Discussion about this post

User's avatar
Nick Taylor's avatar

I find the suggestion that some programmers are already trusting LLM generated code without checking it very worrying. The problems of "hallucination" are well known so why would they do that?

Expand full comment
2 more comments...

No posts