Discussion about this post

User's avatar
Andrew Norman's avatar

Hilarious- This is a superb essay. Connecting dots is what LLMs do, and "confabulate" is the perfect word for what happens when they do this in a way that goes haywire and misleads...

Patrick Grafton-Cardwell's avatar

It’s weird that you make the observation that AIs can’t hallucinate because they don’t have sensory experience and then immediately go on to say they have delusions. They can’t have delusions, because delusions require (as James says) opinions, and AIs don’t have opinions. They don’t think or even write. They just generate syntax. I think the basic point here is good, but you want to follow it all the way, past all the metaphors that grant GenAI too much mind.

20 more comments...

No posts

Ready for more?