Join 📚 Izzy's Highlights

A batch of the best highlights from what Izzy's read, .

What about the fact that large language models make so much stuff up? Known as “hallucinations” by AI researchers (though Hinton prefers the term “confabulations,” because it’s the correct term in psychology), these errors are often seen as a fatal flaw in the technology. The tendency to generate them makes chatbots untrustworthy and, many argue, shows that these models have no true understanding of what they say.

Geoffrey Hinton Tells Us Why He’s Now Scared of the Tech He Helped Build

technologyreview.com

A study of six different urban sites found that roughly a third of all traffic congestion was made up of people trying to find a parking spot.

The Color of Law

Richard Rothstein

Using the right roads has never precluded destroying the wrong ones.

Crossings

Ben Goldfarb

...catch up on these, and many more highlights