Join 📚 Katie's Highlights

A batch of the best highlights from what Katie's read, .

The carb-fat combo doesn’t exist naturally, but it’s one that humans clamor for, say scientists at Yale.

The Comfort Crisis

Michael Easter

The next section will get into the mathematics of self-attention, but the main gist is that a transformer learns which words in an input sequence are related and then creates a new encoding for each position in the input sequence that is a merger of all the related words.

A Very Gentle Introduction to Large Language Models Without the Hype

Mark Riedl

While the Manhattan Project did ultimately accomplish its goal, hindsight obscures the fact that the project itself was a tremendous financial and technological gamble with far-reaching consequences that could not have been foreseen at its inception.

Panic About Overhyped AI Risk Could Lead to the Wrong Kind of Regulation

Divyansh Kaushik

...catch up on these, and many more highlights