Join My Brain Food
A batch of the best highlights from what Louis's read, .
In information theory, the cross-entropy between two probability distributions {\displaystyle p} and {\displaystyle q} over the same underlying set of events measures the average number of bits needed to identify an event drawn from the set if a coding scheme used for the set is optimized for an estimated probability distribution {\displaystyle q}, rather than the true distribution {\displaystyle p}.
Cross Entropy - Wikipedia
p. 82
Stuart Chase and others have come near to claiming that all abstract words are meaningless, and have used this as a pretext for advocating a kind of political quietism. Since you don’t know what Fascism is, how can you struggle against Fascism?
Rationality: From AI to Zombies
Eliezer Yudkowsky
If you could get all the people in an organization rowing in the same direction, you could dominate any industry, in any market, against any competition, at any time.
The Five Dysfunctions of a Team - A Leadership Fable
Patrick Lencioni
...catch up on these, and many more highlights