Join My Brain Food
A batch of the best highlights from what Louis's read, .
In information theory, the cross-entropy between two probability distributions {\displaystyle p} and {\displaystyle q} over the same underlying set of events measures the average number of bits needed to identify an event drawn from the set if a coding scheme used for the set is optimized for an estimated probability distribution {\displaystyle q}, rather than the true distribution {\displaystyle p}.
Cross Entropy - Wikipedia
p. 82
The mental “I” that exists in our brains wants to break free from its genetic servitude, to no longer be held captive by the Darwinian processes that got us all here. We, as intelligent individuals, want to live forever and to preserve our society. We want to escape from the evolutionary forces that created us.
A Thousand Brains_ A New Theory of Intelligence
Jeff Hawkins
The man of knowledge must be able not only to love his enemies, but also to hate his friends.
Thus Spake Zarathustra
Friedrich Wilhelm Nietzsche
...catch up on these, and many more highlights