Join 📚 Nicole's Highlights

A batch of the best highlights from what Nicole's read, .

At Anthropic, our Claude models are not trained on user conversations [by default](https://privacy.anthropic.com/en/articles/10023580-i-want-to-opt-out-of-my-prompts-and-results-being-used-for-training-models), and we take the protection of our users’ data very seriously. How, then, can we research and observe how our systems are used while rigorously maintaining user privacy?

Clio: A System for Privacy-Preserving Insights Into Real-World AI Use

anthropic.com

Cultural changes need to be lived and driven through action. Fostering small changes can have a big impact. One of the most effective ways to do so is by creating rituals and embedding them into ways of working. A simple but powerful one could be that if a decision is needed in a meeting the requester needs to send analyzed and aligned supporting data before the meeting as a pre-read.

How Companies Can Stop Failing at AI and Data-Driven Decision-Making

Shreshth Sharma

But I sometimes think of my journey through adulthood to date as one of incrementally discovering the truth that there is no institution, no walk of life, in which everyone isn’t just winging it, all the time.

Four Thousand Weeks

Oliver Burkeman

...catch up on these, and many more highlights