Join 📚 Quinn's Highlights

A batch of the best highlights from what Quinn's read, .

Explore v.s. Exploit: Finding Solutions Quickly Can Get You Stuck in a Local Optimum Transcript: Speaker 1 So when I started doing the work in AI, one of the really, very, very general ideas that comes across again and again in computer science is this idea of the explore, exploit trade on. And the idea is that you can't get a system that is simultaneously going to optimize for actually being able to do things effectively. That's the exploit part. And being able to figure out, search through all the possibilities. So let me try to describe it this way. I guess we're a podcast. So you're going to have to imagine this usually I wave my arms around a lot here. So imagine that you have some problem you want to solve or some hypothesis that you want to discover. And you can think about it as if there's a big box full of all the possible hypotheses and all the possible solutions to your problem or possible policies that you could have, for instance, Your reinforcement learning context. And now you're in a particular space in that box. That's what you know now. That's the hypotheses you have now. That's the policies you have now. Now what you want to do is get somewhere else. You want to be able to find a new idea, a new solution. And the question is how do you do that? And the idea is that there are actually two different kinds of strategies you could use. One of them is you could just search for solutions that are very similar to the ones you already have. And you could just make small changes in what you already think to accommodate new evidence or a new problem. And that has the advantage that you're going to be able to find a pretty good solution pretty quickly. But it has a disadvantage. And the disadvantage is that there might be a much better solution that's much further away in that high dimensional space. And any interesting space is going to be too large to just search completely systematically. You're always going to have to choose which kinds of possibilities you want to consider. So it could be that there's a really good solution, but it's much more different from where you currently are. And the trouble is that if you just do something like what's called hill climbing, you just look locally, you're likely to get stuck in what's called a local optimum.

Alison Gopnik on Child Development, Elderhood, Caregiving, and A.I.

COMPLEXITY: Physics of Life

The Dataome: The Energy Intensity of the Digital World Key takeaways: • The generation and usage of digital data requires a significant amount of energy and resources. • Silicon chip production is an energy-intensive process due to the creation of ordered structures from disordered material. • Efforts to generate electric power for the current informational world are hindered by the fight against entropy. • The energy requirements for computation, data storage, and data transmission are increasing exponentially. • Without significant improvements in efficiency, the energy needed to run our digital data homes may soon match the global civilization's total energy usage. Transcript: Speaker 1 Its everything, right? It's this conversation in recording to yr bits. It's the information that went to and from your phone when you picked it up in the morning. It's the video you made. It's all the financial transactions, it's all the scientific computation. And that, of course, all takes energy. It takes the construction of te technology. In the first instance, making silican chips is an extraordinarily energy intensive thing, because you're making these exquisitely ordered structures out of very disordered material. And so there too, we go back to simo dynamics. And you're fighting, in this sense, against entropines. In a local fashion, we're having to generate electric to power current informational world, that piece of the data. And the rather sobering thing is that already, the amount of energy and resources that we're putting into this, it's about the same as the total metabolic utilization of around 700 Million human and if you look at the trend in energy requirements for computation, for data storage and data transmission, the trends all upwards. Its an expedential curve. And they suggest that perhaps, even if we have some improvements in efficiency, unless those improvements are then in a few decades time, we may be at a point where the amount of energy, Just electrical energy, required to run our digital data home, is roughly the same as the total amount of electrical energy we utilize as a global civilization at this time. Speaker 3 The

Caleb Scharf on the Ascent of Information — Life in the Human Dataome

COMPLEXITY

Risk tolerance in org change: When 1 bad thing happens, don't pave over it with rules Transcript: Speaker 1 Two is, and Sam, like you and I have talked about this a million times, like there is such a bias around risk where it's like this thing happened once. Now we make a rule for it. Now we have to uphold that rule in perpetuity. And that is what we call work debt. We don't even know if that one thing would ever happen again. But now we have hours and time and money and cycles and cycles and cycles probably forever because let's be honest, we're never going to unwrite that rule for what might have been a one Off. And so on the one hand, I think like that person's perspective is really valid. On the other hand, I'm like, if you want to be strategic, you have to learn to look at risk a little bit differently, which is that you can never eliminate it. You are always doing stuff with the issue in the rear view.

The Future of HR — Building Your Capabilities, Pt. 1 - Getting to Level 3

At Work with The Ready

...catch up on these, and many more highlights