This is about AI, and the question of how we program feelings, they talk about the philosophical zombie idea. Lane throws it back to Lex and he something pretty amazing to me, starting at 12:40
“Think of all the complexities that led up to the human being. The entirety of the history of four billion years. In some deep sense integrated the human being into this environment. That dance of the organism and the environment, we can see how emotions arise from that, and our emotions are deeply connected, creating a human experience. From that you mix in consciousness and the fullness of it. But, from the perspective of an intelligent organism that’s already here, like a baby, it doesn’t need to learn how to be a collection of cells or how do all the things it needs to do to do its basic function of being a baby. It learns to interact with its environment, to learn from its environment to learn how to fit in to this social society. The basic response of the baby is to cry a lot of the time, maybe convince the humans to protect it, or to discipline it, to teach it. We’ve developed a bunch of different tricks, how to get our parents to take care of us, to educate us, about the world we’ve constructed, in such a way that it’s safe enough for us to survive, yet dangerous enough to learn the valuable lessons.”
“To make an apple pie, you need to build the whole universe.”
Then he says he will leave the zombie idea to the philosophers. And he also leaves love, the definition and what happens between two human beings when there’s a magic that just grabs them like nothing else matters in the world and somehow you’ve been searching for this feeling, this person, your whole life.
He’s talking not about what the feeling is, but that it’s real. If he has that feeling for a toaster, or a dog, it’s still a real feeling. It could be anthropomorphism, but there’s some kind of emotional intelligence. But Lex says there needs to be a higher order of intelligence than something like solving a mathematical puzzle, because it’s not the same as human interaction. He notes how incredible it is that we walk, in the same way it’s amazing that we create a conversation and make it meaningful. Nick puts the fine point on it, “what you’re saying is AI cannot become meaningfully emotional until it experiences some kind of internal conflict that it is unable to reconcile. The various aspects of reality or its reality with a decision to make, and then it feels sad, necessarily, because it doesn’t know what to do.”
Nick then references “21 Lessons for the 21st Century” by Yuval Noah Harari. Yuval thinks biochemistry is an algorithm and can figure out how to write great music. Nick can’t refute it, and finds disturbing. In Yuval’s final chapter, he talks about meditating and that it may be a way out of the algorithm. Our decisions could be based on feelings, not a logic that’s algorithmic.