Music, AI, and emotion

I’m listening to Rachmaninov’s second piano concerto, getting the goosebumps and euphoria that always come at the end of the second movement. No matter how far we push AI, this experience is quintessentially human, and something AI will never possess—at least as far as I can see, given the technological underpinnings and stated aspirations of AI researchers.

This thought makes me happy. Even as we push AI into the arts, and generative models—perhaps autonomously—develop work that has the potential to inspire emotion in people, their creation process can’t be driven by an internal emotional process. Which of course brings up all sorts of philosophical questions about intent, meaning, and value in art.

It also makes me think about whether emotional experience as a non-goal is the right choice. On one hand, it’s scary to think about an alien consciousness that includes humanlike drives beyond rational thought. On the other, can we expect future AI to really understand humans and our goals, and thus have any authentic alignment with humanity, without being able to connect with us on other levels?

I’m an AGI skeptic: I don’t believe transformer-based LLMs as they exist will develop into conscious minds as we think of them. But I also don’t assume that it’s impossible to develop a form of conscious intelligence if or when we create the right architecture, which brings me to question whether we have that aim at all, whether we include emotion or not.

I honestly can’t believe that these musings aren’t just abstract—they’re actual issues we face, or might face, in the near future. Ten-year-old me would be absolutely agog at the world I find myself living in.

More writing to come on this topic!

Written entirely without the assistance of AI (other than extremely light proofreading for grammar).