Music and technology can be odd bedfellows. Or maybe better put, music fans and technology can have a strained relationship. Bob Dylan's electric set at Newport in 1965 was met with harsh criticism from some folk music purists. Innovations from feedback to synthesizers to sampling have all met some level of disdain.
Today, relationships between artificial intelligence and music can also stir up an argument. Next week, five musicians will collaborate with AI for a concert hosted by the University of Arkansas Honors College. Two of the musicians, Nicola Radan and Cameron Summers, came to KUAF to discuss what we'll hear on Tuesday night, Sept. 2.
Nicola is a music professor at the University of Arkansas and a flutist. Cameron is a trumpeter, composer and a consultant for music and healthcare industries in AI, machine learning and data science, and he lives in northwest Arkansas. Both have played in ensembles together, and they began discussing Cameron's work in AI. That led to a meeting at Cameron's home.
"I think I showed him I have kind of a collection of ideas that I worked on that started in COVID lockdown," Summers said. "And they're essentially ways to interface computers and AI, machine learning with musicians as musicians like to play. And so the way that looks is often musicians are playing and feeding information into the computer and then doing something with that that is interesting, maybe improvisational. I'm usually working within an improvisatory context and often jazz because that's so central to that particular genre. And so we sat down and I said, you know, play. I think you brought your flute, right?"
"Yes, I did," Radan said.
"Play into this, feel this, hear this," Summers said. "And so he got to experiment with some of the things that I've been working on. And I think that was kind of—"
"Right? You would think maybe that's something similar," Radan said. "You would just use a pedal on the guitar and use all kinds of effects. But these are different effects that Cameron is producing or that really correspond differently or can correspond to a certain mode that you have in your mind to want to perform. So basically it kind of reads your mind in that way when you play. And that's what I see as a great way to use artificial intelligence in music rather than that artificial intelligence takes over us."
"So it sounds like it's more as if you're collaborating with yourself rather than being substituted for," Kellams said.
"Yes, by design, for sure," Summers said. "Because that was, you know, the thing about AI is it's packaging human labor, it's packaging human creativity into something that then operates separately from you. And in and of itself, that's not necessarily a bad thing. But when it comes to creative pursuits, we don't want to take that from people. And so I very much wanted to reimagine this idea of, say, automation of creativity as dependent upon human input and human creativity specifically. So these things that I have built, they are better with a better musician, for example. You can think of it as human augmentation almost, and using intelligence technologies to do that."
"If you were here with your flute and we had someone else here with their flute, you would play differently," Kellams said. "It might be microcosmically different to the untrained ear."
"Yes," Radan said.
"But what it sounds like you experienced here was your style, your information, your technique that you were working with," Kellams said.
"Yes," Radan said, "And basically what I found really inspiring was that when I play, I get new ideas that I wouldn't necessarily get if I don't have it."
"What do you have to do to prepare for this event?" Kellams said.
"All the normal things that I do as a musician," Summers said. "I've written a new piece that will premiere at this concert. So I'm writing and we're going to rehearse and we're doing all the music things. But then there's also all of this work that I've done on the technology, code that I've written. So it's computer software. I spend a lot of time going in, setting things up for the new space that we're going to be in or needing to debug, or maybe there's a new idea in the new piece that needs to be supported with a new feature. And I add this in. So I do a whole lot of software development also that accompanies the music."
"So Cameron, someone's going to be listening to this and think, wait a second, wait a second. This isn't, quote, real music," Kellams said. "This is the machines taking over. But I have two musicians in here who are embracing this."
"Yeah, absolutely. I think that this is the idea, to take that head on and say no, we don't have to sit back as consumers of this thing and just let things happen," Summers said. We have the ability to take action and have agency in this and say, you know what, let's engage with this technology in a different way. Let's think differently about it. How can it serve us in a way that's healthful, good for us? Can it make us feel good? Can it bring us together? These types of things."
"How did you get this interested in the technology side of it?" Kellams said.
"I've always had an interest in technology," Summers said. "My undergraduate studies were in mechanical engineering, but I promptly after I got my degree decided I did not want to do that at all and actually went to music school instead. So I went to the Manhattan School of Music in New York, and I got a master's degree in jazz trumpet there. I studied with some amazing players in New York. And then I was a professional musician for a long time in Los Angeles and New York. I gigged. I played with the Foo Fighters at the Grammy Awards. I played on an album that got a couple of Grammy nominations, lots of studio work, salsa, rock bands, jazz, whatever. I was there doing it.
"But the thing is, as a musician, you have gigs at night. I remember I would go on tour, I would do some Broadway stuff, and we would go on tour with these Broadway shows. But you have the show at 7 p.m., and the rest of the day you're just sitting, hanging out, doing nothing. And so I started to think, I wonder what kind of things are out there on the technology side that are related to the things that I'm interested in now. And that's when I discovered AI and machine learning as it relates to music. And so I started pursuing that pretty seriously, just on the side. And then I eventually got an internship with this company in Los Angeles that was making a digital radio and started writing software around music. And then I discovered you can actually apply these AI things to audio signals, and you can teach computers how to listen to music like people do. Wow, this is super interesting. So I pulled on that thread for a while, and that led me to Silicon Valley, where I was a researcher at a company called Gracenote for five years, cut my teeth there. And so I wrote algorithms that taught computers how to listen to music like humans. Companies like Apple, Amazon, Spotify would feed their entire music catalogs into these algorithms that I worked on. And it would spit out information. It would say, this is the mood of this piece, or these are the instruments, this is the key. And then that company I worked for would sell that data back to Apple or Spotify. And then Spotify, for example, would use that to build discovery playlists, how you'd find, say, a mood playlist on Spotify. That's directly related to work that I did.
"That was my way into it. Once we moved to Northwest Arkansas during COVID lockdown, I continued consulting, doing music related stuff. But then with lockdown and nobody playing music, I like a lot of people thought, what can I do differently here? Rethink my engagement with the arts, with music. And I thought, there's all this stuff that I've been making for these big music companies. What does that look like when you turn it around and put it into a performance context, which is what I was really interested in," Summers said.
"I think it's necessary to talk about AI in general, and we are all afraid of the wrong things that can go wrong with the use of AI or exploitation of AI," Radan said, "But I want to engage the audience in this way to understand the good way that we can see the good parts of AI. We can control the AI, particularly in music and jazz. And jazz is improvisational. Now we basically get more ideas with the technology, help us to get to that new point."
"You mentioned there will be an original piece?" Kellams said.
"Yes. Cameron composed the piece of music." Radan said. "Can you talk about it?"
"Yeah," Summers said. "Previously, the idea was introducing what is music and what is AI and how do these work together. What's the state of this? But I think people are relatively familiar with a lot of these capabilities now. People have probably gone onto their computer and touched a button and generated a three-minute radio song that sounds real. So people have this sense of what it's all about now. So instead this was looking at, well, everybody is now a little uncomfortable with what we can do and where this is going. So it was to focus on this idea of what is healthful around this, around the idea of creating, using AI in interfacing with it. It's not just necessarily going to take away. So the piece focuses on one of these capabilities I've been working on, which you could call a traditional harmonizer, but on steroids. This idea in which you put a musical instrument into it and it will generate multiple voices to harmonize with you. But this is very intelligent. It can change what it's doing based on all kinds of things, based on the things that I do. So it's not a deterministic system like a lot of things in the commercial market that people are familiar with.
"So the music is all built around this, which is stated simply a series of sketches that think of harmony and the way that harmony is good for us, things that sound beautiful, for example, and then mapping that into the system with a bunch of great musicians and just seeing what comes out of that." Summers said.
"Well, I look forward to it. Thank you both for coming in." Kellams said.
"Thanks for having us."
Nicola Radan and Cameron Summers, flute and trumpet, will be joined by Jake Herzog, guitar, Garrett Jones, double bass, and Chris Peters, drums, for an Honors College concert titled "AI in Music." It happens Tuesday night, Sept. 2 at 6 p.m. in the Honors Student Lounge in Gearhart Hall on the University of Arkansas campus. The concert will include Summers' original composition, "Hymnus." The concert Tuesday, Sept. 2 is free, but registration is requested.
Ozarks at Large transcripts are created on a rush deadline. Copy editors utilize AI tools to review work. KUAF does not publish content created by AI. Please reach out to kuafinfo@uark.edu to report an issue. The audio version is the authoritative record of KUAF programming.