Suno and AI music could be the sound of the future. Are we ready?

0
gettyimages-2159558534.jpg


According to the French music streaming service DeezerAround 50,000 fully AI-generated songs are uploaded to the platform every day. Many of these songs won't reach a wide audience, but in the last year some of them have made gains Millions of listeners.

Which begs the question: If our future is going to be filled with this kind of AI music, what would that future sound like?

Deni Béchard is a senior science journalist at Scientific American. For almost a month, Béchard has only allowed himself to listen to his own AI-generated music via the AI ​​Music app Suno. He says the experiment is an attempt to think more critically about how we might engage with this type of music in the future.

Béchard spoke with us Explained today Host Noel King talked about what he's learned so far and how his AI creations compare to human-made music. The conversation has been edited for length and clarity.

There's plenty more in the full podcast – including excerpts from Béchard's songs – so give that a listen Explained today wherever you get your podcasts, including Apple Podcasts, PandoraAnd Spotify.

Okay, so you use Suno to create the songs, you said.

I think of a prompt and plug it in, and each prompt creates two songs, and I try to be as creative as possible. I usually plug it in two or three times and vary it up, add different types of instruments or different types of vocals and just plug in a few of those. One that made me laugh was a song called “Organ Trafficking.” I asked for a contemporary rap song with female vocals, and I asked for playful, ironic lyrics, and this song came about in which organ trafficking is, so to speak, the central metaphor. I was quite surprised.

I think one of the things I've realized is that a lot of the mainstream music I listen to is, in my opinion, heavily processed music – music designed to capture a large market. And it doesn't feel very personal to me anyway, so I realized that in this particular context [the music I made with AI] didn't feel very different most of the time.

Do you think if someone gave you a playlist of 10 songs, five of which were AI songs and five of which weren't, you would tell the difference?

Wow. And what does that tell you?

I mean, it tells me that the AI ​​is getting very good.

One thing I noticed during this process was that a lot of the AI ​​music that is popular is listened to on Spotify and has millions of listeners [are] Songs that are very emotional and dark.

It's like Xania Monet or Solomon Ray or Cain Walker's “Don't tread on me” – and Cain Walker isn’t human. It’s an AI avatar, right? Or Break rust'S”Living on borrowed time.” These songs all feel really authentic. This person really suffered from these things and felt these things. That's how they come across.

I think AI works best when it just leans into that authenticity because it kind of helps overcome the cognitive dissonance that we think, “This isn't a really heartfelt song, and it's moving away from mainstream music that's created by humans – music made by humans – which is often heavily designed to be a summer hit or go viral in some way.” And it often doesn't have that level of authenticity, that feeling of authenticity. I think as AI mimics that, we become more aware that it's superficial or artificial, because there's already an element of artificiality there.

Do you think you will continue making AI music after your experiment is over?

Oh my god, you love the power.

I think, you know, what surprised me about this is that I go somewhere and think, “What if I asked it to combine these styles or combine a banjo with a hip-hop track and add this type of vocal? What would I get?” Now I'm getting curious.

I would say now I'm at the point where I'm no longer worried about connecting with people. That's what I did at the beginning. At first I really thought, “Who is this person?” When you read a book and you're in the middle of the book and you think, “What human mind gave rise to this book?” And you turn the book over and see who the author was, and you Google them and you're like, “How the hell did they make that up?”

In the beginning I often had the impulse to want to know who feels that, who thinks that. I would just have cognitive dissonance. I would say, “This is a machine. This machine didn't fall in love. This machine didn't have these experiences. This machine didn't wake up at two in the morning and write this song just to express itself.” It really annoyed me a lot. It would somehow stop me from enjoying the song.

And I thought, “Well, if someone created an AI avatar and gave it a personality, and it was a fictional character that existed in the metaverse, and that AI avatar was a songwriter and sang this song, would that make it easier?” And strangely enough, it would. It would make it a little easier. And so I just imagined these AI avatars and thought, “Okay, I'll imagine a fictional character singing this song.” And that lasted maybe four or five days, and then I just got used to hearing the music and stopped thinking about it.

Does conducting this experiment and watching how you react to this music change the way you think about AI at all?

I think my conclusion from this is that in 10, 15, or 20 years there will be a lot of teenagers who will look at the discussions we're having right now and say, “What are these people talking about? This is completely normal. Why should anyone feel so conflicted about this?”

I think we'll get used to it pretty quickly. That's my gut feeling. There are many big questions surrounding copyright and artist protection and what it means to be an artist. A lot of questions will arise from this and I really hope that the artists are protected as much as possible and fairly compensated. But I think this will fit into our lives much more smoothly than we realize at the moment.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *