We’re swimming in AI slop. Here’s how to tell the difference.
If your feed is not already filled with AI-generated video slop, it is only a matter of time.
Meta and Openaai will take care of this. Meta recently announced its endless slop feed vibes, which consist exclusively of AI generated content: cats, dogs and blobs. And that's just in Mark Zuckerberg's first video contribution about it.
The new Sora app from Openaai offers a different flavor of slop. Like TikTok, Sora also has a “for her” page for vertical scrolling through content. But the most terrifying thing about Sora is how real it looks. With a function called CAMEO, users can create videos of themselves, their friends and every public profile that grants access. That means videos by Sam Altman with whom he hangs around Glurak or grill Pikachu Make the rounds on social media. And of course videos by Jake Paul are also circulating.
It is only the beginning and the technology is getting better. To make navigation easier, we spoke to Hayden FieldSenior AI reporter at The Verge. Field and Explained today Co-moderator Sean Rameswaram discusses why these technology giants are increasingly relying on AI videos, what to do with it, and we even let a deceive.
Below you will find an excerpt from the conversation, which was processed for reasons of length and clarity. There is much more in the full podcast, so listen Explained today Wherever you get podcasts, including Apple podcasts,, PandoraAnd Spotify.
What tries to do Mark Zuckerberg with Vibes?
That is the million dollar question. These companies, especially meta at the moment, absolutely want to keep us Consuming AI generated content And they really want to keep us on the platform.
I think it is actually just a matter of trying to make AI to a larger part of life and the routine of everyday life, better get used to people and also put a sign in the ground that says: “Hey, look, here the technology is up to date. It is much better than then when we saw it.” Will Smith eats spaghetti.. “
How could it get so much better so quickly? Because yes, that's not Will Smith, who eats spaghetti.
The AI now often trains itself. It can get better and train to get better. One of the great things that stand in the way of them is actually only the calculation. And all of these companies build data centers and complete new shops every day. You are really working to get more computing power so that you can drive the technology even further.
Let us talk about what Openai is doing. You have just released some named Sora 2. What is Sora?
Sora is her new app and basically an endlessly scrolling, AI-generated video social media app. So you can imagine it, so to speak, like a AI-generated TIKTOK. But the craziest is to be honest that you can also make videos of yourself and your friends if you give you permission. It is called Cameo and they absorb how their own face moves back and forth. They record their voice by speaking a sequence of numbers, and then the technology can parody you by doing any number of things you want.
That is why it is so different from “Vibes” from Meta and why it feels different when you scroll in it. You see videos of real people and they look really. I scroll through and saw Sam Altman drank a huge juice box or a number of other things. It looks like it is really Sam Altman, or it looks like it is really Jake Paul.
How can you know something at a time when it is becoming increasingly difficult to recognize something, what you see is real or not?
These tips that I will give you are not foolproof, but you will help a little. If you look at a little long enough, you will likely find one of the traversy signs that something has been generated by the AI.
“Taylor Swift, actually – some of their commercials for their new album apparently had a ferris wheel in the background, the spokes of which were somehow blurred when it was moving.”
One of them is inconsistent lighting. It is sometimes difficult for AI to properly capture the atmosphere of a place. If there is a lot of lamps – maybe it is really dark in a corner, maybe it doesn't have the realistic quality of sunlight – that could be something you could see. Another thing is unnatural facial expressions that simply don't look quite right. Maybe someone smiles too wide or cries with open eyes. Another option is airbrush skin, skin that looks too perfect. And finally background details that can disappear or change in the course of the video. That is a big deal.
Actually Taylor Swift – some of her Promo For her new album, apparently had a ferris wheel in the background and the spokes swimmed while it was moving.
Is there anything else we should look for?
I only wish we had more rules on this topic and how it could be disclosed. For example, Openai has a security measure: every video you download from Sora has a watermark, at least most of the videos. Some professional users can download one without a watermark.
Oh, cool, so if you pay them money, you could lose the watermark. Very nice.
But the other thing is that I saw a series of YouTube tutorials in which it said: “How to remove the sora watermark.”
Does it interest companies like Openaai or Meta, whether we can see whether this is real or not? Or is it exactly what you want?
You say it is important to you. I think that's all we can say now. But it is difficult because it is in the nature of such a technology that it can be misused. So you just have to see if you can contain this abuse as far as possible, and that's what you try. But we have to wait and see how successful they are. And at the moment I'm a little worried when you can start from the story.