Got A Tip?

Star Seeker

Taylor Swift

Elon Musk Finally Hiring Content Moderators For X (Twitter) After Taylor Swift AI Debacle!

Taylor Swift AI Forces Elon Musk X Twitter Hire Content Moderators

AI on X is more than just the likely first name of one of Elon Musk‘s future children. It’s also a technological marvel that’s become more of a terror lately thanks to the way it’s being used.

Most recently some creeps used AI to create fake pornographic photos of Taylor Swift and spread them online — largely on X, the social media giant formerly known as Twitter. Her fans fought back against the trending topic, flooding it with positive images. But it didn’t stop the pics from being viewed millions of times.

We’ve heard Taylor is considering legal action over the images, which makes sense. But would she sue X? The tech company did eventually take measures to stop the spread of the photos, including removing the posts, suspending accounts that shared them, and making it impossible to search “Taylor Swift.” Was it enough?

Related: Is It Impossible For Taylor To Make It To The Super Bowl After Tokyo Concert?

OK, first off, we’re guessing that last one is NOT something Tay and her team want as a solution! She has albums to sell ffs! She wants her fans to be able to talk about her! Secondly, well, they took quite a long time doing anything. We saw dozens and dozens of Swifties post complaints for several hours before X took any action. Why?

We know when he first bought Twitter, Musk fired a LOT of the folks responsible for keeping the platform safe. After all, he was a self-professed champion of free speech. He wasn’t going to stand for some super woke liberals struggling to clamp down on… neo-Nazi hatespeech…

Well, it looks like he’s finally learning Twitter’s previous owners weren’t being PC police, they were just trying to run a business. And in order to run X, he needs to keep it safe — if only to avoid getting himself into more legal trouble and prevent the exodus of advertisers!

On Friday, in the wake of the Taylor Swift AI debacle, the company announced it was building a “trust and safety center” in Austin, Texas, for which they’d hire 100 full-time content moderators. Human beings, not AI, notably.

The main focus will actually be child sexual exploitation, which is a problem we didn’t even realize had been spreading on X. But with the lack of regulation, it makes sense. The darkest, most unaccountable parts of the internet always seem to fill up with such heinous content. And as we understand it, this kind of fake AI NSFW material is being used more often against underage teens in bullying ways than against the famous (and famously litigious). In a statement, head of X business operations Joe Benarroch explained:

“X does not have a line of business focused on children, but it’s important that we make these investments to keep stopping offenders from using our platform for any distribution or engagement with CSE content.”

He did, however, say this would just be “a temporary action” which was being executed “with an abundance of caution as we prioritize safety on this issue.” Hmm. We have to assume these content moderators — being presumably decent human beings — would also be able to see things like the Taylor Swift AI trend much earlier and put a stop to it. We just hope Elon keeps the new team around long enough to make a real change on X.

[Image via MEGA/WENN.]

Related Posts

CLICK HERE TO COMMENT
Jan 30, 2024 06:39am PDT

Share This