Join Now

Want news that’s as fresh as your morning coffee? Join our community and stay in the know!

Elon Musk Reposted a Deepfake of Kamala Harris, May Violate X Policy

Date:

Share:

On Friday evening, Elon Musk reposted a deepfake video of Vice President Kamala Harris on X — a move that may violate his own platform’s policy on synthetic and manipulated media, The New York Times reported.

The video was originally posted by the user @MrReaganUSA, who noted that the clip was a “parody” of Harris’ first campaign ad since becoming the presumptive Democratic Party nominee for the 2024 presidential election.

The clip appears to have been digitally altered to add a new voice-over that sounds like Harris.

In the video, the edited voice-over says, “I was selected because I am the ultimate diversity hire. I’m both a woman and a person of color, so if you criticize anything I say, you’re both sexist and racist.”

The deceptive voice-over also calls Biden senile and says Harris and Biden are “deep state” puppets.

In his repost of the clip, which has been viewed more than 117 million times, Musk failed to note that the video had been edited, writing only: “This is amazing 😂.”

And that may just run afoul of X’s policy on synthetic and manipulated media, which states: “You may not share synthetic, manipulated, or out-of-context media that may deceive or confuse people and lead to harm (“misleading media”).”

X says that for the company to take action and remove or label a post that violates that policy, it must “include media that is significantly and deceptively altered,” “shared in a deceptive manner or with false context,” or that is likely to cause “widespread confusion on public issues.”

The company says that it will consider factors including “whether there are any visual or auditory information (such as new video frames, overdubbed audio, or modified subtitles) that has been added, edited, or removed that fundamentally changes the understanding, meaning, or context of the media.”

The deepfake boom

Deepfakes use artificial intelligence to replace a person’s likeness with that of someone else in video or audio footage.

Audio deepfakes are relatively simple to create but are difficult to detect, studies have found.

A number of politicians have already fallen victim to the technology in the past, highlighting their potential to wreak havoc around election times.

In one clip that was circulating on social media last year, Hillary Clinton appeared to give a surprise endorsement of Florida Governor Ron DeSantis. However, the clip was revealed to have been AI-generated, Reuters reported.

Biden was also on the receiving end of a deepfake following his announcement that he was dropping out of the 2024 presidential election race.

A video on social media appeared to show the president hitting out at his critics and cursing them. But again, the footage was a deepfake, per the AFP news agency.

Unmatched Baby Essentials

baby

━ more like this

Missed a lot of 2024 Proof Points? Here are 10 pieces to catch up

I am always surprised when I comb through my page views at the end of each year. What struck a chord with readers in...

The Invisible Russia-Ukraine Battlefield | WIRED

Russia’s systems were “not very mobile, not very distributed,” Clark tells WIRED. Their relatively small number of big systems, Clark says, “weren’t really relevant...

I Tried On Over 100 Pieces This Year—These 11 Joined My Capsule Wardrobe

If you're a frequent reader of Who What Wear UK, you may know by now that I do a lot of try-ons. This year,...

AI Agents Will Be Manipulation Engines

In 2025, it will be commonplace to talk with a personal AI agent that knows your schedule, your circle of friends, the places you...

Fashion People Are Warming to the Chic Appeal Brown Tights This Season

If you've been keeping an eye on what fashion insiders are wearing lately, you may have noticed a subtle yet impactful shift in their...

LEAVE A REPLY

Please enter your comment!
Please enter your name here