The Student News Site of Grand Valley State University

Grand Valley Lanthorn

The Student News Site of Grand Valley State University

Grand Valley Lanthorn

The Student News Site of Grand Valley State University

Grand Valley Lanthorn

Swift Action: X leads curb in spread of celebrity Deep Fakes, legislation should follow

GVL Editorial
GVL Editorial

Nearly 20 years since the infamous sex-tape scandal involving former WWE wrestler Terry “Hulk Hogan” Bollea that resulted in Gawker’s unceremonious demise, celebrities are facing new privacy-invading tools that are going completely unchecked from a legislative standpoint. 

With the emergence of AI, celebrities are facing new invasions of privacy almost daily. Most notably, over the past week, sexually explicit AI generated photos of Taylor Swift were seen circulating social media giant X, formerly known as Twitter. 

To say we are disgusted is an understatement. Regardless of how people feel about Swift, we can all agree that this breach in her privacy and manipulation of her image is heinous.

In response, X blocked the ability for users to search “Taylor Swift” as a tactic to slow the visibility and spread of the sexually-explicit AI images. We are thankful that X took swift action to curb the spread of these explicit deep fakes.

“This is a temporary action and done with an abundance of caution as we prioritize safety on this issue,” said Joe Benarroch, head of business operations at X, in a statement.

Vogue reported that “Swift is reportedly considering legal action,” but with over 45 million views on one of the AI posts, the spread of the images is wide, and brings up many concerns for everyday people and the use of AI. 

Swift might have the funds to hire a large legal team to combat the spread of these images, but many people do not have the luxury and instead rely on legislation to curb the nefarious use of AI. 

Only one problem– there aren’t any universal concrete laws against using AI to make sexually-explicit photos of non-consenting parties, at least not yet. 

According to USA Today, only 10 states in the U.S. have laws against the creation and or distribution of deepfake porn. California, Florida, Georgia, Hawaii, Illinois, Minnesota, New York, Texas, South Dakota and Virginia all have varying and inconsistent laws in place.

Most of these laws offer sentences from a minimum of a small fine up to five years in prison, but the lack of legislation across the board is concerning. 

In an interview with ABC News, the White House Press Secretary commented that they are “alarmed” by what happened to Swift and that Congress “should take legislative action.” 

Because of Swift’s unfortunate situation, we are now afforded the opportunity to bring the issue of AI deepfake porn to the forefront of American attention. Lawmakers should be more motivated to present a way to defend people, both celebrity and non-celebrity.

The everyday American has no way of fighting back against AI generated content of themselves, and this fact is worrisome to us. Jobs, relationships and personal image are just a few of the things at stake for people that have AI generated “revenge porn” made of them without their consent. 

Another troublesome consideration is the issue of who AI and deepfakes primarily target– young women. Samantha Murphy Kelly wrote in CNN Business that privacy-invading technology has been used against women for years, with increased access to this technology only exacerbating the issue. Furthermore, MIT Technology Review found that 90 to 95% of deepfakes since 2018 involve nonconsensual porn, with almost 90% of that porn featuring women. 

However, in a step in the right direction, “Rep. Joe Morelle renewed a push to pass a bill that would make non consensual sharing of digitally-altered explicit images a federal crime, with jail time and fines,” reported ABC News

This is a great first step in protecting vulnerable people that slip through the cracks without laws like these protecting them. 

We urge lawmakers to seriously consider the effects that explicit AI can have on the livelihood of people. There is no doubt that Swift’s image is not necessarily tainted by these images, but that doesn’t speak to the negative toll this ordeal has certainly had on her mental state. With more celebrities being vocal about the issue, we have hope that lawmakers like Rep. Morelle and others will continue to fight to introduce tighter, more specific laws that offer people protection against sexually explicit deepfake material.

More to Discover