Don’t F*ck With Swifties: Fans Track Down Canadian Man Who Posted AI-Generated Taylor Swift Pics On X

As AI-generated photos of Taylor Swift, currently one of the most influential celebrities in the world, went viral across X, one of the many trolls and incels who shared the deepfake pornographic pics f*cked around and taunted Swift’s fans. He, of course, found out what Swifties are capable of.

“I don’t care how powerful Swifties are, they’ll never find me,” the X user, who goes by the handle @Zvbear, posted, adding that “I’m like the Joker I use fake numbers and addresses.” 

Swifties engaged and tracked him down and attributed the account to Zubear Abdi, a 28-year-old Somalian who lives in Canada. 

@Zvbear later said he was taking his account private until the “tsunami passes,” conceding to the fans and their attempts to dox him.

“Now I’m dealing with Swifties. A whole different animal,” he continued. “This is a Tactical Retreat, every great army has done this.”

X has blocked searches of ‘Taylor Swift’ in an attempt to further spread the AI-generated images. Users searching for “Taylor Swift” or “Taylor Swift AI” are met with an error message, although some have found workarounds by slightly altering the search terms or using quotation marks.

X later issued a statement saying that they have “a zero-tolerance policy” for Non-Consensual Nudity (NCN) images, and are “actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.”

On Friday, White House press secretary Karine Jean-Pierre called on lawmakers to move to protect people from these types of content. This was echoed by politicians. Rep. Yvette D. Clarke (D-NY) warned that this is nothing new. “For yrs, women have been targets of deepfakes w/o their consent,” but with the advancement of technology, creating deepfakes has become easier and cheaper.

Rep Joe Morelle (D-NY), called for the passing of the Preventing Deepfakes of Intimate Images Act, emphasizing that deepfakes don’t just happen to celebrities like Swift, “they’re happening to women and girls everywhere, every day.” 


Information for this story was found via the Wall Street Journal, BBC News, X, and the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.

Video Articles

Goliath Resources Hits 10.60 g/t Gold Over 22.82 Metres, Highest Grade Results In Third Distinct Rock Package At Surebet Discovery

Four Countries Control the Fertilizer That Feeds the World | Sage Potash

$10,000 Gold Is Just A Question of Time | Florian Grummes

Recommended

First Majestic Encounters 711 g/t Silver Equivalent Over 8.0 Metres In Ongoing Exploration At Los Gatos

Silver47 Samples 44.5 g/t Gold, 3,037 g/t Silver & 8.56% Copper At Kennedy Project In Nevada

Related News

Meta Unveils Threads’ Web Version to Counter Platform X’s Dominance

Meta Platforms (NASDAQ: META), the parent company of Facebook and Instagram, has unveiled the web...

Tuesday, August 22, 2023, 10:07:25 AM

A16z Wants to Ignore AI Copyright Issues: “Either Kill or Significantly Hamper Development”

Venture capital firm Andreessen Horowitz (a16z) has expressed concerns about potential devaluation of billions of...

Thursday, November 9, 2023, 10:20:00 AM

China Arrests Man Over Release of ChatGPT-Generated Fake News

Today in that’s-probably-not-how-you-use-artificial-intelligence, a man in China has been arrested for reportedly using ChatGPT to...

Thursday, May 11, 2023, 06:14:00 AM

UPDATED: Dave Portnoy Suspended From Twitter For Less Than Two Hours

This evening Barstool Sports founder, Dave Portnoy, managed to make noise on a platform he...

Friday, June 18, 2021, 09:07:00 PM

Japan’s Government Foregoes Copyright Enforcement on Data Used for AI Training

The Japanese government recently asserted that it would not require copyrights on data used in...

Saturday, June 3, 2023, 07:15:00 AM