Press "Enter" to skip to content

A Teen Marvel actress discusses sexually explicit deepfakes: “Why is this permitted?”

Xochitl Gomez, a seventeen-year-old Marvel star and “Dancing With the Stars” contestant, disclosed that she discovered nonconsensual, sexually explicit deepfakes featuring her face on social media and was unable to have the content removed.

Gomez stated on January 10 episode of “The Squeeze,” a podcast hosted by actor Taylor Lautner and his wife, Taylor Lautner, that she had encountered deepfakes of herself that were sexually explicit on Twitter (now referred to as X). Gomez, who portrays America Chavez in “Doctor Strange in the Multiverse of Madness,” revealed that after inquiring with her mother about the content, she discovered that her team had unsuccessfully attempted to have it withdrawn.

“I was repulsed by it, did not like it, and desired that it be removed.” “Down” was the primary thought that entered my mind. Remove this item. “Pray,'” Gomez pleaded in the podcast. “It wasn’t that I perceived it as an invasion of my privacy; rather, it was that it was not visually appealing to me.” This is not at all relevant to my person. Nevertheless, it remains on my visage.”

Friday’s search by NBC News readily uncovered a number of deepfakes featuring Gomez on X. A representative of X failed to provide a response to a request for comment promptly.

Young female social media personalities’ nonconsensual deepfakes were circulating on X in June 2023, NBC News reported, despite the platform’s policies prohibiting nonconsensual nudity. Certain portions, but not all, of the material were eliminated subsequent to X being approached.

Girls and women, some of whom are prominent and others who are not, have joined Gomez in speaking out against the growing crisis of nonconsensual sexually explicit deepfakes. Typically, these deepfakes graft the victim’s visage onto a pornographic image or video using artificial intelligence. For prominent women’s names combined with the term “deepfakes,” search engines such as Google and Microsoft’s Bing display such content in the top image search results. Additionally, these results may contain links to websites that generate revenue from the content. Takedown request forms for search results are available on both Google and Microsoft’s Bing for nonconsensual deepfake victims and their representatives.

“It’s just strange to think that if someone looked up my name, that would be the first result,” Gomez stated on the podcast. “They cannot be taken down.”

At present, the United States lacks federal legislation specifically targeting nonconsensual sexually explicit deepfakes. State laws regarding these content are patchwork at best. However, further development is required on a federal measure that would classify the nonconsensual sharing of such material as a crime.

Last week, X published a condensed episode of Gomez’s podcast, in which the deepfake segment received over 7 million views, as measured by X’s metrics.

“Why is it so challenging to bring down? “The only thought that entered my mind was, ‘Why is this permitted?'” Gomez declared. “Because I knew in my consciousness that it was not me, it did not interfere with me or anything similar. It was simply something that made me feel extremely uneasy because I was unable to remove it.

Additionally, she stated, “Thinking about it yields no positive outcomes.” “I put down my phone […] I attend to my skin care regimen and socialize with my friends—anything to help me forget what I just saw.”