Connect with us

Mobile Tech

Senators Call for Removal of X App from Apple’s ‘Safe’ App Store Amid Deepfake Concerns

Published

on

A macro photograph of an iPhone screen displaying the glowing App Store icon. Reflected on the glass surface is a large stone gate and high brick walls enclosing a garden, illustrating the "walled garden" metaphor central to Apple's antitrust challenges.

Grok has been making headlines lately for its lack of anything even vaguely resembling guardrails, and now three US senators are stepping in after Apple and Google have failed to proactively deal with the problem in their respective app marketplaces.

Elon Musk’s evil little AI bot recently gained image editing capabilities allowing any user on X to edit any image without permission. Unsurprisingly, the darker side of the internet responded in an entirely predictable way, flooding the social media platform with nonconsensual deepfake pornography.

Since then, things have only gotten more chaotic, as deviants and miscreants have tried to push the limits to see just how far Grok will go in undressing both adults and minors — and nobody seems to be doing anything to stop it, despite xAI’s Terms of Service clearly prohibiting “the sexualization or exploitation of children,” and “violating a person’s privacy or their right to publicity.”

A public statement from X’s Safety account insisted that it takes action against illegal content, and Elon Musk himself echoed that, adding that “Anyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content.” However, despite Musk’s assurances on January 3, the tide of illicit content hasn’t just continued — it’s surged. That highlights a fundamental truth: moderation is a choice, and xAI is choosing not to moderate.

It also doesn’t in any way change the argument that Grok should be prevented from creating child sexual abuse materials (CSAM) in the first place. After all, while the public timeline is where these images are being weaponized, the availability of Grok as a standalone tool means this “undressing” is also happening behind closed doors — turning an iPhone into a more readily accessible private studio for nonconsensual imagery.

However, while X and xAI share the crux of the blame for this, Apple’s and Google’s hands aren’t exactly clean, either. As Caroline Haskins points out at Wired (Apple News+, via Daring Fireball), both companies have banned apps and developers for far less than what Grok is doing right now.

See also  Enhancing Communication with WhatsApp Web App Video and Voice Calls

Apple and Google both explicitly ban apps containing CSAM, which is illegal to host and distribute in many countries. The tech giants also forbid apps that contain pornographic material or facilitate harassment. The Apple App Store says it doesn’t allow “overtly sexual or pornographic material,” as well as “defamatory, discriminatory, or mean-spirited content,” especially if the app is “likely to humiliate, intimidate, or harm a targeted individual or group.

Caroline Haskins

In fact, Haskins goes on to note that Apple and Google have removed several apps that did precisely what Grok is now doing, as we also shared in 2024, following investigations by 404 Media and the BBC.

Yet, in today’s political climate, it seems that Apple has lost its moral compass, being more willing to ban an app that merely crowdsources information on ICE sightings than one that openly allows users to sexualize children. That’s a surprising turnaround for a company that once seriously considered building features into iPhones to scan for photos of child abuse.

With Apple and Google seemingly paralyzed by the fear of going up against the Musk empire, US lawmakers are now trying to light a fire under the two tech giants. Senators Ron Wyden, Ed Markey, and Ben Ray Luján have penned an open letter to Apple CEO Tim Cook and Google CEOs Sundar Pichai asking for X and Grok to both be pulled from their respective app stores.

We write to ask that you enforce your app stores’ terms of service against X Corp’s (hereafter, “X”) X and Grok apps for their mass generation of nonconsensual sexualized images of women and children. X’s generation of these harmful and likely illegal depictions of women and children has shown complete disregard for your stores’ distribution terms. Apple and Google must remove these apps from the app stores until X’s policy violations are addressed.

Senators Ron Wyden, Ed Markey, and Ben Ray Luján

‘Just Plain Creepy’

While the letter is worded as a respectful request, the three senators aren’t pulling any punches here. They point out what X and Grok are doing, and then cite Apple’s policies, including one that gives it the right to remove apps for being “just plain creepy” — a standard that could easily be applied to what Grok has been up to lately.

See also  AI Diagnosis: The New Norm for Brits' Health Concerns

Your app stores’ policies are clear […] Apple’s terms of service bar apps from including “offensive” or “just plain creepy” content, which under any definition must include nonconsensually-generated sexualized images of children and women. Further, Apple’s terms explicitly bar apps from including content that is “[o]vertly sexual or pornographic material” including material “intended to stimulate erotic rather than aesthetic or emotional feelings.”

Senators Ron Wyden, Ed Markey, and Ben Ray Luján

Then, to drive the point home, the open letter adds that “turning a blind eye to X’s egregious behavior” would completely undermine Apple’s most oft-used reasoning for maintaining control over the App Store: that’s it’s a “safe and trusted place for users around the world to discover and download apps.” There’s a veiled threat that could be read in between these lines that if life inside Apple’s walled garden isn’t any better than life outside, maybe it’s time for those walls to come tumbling down.

However, the senators don’t stop there. They also contrast Apple’s (and Google’s) willingness to remove apps like ICEBlock and Red Dot.

After a delay of at least three months, ICEBlock was eventually removed from the App Store, possibly due to White House officials not realizing they could request its removal. In contrast to Grok’s harmful content, apps like ICEBlock and Red Dot, which allowed users to report immigration enforcement activities lawfully, were swiftly taken down under pressure from the Department of Homeland Security. Despite not hosting illegal content, these apps were removed based on claims that they posed a risk to immigration enforcers.

See also  Apple's Potential Shift from TSMC to Intel: What It Means for the Future

Senators Ron Wyden, Ed Markey, and Ben Ray Luján have called on Apple and Google to act promptly and remove such apps immediately, even if temporarily, pending a full investigation. They have set a deadline for a written response from both companies by January 23, 2026, putting the onus on them to address the issue.

The App Store is tightly controlled by Apple, with the company often making controversial decisions regarding the content allowed on its platform. While Apple has been quick to crack down on issues like nonconsensual deepfakes and child abuse in the past, the response to apps like Grok, which generate harmful content, has been lacking. This inaction raises questions about the consistency of Apple’s enforcement policies and the level of scrutiny applied to different types of content.

In a previous incident in 2018, Apple removed Tumblr from the App Store for hosting child pornography, demonstrating a willingness to take decisive action against such content. However, the current situation with Grok highlights a perceived inconsistency in Apple’s approach, with the company appearing reluctant to address the issue promptly.

Overall, the pressure from senators and the public scrutiny on Apple and Google may prompt a reevaluation of their content moderation practices. The outcome of this situation will shed light on the extent of control tech giants have over app content and the accountability they hold in safeguarding user safety and societal values.

Trending