Apple has allegedly taken action against a group of AI-powered image generation apps that are currently on its App store. This action follows the claims of those apps' potential to create someone's nude images without their consent. A 404 Media report indicated that such apps were using Instagram advertisements to promote their services, suggesting users could "undress any girl for free." The advertisements redirected users straight to Apple's App Store, where the apps were listed as "art generators."


In the surfaced report, it was detailed how these apps were advertised and linked to on Instagram. According to 9to5Google, after the article was published, Apple reached out for more details. Apple promptly removed the apps from its platform after it received direct links to the specific ads and App Store pages.


ALSO READ | Top Tech News Today: Dating Apps Tinder, Bumble, Others May Be Selling User Info, Musk's xAI Raising $6 Billion, More


Apple Will Need To Stay On Alert


This recent action by Apple highlights an increased focus among app store operators to address inappropriate and offensive content, particularly those facilitating the creation of non-consensual explicit images. Despite Apple's removal of three such apps from its App Store, this incident indicates that ongoing vigilance and support from third-party entities like 404 Media may be necessary to monitor and address policy-violating apps effectively.


ALSO READ | Google Pixel 8 Six-Month Review: The Tiny Titan Still Packs A Punch, But Is It Worth Buying Now?


Apple had to rely on information provided by 404 Media to identify and eliminate these specific apps, showcasing the challenge the company faces in proactively identifying and addressing such violations.


ALSO READ | 3.2 Billion+ Meta Users, Betting Big On AI, More: Top 6 Takeaways From Mark Zuckerberg's Earnings Call


The future prevalence of similar apps remains uncertain, but it is evident that Apple will need to remain vigilant to prevent such content from reappearing on its platform.