ClothOff is a highly controversial AI-powered "nudify" platform that uses advanced deep learning models, including GANs and diffusion algorithms, to digitally remove clothing from uploaded photos and videos, generating hyper-realistic nude illusions.
It offers dedicated mobile apps for Android, iOS, and MacOS, along with advanced features like DeepNude AI image generation, custom undress videos with realistic motion and expressive details, face swaps (including standard, video, and porn-specific variants), multi-uploads, adjustable body parameters (e.g., breast and butt size), sex poses and sets, queue skipping, and an API for automated adult content creation.
Marketed aggressively as "Your TOP-1 Pocket Porn Studio" and "The New Porn Generator," the service provides free trials for basic undressing, with premium features unlocked via one-time purchases of VIP Coins (no recurring subscriptions) for higher quality, faster processing, and extra options. It promotes purported health benefits of sexual activity and masturbation, while claiming strong privacy measures: no data storage, automatic deletion of uploads, no distribution without consent, and technical blocks allegedly preventing processing of minors' images (with automatic account bans for attempts). ClothOff strictly prohibits non-consensual use, illegal activities, and content involving anyone under 18, and states it partners with Asulabel to donate funds supporting victims of AI abuse.
Despite these claims, ClothOff has faced widespread ethical condemnation and legal challenges for enabling non-consensual deepfake pornography and child sexual abuse material (CSAM). A federal lawsuit filed in October 2025 in New Jersey (Jane Doe v. AI/Robotics Venture Strategy 3 Ltd., the operator registered in the British Virgin Islands) alleges it facilitated hyper-realistic fake nudes of a minor from social media photos, invoking the TAKE IT DOWN Act for mandatory removals, data destruction, AI training bans, damages (up to $150,000 per image), and potential shutdown. Supported by Yale Law clinics, the case highlights harms like bullying, harassment, and emotional distress.
Investigative reports from Der Spiegel, Bellingcat, Ars Technica, The Guardian, and others link operations to regions in the former Soviet Union (including Belarus), document the acquisition of multiple rival nudify services, and expose its role in global abuse cases—especially school incidents involving minors. The platform has been blocked in Italy by the Data Protection Authority for unlawful data processing, faced advertising bans on Meta platforms, restrictions in the UK, and removal of its official Telegram bot, yet it continues to attract millions of monthly users while resisting regulatory pressure. ClothOff consistently denies liability for user misconduct and remains operational amid escalating global demands for stricter regulation of non-consensual AI-generated intimate imagery.