- Clothoff's global expansion and activities: It is a leading app for making fake nudes from real people's images and plans to expand globally. It has resisted attempts to be unmasked and has bought up a network of nudify apps. It currently runs on an annual budget of around $3.5 million and uses Telegram bots and X channels for marketing. It has a large-scale marketing plan to expand into the German market and other markets like British, French, and Spanish. It specifically targets men between 16 and 35 who like certain content.
- Legal battles and challenges: San Francisco's city attorney sued Clothoff and other nudify apps in hopes of forcing a shutdown. Deputy press secretary Alex Barrett-Shorter said attempts to serve Clothoff through legal channels have not been successful. The Take It Down Act has passed to make it easier to remove AI-generated fake nudes, but experts expect legal challenges due to censorship fears.
- Clothoff's denial and user behavior: A Clothoff spokesperson named Elias denied knowing the people flagged in the investigation and disputed the budget figure. He also denied using celebrity influencers and that the app could be used to nudify images of minors, but a user's experience showed otherwise. Some users use the app flippantly, while others are concerned about the consequences. Early victims of nudify apps may have limited legal recourse due to unclear laws.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。