The sudden surge in the number of nudify apps and websites that undress women via AI is disconcerting. The onset of AI in our everyday lives has brought with it an unprecedented list of new threats that were not anticipated previously, including the latest deepfake or pornography pandemic.
The AI Pornography Pandemic
Time reported that the network analysis company Graphika found over 24 million people visiting such websites in the month of September alone. Furthermore, these AI undressing services are marketed on social media websites and their advertising has shot up by 2400% on X and Reddit, according to researchers. What this data suggests is that with the onset of these services, they are being marketed aggressively and are being met by a substantial number of users.
Reports have also suggested that these “nudify” services are occurring without the knowledge and consent of the person who is being undressed. Most of the time the individual in question is not even aware of their modified picture being circulated in the first place. Moreover, most of the services are only applicable on women. Furthermore, searching “nudify” on google immediately directs users to such sites and apps.
These nudify services have also begun charging their customers because of the sheer number of people that are availing them.
The surge in popularity of nudify and undressing apps is linked to the emergence of various open-source diffusion models, representing artificial intelligence capable of generating images of significantly higher quality than those produced in recent years, according to Graphika. These models, utilized by app developers, are freely accessible due to their open-source nature.
Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, stated that deepfake and AI undressing has been prominently observed amongst high school students and youth. While initially the problem was restricted to undressing celebrities now it has furthered towards ordinary people as well.
TikTok has flagged the word “undress” and other platforms are also beginning to block such keywords as well.
Santiago Lakatos, an analyst at Graphika, emphasized the improvement in deepfake technology, stating, “You can create something that actually looks realistic,” and highlighted that earlier deepfakes were often characterized by blurriness.
So far, there have been laws for the redressal of revenge porn where the victim might have participated in creating the image but not in its sharing. But even these laws offer little respite for victims. We still need efficient cyberlaws to address this issue effectively and punish perpetrators accordingly. However, most cases of revenge porn do not get registered or worse, do not get justice, because the cases are tricky and the legislation is lacking.
While we are still discovering ways to address revenge porn properly, the advent of AI has brought forth a new pornography pandemic with nudify. In most cases, the victims are unaware of their AI generated images being circulated and even if they were, the legal means to pursue such cases are still obscure.
How do you remove images generated by AI and nudify apps?
The removal of nudify/AI-generated nude images can be attempted through various avenues, including:
DMCA Takedown: Victims can potentially use the Digital Millennium Copyright Act (DMCA) to remove content from websites like PornHub. However, the application of DMCA to fully deepfake images is uncertain because DMCA has generally been used to remove images where the victim is actually in it. For cases with a digitally created image of the victim, it is not fully known how the act would apply for there is a chance for the creator to claim copyright over the image.
Terms and Conditions Violations: Social media platforms have their own terms of use that prohibit certain content. Users can report violations but the platforms have ultimate discretion in determining if a violation occurred.
Court Order: A court order can be obtained through a lawsuit against the individual responsible for publishing deepfake content, especially in cases involving revenge porn, harassment, or privacy-related claims.
Search Engine Removals: Victims can request search engines like Google to remove the offensive images through delisting or delinking in order to prevent the content from appearing in search results.
Other Resources: Organizations such as the Cyber Civil Rights Initiative and the Federal Trade Commission offer support for victims. Google’s inclusion of deepfake pornography in its removal portal reflects a positive trend in addressing AI-generated images and porn.
It is important to note that different options may apply depending upon the case and victims may seek support from organizations providing resources and guidance in dealing with AI-generated porn and nudity.
With the development of AI at an unprecedented rate, where every other day you hear of newer versions and updates, it gets harder to keep track of and monitor the capabilities of this new advanced technology. Policymakers themselves are familiarizing themselves with the concepts and developments (especially with Nudify and Deepfake) or are yet to do so, resulting in the delay in appropriate legislative responses. Most people are not even certain about the capacities of this technology, let alone its future potential, resulting in a buffer to draft policies. Read more about the EU being the first to recognize the need for AI-regulating legislation here.