AI-Generated Face-Swap Porn Should be Strictly Penalised

AI-Generated Face-Swap Porn Should be Strictly Penalised

The increase in the number of fake AI-generated porn is unavoidable. One of the reasons behind this can be a revenge. A small community on Reddit has created and fine-tuned a desktop application that uses machine learning to morph non-sexual photos and transplant them seamlessly into pornographic videos.

A motherboard report reveals an unsettling use of technology "deepfakes". Within a month of locating a Redditor who used machine learning to swap pictures of mainstream actresses onto the bodies of women performing in porn movies, the outlet has found people using an app base on this technique to create videos using images of women they know.

Wrongdoers are just putting some effort in scraping social media accounts for photos and using web apps that find porn with women who have faces similar to the person they are basing it on. There is a similar technology used in movies for years, but with the AI running on desktop GPUs or using cloud computing, random people suddenly have access and use it in unsettling ways.

Fake Porn is Not New

It has been used for years as people exploit their exes and celebrities online using photo-editing software like Photoshop. But nowadays, machine learning tools have allowed users to create more realistic fake footage. Only thing is that they require a few hours of spare time and enough images of their victim. Anyone can collect images by using an open source photo scraping tool, which can grab photos of someone that are publicly available online. They can then use free web-based tools to find a porn star match to their victim. This technique is also used for creating fake news. So, it is important to keep an eye on how this technology can be unleashed as a means of targeted abuse, and an unspeakable disconcerting execution of revenge porn.

Law Needs to be Updated

If people don't know it is photoshopped, they are assuming that this is an image of someone real. The abuse will be the same, the harassment will also be the same, and the adverse impact on victim's family and employer will be unbearable. It did not have any ground on which it can be said that the harm will be less because it is photoshopped. AI-generated fake pornographic videos will only worsen the issue.

There are a lot of examples in which this technology has been used for wrong reasons and it has affected the lives of people. It may amount to harassment or a malicious communication. Also, civil courts recognizes a concept of "false privacy", that means information which is false but which is nevertheless private in nature. There are also copyright issues for the re-use of images and video that wasn't created by a person.

All these technologies are being created and we are all having to deal with it. So, now is the time to talk about the ethical standards here. What are the things that we should be thinking about before people use these apps, and about the impact on the victim? The law should reflect that.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net