AI-Generated Face-Swap Porn Should be Strictly Penalised

by February 5, 2018 1 comment

The increase in the number of fake AI-generated porn is unavoidable. One of the reasons behind this can be a revenge. A small community on Reddit has created and fine-tuned a desktop application that uses machine learning to morph non-sexual photos and transplant them seamlessly into pornographic videos.

A motherboard report reveals an unsettling use of technology “deepfakes”. Within a month of locating a Redditor who used machine learning to swap pictures of mainstream actresses onto the bodies of women performing in porn movies, the outlet has found people using an app base on this technique to create videos using images of women they know.

Wrongdoers are just putting some effort in scraping social media accounts for photos and using web apps that find porn with women who have faces similar to the person they are basing it on. There is a similar technology used in movies for years, but with the AI running on desktop GPUs or using cloud computing, random people suddenly have access and use it in unsettling ways.

 

Fake Porn is Not New

It has been used for years as people exploit their exes and celebrities online using photo-editing software like Photoshop. But nowadays, machine learning tools have allowed users to create more realistic fake footage. Only thing is that they require a few hours of spare time and enough images of their victim. Anyone can collect images by using an open source photo scraping tool, which can grab photos of someone that are publicly available online. They can then use free web-based tools to find a porn star match to their victim. This technique is also used for creating fake news. So, it is important to keep an eye on how this technology can be unleashed as a means of targeted abuse, and an unspeakable disconcerting execution of revenge porn.

 

Law Needs to be Updated

If people don’t know it is photoshopped, they are assuming that this is an image of someone real. The abuse will be the same, the harassment will also be the same, and the adverse impact on victim’s family and employer will be unbearable. It did not have any ground on which it can be said that the harm will be less because it is photoshopped. AI-generated fake pornographic videos will only worsen the issue.

There are a lot of examples in which this technology has been used for wrong reasons and it has affected the lives of people. It may amount to harassment or a malicious communication. Also, civil courts recognizes a concept of “false privacy”, that means information which is false but which is nevertheless private in nature. There are also copyright issues for the re-use of images and video that wasn’t created by a person.

All these technologies are being created and we are all having to deal with it. So, now is the time to talk about the ethical standards here. What are the things that we should be thinking about before people use these apps, and about the impact on the victim? The law should reflect that.

1 Comment so far

Jump into a conversation
  1. BruceThomson
    #1 BruceThomson 6 February, 2018, 04:45

    Interesting article.
    The arbitrariness of a face…
    – it can be stolen and used for porn.
    – surgically it can be put on another head
    We are in an era where the real you is not your appearance but your behaviour experienced by those around you (if they can authenticate that it’s really you inside that avatar they’re with)
    – In recent years I’ve spent many hours in virtual reality worlds like SecondLife and HighFidelity and IMVU. I was astonished late one night after being in these different worlds, to emerge with the amazing feeling that I hadn’t actually ’emerged’, but simply transferred into a different ‘avatar’, my biological body.
    – That had a major impact on my self-perception, and perception of others. The ‘avatarness’ of our bodies. I judge people far less by appearance now, and much more by their behaviour.

    Reply this comment

Your data will be safe!Your e-mail address will not be published. Also other data will not be shared with third person.