Welcome to Incels.is - Involuntary Celibate Forum

Welcome! This is a forum for involuntary celibates: people who lack a significant other. Are you lonely and wish you had someone in your life? You're not alone! Join our forum and talk to people just like you.

LifeFuel Porn may soon feature people (AI) who don't exist

G

guaucel

Major
★★★★★
Joined
Dec 11, 2017
Posts
2,380
2304.jpg

Porn has often driven technological innovation but research in deepfakes raises questions about its ootential misuses.



A professional headshot of a man with his face carefully poised in a neutral expression, the kind that might be used in an acting portfolio. A teenage girl with red hair and glasses, pouting at the camera against an outdoor backdrop. These photographs are the sort that saturate our online world, ones that you might find on a Facebook profile or LinkedIn page. The only difference? These people don’t exist. They are the product of an algorithm, a network of images competing against each other to create convincing fakes – and experts believe that they could soon replace pictures of real people in everything from the profiles that we match with on dating apps to the bodies that we watch in porn.


Designed by former Uber engineer Phil Wang, thispersondoesnotexist.com makes use of a code called StyleGAN (short for generative adversarial network). Wang has utilized this code to create a seemingly endless stream of faces.


“Our sensitivity to faces, when you really think about it, is a product of evolution for successful mating,” he told me. “What the site really demonstrates is that even for a data distribution that we are so well crafted to understand as human beings, the machine can pick apart all the relevant features and recompose them in a way that’s coherent.


Wang’s innovation is fascinating and seemingly innocuous and yet it shares the same technological basis as much more sinister creations. Over the past few years GANs have been almost endemically misused to create malicious content, such as material mapping the faces of celebrities on to existing, often pornographic, footage – known as deepfakes.


“A recent study found that 96% of all deepfake videos were pornographic and in many cases are being used to harass and terrorize women,” says Rachel aThomas founder of fast.ai and expert in applied data ethics. “In general, our legal system has been slow to catch up with addressing [this kind of] sexual imagery and the use of AI is deepening and accelerating this problem.
(worried whore)



The core of the matter is women must be replaced.
 
Last edited:
ok but porn is gay
 
Tfw even software gets laid but you don't
 


This whore is concerned about "objectification of women", but when she dresses like whore to turn Chad on is "not objectification of herself"
 
Last edited:
This would mean that real life looking women can be in hentai and get fucked by tentacles and shit
 

Similar threads

screwthefbi
Replies
24
Views
519
Clownworldcell
Clownworldcell
Incline
Replies
8
Views
196
Electus
Electus
Shinichi
Replies
30
Views
610
Da_Yunez
Da_Yunez

Users who are viewing this thread

shape1
shape2
shape3
shape4
shape5
shape6
Back
Top