[ Removed by moderator ]

707 points by wiredmagazine 23 hours ago on reddit | 149 comments

TrueReddit-ModTeam | 15 hours ago

Your content at /r/TrueReddit was removed because of a violation of Rule 5:

> Immediately post a submission statement according to the following or the post may be removed.

> Submission statements should be: a 2+ sentence comment in reply to the post, in your own words, and a description of exactly why the post is relevant and insightful

If you provide a submission statement and respond to this message, the post will be reinstated. Thank you.

Brittanicals | 21 hours ago

In my town. I worked at this school and quit over issues with admin. https://www.fox13seattle.com/news/everett-middle-schooler-ai-porn. The principal was said to have asked see the student's phones with AI porn, then tried to delete the pics, before calling police.

AdSevere1274 | 23 hours ago

I have also read that in Japan they are using picture of cats in dating sites... Hopefully Instagram will self destruct..

>In South Korea and Australia, schools have given pupils the option not to have their photos in yearbooks or stopped posting images of students on their official social media accounts, citing their use for potential deepfake abuse. “Around the world, there have been cases where school images were taken from public social media pages, altered using AI, and turned into harmful deepfakes,” one school in Australia said. “Imagery will instead feature side profiles, silhouettes, backs of heads, distant group shots, creative filters, or approved stock photography.”

iifwe | 21 hours ago

One month later: "new AI able to generate accurate image of person based on side profile, silhouette, back of head, and the kinds of stock photography the person likes".

AdSevere1274 | 21 hours ago

LOL.. yup.. or a cat or dog version of one's face..

RueTabegga | 22 hours ago

I cannot wait to see what they come up with. On one hand it’s very creative but on the other what is the point of a yearbook at all if not looking back through the faces?

NOTTedMosby | 22 hours ago

Bro they just don't want their faces on fake naked bodies fucking a dude they never met.

ShotFromGuns | 21 hours ago

I think they get it but are pointing out what a fucking bummer it is that these kids aren't going to get real yearbooks that memorialize a part of their childhood because a bunch of billionaires got desperate to shove pointless technology down everybody's throats.

JohnTDouche | 15 hours ago

Do kids want to do yearbooks? Do they have the option not to?

AdSevere1274 | 22 hours ago

It was always the cringe for people to remember.. Now they don't have to.. It is for the best!!

Stanford_experiencer | 22 hours ago

What the fuck?

AdSevere1274 | 21 hours ago

Well.. thats what I think.. Highschool is not always the best time for all people to remember.. in my opinion. That is where comparisons start and never end!

Dr_Weed_MD | 21 hours ago

I'm sorry you don't have good memories from high school.

AdSevere1274 | 21 hours ago

Who is stopping you from having good memories.. Don't you have a cell phone now.. By the way I never graduated from high school... I went to college and bypassed the end years!.. but I know people who did.. So you are making false assumptions about me, that is nature of bullying in schools.. People assume and rant a lot.. You proved my point.

TeamRedundancyTeam | 17 hours ago

This feels like a pointless reaction to try to ease people's minds without actually doing anything. People can't prevent their photos from being online, and often do it themselves or their friends do. They can have photos taken in public spaces. Changing yearbooks won't change a damn thing.

They're going to have to make it a more serious crime, especially when they're underage, and actually investigate and punish it.

AdSevere1274 | 17 hours ago

People can prevent it alright by removing themselves from social media.. Less is more. They can ask people not to post their pictures online. Underage / overage makes no difference.. Bullying is bullying.. The tech bros need a lesson..

daretoeatapeach | 16 hours ago

Most likely, the deepfakes are being made by people who know the victims and attend their schools. Most likely, a good portion of those students own cell phones. And if they don't own cell phones their friends own cell phones. So all they have to do is take a photo of the victim at lunch when she's talking to friends or whatever. So, avoiding social media is not going to help.

AdSevere1274 | 16 hours ago

They can but everybody then will then know who they are.. You know the guilty parties will be identified.. What people don't need is social media profiting from people's miseries first of all , so the age of fixation on faces and bodies will hopefully degrade..

BarryRightWrong | 23 hours ago

Jfc. These machine learning tools should never have left research facilities. That's where they can do anything close to good.

Everything else seems to be slop, porn, misinformation, a con, or something far worse. This would be squarely in the 'far worse' category.

64_hit_combo | 23 hours ago

I think it's absolutely intentional that the effective 'service' of these products is to spit out anything to make others doubt what they're seeing is real

DangerousLoner | 22 hours ago

Absolutely my thought too. The powers that be wanted this stuff out before the Epstein photos and videos leak so they can blame AI. The sooner the better to get people to doubt their eyes and ears.

FilthySJW | 16 hours ago

So all of the frontier AI labs are a giant conspiracy to undermine our collective agreement on what real things are online?

Okay. Yeah, seems totally plausible.

LoaKonran | 22 hours ago

This is what happens when the money men get involved. Something that might have been useful 10yrs down the line, instead gets packaged and sold for immediate gain as they search for literally any use for it before the bubble bursts. Tech bros and grifters all the way down.

hudson27 | 21 hours ago

No this is when technology gets ahead of policy. Governments took over a decade to know what to do with the internet when it first came on the scene

caboosetp | 21 hours ago

The 90's internet was still a wild west, it just had a much higher barrier to entry. Regulations and convention still had to catch up. Accidentally watching a cartel member skin someone alive is a lot harder to do now.

AI got this insane push to be fucking everywhere and had the technology available in peoples hands to do it.

cheerful_cynic | 17 hours ago

And then they all do their absolute utmost try to clamber over each other making bets & offset bets & algorithmic bets via wall street gambling everyone's retirement, about how much money they're gonna make, using their barely developed new technologies

kermityfrog2 | 22 hours ago

Yeah and nobody holds these corporations responsible. They take all the profits and socialize all the problems and fallout. They will damage a whole generation of young people and no punishment.

mrbombasticat | 15 hours ago

First time?

LabRatsAteMyHomework | 22 hours ago

Bro I just want legit reviews on products not Ai slop writing with intentional spelling mistakes brigading every product I research for personal use.

anormalgeek | 18 hours ago

At this point, Pandora's Box has been opened though. There are already models out in the wild that can be run 100% locally.

TwistedBrother | 15 hours ago

It feels like the barrier to entry for local models is high enough at present that those who would do anything personal with it have the sense to either no do that or not share that.

What’s happening here is that it’s become so frictionless that muppets who have more computer access than sense are just making noods on point and click platforms. These platforms are services connected to Discord servers that are passed around. Kind of like Midjourney but utterly unregulated and ultra dodgy.

It’s kind of challenging to remove local models with literally billions of copies of these models now downloaded. But let’s put that in perspective. It wasn’t something that happened immediately after Stable Diffusion was released, it was most deeply a concern since platforms allowed photo editing on request and services were allowed to run in a point and click manner.

Also, open models are almost always nerfed and deliberately so. You get ken dolls or body horror. It takes effort to add the features you want.

CaffeinatedT | 22 hours ago

If populations have atrophied brains are desperate for employment and disoriented by a breakdown of reality and information that benefits big money interests though so of course it gets pushed.

b88b15 | 23 hours ago

I don't see how we stuff this genie back in the bottle.

Master-Ad-5153 | 21 hours ago

There's the incredibly expensive and difficult legal route - holding the AI companies, their subsidiaries, suppliers, etc. responsible and punishing every node in the chain with applicable sentencing from CSAM laws already on the books. Assuming full cooperation and convictions for all, that's going to cost insane amounts of legal fees and take upwards of decades to conclude. And these actions won't stop the problem.

Or... Data centers are basically the same as warehouses (not advocating for anything in particular)

anormalgeek | 18 hours ago

> the AI companies

Its too late for that. There are already AI models that you can download from file share sites and run locally on your own hardware. The tech is already out in the wild.

stay_fr0sty | 17 hours ago

I could create a deepfake nude right now on my gaming pc, all locally. It’s not difficult (and it’s only getting easier).

I could then use TOR to upload the deepfake to the internet.

I could then use bitcoin to buy a $1 SMS service number in Nigeria. After that I can send the link to any phone I want from my TOR browser.

All that would be completely untraceable.

There is no putting deepfakes back in the bottle. They are only going to get more prevalent and more realistic. It sucks.

USMCLee | 19 hours ago

I agree. Using the CSAM laws to hold everyone involved with AI used to create CSAM is the way to go.

TAKEitTOrCIRCLEJERK | 19 hours ago

there's too much of it. we could run every court and DA office round the clock for years without prosecuting a fraction of the offenses.

USMCLee | 19 hours ago

Sounds like a great job creation program.

Stanford_experiencer | 12 hours ago

The entire reason CSAM is illegal is because it's an image of real abuse. Not because of obscenity. You can't have a similar sentence for the same reason rape is almost never a capital offense. Flattening the sentences gives offenders a clear incentive to escalate.

ranninator | 19 hours ago

We love our glorious warehouses, full of shiny expensive wares... And, of course, we absolutely DISAVOW anything unfortunate that might befall their fate as a consequence for the actions of the shiny little wares they house.

Stanford_experiencer | 12 hours ago

are you okay jesus christ

SunkEmuFlock | 17 hours ago

We go full Dune universe.

b88b15 | 16 hours ago

Claude clearly violates the butlerian jihad.

anormalgeek | 18 hours ago

We don't. At least not on the tech side.

It's like trying to stop gun deaths once the secret of gunpowder got out. You'll never be able to stop it on the tech side anymore.

The best option (although still definitely not perfect) is passing and enforcing laws regarding nonconsensual porn production and distribution. Most US states have done so.

Scarlet_Bard | 23 hours ago

When the AI server farm in your town isn’t training AIs to fire everyone with a desk job or conducting mass surveillance on American citizens, it’s producing deepfake child porn. This is the glorious AI future these tech oligarchs are shoving down our throats.

SunkEmuFlock | 17 hours ago

Don't sell these server farms short. They're also great wartime targets because the government is using them for wartime duties. That data center next to your neighborhood? If shit pops off hard enough, someone will send a missile into that building. Good luck!

Relevant-Doctor187 | 21 hours ago

Threaten to change the owners of each AI company with producing child pornography and this will clear up fast.

Course it’ll probably turn into more spying in law abiding citizens in the end.

no_dice | 22 hours ago

I have a 13 year year old girl and I tell her “the internet is permanent” all the time. This deepfake stuff terrifies me because at the end of the day it doesn’t really matter if it’s not a real image in terms of victimization and the punishment for doing these things is basically nothing.

A kid up the street from us created a video deepfake of one of his teachers and circulated it on Snapchat — he got a 1 week at home suspension and that was it.

Stanford_experiencer | 22 hours ago

>at the end of the day it doesn’t really matter if it’s not a real image in terms of victimization

It absolutely does.

Raerth | 21 hours ago

I think you're misunderstanding the point their making.

Stanford_experiencer | 21 hours ago

I was trafficked as an infant. A fake photo is not the same as real harm.

no_dice | 20 hours ago

I in no way, shape, or form equated a picture being circulated to peers as equivalent to being trafficked? What I’m saying is if a teenage girl has a photo circulated of her it doesn’t matter all that much whether it’s real in terms of how their peers will react to it and treat them as a result.

-Speechless | 19 hours ago

bean soup theory. not everything on the internet is catered to you specifically.

Eriiiii | 20 hours ago

shame they didnt teach you reading comprehension while they had ya

Stanford_experiencer | 20 hours ago

?

Eriiiii | 20 hours ago

The post you originally responded to is saying exactly what youre saying but you read it wrong

flaginorout | 22 hours ago

Make it a felony (if it doesn’t already fall under a felony statute).

Penalty for juveniles is 10 days in the clink. On probation for 1 year and they aren’t permitted to use a smart phone, at all, and can’t use a computer for anything other than school work or job applications.

For adults? They should know better.

1 year in the clink.

Visstah | 22 hours ago

There probably should be PSA's to let everyone know this, but it is completely a felony, considered exactly the same as possessing or distributing regular child sexual abuse material. Any adult doing it will almost definitely get more than a year.

Genesis72 | 22 hours ago

As much as this is a gross problem that needs solving; I’m not sure that making a whole bunch of children felons is the right way to go about it.

flaginorout | 22 hours ago

Why? Their record is expunged when they’re 18 anyway.

This crime can be socially and mentally devastating for the victims. IMO- it’s 10x worse than any sort of traditional bullying.

Zero tolerance.

nondescriptzombie | 21 hours ago

How about we make it illegal for AI to produce deepfake nudes?

Why is this even a feature set?

overdrivetg | 19 hours ago

This would mean making all faceswaps illegal, which has (arguably) legitimate uses.

Also, the reality is that anyone with scissors and glue (or a pencil and artistic talent) could always / have made "[deep]fake nudes". Does that mean we should we go after scissors and glue and pencils next?

I can see a good argument to be made that sharing these kinds of images is where you cross the line into tangible harm and therefore where enforcement should focus but my belief in the odds of a reasoned discussion here on this topic is about 0.00%.

Stanford_experiencer | 12 hours ago

>I can see a good argument to be made that sharing these kinds of images is where you cross the line into tangible harm and therefore where enforcement should focus

It's the only way to do it without making things worse. Either you trust someone to have a computer, or you don't.

lawrenceugene | 21 hours ago

Sex offenses for juveniles are not expunged.

Genesis72 | 21 hours ago

Because a system based on harsh retribution like this doesn’t really work when the perpetrator isn’t capable of making informed decisions in the way an adult is.

Kids make stupid, shitty decisions all the time, because they’re kids and their brains don’t work in the same way that adults do. That’s why we build a society that tries to protect them.

There should be consequences. But incarcerating a child for over a week for a cyber crime, making them a felon, and introducing a strong chance for them to be parole violators (and therefore pushing them even farther into the carceral system) doesn’t solve the problem.

We need to be addressing the root of the issue: penalizing and criminalizing hosting these services, and penalizing parents who’s children offend.

theshadowofself | 20 hours ago

I agree with most everything you say except charging parents(unless it’s proven they helped in the creation of the circulated images). Then you’re breaking up the family of a disturbed child(if they’re making this to begin with) but what happens to them if their parents are arrested? I’m genuinely curious how you think it would play out?

It’s an idea that might sound reasonable in theory but in practice will likely be anything but. There was the story recently about a woman charged with child abuse/neglect because she let her 9 and 6 year old walk a half mile by themselves across a busy street and the 6 year old got hit by a car. A tragic accident made into a negligent homicide due to an overzealous(likely racist) prosecutor that was hell bent on someone paying. Or the legislation that lets a mother be charged with failure to protect her children if she stays in a violent relationship, ignoring the fact that most of these women are abuse victims themselves and have nowhere to go. Parents do need to be accountable for their children but we also need to realize there are nuances in every case and the parents of a kid who does something like this doesn’t automatically imply any wrongdoing on their part. Especially if the kid is in high school.

donkeyrocket | 20 hours ago

The core of the problem isn't being addressed though and it'll just be a long line of children racking up felonies and other children traumatized by their fakes being distributed. It'll change nothing as the kids don't have the mental maturity or parental oversight to grasp the severity of what they're doing... or they wouldn't be doing it in the first place.

Accessibility is the main driver. And the tools having zero guardrails is the other. Put the onus and liability on the companies providing these tools and you'll see these instances plunge.

If your tool is too open ended that you cannot prevent it from creating deepfake nudes or any sort of pornography without flagging it then that should be outlawed. The big bad word "regulation" comes to mind.

slainascully | 22 hours ago

The victims - overwhelmingly teenage girls - have to live with these crimes forever. I think a small amount of actual punishment for the perpetrators would make quite a difference.

Khatib | 21 hours ago

Yeah, but dumb kids don't slow down that much at consequences. Most will just assume they won't get caught. The same way they do with other illegal things like alcohol and drug use.

The regulations need to go on the software more than the kids. Both, but we need to block AI from making unsolicited porn of anyone, not even just kids.

slainascully | 17 hours ago

I mean we can do both? The fact that deepfakes are overwhelmingly (some orgs cite 99%) targeting women and girls shows that this isn’t just mindless schoolyard bullying, it’s got a very nasty sexualised misogynistic angle. I just don’t think the kids doing this should be let off with extremely minor ‘punishments’ when the victims are dropping out of school.

Khatib | 16 hours ago

Yes, but, I don't think this statement is crazy at all

> I think a small amount of actual punishment for the perpetrators would make quite a difference

They need some harsher punishment. But saying a felony for a teen is a "small amount" of punishment is not true. If an adult makes deepfake porn of a kid, that should totally be a felony. It maybe already is, I don't know. I haven't kept up with that.

slainascully | 16 hours ago

The minor punishments refer to the article where one perpetrator was sentenced to 60 hours community service.

I just think the negative reaction to the mere suggestion that actual crimes, with an extreme level of harm, are deserving of any kind of incarceration is repeating the exact same misogyny that drives this. The impact on girls isn’t enough to justify affecting the boys futures. Why?

Genesis72 | 21 hours ago

I just think that a better solution than putting children in jail is actually addressing the root of the issue.

Penalize any sites hosting the tools and charge parents of children who are caught using them

slainascully | 21 hours ago

Okay so how do we address the root of the issue? Some sites are saying that 99% of deepfake abuse is targeting to women and girls so unless we can solve misogyny overnight, i don’t think we can keep putting off protecting some kids as the expense of others.

donkeyrocket | 20 hours ago

While not getting at the core core issue, you build restrictions into the tools. Doing so does absolutely nothing to inhibit the use of the tool in a different way. There's zero reasonable rationale for these things to be able to create accurate nudes.

Regulate the industry and hold the companies accountable for CSAM created using their tool. No, we can't socially engineer boys from doing this shit swiftly or broadly but take away the method that makes producing them as easy as clicking a button.

buzzkill_aldrin | 16 hours ago

> While not getting at the core core issue, you build restrictions into the tools. [...] Regulate the industry and hold the companies accountable for CSAM created using their tool.

This does nothing about open source models that can be run locally.

Ahnteis | 21 hours ago

Real penalties for the leadership companies creating the fakes. Put a few CEOs in jail and magically AI companies will find a way to block this. Sure, some teens will run their own AI models, but stop the major source and the problem will be much resolved.

SpezRuinedHellsite | 18 hours ago

> i don’t think we can keep putting off protecting some kids as the expense of others.

Monstrous.

"Regulating a trillion dollar insustry is hard! Lets incarcerate the ((BAD)) children instead!"

slainascully | 18 hours ago

Why does everyone think I believe we shouldn’t also regulate the tech?? You can in fact do two things to deal with a crime that is insanely gender skewed, lasts forever, and produced CSAM

PabloPicasso | 21 hours ago

Put the CEO, CTOs, CFOs, engineers, and financiers of the AIs and infrastructure in jail and let it be known that they prey on children.

Hillgrove | 20 hours ago

who are going to make it a felony? the people that voted for the pedo president? lol

Master-Ad-5153 | 21 hours ago

That's an interesting idea, but honestly sounds like it would be difficult to impossible to achieve in reality. It would be practically impossible to identify most perpetrators and associate them with the crime(s). Then you have an issue with how long it would take to bring everyone to court.

flaginorout | 21 hours ago

Sounds like 9/10 times, it’s some misguided kid who is doing this. It would take the cops about 30 minutes to back trace to the originator and find the culprit.

Stanford_experiencer | 22 hours ago

No.

Shdwrptr | 21 hours ago

If the deepfake problem is so widespread doesn’t that automatically make every image plausibly fake so none of it is taken at face value?

You can’t really have it both ways that the deepfake problem is massive AND all images are treated as genuine

Marha01 | 17 hours ago

The solution is to teach children that the deepfake nudes are fake even when they look very realistic, and therefore they should not be ashamed at all. Alternatively, flood the internet with deepfake nudes of everyone.

sarbanharble | 22 hours ago

Sue the living hell out of these companies.

Thebandroid | 22 hours ago

as a 32 year old who remembers what its like to be a brainless, horny animal with an internet connection, no It's pretty much exactly how I thought.

Gralla | 20 hours ago

Recommend this podcast - https://www.cbc.ca/listen/cbc-podcasts/1353-the-naked-emperor

bIII7 | 23 hours ago

Deepfake nudes are your face on someone else's body. The novelty needs to wear off already, either in the minds of people who make them, or the minds of people who have been future-photoshopped. Sharing deefakes should be punished, and direct harassment should have repercussions if it doesn't already in this situation. It can be made into nothing, because not much is truly happening.

petielvrrr | 22 hours ago

Im sorry, but no. It’s not nothing. You should talk to one of the women who’s had this done to them. It does not feel like nothing, it does not feel like it’s someone else’s body, and it is not treated like nothing by those around them.

I agree with punishing it, but don’t put this back on to the victims and tell them it’s on them to fix the issue by getting over it.

bIII7 | 22 hours ago

I'm not telling anyone to do anything. I am sorry, but I'm not all too interested what it "feels like". To me, this is about the law, and nothing else. If you need be to be some other way, consider not replying to this comment.

petielvrrr | 22 hours ago

>Deepfake nudes are your face on someone else's body. The novelty needs to wear off already, either in the minds of people who make them, or the minds of people who have been future-photoshopped.

And

>It can be made into nothing, because not much is truly happening.

Those are your words.

bIII7 | 22 hours ago

Right, and I'm not telling anyone to change their feelings, just that I'd like to see some feelings or the other change over time. Because the things that are happening which, you know, HAVE VICTIMS, are crimes, for the most part. Extortion, blackmail, defamation, cp, etc. Maybe in some cases it could be expanded, and it should be done. But the outrage is also partially about something that should not really be illegal: Essentially, photoshopping faces onto nudes. Gross, I know, but it should not be illegal unless further crimes are comitted. The novelty of AI doing this will likely wear off, as it seems to be with other forms of generative AI. Most tech novelty does wear off unless it is genuinely valuable. I don't think random bodies is going to keep appealing to people once the hype wears off. Or, on the other hand, it could be the other way.

dyslexda | 22 hours ago

Hey I don't know this, but you've got a paywall. You probably shouldn't be spamming articles here if folks can't read them.

Datruyugo | 20 hours ago

This is terrible but the fucking title...fear mongering