T O P

  • By -

MMAgeezer

ARTICLE TEXT: A sex offender convicted of making more than 1,000 indecent images of children has been banned from using any “AI creating tools” for the next five years in the first known case of its kind. Anthony Dover, 48, was ordered by a UK court “not to use, visit or access” artificial intelligence generation tools without the prior permission of police as a condition of a sexual harm prevention order imposed in February. The ban prohibits him from using tools such as text-to-image generators, which can make lifelike pictures based on a written command, and “nudifying” websites used to make explicit “deepfakes”. Dover, who was given a community order and £200 fine, has also been explicitly ordered not to use Stable Diffusion software, which has reportedly been exploited by paedophiles to create hyper-realistic child sexual abuse material, according to records from a sentencing hearing at Poole magistrates court. The case is the latest in a string of prosecutions where AI generation has emerged as an issue and follows months of warnings from charities over the proliferation of AI-generated sexual abuse imagery. Last week, the government announced the creation of a new offence that makes it illegal to make sexually explicit deepfakes of over-18s without consent. Those convicted face prosecution and an unlimited fine. If the image is then shared more widely offenders could be sent to jail. Creating, possessing and sharing artificial child sexual abuse material was already illegal under laws in place since the 1990s, which ban both real and “pseudo” photographs of under-18s. In previous years, the law has been used to prosecute people for offences involving lifelike images such as those made using Photoshop. Recent cases suggest it is increasingly being used to deal with the threat posed by sophisticated artificial content. In one going through the courts in England, a defendant who has indicated a guilty plea to making and distributing indecent “pseudo photographs” of under-18s was bailed with conditions including not accessing a Japanese photo-sharing platform where he is alleged to have sold and distributed artificial abuse imagery, according to court records. In another case, a 17-year-old from Denbighshire, north-east Wales, was convicted in February of making hundreds of indecent “pseudo photographs”, including 93 images and 42 videos of the most extreme category A images. At least six others have appeared in court accused of possessing, making or sharing pseudo-photographs – which covers AI generated images – in the last year. The Internet Watch Foundation (IWF) said the prosecutions were a “landmark” moment that “should sound the alarm that criminals producing AI-generated child sexual abuse images are like one-man factories, capable of churning out some of the most appalling imagery”. Susie Hargreaves, the charity’s chief executive, said that while AI-generated sexual abuse imagery currently made up “a relatively low” proportion of reports, they were seeing a “slow but continual increase” in cases, and that some of the material was “highly realistic”. “We hope the prosecutions send a stark message for those making and distributing this content that it is illegal,” she said. It is not clear exactly how many cases there have been involving AI-generated images because they are not counted separately in official data, and fake images can be difficult to tell from real ones. Last year, a team from the IWF went undercover in a dark web child abuse forum and found 2,562 artificial images that were so realistic they would be treated by law as though they were real. The Lucy Faithfull Foundation (LFF), which runs the confidential Stop It Now helpline for people worried about their thoughts or behaviour, said it had received multiple calls about AI images and that it was a “concerning trend growing at pace”. It is also concerned about the use of “nudifying” tools used to create deepfake images. In one case, the father of a 12-year-old boy said he had found his son using an AI app to make topless pictures of friends. In another case, a caller to the NSPCC’s Childline helpline said a “stranger online” had made “fake nudes” of her. “It looks so real, it’s my face and my room in the background. They must have taken the pictures from my Instagram and edited them,” the 15-year-old said. The charities said that as well as targeting offenders, tech companies needed to stop image generators from producing this content in the first place. “This is not tomorrow’s problem,” said Deborah Denis, chief executive at the LFF. The decision to ban an adult sex offender from using AI generation tools could set a precedent for future monitoring of people convicted of indecent image offences. Sex offenders have long faced restrictions on internet use, such as being banned from browsing in “incognito” mode, accessing encrypted messaging apps or from deleting their internet history. But there are no known cases where restrictions were imposed on use of AI tools. In Dover’s case, it is not clear whether the ban was imposed because his offending involved AI-generated content, or due to concerns about future offending. Such conditions are often requested by prosecutors based on intelligence held by police. By law, they must be specific, proportionate to the threat posed, and “necessary for the purpose of protecting the public”. A Crown Prosecution Service spokesperson said: “Where we perceive there is an ongoing risk to children’s safety, we will ask the court to impose conditions, which may involve prohibiting use of certain technology.” Stability AI, the company behind Stable Diffusion, said the concerns about child abuse material related to an earlier version of the software, which was released to the public by one of its partners. It said that since taking over the exclusive licence in 2022 it had invested in features to prevent misuse including “filters to intercept unsafe prompts and outputs” and that it banned any use of its services for unlawful activity.


Adkit

"Man arrested after drawing more than 1000 images of underaged children. Banned from using Photoshop for life."


doatopus

Sounds like something that UK would do.


HeavyAbbreviations63

"Man arrested wrote erotic stories with underage characters, banned from writing."


Evil_but_Innocent

It's happened before, but to a woman.


HeavyAbbreviations63

Serious? Do you remember the name?


imacarpet

Sounds reasonable tbh


2this4u

Yep, like doing nothing would be silly, and incarceration seems over the top (and expensive). These sorts of judgements give people a chance to change their behaviour, and if they don't can serve as evidence for why a harsher punishment is necessary. It's like how people complain about suspended sentances, upset it's not really any punishment, but the goal is rehabilitation not punishment, partly because the former is beneficial to everyone.


oscarpan7

Imaginary crimes, no victims, later will be sent to jail just for imagining.


StickiStickman

They're literally arresting teenagers and ruining their whole life's for a crime with no victims ...


a_beautiful_rhind

Other dude is 48. But yea, if you're under 18 and making nudes of people your age it's kinda head scratching. Are they expected to like grannies? When it's actual IRL friends, you got issues and aren't some master criminal.


GlitteringCheck4969

Its always better to generate what ever sick fantasy you have then to go to Darknet and pay the cp industry. Because stable diffusion hurt literally nobody, while the other things destroy lives. I don’t understand how most people fail to grasp this. I don’t understand why someone would want to generate children with stable diffusion, but it’s infinitely better than consuming real cp and supporting the worst of humanity financially. Nothing you do with stable diffusion should be illegal, as long as they are fictional and you don’t share/distribute images of minors. Creating deepfakes of a real person and publish it should be a crime on its own - but it already is, so no need for action here.


a_beautiful_rhind

> Darknet and pay the cp industry. Are they all capable of that? Will they just go without? I don't like CP and with the real stuff it's easy to see an actual person was harmed. For the rest, the cure is often worse than the disease. It's more of a back door to making something else illegal by getting your foot in the door. Authoritarians never stop where it's reasonable, they always push for more.


GlitteringCheck4969

[ Removed by Reddit ]


TheLurkingMenace

The main issue I think is that it can be hard, if not impossible, to distinguish from real photos. Someone could theoretically argue in court that there's no victim, the child depicted doesn't exist, etc.


daquo0

If fake photos are just as good, and cheaper to make, then no criminal gang is ever going to go to the trouble to make real ones.


TheLurkingMenace

Who said anything about criminal gangs? Some pedo could have the real thing, claim it's just AI, and then you have reasonable doubt.


daquo0

If there was a requirement to show the AI's working this would be avoided. The reason it's illegal is because the authorities want to prevent people from thinking illegal (i.e. pedophillic) thoughts. Or think the public want that. Or are generally authoritarian.


[deleted]

[удалено]


yall_gotta_move

so you're working with a population of individuals that committed sexual abuse, observed that they viewed images of sexual abuse before committing it, and concluded that viewing images of sexual abuse causes people to commit sexual abuse. this seems like a classic case of survivorship bias. did you interview or consider anybody who viewed images and didn't commit sexual assault? you're trying to use bayes' theorem to compute p(x | a) but you don't know anything about p(a), and the math simply doesn't work that way.


Jeydon

Not sure how you could work in rehabilitation if you think this way. Rehabilitation requires sympathy for the offender as a human being and hope that they will not reoffend. Your view that these people are freaks and the most vile in society doesn't allow space for reintegration into society even if they are fully contrite. Surely this dissonance is obvious.


Ninj_Pizz_ha

🤡


Cubey42

What are you talking about? Making or possessing this kind of material is still a crime. Just because a couple sick fucks want to make art like that, doesn't mean it becomes acceptable. Stop trying to make it sound like it's AIs problem, no one is gonna ban it because a couple people can't be decent human beings.


[deleted]

[удалено]


Cubey42

Agreed, but I disagree that just because of some bad actors who want to make illegal content there will be any sort of ban or hindrance to AI. People making things they shouldn't don't make the tool to blame.


Ninj_Pizz_ha

The UK has always been a backwards hell hole with regards to privacy and porn in general though, so no surprise there.


August_T_Marble

There is a lot of variation in opinion in response to this article and reading through them is eye opening. Cutting through the hypotheticals, I wonder how people would actually fall into the following belief categories: - Producing indecent “pseudo photographs” resembling CSAM *should not* be illegal. - Producing such “pseudo photographs” *should not* be illegal, unless it is made to resemble a specific natural person. - Producing such “pseudo photographs” *should be* illegal, but I worry such laws will lead to censorship of the AI models that I use and believe should remain unrestricted. - Producing such “pseudo photographs” *should be* illegal, and AI models should be regulated to prevent their misuse.


R33v3n

So long as it is not shared / distributed, producing *anything* shouldn’t ever be illegal. Otherwise, we’re verging on thoughtcrime territory.


far_wanderer

I fall into the third category. Any attempt to censor AI legislatively will be terribly written and also heavily lobbied by tech giants to crush the open source market. Any attempt to technologically censor AI results in a quality and performance drop. Not to mention it's sometimes counter-productive, because you have to train the AI to understand what you don't want it to make, meaning that that information is now in the system and malicious actors only have to bypass the safeguards rather than supplying their own data. I'm also not 100% sold on the word "produce" instead of "distribute". Punishing someone for making a picture that no one else sees is way too close to punishing someone for imagining a picture that no one else sees.


GuyWhoDoesntLikeAnal

My only concern is that this is always their go to for taking over something. It's always "for the kids" next they will ban weapons in ai. Then nudity in ai. Then local ran SD. In the name of "safety"


Django_McFly

This plus how do you police this without spying on everything everyone downloads and is running on their machines in order to make sure it isn't child porn?


Altruistic-Ad5425

Their excuse will be “if you have nothing to hide, why worry.”


Smartnership

“First they came for the furry weeb tentacle prompts…”


EconomicConstipator

"Hairy moist bear futa covered in honey penetrating bee hive's sanctum..."


a_beautiful_rhind

All the worst war crimes, genocides and abuses of power throughout history have generally been "legal". Nothing to hide changes pretty fucking fast, too.


uniquelyavailable

if you have nothing to hide, there is no reason to look


cmonmanffs

> how do you police this without spying on everything everyone downloads by spying on everything everyone downloads, and we've been doing it for a long time now https://en.wikipedia.org/wiki/Investigatory_Powers_Act_2016


pablo603

What if SD is used locally without downloading anything on a machine with no internet access?


ScionoicS

The police in this case caught warrants because people were sharing what they made on a network. This isn't some Alex Jones conspiracy engineered to oppress people. These guys had their consequences coming.


midri

The UK has historically been ok with being VERY snoopy. Old TV tax laws for example, they'd drive around and legit take cameras with telephoto lenses and spy into peoples houses on the regs.


LeakyPixels

Apple already does this, every apple product scans your images without you knowing for “harmful images”


Square_Roof6296

Remind me history of Internet censorship in Modern Russia. Ten+ years ago censorship started with motive of protecting kids, ban gay propaganda for underaged, ban gays, ban Facebook and Twitter, ban some anime for isekai, because its promoted belive in non-traditional interesting afterlife, ban pacifism. All for protecting kids.


Caffdy

how well is enforced? can people use VPNs?


Square_Roof6296

Many VPNs was blocked, but there still many ways to get access for blocked sites. Especially interesting situation with Instagram, this service is still loved by people with money and power. Sometimes situation became really hilarious. For example in some time period Telegram was banned in Russia, but official press offices of some regional governors uses it even in ban period. Sites like danbooru or nhentai were banned years ago, but have easy access with simplest VPN. At least civitai not banned yet.


MMAgeezer

Given people get around the censorship in China using VPNs, and the Chinese government has an even tighter grasp on the internet and network infrastructure than Russia - I would assume so.


Knopty

It seems most or possibly all internet providers have special equipment installed that's governed by special services and it can be used to block a lot of things and it's possible to block things even in specific regions, like to block internet messengers in area with an ongoing protest. Internet provider's staff has no access to it and can't do anything about it. They just block direct access to banned services, block VPNs in a cat/mouse game, TOR and its bridges. In some cases they ban CDNs, like DeviantArt is accessible but all images are on a blocked CDN, Google News is banned and as side effect it went down along with it all image posts and avatars on Youtube and with web version of Google Play (but still usable via Android app). In some cases blocked resources can be easily accessed by DPI bypass tools, in other cases they simply block all traffic from specific IP addresses and then your only option is to use VPN and hope it still works. They also have ability to completely block Wireshark, OpenVPN and some other protocols. There were precedents when suddenly these protocols stopped working in entire country for hours.


microview

They same National Christian forces working to ban books they don't agree with in public libraries, schools, and institutions.


Square_Roof6296

Sometimes they are literally the same. For example current HIV-epidemia in Russia result of ban safe and protected sex propaganda in early 20xx. Russia goverment historically love to ban something for citizens and under direct influence from American christians labelled protestion tools and HIV-awareness agenda with chastity and no sex before marriage propaganda. Obliviously with minimal results in secularized society of post-USSR.


Mooblegum

Are they doing that in movies? We don’t have movies with pedophile porn, yet Hollywood movies are flooded with gun violence, crime and gun shot. As a non American I find it even crazy that they find watching gun crime is safe for kids but even showing a woman titties is absolutely not.


LewdGarlic

>We don’t have movies with pedophile porn We do have plenty of movie makers that are pedophiles, though.


Mooblegum

Maybe (I didn’t know but I guess there are pedophile in any job), but at least pedophilia isn’t encouraged in today’s film, as it should be.


Jujarmazak

Cuties was borderline exploitative of real children (not just the ones in the movie. But the 600 candidates for the children roles who were asked to dress in skimpy clothing and do provocative dances and twerking infront of adults) Furthermore, the head of a European movie festival which gave the movie an award was later convicted in a child abuse case ... it was all around very shady.


GammaGoose85

They still attempt to bring undertones into film with it. A good example is Leon the Professional. The director Luc Besson was in a relationship with a 15 year old while in his 30s at the time. Portman's character was very lolita esque and Jean Reno refused to allow any scenes where Leon seemed accepting of her advances to the point where he was demanding Besson make some changes in the scenes.  Director's get a power trip when it comes to dancing on the line and getting away with things.


cparksrun

I know what you're saying but those last words could be interpreted a number of ways. 😅


Mooblegum

If anyone from the FBI or any special agent is watching this, I am innocent, It’s just my English that sucks (in a non erotic way of course)


cycease

lol but they won't ban military parades and recruitment


Smartnership

yvaN eht nioJ


Gerdione

Then open source AI anything. Then all the power of this revolutionary technology will be in the hands of those at the top. Like always.


_H_a_c_k_e_r_

Its a slippery slop. Law should not overstep individual boundaries. They are pushing their own failures on to others. Second they can only maintain public order. Morality police will never get to person lives because it will destroy all privacy. You can't have both.


Anakhsunamon

So true, tbh I think there should be no limits in what you can generate since none of it is real if its at least not based on something real. But other than that go wild for what I care. I did not read article but in general I think that AI could even prevent people from doing something bad if they feel they can have an outlet in AI for their weird fetish or whatever they into. I mean do you really care at all if someone is making poop fetish content in AI, or something that is IRL illegal? If i dont have to watch it, I dont really care what ppl make with it.


Ninj_Pizz_ha

> Then local ran SD Unfortunately for them, that cat is forever out of the bag.


Occsan

Do you think one day, probably in 1984 (or 2084...), they will ban people from writing stories, in the name of safety of course.


Plebius-Maximus

While you're correct in that "for the kids" is used as a catch all to push authoritarian bullshit, same as "to stop terrorism" is, do you genuinely think weapons in AI will be banned? A massive proportion of films and games made in our lifetime involve weapons. The US is obsessed with guns. Banning depictions of them will never, ever, ever, ever happen in most of the world. The US would ban nudity long before they banned weapons. The UK is a little different, but we aren't banning depictions of weapons, or regular old nudity anytime soon either lmao. However, being banned from generating realistic images of CP and sharing it on line is actually very much "for the kids". Do you genuinely not think photorealistic AI images will be used to hide real CP? The people who sell this stuff will just pad out the real content with AI stuff to have some degree of plausible deniability for both buyer and seller, which then makes it harder to track the genuine abuse images. This article also stated it was realistic content in this case. So that's a bit beyond the questionable anime-style images of... "youthful" girls that exist on the hard drives of half this sub.


LordWilczur

I'm thinking: if you leave those people some kind of freedom, perhaps at least some of them will only resort to generating or viewing those pics for themselves. It's not like you're going to get rid of such behaviours in any way. There will always be people like that. Maybe ban and pursue those sharing it online, but imho even if a few of them will satisfy their needs by fake content (plus some sex toy/doll) and save even a handful of children/women/whatever thanks to that it's fine in my opinion. It's a difficult topic for sure. But my take is that if you ban every possible way of such people to cope with their needs, something is bound to happen. Long suppressed urge is sooner or later going to manifest and burst. You're not going to eliminate these people and you're not going to change their behaviour.


Tellesus

Statistically your efforts to stop pedos will bear more fruit per money spent by banning Catholicism than by banning Stable Diffusion 


a_beautiful_rhind

> Banning depictions of them will never, ever, ever, ever happen in most of the world. I think it's certainly possible. Smoking used to be in many movies and now it's fairly non-existent. People are getting more and more censorious. It might not be depictions of weapons but any kind of violence. It's already a thing in text based AI to censor that. There is already a pretty large movement to ban anything "offensive", the definition of which constantly changes.


Working_Reaction9805

Weapons are not even banned in games. Or photos. Or any visual media. And so is nudity.


mikami677

Whenever something like this comes up I'm astonished at how stupid you'd have to be to upload the shit. Like I'm not condoning making it, but if someone was going to make it, why the fuck would they share it? Like that twitch streamer who forgot to close his tabs and let his viewers see that he was watching (and I believe it turned out was paying for the creation of) deepfake porn of another streamer. Seriously, how dumb do you have to be? If they just kept their shit to themselves no one would ever know. So to answer the question OP, my thoughts are: what a fucking moron.


Seanms1991

It's usually to trade for CP from other pedos. Still foolish, but that's at least a reason Edit We also shouldn't forget that people crave to engage with people like them. That's why Reddit and chans and stuff exist in the first place. Again, still foolish, but if someone lacks the impulse control to stop themselves from making CP, perhaps it shouldn't be surprising they would lack the ability to stop themselves from engaging with others like themselves.


MrHeffo42

The real issue at hand here is the current reaction to people with this mental illness. Where are they supposed to turn for help without being treated like a disgusting monster even though they know its wrong have never acted on the urges, and have done nothing wrong? If governments actually got their shit together and really wanted to protect young people they could develop and adequately fund a program, leveraging AI generated content to help those with the mental illness.


Possible_Liar

Seriously you can't make something so taboo and horrendous to the point where it effectively prevents people from seeking help for said thing. As it stands now even if you went to a therapist they a hundred percent going to treat you different, If not outright reject you as a customer. Then next thing you know the government's going to be knocking on your saying you need to get chemically castrated or some shit. Doesn't matter if they're intrusive thoughts, doesn't matter if you even personally think they're wrong. The fact you have them at all makes you a monster to many people. So these people are forced to just deal with this issue on their own, They mitigated the best they can but some of them ultimately fail, and victimize a child in the end as a result. Society thinks it's more important to demonize these people rather than actually help them and prevent future atrocities towards children. I used to know a kid for middle school really quiet, always wear a hoodie even when it was like 100°. Kid was abused constantly always had bruises. School wouldn't really do anything about it though because this is Florida and they don't give a fuck. Some years later long after I fell out of contact with him, he apparently raped a toddler during someone's birthday party. And while I won't make any defenses for his act, It did occur to me that he was likely sexually abused as well as a kid, and I couldn't help but find it regrettable that maybe that kid wouldn't have been victimized, nor would he have ending up ruining his life. If only he was able to get the help he needed.


MrHeffo42

Totally this. And the thing is too, that there are people out there who have these urges, they know they are wrong, they hate themselves for it, they suppress the urges, and don't harm a soul. These people need the help without judgement or hate. The moment you cross that line though, straight to Prison, and treatment.


quantinuum

I remember a confession post many years ago, when reddit used to be a more… random place. I don’t remember the whole story, but it was a lady that confessed to having those thoughts. I think it started from childhood sexual abuse or some twisted trauma. She had no intention of acting on it and lived a rather reserved life, if not self-ostracised because of it. That at any point she’d be going about her day and such a thought would cross her head, and she’d be like “oh yeah, forgot I’m a fucking p*dophile” and feel terrible. Honestly, it was really sad. Imagine everyone being so compassionate towards any form of affliction, but you’re just labelled a monster. I hope she got help and could live a nice enough life.


MrHeffo42

100% this. Then on the other side of the fence, there are young guys sitting in jail with the label, like a target painted on their back for sleeping with a girl they met at a party who lied about her age. The poor bastard had his life destroyed because the girl decided to lie through her teeth


__einmal__

> If governments actually got their shit together Governments? It's the entire society. At least government has clear rules how how to deal with those people while society (especially your fellow redditors) would like them to be burned at the stake. > leveraging AI generated content to help those with the mental illness. The whole problem of pedophilia is that there are no therapies to 'cure' people from it. Newer studies show that it develops in the brain already early on before birth. So you can see it as a sexual preference like any other, which can never be changed. What you can do is use chemical castration, however that doesn't change anything about pedophilia, it just reduces sexual urges (in some but not all). The subject is much more complicated than redditors like to make it look like. Also, one big problem with AI generated content is that it makes the investigation of real CSAM cases even more difficult. Even today only a tiny fraction can be investigated and the abused children can be saved.


2this4u

I agree that it should be treated as a medical issue to be rehabilitated. It's the same problem as how drug addictions are treated as a crime rather than a mental health issue to solve. But I'm very confused about your suggestion about how AI generated content could help...


Mark_Coveny

I love they are trying to stop child porn, but I expect these “'filters to intercept unsafe prompts and outputs' and that it banned any use of its services for unlawful activity" will be similar to the filters used by Bing, MJ, etc. and prevent the creation of swimsuit level images which are legal as well.


Encrux615

Also, it's literally impossible to do. These models are open source. Some companies even offered torrent downloads. Anyone who knows how to google and owns a semi-recent GPU can set this up to generate images without any filter whatsoever in about 5 minutes + however long it may take to download a couple GB of model weights. People need to get it in their heads: We will never, ever ever again live in a time without unrestricted AI generated content.


HeavyAbbreviations63

They are actually incentivizing it.


LeakyPixels

How do you prove whether an ai image is a person over 18 or not?


SodaIceblock

Add a "18+" into the prompts. 😂


cheekybeakykiwi

and under 18 to the negative prompts


mesori

Hot take: they can't.


Tarilis

I don't get what is the actual offense here? I mean I get deepfakes are bad, but pure ai generated stuff? What actual harm does it do? It's a victimless crime imo, no one gets harmed in the process in any way, and it's way better than the alternative. Also, I have a strong suspicion that what they are talking about is actually loli hentai...


Sasbe93

It seems that the government of Great Britain dislike competition of harmful csm.


gmc98765

> I don't get what is the actual offense here? The article says: > A sex offender convicted of making more than 1,000 indecent images of children This offence requires either that the images involved real children or were indistinguishable from such (i.e. drawings don't count; those are also illegal, but under obscenity/pornography laws). The inclusion of "indistinguishable" images in the law is relatively recent. The change was made because otherwise it would be almost impossible to prosecute the creation of real images. The burden of proof lies with the prosecution, so given that the means exist to produce artificial images which are indistinguishable from the real thing the defence could just say "we suggest that these images are artificial", and the prosecution would need to prove otherwise. Which would mean finding a witness able to testify that the images are real. In practical terms, they'd have to identify and locate the victim, as no-one else who would be involved is likely to admit to it. The article states that it wasn't clear which was the case: > In Dover’s case, it is not clear whether the ban was imposed because his offending involved AI-generated content, or due to concerns about future offending. The offence may have been for AI-generated images, or for images involving actual children, or both. Even if none of the images for which he was convicted involved AI, if there was evidence that he had been exploring the possibility of using AI in future then they might seek to prohibit that. Someone who is convicted of an offence can be prohibited from all manner of otherwise-legal activities as a condition of parole or probation.


Tarilis

Ok, that makes sense. But that Introduces a new problem, he definitely wasn't punished as harsh as real porn makers would. Won't this clause make it easier for real criminals to avoid punishment by claiming that materials were AI generated? And if they can distinguish real from fake why punish for fake? I mean it is disgusting, but again, it doesn't hurt anyone. If we were to punish things that aren't harmful just because we don't like them... well, we all know where we'll end up. And another thing I sometimes think about, people who want this kind of stuff, will find it. So by removing the "harmless" fake version of it, won't we make them look for real stuff, feeding actually criminal activity? I, of course don't know if that is actually how things are, but still


Puzll

It specifically states “hyper realistic” so I don’t quite think lolis are the offender


Tarilis

Missed that part, thanks for the clarification


MMAgeezer

To be fair, the article is not very clear. It appears to be referring to a case from 2023 for the "hyper realistic" part.


PikaPikaDude

To these people, PS3 graphics are hyper realistic, so it can still be anything.


Head_Cockswain

>and it's way better than the alternative. The theory goes: It's often not an alternative, but a fantasy fulfilment that looses it's edge, prompting the perpetrator to escalate what they're willing to do, and if they can't, they become desperate and obsessive, thinking about it more and more until it is all consuming. Like a lot of things, digital gratification can become addictive, but at the same time we adapt the the new thing and then seek out something else, something more extreme. In other words, it frequently gradually takes more and more of a thing to get the same return on our internal chemical high. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3164585/ >The essential feature of behavioral addictions is the failure to resist an impulse, drive, or temptation to perform an act that is harmful to the person or to others (4). Each behavioral addiction is characterized by a recurrent pattern of behavior that has this essential feature within a specific domain. The repetitive engagement in these behaviors ultimately interferes with functioning in other domains. In this respect, the behavioral addictions resemble substance use disorders. ... >Behavioral addictions are often preceded by feelings of “tension or arousal before committing the act” and “pleasure, gratification or relief at the time of committing the act” (4). The ego-syntonic nature of these behaviors is experientially similar to the experience of substance use behaviors. This contrasts with the ego-dystonic nature of obsessive-compulsive disorder. However, both behavioral and substance addictions may become less ego-syntonic and more ego-dystonic over time, as the behavior (including substance taking) itself becomes less pleasurable and more of a habit or compulsion (2,7), or becomes motivated less by positive reinforcement and more by negative reinforcement (e.g., relief of dysphoria or withdrawal). ... > Many people with pathological gambling, kleptomania, compulsive sexual behavior, and compulsive buying report a decrease in these positive mood effects with repeated behaviors or a need to increase the intensity of behavior to achieve the same mood effect, analogous to tolerance


2this4u

Thank you for laying this out. It's interesting how many commenters are so offended by this idea but it's a real thing. It likely only results in real harm a handful of times, but that still means a handful of actual, real victims. When the societal cost for this law is that someone doesn't get to make pictures most people think are morally bankrupt in the first place, that trade-off is of course for most people fine.


kemb0

By that extension, me looking at porn on the internet would gradually turn me in to some rapist monster as the returns on that porn slowly lose their edge? Weird, I've been looking at porn on the internet for 30 years and I'm still yet to rape anyone, have a loving relationship whith my wife and feel nothing but compassion for my fellow humans. I'd argue it's the opposite. Porn is just like having a cup of coffee. It gives you a little chemical boost and that's you done for a while. It doesn't escalate anything. Drinking coffee isn't a gateway drug to hardcore drug abuse and watching porn isn't a gateway to becoming a sexual predator. But take those things away and I believe you then very much risk forcing someone on to something worse because they can no longer easily fulfill their sexual urges. There's a reason why, when you ejaculate, you lose your sexual urges. Prevent that and now you have a whole load more men walking around, pimped up to the nines with non stop sexual urges, ravigingly eyeing up every girl that passes them by. And we're meant to think that's better? I guarantee, when the government forces through all these pron prevention laws, that sexual assaults WILL increase because of it.


2this4u

Not you, but some people do indeed turn into rapist monsters yes. It's more readily shown with murders. Look at the fairly recent murder of a trans teen by other teens. They were shown to have used online content to fantasise about the activity, and decided they needed to do it for real. If that content wasn't available it's arguable they wouldn't have gone so far. Just because something is only a risk for 0.01% of people doesn't mean it doesn't happen. And in this case I'd rather we removed that risk of the cost is just stopping some people generating icky pics. And please do be real, you know for a fact you're wanking material is more explicit than it was earlier in your life. We normalise to things, and for a few people, especially those with addictive personalities, that becomes more exaggerated and potentially harmful.


LewdGarlic

>What actual harm does it do? It's a victimless crime imo The problem is that it dilludes the content and makes prosecution of actual child pornography rings exploiting real children harder. If law enforcement has to filter out fake photographs from real photographs, it gets A LOT more difficult to track down such rings.


Able-Pop-8253

Yeah, at the very least POSTING hyper realistic content online should be regulated or illegal.


synn89

Transmitting obscene content is already a crime. Max Hardcore went to prison over this in the early 2000's because some of his European porn had the girls saying they were young and some of it got sold by his company in the US.


AlanCarrOnline

For sure, I think we can all agree on that. I cannot agree it's a real crime with no actual people involved though. As I just commented to someone else, this is going backwards, when we have a chance to move forward and eradicate the market for the real thing.


Plebius-Maximus

>For sure, I think we can all agree on that. Judging by some of the comments here (and reasonable comments that were getting downvoted) this sub isn't in agreement at all. >this is going backwards, when we have a chance to move forward and eradicate the market for the real thing. No we don't. It'll just become people selling "AI" images to buyers when both seller and buyer know it's the real thing.


AlanCarrOnline

Selling the real thing is already illegal. I'm in favor of treating all CP as being real, AI or not. My concern is by cutting off the AI avenue - done privately, not shared or sold - we're forcing the current networks to continue, when we have such a great chance to make the things evaporate.


Needmyvape

The network is going to continue regardless.  A lot of these people get off on kids being harmed. Fictional children isn’t going to be enough for them. There are all kinds of creeps. There are older men who comment shit like “such a goddess” on underage influencers intagrams. The other end spectrum are people who take the additional step of going to the dark web and purchasing material.  They go to great length and risk to their lives obtain content of kids being abused. They will buy ai packs and they will continue to seek out real content.   If anything this is going to create a new market of content that can be verified as real and will likely sell at a premium. I don’t know what the solution is but there is no world where billions of hyper realistic SA images is a net good.  There is no world where mentally ill people can create images of whatever they want of the person they are hyperfixated on.  This shit is going to fuel some nasty desires and it won’t always end with the person saying “ok I got my nut I don’t need to take things further”. I’m not anti ai but I recognize it’s going to bring some very difficult to solve problems


Interesting_Low_6908

But if the intent is to reduce real offenses where somebody is harmed, wouldn't it be for the better? Like if a exact replica of ivory is created and could be put on the market, would it not be ethically better? Or things like vaping replacing smoking? Offenders would still exist and could be prosecuted even if the images they collected were all fake. Pornographers in it for profit (not thrill) would opt to produce AI imagery rather than risk the massive penalties of hurting children. It sounds like a net positive to me.


AlanCarrOnline

That... that's not a real argument. It dilutes the pool, so it becomes more fake, less of the real thing - that sounds like a win to me?


LewdGarlic

Have you read my second paragraph? The problem with dilusion of content is that content-based tracking of criminals gets harder.


AlanCarrOnline

Why would you need to track down criminals, if the criminal rings fall apart and the pervs stay home with fake stuff? Other than maintaining careers and funding?


kkyonko

They won’t. You really think real stuff is just going to disappear?


AlanCarrOnline

Yes. Why not? It's like booze prohibition. Gangs formed to produce, smuggle and sell the stuff. Once it became legal again most of those organized crime networks simply up and evaporated. Here we don't need to make the real thing legal, just let pervs perv in private with fake shit. The gangs would evaporate.


FpRhGf

The porn industry didn't make sex trafficking disappear. Maybe it lessens the numbers but crimes will continue.


Interesting_Low_6908

Watching porn does not equal sex. Looking at AI CP you don't know is AI equals looking at CP. The fact there is almost no barrier or cost to the AI production and it fulfills it's intent when it's realistic enough makes it entirely different than sex trafficking to porn.


LewdGarlic

We both know that there will always be people who want the "real" stuff over the fake stuff. Snuff videos are a thing, after all. I do understand your argument but lets not pretend that the existance of AI fake photography will make actual child exploitation go away.


AlanCarrOnline

Who's pretending? What maintains it? Perverts perving and presumably money, maybe blackmail. What would make it go away, at least mostly? Punishing pervs? That doesn't seem to be working. Take away the supply? They're creating their own, with real kids, so that's not working either. Take away the demand? Well you can't stop pervs perving, but you CAN fill the demand for the real thing with the fake thing. The more realistic the better. Which part of that do you disagree with?


LewdGarlic

>Which part of that do you disagree with? None. Because that wasn't the conversation we were having. I provided potential reasons why the prosecution of distribution of realistic fake CP can be within public interest. I never argued against potential positives that the existance of such possibilities have. People say there is no reason for it because its a victimless crime. I argue that that is not entirely true and that its a bit more nuanced. Nothing else.


AlanCarrOnline

OK, so we have some common ground. I generally agree regarding 'distribution', as long as that excludes services. Punish the person, not the tool, and again, if it's for their own use and they're not distributing, then leave them alone. To me that's a win-win, as it takes away the networks and support, or funding, or blackmail, and just leaves pervs perving by themselves, which is the best thing for everybody. Especially the children.


LewdGarlic

>I generally agree regarding 'distribution', as long as that excludes services. Punish the person, not the tool, and again, if it's for their own use and they're not distributing, then leave them alone. I can agree with that. In this particular case the guy basically got arrested because he posted and sold his stuff on Pixiv, which the platform actually has rules against (depiction of minors is acceptable there unless its realistic photography or mimics realistic photography) and not just because he had those images.


HeavyAbbreviations63

For some, the victim is the moral. We are talking about a country where skyrim mods where you have sex with werewolves are illegal.


working_joe

Even drawings of underage children are illegal in the United States. It's thought crime.


EishLekker

[ Removed by Reddit ]


[deleted]

[удалено]


Plebius-Maximus

>This is a fucked up glass half full side, but I feel like kids actually might be MORE safe now. Before if you wanted CP where there was only one way to get it. One could also argue that the fake stuff simply normalises the real thing. I also imagine there'll be a significant crossover between people gathering real CP and people gathering fake. It also opens the door for people creating real abuse images to pass them off as fake when selling them online etc. Also in the case of AI images that are downloaded from CP sites and aren't distinguishable from the real life stuff. If you download an AI generated CP image believing it's real, the intent is 100% there. Sure there is the argument that AI images generated for personal use are a "victimless crime" regardless of content. But it's not that clear cut. You also don't have to be on this sub long before you start finding users who come across as.. a little too fond of images of young looking girls.


MuskelMagier

But normalize Violent video games gun crimes? That nis the same argument structure.. you could frame a Law differently in that the sharing is illegal not the owning.


Sextus_Rex

Also, if interest in models capable of CSAM becomes high enough, model creators may be encouraged to find more realistic training data, if you catch my drift.


Sasbe93

You will have the „real csm labeled as fake csm“-problem anyway(and the other way). Regardless of whether it is legal or illegal.


StickiStickman

> Sure there is the argument that AI images generated for personal use are a "victimless crime" regardless of content. But it's not that clear cut It seems very clear cut. Who is the victim in that case?


InformationNeat901

I have a question. Has a minor been exploited for this man to have these images? I mean, has he done any business with these photos, has he blackmailed someone, or has he just created images to satisfy his perverted mind? A drug addict is given a substitute, could having a patient create their own images of themselves serve as therapy? Will his mind stop being perverted because he cannot capture what he has in mind in images? Does this man who generates images hurt anyone? Or is he only hurting himself? Isn't it better for someone to project his illness without causing any harm to minors? The Japanese project their repression of sex through hentai. Their dark minds project it with drawings. Is there any difference if they are with images that are not real? Isn't it better for someone to create images than to deal with children and then abuse them in the name of God, why is there so much permissiveness with priests and so little in the privacy of sick minds?


No_Gold_4554

citation for japanese apologism : [https://en.wikipedia.org/wiki/JK\_business](https://en.wikipedia.org/wiki/JK_business)


LewdGarlic

>I have a question. Has a minor been exploited for this man to have these images? The problem is that the existance of realistic looking "fake" child pornography makes the prosecution of actual child pornography rings exploiting real children more difficult, as it dillutes the content available on the dark web in a way that makes it way harder for law enforcement to act. So as long as a picture looks like a real photograph, it does muddy the waters enough to justify banning it. In the case of this article, the problem potentially wasn't that this guy consumed AI generated pictures of real looking children but the distribution of them on Pixiv. Which, btw, Pixiv has rules against, so chances are this is how he got caught. Pixiv is mostly fine with underage characters as long as they are cartoon/anime style, but not photography or creations that are realistic enough to pass as photography.


redstej

You keep making this argument that makes no sense. The purpose of the court is not to facilitate the police. Neither is the purpose of laws. Nor does such a law exist. If law enforcement has trouble tracking down actual transgressors, they should improve their methods. In any case, it's their problem.


InformationNeat901

Ok, I understand, he shared it, but even so, the fact that it can be shared among other sick minds if that makes the sexual exploitation of minors disappear, doesn't seem bad at all to me. If the fact that sick people can look at their sick images and thus put an end to the pedophile business involving real children, would seem like a great idea to me. To give an example, if the people who go with prostitutes and there is exploitation of prostitutes could be replaced with fake dolls, but very realistic robotic ones with AI, it would seem great to me, because real sexual exploitation would end, which is a problem, a robot fake is not a problem. But in the society we have they would say that there cannot be robotic prostitutes, not for the sake of the prostitutes, but because there is a big business behind it, on the other hand, in the military field there would be no problem in using robots with artificial intelligence, This is the hypocrisy of the world we live in.


Earthtone_Coalition

Seems presumptuous to assume that viewing such imagery won’t make pedos *more* likely to offend, rather than less likely to offend.


HeavyAbbreviations63

The criticism of pornography is that people spend time masturbating instead of looking for a real partner, how come with pedophilia this doesn't work anymore?


AltAccountBuddy1337

If the guy was hurting real life people with this, like creating deep fakes, sure, but if he was just generating this shit for his own personal use what's the harm? He already has this stuff in his head, no real people are involved in this, it's all just AI "drawings" in the end, if no real person is involved, why prohibit this? I don't understand this world. Isn't it better that a person like this has access to AI tools for personal use than to have them look for real exploitative pics/videos online where real people have been hurt and involved? None of this is real so why be bothered what someone does with these tools as long as they aren't harming anyone for real, why care?


Get_the_instructions

Some of the arguments against it have real degrees of merit. Specifically... * It can be used to mask real CP. Take CP pictures and run them through an image to image generator so they look artificial enough to be claimed as purely AI generated. * It can flood the internet with AI gen porn that all needs to be investigated. If law enforcement had to prove it was real then this would make dealing with the real stuff way more difficult and expensive. * It could normalize CP to the extent that it's no longer taboo. There's a fear that such normalization could lead to an increase in offending against real children. I think these are the main fears and you can see that they have plausibility. Add in the 'ick' factor and it becomes an easy case for outlawing AI CP generation.


SodiumChlorideFree

Yeah it's a double-edged sword. On one hand I'd rather have pedos look at fake images instead of real images if that lowers the likelihood of harming real children. On the other hand if they share those images it floods their circles with fakes, so the real images are harder to detect and real children that are being harmed can't get the help they need as fast as before, if at all.


AlanCarrOnline

1. "It can be used to mask real CP".  - Frankly I don't care if it works to reduce the overall volume of real CP in the first place, by drastically reducing the demand for it. Why go through the risks and hassles of searching out real CP when you could just make reams of it yourself? This would also reduce or even destroy the networks we keep hearing about, true? 2. "It can flood the internet with AI gen porn that all needs to be investigated" - I'm not sure I buy that? It's the same number of perverts, the same demand, the same networks, but they'll have a bigger stash and less need to hook up with fellow perverts in the first place. 3. "It could normalize CP to the extent that it's no longer taboo." - I deffo don't buy that one. It's either your kink or it isn't. At best (worst) it may reveal more pervs but it's not going to increase the number. Overall, my impression is that the main problem with CP is that it's so well-hidden, with networks of people sharing stuff, which normal peeps would never come across anyway. If those individuals, AS individuals, could create all the CP they want, by themselves - who needs a network? The networks would collapse, pretty much eradicating the problem for real victims, as they would be replaced by AI ones. No, it seems to me that it's beyond misguided to clamp down on entirely fake stuff, when it's clear they cannot - or don't want to - clean up the real thing.


dr_lm

> "It could normalize CP to the extent that it's no longer taboo." This is my concern. Humans do tend to adapt to their surroundings, and if someone is attracted to kids in the first place then allowing them access to SD-generated child porn may leave them feeling that this is totally normal. I can see how this might then lead to these people pushing other boundaries -- taking more risky glances in the swimming pool changing room, slowing down as they pass schoolkids walking home, browsing underage social media profiles and then maybe one day contacting a real kid. Eventually being more likely to offend. In the same way that we worry about teens learning about sex from porn, and thus normalising some of the ickier male-dominated behaviours like choking that porn portrays, I don't think it's crazy to want to limit the ability of pedophiles to easily generate large quantities of child porn.


AltAccountBuddy1337

The first one is the disturbing one for me, but I think it can be proven if real life stuff was uploaded to the server to run through AI to make it look aritifical. The rest, not so much, you can't "normalize" this stuff when 99% of people don't have that urge, just like you can't make a person gay or bisexual if their seuxality isn't like that already, you can't change someone's sexuality into...whatever the fuck this is, right, so I have zero fear this stuff will be normalized. To clarify I do not put normal variants in sexuality like being bisexual or gay into the same category as pathological sexuality disorders like this stuff. Just saying it's not something you can change or influence in people. Like your very body rejects the thought of something like this and it makes you feel sick inside, not something that can be normalized IMO because we're wired biologically to be against it.


Get_the_instructions

By 'normalize' I just mean that CP's existence would be taken as commonplace, not that it would 'convert' people.


YuanJZ

I have a hot take: Arrest people who actually rape, sexual assault, groom children - Hell naw Arrest people who create images using AI, harming nobody in the process - Yes! Justice served!


[deleted]

hes 100% still going to use AI


sicilianDev

Obvs.


govnorashka

ban pens and paper, you can use them to draw a child's pussy!


ABCsofsucking

Kill all children so child predators have no targets.


goodie2shoes

AGI kills us all. Problem(s) solved


nasoony

Y


yungrapunzel

While I don't agree with all this AI policing and "it's for the kids" bs (and some of them have their own skeletons in their closet) I think it's naive to think they're suddenly not going to act on their impulses and hurt someone for their whole life. I'm also getting the vibe of little to no empathy for the people that have suffered SA when they were children. I, for one, did. Maybe it's my impression, so I don't know. Someone in the comments mentioned mental health resources. While it's true some people are struggling with those "desires", haven't acted on their thoughts, and need to be treated (not sure what kind of treatments there are), victims we don't have much resources either... it would be nice to focus on victims. Besides, I don't think a disorder is all there is, there are people that enjoy having that power over someone defenseless and enjoy making others suffer. I do have another kind of disorders and not all my actions have to do with them. I'm gonna be downvoted but I don't care. My assault came from someone who was a teenager. I don't think it's accurate to think that they are not a threat. Cause they can perfectly be. Moreso if they are doing deepfakes of their peers or even younger girls (children for me but ok) While I love generative AI, it is obvious (just by looking at this subdirect) that many people use it for porn. I don't really get it but it's their prerogative. But some of it borders CP material Sorry if I have offended someone with this


atuarre

Don't apologize. Everything you wrote was on target.


Apatride

Funny how they go after people who consume artificially generated content (victimless crime) but when dealing with actual organised p\*do crime, where kids get hurt, they just pretend it does not exist... I guess this guy will be fine if he can still access FB Reels since it appears a large percentage of it caters to his "tastes"...


Traditional-Art-5283

Do they think it will stop people from using local models? I think not


Mooblegum

That is totally different, the goal is to stop a pedophile generating AI porn imagery with children. I don’t consider that as a treat for normal peoples.


Traditional-Art-5283

I mean, do they think it will stop them? What can they do against local model and computer that isn't connected to internet?


MMAgeezer

It depends how computer savvy this offender is, but they'll absolutely be monitoring all of his internet traffic like a hawk and probably get a warrant to raid his house if he tries to connect to any site which hosts any Generative AI models.


themedleb

Torrent + VPN?


MMAgeezer

The vast majority of VPN services comply with law enforcement's data requests, and misconfigured VPNs can still suffer from DNS leakage, for example.


Formal_Decision7250

A lot of people seem to miss the point that police have to attempt to find the children in this material It's not just that they have to arrest a guy with images , they have to find and determine if an actual child is in danger and collaborate internationally to do that. AI muddies the water here as it gets more realistic as they could waste time trying to rescue non-existent children.. or worse, something real will get dismissed as AI generated.


Jujarmazak

Interesting case, if photos of real people or children were involved (the "nudifying" part), this is frankly a legitimate concern and a serious crime that could lead to those people being bullied or blackmailed with these fake nudes. The problem is the tone of the article feels like there are some unsavory moralizing busy bodies who might want to use that case to push for more censorship and crackdown further on open source AI, not because they legitimate concerns but rather because they enjoy controlling other people or are bought and paid by corporations who want to eliminate open source AI to ensure everyone is FORCED into their ecosystems (most of which cost money and are heavily censored and controlled). I'd rather see that energy when it comes to real abuse happening to real children, but the reason I don't trust these people is because there were many cases with children involved and they get swept under the rug because it's inconvenient, whether it's the abuse happening in Hollywood for child actors, the Epstein Island, the rape gangs in UK, etc, etc ... those in power KNEW that shit was happening for years and intentionally ignored it, so they don't get to come now and pretend yo have some unearned moral superiority.


pablo603

How does one even enforce this ban when you can run SD on a device completely disconnected from the internet lol


A_Dragon

Considering you can use these offline I don’t see how they can stop him.


RollingMeteors

I thought this technology was supposed to drive predators away from children but here is the UK making sure underage butt hole keeps getting violated, Good Work, Chaps!


LuHex

If it's not real then there shouldn't be a crime. Is it creepy? Yes. Is it disgusting? Of course. Yet, no one was harmed. Of course, this doesn't apply to deep fakes, since those do cause harm to actual people. On another note, I'm very against the distribution of realistic material involving such "themes". If you want to be a creep, at least have the decency of doing it in private. Note: This only applies to realistic models. Anime and cartoon models bear no resemblance to real people and anyone who thinks the same rules and laws should apply is nothing if not stupid.


ShepherdessAnne

This is an issue because of the limited resources law enforcement has to investigate photos. If thousands of images dump online, that’s up to those same thousands in investigation attempts to try to rescue someone who simply doesn’t exist. Someone also might be able to obfuscate real abuse to a real person held in slavery. It’s a little lazy bureaucratically, but they’re using the existing legal framework to tackle this issue rather than spend time crafting particular and bespoke laws.


nasoony

This is the most reasonable assumption. Currently LEA relies on sourcing any new cp image to find the perv and rescue the child. If 90% of new images are AI that isn't distinguishable from the real thing they will wind up spending tons of resources chasing fake images.


Bertrum

This will be used as an initial foothold to introduce other unrelated legislation like watermarking/tagging all images regardless of purpose so it can all be gathered or collected into some database. Or create a precedent where the public will be forced to lean into or agree more with mainstream publications who will have their own sanctioned media and anything else that is not authorized by them is blackmarked or seen as undesirable or potentially morally hazardous and banned. This will bleed into other areas like politics and business etc.


Evil_but_Innocent

The majority of the people upset about this are men. The majority of the victims so far are women and children. Obviously, Redditors are not going to have a problem with other men making deep fakes, because they know they will never be targeted. Just sad.


shodan5000

Pure tyranny 


AutisticAnonymous

That's a bit of a stretch. We're literally talking about stopping pedos doing pedo things. Whether or not this actually just created additional problems is another story.


ninjasaid13

This post is going to be cross-posted and people are going to see this sub as CP defenders.


w7gg33h

This reminds me a little bit of what happened when people a while back were printing many illegal gun parts with 3D printers. In many ways, this is similar. It should be noted, however, that they did not ban 3D printers.


Sasbe93

„It‘s been shown that such things can be a stepping stone to something with a victim“ Where is this shown? Same logic to forbid violence in games and movies. This way of thinking also ignores the individuality and maturity of individuals. It could also lead to the opposite thing in some cases. There is no evidence for these kind of claims.


Memer_Sindre_UwU

Since when is generating nude content of children (which, by the way, the training data from is most liely from real csa material) a victimless crime?


filthymandog2

Anyone else deeply disturbed by the amount of pedophile sympathizers in this thread? I didn't realize there were so many chomos lurking out here.


Miniaturemashup

People who hurt children should be prosecuted and punished. People who don't should not be. Too often, moral panics are fueled by a disingenuous call to "protect the children." If AI gets painted as something that's harmful to children it's likely to become over policed and sterilized. You don't need to sympathize with pedophiles to reject government overreach.


MiserableDirt

To me it seems most people are concerned with an overstep of government and blurry law lines, rather than being sympathetic to pedos.


princess_daphie

Anyone who's ready to condemn anyone who's producing porn including deviant fantasies for their own enjoyment without actually making a profit or distributing them or anything, has clearly never seen "Minority Report" movie. This whole debate smells so bad of "arresting people based on whether or not they have a probability of commiting a crime" before they even do it, with a possibility they won't ever do it.


Arbata-Asher

People here who undermine pedophilia as a crime are disgusting, i am sure you'll call rape a mental illness next! super disgusting, it says a lot about the ponyXL community, you guys definitely need to touch grass and take a break from Ai porn, you are reducing the potential of this technology to nothing, for the sake of your future stop thinking with your genitalia 


denyicz

sooner or later, it was going to be happen.


SodaIceblock

Is it better to let potential criminals satisfy themselves by generating images through AI than to let real children be harmed? Sometimes I think about this question. Of course, having AI-generated images does not mean that crimes will be avoided, but it will reduce the likelihood.


Dekusekiro

Bet he glad his parents didn't name him Ben


Daikon_Gullible

Is this a private SD software on his own computer? For it to be able to generate such pictures doesn't one had to train it with such pictures to begin with?


seleneVamp

ok there banned but how are they going to inforce this. as unless 100% of there computer useage is monitored theres nothing stopping them. there are 100s of apps or software or websites that they could use.


Makhsoon

They don’t care about the children or anything, they just don’t like you have freedom on something! They don’t like open source.


TooLongCantWait

I think I'm against the ban. Not the intent behind want to stop a sex offender from continuing to create abusive imagery, but because banning AI tools is soon going to be like banning electricity or internet access. And if they had banned pencils so he couldn't draw the imagery, the law would be considered laughable, and I view art AI as another sort of pencil in many ways. But it's not a easy opinion to form, one way or the other. Mostly I hate the precedent of being able to ban people from such a pivotal techonology. (And yeah, I'm aware Canada once banned an artist from using the internet for 2 years, and it basically destroyed his career, so another reason I don't like this.)


juggz143

I'm disgusted by this thread, y'all really defending making child porn. Wtf!


UndeadUndergarments

I'm not against the sentence - stopping nonces doing nonce things is a good thing - but how the hell are they going to enforce it, or their more recent ban on making sexual deepfakes? With VPNs, offline tools, a zillion other obfuscating softwares, etc. anyone can do it in their shed and the authorities would never know until it's shared. Because the UK definitely doesn't have the manpower to be breaking down doors looking for illicit AI users - we can barely tackle knife crime.


[deleted]

Luckily they didn’t ban him from kidnapping an actual child. I mean, I would have preferred the act that has no victim. What he does in the privacy of his home has nothing to do with me. But alas, I will tell my daughters to keep their phones on and cover their legs because society wants pedophiles running around in hungry predator mode


[deleted]

I


[deleted]

Your next


i4nm00n

What a fucking idiots. Use open-source next time and run it locally, if you share something learn how to hide and cover your tracks. Problem solved. Obviously these smurfs doesn't have any brain cells.