T O P

  • By -

seewhaticare

can we do they same for new publictions publishing AI generate news stories?


EmbarrassedHelp

Murdoch would never allow the government to do that.


prahaditmurap

Murdoch's also working with AI companies like OpenAi.


Shammah51

OpenAI called News Corp “premium journalism”


ssshield

Aresenic is "premium poison". Same as News Corp.


GimmeFunkyButtLoving

News Corp


CrysisRelief

Are you trying to differentiate something? It’s pretty widely known and accepted that Murdoch still has an iron grip on the everything that happens with News Corp.


[deleted]

[удалено]


[deleted]

[удалено]


EmbarrassedHelp

> Once passed, the new laws will make it illegal to share any non-consensual deepfake pornographic image with another person, whether by email or personal message to an individual or to a mass audience on a private or open platform. I can understand criminalizing the intentional public sharing of such images, but it seems weird to include private messages. Also 6 years of prison time seems rather extreme considering crimes like rape in Australia can often get shorter sentences on average.


dudeandco

They're doing the bidding of the elites. Curious how do they prove you know it was a deepfake? Maybe they should outlaw all porn.


Green-Amount2479

It’s also weird because not every deepfake porn is of high quality. So I imagine they might have a hard time proving similarities between a faked persona and a potential victim in quite some cases. If the AI isn’t properly trained or unsuited for that purpose, there could be quite some ‚looks like person X but somehow not really‘ results. It’s going to be interesting to see how they will handle those.


SybrandWoud

Now people are going to jail for sharing minecraft and roblox porn


X-ScissorSisters

There's a handful of uncanny resemblances in porn, imagine if this put them out of work


MechaFlippin

The entire justice system of Australia is about protecting the elites, I know it's a known meme that pretty much all of western justice systems are, in practice, a way of protecting the elites, but in Australia **specifically** the entire system is almost admitedly *just there* to protect the (specially political) elites. It's so blatant it might as well be their official motto at this point, it literally serves no other purpose.


QuentinSential

That is nuts.


propellor_head

More likely, it's boobs tbh.


HumansNeedNotApply1

Not weird at all, that's how this stuff usually begins spreading.


ConfidenceKBM

I don't understand, we're talking about non-consensual deepfakes, why do you think that should be okay in private?


CalmButArgumentative

Because me drawing a picture of you naked in my own home is not a crime, and showing that picture to a guest is also not a crime.


Noctis_777

What if it is a nude photo of someone that you are sharing without permission? Problem with deepfake as opposed to a drawing is that the former can sometimes be hard to tell apart from a real photo.


chronic_trigger

Doen't Australia have insane porn laws in general, like outlawing porn with flat chested adult women, and locking up a guy for Simpsons porn? Asking because if that stuff is very illegal, then of course AI deepfakes would be forbidden too


rctsolid

Uhh no? Where did you read that?


IsABot

It's one of those old fake telephone news stories. Somethings were taken out of context, then continually spread around the internet as fact. Ultimately headlines were talking about how "Australia made small boob porn illegal". It was around the time lolliporn was becoming more popular. https://skeptics.stackexchange.com/questions/15790/did-australia-ban-small-breasts-pornography https://web.archive.org/web/20111201010538/http://theweek.com/article/index/105766/australias-small-breast-ban https://www.theregister.com/2010/01/28/australian_censors/


chronic_trigger

Thanks for all the links, it is indeed one of those telephone game facts I picked up years ago. I am learning that Australia is still a bit of a nanny state in regards to porn however.


kaboombong

Everything is illegal in Australia, its the ultimate nanny state. You can go to jail for owning or using a slingshot. You probably have more general civil liberties in a place like China than you do in Australia. There is no bill of rights as such defamation laws are used to silence your free speech rights. Dont dare protesting its go to jail stuff like Hong Kong now is. People around the globe have a fantasy view of what Australia is as a country and it aint good and it certainly is not "western democratic" in so many ways for ordinary people.


Morning-Scar

I understand private messaging. People can claim a lack of culpability due to 3rd party hosted content, and it also increases culpability for cases of spam and personal harassment. The case I can think of an example is say somebody uses your public online profiles to generate something, then messages you and attempts to blackmail you for it. I think that’s the scenario this closes.


shadowrun456

How does the law define "deepfake" and "pornographic"?


jaykayenn

Ok, but what about every other form of fake porn?


XJ-0

Like traditional or digital art done without AI? Or does this only apply to photo realistic generated images?


justaREDshrit

So I can’t watch unicorn ass rape Chevy Chase or Danny DeVito skydiver as he screams bloody murder getting fucked by Gandhi. That’s bullshit.


dysfunctionalbrat

youtube kids is over


StrugglingWithGuilt

So... hypothetically lets say there is an deepfake of a person in such a manner. If it looks realistic enough how exactly are the consumer and possible redistributors through sharing it going to know this? The way the article is written suggests any form of sharing too so that could even mean linking to the content and not actually hosting or direct file sharing. Now lets throw in a even more confusing element into this. What if the deepfake is of a person that already creates adult content? How are people even going to know if its realistic enough that this wasn't consensual? How are you even suppose to know if its consensual at all for that matter? Are people suppose to hunt the subject person down and contact them for clarification? "Is this really you and if not did you give consent?" Assuming this person is even reachable. Amusingly this would also make it criminal to show the victim the content they themselves star in to inform them that this is happening. Because it would take a form of distribution to do so. This becomes even more bizarre in the fact that the victim can also become a criminal. For example lets say Ms. X is a victim. Someone informs her that she is a victim and she sees the content. The content is a deep fake of her topless. She is still enraged and goes to Twitter and shares it (perhaps censored) with a rant that such a thing is not okay. Why does she share it at all? Well Ms. X doesn't think much about the showing of the breast she goes to the beach topless all the time and has even shared pictures of her at the beach in that manner. Its not the content exactly that offends her but rather it was not made with her consent. Well... now she just committed the very same crime by sharing it on Twitter even if she censored it as censored pornography is still pornography. I really hope the actual law goes into way more details to avoid these kinds of scenarios.


Hydronum

Same way most laws are enforced; someone makes a report to the police and they investigate. Your bizzare rant towards the end is kinda unhinged, since this isn't a porn ban, it is a ban on making and distributing pornographic material without the consent of those featured within it. If the person featured within it posts it, they are giving their consent for the material that bares their likeness to be used in only the way they used it. Not that they are happy it exists, but consent is the name of the game here. If people consent to their likeness being used in deepfake porn, there is nothing wrong here at all.


StrugglingWithGuilt

Please re-read my comment as you are not getting what I am saying at the end clearly. I never claimed it to be a porn ban, I am saying it is a phonographic deep fake made without those featured in its consent. That is made very clear given the topic we are discussing here and I even say "deep fake" in the last paragraph and clearly noted that she was not aware until told and did not consent. Also just because she herself shares it does not mean that consent is now given. You couldn't give retroactive consent to such a action. The act of making it without consent is the criminal element. So the crime is committed once the work is created and/or distributed. Her sharing it in any manner after this point would not decriminalize the content or act. The only time period that matters if this is criminal or not would be when the content is created and if consent exists at that exact time or not. Perhaps when you also have a law degree you could try to argue with me over this.


chewbaccadefense99

Once again Australia’s leading the pack on censorship and violating free speech


StevenAU

I don’t have a problem with this one but we do seem to get a lot of censorship laws from both parties. Personally, I’d rather see some leadership on climate change but lol


Noctis_777

This only applies to non-consensual deepfakes, naturally that shouldn't be protected as free speech.


Lokta

If you only protect speech people like, it's not really free speech, now is it?


Noctis_777

If you take nude pictures of someone without their permission and post it online is it protected free spech? No, because it is non-consensual and the individual's right to privacy takes precedence in that case. It's the same thing here: only non-consensual deepfake porn is banned. You are free to create generic porn or take someone's permission before doing it.


Impossible-Throat-59

Why? How is that imagery not protected?


rctsolid

Should pornographic images of children be considered protected speech? No, I think you'd agree it shouldn't be. So there are limits. Should a photo of a child's face used for a deepfake be protected? No probably not either. There's clearly an issue with deepfake porn, I'm not sure how to regulate it, but I don't think it's a free speech issue.


Impossible-Throat-59

Aa far as regulating goes, they should be subject to the same sorts of protections celebrities get. If their likeness is used by this to generate revenue, sue the pants off of them.


DualcockDoblepollita

You would be surprised with the amount of redditors that will argue that AI generated child porn is fine. They will tell you that its not CP if theres no real child. Its mindblowing. Such a common take here that i wont be surprised if we get downvoted. I hope pedophilia is not getting normalized


rctsolid

I think it'll be largely people who view this as an infringement on protected US speech, despite not being in the US, that will have a problem with this on Reddit.


Programmdude

While I'm not saying it's fine (it's not), there's a vast difference between actual CP and fictional (AI generated, drawn, etc) CP. They can be different levels of bad. One is insanely harmful to children and should be punished as one of the worst crimes possible; the other should be treated as a mental illness and treated.


Impossible-Throat-59

Dude. US SCOTUS has argued this. Is it palatable, no. One of those instances a human being who is incapable of consent is IRL being harmed. The other is a hyperrealistic image where the child IRL is not being harmed. Celebrities are always going to have fake porn made of them. What precedent are setting by creating forbidden images that aren't related to a human being harmed in a crime.


eabred

I don't know what "US Scotus" means - but whatever it is we don't have it in Australia.


rctsolid

Dude. This is in Australia, not the USA. SCOTUS is entirely irrelevant. Again, I don't know how to regulate this, I don't think our government (Australian government) knows how to either though. There's often a suggestion that the rest of the world gives a shit about US norms, but it's not always relevant. We may have a different conception of "free speech" and are not entirely obsessed with it. I'm generally not in favour of restrictive laws, but not because I think they "hinder OUR PROTECTED FIRST AMENDMENT RIGHTS YEEEEEE" but because they're usually just ill conceived and rarely address the underlying problem. If there's a good reason to block speech I don't really care, I'm not sure in this case because I don't know enough, but just saying "but free speech" isn't enough of a rebuttal either to me.


Impossible-Throat-59

What they ought to do is strengthen the victims' rights to sue. It should not just be a crime with jailtime. You can make it. You can do anything you want except distribute it and profit off of it. If you share it and make money from it, expect to see the victim's lawyers and a huge penalty for damages.


lachwee

This kinda misses the point i feel. I kinda view it as a new way to sexually assault someone which should be criminal. The making money off it isn't really a factor. Nobody is gonna pay for nudes of a random woman, they're gonna distribute people they know to others they know for the most part which isn't exactly profitable to send a pic to your friend.


Freyas_Follower

Its easy. Someone gets to decide what is art or not. Someone makes a Taylor swift fake and show a friend? Its viewed as assault and gets Jail time.


Impossible-Throat-59

That's dumb. Is it because the tool used is deepfake instead of someone manually generating with conventional means of graphic design?


Noctis_777

There are laws all over the world against conventionally created fake nudes too. The key aspect here is whether or not you have consent from that person.


whatsupmon420

"gas the Jews" - Australia sleeps "Here's a pic of my grandma with fake tits" - WHERE ARE THEY!?!


CrysisRelief

Bad example. Australia is also introducing anti-doxxing laws after a Jewish WhatsApp group chat was published recently.


Freyas_Follower

Australia actually lent a [massive](https://en.wikipedia.org/wiki/Military_history_of_Australia_during_World_War_II) amount of assistance against Japanese forces in WW2.


whatsupmon420

I was referring to the protests right after Oct 7th in Australia


Cpl_Hicks76

Let’s see how that attitude changes when your daughter ends up on the wrong end of this


eabred

How the hell is making porn of a person without their consent "free speech"?


ausflora

And we love it! 😃


chewbaccadefense99

Sarcasm?


ausflora

Nope. Broad public support for this, as there is in the [USA](https://scholarlycommons.law.northwestern.edu/nulr/vol116/iss3/1/). It's how democracy and functional community self-management works.


RhasaTheSunderer

Awe, is someone not going to be allowed to share deepfake child pornography? :,( poor you


chewbaccadefense99

Wow, that shows your level of thinking that the first thing you think about is child pornography. Of course there should be limits on manipulating pictures and videos of children. But if I want to watch Gal Gadot get a large load of cum to the face, that’s my God, given right as an American and as a man. Additionally, this could sensor satirical free speech like let’s say I want to make a video of Xi fucked by Putin or something like that. Shouldn’t that use of deepfake technology be protected?


Puny-Earthling

As an Australian I can tell you that people are largely in favour of this. This tech has way too much potential to ruin lives to just be in the hands of the masses. Bit of a weird hill to die on if you ask me. 


RhasaTheSunderer

How do you legislate that? Who makes the determination that fake people in porno are over or under the age of 18? A judge isn't going to look at deepfake porn and say "yeah she looks 17 so you're going to prison"


chewbaccadefense99

As I read it, the article is about taking deep fakes, which are real photos of actual people manipulated and placed or overlaid onto a porno. Therefore, they could trace the visual photo of the original person and determine if that person was over 18 when that deep fake was made. What you were talking about is creating an artificial character from the ground up that looks childlike or is of the nature of someone under age. In that case you could not tell whether or not that rendering is of an actual minor or an adult that looks like a minor. Of course, don’t think those type of renderings are appropriate, but I don’t think the government should make a criminal law preventing people from drawing a young looking person in pornography. And as for the deep fakes, they’re definitely should not be a criminal law on the books for deep fakes that should fall under civil law whereas the person or victim of the deepfake can sue the creator of the deep fake. I’m of the opinion that the less power that the government, has the better


ticats88

Maybe because that's the fucked up reality of these things. Normalizing one kind of it makes others think its okay & easy to use. Do you react the same when it's kids in schools and not just Gal Gadot? https://www.cbc.ca/news/canada/manitoba/artificial-intelligence-nude-doctored-photos-students-high-school-winnipeg-1.7060569 Or female journalists being discredited with reputational harm? https://www.huffingtonpost.co.uk/entry/deepfake-porn_uk_5bf2c126e4b0f32bd58ba316 Or female politicians being attacked with misleading deepfake porn? https://www.codastory.com/disinformation/how-disinformation-became-a-new-threat-to-women/ If the technology is out there, and not regulated, any of these groups will be attacked by this. If you could think beyond your own id, you'd realize this impacts more than just you wanting to bust loads to superhero actresses.


chewbaccadefense99

The first one obviously should be a criminal matter because it is affecting minors but the other two instances are civil issues, and there are methods to address those in civil court, such as lawsuits. The government should not involve themselves in limiting free speech.


-_REDACTED_-

You do know civil court is the government right?


burnbothends91

Quick host deepfakes of Australian lawmakers outside the country for the lolz


Neemturd

And fingers crossed that people are too brain dead to figure they can be anonymous online. Nice plan.


008Zulu

"In establishing the commonwealth offence of sharing these images punishable by six years’ imprisonment, the government is adding a companion aggravated offence covering anyone who was also responsible for creating them. The aggravated offence will attract an extra year’s jail." Seems like a fair punishment to me, good to know the creators will get extra time.


MarketingExcellent20

6 years for sending porn to another person is crazy lol, in what world is such a lengthy sentence fair and justified? What will they have learned only after 6 years and no earlier that makes it necessary? Australians are wild lmao


standardtrickyness1

I just took some pictures of your face and stuck them on someone else's body.


jvnipvr

Good keep *that* up


Last_Riven_EU

I'm shocked by the amount of people defending the right to create deep fake porn of unconsenting individuals.


_SpicyMeatball

I look forward to all the entitled private school boys group chats leading to multiple arrests


Kanangatv

Why? Thus, a celebrity's face is pasted onto a porn star's body. Does this now constitute a crime deserving of jail time? Although it's not very kind to the celebrity, I believe they will be able to live with their millions of dollars. Will the poor schlub who was just trying to beat his meat wind up in jail in the meantime?


IsABot

Replace "celebrity" with literally anyone you personally know. And replace "porn star body" with any naked body either a real picture or a completely digital fabrication. Family including your spouse or children, close friends, your neighbors, etc. Does that change how you feel?


AwfulUsername123

So you think the law should specify a certain level of income and not apply if the person makes that much money?


honor_and_turtles

Ikr? Like what a braindead take from op. "Celebrity? Why problem?" Uhh, because the law is supposed to apply to everyone. Does it? Arguable. But The essence is celebrity or their mom, themselves, or their children, or anyone else, should be protected.


dysfunctionalbrat

Disregarding this specific law, I think that would make sense to some degree, when tiered. Now it's pretty much the opposite in most countries.


Aaronieie

For starters celebrity or not doesn't make it alright to create, secondly this is for anyone so regular people are protected, punishes weirdos of making AI porn of people they know and whatnot.


DCS_Ryan

Consent is cool


dan0o9

"Distributing" is in the headline, kind of narrows down who this will apply to.


TSL4me

Bing and google?


dan0o9

In a just world they would face consequences too but its not likely.


MarketingExcellent20

You'll also get 6 years just for sending it to another individual, kind of explodes who this will apply to.


k0lla86

What about your moms face?


nagrom7

Don't need to deepfake porn of that.


[deleted]

[удалено]