Fears over growth of ‘degrading’ act

Fears Grow Over the Rise of Degrading Deepfake Pornography

Concerns are mounting as deepfake pornography cases surge, prompting calls for urgent legislative action to protect victims

#Politics#

#Deepfake Pornography#, #Australia#, #Chanel Contos#, #AI#, #Legislation#, #Image-Based Abuse#

Melbourne: There’s a lot of chatter about deepfake pornography lately, and it’s not good. Experts are worried that lawmakers just can’t keep up with how fast this tech is evolving. It’s becoming super easy to whip up fake, hyper-realistic porn, and that’s raising some serious alarms.

So, what’s a deepfake? It’s basically when someone uses digital tricks to make it look like a person is doing something they never actually did. It’s a form of image-based abuse, and it’s hitting hard. Just last year, Taylor Swift was a victim, and more recently, a student in southwestern Sydney got in trouble for making deepfake porn of female classmates using AI and social media pics.

Things got even crazier when a student in Victoria created graphic images of around 50 girls from his school. And in Melbourne, fake sexual images of a female teacher were shared around. It’s a mess, and organizations like Teach Us Consent are getting flooded with reports from young people about this kind of abuse.

Chanel Contos, who runs Teach Us Consent, pointed out that deepfake tech is often used for bullying or humiliation, driven by a sense of male entitlement. It’s pretty disturbing. eSafety Commissioner Julie Inman Grant added that it’s getting harder to tell what’s real and what’s fake, thanks to how quickly this tech is advancing.

She mentioned that they’re already seeing reports of AI-generated child sexual abuse material, which is just horrifying. The Australian Federal Police are also worried that the rise of AI could make it tougher to spot real child victims among all the fake stuff out there.

But here’s the kicker: anything that shows child abuse, whether it’s real or AI-generated, is still considered child abuse material and is illegal in Australia. So, there’s that. Experts are calling for more robust laws to tackle this issue, but it seems like the tech is moving way faster than the law can keep up.

Chanel also said that while public figures like Andrew Tate might influence some guys to think this behavior is okay, they’re not the only reason for the rise in deepfakes. The easy access to porn is fueling the demand for deepfake content featuring real people, which is just sad.

Legislation was passed last year to address non-consensual deepfake porn, but there’s a real fear that it won’t be enough. The Attorney-General has assured that new laws will apply to all forms of deepfake material shared without consent, with serious penalties for offenders. But will it be enough to keep up with the tech? Only time will tell.

In the end, it’s clear that we need to tackle this issue from all angles—legally, socially, and culturally. It’s about teaching respect and equality, and making sure tech companies are held accountable for the misuse of their products. If you or someone you know is affected by this, there are resources available to help.

Image Credits and Reference: https://au.news.yahoo.com/fears-over-growth-degrading-act-063930350.html