Rep. Alexandria Ocasio-Cortez, D-N.Y., is pushing for federal legislation that would crack down on deepfake pornography generated by AI, calling it “sexual violence,” The Hill reported.

“Since the public release of AI tools, out of all the images and video that AI has generated, over 90 to 95% of it has been nonconsensual deep fake pornography and over 90% of that is targeting women,” Ocasio-Cortez said in a YouTube video released Tuesday while touting the Defiance Act.

“This is sexual violence. As a prominent and visible female elected official, I’ve been personally targeted by this, but it’s not just prominent people in the limelight. What is actually concerning is the way that this is being used to target everyday people, and if you are a woman of any kind that wants to aspire to anything — starting a business, becoming a teacher, running for office — you are overwhelmingly facing the risk of being targeted by this kind of reputational sexual violence that is at its core exploitative.”

Ocasio-Cortez said the bipartisan bill would update the Violence Against Women Act to create a “civil course of action” for victims of deepfake pornography.

“What that means is that if you are a victim or survivor of AI fake pornography, you will start to have federal protections where you can begin to pursue accountability in court for perpetrators and people who generate, perpetrate and spread this kind of imagery against you,” she said.

Politico reported that the White House issued a “call to action” this week, seeking Congress to strengthen legal protections victims of of AI fake porn, but that lawmakers have struggled to find a solution.

Solange Reyner

Solange Reyner is a writer and editor for Newsmax. She has more than 15 years in the journalism industry reporting and covering news, sports and politics.


© 2024 Newsmax. All rights reserved.



Source link

Subscribe Below To Our Weekly Newsletter of our Latest Videos and Receive a Discount Code For A FREE eBook from our eBook store: