Women in AI: Dr. Rebecca Portnoff is Working to Safeguard Children from Harmful Deepfakes.

And on that note, in keeping with its ongoing Women in AI series-an attempt to shed more time in the spotlight on AI-focused women academics
Women in AI: Dr. Rebecca Portnoff is Working to Safeguard Children from Harmful Deepfakes.

And on that note, in keeping with its ongoing Women in AI series-an attempt to shed more time in the spotlight on AI-focused women academics and others-TechCrunch caught up with Dr. Rebecca Portnoff, who happens to be vice president of data science at the nonprofit Thorn, building tech to protect children from sexual abuse.

She majored at Princeton University before getting her PhD in computer science at the University of California, Berkeley. As a sub-clerk, she has worked her way up the ladder at Thorn since 2016, working since then as a volunteer research scientist, and now leading a team that is perhaps one of the only in the world dedicated to building machine learning and artificial intelligence to stop, prevent, and defend children from sexual abuse. So, during my senior year at Princeton, when I was determined to figure out what to do after graduation, my sister recommended that I read 'Half the Sky' by Nicholas Kristof and Sheryl WuDunn, which brought me on the topic of child sexual abuse," she told TechCrunch, saying the book inspired her to study how to make a difference in this space.

She then wrote up her doctorate dissertation, which focused especially on the use of machine learning and AI in this space. This mission to protect children At Thorn, Portnoff's team works toward identifying victims, preventing revictimization, and preventing the viral spread of sexual abuse material. She organized last year's joint All Tech Is Human Safety by Design initiative with Thorn to prevent people from using generative AI to commit sexual harm against children.

It was a tremendous lift, collectively defining principles and mitigation ways to stop abuse material by generative models; to make such material more reliably detected to prevent its further distribution using those models, services, and apps used to produce this abuse material; then, aligning industry leaders to commit to those standards," she recalled.
She said she'd met lots of dedicated folk but "I've also got more grey hair than I did at the start of it all."

Another big discussion has been that of the use of AI in the creation of pictures of a person's nudity or sexual acts without their permission, especially since AI porn generations have become highly advanced as reported by TechCrunch. To date, there is no federal comprehensive law yet to outlaw or protect sexual generative AI images created of other people without their consent; however, individual states like Florida, Louisiana, and New Mexico have enacted specific legislation targeting AI child abuse. Her view is that, in fact, this is one of the most serious issues facing AI as it evolves. "One in 10 minors report they knew of cases where their peers had generated nude imagery of other kids," she said.

We don't have to live in this reality and it's unacceptable that we've allowed it to go to this point already." She said, however, there are mitigations that can be put in place to prevent and reduce misuse.

Thorn is pushing tech companies to embrace their safety-by-design principles and mitigations, how they can publicly share their efforts to prevent the misuse of their generative AI technologies and products in furthering child sexual abuse; working with professional organizations like the Institute of Electrical and Electronics Engineers (IEEE) and the National Institute of Standards and Technology (NIST) on how setting standards for companies could be used for auditing progress and engaging with policymakers on just how important this is.

"Legislation based on impact will have to be in place to bring all parties and companies on board," she said.  Working as a female in AI

As Portnoff gained more prominence in AI, she remembered people dismissing her advice, asking for a person with some technical expertise to speak to. "My answer? 'No problem-you are speaking with someone with a technical background,'" she said.

She said a few things have helped her navigate working in such a male-dominated field: preparing herself, acting with confidence, and assuming good intentions. Being prepared helps her enter rooms with more confidence, while confidence allows her to navigate challenges with curiosity and boldness, "seeking first to understand and then to be understood," she continued.
"Assuming good intent helps me approach challenges with kindness rather than defensiveness," she said. "If that good intent truly isn't there, it'll show eventually."

Her advice to women interested in entering AI is always believe in your ability and meaning. She said it's easy to fall into the trap of letting assumptions people have about you define your potential, but that everyone's voice is going to be needed in this current AI revolution.

"As ML/AI becomes more embedded in our human systems, all of us need to work together to ensure that it's done in a way that builds up our collective flourishing and strengthens the most vulnerable among us."

Building Ethical AI

There are many facets of responsible AI, she said-from wanting transparency, fairness, reliability, and safety. "But all of them have one thing in common," she continued. "Responsibly building ML/AI requires engaging with more stakeholders than just your fellow technologists.".

That means more active listening and collaboration. "If you are following a roadmap for building responsible AI, and you find that you haven't talked to anyone outside your organization or your engineering team in the process, you're probably headed in the wrong direction."
And as investors continue to dump billions of dollars into AI startups, Portnoff suggested that investors can begin to look at responsibility early on during due diligence, take a look at a company's commitment to ethics before making an investment, and then require certain standards to be met-this can "prevent harm and enable positive growth."

It's much work that needs to be done, said she as speaking generally. And you will be the one to make it happen.

 

Blog
|
2024-10-20 18:01:17