So how does Tay Tay shake off digital pimps? Meet Nina Jankowicz, the digital deep-fake slayer

By Julian Bajkowski

April 17, 2024

Nina Jankowicz
Nina Jankowicz is the founder of Sophias Strategies and a former Fulbright-Clinton Public Policy Fellow. (Image: The Wilson Center)

“I personally know very well what it means to be the person who is the villain of the week, or the month or the year. Frankly, it was over a year for me. I still am sometimes mentioned on Fox News, and every time I am, it drives a huge wave of harassment toward me and my family, which is one of the reasons I’m suing Fox News for defamation. This is the power that can’t be understated.”

Those are the words of Nina Jankowicz, disinformation and online interference expert, former inaugural head of the US Department of Homeland Security’s short-lived Disinformation Governance Board (DGB) and persistent and fierce interlocutor against gendered abuse online.

These days she’s the vice president of the Centre for Information Resilience, and also the author of key texts “How to Lose the Information War” and “How to Be A Woman Online”.

If you’ve ever wondered what it’d be like to be a lightning rod for every kind of shock-jock-tactician, habitual liar and epistemological engineer plying their dark trade on the right, left and just straight for-profit sides of online propaganda (think platforms), Jankowicz is the woman all those haters love to hate, and with good reason. (And yes, there is a strategic Tay Tay angle — be patient).

The background sheet is pretty simple.

In a brief dalliance with regulating the outer extremities of the riot-grade firehose of online bile and lies that supposedly doubles as political discourse these days, the Biden administration announced the Disinformation Governance Board in April 2022 — and it was dead by the end of the next month thanks to a pile-on that attracted everyone from Trump supporters to progressives decrying the creation of an alleged “truth police”. Viva that First Amendment.

It was a moment that sent a shudder down the spines of many other governments that had watched, at a distance, the collision of foreign interference, disinformation, astroturfing and behavioural influence and manipulation by electronic means that broke into the open during the 2016 US presidential election, went on to the Capitol riot, and is still running hot as the next election looms, and not just in the US.

Campaign monitor

“We’ve seen campaigns, taken down by the French government, that the Russians were trying to interfere in the 2017 French presidential election. We’ve seen the Iranian interference in 2020 where Iranians were pretending to be Proud Boys in the US election and targeting Democratic voters and swing states, in order to try to suppress their voter turnout,” Jankowicz says.

Speaking with Dr Miah Hammond-Errey on the Technology and Security podcast live today, Jankowicz is concerned that the ideologically unfettered preoccupation with free speech in the US will potentially impede the mitigation of threats “that we saw in previous election cycles”.

The stakes are higher too, because there is now a hot war in the Ukraine where Russian president Vladimir Putin stands to benefit if Western and American support for the Ukraine wavers. Elon Musk now owns the platform formerly known as Twitter.

With brute mechanical manipulation of platforms through fakes, bots and rented trolls much harder than in 2016, Jankowicz reckons Russian diplomatic accounts “through Russian media and through information laundering” are identifying “useful idiots” around the world “who are willing to carry the flag for Putin and the Kremlin.

“That’s really hard to push back against,” Jankowicz says. “It’s not a fake account. It’s somebody who’s expressing their real opinion and moving that across the world’s information environment.”

Local and or general

While the interview was recorded immediately before the Bondi Junction atrocity and the stabbing of a Syrian Orthodox bishop in Sydney just days after, the local Putin protagonists are in plain view. Simeon Boikov, aka @AussieCossack was swiftly expounding on both incidents.

A key claim of Boikov was the false and erroneous statement that the Bondi assailant was a Jewish man. The bullshit gained truck on the mainstream and it ran. The false attribution was debunked, but it malingers in all the wrong fora.

According to a BBC report, Boikov is still holed up at the Russian Consulate in Sydney. Boikov has been posting from there since a warrant was posted for him over the alleged assault of an elderly Ukrainian man at a demonstration that was caught on film, with the man tumbling to the ground.

Useful, definitely. Idiot, maybe not so much.

X-Ray Spex

Pressed on the need for regulation of social and online platforms, Jankowicz errs on the side of transparency and understanding rather than the traditional approach of whack-a-mole. That might sound light touch, but it makes more sense in terms of actually knowing what’s in the pipe and about to hit the fan.

“So I think the first order of business is [to] have these hard conversations about what is the proper relationship between government and the social media platforms; what oversight should the social media platforms have?” Yankowicz told Hammond-Errey.

“What I would like to see is not even a regulatory regime but a transparency and oversight regime over the social media platforms. So we understand the decisions that they’re making, what they’re moderating, why they’re moderating it, how much they’re responding to user reports of harassment and threats and things like this.”

Just how far Australia, let alone the US, is from that was borne out on Tuesday when Australia’s eSafety commissioner, Julie Inman Grant, revealed Meta and X had been issued takedown notices relating to images of the stabbing of the Syrian Orthodox bishop at the pulpit on Monday.

The scenario mirrors the same latency and lag that accompanied the Christchurch, New Zealand, mass killings by an Australian neo-fascist armed with an AR-15 assault rifle, where platforms failed to respond in time to stop huge amounts of horrendous trophy videos being generated.

And it all comes as Meta walks away from the existing news bargaining code so it doesn’t have to pay regulated publishers for verified news it profits from.

A declared war against women being deep-faked?

There is also something resonant about Susan Faludi in Yankowicz’s broader take if you can imagine Faludi (or Barbie) as a national security analyst.

While she doesn’t explicitly call it out, Yankowicz makes reference to the mass accessibility of powerful new technology gaining unregulated utility through, well, porn that initially propelled the internet into the maelstrom of unauthorised content-sharing that later consumed the recording industry and Hollywood.

The problem with porn was that you first needed to convince people to consensually participate to make it. Then you had to compensate them, and then try and make a margin on distribution.

Some 25 years of the internet destroyed the classicist distribution model. AI, and advanced special effects generated by Hollywood’s tech sector, have delivered a new business model, and it respects even fewer boundaries than before.

The inputs are not paid and are dominantly non-consensual.

“One of the things that I try to raise the alarm bell a lot about is deepfake pornography. Most of the deepfakes that exist today are non-consensual deepfake porn of women, I think over 96%. Now, that statistic is old, from 2019,” Yankowicz says.

“Misogyny has been normalised by politicians, by people in power, by influencers. And so when there’s no consequence for the people at the top who are doing it, we see kind of an open door for anybody else to engage in it as well,” she observes.

Porn, like a lot of the so-called sex industry, used to be on the margins. It may have been exploitative, but that was because access to people willing to make it was demographically limited. The 1980s and 1990s thinned that even further through epidemiology.

There are some definitely blurred and broken boundaries, but in the main, industry participants did not want to go to gaol. But that’s all shifted again.

Deepfakes and the vaporising of consent

Yankowicz makes an important distinction that all women, but especially those with a public or political presence, now don’t necessarily get a say over how their bodies, or other people’s bodies, get used as a digital means to an end.

She particularly calls out the initial training of AI as a precursor to the repetition of aberrant behaviour.

“These [AI] models are trained on women’s bodies, and a lot of that kind of modelling already exists. I’m worried about that.  I’m worried just about the constant violence that women in our kind of public life have to face… I think there’s such a reticence to call it what it is: that it is violence,” Yankowicz said.

It can be argued that virtualised violence against women is essentially digitised rape.

Yankowicz says having experienced digital violence against women herself, “knowing kind of what you go through as a person, in your trauma response, your physical response when the stuff is happening to you, there’s no other way to describe it.”

“And it does have a lasting effect on you.”

Shake it down

So, where’s the Taylor Swift angle, other than her immense appeal and resonance? Probably in her ability to propel the ordinary as extraordinary to the masses. The great schtick isn’t just the message but its potent delivery in the Lingua Franca of the age. It wasn’t called the “Eras” tour for nothing.

“My hope with Taylor is that she will use the position of power that she’s in not just to kind of call attention to this with Congress, or the tech companies, but to really bring together a community of deepfake survivors and use her power, her influence, to potentially have a class action suit against the creators or distributors of these deepfakes,” Jankowicz told the podcast.

“In large part, deepfake pornography on the internet exists on a couple of key websites that Google indexes. So, again, potentially pressuring Google to de-index and demonetise these sites or potentially bringing a class action suit against the sites that store and amplify these deepfakes.”

Sounds like a plan. The Mandarin is reliably informed by teenagers and young adults that searching for Tay Tay on several popular free adult content websites is a bit of a waste of time, indeed a nil result.

You need to go much darker, much deeper. That says something.

Imagine a universal right not to be virtually raped through AI on a global platform. Could be a thing. Could be a right.


READ MORE:

How AI deepfakes threaten elections across the world in 2024

About the author
Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments