A former dark-web fraudster who once stole millions of dollars through identity theft now says the biggest online threats no longer come from humans at all.
Others are reading now
A former dark-web fraudster who once stole millions of dollars through identity theft now says the biggest online threats no longer come from humans at all. Brett Johnson, who later worked with the U.S. Secret Service as a consultant, argues that artificial intelligence is transforming cybercrime into something faster, more organised and far more difficult to trace.
In recent interviews reported by Digi24 and Business Insider, Johnson said the most alarming developments he sees today — deepfakes, industrial-scale scam farms and synthetic identities — are all powered by AI systems that can outpace human detection.
A new phase for online deception
Johnson, who spent more than a decade hacking financial systems and selling stolen credit cards before turning to law enforcement, said the criminal ecosystem he once helped build is evolving rapidly. Instead of lone operators, he now sees entire operations run on automated tools that can mimic voices, generate fake video calls or craft personalised messages in seconds.
Deepfake technology, he warned, is only at the beginning of its criminal potential. Fraudsters already use AI-generated video and audio to impersonate colleagues or trusted officials. Johnson pointed to a case in which a finance worker approved more than $25 million in transfers during a video call populated entirely by AI-generated doubles of his real co-workers.
He said this kind of realistic impersonation erodes the most basic foundation of fraud prevention. Criminals traditionally needed to gain a victim’s trust; with deepfakes, they can hijack trust that already exists. The result, he said, is a future in which “we’re no longer able to trust anything we see or hear in an online environment.”
Also read
Scam farms scale up
Johnson also described a shift toward organised scam farms — physical complexes where trafficked or coerced workers carry out cons in rotating shifts. These operations run romance schemes, cryptocurrency fraud and long-term emotional manipulation, draining victims of life savings. Rather than loose networks of criminals, he said, these outfits now resemble corporate structures with supervisors, quotas and defined roles.
Cases reported in both outlets reflect that shift. One victim, believing he was investing alongside an online romantic partner, put more than a year’s salary into a fake cryptocurrency scheme. He ultimately left Turkey for Uzbekistan to earn enough money to support his family.
Synthetic identities rise
Another fast-growing threat, Johnson said, is synthetic identity fraud — the blending of real and fabricated personal information to create a digital persona that does not actually exist. Synthetic IDs now dominate new-account fraud worldwide, he told both outlets, accounting for an estimated 80% of cases in that category and a growing share of credit losses across the financial system.
Because these identities are not tied to a real person, they can accumulate credit, open accounts and disappear without the typical warning signs of stolen data. Johnson warned that as AI automates the creation of these personas, banks may struggle to detect fraud until long after damage is done.
Staying protected
Online crime is easier to commit than ever, Johnson said, because criminals no longer need technical knowledge — they can buy tutorials, scripts and tools off the shelf. But he argues that individuals can still reduce their risk.
Also read
He advises maintaining awareness on every platform, freezing credit for all household members, placing alerts on financial accounts, avoiding password reuse, enabling multifactor authentication, and limiting personal information posted on social media. These steps, he said, can significantly slow down would-be attackers in a landscape where AI accelerates everything else.
Sources: Digi24; Business Insider