When AI Becomes the Perfect Scammer: The Deepfake Fraud Epidemic

Exploring how AI deepfakes are revolutionizing fraud, from voice cloning scams to fake video calls, and why traditional security measures are failing against this new threat.

Preberi v slovenščini

Hey there,

you’ve probably heard about Google’s new image model “nano banana.” If not, don’t worry, I’ll show you what it is and how it can be useful in the next issue of this newsletter.

Why all the excitement around this?

It preserves the appearance of people or objects remarkably well across iterations. We’re not far from it replacing Photoshop.

And yes it’s impressive, but while testing it I was once again struck by the thought of where we’re heading, so I decided to explore the current state of “deepfakes” or manipulations, because we no longer know what’s real and what isn’t.

That’s when I came across this story. In 2019, an executive at a UK energy company received a phone call.

Imagine, on the other end was his boss from the German parent company. Same voice, same accent, everything exactly as usual. And he tells him to transfer 243,000 euros to a Hungarian supplier. Of course, he immediately made the transfer.

And why wasn’t this okay? His boss never actually called him. It was a perfect AI copy of his voice. Criminals had completely tricked him.

This was six years ago. Back then, we all thought “interesting, but this doesn’t happen often.” Today? This is just the beginning…


AI Fraud is Now Everywhere

The situation has spiraled out of control. Across Europe and America, AI fraud is spreading like wildfire. In 2024, deepfake scams caused 359 million in damages. In just the first half of 2025, already more than 410 million. More than in the last five years combined, can you imagine?

And these aren’t those pathetic Nigerian princes in emails. These are phone calls, video calls, even Zoom meetings where scammers look and sound exactly like your boss, your banker, or even your child.

Some examples:

This is an abuse of trust. What we see and hear is no longer necessarily real.

Anyone Can Now Be a Scammer

And what’s the worst part? You no longer need to be a hacker or have millions for the technology to pull off such a scam. You just need 20 seconds of audio and a free program from the web.

You can create:

Fake boss calls. “Hello, this is the director, I urgently need you to make a transfer…” And people fall for it.

Completely fictional people. AI creates a face, documents, resume, everything for a fake identity for loans or money laundering.

Messages that sound authentic. AI writes emails that sound EXACTLY like your company’s communication.

Fake customer support profiles. “Hello, I’m from your bank, please tell me your PIN code to solve your issue…”

Once, you needed to be James Bond to pull off a scam. Now anyone with a computer and internet can do it.


Finance is Most at Risk

Why are banks and financial institutions most vulnerable?

They move enormous sums of money every day.

Everything is based on trust and quick decisions. “The boss said it’s urgent!” and the money is already on its way.

The faces and voices of financiers are all over the web: interviews, conferences, earnings presentations. All this is material for training AI.

And what was once considered top-notch protection, voice recognition and video identification? Now AI easily outsmarts it.


When They Hit You Personally

It’s not just about corporations and banks. It’s becoming even more personal.

British journalist Cathy Newman experienced a nightmare when she discovered AI porn videos with her face on the internet. She said it was “humiliating and traumatic.”

Children in schools are becoming victims of bullying due to fake recordings where AI puts them in shameful situations.

Parents receive phone calls where their “children” are crying and begging for help. And can you imagine that feeling of horror when you think your child is in danger?

It’s always the same recipe: they create a sense of urgency, exploit your trust, and use AI for a perfect illusion.


The Worst Part of All

It’s not just the money people lose. We’re losing trust and the integrity of society as a whole. Once, it was accepted that you could believe what you see and hear. Now the integrity of information is falling apart before our eyes.

Even authentic recordings can be labeled as fake. Fake recordings spread as truth. And in the end, nobody knows what’s real and what isn’t.

When we lose trust in truth itself, everything else starts to fall apart. Think about it, in a family without trust, there’s nothing. The same applies to society.

Yes, the challenge is enormous. But humans have always adapted and found solutions. I truly believe we will overcome this too. With knowledge and the right tools, we can protect what is real.

Because if we lose our sense of reality, we don’t just lose information. We lose the foundation on which we build relationships and society.

There you go, I wanted to present you with some of the inevitable things that unfortunately come with such powerful technology. Awareness is half the solution. 😉

Next week, we’ll look at how this new image model works and if it’s really as good as they say…

Enjoy until then.

Primož

Email me to book your free 15-min AI strategy call.

Book a Free Call