Deep Fake Storm blog

Deep Fakes Coming To A Screen Near You

Last week, someone stole 25 million dollars from a company in Hong Kong using a deep fake video stream. The attacker used AI to create a video impersonation of company executives and instructed the victim to send 25 million dollars.

The poor guy sent the money, and I bet he feels terrible. Another story making rounds was the recent wave of deep fake nudes of Taylor Swift that has apparently increased sponsorship of a bill in the US Congress to that aims to protect victims of deep fake porn.

How concerned should we be?

I looked into free deep fake video creators this week and wasn’t impressed with what I could find. I bet the paid tools are orders of magnitude better. This is something I will keep an eye on.

All indications point towards stormy weather ahead. But the good news is that if you’re reading this, you’re trying to stay ahead of the storm, and that’s important. A pillar of BlueOx's defense philosophy is the idea that perfection is unnecessary. The goal is to get ahead of the pack, at which point you’re much less likely to become a victim.

What should we be looking out for?

Fake videos or streams - Imagine the following scenario. A video call comes in from an unknown number, but it looks like a friend or family member—the video cuts in and out a bit. The message is urgent. Your ‘friends/family’ are in trouble and need your help immediately. . The urgent situation will almost certainly involve something tragic.

Phone Call - You get a call from a friend. It sounds identical to your friend, and the caller ID shows her number. She's been traveling and has been in an accident. She needs money immediately and asked you to send it.

While it sounds crazy, remember these people have spent years thinking about how to manipulate and trick you. They will try to put you on tilt and invoke strong emotion.

What should we do?

  1. Awareness is crucial. The chances of a scam like this working on you drop with awareness. You're on the right track if you’re reading this!
  2. Call back - Don’t ask the caller, just do it. For example, if it’s your “son,” hang up and call them back directly (using the contact stored in your phone). This is a straightforward step. There is a weathered maxim in information security – trust but verify.
  3. Setup a trusted family word - Create a family safe word with a simple rule/premise - if an emergency arises, you should provide this word. You don’t want to ask the scammer, “What's the secret word?!”. In general, we aim to balance security with usability. A Safe Word might be unreasonable for some, but it could be helpful for older and at-risk members of the community. It also doesn’t scale well, but it’s not a bad idea for your close and immediate family.
  4. Our lesson, Deep Fake Detection Master, will help take your defense to the next level giving you a chance to practice and hone your skills.

Conclusion

Concerns about using AI for malicious purposes are on the rise, but as always, it’s essential to balance click-thirsty media with reality.

As deep fake tech is not likely to slow down anytime soon, we should expect to see more scams and hacks leveraging this technology for nefarious means.

The first crucial step in staying ahead of these threats is an awareness that the tech exists and is being used for malicious purposes. If you have older friends and family who could fall victim to a scam using deep fakes, you should help them understand what deep fakes are and what they are being used for.

Implementing a ‘safe word’ with close family members, especially older family and friends, could be helpful. While this does take some extra work, it can be a powerful tool for Short-Circuiting an attack.

I provided a copy of this to Coach Ox, our new AI-based support assistant. If you have any questions about this post, please ask Coach Ox or in our new community!