
Do you have an elderly parent or grandparent? What would happen if they got a frantic call—in your voice from your phone number—saying you’d been kidnapped and needed money immediately?
We’re now in a time when your number can be spoofed, and AI can replicate your voice so convincingly your spouse, parent, or child can’t tell it isn’t you.
Some AI tech is that good. Some people are that bad.
My work is dedicated to making the world a more prosperous place. Unfortunately, in today’s world, a necessary element of prosperity is keeping your family safe from bad people who would do them harm. People are being assaulted for their crypto keys, being kidnapped like Nancy Guthrie, and having the titles to their homes stolen from under them.
Please. Think about it…
How many spoofed spam calls or emails are you already getting? You’ve seen those emails “confirming” your invoices from the Best Buy Geek Squad, bills for iPhones you never bought, and phony Bitcoin transactions.
What are the chances your grandma gets one of those fake statements (from what looks exactly like her bank), saying that a large amount is being deducted from her account, and calls the number to report it? And she gets talked out of her pin and her account is wiped out. These scams and millions more have been around a long time.
But now, AI multiplies this threat exponentially.
I filed a new trademark recently and received five different “Official” filing notices from $500 to $7,500. All fake. Do you have any idea how many corporate accounting departments pay those automatically without a second thought? (And even many struggling solopreneurs for whom this money is a big hit to their finances.)
If you have a corporation, there’s a strong chance you’ve received — or even paid — a fake annual report notice. If you have a mortgage, you’ve likely seen phony title filings. And hacked social accounts are now used to solicit “loans” from friends daily. (At any given time, there are usually anywhere from one to five counterfeit Facebook pages pretending to be me, soliciting people for investments, endorsing horrible products, or hawking shit coins.)
The reason criminals attempt these scams is because they work. And now AI technology is about to be manipulated to deluge you with more sinister exploitation than you can imagine. (And you can imagine a lot.)
The old rule was “Don’t click suspicious links.” The new rule is: Assume anything digital can be faked. Voice can be cloned, video can be fabricated, and email headers can be spoofed. Trust must now be systematized — not emotional. Verify before reacting.
AI-enabled fraud is exploding because it scales deception. Criminals no longer need skill — all they need are prompts. The defense is no longer “be careful.” The defense is systems.
Install systems, not hope. Allow me to suggest ten defense strategies to protect you and your loved ones.
This one is non-negotiable. Create a private passphrase that only immediate family knows. Make it something impossible to guess and unrelated to your life. Not a pet’s name, a birthday, or your mother’s maiden name. Those were all easy to hack before AI.
Important Note: the trigger for the passphrase is just as important as the phrase itself. Don’t ask, ‘what’s the safe word?’ Have a separate triggering question. For example, you could say, ‘Does cousin Byron know about this yet?’ (Assuming you don’t actually have a cousin named Byron,) and this would trigger your loved one to respond with the passphrase.
Do not discuss the passphrase in DMs or email. Do not say it near smart devices. Keep this conversation as analog as possible. If a loved one calls you claiming:
Speak the triggering phrase. If they can’t reply with the passphrase, take no immediate action, and look for a methodical way to verify. This single step stops most AI voice-clone panic scams. Finally, once a passphrase is used, assume it has been compromised and change it.
AI fraud relies on controlling the communication loop. So, your first response should be to break the loop, and verify using a known channel. Never reply to:
Instead:
Scammers target:
Have explicit conversations:
Normalize skepticism. Share this blog post with them and discuss afterward.
Criminals weaponize urgency. Any request involving:
Gets an automatic 24-hour verification rule. No exceptions. Legitimate emergencies survive verification. Scams do not.
AI voice cloning only needs a few seconds of audio. If you’re like me, the ship has already sailed on this. But if not, reduce public exposure:
One member in my Breakthrough U program refuses to post his photo anywhere on the internet, and only uses an avatar. He’s not paranoid. He’s strategic.
You can’t eliminate risk — but you can reduce available training data.
Most countries have free systems for creating credit freezes which can prevent:
You can temporarily unfreeze when applying for credit. Most people wait until after fraud. Don’t be most people. Consider taking out a title insurance policy on your home, and credit monitoring services. Identity theft recovery can take months or years. Prevention takes 10 minutes.
Email. Banking. Cloud storage. Social media. Email is especially critical, because if criminals control your email, they can easily reset your passwords.
Those AI-generated fake invoices I mentioned are rising sharply. Small businesses are prime targets because controls are loose. Protect your business by:
Banks are now offering a multitude of alerts. Use them for:
The faster you spot fraud, the more likely recovery becomes.
AI fraud succeeds because it bypasses logic and triggers emotion:
When you feel your pulse spike, that’s your clue to slow down. Emotion is the scammer’s entry point. Cynicism is not good. Skepticism is.
Forward this to the people you love. (There are share buttons at the top of the post.) Then sit down and install these systems together. AI and other technology are taking us into a golden era of innovation. And it’s allowing the bad guys to do more bad things at scale. Please stay prosperous.
Peace,
- RG
Previous Post: The Upgrades that Upgrade You
Subscribe to Randy’s Blog via Email