AI Voice Impersonation Scams: What You Need to Know Now

women learning about voice impersonation scams

In today’s digital age, rapid advancements in AI have led to the emergence of voice cloning and deepfake audio, exciting innovations that also bring serious risks, including a growing potential for scams. Here’s a dive into the types of scams we can anticipate flourishing due to voice clones and how you can arm yourself against them. 

 

Voice-Driven Scam Types to Watch For 

Impersonation & Deepfake Scams 

Imposter Scams: Fraudsters clone voices of colleagues or superiors to request urgent financial actions. Deepfake video calls may follow. 

BEC Scams: AI mimics CEOs or executives to trick employees into wire transfers or data leaks. 

Extortion & Ransom: Voice clones of loved ones convey fake emergencies, coercing victims into paying ransoms. 

 

Tech & Support-Based Scams

Tech Support: Scammers pose as service reps, claiming issues and requesting access or payment. 

Subscription Renewal: Fake reps from streaming or magazine services ask for credit card info to “renew.” 

Survey & Prize: Callers claim you’ve won a prize—just share your info to “redeem.” 

 

Financial & Identity Theft 

Banking Scams: Imitate bank reps to collect account verification details under the guise of security. 

Insurance Fraud: Pose as agents requesting renewals or sensitive personal information. 

Investment Scams: Offer high-return opportunities, pressuring fast action with cloned advisor voices. 

 

Personal & Medical Scams 

Healthcare Scams: Imitate doctors or providers to solicit payments or medical data. 

Emergency Scams: Fake voices of family/friends claim accidents or crises, asking for immediate funds. 

  

Emotional Scams 

Romance Scams: Deepen online dating deception with AI-generated voice calls matching fake profiles. 

Charity Scams: Mimic real organizations after disasters to solicit donations. 

 

Lifestyle & Travel Scams

Travel Scams: Fake travel agents promote discounted vacations, demanding upfront payment. 

 

Protecting Yourself from AI Voice Scammers:

Here are some key ways to identify and prevent falling for these scams: 

  • Engage Them with a Random Question: Throw them off with an unexpected question like, “How’s the weather in [random city]?” An AI, prepped with specific scripts, will falter. 
  • Test Their Musical Abilities: Ask them to hum a tune or sing a song. Current AI voice clones can’t match the pitch and variation of genuine human singing. 
  • Introduce Humor: Tell a joke and observe the response. AI doesn’t truly understand humor, and its response will be off-mark or entirely out of context. 
  • Watch for Repetition: AI voice clones tend to regurgitate the same scripted answers. If you notice repeated or eerily similar responses, you’re likely dealing with AI. 
  • Get Personal: Pose a question that the person they’re impersonating would know. An AI, lacking that personal knowledge, will give an incorrect answer or deflect. 
  • Call Back: If you receive an unexpected call demanding action or information, hang up and call back a trusted number for that individual or organization. 
  • Background Noise Assessment: Listen for inconsistencies in background noise. AI-generated audio might lack ambient sounds typical of a genuine call or might use repetitive background loops. 
  • Voice Analysis Software: There are emerging tools and software that can detect discrepancies between a genuine voice and its cloned counterpart. These can be especially useful for businesses or frequent targets of such scams. 
  • Temporal Consistency: Engage the caller in longer conversations. AI voice clones may show inconsistencies over time, especially in longer interactions. 
  • Use of Idioms and Phrases: Every person has specific ways they speak or certain phrases they use frequently. If the voice does not use these or uses them incorrectly, it may be a sign. 
  • Ask for Past Memories: Discussing shared memories or experiences can be tricky for a scammer. They might dodge the question or give a vague response. 
  • Emotional Consistency: Gauge the emotional consistency of the speaker. While AI can mimic voice tones, matching the correct emotional tone in a dynamic conversation can be challenging. 
  • Set up Voice Passwords: For critical communications, especially within businesses, set up voice passwords or phrases known only to the parties involved. 
  • Use of Other Verification Means: Before making any financial transaction or sharing sensitive information, always verify the request through a different communication channel. Rely on other methods in tandem, such as video calls (though be wary of deepfake video technology), text verifications, or email confirmations. 
  • Unexpected Changes in Topic: Switch between topics rapidly or bring up something entirely unexpected. An AI, especially one operating on a script, might struggle to keep up or respond appropriately. 
  • Monitor for Latency: Listen for unnatural pauses or delays in the conversation. This could indicate the AI processing the received information to generate a response. 
  • Check Regularly with Contacts: Regularly checking in with contacts or setting up a routine can establish a pattern. Any deviation from this can be a red flag. 
  • Multi-factor Authentication (MFA): Introduce MFA in your business and personal dealings. This adds an extra layer of security even if someone has access to your voice or personal details. 
  • Stay Updated: With the rapid advancements in technology, always ensure you are abreast of the latest scams and the recommended safety measures.  Stay informed about the latest in voice cloning technology. The more you know, the better equipped you’ll be to detect fakes. 
  • Educate and Train: If you’re in a business setting, ensure your staff is trained to recognize and report potential threats. 
  • Set a verbal codeword: Make sure it’s one only you and those closest to you know. (Financial institutions and alarm companies often set up accounts with a codeword in the same way to ensure that you’re really you when you speak with them.) Make sure everyone knows and uses it in messages when they ask for help.  
  • Protect your identity: Identity monitoring services can notify you if your personal information makes its way to the dark web and provide guidance for protective measures. This can help shut down other ways that a scammer can attempt to pose as you.  
  • Clear your name from data broker sites: How’d that scammer get your phone number anyway? It’s possible they pulled that information off a data broker site. Data brokers buy, collect, and sell detailed personal information, which they compile from several public and private sources, such as local, state, and federal records, in addition to third parties.  
  • Trust Your Instincts: If something feels off, it probably is. Always trust your gut and take a moment to evaluate the situation. 

 

Social Media Safety Tips: 

Preventing AI voice cloning involves safeguarding your voice data much in the same way you protect other personal information. As AI and deep learning technologies become more advanced, even short audio samples can be used to recreate a person’s voice.  

 

When considering social media and online sharing, here are some steps individuals can take to protect their voices: 

  • Limit Public Videos: Refrain from posting long videos where you’re speaking. If you need to share a video, consider using text overlays or subtitles instead of verbal communication. 
  • Privacy Settings: Ensure that your social media profiles are set to private, limiting access to known friends and family. Regularly review and update these settings as platforms often undergo changes. 
  • Be Mindful of Voice Apps: Be cautious when using voice-based social media applications or features, such as voice tweets or voice messages. 
  • Avoid Voice Challenges: Social media platforms sometimes have voice challenges or trends that encourage users to share voice notes or videos. Participating in these activities can expose your voice to a broader audience. 
  • Review Stored Media: Periodically check platforms where you’ve previously uploaded videos or podcasts. Consider removing or replacing older content, especially if it’s no longer relevant. 
  • Beware of Voice Phishing: Be cautious of any unsolicited calls or messages asking you to verbally confirm personal information. 
  • Educate and Inform: Let friends and family know about the risks of AI voice cloning. The more people are aware, the less likely they are to inadvertently share content that features your voice. 
  • Voice Authentication: If you use voice authentication for any services, be aware that your voiceprint is a valuable piece of data. Ensure that such services have robust security measures in place. 
  • Check for Consent: If you’re attending events or webinars, or if you’re part of podcasts or interviews, always ask how your voice will be used. If possible, get a written agreement that restricts unauthorized distribution or use. 
  • Think before you click and share: Who is in your social media network? How well do you really know and trust them? The wider your connections, the more risk you may be opening yourself up to when sharing content about yourself. Be thoughtful about the friends and connections you have online and set your profiles to “friends and families” only so your content isn’t available to the greater public. 

 

Scenarios Most Likely to Succeed in AI Voice Scams: 

Emergency Situations: 

  • Car crash or breakdown. 
  • Robbery. 
  • Medical emergency or hospitalization. 
  • Unexpected legal trouble or arrest. 

Lost Personal Items: 

  • Lost phone or wallet. 
  • Lost passport or travel documents. 
  • Misplaced luggage while traveling. 

Travel-related Issues: 

  • Claiming to be stranded abroad and needing help. 
  • Booking mishaps or hotel payment issues. 
  • Trouble at customs or border controls. 

Financial Urgencies: 

  • Unexpected bills or debts. 
  • Taxes owed immediately to avoid penalties. 
  • Business or investment opportunity that’s “too good to miss”. 

Personal Relationship Tactics:

  • Relationship problems needing financial assistance. 
  • Unexpected pregnancies or related issues. 
  • Urgent need for money for family events, like funerals or weddings. 

Housing or Living Situation:

  • Eviction notice or immediate rent payment. 
  • Utilities being shut off due to unpaid bills. 
  • Natural disasters causing urgent relocation. 

Digital Compromises: 

  • Ransom for supposedly compromised explicit photos or videos. 
  • Alleged hacking of personal accounts demanding money for recovery. 
  • Payment demands following unauthorized software or media downloads. 

Employment or Job Opportunities: 

  • Unanticipated travel expenses for a “guaranteed” job offer. 
  • Payment for training or certifications for an “exclusive” opportunity. 
  • Advance payment for freelancing or work-from-home opportunities. 

 

Account Takeover Prevention: 

AI voice impersonation can be alarmingly effective, especially when combined with other tactics, in the context of bank account takeovers. Here’s how: 

  • Phishing Calls: Using a familiar voice (like a bank or credit union executive or staff) to convince account holders to share sensitive information such as PINs, passwords, or OTPs (One-Time Passcodes). 
  • Two-Factor Authentication (2FA) Bypass: Impersonating the account holder to request or intercept 2FA codes through a call. 
  • Resetting Account Credentials: Using voice impersonation to call customer support, posing as the account holder, and requesting a password reset or account changes. 
  • Fake Account Alerts: Posing as the financial institution’s fraud department to report suspicious activity and convincing the user to provide or confirm account details or move money to a “secure” account. 
  • Manipulating Account Security Settings: After gaining initial access through voice impersonation, the attacker might alter account settings to ease future unauthorized access. 
  • Authorizing Fraudulent Transactions: Using voice commands to authorize payments or wire transfers via phone banking systems that rely on voice authentication. 
  • Gathering Additional Information: Engaging in casual conversations to extract more personal information from victims which can then be used in further scams or for security questions. 
  • Social Engineering of Bank or Credit Union Staff: AI voice impersonation can be used to sound like a senior banking executive, instructing junior employees to make unauthorized transactions or changes to an account. 
  • Mimicking Recorded Verbal Approvals: If a financial institution records verbal consent or approvals for documentation purposes, AI voice cloning can be used to forge such consent. 
  • Combination with Deepfake Technology: Combining voice impersonation with deepfake video can lead to convincing video calls, further fooling victims or bank personnel. 
  • Creating Fake References or Verifications: Using AI to simulate voices of references or contacts that the financial institution might call for account verification. 

Remember, while these tactics are potential threats, it’s essential to be aware of them to put defenses in place. Financial institutions are continually enhancing their security measures and training their personnel to recognize and prevent such attempts. On the user’s end, being cautious and verifying any unusual requests independently can be a strong defense against potential AI voice impersonation scams. 

 

Fake Voices: How AI Tricks People 

Realistic Voices:
AI can be trained to copy almost any voice, in any language or accent. This means scammers who don’t speak English well can still sound like native English speakers. 

Sounds More Real:
Even if a scammer speaks some English, you might notice their accent or grammar mistakes. But with AI, the voice can sound completely natural—like someone who grew up speaking English—making the scam more believable. 

Easier to Do at Scale:
With AI, scammers don’t need a big team. They can create lots of fake calls or messages using just one tool that speaks perfect English. 

Can Change Easily:
AI voices can be changed quickly to match different tones, styles, or ways of speaking. This helps scammers sound more like the people they’re trying to fool. 

Uses Online Voice Clips:
There’s a lot of voice data online—like from speeches, podcasts, and videos. Scammers can use this data to train AI to copy someone’s voice, even if they don’t understand the language. 

Works with Other Tech:
Scammers can use AI voices along with other tools, like chatbots. This means they can have full conversations with people using AI that talks and responds in real time. 

Fewer Mistakes:
Unlike humans, AI doesn’t forget scripts or show emotions. It’s consistent and less likely to make mistakes that would reveal it’s fake. 

Tricks Voice Security:
Some banks use voice recognition to verify who you are. AI can fool these systems by copying someone’s voice—even if the scammer doesn’t understand the words being said. 

Pre-Programmed Replies:
AI can be set up with scripted answers to common questions, so the conversation sounds smooth—even if the scammer isn’t really talking. 

 

Want to Learn More Security Tips?

Explore Honor’s Security Center for our procedures and resources in the event you encounter fraudulent activity.

More To Explore

Zero Down Payment Offer

To make the process as simple as possible with our Zero Down Payment mortgage offer, we encourage you to speak with a mortgage expert so they can explain the requirements and guide you through the process!

Find An Expert Near You

Need Help? Contact us at 800.442.2800 and we will help you get started with the application process.