X

Closed for the Holiday

All Honor locations, Video Tellers, and our Contact Center will be closed on Thursday, November 27, and Friday, November 28, in observance of the federal holiday.

Online Banking Login

Online Banking Login



Online Banking Login

AI Voice Impersonation Scams: What You Need to Know Now

women learning about voice impersonation scams

In today’s digital age, rapid advancements in AI have led to the emergence of voice cloning and deepfake audio—innovations that also bring serious risks, including a growing potential for scams. Here’s a look at common scam types and how to protect yourself.


Voice-Driven Scam Types to Watch For

Impersonation & Deepfake Scams

  • Imposter Scams: Fraudsters clone voices of colleagues or superiors to request urgent financial actions. Deepfake video calls may follow.
  • BEC Scams: AI mimics CEOs or executives to trick employees into wire transfers or data leaks.
  • Extortion & Ransom: Voice clones of loved ones convey fake emergencies to coerce payments.

Tech & Support-Based Scams

  • Tech Support: Scammers pose as service reps, claiming issues and requesting access or payment.
  • Subscription Renewal: Fake reps ask for credit card info to “renew.”
  • Survey & Prize: “You’ve won!” calls that ask you to share personal info to redeem.

Financial & Identity Theft

  • Banking Scams: Imitate bank reps to collect “verification” details.
  • Insurance Fraud: Pose as agents requesting renewals or sensitive data.
  • Investment Scams: High-return pitches pressuring fast action with cloned advisor voices.

Personal & Medical Scams

  • Healthcare Scams: Imitate doctors or providers to solicit payments or medical data.
  • Emergency Scams: Fake voices of family/friends claim accidents or crises and demand immediate funds.

Emotional Scams

  • Romance Scams: AI-generated voice calls deepen online dating deception.
  • Charity Scams: Mimic real organizations after disasters to solicit donations.

Lifestyle & Travel Scams

  • Travel Scams: Fake “agents” promote discounted vacations and demand upfront payment.

Protecting Yourself from AI Voice Scammers

Key ways to identify and avoid these scams:

  • Engage with a random question: Throw them off with something unexpected (e.g., “How’s the weather in ___?”).
  • Test musical ability: Ask them to hum or sing; clones struggle with natural singing variation.
  • Introduce humor: Tell a joke; AI often responds oddly or out of context.
  • Watch for repetition: Scripted, repeated phrasing is a red flag.
  • Get personal: Ask for details only the real person would know.
  • Call back: Hang up and call a trusted number for the person or organization.
  • Check background noise: Listen for unnatural or looping ambience.
  • Use voice analysis tools: Emerging software can flag cloned audio (most useful for businesses).
  • Stretch the call: Inconsistencies often appear over longer conversations.
  • Check idioms and phrasing: Off, missing, or misused personal catchphrases can expose fakes.
  • Ask about shared memories: Scammers will dodge or answer vaguely.
  • Gauge emotion: AI struggles to maintain natural emotional tone dynamically.
  • Set voice passwords/codewords: Use a private phrase for critical communications.
  • Verify via another channel: Confirm requests by text/email/video (stay wary of deepfake video too).
  • Switch topics abruptly: Scripted AIs often stumble.
  • Notice latency: Unnatural pauses can indicate AI processing.
  • Regular check-ins: Establish routines with contacts; deviations can signal fraud.
  • Enable MFA: Adds protection even if your voice is cloned.
  • Stay updated: Keep current on scam tactics and safety tips.
  • Train your team: Educate staff to recognize/report threats.
  • Use a family codeword: Require it in “help” requests.
  • Protect your identity: Monitoring can alert you if data leaks to the dark web.
  • Opt out of data brokers: Remove personal info from people-search sites where possible.
  • Trust your instincts: If something feels off, pause and verify.

Social Media Safety Tips

Preventing AI voice cloning involves safeguarding your voice data just like other personal information. Even short clips can be enough to clone your voice.

When sharing online, consider:

  • Limit public videos: Avoid posting long speaking clips; consider captions instead.
  • Use privacy settings: Keep profiles private; review settings regularly.
  • Be mindful of voice apps: Use voice features cautiously.
  • Avoid “voice challenges”: Trendy voice posts expand exposure.
  • Review stored media: Remove outdated voice-heavy content.
  • Beware of voice phishing: Don’t verbally confirm sensitive info to unsolicited callers.
  • Educate friends/family: Awareness reduces accidental sharing.
  • Harden voice authentication: Ensure services using your voiceprint have strong security.
  • Get consent details: For webinars/podcasts, ask how your audio will be used.
  • Think before you share: Limit audience to people you truly know and trust.

Scenarios Most Likely to Succeed in AI Voice Scams

Emergency Situations

  • Car crash or breakdown
  • Robbery
  • Medical emergency or hospitalization
  • Unexpected legal trouble or arrest

Lost Personal Items

  • Lost phone or wallet
  • Lost passport or travel documents
  • Misplaced luggage while traveling

Travel-Related Issues

  • “Stranded abroad” stories
  • Booking mishaps or hotel payment issues
  • Trouble at customs or border controls

Financial Urgencies

  • Unexpected bills or debts
  • “Immediate” tax payments to avoid penalties
  • “Too good to miss” investment offers

Personal Relationship Tactics

  • Relationship problems needing money
  • Unexpected pregnancies or related issues
  • Urgent family event expenses (funerals, weddings)

Housing or Living Situation

  • Eviction or “immediate” rent payment
  • Utilities about to be shut off
  • Disaster-related urgent relocation

Digital Compromises

  • Ransom for “compromised” explicit media
  • “Hacked account” recovery fees
  • Demands after unauthorized software/media downloads

Employment or Job Opportunities

  • Surprise travel expenses for a “guaranteed” job
  • Pay for training/certifications for an “exclusive” role
  • Advance payment for freelance/work-from-home gigs

Account Takeover Prevention

AI voice impersonation becomes especially dangerous when combined with other tactics during account takeovers:

  • Phishing calls: Using a familiar voice to extract PINs, passwords, or one-time passcodes.
  • 2FA bypass: Impersonating you to request or intercept codes.
  • Credential resets: Calling support to reset passwords or change details.
  • Fake fraud alerts: Urging you to “move funds to a secure account.”
  • Security setting changes: Altering alerts or recovery info after access.
  • Fraudulent authorizations: Using voice systems to OK payments or wires.
  • Info harvesting: Casual chats to gather answers for security questions.
  • Social engineering staff: “Executive” voices pushing employees to act.
  • Forged verbal approvals: Cloning voices to fake recorded consent.
  • Deepfake combos: Voice + video for convincing live calls.
  • Fake references: Simulated voices when institutions verify contacts.

Stay cautious and independently verify any unusual request. Financial institutions are continually strengthening defenses and training teams to spot these attacks.


Fake Voices: How AI Tricks People

  • Realistic voices: AI can copy almost any voice, language, or accent.
  • More believable: Cloned voices hide accents/grammar mistakes.
  • Scalable: One tool can generate many convincing calls.
  • Adaptable: Voices can shift tone and style quickly.
  • Fueled by public audio: Clips from talks, podcasts, and videos train clones.
  • Works with chatbots: Real-time, two-way AI conversations.
  • Fewer slip-ups: AI sticks to scripts consistently.
  • Beats voice security: Some systems can be fooled by good clones.
  • Scripted replies: Pre-programmed answers keep talk flowing.

Want to Learn More Security Tips?

Explore Honor’s Security Center for our procedures and resources in the event you encounter fraudulent activity.

More To Explore

Zero Down Payment Offer

To make the process as simple as possible with our Zero Down Payment mortgage offer, we encourage you to speak with a mortgage expert so they can explain the requirements and guide you through the process!

Find An Expert Near You

Need Help? Contact us at 800.442.2800 and we will help you get started with the application process.