Protecting Yourself From Artificial Intelligence (AI)-Powered Imposter Scams

<p>Protecting Yourself From Artificial Intelligence (AI)-Powered Imposter Scams</p>

In the digital age, technology has brought convenience and efficiency to our lives, but it has also opened new avenues for fraudsters. One of the most concerning trends is the use of AI to carry out imposter scams. These sophisticated scams can be particularly challenging to detect and prevent.

What are AI-powered imposter scams?

AI-powered imposter scams use advanced technology to mimic the voice, appearance, or behavior of someone you know, such as a family member, friend, or even a bank representative. This technology can also replicate business communications, so it’s used to create phishing emails, text messages, and fake websites that look like they’re owned by a legitimate organization.

Remember: KeyBank will not call, email, or text you and ask for your full Social Security number, username, password, or other personal information. If you receive such contact, be cautious and end the communication. Make sure not to dial suspicious numbers or open provided links or attachments from suspicious sources.

Types of AI-powered imposter scams

AI-enhanced phishing and smishing

  • Large Language Models (LLMs) are used to create grammatically correct emails and text messages, often impersonating a business or organization.

 

AI-generated web content

  • AI can be used to quickly create fake websites, online marketplaces, or product listings to defraud users.

 

Voice cloning

  • Fraudsters gather voice samples from social media, phone calls, or public sources and use AI to clone the voice of a known individual.

 

Deepfake videos

  • Deepfake technology can create highly realistic video content that appears to show a person doing or saying something they never actually did.
  • These videos can be used to impersonate someone and gain your trust.

 

Once the fraudster has a convincing imitation, they may use social engineering tactics to manipulate you into sharing sensitive information or making financial transactions.

For example, a fraudster might call you pretending to be a family member in distress, claiming they need immediate financial assistance to get out of a difficult situation.

How to spot AI-powered imposter scams

AI-enhanced phishing and smishing

  • Look for subtle inconsistencies in email addresses and sender information.
  • Be aware that while AI might remove typos, it can produce text that feels overly formal or robotic.

 

AI -generated web content

  • Look for websites that were recently created or use slightly altered URLs, such as letters swapped for numbers.
  • Watch out for images that look “too perfect” or unrealistic.
  • Be cautious of prices that seem too good to be true.

 

Voice cloning

  • Listen for unnatural speech patterns, such as robotic-sounding speech or inconsistency in tone or emotion.
  • If the caller sounds like a friend or family member but can't answer personal questions, it is most likely a scam.

 

Deepfake videos

  • Watch for unnatural facial expressions or body movements, like unnatural blinking, strange facial expressions, or mismatched lip movement and speech.
  • Look for flaws in the background, such as blurry objects and unnatural lighting or shadows.
  • Pay attention to imperfections in audio quality, speech patterns, tone, and facial movements.
  • Look for watermarks. Some social media platforms flag potentially altered content.

How to help protect yourself

  • Before you respond, verify requests for payment or sensitive information by directly contacting that person or organization through a known phone number or channel.
  • Scrutinize unsolicited messages sent with high urgency.
  • Enable multi-factor authentication whenever possible.
  • Be cautious with the personal information, photos, and videos you share online.
  • Check your accounts daily, and immediately report unauthorized transactions to help increase the chance of recovering the money.
  • Stay informed about the latest fraud tactics, and share this information with your family and friends.

What to do if you think you've become a victim of an AI-powered imposter scam

  1. Immediately contact any financial institution where you maintain an account. If you are a KeyBank client, please call the KeyBank Fraud Client Service Center at 1-800-433-0124, or dial 711 for TTY/TRS.
  2. File a report with the Federal Trade Commission (FTC) at IdentityTheft.gov.
  3. Change your passwords and security questions for all affected accounts.
  4. Consider placing a fraud alert on your credit reports and monitoring your credit for signs of identity theft.

By staying informed and taking proactive steps, you can significantly reduce the risk of falling victim to AI-powered imposter scams.

Learn more about our commitment to fraud prevention and cybersecurity at key.com/fraud.

The information and recommendations contained here have been compiled from sources believed to be reliable based on current information and conditions and are subject to change. KeyBank assumes no duty to update any information in the material in the event that such information changes. KeyBank does not represent or warrant its accuracy, reliability, or completeness or accept any liability for any loss or damage (whether direct or indirect) arising out of the use of all or part of this material. This material is provided as general information only; particular situations may require additional information or actions. Nothing in material shall be regarded as an offer, solicitation, recommendation or advice (whether financial, accounting, legal, tax or other) given by KeyBank and/or its officers or employees or other presenters. If legal advice or other expert assistance is required, the services of a competent professional should be sought.