THE FOURTH SCAM OF THE SEASON

O come all ye fakers

AI-generated voices, video calls or social media content is being increasingly used to impersonate acquaintances, family members, celebrities or professionals. These AI scams, commonly known as deepfakes, may take the form of your assistant, requesting a money transfer to pay for gifts, a charity looking for financial support, or a business contact requesting personal information. 

Deepfake scams are an easy trap to fall into. Stephen Henry from Toronto, Canada was recently tricked into sending a cybercriminal $12,000 after receiving a deepfake video of Canadian Prime Minister, Justin Trudeau endorsing an investment opportunity.

Emergency Icon

Warning signs
  • A call or video from someone you recognize asking for unusual urgency (e.g., “transfer funds now”)
  • The voice or video quality is not quite what it should be (e.g., odd lip-sync, background noise, unnatural pauses)
  • The message comes from a new contact or unexpected channel
  • Requests are for private information or money to be transferred to unknown accounts

Shield tick icon

How to stay safe
  • If a request seems urgent and abnormal, call the person using a known number rather than trusting the one in the video/text
  • Use multifactor verification for major financial transactions
  • Limit what personal information you share publicly (photos, voice clips) to reduce material scammers can use
  • Maintain stringent protocols for any high-value transfers, including “two-person sign-off” so the details can be checked thoroughly