Connect with us

News

FBI warns AI scammers impersonate US government officials

Published

on

FBI warns of AI scammers impersonating U.S officials, using AI audio & phishing texts to steal credentials. Verify messages claiming official identity, FBI says

The FBI announced on May 15 that hackers are targeting government officials by impersonating high-level U.S. officials via text and voice messages, using AI-generated audio alongside phishing texts to manipulate others into giving up their personal account credentials.

According to the warning, hackers initiated the campaign in April and appear to keep it in progress, though they disclosed few details.

Hackers deliver the campaign through text or AI-generated voice messages mimicking senior U.S. officials. They focus largely on current and former senior government officials at the federal or state level, as well as their contacts, aiming to breach their accounts.

The FBI now advises the public that “if you receive a message claiming to be from a senior U.S. official, do not assume it is authentic.”

This incident shows another recent case of attackers leveraging AI and deepfake technology to convincingly replicate someone’s voice and appearance.

As large language models become more widespread and sophisticated in generating lifelike media, scammers increasingly leverage them in phishing operations.

Perhaps the most prominent case so far involved a deepfake of President Joe Biden that surfaced during the run-up to the 2024 New Hampshire presidential primary.

Political consultant Steve Kramer, acting on behalf of Democratic challenger Dean Phillips, managed the creation of the audio message that told Democrats not to vote and delivered it to voters in the days leading up to the primary.

The Federal Communications Commission fined Kramer $6 million, and authorities are currently criminally investigating him for faking a candidate’s identity in an effort to influence voter turnout in New Hampshire.

Since its introduction, scammers have rapidly expanded the use of deepfake audio, using it more often in everyday financial fraud and increasingly sophisticated cyber and spy campaigns.

In a post on X dated May 13, Polygon co-founder Sandeep Narwal revealed that scammers had impersonated him using deepfake technology in a separate scam effort.

Narwal said the “attack vector is horrifying” and had left him slightly shaken because several people “called me on Telegram asking if I was on zoom call with them and am I asking them to install a script.”

The scammers hijacked the Telegram of Shreyansh, who leads ventures at Polygon, and used it to lure people into a Zoom meeting featuring AI-generated versions of him, Shreyansh, and another person, Narwal explained.

“The audio is disabled and since your voice is not working, the scammer asks you to install some SDK, if you install game over for you,” Narwal said.

“Other issue is, there is no way to complain this to Telegram and get their attention on this matter”

“I understand they can’t possibly take all these service calls but there should be a way to do it, maybe some sort of social way to call out a particular account.”

In response to the post, one user acknowledged that con artists had targeted them, and Dovey Wan, a prominent Web3 personality, disclosed that scammers had deepfaked her in a similar type of fraud.

Narwal suggests the best defense is to avoid installing anything during online engagements that you didn’t initiate and to store and manage crypto wallets on a completely separate device.

For protection, the FBI suggests that you confirm the identity of anyone who messages you, carefully check sender information for inaccuracies, and watch for distorted hands, feet, or unnatural faces in images and videos.

The FBI further suggests that you avoid sharing sensitive data with unknown contacts, steer clear of suspicious links, and implement two-factor or multifactor authentication wherever possible.

“AI-generated content has advanced to the point that it is often difficult to identify,” the FBI advised. “When in doubt about the authenticity of someone wishing to communicate with you, contact your relevant security officials or the FBI for help.”

Crypto News Update

Latest Episode on Inside Blockchain

Crypto Street

Advertisement



Trending

ALL Sections

Recent Posts