top of page

How Deepfakes and AI Impact Identity Fraud

  • Writer: Roy Urrico
    Roy Urrico
  • 10 hours ago
  • 4 min read

By Roy Urrico

“Who are you?” has never been a more relevant question to ask in the financial services industry. A blog from Velera and a report from the Identity Theft Resource Center (ITRC) both provide insight about the battle against fraudsters intending to steal identities and funds.


Fighting GenAI, Deepfakes and More


Generative artificial intelligence (GenAI) only needs three seconds of someone’s actual voice to clone it into a deepfake fraud – the use of artificial intelligence (AI) to create fraudulent conversations, images and videos, explained Nicole Reyes, vice president of risk engagement at St. Petersburg, Fla.-based payments CUSO Velera, in a blog. Reyes described how credit unions and other financial institutions need to fight back by employing AI in a layered approach – reinforced with decades of human expertise.

Nicole Reyes, vice president of risk engagement at  Velera.
Nicole Reyes, vice president of risk engagement at Velera.

“We see AI evolving every day in ever-emerging fraud schemes,” said Reyes. She acknowledged the explosion of AI has only accelerated the attacks. “By the end of 2027, deepfake fraud is estimated to have an impact of $40 billion in stolen identities, lost data, hijacked savings and more in the United States alone.”


Also, the availability and sophistication of AI have allowed for “DIY fraudsters,” said Reyes. “You no longer need a technical background to be able to write malicious code, so even novice users can now perpetuate fraud. While the increasing scope of AI fraud presents a large technical challenge for credit unions, it also poses a member trust issue.”


So, what is a fraud fighter to do? Reyes provided the four categories that comprise the recipe for what Velera calls the “Deepfake-Fighting Secret Sauce” to protect its clients and their members:


  • Identification strategies, such as account transaction-anomaly software that looks for “needles in a haystack” — rare signs of fraud hidden within millions of routine transactions — as well as multi-factor authentication (MFA).

  • Enhancement strategies, such as constantly reexamining verification protocols and strengthening those wherever possible.

  • Member-focused actions, such as education for employees and members.

  • Technology advancements, such as biometric verification and real-time fraud monitoring systems.


Velera offers several fraud-fighting products, pointed out Reyes, including IDCheck, which leverages AI-trained models to validate the authenticity of an individual using both documentation and facial biometrics analysis; and Intelligent Fraud Decisioning, a transactional fraud decisioning engine that uses cognitive AI, comprehensive data and enhanced analytics to adapt quickly to fraud trends.


Impersonation Scams Rise Dramatically


In its 2025 Trends in Identity Report, the El Cajon, Calif.-based ITRC, a nationally recognized nonprofit organization established to support victims of identity crime, outlined the identity crimes reported to the ITRC from April 1, 2024, through March 31, 2025. The report also looked at how criminals convinced people to willingly share information, as well as how stolen information was used to open new accounts and evade law enforcement.


Impersonation scams were the top reported type of scam to the ITRC, a 148-percentage-point increase year-over-year. Criminals typically impersonated a general business (51%) or a financial institution (21%).

Eva Velasquez, CEO at ITRC.
Eva Velasquez, CEO at ITRC.

“Our 2025 Trends in Identity Report highlights many findings for us to follow, like sharp increases in impersonation scams, stolen birth certificates and account takeover involving existing accounts,” said Eva Velasquez, CEO at ITRC. “One trend that has continued is a decline in the number of victims reporting identity crimes. Fewer people are reporting instances of identity theft, fraud and scams, but there is every reason to believe it is just that – fewer reports, not fewer crimes being committed.”


Victims also reported attempted misuse of their identity credentials. Thieves tried to open a new account (69%) more often than attempting to take over an existing account (31%). Attempted misuse largely involved financial accounts (85%), specifically credit card accounts (56%) and checking accounts (14%).


There was a 754-percentage-point increase in reports of account takeover involving tech accounts and a 47-percentage-point increase in reports of account takeover involving person-to-person payment apps. The number of fraudulent new property leases and rentals reported rose 102 percentage points, and reports of fraudulent federal student loans increased 111 percentage points.


The top methods of identity compromise reported to the ITRC were due to personally identifiable information (PII) being shared in a scam, stolen documents with personal information and unauthorized access to a computer or mobile device. There was a 41-percentage-point decrease in victims reporting their PII was shared in a scam. However, there was an overall increase in other reported compromises, including a 71-percentage-point rise in reports of stolen documents with personal information.


Individuals who reported stolen documents with personal information primarily reported stolen driver’s licenses, Social Security cards, payment cards, birth certificates and phones or tablets. Reports of stolen birth certificates spiked 612 percentage points.


“We are only at the very beginning of what artificial intelligence (AI) can do to facilitate identity and cybercrimes,” Velasquez continued. “The power of AI in the hands of professional criminals is accelerating a shift we have long warned about – where traditional crime patterns give way to a landscape in which anyone can be a victim. The ITRC is ready to help people and businesses prevent identity crimes and recover when they happen.”


The ITRC identified the following identity trends:


  • AI technology makes it easier for thieves to coerce unsuspecting victims into giving away their identity credentials.

  • Identity thieves are increasingly able to access various existing accounts.

  • Individuals are becoming more curious about protecting their identity.

Mona Terry, COO and Head of Victim Services for ITRC.
Mona Terry, COO and Head of Victim Services for ITRC.

“One macro trend that has carried over from 2023 into 2024 (and continues this year) is a decline in the number of victims reporting crimes. Fewer people are reporting identity crimes but those who do, suffer greater financial losses,” said Mona Terry, COO and Head of Victim Services for ITRC, in the report. She noted the ITRC, the FTC and the FBI’s Internet Crime Complaint Center (IC3) have all reached the same conclusion as to what is happening. However, there is no consensus as to why this is happening.


Terry said some of the reasons include:


  • Criminals are using technology like AI to target victims more precisely, so they do not need to attack as many people, but those they do attack lose more money.

  • Victim fatigue associated with the unrelenting pace of data breaches and cyberattacks has created a sense of hopelessness and powerlessness.

  • More people are taking personal responsibility for protecting their identity information, and more organizations are deploying tools that effectively block or minimize attacks.

bottom of page