Explore Our Latest Insights on Artificial Intelligence (AI). Learn More.

Deepfake Technology—A Cautionary Tale for Financial Services Firms

According to recent news reports, unknown bad actors used deepfake technology to defraud a Hong Kong-based multinational company of approximately USD 25 million. Specifically, news outlets reported that the bad actors deployed artificial intelligence (AI) to mislead a company employee into believing that the employee was communicating with real company personnel, when in fact such interactions were an illusion. The defrauded employee was under the impression that they conducted a video call with the company's CFO and other company employees. However, the likenesses and voices of the additional parties to the video, including the CFO, were replicated through AI, using publicly available video and audio footage. Based upon instructions received during the video call, the employee transferred HK 200 million, or USD 25.6 million, to various Hong Kong bank accounts across 15 transfers, according to reports. 

Fraud and Other Compliance Risks

AI use is proliferating in the marketplace, bringing about helpful and positive results for organizations around the world. It is not a surprise that AI use is also finding its way into the hands of bad actors. Replication of audio and video likenesses taken from publicly available file footage is a natural extension of established social engineering attacks that manipulate human behavior to disclose personally identifiable information (PII). Examples of social engineering attacks include compromising an email platform and phishing. Phishing occurs where a perpetrator sends a message falsely representing that it came from a legitimate source to gather PII to be used for illicit purposes (e.g., identity theft and money transfers). Reports of the Hong Kong multinational deepfake fraud noted that the defrauded employee received a message purportedly from the CFO ahead of the deepfake video call.

Sophisticated fraud schemes facilitated through AI technologies pose significant fraud risk and other compliance challenges for organizations and compliance professionals. Days after news outlets reported the fraud suffered by the Hong Kong-based company, the SEC filed settled charges against a company and its founder for allegedly luring investors by falsely claiming that a hedge fund used AI and machine-learning technologies to execute trading strategies. The SEC further alleged that both the hedge fund and the AI tools never existed.

This news report and recent SEC enforcement action together underscore how phishing, deepfake and other evolving AI technologies present a vast range of fraud and compliance risks to financial services firms and their investors. The SEC is tracking these developments and proposed new rules concerning AI, as detailed in Kroll’s recent article concerning “AI Risks and Compliance Strategies" Compliance professionals should proactively prepare for the unique challenges they may face as a result.

Key Takeaways for Compliance Professionals

How can financial services firms guard against increasingly complex fraud attempts? What lessons can we learn from the recent fraud perpetuated through deepfake technology? The circumstances warrant reviewing good hygiene practices regarding AI use internally as well as reinforcing authorized instructions and money movement practices to mitigate fraud risk through AI technology. While not all financial services firms maintain custody of funds or assets directly, it is a good reminder for chief compliance officers (CCOs) to regularly inventory the circumstances unique to their firms and the internal controls regarding authorized parties, contact information and interactions with custodians.

The importance of human intervention to reduce fraud risk cannot be overstated as an integral step to fraud prevention. Money movement systems and associated authorizations are automated for efficiencies of resources and client convenience. However, human involvement continues to be integral to reduce fraud risk. Changes to authorized parties, contact information or wire transfer instructions must be verified with pre-existing internal or client personnel on record prior to the request for execution. Said differently, to obtain confirmation, do not reply to the email and do not call back the number or use the video link contained in the email request. Instead, as a safeguard against email compromise and deepfake use, implement revised instructions or update information only after receiving direct confirmation from existing authorized personnel. A CCO can implement regular testing, monitoring and reporting upon authorized instructions along with regular reminders and communications regarding established protocols as part of an overall compliance program.

Maintain a “back to the basics” approach, focused upon fundamental human intervention and confirmation procedures to reduce fraud risk, regardless of how convincing or time-sensitive circumstances appear. The good news is that CCOs, risk managers and firm personnel can mitigate fraud risks by proactively taking the following steps:

  • Policies and procedures: Maintain clear operating procedures outlining the required review and approval regarding authorized parties, contact information and money movement instructions. Procedures should require not updating money movement instructions, wire instructions, bank account information, phone numbers or contact details without confirming the changes first to an existing contact within the firm or at the client, regardless of who appears to be sending an email, is on the phone or attending the video call (as evidenced in the recent reported deepfake fraud outlined above).
  • Client contracts: Review contracts and wire transfer agreements with clients for language as to authorized instruction protocols and steps to be followed to update authorized parties, their contact information and approved bank account or custodian information.
  • Contact library: Review and authenticate contact details in writing with authorized client representatives, using dual authentication to confirm details if changes are requested (e.g., confirmation with another authorized party).
  • Client communications: Communicate reminders and notices of firm protocols to clients on a regular basis. For example, note that firm representatives will never ask for or accept changes to authorized details over the phone or video and remind clients of the key steps and procedures utilized to update authorized instructions (e.g., callbacks for instruction updates received via email).
  • Client disclosures: Review disclosures made to clients regarding receipt of updates to authorized instructions and client contact information, such as disclosures provided within regular client communications, quarterly statements or regulatory filings (e.g., Form ADV).
  • Internal controls: Automate and restrict single authentication procedures as much as possible; require dual review and approval prior to moving funds, especially outside the organization.
  • Compliance program: Test, monitor and review authorized contact changes and client account activity as part of the overall compliance program to ensure procedures are followed and protocols are functioning effectively.
  • Training: Instruct and remind firm personnel as part of the overall training program regarding proper procedures to implement or change authorized party and contact information. Convey current events, case study details and the significant risk involved in updating contact details or authorized instructions without following established procedures.

In addition to the guidance outlined in “AI Risks and Compliance Strategies,” Kroll’s experts stand ready to leverage our experience in regulatory compliance to craft policies, procedures, testing, training and recordkeeping, designed to help firms mitigate the risk of noncompliance when they adopt AI tools into their workplace operations. Kroll will design gap analyses targeted to identify risks and recommend enhancements to compliance programs to account for AI adoption. We will also prepare SEC-registered firms for navigating the complexities associated with examination and investigation inquiries, especially as the SEC continues to probe AI applications within the financial services industry. Contact our experts today to learn more.



Financial Services Compliance and Regulation

End-to-end governance, advisory and monitorship solutions to detect, mitigate, drive efficiencies and remediate operational, legal, compliance and regulatory risk.

Regulatory Due Diligence

Kroll is a leader in performing regulatory and compliance reviews that complement the financial, commercial and legal M&A transaction due diligence process.

Compliance and Regulation

End-to-end governance, advisory and monitorship solutions to detect, mitigate and remediate security, legal, compliance and regulatory risk.


Cyber Risk

Incident response, digital forensics, breach notification, managed detection services, penetration testing, cyber assessments and advisory.