Deepfake CEO Fraud 2024
A sophisticated deepfake attack using AI-generated voice and video to impersonate a CEO and authorize fraudulent wire transfers totaling $25 million.
Total amount stolen through fraudulent transfers
AI-generated deepfake video call combined with voice cloning to impersonate the CEO
Time elapsed before fraud was discovered
Phase 1: Reconnaissance
Attackers collected public videos and audio recordings of the CEO from earnings calls, interviews, and conference presentations to train deepfake models.
Phase 2: Social Engineering
Phishing campaign targeted the CFO's assistant to gather information about wire transfer procedures and approval workflows.
Phase 3: Execution
Deepfake video call conducted with CFO using AI-generated CEO likeness, requesting urgent wire transfers for a "confidential acquisition."
Phase 4: Discovery
Fraud discovered when the real CEO inquired about the transfers during a routine financial review meeting.
Vulnerabilities Exploited
- • Over-reliance on visual/audio verification
- • Lack of multi-factor authentication for large transfers
- • Insufficient employee training on deepfake threats
- • No technical verification of video call authenticity
Recommended Mitigations
- • Implement code words or challenge questions
- • Require multiple approval channels for large transfers
- • Deploy deepfake detection tools
- • Regular security awareness training