Ferrari Executive Foils Deepfake Scam with Quick Thinking
Author: admin | 30 Jul 2024Quick Thinking executive exposed the Imposter using Deepfake of the CEO by asking a personal identification question.
The explosive growth of Artificial Intelligence has led to the breeding of new criminals, wielding AI deepfakes as their weapon of choice.
A few days back, a high-ranking executive of Ferrari was about to become a victim of a deepfake voice call by scammers who posed as the CEO, Benedetto Vigna. There was a brief exchange of highly convincing WhatsApp messages and calls appearing from the CEO himself. Initially, the executive continued the conversation but once there were red flags raised from a few suspicious messages from the scammer, the executive depicted a high level of quick-wittedness and outrightly asked the scammer for identity verification through a personal question over the call. This caused the scammer to be puzzled as he immediately dropped the call.
Bloomberg reported that the conversation had messages raising suspicion, especially when the scammer urged the executive to maintain discretion and sign the NDA despite mentioning that Italy’s Market regulator and Milan’s stock Market were notified.
The text messages were followed by a voice call from Vigna himself having his signature Southern Italian accent too. The imposter even attempted to bait the executive by saying that he was using a different number due to the sensitivity of the matter and requested him to execute a hedge fund transfer. This odd request with some technical inflection in the voice call raised the alarm in the executive’s head as he swiftly reprimanded the request by saying, “I’m sorry Benedetto but I need to verify your identity.” He immediately pinned a question on the CEO asking him about a book he recommended days earlier. And the result was as expected, the caller vanished away. The company is investigating the matter further.
There’s a sharp increase in deepfake attacks victimizing the senior management and leaders of world-leading brands. According to the CEO of SocialProof Security, Rachel Tobac, fraudsters are now using voice cloning as a deepfake weapon over phone calls using AI.
Cybersecurity experts are predicting a grim picture stating that AI deepfake frauds will become more common and difficult to detect due to being highly realistic. They recommend stricter verifications, checks, and filters to protect business leaders and their hard-earned money. They forbid transferring any amount to anyone before verification even if the orders are from your boss.
Deepfake technology has become advanced and highly accurate. verifying identities on voice calls, video calls and any other medium of communication is of utmost importance for the prevention of deepfake attacks. Using multi-factor authentication (MFA), staying vigilant & alert while in a conversation, & not trusting anyone who asks for money without verifying properly are robust ways to prevent deepfake frauds.
Suggested Read: Police to Catch Wanted Criminals Using Facial Recognition at Festival