Blog 08 Mar 2024

Buyers Guide

Complete playbook to understand liveness detection industry

Learn More
Analyzing Solicitors Featured Image

SRA Warns Solicitors Over Deepfakes Fraud | SRA Latest Deepfake Update | A Growing Threat to the Legal Sector

Author: admin | 08 Mar 2024

With the recent rise in Deep Fake attacks, especially the ones created through Generative AI has posed a risk to the credibility of multiple industries worldwide including the legal sector. 

Today we will analyze the impact of the recent update by the SRA in helping solicitors to prevent deepfake attacks.

On March 5th, 2024 the Solicitors Regulation Authority (SRA) updated its Sectoral Risk Assessment on anti-money laundering and terrorism financing. The document highlights key aspects of the legislative framework that governs AML regulations in the UK. 

Key Takeaways of SRA Update

  • Increased reliance on videos for identity verification increases the risk of Impersonation attacks through Deepfakes. This may result in hiking the identity fraud cases in the legal sector.
  • Fraudsters mostly target residential properties and sell them by owning them illegally through impersonating genuine owners without their consent and knowledge. This illicit act fosters money laundering and fraud.
  • Law firms should investigate their clients if the clients stress depositing money in portions or in any manner that indicates a potential money laundering activity.
  • Solicitors should watch out for pooled investments in which a large number of investors make transactions where detection of a source of funds becomes difficult.
  • Third-party managed accounts though reducing the responsibility of the law firm for monitoring client’s transactions. Yet, the responsibility of ensuring compliance with AML regulations is still with the law firm.
  • Reduced level of Enhanced Due Diligence (EDD) for domestic politically exposed persons (PEPs) is a risk.

Impact of SRA Update Regarding AI Tools and Deepfakes

According to the SRA’s Risk Assessment, when solicitors won’t meet their clients face-to-face, the risk of identity fraud due to anonymity is raised. There is another indicator of a potential fraudulent activity which is if the client refuses to meet in person without a valid reason. Solicitors should also be aware of a new threat vector known as AI-Deepfakes. Highly realistic and convincing videos and images can now easily be created using AI tools that are freely available online for everyone to use.

The document raised a serious question on the actual capability of EDD measures in curbing identity fraud or if the users need Facial Identity Verification solutions to do this job accurately and swiftly for.

How Facial ID Verification Can Combat Deepfakes in the Legal Industry

Facial Identity Verification vendors play a critical role in mitigating the Deepfake threat for solicitors. Deepfake detection in video calls requires a high level of speed and accuracy. This is because sensitive information is shared in the legal sector where solicitors need to be extra careful and take extra measures to prevent Deepfake injection attacks while making video calls. SRA’s document also highlights the element of enhanced due diligence (EDD) which refers to the existing KYC (Know Your Customer) Solutions implemented at law firms. Even though existing KYC solutions can detect anomalies and suspicious activities in verifying digital identities, due to the rapid evolution in the AI industry, deepfakes are becoming more and more realistic.

What are the features of an Ideal Facial Identification Solution Suite?

Facia is committed to reaching perfection in biometric identification by setting benchmarks for the industry. We believe in doing it by staying and the top in ensuring the highest verification speed at maximum possible accuracy by reducing the number of false accepts and false rejects. Here’s how we ensure a seamless and robust digital identity for solicitors and protect them against AI deepfake attacks:

Features of Facia, an ideal facial identification solution.

Solicitors can choose Facia and ensure a secure and seamless video call with their clients without worrying about any identity theft risk. Furthermore, it is necessary to protect the legal information, reputation of the law firm, and goodwill of legal practitioners to comply with the guidelines of SRA.

Facia can prevent video calls deepfakes and also can be integrated with multiple video calling software like Zoom or Meet to verify the liveness of the subject. This feature will help solicitors stay ahead of a deepfake attack by detecting and blocking it.

Let us look into a simple Zoom video call being compromised by a deepfake attack and how Facia will detect it under multiple scenarios that may occur during the call:

Trust Facia because of its speed of verifying both Active Liveness and Passive Liveness in under 1 second during video calls on web conferencing software like Zoom.

Facia is capable of verifying liveness in under one second. It is because its algorithm is designed to efficiently combat digital injection attacks, especially the ones created through AI. 

Other Features

  • Facia is capable of detecting deepfake in both low-res and high-res settings. 
  • It can work effectively and swiftly for detecting deepfakes through liveness in blurry images too.
  • Facia’s SDK is less than 4 MB making it a lightweight Identity Verification solution that can be smoothly integrated with multiple web conferencing platforms. 
  • Also, it can easily be transitioned into operating systems such as iOS, Android or Windows.

 

It is also to be considered that IDV solutions should minimize the False Rejection Rates while verifying genuine clients. The ideal state of an IDV solution can be achieved by minimizing the reducible error from False Accepts and False Rejects

Facia is your partner in serving justice at the best.

Wrap Up

FATF’s guideline of digital ID also speculated that 60% of the world’s GDP would be digitized by 2022 and we witnessed it correctly. The identity theft attacks also became sophisticated and negatively impacted every industry in just 2 years with staggeringly high numbers. AI-deepfakes are now causing fraud and other crimes in legal firms as well. To prevent it facial biometrics helps in better understanding and protecting Digital Identities and meeting customer due diligence goals effectively.

What is the Solicitors Regulation Authority (SRA)?

The SRA is the regulatory body responsible for overseeing solicitors and law firms in England and Wales. It sets and enforces professional standards to ensure that legal services provided are of high quality and that public trust in the legal profession is maintained.

What is the law on deepfakes in the UK?

The UK currently lacks specific legislation directly addressing deepfakes. However, misuse of deepfakes can fall under existing laws concerning harassment, defamation, or privacy violations. Actions causing harm through deepfakes may lead to legal consequences under these frameworks.

Can you sue someone for a deepfake?

Yes, you can sue someone for creating or distributing a deepfake if it results in harm such as defamation, emotional distress, or privacy invasion. The UK’s laws on defamation and privacy can be applied in cases where deepfakes cause personal or financial damage.

How deepfakes impacts the legal system?

Deepfakes present challenges to the legal system, particularly in areas such as evidence integrity, privacy rights, and personal security. They complicate the authentication of visual and audio evidence in court and can be used to commit or conceal crimes, necessitating advanced forensic techniques for detection.

How to identify AI-generated videos?

Some tips for identifying AI-generated videos:

  • Look for unnatural movements
  • Pay attention to skin tones and lighting
  • Check for inconsistencies in video quality
  • Use reverse image search to see if the images appear elsewhere
  • Be skeptical of content that seems too good to be true
Published
Categorized as Blog