Incognito is a natural insect repellent, developed and sold by its eponymous co-founder, Incognito. Liz Bonnin is a well-known Science TV presenter on the BBC.
Incognito had previously asked Ms Bonnin to endorse its products, but she had declined.
Incognito’s CEO, Howard Carter, was then apparently contacted by Ms Bonnin who indicated that she had reconsidered the proposal.
Was this too good to be true? The headlines say it all:
“Firm conned out of £20,000 by scammer pretending to be TV star”
“Firm tricked by AI-generated voice”
"Firm uses presenter in advert after it is tricked by AI fake”
This was undoubtedly distressing for both parties, and costly for Incognito. It is a cautionary tale for all businesses, but there are practical steps businesses can take to avoid being bitten in the same way as Incognito.
Impersonation fraud and deepfakes
Incognito was defrauded by an impersonation scam utilising deepfakes.
Mr Carter was first contacted on Facebook by a profile purporting to be Ms Bonnin. Mr Carter believed the profile to be “suspect”, but he was ultimately “clinched” into believing that he was in discussions with the real Ms Bonnin by voice notes he received. As a result, Mr Carter agreed a deal, negotiated via emails and WhatsApp, to pay £20,000 for Ms Bonnin’s endorsement of its product.
Incognito launched its marketing campaign using images of Bonnin received from the scammers impersonating her, but, on the same day, Ms Bonnin posted on X that she had not agreed to endorse the product.
Ms Bonnin had not negotiated the deal with Mr Carter; fraudsters had impersonated her, including developing an AI-generated version of her voice.
It is not only Incognito and Ms Bonnin that are falling victim to impersonation scams. In February 2024, CNN reported that a finance employee at a multinational firm had been tricked into paying out US$25m to fraudsters who digitally posed as the CFO of the company during a video conference call[1].
According to UK Finance (the trade association representing the UK banking and finance industry), impersonation scams are increasing year on year. Their most recent report on fraud reveals that £177.6m was lost to impersonation scams in 2022[2]. The advent of generative-AI and deepfakes, in particular, mean that such scams are becoming more elaborate.
The consequences of falling victim to a deepfake scam for Incognito is that it is out of pocket to the tune of £20,000, it has exposed itself to a prospective claim from Ms Bonnin for passing off in respect of misrepresenting that Ms Bonnin had endorsed its product and, generally, has been the subject of adverse publicity that has likely negatively impacted its reputation. Deepfake scams can also present security risks for businesses.
Incognito may be unable to recover monies from the fraudsters. As it has fallen foul of authorised push payment (APP) fraud, the scope of any retrieval duty owed by banks and payment service providers (PSP) to a victim of APP fraud is unclear. It is expected that, in October 2024, the Payment Systems Regulator will launch a new mandatory reimbursement scheme for APP fraud victims obliging in-scope PSPs to reimburse victims of APP fraud in certain circumstances. However, prevention is better than cure…
Don’t get bitten
Just as the best way to protect against insect bites is to apply repellent, businesses should take precautions to protect themselves against impersonation scams including those using deepfakes. The UK Finance’s campaign Take Five to Stop Fraud recommends:
- STOP: Before parting with your money or information;
- CHALLENGE: Could it be fake? It's ok to reject, refuse or ignore any requests. Only criminals will try to rush or panic you; and
- PROTECT: Contact your bank immediately if you think you’ve fallen for a scam and report it to Action Fraud.
There are other steps business could take to avoid falling victim to deepfakes, including:
- Employee training: Employees could be educated about deepfake technology and its potential risks. They may be trained to recognise signs of deepfake content, such as inconsistencies in audio or video, unnatural facial movements, and/or discrepancies in context;
- Verification procedures: Businesses should implement robust verification procedures for significant financial and sensitive transactions, including changes to critical business processes. Businesses may require multiple levels of authorisation and verification for high-risk actions;
- Security measures: Businesses may implement multi-factor authentication for accessing sensitive systems and authorising financial transactions. Businesses could also implement trusted communication channels and communication protocols that outline procedures for verifying the authenticity of requests, especially those involving sensitive information or financial transactions. Employees should be encouraged to follow these policies rigorously to prevent falling victim to deepfake scams.
Although generative AI offers tremendous opportunities for businesses, it also presents real risks from unscrupulous fraudsters. Businesses should take appropriate protection to avoid getting bitten…
If you wish to discuss any of the issues raised in this article, please contact Luke Moulton, a partner in our Commercial Litigation team specialising in complex commercial disputes and intellectual property litigation, especially in the advanced manufacturing, media and entertainment and technology sectors.
[1] https://edition.cnn.com/2024/02/04/asia/deepfake-cfo-scam-hong-kong-intl-hnk/index.html
[2] https://www.ukfinance.org.uk/news-and-insight/press-release/new-figures-show-ps1776m-was-lost-impersonation-scams-in-2022-take
The information provided in this article is provided for general information purposes only, and does not provide definitive advice. It does not amount to legal or other professional advice and so you should not rely on any information contained here as if it were such advice.
Wright Hassall does not accept any responsibility for any loss which may arise from reliance on any information published here. Definitive advice can only be given with full knowledge of all relevant facts. If you need such advice please contact a member of our professional staff.
The information published across our Knowledge Base is correct at the time of going to press.