Introducing Britannic Branded Calls and Messages!

Find Out More!

Combatting Deepfakes and Fraud: Why Trust and AI Are Key to Securing Customer Communications

Cybercrime, fraud and scams are on the increase with the sophistication of generative AI making it difficult to determine if voice calls, texts and emails are authentic. Scammers otherwise known as ‘bad actors’ are even using Deepfake images for fraudulent video calls and cloning people’s voices making (vishing) it nearly impossible to tell what’s real and what’s not. It’s a scary world out there for customers particularly in the financial services industry where scammers are rife. Many customers are too fearful to open communications from companies even if they are legitimate. Resulting in wasted sales and marketing efforts where promotions, customer care initiatives and updates are being ignored. This impacts engagement and response rates and ultimately leads to a decline in the bottom line.

Incoming Bad Actors

The stats depicting the increase in cybercrime are alarming with experts predicting that the estimated cost of cybercrime will grow by approximately 70% between 2024-2029 costing businesses worldwide around $15.83 trillion (Statistica). The threats are to everyone from governments, public sector organisations and businesses with ransomware and to individuals.

The ‘bad actors’ are only after two things either your personal data or your money. The rise of fraudulent voice calls, text messages, emails and social media scams is on the increase, with over 42% of adults being scammed this year. Plus, a shocking 33% say they have been duped and lost money to a scammer (NatWest). A staggering 32 million phishing emails have been reported to the Suspicious Email Reporting Services (SERS) of Action Fraud UK. With about 1 in 5 consumers falling to scams such as phishing links.

Previously scams or fraudulent communications were relatively easy to identify because spelling or grammar were incorrect or the tone or the language was off. Now with generative AI bad actors can programme the language, tone, spelling etc and produce word perfect communications in the same style as the company they are mimicking. They can also design the same logo, corporate colours and look and feel.

There is every scam going from parcel delivery, notes from Amazon, people pretending to be from your bank, Microsoft and even fraudulent texts from your supposed child saying they need your help so please send money. Pension scams are also huge, Action Fraud reported in 2023 a total of £17.7 million was reported lost to pension fraud equating an average loss of nearly £47,000 per person. However, bad actors and cybercrime is not prejudice to anyone and everyone irrespective of age is vulnerable.

Improving Awareness and Education

Cyber-crime will increase and get more sophisticated by the day and it is crucial that children and adults are made aware of it. We need to understand what to look out for in scams so we can do our best to avoid it.

Schools hold talks on County Lines drugs to warn children about the dangers of drugs and how they are targeted and the scams they use to rope them in. The same should be done about cyber-crimes particularly as they spend so much time on screens, are vulnerable and would be excited at the prospect to make a lot of easy money. A recent scheme from Russian Coms gave rise to a new phenomenon, Fraud as a Service (FaaS) where children and adults could buy a handset and service (including 24x7 support) advertised on social media, with over 7000 followers with the intention to scam people and make money from it. They allowed fraudsters to pretend to be callers from a bank or telecoms firm to steal money or personal details, targeting adults and vulnerable children as the perfect targets who want to make money.

The National Crime Agency reported that between 2021 and 2024, over 1.3 million calls were made by Russian Coms users to 500,000 unique UK phone numbers, with about 170,000 people in the UK are believed to be victims, with the average reported loss more than £9,400. Thankfully the online platform was shut down by the NCA.

We all see coverage in the media about scams but there needs to be a concentrated effort and initiatives to educate, inform and increase the awareness of cyber-crimes. Who is accountable to own this? Is it the government or the National Crime agency? There is a number you can text 7726 (which spells out SCAM) for free by forwarding the text to report a scam or suspicious text. The phone operator can investigate the origin of the text and arrange to block or ban the sender. Did you know about this number? Exactly!

Increasing the Trust

Building trust is imperative and it is therefore no surprise that customers are scared to open emails, texts and social media messages and even take phone calls from customers. However, companies don’t need to worry and return to the old days of direct mail. Instead, they need to ensure that all their digital and voice communications are personalised and verified so customers trust them. This can be achieved by sending out guidelines and a message albeit by direct mail informing them that what your company will and not ask for over digital communications and phone calls.

Branded messages for mobile phones using rich communications services (RCS) elevate your messages using multi-media with images, videos and action buttons to create an engaging and interactive experience. But critically, they can be verified and seamlessly integrated with real-time communications producing delivery receipts and read confirmations with the ability for automated responses. All resulting in secure and verified text messages and phone calls. With a call branding solution customers can see the call is coming from a verified company as your logo, number and reason for the call on the screen with the incoming call. This will help to secure your communications and build trust with customers.

Enabling financial services business, utilities, logistics, public sector and more, to achieve higher engagement and response rates particularly with the fact that the blue tick will verify the message to reassure customers they are interacting with an authentic and secure business.

From Bad Actors to Good Actors

Harness the superpower of AI to turn it into a good actor and use its phenomenal capabilities to secure your IT infrastructure and communications to stop cyber criminals and bad actors in their tracks. AI security solutions can monitor vast amounts of data and identify irregularities in real time using fraud detection algorithms that improve the identification and treatment of bad actors. Enabling security professionals to respond rapidly and where appropriate they can also generate an automatic response.

A report by Microsoft in collaboration with Dr Brauer at Goldsmiths, University of London stated that 87% of organisations in the UK surveyed were vulnerable to attacks yet only 27% of them were using AI to strengthen their security. He cites that organisations should ‘fight fire with fire’ and use the same AI technologies to secure their organisations and tip back the balance in their favour. The report reveals that stronger cybersecurity could potentially save the UK economy £352 billion a year.


Reduce Fear and Increase Engagement
The government, National Cybercrime Agency, schools and businesses need to come together to educate, inform and spread awareness of what to look out for in scams. How to avoid them and how to report them.

For you, it is time to take action and ‘fight fire with fire’ and do everything in your power to move the ‘bad actors’ off the centre stage and focus on the ‘good actors’ to increase trust with customers and secure your IT infrastructure and communications.