2022 Predictions: Deepfakes and voice synthesis pose growing fraud threat

In 2021 we saw a large increase in remote working and reduced face-to-face interactions. This naturally resulted in increased traffic into call centres. One surprising effect of this was the reduction in fraudulent calls to the agent due to increased waiting times. However, it seems that fraudsters are starting to return to normal working habits just like the rest of us. The last couple of months have shown a steady increase in fraudulent calls. Based on this observation and past experience, in 2022 we can expect fraudsters to return to targeting call centres at the high rates of the pre-pandemic years.

Call centres and other organisations also need to be aware that the techniques used by fraudsters in these attacks will adapt to accommodate the changes facilitated by COVID-19. Since the beginning of the pandemic, the use of video as a means for communication both for personal use and in the workplace increased exponentially. This opened up new opportunities for fraudsters who already have the voice channels as one of their preferred means as part of their attacks. One side effect of this increased remote communications is a drive in innovation in audio and visual tools, some good, some not so good. The one that has received a lot of attention, is deepfakes.

2021 has already seen deepfakes on the rise. They’ve been ‘harmlessly’ used in the media for initiatives such as documentaries, sparking a conversation as to whether it’s right to use a synthetic voice, and we’ve even seen technology companies building deepfake tools with the potential for customer use. As with all technology, however, it can be used for both good and malicious purposes, and as seen in the media, fraudsters already have the capabilities to create deepfakes to con businesses.

Deepfakes are not just image and video related, voice synthesis (making a machine sound like somebody) and voice conversion (making a human speaker sound like someone else) are growing trends, and fraudsters are increasingly taking advantage of these innovative tools. These techniques are not so well-known to the public because of the limited real-world applications available today, however, it is a very real threat and a tactic we have already seen fraudsters adopt, for example, the recent US$35m bank heist.

With fraudsters looking to hone their skills and capabilities to create both deepfakes and voice synthesis I predict they will only increase in popularity as we move into 2022. It is therefore vital that businesses be aware of these new techniques and adopt the appropriate technology to combat them.

If contact centres can take just one piece of advice as we head into a new year, I implore them to secure their telephony channel to assist in the reduction of omnichannel fraud.

    Share Story:


Deborah Ritchie speaks to Chief Inspector Tracy Mortimer of the Specialist Operations Planning Unit in Greater Manchester Police's Civil Contingencies and Resilience Unit; Inspector Darren Spurgeon, AtHoc lead at Greater Manchester Police; and Chris Ullah, Solutions Expert at BlackBerry AtHoc, and himself a former Police Superintendent. For more information click here

Modelling and measuring transition and physical risks
CIR's editor, Deborah Ritchie speaks with Giorgio Baldasarri, global head of the Analytical Innovation & Development Group at S&P Global Market Intelligence; and James McMahon, CEO of The Climate Service, a S&P Global company. April 2023