Artificial Intelligence: A New Tool for Cybercriminals

A significant increase in cyberattacks on bank clients was observed last year, and the advent of artificial intelligence might power the next wave. In a novel twist, cybercriminals utilize doctored videos featuring known personalities to dupe unsuspecting victims. According to Tomáš Stegura from the Czech Banking Association, the first attempts have already been made using a fake video of a Czech bank CEO urging investment.

In 2020 and 2021, due to pandemic restrictions, there was a significant increase in internet use and online transactions. Consumer behavior remained the same even when restrictions eased, leading to increased online shopping. As a result, the number of people shopping online continues to rise, making this growing group of potential victims increasingly attractive to attackers.

Fraudsters commonly employ deceitful phone calls or counterfeit websites to perpetrate their scams. They coax individuals into providing access information for electronic banking, thereby gaining control over their accounts. One of their favorite tricks is to attempt to install a remote desktop management program onto the victim’s computer and then have the victim authorize complete control over the computer. Once this is done, scammers can click through and control everything themselves.

Artificial Intelligence (AI) is a new tool in the fraudster’s arsenal. As AI becomes more accessible, criminals will surely not overlook its potential. Its primary use will be improving the text on fraudulent websites to make them look more trustworthy. In addition, it can be employed for video calls where scammers pretend to be someone else. We are already seeing advertisements on social media mimicking famous personalities, such as a president or Elon Musk, to gain trust and make their offers appear more credible.

AI’s potential to create realistic fake videos, where a famous actor might “recommend” an investment or click on a scam website, is a real threat. The first attempts, like the fake bank CEO urging investment, have already been spotted. While it was still possible to detect with a simple observation that it was a montage, AI will only improve over time. The quality will improve, and the credibility of online recommendations will decrease.

The trend is unavoidable, and banks will try to respond by providing their clients with better security tools and verification methods to ensure they are dealing with the actual counterparty. When approached by a name-dropping scammer, verifying such a call through the Internet or mobile banking is always better.