Fighting scammers
text size

Fighting scammers

Experts at a recent forum discuss the need for literacy and tools such as Whoscall to counteract digital fraud

SOCIAL & LIFESTYLE
Fighting scammers

Whoscall, a mobile app that helps people identify unknown phone calls and text messages, recently revealed that last year people in Thailand received the highest number of fraudulent SMS messages in Asia.

According to Whoscall, Thais received 79 million fraudulent SMS messages and calls last year. This represents an 18% increase from 2022's total of 66.7 million calls and messages. On average, one person in Thailand received 20.3 fraudulent SMS messages per year. Following Thailand, the Philippines received an average of 19.3 messages per person while Hong Kong residents received an average of 16.2 messages per person in a year.

Due to this, Cofact (Collaborative Fact Checking), an organisation which fights against fake news, recently held a forum titled "From Cheap Fakes To Deepfakes: Is Fact-Checking And Media Literacy Enough?" at the Bangkok Art and Culture Centre.

Pinyo Treepetcharaporn, director of the Technology Risk Supervision Department at the Bank of Thailand, started the forum by explaining differences between cheap fakes and deepfakes.

"Cheap fakes are a type of deception that use simple technology such as photo or video editing tools manipulate the appearances of celebrities. Cheap fakes are often basic. For example, they might use the face of a celebrity and make it appear as if that person is speaking with their mouth moving, but the rest of the face is static. In this case, it is easy for people to tell that the photo or video has been edited," Pinyo explained.

Thitinan Suthinaraphan, marketing director of Whoscall, gave an example of a scam using cheap fake that many people are familiar with: a video call from a police officer.

"Scammers usually instruct victims to download Line and communicate with them through video call. The video typically shows a static face of a police officer with only his lips moving. Unfortunately, many Thais believe these cheap fakes and end up with significant financial loss," explained Thitinan.

"Deepfakes feature sophisticated technology and AI to make the appearance and voice of a celebrity in a video clip similar to the real thing. However, there are still limitations since deepfakes require combining several clips to create an effective video. As a result, most deepfake targets are well-known people who appear in the media."

Masato Kajimoto, a professor at the Journalism and Media Studies Center at the University of Hong Kong, shared information that raised concerns about cheap fake and deepfake technologies.

"Open AI, an artificial intelligence research organisation, recently announced that they can generate your voice if they have a 15 second sample. Moreover, they claimed their technology can generate a person's voice speaking multiple languages. If they have a recording of my voice, they can make me speak Thai. This is great technology, but you can imagine how easy it will be to trick people," Prof Kajimoto explained.

In response to the growing problem of deepfakes, Google DeepMind has launched a tool called SynthID. Faith Chen, lead of the APAC news partnership at Google News Initiative, explained that this technology embeds a digital watermark into images and videos generated by AI. While invisible to the human eye, AI algorithms can detect this watermark to verify originality and prove it has not been modified.

"This is a tool that can help all of us use generative AI images at work or on YouTube. We can easily identify what is AI generated," Chen said.

The moderator, Asst Prof Jessada Salathong from the Faculty of Communication Arts at Chulalongkorn University, raised an interesting question. He said cases of fraud and scams in Japan appear less frequent than in Thailand. Prof Kajimoto was asked whether people in Japan have more media literacy than in Thailand.

"In Japan, traditional scams by phone calls are still rampant. News consumption on the internet is extremely low. Less than 10% of people actually read or share news on the internet. They still rely on TV news and radio or they don't care about news at all," said Prof Kajimoto

"Another reason is the language barrier. Many scammers come from overseas. It's difficult to trick someone using Japanese language. However, AI technologies are constantly developing. If scammers are able to translate their messages effectively into Japanese, the situation may change significantly."

Since fraudulent technologies have become more advanced, media literacy is crucial. Chen said young people and the elderly are the most vulnerable groups.

"Media literacy doesn't mean you have to know how to detect AI-generated images from normal ones. It can be as simple as training the elderly and young people to reason and think critically when they read, watch, listen or see something, especially social media," said Chen.

When a moderator asked what people should do to prevent being scammed, Thitinan said that people should be aware that scammers use three emotions to manipulate victims: fear, greed or romance. Additionally, Thitinan pointed out that deepfake technology is currently expensive, but it is likely to become cheaper and more accessible.

"A couple of months ago, an employee at a company in Hong Kong was defrauded by a fake video of the CFO generated by AI to approve a transfer of US$25 million [920.5 million baht]. Deepfake technology is expensive, but can provide significant profit to scammers," Thitinan said.

"If people use the Whoscall app, it will alert them which phone numbers are from scammers. However, scammers can now generate numbers which are difficult to detect. Thus, people should check any suspicious call before deciding to transfer money."

Pinyo highlighted a slogan on a poster created by Cofact which reads "Don't be quick to believe, share and/or transfer".

"Since technology today is more sophisticated, scammers are also becoming more skilled at using it to their advantage. For instance, they can now send text messages through the same messaging platform as banks," he said.

" 'Don't be quick to believe' means you must verify the source first. For example, if you receive an SMS from a bank asking you to click on a link, many people will know that banks do not send SMS messages with links to customers. However, some may not be aware of this. They should call the bank to confirm if they sent an SMS.

" 'Don't be quick to transfer' means you must be cautious about requests for money. You should check the bank account number of the recipient on websites like checkgon.com and blacklistseller.com. Both websites allow you to check either a phone number or account number to see if the recipient has been reported for scams in the past. Scammers always come up with new tricks, and we may not be familiar with them all. Therefore, we need to protect ourselves."

Do you like the content of this article?
COMMENT