Source: Guangzhou Daily

"In the past, we said 'seeing is true', but with the development of technology, the technical means of criminals are also updated, creating the" fake people to speak false ", sometimes it is invincible, and we must pay attention to identification and prevention."Recently, at the World Digital City Conference" Biomedical identification Technology Application Forum "held in Shenzhen, Professor Chen Youbin, a national special expert, said in an exclusive interview with the reporter of this newspaper:" Face recognition technology has been applied to various applications, which is caused by the application of various applications.The problems of personal privacy leaks caused by this are also increasingly prominent, and the security management of face recognition technology is imperative. "

AI face change, sound synthesis

"Friends" deceived money with fake chaos

"Today, many businesses in social life can be carried out online. Financial aspects such as account opening, card opening, wealth management, and credit;Signing, notarization, etc., have you ever thought about the person that appeared on the screen really the person you thought? In the era of artificial intelligence, this is particularly vigilant. "Chen Youbin said.

Not long ago, Baotou police notified a case of using AI to implement fraud: Mr. Guo, a corporate representative of a company in Fuzhou, was deceived 4.3 million yuan within 10 minutes.According to the report, a "friend" of Mr. Guo suddenly contacted him through WeChat videos, saying that one of his customers and friends in the field need a deposit of 4.3 million yuan, and the public account was required to borrow Mr. Guo's company.Account account.Based on the trust of my friends and verifying the identity of the video chat, Mr. Guo directly transferred 4.3 million yuan to the bank card designated by the other party without verification.Afterwards, Mr. Guo called his friend to know that he was deceived. It turned out that the criminals changed their faces and fomathered technology through smart AI, pretending to be a friend to perform fraud against him."Criminals simulate the voice of the victim's friends in the way of videos. It seems that people are also this person, and their voice is his voice, but in fact, it is not what his friend did." Chen Youbin said.

Coincidentally, in February 2022, a Mr. Chen reported to the police station, saying that he was deceived by "friends" nearly 50,000 yuan.After verification by the police, the scammers used a video released by Mr. Chen on the social platform to intercept his face screen and use "AI to change the face" synthesis to create the illusion of Mr. Chen's video chat with the "friend" video, thus implementing fraud.

On October 7 this year, the Beijing Supervision Bureau of the State Administration of Finance and Administration issued risks that there were illegal obtaining personal information by illegal elements, and the portraits and soundPretend to be deceived as the identity of the above personnel; use the routine that prepared in advance after receiving the trust of the victims to send bank card transfer, virtual investment and wealth management, ordering and rebate, etc.To reduce the victim's preparation, victims are often difficult to detect in a short period of time. Once they listen to the scams of scammers and complete the transfer, there is no news for convenience.

AIGC digital person?Natural person?

Human biological characteristics may be "deep forgery"

"Criminals change face and acoustic technology through AI, pretending to be acquaintances to perform fraud." Chen Youbin introduced that China ’s application in the field of face recognition is ranked at the forefront of the world.focus on.

"Is the person in front of the camera, is he a real natural person? Is the person real? Is the certificate true? Is the person's certificate consistent? Need a means of remote nuclear body." Chen Youbin introduced, appearing in the appearance, appearing inForgery or forged videos, people who generate or forgery with AI technology are called "AIGC Digital".It can be a person who does not exist in the real world, or it can be forged by those who exist in the real world.And "natural person" is a person who appears in real images or real videos and exists in the real world.

It is understood that in Guangshen, there are thousands of companies producing and selling virtual digital people. The "digital people" created by AI technology have a highly realistic appearance and natural language interaction ability.Real -time dialogue interaction, even reaching the realistic level of "don't look at it carefully".

As an expert in the field of image recognition and artificial intelligence years, Chen Youbin reminded: "Human face, sound lines, fingerprints, iris, signatures, etc. are the most commonly used human biological characteristics.At this moment, or someone else moved flowers and picked him up, you need to identify it. "

Chen Youbin introduced that compared with traditional technology, the destructive power of "AI changing" is not only "forgery", but also "depth"."Scammers generally analyze big data analysis of various types of information posted on the Internet, including personal data, work information, interpersonal relationships on social media, etc., to study their daily habits, work habits, asset conditions, etc.Then combine the fraud to be implemented, screened the crowd through AI technology, so as to determine the target group of fraud and lock the scam object. "

Be alert to fake sound patterns

Saying a "change of a sound and saying"

Chen Youbin introduced that "Deep Fake" is a multimedia tampering and synthetic technology based on deep learning, mainly including the forgery and tampering of images, voice, video and text.Improve the target person to deceive another person."Deep fake development is very fast. This technology was first used to simulate some parts that could not be taken in reality when making movies in Hollywood. Later, it was used by criminals.Shaking your head and shaking your head, changing your face, etc. Now there are many models. "

He told reporters that this deep forgery technology can change the age, gender, and race without changing his identity, and manipulate your mouth or expression; you can also migrate, exchange, stack the face, stackIn other ways, mix with a variety of methods such as "changing your face and changing your expression", "don't change your face" and other methods to change your identity."Real -time interactive falsification will map the synthesis to the present moment. For real -time interactive application scenarios, such as using false identity to steal information or disturb the order in an online meeting, or imitate the target characters in video calls to deceive users on the other endWait. "

Chen Youbin reminded: It is important to note that the information of the sound can also be faked by extracting "sound pattern".What is "sound pattern"?Chen Youbin introduced that the emergence of human language is a complex physiological physical process between the human language center and the pronunciation organs. The vocal organs -tongue, teeth, throat, lung, nasal cavity in size and morphology are very different, soThere are differences between the sound pattern of any two people.Now everyone often receives advertising calls from many strangers. I do n’t know that in the process of talking to strangers, I unknowingly leaked their sound patterns.

"Criminals may call you through different people 'bombing'. Perhaps their purpose is not to sell at all, but to obtain your sound patterns, living habits and social relations, and then implement fraud." Chen Youbin introducedThe time of voice will affect the accuracy of sound pattern recognition. The longer the effective voice duration, the more data obtained by the algorithm, and the higher the accuracy will be.Therefore, he reminded: "Elderly people who are not familiar with artificial intelligence technology need to be vigilant, including young people, including young people. Don't talk more when you receive strange calls. You must pinch your nose and change your voice when you talk.Otherwise, after a period of data study, the criminals may use your sound texture information and speech habits to synthesize your voice, replace it with your face, and scam in your social network. "

Carefully pay attention to AI forgery of "clues"

Multi -channel confirming not trusting video content

How to prevent AI in -depth falsification of fraud?Chen Youbin reminded: "If this person is familiar with you, you can find a few times with him, and you can find a problem." It is understood that the banking business is an example. It is necessaryTo complete the financial business, minimize risk.And ordinary people can also discover the "clues" of AI in -depth falsification through various checking methods.

He introduced that because the material synthetic materials are mostly used by eye -opening photos, "lack of blinking" can be regarded as one of the "features" of synthetic videos. In additionSynchronous, emotional does not meet, is unnatural or unconnected somewhere, the texture of the teeth and lips is unclear, and the ears are asymmetric.AI Aptosure and AI face -changing packaging after fake calls can realize the language and behavior of the original characters realistic, but it is still full of flaws, such as low blink frequency, uncoordinated eyes, unnatural facial expressions, discomfort in sentences, etc.——This are the characteristics of "digital people".The following precautions are relatively easy to remember: "The phone call back to confirm that the video cannot be easily believed; remember to ask private questions, it is not credible to answer; it can break the AI face in a blink of an eye.

It is reported that the Safety Management Regulations of Face recognition technology released by the National Internet Information Office on August 8th (trial) (draft for comments) also made requirements from public places, operating venues, and places that may infringe on the privacy of others.EssenceRelevant departments also reminded: To increase the awareness of information protection, they are particularly alert to the informal software that needs to be entered into personal information; in the opening of "positioning service", input ID number, or entering "face recognition information" and "fingerprint recognition information"Be careful when you find information; you must report to the relevant departments in a timely manner when you find that the app is excessive and forcibly collects personal information.In addition, properly set up the viewing authority of personal social accounts, but to publicly open or share moving pictures and videos involving personal information, and make a vigilance on the links issued by unknown platforms, and do not easily open mobile phone screens to strangers.