The owner and manager of Telegram, which is popular in Singapore, cannot even contact them even government officials, and they are not responsible for harmful content.It is shocking that educators communicate with students and parents through this communication application.

Recent survey shows that network hazards are an increasingly common problem in Singapore.Among them, young people have the highest risks in the face of network hazards. 58%of them have experienced or have cyber hazards, and women are more likely to suffer sexual harm and other sexual harassment.The four most common network hazards are sexual harassment, network bullying, impersonation or stolen identity and slander.

SG Her Empower (SHE) investigating more than 1,000 Singapore citizens and permanent residents in Singapore's network hazard survey report was released in Singapore.The report shows that, of each five victims, two people face severe physical and mental trauma, such as worrying about personal safety, depression, and self -harm; 30 % of the victims are also angry, sad, anxious, embarrassing, shame, or helpless.

In addition, according to the report released by the polls in October, nearly half (46%) Singaporeans believe that psychological diseases are the greatest healthy issues facing the current Chinese people.A national survey jointly launched by the Ministry of National University and Education in April this year shows that about one -third of local young people are facing psychological diseases such as depression, anxiety and loneliness.

The worrying data above, based on the widely used online activities of artificial intelligence (AI). Now the popularity of AI has made the problem worse.

Singapore is more likely to suffer network harm

In the past, illegal people had to upload pornographic photos they got to cause damage.Today, they can intercept photos of innocent people from social media and synthesize realistic pornographic images through AI applications.In September this year, such a thing happened in Almenidralejo, a Spanish, with only 30,000 people.More than 20 innocent girls aged 11 to 17 in the town have become victims.The neat photos posted on social media were intercepted and circulated on the Internet after changing.

Genesis artificial intelligence (Genai) is in the initial stage of commercial development.If it is not controlled, illegal people may use images and videos generated by AI to harass, extort and blackmail, destroy the process of democracy, and distort the development of free society.The pornographic content made with DeepFake technology has recently been used to suppress, humiliate or extort the people and reporters.More than 95%of the deep pseudo content generated by AI is aimed at women. If it is not stopped, it may hinder the world to achieve gender equality and bring negative social and economic impact.

The digitalization of Singapore is popularized, and the Internet is more frequent than the first generation, making it easier for us to suffer network harm and facing mental health challenges.

The victim itself is not the victim.As fraud and scams become more and more common, economic costs are increasing.For example, the company may start to buy insurance for executives may be ransomed by fake photos.

Network hazards may be a major harm that Singapore and the world will face next.

What are the countermeasures?

The anonymous nature of the network, the speed of browsing, and the Internet involved the judicial jurisdiction of multiple countries, making the problem of network harm extremely tricky.

You can't stop a person involved in the incident you can't recognize.Although the legal procedure is effective, it takes time and spend money, and it is not helpful to a victim who only wants to delete all the posts that hurts himself as soon as possible.In addition, as long as the virtual special network (VPN) is set, anyone can easily "turn over the wall" to browse the blocked website.

Governments of various countries have proceeded to formulate relevant regulations to cope with network hazards.

Since 2015, the Australian government has authorized the online security commissioner to instruct the online platform to delete illegal content.The post usually takes a few hours or even a few days to be deleted, but it can be deleted in 12 minutes after cooperating with the responsible online platform.More importantly, law enforcement is only a small part of the Cyber ​​Security Commissioner Office. Most of them, they are launching public publicity and educators training, comprehensively examining emerging network hazards, and related regulatory agencies and technology platforms., Establish a partnership.

The European Union has also recently used the digital service law to require the online platform to take measures to prevent and delete the posts containing illegal goods, services or content, and improve the transparency of algorithms.According to the Digital Service Law, if the network platform realizes the illegal content, but does not quickly take action to delete or prohibit public browsing, it must bear the corresponding legal responsibility.It is important that the network platform must also establish a convenient mechanism to receive notice of illegal content.

In addition, France, Ireland, Germany, and Britain have formulated a network security law on specific categories.Singapore has recently revised the radio law and the establishment of a network crime hazard.

Online activities span the national borders. If the judicial mechanism alone is not enough to effectively respond to the dangers of the network, it must be multi -pronged to create a safer network space.

First of all, network space has unique communication characteristics, and regulators must have corresponding power.According to the SHE survey, most interviewees believe that the rapid deleting illegal posts is an urgent countermeasure. Therefore, any law enforcement plan must be able to provide this layer of protection, especially when disadvantaged groups such as children such as children are more important.New regulations must be able to cope with the hazards that have not yet occurred, and implement each other in an open and transparent and coordinated manner to establish mutual trust with all relevant groups.For example, the content creator is required to use the content of AI to add watermark certification.In addition, if countries can reach an agreement in terms of legal category and law enforcement cooperation, the effect of responding to network harm will be stronger.

The role of the network platform should be clearer

Secondly, the network platform plays the most important role.Although these platforms are not content creators, they are the contents of the content.They should also provide and launch a business and technology solution to ensure that the network is safer.We can learn from the EU Digital Service Law. Considering that the network platform is not taken to take action if we find the dangers of the network, it should be responsible.

Third, new regulations must be able to effectively combat online anonymous or groups that are not subject to judicial jurisdiction.

Finally, raise public awareness.A worry -free discovery in the SHE survey is that 20 % of the respondents believe that network harm is "rare and common."We should not consider the dangers of the Internet as an unavoidable phenomenon, but we should actively take action, improve self -awareness, and teach people who teach young, older, and unwilling to protect themselves.We must also be vigilant for the sharing of personal information, including choosing to use responsible online platforms.Many people may be surprised that the owners and managers of Telegram, which prevailed in Singapore, cannot even contact them by government officials, and they are not responsible for harmful content.It is shocking that educators communicate with students and parents through this communication application.

People often accuse network supervision of the freedom and democracy of online supervision.However, the SHE survey shows that in order to avoid continuing to suffer from online harassment, 66%of the respondents will self -censorship on the Internet, and 68%simply do not participate in online activities.However, transparent network supervision can actually promote individuals' free activities online and establish a stronger society.

At present, we have not found the perfect countermeasures, but we must not ignore itEssence

The author is the joint management partner of Yizheng Law Firm

The founder of Singapore Women's Development and Support Organization

Translation is provided by the author