Responsibility of social network platforms

American experts have recently warned of a serious mental health crisis for young people in the country, with social networks named as one of the main causes. The rapid and uncontrollable spread of bad, malicious, and fake news in cyberspace has been raising the issue of the legal responsibility of technology companies to users.
Illustrative image. (Photo: Reuters)
Illustrative image. (Photo: Reuters)

The unsafe use of social networks has been causing negative effects for users, especially for young people, teenagers, and children. According to the US Department of Health and Human Services, 95% of the country’s young people use social networks, up to a third of whom use it with constant frequency.

US Surgeon General Vivek Murthy stressed that social networks contain extremist and toxic content for children such as "normalising" self-harm or suicide.

A recent study showed that recommendations from YouTube's algorithm promoted the sending of violent videos and images of guns to children. Meanwhile, according to TSB Bank (in the UK), the number of scams on Meta's online platforms, including Facebook, WhatsApp, and Instagram, account for 80% of all scams in the form of impersonation, purchase, and investment in this bank's statistics on social networks.

In the face of these alarming figures, once again, the story of the responsibility of online platforms in strictly controlling contents has been repeated. Big tech companies once had a peak period with outstanding profits and revenue during the outbreak of the COVID-19 pandemic. However, potential risks to network users were also clearly revealed during this period. From October to December 2020 alone, Facebook had to remove 1.3 billion fake accounts.

Being aware of the implications and dangers from cyberspace, many countries have taken drastic steps in regulating online platforms. In March 2023, Utah became the first US state to enact a law requiring social networking sites to give parents more control over the accounts of users under the age of 18. This move by the US has increased pressure on social networking platforms to verify the age of users.

The European Union (EU) has promoted the Digital Services Act (DSA), with regulations aimed at banning advertisement that is targeted at children or based on sensitive data such as religion and sex as well as forcing platforms to aggressively fight misinformation. If they do not comply with the DSA, technology companies will face fines of up to 6% of global revenue. In Asia, Indonesia, in its role as ASEAN Chair in 2023, called on countries in the region to work closely against fake news.

Faced with increased pressure from governments around the world, most online platforms have made moves to enhance information transparency, speed up the closure of fake accounts and block the spread of false information. However, in fact, many people are still victims of cybercrime, requiring these platforms to act more decisively and responsibly.

Recently, information about Twitter's decision to withdraw from the EU's DSA has caused a lot of controversy. EU officials believe that the level of spreading misinformation on Twitter has been increasing and causing discomfort to users.

Stressing the importance of strict management of online platforms, EU Industry Commissioner Thierry Breton affirmed that large-scale operation must come with great responsibility. The financial penalties imposed on social networks related to recent wrongdoings show that the reliability of these platforms is still precarious. Technology companies need to continue to implement strict policies towards building a safe online environment for everyone, which is an urgent demand in the new era.