Prime Minister Ranil Wickremasinghe recently said that Sri Lanka would have new laws enacted to regulate social media and hate speech. He added he is fully aware of the importance of social media and the Government’s objective is only to minimise its negative use.
The social media are a recent invention. The two most popular websites, Facebook and Twitter, were founded in 2004 and 2006 respectively. They may be new, but they are big. Nearly 1.2 billion people regularly use Facebook and 255 million regularly use Twitter. There are many other social media forums, based all over the globe, with different focuses of activity, all with the purpose of “social networking,” or connecting people to express themselves and interact with each other.
As John Cooper QC, politician and human rights expert, says, “the vast majority of people who use the social media are like society. The majority are decent, intelligent, inspiring people. The problem comes with a small minority, as in society, who spoil it for everyone else”
How can this minority “spoil it” for everyone? There are number of ways: cyber bullying, revenge porn, trolling, virtual mobbing, posting hate messages. (Trolling happens when someone creates conflict on sites by posting messages that are particularly controversial or inflammatory. Virtual mobbing occurs when a number of individuals use social media to make comments about another individual, because they are opposed to that person’s opinions.)
In recent years, we have seen conspiracy theories trend on social media platforms, fake Twitter and Facebook accounts stoke and religious social tensions, external actors interfere in elections and criminals steal troves of personal data.
German experience
Prime minister also said that he has advised Foreign Affairs Ministry to look at how other countries are working on filtering social media. For a brief study, we will take Germany and UK.
In Germany, a law came into force in January that foresees fines of up to €50 million (Rs. 6900 million) for social media platforms that fail to remove hate speech within 24 hours.“Social networks are no charity organizations that guarantee freedom of speech in their terms of service,” said Gerd Billen, an undersecretary in Germany’s justice and Consumer Protection ministry. Social networks, he said, had to comply with German law and not only with their own rules.
“We cannot simply accept the fact that illegal Fake News impact our democratic elections or that (online) hate crimes poison our public discourse,” Billen added.
The German law forces major social networks to withhold certain comments or posts from users in the country if they are deemed illegal and offensive and were reported by users. The networks such as Facebook, Twitter or Instagram now face fines if they systematically fail to comply with the 24-hour deadline.
The German law, Network Enforcement Act, is an international test case. How it plays out is being closely watched by other countries considering similar measures. German Government also said that they want to add an amendment to help web users get incorrectly deleted material restored online.
The legislation was introduced amid a recent spike in online hate speech and complaints by authorities that the sheer volume of incidents has become difficult to prosecute. All social networks have fallen in line with the new legislation.
Leading social networks such as Facebook and Twitter have acknowledged some failures in the past but emphasized that they will voluntarily step up efforts to combat online hate crimes.
UK experience
UK government, too, has unveiled sweeping plans to “regulate the internet,” which they claim will stop the web being used to tackle bullying, hate posts and other abuses. The UK government hopes to introduce new taxes on social media companies which will be used to improve the quality of internet. Social media companies will also be asked to commit to a “code of conduct”
“We just want to make sure that we do regulate the social media in appropriate way, so that we allow the freedoms the internet gives you,” said UK Culture Secretary Karen Bradley in an interview.”We need an approach that protects everyone without restricting growth and innovation in the digital economy. We want the Government, industry and communities to work together to keep citizens safe online,” she added.
The UK government is also considering changing the legal status of Google, Facebook and other social network companies amid growing concerns about copyright infringement and the spread of extremist material online. Social media networks are currently considered conduits of information rather than publishers under UK law, meaning they have limited responsibility for what appears on their sites.
However, the previous chairman of Ofcom, UK’s communications regulator, Patricia Hodgson said she believed the likes of Google and Facebook were publishers, raising the prospect that they could eventually face more regulation.
“We need to be careful here that what we do is not a sledgehammer to crack a nut. But we have to do this in a way that doesn’t allow harm to anyone,” she added.
Different angle
In today’s digital landscape, some analysts believe, that regulation alone is not the answer for fake news or hate posts, although it is being welcomed by most people. The Government may implement a set of laws which enable hefty fines for social media companies which fail to take down ‘obviously illegal content’ within specified time limit. However, two important aspects should be considered before doing so.
First, the costs of legislation which might outweigh the benefits.
Second, the dangerous effects of such legislation. Placing legal responsibility on social media companies to identify the lawfulness of content on their platforms will create an atmosphere where they will take unwarranted caution to avoid issues. Beyond undermining the right to free speech, companies may even censor important public feedback, for example, on Governmental corruption.
Maybe, we need to look for a solution in a different angle. In the traditional print, radio and TV media space, we have seen how the media companies generally censor themselves to prevent publishing content which violates standards of public decency. We must, however, recognize that unlike traditional media companies, where content is generated by small group of individuals, social media platforms represent a very broad base of content producers and users. That is why social media platforms are considered as avenues for public interests to be represented.
So, we have the Government on one side and the social media platform on the other side, both claiming to be stewards of public interests. If we proceed further with this paradigm, we need to take three mutually agreeable steps to solve this problem.
First, both parties should agree on the content standards to be interpreted and operationalized on social media platforms. This has to be done through an inclusive mechanism. When it comes to interpreting content laws, the scale and speed of the digital world make court decisions impractical. The best idea is for Governments and social media companies to co-develop a swift mechanism which allows a spectrum of public voices to influence the interpretation of content laws in grey cases.
Second, the Government and social media companies should establish a system of public accountability.
Add new comment