Table of Contents
ToggleRegulating online speech and its urgent need
What should be done about “bad” speech on the internet, particularly speech on social media platforms like Facebook and Twitter, is one of the most hotly contested topics of the present day.
Hate speech, disinformation and propaganda efforts, the advocacy of and incitement to violence, limiting exposure to ideas that one disagrees with or that conflict with preexisting convictions, and other problematic communications are all examples of “bad” speech. Because the internet is a system of global communication by nature, “evil” speech can come from both domestic and international sources.
There is no doubt that these types of extremely harmful expressions have existed for as long as there have been communications technologies, but the current debate is based on the idea that the prevalence and design of this most recent and potent communications technology magnify these harms tremendously beyond anything we have ever experienced. Some contend that if it goes unchecked, democracy itself could be in danger.
Why is this in the news?
The government-appointed grievance appellate committees (GAC) and the industry self-regulatory body (SRB) have been proposed by the Ministry of Electronics and Information Technology (Meity) to restrict competition in favor of a unilateral government and industry agenda.
The daily consequences of having unregulated online speech:
-
Campaigns of gendered misinformation and harassment
Effects on people’s mental health, productivity at work, and if and how they use online spaces.
-
GLAAD’s 2021 Social Media Safety Index
64% of LGBTQ social media users reported encountering harassment and hate speech, including on sites like Facebook, Twitter, YouTube, Instagram, and TikTok.
-
Causing communal violence
Failure to delete and stop the spread of bad content can have serious offline effects, including as violence and fatalities, in nations like India and Sri Lanka.
The proposals for the regulation of online speech:
-
Grievance Appellate Committees
According to a proposal released by the Ministry of Electronics and Information Technology (Meity), the central government will establish the GACs, which will act as an appeals body for judgments made by various social media sites.
-
Self-regulatory bodies appointed by social media platforms
As the term implies, businesses like Twitter, meta, etc. will choose their own employees and set up a self-regulatory body to handle complaints about social media content.
Criticisms on GAC and SRB:
-
Lack of substantive framework
The government seeks the authority to apply this incredibly subjective standard to specific pieces of information and/or people, despite not having established a substantive policy with objectively defined boundaries of prohibited speech.
-
Irrational removal of content
It is noteworthy that the government has already ceded this prerogative and frequently directs social media platforms to remove or block content (without giving a cause), with little resistance from the platforms.
-
Serving the Governments’ objective
The reinstatement of content or users who have been proactively blocked by platforms, however, does not follow the national security or public order logic of takedowns, and it is likely that another goal of the GACs is to give the ruling government machinery an institutional route to get a group of aligned accounts or content reinstated rather than just takedowns.
-
It will result in increasing the Government’s unrestrained powers
Despite Twitter’s complaint before the Karnataka High Court about the Center’s “disproportionate use of power” to issue “overbroad and arbitrary” content-blocking orders, platforms in India have a very poor track record of defying government pressure.
-
Lack of consent in SRB
The other genuine risk is that such a committee may fail because of internal strife or non-compliance, opening the door for the government GAC. The opposing viewpoints of the constituent platforms suggest this possibility.
Suggestions :
- The administration of speech, including the establishment and enforcement of rules, must therefore be wholly outside the purview of government.
- A statutory regulator answerable to Parliament can be used to create an independent body with parliamentary oversight.
- Standard operating procedure for content removal: In the meanwhile, there needs to be clarity on how decisions about content moderation are made, especially when the government issues takedown orders.
Conclusion
The present plans are focused on regulating specific pieces of content, yet social media platforms have a fundamental impact on our information ecosystems. Our public discussion must advance to address structural issues influencing information ecosystems as social media platforms increasingly intervene by elevating specific perspectives.