This article talks about good ‘content moderation’ practices, and talks specifically about how content can be moderated through Co-regulation.
How is the online ecosystem today?
Until the government introduced the Information Technology (IT) Rules 2021, it was voluntary for platforms to establish a grievance redressal mechanism through their terms of service.
But, the IT Rules 2021 mandate platforms to establish a grievance redressal mechanism to resolve user complaints within fixed timelines.
Comprising government appointees, GACs will now sit in appeals over the platforms’ grievance redressal decisions.
This signifies the government’s tightening control of online speech, much like Section 69A of the IT Act 2000.
However, the existing government control on online speech is unsustainable.
With increased stakes in free speech and with increasing online risks, a modern intermediary law must re-imagine the role of governments.
How should a modern intermediary law be?
Under such a law, government orders to remove content
Must be necessary and proportionate, and
Must comply with due process.
The European Union (EU) Digital Services Act (DSA) is a good reference point. It regulates intermediary liability in the EU.
The DSA requires government take-down orders to be proportionate and reasoned.
The DSA also gives intermediaries an opportunity to challenge the government’s decision to block content.
An intermediary law must devolve crucial social media content moderation decisions at the platform level.
Platforms must have the responsibility to regulate content under broad government guidelines.
Instituting such a co-regulatory framework will serve three functions.
What functions will the co-regulatory framework serve?
Co-regulation will give the platforms the flexibility to define the evolving standards of harmful content.
With co-regulation, platforms will retain reasonable autonomy over their terms of service.
Co-regulation will promote free speech online because government oversight incentivises platforms to engage in private censorship.
Co-regulation aligns government and platform interests.
Online platforms themselves seek to promote platform speech and security so that their users have a free and safe experience.
Incentivising platforms to act as Good Samaritans will build healthy online environments.
Instituting co-regulatory mechanisms allows the state to outsource content regulation to platforms, which are better equipped to tackle modern content moderation challenges.
While maintaining platform autonomy, the co-regulation also makes platforms accountable for their content moderation decisions.
Whenever platforms remove content, or redress user grievance, their decisions must follow due process and be proportionate.
But, when the use tools for de-prioritisation of content to reduce the visibility of such content, mostly the users are unaware of it.
So the users are unable to challenge such actions as they take place through platform algorithms that are often confidential.
Platform accountability can be increased through algorithmic transparency.
What is next?
An intermediary law should take the baton brought forward by the 2021 Rules.
The GACs must be done away with because they concentrate censorship powers in the hands of government.
A Digital India Act is expected to be the successor law to the IT Act.
This is a perfect opportunity for the government to adopt a co-regulatory model of speech regulation of online speech.