- Aditi Mozika
Towards the end of 2018, the Ministry of Electronics and Information Technology (MeitY) proposed certain amendments to the Information Technology (Intermediaries Guidelines) Rules, 2011. These changes greatly modify the way in which the government visualizes user rights and intermediary liability in the digital sphere. One of the important amendments that have been proposed is that of a reduction in the take-down timeframe to 24 hours. While the rationale behind this measure- that of prompting swift action to remove harmful content- is appreciated, there are several concerns that have been raised with regard to this provision as well. This Article flags these concerns, examining the importance of framing an appropriate timeframe to prevent over-censoring of free speech online, and suggests some measures that ought to be implemented by the government to protect the stakeholders while ensuring take-down of harmful content.
Intermediaries are platforms that facilitate the sharing of information through the internet. According to Section 2(1)(w) of the Information Technology Act, 2000, an intermediary includes telecom service providers, internet service providers, search engines, online payment sites, online market places, cyber cafes, among others. To be brief, almost every internet-based service provider falls under the ambit of this definition. The IT Act makes provision for intermediaries to avail safe harbor under Section 79 if they follow the rules provided in the Intermediaries Guidelines. These include observance of due diligence by filtering content, appointing grievance officers, providing information to government agencies, and monitoring user-generated content.
With the advancement of technology, the world also saw an increase in the circulation of propaganda, hate speech, and disinformation. These had severe consequences, extending to lynching and death in some cases. To curb this, governments took to strengthening their intermediary liability laws by adopting an interventionist approach that sought to impose more responsibility on intermediaries as a measure to check the spread of such information through various internet platforms. India has been keen on overhauling its safe harbour regime as well. To this end, the MeitY proposed certain changes to the Intermediaries Guidelines. However, the Draft Amendments seek to significantly revise the current model, crafting burdensome obligations for intermediaries to avail safe harbour provisions. They propose to impose the requirement of traceability of the originators of the content, which can have a chilling effect on free speech. Another important, yet neglected, amendment that has been proposed is that of a reduction in the take-down timeframe.
The take-down timeframe is the period of time an intermediary is assigned to respond to a legal take-down order. According to Rule 3(8) of the Draft Amendments, the time-limit given to the intermediaries to respond to a legal content take-down request- also known as the turn-around time- has been reduced to 24 hours from the erstwhile duration of 36 hours. What this implies is that once an intermediary receives a take-down order from the government or the court, it would have to remove the concerned piece of content within 24 hours of receipt of the notice. In case they fail to do so, the safe harbour applicable to them under Section 79 would cease to apply. In addition to this direct consequence, a host of additional concerns vis-à-vis the reduction in timeframe have been raised.
First of all, the definition of the intermediary under the IT Act is extremely broad and includes entities ranging from search engines to cyber cafés. As such, regulation of a one-size-fits-all nature would be inefficient as different platforms have different capabilities and different resources available to them. Larger platforms with adequate technical architecture like YouTube or Facebook would face no difficulty in executing a legal take-down order within the given timeframe, but the case would be different for a smaller company. Likewise, a search engine could easily delink websites, but a cyber café would not be able to pursue a similar course in responding to an order.
Besides, failure to observe these rules invites harsh sanctions. In fact, safe harbour principles state that intermediaries would be exposed to the same level of liability as the person uploading the content in case they fail to remove the content. Such a consequence does not augur well for any kind of intermediary but poses a bigger threat to smaller companies. This is because a bigger intermediary may be able to endure a financial loss as a sanction for its failure to remove content, but smaller companies would possibly be crippled. This could also go on to eliminate competition from smaller companies in the long run.
In furtherance of the above, we must also create a distinction between different kinds of harmful content. This difference must also be acknowledged and incorporated in take-down laws. Several regulations, like the NetzDG, have attempted to frame regulations based on the nature of the content. On the other hand, the Indian law takes in an array of content that is deemed to be illegal, ranging from critical content that threatens ‘the sovereignty and integrity of India’ to subjective elements like ‘decency’. While the former category of content warrants immediate action, such an accelerated timeframe may not be justifiable, or rather, suitable, for the latter.
Second, intermediaries, under the threat of heavy criminal sanctions, would err on the side of the law. This means that owing to the fear of losing protection, intermediaries would make shorter, quicker and less nuanced decisions while determining whether a piece of content is illegal. They would prefer removing content that is lawful paying less heed to the technical requirements or the correct legal procedures in order to ensure that sanctions are not imposed on them. This imposes a grave risk of censorship of legitimate speech.
Third, intermediaries, acting hastily on account of the short turn-around time, may resort to over-removal of content or may not follow due process. This is because intermediaries would lack sufficient time to scrutinize a request properly and to ensure that all technical and legal requirements are adhered to.
In addition to these specific concerns, there are other issues that need to be addressed. The most important of these is that the current legal framework lacks proper procedural safeguards. The Indian law mandates content take-down, but there is no prescribed procedure through which a user is notified that such an action has been taken. There must be a notice mechanism put in place, as due process is an important legal principle that cannot be ignored.
Conclusion and recommendations
Popular communications applications have been a major source of abusive, threatening and terrorist content in India. In view of this reality, it becomes impractical to assert that the internet should be absolutely free from regulation. Yet, this concern should not supersede the need to protect human rights, including the freedom of expression. The proposed amendments indicate a departure from the current model of intermediary liability in India, by putting great pressure on online intermediaries to keep their platforms free of disagreeable. They also heavily foster issues of due process and opacity in the legal process of content take-down. Such regulations can lead to unintended consequences that have been highlighted above and therefore, the government needs to address this by taking necessary action.
First of all, the government must keep in mind the fact that there are vast differences between the various categories of intermediaries and the various types of content. Several jurisdictions have framed regulations operating on different timeframes, and it would be beneficial to the larger interest of the country to adopt such an approach. Some countries also require intermediaries to submit transparency reports, which help in providing knowledge of the effectiveness of turn-around times. This can help in introducing informed changes to the law. The government must provide research to rationalize their claim that a 24 hour period would be an optimal framework for all kinds of intermediaries, without distinguishing their size or services. The government could also consider having an independent regulator as a compromise between pure governmental and self-regulation. The regulator would, while assessing liability, take into account the nature of the intermediary as well as that of the content.
Governance of intermediaries necessarily requires varying levels of regulation, rooted in the different composition and technical architecture of these entities. It must be understood and accepted that moderation of content on the internet cannot be done using a single blanket formula, and more importantly, not at the cost of users’ freedom of speech and expression.
The author is a 4th Year Student at GNLU, Gandhinagar