INTRODUCTION
In today’s economy which is rapidly evolving into the digital world, platforms have emerged as the new marketplaces and service providers. Regardless of whether it is a well-established social media network, an e-commerce platform or a cloud service provider, all of these entities share a legal status i.e., they are Intermediaries. While a platform-based business focuses primarily on seamless connectivity, the legal reality is a complicated intertwining between facilitating users’ needs and bearing accountability for users’ wrongdoings. In India, this equilibrium is regulated under the Information Technology Act, 2000, the focus of which lies upon S.79.
SAFE HARBOUR CONCEPT
The core of intermediary liability concept rests on a ‘safe harbor protection’. Let’s say the post office would go bankrupt if it were sued each time someone posted a defamatory letter. Thus, intermediaries are immune from being held accountable for the third-party data, information, or any communication sent through them. In this way, S.79 IT Act states an intermediary can claim immunity from intermediary liability arising from user’s unlawful data, provided that it acts in a limited role only by facilitating access to a system, did not initiate the transmission, did not select the receiver of the message and had not modified the contents of the message.
Now this safe harbor protection is not an absolute one and is conditioned upon the observance of a ‘due diligence’ on part of the platform, or it is removed, then intermediary would be liable for a user’s misconduct. The regime has undergone substantial change since Shreya Singhal v. Union of India, wherein an intermediary could claim immunity only by taking down a content after obtaining a valid notification or an order of court. However now with AI becoming more prevalent with the proliferation of deep fakes and online financial frauds, stricter regulation is necessary and was incorporated in Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 with the creation of a category for the same, which are designated as Significant Social Media Intermediaries (SSMIs), defined as social media intermediaries with over 5 million registered users. SSMIs have much stricter obligations: appointing a Chief Compliance Officer, Nodal Contact Person and a Resident Grievance Officer and further requirements for publishing monthly compliance reports and also for placing mechanisms to detect explicitly inappropriate content.
LEGAL RISKS FOR PLATFORM BUSINESSES
The first legal risk for platforms is the removal of the safe harbor. If an intermediary can be seen to have directly colluded with the user in commissioning the unlawful act or where the platform failed to remove unlawful content even after obtaining the order for the removal of the same, the platform could be directly prosecuted for defamation or copyright infringement. The second is a highly debatable aspect, where it involves a paradox between a user’s right to privacy through end-to-end encryption, as used in messaging platforms such as WhatsApp, and a government’s ability to trace the first originator. It’s either about adhering to the law by breaking the encryption or jeopardizing user’s privacy.
It has become a Catch 22 situation as on one hand if the platforms excessively moderate they could be taken to courts over free speech, and in other circumstances, over arbitrary blocking of content. Now, an intermediary has the obligation to also appeal to a government body, the Grievance Appellate Committee (GAC) when any of their content blocking decisions are challenged by the user, the platform’s internal policies have become subject to a secondary oversight.
Looking at the contrast between standard intermediaries and SSMIs in terms of their due diligence requirements – a standard intermediary requires having privacy policies and terms and conditions whereas a SSMI additionally has requirement of having monthly reporting; both have been specified to reply to the user complaint within 24 hours and dispose of it within 15 days; it is mandatory to have local Indian officers only for SSMIs and there is a requirement of a local presence for SSMI only if it’s a messaging focused intermediary for tracing.
STRATEGIES FOR RISK MITIGATION
Legal counsels for platform-based business need to ensure they do not rely on mere hope for protecting their entities, but they must create strong terms of service prohibiting unlawful content as well as implement quick takedown mechanisms for a quick response to an unlawful post as well as investing in compliance technologies that make use of Artificial intelligence to scan and identify unlawful content along with an all-important human to decide on the judgment over excessive censoring, thus enabling free speech. In conclusion, it can be said that Regular Audits would also serve as a test of whether the safe harbor protection can be effectively maintained. These aspects should actively be maintained and implemented to fulfill the existing laws effectively. Besides all these the human cost of these legal risks is immense and can even lead to a shutting down of a small business. Transparency and user safety foster a sense of user trust.
CONCLUSION
The IT Act, 2000 was created in the age of dial-up internet whereas the 2021-23 rules were promulgated for the age of Artificial intelligence. With the ongoing efforts to replace the IT Act with the proposed Digital India Act, there is bound to be an even further redefinition of what constitutes an intermediary, for the platforms, it’s essential to realize that intermediary liability can no longer be taken as a passive risk but rather it is an active risk that needs to be addressed actively and regularly.

Hritvik Gupta is a legal writer and researcher associated with LEGALLANDS LLP, where he contributes analytical and research-driven articles on corporate governance, international trade laws, and policy reforms. His writing reflects a deep understanding of evolving legal frameworks and their impact on cross-border commerce and regulatory compliance.
Hritvik’s work bridges practical legal insight with emerging global regulatory trends, offering readers a balanced perspective that combines academic depth with real-world application. He takes a keen interest in the intersection of law, technology, and international policy, contributing to the discourse on how businesses and governments can adapt to dynamic legal environments.
Through his contributions to Legallands.com, Hritvik aims to make complex legal developments more accessible, insightful, and relevant to businesses, professionals, and policymakers operating in an increasingly interconnected world.
