Introduction:
The Karnataka High Court, on July 17, 2025, resumed hearing a critical petition filed by X Corp (formerly Twitter) challenging the interpretation and application of provisions under the Information Technology Act, 2000, and the IT Rules, 2021. The petition primarily questions whether Section 79(3)(b) of the IT Act empowers authorities to issue content-blocking orders or whether such powers are exclusively governed by Section 69A of the Act read with the Information Technology (Procedure and Safeguards for Blocking Access of Information by Public) Rules, 2009. The case also delves into issues surrounding safe harbour protections for intermediaries, the validity of takedown obligations under Rule 3(1)(d) of the IT Rules, 2021, and the balance between freedom of speech and content moderation in the digital era. During the proceedings, Solicitor General Tushar Mehta, representing the Union of India, underscored the evolving threats of harmful online content and stressed the necessity for coordinated regulatory frameworks to ensure online safety, accountability, and responsible technology use. The case brings into sharp focus the tension between individual rights and State regulation in cyberspace, making it one of the most significant constitutional and technology-related disputes in recent times.
Arguments Presented by X Corp:
X Corp, represented by senior advocates, argued that the government lacks authority under Section 79(3)(b) of the IT Act to issue blocking orders, as the statute only provides for the removal of unlawful content after an intermediary fails to comply with lawful directions. The petitioner contended that only Section 69A, read with its corresponding rules, allows for content blocking, and it prescribes a structured mechanism ensuring due process, transparency, and safeguards against arbitrary actions.
The platform further asserted that its role is limited to that of an intermediary, functioning as a neutral conduit for user-generated content, akin to a notice board, and thus enjoys safe harbour protection under Section 79. It argued that imposing broad obligations beyond statutory limits undermines the legal certainty essential for intermediaries operating in India.
Significantly, X Corp raised constitutional concerns, submitting that the Right to Freedom of Speech and Expression under Article 19(1)(a) is indirectly impacted by such government directives. Though the intermediary itself does not claim a direct right under Article 19, it asserted that excessive and vague takedown demands have a chilling effect on user speech, impairing democratic discourse and creating an environment of fear and self-censorship.
The petitioner also emphasized the need for judicial oversight before mandating content removal or blocking, to ensure compliance with the principles laid down in Shreya Singhal v. Union of India, where the Supreme Court upheld Section 69A on the condition of procedural safeguards and reasoned orders.
Arguments by the Union of India:
Solicitor General Tushar Mehta, representing the Centre, strongly defended the regulatory architecture and argued that the digital ecosystem today faces unprecedented challenges, including disinformation, cybercrimes, hate speech, and unlawful content posing threats to sovereignty, public order, and individual dignity.
Highlighting the alarming rise in cybercrime, Mehta pointed out that complaints surged from 26,049 in 2019 to over 22,68,346 in 2024, marking a 401% increase, thereby demonstrating the urgency for robust monitoring and regulatory enforcement. He emphasized that intermediaries cannot wash their hands of responsibility under the guise of neutrality, as their business models thrive on user engagement, algorithmic amplification, and content curation.
On X Corp’s contention regarding Article 19, the SG asserted that intermediaries are not rights bearers under the Constitution and, therefore, cannot claim freedom of speech protections. He characterized the platform as a “notice board”, asserting that only individuals who create content can claim fundamental rights.
Referring to Rule 3(1)(d) of the IT Rules, 2021, Mehta explained that intermediaries are obligated to not host or publish unlawful content upon being notified by the government or upon receipt of a court order, failing which they lose safe harbour protection. He clarified that this rule does not impose penal liability but merely conditions the intermediary’s statutory immunity.
Addressing concerns over due process, the SG maintained that Section 79 operates as an exception providing immunity, subject to compliance, and argued that similar regulatory frameworks exist globally. He further contended that the argument of “exception cannot override the rule” is misplaced because the provision merely creates an exemption contingent upon adherence to legal obligations.
On the question of AI and algorithmic amplification, Mehta observed that social media platforms curate and sequence content, influencing user behavior and amplifying harmful narratives. He warned that while Artificial Intelligence is a technological boon, its misuse in digital ecosystems has made the law’s evolution imperative.
Mehta concluded by stressing the need to balance free speech with regulatory safeguards, citing instances such as the Supreme Court’s concerns in the Ranveer Allahabadia case about regulating vulgar and harmful online content.
Observations by the Court:
Justice N. Nagaprasanna acknowledged the complexity of reconciling free speech guarantees with regulatory imperatives in the digital age, especially given the rapid pace of technological evolution. The bench noted that while safe harbour protections are crucial for intermediary liability frameworks, unregulated cyberspace could foster grave societal harms.
The Court orally remarked on the pervasive nature of algorithmic monitoring and targeted advertising, observing that digital footprints significantly compromise individual privacy. It agreed with the SG that intermediary platforms wield enormous influence over content dissemination, raising legitimate concerns of accountability and transparency.
At the same time, the Court stressed that exceptions cannot swallow the rule of law, and regulatory measures must withstand the constitutional test of proportionality. It sought clarity on whether government directions under Rule 3(1)(d) are subject to procedural safeguards akin to those mandated under Section 69A to prevent arbitrary takedowns.
The matter has been posted for further hearing on July 18, with the bench indicating that its final ruling will address the scope of intermediary obligations, constitutional implications, and future regulatory contours for social media platforms.