preloader image

Loading...

The Legal Affair

Let's talk Law

The Legal Affair

Let's talk Law

Delhi High Court Calls for Strict Accountability of Digital Platforms Over Obscene and Pornographic Mobile Applications

Delhi High Court Calls for Strict Accountability of Digital Platforms Over Obscene and Pornographic Mobile Applications

Introduction:

In a significant development concerning digital governance, intermediary liability, and online safety, the Delhi High Court recently expressed serious concern over the unchecked circulation of obscene and pornographic content through mobile applications hosted on digital play-stores operated by major technology companies. The Court directed Google LLC and Apple Inc. to take immediate and strict action to curb the dissemination of vulgar and sexually explicit content through applications available on their respective platforms.

The observations came during the hearing of a Public Interest Litigation filed in the matter titled Rubika Thapa v. Union of India & Others. The petition raised grave concerns regarding the alleged inaction of authorities and digital intermediaries in controlling mobile applications that allegedly promote pornographic live streaming, obscenity, immoral trafficking, extortion rackets, and other forms of organized cybercrime.

The Division Bench comprising Chief Justice DK Upadhyaya and Justice Tejas Karia observed that intermediaries cannot merely function as passive hosts waiting for complaints to arise. Instead, they are obligated under the Information Technology framework to exercise due diligence before permitting applications to be uploaded and distributed through their digital marketplaces.

The petition particularly highlighted several applications allegedly engaged in displaying explicit and vulgar content through live streaming services. According to the petitioner, these platforms were not only facilitating circulation of obscene material but were also allegedly being misused for criminal activities such as prostitution rackets, honey-trap schemes, deepfake-enabled extortion, illegal financial transactions, substance abuse networks, and organized cybercrime.

The matter assumes importance in the backdrop of increasing public concern regarding digital platforms hosting user-generated content without adequate regulatory oversight. With smartphones and internet access penetrating every section of society, the availability of explicit applications through mainstream app stores has raised larger constitutional and legal questions concerning free speech, intermediary liability, child protection, public morality, cyber safety, and national security.

The petition specifically invoked provisions of the Information Technology Act, 2000, including Sections 67, 67A, and 67B, which criminalize publication and transmission of obscene and sexually explicit material in electronic form. It also referred to relevant provisions of the Bharatiya Nyaya Sanhita, 2023 concerning obscenity and organized crime.

The controversy further involved the role of intermediaries under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. These rules impose obligations upon digital intermediaries to exercise due diligence, remove unlawful content, cooperate with governmental agencies, and ensure safe digital ecosystems.

During the proceedings, the Court was informed that several applications named in the petition were allegedly operating from foreign jurisdictions including the United States, Turkey, Russia, Japan, and China, thereby complicating regulatory enforcement under Indian law.

The High Court’s observations are particularly notable because they indicate judicial willingness to hold large technology corporations accountable not merely for reactive compliance but also for proactive content governance. The Court emphasized that technology companies running digital marketplaces must exercise caution and responsibility before allowing applications to become accessible to Indian users.

The case therefore lies at the intersection of constitutional law, information technology law, cyber regulation, intermediary liability, public morality, and national security concerns. It also reflects the growing judicial focus on balancing technological innovation with legal accountability in an increasingly digitized society.

Arguments of the Parties:

The petitioner, Rubika Thapa, approached the Delhi High Court through a Public Interest Litigation highlighting the alarming rise of mobile applications allegedly disseminating obscene, vulgar, and pornographic material through mainstream digital platforms such as Google Play Store and Apple App Store.

Represented by Advocate Tanmaya Mehta along with other counsel, the petitioner argued that many of these applications host live-streaming services involving explicit nudity and sexually suggestive content designed specifically to attract users and generate revenue. It was contended that the nature of the content available on these platforms exceeded ordinary obscenity and crossed into serious violations of statutory criminal law.

The petitioner submitted that such applications were not isolated instances of adult content but had evolved into platforms facilitating organized criminal activities. According to the plea, these applications were allegedly being misused for prostitution rackets, immoral trafficking, honey traps, extortion schemes, substance abuse networks, illegal financial transactions, and even deepfake-based cybercrime.

Particular emphasis was placed on the allegation that several applications enabled users to engage in private interactions involving explicit conduct in exchange for money. The petitioner argued that this ecosystem fostered exploitation and criminal conduct under the guise of digital entertainment.

The PIL named several applications including Tango.Me, Pure, Chamet, Bolo Ji, PyaarChat, Bling, StreamKar, LivHub, MuMu, Chato, Vibely, Fun Party, Hiiclub Pro, Jalwa, and Winku. According to the petitioner, these applications openly hosted sexually explicit content and violated Indian legal standards concerning obscenity and cyber safety.

The petitioner further argued that the conduct of these platforms violated Sections 67, 67A, and 67B of the Information Technology Act, 2000, which prohibit publication and transmission of obscene and sexually explicit material through electronic means.

Additionally, the plea highlighted the challenge posed by the foreign origins of many of these entities. It was argued that several companies failed to disclose proper ownership details, registered offices, or operational transparency, while routing their services through servers located outside India. This, according to the petitioner, complicated law enforcement efforts and enabled such entities to escape effective regulatory scrutiny.

Another significant aspect raised in the petition concerned national security and economic implications. The petitioner alleged that some platforms were being used for laundering illicit funds through international channels and facilitating extortion through technologically manipulated content such as deepfakes.

On behalf of the petitioner, Advocate Tanmaya Mehta argued before the Court that the content available on these applications was “worse than pornographic material” and that these platforms were actively contributing to criminal exploitation and cyber abuse.

In response, counsel appearing for Google LLC submitted that the company already had a robust grievance redressal mechanism in place. It was argued that applications violating platform policies or legal standards could be removed upon receipt of complaints or identification of unlawful activity.

Google’s counsel sought to emphasize that intermediary platforms maintain reporting systems and compliance frameworks intended to address problematic content.

However, the Court appeared unconvinced by a purely reactive approach. The Bench observed that the issue was not merely whether Google could act upon receiving complaints, but whether it had an obligation to proactively ensure that such applications were not permitted to operate in the first place.

The Union Government, represented by Chetan Sharma, broadly supported the concerns raised in the petition. The Additional Solicitor General emphasized that intermediaries under the 2021 IT Rules are required to play a proactive and responsible role in monitoring and preventing dissemination of unlawful content.

The Union Government stressed that digital intermediaries must go beyond minimal compliance and actively ensure that obscene and unlawful content does not become accessible through their platforms.

The Court also considered the role of the Indian Computer Emergency Response Team in regulating cyber threats and coordinating responses to unlawful digital activity.

The proceedings therefore evolved into a larger debate regarding the extent of responsibility borne by technology corporations operating digital marketplaces in India. While intermediaries argued that mechanisms for complaint-based takedown existed, the petitioner and the Union Government insisted that due diligence obligations require more active scrutiny before applications are made available to the public.

Court’s Judgment:

After hearing the parties, the Delhi High Court expressed serious concern regarding the availability and circulation of obscene and pornographic content through mobile applications hosted on digital play-stores.

The Division Bench observed that intermediaries such as Google LLC and Apple Inc. cannot escape responsibility merely by claiming that they possess complaint-based takedown mechanisms. The Court emphasized that the obligations imposed under the Information Technology Rules, 2021 require intermediaries to exercise due diligence not only after complaints are received but also at the stage when applications are initially uploaded and approved for public access.

The Court made it clear that digital platforms facilitating app distribution occupy a central role in the online ecosystem and therefore must adopt greater responsibility in preventing dissemination of unlawful content.

The Bench observed:

“We expect that having regards to the averments made in the petition, the respondents no. 2 (Google LLC), 3 (Apple INC) and 4 (Indian Computer Emergency Response Team) shall act strictly to ensure that such dissemination of videos is immediately checked and the 2021 Rules are followed in their letter and spirit.”

This observation reflected the Court’s insistence that compliance with intermediary obligations cannot remain superficial or technical. Instead, intermediaries must implement the 2021 Rules in both substance and spirit.

The Court specifically referred to the due diligence obligations imposed under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. These rules require intermediaries to prevent hosting or dissemination of unlawful content, cooperate with authorities, and establish effective grievance mechanisms.

Importantly, the Bench underscored that the duty of intermediaries is not confined to responding to user complaints after harmful content has already spread. The Court indicated that technology companies must proactively assess applications seeking entry into their platforms and prevent those evidently facilitating unlawful activities from becoming available altogether.

The observations of the Court signify a judicial shift toward enhanced intermediary accountability in India’s digital regulatory framework.

The High Court also appeared conscious of the broader societal implications of unrestricted access to sexually explicit applications. By acknowledging allegations relating to extortion, honey traps, prostitution rackets, and organized cybercrime, the Court recognized that such platforms may produce consequences extending far beyond moral concerns.

Another significant aspect of the proceedings was the Court’s recognition that digital obscenity and cyber exploitation may intersect with issues of public order, financial crime, and national security. The allegations regarding deepfake-enabled extortion and international routing of illicit funds added urgency to the Court’s concerns.

At the same time, the Court refrained from issuing immediate coercive directions such as blanket bans or removal orders at the preliminary stage. Instead, it adopted a measured judicial approach by issuing notice and directing the concerned respondents to submit action taken reports.

The Court therefore balanced procedural fairness with regulatory urgency. By requiring the respondents to explain steps already taken and measures proposed for future compliance, the Bench ensured that the matter would proceed through structured judicial scrutiny.

The matter was accordingly listed for further hearing on July 17.

The judgment is significant because it reinforces the evolving legal principle that intermediaries operating large-scale digital ecosystems cannot function as entirely passive entities. Indian courts are increasingly recognizing that technology companies possessing substantial control over content distribution must exercise corresponding responsibility.

The ruling also reflects the judiciary’s attempt to adapt traditional legal concepts of accountability to rapidly changing digital realities. Questions concerning obscenity, platform liability, algorithmic dissemination, and cyber exploitation now require courts to interpret existing legal frameworks in technologically complex contexts.

Furthermore, the decision may have wider implications for regulation of app stores, content moderation policies, and intermediary compliance standards in India. Technology corporations operating in India may face increasing judicial scrutiny regarding how they assess applications before making them publicly available.

Ultimately, the Delhi High Court’s intervention underscores that digital freedom cannot become a shield for unlawful exploitation, obscenity, or organized cybercrime. The Court’s observations send a strong message that intermediaries must actively contribute toward building a safer and legally compliant digital environment.