Opinion

It’s imperative to mete out exemplary punishment to those behind ‘Bulli Bai’ app to deter others in future

Cyberviolence, especially that with sexual overtones such as Bulli Bai app, is nothing but a symptom of a society that allows misogynistic and Islamophobic behaviour to thrive

Representative
Representative IANS Photo

On January 1, the Mumbai Police registered a case against the developers of an app, “Bulli Bai”, hosted on GitHub, which used photos of approximately 100 Muslim women, obtained and doctored without their permission, to mock-auction them to other users on the app. First information reports (FIRs) were also registered against Twitter handles that promoted the app. Based on a complaint by social activist Khalida Parveen, an FIR against the now-deleted app has also been registered in Hyderabad. The National Commission for Women too wrote to the Delhi Police Commissioner to immediately register an FIR.

This is not the first time something like this has happened. In May 2021, a YouTube account, under the name ‘Liberal Doge’, mock-auctioned off Muslim women from India and Pakistan to its 87,000 strong audience. Then in July 2021, an app called “Sulli Deals”, also hosted on GitHub, generated profiles of more than 80 Muslim women, using images from their social media, and described them as “deals of the day”. The police had registered an FIR against unknown persons at the time, but no arrests were made.

Thereafter, in November 2021, accounts on the popular social audio app, Clubhouse, were also created to mock-auction Muslim women using the Sulli Deals tag.

In all instances, outspoken women from mostly Muslim backgrounds have been listed, and their pictures morphed. These include Ismat Ara – an investigative journalist, Sayema – a radio jockey, Shabana Azmi, Swara Bhasker – actors, Fatima Nafees – the 65-year-old mother of Jawaharlal Nehru University student Najeeb Ahmad, who disappeared from his campus in 2016, and Pakistani Nobel Peace Prize laureate Malala Yousafzai, among many others.

Some of the profiles that were generated on the app were of teenage girls. Both “Sulli” and “Bulli” are derogatory slangs used to refer to women belonging to the Muslim community.

Published: undefined

The attention Bulli Bai has garnered has, fortunately, resulted in some action, the fourth time around. This time, a 21-year-old engineering student named Vishal Kumar Jha and an 18-year-old engineering aspirant named Shweta Singh, who is the main accused, have been arrested from Bengaluru and Uttarakhand respectively. Vishal has been remanded to custody by a metropolitan court till January 10, while the woman has been placed under arrest.

They have been charged under Sections 153 (wantonly provoking with intent to cause riot), 153B (publishing any imputation that any class of persons cannot, by reason of their being members of any religious, ra­cial, language or regional group or caste or community, bear true faith and allegiance to the Constitution of India), 295A (deliberate and malicious acts, intended to outrage reli­gious feelings), 354 (assault or use of criminal force to any woman with intention to outrage modesty), 500 (punishment for defamation), and 509 (insulting the modesty of a woman) of the Indian Penal Code (IPC), and Section 67 (publish any material which is lascivious) of the Information Technology Act, 2000.

Nikhil Narendran, a partner at Trilegal in its telecom, media, and technology practice, believes that an argument can be made that the content amounts to obscenity, and that it can be argued to be lascivious or appealing to prurient interests. However, new precedent would have to be established in order to obtain a conviction.

On this front, provisions on obscenity in the IPC (Sections 292, 293, and 294) and the IT Act (Section 67) could be applied. Provisions of the Indecent Representation of Women (Prohibition) Act, 1986 may also be applied, in his opinion, as the content is “derogatory, denigrates women and is against public morality.”

Published: undefined

Union Information Technology Minister Ashwini Vaishnaw has announced that the Indian Computer Emergency Response System, the nodal agency for monitoring cyber security incidents, has been asked to form a high-level committee to investigate the case.

GitHub is a cloud-based open-source software repository service that allows developers to store and manage their code, and keep track of any changes made to it. Git is an open-source version control software that allows for the codebase and history to be available on every developer’s computer, and GitHub enables coding participants or team members to collaborate better with ease through a user-friendly interface.

This platform was used to build and host the two mock-auction apps which stole pictures from the social media accounts of women without their permission, and invited users of the app to bid on them. GitHub has taken down the app without divulging the identity of the people involved and issued a statement saying, “GitHub requires content to be respectful and civil and threats of violence are forbidden and speech that attacks a person or a group on the basis of their identity is not allowed.“

Karthik Venkatesh, a programme manager at Delhi-based think tank The Dialogue, said, “Under the intermediary rules, a platform acts as a “passive conduit” when it comes to content that is hosted on it. They are bound by their community guidelines that are published on the website, which categorically states that “bullying and/or harassment content” or “pornographic content” is disallowed on the platform. Repeated incidents of women’s photographs being uploaded and auctioned on Github directly points towards a disturbing trend for online safety.”

Published: undefined

As to GitHub’s liability under the Intermediary Rules, Narendran says that “if GitHub takes down the content, it will not be liable, provided that it complies with the due diligence requirements.”

Since GitHub is headquartered overseas, any information collection by Indian law enforcement will have to go through the Mutual Legal Assistance Treaty (MLAT) process. An MLAT is an agreement between two or more countries to facilitate the gathering and exchanging of information to enforce criminal or public laws. Requests for assistance can involve the examination and identification of people, places, or things, questioning, transfer of custody, or other assistance from law enforcement agencies in the receiving country, when the object of investigation is within its territory. GitHub is based out of the United States, a country with which India has entered into a Treaty on Mutual Legal Assistance in Criminal Matters.

“In this instance, there are four major stakeholders - the women, the platform (GitHub), Law Enforcement, and the owners of the app/developers. To conduct an investigation and to ensure that they gather the requisite evidence on the complaints by the women against the perpetrators of the obscene content, the police must go to the platform and request data. Since GitHub is a foreign company, the regular route that is followed for data sharing requests involves going through the MLAT process. The MLAT process is a long and tedious process and in dire need of reform,” Venkatesh said.

In his view, such reform “must concentrate on building consensus on the standards that will govern law enforcement access to data in a timely manner. Given the nature of the internet, building cooperation on matters such as this will be beneficial for all parties. Towards this, a data protection framework that lays down institutional mechanisms and obligations of data fiduciaries that are at par with global standards will allow for swifter data sharing. As far as legislative reforms go, legal mandates on performing audits on practices of platforms, enforcement of the community guidelines when it comes to illegal content/ pertains to a crime will help in demystifying the black box that platforms operate in.”

Published: undefined

The Delhi Police had, last year, sought approval for MLAT when FIRs were registered against the Sulli Deals app. They received permission from the Union Government earlier this week, and have sought information from GitHub regarding the developer of the Bulli Bai app. Twitter has also been asked to block and remove related offensive content from its platform.

There has been a worrying increase in the number of instances of harassment of women online. While this is often shrugged off as men being men, much of this online “locker room talk” has the potential to manifest into real world violence against women. Discouragingly, very little is done by law enforcement to curb the phenomenon of cyber offences.

According to Venkatesh, “[t]he prosecution rates for cyber offences have remained low, despite the enactment of the IT Act and the IT rules. One could allude to the lack of resources for law enforcement, lack of manpower and the technical expertise, and financial resources that are allocated for fighting cybercrimes remain low. Despite thousands of complaints to the National Crime Records Bureau (NCRB) on Child Sexual Abuse Material, conversion of these complaints to FIRs remains abysmal.”

Online gender-based violence is just as harmful as offline violence, and is rooted in the gender equalities persistent in society. Not only will continued online violence such as this have a chilling effect and lead to self-censorship by women, it will also have long-term mental health consequences if left unaddressed.

Cyberviolence, especially that with sexual overtones such as the Bulli Bai app, is a symptom of a bigger problem – a society that deprives women of their right to equally participate in political, social, and cultural life; one that is misogynistic and Islamophobic.

In this case, the targeting of vocal women, largely belonging to a minority community, is an express assertion of power. This power imbalance must be addressed with awareness programmes and complaint and protection mechanisms, accompanied by strong punishments that will deter such behaviour in the future.

(IPA Service)

Views are personal

Courtesy: The Leaflet

Published: undefined

Follow us on: Facebook, Twitter, Google News, Instagram 

Join our official telegram channel (@nationalherald) and stay updated with the latest headlines

Published: undefined