The closed-door hearing followed accusations in newspaper reports that Facebook was allowing anti-Muslim hate speech on the platform and that its top policy official in India had shown favoritism toward Prime Minister Narendra Modi’s Bharatiya Janata Party. The social media giant has denied the allegations.
After a hearing that lasted three and a half hours, the committee “agreed to resume discussions later, including with representatives of Facebook," chairman Shashi Tharoor said in a tweet.
Tharoor, an opposition Congress party lawmaker, did not give any details about the hearing.
India is Facebook’s largest market with nearly 328 million users. Facebook also owns WhatsApp, which has more than 400 million users in India.
As usage has spread across India, Facebook and WhatsApp have become fierce battlegrounds for India’s political parties. Leaders of Modi’s Hindu nationalist party have come under scrutiny for running online campaigns laced with false claims and attacks on the minority Muslim population.
Modi's party and its leaders have repeatedly denied the allegations and instead accuse Facebook of censoring pro-India content.
On Tuesday, technology minister Ravi Shankar Prasad wrote to Facebook CEO Mark Zuckerberg and said the platform was censoring content posted by right-wing users.
In August, the main opposition Congress party wrote two letters to Zuckerberg asking him to specify steps being taken to investigate allegations against its operations in India after a Wall Street Journal report said Facebook India’s head of public policy, Anhki Das, “opposed applying hate-speech rules” on members of Modi's party even after the issue was flagged internally.
In the second letter, the party said it was considering “legislative and judicial action” to make sure “a foreign company cannot continue to cause social disharmony.”
Two Facebook spokeswomen did not immediately comment. On Aug. 21, the company denied any bias toward Modi’s party and said it was “open, transparent and nonpartisan.”
The company said in a recent email to AP that it enforces content moderation policies globally "without regard to anyone’s political position or party affiliation.”
Facebook and WhatsApp have often been used to spread hate speech to incite deadly attacks on minority groups amid rising communal tensions across India.
A 2019 analysis by Equality Labs, a South Asia research organization, showed that groups sharing anti-Muslim content on Facebook included supporters of Modi’s party or were linked to Rashtriya Swayamsevak Sangh, a Hindu nationalist paramilitary volunteer organization and the ideological parent of the BJP. It found that 93% of the hate speech reported to Facebook was not removed.
Thenmozhi Soundararajan, executive director of Equality Labs, said Facebook lacks the capacity to remove widespread hate speech on its own and has been disingenuous and slow to act.
“They have no interest in removing violent users because it is against their business interests,” Soundararajan said.
She said Facebook India must ensure diversity in its content moderation team and consumer oversight of hate content.
The controversy comes as Facebook and Jio, India’s cheapest and most popular phone service provider, await a green light from India’s Supreme Court to roll out WhatsApp Pay, an e-commerce and digital payments platform poised to help the social media behemoth further penetrate India’s trillion-dollar digital market.
Facebook invested $5.7 billion in cash in Reliance Jio, a subsidiary of Reliance Industries, whose chairman, Mukesh Ambani, is India’s richest man.