For research purposes, a Facebook researcher created a new account to check what is the experience like for Kerala natives living in India using social media sites. For the next 3 weeks, the account was operated by following simple rules generated and recommended by Facebook’s algorithms to watching videos, exploring new pages, joining groups on the site.
The result was not expected as it was inundation full of misinformation, hate speech, celebrations of violence that were reported and documented in an internal Facebook report published later that month. The verified report was one of the dozens of memos and studies that are written by employees of Facebook show a struggle with misinformation, celebrations of violence, and hate speech in the country, which is one of the company’s biggest markets.
They presented stark evidence of one of the most scathing criticisms levied by politicians and human rights activists against the world-spanning organizations and firms. It goes into a country without fully realizing and understanding its complete potential impact on politics and local culture; they fail in deploying the resources for the act on issues once they occur.
Around 340 million people use Facebook and various social media platforms, but India is always considered one of the largest marketplaces for such companies. The problems of the social media site on the subcontinent present an amplified version of the issues it has faced all around the world, making it worse as it lacks expertise and resources in 22 officially recognized languages of India.
The information was collected by a former Facebook product manager, Frances Haugen, who recently testified before a Senate subcommittee and became a whistle-blower about the company and its social media platforms. References to India were distributed among reports filed by Ms. Haugen to the Exchange and Securities Commission in a complaint earlier this month.
According to its documents, Facebook did not have sufficient support and resources for India and could not able to grapple with the difficulties and problems it had introduced there, including posts related to anti-Muslim.
87% of the global budget of companies are spending time in classifying misinformation that is earmarked for America. In comparison, only 13% is set aside for the rest of the globe – even though North American consumers of the social media site are making up only 10% of the daily active users on the social network, according to 1 document describing the allocation of resources of Facebook.
A Facebook spokesman, Andy Stone, said the figures were half-done and didn’t incorporate the third-party fact-checking partners of the company, most of whom are outside America.
A lopsided and unbalanced focus on America has had consequences in several countries besides India. For example, company records showed that Facebook introduced and implemented measures to demote misinformation in Myanmar during the election in November, which included disinformation shared by the Myanmar military junta.
Facebook said it had trained its A.I. systems on five, 22 officially recognized languages of India. But in Bengali and Hindi, it still did not have sufficient data to control and police the content adequately, and much of the content targeting Muslims “is never actioned or flagged,” the Facebook report said.
10 days after the proper study, the researcher opened the fake account to study misinformation; a suicide bombing in the disputed border region of Kashmir was setting off a round of a spike and violence in accusations, false, inaccurate, misleading information, and conspiracies between Pakistani and Indians nationals.
In March 2021, a report published showed that numerous problems cited during the 2019 elections persisted.
In India, “there is a puzzle and question about resourcing” for Facebook, but the solution is not “simply driving or throwing more money at this problem,” said Katie Harbath. She gave ten years at Facebook as a public policy director and worked directly to secure national elections in India. Instead, she said Facebook requires obtaining a solution that can be implemented in nations throughout the world.
The report said several dehumanizing posts were published on social media, which compared Muslims to “dogs” or “pigs” and misinformation claiming that the holy book of Islam, the Quran, calls for males to rape their female family members.
Facebook considers choosing and designating the group a dangerous organization because it is “inciting religious violence” on the social platform, the report showed, but it has not yet done so. In the private report, named Adversarial Harmful Networks: India Case Study, Facebook analysts and researchers wrote that Facebook groups and pages “replete with misleading and inflammatory anti-Muslim content.”
“Join the club or group and help in running the group; increasing the number of members of the group” said one post seeking and looking for recruits on Facebook for spreading Bajrang Dal’s messages. “Fight for justice and truth until the unjust are destroyed.”
Edited by Anupama Roy