Full width home advertisement

Welcome Home


Post Page Advertisement [Top]

The most important lessons to be learned from the Facebook Papers



When it comes to being in the spotlight, Facebook is no stranger. A whistleblower and top executives from the company have been summoned to testify before Congress in response to the release of leaked internal research and documents. While the company has come under fire on numerous occasions in recent years for its role in disseminating misinformation, particularly during the 2016 election, the last two months have been particularly turbulent.


It has been revealed that Facebook (FB) whistleblower Frances Haugen made a number of disclosures to the Securities and Exchange Commission, as well as redacted versions of those disclosures provided to Congress by her legal counsel. This has shed new light on the inner workings of the tech giant. CNN was among a group of 17 news organizations in the United States that looked over the redacted versions of the documents that had been delivered to Congress. Her documents were also shared with the Wall Street Journal, which published a multi-part investigation demonstrating that Facebook was aware of platform problems.


Despite Haugen's claims, Facebook has responded with a 1,300-word statement from CEO Mark Zuckerberg, who claims that the documents were cherry-picked to create a false narrative about the company.


The following are some of the most important takeaways from the tens of thousands of pages of internal documents that were reviewed.


False information is being disseminated


In one SEC filing, Haugen claims that "Facebook misled investors and the public about its role in the propagation of misinformation and violent extremism in connection with the 2020 election and the January 6th insurgency."


Documents obtained by The Intercept describe a research project conducted in June 2019 called "Carol's Journey to QAnon," which looked into which pages and groups would be promoted by Facebook's algorithms when a fake account was created in the name of a 41-year-old conservative mother named Carol Smith was used as a guinea pig. Carol received a recommendation from Facebook's algorithm to follow a QAnon page within two days of starting to follow verified pages for conservative figures such as Fox News and Donald Trump.


As a spokesperson for Facebook told CNN, "While this was a single hypothetical user study, it demonstrates the type of research the company conducts to improve its systems and assisted us in our decision to remove QAnon."


One other report, titled "Stop the Steal and the Patriot Party: The Growth and Mitigation of an Adversarial Harmful Movement," presents an analysis conducted after the Capitol riots on January 6th, which suggests that Facebook could have done more to prevent the spread of the "Stop the Steal" movement, which was instrumental in the riots on January 6.


Additionally, leaked comments from some Facebook employees on January 6 suggest that the company may have played a role in the incident by failing to act more quickly to halt the growth of Stop the Steal groups on its platform.


According to CNN, in response to these documents, a Facebook spokesperson stated, "The responsibility for the violence that occurred on January 6 belongs to those who attacked our Capitol and those who encouraged them."


Global lack of support


As part of Haugen's disclosures, internal Facebook documents and research were made public. These documents and research demonstrate Facebook's inability to prevent hate speech and misinformation in countries such as Myanmar, Afghanistan, India, Ethiopia, and much of the Middle East, where coverage of many indigenous languages is insufficient.


Facebook's platforms are available in more than 100 languages around the world, according to a company spokesperson who spoke to CNN Business. The company's global content moderation teams are comprised of "15,000 people who review content in more than 70 languages in more than 20 locations," according to the spokesperson.


For example, in India, which has the largest Facebook user base, hate speech classifiers for Hindi and Bengali, two of the country's most popular languages spoken by more than 600 million people collectively, were unavailable for several years on Facebook. The lack of Hindi and Bengali classifiers, according to a Facebook researcher who spoke at an internal presentation on anti-Muslim hate speech, "means that much of this content is never flagged or removed."


According to CNN, Facebook added hate speech classifications for "Hindi in 2018, Bengali in 2020, Tamil and Urdu more recently," and "Hindi in 2018 and Bengali in 2020," among other languages.


In response to reports about the leaked research, Miranda Sissons, Facebook's director of human rights policy, and Nicole Isaac, Facebook's international strategic response director, issued a statement on October 23. "The review and prioritization of countries with the highest risk of offline harm and violence is carried out every six months by our team of industry experts, who are recognized as leaders in their field. When a crisis occurs, we respond by deploying assistance that is tailored to the country in question."


Human Trafficking


According to company documents reviewed by CNN, Facebook has been aware of human traffickers using its platforms at least since 2018. However, the company has struggled to crack down on related content.


According to an internal report published in September 2019, "Our platform makes it possible to complete all three stages of the human exploitation lifecycle (recruitment, facilitation, and exploitation) through the use of real-world networks... They used FB [Facebook] profiles, IG [Instagram] profiles, Pages, Messenger, and WhatsApp to facilitate trafficking, recruitment, and facilitation on behalf of these "agencies.""


Other documents described how Facebook researchers discovered and removed Instagram accounts that purported to sell domestic workers, as well as the various steps the company took to address the issue, which included the removal of specific hashtags from the platform. A CNN investigation last week revealed a number of similar Instagram accounts that were advertising domestic workers for sale. An official from Facebook confirmed to CNN that the accounts were in violation of the company's policies after the news organization inquired. The accounts, as well as the posts, have since been removed from the site.


"We categorically prohibit the exploitation of human beings," Facebook spokesperson Andy Stone stated. "We've been fighting human trafficking on our platform for many years, and our goal remains the same: to prevent anyone looking to exploit others from finding a home on our platform," says the company.


Instigating Violence on a Global Scale


It appears from internal documents that Facebook was aware that its current strategies were insufficient to prevent the spread of violent posts in countries considered "at risk" of conflict, such as Ethiopia, and that it was working to improve them.


Facebook works with third-party fact-checking organizations to identify, review, and rate potential misinformation on its platform. This is done through an internal Facebook tool that surfaces content that has been flagged as false or misleading using a combination of artificial intelligence and human moderators, among other things.


Ethiopia, where a civil war has been raging for the past year, is ranked among the top tier of countries at risk of conflict on Facebook's Conflict Risk Index. However, according to a March internal report titled "Coordinated Social Harm," Ethiopian armed groups were using Facebook to incite violence against ethnic minorities in the "context of civil war," according to the report's authors. "Current mitigation strategies are insufficient," the report's bold headline warned readers.


This is not the first time that concerns have been raised about Facebook's role in the promotion of violence and hate speech. The United Nations (UN) expressed concern about Facebook's role in the Myanmar crisis in 2018, and the company admitted that it had not done enough to prevent its platform from being used to incite violence. Facebook CEO Mark Zuckerberg promised to increase the company's moderation efforts as a result.


"I genuinely believe that many lives are on the line — that Myanmar and Ethiopia are only the beginning of the story," Haugen said in his presentation to the consortium.


The company, according to a Facebook spokesperson, made an investment "Our platform is worth $13 billion, and we employ 40,000 people to ensure its safety and security, including 15,000 people who review content in more than 70 languages and work in more than 20 locations around the world to serve our community. It is our third-party fact-checking program that consists of over 80 partners who review content in more than 60 languages, with 70 of those fact checkers based outside of the United States, according to our statistics."


Having an impact on adolescents


Facebook has made a concerted effort to grow its young adult audience, according to the documents. This is despite internal research indicating that the company's platforms, particularly Instagram, may have a negative impact on young adults' mental health and well-being.


While Facebook has previously acknowledged that young adult engagement with the Facebook app is "low and continuing to decline," the company has taken steps to reach out to that demographic in order to increase engagement. Among the company's many strategies to "resonate and win with young people" was a three-pronged approach aimed at convincing young adults to "choose Facebook as their preferred platform for connecting with the people and interests that matter to them." Some of these were "fundamental design and navigation changes to foster a sense of connectedness and entertainment," while others were "ongoing research to concentrate on youth well-being and integrity efforts," according to the report.


The Wall Street Journal first reported that Facebook's internal research found that the company's platforms "exacerbate body image issues for one in three teen girls," a finding that was confirmed by the company itself. Additionally, it was discovered that "13.5 percent of teen girls on Instagram say the platform exacerbates thoughts of 'Suicide and Self Injury,'" and that "17 percent of teen girls on Instagram say the platform exacerbates "Eating Issues," such as anorexia."


Asked about the internal research on September 14, Instagram's head of public policy, Karina Newton, said the company "stands by" it, but claimed that the Wall Street Journal "focuses on a limited set of findings and paints them in a negative light."


The use of algorithms as a source of disagreement


In 2018, Facebook made a change to its News Feed algorithm to give priority to "meaningful social interactions." According to internal company documents obtained by CNN, Facebook discovered shortly after the change that it had sparked widespread outrage and division on the social media platform.


A study conducted in late-2018 on 14 Facebook publishers, titled "Does Facebook reward outrage?" discovered that the greater the number of negative comments a Facebook post received, the greater the likelihood that the link to the post would be clicked.


"The mechanics of our platform are not neutral," one member of the team wrote.

No comments:

Post a Comment

Bottom Ad [Post Page]