Tech

Four revelations from the Facebook Papers

Facebook has been in serious trouble since the Cambridge Analytica scandal, which accused whistleblowers of focusing on “profit over safety,” and shed light on internal mechanics through thousands of pages of leaked notes. I am fighting with.

The document was disclosed to US regulatory agencies and provided to Congress in a form edited by Franceshausen’s lawyer. A consortium of news organizations, including the Financial Times, has obtained an edited version received by Congress.

Earlier this month, Haugen said in Congress that social media companies haven’t done enough to secure 2.9 billion users, downplaying the potential harm to society, investors and the general public. I testified that I had repeatedly misunderstood.The Wall Street Journal also published a series of articles called Facebook file.

The four amazing revelations contained in the document are:

Facebook has a big language problem

Facebook is often accused of failing to mitigate hate speech on English-speaking sites, but after promising more investment after being accused of its role in promoting genocide in Myanmar in 2017. But in other language-speaking countries the problem is exacerbated.

A 2021 document warns that the number of content moderators in Arabic dialects spoken in Saudi Arabia, Yemen and Libya is very small. Another study of Afghanistan, which has 5 million Facebook users, found that even pages explaining how to report malicious language were mistranslated.

Facebook has allocated only 13% of its budget to develop false alarm detection algorithms to the world outside the United States © Ilana Panich-Linsman / The Washington Post / Getty

Facebook’s own research failed, even though some countries were marked “high risk” due to vulnerable political conditions and the frequency of hate speech.

According to one document, the company allocated 87% of its budget to develop false alarm detection algorithms in the United States in 2020, while allocating 13% to other parts of the world.

Haugen said Facebook should remain transparent about the resources it holds by country and language.

Facebook often doesn’t understand how the algorithm works

Some documents show that Facebook is confused by its own algorithm.

According to a September 2019 memo, men have 64% more political posts than women in “almost every country,” and the problem is particularly acute in African and Asian countries.

Men tended to follow accounts that create political content, but Facebook’s feed ranking algorithm also played an important role, according to notes.

Facebook found that men are offered 64% more political posts than women in “almost every country” © Paul Morris / Bloomberg

According to a June 2020 note, Facebook’s “major systems show systematic bias based on the race of the affected users” is “virtually guaranteed”. ..

The authors suggested that news feed rankings may be more influential to those who share than those who are less frequently involved in sharing. This may correlate with race. As a result, the content of one race takes precedence over other races.

Facebook made it difficult to report hate speech due to AI failure

Facebook has long said that artificial intelligence programs can find and eliminate malicious expressions and abuse, but files are limited.

According to a March 2021 memo by a group of researchers, the company will act on only 3-5% of hate speech and 0.6% of violent content. Another note suggests that AI can never exceed 10 to 20 percent because it is “very difficult” to understand the context in which the language is used.

Nevertheless, Facebook has already relied more on AI and decided to reduce the money spent on human moderation in 2019 when it comes to hate speech. In particular, the company reported and made it difficult to sue for decisions regarding hate speech.

“When fighting hate speech on Facebook, our goal is to reduce that epidemic, which is the amount people actually see,” Facebook said. He added that hate speech accounted for only 0.05% of what users read, down 50% in the last three quarters.

Facebook messed up while the Capitol was on fire

The document is on Facebook Difficult to contain the hate speech explosion False information on the platform before and after the January 6 riots in Washington caused internal confusion.

Facebook turned off hate speech emergency safeguards before the Capitol rioted in Washington on January 6 © Leah Mills / Reuters

According to the memo, the company turned off certain emergency safeguards in the wake of the November 2020 elections, but as the violence intensified, it scrambled them back on. One internal assessment found that waiting for approval from the policy team hindered the rapid implementation of measures.

Even positive behavior failed to produce the desired effect. In October 2020, Facebook announced that it would stop recommending “citizen groups” to discuss social and political issues. However, according to one survey note, due to technical issues in implementing the changes, at least one of 700,000 identified citizen groups each day between mid-October 2020 and mid-January 2021 Recommended by 3 million US users.

Facebook reaction

Facebook refused to comment on some of the details of the allegations and instead prioritized profits over people’s safety and well-being, “The truth is, invest $ 13 billion and do one job. That’s because there are more than 40,000 people. Safe people on Facebook. ”

Four revelations from the Facebook Papers Source link Four revelations from the Facebook Papers

Related Articles

Back to top button