Facebook reveals what content it will suppress, but will not delete it-although key details remain hidden


Backline

Facebook released its content distribution guidelines on Thursday, reiterating that it limits the spread of “problematic or low-quality” content, including clickbait, misinformation, and sensational health claims, as the company faces increasing demands from users and legislators Pressure to disclose how it decides what content to display.

Key facts

According to the content distribution guidelines, there are many reasons for Facebook to reduce the coverage of the content shown in its news feed Publish Thursday, including spam, unsafe suicide reports and “posts from widely untrusted news publishers.”

Facebook stated that the guidelines previously shared via early announcements were newly collected on the platform a few months ago Transparent center.

Facebook stated that it will reduce the coverage of content for three reasons: to incentivize creators to build “high-quality and accurate content”, to cultivate “safer communities,” and to respond to direct user feedback.

Publishing these measures in one place—these measures apply to questionable posts that don’t need to be removed from the site—should make Facebook’s internal processes clearer, the company Said, Because it faces more and more firepower around the world to reveal how it controls what users see.

According to the guidelines, Facebook will also downgrade posts by people who “may have multiple accounts to evade law enforcement.” These news articles lack clear authorship and links, resulting in pages containing clear or shocking content.

Things we don’t know

The guide only provides a top-level view of how Facebook controls the News Feed. For example, it does not specify how the content is reduced, how much is reduced, or whether different types of posts have reduced the scope of influence in different ways.

Key background

These guidelines are not a new initiative by Facebook, but just provide a small highlight for the company’s lack of visibility and understanding. practice of manipulate What the user sees.Recent survey Wall Street Journal with New York Times Reveal the details of the project to show people Positive story About Facebook, Separate rules For well-known users and politicians, as well as companies fully aware that their products can harmful For users, Include Young girls. In 2014, Facebook disclose Secret experiment conducted on nearly 700,000 people to determine if it can manipulate User sentiment using News Feed (research found that they can).

Further reading

No more apologies: in Facebook’s efforts to defend its image (now)

The Facebook Oversight Committee will investigate whether the company has different rules for powerful users (Forbes)

Facebook started to share more about its downgrade in news feeds (edge)

Facebook is trying to make its platform a healthier place. Instead, it became more angry. (Wall Street Journal)



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *