Facebook’s “Transparency Center” Confirms What “Shadow Banned” Accounts Already Knew

NRB | September 24, 2021 | Press Releases

FOR IMMEDIATE RELEASE
September 25, 2021

FOR INFORMATION CONTACT:
Noelle Garnier
202-891-7843
ngarnier@nrb.org

Facebook’s “Transparency Center” Confirms What “Shadow Banned” Accounts Already Knew

WASHINGTON, D.C.— Not long ago, the concept of certain social media accounts and content being “shadow banned” was ridiculed as a hoax—but those days are over for good.

In the name of “insightful transparency,” Facebook has unveiled its Content Distribution Guidelines, a partial listing of the types of content it limits from visibility on its platform or “demotes” in users’ news feeds. To be clear, these aren’t new protocols—Facebook is just revealing to the public some of the standards governing its content suppression practices. 

In a blog post, Anna Stepanov, Director of Product Management at Facebook, connected these protocols to three of Facebook’s corporate values: responding to direct feedback, incentivizing investment in high-quality content, and fostering an environment where users feel “safer” from “problematic” content. 

“Some content may be problematic for our community, regardless of the intent,” Stepanov writes. “We’ll make this content more difficult for people to encounter.” 

A partial list of “Types Of Content We Demote” includes things such as:

  • Comments that Facebook predicts will be reported because comments like them have been reported before
  • Content that Facebook’s third-party fact-checkers have “debunked” as “False, Altered, or Partly False”
  • Content from news publishers that Facebook users “broadly” rate as “untrusted” in on-platform surveys
  • “Unoriginal” news articles lacking additional facts and analysis
  • Content “borderline to” or “likely violating” Facebook’s community standards
    • Besides violent and sexual imagery, this bucket includes content that could discourage COVID-19 vaccinations, or content posted by groups and pages “associated with (but not representing)” certain “conspiracy networks”

Perhaps your organization has sound editorial standards, wholesome audience engagement, and isn’t associated with any “conspiracy networks,” but your reach has been diminished nonetheless. Based on Thursday’s big reveal, simply playing by the rules—as much as any group speaking about religious beliefs can, given the broad and ideologically-informed nature of Facebook content standards—won’t protect your ministry or media company from content suppression on Facebook.  

Instead, you’re at the mercy of users—and not just those who “Like” your page and appreciate your messages. If hostile users flag and report your content because of religious or ideological viewpoint objections, Thursday’s updates confirm that Facebook’s stated corporate value of “responding to people’s direct feedback” may just win out, resulting in “demoted” posts and limited visibility. 

Furthermore, if your organization posts any information about COVID-19 recovery or treatments, even from the perspective of a personal experience or testimony, you may run afoul of the new guidelines if Facebook deems the information “sensationalized.” (The White House revealed in July 2021 that the federal government was assisting Facebook in identifying COVID-19 misinformation.) 

And that’s just the part we know. The true extent of Facebook’s content distribution policies remains cloaked in secrecy—but the platform’s history of suppressing Christian views on sexual orientation and gender identity, abortion, and religious liberty may provide clues.  

Enforcing “safe” conversation through algorithmic ideological judgments is a step toward a less humane world with less critical thought. The elevation of “safety” as a principle governing public expression will degrade the quality of public discourse, preclude countless challenging and worthwhile conversations, and silence messages with the power to transform hearts and minds.  

“We are wary that Facebook’s commitment to a ‘safe’ experience may amount to an ‘open season’ on religious viewpoints, while doing nothing to improve the quality of online discourse,” said NRB CEO, Troy Miller. “In general, mob rule and guilt by association are two very bad ways to ‘improve’ the conversation.” 

NRB opposes religious viewpoint discrimination in every form, including content moderation policies that limit the ability to teach the Bible, preach the gospel, and promote Christian values in the public square.  

It’s key that organizations pro-actively defend against these threats as well. One of the most important steps for a ministry or media company to take right now is to back up your content—videos, content lists, and more—away from the distribution platforms themselves. Don’t risk losing access to your content and messages by storing them solely on user accounts that can be suspended, locked, or removed. Ensure that your audience can access your digital media on more than one platform. Evaluate threats to your organization’s digital infrastructure before they happen. Finally, develop a plan to get the word out if Big Tech de-platforming strikes you—and communicate with us at NRB about any de-platforming actions taken against your organization. 

NRB has monitored threats to religious freedom on new media platforms for over a decade. Today, as since its founding in 1944, NRB is committed to representing Christian broadcasting wherever threats to religious free speech emerge.

Wordpress Social Share Plugin powered by Ultimatelysocial