In an April hearing focused on online viewpoint suppression, House Judiciary Committee Chairman Bob Goodlatte (R-Va.) declared his intent to make sure Big Tech corporations answered for their content filtering practices. This week, Goodlatte made good on that declaration by convening a hearing that featured officials from Facebook, Twitter, and Google’s YouTube.
In announcing this week’s hearing, Goodlatte touted the amazing possibilities of social media, but also warned, “[T]his same technology can be used to suppress a particular viewpoint and manipulate public opinion.” He hoped this hearing would help the Big Tech companies explore “how they can be better stewards of free speech in the United States and abroad.”
The chairman did not mince words in his opening statement, particularly as he pointed to a recent case in which Facebook, citing a hate speech violation, blocked a Texas newspaper’s post around Independence Day. Shockingly, the text in question was the Declaration of Independence. “Think about that for a moment,” Goodlatte said. “If Thomas Jefferson had written the Declaration of Independence on Facebook, that document would have never seen the light of day. No one would be able to see his words because an algorithm automatically flagged it—or at least some portion of it—as hate speech.”
Monika Bickert, Facebook Head of Global Policy Management; Juniper Downs, YouTube Global Head of Public Policy and Government Relations; and Nick Pickles, Twitter Senior Strategist on Public Policy, all said they regretted mistakes, but attempted to convince committee members that their content moderation decisions were not ideologically driven. Bickert declared Facebook’s “deep commitment to building a community that encourages and fosters free expression.” YouTube’s Downs said, “We have a natural and long-term incentive to make sure that our products work for users of all viewpoints.” And Pickles told the committee that Twitter’s rules are “not based on ideology or a particular set of beliefs.”
Democrats on the committee went further. In April, Rep. Jerrold Nadler (D-N.Y.), the top Democrat on the committee, declared, “The notion that social media companies are filtering out conservative voices is a hoax—a tired narrative of imagined victimhood.” In that vein, Rep. Jamie Raskin (D-Md.) this week rebuked the “imaginary narrative” and “conservative fantasy” that gave rise to the hearing, and other Democrats echoed that sentiment. They (as well as protestors who appeared to have been given access into the hearing room by Democrat staff) also trained their fire on Facebook. A number of Democrats seemed to fault Facebook for, in their opinion, being too attentive to the cries of conservative consumers.
Sadly, the problem of viewpoint discrimination is not a hoax. While some may not wish it to be true, NRB has in fact documented a pattern of Christian and conservative content being censored by Silicon Valley titans. Its Internet Freedom Watch initiative (internetfreedomwatch.org) shows examples of censorship in a timeline dating back to Apple’s 2010 discrimination against the late Chuck Colson’s Manhattan Declaration app. One of NRB’s objectives in launching Internet Freedom Watch in December 2017 was to seek congressional oversight, specifically calling for hearings on the problem of internet censorship.
In a letter from NRB President & CEO Dr. Jerry A. Johnson, which was included in the April hearing’s public record by Chairman Goodlatte, Johnson said we were at a “pivotal moment.” He called on tech titans to do the right thing by “proactively and publicly declaring a robust and unapologetic stand for the free speech rights of their users.” Johnson also thanked Goodlatte for holding the hearing, because, he said, “We must determine if we are facing algorithmic or human discrimination and what is being done to correct this problem in either case.”
Many committee members attempted to unearth details about how the tech companies make content determinations and what level of legal responsibility they should be held to for those choices. Notably, in his remarks at the hearing, Goodlatte said, “The online environment is becoming more polarized—not less; and there are concerns that discourse is being squelched—not facilitated. Moreover, society as a whole is finding it difficult to define what these social media platforms are and what they do. For example, some would like to think of them as government actors, as public utilities, as advertising agencies, or as media publishers—each with its own set of legal implications and potential shortfalls.”
Goodlatte further declared, “It’s clear, however, that the platforms need to do a better job explaining how they make decisions to filter content and the rationale for why they do so.”
By Aaron Mercer, Vice President of Government Relations
Published: July 20, 2018