#13: Who Picks Up the (Fact) Check?

+ What public meetings tell us about divisive algorithms

Welcome to the Civic Signals newsletter, a project dedicated to building better public digital spaces. If you’re new here and like what you see, please subscribe to get this weekly email in your inbox.

Keywords: Binary code 

| bahy-nuh-ree | | kohd |
n. a coding system that represents text, computer processor instructions, or other data with a two-symbol system, usually with 0s and 1s.

The modern binary number system, which serves as the basis of most computer coding, was invented by Gottfried Leibniz in 1689. It uses sequences of zeroes and ones to represent other numbers, letters, and words, converting verbal statements into logic. However, Leibniz was inspired by a much earlier text: the I Ching, or “Book of Changes,” a 9th-century Chinese divination text that uses broken and solid lines that representing the yin and yang respectively. Those lines stack into hexagrams to represent words, which then become part of a sequence in the book. The hexagram below represents the number 61, or Inner Truth.

This method of stringing information sets the logical foundations of communication systems like Morse code and Braille or coding systems like ASCII code and Boolean algebra. For an example of the trappings of binary thinking, consider how politicians have cautioned that reopening after the pandemic is NOT like a light switch.


What’s Clicking

Online

Offline

Linked

This week’s Double Click
Coronagrifting: A Design Phenomenon by Kate Wagner of McMansionHell


Shopping Checkout

This week, Twitter and Facebook entered the 2020 election content moderation foray in earnest. As both of these tech-giants reckon with what exactly their responsibilities in the 2020 election look like, President Trump’s prospective executive order has amped up the volume on the fight—with some chilling effects on free speech. 

The Washington Post obtained a draft executive order that would empower federal regulators like the FCC and the FTC to rethink Section 230 of the Communications Decency Law of 1996. The law has been adapted to spare tech platforms from the same liability with user generated content that publishers have with editorial content. But the law is meant to shield the platforms from the behavior of its users, not to institute political neutrality. The move comes just after Twitter decided to inject a warning label to fact-check the president’s tweet about mail-in voting and voter fraud.

Meanwhile, Facebook CEO Mark Zuckerberg went on FOX News on Wednesday to say that they would not be fact-checking the president. "I believe strongly that Facebook shouldn't be the arbiter of truth of everything that people say online." (Note: Facebook is, in fact, doing its own fact-checking) Later Wednesday night, Twitter CEO Jack Dorsey clapped back at Zuckerberg’s comments:

Section 230 may seem wonky, but at its core, it’s a question we’re familiar with: What does it mean to have private internet platforms become our new public squares?

One of the Supreme Court precedents the drafted executive order leans on is actually helpful for illustrating what kind of public space tech platforms are. Pruneyard v. Robins (1980) centers around the right to exercise free speech in a private shopping mall in parts that were regularly open to the public. In this case, a California mall tried to stop high school kids from soliciting signatures on a petition against a United Nations General Assembly Resolution. The draft executive order would combine that logic with a 2017 decision that defines social media as “protected space under the First Amendment. As tech journalist Kara Swisher put it, “That’s right, shopping centers is the argument. For you kids, that was where Paul Blart works.”

But in this instance, the President isn’t being prevented from speaking on these platforms; he’s just getting corrected by their exercising of free speech. If social media platforms are like shopping malls, then this is like managing what gets posted on a bulletin board at the entrance. Who knows if posting warnings will even solve all that much truth-vandalism.

Instead, this federal action hints at harming tech platforms’s advertising businesses with anti-trust threats after conservatives have launched failed anti-bias lawsuit against these companies. In this shopping mall analogy, the Trump administration looks a lot more like an aggressive vendor pushing stores to sell their red hats, rather than staging any actual First Amendment-protected protest on sites that have the right to exercise editorial control.

A Tribe Called Stressed

Earlier this week, the Wall Street Journal published a story about how Facebook’s algorithms are encouraging division. Over 2017 and 2018, Facebook’s Integrity Team research on divisive content and bad behavior found that the priority on “user engagement,”—a metric that combines time spent, likes, shares and comments into an algorithm on the site—had aggravated the problem of polarization and tribal behavior.

As that team combatted “fake news, spam, clickbait, and inauthentic users” and found that the bad behavior was coming from a “small pool of hyperpartisan users.” Even before that team’s creation, Facebook researchers found signs that group recommendation algorithms were steering people into joining extremist groups.

Civic Signals’ Eli Pariser is quoted in the WSJ story:

At Facebook, “There was this soul-searching period after 2016 that seemed to me this period of really sincere, ‘Oh man, what if we really did mess up the world?’ ” said Eli Pariser, co-director of Civic Signals, a project that aims to build healthier digital spaces, and who has spoken to Facebook officials about polarization.

Mr. Pariser said that started to change after March 2018, when Facebook got in hot water after disclosing that Cambridge Analytica, the political-analytics startup, improperly obtained Facebook data about tens of millions of people. The shift has gained momentum since, he said: “The internal pendulum swung really hard to ‘the media hates us no matter what we do, so let’s just batten down the hatches.’ ”

But as internal debate went on, recalibrating the platform for the public good reportedly lost Zuckerburg’s interest. The story infers a dramatic shift from that mea culpa to today’s more combative Facebook, pointing to Zuckerburg saying in January that he would stand up “against those who say that new types of communities forming on social media are dividing us” and the Wall Street Journal reports that privately Zuckerburg doesn’t see social media as bearing responsibility for polarization.

Maybe there’s some truth to that. Sit in on any local neighborhood planning meeting and you can witness heated debates about everything from building new apartments to putting in bike lanes. Even among people of the same red or blue political tribe, outrage and resistance to change can emerge in nearly any venue that doesn’t have mechanisms to curb our natural gravitation towards conflict. But Facebook’s leadership demurred on making political judgments part of its content ranking system, finding no way to distinguish the coordinated effort of partisan “super sharers” from a hypothetical Girl Scouts troop selling cookies.

But what city leaders have been trying to reckon with recently is how representative that audience sample is of the community they represent. You can get a lot of people to come to a meeting if they are mad and have enough time on their hands, but it won’t represent the moderate opinion of everyone who’s just okay a decision but don’t have the time between work, school, and play to show up to a meeting or get into an argument on the internet.

It even seemed as if the urgency of COVID-19 might even help social platforms find some sort of less-outrage driven reset, as our co-director Talia Stroud mentioned some highlights from her research in a Harmony Labs livestream on Wednesday:

You had this golden opportunity where everyone is thinking about loving their neighbor. … In March, what did well on Facebook and what people were craving were the core critical information needs that the public had: Where do I get tested? How many people are testing positive? 

One month later, if you look at what’s doing well on social? It’s local political disagreement that gets the most shares, the most reactions, the most comments. I want so badly to have this conversation about what a beneficial media business model looks like. I think there was a window there where people weren’t looking for the same kind of vitriol that does well but I am nervous that moment has passed.

I’ll take two boxes of Thin Mints,
Andrew Small

Illustrations by Josh Kramer

PS. We’re still taking in submissions for our #MakeItDigital feature from last week’s edition. Don’t be shy! Send us a note at civic.signals@gmail.com


Civic Signals is a partnership between the Center for Media Engagement at the University of Texas, Austin and the National Conference on Citizenship, and was incubated by New America.