Concerned about the perpetuation of hate and mistrust? Missed any of the “historic” antitrust hearings featuring top chiefs from Amazon, Google, Apple, and Facebook?
Don’t worry. For as entertaining as it was for those missing the point, the highlight of the most recent hearing was not Rep. Sensenbrenner getting Facebook and Twitter confused, it was Mark Zuckerberg clarifying his position on sharing and democracy of speech.
His clarification is an object lesson in standing one’s ground, even as it continues to rumble and seize.
From where I sit, he’s got a lot of ground to stand on, and with the amount of time he has been spending sowing the seeds of doubt about TikTok, there is a lot more ground to cover.
Zuckerberg has said that Facebook wants people to share their ideas. Facebook does not want to become the arbiter of truth.
Zuckerberg also made it clear that he will not allow advertisers to determine content. Vital guiding principal? Convenient populist truth? Before settling on any of that, it’s essential to dig into what’s driving the current turmoil.
Largely anonymous truthy groups with massive social reach have become emotion machines for humans and cash machines for Facebook. Emotion leads to sharing, sharing leads to engagement, and engagement supplies much-needed dopamine.
There is no financial incentive whatsoever to bring this to an end, and it doesn’t favor political parties. Ban one and the optimization engine will just shift to the other. Think of it as an emotional machine whack-a-mole.
Although the subject did come up a couple of times, much of the discussion missed the mechanics that have led to the current advertiser boycott. It seems very few people outside the mystical walled garden understand how all the hate comes together, and like the great secret Google algorithm, maybe it’s better that way.
To examine a key part of the machine fueling the ad industry’s demand for Facebook change, have a look at what happened with The Houston Chronicle’s recent printed obituary supplement. While the situation in Houston right now is very fluid, here is a breakdown of how the story unfolded and an illustration of how the toxic cocktail of social infrastructure and media plays out all over the world, without respect for party lines.
The Houston Chronicle published a printed 43-page long obituary recently. The Chronicle does this quarterly. The obituaries are sold as advertisements. Only those who purchase ads are included, and some of them have died of COVID-19.
Newsweek covered The Chronicle‘s published obituary with headline: “Texas Newspaper Prints 43-Page Obituary Section as Coronavirus Deaths Soar.”
According to Newsweek, that equated to 900 people dying in greater Houston of all causes in the first six months of 2020. Among its frequently cited “sources” were Tweets. Are you surprised Bari Weiss named Twitter the ultimate editor of the New York Times in her resignation letter?
The media coverage was followed with a flurry of Tweets, like this one from Rogelio Garcia Lawyer ‘s account with about 93,000 followers. It was retweeted over 55,000 times, and over 120 thousand people liked this tweet.
The nonprofit news organization Truthout disseminated Garcia’s tweet to over 825,000 followers. Truthout’s post was shared and “engaged” with emotion thousands of times.
Truthout’s page states that it is “dedicated to providing independent reporting and commentary on a diverse range of social justice issues.” There was no reporting and no commentary on Garcia’s Tweet. It wasn’t even a retweet; it was a photo of a Tweet.
Truthout has over 1300 such photos in its timeline. Text and other markers help with hate-content mitigation, but the one in my example is more difficult.
It’s hard to count the number of groups on Facebook, but the scenario I just played out is happening over and over many times a day for over a billion people all over the world. Other forms of social media and news media outlets all over the world feed the machines.
People have been living on dopamine-evoking headlines for a long time, but massive emotion aggregators, cancel culture, a complicit media, and a technical infrastructure that facilitates retreating into brain-chemical sympathetic bubbles have created the perfect storm of hate.
This isn’t a political problem — it’s a sharing enablement problem. Facebook wants people to share their ideas. If people were sharing their own ideas, not the trigger bait via emotion mills, the problem would be so much easier to manage.
Zuckerberg said fake content hurts his business — that people don’t want to hear bad things. But hate is bad, and it’s just like any other emotion that can be used to manipulate. Profit motive demands sharing. Billions of fake accounts are removed, but the sharing machines are prolific.
Zuckerberg says that roughly nine out of ten pieces of hate speech are removed before they go live.
How do you pre-cognitively determine what is hate speech? If most of it is removed before it is reported, you are in fact arbiters of the truth, but who cares? Facebook’s stock is doing great.