This time it’s personal.

Last month, my daughter Sukee, a science journalist and social media maven, wrote and published an article about a scientific expedition to Thwaites Glacier in Antarctica. When she posted the story on Facebook on April 22 — in time for Earth Day — she ended up getting a lesson in politics.

Aquarium educator in a hammerhead shark costume

Sukee at NE Aquarium

Sukee started delivering science to the people as a New England Aquarium educator in 2012. With a van full of tidepool animals, she brought the aquarium experience to schoolchildren across the region. At the aquarium, she wrote and delivered live presentations addressing hundreds of visitors on climate and overfishing. Now, as a digital editor at NOVA, where her peers have described her as genetically engineered for the job (you’re welcome PBS), she serves an online audience of more than a million.

The glacier article got some traction among those followers and was worthy of a boost via ad spending on Facebook. Here, the post would be disseminated to Facebook users between the ages of 18 and 65 who are interested in science, Antarctica, and climate change mitigation. Within hours, Facebook rejected the boost: Because the post’s social copy contained the phrase “climate change,” it deemed the post politically sensitive.

For a journalist whose performance is largely measured by audience engagement metrics working at an institution with science education at its core, it was an opportunity lost. As Sukee’s dad, I felt it unfair. And as a climate activist, I wanted to say, “WTF Facebook!”

Scientists go to great lengths to gather data as do the journalists who tell their stories. Should Facebook be deeming these tales of science as “too political to boost,” therefore lowering their chances of gaining greater exposure?

Surprised to learn that “clean energy” and “climate change” are categorized as political topics on Facebook, I asked preeminent climate scientist Katharine Hayhoe, professor and co-director of the Climate Center at Texas Tech University, about her feelings on the topic:

The more we know. The more we care.

Those were Professor Hayhoe’s words as she described her climate outreach work. Her video series Global Weirding produced by KTTZ Texas Tech Public Media answers tough climate questions in an easy to understand and entertaining format. She mentioned Global Warming’s Six Americas to illustrate her approach to public engagement. At one end of the scale are the alarmed and concerned folks like us at 350 CO and our allies. At the other end are the doubtful and dismissive group who represent a small but often vocal opposition to climate action. In the middle are the cautious and disengaged who are more likely to be swayed than those at the extremes.

Posting videos on YouTube and promoting them on Facebook was how she got the word out. Posts promoting the series including boosted to a segment of her target audience – the many friends and family of series fans – not necessarily on board or even curious about climate change, but potentially open to learning more.

This changed about a year and a half ago when Facebook quietly added “clean energy” and “climate change” to its list of political topics. Hayhoe was no longer permitted to boost posts without first registering as a political organization. “Over my dead body” the scientist said, adding “a thermometer doesn’t show a different temperature depending on how you vote.” Standing firm for scientific integrity, she refused to comply even after Facebook relabeled these so-called ‘political’ topics as simply ‘sensitive.’

Since Facebook’s policy change, Hayhoe has seen a notable decline in new subscribers. She told me the growth rate for her Global Weirding series “was cut in half.”

Facebook’s policy puts scientists and journalists at a disadvantage while vying for attention with recommendation algorithms that promote increasingly radical views in a quest for billions in ad dollars because “that’s what gets the clicks” said Hayhoe. Just as worrisome are Facebook’s fact-checking partnerships – first with the climate denialist Weekly Standard and more recently with an arm of the anti-science media site the Daily Caller – founded by Fox News host Tucker Carlson.

Penn State climatologist Michael Mann put it bluntly when he described Facebook as “another tool in the toolbox used by fossil fuel interests and plutocrats to confuse the public and policymakers.”

It’s not just Facebook, Twitter has a bot problem. 

A 2018 study at Brown University showed that over 15% of Twitter users discussing climate change were actually bots. These fake accounts – engineered to sow discord and false information in the climate debate – were responsible for 20% of all tweets collected in the study. More recently, climate journalist Emily Atkin wrote that despite its ban on political advertising Twitter has allowed Big Oil – in its quest for government handouts during the pandemic – “to continue their historically effective, yet highly deceptive practice of corporate reputation advertising to buy public support for burning fossil fuels.”

Google’s AI: A rise of the machines isn’t the real threat

And it’s not just climate, an Oxford study showed that artificial intelligence (AI), analytics, paid ads and search engine optimization (SEO) are employed to spread disinformation and manipulate public opinion across the globe, especially during elections.

University of North Carolina Professor Zeynep Tufekci cautions us in her 2017 TED talk that the same algorithms companies use to get you to click on ads also influence your access to political and social information. She says “What we need to fear is not what artificial intelligence will do to us on its own but how the people in power will use artificial intelligence to control us and to manipulate us in novel, sometimes hidden, subtle and unexpected ways.”

And just as we ‘go to press’ The Atlantic has published an eye-opening story by Zeynep Tufekci explaining Facebook’s symbiotic relationship with the political right. Highly recommended.

Sukee’s story and my related inquiry led me down a rabbit hole of black-box algorithms and dystopian visions that some part of me wishes I could unsee. These social platforms are not the flat, open, virtually democratic forums that they claim to be. They are in the business to make money, lots of it, with algorithms to monetize our privacy and policies that put our fragile democracy at risk.

Be an educated information consumer. Viewer beware. Check out the source, question motives, and get another opinion. And don’t just give away your privacy. If you haven’t recently, please check your (and your kids’ if you’ve got ‘em) privacy settings.

Ron Bennett, architect &
decarbonization advocate

Did you learn something from this blog post? If so, consider donating to 350 Colorado to help us produce and share more educational content like this!