I spent yesterday away and unavailable on all social networking channels in observance of the #blackouttuesday movement and #theshowmustbepaused initiative. Rather than spending my free moments doom-scrolling and building up more and more anger over recent, horrifying events. I donated to Black Lives Matter International and the Houston Bail Relief Fund, drank a little bourbon, listened to some of my favorite Blues musicians, and just took a breath. The day without Facebook, Instagram, and Twitter was… well… refreshing. Liberating. I didn’t avoid the news, I simply consumed it via direct, non-artificially curated channels, and I discovered something. Now this is probably not new to some of you. Likely a several of you have noticed it, but it’s a phenomenon that happens so subtly it can be hard to detect: Facebook’s feed algorithm is like gasoline and a match.
Okay, so I’m sure there are those you are saying, “duh”, and to be honest: I already knew this, but I was uncertain of just how insidious it could be. For example: when I was primarily sharing music related content on my Facebook feed, the overwhelming majority of what I saw on my feed was music related. Doing as the algorithm is supposed to do, it assessed my interests and began to feed me content that was related to that interest. And it did a damn good job of it too – it connected me with some truly interesting shit, so that damned thing works… all too well, I’m afraid.
When my posts turned a little more political… well… I’m pretty sure you can do the simple math here: My feed slowly began to change. Music was still in the mix, but now Facebook started interjecting posts that were aligned closely with what it determined as my political views. Mind you, this is when I was sharing items of a political nature in a more informational capacity. And then some racist-assed fascist, piece of shit, motherfucker killed Mr. Floyd and I got really angry. I angrily shared some content. Facebook’s algorithm picked up on that anger pretty swiftly and efficiently and started to put before me items which made me increasingly outraged and angry. The music moved way down in priority, and acutely upsetting items which happened to align with my anger moved to the top.
Facebook’s feed algorithm presented me items which pushed me into confirmation bias pretty easily and thoroughly. Now, as an information professional I try really hard to keep an eye on this shit. I try to be very conscientious about what I share and I try to ensure the veracity of the shared items by sticking to authoritative sources as much as possible when it’s not just dumb stuff for shits and giggles. I’ve deleted posts because they were inaccurate and eaten my fair share of crow as gracefully as possible. Yesterday’s social networking blackout afforded me some needed perspective and gave me a chance to sort this out in my head. And then the alarm went off and hasn’t stopped going off since: If this “interests” algorithm could do this to me, what’s it doing to people who are already steeped in confirmation bias, partisanship, and prejudice? If the algorithm so efficiently matches interests… well… the thought is frightening. Buzzfeed had some thoughts on the matter back in 2016. Go on, give it a read. I’ll wait…
I’m pretty sure I know a handful of people who have been radicalized by Facebook’s feed algorithm – I really can’t trace the origins of their ideological shift to any other source. I mean I witnessed it at work: react to one post, Facebook’s algorithm takes notice. Share a post like the one to which you reacted, and Facebook begins to tilt the feed in that particular direction. Share and react to a bunch of posts on that same topic and the floodgate opens. Before you know it you’re steeped in an echo chamber of posts that align very accurately with the preference you’ve expressed to the exclusion of rational, informed, and opposing opinions. I imagine it’s worse within Facebook groups of shared interests.
Unfortunately, I have no real remedy other than to encourage people to verify the stuff they see on social networks before they hit “share” or “like”, and to use the old “Count to ten” method of taking stalk of one’s anger reaction… But that’s not really a good remedy because the people this advice would help the most won’t heed it – they can’t see outside of the neat little stall their social network channels have made for them. In the past, I’ve considered deleting my Facebook account and the other social networking channels as well but that’s not solution for anyone other than me. I have arrived at the opinion that everyone should immediately eschew social networks; the human mind was not designed to exist like this, with poisonous ideas being rammed down our throats on a near constant basis reinforced by a terminal case of FOMO.
People are giving weaselly, little turd biscuits like Mark Zuckerberg the power and total permission to brain fuck them for profit. It’s important to remember that it’s how they make their money: the more you interact with the content and the longer you remain in the network scrolling and reacting and sharing the more money they make. The reaction and share buttons are communicating your preferences alright, but not just to the people who follow you and whom you follow. Your preferences are telling Facebook’s algorithm what content you want to consume and it feeds you with a chute veiled all under the illusion of actual choice when really: it’s choosing for you after a while.
I really wish there was more I could do about this; it’s a big wall to surmount as an almost-librarian. I’m working on a strategy, in the meantime feel free to share some of these resources with people who may need them. They won’t thank you for your efforts, they don’t like being pulled out of there carefully constructed comfort zones, but we have to do something.