Mike Caulfield, former director community outreach for MIT’s OpenCourseWare program, discusses whether Facebook’s current strategy for users sharing content is evidence of confirmation bias – that people see something they agree with and share it – or if opinions are impacted and shaped based on what shows up on a given timeline:
“The process that Facebook currently encourages, on the other hand, of looking at these short cards of news stories and forcing you to immediately decide whether to support or not support them trains people to be extremists. It takes a moment of ambivalence or nuance, and by design pushes the reader to go deeper into their support for whatever theory or argument they are staring at. When you consider that people are being trained in this way by Facebook for hours each day, that should scare the living daylights out of you.”
This thought is indeed frightening. I can definitely say that I’ve fallen victim to this. News stories splash across my timeline in no discernible order thanks to Facebook’s algorithm. A headline strikes me intriguing and the person who shared it is someone who I trust and, like me, is someone whom ostensibly believes in checking sources. So I hit the share button and move on. I don’t check the sources and often I haven’t even read the article. But I remember the headline, and that headline then informs my perception of related news going forward. Caulfield continues:
And the problem is that — unlike previous social sites — Facebook doesn’t know, because from Facebook’s perspective they have two goals, and neither is about the quality of the community or well-being of its members. The first goal is to keep you creating Facebook content in the form of shares, likes, and comments. Any value you get out of it as a person is not a direct Facebook concern, except as it impacts those goals. And so Facebook is designed to make you share without reading, and like without thinking, because that is how Facebook makes its money and lock-in, by having you create social content (and personal marketing data) it can use.
The second Facebook goal is to keep you on the site at all costs, since this is where they can serve you ads. And this leads to another problem we can talk about more fully in another post. Your average news story — something from the New York Times on a history of the Alt-Right, for example — won’t get clicked, because Facebook has built their environment to resist people clicking external links. Marketers figured this out and realized that to get you to click they had to up the ante. So they produced conspiracy sites that have carefully designed, fictional stories that are inflammatory enough that you *will* click.
Here is the crux of the issue. Facebook’s business model relies on unpaid content creation via their users and doing everything they can do to keep people on the website, obsessively scrolling through and refreshing timelines. Facebook is one of the most powerful communication tools in the world at the moment, and rather than using this power responsibly and with the common good in mind, the company remains slavishly devoted to its bottom line to the detriment of society. Mark Zuckerberg can hang as many #BLM signs in the Facebook headquarters as he wants, until the company puts responsibility ahead of profits, they are just another exploitative corporate power using progressive causes as a marketing ploy.