Jared Schumacher
The virtues of Facebook are manifest to those who frequent the social media behemoth: it allows the almost instantaneous "sharing" of information with those to whom we wish to be connected, de-limiting the ways in which information is communicated and the social pot is stirred. This is to say that Facebook’s virtue is that it closes the distance gap by enabling immediate connection.
But its limits as a medium are increasing becoming obvious the more it has been appropriated. For the past decade or so, detractors have been lining up to criticize the vacuity of a virtual social bond and the way it detrimentally affects an individual user's real-life sense of self and his or her perceived community. Witness the Catfish phenomenon as a critical case in point. But these criticisms often focus somewhat narrowly on the virtuality of relationship, its abstraction from concrete human embodiment. Because humans are embodied beings – so the argument goes – they require concrete interaction. And Facebook cannot de-limit this human need virtually.
A recent article adds another, more unique, critical voice to the chorus. In a piece for Wired magazine, Mat Honan summarizes the results of his comical-because-tragic experiment to "like" everything on his Facebook feed.
The premise was simple: Honan clicked the “like” button for everything that was posted on his Facebook wall. Some of this content was posted by friends, and – because Facebook's business model is dependent upon ‘suggested content', which really means advertisements placed within one's wall as if from friends – some of it by Facebook itself. As Honan explains:
See, Facebook uses algorithms to decide what shows up in your feed. It isn’t just a parade of sequential updates from your friends and the things you’ve expressed an interest in. In 2014 the News Feed is a highly-curated presentation, delivered to you by a complicated formula based on the actions you take on the site, and across the web. I wanted to see how my Facebook experience would change if I constantly rewarded the robots making these decisions for me, if I continually said, “good job, robot, I like this.”
The results of the experiment were fascinating. As he details, "[a]fter checking in and liking a bunch of stuff over the course of an hour, there were no human beings in my feed anymore. It became about brands and messaging, rather than humans with messages." The robots used his encouragement in the form of "liking" to create a marketing echo chamber, and a radicalizing one at that. As his friends on the political right and left posted things that he then "liked", his feed was funneled to aggregation sites like Huffington Post and Buzzfeed, which continued to feed him ad nauseam with similar polarizing stories, products, and pages, all for him "to like", but little of which he actually did.
The cascade effect this experiment had on others is also worthy of note. Facebook frequently posts liking activity on friends' feeds. So, when Honan went on his liking rampage, his friends were involuntarily drawn in to his experiment. Their feeds became colonized by his activity, limiting their exposure to articles and content they might actually otherwise have "liked."
After two days of liking activity, and some time for reflection, Honan concludes his experiment with this piece of epigrammatic analysis: "By liking everything, I turned Facebook into a place where there was nothing I liked. To be honest, I really didn’t like it. I didn’t like what I had done." Indiscriminate liking had the ironic effect of radicalizing while banalizing everything.
Despite this unhappy ending, the experiment did help to expose one of the most obvious and yet under-recognized facts of human interaction on the web. There seems a broad but naive assumption that, because "the internet" is a medium, it itself is unmediated. But this is categorically untrue. In ways we are only beginning to understand, the media we use to interrelate, however virtually, are themselves mediated by those who provide the services in the first place. Facebook's algorithm and its extreme use in the experiment expose the prejudices of those who designed it. Facebook exists to make money for its (now) public investors, and those investors profit from the radicalization of news and products and the filtering of information to sites which are themselves heavily monetized. It seems that for us humans, there is no unmediated reality, virtual or otherwise. Even if ‘the virtual’ de-limits our capacity to interact at a distance, it does so precisely by delimiting in other ways. Which means that, while it may indeed close the distance gap, Facebook also functions as a social filter, creating different gaps in its place. We tell Facebook who our friends are and what we “like” or don’t like , giving us the appearance of control over the information that we allow to inform us; in actuality, Facebook controls our social network through a process of selection, limiting our exposure to information in ways that are not altogether clear.
To some extent, there is nothing perfidious in all of this. There is simply too much information in the world, even when filtered through our “friends”, for us to digest in any meaningful way at a given moment. We need filters. However, the concealment of this selective process, the veiling of the algorithm and its predilections for proprietary reasons, molds our perception of reality far more than we are aware. When this concealment is allied with the economic self-interest Facebook possesses as a publically traded company, its tendency to represent reality in its own best financial interests is compounded.
Only by becoming aware of what exactly is happening when we like Facebook, can the morality of its use be adequately judged. Only when we are first clear about what we like about and on Facebook, can we use it responsibly. Otherwise, Facebook will take our liking behavior as a tacit affirmation that its robots are constructing a (view of) reality that we actually like. And because (virtual) reality is deeply interconnected, what we "like" affects, and sometimes afflicts, those we truly care about.