the social algorithm and social engineering

This, an extract from an introduction to the Facebook-run study appearing in the current edition of the journal Science. It reminds us of the positive answer to the question of whether the communicative protocol that is deployed by a social institution (in this case, Facebook) affects the qualities of the communications itself. YES IT DOES! Hello folks! In this case that effect is to the social (fiscal) benefit of the social institution. And if it means more eye-ball time that individuals spend on Facebook, efforts (social engineering) in this direction will only increase. The centralization of power overlays the nexus of control of the protocols through which we communicate. This is nothing new — from the British Crown licensing printing presses, to censorship boards, to Stasi registration of typewriters, to military propaganda machines, to network television — although in some of those cases the social institution holding control of the protocol is not the same as those deploying the protocol. When those two conditions are congruent, the control tends to be dominant, all-encompassing, or else invisible (the fish-knowing-about-water issue).

(Translate: read the word “curation” in the following as “control” to double-underscore the concept.)

The specific deliberative issue that Bakshy et al. examine is whether Facebook’s curation of news feeds prevents the intersection of conflicting points of view. That is, does a “filter bubble” emerge from this algorithmic curation process, so that individuals only see posts that they agree with (5)? Such an algorithmic sorting has the potential to be unhealthy for our democracy, fostering polarization and undermining the construction of a vision of the common good.

Their answer, after parsing the Facebook pages of ~10 million U.S. individuals with self-declared ideologies, is that the curation does ideologically filter what we see. However, this effect is modest relative to choices people make that filter information, including who their friends are and what they choose to read given the curation. The deliberative sky is not yet falling, but the skies are not completely clear either.

This is an important finding and one that requires continued vigilance. A small effect today might become a large effect tomorrow, depending on changes in the algorithms and human behavior. Ironically, these findings suggest that if Facebook incorporated ideology into the features that the algorithms pay attention to, it would improve engagement with content by removing dissonant ideological content. It is also notable, for example, that Facebook announced April 21st—well after the analysis conducted in this paper—three major changes to the curation of newsfeeds. These changes had benign objectives, such as ensuring that one sees updates from “the friends you care about” (6). It is plausible, however, that friends that Facebook infers you to care about also tend to be more ideologically aligned with you as well, accentuating the filtering effect. Furthermore, the impact of curation on other dimensions of deliberative quality on Facebook remains to be examined. Open questions include whether the curation privileges some voices over others, and whether certain types of subjects are highlighted by the curation in a way that systematically undermines discussions of the issues of the day (pets over politics).

The impacts of social algorithms are a subject with rich scientific possibilities, not least because of the enormous data streams captured by these socio-technical systems (7). It is not possible to determine definitively whether Facebook encourages or hinders political discussion across partisan divides relative to a pre-Facebook world, because we do not have nearly the same quality or quantity of data for the pre-Facebook world. The existence of Facebook, Twitter, etc., should be a boon to the study of political deliberation, because it is now possible to study these systems at a societal scale.

Lazer, B.D., 2015. The rise of the social algorithm. Science. Available at: https://www.sciencemag.org/cgi/doi/10.1126/science.aab1422 [Accessed May 15, 2015].

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.