By public sphere we mean first of all a realm of our social life in which something approaching public opinion can be formed. […] The expression ‘public opinion’ refers to the tasks of criticism and control which a public body of citizens informally – and in periodic elections, formally as well – practises vis-à-vis the ruling structure organized in the form of a state.Habermas, 1964, pp. 49-50
Habermas spent his life theorizing about democracy and capitalism, among other things. He traced the emergence of the public sphere, investigated its features, and defended its importance. If in 1964, the year when Habermas published The Public Sphere, the analysis of the public sphere would have taken place in the context of newspapers, television and radio, contemporary discussion concerning the state and flourishing of the public sphere needs to take place in the context of social media.
Since the advent of Facebook and social media platform alike, many scholars saw on these platforms an opportunity to enhance democratic participation (Van Dijk, 2013). This is because social media platforms soon established themselves as an alternative medium to receive and discuss information. However, scholars have recently started to question their earlier enthusiasm.
Scholars have observed how social media algorithms predict and show the kind of content individuals might prefer seeing on their social media feed based on users’ previous activity. For instance, if you follow pages on music, it is more likely that you will be see on your social media feed posts related to music. In this article, I will discuss five ways in which algorithms may be threatening online democracy.
The practice of citizenship requires access to information and to the various communities in which citizens claim membershipCohen, 2013, p. 1913
1. Algorithms limit citizens’ ability to exercise control
Social media algorithms have been acting as the contemporary gatekeeper of information, as their timeline functions precisely as those search engines that tailor “both results and the accompanying advertising to what is known about the searcher” (Cohen, 2013, p. 1913). The fact that the timeline is tailored means that only specific contents and types of information become visible without the active intervention, or rather discomfort (Cohen, 2013), of users. By limiting the kind of information that a user can access, algorithms shrink the realm on which the user, as a citizen, might be able to exercise “the democratic control of state activities” (Habermas, 1964, p. 50).
The algorithm that powers the News Feed, with the goal of driving engagement […] is arguably doing more damage to our politics than the most biased human editor ever couldThompson, 2016
2. Algorithms create filter bubbles
Algorithms are not only making it more difficult to access information fundamental for identifying the realms on which the public should exercise supervision, but they also act as a culturally and ideologically segregating forces that make it harder to initiate dialogue among different users, or rather citizens. Yet, as Habermas argued, dialogue is fundamental for a functioning public sphere. On social media, the possibility of initiating dialogue and reaching understanding becomes harder on social media platforms due to the “subtle process of continual feedback [whereby] stimuli are tailored to play to existing inclinations” (Cohen, 2013, p. 1917), which segregates users in what are commonly defined “filter bubbles” (Thompson 2016).
News that are ideologically akin to users’ social and political beliefs are more likely to appear on one’s timeline creating structural difficulties for a user to engage in a dialogue with people who hold different ideas (Thompson, 2016). Hence, users become not only “locked in” the platform but more specifically in cultural bubbles differentiated on the basis of multiple and yet different understandings of the world that hardly come into contact, which is precisely what reduces the possibility of reaching understanding (Cohen, 2013; Thompson, 2016). Therefore, the ubiquitous and yet invisible algorithm through which users are modulated does not provide favorable conditions for the initiation of dialogue, a fundamental feature of democratic public spheres.
3. Algorithms weaken the power of social media to challenge discourses and ideas
The fact that individuals are locked in filter bubbles may be weakening the power of social media to act as a public sphere where ideas, ideologies, and discourses are challenged. As Fraser (1990) points out, the public sphere is constituted by multiple parallel arenas that “formulate oppositional interpretations of their identities, interests and needs” (p. 67). The multiplicity of publics emerging on social media platforms may be said to have the dual character of any other public:
On one hand, they function as spaces of withdrawal and re-groupment; on the other hand they also function as bases and training grounds or agitational activities directed towards wider publicsFraser, 1990, p. 68
However, algorithms, by segregating social media users in filter bubbles, are limiting the ability of publics to reach out the wider arena and establish a communicative process that could take “the form of contestation as that of deliberation” (Fraser, 1990, p. 68) and therefore could challenge normative understandings.
For instance, social media are powerful tools were the social justice activitsm and concerns of global movements such as #BlackLivesMatter and #MeToo. However, algorithms may not show news related to these movements to those who might have other political opinions, which defeats the objectives of these movements. As such, algorithms seems to be weakening the public sphere’s function of questioning dominant normative frameworks and narratives.
4. Algorithms weaken the desire to practice citizenship
Algorithms may be weakling the public sphere, and therefore democracy, also because instead of encouraging practices of citizenship, they seem to be encouraging consumption (Thompson, 2015). For instance, as Facebook explained few years ago, algorithms are aimed at maximizing profits by turning users into consumers:
We do this not only because we believe it’s the right thing but also because it’s good for our business. When people see content they are interested in, they are more likely to spend time on News Feed and enjoy their experienceReported by Thomson 2015
Aiming at fostering a class of citizens/users active and responsible for the public sphere seems to be neither the primary objective nor the underlying philosophy of Facebook, the owner of multiple social platforms, including WhatsApp and Instagram (Van Dijk, 2013). Indeed, Facebook.com aims at having users consuming information as a source of entertainment rather than a platform for the critical assessment of such political scene (Bauman & Lyon, 2012).
The particular design features of our artifacts make some activities seem easier and more natural and others difficult and these implicit behavioral templates or affordances encourage us to behave in certain ways rather than othersCohen, 2013, p. 1912-1913
The process whereby the public sphere succumbs to the pursuit of information-commodity rather than information-assessment and deliberation may be endangering democracy. As Tufekci argues, “if you don’t have those institutions of deliberation, you can’t have a deliberative democracy” (2016). The disintegration of the public sphere may lead to what Baudrillard (1985) calls the
Massive delegation of the power of desire of choice of responsibility […] to apparatuses either political or intellectual either technical or operational to whom has devolved the duty of taking care of all of these thingsBaudrillard, 1985, p. 585
A need for media literacy and online citizenship
If modulation not only restricts the functioning of the public sphere, but may also disintegrate “the desire to practice […] citizenship” (Cohen, 2013, p. 1918), the challenge is to figure out how to reverse such process. There is the need to figure out how users, or rather citizens, could break out the vicious circle where our “society is […] continually adjusting the information environment to each individual’s comfort level” (Cohen, 2013, p. 1918).
One possible and practical solution, yet not exhaustive, would be to reactivate the discomfort that neoliberal democratic citizenship requires by which citizens actively engage in information assessing, rather than ‘enjoying’ the Feed as consumers. This would mean understanding how algorithms influence the kind of information we can access, discomforting ourselves in order to break out our structured cultural bubble, as well as critically assessing the ‘fakeness’ of a story ourselves.
However, discomfort requires literacy, the new kind of literacy that Jenkins et al. (2006) suggested, which I presented in a previous article on the digital divide. In fact, to realize that there is the need to discomfort ourselves, individuals need to be able to understand the assumptions of social media platforms, their objectives and underlying philosophy.