Algorithms are emergent phenomena. The word is a Latinized version of Muhammad Ibn Musa Al-Khwarizmi’s name, who, in 830 AD, authored The Book of Calculation by Completion and Balancing. Contrary to widespread belief, The Book of Calculation is not a mathematical textbook but a summary of practices as diverse as geometric approximation techniques and trade regulations.
Millennia before mathematics arose as a discipline, ancient civilizations marked territories – rocks, soil, human bodies – through techniques of spatial and temporal abstraction: encircling, incising, carving and scarifying. Algorithms were here codified rituals. In the Middle Ages, they were heterogeneous methods used for predictive operations. Today, they are automated computerized procedures. Given this history, it would be wrong to think of algorithms as of abstract mathematical ideas imposed on concrete data. Algorithms are emergent diagrams contingent on repetition and the processual organization of time, space, objects, and actions.
The emergent quality of algorithms corresponds to the somewhat mysterious property of all ritual, usually referred to as operational performance efficacy. In a nutshell, performance efficacy is the capacity of performed action to actualize that which it suggests. Any action performed in a particular spatial and temporal order, with a non-random use of objects, has the power of tradition. Whether performed for the first time or the thousandth, when ordered and witnessed, it has the aura of tradition, as any aspect of human life or behavior may lend itself to ritualization.[i]
Operational performance efficacy differs from doctrinal performance efficacy, which characterizes religious rituals and operates in an explicit manner, by postulating a cosmic order, affecting the spirit world, and demonstrating the validity of the explanations it offers. For example, in the Christian communion, the host, which represents the body of Christ, unites the congregant’s physical body with the immortal body of Christ while simultaneously reaffirming the immortality of the Christian soul.
Operational efficacy, by contrast, describes the change that occurs in the onlooker’s worldview regardless of whether this effect was targeted or not. While doctrinal performance efficacy emanates from the ritual’s content, operational performance efficacy emerges from the ritual’s form – its spatio-temporal ordering and repeatability. Repeatability suggests past repetitions, and thus, also, truth, if truth is seen as a temporal-experiential category, as the stability of basic assumptions.
Today, many voices warn us of impending digital totalitarianism taking hold of neoliberal democracies. Comparing psychographic voter targeting and data manipulation to China’s Social Credit system – which seeks to raise the level of trustworthiness in the entire society –, the authors of Reality Lost premise digital totalitarianism on Google’s Orwellian practices. Like many other critics, Hendricks and Vestergaard correctly note that even if it’s not mandatory to use Google or Amazon’s smart products, personal assistants, learning devices and sensors are ceaselessly gathering data. Following Arendt, they suggest that the aim of (any form of) totalitarianism is identical to a real-world actualization of behaviourism – the deterministic predictability of stimuli and responses akin to Pavlov’s dogs. They suggest that the predatory design of new technologies precipitates “the total elimination of autonomy and self-determination by data-driven behavioral control”.[ii] [ii]
Without a doubt, there is an alarming relationship between artificial intelligence and present-day autocracies. This is true in China, as well as, increasingly, the U.S., where new levels of control achieved by applying computing to unstructured data, facial-recognition technologies, and social-media-monitoring algorithms, are proving disconcertingly useful to authoritarians. Corporate and governmental misuse of algorithms is no less problematic. Uber uses behavioural modification in the form of incessant nudging to make its drivers extend their workday ‘of their own free will’; the justice and penal systems discriminate against the already disadvantaged population, by, for example, ‘reading’ data about a person’s level of education against their family’s incarceration history in order to predict recidivism in criminals. Clearly, such computations stem from discriminatory premises. Worse still, they have pernicious feedback loops in the increased interlinking of educational, employment, medical and criminal records.
The problem, however, is that although justified, these critiques link opaque algorithmic operations to a single origin: the programmer’s intention, or, in the case of adaptive (rather than merely optimizing) algorithms, to the algorithm’s immediate learning environment as can be seen from the alarming example of Google auto-tagging pictures of black people ‘gorillas’. It’s certainly not wrong to conclude that the predominantly white racial makeup of Google’s programmers may have something to do with this intolerable error. But should our concern stop there? How relevant are anthropocentric notions such as intention, or even unconscious bias, in a world of AI, seen not as an algorithmic variation on human consciousness – still a distant, if at all likely scenario – but a far more diffuse operation: the digital systems’ iterative sequencing of generalizations obtained from limited data?
I’d like to suggest that medial efficacy – which is in many ways similar to operational performance efficacy – is a more apt way of thinking about opaque algorithmic operations. Derived from Aristotle’s ‘betweens’ a medium is a both a milieu and a practice where objects, operational principles and subjects, converge. In German media studies, this in-between-ness is reflected in the expression Kultuurtechniken, a term that stresses the relation of performance to objects and people. As Vissman notes:
“a plough drawing a line in the ground is an agricultural tool which determines the political act… It produces the subject who a posteriori claims mastery over both the tool and the action associated with it.”[iii]
The problematic dualism of agent and patient, activity and passivity, denies things agency, attributing it, instead, to programmers or designers. Contra this view, Vissman suggests that a tool’s (or programme’s) features cannot be independent from their conditions of production, material properties, and spatio-temporal circumstances. This is why we need to distinguish
“between persons, who de jure act autonomously, and cultural techniques, which de facto determine the entire course of action. To inquire about cultural techniques is not to ask about the feasibility, success, chances and risks of certain innovations and inventions in the domain of the subject. Instead, it is to ask about the self-management or auto-praxis [Eigenpraxis] of media and things.”[iv]
In German, Eigenpraxis has the connotation of ‘own’ or ‘proper’. It refers to the agent-thing’s iterative steering of emergent processes in new, and, for humans, often, imperceptible directions. As a term, medial efficacy draws on Eigenpraxis, on operational performance efficacy, and on McLuhan’s work on such media as the radio. For McLuhan, the radio collapses the auditory and the living space producing media totalitarianism, a form of totalitarianism unrelated to human intentions, though this is not to say that it cannot be combined with human intentions.
Further developing McLuhan’s thought, Epping-Jäger analyses the coalescence of media and human (Nazi) totalitarianism, where iterations of ‘a medially configured vocal power’ anchor the phonocentric experience of Volksgemeinschaft in the Nazi parades, carefully staged within 75 meters of loudspeakers that amplified Hitler’s voice to 50,000 decibel, the level needed to dominate the 500,000 m2 of Berlin’s Tempelhof Field.[v] Despite the fact that Hitler’s speeches have a historically conclusive semantic dimension, the phatic function of voice (the human medium), its amplification (the technical medium), and its reach (in the geographical medium) act as milieus-conduits-agents that pre-exist semantic content.
Similar to operational performance efficacy, medial efficacy emerges from the configuration of time, space, objects and action, predicated on the relationality of operational protocols, iteration, velocity, and amplification, most of which unfold beneath the threshold of human perception.
A glimpse of its operation can be gleaned from high frequency trading (HFT), a digital ecosystem in which faster algorithms randomly distort the existing information (the state of the market), communicate these ‘errors’ to slower institutional algorithms, and, in this way, create new economic realities from nothing else but the configuration of time, objects, action, and repetition. Although the initial placement and cancellation of, say, the same order 10,000 times per second is programmed, the speed ratio, the distortions of scale caused by repetition, and the learning capacity of the HFT adaptive algorithms are not. Once an algorithmic action has been performed 100000000 times, it becomes impossible to predict its effects.
The question, therefore, is not who programmed algorithm x that did y? Rather, it is: how do the various relationships of scale, velocity, and adaptability, create new epistemic, operational, and economic realities? What precisely is algorithmic Eigenpraxis? And, most importantly, how do we stop thinking in and with single entities, e.g. ‘multicellular digital organism’ and start thinking in diffuse categories like medial efficacy?
[i] Sally F. Moore and Barbara G. Myerhoff, Secular Ritual, Assen: Van Gorcum, 1977, p. 8–12.
[ii] Vincent F. Hendricks and Mads Vestergaard, Reality Lost: Markets of Attention, Misinformation and Manipulation, Trans. Sara Høyrup, Cham: Springer Open, 2019, p.135.
[iii] Cornealia Vissman, “Cultural Techniques and Sovereignty,” Trans. Ilinca Iurascu, Theory, Culture & Society, Vol.30 (6), pp. 83–84.
[iv] Ibid: 84.
[v] Cornelia Epping-Jäger, “Stimmgewalt. Die NSDAP als Rednerpartei,” Stimme. Annäherung an ein Phänomen, D. Kolesch and S. Krämer (eds.), Frankfurt am Main: Suhrkamp, 2006, p. 166.