79 Theses on Technology: Remember the Titans

Ned O’Gorman takes issue with Theses 40-46, in which Alan Jacobs critiques our modern characterization of technology as having agency independent of conscious human will:

Things, in this instance manuscripts, are indeed meaningful and powerful. Why would we want to divest things of their poetic quality, their meaningfulness, and indeed their power? Kevin Kelly may be off in his aims or misguided in his understanding, but he’s right to recognize in things, even and especially in technologies, sources of meaning and meaningfulness.

Of course technologies want. The button wants to be pushed; the trigger wants to be pulled; the text wants to be read—each of these want as much as I want to go to bed, get a drink, or get up out of my chair and walk around, though they may want in a different way than I want. To reserve “wanting” for will-bearing creatures is to commit oneself to the philosophical voluntarianism that undergirds technological instrumentalism.

Jacobs responds to O’Gorman:

I believe I understand why O’Gorman wants to make this argument: The phrases “philosophical voluntarism” and “technological instrumentalism” are the key ones. I assume that by invoking these phrases O’Gorman means to reject the idea that human beings stand in a position of absolute freedom, simply choosing whatever “instruments” seem useful to them for their given project. He wants to avoid the disasters we land ourselves in when we say that Facebook, or the internal combustion engine, or the personal computer, or nuclear power, is “just a tool” and that “what matters is how you use it.” And O’Gorman is right to want to critique this position as both naïve and destructive.

But he is wrong if he thinks that this position is entailed in any way by my theses; and even more wrong to think that this position can be effectively combated by saying that technologies “want.” Once you start to think of technologies as having desires of their own you are well on the way to the Borg Complex: We all instinctively understand that it is precisely because tools don’t want anything that they cannot be reasoned with or argued with. And we can become easily intimidated by the sheer scale of technological production in our era. Eventually, we can end up talking even about what algorithms do as though algorithms aren’t written by humans.

O’Gorman and Jacobs may have a profound disagreement on these points, but I wonder if they aren’t also talking past each other just a little bit. O’Gorman cites Mary Carruthers, who argued that medieval texts were thought to contain meaning that must be explicated by the interpreter. Our more contemporary notions of texts as codes is much more in keeping with the anti-essentialism of our episteme, whereas many scholars in the medieval era may have thought of hermeneutics as a way of getting closer to the absolute, divinely-wrought truth of things. If you think about this debate in terms of conflicting epistemes, or ways of producing knowledge that are sanctioned by contemporary history, O’Gorman and Jacobs may be, in a way, disagreeing on the terms of this debate. (Just to be clear, I don’t think they misunderstand each other, but I’m intrigued by how the stakes for each of them seem to be somewhat mutually incompatible.)

O’Gorman is saying (or at least implying on a fundamental level) that it is incredibly useful to think of things as being invested with agency. In other words, we must acknowledge that, for a long time, even well before the advent of modernity, people thought of their tools as being invested with a certain amount of power or will or agency (or whatever terms you prefer) independent of their users. This has been a vitally important way of thinking about the human relationship to the world: a relationship mediated by tools, by things.

Jacobs is saying very explicitly that this way of thinking is not useful; furthermore, he’s saying that it invests far more perceived power in the role of the mediator than it should. If tools are our mediators between ourselves and the world, we ought not to trust them as implicitly as we do. Once a tool is invested with desire, it can be thought of as having motives. A mediator—a maker of meaning—with motives of its own is little different from another person or, given the amount of material power now possessed by our tools, a god. An incompetent god, as Jacobs says, but a god nonetheless.

(Even in my own rhetoric, it’s been difficult not to talk about tools as if they had agency; I just referenced the “material power now possessed by our tools,” as if tools were capable of taking possession of power. They’re not, but it’s dangerously easy to talk about them as if they did.)

The point Jacobs makes is that it is so incredibly easy to fall into habits of speaking about tools as independent entities, and that the first step in surrendering our sense of self is to render submission to an alien power. Jacobs is a Christian, so a distinction that he makes in Theses 44 and 45 is that the human submission of the self to the power of tools is a kind of idolatry—the worship of false gods. What makes this submission to a power outside the self idolatry, and therefore false, is that tools are made by people, and we, the the people, are not gods.

Nor is O’Gorman making the case that we are. My reading of his argument is that to strip technology of a capacity for desire is to strip it of our capacity to appreciate how much our own desires are layered into the technology by its construction, use, and place as signifier in our discourse. That would be another way to evade the human, as he says.

I think I agree more with Jacobs on what’s at stake on this point, even though I think I agree with O’Gorman that, if our discourse is any indication, things do possess meanings, just as they do desire things. That’s precisely the danger, and it highlights the radicalism of what Jacobs is proposing, which is that we need to realign the way we speak and think about technology, perhaps to the point of doing violence not only to technology, but to the discourse that has grown up around it and even made the development of new technologies possible.

At stake is that we (mistakenly) think of ourselves as Tom Swift, master of invention and futuristic lifeways, when in reality we are Victor Frankenstein (that modern Prometheus), creating something that is more than mere technology but—somehow—something less than human (at least as far as we permit ourselves to perceive it). For the consequence of failing to recognize the ways in which we unconsciously submit to our tools is that we bend our collective will to (the will of) our idols and not only mistake our tools for false gods but unconsciously think of ourselves as godlike in the process. Gods of human creation cannot help but reflect the nature of their creators. If we are to meditate on the fate of gods who gave birth to other gods, we would do well to remember the Titans.

Alan Jacobs’s 79 Theses on Technology

As Chad Wellmon explains, the 79 Theses on Technology submitted by Alan Jacobs for disputation are somewhat “tongue-in-cheek,” but they are also most certainly “provocative.” I won’t quote them all here (because you should simply follow the link to the original post), but there are a few that I found to be particularly amusing or stimulating.

1.) Everything begins with attention.

5.) To “pay” attention is not a metaphor: Attending to something is an economic exercise, an exchange with uncertain returns.

9.) An essential question is, “What form of attention does this phenomenon require? That of reading or seeing? That of writing also? Or silence?”

11.) “Mindfulness” seems to many a valid response to the perils of incessant connectivity because it confines its recommendation to the cultivation of a mental stance without objects.

13.) The only mindfulness worth cultivating will be teleological through and through.

14.) Such mindfulness, and all other healthy forms of attention—healthy for oneself and for others—can only happen with the creation of and care for an attentional commons.

15.) This will not be easy to do in a culture for which surveillance has become the normative form of care.

16.) Simone Weil wrote that ‘Attention is the rarest and purest form of generosity’; if so, then surveillance is the opposite of attention.

20.) We cannot understand the internet without perceiving its true status: The Internet is a failed state.

25.) Building an alternative digital commons requires reimagining, which requires renarrating the past (and not just the digital past).

26.) Digital textuality offers us the chance to restore commentary to its pre-modern place as the central scholarly genre.

31.) Blessed are they who strive to practice commentary as a legitimate, serious genre of responsiveness to others’ thoughts.

32.) And blessed also are those who discover how to write so as to elicit genuine commentary.

38.) To work against the grain of a technology is painful to us and perhaps destructive to the technology, but occasionally necessary to our humanity.

42.) Our current electronic technologies make competent servants, annoyingly capricious masters, and tragically incompetent gods.

54.) The contemporary version of the pathetic fallacy is to attribute agency not to nature but to algorithms—as though humans don’t write algorithms. But they do.

62.) The chief purpose of technology under capitalism is to make commonplace actions one had long done painlessly seem intolerable.

63.) Embrace the now intolerable.

71.) The Dunning-Kruger effect grows more pronounced when online and offline life are functionally unrelated.

72.) A more useful term than “Dunning-Kruger effect” is “digitally-amplified anosognosia.”

77.) Consistent pseudonymity creates one degree of disembodiment; varying pseudonymity and anonymity create infinite disembodiment.

Wellmon opens the dialogue concerning the theses:

But this image of a sovereign self governing an internal economy of attention is a poor description of other experiences of the world and ourselves. In addition, it levies an impossible burden of self mastery. A distributive model of attention cuts us off, as Matt Crawford puts it, from the world “beyond [our] head.” It suggests that anything other than my own mind that lays claim to my attention impinges upon my own powers to willfully distribute that attention. My son’s repeated questions about the Turing test are a distraction, but it might also be an unexpected opportunity to engage the world beyond my own head.

If we conceive of attention as simply the activity of a willful agent managing her units of attention, we foreclose the possibility of being arrested or brought to attention by something fully outside ourselves. We foreclose, for example, the possibility of an ecstatic attention and the possibility that we can be brought to attention by a particular thing beyond our will, a source beyond our own purposeful, willful action.

–Chad Wellmon, opening the dialogue about the theses

 In my theses I am somewhat insistent on employing economic metaphors to describe the challenges and rewards of attentiveness, and in so doing I always had in mind the root of that word, oikonomos (οἰκονόμος), meaning the steward of a household. The steward does not own his household, any more than we own our lifeworld, but rather is accountable to it and answerable for the decisions he makes within it. The resources of the household are indeed limited, and the steward does indeed have to make decisions about how to distribute them, but such matters do not mark him as a “sovereign self” but rather the opposite: a person embedded in a social and familial context within which he has serious responsibilities. But he has to decide how and when (and whether) to meet those responsibilities. So, too, the person embedded in an “attention economy.”

–Alan Jacobs, engaging in the dialogue