Ned O’Gorman takes issue with Theses 40-46, in which Alan Jacobs critiques our modern characterization of technology as having agency independent of conscious human will:
Things, in this instance manuscripts, are indeed meaningful and powerful. Why would we want to divest things of their poetic quality, their meaningfulness, and indeed their power? Kevin Kelly may be off in his aims or misguided in his understanding, but he’s right to recognize in things, even and especially in technologies, sources of meaning and meaningfulness.
Of course technologies want. The button wants to be pushed; the trigger wants to be pulled; the text wants to be read—each of these want as much as I want to go to bed, get a drink, or get up out of my chair and walk around, though they may want in a different way than I want. To reserve “wanting” for will-bearing creatures is to commit oneself to the philosophical voluntarianism that undergirds technological instrumentalism.
Jacobs responds to O’Gorman:
I believe I understand why O’Gorman wants to make this argument: The phrases “philosophical voluntarism” and “technological instrumentalism” are the key ones. I assume that by invoking these phrases O’Gorman means to reject the idea that human beings stand in a position of absolute freedom, simply choosing whatever “instruments” seem useful to them for their given project. He wants to avoid the disasters we land ourselves in when we say that Facebook, or the internal combustion engine, or the personal computer, or nuclear power, is “just a tool” and that “what matters is how you use it.” And O’Gorman is right to want to critique this position as both naïve and destructive.
But he is wrong if he thinks that this position is entailed in any way by my theses; and even more wrong to think that this position can be effectively combated by saying that technologies “want.” Once you start to think of technologies as having desires of their own you are well on the way to the Borg Complex: We all instinctively understand that it is precisely because tools don’t want anything that they cannot be reasoned with or argued with. And we can become easily intimidated by the sheer scale of technological production in our era. Eventually, we can end up talking even about what algorithms do as though algorithms aren’t written by humans.
O’Gorman and Jacobs may have a profound disagreement on these points, but I wonder if they aren’t also talking past each other just a little bit. O’Gorman cites Mary Carruthers, who argued that medieval texts were thought to contain meaning that must be explicated by the interpreter. Our more contemporary notions of texts as codes is much more in keeping with the anti-essentialism of our episteme, whereas many scholars in the medieval era may have thought of hermeneutics as a way of getting closer to the absolute, divinely-wrought truth of things. If you think about this debate in terms of conflicting epistemes, or ways of producing knowledge that are sanctioned by contemporary history, O’Gorman and Jacobs may be, in a way, disagreeing on the terms of this debate. (Just to be clear, I don’t think they misunderstand each other, but I’m intrigued by how the stakes for each of them seem to be somewhat mutually incompatible.)
O’Gorman is saying (or at least implying on a fundamental level) that it is incredibly useful to think of things as being invested with agency. In other words, we must acknowledge that, for a long time, even well before the advent of modernity, people thought of their tools as being invested with a certain amount of power or will or agency (or whatever terms you prefer) independent of their users. This has been a vitally important way of thinking about the human relationship to the world: a relationship mediated by tools, by things.
Jacobs is saying very explicitly that this way of thinking is not useful; furthermore, he’s saying that it invests far more perceived power in the role of the mediator than it should. If tools are our mediators between ourselves and the world, we ought not to trust them as implicitly as we do. Once a tool is invested with desire, it can be thought of as having motives. A mediator—a maker of meaning—with motives of its own is little different from another person or, given the amount of material power now possessed by our tools, a god. An incompetent god, as Jacobs says, but a god nonetheless.
(Even in my own rhetoric, it’s been difficult not to talk about tools as if they had agency; I just referenced the “material power now possessed by our tools,” as if tools were capable of taking possession of power. They’re not, but it’s dangerously easy to talk about them as if they did.)
The point Jacobs makes is that it is so incredibly easy to fall into habits of speaking about tools as independent entities, and that the first step in surrendering our sense of self is to render submission to an alien power. Jacobs is a Christian, so a distinction that he makes in Theses 44 and 45 is that the human submission of the self to the power of tools is a kind of idolatry—the worship of false gods. What makes this submission to a power outside the self idolatry, and therefore false, is that tools are made by people, and we, the the people, are not gods.
Nor is O’Gorman making the case that we are. My reading of his argument is that to strip technology of a capacity for desire is to strip it of our capacity to appreciate how much our own desires are layered into the technology by its construction, use, and place as signifier in our discourse. That would be another way to evade the human, as he says.
I think I agree more with Jacobs on what’s at stake on this point, even though I think I agree with O’Gorman that, if our discourse is any indication, things do possess meanings, just as they do desire things. That’s precisely the danger, and it highlights the radicalism of what Jacobs is proposing, which is that we need to realign the way we speak and think about technology, perhaps to the point of doing violence not only to technology, but to the discourse that has grown up around it and even made the development of new technologies possible.
At stake is that we (mistakenly) think of ourselves as Tom Swift, master of invention and futuristic lifeways, when in reality we are Victor Frankenstein (that modern Prometheus), creating something that is more than mere technology but—somehow—something less than human (at least as far as we permit ourselves to perceive it). For the consequence of failing to recognize the ways in which we unconsciously submit to our tools is that we bend our collective will to (the will of) our idols and not only mistake our tools for false gods but unconsciously think of ourselves as godlike in the process. Gods of human creation cannot help but reflect the nature of their creators. If we are to meditate on the fate of gods who gave birth to other gods, we would do well to remember the Titans.