Tag Archives: Ned O’Gorman

Technological innovation is the only shared story that makes sense anymore

“This fall, one of us spent a day touring three of the Smithsonian museums in Washington, DC: the National Museum of Natural History, the National Museum of American History, and the Air and Space Museum. Only the last seemed to make “sense.” That is, only the Air and Space Museum offered a relatively coherent narrative. Moving from room to room, the museum’s story was fairly straight forward. From early-modern seafaring, to the Wright brothers, to World War II aerial combat, to nuclear deterrence, to the age of unmanned aerial vehicles, the world has been caught up in an age of ineffable aeronautical adventures. And the United States is the late-modern vanguard. Emblazoned on the tails of fighter jets and the bellies of missiles was the national story of technological flight.

Walking through the National Museum of American History, on the other hand, made no such sense. There was no coherent overall narrative. It was strictly an episodic experience, like watching the History Channel for a day. (No surprise: The History Channel is a prominent museum sponsor.) The National Museum of Natural History—dedicated to the cultural keeping of “nature”—was even more fragmented. Offering no history, no narrative, it simply assembled a pastiche of stuffed mammals, winged butterflies, arctic photographs, and tropical fish around an acquisitive centerpiece, the Hope Diamond.

After leaving the Mall and its museums, this tourist left with a clear message: Technological innovation is the only shared story that makes sense anymore. Neither the “imagined community” of the nation-state nor the Earth, which for aeons has grounded humans narratively and otherwise, has the symbolic power to make history cohere, at least in the United States. Even natural scientists, as the Museum of Natural History made clear, are engineers taking flights into the statistical improbabilities of human evolution and considerably warmer futures. “History” is technological innovation, a story told best through the marvels hanging from the ceilings of the Air and Space Museum.”

–Ned O’Gorman and Chad Wellmon, “Media Are Elemental: Marvelous Clouds

Advertisements

79 Theses on Technology: Remember the Titans

Ned O’Gorman takes issue with Theses 40-46, in which Alan Jacobs critiques our modern characterization of technology as having agency independent of conscious human will:

Things, in this instance manuscripts, are indeed meaningful and powerful. Why would we want to divest things of their poetic quality, their meaningfulness, and indeed their power? Kevin Kelly may be off in his aims or misguided in his understanding, but he’s right to recognize in things, even and especially in technologies, sources of meaning and meaningfulness.

Of course technologies want. The button wants to be pushed; the trigger wants to be pulled; the text wants to be read—each of these want as much as I want to go to bed, get a drink, or get up out of my chair and walk around, though they may want in a different way than I want. To reserve “wanting” for will-bearing creatures is to commit oneself to the philosophical voluntarianism that undergirds technological instrumentalism.

Jacobs responds to O’Gorman:

I believe I understand why O’Gorman wants to make this argument: The phrases “philosophical voluntarism” and “technological instrumentalism” are the key ones. I assume that by invoking these phrases O’Gorman means to reject the idea that human beings stand in a position of absolute freedom, simply choosing whatever “instruments” seem useful to them for their given project. He wants to avoid the disasters we land ourselves in when we say that Facebook, or the internal combustion engine, or the personal computer, or nuclear power, is “just a tool” and that “what matters is how you use it.” And O’Gorman is right to want to critique this position as both naïve and destructive.

But he is wrong if he thinks that this position is entailed in any way by my theses; and even more wrong to think that this position can be effectively combated by saying that technologies “want.” Once you start to think of technologies as having desires of their own you are well on the way to the Borg Complex: We all instinctively understand that it is precisely because tools don’t want anything that they cannot be reasoned with or argued with. And we can become easily intimidated by the sheer scale of technological production in our era. Eventually, we can end up talking even about what algorithms do as though algorithms aren’t written by humans.

O’Gorman and Jacobs may have a profound disagreement on these points, but I wonder if they aren’t also talking past each other just a little bit. O’Gorman cites Mary Carruthers, who argued that medieval texts were thought to contain meaning that must be explicated by the interpreter. Our more contemporary notions of texts as codes is much more in keeping with the anti-essentialism of our episteme, whereas many scholars in the medieval era may have thought of hermeneutics as a way of getting closer to the absolute, divinely-wrought truth of things. If you think about this debate in terms of conflicting epistemes, or ways of producing knowledge that are sanctioned by contemporary history, O’Gorman and Jacobs may be, in a way, disagreeing on the terms of this debate. (Just to be clear, I don’t think they misunderstand each other, but I’m intrigued by how the stakes for each of them seem to be somewhat mutually incompatible.)

O’Gorman is saying (or at least implying on a fundamental level) that it is incredibly useful to think of things as being invested with agency. In other words, we must acknowledge that, for a long time, even well before the advent of modernity, people thought of their tools as being invested with a certain amount of power or will or agency (or whatever terms you prefer) independent of their users. This has been a vitally important way of thinking about the human relationship to the world: a relationship mediated by tools, by things.

Jacobs is saying very explicitly that this way of thinking is not useful; furthermore, he’s saying that it invests far more perceived power in the role of the mediator than it should. If tools are our mediators between ourselves and the world, we ought not to trust them as implicitly as we do. Once a tool is invested with desire, it can be thought of as having motives. A mediator—a maker of meaning—with motives of its own is little different from another person or, given the amount of material power now possessed by our tools, a god. An incompetent god, as Jacobs says, but a god nonetheless.

(Even in my own rhetoric, it’s been difficult not to talk about tools as if they had agency; I just referenced the “material power now possessed by our tools,” as if tools were capable of taking possession of power. They’re not, but it’s dangerously easy to talk about them as if they did.)

The point Jacobs makes is that it is so incredibly easy to fall into habits of speaking about tools as independent entities, and that the first step in surrendering our sense of self is to render submission to an alien power. Jacobs is a Christian, so a distinction that he makes in Theses 44 and 45 is that the human submission of the self to the power of tools is a kind of idolatry—the worship of false gods. What makes this submission to a power outside the self idolatry, and therefore false, is that tools are made by people, and we, the the people, are not gods.

Nor is O’Gorman making the case that we are. My reading of his argument is that to strip technology of a capacity for desire is to strip it of our capacity to appreciate how much our own desires are layered into the technology by its construction, use, and place as signifier in our discourse. That would be another way to evade the human, as he says.

I think I agree more with Jacobs on what’s at stake on this point, even though I think I agree with O’Gorman that, if our discourse is any indication, things do possess meanings, just as they do desire things. That’s precisely the danger, and it highlights the radicalism of what Jacobs is proposing, which is that we need to realign the way we speak and think about technology, perhaps to the point of doing violence not only to technology, but to the discourse that has grown up around it and even made the development of new technologies possible.

At stake is that we (mistakenly) think of ourselves as Tom Swift, master of invention and futuristic lifeways, when in reality we are Victor Frankenstein (that modern Prometheus), creating something that is more than mere technology but—somehow—something less than human (at least as far as we permit ourselves to perceive it). For the consequence of failing to recognize the ways in which we unconsciously submit to our tools is that we bend our collective will to (the will of) our idols and not only mistake our tools for false gods but unconsciously think of ourselves as godlike in the process. Gods of human creation cannot help but reflect the nature of their creators. If we are to meditate on the fate of gods who gave birth to other gods, we would do well to remember the Titans.


A valuation of the humanities

“Today the world’s biggest problems have indeed grown big enough to concern the very survival of the human species: environmental catastrophe, genocidal weapons, and fragile technological and economic systems each put the species—not just individuals—at risk. But the solutions to these problems, in as much as they can be achieved, will be essentially, and not merely accidentally, social and political in nature.

[…]

There is no science that can save us from the historically embedded habits and the wider structures that cause us, seemingly instinctively, to value the lives of some more than others based on the skin color, gender, or any other of a number of social markers of the Other. And the only solutions for structural problems within the law are both better law and better practice of the law.

These problems require citizens capable of reflecting on matters like discrimination and the law, and leaders who understand that the world’s problems can’t be fixed simply through technology. The world’s largest problems are not equivalent to the problem of gravity. If they were, perhaps science and technology could solve them. We’d just need more well-funded Newtons and Einsteins. Rather, we have problems that are inherently political and/or social in nature and that require political and/or social solutions. Moreover, it should be obvious by now that scientific and technological “fixes” often create new ones (e.g., industrialism’s creation of global warming, genocidal killing machines, and antibiotics).

So while it seems silly to say it, it needs to be said, in light of the legitimate value political and academic leaders are putting on life: The arts and humanities save lives!”

—Ned O’Gorman, “The Arts and Humanities Save Lives!”