The medium is the memory

Rumsey draws a powerful analogy to underscore memory’s materiality. The greatest memory system, she reminds us, is the universe itself. Nature embeds history in matter. When, in the early 19th century, scientists realized that they could read nature’s memory by closely examining the Earth and stars, we gained a much deeper understanding of the cosmos and our place in it. Geologists discovered that the strata in exposed rock tell the story of the planet’s development. Biologists found that fossilized plants and animals reveal secrets about the evolution of life. Astronomers realized that by looking through a telescope they could see not only across great distances but far back in time, gaining a glimpse of the origins of existence.

Through such discoveries, Rumsey argues, people both revealed and refined their “forensic imagination,” a subtle and creative way of thinking highly attuned to deciphering meaning from matter. We deploy that same imagination in understanding and appreciating our history and culture. The upshot is that the technologies a society uses to record, store and share information will play a crucial role in determining the richness, or sparseness, of its legacy. To put a new spin on Marshall McLuhan’s famous dictum, the medium is the memory.

Whether through cave paintings or Facebook posts, we humans have always been eager to record our experiences. But, as Rumsey makes clear, we’ve been far less zealous about safeguarding those records for posterity. In choosing among media technologies through the ages, people have tended to trade durability for transmissibility. It’s not hard to understand why. Intent on our immediate needs, we prefer those media that make communication easier and faster, rather than the ones that offer the greatest longevity. And so the lightweight scroll supplants the heavy clay tablet, the instantaneous email supplants the slow-moving letter. A cave painting may last for millennia, but a Facebook post will get you a lot more likes a lot more quickly.

–Nicholas Carr, When our culture’s past is lost in the cloud

A minor quibble: Carr ends his column by advising, “We should make sure that there’s always a place in the world for the eloquent object, the thing itself.” I know what he means by that, and it’s a point I agree with. At the same time, his definition of “materiality,” in this particular column, is a bit limited in scope. True, the “digital” record of a Facebook timeline is not the same as the “physical” record of, say, a diary. Both are still material. The difference is that the materiality of a Facebook timeline is scattered–into the code that structures a web site or whatever browser a person is using, into whatever hard drives or servers are tasked to archive the timeline and call it up on demand–whereas the materiality of a diary is self-contained: the book, the “thing itself.”

Electronic signals, we should remember, are material things, if we understand “materiality” to refer to anything with atomic substance. But Abby Rumsey’s point about the fragility of digital stuff is well-taken. Those digital archives are not only fragile in their material state (as anyone can attest whose computer has suddenly died before she could save the document she was writing), but they are, as noted elsewhere in the column, eminently mutable (as anyone can attest who has accidentally deleted the document he was revising before saving the latest change).

All of this is to emphasize the point Carr/Rumsey makes in that third paragraph: digital media are more immediately transmissible, but the meaning and form of communication are (perhaps) not as adequately preserved. The physical chassis of my laptop will likely outlive me in significant respects. The digital world housed within or accessed through it likely will not. Not in its current form. It’s worth reflecting on the fact that this very blog post does, in fact, physically exist. The electronic signals that sustain and transmit it literally exist; the codes for it are stored somewhere. But these various physical materials only come together in the form of this commonplace blog when the vast machinery of the Internet (including your machine and mine) is mobilized to make it so, for the fleeting moments of access. Not unlike the various chemicals and biological materials that house me for the scant few decades I–as in I Me Myself, the being whose history belongs to this body and mind–exist on this planet.

When I die, these materials will disperse, never to come together in precisely the same form again, never housing the particular meaningfulness or resonance of my life, as I have lived it. What I think Carr and Rumsey touch on, whether they know it or not, is whether the resonance of a human soul can be housed by media. If it can, it is less likely to be in digital form. Looks like Ray Kurzweil still has his work cut out for him.

Advertisements

Cause for alarm

He wants to lead the U.S. military. Yet he tells us that he had to act as he did partly because of a cartoon on the cover of a magazine. What could Vladimir Putin bait Donald Trump into doing with a strategic joke about his executive branch?

— Conor Friedersdorf, The Unnerving Insecurities of Donald Trump

It would indeed be troubling to watch Trump fall into the hands of a master baiter like Putin.

The Temporal Advancement Phenomenological Paradox

Let us ponder the paradoxically obvious. Alan Jacobs is a teacher who is older than his students. To wit:

In one of my classes we’re reading McLuhan’s Understanding Media, and today I called my students’ attention to this passage… I pointed out to them McLuhan’s implicit claim: that he stands in the same relation to the new electronic age as Tocqueville stood to “typographic man.” He is an acute observer of the world he is describing to us, but not a native of it, and his slight distance is the key to his perceptiveness. I asked them to be especially attentive to his metaphor of “the spell” – and then I told them, “Basically, McLuhan is applying for the position of Defense Against the Dark Arts teacher for our entire culture.”

Most of them looked blankly at me.

The deftness of his analogy demonstrates that Jacobs is likely a brilliant in-person lecturer. What is easy for teachers to lose sight of, I have learned, is that our students really do get younger every semester. I’m in my early- mid-thirties,[1] and it doesn’t seem to me as though my students are that much younger than I am (after all, they’re nearly in their twenties!), but as you age, you find that time works a bit differently than it did when you were younger. An astonishingly elementary observation, I know, but one that never fails to astonish me whenever I reflect on it. For me, something five years old is something “recent;” for my students, it’s a quarter of a lifetime ago.[2]

So. Harry Potter and the Death Hallows was published in 2007, and the last movie came out in 2011. Assuming the median age of Jacobs’s students to be 19, they were 14 when the last movie came out and 10 when the last book came out. And the last book actually to take place mostly at Hogwarts was Harry Potter and the Half-Blood Prince, published in 2005, when his students were 8, and filmed in 2009, when his students were 12.

Put like that, the wizarding world of Harry Potter is for today’s college students half a lifetime ago. For us teachers, it’s contemporary pop culture; for our students, it’s period nostalgia.

There is already a name for this, I’m sure. But for now let’s call it Tanukifune’s Temporal Advancement Phenomenological Paradox. It’s something like the inverse of Zeno’s Achilles and Tortoise Paradox.[3]

Here’s how mine goes. For every year older you get, an event half a lifetime ago will seem twice as recent.

Let’s say you get a bike when you’re seven.  When you’re 14, it will seem like you’ve had the bike forever. Let’s say you get a car when you’re 16. When you’re 32, that first car will have the glow of long-ago/far-away nostalgia. Not an eternity, but definitely “way back when.” Let’s say you get married when you’re 25. When you’re 50, you won’t be able to believe it’s been a quarter century. The years were noticeable, but they just flew by. When you’re 30, let’s say you have a baby. When you’re 60, you’ll swear that your 30-year-old daughter came home from the hospital only yesterday.

See how that works? For every year older you get, the more recent an event half a lifetime ago will seem to be. Another way to put it is that as you age, your sense of the “contemporary” expands exponentially in proportion to your years. Hence, college students who have either never read/watched Harry Potter, or read/watched the series so long ago that, to them, it’s long ago and far away, whereas to their English teacher, Harry Potter is still “new.”[4]

__________

[1] Recent birthdays screw everything up, sort of like how I always date all my checks with the previous year every January.
[2] I once asked my class as part of an icebreaker during roll call where they would go if they could travel anywhere in time and space. A student replied that she’d go back to the 80s to see her parents when they met. I was like, “Oh, so basically the plot from Back to the Future!” I got what I presume were the same blank stares Jacobs received.
[3] If you don’t feel like following the link, here’s my layman’s attempt: Achilles, as Zeno put it, would never be able to catch a Tortoise who had a head start in a race, because to close half the distance between them would take a certain amount of time, and closing half that distance would take more time, and closing half that distance would take even more time, and so on—and, meanwhile, the tortoise would continue to increase his lead. Achilles must spend every moment of the race closing a fraction of the distance between the tortoise and himself. In essence, in order to get from Point A to Point B, one must first reach a point midway between it, right? But in order to get from Point A.1 to Point B, one must reach Point A.2, then Point A.3, A.4, etc. And so on, forever and ever. Put even more succinctly, the distance traveled between two points may be divided by half infinitely, thus ensuring that a person departing from one point will, quite logically, never reach another if he must first always reach a midpoint between them.
[4] Or maybe Jacobs was just cursed with students who don’t read good YA fantasy. They probably haven’t seen Back to the Future, either. Oh well. One thing at a time.

Buddha and Idi

An analogy may be useful here. The position that Harris’ thought experiment thrusts us into is comparable to sailors adrift on a lifeboat. Clearly, all of these sailors would recognize that it would be a positive thing if everyone on the boat had water to drink, or as Harris inversely presents it: a lifeboat with no water is objectively worse than a lifeboat with water. Thus—Harris’ reasoning goes—there are objectively right and wrong answers to moral questions related to having water on a lifeboat.  But where does this get us exactly? While all of these sailors may agree that having water is better than not having water, this does not bring us any closer to objective truth. There is in reality still no unity of value here, only the appearance of it. To underscore my point, let us assume that the Buddha and Idi Amin are among these sailors. Both the Buddha and Idi Amin would certainly agree that having no water on the lifeboat is categorically bad, but they would do so for entirely different reasons. While the Buddha may be adopting this position from the utilitarian’s ‘point of view of the universe’, having compassion for all people on the boat and viewing the situation from a perfectly objective stance, Idi Amin likely remains stuck in the subjective stance, thinking of himself. Yet the Buddha and Idi Amin appear to be in perfect moral agreement. What happened? The problem is that within Harris’s thought experiment, moral value is never actually identified, just preferences. Moral debate begins when interests start to collide—when there is some but not enough water for all the sailors—not when interests are comfortably aligned. Harris’s thought experiment does not bring us any closer to objective value—it merely achieves the illusion of it. With one cup of water, this illusion is shattered and the profound moral incongruity between the Buddha and Idi Amin that was present all along comes suddenly into view.

And here comes the truly difficult part. While we may find Idi Amin morally repugnant, we cannot provide a rational account of why Amin should adopt an objective view of his situation. The insight that value is reducible to brain states tells us nothing about why Idi Amin should care about the brain states of other people.

–Bryan Druzin, “Shouldering the Burden of Utilitarianism”

The Gamergate candidate

My students—budding historians—tell me exactly what budding historians are supposed to say: It has always been like this. And in a way they’re right. Go back to the Early Republic and consider how Burr, Adams, Hamilton and the like went after each other. It was vicious. Adams was the worst. He famously called Hamilton “the Bastard brat of a Scotch peddler”; Paine’s Common Sense, a “crapulous mess”; and Jefferson’s soul, “poisoned with ambition.” But the difference with Trump is that, unlike past political mudslinging, his insults are divorced from political reality. Trump isn’t hissing out insults to underscore his political position, or to denigrate the political position of another. He’s doing it to bully for the sake of bullying. Trump issues taunts apolitically, all over the place (against Republicans and Democrats), and with abandon. He’s often compared to a third grader on a playground. But, honestly, that’s not fair to third graders, most of whom seem to understand that you don’t behave that way.

–James McWilliams, “Why Trump?”

That last is the best line in a relatively light piece in which McWilliams suggests that what is historically unique about Donald Trump’s campaign is that the link between his vulgar outbursts and his upticks in polling is, broadly and collectively speaking, our social media. I’m not totally persuaded by this argument, even though I suppose that, to a certain extent, social media do foster “a culture of insult that prevails in the darker arenas of the Web, in places where We The People are allowed to be at our worst.” And I do also suspect that Trump is being used as a way to, as McWilliams argues, legitimate those discourses.

In other words, he’s quintessentially a Gamergate candidate. Whatever that may mean.