Narcissist eugenics

“Reading Emanuel’s essay, I began to despair — not just about the moral outlook it expresses, but about whether its readers will even recognize how monstrous it is. Emanuel has taken the ethic of meritocratic striving that currently dominates elite culture in the United States and transformed it into a comprehensive vision of the human good. Viewed in its light, the only life worth living is one in which you endlessly, relentlessly strive to look as smart and clever as possible in the eyes of other smart and clever people. The ultimate goal of such a life is to be considered the smartest and cleverest person of all. Once old age or any other misfortune gets in the way of continually striving for that goal, one might as well cease to exist.


This isn’t just an academic question. Emanuel and his striving siblings have an adopted sister named Shoshana who suffers from cerebral palsy. The clear implication of Emanuel’s argument is that her life — like that of someone who endures a steep decline in old age — is not worth living. Is that what Emanuel really thinks? If so, he should say so explicitly. If not, he should explain why not, since every word of his essay appears to push in the opposite direction.

Though he would no doubt blanch at the suggestion, Ezekiel Emanuel has written an essay that in its implications clearly amounts to a defense of eugenics. Yet Emanuel’s position differs in one important respect from the version of eugenics that rose to prominence in the Western world in the early decades of the 20th century. Whereas eugenics was originally motivated by public-spirited ends — by the desire to purify the race for the sake of overall human flourishing — Emanuel’s version is provoked by a more personal concern: Extraordinarily creative, intelligent, ambitious, and successful individuals want to be seen and remembered as exemplars of “vitality,” and not as “burdens” slowly succumbing to the “agonies of decline.”

This is eugenics induced by narcissism.”

—Damon Linker, “Should You Hope to die at 75? Absolutely Not”

A spiritual reality not one’s own

“Thus I do not mean to decry a fashion, but to underscore the motive behind the contemporary taste for the extreme in art and thought. All that is necessary is that we not be hypocritical, that we recognize why we read and admire writers like Simone Weil. I cannot believe that more than a handful of the tens of thousands of readers she has won since the posthumous publication of her books and essays really share her ideas. Nor is it necessary—necessary to share Simone Weil’s anguished and unconsummated love affair with the Catholic Church, or accept her gnostic theology of divine absence, or espouse her ideals of body denial, or concur in her violently unfair hatred of Roman civilization and the Jews. Similarly, with Kierkegaard and Nietzsche; most of their modern admirers could not, and do not embrace their ideas. We read writers of such scathing originality for their personal authority, for the example of their seriousness, for their manifest willingness to sacrifice themselves for their truths, and—only piecemeal—for their “views.” As the corrupt Alcibiades followed Socrates, unable and unwilling to change his own life, but moved, enriched, and full of love; so the sensitive modern reader pays his respect to a level of spiritual reality which is not, could not, be his own.”

Susan Sontag, “Simone Weil”

Uncultivated wastes

By midcentury, there could be 10 billion humans, all demanding and deserving a quality of life presently experienced by only a few. It will be an extraordinary, planet-defining challenge. Meeting it will require, as green modernists correctly observe, new ideas and tools. It also demands a deep, abiding respect for non‑human life, no less negligible than the respect we extend to one another. Power is not the same thing as supremacy.


We need to embrace more wilderness, not less. And though framing humanity’s role as global gardening sounds harmless, even pleasant, the idea contains a seed of industrial society’s fundamental flaw: an ethical vision in which only human interests matter. It’s a blueprint not for a garden, but for a landscaped graveyard.

Brandon Keim, “Earth is not a garden”

Though the earth, and all inferior creatures, be common to all men, yet every man has a property in his own person: this no body has any right to but himself. The labour of his body, and the work of his hands, we may say, are properly his. Whatsoever then he removes out of the state that nature hath provided, and left it in, he hath mixed his labour with, and joined to it something that is his own, and thereby makes it his property. It being by him removed from the common state nature hath placed it in, it hath by this labour something annexed to it, that excludes the common right of other men: for this labour being the unquestionable property of the labourer, no man but he can have a right to what that is once joined to, at least where there is enough, and as good, left in common for others.


But the chief matter of property being now not the fruits of the earth, and the beasts that subsist on it, but the earth itself; as that which takes in and carries with it all the rest; I think it is plain, that property in that too is acquired as the former. As much land as a man tills, plants, improves, cultivates, and can use the product of, so much is his property. He by his labour does, as it were, inclose it from the common. Nor will it invalidate his right, to say every body else has an equal title to it; and therefore he cannot appropriate, he cannot inclose, without the consent of all his fellow-commoners, all mankind. God, when he gave the world in common to all mankind, commanded man also to labour, and the penury of his condition required it of him. God and his reason commanded him to subdue the earth, i.e. improve it for the benefit of life, and therein lay out something upon it that was his own, his labour. He that in obedience to this command of God, subdued, tilled and sowed any part of it, thereby annexed to it something that was his property, which another had no title to, nor could without injury take from him.


I ask, whether in the wild woods and uncultivated waste of America, left to nature, without any improvement, tillage or husbandry, a thousand acres yield the needy and wretched inhabitants as many conveniencies of life, as ten acres of equally fertile land do in Devonshire, where they are well cultivated?

John Locke, “Second Treatise of Civil Government, Chapter 5: On Property”

Humanities without university

“In 1872, just three years after he landed his first, and only, professorship at the University of Basel without even having finished his dissertation, Nietzsche delivered a series of lectures, On the Future of Our Educational Institutions, in the city museum. Before crowds of more than 300 people, Nietzsche staged a dialogue on the future of German universities and culture between two young students and a cantankerous old philosopher and his slow-witted but earnest assistant.

The grousing philosopher lamented the decline of universities into state-sponsored factories that produced pliant citizens and mindless, “castrated” scholars who cared not a bit for life. By the end of the lectures, it’s difficult to say whether Nietzsche thought there was a future at all for German universities. Nietzsche lasted a few more years in his position, resigning only when ill health forced him to. But he left an oeuvre that looked to the university and saw little but ruin.

As Nietzsche was writing, parts of the German university might not have been in decay, but they were in decline, the humanities in particular. Between 1841 and 1881, enrollment in philosophy, philology, and history within “philosophy faculties,” which compromised the core liberal arts fields, declined from 86.4 percent to 62.9 percent, whereas in mathematics and the natural sciences enrollments increased from 13.6 to 37.1 percent of all students matriculating at German universities. The mood among humanists was often such that they sounded quite a bit like the embattled literature professors of today. In academia, crisis is generally a matter of perception, and even in what now seems like a “golden age” for humanists, there was, in fact, a seismic shift for the humanities.

More recent forms of Quit Lit tend to lack a key feature of Nietzsche’s model, however. Nietzsche never conflated the humanities or humanistic inquiry with the university. For him, humanistic inquiry—and Nietzsche was deeply humanistic as his lifelong commitment to philology attests—transcended the institutional and historically particular shape of universities, which he saw as little more than extensions of a Prussian bureaucratic machine.


I am not suggesting that we should give up on universities. Universities, especially modern research universities, have long helped sustain and cultivate the practices and virtues central to the humanities. But just as German universities were becoming international paradigms, emulated from Baltimore to Beijing, Nietzsche made a fateful diagnosis. Those practices and virtues could ossify and whither in the arcane and self-justifying bowels of the modern, bureaucratic university. “Human inquiry,” in contrast, would live on.

We may well benefit from an exercise in imagination. Could the humanities survive the collapse of the university? I think so.”

Chad Wellmon, “Quit Lit: Do the Humanities Need the University?”

Comic Book Guy: catalog as critique

johnthelutheran has a wonderful rant about the people who complain whenever you (or anyone else) bothers to talk critically about a film you’ve just seen (or book you’ve read, song you’ve heard, etc.). He’s responding to a piece you can read here in its entirety. Here’s an excerpt from that response:

There can be a real joy and pleasure – excitement, interest, fun – in thinking about and discussing “a comic or a book or a movie or a TV show” to find out what else it has going on within it, beyond the immediate surface impression. Sometimes (often), that can greatly enhance your enjoyment of the original work. Sometimes, you realise there’s nothing beyond the surface, and that’s fine. Sometimes, the work falls to pieces in your hands on closer inspection. But those occasions are more than paid for by the deep, positive gains of a critical appreciation for the things you enjoy.

The whole point about Comic Book Guy is that he isn’t “critical”. He’s negative. His besetting sin is the deadly sin of acedia (“sloth”): the sin whose fruit is Marie-Antoinette Syndrome, where “nothing tastes”. But the cure for this cultural acedia isn’t just “let go and be a kid again” (though that has its place), but learning that true criticism has, as its goal, the increasing of one’s enjoyment of things worth enjoying.

I would also add that Comic Book Guy is not only simply negative, but that his criticism is often incredibly superficial. He’s the guy who watches the new Star Wars and destroys it online because: “The stripe on the Gundaar lieutenant’s uniform was pale blue, which clearly violates continuity as established in volume fourteen, issue 72 from 1986 of Star Wars Universe: The Gundaar Treaty, page 12, in which a Gundaar lieutenant’s uniform clearly is shown to have turquoise stripes. Worst. Star Wars. Ever.”

Technical problems are certainly a factor in any critical judgment. But there are deeper issues in any piece of art. David Gerrold talks a lot about story structure, which is something not immediately apparent when you watch a film. It’s there. If you know what you’re looking for, it’s “visible.” But it’s not a surface detail.

What CBG does is data cataloguing. When a piece of data doesn’t match his unusually stringent (and idiosyncratic) categories, he rejects it and everything along with it. I’d almost call his approach scientific — in that, when a data point corrupts an experiment, the experiment does not yield a consistent result — except that even scientists try to account for that anomalous data point. A corrupted experiment may still have value for long-term research. It’s not necessarily the Worst. Experiment. Ever.

This kind of “criticism” can be fun. MST3K got tons of mileage out of continuity errors and dumb technical mistakes. (My favorite: in The Girl with Gold Boots, a bad edit shazams a man into existence in a restaurant booth. “I’m back!”) But the Satellite of Love crew also took potshots at the deeper problems CBG simply doesn’t notice. When characters behave inconsistently, or the story structure is wonky, those things get shredded, too. Manos isn’t the Worst. Movie. Ever. just because of terrible acting or bad lighting; there are scenes that literally add nothing to the narrative besides length to an already interminable film.[1] That’s not surface detail; that’s the substrate where artistic decisions have gone deeply, horribly awry.

While distractingly bad surface details can distract from a good experience, they don’t usually undermine a story in themselves. Most frequently, the plague of corrupted data is indicative of much deeper issues that are the true source of the cracks we see on the surface. Comic Book Guy and his ilk catalogue the symptoms without looking at the disease, and it sometimes leads them to misdiagnose an otherwise healthy story as a terminal case.


[1] If you’re one of those who thinks that Manos: The Hands of Fate is actually a masterpiece of pre-Lynchian/post-Buñuelian pararealism, good for you. The goat-man down the hall has a shiny star for your lapel.

European timber and American sand

A passionate traveler, Montaigne sought, in difficult journeys he made more difficult by refusing an orderly itinerary, the same process of becoming he recorded in his essays. He understood that what he found upon his travels was inextricable from the finding, and that he himself would be transformed by both.

Put another way, all this has been to suggest that a difference so distinct as to constitute the opposite of universality cannot be. As vehicles of meaning, the ships that discovered America exploded on impact. After the landfall, all there was on the beach was a confusion of European timber and American sand. Difference being an analytical concept, its objective reality lies only in its construction of subjective realities. Thus while the Aztecs and the Spanish certainly existed before their encounter, the difference between them did not. That was the creature of the conquest.

— Myra Jehlen, “Why Did the Europeans Cross the Ocean: A Seventeenth-Century Riddle”

Build your own damn self

Perhaps I am emblematic of everything that is wrong with elite American education, but I have no idea how to get my students to build a self or become a soul. It isn’t taught in graduate school, and in the hundreds of faculty appointments and promotions I have participated in, we’ve never evaluated a candidate on how well he or she could accomplish it. I submit that if “building a self” is the goal of a university education, you’re going to be reading anguished articles about how the universities are failing at it for a long, long time.

I think we can be more specific. It seems to me that educated people should know something about the 13-billion-year prehistory of our species and the basic laws governing the physical and living world, including our bodies and brains. They should grasp the timeline of human history from the dawn of agriculture to the present. They should be exposed to the diversity of human cultures, and the major systems of belief and value with which they have made sense of their lives. They should know about the formative events in human history, including the blunders we can hope not to repeat. They should understand the principles behind democratic governance and the rule of law. They should know how to appreciate works of fiction and art as sources of aesthetic pleasure and as impetuses to reflect on the human condition.

On top of this knowledge, a liberal education should make certain habits of rationality second nature. Educated people should be able to express complex ideas in clear writing and speech. They should appreciate that objective knowledge is a precious commodity, and know how to distinguish vetted fact from superstition, rumor, and unexamined conventional wisdom. They should know how to reason logically and statistically, avoiding the fallacies and biases to which the untutored human mind is vulnerable. They should think causally rather than magically, and know what it takes to distinguish causation from correlation and coincidence. They should be acutely aware of human fallibility, most notably their own, and appreciate that people who disagree with them are not stupid or evil. Accordingly, they should appreciate the value of trying to change minds by persuasion rather than intimidation or demagoguery.

I believe (and believe I can persuade you) that the more deeply a society cultivates this knowledge and mindset, the more it will flourish. The conviction that they are teachable gets me out of bed in the morning. Laying the foundations in just four years is a formidable challenge. If on top of all this, students want to build a self, they can do it on their own time.

Steven Pinker, “The Trouble with Harvard”

The birth of crowdwriting?

We have entered a Brave New World in which one price cannot serve all, and sales points need to be calculated and contextually negotiated. And even beyond flexible price points, there is the possibility of ongoing “public patronage” in which the public “subscribes” to an individual artistSurely, it is only a matter of time before sites like Gittip, which allows users to donate a minimum of 25 cents a week to members of the technorati, expand to cover authors and artists.


What if, as Scott Turow fears, authors cannot adjust to the challenges of online distribution and piracy, leaving us with a system that rewards a writerly 1 percent of a few winners and lots of losers? One possible consequence is that idealistic, imaginative, and socially engaged members of our generation will feel compelled to make a more direct and practical impact on the world—rather than writing about social inequality, they might be “forced” to take jobs as policy-makers, for example. In fact, with the collaborative and interdisciplinary mentality behind many crowdfunded projects, such possibilities have already emerged.

Sarah Ruth Jacobs, “The Author in the Age of Digital Distribution”

Spectral epistemology

Davidson’s messianic hopes as well as Nichols’s cultural despair mistakenly suppose that there can somehow be a vacuum of epistemic authority. But, in truth, forms and functions of epistemic authority, be they the disciplinary order of the research university or Wikipedia’s fundamental principles or “Five Pillars,” are themselves filtering technologies, helping us to orient ourselves amid a surfeit of information. They help us discern and attend to what is worthwhile. Google searches point us in the direction of some resources and not others. Technologies are normative, evaluative structures to make information accessible, manageable, and, ultimately, meaningful. It is not a question, then, of the presence or absence of epistemic authority; it is about better or worse forms of epistemic authority. Expertise and cultural authority are still with us. But now it might be more spectral, embodied not in the university don but in the black-boxed algorithm.

Chad Wellmon, “Algorithms Rule”

The hows and whys of scale and precedence

“There is a word in literary theory for what Mitchell’s doing: metalepsis, the transgression of the boundaries of a fictional world by an object, idea, or character. But there’s not much precedent for how he’s doing it. Among recurrent figures in adult literature, the one who comes closest to behaving like a Mitchell character is Falstaff, ambling from Henry IV to The Merry Wives of Windsor. But if Shakespeare had done what Mitchell is doing, Falstaff would have been the grandfather of Oberon, who would have first appeared as a page boy in Richard III.

Kathryn Schulz, “Boundaries Are Conventions. And The Bone Clocks Author David Mitchell Transcends Them All.”

I wonder if it’s true that Falstaff is about the closest thing modern literature (that is, literature from the “early modern” period to the present) has to offer to a David Mitchell character. In the course of five hundred-odd years, it seems improbable that there haven’t been more metalepses in the sense that Schulz uses the term. (Refreshers on the term here and here.) In the 20th century alone, haven’t Thomas Pynchon, Jack Kerouac, Salman Rushdie, and (of course) James Joyce performed similar operations in their works? If anyone knows, please share. It’s quite possible that nobody has tried it on Mitchell’s scale, but I’m not sure if that’s more of a “what” than a “how” question. Perhaps an “if”…

The quibble stems from a frustration I have with a widespread tendency to stake claims of precedence. It seems that if you want to make a favorite artist of yours seem especially significant, you claim that whatever it is you like about him, he did it first, or is unique among his peers for doing it. Just because someone does something well doesn’t mean that it’s important that she’s the only one doing it, or very nearly the first to have done it. Staking a claim like that distracts from or distorts the substantive reasons for that thing being done at all. Not that it’s unimportant to ask “how,” but it’s also important to ask “why.” The problem here is that Schulz smuggles in an assertion about the “how” that may not be true, and which might affect our understanding of the “why.”

Otherwise, it’s an intriguing interview. I’ve only read Cloud Atlas, but I hope to read Mitchell’s entire oeuvre by some point.