Introducing “iGen.”

From Jean M. Twenge’s recent essay in The Atlantic:

The more I pored over yearly surveys of teen attitudes and behaviors, and the more I talked with young people like Athena, the clearer it became that theirs is a generation shaped by the smartphone and by the concomitant rise of social media. I call them iGen. Born between 1995 and 2012, members of this generation are growing up with smartphones, have an Instagram account before they start high school, and do not remember a time before the internet. The Millennials grew up with the web as well, but it wasn’t ever-present in their lives, at hand at all times, day and night. iGen’s oldest members were early adolescents when the iPhone was introduced, in 2007, and high-school students when the iPad entered the scene, in 2010. A 2017 survey of more than 5,000 American teens found that three out of four owned an iPhone.

The advent of the smartphone and its cousin the tablet was followed quickly by hand-wringing about the deleterious effects of “screen time.” But the impact of these devices has not been fully appreciated, and goes far beyond the usual concerns about curtailed attention spans. The arrival of the smartphone has radically changed every aspect of teenagers’ lives, from the nature of their social interactions to their mental health. These changes have affected young people in every corner of the nation and in every type of household. The trends appear among teens poor and rich; of every ethnic background; in cities, suburbs, and small towns. Where there are cell towers, there are teens living their lives on their smartphone.

Twenge supplies a lot of correlated data that strongly link smartphone use to a number of generationally-distinct patterns in what she calls “iGen.” Among the more worrying data, she documents the rise of cyberbullying among young people, especially among girls. Then this:

Social-media companies are of course aware of these problems, and to one degree or another have endeavored to prevent cyberbullying. But their various motivations are, to say the least, complex. A recently leaked Facebook document indicated that the company had been touting to advertisers its ability to determine teens’ emotional state based on their on-site behavior, and even to pinpoint “moments when young people need a confidence boost.” Facebook acknowledged that the document was real, but denied that it offers “tools to target people based on their emotional state.”

At no time in human history have we possessed tools more finely-attuned to the art of manipulating the psychology of masses of people. These tools are supremely scalable. The same platforms that can target a demographic of heterogenous millions can individualize their content to reach, perhaps, a niche demographic of dozens. Taken in the context of Mark Zuckerberg’s utopian manifesto from earlier this year, the existence of the “boost” document ought to give us serious pause.

Allow me to go one step further. Scientists based in Portland, Oregon, recently succeeded in using the gene-editing program CRISPR/Cas9 to edit the DNA of embryos to eliminate the development of a genetic mutation that would cause hypertrophic cardiomyapathy. This is an incredible victory for medical science. But as I’ve said before, I’ve read Margaret Atwood’s Oryx and Crake. You should, too.

We have the tools to shape and reshape the human experience on a very literal level. On the genetic level, CRISPR is but the first feeble step toward technology whose power will enable us to program our own genetic makeup on scales previously imagined only in science fiction. Similarly, the algorithms of social media sites like Facebook have the potential to shape their users’ desires, feelings, and perceptions in ways that are simultaneously microscopically managed and macroscopically unpredictable. I strive to make these observations not in a spirit of alarm or histrionics but in the mindset of sober assessment. If, despite my attempts at sobriety, you feel alarmed… well, good.

Our civilization’s specific weakness

You are where your attention is. If you’re watching a football game with your son while also texting a friend, you’re not fully with your child — and he knows it. Truly being with another person means being experientially with them, picking up countless tiny signals from the eyes and voice and body language and context, and reacting, often unconsciously, to every nuance. These are our deepest social skills, which have been honed through the aeons. They are what make us distinctively human.

By rapidly substituting virtual reality for reality, we are diminishing the scope of this interaction even as we multiply the number of people with whom we interact. We remove or drastically filter all the information we might get by being with another person. We reduce them to some outlines — a Facebook “friend,” an Instagram photo, a text message — in a controlled and sequestered world that exists largely free of the sudden eruptions or encumbrances of actual human interaction. We become each other’s “contacts,” efficient shadows of ourselves.

–Andrew Sullivan, I Used to Be a Human Being

Amorality and attention economy

In the bubble economy of pageviews and rageclicks, the reasons for the attention fall away. And so a perpetual attention-generating machine like Donald Trump becomes not just a symptom but an attractor: the news media turned him into a phenomenon in pursuit of attention to their properties, even as the “serious” members of the press denied he could ever become a candidate. After all, all he was was a bid for attention, devoid of any real political program. Alas, such a distinction between politics and attention is no longer meaningful.

Only in a world where raw attention is an ultimate end could Trump have become a presidential nominee. By being deaf to all ulterior motives beyond self-aggrandizement, Trump is oddly incorruptible, apparently unwilling to be tamed by teleprompter or Svengali. By refusing to have any principles, he can’t be manipulated through them, nor can he betray them. “This was clearly madness,” Musil writes of Moosbrugger’s hothouse rhetoric, “and just as clearly it was no more than a distortion of our own elements of being.” Trump’s secret is that there is no secret. He is the Pollock canvas on which we’ve flung our collective vomit and feces. In the chaos that results, we can almost make out our reflection.

–David Auerbach, Make America Austria Again: How Robert Musil Predicted the Rise of Donald Trump

Hand the kid a hacksaw

When you tell a 22-year-old to turn off the phone, don’t ruin the movie, they hear please cut off your left arm above the elbow. You can’t tell a 22-year-old to turn off their cellphone. That’s not how they live their life.

At the same time, though, we’re going to have to figure out a way to do it that doesn’t disturb today’s audiences. There’s a reason there are ads up there saying turn off your phone, because today’s moviegoer doesn’t want somebody sitting next to them texting or having their phone on.

–Adam Aron, interviewed by Brent Lang

I will never—ever—go to a cinema if I think there’s even a remote chance of sharing a theater with the “cell phone section.” Why? Read the above quote with a minor modification:

When you tell a 22-year-old to put out their cigarette, don’t ruin the movie, they hear please cut off your left arm above the elbow. You can’t tell a 22-year-old to put out their cigarette. That’s not how they live their life.

At the same time, though, we’re going to have to figure out a way to do it that doesn’t disturb today’s audiences. There’s a reason there are ads up there saying put our your cigarette, because today’s moviegoer doesn’t want somebody sitting next to them coughing or blowing smoke in their face.

How in the world to your reconcile these mutually exclusive audiences? The point is not that it’s patently ridiculous that 22-year-olds live their lives permanently attached to smartphones. (Although it is patently ridiculous. If some 22-year-old thinks giving up his phone is like being asked to cut off his arm above the elbow, I say hand the kid a hacksaw.) The point is that I, as a consumer, as a citizen, value what Matthew Crawford calls the “attentional commons,” largely for the same reason that most Americans who enjoy breathing unpolluted air value the Clean Air Act.

I can’t even imagine what novel forms of attentional pollution cinema chains and telecom advertisers will devise when they know that they have a captive audience in an environment already primed for product placement and surrounded by personalized digital devices. Nor can I imagine the novel forms of rudeness to which my fellow creatures will descend once that barn door is cracked open. If the traditional film continues to exist–one that does not incorporate interactive, smartphone-dependent elements–and it continues to be exhibited in cinema chains, those chains are going to drive away any- and everyone who still goes to the movies for the movie-going experience.

I totally understand that entertainment media evolve all the time, and at some point there will be a sea change in the moviegoing experience. Until that time, though, people like Aron should understand that people like me go to the movies to watch movies, not to dink around on our phones and be distracted by the cancerous pests who do so. Have I made my position perfectly clear?

Continue reading “Hand the kid a hacksaw”

Knowledge at first sight

“Rather than blame things for being obscure, we should blame ourselves for being biased and prisoners of self-induced repetitiveness. One must forget many clichés in order to behold a single image. Insight is the beginning of perceptions to come rather than the extension of perceptions gone by. Conventional seeing, operating as it does with patterns and coherences, is a way of seeing the present in the past tense. Insight is an attempt to think in the present.

Insight is a breakthrough, requiring much intellectual dismantling and dislocation. It begins with a mental interim, with the cultivation of a feeling for the unfamiliar, unparalleled, incredible. It is being involved with a phenomenon, being intimately engaged to it, courting it, as it were, that after much perplexity and embarrassment we come upon insight—upon a way of seeing the phenomenon from within. Insight is accompanied by a sense of surprise. What has been closed is suddenly disclosed. It entails genuine perception, seeing anew. He who thinks that we can see the same object twice has never seen. Paradoxically, insight is knowledge at first sight.”

—Abraham J. Heschel, The Prophets (1962), p. xvi

Alan Jacobs’s 79 Theses on Technology

As Chad Wellmon explains, the 79 Theses on Technology submitted by Alan Jacobs for disputation are somewhat “tongue-in-cheek,” but they are also most certainly “provocative.” I won’t quote them all here (because you should simply follow the link to the original post), but there are a few that I found to be particularly amusing or stimulating.

1.) Everything begins with attention.

5.) To “pay” attention is not a metaphor: Attending to something is an economic exercise, an exchange with uncertain returns.

9.) An essential question is, “What form of attention does this phenomenon require? That of reading or seeing? That of writing also? Or silence?”

11.) “Mindfulness” seems to many a valid response to the perils of incessant connectivity because it confines its recommendation to the cultivation of a mental stance without objects.

13.) The only mindfulness worth cultivating will be teleological through and through.

14.) Such mindfulness, and all other healthy forms of attention—healthy for oneself and for others—can only happen with the creation of and care for an attentional commons.

15.) This will not be easy to do in a culture for which surveillance has become the normative form of care.

16.) Simone Weil wrote that ‘Attention is the rarest and purest form of generosity’; if so, then surveillance is the opposite of attention.

20.) We cannot understand the internet without perceiving its true status: The Internet is a failed state.

25.) Building an alternative digital commons requires reimagining, which requires renarrating the past (and not just the digital past).

26.) Digital textuality offers us the chance to restore commentary to its pre-modern place as the central scholarly genre.

31.) Blessed are they who strive to practice commentary as a legitimate, serious genre of responsiveness to others’ thoughts.

32.) And blessed also are those who discover how to write so as to elicit genuine commentary.

38.) To work against the grain of a technology is painful to us and perhaps destructive to the technology, but occasionally necessary to our humanity.

42.) Our current electronic technologies make competent servants, annoyingly capricious masters, and tragically incompetent gods.

54.) The contemporary version of the pathetic fallacy is to attribute agency not to nature but to algorithms—as though humans don’t write algorithms. But they do.

62.) The chief purpose of technology under capitalism is to make commonplace actions one had long done painlessly seem intolerable.

63.) Embrace the now intolerable.

71.) The Dunning-Kruger effect grows more pronounced when online and offline life are functionally unrelated.

72.) A more useful term than “Dunning-Kruger effect” is “digitally-amplified anosognosia.”

77.) Consistent pseudonymity creates one degree of disembodiment; varying pseudonymity and anonymity create infinite disembodiment.

Wellmon opens the dialogue concerning the theses:

But this image of a sovereign self governing an internal economy of attention is a poor description of other experiences of the world and ourselves. In addition, it levies an impossible burden of self mastery. A distributive model of attention cuts us off, as Matt Crawford puts it, from the world “beyond [our] head.” It suggests that anything other than my own mind that lays claim to my attention impinges upon my own powers to willfully distribute that attention. My son’s repeated questions about the Turing test are a distraction, but it might also be an unexpected opportunity to engage the world beyond my own head.

If we conceive of attention as simply the activity of a willful agent managing her units of attention, we foreclose the possibility of being arrested or brought to attention by something fully outside ourselves. We foreclose, for example, the possibility of an ecstatic attention and the possibility that we can be brought to attention by a particular thing beyond our will, a source beyond our own purposeful, willful action.

–Chad Wellmon, opening the dialogue about the theses

 In my theses I am somewhat insistent on employing economic metaphors to describe the challenges and rewards of attentiveness, and in so doing I always had in mind the root of that word, oikonomos (οἰκονόμος), meaning the steward of a household. The steward does not own his household, any more than we own our lifeworld, but rather is accountable to it and answerable for the decisions he makes within it. The resources of the household are indeed limited, and the steward does indeed have to make decisions about how to distribute them, but such matters do not mark him as a “sovereign self” but rather the opposite: a person embedded in a social and familial context within which he has serious responsibilities. But he has to decide how and when (and whether) to meet those responsibilities. So, too, the person embedded in an “attention economy.”

–Alan Jacobs, engaging in the dialogue