Otherwise–who knows?

I have said that this new development has unbounded possibilities for good and for evil. For one thing, it makes the metaphorical dominance of the machines, as imagined by Samuel Butler, a most immediate and non-metaphorical problem. It gives the human race a new and most effective collection of mechanical slaves to perform its labor. Such mechanical labor has most of the economic properties of slave labor, although, unlike slave labor, it does not involve the direct demoralizing effects of human cruelty. However, any labor that accepts the conditions of competition with slave labor accepts the conditions of slave labor, and is essentially slave labor. The key word of this statement is competition. It may very well be a good thing for humanity to have the machine remove from it the need of menial and disagreeable tasks, or it may not. I do not know. It cannot be good for these new potentialities to be assessed in the terms of the market, of the money they save; and it is precisely the terms of the open market, the “fifth freedom,” that have become the shibboleth of the sector of American opinion represented by the National Association of Manufacturers and the Saturday Evening Post. I say American opinion, for as an American, I know it best, but the hucksters recognize no national boundary.

Perhaps I may clarify the historical background of the present situation if I say that the first industrial revolution, the revolution of the “dark satanic mills,’ was the devaluation of the human arm by the competition of machinery. There is no rate of pay at which a United States pick-and-shovel laborer can live which is low enough to compete with the work of a steam shovel as an excavator. The modern industrial revolution is similarly bound to devalue the human brain, at least in its simpler and more routine decisions. Of course, just as the skilled carpenter, the skilled mechanic, the skilled dressmaker have in some degree survived the first industrial revolution, so the skilled scientist and the skilled administrator may survive the second. However, taking the second revolution as accomplished, the average human being of mediocre attainments or less has nothing to sell that is worth anyone’s money to buy.

The answer, of course, is to have a society based on human values other than buying or selling. To arrive at this society, we need a good deal of planning and a good deal of struggle, which, if the best comes to the best, may be on the plane of ideas, and otherwise—who knows? […]

Those of us who have contributed to the new science of cybernetics thus stand in a moral position which is, to say the least, not very comfortable. We have contributed to the initiation of a new science which, as I have said, embraces technical developments with great possibilities for good and for evil. We can only hand it over into the world that exists about us, and this is the world of Belsen and Hiroshima. We do not even have the choice of suppressing these new technical developments. They belong to the age, and the most any of us can do by suppression is to put the development of the subject into the hands of the most irresponsible and most venal of our engineers. The best we can do is to see that a large public understands the trend and the bearing of the present work, and to confine our personal efforts to those fields, such as physiology and psychology, most remote from war and exploitation. As we have seen, there are those who hope that the good of a better understanding of man and society which is offered by this new field of work may anticipate and outweigh the incidental contribution we are making to the concentration of power (which is always concentrated, by its very conditions of existence, in the hands of the most unscrupulous). I write in 1947, and I am compelled to say that it is a very slight hope.

—Norbert Wiener, Cybernetics: or Control and Communication in the Animal and the Machine (1961 [orig. 1948]), The MIT Press, pp. 27-29

Advertisements

Introducing “iGen.”

From Jean M. Twenge’s recent essay in The Atlantic:

The more I pored over yearly surveys of teen attitudes and behaviors, and the more I talked with young people like Athena, the clearer it became that theirs is a generation shaped by the smartphone and by the concomitant rise of social media. I call them iGen. Born between 1995 and 2012, members of this generation are growing up with smartphones, have an Instagram account before they start high school, and do not remember a time before the internet. The Millennials grew up with the web as well, but it wasn’t ever-present in their lives, at hand at all times, day and night. iGen’s oldest members were early adolescents when the iPhone was introduced, in 2007, and high-school students when the iPad entered the scene, in 2010. A 2017 survey of more than 5,000 American teens found that three out of four owned an iPhone.

The advent of the smartphone and its cousin the tablet was followed quickly by hand-wringing about the deleterious effects of “screen time.” But the impact of these devices has not been fully appreciated, and goes far beyond the usual concerns about curtailed attention spans. The arrival of the smartphone has radically changed every aspect of teenagers’ lives, from the nature of their social interactions to their mental health. These changes have affected young people in every corner of the nation and in every type of household. The trends appear among teens poor and rich; of every ethnic background; in cities, suburbs, and small towns. Where there are cell towers, there are teens living their lives on their smartphone.

Twenge supplies a lot of correlated data that strongly link smartphone use to a number of generationally-distinct patterns in what she calls “iGen.” Among the more worrying data, she documents the rise of cyberbullying among young people, especially among girls. Then this:

Social-media companies are of course aware of these problems, and to one degree or another have endeavored to prevent cyberbullying. But their various motivations are, to say the least, complex. A recently leaked Facebook document indicated that the company had been touting to advertisers its ability to determine teens’ emotional state based on their on-site behavior, and even to pinpoint “moments when young people need a confidence boost.” Facebook acknowledged that the document was real, but denied that it offers “tools to target people based on their emotional state.”

At no time in human history have we possessed tools more finely-attuned to the art of manipulating the psychology of masses of people. These tools are supremely scalable. The same platforms that can target a demographic of heterogenous millions can individualize their content to reach, perhaps, a niche demographic of dozens. Taken in the context of Mark Zuckerberg’s utopian manifesto from earlier this year, the existence of the “boost” document ought to give us serious pause.

Allow me to go one step further. Scientists based in Portland, Oregon, recently succeeded in using the gene-editing program CRISPR/Cas9 to edit the DNA of embryos to eliminate the development of a genetic mutation that would cause hypertrophic cardiomyapathy. This is an incredible victory for medical science. But as I’ve said before, I’ve read Margaret Atwood’s Oryx and Crake. You should, too.

We have the tools to shape and reshape the human experience on a very literal level. On the genetic level, CRISPR is but the first feeble step toward technology whose power will enable us to program our own genetic makeup on scales previously imagined only in science fiction. Similarly, the algorithms of social media sites like Facebook have the potential to shape their users’ desires, feelings, and perceptions in ways that are simultaneously microscopically managed and macroscopically unpredictable. I strive to make these observations not in a spirit of alarm or histrionics but in the mindset of sober assessment. If, despite my attempts at sobriety, you feel alarmed… well, good.

Not alone in the universe

When people are searching for meaning, their minds seem to gravitate toward thoughts of things like aliens that do not fall within our current scientific inventory of the world. Why? I suspect part of the answer is that such ideas imply that humans are not alone in the universe, that we might be part of a larger cosmic drama. As with traditional religious beliefs, many of these paranormal beliefs involve powerful beings watching over humans and the hope that they will rescue us from death and extinction.

–Clay Routledge, Don’t Believe in God? Maybe You’ll Try U.F.O.s

Routledge ends with this: “The Western world is, in theory, becoming increasingly secular — but the religious mind remains active. The question now is, how can society satisfactorily meet people’s religious and spiritual needs?”

To betray our nature.

Anthropocene describes what we are doing to our environment, while posthuman is largely phenomenological, a condensed articulation of what it’s like to live in a world where we are constantly making and remaking ourselves, especially via biotechnology. And surely there is some truth in these points, but I want to suggest that the apparent disjunction obscures a deeper unity. A world in which we remake our environment and ourselves is a world that does not feel human to us. We do not know how to play the role of gods, and on some level perceive that to act as gods is to betray our nature.

–Alan Jacobs, Anthropocene theology

The tortoise, the hare, and American science

“Put simply, privatization will mean that more ‘sexy,’ ‘hot’ science will be funded, and we will miss important discoveries since most breakthroughs are based on years and decades of baby steps,” said Kelly Cosgrove, an associate professor of psychiatry at Yale University. “The hare will win, the tortoise will lose, and America will not be scientifically great.”

–Adrienne Lafrance, Scientists Brace for a Lost Generation in American Research

Ice Age Solastalgia

The mammoth’s extinction may have been our original ecological sin. When humans left Africa 70,000 years ago, the elephant family occupied a range that stretched from that continent’s southern tip to within 600 miles of the North Pole. Now elephants are holed up in a few final hiding places, such as Asia’s dense forests. Even in Africa, our shared ancestral home, their populations are shrinking, as poachers hunt them with helicopters, GPS, and night-vision goggles. If you were an anthropologist specializing in human ecological relationships, you may well conclude that one of our distinguishing features as a species is an inability to coexist peacefully with elephants.

But nature isn’t fixed, least of all human nature. We may yet learn to live alongside elephants, in all their spectacular variety. We may even become a friend to these magnificent animals. Already, we honor them as a symbol of memory, wisdom, and dignity. With luck, we will soon re-extend their range to the Arctic. […]

Nikita and Sergey seemed entirely unbothered by ethical considerations regarding mammoth cloning or geoengineering. They saw no contradiction between their veneration of “the wild” and their willingness to intervene, radically, in nature. At times they sounded like villains from a Michael Crichton novel. Nikita suggested that such concerns reeked of a particularly American piety. “I grew up in an atheist country,” he said. “Playing God doesn’t bother me in the least. We are already doing it. Why not do it better?”

–Ross Andersen, Welcome to Pleistocene Park

Curiosity over partisanship?

In other words, curiosity seems to be the pin that bursts our partisan bubbles, allowing new and sometimes uncomfortable information to trickle in. Nothing else works like curiosity does, the authors point out—not being reflective, or good at math, or even well-educated.

Instead, they write, it’s “individuals who have an appetite to be surprised by scientific information—who find it pleasurable to discover that the world does not work as they expected … [who] expose themselves more readily to information that defies their expectations.”

–Olga Khazan, How to Overcome Political Irrationality About Facts

Residents and transients

The scientists who study these communities are microbial ecologists, and like ecologists of the macro world, they like to think about the interactions between different organisms. But this research is still new, and this way of thinking is only just starting to make it into the world of clinical microbiology, which is still focused on defeating the bad microbes. “We noticed there is kind of a division between the clinical human skin microbiology research and this more recent emergence of microbial ecology,” says Roo Vandegrift, an ecologist who co-authored the recent paper, “There wasn’t very much crosstalk between those segments of the scientific literature.”

Even the language the two groups use is different. Microbial ecologists tend to divide microbes into how they behave in communities: are they residents or are they transient? Clinical microbiologist divide them up based on whether they’re harmful: commensal or pathogenic? You could say the two sets of vocabulary roughly map onto one another. Resident microbes tend to be commensal (the bacteria are harmless to you but they’re benefitting from living on you) and the transients are the pathogens that make you sick when they appear.

–Sarah Zhang, What Is the Right Way to Wash Your Hands?

Every week

To be clear, these E. coli with mcr-1 found in China were still susceptible to antibiotics other than colistin, but if a bacterium with genes that make it resistant to every other drug then picks up mcr-1, you get the nightmare scenario: a pan-resistant bacteria. These drug- resistant infections usually happen in people who are already sick or have weakened immune systems.

The Lancet report immediately kicked off a flurry of activity in microbiology labs around the world. And soon enough, the reports started coming out. In December, scientists reported finding mcr-1 in Denmark. Then Germany and Vietnam and  Spain and  United States and on and on. It seems like every month is a new mcr-1 development, I said to James Spencer, a microbiologist University of Bristol who collaborated with the Chinese scientists on the original Lancet paper. “Every week,” he corrected me. By the time anyone had figured out mcr-1’s existence, it had already spread around the world.

–Sarah Zhang, Resistance to the Antibiotic of Last Resort is Silently Spreading

Science fiction capabilities, not scientific capabilities

Many of Rid’s tales unfold in the Defense Department and in the General Electric factory in Schenectady, New York, where Vietnam-driven businessmen, engineers, and government men created (unsuccessful) prototypes of robot weapons, and where Kurt Vonnegut sets his first novel, the cybernetics-inspired Player Piano. It turns out, although Rid does not say this in so many words, that science fiction has been as instrumental in the rise of the digital as any set of switches. Consider, for example, the creation of the Agile Eye helmet for Air Force pilots who need to integrate “cyberspace” (their term) with meatspace. The officer in charge reports, according to Rid, “We actually used the same industrial designers that had designed Darth Vader’s helmet.” This fluid movement between futuristic Hollywood design, science fiction, and the DOD is a recurring feature of Rise of the Machines. Take the NSA’s internal warning that “[l]aymen are beginning to expect science fiction capabilities and not scientific capabilities” in virtual reality. Or Rid’s account of the so-called “cypherpunks” around Timothy May. Their name was cribbed from the “cyberpunk” science fiction genre (“cypher” refers to public-key encryption), and they were inspired by novels like Vernor Vinge’s True Names (1981), one on a list of recommended books for the movement on which not a single nonfiction text figures.

–Leif Weatherby, The Cybernetic Humanities