Anthropocene describes what we are doing to our environment, while posthuman is largely phenomenological, a condensed articulation of what it’s like to live in a world where we are constantly making and remaking ourselves, especially via biotechnology. And surely there is some truth in these points, but I want to suggest that the apparent disjunction obscures a deeper unity. A world in which we remake our environment and ourselves is a world that does not feel human to us. We do not know how to play the role of gods, and on some level perceive that to act as gods is to betray our nature.
–Alan Jacobs, Anthropocene theology
“Put simply, privatization will mean that more ‘sexy,’ ‘hot’ science will be funded, and we will miss important discoveries since most breakthroughs are based on years and decades of baby steps,” said Kelly Cosgrove, an associate professor of psychiatry at Yale University. “The hare will win, the tortoise will lose, and America will not be scientifically great.”
–Adrienne Lafrance, Scientists Brace for a Lost Generation in American Research
In other words, curiosity seems to be the pin that bursts our partisan bubbles, allowing new and sometimes uncomfortable information to trickle in. Nothing else works like curiosity does, the authors point out—not being reflective, or good at math, or even well-educated.
Instead, they write, it’s “individuals who have an appetite to be surprised by scientific information—who find it pleasurable to discover that the world does not work as they expected … [who] expose themselves more readily to information that defies their expectations.”
–Olga Khazan, How to Overcome Political Irrationality About Facts
The scientists who study these communities are microbial ecologists, and like ecologists of the macro world, they like to think about the interactions between different organisms. But this research is still new, and this way of thinking is only just starting to make it into the world of clinical microbiology, which is still focused on defeating the bad microbes. “We noticed there is kind of a division between the clinical human skin microbiology research and this more recent emergence of microbial ecology,” says Roo Vandegrift, an ecologist who co-authored the recent paper, “There wasn’t very much crosstalk between those segments of the scientific literature.”
Even the language the two groups use is different. Microbial ecologists tend to divide microbes into how they behave in communities: are they residents or are they transient? Clinical microbiologist divide them up based on whether they’re harmful: commensal or pathogenic? You could say the two sets of vocabulary roughly map onto one another. Resident microbes tend to be commensal (the bacteria are harmless to you but they’re benefitting from living on you) and the transients are the pathogens that make you sick when they appear.
–Sarah Zhang, What Is the Right Way to Wash Your Hands?
To be clear, these E. coli with mcr-1 found in China were still susceptible to antibiotics other than colistin, but if a bacterium with genes that make it resistant to every other drug then picks up mcr-1, you get the nightmare scenario: a pan-resistant bacteria. These drug- resistant infections usually happen in people who are already sick or have weakened immune systems.
The Lancet report immediately kicked off a flurry of activity in microbiology labs around the world. And soon enough, the reports started coming out. In December, scientists reported finding mcr-1 in Denmark. Then Germany and Vietnam and Spain and United States and on and on. It seems like every month is a new mcr-1 development, I said to James Spencer, a microbiologist University of Bristol who collaborated with the Chinese scientists on the original Lancet paper. “Every week,” he corrected me. By the time anyone had figured out mcr-1’s existence, it had already spread around the world.
–Sarah Zhang, Resistance to the Antibiotic of Last Resort is Silently Spreading
Many of Rid’s tales unfold in the Defense Department and in the General Electric factory in Schenectady, New York, where Vietnam-driven businessmen, engineers, and government men created (unsuccessful) prototypes of robot weapons, and where Kurt Vonnegut sets his first novel, the cybernetics-inspired Player Piano. It turns out, although Rid does not say this in so many words, that science fiction has been as instrumental in the rise of the digital as any set of switches. Consider, for example, the creation of the Agile Eye helmet for Air Force pilots who need to integrate “cyberspace” (their term) with meatspace. The officer in charge reports, according to Rid, “We actually used the same industrial designers that had designed Darth Vader’s helmet.” This fluid movement between futuristic Hollywood design, science fiction, and the DOD is a recurring feature of Rise of the Machines. Take the NSA’s internal warning that “[l]aymen are beginning to expect science fiction capabilities and not scientific capabilities” in virtual reality. Or Rid’s account of the so-called “cypherpunks” around Timothy May. Their name was cribbed from the “cyberpunk” science fiction genre (“cypher” refers to public-key encryption), and they were inspired by novels like Vernor Vinge’s True Names (1981), one on a list of recommended books for the movement on which not a single nonfiction text figures.
–Leif Weatherby, The Cybernetic Humanities
“The best way to predict the future is to invent it,” computer scientist Alan Kay once famously said. I’d wager that the easiest way is just to make stuff up and issue a press release. I mean, really. You don’t even need the pretense of a methodology. Nobody is going to remember what you predicted. Nobody is going to remember if your prediction was right or wrong. Nobody – certainly not the technology press, which is often painfully unaware of any history, near-term or long ago – is going to call you to task. This is particularly true if you make your prediction vague – like “within our lifetime” – or set your target date just far enough in the future – “In fifty years, there will be only ten institutions in the world delivering higher education and Udacity has a shot at being one of them.”
Let’s consider: is there something about the field of computer science in particular – and its ideological underpinnings – that makes it more prone to encourage, embrace, espouse these sorts of predictions? Is there something about Americans’ faith in science and technology, about our belief in technological progress as a signal of socio-economic or political progress, that makes us more susceptible to take these predictions at face value? Is there something about our fears and uncertainties – and not just now, days before this Presidential Election where we are obsessed with polls, refreshing Nate Silver’s website obsessively – that makes us prone to seek comfort, reassurance, certainty from those who can claim that they know what the future will hold?
–Audrey Watters, The Best Way to Predict the Future is to Issue a Press Release
This is how science moves, through layers of history, through what phenomenologists call the “lifeworld.” The great scientist Benno Müller-Hill famously described the “two faces” of molecular biology. “In the textbooks, almost everything is solved and clear,” he wrote. “Most claims are so self-evident that no proofs are given. Old, classical experiments disappear.” In a sense, the first face is taken as a given. “The other face of molecular biology is seen at scientific conferences or read in recent issues or Nature, Science or Cell,” at the cutting edge of the field, where knowledge struggles with ignorance. Robert Pogue Harrison has suggested the metaphor of “two angels,” drawing on Paul Klee’s painting — made famous in Walter Benjamin’s “Theses on the Philosophy of History” — that depicts an “angel of history is borne upward through the air on outspread wings, facing backward.” In Benjamin’s vision, all the angel sees are “the accumulated ruins of the has-been.” Harrison goes on to suggest that
science flies on the wing of another kind of angel — the angel of neoteny — who weaves in and out of enfolded spaces, forever turning a corner or rounding a bend, entering or exiting a crease of the cosmos, such that his expectant, forward-looking gaze sees anew a world it has been seen countless times before, always as if for the first time.
–Jim Kozubek, On Writing a History of Crispr-Cas9
Chief—let’s call him Chief for brevity’s sake—was so popular because his daughters were fantastic milk producers. He had great genes for milk. But, geneticists now know, he also had a single copy of a deadly mutation that spread undetected through the Holstein cow population. The mutation caused some unborn calves to die in the womb. According to a recent estimate, this single mutation ended up causing more than 500,000 spontaneous abortions and costing the dairy industry $420 million in losses. […]
The USDA team now knew something was wrong with this segment of Chief’s DNA, but they didn’t know exactly where or why. Remember, the USDA was working with genetic markers, which did not actually get at the underlying DNA sequence. So they called up Harris Lewin, who had, by chance, undertaken the then-enormously-expensive project of sequencing Chief’s entire genome a few years ago. Chief and his son Walkway Chief Mark were the first two dairy bulls to ever be sequenced.
Lewin and his post doc Heather Adams got to work. “Within 48 hours, we had a candidate,” he says. The stretch of DNA in question corresponded to the gene Apaf1, which had been well studied in mice. Brain cells in mice embryos with a faulty Apaf1 would grow out of control, until the embryo eventually died. “The reason we had a candidate so quickly was because of the tremendous investment in mouse genetics,” says Lewin. The scientists trudging through mouse genome could probably have never known an obscure gene they isolated had such a huge effect on the dairy industry.
–Sarah Zhang, The Dairy Industry Lost $420 Million From a Defective Cow Gene