Otherwise–who knows?

I have said that this new development has unbounded possibilities for good and for evil. For one thing, it makes the metaphorical dominance of the machines, as imagined by Samuel Butler, a most immediate and non-metaphorical problem. It gives the human race a new and most effective collection of mechanical slaves to perform its labor. Such mechanical labor has most of the economic properties of slave labor, although, unlike slave labor, it does not involve the direct demoralizing effects of human cruelty. However, any labor that accepts the conditions of competition with slave labor accepts the conditions of slave labor, and is essentially slave labor. The key word of this statement is competition. It may very well be a good thing for humanity to have the machine remove from it the need of menial and disagreeable tasks, or it may not. I do not know. It cannot be good for these new potentialities to be assessed in the terms of the market, of the money they save; and it is precisely the terms of the open market, the “fifth freedom,” that have become the shibboleth of the sector of American opinion represented by the National Association of Manufacturers and the Saturday Evening Post. I say American opinion, for as an American, I know it best, but the hucksters recognize no national boundary.

Perhaps I may clarify the historical background of the present situation if I say that the first industrial revolution, the revolution of the “dark satanic mills,’ was the devaluation of the human arm by the competition of machinery. There is no rate of pay at which a United States pick-and-shovel laborer can live which is low enough to compete with the work of a steam shovel as an excavator. The modern industrial revolution is similarly bound to devalue the human brain, at least in its simpler and more routine decisions. Of course, just as the skilled carpenter, the skilled mechanic, the skilled dressmaker have in some degree survived the first industrial revolution, so the skilled scientist and the skilled administrator may survive the second. However, taking the second revolution as accomplished, the average human being of mediocre attainments or less has nothing to sell that is worth anyone’s money to buy.

The answer, of course, is to have a society based on human values other than buying or selling. To arrive at this society, we need a good deal of planning and a good deal of struggle, which, if the best comes to the best, may be on the plane of ideas, and otherwise—who knows? […]

Those of us who have contributed to the new science of cybernetics thus stand in a moral position which is, to say the least, not very comfortable. We have contributed to the initiation of a new science which, as I have said, embraces technical developments with great possibilities for good and for evil. We can only hand it over into the world that exists about us, and this is the world of Belsen and Hiroshima. We do not even have the choice of suppressing these new technical developments. They belong to the age, and the most any of us can do by suppression is to put the development of the subject into the hands of the most irresponsible and most venal of our engineers. The best we can do is to see that a large public understands the trend and the bearing of the present work, and to confine our personal efforts to those fields, such as physiology and psychology, most remote from war and exploitation. As we have seen, there are those who hope that the good of a better understanding of man and society which is offered by this new field of work may anticipate and outweigh the incidental contribution we are making to the concentration of power (which is always concentrated, by its very conditions of existence, in the hands of the most unscrupulous). I write in 1947, and I am compelled to say that it is a very slight hope.

—Norbert Wiener, Cybernetics: or Control and Communication in the Animal and the Machine (1961 [orig. 1948]), The MIT Press, pp. 27-29

Advertisements

Judge Roy Moore and American Christianity

In an interview conducted by Jeff Stein for Vox, one of Alabama’s Republican senatorial candidates, Judge Roy Moore, attempted to clarify his view of the relationship between the American Constitution and Christianity:

But to deny God — to deny Christianity or Christian principles — is to deny what the First Amendment was established for. You see, the First Amendment was established on Christian principles, because it was Jesus that said this: “Render therefore unto Caesar the things which are Caesar’s; and render unto God the things that are God’s.” He recognized the jurisdiction the government does not have — and that was the freedom of conscience.

If you were a complete atheist, or a Buddhist, or a Muslim, or whatever, you have freedom in this country to worship God and you can’t be forced otherwise. That’s a Christian concept. It’s not a Muslim concept.

Developing his theme of contrast between Christianity and Islam, Moore claimed this:

There are communities under Sharia law right now in our country. Up in Illinois. Christian communities; I don’t know if they may be Muslim communities.

But Sharia law is a little different from American law. It is founded on religious concepts.

To recap: the U. S. Constitution — the entire basis of the American legal system — is founded on Christian principles, but Sharia law is different because it is founded on religious concepts.

Also, when Stein challenges Moore to elaborate on those communities allegedly living under Sharia law, Moore replies, “I was informed that there were. But if they’re not, it doesn’t matter.” Because why would anybody care about things like verifiable evidence for  bold claims about a key issue?

Moore’s most basic claims about the legal relationship between religion and the U. S. Constitution are self-evidently contradictory and incoherent. By the way, Moore is a former chief justice of Alabama’s Supreme Court. And if you believe the polls, he’s about to be the Republican nominee for Jeff Sessions’s old Senate seat. In practice, this means that the people of Alabama are very likely to make him their next U. S. senator.

This is significant to me only as a barometer of the degree to which not-insignificant portions of the electorate are eager to embrace patent fruitcakery, so long as it is sufficiently white and sufficiently bigoted. As a Christian, I feel that it’s more significant to me because I hate that people like Moore too often symbolize my faith to people on all sides of the front lines in America’s culture wars.

Many on the right hasten to offer apologetics for his pernicious balderdash; many on the left hasten to cast all American Christians from the same mold as Moore, because they think that, deep down, he’s merely the most blatant, odious symptom of our unsupportable mass delusion. Judge Moore does not speak for me. To the extent that he represents any historical variant of the rich, multilayered tapestry of the Christian religion, he is representative of those threads tangled together underneath a moldy coffee stain.

And if you think Judge Moore speaks for you, then you are welcome to all the justifiable criticism and caricaturization that inevitably follows when a buffoon who has smeared himself in feces lights himself on fire and sings the national anthem in the public square. It’s an offensive spectacle to all who have eyes to see and ears to hear, and it is to be greatly regretted that the stench will cling to the clothes of all who happened to be present to witness it, regardless of where they happened to be standing at the time.

Only a kind of obsessive monoculture

Ms. Tippett: … I want to take a slight diversion, which I don’t think is completely a diversion, which is your love of science fiction and the way science fiction is in your fiction. And I also love science fiction, and my story is not your story, but I grew up in a very small town and went to Brown, which was like going to a different planet. And you came from Santo Domingo to central New Jersey; it was like a different planet. And for the very first time, when I was reading you, and the science fiction references keep jumping out at me, including “Fear is the mind-killer,” it occurred to me that science fiction is there for people who change worlds. What did you say a little while ago? You were talking, also, about that numinous world that — the sense that there are many worlds within the world. I just kind of wanted to note that. I mean — and it’s not an escape. It’s actually revealing or kind of opening your imagination to vast cosmic possibilities that aren’t immediately reflected in the world around you.

Mr. Díaz: Yeah, well, it could be an escape, but I do find science fiction to be — for me has been an excellent literary technology for understanding our many worlds, for understanding what’s been disavowed about our societies, for understanding our political unconscious. It’s really — science fiction is really good to think, man. And for some folks, the aliens and all the stuff about otherness is just surface titillation. For others of us, it becomes a source for theorizing about real-world alterity and alternate possibilities. And that’s the way I reacted to science fiction, in some ways. For me, science fiction offered the possibility of different ways of being and of ways of possibly overcoming the cage that surrounded us.

Ms. Tippett: Yeah, and another reference that I feel is kind of in the ether right now is this Whitman line of “I contain multitudes.” It’s come up a lot, lately, and you invoke that in the context of a question about what is America — that there are these multiple Americas. I wonder how your long view of time, your rootedness in the whole sweep of history, of your ancestors, of your people as the ground on which you stand in the present, how that speaks to you about multiple Americas and how to live with this, generatively.

Mr. Díaz: Well, I mean shoot. It’s a question that has bedeviled the New World and bedeviled societies for a long time. I mean shoot, we’ve got the Babel myth at the heart of the Bible, the idea that God struck down humans by making them more diverse. [laughs] Only a kind of obsessive monoculture would think that’s a terrible thing. But, you know, so it goes. I just — when I think about what is required for all of us to live on this planet, it’s going to be the kinds of solidarities and the kinds of civic imaginaries and the kinds of radical tolerances that we’re not seeing. We’re going to have to practice a democracy that we’ve yet to define or even lay down the first four bricks of. There’s nothing about our impoverished political systems, our imagined communities, that is going to be able to hold us together in the face of the coming storm of climate change. We need a lot more than we have. And the fact that so many of us are scared by our multiplicity shows you how much work we have to do.

Our multiplicity is our damn strength. There is no getting around it. People want to make it the danger. People want to make it the problem. No, it’s only going to be the problem if we don’t make it our strength. And you don’t want to be so fantastically reductive, but really, at an operational level, it’s really what it comes down to — either we’re going to embrace humanity and figure out how we can all live together and work together to overcome the damage that certain sectors of us have inflicted on the planet, or we’re not. And I, for one, think eventually there’s — I don’t trust our politicians. I don’t trust our mainstream religious figures. I don’t trust our business leaders. I don’t trust any of the sort of folks who already have power and have already shown us how little they can do for us, and they’re showing us their cowardice and their avarice — I don’t trust any of those people. But I do trust in the collective genius of all the people who have survived these wicked systems. I trust in that. I think from the bottom will the genius come that makes our ability to live with each other possible. I believe that with all my heart.

Junot Díaz in conversation with Krista Tippett

This is a fascinating, somewhat confusing exchange. Díaz and Tippett link sf to alterity, and they link alterity to the plurality inherent in systems of democracy. So far, so good. But Díaz alludes to the Babel story to illustrate the notion that humanity has struggled with multiculturalism for millennia. “God struck down humans by making them more diverse.” Hm, okay. If language is a metonym for all diversity, sure. And if scattering people to diverse areas around the globe equals “striking down,” I guess. But then he says, “Only a kind of obsessive monoculture would think that’s a terrible thing.” This is the confusing part. To which “obsessive monoculture” is he referring? Who sees what part of that as a terrible thing?

I suppose that Babel often serves as a kind of metaphor for irreconcilable breakdowns in communication. Fair enough. And we do, I further suppose, generally think of communication breakdowns as bad things. But that’s us: the generations raised to believe in the rightness of democratic politics. Weirdly enough, I wouldn’t take exception to Díaz labeling we 21st-century moderns as a kind of obsessive monoculture. But I don’t think that he’s doing that.

God’s reason for scattering the people is that if they succeed in building their city and its tower to heaven, then “nothing they plan to do will be impossible for them.” There’s not much elaboration there. I’m confident that theologians over the centuries have spilled much ink and hot air over why God really confused humanity’s languages or the myriad things the story signifies. On the most basic level, it simply seems that God did not think it good that humans find nothing to be impossible, and it’s worth meditating on why God would place barriers in front of people reaching for radical possibilities of self-definition and agency.

This kind of meditation is something sf is really good at. And one might even generalize that stories modeled on the story of Babel tend to emphasize the hubris, avarice, and cowardice of leaders who want to place themselves on the same plane as God at the expense of common people and the natural world.

That still doesn’t help me understand which “obsessive monoculture” Díaz refers to or precisely why invoking the Babel story helps us understand why it would view multiplicity as such a terrible thing. Perhaps he meant nothing more than to imply some sort of intrinsic correlation between the Bible and fear of the Other. But, you know, so it goes.

 

Politics of ruin

Every dystopia is a history of the future. What are the consequences of a literature, even a pulp literature, of political desperation? “It’s a sad commentary on our age that we find dystopias a lot easier to believe in than utopias,” Atwood wrote in the nineteen-eighties. “Utopias we can only imagine; dystopias we’ve already had.” But what was really happening then was that the genre and its readers were sorting themselves out by political preference, following the same path—to the same ideological bunkers—as families, friends, neighborhoods, and the news. In the first year of Obama’s Presidency, Americans bought half a million copies of “Atlas Shrugged.” In the first month of the Administration of Donald (“American carnage”) Trump, during which Kellyanne Conway talked about alternative facts, “1984” jumped to the top of the Amazon best-seller list. (Steve Bannon is a particular fan of a 1973 French novel called “The Camp of the Saints,” in which Europe is overrun by dark-skinned immigrants.) The duel of dystopias is nothing so much as yet another place poisoned by polarized politics, a proxy war of imaginary worlds.

Dystopia used to be a fiction of resistance; it’s become a fiction of submission, the fiction of an untrusting, lonely, and sullen twenty-first century, the fiction of fake news and infowars, the fiction of helplessness and hopelessness. It cannot imagine a better future, and it doesn’t ask anyone to bother to make one. It nurses grievances and indulges resentments; it doesn’t call for courage; it finds that cowardice suffices. Its only admonition is: Despair more. It appeals to both the left and the right, because, in the end, it requires so little by way of literary, political, or moral imagination, asking only that you enjoy the company of people whose fear of the future aligns comfortably with your own. Left or right, the radical pessimism of an unremitting dystopianism has itself contributed to the unravelling of the liberal state and the weakening of a commitment to political pluralism. “This isn’t a story about war,” El Akkad writes in “American War.” “It’s about ruin.” A story about ruin can be beautiful. Wreckage is romantic. But a politics of ruin is doomed.

–Jill Lepore, A Golden Age of Dystopian Fiction

Introducing “iGen.”

From Jean M. Twenge’s recent essay in The Atlantic:

The more I pored over yearly surveys of teen attitudes and behaviors, and the more I talked with young people like Athena, the clearer it became that theirs is a generation shaped by the smartphone and by the concomitant rise of social media. I call them iGen. Born between 1995 and 2012, members of this generation are growing up with smartphones, have an Instagram account before they start high school, and do not remember a time before the internet. The Millennials grew up with the web as well, but it wasn’t ever-present in their lives, at hand at all times, day and night. iGen’s oldest members were early adolescents when the iPhone was introduced, in 2007, and high-school students when the iPad entered the scene, in 2010. A 2017 survey of more than 5,000 American teens found that three out of four owned an iPhone.

The advent of the smartphone and its cousin the tablet was followed quickly by hand-wringing about the deleterious effects of “screen time.” But the impact of these devices has not been fully appreciated, and goes far beyond the usual concerns about curtailed attention spans. The arrival of the smartphone has radically changed every aspect of teenagers’ lives, from the nature of their social interactions to their mental health. These changes have affected young people in every corner of the nation and in every type of household. The trends appear among teens poor and rich; of every ethnic background; in cities, suburbs, and small towns. Where there are cell towers, there are teens living their lives on their smartphone.

Twenge supplies a lot of correlated data that strongly link smartphone use to a number of generationally-distinct patterns in what she calls “iGen.” Among the more worrying data, she documents the rise of cyberbullying among young people, especially among girls. Then this:

Social-media companies are of course aware of these problems, and to one degree or another have endeavored to prevent cyberbullying. But their various motivations are, to say the least, complex. A recently leaked Facebook document indicated that the company had been touting to advertisers its ability to determine teens’ emotional state based on their on-site behavior, and even to pinpoint “moments when young people need a confidence boost.” Facebook acknowledged that the document was real, but denied that it offers “tools to target people based on their emotional state.”

At no time in human history have we possessed tools more finely-attuned to the art of manipulating the psychology of masses of people. These tools are supremely scalable. The same platforms that can target a demographic of heterogenous millions can individualize their content to reach, perhaps, a niche demographic of dozens. Taken in the context of Mark Zuckerberg’s utopian manifesto from earlier this year, the existence of the “boost” document ought to give us serious pause.

Allow me to go one step further. Scientists based in Portland, Oregon, recently succeeded in using the gene-editing program CRISPR/Cas9 to edit the DNA of embryos to eliminate the development of a genetic mutation that would cause hypertrophic cardiomyapathy. This is an incredible victory for medical science. But as I’ve said before, I’ve read Margaret Atwood’s Oryx and Crake. You should, too.

We have the tools to shape and reshape the human experience on a very literal level. On the genetic level, CRISPR is but the first feeble step toward technology whose power will enable us to program our own genetic makeup on scales previously imagined only in science fiction. Similarly, the algorithms of social media sites like Facebook have the potential to shape their users’ desires, feelings, and perceptions in ways that are simultaneously microscopically managed and macroscopically unpredictable. I strive to make these observations not in a spirit of alarm or histrionics but in the mindset of sober assessment. If, despite my attempts at sobriety, you feel alarmed… well, good.

Neither savage nor barbarian

Clara broke in here, flushing a little as she spoke: ‘Was not their mistake once more bred of the life of slavery that they had been living? — a life which was always looking upon everything, except mankind, animate and inanimate — “nature”, as people used to call it — as one thing, and mankind another. It was natural to people thinking in this way, that they should try to make “nature” their slave, since they thought “nature was something outside them.’

‘Surely,’ said Morsom; ‘and they were puzzled as to what to do, till they found the feeling against mechanical life, which had begun before the Great Change amongst people who had leisure to think of such things, was spreading insensibly; till at last under the guise of pleasure that was not supposed to be work, work that was pleasure began to push out the mechanical toil, which they had once hoped at the best to reduce to narrow limits indeed, but never to get rid of, and which, moreover, they found they could not limit as they had hoped to do.’

‘When did this new revolution gather head?’ said I.

‘In the half-century that followed the Great Change,’ said Morsom, ‘it began to be noteworthy; machine after machine wa quietly dropped under the excuse that the machines could not produce works of art, and that works of art were more and more called for. Look here,’ he said, ‘here are some of the works of that time — rought and unskilful in handiwork, but solid and showing some sense of pleasure in the making.’

‘They are very curious,’ said I, taking up a piece of pottery from amongst the specimens which the antiquary was showing us; ‘not a bit like the work of either savages or barbarians, and yet with what would once have been called a hatred of civilization impressed upon them.’

–William Morris, News from Nowhere, or an epoch of rest, being some chapters from a utopian romance. Edited by James Remond, Routledge & Kegan Paul, 1970, pp. 154-55. (1890)

 

Best practices.

In a still-influential paper from 1937 titled “The Nature of the Firm,” the economist and Nobel laureate Ronald Coase established himself as an early observer and theorist of corporate concerns. He described the employment contract not as a document that handed the employer unaccountable powers, but as one that circumscribed those powers. In signing a contract, the employee “agrees to obey the directions of an entrepreneur within certain limits,” he emphasized. But such characterizations, as Anderson notes, do not reflect reality; most workers agree to employment without any negotiation or even communication about their employer’s power or its limits. The exceptions to this rule are few and notable: top professional athletes, celebrity entertainers, superstar academics, and the (increasingly small) groups of workers who are able to bargain collectively.

Yet because employment contracts create the illusion that workers and companies have arrived at a mutually satisfying agreement, the increasingly onerous restrictions placed on modern employees are often presented as “best practices” and “industry standards,” framing all sorts of behaviors and outcomes as things that ought to be intrinsically desired by workers themselves. Who, after all, would not want to work on something in the “best” way? Beyond employment contracts, companies also rely on social pressure to foster obedience: If everyone in the office regularly stays until seven o’clock every night, who would risk departing at five, even if it’s technically allowed? Such social prods exist alongside more rigid behavioral codes that dictate everything from how visible an employee’s tattoo can be to when and how long workers can break for lunch.

–Miya Tokumitsu, The United States of Work

You won’t get privacy on the Republican party line.

As the Electronic Frontier Foundation has pointed out, there are also serious implications for security: If ISPs look to sell consumer data, “internet providers will need to record and store even more sensitive data on their customers, which will become a target for hackers.” Even if they anonymize your sensitive data before they sell it to advertisers, they need to collect it first—and these companies don’t exactly have a perfect track record in protecting consumer data. In 2015, for example, Comcast paid $33 million as part of a settlement for accidentally releasing information about users who had paid the company to keep their phone numbers unlisted, including domestic violence victims.

This is all made much more difficult for consumers by the dearth of broadband competition. More than half of Americans have either one or even no options for providers, so if you don’t like your ISP’s data collection policies, chances are you won’t be able to do much about it, and providers know that. It’s highly unlikely that providers, particularly the dominant companies, will choose to forego those sweet advertising dollars in order to secure their customers’ privacy, when they know those customers don’t have much choice. […]

All is not completely lost. Your ISP still has to allow you to opt out of having your data sold, so you can call them or go online to find out how to do that. (If you do that, let us know how it went.) But today’s news is devastating for privacy overall. Consumers could have had more control over their privacy; your data could have been safer. Things could have been better, if Congress had done what it usually does and done nothing. Instead, they made things worse for anyone who doesn’t run an internet company or an advertising agency.

–Libby Watson, Congress Just Gave Internet Providers the Green Light to Sell You Browsing History Without Consent

 

Ice Age Solastalgia

The mammoth’s extinction may have been our original ecological sin. When humans left Africa 70,000 years ago, the elephant family occupied a range that stretched from that continent’s southern tip to within 600 miles of the North Pole. Now elephants are holed up in a few final hiding places, such as Asia’s dense forests. Even in Africa, our shared ancestral home, their populations are shrinking, as poachers hunt them with helicopters, GPS, and night-vision goggles. If you were an anthropologist specializing in human ecological relationships, you may well conclude that one of our distinguishing features as a species is an inability to coexist peacefully with elephants.

But nature isn’t fixed, least of all human nature. We may yet learn to live alongside elephants, in all their spectacular variety. We may even become a friend to these magnificent animals. Already, we honor them as a symbol of memory, wisdom, and dignity. With luck, we will soon re-extend their range to the Arctic. […]

Nikita and Sergey seemed entirely unbothered by ethical considerations regarding mammoth cloning or geoengineering. They saw no contradiction between their veneration of “the wild” and their willingness to intervene, radically, in nature. At times they sounded like villains from a Michael Crichton novel. Nikita suggested that such concerns reeked of a particularly American piety. “I grew up in an atheist country,” he said. “Playing God doesn’t bother me in the least. We are already doing it. Why not do it better?”

–Ross Andersen, Welcome to Pleistocene Park

From “The Mark Manifesto”

With a community of almost two billion people, it is less feasible to have a single set of standards to govern the entire community so we need to evolve towards a system of more local governance.

–Mark Zuckerberg, Building Global Community [via Recode]

There’s so much in the manifesto that smarter people than me will hash over, but this stood out to me, appearing as it does about three-quarters of the way through a polemic advocating for Facebook’s centrality to the building of a truly global community. I’ve no idea how this claim will be translated into algorithmic practice. The general tenor of that section of the manifesto gives the impression that what Zuckerberg means is that individuals will still (sort of) control what they see, but those settings will be refined by Facebook’s programmers to set regional norms for community standards. But in a global community, how are locality and region going to be defined? In a digital space where people choose their associations, how will Facebook determine boundaries? To what extent will cookies, likes, and reposts determine new forms of subcommunity identity? If Facebook is successful in its global agenda, will nation-states morph into digitally-facilitated forms of groupthink? Zuckerberg seems determined not to contribute to the atomization of society via his particular social media platform (and it’s clear that he’s wrestled with this issue pretty extensively), but what checks and balances do Zuckerberg and his army of programmers intend to build into the code? Zuckerberg also intends to grow the Facebook community; if 2 billion makes it “less feasible to have a single set of standards,” what happens when Facebook hits 3 billion? Zuckerberg claims at the outset of the manifesto that the goal is “building the long term social infrastructure to bring humanity together.” I feel like there’s a lot of slippage between terms like “community,” “government,” “standards,” and “infrastructure” throughout–as there tends to be in any extended political conversation–but very little acknowledgement of who or what comprises this infrastructure. It’s fine and dandy to insist that the sociability of people is the nucleus of Facebook. And that’s sort of true. But it’s also true that Facebook remains a private company whose product is a patented digital system whose language is known only to Zuckerberg and his employees. Facebook is infrastructure, even social infrastructure in a capacious sense of the word. But Zuckerberg seems to entertain seriously the idea that it’s the users who are driving the formation of the community even as he promotes the role of the Facebook corporate entity in giving it shape and function. What does locality look like in a global village whose infrastructure is house in Silicon Valley, yet whose fiberoptic materials and electronic signals remain almost literally invisible to the eye of the people who “live” there?