The best I could do as moderator some days was to keep the conversation from completely turning into a flaming cesspool. Last month, I was speaking to a friend, describing my long-held hope that things might someday improve, that every time a conversation in comments went really well, maybe it signaled a turning point—that from then on, things would get better. As soon as I said that aloud, I realized that it sounded as if I had been living in a long-term abusive relationship.
–Alan Taylor, For Ten Years, I Read the Comments
Davidson’s messianic hopes as well as Nichols’s cultural despair mistakenly suppose that there can somehow be a vacuum of epistemic authority. But, in truth, forms and functions of epistemic authority, be they the disciplinary order of the research university or Wikipedia’s fundamental principles or “Five Pillars,” are themselves filtering technologies, helping us to orient ourselves amid a surfeit of information. They help us discern and attend to what is worthwhile. Google searches point us in the direction of some resources and not others. Technologies are normative, evaluative structures to make information accessible, manageable, and, ultimately, meaningful. It is not a question, then, of the presence or absence of epistemic authority; it is about better or worse forms of epistemic authority. Expertise and cultural authority are still with us. But now it might be more spectral, embodied not in the university don but in the black-boxed algorithm.
Chad Wellmon, “Algorithms Rule”