- HARPER'S MAGAZINE - SEP, 2021 - Joseph Bernstein -
In the beginning, there were ABC, NBC, and CBS, and they were good. Midcentury American man could come home after eight hours of work and turn on his television and know where he stood in relation to his wife, and his children, and his neighbors, and his town, and his country, and his world. And that was good. Or he could open the local paper in the morning in the ritual fashion, taking his civic communion with his coffee, and know that identical scenes were unfolding in households across the country.
Over frequencies our American never tuned in to, red-baiting, ultra-right-wing radio preachers hyperventilated to millions. In magazines and books he didn’t read, elites fretted at great length about the dislocating effects of television. And for people who didn’t look like him, the media had hardly anything to say at all. But our man lived in an Eden, not because it was unspoiled, but because he hadn’t considered any other state of affairs. For him, information was in its right—that is to say, unquestioned—place. And that was good, too.
Today, we are lapsed. We understand the media through a metaphor—“the information ecosystem”—which suggests to the American subject that she occupies a hopelessly denatured habitat. Every time she logs on to Facebook or YouTube or Twitter, she encounters the toxic byproducts of modernity as fast as her fingers can scroll. Here is hate speech, foreign interference, and trolling; there are lies about the sizes of inauguration crowds, the origins of pandemics, and the outcomes of elections.
She looks out at her fellow citizens and sees them as contaminated, like tufted coastal animals after an oil spill, with “disinformation” and “misinformation.” She can’t quite define these terms, but she feels that they define the world, online and, increasingly, off.
Everyone scrounges this wasteland for tainted morsels of content, and it’s impossible to know exactly what anyone else has found, in what condition, and in what order. Nevertheless, our American is sure that what her fellow citizens are reading and watching is bad. According to a 2019 Pew survey, half of Americans think that “made-up news/info” is “a very big problem in the country today,” about on par with the “U.S. political system,” the “gap between rich and poor,” and “violent crime.” But she is most worried about disinformation, because it seems so new, and because so new, so isolable, and because so isolable, so fixable. It has something to do, she knows, with the algorithm.
What is to be done with all the bad content? In March, the Aspen Institute announced that it would convene an exquisitely nonpartisan Commission on Information Disorder, co-chaired by Katie Couric, which would “deliver recommendations for how the country can respond to this modern-day crisis of faith in key institutions.” The fifteen commissioners include Yasmin Green, the director of research and development for Jigsaw, a technology incubator within Google that “explores threats to open societies”; Garry Kasparov, the chess champion and Kremlin critic; Alex Stamos, formerly Facebook’s chief security officer and now the director of the Stanford Internet Observatory; Kathryn Murdoch, Rupert Murdoch’s estranged daughter-in-law; and Prince Harry, Prince Charles’s estranged son. Among the commission’s goals is to determine “how government, private industry, and civil society can work together . . . to engage disaffected populations who have lost faith in evidence-based reality,” faith being a well-known prerequisite for evidence-based reality.
The Commission on Information Disorder is the latest (and most creepily named) addition to a new field of knowledge production that emerged during the Trump years at the juncture of media, academia, and policy research: Big Disinfo. A kind of EPA for content, it seeks to expose the spread of various sorts of “toxicity” on social-media platforms, the downstream effects of this spread, and the platforms’ clumsy, dishonest, and half-hearted attempts to halt it. As an environmental cleanup project, it presumes a harm model of content consumption. Just as, say, smoking causes cancer, consuming bad information must cause changes in belief or behavior that are bad, by some standard. Otherwise, why care what people read and watch?
Para acessar o Conteúdo acima, acesse a Home Page aqui. https://www.heitordepaola.online/