Social media has drastically restructured the way we communicate in an incredibly short period of time.
We can discover, “Like,” click on, and share information faster than ever before, guided by algorithms most of us don’t quite understand according to a new white paper “Stewardship of global collective behavior,” published in the prestigious science journal PNAS in June. Vox.com reports that seventeen researchers who specialize in widely different fields, from climate science to philosophy, make the case that academics should treat the study of technology’s large-scale impact on society as a “crisis discipline.” A crisis discipline is a field in which scientists across different fields work quickly to address an urgent societal problem — like how conservation biology tries to protect endangered species or climate science research aims to stop global warming. The paper argues that our lack of understanding about the collective behavioral effects of new technology is a danger to democracy and scientific progress. For example, the paper says that tech companies have “fumbled their way through the ongoing coronavirus pandemic, unable to stem the ‘infodemic’ of misinformation” that has hindered widespread acceptance of masks and vaccines. The authors warn that if left misunderstood and unchecked, we could see unintended consequences of new technology contributing to phenomena such as “election tampering, disease, violent extremism, famine, racism, and war.” Recode interviewed the lead author of the paper, Joe Bak-Coleman, a postdoctoral fellow at the University of Washington Center for an Informed Public as well as co-author Carl Bergstrom, a biology professor at the University of Washington, to better understand this call for a paradigm shift in how scientists study the technology we use every day. Bergstrom said that his sense is that social media in particular — as well as a broader range of internet technologies, including algorithmically driven search and click-based advertising — have changed the way that people get information and form opinions about the world.
There’s no reason why good information will rise to the top of any ecosystem we’ve designed he says. Bergstrom’s hope is very much that this [paper] will sort of galvanize people. He thinks this paper will hopefully really highlight the magnitude of what’s happened and the urgency of fixing it. What he is concerned about is the fact that this information ecosystem has developed to optimize something orthogonal to things that we think are extremely important, like being concerned about the veracity of information or the effect of information on human well-being, on democracy, on health, on the ecosystem.