The Need for People-Centered Sources of Hope for Our Digital Future Ahead

By David Bray

Photo free for reuse from Pixabay; video from PCI’s 10 Dec 2018 event hosted by the Internet Society

With the People-Centered Internet coalition, several of us — including Vint Cerf, Mei Lin Fung, Marci Harris, Ray Wang, Eric Rasmussen, Lin Wells, Annie Sobel, Bevon Moore, Stephanie Wander, Corina DuBois, Don Codling, John Taschek and many more — met on 10 Dec 2018 to discuss the challenges facing open, pluralistic societies with our digital era. Specifically, we were concerned about whether human rights were being eroded with new technologies and wanted to help create People-Centered challenge incentives for innovators, entrepreneurs, and other creators to find ways to uplift communities and re-empower human rights for our digital future ahead.

A few weeks ago, I was asked to give an executive talk in Canada on the issues of Resilience and Governance. Those present were concerned about growing polarization, unrest, and social turbulence in several parts of Europe and the United States. For my talk, I opened with the somewhat provocative position that those societies that separate their private sector activities from their public sector activities — i.e., the public sector doesn’t own the private sector, or the private sector hasn’t wholly grabbed the public sector such that it does what they want, or some form or kleptocracy hasn’t made the two indistinguishable) — will have a harder time adapting to the challenges of the next decade.

Caveat: I am *not* advocating that don’t separate private sector activities from public sector activities, just that this very distinction may prove challenging for the era ahead. No one would ask small businesses to purchase surface-to-air missiles to protect their property from nation state actors, yet in Western Europe and North America, we do ask them to do the equivalent of this to protect their data and intellectual property from cyber-related concerns. Similarly, when it comes to misinformation, disinformation, and malicious influence operations — countries that don’t separate their private and public sectors can censor, lock up, or worse (eliminate) such attempts in ways that open societies cannot. Again, I certainly am *not* advocating for such means, just noting the challenges that open societies face are to a degree uniquely theirs.

The challenges currently facing open societies may only get worse as our digital era of the growing Internet of Things, combined with small satellites, essentially “instrument the world” and in doing so collectively: (1) risks allowing anyone to take out of context observed actions of an individual, group, or organization, and (2) risks providing those with enough data to build reliable models to know what information or other motivational cues will be needed to provoke an action from an individual or group — and thus prompting important questions of free will for those of us living in open societies.


So — circling back to the concern that human rights are being eroded with new technologies and the need to create People-Centered challenge incentives for innovators, entrepreneurs, and other creators to find ways to uplift communities. It’s possible that historians, looking back at the last twenty years, might conclude the goal of “digitally instrumenting the world” was tied to the thought that greater visibility and transparency would improve commerce, social functions, and add convenience to our everyday lives. Yet the historians, looking at our 2016 and beyond periods, may conclude that we unfortunately discovered that instrumenting the world only made things more likely to be taken out of context, be subject to misinformation and disinformation, and made it even harder to discern truth in the midst of the digital tsunami of data hitting us constantly on a daily and hourly basis.

Back to the 1990s and early 2000s, those of use tracking digital advancements used to think more transparency and access to more facts would create greater understanding and remedy bad behaviors in the world. What we missed are the fact each of us, as humans, have confirmation biases that kick-in once we have made up our minds on a topic. More facts counter to our position will not sway our chosen position, instead we’ll “dig in” further on our existing views. We like to think we’re rational being yet we’re not. Similarly, cognitive ease — repeating something over and over — makes use more likely to believe it’s true, something marketing and politics both already know well. Refuting something as untrue only gives more airtime to subject, invoking the double combination of cognitive ease and confirmation biases in such a way that a malicious meme thrives in an open society.

The quote attributed to Scott Guthery is astute: “Noise pulls people apart; truth doesn’t bring them back together.”

Future historians may conclude that our human brains and behaviors were not ready for the democratization of information and the consequences that followed. Moreover, it’s important to remember that marketing and politics have always been influence operations to a degree. Like using botox to remove wrinkles temporarily, it seems like open societies were okay (to a degree) with these activities if — to extend the botox analogy — they neither paralyzed nor killed the patient. For the decade ahead, what we may be discovering is it’s all too easy to weaponize information to paralyze or kill productive thought and social cohesion.

So, given all these challenges, what are potential sources of hope? Here are three:

(1) We need more research on what can make people aware of their confirmation biases, cognitive ease effects, and at least be open to considering additional perspectives — the challenge is we may find that the selection pressures that shaped human evolution are hard to overcome (e.g., most of us want to be certain and not re-evaluate our worldviews on a daily basis)

(2) We can find ways to give people ownership of their data that they produce in such a way that they can choose how it’s used and not used — the challenge is how do you adequately define informed consent and all the second- and third-order effects of using such data, let alone keep up with a tsunami of prompts for use?

(3) We must remove some of the marketing or political incentives to weaponize information — or introduce positive incentives to increase group cohesion beyond self-gain. If every organization or society has a set of incentives perfectly designed to produce the results that they’re currently experiencing, then we must ask what are the different set of incentives to trigger a different series of behaviors that we want to survive the challenges ahead?

Human uses of tools and technologies, like all the items contained into the mythical Pandora’s box, cannot be put back in the box once opened. Yet in the myth of Pandora’s box, at the very bottom there was hope. In our socially turbulent era, especially for several parts of Europe and the United States, we need hope that open, pluralistic societies can withstand the challenges they confront. Together we can encourage People-Centered challenge incentives for innovators, entrepreneurs, and other creators to find ways to uplift communities and re-empower human rights for our digital future ahead.

This article was first published here.

________________________

If you liked this article, you can follow us on Twitter and Facebook for more updates.

Posted in Mei Lin Fung.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.