By Dr. David Bray
Never has 50% of humanity (i.e., approximately the percentage of people on the planet with access to the Internet as of 2018) had access to so much information per day. Yet with that access comes a challenge of our modern age: all of us can find information that supports our existent beliefs.
Each of us as humans can and do experience confirmation bias (where we “plus up” information that re-enforces our existing views and dismiss information that challenges those views) and cognitive easing (where if something is repeated enough we become comfortable with it and believe it to be true, even if it is not). A lot of this has to do with our human biology, the time and energy it takes to re-examine every aspect of our lives daily would be overwhelming.
Where Do We Want to Go?
Last week, in my capacity as Executive Director for the People-Centered Internet, I had the opportunity to give a talk on the “Future of Governance” at Singularity University where I actively asked the participants to consider what the future might have in store for us. Over the years I’ve written how the pace of advances in technological capabilities appears to be accelerating (though this acceleration may mostly be improvements to existing technologies vs. entirely new technologies). This acceleration is also triggering social impacts as we strive to sort out what the new capabilities mean for how we live, work, and play.
In the past, the professions arose as a way for individuals unfamiliar with a profession to know if that professional, who possessed textbook knowledge and real-world skills, beyond the skills of the normal public was behaving ethically. Professional societies were given the ability to self-govern by the public if their members demonstrated themselves qualified from their experiences and adhering to a code of values.
I asked the participants if perhaps the professions have been “flattened” in some respects with our era of information richness bordering on information overload? Specifically, could information overload prompt societies to actively devalue expertise and claim experts are no longer needed because they’re either out-of-date with the changing world or information to counter the views of experts can be found online?
By asking this question of the participants at Singularity University, I wanted to see what their thoughts were on how to balance the need for expertise in pluralistic, open societies with the need for the ability of the public to ask questions of experts too? I personally believe that researchers and practitioners in economics, sociology, management science, and other human-focused fields are just now beginning to appreciate how confirmation bias and cognitive easing are impacted by our information rich, potentially information overloaded era.
I also emphasized during the talk that I am a strong supporter of the “power of diversity”, namely that groups with varying perspectives can out-perform groups with singular perspectives. This means we need to be cautious of experts with just one way of looking at the world — for example, Frank Lloyd Wright was an amazing architect. His buildings are amazing from an architectural standpoint. With the benefit of hindsight, some have said that from an engineering perspective his buildings could be improved; it would have been better to pair him with an amazing engineer.
Yet bringing together diverse groups of people around challenges involves finding a common way of communicating ideas and concepts — as well as establishing a comfortableness with the different perspectives diverse experts bring to the world. Here again, confirmation bias and cognitive easing risk polarizing different groups vs. helping to bring them together.
How Will Technology Impact How We Co-Exist?
We humans have opted for different forms of governance over the centuries. Some of these have been governments of different forms, some have been other forms of self-governance such as the example of the professional societies. I like Tim O’Reilly’s definition that I once heard at a Sci Foo conference, namely: governing is that which we choose to do together that we can’t do alone.
Last week, I asked the participants at Singularity University to consider how might technology empower — or potentially erode — different forms of governments? Did we risk a future where autocracies were strengthened by surveillance and control by algorithms? Did the current forms of social media lend themselves to a neo-feudalism where vassals follow different “lords” and “ladies” online? Could new faiths arise around different AIs? Could distributed ledgers assist with more participatory forms of democracies?
I also asked the participants to consider whether nation states themselves might fade? After all, where is a packet of information on the Internet — do you apply the rule of law based on its origin, destination, or both? Europe’s General Data Protection Regulation (GDPR) represents an interesting effort to apply their rules outside of Europe and it will be interesting to see the long-term effects from GDPR.
Recognizing governance is done by groups beyond just governments, in the future we might see Boards with both elected humans and algorithms. We may see partnerships across sectors, mediated by AI or machine learning to help ensure fairness and transparency.
I asked the participants to consider if new technological capabilities challenge societies to re-examine what they consider to be good vs. bad regarding organizational and individual norms. New capabilities might change the definition of what’s ethical and fair vs. the norms of the past — and might take 10 to 20 years for a new generation to sort out a consensus. If this is the case, then a significant challenge arises however when the pace of technological capabilities accelerates. Modern societies are strained to keep up with such changes. Significant friction and heightened emotions are emerging as a result.
Fast forwarding to this week, I recently read a draft platform policy paper released by Senator Mark Warner of Virginia. Back in 2012–2013 I had the opportunity to work with Sen. Warner when he was one of 12 members of the National Commission for the Review of the Research and Development Programs of the U.S. Intelligence Community. I was serving as the non-partisan Executive Director sandwiched between six Democrats and six Republicans. During meetings with him and his staff, I found he asked insightful, forward-leaning questions. As a side note, that Commission back in 2013 also made us all concerned about the potential impacts of future technology advancements on open societies.
In reading Sen. Warner’s latest draft platform policy paper, clearly he and his staff have put a lot of thought into the challenges of the rapidly changing world. This includes the challenges of misinformation, fake images and videos, and disinformation; challenges of data ownership and data transparency; and challenges of bots, algorithms, and a world in which machines can emulate human-like behaviors. As a non-partisan I appreciate the forward-leaning potential solutions proposed in the draft — recognizing that a lot of additional consideration is needed from all sides. I celebrate at least someone is making a thoughtful effort to recommend solutions that fit the challenges of governance now and in the future vs. pretending we don’t have some serious challenges that won’t be solved by simple one-sided answers.
Strengthening Pluralistic, Open Societies
Considering how technology impacts societies, and considering the current challenges we seem to be facing with governance at local, community, national, and global levels — I submit we have a lot of work to do. Together, we humans are going to have to figure out whether laws by geography work for some of these issues, or if certain platforms are trans-national in nature? We’re going to have to figure out how to protect individuals from attacks on their privacy or individual person, while at the same time find some way to address the challenges of attribution of misinformation online?
Sen. Warner’s draft platform policy paper raises concerns of “dark patterns” that mislead people to think they must give up more information, via an app or website, to use the service. Such concerns address the challenges of user experience and design, as well as what people are thinking and considering when they decide. Knowing this fully can be hard to do, yet we need to consider new solutions if we are to find ways to co-exist as pluralistic, open societies. With the People-Centered Internet coalition — Vint Cerf, Mei Lin Fung, Ray Wang, Marci Harris, Stephanie Wander, Corina DuBois, and several other participants — are looking to hold a big event on 10 December 2018 focused on the “Unfinished Work of the Internet” in multiple areas, to include digital inclusion, privacy, cybersecurity, future of work, and several other important areas. As a new father, I regularly wonder what do both the world and the Internet of 2030 look like for my son.
Together we can find new ways of working and living in the changing world and find ways to still value both expertise and a diversity of perspectives in an information rich era.
Two quick closing thoughts. First, as part of the talk I gave at Singularity University, I also asked the participants to engage in an interactive case study involving the impacts of a new technology on society (in this case, healthcare). I asked them to consider their (1) obligations to society, (2) potential biases, (3) proactive steps they should take, and (4) anticipatory safeguards they may want to put in place given the situation. Perhaps this framework could help guide future efforts as we, as species, strive to figure out where we want to go and how we want to co-exist?
Second, I also raised a closing question for all participants — one I’ll also ask here: will the Internet re-enforce global human rights? This is a central question for the decade ahead and one that requires reflection on the decade past to ask: “what could we do better?” There are some things we only learn could be improved through experience. We learn what works, what doesn’t work, what we should do vs. not do, and what might be better social norms from experiences with new technological capabilities. With the advances on the Internet and other areas, such as advanced data analytics and machine learning, the last decade has several instances where we as society now are realizing technological capabilities require us to renew an emphasis on the Internet as being a source of hope and human freedom. This will include sorting out the future of governance and the future of how we co-exist.
All of this means that positive #ChangeAgents, willing to strengthen pluralistic, open societies — are needed now more than ever. We’ve work to do.