We live in a world in flux. The past is no longer a reliable guide to the future, so we need to process a lot of new information from new sources in order to survive and thrive. Our filters are due for replacement.

That’s proving difficult.

We all possess tunnel vision. Any new information about the world must pass through several filters before it can even reach us. And it must pass through several more filters before we’ll accept it.

Some of these filters are new. In the analog, mass media age, editors and other gatekeepers decided the news that was “fit for print” or the ideas worth stocking on bookshelves. In the digital, social media age, we’ve offloaded a lot of that gatekeeping to computer algorithms. Most of them are trained to learn our preferences and selectively expose us to more of the ideas we like and fewer of the ideas we ignore. This has a feedback effect, which by now is well-documented: the more we consume the algorithm’s choices, the stronger and narrower our initial preferences become.

Some of our filters are ancient. Pascal Boyer, an evolutionary psychologist, notes that in nature, many species are locked in “an arms race between between deception and detection.” Accomplished liars, like the mimic octopus, thrive by duping other animals into believing their deceits. So humans, like other animals, have developed strong defences against the threat of manipulation. (Our tendency to stick with what we already believe protects us from being taken in by every skilful swindler who comes along.)

Our tunnel vision isn’t a flaw; it’s a feature.

But it’s a feature in sore need of an upgrade.

There are at least three good reasons to believe that performing this upgrade would improve our vision.

First, to counter-balance the echo-chamber effect of algorithms upon our existing beliefs, attitudes and behaviors.

Second, to help us adapt to the more complex systems we’re now living in. In the past, we were able to use our powers of tunnel vision to create highly efficient explanations that worked. We built simple models that only paid attention to a small set of information but nonetheless generated a good understanding of what was happening to us and why. These tools were created for a more stable, more linear world, when the past was a reliable guide to the future.

They’re less useful today:

“In recent years we have seen economic forecasts misfire, political polls turn out to be wrong, financial models fail, tech innovations turn dangerous, and consumer surveys mislead.”
- Gillian Tett

In her book Anthro-Vision, Gillian Tett, US managing editor of the Financial Times, describes how simple models assume away the real world’s complexity in dangerous ways.

Third, we need to upgrade our filters because information is a very different species today than it was eons ago when our filters were installed and our intuitions evolved. In our analogue past, a lot of bad information was filtered out for us by the harsh conditions we lived in. Information decayed. It cost scarce resources to store and maintain. It took time and energy to spread. Spreading harmful information could lead to exile or death. Chances were, if news managed to reach you at all, it had some value in the struggle for survival.

Today those properties are reversed. Digital information is permanent. It flows instantly, freely, from many to many. The reputational rewards for saying bad things outweigh the costs. Logically, the fact that information reaches us, the fact that it spreads, the fact that it persists, should tell us nothing about its quality. Yet it still does.

Tett concludes that we need to widen our “tunnel vision” and develop “lateral vision” instead. It’s an ambitious call-to-action, but performing this upgrade may not require a radical overhaul. A big part of the shift from tunnel vision (which sounds bad!) to lateral vision (which sounds much better!) is probably a shift in attitude: less hostility, and more curiosity, about the ideas that never get through to us or aren’t welcome if they do.

On that curiosity-driven note, let’s tour the filters that form and maintain the walls of our tunnel vision. Not so we fix them or knock them down, but so we can play with them more, climb over some of them, and begin to discover what our lateral vision might reveal.

Your beliefs have three layers of protection

Over the past several years, a rich but confusing list of biases has burst into popular media and public awareness. One way to organize thinking about our various information filters is to arrange them into three concentric circles:

  1. Cognitive filters that we generate ourselves,
  2. Group filters that are generated by the company we keep, and
  3. Collective filters that are generated by our culture and society.

Each level offers distinct opportunities to broaden our beliefs, attitudes and behaviors.

DiagramDescription automatically generated

Layer 1: Protective cognition

Our cognitive filters are easiest for us to adjust. (Hence, they get a lot of attention in the self-development and self-improvement literature. In that literature, they are often framed as flaws to overcome. That’s ironic, because the research suggests we’d respond to this information about ourselves much better if we thought of these filters as features instead…)

One of the core insights into the power of our cognitive filters came from research done by Leon Festinger and his team in the 1950s. They infiltrated a doomsday cult of believers whose core belief was proven false – and yet they kept believing. Their leader, Dorothy Martin, claimed to be in contact with aliens (The Guardians) who had promised to swoop down and collect Dorothy and all true believers on a specific day, December 20, 1955 – just in time to spare them from a cataclysmic Flood scheduled for the next day.

The aliens did not come. And indeed, the Flood did not come either. Aha! Proof, Martin claimed, that their faith had prevented calamity!

The cult’s casual members drifted away, but its most fervent adherents became even stronger believers. Why – when their founding belief had been proven false? Cognitive dissonance was Festinger’s answer.

How we process information is not rational; it is motivated.
- Leon Festinger

Our powers of protective cognition consist of two major faculties:

Motivated reasoning is the search for new information or explanations that confirm our beliefs. The Flood did not come, therefore our faith must have worked! We develop a filter – a confirmation bias – to help us find more reasons to validate the cognitive commitments we’ve already made to ourselves.

Belief perseverance is the search for reasons to reject any new information that might undermine our beliefs.

Skeptic: “These aliens you claim contacted you…They never contacted any of us!”
Believer: “Of course not. None of you are believers!”

The social psychologist Tom Gilovich put it this way:

“For desired conclusions, it’s as if we ask ourselves Can I please believe this? But for unpalatable conclusions we instead ask ourselves, ‘Do I have to believe this?’…

People come to some information seeking permission to believe, and to other information looking for escape routes not to.”

Our powers of protective cognition are extraordinary. We can fabricate false memories – and make ourselves forget they’re false. We can build palaces of explanation that defy gravity. Undermine their central pillars, and they still stand. You can see this feat unfold clearly in jury studies:

1. Eyewitness testimony causes a juror to believe “The defendant is guilty.”  
2. The eyewitness’s testimony is later shown to be a lie.
3. The juror still believes the defendant is guilty, even though the evidence no longer supports that belief.

Why? One explanation is that between steps 1 and 2, the juror develops all sorts of additional justifications in their own head for why they believe what they believe about the defendant. Debunking the original cause of our belief doesn’t debunk all of the derivative rationales we've developed in the meantime. (This finding has important implications for democracy, political scientist Brendan Nyhan points out: The more we’re confronted about our beliefs and forced to  justify them, the more ingrained they might become.)

We use these mental powers to protect ourselves from moral and ethical distress, from embarrassment, and from conflict with our group (more on that later).

Which is not to say that all our cognitive filters are irrational. We are also deeply motivated to be accurate. We need an accurate map of the world to survive. But this accuracy motivation has its own “features.” Economists today talk about bounded rationality. We are not the perfect utility-maximisers described in Economics 101 textbooks. In real life, any rational search we make is bounded by practical constraints. We have only so much time and energy to go out and gather firsthand the beliefs, attitudes and behaviors we need to thrive in our world. (Few of us have done our own experiments to demonstrate that the Earth revolves around the Sun.)

So most of what we believe comes to us secondhand. Others say things, and then we filter what they say by who we trust, by who we look up to as role models (which in turn depends on what we value), and by what we are incentivized to care about most. In short, a lot of factors turn each person’s rational search for accuracy into a quite personal journey.

Even the information we gather for ourselves firsthand is warped (or enriched?) by our own faulty reasoning. Our minds trip us into false beliefs if we’re not careful. For example, the regression fallacy can lead us to see a causal relationship where none exists. A classic case involves speed cameras:

1. There’s a sudden, unusual spike in the number of accidents on a stretch of road.
2. In response, a speed camera is installed.
3. The number of accidents returns to the long-term average.

Did installing the speed camera cause the number of accidents to go down? Or was it going to go back down anyway?

What is often called “cognitive bias” spans a wide range of astonishing things each of us does with new information, mostly without realizing it, and mostly for our own protection. Our stomachs are a biome of micro-organisms that break foods down into the nutrients our body finds useful (and excrete the rest). Our brains do something similar with the information we ingest.

Layer 2: Tribal defences

Another factor that helps us form our tunnel vision is the company we keep. As Festinger found out in the 1950s, a lot of our motivation to filter information has to do with protecting our status as a member of the group. Will believing this new information strengthen or put at risk my sense and status of belonging?

Our beliefs, attitudes and behaviours are tribal. And throughout most of our species’ history, maintaining our status and membership in the group has been far more important to our own survival than knowing the truth. From an evolutionary perspective, “true” and “false” beliefs are not merely rational or empirical things. They also serve a vital social function, with high-stake consequences. They signal who’s in the group and who’s not – a key distinction when your group is competing for scarce resources against other groups.

Do you believe in climate change? For us as individuals, the most immediate consequence of answering this question Yes or No is going to be social: who you attract and who you repel by wearing that belief on your sleeve or on your social media feed.

It’s easy to scan the world today and see the influence that this cognitive tribalism has upon us. Julie Beck, a senior editor for The Atlantic, asked eminent social psychologist Carol Tavris, author of Mistakes Were Made (But Not by Me): “What would get someone to change their mind about a false belief that is deeply tied to their group identity?”

“Probably nothing,” Tavris replied. “I mean that seriously.”

One very clear study of cognitive tribalism involved a 2014 US national survey of science literacy. When asked, “True or False? Human beings, as we know them today, developed from earlier species of animals”, 55% of American surveyed answered True. But when asked, “True or False? According to the theory of evolution, human beings, as we know them today, developed from earlier species of animals”, 81% answered True.

Adding that one clause added 26% to Americans’ scientific literacy score. Why? The first version of the question included a hidden test of identity. Many US religious groups reject the theory of evolution as an article of faith. So the question behind the question was: Whose team are you really on? The second version of the question was a purer test of knowledge.

We’re intelligent animals who can wear information like a team jersey to signal which side we are on. This signalling power may also explain the popularity of conspiracy theories in today’s complicated media environment. Conspiracy theories are a highly efficient way to perform an extremely valuable, but often expensive, search: Who can I really trust? If you are willing to trust me on this, then I’m also willing to trust you, and atop that strong trust foundation we can do things in our group and as a group that outsiders couldn’t imagine...

The benefit of cognitive tribalism is strong group cohesion. The cost can be groupthink: hostility to expressing or even hearing outside views. The tribe has a harder and harder time hearing accurate information, sometimes to the point of self-destruction.

When the social motive becomes the sole motive, then the same filters that aid group survival can threaten it. (The expression “to drink the Kool-Aid” dates back to an infamous, extreme episode of groupthink in 1978, when over 900 members of a religious cult drank poisoned punch and died at the command of their leader, Jim Jones.)

More ordinary forms of groupthink abound. Every group and organization has a culture that is more receptive to some beliefs, attitudes and behaviours than others. Whenever we ourselves don't agree or don't align with the group, but we don't dissent either because we want “to get along,” we are strengthening the filtering power of the company we keep.

Layer 3: The weirdness of culture

The broadest limit on our vision is the hardest to perceive. And yet it is there, walling us off from other ways of seeing. This is the collective tunnel of the culture we’ve grown up in.

Anthropologists, who make it their business to feel the far walls of our cultural tunnels, often describe Western, educated, industrialized, rich democracies as WEIRD societies. That label is a helpful starting point for anyone in any society. It helps to think that there is something fundamentally weird about how I, as a member of my culture, see the world. It inspires curiosity to explore what makes me weird and to discover the other weird ways that other societies see their world.

Some of our deepest, most intuitive and embodied filters can only be found by exploring what’s being generated at this cultural layer. Our ear for music is weird, for example. Arrangements of notes that Western ears find jarring sound just fine to people who haven’t grown up listening to Western music.

Another weird, cultural thing is how we resolve tensions between the welfare of the individual and the welfare of the wider group. We may not realize how Greek we really are. Or how Confucian we really are. It's as if someone plucked a harp string 2,000 years ago, and the echo of that chord still resonates and reverberates through the air we breathe – even though the original harp is long gone.

Global pandemics offer a rare opportunity to see some of these deep differences plainly. If your civilization’s experiment is to pursue individual liberty as the highest good, government can suspend some freedoms for the sake of public health – but only at great cost, great conflict, for a limited time and with limited enforcement measures. If instead your experiment is to create harmony on earth, government can go to far greater lengths to root out the causes of disease and discord as it sees them.


Opportunities for Action

One humbling take-away from any tour of our cognitive filters is how warped our perspectives all must be. Today it’s common to talk about the information environment as “post-truth” or “post-fact.” These phrases imply a past golden age of information purity that never existed. We have always filtered and twisted. We have always lived in our own information silos. In the past, those silos were more geographical in nature. News was more local.

Now, the silos are more identity-based – which is different, but not altogether new. The one-to-many, mainstream media model of the mid-20th century, when a handful of editors supplied common information for everyone’s consumption was the anomaly, not the norm.

The second humbling take-away is how much more we might see, by playing with our filters.

→For Individuals
Improved self-awareness

We can start by recognizing more of the cognitive, company and collective filtering that shapes our own beliefs, attitudes and behaviors.

When new information crosses our view, do we notice our own motivated reasoning, belief perseverance and cognitive tribalism at play? New technologies may be a good place to start noticing your beliefs. Blockchain, AI, nanotechnology, synthetic biology…Most of us don't know much about them. But in this techno-abundant age, all of us are predisposed to believe certain things about them. Are you hunting for more evidence that irresponsible profit-seekers are wilfully blind to the unintended consequences of their actions? Are you hungry for new evidence that human ingenuity will innovate solutions to all our problems if we just let it do its thing?

Self-awareness is often easier to grow in groups, so find or form a good group to go exploring with. Of course, this works better if the group contains cognitive diversity — people with different beliefs, attitudes and behaviors.  

Improved empathy

After touring the many ways we warp the world around us, it is much easier to appreciate how wide-ranging other people’s own relationships with “the mainstream” can be. In the 1930s, sociologist Robert Merton drew a map of the different relationships we tend to form with mainstream society’s goals (what people strive for) and means (how people strive for them). He called it his “Deviance Typology”:

DiagramDescription automatically generated

People who accept society’s goals but not the means Innovate. People who reject the goals but accept the means Go Through The Motions. People who accept both Conform. People who accept neither Retreat. Among those who retreat, some will Rebel. They pursue new goals with new means.

It's a useful framework today, when so many of the means and goals of society are being questioned. When common values and common meanings are no longer commonly understood or accepted, a lot of people feel energized by the opportunity to find new values and new meanings. And a lot of people feel lost and empty.

By playing with your own filters, you can relate to people in all these boxes.

→For organizations
Improved diversity-driven outcomes

For organizations, the obvious opportunity is to get to know your cognitive tribalism much better. It’s now common wisdom that diverse teams enjoy a “diversity bonus.” More and more hiring and promotion practices are being reinvented to improve diversity and thereby unlock more creativity and dynamism.

The next conversation companies need to have is “What counts as diversity?” A multicultural cohort of Harvard MBAs is diverse in some ways but not in others. Explore your own group’s diversity along the three dimensions of cognitive difference:

  1. Attitudes: How do people evaluate X? (“I like you.”)
  2. Beliefs: What attributes or outcomes do people associate with X? (“I think you’re a good collaborator.”)
  3. Behaviors: What actions do people take in relation to X? (“I often look to you when starting up a new project.”)

Another clear opportunity for organizations is to ignite fresh curiosity about the customers, markets and other things you think you understand already.

For example, it's a common notion in the auto industry that “millennials do not buy cars.” They’ve all grown up with Uber and shared scooters and they’re all environmentally conscious urbanites who are happy to trade away the convenience of car ownership to promote other values. That belief completely ignores the rich range of cultural, company and cognitive filters through which individuals evaluate, rationalize and adapt to the moral dilemmas that new situations present in real life. Since the pandemic, driving tests among younger people have risen rapidly. Private transport revealed a big advantage over public transit – social isolation. The market for used cars boomed among millennials, despite the fact that these are the vehicles with the worst fuel economy and emissions.

→For Society
Lateral learning

For us as collectives and civilizations, the opportunity is to look outward.

"The single most important human insight to be gained from comparing societies is perhaps the realization that everything could have been different in our own society – that the way we live is only one among innumerable ways of life which humans have adopted."

- Thomas Hylland Eriksen, anthropologist.

The pandemic is a rare and concrete moment for cross-cultural learning. We all encountered the same threat. Right now, we are still bent on judging our own governments – what they did right or wrong, how well or badly the public health system performed under crisis, etc. Let’s start shifting energy into learning of how different countries did things differently and what the different consequences were. It can be the first in a brave new series of cross-cultural projects to broaden our thinking and responses to other global challenges: inequity, social cohesion, racism, climate change.

These projects require courage. World events in the 20th century helped us to expand our tribalism to a civilizational scale. The same risk of exile-from-the-group that prevents my making friends with people on the other political team makes it a risky business to empathise with how people over there see us.

The spaces where “competitors” can meet without retribution from their own groups are hard to find and hard to create.

But the rewards are great. Embracing diversity is the key to dynamism, creativity and resilience. This pandemic is a golden opportunity for everyone to explore the truth of that statement.

Dare we let it go to waste?