An Inflection Point in Our Cognitive Growth

Brian G Herbert
10 min readJan 9, 2023

--

We are adapting to a massive increase in the volume of data and the degree to which technology and automation are forced on us. It has consequences for our ability to think, remember, and progress…or stagnate if we simply defer analytical thought to others or to our machines!

Think for a minute what an immigrant faces if they don’t speak the language of a host country. That is how 2020 hit many people when suddenly forced to get everything done online. It's not surprising that we often instinctively go with the simplest, fastest approach.

The more we defer to intuitive conclusions, the more we create a world that is not based on merit or fair treatment of others. I see it in many areas of business, from the selection of customers to the selection of job candidates. The majority go with the flow of an increasingly caste system driven by algorithms of which most of us know little.

It’s too vague to simply say “we are facing accelerating change”. I first wanted to itemize the forces of change in our business and personal lives. Then I wanted to explain how these forces, particularly the information spike, can tend to cause us to defer to gut decisions when we should be more deliberate and analytical.

Defending our Right to Think in the Age of Automation
Analytical Thought- Use It, or Lose It!

First, the major forces of change:

1. Loss of protected markets driving change in every industry. Even low-tech businesses face pressure to re-engineer or die. Open markets are a net positive (except for incumbents), except when they end up being dominated by one or a few aggregators (see next point).

2. Competition for talent is global in many professional categories. Covid-19 in 2020 also accelerated the trend towards remote work, further freeing up where workers can live. However, one trend against a more free labor market is the rise of aggregators. Where aggregators control access to many of the open positions, professionals tend to have fewer options and it is more difficult to differentiate in the market.

3. The accelerating pace of technology adoption. The need to adopt technology, even if we disagree with its utility or if the implementation is poorly designed or buggy. Positives: convenience, speed of transactions. Negatives: loss of control over privacy, loss of alternatives to bad online systems, immature support for exceptions or “long tail” needs (current workflows often support only a narrow band of assumptions with plans to expand in the future. Only ‘elite’ customers have options to handle special needs),

4. The overwhelming volume of data/information. The challenge to individuals is determining the accuracy, objectivity, and importance of sources. I could see a growing market for applications like "NewsGuard", which establishes rules for scoring the objectivity and accuracy of websites that have news content. With apps like ChatGPT or LaMDA, it is even difficult to distinguish human-written content from machine-generated content! A secondary risk from data growth is the protection of rights to our data as well as privacy.

5. Rapid cultural shift re: our social interactions. It has happened so fast that we are trying to get a handle on its impact on how we stay in touch with friends, incidents of mass violence, political polarization, and ultimately how to have a happy, rewarding life. Has our shift online had a positive impact on those? If not, how do we adapt, and how much regulation do we need to protect us from us?

6. Starting around 1999, customers stopped being customers. We became a conduit for advertisers to get to our wallets and for all companies to get data about us. The “genius” model that brought us free Gmail, blogs, podcasts, and other content also turned us into something other than a true customer- something that has a bit of a lab monkey feel to it.

Reality is a Hallucination!

Dr. Anil Seth of the University of Sussex is one of the foremost neuroscientists studying consciousness and our perception of reality. Seth has found that what we call reality is a hallucination that is adjusted via successive approximations, or guesses, made by our sensory inputs (wow, that’s a mind-blower!).

We refer to intuition as our “gut” for good reason. During a stressful situation, such as under “fight or flight” conditions, the amygdala controls several aspects of the GI tract and we feel that constriction of our ‘gut’ and a call to action. Trouble is, our sensory inputs are simultaneously and autonomously constrained. So the more we act in this way, the less we are taking in objective data and the more likely we are to be misled by “tunnel vision”.

Amidst rapid change and increasing complexity, we crave the simple and unambiguous

I’ve observed our information overload increase over 20 years, and I’ve seen individual and cultural changes as we react. I needed to write an article about what I’ve learned and to remind others that the only way we truly cede control is when we fail to act with purpose and intention.

Amidst ambiguity and the need to make quick decisions based on too much or ambiguous information, we crave the simple and deterministic. It’s in our nature to seek mental shortcuts. We call these little “rules of thumb” heuristics, and their close companions are assumptions and biases.

Why do I include bias as a shortcut? One common cliche for the subjective mind is it “sees what it wants to see”. This may be more accurately stated as seeing what requires the least effort to see. Prejudice flows downstream, in the same direction as the momentum of a biased person’s brain. It requires more effort to check impulses and force the mind to hold off judgment and evaluate evidence. Heuristics, assumptions, prejudice, and bias all originate out of the same conservation of energy, or laziness, that we seek to apply to get through a situation with the minimum expenditure.

The Assumption: Fast = proactive and competitive, Slow = opportunity cost and uncertainty

We admire quick and absolute, and we fear doubt. Fast is proactive, competitive, and successful. Slow accumulates opportunity cost and late-mover disadvantages. “Do it now then make adjustments” is the assumption, but rarely is the latter part reality. I want to point out that “Fast vs. Slow” is simply one dimension on which to describe the two most prominent methods of human decision-making. Another way to describe it could be “Shallow versus Deep”. This shifts the positive connotation to the latter of the two terms. It is hard to convince anyone that “Slow” thinking is preferable to “Fast”, but if I call it "Deep" thinking I might get somewhere 😂. Ah, the benefits of the time I spent in marketing!

Nobel Prize Winning Israeli Scientist Daniel Kahneman’s book, Thinking Fast and Slowis a good place to learn about the differences between intuitive or “gut” decisions and deliberate, analytical decisions. University of Chicago’s Richard Thaler as well as Kahneman and his former research partner Amos Tversky, are credited as founders of the field of Behavioral Economics. Through years of experiments involving game theory, these scientists identified characteristic heuristics and biases that have a considerable influence on human nature. One of the most interesting details I got from Kahneman and Thaler’s work is that level of education and economic or social status do not lessen the effect of what they termed "cognitive illusions" (if it was a system, we'd call it a "bug"), and can actually increase the person’s myopia.

We use different areas of our brain for Fast vs. Slow thinking, and as we train one or the other, we expand certain neural connections and even allocate a larger area of our brain to support it. The more we do Anything, the more our brain adapts by ‘training’ new neural connections and expanding the grey real estate dedicated to it. Fascinating neuroscience stuff, but let’s skip it to look at the shortcuts employed for fast thinking- heuristics, assumptions, and bias.

A heuristic is a shortcut that has no empirical basis, but that we trust to provide us with an answer that will have a high correlation with the truth we seek. The rise of trust in “Experts” is a good example, as is trust in the wisdom of the crowd, or any of a number of other heuristics for selecting products or services. These heuristics give our brains a pass on collecting balanced information and trying to make an objective decision. We tend to emulate those we admire even in areas in which the famous person/celebrity/expert has no known qualifications. This heuristic is known by several names, but I prefer “the halo effect”.

Finding the Expert: the Athena Health Story

Michael Lewis, author of… well, every awesome non-fiction book of the past 30 years, covered the work of Kahneman and Tversky in his book, “The Undoing Project”. In season 3 of his podcast “Against the Rules”, he dealt with the expert problem, and brought in Todd Parks, founder of Athena Health, to tell a story about experts that related to medical billing. Sounds riveting, doesn’t it! Yet the story about identifying the expert and realizing her importance to the mission, as well as the critical business message about the need to be ready to pivot and solve the most important problem, was fascinating.

Todd and his brother Eddie founded Athena to become a nationwide network of affordable, highly-effective prenatal care clinics. They started with a single clinic in San Diego County and quickly found out that getting paid by insurers for claims was the problem that would either save or sink the business. This crew of would-be world-slaying entrepreneurs found themselves working around the clock just to get paid and keep the doors open on their initial, Proof-of-Concept clinic.

They came to the realization that they needed an expert on claims processing, and it wasn’t going to be anybody who claimed to be an expert. The real expert, as Todd says, is typically an L6 in a hospital’s bureaucracy- 6 levels down from the chief executive, probably located in a basement office with no windows. While this person will not hesitate to speak up to anyone about billing matters, they don’t tend towards self-promotion. They found Sue Henderson, a twenty-year veteran of hospital accounting and administration in Massachusetts. Perhaps Lewis liked this story because of the parallels to Moneyball- false assumptions about the value of human assets lead to opportunities to turn an industry on its head and arbitrage unrealized value.

My point in telling the story of Athena Health is that the founders had to check their initial assumptions about the success factors for the business and the type of expert they needed. They learned the hard way that there were thousands of reasons an insurer could justify not paying a claim. In true "thinking slow" fashion (and as a great case study for Lean startup methodology), the business pivoted to bring in the best medical billing person they could find and build a business around solving the huge problem with slow payment of medical claims.

Even Simple Decisions have Four Possible Outcomes

When a machine or a person makes a prediction (let’s assume a binary one as in True/False, Yes/No, Compatible/Jerk, Accept/Reject). There are two possibilities for your 'prediction' and two possibilities for the actual result, or “truth”. That means we can commit two types of errors- False Positives (known as Type I errors) or False Negatives (known as Type II errors).

The "Confusion Matrix" for a Binary Classification

Anytime we make a binary decision, we have a risk of a false positive or a false negative. Heuristics and assumptions save a lot of time over carrying out an objective analysis, but they have inherent biases so it is important to weigh the time savings against the consequence of an incorrect decision.

Take recruiting for example, where our own biases can have a large influence on the outcome. Due to a phenomenon called attribution bias, we tend to interpret the behavior of others in the context of our own reality. Going back to Dr. Seth's definition of reality as a very subjective hallucination, this means that much of the attribution we do with others will be wrong. This is a good reason to have multiple people involved in recruiting decisions and to try to assess abilities with objective criteria.

In “The Alignment Problem: Machine Learning and Human Values” by Brian Christian and “Weapons of Math Destruction” by Cathy O’Neil, the authors detail powerful stories where we ceded trust to data analysis and prediction algorithms only to realize we under-estimated their potential for bias. The first sets of word vectors trained on huge corpora of human-generated text presented a quandary. A long history of prejudice in human writing seeped into the vectors produced by these machine-learning models. The algorithms weren’t biased, but did that make their biased output OK? Remaining vigilant is a good reason to keep our analytical brains sharp!

I recently heard Daniel Kahneman on a podcast where he said it is unreasonable to think that our assumptions won’t factor into things like hiring decisions, but he recommended holding off instincts long enough to collect balanced data and make a logical analysis. After that, you can decide as you will, but at least you have given both your fast and slow processes a chance to weigh in!

If we get systemically swayed, then arbitrage opportunities will be created. Are we likely to be swayed more often in the future, potentially by biased, automated prediction machines? Who will benefit from that, and who will suffer, and how can we stay alert and at the top of our decision-making game!?

--

--

Brian G Herbert
Brian G Herbert

Written by Brian G Herbert

Award-winning Product Manager & Solution Architect for new concepts and ventures . MBA, BA-Psychology, Certificates in Machine Learning & BigData Analytics

No responses yet