How to Apply Evidence-Based Problem Solving to Improve the Outcomes of Your Projects

Dec 11, 2022

I’ve said it before: I’m not a fan of the expression “data-driven organization. While we often hear that data-driven corporations outperform their peers, it’s a mistake to think that simply investing in data infrastructure and data acquisition will improve the bottom line. As I noted in a previous article, many data-driven companies end up data-rich but insight-poor.

And the same is true in business analysis. A substantial investment in acquiring and crunching data will not ensure that the business problem will be fully understood, the constraints involved will be properly identified, and we’ll be able to build a solution that is desirable, viable, and usable.

The reason why being data-driven can be problematic is simple:

Data is often presented in ways that fool us into thinking we know things that aren't true.

Consider this illustrative example from Twitter. (I decided to redact the name after realizing this wasn’t a joke, but rather one of repeated instances of the same account misinterpreting data.)

Why no interest in long COVID in Japan or Sweden or Italy or Brazil? Anyone capable of thinking analytically will arrive at a logical explanation for this unsurprising “trend”.
First, even the author was able to notice that those countries are not part of the “Anglosphere.” If we wanted to analyze the online search interest for “long COVID” in non-English speaking countries, we’d have to combine the searches in English with searches using their local languages (for example, “COVID longa” in Brazilian Portuguese).

Second, what evidence do we have that Google search volume is a good proxy for interest in a topic in every country? For instance, many of my friends in Brazil use Ecosia instead of Google so that their online searches can contribute to planting trees in the country to protect its endangered plants and animals. And it’s quite possible that there, and in many other countries, people prefer to rely on newsletters like Your Local Epidemiologist to receive science-backed information about the pandemic among other public health topics, rather than risking being misinformed by the results of a search engine.

The reality is that data without analytical thinking can be as bad or worse than relying only on our intuition. Measuring things and gathering data is an excellent approach to reduce uncertainty and find patterns that result in new opportunities and better solutions to existing problems—but only if we understand the components of the evidence-based approach to problem solving.

The core elements of evidence-based problem solving

1. First principles thinking

First principles reasoning, the topic of my previous article, is where it all starts. What are we absolutely sure is true? What has been treated as truth without being proven? Only when we follow this line of questioning do we avoid the risk of making incorrect decisions based on ambiguous data or biased opinions.

2. Hypothesis testing

When we’re trying to solve a problem, some assumptions will represent core beliefs that must be true for our solution to succeed. For example, in a software development project, a core assumption is that the stakeholders have used objective evidence to reach the conclusion that building a custom solution is better than trying to mold an off-the-shelf application to the organization’s needs.

But not all project assumptions should be treated as a given. The biggest enemy of successful problem solving is confirmation bias: our tendency to pay attention to information that confirms our beliefs and ignore or downplay facts that contradict our ideas. To avoid missing key pieces of information, or interpreting subjective information in a way that favors what we want to believe, we must treat our unproven assumptions as hypotheses to be tested–and, when possible, falsified.

Imagine that you work for a B2B company that sells a content management solution. The sales team has been asking for a collaborative editing feature to allow users other than the document creator to make edits. If you only look for supporting evidence, you’re likely to find “proof” that the feature is going to be popular with the user base.

But if the reason for building the feature is to help the business retain customers, collecting more data may help you falsify this hypothesis. Maybe customers are only interested in the capability if it can log the editor's name, and due to technical constraints, the proposed solution won’t allow that. Or, none of the customers threatening to leave for a competitor are interested in the collaborative edit feature, and delivering the capability will not increase the retention rate.

Hypothesis testing doesn’t need to be a costly or lengthy process. In some cases, the information required to prove or disprove a hypothesis is already available (for example, data from a customer survey). In other cases, the process may consist of simply writing down your hypothesis, finding people to talk to, figuring out what you need to learn, devising questions to get you there, and scheduling your interviews.

If, while looking for evidence that validates or invalidates your hypothesis, you end up falsifying it, that’s a reason to celebrate. By analyzing the merit of proposed ideas and rejecting the ones that can’t produce real value, the business can free scarce resources to work on more promising initiatives.

3. Information value analysis

In every project, there will always be a degree of uncertainty. When building a new feature for a customer-facing product, how can we be sure that offering what the users are asking for will prevent them from leaving if a competitor creates a better alternative at a lower price? When replacing the company’s CRM tool, how can we know if the new vendor’s promises to close some capability gaps will actually fulfill the stated business needs?

In reality, if we try to test every possible hypothesis, gather evidence for even the most basic assumption, and weigh every possible outcome, we risk wasting time or getting into analysis paralysis. 

That's why we need to pay attention to the expected value of more information when looking for evidence to support project decisions. If we're fairly certain that the new CRM tool is the best alternative despite some capability gaps, the cost of more information is unlikely to justify seeking additional evidence to support the choice. On the other hand, if being wrong about a feature's ability to increase customer retention could have a sizable impact on the bottom line, the risk reduction benefits of more information may very well justify the extra time and effort to investigate further.

# # #

Being “data-driven” doesn’t help create project success; being evidence-based does.

Evidence-based problem solving reduces the risk of blind spots and confirmation bias and increases the chances of achieving the desired outcomes. In high-stakes projects, risks can be dramatically reduced when a business analyst is willing to apply first principles thinking, hypothesis testing, and information value analysis to integrate the best evidence into the decision-making process.

Author: Adriana Beal 

Adriana Beal has been working as a data scientist since 2016. Her educational background includes graduate degrees in Electrical Engineering and Strategic Management of Information obtained from top schools in her native country, Brazil and certificates in Big Data and Data Analytics from the University of Texas and Machine Learning Specialty from AWS. Over the past five years, she has developed predictive models to improve outcomes in healthcare, mobility, IoT, customer science, human services, and agriculture. Prior to that she worked for more than a decade in business analysis and product management helping U.S. Fortune 500 companies and high tech startups make better software decisions. Adriana has two IT strategy books published in Brazil and work internationally published by IEEE and IGI Global. You can find more of her useful advice for business analysts at



Copyright 2006-2024 by Modern Analyst Media LLC