More

    Challenged Election Results and Confirmation Bias

    Slide Show

    5 Characteristics Professional IT Consultants Must Possess

    Two massive dangers can turn analytics from an incredibly useful tool into a substantial liability: data corruption and confirmation bias. We saw both start to play out over the weekend when U.S. President Elect Donald Trump tweeted that he should have won the popular vote but didn’t due to voter fraud and the Clinton camp, which had been arguing Trump’s views were unfounded, moved for a recount. What seems very strange is that it was clear that Russia was actively trying to influence the election through fake news, which we used to call propaganda, and we also know, given a long list of breaches, that the aging U.S. electronic infrastructure is badly out of date and unsecure.

    In other words, assuming the results weren’t tampered with, regardless of the outcome, would be foolish, yet only Trump called out the problem, until he won, and then (at least initially) the two camps switched sides. I think this is a good showcase for data corruption and confirmation bias, which can render any analytics tool less than useless because it then simply becomes a stronger justification for the wrong decision.

    Garbage In, Garbage Out

    Garbage in, garbage out was one of the first sayings I learned when I was studying programming. It basically argues that it doesn’t matter how good the tool is; if the raw material is worthless, the result will be equally worthless. This is why it is important to ensure the data source, which wasn’t done adequately during the election. That led to the better-funded organization making a critical strategic error and losing badly. The Democrats went into the election believing they could not lose, and they lost. Ironically, this was almost the exact same thing that had happed to the Republicans in the prior election, when Mitt Romney’s bad analytics led him to believe that he was comfortably ahead when he was behind.

    But one of the big reasons people don’t challenge bad data is confirmation bias.

    Confirmation Bias

    Confirmation bias is when you form a position and then selectively look at information that supports the position you’ve already taken. For an analyst, this is when you first take a position, then do the research, rather than the other way around. But, in this election case, we had the media (which was the unofficial validation body for the polls) and the Democratic party convinced there was only one right answer and that answer was that Hillary Clinton would win. So even though, during the polling process, clear anomalies were caught during the Republican primary and clear indications that Trump supporters, in large numbers, were declining to be polled, the results showcasing Clinton with a comfortable lead were accepted.

    Then, when the results of the election varied massively from the polls, rather than triggering an automatic audit, the analysts doing the polls moved to defend their bad results. When you have two data sources believed to be accurate, but they don’t agree, either could be wrong. In fact, both could be wrong, but you’d be best served to audit the one you will rely on because if it is wrong you’ll make a bad decision.

    In the end, what should happen after the analysis is significant revision of the polling and voting process to ensure a timely and accurate result, but that seems unlikely at the moment because of an excess focus on blame rather than problem resolution.

    Wrapping Up: Challenge All Results

    Elections are fun but I’ve seen a number of companies fail because they relied on bad customer satisfaction data, bad competitor intelligence, bad win/loss analysis, and bad employee morale scores, all of which were altered to give managers results they wanted to see and that supported bonuses rather than what they needed to see. One large company I follow is aggressively hiding from what appears to be a complete meltdown of its customer base, and I’m not sure if it is oblivious or has simply given up and is hoping things automatically turn around.

    If you get accurate results, you make decisions based on fact, but if your results are compromised, that may lead to decisions that could eliminate your firm. Currently, people challenge results if they tell them something they don’t like. If you want to be successful, you should challenge results you like as well because they could cover up an avoidable disaster.

    Rob Enderle is President and Principal Analyst of the Enderle Group, a forward-looking emerging technology advisory firm.  With over 30 years’ experience in emerging technologies, he has provided regional and global companies with guidance in how to better target customer needs; create new business opportunities; anticipate technology changes; select vendors and products; and present their products in the best possible light. Rob covers the technology industry broadly. Before founding the Enderle Group, Rob was the Senior Research Fellow for Forrester Research and the Giga Information Group, and held senior positions at IBM and ROLM. Follow Rob on Twitter @enderle, on Facebook and on Google+

    Rob Enderle
    Rob Enderle
    As President and Principal Analyst of the Enderle Group, Rob provides regional and global companies with guidance in how to create credible dialogue with the market, target customer needs, create new business opportunities, anticipate technology changes, select vendors and products, and practice zero dollar marketing. For over 20 years Rob has worked for and with companies like Microsoft, HP, IBM, Dell, Toshiba, Gateway, Sony, USAA, Texas Instruments, AMD, Intel, Credit Suisse First Boston, ROLM, and Siemens.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles