If we accept argumentative theory and confirmation bias as facts, and I believe both theories are representative of how we approach problem solving, then many of us rightfully view analytics as more of a problem than a solution.
Argumentative theory suggests we use the power of argument to dominate and improve status and we are hardwired for this because higher states equates to better mates and a better lineage.
Confirmation bias argues we ignore anything we don’t already agree with and tend to form opinions very quickly and often before we have had time to digest the underlying data. To net this out and connect the dots, we are hardwired to take positions that have a high probability of being wrong and looking stupid if taken against objective analysis. In other words: It is more important to dominate and appear right than to actually do the right thing, and analytics, if done correctly, always points to the best answer.
In short: Analytics works against our god-given right to be stupid and we are hardwired to do stupid stuff. Let’s explore this in the context of the Newtown school shooting that has captured nearly global interest and polarized the two sides on gun control.
Facts
My sources of facts are here, from the Huffington Post, and here, from Wikipedia, and here, the Lawyer Herald showcasing motive was not clear but might have been tied to an attempt to have the shooter committed. I did look at some of the “Truther” information, which is kind of fascinating and, if prevalent enough, could actually corrupt the analysis here. Salon does a nice job of breaking down the false information here, but this showcases how data corruption can take place and the debunking effort won’t get the same attention as the “Truther” effort.
The shooter may have been mentally ill, but was not classified as such, and there is an unconfirmed story that his mother was trying to have him committed. The shooter did not own any of the guns he used. He did play violent video games, but there is no evidence found (he did destroy his PC) that he had planned the shooting.
Sequence of events have the shooter shooting the guns’ owner (his mother), then proceeding to the school where he shot through the protective door and then committed the additional deadly crimes.
So we have a young untrained shooter with a gun he didn’t purchase and the clear intent to kill everyone in the school and what prevented that outcome was the sound of a siren indicating police were coming. The report indicates the shooter shot himself when he became convinced that he would soon be captured.
Analysis
Looking at this like a security analyst, no amount of additional background checks would have prevented the shooter from getting the guns unless they prevented the mother, who was not mentally ill, from getting the guns. In fact, in the case of stolen guns, as long as there are guns to be stolen, access to them remains likely. Had the assault rifle not been available, he would have had two pistols and a shotgun.
Recommendations based on this attack profile would have included hardening the door against weapons available to normal attackers and a faster police response (likely tied to a perimeter system of cameras looking for suspicious people approaching the school and/or attempting to break in).
Note that this specific proposal doesn’t appear to be coming from either side. The pro-gun side wants to use the event to arm more people, and the anti-gun side wants certain guns eliminated and background checks intensified. The potential for a gun fight at a school seems unwise and the second proposal doesn’t appear related to this particular event.
Analytics
Now, in the end, another even like Sandy Hook is unlikely, just like it is unlikely we will ever have another 9/11. Subsequent events tend to vary greatly from each other. This suggests that if we really want to fix the “problem,” we need to analyze attacks against soft targets both domestic and foreign and look for common patterns to develop a strategy that will prevent or at least mitigate the next attack.
The only proposal on the table to do that is the administration’s proposal to allow the Center for Disease Control to analyze patterns of attack and, much like it recommends a response to an outbreak of an illness, be able to recommend a viable alternative solution. This is being fought because the other side recognizes that the side with the detailed studies is more likely to win and apparently doesn’t want to fund detailed studies itself; if it did, the result would likely be corruption of the study process, which could occur on either side. And both sides, regardless of whether this happened, would likely conclude that the other side had introduced bias.
In short, the NRA is against this move because it believes the government’s agenda is against its primary (pro-gun sales) agenda. Its position is validated by the initial attack on owners’ rights, which was put forth before the analysis was actually approved or done.
Wrapping Up: To Work, Analytics Has to Address Real Human Concerns
If we believe anyone who disagrees with us is biased and that biased people will corrupt studies to prove us wrong, then analytics are doomed to fail even if they are accurate. Belief plays a massive role in making an outcome achievable. If I don’t trust you to be unbiased, then I’ll never trust your data.
This suggests that the first step in an analytics product is to establish and reaffirm trust that the project will provide the correct answer. The second step is to find and either eliminate or reduce those who are in power and who will unreasonably object to any outcome. This suggests that a process where you first establish analytics in areas where there is little conflict to build trust and then move the process laterally into other more conflicted areas will be more successful.
I’ll leave you with a recollection. I helped run a team in IBM that tried to spin out the software portion as a separate company. The analysis, and we did a ton of it, suggested the software company’s resulting valuation after it was up and running would have exceeded substantially what IBM was currently valued and that against Microsoft, the result was far more powerful. But power resided with hardware and we hadn’t taken that into account. If we’d realized early on that we needed to convince hardware of this move, we still wouldn’t have done it. As a result, we spent millions on analysis, which was damn good but still wasted.
You just can’t avoid the human element as much as you’d like to.
For instance, and back to Sandy Hook, if the analysis said that the most successful protection was to arm and train teachers (a likely outcome), I doubt the administration would release it, and if it concluded that cities that banned guns entirely were safer (also a likely outcome), the NRA would have a coronary. If neither side will accept a likely outcome, then analytics are pointless. In short, we often can’t handle the truth. Once you know that, you can avoid pointless projects (like my trying to spin out IBM software) or focus up front on assuring they won’t be pointless (I think the latter would be preferable with Sandy Hook).
Or, while the most powerful use of analytics is likely on your customers, sales, which has a ton of power, may feel that this is an attack on that power and block it. But if the first use was on the competitor’s customers, they’d likely be supportive (because that intelligence enhances their efforts) and then less likely to block successfully the customer-focused effort, which risks showcasing bad sales practices.
Sometimes how you approach a problem can have more impact on success than the tool you use to eventually solve it.