Let’s say you’re a system administrator, and the CEO comes to you and says he wants you to try to hack into a competitor’s email system. Let’s also say you don’t particularly want to go to prison, nor do you want the CEO to fire you. What do you do?
I recently had the opportunity to discuss that scenario during a fascinating conversation about “intelligent disobedience” with Ira Chaleff, a consultant in the field of “followership” studies and author of “Intelligent Disobedience: Doing Right When What You’re Told To Do Is Wrong.” To kick off the discussion, Chaleff explained the term this way:
When people first see the term, and they haven’t encountered it before, there’s a tendency to think of it as civil disobedience, and it’s not. Intelligent disobedience is when you are getting instructions or orders from your chain of command, and you know that the order, if executed the way you’re being asked to, will have a harmful result. That’s when you need to be able to say, ‘No,’ and if possible, say, ‘Here’s an alternate way of doing this that can get you close to where you want to go, without the risk you’re going to encounter by doing it the way you’re asking me to do it.’
As for the hacking scenario I presented to Chaleff, he stressed that in the moment, the pressures on the individual are tremendous. But his advice for handling the situation seems sound:
The first thing they have to do is deal with their own internal mindset. Their mindset is likely to be, ‘This is the CEO. If I don’t do what he says, I’m going to lose my job, and the job market is tight right now. So I’m going to do this just this once and take the short-term risk.’ Instead, you’ve got to step back and understand yeah, maybe you will lose your job. But if you go ahead and do this illegal thing you’re being asked to do, and the company and you get prosecuted, your career could be really messed up. You could even be facing jail time. So you’ve got to balance the short-term risk with the long-term risk, and stabilize yourself in that moment. Then the first thing you want to ask is, ‘Did I hear you right?’ Because maybe you didn’t. And just by asking that, sometimes [the CEO will] say, ‘No, no, that’s not what I meant.’
But then there’s the scenario in which the CEO responds, “Yes, that’s what I’m asking you to do.” In that case, Chaleff turns to the analogy of training guide dogs for the blind—the analogy that inspired his book:
Of course, guide dogs should obey almost everything, except when the order is going to put the human and dog in harm’s way. Then they have to know how to disobey, even when the order’s repeated—they’ve got to really be firm. There’s a move in guide dog training called a ‘counter pull.’ Imagine that the blind person gives the command to the dog to go forward, but there are steps right in front of him that he’s not aware of. The dog is trained to pull the person in the other direction. I think that’s what the IT professional needs to do in that situation—a counter pull to the CEO, to save the CEO from getting the company and himself or herself in deep trouble. Once you’ve drawn the line and stopped the action, you can say, ‘What is it we’re trying to achieve? Let me see if I can help you achieve that with less risk, and certainly without violating the law.’
Chaleff went on to point out that the nature of the IT profession is such that “educating upward” becomes a critical skill in this context:
Because IT professionals are often dealing with people in their chain of command who don’t fully understand what it takes to execute what they’re being asked to do, I would think there has to be a certain amount of adeptness in educating upward; and also knowing how to say yes when you can, and to say no, pretty firmly, when you really can’t. It might be that it’s not within the budget, the manpower, the time frame for it to be done.
Chaleff explained that at its core, the issue is one of risk management:
You can put in all kinds of risk management programs, but ultimately, if there’s a culture that dissuades individuals from speaking up and dissenting when necessary, you are at risk, and your company is at risk. I’ve had CEOs tell me that what keeps them up at night is the prospect that somebody won’t tell them what they really know about what they’re being asked to do. It’s absolutely up to the people who really know what’s happening to speak up, even when they know it’s unpopular. … We have to understand that we really do need to make the locus of accountability ourselves. We can never say, ‘I just did what I was told to do.’
Chaleff wrapped up the conversation by providing a sobering example of why that’s particularly important in the IT profession:
Let’s say you were asked to do something with some code that you knew wasn’t a best practice. It’s a shortcut that makes the system more vulnerable, and the system could get hacked. And three years down the road, there’s a hack that causes power to go out in a hospital, and somebody dies. If you know that what you’re being asked to do potentially has that danger, and there are safer ways to do it, you need to take a stand now. And know that you probably saved something bad from happening, even though you might never know yourself what the result was.
A contributing writer on IT management and career topics with IT Business Edge since 2009, Don Tennant began his technology journalism career in 1990 in Hong Kong, where he served as editor of the Hong Kong edition of Computerworld. After returning to the U.S. in 2000, he became Editor in Chief of the U.S. edition of Computerworld, and later assumed the editorial directorship of Computerworld and InfoWorld. Don was presented with the 2007 Timothy White Award for Editorial Integrity by American Business Media, and he is a recipient of the Jesse H. Neal National Business Journalism Award for editorial excellence in news coverage. Follow him on Twitter @dontennant.