Keil Hubert: Executive Override

Power corrupts. It’s one of the first lessons that you learn in cyber security. Business Technology’s resident U.S. blogger Keil Hubert argues that a user who misuses their power to undermine security once is highly likely to do so again.

????????????????????????????????????????????????????????????????????

I’ve been advancing the argument lately that an effective cyber security program needs to focus on people first and foremost. My last column advanced the idea that security leaders have a duty to actively pay attention to our co-workers so that we can spot and take action to mitigate problems that people are experiencing that might lead (down the road) to actions that might compromise security protocols. This column was an extension of the ideas that I expressed during my presentation at this year’s European Information Security Summit back on 19th February. Today, I’d like to twist that idea like a Rubik’s Cube and look at the ‘people’ element of security from the top down.

Back on 10th January, Cory Doctorow wrote an article on BoingBoing titled ‘Senior execs are the biggest risk to IT security’ where he railed against the inherent hypocrisy of senior managers setting security policies that everyone has to obey even as they ignore and circumvent their own rules. From Cory’s article:

‘Senior management often sets “business-wide” policies that everyone except the policy-makers themselves are required to abide by. Everyone I know who’s worked in corporate IT has horror stories about senior managers who refuse to adopt good password strategies, good email hygiene, etc.’ [emphasis added]

I’ve kept this short article and the original piece that Cory was commenting on open in a pair of tabs in a browser window since the day Cory’s piece posted. I keep re-reading them, because I feel strongly that this is one of the greatest threats to an effective cyber security program that we’ll ever have to deal with. Cory went on to say:

‘More widely, the problem of leaders establishing “one rule for them, another for us,” is an endemic one that cuts across several domains.’

It almost certainly manifests in all phases of the offender's personal and professional life.

It almost certainly manifests in all phases of the offender’s personal and professional life.

Let me expand on that thought: the problem of leaders establishing “one rule for them, another for us” is probably the most crucial early indicator that a company can have that they’re going to experience a catastrophic security incident. I say that, because the do-as-I-say-not-as-I-do phenomenon constitutes incontrovertible evidence that someone with great power inside the organization likely holds a cavalier disregard for the principles and protocols that make your entire cyber security program function. A senior leader who is willing to use her authority to exempt herself from one rule won’t stop at just the one transgression. As I said in the close of my TEISS presentation, ‘If they did it once, they’ll do it again.’

There are exceptions, of course; if you confront an executive about their counterproductive behaviour and they then faithfully change their behaviour to return to the fold, you might be able to trust them to comply in the future. Might. It really depends on whether the incident was caused by ignorance, or by petulance. The first circumstance is correctable and forgivable; the second isn’t.

I have one short example of this phenomenon that should resonate with most enterprise IT folks: Several years back, a company that I occasionally dealt with had instituted a mandatory travel scheduling system. No one liked it, mostly because the client-server application was buggy and unresponsive. It was, however, mandatory. All users needing to travel had to log into this system and file their itinerary before they departed the home office. Most people grumbled about the system, and then made do. Most.

The presiding division executive didn’t like the new travel system one bit. It was one more user ID and password combination for her to have to remember. It was slow. It was buggy. She had far too many things to do already to occupy her time, and felt that the new system was an imposition. Fair enough; had anyone in IT known about the system, we likely would have empathized. Maybe even helped troubleshoot its issues. But IT didn’t know about it.

The new system had come down through functional lines (e.g. lines of business) from the international head office, not from within the company’s IT community. No one in IT knew that a local instance of the app was being hosted at the local office. The industrial arm of the business decided to hire a contractor to manage the problem – a contractor that wasn’t under IT’s control. You can probably imagine what happened next.

Like this, only with more laughing on the part of the unimpressed baddies.

Like this, only with more laughing on the part of the unimpressed baddies.

First, our busy executive decided that she’d had enough of the laggy app and demanded that it be withdrawn from the local data center – she ordered the contractor to put it on a consumer DSL circuit that a co-located snack bar in the office park was leasing for their customers. That put the host – and all of its stored Personally-Identifiable Information directly on the naked Internet with no firewall, no IPS, no monitoring, and no configuration management.

That wasn’t enough, though. The app ran faster once it was out from behind the company firewall, but it was still too cumbersome for the weary executive. She leaned on the contractor to eliminate the individual user accounts and convert the system to a single shared admin account. Everyone in the department used the same login. So, too, did the hackers who promptly took control of the machine.

When the contractor finally let IT know about the situation, the damage was done. All of the stored sensitive information had been exfiltrated by the cyber criminals who had taken over the unprotected machine. The contractor was apologetic – she knew that was she was doing was wrong, but felt bullied into obeying the person who signed her paycheck. By the time the IT department (and, indirectly, the security team) was made aware of the severity of the situation, the only option was to vaporize the infected machine and start over.

I chose this example because I was read in on the details from my friends who worked there. I met several of the key players in the drama, and I was asked for my expert opinion on how to resolve it. My considered advice – that upper management sack or censure the executive who repeatedly used her clout to violate company security protocols – was ignored. As far as I know, the miscreant never suffered so much as a verbal admonishment. The organization, however, suffered greatly thereafter. As I pointed out at the time: if she did it once, she’d do it again. From what I heard from my mates at the company, that’s exactly how things played out in the years following the initial incident.

It’s normal to feel frustrated with dodgy IT solutions. We all get miffed when things that we have to use don’t work as-promised. I empathize. From the perspective of the cyber security team, though, frustration doesn’t constitute sufficient grounds for violating security protocols.

It's always adequate grounds for a leaving the office to have a relaxing pint, though.

It’s always adequate grounds for a leaving the office to have a relaxing pint, though.

Yes, there should always be an ‘executive override’ process. If the situation is critical to the business, then the appropriate executive with dominion over the entirely of the system should be fully informed of the risks and be allowed to make an informed decision as to whether or not to change things. It’s an executive’s job to accept (and live with) risks. The key to this is that the executive making the decision must be fully informed! In the example that I gave, the site boss with responsibility over the operation was never told. That’s unacceptable. It’s can also constitute the kind of negligence that gets a business sued into the ground.

As cyber security experts, we have to factor the human dimension into everything that we do. Security starts and ends with users, customers, partners, and key decision makers. We need to understand the ‘who’ element first and foremost rather than get entranced by the ‘how’ elements. Pay attention to your community … and when someone in a position of meaningful power misuses their authority to willfully undermine your defenses, do yourself and your company a favour and sack ‘em.

POC is Keil Hubert, keil.hubert@gmail.com

Keil Hubert is a business, security and technology operations consultant in Texas. He’s built dot-com start-ups for KPMG Consulting, created an in-house consulting practice for Yahoo! Broadcast, and helped launch four small businesses (including his own).

Keil-Hubert-featured

His experience creating and leading IT teams in the defence, healthcare, media, government and non-profit sectors has afforded him an eclectic perspective on the integration of business needs, technical services and creative employees. He currently commands a small IT support organization for a military agency, where his current focus is mentoring technical specialists into becoming credible, corporate team leaders.

Tags: , , , , , , , , ,