The natural tendency of an engineer is to assume that “security” is an engineering issue that reduces to a type of reliability. And the Common Criteria security document outlines
a solid engineering approach (written in astoundingly opaque bureaucratese) for assuring software is designed, developed, and tested to limit security failures. But “software security” means different things in other contexts.
- Ross Anderson’s famous Why Software Security is Hard paper explains that “security” is often concerned with avoiding liability or blame while evading actually paying the costs of engineering security. For many people “software security” means “some hoops we have to jump though to satisfy an auditor or evade responsibility”.
- GreenHills Software is using “security” as a method for trying to frighten people away from Linux. See my GrokLaw note for details.
- The Digital Rights Management (DRM) people mean “Make sure we get paid, no matter how dangerous or insecure this makes the system from the customer point of view” See
DRM and Security.
The lesson here is that “security” is more like “efficiency” than “reliability”. You’d be wise to find out “who gets the benefit from this meaning of security” before you sign up. Bruce Schneier has a great story.
The other week I visited the corporate headquarters of a large financial institution on Wall Street; let’s call them FinCorp. FinCorp had pretty elaborate building security. Everyone — employees and visitors — had to have their bags X-rayed.
Seemed silly to me, but I played along. There was a single guard watching the X-ray machine’s monitor, and a line of people putting their bags onto the machine. The people themselves weren’t searched at all. Even worse, no guard was watching the people. So when I walked with everyone else in line and just didn’t put my bag onto the machine, no one noticed.
It was all good fun, and I very much enjoyed describing this to FinCorp’s VP of Corporate Security. He explained to me that he got a $5 million rate reduction from his insurance company by installing that X-ray machine and having some dogs sniff around the building a couple of times a week.
I thought the building’s security was a waste of money. It was actually a source of corporate profit.
So next time you see an impressive security credential or a virtual X-ray machine with patriotic slogans all over it, please look for the actual motivation and use. You might be surprised. In fact, the Common Criteria approach to software security asks for just this type of caution. You are supposed to do a thorough threat evaluation so that you can identify what your security needs really are instead of being stampeded into using someone else’s definition. Unfortunately, the Common Criteria documents are in exactly the kind of prose that you’d expect to see resulting from the collaboration of government security and standards organizations. See this note for more.