Last week I was discussing my research at Security Lancaster with a friend. She said, "You must think security is the most important thing in software.”
I hastened to deny it. You see, I don't think security is by any means the most important aspect to software creation.
I'm researching software security because I feel it is poorly understood and carried out by most contemporary software developers, and that better research and understanding will have a real impact. But most important? Certainly not. Other aspects of software are quite as important and will be more so in most applications: usability, performance, reliability, time to market, maintainability, and of course functionality. After all, if nobody uses the software, its security is almost irrelevant (unless it’s a gateway to an existing system).
Another thing I've learned is that really, there is no such single thing as software security. Depending of the component of application or service you're working on, you’ll need different kinds of security. One approach considers this via four key concepts: Confidentiality, Integrity, Availability (CIA) and Non-repudiation. For example, if you're developing a website to take credit card payments, confidentiality will be particularly important: you don't want attackers to learn personal or credit card information. If you're developing a secure backup system, integrity will be very important to you. You want to be sure that what you have backed up accurately reflects what it is supposed to. If you're inventing a social media service, availability will be critical to you. You do not want denial of service attacks to lose you your customers. If you're developing a system to handle the software for a cash dispensing machine, non-repudiation will be very important to you. You don't want customers to be able to lie and claim that they have not received their money.
And there are many more fine-grained kinds of security requirement that we can identify for a given system by looking at the needs and potential threats to the system in detail.
Nor does achieving software security require a complex process that dominates every aspect of software development. By saying this, I'm going against much of the existing writing about software security, but it is quite clear from studying how successful teams achieve security in practice. I've learned that secure development processes are fragile and do not appeal to developers, but that developers with the right attitudes will deliver secure software. Thus we can achieve software security with small changes to the way that developers interact with the different people they work with.
I'm working on helping software teams to make those changes. The job of getting good software security is one of having the right attitude.
It's not hard!