A very long time ago, when I was a foolish young UNIX sysadmin (I’m only one of those things now), I made a pretty serious security misstep on one of the servers that ran our backup systems. I won’t go into details, but it had to do with the sudo command, a text editor, and ignorance.
Fortunately, one of the more experienced (and frankly, smarter) sysadmins had my back. They had noticed my new procedure, investigated it… and then politely took me aside to explain the error of my ways. They then spent time helping me work out a solution that still gave people the self-service feature they needed, but with less chance of catastrophe. In my self-edited and unreliable memory, I was grateful for the improvements—both to me, and to my config.
That behavior of sharing, collaboration, and blame-free troubleshooting has stayed with me through the years and I hope that I’ve occasionally been able to help someone out with their security slipups.
Along these lines, it came as no surprise when reading the Puppet State of DevOps Report to find that when you integrate security earlier into software delivery lifecycle, the result is—wait for it—better security.
Looking at the report, it’s clear that the advantages of shifting security left into the software lifecycle rely on shifting those DevOps behavior principles into the security teams as much as, if not more than, moving security tools into the pipelines.
Whereas more traditional security operational practices focus on testing and controls (most app teams will have experienced receiving a tool-generated security report showing multiple issues that need addressing), an approach built on DevOps principles encourages early collaboration, sharing and joint responsibility.
Looking at the “Improving Security Posture” section of the report (beginning on page 31), it’s important to appreciate how much relies on the friction-free injection of security thinking into the design and build of software and infrastructure, as opposed to just putting the right controls and technology in place.
If the benefits rely on the sharing and integration of security professionals’ skills and mindset into the heart of the software delivery pipeline, then some of the most significant changes will be behavioral.
Immersing a security team early into the software development lifecycle will certainly require them to adapt, but these changes will have to go both ways. While security professionals will have to adopt new ways of working—and probably learn to speak more ‘dev’ than they currently do, the development team may well have to embrace the Tao of the Andon cord.
If you haven’t investigated the Andon cord and its place in the quality control manufacturing process pioneered by Toyota, then it’s well worth your time. There are articles, academic papers, whole books, and even higher education courses covering the subject. But before you abandon your technology career to pursue that MBA you always promised yourself, finish this article, because I think the most significant fact about the Andon cord is both the simplest and the hardest to achieve.
When a worker in a production line pulls their Andon cord to halt production due to a defect, the first thing their co-workers and management team do is rush over and thank them. And they have to mean it. Pulling the Andon cord is embraced as a good thing, as it’s a step towards quality improvement. Managers and co-workers are grateful that the production line is stopped due to a problem—because that’s an opportunity to improve.
Can your development team embrace the security team finding defects and pulling the (virtual) cord with gratitude? Can a DevOps team learn to value builds or deployments that fail security tests as much as they do the ones that pass?
Can the expectant business owners learn that, to make great software, we should be looking for more frequent test failures as we build better and better tests that identify problems before they become apparent in a deployment?
That can be a tough mental adaptation.
While I’m grateful for the inevitable editing corrections that will have been applied to this article by the time you read it, it’s not always easy to see the thing you created dissected by others. Your rational brain knows it delivers a better product, but your inner chimp just wants to fling its arms around.
So, although it might be a hard mindset to adopt, it’s critical to success in integrating security early into the software lifecycle, and so much better than the existing after-the-fact reviews and hurried remediations.
Changing attitudes is a whole other area of study, but one leaders and practitioners in IT need to become proficient in, especially in this time of revolution in ways of working. It’s often (much) more difficult than just adopting new technology. If you are going to successfully ‘shift left’ on security—and the data from this report says you should, then take as much time with the human elements as the technical ones.