We've been working hard on gathering data over the last few months and it was well worth the effort. I had the privilege of leading this project, with great contributions from our 17 project partners and tremendous help from Jeremiah Grossman at WhiteHat.
We started out with a simple question - how much security spending is enough when developing web apps and software? We all know that there are a lot of web apps out there that are not sufficiently secure. Some of this bad security is a result of sheer ignorance. But most bad security results from lack of effort. Or applying that universal law of business (TIME=MONEY), due to a lack of resources.
In other words, most web apps are insecure because no one bothered spending the time or money to secure them. But since one could theoretically spend 90% of a project budget eliminating every esoteric web app vulnerability known to man, how much spending is enough? Executives want to know what the industry norm is, set aside that budget, and see the security issue disappear so the company can focus on its core business. Although this may not make some security purists happy, in the vast majority of companies security is a tax. And like with most taxes, when the rates are unclear and the tax code is too complicated, people start fudging and figure they'll take their chances.
The security tax is relatively well understood in web security's better funded cousin, network security. When it comes to basic network security there is a conventional wisdom of a security tax of somewhere between roughly 5 and 10%. No such consensus exists for development. It is this vacuum that the OWASP Security Spending Benchmarks Project addresses.
THE FACTS
From the get go we have focussed on making a different kind of survey - a collaborative data collection effort by the community without any added spin. We want the facts to speak for themselves with the goal of better understanding security spending in development. That is why we have kept our own thoughts and analysis limited to blogs like this but out of the actual report. Also in what I believe is a first for a security spending survey, we have also released the raw survey data that can be found on Survey Monkey.
There's a detailed description on the project page of our methodology and repsondents. A bit further down in this post you can read about some factors that may have affected the quality of our data. So if there are any statisticians in the crowd, please keep in mind that we realize that our resulting data is far from perfect. I do however think that our open community approach has led us to results that are at least as good as funded proprietary surveys that until now have been the only source of data on this topic.
So with that grain of salt sitting on the plate, let's take a deep dive into some of the key survey results...
Data breaches loosen the old security purse strings. The survey data validates the unsurprising fact that companies that have suffered security incidents are more likely to invest in security and have security executives on staff. Any one who has spent any time negotiating security budgets knows that security incidents are unfortunately a powerful (and somewhat belated) kick in the behind to get security projects rolling. This also makes sense from an enterprise risk perspective - most customers and even regulators can forgive one data breach (after all, stuff happens as they say). Two data breaches is of course a different ball of wax.
The recession/depression is not negatively affecting app security spending. At first it seems a bit surprising that we found that web application security spending is expected to either stay flat or increase in nearly two thirds of companies. After all, isn't everyone but the government basically broke? But of course the reason we are broke is bad risk management. As I predicted back at New Year's, we got into this big pile of financial $%*#& due to a fundamental inability to assess risk and there is going to be a ton of new risk related regulations in the financial sector. And once legislators get the old regulating pen out, you can betcha that infosec regulation is on the way too. In fact, the
Most companies have specific IT security budgets You don't get very far without a budget, and 67% percent of companies surveyed have specific IT security budgets. For companies that had suffered incidents, this was almost 90%. The interesting thing about having a specific security IT budget is that it pits competing security interests against each other at budget time.
There's very little development headcount dedicated mainly to security. In a whopping 40% of companies, there is less than 2% of developer headcount dedicated to security. This makes sense to me, because the job of developers is basically to build stuff without making really obvious security mistakes. As I have said before for many organizations I don't see a real alternative to some kind of build-then-try-to-break approach to producing secure code.
Most companies do third party security reviews. At least 61% of respondents perform an independent third party security review before deploying a Web application while 17% do not (the remainder do not know or do so when requested by a customer). I find this one of the most surprising statistics, considering the expense of third party reviews and the (probably false) assumption many companies might have that they can do this in house. This statistic seems to indicate that there is widespread acceptance of the breakers model of building secure code.
Security is important when hiring. For most companies it is completely infeasible to do a security review of every line of code a developer writes. That's why developer education and previous security experience are so important in producing secure code. That might explain the surprising fact that half of respondents consider security experience important when hiring developers, and a majority provide their developers with security training.
Competitive advantage is not an important factor in security spending decisions. Competitive advantage ranked last out of five factors that could influence security spending, while compliance ranked first. This reflects a reality that I see everywhere except for certain navel-gazing security conferences and highly sensitive industries - namely that regular ordinary folks do not make purchasing decisions on the basis of security. They expect basic security as part of a product or web application.
Web application security still has a relatively small part of the security spending pie. There are a couple of reasons for this in my opinion. Web app security is still somewhat new, at least compared to the classical network security approaches. Many standards and RFPs still place a very heavy emphasis on network vs. application security. For example, the recently announced Priority Approach for PCI prioritizes virtually all network security requirements ahead of web application security requirements. Another reason that web application security receives relatively little budget in my opinion is that it does not come bundled together with other functionality. Many security products allow new and visible ways of managing and mointoring users. Most web application security spending on the other hand leads to something that is almost completely invisible, namely a locked down application. Locked down applications don't make for very good powerpoint slides. "Total network awareness" products do.
I've got more to say on that but I feel we are digressing a bit. If you are still with me, I'd like to dedicate some ink/pixels to an honest critique of the data quality in the OWASP SSB survey.
LIES, DAMN LIES, AND STATISTICS
I always read survey results with a healthy dose of skepticism. Too often you read stories along the lines of "1 in 3 grandmothers reports losing social security check due to insecure mailbox", with a press release the next day announcing a new mailbox lock.
So having just reviewed our results, its time to analyze the accompanying grain of salt. Openness and collaboration is a big part of the OWASP Security Spending Benchmarks Project. Although we made every effort to maximize the quality of the data and analysis, the data is not perfect. Surveys never are, but of course businesses are constantly forced to make critical decisions on the basis of imperfect data. Ultimately the goal of the project is to capture the best possible picture of the development spending landscape and - equally important - to stimulate a discussion that can lead to further consensus on this issue.
I may have missed something, but here in no particular order are the possible issues that affect the reliability of survey results:
(1) People not answering truthfully.
Well, not much you can do about this. We kept this to a minimum during the OWASP SSB survey by rejecting responses that took less than 2 minutes to fill out and by spreading the survey through a trusted list of contacts.
(2) Intentional skewing of results by subverting the survey process
(eg. responding multiple times). Internet surveys that offer some degree of anonymity will always be vulnerable to this. Again, our survey methodology as described on the project page was designed to keep this phenomena to a minimum.
(3) A non-representative respondent base
I think that this is the most significant potential weakness of our current survey. Although we have a number of non-security partners involved, the current list of partners consists primarily of security consultants and other organizations. Although I don't think that this significantly skewed the results, there is a possibility that the companies we reached out to through our partner network were somewhat more security aware than a randomly selected company. In the next quarter, we plan to expand our partner base to include a greater number of non-security related companies.
(4) Badly formulated or suggestive questions.
This is the "Do you (a) support candidate X or are you (b) a heartless fascist" problem. There is an entire industry built around the proper phrasing of survey data to get accurate results. Although none of the partners is a survey expert to the best of my knowledge, many of us had been involved in similar efforts in the past and we attempted to phrase the survey in a neutral way that would lead to the most accurate results.
(5) Drawing incorrect conclusions from the collected data.
Our project report attempted to avoid this by sticking to the facts and leaving the analysis to the blogs. Also, unlike every commercial survey I have read in the last few years, our project actually releases raw data to the community to evaluate. There are no "proprietary method" or "confidential sources". In fact we welcome competing analysis and others using our data to further discussion around this topic.
(6) Opaque and non-verifiable process.
I think we steered safely clear of this pitfall as well. Our project plan is always available on the project homepage. Any one who is willing to commit some time and energy into promoting the survey and providing strategic input is welcome to participate.
THE NEXT STEPS IN THE OWASP SSB PROJECT
We are planning to build on the momentum of this first survey to get more companies involved and to get new thematic priorities. The current status of the project will always be available at the project homepage.
UPDATED PRESS COVERAGE:
This project generated siginificant press coverage, which is great for our goal of establishing industry wide benchmarks.
Click here for a video interview I gave Search Security on the project
Other coverage includes articles in SC Magazine, Dark Reading, Search Security, PC World, Information Week, and numerous other publications.
There has also been coverage in German and Spanish.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.