[et_pb_section admin_label=”section”][et_pb_row admin_label=”row”][et_pb_column type=”4_4″][et_pb_text admin_label=”Text” background_layout=”light” text_orientation=”left” use_border_color=”off” border_color=”#ffffff” border_style=”solid” text_font_size=”18″]
Gone are the days when organizations thought that perimeter security was enough to thwart application-level attacks. More and more organizations are seeing the problem and taking action to prevent coding flaws that could lead to data breaches and application-level cyberattacks.
They start by getting all hands on deck, followed by a couple of security presentations, tutorials, and even a round of security training. Three months later, no real improvements have been made; developers, managers and executives are confused about why things didn’t work out; external/internal security assessments come back with the same old issues such as SQL injection and cross-site scripting.
How to explain the ineffectiveness of application security programs? We’ve put together the top five reasons we’ve seen while working with organizations to increase internal application security capabilities.
1. Depending too much on static code analysis tools
Many organizations lead the program with a static code analysis tool, and why not? The tool was sold to them as a bulletproof magic wand that will identify all the security flaws and help fix those issues.
While no security program is really scalable without a good tool, the problem starts when organizations put all their faith in the tool without developing a proper application security program, which includes guidelines, proper training on the tool, and identification of the issues that the tool usually finds and the ones usually missed.
2. Providing training without actual tools
Other organizations lead by training their development staff without giving them the capability to execute the techniques they learned in the training.
Understanding what a vulnerability looks like is the first step, but it must be followed by systematic ways of finding that vulnerability, as well as reliable, time-effective techniques on how to verify vulnerabilities.
Keep in mind that your software developers are not security experts, just as they are not QA experts or usability experts. They should have a strong background in security. They should be able to find and fix basic forms of vulnerabilities. But they can’t be experts in security unless it is their full-time job.
3. Lack of goal setting and proper communication
For the longest time, application security was an IT issue. IT staff, who weren’t trained in application security, used the same techniques to test applications as they used to test networks. That mistake led to years and years of exposed applications, delayed vulnerability remediation, and lots of time spent on what could have been a very straightforward vulnerability mitigation process.
Many organizations nowadays are putting the load on development teams to solve the application security problem, but this expectation is not clearly defined or properly communicated.
Development staff are usually engineers, and engineers are not very good working in vague modes. They know that their software development tasks are done when the code works with no bugs. However, they have no idea when their code is secure – nothing special happens when the code is secure versus when it is not secure.
4. No champion
Organizations that are doing scrum properly usually use a scrum master. A scrum master is the facilitator for a product development team that uses scrum.
We can argue that proper application security programs require a security master – someone who facilitates the secure software development efforts and ensures that security activities are done. A security master consistently makes sure that security bugs are addressed and that developers receive the training they require. Finally, a security master guides developers through the secure development process and answers their questions or points them in the right direction.
5. No alignment with personal goals
This is the one reason that executives will never hear, but that developers keep saying over and over to themselves and to each others: What’s in it for me? Why should I put the extra time and effort into security? How does it align with my role, my career, my promotion, my raise? Heck, forget about me, how does security align with the company’s goals and future?
A frowning executive might answer, “Well, if a data breach happened and we lost customers’ data, that would reflect badly on our brand and the health of our financials.
To that a developer responds, “Great, then which one is more important: security or deadlines?” Developers are usually under very tight deadlines, and adding another task without proper alignment to their careers will not work. When that alignment is strong , magic starts to happen and developers find creative answers to difficult security questions. The famous question, “Which one is more important: security or deadlines?” does not become an issue because good developers will find ways to achieve both.
What do you think? Have you seen one or more of these reasons for failure on an application security program? Are there other important ones to add? Once you understand the cause, it’s that much easier to fix.