Jun 21, 16 1:39 am

Was this post helpful?

Why Don't Developers Write More Secure Code?

Jun 21, 2016
| by:
Sherif Koussa
Developers have been rapped in some circles for writing code with security flaws, but is such criticism justified? Where is security on developers' priority list?

Programmers certainly have a lot on their plates and while security has been a burning issue in recent times, it hasn't been a top priority for developers. Secure coding for secure applications is easy in theory, but why is it so hard to write secure programs?

A survey of more than 200 developers conducted a few years ago identified half a dozen priorities of developers. In order of importance, they were functions and features as specified or envisioned, performance, usability, uptime, maintainability and, at the bottom of the list, security. 

The main priorities for developers stem from the agile cycle, and end user expectations. Features, and functionality is the main priority and selling point of the majority of applications, but with expedited timelines it can be difficult to delay these releases with security checks. The company's goal is to be first to market with new features and functionalities for their customers. After functions and features, developers are then focused on the performance and usability of these functions. UI/UX is becoming more prevalent in decision making for end users, and requires just as much devotion as the function itself. If the performance and usability is not up to the end user’s standards, it can reduce the overall adoption rate and defeats the original goal of a new function/feature. Uptime and maintainability is the next natural step for developers to evaluate. They have the function, performance and usability, but how sustainable is the upkeep? If the end user is happy with the application and its functions, is it easy to maintain that functionality to maintain the uptime? Security can have a large role in maintainability and uptime, if things are secure and operating correctly, the likelihood of application downtime is reduced. Developers' agile cycle can make it difficult to integrate security at the right time, and security teams need to appreciate and understand how security fits in with functions, performance, usability, uptime, and maintainability. Otherwise, security will keep being ignored by developers. 

Less help received from the quality assurance department

Without a doubt, security can get in the way of some of those priorities, which is why it is not on the top of the priority list for developers. However, developers have had to adjust to similar scenarios in the past. For example, quality assurance (QA) used to generate debates about what's the right ratio of developers to test engineers. Today, that's less of an issue because now every developer takes on responsibility for testing and builds unit tests every time they add new features and functionality. QA testers haven't totally disappeared but there are fewer of them than there used to be, most of the remaining testers perform manual tests that are difficult to automate. The same thing has to be done with application security. It needs to be embedded into the workflow where  it can help productivity in the long run, not hurt it.

Most developers don't know what secure code looks like

Although there may be some resistance from developers to expand their roles in securing software, most want to write secure code but many don't even know what that means. They know some basics ­­ validate input, check for buffer overflows, encrypt data in transit and limit privileges ­­ but many aren't equipped to address advanced problems ­­ authentication weaknesses, application logic flaws, and advanced input validation.

Security is a marginal topic in high-ed curriculums

There are a few reasons why developers have a knowledge gap when it comes to security; developers aren’t taught application security in school. Forrester looked at the top 40 computer science programs and found that “none of the top-ranked computer science programs in the United States require a class about secure coding or secure application design.” In most instances, general cybersecurity courses are offered as an option rather than as a priority in many schools. 

Not only is there a lack of formal cybersecurity education, but there’s also a general unawareness about application security trends, as developer trends are siloed from cybersecurity trends. 

Overall, developers haven't received a lot of training about writing secure code. Just as security isn't high on a developer's list of priorities, teaching students how to incorporate security thinking and awareness into code design, development and testing hasn't been high on the priority list of universities either. This is carried into the workforce, and further hinders the development of secure coding practices within software development.

Security tools are a major frustration point to developers

If a developer doesn't know they're introducing security flaws into their code, there's a tendency to believe that what they're producing is secure. That's especially true when reports from the security team contain vulnerabilities that are not vulnerabilities at all but false positives. So if an organization expects their developers to buy into taking greater responsibility for security, it needs to make sure it has good tools­; vulnerability scanners, source code analyzers and savvy application security SMEs to educate developers, without prejudice, that their code needs improving.

So if an organization expects their developers to buy in to taking greater responsibility for security, it needs to make sure it has good tools; vulnerability scanners, source code analyzers and savvy application security SMEs to educate developers, without prejudice, that their code needs improving.

Tools can be another pain point for developers who want to produce secure code. It's not uncommon to hear developers complain about the tools they have lacking the kind of sophistication they need to identify security risks and fix them. On the other hand, many developers aren't willing to spend the time necessary to tweak those tools to get more out of them with less pain. There are tools for developers to help with security analysis such as Static Application Security Testing (SAST), which is becoming popular amongst developers.  Finding coding errors early in the development life cycle can save organizations both time and money, as well as make applications more secure.

Mixed messages received by developers

Developers have also been sent mixed signals about their role in producing secure code. The security industry often touts new products as alternatives to secure coding. That was the case with Web Application Firewalls (WAF) and Runtime Application Self­-Protection (RASP). The pitch for WAFs was that they could stop attackers before they could exploit flaws in an application's code. Theoretically, that diminishes the risk created by insecure code and relieves the pressure on developers to write flawless code. 

In reality, though, hackers found ways to defeat WAFs, making it still important to produce secure code. In the same vein, RASP is being sold as the answer to flawed programming. It's designed to see into applications and shut them down if they misbehave. That's fine as a temporary fix, but to get the app running again, whatever's wrong with it needs to be fixed. RASP can reduce the risks created by insecure code ­­ although it's limited in the classes of vulnerabilities it can protect against ­­ but it's not going to make an application as secure as it could be if its code was written with security in mind during the design and build phase.

No matter how sophisticated the tools get, they will not run themselves, they need engineers with expert security skills to run them.

Developers can also be sent mixed signals about writing secure code from their organizations. Executive buy­-in to writing secure code is as important as getting the developers themselves to embrace the concept. Unless management understands the value of secure coding and conveys its support of the concept through things like training and purchase of state of the art tools, then any efforts to improve the security practices of coders is unlikely to gain any traction.

We help DevOps teams build confidence in their application security.

Penetration Testing as a Service (PTaaS) is one way in which we test your code and infrastructure for vulnerabilities against a custom business logic. Companies are switching to PTaaS because it naturally aligns much better with their current software development practices. Rather than making security testing an afterthought or an additional step, it makes it part of the process right alongside the testing that you already do. Test as you deploy code and start sleeping better at night.

Interested? Learn More!

Was this post helpful?

About the Author

Sherif Koussa
Sherif Koussa is OWASP Ottawa Chapter Co-Leader, Software Developer, Hacker, and founder and CEO of Software Secured and Reshift. In addition to contributing to OWASP Ottawa for over 14 years, Sherif contributed to WebGoat, and OWASP Cheat Sheets. Sherif also helped the SANS and GIAC organizations launch their GSSP-Java and GSSP-NET exams and contributed to a few of their courses. After switching from the software development field to the security field, Sherif took on the mission of supporting developers shifting security left, and ship more secure code organically.
Share This Post

Leave a Reply

Your email address will not be published.

Related Post

Aug 9, 2023 by Cate Callegari

Worried Penetration Testing Will Derail Your Sprint Cycle?

Read more

Was this post helpful?

Jul 17, 2023 by Shimon Brathwaite

7 Agile Software Development Habits that Produce Security Concerns

Read more

Was this post helpful?

Jul 4, 2023 by Cate Callegari

Common Security Misconfiguration Habits

Read more

Was this post helpful?

Office

301 Moodie Dr. Unit 108
Ottawa ON K2H 9C4

Designed by WP Expert
© 2023
Software Secured
cross