The Synopsys Software Integrity Group is now Black Duck®. Learn More

close search bar

Sorry, not available in this language yet

close language selection

Why isn’t cyber security taught in schools?

Black Duck Editorial Staff

Oct 12, 2016 / 5 min read

The recent highly publicized hacks of private and government entities have ignited discussion about expanding the role of cyber security education in traditional CS curricula. This is a good thing. What isn’t, is that most colleges don’t require computer science students to take security courses. In fact, many don’t even offer them. Why? Because most developers think that security people should handle security problems, as do the vast majority of their college instructors. After all, developers want to focus on features and functionality. Not fixing problems.

Universities are failing at cyber security education

Per an oft-cited study by CloudPassage conducted earlier this year, only one of the top 36 undergraduate computer science programs in the United States made passing a cyber security course a graduation requirement. After college, retrofitting these young developers with a security-focused utility belt isn’t really an option. Not just because it’s too costly—but because no one really has a way to even do it, or has discovered a way that works.

So how can we feed a well-balanced diet of development and security to twentysomethings before they come off the assembly line? How can we give them the know-how to build solid applications, not just functional ones?

The answer is simple: We need to flip the script and start teaching professors and mentors about the benefits of weaving security into every aspect of developer education. Understanding the best way to do that, of course, starts with understanding why we’re not already doing it.

Reason 1: The current academic model doesn’t work when it comes to security

Traditionally, computer science programs have done an excellent job with theory. Students are taught to be great computer scientists—not great developers.

IEEE member Roy Wattanasin addresses one the main problems with cyber security education, saying that there is no template or guide for instructors to follow. They have to create their own course curricula and with the list of zero-days and online exploit kits popping up daily, it’s almost impossible to even know where to begin. Wattanasin surveyed computer science programs in the Boston area, and he that found that no two computer science courses were alike and no school offered a course specifically on cyber security. This is a huge issue for employers, and with such fragmented education, there is no telling how well equipped, if at all, these graduates are to tackle security issues.

Security knowledge, unlike high-level theory, requires developers to get their hands dirty. This isn’t to say traditional methods aren’t useful to a fledgling developer; they just don’t compare to actually building something with security baked in from the get-go.

Understanding security issues requires developers to:

  1. Care about the thing they’re protecting—whether it be user data, ability to control an app, etc.
  2. Think like an attacker and understand how they would gain access
  3. Have a clear understanding of what the security issue looks like in the code base, how it’s manifested and how it’s exploited

And for many developers, it’s just not practical to go through the entire process—especially when security issues can be someone else’s problem.

Reason 2: Cyber security degrees—Great for security people, but what about developers?

The way security people think, react to problems, and reach solutions is foreign to developers. It’s almost like they speak different languages. So while undergraduate cyber security degrees might very well result in high-quality, focused professionals, they aren’t useful for developers. “You know the situation is bad when companies like Bloomberg, Facebook, Google, and Microsoft are creating their own cyber security programs to train employees,” says Ming Chow, a professor at Tufts University.

What the world really needs isn’t a new degree. It’s a developer-education reboot. Why? Without dovetailing cyber security education into developer education from the start, developers won’t become truly bilingual. They’ll always be “developers who know something about security,” not “security-driven developers.” And only the latter can be trained to incorporate the principles of security into dev work and ensure nothing gets lost in translation.

Reason 3: Nontraditional & DIY approaches aren’t always reliable

Boot camps and online courses do a fantastic job of preparing new developers to write code. Graduates of these programs may very well have an edge over CS students because of the project-based, hands-on nature of their education. But despite the explosion in growth, devs coming out of these programs don’t get the same level of cyber security they would in traditional academic programs. The skills they learned in these crash-course environments need to be continually developed.

Even if we assume it was possible, in principle, for a self-taught developer to focus on writing secure code from the start, it doesn’t really work in practice. A Google search or trip down a Stack Overflow rabbit hole can do more harm than good.

This also impacts veteran devs. looking to bolster their own security knowledge when on the job training doesn’t cut it. The lack of solid resources actually disincentivizes learning. And reinforces the mistaken idea that security issues should be handled by security people.

Change is coming

Pockets of the academic world have recognized that the CS curriculum in its current form is failing to prepare students for the workforce, and they’re trying to fix it one course at a time.

Ming Chow offered colleagues the following manifesto at the end of his New England Security Day presentation “Chipping Away at the Security Education Problem.”

  • There is no excuse not to integrate cyber security into computer science education, especially systems and application-based courses.
  • Inform students of the security and privacy problems and opportunities; ask students to be good citizens.
  • Encourage and challenge students to develop the curiosity and mindset of the bad guy.
  • Do not use only traditional teaching and learning techniques for courses. Learning how to take tests isn’t helping.
  • Provide mentorship and networking opportunities.

In adhering to these principles, professors can create stronger developers (and better job candidates).

Provided the professors themselves have an understanding of security, small tweaks to existing projects won’t be difficult to implement. Baby steps will pay major dividends. Encourage CS students to think about security at the beginning of each assignment. Prompt them to discuss potential threats, where they come from, and how to prevent them.

Educators—academic or otherwise—are in the best possible position to incite change. A shift toward more secure code begins with recognition of its importance and subsequent inclusion in curricula. Every CS professor and boot camp instructor can ask students, “What could go wrong?” and start a conversation.

Where does this leave us?

Hopeful!

If we want students to become developers who write secure code, the answer is simple: Put them in an environment where secure code is the only kind of code. Unlike security professionals, developers are in the unique position to fix security issues before they start. Like, before the toasters and smart cribs start spying on us. Clearly, application security is evolving, and developer education needs to evolve with it.

If proper training means delaying the inevitable rise of Skynet, we’re all for it. So, professors! Developers! Boot camp mentors! Stack Overflow contributors! Rise up and help us eliminate the plague of application insecurity by equipping young developers with the cyber security education they need to write secure code.

Continue Reading

Explore Topics