CIO February 15 2010 Issue

Page 15

Kenneth van Wyk

Applied Insight

Until we start really pushing for positive validation approaches — mechanisms that only allow things that are safe and prevent all others — we're going to continue to be ‘surprised’ by novel attacks. To do positive validation, we need to understand the context of what is going on, which means the solution must reside in (or very close to) our application software. I know full well this is much easier to say than to achieve, but we have to do much better.

Penetration Testing Penetration tests are a vital part of our security arsenal, but relying solely on them to determine whether an application or system is secure is downright negligent. The reason for this is that penetration testing tools and techniques are all too often simply based on network and application scanning. These are inherently outside-inside approaches that fail to look deeper into the underlying software architecture, design and source code. Any responsible security testing program must include appropriate design and source code reviews, in addition to other rigorous dynamic testing (module/API fuzzing, for example). All this, of course, in addition to penetration testing.

Erosion of Software Understanding

of our business systems, and then expect miracles. Instead, we get disappointment all too often. There's nothing wrong with stopgap solutions to nasty problems, but we have to recognize that they're just that: stopgap solutions. If we're not also keeping a diligent eye on the underlying systemic issue, you end up with ugly mishmash systems that are bound to fail at the worst possible time.

Overly Optimistic Code This topic is a bit out in left field, but it's something I've seen repeatedly. Many times, the software we rely on is written far too optimistically. Anyone who has written a line of code has no doubt made this sort of mistake — I know I have. What I mean by this is that our software often written with the assumption that the actions it takes will work just fine. A file written to disk, for example. We assume there'll always be adequate disk space to hold the file and that the write operation will work cleanly. The truth is that computer environments often throw unanticipated obstacles at us that cause our assumptions to fail in spectacular ways. And many of these failures have significant security ramifications: customer records stolen, authentication credentials spoofed, and so on.

Take on the mind-set of someone walking across a busy street with a toddler in tow. It is up to the adult — your software developer — to anticipate fail states and to keep the toddler safe.

When I got started in this industry in the late 1980s, it was pretty much a given that folks who had the title of system administrator were adept at programming. Nearly everyone in the field had at least a solid computer science background of some sort. Look around today, though, and you'll see that information security has branched out into its own specialty niche and has fewer and fewer practitioners who can even read software source code. This is a grave mistake, folks. It's not enough for us to know how the latest exploit (and exploit tool) works. We all really need to maintain a deep understanding of the underlying technologies we're working with. It is considered essential for security techies to keep up with the latest security technologies. Nearly everyone among us reads the likes of full-disclosure already. That's all well and good, but we've got to take that deeper.

When doing system reviews, I always look for and encourage an attitude that anticipates things going wrong. Take on the mindset of someone walking across a busy city street with a toddler in tow. It is up to the adult — the software developer, if you will — to anticipate fail states and to keep the toddler safe. When we adopt that kind of attitude in our code, things like positive input validation become obvious. These are just a few things to consider as we dive into 2010 and beyond. And no, I don't have all the answers, by any stretch. These are tough problems, but the only way we're going to stand a chance is if we understand them and work toward doing better. CIO

Bolt-on Security Whenever we read about a new attack tool or technique, it's a natural reaction to want to ensure that our business systems are properly protected against it. When they're not, we seek the quick fix. That's all natural and expected. But that search for a quick fix often leads us down a path of bolt-on security. We buy a product from a vendor, put it in front

Vol/5 | ISSUE/04

Coloumn_Security_Mistakes.indd 21

With more than 20 years in the information security field, Kenneth van Wyk has worked at Carnegie Mellon University's CERT/CC, the US Department of Defense, Para-Protect and others. He has published two books on information security and is working on a third. He is the president and principal consultant at KRvW Associates LLC. Send feedback on this column to editor@cio.in

REAL CIO WORLD | F E B R U A R Y 1 5 , 2 0 1 0

21

2/9/2010 3:30:21 PM


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.