Several years ago I worked on developing a curriculum for a master’s degree in information security. The choice we faced at that time was to find a middle ground between an “MBA with a security minor” or a technical focus similar to SANS. Our original idea was to strike a balance between the two: that is, to focus on the technical knowledge and background that a technical lead in information security would need to know and do, while combining that with the background to take on a technical leadership role within a security organization.
An example will clarify this. We did not expect that our students would complete the curriculum knowing how to configure an XXX firewall. We did expect that the student would know what a firewall is and what it does, and be able to explain how various firewall rules affect the flow of traffic. So, for example, we expected our student to understand the statement “Block all incoming TCP connections that aren’t associated with a connection request from “inside” the network.” and be able to communicate this to a non-technical audience (e.g., “We don’t allow folks from outside our network to connect to our internal servers.”)
Under a new administration, the program changed focus to a more “business” oriented program, focusing on risk management, eGRC and the like. All technical courses were designated as electives, except for a two-semester course that covered the (ISC)2 CBK. It was possible to achieve a master’s degree without ever being exposed to issues such as cryptography, network security, software security testing and so forth beyond the material covered in the (ISC)2 CBK.
I don’t object to the role of the CISO or the CSO, nor do I object to the technical roles of a network security analyst or systems administrator. However, I strongly believe that there needs to be an individual who is able to communicate effectively with the suits and the T-shirts. The proof of the pudding is indeed in the eating, and a corporate GRC policy is only as good as the implementation of that policy.
I always look askance at articles like “10 high-est paying IT security jobs” as posted at CSO Online especially when it comes to the numbers? Where is the data from? What was the sample size? And all those other statistical-type questions.
So while we can’t necessarily trust the numbers, maybe we can trust the positions. I’ve broken them out by management and by technical positions based on my own biased heuristics. 🙂 The numbers reflect the salary level of the original article
2. Chief security officer
3. Global information security director
4. Security Consultant (judgement call on my part)
5. Chief information security officer
6. Director of security
9. Application Security Manager
1. Lead software security engineer — average salary: $233,333!
7. Cyber security lead
8. Lead security engineer
10. Cybersecurity Engineer
Call me cynical, but at least some of current security issues are reflected in the fact that 60% of the top paid job descriptions are management. And 3 of the 4 technical positions are at the bottom of the pay scale. Proof again that the further you get from the actual hands-on work, the less you actually know about the dirty details. And the devil is in the details.
Well, yet again, another member of law enforcement has decided to step into the privacy/encryption wars. Today, Suffolk D.A. Dan Conley dragged out same old argument that providing encryption to the masses will only server the interests of criminals
In America, we often say that none of us is above the law,’ But when unaccountable corporate interests place crucial evidence beyond the legitimate reach of our courts, they are in fact granting those who rape, defraud, assault and even kill a profound legal advantage over victims and society.’’
Which translates, I think, that those of us who use encryption to maintain a shred of privacy in the face of unlimited data collection of all kinds of communication without benefit of a search warrant are somehow playing into the hands of these criminals who will now be able to carry on their nefarious activities behind the shield of encrypted communications.
Notice that Mr Conley, like so many of his ilk, never provide any statistics that demonstrate how law enforcement was not able to proceed because of encryption. Instead, we hear about horrific cases that wouldn’t have been solved if the perpetrator had access to encryption technology. And we never hear about cases where the perpetrator did use encryption which law enforcement was able to circumvent.
Funny thing about this is that the same arguments were made when Phil Zimmerman published the code for Pretty Good Privacy (PGP). As far as I know, the sky hasn’t fallen yet, although it may have and knocked me unconscious. No, wait, I pinched myself … I’m awake.
Derek Brink, in a blog post on an RSA blog entitled Watch Your Language: How Security Professionals Miscommunicate about Risk, addresses the issue of risk thusly.
Shon Harris, author of the popular CISSP All-in-One Exam Guide, defines risk as “the likelihood of a threat agent exploiting a vulnerability, and the corresponding business impact.” Douglas Hubbard, author of The Failure of Risk Management: Why It’s Broken, and How to Fix It, defines risk as “the probability and magnitude of a loss, disaster, or other undesirable event.” (And in an even simpler version: “something bad could happen.”)
All well and good. But. The devil is in the details of “likelihood.:” One favorite measure of the metric minded among us is the Annual Loss Expectancy, which is the product of the SLE (Single Loss Expectancy) multiplied by the ARO (Annual Rate of Occurrence).
The problem in measuring risk thusly arises when the likelihood (ARO) is very very low and the consequences (SLE) is very very high. The old expressions “1 in a million” works out to an ARO of “.0000001” and and SLE of $1,000,000. Is the ALE then $1.00? No. The ALE is $1,000,000. If it happens, it happens. If it doesn’t, it doesn’t. The event won’t happen 0.25 times per year. Or 0.33 times a year.
This makes it damnable difficult for an organization to budget for security. If an organization is required to spend the amount that represents the impact multiplied by the probability of that loss, then do you spend $1.00? Or do you spend $1,000,000? The answer lies somewhere in between.
A Nobel prize to the individual who figures out this equation.
I recall, a number of years ago, that Marshall Rose described technical folk as divided into go’ers and do’ers. The Go’ers were most likely to attend conferences and working groups, as well as act as representatives to standards committees. Do’ers, on the other hand, stayed in front of their workstations, working out thorny protocol issues and writing interoperable code against imperfect specifications.
And going even further back, we can distinguish between knowing how and knowing that. I don’t fully know the details of the internal combustion engine, but I can still drive a car. I do expect my mechanic to understand the details, at least to the extent that she is able to diagnose a particular problem and come up with a solution.
Which is why the following post struck my attention. In SANS NewsBites Vol. 15, Num. 103, Alan Paller wrote:
The top story at the end of 2013 could just as well have been the top story ten years ago. Federal chief information security officers continue to “admire the problem” by paying $250/hour consultants to write reports about vulnerabilities rather than paying them to fix the problem. Sadly most of the federal CISOs and more than 85% of the consultants lack sufficient technical skills to do the forensics and security engineering to find and fix the problems. Paying the wrong people to do the wrong job costs the U.S. taxpayer more than a billion dollars each year in wasted spending plus all the costs of cleaning up after the breaches. How about a 2014 New Years resolution to spend federal cybersecurity money usefully: either by ensuring all the sensitive data is encrypted (at rest and in transit) and/or the organization implements the Top 4 Controls on the way to implementing the 20 Critical Security Controls?
Now, I’m not sure that a CISO needs to have the technical skills “to do the forensics and security engineering to find and fix the problem.” But the CISO should know whether they have the expertise in-house to do so, of if the consultants they are hiring have these skills, and have the clout necessary to ensure that the right people are hired and that the job has been done right. Otherwise, the top story of 2023 will be that same as 2013.
I could rant on, but I don’t want to break a New Year’s resolution quite yet. 🙂
It’s just the same old song / with a different beat …
At http://www.packtpub.com. I was one of the technical reviewers for this book, and I think it fills an important niche for using Wireshark for Network Analysis (and don’t forget security as well.)
Digital Forensics, Incident Response and Root-Cause Analysis have many tools in common. Digital Forensics and Incident Response have different procedures.
via Digital Forensics Links | Aggressive Virus Defense.
And this, ladies and gentlemen, is what we need to be teaching our security students, instead of all this GRC bullshit.
I’ve recently reached agreement with a major technical publisher to write a book on security. Yes, I’m being deliberately coy at the moment, but more details will follow in good time.