Derek Brink, in a blog post on an RSA blog entitled *Watch Your Language: How Security Professionals Miscommunicate about Risk, *addresses the issue of risk thusly.

Shon Harris, author of the popular CISSP All-in-One Exam Guide, defines risk as “the likelihood of a threat agent exploiting a vulnerability, and the corresponding business impact.” Douglas Hubbard, author of The Failure of Risk Management: Why It’s Broken, and How to Fix It, defines risk as “the probability and magnitude of a loss, disaster, or other undesirable event.” (And in an even simpler version: “something bad could happen.”)

All well and good. But. The devil is in the details of “likelihood.:” One favorite measure of the metric minded among us is the Annual Loss Expectancy, which is the product of the SLE (Single Loss Expectancy) multiplied by the ARO (Annual Rate of Occurrence).

The problem in measuring risk thusly arises when the likelihood (ARO) is very very low and the consequences (SLE) is very very high. The old expressions “1 in a million” works out to an ARO of “.0000001” and and SLE of $1,000,000. Is the ALE then $1.00? No. The ALE is $1,000,000. If it happens, it happens. If it doesn’t, it doesn’t. The event won’t happen 0.25 times per year. Or 0.33 times a year.

This makes it damnable difficult for an organization to budget for security. If an organization is required to spend the amount that represents the impact multiplied by the probability of that loss, then do you spend $1.00? Or do you spend $1,000,000? The answer lies somewhere in between.

A Nobel prize to the individual who figures out this equation.