Probability and risk: how snake eyes help explain odds in FAIR analysis

Explore how the dice odds example—two ones on two six-sided dice—shows probability, a core idea in FAIR risk analysis. See how probability differs from possibility and prediction, and how likelihood guides decisions in information security, finance, and everyday risk.

Let’s start with a tiny math moment that quietly shapes big decisions: the odds of rolling snake eyes with two dice. If you roll a pair of standard six-sided dice, there are 36 possible outcomes (6 sides on the first die times 6 on the second). Only one of those outcomes gives you two ones—snake eyes. So the chance is 1 in 36. Simple, right? But that little fact packs a punch when you’re thinking about risk in a structured way, especially in the world of FAIR—the Factor Analysis of Information Risk framework.

Probability, versus the other “P” words

Here’s the thing: that 1 in 36 is a probability. It’s a precise expression of likelihood, tied to the number of outcomes that could happen. It’s not just about whether something could occur (that would be possibility), and it isn’t a forecast about the future based on trends (that would be a prediction). And it certainly isn’t about mysticism or inevitability (that would be prophecy). Probability sits in a clean, mathematical lane: it answers the question, “How likely is this event, given what could happen in the sample space?”

In everyday risk talk, people often mix these up. You might hear, “There’s a possibility you’ll lose data this year.” That’s a yes-or-no sense of whether something could occur, not a number you can tally. Or someone might say, “We predict that losses will double next year.” That’s a forecast, policy-dependent and uncertain, not a fixed probability. The dice example isolates probability as a measurable chance, a number you can work with when you’re weighing options and trade-offs.

Probability in FAIR terms

FAIR is all about turning uncertainty into something you can manage. It does this by breaking risk into two big parts: how often a loss event is likely to occur (frequency) and how bad it would be if it does occur (magnitude). The product of those ideas is the risk you actually face.

  • Loss Event Frequency (LEF): How often a threat actually causes a loss, within a given period.

  • Loss Magnitude (LM): How severe a loss would be if the event happens (financial impact, downtime, reputational damage, etc.).

That dice example lines right up with LEF: the chance of a snake eyes tells you something about the likelihood that a particular kind of loss event will occur. In a real setting, you’re not counting dice; you’re counting threats, vulnerabilities, and exposures. But the math idea is the same: understand the probability of an event and the scale of its impact, then combine them to gauge risk.

A quick mental model you can carry forward

  • Think of risk as a blend: how often something happens × how bad it would be if it does.

  • Probability is the bridge between those pieces. It gives you a number to sit with when you’re deciding what controls to implement.

  • Not all events are as simple as rolling two dice. Some risks are correlated, some are seasonally influenced, and some depend on complex networks of systems. That’s where distributions and more nuanced thinking come in.

From dice to daily decisions

The 1 in 36 example isn’t just a math aside; it’s a window into decision-making under uncertainty. Consider a small business that relies on a digital platform. The team can estimate the probability that a specific security incident leads to measurable losses in a year. It won’t be exactly 1/36, but the principle applies: you quantify likelihood, you estimate potential damage, and you decide where to invest controls.

In the FAIR mindset, you wouldn’t freeze at the word “risk.” You’d translate risk into terms you can act on: frequencies, magnitudes, and the confidence you have in those estimates. You’d ask questions like:

  • How often could a particular threat exploit a vulnerability in our environment?

  • If it happens, what’s the potential financial impact, downtime, or regulatory consequence?

  • How do our controls shift the probability (or the magnitude) of those losses?

A touch more nuance, please: independence and real-world quirks

Two dice are a neat, clean setup: each die is independent, and every outcome has the same 1/36 flavoring. Real-world risks aren’t so tidy. Dependencies matter. One cyber vulnerability exploited by a threat actor might raise the chance of a second issue—compounding risk, not just a single event. A regulatory change could simultaneously affect multiple lines of defense. In FAIR language, you learn to map these interdependencies, adjust probability estimates, and keep the overall risk picture honest.

That doesn’t mean the dice analogy is useless. On the contrary, it gives you a baseline. If you can understand a simple, clean probability, you’re better equipped to grapple with more complex probability models, including distributions, ranges, and sensitivity analysis. And when you can explain those ideas in plain terms, you’re already ahead in the room—whether you’re communicating with your team, a manager, or a stakeholder.

A practical lane to move in: communicating risk with numbers

One value FAIR emphasizes is making risk talk concrete. Saying “we expect some losses” is not as useful as saying, “the annual expected loss is X dollars with a confidence interval from Y to Z.” If you think in those terms, you’re using probability to shape action—prioritizing defenses where the likelihood and impact intersect in the most consequential way.

Here are a few tiny-but-useful ways to bring probability into everyday risk chats:

  • Use straightforward numbers when possible. A single probability number or a small range is clearer than vague guesses.

  • Pair likelihood with impact. A low probability event with enormous impact might still demand attention (think critical data breaches).

  • Consider ranges, not just point estimates. Real-world estimates wobble; expressing a distribution helps everyone see the uncertainty.

Rhetorical nudges that help without shouting

Let me explain with a couple of quick ideas:

  • If you were choosing between controls, you’d pick the ones that push up the odds of staying within acceptable risk levels. Probability helps you compare options on a like-for-like basis.

  • When a risk discussion gets a bit abstract, anchor it to a concrete scenario. “What would this look like if a server goes down for eight hours? What’s the dollar impact, and how does that affect our odds of a bigger loss?”

A tiny detour: how probability ties into risk appetite

Every organization has a taste for risk, a balance between ambition and caution. Probability is the language that helps you calibrate that balance. If the chance of a negative event is uncomfortably high, you either invest more heavily in controls or you adjust your risk tolerance. It’s not about chasing perfect safety; it’s about making informed bets, with a clear sense of probability and consequence.

A few quick takeaways to carry along

  • The snake eyes example is a clean illustration of probability: a 1/36 chance, because there are 36 equally likely outcomes and only one that matches the event.

  • In risk analysis, probability isn’t a forecast or a mystical prediction. It’s a numeric statement about how likely something is to occur.

  • FAIR uses probability to connect how often a loss event might happen with how bad the loss could be. The math helps you prioritize defenses and allocate resources.

  • Real-world risks are messier than dice. Independence, correlation, and changing conditions matter. Build models that reflect that complexity, but always anchor them to clear numbers you can explain.

  • Communicate risk with concrete figures. A few precise numbers, paired with a sense of the uncertainty, are far more persuasive than hand-wavy vibes.

A gentle closer: why this matters beyond the numbers

Numbers are not the enemy; they’re a bridge to better decisions. When you can translate a probabilistic statement into a story about threats, vulnerabilities, and defenses, you empower teams to act with clarity. The dice lesson is more than a math detour; it’s a reminder that risk lives in the space between what could happen and what we choose to do about it.

If you’re exploring risk frameworks, think of probability as your compass. It points you toward the questions that matter: which threats deserve attention, how often they might bite, and how hard the hit could be. And in the end, that blend of thinking—probability guiding action—helps a business stay steady, even when uncertainty is part of the landscape.

So next time you hear a number attached to risk, take a moment to ask: where did that probability come from, and what story does it tell about the steps we should take? The dice are easy to read, but the real world demands a little more nuance. The payoff, though, is a strategy that’s thoughtful, data-driven, and playfully practical.

Subscribe

Get the latest from Examzify

You can unsubscribe at any time. Read our privacy policy