Subjectivity and Risk Scoring

One common misconception that I was guilty of subscribing to is that while qualitative risk analysis always has some degree of subjectivity, quantitative risk analysis should remain strictly objective. My school of thought has since shifted that there is no such thing as truly objective data with which to conduct such an analysis, because if nothing else, even if the measurements are truly reproducible and repeatable, there is an observer bias that is injected into the mix. Rather than fighting the subjectivity, it’s much easier to just accept it. I know it sounds like a terrible idea, but bear with me.

In a previous post, I described pulling marbles from a bag using a Bayesian technique to develop an estimate. I actually had a few people get in touch with me about how ridiculous it is just automatically assume that there is an equiprobability of marbles without any prior knowledge as to the contents. Lacking any prior knowledge or experience, we are using a Bayesian concept of uninformative prior which gives us a general starting point. This concept demands indifference, and until we find out otherwise we will assume that there is an equiprobability.

But what if we had prior knowledge or experience? If I spent my youth playing marbles, perhaps I could tell you that a bag of that size and weight likely contains about 20 marbles. Could we use this to our advantage? Absolutely. We can now reasonably assume that the red marble will be drawn at a minimum of 5% of the time. This is what is often referred to as a priorpriori, or informative prior. But that’s just a minimum, we could be facing a significantly higher percentage. However, an incomplete data set is what drove use towards a Bayesian technique in the first place. We now have the option of either using the informative or the uninformative prior for our first round of testing.

This is something that pains people to discuss, because we are now possibly considering 5% likelihood as the low end of the spectrum based upon my personal experience. Before we grab the pitchforks and torches, let’s remember that expert judgment is regularly called upon throughout planning various aspects of a project and that human input remains important. This is not to say that we should not debate the veracity of this estimate, because we should! But let’s not debate the source, let’s not argue solely because it was based upon a person’s opinion rather than empirical data.

At the end of the first test, when we draw the first marble and record the result, we have our first posterior. However, when we go to run our test again, we will change the equation that we use to calculate probability. We will now have a new likelihood for drawing a red marble, as the posterior from the previous experiment becomes the next experiment’s prior.

As further data is collected, our calculations will continue to evolve and with this additional data we will develop refined probabilities. Here in lies the argument that people have against Bayesian methods for risk management in a predictive life cycle project: if planning is done upfront, how can appropriate plans ever be completed if the results of risk management activities continue to change? The problem becomes one of attempting to conduct planning in a vacuum rather than the methodology we are using for risk.

When the facts change, I change my mind. What do you do, sir? – John Maynard Keynes

It goes without any great argument that planning within a project is iterative and ongoing. In fact, project managers regularly engage in what is called progressive elaboration where a plan is regularly refined as more information becomes available. Risk management is no different. One of the processes invoked is Control Risks, where risks are supposed to be reassessed at a frequency determined within the risk management plan. This reassessment is supposed to determine if a shift in probability and/or impact has occurred since last assessed.

This type of reassessment should be intuitive for most people. I have lived in New Jersey for a number of years, and it gets some great weather – especially in the winter time. When we receive word that we are going to have a major snow storm, I try to check the weather every 3-4 hours to see if it’s still coming towards us and how much snow we are going to get. It’s a running joke that if they call for the storm of the century, we’ll get flurries; but the opposite holds true, as well. Think about how unreasonable it would be for me to watch the news once and tell the kids that they don’t have school next week!

Risk is uncertainty. The thing that hurts most projects is that we try and turn that uncertainty into certainty, which just does not work. I embrace everything in terms of likelihood of occurrence based upon probability developed from data. There’s an 80% chance of snow? I like to tell my kids there’s a 20% chance that they’re going to school. Risks can be managed, and we can do our due diligence to gather as much data as possible.

Karl Cheney
Castlebar Solutions

Systemigrams and Complexity

I have been down in Virginia this week for some training and had some time to talk with some really smart people, something that I always enjoy. One of the topics that invariably pops up in the greater D.C. area is government contracts and the administration thereof. It all started with discussing how difficult it is to give initial estimates for work that will somehow, hopefully be within whatever the tolerance is of whatever the government’s independent estimators come up with.

These discussions are always fun to be a part of, especially when you are working with people from different industries. One came from a background of bidding as a prime to sub work, and touted his experience with LPTA contracting, while the other works in an engineering firm where the complexity of solutions runs contrary to the cost-only provisions of LPTA. Hearing the words complexity and solution in the same sentence always piques my interest, so I immediately started asking about how they manage to develop accurate estimates where uncertainty reigns.

The traditional means of reducing risk is to conduct planning in a predictive life-cycle project, or to conduct a spike in an agile organization. The issue is that funding and resources may not be available to a project manager who is tasked with putting together a RFP for a procurement activity. This is where systemigrams come in. When I first heard the word systemigram, I first asked them to say it again, and then spell it, and finally I admitted my ignorance on the topic. To be fair, and in my defense, even for a portmanteau it sounds like it was made up on the spot.

While this may be overwhelming to look at initially, think about how beautifully it represents the multitude of factors that can affect cyber security on social media. This is not a concept that lends itself well to illustration. Systemigrams do an awesome job of organizing chaotic thoughts that relate to complexity.

Blair, Boardman and Sauser wrote about systemigrams as one approach for the United Kingdom’s Ministry of Defense to examine System of Systems (SoS) in lieu of traditional models that were used previously, which typically discounted rather than made sense of the complexity that an organization faces. The process begins with capturing the system in a narrative, and then illustrating it. This concept is hardly new, but the manner in which it is carried out makes it far more useful as it is possible to examine the subject holistically. They listed the following rules when building a systemigram:

Rules for Prose
1. Address strategic intent, not procedural tactics.
2. Be well-crafted, searching the mind of reader and
3. Facilitation and dialogue with stakeholders
(owner/originator of strategic intent) may be required
to create structured text.
4. Length variable but less than 2000 words; scope
of prose must fit scope of resulting systemigram
Rules for Graphic
1. Required entities are nodes, links, inputs, outputs,
beginning, end.
2. Sized for a single page.
3. Nodes represent key concepts, noun phrases
specifying people, organizations, groups, artifacts,
and conditions.
4. Links represent relationships and flow between
nodes, verb phrases (occasional prepositional
phrases) indicating transformation, belonging,
and being.
5. Nodes may contain other nodes (to indicate
break-out of a document or an organizational/product/process
6. For clarity, the systemigram should contain no
crossover of links.
7. Based on experience, to maintain reasonable
size for presentation purposes, the ratio of
nodes to links should be approximately 1.5.
8. Main flow of systemigram is from top left to
bottom right.
9. Geography of systemigram may be exploited to
elucidate the “why,” “what,” “how” in order to
validate the Transformational aspect of the systemic
10. Color may be used to draw attention to subfamilies
of concepts and transformations.

Even with these rules clearly stated, people struggle with complex concepts. It turns out that the ability to create and interpret systemigrams depend upon abilities that are often called Systems Thinking Skills (STS). Dorani et al. broke these skills down into seven categories: Dynamic Thinking, System-as-Cause Thinking, Forest Thinking, Operational Thinking, Close-Loop Thinking, Quantitative Thinking, and Scientific Thinking. These all boil down to looking at the whole as being more than just the sum of its parts.

This work expanded upon that of Richard Plate  who developed CMAST, Cognitive Mapping Assessment of Systems Thinking. Plate used CMAST to evaluate the ability of middle school aged children to understand and interpret non-linear relationships in a complex system. Plate’s work is an awesome read, and one story towards the end really stood out for me. It was about a child that wanted very badly to be correct, but was unable to deviate from a linear structure to a more uncomfortable, but correct, structure that involved branching.

One of the reason that I believe complexity is arguably the single largest threat to any project’s success is the fact that I have met many adults, some in positions of authority, that thought like this child: in a linear manner, unwilling to deviate even though they want to, even though they know that their answer is wrong – they just can’t wrap their minds around the complexity.

Karl Cheney
Castlebar Solutions