Skip to content

The DOD caught with its Pants down – A Revolution in Risk Assessment needed?

December 29, 2012

The DOD caught with its Pants down – A Revolution in Risk Assessment needed?

It seems to have passed commentators by, but the US Government Accountability Office (GAO) recently uncovered a major under-reporting of risk at the Department of Defense. (see Figure 1 below).1

The GAO’s recommendations read like a school report: the DOD should re-do its homework and try harder in the future…

However, I have a different and more revolutionary approach to risk management to propose – and it is based on real world feedback, agile style.

Figure 1: All is well, apparently…

Background

The GAO’s recent report on the usefulness of the Federal Information Technology Dashboard uncovered the fact that the DOD is:

“masking significant investment risks, has not employed its own risk management guidance, and has not delivered the transparency intended by the Federal IT Dashboard.”

The Federal IT Dashboard was introduced in 2009 and whilst it has increased the transparency of the performance of major federal IT investments, the GAO appears unsure of its usefulness. It notes that although agencies adjust the risk assessments of their investments over time, there is not enough evidence as to whether this is just because of changes to risk assessment processes, or because of actual changes in the real underlying risks.

In this latest report the GAO looked at six departments, and although the DOD whitewash of its risk assessments is the most egregious, the other departments (with the possible exception of the Department of the Interior – DOI) look suspiciously optimistic also (see Figure 1).

The GAO’s Recommendations?

The GAO has called, yet again, for more information on investment performance and more care in the rating processes used for risk assessment.

We have heard these recommendations before. In a recent presentation to senior government advisers in London I highlighted how increasing detail in planning and analysis at the FBI led to no greater insight into risk management.

In annual assessments of the FBI’s case management overhaul which in the end wasted over $600m, the GAO had recommended more and more detailed planning and risk management. They were encouraged by the mountains of paperwork that had been produced. But the program failed, despite ‘Green’ risk assessments by an ‘independent’ $100m Programme Management Office (PMO) and the annual GAO scrutiny just highlights the problem of trying to predict risk by ensuring bureaucratic conformity rather than delivering. Bureaucratic conformity, after all, is the best way to ensure ‘groupthink’ and optimism bias.

In the end, only an ‘agile revolution’ at the FBI saved the day after the prime contractor had failed to deliver again and again.

Figure 2: Brian in PS Panel Session after his talk (transcript of talk here)2

Brian in panel - cropped

We Have Heard this all Before

This recent GAO report now repeats similar advice. The GAO have found that the DOD risk assessment rates none of its 87 current investments as even moderately high risk. Indeed, over 85% of its investments are apparently at low or moderately low risk.

The GAO is skeptical saying that the DOD deliberately downplays delays and cost increases so as to reduce the
likelihood of scrutiny by the Office for Management and Budget (OMB):

“The DOD is masking significant investment risks, has not employed its own risk management guidance, and has not delivered the transparency intended by the Dashboard.”

The GAO’s recommendation to DOD? That more performance assessment information should be fed into the same process – in other words, more of the same.

Analyzing the GAO Report in More Depth

The report admits the limited usefulness of the dashboard:

“Both OMB and several agencies suggested caution in interpreting changing risk levels for investments … An increase in an investment’s risk level can sometimes indicate better management by the program … conversely, a decrease in an investment’s risk level may not indicate improved management if the data and analysis on which the CIO rating are based is incomplete, inconsistent, or outdated.”

So the implication is that none of the risk assessments can be taken at face value, and projects that are badly run are the most likely to have the most unrealistic assessment. But surely those are the ones most in need of effective risk management?

A Revolution in Risk Assessment is Needed: Agile Risk Management

The crux of my argument in my book “Agile Project Management for Government”, is that decisions should be based on practical feedback from what works – and that this feedback is needed early in a project lifecycle and frequently thereafter.3

My proposal in this blog is a simple one: we should not assess the risk of project failure by measuring compliance to bureaucratic assessments. We should instead rate all commitments to spend as ‘Red’ until they start to deliver, and then they should move into ‘Yellow’ status until substantial implementation has taken place that proves the project concept.

This will encourage smaller, more modular projects (as the OMB and GAO concur are required) and earlier ‘proof of the pudding’ from real-world success. This takes forward Barry Boehm’s theory that what he called spiral development where each iteration of a spiral of work starts with a risk assessment.4

The Federal IT Dashboard should be reworked divide money balanced on precarious assumptions about likely success of implementation from that being invested based on proven concepts that are being incrementally developed and released…

Comment below…

References

1 US GAO “Information Technology Dashboard: Opportunities Exist to Improve Transparency and Oversight of Investment Risk at Select Agencies.” GAO-13-98, Accessed December 29, 2012, http://www.gao.gov/assets/650/649561.pdf

2 Wernham, Brian. “Agile saves the FBI Sentinel project.” Accessed December 29, 2012, http://www.publicserviceevents.co.uk/ppt/mc12-brian_wernham.pdf

3 Wernham, Brian. Agile Project Management for Government. New York, London: Maitland and Strong, 2012

4 Boehm, Barry, and W. Hansen. The Spiral Model as a Tool for Evolutionary Acquisition.” CrossTalk, 2001

© Brian Wernham 2012 CC BY-NC-ND

Advertisements

From → Agile Governance

4 Comments
  1. It’s not just DOD, it’s a much broader issue…

    Everyone in DC knows the dashboard is bad due to self-reporting. Here is a good article from IW that gives the details (here)

    I do like your proposal but I’m not sure what you are saying is quite going to fix the problem… The problem isn’t the reporting.. the problem is the lack of accountability (who is getting fired for inaccurate dashboard reports—no one…)

  2. Scot,

    Yup – as your link shows, after several years of dashboard reporting, and all of GAO’s previous recommendation’s implemented, these problems still remain in the Federal IT Dashboard…

    In the UK an independent process is used, but these are still problematical – see here:
    http://books.google.co.uk/books?id=7cZyrdr6CaIC&pg=PA236&dq=wernham+gateway+report&hl=en&sa=X&ei=PczhUJW5EMqThgeQh4HYBg&ved=0CDYQ6AEwAA#v=onepage&q=wernham%20gateway%20report&f=false

    Perhaps an ‘alternative’ Federal IT Dashboard could be published using my proposals. GAO could track the existing approach versus my proposed Agile IT Dashboard approach and see which approach is more predicative of lurking problems?

    Brian

  3. Hi Brian,

    I like your idea of project status being red until otherwise proven. I once worked with an organisation that had a 4 colour system: Red = in difficulty, Yellow = at risk, Green = ok, White = unknown. The PMO would hold weekly conference calls to review the white projects and only looked at the Red projects at the request of the SRO. What was really useful is that they had means of turning rated projects to white if they hadn’t met certain criteria – e.g. risk register not being updated in last 30 days or the SRO not attending 2 project board meetings in a row.

    Back to the DOD’s dashboard….
    I was once told that if you ask 100 men if they were above average approximately 80 of them would say yes. This means that 30 of them are either deliberately lying or over optimistic of their own ability. Whether or not its true, it illustrates the issue with self-reporting.

    That’s why Independent Review is so useful. Unfortunately, many government departments, government agencies and indeed regulated industries see Independent Review as a threat (as in your blog’s reference to the fear of coming under OMB scrutiny) and then focus on conformity rather than doing a good job.

    I’ve also seen first hand many project teams over inflating costs and schedules in the initial analysis to ensure they have sufficient headroom to never have to report red/yellow status, which is just as wasteful as being over-optimistic.

    That’s why effective governance requires treating good decision-making and its required behaviours as a ‘system’ with accountability at its heart. APM’s Directing Change provides a useful guidance on how to asses and improve an organisation’s governance ‘system’.

    http://www.apm.org.uk/group/apm-governance-specific-interest-group

    Best regards
    Andy

  4. Some more thoughts on whether the new ISO 31000 standard would have done better:

    It is clear that DoD’s risk management framework based on MIL-STD-882D failed to deliver credible evaluations of risk to the White House.

    So – we can conjecture that if the DoD had adopted ISO 31000 this would not have happened. BUT: what evidence do we have of this?

    Jeff Walker of Booz Allen Hamilton made an interesting presentation at the Environment, Energy Security & Sustainability Symposium of the NDIA in 2011 where he compared the standards.

    His conclusion?

    That they are all “essentially the same” being based on establishing a repeatable, documented structure. The ‘waterfall’ approach of “Identify Risks, Evaluate Risks, Develop Mitigations, Verify Mitigations, Accept Risk”

    See his paper here:
    http://wp.me/a2n2JV-4E

    Brian

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: