Skip to main contentdfsdf

christopher bischoff's List: Risk Management

  • Of note to #risk guys: Probabilistic Risk Assessment (PRA) methodologies NOT suited to managing risk in malicious environments. Brian Snow
      • From the business investor’s perspective, there are three categories of risk[2].

         
           
        1. Delivery Risk. The risk that the software is NOT delivered on time, on budget, and to the required quality.
        2.  
        3. Business Value Risk. The risk that the project does not deliver the value expected.
        4.  
        5. Existing Business Model Risk. The risk that the project actually damages the existing organisation.
        6.  
         

        Agile and Lean Software Development Techniques address the first category of risk. Feature Injection, Lean Start Up techniques and Business Value are starting to address the second category of risk. Real Options[3] can be used to manage all three types of risk.

      • business investor’s perspective, there are three categories of risk[2].

         
           
        1. Delivery Risk. The risk that the software is NOT delivered on time, on budget, and to the required quality.
        2.  
        3. Business Value Risk. The risk that the project does not deliver the value expected.
        4.  
        5. Existing Business Model Risk. The risk that the project actually damages the existing organisation.
        6.  
         

        Agile and Lean Software Development Techniques address the first category of risk. Feature Injection, Lean S

    5 more annotations...

    • Risk != uncertainty (unless you’re a Knightian frequentist, and then you don’t believe in measurement anyway), though if you were to account for risk in an equation, the amount of uncertainty would be a factor.
    • risk != “likelihood” (to a statistician or probabilist anyway). Like uncertainty, likelihood has a specific meaning.

    2 more annotations...

    • On that particular topic, I cannot better the words of Richard Feynman, from his famous minority report on the Challenger Space Shuttle disaster:

       
       

      It appears that there are enormous differences of opinion as to the probability of a failure with loss of vehicle and of human life. The estimates range from roughly 1 in 100 to 1 in 100,000. The higher figures come from the working engineers, and the very low figures from management. What are the causes and consequences of this lack of agreement? Since 1 part in 100,000 would imply that one could put a Shuttle up each day for 300 years expecting to lose only one, we could properly ask "What is the cause of management's fantastic faith in the machinery?"

    • Ultimately, risk management is a numbers game; you multiply a wild-ass guess by a fudge factor. Worse, the potential cost of failure is estimated in as a factor, too. So you're trying to balance an unjustified estimate of cost of failure against a wild-ass guess multiplied by a fudge factor. Generally, what is really going on is that risk management is used as a sort of statistical shell-game to manipulate the perceived value of security when dealing with a clueless senior manager. Bluntly: it's lying with statistics. Those who engage in it do so because they think their managers are idiots. The fact that they are often right is sad, but should not surprise anyone.

    1 more annotation...

1 - 5 of 5
20 items/page
List Comments (0)