Your organization’s approach to risk assessment is imperfect.
You already knew that. Any risk assessment or risk analysis begins by putting limits on, or “bounding” the topic – identifying what risks to include, which stakeholders to consider, how far into the future to think, et cetera.
But recent research in Science suggests conventional risk management frameworks are fundamentally flawed. The “bounded rationality” that defines and limits a risk analysis is rife with human judgments. And researchers in communication science and decision-making are just starting to recognize those judgments and the impact they have on risk management.
What Is Risk?
Risk is defined as the probability of an event occurring multiplied by the impact it would have.
An unlikely event with a big impact — like a tsunami in the Pacific — could still be a significant risk. Risks are often obvious with new technologies like driver-less cars or genetically modified organisms. But all businesses, even in traditional industries, face risks.
Climate change poses a broad threat that leaders in the private sector are starting to tackle on their own. Drought and flooding could seriously reduce crop yields for farmers. Freak storms could down power lines and disrupt supply chains across industries. Decreasing availability of water – or government-imposed limits on water usage – could strain water-intensive manufacturing operations.
(NBS’s systematic review Business Adaptation to Climate Change summarizes the risks and opportunities of climate change across many different sectors.)
Other risks arise from businesses’ interactions with people. Community unrest in developing nations threatens the viability of certain mining companies. Human rights abuses in overseas manufacturers can undermine entire supply chains – obliterating brands.
New technologies not yet invented could threaten whole lines of business, as the music, newspaper, and taxi industries have learned.
Experts do their best to identify, prioritize, and value a company’s potential risks – including their costs and benefits. The next step is communicating the results.
Risk Management Today
Professor of engineering and public policy Baruch Fischhoff, of Carnegie Mellon University, reviewed risk analysis models used by the U.S. Environmental Protection Agency, Food and Drug Administration, Department of Homeland Security, and other high-stakes organizations.
“Rational” is different from objective.
Fischhoff explores the concept of “bounded rationality”: “Rather than attempting to address all aspects of a complex decision,” said Fischhoff, “… analyses ‘bound’ it, in the sense of ignoring enough of its elements to be able to treat those that remain ‘rationally.’”
“Rationality” implies objective calculation. However, as Fischhoff discusses, analysis and decision-making in risk management are anything but objective. Human judgments are inherent in risk analysis: “To use analyses wisely, decision-makers need to know what judgments were made and how [those judgments] affected the results.”
Fischhoff identifies two types of judgments that analysts make in risk assessment: scientific and ethical judgments.
Because science is often incomplete, we may not know exactly how likely a given event is. To inform their assessments, analysts and decision-makers may have less rigorous evidence than they would like. The data they use – and its quality and source – represent a scientific judgment.
For example, life cycle analyses assess the energy and materials involved in creation, use, and disposal of products, processes, or services. The person conducting a life cycle analysis must decide how far upstream and downstream to go. Upstream: Does he or she include the energy and material embodied in manufacturing equipment? Downstream: Should the analysis account for methane released from landfill? (Check out NBS’s report Measuring and Valuing Environmental Impacts.)
One might consider scientific judgments the “known unknowns” of risk management.
Ethical judgments are subjective decisions that determine where analysts focus their attention and how they (and decision-makers) interpret the results. Analysts make ethical decisions about, for example, which of all possible risks to analyze; which stakeholders to consider when evaluating the risk; and how to define the terms (e.g. “risk of death” could be defined as the probability of dying any time before your natural death or as expected life years lost – a more dramatic outcome for people who die young). Analysts also make ethical judgments about what metric to use to quantify the risk and its potential costs and benefits (e.g. dollars, lives, hours of quality life).
In the lifecycle example, deciding whether impacts in other countries matter is an example of an ethical judgment.
It goes without saying that people assess risks, costs and benefits differently. A community living near a mine is likely to be more concerned about potential pollution from it than the consumer who uses its product.
To reduce the subjectivity of their assessments, analysts and decision-makers can do the following:
Ask for input – before, during, and after.
In the 1980’s, the US Environmental Protection Agency (EPA) conducted a series of “risk ranking” exercises with staff, its Scientific Advisory Board and, eventually, citizens from many states: “In these exercises, participants chose the risks (e.g. infectious disease and urban sprawl and the valued outcomes (e.g. morbidity, mortality, economic development).” Analysts roughly estimated outcomes for each risk. Participants compared those outcomes and set priorities for future analysis and action.
“Having stakeholders define the terms of an analysis tailors it to their needs,” said Fischhoff. “However, it also limits comparisons across analyses.”
Engage your stakeholders before, during and after a risk assessment. Consider inviting stakeholders who you believe may be unaffected by the risks you are examining. It’s the only way to determine if your assumptions about what matters to them are accurate.
It’s particularly important to engage stakeholders on controversial issues.
Show your work.
Document the assumptions – or ethical judgments – in your analysis.
Documentation lets stakeholders examine the validity of your risk assessment, ultimately increasing trust and reducing skepticism. It enables other analysts and researchers to replicate or build upon your work. And it increases the likelihood that affected stakeholders will be receptive to your risk assessment.
Offer á la carte instead of a five-course meal.
Science communication has classically employed a “deficit approach.” Experts decide what knowledge people need in order to be literate on a topic, whether the topic be climate change, finance, energy, etc. The experts then proceed to (try to) fill people’s heads with all the available information on the topic.
The research has moved to a more user-centric model. This model involves analyzing the decisions or problems facing stakeholders and identifying the few things they most need to know about the issue.
Approach risk the same way you approach community engagement.
Ultimately, your risk analysts should take a lesson from your community relations team. Identify the groups affected by your potential risks, engage them early, communicate the information they care about, and adjust your approach based on their feedback.
The model below shows a process for ongoing, collaborative risk management. Analysts define the problem, evaluate risks and possible solutions, and then monitor implementation. At all stages, they communicate with those affected by the risks.
Fischhoff, B. 2015. The realities of risk-cost-benefit-analysis. Science.
Fischhoff, B. 2012. Better ways of thinking about risks. Harvard Business Review.