|EXECUTIVE OFFICE OF THE PRESIDENT|
|OFFICE OF MANAGEMENT AND BUDGET|
|WASHINGTON, D.C. 20503|
M-07-24 Updated Principles for Risk Analysis
MEMORANDUM FOR THE HEADS OF EXECUTIVE DEPARTMENTS AND AGENCIES
|FROM:|| Susan E. Dudley|
Administrator, Office of Information and Regulatory Affairs,
Office of Management and Budget
| Sharon L. Hays|
Associate Director and Deputy Director for Science,
Office of Science and Technology Policy
|SUBJECT:||Updated Principles for Risk Analysis|
Federal agencies take a variety of actions to improve public health, safety, and the environment. Agency activities designed to reduce risks are influenced by numerous factors, including Congressional priorities, information on the degree of risk faced by different populations, entities, or individuals, resources available, and the ease of implementing chosen priorities. Development of these actions often begins with an assessment of the risks posed under certain conditions, as well as assessments of the potential changes in risk achievable due to different policy options.
In 1995, an interagency working group, co-chaired by the Office of Management and Budget (OMB) and the Office of Science and Technology Policy (OSTP), developed a set of principles to guide policymakers in assessing, managing, and communicating policies to address environmental, health, and safety risks (the 1995 Principles). The 1995 Principles, shared with regulatory agencies in a memorandum from Sally Katzen, then Administrator of OMB's Office of Information and Regulatory Affairs (OIRA), remain valid today.
This Memorandum reinforces the 1995 Principles with reference to more recent guidance from the scientific community, the Congress, and the Executive Branch. This Memorandum also benefits from feedback received on OMB's Proposed Risk Assessment Bulletin issued in 2006 (Proposed Risk Assessment Bulletin).
In January 2006, OIRA, in consultation with OSTP, released the Proposed Risk Assessment Bulletin for public comment and asked the National Academy of Sciences (NAS) to conduct an expert peer review. The NAS issued its report on the Proposed Risk Assessment Bulletin in 2007 (the 2007 NAS Report on the Proposed Risk Assessment Bulletin). While supportive of the goal of "increasing the quality and objectivity of risk assessment in the federal government," the NAS recommended an approach that would "outline goals and general principles of risk assessment." After carefully evaluating these constructive recommendations from the NAS, as well as feedback from a rigorous interagency review, and public comments, we have decided not to issue the bulletin in final form. Rather, we are issuing this Memorandum to reinforce generally-accepted principles for risk analysis upon which a wide consensus now exists.
Recognizing the diversity of documents that stem from risk analysis techniques, this Memorandum reinforces generally-accepted principles for risk analysis related to environmental, health, and safety risks. As a whole, the Memorandum endeavors to enhance the scientific quality, objectivity, and utility of Agency risk analyses and the complementary objectives of improving efficiency and consistency among the Federal family. The general principles presented here should continue to assist and guide agencies as they conduct risk analyses, and thereby should enhance the multiple decisions that are based upon these analyses.
The 1995 Principles were divided into five parts: general principles, principles for risk assessment, principles for risk management, principles for risk communication, and priority setting. This Memorandum reiterates each of these principles (in bold text) and, where appropriate, highlights and references more recent guidance (in plain text). OMB and OSTP will work with Federal agencies to ensure consistency with the principles in this memorandum. Agencies should review their current risk analysis practices and guidelines and incorporate these principles as they develop, update, and issue risk analyses and guidelines.
Should you have any questions regarding this Memorandum, or would like to discuss it, please do not hesitate to contact either of us or our staff at 202-395-4852 (OIRA) or 202-456- 7116 (OSTP).
- 1. These Principles are intended to be goals for agency activities with respect to the assessment, management, and communication of environmental, health, and safety risks. Agencies should recognize that risk analysis is a tool--one of many, but nonetheless an important tool--in the regulatory tool kit. These Principles are intended to provide a general policy framework for evaluating and reducing risk, while recognizing that risk analysis is an evolving process and agencies must retain sufficient flexibility to incorporate scientific advances.
The 2007 NAS Report on the Proposed Risk Assessment Bulletin recommended that OMB "develop goals for risk assessment that emphasize the central objective of enhanced scientific quality and the complementary objectives of efficiency and consistency among agencies evaluating the same or similar risks." Recognizing that "[r]isk assessment is not a monolithic process or a single method," and that "[a]ll risk assessments share some common principles, but their application varies widely among domains," the NAS recommended that "affected federal agencies develop their own technical risk assessment guidelines that are consistent with the OMB general principles."
- 2. The Principles in this document are intended to be applied and interpreted in the context of statutory policies and requirements, and Administration priorities.
- 3. As stated in Executive Order No. 12866, " In setting regulatory priorities, each agency shall consider, to the extent reasonable, the degree and nature of the risks posed by various substances or activities within its jurisdiction" [Section 1(b)(4)]. Further, in developing regulations, federal agencies should consider "...how the action will reduce risks to public health, safety, or the environment, as well as how the magnitude of the risk addressed by the action relates to other risks within the jurisdiction of the agency" [Section 4(c)(1)(D)].
Agencies should refer to Circular A-4 for expanded and updated guidance regarding best practices for agency regulatory analysis.
- 4. In undertaking risk analyses, agencies should establish and maintain a clear distinction between the identification, quantification, and characterization of risks, and the selection of methods or mechanisms for managing risks. Such a distinction, however, does not mean separation. Risk management policies may induce changes in human behaviors that can alter risks (i.e., reduce, increase, or change their character), and these linkages must be incorporated into evaluations of the effectiveness of such policies.
- 5. The depth or extent of the analysis of the risks, benefits and costs associated with a decision should be commensurate with the nature and significance of the decision.
Subsequent reports have re-affirmed this principal. The 1997 Presidential/Congressional Commission on Risk Assessment and Risk Management (referred to as the Presidential Commission on Risk) issued a two-volume report that stated as follows:
- The level of detail considered in a risk assessment and included in a risk characterization should be commensurate with the problem's importance, expected health or environmental impact, expected economic or social impact, urgency, and level of controversy, as well as with the expected impact and cost of protective measures.
A 2007 NAS report on global change assessments evaluated lessons learned from relevant past global change assessments. The NAS, drawing on its analysis and the relevant literature, identified essential elements of effective assessments and included elements relating to the adequacy of funding and the balance between benefits of the assessment and the opportunity costs of producing it.
Principles for Risk Assessment
- 1. Agencies should employ the best reasonably obtainable scientific information to assess risks to health, safety, and the environment.
Risk analyses should be based upon the best available scientific methodologies, information, data, and weight of the available scientific evidence. The Presidential Commission on Risk observed:
- Because so many judgments must be based on limited information, it is critical that all reliable information be considered. Risk assessors and economists are responsible for providing decision-makers with the best technical information available or reasonably attainable, including evaluations of the weight of the evidence that supports different assumptions and conclusions.
Congress emphasized using the best available scientific evidence for risk information in the 1996 amendments to the Safe Drinking Water Act (SDWA). Pursuant to the SDWA, an agency is directed "to the degree that an agency action is based on science," to use:
- (i) the best available, peer-reviewed science and supporting studies conducted in accordance with sound and objective scientific practices; and (ii) data collected by accepted methods or best available methods (if the reliability of the method and the nature of the decision justifies use of the data).
Agencies have adopted or adapted this SDWA standard in their Information Quality Guidelines with regard to the analysis of risks to human health, safety, and the environment.
- 2. Characterizations of risks and of changes in the nature or magnitude of risks should be both qualitative and quantitative, consistent with available data. The characterizations should be broad enough to inform the range of policies to reduce risks.
In the 1996 SDWA amendments, Congress adopted a basic quality standard for the dissemination of public information about risks of adverse health effects. Under the 1996 SDWA amendments, the Environmental Protection Agency is directed "to ensure that the presentation of information on public health effects is comprehensive, informative, and understandable." The Information Quality Guidelines adapt this language and further direct the agencies to:
- in a document made available to the public in support of a regulation [to] specify, to the extent practicable-- (i) each population addressed by any estimate [of applicable risk effects]; (ii) the expected risk or central estimate of risk for the specific populations [affected]; (iii) each appropriate upper-bound or lower-bound estimate of risk; (iv) each significant uncertainty identified in the process of the assessment of [risk] effects and the studies that would assist in resolving the uncertainty; and (v) peer-reviewed studies known to the [agency] that support, are directly relevant to, or fail to support any estimate of [risk] effects and the methodology used to reconcile inconsistencies in the scientific data.
Agencies have adopted or adapted this standard in their Information Quality Guidelines with regard to analysis of risks to human health, safety, and the environment.
In addition, the Information Quality Guidelines state that agency disseminations should be objective; these guidelines, as well as individual agency information quality guidelines, provide further discussion of objectivity and its application in agency disseminations. While risks should not be minimized nor exaggerated, the 2007 NAS Report on the Proposed Risk Assessment Bulletin stated that "[i]nformation on the variability of effects across potentially affected populations is essential to decision-making."
When characterizing risk in its 1996 report on understanding risk and how it informs decisions, the NAS stated that "quantitative models to organize and interpret data are particularly important to risk characterization." When a risk analysis is influential, increased efforts to provide useful quantitative estimates of risk are particularly important. Due to the inherent uncertainties associated with estimates of risk, presentation of a single estimate may be misleading and provide a false sense of precision. Expert panels agree that when a quantitative characterization of risk is provided, a range of plausible risk estimates should be provided. When something more than a superficial analysis can be conducted, quantitative uncertainty analysis, sensitivity analysis, and a discussion of model uncertainty can greatly inform risk management decisions.
Experts have recognized that when presenting risk information qualitatively, or quantitatively, agencies' methodological approaches will likely vary and will depend upon the context for which the analysis is used. These methodologies are continuing to develop as the science associated with quantitative uncertainty analysis advances. As technical guidance continues to develop, so too should agencies' presentation of quantitative risk information.
- 3. Judgments used in developing a risk assessment, such as assumptions, defaults, and uncertainties, should be stated explicitly. The rationale for these judgments and their influence on the risk assessment should be articulated.
If important judgments are supported by, or conflict with, empirical data, that information should be discussed. The discussion should address the range of scientific and/or technical opinions regarding the likelihood of plausible alternate judgments and the direction and magnitude of any resulting changes that might arise in the analysis due to changes in key judgments. Every effort should be made to perform a quantitative evaluation of reasonable alternative assumptions. When an analysis combines multiple assumptions, the basis and rationale for combining the assumptions should be explicitly described.
Critical judgments are often made when choosing and presenting study results. Results based on different effects and/or different studies should be presented to convey how the choice of effect and/or study influences the analysis. The presentation of information regarding different scientifically plausible endpoints should allow for a robust discussion of the available data, associated uncertainties, and underlying science. In its 2007 report evaluating global change assessments, the NAS recommended that "[t]here should be a deliberate effort to clarify the importance of alternative assumptions and to illustrate the impacts of uncertainties." When relying on data from one study over others, the agency should provide a clear rationale and/or scientific basis for its choice.
- 4. Risk assessments should encompass all appropriate hazards (e.g., acute and chronic risks, including cancer and non-cancer risks, to human health and the environment). In addition to considering the full population at risk, attention should be directed to subpopulations that may be particularly susceptible to such risks and/or may be more highly exposed.
A good risk analysis should clearly summarize the scope of the assessment, including a description of: the agent, technology and/or activity that is the subject of the analysis; the hazard of concern; the affected entities (populations, subpopulations, individuals, natural resources, ecosystems, critical infrastructure, or other) that are the subject of the assessment; the exposure/event scenarios relevant to the objectives of the assessment; and the type of event- consequence or dose-response relationship for the hazard of concern. In the 2007 NAS Report on the Proposed Risk Assessment Bulletin, the NAS reaffirmed that including this information would improve the clarity of a risk analysis and is consistent with the recommendations of previous expert reports.
More recent reports have reaffirmed that, in addition to considering the full population or entities at risk, the risk analysis should also consider subpopulations or sub-entities that may be particularly susceptible to such risks and/or may experience greater exposures. If a risk analysis is to address only specific subpopulations, the scope should be very clear about this limitation.
Where there are known differences in risk for different individuals, subpopulations or ecosystems, analysts should characterize this variability. Risk managers will be better informed when an understanding of variability and the key contributors to the cause of this variability are presented in the risk analysis. As guidance on the presentation of risk information to the public continues to develop, so too will agencies' presentation and discussion of variability.
The President's Commission on Risk stated: "A good risk management decision is based on a careful analysis of the weight of scientific evidence that supports conclusions about a problem's potential risks to human health and the environment." This may include consideration of both positive and negative studies, in light of each study's technical quality. Agencies and the risk assessment community are continuing to develop techniques for weight of evidence evaluations.
Agencies should consider confounding and/or synergistic factors. The scientific process of considering these elements may assist policy makers in developing a broader sense of how risk can be reduced significantly and the range of decision options that need to be considered in developing risk management approaches.
- 5. Peer review of risk assessments can ensure that the highest professional standards are maintained. Therefore, agencies should develop policies to maximize its use.
Agencies should refer to the Peer Review Bulletin for updated guidance regarding agency best practices for peer review.
- 6. Agencies should strive to adopt consistent approaches to evaluating the risks posed by hazardous agents or events.
Principles for Risk Management
- 1. In making significant risk management decisions, agencies should analyze the distribution of the risks and the benefits and costs (both direct and indirect, both quantifiable and non-quantifiable) associated with the selection or implementation of risk management strategies. Reasonably feasible risk management strategies, including regulation, positive and negative economic incentives, and other ways to encourage behavioral changes to reduce risks (e.g., information dissemination), should be evaluated. Agencies should employ the best available scientific, economic and policy analysis, and such analyses should include explanations of significant assumptions, uncertainties, and methods of data development.
Agencies should refer to Circular A-4 for updated guidance regarding agency best practices for regulatory analysis.
- 2. In choosing among alternative approaches to reducing risk, agencies should seek to offer the greatest net improvement in total societal welfare, accounting for a broad range of relevant social and economic considerations such as equity, quality of life, individual preferences, and the magnitude and distribution of benefits and costs (both direct and indirect, both quantifiable and non-quantifiable).
Agencies should refer to Circular A-4 for updated guidance regarding agency best practices for regulatory analysis.
Principles for Risk Communication
- 1. Risk communication should involve the open, two-way exchange of information between professionals, including both policy makers and "experts" in relevant disciplines, and the public.
In describing its approach to risk characterization in the 1996 NAS Report on Understanding Risk and Informing Decisions, NAS stated that "[t]he responsible organization's staff should describe the stated and implicit purposes of the decision-making activity, the type of decision and general aims furthered by the activity, and the intended users of the risk characterization."
Agencies should provide this information as it will assist the readers and users in better understanding the questions that the analysis sought to answer and will help to ensure that the risk analyses are used for their intended purposes. This information is particularly important in cases where likely users of the risk analyses are not the original intended audience for the document.
- 2. Risk management goals should be stated clearly, and risk assessments and risk management decisions should be communicated accurately and objectively in a meaningful manner.
An executive summary that discloses the objectives and scope, the key findings of the analysis, and the key scientific limitations and uncertainties can be an important part of risk communication. In the 2007 NAS Report on the Proposed Risk Bulletin, the NAS commented that the inclusion of an executive summary could improve the clarity of risk assessments. Presentation of information in a helpful and concise introductory section of the report will not only foster improved communication of the findings, but will also help ensure that the risk analysis is appropriately interpreted by diverse end users.
- To maximize public understanding and participation in risk-related decisions, agencies should:
- a. explain the basis for significant assumptions, data, models, and inferences used or relied upon in the assessment or decision;
A high degree of transparency with respect to data, assumptions, and methods will increase the credibility of the risk analysis, and will allow interested individuals, internal and external to the agency, to understand better the technical basis of the analysis.
- b. describe the sources, extent and magnitude of significant uncertainties associated with the assessment or decision;
In the 1996 NAS Report on Understanding Risk and Informing Decisions, the NAS observed that "[t]here is strong agreement that risk analysts should explicitly summarize uncertainty, and there are methods for doing so." In the 2007 NAS Report on Global Change Assessments, when referring to critical elements of a credible, legitimate and salient assessment, the NAS included as one of four central elements:
- Deliberative and consistent methods of treating and communicating uncertainties add credibility and salience. Regardless of method (statistics, sensitivity analysis, scenario development, or expert judgment), each measure must be defined and communicated in a consistent manner.
In the same report the NAS also stated:
- An effective characterization of uncertainty in assessments requires determining what sorts of uncertainty information would be useful for decision makers as well as developing quantitative or qualitative measures of uncertainty . . . . The manner in which uncertainties are acknowledged and characterized will affect both the salience and credibility of the assessment."
The agency also should identify the sources of the underlying information (consistent with sensitive information and confidentiality protections) and the supporting data and models, so that the public can evaluate whether there may be some reason to question objectivity. Data should be accurately documented, and error sources affecting data quality should be identified and disclosed.
- c. make appropriate risk comparisons, taking into account, for example, public attitudes with respect to voluntary versus involuntary risk; and,
When making risk comparisons, agencies should be careful to consider the perspectives, assumptions, attitudes and context that the public associates with each risk. Agencies may want to consult the risk communication literature when considering appropriate comparisons. Although the risk assessor has considerable latitude in making risk comparisons, the fundamental point is that risk should be placed in a context that is useful and relevant for the intended audience. Furthermore, effective communication of risk information can assist the public in balancing benefits and risks. As our understanding regarding the presentation of risk information to policy makers and the public continues to develop, so too will agencies' presentation and discussion of risk comparison information.
- d. provide timely, public access to relevant supporting documents and a reasonable opportunity for public comment.
Agencies should refer to OMB's Final Bulletin for Agency Good Guidance Practices, as well as the Peer Review Bulletin, for updated guidance regarding best practices for increasing public access and public comment concerning guidance documents and influential scientific information. In addition, as noted in the OMB Information Quality Guidelines, influential risk analyses should be reproducible. For guidance on how to provide the public with timely access to government information, Agencies should refer to OMB's Circular A-130, which addresses the management of Federal information resources, and OMB Memorandum 06-02, which addresses improving public access to Federal information.
Principles for Priority Setting Using Risk Analysis
- 1. To inform priority setting, agencies should seek to compare risks, grouping them in broad categories of concern (e.g., high, moderate, and low).
- 2. Agencies should set priorities for managing risks so that those actions resulting in the greatest net improvement in societal welfare are taken first, accounting for relevant management and social considerations such as different types of health or environmental impacts; individual preferences; the feasibility of reducing or avoiding risks; quality of life; environmental justice; and the magnitude and distribution of both short- and long-term benefits and costs.
- 3. The setting of priorities should be informed by internal agency experts and a broad range of individuals in state and local government, industry, academia, and nongovernmental organizations, as well as the public at large. Where possible, consensus views should be reflected in the setting of priorities.
- 4. Agencies should attempt to coordinate risk reduction efforts wherever feasible and appropriate.
- ↑ U.S. Office of Mgmt. and Budget (OMB), Memorandum for the Regulatory Working Group, Principles for Risk Analysis (1995), available at http://www.whitehouse.gov/omb/inforeg/regpol/jan1995_risk_analysis_principles.pdf.
- ↑ OMB, Proposed Risk Assessment Bulletin, (2006) [hereinafter Proposed Risk Assessment Bulletin], available at http://www.whitehouse.gov/omb/inforeg/proposed_risk_assessment_bulletin_010906.pdf.
- ↑ See Press Release, OMB, OMB Requests Peer Review of Proposed Risk Assessment Bulletin (Jan. 9, 2006), available at http://www.whitehouse.gov/omb/pubpress/2006/2006-01.pdf.
- ↑ National Research Council, National Academy of Sciences, Scientific Review of the Proposed Risk Assessment Bulletin from the Office of Management and Budget (2007) [hereinafter 2007 NAS Report on the Proposed Risk Assessment Bulletin].
- ↑ Id. at 6 - 7.
- ↑ OMB received 79 public comments on the Proposed Risk Assessment Bulletin. These comments are posted on OMB's website, available at http://www.whitehouse.gov/omb/inforeg/comments_rab/list_rab2006.html.
- ↑ While many of the principles presented in this Memorandum may be relevant to other fields, such as financial or information technology risk analyses, the focus of this Memorandum is on those risk analyses related to environmental, health, and safety risks.
- ↑ The enhancement of information quality, objectivity, transparency, and reproducibility is addressed in other OMB guidance as well. See OMB, Circular A-4 for Regulatory Review (2003) [hereinafter Circular A-4] (includes requirements for regulatory analysis as it relates to risk management decisions, particularly those required under Executive Order 12866), available at http://www.whitehouse.gov/omb/circulars/a004/a-4.pdf. In addition, pursuant to what is commonly referred to as the Information Quality Act (Sec. 515 of the Treasury and General Government Appropriations Act for FY 2001, Pub. L. No. 106-554), OMB issued government-wide Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Information Disseminated by Federal Agencies (2002), 67 Fed. Reg. 8452 (Feb. 22, 2002) [hereinafter Information Quality Guidelines], available at http://www.whitehouse.gov/omb/fedreg/reproducible2.pdf; and its Final Information Quality Bulletin for Peer Review (2004), 70 Fed. Reg.2664 (Jan. 14, 2005) [hereinafter Peer Review Bulletin], available at http://www.whitehouse.gov/omb/memoranda/fy2005/m05-03.pdf. These documents were each issued first as a proposal for public comment. In addition, the Information Quality Guidelines were then implemented by agencies in their own agency-specific information quality guidelines. This Memorandum is intended to complement and support the Information Quality Guidelines.
- ↑ This framework is consistent with the long-standing distinction between risk assessment, risk management, and risk communication. See National Research Council, National Academy of Sciences, Risk Assessment in the Federal Government: Managing the Process (1983).
- ↑ 2007 NAS Report on the Proposed Risk Assessment Bulletin, supra note 4, at 110.
- ↑ Id. at 106, 110-11.
- ↑ See Exec. Order No. 12866, 58 Fed. Reg. 51,735 (Oct. 4, 1993), available at http://www.whitehouse.gov/omb/inforeg/eo12866.pdf. See also Exec. Order No. 13132, 72 Fed. Reg. 2763 (Jan. 23, 2007), available at http://www.whitehouse.gov/omb/inforeg/eo12866/index_eo12866.html.
- ↑ See footnote 8. Circular A-4 refined OMB's "Economic Analysis of Federal Regulations Under Executive Order 12866" (1996) (a "best practices" guide to preparing a regulatory analysis), available at http://www.whitehouse.gov/omb/inforeg/riaguide.html. The "best practices" guide was issued as guidance in 2000, OMB's Memorandum for the Heads of Departments and Agencies on Guidelines to Standardize Measures of Costs and Benefits and the Format of Accounting Statements (2000), available at http://www.whitehouse.gov/omb/memoranda/m00-08.pdf; and reaffirmed in 2001, OMB's Memorandum for the President's Management Council on Presidential Review of Agency Rulemaking (2001), available at http://www.whitehouse.gov/omb/inforeg/oira_review-process.html. Before finalization, Circular A-4 went through public comment, interagency review, and peer review.
- ↑ Presidential/Congressional Commission on Risk Assessment and Risk Management, Framework for Environmental Health Risk Management, 1 Final Report 25 (1997) [hereinafter Risk Commission Report I].
- ↑ See National Research Council, National Academy of Sciences, Analysis of Global Change Assessments: Lessons Learned (2007) [hereinafter 2007 NAS Report on Global Change Assessments].
- ↑ Id., supra note 15, at 4 (two essential elements of effective assessments are "[a]dequate funding that is both commensurate with the mandate and effectively managed to ensure an efficient assessment process" and a "balance between the benefits of a particular assessment and the opportunity costs (e.g., commitments of time and effort) to the scientific community").
- ↑ In this Memorandum, "scientific" information includes information related to applied sciences (such as engineering) and technical information related to these fields.
- ↑ Risk Commission Report I, supra note 14, at 38. The Risk Commission Report I provides examples of the kinds of considerations entailed in making judgments on the basis of the weight of the scientific evidence in a toxicity study: quality of the toxicity study, appropriateness of the toxicity study methods, consistency of results across studies, biological plausibility of statistical associations, and similarity of results to responses and effects in humans.
- ↑ 42 U.S.C. § 300g-1(b)(3)(A,B).
- ↑ 42 U.S.C. § 300g1(b)(3)(A).
- ↑ Links to agency specific information quality guidelines can be found in the OMB Draft 2007 Report to Congress on the Costs and Benefits of Federal Regulations, Appendix C, available at http://www.whitehouse.gov/omb/inforeg/2007_cb/2007_draft_cb_report.pdf. For specific examples of agency guidelines, see http://dms.dot.gov/submit/DataQualityGuidelines.pdf (Department of Transportation guidelines), http://aspe.hhs.gov/infoquality/Guidelines/index.shtml (Department of Health and Human Services guidelines), http://www.sti.nasa.gov/qualinfo.html (National Aeronautics and Space Administration guidelines), http://www.dol.gov/informationquality.htm (Department of Labor guidelines).
- ↑ 42 U.S.C. § 300g-1(b)(3)(B).
- ↑ See Information Quality Guidelines, supra note 8.
- ↑ See footnote 21.
- ↑ See Information Quality Guidelines, supra note 8. On a substantive level, objectivity ensures accurate, reliable and unbiased information.
- ↑ See 2007 NAS Report on the Proposed Risk Assessment Bulletin, supra note 4, at 63 (in referring to the Proposed Risk Assessment Bulletin's statement that risk assessments should be presented such that they are "neither minimizing nor exaggerating the nature and magnitude of risks," the NAS stated that this "could, however, degrade risk analysis if it were interpreted so as to deprive decision-makers of important information on sensitive subpopulations on the grounds that such information may generate risk estimates considerably higher than a central tendency or general population estimates.").
- ↑ National Research Council, National Academy of Sciences, Understanding Risk: Informing Decisions in a Democratic Society 99-100 (1996) [hereinafter 1996 NAS Report on Understanding Risk and Informing Decisions].
- ↑ In this Memorandum, the term "influential" is defined in the same way as it is defined in the Information Quality Guidelines. Influential means that "the agency can reasonably determine that dissemination of the information will have or does have a clear and substantial impact on important public policies or important private sector decisions." In their information quality guidelines, agencies have defined "influential" in ways appropriate for them given the nature and multiplicity of issues for which the agencies are responsible.
- ↑ See National Research Council, National Academy of Sciences, Models in Environmental Regulatory Decision Making 136 (2007) [hereinafter 2007 NAS Report on Environmental Decision Making] ("there are substantial problems in reducing the results of a large-scale study with many sources of uncertainty to a single number or even a single probability distribution. We contend that such an approach draws the line between the role of analysts and the role of policy makers in decision making at the wrong place."); Id. at 7 ("Effective decision making will require providing policy makers with more than a single probability distribution for a model result (and certainly more than just a single number, such as the expected net benefit, with no indication of uncertainty). Such summaries obscure the sensitivities of the outcome to individual sources of uncertainty, thus undermining the ability of policy makers to make informed decisions and constraining the efforts of stakeholders to understand the basis for the decisions."). See also 1996 NAS Report on Understanding Risk and Informing Decisions, supra note 27, at 66 ("Risk characterizations often fail because they attribute meaning to scientific estimates in ways that mislead participants in the risk decision process or are incomprehensible to them.").
- ↑ See 2007 NAS Report on the Proposed Risk Assessment Bulletin, supra note 4, at 37 ("The committee agrees with OMB that in some cases `presentation of single estimates of risk is misleading' and that ranges of `plausible risk' should be presented; however, the challenge is in the operational definitions of such words as central, expected, and plausible.") (emphasis in original).
- ↑ See 2007 NAS Report on Environmental Decision Making, supra note 29, at 6 ("A wide range of possibilities is available for performing model uncertainty analysis. At one extreme, all model uncertainties could be represented probabilistically, and the probability distribution of any model outcome of interest could be calculated. However, in assessing environmental regulatory issues, these analyses generally would be quite complicated to carry out convincingly, especially when some of the uncertainties in critical parameters have broad ranges or when the parameter uncertainties are difficult to quantify. Thus, although probabilistic uncertainty analysis is an important tool, requiring EPA to do complete probabilistic regulatory analyses on a routine basis would probably result in superficial treatments of many sources of uncertainty. The practical problems of performing a complete probabilistic analysis stem from models that have large numbers of parameters whose uncertainties must be estimated in a cursory fashion. Such problems are compounded when models are linked into a highly complex system, for example, when emissions and meteorological model results are used as inputs into an air quality model."); Id. at 7 ("It is not necessary to choose between purely probabilistic approaches and deterministic approaches. Hybrid analyses combining aspects of probabilistic and deterministic approaches might provide the best solution for quantifying uncertainties, given the finite resources available for any analysis. For example, a sensitivity analysis might be used to determine which model parameters are most likely to have the largest impacts on the conclusions, and then a probabilistic analysis could be used to quantify bounds on the conclusions due to uncertainties in those parameters.").
- ↑ Id. at 7-8 ("In some cases, presenting results from a small number of model scenarios will provide an adequate uncertainty analysis (for example, cases in which the stakes are low, modeling resources are limited, or insufficient information is available). In many instances, however, probabilistic methods will be necessary to characterize properly at least some uncertainties and to communicate clearly the overall uncertainties.").
- ↑ See 1996 NAS Report on Understanding Risk and Informing Decisions, supra note 27, at 53 ("Simplifying assumptions generate especially serious problems when some of the assumptions are unreasonable in the face of information available to people outside the analytical process.").
- ↑ Id. at 99 ("Without good analysis, deliberative processes can arrive at agreements that are unwise or not feasible."). See also National Research Council, National Academy of Sciences, Health Risks from Dioxin and Related Compounds 7 (2006) ("Although EPA [in its dioxin assessment] addressed many sources of variability and uncertainty qualitatively, the committee noted that the Reassessment would be substantially improved if its risk characterization included more quantitative approaches. Failure to characterize variability and uncertainty thoroughly can convey a false sense of precision in the conclusions of the risk assessment.").
- ↑ See Circular A-4, supra note 8, at 39 ("Inferences and assumptions used in your analysis should be identified, and your analytical choices should be explicitly evaluated and adequately justified.").
- ↑ See 2007 NAS Report on Global Change Assessments, supra note 15, at 126.
- ↑ See Circular A-4, supra note 8, at 39.
- ↑ See Risk Commission Report I, supra note 14, at 24 (a list of questions should be addressed in a risk characterization).
- ↑ See 2007 NAS Report on the Proposed Risk Assessment Bulletin, supra note 4, at 62.
- ↑ See Risk Commission Report I, supra note 14, at 24 (a list of questions that includes asking about individuals or groups that are at risk and if some people are more likely to be at risk than others).
- ↑ See 2007 NAS Report on the Proposed Risk Assessment Bulletin, supra note 4, at 80 ("information on the variability of effects across potentially affected populations--due to differences in sensitivity, exposure, or both--is essential to decision-making.").
- ↑ See Risk Commission Report I, supra note 14, at 23; Id. at 23-24 ("It is important that risk assessors respect the objective scientific basis of risks and procedures for making inferences in the absence of adequate data. Risk assessors should provide risk managers and other stakeholders with plausible conclusions about risk that can be made on the basis of the available information, along with evaluations of the scientific weight of evidence supporting those conclusions and descriptions of major sources of uncertainty and alternative views.").
- ↑ Id. at 24 (a list of questions that includes asking about "[w]hat other sources cause the same type of effects or risks.").
- ↑ See Peer Review Bulletin, supra note 8.
- ↑ See Circular A-4, supra note 8.
- ↑ Id.
- ↑ See 1996 NAS Report on Understanding Risk and Informing Decisions, supra note 27, at 146; Id. ("Different types of decisions may require different types of knowledge and perspectives and hence require different participants in the analytic-deliberative process--both inside and outside the organization.").
- ↑ Key limitations are those that are most likely to affect significantly the determinations and/or estimates of risk presented in the analysis.
- ↑ See 2007 NAS Report on the Proposed Risk Assessment Bulletin, supra note 4, at 64 ("These qualitative standards . . . [presented in an executive summary] could improve the clarity of risk assessment in the federal government if risk assessments do not implement them already (although the existence of such problems is not established by the bulletin.")).
- ↑ See 1996 NAS Report on Understanding Risk and Informing Decisions, supra note 27, at 67; Id. at 100 (in a good quantitative analysis, "[a]ny assumptions used are clearly explained, used consistently, and tested for reasonableness," "[c]alculations are presented in such a form that they can be checked by others interested in verifying the results," and "[u]ncertainties are indicated, including those in data, models, parameters, and calculations."). See 2007 NAS Report on Global Change Assessments, supra note 15, at 126 (NAS recommends that "[t]here should be a deliberate effort to clarify the importance of alternative assumptions and to illustrate the impacts of uncertainties."). See also U.S. Environmental Protection Agency, Science Policy Council Handbook, Risk Characterization 15 (2000) [hereinafter EPA Risk Characterization Handbook] ("[Transparency] ensures that any reader understands all the steps, logic, key assumptions, limitations, and decisions in the risk assessment, and comprehends the supporting rationale that lead to the outcome."). In the EPA Risk Characterization Handbook, EPA lists 10 elements of full disclosure. Id. at 15-16.
- ↑ See 1996 NAS Report on Understanding Risk and Informing Decisions, supra note 27, at 107 ("When uncertainty is recognizable and quantifiable, the language of probability can be used to describe it."). See also EPA Risk Characterization Handbook, supra note 50, at 41 ("While it is generally preferred that quantitative uncertainty analyses are used in each risk characterization, there is no single recognized guidance that currently exists on how to conduct an uncertainty analysis. Nonetheless, risk assessors should perform an uncertainty analysis . . . . Uncertainty analysis should not be restricted to discussion of precision and accuracy, but should include such issues as data gaps and models.").
- ↑ See 2007 NAS Report on Global Change Assessments, supra note 15, at 60.
- ↑ Id. at 126.
- ↑ See Information Quality Guidelines, supra note 8.
- ↑ See OMB, Statistical Policy Working Paper No. 31, Measuring and Reporting Sources of Error in Surveys, 1-5 (2001) ("The measurement and reporting of error sources is important for everyone who uses statistical data. For the analyst, this information helps data analyses through an awareness of the limitations of the data."), available at http://www.fcsm.gov/01papers/SPWP31_final.pdf .
- ↑ See National Research Council, National Academy of Sciences, Improving Risk Communication 165-79 (1989). See also Risk Commission Report I, supra note 14, at 4 (the problems a regulation is intended to address should be placed in their "public health and ecological context.").
- ↑ See Institute of Medicine, National Academy of Sciences, Seafood Choices: Balancing Benefits and Risks 207 (2006) ("In the committee's judgment, it is important to conduct substitution analyses of the potential impacts of changes in consumption despite the uncertainties about the underlying nutrient and contamination levels."). Id. at 231 (in referring to guidance and information that can simplify such tradeoffs, the NAS lists considerations for guidance development).
- ↑ See OMB, Final Bulletin for Agency Good Guidance Practices (2007), available at http://www.whitehouse.gov/omb/fedreg/2007/012507_good_guidance.pdf, and Peer Review Bulletin, supra note 8.
- ↑ See Information Quality Guidelines, supra note 8, 67 Fed. Reg. at 8460 (independent reanalysis of the original or supporting data using the same methods would generate similar analytical results, subject to an acceptable degree of precision).
- ↑ See OMB, Circular A-130, Management of Federal Information Resources (2000), available at http://www.whitehouse.gov/omb/circulars/a130/a130trans4.pdf; OMB Memorandum No. 06-02, Improving Public Access to and Dissemination of Government Information and Using the Federal Enterprise Architecture Data Reference Model (2005), available at http://www.whitehouse.gov/omb/memoranda/fy2006/m06-02.pdf.