Society For Risk Analysis Annual Meeting 2017

Session Schedule & Abstracts


* Disclaimer: All presentations represent the views of the authors, and not the organizations that support their research. Please apply the standard disclaimer that any opinions, findings, and conclusions or recommendations in abstracts, posters, and presentations at the meeting are those of the authors and do not necessarily reflect the views of any other organization or agency. Meeting attendees and authors should be aware that this disclaimer is intended to apply to all abstracts contained in this document. Authors who wish to emphasize this disclaimer should do so in their presentation or poster. In an effort to make the abstracts as concise as possible and easy for meeting participants to read, the abstracts have been formatted such that they exclude references to papers, affiliations, and/or funding sources. Authors who wish to provide attendees with this information should do so in their presentation or poster.

Common abbreviations


Poster Session

Room: Salons 3-6   6:00 pm–8:00 pm



P.1  Identification of potential biological hazards in groundwater underlying cemeteries and graveyards. Leung ACW*, Minnery JG, Chung R; Public Health Ontario   alvin.leung@oahpp.ca
Applied Risk Management

Abstract: Growing interest in natural or green burials is not only challenging traditional cemetery practice but also how they are assessed and managed. In Ontario, new cemetery site applications are reviewed by local municipalities, the Ministry of Environment and Climate Change and local public health authorities with limited guidance available to inform decision making and risk management. Public Health Ontario was approached to assist a local health unit in the review of a cemetery site application that would include several burial types including traditional (e.g., embalmment and coffin) and natural (e.g., shroud without embalmment). There was concern that microbiological constituents could leach during decomposition at burial sites and migrate into the groundwater that supplies residential drinking water wells adjacent to the proposed site. Existing guidance recommends a minimum distance of 0.5 m between the grave and water table without supporting rationale. Public Health Ontario reviewed existing guidance and conducted a targeted literature search related to assessing and managing potential impacts from cemeteries on groundwater quality. Overall, it was found that proper cemetery construction and design is important in reducing potential impacts to groundwater. Combining knowledge of pathogen survival times and existing hydrogeological conditions at a proposed site can help characterize the risk of contaminating a secure drinking water source. Finally, monitoring standard microbial indicators (including heterotrophic plate count, total coliform and E. coli.) in the vicinity of a cemetery could be conducted to observe for changes to potable groundwater quality. Advances in forensic science illuminate both limitations in our reliance on indicator organisms as well as opportunities to improve our metrics of groundwater quality.

P.2  Climate Change Vulnerability, Risk Assessment and Adaptation Scenario Development for Municipalities. Thorne ST*, Kovacs DK, Austin LA, Qiu X, Horb E, Martyn N, Hay A; Decision • Partners, Inc., Novus Environmental, RiskLogik, Southern Harbour   dkovacs@decisionpartners.com
Applied Risk Management

Abstract: For many urban centers climate change will have an increasing impact on development, economy and vitality. This demands efforts to reduce and stabilize levels of greenhouse gasses emitted (i.e., mitigation) and to increase community and municipal sustainability and resilience (i.e., adaptation). These impacts, manifested, for example, as the increasing severity of extreme weather events and decreased lifecycle for critical infrastructure, fundamentally alter the relationship between the civil society, the economy and the environment that supports them both. Many cities have wisely embarked on a program of risk characterization and response in order to mitigate and adapt to as many of the potential impacts as possible in advance of extreme events, and avoid their potentially crippling costs. But even in affluent cities, there are not enough resources to mitigate or adapt to every risk. The deployment of limited resources must be made on the basis of the greatest benefit to the greatest number of citizens or the protection of the most critical infrastructure assets. Until now, it has been extremely difficult and time consuming, to define with any degree of empirical accuracy, repeatability or auditability, what is critical and must be protected under changing circumstances. Further, even when critical risks have been identified, it is difficult to address them in a holistic fashion in a way that takes advantage of opportunities for synergies across multiple projects and, at the same time, prioritizes efforts based on the values, interests and priorities of those who have a stake in the process or the outcome. Our Team applies a multidisciplinary approach of risk assessment, risk management, resilience and stakeholder engagement to develop a holistic, strategic climate change mitigation and adaptation solution.

P.3  Development of methodology for finding underestimated chemical substances for health risk based on human kinetic adjustment factor analyzed by QPPR-PBPK model. Sato N*, Kojima N, Tokai A; Osaka University   satou@em.see.eng.osaka-u.ac.jp
Applied Risk Management

Abstract: To address interspecies differences and human variability in sensitivity concerning chemical risk assessment, the Uncertainty Factor (UF) of 10 is commonly used. However, because it has been reported that the use of UF can lead to underestimation of risks to high susceptible populations, the World Health Organization, proposed a chemical-specific adjustment factor (CSAF). CSAF stems from the results of quantitative evaluations of substances sets, based on the test animals and human toxicokinetics (TK) and toxicodynamics (TD) datasets. The focus of this study is on the Human Kinetic Adjustment Factor (HKAF), a sub-factor of CSAF (√10 ≒ 3.16). The objective is to extract priority evaluation substances, whose HKAF exceeds 3.16.The HKAF was calculated using the Quantitative Property-Property Relationship Physiological Based Pharmacokinetic (QPPR-PBPK) model, applying the average value for the physiological parameters of adults and infants, for 14 substances. The biochemical parameters of substances were estimated from physical property data, using QPPR. It was found that one substance, Glycol, exceeded the default value with 3.35, and three substances presented values higher than 2.7, (Bisphenol A:2.97, NMP:2.72, and Butoxyethanol:2.70).And also, these substances presented high blood air partition coefficients that exceeded 10,000(-). Furthermore, by using QPPR method, it was suggested that the fact the molecular weight × water vapor pressure / water solubility is low lead to high blood air partition coefficients. Especially for these substances, the values were lower than 0.1.In the future, using these parameters, extraction of priority assessment substances is carried out for 149 substances that satisfy the data for QPPR. Extending this studies for the reaming substances will allow for the identification of substances exceeding the default value of the UF and will prevent the underestimation of risk, by replacing with the CSAF. This work is supported by Japanese METI.

P.5  Risk-based National Standards of the Republic of China (CNS) on Chemical Level in Consumer Products: A Suggested Framework. Chuang YC*, Huang SZ, Wu C, Wu KY; National Taiwan University   mcyc1979@gmail.com
Applied Risk Management

Abstract: National Standards of the Republic of China (CNS) was developed by the Bureau of Standards, Metrology and Inspection (BSMI) in Taiwan which is the authority responsible for standardization, metrology and product inspection. As the rising of risk perception, BSMI launched the review project and expected to harmonize the local health risk impact and international standards into current CNS of consumer products initially. The suggested framework of risk-based CNS was referred to international organizations with the framework in product risk assessment and was identified with nine major steps and a core concept. The step-by-step approach are including product definition, hazard identification, subject identification, scenario construction, potential harm identification, severity assessment, probability assessment of harm, risk characterization, and uncertainty analysis with the concept of real-time information updating to propose the comprehensive risk assessment framework of consumer products. The feature of Suggested Framework can harmonize local regulation and consumer behavior in National Standards to reduce the risk concerns of local consumer from international standards. Recently, a practical project in BSMI have be launched to evaluate the Violate Organic Compound level of consumer products for CNS with risk-based Framework.

P.6  Awareness-Based Risk Management: Seeing, Transforming, and Unleashing Organizational Capacity. Redinger CF*; The Institute for Advanced Risk Management   cfr@redinger360.com
Applied Risk Management

Abstract: Risk management is a lens through which organizational values and culture lay bare, and through which they can be transformed. Awareness-Based Risk Management (ABRM) provides a way to transform an organization’s risk management function – it generates a shift in context from compliance and status quo – whereby risk awareness increases, along with organizational energy, vitality, creativity, and resilience. The risk function is not only stronger, but also impacts the culture – where there is a higher sense of purpose and alignment with organizational vision and mission. ABRM is not value-neutral. It advocates a view that – the aspiration to improve human, community, environmental and global health and well-being is universal, and the capacity to do so is latent in most systems. ABRM suggests a way to unleash this latent capacity. A key component of ABRM is reframing the “objectives” and “uncertainty” concepts stated in ISO’s definition of risk – in terms of “what matters and is important” and “paradox”. This reframing and shift-in-thinking increases engagement – it fundamentally shifts the relationship to risk, for individuals and teams. This poster presents: a risk management maturity continuum; ABRM’s seven dimensions; an organizational risk-awareness measurement scheme; and several case examples.

P.7  Developing the probability prediction model for the carcinogenic potency by using the Bayesian method to support hazard assessment under Japan’s Chemical Substances Control Law. Yamaguchi H*, Yamada T, Hirose A; National Institute of Health Sciences   h-yamaguchi@nihs.go.jp
Applied Risk Management

Abstract: Hazard management exposure levels in human health assessment under Chemical Substances Control Law (CSCL) in Japan are derived based on mainly the hazard assessment data of four endpoints: repeated toxicity, reproductive toxicity, genetic toxicity, and carcinogenic toxicity. About 200 chemicals have been already identified as high priority chemicals requiring risk assessment, and the carcinogenic toxicology data are not available for about half of these chemicals. Due to taking long period of time and a lot of chronic assays, conducting the chronic testing of all chemicals is impossible. The purpose of this study is to develop the probabilistic model for prediction of carcinogenic effects by using the existing short-term toxicology data, and to propose the practical scheme for priority settings of the chronic toxicity tests in risk assessment under CSCL. Taylor et al. (1993) and Yokota et al. (2004) constructed the models incorporating the Bayesian approach to predict the likelihood of the carcinogenic potency based on the information of LD50, Ames test results, and Maximum Tolerated Dose (MTD). In this study, we developed the models using the toxicity data from the 28 days or the 90 days repeated dose studies instead of MTD as input data of their model. We simulated by using substances that all data of four endpoints were available: LD50, Ames, NOAEL of sub-chronic test, and TD50. Prior distributions of the occurrence probability of carcinogenic potency were estimated by LD50 and Ames test results. Applying the Bayesian method, posterior distributions were derived by the NOAEL obtained by sub-chronic tests. These results were compared with the carcinogenic potency obtained by the long-term carcinogenicity tests. Finally, we discussed improving the tiered testing approach to support hazard risk assessment under CSCL, adding the view of the Threshold of Toxicological Concern (TTC) or the category approach based on chemical structural similarities.

P.8  This is the titleThe IRGC approach to risk and resilience assessment – the IRGC Resource Guide on Resilience. Florin MV*, Linkov I, Trump B; IRGC, EPFL   marie-valentine.florin@epfl.ch
Applied Risk Management

Abstract: Responses to disasters, both natural and technology-related, often show the limitations of traditional risk assessment and management. In the context of risk, resilience has been discussed as both a supplement and an alternative to conventional risk management. IRGC describes resilience building as a possible risk management strategy when there is much uncertainty about impacts and the need to prepare to cope with unexpected shocks. Both governments and industry explicitly call for resilience-based risk management. The IRGC annotated 'Resource Guide on Resilience' is a collection of authored papers about resilience, guiding to a selection of best literature sources. It highlights both the variety of approaches to resilience as well as common features and dynamics. It stresses the importance of including resilience as an important component of the risk governance process, including in research, policy, strategies, and practices. It maps risk and resilience in the context of governance, and reviews how resilience has been manifested, managed and measured in different fields and sectors. IRGC’s objective with the guide is to help scientists and practitioners working on risk governance and resilience evaluation and building. It does so by providing background information on the various perspectives and tools for integrating risk and resilience, and for measuring resilience and the effectiveness of actions taken to build it.

P.9  State of knowledge and data gaps regarding the potential for cyanide poisoning from consumption of apricot kernels in the United States. Savidge MJ*, Hsu LC, Smegal DC; U.S. Food and Drug Administration   matthew.savidge@fda.hhs.gov
Decision Analysis and Risk

Abstract: Apricot kernels are one of several different foods that contain the cyanogenic glycoside amygdalin. Once ingested, amygdalin is metabolized to hydrogen cyanide, a known toxin. In recent decades, amygdalin, and consequently, amygdalin-rich foods such as apricot kernels, have become a popular “natural” alternative to traditional cancer treatment therapies. Despite little evidence backing these claims, the popularity of apricot kernels continues to grow within the United States and abroad, which has resulted in numerous cases of cyanide poisonings around the world. Following the increase in poisoning cases in their countries, agencies in Australia (FSANZ) and the EU (EFSA) completed assessments on the likelihood of cyanide poisoning from consumption of apricot kernels available to consumers in their respective countries. Both agencies concluded that the consumption of apricot kernels poses a likely health concern for all populations, and FSANZ issued a ban on apricot kernels in 2016. Here, we discuss the approaches taken by FSANZ and EFSA and the data gaps currently limiting FDA’s ability to evaluate the potential for cyanide poisoning from apricot kernel consumption in the United States. The two primary data gaps identified by FDA are 1) lack of quality data on levels of amygdalin in apricot kernels and 2) lack of reliable information on consumption patterns in the U.S. Addressing these data gaps would provide valuable insight on the magnitude of exposure to amygdalin from apricot kernel consumption in the U.S relative to other countries, and could potentially inform policy decisions regarding the sale of apricot kernels in the U.S.

P.10  Understanding the causes and consequences of harms to residents of retirement homes in Ontario, Canada. Mangalam S, Pham P, Castellino A*, Salamati F; PRISM Institute   Paul.Pham@rhra.ca
Decision Analysis and Risk

Abstract: Retirement (seniors) homes in modern jurisdictions are regulated and typically require to be operated so that it is a place where residents live with dignity, respect, privacy and autonomy, in security, safety and comfort and can make informed choices about their care options. An evidence based harm-reduction and quality of life improvement hybrid framework is recognized as the most effective and efficient approach to regulatory oversight. This paper describes a hypothetical integrated decision support framework that is based on a combination of a risk based harm reduction model and resident-reporting quality of life survey methodology. Evidence on harms are used to build a cause-consequence model and quantify risk. Quality of life focused surveys of Retirement Home residents help establish a quality score. A resulting risk-quality index helps monitor, measure and prioritize regulatory actions.

P.12  Qualitative Risk Assessment for Drinking Water Standards Using TTC Approach. Hughes B*, Cox K, Bhat V; NSF International   bhughes@nsf.org
Decision Analysis and Risk

Abstract: NSF International evaluated whether the Threshold of Toxicological Concern (TTC) was “fit for purpose” for setting total allowable concentrations (TAC) and single product allowable concentrations (SPAC) for drinking water contaminants with limited toxicology data under NSF/ANSI Standard 61 (NSF 61). Currently, a threshold of evaluation (TOE) of 3 ppb for TAC and 0.3 ppb for SPAC are assigned for chemicals that lack toxicology data and genotoxic structural alerts. If negative mutagenicity and clastogenicity data exists, qualitative TAC and SPAC are set at either 10 ppb (genotoxicity data only) or 50 ppb (additional <90-day study exists). The present evaluation compared these qualitative approaches with the TTC by first assigning chemicals into one of four Cramer classes: 1) those with genotoxic structural alerts (TAC = 3 ppb and SPAC = 0.3 ppb), 2) Class III (TAC = 10 ppb and SPAC = 1 ppb), 3) Class II (TAC =50 ppb and SPAC = 5 ppb), or 4) Class I (TAC = 200 ppb and SPAC = 20 ppb). Structural alerts and Cramer classes were assigned with OECD Toolbox and ToxTree software. Analysis of the randomly selected subset of TOE compounds (n=112) with OECD Toolbox found no structural alerts for 97% of chemicals indicating that use of TOE is adequately protective of public health. For chemicals where the qualitative assessment assigned a TAC and SPAC of 10 ppb (n = 44), the majority (~61%) were Class III chemicals and would have been assigned a similar TAC of 10 ppb using the TTC approach. For those chemicals where the qualitative assessment assigned a TAC and SPAC of 50 ppb (n = 11), OECD Toolbox categorized 55% as Class III and 45% as Class II or I. Results using ToxTree revealed that 82% were Class III chemicals and only 18% as Class II or I. However, the low number of chemicals evaluated in this qualitative category makes these predictions difficult to interpret. The ongoing FDA reevaluation of the Cramer class tool may increase the accuracy of these predictions.

P.13  YPLL: A Comprehensive Quantitative Tool to Evaluate Worker Risk Under Green and Sustainable Remediation. Greenberg GI*, Beck BD; Gradient   ggreenberg@gradientcorp.com
Decision Analysis and Risk

Abstract: The goal of hazardous waste site remediation programs is to reduce risks to populations potentially exposed to chemicals or radionuclides. Environmental remediation practices have evolved to incorporate workers' health and safety in the remedy selection process under Green and Sustainable Remediation (GSR) principles. Many quantitative GSR tools evaluate risk to workers from remediation and transportation of waste and other materials (e.g., traffic accidents). These risks are calculated for various remedial options to help select a remedy that ideally minimizes worker risk while maximizing the effectiveness of the selected remedy. One GSR tool not commonly used is "years of potential life lost" (YPLL), an epidemiological measure of premature mortality that is calculated as the difference between a predetermined age, such as life expectancy, and the age at premature death. YPLL can aid in evaluating risks to workers from remediation activities and transportation and can be useful in selecting a remedy that aligns with GSR principles. We propose using this metric in the remedy selection process to help select a remedy that produces the greatest net public health benefit. An advantage of YPLL over other quantitative tools is its ability to quantitatively assess the public health benefits (i.e., reduced risks to nearby residents exposed to chemicals at a hazardous waste site) of remediation in comparison with the public health costs (i.e., increased risks) experienced by remediation workers. YPLL is a well-established metric that uses the same underlying data as other GSR tools. We will compare YPLL for remediation workers with two models (SiteWiseTM and SRT) commonly used to evaluate workers' health and safety by focusing on their most important differences and similarities. We will also present some of the key assumptions used and how they can influence YPLL results, and discuss the advantages/disadvantages of using YPLL in the remedy selection process.

P.14  Quantitative Microbial Risk Analysis (QRMA) on risk's estimative associated with infectious waste in Blood Centers. GOIS LHB, MONTEIRO LKS*, JORQUERA O, COHIM E, KIPERSTOCK A; Universidade Federal da Bahia    lorenamonteiro.ufba@gmail.com
Decision Analysis and Risk

Abstract: The blood cycle in health sector is part of a supply chain which has a lot of specificities, adding losses and generating too much waste to the related processes. Not only the economic loss associated with the lack of generated hemocomponents use, the supply chain also needs a lot of financial and energetical resources to process the waste. They are classified as microbiologic and potentially infectious, and treated through special methods of waste management. In Brazil, Blood Centers do the blood collect, and after, the nonconforming blood bags are destinated to the waste discard stage. Their treatment is made by incineration or autoclaving, and after that they are discarded in landfill. In health services, plenty of risk minimization and control actions have been adopted based on the Precautionary Principle. Admitting a risk without knowing it makes it merely potential, and real evidences are missing when it comes to the effects of the Blood Center waste in human's health. The Quantitative Microbial Risk Analysis (QRMA) is a tool which is being used by Institutions to make decisions, and it can be successful used to estimate the risk to the society considering a certain event. The present work has the goal to show evidences, using the QRMA Method, of the risk involved with the Blood Center's waste, opening the possibility to reduce the costs and energy losses involved with the current discard process.

P.15  Reliability as a method for risk assessment in hemovigilance. Calazans B, Pessoa RWS, Coutinho IBS*, Oliveira-Esquerre KPR, Kiperstok A; Federal University of Bahia - UFBA   asaicaro@gmail.com
Decision Analysis and Risk

Abstract: The Brazilian hemotherapy service has approximately 2000 institutions. The National Health Surveillance Agency (ANVISA) has developed a semi-quantitative method called Potential Risk Assessment Method in Hemotherapy Service (MARPSH) to evaluate the hemotherapy services through control items defined by the legislation. However, these items often lack support from scientific evidence, so, they are defined by experts. Furthermore, their compliance is binary evaluated, despite the continuous empirical nature of the problem. Thus we propose a method of evaluating the reliability of the hemotherapy entities on the actual potential risk evaluation based on a more flexible metrics within reliability theory. Failure mode is connected with prevention barrier. An escalation factor is here proposed for each barrier as a function of the degree of conformity to the ideal state. The failure rate is defined a priori by expert opinion and may have its value continuously updated through objective evidence. Finally, the institution result for the evaluation is ranked among the levels of potential risk already practiced by ANVISA. Through the MARPSH simulations and the reliability method, the analysis of the methods indicated that using reliability does not negate current practices, but rather allows optimizations in the sensitivity of the evaluations. The absence of a cause-effect structure of the failures obtained by using MARSPH underestimates the importance level of some items or its whole set. The practice of continuous assessment based on data allows the reliability analysis to be optimized and sensitized. Among the benefits of this metric are an opportunity to incorporate to it blood bank risk assessment, failure propagation and possibility of updating the risk due to the improvements incorporated in the process.

P.16  Urban Heat Projections in a Changing Climate: Washington D.C. as a Case Study. Zhang Y*, Ayyub BM; Center for Technology and Systems Management, University of Maryland, College Park   zhyating@umd.edu
Decision Analysis and Risk

Abstract: Extreme heat events are posing a rising threat to us due to a changing climate. The Urban Heat Island (UHI) effect further amplifies heat impacts and raises heat-related risks during heat events. Therefore, it is important to characterize future trends and levels of heat events for cities to manage and reduce associated risks. However, previous predictions for local heat events were limited by its low spatial resolution and oversimplified atmospheric model. This study utilizes the improved Weather Research Forecasting (WRF) model, which dynamically downscales projection results from Community Earth System Model (CESM), to generate high-quality heat event predictions. Using this method, we predicted heat waves in the Washington metropolitan area for the next 80 years and investigated the interaction between heat events and the UHI effect. Results based on the highest concentration scenario of greenhouse gases, Representative Concentration Pathway (RCP) 8.5, indicated that the intensity, frequency, and annual duration of heat waves will increase continuously. The UHI effect and heat events strengthen each other, which can be weakened by increasing urban surface albedo or vegetation cover ratio.

P.17  International Activities Related to Development of Guidance on Human Intrusion in the Context of Disposal of Radioactive Waste. Barr C, Pinkston K*, Seitz R, Bailey L, Guskov A, McKenney C; United States Nuclear Regulatory Commission (Author 1, 2 and 6); Savannah River National Laboratories (Author 3); Radioactive Waste Management Limited, UK (Author 4); International Atomic Energy Agency (Author 5)   cynthia.barr@nrc.gov
Decision Analysis and Risk

Abstract: The International Atomic Energy Agency’s Human Intrusion in the Context of Disposal of Radioactive Waste (HIDRA) project is developing guidance to assist countries with decision-making approaches within a safety case using the results of potential inadvertent human intrusion analyses considered in post-closure safety assessments for radioactive waste disposal facilities. HIDRA differentiates between inadvertent intrusion into geological disposal facilities where radioactive waste is by its very nature isolated from the biosphere, and near-surface disposal facilities, where the potential for inadvertent human intrusion into the disposal facility can be higher given the proximity to human activities in the biosphere. In the first phase of HIDRA, a draft report was completed providing background information on international safety standards and guidance for a general approach to address inadvertent human intrusion. The guidance addressed exposure scenario development, consideration of protective measures to reduce the potential for and/or consequences of human intrusion, and consideration of societal factors including risk communication. The objective of Phase 2 of the HIDRA project is to test implementation of the methodology developed in Phase 1. As part of this effort, examples of human intrusion analyses from various member countries are summarized and important findings and risk insights documented.

P.18  A TOPSIS-based model for performance appraisal of risk management system . Sheikh Hassani N*; Akdeniz University   sheikhhassani.n@gmail.com
Decision Analysis and Risk

Abstract: Risk management systems have become an essential component of management system in many firms. Therefore, having useful tools and methods to appraise the performance of these risk management systems is critical. A performance appraisal in place helps firms to ensure that risk management system is fulfilling the pre-defined targets properly. It also provides the opportunity for further improvements and setting up more rigorous targets. In order to evaluate performance of risk management systems, various performance indicators are introduced by researchers. These indicators reflect on a wide range of organizational, development, capacity and institutional aspects in risk management system. This research utilizes The Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) to get a performance measure index and rank the risk management system along different performance indicators. TOPSIS method has been widely used before in risk assessment phase for evaluating risks of different alternatives. However, this research expands its application into monitoring phase of risk management system and tries to provide an effective measure for performance appraisal of risk management system itself.

P.20  Primary Voting Risk Management. Gurian PL*; Drexel University   pgurian@drexel.edu
Decision Analysis and Risk

Abstract: Voting is often seen as an expression of values, but it is also a consequential act with risky outcomes. The study evaluates primary voting under two different objective functions. The first values a vote by its ability to increase the probability of a candidate winning. The second objective function values a vote by the changes in proportion of votes for each candidate. Both valuation approaches are applied to a set of six hypothetical voters in the Pennsylvania and Indiana 2016 presidential primaries using polling data that was available to voters at the time of the primary and the actual number of voters in each primary. Result indicate that voters will not necessarily vote for their preferred candidate or in the primary of the party with which they are most ideologically aligned. For example, a voter in the Indiana primary with preferences intended to represent a “conventional Democratic” voter (Clinton > O’Mally > Sanders > Kasich > Cruz > Trump) would vote for their second least preferred candidate, Cruz, based on the probability of winning objective function. This is driven largely by the closeness of the Republican primary polls compared to the strong lead Clinton had in the Democratic primary. Under a proportional utility objective function the “conventional Democrat” would vote for Kasich with the driving factor in this case being the large difference in utility scores for the Republic candidates. This approach to primary voting will tend to: 1) favor voting in the primary with fewer expected voters, 2) favor voting in the primary with the greater difference in preferences among candidates, 3) discourage strategic voting to advance a weaker candidate from the opposing party’s primary to the general election, and 4) limit the pool of viable candidates to the two front runners in each party primary for the marginal probability of winning objective function. It is argued that if adopted more widely, this approach would decrease the polarization of the primary electorate and reward candidates with broad appeal.

P.21  Moral Hazard in Loss Reduction and the State Dependent Utility. HONG J, SEOG S*; Seoul National University   seogsh@snu.ac.kr
Decision Analysis and Risk

Abstract: We consider a state dependent utility model with binary states where moral hazard occurs in loss reduction. We find different results depending on the relative sizes of the marginal utilities between the loss state and the no loss state. (i) If the marginal utilities are equal between the two states, the optimal insurance involves full insurance up to a limit and coinsurance above the limit, which corresponds to the case of the state independent utility. (ii) If the marginal utility in the loss state is greater than that in the no loss state, then the optimal insurance includes full insurance, and the moral hazard problem becomes less severe than under the case of the independent utility. (iii) If the marginal utility in the loss state is less than that in the no loss state, then the optimal insurance includes the deductible up to a limit and coinsurance above that limit, and the moral hazard problem becomes more severe. We extend the model into a two period setting, and apply it to the cases of a debt contract of a firm and a wage contract.

P.22  A Single Changepoint Software Reliability Growth Model with Heterogeneous Fault Detection Processes. Nagaraju V*, Fiondella L, Wandji T; University of Massachusetts Dartmouth   vnagaraju@umassd.edu
Decision Analysis and Risk

Abstract: Non-homogeneous Poisson process software reliability growth models enable several quantitative decisions about the software testing process such as reliability, optimal release time, and mean time to failure. Most of these software reliability growth models are characterized by a single continuous curve. However, during the software testing process, the failure data is affected by additional factors such as testing strategy, environment, integration testing, and resource allocation. These factors can introduce one or more changepoint into the fault detection process. Recently, several researchers have proposed non-homogeneous Poisson process software reliability models with one or more changepoints to characterize the failure data more precisely. However, one of the limitations of previous research is that only homogeneous combinations of the failure distribution before and after changepoints are considered. This simplifying assumption may not necessarily be true in all cases because even simple visual inspection suggests that the failure data could follow different distributions at various stages. Therefore, it is possible to observe different combinations of failure distribution before and after changepoints. This paper develops heterogeneous single changepoint models possessing different failure distributions before and after the changepoint. The heterogeneous models are compared with existing homogeneous models using goodness-of- fit measures such as the Akaike Information Criterion and Predictive Sum of Squares Error. Experimental results suggest that heterogeneous changepoint models better characterize 60% of failure data sets considered. Optimal release policies are discussed and a general procedure is presented based on the goodness-of-fit measures to identify the best model to predict the optimal release time that considers the subset of the data available.

P.23   Entropy for Quantifying Uncertainty and Risk in Economic Disparity . Mishra S*, Ayyub B, Zhang Y; University of Maryland College Park and International Finance Corporation   saurabhthemishra@gmail.com
Decision Analysis and Risk

Abstract: The rise in economic disparity presents significant risks to global social order and the resilience of local communities. However, existing measurement science for economic disparity (e.g. Gini coefficient) and risk metrics do not explicitly consider the information, deficiencies and uncertainties associated with the underlying income distribution. This paper introduces probabilistic principles of Shannon Entropy to quantity uncertainty and risk in economic disparity with theoretic and empirical evidence for global, national and city level data. Entropy and conflict rise as incomes converge, whereas low entropy signal high-risk tipping points for anomaly and conflict detection. Happy societies tend to have high entropy, but a public discourse is required for entropy optimization with significance for the economies productive structure and social order. Experiments for Washington, D.C., San Francisco, Baltimore, and Detroit are used to assess risk profiles within and between cities. Entropy produces reliable results with significantly reduced computational costs than Gini coefficient.

P.24  Practical Multi-Criteria Decision Analysis with an Alternatives Assessment Framework. Howard B*, Kenney M, Gerst M, Giraud R; American Chemistry Council   brett_howard@americanchemistry.com
Decision Analysis and Risk

Abstract: Alternatives Assessments afford value chain stakeholders an opportunity to evaluate a chemical or product holistically – providing an advantage over traditional single attribute determinations. To date, however, AAs have experienced sporadic application with common complaints being they’re too difficult to implement and don’t capture tradeoffs accurately. Much of this stems from an “apples to oranges” comparison mindset -- for example, how does one relate reduced environmental impact to increased hazard? Multi-criteria decision analysis (MCDA) is one method stakeholders can use to analyze tradeoffs in otherwise incomparable criteria. And by combining MCDA approaches with existing LCA, hazard, exposure, and performance information databases, the process can drastically reduce the time required to perform an AA and increase the informational value of the output. The Value Chain Outreach division within ACC has recently explored applying MCDA techniques in real-world circumstances – today’s talk will discuss some of our findings and the challenges that remain.

P.25  Inspections outcomes and their association with contract manufacturing and drug application submissions. Liu W*, Schick A, Kazemi R; US Food and Drug Administration (FDA)   reza.kazemi-tabriz@fda.hhs.gov
Decision Analysis and Risk

Abstract: Abstract Regulatory agencies around the world conduct careful inspections of regulated drug manufacturing facilities to determine whether the firm is compliant with regulations and good manufacturing practices (GMPs) to mitigate risk to public health. Although many hypotheses and questions have been raised in forums and conferences about the relationship between GMP inspections and the state of quality in manufacturing facilities, or the effect that potential “risky behaviors” of facilities have on phenomenon like shortages, few analytical and data driven studies have actually been conducted to evaluate these hypotheses. In this paper we set to study two such questions. First, are pharmaceutical contract manufacturing organizations (CMOs) more susceptible to quality problems, compared to facilities that cannot be identified as contract manufacturers? And as an extension, do they experience worse inspection outcomes compared to their counter parts? This study will also take into account the uncertainties involved in identifying contract manufacturing facilities. The second question has to do with regulatory agencies’ assessment of the state of quality of drug manufacturing facilities, particularly with respect to the quantity/characteristics of products with which manufacturers are involved. For instance, is there a trend where facilities with a favorable inspection outcome, subsequently submit or are named in an increasing number of applications? This change in the variety or volume of products being manufactured affects the risk profile of the facility, and can dramatically impact regulators’ facility risk assessment. Since regulatory agencies have limited resources in conducting inspections, this study may be particularly important in helping them develop optimal-risk based inspection schedules, allowing them to re-direct their inspectional resources in order to more efficiently advance their mission in protecting public health.

P.26  A Probabilistic Risk Model for Contaminated Site Management. Bailey A*, Peterson J; SLR International Corporation   abailey@slrconsulting.com
Decision Analysis and Risk

Abstract: Probabilistic risk assessment (PRA) uses distributions as an alternative to point estimates to calculate and present risk estimates, resulting in more complete and informed risk characterization. PRA also provides insight into the critical factors influencing risk estimates by formally evaluating the roles of variability and uncertainty. PRA is a useful tool to inform management decisions for contaminated sites where unacceptable point estimates of risk have adverse consequences, and to prioritize additional research to further inform risk estimates. The PRA presented herein evaluated cancer risks from arsenic-impacted soil under three types of residential exposure scenarios. A model was developed to incorporate changes in body weight, exposure duration, soil ingestion rate, and exposure time as receptors aged. Measured data from the population in the site vicinity, where available, were used to characterize exposure assumptions. The resulting risk distributions were used to focus remediation on areas likely to pose the greatest potential risk, thereby reducing remediation costs. This project represents the first PRA, to the authors’ knowledge, accepted and used by a State regulatory agency to guide management decisions for a contaminated site. This presentation will focus primarily on the PRA methods and model, as well as application of results.

P.27  Implications of Anthropogenic Climate Change on Radioactive Waste Disposal in the United States. Lee RC*, Crowe B, Duffy P, Sully M, Levitt D, Black P; Neptune and Company, Inc.   rlee@neptuneinc.org
Decision Analysis and Risk

Abstract: Radioactive waste disposal in the US occurs in a regulatory framework that requires risk assessment of potential disposal sites. The main concern is radionuclide transport out of the disposal site into the nearby environment. Future environmental radionuclide concentrations, resultant risks, and associated uncertainties are quantified in “performance assessments” (PAs), which are probabilistic radionuclide transport/risk models specific to each disposal site. Past PAs have assumed that large-scale future climate changes will follow glacial cycles of approximately the past 1 million years, and have ignored anthropogenic “greenhouse gas” emissions. Considering near universal consensus on anthropogenic climate influences, it is now prudent to address these in PAs and associated decision analyses (DAs). Important aspects for PAs/DAs include influences of climate change on overall temperature/precipitation (affecting water infiltration), frequency of maximum precipitation events (affecting erosion), likelihood of human habitation, and delay of the next glacial period. Examples of PAs at different US sites are presented, in which projections of anthropogenic climate influences over the next millennia are being incorporated. A New Mexico site could experience lower precipitation/higher temperatures, lowering infiltration and reducing habitation; but increases in extreme monsoon events, hastening erosion. A West Texas site could experience desertification, and thus less infiltration and habitation. A New York site could experience greater precipitation and more extreme events, thus increasing runoff and erosion. A Utah site is less likely to be inundated by the return of Lake Bonneville, due to delay of the next glacial period. The economic implications of these climate influences on radioactive waste disposal are likely in the trillions of dollars over multiple sites and waste types, as disposal decisions are associated with large costs and substantial risk implications.

P.28  Assessing consumer product manufacturers’ tradeoffs among design criteria in chemical substitution decisions. Rao V*, Francis R, Tanir J; The George Washington University, Human and Environmental Sciences Institute   vrao81@gwu.edu
Decision Analysis and Risk

Abstract: Many chemical manufacturers are embracing alternatives assessment to identify alternative chemicals in consumer and/or specialty products that are safer and more environmentally friendly than chemicals of concern identified in emerging regulations. For example, California has established a Green Chemistry Initiative and has enacted regulation to improve evaluation of chemicals, raise transparency to end users, and require substitution of priority chemicals of concern. Thus, it is important to understand tradeoffs made by chemical and product manufacturers and designers with regards to final product design and re-design decisions. The objective of this research is to characterize tradeoffs among six factors affecting product design: Business Strategy, Economic Considerations, Functionality and Performance, Health and Environmental Endpoints, Public Perception, and Regulatory Factors. These factors were further disaggregated into 33 attributes distributed across the six factors. We assessed tradeoff weights for each factor and the degree of influence of the specified attributes on a recent product design or re-design by administering a survey to consumer or specialty product manufacturers. Our results indicate the most highly weighted factors are Economic Considerations, Functionality and Performance, and Health/Environmental Endpoints. The most important attributes are Product Price, Product Performance, Meets Desired Specifications, Company Reputation, and Meets Regulatory Standards. Although health/environmental endpoints are one of the three most important decision factors, our results suggest consideration of this decision factor must be situated in the broader economic context faced by product manufacturers.

P.30  An algorithmic adversarial risk analysis aproach for bi-agent influence diagrams. González-Ortega J*, Ríos Insua D; ICMAT   jorge.gonzalez@icmat.es
Decision Analysis and Risk

Abstract: In his landmark paper, Shachter (1986) proposed extending the computation of optimal decision policies in influence diagrams to the multi-agent case as a fundamental problem. So far, this suggestion has been faced from a (non-cooperative) game theoretic perspective, stemming from Koller and Milch (2003) who introduced Multi-Agent Influence Diagrams (MAIDs) and provided algorithms for finding Nash equilibria in problems modelled as MAIDs. A main drawback of such methodology is its underlying common knowledge assumptions, criticized in e.g. Raiffa et al. (2002). Most versions of non-cooperative game theory assume that adversaries not only know their own payoffs, preferences, beliefs and possible actions, but also those of their opponents. However, in many contexts, including counter-terrorism or cybersecurity, these premises will not generally hold. Adversarial Risk Analysis (ARA) provides a way forward. Instead of addressing the problem simultaneously for all agents, a single decision maker (defender, she) is supported. Her problem is viewed as a decision analytic one, but procedures which employ the game theoretic structure are used to estimate the probabilities of the opponent's (attacker, he) actions resulting in a maximized expected utility solution. Many different attacker rationalities may be considered in the ARA framework, though a level-2 thinking strategy is adopted, see Stahl and Wilson (1995), so that the defender will ponder over the attacker's strategy but assume that he will not do the same about hers. The uncertainty in the assessments about the attacker's probabilities and utilities propagates to his random optimal decision which constitutes the required attack forecast. General adversarial problems between two agents are studied, consisting of intermingled sequential and simultaneous movements, spanning across the corresponding planning period. The approach is illustrated through a driving example in critical infrastructure protection.

P.33  Untangling the mystery of assessing snow avalanche hazard - a conceptual model. Haegeli P*, Statham G, Birkeland KW, Greene E; Simon Fraser University, Parks Canada Agency, USFS National Avalanche Center and Colorado Avalanche Information Center   pascal_haegeli@sfu.ca
Decision Analysis and Risk

Abstract: Snow avalanches claim about 150 lives in the western world every year, more than any other natural hazard. Most victims are backcountry recreationists, but avalanches also threaten villages, utility lines, resource operations and cause traffic hazard and economic loss by blocking highways and railways. Avalanche risk is managed in real-time by continuously monitoring weather and snowpack conditions to assess the hazard and determine its effect on the element(s)-at-risk. Mitigation measures are then chosen based on objectives, such as warning the public, skiing a slope or keeping a road open. Assessment methods using a combination of hazard, exposure and vulnerability to determine and compare risks are widely used in natural hazards. Despite recent advances in the adoption of explicit risk concepts among avalanche forecasters, the process by which observations and data are combined into hazard assessments has so far not been formally described. This lack of formal structure makes the process vulnerable to human errors and posed a significant hurdle for evaluation, targeted improvements, and effective communication. We introduce a conceptual model of avalanche hazard that describes the assessment process by decomposing the intuitive, judgment based reasoning process of experts. Starting from a qualitative, risk-based framework, we progressively break down the practice of avalanche hazard assessment into its core components. We then define these components before reassembling them into a probability-consequence framework. The resulting model offers a tangible pathway from observations to hazard assessments that is universally applicable in any type of avalanche risk management context. This makes the model extremely valuable for operational application, training and communication. We conclude with a discussion of our practical experience with the model, its potential for future research and ideas about the benefits of our approach for other dynamic risk environments.

P.34  Simulation of reconstruction of the affected area of 2011 Great East Earthquake. Maeda Y*, Masuda R; Shizuoka University   maeda.yasunobu@shizuoka.ac.jp
Decision Analysis and Risk

Abstract: The Great East Japan Earthquake of March 11, 2011 caused serious damages. Especially this disaster was not only earthquake, but multiple hazards including tsunami and radiation contamination due to the accident of TEPCO Fukushima Daiichi NPS. Although the affected area is currently recovering from the damages, there are regional differences in reconstruction due to differences in magnitude and types of the damages. This research created a system dynamics model of three prefectures in the affected area, Iwate, Miyagi, and Fukushima, and performed simulations to estimate future trends of gross production and population in the three prefectures. The model was made by using Vensim simulation language. This model deals with the relationship between the economy and population, as well as impacts of the disaster and the reconstruction. It simulates the period from 2000 to 2040, including conditions before the disaster, during the disaster, and after the disaster. Based on the simulation results, we estimated the future of the three prefectures, what factors will influence the reconstruction from the Great East Japan Earthquake, and what factors are required for further development of the Northeast three prefectures. As the results, the followings are found. Firstly, the three prefectures tended to decrease in the prefecture's gross production before the earthquake, and the trend of the model also showed a trend along that. Secondly, even with the addition of the reconstruction factors, the prefecture's gross production will return to a decreasing trend when the reconstruction period expires, so it was not possible to grow only with the reconstruction. Thirdly, even if the labor input changes from minus to zero, it is possible to maintain the prefectural gross production at the level of reconstruction in Miyagi prefecture and Iwate prefecture. And fourthly, regarding Fukushima prefecture, recovery is delayed due to the influence of radiation compared with Miyagi and Iwate.

P.35  Developing a decision framework for outer continental shelf sand resource management. Bates ME*, Fox-Lent C, Corr J, Cialone M, Knorr P; US Army Corps of Engineers   matthew.e.bates@usace.army.mil
Decision Analysis and Risk

Abstract: This poster presents a multi-criteria decision analysis framework for managing sand resources (e.g., shoals) in the US outer continental shelf. Sand is mined from these areas for beach nourishment, combatting erosion and providing recreational value and storm protection. Offshore sand resources are finite and once depleted or made unusable by poor management practices are not typically renewed on human time scales. Working through a participatory process with practitioners in various government agencies and in industry, the authors developed a decision framework for evaluating areas for proximal use based on a balance of physical features, project needs, ease of extraction and transport, and long-term site sustainability.

P.36  Probability Distortion is an Optimal Response to Imprecise Probabilities. Johnson KL*, Luhmann CC; Stony Brook University   kelli.johnson@stonybrook.edu
Decision Analysis and Risk

Abstract: When making choices under uncertainty, people behave as if small probabilities are larger than they were said to be and large probabilities are smaller than they were said to be. In other words, people’s subjective probability representations are regressive with respect to nominal probabilities. This phenomenon is known as probability distortion, and is generally considered to reflect limitations in human perception or decision making. However, probabilities are often imprecise, in which case distorting probabilities represents an advantageous decision-making strategy. For a sample drawn from a distribution, the sample proportion serves as the nominal probability. The nominal probability therefore represents the mode of the likelihood function, otherwise known as the maximum likelihood estimate. Alternatively, the weighted probabilities observed in probability distortion are regressive with respect to the nominal probability, and therefore represent the weighted mean probability. Whereas maximum likelihood estimates (nominal probabilities) result in more instances of zero error, weighted mean probabilities result in less error on average. Consequently, regressive probability distortion is advantageous when probability estimates are uncertain. The current work demonstrates this effect via agent-based computational modeling. Conditions under which various measures of the likelihood function should be used for decision making were also simulated. These results have implications regarding human decision-making research, as well as optimal strategies for making decisions based on uncertain probabilities.

P.37  Risk Evaluation in Peer Review of Grant Applications. Gallo SA*, Thompson L, Schmaling K, Glisson S; American Institute of Biological Sciences   sgallo@aibs.org
Decision Analysis and Risk

Abstract: The process of peer review is used to identify the most scientifically meritorious research projects for funding. Most research funding agencies would like to fund the most highly impactful research, however much criticism of peer review focuses on the perception that panelists are biased against innovation. Some recent evidence, from our group and others, suggests that review scores of funded projects are only moderately correlated with citation impact and one study found that reviewers systematically assigned poorer scores to highly novel work. Moreover, it is clear that reviewers’ definitions for excellent research and paradigm-shifting research are different; innovative research may not always be considered excellent. But it is clear more needs to be done to understand the decision-making processes of reviewers, both as individuals and as a panel, in evaluating high-risk research. In an effort to address this gap, the American Institute of Biological Sciences developed a comprehensive peer review survey that examined, in part, the differences in applicant and reviewer perceptions of review outcomes. The survey was disseminated to 14,138 reviewers and applicants (9% response rate). Only 19% of respondent applicants indicated innovation potential as an area addressed in review feedback, while 84% of respondent reviewers indicated that they factored innovation into selecting the best science and 74% viewed innovation as an essential component of scientific excellence. Similarly, while only 33% of respondent applicants received comments on the riskiness of their grant applications, 63% of respondent reviewers indicated that the risks associated with innovative research impacted the scores they assigned to the grant applications. These results suggest a disconnect in perception between how innovation is evaluated in grant applications and how the feedback is received.

P.38  Evaluating impacts to the DoD mission and the defense industrial base from chemical regulation under the amended Toxic Substances Control Act . Vogel C*, Rak A, Underwood P, Scanlon K, Bandolin N, Esola S; (1,2) Noblis, Inc.; (3,4) Office of the Assistant Secretary of Defense for Energy, Installations, and Environment; (5) U.S. Army Public Health Command; (6) DoD Defense Contract Management Agency Industrial Analysis Group    catherine.vogel@noblis.org
Decision Analysis and Risk

Abstract: The Toxic Substances Control Act (TSCA), amended in June 2016, provides the Environmental Protection Agency (EPA) with new authority to evaluate and address risks to human health and the environment from chemical substances and mixtures. The EPA identified an initial list of 15 chemicals for which risk evaluations and risk management actions will be developed through TSCA Sections 6(b) and 6(h) rule makings. TSCA-driven risk management actions can range from use restrictions and implementation of additional industrial hygiene control measures to full manufacturing bans. Many of the chemicals identified by EPA are used in Department of Defense (DoD) sustainment activities or in the manufacturing of components for weapon systems and platforms. The Deputy Assistant Secretary of Defense Environment, Safety and Occupational Health (ESOH) staff, the Military Departments, and the Defense Contract Management Agency Industrial Analysis Group are using a collaborative, multi-pronged approach to identify and evaluate potential DoD mission impacts from the TSCA rule makings. The approach includes (1) identifying the conditions and usage amount of each chemical through database research; (2) engaging subject matter experts to identify the mission criticality of these chemicals; (3) assessing industrial base impacts; and (3) engaging with EPA to inform the TSCA rule makings. A pilot assessment of two chemicals with proposed TSCA Section 6(b) rule makings is underway to identify supporting industrial base suppliers, explore the availability of potential chemical substitutes, and project the associated industrial base impact of the regulations. The pilot effort will aid in proactively identifying and addressing DoD mission risk from future TSCA rule makings. The poster describes the collaborative risk evaluation approach, summarizes supporting data, and presents key pilot assessment results.

P.39  Analysis of Consumers’ Preference to Accidental and Chemical Risk in a Purchase of a Domestic Appliance. Tsunemi K*, Kawamoto A, Ono K; National Institute of Advanced Industrial Science and Technology   k-tsunemi@aist.go.jp
Decision Analysis and Risk

Abstract: The aim of this study is to construct a framework of qualitative and quantitative evaluation of various effects including consumers’ preference contributed to establish safety target of domestic appliances. Questionnaire survey was conducted on consumers’ preference to accidental risk and chemical risk focused on flame retardants used in plastic parts of electric and electronic home appliances. First, we classified effects of flame retardants into four groups, such as health, environment, safety and economy, and we also identified four alternatives, such as products including brominated flame retardant, phosphorus flame retardant, inorganic flame retardant and with no flame retardant. Next, we evaluated the effects of health, environment, safety and economy of four alternatives qualitatively. Then, we conducted a questionnaire survey to consumers in the purchase of a domestic appliance and analyzed the consumers’ preference by analytic hierarchy process (AHP). As the result, the degree of each consumer’s preference to health, environment, safety and economy was quantified as 24%, 18%, 32% and 25%, respectively. It revealed that consumers had consciousness to avoid accidental risk in preference to avoid chemical risk. While, the alternative product including inorganic flame retardant which has low level of safety had the largest degree of consumer’s preference, which was inconsistent with the former result.

P.40  Methodology for deriving provisional advisory levels (PALs) for chlorine. Kobylewski-Saucier SE, Taylor ML*, Lipscomb JC; 1. Consolidated Safety Services, Inc; 2. WinTech, LLC; 3. US Environmental Protection Agency   lipscomb.john@epa.gov
Dose Response: Chemical Specific Methods and Results

Abstract: PALs are health threat characterizations (oral and inhalation threshold values) that estimate levels of harm from exposure to toxic industrial chemicals and chemical warfare agents (CWA) at various exposure levels and durations. PALs values inform options for emergency response and risk management decisions (e.g. temporary re-entry of contaminated areas and reuse of previously contaminated resources) and are not no-effect levels but provide “degree of injury” tiers. Exposure at PAL 1, 2, and 3 tiers are associated (respectively) with reversible, irreversible or escape-impairing, and/or lethal health effects for exposures of up to one day to up to two years. Chlorine is used in the manufacturing of numerous products and has been involved in several unanticipated releases, including its use as a CWA. The respiratory tract is the primary target for inhaled chlorine toxicity; effects range from sensory irritation to epithelial tissue damage to increased airway resistance to death at high doses. Chlorine hydrolysis products formed in the body can disrupt enzyme function and membrane structure and damage tissues. Based on its toxicity and general availability, chlorine is considered an important potential agent of harm in both accidental and subversive release scenarios. This project evaluates chlorine toxicity information within the context of the comprehensive methodology for PALs derivation, including consideration of direct and secondary mechanisms of toxicity in identifying critical effects (CE) and points of departure (POD). This paper will demonstrate how CE and POD are selected using a weight-of-evidence approach involving analysis of the relationship between toxicity mechanisms and characteristic physiological endpoints. The views expressed in this paper are those of the authors and do not necessarily reflect the views or policies of the Agency. Mention of trade names or commercial products does not constitute endorsement or recommendation for use.

P.42  Implications of Recent Changes to the Toxicity of 1,4-Dioxane on the Derivation of Regulatory Criteria. Sager SL*, Forsberg ND, Prucha C, Bull L; ARCADIS U.S., Inc., (Sager & Forsberg) Waste Management (Prucha & Bull)   shawn.sager@arcadis.com
Dose Response: Chemical Specific Methods and Results

Abstract: 1,4-Dioxane is emerging as a water resource contaminant of interest because of its presence in many industrial and commercial products as well as its historical use as a solvent and as a stabilizer for solvents (e.g., 1,1,1- trichloroethane). Due to its high solubility, 1,4-dioxane is frequently detected in groundwater at low part per billion levels. The United States Environmental Protection Agency (USEPA) has not established a drinking water standard for 1,4-dioxane although several states have set groundwater or drinking water standards for this constituent. The USEPA revised its cancer evaluation for liver tumors in rodents in 2010, concluding that 1,4-dioxane is likely to be carcinogenic to humans. USEPA justified developing a cancer risk assessment approach by also concluding that the available toxicological data were insufficient to adequately support a non-linearized cancer mode of action (MOA) for 1,4-dioxane. Applying these conclusions to the derivation of ground water quality criteria yields sub-part-per-billion levels. Scientific investigations by Dourson et al. (2014) and the Alliance for Risk Assessment in 2017 to resolve MOA data gaps concluded that 1,4-dioxane causes liver tumors in rodents through a regenerative cell proliferation MOA. This cancer MOA shows a threshold of exposure below which tumors do not form and supports the use of a non-linear low-dose extrapolation procedure for estimating risks with the most-sensitive, tumor endpoint. Using these results, alternative risk-based drinking water criteria can be derived. Using these toxicity values, a drinking water quality criterion that is over 1,000 times greater than that derived using USEPA’s current toxicity values can be derived. This presentation will summarize recent advances in the understanding of 1,4-dioxane’s cancer MOA and discuss the impact of the new information on establishing risk-based drinking water quality criteria utilizing methodologies established by several states.

P.43  Use of DistillerSR to Facilitate Systematic Reviews. Wilkins A*, Thayer K; Federal Government   wilkins.amina@epa.gov
Dose Response: (More) Tools to Operationalize Human Health Risk Assessment

Abstract: ABSTRACT: Use of DistillerSR to Facilitate Systematic Reviews To support the Environmental Protection Agency’s (EPA) mission to protect human health and the environment, the Integrated Risk Information System (IRIS) Program evaluates studies of chemicals in the environment to identify adverse health effects and characterize exposure-response relationships. Over the last several years, the IRIS Program has been adopting systematic review methodology in the development of their assessments. As part of implementing systematic review, the IRIS program is increasingly using a variety of specialized software applications to increase efficiency of conducting the assessments. This poster describes the use of one web-based systematic review software tool, DistillerSR, commonly used in the field of systematic review to screen studies for relevance. DistillerSR facilitates group screening efforts and includes many features, including tracking of screening conflicts, creation of inclusion/exclusion reports, project management capabilities, and user-customized form creation. Key features and staff time metrics for recent screening work will be described in this presentation. The views expressed herein are those of the authors and do not necessarily reflect the views or policies of the US EPA.

P.44  Validation and Application of a Text Mining Tool for Identification and Categorization of Mechanistic Data Related to the Key Characteristics of Carcinogens: Case Studies of a Problem Formulation Tool. Chappell G*, Welsh B, Harvey S, Harris M, Wikoff D; ToxStrategies, Inc.   gchappell@toxstrategies.com
Dose Response: (More) Tools to Operationalize Human Health Risk Assessment

Abstract: Efforts to determine efficacious methods for identification and categorization of information in toxicological systematic reviews are ongoing. We assessed the utility of a text-mining and machine learning tool (SWIFT) in the characterization of mechanistic data associated with carcinogenic endpoints organized by ten key characteristics of carcinogens (KCC) (Smith et al., 2016). Objectives included: 1) validation of a process that employs a text-mining and machine learning tool (SWIFT) to literature by the KCC using five systematic review datasets, and 2) assessment of the approach as a problem formulation method via characterization of literature for 20 substances of varying carcinogenic potential. With respect to validation, we found that the text mining strategy returned ~60% of the KCC-relevant studies that were identified by an analyst via title and abstract screening demonstrating the utility (and limitations) of the method as a problem formulation tool. Search syntax optimization was highly influential, with the internally-developed syntax returning different results than the default KCC search strings. Regarding machine learning, we found that the prioritization tool did not adequately reduce the screening effort for relatively small datasets (<1,000 articles): the effort required to effectively train the program to sufficiently rank papers by relevance outweighed manual screening. For the second objective, the overall profiles of studies categorized according to the KCC varied considerably across agents both within and across carcinogenic potential groupings: for many (8/20), most the data appear to be predominantly associated with few (≤5) KCC, while others (4/20) had a relatively even distribution of studies across the ten KCC. Collectively, our findings demonstrate the potential utility of computational tools to support problem formulation and identification of data gaps, although the importance of evaluation beyond categorization to mechanistic categories is emphasized.

P.45  Proposed key characteristics of male reproductive toxicants as a method for organizing and screening mechanistic evidence for non-cancer outcomes. Arzuaga X*, Yost E, Hotchkiss A, Beverly B, Gibbons C; U.S. Environmental Protection Agency, Office of Research and Development, National Center for Environmental Assessment, and National Institutes of Health, National Institute of Environmental Health Sciences, National Toxicology Program   arzuaga.xabier@epa.gov
Dose Response: (More) Tools to Operationalize Human Health Risk Assessment

Abstract: The adoption of systematic review practices for risk assessment includes integration of evidence obtained from experimental, epidemiological, and mechanistic studies. Although mechanistic evidence plays an important role in mode of action analysis, the process of sorting and analyzing mechanistic studies and outcomes is a challenging exercise due to the diversity of research models and methods, and the variety of known and proposed pathways for chemical-induced toxicity. The recently identified Ten Key Characteristics of Carcinogens provide a valuable tool for organizing chemical-specific data on potential mechanisms of carcinogenesis. However, such an approach has not yet been developed for non-cancer adverse outcomes. Our objective in this study was to identify a set of key characteristics that could be applied for screening mechanistic evidence for male reproductive effects. Identification of seven key characteristics of male reproductive toxicants was based on a survey of established mechanisms/pathways of toxicity. As a proof of principle, we applied this set of key characteristics to organize experimental and mechanistic studies that evaluate the effects of the PCB mixture Aroclor 1254 on the male reproductive system. A database was developed to capture the available information and experimental design details on Aroclor 1254 for each of the key characteristics of male reproductive toxicants. The proposed key characteristics provide a useful method that can facilitate the systematic and transparent organization of mechanistic evidence relevant to chemical-induced effects in the male reproductive system. Disclaimer: The views expressed are those of the authors and do not necessarily represent the views or policies of the US EPA.

P.46  High-throughput benchmark dose modeling using a web-application and Python interface library for US EPA Benchmark Dose Modeling Software (BMDS) . Shapiro AJ*; US National Toxicology Program   andy.shapiro@nih.gov

Abstract: The US EPA benchmark dose modeling software (BMDS) is a widely-used application in dose-response modeling, and has been extensively applied in regulatory decision making. However, the software is designed as a desktop application, and the dose-response models are written using a performant low-level programming language with fixed-text input and output files, making it difficult to interface with modern software. Because of these implementation limitations, the BMDS software has generally been applied to lower-throughput modeling applications, or highly customized software packages to solve a specific scientific problem (e.g., BMD Express for microarray data). Rewriting the application in a different language which would allow for easier integration into model pipelines would present a challenge in acceptance of outputs from a new application without a large burden of software equality tests. Instead, we present a Python-based interface for easy and efficient execution of the BMDS software, https://pypi.python.org/pypi/bmds. This flexible, open-source, python interface allows for integration of the BMDS modeling software into other systems. The interface allows for automatic creation of model input files, model execution, parsing of output files, figure creation, and model-recommendation logic. Input and settings are fully customizable. Further, a web-application has also been created with a REST API for job submission, allowing for seamless integration in other applications requiring no installation of BMDS on the target machine. Performance on a standard desktop computer allows for dose-response throughput of up to one dataset/second, allowing for BMDS to be used in new scientific realms such as high-throughput assays or microarrays. Further, the application has been designed to be agnostic to model input setting and recommendation methods, allowing for maximum flexibility and adaptability. All software presented is currently available, open-source, and free.

P.47  A Web-based Bayesian Dose-Response Assessment System. Shao K*, Shapiro A; Indiana University   kshao@indiana.edu
Dose Response: (More) Tools to Operationalize Human Health Risk Assessment

Abstract: Since the introduction of the benchmark dose (BMD) methodology, the traditional BMD modeling tools (e.g., EPA’s BMDS and RIVM’s PROAST) have played a key role in facilitating and promoting the BMD approach. These tools employ the maximum likelihood estimation methods to fit dose-response models and produce mainly point estimates for model parameters, benchmark dose values, and other quantities of interest. As the mainstream regulatory risk assessment community is moving towards a probabilistic assessment framework, more robust methods are needed to provide quantitative estimates (such as distributional estimation) to support risk assessment and enable benefit-risk analysis. To address these needs and further the state of the science, a new system that is fully based on Bayesian statistics has been developed. This software application utilizes the Stan probabilistic programming language for model fitting using MCMC sampling for Bayesian inference. Modeling outputs include distributions of model-fit parameters, distributions of BMD estimates, and risk/response calculations for user-specified dose levels. Posterior model weights are used to not only evaluate model fitting performance but also to calculate model-weight averaged BMDs. New module is added to utilize the posterior sample of BMD estimates from the dose-response modeling step to do the probabilistic low-dose extrapolation as introduced in Chiu and Wout 2015. Here we present the new software system and the web-based interface which has been developed to enable users to conduct their own analyses.

P.48  Defining Priors for Bayesian Dichotomous Dose-Response Analysis. Allen BC*, Blessinger TD; U.S. Environmental Protection Agency and Independent Consultant   blessinger.todd@epa.gov
Dose Response: (More) Tools to Operationalize Human Health Risk Assessment

Abstract: Bayesian approaches in dose-response modeling are becoming more common given the wide availability of Markov Chain Monte Carlo techniques and recent advances in Bayesian model averaging. An important consideration for implementation of Bayesian approaches is the specification of parameter priors. A Bayesian model averaging method recently released in EPA’s Benchmark Dose Software (BMDS) defines model-specific parameter priors empirically based on 558 datasets obtained from EPA’s IRIS database. This poster presents an alternative approach in which the priors applied to model-free quantities of interest (e.g., extra risk at a particular dose, background response, BMD50) are consistent across models that are fit to the data. Also investigated is the impact of the correlation among model parameters (which was not accounted for in the empirical investigation noted above). To address this issue, an alternative model parameterization is proposed for which the assumption of independence among parameter priors is reasonable.

P.49  Applying in vitro toxicity data to inform chemical risk assessment. Wheeler MW*, Bailer JB, Whittaker C; National Institute for occupational safety and health   mwheeler@cdc.gov
Dose Response: Predicting and Observing

Abstract: Various in vitro toxicity assays, including those in the USEPA ToxCast high throughput toxicity testing database, have been developed in an effort to better inform chemical risk assessments; however, few modeling methodologies have been created to fully use this information. For example, whole dose response curves for a collection of responses are typically not considered, and more complex models that involve multiple structurally-similar chemicals are frequently not used. This presentation investigates applying machine learning approaches including Gaussian processes, functional data models, and clustering techniques to databases created from the National Toxicology Program’s databases as well as the US EPA’s ToxCast high throughput toxicity testing platform to better classify risk. The results are presented to explore potential advantages of using this information to augment a standard quantitative risk assessment based on the analysis of the results from a single long-term bioassay study. We look at the potentiality of combining this information to develop models that will augment traditional quantitative risk assessment.

P.50  Estimating Chronic Toxicity Values from Short Term Tox Tests: Application to Chemical Substitution Decisions. Kratchman J*, Gray G; George Washington University   gmgray@gwu.edu
Dose Response: Predicting and Observing

Abstract: With increasing demand for safer chemicals, more chemical alternatives assessment frameworks are being developed to replace target chemicals in products and processes. However, many chemical alternatives lack chronic toxicity data, leaving assessments imbalanced. In the absence of such data this study investigated whether short-term non-cancer toxicity data can be used to predict chronic toxicity effect levels by focusing on the dose-response relationship instead of a critical effect. This information is then applied to an alternatives assessment decision. Data from National Toxicology Program (NTP) technical reports have been extracted and modeled using a novel high-throughput processing approach and the Environmental Protection Agency’s Benchmark Dose Software and. Best-fit, minimum benchmark dose (BMD) and benchmark dose lower limits (BMDLs) were modeled for all NTP pathologist identified significant non-neoplastic lesions, final mean body weight and mean organ weight for 41 chemicals tested by NTP between 2000 and 2012. Relationships between the short-term and chronic data were then developed using orthogonal regression techniques. The findings indicate that short-term animal studies may reasonably provide a quantitative estimate of a chronic BMD or BMDL. This can allow relative human toxicity comparisons for chemicals that lack chronic toxicity data. Next, two pairs target-alternative chemical recommendations were considered. The alternative chemical lacked chronic toxicity data, whereas the target had well studied non-cancer health effects. Using the established quantitative relationships chronic health effect levels were predicted for the alternative chemicals and compared to known points of departure (PODs) for the targets. The findings indicate some alternative assessment approaches can recommend substitute chemicals with greater toxicity concerns than the target chemical.

P.51  The Correlation between Liver Tumor Incidence and Early-stage Liver Weight Change - An Analysis Using NTP Data. Chen Q*, Shao K; Indiana University   chenqira@umail.iu.edu
Dose Response: Predicting and Observing

Abstract: Liver tumor is a significant and complicated disease in human. Various studies have focused on the cause and biomarkers of early detection of the liver tumor occurrence. In this study, we aim to investigate whether there is a correlation between short-term liver weight change in rodents and liver tumor incidence. We screened all 593 published NTP technical reports and found 174 compounds that have at least one species/sex combination with positive or clear-evidence liver tumor. Next, we searched the data of absolute liver weight and relative liver weight at early stages for these compounds and finally enrolled 79 chemicals which have both long-term cancer data and short-term liver weight data. For the data analysis, we employed the benchmark dose (BMD) method to analyze the corresponding dichotomous and continuous dose-response data of each compound to determine the dose level that causes significant change in each endpoint, which is followed by a calculation of the Pearson correlation coefficient to determine the correlation between these two sets of BMDs. Multiple models were applied to take model uncertainty issue into account. Preliminary results show that the correlation coefficients range from -0.26 to 0.86, with a mean of 0.53 for 3-month relative liver weight. In addition, more than 43% of correlation are greater than 0.6, which indicates a fairly strong association. Generally, although considerable uncertainties still exist, there is a relatively high agreement between the BMD estimates from the tumor data and short-term liver weight data. These results suggest that liver weight increase can be used as a risk indicator for liver tumor to give a warning signal in early stage.

P.52  NMR-and MS-based metabolomics to investigate molecular effects of repeated dose exposure of maleic acid in Spague-Dawley rats. Wu C*, Chen CH, Chen HC, Liang HJ, Chen ST, Lin WY, Wu KY, Chiang SY, Lin CY; National Taiwan University   d01841001@ntu.edu.tw
Dose Response: Predicting and Observing

Abstract: Maleic anhydride (MAH), an organic raw material frequently used in consumer and industrial products, was intentionally adulterated in a variety of starch-based foods. We aim to elucidate possible mechanisms through which MA-toxicity occurs by 1) determining the changes of metabolic profile of Sprague-Dawley rat urine after repeated exposure using 1H NMR spectroscopy and multivariate analysis; 2) investigating whether MA induce oxidative stress using LC-MS/MS. Adult male Sprague-Dawley (SD) rats were divided into four dose groups and subjected to a 28-day repeat-dosed study (0, 6, 20 and 60 mg/kg) via oral gavage. Urine samples were collected twice a day on Day 0, 7, 14, 21, and 28; organs underwent histopathological examination. Our results demonstrated that MA exposure increases the urinary concentrations of 8-OHdG, 8-NO2Gua and 8-isoPGF2&#945;; analysis of acetoacetate, hippurate, alanine, and acetate demonstrated time- and dose-dependent variations in the treatment groups. Changes in body weight gain and relative kidney weights in medium- and high-dose groups were significantly different compared to untreated rats (p <0.05). Physio-morphological alterations were evident in the kidneys and liver. Our results suggest that MA consumption escalates oxidative damage, membrane lipid destruction, and disrupt energy metabolism. These aforementioned changes in biomarkers and metabolites can assist in characterizing the possible mechanisms by which maleic acid induces nephro- and hepatotoxicity.

P.53  Evaluating the Association between Alterations in Maternal Thyroid Hormones and Adverse Neurodevelopmental Outcomes. Brown L*, Reichle L, Klein R, Ginsberg G; Abt Associates, Inc.; Partnership for Pediatric and Environmental Health   lauren_brown@abtassoc.com
Dose Response: Predicting and Observing

Abstract: In utero exposures to endocrine disrupting chemicals can impact a range of health outcomes for the offspring. Specifically, when considering potential disruptions to proper maternal thyroid functioning due to chemical exposures there is increasing concern that adverse neurodevelopmental outcomes may be seen in the offspring. This is especially true early in pregnancy, before the fetus has its own functioning thyroid. Results of a comprehensive literature review on the associations between maternal thyroid hormone levels and offspring neurodevelopmental outcomes will be presented. Specifically, we found that altered free thyroxine (fT4) in early pregnancy has the most evidence regarding the relationship between potentially altered thyroid hormone homeostasis in pregnancy and adverse neurodevelopmental outcomes. We will present multiple approaches to utilize data from the identified literature to evaluate the magnitude of potential impact from altered maternal fT4 levels on offspring neurodevelopment. These approaches consist of both evaluating dose-response relationships between maternal fT4 and offspring neurodevelopmental and examining how shifts in distributions of thyroid hormone levels may place additional proportions of a population at risk for adverse neurodevelopmental outcomes. The presented methods can be used to assess the potential neurodevelopmental impacts on the fetus of a pregnant mother exposed to a chemical that may potentially alter homeostatic thyroid functioning (e.g., PCBs, perchlorate, PBDEs, PFAs, BPA)

P.54  Estimate of IQ loss in infants due to exposure to arsenic in infant cereals. Lynch MT*, Chiger A, Houlihan J; Abt Associates (1&2) and Healthy Babies Bright Futures (3)   mtklynch@gmail.com
Dose Response: Predicting and Observing

Abstract: Children in the United States may be exposed to inorganic arsenic via food and drinking water. Our goal is to quantify the risks of IQ loss in children from arsenic in infant rice cereal. We will present the results of a literature review for dose-response functions linking inorganic arsenic exposures to IQ loss in children, and for reference doses based on these effects. We will use our arsenic exposure estimates (presented in a companion poster titled “Inorganic Arsenic Exposures Associated with Consumption of Infant Rice Cereal” by Andrea Chiger, Jane Houlihan and Meghan Lynch) and used these estimates in our quantitative dose-response analysis. We will present estimates of avoided IQ losses in children from reduced consumption of infant rice cereal (due to a switch to other cereals with lower arsenic content such as oatmeal, or from consumption of theoretical decreased arsenic concentrations in infant rice cereal). Our analyses indicate that inorganic arsenic exposures from infant rice cereal consumption may be associated with adverse neurodevelopmental effects, specifically IQ loss, in U.S. children.

P.55  Association with Using Statins and the Risk of New Diagnosis Diabetes Mellitus in Transient Ischemic Attack Patients. Ho WC*, Yin MC, Chu YR, Peng YH, Tsan YT, Chen PC; China Medical University   whocmu@gmail.com
Dose Response: Predicting and Observing

Abstract: Diabetes Mellitus (DM) is a major public health issue in Taiwan, and worldwide, affecting the health of people and quality of life. Transient Ischemic Attack (TIA) is an important cerebrovascular disease in Taiwan, the third leading mortality cause in Taiwan. Statins are the widely used for hyperlipidemia by their cholesterol-lowering effect, also potentially with pleiotropic effect. Statins potentially can reduce the risk of TIA patients developing more severe stroke. Some studies have shown statins use maybe related to the risk of developing DM but the results remain controversial. Therefore, the aim of this study was to investigate the risk of developing DM after using statins among TIA patients. The study design was a retrospective cohort study based on Taiwan Longitudinal Heath Insurance Database 2000 (LHID2000). This study subjects were newly diagnosed cases of transient ischemic attack from January 1, 1997 to December 31, 2011. Statin dose is assessed and evaluated according to the cumulative daily defined dose (cDDDs). Cox proportion hazard regression models were used to investigate the relationship between the use of statins and the risk of newly diagnosed DM among TIA patients. Considering the differences within the time period of personal use statins, the Cox proportional hazards model with time-dependent covariate was used to calculate the hazard ratio (HR) and 95% confidence interval (95% CI) after adjusting the sex, age, income and urbanization degree. Sensitivity analysis and subgroup analysis were also conducted. All hypothesis tests for statistical significance were 2-sided using p <0.05. The results showed that TIA patient statins use may reduce the risk of newly diagnosed DM. According to the cumulative daily dose, three exposure groups being analyzed (low, medium and high doses), dose-response relationship were found. More research is needed for the mechanism and effect of statins among TIA patients related to DM.

P.56  Statin Use and Temperature on Transient Ischemic Attacks among Diabetes Mellitus Patients. Chang PH*, Chou YJ, Yin MC, Chu YR, Tsan YT, Chan WC, Ho WC, Chen PC; China Medical University   peipei10220312@gmail.com
Dose Response: Predicting and Observing

Abstract: Ischemic stroke (IS) is the major stroke in Taiwan. Studies indicate that transient ischemic attack (TIA) has similar mechanism to ischemic stroke, and may serve as early stage prevention of ischemic stroke. Diabetes mellitus (DM) patients tend to have increased risk for stroke. Statins are widely used medicine for their cholesterol-lowering effect in patients with hyperlipidemia to prevent vascular diseases. Studies show that temperature affects arterial pressure, cholesterol concentration and then can cause thrombosis. The objective of this study is to investigate whether statin use and temperature have the impact on transient ischemic attacks among DM patients. The study design was a retrospective cohort study. The medical records of subjects including TIA events and statin use were collected by Longitudinal Health Insurance Database applying for 2million people with DM. The meteorological factors including temperature and air pollution data were obtained from the Taiwan Environment Protection Administration monitored at 77 air pollution monitoring stations. The exposure of statin and temperature analyzed by the time stratified case cross-over approach. Conditional logistic regression model was used to correlate average daily meteorological factors and concentration of daily air pollution. The results showed that temperature change are associated with TIA. Statin has a protective effect on transient ischemic episodes. The modifying effect of temperature changes on statin related to transient is chemic episodes can be on important topic.

P.57  Environmental risk assessment in e-SCM. Mohammadabbasi M*, Sheikh Hassani N; Tehran University   borninfarvardin@gmail.com
Ecological Risk Assessment

Abstract: Electronic supply chain management involves using internet mostly in manufacturing industry in order to management the whole supply chain. An effective e-SCM solution allows companies to produce products that meet clients' needs and result in appropriate return on investment. As companies get increasingly concerned with their environmental responsibility, there is a notable tendency to integrate environmental issues into supply chain management system. Introducing environmental dimension into supply chain would bring out new trade-offs into supply chain decisions. Addressing these trade-offs require use of specific tools and models such as Multi-Criteria Decision Making methods. In this paper, we outline the similarities and differences of risk assessment in e-SCM and traditional supply chains. We will illustrate methods that help companies using e-SCM to reach a better environmental risk assessment performance. Results from empirical research is reported to further show the application of MCDM such as TOPSIS and AHP in successfully assessing environmental risks of e-SCM.

P.59  Minimizing Average Procurement Unit Cost for Rotorcraft Tradespace Exploration. Bhattacharya S*, Nagaraju V, Fiondella L, Spero E, Ghoshal A; University of Massachusetts, Dartmouth   sbhattacharya@umassd.edu
Economics and Benefits Analysis

Abstract: In recent years, Tradespace Exploration (TSE) has emerged as a systematic strategy to assess the effectiveness and suitability of various alternative conceptual designs. One of the primary advantages of TSE is that it provides an environment for detailed consideration for various tradeoffs, which can be used by various stakeholders before committing to any configuration. TSE has gained a significant momentum for developing and conceptualizing products capable of performing in a wide range of adverse conditions commonly encountered by military systems. Combined with technology, TSE provides a collaborative environment between various stakeholders for analysis of alternatives. However, much of the previous TSE research emphasized on tradeoffs between functional requirements, especially those about performance. Our past work proposed models for rotorcraft TSE to quantify the impact of non-functional requirements such as reliability and availability, survivability. U.S. Army research lab in collaboration with Georgia Tech. developed Capability Assessment and Tradeoff Environment (CATE) tool. The primary aim of this tool is to perform tradespace exploration for Future Vertical Lift and Joint Multi-Role Rotorcraft Technologies. At the heart of CATE is NASA’s Design and Analysis of Rotorcraft (NDARC) software application, a conceptual-level tool to design a rotorcraft for specified missions under designated conditions, and then conduct performance analysis under nominal as well as in scenarios that exceed anticipated mission conditions. We seek to further analysis and develop more rigorous quantitative RAM models that can be integrated with the present CATE system and benefit rotorcraft TSE efforts. This paper presents a model for rotorcraft TSE to explicitly consider the relationship between subsystem reliability improvement and average procurement unit cost. Examples illustrate how the model allocates resources to improve subsystem reliability in a manner that minimizes APUC for any fleet size specified.

P.60  Economic recession of old industry base in Northeast China . Jiang HZ*, Tiffany Paul; Northeast Yucai School; University of California Berkeley   zhanzhansy@outlook.com
Economics and Benefits Analysis

Abstract: From the so-called “traditional old industry base” which played the most important role in supporting the whole country’s economy in China since the 1950s to today’s last place in GDP, the northeastern part of China (i.e. Lianong, Jilin and Hei Long Jiang provinces) is experiencing a chronic serious economic downturn. The GDP growth rate of the first season of the year 2016 in Liaoning shows that the province is the only one with negative growth among more than thirty provinces. The general poor economic condition almost affected everyone living in the region spanning from large enterprises and local government to homeowners. The risk of a further recession is still very high especially for Liaoning province. We raised up a question: What are primary and secondary reasons that Northeastern China suffered such a regional recession in recent years? We investigated many relevant factors that could contribute to explain the recession and the future risk of a further and more severe recession. These factors include: regional superiority at national-level policy in China, the aging of industrial structures, feasibility and challenge in transformation of traditional industry (e.g. steel, petroleum), funding shortage and financial credit crisis, crisis in trust on local government, Unemployment, inter-regional competition, talent migration, and environment deterioration (e.g. air pollution). We employed two main methodology in this study. First, we collected relevant data and built up econometric models to find out important factors. Then, we used qualitative method (including focus group and interview on local policy makers, economists and laymen) to understand why. This study provides a comprehensive and first detailed analysis focusing on the economic recession in Northeastern China and gives insight to the local and national policy makers.

P.61  Annualized Loss of Revenue Caused by Cyber-attacks for Power Generation in Virginia Using Agent-Based Modeling. Poyraz OI, Keskin OK*, Pinto CA; Old Dominion University   okesk001@odu.edu
Economics and Benefits Analysis

Abstract: In 2016, the highest annualized cost of cyber-crime to a commercial company was $74M, and the average annualized cost of cyber attacks to utilities and energy industry was $14.80M. The cost of cyber-crime has been increasing as frequency and sophistication of cyber-attacks increase. On the other hand, organizations invest more in cyber security and cyber insurance to prevent from loss of confidential information, business disruption, loss of revenue, and even loss of hardware. If state sponsored or non-state sponsored attackers decide to infiltrate an organization network to make an impact, with enough resources and dedication, they can eventually accomplish this malicious goal through different intrusion methods such as social engineering and Trojan. If these compromises occur in the critical infrastructure sector, this may lead the nation into chaos for a while. Some examples of this are the attacks in Estonia (2007), Georgia (2008), and Ukraine (2015). The motivation of this study is to illustrate the potential cost of cyber-crime and give a ballpark idea to evaluate the risk. Critical infrastructure decision-makers need to gauge whether the cyber investment or insurance is a necessity for the organization. In this study, we conducted a Monte Carlo Simulation and computed the average annual loss of revenue using Agent Based Modeling. In the model, we included current power generation entities of the Commonwealth of Virginia and represented the model on a map using Geographical Information Systems. We also included the loss caused by business disruption, hardware-software, and cyber-security monitoring cost in order to calculate the total annualized average loss from cyber incidents. The input parameters of the simulation are detection time, recovery time, and unit price of the electricity. We conducted the simulation by utilizing the real data from Energy Information Administration and industry surveys of Ponemon Institute.

P.62  Can Scanner Food Purchase Data Help Us Identify Sources of Foodborne Illness? Ashton L, Berck P, Cole D, Hoffmann S*, Todd J; University of Wisconsin, Madison; University of California, USDA Animal Health Inspection Service, USDA Economic Research Service   shoffmann@ers.usda.gov
Economics and Benefits Analysis

Abstract: Since the adoption of the Sanitary and Phytosanitary Agreement in 1995, food safety authorities around the world have been working to move from a hazard-based to a risk-based approach to food safety management. The international focus on risk-based food safety management has included a call to develop methods “to estimate the proportion of illnesses and death that is truly foodborne and the major food vehicles, processes, and food handling practices responsible for each hazard” (FAO/Codex. Principles and Guidelines for the Microbial Risk Management p. 7). The result has been an emerging area of food safety research, source attribution (Pires et al. 2009). Source attribution of foodborne illness is surprisingly difficult. As a growing body of research, has focused on developing new approaches to estimating quantitative relationships between food exposures and human disease, it has become clear that a wider range of methods are needed. This paper develops a new source attribution method using analysis of daily time series data to study of the relationship between enteric disease and food exposure. Our study uses daily data from Foodborne Diseases Active Surveillance Network (FoodNet) on illness from Campylobacter and Shiga toxin-producing Escherichia coli (STEC) O157 and daily food purchase data collected by Nielson. Both pathogens have been studied using other source attribution methods, providing a basis for examining the external validity of our approach. These data are regionally specific, allowing examination of geographic heterogeneity in the association between food exposures and disease. We explore the effects of temporal aggregation and lag structure as well as food category structure on statistical power. We find patterns of food association that are consistent with those found in prior case-control studies, but our results allow more specific regional and temporal patterns to these associations.

P.63  Consumer Approval of Nanomaterials in Food and Medicine. Hilgard J*, Nucci ML, Hallman WK; Annenberg Public Policy Center, University of Pennsylvania; Illinois State University; Rutgers University   jhilgard@gmail.com
Emerging Nanoscale Materials

Abstract: This study examines consumers’ approval, evaluation, and policy stances regarding the use of nanotechnology in food and medicine. Although nanotechnology may provide useful innovations in product design, packaging, and supplementation, consumers often see novel food technologies as risky, as exemplified by opposition to genetically modified foods. To understand public awareness and opinion towards nanotechnology, we conducted a web-based survey on a nationally-representative panel (N = 1058). Participants evaluated a variety of nanotechnology applications in terms of their approval for the product, their behavioral intentions towards the product, and whether the product should require special labels. Embedded experiments manipulated attributes of some products in a randomly-assigned, between-subjects fashion. Overall, awareness of nanotechnology is low, and attitudes are generally ambivalent or mildly supportive. Products that provided solutions to serious problems, such as disease, were seen more favorably than products that embellished existing functionality, such as vitamin supplements. Furthermore, results indicate that attitudes are more favorable to nano-products that are not directly ingested by humans or by livestock. Taken together, it appears that consumers see ingested nanoparticles as a source of risk. Nanotechnology may find greater approval when fed to pets rather than humans, used in packaging rather than the food itself, or used to address serious problems that conventional products cannot solve.

P.64  A Novel Clustering Analysis for the Examination of Metal Oxide Pulmonary Toxicity in Rodents. Ramchandran V*, Gernand JM; Penn State University   vzr124@psu.edu
Emerging Nanoscale Materials

Abstract: A quantitative, analytical relationship between the characteristics of emerging nanomaterials and related toxicity is desired to better assist in the subsequent mitigation of toxicity by design. Experimental toxicology studies are accompanied by drawbacks relating to time and cost which can be overcome or limited by the development of computational approaches. Quantitative structure activity relationships (QSAR’s) and meta-analyses are popular methods used to develop predictive toxicity models. A meta-analysis for investigating the dose-response and recovery relationship in metal oxide (MO) pulmonary toxicity studies on rodents was performed using a novel clustering approach. The primary objective of the clustering is to categorize groups of similarly behaving MOs (similar dose-response-recovery relationship) leading to the identification of any physicochemical differences between the various clusters and evaluate their contributions to toxicity. The studies are grouped together based on their similarity of dose-response-recovery, the algorithm uses a combination of genetic algorithms and robust hierarchical clustering to categorize the different MO particles. The algorithm uses the Akaike information criterion (AIC) as the performance metric to ensure there is no overfitting. Differences in the toxicity of the clusters can be explained by their respective potency and correlated to variations between the attributes of the clusters. The results from the clustering analysis of MO particles (for 5 response variables) revealed that there are at least 6 toxicologically distinct groups present among the MOs on the basis of similarity of dose-response. Analysis of the attributes of the clusters reveals that they also differ on the basis of their length, diameter and chemical composition. The MO particles with short lengths and small diameters were found to be more potent than the other MO’s analyzed.

P.65  Understanding the growing costs for FEMA’s Public Assistance Program: The role of repeated hazards and institutional knowledge by applicants. Ghaedi H*, Reilly A; University of Maryland   hghaedi@umd.edu
Engineering and Infrastructure

Abstract: Federal Emergency Management Agency (FEMA) is responsible for providing assistance to qualified entities before, during, and after a disaster occurs to reduce the impact of hazards. One of the important types of assistance provided by FEMA is Public Assistance (PA). PA provides funding for debris removal, repair, and restoration of damaged public infrastructure caused by hazards. The Government Accountability Office recommends an overhaul of how PA is funded because of its rising and unsustainable costs. In the current study, we shed light on whether institutional knowledge developed by applicants who experience to repeated hazards increases the likelihood that they will apply for PA in the future and ultimately the amount of funding they qualify for. This could suggest that rising costs borne by FEMA after disasters partially results from more entities discovering that they are entitled to funding and from a better understanding of the bureaucratic complexities of filing for assistance, making entities more likely to file.

P.66  Costs of Seismic Retrofits of Existing Federal Buildings for Disaster Resilience. Halper SH*, Saadat Y, Ayyub BM; University of Maryland, College Park   shalpe25@terpmail.umd.edu
Engineering and Infrastructure

Abstract: Throughout the history of the United States many buildings were designed according to building codes which predate seismic provisions. As a result, a lot of older buildings are not adequately equipped to withstand loads produced form earthquakes. This creates a safety hazard to the occupants of these buildings and their surroundings, and therefore retrofitting is required, especially for those buildings which are most vulnerable. Multiple retrofit methods exist that are suited for different types of building structures to address different structural deficiencies. The purpose of this work is to determine which retrofit methods are most economical for a particular building type. This involves identifying a system of classifying building structures, developing a method of estimating costs of each retrofit method for each building structure type, developing a method of estimating benefits gained as a result of the retrofit, and ultimately computing benefit/cost ratios. Costs estimates include direct and indirect costs. Costs are extremely dependent on factors like seismicity (low, moderate, high, very high), performance objective (life safety, damage control, immediate occupancy), occupancy class (parking, retail, residential, industrial, institutional/educational, commercial, assembly), and occupancy during construction (in place, temporarily removed, vacant). Benefits are determined by losses avoided, including loss of life, and property losses. The benefit cost ratios provide a basis for selecting economically justifiable and effective solutions. This work concentrates on buildings belonging to the United States Federal Government, and outlines a methodology for producing cost benefit ratios for such buildings.

P.67  Validating and Improving Downscaling Methods of Global Climate Model Results in Predicting Extremes using Copula. Hu H*, Ayyub BM; University of Maryland, College Park   hhu3@umd.edu
Engineering and Infrastructure

Abstract: The ability to analyze and predict future precipitation plays a key role in risk and damage management. Huge effort has been spent to build Global Climate Models that provide climate projections in the global scale. However, when applying the global-scale projection to a local scale, a downscaling procedure is required, which transforms a global trend to a local trend. Methods have been proposed by previous works to perform downscaling, but none of them have fully validated in terms of their accuracy using historical data for extreme events. In this work, the validation of three downscaling methods, namely Global Daily Downscaled Projections, Localized Constructed Analogs, and Multivariate Adaptive Constructed Analogs, are studied and compared in the context of extreme precipitation. It is observed from this work that all three downscaling methods work well when used to analyze non-extreme precipitation. However, when used to analyze extreme precipitation, the accuracy decreases significantly. To further explore the underlying reason, probabilistic methods based on copula is used. Copula is a probabilistic tool that decouples the marginal distribution and the joint-dependency among random variables. Using copula, it is found that existing downscaling methods produce marginal distribution, which is important for average precipitation, with high precision. However, they perform worse for copula dependency, which plays a key role in extreme precipitation projection and analysis. Inspired by the above observation, a novel method is finally proposed that is able to fix existing downscaling result to be extreme-climate friendly. The method is nonparametric and works better if more downscaling methods are incorporated.

P.68  Risk perception on hydrogen fueling stations for Japanese public with risk and benefit information. Ono K*, Tsunemi K; AIST   kyoko.ono@aist.go.jp
Engineering and Infrastructure

Abstract: Hydrogen storage facilities, such as hydrogen fueling stations (H2 station), are inevitable infrastructure for the utility of fuel cell vehicles. We are interested in the public acceptance of H2stations and how the installation of these H2 stations is perceived by the public. Aims of this study are to describe the characteristics of public perception of H2 station in Japan using risk perception and acceptance scales and to analyze differences of people’s acceptance on H2 station with risk or benefit information on a H2 station. We conducted an online survey asking respondents to rate their acceptance of having an H2 station constructed in the gas station nearest their home. Respondents were divided into 18 groups by providing with or without risk and/or benefit information. At the same time, we asked the respondents on their risk perception, scale of risk acceptance, and risk-avoiding tendencies. Factor analysis was conducted to extract factors to characteristics of public acceptance by each group. We found the following to be explanatory factors for acceptance: gender, degree, vehicle use, knowledge about hydrogen, risk perception of H2 station, and inherent risk acceptance and avoidance. Binominal regression analysis and a structural equation modelling were conducted to construct an acceptance model, and the risk perception factor “Dread” was dominant among the effective independent variables. We also discussed the difference of acceptance rates among groups which were provided or not provided with risk or benefit information. This work was supported by Japan Science and Technology Agency (JST), Cross-ministerial Strategic Innovation Promotion Program (SIP).

P.69  Reliability analysis of a containment system, transport and segregation of effluents. Santana S.P.B.*, Pessoa R.W.S., Oliveira-Esquerre K.P.; Federal University of Bahia   karlaesquerre@ufba.br
Engineering and Infrastructure

Abstract: Water pollution is a major global concern and every effort to avoid it must be carried out. In an effluent treatment plant the reliability of the effluent is a matter of great importance when the effluent is reused or mainly discharged into the water receiving body. Data analysis becomes challenging when you want to find information that measures the probability of an overflow (failure) in a containment system, transport and segregation of effluents. Reliability analysis was used is this work as a tool to investigate the probability of occurrence of some event (problem) that compromise the safety, operation of the system or process for subsequent decision making of the considered system. The presence of rain was taken into account as an accelerating factor of overflow which can directly influence the station's operation. The probabilities of overflow from each basin to another one or to the water body (sea or river) were estimated. Among others, the results show that under rainfall the probability of failure is higher than during dry seasons. The developed tool can help in making decisions in the operation of the system in order to better manage its risks.

P.70  Resilience-oriented Analysis of Risk Management and Ontology-based Categorisation of Hazards in interdependent Infrastructure Systems. Yan J*, Tang J; Future Resilient Systems, Singapore-ETH Centre   junqing.tang@frs.ethz.ch
Engineering and Infrastructure

Abstract: A resilient and safe infrastructure network should represent high connectivity efficiency, tolerable robustness to the disturbance, and moderate service flow. The disruptions can be roughly classified as random failures and deliberate attacks from either natural hazards or technical system failures. Here either traditional risk management approach or resilience analysis of general infrastructure system all comprehensively emphasise on the critical understandings of unexpected hazards. Due to the different levels of granularity and complexity, the uncertainty and ambiguousness of a risk which could subject to infrastructure system in our city become increasingly intriguing, for it is becoming evolving and dynamic in its own nature. Therefore, a new approach, resilience-based risk management, is essentially in great demand. Yet, it is an emergent scientific approach which still lack of development in the field. We proposed these new analysis approaches could realise in the following ways: 1) Develop a framework to organise the resilience and risk analysis for general infrastructure systems. 2) By applying those new approach and frameworks, we could provide essential and inspiring aids on decision-making process during risk management and analysis. The effectiveness of approaches could be formalised in testing on real-world infrastructure systems. For example, our study could identify the weak points of transport network with respect to different disturbance. By reinforcing those weak points, the network’s robustness could be improved so that the mobility of the network could also be improved simultaneously. The work is still currently under development and the future works involve applying resilience-oriented risk management approach to real-world case studies and testing the categorisation method to real infrastructure networks hazard data.

P.71  Traffic-Accidents Prediction Using Advanceds Machine Learning Techniques. Aguiar Filho A, Soares ES*, Esquerre KP, Barreto TB, Pessoa RW; Federal University of Bahia   soares.eduardo.sampaio@gmail.com
Engineering and Infrastructure

Abstract: The present work aims at understanding a relation between vehicular accident types and some possíble causes at Brazilian highways. In order to carry out the analyzes, a database from the National Department of Transport Infrastructure (DNIT) for seven-year period (2005 to 2011) was utilized. The prediction model was built by using the space-time characteristics of the highway applied to the technique of gradient boosted Regression trees, GBRT, through the LightGBM package. The quality of the model was evaluated based on a test data set which represents 30% of the total data, through the confusion matrix, the non-information ratio, as well as the multi-class logarithmic loss. Model shows a good qualitative performance. The most influential variable in the accident type identification corresponds to the location of the stretch of the highway (km). Climatic conditions, the quality and characteristics of the stretches of the highway, among others, are suggested to be used in future studies as predictive variables as a way to improve model predictive performance. After this initial test, was proposed a better model using other powerful Machine Learning strategy.

P.72  Optimal Re-allocation of Cargo Across Transportation Modes for the Recovery of Throughput During an Inland Waterway Disruption. Amodeo DC*, Francis R; The George Washington University   dcamodeo@gmail.com
Engineering and Infrastructure

Abstract: A number of studies have looked at the relationship between attributes of port delays along inland waterways and the regional economic loss. These studies have typically integrated a loss model with either a simulation or stochastic program to generate a set of shipper responses to disruption. To our understanding, none of these studies have looked at the role of operational terminals in facilitating modal shifts when the disruption comes from barriers to vessel movement such as low water levels, lock closures, or vessel collisions. In this paper we propose a model to optimize cargo re-allocation mid-sail when carriers are faced with uncertain transit times due to disruption. The degree of feasible modal shift can be viewed as a contributing factor of network resilience. This ability to re-allocate a mode’s commodity mix ensures continued service when one or more modes are disrupted. A critical input to this study is the capacity and capability of river terminals. After presenting the proposed model, we illustrate its use of a 500 mile commercially navigable segment of the Tennessee River.

P.73  Measuring the Impact of Socio-economic Status on Post-Hurricane Power Restoration. Kerr SE*, Patwardhan A; University of Maryland College Park   siokerr@umd.edu
Engineering and Infrastructure

Abstract: As we increasingly consider resilience as a central strategy for addressing climate change, recovery emerges as an important dimension that is often the focus of public policy. The progression of global climate change will cause an increase in the scale and magnitude of disasters, and it is therefore more important than ever to understand how we can not only prevent impacts, but also recover from them. Recovery processes can be very non-uniform; communities struck by similar levels of damages in the aftermath of a disaster can have very different outcomes. As a result, we must work to better understand the factors that make the recovery process more efficient and effective for some communities than for others. This project considers the impact of socio-economic status on post-hurricane recovery at the regional level, using power restoration as a metric by which to better understand short-term recovery of a specific infrastructure system on a broad spatial scale. The relationship is examined using a cross-sectional analysis that compares the duration and nature of recovery processes at the zip code level following Hurricane Matthew in the southeastern United States. The research uses outage data that was scraped from utility websites following the storm, and controls for the hazard and spatial characteristics that are expected to impact the restoration process. This research hypothesizes that there is a level of subjectivity inherent to the decision making process that guides power restoration efforts. This could cause deviations in recovery outcomes along socio-economic lines. There is little acknowledgment in the literature of the possibility that power restoration could be influenced by such factors, so positive results will be of great relevance to policy makers and utilities alike. More broadly, by establishing power restoration as a valid proxy for short-term infrastructural recovery the research sets the groundwork for future studies of this nature.

P.74  Sensitivity analysis on resident evacuation behavior in the Integrated Scenario-based Evacuation (ISE) framework. Yang K*, Davidson R, Nozick L, Brian B, Brian C, Wachtendorf T, Drasback K, DeYoung S, Kolar R, Yi W; University of Delaware   kunyang@udel.edu
Engineering and Infrastructure

Abstract: Hurricane evacuation is a complicated process involving uncertainty in the evolution of the hurricane and many interactions among natural, human, and infrastructure systems, both of which change over time. The newly developed Integrated Scenario-based Evacuation (ISE) computational framework integrates dynamic hazard modeling based on an ensemble of scenarios, dynamic population evacuation behavior prediction, and dynamic traffic assignment modeling to solve for a tree of evacuation recommendations that minimize risk and travel time. Each path through the tree corresponds to a set of recommended evacuation orders at each decision time that are conditional on how the hurricane has evolved until that time. The performance of the ISE framework has been demonstrated for a full-scale case study in North Carolina for Hurricane Isabel. In this presentation, we examine the sensitivity of the framework evacuation recommendations and performance to the evacuation behavior of the population. Specifically, we conduct a series of analyses, including ones that assume nobody leaves (providing an upper bound on risk), no official evacuation orders are given (only shadow evacuation), orders are given and are more influential than currently assumed (due to an education program perhaps), and full compliance (i.e., everyone who is given an order leaves, but no one else). Results are examined in terms of false negatives (people not leaving when they should) and false positives (people leaving when it is not necessary). This sensitivity analysis is useful for helping to interpret the results, understand the role of population behavior relative to the hazard and other components of the process, and guide future model development.

P.75  Modeling homeowner retrofit behavior for wind and flood. Yahyazadeh Z*, Davidson R, Trainor J, Kruse J, Nozick L; University of Maryland   Zeinabj@udel.edu
Engineering and Infrastructure

Abstract: For existing homes constructed before current building codes were in place, retrofitting (i.e., strengthening) can be an effective way to reduce damage from hurricanes and other extreme events. It has not been widely implemented, however. Understanding the process by which homeowners make retrofit decisions is critical to developing programs to encourage it. This presentation provides an analysis that adds to the scant empirical literature on the subject. We combine revealed and stated preference survey data for homeowners in North Carolina to develop separate mixed logit models for homeowner decisions about retrofits aimed at addressing four different types of hurricane damage—wind damage to the roof, openings (windows, doors), and roof-to-wall connection, and flood damage. In each case, we examine the effect of three possible types of economic incentives—low-interest loan, insurance premium reduction, and grant. Combining the two data types allows us to combine the strengths of each—investigation of incentives that do not yet exist through stated preference data and take-up rates thought to be more reliable based on revealed preference data. In particular, the study aims to develop the statistical models so they can be used to: (1) predict regional take-up rates for different retrofit types under different possible incentives; (2) better understand the effects of the different incentive types; and (3) and identify characteristics of homeowners more likely to undertake retrofits. Results provide evidence that offering a grant increases the likelihood of retrofitting, but offer no such evidence for incentives in the form of low-interest loans or insurance premium reductions. The models also suggest the probability of retrofitting varies by type, with the most interest in strengthening openings, and that homeowners are more likely to retrofit when they are closer to the coast, younger, in newer homes, or within a year of a hurricane experience.

P.76  Resource adequacy risks to the bulk power system in North America. Murphy SJ*, Apt J, Sowell F; Carnegie Mellon University   apt@cmu.edu
Engineering and Infrastructure

Abstract: Approximately 5% of all electric generating units in the USA and Canada have unscheduled outages at any given time. To prevent a shortage of power when these unscheduled outages happen, the industry procures reserve generation that runs at idle until needed. But at times the outages can exceed 15 or even 20% of all capacity. Current practice treats all outages as independent. Using a database that covers ~85% of all generators in the USA and Canada, we show that correlated failures occur in nearly every region. We test whether the observed increases in outages may be due to random coincidence of independent failures in two ways: block subsampling and modeling outage events occurrences as binomial random variables, finding that almost all regions show correlated failures with a 99% confidence level. Since hurricane Sandy and the very cold month of January 2014 occurred during the four years we examined, we remove those events from the data set and still find with high confidence that electric generators exhibit significant correlated outages. We test whether these correlated outages occur at a particular time of the year, that might be different in different regions (winter in the northeast or summer in the south, for example). Even with just four years of data we see evidence of correlated failures in all seasons. Further, we see no recurrent seasonal patterns in the average unscheduled unavailable capacity, nor in its variance. In order to characterize the generator outage data in a way that can be used in reliability analysis, we present the results of four analyses. These are: Weibull and lognormal fits to each region’s series of unscheduled unavailable capacity; fits to each unit type’s distribution of normalized derating magnitudes; summaries of the mean time between failure and mean time to recovery by region and unit type, along with Weibull and gamma distributions fits to each; and time series of unavailable capacity from maintenance and scheduled events.

P.77  Assessing the Risk of Wind Drought for Wind Farms. Schell K*, Guikema SD, Pinson P; University of Michigan   krschell@umich.edu
Engineering and Infrastructure

Abstract: There are over 10,000 individual wind turbines currently operating in the state of Texas. Tax credits have helped spur past and future investment in the Texas wind industry, which anticipates a further 11 GW of new capacity additions in the near future. While this increase in wind capacity, at times, provides a remarkable percentage of load with renewable generation, variable wind power output is not often synchronous with peak demand, which raises the issue of its contribution to overall resource adequacy. We propose a new method for assessing wind resource adequacy in the planning phase, utilizing cross-spectra analysis of wind speed and system load time series. The results indicate which geographic locations in an electricity system have wind resource potential that is most able to contribute to meeting system load, across time. This metric gives wind farm planners information on where to site wind farms that reduce reliability risk and increase supply adequacy. This knowledge is particularly important as electricity systems move toward maximum levels of variable renewable power penetration.

P.78  Impact of Large-Magnitude Earthquakes on Structures in Deep Sedimentary Basins . Marafi N*, Berman J, Eberhard M; University of Washington   abostrom@uw.edu
Engineering and Infrastructure

Abstract: The Cascadia Subduction Zone (CSZ) is capable of producing long-duration, large-magnitude earthquakes that could severely affect buildings and infrastructure in the Pacific Northwest (PNW). In addition, deep sedimentary basins are expected to amplify ground-motion intensity which underlie several cities in the Puget Sound region. The effects of long-duration and basins are poorly understood for the CSZ, because no ground-motion recordings are available for large-magnitude earthquakes in this region. To compensate for the paucity of recorded subduction events in the PNW, suites of simulated M9, CSZ ground motions are used to study their effects on archetypical structures in the PNW region. The severity of these ground motions is quantified in terms of several intensity measures that quantify ground-motion duration and frequency content known to affect structures. Finally, the performance of these archetypes is evaluated in terms of collapse risk, and appropriate design recommendations are proposed to compensate for these effects.

P.79  Public Health Implications of EPA’s UCMR3 Sampling of Contaminants in Drinking Water. Greene CW*, Suchomel AE; Minnesota Department of Health   christopher.greene@state.mn.us
Exposure Assessment

Abstract: The Minnesota Department of Health (MDH) has conducted an analysis of community water supply data gathered under the EPA’s Unregulated Contaminant Monitoring Rule 3 (UCMR3) program between January 2013 and September 2015. The UCMR3 analyte list included several chemicals with no previous data in finished drinking water in Minnesota, as well as chemicals with recently revised drinking water standards. MDH viewed the UCMR sampling as an opportunity to estimate potential impacts to public health from exposure to drinking water contaminants. Analytical results for each chemical were compared to the best available Minnesota or EPA health-based standard. Hazard indices (HI) were calculated for individual chemicals and summed for each community sampled. Chromium VI, manganese, and chlorate were the major risk drivers at locations throughout the state. In areas known to be impacted by perfluoroalkyl substances (PFAS), PFOS, PFOA, and PFHxS were important contributors to total HI, but these communities often had other individual contaminants, such as chromium or cobalt, with an HI greater than 1. MDH conducted a basic additivity analysis on the basis of common health endpoints, and found that in two community water systems, the HI summed by endpoint exceeded 1 even when no individual chemical’s HI exceeded 1. Although there are limitations inherent in the sampling process (i.e., the number of samples per facility was small), the UCMR3 data were useful for chemical prioritization (identifying risk drivers) and for geographic analysis (identifying the scope and extent of contamination). MDH also identified three contaminants (1,2,3-trichloropropane, 1,3-butadiene, and 17alpha-ethinylestradiol) whose analytical reporting limits were higher than MDH’s health-based criteria, indicating a need for improved analytical methods. As EPA prepares for the next round of UCMR analysis (UCMR4), MDH is working to develop health-based drinking water standards for the UCMR4 target analytes.

P.80  Understanding Chemical Emission from In-Situ Water Pipe Repairs. TEIMOURI, MABOOBEH MT*, RA, KYUNGYEON KR, CONKLING, EMILY EC, BOOR, BRANDON BB, HOWARTER, JOHN A JA, WHELTON, ANDREW J AJW; Purdue University   awhelton@purdue.edu
Exposure Assessment

Abstract: Water pipe repairs are increasingly being completed with polymer coatings and cured in place pipes (CIPP). These technologies enable pipe owners to avoid costly pipe replacement activities (i.e., road closures, building repairs) by installing a barrier between the corroded pipe and water that it conveys. CIPP pipes are new pipes chemically manufactured inside an existing damaged pipe in-situ. Today, 50% of all water pipes are repaired by CIPP technology. At present, little information is available about chemical emission into air and water by CIPP installation activities. Results of two ongoing CIPP chemical emission studies will be presented. In 2016, the US National Science Foundation funded a rapid response study to investigate chemical air emissions caused by the CIPP installation processes. In 2016, six US state transportation agencies also funded a project to investigate how to better understand and limit emissions into waterways when CIPP technology is used for storm water culvert repairs. Results showed that chemical air emissions can be high, transient, and contain more than styrene. Results are being further analyzed and prepared for release. Characterization of installed CIPP materials also has indicated a variety of organic compounds can be released from the CIPP once the contractors leave the worksite. Results in this presentation will be described in the context of workplace safety and environmental toxicity thresholds.

P.81  Determining the health protective capability of analytical detection methods for short duration exposures . Lipscomb JC*, Willison S, Parry E, Chattopadhyay S, Snyder E; US EPA National Homeland Security Research Center   lipscomb.john@epa.gov
Exposure Assessment

Abstract: Emergency response decisions require integrating exposure and risk information. Optimal guideline values for acute or short-term exposures are based on dose-response data for the relevant duration and address multiple levels of severity. U.S. EPA’s National Homeland Security Research Center (NHSRC) has tools to characterize exposures and risk during temporary reutilization of previously contaminated infrastructure: The Provisional Advisory Levels (PALs) and Selected Analytical Methods for Environmental Remediation and Recovery (SAM). Oral and inhalation PALs cover three tiers of severity (minimal, reversible; more severe, irreversible or escape-impairing; and lethal) for up to 24 hours, 30 days, 90 days and 2 years. PALs decrease with time and increase with effect severity. SAM recommends optimal analytical methods for a matrix-analyte pair and describes performance. Using SAM, the sufficiency of analytical capability for acrylonitrile (ACN), a widely used industrial chemical, relative to the PALs values was evaluated. Oral PALs (as drinking water equivalents for children) ranged as low as 0.064 mg/L. SAM identified EPA Method 524.2 (run time ~ 30 minutes) as the optimal method for drinking water (GC/MS). This method provides a detection limit of 0.00022 mg/L and a limit of quantitation of 0.0009 mg/L, sufficient to detect ACN concentrations associated with even minimal, reversible effects. Inhalation PALs ranged as low as 0.030 mg/m3. From SAM, OSHA Method PV2004 for acrylamide (HPLC/UV) may be applicable, with a possible detection limit of 0.001 mg/m3. Confidence will be increased when this method can be verified for ACN and a limit of quantitation established. For some acute exposures, the logistical constraints for sample collection, transport and analysis serve to emphasize the value of enhanced field detection capabilities. This process will be applied to other priority chemicals. This abstract may not represent the views and policies of the US EPA/ORD/NHSRC.

P.82  Concentration of Cadmium in Spinach in U.S. Monitoring Data. Nyambok EO*, Hoffman-Pennesi D, Gavelek A, Briguglio S, Spungen J, Wirtz MS; Oak Ridge Institute for Science and Education   Edward.Nyambok@fda.hhs.gov
Exposure Assessment

Abstract: In 2015 the California Department of Public Health recalled spinach due to elevated levels of cadmium. Cadmium is a toxic element widely distributed in the environment and therefore maybe found in food. Soil concentration of cadmium varies by different regions, resulting in a variation in cadmium levels in plants. Plants also differ in their rate of cadmium uptake and accumulation; some plants such as spinach tend to accumulate cadmium more readily. The U.S. produces roughly 200 million pounds of spinach and imports about 28 million pounds annually. Many countries monitor contaminants in food over time as part of market basket surveys referred to as Total Diet Study (TDS) programs. Our study reviewed cadmium data specific to spinach from TDS conducted in the United States (U.S.), Canada, and France. We reviewed spinach data from U.S. (2003-2012), Canada (2005-2012), and France (2007-2009) to see how U.S. compares to Canada and France. Among the TDS foods analyzed for cadmium, spinach is in the top five. The poster will also present trends of the cadmium levels in spinach over time for each of the three countries in the study.

P.83  Exposures to Styrene from Food Packaging under CA Proposition 65. Mattuck R*, Dubé EM, Liu X, Greenberg GI; Gradient   rmattuck@gradientcorp.com
Exposure Assessment

Abstract: Polystyrene food containers may contain small amounts of residual styrene monomer that could potentially migrate into food. The State of California listed styrene as a human carcinogen under Proposition 65, and in 2017 established a no significant risk level (NSRL) of 27 ug/day. Although the US Food and Drug Administration (FDA) has previously determined that polystyrene is safe for use in food packaging, the NSRL is significantly lower than the acceptable daily intake (ADI) previously estimated for styrene. FDA is concerned with estimating cumulative exposure to a food contact substance (e.g., styrene) in the total diet, and their method employs a Consumption Factor (CF), which represents the fraction of the daily diet expected to contact a specific type of packaging material. However, under Proposition 65, a manufacturer must demonstrate that their specific product will not result in exposures above the NSRL, thus the use of the CF may not be appropriate. We used FDA's method only to calculate the amount of styrene that could potentially migrate into food, for several combinations of polystyrene containers and foods. We then obtained data on the amounts consumed and the frequency of consumption for these foods from the National Health and Nutrition Examination Survey (NHANES). Combining migration concentrations and consumption data, we derived estimated daily intakes (EDI) for styrene from specific foods, and compared the EDI to the NSRL. Overall, we concluded that consumption of food in contact with polystyrene food packaging is unlikely to result in exposures to styrene above the NSRL.

P.84  Senior Toxicologist, Human Health Risk Assessment. Lopez TK*; Tetra Tech   theresa.lopez@tetratech.com
Exposure Assessment

Abstract: In assessing human exposure to or establishing risk-based values for a surface water body, it can be necessary to account for exposures through fish ingestion as well as water ingestion. To do so, a bioaccumulation factor (BAF) that estimates tissue concentrations in fish from water is needed for each chemical evaluated. BAFs are available in the literature, but site-specific BAFs are not typically developed during site investigations. In practice, published BAF values for each species or trophic level are relied upon to estimate human exposures from fish ingestion. These BAFs assume a fixed linear relationship between a chemical’s concentration in water and the concentration in fish. However, this assumption does not hold for metals since the rate of bioaccumulation decreases as the concentration in water increases; thus the BAFs overestimate exposure with increasing concentrations in the water column. Establishing or selecting BAF for metals for fish of any trophic level is complicated by other factors as well: the natural presence of metals is to be expected in ambient water and fish tissue, as opposed to organic chemicals; ionic form, temperature, and pH are complicating factors in determining uptake and availability of metals; and some metals are essential nutrients to aquatic life. Rather than using a fixed BAF for each metal to estimate fish tissue concentration, it is proposed that BAFs be calculated from the slope of the line characterizing tissue concentration versus water concentration. Information establishing tissue concentration at various water concentrations, available from many sources and for many trophic levels, can be used to quantify the relationship between water and fish tissue concentration. Using the slope of that line allows for calculation of metal BAFs that should more accurately predict fish tissue concentrations relative to water conditions, providing a more accurate estimate of human exposure for a specific water body.

P.85  Inorganic Arsenic Exposures Associated with Consumption of Infant Rice Cereal. Chiger A*, Lynch MT, Houlihan J; Abt Associates and Healthy Babies Bright Futures   andrea_chiger@abtassoc.com
Exposure Assessment

Abstract: Inorganic arsenic is a developmental neurotoxicant that is commonly found in a variety of foods, particularly those containing rice. However, there is currently no standard for arsenic in infant rice cereals sold in the United States. We investigated inorganic arsenic exposures in U.S. children as a result of infant rice cereal consumption. We first compiled data sources on infant rice cereal consumption, inorganic arsenic concentrations in infant rice cereal, and estimates of dietary exposures in children. We then evaluated the available data and created arsenic exposure estimates for average and high consumers of infant rice cereal. To place these estimates into context, we compared them to the existing EPA reference dose for the dermal effects of arsenic, and to estimates of a reference dose for its neurodevelopmental effects, which were based on the primary literature. Further details of our dose-response assessment will be presented in a companion poster entitled “Estimates of IQ Loss in Infants due to Exposure to Arsenic in Infant Rice Cereals” by Meghan Lynch, Andrea Chiger and Jane Houlihan.

P.86  Risk assessment of exposure to Acrylamide from baby food in Taiwan. Lai TR*, Huang YC, Chuang YC, Wu KY, Chiang SY; China Medical University   routinelai@gmail.com
Exposure Assessment

Abstract: Acrylamide is commonly found in various high-carbohydrate food especially when processed at high temperatures.Many studies demonstrate that acrylamide possesses neurotoxicity,genotoxicity and reproductive toxicity.Acrylamide has been classified by the International Agency for Research on Cancer as probably carcinogenic for humans(group 2A).Previous data have shown that ingestion risk to children is 2-3 times higher than adults and acrylamide is detected in several baby food items.However,there is lack of research for assessing risk of acrylamide in baby food products in Taiwan.The aim of this study was to assess the risk of acrylamide in baby food products.The probabilistic risk assessment was conducted in 0-3 age group of general population group in Taiwan with Monte Carlo simulation.The residues of acrylamide in 93 baby food items were cited from a report of Taiwan Food and Drug Administration and the food consumption data was taken from the National Food Consumption Database in Taiwan.Our results showed that the mean and 95th percentile of margin of exposures for five specified items of baby foods were ranged from 217,507 to 58,480,130 and 388,346 to 115,232,718,respectively.The total and 95th percentile of hazard index were 0.185 and 0.690,both less than 1.The ranges of lifetime cancer risk at the mean and 95th percentile of Lifetime Average Daily Dose(LADD) of Acrylamide for 0-3 age of were estimated at 5.93 x 10-7 - 3.22 x 10-5 and 2.77 x 10-6 - 1.22 x 10-4, respectively.In conclusion,baby foods in general population with the 95th percentile LADD of acrylamide may pose potential carcinogenic.

P.87  The risk assessment of Furan residue in commercial baby formula in Taiwan for infants and children under age of three. Huang YC*, Wu KY; National Taiwan Univertisy   Mirroronthewall_tw@hotmail.com
Exposure Assessment

Abstract: Breast feeding is the most ideal feeding choice for infants longterm emphasized by pediatrics. However, most women in Taiwan are career women with long working hours and little friendly environment for day care and breast feeding facilities. Furan is known to be both hepatoxic and carcinogenic in rats and mice and has been classification as a possible human carcinogen (Group 2B) by the International Agency for Research on Cancer (IARC). However, risk of Furan within dietary items has not been well established. In 2010, total 12 items of baby formula from Taiwanese markets was quantified by Static headspace sampling technique. The result enables us for further risk assessment of Furan exposures specifically from baby formula to children under age of three. Benchmark dose of BMDLh= 0.36217 mg/Kg in accordance to the data from Incidence of liver tumors in B6C3F1 mice exposed to furan in the National Toxicology Program Bioassaya conducted in 2008. Concerning age of objects in this study, the 10-fold uncertainty factors are applied for extrapolation from adult to children, hence the final reference dosage was further divided 10 into 0.036217 mg/Kg. We obtained average consumption of baby formula for infants and children age under 3 base on the data in 2015 provided by National Food consumption data base in Taiwan. Estimation of possible exposure of Furan from baby formula for infants and children under age 3 achieved by using Markov chain Monte Carlo (MCMC) methods. Median exposure for children under/including 3 is 0.00897 mg/Kg. This study provides important information for risk assessment of Furen residues in baby formula. It can help raise awareness of the general public and give people a quick overview of what we feed to infants and children under age of three, who are vulnerable and have no abilities to select their own food. Further more, we would like to urge the government to provide more regulation to improve friendly environment of breast feeding in public as well as in work place.

P.88  Probabilistic risk assessment of Fipronil in vegetables and fruits in Taiwan. Huang YC*, Chuang YC, Wu KY, Chiang SY; Department of Public Health, China Medical University   u105070003@cmu.edu.tw
Exposure Assessment

Abstract: Fipronil is a broad-spectrum phenylpyrazole insecticide that blocks of the GABA-gated chloride channel of insects. Fipronil has been widely used for controlling insects on various crops including vegetables and fruits. Fipronil causes neurological and developmental toxicity and was classified as Group C (possible human carcinogen) based on the increase of thyroid tumors in rats. In recent years, some studies have estimated the risk of fipronil residues in tea in Taiwan. But few studies evaluate the risk in vegetables and fruits. This study was using Monte Carlo simulation to conduct a probabilistic risk assessment of fipronil in vegetables and fruits in adulthood (age 19-65 years old) in Taiwan. The fipronil residues in 47,585 samples of vegetables and fruits were cited from reports released by Council of Agriculture in Taiwan. The vegetable and fruit consumption data was cited from the National Food Consumption Database of Taiwan. The distribution of fipronil residues, and the Lifetime Average Daily Dose (LADD) and Hazard Index (HI) was estimated by Monte Carlo simulation. The total LADD of fipronil for the consumer with age 19-65 years old is 3.28E-05 mg/kg/day. Based on the reference dose of 0.0002 mg/kg bw/day for fipronil, the mean HI and the upper bound of 95% confidence interval of HI are calculated to be 0.164 and 0.447, both below one. This study shows that consumers may not be subject to the adverse health effects from fipronil exposure via diet. Farmers often use multiple insecticides at the same time. Thus, the risk of fipronil should not be disregard.

P.89  Risk assessment of imported canned foods in Taiwan: take Bisphenol A (BPA) as an example. Hsiao IL*, Wu KY; National Taiwan University   kuenyuhwu@ntu.edu.tw
Exposure Assessment

Abstract: In canned foods, chemical migration from packages and microorganism growth are the two factors that have potentially impacts on human health. The imported canned foods especially raised the concerns because of the poor hygienic condition and material quality in canned foods manufacturing in developing countries. In this study, we conducted a risk of imported canned foods in Taiwan and took one of most important toxic chemicals Bisphenol A (BPA) as an example. BPA can migrate from epoxy resin-coated canned foods. Exposures to BPA have been associated with reproductive, developmental, and cardiovascular effects. The European Food Safety Authority (ESFA) has re-evaluated the Tolerable daily intake (TDI) of BPA from 50 to 4 µg/kg bw/day in 2015 due to the new evidences on immune system. By integrating the currently available BPA data in canned food from publications and using Taiwan FDA canned food import statistics data, we evaluated the risk of canned foods from ten major import countries. Using Monte Carlo simulation to simulate great variability of data, the results showed that an upper limit of 95 % confidence interval of overall exposure for the adults (19–64 years old) was 14 ng/kg-day, corresponding to a hazard index (HI) of 0.0035. The BPA concentration is the most sensitive factor in the exposure assessment (+37.3 to +77.7%, variance between countries). The potential dose is very low in comparison to our previous study in evaluating aggregate health risks on BPA in Taiwan by CalTOX multimedia model (1.05 μg/kg-day).

P.90  Risk Assessment of Arsenic in Prescriptions of Traditional Chinese Medicine. Horng RL*, Chuang YC, Hsiao JL, Lin YT, Wu KY, Chiang SY; China Medical University    ny88021@gmail.com
Exposure Assessment

Abstract: The use of complementary and alternative medicine is increasing worldwide and the safety of Traditional Chinese Medicines (TCMs) is of concern. Different levels of heavy metals have been detected in TCMs. Among them, arsenic is classified by the International Agency for Research on Cancer as Group 1 carcinogen. Due to high prevalence of chronic insomnia in Taiwan, TCMs are widely used to treat insomnia. The most common prescriptions of TCMs for the treatment of insomnia are Jia Wey Shiau Yau Saan (Formula #1), Suan Tzao Ren Tang (Formula #2), Chair Hwu Jia Long Guu Muu Lih Tang (Formula #3) and Uen Dann Tang (Formula #4). Therefore, the objective of this study was to conduct a probabilistic risk assessment on arsenic in 4 prescriptions of TCMs. We assumed that people used TCMs a quarter of year and the average decoction transfer rate for arsenic after boiling was 10%. The Monte Carlo simulation was used to simulate the distributions of the Life-time Average Daily Dose (LADD), Hazard Index (HI) and Cancer Risk (CR). The mean LADDs of Formula #1 to Formula #4 are 4.54E-05, 3.47E-06, 1.18E-04 and 6.21E-06 mg/kg-day, respectively. The mean HIs of Formula #1 to Formula #4 are 0.15, 0.01, 0.39 and 0.02, respectively. The HI for 4 prescriptions of TCMs were all less than 1, indicating the absence of a non-carcinogenic health hazard. The carcinogenic risk of 4 prescriptions of TMCs were all higher than1E-06, indicating potential risk of cancer. The residue data of arsenic in TCMs are composed of inorganic arsenic and organic arsenic. If the residues of inorganic arsenic and organic arsenic are separated, the uncertainty of risk assessment will be reduced.

P.91  Dietary Contaminant Exposure Estimates Should Reflect Risks to Sensitive Sub-Populations: A Case Study of Lead and Hot Tea Consumption in Persons of Reproductive Age Participating in NHANES 2013-2014. Guerrette ZN*, Fleischer JG, Whittaker MH; ToxServices LLC   zguerrette@toxservices.com
Exposure Assessment

Abstract: Dietary exposure estimates are typically calculated as usual intake estimates, a product of food frequency questionnaire data and the amount consumed per consumption event. While these estimates are suitable to express exposure to dietary contaminants whose risk is a function of exposure duration, these estimates likely underestimate risk of adverse effects from chemicals whose toxicity is predicated on exposure during critical windows of susceptibility, especially developmental toxicants. Due to the nature and the potential severity of developmental impacts, which are dependent more on exposure timing rather than exposure duration, a more conservative approach to the assessment of chemical exposure is warranted if the chemical being evaluated is a developmental toxicant. This approach does not normalize consumption across time, but instead includes data only for days on which the dietary item is consumed. We have demonstrated differences in exposure estimates using these approaches applied to hot tea consumption data from the NHANES 2013-2014 dataset and compared exposure estimates to the 0.5 µg/day MADL for lead, which is based on this chemical’s reproductive and developmental toxicity. Tea consumption is a significant non-occupational route of lead exposure. Among men and women of child-bearing age (15-49 years old) participating in NHANES 2013-2014, the single day arithmetic mean hot tea consumption was 0.416-0.436 L/day for hot tea consumers. Across both sampling days, the arithmetic mean hot tea consumption considering only days in which tea was consumed was 0.402 L/day. As the average lead content in a 200 mL beverage is estimated as 10 µg/L, an average hot tea consumption of 0.402 L/day would produce a daily lead intake of 4.02 µg/day, a value eight times greater than the MADL for lead. To protect sensitive sub-populations, the exposure estimation method chosen needs to reflect the nature of the toxicity induced by the contaminant being evaluated.

P.93  Human Health Exposure and Risk Assessment of Mercury in Camp 6, Benguet, Philippines. Diola MD*, Resurreccion AC, Fujimura M; University of the Philippines Diliman and National Institute for Minamata Disease   mddiola@up.edu.ph
Exposure Assessment

Abstract: Humans can be exposed to mercury in the environment thru different pathways such as ingestion, inhalation and dermal contact. One of the primary sources of mercury (Hg) emission into the environment is artisanal gold mining. This study aims to estimate the health exposure, daily intake and risk of mercury of the residents in Camp 6, Tuba, Benguet, Philippines, a small community which is known for artisanal mining activities. Hair samples, considered as the best indicators of human mercury contamination, from residents were collected and analyzed for mercury content. Residents were also surveyed to gather socio-economic, demographic, and site-specific exposure data. Residential Hg intake dosage was quantified using Hg measurements from the hair samples of 111 residents. Hair Hg concentrations from the participants range from 0.03 to 24.17 ppm with a geometric mean (GM) of 0.224 ppm. Male residents were found to have significantly higher hair Hg concentrations (GM=0.263) as compared with the females (GM=0.179) suggesting that there are many factors affecting Hg levels in hair such as hormones, occupational exposure, and the amount of fish consumed. It was also observed that miners have significantly higher Hg concentrations as compared to non-miners which signify additional mercury exposure to miners due to direct external exposure to mercury vapour, which may be released during small-scale mining activities. Using Monte Carlo Simulation, daily Hg intake was found to follow a lognormal distribution with a mean, median and standard deviation of 25.66 ng/kg-bw/day, 22.36 ng/kg-bw/day, and 14.48, respectively. The average daily Hg intake dose is within acceptable limit set by US EPA (100 ng/kg-bw/day). This corresponds to a non-carcinogenic risk and hazard index of 0.22, which indicates that the residents of Camp 6, at present, is not potentially at risk and they are not likely to have adverse health effects due to Hg exposure.

P.94  State-of-the Art Consensus on how to Evaluate Bioavailability in Contaminated Soil: Guidance from ITRC. Ries D*, Durant K, Sorrentino C, Interstate Technology and Regulatory Council ; state government   riesd@michigan.gov
Exposure Assessment

Abstract: State-of-the Art Consensus on how to Evaluate Bioavailability in Contaminated Soil: Guidance from ITRC Divinia Ries (Michigan Department of Environmental Quality), Kathryn Durant (Department of Natural Resources & Environmental Control, New Castle, DE), Claudio Sorrentino (Department of Toxic Substances Control, Sacramento, CA), and Interstate Technology and Regulatory Council (ITRC) The Bioavailability in Contaminated Soil (BCS) guidance from the Interstate Technology and Regulatory Council focuses on lead, arsenic, and polycyclic aromatic hydrocarbons (PAHs). It is a consensus-based, easy-to-read, web-based document that represents the shared knowledge of representatives from state and federal regulatory agencies, the private sector, academia, and tribal and public stakeholders. It aims to provide detailed information on available bioavailability and bioaccessibility methods, including what the risk assessment practitioner should consider to make informed decisions for a specific site. The BCS guidance includes case studies that present how bioavailability of lead, arsenic and PAHs have been evaluated at sites. It also discusses the challenges, how these challenges were overcome, and the lessons learned. In vivo bioavailability methods provide insights into site-specific bioavailability; however, the high cost and duration of in vivo studies severely limit their applicability to a small number of large contamination sites, and requires considerable resources and long timeline. In the past few years, various groups have developed in vitro methods to measure bioaccessibility as a surrogate for bioavailability. These in vitro methods are available for arsenic (As) and lead (Pb) and their relatively low cost and short turnaround time allow for the inclusion of site-specific bioavailability considerations for many small or low contamination sites. This new guidance will provide regulators, stakeholders, and practitioners with the tools they need to make informed decisions.

P.95  Impact of industrial activities emissions on mortality rates in Chile: An ecological study. Gutierrez VV, Fortt A*; Universidad Diego Portales and GreenRiver   antonia.fortt@greeriver.cl
Exposure Assessment

Abstract: Chilean population are exposed to significant pollution from different industrial activities, mainly mining, agriculture, energy industries, power plants, paper production and transportation. The aim of this research is to determine if higher cancer mortality rates are associated with the presence of these different industrial activities along the country. Following the method used by Ruiz-Rudolph et al (2016), we conduct an ecological study that uses Chilean communes as small-area observation units to assess cancer mortality. Public data for mortality rates is available at the commune level, which are the smallest units of local administration in Chile. For each commune, data on cancer mortality were aggregated for the 2000-2016 period. Public database are available for pollution emissions for different industries. The impact of the pollution from different economic activities on cancer mortality rates is estimated using a model proposed by Besag, York, and Mollie (BYM) (Besag et al., 1991), wich has been used extensively in spatial epidemiology. Significant higher rates of cancer mortality were observed in communes with large industrial emitters.

P.96  Use of air dispersion modeling to estimate historical community exposure from manufacturers of asbestos-containing products. Bare JL*, Abramson MM, Barlow CA, Scott PK; Cardno ChemRisk, Arlington, VA; Cardno ChemRisk, Pittsburgh, PA; Cardno ChemRisk, Boulder, CO; Cardno ChemRisk, Pittsburgh, PA    jennifer.bare@cardno.com
Exposure Assessment

Abstract: Manufacturing of asbestos-containing products can result in the environmental release of asbestos into the surrounding community. Unlike exposures to asbestos during the manufacture or use of asbestos-containing products, community exposure has not been as well-characterized. However, asbestos emissions to a community cannot be generalized across all types of manufacturing facilities. In this analysis, a methodology using air dispersion modeling to estimate potential community exposure to asbestos from a hypothetical manufacturer of asbestos-containing friction products from 1965 to 1989 is presented. The USEPA preferred air dispersion model, AERMOD, was used to predict the annual airborne concentrations of asbestos up to 2000 meters from the facility. Continuous emission sources included 1) unloading of asbestos from a box truck, 2) uncontrolled fugitive building emissions prior to 1975, 3) controlled baghouse emissions after 1975, and 4) disposal of scrap asbestos. Prior to the introduction of a baghouse in 1975, asbestos emissions for the hypothetical facility were primarily attributed to the uncontrolled fugitive building emissions. Thereafter, there was a substantial decrease in overall asbestos emissions, with unloading emissions as the greatest contributor. As expected, predicted airborne concentrations of asbestos were greatest closest to the manufacturing facility and decreased by approximately four-fold after the installation of the baghouse in 1975. Uncertainties associated with the air dispersion modeling included the assumptions regarding historical asbestos usage and asbestos handling and processing at the hypothetical manufacturer. This methodology may be applied to similar exposure scenarios to reconstruct historical community exposure to asbestos with manufacturer-specific data and records.

P.97  Human exposure to nine flame retardants in indoor environments. Patterson J*, Chaisson C, Diskin K, Parker A, Babich M, Biggs MB; 1University of Cincinnati, formally Toxicology Excellence for Risk Assessment, 2The LifeLine Group, 3U.S. Consumer Product Safety Commission&#61482;   patteji@ucmail.uc.edu
Exposure Assessment

Abstract: Flame retardants (FRs) are chemicals added to materials to improve their resistance to fire. FRs have been detected in indoor environments, particularly in air, household dust, and consumer products. Incidental ingestion of household dust is believed to be a major source of human exposure. Toxicity observed in animals includes reproductive and developmental toxicity, organ toxicity, and cancer. Human exposure was estimated for 9 FRs (TDCPP, TCPP, TCEP, TEP, TPP, TBB, TBPH, TBBPA, antimony trioxide) from indoor air and household dust in 4 environments (home, office, child care, and car) for which published data were available. Data from the most competent and relevant publications were used to represent concentrations for each FR and relevant media scenario. Dust and air particulate measurements were used as indicators of the overall contaminant levels of FRs indoors. Using probabilistic exposure assessment methods (LifeLine™ software) a range of aggregate exposures from oral, dermal, and inhalation routes for several age groupings were estimated. The highest exposures for all FRs except TEP (at the 50th percentile of the population’s exposure) were for children ages 1 to <3 years). For over half of the FRs, the home environment contributed the most to the combined daily exposure for infants and children ages 1 to <3. TCPP levels in the home environment generated the highest estimates of exposure for infants and children 1 to <3 years. For adults, exposure from air and dust TCEP concentrations in the office environment was the highest exposure. The many FR studies presented substantial information on exposure concentrations but the reliability of the assessments’ conclusions is limited by the frequent lack of information on experimental design, collection methodologies, description of environments, and FR sources. &#61482;These comments are those of the CPSC staff and have not been reviewed or approved by, and may not necessarily reflect the views of, the Commission.

P.98  VOC Exposures from Use of Aerosol Brake Cleaner. Williams PRD*, Fries M, Ovesen J, Maier A; E Risk Sciences, LLP and University of Cincinnati College of Medicine   pwilliams@erisksciences.com
Exposure Assessment

Abstract: Airborne exposures were measured during the use of a common aerosol brake cleaner under varying conditions. Short-term (15 min) and task-based (&#8805;1 hr) personal, area, and background samples were collected using charcoal tubes. Real-time monitoring of total VOCs during application of the brake cleaner was also conducted. Sampling occurred during 8 simulations in which the product composition, potential for dilution ventilation, and use of an industrial floor fan varied. Each simulation involved the disassembly, cleaning and inspection, and reassembly of brakes (disc and drum) on four passenger cars and light duty trucks. For the personal short-term samples, the average detected concentration of THC and toluene across all simulations ranged from 19.0–217.5 mg/m3 (5.5–61.5 ppm) and 10.4–162.5 mg/m3 (2.8–44.0 ppm), respectively. For the personal task-based samples, the average detected concentration of THC and toluene across all simulations ranged from 3.6–66.0 mg/m3 (1.0–18.5 ppm) and 2.4–52.0 mg/m3 (0.6–14.0 ppm), respectively. Lower airborne concentrations were measured for the area samples. The highest measured concentrations of THC and toluene occurred during the greatest use of aerosol brake cleaner, when the bay doors were closed, and when the floor fan was turned off. Benzene was not detected in any of the samples. Peak concentrations of total VOCs ranged from approximately 20–764 ppm, with no measurements exceeding 300 ppm for 10 minutes. The ambient temperature during the study was hot (82–96°F) and humid (RH 48–73%). Wind speed measurements using a hot wire anemometer only exceeded 30 ft/min when the floor fan was on. The amount of aerosol brake cleaner used in each simulation (33.4–136.6 grams) was driven by the number of parts cleaned. The scenarios evaluated in this study encompassed both typical and worst-case conditions during the use of an aerosol brake cleaner, and should be applicable to other products with similar compositions and use patterns.

P.99  Study on risk assessment of aloe-emodin for Taiwanese population. YEN YT*, WU KY; National Taiwan University   r04841015@ntu.edu.tw
Exposure Assessment

Abstract: Aloin is added to the dietary supplements sold in the market in Taiwan. In addition to the application in the dietary supplements, aloin is also applied as laxatives in medical use and bittering agents in alcohol beverage. However, it is necessary to be concerned with the impact of dietary supplements on humans, since chronic aloin consumption is associated with adverse health effects such as bowl irritation and diarrhea. In the National Toxicology Program(NTP) two-year chronic toxicity study, aloe-emodin demonstrates carcinogenicity in animals; in rats, chronic exposure to aloe-emodin result in colon adenocarcinoma and cancer. We determine residues of aloin in dietary supplements with Taiwanese consumption data from National Food Consumption Database and the proposed daily suggested amount of intake, which indicated that no more than 10 mg/day. Monitoring residues data were collected from market survey by TFDA. We calculated the following: Mean concentration(MC), Lifetime Average Daily Dose(LADD), and Margin of Exposure(MOE). We found that aloin residue was detected in a large amount in the various dietary supplements sold in the Taiwan market. The initial results shown as follows: the LADD is 4.30 x 10-3, mg/kg/day, and the MOE is 150.78. The MOE is considered to be lower than the standard which needs further health concern. Therefore, the guidelines of aloin in dietary supplements require revision. Moreover, people may expose to multiple aloin in foods simultaneously. Thus, the MOE indicates public health concern. This study established a complete health risk assessment of aloin in dietary supplements in recent year in 2015. Through this study, people will realize the importance of proper use of aloin. Further studies are needed to integrate all possible aloin residues in foods. We provided important information for risk assessment of aloin. Moreover, this study can equip risk-related managers with auxiliary scientific evidence to improve policy formation and social practice about dietary supplements in Taiwan.

P.100  Comprehensive Multipathway Risk Assessment of Chemicals Associated with Recycled (. Lemay JC*, Peterson MK, Pacheco Shubin SE, Prueitt RL; Gradient   jlemay@gradientcorp.com
Exposure Assessment

Abstract: Thousands of synthetic turf fields in the US are regularly used by millions of individuals (particularly children and adolescents). Although many risk assessments have concluded that there are low or negligible risks related to exposure to chemicals found in the recycled rubber used to make turf fields, concerns remain about the safety of this product. In response to these concerns, some federal and state agencies have initiated long-term studies of recycled rubber infill. In general, criticisms of existing studies focus on their limitations, which include low sample sizes and limited evaluation of relevant exposure pathways and scenarios. To address such concerns, we conducted a multipathway human health risk assessment (HHRA) of exposures to chemicals in recycled rubber. We compiled all the available chemical composition data on recycled rubber from the literature as well as all the available data related to chemical concentrations in the air above synthetic turf fields made with recycled rubber. We limited the relevant data to over 100 recycled rubber samples (including virgin samples) from more than 50 North American fields and nearly 100 air samples from indoor and outdoor artificial turf fields. We also assessed background data on exposure to chemicals in natural soil and air to provide context for the HHRA's results. We evaluated ingestion, dermal absorption, and inhalation pathways according to US EPA guidance, and considered multiple exposure scenarios (e.g., youths playing on indoor and outdoor fields, spectators). The results showed that, even using 95% upper confidence limit concentrations, the estimated non-cancer hazards and cancer risks for all evaluated scenarios were below levels that US EPA considers acceptable. In addition, we found that risk levels for athletes playing on synthetic turf fields were lower than those associated with playing on natural turf fields with background levels of heavy metals and polycyclic aromatic hydrocarbons.

P.101  Toxicological risk assessment of toluene and formaldehyde in a consumer product packaging material . Fleischer JG*, Whittaker MH; ToxServices LLC   jfleischer@toxservices.com
Exposure Assessment

Abstract: Manufacturers/distributors causing exposures to substances included on California’s Proposition 65 list of substances known to cause cancer or reproductive toxicity must provide specific warnings to consumers. Notices of Violation are issued for an increasing variety of consumer products that may cause exposure but do not provide the warning. ToxServices calculated exposure to formaldehyde and toluene emitted from a multi-layered consumer product packaging material to determine if exposure levels exceed established Proposition 65 safe harbor levels of 40 µg/day (formaldehyde gas) and 13,000 µg/day (inhaled toluene). Exposures below a carcinogen’s NSRL or a reproductive toxicant’s MADL do not pose significantly increased cancer or reproductive toxicity risks and therefore have safe harbor from Proposition 65. We conducted two-compartment, quantitative inhalation exposure assessments using California Code of Regulations methodologies. Analytical testing (headspace GC for toluene, UV-VIS spectrophotometry for formaldehyde) measured detectable levels of both substances. We calculated air concentrations of formaldehyde and toluene in the direct area of use of the packaging material and in the entire home for residents ranging in age from infancy to adulthood, including pregnant women. Additional variables included in age- and life stage-specific exposure calculations included air change rate, packaging material life-span, whole home and room air volumes, inhalation rate, and daily exposure duration. Formaldehyde exposure ranged from 0.0106 µg/day (0-2 year olds) to 1.43 µg/day (>18 year olds), with a lifetime average exposure of 1.93 µg/day. Toluene exposure ranged from 0.00432 µg/day (pregnant women) to 0.77 µg/day (>18 year olds). Since daily formaldehyde exposure is less than the 40 µg/day NSRL and daily inhaled toluene exposure is less than the 13,000 µg/day MADL, the distributor of the packaging material has safe harbor from Proposition 65 warning requirements.

P.102  Mapping the Emissions Exposure Risk due to Hydraulic Fracturing in Pennsylvania. Banan Z*, Gernand JM; Pennsylvania State University   zoya@psu.edu
Exposure Assessment

Abstract: In the past decade, the shale gas boom has led to increasing exploration and production of this strategic energy source with considerable potential economic and environmental benefits. However, there exist environmental health risks caused by emissions from shale gas development activities. Pollutant dispersion simulation results demonstrate that setback policy in Pennsylvania is not adequate to protect people from health effects due to exposure. We estimate the exposure level of a resident within the vicinity of the set of actual historical wells to evaluate the severity of this exposure experience. This study uses a Gaussian plume model to calculate the exposure values of fine particulate matter (PM2.5) over the Marcellus shale region of Pennsylvania, considering the density and timing of these activities. These results serve as input to create exposure maps that quantify the level of environmental health risk associated with these emissions. Results reveal the wellsite vicinities that did not comply with the EPA’s exposure standards which implies a failure of public health protection policy at these locations. We combine these exposure results with disease risk and population density to create a series of risk maps that identify expected case numbers as a function of geographic area and time. Based on these risk maps, we evaluate policy alternatives for regulating the density of shale gas development activity in time and space to keep individual communities below the recommended exposure levels for pollutants emission in the future.

P.103  Application of an Excel-based Toxicokinetic (TK) Model for Deriving Health-based Water Guidance for PFOS and PFOA. Goeden HM*, Greene CW, Jacobus JA; Minnesota Department of Health   helen.goeden@state.mn.us
Exposure Assessment

Abstract: Perfluorinated compounds have been widely released and human exposure is ongoing and ubiquitous. The Minnesota Department of Health (MDH) released revised noncancer health-based water guidance (nHBG) for PFOS and PFOA in May 2017. Traditionally, noncancer health-based water guidance values are calculated using a reference dose, a relative source contribution factor, and intake rate. During MDH’s review it became clear that traditional nHBG derivation methods would be inadequate due to maternal transfer at birth and potential accumulation of PFOS and PFOA in breastmilk at higher levels compared to drinking water. The revised values are based on a novel Excel-based TK model tailored to the physical-chemical properties, exposure parameters, and human transfer coefficients especially critical for evaluating early life exposures. Although exposures during infancy are short-term, this life stage is of particular concern because: (1) infants consume a much greater volume of liquid per unit body weight than older children and adults; and (2) due to the long elimination half-lives, the body burden instilled in infancy may take years to eliminate. To address these concerns, MDH developed an Excel-based, one-compartment TK model to predict serum levels of PFOS and PFOA from birth through attainment of steady-state conditions. Two exposure scenarios were evaluated: 1) an infant exclusively fed with formula reconstituted with contaminated water starting at birth, followed by a lifetime of drinking contaminated water; and 2) an infant exclusively breastfed for 12 months, followed by a lifetime of drinking contaminated water. In both scenarios, infants began life with a pre-existing body burden through placental transfer from a mother at steady-state conditions. Based on modeling results, breastfed infants were the most heavily exposed, and predicted serum levels early in life exceeded steady state levels.

P.104  Dynamical Systems Modeling of the Human Hypothalamic-Pituitary-Thyroid Axis: Developing Quantitative Adverse Outcome Pathways for Thyroid Endocrine Disruptors. Fueta PO*, Zhang Q; Emory University   drpatrickfueta@gmail.com
Exposure Assessment

Abstract: While cell-based assays are making headway to replace animal-based approaches for chemical toxicity testing, a major challenge lies in translating in vitro results into in vivo health risk. Predicting the organism-level dose response of endocrine disrupting chemicals (EDCs) is particularly challenging, where the perturbations observed in vitro may be buffered in vivo by homeostatic regulation common to endocrine systems. Using the human hypothalamic-pituitary-thyroid (HPT) axis we present a dynamical systems modeling approach to mechanistically link toxicological and epidemiological data across multiple physiological scales, which will help extrapolate and predict the risks and mechanisms of EDCs. Specifically, an ordinary differential equation (ODE)-based model of the HPT axis was constructed to capture the feedback regulation between T3, T4, and TSH, their synthesis, metabolism, and plasma buffering. The model represents an average euthyroid condition and can simulate primary or secondary hyper- or hypothyroidism induced by EDCs. A select parameter space was then sampled to optimize the model against the NHANES thyroid profile data (2007-2012). The resulting correlated parameter distributions established a virtual reference thyroid population. Pearson correlation and multiple linear regression of the thyroid data and optimized model parameters vs. urinary EDCs including sodium iodide symporter (NIS) inhibitors, environmental phenols, and perfluorinated chemicals were then performed. These analyses confirmed the thyroid-disrupting mechanisms of well-known EDCs such as perchlorate and predicted novel thyroid-disrupting mechanisms for chemicals such as thiocyanate and BPA. Hierarchical clustering demonstrated the optimized model parameters can be used as additional features to refine chemical grouping. Lastly, using in vitro NIS inhibition data, we demonstrated how the dynamical model can be applied to predict in vivo dose response of thyroid hormone disruptions.

P.106  Development of an Indoor Consumer Exposure Assessment Tool (ICET) and case studies by using ICET. Kajihara H*, Higashino H; National Institute of Advanced Industrial Science and Technology (AIST)   kajihara.hideo@aist.go.jp
Exposure Assessment

Abstract: We developed an indoor consumer exposure assessment tool, named “ICET”, which is a software to estimate inhalation, dermal, and oral exposure from consumer products in the indoor environment. The tool estimates concentrations of chemicals in indoor air, in house dust, and the amount of substances migrated to the surface of the skin, from both articles and mixtures. A database reflecting the Japanese lifestyle is included in the tool, such as types of houses, bodyweight, respiratory volume, and some default data sets concerning products. The tool estimates not only personal exposure in individual house but also nationwide exposure distributions of houses and residents in Japan using Monte Carlo simulation. Case studies were conducted on inhalation exposure of paradichlorobenzene contained in bug repellent set in closet, and on dermal exposure of plasticizer contained in PVC film. In the case of the bug repellent, indoor concentration estimated by ICET was roughly agreed with the measured concentration. On the other hand, the dermal exposure amount of plasticizer estimated by ICET was about 30 times greater than the measured value. Though, the exposure amount estimated by ICET was closer to the measured amount than the exposure amount estimated by ECETOC-TRA. It was found that the main cause of estimation error is the transfer rate from product to skin surface. We considered the cause of estimation error on transfer rate from product to skin surface.

P.107  Association Between Melanoma and Glioma Risk A nationwide Study in Taiwan. Chu YR*, Yin MC, Chang PH, Luo RY, Ho WC, Il'yasova D; China Medical University   rayray0508@gmail.com
Exposure Assessment

Abstract: Glioma incidence has been increased in these years. The etiology of brain tumor is still largely unknown. In a previous study, it was shown that melanoma patients have a greater incidence of glioma as compared to the general population in United States. Because glioma and melanoma do not have common environmental risk factors, this observation suggests a common genetic predisposition shared by glioma and melanoma that may be used to lead future research of the specific genes and drug targets for both malignancies. However, this observation has to be confirmed in other populations. The aim of this study was to investigate the association between melanoma and glioma in Taiwanese population. We used claim data of Taiwan’s National Health Insurance Research Database (NHIRD) from year 1998 to 2010. The study population included 1,000,000 randomly selected men and women ages 20 and older from the NHIRD database. Glioma was defined by ICD-9-CM codes 191, 192.0-192.3, 192.8, 192.9, 225 and 237.5. Melanoma was defined by ICD-9-CM codes 172, 173, 190.0, 190.9, 216.X (X=0, 3-7, 9), 224, 223.2, 235.1, 235.2, 237.6, 238.2, 238.3, 238.8. We excluded participants under ages 20 at 1998 and unknown gender (n=324,879). Cox's proportional hazard regression analysis was conducted to estimate the association between the history of melanoma on glioma risk. The hazard ratio of developing glioma was significantly higher in patients with melanoma than in those without melanoma (hazard ratio (HR) = 6.18; 95% confidence interval (CI) = 5.57-6.85) and was lower in male patients than in female patients, with the hazard ratio of 0.77 (95% CI = 0.71-0.84), adjusted covariables. The hazard ratio increased with age peaking at age group from 60 to 69 and decrease after 70 and older. The present study showed that Taiwanese patients with melanoma are at a higher risk of developing glioma. The exact underlying etiologies require further investigation.

P.108  Air Quality Concerns Following Ocean Oil Spills. Rosenstein AB*, French-McCay D, Rowe J; Author 1: Lexington Environmental Risk Group, LLC, Lexington, MA. Authors 2 and 3: RPS/ASA, South Kingstown, RI   envriskexpabr@aol.com
Exposure Assessment

Abstract: Decision-making during oil spill emergency responses should take into account all exposure pathways for workers and communities, in particular, air exposures. Workers can be exposed to contaminated air following a spill while working on oil rigs, boats, or during beach cleanups. Communities have the potential for air exposures, especially if the spill occurs close to shore. However, air quality data following oil spills are often lacking. In addition, although there are some published guidelines for "air events" such as wildfires, there are no published guidelines specifically for practitioners who need to make quick decisions on worker or community health related to air exposures following an ocean oil spill. Oil spills emit volatile organic compounds (VOCs), including highly‐volatile, intermediate‐volatile and semi‐volatile organic compounds, which result in increased oxidant (e.g. organic hydrogen radicals and ozone) and secondary organic aerosol production from oxidation products. Since air exposures will vary considerably by spill location and oil type, oil spills may be categorized to predict the levels and composition of atmospheric exposures. We will present our proposed “decision tool,” based on (1) modeling of air emissions following oil spills, and (2) available air quality guidelines, including a discussion of petroleum-related compounds without published guidelines. Variables that may be components of the air exposure decision tool include: location of spill (e.g. shallow or deep water, distance from shore); size of the spill (e.g. large, medium, small); and type of oil spilled (e.g. crude oil, condensate, diesel, heavy fuel oil). Risk analysts have an important role in the emergency response decision-making context; this proposed decision tool, based on air emissions modeling and a thorough review of air quality guidelines, will be a proactive, health-protective step that practitioners could use in the immediate aftermath of an oil spill.

P.109  Indirect Health Risk Reduction Through Transgenic Bt Corn. Yu J*, Hennessy D, Wu F; Michigan State University   yujina@msu.edu
Foundational Issues in Risk Analysis

Abstract: Risk analysis has important methodologies that can help in evaluating controversial questions of policy importance, such as the use and potential impacts of transgenic (genetically modified) crops in the United States and elsewhere. Traditional plant pathological and agronomic studies had previously reached no consensus as to whether transgenic Bt corn, the GMO planted most commonly worldwide, has resulted in significantly lower aflatoxin levels than non-GM corn. If this benefit did exist, then Bt corn could be an important strategy in reducing dietary aflatoxin exposure worldwide, with corresponding human health and economic benefits. In this study, we focus in the United States, using data from the USDA Risk Management Agency from 2001-2011; to evaluate the incidence of aflatoxin-related crop insurance claims among corn growers as a function of Bt corn planting across all US corn-planting counties. Our analysis factored in other risk factors for aflatoxin occurrence in corn, including high summer temperatures and drought (data from NOAA). We found that a significant negative correlation exists between Bt corn planting and aflatoxin-related insurance claims in the US, even when controlling for temperature and drought/humidity. In the US, this benefit primarily results in better economic returns for corn growers. Elsewhere worldwide, there could be important human health implications from the results of this risk assessment.

P.110  Risk denial: societal, organizational and cultural perspectives. Merad MMe*; CNRS   myriam.merad@unice.fr
Foundational Issues in Risk Analysis

Abstract: Asbestos, pesticides, endocrine disruptors, ..., the long list of scandals has since then appeared in the head page of newspapers. These emblematic cases are obviously symptomatic of systems’ and organizations’ refusal to recognize the high amplitude of their potential consequences on health and environment. This blindness or even risks denial, whose ramifications are cognitively, organizationally and culturally framed, is often analyzed from the point of view of the conscious and voluntary strategy of stakeholders or shareholders to deny the evidences (willful blindness). It is also frequently considered from a societal point of view as being symptomatic of the rise of cynicism within certain areas of activity, or even more widely within the society. The Seventh Art has also echoed it with such iconic films such as Jeffrey C. Chandor's Margin Call (2011), The Steps of Power by George Clooney (2011), ..., and more recently “La fille de Brest” by Emmanuelle Bercot (2016), Miss Sloane by John Madden (2016). While this may cover some of the explanation of risk denial mechanisms, it remains necessary to recognize the existence of several other explanatory mechanisms. The “fabric of” (with respect to N. Chomsky contribution) “weight of evidences” and “turning points” model (Baxter and Bullis contribution), risk perception biases, the balance between risk-taking and innovation, self-censorship, etc., are all aspects to be considered in the analysis of the deep branching of individual or/and collective framing of risk denial. On behalf of the scientific program committee of the meeting, this paper will share the conclusions of IMdR “Franch Risk Management Institute” Paris 2017 meeting on “Risk denial”. Based on a multi-sectorial and multi-disciplinary approach, these conclusions will contribute to clarify individual and collective risk denial mechanisms in extreme situations or even in innovation contexts.

P.111  Application of Principles of Failure Modes, Effects, and Criticality Analysis to Fluid Milk Food Safety Plan. Kottapalli B*, Butler S, Peers M, Holzhueter D; ConAgra Brands   bala.kottapalli@gmail.com
Microbial Risk Analysis

Abstract: Failure Modes, Effects and Criticality Analysis (FMECA) is a semi-quantitative risk assessment methodology designed to identify and address potential failure modes during manufacturing of a process or product. FMECA risk assessment tool can be used to verify the effectiveness of implementation of preventive controls in a manufacturing facility’s food safety plan. The purpose of this study was to apply FMECA principles to analyze the effectiveness of critical control points (CCPs) and pre-requisite programs in mitigating food safety concerns during fluid milk production. Preliminary hazard analysis (using Ishikawa diagrams) was used to predict potential food safety issues occurring at different process steps throughout the manufacturing of fluid milk. The potential failure modes were ranked based on severity (S), likelihood of occurrence (O) and likelihood of detection (D) on a scale of 1-10 (for each of the variables S, O, D). A team comprising of subject matter experts, operations and quality assurance personnel were involved in ranking of the failure modes. Risk priority number (RPN = S X O X D) was calculated and a score of 130 or above was deemed high risk and required corrective actions. Pareto diagrams were created using MINITAB statistical software to identify high risk processing steps that require corrections and/or corrective actions. The results of FMECA indicated that the preventive controls implemented in the manufacturing of fluid milk process significantly minimize or prevent (RPNs < 130) food safety hazards from a public health standpoint. The FMECA tool used in this study can provide scientific basis for verifying the effectiveness of the implementation of preventive controls executed by Conagra Brands’ fluid milk manufacturing facility and compliance with FSMA requirements.

P.112  Applicability of whole genome sequencing data for Salmonella risk assessment in poultry meat. Karanth S*, Mishra A, Pradhan AK; University of Maryland College Park   skm@mail.umd.edu
Microbial Risk Analysis

Abstract: Salmonella is a major foodborne pathogen with an annual estimate of 1.2 million illnesses and 450 deaths, and $365 million in direct medical costs in the USA according to the Centers for Disease Control and Prevention (CDC). It is also one of the most diverse pathogenic bacteria, comprising more than 50 serogroups and over 2500 named serovars, each with variable pathogenicity profiles. Whole genome sequencing (WGS), a rapid, cost-effective method to reveal the complete genetic make-up of an organism, can be used to identify specific pathogenic serotypes of microorganisms. This in turn could help better predict microbial behavior, which could be used to refine the existing microbial risk assessment (MRA) models for Salmonella. The aim of this study was to determine the applicability of WGS data in developing serovar-specific risk assessment models for Salmonella. In this study, data regarding the prevalence of different serovars of Salmonella in chicken (obtained by WGS) between 2000 and 2015 was obtained from the GenomeTrakr network and salmonellosis outbreak survey data was obtained from the CDC Foodborne Outbreak Online Database (FOOD Tool). An autoregressive moving averages time series model was developed to determine the prevalence of Salmonella spp. and some of its most common pathogenic serotypes over this time period for incorporation into a MRA model. Time series analysis of the prevalence and outbreak data indicated an increased prevalence of Salmonella spp. during spring (March–May). However, decomposition and linear regression of the prevalence data indicated different seasons for increased prevalence of some major pathogenic Salmonella strains, such as Salmonella enterica serovar Enteritidis (November–January), Heidelberg (February–April and October–December), and Typhimurium (June–August) at a 95% significance level. The results of this study can provide useful insights in incorporating genomic data in future microbial risk assessment studies.

P.113  Assessing food safety risk of Toxoplasma gondii in muscle tissues of naturally infected meat animals in the United States. Rani S*, Dubey JP, Pradhan AK; University of Maryland and USDA Animal Research Services   sur1404rani@gmail.com
Microbial Risk Analysis

Abstract: Toxoplasma gondii is a protozoan parasite that infects virtually all warm-blooded animals including humans, livestock and marine mammals. Various surveys have found that 10–50% of the adult population has been exposed to this parasite. T. gondii infection causes mental retardation, loss of vision and other congenital health problems in human infants. Toxoplasmosis is an important cause of morbidity and mortality in immunosuppressed individuals and can cause serious health problems in healthy adults. Annually, approximately 1 million people in the USA are infected with T. gondii, and over 4,800 people develop symptomatic ocular disease. There are no quantitative data pertaining to concentration of viable T. gondii in naturally infected meat animals. The goal of this study is to quantify T. gondii concentration in muscle tissues of naturally infected lambs and goats. The muscle tissues of 4 lamb and 4 goat legs along with their hearts were serologically tested for T. gondii antibodies and then bio-assayed in mice in different amounts, 5 g, 10 g and 50 g to observe its infection rate. Results suggested that T. gondii DNA was detected by PCR more consistently, but not correlated with the presence of viable T. gondii in muscle tissues. This study indicated that the parasite can be present in naturally infected meat animals and could pose threat of foodborne illness to consumers.

P.116  Horizontal Gene Transfer Under Dynamic System Conditions for Understanding Dose-Response Relationships for Antibiotic Resistance Risks. Chabrelie AE*, Zhang L, Bornhorst G, Mitchell J; Michigan State University and University of California, Davis   chabreli@msu.edu
Microbial Risk Analysis

Abstract: Antimicrobial resistance (AMR) is a looming global concern and the associated genes are now considered emerging hazards because of their ability to proliferate antibiotic resistance. Understanding how horizontal gene transfer (HGT) occurs under system conditions is a powerful step in determining how AMR is spread in the environment and in humans. Traditional pathogen dose-response models do not consider the potential body burden of AMR strains, which is the result of replication and gene transfer. There is a lack of information about HGT within dynamic systems like the animal and human gut as most studies are done under static conditions. In this study, in vitro experiments were conducted to quantify how different parameters impact E. coli. AMR transmission through HGT. E. coli. was selected as it represents one of the most common commensal bacteria in large intestine of humans and animals, as well as in environment. Parameters investigated are the time, nutrient concentration, donor-recipient ratios, and viscosity. These factors correlate with the digestion process, diet, the amount of resistant and susceptible bacteria ingested and the conditions within the gut. Preliminary results obtained showed a clear correlation between the HGT transfer and these parameters, allowing researchers to refine their model for better estimating development of AMR in animals and humans.

P.117  PSCMT: A supply chain model of microbial contamination risk in fresh tomatoes . Zoellner C*, Jackson P, Al-Mamun MA, Grohn YT, Worobo R; Cornell University   cez23@cornell.edu
Microbial Risk Analysis

Abstract: According to the Centers for Disease Control and Prevention, 46% of foodborne illness each year in the United States of America is attributed to contaminated fresh produce. The source of contamination with human pathogens is often in the production and handling environments of fresh produce. But, the post-harvest supply chain also presents conditions which either encourage or discourage microbial growth and survival. The objective of this study was to develop a generic modeling tool of microbial contamination dynamics in order to identify pathogen control points in fresh produce supply chains. The postharvest supply chain is explicitly modeled, including transportation of the harvested produce to a packaging location and subsequent activities to sort, wash, cut, pack, store, distribute and sell the product in the supermarket. The model starts with introducing a microbial load at one node and a modular process risk modeling approach determines the resulting levels throughout the supply chain due to growth, survival, transfer and reduction. Each node utilizes input parameter values from published studies of microbial behavior on fresh produce to govern these modular microbial processes. Upon simulation, the microbial contamination level for each node is displayed in different scales and at discrete time points. The tool is demonstrated with Salmonella Montevideo in a fresh tomato supply chain consisting of four nodes—field, packinghouse, distribution center and supermarket—to model the impact of supply chain decisions. This detailed, mechanistic model of microbial behavior dynamics is developed to address the unique risks of fresh produce supply chains and is intended for identification of practices that may reduce contamination risks and foodborne illness.

P.118  health risk assessment of cadmium in rice. kuen yu hwu JB, lai szu chi *; taiwan University   r04841017@gmail.com
Occupational Health and Safety

Abstract: The aim of this study was to assess the risk of edible crops in rice from 2010 to 2015. 19-65 years old adults are our target population, on behalf of the public. The average body weight is 62.96 kg. We used the reference dose to calculate the reference dose (RfD) instead of NOAEL. BMDL (baseline Pant (1995) )derives lower confidence limits for dose. Reference dose software (BMDS) is from USEPA. The average concentration (MC), lifetime average daily dose (LADD) and hazard index (HI) were calculated by the Bayesian theorem in the Markov chain Monte Carlo simulation (MCMC). Cadmium oral dose response relationship is not clear, so in the carcinogenic risk factor estimates, we will estimate the slope of respiratory carcinogenesis, followed by 50% conversion rate, converted into oral carcinogenic slope. Dose Response Relationships The carcinogenicity of rat animal experiments in IRIS was assumed to be 0.25 kg in rats and BDML = 0.837 was calculated using Benchmark dose software. Human BMDL is estimated by animal experiments calculated BMDL, by the animal dose response relationship extrapolated to human dose response relationship, calculated human BMDL = 0.254. Furthermore, oral carcinogenicity was calculated assuming a daily dose of 20 m3 / day per day and an adult weight of 70 Kg, calculated to have a carcinogenic slope of 6.5 similar to that of the US EPA-published diet carcinogenic slope of 6.4.

P.119  Risk Analysis and Patient Safety: A tool for Improvement. Elmontsri M*, Banarsee R, Azeem M; Imperial College London   m.elmontsri10@imperial.ac.uk
Occupational Health and Safety

Abstract: The delivery of healthcare process is complex and require integration of different functions and stakeholders to ensure patients are provided with high quality and safe care. Risks that exist in the healthcare environment could lead to significant negative outcomes to the process of care. Statistics of harm done to patients show the untold story of healthcare delivery. The consequences of poor health care delivery are unacceptable. Risk analysis can help healthcare providers to ensure that their services are provided in accordance with the patient’s expectations. Risk analysis can be used as an integral tool in the overall governance and management process to ensure that all different types of risks are captured, evaluated, analysed and controlled. Healthcare providers can enhance strengthen their health systems through involving different stakeholders in the process of risk analysis including patients, health professionals, policy makers, families, nurses and administrators. Patient involvement in the risk analysis will also highlight new insights about the process of care as patients have different perception to the different types of risks that exist within the healthcare environment.

P.120  Incorporating ToxCast and ExpoCast Data into Naphthalene Risk Assessment. Bailey LA*, Rhomberg LR; Gradient   lbailey@gradientcorp.com
Occupational Health and Safety

Abstract: Inhalation of naphthalene causes nasal olfactory epithelial tumors in rats and benign lung adenomas in mice. The available human data have not identified an association between naphthalene exposure and increased respiratory cancer risk. Therefore, naphthalene carcinogenic risk assessment in humans depends entirely on experimental evidence from rodents. The United States Environmental Protection Agency (US EPA) Toxicity Forecaster (ToxCast) Database contains 882 in vitro assays for naphthalene, with more than 750 assays conducted in human cells. We obtained and reviewed all naphthalene ToxCast assay data and used that information for the following analyses: 1) Using a physiologically-based pharmacokinetic (PBPK) model for naphthalene (Campbell et al. 2014), we determined the naphthalene inhalation concentrations that correspond to relevant activity concentrations for all active naphthalene assays, and compared those concentrations to the naphthalene human equivalent concentration (HEC) derived in our recent naphthalene paper (Bailey et al., 2015); 2) We evaluated target endpoints for active assays in the context of proposed modes of action for naphthalene; and 3) We reviewed the assays within ToxCast to determine which might be expected to be positive based on proposed modes of action and carcinogenic endpoints for naphthalene. In addition, we evaluated naphthalene exposure information within the US EPA High-Throughput Exposure Forecaster (ExpoCast) Database in the context of our naphthalene HEC. Overall, although there are limitations and some uncertainties within the naphthalene ToxCast and ExpoCast data, the results from our analyses of these data are consistent with, and provide additional support for, the conclusions in our 2015 analysis.

P.121  Incorporating Health Risk Assessment into Facility Layout and Process Design. Huang SH*, Chuang YC, Wu KY; National Taiwan University   d04841012@ntu.edu.tw
Occupational Health and Safety

Abstract: Facility planning and process design have a significant impact on occupational health and safety. Factors such as the use of raw materials, storage, machine equipment, etc. will affect the workers’ exposure to hazardous substances in the condition. Considering occupational health risks during the planning and design phase allows better anticipation of potential hazards and implementation of appropriate control or contingency measures. Current practice of occupational exposure assessment is often only monitored once the occupational setting is in operation. Even when incompliance is discovered, site revisions are confronted with challenges due to the potential impact on production line efficiency and cost. The objective of this study is to incorporate occupational health in facility layout design by characterizing potential health risks of an exposure scenario at an early stage when limited information is available. Using the Stoffenmanager exposure modeling webtool, we predict the potential exposure level of exposure scenarios. By collecting past domestic exposure assessment data from various processes, we integrate this information with the predicted exposure levels using Markov chain Monte Carlo sampling to conduct health risk assessments for evaluating chemical hazard index (HI) or carcinogenic risk. The resulting risk characterization provides a reference for future establishments of new plant process designs. Preliminary finding used existing exposure scenarios obtained from an optoelectronic semiconductor-manufacturing factory in Taiwan to conduct the health risk assessment of chronic exposure to positive photoresists, a material used in the optoelectronic semiconductor industry. By-product benzene, in particular, had the highest HI and cancer risk. These results demonstrate that applying the method during process design may help further incorporate occupational health and provide insights to design improvements.

P.122.  Recommendations for sieving soil and dust samples at Superfund sites for assessment of incidental ingestion via dermal adherence. Stifelman M, Brown J, Lowney Y, Follansbee M*, Diamond G, Burgess M; SRC, Inc.   follansbee@srcinc.com

Abstract: Incidental ingestion is the primary pathway for exposure to lead, and other contaminants, in soil and dust and is dependent on dermal adherence. Hence, site-specific risk assessment requires that soil and dust samples be sieved to accurately represent incidentally ingested material that adheres to skin. Soil and dust particle size, an important determinant of dermal adherence, is generally inversely associated with lead concentration, mobility, and bioavailability. Reliable data on the particle size fraction that is most likely to adhere to hands and on the lead concentration found in that particle size can improve the accuracy of exposure and risk calculations in lead risk assessments. We reviewed literature for relevant data on the relationship between particle size and dermal adherence, and between particle size and lead enrichment. The review revealed that growing body of evidence showing that dermally-adhered soil and dust is dominated (>90%) by particles <150 µm. Additionally, although dependent on site-specific conditions, the review also revealed the consistent enrichment of lead associated with smaller sized particles <150 µm. Based on this new information, US EPA now recommends that the concentration of lead in the <150 µm fraction be used to represent the particle size fraction associated with incidental ingestion, and this particle size fraction has also been suggested for use in assessing exposure to other contaminants in soil. Previously, US EPA’s Office of Land and Emergency Management recommended that the lead concentration in the <250 µm particle size fraction be used to represent the fraction of soil and dust that adheres to hands and could be incidentally ingested. The Office of Land and Emergency Management recognizes, however, that this recommendation to sieve to 150 µm to obtain the fine fraction may be adjusted on a site-specific basis to obtain smaller or larger particle size fractions as site-specific history or circumstances warrant. For example, larger particle size fractions may be appropriate when contact with wet soil conditions are anticipated (e.g., exposures to soils or sediments at shorelines). We will provide a summary of the technical studies and regulatory analyses that form the basis for selection of this particle size range as the appropriate cutoff for use in the human health risk assessment of ingestion exposures to contaminants in soil.

P.122  Integrating NIOSH Efforts to Protect Workers: Linking Cumulative Risk Assessment, Exposome, and Total Worker Health®. Dotson GS*, Chosewood LC, Middendorf PJ; CDC/National Institute for Occupational Safety and Health    fya8@cdc.gov
Occupational Health and Safety

Abstract: The nature of work has shifted greatly in the last century to reflect changes in the economy, population demographics, and the availability of advanced technologies. The changing nature of work has resulted in a modern work environment that consists of both recognized and emerging occupational risk factors. Occupational health professionals are in need of innovative approaches and tools capable of characterizing the risk workers encounter in the modern work environment. For this reason, the National Institute for Occupational Safety and Health (NIOSH) is exploring the integration of three novel initiatives: 1) cumulative risk assessment (CRA), 2) exposome, and 3) Total Worker Health® (TWH). CRA attempts to characterize the cumulative, or combined, risk associated with co-exposures to multiple stressors. The term exposome refers the exposures of an individual in a lifetime and how those exposures relate to health. TWH is defined as policies, programs, and practices that integrate protection from work-related safety and health hazards with promotion of injury and illness prevention efforts to advance worker well-being. Each of these initiatives reflects independent, but complementary, research programs that focus on better understanding and addressing risk factors that impact worker safety, health and well-being. This poster will: 1) provide an overview of each of these research initiatives; 2) illustrate the links between CRA, exposome, and TWH; and 3) explore the challenges and benefits of integrating the initiatives.

P.123  A risk by any other name would not smell as sweet. Pace ND*, Poole C; University of North Carolina at Chapel Hill   nelsonpace@gmail.com
Other

Abstract: Risk and odds are two methods of reporting disease frequency. Odds ratios can approximate risk ratios when the underlying risk of disease is rare. That approximation improves with greater rarity of disease. When the approximation is the same to several decimal places, we suggest that odds ratios be referred to a risk ratios because (1) risk ratios are easily understood in comparison to odds ratios which will facilitate improved scientific communication, (2) the numeric difference between risk ratios and odds ratios for very rare diseases is unobservable in published literature due to factors such as rounding, and (3) the decimal places needed to portray these minute differences are often beyond the level of precision of most measured variables. For these reasons, we advise that calculated odds ratios in very rare disease research be referred to as risk ratios.

P.124  Comparing Verbal and Numeric Forecasts New Findings and Implications. Nguyen JD*, John RJ; University of Southern California   hoangdun@usc.edu
Other

Abstract: Whereas the process of risk assessment requires probability estimates in their numerical forms, many organizations prefer to use verbal measures of uncertainties. This research compares the accuracies of verbal predictions and numeric forecasts. A sample of 118 NFL football experts was recruited from Turk Prime. The experts were randomized into one of the two experimental conditions. Experts in the NUMBER condition were asked to make predictions of various possible outcomes in the NFL 2016-2017 regular season by using a numerical scale. On the other hand, experts in the VERBAL condition were asked to make the same predictions by using a verbal scale that includes 11 different probability words. The experts in the VERBAL condition were later invited to participate in a separate study about “preference for gambles”. The experts were presented with series of binary gambles and they were asked to choose their preferred options. The payoffs of the binary gambles were identical, but the first gamble describes the chance of winning by using a numeric value whereas the second describes the chance of winning by using a verbal expression of uncertainty. Using an iterative procedure, we were able to quantify the numeric values corresponding to the verbal expressions in the verbal response scale in the main study. These quantified values were then used to transform the verbal responses into numeric values for the VERBAL experts. Results from the main experiment showed that verbal forecasts were not statistically significant from numerical predictions in terms of the overall accuracy—quantified by the Brier score. However, numerical judgments were more resolute or discriminatory than verbal judgments. Yet, the degree of underconfidence was much less extreme among VERBAL experts. These results contradict findings in previous studies, and call more for attention to evaluate the application of verbal measures of uncertainty.

P.125  Methodology for Policy Characterization based on the Multiple Risk Evaluation Results: Case Study for Japanese Chemical Replacements. Kojima N*, Xue M, Zhou L, Machimura T, Ebisudani M, Tokai A; Graduate School of Engineering, Osaka University   kojima_n@see.eng.osaka-u.ac.jp
Other

Abstract: For managing multiple risks more comprehensive methodology are required instead of risk management to specific target risk. Here we have been exploring practical method to carry our risk governance of chemicals. The objective of this research is developing methodology for policy characterization based on the policy evaluation analyzing risk-risk trade-offs: about not just target risk but also countervailing risks. This study composed two steps: the first step was each policy evaluation as a preparation of comprehensive policy comparison; and the second step was the definition of the way of illustration in order to grasp the profiles of target and countervailing risk reduction. In the policy evaluation step, we evaluated five (and more…) cases of chemical replacement in Japan. For example, we picked up adhesive replacements to the lower formaldehyde residue for wooden constructing material; several refrigerant replacements for reducing target risk such as ozone depletion potential (ODP); and home detergent replacements using LAS to using AE. Every chemical replacement inevitably included several endpoints for other populations, so we evaluated the risk by the life-stage, by the population along with time course, by the risk category using with the material flow (&stock) analysis, life-cycle assessment, and probabilistic risk assessment. Based on the results above, we set the trade-off ratio as a horizontal axis. Moreover, as a preliminary result, we set the flow-stock ratio as the vertical axis for checking which the risk from consumer products have increased or not. As a remarkable results of refrigerants in air conditioner, we could find that the replacement lead to decrease health risk of ODP as target and GWP (Global Warming Potential) as countervailing risk in the 2030, the improvement on longer lifetime lead to prolonged the risk occurrence. This work is supported by Ministry of Environment, Japan (1-1501).

P.128  Influence of industrial activities emissions on mortality rates in Chile: An ecological study. . Fortt A*, Gutierrez V.V; Universidad Diego Portales and GreenRiver   antonia.fortt@greenriver.cl
Risk and Development

Abstract: Chilean population is exposed to significant pollution from different industrial activities, mainly mining, agriculture, energy industries, power plants, paper production, and transportation. The aim of this research is to determine if higher cancer mortality rates are associated with the presence of these different industrial activities along the country. Following the method used by Ruiz-Rudolph et al (2016), we conduct an ecological study that uses Chilean communes as small-area observation units to assess cancer mortality. Public data for mortality rates is available at the commune level, which are the smallest units of local administration in Chile. For each commune, data on cancer mortality were aggregated for the 2000-2016 period. Public database are available for pollution emissions for different industries. The impact of the pollution from different economic activities on cancer mortality rates is estimated using a model proposed by Besag, York, and Mollie (BYM) (Besag et al., 1991), wich has been used extensively in spatial epidemiology. Significant higher rates of cancer mortality were observed in communes with large industrial emitters.

P.129  Current and Emerging Human Health Impacts Associated with Land-Based Pollution in Low and Middle-Income Countries (LMICs): Data Gaps & Research Needs. Williams PRD*, Meiro-Lorenzo M, Puech Fernandez MR, Kadeli LG; E Risk Sciences, LLP and World Bank   pwilliams@erisksciences.com
Risk and Development

Abstract: Little is known about the health impacts from land-based pollution in low and middle income countries (LMICs), although preliminary estimates suggest that exposures and risks are significantly greater than in high income countries. Such pollution occurs from both legacy and active sites involving industries such as lead-acid battery recycling, industrial mining and ore processing, lead smelting, tanning operations, artisanal small-scale gold mining, and e-waste recycling. This presentation provides an overview of key findings from the available literature, identifies important knowledge and data gaps, and discusses research currently underway by the World Bank to better understand the health impacts from contaminated sites in LMICs. For example, although a number of studies have collected data in LMICs, different study designs and sampling methods makes it difficult to compare or generalize from these study findings. Most research has also focused on a single pollutant or category (e.g., lead, mercury, metals, pesticides) and limited number of exposure pathways and health outcomes. Additionally, non-uniform survey questionnaires has resulted in variable and incomplete information on population exposure factors, such as intake rates, time-activity patterns, behaviors, lifestyle factors, and comorbidities. Most lacking are data and analyses linking environmental concentrations, population factors, estimated or measured exposure levels, and observed or predicted health effects at the local or regional level. Few attempts have also been made to utilize, validate, or modify existing exposure and risk models for use in developing countries. To address these issues, the World Bank is supporting research to collect and analyze site-specific and population factor data from selected target countries (e.g., Nigeria, Ghana, Tanzania, Bangladesh, Pakistan) and improve the robustness of affordable tools to estimate health and economic impacts from land-based pollution in LMICs.

P.130  The social and economic effects of environmental contamination and remediation. Zwickle A*, Cox J, Hamm J, Zhuang J, Upham B, Dearing J; Michigan State University   zwicklea@msu.edu
Risk Communication

Abstract: Environmental contamination can have natural, social, and economic impacts, especially when they occur in populated areas. As part of a larger effort to study Superfund sites in the United States, this research focuses on the economic effects of dioxin contamination in a mid-sized Michigan city. A person’s home is usually their most valuable asset, and a drop in its value can have significant economic and social implications. In addition to the direct physical impacts, well publicized pollution events can create social forces, such as stigma and perceptions of health risks, which negatively impact home values. To study the effect that environmental contamination and remediation have had on home values over time, we collected assessed home value for nearly 800 parcels from the years 2000-2017. Homes were placed into three groups: those found to contain dioxin levels above the threshold deemed safe and were subsequently remediated, those found to contain dioxin levels below the threshold and were not remediated, and a comparison group. We then conducted a multi-level longitudinal analysis to determine if the value of homes in each group changed at different rates over time. This analysis is set against the backdrop of notable contamination related events in the community’s history, such as a contentious and well reported public meeting and the highly visible cleanup and remediation phase. We conclude this presentation with a discussion of the practical implications of our findings for future remediation processes, as well as avenues for future research.

P.132  Modeling social media engagement across the disaster continuum. Sutton J*, Resnick S, Vos SC, Yu Y, Olson M, Butts SC; University of Kentucky   jeannette.sutton@uky.edu
Risk Communication

Abstract: Existing research on social media and risk communication in the context of disaster focuses on the threat and response phases, with little attention to the communication strategies implemented during nonthreat periods. Furthermore, much of the research in this area remains descriptive and provides limited guidance on effective messaging strategies for engagement. In this paper, we add to the scholarship by examining social media use across the disaster continuum in order to identify message strategies that enhance engagement during threat and nonthreat periods. To date, social media “engagement” has been conceptualized primarily as dialogic communication, displayed through symmetrical or asymmetrical posts and directed messages that are aimed at building an online participatory community. This strategy imagines communication on social media as interpersonal conversation; as a result it fails to capitalize on all of the strengths of this channel, including network features and the ability to reach mass audiences. In this paper we propose a model of social media risk communication that builds on the current literature to inform an ongoing strategy of engagement across the threat-nonthreat time continuum. We draw from our previous findings to code, analyse, and model the message design and network features of a systematic, proportional, stratified random sample of 6,000 Tweets distributed by the census of National Weather Service accounts from 2014-2016. We conduct a thematic content analysis of three primary engagement strategies, informing, instructing, and dialogue across two time periods, threat and nonthreat, to develop a classifier system. We then use automated coding to identify microstructure features (#, @, RT, and url). We build models to identify the message design and network features that affect engagement, in the form of message sharing. The results identify strategies that increase engagement during threat and nonthreat periods.

P.133  Engaging with human gene editing: public views toward decision-making about controversial scientific issues. Rose KM*, Scheufele DA, Brossard D, Xenos MA; University of Wisconsin-Madison   kmrose@wisc.edu
Risk Communication

Abstract: The rise of low cost, efficient gene editing technologies (e.g., CRISPR/Cas9) has given new urgency to calls for public dialogue about the potential technical and societal risks surrounding human gene editing. In fact, many issues raised by such technologies do not have exclusively scientific answers but require broader societal debates about the moral, political, and societal complexities. This was echoed by the 2017 National Academy of Sciences and National Academy of Medicine report on human genome editing which brought attention to both the potential implications and calls from the scientific community for public engagement in policy decisions related to the technology. While many in the scientific community recognize the need for public involvement in decisions concerning controversial technologies with far reaching risks and benefits, public attitudes toward engagement in these decisions have yet to be determined. In this study, we examine public attitudes toward involvement in regulatory and scientific decisions related to human gene editing. Specifically, we analyze attitudes toward public engagement and the capabilities of the scientific community to responsibly develop the technology using data gathered from a nationally representative survey of U.S. adults (N=1,600; RR1=35.9%) from December 2016 to January 2017. We find sharp divides across ideological and religious groups in attitudes toward the technology and in views of the abilities of scientists to appropriately regulate themselves with respect to future development of the technology, however, the same groups which differ in their attitudes are united in their support for public engagement. Implications for public engagement in policy making on this emerging technology are discussed.

P.135  What's Numbers Got To Do With It?: The Role of Statistical Content in Risk Perception About Road Saftey. Steinhardt JS*; Michigan State University   jsteinh@gmail.com
Risk Communication

Abstract: Numbers saturate news coverage and health and risk messaging. But as our expertise in the creation of statistical information increases, the ability to use those statistics in decision making remains frustratingly inadequate. There has been a wealth of research related to how to train people to better use the numbers they interact with on a daily basis. Far less research however explores the appropriate way to use numbers in communication. Three experiments explored the role of numbers in risk perception related to road safety while driving. Experiment 1 found that the presence of numbers influence risk perception, but whether those numbers reflect accurate statistics or random numbers does not change their influence. Experiment 2 found that removing all statistics entirely from infographics and replacing them with linguistic gist representations of the numbers (i.e. words like “some,” “many,” “none”) increased risk perception even though people found the infographics to be less informative than the ones containing numbers. Experiment 3 found that the effects described in the first two experiments pertain not just to infographics but also to bite sized text-based risk communication. The results suggest that the gist representations of the numbers in the context of the infographics are equivalent regardless of their value, such that the very presence of statistics influences judgment and risk perception but not their meaning. They also suggest that people do not always realize how they are using statistical information in their judgement and decision making process.

P.136  Communicating earthquake preparedness. Marti M*, Stauffacher M, Matthes J, Wiemer S; ETH Zurich   michele.marti@sed.ethz.ch
Risk Communication

Abstract: Despite global efforts to reduce seismic risk, actual preparedness levels remain universally low. Although, earthquake- resistant building design is the most efficient way to decrease potential losses, its application is not a legal requirement across all earthquake-prone countries and even if, often not strictly enforced. Risk communication encouraging homeowners to take precautionary measures is therefore an important means to enhance a country’s earthquake resilience. Our study illustrates that specific interactions of mood, perceived risk and frame type significantly affect homeowners’ attitudes towards general precautionary measures for earthquakes. The interdependencies of the variables mood, risk information and frame type were tested in an experimental 2 x 2 x 2 design (N = 156). Only in combination and not on their own, these variables effectively influence attitudes towards general precautionary measures for earthquakes. The control variables gender, “trait anxiety” index and alteration of perceived risk adjust the effect. Overall, the group with the strongest attitudes towards general precautionary actions for earthquakes are homeowners with induced negative mood, who process high risk information and gain-framed messages. However, the conditions comprising induced negative mood, low risk information and loss-frame and induced positive mood, low risk information and gain-framed messages both also significantly influence homeowners’ attitudes towards general precautionary measures for earthquakes. These results mostly confirm previous findings in the field of health communication. For practitioners, our study emphasizes that carefully compiled communication measures are a powerful means to encourage precautionary attitudes among homeowners, especially for those with an elevated perceived risk.

P.137  Testing Procedures to Mitigate Perceived Unfairness Perceptions Associated with Research-Related Conflicts of Interest. Besley JC*, McCright AM, Zahry NR, Elliott NE, Martin JD, Kaminski NE; Michigan State University   jbesley@msu.edu
Risk Communication

Abstract: Two between-subject experiments explored the perceived fairness of a hypothetical public-private research partnership to study the health risks of transfats. Fairness was measured as researchers’ willingness to listen to a range of voices and minimize bias. The practical value of the research is that it clarifies the challenges that investigators who take industry funding may face in having their research seen as legitimate. Conceptually, the research speaks to considering scientific research as decision-making processes that people may judge based on their perception of the use of fair research procedures. Analyses include mean comparisons and General Linear Model estimation. Experiment 1 (n = 1,263) assigned research subjects to a partnership that included a random combination of an industry partner, a university partner, and a non-governmental organization (NGO) partner, as well as one of three processes aimed at mitigating the potential for conflicts of interest (COI) to harm the quality of the research. The procedures included an arm’s length process meant to keep the university-based research team from being influenced by the other partners; an independent advisory board to oversee the project; and a commitment to making all data and analyses openly available. The results suggest that having an industry partner has substantial negative effects on perceived research fairness and that the benefit of adding a single COI-mitigation process may be relatively small. Subjects in Experiment 2 (n = 1,076) assessed a partnership that included a university and either a NGO or industry partner, as well as a combination of the three COI-mitigation procedures used in Experiment 1. Subjects could therefore be assigned to assess a collaboration without any mention of a COI-mitigation process, one of the three processes, two of the three processes, or all three processes. The results suggest there may be little value in combining COI-mitigation procedures.

P.138  Processing risks: What makes the U.S. public attend to information about the 2016 presidential election vs. climate change. Yang J.*, Chu H.; University at Buffalo   zyang5@buffalo.edu
Risk Communication

Abstract: The 2016 presidential election presented a unique case study for risk communication because the frenzied campaign unavoidably left many Americans feeling uncertain about the future, if not fearful or anxious due to elevated risk perceptions. These risk perceptions, and the strong emotional responses they generated, inevitably had an impact on U.S. electors’ communication behaviors and decision-making processes. Although not a risk object in the traditional sense, the 2016 election embodied many characteristics that define risks, such as the many unknown and uncontrollable factors that influenced its outcome. Applying the risk information seeking and processing (RISP) model, this study examines the social cognitive variables that motivated the U.S. public to attend to information about the risks posed by the election. Further, the utility of the RISP model in explicating information processing in the election context is juxtaposed with an identical model specified to a dataset on climate change, a typical risk that enjoyed much less issue salience during the election cycle. To examine what made the U.S. public attend to information about the risks posed by the election vs. climate change, both datasets were collected from Oct 6 to Oct 23, 2016 using Qualtrics national panels. Results indicate that in the climate change context, risk perception had a stronger relationship with affective responses, which were significantly related to systematic processing. In contrast, affective responses were not significantly related to systematic processing in the election sample. Perceived knowledge had a stronger relationship with information insufficiency and information processing in the election context. The relationships between other RISP variables and systematic processing were consistent across both datasets, which justified the use of the RISP model in studying the presidential election as a risk issue.

P.139  Media Representations of Water Issues as Health Risks. Boyd A*, Mayeda A, Paveglio T, Flint C; Washington State University   amanda.boyd@wsu.edu
Risk Communication

Abstract: Water is a natural resource that is critical to sustaining human life. Media representations may impact public risk perceptions and inform water conservation or health promotion behaviors. The objective of this research is to identify how the media frames public health risks associated with water resource issues. A content analysis of newspaper articles focusing on water resource issues was conducted for a three-year timeframe from January 2012 to December 2014. Articles were gathered from eight newspapers in four Western U.S. states using the search term ‘water’ and subsequently reduced to a set of articles with any mention of human health. Articles were coded to determine what health related issues were discussed and the environmental issues they were connected with (e.g., drought, climate change, contamination). The search initially returned 3,338 articles with a major focus on water issues. Approximately 10% of these articles contained any mention of a health-related issue. The three major health themes present in water-related newspaper coverage were: (1) risks from water contamination, (2) general health risks, and (3) illness or disease. The results indicate that while health risks are seldom mentioned, the risks most frequently discussed in relation to water resources are those with direct and immediate impacts. These findings suggest the issues being reported in the media may not be consistent with the nature of health impacts associated with water resource issues, which are most often long-term and indirect.

P.140  Comparison of risk perception among thirty risk factors in Japan. Ohkubo C*; Japan EMF Information Center   ohkubo@jeic-emf.jp
Risk Communication

Abstract: Risk perception of wen (n=1506) and women (n=1495) on 30 risk factors was surveyed by internet in 2016. The questionnaires include degree of risk perception on 30 factors in daily life including outdoor air pollution, indoor air pollution, diesel emissions, chemical substances in general, dioxins, pesticides, food additives, industrial wastes, nuclear power plants, ionizing radiation, X rays, radon, UV radiation, solar radiation, electromagnetic fields radiation, high-voltage power lines, cellar phones, base stations, microwave ovens, induction hobs, home electric appliances, motor vehicles, air crafts, daily foods, red meat, processed meat, genetically modified food, passive smoking, smoking and alcohol drinking. Respondents were asked for their risk perception for each risk factor. For comparison, risk perception is categorized into 3 levels, i.e., “danger”, “neither danger nor safe” and “safe”. By the gender a whole, to the 10th describing in descending order of risk perception who answered as “danger” , it was passive smoking, smoking, ionizing radiation, pesticides, outdoor air pollution, dioxins, nuclear power plants, UV radiation, diesel emissions and food additives. In wen, it was smoking, passive smoking, outdoor air pollution, ionizing radiation, dioxins, nuclear power plants, pesticides, outdoor air pollution, dioxins, UV radiation, diesel emissions and industrial wastes. In women, it was passive smoking, smoking, outdoor air pollution, UV radiation, dioxins, pesticides, ionizing radiation, nuclear power plant, food additives and diesel emissions. Risk perception on electromagnetic fields related risk factors, to the 5th describing in descending order of risk perception who answered as “danger”, it was high-voltage power lines, cellar phones, microwave ovens, base stations and home electric appliances by the gender a whole. The proportion of women who answered “danger” was higher than men in all the risk factors.

P.143  Up and down in the cycle: the effects of media attention on the political debate and policy on the public risk of Earthquakes . Opperhuizen, Erasmus University Rotterdam AE*, Schouten, Erasmus University Rotterdam KIM, Klijn EH; Erasmus University Rotterdam   opperhuizen@fsw.eur.nl
Risk Communication

Abstract: Media functions as amplification station by transmitting information about a risk event towards the general public. Media are an important source of information which influences the public and political perception of a risk. In the framework of social amplification of risk the media is often mentioned as an institutions with its own rules. While political scholars focus on media function as agenda-setter for political (risk) debate, there is limited research on the role of media on the ripple effect they create in the policy sphere. In this study we focus on the agenda-setter role of the media on both the political sphere as well as the policy sphere. We do this in case of increasing earthquakes risk caused by gas drilling in The Netherlands. The risk of earthquakes offer the opportunity to study media attention over a long period (25 years) and at the same time provides in-depth knowledge about media attention on political risk debate and risk policy. In this a Supervised Machine Learning (SML) techniques is applied to conduct a content analyses of media articles (N= 2265) from 5 different newspapers, political documents reporting about debates in the Parliament (N= 124) and policy documents. We expect that the frequency of media attention influence how often the earthquakes risk are subject of policies debate. It is also expected, in line with the symbolic agenda setting theory, that the content of topics mentioned by media is reflected in the political sphere. We expect further, in line with the substantive agenda setting, risk control reflex and ripple effects theory also influence the policy sphere.

P.145  Forecasting Barriers to Wide Scale Adoption of Self-Driving Car Technology. Dixon GN*, Hart PS, Clarke CE, O'Donnell N; The Ohio State University   graham.n.dixon@gmail.com
Risk Communication

Abstract: Recent advances in automotive technology have made fully automated self-driving cars possible. Despite offering many benefits, such as increased safety, improved fuel efficiency, and greater disability access, public support for self-driving cars remains low. While previous studies find that age and sex influence self-driving car support, factors influencing support likely go beyond these demographic variables. Using a national survey of American adults (N = 1008), we find that age and sex do not significantly associate with support for self-driving car policies when controlling for psychological traits and cultural values. Instead, significant predictors of support included trust in automotive institutions and regulatory bodies, recognition of self-driving car benefits, positive affect toward self-driving cars, and a greater perception that manual-driven cars are riskier than self-driving cars. Importantly, we also find that individualism is negatively associated with support. Thus, people who value personal autonomy and limited government regulation may perceive policies encouraging self-driving car use as threatening to their worldviews. Altogether, our results point to solutions for encouraging greater public support of self-driving vehicles, while also forecasting potential barriers as self-driving cars emerge as a fixture in transportation policy.

P.146  News media framing of the risk of induced seismicity in four U.S. states. Lambert CE*, McComas KA; Cornell University   cel247@cornell.edu
Risk Communication

Abstract: In the past 10 years, earthquakes in the central and eastern US have increased, due to industrial activities like deep injection of oil and gas related wastewater. As seismic activity has risen, attitudes and responses in affected states have differed, with some placing limits on injection soon after seismic activity began, and others introducing regulations after a period of years. The unfolding debates about induced seismicity raise questions of how news media are covering seismic risk from induced earthquakes, an area of study that has received little attention. We investigate variability in news media framing of induced earthquakes across multiple states with different regulatory responses to assess what frames the news media use in their coverage, how the use of frames varies over time, and to compare framing and levels of coverage with earthquake activity and regulatory action. Our study consists of a content analysis of newspaper coverage of induced earthquake events in Arkansas, Oklahoma, Ohio, and Texas, chosen because they have experienced induced earthquake sequences within the last 10 years and have instituted a range of regulatory responses. We draw on previous literature to categorize whether coverage emphasizes earthquakes as either (1) a problem that needs to be solved, by referring to scientific information, public accountability, risk linkages, or causal attribution, or as (2) not a problem, which emphasizes the situation as a non-issue or as with a solution already in hand, by focusing on governmental statements, economic perspectives, scientific uncertainty, or denial of causal relationships. We present a timeline of media coverage and framing, earthquake activity, and regulatory action in each state. By designating frames as problem/no problem, the results provide a picture of how the debate over induced seismicity as a risk issue has unfolded over time in the media and how that discussion corresponds to sociopolitical and geological events.

P.149  The effect of gain vs. loss message framing and spatial distance on influencing support for aquaculture among U.S. seafood consumers . Rickard LN, Kumara SMSP*; University of Maine   laura.rickard@maine.edu
Risk Communication

Abstract: In response to declining wild fisheries and increasing global seafood demand, aquaculture – the breeding, rearing, and harvesting of animals or plants in water environments – has emerged as one of the fastest growing food production sectors, currently contributing over 50% of the world’s seafood. Yet, farming seafood is not without drawbacks, and attention to possible environmental and human health risks has amplified risk perceptions, especially among U.S. and European audiences, and rendered some policies controversial. In this study, we examine whether support for aquaculture may be influenced by communication about its benefits, including environmental sustainability and job creation. Specifically, we investigate the impact of gain versus loss framing (i.e., highlighting the advantages of adopting or disadvantages of not adopting) and spatial distance (i.e., the location in which aquaculture occurs, from the perspective of the message recipient) as two variables that may influence support for aquaculture policies and products. Using a nationally representative sample of U.S. residents collected by GfK in January 2017 (N = 1210), we report on a messaging experiment utilizing a 2 (gain vs. loss) x 2 (near – U.S. vs. far – China) between-subjects, experimental design, with a no-message control group designed to explore the main and interactional effects of message treatment on support for aquaculture. Four simulated newspaper messages are used to highlight the gains of implementing or losses of failing to implement aquaculture, in the context of the U.S. or China. We explore the possible mediating effect of perceived benefit (versus risk) of aquaculture, as well as the potential moderating variables of source credibility, political ideology, subjective and objective (i.e., fact-based) knowledge about aquaculture, and level of seafood consumption. Theoretical implications for strategic risk communication messaging, as well as applied insights for aquaculture industry and government stakeholders, will be presented.

P.150  The Role of Science News Sources in Shaping Risk Perceptions of Agricultural Use of Pesticides. Li N, Powers R*; Texas Tech University   nan.li@ttu.edu
Risk Communication

Abstract: Pesticides have been extensively used in agriculture and urban settings to control weeds, insects and other pests. The use of pesticides has resulted in a range of benefits, including increased crop production and decreased insect-borne diseases. Nonetheless, the agricultural use of pesticides has also raised persistent concerns about possible adverse effects on human health and the environment. As most people lack direct experiences with and/or scientific knowledge of pesticides, they will have to rely on external information sources, such as mass media, to judge the safety of pesticide use in farming. This study conducts a secondary analysis of the 2016 General Social Survey Ballot 2 data (Sample N=859) to investigate how the US public form their risk perceptions of agricultural use of pesticides. Results show that females, those who are pessimistic about scientific advances and concerned about environmental quality tend to perceive pesticides to be more hazardous. In addition, people who rely on television as primary sources of science news are less likely to think pesticide is dangerous compared with those who primarily use online-only news. What’s more, although the Internet allows its users to selectively expose to information that aligns with their generally optimistic attitudes toward science and hence attenuates their risk perceptions, the same function does not apply to printed media and television. These findings have furthered our understanding of the different roles of printed, broadcast and online-only media in communicating about chronic environmental risks. Although previous studies have characterized mass media as an amplifier of environmental risk, this function might be varying as the whole media landscape has become increasingly fragmented. Future studies should develop a more coherent framework explaining the role of mass media in communicating risks under different circumstances and develop risk communication strategies that tailor to individuals’ news use behaviors.

P.152  Food Fraud and Consumer Risk Perception in Quebec (Canada). De Marcellis-Warin N*, Peignier I; Ecole Polytechnique de Montreal   ingrid.peignier@cirano.qc.ca
Risk Communication

Abstract: Food fraud occurs when there is intentional misrepresentation to consumers as to the nature, the origin or the ingredients of a food product. This may involve diluting a product with lower-quality materials or ingredients (as an example, an olive oil cut by hazelnut oil and sold as an extra-virgin oil or the 2013 horse meat scandal involving beef burgers containing horse meat), adding foreign substances to the original product (like the 2008 Chinese milk scandal involving infant formula being adulterated with melamine), selling non-organic products as “organic”, misrepresenting the animal or fish species or using a misleading label. Food fraud can therefore not only lead to economic consequences but also to consequences in term of public health. What are the concerns and the perceptions of the risks associated with food fraud? How well informed are consumers regarding the risks surrounding food fraud? What are the willingness to change their purchasing behaviour in case of a specific «Zero Food Fraud» certification? The Barometre CIRANO 2017 tends to answer to all of these questions for Quebec population. It is a unique tool for a better understanding of Quebecker’s current concerns and risk perceptions and for identifying the determinants of the social acceptability of more than 40 major issues. It consists on a large online survey conducted each year since 2011 on more than 1000 persons representative of Quebec population. The 2017 Edition covered more deeply Food Fraud in including a specific case study. Taking into account of consumer risk perception, their level of confidence and their level of knowledge before choosing or implementing solutions to manage food fraud is very important in order to ensure that the means being built to manage food fraud and also those planned are effective and receive all the trust needed from the public. Measures in place must not only be an expenditure for industry but must also represent a value for the consumer.

P.153  Risk Perceptions of Lone-Wolf Terrorist Threats and Policy Preferences for Government Counterterrorism Spending: Evidence from a U.S. National Panel Survey. Liu X, Mumpower JL*, Portney KE, Vedlitz A; Texas A&M University   jmumpower@tamu.edu
Risk Communication

Abstract: Using data from a two-wave U.S. national survey (May and November 2016), we replicated and extended past research. First, we investigated the ability of sociodemographic variables and other individual-level characteristics to predict public risk perceptions about lone-wolf terrorist threats. Second, we compared the ability of psychometric variables (severity of consequences, levels of public and scientific understanding, number affected, and likelihood) and sociodemographic ones to predict public risk perceptions. The results supported previous research that found both psychometric and sociodemographic variables were significant predictors of perceived risk, but that psychometric variables were generally stronger. Third, we assessed how well perceived risk, sociodemographic variables, and other individual-level characteristics can predict preferences for national and local government spending levels for counterterrorism. We replicated previous research which found that perceived risk was a significant predictor of preferred spending levels and that adding sociodemographic variables somewhat improved the ability to make such predictions. Fourth, we tested models to predict changes in public risk perceptions of terrorist threats between Wave 1 and Wave 2. We found that no sociodemographic variables were significant predictors, but that changes over time in several of the psychometric variables significantly predicted changes in perceived risk levels. Fifth, we found that change in levels of perceived risk over time was a significant predictor of change over time in preferred levels of government counterterrorism spending, and that predictive ability was somewhat enhanced by adding certain individual-level characteristic variables to the model, particularly change in perceptions of governmental competence.

P.154  Who moved my coffee? Using psychological distance to frame climate change impacts. Chu H.*, Yang J.; University at Buffalo, State University of New York   hchu5@buffalo.edu
Risk Communication

Abstract: Framing is an effective communication technique to motivate the public to engage in climate change mitigation and adaptation behaviors (Nisbet, 2009). However, existing empirical evidence has been inconsistent (e.g. Hart, 2011; Hart & Nisbet, 2012). Among different framing strategies, those assuming stronger personal relevance usually demonstrate higher effectiveness in motivating people to acknowledge the anthropogenic causes of climate change (ACC) or to support mitigation policies (Myers et al., 2012; Spence & Pidgeon, 2010). Although not consistently examined in these studies, psychological distance may be the underlying mechanism behind issue framing effects identified in various existing studies (Liberman & Trope, 2008; Rickard et al., 2016). Building on existing research, this study employs audiovisual messages highlighting climate change impacts that are either close or far in spatial and social distance, to investigate whether psychological distance is indeed an effective lens through which to frame climate change impacts. Results are largely consistent with existing research. Theoretical and practical implications of the findings will be discussed.

P.155  Perception and acceptance of HPV vaccination: Evaluating the impacts of message framing, motivation, cultural cognition and gender in a cross-country context. Liu S.*, Yang J.; University at Buffalo, State University of New York   sixiaoli@buffalo.edu
Risk Communication

Abstract: Human papillomavirus (HPV) is the most common sexually transmitted virus worldwide, and the prevention of HPV transmission is an important public health issue. It has been widely accepted that HPV vaccination holds a great promise to prevent the infection of HPV, which in turn reduces the contraction rates of HPV-induced diseases (e.g., genital warts, cervical cancers). This research aims to examine the influence of message framing (gain vs. loss), motivation orientation (approach vs. avoidance), cultural cognition, and gender on young adults’ perceptions of risks associated with HPV and HPV vaccines, as well as their intentions to get vaccinated. Also, given the newly approved HPV vaccines in mainland China, this research will compare the framing effects in a cross-cultural setting, based on undergraduates participants recruited from both the U.S. and China.

P.156  The influence of narrative and participatory drama on social interaction and efficacy around health and environmental issues in Malawi. Young CE *, McComas KA ; Cornell University    cey26@cornell.edu
Risk Communication

Abstract: This project tests the role of narrative and participatory drama as creative methods of communication in response to the impacts of climate change and other environmental and social pressures in sub-Saharan Africa. Specifically, the research seeks to understand the influence that these communication methods have on increasing social interaction, engagement, attitudes, and efficacy around inter-related sustainability issues among smallholder farmers in Malawi. In order to test the role of drama, 500 farmers in two regions of the country participated in an integrated curriculum on climate change, agroecology, soil health, health and nutrition, and social equity. Half of the participants used stories and drama in their training, and the other half acted as a control group using small group discussions. This poster includes some of the key quantitative (N=500) and in-depth interview (N=47) findings. The poster focuses on those findings related to risk perceptions and increased confidence around new ideas and innovations. The research draws on theories around health, environment, science, and risk communication, and considers the complex interrelationships between multiple sustainability issues, including health & nutrition, climate change, social equality, and agroecology & soil health. It also considers the communication of sensitive social issues, such as HIV/AIDS, alcoholism, and violence in the household. This project builds on previous research by offering unique context and scale, as well as a focus on communicating the relationship among multiple health, environmental, and social issues.

P.158  Perceptions of risk and uncertainty in climate-adaptive forestry. Findlater KM*, Peterson St-Laurent G, Hagerman S, Kozak R; University of British Columbia   k.findlater@alumni.ubc.ca
Risk Communication

Abstract: Climate change is increasingly impacting ecosystems worldwide, notably by shifting species distributions and increasing the frequency and severity of natural disturbances. In the forest sector – economically vital to the province of British Columbia (BC), Canada – climate-adaptive practices can address the risk that trees planted today will be mismatched with future local climates. Ongoing advances in genetics are enabling forward-looking and genomically-informed reforestation programs that assist with the migration of trees to better match predicted future climates and better withstand changes in pest and pathogen regimes. However, public and stakeholder perceptions of risk and uncertainty in reforestation strategies like assisted gene flow (within species range) and assisted migration (beyond species range) are largely unknown, with the potential for hesitation, concern and resistance to climate-adaptive practices stemming from biotechnological advances. This study seeks to understand public and stakeholder conceptions of risk and uncertainty in climate-adaptive forest management practices, broadly, and genomically-informed reforestation, in particular. It evaluates perceptions of such practices in BC through structured focus groups (stakeholders) and an online survey (public). This mixed-methods approach provides a rich understanding of the drivers of perceived risk and uncertainty created by climatic and technological changes, and important sources of misalignment with experts’ perceptions. In this talk, we present preliminary results about how the malleability of support might play out in a deliberative context. In particular, the survey data suggest that even brief consideration of ecological, economic and scientific trade-offs shifts participants’ expressed risk perceptions. These findings will help policy-makers better understand sociocultural and political barriers to climate-adaptive practices in mixed natural/managed landscapes.

P.159  The salience of environmental hazards: Making sense of citizen concerns and their implications for risk communication. Binder AR*; North Carolina State University   arbinder@ncsu.edu
Risk Communication

Abstract: For the public policy process to respond successfully to citizen concerns about environmental hazards, an understanding of how citizens think about these hazards is paramount. A straightforward approach to an assessment would be to list a number of environmental hazards and ask citizens which one is the most important problem in their opinion. Indeed, this is the approach represented by a perennial question in the General Social Survey of citizens in the United States. An alternative to this approach would be to ask the question in a more open-ended way, e.g., “Of all the environmental problems that you might be able to think of, which one is the most important?” This inquiry compares and contrasts the usefulness of these different approaches for informing environmental policy and risk communication. Using a nationally representative survey of U.S. citizens from 2013, I take a multi-methodological approach. From a qualitative perspective, I analyze the open-ended responses (which had no space constraints on survey respondents) to identify common hazards named by survey respondents. This analysis lends itself to developing a typology of environmental themes (broader than specific hazards). From a quantitative perspective, I then test whether or not the environmental themes that emerged from the qualitative analysis are significantly related to a variety of independent variables: values (political ideology, environmentalism, religiosity), social trust (trust in scientists and other public figures), news media consumption, and factual scientific knowledge, among others. Finally, a comparison is conducted between the origins of the open-ended survey responses and the closed-ended responses to see how the question format influences different people in different ways. Implications for policymaking and risk communication are discussed.

P.160  Current situation of emergency and long-term responses on community risks by chemical accidents. Murayama TM*, Imanaka IA, Nishikizawa NS, Nagaoka NA; Tokyo Institute of Technology   murayama.t.ac@m.titech.ac.jp
Risk Communication

Abstract: Emergency responses as well as long-term mitigation for environmental and social impacts induced by chemical accidents is one of important issues for industrial society. In the case of Japan, there are approximately 200 accidents annually in last ten years. Japanese national government has published a guideline on impact assessment for chemical accident, and local governments where chemical complex are located are required to comprise manuals on emergency response for chemical accidents. While those activities would be partially effective for managing some impacts by the accidents, less attention would be paid for community risks surrounding chemical complex. Conducting a questionnaire survey for local governments, we make clear about emergency and long-term responses on community risks as well as overall procedures and organizational structure for managing risks induced by chemical accidents.

P.162  The impact of advocacy by scientists on credibility and citizens' deference on specific issues. Stenhouse N*, Vraga E, Myers T, Kotcher J, Beall L, Maibach E; University of Wisconsin-Madison   stenhouse@wisc.edu
Risk Communication

Abstract: Scientists wishing to lend their voice to policy debates often worry that advocating positions on polarized issues may reduce the credibility of scientists in the eyes of the public. As well as fearing credibility loss, scientists fear that citizens or policymakers will be less likely to defer to them – to take their advice on specific scientific issues seriously – if they are seen as advocates. The present study (data collected, but not yet analyzed) measures the impact of advocacy messages by scientists on these two outcomes – deference and credibility. A previous study has shown that some forms of advocacy by scientists on social media do not substantially harm credibility. However, it might be the case that advocacy harms deference but not credibility – perhaps citizens perceive scientists who advocate as just as credible, per se, as those who do not, but still become less willing to defer to their expertise. Without measuring both these effects, we cannot be certain of what impact advocacy has on both these important outcomes. We look at the impact of scientists’ advocacy messages on credibility and deference on a large quota sample (N=2451) of US adults. The advocacy messages related to four issues – climate change, severe weather, influenza, and marijuana use. In addition to testing the impact of these messages on citizens’ willingness to defer to scientists on these specific issues, we measure the “spillover” impact on deference for other issues such as biotech crops and nuclear power.

P.165  Thematic Mapping of Cyber Security and Cyber Security Risk: Expert Elicitation of Researchers and Practitioners. Taber DL*, King ZM, Cains MG, Henshel DS; Indiana University   mgcains@indiana.edu
Security and Defense

Abstract: The National Initiative for Cybersecurity Careers and Studies defines cyber security as an activity or process that protects and/or defends information and systems against damage, unauthorized use or modification, or exploitation. Given the interdisciplinary nature of cyber security research, it is important that researchers from disparate disciplines share a common understanding of what is meant by cyber security and cyber security risk. Common definitions of cyber security and cyber security risk are imperative when conducting research across disciplinary and sectoral (i.e. academia, government) boundaries. In an attempt to identify this common understanding, researchers in the Cyber Security Collaborative Research Alliance (CSec CRA) conducted interviews on these topics with experts in academia and the U.S. Army. The common understandings were determined through thematic analysis of the interview corpus. Thematic analysis is commonly used in qualitative research to identify overarching patterns, or themes, that are expressed both implicitly and explicitly across datasets. This method draws on the practice of developing theory from trends via systematic investigation of qualitative data (standardized in the social sciences as Grounded Theory). The results of refined thematic analysis can be visually summarized in thematic “maps” which are useful in distilling and relating commonly-held perceptions of cyber security and cyber security risk across diverse issues and stakeholder groups. The coding process represented in thematic mapping illustrates the analytical consolidation of similar ideas into representative themes. The research presented here provides a comparison of thematic maps of both cyber security and cyber security risk that were generated from interviews across the disciplines of academic research and Army practice.

P.166  Hazard Assessment of Ethylbenzene for Potential Impacts to National Defense . Rak A*, Vogel CM, Bandolin N; Noblis and US Army Public Health Center   andrew.rak@noblis.org
Security and Defense

Abstract: The Department of Defense’s (DoD’s) Chemical and Material Risk Management (CMRM) Program has a well-established three-tiered process for over-the-horizon scanning for Emerging Contaminants, conducting qualitative and quantitative impact assessments in critical functional areas, and developing sound risk management options. This “Scan-Watch-Action” process was used to examine potentials risks from potential increased regulation of ethylbenzene under the Toxic Substances Control Act (TSCA), the Environmental Protection Agency’s Integrated Risk Information (IRIS) Program, and the European Union Registration, Evaluation, Authorization and Restriction of Chemicals (REACH) regulation. Subject matter experts (SMEs) from throughout the DoD used the Emerging Contaminants Assessment System (ECAS) tool to evaluate the potential risks to DoD associated with these two mission critical chemicals. Members of the CMRM Program team used the Impact Assessment Criteria Assessment Tool (ICAT) to analyze SME input. Together, these two groups developed a set of initial risk management options (RMOs) to be considered within the DoD. The risks identified by the SMEs and the potential RMOs for each chemical are presented for each of five different functional areas. The uncertainties in the SME’s risk estimates are also discussed and recommendations for further analysis are presented. The assessment concluded that increased regulation of ethylbenzene poses moderate or high risk to three DoD functional areas: Acquisition/Research, Development, Testing, and Evaluation; Site Cleanup; and Environment, Safety, and Health. Risk management actions are required to develop safer alternatives that meet military performance requirements. The CMRM Program will continue monitoring of regulatory and scientific development with a specific focus on potential changes under TSCA Section 6 and the IRIS Program.

P.167  National Academies decadal survey of social and behavioral sciences for national security. Bhatt S, Schuck JA*; National Academies of Sciences, Engineering, and Medicine   jschuck@nas.edu
Security and Defense

Abstract: The National Academies of Sciences, Engineering, and Medicine is conducting a decadal survey of research opportunities in the social and behavioral sciences that can contribute to national security. Decadal surveys are used to assess and project research possibilities for the coming decade. A committee of experts has been appointed to carry out this work, which will address a broad range of areas, including assessment, characterization, and communication of risk. This poster will identify the members of the committee and their statement of work in carrying out the decadal survey. It will also identify how the scientific community and other allied professionals can provide input to the decadal survey. More information on the decadal survey is available at http://nas.edu/SBSDecadalSurvey.

P.168  The risk assessment of pesticide residue, Fluopyram, in tea in Taiwan . Huang J*, Wu KY; National Taiwan University   jessewww@gmail.com

Abstract: Fluopyram, a novel broad-spectrum fungicide from the pyridinyl-ethyl-benzamide class, acts by inhibiting the enzyme succinate dehydrogenase (SDH). In chronic and carcinogenicity expose, fluopyram can damage liver with the effects noted at lower doses increased liver weights and hepatocellular hypertrophy, whereas at higher doses included hepatocellular degeneration or necrosis. Also, the existing database on fluopyram is adequate to characterize the potential hazards to reproductive systems of fetuses, infants and children. This study focused on conducting a risk assessment of fluopyram in Taiwan to examine what an appropriate MRL is for tea intake. . We determined the intake (unit:grams/person/day) of 29 kinds of fruit and vegetables with Taiwanese consumption data from National Food Consumption Database in 2016, which were fluopyram legally allowed. The MRLs data (unit:ppm) were provided by Food and Drug Administration (FDA). Furthermore, we attempted to calculate the maximal daily intake and estimate the unknown MRL of tea. We used benchmark dose to calculate a reference dose (RfD) to replace NOAEL. Then, we got the final result which is 6.586951216 ppm for the MRL of tea. Comparing to the reference dose 6 ppm set by Council of Agriculture, it is a little higher but closed number. Moreover, we could judge whether it is appropriate to allow using fluopyram on tea, and explore the risk communication to society. This study demonstrates the relationship between the amount of residues and health risk. We suggest government investigate long-term qualitative and quantitative studies of fluopyram in foods and call for decrease on abuse of fluopyram.

P.171  Biphasic low-dose patterns of inhibition-activation for three nuclear receptors linked to suppressed apoptosis, cell proliferation, and tumorigenesis: HSP70, Nrf2, and CAR. Bogen KT*; Exponent, Inc., Health Sciences   kbogen@exponent.com

Abstract: Evaluations of potential health risks associated with low-level environmental exposures often hinge pivotally on low-dose dose-response extrapolations and associated mechanistically informed expectations that typically cannot be verified experimentally. The multistage somatic mutation (MSM) theory of cancer underlies default regulatory assumptions that increased cancer risk—which often dominates other potential impacts—may occur from chronic, low-level exposures to genotoxic chemical carcinogens with a linear-no-threshold (LNT) dose-response. Recent observations challenge this assumption, and thus also challenge recent proposals to apply LNT risk extrapolation for non-cancer endpoints. Examples illustrated here are significantly biphasic (U-shaped) patterns of activation by three highly conserved ultrasensitive nuclear- receptor-type molecular switches: a heat shock protein (HSP70) involved in chaperoning misfolded proteins, the Nrf2 anti-oxidant response pathway, and the human constitutive androstane receptor (hCAR). Via weighted nonlinear model regression implemented using Mathematica software, biphasic in vitro activation patterns are shown to occur for HSP70 in murine embryo fibroblast NIH-3T3 cells (using data presented by Beckham et al., Photochem Photobiol 2004; 79(1):76–85), for seven Nrf2 agonists in human liver HepG2 cells (using data obtained by Shukla et al., Environ Health Perspect 2012; 120(8):1150–6), and for two rodent-liver-tumor promoting aminoazo hCAR agonists (4-aminoazobenzene and ortho-aminoazotoluene) also in HepG2 cells (using hCAR activation data archived in 2017 by the National Center for Biotechnology Information). Each such low-dose pattern of inhibition/activation is shown to incorporate a highly significant negative initial linear slope and an overall U-shaped response. These observations highlight sources of fundamental uncertainty in dose-response extrapolation that complicate health risk assessment and management for environmental chemicals that induce toxic endpoints mediated by these important nuclear receptors.

P.172  Applying A Global Sensitivity Analysis Workflow to Improve Computational Efficiency in Physiologically-Based Pharmacokinetic Model. Hsieh NH*, Reisfeld B, Bois FY, Weihsueh WA; Department of Veterinary Integrative Biosciences, Texas A&M University   nhsieh@cvm.tamu.edu

Abstract: Population physiologically-based pharmacokinetic (PBPK) models have come to play a key role in toxicological dose-response assessments, but can be computationally intensive to calibrate. The purpose of this study is to apply global sensitivity analysis (GSA) to ascertain which PBPK model parameters are nonidentifiable, and therefore can be assigned fixed values in Bayesian parameter estimation, increasing computational efficiency with minimal bias. We illustrate this approach using a published human population PBPK model for acetaminophen (APAP) and its two major metabolites APAP-glucuronide and APAP-sulfate. The Morris Elementary Effects method and variance-based Sobol indices (estimated using three different algorithms) were used to determine the sensitivity of parameters in the original model. Unsupervised hierarchical clustering was used to distinguish between sensitive and insensitive parameters. We compared Bayesian model calibration results using the "original" versus "original sensitive" parameters. We then expanded our GSA to encompass all PBPK parameters, including those fixed in the published model, comparing the model calibration results using "all" versus "all sensitive" parameters. The three variance-based GSA estimators gave similar results, which different from the results of the Morris method. We found that 12 of the 21 original parameters have low sensitivity, and could be fixed to improve computational efficiency without discernable changes in prediction accuracy or precision. We further found 10 additional sensitive parameters among the parameters that were previously fixed in the published PBPK model. Adding these additional sensitive parameters improved model performance beyond that of the original publication, while maintaining similar computational efficiency. We conclude that GSA provides an objective, transparent, and reproducible approach to improve the performance and computational efficiency of PBPK models.

P.173  Physiologically Based Pharmacokinetic (PBPK) Modeling of Interstrain Variability in Perchloroethylene Metabolism in Mice. Dalaijamts C*, Cichocki JA, Luo YS, Rusyn I, Chiu WA; Texas A&M University   cdalaijamts@cvm.tamu.edu

Abstract: Perchloroethylene organ-specific toxicity has been associated with both oxidative and conjugative metabolism pathways. Previous perc PBPK modeling could accurately predict oxidation but suggested the need to better characterize glutathione (GSH) conjugation as well as toxicokinetic uncertainty and variability. We updated the previously published “harmonized” perc PBPK model for mice to characterize the uncertainty and variability of perc toxicokinetics, with particular emphasis on modeling GSH conjugation metabolites. The updated PBPK model includes physiologically based sub-models for conjugation metabolites trichlorovinyl glutathione (TCVG), tricholorvinyl cysteine (TCVC), and N-acetyl trichlorovinyl cysteine (NAcTCVC), and added a brain compartment for perc and the oxidative metabolite trichloroacetic acid (TCA). Previously compiled mouse kinetic data on perc and TCA in B6C3F1 and Swiss mice was augmented to include data from a recent study in male C57Bl/6J mice that measured perc, TCA, and GSH conjugation metabolites in serum and multiple tissues. A hierarchical Bayesian population approach was used to estimate model parameters and characterize the uncertainty and interstrain variability, implemented using Markov chain Monte Carlo (MCMC) simulation. All convergence criteria were satisfied with four MCMC chains, each 100,000 iterations long. The updated model performed as well or better than the previously published model. Tissue dosimetry for both oxidative and conjugative metabolites were successfully predicted across the three strains of mice, with estimated residuals errors of 2-fold for the majority of data. Inter-strain variability across the three strains was evident for oxidative metabolism; GSH conjugation data were only available for one strain. Updated PBPK model fills a critical data gap in quantitative risk assessment by providing predictions of the internal dosimetry of perc and its oxidative and GSH conjugation metabolites. It also lays the groundwork for future studies to characterize perc toxicokinetic variability.

P.174  Assessing the Risk of Maritime Accidents. Large PJ*, Zouhair F; U.S. Coast Guard   paul.j.large@uscg.mil

Abstract: As part of the rulemaking process, Federal Agencies need to identify, quantify, and where possible monetize the benefits of proposed regulations. This need becomes especially salient when the proposed regulation is intended to prevent or mitigate the impacts of accidents in an industry where this distribution is heavily skewed right. The maritime industry encounters risks of low probability and high consequence which are often difficult to model. The U.S. Coast Guard (USCG) is responsible for promulgating regulations that prevent or mitigate maritime accidents. Because the possible impacts from a these events can be significant, these accidents may be of particular interest to policymakers. The goal of the analysis is to predict the risks of worst-case accidents, identify return levels and return periods using extreme value theory along with its competitive models. Knowledge of the risk assessment is essential for policymakers to help design regulations that manage risk by mitigating Black Swan events and promote safety. This analysis is based on USCG commercial vessel accident data from 2007-2016 and models the risk curve for resulting fatalities and injuries. The USCG accident data contains information related to marine investigations reportable under 46 C.F.R. 4.03. The data reflect information collected by USCG personnel concerning vessel accidents throughout the United States and its territories.

P.175  Field Evaluations of Newly Available “Interference-free” O3 Monitors and 2-10 meter near-ground O3 gradients. Ollison WM*, Leston AR; American Petroleum Institute and AirQuality Research & Logistics, LLC   ollisonw@api.org

Abstract: Metal oxide scrubbers ideally remove only ozone (O3) from scrubbed reference streams of conventional O3 photometers deployed in the U.S. O3 standard compliance network; however, in reality they change ambient vapor levels of water, mercury, and 254 nm-absorbing aromatic species, ultimately enhancing reported O3 levels when subtracted from the un-scrubbed sample stream. Recent comparisons of Federal Equivalent Method (FEM) conventional photometers (doi: org/10.1080/10962247.2017.1339645) to newer FEM photometers equipped with post-scrubber Nafion™ humidity equilibration (e.g., 2B Technologies 202/205) or gas-phase nitric oxide scrubbers (e.g., 2B 211) to also eliminate Hg and UV-active aromatic compound interference, improve air quality standard compliance. Newer 2B heated graphite-scrubbed photometers (doi: org/10.5194/amt-10-2253-2017) also reduce photometer artifacts many-fold and avoid the gas handling burdens of NO-based scrubbers. Currently allowed U.S. O3 compliance monitor inlet heights range from 2 to 15 meters (m) above ground level (AGL), averaging 5.4 m in urban and 10 m in rural compliance locations. Previous near-ground O3 gradient studies (Atm. Environ. 32: 1317-1322, 1998) report 20% O3 drops over a 4 m to 0.5 m inlet height range under stable conditions and up to 7% decreases under well-mixed conditions. Human nose heights are typically 1-2 m AGL so 2 m inlets best approximate outdoor population exposures. The use of newer FEM O3 photometers with 2 m inlets provides both improved air quality compliance and more realistic health risk assessments.

P.176  A Review of Non-Chemical Stressors and Their Importance in Cumulative Risk Assessment . Hibbert K*, Tulve NS; U.S. Environmental Protection Agency   hibbert.kathleen@epa.gov

Abstract: Non-chemical stressors are factors found in built, natural and social environments including physical factors and psychosocial factors. Extant research has shown correlations between non-chemical stressors found in a child’s social environment (e.g., food security, violence) and changes in children’s health and well-being. However, limited data are available on the interrelationships between chemical and non-chemical stressors and children’s health. Children may be more vulnerable to combined interactions of chemical and non-chemical stressors due to their developmental stage and lifestage-specific activities/behaviors. Objectives of this review were to 1) examine the state-of-the-science of non-chemical stressors found in a child’s social environment and 2) statistically rank and prioritize those stressors. A systematic review of non-chemical stressors found in a child’s social environment was performed on extant literature. Combinations of search strings (i.e., acculturation + health + child*) were entered into PubMed and PsychInfo. Inclusion criteria resulted in 244 articles. The available non-chemical stressor data from articles were extracted for statistical analysis and were classified into 11 topic categories: acculturation, adverse childhood experiences, economic, education, food, greenspace, overcrowding, social support, stress, urbanization, and exposure to violence. Depending on the topic category, initial analyses suggested significant positive and negative impacts on children’s health. Preliminary analyses identified most frequently reported non-chemical stressors, sub-categories of non-chemical stressors, proportion of studies that considered multiple exposures involving at least one chemical and non-chemical stressor, and correlations between a non-chemical stressor and health outcome. Our research suggests that non-chemical stressors, in combination with chemical exposures, should be considered in cumulative risk assessment for children’s health.

P.177  Framework for Managing Risks under Ontario’s Local Air Quality Regulation. Gilmore J*, Jugloff D, Onica T, Grant C, Schroeder J; Ontario Ministry of Environment and Climate Change    James.Gilmore@Ontario.ca

Abstract: Ontario’s local air quality regulation regulates contaminants released from industrial and commercial facilities. Air standards under the regulation are used to assess the environmental performance of regulated facilities and, when exceeded, drive actions to reduce emissions through technology and best practices. Risk is managed according to a framework developed in cooperation with public health agencies. Under the framework, modelling and monitoring information around a facility is evaluated and risk management actions are defined according three levels of a contaminant: The air standard level – set for contaminant at concentrations that are protective against adverse effect. As low as reasonably achievable (ALARA) level – exceeds a negligible risk but within an acceptable range for risk management - and requires actions by a facility to reduce as low as reasonably achievable. Upper risk threshold level (URT) – exceeding the URT level requires reporting and prompts timely action to reduce risks. For carcinogens, the standard and URT are set at a risk specific concentrations equivalent to a one-in a million (or 10-6) and one in ten thousand risk level (or 10-4) respectively. For non-carcinogens, standards are set at concentrations well below levels where effects are observed and URTs are generally set at 10 times the air standard. Using modelled receptor concentrations as maximum annual average levels, examples of carcinogens (Benzene, Benzo[a]pyrene and Chromium VI) are used to categorize various facilities. This characterization can then be used direct actions through improvement in pollution control technologies and to develop communication material for the public. The framework allows the ministry to work with facilities to reduce risk - as much as possible- in local communities in an open and transparent process.

P.178  Evaluation of ACGIH TLVs for Toluene Diisocyanate. Goodman JE*, Lynch HN, Prueitt RL, Mohar I; Gradient   jgoodman@gradientcorp.com

Abstract: In 2016, the American Conference of Governmental Industrial Hygienists lowered the 8-hr Threshold Limit Value - time-weighted average (TLV-TWA) for toluene diisocyanate (TDI) from 5 parts per billion (ppb) to 1 ppb, and the 15-minute short-term exposure limit (STEL) from 20 ppb to 5 ppb, with the intention of protecting against respiratory effects. We critically reviewed the human and animal evidence on which the TLVs were based and found that the human evidence indicates that maintenance of the previous 8-hr TLV-TWA and 15-minute STEL (5 ppb and 20 ppb, respectively, which were in effect from 1983 to 2016) is protective of occupational asthma (OA) in most workers, and is also protective of lung function decrements and other respiratory effects. Although some studies suggest OA cases may occur at TWA concentrations less than 5 ppb, many studies acknowledge the possibility that very high peak exposures, well above 20 ppb, may have contributed to OA onset. Advances in industrial hygiene measures have reduced peak exposures and the incidence of upset conditions such as spills and accidents, so these high peak exposures are unlikely to occur in modern TDI facilities. The animal literature supports the human evidence and indicates that TDI-induced asthma is a threshold phenomenon. The evidence does not indicate that the lower TDI TLVs will result in a lower incidence of respiratory effects, including OA.

P.179  Science in the news: the politicization of fracking. McClaran N*; Michigan State University   mcclaran@msu.edu

Abstract: Prior research has shown that if an agent (political or non-political) questions the inherent uncertainty of scientific findings, people are more inclined to be anxious about a scientific technology, consider it to have higher risk, and be less supportive of its adoption overall. This effect has been labeled the politicization of science. Although prior research has explored the politicization of science through one-paragraph summaries, no known research has tested whether this effect occurs within news articles- a common way to disseminate scientific information to the public. This study thus seeks to fill this gap in literature. By manipulating news articles to (a) be framed as a conflict between either political ideologies or interest groups, or have no conflict frame at all, and (b) either include or exclude a commonly found politicizing statement, this study tests the effects of politicization on perceived risk and support of a controversial scientific technology (i.e., hydraulic fracking). More so, this study will be among the first to test whether politicization can occur implicitly through conflict frames. That is, does presenting fracking as an issue between two sides inherently cause people to be less likely to support fracking regulations due to increased skepticism of the science behind the issue? The results of this study have implications into how science technologies are often portrayed in news coverage and provides further insight into why public support for pro-environmental issues continue to fluctuate despite accumulating scientific consensus.

P.180  Risk Assessment Guidance for Enzyme-containing Products. Kruszewski FH*; American Cleaning Institute   fkruszewski@cleaninginstitute.org

Abstract: The purpose of this guidance is to describe the potential health hazards of enzymes present in consumer products and provide a framework for manufacturers of these products to conduct risk assessments to help ensure the safety of new products containing enzymes. Enzymes generally have good safety profiles. However, enzymes like many other proteins can act as allergens and induce the production of allergen-specific IgE antibody upon repeated inhalation or exposure to mucous membranes that may lead to allergy symptoms, including asthma. The primary challenge associated with enzyme use is preventing the generation of allergen-specific antibody and the development of symptoms of Type 1 hypersensitivity. This hazard is the primary focus for the risk assessment for enzymes and must be managed carefully. Another hazard that also should be addressed is primary irritation of the eye and skin. If the risks posed by enzymes are not managed appropriately, the consequences may spread beyond a single product or company. This could lead to unwarranted limitations on the use of enzyme technology in other consumer applications. It is recommended that companies using enzymes in consumer products responsibly consider how they are managing enzyme safety including the conduct of appropriate risk assessments and risk management programs. Such programs will include measures to manage exposures to enzymes. The program design should be developed on a case-by-case basis to address parameters specific to the type of product and its applications. Experience in the cleaning products industry demonstrates that the potential risk of adverse effects can be successfully managed by identifying the hazards, carefully assessing exposure, characterizing the risk and then applying appropriate risk management. This guidance document outlines strategies and methods that have been used successfully by the cleaning products industry. An updated guidance document will be available in 2018.

P.181  Application of Livestock Shipment Models to Address Regional Risk of Disease Spread and Detection. Hallman CN*, Portacci K, Miller RS, Sellman S, Brommesson P, Beck-Johnson L, McKee C, Gorsich E, Tsao K, Tildesley M, Wennergren U, Lindström T, Webb C; 1,2,3,9 U.S. Department of Agriculture, 2150 Centre Ave, Fort Collins, CO; 4,5,11,12 Linkoping University, Linkoping, Sweden; 6,7,8,13 Colorado State University, Fort Collins, CO; 10 University of Warwick, Coventry, UK   clayton.n.hallman@aphis.usda.gov

Abstract: National patterns of livestock shipment play an important role in disease transmission risk and development of surveillance strategies intended to mitigate the risk. The US Animal Movement Model (USAMM) is a Bayesian model that simulates annual national-scale networks of county-level shipments of beef and dairy cattle. USAMM incorporates data from cattle shipments moving across state lines (Interstate Certificate of Veterinary Inspection) with covariates on cattle industry information from the national census (National Agricultural Statistics Service). The hierarchical Bayesian framework of the model allows for both within-state shipments and between state shipments to be modeled at the county level. Simulated shipment networks are publicly available and a web-based interactive shiny application is available to visualize shipment patterns (https://usamm-gen-net.shinyapps.io/usamm-gen-net/). The USAMM is valuable to a broad range of risk assessment projects and represents the only data available to represent livestock movements between counties. We illustrate the utility of USAMM to support risk assessments addressing foreign animal disease spread, movement of at-risk livestock, and risk-based targeted disease surveillance. We also illustrate how USAMM generated shipment networks can be used in conjunction with a disease spread model (US Disease Outbreak Simulator) to determine the potential National scale disease spread. We expect that USAMM will support a diversity of risk assessment and risk identification activities.

P.182  A Game-Theoretic Approach to Attacker-Defender Interaction in Cyber Systems. Outkin A.V.*, Eames B.K., Jones S.T., Vugrin E.D., Walsh S., Phillips C.A., Hobbs J.A., Galiardi M., Wyss G.D.; Sandia National Laboratories   avoutki@sandia.gov

Abstract: Modern society relies extremely heavily on cyber-intermediated systems that affect nearly all aspects of modern life including communications, energy, transportation, social networking, news provision, and other systems. Disruptions to those systems can be impactful, hard to detect as well as to respond. We present a game-theoretic approach to attacker-defender interaction in a resource contention game, called PLADD (probabilistic, learning attacker and dynamic defender). We provide a comprehensive mathematical framework for analysis of the dynamic attacker-defender interaction with incomplete information that can be used to create simple, analytically tractable yet practically insightful models for understanding cyber disruptions to these systems and their security. We build upon an existing model called FlipIt, extending it into a scenario involving a probabilistic attacker and defender playing for control over a resource. Using the martingale-based approach, we solve analytically for defender strategies and show how defender strategies affect the attacker payoffs. We compare the analytical solution to a simulation and show how the simulation can be extended into analytically intractable scenarios. Finally, we discuss how PLADD can be extended into domain of multi-resource games to represent more realistic attack scenarios.

P.183  Application of a 3-D chemical fate prediction model for risk assessment of agricultural chemicals in Japanese river water. Kobayashi N*, Komatsubara Y, Eriguchi T, Ikarashi Y; National Instutute of Health Sciences   norihiro.kobayashi@nihs.go.jp

Abstract: In order to ensure the safety of drinking water, many agricultural chemicals are monitored by many of water suppliers in Japan. However, analytical methods for agricultural chemicals in tap water are complicated. Therefore, much labor and cost is required to apply these methods. In the present study, we have developed a 3-D chemical fate prediction model and applied the model to approximately 250 agricultural chemicals, which are the “Complimentary Items” in tap water in the Japanese Waterworks Act. This model takes into consideration two forms of agricultural chemicals (particulate-phase and dissolved-phase) in the estuary, and can simulate the diffusion and sinking processes of agricultural chemicals from the loading points of their sources. This model requires flow field data (current velocity, water temperature, etc.), particulate matter (phytoplankton and detritus) concentrations, loading flux, and physical-chemical property of agricultural chemicals (log Koc, and half-life in water) as input data for the calculations. Run-off ratio at the 30 km downstream from loading points and the mass balances of these agricultural chemicals in the river were estimated. The results obtained from this model can be used to select target chemicals for environmental monitoring in Japanese river water. Further, this model can be also applied to human health and ecological risk assessments of agricultural chemicals. The improvement of the model’s predictive capability shall be the focus of our next study.

P.184  Effect of Risk Probability Disclosure on System Reliability: An Economic Experiment. Akai K*, Makino R, Takeshita J, Kudo T, Aoki K; Shimane University   akai@med.shimane-u.ac.jp

Abstract: The purpose of this study is to experimentally examine the effect of disclosing the risk probability of each unit in a production system on human behavior and on the resulting reliability of the production system using human subjects in a laboratory. To achieve this goal, we used an economic experiment based on the theoretical model of Hausken (2002). To evaluate the effect of risk probability disclosure, we conducted one experiment in which the risk probability was disclosed to subjects and one in which the risk probability was not disclosed. We conducted first the non-disclosed-risk experiment and then the disclosed-risk experiment within subjects in both series and parallel systems. Our experimental results show that risk probability disclosure has two positive effects. First, subjects succeeded in improving the system reliability while cutting back on efforts to reduce the risk of their units when the risk probability was disclosed. In each system, the disclosed-risk condition achieves significantly higher system reliability on average than does the non-disclosed-risk condition, although the average level of effort is significantly lower under the disclosed-risk condition than under the non-disclosed-risk condition. Second, disclosing the risk probability simplified the subjects’ decision-making process and reduced its cost because subjects made their decisions on the amount of effort to exert based only on the risk probability information without considering other factors, such as the number of accidents.



[back to schedule]