Society For Risk Analysis Annual Meeting 2015

Session Schedule & Abstracts


* Disclaimer: All presentations represent the views of the authors, and not the organizations that support their research. Please apply the standard disclaimer that any opinions, findings, and conclusions or recommendations in abstracts, posters, and presentations at the meeting are those of the authors and do not necessarily reflect the views of any other organization or agency. Meeting attendees and authors should be aware that this disclaimer is intended to apply to all abstracts contained in this document. Authors who wish to emphasize this disclaimer should do so in their presentation or poster. In an effort to make the abstracts as concise as possible and easy for meeting participants to read, the abstracts have been formatted such that they exclude references to papers, affiliations, and/or funding sources. Authors who wish to provide attendees with this information should do so in their presentation or poster.

Common abbreviations


Poster Session

Room:    6:00 PM-8:00 PM



P.2  Quantifying the Contribution of Individual Emissions Sources to PM2.5 Social Costs for Designing Cost-Effective Control Strategies. Heo J*, Adams PJ, Gao HO; Cornell University and Carnegie Mellon University   heo.cmu@gmail.com

Abstract: Fine particulate matter (PM2.5) and its precursor emissions originated from numerous regional sources travel long distance and generally account for a substantial fraction of the social costs (or public health burden) of PM2.5 pollution within a region or an urban area. Quantifying the contribution of individual sources to the social costs will be valuable to design optimal control strategies. Recently, regression models have been developed using a database generated by a state-of-the-art chemical transport model (CTM), which demonstrated robust out-of-sample validation. The models provide the technical rigor and accuracy of CTMs to policy research without tremendous computational costs of running CTMs. The models estimate the source-by-source contribution to the air quality burden at a receptor location. Using the models, this study aims to provide a method to design optimal air quality control strategies. We focus on quantifying the fractional contributions of four species (elemental carbon (EC) and inorganic particulate matter precursors (SO2, NOx, and NH3)) emitted anywhere in the United States. Preliminary results indicate that spatial scope of emissions control varies substantially depending upon the location of an affected area. For example, local emissions of New York City (i.e. emissions within the city) are responsible for 9% of the total burden, showing its dominant fraction is originated from external sources. However, local emissions of Los Angeles account for a larger fraction, 38%. Emission sources that account for 50% of the burden in New York City are as far as 880 km away while those for Los Angeles are 36 km away, which shows that controlling regional sources is more important for New York City than for Los Angeles. Our new approach provides detailed social cost accounting of individual emission sources, which assists identification and design of cost-effective and equitable air quality control strategies.

P.3  Profiling Adapters and Mitigators: An Empirical Study on Risk Perceptions and Behavioral Responses toward Air Pollution in Beijing. Tan H*, Xu J; Southwestern University of Finance and Economics    tanhuimin@swufe.edu.cn

Abstract: Air pollution is largely a collectively generated problem, posing health risks to all exposed to it. Little is known about the characteristics of the general public who take adaptive measures and those who take mitigation actions, in response to air pollution. So is the relationship between public’s cognitive factors and behavioral responses to air pollution. We conducted a questionnaire survey (N=979) in Beijing, China, where air pollution is a substantial problem, to assess risk perception, concern, perceived efficacy and obligation, and behavioral responses of Beijing residents to air pollution. We identified three segments of general public: mediocre wabbler, concerned activist, and conservative adapter. Our study suggests that the mitigation behaviors of concerned activists are more likely to be further enhanced if they believe others are more capable of controlling their own behaviors, the mediocre wabblers, would engage in more extensive mitigation if they believe both themselves and others should take greater responsibility for the air pollution, whereas neither moral obligation nor perceived efficacy, nor environmental values could influence the behaviors of conservative adapters. We highlight the most promising segments among general public to take mitigation actions and discuss possible segment-specific information campaign and policy strategies.

P.5  Risk-Based Maritime Security Risk Operations. Kuck JW, Howard PM, Taylor J*; ABS Group   jkuck@absconsulting.com

Abstract: The United States Coast Guard conducts daily maritime security activities, including port patrols, vessel escorts, and security zone enforcement. Traditionally, port security activity prioritization focused on maintaining minimum activity levels for various security activities. The increasing sophistication of Coast Guard risk assessment and analysis tools enables increasingly refined estimation of the risk reduction benefits of maritime port security activities. Assessing risk reduction for various port security activities permits application of a risk-based view for conducting operations as opposed to a strict activity based protocol, focusing limited resources on those activities that provide the greatest risk reduction impact. In 2015, the USCG completed implementation of a service-wide pilot project to prioritize Maritime Security Response Operations (MSRO) based on risk reduction benefits. Now, rather than striving to complete a pre-determined number of activities, Captain of the Port enforcement branches coordinate assets to maximize risk reduction across projected MSRO activity demand. Performance measurement includes a percentage of risk reduced against a specific, port-level risk reduction target considering the optimal risk reduction potential of available resources against actual MSRO activity demand. This project integrates security activity modeling, risk assessment, resource optimization, and a dynamic planning process. Geographic modeling and linear programming combine risk exposure, projected risk exposure uncertainties, risk reduction metrics, resource availability, and other factors to determine an optimal allocation solution.

P.6  Comparison of a Site Risk Assessment Conducted Using EPA Superfund Risk Assessment Guidelines vs. LDEQ RECAP Methods. Greenberg GI*, Beyer LA; Gradient   ggreenberg@gradientcorp.com

Abstract: We conducted a risk assessment for a site in Louisiana using both EPA's Superfund Risk Assessment Guidelines and Louisiana Department of Environmental Quality (LDEQ) Risk Evaluation/Corrective Action Program (RECAP) methods. Risks were evaluated for a recreator who is exposed to sediment and surface water while crabbing in waters near the site. RECAP methods are consistent with EPA risk assessment guidance, but differ in terms of specific guidance. EPA's risk assessment approach consists of four steps: hazard identification, dose-response assessment, exposure assessment, and risk characterization. RECAP is based on a tiered framework consisting of a Screening Option (SO) and three Management Options (MO)(MO-1, MO-2 and MO-3). Similar to EPA's hazard identification step, the SO identifies chemicals of concern (COCs),which are carried forward to the MO. Unlike EPA's method, RECAP's tiered MO approach compares exposure point concentrations to RECAP standards (RS). With each increasing MO tier, the RS becomes more refined, incorporating more site-specific assumptions (e.g., exposure, fate/transport). The two methods also differ in terms of specific guidance. They have different requirements for data usability resulting in different datasets. Key discrepancies include use of SPLP and filtered samples, soil depth, and proxy values for non-detected chemicals. EPA and RECAP also have different specifications that could impact the COC selection (e.g. detection frequency and health based screening values). For the risk calculations, EPA incorporates soil bioavailability and updated exposure assumptions and toxicity values. Total petroleum hydrocarbon (TPH) risk calculations also differ as both EPA and RECAP rely on different sources to define TPH fractions and toxicity values. Despite the differences between EPA and LDEQ RECAP approaches, both resulted in the same conclusion: recreator exposures to Site media did not pose a risk of adverse health effects.

P.7  Risk management strategy regarding nanotechnologies within the EDF Group. Tossa P, Delon C, Brugidou M, Noel D, Cabanes PA*; Electricity of France (EDF)   pierre-andre.cabanes@edf.fr

Abstract: Nanoscience research has revealed interesting applications for energy. EDF (Electricité De France), an electricity producer, is interested in certain applications of this research that could concern both centralised production (nuclear, fossil-fired power plants, etc.) and decentralised production (biomass and photovoltaic), as well as the transmission of electricity through power lines and the storage of electricity (fuel cells, batteries, and supercapacitors), etc. Similarly, certain products purchased by the company to meet specific requirements may contain nanomaterials (paint, printer ink, overalls, etc.). Nanotechnology is a new type of technology for which specific regulations are struggling to get off the ground in France and around the world. To ensure nanotechnologies are developed in a responsible manner within EDF, a risk management strategy has been put in place. For applications developed within R&D, existing knowledge of the dangers of the nanoproducts used must be gathered or, in its absence, knowledge must be acquired in order to assess the exposure potential of these products. With regards to purchased products, initially, suppliers must be asked whether these products contain nano-objects and, at a later date, an exposure assessment method will have to be developed.

P.8  Addressing Affordability Issues in the Federal Flood Insurance. XIAN SY*, LIN N; Princeton University    sxian@princeton.edu

Abstract: Hurricanes Katrina and Sandy have drown the National Flood Insurance Program (NFIP) in debt of over 30 billion US dollars. To put the program on a sounder financial footing, the NFIP recently moved toward elimination of certain premium discounts through the pass of the Biggert-Waters Flood Insurance Reform Act (BW12). However, many residents in flood-prone areas could not afford the increased premiums. As a result, Congress passed the Homeowner Flood Insurance Affordability Act and reinstated discounted rates.In order to address the affordability issue, Kousky and Kunreuther (2014) proposed a voucher program coupled with a low-interest loan to cover the combined cost of the loan payment and the reduced premium (due to mitigation) in excess of the affordable amount of household’s income. The proposal is novel to stimulate mitigation and address affordability while reinstating the risk-based premium principle. A more systematic framework to implement the proposal needs to be developed. Especially, such a framework should consider not only the current cost and benefits (reduced premium) but also long-term benefits of avoiding/reducing future losses. Here we develop a methodology to further design the voucher program and quantify its cost effectiveness, considering future losses from storms (in a changing climate and sea level rise). Different from the proposal by Kousky and Kunreuther (2014)who use hypothetical parameters and house characteristics, we first find the optimal parameters such as elevation height that minimize the total cost of mitigation and future losses.Further on, the decision of whether FEMA should implement the voucher with or without mitigation for specific house is based on cost effectiveness. In a few test cases we found that voucher with mitigation is not beneficial without considering future losses while it could be beneficial with future losses. We will apply our framework to a coastal community (Ortley Beach, NJ) as a case study to test our methodology.

P.9  Managing Coastal Flood Risks: A Structured Decision Making (SDM) Approach to Mitigating the Impacts of Sea-Level Rise in Vancouver, British Columbia. Beaudrie CEH*, Lyle T, Long G, Badelt B; 1, 3 Compass Resource Management Ltd, Vancouver, British Columbia; 2 Ebbwater Consulting, Vancouver, British Columbia; 4 City of Vancouver, British Columbia   christian.beaudrie@gmail.com

Abstract: Rising sea levels pose increasing flood risks for coastal communities, particularly major population centers along the British Columbia Coast. With a projected sea level rise of 1m by 2100, BC communities face the challenging task of understanding hazards, vulnerabilities, and consequences from flood events, and identifying suitable measures to protect multiple interests over large areas. This talk highlights the application of a Structured Decision Making (SDM) approach to evaluate the impacts of sea level rise and select mitigation options to reduce flood risks for Vancouver, British Columbia. The process involved a series of stakeholder workshops to identify interests that may be impacted, develop suitable mitigation alternatives, review performance of each alternative and consider trade-offs, and finally to develop recommendations for a suite of mitigation alternatives to protect vulnerable neighbourhoods across the city. To address the challenge of communicating complex risk information, stakeholders were engaged using both spatial illustrations of flood extents for a number of flood scenarios, and an interactive decision support tool to facilitate comparison of alternatives and trade-offs. This work breaks new ground in evaluating the implications of sea level rise on coastal communities, and provides a model for other communities grappling with the challenges of assessing and managing flood risks from a rising sea.

P.10  Using Means Objectives to Present Risk Information. Huynh CH, Simon J*; California State Polytechnic, Pomona and Naval Postgraduate School   jay.simon@gmail.com

Abstract: When making decisions involving alternatives with risk, individuals are not always able to express or view the possible outcomes in terms of a fundamental objective. In many cases, using a means objective is more practical or more accessible. However, to apply information about a means objective correctly, a decision maker must first translate it into information about a fundamental objective. This paper presents and discusses the results of two studies regarding decision makers' preferences when information is presented either in terms of a means objective or a fundamental objective.

P.11  Measuring Individual Differences in Near-Miss Appraisals. Cui J*, Rosoff H, John RS; University of Southern California   jinshucu@usc.edu

Abstract: A near-miss is defined as “an event that had a nontrivial probability of ending badly, but by chance did not” (Dillon, Tinsley, & Cronin, 2011). Research has found that individuals could react differently (risk-averse or risk-seeking) to a near-miss event depending on whether the experience is interpreted as resilient or vulnerable (Tinsley, Dillon and Cronin, 2012; Rosoff, Cui and John, 2013). We hypothesize that people differ in the way they interpret near-miss events and resolve decisions following near-miss experience. In this study, we developed the Near Miss Appraisal Scale (NMAS) that measures the psychological appraisal of near-miss experiences. The scale contains 20 items that describe a near-miss event. Respondents provide a cognitive appraisal of each near miss event. Responses take the form of an assessment of whether the event described changes their appraisal of the risk of such an event in the future. An increase in risk appraisal suggests that the respondent views near misses as identifying vulnerabilities, while a decrease in risk appraisal suggests that the respondent views near misses as confirmation of safety and resiliency. Specifically, respondents rate the likelihood with which they might engage in risk-mitigating behaviors in the future, using a 7-point rating scale ranging from 1 (extremely unlikely) to 7 (extremely likely). A higher score indicates an appraisal of vulnerability, in that the respondent is more likely to take a protective measure under the same situation in the future. In contrast, a lower score indicates an appraisal of resilience, in that the respondent is less likely to take protective measures. We evaluate the psychometric properties of the items and the scale using Classical Reliability (Test) Theory CTT), factor analysis (FA) and Item Response Theory (IRT). We establish discriminant validity for the scale with other psychological measures, such as risk aversion/loss aversion constructs, e.g., Domain-Specific Risk-Taking (DOSPERT) scale (Blais and Weber, 2006).

P.13  The Goldilocks Fallacy. Vanden Bosch P*; Marymount University   petevandenbosch@gmail.com

Abstract: When faced with a range of options, we have a strong tendency to choose an intermediate one. This Goldilocks heuristic serves us well in many circumstances. For instance, in the presence of conflict, we seek compromise. We keep to the middle of our lanes when driving, without consciously reasoning why this is a really good idea. And when we see seven predictions for a hurricane’s path, we intuitively assign higher likelihood to the intermediate predictions and less to the extremes, without consciously invoking the Law of Large Numbers. All of these mental shortcuts have some justification. But we also apply this Goldilocks heuristic in questionable ways. This presentation focuses on the idea that, in making choices when there are conflicting objectives – which are very common – choosing a middle path often fails to give you the best decision. In fact, sometimes it leads to the worst. In this presentation, we’ll focus on some telling examples in risk analysis, quantify what “often” and “sometimes” really mean, and examine the psychological reasons that drive us to make these poor Goldilocks decisions.

P.14  Verbal decision analysis of risk related causal factors in operator errors. Yemelyanov AM*, Baev S, Yemelyanov AA, Tikhomirov NP; GSW State University, Plekhanov Russian University of Economics   alexander.yemelyanov@gsw.edu

Abstract: Existing databases, such as NTSB, NASA-ASRS, etc., describing accident reports are mostly suitable for statistical analysis of predetermined error categories, rather than for the analysis of risk related causal factors, specifically in decision-making (> 60% of all errors). To improve accident report descriptions, we focus on a method of error modeling that incorporates the risk-as-feeling approach, verbal decision analysis, and human reliability analysis. According to the risk-as-feeling approach (P. Slovic), individuals rely on their feelings that function as good-versus-bad information to influence their choices when making decisions in complex situations with limited mental resources. We also use the verbal decision analysis with soft (fuzzy) words to measure and compare the operator’s attitude (feelings) towards an emotional event. For this purpose, negative and positive feelings are measured by their intensity (scale “weak” – “strong”) and likelihood (“seldom” – “often”), then integrated into significance-as-anxiety and significance-as-value, and finally mapped on the one-dimensional scale of preference. This allows different types of risks to be compared when making subjective decisions. Lastly, in the model, the selected PSFs are integrated into the operator’s performance parameters, and, together with the suggested classification algorithms, work efficiently for the causal analysis of operator errors. Various examples will be discussed to demonstrate this efficiency.

P.15  Playing with Fire: Assessing the effects of risk interdependency and social norms on homeowners’ wildfire mitigation decisions using choice experiments. Brenkert-Smith H*, Dickinson K, Flores N; University of Colorado   hannahb@colorado.edu

Abstract: In the face of the expanding wildfire hazard, the actions that homeowners take to reduce fire risk on private property are crucial. However, homeowners’ wildfire risk mitigation actions are interdependent: the risk that any individual faces is affected by the conditions on neighboring properties. Meanwhile, social norms provide another mechanism linking one’s own risk mitigation choice to neighbors’ choices. Households may be encouraged to undertake more mitigation when presented with social comparisons highlighting high levels of mitigation among neighbors. Our web-based survey of homeowners in fire-prone areas on Colorado’s Western Slope combines survey data on current knowledge, risk perceptions, and practices with choice experiments that vary 1) fuel conditions on neighbors’ property (i.e., neighbors’ risk levels), and 2) neighbors’ mitigation actions (i.e., social norms) in order to assess the impact of these factors on wildfire mitigation choices. Of particular interest are the interactive effects of these different factors. For example, if risk interdependency tends to lead to underinvestment in risk mitigation, can social norms help to overcome this problem and “tip” communities toward higher levels of protection? In this presentation we describe the survey and experiment, and discuss initial findings from our data.

P.16  Decision Making under Risk and Ambiguity. Wang Y*; Georgia Institute of Technology   yan_wang@gatech.edu

Abstract: Different from risk, which represents the situation that information is available and takes the form of probability distributions, ambiguity refers to the situation that the probabilistic information is incomplete or imprecise. Various approaches for decision making under ambiguity have been developed, where uncertainty is associated with either decision maker’s preference or probability. In this research, a scheme of decision making under imprecise risk is developed as an extension of the Choquet expected utility model and cumulative prospect theory. In the new scheme, capacities are derived from lower and upper probabilities as well as prior beliefs. The degrees of ambiguity aversion and prior conviction are incorporated in the dynamic information gathering and belief update process. The goal is to improve the robustness of decision making under both risk and ambiguity.

P.17  Is it necessary to invest in information technology security countermeasures? A theoretical model of decision making based on risk mitigation of banking phishing . Nsiempba JJ*, De Marcellis Warin N, Fernandez J; École Polytechnique de Montréal   jude-jacob.nsiempba@polymtl.ca

Abstract: The phishing bank fraud is a threat that falls under the category of what we would qualify here as human threat, meaning in addition to exploiting security breaches it uses social engineering to manipulate users and bypass the safety measures. It causes enormous financial losses to individuals and businesses. In 2013, Canadian banks have repaid nearly $139M to the holders of credit cards for the losses incurred by these clients as a result of phishing attacks. These financial losses tend to grow yearly despite the proliferation of security countermeasures, which, alone, are no longer sufficient in reducing this type of risk. Therefore, when facing this subtle type of threat, users must use their judgment in selecting countermeasures which provide the best risk-reduction vs Cost ratio. To achieve this, users need economic arguments which will enable them to make an informed decision when investing. In this paper, we suggest a theoretical model of the investment decision in a countermeasure, based on risk mitigation. This risk reduction is based on what we call probability of monetization, which means, the chances that a card stolen by phishing, sold on the black market would be convertible (money could be withdrawn from it). We use a top-down logic and an influence diagram to represent the decision-making framework. Based on specific assumptions on types of credit cards and their features, our model develops a new theoretical approach for calculating the monetization function by using functional forms. We show that net profit induced by the acquisition of a new countermeasure depends on three major factors, namely: the amount of money available in the credit card, the commission paid to the mule for laundering and the expenses incurred for fraud protection. Although this model cannot compensate for lack of relevant data needed for Risk analysis, we believe that its use for a decision to invest or not in a countermeasure would be very useful in reducing or better yet avoiding the risks of making mistakes

P.18  Decision-Analytical Approach to Managing Harmful Algal Blooms: Methodology and Case Study. Radomyski A, Pang C*, Subramanian V, Nadimi M, Barba D, Linkov I; Ca’ Foscari University of Venice, Ialy   chengfang.pang@unive.it

Abstract: Harmful Algal Blooms (HAB) have attracted a great deal of attention in recent years due to concerns over their potential to accrue economic losses and damage ecosystem health. The number of toxic blooms characterized by widespread production of highly potent toxins (cyanotoxins) each year has increased globally, with human activities being identified as one of major cause for this change due to water eutrophication, increased air temperature, a disturbance of thermal stratification, and the modification of local hydrology. With the advent of new remediation technologies, mitigating the consequences of specific HAB events is more feasible than in the past, but requires careful selection of remedial alternatives given specific HAB conditions and specifics of the site and affected communities. In this work, we discuss HAB management solutions, and specifically take into account various groups of stakeholders/decision makers/risk managers and their varied priorities given to human health, environmental and ecological, socioeconomic, and technical feasibility criteria. We demonstrate how stakeholders may prioritize management alternatives through the use of Multi-Criteria Decision Analysis (MCDA) in a case study of an Alpine lake affected by HABs.

P.19  Incorporating Decision Points in Models of Risk Scenarios. Treeman NM*, Mosleh A; University of Maryland College Park and University of California Los Angeles   treeman@usna.edu

Abstract: The inherent risk of a technologically complex system is not a stationary characteristic, but is a continually evolving and changing status depending on myriad factors. Current probabilistic risk assessment (PRA) processes attempt to qualify and quantify the risk character of a system in order to provide guidance on possible means to reduce this overall inherent risk. Post-accident evaluation of the existing risk analysis for a specific system provides a means of identifying deficiencies of risk analysis processes. In the case of the nuclear site accident at Fukushima Daiichi, one area in the risk assessment that could require improvement is the incorporation of influence of decisions made by operators, site mangers, and operating company management, as well as local and national government organizations and officials. Such decisions can significantly affect the course of the accident scenarios. This paper introduces an extension of the conventional PRA methods of risk scenario modeling to explicitly include “decision points” specifically, decisions made on when and how to intervene in system functions, as well as in managing control or mitigating the accident’s adverse impacts on people and environment.

P.20  Toward Risk-Informed Regulation in Healthcare Using Socio-Technical Risk Analysis . Maddi A*, Pence J, Mohagegh Z; University of Illinois Urbana Champaign   amaddi3@illinois.edu

Abstract: Risk-Informed Regulation has been used in various industries such as Nuclear, Aviation, Oil and Gas, etc. Risk-Informed regulation considers multiple deterministic and probabilistic criterion used in decision making and oversight processes. Risk-Informed Regulation poses an advantage over risk-based regulation, as it posits that decisions are not made solely based on risk estimations from techniques such as Probabilistic Risk Assessments (PRAs). However, it can be seen in healthcare literature that a risk-based approach is being considered in Regulation, Monitoring and Decision Making. This paper addresses the benefits of a Risk-Informed approach for the healthcare industry. Relevant techniques such as Failure Mode and Effect Analysis (FMEA), Failure Mode Effect and Criticality Analysis (FMECA), Healthcare Failure Mode and Effect Analysis (HFMEA) and Clinical Risk and Error Analysis (CREA) and their shortcomings with respect to human error and organizational factors are discussed in this paper. This paper introduces an approach for Socio-Technical Risk Analysis (SoTeRiA), which helps to: (1) determine patient level and organizational level errors and failure mechanisms, (2) takes into consideration the human-system-software interfaces and dependencies, and (3) considers the likelihoods and sequences of events. The Risk-Informed approach will be evaluated based on a comparison to existing and proposed techniques for Regulation in Healthcare. For instance, the regulation for the medical equipment is Risk-based as defined by the U.S Food and Drug Administration, where there is a great potential to combine deterministic and probabilistic information in order to provide decision makers with appropriate information on the regulatory oversight of medical devices, resulting in improvements to patient safety.

P.21  Beta Bayesian Kernel Methods for the Prediction of Global Supply Chain Disruptions. Baroud H*, Francis R, Barker K; University of Oklahoma and George Washington University   hbaroud@ou.edu

Abstract: Multinational corporations often operate facilities or have suppliers in countries prone to natural hazards, extreme weather, or political turmoil. While such business strategies offer low operating costs, they might be introducing high risks leading to high impact disruptions in the global supply chains of these corporations. With multinational companies operating from different locations, there is a great deal of information, services, and money circulating between facilities. It is then critical to assess the risks and consequences of potential global disruptive events such as tsunamis, earthquakes, or factory fires, among others. Therefore, the accurate prediction of such risk plays a vital role in (i) assessing the cost effectiveness of overseas operations, and (ii) identifying risk management and preparedness strategies. We deploy a Bayesian kernel model using the Beta Binomial conjugate prior to assess the probability of a supply chain disruption given (i) historical information on past disruptions, and (ii) a set of attributes describing the company, its risk management strategies, and its suppliers’ risk assessment, among other possible attributes. The model is extended to accommodate non-continuous variables by incorporating different types of kernel functions. Given data provided by a survey of companies’ resilience of their supply chains, we are able to produce the probability distribution of global supply chain disruptions for any company in the sample. We consider several variations of the model and compare the output. The approach is also compared with classical forecasting tools for validation purposes. This method provides risk managers with an accurate estimation of the likelihood of a supply chain disruption. It is also suitable to integrate the decision maker’s risk preference in the form of the prior distribution while updating it with newly acquired information.

P.22  Improving risk prediction models using PGA, LASSO and SVM in prostate cancer prediction. Farnaz Pirasteh FP*, Mahtab Sanei MS; Pukyong National University   pirasteh@pknu.ac.kr

Abstract: Absolute cancer risk is the probability that an individual with given risk factors and a given age will develop cancer over a defined period of time. Examples of these risk factors include race, age, sex, genetics, body mass index, family history of cancer and many other factors. Considering prostate cancer, recently, some common gene variations have been linked to a higher risk of prostate cancer. This research is to show testing for the gene variants will be useful in predicting prostate cancer risk. Developing prediction models that estimate the probability of cancer presence based on individual genes data over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk. These types of models also will be useful for designing future chemoprevention and screening intervention trials in individuals at high risk of specific cancers in the general population. To address the issue, GSE8218 dateset from Vaccine Research Institute of San Diego, as a study on Prostate cancer gene expression profile, is considered; and as a case of big data (while there are about 22,000 genes considered as variables and 148 samples), variable selection methods such as PGA (Parallel Genetic Algorithm) and LASSO (Least Absolute Shrinkage and Selection Operator) were applied to reduce the prediction dimension and so prediction error (risk) (Considering Data Mining Processes). As an efficient prediction model, SVM (Support Vector Machine) was applied and the results based on RMSEP (Root Mean Squared Error Prediction), MAE (Mean Absolute Error) and MSE (Mean Squared Error) show the efficiency of SVM in predicting samples with specific genes to address for prostate cancer.

P.23  Application of Benford’s Law and Zipf’s Law in the Development of Data Driven Decision Support for Environmental Enforcement. Hatami P*, Mitchell J, Gibbs C, Rivers L; Michigan State University   jade@msu.edu

Abstract: The Clean Water Act mandates the EPA to collect data from municipal wastewater treatment plants and other permitted facilities under the National Pollutant Discharge Elimination System (NPDES). However, undetected cases of regulated plants self-reporting inaccurate estimates of their pollutant discharges can lead to chemical and biological pollution posing both ecological and human health risk. Furthermore, the limited resources of regulatory agencies limit their capacity to assess the veracity of the data proactively so enforcement relies heavily on other means like the on-site inspection process, which may not detect all cases so the frequency of these risks is unknown. The development of a data driven methods to evaluate self-reported discharge data is needed to support regulatory enforcement. This study investigates the applicability and conformance of self-reported discharge values from wastewater treatment plants in one state environmental agency using two methods. Benford’s Law is a robust screening tool for evaluation of large data sets and has been applied to a number of environmental data streams. However, limitations to Benford’s Law include its effectiveness in evaluating small range data sets and its lack of use of the correlations between the multiple components measured at one site. Zipf’s Law addresses these limitations and can be used to detect mishandled data which cannot be effectively detected using Benford’s law. Our preliminary results indicate that these tools are promising approaches for analyzing self-reported data and may form the basis of a screening tool to aid in decisions about environmental enforcement once parameters are categorized based on their conformance to the methods and ranked according to their applicability to the Laws. Microbial indicator data and total suspended solids tended to fit the distributions with the highest level of significance.

P.24  Adverse Outcome Pathways for Effects Associated with Exposure to Inorganic Arsenic. Clewell HJ*, Greene TB, Gentry PR; Institute for Chemical Safety Sciences, The Hamner Institutes for Health Sciences, Research Triangle Park, NC; Ramboll Environ, Monroe, LA;    rgentry@environcorp.com

Abstract: Adverse Outcome Pathway (AOP) descriptions for both noncancer and cancer outcomes are proposed for compounds that disrupt cellular function through their avid binding to vicinal dithiols in cellular proteins, using the available data for inorganic arsenic as a case study. These AOPs integrate data from diverse sources, including epidemiology, in vivo animal studies, and in vitro studies. For cancer outcomes, the AOP describes a co-mutagenic mode of action consisting of two parallel processes: (1) inhibition of DNA repair due to binding of key proteins associated with DNA damage repair, and (2) an oxidative stress / inflammatory response, driven by binding of key proteins in the NRF2 and NFKB cell signaling pathways, that impairs the cell’s ability to delay cell division until DNA damage has been repaired. For noncancer outcomes, the AOP proceeds from binding to vicinal dithiols in key cellular proteins to the development of a proliferative phenotype characterized by altered oxidative stress and inflammatory signaling that can result in a number of non-cancer effects, including proliferative skin lesions, inflammatory cardiovascular effects, and inhibition of immune responses. While these AOPs were developed using inorganic arsenic as a case study, they are also potentially applicable to other compounds, including phenyl arsenic oxide and its derivatives, and possibly tellurium and periodate, that bind avidly to vicinal protein sulfhydryls. (This work was supported by EPRI).

P.26  Incorporating Ecosystem Services into a Conceptual Model of Cumulative Risk Assessment: Cardiovascular Disease as a Case Study . Menzie C*, Kashuba R, Law S; Exponent Inc.   camenzie@exponent.com

Abstract: The evaluation of risk factors leading to cardiovascular disease (CVD) is a particularly good example of a complex system of mechanisms resulting in an undesired health effect. A conceptual model. represents cumulative risk as network of interrelated nodes and arrows (and the process of creating it) is useful for many reasons including defining and organizing the problem, determining the scope and scale, enabling multi-directional causal reasoning, and facilitating communication. Depending on their perspective, local, state, and federal government officials of environmental and health-driven organizations relate to different parts of the model, creating an interface for discussing how different entities can work together on different parts of the problem, take ownership of different areas, and discuss how those areas interface with the other areas in the overall complex problem. From the environmental perspective, a CVD risk mitigation framework can be derived from the concept of landscape and ecosystem services. Ecosystem services are human benefits derived from nature. They are typically grouped into four categories of supporting, regulating, provisioning and cultural services. Many subcategories of such services directly affect factors that impact CVD risk. For example, the regulating service of air purification that vegetated terrestrial systems provide can decrease concentrations of particulate matter, ozone, and other air pollutants associated with incidence of CVD. The cultural service of recreation can offer increased physical activity and decreased psychological stress, both known CVD risk factors. Approaching human health risk mitigation from an ecosystem services perspective creates an organized, integrated structure for management of environmental drivers.

P.27  Dose response curves derived from clinical ozone exposures can inform public policy. Lange SS*, Tao G, Rhomberg LR, Goodman JE, Dourson ML, Honeycutt ME; Texas Commission on Environmental Quality; Gradient; Toxicology Excellence for Risk Assessment   sabine.lange@tceq.texas.gov

Abstract: Context: Ozone is one of six criteria air pollutants for which regulations are set by the US Environmental Protection Agency using the National Ambient Air Quality Standards (NAAQS). The ozone NAAQS level is currently set at 75 parts per billion (ppb). Objectives: We used data from human clinical studies to inform policy decisions about a protective ozone level. Methods: We plotted mean forced expiratory volume (FEV1) response versus total inhaled ozone dose (calculated from ozone concentration, duration of exposure and ventilation rate) from clinical studies of 1-8 hour durations. This produced two distinct sigmoidally-shaped curves, for the shorter-duration exposures (< 3 hours) and the longer-duration exposures (6-8 hours). The initial plot used data from healthy young adults; additional analyses incorporated data from children and asthmatics, but results did not differ. Results: There were clear thresholds of effect, which were consistent with the known ozone mode of action. We estimated typical ozone doses at ambient ozone concentrations of 75, 70 or 65 ppb (8-hour maximum average). Doses were similar at these three concentrations, and almost all doses were below those associated with a 5% FEV1 decrement, even when different exposure times and ventilation rates were assessed. Conclusions: This type of analysis can determine thresholds of ozone toxicity, which can be crucial for choosing an ambient ozone concentration that is protective of human health. Setting the ozone NAAQS at 65-70 ppb will have a marginal impact, if any, on ozone doses and lung function, but could have significant societal and economic implications.

P.28  Evaluating dose-additivity for dioxin-like compounds using a combined component-chemical/mixture data approach. Swartout JC*; U.S. Environmental Protection Agency   Swartout.Jeff@epa.gov

Abstract: An analysis of the National Toxicology Program rodent bioassays on 3 dioxin-like compounds (TCDD, PeCDF [23478-pentachlorodibenzofuran], PCB-126) and their mixture was conducted to evaluate dose additivity and estimate relative potency (ReP) values for each of the components. Dose additivity (with constant ReP) was defined as equality of dose-response shape for each of the components. Dose-response models were fit to administered dose and fat concentrations for each of the component data sets, to the combined components, and to the combined components plus the mixture for 24 common positive health endpoints. A single shape parameter (Weibull power or Hill coefficient) was fit for the combined-data models, with simultaneous fitting of RePs for each component and the mixture, itself; a likelihood-ratio test was used to evaluate equality of shape parameters across all components. For 19 of the 24 endpoints, no significant shape differences were computed across all 3 components and the mixture, supporting constant relative potency for all doses. For PeCDF, component RePs from the combined fits frequently differed from RePs computed from the individual component fits. Component RePs (PeCDF = 0.2, PCB-126 = 0.1) were similar for most of the 17 liver endpoints, but varied significantly across organ systems for PeCDF (0.015 to 6.3). RePs for liver endpoints based on fat concentrations differed from those based on administered dose by a factor of 2, with higher PeCDF values and lower PCB-126 values; this result may be due to differences in tissue distribution of the components. Whole mixture RePs were significantly less than 1 for 8 endpoints for administered dose and 15 endpoints for fat concentrations, suggesting the possibility of antagonistic interactions among the components for those endpoints. Although constant RePs are supported for most endpoints, RePs for the 5 endpoints failing the test for shape equivalence varied by 3- to 80-fold over the response range.

P.29  Web-based Bayesian Benchmark Dose Estimation System. Shao K*, Shapiro A; Indiana University Bloomington   kshao@indiana.edu

Abstract: The benchmark dose (BMD) methodology has been widely accepted as a preferred dose-response modeling approach in support of regulatory health risk assessment. Unlike the traditional No Observed Adverse Effect Level (NOAEL) method, which can be calculated using simple statistical testing, the BMD methodology requires a software package to perform model fitting and parameter estimation. The Benchmark Dose Software (BMDS) first published by the US Environmental Protection Agency (EPA) in 2000 played a key role in facilitating and promoting the BMD methodology. As the mainstream regulatory risk assessment community is moving towards a probabilistic assessment framework, point estimates produced by current BMDS are not as robust as more modern statistical methods. A BMD estimation system fully based on Bayesian statistics is being developed, and a web-based interface has been created for interacting with this new system. The software application has a number of features, including distribution estimation for BMD instead of point estimates of BMD/BMDL, risk/response calculations for user-specified dose levels, and quantitative model evaluation using model weights as opposed to model-selection using the Akaike information criterion (AIC). The web-based application also handles software versioning, as well as the potential to store ongoing analyses, in order to ensure transparency and reproducibility of risk assessments. Finally, the new system based on Bayesian framework has the potential to incorporate knowledge from various sources as prior information to inform model fitting and BMD estimation.

P.31  Problem Formulation Efforts in the IRIS Program. Subramaniam R*, Birchfield N, Cooper G, Fite K, Flowers L, Li Z, Jones S, Rieth S, Starkey C, Cogliano V; US Environmental Protection Agency   Subramaniam.Ravi@epa.gov

Abstract: The US EPA’s Integrated Risk Information System (IRIS) Program has started including a Problem Formulation step as one of the enhancements to the IRIS assessment development process as recommended in recent National Research Council reports (2009, 2014). Problem formulation, which is informed by a prior scoping process that identifies Agency needs for the chemical being assessed, informs assessment development by identifying potential health endpoints to be evaluated by systematic review methods, scientific issues that will be addressed, and data gaps for consideration. To identify potential health effects and related scientific issues, a broad literature survey as recommended by the NRC is conducted that largely relies on past reviews and assessments by other governmental and international health agencies. This information is used to determine the health outcomes of most interest and formulate the appropriate questions for systematic review. In addition, the problem formulation includes a focused discussion of other toxicologically-relevant information that represent potentially controversial issues to be addressed in the assessment. These can include toxicokinetic data and models, mode of action hypotheses, genotoxicity information, and data that might inform considerations of susceptibility. Problem formulation has been recently implemented in developing IRIS assessments for inorganic arsenic, naphthalene, ethylbenzene, PCBs (noncancer), and hexavalent chromium. The problem formulation document and the outcome of the internal scoping process are made publicly available and discussed in a public meeting with the scientific community and stakeholders. Utilizing these cases, our presentation will discuss lessons learned thus far regarding the general approach to problem formulation and advances for the future. The views expressed are those of the authors, and do not necessarily represent the views or policies of the U.S. EPA.

P.32  Benzo(a)pyrene [B(a)P]-induced colon tumorigenesis is enhanced by Western diet in the PIRC rat model . Harris KL*, Pulliam SR, Niaz MS, Okoro E, Gou Z, Washington MK, Adunyah SE, Ramesh A; Meharry Medical College and Vanderbilt-Ingram Cancer Center   kharris10@email.mmc.edu

Abstract: The objective of this study was to investigate the effect of diet type on benzo(a)pyrene [B(a)P]-induced colon cancer in an adult male rat model, the Polyposis In the Rat Colon (PIRC) kindred type. Groups of PIRC rats (n = 5) were fed with AIN-76A regular diet (RD) or Western diet (WD) and received 25,50 and 100 µg B(a)P/kg body wt. via oral gavage for 60 days. Rats fed diets alone, but no B(a)P, served as controls. After exposure, rats were sacrificed; blood samples were collected and concentrations of cholesterol, triglycerides, leptin, glucose, insulin, and adiponectin were measured. Colon tissues were scored for tumors, preserved in 10% formalin and observed for pathological changes. Colon and liver samples were analyzed for activation of drug metabolizing enzymes (DMEs) CYP1A1, CYP1B1 and GST. Rats that received WD + B(a)P showed increased levels of cholesterol, triglycerides, leptin, and insulin in comparison to RD + B(a)P groups and controls. Glucose levels showed a significant increase (p < 0.001) at 100 µg B(a)P/kg body wt. + WD only. The concentrations of adiponectin did not vary much between WD and RD groups for all the B(a)P doses used. The colon tumor counts were more in B(a)P + WD rats compared to their B(a)P + RD counterparts, exhibiting a B(a)P dose-response relationship, with 100 µg B(a)P/kg registering greater counts. Increased incidence of adenomas and high grade dysplasia were encountered in rats that were fed with WD compared to RD and controls (p < 0.05). Immunohistochemical analyses of colon tissue samples for PCNA, cyclin D1, TGF-&#946;, and &#946;-catenin revealed increased levels of cell proliferation and nuclear positivity among all treatment groups. Western diet consumption increased DME activation among rats that were given B(a)P. Our results demonstrate that WD accelerates the development of colon tumors induced by B(a)P through proinflammatory action, characterized by gain in tumor number and sizes, and body weight loss.

P.33  Characterizing Determinants of Risk: Concentration, Duration, and Timing of Exposure. Woodall GM, Hotchkiss AK, Makris SL, Jarabek AM*, Sams RL, Davis JA, Schlosser PM, Lin YS; US Environmental Protection Agency   woodall.george@epa.gov

Abstract: An ongoing challenge in human health risk assessment is to determine the best approach for characterizing the risk from real-world exposures. Three major determinants characterize exposure: concentration (how much), duration (the frequency and how long), and critical timing (when). Toxicity will vary with these exposure determinants and the pollutant’s physicochemical properties to form a 3-D response surface representing the concentration/duration/response continuum. Systems-based characterizations of pathogenesis provide a more comprehensive understanding of disease progression via measurements of key events early in an adverse outcome pathway (AOP). Approaches to identify and incorporate the most relevant key events in AOPs advances their application into risk assessments, and builds better bridges with computational models of both exposure and toxicity, thereby ensuring assessments are based on the most contemporary methods using sound scientific evidence. The prognostic value of key events in an AOP to predict apical disease states may allow endpoints which occur earlier or at lower levels on the continuum to be included in analyses of risk. Case studies of different chemical categories illustrate these concepts: 1) single acute duration increases in exposure; 2) fluctuations in exposure levels (e.g., repeated episodic increases); 3) repair vs. accumulation of effects from long-term, low-level averages and/or episodic increases in exposure; and 4) exposures during susceptible life-stages or windows of vulnerability. Effective communication of these complex evaluations in a clear and transparent fashion will be addressed with development of visualization tools. This project supports a flexible array of response estimates characterizing dose from real-world exposure scenarios, across durations from acute to chronic, and builds capacity to apply computational approaches such as AOPs. The views are those of the authors, and not necessarily those of the U.S. EPA.

P.34  Development of the dose-response relationship for human Toxoplasma gondii infection associated with meat consumption. Guo M*, Buchanan RL, Dubey JP, Hill DE, Gamble HR, Jones JL, Pradhan AK; University of Maryland, College Park, MD; Agricultural Research Service, United States Department of Agriculture, Beltsville, MD; National Academy of Science, Washington, D.C.; Centers for Disease Control and Prevention, Atlanta, GA   miaoguo312@gmail.com

Abstract: Toxoplasma gondii is a protozoan parasite that is responsible for approximately 24% of deaths attributed to foodborne pathogens in the United States. It is thought that a substantial portion of human T. gondii infections is acquired through the consumption of meats. The dose-response relationship for human exposures to T. gondii-infected meat is not known because no human data are available. The goal of this study was to develop and validate dose-response models based on animal studies, and to compute scaling factors so that animal-derived models can predict T. gondii infection in humans. Relevant studies in the literature were collected and appropriate studies were selected based on animal species, stage, genotype of T. gondii, and route of infection. Data were pooled and fitted to four sigmoidal-shaped mathematical models, and model parameters estimated using maximum likelihood estimation. Data from a mouse study were selected to develop the dose-response relationship. Exponential and beta-Poisson models, which predicted similar responses, were selected as reasonable dose-response models based on their simplicity, biological plausibility and goodness-of-fit. The confidence interval of the parameter was determined by constructing 10,000 bootstrap samples. Scaling factors were computed by matching the predicted infection cases with the epidemiological data. Mouse-derived models were validated against data for the dose-infection relationship in rat. A human dose-response model was developed as P (d) = 1-exp (-0.0015×0.005×d) or P (d) = 1-(1+d×0.003/582.414)^-1.479. Both models predict the human response after consuming T. gondii-infected meats, and provide an enhanced risk characterization in a quantitative microbial risk assessment model for this pathogen.

P.35  A dose response model for the Mycobacterium avium complex that takes into account recent developments in taxonomy and epidemiology for use in quantitative microbial risk assessment models. Hamilton KH*, Haas CN; Drexel University   kh495@drexel.edu

Abstract: Although MAC is a group of waterborne pathogens of great public health importance known to be harbored, amplified, and selected for more human-virulent characteristics by amoeba species in biofilms, a quantitative microbial risk assessment (QMRA) has not been performed due to the lack of dose response models resulting from significant heterogeneity within even a single species or subspecies of MAC as well as the range of human susceptibilities to mycobacterial disease. The primary human-relevant species and subspecies responsible for the majority of the human disease burden and present in drinking water, biofilms, and soil are M. avium subsp. hominissuis, M. intracellulare, and M. chimaera. Infectious dose studies and development of a dose response model for MAC have been identified as a top priority for opportunistic pathogens in premise plumbing; development of such a model will inform future infectious dose studies of MAC and identify research gaps necessary to complete a QMRA. Five new exponential dose response models are proposed here for human-relevant species of MAC with endpoints of lung lesions, death, and lymph node lesions in experimental in vivo animal models. Although current physical and biochemical tests used in clinical settings do not differentiate between M. avium and M. intracellulare, differentiating between environmental species and subspecies of the MAC can aid in the assessment of health risks and control of MAC sources.

P.36  Predicting a change in newborn’s birth weight based on maternal exposures to lead. Lynch MTK*, Brown LPM; Abt Associates   meghan_lynch@abtassoc.com

Abstract: Blood lead levels have continued to decline in recent years. However, recent studies continue to show adverse effects due to lead exposures for a diverse set of health endpoints. Additionally, a safe level for exposure to lead has not been identified. This suggests that further declines in lead exposure below today’s levels could still yield important benefits. For example, an NTP Monograph (2013) found “sufficient evidence that maternal blood Pb levels <5 µg/dL are associated with reduced fetal growth and lower birth weight” (National Toxicology Program, 2012, p. 139). Therefore, we closely evaluated the literature considered in this Monograph, other government reports such as EPA’s Integrated Science Assessment (2013) and the primary literature on this topic published since 2012. We propose a concentration-response function relating blood lead levels in mothers with lower birth weight in newborns. We will also discuss the uncertainties of the proposed function. This function could potentially be used to support the benefits analysis of future regulations intended to result in a decrease in lead exposure for adults.

P.37  From Literature Search to Evidence Integration. Henning CC*, Turley AT; ICF International   cara.henning@icfi.com

Abstract: ICF developed DRAGON, an online tool for systematic literature review, to meet the challenge of managing data and decisions to inform human health risk assessments. Following a literature search, integrating evidence from epidemiology, toxicology, and in vitro studies is necessary to support decisions about the hazard caused by a chemical, but organizing the disparate data yielded by a comprehensive search proves challenging across species, exposure routes, and outcomes. DRAGON’s recently improved functionality has helped us to better organize literature returned from a comprehensive search, resulting in more efficient review of titles and abstracts. The parallel structure of the new extraction modules in DRAGON allows for more thorough integration of data from human, animal, and cell studies. We demonstrate the application of these modules for a sample human health risk assessment.

P.39  Development of an Age Dependent Dose Response Model for Western, Eastern and Venezuelan Encephalitis Viruses. . Mraz AL*, Weir MH, Nappier SP, Haas CN; Temple University   alexis.f.layman@gmail.com

Abstract: Equine Encephalitis Viruses (EEVs) are a causative agent of febrile disease and encephalitis, an inflammation of the brain. The three types of EEVs: Western (WEEV), Eastern (EEEV), and Venezuelan equine encephalitis viruses (VEEV) are clinically indistinguishable, making monitoring and treatment more difficult. EEVs are typically transmitted via insect vector, but can also be infectious via droplet or aerosol transmission, making them a bioterror option. VEEV is the most likely biowarfare option and has been previously developed as such by the United States and the former USSR. Areas with poor sanitation and water infrastructure conditions may experience increased risks as compared to other regions. Given the dynamics of potential exposure routes, areas with more robust infrastructure may still be at risk, especially in the light of global climate change. Quantifying these risks is impossible due to the lack of a dose-response model. This research not only develops a dose response model for the three types of EEV, but also develops an age dependent dose response model. The age dependency is important given that EEV like other pathogens, adversely affect the young and elderly more severely. Therefore understanding the risk of EEVs in susceptible populations is an important tool for scientific professionals and engineers. With dose response being the yardstick of risk, this model is vital to a quantitative microbial risk assessment model for utilities, health professionals and scientists. Methodology of the model optimization as well as use of the resulting dose response models will be presented.

P.40  Reviewing evidence of time-dependent toxicities of organic and inorganic mercury in the developing brain. Pletz J*, Tennekes HA, Sánchez-Bayo F; Experimental Toxicology Services (ETS) Nederland BV (1,2), The University of Sydney (3)   julia.pletz@yahoo.co.uk

Abstract: Methylmercury is commonly found in fresh or salt water fish as a result of bioaccumulation (increasing concentrations in tissues over time). Ethylmercury is a metabolite of thimerosal (ethyl(2-mercaptobenzoato-(2)-O,S), which is a mercury-containing chemical used as a preservative in some vaccines. The most sensitive target of organic mercury exposure, particularly methylmercury, is the nervous system. In children, methylmercury exposure during pregnancy (in utero) has been associated with delays in reaching developmental milestones (e.g., age at first walking) and decreases in intelligence, increasing in severity with increasing exposure. Ethylmercury exposure from thimerosal in some vaccines has been associated, in some studies, with autism and other neurological disorders in children. A review of the scientific literature on mercury induced toxicity highlights that latency periods have typically been observed, even in highly exposed individuals intoxicated with methylmercury or ethylmercury. This poster will review dose-time-response relationships observed following exposure to organic mercury compounds. The mode of action underlying time-dependent toxicity is irreversible binding to critical receptors causing adverse and cumulative effects. Covalent binding to thiol groups of intracellular enzymes and structural proteins are presumed to cause serious adverse effects. We have examined whether dose-response data from in vitro and in vivo studies fit the Druckrey-Küpfmüller equation c•t^n = constant (c= exposure concentration, t= latency period), first established for genotoxic carcinogens, and whether or not irreversible effects are enhanced by time of exposure (n &#8805; 1), or whether toxic effects are particularly expressed at higher concentrations while time has only minor influence on the adverse outcome (n < 1). The results will be discussed in the light of experimental and epidemiological data.

P.41  Characterization and Application of High-Throughput Platform-Based Quantitative Screening Estimates. Wesselkamper SC*, Zhao QJ, Lambert JC; U.S. Environmental Protection Agency, National Center for Environmental Assessment, Cincinnati, OH   wesselkamper.scott@epa.gov

Abstract: Over the past four decades, the U.S. EPA has made significant progress in protecting human health and the environment from the adverse effects of chemical exposures. Nonetheless, several U.S. EPA Programs and Regions are often tasked with addressing the potential hazard(s) to human health and the environment of chemicals for which little-to-no data exist. The shared problem formulation in this context warrants basic identification of hazard and associated quantitative dose-response assessment for screening and prioritization purposes. Considering the lack of repeat-dose toxicity data for a significant number of potentially hazardous chemicals of interest to U.S. EPA client offices, new investigations have been proposed to address the need to develop approaches for interpreting and applying non-traditional, higher-throughput data to human health risk assessment and technical support efforts conducted within the U.S. EPA’s Human Health Risk Assessment (HHRA) program. Initially, the development of these approaches will address chemicals with inadequate or non-existent hazard databases, and will utilize data from alternative platforms or approaches such as structural read-across/(quantitative) structure-activity relationship [(Q)SAR], in vitro biological activity assays (e.g., ToxCast), and toxicogenomics, and builds upon output data from the U.S. EPA’s Chemical Safety for Sustainability (CSS) program. This effort is highly relevant because alternative methods and data may fill a critical need and be directly responsive to decision-maker needs across U.S. EPA Programs and Regions. The long-term objective of this effort is to perform methods development and proof-of-concept evaluations that inform how data from these alternative platforms may ultimately result in the identification of quantitative screening estimates and other fit-for-purpose applications for large numbers of chemicals.

P.42  Cumulative Risk Assessment of Methyl Yellow Residues in Food. Huang YW*, Wu KY; National Taiwan University   a-3306067@hotmail.com

Abstract: In 2014 it was found that dried tofu products have been adulterated with an industrial dye, para-Dimethylaminoazobenzene, intermittently for the past 20 years. The scandal was initially uncovered by authorities in Hong Kong regarding the many flavored types of dried preserved tofu, that were recalled after the contamination broke. Dose-response aspects of the carcinogenicity of DAB in rats following its oral administration were investigated by Druckrey (1943, 1951, 1967). Drckrey (1948) gave daily dosages of 1, 3, 10, 20 or 30 mg DAB/rat for lifespan. All doses produced liver tumours, the induction time being inversely proportional to the daily dose and ranging between 34 days in rats given 30 mg DAB/day and 700 days in rats given 1 mg DAB/day. For rats given from 3-30 mg/rat/day, the total carcinogenic dose was about 1000 mg. This study aims to estimate human exposure to methyl yellow from food and provide a quantitative risk assessment. Further to estimate the concentration of food which could be permissible. A number of dose-response models can be used to fit dichotomous response data. The models, which can be fit in the US EPA bench-mark dose software. The benchmark dose method involves fitting a mathematical model to all the dose-response data within a study, and thus more biological information is incorporated in the resulting estimates of guidance values. The human dietary intake of methyl yellow via foods was estimated based on National Food Consumption Database consumption data of different food groups. The permissible concentration of food was estimated by the usual limits for cancer risks set 1 in 1000000. The BMDL was estimated at 0.710741 mg/kg bw/day, and the permissible concentration of food was estimated at less than 4.24ppb. We strongly recommend the food content of methyl yellow is less than 4.24ppb.

P.43  Food Safety Assessment on Butter yellow, 4-dimethylaminoazobenzene. Chiang SY*, Huang YW, Wu KY; China Medical University, Taiwan   sychiang@mail.cmu.edu.tw

Abstract: 4-Dimethylaminoazobenzene (DAB), known as butter yellow, methyl yellow and dimethyl yellow, was used as a food colorant to color butter, but soon discovered to be a potent carcinogenic dye, and then banned in 1945 in USA. A countrywide recall of dried bean curd was announced by Taiwan FDA after discovery of possible traces of banned DAB in more than 100 tofu-related foods in 2014. DAB is mutagenic in Ames tests, causes sister-chromatin exchange in mammalian cells, and induces hepatoma in rodents. The potential health effects have been of great concern for consumers in Taiwan. This study was to assess cancer risk assessment for DAB in foods. The Benchmark dose software was used to fit the hepatoma data of male B6C3F1 mice to obtain the best fit BMDL10 at 0.71 mg/kg/day estimated by using the multistage model. The cancer slope was estimated at 0.51 (mg/kg/day)-1 by following the 2005 US EPA guideline for Risk Assessment on carcinogens. The daily intake rates of the more than 100 foods were cited from the Taiwan National Food Consumption database. The total daily intake rate of these foods is 460 g/day. The estimated maximum allowable residue is 4 &#61549;g/kg in these foods, given the negligible cancer risk at one in a million.

P.44  Impact of Statins Use and Air Pollution on Stroke among Diabetes Mellitus Patients. Ho WC*, Wu TT, Lin MH, Fan KC, Lin YS, Chen PC, Wu TN, Sung FC, Lin RS; China Medical University   whocmu@gmail.com

Abstract: Diabetes mellitus(DM) patients have higher incidence of stroke. Stains are widely used for hyperlipidemia and cardiovascular disease by their cholesterol-lowering effect. Studies reported that statins may reduce the risk of stroke by their anti-inflammatory effect and yet more research in need. Air pollution makes oxidative stress, systemic inflammation and endothelial dysfunction in vessel to cause stroke, especially among DM patients. The objective of this study is to estimate the effects of stains and air pollution exposure on stroke among DM patients. The study design was a retrospective cohort study. We used the medical records of subjects including diagnosis of DM, stroke events, and statins use from Longitudinal Health Insurance Database 2000 (LHID2000). Our subjects were who had first-time diagnosis of DM and age >18 years. We collected and analyzed their first-time occurring of stroke after first-time diagnosis of DM and statins use records for calculating the defined daily dose (cDDD). Air pollution data were collected by high-density Taiwan Environmental Protection Administration monitoring stations and used in estimating concentration by Geographic Information Systems (GIS). Cox proportional hazard model with time-dependent covariates for statins and air pollution exposure were used and adjusted gender, sex, comorbidities, co-medication, and other sociodemographic characteristics. Hypothesis testing was two-sided at the 0.05 significance level and was performed using SAS software version 9.4. The results show that air pollution increases the risk of first attack of stroke and statins may reduce the risk among DM patients. Potential antagonistic effects are found. Using statin and preventing the adverse effect of air pollution can be both important in reducing stroke risk among DM patients. It is potentially through the inflammation mediation mechanism. More studies are suggested.

P.45  Statin use and age-specific risk of cancer in patients with hypertension. Chou YJ*, Ho WC, Tsan YT, Wu TT, Lin MH, Chan WC, Chen PC, Wu TN, Sung FC, Lin RS; China Medical University   yirongchou@gmail.com

Abstract: Hypertension is a major worldwide public health issue. Statins are the widely used for hyperlipidemia and preventing coronary heart disease by their cholesterol-lowering effect. Hypertension patients may have higher exposure of statins. Preclinical evidence has shown that statins are potentially as anticancer agents by their anti-inflammatory, antiproliferative, proapoptotic and anti-invasive effect. Therefore there are many studies investigating statins associated with the risk of cancer among the general population and diabetes mellitus (DM) patients. The objective of this research is to investigate whether statins use is associated with cancer incidence in patients with hypertension. The study design is a retrospective cohort study. The medical records of subjects including cancer events and statins use were collected by Longitudinal Heath Insurance Database 2005 (LHID2005). Cox proportion hazard regression models were used to estimate the relationship between statins use and cancer occurring in patients with hypertension. The statins dosage of exposure, estimated as the sum of the dispensed defined daily doss (DDDs) of any statins, is used to correlate statins use to cancer risk. The results show that statins use may reduce the risk of cancer incidence in patients with hypertension. It is an important to assess cancer risk in patients with hypertension. Further study is promised.

P.49  Cumulative Risk Assessment of Pesticides in the Taiwan Population . Chen YH*, Wu CH, Wu KY; National Taiwan University   chendwayne@gmail.com

Abstract: Cumulative health risk assessment was performed on five pesticides as they are widely used on various vegetables and fruits in Taiwan. These insecticides are known to inhibit acetylcholinesterase (AChE), which lead to neurological and reproductive toxicity. Notably, all of the aforementioned insecticides, chlorothalonil is classified as carcinogenic to animals. Since agriculturalists often mix multiple pesticides during cultivation, assessing the cumulative health risks would more accurately reflect realistic hazards, compared to assessing that of a single pesticide. The present study evaluates the health risks of five pesticides by calculating reference dose (RfD) using the Benchmark Dose Software. The BMDL10 of carbofuran, dimethoate, methamidophos, and terbufos are 0.206, 2.514, 0.387 and 0.006 mg/kg/day, respectively. Since chlorothalonil is carcignogenic, its cancer slope factor is 0.0074. Applying Bayesian probability combined with Markov Chain Monte Carlo simulation (MCMC), data from the National Food Consumption Database of Taiwan were used to conduct exposure assessment. This study established ADIs (carbofuran: 2.6 x 10-3, dimethoate: 2.15 x 10-2, methamidophos: 3.87x10-3 and terbufos: 5.83x10-2 mg/kg/day) that are currently absent in Taiwan, which can be used by risk managers. The lifetime average daily doses (LADD) calculated, obtained using MCMC to estimate exposure, is much lower compared to that obtained using the pesticides’ official MRLs. This indicates that the MRLs set by Taiwanese authorities could result in a lenient LADD. Although both individual and combined HI values indicate that consumers are not subject to the potential adverse health effects, further investigation focusing on the exposure to multiple compounds for the consumers is warranted. A review of risk management protocol of pesticides should be considered since compliance with current regulations may be inadequate in safeguarding health.

P.53  Risk and Insurance Demand. Seog SH*; Seoul National University   seogsh@snu.ac.kr

Abstract: Risk represents the possible variation of events in the future. As the future can be understood only through the present, risk should be understood and estimated from the present. The interaction between the present and the future, however, is not fully reflected in the standard insurance literature. Often, the standard approach is built up on one-time models. In a one-time model, the interaction occurs only between different states in the future. While the interaction between states is important, such a "snap-shot" model virtually ignores the essential part of risk, the interaction between different times. We analyze insurance demand in a two-time model with both consequential and intrinsic risk aversion, where intrinsic risk aversion implies being worried or nervous. We find that actuarially fair premium does not guarantee full insurance in general, unlike in the standard approach. Incomes in different times and discount factors are also important in determining insurance demand. Second, actuarially fair premium leads to full insurance, only if the individual can have access to a capital market. That is, the standard result is not obtainable, if insurance demand is considered in isolation. Third, the additional consideration of intrinsic risk aversion leads to higher insurance demand. Fourth, when there is a capital market, the additional consideration of intrinsic risk aversion leads to higher insurance demand, but lower savings.

P.54  What Drives Economic Contagion? Findings from a Borrower-lender Game. Welburn J*; University of Wisconsin - Madison   welburn@wisc.edu

Abstract: The 2010 Eurozone debt crisis demonstrated a significant source of risk to economic stability; adverse economic events in one country can quickly spread across countries and regions. While other post-crisis spillover effects occur over extended periods of time, this spread of crises, a phenomenon known as contagion, results in effects that occur in the short-term. The process by which contagion occurs can follow from cascading shocks spread through trade and debt channels. Alternatively, the same effect can be produced by common-cause shocks where multiple countries experience similar adverse shocks simultaneously. Our aim is to elucidate the key driver of contagion, whether it be debt, trade, the combined effect of debt and trade, or a common cause. We adopt an interdisciplinary approach by combining traditional economic modeling with methods from engineering risk analysis. We present a within-period sequential-move game with two borrower countries and a single lender to model the process of contagion immediately following a crisis. Each borrower can receive adverse independent and common-cause shocks, which can then spread through debt and/or trade channels. We model the effect of specific contagion channels through subgames – a debt model, a trade model, and a model with both debt and trade (to highlight their interaction). Furthermore, we present computational results calibrated to the 2010 Eurozone crisis. We discuss how contagion could occur through each channel, demonstrate that lender beliefs can drive contagion even in the absence of cascading shocks, and give recommendations on how future crises could be managed.

P.55  Efficient food standards for radioceasium based on cost-benefit analysis of the regulation. Oka T*; Fukui Prefectural University   oka@fpu.ac.jp

Abstract: After the Fukushima Daiichi nuclear accident in March 2011, the Japanese government determined provisional regulation values for radioactive substances and began to regulate the distribution of contaminated agricultural products. Since April 2012 stricter food standards (100Bq/kg for the sum of Cs-134 and Cs-137 in food in general for example) have been applied. In this study costs of the regulation are estimated in terms of cost per life-year saved (CPLYS) and set against the benefits. On the basis of those results, efficient values for food standard are proposed in the sense that the cost does not exceed the benefit. The values for CPLYS in the case of prohibition of distribution of vegetables and rice are 8.0 million to 51 million yen, and 310 million to 1.0 billion yen respectively. That for the stop of production of dried persimmon is 290 million yen. That for the countermeasures carried out in rice production (fertilization with potassium and zeolite and deep cultivation) is 300 million to 1.1 billion yen at the minimum. That for the countermeasures in the production of persimmon (bark washing) is 37 million yen. Most of the values for CPLYS is larger than the value of a life-year, 20 million yen. The efficient values for food standard are estimated to be 1000 Bq/kg for vegetables, 720 Bq/kg for rice, and 3600 Bq/kg for dried persimmon in the case of regulation of distribution after production. The existing countermeasures will be efficient if the reduction in caesium concentration of white rice is greater than 600 Bq/kg, and if the initial caesium concentration in dried persimmon is greater than 450 Bq/kg. That implies that the efficient standard is not likely to be lowered by taking the countermeasures into account in the case of rice, while in the case of dried persimmon it may be significantly reduced by taking the countermeasure into account, although it is still greater than the present standard.

P.56  Modeling the Economic Cost of Non-Fatal Injuries from Terrorist Attacks. Heatwole NT*; University of Southern California   nat@ajheatwole.com

Abstract: Terrorist attacks can result in a wide array of consequences - deaths, injuries, property damage, business interruption, etc. One aspect of terrorism risk analysis, therefore, is assessing how reductions in terrorism morbidity might be compared against reductions in the other consequence dimensions. However, many terrorism risk analyses sidestep the issue of injury values by examining only fatalities. The severity of injury is assessed using: 1) the six-level Abbreviated Injury Scale (AIS), for which various monetary values of injury have been presented in the literature (including medical costs, lost work, and more quality-related costs); 2) the Injury Severity Score (ISS), a companion of the AIS that is especially useful in cases of multiple (simultaneous) injuries; and 3) the portion of injured victims who are hospitalized overnight (versus emergency department treated/released), combined with the number of days they remain in the hospital. A literature review located several studies of terrorism injuries that each use one or more of these severity metrics, and various meta-regression techniques (ordinary least squares, binary logistic, ordinal logistic, and stepwise) are then used to link the severity of terrorism injuries to various attributes related to the attack (e.g., weapon type), the victims (e.g., age, gender), background conditions (e.g., per capita GDP of country), and other parameters. A method is also presented for mapping the AIS monetary values of injury onto the ISS scale (or, equivalently, for mapping ISS values onto the AIS). The severity distribution for terrorism injuries is also compared to that for other injury types/contexts (e.g., motor vehicle accidents, all injuries in the U.S. generally). Although the value of terrorism injuries singularly is modeled, the methods developed can be applied to any hazard where the severity of the injuries (as assessed using one of the aforementioned methods) can be specified.

P.57  Achievement of a good balance between the enhancement of risk reduction and production – An economic experiment approach. Makino R*, Akai K, Takeshita J; AIST   ryoji-makino@aist.go.jp

Abstract: Industrial accidents, for example explosions of chemical plants, are major technological risks. Such disasters not only disrupt firm’s production process but cause damage to human health and ecosystems. Although estimating risk through Probabilistic Risk Analysis (PRA) has primarily been a non-behavioral physical engineering approach, the behavioral dimension should be taken into account to assess the reliability of a system. To fill this gap, Hausken (2002) [1] presented the theoretical model merging PRA and game theory. Makino & Takeshita (2013) [2] conducted an economic experiment based on the Hausken’s model and suggested the importance of introducing the behavioral dimension into PRA. In our previous study, however, we mainly focused on only firms’ risk reduction activities and ignored their production activities. It was the shortcoming of the study because a firm in general faces a trade-off between the amounts of resources allocated to risk reduction measures and those allocated to production enhancement. In this study, we conduct an economic experiment where subjects who play the role of top managements or workers of a firm face the trade-off described above. The purpose of this study is to establish an effective way of achieving a good balance between the enhancements of risk reduction and production. [1] Hausken, K. (2002), “Probabilistic Risk Analysis and Game Theory,” Risk Analysis 22, 17-27. [2] Makino, R. and Takeshita, J. (2013), “Can game theory predict the human behavior on safety? From the viewpoint of an economic experiment,” Society for Risk Analysis annual meeting 2013.

P.58  Advancing Methods for Benefits Analysis. Bateson TF*, Blessinger T, Subramaniam R, Axelrad DA, Dockins C; U.S. Environmental Protection Agency   bateson.thomas@epa.gov

Abstract: Benefit-Cost Analysis is widely employed and accepted for evaluating environmental policies, and is required by Executive Order 12866 and certain statutes. For most contaminants, there are few tools with which to evaluate the non-cancer human health benefits of exposure reductions. This presentation describes an effort that brings together economists, epidemiologists, statisticians, and toxicologists from across the U.S. Environmental Protection Agency for the purpose of quantifying health risks and their associated economic valuations. Key issues to be addressed include: 1) Need for standardized weight of evidence conclusions for all non-cancer hazards to communicate clearly and provide improved support for benefits analysis; 2) Need to explore how to include effects with a “suggestive” or “possible” weight of evidence conclusion, and how to incorporate weight-of-evidence uncertainty into the benefits analysis; 3) Need to estimate risk at a given dose by applying dose-response modeling techniques and incorporating uncertainty and variability, which will directly support economic analysis and provide additional information for decision makers; 4) Need to establish linkages of upstream and early biomarkers of effects to health outcomes that are amenable to economic valuation. These efforts will enable a more rigorous and informative characterization of health risks. Disclaimer: The views expressed in this abstract are those of the authors and do not represent U.S. EPA opinions and/or policy.

P.59  Benefit Analysis of Vehicle Crash Imminent Braking Systems for Bicyclist Fatality Reduction. Good DH*, Chien S, Li L, Christopher L, Zheng J, Krutilla K, Tian R, Chen Y; Indiana University and Indiana University - Purdue University Indianapolis   good@indiana.edu

Abstract: The World Health Organization estimates that there are approximately 1.2 million people fatalities per year resulting from traffic crashes. About half of the victims are not the occupants of a motor vehicle and do not benefit from the improvements in passive safety systems such as seat belts or air bags. As a response, motor vehicle manufacturers and tier 1 suppliers have been developing and deploying a new generation of active safety systems designed to protect pedestrians and bicyclists. While bicycle crashes with motor vehicles are rarer in the US compared to other countries, they are expected to increase as Americans improve their health through mobility. This study evaluates the benefits from two systems on their ability to mitigate crashes with bicycles. Compared to pedestrians, cyclist crashes are far more complex (for example, pedestrians do not attempt to merge into traffic and their range of speeds is much lower). We develop testing scenarios based on US crash data, a realistic (from the radar and visual perspectives) surrogate cyclist targets, develop models of injury prediction, and perform test these vehicles on track with repeated trials. In addition to an estimate of benefits, we offer suggestions for the implementation of these procedures for the DOT’s new car assessment program (NCAP) and the future modifications to the Federal Motor Vehicle Safety Standards.

P.61  The Social and Economic Effects of Wage Violations: Estimates for California and New York. Forsell T*, Haverstick K, Nadeau L; Eastern Research Group, Inc. (ERG)   Tess.Forsell@erg.com

Abstract: This project estimated compliance with labor laws and the costs of non-compliance. The Fair Labor Standards Act (FLSA) sets national standards for a minimum hourly wage, maximum hours worked per week at the regular rate of pay, and premium pay if the weekly standard is exceeded (overtime pay). State governments can implement labor laws that provide higher wage floors, more restrictive overtime laws, or stricter exemption requirements. Accounting for variation in state labor law adds to the complexity of evaluating compliance with labor laws. The study focused on minimum wage violations in California and New York. We used two large nationally representative datasets, the Current Population Survey (CPS) and the Survey of Income and Program Participation (SIPP). The first step in estimating the costs and benefits of non-compliance was to evaluate which workers are covered by these labor laws. Then lost wages to workers were estimated which is the primary cost. Using the CPS, an estimated $22.5 million in wages were lost per week by workers in California and $10.2 million in wages were lost per week by workers in New York in 2011 due to minimum wage violations. However, failure to comply with the FLSA and state labor laws has implications far beyond the dollar amount of unpaid wages. Lack of compliance with the FLSA and state minimum wage laws contributed to higher poverty rates, lower government revenue (due to lower employment and income tax payments by employees), and higher government expenditures on social support programs (such as the Supplemental Nutrition Assistance Program). Estimates of the number of families in poverty, lost tax revenue, and government expenditures on social support programs were also constructed. Thus, lack of compliance with labor laws can impact the Department of Labor’s goal of providing a standard of protection to the labor force.

P.63  Release of silver nanoparticles from nanocomposite water treatment membranes: an assessment of potential environmental exposures across the product’s life cycle. Rice JR*, Wiesner M; Duke University   jacelyn.rice@duke.edu

Abstract: Incorporating nanomaterials can provide membranes with unique properties by inducing new characteristics and functions, and aid in tuning physiochemical membrane properties. In recent years progress has been made in the development of polymer-matrix nanocomposite membranes for water treatment; giving way to the designing of next generation polymeric membranes with high performance and antifouling properties. One such example is the incorporation of silver nanoparticles (NPs) into polymer membranes for bacterial resistance. While there is debate in previous studies regarding the adverse effects caused by silver nanoparticle exposure on environmental species, it’s generally accepted that dissolution of silver nanoparticles does account for a degree of the toxicity observed. Potential affects of leached nanomaterials to the environment illuminate the need to quantify release under varying use and disposal scenarios. Here we aim to help address this research need by exploring the roles of polymer type, dissolution matrix, and membrane pore size in the release and transformation of silver nanoparticles from embedded polymer membranes. In doing so we (1) synthesize and characterize several types of polymer membranes impregnated with Tween-20 nanoparticles, (2) perform several passive release experiments designed to mimic the potential material and chemical stresses subjected to the product during the use and disposal phases of its life cycle, (3) and assess the efficacy by testing the membranes antibacterial properties.

P.64  Nanoinformatics: Advances, Applications, and Assessing the Continuing Challenge of Uncertainty. Gernand JM*; Penn State University   jmgernand@psu.edu

Abstract: The field of nanoinformatics continues to progress in the development of databases, statistical analyses, and data mining algorithms for the identification of important risk and functional aspects of engineered nanomaterials. The successful development of these tools and resources is critical for the safe and responsible development of this technology. Over the past several years special discussions and meetings on this topic have focused on the needs and plans for the development of nanoinformatics infrastructure and methodologies. Now, the first steps are being made with traditional and non-traditional applications to report along with their successes and limitations. This presentation will highlight advances, applications, and challenges of existing efforts in this area, which still tend to be relatively young, and how the continuing uncertainty as revealed by nanoinformatics analysis can inform future developments and knowledge search. Quantitative structure activity relationships (QSARs) and related modeling studies of the toxicity of nanomaterials have been successfully developed, but for isolated and independent perspectives that cannot yet be readily consolidated. The bridging questions and prospective unifying questions can now be defined, however. Several ambitious database projects to capture and serve up accumulated and newly created data on nanomaterial risks and functions have been making progress, and these activities have identified the main impediments to full implementation while allowing the community to estimate the level of investment required to achieve the desired aims. This presentation also presents a case study on the analysis of uncertainty over time with respect to nanomaterial characterization and the quantification of outcomes with a prospective estimation of when certain data milestones may be reached.

P.65  Socioeconomic impact analysis in critical infrastructure failure and hazardous site disasters. Iuliani L*, de Marcellis-Warin N, Galbraith J; École Polytechnique de Montréal   lucas.iuliani@polymtl.ca

Abstract: Among the most challenging and commonly identified gaps in knowledge related to the enhancement of critical infrastructure (CI) protection capabilities is the incomplete understanding of the interdependencies between infrastructures themselves. Over time, these networks have become increasingly interdependent at the physical, cyber and human levels and as such, accurate modelling methods are commonly viewed as a prerequisite to answering complex questions regarding the faithful measurements of damage propagation and cascading downstream socioeconomic impacts on populations resulting in the failure of these interconnected key assets. Similarly, the hardening of these networks is made all the more difficult given their size, and thusly the prioritization of the most critical nodes to protect becomes essential. The proposed work aims to present an overview of existing interdependency models focused on the resiliency of economic arrangements to destructive events and discuss their adequacy when applied to case studies relating to recent destructive events such as the Lac-Mégantic rail disaster of 2013 and the 2003 electrical blackout of the East Coast. In order to address this question usefully, this study attempts to develop measures of vulnerability in economic systems and select the appropriate indicators, both social and economic, with which to quantitatively assess the socioeconomic impact of such destructive events. Among these are direct impact indicators such as loss of life, assets, stocks and infrastructure, as well as indirect impacts such as GDP fluctuations, production declines, loss of housing, education, health services and ultimate recovery times.

P.66  A Stakeholder-Based Survey for Assessing the Viability of a Water Biofilter Concept in the Philippines. Santos JR*, Latayan JS, Pagsuyoin SA, Srija S; George Washington University   joost@gwu.edu

Abstract: Access to safe water continues to be a serious health concern across the globe, particularly in the rural areas of developing nations. Inadequate standards and practices for treating and storing drinking water have increased the incidences of illnesses and mortalities due to waterborne diseases and contaminants. In this paper, we assess the viability of utilizing the seeds from the Moringa Oleifera (MO) tree as a main component in the conceptualization of a portable water biofilter. There are three major contributions in this paper. First, we performed a review of literature to discuss the risks propagated by unsafe water, as well as various methods and practices for treating and disinfecting water sources. Second, we investigated the extent to which MO could be utilized for the water biofilter concept. Third, we developed a stakeholder survey and disseminated it to the residents of a rural municipality in the Philippines. Results of the survey will be used in assessing community interest as well as evaluating overall viability of pursuing the MO biofilter concept.

P.67  Assessing Terrorist Threats for Energy Systems by Utilizing Historical Data and Expert Judgments. Sinka D*; ENCONET   davor.sinka@enconet.hr

Abstract: Energy systems are relatively attractive targets for terrorist attacks and should be adequately protected. The quantitative risk analysis (QRA), with the threat assessment as one of its standard components, is considered to be a promising methodological framework for optimizing such protection. This paper presents the results of the research which consisted of three parts. In the first part terrorist target selection process was examined and the factors influencing this process were identified. In the second part the statistical analysis of the terrorist attacks towards energy systems was performed. The input data were derived from the Global Terrorism Database (GTD), covering the period from 1996 to 2010. Within the analysis the relative contributions of the attacks on energy systems to the total number of attacks were determined. Also, it was examined how those contributions are influenced by various factors related to the characteristics of the target, terrorist group and the environment in which the group operates. In the last part of the research the method for the quantitative threat assessment was developed. It is designated to the energy systems' operators and applicable to the electricity, gas and oil sectors. The method is based on the results of the performed statistical analysis (i.e. on historical data) and on expert judgments. It can be applied by using the publicly available data only. The method is realized by the implementation of Bayesian networks. This allows for taking the uncertainties into consideration, as well as for easy modifying and updating.

P.68  Researching Engineering Causes in 2003 Boumerdes-Algiers (Algeria) Earthquake Disaster. Benouar D*; USTHB   dbenouar@gmail.com

Abstract: This paper attempts, as a case study, to investigate the engineering causes of the Boumerdes-Algiers (Algeria) earthquake disaster of 21 May 2003 which caused the loss of more than 2,278 human lives, injuring more than 11,450, making 1,240 missing and 182,000 homeless; it destroyed or seriously damaged at least 200,000 housing units and about 6,000 public buildings in five wilayas (provinces). On Wednesday 21 May 2003, at 19h 44m 2s (18h 44m 2s UTC), a destructive earthquake occurred in the Boumerdes-Algiers region affecting a rather densely populated and industrialized region of about 3,500,000 people. It is one of the strongest recorded seismic events in North Africa. The depth of the focus was about 10 km. The magnitude of the earthquake was calculated at M = 6.8. The main shock, which lasted about 40 sec, and the two largest aftershocks (both reached M 5.8 on 27 and 29 May 2003). Disasters are increasingly being understood as ‘processes’ and not discreet ‘events’. Moreover, the causes of disasters are driven by complex engineering, socio-economic, socio-cultural, and various geophysical factors. Such interacting driving factors, occurring across a range of temporal and spatial scales, combined in numerous ways to configure disaster risks. Using some selected disasters in Algeria, the dynamics of such risks and their configurations will be explored using a new forensic style approach. Indeed, it seems, the more we have learned, the more we are losing. The approach is based on the idea that this situation is due to the fact that much current research is still informed by a focus on surface symptoms of observations and events rather than critical causes and processes of disaster risk construction and accumulation.

P.69  Triple Bottom Line Modeling of Green Storm Water Infrastructure – Step 1 Environmental Benefits. Weir MH*; Temple University   mark.weir@temple.edu

Abstract: Storm water is an often misunderstood and occasionally neglected portion of the complete water infrastructure system. With the understanding of combined sewer overflows (CSOs) and other environmental impacts from current grey infrastructure, green infrastructure is an attractive option. Typically the costs incurred to establish green storm water infrastructure require additional justification. Added to this the rather untested nature of green infrastructure for large urban water systems and their justification is more urgent. Using the concepts of integrated water management triple bottom line (TBL) modeling has been proposed as a means of assessing future community and environmental benefits. TBL modeling uses the concept of combined benefits from three fields: environmental, social and economic. A dynamic TBL model is being developed for the city of Philadelphia, which is engaged in a large-scale green storm water infrastructure program. The first step, the environmental model will be presented. In the environmental model air pollutant concentration reduction based on green infrastructure type is coupled with a model relating air pollutant concentration with respiratory diseases and acute conditions (e.g. asthma). The longer term future of this model is a direct inclusion of additional environmental benefits, social and economic benefits.

P.70  Building models and tools for national infrastructure flood risk assessment . Pant R*, Hall JW, Thacker S, Barr S, Alderson D, Lamb R; University of Oxford   raghav.pant@ouce.ox.ac.uk

Abstract: National infrastructures are prominent for social and economic sustainability and wellbeing, and addressing critical infrastructure risks has become relevant for safety and security. Infrastructure networks consisting of critical nodes and edges span entire national landscapes or even traverse national borders. The impacts of any failures to such networks cascade easily from localised effects towards global scales. There is enough evidence to support this claim, as events such as hurricanes (Katrina 2005, Sandy 2012), tsunamis (Japan 2011), floods (Great Britain 2013/2014) have caused widespread infrastructure disruptions. It is important to have an understanding of the risks due to extreme climate events if we plan towards future infrastructure protection and sustainability. To address such issues, in this research we have developed a national infrastructure risk assessment methodology and tools that improves our understanding of interdependent infrastructure failures and provides answers to important aspects of damage propagation across infrastructures. The risk assessment tool includes: (i) spatially coherent probabilistic extreme weather events; (ii) interdependent national infrastructure networks; (iii) network demand models; and (iv) macroeconomic linkages. The outcomes of the framework include: (i) spatial estimates of key vulnerabilities in national infrastructure networks; and (ii) demographic and economic consequences of national infrastructure failures. The risk assessment framework is demonstrated through a flood risk analysis of interdependent national-scale networks for Great Britain. For an island like Great Britain, flooding risks are a major cause of concern and are imminent as future climate projections indicate the likely increase in sea levels and frequent extreme rainfall events. The analysis shows that importance of understanding interdependence among infrastructures, which leads to greater failure propagations.

P.71  Implementation of Soot Production Models For Fire Simulations in CFD Tools. Mariño OA*, Muñoz F; Universidad de los Andes   oa.marino2865@uniandes.edu.co

Abstract: To design a fire protection system, one option is to use a performance-based approximation in which the fire dynamics is required to be simulated, in order to obtain the radiation effects of the fire. There are many variables that influence the consequences that affect a person or a structure, but in particular the amount and concentration of soot produced can change the incident heat radiation and the propagation of the fire. To account these effects, a soot model is needed to be implemented in a field model, which is the most accurate kind of the fire models. The software of this kind most used by fire protection engineers is the Fire Dynamics Simulator (FDS), wherein a soot model is not use. In this project the aim is to implement in FDS a soot model and maintain the mean features of the software: relative low computational cost, simple chemistry models and to be useful in large scale. For this reason in this work the global models have been chosen to be implemented in FDS V6, which is the latest release of the software. After this implementation in a study case (laminar co-flow flame), the soot volume fraction simulated are in the same order of magnitude as the experimental data. A parameter adjustment is performed in order to achieve a better adjustment.

P.72  Comparison of VOC Drinking Water Contaminant Levels in New Jersey to Regulatory and Human-Health Benchmarks . Williams PRD*; E Risk Sciences, LLP   pwilliams@erisksciences.com

Abstract: Potential threats to drinking water and water quality continue to be a major concern in many regions of the United States. In this analysis, the occurrence and detected concentrations of several volatile organic compounds (VOCs) in public water systems, private wells, and ambient groundwater wells in New Jersey were evaluated and compared to federal and state regulatory and human-health benchmarks. Analyses were based on three databases that contain water quality monitoring data for New Jersey: (1) Safe Drinking Water Information System (SDWIS), (2) Private Well Testing Act (PWTA), and (3) National Water Information System (NWIS). Primary maximum contaminant levels (MCLs), health-based MCLs, or other published human-health benchmarks were used. Tetrachloroethylene (PCE) was detected at a concentration at or above 0.44 ug/L, 1 ug/L, and 5 ug/L in 6.9%, 3.9%, and 1% of sampled public water systems, respectively. Trichloroethylene (TCE) was detected at a concentration at or above 1 ug/L and 5 ug/L in 3.6% and 1.1% of sampled public water systems, respectively. Benzene was detected at a concentration at or above 0.15 ug/L, 1 ug/L, and 5 ug/L in 3.1%, 0.9%, and 0.3% of sampled public water systems, respectively. MTBE was detected at a concentration at or above 10 ug/L, 20 ug/L, and 70 ug/L in 2%, 1.4%, and 0.3% of sampled systems, respectively. Mean detected concentrations of PCE, TCE, benzene, and MTBE were 1.3 ug/L, 1.5 ug/L, 1.1 ug/L, and 2.7 ug/L, respectively. The detection frequency of these same VOCs was notably lower in private wells, but higher in ambient groundwater wells. PCE, TCE, and benzene were also detected more often above corresponding regulatory or human-health benchmarks than MTBE due to their higher detected concentrations in water and/or greater toxicity values. These data are useful for evaluating historical or current contamination of water supplies in New Jersey and potential opportunities for public exposures and health risks.

P.73  State-Level Innovations in the Assessment of Drinking Water Contaminants of Emerging Concern. Greene CW*, Goeden HM; Minnesota Department of Health   christopher.greene@state.mn.us

Abstract: The Minnesota Department of Health (MDH), using funding derived from the Minnesota Clean Water, Land, and Legacy Amendment to the state Constitution, has developed an innovative program to identify, screen, and assess drinking water contaminants of emerging concern (CECs) and, when possible, develop health-based drinking water standards. In the state of Minnesota, approximately 1.4 million people, or 25 percent of the population, derive their drinking water from the Mississippi River. Like most rivers and streams that receive treated wastewater discharges, the Mississippi has been shown to contain measurable concentrations of certain pharmaceuticals, personal care product ingredients, plasticizers, and pesticides. To a lesser extent, CECs may also be found in municipal or private groundwater wells. New industrial, commercial, or consumer uses may also contribute to releases of CECs to drinking water sources. Many of these CECs are not required to be monitored under state or federal drinking water programs, but have been measured in source water by the U.S. Geological Survey (USGS), the Minnesota Pollution Control Agency (MPCA), and other parties. MDH’s program allows interested persons or organizations to nominate chemicals to be considered for review. MDH staff conduct screening assessments of toxicity and exposure potential, then select candidates for a more intensive review process aimed at developing health-based standards for multiple exposure durations. Through this program, MDH has developed enhanced capabilities for dealing with risk assessment issues that arise with CECs, including making decisions based on limited toxicological data and allocating potential risks among multiple exposure sources such as consumer products and diet. MDH’s experiences assessing CECs may be informative to state or local authorities concerned about risks from water reuse, wastewater impacts on surface water quality, or exposures to consumer products.

P.74  The Impact of Rodent Reflex Bradypnea on Human Health Risk Assessments of Inhaled Irritants. Whalan JE*, Pauluhn J; US Environmental Protection Agency   whalan.john@epa.gov

Abstract: Reflex bradypnea (RB) is a protective sensory reflex that allows rodents—but not humans—to markedly reduce their exposure to inhaled upper respiratory tract (URT) irritants such as aldehydes (e.g., formaldehyde), isocyanates, ammonia, and pyrethroids. When an irritant exposure above some biological threshold stimulates trigeminal nerves in the URT, rodents experience a rapid and sustained decrease in ventilation (as much as 90%) so they inhale a much lower chemical dose than if they were breathing normally. This bradypnea is accompanied by decreases in body temperature (as much as 14o C), metabolic rate, heart rate, and activity, and altered blood pH. These protective physiological effects may be misconstrued as adverse “systemic” outcomes. This poster demonstrates that a human health risk assessment that fails to account for a reduced inhaled rodent dose may be biased to understate the true human risk. It also demonstrates that behavioral and developmental effects due to RB may not be relevant to humans. For example, RB-induced hypothermia can impede learning and motor function tests (swim maze, rotarod, etc.). RB in pregnant dams can result in fetal hypothermia, impaired placental transfer of O2 (hypoxia) and CO2 (hypercapnia), developmental delays, and other effects that may be erroneously considered evidence of toxicity. The impact RB can have on human health risk assessments has not received the attention it deserves from toxicologists and risk assessors, largely because current testing guidelines do not require examination of RB-related endpoints. This analysis shows the major impact RB can have on the interpretation of findings, and it demonstrates why it may be necessary to adjust points-of-departure (PODs) in risk assessments of inhaled irritants to make them health protective for humans. The views expressed are those of the authors, and do not necessarily represent the views or policies of the U.S. EPA.

P.75  Estimation of Distribution of Chicken Meat Consumption in Canadian Populations. Nguyen LB*, Smith M; Health Canada   loan.nguyen@hc-sc.gc.ca

Abstract: In a quantitative microbial risk assessment, the distribution of a serving size of a food of interest is one of the key elements in the exposure assessment. We discuss the data and methodology used to derive the single-day chicken meat consumption for Canadian populations. It was derived from the 24-hour dietary recall component of the 2004 Canadian Community Health Survey, Cycle 2.2, Wave 3 (CCHS, 2004). CCHS 2004 is a multi-stage sample cross-sectional survey whose sampling population is about 98% of Canadian individuals from all age groups. How single-day consumption varies among individuals in a Dietary Reference Intake (DRI) age-sex group is described by a zero-inflated gamma distribution - combining the fraction of individuals who did not eat chicken on a day at random, with a gamma distribution describing how single-day consumption varies among individuals who ate chicken on a day at random - with parameters estimated by maximum likelihood. A non-parametric bootstrap (bootstrap survey design weights) was used to estimate the sampling distribution (multivariate normal) of the parameters for the fitted zero-inflated gamma distributions. How single-day consumption varies among individuals in the sub-populations of interest are constructed as mixtures, f(x)=a1f1(x)+a2f2(x)+ ...+ (1-a1-...-an)fn+1(x) of those sub-populations’ component DRI age-sex groups’ fitted distributions, with mixture constants, a1, a2, ..., an, from Canadian population census data. Results obtained from the model were used to derive the distribution of chicken meat consumption on a random day for each sub-population of interest through a Monte Carlo simulation.

P.76  Assessing the Health Risks of Gossypol in the Taiwanese population. Hsing HH*, Chuang YC, Wu C, Wu KY; Institute of Occupational Medicine and Industrial Hygiene, National Taiwan University   r03841029@ntu.edu.tw

Abstract: Gossypol is an anti-fertility agent found in cotton plants. Humans are likely to be exposed to gossypol through the consumption of cottonseed oil and edible tissues of livestock. However, no health-based guidance value has been established for gossypol. Therefore, it is important to quantify concentrations of gossypol and assess the risk of gossypol to the general population. We estimated the reference dose (RfD) by using the data which from a large clinical trial in China involved 8,806 men who were administered gossypol 15 and 20 mg/day for various periods. BMDL10 were calculated using benchmark dose (BMD) method. The results show that the BMDL10 of gossypol was 0.20 and 0.1769 mg/kg/day and then divided by uncertainty factors of human (10). The ADI were 0.020 mg/kg/day and 0.0177 mg/kg/day, respectively. It is evident that the RfD yielded from two different methods are very similar; indicating that both methods are equally precise in assessing the reference dose. Risks were characterized using the Hazards Index (HI) method. The Life-time Average Daily Dose divided by the estimated ADI, whereby estimated HI was compared to an HI of 1. There is insufficient data on gossypol content in edible oil in Taiwan. Therefore, the Life-time Average Daily Dose was estimated by the data of edible oil consumption from National Food Consumption Database of Taiwan and the detection limit of the current method of analysis for free form gossypol (0.05 ppm). The HI for the general population was less than 1; indicating that the concentration of gossypol in edible oil is under the detection limit of current analysis method and pose no adverse health effects. Cottonseed meal can be used as a livestock feed. However, the risk assessment of gossypol residues in edible tissues has not been established by using the Markov Chain Monte Carlo simulation (MCMC). Therefore, it would be discussed in our further study.

P.77  Progress in High Throughput Exposure Assessment for Prioritizing Human Exposure to Environmental Chemicals. Setzer RW*, Wambaugh JF, Isaacs KK; US Environmental Protection Agency, Research Triangle Park, NC, USA   setzer.woodrow@epa.gov

Abstract: Thousands of chemicals in commerce have little or no information about exposure or health and ecological effects. The US Environmental Protection Agency (USEPA) has ongoing research programs to develop and evaluate models that use the often minimal chemical information available for rapidly assessing the potential for exposure. Two high-throughput exposure models (Mitchell et al, 2013), were evaluated for their ability to predict biomonitoring data from the National Health and Nutrition Examination Survey (NHANES) for about 100 chemicals found in urine samples. The two models were essentially unable to predict the exposures. A binary indicator of near-field exposure explained a significant fraction of the total variance in inferred exposure (Wambaugh et al, 2013). Further analysis refined the nature of the ‘near-field’ predictor. About half the variance in the inferred NHANES exposures can be explained using a regression-like model on indicators for use-categories and an estimate of production volume (Wambaugh et al, 2014). This heuristic model can make exposure predictions for almost 8000 chemicals of interest, but, since the predictions are based on regression modeling, the predictions are subject to domain of applicability concerns. SHEDS-HT (Isaacs, et al., 2014) is a probabilistic exposure model for chemicals with near-field and dietary exposure. While it can currently make predictions for only 39 chemicals of the NHANES evaluation dataset, it can explain about 40% of the variance of inferred exposure for those chemicals. This compares favorably with the empirical model, since no calibration against the evaluation data was involved in the SHEDS-HT fits. A critical feature of this model evaluation and development is the incorporation of data-derived prediction uncertainties, which allows the use for chemical prioritization and screening with explicit characterization of associated uncertainties. This abstract does not necessarily reflect U.S. EPA policy.

P.81  Using a Toxicological Framework for Chemical Prioritization from Children’s Safe Product Act Data. Smith MN*, Faustman EM, Grice J; University of Washington   rissa8@u.washington.edu

Abstract: Thousands of chemicals are incorporated into consumer products and released into the environment without a consistent understanding of their effects in humans and often even less of an understanding about the potential cumulative effects of combined exposures. The Children’s Safe Product Act (CSPA) requires that manufacturers of children’s products sold in Washington State report if their products contain 66 chemicals that are designated to be of high concern in children. While this requirement has successfully generated an extensive database, information regarding the potential route of exposure and toxicity had not been integrated. In this project we create an integrating framework for a toxicological interpretation of the most frequently reported CSPA chemicals. The framework scores lifestage, exposure duration, exposure routes, toxicokinetic factors, chemical properties, toxicity and potency assessed by international (EU REACH, GHS, IARC) and national (EPA IRIS) databases. Scores are integrated using a decision framework and a multi-attribute utility function to calculate priority scores. Among the ten most frequently reported chemical groups, the priority scores were highest, on average, for formaldehyde, methyl ethyl ketone and styrene. The framework has allowed us to examine these chemicals and exposure pathways in a lifestage-specific manner that will guide Washington in taking action on the most immediate potential concerns for children’s health. As more states are beginning to enact similar children’s product legislation, the applications of this framework will expand. This work was supported by Washington State Department of Ecology and the University of Washington Center for Child Environmental Health Risks Research (NIEHS 5 P01 ES009601 and EPA RD 834514).

P.82  The risk assessment of pesticide residues in vegetables and fruits in Taiwan: Carbofuran, Chlorothalonil, Dimethoate, Methamidophos, Terbufos. Chen Y.J.*, Chen Y.H., Wu C., Wu K.Y.; Institute of Occupational Medicine and Industrial Hygiene, National Taiwan University   b99605044@ntu.edu.tw

Abstract: The objective of this study was to evaluate theoretical maximum daily intake (TMDI), estimated Lifetime Average Daily Dose (LADD) and acceptable daily intake (ADI) of pesticide residues of Carbofuran, Chorothalonil, Dimethoate, Methamidophos, Terbufos in vegetables and fruits in Taiwan by using food intake, residue data and compare with ADI in order to estimate the health risk based on the pesticide exposure. Existing ADIs of these pesticides were estimated by the no observed-adverse-effect level (NOAEL), due to the lack of methods to simulate actual scenarios. We used benchmark dose to calculate a reference dose (RfD) to replace NOAEL. In this study, LADD was based on a model of Bayes' theorem in Markov Chain Monte Carlo simulation (MCMC) and TMDI was according to the newest MRLs rules in Taiwan. Meanwhile, ADIs were calculated by benchmark dose with existing animal studies. The study reveals that the TMDI of carbofuran, chlorothalonil, dimethoate, methamidophos, terbufos was 0.0424, 0.0766, 0.0083, 0.007242 and 0.000377 mg/kg/day and ADI of carbofuran, dimethoate, methamidophos, terbufos was 2.06×&#12310;10&#12311;^(-3),2.15×&#12310;10&#12311;^(-2), 3.87x10-3, 1.1x10-3mg/kg/day. The percent ratio of TMDI to ADI for 5 pesticides exceeded the ADI. LADD of the five pesticides was 3.26×&#12310;10&#12311;^(-4),3.50×&#12310;10&#12311;^(-4),6.71×&#12310;10&#12311;^(-5), 3.48x10-5, 3.20x10-5mg/kg/day, respectively. The percent of LADD to ADI of the five pesticides was 15.8, 2.33, 0.31, 0.90, 0.029, respectively. All LADDs were below ADIs, so MRLs of the five pesticides should undergo revision. Results show that Carbofuran, Dimethoate, Methamidophos, Terbufos, accumulating in the human body would cause neurotoxicity and reproductive toxicity in human beings and that carcinogen chlorothalonil which is most harmful to humans in these pesticides and used most widely may cause malignant tumors in humans. Therefore, this study informs the government to modify each improper MRL in agriculture and decrease pesticide abuses.

P.83  Comparison of Bayesian and Frequentist Inference in Probabilistic Exposure Assessment of Dietary Intake from Pesticide Residues Survey with Left-Censored Data. Chuang YC*, Wu KY; National Taiwan University   mcyc1979@gmail.com

Abstract: Pesticide residue monitoring survey can provide valuable information on the incidence and level of pesticide residues in vegetables and fruits to conduct a dietary exposure assessment. However, the dataset from national survey of pesticide residue was not only censored by detection limit of pesticide analytical methods as “non-detects” (NDs) but also truncated by maximum residue level. When the sample size is more than 80% censoring, recommendation of EFSA report was suggested to collect additional data from similar food categories for obtaining larger sample sizes rather than a probabilistic exposure assessment. Therefore, a study on statistical estimation of dietary intake from highly censored datasets has been implemented by Bayesian and frequentist inference. In Bayesian inference, a non-informative prior are used within Bayesian methods, and the probability of a positive concentration below the limit of detection (LOD) has been concerned as a binomial distribution to obtain the mean residue level by the iteration of Markov chain Monte Carlo simulation (MCMC). In contrast with Bayesian model, the pesticide residue of probability density function was regressed with maximum-likelihood estimation via censored data and the NDs were substituted by LOD/2 as the traditional method. Consumption data were selected from vegetables and fruits that have been found to contain residues of one or more of the interested pesticide. The dietary intake of pesticide from vegetables and fruits can be obtained after 50,000 iterations simulation with frequentist and Bayesian inferences and the mean of dietary intake are 1.71×10-4 and 2.35×10-3 mg/kg/day, respectively. The results show that posterior distribution of dietary intake theoretically converge by Bayesian inference to corresponding representative distributions from highly censored data so that quality of probabilistic exposure assessment may be improved by reducing the uncertainty which comparing with the traditional approach of frequentist inference.

P.84  Oral Bioaccessibility of Nickel and Cobalt from Metal Alloy Emissions in Soil and Dust . Verwiel AH*, Proctor DP; ToxStrategies, Inc.   averwiel@toxstrategies.com

Abstract: Bioavailability is the extent to which bioaccessible metals are systemically absorbed. Measuring the bioaccessible fraction of metals in vitro provides a baseline for understanding bioavailability and for estimating more appropriate remedial action levels for hazardous waste sites. EPA has developed estimates of oral bioaccessibility and relative bioavailability for lead and arsenic in soil and mine tailings to more accurately estimate potential human health risks. We evaluated the oral bioaccessibility of nickel and cobalt in soil and dust affected by deposition of metal alloys from industrial operations. The noncancer toxicity criterion for cobalt is based on soluble cobalt sulfate administered as a dietary supplement, and the nickel criterion is based on water-soluble nickel sulfate hexahydrate. Both of these forms are more bioaccessible than cobalt and nickel in soil and dust affected by metal alloys that are engineered to resist corrosion. Bioaccessibility testing was performed by Ohio State University using in vitro gastrointestinal (IVG) methods. Five background soil samples, and 6 soil samples and 5 dust samples from areas affected by emissions were analyzed using IVG. Bioaccessibility in all but one background samples was not measurable; nickel bioaccessiblity was 12.4% in the one sample with a measurable level. For affected soil samples, only nickel was measureable in the IVG, and bioaccessibility ranged from 1.36% to 5.26% in the gastric and intestinal extraction phases. For dust samples, cobalt and nickel bioaccessibility was similar, ranging from 0.98% to 4.41% in gastric and intestinal extractions. Soil and dust samples with the highest concentrations of nickel and cobalt had the lowest bioaccessibility (<1.75%) supporting that nickel and cobalt in affected dust and soil affected by metal alloy emissions are expected to have very limited relative oral bioavailability.

P.85  Chemical Risk Analysis and Management in King Saud University Laboratories and Stores, Riyadh, Saudi Arabia. A Case Study. Shereif M*; Associate Professor, Dept. of Chemistry, College of Science, King saud University   mshereif@ksu.edu.sa

Abstract: Attention to chemical risk analysis and management came to the forefront in King Saud University, Riyadh, Saudi Arabia after hosting experts from Sandia National Laboratories (CSP), CRDF and USEPA. This initial involvement led to the participation of our faculty members, chemical laboratory and store personnel in three separate international symposia and workshops from 2008 to 2015 conducted by Chemical Pollution Protection Committee (CPPC) at King Saud University. Valuable insights from the training were translated into practical applications appropriate for the chemical risk analysis and management in our chemical laboratories. The workshop and symposia topics included chemical storage, fire safety, personal protective equipment, and emergency procedures and preparedness among others. Practical activities covered the evaluation of hoods, conducting a safety audit, proper gloving techniques, and managing dry and wet spills. ‘‘Just-in-time’’ acquisition of chemicals is close to impossible in our institution, which is probably also the case for most schools and universities in developing countries. While common acids and solvents are readily available from local distributors, specialty reagents take at least three months to import. As a consequence, some schools purchase more than necessary and overstock reagents, taking up scant storage space. Some researchers also fail to endorse their used reagents, and these ‘‘orphan’’ chemicals often remain on the shelves or in the freezer. Aggravating the problem, the institution is often left to properly dispose of these chemicals. The (CPPC) contracted with one of Saudi specialized companies to take the responsibility of the final disposal of more than (150) tons of hazardous chemical wastes from the University chemical laboratories and stores.

P.86  Assessment Of The Explosion Characteristics Of Dust Clouds: Standards Versus Reality. Vizcaya DM, Amín M*, Pinilla A, Muñoz F; Universidad de los Andes   dm.vizcaya62@uniandes.edu.co

Abstract: Dust explosions are events that might occur when a combustible dust is dispersed near an ignition source in a confined facility, which is why this is a matter of main interest in the process safety. This fact implies that the generation of a dust cloud represents a hazard for industries that store or handle this type of combustible materials. Accordingly, some organic, plastic and metal powders that are usually considered as non-hazardous materials, have led to several incidents that were associated to explosive atmospheres. To characterize flammable dust, there have been developed some standard tests on different equipment. These types of tests allow the characterization of parameters associated to the material, like the maximum pressure (Pmax), the maximum rate of pressure rise (dP/dt)max, the minimum explosive concentration (MEC), and the deflagration index (Kst). Nevertheless, during several years, there have been established some questioning regarding the evaluation of some parameters which has evolved in the need of further analyzing the standard tests conditions, in order to improve the quality of the results. Hence, this paper emphasize in gathering the evidence available of this subject, provided for the scientific community to this moment. Based on this, it is possible to establish the main shortcomings of the standard tests to contribute to its improvement. As part of the key results gathered, it was found that within the 20 L sphere test, there is a reduction of the particle size due to fragmentation phenomena. This implies that values such as the MEC and Kst, calculated during the test, are related to different conditions than the ones needed. Alternatively, the Kst does not take into account phenomena like turbulence, combustion velocity and flame area, which is why this parameter is hardly scalable to industrial scenarios as it only takes into account the facility volume.

P.89  The Concept of Unacceptable Risk in EPA Regulatory Policies. Farber G*; US EPA   farber.glenn@epa.gov

Abstract: Regulatory programs under a variety of environmental statutes use threshold levels of hazard and exposure to set set threshold levels of risk. Employing these levels for regulatory or cleanup decisions carries the implication that lower levels are “acceptable”, or at least not appropriate for action. This session will examine the trigger thresholds in various EPA regulatory programs, comparing the basis for establishing the criteria. How does the concept of acceptable risk fit into these policies, and what are the implications?

P.90  Discovery of thresholds of nursing accidents by analysis of open data. Maeda Y*, Marui R; Shizuoka Univeristy, Japan Post Insurance Systems Solutions   maeda.yasunobu@shizuoka.ac.jp

Abstract: This paper reports that some keywords in regard to thresholds between accidents and near-misses in nursing work in hospitals were obtained from open data of accident/near-miss case reports by using text mining techniques. The open data is collected through the Project to Collect Medical Near-Miss/Adverse Event Information operated by The Japan Council for Quality Health Care. In this research 15,096 cases of accident/near-miss reports were analyzed. At first words in the case reports were extracted through a combination of morphological analysis and TF-IDF indicator. Then decision trees to discriminate accidents from near-misses were obtained through C4.5 algorithm. In the decision trees, keywords of 'falling', 'removing', 'toilet', 'walk', 'internal use', and 'day shift' were found on the thresholds between the accidents and the near-misses. This result suggests that nursing accidents are not mere worse cases than near-misses, but have different characteristics from them.

P.91  Why Qualitative Research is so important for Risk Analysis in Latin America? Padlog, M. PM*; University of Guadalajara   mpadlog@cencar.udg.mx

Abstract: SRA-LA was born in 2008, with the seal of the diversity since professionals and researchers from different countries gathered to give birth to a group intended to study, give proposals and find solutions to the multiple hazards that the Continent presents to its millions of inhabitants. Since that start, all the members representing several Latin American countries in the newborn society were focused in attending the needs of the local communities, which in no way are similar one to the other. The extension of their territories, the particular composition of their peoples with their specific cultural background according to their pre-Colombian stage, the colonization processes and the immigration flows, plus the different levels of education, industrialization, social and political organization, technology availability and economic insertion in the global processes, represent a wide range of hazards and risks, rising to the disaster level according to the vulnerability that each community is exposed. Qualitative research is a methodology capable to open the field for interdisciplinary works, to maximize the results of research in such different cultures. The method itself provides with specific tips, clues and suggestions to address not just to regions or countries, but to particular communities on which general policies of prevention and mitigation of disasters need to be adjusted in order to make them suitable for their idiosyncracy, to enrich their response ability and reinforce their resilience processes. Qualitative research findings contribute to develop dynamic models to investigate and prevent risks, capable to adjust to the local differences in each case study for mitigation in emergencies. Focusing in the diversity may save efforts and resources in risk management, safety policies and the construction of security environments.

P.92  U.S. EPA Provisional Peer-Reviewed Toxicity Value and Community Site Specific and Regulatory Support Program. Shannon T*, Gatchett A, Zhao QJ, Kaiser JP, Phillips L, Woodall G; U.S. Environmental Protection Agency, National Center for Environmental Assessment   Gatchett.Annette@epa.gov

Abstract: The Superfund Technical Support program is responsible for providing general regulatory support to the Office of Solid Waste and Emergency Response (OSWER) and the development of Provisional Peer-Reviewed Toxicity Value (PPRTV) assessments. The Program is devoted to responding to emerging, often crisis-level, chemical/substance issues with sound science that allows for quick action and effective solutions. Scientists conducting work in this program are also working on methods to support decision-making at contaminated waste sites, developing tools to help understand community risk, or providing rapid responses. Research in this program focuses on two main areas: 1) PPRTV assessments, and 2) site-specific and Superfund/regulatory support. PPRTVs are toxicity values derived for use in the U.S. EPA’s Superfund program when such toxicity values are not available in the IRIS database. PPRTVs are used by the Superfund program and regional decision-makers when making site-specific clean-up decisions. The second research area provides scientific expert consultations and technical support to OSWER and the regions regarding requests for both human and ecological risk assessment. Rapid risk assessment support to emergent situations and targeted evaluations of complex problems of Agency priority (e.g., Gulf Oil spill) is regularly required. The implications of the regulatory decisions include improvements in human health in the vicinity of Superfund sites, reduction or reversal of damages to natural resources, reduction of harm in emergency situations, improved economic conditions and quality of life, improved environmental practices by industry, and advances in science and technology. This poster provides an overview and highlights of the proposed research over the next five years. The views expressed in this abstract are those of the authors and do not necessarily reflect the views and policies of the U.S. EPA. Mention of trade names or commercial products does not constitute endorsement or recommendation for use.

P.93  From Risk Communication to Risk Culture: Challenges for New Approaches. Okamura C*, Lolive J; CETESB - Environmental Company of São Paulo State   cintiaokamura@hotmail.com

Abstract: São Paulo is the anchor city in São Paulo metropolitan area, ranked as the most populous in Brazil and the second in Americas. This region concentrates many industrial activities, crowded roads and highways, high density of residential buildings, houses and slums and a lot of contaminated areas (4771 are currently listed in the State of São Paulo, among them 1685 in the city of São Paulo). So its inhabitants are exposed to a unique combination of various risks. CETESB, the governmental agency in charge of risk management, has acquired a strong potential for monitoring, hazards analysis and regulations concerning the biophysical component of risk. The knowledge of their anthropic component is missing to develop best instruments. In particular, their top-down risk communication is showing its limitations in front of the loss of trust, the growing concern of risk areas populations and conflicts between the stakeholders involved The research we present is in progress. It is based on a Franco-Brazilian cooperation (National Centre for Scientific Research of France, CETESB and Faculty of Public Health College of the Sao Paulo University). Coordinated by the CETESB, it proposes to test relevant methods (controversy analysis, artistic simulation of disaster, risk ambiances, forum of contaminated areas) for analyze, reveal, stimulate and promote the experience of the exposed populations of risk areas of Sao Paulo and its region to develop an inclusive and participatory risk culture. The research focuses on two pilot sites chosen by the research team with members of the CETESB: a residential complex in a heavily contaminated area, and the city of São Sebastião, which contains the largest Latin American oil terminal. The knowledge generated by this research will improve risk management policy. They will be translated into standards for action to develop awareness, communication and participation protocol, which will be implemented by the CETESB.

P.94  Poker, Beer, and Zombies: The Application of Adult Learning Theory to Teach Risk Management to Undergraduates. Spicer KE*; Murray State University   kspicer@murraystate.edu

Abstract: Teaching the concepts of Risk Management, including Risk Identification, Risk Assessment/Analysis, the development of Risk Control strategies, and Risk Communication, to undergraduate Occupational Safety students presents unique challenges. Questions to consider include (1) What do the students already know about hazard identification and probability, (2) To be effective in their careers as safety professionals, what do they need to know about risk management, and (3) What are the most effective methods to develop the students’ understanding of this material and the ability to apply these concepts in the “real world”. Sometimes educators in higher education forget that university students are adults, and there exists a plethora of adult learning theory principles that can aid in the effective education of adult students. This study seeks to identify the effectiveness of the application of the following adult learning principles: Principle of Readiness, Principle of Association, Principle of Involvement, Principle of Repetition, and the Principle of Reinforcement. These principles were applied within the problem-based learning method of teaching; problems included a hands-on risk analysis and a disaster preparedness communication project known affectionately as “The Zombie Project.” Using topics that are of interest to college students, including poker, beer, and zombies, while applying the listed principles of adult learning theory resulted in a high-level of student understanding, based on student performance on the completed projects.

P.95  U.S. EPA Human Health Research on Community and Site-specific Risk Program. Gatchett A*, Wright JM, Segal D, Shannon T; U.S. Environmental Protection Agency, National Center for Environmental Assessment, Cincinnati, OH   Gatchett.Annette@epa.gov

Abstract: The purpose of the U.S. EPA’s Human Health Risk Assessment (HHRA) program in the Office of Research and Development is to develop and apply state-of-the-science risk assessment methods to estimate health and environmental risks from exposures to chemical and non-chemical stressors including various mixture combinations. More specifically, the community and site-specific risk emphasis is to provide rapid response assessments and cumulative risk methods to address emergency response, Superfund site assessment, sustainability characterization, and community concerns. Communities today are faced with an urgent need for coordinated assistance to assess and address issues of chemical and other environmental contamination. The U.S. EPA’s HHRA program is frequently called upon to quickly assist in these situations, often in the face of large scientific uncertainties due to data gaps. Specific work under this topic includes quick turn-around exposure and risk assessments, technical support on human health or ecological risks to support different Superfund sites or regional concerns, the development of Provisional Peer-Reviewed Toxicity Value (PPRTV) assessments, and the development of methods and tools for conducting cumulative impact and risk assessments. Taken together, this work helps ensure that the U.S. EPA’s programs and regions have the tools and information they need to make decisions and address community concerns. The Community and Site Specific research plan is uniquely positioned to support risk management decisions and regulatory needs of various stakeholders, including Agency program and regional offices as well as state/tribal environmental protection programs and interested communities. This poster provides an overview and highlights of the proposed research over the next five years. The views expressed in this abstract are those of the authors and do not necessarily reflect the views and policies of the U.S. EPA. Mention of trade names or commercial products does not constitute endorsement or recommendation for use.

P.96  Application of Mental Modeling Technology™ with Synthetic Interviews™ to Support Stakeholder Engagement through Artificial Intelligence Products . Butte G, Kovacs D*, Ketchum C, Pribanic V, Thorne S; Decision Partners; MedRespond   dkovacs@decisionpartners.com

Abstract: Interactive Decision Support Technology (IDST) is a ground-breaking integration of Decision Partner’s Mental Modeling Technology™ with MedRespond’s Synthetic Interview™ artificial intelligence and online communication products. The IDST solution has been designed to improve stakeholder judgment and decision making through realistic dialogue with virtual experts that are available 24x7x365. The virtual experts effectively engage, inform and motivate stakeholders (patients, doctors, employees, customers and others) on the topic at hand because the virtual experts “Understand” stakeholder motivation and decision behaviors; “Respond” to stakeholder questions via immersive conversational videos; “Remember” individual stakeholders over time; in order to “Follow” changes in stakeholder information needs, priorities and thinking. The virtual dialogue is both informed by traditional Mental Models research that elicits in-depth understanding of stakeholder perceptions, interests, priorities and informational needs, and extends Mental Models research results by providing on-going insight into changes in stakeholders’ Mental Models (both individually and collectively) over time. With a virtual video expert at the core of “conversations” around topics such as health and well-being or socio-technical issues such as energy , one-way / one-size-fits-all online communications are converted to two-way, personalized virtual dialogues, providing stakeholders with a powerful, online communication experience that better prepares them to make well-informed decisions. IDST provides the enterprise or sponsor organization with dynamic insights into stakeholder beliefs, attitudes and perspectives on the topic at hand, enabling focused communication investments that inform and influence decision making and behavior in a manner that is predictable, rapid, scalable and cost-effective. In this presentation we will present background on the IDST approach and case examples.

P.97  4-N-Nitrosomethylamino-1-(3-pyridyl)-1-butanone (NNK) and N-Nitrosonornicotine (NNN): Risk Assessment of Two Tobacco-Specific Nitrosamines (TSNAs). Fiebelkorn SA*, Cunningham FH, Dillon D, Meredith C; British American Tobacco, Group Research and Development, Southampton, SO15 8TL, UK   stacy_fiebelkorn@bat.com

Abstract: Tobacco smoke contains over 6,000 constituents, at least 150 of which have established toxicological properties. Both NNK and NNN have been proposed by the WHO Study Group on Tobacco Product Regulation for mandatory lowering. To further understand this prioritisation, we have employed margin of exposure (MOE) calculations, drafted postulated mode of actions (MOA), conducted a series of in vitro tests to confirm the MOA and investigated the parameters for constructing a physiologically-based pharmacokinetic (PBPK) model. MOE calculations for NNK, using 8 data sets with varying routes of administration and duration, generated values ranging between 278 and 89,543. MOE calculations for NNN, using 7 data sets with varying routes of administration and duration, generated values ranging between 2,758 and 255,652. These ranges are both above and below the critical MOE value of 10,000, creating ambiguity as to the priority of NNK and NNN. However, comparing the lowest MOE for NNK versus NNN suggests a difference between the two compounds of approximately 10-fold. The key events in our postulated mode of action (MOA) for both NNK and NNN involve metabolic activation, DNA damage, genotoxicity (mutation) leading to cell proliferation and tumours. We compared the in vitro genotoxicity of NNK and NNN, using Ames and the mouse lymphoma assay (MLA). NNK induced dose-dependent mutations in two Ames strains in the presence of S9 when tested up to 5000 µg/plate, whereas NNN gave no conclusive results even when tested up to 30,000 µg/plate. NNK induced mutation in L5178Y cells (MLA) following 3 hour treatment with S9 when tested up to 10 mM, however NNN did not induce mutations with or without S9 when tested up to 10 mM. These data suggest that there is a difference in the potency of NNK and NNN and we propose a further step by constructing a PBPK model to contextualize these data against the effective tissue doses experienced by smokers.

P.98  Indoor Environmental and Air Quality Characteristics, Prior Health Conditions, and Building-Related Symptoms. Lukcso D, Guidotti TL*, Franklin DE, Burt A; Medical Advisory Services and Building Health Sciences   tee@teeguidotti.com

Abstract: We investigated environmental conditions by many modalities in 71 discreet areas of 12 buildings in a government building complex which had experienced persistent occupant complaints despite correction of deficiencies following a prior survey. An on-line health survey was completed by 7,637 building occupants (49% response rate). Building environmental measures were within current standards and guidelines. Four environmental factors were consistently associated with group-level building-related health complaints: physical comfort/discomfort, odor, job stress, and glare. Several other factors were frequently commented on by participants including cleanliness, renovation and construction activity, and noise. Low relative humidity was significantly associated with lower respiratory and “sick building syndrome”-type symptoms. No other environmental conditions (including formaldehyde, PM10, or mold levels which were tested by seven parameters) correlated directly with individual health symptoms. Indicators of atopy or allergy (sinusitis, allergies, and asthma) when present singly, in combinations of two conditions, or together, were hierarchically associated with increased absence, increased presenteeism (presence at work but at reduced capacity), and increase in reported symptom-days, including symptoms not related to respiratory disease. We find that in buildings without unusual hazards and with environmental and air quality indicators within the range of acceptable indoor air quality standards, there is an identifiable population of occupants with a high prevalence of asthma and allergic disease who disproportionately report discomfort and lost productivity due to symptoms. These outcome indicators are more closely associated with host factors than with environmental conditions. An occupant-centered medical evaluation should guide environmental investigations, especially when the building and its work areas are within regulatory standards and industry guidelines.

P.99  What is the best control strategies for and equine influenza outbreak? Cogger N*, Rosanowski S; Massey University   n.cogger@massey.ac.nz

Abstract: Equine influenza (EI) is a highly infectious respiratory disease of horses that is not present in New Zealand (NZ). Given the equine population in NZ is naïve to the virus it has the potential to spread rapidly. Consequently, government and industry must respond rapidly if they are going to eradicate the disease. To ensure a rapid response decision making around control strategies should be made in peace time. Infectious disease modelling can provide useful information to inform this type of decision making. This abstract will present the results of infectious disease modelling of an EI outbreak in NZ under when movement restriction was the only control strategy and when movement controls were combined with a vaccination that was suppressive, protective or targeted to high value animals. The infectious disease modelling was done using InterSpread Plus a spatial explicit SRI model. The model requires information about the: (1) location of all properties with horses and racetracks; (2) frequency and distance of movement of animals, humans and fomites between properties and to and from racetracks; (3) incubation period and virus production in horses, and (4) local-spread and airborne-spread probabilities. The results of this analysis showed that an EI in NZ would be widespread and require a six months nationwide stand still. Use of vaccination used in conjunction with a nationwide standstill did significantly reduce the size of the outbreak. However, the reduction was not substantial and the number of properties, especially when the strategy was vaccinate all properties within a 7-10 band from known IP’s, was substantial. Consequently, the decision as to whether or not to use vaccination is not clear cut and could benefit from some economic assessment of the costs and benefits.

P.100  Use of a Quantitative Microbial Risk Assessment Model to Estimate Exposure to Campylobacter from Consumption of Chicken in the United States. Kang D*, Eifert J; Virginia Tech   dkang@vt.edu

Abstract: The current paradigm in food-safety policy has progressively shifted from hazard- to risk-based approaches. Risk analyses have become more integrated in developing modern food safety systems. Although Campylobacter spp. has been recognized as a major causative agent for foodborne illnesses in the U.S., it is still overshadowed by higher-profile pathogens. Chicken consumption provides a source of relatively inexpensive animal protein in consumer diets but is also considered a primary food vehicle for Campylobacter¬¬-related illnesses. Using risk analysis, the annual health and societal impact of Campylobacter-associated chicken consumption in the U.S. can be measured and updated. Quantitative microbial risk assessment models using Monte-Carlo simulations were developed to gain insight into the true prevalence and concentration of Campylobacter spp. at various stages of chicken transport, processing, retail, and consumer storage and handling. The pathogen levels were fit into a dose-response model that yielded estimations of annual health impacts of chicken product consumption. Along with a baseline model, application of intervention steps with various reductive capacities along these stages was assessed. Health-impact estimations were separated by illness predictions of various levels caused by consumer exposure. Health-impact estimations were linked to cost-of-illness predictors using nationally available data to demonstrate the societal burden caused by this pathogen-food-consumer interaction. Campylobacter-contaminated chicken consumption presents significant health- and socio-economic burdens in the U.S., as demonstrated with this quantitative risk assessment. The marriage of utilizing commercially available risk assessment software with the accessibility of processing power available to end-users should yield more sophisticated quantitative risk assessment models with fewer resource burdens than in the past.

P.101  Development of a pre-harvest system model to understand the ecology of E. coli O157:H7 in leafy greens production. Mishra A*, Pradhan AK; University of Maryland, College Park   abhinavmishra75@gmail.com

Abstract: Leafy green vegetables are identified as the fresh produce commodity group of highest concern from a microbiological safety perspective, because they are often grown in the open field and are vulnerable to contamination from contaminated manure, soil, irrigation water, and contact with (feces of) wildlife. Leafy greens are generally consumed raw and, there is no cooking steps involve. Bacteria, such as pathogenic Escherichia coli O157:H7 are the main pathogens causing foodborne diseases through consumption of leafy greens. The incidence of foodborne diseases is generally correlated with climate conditions. E. coli O157:H7 is often detected from various on-farm sources (or sub-systems) like water, feed, and bedding material and from environmental sources like wild birds and rodents, insects, water, and soil. During the past decade, we have begun to see a new “systems” food safety vocabulary emerging as a result of the need to deal with highly complex food safety issues. Systems thinking is an approach to problem-solving based on the belief that the component parts of a system can best be understood in the context of relationships with each other and with other systems, rather than in isolation. This study was done with an objective of developing a system model of ecology of E. coli O157:H7 in leafy greens, characterized by inclusion of interaction terms in addition to terms for variability and uncertainty in various subsystems. In the United States, between 1998 and 2005, a majority of 58% of the VTEC outbreaks (n = 342) were related to contaminated food and mainly occurred in the period from May to October.

P.102  A Bayesian approach to the estimation of Salmonella growth in raw chicken meat . Nguyen Loan*; Health Canada   loan.nguyen@hc-sc.gc.ca

Abstract: A Bayesian, hierarchical approach, proposed by Pouillot et al. (2003) and Muller et al. (2006), was used to estimate the growth variability and uncertainty of the parameters for the Salmonella growth model. Our model combined the Baranyi et al. (1994) growth model as the primary model and the Rosso et al. (1993) cardinal model as the secondary model to describe how the growth parameters vary among Salmonella and raw chicken products. Parameter estimation follows the Markov Chain Monte Carlo that applied to a pooled Salmonella growth data, extracted from the published literature and ComBase, to describe uncertainty in the model parameters. Fairly uninformative prior distributions for the parameters of predictive growth model (Ellouze et al., 2010) and both gamma and uniform prior distributions for the parameters’ standard deviations (Gelman, 2006) were considered in the model. This structure let us describe how growth varies among Salmonella and chicken products, which is necessary to construct our risk assessment. It is better than using a single, individual case model from among those that appear in the literature.

P.103  Modeling of Environmental and Meteorological Risk Factors for Contamination by Foodborne Pathogens in Produce Farms. Pang H*, Lambertini E, Pradhan AK; University of Maryland, Center for Food Safety and Security Systems, College Park, MD, 20742   haopang@umd.edu

Abstract: Foodborne outbreaks attributed to produce have been traced back to contamination at pre-harvest level. Geographical factors surrounding produce farms may influence the probability of microbial contamination of produce at pre-harvest stage. The objectives of this study were to: 1) identify possible risk factors for pathogen contamination in produce at pre-harvest level, and 2) compare different modeling tools that can be used to analyze and identify risk factors in order to control and manage the pathogen contamination risk at farm level. A broad literature search was conducted, and studies that investigated possible risk factors for contamination from Listeria, Salmonella, and pathogenic E. coli in a variety of produce at pre-harvest level were summarized and discussed. Potential pre-harvest risk factors were identified and divided into environmental and meteorological factors. Presence and survival of pathogens in wild and domestic animals, water, soil, and manure are well documented. Meteorological factors such as temperature, freeze-thaw cycle, and rainfall have been investigated as possible risk factors, although consistent evidence is lacking for conclusively support the association between these factors and contamination on produce farms. Classification trees and logistic regression are the primary statistical modelling tools that have been used to identify potential risk factors for contamination. This study analyzed possible risk factors of microbial contamination in produce farms and discussed statistical tools that can be used to evaluate and determine those risk factors. The information provided in this study could serve as a useful resource to evaluate and rank risk factors in produce production.

P.104  Prevalence, Isolation, and Genetic Characterization of Toxoplasma gondii in Chicken from Amish Community. Ying YQ*, Guo Miao, Dubey JP, Pradhan AK; Yuqing Ying1, Miao Guo1, Jitender P. Dubey2, Abani K. Pradhan1,3 1Department of Nutrition and Food Science, 3Center for Food Safety and Security Systems, University of Maryland, College Park, MD 20742, USA; 2Animal Parasitic Diseases Laboratory, Agricultural Research Service, United States Departm   yingyuqing@gmail.com

Abstract: Toxoplasma gondii is a protozoan parasite that could infect all warm-blooded animals. Consumption of raw or undercooked meat products and ingestion of oocysts from contaminated food and water are two major sources of T. gondii infection. T. gondii prevalence could vary in different areas and ethnic groups due to the different oocysts contamination in environment and people’s dietary habits. High infection rate was observed in 59 (52%) out of 114 pregnant Amish women. People from Amish Community have higher prevalence compared with the national infection rate. However, little is known about the major source of T. gondii infection in Amish population. The objective of this study was to conduct a survey in the chickens sold at Amish markets in mid-Atlantic area. Chickens are good indicator of the environment oocysts contamination because they feed from ground. Approximately 200 heart samples will be purchased from each of the six Amish markets in Maryland and Pennsylvania. Serum or interstitial fluid from the samples will be diluted into at least two dilutions and tested for antibodies to T. gondii by modified agglutination test (MAT). Tissues (hearts) of the seropositives sample will be bioassayed in mice to isolate viable T. gondii. T. gondii isolates will be genetically characterized by PCR-RFLP of 10 loci. In the preliminary experiment, two samplings were conducted in 113 chicken heart samples purchased from Dutch Country Farmers Market in Maryland. T. gondii antibodies were detected from 19 samples with MAT at titer 1:5 or higher. However, no parasite was isolated through bioassay. This study will give guidance to customers on food safety of meat products from Amish markets with statistical power

P.106  A fuzzy linear programming model for optimal allocation of health workers in a medical facility under crisis conditions. Yu KDS*, Tan RR, Aviso KB, Promentilla MAB, Santos JR; De La Salle University   krista.yu@dlsu.edu.ph

Abstract: Disease pandemics create problems for medical establishments since the increased demand for treatment services is compounded by exposure of healthcare workforce to contagion, which in turn can cause a reduction in capacity to provide services. Past works on crisis management in health care systems have explored the implementation of non-pharmaceutical interventions such as social distancing and modifications in workplace practices, as well as strategies for the optimal allocation of human resources in order to maximize the impact of health care. Interactions between staff in a hospital for example are necessary to deliver essential services but the frequency of contact could also increase the probability of infection. This study develops a fuzzy linear programming model, which will allow the optimal allocation of health workers that minimize the impact of reduced staff services as a result of exposure to health risks such as influenza pandemics and outbreaks of other contagious diseases. Extension of input-output models, which are widely used to illustrate the interdependent nature of economic sectors, is also incorporated in the model to analyze human interactions within the organization. Results show that the inability of doctors and nurses to provide optimal care to all patients may potentially yield an excess in support staff which may not contribute positively to the operation under crisis conditions. Hence, management may use the information to assess alternative strategies such as importing essential workforce from other institutions to satisfy the surge for emergency response, referring patients to other medical institutions and sending non-essential labor home to reduce their exposure to health risks.

P.107  Stochastic Input-Output Analysis and Extensions for Impact Analysis: A United States Case Study. Ali J*, Santos JR; George Washington University   jalalali@gwu.edu

Abstract: The input-output (I-O) model’s capability to provide macroeconomic policy insights on interdependent economic systems has recently been extended in the field of quantitative risk analysis. As with any quantitative models, estimates of input data and associated parameters are inevitably prone to some kind of error or bias. The same statement can be said about the susceptibility of the I-O technical coefficients to imprecision originating from various sources of uncertainty. Hence, this paper provides a methodology based on stochastic I-O analysis to address these issues and subsequently measure the uncertainty when using the I-O model. The research uses the supply and use tables from the US Bureau of Economic Analysis for a period of 14 years (1998-2011) to estimate the probability distributions of the technical coefficients. The coefficients are assumed to follow the Dirichlet distribution, and their moments are evaluated by using a Monte-Carlos Simulation of 10,000 iterations. The simulation methodology is implemented in MATLAB and the results are used to generate key sector analysis. Probability distributions can be established to measure the backward and forward linkages for each economic sector. In addition, we used the eigenvalue method to determine the key sectors based on their contribution to the economy and to assess the sensitivity of the sectors to economic disruptions. In sum, this research develops a stochastic model based on historical I-O data and the results are envisioned to contribute positively to strategic economic planning and macroeconomic risk analysis.

P.108  Mental Models of Indoor Air Quality: Does anybody believe the research? Hamilton M, Rackes A, Gurian PL*, Waring MS; Drexel University   pgurian@drexel.edu

Abstract: This study used the initial steps of the mental models risk communication approach to characterize current understanding of indoor air quality among a diverse group of stakeholders, including designers, facilities managers, owners, and tenants, and to contrast this with technical understanding. The study began with a search of the technical literature on indoor air quality and the development of an influence diagram summarizing causal relationships and decision options. Two cost-effect intervention points were identified as having substantial benefits based on literature information: 1) increasing ventilation rates above minimum design standards and 2) the use of high efficiency filters. A series of semi-structured interviews indicated that many stakeholders see indoor air quality as more of a peripheral issue than as one having substantial health and productivity impacts. Some are concerned about bringing poor quality outdoor air into the building. This concern is legitimate in many contexts but can be addressed with high efficiency filters. The semi-structured interview results were used to design a structured survey that was administered to 112 stakeholders across the United States. The survey assessed perceptions of current indoor air quality and estimates of benefits and costs of, as well as willingness to pay for, IAQ improvements. Minorities of respondents saw indoor air quality improvements as providing health, productivity, and absenteeism benefits, which is in contrast to studies that have documented these benefits. Respondents holding green building credentials were not more willing to affirm the benefits of improved indoor air quality than those who did not. Survey respondents estimated the costs of improved ventilation and filtration to be over ten times higher than engineering cost estimates of these measures. This study suggests that approaches to inform and certify professionals on health risks and benefits of indoor air quality interventions may need to be revised.

P.109  Snow Avalanches Risk in North India and Role of GIS/RS and ICT in Avalanche Management. Walia AB*; Centre for Disaster Management Lal Bahadur Shastri National Academy of Administration   waliaabhi@gmail.com

Abstract: Snow Avalanche is one of the major disaster in the Himalayan region. Between December 2004 and February 2005 it has shown its dangerous image in Jammu & Kashmir, India. This was one of the highest snowfall recorded in the last 30 years in J&K region. Snow avalanches of the Indian Himalayas are known for their massiveness and great destructive potential. An area of about 2,00,000 sq.km of Jammu & Kashmir, Himachal Pradesh and Uttrakhand is vulnerably exposed to avalanche danger. In J& K the most affected areas are the higher reaches of Kashmir & Gurez valleys, Kargil and Ladakh districts and some of the major highways. In Himachal Pradesh the areas vulnerable to the hazard are Chamba (34 villages), Lahoul-Spiti (48 villages), Kullu and kinnaur (25 villages). Parts of Tehri- Garhwal and Chamoli districts in Uttranchal are also affected by the avalanches. The aim of the poster is to give an introduction about the snow avalanches in a vital form to make people understand easily about the Avalanches, its causes and problem.To share the knowledge about the Avalanche danger in the North Indian mountainous regions and the role of GIS in its management.To disseminate information about the snow avalanche management for all major three phases I.e. Before, During and After phase. To study vulnerability of Avalanches in Indian mountainous region.To assess efforts of Government and NGOs in Avalanche Management.The proposed research work will be conducted with the help of primary and secondary data. The primary data is proposed to be collect through personal interviews/meetings with the dealing agencies, concerned professionals and the secondary information will be taken from printed materials and internet. This poster will give the clear picture of the status of Avalanche Risk and Management in North Indian mountainous regions.

P.110  Human and Ecological Risk Assessment of Indiana University Golf Course. Cains MG*, McFetridge E, Winter A, Duan Y, Cains ; Indiana University   marianacains@gmail.com

Abstract: Indiana University Golf Course is an 18-hole, 233 acre championship course that used 31 pesticides in 2014 and applied 548 kg of active ingredients to eliminate fungus, insects, and weeds. Multiple terrestrial and aquatic species are present within the vicinity in addition to University Lake and Griffy Lake. Due to the potential health risks posed to IU Golf Course pesticide applicators and the surrounding ecosystem, an environmental risk analysis was conducted on five of the 26 applied active ingredients (bensulide, carbaryl, chlorothalonil, iprodione, and tebuconazole). The active ingredients were selected based on toxicity, bioaccumulation potential, half-life, amount of active ingredient applied, and frequency of application. Human risk was assessed using the EPA RAGS framework and the EPA Occupation Pesticide Handler dermal and inhalation exposure factors for golf course pesticide mixing, loading, and application. Terrestrial ecological risk was assessed for the Carolina chickadee, American robin, Canada goose, meadow jumping mouse, meadow voles, and rabbit using the EPA T-REX model. Aquatic ecological risk was assessed for daphnia, largemouth bass, and channel catfish using the TurfPQ and PondPQ models developed by Dr. Douglas Haith of Cornell University. Suggested risk management improvements include: requiring all pesticide handlers and applicators to always wear the pesticide product label prescribed level of personal protective equipment; to avoid pesticide application within 15 feet of a waterbody; and to discontinue the use of Sevin (active ingredient: carbaryl). Future steps for this environmental risk assessment include the incorporation of the actual personal protective equipment worn by golf course pesticide applicators, adding a distance component into the PondPQ model, the calculation of probabilistic ecological risk rather than risk quotients, and finally the expansion of analysis to include all 26 active ingredients.

P.111  An Iterative and Multidisciplinary Framework for Determining Read-Across Chemical Surrogates. Rice JW, Ritter HC*, Kneeland JM, Zhang J, Butler C, Noble AE; Gradient   hritter@gradientcorp.com

Abstract: Businesses and governments around the world are increasingly integrating chemical risk assessment into their safety and regulatory programs for both new and existing chemicals. In order to accurately and comprehensively assess chemical risks, potential hazards to human health and the environment must be evaluated. Unfortunately, many substances have no readily available toxicity data, necessitating a different approach: "read-across" of hazard properties based on a similar chemical "surrogate" or "analog." The read-across approach involves identifying suitable chemical surrogates that are structurally and toxicologically similar to a chemical of interest (COI) and for which robust toxicological data exist. Based on this approach, we have developed an iterative, multidisciplinary framework for consistently and reproducibly identifying targeted and justifiable surrogates in the absence of chemical-specific data for COIs. Identifying robust chemical surrogates involves preserving COI reactive functional groups, incorporating known structural alerts, and considering bioavailability. The use of structure activity relationships (SARs) and chemical grouping by functional group, moiety, or chemical class to evaluate chemical hazard profiles is well established; with a large portfolio of chemicals to assess, the systematic leveraging of chemical groupings additionally streamlines surrogate selection and ensures consistency within the portfolio. Ultimately, correctly applying the read-across approach enables accurate chemical hazard assessment, minimizes analytical costs and dependence on animal testing, and assists compliance with hazard communication frameworks such as the Globally Harmonized System for Classification and Labeling of Chemicals (GHS).

P.112  Apportioning Multimedia Exposure and Risk across Human and Ecological Receptors. Richmond-Bryant J*, Lorber M, Price PS, Wright JM, Segal D, Gatchett A, Jarabek AM; US Environmental Protection Agency   richmond-bryant.jennifer@epa.gov

Abstract: A priority research area for the Environmental Protection Agency’s Human Health and Risk Assessment agenda is risk apportionment for multiple stressors and their sources on human receptors. The objectives of this research area are to address scientific challenges regarding integration of exposure to multiple stressors into cumulative risk applications and to explore stressors that may modify exposures and influence dose-response relationships. In one study, human exposure to elevated levels of multiple airborne phthalates is modeled by an air-to-blood-to-bladder model, and measured levels of phthalate metabolites in the urine are compared to model results. Previously, the focus of phthalate exposure research was on consumer products and food; this study is novel in quantifying exposures associated with airborne phthalates. A phthalate cumulative risk assessment, first presented at the 2014 Society for Risk Analysis Annual Meeting, is being used in a new case study of overall cumulative risk. Previous results have suggested that often, total risk in an overall population, life stage, or segment of a population is driven by a single source or chemical stressor. If it can be shown that phthalate cumulative risk is dominated by one or two phthalates, then regulatory strategies for risk reduction may be developed around controlling specific sources. Additional exposure and risk apportionment research focuses on the impact of multiple stressors on cardiovascular disease risk factors using a model of disease relationships to the physical and chemical properties of environmental stressors. Finally, in response to an increased awareness of environmental chemicals partitioning to breast milk as a potential exposure route important to children’s health, a comprehensive literature review on the epidemiological evidence of potential reduction in benefits of breastfeeding on infant health due to the presence of multiple chemicals in breast milk will be conducted. Views expressed in this manuscript are the authors’ and do not necessarily represent U.S. EPA views or policies.

P.113  Degradation Products as Read-across Surrogates for Hazard Assessment of Readily Degradable Substances. Ritter HC, Pizzurro DM*, Lunsman TD; Gradient   dpizzurro@gradientcorp.com

Abstract: Industry leaders and international regulatory bodies are increasingly integrating chemical risk assessment into their safety and regulatory programs. A critical first step in risk assessment is chemical hazard assessment, which involves comprehensive toxicological evaluation of hazards inherent to the individual chemical. Hazard assessment using the Globally Harmonized System for Classification and Labeling of Chemicals (GHS) guidance on chemical hazard classification fosters transparency and compliance with global regulations and is a predominant method for this process. To reduce reliance on animal testing and analytical costs, a read-across approach is increasingly used; this relies on identifying suitable "surrogate" compounds, for which toxicological data already exist, that are chemically and toxicologically similar to the compound of interest (COI) so that appropriate hazard assignment of the COI can proceed. A particularly challenging scenario for read-across surrogate identification is the case in which the COI can readily degrade or dissociate (defined herein as the propensity for substances to degrade through with speed or facility) into other constituents. In this case, one must evaluate the appropriateness of assigning dissociation or degradation products as a surrogate for the COI, an evaluation that is often complex and for which no unified set of standards currently exists. Such an evaluation considers many important factors, including the mechanism, extent, and timescale of dissociation/degradation of the substance, as well as the potential for the constituents to recombine. This analysis presents examples of assessing read-across surrogates for several categories of readily degradable compounds, including compounds readily undergoing ionic dissociation, abiotic degradation, and biotic degradation (e.g., metabolism), and outlines a multidisciplinary approach to identifying supported, toxicologically appropriate surrogates for chemical hazard assessment.

P.114  Human Health Risk Assessment: Contemporary Characterizations and Challenges . Vandenberg JJ*, Jarabek AM, D'Amico L, Johnson M, Shams D, Bland N, Avery J, Jarabek ; Government   jarabek.annie@epa.gov

Abstract: The Human Health Risk Assessment (HHRA) national research program in the US EPA’s Office of Research and Development provides state-of-the-science assessment products tailored to support science-based decisions about environmental pollutants that impact human health and the environment. The landscape of such regulatory decisions ranges from chemical prioritization or rapid response to emergent situations using limited data to determinations regarding whether to either retain or revise National Ambient Air Quality Standards based on the integration of evidence from hundreds of epidemiological studies. Such decisions are also being made in the context of rapidly emerging biotechnology and alternative testing strategies supplying new data streams from other species and in vitro assays. Environmental justice awareness and evolving community concerns and an appreciation for the influence of ecosystem degradation on public health warrant cumulative risk characterizations. This poster describes the elements of the HHRA national research program that comprise its current risk characterization products. Also identified are projects and opportunities to address such challenges, including strategies on how to flexibly apply these new computational techniques advances or develop approaches to integrate multiple stressors. Goals for, and approaches to, stakeholder and partner engagement are presented; such engagement is especially important to enhance scoping and problem formulation for assessment products. Collectively these projects in the HHRA program will improve the hazard and dose-response assessments used to inform Agency decisions. (The views those of the authors, and not necessarily those of the US EPA).

P.117  Health Risk Assessment for Exposure to Photoresists in Semiconductor Manufacturing Industries. Huang SZ*, Wu KY; National Taiwan University   shaozuhuang@gmail.com

Abstract: The objective of this study was to assess the potential health risks of chronic exposure to photoresists, a mixture of organic compounds used in the semiconductor industry for photolithography. Due to company trade secrets, the exact composition of the chemical is often limited. Existing literatures and patents descriptions presented potential compositions as well as by-products when used. Human health risk assessment were conducted using the United States Environmental Protection Agency (U.S. EPA) risk assessment process. For exposure assessment, the exposure scenario of photoresists were obtained from an optoelectronic semiconductor manufacturing factory in Taiwan. Exposure modeling tool Stoffenmanager was used to predict an estimate of worker’s exposure given the scenario. As one of the identified compounds’ occupational personal sampling data have been monitored by the factory, the information was used, along with the exposure model estimation as the prior distribution, to set up a Bayesian statistical update using Markov Chain Monte Carlo method. The updated distribution of the estimated exposure was then used to determine the hazard index (HI) of each compound. The evaluated photoresist used Propylene glyco methyl ether acetate (PGMEA), mixed with Novolac resin and photoactive compound. Potential by-products of the photoresist included Phenol, Cresol, Benzene, Toluene, and Xylene. The dose response of each compound was re-assessed and updated when possible. Their reference concentration and hazard index were determined. HI of PGMEA after Bayesian update was 0.13 with a 95% upper bound of 0.26. The HI of other compounds were ranked to determined compounds that should take precedence for validation.

P.118  Using pharmacokinetic data to replace default adjustment factors in assessing risk from non-clinical exposures to pharmaceuticals. Willis AM*, Ovesen J, Reichard J, Sandhu R, Maier A; 1,2,5. University of Cincinnati; 3. Toxicology Excellence for Risk Assessment; 4. SafeDose, Ltd.   willisam@ucmail.uc.edu

Abstract: On a molecular level, the toxicity of a chemical can be predicted based on the concentration/total dose of the biologically active chemical at the site of action. The effect is dictated by pharmacokinetics (PK) (delivery of the chemical to the site of action) and pharmacodynamics (PD) (how the chemical produces a biological response). Most pharmaceuticals have rich datasets, including known PK/PD parameters such as target receptor, mechanism of drug action, and receptor occupancy estimates. PK data on bioavailability, tissue distribution, drug protein binding, metabolism, and excretion may also exist. Several recent guidances have advocated the use of data-driven risk assessment methodologies for setting Acceptable Daily Exposures (ADEs) and Occupational Exposure Limits (OELs) for pharmaceuticals. They advocate the derivation of chemical-specific health-based limits with a rigorous evaluation and synthesis of all pharmacology and toxicology data. However, many of the specifics of derivation are left open to interpretation and limited guidance is provided on how to move away from default processes. The ultimate goal is to replace default factors with data when quantitative chemical-specific information is available. This work aims to lend additional guidance on how to utilize PK/PD data for the replacement of defaults in pharmaceutical risk assessment. Areas that can utilize PK/PD data include: adjusting for bioaccumulation, steady state (S), and dose averaging for intermittent dose schedules; assessing bioavailability from different routes of exposure; estimating interindividual variability (UFH); and supporting interspecies extrapolation (UFA); among others. While the focus here is ADEs, many of these issues apply equally to other types of pharmaceutical risk assessments, such as OELs and Reference Doses (RfDs). In either case, the appropriate use of PK/PD data to set ADEs depends on the underlying dataset and the target subpopulation being protected.

P.119  An Analysis of Violations of the OSHA Regulatory Standard on Benzene. Williams PRD*; E Risk Sciences, LLP   pwilliams@erisksciences.com

Abstract: In this study, the number and type of OSHA Benzene Standard violations and corresponding violations of OSHA’s Hazard Communication Standard (HCS) were evaluated. OSHA violation and inspection data collected since the 1970s were obtained from the Department of Labor enforcement website. Analysis of these data indicated that 938 violations of the benzene standard have been issued to date, compared to >80,000 violations for some other regulated substances (e.g., lead). The number of benzene standard violations was found to vary by time period, standard provision, industry sector, and other factors. Approximately 70% of the benzene standard violations occurred during the late 1980s to early/mid-1990s, soon after the 1987 final benzene rule was promulgated. The majority of benzene standard violations also pertained to noncompliance with provisions dealing with exposure monitoring (37%), communication of hazards (23%), respiratory protection (10%), and medical surveillance (9%). Only 200 out of approximate 550,000 HCS violations were attributed to potential benzene hazards in the workplace. Additionally, 55% of benzene standard violations pertained to the manufacturing sector, particularly industries where benzene products may be used or produced (e.g., petroleum refining). The greatest percentage of benzene standard violations were issued to private facility owners (90%), during inspections where union representation was present (56%), and from complaint-driven inspections (45%). Violations of the benzene standard have typically involved a single instance per facility and 10 or fewer exposed employees, and initial penalties have generally been <$5,000 per violation. Despite some limitations, the OSHA inspection database contains the best available data for assessing historical or current violations of the benzene standard. These data may be of interest to those involved in benzene risk assessment, risk management, or public policy issues.

P.120  Chicago Transit Authority Train Noise Exposure. Phan LT*, Jones RM; University of Illinois at Chicago   lphan6@uic.edu

Abstract: Objectives: The goal of this study is to characterize the noise exposure of riders on the Chicago Transit Authority (CTA) trains and to identify factors influencing noise levels. Methods: Twenty-eight UIC students were recruited to participate in this study. Participants were asked to ride a specific train route while wearing a noise dosimeter and complete a questionnaire about factors that might influence noise levels. Participants rode in the first car, where the driver’s cab is located. We used CEL-35x dBadge noise dosimeters, set to integrate sound levels over 1-minute intervals with a 5 dB exchange rate, an 80 dB threshold and a 90 dB criterion. Noise measurements were taken in vehicle only. Noise levels were tabulated as: 1) 1-min peak (Lpeak) and average noise levels (Leq), and 2) peak and average noise levels by train line segments between stations. Results: Train drivers have longer duration exposure than riders. The estimated 8-hour noise doses, however, do not exceed the exposure limits: The highest OSHA and ACGIH 8-h projected doses were on Blue line, which were 13% and 25.24% respectively. The linear-mixed effect regression models showed that the 1-min and segment-average noise levels of the Blue line, which were about 78 dBA, were significantly higher than all the other lines (p<0.05). The station-average noise level for segments involving travel through tunnels (77dBA) was 2 dBA significantly higher than elevated segments and 3 dbA higher than ground segments (p<0.05). Occupancy and passing trains were not associated with segment-average noise levels. Conclusions: The Blue line had higher noise levels than other train lines, but noise doses do not exceed occupational exposure limits for ride or 8-hour durations. Train drivers may have different exposures than riders, however, because they are in a cab with operational windows. Future work should confirm that drivers’ noise exposures are below exposure limits.

P.122  What Happened To The Acute Exposure Guideline Level (AEGL) Program?? Fensterheim R*, Choi H, Strother D, Jaques A; RegNet Environmental Service; Toxsolve   rfensterheim@regnet.com

Abstract: Acute Exposure Guideline Levels describe risk to humans from short-term chemical exposures from spills or other catastrophic events. The robustness of the technical reviews produced high quality analyses for important chemicals in commerce. AEGLs and supporting materials are used globally for uses well beyond accidental release scenarios. The foundation for the program was the National Advisory AEGL Committee which formed in 1996. The Committee chartered under the Federal Advisory Committee Act (FACA), had broad representation which contributed to its success including EPA, DOD, DOE, DOT) as well as other federal/state governments, the chemical industry, academia, and the private sector. Guidance and peer review was provided by the National Academy of Sciences.The NAS organized a Committee to review draft reports; concerns identified by NAS were submitted to the FAC for resolution. Once concurrence was achieved, the final values are published. All meetings of the FAC, drafts and proposed values were announced in the Federal Register; public and other stakeholders were invited to comment and present. The last meeting of the FAC was in April 2010; the Committee was never reconvened and the charter expired in October 2011. Since the NAS did not complete its review of all proposed values, a new process was established which bypassed the FAC and did not include the same public notice/comment period. Under this new process, NAS's concerns were addressed by a new Contractor; agency stakeholders were provided a two week review. While many of the AEGL values finalized under the new process were similar to the proposed, several are significantly and strikingly different. An analysis of the differences between the AEGL values proposed by the FAC and those finalized by the NAS are investigated. The lack of full public stakeholder involvement is believed to have contributed to some of the differences. Public policy questions are explored over the status of the AEGL values that bypassed the full FACA process.

P.123  The development of a heat wave vulnerability index for Osaka, Japan. Macnee RGD*, Tokai A; Osaka University   robert.macnee@gmail.com

Abstract: An emerging environmental concern in urban areas is the impact of heat waves on health. Heat waves account for more fatalities than any other meteorological event in the developed world. The impact of heatwaves is dependent on the intensity and duration of each event and on environmental and socio-demographic factors. Heat wave impacts are also spatially dynamic; there is no globally defined temperature threshold beyond which excess deaths occur. Improved forecasting enables regional early warning systems to be used to detect when heat waves will occur with some accuracy. However, in order to develop effective adaptation strategies and increase the resilience of a city it is important to develop a method to clearly identify the areas that are most vulnerable. Due to the large number of variables that determine the impact of a heat wave and the spatial variability of heat thresholds, it isn’t sufficient to only use regional weather forecasting. This project proposes the use of a heat wave vulnerability index developed from a principle component analysis of the variables that influence heat wave vulnerability. Initially, evidence from observations is presented to show an increasing trend in the annual number of “hot” days in Osaka Prefecture (1980-2014). Proxy measures of vulnerability are obtained from census data and land-use classification for the ward and city districts of Osaka Prefecture. Each component is weighted according to its influence and then summed to develop an index score for each district. The resulting map is combined with 1 km2 resolution air temperature observations and downscaled Global Climate Model (GCM) projections. Thus, temperature “hot spots” and sensitive areas will be identified simultaneously. This assessment of vulnerability, combining exposure and sensitivity components, can provide precedent for efficient, targeted action to be taken to reduce the impact of heat waves at present and under climate change.

P.125  Qualitative Interviews with Science and Risk Communication Trainers about Communication Goals. Besley JC*, Dudo AD, Yuan S, Besley ; Michigan State University   jbesley@msu.edu

Abstract: Twenty four qualitative interviews were conducted with science communication trainers to better understand how these trainers address goal setting in their work. The results suggest that trainers believe the scientists they train want help achieving a range of personal and societal goals. Personal goals were primarily related to career while societal goals were primarily related to ensuring that science is part of decision-making related to health or environmental risk (e.g., climate change, or vaccines). Interviews also suggested that the training being offered rarely explicitly addresses what intermediate objectives might allow scientists’ to achieve their overall goals. There was recognition that increasing knowledge was unlikely to have a substantial effect on how non-scientists view issues involving science/risk, but the training being provided appears to emphasize communication skills such as clarity and message selection. What appears to be missing was any discussion of how scientists could attempt to communicate elements of trustworthiness (i.e., warmth and competence) or procedural fairness (i.e., a willingness to listen and respect for others’ views). In some cases, trainers noted that their training includes a focus on trust-related strategies such as helping scientists be more relaxed and personable in front of a camera or helping them listen to audiences, but these were typically discussed in the context of enabling knowledge transmission rather than as potentially complementary pathways to realizing scientists’ overall goals. Another potential limitation of the training currently being offered is that trainers generally said that they allow the scientists’ themselves to set their goals rather than providing guidance on what goals are most likely to be effective. The interviews are part of a larger project aimed at understanding scientists’ views about public engagement related to health and environmental risks, as well as other issues.

P.127  A valid scale of past experiences for tornado risks. Demuth JL*; NCAR and Colorado State University   jdemuth@ucar.edu

Abstract: One’s past experience with a hazard potentially is a key factor in how they perceive a future risk as experience is a key mechanism through which one acquires knowledge about a risk. Despite this, past hazard experience has been conceptualized and measured in wide-ranging and often simplistic ways by researchers, resulting in mixed findings about the relationship between experience and risk perceptions. Thus, dimensions of past hazard experiences are not known, nor is it known how one’s experiences relate to their assessment of future risks. Past hazard experience is particularly relevant to weather risks, which are common enough for people to acquire many experiences. This poster will present the results of a study to develop a valid scale of past experiences in the context of tornado risks. The scale is developed by, first, conceptualizing and identifying dimensions of past tornado experience, and then by examining the relationship between the different experience dimensions and people’s tornado risk perceptions. Data were collected through two mixed-mode surveys of the public who reside in tornado-prone areas. An initial set of items to measure people’s most memorable tornado experience as well as their experiences with multiple tornado threats were developed for and evaluated with the first survey. Additional aspects of people’s past tornado experiences were elicited in their own words. The item set then was revised and evaluated with the second survey along with measures of people’s tornado risk perceptions. Four dimensions of people’s most memorable tornado experiences emerged: risk awareness, risk personalization, personal intrusive impacts, and vicarious impacts. Also, two dimensions of people’s multiple tornado experiences emerged: common personal threats and impacts, and negative emotional responses. Moreover, these different types of experiences differently relate to people’s tornado risk perceptions. These results will be discussed.

P.128  Launching a New Product in a Buzzing World: The Apple Watch’s Reputation at Risk. Digoin G*, de Marcellis-Warin N, Warin T; Ecole Polytechnique de Montreal   guillaume.digoin@polymtl.ca

Abstract: Successfully launching a new tech product is vital. If a company does not control all the risks associated with this launch, then the product (and the company) can just be disregarded by consumers. This is particularly true for the reputation risks. If the product’s reputation is damaged right from the beginning, then it is often impossible to offset and it can also affect the company’s reputation. Corporate reputation is indeed one of the most important assets of a company (de Marcellis-Warin & Teodoresco, 2012). Although this has always been relevant, it is even more important these days with social networks. A study conducted by the Reputation Institute in 2015 reveals that conversations on social networks – by consumers or even unrelated persons - have a negative impact on corporate reputation (-2.4%). However, when a company communicates on social media, its reputation increases (+0.4%). In this research, we analyze the launch of the Apple watch in early 2015. As one of the most successful tech companies in the world, Apple’s reputation is very high. The launch of Apple’s latest connected watch was highly commented on social networks and especially on Twitter. We have collected tweets before, during and after the launch of the Apple watch. This dataset allows us to do a sentiment analysis and highlight whether the conversations has a positive or a negative impact on the product’s reputation as well as on Apple’s reputation. We can also identify the specific topics that were discussed (technical, design, etc.). Eventually, we also highlight the differences between the conversations initiated by consumers versus the conversations initiated by non-consumers.

P.129  The Challenge of Communicating the Risk of Inaction: Linking Causal Attribution to Biased Information Processing. Dixon GN*; Washington State University   graham.n.dixon@gmail.com

Abstract: In communicating the societal and personal benefits of vaccination, many persuasion techniques fail to produce a desired effect on vaccine compliance and beliefs. Statistical information about vaccine safety is largely ineffective at improving vaccine attitudes and intentions; refuting nonscientific information decreases intent to vaccinate; and using emotional testimony or pictures to highlight the consequences of non-vaccination can attenuate risk perception surrounding vaccine-preventable disease and increase the erroneous perception that vaccines cause autism. Furthermore, using emotional pictures to illustrate the risks associated with not vaccinating results in motivated reasoning - it is persuasive only for those with views favorable toward vaccination, but backfires for individuals harboring anti-vaccine views. Creating effective vaccine persuasion campaigns then requires further scrutiny of the psychological aspects of vaccine risk perception and the manner in which different audiences process vaccine-related messages. Therefore, this research (1) addresses why persuasive messages on vaccination backfire for vaccine skeptics and (2) researches ways to correct these biasing effects. Theoretically, I apply attribution theory to vaccine risk perception and motivated reasoning as a way of understanding why vaccine risk messages result in biased processing. Understanding the causal attributions of risks can then inform on the best practices for communicating about vaccinations to skeptical audiences.

P.130  Protecting lives or promoting risk? Hurricane Sandy survivors’ perceptions of severe weather communication . Eosco GM*, Rickard LN, Scherer CW, Haase D; ERG; University of Maine; Cornell University; SUNY-ESF   lrickard@esf.edu

Abstract: For those living directly on the coast, storm surge is the most dangerous and potentially deadly risk. During Hurricane Sandy in 2012, 40 deaths were directly attributed to flooding that occurred due to a dramatic slow rise of ocean surge. Beyond Sandy, storm surge has easily been one of the most challenging risks to communicate over the last decade. How individuals make decisions about whether or not to evacuate is explored in this study of 75 individuals living within a few feet or blocks of the coast in five Connecticut communities. These individuals participated in 90 to 120-minute focus groups (n = 7) examining their use of information sources during Sandy and their evacuation decision-making. Findings suggest that the more “storm proof” (i.e., in compliance with FEMA policy) they perceive their house to be, the less likely they are to evacuate during severe weather events. If, by following FEMA standards, individuals perceive their homes as safer, and are less likely to evacuate, is this policy to protect property running counter to the overarching goal of protecting human life? Do FEMA standards spark an unintended “risk compensation” effect, wherein residents’ (possibly unrealistic) perceived safety overrides messages they receive about storm severity and necessary evacuation? In turn, how do these perceptions relate to behavioral decisions and the challenges facing first responders in these communities? We explore these and other questions emerging from the focus groups, and suggest theoretical and practical implications for risk communication and emergency management.

P.131  Should society be compensated for the risks imposed by Climate Change? Gutierrez VV*, Cifuentes LA; Universidad Diego Portales and Pontificia Universidad Católica de Chile   virna.gutierrez@udp.cl

Abstract: Chile is one of the countries that is considerable affected by climate change. Even though Chile contributes close to 0.3% of the world’s greenhouse gas emissions, Chilean authorities are engaged in responding in a constructive manner to develop a solution and to adapt to the significant impacts of climate change. For this reason, Chile created its National Climate Change Action Plan, which brings together a number of public policies related to climate change and its adverse effects. However, in order to successfully achieve this plan, authorities should take into account public opinion. This research advances the understanding of public perception of the risks imposed to society, to the environment and to individuals for climate change. We explore how much people claim society; the environment and individuals should be compensated for climate change impacts. We also relate compensation demanded with the classical variables used in the field of risk perception (as perceived risk, public acceptability and trust in regulating authorities). We used an online survey to poll a sample of the population of Santiago, Chile. A total of 525 subjects answered the survey. Data were analyzed using structural equation modeling procedures. Results show that compensation demanded for the effects of climate change on the environment is higher than compensation demanded for effects on society or an individual. Perceived risk is higher for impacts on the environment than for society or an individual. Acceptability of the risks of climate change and trust in authorities in charge of managing it are low. Demanded compensation depends on perceived risk and also on trust. Implications for decision makers and public policies are discussed.

P.132  Who accept using Fukushima produce at school lunch and why? . Hiromi H*, Iwabuchi M, Kumagai Y, Sekizaki T; the University of Tokyo   ahiromix@mail.ecc.u-tokyo.ac.jp

Abstract: Since the accident at the Fukushima nuclear power plant, many autonomy paid more attention to choose produce for school lunch. As four years has passed since the disaster, the radiocesium contamination of foods distribute to the market are well controlled and most of inspected foods results in “not detected” level. We investigate the recovery process of Fukushima produce usage at school lunch focusing on information provided at elementary school and parents’ trust on school lunch. A total of 168 response were collected from elementary schools in Tokyo and Fukushima in Oct. 2014. Only one school in each prefecture agreed to implement questionnaire to the parents and 216 and 51 response respectively from parents were collected in Dec. 2014. Following facts were found: In Fukushima, milk and rice usage were almost recovered to the level before the disaster while no replied school in Tokyo use rice produced in Fukushima in 2014 (it was 16% before the disaster). Main information provided to parents in Fukushima were the control measures and the result of inspection while in Tokyo more focus was on clarifying the production area. Those who very well understand about the radiation risk as well as those who referred information provided from school tend to accept to use Fukushima produce while who have certain degree of knowledge and referred internet based information tend to refuse using them. Parents in Tokyo who agree to use Fukushima produce have intention to support recovery from the disaster.

P.133  Measurement of the thresholds of fear for probabilistic earthquake forecasting and examining the effects by communication methods and demographic factors in Japan. Hirota S*, Oki S; Tokyo City University and Keio University   sumire@tcu.ac.jp

Abstract: After 3.11, the development of earthquake communication methods has gained importance. Although earthquake forecasting using the probability and the return period, such as gthe probability of occurrence of the earthquake with a seismic intensity of 6 is 25% in 30 years,h is common, the way such forecasting is perceived has not yet been investigated. The purposes of this study, therefore, were to measure the thresholds of fear for such probabilistic forecasts and to examine the effectiveness of communication methods including framing and/or showing reference areas. 1114 participants who were the householders or their spouses aged from 30s to 50s living in highly/less vulnerable areas in Osaka responded to a web survey in February 2015. The survey included the Cognitive Reflection Test (CRT) and questions on demographics. The thresholds of percentages and the period of fear were measured using the method of limits in ascending/descending order. The participants were presented with the probabilistic forecasts in their residence areas by one of the available communication methods and were asked about their behavior intentions for prevention. Measurement methods and the experimental conditions were randomized across participants. Results showed that, though the period thresholds were several fixed values, percentage thresholds were completely different between measuring series. In descending series, decreasing the percentages exponentially raised the probability thresholds of gnot fearfulh. In contrast, in ascending series, the thresholds of gfearh had several clear peaks. Positive framing and showing reference areas produced relatively high behavior intentions but were not significant. Log linear analysis revealed that gender (p < .001), income level (p < .01), CRT score (p < .05), and age (p < .05) significantly influenced changes in behavior intentions. These findings indicate that the effectiveness of probabilistic forecasts is influenced by the mode of communication, and that earthquake risk communication should be tailored to recipientsf demographics.

P.134  The paradox of risk communication: People might fear something even though it is described as safe, except people with high numeracy. Ikawa M*, Kusumi T; Kyoto University   ikawa.miho.73u@st.kyoto-u.ac.jp

Abstract: Introduction: The more a risk informant explains safety, the more their communication could be considered doubtful. This phenomenon is called the paradox of risk communication. Previous findings indicated that the paradox tends to occur when a risk manager is not trusted. These studies, however, had not examined the individual differences in numeracy: the ability to process basic probability and numerical concepts. Hence, the purpose of this study was to examine the possible relationships among risk perception, numeracy, and trust in a risk informant. Method: 1300 Japanese people (47.2% female, the average age is 44.3 years), who were the participant pool of an Internet research company, participated in this study. The participants were asked to assess how much the Japanese government was trustworthy and to read information about radiation risk: the radioactive contamination in food by the Fukushima Daiichi nuclear disaster in 2011. The information consisted of two parts: the first gave numerical expression about risk, while the second report revised the risk to be smaller (i.e., the control standard value of radiation risk changed from 5 mSv per year to 1 mSv per year). After reading the information, participants were asked to complete a risk perception scale and a numeracy scale. Results and Discussion: Chi-square test results revealed that participants who trusted the Japanese government became less afraid of radiation risk after reading the information of risk reduction. The test revealed that participants with high numeracy also became less fearful. The information helped participants with high numeracy to understand why the control value of radiation risk changed, more so compared to those with low numeracy. These findings suggest that the paradox of risk communication differs between people with high and low numeracy. Implications about the roles of trust and numeracy in risk communication will be discussed.

P.135  EPA’s Risk Assessment Training and Experience Program (RATE): A Critical Tool for Advancing National and International Collaboration and Harmonization of Risk Assessment. Kadry AM*, Walsh D, Sams R; National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Washington, DC    kadry.abdel@epa.gov

Abstract: The United States Environmental Protection Agency (US EPA) is the global leader in conducting state of the science human health risk assessments. These risk assessments are often the first to apply new risk assessment guidance, scientific methods and data. State of the art risk assessments rely on peer-reviewed epidemiological and laboratory animal studies to identify hazards associated with exposure to environmental contaminants and to perform quantitative dose-response analysis. Biologically-based mathematical models and mechanistic data are used to answer questions about the human relevance of animal studies and to select appropriate methods to extrapolate from high experimental doses to the low doses people encounter in the environment. Development of quantitative methods to better characterize uncertainty for risk estimation represents a key focus within the field of risk assessment. EPA plays a critical role in the scientific community to promote the advancement of risk assessment. There is a need to communicate and provide comprehensive training and experience to EPA stakeholders and the risk assessment community on current, state-of-the-art risk assessment practices as used and implemented by federal agencies. The US EPA’s National Center for Environmental Assessment (NCEA) has developed and provided training in risk assessment to US EPA’s stakeholders and the international risk assessment community. Risk assessment training has been offered to EPA programs, at conferences such as SRA, and to assessors in several countries around the globe (e.g., Canada, Switzerland, Poland, South Africa, Ghana, Australia, Egypt, Saudi Arabia, Chile and United Arab Emirates). In this presentation we will discuss key approaches to assure training program success and future opportunities for national and international cooperation in risk assessment training. The views expressed in the proposal are those of the authors and do not necessarily reflect the views or policies of the U.S. Environmental Protection Agency.

P.137  How was a health risk related news reported in Taiwan? A pilot Analysis of News Reports on Ractopamine-Containing Beef Imported from the United States. Lu EH*, Wu KY; Institute of Occupational Medicine and Industrial Hygiene, College of Public Health, National Taiwan University, Taipei, Taiwan.   shren199322@gmail.com

Abstract: Medium is considered as a risk amplifier. The news reported might have shaped risk perception of the public toward a new hazard or issue. It is very critical to study factors influencing news on health risk associated issue. Ractopamine-containing beef (RCB) imported from USA was used as an example in this pilot study to elucidate factors associated with news in Taiwan appearing on the top-4-circulated presses. The news appearing in the peak period of controversy on this RCB issue from January to March in 2012 was selected for this study, and that in 2011 as a reference. The news containing “USA beef products” or “RCB” was selected for analysis. The headlines and articles of the news associated with concept of health risk were counted separately. The headline or article contained keywords, ractopamine detected, lifted or banned, dose limit, food safety, and health effects, was assigned as 1, and others were assigned as 0. The numbers were summed for the headlines and articles for each newspaper in each year. The ratio of difference in news appearance for each press, and the percentage of news appearing on different sections were calculated between 2012 and 2011. Totally, 1710 news was selected, 181 in 2011 and 1529 in 2012. The headline rarely carried the concept of health risk, while more than 50% of the articles delivered the concept in 2012. If the news article conveyed the concept, it frequently appeared on the Front Page for about 50% in 2 presses. The articles often expressed the intrinsic toxic effects rather than dose effects. If a news distributed concept of dose effects, it usually referred to the rate of ractopamine detected. In conclusion, reporters in Taiwan like laymen lack the concept of dose effects. Their reports might mislead the readers and amplify the risk.

P.138   Risk Perception on EMF Health Effects of Pregnant Women in Japan. Ohkubo C*; Japan EMF Information Center   ohkubo@jeic-emf.jp

Abstract: Risk perception of pregnant women (n=1164 in 20-30 years old) on electromagnetic fields (EMF) and health issues was surveyed and compared with non-pregnant women and men (n=1224) in the same age group as a control in Japan. The internet survey was conducted in June, 2014. The questionnaires include degree of risk perception on health effects of EMF, concerned items in daily life, ill health effects imagined to be caused by EMF exposure, trust to stakeholders, and reliability on information sources. The pregnant women are more sensitive to health issues than the control. More than half of the pregnant women concern about health effects of EMF exposure, whereas almost forty percent in the control group. Concerned devices emitting EMF among the pregnant women are mobile telephones (57%), microwave ovens (46%), personal computers (26%) and others. Concerned facilities are electrical power substations (21%), power lines (16%), mobile phone base stations (15%), and others. Among the pregnant women, ill health effects imagined to be related to EMF exposure are fetus development (37%), birth defect (27%), miscarriage (19%), infertility (17%), and others. Concerns of EMF exposure effects on childhood leukemia and brain tumor are relatively low (about 10% each). Degree of truth to international organization (53%), national government (42%) are higher than those to electrical power companies (21%) among the pregnant women. The pregnant women showed a trend to rely on information from medical staff, their parents and newspapers but not on information through the internet.

P.139  The proof is in the picture: Exploring the influence of visual type on hurricane risk perception. Rickard LN*, Eosco GM, Scherer CW; University of Maine; ERG; Cornell University    lrickard@esf.edu

Abstract: Whether a photograph, computer graphic, or radar image, visual representations of severe weather range from the “iconic” to the “indexical” (Messaris, 1997). Whereas iconic visuals, such as maps, serve as an analogy between an object and its signifier, indexical visual representations, such as photographs, are considered physical traces: “proof” that some object exists. Past research suggests that iconic visuals implicitly convey uncertainty, including in relation to the timing of the event, the amount of risk to a location, or if the event will occur. Indexical visuals, on the other hand, tend to prompt increased certainty and perceived risk. In the case of severe weather, however, providing indexical images may be impossible, as often these events have yet to occur. Building on this foundation, the present study explores the influence of visual type on perceived risk related to a hurricane forecast. In a between-subjects factorial design, we present residents of New York, New Jersey, and Connecticut (N = 1,052) with a forecast describing hypothetical Hurricane Pat accompanied by either a map showing predicted storm surge amounts, a photograph of storm surge impacting a neighborhood, or no visual. In addition, half of the forecasts contained information characterizing evacuation decision-making as a shared responsibility between forecasters, emergency personnel, and individual citizens. Results suggest that the condition with a photo but without shared responsibility information elicited the most concern about storm surge. Moreover, when predicting risk judgment, experimental condition, perceptions of the visual (e.g., its believability), and individual characteristics (i.e., education, number of children at home) were significant predictors, whereas past experience with hurricanes was not. Theoretical and practical implications will be presented.

P.140  Current information needs and preferred communication channels in municipalities affected by the Fukushima nuclear accident. Sato A*; United Nations University, Institute for the Advanced Study of Sustainability   akiko.sato@unu.edu

Abstract: This study concerns issues around information demand and supply in the aftermath of the March 2011 accident at the Fukushima Daiichi Nuclear Power Plant. It has been repeatedly emphasized that effective risk communication should be understandable and actionable, even before the accident. Progress in implementation is, however, extremely limited. Previous research suggests that mismatches existed between information demands and available/easily accessible information. Importantly, the circumstances of the affected individuals and communities have changed and greatly diversified during the past four years. There are also indications of the influence of deep-rooted norms related to gender, family and social life on individual radiation risk perception, as well as the “information vulnerability” of certain population segments, such as the elderly. These factors may contribute to the information gaps. To this end, this study attempts to assess current information needs and preferred communication channels among the affected population from consultations with relevant scholars and practitioners, and semi-structured interviews in four municipalities at different stages of the progress in decontamination work and evacuation-order lifting, with key stakeholders, involving life-support counselors, public health nurses, civil society organizations and neighborhood councils of temporally housings. The study presentation will discuss whether and how information needs and preferred communication channels differ by geographical origin and social factors such as age and gender. The findings will be utilized to develop policy recommendations for effective information/message development and delivery, in order to enable the affected individuals to make well-informed decisions for their livelihood reconstruction and well-being, and strengthen collaborative efforts for disaster recovery among actors engaged in risk communication in relation to the Fukushima nuclear accident.

P.141  Analyzing the Discourse of Trust in Post-Spill Charleston through Local Newspapers. Song H*; Cornell University   hs672@cornell.edu

Abstract: Although many studies in risk communication have focused on the effects or predictors of trust through survey or experimental methods, few studies have explored how news media construct the public discourse of trust. This study used quantitative content analysis to explore how local newspapers in Charleston, WV used keywords related to trust or distrust in articles covering the Elk River Chemical Spill in January, 2014. Consistent with the general assumption in risk communication literature, the general public was most often cited as the party making the judgment of trust (i.e., trustor). However, in conflict with literature which assumes that the targets of trust judgment should be specific, majority of trust judgments were made against abstract non-human targets (e.g., water) or without any reference to a specific target at all (i.e., trustees). Also, as expected, nearly three quarters of the trust words described situations where trust was damaged at the moment, indicating a relationship of distrust (i.e., negative trust valence). Trustees representing or affiliated with businesses, especially the water utility company, were more likely to be portrayed in terms of negative trust valence than non-business trustees. In contrast, governmental agents were more likely to receive positive trust judgments than non-governmental trustees. Although the two local newspapers analyzed had contrasting political orientations, they did not differ in their likelihood to cover the trustworthiness of business or governmental trustees. Neither did the valence of trust attached to the two types of trustees differ between the two newspapers. Findings suggest that risk communication research should pay more attention to the specificity of trust judgments in everyday discourse. The public may be ascribing trust or distrust to an entire system or final product that results from the collaboration between the private and public sectors rather than each separate risk-managing entity.

P.142  A Longitudinal Study of Electronic Cigarette Use Among College Students. Trumbo CW*, Kim SJ, Harper R; Colorado State University   ctrumbo@mac.com

Abstract: Several studies have reported that virtually all college students have heard of e-cigarettes and 13%-15% have tried them. In this population of young adults there is evidence that e-cigarettes are providing a path to nicotine dependence and possibly uptake of smoking. College students are of specific interest for several reasons. Half of young adults in the U.S. attend a college or university. Studies have shown a high degree of social acceptability for e-cigarette use by college students. While longitudinal data on e-cigarette use by college students is as yet limited, research has shown an increasing trend. Associated trends in students’ use of cigarettes (declining) and hookah (increasing) suggest that the college population is oriented toward uptake of non-cigarette delivered nicotine. This investigation was grounded in three theoretical perspectives: the Theory of Reasoned Action, the Heuristic-Systematic Model, and cognitive-affective risk perception. Multi-item measures were used for each concept with good to excellent reliabilities. The study made use of two data collections, each using identical measures and populations. The first data collection was accomplished in fall 2013 using an online survey presented as an extra credit activity in an undergraduate large lecture class serving a broad cross-section of students on campus (N = 309). The second data collection was completed very recently at the end of the spring 2015 semester, sampling from the same course pool and using the same methods (N = 398). Preliminary analysis shows that the predictive relationships demonstrated in the first survey were replicated in the second, with information processing predicting risk perception, which predicts intention to use an e-cigarette. The percentage of respondents aware of e-cigarettes remained very high (95%) while the percentage of respondents who had tried an e-cigarette increased from 14% to 20%. Further analysis is in progress.

P.143  Who trusts the government? The relationships between trust in sources of information, risk perception and disaster preparedness in Canada. Yong AG*, Beaudry M, Lemyre L, Pinsent C, Dugas T, Krewski D; University of Ottawa   ayong089@uottawa.ca

Abstract: Trust is an important element in risk governance, communication and management because individuals are more likely to follow emergency directives from trusted information sources. To further understand who the Canadian public trusts, we investigated how self-reported trust in different sources of information is associated with socio-demographic, use of information sources, risk perception and disaster preparedness by analyzing data from a nationally representative Canadian survey (N = 3,263). Factor analyses of trust towards six information sources (i.e., friends and family, social media, traditional media, official internet webpages, government, and experts) revealed that trust was a unidimensional construct, although the trust levels for different information sources differed by socio-demographic. For example, men reported higher level of trust in government compared to women, and immigrants reported higher level of trust in social media and government compared to Canadian-born individuals. Results also showed that trust in an information source predicted a higher likelihood of receiving information from that particular source. The level of trust in different information sources differently predicted disaster preparedness behaviours. For instance, trust in social media predicted a decreased likelihood of first aid or CPR training, whereas trust in official internet webpages predicted increased likelihood of this preparedness behaviour. Individuals who reported higher trust in social media were significantly more likely to have heightened risk perception for flooding, whereas individuals with more trust in traditional media reported lowered risk perception. Findings suggest that the relationships amongst trust, risk perception and behaviours are multifaceted. Hence, targeting and tailoring risk messages to selected social groups by the type of information source may be invaluable. Theoretical and practical implications on will be discussed.

P.144  Foresight tools for responding to cascading effects in a crisis. Sellke P*; Dialogik   sellke@dialogik-expert.de

Abstract: Cascading effects pose a major challenge in a crisis and disaster situation. The effects can be technical as well as social, affecting unexpected systems or triggering unwanted behavior by the public. Especially if the degree of technical and social preparedness is rather low, cascading effects might lead to major negative consequences, i.e. a more severely damaged infrastructure, higher losses, and a longer time needed for recovery. A model is proposed to intervene in current crisis response practices by bridging the gap between the over-reliance on unstructured information collection on one side and a lack of attention to structural, communication and management elements of cross-border and cascading crisis situations on the other. Information and communication tools and models are discussed, intended to assist stakeholders in evaluating what information is significant and relevant. These new approaches are part of the research project FORTRESS and will be systematically build by using evidence-based information from historical crisis case studies, as well as comprehensive analysis of the different relationships between systems, and systems and sensitivity information from current crisis management contexts and practices in four system simulations. This will enable to build a collaborative and accessible modeling platform for cascading and cross-border effects in a range of crisis situations.

P.145  “Weight-of-Evidence” Risk Messages about Genetically Modified (GM) Foods: Persuasive Effects and Motivated Reasoning. Vianna B*, Clarke CE; George Mason University   cclark27@gmu.edu

Abstract: Public perception and scientific discourse diverge for variety of risk issues, and there is growing scholarly interest in using news media to convey “weight-of-evidence” for issues where evidence supports a particular conclusion. This article extends this research by examining (1) genetically modified (GM) foods as a new case study; (2) two novel dependent variables (audience beliefs related to GM food safety, and the certainty with which those beliefs are held); (3) and political ideology as a moderator of these message effects. A messaging experiment (n = 176) revealed that weight-of-evidence information emphasizing the safety of GM foods for human consumption heightened participants’ views that they are safe and affected people’s strength of conviction, depending on pre-exposure views. However, perceived scientific uncertainty (i.e., believing that scientific evidence was conflicting) did not mediate – and political ideology did not moderate - either of these direct effects. We discuss risk communication implications.

P.147  Understanding of risk and media literacy. AOYAGI M*; National Institute for Environmental Studies   aoyagi@nies.go.jp

Abstract: In considering risk communication practices among general public, peoplefs media use is crucial. We think those abilities what we call gmedia literacyh, defined as the ability to access, analyze, evaluate and use various media. We focus on this media literacy in this paper and using nationally representative data, we discuss the importance of media literacy in the risk communication practices and governance. We carried out nationwide public opinion survey from 20 to 74 years old, nationally representative adults Japanese October 2014. Effective responses were 1,548 out of 4,000 samples contacted. Using this data, we analyzed peoplefs media exposure and literacy in risk issues. 1) We asked ginformation sources for daily news events including risk issues.h 92.6% of our respondents chose television program, 70.2% did newspaper (printed), while 24.0% did online news site including newspaper publishers. Those are followed by family and friends (21.3%), radio (20.4%), magazine (12.3%), and SNS (12.0%). About 23% of our respondents answered they did not use the internet at all. 2) Then, we asked gthe most trusted information sources for environmental issues.h The most chosen item was journalists (45.5%) followed by professors/ experts (27.1%), national government (24.6%), local government (23.1%), Environmental-NGOs (21.3%). 3) We had two quizzes concerning the science of radioactivity. We investigated the relationship between information sources they use, their trusted information sources and the answers of quizzes. The result shows a clear relationship among them. Respondents who chose the wrong options are more likely to trust gjournalisth, gSNSs by non-expertsh, ad who did right options are more likely to trust gprofessors/ expertsh, and read newspapers. Those relationships are statistically significant in Pearsonfs Chi test. 4) We also asked our respondents whether they chose food from possible radioactive contaminated regions. Anxiety level is closely related to choice of the information sources and purchasing behavior.

P.148  Risk Assessment on the Legibility of the Prescriptions by Medical Practitioners in Quezon City, Philippines. Mallare ANLB, Sanchez NADG, Tolentino RMS*, Resurreccion JZ; University of the Philippines, St. Luke's Medical Center   nadgsanchez@gmail.com

Abstract: Medication errors are unintentional errors in the overall prescription and administration of drugs, commonly attributable to the illegibility of a medical practitioner’s handwriting. The purpose of this study is to evaluate the legibility of the handwriting of doctors in both private and public hospitals in an area of study where automation of prescriptions is not available and to assess the risks that it may entail to patients. Error-prone drugs, abbreviations, and symbols were collected from the Institute of Safe Medication Practices to comprise a preset prescription, which was written by a randomly selected pool of licensed surgical, medical, and intensive care physicians in Quezon City, Philippines. Each prescription was assessed by a pharmacist, a young adult (20-34 years old), a middle-aged adult (35-49 years old), and a senior citizen (50 years old and up). The rating and frequency of the assessors’ misinterpretations of the prescriptions were tabulated. The factors significantly affecting the legibility of the handwriting of doctors (ie. assessor’s nature, doctors’ general specialization, and prescription components) were then subjected to an ANOVA with a 95% confidence level and to Tukey’s Test. Results from the tests show that pharmacists yield better comprehension relative to other assessing groups, and abbreviations contributed to the most errors among the prescription parts. A fault tree that summarizes the event of wrongly dispensing of drugs was developed, where the assessors, doctor’s specialization, and prescription parts influenced the frequency of errors by 21.3%, 65.4% and 84.9% of the time, respectively. By making sure that only pharmacists are allowed to dispense drugs to patients, the contribution of other assessing groups to the total error will be nullified, reducing the risk from 21% to 5%. These may be used to formulate policies and strategies in writing medical prescriptions to reduce its risks.

P.149  Nuclear Energy in the Media: Examining How Fukushima Influenced Debates over the Future of Nuclear. Bell MZ*, Yang ZJ; State University of New York At Buffalo   mzbell@buffalo.edu

Abstract: Although the ‘nuclear renaissance’ of the 1970s was never quite fulfilled, nuclear continues to be a stable source of low-carbon energy across the world, including the US. However, the continued reliance on nuclear energy has not been without its controversy, with issues tied to waste storage, proliferation, nuclear safety and public risk perception. Although risk perception has its unique cognitive foundations, it can be tied to media representation of nuclear energy, which is often cited either to be heavily scaremongering or deceptively rose-tinted in its portrayals. Over the years, several media content analyses have been conducted, often centering on risk events such as Three Mile Island or Chernobyl. Given the timely opportunity, this paper seeks to continue this legacy, taking the 2011 Fukushima Daiichi incident to explore the ways in which a new nuclear risk affects the representation of ‘nuclear energy’, seeking to understand whether representations in three major US newspapers are different before and after the Fukushima accident. Results show that these newspapers overall portrayed nuclear energy in a negative light, but coverage was more negative in 2011. There were also shifts in the amounts of renewable energy and fossil fuel mentioned after 2011, which suggests that the disaster engendered debate over the future of nuclear. Lastly, there also appears to be evidence of a risk tradeoff between nuclear and climate change risks. Fukushima additionally introduced natural hazards into the debate about nuclear energy, which was minimal prior to this particular incident. Overall, results from this content analysis indicate that albeit limited, Fukushima had some impact on the newspaper discourse surrounding nuclear energy.

P.151  Proposal for a constructivist model of "communication-uncertainty" and a typology according to the nature of uncertainty. Camin JM*; Université Michel de Montaigne Bordeaux 3   jmcamin@orange.fr

Abstract: While the main activities of a project manager are done through the communication process, it is observed that many projects are subject to delays, excesses or defects specifications. Excess of measurements to prevent the risk ? Defective management of the communication which leaves too much place to uncertainty ? The Theory of Uncertainty Reduction developed by Berger and Calabrese (1975) in the field of communication does not fully understand how a project dissipates the existing uncertainty between actors. By revisiting an operational project within the framework of action research, we strive to identify how uncertainty and communication influence and form themselves mutually. We used the constructivist approach and the actor-network theory of Callon and Latour (1981, 1984, 2006, 2007) to reach the meaning of this circular relationship. We present one of the results of our thesis: The communication process used in the construction of the network differs according to the nature of uncertainty encountered or felt. By positioning uncertainty as a socially constructed phenomenon, we present a constructivist model of "communication-uncertainty" where the observer is an intentional actor limited by constraints (Boudon, 2009). We propose a typology in the field of communication, in agreement with the proposals for other areas researchers (Klir, Ayyub, Walker, Rowe, Hoffman). We propose to distinguish the nature of uncertainty following a typology : the variability uncertainty (inherent variability of things), the epistemic uncertainty ambiguous or not (due to the imperfection of our knowledge) and the scale uncertainty (in touch with the imperfection of our models of representations).

P.152  Communicating Risk in Disaster Risk Management Systems-A Study Based on Developing and Utilizing National Risk and Vulnerability Assessments Undertaken in Sweden. Lin L*; Lund University   lexin.lin@risk.lth.se

Abstract: Disaster Risk Management (DRM) requires the participation and collaboration of a wide variety of stakeholders, from different functional sectors and across geographical boundaries to be actively involved to anticipate, prepare for and respond to disaster risks. Since many risks human beings face today are complex and trans-boundary, stakeholders from different administrative levels and with various disciplinary backgrounds in DRM are dependent on each other, in order to generate risk-related information, and then gather a holistic picture to assess and manage disaster risks. Therefore, the exchange and sharing of risk-related information between stakeholders becomes a key issue that is likely to influence the success of DRM activities. The purpose of this study is to explore the communication-related issues between stakeholders in the DRM system. By using the multi-level/multi-stakeholder Swedish DRM system as a study case, the author investigates the communicating of risk-related information between the national level stakeholders, during the process of developing and utilizing the Risk and Vulnerability Assessments (RVAs). Empirical data were collected through 20 semi-structured interviews, involving 15 national level authorities in the Swedish DRM system, aiming to answer the research question: What are the perceived challenges on the national level of the Swedish DRM system concerning information sharing and stakeholder collaboration in the development and utilization of RVAs? The major conclusions identified were the lack of a constructive feedback mechanism resulting in the seemingly one-way communication in the DRM system, the limited private stakeholder participation, and the lack of collaboration among stakeholders that are from different functional sectors. The communication challenges identified in this study are important and general. They can be relevant for both researchers and practitioners as they reflect upon the practice of risk communication in a multi-stakeholder DRM system.

P.153  Risk Perception in User-Centered Product Design. Seligsohn EN*, Wang Y; Georgia Institute of Technology   yan_wang@gatech.edu

Abstract: When customers decide which product to buy, the perceived risks associated with the purchase are typically considered. It is important to incorporate the customer’s perception of risks in a user-centered product design paradigm. Objective risk is typically quantified with the combination of probability and consequence, whereas subjective risk is an individual’s feeling about the importance of an amount at stake and subjective certainty that he/she will gain or lose the stake. Consumers’ purchase decision is directly related to the perceived risks about finance, safety, reputation, etc. Existing research of risk perception in product development focuses on the warning label design to ensure that consumers are aware of product safety and potential hazards. In this research, the effects of risk perception of both consumers and product engineers on design are studied. It is common for consumer products to appear safer than they are because the product design does not accurately project its true risk. It is also seen that design standards and the involvement of insurance companies could affect the risk perception of consumers.

P.155  The Policies and Politics of Science Education: The Environmental Literacy Improvement Act. Herovic E*; University of Kentucky    eherovic89@gmail.com

Abstract: Appropriate means of teaching science to the public has been under intense study, and increased debates about educating the populace on climate change have been contested and controversial. Specifically, science education curricula at the K-12 level is at stake as there exists a risk of changed dialogue on global climate change in schools. There are currently three states in America pushing the Environmental Literacy Improvement Act which would require curricula to include teaching climate change denial in schools. This paper explains the ELIA, the risk to climate change dialogue it poses, and the politics that surround the act using Kingdon’s (2003) multiple streams framework and Baumgartner and Jones’ (1993) framework of policy monopolies.

P.156  An Analysis of Japanese Companies’ Litigations against Trade Secret Misappropriation by Insiders. Takizawa, Kazuko TK*; Waseda University   kazuko.takizawa@aoni.waseda.jp

Abstract: The number and impact of trade secret misappropriation by insiders (mainly companies’ former employees)—such as showing technological data to competitors or selling client lists to brokerages—has increased in Japan. Many plaintiff companies pursue compensation and sanctions based on the violation of (1) non-competes, (2) non-disclosure agreements, and/or (3) Japan’s Unfair Competition Prevention Act. This study analyzes relevant lawsuits between Japanese companies and their former employees and/or new employees since the year 2000. Non-competition agreements may prevent employees from joining competitors for less than two years; however, the enforceability of such agreements has often been limited, especially in terms of employees’ mobility. Non-disclosure agreements are more common in Japan and may be enforced for longer periods; nonetheless, some judges have concluded that such contracts are void if employers have failed to clarify which information is subject to the agreement. Some leading Japanese companies, such as Toshiba and Nippon Steel & Sumitomo Metal Corporation, have filed lawsuits under the Unfair Competition Prevention Act. The possible expansion of the scope of this act, along with a mitigation of its requirements, has recently been discussed.

P.158  New Breeding Techniques: The Risks of Innovation Versus the Inadequacy of Regulation. Anyshchenko A*, Xiang W; University of Copenhagen   artem.anyshchenko@jur.ku.dk

Abstract: There is a growing debate on whether plants manipulated using crop breeding tools commonly known as new breeding techniques (NBTs) are genetically modified organisms. Analysis of the legal status of NBTs in different jurisdictions may help elaborate on the issue. There are two approaches to the regulation of GMOs: a process-based and a product-based. According to the process-based approach adopted in the EU, GMOs are defined as arising from the use of certain specific methods. The product-based approach adopted in the US defines GMOs as possessing a new combination of genetic material that could not have occurred naturally. The latter approach suggests that some plants, even though modified by genome editing, may be outside of the scope of GMO regulation since they result in a product lacking a transgenic insertion. Opposition to genetic engineering is rationalized, inter alia, by the discourse of risks to health made possible due to the release of new organisms into the environment. In Europe, GMOs are associated with higher levels of perceived risk than in North America. However, it is unclear if NBTs would result in GMOs and the question of adequate risk policy on NBTs has yet to be answered. Would the resulting products fall under the scope of the existing GMOs legislation? In other words, is it legally feasible to separate NBTs from their transgenic counterparts? Does the law adequately reflect the present state of affairs in the field of agricultural biotechnology? EU regulation on GMOs treat NBTs with a degree of uncertainty that makes the law itself unclear and ineffective. Limited success of public policies based on the precautionary principle has not resulted in mitigating the risks of adverse environmental changes. Risks attributed to biotechnologies due to scientific uncertainty conflict with the inadequacy of risk governance. Rapid scientific developments require a new policy for the risks inherent in genome editing, and therefore the issue of the legal status of NBTs deserves careful consideration.

P.159  WIND TURBINE NOISE AND HEALTH: FINDINGS OF AN EXPERT PANEL. Guidotti TL*; Chair, Panel on Wind Turbine Noise & Health, Council of Canadian Academies   tee.guidotti@gmail.com

Abstract: Wind energy generates an increasing share of electricity in Canada, as elsewhere. In some communities where wind turbine installations are concentrated, concern over health effects from wind turbine noise has been vocal. Concern is primarily focused on health effects of at frequencies below the threshold of hearing (“infrasound”). Health Canada charged the Council of Canadian Academies (CCA) with performing an objective assessment of the extant scientific literature. A panel of 10 experts was convened. First, the scientific literature, grey literature, publications of record, and legal filings were examined to develop a list of 32 candidate health outcomes of concern. Then, a systematic search for scientific literature on wind turbine noise and health outcomes yielded 38 relevant references. Deliberations on interpretation emphasized the pattern, the convergence of evidence from different sources, and the cumulative weight of evidence are emphasized throughout this report, not individual observations. The literature as a whole was evaluated for sufficiency of evidence to conclude that there was a causal association, guided by the Hill criteria. The Panel concluded that there was 1) sufficient evidence to conclude that a causal association exists between wind turbine noise and annoyance, defined as “a feeling of displeasure evoked by noise”, which is considered a health outcome by the WHO definition of “health”; 2) limited evidence for sleep disturbance; 3) adequate evidence against an association for hearing loss; 4) inadequate evidence of a direct causal relationship with stress, not withstanding known relationships observed in community noise studies; and 5) inadequate evidence for all other health effects, including cardiovascular disease, tinnitus, and vertigo. The Panel also observed that although conventional (dBA) sound level measurements do not measure the full spectrum of sound generated by wind turbines, they serve as an acceptable surrogate measure for sound intensity.

P.161  Resilience: Concept and Application to Energy Transformation. Renn O*, Dreyer M; University of Stuttgart   ortwin.renn@sowi.uni-stuttgart.de

Abstract: Resilience is the ability of an entity to cope with stress, crisis and disaster. It includes technical (vulernable infrastructures, existence of barriers), organizational (institutional coping and adaptive capacity) and social (level of education, awareness and cooperation) elements. These three elements need to be addressed in three different time references: preparatory phase before the stress or the crisis occurs (prevention, risk reduction; risk mitigation); coping capacity after the stress or the crisis strikes (disaster management capacity, recovery capabilities); and post-crisis learning (once the recovery has been organized and new coping strategies can be developed). So the framework would propose a three times three matrix that includes a vertical level: technical, organizational and social, and a horizontal level that includes pre-crisis, recovery and post-crisis adaptation. A risk-based analysis includes the creation of scenarios with simulations of unlikely but still possible stress situation and common mode failures. The process is to identify vulnerabilities, to assess existing coping strategies and to evaluate institutional capabilities to deal with crisis situations. This abstract model of resilience and risk has been applied to the German Energy transformation. Germany's National Academies of Sciences initiated an expert working group (chaired by the author) to address the issue of resilience for the ongoing energy transition. The interim results of this study will be presented.

P.163  Representing uncertainties in economic consequences of multiple hazards. Chatterjee S*, Prager F, Chen Z, Rose A; Pacific Northwest National Laboratory   samrat.chatterjee@pnnl.gov

Abstract: This talk focuses on uncertainty characterizations, including mathematical intervals and probability distributions, for economic consequences of multiple hazards. Uncertainty quantification and propagation is performed using sampling with variance reduction and regression (both least squares and quantile) with stochastic regressors. Consequence probability distributions are developed that may be useful for homeland security policy-makers conducting national risk assessments and for emergency management decision-making.

P.165  Risk-Informed Strategic Decision Making; Adapting to Meet New Realities. Rouse J.F.*; Joint Staff, Arete Associates   jrouse@arete.com

Abstract: The Chairman's Risk Assessment has informed national-level strategic decision making for fifteen years. Over this time, its methodology has gone through periods of both slow maturation and rapid change based upon a dynamic strategic environment, complex security challenges, and transformed U.S. Strategy. Additionally, as major processes within the Department of Defense changed, the risk assessment methodology adjusted to keep pace with, inform, and nest those decisions and their associated risks into a more comprehensive approach to risk governance. Accordingly, The Joint Staff recently completed a major revision to the Chairman’s Risk Assessment methodology. These risk methodology changes, the history and rationale behind them, and the institutional challenges encountered will be of interest and use to national security professionals who perform risk assessments, strategy and policy development or supervise strategic processes.

P.166  Recovery Scenarios from Various types of Attacks. Russell D*; Global Environmental Operations, Inc   dlr@mindspring.com

Abstract: A lot of attention has been focused on attack scenarios for municipalities. Most are directed at infrastructure or general population. This presentation will discuss how some low cost precautions and planning can provide substantial mitigation for some likely scenarios and some uncommon ones. Scenarios include 1) Water system attack, 2) dirty bomb attack, and 3) Chemical attack. The discussion will focus on the likelihood of success, the response and the cleanup from the attack and the logic of how and which scenarios are likely to be more or less effective.

P.167  Hazard Assessment of Selected Flame Retardant Chemicals of Importance to National Defense. Rak A*, Vogel CM, Bass N; Noblis Inc. and US Army Public Health Command   andrew.rak@noblis.org

Abstract: The Department of Defense’s (DoD’s) Chemical and Material Risk Management (CMRM) Program has a well-established three-tiered process for over-the-horizon scanning for Emerging Contaminants, conducting qualitative and quantitative impact assessments in critical functional areas, and developing sound risk management options. This “Scan-Watch-Action” process was used to examine potentials risks from selected brominated flame retardants (FRs). Subject matter experts (SMEs) from throughout the DoD used the Emerging Contaminants Assessment System (ECAS) tool to evaluate the potential risks to DoD associated with these two mission critical chemicals. Members of the CMRM Program team used the Impact Assessment Criteria Assessment Tool (ICAT) to analyze SME input. Together, these two groups developed a set of initial risk management options (RMOs) to be considered within the DoD. The risks identified by the SMEs and the potential RMOs for each FR chemical are presented for each of five different functional areas. The uncertainties in the SME’s risk estimates are also discussed and recommendations for further analysis are presented. The conclusion of the assessment indicates that select FRs require risk management actions to mitigate possible risks to operation and maintenance; including additional research into safer alternatives that meet performance requirements.

P.168  Human Factor Trust Framework within Holistic Cyber Security Risk Assessment. Cains MG*, Henshel D, Hoffman B, Oltramari A; Indiana University, Army Research Labs, Carnegie Mellon University   marianacains@gmail.com

Abstract: As part of a continuing effort to develop a holistic cyber security risk assessment framework and model, the characterization of human factors, which includes human behavior, is needed to understand how the actions of defenders, users, and attackers affect cyber security risk. We have developed an initial framework for how to incorporate trust as a factor/parameter within a larger characterization of the human influences (users, defenders and attackers) on cyber security risk. The work group developing this new cyber security risk assessment model and framework has chosen to distinguish between trust and confidence by using "trust" only for human factors, and "confidence" for all non-human factors (e.g. hardware and software) in order to reduce confusion between the two concepts within our larger holistic cyber security risk assessment framework. The presented trust framework details the parameter relationships that build trust in cyber defenders and the parameter metrics used to measure trust. Trust in the human factors is composed of two main categories: inherent characteristics, that which is a part of the individual: personality, motivation, rationality/irrationality, benevolence/malevolence, integrity, expertise, and attention/awareness; and situational characteristics, that which is outside of the individual: authorized or unauthorized insider access. The use of trust as a human factor in holistic cyber security risk assessment will also rely on understanding how differing mental models and risk postures impact the level trust given to an individual and the biases affecting the ability to give said trust. The ability to understand how trust is developed and given within the context of cyber security will facilitate the development of a more holistic and predictive risk model that effectively incorporate human factors into the risk equation.

P.170  Framing Risk Assessment of Complex Systems. Henshel DH*, Cains MG, Hoffman B; Indiana University and Army Research Laboratory   dhenshel@indiana.edu

Abstract: The world is full of complex systems, both natural (the human body, an ecosystem) and anthropogenic (cyber systems). Complex systems are characterized by having many components, often in both hierarchical and parallel relationships exhibiting feedback and feed-forward interactions in addition to classical cause and effect interactions. Modeling risk in complex systems faces the following difficulties: the need to combine metrics that are determined using very different units; metrics that exist at multiple scales; serial risks that are introduced due to serial interactions within the system or that “ripple out” from the initial system stressor; dynamic interactions in the system that alter the system itself as well as altering the risk in the system; multiple types of variables to include in the model including some that are primary risk (types of impact) metrics, some that are magnitude metrics, and some that only contribute a weighting value to other metrics. The ideal approach to modeling risk in a complex, dynamic system (such as current vulnerabilities in a cyber network) would be to calculate the risk as a “living process” , responsive to the current state of the system and capable of recalculation when new data is available (new detection data) or when the system state has changed. The standard engineering approach to risk assessment has been to simplify first then slowly integrate complexity into the system. This approach, however, does not enable responsive risk modeling of a complete cyber network. Our approach , focusing on cyber defense, is to start by defining the universe of the complex system, characterizing and identifying the variability and uncertainty in each risk-related variable, enumerating metrics, and then determining risk as relevant to the risk management question at hand. This paper presents an example of this model in action using a discrete cyber mission.

P.171  When the Presidential Candidate Is No Difference from Ordinary People: Revisiting the “Weakest Link” in the Cyber Security Chain. Nguyen KD*, Rosoff H, John RS; University of Southern California   hoangdun@usc.edu

Abstract: It has been agreed that humans are the weakest link in a cyber security chain. Yet the security of any cyber infrastructure largely depends on the participation of end-users to practice safe behaviors. Nonetheless, the extent to which users take precautionary actions against cyber risks is conditional upon their perceptions of the risks, effectiveness, cost, and benefit of the self-protective measures. The current study examined this issue by eliciting trade-off values—or “security premiums”—that users were willing to sacrifice to protect their information security in a phishing context. We also tested the generalizability of the security premiums by testing the effect of various usage contexts on the premiums. Methodologically, we asked users to make trade-offs between pairs of attributes including security, cost, latency, and productivity, from which we could quantify the security premiums. The results indicated that half of the respondents were willing to pay a premium between $9 and $11 per month, willing to wait between 8 to 9 additional minutes, and willing to forgo their access to 21 to 39 legit pieces of information, to obtain a more effective phishing filter that reduces the number of phishing attempts from 24 to 6. Importantly, the security premiums that users were willing to sacrifice for information security protection significantly predicted the frequency of users’ scanning their computer for viruses, controlling for demographic factors, perceived cyber risk, perceived severity of cyber threats, and belief in computer security self-efficacy. Interestingly, the value of information security protection was partly contingent upon the usage contexts such that social media use invoked greater security premiums than email use and web surfing. These results increase our understanding on how and why people engage in self-protection cyber behaviors, and offer valuable implications for the design of a more usable information security system.

P.172  Life-Cycle Assessment of Dredged-Sediment Management Alternatives. Bates ME*, Fox-Lent C, Seymour L, Wender BA, Bridges TS, Linkov I; US Army Corps of Engineers, Engineer Research and Development Center, Environmental Lab; Massachusetts Institute of Technology; Arizona State University   matthew.e.bates@usace.army.mil

Abstract: Managing dredged sediments in a way that properly balances environmental risks and public benefits is often a point of controversy between and among federal agencies and stakeholders. Current decision making includes environmental criteria, but is often limited to those measuring local or immediate effects. Specifically, the variety of distributed and long-term impacts resulting from transportation by truck or barge, use of loading equipment, and long-term site management have implications for climate change, air quality, non-renewable resource consumption and other factors that affect human and ecosystem health. Life-Cycle Assessment (LCA) is a method of accounting for a wider range of impacts and benefits than are included in most risk assessment strategies.   By developing an LCA comparing dredged-material management strategies, we show how fuller criteria can be included in future sediment-management decisions. This paper applies LCA to dredged-sediment management through a comparative analysis of potential upland, open water, and containment-island placement sites in the Long Island Sound region, NY/CT.

P.173  Balancing Research and Funding using Value of Information and Portfolio Tools for Nanomaterial Risk Classification. Bates ME*, Keisler JM, Zussblatt NP, Plourde KJ, Wender BA, Linkov I; US Army Corps of Engineers, Engineer Research and Development Center, Environmental Lab; University of Massachusetts Boston; University of California Santa Barbara; Arizona State University   matthew.e.bates@usace.army.mil

Abstract: Currently, risk research for nanomaterials is prioritized through expert workshops and other deliberative processes. However, analytical techniques that quantify and compare alternative research investments are increasingly recommended. Here we apply value of information and portfolio decision analysis – methods commonly applied in financial and operations management – to prioritize risk research for multiwalled carbon nanotubes and nanoparticulate silver and titanium dioxide. We modify the widely accepted CB Nanotool hazard evaluation framework, which combines nano- and bulk-material properties into a hazard score, to operate probabilistically with uncertain inputs. Literature is reviewed to develop uncertain estimates for each input parameter and we apply Monte Carlo simulation to assess how different research strategies can improve hazard classification. The relative cost of each research experiment is elicited from experts; this enables identification of efficient research portfolios – combinations of experiments that lead to the greatest improvement in hazard classification at the lowest cost. Nanoparticle shape, diameter, solubility, and surface reactivity were most frequently identified within efficient portfolios in our results. This type of analysis can be usefully applied by researchers and funding agencies trying to most efficiently allocate limited R&D resources.

P.174  Multi-pollutant health risk assessment for industrial sectors in Canada. Jessiman B, Colas G, Dinu T, Hancock-Chen T, Judek S, Lyrette N, Raymond P, Willey JB*; Health Canada   jeff.willey@hc-sc.gc.ca

Abstract: Health Canada (HC) has initiated a novel approach to characterizing the complete emissions profiles and resulting exposures and health risks to nearby populations for several Canadian industrial sectors. Three specific facilities for each sector were selected to undergo atmospheric dispersion modelling. These facilities were meant to be representative of the sector with respect to raw materials, technologies and emissions. Detailed data were derived and retained from the modelling, including a full range of percentiles for the predicted ground-level concentrations of a complete suite of emitted air pollutants within 30 x 30 km study areas. Population data were incorporated to estimate exposures and analyze risk via single pollutant and multipollutant methods, using air quality standards and guidelines as well as other techniques, including a Hazard Index approach. This sector-based study is intended to provide HC with a better understanding of specific air emissions and exposures as a result of industrial activities, and to make linkages with health impacts. Assessment of the emissions affecting air quality in the vicinity of industrial sectors will be useful to both industry and governments in developing strategies to protect human health.

P.175  Evaluation of risk based microbiological criteria for Campylobacter in broiler carcasses in Belgium using TRiMiCri. Seliwiorstow T, Uyttendaele M, De Zutter L, Nauta MJ*; Faculty of Veterinary Medicine, Ghent University, Belgium (1,3); Faculty of Bioscience Engineering, Ghent University, Belgium (1,2);National Food Institute, Technical University of Denmark (4)   maana@food.dtu.dk

Abstract: Campylobacteriosis is the most frequently reported foodborne zoonosis worldwide. Consumer´s exposure to Campylobacter might be reduced by establishing a microbiological criterion (MC) for Campylobacter on broiler meat. In the present study two possible approaches were evaluated, using the freely available software tool for risk based microbiological criteria TRiMiCri (http://tools.food.dtu.dk/trimicri). The first approach was the traditional one that implies a microbiological limit (ML-MC) and the second one which is based on the relative risk estimate (RRL-MC). The analyses were based on Campylobacter quantitative data collected from 28 Campylobacter positive bathes processed in 6 Belgian broiler slaughterhouses. To evaluate the performance of ML-MC, n=6, different c (0,1,2) and m (100,1 000,10 000) were used. Results showed that more than 90% of Campylobacter positive batches were not complying with strict ML criteria based on the m=100 for all applied combination of c. The RRL approach requires a baseline risk which was estimated based on the Campylobacter baseline data collected in Belgium in 2008. Approximately 60% of evaluated Campylobacter positive batches account for higher risk than the baseline risk. For both approaches, application of less stringent criteria results in lower percentage of NC and higher minimum relative residual risks (MRRR; it refers to the change in risk when all batches are sampled and all NC batches undergo treatment that effectively eliminates Campylobacter so they are replaced by zero risk batches). It was also observed that the number of samples (n) had little effect on risk estimates. Additionally, the results from ML-MC and RRL-MC follow the same curve when plotting percentage of NC against MRRR. However, for RRL-MC the percentage of NC batches and MRRR was lower and higher, respectively. To conclude, obtained results indicate that TRiMiCri is a useful and user friendly tool to make a risk based decision on the choice of the MC.

P.176  Self-participation in Desertification: A Study on Risk Perception and Coping Behaviors. Zhou Y*, Song Y, Tian J; Peking University and Carnegie Mellon University   humoristecharles@gmail.com

Abstract: Many recent studies showed that human activities play a crucial role in desertification process. Our current project aimed at understanding people¡¯s risk perception and coping behavior of desertification caused by human activities. We had two major goals. First, we were interested in whether lay people recognize the influences of human activities in general and their own participation in exacerbating desertification. Second, while current desertification control policies mostly target at areas with potential desertification threat, research on risk perception usually only recruits and assesses people from areas suffering moderate to high desertification. We specifically collected risk perception data from people living in areas with different desertification degrees to fill this gap. In our first study, we recruited 400 participants from three counties with different level of desertification in Ningxia province, China and asked them to rate the riskiness as well as psychological factors on 10 desertification causes (including both natural and human causes). Especially, we proposed a new set of psychological factors and labeled them as self-relevancy. We expected self-relevancy to capture people¡¯s own perceived participation in desertification. We found that, different from the scientists¡¯ view, lay people in all desertification areas acknowledged the influences of human activities and their own participation in desertification. In our second study, we investigated whether people from areas with three desertification degrees would show distinct risk perception, spontaneous coping behaviors, self-relevancy, or voluntariness of accepting policies that might be detrimental to their own benefits of desertification caused by human activities. We found that people from potential desertification areas have less risk perception, less spontaneous coping behaviors, less self-relevancy, and are less likely to accept policies to address the desertification issue. Our results helped promote effective risk communication and encourage more coping policies to be implemented.

P.177  Alaska Specific Calculator Tool for Addressing Risk Based Human-Health Cleanup Levels. Galloway LD*, Wu T, Dolislager FD, Stewart DJ; University of Tennessee, Knoxville and State of Alaska DEC Contaminated Sites   galloway@utk.edu

Abstract: The Alaska Cleanup Level Calculator (AKCALC) was developed to assist Alaska in setting Method 2 human heath cleanup levels for its list of regulated chemicals. A number of Alaska-specific features was incorporated into this tool to enable it to comport to Alaska Department of Environmental Conservation's regulatory approach for establishing cleanup levels. The AK CALC is based on the inputs and equations used by the EPA's Regional Screen Level (RSL) calculator with Alaska-specific parameters. The output results in cleanup values in soil for 3 different Alaska regions, groundwater and migration to groundwater. An additional tool (AKRISK) was developed to enable the user to calculate cumulative risk. The new tools will be useful in setting regulatory cleanup standards for contaminated sites in Alaska through the regulatory process.

P.178  Forensic investigation style of an unexpected large scale urban disaster: the november 10, 2001 algiers floods and debris flow. Benouar D*, Zelloum H, El Hadj F; Universite USTHB   dbenouar@usthb.dz

Abstract: This paper attempts to present the impact of the unexpected floods and debris flow that occurred within the city center of Algiers (capital of Algeria) on November 10, 2001. According to the official reports, this event caused the loss of 712 human lives, injured 350 people and 116 missing. 1800 housing units were damaged, 56 schools, several bridges, roads and public works suffered considerable damage. The streets of the area affected were affected the debris flow and accumulated more than 800 000 m3 of mud and debris. More than 350 cars were also damaged and several of them and also buses were buried under the debris flow and mud with passengers, unfortunately, there is a great deficit in ongoing research on how science is used to facilitate social and political decision-making in the context of risk particularly for unexpected disasters. Naturally, this requires an integrated approach of research and policy-making for all hazards and disciplines. The analysis of this event has allowed us to make first an inventory of the vulnerability factors, as the existence of the catchments, the high density of inhabitants, open spaces, soil cover, topography, the physical vulnerability of buildings, roads and bridges, the vulnerability of public buildings, etc., of the site and environment that contributed to cause the human and economic losses. This analysis has allowed according to the available data to integrate it into the urban design phase or reconstruction phase in the standards and regulations to reduce the risks. For existing and constructed sites, the risk reduction consists in making new decisions to reduce the vulnerability of the environment and enhance the resilience of the population. Recommendations are made for disaster risk reduction for the site affected of Algiers in terms of reducing the vulnerabilities, and thus reducing risk and curbing human lives and economic losses through sound knowledge-based measures.

P.179  Seeing is Believing?-An Examination of Perceptions of Local Weather Conditions and Climate Change among Residents in the U.S. Gulf Coast. Shao W*, Goidel RK; Auburn University at Montgomery   wshao@aum.edu

Abstract: What role do objective weather conditions play in coastal residents’ perceptions of local climate shifts and how do these perceptions affect attitudes toward climate change? While scholars have increasingly investigated the role of weather and climate conditions on climate-related attitudes and behaviors, they typically assume that residents accurately perceive shifts in local climate patterns.We directly test this assumption using the largest and most comprehensive survey of Gulf Coast residents conducted to date supplemented with monthly temperature data from the United States Historical Climatology Network and extreme weather events data from National Climatic Data Center. We find objective conditions have limited explanatory power in determining perceptions of local climate patterns. Only the 15- and 19-year hurricane trends and decadal summer temperature trend have some effects on perceptions of these weather conditions, while the decadal trend of total number of extreme weather events and 15- and 19-year winter temperature trends are correlated with belief in climate change. Partisan affiliation, in contrast, plays a powerful role affecting individual perceptions of changing patterns of air temperatures, flooding, droughts, and hurricanes; as well as belief in the existence of climate change and concern for future consequences. At least when it comes to changing local conditions, “seeing is not believing.” Political orientations rather than local conditions drive perceptions of local weather conditions and these perceptions - rather than objectively measured weather conditions - influence climate-related attitudes.

P.180  Key Elements for Judging the Quality of a Risk Assessment. Fenner-Crisp PA*, Dellarco VL; Both authors: Independent Consultant and U.S. Environmental Protection Agency (Retired)   pfennercrisp@aol.com

Abstract: A number of U.S. federal, state and local government agencies produce risk assessments on a continuing basis. In the years since publication of the 1983 National Research Council report Risk Assessment in the Federal Government: Managing the Process (the “Red Book”), advances in risk assessment have occurred but the need for further improvement continues to be recognized. Many reports have been published that contain recommendations for improving the quality, transparency and usefulness for decision-making of risk assessments prepared by these agencies. A substantial measure of consensus has emerged as to what characteristics high quality assessments should possess. The aim of this effort was to distill a set of key attributes from the accumulated recommendations into a Guide for use by decision-makers, risk assessors, peer reviewers and other interested stakeholders to judge whether or not an assessment “measures up” to current best scientific practices and results in a scientifically credible, transparent and useful product. By “best practices,” we mean that an assessment reflects a critical, open-minded approach in selecting reliable data and models fit for their intended use, and analyzing and integrating that information. The use of the Guide is intended to promote transparency and consistency with regards to the conduct and quality of assessments. Most of the features cited in the Guide are applicable to any type of assessment, whether it encompasses just one or all four phases (hazard identification, dose-response assessment, exposure assessment and risk characterization) of the risk assessment paradigm, whether qualitative or quantitative, screening level or highly sophisticated and complex. Just as agencies at all levels of government are responsible for determining the effectiveness of their programs, so should they determine the effectiveness of their assessments used in support of their regulatory decisions.

P.181  Techno-Economic Feasibility of Desalination Technology for Agriculture. Welle P*, Mauter M; Carnegie Mellon University   pdw@andrew.cmu.edu

Abstract: Global food systems are faced with consistently high levels of risk. In arid regions, especially, agriculture suffers from issues related to water supply and water quality. Desalination technologies may have a role in mitigating both supply and quality issues by utilizing previously inaccessible water sources and/or treating surface water. In this study, we examine the techno-economic viability of different use-cases for desalination technology in the central valley of California and analyze the possibility of policy-encouraged technology adoption. By relating the agricultural valuation of water to the existing technological state-of-the-art, we are able to address how near desalination technologies are from being profitable in the marketplace. This knowledge can serve to guide researchers seeking to refine these technologies by highlighting the scale and scenarios in which they are most likely to be useful, as well as those policy makers who wish to reduce pollution through technology implementation. The study acts as an example of a set of methods which assess the possibility for a new technology to impact how large systems cope with risk.

P.182  How much risks of GM issue has been told at Chinese newspapers? Comparative Analysis of National and Local Newspaper Coverage of GM issue in China, 2000--2014. Zhang Xiao*; The University of Tokyo and GSII (Graduate School of Interdisciplinary Information Studies)   blue2624@gmail.com

Abstract: GM issues have triggered widespread concerns among Chinese society and the debate is showing no signs of abating. Public rejection is seen as closely related to their risk perception about GM foods and GM controversy is considered relevant to intensive media coverage of potential risks. However, research of risks in GM media coverage in developing countries including China has been very limited. Therefore, this study applied quantitative and qualitative content analysis to examine 16976 GM related news articles published at 718 newspapers from 2000 to 2014 in China. We not only examined general newspaper coverage of GM over 15-year period but also had a focus on differences of coverage-including risk presentations-between national/local levels given the assumption that diverse perspectives and more risks may have been masked by exclusively focus on mainstream leading newspapers. We examined volume change of total news articles and identified 70 events happened based on coverage clusters, and specifically compared reporting patterns for national and local coverage. Proportion of articles mentioned “risk” has generally increased, but very few have “risk” in the headlines. We noticed that one newspaper only published about 1 to 5 GM risk articles on average per year from 2000 to 2014. Risk articles on average are longer than that of GM articles, especially risk articles from national level, but average length of articles from both levels decreased gradually with quite similar trend since 2010. One of our major findings was that local newspapers have wider room for dissenting or “fringe” voices, thus have diverse voices and perspectives, while national coverage performs still play very important role in agenda-setting in GM controversy. Although GM risk articles were minority, they have potential of offering critical perspectives to shape public risk perception towards GM issues in China.

P.183  Practical usage of regional air monitoring to evaluate community-level chemical release exposures. Robinson HJ*; Ramboll Environ   hrobinson@ramboll.com

Abstract: Significant chemical releases, such as those caused by oil spills, industrial fires, natural disaster-related spills, wildfires, and train accidents, can produce airborne exposures at the community-wide level, particularly if the incident has extended duration or widespread effect. In 2015, more than 35 incidents have occurred requiring evacuation for potential airborne exposures. This study investigates the robustness of regional air monitoring networks for providing baseline information about community air conditions after unexpected release events. A spatial analysis has been performed to compare the locations of air monitoring stations in EPA’s Air Quality System (AQS) with locations of airborne releases over the past five years, as documented by the National Response Center. This poster displays the results of this analysis and identifies the situations and areas in which risk evaluators may be able to use existing air monitors to provide baseline and incident-related data. For example, a number of recent incidents have occurred in less populated areas where speciated air monitors are less common but PM2.5 monitors remain regionally distributed. When using regional monitoring data for the purpose of assessing community risks during a chemical release, further consideration must also be given to sampling duration, timeliness of data access, and appropriateness of existing analytical methods for capturing incident-related emissions.

P.184  After the Flood: Risk Perceptions and Management Preferences Following the YYC Flood of 2013. Tanner A*, Arvai J; University of Calgary   alexa.tanner@ucalgary.ca

Abstract: Many studies have looked at public risk perceptions of flood-related risk. My graduate research builds on past work to further elucidate the variables that drive flood risk perceptions, as well as public trust in risk managers, and support for small- and large-scale risk management activities. This research took place in Calgary, Canada two years after the city experienced Canada’s most costly natural disaster. The Calgary flood of 2013 resulted in the lengthy evacuation of 100,000 residents, and led to over $445 million in municipal flood recovery costs. We used a quantitative survey to understand the role and influence of a variety of explanatory variables, including self-rated coping appraisal, differing value orientations thought to influence responses to environmental risks, and previous experience with flooding (including evacuation and residential damage as a result of the 2013 flood). Comparisons were also made between those who currently live in the flood zone, and those who do not, in order to understand the differing perceptions and views towards mitigation and flood risk beliefs.

P.185  Expert Panel Review of the Carcinogenic Potential of the Herbicide Glyphosate. Williams GM*, Sorahan TM, Aardema MJ, Acquavella J, Berry CL, Brusick DJ, Burns M, Viana de Camargo JL, Garabrant DH, Greim, Kier, L; Kirkland, D; Marsh, G; Solomon, K; Weed, D; Roberts, A HA; New York Medical College   ashley.roberts@intertek.com

Abstract: The carcinogenic potential of glyphosate has been the subject of numerous reviews by health and regulatory agencies for over 30 years. These agencies concluded repeatedly that glyphosate does not pose a carcinogenic risk to humans. However, the International Agency for Research on Cancer (IARC) recently classified glyphosate as a probable human carcinogen (Group 2A). Because this classification is causing confusion for various stakeholders, an Expert Panel (EP) was assembled to review the primary evidence in the areas evaluated by IARC: animal bioassays, genotoxicity, exposure, and epidemiology. The Animal Bioassay and Genotoxicity EPs concluded that IARC’s equivalent working groups’ reviews suffered from significant weaknesses such as: selectivity in the choice of data reviewed, failure to use all relevant biologic information to evaluate relationship to treatment in animal bioassays, and failure to use weight-of-evidence (WOE) evaluations using all available data and appropriate weighting. The Animal Bioassay EP conducted a thorough WOE evaluation and concluded there is no evidence of carcinogenicity in rats or mice. A review of the complete database by the EP Genotoxicity workgroup supports the conclusion that glyphosate does not pose a genotoxic hazard and therefore, should not be considered supportive for the classification of glyphosate as a genotoxic carcinogen. The Epidemiology review focused on one cohort and six case-control studies of non-Hodgkin’s lymphoma, as IARC did. Only the Ag Health (cohort) Study assessed exposure before disease outcomes occurred, considered extent of exposure, and controlled for confounding effects of other pesticides. That study found evidence of no association between glyphosate exposure and NHL incidence and was determinative in the epidemiology evaluation. Thus, none of the results from a very large database, using different methodologies, provides evidence of, or a potential mechanism for, human carcinogenesis.

P.186  Identification and Quantification of Cumulative Factors that Increase Environmental Exposures and Impacts. Huang H.*, Barzyk T.M.; ORISE at EPA   hongtaihuang@gmail.com

Abstract: Evaluating the combined adverse effects of multiple stressors upon human health is an imperative component of cumulative risk assessment (CRA). In addition to chemical stressors, other non-chemical factors are also considered. For examples, smoking will elevate the risks of having lung cancer associated with radon exposure; toluene and noise together will induce higher levels of hearing loss; children exposed to violence will have higher risks of developing asthma in the presence of air pollution. Environmental Justice (EJ) indicators, used as a tool to assess and quantify some of these non-chemical factors, include health, economic, and social indicators such as vulnerability and susceptibility. Vulnerability factors encompass race, ethnicity, behavior, geographic location, etc., while susceptibility factors include life stage, genetic predisposition, pre-existing health condition and others, although these two categories are not always mutually exclusive. Numerous findings regarding combined effects of EJ indicators and chemical stressors have been identified. However, fewer studies have analyzed the interrelation between multiple stressors that exert combined harmful effects upon individual or population health in the context of exposure assessment within the risk assessment framework. In this study, we connected EJ indicators to variables in the exposure assessment model, especially the Average Daily Dose (ADD) model, in order to better understand how the interaction of multiple stressors will affect individual or population level exposure assessment. Major intention of this method is to quantify what are often considered qualitative EJ indicators to provide a more accurate representation of environmental exposures and impacts.

P.187  Risks to U.S. Wastewater Workers During Ebola Outbreaks: A Bayesian Belief Network Model. Zabinski J*, MacDonald Gibson J; University of North Carolina at Chapel Hill   zabinski@unc.edu

Abstract: During the recent Ebola virus epidemic in West Africa, several cases of Ebola virus disease (EVD) were treated in U.S. hospitals. Municipal wastewater utilities connected to these hospitals raised concerns about potential risks to their workforce from Ebola transport through wastewater discharged from the hospitals. At some wastewater utilities, workers failed to report to work due to their fears about Ebola. The magnitude of risk to wastewater workers is currently unknown, and assessing this risk presents a major challenge due to limited data on the behavior of Ebola virus in wastewater, differences among hospitals in the management of liquid wastes from EVD patients, limited understanding of the dose-response relationship for Ebola, and the wide variation in potential scenarios via which wastewater and sewer workers could be exposed to Ebola virus. Information on the potential magnitude of risks under different scenarios for hospital treatment of liquid waste from EVD patients is essential to inform future risk management decisions, in case another outbreak of Ebola or a similar emerging pathogen occurs in the future. We model wastewater treatment workers’ risk of developing EVD under alternative hospital waste management practices using a Bayesian belief network (BBN). This modeling approach allows for the simultaneous incorporation of heterogeneous data (including wastewater treatment system parameters, viral activity data, and EVD patient waste treatment protocol information). The BBN characterizes the wastewater system and its interaction with Ebola virus from point of emission, while also permitting the specific identification of high-impact variables in reducing infection risk. Our model can aid wastewater treatment utilities in focusing efforts on workers at greatest risk through the most effective interventions and in advising hospitals on treatment of liquid wastes from EVD patients. The BBN’s structure can also be adapted to characterize risk from other infectious pathogens within wastewater treatment systems.

P.188  Reducing early-life exposure to radiation: a review of radon testing programs in Canadian schools . Nicol AM*, Palmer A, Telfer J, Warje O; Simon Fraser University   anicol@sfu.ca

Abstract: Introduction: Radon, a radioactive soil gas, is the 2nd cause of lung cancer amongst Canadians. Reducing exposure during early-years contributes to an overall lowering of lifetime risk of developing cancer. Preventing exposure to radon gas requires reducing exposure in indoor environments where people spend significant amounts of time. Outside of homes, schools and daycare are spaces most frequented by children. This review was undertaken to examine where program and policies exist across Canada schools and to collect data on radon testing experiences. Methods: Policy reviews were done at the federal, provincial and school-board level to capture legislation. Ministries of Education and School boards were contacted to request information on testing and results. Radon mitigation specialists were interviewed. Where information on costs and program implementation was available, data was collated. Results: Overall, there were significant challenges finding information on school radon testing. Three provinces had comprehensive, province-wide testing programs, two that began in the 1990s. Each of these provinces had adopted radon reduction programs in a unique manner. Three provinces had little to no information available about radon testing, even though each has regions where homes have tested above guidelines. The remaining 6 provinces had low testing rates, with some initiatives happening through local public health authorities and others through school boards. Confusion was noted in some provinces regarding who was responsible for setting out radon school testing requirements. Conclusion: Canada lacks a comprehensive school testing initiative for radon gas, even though the federal government encourages the testing of all public buildings. Data from current school radon programs, including costs and challenges are not being shared. Making results accessible across the country and supporting the adoption of province-wide testing legislation is needed to reduce radiation exposure during children’s formative years.

P.189  The Dose-Response Framework: An online compendium of risk methods organized by problem formulation. Kroner O*, Haber L, Dourson M; Toxicology Excellence for Risk Assessment (TERA) Center of the University of Cincinnati   oliver.kroner@uc.edu

Abstract: With improvements in our biological understating, computational power, and an ever-changing regulatory landscape, new methods for evaluating human health impact and risk are emerging. Since 2010, the Beyond Science & Decisions workshop series has provided a venue for testing, vetting, and improving novel risk assessment methods. As a project of the Alliance for Risk Assessment, this effort has grown to a collaboration of over 60 organizations representing government, industry, scientific societies, consultancies, and environmental NGOs. Over 40 case studies illustrating such methods have been presented and reviewed by the Science Panel. To make these case studies available to the risk community, the ARA Dose Response Framework (www.chemicalriskassessment.org) was created. This free online compendium of state-of-the-science methods organizes the case studies by problem formulation, offering different methods for conducting qualitative, screening-level, or in-depth assessments, and links to key guidance documents on a range of risk assessment issues. It was developed by panel members and workshop participants as a way to categorize and identify gaps in available methods, and to aid risk assessors in identifying useful tools for different problem formulations. It is intended as a tool to help guide the risk assessor in selecting an appropriate method(s) for addressing different issues related to hazard characterization and dose-response assessment, and to help the field of risk assessment identify gaps in methodology. The Dose-Response Framework is now available via the National Library of Medicine’s Enviro-Health Links (https://sis.nlm.nih.gov/enviro/toxweblinks.html).

P.190  GMOs and pesticides – Going beyond the data with new tools for risk communication. Reeves WR*; Monsanto Company   william.r.reeves@monsanto.com

Abstract: Genetically modified organisms (GMOs) and pesticides are a divisive issue for many consumers and discussions about their risks and benefits often devolve into emotional arguments. On the surface, the debate can appear to be driven by pro- or anti- science worldviews but the field of risk communication provides a more nuanced understanding. By using the knowledge and consent paradigm described by Douglas and Wildavsky (1982) it is possible to understand why disagreements over GMOs and pesticides are among the most intractable. The solution to these types of disagreements are never simple but they require personal communication more than data. This presentation describes how Monsanto’s efforts to share information about its products have been influenced by outlets such as the GMO Answers web site and social media in general. It also poses questions about what other steps need to be taken to further this dialogue.

P.191  Use of In Ovo Genotoxicity Assay for Risk Assessment of Food-Borne Compounds. Kobets T*, Duan JD, Brunnemann KD, Iatropoulos MJ, Vock E, Deschl U, Williams GM; New York Medical College, Valhalla, NY, USA and Boehringer Ingelheim Pharma GmbH&Co. KG,Biberach an der Riss, Germany.   Tetyana_Kobets@nymc.edu

Abstract: The in Ovo Genotoxicity Assay, a unique enhanced non-animal model that is intermediate between in vitro and in vivo assays, has been developed as an alternative for the assessment of the potential of chemicals to induce DNA damage. The model utilizes chicken (Chicken Egg Genotoxicity Assay; CEGA) or turkey (Turkey Egg Genotoxicity Assay; TEGA) eggs and evaluates the endpoints of formation of DNA adducts using 32P-nucleotide postlabeling (NPL) assay and formation of DNA strand break using alkaline single cell gel electrophoresis (comet) assay. CEGA was used to assess the genotoxic potential of several food-borne compounds including three known genotoxic activation-dependent carcinogens (aflatoxin B1, benzo[a]pyrene, methyl eugenol) and their structurally similar weak or non-genotoxic comparators (aflatoxin B2, benzo[e]pyrene, eugenol). For this purpose, fertilized white chicken eggs received 3 daily injections with total doses of 1.6, 3.2, 6.4 µg/egg of aflatoxin B1 or aflatoxin B2, 250, 500 µg/egg of benzo[a]pyrene or benzo[e]pyrene, 1, 2 mg/egg of methyl eugenol or 1.5, 3 mg/egg of eugenol on days 9-11 of incubation. Three hours after the last injection, fetal livers were collected for the genotoxicity assays. The established carcinogens aflatoxin B1 and benzo[a]pyrene produced DNA strand breaks, benzo[a]pyrene also produced DNA adducts. The hepatocarcinogen methyl eugenol formed DNA adducts. Weakly carcinogenic aflatoxin B2 was weakly positive in comet assay and negative in NPL. In contrast, non-carcinogens benzo[e]pyrene and eugenol did not induce genotoxicity. Data on other naturally occurring compounds will be presented. The findings in CEGA provide evidence that In Ovo Genotoxicity Assay is a reliable alternative model for the evaluation of chemical-induced genotoxicity for risk assessment.

P.192  An Evaluation of the Influenza Risk Reduction from Antimicrobial Spray Application of Porous Surfaces. Chabrelie A*, Mitchell J, Rose J, Charbonneau D, Ishida Y; Michigan State University   jade@msu.edu

Abstract: Antimicrobial spray products are used by millions of people around the world, for cleaning and disinfection of commonly touched surfaces. Influenza A is a pathogen of major concern, causing over 36,000 deaths and 114,000 hospitalizations annually in the United States alone. One of the proposed routes of transmission for Influenza A is by transfer from porous and non-porous surfaces to hands and subsequently to mucous membranes. Therefore, routine cleaning and disinfection of surfaces is an important part of the environmental management of Influennza A. While the emphasis is generally on spraying hard surfaces and laundering cloth and linens with high temperature machine drying, this study examines the impact using an antimicrobial spray on a porous surface has on reducing the risk of infection. A Quantitative Microbial Risk Assessment (QMRA) for a single touch resulting in direct contact with a treated, contaminated, porous surface is analyzed to determine the reduction in Influenza A risk associated with the measured viral inactivation. A comparison of the risk of infection with and without the use of the antimicrobial spray product has been done. The analysis indicates that Influenza infection risks associated with a single touch to contaminated fabrics are relatively high especially considering the potential for multiple touches in a real world scenario. However, use of the antimicrobial spray product resulted in a 4 log risk reduction. Thus the results of this study inform and broaden the range of risk management strategies for Influenza A.

P.193  Monte Carlo N-Particle (MCNP) Enhancements to Area Correction Factors (ACF), Gamma Shielding Factors (GSF), and Surface Factors for Rooms (Fsurf) Used in Superfund Risk and Dose Calculators. Stewart DJ*, Dolislager FG, Galloway LD, Bellamy MB, Finklea LR, Walker S; University of Tennessee   dstewart@utk.edu

Abstract: To model the external exposure pathway in risk and dose assessments of radioactive contamination at Superfund sites, the U.S. Environmental Protection Agency (EPA) uses slope factors (SFs), also known as risk coefficients, and dose conversion factors (DCFs). Without any adjustment, these external radiation exposure pathways effectively assume that an individual is exposed to an infinite slab source geometry, meaning that the thickness of the contaminated zone and its aerial extent are so large that it behaves as if it were infinite in its physical dimensions. MCNP has been used to improve site-specific risk and dose calculations by modeling isotope-specific, source depth-specific, and material-specific adjustment factors that provide a more accurate representation than the previous defaults. The ACF and GSF enhancements now allow the user to select from 19 different site size options and clean soil cover depths from 0-10 meters. The Fsurf enhancements now allow the user to select from 8 different room materials, 4 receptor positions, and 5 room sizes. Now, the adjustment factors and the risk and dose coefficients are based on the same source depths.

P.194  Can Air Pollution Sources Adversely Affect Soil and Vegetation? Zemba SG*, Lester RR; Sanborn, Head & Associates and CDM Smith   szemba@sanbornhead.com

Abstract: Clean Air Act regulations require evaluation of impacts of air pollution emissions on soils and vegetation. Assessments are often qualitative, as emissions from modern industrial facilities are too low to deposit substantial levels of contaminants. However, concerns over contaminant deposition occasionally arise and demand quantitative evaluation. Case studies will be presented that evaluated the potential for air pollutants to affect soils at levels that could present human health and ecological risks. One case examined fugitive dust emissions from an aggregate material storage facility. Dust deposition into a shallow soil was found to insubstantially increase contaminant concentrations in soils relative to background levels. In this case, electron microscopy analysis found evidence of aggregate particle deposition too small to distinguish with conventional sampling. In a similar study, deposition modeling of airborne emissions from a Portland cement facility also indicated insubstantial increases in soil contaminant levels. A third study modeled nitrogen deposition to soil due to emissions from a proposed power plant. Excessive nitrogen loading could endanger restoration of a tallgrass prairie by encouraging invasive plant species incapable of fixing atmospheric nitrogen. The increased level of nitrogen deposition, when added to the background rate, was found to be lower than that associated with invasive species proliferation. In general, these studies indicate that very high emissions are needed to effect noticeable changes in soil contaminant levels. Examples in which atmospheric deposition substantially impacted soils are the historic use of lead as a gasoline additive and a former municipal solid waste incinerator with very high dioxin emissions. In both cases, simple accumulation models indicate substantial increases in soil concentrations due to atmospheric deposition capable of endangering health.

P.195  Evaluation of developmental toxicity of multi-wall carbon nanotubes in pregnant mice after repeated intratracheal instillation. Kobayashi N*, Tanaka S, Ema M, Ikarashi Y, Hirose A; National Institute of Health Sciences   norihiro.kobayashi@nihs.go.jp

Abstract: Some studies have reported that maternal exposure to nanomaterials, including carbon nanotubes, may induce teratogenicity. In order to evaluate the developmental toxicity including the teratogenicity of multi-wall carbon nanotubes (MWCNTs) via inhalation exposure, we conducted repeated intratracheal instillation studies of MWCNTs in pregnant mice. The MWCNT dispersions were repeatedly administered to pregnant Crlj:CD1(ICR) mice on gestation day 6, 9, 12, and 15 at dosage of 0, 0.5, 1.0, 2.0, and 4.0 mg/kg/day. Ten pregnant mice per group were dissected on gestation day 17, and then developmental toxicity to embryos and fetuses were evaluated. The body weights of MWCNT exposed mice at dosages of 2.0 and 4.0 mg/kg decreased, although the changes were not statistically significant. Body weight of fetuses was significantly decreased in the 2.0 and 4.0 mg/kg/day MWCNT exposed group. No statistically significant difference between the control group and MWCNT-exposed groups were observed in the number of corpora lutea, implantations. However, statistically significant changes in number of live fetuses, sex ratio, placental weight, amd external malformations (ectrodactyly and micromelia) of fetuses were observed in the 4.0 mg/kg/day MWCNT exposed groups. Furthermore, skeletal anomalies of fetuses were observed in the 1.0 mg/kg/day or more dosages. The number and percentage of the skeletal anomalies were changed in a dose-dependent manner. From these results, we have concluded that lowest-observed adverse effect level (LOAEL) for the developmental toxicity of MWCNT is 1.0 mg/kg/day.

P.196  Characterising uncertainty in a Toxicokinetic/Toxicodynamic (TK/TD) model-based risk assessment of skin sensitisation. MacKay C*, Reynolds J, Gosling JP, Cubberley R, Dhadra S, Gellatly N, Pendlington R, Pickles J, Tang D, Maxwell G; Unilever Safety and Environmental Assurance Centre and University of Leeds   cameron.mackay@unilever.com

Abstract: Reliable methods of hazard characterisation are key to informing risk assessment of skin sensitising agents. However, combining consumer exposure data with non-animal hazard data to form a risk assessment remains a key challenge. We present a toxicokinetic/toxicodynamic (TK/TD) modelling approach to solving this problem. TK models were built to describe the absorption and distribution of chemical sensitisers in skin and their reaction with skin proteins (the molecular initiating event). These models were parameterised using literature data and a variety of non-animal experiments that enable time and dose-dependent prediction of the sensitiser-modified fraction of skin protein. A TD model was built to describe the dynamics of antigen (i.e., the sensitiser-modified protein) processing, antigen presentation and subsequent CD8+ T-cell activation (a key event in sensitisation). The TD model was parameterised using data from the immunological literature and coupled to the TK model to enable a dose-dependent prediction of the CD8+ T-cell activation event. Reverse dosimetry analysis was applied to determine the dose at which sensitisation occurs in the average individual. In our analyses, we treat the models as approximations of reality. The uncertainty in model predictions due to parameter uncertainty (limitations in knowledge of parameter values) and model uncertainty (limitations in model representativeness introduced via modelling assumptions) have been accounted for. Expert knowledge elicitation was used to set distributions on model parameters that were then updated using available experimental data and Bayesian inference. Global sensitivity analysis was performed to determine how parameters contribute to the uncertainty in the inferred sensitising dose. The impact of modelling assumptions on the ability of the TK/TD model to accurately represent reality was assessed and judgements on model uncertainty were made on this basis.

P.197  Status of Regulatory Decisions for Perfluoroalkyl Compounds: Is the Level of Protection to the General Public Worth the Uncertainty and Cost? . Anderson JK, Goodrum P*; Integral Consulting Inc.   janderson@integral-corp.com

Abstract: Perfluoroalkyl substances (PFASs) continue to present significant challenges for regulatory agencies as the science behind the health and environmental effects are continuously evolving. Many agencies struggle to set regulatory levels for PFASs, in part, due to difficulties assessing risks from low level exposure and the constant generation of new toxicity information. At the Federal level, final guidance on PFASs is limited to the January 2009, U.S. EPA Office of Water provisional health advisories for PFOA and PFOS. In October 2009, U.S. EPA Office of Solid Waste and Emergency Response applied the sub-chronic RfDs for PFOA and PFOS from the Office of Water’s assessment to the Superfund risk-based equations to derive screening levels for water and other media; however, these values have not been adopted in the EPA Regional Screening Level Table. In the absence of national standards, some States have pursued drinking water advisories or standards for various PFASs, but with different interpretations of the science and methods for addressing data gaps. Both U.S. EPA (2014) and ATSDR (2015)have released draft assessment, which contain conflicting and opposing conclusions. In 2013, U.S. EPA began collecting occurrence data from large public drinking water supply systems for six PFASs to assess whether health-based standards under the Safe Drinking Water Act were necessary. PFASs have been detected in fewer than 5 percent of all water supply systems, but those detections have been across 33 states. Exposure for the general population has been shown to primarily occur through treated textiles and consumer products. It is unclear how information on exposure and occurrence are utilized by agencies to prioritize regulatory actions and develop the most public health-protective regulations. This presentation will evaluate and compare the various PFAS standards and general population PFAS exposure. Implications for public health policy and risk assessment will be presented.

P.198  Predictive Quantification of Inhalation Risks to Support Natural Resource Damage Assessment. Rosenstein AB*, Mori CS, Colegrove KM, Schwacke LH; Risk Assessment Consultant; Industrial Economics, Inc.; Zoological Pathology Program, College of Veterinary Medicine, University of Illinois at Urbana-Champaign; and Oceans & Human Health Branch, NOAA/NCCOS Hollings Marine Labor   envriskexpabr@aol.com

Abstract: An inhalation risk assessment for marine mammals in the Gulf of Mexico following the Deepwater Horizon (DWH) oil spill was conducted. Inhalation is an important exposure route for these animals due to their unique anatomy, physiology, and diving behavior. This risk assessment incorporated information on relevant marine mammal species, air exposure levels, and the inhalation toxicity of petroleum and petroleum-related compounds. Inhalation toxic effect levels from laboratory mammal toxicity studies were extrapolated to marine mammals, and then were compared to modeled and measured air concentrations to derive risk estimates. Based on data from both live health assessments and post-mortem analyses of stranded bottlenose dolphins, researchers identified three primary adverse health effects that are the main contributors to the increased prevalence of sick animals, dead animals, and reproductive failure within the DWH oil spill footprint: lung disease, adrenal system disruption, and poor body condition. Researchers also identified a suite of other adverse health effects that compound the primary injuries, including anemia, liver disease, and dental disease. The marine mammal inhalation risk assessment supports the conclusions of the bottlenose dolphin live health assessments and post-mortem analyses, that adverse health impacts occurred, and may also occur over a longer time frame, for Gulf of Mexico marine mammals following the DWH oil spill.

P.199  Effects of Ozone Monitor Upgrades and Inlet Height Adjustments on Ambient Exposure Risk and NAAQS Compliance. Ollison WM*, Leston AR; American Petroleum Institute   ollisonw@api.org

Abstract: The recent tightening of the ozone (O3) National Ambient Air Quality Standards (NAAQS) did not require upgrading compliance monitoring networks with recently approved “interference free” Federal Equivalent Method O3 monitors. Both the Teledyne-Air Pollution Instrumentation (TAPI) T265 and the 2B Technologies 211 would address interference bias in the array of conventional UV (254 nm) photometers presently deployed. Nor did the Environmental Protection Agency (EPA) modify sampling protocol parameter guidance for sampling inlet heights whose wide allowed spread (2-15 meters) may exaggerate estimated ambient exposure and measured exceedance of the NAAQS above 2 meters. To address these deficiencies we measure impacts of new monitor deployment and inlet height at the Westport, CT O3 monitoring site. A nitric oxide (NO) scrubbed TAPI T400 photometer, sequentially sampling at 6.2 meters and 2 meters, was collocated with its conventional MnO2 scrubbed TAPI T400 version sampling at 6.2 meters with both data sets adjusted for daily relative instrument drift. A comparison of the scrubber upgraded and conventional 6.2 meter TAPI T400 time-series provide a measure of conventional monitor interference bias; a comparison of the upgraded monitor’s 6.2 meter and 2 meter time-series provide a measure of the O3 gradient between a 2 meter height, more representative of outdoor nose heights, and the Westport 6.2 meter compliance monitoring height. While current network inlet heights may vary between 2-15 meters, they currently cluster in the 4-6 meter range, averaging 5.2 meters for urban sites; however, 10 meter inlet heights are mandated at the rural EPA CASTNet compliance sites. The effects of improved network instrumentation and reconsidered inlet height sampling on measured O3 outdoor exposure and NAAQS compliance will be described during the poster presentation.

P.200  Climate change impacts on heat-related mortality in large urban areas in China . Li Y*, Zhang W; East Tennessee State University, Renmin University (Beijing, China)   liy005@etsu.edu

Abstract: Global climate change is anticipated to raise the overall temperatures and is likely to increase future mortality attributable to heat. China, a rapid developing nation with the world’s largest population, has experienced noticeable changes in climate over the past century, with an annual increase in air temperature by 0.5–0.8°C. While increasing evidence is suggesting that climate change has posed significant health risks to Chinese population, including heat-related mortality, the extent to which climate change will affect future mortality and the sources of uncertainty in projecting prospective changes in mortality remain unexplored. This working-in-progress study aims at estimating excess future premature deaths in large urban areas in China resulting from potential increases in temperature under climate change and investigating sources of uncertainty. We include 51 large Chinese cities in this study, which approximately one third of the total population in China. We use an integrated approach, which combines temperature predictions from climate models, local temperature-mortality relationship and population forecasting, to project the future excess mortality attributed to higher temperature during warm season. The poster presents the results of predicting temperature change during 2040-2050 relative to the baseline period 1950-2000 in the 51 cities selected. We ensemble outputs from 19 climate models used in the IPCC 5th Report, including outputs related to all four AR5 emission scenarios (RCPs 2.6, 4.5, 6.0 and 8.5).

P.201  Associations Between Cardiovascular Birth Defects and Disinfection By-Product Exposures in Massachusetts, 2000-2004 . Wright JM, Evans A, Kaufman JA*, Rivera-Nunez Z, Narotsky M; Association of Schools and Programs of Public Health   kaufman.john.a@gmail.com

Abstract: Epidemiological studies suggest that women exposed to disinfection byproducts (DBPs) in treated water have an increased risk of delivering babies with cardiovascular defects, though evidence for specific DBP-defect associations is limited. We used a case-control design of birth defects nested in a retrospective cohort study of all births in Massachusetts from 2000-2004 with complete trihalomethane (THM) and haloacetic acid (HAA) data (n=22,033). We randomly matched each case (n=2,203) to 10 controls based on week of conception. Drinking water source, disinfection treatment information, and quarterly DBP data were obtained from all Massachusetts towns and linked to birth data by town of residence and birth month. We used weighted averaged first trimester DBP exposures across all quarterly monitoring sample locations. We examined nine birth defects in relation to four mixtures and five individual DBP categorical exposures. We observed statistically significant adjusted odds ratios (aORs) for four birth defects when comparing higher exposure groups to low exposure referent groups: 2.4 (95%CI=1.2–4.9) for atrioventricular or ventricular septal defect with dibromochloromethane (DBCM); 3.4 (95%CI=1.10–11.4) for pulmonary stenosis with trichloroacetic acid (TCAA); 4.3 (95%CI=1.1–16.9) for tetralogy of Fallot with TCAA; and 6.5 (95%CI=1.2–34.6) for transposition of the great arteries with HAA5. We detected an aOR of 1.8 (95%CI=0.9–3.6) for ventricular septal defect (VSD) with elevated THM4 exposure, within the range consistently observed by previous meta-analyses. The strongest association for VSD detected for an individual DBP exposure was an aOR of 1.5 (95%CI=1.0–2.4) for DBCM, one of the DBP components of THM4. Such results for specific DBPs rather than proxy or aggregate measures are a strength of this study. The views expressed herein are those of the authors only and do not necessarily reflect the views or policies of the USEPA.

P.202  A Food Processing Vulnerability Tool Exploring Public Health Risks. Hartnett E*, Milton B, Wilson M, Schaffner DW, Haas C; Risk Sciences International   ehartnett@risksciences.com

Abstract: The Food Safety Modernization Act (FSMA) Section 106 requires that a vulnerability assessment of the food system be conducted, including biological, chemical, radiological or other risk assessments. The Department of Homeland Security (DHS) has defined vulnerability as “likelihood that an attack is successful, if it is attempted”. The success of an attack on the food system depends on the ability of the hazard to persist in the product at concentrations sufficient to cause harm after distribution. We are developing a web-based food processing vulnerability assessment tool for agents of concern that may be used in attacks targeting food production systems. The aim is to inform decisions focusing upon protecting the production system from attacks. The tool is based upon an underlying stochastic simulation model tracking the levels of a suite of agents of concern in a user-defined food production system. Results are presented incorporating assessments for a set of multiple agents of concern, both microbial and chemical (e.g. Bacillus anthracis, Clostridium botulinum, Yersinia pestis, and other agents that may be used in attacks on the food supply). The tool provides quantitative estimates that directly measure risk and vulnerability using public-health based metrics. Adopting a multi-hazard approach, results provides a holistic view of the vulnerability of the production system, as opposed to single hazard-food based assessment. Results of the vulnerability assessment of illustrative processing systems will be presented. Our tool is generic in nature, and can be applied to a multitude of food production systems. It enables exploration of the impact of risk mitigation measures upon the vulnerability of a food production system. Use of the tool will provide stakeholders with science-based quantitative information that can be directly used to inform decisions enhancing the resiliency of the supply chain, and minimizing the risks to the consumer.

P.203  Probabilistic risk assessment of the exposure to formaldehyde via fish consumption in Taiwan. Chiang SY*, Chen SR, Yang WT, Liu JM, Wu KY; China Medical University and National Taiwan University   sychiang@mail.cmu.edu.tw

Abstract: Formaldehyde, a known human carcinogen, is reported present in fish products, probably resulting from the degradation of trimethylamine N-oxide. Its potential health effects from fish consumption are of great public concern. This study assessed the potential health risk of formaldehyde due to fish consumption in Taiwan. The analysis data of residual formaldehyde in imported fish products were cited from a report released by Taiwan Food and Drug Administration Bureau. Monte Carlo simulation was used to assess the distribution of the total average lifetime daily dose by adopting the intake rates of the fish products published National Health Research Institutes of Taiwan. Based on the current ADI of 0.15 mg/kg bw/day for formaldehyde, the upper bound of 95% confidence interval of hazard index is calculated to be 2.1. Our data suggest that there will be some risk due to exposures to formaldehyde for those who frequently consume fish products in Taiwan.

P.204  Understanding American Public Perceptions of Scientists' Communication Goals. Kotcher J*, Myers T, Stenhouse N, Vraga E, Maibach E; George Mason University   john.kotcher@gmail.com

Abstract: Despite recent efforts within the scientific community to train and mobilize more scientists for public engagement, relatively little is known regarding public beliefs about the reasons scientists engage in public communication, including their antecedents and consequences. Drawing on data from a nationally representative survey of Americans (n=1000), we asked participants to rate how important they think a number of different goals are to scientists when they engage in public communication (e.g., to ensure that people are informed, to advance a political agenda, to get more funding). We examine the factor structure of these goal perceptions, and present a structural equation model that examines demographic and media use predictors of attributions about scientists’ goals, and how these goal perceptions affect the perceived credibility of scientists. We found that beliefs about scientists’ communication goals break down into two latent factors, representing what we refer to as altruistic and egoistic goals. Furthermore, we find that being female, age, and use of traditional science media are positively related, and conservative political ideology is negatively related to altruistic goal perceptions. Women, those with higher levels of education, and those with higher political media use tend to have greater egoistic goal perceptions. In turn, altruistic goal perceptions have a positive direct effect on credibility, whereas egoistic goal perceptions have a small, negative and only marginally significant direct effect on credibility. By examining the factor structure of a number of different communication goals, our study sheds light on how individuals integrate beliefs about the intentions of scientists to form judgments about their credibility. Additionally, our analysis reveals that different sources of media have distinctive influences on the perceived communication goals of scientists, likely due to variations in the depiction of scientists within these media.

P.205  Communicating Environmental Health Risks to Indigenous Populations: A Systematic Literature Review and Recommendations for Future Research. Boyd AD*, Furgal CM, Dickson D; Washington State University   amanda.boyd@wsu.edu

Abstract: Indigenous populations are recognized as a potentially vulnerable group to environmental health risks due to their intimate relationship with the environment, and reliance on local environments for aspects of culture, health and well-being in many circumstances. Barriers to effective communication and health risk management are linked to cultural, economic and geographic factors. A systematic literature review was conducted to consolidate peer-reviewed research published between 1980 and 2014 on the communication of environmental health risks to Indigenous populations. The comprehensive literature review procedures included searching databases and key journals that represented various fields in communication, environmental health, and Indigenous studies. The review yielded a total of 4,469 potential articles and a total of 14 of these manuscripts met the inclusion criteria. The 14 articles were analyzed to provide lessons learned for effective risk communication. Factors that influence successful risk communication strategies with Indigenous populations include: (1) Developing messages that are congruent with the populations’ cultural beliefs and understanding of the environment; (2) including Indigenous populations in message design and delivery; (3) using credible and trustworthy spokespeople in message delivery; (4) identifying and utilizing effective communication materials and channels; and (5) ensuring that messages are understandable to the target audience. Gaps in the literature include the lack of longitudinal studies that empirically measure changes in perception, awareness and behavior, as well as a general lack of theory-based research. Results from this review will provide directions for future research to help guide the development of more effective risk communication research and strategies to Indigenous populations.

P.206  Lung Cancer Risk from Residential Radon Exposure. Corrigan RM*; University of Ottawa   rcorr073@gmail.com

Abstract: Strong evidence exists linking radon exposure to increased lung cancer risk. Epidemiological studies of underground miners exposed to high concentrations of radon demonstrate a higher incidence of lung cancer - an association that has been corroborated by case-control studies and improved credibility of plausible biologic mechanisms. Based on the miner studies, radon has been classifi ed as a human carcinogen and is ranked as the second greatest cause of lung cancer after smoking. Radon is also the greatest source of natural radiological exposure to humans, accounting for approximately 50% of the dose received from natural sources. A recent Health Canada survey of radon concentration in homes across Canada has yielded higher concentrations than older surveys. A number of exposure-response models are used to estimate the health detriment from radon exposure. The population attributable risk of lung cancer is estimated at approximately 3% to 10% depending on the model used. Excess lifetime relative risk is calculated to be approximately 0.04 to 0.15.

P.207  Alliance for Risk Assessment’sv1,4-Dioxane Reanalysis in Support of a Regenerative Hyperplasia Mode of Action (MOA). Nance P*, Dourson M; Toxicology Excellence for Risk Assessment (TERA) Center, University of Cincinnati   patricia.nance@uc.edu

Abstract: The State of Kentucky petitioned the Alliance for Risk Assessment (ARA) to obtain additional information from Japanese studies to inform 1,4-dioxane’s cancer mode of action (MOA) based on a recent reanalysis. Additional information and translations of the Japanese studies are also supportive of a regenerative hyperplasia MOA but with one exception, specifically, the reported findings from the histopathology and clinical chemistry of the mouse liver in the Japanese studies are contradictory. The reanalysis of data leads to the conclusion that these rodent tumors are evoked by a regenerative hyperplasia mode of action (MOA) that stimulates existing background mutations. Regenerative hyperplasia in this context is due to an overwhelming toxicity in the rodent liver as evidenced by an increase in blood levels of enzymes indicative of liver cell damage and associated histopathology due to 1,4-dioxane exposure that occurs in a dose and time related manner throughout the lifespan. This contradiction may be due in part to the investigators changing criteria for liver histopathology scoring during the course of reporting their results. A limited amount of additional information from the Japanese studies, including potentially rereading some of the mouse liver histopathology slides, may be helpful. The intent of this ARA project is to obtain this limited, additional information from the Japanese studies, or other information as appropriate, in order to resolve the hypothesized MOA for 1,4-dioxane’s liver tumor formation (and potentially other tumors).

P.208  Risk governance through the integrating risk evaluation evaluation and the institutional systems: case of chemicals management. Tokai A*, Todoroki A, Machimura T, Xue M, Kojima N, Ebisudani M, Sakamoto Y, Shiga Y, Manabe Y, Zhou L; Osaka U.   tokai@see.eng.osaka-u.ac.jp

Abstract: WSSD 2020 target requires higher level of chemical risk management in the world. To do so, much effort has been doing in the world. We try to build methodology to support this target through integrating risk evaluation and institutional options. Japanese chemical risk management have been carrying out through PRTR and Law Concerning the Examination and Regulation of Manufacture, etc. of Chemical Substances. These institutional systems mainly look at chemical flows management and in this research project, concept of stock management is introduced and the possible extension of the coverage of chemicals risk management will be discussed. We employ a couple of representative chemicals and manufactured good as example and perform case studies. Point of arguments are stock management and its possible contribution to the risk governance.This research was supported by the environment Research Technology DevelopmentFund (1-1501) of the Ministry of Environment, Japan.



[back to schedule]