Society For Risk Analysis Annual Meeting 2013

Session Schedule & Abstracts


* Disclaimer: All presentations represent the views of the authors, and not the organizations that support their research. Please apply the standard disclaimer that any opinions, findings, and conclusions or recommendations in abstracts, posters, and presentations at the meeting are those of the authors and do not necessarily reflect the views of any other organization or agency. Meeting attendees and authors should be aware that this disclaimer is intended to apply to all abstracts contained in this document. Authors who wish to emphasize this disclaimer should do so in their presentation or poster. In an effort to make the abstracts as concise as possible and easy for meeting participants to read, the abstracts have been formatted such that they exclude references to papers, affiliations, and/or funding sources. Authors who wish to provide attendees with this information should do so in their presentation or poster.

Common abbreviations


Poster Session

Room: Key Ballroom 7-12   6 PM- 8PM

Chair(s): TBD



Caring for Consumers
 

P.1  Approach for developing Specific Consumer Exposure Determinants (SCEDs) for fuel and lubricant scenarios. Qian H*, Zaleski R, Money C; ExxonMobil Biomedical Sciences, Inc.   hua.qian@exxonmobil.com

Abstract: The ECETOC TRA tool, a preferred lower tier exposure tool under REACH, provides conservative (intentionally high) estimates of consumer exposure. Under REACH, if a predicted exposure exceeds a substance hazard benchmark (Derived No Effect Level or DNEL) using lower tier tools, the assessment is either refined via higher tier analysis or Risk Management Measures are implemented to reduce the exposure predictions to values < DNEL. Much effort has recently been directed to identifying a rigorous and transparent approach for refining the TRA defaults so that initial exposure estimates are closer to reality, limiting the need to perform higher tier analysis which requires more data. In 2012, ECETOC introduced the concept of Specific Consumer Exposure Determinants (SCEDs), a template that helps provide a basis for delivering a more realistic estimate of consumer exposure. We populated this template and developed 9 SCEDs to cover a range of consumer fuel use scenarios and 4 SCEDs to better define consumer lubricant use scenarios based on public data. SCED development required: data mining, data assessment, data documentation, and evaluation of the final scenario as whole. We describe the steps taken in SCED development, and provide examples of a completed product. This approach has general utility for documentation of consumer exposure determinants and development of improved consumer exposure scenarios. It is currently being implemented by other industry sectors as well.

P.2  Developing Specific Consumer Exposure Determinants (SCEDs) for assessing risk from chemicals used in consumer products. Money C, Corea N, Rodriguez C, Ingram J, Zaleski RT, Lewis J*; ExxonMobil, SC Johnson, Procter and Gamble, Unilever   chris.money@exxonmobil.com

Abstract: The EU REACH Regulation requires that an assessment of the risks to consumers is undertaken when a substance is registered and is classified as hazardous. The assessment must cover all known consumer uses of the substance. The ECETOC TRA tool has proved to be the preferred lower tier exposure tool under REACH that provides conservative estimates of consumer exposure. In order to ensure that TRA users are able to reliably and efficiently refine exposure predictions, while maintaining the degree of conservatism inherent to lower-tier models, it should be possible to enter relevant habits and practices data into the tool However, in the absence of standardized descriptions of consumer use, there is a likelihood that different TRA users will describe common consumer uses differently. European industry has therefore undertaken an activity, in cooperation with EU regulators, which aims to describe typical habits and practices information for commonly encountered uses of consumer products in Europe. For each use, the habits and practices information is compiled in a transparent manner and described in a structured form (the Specific Consumer Exposure Determinant, SCED) that is capable of electronic data transfer and manipulation. The core elements within the SCED template are specifically designed for use with TRA algorithms. The process for developing the SCEDs, together with examples, will be described.

Decision Analysis & Risk
 

P.3  Mercury at Oak Ridge: Outcomes from Risk Evaluations can Differ Depending upon Objectives and Methodologies. Burger J*, Gochfeld M, Powers CW, Kosson D, Clarke J, Brown K; Rutgers University, Consortium for Risk Evaluation with Stakeholder Participation, Vanderbilt University   burger@dls.rutgers.edu

Abstract: Risk evaluations play an important role in environmental management, remediation, and restoration. Yet when different agencies and groups evaluate risk, the objectives and methods may differ, leading to different conclusions, which can confuse managers, policy-makers, and the public. In this paper we examine two evaluations of the potential risk from mercury contamination deriving from the Y-12 facility at the Department of Energy’s Oak Ridge Reservation (Tennessee, USA). The U.S. Agency for Toxic Substances and Disease Registry (ATSDR) examined the past and present risks from mercury to humans, using data provided in government reports and publications. The Consortium for Risk Evaluation with Stakeholder Participation (CRESP) used a risk-informed prioritization model it developed for managers to evaluate different remediation projects. The CRESP prioritization model considered both human and ecological receptors, as well as future potential risks. Risk was an important component of both evaluations, and both evaluations found that there was a completed pathway of mercury from the source on the Oak Ridge Reservation to offsite human receptors, although the evaluations differed in their final conclusions. In both cases, the pathway to off-site human exposure was through fish consumption. The two evaluations are compared with respect to purpose, specific goals, target audience, receptors, assumptions, time frames, evaluation criteria, and conclusions. When these aspects are considered, the risk evaluations are congruent, although the risk communication messages differ. We conclude that there are many different possible risk evaluations, and the aforementioned variables must be carefully considered when making management decisions, determining remediation goals, and communicating with regulators, managers, public policy makers, and the public.

P.4  Food Safety? A Supply Chain Matter: Probabilistic Risk Model based on the Agro-Food Trade Network. Convertino MC*, Liang SL; University of Minnesota    matteoc@umn.edu

Abstract: Food safety is a major issue for the worldwide population. Foodborne outbreaks in the USA caused in 2010 a cost of $ 152 billion related to 325,000 hospitalized persons and 5000 deaths due to a foodborne illness. To fight this increasing trend a risk-based system built upon data-driven analysis to inform the efficient targeting of efforts to minimize foodborne risks to the US consumer. Here we propose a model for the assessment of the potential total health risk of food based on the food supply chain (FSC) as a subset of the international agro- food trade network. The number of connected countries, the betweenness centrality of the exporting countries, and the average path length are the supply network variables considered. Considering the safety of each country and the network variables we introduce a global safety index (GSI) for characterizing the riskiness of each country based on local and FSC variables. The intermediary country risk, the food-pathogen health risk, and the company reliability are the second most important factors for the total health risk. Policies that act on both the supply chain variables and the safety index by means of the GSI reduce of 44% the average total health risk. This reduction is much larger than the reduction of policies focused on individual risk factors of the food life-cycle. The proposed FSC model is scalable to any level of the global food system and of- fers a novel perspective in which the global public health is conceived, monitored and regulated.

P.6  Spatial analysis of risk perception. The case of Nuclear Power Plant. Dumitrescu A*, Lemyre L, Pincent C; University of Ottawa   adumi039@uottawa.ca

Abstract: The beginnings of research in risk perception can be traced in the mid-60s to the public debate over use of nuclear energy and associated industries that have promised a cheap, clean and safe energy. Over the past decades, the study of public understanding and perception of risk has developed into a wide-ranging and interdisciplinary field of research and have been studied from different psychometric-cognitive and cultural perspectives. Among all these approaches and theories, there is no doubt that the psychometric paradigm has emerged as a leading theory in this field. While psychometric research has made important contributions, it has been criticized for its limitation in approaching the perception of risk across different geographical locations. Indeed, studies that have been conducted on public understanding and perception of risks in relation to physical space, proximity and place have produced mixed and conflicted results. The National Health Risk Perception Survey 2012 involving a representative sample of approx. 3000 respondents among Canadian population was used to study the proximity effect. Our research proposes a new methodology, which aims to explore the relationship between proximity to a hazard and perceptions of risk by analysing the spatiality of risk perception. By geocoding risk perception data, we compared risk perception between the population living in the proximity of nuclear power plant and the rest of Canadian population. Our results are consistent with the findings of other studies which shown that risk perception is lower among people living close to nuclear power plants. Additionally, correlations between living distance from nuclear power plants and risk perception were explored. The analysis of spatial dimension of risk perception provides an exceptional level of integration of individual, environmental and contextual variables and provides also a link between risk assessment attributable to a specific hazard and risk perception.

P.7  Risk Perception in Libya: An Overview . Elmontsri ME*; Higher Institute of Occupational Safety and Health   mustafa@oshc.org.ly

Abstract: “Risk” has become increasingly topical in the recent decades in the Western Countries based on a socio-psychological approach. The aims of those studies were to understand how people react and perceive specific types of risks which helps decision and policy makers in understanding what the society is worried about and how such risks affecting their decisions. The science of risk analysis has been studied in details in the developed world whereas in developing countries such type of research remains limited. Therefore, the aim of this paper is to examine the ways in which Libyan people perceive the various societal risks that confront them by adopting the psychometric paradigm, which involves using a survey strategy to obtain the required data. It is aimed that this piece of research provides a valuable and crucial insight into the current risk perception of the Libyan public. It will also provide a starting base-line for further research in this field.

P.8  Using quantitative bias analysis to characterize the uncertainty of inputs based on epidemiological data. Forshee RA*, Lu Y, Izurieta H, Egger J, Cooney D, Lash T, Fox M, Brown J; U.S. Food and Drug Administration, SciMetrika LLC, Emory University, Boston University,Harvard Pilgrim,    richard.forshee@fda.hhs.gov

Abstract: Many important public health decisions are informed by epidemiological data, but the results of epidemiological studies may be affected by several potential sources of bias including selection bias, unmeasured and unknown confounders, and various forms of misclassification. A growing body of literature is seeking to account for these potential sources of bias in a more rigorous and transparent manner by using quantitative, probabilistic methods. We have adapted quantitative bias analysis methods--such as probabilistic bias analysis and multiple bias modeling--for use in post-market studies of vaccine safety. The project examines the potential impact of several forms of misclassification and missing and partially missing confounders on the estimates of the possible relationship between vaccination and risk of an adverse event. Quantitative bias analysis methods were developed for several study designs that are commonly used in post-market safety analyses including cohort, case-control, self-controlled case series, and vaccinee-only risk interval studies. Quantitative bias analysis was used to generate an adjusted distribution of the risk based on both the statistical uncertainty and the other potential sources of bias. These approaches could be used to create inputs to probabilistic risk assessment approaches that better characterize the uncertainty associated with post-market safety data.

P.9  Analyzing the effects of unintended uses of commodities on phytosanitary risk: The example of U.S. potato exports to Mexico. Fowler G*, Erikson L, Ahern R, Caton B, Gutierrez W, Griffin R; United States Department of Agriculture   glenn.fowler@aphis.usda.gov

Abstract: Diversion of commodities from intended use is a recurring plant health issue in agricultural trade because consumption is generally low risk while other uses, e.g. propagation, may not be. Consequently, having mechanisms to characterize the risk of this occurring is highly relevant to policy makers and trade dispute arbitration. Here we analyze the risk of U.S. table stock potatoes being diverted for seed by Mexican potato producers as an example of characterizing the risk of unintended commodity use in a plant health context. We constructed probabilistic pathway models characterizing the movement of quarantine significant white, yellow and Russet potatoes from the United States into Mexico at current and double export volumes. We then modeled the likelihood of these potatoes being diverted for seed and the subsequent establishment of bacteria, nematode, surface feeder and virus pests in Mexico. This analysis can be adopted as a mechanism for modeling the unintended use of other commodities and informing trade policy.

P.10  Interim action values for management of contaminants in soils for protection of human health risks. Gilmore J*, Martinez C, Pagliarulo M; Ontario Ministry of Environment   James.Gilmore@Ontario.ca

Abstract: The Ontario Ministry of the Environment has developed interim action value (IAVs) for several contaminants as part of its Soil Assessment Protocol, which informs investigation, analysis and risk reduction measures (RRMs) to address contaminated soil. An IAV represents the upper limit beyond which interim risk reduction measures should be considered. As the name implies, IAVs are intended to inform short-term risk management and mitigation decisions, which may need to be reconsidered over time as more information becomes available on the exposure conditions or on the science underpinning the IAV. IAVs can be developed for various media including soil, groundwater, soil vapour and indoor air, as needed. Interim action values are generally developed from generic soil standards (GSS) by: a) Reviewing the relevant human health component values (HHCVs) underpinning the GSS (e.g., direct contact for incidental ingestion and dermal contact); b) Adjusting the target risk levels for non-cancer and cancer effects to the selected range for risk management; c) Selecting the more stringent effect and ensure that risks posed by other effects (e.g., acute effects) are not exceeded. The IAV may be also refined by reviewing available data from biomonitoring or other studies, if available. Using arsenic as an example, an IAV of 200 µg/g was developed. This value is within the range of soil concentrations studied that showed no significant elevated arsenic exposure, reflects a 2 in 10,000 (or 1 in 5,000) incremental cancer risk and is within the range of risks posed from total inorganic arsenic exposure in the general Canadian population (1 in 1,000 to 1 in 10,000), and is equivalent to approximately ten times an upper estimate of background soil concentrations of arsenic at 18 µg/g.

P.11  Toxidromes - A decision-making tool for early response to chemical mass exposure incidents. Kirk M, Hakkinen P*, Ignacio J, Kroner O, Maier A, Patterson J; University of Virginia   Kroner@TERA.org

Abstract: A common language to describe and recognize clinical manifestations of toxic chemical exposures is essential for emergency responders and hospital first receivers to be prepared to provide rapid and appropriate medical care for victims of industrial chemical mass exposures and terrorist attacks. In these situations, when the identity of the chemical is not known, first responders need a tool to rapidly evaluate victims and identify the best course of treatment. Military and civilian emergency response communities use a “toxic syndrome” (toxidrome) approach to quickly assess victims and determine the best immediate treatment when information on chemical exposures is limited. Toxidromes can be defined by a unique group of clinical observations, such as vital signs, mental status, pupil size, mucous membrane irritation, and lung and skin examinations. Data on over 20 toxidrome systems were evaluated to identify salient features and develop a consistent lexicon for use by state, local, tribal, territorial, and federal first responders and first receivers. A workshop of over 40 practitioners and experts in emergency response, emergency medicine, and medical toxicology developed names and definitions for 12 unique toxidromes that describe and differentiate the clinical signs and symptoms from exposures to chemicals. These toxidromes focus on acute signs and symptoms caused by inhalation and dermal exposures. Each toxidrome is characterized by exposure routes and sources, organs/systems affected, initial signs and symptoms, underlying mode of action, and treatment/antidotes. Toxidrome names and definitions are designed to be readily understood and remembered by users. Communication in a crisis requires accurate and succinct terms that can quickly convey the health conditions of patients. These toxidromes lay the foundation for a consistent lexicon, that if adopted widely, will improve response to chemical mass exposure incidents.

P.12  Risk analysis for networks with cooperative games. Mohri H*, Takeshita J; Waseda University, National Institute of Advanced Industrial Science and Technology   mohri@waseda.jp

Abstract: We proposed a risk measure with cooperative game theory. Hausken (2002) has introduced a risk measure with non-cooperative game theory for risk analysis. In social sciences such as economics, the non-cooperative game theory is a kind of tool for modern microeconomics e.g. industrial organization theory. Now, it is very popular that non-cooperative game theory is taught in faculty of economics and management. Nevertheless, cooperative game theory is not so popular compared with non-cooperative game theory because researchers of economics think agents, like companies in markets, are competitive. Sometimes, we may come across situation in which agents are cooperative considering on risks. For example, suppose a chemical plant, it was operated by one company usually. It also consists of many sections. However, all sections should be cooperative to make chemical products in one plant under one company. Some researchers of economics saying, if agents are cooperative, cooperative game would be converted into non-cooperative game by Nash program. As we wrote above, it is very easy to suppose cooperative game situation in one chemical plant. Not only that but also agents of infrastructure networks should be regarded as cooperative. In this study, we firstly argued how cooperative game should be introduced for risk analysis. Secondly, we addressed things using concrete examples; some graphs whose had simple mathematical structures. They are series (tandem), parallel, and combined graphs. Then, we discussed how should be treated complicated structure matters.

P.13  Development of practical risk evaluation method with the example of traffic relevant environmental measures . Tokai A*, Nakazawa K, Nakakubo T, Yamaguchi H, Kojima N, Sakagami M, Higuchi Y, Nagata Y, Ishimaru T; Osaka University   tokai@see.eng.osaka-u.ac.jp

Abstract: Under the multiple risk situations in our daily life, to understand the actual risk condition goes underground the revealed risk is necessary. To grasp it, we examined the method that make up the result of traditional health risk assessment. For this purpose, we identified three objectives in this research project. The first one is to develop risk durability evaluation method and the second one is to apply this method to actual environmental measures especially we examined the transportation related technologies. The third one is to examine the possibility of voluntary action for risk reduction by citizens through questionnaire survey to residents in Osaka prefecture. Main findings were followings. Regarding first task, we developed risk durability evaluation method based on the concept of value of information and trade-off analysis with the example of flame retardant to plastics with the software Analytica. This software enables us to build user friendly model that support to obtain the insight of risk stems from this chemical through graphical user interface. To analyze the uncertainty included in the process of risk assessment of flame retardant, we estimated value of information of specific parameter required for risk estimation. As to the second task, we carried out the risk analysis and lifecycle analysis for three types of environmental measures for automobile industry, they are material substitutions, fuel substitutions and products replacement. We clarified the risk trade-offs for these three types of measures and evaluated relative superiority from the viewpoint of health risk and greenhouse gas emission. Regarding third task, results of 232 valid respondents who were aged 20 or over and lived in Osaka prefecture were obtained.

P.14  Air pollution in Salvador, BA, Brazil: An experience of risk analysis. Vianna NA*, Saldiva PHN; University of Sao Paulo   nelza@usp.br

Abstract: Scientific basis for understanding the effects of air pollution on human health are necessary in worldwide because local data are important to support decision making and air pollution control. Developing countries have presented difficulty for pollutants detection and measure of air pollution. This is a challenge for the implementation of air quality standards. In Brazil a Program from Environmental Health Agency called VIGIAR makes surveillance in air quality on human health, but there is not technology available for measures of pollutants in all brazilian cities. For implementation this program in a metropolitan area of brazilian megacity was necessary involving stakeholders. The aim this study was to make a framework between academic and public sector for application of risk analysis and air quality management. The pollutants where characterized in terms of chemical composition. The receptor model was used for detection particulate matter 2.5 during two years in Salvador city. The composition of particulate matter was studied for understanding local emission. Alternative tools as biomonitoring were used, including morphological analysis of particles. Studies about effects on human health was used data of respiratory and cardiovascular diseases. Strategies of risk communication are necessary yet. After validation of methodology, these results will be used to support decision making in Salvador, as well as help in policy for air pollution control and to protect human health.

P.15  Practice makes perfect: Lessons and outcomes based on mode of action/human relevance framework application to case studies. Willis AM*, Maier A, Reichard J, Haber L, Patterson J; Toxicology Excellence for Risk Assessment (TERA)   Willis@TERA.org

Abstract: A public workshop, organized by a Steering Committee of scientists from government, industry, university, and research organizations, was held at the National Institute of Environmental Health Sciences (NIEHS) in September, 2010. The workshop explored the development of dose-response approaches for receptor-mediated liver cancer within a Mode of Action (MOA) Human Relevance Framework (HRF) (WHO/IPCS). Case studies addressed activation of the aryl hydrocarbon receptor (AHR), the constitutive androstane receptor (CAR), and the pregnane X receptor alpha (PPAR&#945;). The workshop case studies provided a valuable exercise in applying the MOA/HRF and a number of insights and lessons emerged that may be useful for future applications. Inclusion of associative events and modulating factors into the framework was useful for the consideration of mechanistic data to inform dose-response. In particular, associative events and modulating factors could be a useful approach for the integration of molecular and biomarker data to inform risk assessment. Further guidance in the MOA/HRF would be useful to clarify the number of key events needed to define a MOA, to address the absence of human studies needed to inform critical interspecies differences, and to assess activation of multiple MOA cascades and interactive molecular signaling pathways. In addition, variation in human population susceptibility needs to be addressed for assessing human relevance, particularly the possibility that populations exhibit widely ranging individual thresholds. A qualitative “value of information” approach to assess the utility of further research to better inform dose-response and a detailed and systematic uncertainty analysis for use in the framework would be useful.

P.16  Weight-of-Evidence Evaluation of Short-term Ozone Exposure and Cardiovascular Effects. Sax S*, Prueitt R, Goodman J; Gradient   ssax@gradientcorp.com

Abstract: There is a considerable body of research on the cardiovascular (CV) effects associated with ozone exposure, including epidemiology, toxicology, and controlled human exposure studies. US EPA is considering these data to determine whether to update the ozone National Ambient Air Quality Standards (NAAQS). We conducted a weight-of-evidence (WoE) analysis to determine if there was an association between CV effects and short-term ozone exposures at levels below the current primary ozone NAAQS of 75 parts per billion. Our analysis followed an updated WoE framework based on EPA's NAAQS framework. We found that the epidemiology evidence of CV morbidity and mortality is inconsistent and lacks coherence across specific CV endpoints. Specifically, the lack of epidemiology evidence of morbidity effects is not coherent with reported mortality estimates. Toxicology studies, although somewhat more consistent, are conducted at high exposure levels well above the current NAAQS, and there is limited information on dose-response relationships. Furthermore, there is a lack of coherence between reported results from epidemiology studies (suggesting no effects) and results from animal studies (suggesting small effects at high exposure levels). Similarly, controlled human exposure studies report inconsistent effects after exposure to high ozone levels above the current NAAQS. Overall, our WoE analysis indicates that CV effects are not associated with short-term ozone exposures below the current NAAQS.

P.18  A Bayesian Belief Network (BBN) for Modeling Risk of Adverse Events Due to the Particulate Matter in Injectables. Kazemi R*, Rahaman F, Urban J; USFDA   rezakazemi@gmail.com

Abstract: Particles in injectable medications chiefly come from two main sources, intrinsic contaminants that result from manufacturing and packaging processes and extrinsic particles that are introduced at the time of administration to patients. These particles are generally considered to be harmful and can have many shapes or types (e.g. glass, metal, rubber, lamellae, etc.) and many sizes. Many factors play a role in determining whether these particles will have imminent effect of patient’s health, whether the particulates could cause serious health problems or death, temporary health problems or no serious adverse health reaction is expected at all. Among these factors are particulate profile (e.g. type, size, etc.), amount of particulates administered to the patient, Route of administration, whether or not barriers will be effective to screen out the particulates (i.e. use of filters) as well as patient’s resistance factors. Due to the uncertainties involved in how these particles may influence risk of adverse events in patients, Bayesian Belief Network (BBN) formalism has been utilized to assess the risk of adverse events due to the injection of particulates. BBNs are probabilistic in nature and the uncertainty in assessing risk of adverse events, given the different states of all influencing factors can be explicitly expressed and modeled. Given patient’s conditions and the characteristics of the particles, the model will be able to assess the likelihoods of adverse events with their respective uncertainties.

Dose Response
 

P.19  Determining a concentration-response relationship suitable for estimating adult benefits of reduced lead exposure. Brown LPM*, Lynch MK, Post ES, Belova A; Abt Associates, Inc.    lauren_brown@abtassoc.com

Abstract: Lead is a highly toxic pollutant that can damage neurological, cardiovascular, and other major organ systems. The neurological effects are particularly pronounced in children. However, the recent literature has found that a wide spectrum of adverse health outcomes can occur in people of all ages. In addition, a threshold below which exposure to lead causes no adverse health effects has not been identified. This suggests that further declines in lead exposure below today’s levels could still yield important benefits. A well-established quantitative risk assessment-based approach to evaluating the benefits of reductions in lead releases for adults does not exist. We will present our efforts to create a rigorous approach to value adult health benefits for endpoints such as cardiovascular mortality. We reviewed recently published government reports and the primary literature. We then assessed the weight-of-evidence for associations between lead exposure and cardiovascular, renal, reproductive, immune, neurologic and cancer endpoints for the purposes of benefits estimation. We closely evaluated the literature and will propose a concentration-response function relating blood lead levels to adverse effects, particularly cardiovascular mortality, in adults. This function could potentially be used to support the benefits analysis of future regulations intended to result in a decrease in lead exposure for adults.

P.20  Assessing the impact of human metabolic variability on the health risks of occupational and environmental exposures to chloroform. Deveau M*, Krewski D, Nong A; University of Ottawa; Health Canada   michelle.deveau@uottawa.ca

Abstract: Approximately 15,000 Canadians are occupationally exposed to chloroform, primarily in the recreational sector. Non-occupational exposures can occur when chloroform is formed as a disinfection byproduct in drinking water. Occupational and environmental exposure limits are designed to prevent liver toxicity from metabolites and neurological effects from the parent compound. Because chloroform is primarily metabolized by the 2E1 isoform of cytochrome P450 (CYP2E1), variability in the levels of the enzyme in the human population could influence susceptibility to the compound. The objective of this research was to investigate the effect of interindividual variability in CYP2E1 activity on the health risks of chloroform and to identify whether existing exposure limits sufficiently account for these differences. To do this, a human physiologically based pharmacokinetic (PBPK) model for chloroform was used, and distribution data on CYP2E1 in human liver were inputted into the model to simulate exposure scenarios for selected occupational and environmental exposure limits. Estimates were obtained for 5th percentile, average and 95th percentile metabolizers. As expected, the 5th percentile group metabolized less chloroform, resulting in higher blood chloroform concentrations. Likewise, the 95th percentile metabolizers had higher levels of metabolism. However, the differences amongst the groups were less than 2-fold, despite much higher variability levels for CYP2E1 and microsome concentrations; therefore, these factors only have a minor impact on the risks of liver toxicity and acute neurological effects within the population. The population variability in CYP2E1 appears to be sufficiently addressed in the selected occupational and environmental exposure limits.

P.21  Ambient air pollution and allergic disease among children. FAN KC*, HO WC, LIN MH, CAFFREY JL, WU TT, PAN SC, CHEN PC, WU TN, SUNG FC, LIN RS; China Medical University   applet2899@gmail.com

Abstract: The prevalence of childhood eczema, allergic rhinitis, and asthma has been increasing worldwide. Air pollution related to allergic diseases has been an important public health issues, especially for highly sensitive group like children. Critical exposure window could occur for air pollution related to allergic diseases, especially during embryo, 0 to 1 years old, and 1-2 years old. The purpose of this study is to assess the potential adverse health effects of air pollution related to allergic diseases (eczema, allergic rhinitis and asthma). There are two databases used in this study: 1) Longitudinal Health Insurance Database 2005 (LHID2005) and 2) Environmental Protection Agency (EPA) air monitoring database. Geographic Information Systems (GIS) will be used in estimating air pollution exposure. Furthermore, Cox proportional hazard regression models will be used in adjusting sex, geographic area, urbanization level, household Environmental Tobacco Smoking (ETS) exposure and lead concentrations in air within three periods of exposure time, 10 months before birth, 0 to1 years old, and 1 to 2 years old. All statistical analyses will be performed with the SAS version 9.3 (SAS Institute, Cary, NC, USA). In other study indicate that components of PM2.5 were associated with hospitalization for several childhood respiratory diseases including pneumonia, bronchitis, and asthma. Therefore we find that long-term air pollution exposure not only associated with asthma, but also affect children’s lung function and cause allergic disease.

P.23  Assessment of benzo(a)pyrene (a tobacco smoke toxicant) as a driver of genotoxicity. Fiebelkorn SA*, Bishop EL, Breheny D, Cunningham FH, Dillon DM, Meredith C; British American Tobacco, Group R&D   clive_meredith@bat.com

Abstract: Over 5,600 constituents have been identified in tobacco smoke, some with well established toxicological properties. Our proposed framework for the risk assessment of tobacco smoke toxicants combines both computational and in vitro experimental components. Initially we use Margin of Exposure (MOE) calculations to segregate tobacco smoke toxicants into high and low priority for risk management action, using guidelines developed by the European Food Safety Authority (EFSA). We conduct Mode of Action (MOA) analyses, using the International Programme on Chemical Safety (IPCS) framework, on these prioritised toxicants. Experimentally, we then test individual priority toxicants for their activity in several in vitro assays using the MOA for each toxicant to inform assay selection. Here we describe our findings from use of this risk assessment framework, using benzo(a)pyrene (BaP) as the prototypical tobacco smoke toxicant. Following a detailed literature search, we generated twelve MOEs for BaP ranging from 16,805 to 2,400,000 indicating a lower priority for risk reduction research. Our MOA analysis for BaP proposed four key events; genotoxicity, mutation, cell proliferation and tumour formation. From our in vitro toxicity data, the concentrations of BaP equating to a point of departure were 1.0-1.28 µg/plate (Ames), 0.75-1.0 µg/ml (micronucleus) and 1.4-1.5 µg/ml (mouse lymphoma assay). These data confirm the genotoxic and mutagenic potential of BaP, supporting the first two key events in the proposed MOA. The data has subsequently been used to generate in vitro MOEs and these support the in vivo MOE conclusions (1,200,000-30,000,000). Additional in vitro data sets from disease models provide further weight of evidence for the postulated MOA key events. Future refinement of our conclusions for BaP would include the use of PBPK models to predict tissue dose within the respiratory tract of smokers, and a cumulative risk assessment on the various polycyclic aromatic hydrocarbons present in tobacco smoke.

P.24  A Decision Tool for Assessing Polymers and Polymeric Substances with Potential Hazards to Human Health. Gadagbui B, Maier A, Nance P*, JayJock M, Franklin C; Toxicology Excellence for Risk Assessment   nance@tera.org

Abstract: Polymers display a wide variety of characteristics - e.g., presence of non-bound residual monomers, polymerization chemicals, degradation products, and additives - that may pose a potential health hazard. There is a paucity of direct testing data on many polymers to adequately evaluate their toxicity, but several regulatory agencies have provided guidance for assessing polymer safety. We evaluated each of these approaches and identified the strengths and weaknesses of each. No single published model appears to cover all characteristics of interest. This suggests the need to develop a comprehensive decision tool to identify polymeric substances that may pose potential toxicological hazards to human health. We developed a decision tool that incorporates a weight of evidence approach integrating information for many individual hazard flags. Hazard flags were placed into four broad categories: (1) empirical hazard information on the polymer or residual monomer; (2) evidence of toxicity based on structural properties (i.e., based on polymer class, monomer components, or reactive functional groups); (3) potential for significant tissue dose (i.e., based on molecular weight distribution or systemic bioavailability); and (4) hazard based on foreseeable special use considerations. Some of these hazard flags have not been considered previously by the regulatory agencies. We tested this approach for a number of polymers to demonstrate how the new tool (integrates) incorporates all available regulatory approaches as well as the new features and provides a comprehensive decision framework for evaluating polymer safety.

P.25  Development of Chemical-Specific Adjustment Factors for Long-Lived Chemicals: PFOS as a Model Chemical. Haber LT*, Dourson ML, Mohapatra A; TERA   Haber@tera.org

Abstract: Guidance for the development of chemical-specific adjustment factors (CSAFs) has been available for a number of years, and has been applied in assessments of several chemicals, such as boron and 2-butoxyethanol. Typical dose metrics considered for interspecies extrapolation include the area under the concentration times time curve (AUC) or maximal concentration (Cmax). The IPCS (2005) guidance provides some lines of evidence to aid in choosing the dose metric, but notes that “a reasonable assumption is that effects resulting from subchronic or chronic exposure would normally be related to the AUC, especially for chemicals with long half-lives, whereas acute toxicity could be related to either the AUC or the Cmax.” Despite this guidance, CSAFs have been derived primarily for chemicals with short half-lives. A challenge with using AUC for interspecies extrapolation is how to take into account the duration of exposure, particularly if steady state has not been reached. We investigated the development of a CSAF for long-lived chemicals, using perfluorooctanesulfonate (PFOS) as an example. We evaluated the data supporting AUC vs. Cmax as the appropriate dose metric, with particular attention to the relationship between various dose metrics and measures of toxicity in post-exposure recovery groups. We also considered the implications of non-steady state kinetics, as well as relating the exposure duration to the critical effect.

P.26  Ambient air pollution and Attention Deficit Hyperactivity Disorder (ADHD) among children. Lin MH*, Ho WC, Caffrey JL, FAN KC, WU TT, CHEN PC, LIN CC, WU TN, SUNG FC, LIN RS; China Medical University   whocmu@gmail.com

Abstract: Attention Deficit Hyperactivity Disorder (ADHD) is the most commonly diagnosed neurobehavioral disorder of childhood. Animal studies suggest that traffic-related air pollution may have adverse neurologic effects, but studies of neurobehavioral effects in children are still in need. The purpose of this study is to assess the potential adverse health effects of air pollution during maternal pregnancy related to childhood ADHD. There are two databases used in this study: 1) Longitudinal Health Insurance Database 2005 (LHID2005) and 2) Environmental Protection Agency (EPA) air monitoring database. Geographic Information Systems (GIS) will be used in estimating air pollution exposure. Furthermore, Cox proportional hazard regression models will be used in adjusting sex, geographic area, urbanization level, household Environmental Tobacco Smoking (ETS) exposure and lead concentrations in air. All statistical analyses will be performed with the SAS version 9.2 (SAS Institute, Cary, NC, USA). A p-value of less than 0.05 is set to declare statistical significance. The results showed that air pollution could be related to childhood ADHD, especially traffic-related air pollution. Air-pollutant trimester-specific effect was found. Further research is suggested.

P.27  Birth weight, household smoking and the risk of wheezing in adolescents: a retrospective cohort study. Ho WC*, Lin MH, Caffrey JL, Lin YS, Fan KC, Wu TT, Chen PC, Wu TN, Sung FC, Lin RS; China Medical University , 15 F, No.91 Hsueh-Shih Road ,Taichung city,Taiwan   whocmu@gmail.com

Abstract: OBJECTIVE: Low birth weight (LBW) and environmental tobacco smoke (ETS) exposure are each associated with respiratory difficulties (wheezing) in children. This study was designed to examine the combined association of LBW and ETS with wheezing. METHODS: A retrospective birth cohort analysis linked with a national survey of allergic disorders among 1018031 junior high school students in Taiwan (1995–96) was analyzed. The reported incidence of wheezing (yes or no) and ETS exposure (4 categories: 0, 1–20, 21–40 and ≥41 household cigarettes per day) were obtained from validated questionnaires. Logistic regression models were used to assess the associations of interest. RESULTS: LBW was associated with a higher odds ratios (ORs) of reporting ever wheezing (1.08, 95% confidence interval, 1.01 to1.16), current wheezing (1.09, 95% confidence interval, 1.00 to 1.20) and wheezing with exercise (1.11, 95% confidence interval, 1.02 to 1.21) within the smoke-free cohort. Higher ETS exposure correlated to a higher risk of wheezing (ever, current and with exercise). With ETS exposure, adolescents from the lowest birth weight cohorts were more likely to report wheezing (ever, current and with exercise). CONCLUSIONS: ETS and LBW each has been comprised a public health risk for respiratory symptoms in children. Furthermore, LBW may exaggerate the risk among those exposed to ETS. LBW, ETS and associated respiratory impairments may deserve special attention as part of a comprehensive environmental health risk assessment directed toward prevention and intervention.

P.28  Development of practical quantifying method applicable for risk assessment of metabolic inhibition during co-exposure in workplaces by applying a PBPK model in humans. Ishimaru T*, Yamaguchi H, Tokai A, Nakakubo T; Osaka University   ishimaru@em.see.eng.osaka-u.ac.jp

Abstract: At present, chemical substances in workplaces were managed based on administrative control level for single substance. When a number of chemical substances are used in a workplace, they are managed on the assumption that risk increase additively. The Hazard Index is calculated as the sum of the ratio a chemical’s exposure level to administrative control level, such that values larger than 1 are of concern. However the management based on this assumption cannot appropriately control compounds concerned the effect of metabolic inhibition. Based on the above considerations, we aim to develop the method to quantify the effect of metabolic inhibition in order to support risk management in occupational workplaces. In particular, we construct the method to derive dose-response curve by applying PBPK model for the metabolic inhibition and assess the effect caused by co-exposure with the case of toluene and n-hexane. Using the method to integrate the PBPK model applicable for co-exposure to toluene and n-hexane into the hierarchical model to evaluate the dose-response relations by dividing into pharmacokinetics (PK) and pharmacodynamics (PD), we have derived the dose-response curve including metabolic inhibition. As a result, by quantifying the variation of risk levels such as BMD10 from the dose-response curve excluding metabolic inhibition and the curve including metabolic inhibition, the effect of the metabolic inhibition was quantified for every administered concentration of competing chemical substances. We evaluated the threshold of co-exposure interaction. Moreover, this method could be applied to another type of combination of chemicals which causes the mutual metabolic inhibition if their metabolic inhibition mechanism is clear. Therefore, for the further development of this method, we deem it necessary to classify compounds which may cause the mutual metabolic inhibition in workplaces, and to clarify the competition mechanism of metabolic enzyme.

P.29  The Role of Dietary Zinc in Cadmium Nephrotoxicity. Lin YS*, Caffrey JL, Ho WC, Bayliss D, Sonawane B; U.S. Environmental Protection Agency   Lin.Yu-Sheng@epa.gov

Abstract: Background: Animal studies have shown that cadmium (Cd) and zinc (Zn) are associated with increased and decreased renal risk, respectively. Goal: To examine the joint effect of Cd exposure and Zn intake on renal risk. Methods: The data were obtained from 5,205 adults aged 50 years and older from the Third National Health and Nutrition Examination Survey (NHANES III, 1988-94). Results: Logged urinary Cd is positively associated with albuminuria (odds ratio=1.29; p=0.01). Despite an apparent protective effect of Zn intake, the finding is not significant. However when considered jointly with Cd, there was a significant inverse association between albuminuria and the Zn-to-urinary Cd ratio (p<0.01). Discussion: Whereas Cd poses an important risk factor for albuminuria in older Americans, the positive role of Zn in moderating Cd-associated renal risk remains to be investigated.

P.30  Assessment of the Occupational Exposure Limit of p-Phenylenediamine for Hairdressers . Lin HC*, Guo YL, Wu KY; Institute of Occupational medicine and Industrial Hygiene, College of Public Health, National Taiwan University, Taipei, Taiwan   losingweights0418@gmail.com

Abstract: p-Phenylenediamine (PPD) is an ingredient of permanent oxidative hair colouring products. PPD was reported to cause severe allergic contact dermatitis. The exposures to PPD for hairdressers have been of concerns. Previous studies have been carried out to elicit allergic skin reaction as a function of the applied concentration on skin and exposure duration. In order to protect the hairdressers, PPD was banned by Sweden and France, but many regulate PPD contents in hair coloring products including European Cosmetics Toiletry and Perfumery Association and U. S. A. Currently, there is not an occupational exposure limit of PPD. Therefore, the aim of this study was to propose occupational exposure level of PPD for hairdressers by using health risk assessment. Previously, 16 volunteers were tested with patches containing 1%, 0.3%, 0.1% and 0.01% PPD in petroleum for 15 min, 30 min and for 120 min. These data were used for dose-response relationship modeling by using the Benchmark dose software. Only the log-probit and log-logistic models fit this data set. A BMDL10 0.57 %*min was adopted, and uncertainty factor of 20 accounting for inter-individual differences and incompletion of data was used to derive the proposed exposure level at 1.2 ug/cm2 of PPD in petroleum on skin in a 8-hr work shift.

P.31  The Relationship of Mercury Exposure, Omega-3 Intake, and Risk of Chronic Kidney Disease. LIN YS*, GINSBERG G, SONAWANE B; US EPA   Lin.yu-Sheng@Epa.gov

Abstract: Background: It remains unclear whether environmental exposure to mercury (Hg) is associated with increased renal risk, and whether omega-3 fatty acid (FA) intake could affect Hg nephrotoxicity. Goal: To examine the relation of chronic kidney disease (CKD) to blood Hg and omega-3 FAs. Methods: The data were obtained from 1,046 adults aged 40 yrs or older from the National Health and Nutrition Examination Survey 2003-4. Results: The adjusted odds ratio for increased CKD risk in the highest tertile of blood Hg compared with the lowest was 2.96 (95% confidence interval= 1.05-8.39). Despite the role of omega-3 FAs in modulating Hg nephrotoxicity, there was only a marginal association between omega-3 FAs. Discussion: Hg exposure is associated with CKD risk and additional studies are needed to assess the role of omega-3 FAs and the sources, exposure routes, and forms of Hg responsible for Hg nephrotoxicity.

P.32  Workplace Environmental Exposure Level (WEEL) Methodology with Octamethylcyclotetrasiloxane (D4) as a Case Study. Parker AL*, Nance PM, Maier A; Toxicology Excellence for Risk Assessment   parker@tera.org

Abstract: Workplace Environmental Exposure Levels (WEELs) are health-based guide values for chemical stressors developed by a volunteer group of professional experts known as the WEEL Committee. The WEEL committee is a collaborative initiative with the goal of promoting worker health protection through increased access to high quality occupational exposure limits, enhancements in methods for establishing worker-health exposure guidelines, and education and training in occupational risk assessment methods. A WEEL is intended to be an airborne chemical concentration to which nearly all workers may be repeatedly exposed, for a working lifetime, without experiencing adverse health effects. WEELs are derived using scientifically sound, state-of-the-art risk assessment procedures and a multi-tiered review process. An extensive review of all available relevant information of sufficient quality is used in the development of a WEEL. The Committee only considers developing WEELs where no valid guidance exists for chemical exposure. Candidate chemicals are identified from various sources including HPV (USEPA High Production Volume lists) and those solicited from stakeholders. A new stakeholder process allows interest groups or companies to request the development of a WEEL for a specific chemical of interest through the Occupational Alliance for Risk Science (OARS) initiative. The first stakeholder sponsored WEEL for octamethylcyclotetrasiloxane (D4) is in the final steps of this new process. The new WEEL and all other WEELS developed through OARS will be provided at no cost on the OARS website.

P.33  Workshop on lessons learned, challenges, and opportunities: The U.S. Endocrine Disruptor Screening Program. Patterson J*, Becker R, Borghoff S, Casey W, Dourson M, Fowle J, Hartung T, Holsapple M, Jones B, Juberg D, Kroner O, Lamb J, Marty S, Mihaich E, Rinckel L, Van Der Kraak G, Wade M, Willett C; 1,5,11 Toxicology Excellence for Risk Assessment (TERA); 2 American Chemistry Council; 3, 9, 15 Integrated Laboratory Systems (ILS); 4 National Institute of Environmental Health Sciences; 6 independent consultant; 7 Center for Alternatives to Animal Testing, Johns Hopkins University; 8 Battelle;10 Dow AgroSciences; 12 Exponent, Inc.; 13 The Dow Chemical Company; 14 ER2;16 University of Guelph; 17 Health Canada; 18 Humane Society of the United States   patterson@tera.org

Abstract: Fifty-two chemicals were recently screened using 11 Endocrine Disruptor Screening Program (EDSP) Tier 1 assays and the data submitted to the EPA for review. Over 240 scientists participated in a workshop on the EDSP in April 2013 to share scientific learnings and experiences with the EDSP and identify opportunities to inform ongoing and future efforts to evaluate the endocrine disruption potential of chemicals. The first session focused on the conduct and performance of the 11 Tier 1 assays. Speakers and workshop participants highlighted challenges in conducting the assays and solutions developed by the laboratories, as well as issues relevant to data interpretation. The second session focused on how to apply relevant information from the current Tier 1 battery to identify potential modes of action and the value of a weight of evidence (WoE) assessment for evaluating potential interactions with endocrine pathways. Presentations and discussions explored the development of critical systematic evaluation of existing data prior to implementation of Tier 2 testing, and application of alternative data to replace Tier 1 assays. The third session provided perspectives on the future of endocrine screening and the promise of in vitro high-throughput analyses, toxicity pathways, and prediction models. A number of common themes and suggestions emerged from the extensive discussions, including that a critical review and update of current Tier 1 testing guidelines is needed, the use of biomonitoring data for exposure-based prioritization, reducing the number of animals used in testing, and use of a robust WoE approach to align available Tier 1 data with potency and exposure information to better inform decisions on Tier 2 testing.

P.34  Air Pollution Patterns May Modify the Effect of Weight Gain on Lung Function among Adolescents. Wu TT*, Chen LH, Ho WC, Lin MH, Pan SC, Fan KC, Chen PC, Wu TN, Sung FC, Lin RS; China Medical University   martinwu1006@gmail.com

Abstract: Lung function is a very important index of respiratory health. Weight gain and air pollution both can have adverse effect on lung function. The objective of this study is to assess the modifying effect of air pollution patterns on weight gain related to reducing lung function. The study design was a retrospective birth cohort through linking birth registry record (birth weight and gestational age) and nation-wide junior high school student respiratory health survey database in central Taiwan. The study subjects were based on 10% of random sampling. For robust exposure assessments, we excluded the subject who had ever moved during the follow up in the analysis. Air pollution data including SO2, CO, O3, NO2 and PM10 were collected by high-density Taiwan Environmental Protection Administration monitoring stations. Multiple regressions were used, the adjusted variables including sex, age, height, weight, parental education level, family smoking, incense burning, exercise and temperature. Obesity was related to reducing lung function. Low birth weight had the similar effect. Obese adolescents who were born with low birth weight had the most adverse effect on lung function. Furthermore, air pollution patterns might modify the effect. It is necessary to protect public from the adverse effect of weight gain, especially considering the potential interaction with air pollution patterns.

P.35  Residential and Occupational Exposure to Wood Treating Operations and Risk of Non-Hodgkin Lymphoma: A Meta-Analysis. Williams BH*, Pierce JS, Glynn ME, Johns LE, Adhikari R, Finley BL; Cardno ChemRisk   Brenten.Williams@cardno.com

Abstract: There are hundreds of former and currently active wood treating facilities in the United States, and over time concerns have been raised regarding the potential chronic health effects associated with wood treating-related exposures. It has been suggested in the peer-reviewed literature that exposure to chemicals related to historical wood treating operations (in particular, pentachlorophenol [PCP]) is associated with an increased risk in non-Hodgkin lymphoma (NHL). To test the merits of this assertion, we conducted a systematic review of all published and unpublished analyses that report risk estimates for NHL in (1) residents of communities surrounding wood treating operations, (2) wood treating workers, and (3) non-wood treating workers who were exposed to chemicals associated with wood treating operations (creosote, coal tar and associated polycyclic aromatic hydrocarbons [PAHs] and PCP). A total of 12 studies, including independent cohort, record-linkage, and case-control studies, were included in the meta-analysis. Using a random effects model, meta-relative risks (meta-RRs) were calculated for each exposure group. The summary relative risk (meta-RR) for NHL overall was 1.31 (95% confidence interval [CI]: 0.93, 1.85). No statistically significant meta-RRs were observed among residents of communities in the vicinity of wood treating facilities (meta-RR=0.75; 95% CI: 0.37, 1.51); wood treating workers (meta-RR=1.89; 95% CI: 0.69, 4.12); workers exposed to coal tar, creosote, and associated PAHs (meta-RR=1.37; 95% CI: 0.80, 2.34); and workers exposed to PCP (meta-RR=1.61; 95% CI: 0.99, 2.62). Notably, many of the occupational studies, and in particular those conducted among manufacturing workers, were limited by the inability to distinguish the potential confounding effects of contaminants, particularly polychlorinated dibenzo-p-dioxins (PCDDs), within chlorophenols. Nevertheless, there is no evidence in the studies reviewed that residential or occupational exposures related to wood treating operations increase the risk of NHL.

P.36  Residential and occupational exposure to wood treating operations and bladder cancer: A meta-analysis. Glynn ME*, Pierce JS, Williams B, Johns LE, Adhikari R, Finley BL; Cardno ChemRisk   Meghan.Glynn@cardno.com

Abstract: The wood treating industry has operated for over 100 years in the United States, with sites commonly operating for more than decades. Over time, concerns have been raised regarding the potential chronic health effects associated with wood treating-related exposures. In at least one case it has been suggested that there might be an association between risk of bladder cancer and exposure to chemicals associated with historical wood treating operations (e.g., creosote, coal tar and associated polycyclic aromatic hydrocarbons [PAHs], and pentachlorophenol [PCP]). A literature search was conducted to identify all published and unpublished analyses that reported risk estimates for bladder cancer in (1) residents of communities surrounding wood treating operations, (2) wood treating workers, and (3) non-wood treating workers who were exposed to chemicals associated with wood treating operations (e.g., creosote/coal tar/PAHs and PCP). A total of 18 studies, including independent cohort, record-linkage, and case-control studies, were included in the meta-analysis. Using a random effects model, meta-relative risks (meta-RRs) were calculated for each exposure group. The summary relative risk (meta-RR) for bladder cancer overall was 1.04 (95% confidence interval [CI]: 0.93, 1.17). No statistically significant meta-RRs were observed among residents of communities in the vicinity of wood treating operations (meta-RR=0.99; 95% CI: 0.73, 1.34); wood treating workers (meta-RR=1.11; 95% CI: 0.53, 2.04); workers exposed to coal tar, creosote, and associated PAHs (meta-RR=1.04; 95% CI: 0.86, 1.27); and workers exposed to PCP (meta-RR=1.00; 95% CI: 0.82, 1.23). In conclusion, the studies reviewed provided no evidence of an association between residential and occupational exposure to wood treating operations and an increased risk of bladder cancer.

Ecological Risk
 

P.37  Identifying regional features of temperature variability using cluster analysis and quantile regression applied to the daily surface level observations. Timofeev A.A.*, Sterin A.M.; RIHMI-WDC   arseni@developitbest.com

Abstract: It is very important to assess extreme climate variability, which may cause unwanted weather anomalies. With appropriate usage, this information effectively allows reducing possible losses caused by extreme weather events. As we’ve shown in our previous presentations, quantile regression does not have drawbacks of traditional approaches, and provides comprehensive data, describing changes in statistical distribution across full range of quantiles. However, the more detailed information we get, the more difficult it gets to analyze and interpret. In our early research we’ve moved from linear process diagrams to colored cross sections to visualize quantile trend values, obtained in one location or distributed vertically along radiosonde soundings made from same weather station. Both methods are very effective for separate examination of limited number of weather stations. This time we introduced new approach - use vectors of quantiles as an input for cluster analysis. Then plot resulting clusters on a map as simple colored marks. Thus we can see if there are evidently similar points (weather stations) and how their similarity in changes of variability matches geographical Locations. We can use any part of distribution we are interested in just by making appropriate selection of quantiles. For example for extreme events analysis, top section of the distribution should be interesting. Using that approach, we’ve obtained very interesting results, with highly pronounced regional similarities of distribution changes in certain parts of statistical distribution of surface temperature. Thus quantile regression, combined with cluster analysis on it’s results provides comprehensive information about climate variability changes projected onto a geographical map.

P.38  Determining detection rates of environmental DNA sampling for monitoring the risk of invasive fish species. Song JW*, Small MJ; Carnegie Mellon University   jsong31@gmail.com

Abstract: Early detection of invasive species is critical to effective aquatic ecosystem risk management. A newly developed detection method is eDNA sampling, which is the analysis of water samples for the presence of species-specific environmental DNA (eDNA), DNA fragments that are released in the water, to infer the presence of the species. This technique promises improved detection sensitivity and specificity and reduced monitoring costs compared to traditional techniques. However, the use of eDNA sampling in decision-making frameworks is challenging due to the many uncertainties associated with the DNA technology and sampling methodology. These uncertainties have received particular attention in the use of eDNA sampling for detection of invasive Asian carp species in the Great Lakes region, where many costly and ineffective risk management efforts have been performed due to eDNA evidence. In this paper, the uncertainties in the relationship between fish presence and eDNA presence in a river system is explored. A one-dimensional advective-reactive-dispersive transport model is integrated with a fish dispersal model to determine the concentration profile of eDNA, spatially and temporally, in a specified river system. The model can then evaluate the relationship between fish density and eDNA concentration and the potential detection rates at each section of the river. The results suggest that under high flow conditions, such as in major river channels, there is a high likelihood of false negatives due to the washout of eDNA. The potential of false positives is higher under low flow conditions, such as in slower-moving backwater areas, because the persistence of eDNA can now influence the results. A stronger understanding of the detection rates of eDNA sampling will help inform improved sampling methodologies and better integration with decision-making frameworks.

P.40  Long-term variability of wind regime in the atmosphere over the Arctic . Agurenko AO*, Khokhlova AV; RIHMI-WDC   agualina@ya.ru

Abstract: Meteorological observations are very important in estimating climate risks, i.e. in determining a frequency of potentially hazardous events and their intensity, as well as in designing efficient strategies of disaster mitigation. Struggling against hazardous natural processes can be efficient owing to the knowledge of both their origin and evolution. In this connection, the research on long-term trends towards the changes in climate parameters on global and regional scales under changing climate provokes permanent interest. The aim of this work is to study the long-term wind speed variability in the atmosphere over the northern polar region. The climate of the Arctic is the product of interactions between a large range of physical, chemical, and radiative processes, involving ocean, sea ice, land-surface, snow cover, clouds, and aerosols. Many of the interactions operate via atmospheric circulation. The wind speed and its variability determine significantly the circulation regime and transport of substances. Long-term upper-air observations from the IGRA (Integrated Global Radiosonde Archive) dataset were used in the study. The IGRA consists of radiosonde and pilot balloon observations at over 1500 globally distributed stations. Observations are available for standard, surface, tropopause and significant levels. The work is fulfilled from the analysis of long-term time series of wind speed based on observations at over 50 stations located at 60-80°N for the period 1972-2010. The series of mean monthly wind speeds and maximum wind speeds were constructed for this period at standard pressure surfaces from the ground to the 30 hPa elevation. Time series were analyzed and linear trend coefficients were determined. Areas of minimum and maximum changes in mean and maximum wind speeds were identified. The results obtained are of importance in analyzing meteorological risks to develop efficient strategies to mitigate disaster impacts.

P.41  Metacommunity Resilience of the Amazon Tropical Forest Facing Human and Natural Stressors. Convertino MC*, Munoz-Carpena RMC, Kiker GK, Perz SP; University of Minnesota   matteoc@umn.edu

Abstract: Climate extremes and rapid urbanization are stressors that both shape and threat ecosystems. Thus, questions arise about future scenarios for ecosystems and how we as society can potentially control ecosystem evolution considering natural variability and human needs. Here we reproduce biodiversity patterns of the Amazon’s MAP (Madre de Dios - Acre - Pando) tropical rainforest affected by the construction of the transoceanic highway and climate change with a neutral metacommunity model at different scales and resolutions. The influence of environmental variability in species loss and richness increases with scale and decreases with tree clumpiness heterogeneity. At the ecosystem scale drought sensitivity is 37 % higher than at the plot scale where the difference in scales is of seven orders of magnitude. Conversely, the anthropic disturbance played by the road is much larger at the plot scale, and undetectable at the ecosystem scale because dispersal is not affected. A non trivial pattern is found between the species cluster size and the persistence time. Bimodal distributions of clumpiness results in highly stable species richness and persistence time distributions. The species persistence time follows a power law function whose exponent increases with the magnitude of disturbance. This power law is preserved together with the distribution of tree cover despite changes in the shape of the species richness distribution. We propose the product of the persistence time, its probability of occurrence and the average species cluster size as a measure of metacommunity risk of ecosystems as a function of its resilience. A spatial resilience index as ratio of metacommity risks in the disturbed and undisturbed case is calculated to identify the most resilient communities. Our results show that societal development pressure should consider the ecosystem tree distribution of to minimize and maximize biodiversity loss and persistence time, respectively. The spatial resilience index can be used to plan agricultural and urban expansion that preserve resilient communities.

P.42  Design of Ecosystem Monitoring Networks by Value of Information Optimization: Experiment in the Amazon. Matteo Convertino MC*, Rafael Munoz-Carpena RMC, Greg Kiker GK, Stephen Perz SP; University of Florida (on leave), and Emerging Pathogens Institute at the University of Florida   matteoc@umn.edu

Abstract: Effective monitoring of ecosystems is crucial for assessing and possibly anticipating shifts, quantifying ecosystem services, and decision making based on these shifts and services. The selection of monitoring sites is typically suboptimal following local stake- holder or research interests that do not allow to capture the whole ecosystem patterns and dynamics. Here we propose a novel model for the design of optimal monitoring networks for biodiversity based on the concept of the value of information (VoI). We consider the trinational frontier among Brazil, Peru, and Bolivia as a case study. Using a mul- tiresolution texture-based model we estimate species richness and turnover on satellite imagery as a function of different sets of information coming from plot data organized in network topologies. The optimal monitoring network is the network that minimizes the integrated VoI defined as the variation of the VoI in the 28 years considered. This is equivalent to minimize the sum of the species turnover of the ecosystem. We identify the small world network as the optimal and most resilient monitoring network whose nodes are the hotspots of species richness. The hotspots are identified as the sites whose VoI is the highest for the whole period considered. Hence, the hotspots are the most valu- able communities for inferring biodiversity patterns and the most ecologically valuable according to the richness - resilience hypothesis. The small world monitoring network has an accuracy ∼ 50% higher than other network topologies in predicting biodiversity patterns. The network that results from the optimal trade-off between data value with their uncertainty and relevance, has deep implications for understanding ecosystem function and for management decisions. Hence, because of the optimal integration of environ- mental, social, and economical factors the model allows a sustainable monitoring and planning of biodiversity for the future.

Economic & Benefit Analysis
 

P.43  Can game theory predict the human behavior on safety? Form the viewpoint of an economic experiment. Makino R*, Takeshita J; AIST   ryoji-makino@aist.go.jp

Abstract: Estimating risk through Probabilistic Risk Analysis (PRA) has primarily been a non-behavioral, physical engineering approach. To assess the reliability of a system, however, the behavioral dimension must be taken into account. The theoretical model of Hausken (2002) that merges PRA and game theory considers safety as a “public good” which is a notion of economics [1]. It is well known in economic theory that the supply of public goods is lower than its optimal level without any corrective interventions. Using a game theoretic analysis, Hausken (2002) described the situation where a safety level remains low, that means the supply of safety as a public good is lower than its optimal level. Although his model is valuable, it has not been empirically validated yet. We validate the Hausken’s model using the technique of experimental economics. We basically follow the method for public good experiments employed by Fehr and Gachter (2002) [2]. In our study 48 participants take part in the experiments and they are divided into 12 groups of n = 4 participants. They work virtually on PCs that are inter-connected in series or parallel in a computerized laboratory. We establish safety rules for the virtual works and all participants are asked to follow these rules when they work. The experimental design is that if some, not necessarily all, of the participants follow the safety rules, the risk of occupational/industrial accidents remains low while the observation of the rules costs the rule followers. The costs are designed to give participants an incentive to break the safety rules. We examine the condition that participants break the safety rule, or to put differently, they take unsafe actions. [1] Hausken, K. (2002), “Probabilistic Risk Analysis and Game Theory,” Risk Analysis 22, 17-27. [2] Fehr, E. and Gachter, S. (2002), “Altruistic Punishment in Humans,” Nature 415, 137-140.

P.44  Cost-effectiveness of the decontamination activities in the evacuation zones due to the Fukushima nuclear accident. Oka T*; Fukui Prefectural University   oka@fpu.ac.jp

Abstract: Under the Act on Special Measures Concerning the Handling of Radioactive Pollution, the areas contaminated with radioactive materials due to the accident of Fukushima Daiichi Nuclear Power Station are broken down into two categories: i) Special Decontamination Area, which consists of the areas located in 20km radius from Fukushima Daiich Power Station and of the areas where annual cumulative dose could be more than 20mSv, and ii) Intensive Contamination Survey Area, in which over 1mSv/y of additional exposure dose were observed. We have estimated costs and effectiveness of the decontamination works being implemented in the Special Decontamination Area, by specifying the decontamination method for houses and buildings, agricultural lands, forests, and roads, by assuming efficiencies of the methods, and on the basis of the data on land-use in each 1km mesh. Effects of decontamination appear as reductions in air dose rate, from which reductions in cumulative dose in a year are calculated assuming a value for occupancy/shielding factor. Taking into account the return ratio of the evacuated people as a function of cumulative dose in a year, which has been estimated from questionnaire surveys of the municipalities in the affected area, the effects of decontamination are represented by reductions in the cumulative dose taking the return ratio into account, which are measured by the integral from the dose after decontamination to the dose before it of the return ratio with respect to the dose. These values are converted into reductions in the loss of life-expectancy due to the exposure in the next 30 years, which produce, when combined with the costs of decontamination, values for cost per life-year saved. The result for the cost per life-year saved is 5 billion yen on the average over the total area (ranging from 10^8 to 10^12 yen) when the occupancy/shielding factor is 0.6. The relation of the cost-effectiveness to the level of contamination depends on the value for the occupancy/shielding factor.

P.45  Evaluating the timing of benefits from abatement of short and long lived climate change species. Zheng JM*, Gilmore EA, Sarofim MC; University of Maryland   jzheng12@umd.edu

Abstract: Chemical species that affect the climate have different atmospheric lifetimes. Black carbon (BC) only lives in the atmosphere for days to weeks while carbon dioxide (CO2) can exist up to millennia. When the objective of a policy is to reduce the overall damages, a metric that compares the benefits of abatement of two or more climate forcers can be useful. However, it is difficult to compare BC with other greenhouse gases with the commonly used metrics, because those metrics have a flawed approach to considering the lifetime of different species and the associated damages. Here, we compare the relative economic benefits of abatement of a short-lived (BC) and long-lived (CO2) species. To evaluate the benefits of abatement, first, we translate a marginal emission to temperature change using simplified analytical models. We then convert the temperature change to economic damages using a quadratic damage function. We define the marginal emission of a range of climate change emission scenarios, specifically the representative concentration pathways (RCPs). For each scenario, we show how the relative benefits of abatement evolve as a function of time. We then discussed how to apply this damage metric for environmental policies such as emission trading and the associated uncertainties.

P.46  REAL SYSTEMATIC RISK FOR MODELING WEIGHTED PRICES AS AN ASSET FOR DECISION MAKING. ANYIKA E*, WEKE PO, ACHIA TN; UNIVERSITY OF NAIROBI   mmnk55378@gmail.com

Abstract: In this paper the notion that real non-diversifiable (systematic or market) risk does not exist or is one minus diversifiable risk (non-systematic risk) is investigated. Real systematic risk is then developed from its basic principles of not being able to be diversified like non-systematic risk which can by increasing the number of portfolios. Systemic together with non-systematic risk is then weighted against expected returns of assets to determine maximum returns of these assets at minimum risk. A Real Risk Weighted Pricing Model is thus developed which is able to postulate expected returns and risks of assets in the present and in the near future. This enables capital allocation, investing and financial decisions to be accurately determined.

P.47  Design of Institutional Mechanisms for Effective Risk Management: Assignment of Responsibility in the Case of Waste Disposal. Farber GS*; US EPA   farber.glenn@epa.gov

Abstract: Policy schemes for disposal of hazardous materials are designed to facilitate risk reduction by reducing exposure to toxic and radioactive hazards. Why do some policy schemes for disposal of hazardous materials work effectively, while others function poorly and fail to mitigate those risks? A great deal of attention is paid to engineering design of waste management units and structures, but insufficient attention is given to the institutional mechanisms and incentives in designing policies. This paper examines the role of several of these mechanisms in obtaining effective risk management outcomes, focusing on the schemes for assigning responsibility for waste disposal.

Engineering & Infrastructure
 

P.48  Exploring the concept of transportation systems risks. Chikaraishi M*, Fischbeck P, Chen M; The University of Tokyo   chikaraishi@ut.t.u-tokyo.ac.jp

Abstract: In present-day society, transport modes such as transit and automobiles are indispensable tools that are required to maintain a minimum level of wellbeing. Their importance spans the urban-rural dimension. In the U.S., national, state and local governments have generally assumed the responsibility of developing and maintaining infrastructure of transportation systems (e.g., highways and mass transit rail). The balance between mass transit availability and private vehicle use varies dramatically. In areas with effective mass transit systems, personal vehicle use can, in many cases, be considered voluntary thus making the associated risk voluntary as well. However, in areas without mass transit, personal vehicle use is a necessity and a large portion of the associated risks is essentially involuntary. Despite the varying characteristics of personal vehicle risks most traffic risk studies have focused solely on personal vehicle risks (e.g., a number of fatalities or injuries per unit exposure of vehicle travel). In this study, we first propose an alternative transportation risk measure that focuses on the accident risks of the entire system. We particularly argue that the proposed system-risk measure is much more appropriate for policy discussions over allocation of scarce resources to improve safety. Understanding the impact of shifting personal vehicle risks from being involuntary to the voluntary by making mass transit more readily available changes the framing of the problem. In this study we compare differences in vehicle and system risks across multiple exposure measures (i.e., per mile, per trip, per driver) for urban, suburban, and rural areas using data from National Household Travel Survey, Fatality Analysis Reporting System, and American Community Survey.

P.49  Recovery estimation model of thermal power plants damaged by complex hazards -Case of the 2011 Tohoku-oki Earthquake. Yuyama A*, Kajitani Y; Central Research Institute of Electric Power Industry   yuyama@criepi.denken.or.jp

Abstract: The 2011 Tohoku-oki Earthquake and the resulting nuclear accidents at the Fukushima Daiichi power plant has caused power shortages in Japan. 18 thermal power plants (39 units) located in Kanto and Tohoku region were affected by earthquake and tsunami and, as a result, about 30% of thermal power generation capacity in East Japan was lost. This was the most severe accident in thermal power plants in Japan and became one of the main causes of the power crisis. In order to secure stable power supply, restoration of the damaged plants was carried out immediately after the disaster as well as other remedial measures such as installation of emergency power sources. All affected plants except one scrapped unit have resumed operation by April 2013. Clarifying damage and recovery mechanism of thermal power plants due to such large scale, complex hazards is necessary for the analysis of power shortage risk for future disasters. Thus, we examine the relationships between recovery duration and various factors such as observed seismic ground motion, tsunami height, failure mode, and restoration policy. In addition to these internal factors, it is shown that the damage situation of infrastructure that plants depend on such as the disruption of industrial water supply could delay the restoration.

P.50  Constraint analysis for siting solar energy projects. Reid R, Loftis B, Dwyer S*; Kleinfelder, Inc.   sdwyer@kleinfelder.com

Abstract: A risk analysis methodology (constraints analysis) was developed to evaluate conditions affecting site selection for ground mounted solar photo-voltaic (PV) systems. Utility companies active in the solar market have applied this methodology in their site selection efforts to evaluate environmental, engineering, and regulatory constraints that could render a site economically or physically infeasible for development. The constraints analysis addresses up to 16 characteristics for a given site, including flooding, presence of jurisdictional waters, threatened and endangered species, sensitive habitats, regulatory environment, topography, land ownership, zoning, site access, geotechnical conditions, and distance to electrical transmission infrastructure. The primary goals of the constraints analysis are to optimize the allocation of capital and to minimize capital at risk. Presently, the constraints analysis tool is largely qualitative and relies on subjective judgments regarding each site characteristic. Approaches to advancing the constraints analysis through the use of advanced analytical tools, such as multi-criteria decision analysis and GIS, will be discussed.

P.51  Hydraulic Fracturing Failure Rates – Key to Understanding Actual Risks. Pawlisz AV*; Conestoga-Rovers& Associates   apawlisz@craworld.com

Abstract: Extraction of natural gas deposits via hydraulic fracturing (fracking) has grown at an unprecedented rate in the United States and worldwide. For various reasons, this method of natural resource retrieval has met considerable opposition from the regulatory community and the public. One of the sensitive issues is the potential for adverse impacts to the environment and human health, particularly relative to groundwater extraction, drinking water pollution, deep chemical injection, well failures, blow outs, on-site spills, air emissions, transport accidents, and noise. This presentation compiles the most recent data on various incident/accident/spill/release rates published by the industry, government agencies, and open literature. Failure data are used to conduct a predictive risk assessment where the calculated odds ratios are compared to those for conventional hydrocarbon extraction methods. The overall objective is to provide an insight on how fracking compares to other drilling and oil/gas operations in terms of the potential for overall environmental impacts.

Give me some numbers! Tox and uncertainty values
 

P.52  Provisional Advisory Level (PAL) Development for Superwarfarins (Brodifacoum and Bromidalone). Stewart D*, Glass-Mattie D, Dorman D, McConnell E, Adeshina F; University of Tennessee and Oak Ridge National Laboratory   dstewart@utk.edu

Abstract: PAL values developed for hazardous materials by the US EPA represent general public emergency exposure limits for oral and inhalation exposures corresponding to three different severity levels (1, 2, and 3) for 24-hr, 30-d, 90-d, and 2-yr durations. PAL 1 represents the threshold for mild effects; PAL 2 represents the threshold for serious, irreversible or escape-impairing effects; PAL 3 represents the threshold for lethal effects. PALs have not been promulgated nor have they been formally issued as regulatory guidance. They are intended to be used at the discretion of risk managers in emergency situations when site-specific risk assessments are not available. The mention of trade names does not imply EPA endorsement. PAL values were developed based on the SOP and QAPP requirements. Brodifacoum (CAS No. 56073-10-0) and Bromadiolone (CAS No. 28772-56-7) are both members of the newer generations of anticoagulant rodenticides collectively named superwarfarins. Anticoagulant rodenticides have been used historically as effective means to control populations of mice, rats and other rodents in urban and agricultural settings. All anticoagulants act by preventing the coagulation cascade, which can cause free bleeding that can be fatal. Superwarfarins work by inhibiting vitamin K1 epoxide reductase, which leads to a depletion of vitamin K1 and impairment of blood clotting ability. In humans and animals, Vitamin K1 can be administered after ingestion to act as an antidote to prevent free bleeding. Brodifacoum and bromadiolone are absorbed quickly, distributed primarily to the liver and excreted in the feces, mostly unchanged. Oral PAL values were developed by using human case reports and animal data. Oral PAL 1, 2, and 3 values are 0.017, 0.16, and 1.8 mg/L for 24-hours and NR (not recommended), 0.0025, and 0.0075 mg/L for 30-days. Values are not recommended for 90-day and 2-year oral exposure and all inhalation durations due to insufficient data.

P.54  Probabilistic Cancer Risk Assessment for Aflatoxin B 1 with Bayesian Statistics Markov Chain Monte Carlo Simulation . Liu SY*, Chang CS, Chung YC, Chen CC, Wu KY; National Taiwan University   r01841027@ntu.edu.tw

Abstract: Aflatoxins are found present in nuts, peanuts, corns, spices, traditional Chinese medicine, maize and rice. Particularly, aflatoxin B1 has been shown to induce liver cancer (hepatocellular carcinomaor ,HCC) in many species of animals and is classified as a human carcinogen by IARC. Exposure to aflatoxin B1 through food consumption is considered as a risk factor for hepatocellular carcinoma (HCC) and could cause synergetic effects to the infection of hepatitis B virus. However, the available residues in foods are very limited, and the intake rates of foods containing aflatoxin B1 are very uncertain. Therefore, the aim of this study was to perform probabilistic cancer risk assessment for aflatoxin B1 with Bayesain statistics coupled with Marko chain Monte Carlo simulation (BS-MCMC) to reduce uncertainty in the distributions of aflatoxin B1 residues and the intakes. The aflatoxin B1 residue data was cited from official reports of routine monitoring data published by Taiwan Food and Drug Administration. Questionnaires were used to collect the frequency of consuming these foods containing aflatoxin B1 from 124 study subjects. A cancer slope factor, 0.128 (g/kg/day)-1, was assessed with the Benchmark dose software and linear extrapolation. These data were used as prior information for BS-MCMC modeling. Our results reveal that the cancer risk was 2.642.07 x10-7 for the HBsAg (-) population and 6.755.29 x10-6 for the HBsAg (+) population. These results suggest that reduction of aflatoxin B1 exposure is necessary for the HBsAg (+) population.

P.55  Potential impacts of uncertainty in the C8 Science Panel exposure assessment for perfluorooctanoate. Avanasi Narasimhan R*, Shin HM, Vieira VM, Bartell SM; UCI, UCI, UCI, UCD   ravanasi@uci.edu

Abstract: The C8 Health Project is a cross-sectional epidemiologic study of 69,030 people who were environmentally exposed to Perfluorooctanoate (PFOA) near a major U.S. fluoropolymer production facility located in West Virginia. A previously published retrospective exposure assessment model (including PFOA release assessment, integrated fate and transport modeling, and dose reconstruction) predicts the blood serum PFOA concentration for 43,360 non-occupationally exposed residents from 1951-2008; these predictions were validated against 2005-2006 serum PFOA measurements which are available for every participant (Shin et al., 2011). The fate and transport model that predicts the PFOA water concentration in the six public water districts (PWD) utilizes a number of uncertain physiochemical and hydrogeological parameters. The aim of the present study is to evaluate the extent to which the uncertainty and spatial heterogeneity in the water concentration predictions could influence the serum predictions and relative ranking of exposures for individuals in a variety of epidemiologic studies relying on the retrospective exposure estimates. Using Monte Carlo simulation, we change the individual PWD-PFOA water concentration for every year by randomly sampling from lognormal distributions centered on the original predicted concentrations We evaluate the impacts of uncertainty by comparing the spearman rank correlation coefficient between the predicted and the measured serum concentrations for each iteration, and by evaluating the similarity of the iterated serum predictions within each year. Preliminary results suggest that random variability/uncertainty in historical water concentrations has little impact on the validity of the serum predictions as measured by comparison with 2005-2006 serum measurements. We are now evaluating the potential impacts of systematic errors in water concentration predictions, using autoregressive correlation structures and shifted distributions.

Methods, Models & Data: Potpourri
 

P.56  Proposing a framework of QAAR approaches for predicting the toxicity of chemical substances: A case study on predicting and extrapolating the missing NOEL values. Takeshita J*, Gamo M; National Institute of Advanced Industrial Science and Technology (AIST)   jun-takeshita@aist.go.jp

Abstract: We propose a Quantitative Activity–Activity Relationship (QAAR) model to predict the unknown toxicity values of chemical substances in animal testing data from the existing acute oral toxicity and 28 days repeated dose toxicity studies. In view of the global movement to reduce animal testing to assesse and manage the chemical risk, OECD has been saying the aggressive use of statistical methods for predicting the toxicity of chemical substances. As one of the most popular statistical methods, there is the Quantitative Structure-Activity Relationship (QSAR). On the other hand, the Quantitative Activity-Activity Relationship (QAAR) was introduced to estimate unknown toxicity values from the relationship between difference toxicity endpoints. For example, suppose that a target substance and there exist in vivo data of some endpoints. When we would like to know the toxicity of every endpoint, we have been considering that the QAAR works well. The QAAR enables us to predict any endpoint's toxicity from the existing in vivo data. When we deal with an existing substance, we may face the situation like above since there are not a little literature information on the substance. In this study, we first develop a QAAR by using covariance structure analysis. Our model is based on correlations among organ-specific NOEL values that are included in the training set. The major advantage of the model is that it enables us to make estimations with the confidence intervals. Secondly, we predict the missing NOEL values of the substances for which NOEL data for some organs but not all on 28 days repeated dose toxicity studies are available. Finally, we extrapolate every NOEL values of the substances that have only acute oral toxicity studies from the LD50 values.

P.57  Comparative study of risk with nursing work in Japan and China. Maeda Y*, Marui R, Yamauchi H, Yamaki N; Shizuoka University   tymaeda1@ipc.shizuoka.ac.jp

Abstract: Risk with nursing work, particularly medical incidents, in hospitals in China and Japan was compared. Data about medical incidents in Japan were obtained from the survey of 1,275 Japanese hospitals operated by Japan Council for Quality Health Care in 2011. As for China, a questionnaire survey was conducted from December 2012 to January 2013 among 631 nurses in Nanjing Drum Tower Hospital. As a result, situations related to medical incidents, factors of medical incidents, length of service of nurses who found the incidents, and the season when the incidents occurred frequently were common in Japan and China, whereas frequency of medical incidents were different. In addition, satisfaction with work schedule of nursing was investigated by several questions in the survey in China. Satisfaction level with the schedule was very high in average, however dissatisfaction was also found for some questions. Independence of medical incident reporting and satisfaction with scheduling was tested. In some questions, significant relationship between dissatisfaction with schedule and frequency of medical incidents were obtained. It suggests that medical incidents are related to busyness of nursing work.

P.58  Environmental attitudes and behaviours of university students: A case of study at an Chilean university. Heyl ME*, Moyano E, Cornejo F, Cifuentes LA; Faculty of Enginnering, Pontifical Catholic University of Chile   marianne.heyl@gmail.com

Abstract: Encouraging the adoption of pro-environmental behaviour is critical to reduce the environmental impacts and to move toward a more sustainable future. Higher education plays an important role in educate and form professionals who will play an important role in protecting the environment through their decision and behaviours in their personal and professionals lives. The aim of this study is to identify whether there are significant differences between university students depending on the diploma of specialisation, related to the environment or not, the year in which they are studying and gender. Besides to investigate which factors influence significantly (perceived effort, positive environmental attitude or perceives positive consequents) the frequency of pro-environmental behaviours in the students. The sample consisted of 383 students in first, third and sixth year by which two instruments were designed to measure environmental attitudes and behaviours. Significant differences were noted between those who are studying diplomas related to the environment and those who aren’t, as opposed to the variations at different stages of the course. However, students have positive environmental attitudes which are not reflected in the performance of environmental behaviour. Upon conducting regression analysis, it is noted that the three factors influence significantly the frequency of pro-environmental behaviour, being the perceived effort (negative) the most influential variable.

P.59  DRAGON: A Single Risk Assessment Database to Promote Transparency and Data Sharing. Henning CC*, Overton AJ, Marin K, Cleland JC, Turley AT; ICF International   cara.henning@icfi.com

Abstract: With the availability of large volumes of data for risk assessment and a greater emphasis on consistency and transparency in federal agencies, data management and data sharing are of keen interest for risk assessors. The DRAGON tool is a database that stores risk assessment data and allows nimble management of the overall assessment process. Within DRAGON, the risk assessors can implement systematic review of the literature, manage the assessment of the quality of key studies and store related decisions, manage the data entry process, perform dose-response modeling, and rapidly generate reports in a variety of formats. The database itself has a unified structure that allows data-sharing across agencies and risk assessors with similar interests in a given chemical. Data-entry forms, reports, and assessment decision logic can be tailored for each agency, though, to meet the different internal priorities and needs. Included in the database is an evolving health outcome standard vocabulary that can be crosswalked to any other vocabulary if needed. The vocabulary is based on a system-based classification for each endpoint. Specific endpoints can also be mapped to custom categories for each assessment as desired. DRAGON also provides a framework for coordinating the work of multiple people working on assessments of chemicals with large databases to improve consistency and to facilitate quality assurance procedures.

P.60  Implementing Systematic Review for Chemicals with Large Databases. Turley AT*, Overton AJ, Marin K, Henning CC; ICF International   audrey.turley@icfi.com

Abstract: Systematic review allows risk assessors to use transparent logic to search for, categorize, and select data for use in chemical risk assessments for deriving toxicity reference values. The need for transparency is particularly important to federal agencies that are placing increased emphasis on stakeholder engagement. In addition, chemicals that may have tens of thousands of references require additional data management strategies beyond existing tools (e.g., EndNote) that allow additional sorting and querying of the results. The DRAGON Screen tool allows for the agile categorization and characterization of studies so that literature can be sorted and prioritized for inclusion in weight of evidence and data analyses. The tool allows the assessment manager to construct multiple and evolving rounds of review. The categories in each round are tailored to the particular chemical and assessment priorities, and new rounds can be added as needed throughout the literature screening process. Two separate data managers categorize the study separately based on title and abstract screening and without knowledge of each others’ selections. Then, a third expert reviews the two sets of categories for a study and resolves any discrepancies. Studies can then pass into the data entry phases. For large database chemicals, the risk assessment team may choose to only pass a set of categories through to the data entry phase. The logic of how this decision was made as well as the general literature flow is stored in the tool and can be provided to the stakeholders for review at any time.

P.61  A pragmatic way of achieving High Sustainable Organization: Governance and organizational learning in action in the public French sector . Merad Myriam *, Marcel Frédéric ; INERIS   myriam.merad@ineris.fr

Abstract: Sustainability is becoming more and more the key challenge for organizations. The French public organizations are currently working on issues related to both the assessments and the governance of sustainability. In this paper we propose a “proactive-based assessment” that help to set clear and conscious the different stage of progress of the organizations in terms of sustainability and responsiveness in that respect. Three new concepts to deal with the problem of sustainability for a public organization are proposed based on a research-in-action approach: the concept of “critical capital”, the concept of High Sustainable Organization (HSO), and the concept of learning stages within HSO. Our contribution is mainly based on investigation and pragmatic observation done within the French working group on “Governance” of public organizations.

Microbial Risk Assessment
 

P.62  Modeling the relationship between post-vaccination hemagglutination inhibition (HI) titer and protection against influenza. Huang Y*, Anderson S, Yang H; The US Food and Drug Administration   Yin.Huang@fda.hhs.gov

Abstract: The objective of this research is to evaluate the relationship between post-vaccination HI titer in the host and the protection against influenza using modeling approaches. The HI titer is currently used as a surrogate endpoint for protection against diseases in FDA’s regulatory review of influenza vaccine products. We expect that the results of this research will provide us an insight on whether HI titer is a good predictor of protection against influenza; and if it is, what the level of HI titer needed for a sufficient protection is. We first searched available data from human challenge studies that reported post-vaccination HI titer, challenge dose, and post-challenge influenza infection. Five large-scale studies were identified. Among them, four studies used single doses for challenge while one reported multiple-dose challenge. We grouped the volunteers based on their HI titer levels. We assumed the relationship between challenge dose and infection rate (response) could be described by exponential or beta-Poisson dose-response models that have been widely used for a number of infectious disease agents. We estimated the model parameters for each HI titer group, and examined the dependency between host susceptibility represented by model parameters and post-vaccination HI titer. The dose-response models were further modified by incorporating such dependency and fit to the data set with graded challenge doses using maximum likelihood estimation. An exponential dependency between the model parameters and HI titer was identified and the dose-response models incorporating this dependency provided statistically acceptable fit to the data while the original models failed to do so. The modified models can be potentially used to identify the critical level of post-vaccination HI titer required for sufficient protection against influenza; and therefore, enhance our ability to evaluate the efficacy and protection offered by future candidate influenza vaccines.

P.63  Risk factors identification for Toxoplasma gondii infection in meat products destined for human consumption. Guo M*, Buchanan RL, Dubey JP, Hill D, Gamble HR, Jones J, Pradhan AK; University of Maryland   miaoguo312@gmail.com

Abstract: Toxoplasma gondii is a parasite that is responsible for approximately 24% of all estimated deaths per year, attributed to foodborne pathogens in the U.S. The main transmission route for human infection is through consumption of raw or undercooked meat products that contain T. gondii tissue cysts. Risk assessment studies related to meat-borne T.gondii infection were very limited so far. The objective of this study was to compare risk among different meat products, identify risk factors and summarize risk assessment studies for human T. gondii infection through consumption of meat products, both conventional and organic, in the past twenty years. Relevant studies in literature were searched in PubMed and Google Scholar database by key words ‘Toxoplasma gondii’ and in combination with ‘pig’, ‘pork’, ‘sheep’, ‘lamb’, ‘chicken’, ‘cattle’, ‘meat’, ‘organic meat’, ‘risk’, and ‘risk assessment’. This structured review focused on studies of T. gondii infection through meat-consumption route. Risk factors identified on farm include outdoor access, farm type, feeding, presence of cats, rodent control, bird control, farm management, carcasses handling, and water quality. Seroprevalence of T. gondii is greater in conventional pig and sheep compared to cattle and poultry. Seroprevalence of T. gondii is greater in organic compared to conventional meat products indicating higher risk of T. gondii infection from organic meats. To better understand the risk of toxoplasmosis in humans from meat consumption in the U.S., a quantitative microbial risk assessment of meat-borne toxoplasmosis based on data and information relevant to the U.S. is critically needed. This study would serve as a useful resource and information repository for informing quantitative risk assessment studies for T. gondii infection in humans through meat consumption.

P.64  Scald and food safety risks posed by unsafe water, refrigerator, and freezer temperatures in residences of Meals On Wheels recipients in 4 US states. Hallman WK*, Cuite CL, McWilliams RM, Senger-Mersich A; Rutgers, The State University of New Jersey   hallman@aesop.rutgers.edu

Abstract: In the US, an increasing number of older adults live alone. Taking on-site measurements in the residences of 800 Meals On Wheels Recipients in 4 US States (AR, IA, NJ and SC), this study examined the potential scald risks to older individuals posed by unsafe water temperatures and the food safety risks posed by unsafe refrigerator and freezer temperatures. Most water heater manufacturers have voluntarily adopted a 120°F standard for domestic hot water as recommended by the US Consumer Product Safety Commission. However, the thinner skin of older adults burns more quickly than that of younger people and so are at increased risk for scalding and burns even at 120°F. This study adopted a nominal acceptable water temperature range of 114.8 to 118.4°F since studies have shown that this ensures thorough removal of grease films (which may promote bacterial growth), yet reduces the risk of scalding. Only 27% had hot water temperatures within this range. More than half (52%) were >120°F and 11% were >130°F (exposure to which can result in second degree burns within seconds). The highest temperature recorded was 184.5°F. Since older adults have a worse prognosis than younger patients after scald burns, the potential health consequences are serious for a large percentage of this population. The USDA recommends a freezer temperature < 0°F and a refrigerator temperature < 40°F to minimize microbial growth. However, 71.6% of the homes surveyed had freezer temperatures >0°F; with homes in AR and SC at statistically significantly higher risk of out-of-range freezers. In addition, 26.3% had refrigerator temperatures > 40°F; with homes in AR at statistically significantly higher risk of having an out-of-range refrigerator. The results suggest that the risks of scalding and microbial exposure are high for a large number of older individuals and highlights the need for surveillance and the development of prevention strategies to ensure safer water, refrigerator and freezer temperatures in the homes of these individuals.

P.65  Biological weapons and bioterrorism threat assessment. JYOTHIKUMAR V*; University of Virginia   vinzymes@gmail.com

Abstract: Present day terrorists do not officially represent countries or states, but often they represent a religious ideology expressed through violence and death. The use of weapons of mass destruction (WMD) against civilian noncombatants is not novel or unique to present times. Mankind has exploited diseases, toxins, and poisons since the earliest days of recorded history to wage warfare, commit crimes, and force others. However, accessibility to biological weapon agents, and their enhanced capacity to cause morbidity and mortality, as well as improvement of tactics for their employment, have significantly increased the need for the development of more effective means of detecting and countering such weapons. Infectious diseases have unquestionably played a significant and defining role in the overall progression of mankind, religions, and cultures to the structure and organization of economies and governments. Bacterial pathogens that can be potentially used as biological threat are typically 1–2 microns and are most effective when inhaled or ingested into the body. Antibiotic-resistant strains of anthrax, plague, and tularemia are known to exist naturally and may be exploited for weapons. Bacillis anthracis , for example, can be weaponized by attaching spores to carrier particles. A new economical way to screen such samples in nature is by using autofluorescence. Autofluorescence is based on the detection of natural intrinsic tissue fluorescence emitted by endogenous molecules such as co-enzyme, collagen and flavins. After excitation by a short-wavelength light source, these fluorophores emit light of longer wavelengths. The overall fluorescence emission patterns differ among various bacterial species due to corresponding differences in fluorophores concentration, metabolic state. Bacterial spores contain a high concentration of endogenous fluorophores besides other elements, which may allow a proper discrimination of spores from other suspicious particles.

P.66  Foodborne pathogens in leafy greens: Data, predictive models, and quantitative risk assessments . Mishra A*, Lambertini E, Pradhan AK; University of Maryland College Park, MD   amishra1@umd.edu

Abstract: In the last few years, technological innovations in production, harvesting, processing, and packaging of fresh produce and their consumption have increased tremendously in the U.S. Consumers are eating more fresh produce, purchasing a broader variety and demanding more convenience products such as ready-to-eat salads. Fresh produce is generally consumed raw, making it a high-risk food in terms of pathogen contamination. A recent study by the Centers for Disease Control and Prevention indicated that in between 1998 and 2008, leafy greens outbreaks accounted for 22.3% of foodborne outbreaks. Contamination with pathogens of fresh produce including leafy greens has been a major concern to various stakeholders such as food industry, regulatory agencies, and consumers. In this study, we performed a structured review of literature to gain more insight into the available data and information related to contamination sources, predictive microbial models, and quantitative risk assessment models for different pathogens such as Listeria monocytogenes, Salmonella, Escherichia coli O157:H7 in leafy greens in the farm-to-table continuum. It was observed that microbial contamination mostly originated from the pre-harvest environment. Contamination can effectively be controlled by storing the leafy greens at appropriate temperature and time, and by the application of intervention steps such as washing and irradiation. Several research studies on predictive modeling and quantitative microbial risk assessments of pathogens in fresh produce, which are focused on one or more steps such as irrigation, harvesting, processing, transportation, storage, and washing, have been reported in the last few years. We divided those into three categories: pre-harvest models, storage models, and models related to intervention steps, primarily washing. Overall, our study provides valuable information to inform future quantitative microbial risk assessment studies related to leafy greens.

P.67  Quantitative Risk Assessment for Escherichia coli O157:H7 in Fresh-cut Lettuce. Pang H*, Buchanan RL, Schaffner DW, Pradhan AK; Pang, H; Buchanan, RL; Pradhan, AK; University of Maryland, College Park, MD, 20742; Schaffner, DW; Rutgers University, New Brunswick, NJ 08901   haopang@mail.umd.edu

Abstract: Leafy green vegetables, including lettuce, are of serious food safety concern, as those are recognized as vehicles for foodborne pathogens such as Escherichia coli O157:H7 that could cause human illnesses. Development and application of quantitative risk assessment models have been recognized as strong tools to identify and minimize potential risks associated with foodborne pathogens. This study was aimed at developing a quantitative microbial risk assessment model (QMRA) for E. coli O157:H7 in fresh-cut lettuce and evaluating the effects of intervention strategies on public health risks. The supply chain of fresh-cut lettuce was modeled from infield production until consumption at home. Using @RISK software a simulation model was developed for exposure and health outcome assessment. The developed model was simulated using Latin Hypercube Sampling for 100,000 iterations to estimate the number of illnesses due to consumption of fresh-cut lettuce in the U.S. With a prevalence of 1% of lettuce coming to the processing plant, the baseline model (with no inclusion of intervention strategies) predicted 8921 number of cases per year in the U.S. For each of the intervention strategies evaluated, the public health risks were reduced and the bacteriophage was the most effective in reducing the public health risks. Sensitivity analysis results indicated that irrigation water quality is the most important factor affecting the number of cases predicted. The developed risk model can be used to estimate the public health risk of E. coli O157:H7 from fresh-cut lettuce and to evaluate different potential intervention strategies to mitigate such risk.

P.69  Risk Assessments for Listeria monocytogenes and Salmonella spp. in Melons. Wang M*, Lambertini E, Micallef SA, Pradhan AK; University of Maryland, College Park, MD   meowang@umd.edu

Abstract: In the past decade, with the increasing public preference for fresh produce, the risk of illnesses associated with consuming raw and minimally processed fruits and vegetables has drawn increased scrutiny from various stakeholders including consumers, industry, government, and academia. Annual consumption of non-citrus fresh fruits, including melons, increased 45.5% from 1976 to 2009. Melons are highly popular due to their high nutrition value and the attractive natural flavor. However, melons are vulnerable to pathogen contamination because they are grown on ground, minimally processed, and eaten raw. Therefore, melons are considered as the second highest fresh produce commodity of concern for microbial risk. Salmonella spp. and Listeria monocytogenes, two of the most deadly foodborne pathogens, have been associated with melons contamination, recalls, and most importantly two recent large-scale outbreaks in 2011 and 2012. While government guidelines on Good Agricultural Practices and post-harvest Best Practices have been published for cantaloupes, no quantitative estimate of risk and mitigation effectiveness are available for any melon variety. In support of such quantitative risk assessment efforts, the goal of this study was to systematically review existing data on the risk of contamination from Salmonella spp. and Listeria monocytogenes and their ecology in the melon production chain. Specific objectives were to review: (i) production and consumption of common melon varieties (cantaloupe, honeydew, and watermelon), (ii) potential contamination sources in the farm-to-fork supply chain, (iii) prevalence and survival of pathogens associated with melons, and (iv) potential intervention strategies for risk reduction in the melon industry. This systematic review synthesizes critical information needed for conducting farm-to-fork quantitative microbial risk assessment (QMRA) models for L. monocytogenes and Salmonella spp. on melons.

Probabilistic Risk: Staff, Water and Statistics
 

P.70  Probabilistic Risk Assessment with the Bayesian Statistics Markov Chain Monte Carlo Simulation. Wu KY*, Chung YC, Chen CC, Hsiao CH; National Taiwan University   kuenyuhwu@ntu.edu.tw

Abstract: US EPA already adopted probabilistic risk assessment (PRA) for decision making. Previously, PRA was conducted by mainly using the Monte Carlo (MC) simulation, which frequently requires either empirical or probability distributions of parameters to simulate the distribution of lifetime daily dose. The simulation results will be valid if only the input parameters, data and assumptions are valid. In practice, risk assessors frequently suffered from insufficient data to fit distributions for some parameters, especially concentrations and intake rates, or even worse spotted data hinder completion of an assessment, such as a large proportion of residue data below detection limit. In order to reduce uncertainty due to insufficient data, the Bayesian statistics Markov chain Monte Carlo (MCMC) simulation was applied to perform PRA. The limited data available were used as prior information. Markov chain Monte Carlo simulation was performed with the WinBUG to achieve the posterior distributions of parameters and health risk. Four examples will be presented in this meeting; assessment of lifetime cancer risk for N-nitrosodimethylamine (NDMA) in drinking water (only one sample with detectable NDMA level out of 50 samples collected), assessment of lifetime cancer risk for aflatoxin B1 in food (only few data greater than regulations were available), assessment of health risk for medical staffs exposed to cisplatin by using urinary platinum as biomarker to reconstruct exposures, and assessment of lifetime cancer risk for acrylamide in high-temperature processed foods with high uncertainty in residue and intake rate data. With limited data available, the posterior distributions of parameters and health risk theoretically converge to corresponding representative distributions for the study population so that quality of risk assessment may be improved without additional investment of resources to collect data.

P.71  Probabilistic Risk Assessment of Cisplatin for Medical Staff of Medical Centers in Taiwan. Chen YT*, Chang CH, Chung YC, Chen CC, Wang GS, Wu KY; National Taiwan University   r01844001@ntu.edu.tw

Abstract: Cisplatin, a platinum-based chemotherapeutic medicine, widely used in chemotherapy, is mutagenic and has ability to cause chromosal aberrations, micronuclei and to induce nephrotoxicity, birth abnormality, and reproductive effect. Potential cisplatin exposures to medical staff including pharmacists and nurses who handle the chemotherapeutic medicine have been of great concerns. A preliminary study was conducted to analyze urinary platinum to assess exposures for 126 medical staff in three medical centers in Taiwan, showing that urinary platinum was detectable in only 5 of these study subjects in the range from 1.59 to 89.1 ppt. In this study, the urinary platinum levels were used to reconstruct cisplatin exposures for probabilistic risk assessment by using Bayesian statistics Markov chain Monte Carlo simulation with the WINBUGS software. The results showed that cisplatin exposures were 0.03 mg/kg/day for pharmacists in the departments of chemotherapeutic medicine, the corresponding hazard indexes (HI) were 129.98 ± 12.95; for nurses in oncology wards were 0.02 mg/kg/day, the corresponding HI were 86.53 ± 8.67; and for nurses in the oncology clinics were 0.04 mg/kg/day, the corresponding HI were 173.14 ± 17.23. These results suggest that the intervention should be implemented thoroughly and further studies should be conducted to protect medical staff handling the chemotherapy medicine in Taiwan.

P.72  Probabilistic Assessment of Cancer Risk for N-Nitrosodimethylamine in Drinking Water by Using Bayesian Statistics with MarKov Chain Monte Carlo Simulation. Chang CH, Chuang YC, Chen CC, Wu KY*; National Taiwan University   achchg@gmail.com

Abstract: N-Nitrosodimethylamine (NDMA) is an emerging nitrogenated Disinfection By-Product (N-DBP) in drinking water formed during chloramination and chlorination. NDMA is genotoxic and carcinogenic to rodents after cytochrome P-450 metabolism by the liver as its target organ. The potential health risk posed by NDMA exposures through water consumption has been of great concern. With the majority of NDMA data in drinking water below detection limit, the research community has yet to perform its cancer risk assessment. In order to deal with a majority of Non-detectable (ND) data of NDMA in drinking water, a Bayesian statistics with Markov chain Monte Carlo simulation was first used to assess probabilistic cancer risk for NDMA in drinking water in WinBUGS1.4 (Bayesian analysis software Using Gibbs Sampling for Windows Version 1.4). The dataset of NDMA concentration was cited from a published study in Taiwan; from which 50 drinking water samples were collected and only one sample was detected at 4.5ng/L. Posterior distributions revealed a mean concentration of NDMA in drinking water at 0.54ng/L. The estimated 95th percentile of lifetime cancer risk was 2.897E-6. These results suggest that the NDMA levels in drinking water in Taiwan do not pose significant cancer risk; regulation of NDMA may be needed in order to fully protect the general public.

P.73  Assessing the Health Risks of Dimethylformamide in an Occupational Setting. Wu CH*, Huang YF, Wu KY; National Taiwan University   charlenehwu@gmail.com

Abstract: Dimethylformamide (DMF) is an organic compound able to induce adverse health effects in humans. It is used in the production of pharmaceutical products, adhesives, and synthetic leathers. Employees working in these industries are likely to be at risk, particularly through inhalation and dermal absorption. Exposure assessment of DMF was conducted on 106 employees from three synthetic leather plants in Taiwan. The employees’ urinary samples were collected and subsequently analyzed. N-methylformamide (NMF), a DMF metabolite, is the biomarker identified to measure the body burden of DMF exposure. Results demonstrated that NMF concentrations from the majority of urinary samples collected prior to a work shift were mostly undetectable. However, urinary concentrations of NMF were significantly higher post-shift compared to pre-shift. Of the 318 urinary samples collected, the NMF concentrations in 59 samples exceeded the American Conference of Industrial Hygienists’ (ACHIH) recommended Biological Exposure Index (BEI) of 15mg/L. To assess the health risks of DMF, the reference concentrations (RfC) were calculated using the Benchmark Dose software. The benchmark dose, based on existing animal data, is calculated at 3.98 ppm (11.94 mg/m3). Based on this BMDL10, the RfC for DMF in these synthetic leather plants is derived at 0.04 ppm (0.12 mg/m3). The hazard indices for all 106 employees (HI) were also calculated and found that 89.6% of the employees have a HI value greater than one. Our results demonstrate that the majority of employees exposed to DMF are subject to noncarcinogenic adverse health effects, even though the amount of DMF exposure does not exceed current permissible exposure limits (PELs) of 10ppm. Further investigation focusing on the exposure to multiple compounds in these occupational settings is warranted. A review of risk management protocol of DMF should be considered since compliance with current regulations is inadequate in safeguarding health.

Risk Communication
 

P.74  Atrazine effects on amphibians: Is it safe to go back into the water? Smith D.W.*; Conestoga-Rovers & Associates   dwsmith@craworld.com

Abstract: In the mid-2000s, scientific debate about atrazine’s potential to cause endocrine disruption of amphibians played out in the bright lights and high stakes of EPA re-registration. As part of that process, EPA had its Science Advisory Board evaluate the available science on endocrine disruption. Depending on which conventional wisdom on the internet one reads, the SAB review either found serious errors in the science showing significant negative effects on amphibian development at low atrazine concentrations. Other sources claim the complete opposite, e.g., that the SAB “found each and every one of the studies [showing no significant effects] to be fundamentally or methodologically flawed, some containing defects as egregious as allowing control and test subjects to intermix.” Since industry funded much of the latter research, this debate devolved into the all too familiar acrimonious discussions about conflict of interests, and even contained a published paper purported showing a relationship between funding source and results. From afar – I luckily had no involvement at all in any of this -- the debate seems to exemplify some issues bedeviling science in general and risk assessment in particular. Specifically, are scientists in general and the SAB in this specific case so poor at communicating to the public. Secondarily, what exactly are “conflicts of interests", who has the, and what effects do they likely have on science. Thirdly, what are professional ethics in the midst of debate about potentially important environmental risks. Thus, my talk will review this incident, give my best science view of what exactly the SAB said, and review current discussions of scientific ethics.

P.75  Public risk perception towards urban air pollution. Zhu KJ*, Xu JH; Peking University   cocojne@pku.edu.cn

Abstract: The notorious prevailing hazy days in China in the winter of 2011 and 2012 triggered a public crisis over air quality, while how the public perceives air pollution is meagerly studied. Understanding the public’s perception towards risks is a prerequisite for predicting public responses and designing risk communication strategies and policy interventions. This paper presented an empirical study to explore how the public perceive risks posed by urban air pollution in China and what factors affecting their perception. Existing studies were mostly conducted in other cultural settings, the results of which cannot be borrowed directly in a Chinese setting. Therefore our research begins with an exploratory study using one-on-one in-depth interview to elicit public attitudes towards urban air pollution as well as potential affecting factors. A convenient sample of 43 diversified respondents in Beijing was recruited with a snow-balling approach. Qualitative analysis of the open-ended interviews based on grounded theory revealed interesting findings, such as confusion between air pollution and climate, perceived relationship between economics and air pollution, and high constraint recognition in public involvement. Based on the results of and the preferred wording in the interviews, a confirmative questionnaire is designed to test the prevalence of misconceptions and beliefs, and to test the factors that affecting public risk perception in China.

P.76  Applying Mental Modeling Technology™ to Developing the Communications Research and Analytics Roadmap for Census Bureau. Kovacs DC*, Thorne SL, Butte GE, Wroblewski MJ; Decision Partners; United States Census Census Bureau   dkovacs@decisionpartners.com

Abstract: The U.S. Census Bureau (CB) serves as the leading source of quality data and information about the nation's people, places, and economy. More than just numbers, this information shapes important policy decisions that help improve the nation’s social and economic conditions. The Center for New Media and Promotions (CNMP), within the Communications Directorate of the Census Bureau, coordinates, develops, and implements integrated communications and promotional campaigns with consistent messaging and branding about the Census Bureau. This includes exploring innovative ways of communicating through the web, digital, and social media; promotional activities; and evolving communications channels and platforms in support of the Data Dissemination Initiative and other CB programs. To support their activities, the CNMP sponsored development of a Communications Research and Analytics Roadmap (CRAR) to provide the needed insight to guide the development of effective integrated communications services. The Roadmap was informed by Foundational Research comprising an assessment of existing research and Mental Models research, a key component of Mental Modeling Technology. The mental models research is the topic of this presentation. First, an Expert Model of “Influences on Integrated Communication for Data Dissemination” was developed based on a review of background materials and discussions with a select group of CB employees. Using the model as an analytical basis, an interview protocol was designed and in-depth mental models interviews were conducted with 26 key internal CB stakeholders. The results of the expert interviews provided critical insight to support the development of the CRAR, which will provide guidance needed to improve the effectiveness of CB communications and data collection at a time of significantly escalating CB data collection costs and ever tightening government budgets.

P.77  The constitutive role of communication for coordinated safety behavior in an organization managing high-hazardous processes. Marynissen H*, Ladkin D, Denyer D, Pilbeam C; Cranfield University   hugo@pm.be

Abstract: The dissemination of safety procedures and guidelines is perceived as pivotal to keep organizations managing high-hazardous technologies incident free. The role of clear communication is seen as essential in the transferring of these procedures and guidelines. However, previous research in a gas-receiving terminal clearly indicates how every single individual in that organization holds divergent perceptions of the present risks. This suggests that the transmitting of information through various forms of communication fails to create a uniform perceived interpretation of the potential risks in an organization. Hence, these variable risk perceptions might actually endanger safe operations. On the other hand, the gas terminal that was the subject of this research has been operating accident-free for more than twenty years. This is at odds with the average number of fatal accident rates in onshore gas companies. Therefore it might be argued that this gas terminal achieves some form of coordinated safety behavior, based on a differing way of relating within the organization. In an attempt to uncover this coordinated safety behavior, this research explores the interactive processes between all staff. Based on Social Network Analysis and qualitative interviews it indicates how the ongoing conversations about safety and risk avoidant behavior constitute a safety culture in this gas-receiving terminal. Furthermore, it fundamentally adds knew insights to the existing knowledge in the field of “communication constitutes organization” research (CCO), and more specifically to the use of CCO in High Reliability Organizations. Finally, recommendations for practice and viable directions for further research are indicated.

P.80  Irrational fears for radioactivity: Qualitative and quantitative evaluation. Aoyagi M*, Kanamori Y, Yoshida A; National Institute for Environmental Studies   aoyagi@nies.go.jp

Abstract: Using qualitative (focus group interviews) and quantitative (opinion survey) surveys, which carried out from October 2012 to February 2013, we discuss public understandings, attitudes, and images against radioactive contaminations by the East Japan Great Earthquake. As we have already reported qualitative part elsewhere (Other SRA chapters), logical explanation could not remove “fears” from participants of focus group interviews. We further explore people’s understanding of radioactive contaminations by quantitative questionnaire survey on Japanese respondents drawn from nationally representative probabilistic samples of males and females between 20 and 80 years old in February 2013. Surprisingly, the most chosen option of information source for social events general, was TV programs (92%) followed by printed newspapers (75%). Internet resources such as SNS (6.4%), newspaper on websites (10.8%) were lower options than radio (22.8%), friends and family members (19.5%), magazines (14%), although about 70% of our respondents answered they used internet. Actually, TV program is a strongest in diffusing information on risks, such as global warming or radioactive contamination. 55% of our respondents chose “journalists or commentators appear on the TV program” as the most trusted information sources. This number is twice as high as the option of “academics from universities or other research institutes” (26.8%), or “national government”(22.2%), “international governmental bodies”(14.5%). Above findings seem to be one of the reasons of people having ambiguous fears for radioactive contamination. We asked three quizzes for the science of radioactivity, and found more than half of respondents answered wrong, for at least one. Majority of people do not have enough knowledge for radioactivity, and could not understand news or information from either national or local governments, because they relied on journalists or commentators appeared on TV programs, whose comments may not always be based on scientific evidence.

P.81  Progress in new tools for participatory vulnerability analysis to climate stressors. Webler TW*, Tuler SP; Social and Environmental Research Institute   twebler@seri-us.org

Abstract: City officials want to better understand how their communities are vulnerable to climate change. We used social science of hazard management and deliberative learning to develop a method for participatory vulnerability assessment that organizes expert and local knowledge about climate hazards. Facilitated deliberation promotes learning and is favored by participants as a “natural” way of carrying out self-governance. We report here on the results of employing this method in the City of Boston.

P.82  Challenges Associated with Communicating Multidimensional Risk Data to a Diverse Set of Stakeholders . Wilson P*, Kubatko A, Hawkins B, Cox J, Gooding R, Whitmire M; Battelle Memorial Institute and the Department of Homeland Security Chemical Security Analysis Center   wilsonph@battelle.org

Abstract: The Chemical Terrorism Risk Assessment (CTRA) is a DHS CSAC funded program that estimates the risk among chemical terrorism attack scenarios and assists in prioritizing mitigation strategies. Presentation of multidimensional results, specifically frequency, consequence, and risk results for a wide variety of attack scenarios, in a manner that is easily digestible to stakeholders from diverse backgrounds is a perpetual challenge on the CTRA. Graphical formats are commonly more comprehensible and meaningful than vast numeric tables; however, visually capturing multiple dimensions poses a difficult challenge. Experience has shown that pie and bar charts are the most aesthetically appealing and easily understood formats, yet such formats generally only present one dimension of the data and do not capture the uncertainty inherent in the results. Whisker and box plots portray the uncertainty associated with a single dimension of the data, but are generally not well understood by, and thus not appealing to stakeholders. Risk square plots that mimic traditional risk assessment matrices have proven useful at effectively communicating results, but struggle to depict the vast number of attack scenarios comprising the CTRA and wide range of scenario aggregates of interest to the various types of stakeholders. Often CTRA stakeholders desire drastically different aggregations in order to meet their specific needs of their missions. To better meet the needs of the wide array of CTRA stakeholders, notional data will be used to illustrate examples of risk visualizations for potential use in communicating results. Interactive data visualization concepts that allow stakeholders to customize scenario aggregation to meet specific needs will also be discussed.

P.84  Utilizing Need for Affect and Need for Cognition from a Dual-Processing Framework: Measuring Environmental Policy Preference by Experimental Design Studies. Kim S-J*; Colorado State University   kim.sage@gmail.com

Abstract: Haddock, Maio, Arnold, and Huskinson (2008) reported that an affective message had stronger effects on attitude changes amongst those high in NFA and low in NFC. On the other hand, a cognitive message was found to elicit more positive changes in attitudes in those categorized as high in NFC and as low in NFA. Based on the review of the literature, the present study proposes several ways to more effectively utilize both NFA as well as NFC. Thus, this paper suggests H1: Individuals exposed to a cognitive message who are high in NFC and low in NFA will more likely to support an environmental policy issue (a new 54.5 MPG standard by 2025), compared to individuals exposed to other messages (i.e. affective, both cognitive & affective, and neither cognitive nor affective messages); H 2: Individuals exposed to an affective message who are low in NFC and high in NFA will more likely to support an environmental policy issue, compared to individuals exposed to other messages; H 3: Individuals exposed to a combined (both cognitive and affective) message who are high in NFC and high in NFA will more likely to support an environmental policy issue, compared to individuals exposed to other messages; H 4: Individuals exposed to a neutral (neither cognitive nor affective) message who are low in NFC and low in NFA will more likely to support an environmental policy issue, compared to individuals exposed to other messages. In addition, this study adds involvement and endorser as moderators of these relationships; furthermore, it looks at opinion leadership on the climate change issue and behavior/intention towards the adoption of a new 54.5 MPG standard vehicle as additional dependent variables. A series of experimental design studies (Study 1, 2, and 3) will be introduced and their strengths/limitations will be discussed.

P.85  Improving natural disaster risk communication strategies: Characterizing public trust in institutions involved in natural disaster management in Chile. Zacharias C.A., Jimenez R.B.*, Bronfman N.C.; Universidad Andres Bello   rjimenez@unab.cl

Abstract: In recent years, natural disasters in Chile have tested the response capacity of local authorities, revealing significant weaknesses in the way that the these risks are currently assessed, managed and communicated to the public. Several studies suggest that public trust in institutions in charge of managing complex risks such as natural disasters, has a significant influence on public attitudes towards decisions and initiatives that emerge from such institutions. Therefore, the development of effective natural disaster communication strategies must address public concerns, preferences and levels of public trust in relevant actors in disaster prevention, warning and response. So far, the extent to which Chilean trust local institutions in charge of communicating and informing the community about natural disaster risks remains unknown. To fill this void, this study aimed at characterizing the degree of public trust in institutions responsible for risk communication and education of natural hazards in Chile. Chileans’ trust in ten institutions with relevant roles in natural disaster risk communication in Chile (government, scientific community, NGOs and others) will be characterized using a survey. For each institution, participants will rate a set of six observable variables that account for two dimension of institutional trust (competence-based trust and integrity-based trust) using a 5-point Likert scale. The survey will be implemented in July 2013. To better represent Chilean population, this survey will be administered in four major Chilean cities: Iquique, Santiago, Valparaiso and Concepcion. Data from the survey will be analyzed using descriptive statistics and confirmatory factor analysis. Results from the present study will provide a useful tool for decision makers in natural disaster risk management, especially for relevant actors and governmental agencies in charge of developing and implementing risk communication, information and education strategies.

P.86  Risk Communication Activities of Health Risks by the Japan EMF Information Center. Ohkubo C*; Japan EMF Information Center   ohkubo@jeic-emf.jp

Abstract: In response to the World Health Organization (WHO)’s publication of the Environmental Health Criteria monograph (EHC) No. 238 and WHO Fact Sheet No. 322 on extremely low frequency (ELF) electromagnetic fields, the Japanese Ministry of Economy, Trade and Industry (METI) set up a Working Group on Electric Power Facility and EMF Policy in June 2007. The Working Group compiled their report in which their recommendations to the METI were incorporated. To address issues related to potential long-term effects of ELF-EMF, the Working Group recommended that a neutral and permanent EMF information centre should be established to promote risk communication and facilitate peoples’ understanding based on scientific evidences. In response to this recommendation, the Japan EMF Information Centre (JEIC) was established in July 2008. JEIC is financed from donations by stakeholders. The Administration Audit Committee was founded in order to ensure and monitor the neutrality and transparency of JEIC operations. The JEIC institutional system is determined to develop itself into a world-class risk communication center with expertise in EMF. JEIC’s philosophy and purpose are to provide easy-to-understand scientific information on EMF and its health effects and minimize the gap of risk perception among stakeholders and promote risk communication from a fair perspective. JEIC’s activities to achieve its purposes include (1)Creating an EMF information database including EMF research database, (2)Communication with mass media, (3)Organizing public meetings, (4)Q&A by telephone and emails.

P.87  Kids + chemical safety: a tool for educating the public about chemicals. Nance P, Kroner O, Dourson M*; Toxicology Excellence for Risk Assessment   nance@tera.org

Abstract: With an increasingly “plugged-in,” connected, and informed public, there is an evolving need for rapid availability and global dissemination of accurate information. Important decisions about personal health, and public health and safety are made daily by the scientific and medical community, legislators, the public and the media often based on inaccurate, incomplete or biased information on the internet. The exposure of children to chemicals in their environment and the possible effects on childhood growth and development is a paramount social concern. Many websites dedicated to children and chemical exposures are available. However, these websites can be generally characterized as either government sites that are technically dense, not interactive with users, and primarily targeted to the scientific community; or, sites developed by special interest groups, that lack technical depth, may/may not accurately represent the toxicology of the subject chemicals, may/may not be interactive with users, but that are nevertheless written at a level understandable to a broad public audience. A challenge for protection of children’s health to chemical exposures is to develop a website that can rapidly communicate independent, scientifically accurate information needed to make important decisions in a way that a broad user audience can understand and apply. Kids + Chemical Safety is scientifically accurate website, staffed by experts in toxicology, public health protection and scientific communication, that evenly represents perspectives, provides current information, interactive, and understandable to serve a broad audience, inclusive of scientists, parents and the media.

P.88  Effect of information trustworthiness on cancer risk perception after a nuclear disaster. Kuroda Y*, Iwamitsu Y, Takemura K, Ban N, Sakura O, Sakata N, Tsubono K, Nakagawa K; The University of Tokyo, Kitasato University, and Waseda University   kuroday-rad@h.u-tokyo.ac.jp

Abstract: This study examines the effect of trustworthiness on risk perception in residents of Tokyo and Fukushima after the Fukushima Daiichi nuclear disaster. A cross-sectional study was conducted among 2000 residents of Tokyo and Fukushima (1000 per city), selected by stratified random sampling. Participants anonymously filled out a questionnaire on 8 cancer risk factors: Smoking, Drinking, Poor Diet and Lack of Exercise, Obesity, Stress, Food Additives, Low Vegetable Consumption, and Radiation Exposure. From these 8 factors, participants were asked to select and rank the top3 (1=highest risk). They also rated their trustworthiness in sources of information about radiation on a 5-point scale (1 = Not Reliable, 5 = Reliable). This study was approved by the Institutional Review Board at the University of Tokyo. Responses were obtained from 554 participants from Fukushima (mean age = 52.8 ± 16.3 y) and 465 participants from Tokyo (mean age = 51.6 ± 15.8 y). Participants from both cities rated Smoking (40.7%), Radiation Exposure (31.5%), and Stress (17.7%) as factors with highest risks for cancer. Radiation was rated significantly higher as a risk factor by participants from Fukushima than participants from Tokyo (X2 = 6.21, df=1, p < .01). Sources of information about radiation were classified as “Reliable” (score of 5 or 4) and “Unreliable” (scores of 3, 2, or 1). A chi-square test revealed that those receiving information from an unreliable source were more likely to report radiation as a higher risk factor (X2 = 6.81, df=1, p < .01). Trustworthiness is significantly related to perception of radiation risk; thus, building trustworthiness is the key issue to address for effective radiation risk communication.

P.89  This is the title; Investigating “consumer awareness” in evaluating food safety hazards related to beef in Japan. Kumagai Y*, Hosono H, Sekizaki T; the University of Tokyo   2791388998@mail.ecc.u-tokyo.ac.jp

Abstract: In an emergency situation, effective risk communication may reduce unnecessary public concern and consequential behaviors. In 2011, Japan has two crisis events for beef-related health hazards (enterohemorrhagic Escherichia coli O111 and O157 and radioactive contamination) in the area of food safety. In this situation, Japanese consumers become increasingly aware of the risks of health hazards related to beef. The aim of this study is to investigate “consumer awareness” made an influence on evaluating health hazards in an emergency situation. We conducted the internet-based questionnaire survey on October in 2011. The survey had 3,957 respondents. The following subjects were asked about; (1)where each health hazard related to beef (“enterohemorrhagic E. coli (EHEC)”, “Salmonella spp.”, “Campylobacter spp.” “bovine spongiform encephalopathy (BSE)”, “radioactive substances”, “antibiotic residues”, and ”cloned animal”) was ranked in descending order of risks, (2)the reasons why respondents chose the highest health hazard. We analyzed the words into free descriptions of the reasons, categorized the words into 8 broad types (“severity”, “probability of occurrences”, “anxiety and fear”, “adverse effects for infants”, “reliability of governmental management”, “avoidance by oneself”, “attention to media”, and “production of are”) as “consumer awareness” and explored factors made an influence on evaluating health hazards. In the result, “consumer awareness” in risk ranking was confirmed as follows; (1)regarding EHEC and Salmonella spp., “severity”, “probability of occurrence”, and “anxiety and fear”, (2)regarding BSE, “anxiety and fear”, “severity” and “avoidance by oneself”, (3)regarding radioactive substances, “reliability of governmental management”, “anxiety and fear”, “attention to media” and “adverse effect for infants”. The result of this study implied that “reliability of governmental management” is very important factor for emerging hazards like radioactive substances.

P.90  Effects of changing frequency of heterogeneous stimuli over time on estimation of frequency. Kugihara N*; Graduate School of Human Sciences, Osaka University   kugihara@hus.osaka-u.ac.jp

Abstract: Two kinds of stimuli (i.e. photos and words) were shown repeatedly to participants for twenty minutes. The photos have neutral emotional valence (e.g. spoon, tissue paper and dish) and words were nonsense syllables (e.g. nuse, heyo and rue). These stimuli were shown according to two types of schedules, HF (high frequency) and LF (low frequency). As for HF, the presented frequency increased rapidly and reached the peak (60 times per minute) in two minutes, then decreased gradually. As for LF, it increased gradually and reached the peak (6 times per minute) in two minutes, then decreased. HF and LF schedule were done at the same time. In one condition, stimuli of HF were words and LF were photos and in another condition HF were photos and LF were words. If a disaster occurs, mass media have a tendency to try to identify and pursue a target in charge of the tragic event. The frequency of the newspaper articles pursuing targets varies and fluctuates through time. Some researches indicate that transitions in scapegoats (targets) occur as time advances from persons or groups to society or to our culture. My past laboratory studies showed that these transitions were mainly caused by our memory bias. This means that the rare targets picked up by articles overestimated and also the degree of the overestimation increases as time advances. However, these experiment used same kinds of stimuli. Therefore participants may fail to discriminate these stimuli. To avoid this problem the present study used heterogeneous stimuli (words and photos) as HF and LF. Results showed that perceived frequencies of LF were overestimated, and the subjective peak of LF appeared later than actual peak. On the other hand frequencies of HF were underestimated, and estimation peak nearly corresponded to presentation peak. These results indicate that even if presented stimuli were heterogeneous we have subjective biases of frequency and of its peak estimation.

P.91  Rethinking Risk Data: ITER 2.0. Kroner O*, Wullenweber A, Willis AM; Toxicology Excellence for Risk Assessment (TERA)   Kroner@TERA.org

Abstract: For 17 years, the International Toxicity Estimates for Risk (ITER) database (www.tera.org/iter) has been a centralized source of peer-reviewed chronic human health risk values. ITER is a free resource, providing access to international risk information with a side-by-side comparison of risk values. Since 1996, the database has grown to include over 700 chemicals and includes risk values derived by organizations from around the world. However, during this time the world has seen major advancements in computational processing power, database mining and programming, and user interface design. In short, we are learning to extract more knowledge from the available data. With an eye to the future, a series of stakeholder surveys have been conducted to evaluate how ITER and risk information is currently used by risk assessors, and how ITER may be restructured to be most useful to meet the risk assessment community’s needs. Survey results indicated several areas for improvement and have spurred a call for ITER 2.0, which is currently underway. The redesigned system will be built to support additional problem formulations (NAS, 2009), offering flexibility to include additional data types such as acute values, occupational exposure levels, biomonitoring equivalents, and possibly ecological risk values. Possible user interface enhancements will allow for searching and sorting by chemical class, target endpoint, date of derivation, and uncertainty factors, and allow for cross chemical comparisons and meta-analyses. The development of ITER 2.0 based on user feedback will help organizations share their risk work, and help risk scientists navigate the available data to streamline public health protection efforts.

P.92  Managing communication in times of crisis through ambiguity: A framework for crisis communication. Eller EG*, Calderon AA; Stephenson Disaster Management Institute, Louisiana State University   mail@ericeller.de

Abstract: It has been suggested that the control of ambiguity as part of a crisis communication strategy can be an effective mechanism to manage and affect the perception of organizations by their stakeholders. Previous research on the perception of ambiguity suggests that positive and negative effects can be attained by both: (1) the communicating organization (e.g. through flexibility, credibility, and other outcomes) and (2) the recipient of the message (e.g. stakeholders with varied levels of trust, confusion, etc.). The purpose of the presented work is to contribute to the understanding of how, if any, ambiguity should consciously be managed in crisis communication. We consider ambiguity as a multidimensional construct, so we argue that in crisis communication, ambiguity can be found and managed on several levels such as in the content of the message, in the context of the relationship between the communicating parties, and in the form and context of the communication. We also suggest several factors of the recipient of the message affecting the interpretation and impact of ambiguity. The present work offers a practical framework for the management of ambiguity in crisis communication based on prior research and critiqued by a group of experts. This paper presents through translational research an applied research framework for the use of scholars and decision makers at all levels, while taking into consideration their perspectives, experiences, concerns, comments and ideas during Hurricane Katrina (2005) and the Deepwater Horizon BP Oil Spill (2010). We believe that the presented framework offers a new perspective on the management of ambiguity in times of crisis and thereby provides a basis for future research and also provides a practical framework that can be used to collect data to further educate the field of crisis communication.

P.93  Uneven recall and inaccurate risk assessments from reading balanced news articles of controversial risk topics: The role of exemplars and affect. Dixon GN*; Cornell University   gnd5@cornell.edu

Abstract: This paper examines how the uneven placement of emotional pictures in a two-sided news article influences readers' information processing and risk perceptions. In study 1, participants (n=198) were randomly assigned to one of three balanced (i.e., two-sided) news articles on vaccine safety – one article with an emotional picture exemplifying vaccine safety arguments only; one article with an emotional picture exemplifying vaccine danger arguments only; and one article with no picture (control condition). In both experimental conditions, readers recalled risk arguments from the side with the exemplar significantly more than the side without it. The control condition yielded no significant difference in recall between the two sides. Study 2, which is currently ongoing, investigates the extent to which affective reactions toward the exemplar mediate the relationship between exemplar exposure and recall. Furthermore, it is hypothesized that the ease with which readers recall such information will significantly influence their risk perceptions. For scientific controversies where the evidence supports only one side of a two-sided news article, uneven placement of an affect-inducing exemplar might lead readers to primarily recall the side that is supported by little or no evidence. This is important because journalists who believe they are presenting a “balanced” article on a risk controversy might unknowingly influence their readers to largely process and recall only one side of a two-sided message, subsequently leading to inaccurate risk assessments.

P.94  “Magical thinking” in high risk cancer families . Flander LB*, Keogh LA, Ugoni A, Ait Oaukrim D, Gaff C, Jenkins MA; University of Melbourne   l.flander@unimelb.edu.au

Abstract: About half of people from mutation-carrying families decline genetic counselling and/or testing to identify their mutation status and risk of colorectal cancer. We report on perceived CRC risk and qualitative analysis of reasons for declining in this group. We studied 26 participants (mean age 43.1 years, 14 women,) in the Australasian Colorectal Cancer Family Registry who were relatives of mismatch repair gene mutation carriers; who had not been diagnosed with any cancer at the time of recruitment and who had declined an invitation to attend genetic counselling and/or testing at the time of interview. A structured elicitation protocol was used to capture bounded estimates of perceived risk over the next 10 years. Understanding of genetic testing and CRC risk, reasons for declining testing and self-reported colonoscopy screening were elicited during a 45-minute semi-structured interview. A sub-group of decliners (31%) unconditionally rejected genetic testing compared to conditional decliners who would consider genetic testing in the future. They were confident their decisions would avoid the potential negative impact of testing. Mean perceived 10-year risk of CRC was 54% [95% CI 37, 71] in unconditional decliners, compared with the mean perceived 10-year risk of CRC of 20% [95% CI 5,36] in people who conditionally decline genetic testing. This difference remained after adjusting for potential confounding (age, gender and reported screening colonoscopy). This group perceive themselves to be at about 2.6 times higher risk than conditional decliners. Their biased judgment under perceived high risk may be “magical thinking,” which becomes a heuristic to avoid "tempting fate" (Risen and Gilovich 2008). Defensive motives to protect against threatening health information may contribute to unconditional declining of genetic testing (Etchegary and Perrier 2007).

P.95  Two years since Fukushima accident. Do people still willing to support for the affected area? Hosono H*, Kumagai Y, Sekizaki T; the University of Tokyo   ahiromix@mail.ecc.u-tokyo.ac.jp

Abstract: We’ve implemented 3 times of web based consumer survey, Nov. 2011(N=4,363), Mar. 2012 (N=5,028) and Jan. 2013 (N=6,357) to investigate how Japanese consumers consider the food produced in the area affected by the Fukushima accident. From these survey, risk perception of 7 beef related hazards became lower as the time passed. Limited relationship between the risk perception or willingness to pay for the food from disaster affected area and knowledge were observed while the intention to support for the recovery of affected area significantly increase the willingness to pay for these food. Another notable finding was in the 3rd survey implemented about 2 years after the disaster, 22.5% of respondents didn’t want to accept food from affected area even radioactive cesium is below standard level while the ratio in 1st and 2nd survey was 13.0% and 9.8% respectively. Therefore, we’ve developed and applied a web based lottery to identify individual risk aversion level as well as a donation experiment with the respondents of 3rd survey to relate willingness to accept food from disaster affected area and willingness to support the recovery of affected area.

P.96  Burgers or tofu? Eating between two worlds: Risk information seeking and processing during dietary acculturation. Lu H*; Marquette University   hunter8828@gmail.com

Abstract: As the fastest growing ethnic student group in the United States, Chinese international students are becoming a prominent part of the society. Studies have shown that with acculturation, Chinese international students have consumed a more Americanized diet (increased fat and cholesterol and less fiber than their traditional diet) and that this acculturated diet has been associated with a higher prevalence of chronic diseases. The understanding of what factors, cognitive, affective, and/or socio-demographic, might motivate Chinese international students to seek and process risk information about potential health risks from adopting a more Americanized diet is the primary focus of the current study. Guided by the Risk Information Seeking and Processing (RISP) model, an online 2 (severity: high vs. low) x 2 (coping strategies: present vs. absent) between-subjects experiment was conducted among 635 participants. Data were analyzed primarily using structural equation modeling. Some highlights of this study include the situational roles that informational subjective norms (ISN) and affective responses play in risk information seeking and processing. More specifically, ISN had direct relationships with systematic processing and heuristic processing while working through information insufficiency to affect information seeking and information avoidance. Negative affect was positively related to information seeking but also worked through information insufficiency to pose an impact on information seeking. Positive affect was positively related to information avoidance and heuristic processing. Future implications include boosting perceived social pressure and using appropriate fear appeals in healthy eating education programs, and creating more awareness about potential risks of tasty, convenient, inexpensive but unhealthy food. The author would like to be considered for the Student Travel Award.

P.97  Exploring the impact of negative emotions on information seeking about radioactive food contamination in Japan after March 11, 2011. Okada T*, Inaba T; Hitotsubashi University   sd121003@g.hit-u.ac.jp

Abstract: Although previous research shows that negative emotions can increase information seeking, few have examined the qualitative outcomes of this behavior. The nature of these outcomes is crucial, for if people receive biased information or if acquire incorrect knowledge, the uninformed may suffer. Thus, we explore the extent, viewpoints (negative or positive) of information sought, and the knowledge gleaned by anxiety- and anger-induced information search on measures of radioactive food contamination. After the March,2011 nuclear power plant accident in Japan,the government has monitored the radioactivity of foods produced in affected areas and bans shipments if levels exceed scientifically established legal limits. But widespread skepticism remains as to whether food within those limits is truly safe. Survey data (N=800) provided the following results. First, anxious people were more likely to seek news from many sources. Next, anxiety increased exposure to negative views about legal standards for food radioactivity, while anger led to more exposure to positive views, no significant effect of emotions was found on knowledge of food safety. Although anxiety was positively related to knowledge of the impact of low-dose radiation on health (government-provided information frequently reported shortly after the accident), no relationship was found between emotions and knowledge of rationales for changed legal standards or their implementation. Finally, the model was not significant in predicting whether or not one’s understanding of the changing legal standards was accurate. These results imply that each emotion influences the quality of information seeking differently, and that irrespective of the number of news sources sought, they tend to be biased toward negative views. Further, while anxiety may direct attention to news about potential harm, the timing and characteristics of the reported news may be critical to the economic consequences of that information seeking.

P.98  Numeracy and Beliefs About the Preventability of Cancer. Steinhardt JS*, Niederdeppe J, Lee T; Cornell University   jsteinh@gmail.com

Abstract: Fatalistic beliefs about cancer and uncertainty about information in news stories about cancer are barriers to cancer preventing behaviors. This research explores the relationship between numeracy, the ability to reason mathematically and interpret basic statistical and probabilistic information, and both fatalistic beliefs and uncertainty about the information present in news stories about cancer. Numeracy is measured using a 7-item subjective numeracy scale. A sample of 601 adults aged 18 and older were asked to read news stories about cancer in one of 15 randomized conditions and then asked questions related to cancer prevention and subjective numeracy. Higher levels of numeracy were associated with less fatalistic beliefs about cancer and less uncertainty about cancer risk and prevention. Interactions between numeracy and cancer news story exposure on fatalistic and uncertain beliefs about cancer risks and preventions, however, were not statistically significant. We conclude with a discussion of implications of these findings for future research on numeracy and health communication about complex issues like cancer.

P.99  Alternating Hydrologic Extremes: Risk Communication and Weather Whiplash. Trumbo CW*, Peek L, Laituri M; COLORADO STATE UNIVERSITY   ctrumbo@mac.com

Abstract: Our focus in this work is on the cascading effects of alternating hydrologic extremes. Possibly as a consequence of climate change there is an increasing likelihood that areas of the world will be undergoing a rapid “weather whiplash” between drought and flood. These alternating extremes pose extraordinary risk to agricultural systems and economies, both rural and urban infrastructures, human population patterns and migration, and the natural ecosystems that we all ultimately depend on. Using the Spatial Hazard Events and Losses Database for the United States (SHELDUS) we accessed some 65,000 county-level records for financial losses from natural hazards over the period 1960-2010. The data were parsed to isolate floods and droughts, and a summary metric was computed to identify the cases in the top 80th and 95th percentiles for total losses (both crop and property damage, in 2000 dollars). By identifying cases that fell in the top 80th percentile for the union of flooding and drought we identified 99 counties that have had the highest cumulative losses from the combined hazard. This focused the data reduction on approximately 4,700 cases. By then parsing the data by geographic area we were able to sort cases by date to identify specific circumstances in which losses from floods and droughts occurred in spatial-temporal proximity. To conclude this phase of the project we will examine historical records such as news sources to gain insight into the consequences of the combined hazard events and how risk was communicated. This approach has identified the most acute cases, we anticipate that continuing analysis will identify broader and more nuanced patterns that will generate additional historical information. We will then seek additional collaborators with expertise in GIS and Atmospheric Science to use climatological data to identify areas where this combined natural hazard may increase under climate change scenarios.

P.100  Risk perception of drinking water quality in a US-Mexico Border community. Victory K*, Cabrera N, Larson D, Reynolds K, Latura J, Beamer P; University of Arizona and Mariposa Community Health Center   kerton1@email.arizona.edu

Abstract: The United States and Mexico are the largest consumers of bottled water worldwide, but it is unclear what causes this increased consumption. We previously demonstrated, in a cross-sectional study, that in the border town of Nogales, Arizona approximately 85% of low-income residents primarily drink bottled water and 50% cook with it. In the current study, we interviewed ninety low-income Latinos in Nogales, AZ to assess differences in perceived risks of drinking municipal tap water and bottled water and to understand why these families use bottled water as their primary drinking water source. Respondents viewed drinking tap water to be a significantly more risky activity than consuming bottled and other purchased sources of water (p<0.001). Additionally, 98% of respondents feared that drinking local municipal tap water could result in adverse health effects such as cancer, lupus or gastrointestinal illnesses and did not associate drinking bottled water with any health outcomes. The majority of respondents (79%) stated that their primary reason for not drinking their tap water was fear of chemical and microbial contamination, compared to only 17% who preferred the taste of bottled water. Furthermore, respondents had significantly higher perceived risk (p<0.001) of drinking tap water in Mexico when compared to the U.S., but no differences among U.S. cities. Parents who are thirty-five years and older had significantly higher perceived risk (p<0.001) of the safety of their tap water compared to younger parents. We found no significant differences in perceived risks by gender, education, income or immigration status. Based on these results, future studies are needed to assess if these findings are localized to Nogales, or persists in other parts of the state or border region.

P.101  Natural disaster cognitive appraisals and disaster preparedness in immigrants and native-born in the Canadian context: A need for psychosocial considerations. Yong AG*, Lemyre L, Pinsent C, Krewski D; University of Ottawa   ayong089@uottawa.ca

Abstract: In Canada, the immigrant population has been identified as a higher-risk population in the context of emergencies. Past research shows that immigrants experience more negative consequences post-disasters compared to the native-born. Thus, it is important to induce emergency preparedness within this population long before the event. Immigrants may have different risk perceptions due to socio-contextual factors. Thus, psychosocial considerations are critical in order to better target risk communication and management for this population. Yet, little is documented about the risk mental models of immigrants. In an attempt to better understand the nature of the perceptions of natural disasters in immigrants of Canada and its relation to preparedness, data from a representative national survey of the Canadian public (n = 3263) were analyzed. This study examined natural disaster cognitive appraisals (perceived risk, likelihood, knowledge, mastery, complexity and uncertainty) between Canadian-born and immigrants, and how these factors predicted preparedness behaviours. Results showed that while there were no significant differences in cognitive appraisals between Canadian-born and immigrants, there were significant differences in how these factors predicted uptake of preparedness behaviours. For example, while perceived mastery positively predicted actual preparedness for both immigrants and Canadian-born, perceived knowledge positively predicted actual preparedness for Canadian-born, but not immigrants. Also, intention to comply with recommendations to evacuate was predicted differentially for Canadian-born versus immigrants. Results suggest that there may be socio-contextual nuances in the pathways that can explain the differences in uptake of preparedness. Therefore, psychosocial considerations for immigrants need to be incorporated in risk communication and management aiming at better preparedness and response. Theoretical and practical implications will be discussed.

P.102  Public Collaboration on a 30-year Commitment to Assess Superfund Health Outcomes in Butte, Montana. Ackerlund W.S.*; Kleinfelder   sackerlund@kleinfelder.com

Abstract: A collaborative process is described for gaining meaningful community participation on an atypical Superfund study to investigate the adequacy of health protection achieved through remediation. The process will guide the scope and conduct of health studies of the Butte community to be performed every five years for thirty years. The process emerged to address differences among stakeholders about health study needs and concerns for trust and transparency in the risk analysis. It effectively coordinates a technical working group (comprised of responsible parties, oversight agencies, supporting agencies, and consultants), a citizen’s advisory committee, EPA’s Technical Assistance Grant (TAG), and citizens at large. Major collaboration principles applied, lessons learned, and project benefits to date are identified. The collaboration effort provides a useful example of how to achieve common understanding among diverse stakeholders about public health risks.

P.103  Public response to information about the risk of cancer after the nuclear disaster in Fukushima. Sakata N*, Kuroda Y, Tsubono K, Nakagawa K; The University of Tokyo   sakatan-rad@h.u-tokyo.acjp

Abstract: The purpose of this study was to assess the response of residents in the affected area of Fukushima to a presentation on cancer risk that compared exposure to radiation and lifestyle choices. In March 2013, residents of Fukushima who had not been evacuated attended a lecture by an expert about cancer risks. After the lecture, a questionnaire about their response to the presentation was completed by 173 residents. The questionnaire concerned the perceived usefulness of or aversion toward the comparison of exposure to radiation and lifestyle choices. Residents responded on a 4-point Likert scale. In addition, the reason for any aversion was requested. Content analysis was performed for the qualitative data. Of the 173 residents (mean age ± SD = 59.53 ± 11.1), the expert’s information was rated useful or very useful by 85.5%, while 14.5% responded that the discussion was not very useful or not useful. Additionally, 59.3% responded that they did not feel any aversion toward the comparison of exposure to radiation and lifestyle and 40.7% responded they had feelings of aversion. Five categories and twelve codes were extracted from the residents’ responses, including “could not understand the methodology” “did not like the fact that the expert classified the risk of radiation as low,” “it was inappropriate to compare exposure to radiation and lifestyle choices,” “distrust of government and experts,” and “the risk assessment for children was incomplete.” Comparing exposure to radiation and lifestyle choices was considered helpful in understanding the risk of cancer by most residents, but feelings of aversion were also present. Reducing and addressing aversion engendered by information about cancer risks should be addressed in future presentations.

P.104  Crisis and Emergency Risk Communication to Family Physicians in Canada. KAIN NA*; University of Alberta   nkain@ualberta.ca

Abstract: Family physicians play the unique role in the Canadian health care system of being both recipients and translators of complex health risk information. This role is especially highlighted during times of public health crisis, such as the SARS outbreak in 2003, the Maple Leaf Foods Listeria outbreak in 2008, and the H1N1 pandemic influenza outbreak in 2009. Reviews of these crises outline the necessity for improved risk communication of appropriate and timely information to family physicians. Public health and professional agencies need to better understand the information-seeking behaviours, knowledge requirements and trust relationships of this community, in order to maximize the potential of knowledge dissemination to this group, and to improve the risk communication of crisis/emergency information to family physicians in Canada. This paper outlines an original research study that will: 1) explore the way in which Canadian family physicians access information during times of public health crisis/emergency, 2) acquire a clear understanding of who family physicians trust for timely, accurate and credible information, and 3) assess the knowledge requirements and risk communication needs of family physicians. Using a phenomenological approach, individual interviews will be conducted with family physicians from various practice settings and locations across Canada. The interviews will be audio-record and transcribed verbatim and coded to identify descriptions of the phenomenon of risk communication of crisis/emergency information to family physicians, and then by clustering these descriptions into categories to describe the “essence” of this phenomenon. A set of recommendations for public health and professional agencies in Canada to improve risk communication strategies for family physicians relating to crisis/emergency information will be proposed.

Risk & Development
 

P.106  Risk Management in Colombia: The Challenge of Development. Orozco G*; Universidad del Norte   gorozcorestrepo@gmail.com

Abstract: This article addressed both the vulnerability of the biodiversity in the Colombian territory and the priority of the government to improve the mining activities in favor of the economic growth. The main purpose is to explain how risk management is not appropriate to the major environmental problems in Colombia.

P.107  Selection of next-generation low global-warming-potential refrigerants by using a risk trade-off framework. Kajihara H*; National Institute of Advanced Industrial Science and Technology   kajihara.hideo@aist.go.jp

Abstract: Because the refrigerants currently used in air-conditioners have high global-warming-potential (GWP), substances with lower GWP, such as R-1234yf, are being sought as candidate next-generation refrigerants. However, low-GWP substances often have comparatively high chemical reactivity and may carry increased risks of combustibility, toxicity, generation of degraded products, and CO2 emission increase caused by poor energy-saving performance. It is therefore possible that there is a risk trade-off between currently used refrigerants and low-GWP ones. In this research, I proposed a framework for evaluating this risk trade-off in the following five categories: (1) environmental characteristics; (2) combustion characteristics; (3) toxicity; (4) volume of greenhouse gas emissions; and (5) applicability to air-conditioning equipment. I then selected substances well suited as next-generation refrigerants in accordance with a specific screening process. I showed the importance of clearly specifying the combination of a number of end points and assessment criteria in the process of decision-making based on risk trade-off. This yields a rapid understanding of the necessary data, as well as flexible decision-making that is relevant to the social conditions.

P.108  Contamination risks and effects on suburban areas by a ceramic and tiles factories: a case of study. COLON L, MONZON A, DEMICHELIS S*; National University of Lanus   sandrademichelis@yahoo.com

Abstract: The aim of the study is to evaluate the contamination produced by a factory of ceramic and tiles which is located in the Industrial Park of Burzaco, Almirante Brown County, Buenos Aires Province, Argentina. There is an urbanization surrounding this industrial area without any kind of barrier The tiles factory delivered into the media uncontrolled residual waters from productive process in an artificial pond and which is in contact with population since eventually it discharges into a natural stream and due it was not adequately built affecting groundwater. Waste water, soil and air sources of pollution were analyzed On the other hand, there is a neighborhood under risk, which had been surrounded by the Industrial park growth and which is directly affected by particulate material disposed in open areas on the soil and by uncontrolled gaseous emissions. The population vulnerability is increased by strong rains since run off transports particulates from accumulation areas and pollutants from the lagoon overflows.. This work presents an integral system of environmental and waste water management, that integrates technological improvements including the use of a effluent treatment system, an intervention protocol for administrative and productive sectors in order to guarantee a correct life cycle of their products and diminish inhabitants risks.

Risk, Policy & Law
 

P.109  Setting a Regulatory Cleanup Level for the Emerging Contaminant Sulfolane . Farris AM, Buss SD*, Cardona-Marek T; Alaska Department of Environmental Conservation and SPB Consulting   buss.steph@gmail.com

Abstract: Sulfolane is an industrial solvent used in oil and gas processing. When sulfolane was first detected in the groundwater at the North Pole Refinery in North Pole, Alaska, it was not a regulated compound by the State of Alaska or United States Environmental Protection Agency (US EPA). In addition, little data was available on the long-term toxicity of the compound. In 2004, the Alaska Department of Environmental Conservation (ADEC) set a cleanup level for sulfolane in groundwater at 350 parts per billion (ppb) based on toxicity levels from the Canadian Councils of Ministers of the Environment report. This concentration was not exceeded at the refinery boundaries, so no further characterization was completed until additional monitoring wells were installed in 2009 and concentrations were higher than expected. Sulfolane was then tested for and discovered in over 300 private drinking water wells downgradient of the refinery, as well as the city municipal wells. This discovery led to ADEC coordinating with the State Department of Health, the Agency for Toxic Substances and Disease Registry (ATSDR), and US EPA toxicologists to better evaluate the chemical’s potential health effects and re-evaluate the cleanup level. Nearly 30 toxicologists reviewed the available data on sulfolane and made the recommendation to substantially lower the established level. The State of Alaska calculated a cleanup level of 14 ppb in groundwater based on an US EPA Provisional Peer Reviewed Toxicity Value protective of the ingestion of groundwater and site specific exposure parameters. Beyond setting the cleanup level, ADEC nominated sulfolane to the National Toxicity Program for additional toxicity research. The nomination was accepted and additional studies are underway.

P.110  The role of statistical models in drinking water distribution system asset management. Rao V*, Francis R; The George Washington University   vrao81@gwu.edu

Abstract: A robust asset management plan needs to be in place for water utilities to effectively manage their distribution systems. Of concern to utilities are broken pipes, which can lead to bacteria entering the water system and causing illness to consumers. Typically, water utilities allocate a portion of funds every year for renewal of pipes and valves. However, pipe renewal is largely based on replacing current broken pipes, and long- term asset management planning to replace pipes is not a priority for water utilities. Water utilities are beginning to use probabilistic break models and other statistical tools to predict pipe failures. These models incorporate variables such as pipe length, diameter, age, and material. Although incorporation of these models is emerging in the water industry, their direct impact on long term asset planning remains to be seen. In addition, the effectiveness of these models is questionable, as there is currently little research done to evaluate the ability of these models to assist in asset management planning. This paper discusses the role of probabilistic pipe break models in structuring long-term asset management decisions and tradeoffs taken by drinking water utility companies.

P.111  SafeWater CBX: Incorporating Uncertainty and Variability in Benefits Analysis. Stedge J*, Brad F; Abt Associates   gerald_stedge@abtassoc.com

Abstract: Incorporating variability and uncertainty into public health regulation benefits assessments is critical to fully understanding the potential impacts; however, doing so can be data intensive and computationally complex. To support the development of national primary drinking water standards, we developed the SafeWater CBX model which is designed to estimate the health benefits associated with alternative maximum contaminant levels (MCLs) in drinking water. SafeWater CBX is the first model ever developed to fully incorporate both variability and uncertainty in drinking water benefits assessment. The model first estimates the exposed populations at each public water system (PWS) entry point to the distribution system (EP). The exposed population is categorized by age and gender. Based on EP-level distributions of contaminant occurrence (which vary by source water type and region and are uncertain), drinking water consumption (which varies by age), and dose-response functions (which vary by age and gender and are also uncertain), the model then estimates the expected cases of illness each year (over a 50-year period of analysis) at any number of alternative MCLs. SafeWater CBX then values both the reduced expected illnesses and deaths avoided using cost of illness estimates and the value of statistical life (both of which are uncertain). The health benefits can be displayed by source water type (ground or surface water), age group, sex, PWS system size, and region. In addition to its ability to incorporate variability and uncertainty into the benefits analysis, SafeWater CBX also provides users with the option to run in “mean mode” where all inputs are treated as certain (variability is still modeled). This capability allows users to conduct sensitivity analyses in real time (15 minutes per run), making it possible to incorporate information on benefits into the regulatory option selection process.

P.112  The Balance between Protection of Human Health and Compliance with Regulatory Standards. Sager SL*, Locey BJ, Schlekat TH; ARCADIS U.S., Inc.   ssager@arcadis-us.com

Abstract: Drinking water standards such as maximum contaminant levels (MCLs) are frequently selected as remedial goals for groundwater whether or not this resource is used as a potable water supply. Some states even go as far as promulgating groundwater standards based on MCLs. However, when toxicity values change, there can be a lag time between those changes and a revision to the drinking water standard. This lag time has vast implications for industrial sites. As examples, the changes in the toxicity values of 1,1-dichloroethene (1,1-DCE) and tetrachloroethene (PCE) recommended by the United States Environmental Protection Agency (USEPA) and their implications for setting groundwater standards and remedial goals will be discussed. The groundwater standard for 1,1-DCE in the State of North Carolina will be presented as a case study. Implications for the groundwater standard for PCE will also be discussed. North Carolina recently revised the groundwater standard for 1,1-DCE from the MCL of 7 micrograms per liter (µg/L) to a health-based concentration of 350 µg/L. In their economic analysis in support of the revision, the State of North Carolina reported cost savings of over $1,000,000 reflected by reduced sampling and analysis, reporting, and regulatory oversight. However, since compliance with the MCL is still required for public water supplies, these savings are primarily achieved for sites without potable water. A similar scenario is envisioned for PCE with broader economic impacts. The State of North Carolina groundwater standard is a health-based concentration of 0.7 µg/L based on an old toxicity value. However, a revised standard could be set equal to the current MCL of 5 µg/L or it could be recalculated to equal 16 µg/L if the latest toxicity information from USEPA is utilized. In addition this paper will discuss the cost of the time lag to update standards based on the latest science to both industry and the public.

Risky Eating
 

P.113  Probabilistic Risk Assessment for 2-Amino-1-Methyl-6-Phenylimidazo[4,5-b]- Pyridine (PhIP) through Daily Consumption of High-Temperature Processed Meats and Fishes in Taiwan. Liu LH*, Chan CC, Wu KY; National Taiwan University   smorezed@gmail.com

Abstract: 2-amino-1-methyl-6-phenylimidazo[4,5-b]pyridine (PhIP) has been reported present in many panfried, oven-broiled, and grilled meats and fishes which are important sources of nutrients. It caused colon, prostate and mammary cancer in animal bioassay and is classified as a possible human carcinogen. PhIP can cause the formation of DNA adducts and a mutagen, and a genotoxicity mode of action is relevant to human. Daily dietary intakes of PhIP through consumption of the high-temperature processed meats and fishes have been of great concerns. Therefore, this study was aimed to perform a probabilistic cancer risk assessment on PhIP due to daily consumption of these meats and fishes for the Taiwanese population. Dose-response modeling was performed with the Benchmark dose software for PhIP, a BMDL10 at 0.248 (mg/kg-day) was adopted for species extrapolation to assess a cancer-slope factor 0.4029(kg-day/mg). PhIP concentrations in meats and fishes cooked at different methods were cited from literatures. Questionnaires were used to collect the frequency consuming these meats and fishes from 123 study subjects to adjust the intake rates of these meats and fishes from National survey data. Probabilistic assessment of cancer risk and margin of exposure (MOE) was conducted by using the Monte Carlo simulation with the Crystal Ball software. The results reveal that a mean cancer risk is 4.64 x10(-6), and the bound of its upper 95th confidence interval is 1.58x10(-5), and the mean MOE is 270,000, and its lower bound 95th confidence interval is 10,000.

P.114  An exposure and health risk assessment of metals in apple juice. Banducci AM*, Tvermoes B, Bebenek I, Monnot A, Devlin K, Madl A; Cardno Chemrisk   amber.banducci@cardno.com

Abstract: Concerns have recently been raised about heavy metal contamination in apple juices and its potential impact on children’s health. Heavy metals such as aluminum (Al), arsenic (As), chromium (Cr), copper (Cu), lead (Pb), manganese (Mn), mercury (Hg), and zinc (Zn) are currently and have been historically used in a number of herbicides, fungicides, insecticides, and other pesticides in the United States and worldwide. As a result, these metals have the potential to contaminate fruit used to make juices. This study investigated the possible human health risks associated with heavy metal contamination in apple juice. The concentration of several metals including Al, As, cadmium (Cd), Cr, Cu, Pb, Mn, Hg and Zn were measured in six commercially available brands of apple juice and three organic brands. The concentrations of total As, Cd, Cr, Cu, Hg, and Zn in all nine apple juice brands sampled were below each metal’s respective FDA maximum contaminant level for bottled water. However, in some juices the levels of Al, Pb, and Mn exceeded FDA maximum contaminant levels for bottled water. Thus, to understand the implications of these findings in regards to children’s health, hazard quotients (HQs) for Al and Mn were calculated to estimate non-carcinogenic risk of heavy metal exposure from apple juice. In addition, blood Pb concentrations were estimated to characterize potential risk from Pb exposure following apple juice consumption. Due to recent concerns, non-carcinogenic risk was also estimated for As. Our results suggest that the exposure concentrations of Al, Mn, Pb, and As that result from apple juice consumption do not pose an increased non-carcinogenic health risk for children.

P.115  Dietary, occupational, and ecological risk assessment of carbaryl and dimethoate. Chiang SY*, Chang-Chien GP , Horng CY , Wu KY ; China Medical U., Taiwan   sychiang@mail.cmu.edu.tw

Abstract: The application of pesticides may cause adverse impacts on human users and environmental receptors. We carried out consumers', occupational, and ecological risk assessment of two commonly-used pesticides, carbaryl (carbamate) or dimethoate (organophosphate). For consumers' health risk assessment based on the current tolerance and highest residue levels in crops, fruits and vegetables, the non-carcinogenic risk index (hazard indexes, HIs) of carbaryl but not dimethoate was less than 1. Further analysis using Monte Carlo Simulation method showed that the means and upper 95% confidence limits of total HIs for carbaryl and dimethoate in different crops did not exceed one. For occupational exposure risk assessment, the distributions of pesticide exposure were assessed by HPLC analysis of personal air sampling tubes and patches from 27 workers and 16 farmers. Some of 95% confidence limits of total HIs for carbaryl and dimethoate were larger than 1, suggesting the importance of strengthening personal protective measures at work. The results from ecological risk assessment show that carbaryl possesses potential risk to aquatic insects, but not to fishes and frogs, whereas, dimethoate has significant ecological hazard effects on minnows, stoneflies, and frogs. The results can be regarded as the reference of government's pesticide administrative decision.

P.117  Questionnaire survey on water ingestion rates for various types of liquid and the seasonal differences between summer and winter. Ohno K*, Asami M, Matsui Y; National Institute of Public Health, National Institute of Public Health, Hokkaido University, Japan   ohno-k@niph.go.jp

Abstract: Water ingestion rates not only from community water supply but also from commercial beverages were surveyed to obtain the water ingestion rates for various types of water and to investigate the seasonal differences. The surveys were conducted during winter and summer in 2012. As a general population of the ages 20–79 in Japan, members of an Internet research company were invited and 1188 individuals in winter and 1278 in summer responded. The respondents were asked to record the daily water intake volume of each kind of water in arbitrary two working days and one non-working day during the research period. The kinds of water were non-heated and heated tap water as a beverage, soup made from tap water, bottled water, and commercial beverage including alcoholic drinks. There were no much differences of water ingestion rate among the three days; the first day’s results were presented hereafter. The mean water intakes from all kinds of water in summer and winter were 1936 and 1638 mL/day, respectively (95%iles: 3570 and 2900 mL/day). The mean water intake in summer is 1.2 times greater than in winter. Nevertheless, the mean water intakes from tap water including soup in summer and winter were 1159 and 1124 mL/day, respectively (95%iles: 2400 and 2200 mL/day); there were small seasonal differences. The main component that caused seasonal differences was the commercial beverage. The mean intakes in summer and winter were 635 and 437 mL/day, respectively (95%iles: 2500 and 1200 mL/day). With respect to tap water intake, large seasonal differences of non-heated water were observed. The mean intake in summer was 2.1 times greater than in winter (545 and 255 mL/day; 95%iles: 1676 and 950 mL/day). As conclusions, seasonal differences of water intake should be considered in risk assessment, especially in microbial risk assessment; the default water ingestion rate of 2 L/day may not always be most appropriate in risk assessment depending on the types of water taken into account.

P.118  Measurement of Hand to Mouth Lead Transfer Efficiency - A Simulation Study. Sahmel J, Devlin KD, Hsu EI*; Cardno Chemrisk   Elleen.Hsu@cardno.com

Abstract: There are currently no known empirical data in the published literature that characterize hand-to-mouth transfer efficiencies for lead. The purpose of this study was to quantify the potential for the hand-to-mouth transfer of lead in adult volunteers using human saliva on a test surface as a surrogate for the mouth. Commercially available 100% lead fishing weights, confirmed by bulk analysis, were used as the source of dermal lead loading in this study. Volunteers were instructed to collect saliva in a vial prior to the study. A small amount of saliva was poured on to a sheet of wax paper placed on a balance. Volunteers were instructed to handle lead fishing weights with both hands for approximately 20 seconds and then press three fingers from the right hand, ten presses per finger, into the saliva with approximately one pound of pressure. The left hand remained as a control with no saliva contact to compare total dermal loading. Palintest® wipes were used to perform a series of wipes to collect lead from the saliva and skin surfaces. Samples were analyzed by the NIOSH 7300 method, modified for wipes. Quantitative analysis yielded a lead hand-to-mouth transfer efficiency that ranged from 12 to 34% (average 24%). A two-tailed paired t-test determined that the amount of lead loaded onto each hand was not statistically different (p-value: 0.867). These quantitative transfer data for lead from the skin surface to saliva are likely to be useful for the purposes of estimating exposures in exposure assessments, including those involving consumer products, and human health risk assessments.

P.119  Probabilistic Assessment of Lifetime Cancer Risk for Acrylamide through Daily Consumption of High-Temperature Processed Foods in Taiwan with Bayesian Statistics Markov Chain Monte Carlo Simulation. Wu CY*, Chang CH, Chung YC, Chen CC, Wu KY; National Taiwan University   b97310020@ntu.edu.tw

Abstract: Acrylamide (AA), a probable carcinogen, present in foods, especially in carbohydrate-rich food processed at high temperature. The potential health risk has been of great concerns due to daily AA intakes which could vary geographically due to various dietary habits and cooking process. In order to assess the lifetime cancer risk, a variety of high-temperature processed foods were collected from several counties in Taiwan. Totally, AA contents in 300 samples were analyzed with liquid chromatography tandem mass spectrometry. Questionnaires were used to collect intakes of these foods from 132 study subjects. With the limited data on the AA contents and food intake rates, Bayesian statistics Markov chain Monte Carlo simulation was to perform probabilistic assessment of lifetime cancer risk for AA through consumption of high-temperature foods in Taiwan to estimate the representative distribution of daily AA intake. In this study, mean intakes doses is 1.655 micro-gram / kg-day for the samples. Use the cancer slope factor (0.51 [mg/kg-day] -1) of EPA(2010), the mean cancer risk for Taiwan population is 8.44*10-4. The risk in our study is higher than those of other studies. This could be attributed to food intake rates probably overestimated according to surveys from young population. However, it may be necessary for the young study subjects to reduce the consumption of fried foods.

Security & Defense
 

P.120  Phase I Impact Assessment Results for 1-bromopropane and 3-nitro-1,2,4-triazol-5-one (NTO). Rak A*, Vogel CM, Bass N; Noblis Inc., US Army Public Health Command   andrew.rak@noblis.org

Abstract: The Department of Defense’s (DoD’s) Emerging Contaminants (EC) Program has a well-established three-tiered process for over-the-horizon scanning for ECs, conducting qualitative and quantitative impact assessments in critical functional areas, and developing sound risk management options. This “Scan-Watch-Action” process was used to examine potentials risks from 1-bromopropane and the insensitive high explosive NTO. Subject matter experts (SMEs) from throughout the DoD used the Emerging Contaminants Assessment System (ECAS) tool to evaluate the potential risks to DoD associated with these two mission critical chemicals. Members of the EC Program team used the Impact Assessment Criteria Assessment Tool (ICAT) to analyze SME input. Together, these two groups developed a set of initial risk management options (RMOs) within the DoD. The risks identified by the SMEs and the potential RMOs for each chemical are presented for each of five different functional areas. The uncertainties in the SME’s risk estimates are also discussed and recommendations for further analysis are presented. The conclusion of these assessments indicates that 1-bromopropoane requires significant risk management actions to mitigate possible risks from occupational exposure while NTO requires additional toxicity and environmental fate data be collected.

P.121  QUANTITATIVE APPROACH TO RISK ON FUEL TRANSPORTATION PIPELINES. Parra LM*, Munoz F; Universidad de los Andes   lm.parra71@uniandes.edu.co

Abstract: Hazardous materials transportation by pipeline is a widespread practice in the industry all over the world. It is socially accepted as favorable, however, like any other industrial practice is dangerous and represents risks to the society, the environment and infrastructure. Since production sites are often away from consumption centers, increased demand in Colombia and the world has led to increased transportation needs, forcing the growth of the pipeline network. Performing risk analysis before incidents occur, can provide engineering tools to support decision making, regarding the compatibility of activities within a territory. This type of analysis examines the information related to the consequences of a critical event, as radiant or overpressure effects that can affect communities in the vicinity of industrial facilities. This work consists of the development of a methodology for risk assessment and evaluation of societal and individual risk associated with fuel pipelines related to accidental events. This work includes the study of past events in Colombia and Latin America, as well as an analysis of the causes and types of events. As risks values vary along pipelines due to differences in environmental and societal conditions, analysis is performed for segments of a pipeline. In order to have manageable segments of pipeline to develop the analysis a dynamic method for segmenting is proposed. Individual and societal risk is estimated for each segment, identifying those which require priority actions. Societal risk values are defined as curves with frequency of accidents vs. expected number of fatalities. The last part of this work is the proposal of risk criteria, to support decision-making in Colombia with respect to land-use planning. This may generate tools to support decision making and resolve whether risk values can be reduced.

P.122  A New Endophyte Risk Assessment Model. Bromfield KB*, Rowe AJ, Atapattu AA; Environmental Protection Authority   kate.bromfield@epa.govt.nz

Abstract: Fungal endophytes are microorganisms that occur naturally within plant tissues, and do not usually cause any disease symptoms. They are an important component of the plant microbiome, and affect the plant’s growth and its response to pathogens, herbivores, and varied environmental conditions through the production of secondary metabolites (alkaloids). Recent advances in plant biotechnology mean the plant traits conferred by these endophytes in association with their natural host plant can be transferred into new plant species, in much the same way that traits are manipulated in genetically modified organisms. For example, some grass endophytes are being artificially inoculated into cereal crops to confer insect pest and drought resistance. However, some of the alkaloids produced by these endophytes are known to cause illness in grazing livestock, so there is a need to assess the risks of these changes to the chemical profile of the plants receiving these endophytes. We present a model for assessing the risks associated with the manipulation of endophytes across plant groups. We have tested this model using two case studies, presented here, and we are looking to expand its application further. The questions that drive this risk assessment model will inform any plant biosecurity risk assessment, including the assessment of plants with genetically modified traits. This model is the first of its kind and provides regulators with a simple yet effective approach to risk analysis, while ensuring consistency among decision makers.

P.123  Nanoscale risk assessment and uncertainty quantification in atomistic simulations. Wang Y*; Georgia Institute of Technology   yan.wang@me.gatech.edu

Abstract: Uncertainties in atomistic simulations imply the associated risks in simulation-based materials and drug development. Lack of data, conflicting information, numerical and measurement errors are the major sources of epistemic uncertainty in simulation. In particular, the sources of model form uncertainty for molecular dynamics (MD) include imprecise interatomic potential functions and parameters, inaccurate boundary conditions, cut-off distance for simplification, approximations used for simulation acceleration, calibration bias caused by measurement errors, and other systematic errors during mathematical and numerical treatment. The sources for kinetic Monte Carlo (kMC) simulation include unknown stable and transition states, and imprecise transition rates. In this work, we illustrate the sensitivity and effect of model form uncertainty in MD and kMC simulations on physical and chemical property predictions. A generalized interval probability formalism is applied to quantify both aleatory and epistemic uncertainties. New reliable MD and kMC simulation mechanisms are proposed, where the robustness of simulation predictions can be improved without the traditional second-order Monte Carlo style sensitivity analysis. Examples of engineering materials and biochemical processes are used to demonstrate the new nanoscale risk assessment approach.

P.124  Using portfolio optimization to select an optimal set of water security countermeasures. Bates ME*, Shoaf H, Keisler JM, Dokukin D, Linkov I; US Army Corps of Engineers, Engineer Research and Development Center   Matthew.E.Bates@usace.army.mil

Abstract: Counterterrorism decisions for infrastructure security can be challenging due to resource constraints and the large number and scope of potential targets and threats to consider. This poster presents a multi-criteria portfolio decision model (PDM) that optimizes countermeasure selection to maximize effectiveness under various counter-terrorism budget levels. Multi-criteria decision analysis (MCDA) is used to assess the holistic benefits of protective countermeasures when applied to key infrastructure in specific threat environments. Resulting scores, cost estimates, and synergistic/redundant interactions between projects are used to construct an efficient funding frontier that tracks how budget changes affect optimal portfolio composition. Results are presented for a case study based on literature data and author judgment optimizing protection against terrorist threats to a water supply network.

P.125  Application of multi-criteria decision snalysis to humanitarian sssistance and disaster response site suitability analysis. Bates ME*, Linkov I, Clark TL, Curran RW, Bell HM; US Army Corps of Engineers - Engineer Research and Development Center, Pacific Disaster Center   Matthew.E.Bates@usace.army.mil

Abstract: Humanitarian Assistance and Disaster Response (HADR) managers often face the complex task of prioritizing limited funds for investment across broad regions of varying need. In selecting regions and sites for project investment, project funders must assess and tradeoff site investment suitability along multiple dimensions. For example, governmental HADR resources might be invested to fit a combination of needs including: investing agency mission, local community hazard exposure, local community resilience, and projected investment sustainability, etc., each of which can be decomposed into many relevant sub-criteria. This poster presents a framework for HADR site suitability analysis based on the integration of spatial and non-spatial data from Geographic Information Systems (GIS) and other HADR sources via Multi-Criteria Decision Analysis, an analytical approach for integrating data across traditionally-incommensurate criteria via value functions and tradeoff weights. This framework is applied to a case study using HADR data to analyze investment suitability at the Department-level in El Salvador.

P.126   Microbial contamination in poultry chillers estimated by Monte Carlo simulations. Holser RA*; Russell Research Center   Ronald.Holser@ars.usda.gov

Abstract: The risk of contamination exists in meat processing facilities where bacteria that are normally associated with the animal are transferred to the product. If the product is not stored, handled, or cooked properly the results range from mild food poisoning to potential life threatening health conditions. One strategy to manage risk during production is the practice of Hazard Analysis and Critical Control Points (HACCP). In keeping with the principles of HACCP a key processing step to control bacterial growth occurs at the chiller. The risk of microbial contamination during poultry processing is influenced by the operating characteristics of the chiller. The performance of air chillers and immersion chillers were compared in terms of pre-chill and post-chill contamination using Monte Carlo simulations. Three parameters were used to model the cross-contamination that occurs during chiller operation. The model used one parameter to estimate the likelihood of contact and a second parameter to estimate the likelihood of contamination resulting from that contact. A third parameter was included to represent the influence of antimicrobial treatments to reduce bacterial populations. Results were calculated for 30%, 50%, and 80% levels of contamination in pre-chill carcasses. Air chilling showed increased risk of contamination in post-chill carcasses. Immersion chilling with 50 mg/L chlorine or 5% trisodium phosphate added to the chiller water as antimicrobial treatments reduced contamination to negligible levels in post-chill carcasses. Simulations of combination air/immersion chiller systems showed reductions of microbial contamination but not to the extent of immersion chillers. This is attributed to the reduced exposure time to antimicrobial treatments. These results show the relation between chiller operation and the potential to mitigate risk of microbial contamination during poultry processing.

P.127  Challenges Associated with Practical Environmental Restoration Risk Assessment and Management Decisions for Perfluoroalkyl Substances (PFASs). Phillips JK*, Anderson JK; TRC Solutions; US Air Force   JKPhillips@trcsolutions.com

Abstract: Perfluoroalkyl substances (PFASs) are environmental emerging contaminants with widespread applications in industry. PFASs do not have federal cleanup standards; however, some PFASs are environmentally persistent, bioaccumulate in living organisms, and have demonstrated toxicity in laboratory animals. Thus, despite the lack of federal regulations, it may be prudent to assess and potentially mitigate human and/or environmental exposures. A risk management decision process for the management of emerging contaminants such as PFASs at restoration sites is outlined. The identification of PFASs can significantly impact site objectives, schedule, cost and ongoing remedial activities, particularly without clear regulatory criteria. PFASs present unique challenges including identifying potential sources related to PFAS release, and characterizing PFAS contaminated groundwater and/or soil. EPA’s Office of Water is conducting reanalysis of PFAS toxicity information to revise their 2009 subchronic Provisional Health Advisories (PHAs). PHAs are non-enforceable guidelines that may or may not be utilized within state-led regulatory environmental cleanup decisions, leading to inconsistent national application. Within the US, there are several States with PFAS guidance levels, however only Minnesota has promulgated standards. This poster presentation will introduce PFASs, their sources, and the available screening levels for data comparison. It will also highlight the management challenges and current technical options available for groundwater contaminated with PFAS. Until consistent and defensible toxicity values are developed and practical remedial technologies are available, it remains challenging to execute consistent risk management practices to protect human health and the environment from PFAS exposures.

P.128  Application of socio-economic analysis for restriction and authorization of chemical in Korea. Yong Jin LEE*, Ji Yeon YANG, Geon Woo LEE, Dong Chun SHIN; Yonsei University   yjlee75@yuhs.ac

Abstract: Industrial chemicals are essential to modern society and bring benefits in the form of improved health, food supply, goods, general lifestyle and well-being. Some chemicals, if they enter the environment, can cause problems for human health and ecosystems and it is important to identify the potential hazardous endpoints, quantify the risk that genuine harm will occur and develop strategies to mitigate that risk. Socio-economic analysis (SEA) weighs the costs of any restrictions on the production and use of chemicals against the benefits to human health and the environment. The reasons why industry needs to understand the principles and practices of socio-economic analysis are: (l) to carry out, where appropriate, a SEA as an argument for authorisation, and (2) to be able to contribute as stakeholders in socio-economic discussions with regulatory authorities when a SEA is used as a basis for justifying restrictions. In this study, we are constructing and calculating social benefit and cost using the data of the value of a statistical life (VSL), cost of illness (COI), and results of cost evaluation. The results of VSL, COI are calculated by the adverse effects of carcinogen and non-carcinogen and the obtained data from data base of healthcare expenditures and several costs about illness, and questionnaires about income, expenditures, quality of life, and so on. VSL reflects the aggregation of individuals' willingness to pay for fatal risk reduction and therefore the economic value to society to reduce the statistical incidence of premature death in the population by one. The result of VSL estimated by cancer deaths is calculated US$ 73.6 million(2013) through contingent valuation methods. In this study, appropriate authorization and restriction of chemical substances will suggest through the evaluation of the cost with the results calculated by the VSL and COI.

P.129  Development of exposure guidelines for chronic health effects following acute exposures to TICs. Winkel DJ*, Hawkins BE, Roszell LE; BATTELLE MEMORIAL INSTITUTE, US ARMY PUBLIC HEALTH COMMAND   winkeld@battelle.org

Abstract: Joint Chemical, Biological, Radiological, and Nuclear doctrine (JP 3-11) requires military commanders to minimize total risk in operational planning and execution. Incorporation of Military Exposure Guidelines (MEGs) into risk estimates provides a mechanism to consider short- and long-term chemical exposure risks. However, current MEGs (and civilian guidelines) do not address chronic non-cancer health effects resulting from a single acute exposure. This gap is a source of concern for planners in the medical community, as these effects may have implications for long-term protection of exposed military or civilian populations. Challenges in establishing this type of guideline are small sample sizes, difficulties/inconsistencies in identifying long-term effects, and uncertainty in exposure concentration and duration. Given these challenges, this investigation describes an approach to develop such guidelines, using chlorine as an exemplar chemical. Chlorine was selected due to its use in attacks with improvised explosive devices, its presence in industry, and a prevalence of cases in the literature. Reports from chlorine exposures were reviewed and data on exposure concentration, duration, and long term health outcomes were compiled. Health outcomes ranged from the presence of physical symptoms (e.g., shortness of breath) to abnormal pulmonary function test results. Binomial distributions were used to address issues with a small sample population; uniform distributions were used to address incomplete exposure terms. The approach was applied to data presently identified and a probit analysis was used to generate a probit curve capturing the dose-response of long term health effects due to acute chlorine exposure. The curve compares favorably to existing guidelines (both military and civilian) in that only severe exposures have the potential to cause chronic health effects. This approach is believed to be novel and may be applicable to other TICs with a limited data set.

P.130  Understanding risk: Applying the CAUSE model in a content analysis of emergency management organizations coverage of hurricane Sandy. . Kowalek Denna*; Howard University    denna.kowalek@gmail.com

Abstract: Andersen and Spitzberg (2010) state that by many measures, the world is becoming a more dangerous place. The fact that there are more people in more places, from more cultures, often in greater levels of density, means that when disasters occur, they have the potential to affect more people, and more organizations and institutions are responsible for managing such disasters. Examining emergency management organizations communicative messages during Hurricane Sandy in the fall of 2012 allows risk and crisis communicators to determine how the hurricane information was disseminated, thus providing further precautionary and preparedness actions. Understanding how people view disasters and precautionary and preparedness will help generate more effective risk communication campaigns. This research will utilize Rowan’s et al. (2009) CAUSE model as a framework to understand how The National Weather Service (NOAA), American Red Cross and FEMA incorporated precautionary and preparedness action into their coverage of Hurricane Sandy. This will be done to determine how emergency management organizations created understanding through their messages. This information is crucial to understand in order to comprehend precautionary and preparedness actions regarding disasters. A content analysis will be conducted of messages from the National Weather Service, American Red Cross, and Federal Emergency Management Agency (FEMA) to understand how messages addressed the U in the CAUSE model, understanding, while discussing precautionary and preparedness actions regarding Hurricane Sandy.

P.131  Communicating conservation with labels: Experiment on the effectiveness of using IUCN categories for advocacy. Song H*, Underhill JC, Schuldt JP; Song and Schuldt: Cornell University, Underhill: Johns Hopkins University   hs672@cornell.edu

Abstract: The Red List published by the International Union for Conservation of Nature and Natural Resources (IUCN) uses a categorical system with labels such as “Critically Endangered” or “Vulnerable” to communicate the level of threat faced by each species. This study examined whether messages using such categorization information would be as effective as messages using statistical information in communicating risk. In an online experiment, 169 participants were randomly assigned to read four descriptions about threatened species written with either categorization information (verbal group) or statistical information (statistical group). Readability measured by the Flesch-Kincaid Grade Level score was controlled for across conditions (e.g., “According to the IUCN, the Bigeye Tuna is classified as a Vulnerable (VU) species” vs. “According to the IUCN, the Bigeye Tuna population declined by 42% around the globe over the past 15 years”). Although there were no significant differences in perceived message clarity or behavioral intention, perceived risk of extinction was higher among the statistical group than the verbal group. Thus, professionals communicating with lay audiences about threatened species may wish to cite relevant statistics instead of, or along with, the Red List categories. A follow-up study featuring a more diverse participant sample and varying levels of statistical complexity is currently underway.

P.132  Treed Exponential Models for Evaluating Factors Affecting Nanomaterial Dose-Response and Setting Occupational Exposure Limits. Gernand JM*, Casman EA; Penn State University   jmgernand@psu.edu

Abstract: Existing research has demonstrated that some materials produce significantly increased toxic responses when particles are sized in the ultrafine or nano-range (<100 nm). Further investigation revealed that even small changes in the characteristics of these nanomaterials can result in divergent outcomes following exposure. Understanding which controllable properties of nanomaterials may be responsible for differences in toxicity is critical for appropriate risk assessment, setting regulatory policy on exposure limits for these materials, and understanding the biological mechanisms involved. Multiple regression techniques can provide some insight, but traditional linear models and even new machine learning models make assumptions of linear or constant dose-response relationships that violate the best current understanding in toxicology. This work presents a new modeling framework, treed exponential models, for evaluating the effects of changes in specific nanomaterial properties on dose-response. We demonstrate this modeling technique on a collection of published nanoparticle pulmonary toxicity experiments. This technique combines the benefits of machine learning regression tree (RT) models with the accumulated mechanistic knowledge contained in traditional dose-response exponential curve models. These models facilitate comparisons between different types of nanomaterials and other toxins, and provide quantitative guidance regarding when different types of nanomaterials should be considered distinct groups.

P.133  Quantitative assessment of in vivo toxicological interactions from criteria pollutant mixtures containing oxides of nitrogen. Datko-Williams L*, Young B, Wilkie A, Madden M, Dubois JJ, Wichers Stanek L, Johns D, Oesterling Owens B; U.S. Environmental Protection Agency; U.S. Centers for Disease Control and Prevention   datko-williams.laura@epa.gov

Abstract: The U.S. EPA sets National Ambient Air Quality Standards (NAAQS) for individual criteria air pollutants by evaluating sources, ambient concentrations, and health impacts. The Agency recognizes that air pollution exists as a complex mixture; however, biological interactions between mixture components are not well characterized. We reviewed literature cited in EPA’s Integrated Science Assessments and Air Quality Criteria Documents to identify studies of criteria pollutant mixtures and in vivo toxicological interactions. The current analysis considered mixtures containing oxides of nitrogen (NOX) and all health endpoints, although most studies focused on mixtures of NOX + O3 (ozone) and respiratory system effects. Studies with complete response data (mean, variance, n observations) for each treatment group were included in a quantitative analysis of the relationship between combined effects and individual component effects in the mixture. For each endpoint, the interaction was categorized as additive, greater than additive, or less than additive. Additivity was defined as the absence of a statistical difference between the sum of responses to individual pollutants and the response to the mixture of pollutants. Departures from additivity were tested using analysis of variance (H0: combined effects = sum of individual effects, p = 0.05). At least one endpoint deviated from additivity in all animal studies (n=17), while the majority of endpoints in the human studies (n=9) were additive. When studies were compared, no pattern among endpoints or exposure conditions emerged. Thus, this analysis suggests that deviations from additivity exist in health impacts of criteria air pollutant mixtures, although most endpoints and mixtures were not different from additive. The views expressed in this abstract are those of the authors and do not necessarily represent the views or policies of the U.S. EPA.

P.134  Trust in a wide variety of risk managers after a catastrophic disaster. Nakayachi K*; Doshisha University   nakayachi@mail.doshisha.ac.jp

Abstract: The results of an opinion survey suggest that the public’s trust in technological and scientific experts deteriorated overall after the 2011 Tohoku earthquake in Japan. It is not surprising that trust in risk managers responsible for nuclear technology declined after the Level 7 accident at the Fukushima Daiichi nuclear power plant. Similarly, public confidence in experts responsible for disaster protection against earthquakes and tsunamis also decreased as a result of the disaster casualty of over twenty thousand. Does the public, however, distrust experts who were not directly related to predicting and responding to these disasters? The opinion survey mentioned above used the general term “scientists” and “technical experts” as the job description to be evaluated by the respondents. An expert is a specialist with competency in a particular area, however. Therefore, if the public’s trust in experts had deteriorated overall, risk managers in different areas (e.g., BSE, new infectious disease, agrochemicals, etc.) would have been mistrusted more compared with the period before the Tohoku earthquake. This research empirically examines the fluctuation in public trust of a wide variety of risk managers after the 2011 Tohoku earthquake, comparing the survey data conducted before and after the earthquake. The results revealed that, of the fifty-one risk management systems, only two (earthquakes and nuclear power) showed decreased levels of trust. There was no significant difference in the trust score concerning the thirty hazards unrelated to earthquakes and nuclear power. Interestingly, regarding the other nineteen hazards, trust has actually increased without any evidence that the risk management has improved. The fact that public trust in risk managers unrelated to earthquakes and nuclear power has increased after the earthquake is well explained by the finite-pool-of-worry hypothesis and supported by findings of Nakayachi et al. (in press).

P.135  Diminishing risks of soil pollution in public spaces: a proposal for remediation. Valentini M, Curra C, DEMICHELIS SO*, DEMICHELIS SANDRA; ENVIRONMENT LABORATORY - DDPY - UNLA   sandrademichelis@yahoo.com

Abstract: With the purpose of develop a set of proactive guidelines, the following paper carries out an environmental urban diagnosis in order to remediate an area associated with the trace of the Railway Railroad Roca, in the town of Remedios de Escalada, Lanús. The main objective is to intervene in the territory in order to diminish risks produced by urban soil contamination with oil, heavy metals and hydrocarbons which result from the deposition and accumulation of cars that have been abandoned in an underutilized area consequence of the breakdown of space. The problem is enhanced because of the combination of the absence of planning and strategic management, by neglect to cadastral situation in the area, among others, and the consequent reduction in both soil and environmental quality of the areas covered as nearby places. The overall purpose is to promote the relationship of individuals to the territory where they live and contribute to the development, presentation and subsequent conversion of the space, thereby promoting a better quality of life for its inhabitants. It has been projected a series of strategic guidelines designed and ordered according to the needs and relevant problems in the site, which proposed: removing cars and dispose them, tire recycling, Recycling /Reuse scrap metal and metals in general, Soil remediation in those contaminated points, Groundwater monitoring program and Creating public spaces once removed respective vehicles, waste and once remediation has taken place. We understand that transforming conflict areas and bringing back the importance that the environment represents, it is a way to achieve a balance together the implementation of best practices that contribute to improve quality of life.

P.136  Bad decisions increases health risks: reopening of an abandoned asphalt plant a case of study. BRACCA M, MONZON A, DEMICHELIS SO*; ENVIRONMENT LABORATORY - DDPYT - UNLA   sandrademichelis@yahoo.com

Abstract: The overall goal of this work is to diminish risk by planning the recovery of use and the habitat remediation of an abandoned asphalt plant (now micro-dump) belonging to the municipality of Lanus. The site is surrounded by housing and obsolescence and waste is an environmental liability that requires early intervention, After EIS, recommended the elimination of landfill, neutralization of obsolete material, the relocation of the asphalt plant and soil remediation. The situation analysis concludes that the ground is not suitable for development and operation of plant and waste and debris existence justifies its intervention. The government proposed not to move the plant, but reactivate. The negative impacts will occur on human health in case of reopening are associated with the appearance of asphalt fumes, with immediate consequences for those exposed directly and increased risk of various cancers. As for the environmental and health damage existing waste cause pest invasion, air pollution, leachate generation that pollute groundwater. The transfer proposal showed investment will be much more expensive than the reopening; however, these do not include the health costs it will generate for government since public health is covered by state; therefore preventive measures are proposed as the relocation of the plant is better than reopening, by clearing the dump and the property, remediation of soil gas extraction methods and aeration will result in diminish damages. At present authorities prefers reopen it but they did not perform EIS. If the proposal of eradication fails and the reopening of the plant will take place, a program that includes surveillance and mitigation must be developed.

P.137  Kinetics and micromechanics associated with crack growth in brittle materials. Djouder S, Chabaat M*, Touati M; Built Environment Research Laboratory, Dept of Structures and Materials, Civil Engineering Faculty, University of Sciences and Technology Houari Boumediene   mchabaat@yahoo.com

Abstract: In this study, kinetics and micromechanics associated with crack growth in brittle materials are considered. It is known that crack growth characteristics contain information on the material strength of fracture mechanisms and that there are sufficient experimental data evidencing that in most cases a crack growth is surrounded by a severely Damage Zone (DZ) which often precedes the crack itself. During its propagation, the DZ is characterized by few degrees of freedom (elementary movements) such as translation, rotation, isotropic expansion and distortion. On the basis of a stress field distribution obtained by the use of a Semi-Empirical Approach (SEA), which relies on the Green's functions, these driving forces corresponding to the mentioned degrees of freedom are formulated within the framework of the plane problem of elastostatics. A number of theoretical models have been proposed for the description of a stress field and kinetics of a damage zone [1, 2]. The traditional one identifies the DZ as a plastic zone and uses the well developed technique of the plasticity theory for the determination of its size, shape, energy release rates etc… According to recent experimental results, some damage patterns do not yield any model of plasticity and the shape of the DZ can be difficult to model. Then, a plasticity criteria is not adequate for damage characterization. However, elastoplastic solution is currently employed due to the lack of other approaches. Throughout this study, SEM is proposed for evaluating the stress field and the different energy release rates. This approach is based on the representation of displacement discontinuities by means of the Green’s function theory [3, 4]. This latest has been used in a purely theoretical context. Herein, we suggest a more realistic model (arbitrary orientations of discontinuities rather than rectilinear ones) for which the result can be obtained using the experimental data and thus avoiding the difficulties of analytical solutions.

P.138  Keeping track of nanotechnology in your everyday life: The Nanotechnology Consumer Products Inventory 2.0. Kuiken T*, Quadros M; Woodrow Wilson Center, Virginia Tech   todd.kuiken@wilsoncenter.org

Abstract: The Woodrow Wilson International Center for Scholars and the Project on Emerging Nanotechnologies created the Nanotechnology Consumer Product Inventory (CPI), in 2005. This first-of-its-kind inventory tracks consumer products claiming to contain nanomaterials and has become one of the most frequently cited resources showcasing the widespread applications of nanotechnology. The CPI now contains 1,628 consumer products that have been introduced to the market since 2005, representing a 24 percent increase since the last update in 2010. In the years since its launch, the CPI has been criticized because of its lack of scientific data. To address some of these concerns, this update adds qualitative and quantitative descriptors, such as size, concentration, and potential exposure routes for the nanomaterial’s contained in consumer products. It also includes published scientific data related to those products, where available, and adds a metric to assess the reliability of the data on each entry. In addition, the newly re-launched inventory seeks to address scientific uncertainty with contributions from those involved with nanomaterial production, use, and analysis. This is the first major overhaul of the inventory, since it was launched in 2005. The re-launched inventory seeks to “crowd source” expertise in an effort to create an inventory with more accurate information on consumer products. Registered users are encouraged to submit relevant data pertaining to nanoparticle function, location, and properties; potential exposure pathways; toxicity; and lifecycle assessment, as well as add product data and information on new products. The Virginia Tech Center for Sustainable Nanotechnology worked with the Wilson Center to redevelop the inventory to improve the reliability, functionality, and scientific credibility of this database. Virginia Tech’s Institute for Critical Technology and Applied Science provided funding for the effort.

P.139  Review of Health Effects and Toxicological Interactions of Air Pollutant Mixtures Containing Oxides of Nitrogen. Madden M*, Young B, Datko-Williams L, Wilkie A, Dubois JJ, Stanek LW, Johns D, Owens EO; ORISE, U.S. EPA-ORD, U.S. CDC-NIOSH   madden.meagan@epa.gov

Abstract: The U.S. EPA sets National Ambient Air Quality Standards (NAAQS) to protect against health effects from criteria air pollutants with the recognition that human populations are exposed to complex air pollutant mixtures. Exposure to these mixtures may differentially affect human health relative to single pollutants as a result of biological interactions between constituents of the mixture. If the effects of a mixture are equal to the sum of the effects of individual components, the effects are additive and the interaction effect is zero; additivity is often assumed as the null hypothesis for interaction effects. Alternatively, synergism (effects greater than additive) and antagonism (effects less than additive) are possible interactions, although the definitions and usage of these terms are not consistent across studies. To understand the potential biological interactions of exposure to air pollutant mixtures, we reviewed toxicological evidence (animal and controlled human exposure) from mixture studies cited in EPA’s Integrated Science Assessments (ISAs) and Air Quality Criteria Documents (AQCDs). We used quantitative and qualitative methods to determine the effects of pollutant mixtures on all health-related endpoints evaluated in these studies, specifically focusing on mixtures containing oxides of nitrogen. Many studies could not be analyzed quantitatively using our statistical model due to incomplete reporting of data. Instead, studies with incomplete response data were evaluated qualitatively for evidence of interaction effects and relevant vocabulary such as “additivity,” “synergism,” and “antagonism.” Although a number of studies reported deviations from additivity, there was no discernible pattern to the relationship between similar exposure scenarios and the direction or magnitude of the biological response. The views expressed in this abstract are those of the authors and do not necessarily represent the views or policies of the U.S. EPA.

P.140  Public risk perception towards urban air pollution. Zhu KJ*, Xu JH; Peking University   cocojne@pku.edu.cn

Abstract: The notorious prevailing hazy days in China in recent winters triggered a public crisis over air quality, while how the public perceives air pollution is meagerly studied. Understanding the public’s perception towards risks is a prerequisite for predicting public responses and designing risk communication strategies and policy interventions. The presented empirical study explores how the public perceive and react to risks posed by urban air pollution in China and what factors affecting their reactions, through an in-depth one-on-one interview approach and qualitative analysis. A convenient sample of 43 diversified respondents in Beijing was recruited with a snow-balling approach. Qualitative analysis of the open-ended interviews based on grounded theory revealed interesting findings. Despite some confusion between concept of air pollution and feeling of temperature and climate, most of the interviewees are aware of the severe air pollution problem in Beijing. However, concerns about air pollution and adverse health effect are attenuated by perceived low risks and individual powerlessness. The participants are also distracted by immediate priorities, e.g. social and economic problems. Many participants have recognized that each individual contributes to the problem, but express great reluctance to change their behaviors. Major barriers to individual engagement in mitigation strategies include a lack of knowledge, a shift of responsibility to the others (next generation, government and industries, etc.), inaction of government, perceived individual powerlessness, and reluctance to sacrifice comfort and convenience. These misunderstandings and barriers are important focuses of future communication strategies and policy interventions, if the public is to be fully involved in mitigating air pollution. It is also suggested that future research should test the prevalence of these phenomena and explore the potential reasons.

P.141  Analysis of U.S. soil lead (Pb) studies from 1970-2012. Wilkie A*, Datko-Williams L, Richmond-Bryant J; ORISE; U.S. EPA   wilkie.adrien@epa.gov

Abstract: Although lead (Pb) emissions to the air have substantially decreased in the United States since the phase-out of leaded gasoline completed in 1995, amounts of Pb in some soils remain elevated. Lead concentrations in residential and recreational soils are of concern because health effects have been associated with Pb exposure. Elevated soil Pb is especially harmful to young children due to their higher likelihood of soil ingestion. In this study, U.S. soil Pb data published from 1970 through 2012 was compiled and analyzed to reveal spatial and/or temporal soil Pb trends in the U.S. over the past 40 years. A total of 84 soil Pb studies across 62 U.S. cities were evaluated. Median soil Pb values from the studies were analyzed with respect to year of sampling, residential location type (e.g., urban, suburban), and population density. In aggregate, there was no statistically significant correlation between year and median soil Pb; however, within single cities, soil Pb generally declined over time. Our analysis shows that soil Pb quantities in city centers were generally highest and declined towards the suburbs and exurbs of the city. In addition, there was a statistically significant, positive relationship between median soil Pb and population density. In general, the trends examined here align with previously reported conclusions that soil Pb levels are higher in larger urban areas and Pb tends to remain in soil for long periods of time. The views expressed in this abstract are those of the authors and do not necessarily represent the views or policies of the U.S. Environmental Protection Agency.

P.142  Model Validation in Disaster Relief Partner Selection and Maintenance. Coles JB*, Zhuang J; University at Buffalo   jbcoles@buffalo.edu

Abstract: In this research we study how optimization, simulation, and game theory models could help agencies make better decisions after a disaster. To better understand the behavioral dynamics of interagency interaction, we interviewed over 60 agencies about their network behavior using an ego-centric approach, and used this data to propose a set of experiments to examine agency decision making in disaster relief operations. The full process of network development is complex, but in this poster we focus on the process of partner selection in an environment that is both cooperative and competitive. The partner selection model proposed here was developed from interviews conducted with agencies involved in disaster relief operations in response the 2010 earthquake in Haiti, the 2011 Tornado in Joplin Missouri, and Hurricane Sandy along the east coast of the United States. The model proposed will be initially validated using student data to provide a granular estimate of how interagency dynamics work. Once the initial validation is complete, we will conduct a secondary validation process with decision makers working in disaster relief agencies.

P.143  A Probabilistic Model of U.S. Intra-Day Tap Water Exposure and Its Application in PBPK Modeling. Schlosser PM*, Isaacs K, Sasso AF, Gift JS; U.S. Environmental Protection Agency   schlosser.paul@epa.gov

Abstract: While the previously developed SHEDS model (http://www.epa.gov/heasd/research/sheds.html) provides probabilistic sampling for total daily water consumption, it does not provide information on the intra-day distribution of that ingestion – the fraction of the total consumed in any hour of the day. For chemicals such as methanol or chloroform which are rapidly absorbed, the peak blood concentration (Cmax) depends strongly on this distribution and is a determinant for key toxic effects. We analyzed 2003-2010 NHANES dietary recall data for hourly ingestion, based on total moisture (g) in each food or beverage. (While the recall diaries allowed recording of events in 15-min intervals the data appeared biased to on-the-hour records, so consumption was summed into 1-h intervals.) Besides directly consumed tap water, a subset of NHANES foods and beverages assumed to be prepared with tap water was identified, and each was assigned an assumed fraction of total moisture attributed to tap water (e.g., 50% of canned soup prepared with water). Maximum-likelihood log-normal intake distributions were then fit for each 1-h interval. An assumed ingestion rate distribution (geometric mean, GM = 1 L/h) was applied allowing consumption to occur in less than an hour. 18% of the diaries included “extended consumption” occasions (vs. “Breakfast,” “Lunch,” etc.) , assumed to be drinking from a large water bottle, coffee mug, etc., over longer periods; these observations were analyzed separately from the hourly distributions and simulated with appropriate probability. Since the length of the extended consumption events were not reported, a slower rate distribution was assumed (GM = 0.2 L/h), with a minimum time of 1 h. With simulated distributions based on the NHANES diaries, most individuals appear to consume tap water less than 3 h each day. Use of these patterns as input to a PBPK model yields mean Cmax predictions 2x or more over an idealized pattern of 6 ingestion events per day.

P.144  A tool to facilitate the incorporation of metagenomic data into environmental microbial decision-making and risk analysis. Smith MN, Port JA, Cullen AC, Wallace JC, Faustman EM*; University of Washington   faustman@u.washington.edu

Abstract: Advances in microbial genomics have opened up new opportunities for translation to public health risk research. Traditionally, methods for studying changes in the population dynamics of microbial communities have required cell culture and have focused on single organisms. Some microbes cannot be cultured with current methods or are poorly characterized, leading to incomplete assessment of microbial environmental health. The development of metagenomics, in which DNA is extracted directly from environmental samples and sequenced, now allows for the characterization of the taxonomic and functional potential of microbial communities, provides an expanded tool for microbial environmental health monitoring. However, new bioinformatics and analytical challenges have arisen in interpretation and translation of metagenomic data to decision-making and risk management. We provide a tool for the translation of metagenomic data to environmental health monitoring information relevant to public health decision-making and risk assessments. This framework allows functional data from Clusters of Orthologous Groups of proteins (COGs) to be interpreted within the context of public health. Using metagenomic data from 107 published biomes, we performed a functional analysis to identify COGs that are reflective of potential human impacts. Biomes with higher known potential human impact, such as the WWTP had a greater percentage of public health relevant COG relative abundance. Overall, we demonstrate that this is a valuable tool for distinguishing between environments with differing levels of human impact in the public health context. This project is supported by the NOAA-funded Pacific Northwest Consortium for Pre- and Post-doctoral Traineeships in Oceans and Human Health and the UW Pacific Northwest Center for Human Health and Ocean Studies (NIEHS: P50 ESO12762 and NSF: OCE-0434087), NOAA (UCAR S08-67883) and the Center for Ecogenetics and Environmental Health (5 P30 ES007033).

P.145  Decision Aiding for Extreme Event Evacuation. Chen NC*, Yates JY; Texas A&M University   nnchen@tamu.edu

Abstract: Evacuating a large population from an impending extreme event is fraught with complexity, uncertainty and risk. Evacuees have to make decisions on route planning and point-of-destination while emergency managers need to ensure the appropriate personnel and infrastructure are available and capable of facilitating the evacuation. In evacuation, individual evacuees are exhibiting an increasing desire to communicate with family, friends and the local/state/federal authorities in-situ. We develop an agent-based simulation model to examine the impact of communication within social connections and emergency management during regional evacuations. We show how improved communication among evacuees impacts the evacuation process and we demonstrate how this knowledge can lead to improved public evacuation management. Furthermore, to better enable evacuee communication during an event, we formulate a time-dependent discrete optimization model to determine the location of telecommunications equipment as well as the assignment of evacuees to equipment.

P.146  Sensitivity of regulatory ozone risk assessment to improved exposure and response models. Ollison W*, Capel J, Johnson T; 1 - American Petroleum Institute, 1220 L Street, NW, Washington, DC 20005; 2 - Consultant, 1005 Demerius Street, Durham, NC 27701; 3 - TRJ Environmental, Inc., 713 Shadylawn Road, Chapel Hill, NC 27914   ollisonw@api.org

Abstract: We evaluate the sensitivity of EPA’s current ozone (O3) exposure model (APEX) to (1) alternative pulmonary function response models, (2) attainment AQ rollback approaches, (3) altitude effects, and (4) newly measured O3 penetration/deposition rates and microenvironmental (ME) factors, corrected for O3 measurement error. Results are provided for Denver air quality (AQ) scenarios representing 2006 “as is” conditions and the attainment of the current ozone NAAQS. We test recently published pulmonary function models that incorporate realistic O3 response thresholds and subject response variability proportional to the level of response. A CAMx model is used to adjust 2006 Denver AQ to simulate NAAQS attainment conditions. The modeled rollback projections account for NOx control-related increases in urban and background O3 levels from reduced NO-O3 titration that are not addressed by EPA’s quadratic rollback approach. Inhaled O3 mass is adjusted to account for altitude acclimation among Denver residents. Impacts of newly measured indoor O3 penetration-deposition rates on estimated responses are compared to projections using current APEX indoor mass-balance model assumptions. APEX ME factors are also adjusted according to recent field measurements made using new interference-free O3 photometers. Cumulative impacts of these updated components in the APEX exposure analysis are tabulated and compared to those of the current APEX model.

P.147  EVALUATING LONG TERM INACTIVATION OF BACILLUS SPORES ON COMMON SURFACES. Enger KS, Murali B, Birdsell D, Gurian P, Wagner DM, Mitchell J*; Michigan State University   jade@msu.edu

Abstract: Bacillus spores resist inactivation, but the extent of their persistence on common surfaces is unclear. This work addresses knowledge gaps regarding biothreat agents in the environment in order to reduce uncertainty in related risk assessment models. Studies were conducted to investigate the long term inactivation of B. anthracis and three commonly used surrogate organisms -B. cereus, B. atrophaeus, and B. thuringiensis. Models of inactivation kinetics were subsequently evaluated for fit. Spores were applied to 25 cm2 rectangles of three materials: laminate countertop, stainless steel, and polystyrene Petri dish. They remained at 22ºC and 50% relative humidity. Viable spores were measured at 1, 30, 90, 196, and 304 days by swabbing rectangles and eluting swabs in phosphate buffered saline. After serial dilution, colonies were grown and counted. R (cran.r-project.org) was used to fit persistence models to the data: exponential, logistic, Juneja and Marks 1 (JM1), Juneja and Marks 2 (JM2), Gompertz, Weibull, lognormal, gamma, biphasic spline, and double exponential. B. thuringiensis counts increased at 24 hours on all materials, with a subsequent decline. Several experiments showed evidence of a U shape; decrease followed by an increase in spore counts (B. anthracis & B. atrophaeus on laminate; B. anthracis & cereus on steel). Spores on polystyrene showed little inactivation. The maximum inactivation was 56% (B. atrophaeus spores inactivated on steel at 196 days). Fitting models to the data from laminate and steel indicated that the gamma, lognormal, JM1, and JM2 models fitted the data better than other models (by lowest BIC, or within 2 units of the lowest BIC). Models fitted to data from the polystyrene material were uninformative because little inactivation was observed. Spore inactivation was not loglinear. U-shaped inactivation curves might be explained by lower adhesion to the surface as the spores age, enhancing recovery.

P.148  Robust Approval Process in the Face of Strategic Adversaries and Normal Applicants. Zhuang J, Wang X*, Song C, Xu J; University at Buffalo, SUNY   xwang54@buffalo.edu

Abstract: The objective of this project is to explore a new class of decision models to provide structural insights for robust screening when dealing with adaptive applicants and incomplete information. This research is motivated by public concerns on balancing congestion and safety due to security screening. Such screening has been used to identify and deter potential threats (e.g., terrorists, attackers, smugglers, spies) among normal applicants wishing to enter an organization, location, or facility. In-depth screening could reduce the risk of being attacked. However it may also create delays and deter normal applicants, which decreases the welfare of both the approver (authority, manager, screener) and the normal applicants. This research will consider the factors of security, congestion, equity, and the strategic and non-strategic responses from various applicant types. In particular, this research studies the applicants' strategies of applying, reneging, learning, and deceiving. This research also studies the approver's strategies of screening, dynamic service rates, multiple-servers and priority processing, multi-layer screening, and secrecy and deception. If successful, this research will lead to new frameworks that decision makers can use for screening diverse groups of strategic applicants. These new frameworks have the potential to reduce costs, avoid unnecessary waiting and inconvenience, and improve effectiveness and efficiency of the approval processes. Potential applications of this research include immigration systems, job market background checks, and airport/container/border controls. The relevance is illustrated by the recent national debate on selective "pat-downs" and "advanced imaging"screening, and the associated changing travel patterns. This research will engage many graduate, undergraduate, and high school students, including those from under-represented groups. The results of this research will be disseminated broadly to local, national and international communities.

P.149  Modeling and Validating Multi-period, Multi-type, and Multi-target Attacker-defender Games. Zhang J*, Zhuang J; University at Buffalo, SUNY   jzhang42@buffalo.edu

Abstract: In this research, we present a novel class of multi-period and multi-target attacker-defender games where the attackers may have different motivations and multiple attacking options including assassination, armed assault, bombing/explosion, facility/infrastructure attack, hijacking, and hostage taking. We use optimization methods such as Minimax SQP, Quasi-Newton, Line-search to solve the problem. Utilizing the historical terrorism event data (e.g., Global Terrorism Database) and defensive investment data (e.g., Urban Areas Security Initiative programs), we numerically illustrate the model results. Our results show that the defender budget and the defense cost effectiveness have huge impact on the optimal resource allocation; and when the budget is tight and defense effectiveness is low, most of the defensive resources would be allocated to New York City. Our research provides some new insights on modeling and validating resource allocation models related to adaptive adversary.

P.150  Incentives in Government Provision of Emergency Preparedness and Disaster Relief. Guan P*, Shan X, He F, Zhuang J; University at Buffalo, SUNY   peiqiugu@buffalo.edu

Abstract: The goal of this project is to help provide a solid foundation for motivating more comprehensive ways to assess the risk tradeoffs in multi-stakeholder disaster management and resource allocation. This will be accomplished by taking advantage of theoretical decision frameworks such as game theory and prospect theory, and will use robust optimization techniques to address the uncertainty that surrounds disasters. This project will address under-studied questions such as: (a) How should governments and private sectors balance between the funding for emergency preparedness and the funding for disaster relief, when they are uncertain about the disaster location and consequences? (b) How should governments distribute incentives to reduce vulnerability to disasters? and (c) How should decision makers balance equity, efficiency, and effectiveness when preparing for and responding to disasters? As a 2012 National Research Council report states, "there is currently no comprehensive framework to guide private-public collaboration focused on disaster preparedness, response, and recovery." If successful, this project will help to address this issue by providing insights, practical guidelines, and decision support tools to help save lives and property in the face of disasters. This research will engage many graduate, undergraduate, and high school students, including those from under-represented groups. The models, results, and insight gained will be shared with international, federal, and local representatives through seminars, conferences, publication, media coverage, and websites.

P.151  Modeling attacker-defender games with risk preferences. Zhuang J, Fu J*, Jose VRR; University at Buffalo, SUNY   jfu3@buffalo.edu

Abstract: Traditional models of attacker-defender games generally assume that players are risk-neutral; i.e. they choose strategies that maximize their expected payoff or benefit. In practice, decision makers could be either risk seeking or risk averse, which has not been extensively studied in the attacker-defender game literature. For example, terrorists could have risk preferences (Yang et al. 2013). Government could be risk averse: Standish (2002) argued that Western governments tend to be extremely risk-avers and constantly introduce disruptive risk-averse policies in many areas such as air travel security; Stewart et al. (2011) stated that the amount of government spending for “many homeland security measures would fail a cost-benefit analysis using standard expected value methods of analysis [suggesting] not surprisingly that policy makers within the US Government /DHS are risk-averse.” Unfortunately, the growing attacker-defender game literature has not rigorously studied the players’ risk preferences. To fill this gap, the research objective of this proposal is to study how the incorporation of alternative models of behavior (e.g., expected utility and cumulative prospect theories) affects the equilibrium behavior of players in attacker-defender games. Though Zhuang and Bier (2007, Operations Research) asserted that the more risk-averse a defender is, the more likely she is to defend and that the less risk-averse an attacker is, the more likely he is to attack, our preliminary results show that this behavior may not be true when other factors such as initial wealth, risk aversion/risk seekingness, loss aversion, and optimism are incorporated into a model. If successful, this project results can enrich our understanding of how players in attacker-defender games when a rich set of empirically recognized aspects of human behavior and decision making is introduced in analytic models, and help transform a large group of existing models which ignore player preferences.

P.152  First Conference on Validating Models of Adversary Behavior. Zhuang J, Bier V, Zhang J*; University at Buffalo, SUNY; University of Wisconsin-Madison   Jingzhang42@buffalo.edu

Abstract: Hundreds of billions of dollars have been spent on homeland security since September 11, 2001, and numerous models have been developed to study the strategic interactions be tween defenders and adversaries (e.g., attackers or terrorists). Unfortunately, few if any models have yet been validated using empirical data, limiting the application of those models in practice. Supported by the U. S. Department of Homeland Security (DHS) through the National Consortium for the Study of Terrorism and Responses to Terrorism (START) and the National Center for Risk and Economic Analysis of Terrorism Events (CREATE), Drs. Jun Zhuang and Vicki Bier organized a conference on validating models of adversary behavior in Buffalo/Niagara Falls, NY, in June 23-26, 2013. This conference is intended to bridge theoretical and empirical research on adversarial modeling, and facilitate transitioning of the best existing models of adversary behavior into practice by assessing and demonstrating their validity and applicability to real-world problems. A secondary goal of the conference is to encourage synergy and communication between risk analysts, statisticians, economists, and other social scientists engaged in terrorism modeling and research. In this poster presentation, we summarize the summary of the conference findings.

P.153  Simulating Non-Dietary Ingestion of Listeria monocytogenes from Residential Surfaces. Canales RA*, Sinclair RG, Soto-Beltran M, Reynolds K; The University of Arizona   rcanales@email.arizona.edu

Abstract: While infection by Listeria monocytogenes in healthy individuals typically leads to only mild symptoms, in susceptible populations listeriosis has a high fatality rate. In fact the United States Centers for Disease Control lists Listeria as one of the top five pathogens causing foodborne illness resulting in death. The objective of this work is to compose and present a simulation framework for estimating health risks from non-dietary ingestion of Listeria in residential environments. Although there is evidence that principle sources of Listeria include ready-to-eat foods and unpasteurized dairy products, we take a cue from the chemical risk assessment field and realize an exploration of additional pathways of exposure may be warranted. The framework is composed of simulated activities and transfer of Listeria from household surfaces to hands, and subsequent transfer from hands to mouth. Hand and mouth activities are modeled using published behavioral data and incorporate values for the surface area of contact. The framework is applied to data collected from an exploratory study of pathogens on household surfaces in an urban low-income community in Lima, Peru. Approximately 25% of fomites tested were positive for Listeria, with positive concentrations ranging from 0.2 to greater than 75 MPN/10 cm2. Inputs were incorporated as truncated probability distributions and the framework was run as a Monte Carlo assessment – resulting in distributions of non-dietary ingestion estimates and risks. While the resulting exposure and risk estimates were relatively low, the primary insight in constructing the framework is the realization that there are limited data in developing realistic assessments of non-dietary ingestion exposure to Listeria in residential environments. Limited data exist regarding adult behaviors/contacts with surfaces, and transfer of Listeria from surfaces to skin and from skin to mouth. When assessing risk, there are also difficulties since dose-response models for Listeria are inconsistent and poorly understood.

P.154  Comparing bioactivity profiles of diverse nanomaterials based on high-throughput screening (HTS) in ToxCast™ . Wang A*, Filer D, Shah I, Kleinstreuer N, Berg E, Mosher S, Rotroff D, Marinakos S, El-Badawy A, Houck K; AW, DF, IS, DM, KH: US EPA. NK: ILS. EB: BioSeek Inc. DR: NC State Univ. SM: Duke Univ.   wang.amy@epa.gov

Abstract: Most of the over 2800 nanomaterials (NMs) in commerce lack hazard data. Efficient NM testing requires suitable toxicity tests for prioritization of NMs to be tested. The EPA’s ToxCast program is evaluating HTS assays to prioritize NMs for targeted testing. Au, Ag, CeO2, Cu(O2), TiO2, SiO2, and ZnO nanoparticles, their ion and micro counterparts, carbon nanotubes (CNTs), asbestos, and pesticides containing nano-Cu(O) - total 62 samples - were screened at 6 -10 concentrations each for 262 bioactivity/toxicity endpoints in cells and zebrafish embryos. Cellular stress and immune response pathways were primarily affected. NM’s core chemical composition was more important than size for bioactivity. NMs had similar profiles as their ion counterparts, suggesting ion shedding was a key factor in mechanism of action. Ag, Cu, and Zn (nano, ion) were more cytotoxic and active in more assays than others. While 3 asbestos samples had similar immune response profiles, 6 CNTs had profiles distinctive from asbestos. Potential bioactivity targets that were not directly measured were suggested by reference profiles similar to our data, e.g. similar profiles of a microtubule stabilizer interfering with mitosis and our nano-TiO2. Dividing endpoints into cytotoxicity and various function domains, we developed a ToxPi-based ranking approach for in vitro bioactivity. Samples active in more domains at lower concentrations were ranked higher than samples active in fewer domains and/or at higher concentrations. Ag, Cu, and Zn samples were ranked as high in vitro bioactivity; asbestos, Au, CeO2, some CNTs, and some TiO2 samples were ranked as low. Recognizing our assays using submerged cells may have limited sensitivity to inhalation effects, we are exploring prioritization approaches for various purposes. We demonstrated that HTS assays can identify affected cellular pathways, predict targets, and may be useful for ranking NMs for specific purposes. This abstract does not reflect EPA policy.

P.155  Ammonia removal from waste water from cattle and livestock and its reuse. CABRERA VB*, DE LAS POZAS C; Universidad San Sebastian   victor.cabrera27@gmail.com

Abstract: Waste water from some farms ought to be treated with zeolite, which is an aluminum silicate based volcanic mineral that can be used in order to remove heavy metals and other toxins, among them ammonia and ammonium. The zeolite looks like a sand that has a negative charge which attract and bond with the toxins and another toxic elements found in the waste. This treated water now that can be used for watering crops and farmlands in the urban areas of Central Chile. If the treated water is not treated with zeolite, the risk of contamination is a potential danger, not just for crop but for the soil. Infiltration of this water will contaminate water table, aquifers and will leach some layers and stratus, therefore groundwater will be affected. According to Langmuir model zeolite compounds used in waste water from horse stables, cowshed and, livestock barns, piggery shed have given an extraordinary fit for NH4 and NH3 ions.



[back to schedule]