Society For Risk Analysis Annual Meeting 2016

Session Schedule & Abstracts

* Disclaimer: All presentations represent the views of the authors, and not the organizations that support their research. Please apply the standard disclaimer that any opinions, findings, and conclusions or recommendations in abstracts, posters, and presentations at the meeting are those of the authors and do not necessarily reflect the views of any other organization or agency. Meeting attendees and authors should be aware that this disclaimer is intended to apply to all abstracts contained in this document. Authors who wish to emphasize this disclaimer should do so in their presentation or poster. In an effort to make the abstracts as concise as possible and easy for meeting participants to read, the abstracts have been formatted such that they exclude references to papers, affiliations, and/or funding sources. Authors who wish to provide attendees with this information should do so in their presentation or poster.

Common abbreviations

Poster Session

Room: Grande Ballroom   6:00 pm–8:00 pm

P.1  Risk mapping of technological disasters and its application in land use planning: the state of art. Alves EN*; Engine Engenharia Ltda

Abstract: This review paper explores the articles published in journals during the past 16 years which dealing with risk mapping of accidents with hazardous materials and its application in Land Use Planning (LUP), in order to identify key issues and decision-making which have been highlighted and privileged at different times and places, thus delineating a state of the art on the topic, allowing to build a specific theoretical framework guiding further new discussions and paths that lead to the improvement of this theme, which is assuming an importance role in industrialized and urbanized countries, in its various scales, such as local, regional and national. The history of industrial accidents clearly demonstrates that its consequences can be severely amplified by the presence of dwellers living in areas at risk. The major articles tapped showed that risk mapping in LUP has been urged in many countries, particularly those in European Union driven by Seveso II, in order to adopt a policy for the vicinity of hazardous establishments as an essential element to mitigate accidents surrounding. The field of engineering has shown great domain of techniques and methodologies for prevention on loss of containment and for defining distances of risk, but it are constantly focused on the technological development inside industries, however, in order to achieve the targets for preparedness and response to major accidents within the perspective of social and environmental justice, it is necessary to dialogue with land use planners, whom in turn have always performed independently, and despite the great advances in theories and practices in LUP, there are still many unresolved problems, particularly those regarding to safety in urban and environmental areas

P.2  Inter-organizational collaboration during complex risk events: Communication task performance and satisfaction in homogeneous and mixed stakeholder teams. Beaudry M*, Lemyre L, Blust-Volpato SA, Boutette P, Pinsent C; University of Ottawa

Abstract: During complex risk events, emergency management organizations are faced with the necessity to collaborate with external organizations that often have different cultures and command systems. To ensure the short and long-term success of these collaborations, it is important to understand the key drivers of performance and of stakeholder satisfaction in mixed teams. This study examined interorganizational collaboration using in vivo simulation in a public communication task related to a complex radiological event. Participants in this simulation were senior disaster and emergency management officers working in three types of organizations: the military, emergency response services (with incident command system), and public service organizations (without incident command system). Comparing homogeneous and mixed teams, we examined externally-rated performance (decision quality), self-rated process and outcome satisfaction, and self-reported team functioning. Nonparametric analyses revealed that externally-rated performance was significantly higher in homogeneous teams and that mixed teams experienced more frequent differences of opinion. Despite mixed teams experiencing more frequent differences of opinion, frustration with these was negatively associated with performance only in homogeneous teams. In mixed teams, performance was related to higher sense of belonging and broader leadership distribution. Satisfaction with problem-solving process in homogeneous teams was associated with performance, sense of belonging, and trust, while satisfaction with problem-solving process in mixed teams was associated with satisfaction with outcome. More research is needed to capture factors influencing performance in mixed interorganizational teams across various types of tasks, and across the risk management timeline. In particular, the role of social identity and shared governance are to be further investigated. Lastly, training may benefit from exercises and simulations in mixed teams.

P.3  Development of Cloud-Based Food Safety Assessment System from Post-Market Surveillance with Bayesian inference via Markov Chain Monte Carlo technique. Chuang YC*, Wu KY; National Taiwan University

Abstract: To ensure food safety and quality, post-market surveillance of Food Products have been implemented by local Health Bureaus yearly in Taiwan. Over 4000 cases have been inspected from markets and vendors per year and the Surveillance items are including pesticide residues in agricultural products/rice, veterinary drug residues in foods, heavy metal contents in fruits/vegetables/ rice and Mycotoxin in commercial foodstuffs. The high compliance rate of Post-Market Surveillance results are ranged from 87.2%-100% to show the food safety was well-managed through current regulatory standards in 2015 TFDA annual report. However, the health risk level in current status is hard to quantify to improve the performance of risk management. For conducting a health risk assessment, the challenges are as follows: 1). the detailed inspection information was scattered in local Health Bureaus individually without appropriate compilation. 2). the monitoring dataset from post-market surveillance which is highly censored data may produce biased exposure concentrations. Thus, cloud-based application and Bayesian inference with Markov Chain Monte Carlo technique (BSMCMC) can be adapted to overcome the problem of Food safety assessment from post-market surveillance simultaneously. Cloud storage database can provide friendly interface to local Health Bureaus without spatial limitation to collect inspection information that are including the sampling size, residuals concentration, food manufacturer, detection limits of instrument, and uncertainty. The proper prior distribution can be created by utilizing collected data to yield the posterior distribution of mean residual concentration which is combined with the likelihood function of Bayesian inference model. Therefore, the uncertainty of risk assessment from highly censored data will be reduced by BSMCMC model. The cloud-based Food safety assessment system not only assess health risk level with limited dataset but also improve the efficiencies of managing resource in risk management.

P.4  Enhancing operational risk management for wintertime oil spills with smart response services. Goerlandt F*, Tabri K, Aps R, Höglund A, Lensu M, Rytkönen J; Aalto University

Abstract: While maritime transport is of vital economic importance to the Baltic Sea area, challenging winter navigation conditions pose a hazard to ships operating in these waters. To counteract the environmental risks posed by oil pollution from shipping accidents, adequate measures for accident prevention and spill mitigation are critically important. Operational oil spill risk management is facilitated by smart response services, encompassing several technological and scientific developments. When a ship collision or grounding has occurred, tools for predicting the amount of oil spilled and the spill duration provide critical information to response services about the necessary resources for mitigation actions. Likewise, projections about the fate of the oil in the moving ice fields are useful for operational planning purposes. Finally, linking this information to knowledge about the vulnerability of various ecosystem services of the sea area further assists in prioritizing actions to minimize the consequences of the spill. This work presents the overall rationale and selected results of the STORMWINDS project, which aims to advance the state-of-art in operational risk management for accidental spills in ice conditions, through the development of advanced tools and online services to assist oil spill response operations.

P.5  Estimation of human risks induced by chemical accidents. Murayama. T TM*, Toshida, M MT; Tokyo Institute of Technology

Abstract: While environmental risks induced by dispersion of chemical substances from factories in normal condition are estimated and managed with some data such as PRTR system, it is still under development process to manage impacts generated through discharge of hazardous substances in emergency situations such as accidents and earthquakes. After identifying the target substance and factory in a prefecture, we estimated environmental risks which included casualty and health impact by the acute effect, as well as distribution of risk levels considering the annual probability and population based on wind and other atmospheric conditions. The results suggest that casualty risk would be higher within around 1 km, and the estimation could be suitable to understand distribution of risks.

P.6  Association between air pollution exposure and acute myocardial infarction emergency room visits: the effects of comorbid chronic conditions. Pan SC*, Huang CC, Ho WC, Chen BY, Guo YL; National Taiwan University

Abstract: Background: There has been increasing epidemiological evidence showed that exposure to air pollution could be association with increasing risk of emergency room (ER) visits for acute myocardial infarction (AMI). However, the information on susceptibility caused by chronic medical conditions has been limited. Objective: The aim of this study was to investigate whether comorbid chronic conditions modify the adverse effects of air pollution exposure on AMI ER visits. Methods: Healthcare utilization information was obtained by using the National Health Insurance Research Database (NHIRD). Daily ambient air pollutants were extracted from the Taiwan EPA air monitoring data. The first time AMI ER visits during 2006-2011 were included. We used time-stratified case-crossover study design and conditional logistic regression adjusted for temperature and relative humidity to investigate the relationship between air pollutants exposure and AMI ER visits in comorbid and non-comorbid chronic disease groups. Results: There were 963 first time AMI ER visits during 2006-2011. The odds ratio (OR) of AMI ER visits associated exposure to per interquartile range increase in PM2.5 of the same day (lag 0 day) was 1.12 (95% confidence interval, CI= 0.97-1.28). The OR was 1.26 (95% CI= 1.00-1.57) for the diabetic group, and 1.04 (95% CI=0.87-1.23) for the non-diabetic group. Conclusions: In conclusion, exposure to PM2.5 would increase the risk of AMI ER visits. People who have pre-existing diabetes are especially susceptible to increased AMI ER visits related to PM2.5 exposure.

P.7  Creation of REDESASTRE as a strategy for capacity building and support for the implementation of the Sendai Framework in the Parana State - Brazil. Pinheiro EG, Stringari D*; Disaster Research Center of Parana State - Brazil

Abstract: The State of the Paraná, located in the south of Brazil, established the first thematic network in the country to address the issue disaster risk reduction (DRR), called REDESASTRE. This network has more than 15 institutions of higher education and research centers cooperated, meeting to integrate the different areas in search for technical and scientific alternatives for the DRR. The initiative resulted as of the perception of the public managers that something was needed to be done, besides the usual preparation for the answer. Unlike other research centers in Brazil, the University Centre for Studies and Research on Disasters - CEPED/Paraná was created by official act of Governor and, in spite of being a university center of the Paraná State University, it is linked directly to the Coordinator of Protection and Civil Defense of the State. The Centre and the REDESASTRE aims to research and university extension; education (training and training courses); and technological innovation for the development of new technologies aimed at DRR. Among the actions implemented by the CEPED/Paraná, there is the cooperation with the UNISDR, which credited the CEPED/PR as the only multiplier training course focused on the global campaign "Building Resilient Cities", in Brazil. The REDESASTRE has the potentiality of joining the academy, stimulating it on the importance of the subject and narrowing the contact between the real necessities indicated by the public managers and the demands of the local community, which require science and technology so that the risks can be known and, consequently, diminished. The REDESASTRE also enables the awareness of strategic sectors of the economy, such as industry, to financially support initiatives aimed at fostering research, extension and technological innovation, so as to make the network sustainable monetarily. The expected results of this combination, show signs that the REDESASTRE has relevant role in the implementation of Sendai Framework.

P.8  Screening for Developmental and Reproductive Toxicity Hazards in the Workplace. Sullivan KS*, Dodge DG, Lewandowski TA; Gradient Corporation

Abstract: We have conducted conservative screening level analyses aimed at protecting workers from reproductive and developmental toxicity (DART) for over 10 years. These DART assessments are for inhalation exposures only and help occupational physicians decide whether workplace activities should be restricted to protect mothers and the developing child. The assessment uses a three part process: exposure assessment, DART hazard identification, and risk characterization. Exposure assessments identify workplace chemicals of concern (CoC) and quantify exposure duration and frequency. Toxicological data for the CoC is identified and captured during the DART hazard identification phase using sources such as: on-line databases (TOXNET, IRIS, IUCLID, ATSDR, Prop65), occupational exposure limits (OELs) and their documentation, and MSDSs. Chemicals are categorized as known, suspect, or not suspect based on the toxicological review. If no suitable information is found, insufficient data is assigned. Risk characterization includes calculating a DART reference dose (DRD) and worker dose (WD). The DRD is derived from OELs that account for DART effects or animal or human data with applied safety factors. A DART-specific hazard quotient (HQ; HQ=WD/DRD) is established and serves as the assessment product. The assessment results are provided to the occupational medicine team and employee. Using this process, we evaluated over 1500 individual chemicals and/or products. Eighteen percent were classified as known or suspect DART agents. Fifty percent of the individual chemicals evaluated did not have an established OEL (for any health effect); among these substances 8% were known or suspect DART agents. These results suggest a need to consider DART effects regardless of whether a chemical has an existing OEL. Based on our experience, this screening process is a useful tool to rapidly inform risk managers whether or not there are DART concerns so that appropriate interventions are considered.

P.9  Uncertainty analysis with the assessment processes in the screening hazard assessment of human health under Japan’s Chemical Substances Control Law. Yamaguchi H*, Matsumoto M, Kato H, Hirose A; National Institute of Health Sciences

Abstract: Hazard classes of human health in a screening assessment under Japan’s Chemical Substances Control Law (CSCL) had already been established about 450 chemical substances from the 2010 to the 2014 fiscal year. However, chemical substances targeted in the screening assessment are about 10,000 substances, those with the manufactured/imported amount of more than 10 tons are about 7,500 substances. Several contracted institutions had conducted the pre-screening assessment of thousands of substances, and estimated hazard assessment values. However, the information and data collected by these institutions were required to refine in order to apply the governmental screening assessment, because their assessment procedures were inconsistent with each other. Therefore, many resources for the refinement have been spent for the data evaluation. Toward the establishment of an effective procedure of the hazard screening assessment, this study aimed to quantitatively evaluate the uncertainty with the processes of hazard screening assessment and to identify its sources by comparing the hazard assessment values, the assessment procedures and the sources of hazard information adopted by several authorities. Then, we discuss the improvement of the procedure to help the efficient screening assessment in CSCL.

P.10  SISDC Mobile: A support tool for municipalities for disaster management. Barros E*, Borges MMF; University Centre for Disaster Studies and Research on the State of Paraná

Abstract: In Brazil disaster risk management is governed by Federal Law 12,608 of 10 April 2012. This law deals with, among other things, the powers of the Union, states and municipalities so that, within its powers to execute the National Policy protection and civil defense. In this sense, the State Coordination of Protection and Civil Defense of the State of Paraná (CEPDEC) has been developing since 2005, a system called Computerized System of Civil Defense (SISDC) to support municipalities in disaster management. The SISDC has been established as an important tool to support municipalities in disaster management, however, the CEPDEC noted the need to develop a specific tool for municipalities could improve the quality of their areas of mapped attention to the possibility of carrying out a mapping in remote locations, and subsequent transmission of information the SISDC. Thus, in partnership with the State of Paraná Technology Company (CELEPAR) in 2016 was released the mobile version of SISDC (available on google play store in two versions: training and system) which provides municipalities in the state the following features displayed: - Mapping of Attention Areas of flooding and landslides. Armed mobile version on your Tablet/Smartphone the operator can create a new area on a base previously registered map, marking points on the field or walking through the area. - Specific form for flooding area. - Specific form for registration landslides areas. - Individual registration form for residences in Attention Areas. The State Protection and Civil Defense Coordination believes that the implementation of this tool is a milestone in the progress of the Civil Defense in the state of Paraná, it is a unique, modern and functional tool that much will collaborate with the 399 municipalities in Paraná for that can make a proper risk management of disasters, increasing resilience and contributing to the preservation of life.

P.11  Comparison and validation of statistical methods for predicting the failure probability of trees. Kabir E*, Guikema SD; University of Michigan

Abstract: This paper examines disparate statistical methods for predicting the failure likelihood of trees in the face of storms and also comparing their accuracies. Being able to make accurate predictions plays a key role in helping arborists to do preventive measures with the aim of decreasing the chance of failure or even cutting down the hazard trees. The data used consists of four factor variables including the location of each tree, the tree species, whether the tree was pruned and whether there are any removed trees around the tree, and also two continuous variables including diameter at breast height (DBH) and height. Different data mining methods are used to predict the failure probability of trees. They include logistic regression, random forest regression, classification and regression trees (CART), multivariate adaptive regression splines (MARS), artificial neural network (ANN), Naïve-Bayes regression and an ensemble model. These models are validated through one hundred holdouts and the best ones in terms of accuracy are chosen for further analysis. Our results indicate that logistic regression, random forest and ensemble model of these two models predict the failure rate better than others.

P.12  Can risk governance function without a risk council? Bonneck S*

Abstract: The carcinogenic mode of action of acrylamide was not doubted by any expert when the substance was discovered in food in 2002. The criteria for the precautionary principle were undoubtedly met and the reduction of acrylamide contents in a number of products was soon to be technically feasible. In spite of this, consumer exposure could not be minimized as the risk management was: a) not transparent: the risk management did not determine the extent of the acceptable risk. It neither introduced mandatory regulatory measures nor did it explain why it did not intervene; b) not efficient: eight risk assessments were commissioned between 2002 and 2015. The results did not go considerably beyond the knowledge of the early 1990s and were not relevant for further developing the risk management process. As no effort was made to set questions in individual research areas, a large number of research results were generated where the relevance for consumer health protection remains completely unclear; c) not effective: no risk assessment policy was issued. Therefore what remained concealed was the fact that the assumptions regarding the exposure assessments were not based on scientific facts and were also not in line with the precautionary principle. Furthermore, no options for management measures were developed and evaluated. The risk governance process could have been more transparent, more efficient and more effective if it had been supervised and coordinated by a risk council. This board should include representatives of all risk relevant disciplines that consider consumer protection as their highest priority. Only a higher-ranking risk council can ensure that solutions acceptable to the general public are found within a certain timeframe.

P.13  Thailand’s granary faces risks of drought due to climate change . Yi Carine*; Tohoku University

Abstract: Global climate change, including increases in extreme weather such as heat waves, flooding, wildfires, and drought is frequently observed. Climate change impacts topography, land use, agriculture behavior, and eventually human population. Unlikely flood, drought is an extreme water, as well as heat, related disaster which the process is slow and gradually expands its affected areas, and leave impacts for long time on soil. However, it seems global drought hazard has been given lesser attention (than floods) throughout the world until California’s drought occurred. Global climate requires a balance of precipitation, evaporation between earth surface, atmosphere, and ground water, as well as changes in agricultural practices, with no international boundary. Therefore, the disaster adaptation strategy is needed. In this study, northeast area in Thailand is selected, where reported crop productivity is low due to poor soil condition of the region, which is amplified due to unsustainable agricultural practices. Decreasing of precipitation and lack of water resources is also reported recently. Now, Thai people are facing risks of flood and drought at a more extreme level as the major disaster events. Furthermore, drought, changes not only river ways, but also ground water levels and land use (as a result of farmers reacting to changing seasons). To clarify the impacts of drought in the local area the precipitation and temperature changes, crop productivity changes by climate, changes in the global crop market price, and any change in crop export amount are needed to be interpreted. Increasing water demands by the population and economic growth, as well as water storage capacity in this drought-suffering areas – which rely on rainfall and traditional water management in personal/community level – will be studied for the long-term water and soil risk management.

P.14  Estimation and Management of risks of injury at institutions due to fuel burning appliances. Sridharan S, Mangalam S*, Wiersma R, Ravindran K, Reid D, Larez J; Technical Standards and Safety Authority

Abstract: This paper will outline a novel methodology used by the Technical Standards and Safety Authority for the estimation and management of risk of injury or fatality to residents at these locations due to the fuel burning appliances such as boilers, water heaters and furnaces.

P.15  Understanding cause and outcomes of electrical injuries at Institutions from an Epidemiological Perspective. Moody Joel*; Electrical Safety Authority

Abstract: This paper will be presented by the Electrical Safety Authority (ESA), who is responsible for enhancing public electrical safety through training, inspection, authorization, investigation, registration, enforcement, audit, and other regulatory and non-regulatory public electric safety quality assurance services.  ESA will discuss the causes and outcomes of injuries sustained by occupants at these locations due to electrical technologies using a population based epidemiological perspective.

P.16  A risk based framework to protecting the rights of residents of retirement homes in Ontario, Canada. Bates A*, Castellino A, Mangalam S; Retirement Homes Regulatory Authority

Abstract: This paper will be presented by the Retirement Homes Regulatory Authority (RHRA), who are responsible for overseeing Ontario retirement homes’ compliance with legislative requirements, and educating the public, the retirement home sector and residents about these requirements, the rights of residents, and best practices for the operation of retirement homes. The RHRA will discuss the particular challenges of building a risk framework to accommodate the organization’s broad mandate, which includes protection of residents’ dignity, respect, privacy, autonomy, security, safety, comfort, and informed choices about care options.

P.17  Pathways to learning in selecting voluntary risk management practices. Scott RP*; University of Washington

Abstract: Voluntary best management practices (BMPs) are a common framework for addressing diffuse risks and externalities in a manner that allows firm level flexibility in response and limited government coercion. However, use of BMPs in oil and gas risk management relies on firms baselining practices off of each other, adapting practices based on new knowledge and feedback, and selecting practices that do in fact minimize or mitigate risks. This project evaluates, 1) “what pressures firms to adopt new best management practices in response to new information from firms, residents, or agencies?” 2) “how do various pressures drive specific firm practice choices?” Based on 18 semi-structured interviews with drilling permit managers, this paper evaluates use of firm experiences, agency feedback, other-firm actions, and citizen involvement in firm choice of BMPs. The interviews were conducted in the state of Colorado in 2016 and were transcribed and coded using attribute and provisional coding followed by structural coding emphasizing differences between paired observations. The interviews are coded and analyzed in the software Dedoose, and each interview is analyzed alongside administrative data on firm and location characteristics gathered from the State of Colorado. The results highlight patterns that emerge as clear pathways to learning and adaption based on the paired comparisons. I specifically characterize pathways through which specific pressures including normative pressure, coercive pressure, uncertainty, and economic motivations foster use of deliberation as part of the learning process. This work builds on the case to hypothesize about conditions under which a BMP systems may be most effective for risk management, emphasizes areas where regulatory agencies may be able to supplement private motivations, and provides new research questions for risk management research.

P.18  Establishing and Implementing Enterprise Risk Management in Government Agencies. Arimoto CW, Howard PM*; ABSG Consulting Inc

Abstract: Government agencies face a variety of risks across their organization, from internal risks such as data breaches, human resources failures, and poor decision-making, to external risks such as a change in operating environments and mission-creep. A number of overseeing bodies, such as the Office of Management and Budget (OMB) and the Government Accountability Office (GAO) have been advocating for government agencies to implement an Enterprise Risk Management (ERM) program that identifies, assesses, evaluates, and prioritizes organizational-wide risks. These programs inform decision-making and enable effective mitigation of unacceptable risks. The first step in implementing an ERM program at a government agency is to determine what is the current state of risk management across the organization. The next step is determining which frameworks and guidelines are most applicable and appropriate for the particular agency. Finally, establish and execute the program to better inform strategic planning, performance management, and prevent catastrophic loss to an organization. This presentation will provide an overview of the process of establishing and implementing an ERM program in a government agency.

P.19  Enterprise Risk Management Implementation after Organizational Crisis: Opportunity to build a resilient structure in a multinational company. Janickova M*; Paris Dauphine University

Abstract: In Multinational Companies (MNCs) major challenges and crises often dramatically impact their organizational systems. This is largely due to risk management systems that are ineffective in anticipating uncontrolled events. The starting point of our research is the literature on MNC Structural Change (Chandler, 1962, and Mintzberg, 1979) followed by the more recent integrated approach called Enterprise Risk Management (Bromiley, 2015; Power, 2009, 2007; Mikes, 2014, 2011, 2009). Indeed our literature review highlights that the new topic area in research literature that is Enterprise Risk Management (ERM) needs further empirical shaping. Moreover, we identify in the previous literature a lack of consensus on ERM building in practice which reveals an unexplored gap between key concepts in organizational studies; especially between the dynamic of the organizational crisis, structural change and system of actors. To fill this gap we propose an Integrated Risk Management System Framework. We present a revelatory (Yin, 2003) one case study that allows us to examine the change in the organizational structure by involving ERM. Than our embedded approach is appropriate to examine risk structure and change in detail. We collect and triangulate data from multiple sources such as observations, semi-directed interviews and documentation. The data analysis supports our conceptual framework that leads to our main results. Our findings contribute to the development of Risk Management theory within Organizational Studies. We conclude with a discussion on effective ERM practices and conditions that could be further examined by future research.

P.20  Evaluation of a Model which Supports Decision-Making on Information Security Risk Treatment Using Statistical Data. Kawasaki (Aiba) R*, Hiromatsu T; Institute of Information Security

Abstract: This research aims to evaluate the effectiveness of a model proposed by Kawasaki (Aiba) and Hiromatsu (2014), which supports decision-making by top management on information security risk treatment. The model provides one of a suitable set of risk measures and the costs needed for the measures under the constraints of the total budget and the target level of information security risks of an organization. By using this model, top management can avoid to make a decision depending only on his/her experiences and intuition. On the other hand, the model has a difficult problem on evaluating its effectiveness. Although the evaluation needs the actual outcome of information security risk assessment and information security risk treatment, such information generally does not disclose because of security reasons. Therefore, this research focuses on using statistical data. First, how to apply statistical data to the model and the results of application are shown. Then, consider the effectiveness of the model by referring the results.

P.21  1 Going further than Physical and Cyber Connections: Consideration of Logical Interdependencies. Lewis LP*, Petit FD, Berry MS; Argonne National Laboratory

Abstract: Infrastructure interdependencies are fundamental considerations when assessing regional resilience. Most assessments focus solely on physical, cyber, and geographic interdependencies existing between infrastructure systems. Although a fourth class of connections, the logical interdependencies, has been identified in scholarship, it has yet to be integrated in resilience and risk assessment methodologies; the term logical dependencies is widely used but it has suffered from little further refinement beyond its identification. The lack of deeper inquiry into the human interests and activities that define these logical dependencies, such as business continuity principles, economic market forces, societal aspirations, equal access, and distributive justice, is a significant deficiency in the holistic understanding of community resilience we seek to build. A multidisciplinary or “socio-technical” point of view is needed to fully elucidate the extensive range of influences acting upon infrastructure, from the individual asset to the sector level. Refining the concept of logical dependency and defining the elements characterizing this type of critical infrastructure relationship are essential steps in order to draw connections between infrastructure and its management, from the operator to the policy-maker. Novel assessments are being developed that incorporate the social, behavioral, economic, political, and legal forces that influence and are impacted by the strategic management of critical infrastructure. Developing this capability will enable policy-makers, economic actors, infrastructure operators, and community planners to draw more meaningful and actionable conclusions about the fundamental relationship between critical infrastructure sectors and their impact on community resilience. The objectives of this presentation are to define the concepts and propose a framework for identifying and characterizing system logical dependencies to enhance regional resilience.

P.22  Prioritizing Chemical Residue Testing in Meat, Poultry, and Egg Products. Ward L*, LaBarre D, Duverna R, Muniz Ortiz JG, Kishore R, Kause J, Catlin MC; USDA FSIS Office of Public Health Science

Abstract: As the public health agency of the United States Department of Agriculture, the Food Safety and Inspection Service (USDA-FSIS) is responsible for ensuring that the nation’s commercial supply of meat, poultry, and eggs products is safe, wholesome, and correctly labeled and packaged. The interagency National Residue Program (NRP), which is organized and administered primarily within FSIS, is responsible for identifying and prioritizing chemical hazards of concern for FSIS-regulated product classes, for collecting FSIS laboratory data describing residues detected in meat and poultry samples, and for analyzing and reporting those data to ensure that permissible levels of hazardous compounds are not exceeded. Recent audits of the NRP have highlighted a need for a more transparent prioritization strategy for chemical residues than its prior reliance on expert judgement. FSIS is shifting toward a more systematic, transparent, and public health-focused risk ranking scheme. FSIS—in collaboration with the Environmental Protection Agency and the Food and Drug Administration—is currently evaluating two risk ranking approaches. One, the Public Health-Based model (PH) is a straightforward combination of exposure and toxicity characteristics with a relative public health risk score being generated for each evaluated chemical. The other, the Latent Variable model (LV), incorporates the same chemical characteristics or parameters used in the PH model, but inputs them into a more complex statistical model that approximates the probability of an adverse event occurring with exposure to observed chemical residue levels. Despite their many differences in construction, the PH and LV methods generate highly similar relative public health risk rankings. A comparison of the two approaches and their outputs will be presented.

P.23  Key role of capacity building and participation in promoting the improvement of articulated risk and impact assessment system in Western Mexico. Clausen JE*, Gomez Quiroga G; ITESO University

Abstract: Articulates Risk and Impact Assessments are part of good governance systems. The Guadalajara Metropolitan Area (GMA) is the second largest in Mexico. The current urbanization of the periurban territories has intensified the effects of air emissions, vulnerability of the local aquifers to depletion and pollution, degradation of the best agricultural soils and important loss of forest coverage; thus endangering one of GMA´s main sources of good quality drinking water and resilience capacity to climate change. During the last years, collaborative efforts with the UN sustained a new vision for regional spatial and urban planning in GMA, incorporating a new approach to environmental governance, and an articulated system of risks and impact assessment. Urban models, mixed land use, proximity agriculture, sustainable transportation, environmental health and integrated water resources management are all issues valued during risk assessment scoping process. The ongoing capacity building on risk-impact assessments at local campi, collaboration with international organizations, enriched NGOs´ and citizen participation; all contributed to triggered a government political will to assess and improve the regional decision analysis and risk assessment system’s structure and performance through legislation from a best practice perspective. This poster discuss if investing in capacity building and participation in general, can nudge governments to do the next step and introduce risk assessments legislation.

P.24  IRGC resource guide on resilience. Florin MV, Linkov I*; IRGC, Switzerland and US Army Engineer R&D Center, Boston

Abstract: Responses to disasters, both natural and technology-related, often show the limitations of traditional risk assessment and management. In the context of risk, resilience has been discussed as both a supplement and an alternative to conventional risk management. IRGC describes resilience as a risk management strategy when there is much uncertainty about impacts and the need to prepare to cope with surprises. Both governments and industry explicitly call for resilience-based risk management. Even though the field is fluid, mapping risk and resilience in the context of governance as well as summarizing how resilience has been manifested, managed and measured in different fields and sectors is needed. The IRGC ‘Resilience In And For Risk Governance’ (RIARG) resource guide stresses the importance of including resilience as an important component of the risk governance process, including in research, policy, strategies, and practices. IRGC’s objective with the guide is to propose an annotated bibliography of existing ideas and tools for integrating risk and resilience and measuring resilience and the effectiveness of actions taken to build it. The guide focuses in particular on metrics for resilience assessment and instruments for resilience management. It aims to encourage the development of methods for resilience quantification. The resource guide is composed of invited authored papers, which the poster will present, highlighting both the variety of approaches to resilience as well as common features and dynamics. It is designed to help scientists and practitioners working on risk governance and resilience evaluation, by giving them background information on the various perspectives and guiding them to the best available literature sources. The resource guide was developed in 2016 and will be launched at IDRC 2016.

P.26  Race/ethnicity and climate change polarization: Evidence from a U.S. survey experiment. Schuldt JP*, Pearson AR; Cornell University

Abstract: Social identities and group affiliations have been acknowledged to play an important role in public disagreements on climate change and other prominent scientific issues, perhaps none more so than political orientation (political ideology and party identification). Less research, however, has examined fundamental social identities besides politics that may matter just as much—if not more—in the ongoing debate over climate change. Drawing on research suggesting that concern about climate change is stronger among U.S. non-Whites relative to Whites, we explored whether the climate beliefs of U.S. racial and ethnic minorities would be less sensitive to political ideology and other known polarizing factors (labeling the phenomenon as “global warming” rather than “climate change”). Analyzing data from a large U.S. national survey experiment (n = 2,041), we found that political ideology (liberalism-conservatism) was a weaker predictor of non-Whites’ (vs. Whites’) beliefs for every climate change opinion metric we surveyed—namely, belief in its existence, perceptions of the scientific consensus, and support for climate mitigation policy. Additional analysis revealed that non-Whites were significantly less likely to personally identify as an “environmentalist” than were Whites—a variable that was less strongly correlated with political ideology among non-Whites—and that non-Whites were less likely to perceive a scientific consensus on climate change. Further, although Whites were less likely to report believing in the existence of “global warming” as compared to “climate change,” labeling did not influence the existence beliefs of non-Whites, a pattern that remained when controlling for key covariates (including political ideology and party identification). Our findings highlight a need to better understand how different social groups interpret environmental risks and respond to common climate-related messages, while offering insights into the dynamics of U.S. public opinion on climate change amid rapidly shifting demographics.

P.27  Putting on your thinking cap: completing a warm-up reasoning task produces critical but biased evaluations of scientific evidence. Drummond C*, Fischhoff B; Carnegie Mellon University

Abstract: Prior research suggests that the motivation to maintain personal beliefs leads individuals to evaluate scientific evidence more critically when it opposes their beliefs (Lord, Ross & Lepper, 1979; Kunda, 1990). We test whether a “putting on your thinking cap” manipulation, in which participants complete a warm-up scientific reasoning task before reading scientific evidence, will reduce the degree to which participants’ evaluations are biased in favor of pro-attitudinal scientific evidence. In our experiment, proponents (N= 302) and opponents (N=303) of the Affordable Care Act (ACA) were recruited to take an online survey. Participants were randomly assigned to read a news article describing one of two real scientific studies. The studies used nearly identical methods and data, but only one found positive effects of the ACA on access to healthcare. In the “thinking cap” condition, participants were randomly assigned to complete a warm-up reasoning task, the Scientific Reasoning Scale (SRS; Drummond & Fischhoff, 2015), which consists of eleven short scientific reasoning problems, before reading the news article. In the control condition, the SRS was administered at the end of the study. We found that participants who read pro-attitudinal articles judged the scientific evidence to be of higher quality than those who read counter-attitudinal articles, and that the magnitude of this bias did not differ in the “thinking cap” and control conditions. Instead, we found a main effect of the “thinking cap” condition such that participants who took the SRS before reading the article judged the science to be of lower quality. Our results indicate that putting your thinking cap on increases critical thinking but fails to reduce bias due to prior beliefs. We discuss the implications of this research with regards to public skepticism of science.

P.28  Public perceptions of clean energy technologies. Abdulla A*, Vaishnav P; UC San Diego and Carnegie Mellon University

Abstract: In order to deeply decarbonize the world's energy system by the end of this century, it is likely that all available low-carbon technologies will have to be deployed to some extent. Each of these must contend with different technical, economic, and institutional challenges. Moreover, each has a different risk profile and therefore a different level of social acceptability. There is a substantial literature that examines public attitudes to technologies that are considered particularly controversial. Of these findings, perhaps the most replicated and undisputed is nuclear power's unique status as a technology that engenders "dread." In this paper, we recruit respondents and randomly assign them to one of two groups. The first of these groups is given information about the risk profile of each of the clean technology options available for mass deployment. We focus on the following five risk categories: economic cost, service reliability, health impacts, environmental emissions, and land use implications. The second group is given the same risk profile information, though the names of the options are withheld. In each case, we ask respondents to rank the technologies in order of preference. We then task them with creating a clean energy portfolio to meet U.S. electricity demand in 2030 under an emissions constraint. Upon completion of the protocol, the names associated with each technology are revealed to the second group, and they are offered the option of adjusting their preferences. The results of this investigation will allow policymakers to strike the right balance between allocating resources to reducing the actuarial risk associated with different energy technologies, versus addressing perceptions.

P.29  Game-theoretic model for attack and defense of smart grids at three levels. Shan X*, Zhuang J, Rao N; University of Houston - Clear Lake and State University of New York at Buffalo and Oak Ridge National Laboratory

Abstract: As the society relies more on electric technologies, efficient generation and delivery of electric power becomes increasingly important. Smart grids provide a promising solution to the increasing electricity needs. Whereas smart grids have a number of advantages over traditional grids, one of their main disadvantages is the susceptibility to cyber attacks, which have not been the focus of studies on smart grids. In this paper, a game-theoretic model is developed to identify optimal defenses and attacks at three different levels (i.e., electric generation plants, transmission, and distribution systems). We define parent and child networks. For example, electricity generation network is the parent network of transmission network, which is in turn the parent network of distribution network. Network failure could be due to either direct attacks, cascading failure caused by intense attacks on its parent network, or inadequate maintenance. We identify the best responses and equilibrium strategies of both the attacker and the defender, who interact at three system levels: distribution, transmission, and electricity generation. The results show that the best response of the defender is not only a function of direct attacks but also of the spread from connected networks. Furthermore, we also conduct sensitivity analyses of the equilibrium strategies. Results show that if the probability of a successful attack against electricity generation plants is above a certain level, the defender enhances efforts in protecting electricity generation plants. On the other hand, the efforts of attacking at any of the three levels is not influenced by such a probability. This paper yields some interesting insights to modeling and analyzing the strategic interactions between the attacker and the defender of smart grid networks, which plays an increasingly important role in modern societies.

P.30  Adversarial hypothesis testing. González-Ortega J*, Ríos Insua D, Cano J; Instituto de Ciencias Matemáticas and Universidad Rey Juan Carlos

Abstract: Hypothesis testing is one of the standard problems in statistical inference. Though not exempt of debate, it is a thoroughly studied problem from a decision theoretical perspective, both from the frequentist and the Bayesian points of view, stemming from the seminal work of Wald (1950). Drawing on recent developments in cybersecurity, there has been an upsurge in applied areas which may be jointly denominated adversarial signal processing. This covers applications from online fraud detection to steganography, going through spam detection or watermarking, among many others. Several of the issues posed may be seen as hypothesis testing problems in which hostile adversaries somehow perturb the data observed by a decision maker as a way to confound him about the relevant hypothesis. However, attempts in this area have focused mostly on zero-sum game theoretic minimax approaches to hypothesis testing, which is not satisfactory since losses for various agents will be typically asymmetric. Moreover, the beliefs and preferences of the adversary will not be readily available, violating the common knowledge assumption from game theory. Thus, key assumptions for the proposed solution approaches would not hold. Using recent concepts from Adversarial Risk Analysis (ARA), we provide a novel approach to the Adversarial Hypothesis Testing (AHT) problem. We consider an agent, called Defender, which needs to ascertain which of several hypotheses holds, based on observations from a source that may be perturbed by another agent, which we designate Attacker. We study the AHT problem from the Defender's perspective. In doing this, we observe that the Defender needs to forecast the Attacker's decision, which we perform by simulating from the corresponding Attacker's decision making problem. Our approach is illustrated through a model for batch acceptance.

P.31  Implementation of a decision support tool for sustainable remediation in practice - lessons learned. Norrman J*, Söderqvist T, Volchko Y, Rosén L, Franzén F; 1, 3, 4) Chalmers University of Technology; 2,5) Enveco Environmental Economics Consultancy

Abstract: Sustainable Choice of REmediation (SCORE) is a multi-criteria decision analysis tool developed for evaluating economic, environmental and social sustainability of remediation strategies at contaminated sites. In the SAFIRE research project, SCORE is applied and evaluated in five real case studies to compare and evaluate the potential remedial strategies, with input from and together with stakeholders. One of the case studies is the Southern area of the BT Kemi industrial site in Teckomatorp in the municipality of Svalöv, Southern Sweden. The company BT Kemi manufactured pesticides from 1965 and secretly buried oil drums with organic pollutants. The site is associated with a famous Swedish environmental scandal culminating in 1977. The factory was closed, but although the site was remediated at that time, the site is today still not fully remediated and the final remediation will take place in the 2-3 coming years. The project is publicly funded and is run by the municipality. In the project, it is acknowledged that psychology is an important factor, and one of the main goals of the project is to change the attitude towards the village and the bad reputation. SCORE is applied in the Southern area of the BT Kemi site. Social sustainability is evaluated with selected stakeholders in a workshop, the economic sustainability is evaluated by means of a cost-benefit analysis with input from stakeholders, and the ecological sustainability is evaluated together with consultants and the project leader based on the environmental risk assessment. Additional information from the public for the assessment of social and economic sustainability is collected by means of a questionnaire. Feedback on the process, the SCORE tool and its input into the decision process is collected after the SCORE analysis. The lessons learned from this case study, together with the other case studies, will help improving the process of applying SCORE, as well as the SCORE tool itself.

P.32  Is sustainable remediation of contaminated land more efficient? Anderson R*, Norrman J, Rosén L, Volchko Y; Chalmers University of Technology

Abstract: It is estimated that there are 80,000 potentially contaminated sites in Sweden, where approximately 1300 are considered to pose substantial risk to human health and the environment. Soil remediation reduces negative impacts from contaminants on humans and ecosystems, however the process itself often results in other negative effects, such as large environmental footprints and high costs to society. Increased focus on implementing sustainable remediation solutions has been seen internationally in recent years. SCORE (Sustainable Choice of REmediation), developed at Chalmers University of Technology, is a multi-criteria decision analysis tool for assessing economic, environmental and social sustainability of remediation alternatives, incorporating social cost-benefit analysis and uncertainty analysis. In addition to sustainability, the Swedish EPA is also concerned with the efficiency of remediation, stemming from the slow-progress, low level of innovation, and high costs of state funded projects. The main objective of the SAFIRE research project is to evaluate if sustainability assessments can improve the efficiency and effectiveness of site remediation. Sustainability assessments were performed on four real case sites in Sweden using the SCORE method together with stakeholders. Alternatives from each site were then evaluated and compared using a number of efficiency and effectiveness indicators. In choosing the indicators, focus was given on risk-reduction and on reaching stakeholders’ project specific goals. Results of the efficiency and effectiveness analysis on the case study sites are presented. The study aims to demonstrate how remediation sustainability and remediation efficiency can be related, an important question for national remediation programs worldwide.

P.33  Developing a Predictive Model to Detect Mishandling in the Self-reported Water Discharge Data. Abouali M*, Mitchell J, Nejadhashemi AP, Hatami P, Gibbs C, Rivers L; Michigan State University and North Carolina State University

Abstract: The National Pollutant Discharge Elimination System (NPDES) regulates the level of pollutant discharges from point sources into the waters of the U.S. in order to protect public health and the environment. However, the NPDES program relies heavily on self-reported data without robust platforms to assess the integrity of that data, which could pose risks with adverse water quality impacts. Therefore, the need for data-driven methods to support regulatory enforcement is an important area of research. The goal of this study is to build a model capable of detecting fraudulent or mishandled data from wastewater treatment plants based on self-reported discharge data from one state environmental agency. Longitudinal data for water quality parameters like dissolved oxygen, nitrate, BOD, and phosphorus over several years were incorporated in a correlation analysis with precipitation and climate data. Based on the correlation analysis, causal statements were identified, which could subsequently be tested using stochastic regressors through regression analysis or structural models where appropriate mechanisms are well established. Hence, the final models will be able to estimate plausible probability distributions for dependent water quality parameters given a set of probability distributions for independent variables. Validation of the models is facilitated using methods of cross validation by splitting the time series across several years of self-reported data. Future work includes establishing thresholds for consistency with predicted distributions that can be used to support decisions about the veracity of future streams of self-reported data.

P.34  VRAKA – a method for environmental risk assessment of potentially polluting shipwrecks. Landquist H*, Rosén L, Lindhe A, Hassellöv I-M; Chalmers University of Technology

Abstract: Shipwrecks around the world containing unknown amounts of hazardous substances pose an increasing risk of polluting the marine environment. Many dwell far below the sea surface and the amount of hazardous substances contained might be uncertain or not known. Mitigation operations of these shipwrecks are costly and therefore risk assessment and decision support for prioritization is needed. A holistic method for risk assessment of shipwrecks should encompass both the probability of a discharge and the consequences thereof. The aim of this study was therefore to develop VRAKA (short for shipwreck risk assessment in Swedish), a comprehensive method for probabilistic risk assessment of shipwrecks. The method consists of two parts: (1) is a tool to estimate the probability of a release of hazardous substances from a wreck. The tool is based on a fault tree model combining site- and wreck specific information with activities that might cause damage to the shipwreck. Input information to the fault tree has been derived by expert elicitation. The second part, (2) can be applied to estimate the consequences of a discharge and can be performed in three tiers depending on available resources. An initial approach is to combine the probability of release with the potential amount of hazardous substances contained. The next tier involves an environmental sensitivity matrix while the third tier combines an advanced oil spill trajectory tool with coastal sensitivity estimations. The probabilistic approach of VRAKA facilitated by the fault tree model enables a distribution of probable outcomes rather than a point value. Thus facilitating incorporation of the uncertainties involved in shipwreck risk assessment. VRAKA can deliver decision support regarding risk mitigation of potentially polluting shipwrecks taking uncertainties into account. It offers a comprehensive approach facilitating prioritization of shipwrecks for risk reduction measures enabling efficient resource use.

P.35  The consequences of climate change-driven land-use shifts in New England forests. Borsuk ME*, Thompson JR, Kittredge DB, Lindsay M, Orwig DA, Foster DR; Dartmouth College

Abstract: Climate change is generating new opportunities, risks, and uncertainties for forest landowners. We are exploring how climate change is shifting land-use regimes in New England by altering human decision-making and how these changes, in turn, are affecting regional forest ecosystems and the provisioning of ecosystem services. In particular, we investigate how landowners respond to the direct impacts of climate change (including increased disturbance frequency) as well as to an indirect, socially-mediated impact in the form of carbon offset credits. We simulate the impacts of this changing climate and landowner behavior to quantify their effects on regional forest carbon stores, forest structure and composition, and timber yields. The project couples a multi-agent model to a process-based regional forest landscape model. The former is being used to propagate the behaviors of different land-owner functional types, while the latter is serving to assess the land-use regimes as they interact with climate change and natural ecosystem dynamics. Information to parameterize the models is coming from several social science data collection activities. One overarching hypothesis of our study is that forest land-use change in response to climate change is having greater near-term ecological consequences than climate change itself.

P.36  Should we design for 100 year flood? . Xian SY*, Small MJ, Lin N; Princeton University; Carnegie Mellon University; Princeton University

Abstract: Flood adaptation measures are imperative to deal with increasing flood risk. However, current flood adaptation strategies may be far from optimal. First, estimates of future flood hazards are often made assuming stationary mean sea level or a uniform global rate of increase, without consideration of local variations. Second, flood adaptations are based on design storms with specified current or future flood return levels rather than using the full distribution of flood hazards. In this study, we provide a framework to integrate the estimation of flood hazards under local uncertain sea level and the calculation of flood return levels and optimal protection levels for the future. Optimal mitigation level is defined as the level that minimizes the combined cost of mitigation and future expected losses (net present value). We first estimate the annual probability of flooding and the storm surge and damage associated with the full range of flood return levels (e.g.50 and 100 years).The optimal mitigation level is then determined for a future assessment period corresponding to the next 30 years. Alternative realizations of flooding (using Monte Carlo simulation) are used to evaluate the return period design and the optimal mitigation design across the multiple simulations. However, each realization of future floods generates an amount of regret or absolute losses associated with over-mitigation or under-mitigation. The distribution of the regret and absolute loss are obtained for the two approaches of protection design: i) mitigate for a design storm; and ii) choose optimal mitigation across the distribution of storm events. We apply this framework to two properties in NYC and Florida as an illustration and comparison. The results show that economically optimal design is more favorable than the commonly-used flood return levels. The study suggests that decision makers should take into account the uncertainty of local sea level increase and the economical optimum when flood adaptation is prescribed.

P.38  Portfolio Analysis for Research Prioritization: Application to NOAA Fisheries. Wood MD*, Foran CM; US Army Engineer Research & Development Center

Abstract: NOAA Fisheries is responsible for understanding and monitoring the wellbeing of species and ecosystems in the Nation’s continental waters while considering priorities of the organization, the federal government, and the regional economies that rely on commercial fishing, tourism, and other activities. Every year, NOAA Fisheries develops and executes a portfolio of cruises for each of its six Fisheries Science Centers. These cruise portfolios must maximize scientific understanding and stakeholder requirements while maintaining flexibility to changing organizational, scientific, and budgetary conditions. We present a portfolio analysis tool developed for NOAA Fisheries that uses leadership preferences expressed as weights from structured interviews, and historic information from past year’s cruise prioritization expressed as optimized weights derived through a genetic algorithm. The result was a prioritized list of cruises for each of the Fisheries Science Centers and the Nation as a whole that were successfully used to inform which cruises were approved for allocated NOAA Fisheries funding in FY17. Additional findings and lessons learned to be discussed.

P.39  Visualization of Life Cycle Assessment (LCA) Output. Brondum M*, Wood M, Linkov I; United States Army Corps of Engineers

Abstract: Practitioners of Life Cycle Analysis (LCA) are presented with the dual mission of conducting elegant analyses that veridically capture the environmental impacts of a process, and developing information that enables decision makers to come to a conclusion that is best for their organization. Practitioners’ disposition for more detail combined with the inherent complexities of the analysis produce a scenario where decision makers are overwhelmed with information that makes their task more difficult (rather than easier) to accomplish. This phenomenon has been described as information overload and can lead a decision maker to use heuristic shortcuts that result in biased decisions. By simplifying data presentation, decision makers will be less likely to experience information overload. The effectiveness of information presented to decision-makers follows an inverse bell curve performance function, where very small or large volumes of information can handicap decision making capacity. One of the benefits of LCA is its ability to incorporate massive amounts of data, but that same benefit can become a hindrance if the results of an analysis are not presented carefully. In order to make effective use of the information supplied by LCA, tools are required to integrate the results in a clear manner that facilitates rational decision making. The goal of this presentation is to provide guidance on presenting results of a Life Cycle Analysis (LCA) in a way that enables decision makers in munitions acquisitions and range management to interpret and understand the impacts of munitions and training activities on environmental safety and occupational health (ESOH). Efforts are being made to improve information for munition acquisitions and range managers to make informed decisions. However, these improvements will only be useful to the extent that the improved information can be effectively communicated to those decision makers.

P.40  Advances in risk assessment of farm product and biota intake in SADA version 6. Bolus KA*, Manning KL, Stewart RN, Dolislager FG, Walker SA; Oak Ridge National Laboratory

Abstract: The Spatial Analysis and Decision Assistance (SADA) freeware program is a joint research and development effort between the Oak Ridge National Laboratory and the University of Tennessee. For nearly two decades, SADA has enabled environmental risk assessors (over 18,000 registered) to situate risk and decision analytics entirely within a spatial context. SADA represents a substantial integration of toxicological data, risk models, and advanced geospatial methods, resulting in new approaches for directly developing risk informed sample designs, remedial designs, cost analysis, and uncertainty analysis within an open modeling environment. In the upcoming Version 6, supported by the U.S. EPA, modernizations include expanded graphics capabilities, integration of population data and distribution models, and improved chemical and radiological risk and dose models. Risk upgrades include substantial advancements in biota modeling. Previously, produce intake rates were based on general fruit and vegetable consumption rates. Now, produce intake rates are derived from 22 individual produce items that contribute to the overall produce risk. Mass loading factors have been expanded from a single MLF that was applied to all produce to 22 individual mass loading factors that correspond with the specific produce items that now make up the produce intake rates. Additionally, animal product intake rates have been added to the models including goat milk, sheep milk, duck, mutton, goat meat, rabbit, turkey, and venison. Formerly, the plant transfer factors used in these models were element-specific only. Now, transfer factors are specific to element, soil type, and climate zone. These new transfer factors are from the recent TRS-472 and TRS-479 from IAEA and from Science Report SC030162/SR2 from the Environment Agency of the UK and were used to replace most of the NRCP generic values. Future plans are to utilize the spatial awareness of the location of each site assessed in SADA and apply appropriate site-specific exposure conditions to the assessment.

P.41  Optimizing Resources: An Environment, Health & Safety Risk Model . Pierce A*, Warshaw C, Posin L, Hancock G; General Electric Co. and Gnarus Advisors

Abstract: Are you facing challenges allocating resources and prioritizing support in a constrained environment? The GE Environmental, Health and Safety (EHS) function faced the same challenges as they moved from a business support model to a shared service and continue to face them today in a dynamic business environment. GE partnered with Gnarus Advisors to develop a risk model to provide objective insights into the risk of their fixed facilities and services operations. The output of the model is a ranking of operations globally, by business, region and EHS media aimed at allocating resources, program development and support. This session will focus on the model’s development and real world application with a special focus on their governance program.

P.42  Comparison of evaluation functions for setting priority of risk management. Maeda Y*, Muramatsu G; Shizuoka University

Abstract: In a risk management process, risk managers firstly recognize and describe risks that they try to deal with, secondly assess the magnitudes of these risks, thirdly make the plan of risk control measures, and fourthly carry out it. In the third step, they have to determine what risks are preferentially treated and what are not treated but accepted. In other words, they have to rank the risks. So, how should they do? In this research, five evaluation functions for ranking risks, that is, magnitude of risk, cost-effectiveness of risk control measure, benefit-risk ratio, voting, and majority judgement, were used to rank twenty risks and compared. The twenty risks were chosen from the result of a questionnaire survey in which students in a university were questioned genumerate risks around you.h These risks were ranked by using the five evaluation functions. As for voting and majority judgement, the results were obtained from the answer of another survey of the same students. As a result, we obtained the following remarks. Firstly, ranks of risks derived from these evaluation functions are different from each other. In particular, ranks based on rational approaches, i.e., magnitude of risk, cost-effectiveness, and benefit-risk ratio, are entirely different from ranks derived from the other functions, voting and majority judgement. Secondly, risk of nuclear plants is highly ranked by voting, while lowly ranked by other four functions. And thirdly, risk of traffic accident is highly ranked by most of functions while benefit-risk ratio ranks this risk lowly. These results suggest importance of risk evaluation policy in risk management process.

P.43  Siting High-Level Radioactive Waste Disposal Facilities: 50 Years of Failure . Luk SY, Mumpower JL*; Texas A&M University

Abstract: According to The Blue Ribbon Commission on America’s Nuclear Future, disposal of high-level radioactive waste (HLRW) is a problem for which "we know what to do, we know we have to do it, and we even know how to do it." But for more than fifty years, the U.S. has failed to find a way to dispose of commercial HLRW. Despite having the necessary knowledge, experience, and financial resources, the U.S. has no place to dispose of approximately 70,000 metric tons of accumulated HLRW and 2,200 metric tons added annually. It is now arguably further from a solution than it was a decade ago. The Department of Energy filed a license application in 2008 for construction of a HLRW repository at Yucca Mountain, Nevada, but in 2010 requested withdrawal of the application. The rest of the world has done little better. No country has yet to achieve a full, satisfactory resolution to the problem. A handful of countries-Finland, France, and Sweden-appear to have success in sight, but still years or decades off. In contrast, the U.S. is also home to a success story. The Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico, is a deep geological repository that disposes of the nation’s defense-related transuranic radioactive wastes. WIPP opened in 1999 and its success has been seen as an indicator that eventually it will prove possible to site, construct, and operate HLRW disposal facilities for commercial waste. Recent events, however, raise doubts. In early 2014, two accidents resulted in the closing of WIPP; the re-opening date is uncertain. This paper discusses proposed frameworks for design of a siting process; reviews the history of failed efforts to site a U.S. HLRW disposal facility and the successful siting of WIPP; and reviews the history of siting efforts in other countries, particularly the few where success appears most likely. It summarizes lessons learned and conclusions from analysis of efforts to site HLRW disposal facilities.

P.44  Application of Structured Decision Making to Radiological Air Monitoring. Black PK*, Stockton TB, Perona R, Ryti RT; Neptune and Company, Inc.

Abstract: We employed an analytic-deliberative structured approach to a Hanford Site radiological monitoring program decision that integrates science with values and preferences. This value focused thinking approach facilitates transparent and defensible decision making, and aids in communicating monitoring rationale. The example application of structured decision making (SDM) evaluates management options for a Hanford Site air monitoring network decision. The fundamental objectives elicited from the decision makers included maximizing social sustainability and minimizing public health and environmental impacts. By shifting from an alternatives focus to a values focus an influence diagram is initialized that connects the management options with models and measures that are explicated connected to objectives. The influence diagram is populated with conditional probabilities using either existing information or elicitation resulting a Bayesian Network that captures in the decision uncertainty. The example SDM application demonstrates SDM’s utility for monitoring program decisions as well as the ability to scale the relatively simple example SDM model to a more complex and holistic assessment of program objectives.

P.45  An Exposure Based Multi-Criteria Decision Analysis (MCDA) Approach for the Risk Prioritization of Antibiotic Products. Chabrelie AE*, Mitchell J, Norby B; Michigan State University

Abstract: Antimicrobials are required in the production of cattle intended for the meat and dairy industries. Because the use of antimicrobials may contribute to increased antimicrobial resistance in bacterial communities across multiple environments over the life cycle of these products, it is important to rank risks associated with their usage. However, many existing data gaps limit the ability to quantitatively characterize risks across these diverse exposure pathways. Hence, this study develops a decision analytic framework, to prioritize risks based on exposure potential in order to inform stewardship initiatives. By using MCDA, several disparate types of information or criteria that play a role in the transmission of antimicrobial resistance can be integrated. First, information related to usage - quantities manufactured, prescribed and administered. Second, the properties of the antibiotic compounds themselves, such as their degradation kinetics and mechanisms; and properties driving accumulation in certain environmental compartments. Third, information related to their interaction with pathogens and commensals to exert selection pressure for resistance. Finally, criteria related to the availability of interventions or alternatives to potentially reduce usage. The approach developed will be presented based on a recent survey of antibiotics used in dairy cattle. The strength of this approach is the ability to integrate relevant statistical and mechanistic data with expert judgment, allowing for an initial tier assessment that can further inform research and risk management strategies.

P.46  Impact of Temperature and Humidity on Stroke among Diabetes Mellitus Patients Using Statins. Ho WC*, Chou YJ, Tsan YT, Chan WC, Lin MH, Lin YS, Chen PC; China Medical University

Abstract: Cerebrovascular diseases have been the common cause of death globally during the past decade. Diabetes mellitus has been well-known risk factor of stroke. Person with diabetes mellitus tends to have increased risk for stroke. Statins are widely used medicine for their cholesterol-lowering effect in patients with hyperlipidemia to prevent vascular disease. However, among the noted pleiotropic effect of statins, anti-inflammation may reduce the risk of stroke. Previous studies reported that temperature and relative humidity are associated with stroke by changing blood viscosity or increasing plasma fibrinogen level that may lead to inflammation and contribute to stroke. The objective of this study is to investigate whether meteorological factor exposure and the stains use have the impact on stroke among diabetes mellitus population. This study used Longitudinal Heath Insurance Database cohort applying for 2 million people with diabetes mellitus during the 1999- 2010 period. We collected and analyzed the incident of stroke in the diabetes mellitus cohort, and risk was measured by defined daily dose(cDDD) and diabetes mellitus severity. The meteorological factors including temperature and relative humidity data obtained from by the amount of 77 monitoring stations of Taiwan Environmental Protection Administration. The time stratified case cross-over approach was used to establish the groups of case, and controls 1:3 ratio. Conditional logistic regression model was used to estimate risks associated with average daily meteorological factors of daily air pollution and statin usage. Subgroup analyses by the level of statins intake were also further conducted. Finally, the potential interaction relationship of since the meteorological factor may trigger stroke, we will evaluate whether the statins interact with meteorological status in the risk of stroke. The expected impacts are preventive and therapeutic efficacy of statins that may modify the meteorological factor effects on stroke among the diabetes mellitus population.

P.48  A Series of Unfortunate Events: Perpetuation of the Pervasive Misconception that Rats Receive a 3-5 Times Lower Lung Tissue Dose than Humans at the Same Ozone Concentration. McCant DD*, Lange SS, Haney JT, Honeycutt ME; Texas Commission on Environmental Quality

Abstract: Unfortunately, researchers continue to perpetuate the misunderstanding that human lung tissue doses of ozone (O3) are 3-5 times greater than rat tissue doses at the same O3 concentration, referencing Hatch et al. (1994). The origin of this erroneous assertion lies in the fact that Hatch et al. did not expose humans and rats under the same conditions, which continues to not be accounted for but pervades the scientific literature. Hatch et al. exposed exercising humans to 0.4 ppm and resting rats to 2 ppm and found comparable 18O incorporation into bronchoalveolar lavage constituents. This important difference in activity state is not always appropriately considered when the perceived implications of the Hatch et al. study are cited in the peer-reviewed literature. However, this difference in activity state explains the comparable incorporation of 18O by exercising humans and resting rats at 5-fold different exposure concentrations. More specifically, although exercising humans were exposed to a 5-times lower O3 concentration than resting rats, their ventilation rate was 5-times higher than the resting rate, offsetting the 5-times lower exposure concentration and producing the same dose that would be expected at rest when exposed to 2 ppm (i.e., 0.4 ppm x exercising human ventilation rate of 64.6 L/min ≈ 2 ppm x resting human study ventilation rate of 13.5 L/min). In other words, humans exposed to 2 ppm at rest should be expected to experience approximately the same dose as exercising humans at 0.4 ppm, which produced a dose comparable to resting rats at 2 ppm. Correcting the misconception that rats must be exposed to 3-5 times environmental concentrations to achieve the same environmentally-relevant O3 doses as in humans is important for a correct understanding of available O3 studies by the scientists and policy makers responsible for making regulatory decisions (e.g., setting the federal O3 standard).

P.49  Review and Assessment of Phosgene Mammalian Lethality Data and the Development of a Human Estimate. Sommerville DR*, Channel SR; US Army Edgewood Chemical Biological Center and Leidos

Abstract: New human estimates for the lethal effects of phosgene inhalation were derived from a review and statistical analysis of existing mammalian lethality data. The estimates are expressed as a function of exposure duration for healthy subpopulations and the general population. Median lethal dosages and quantal response data were analyzed for 10 species: mouse, rat, guinea pig, rabbit, cat, goat, sheep, swine, horse, dog and monkey. A total of 155 median lethal dosages and 31 probit slopes were compiled from some 42 studies/sources dating back to World War I. Resulting human estimates were expressed via the toxic load model, L(C^n t)50 = k, where n = 1 with C in mg/m^3 and t in minutes (thus following Haber’s Rule). The LCt50 equals 1,500 and 1,100 mg-min/m^3 for military (healthy subpopulation) and general populations, respectively. The revised healthy subpopulation estimate is lower than WWII era military estimates but in exact agreement with more recent analyzes by the United States and the United Kingdom. The general population estimate is lower than previous estimates reflecting a more robust estimate with inclusion of non-human primate data that was generally unavailable to non-military organizations. The base 10 probit slope (concentration) is estimated at 9.5 (healthy subpopulation) and 7.0 (general populations).

P.50  Prediction of hepatotoxicity in rats by statistical approaches. Takeshita J*, Oki H, Yoshinari K; 1) National Institute of Advanced Industrial Science and Technology, and 2), 3) University of Shizuoka

Abstract: Today, hazard and risk assessments of chemical substances are mainly carried out based on the results of animal experiments. On the other hand, in terms of time and cost efficiencies and the animal welfare, there is an increasing demand for alternatives to animal experiments in the world. The development of alternatives, however, has not progressed enough, especially for “repeated-dose toxicity (RDT),” which is one of the most important toxicities in a hazard assessment of chemical substances. As alternatives to animal experiments, we suggest two statistical ways to predict in vivo hepatotoxicity using three different data: a) existing data of in vivo hepatotoxicity in rats, b) data of in vitro nuclear receptor assays, and c) molecular descriptors (in silico data). First, we obtained in vivo RDT data from “Hazard Evaluation Support System Integrate Platform (HESS),” which had been developed in Japan. There were 365 endpoints and 606 compounds in the HESS data as of September, 2014. Then, we conducted several kinds of in vitro nuclear receptor assays for 190 compounds out of the 606. We also calculated various molecular descriptors of the 190 compounds using Dragon 6. Thus we constructed a training dataset for this study. Then, we took two ways to predict in vivo toxicity: 1) We selected “Hepatocellular Hypertrophy (Centrilobular)” as an endpoint of hepatotoxicity to be studied. We then predicted the presence or absence of the toxicity, as well as its strength, using the in vitro and in silico data. 2) We grouped the 190 compounds based on the in vitro and in silico data. We then filled the missing information of potential hazards of compounds whose in vivo hepatotoxicity had not been obtained, based on our assumption of their similarity to the compounds in the training dataset. In this talk, we will show you the summary of the training dataset and the results of the two approaches to predict in vivo hepatotoxicity in rats.

P.51  The effects of air pollution and statin use on the risk of stroke in diabetes mellitus patients after transient ischemic attack: a 5-year population-based cohort follow-up study. Yin MC*, Wu TT, Chou YJ, Chu YR, Chan WC, Tsan YT, Ho WC, Chu CC, Chen PC; China Medical University

Abstract: Stroke is a major worldwide public health issue. Air pollution related to cardiovascular diseases has been assessed. Statins are the widely used for hyperlipidemia by their cholesterol-lowering effect. Preclinical evidence has shown that statins can reduce the incidence of stroke in diabetes mellitus (DM) patients by their anti-inflammatory, antiproliferative, proapoptotic and anti-invasive effect. However, few studies investigate the effects of air pollution and statins use after transient ischemic attack (TIA) in Diabetics. The aim of this study was to investigate whether air pollution and statin use is associated with stroke incidence among diabetes mellitus patients who had TIA, especially focusing on the both potential critical period effects of before and after TIA. The study design was a retrospective cohort study. The medical records of subjects including stroke events and statins use were collected by Longitudinal Heath Insurance Database 2000 (LHID2000). Air pollution was based on Taiwan Environmental Protection Administration high density air monitoring stations and assessed by geographical information system (GIS). Cox proportion hazard regression models were used. The results showed air pollution potentially increasing the risk of stroke for both exposure periods among DM patients who had TIA. Statins use might interact with air pollution on the risk. It is an important to assess stroke risk among DM patients who had TIA, and find out the role of air pollution and statins in the prevention/intervention strategies of stroke. Further study is promised.

P.53  National-level evaluation of pesticide risks to endangered and threatened species. Rossmeisl CM*, Peck C, Garber K; U.S. Environmental Protection Agency

Abstract: The United States Environmental Protection Agency (USEPA), the US Fish and Wildlife Service (USFWS) and the National Oceanic and Atmospheric Administration’s National Marine Fisheries Service (NMFS), and the US Department of Agriculture are jointly developing a framework to address the risk from pesticide use to federally threatened and endangered species (listed species) and their designated critical habitats. This framework is intended to be applied on a national level to anywhere a pesticide may be applied as allowed on a label. The framework includes 3 steps, with the first two steps focused on risks to an individual of a listed species through a weight of evidence approach and the third step focused on risks to entire populations. The methodology is being developed by the agencies through the assessment of risks of three organophosphate insecticides: chlorpyrifos, diazinon and malathion. Draft results of the first two steps of the process have been completed for approximately 1780 species of plants and animals. USEPA, USFWS and NMFS are currently working to translate information used to identify risks for individuals, which are based on field and waterbody level scales, into information that is useful at the population, landscape and watershed scales. This presentation will give an overview of the methodology used in the individual level risk assessments used for the first two steps of the process and how this information could be integrated into a population level assessment.

P.54  Improving Ecological Risk Assessment by Embracing Benchmark Dose Analysis. Mayfield DB*, Skall DG; Gradient

Abstract: Benchmark dose (BMD) analysis is routinely used in the assessment of human health effects from exposure to environmental contaminants. Accordingly, the US Environmental Protection Agency (US EPA) has developed technical guidance and software tools (i.e., benchmark dose software [BMDS]) to allow risk assessors to characterize exposure-response relationships following a systematic process. By comparison, ecological risk assessments (ERAs) seldom employ dose-response modeling and rely upon the use of no- or lowest-observed adverse effect levels (NOAELs/LOAELs). For example, ecological screening levels and remediation goals for wildlife are often developed using food-web models based on conservative assumptions of exposure and toxicity. The toxicity reference values (TRVs) underlying these concentration goals are typically based on NOAELs or LOAELs derived from available laboratory toxicity studies. In many cases, these procedures result in unrealistically low estimates of hazard. Fortunately, BMD modeling can be employed in ERAs by using existing BMD-methods and EPA guidance. In this study, we applied BMD analysis to characterize exposure-response relationships for selected chemicals following a systematic process. This analysis demonstrates the utility of the BMD approach for developing more robust toxicity values for use in wildlife risk assessments.

P.55  Extrapolation Strategies for Ecological Risk Assessment: Inhalation Toxicology in Cetaceans . Rosenstein AB*, Collier TK, Mori C; Independent Consultant

Abstract: There are often limited or no toxicity data for ecological receptors, and physiological information for many species is also similarly limited. In our recent research, we extrapolated laboratory animal inhalation toxic effect levels to marine mammals (cetaceans) by scaling, using body mass and lung volume. Other differences between terrestrial and marine mammals, some of which may derive from living and feeding in deep water versus in air, are not well understood. Examples include: physiological differences in lung structure and nasal filtration; environmental and chemical differences related to the possible impact of gas pressures at depth on toxicant uptake; and metabolic differences. These differences were not incorporated into the extrapolations we have conducted so far. Here we address the following question: what is the effect on the final exposure and risk estimates of leaving out these types of factors? We will present a summary of extrapolation approaches that have been used in ecological risk assessments in the past, and we will discuss promising methods currently being developed to take into account metabolic, physiological, and other species differences.

P.56  An Attacker-Defender Resource Allocation Game with Complementary or Substituting Effects. He M*, Zhuang J; University at Buffalo

Abstract: Many game-theoretic models have been developed to study the optimal government (defender) resource allocation strategies against the strategic adversaries (attacker). However, to our best knowledge, the complementary or substituting effects of the government’s resource allocations have not been taken into consideration in the literature, especially facing with strategic attackers. This work fills the gap by developing a multi-stage game-theoretical model with a more realistic contest success function, which focuses on the optimal resource allocation to minimize the economic losses facing attacks by intelligent terrorists, taking into account the complementary or substituting effects. We study how the joint effectiveness of multiple security investments and the uncertainty of the interactions between different security programs influence the defender’s strategy. In a sequential game model, we study the best response of the attacker for each potential defense allocation. Then we explore the dynamics between defender and attacker, and the corresponding expected payoffs. We provide analytical solutions to a single-target model and then provide numerical illustrations of a multiple-target model, using real data. We also compare the results of the models with and without consideration of the complementary or substituting effects. Our preliminary results show that the optimal resource allocation significantly depends on the joint effectiveness of security investment, and the defender would be worse off by not considering such effects. Finally, we study how the optimal strategies of each player and the corresponding probabilities of successful attack change when the parameters vary. This research provides some new insights into homeland security budget allocation through a multi-stage game model. Moreover, the impact of complementary or substituting effects on optimal budget allocations has emphasized the importance of developing a more realistic defense efforts model for use in homeland security budget allocation decisions.

P.57  Combining Quantitative Microbial Risk Assessment and Disability Adjusted Life Years to estimate Microbial Risk Reduction for Cost-Benefit Analysis in Drinking Water Systems. Bergion V*, Rosén R, Lindhe A; Chalmers University of Technology

Abstract: Waterborne outbreaks of gastrointestinal diseases can result in large societal costs. Health care costs, productivity losses, costs of illness and costs due to reduced trust in drinking water authorities are some examples. To mitigate waterborne diseases, microbial risk reduction measures can be undertaken. In order to choose the most appropriate microbial risk reduction measure(s), drinking water authorities need transparent and structured decision support. Quantitative Microbial Risk Assessment (QMRA) estimate the microbial risks in a drinking water system, and commonly, Disability Adjusted Life Years (DALYs) are used as a metric for potential negative health effects. We present a novel approach, combining QMRA and economic valuation of DALYs in order to estimate and monetise health benefits resulting from microbial risk reduction measures in a drinking water system as input for Cost-Benefit Analysis (CBA). Health benefits, other benefits and the costs were used to compare the different microbial risk reduction measures, using CBA. To include uncertainties in the CBA, Monte Carlo simulation was used for calculations of the Net Present Value (NPV). Lake Vomb in Sweden was used as a case study, in which different microbial risk reduction measures for On-site Wastewater Treatment Systems (OWTSs) were compared. Preliminary results indicate that connecting the ten OWTSs posing the largest microbial risk has the highest probability of a positive NPV in the CBA. This approach will make up a key part of a transparent and structured decision support framework for drinking water authorities. Furthermore, using DALYs as a health indicator, it is possible to compare mitigation measures that reduce both chemical and/or microbial health risks and to compare measures implemented in different parts of the drinking water system. The study was performed within the project “Risk based decision support for safe drinking water”, funded by the Swedish Water and Wastewater Association.

P.58  Combining cost benefit analysis with multi criteria analysis for sustainability assessment of regional water supply policies. Sjöstrand K*, Rosén L, Kärrman E, Blom L, Lindkvist J, Ivarsson M, Lång LO, Lindhe A; (1,3) SP Technical Research Institute of Sweden, (1,2,8) Chalmers University of Technology, (4) City of Gothenburg, (5) Gothenburg Region, (6) Enveco Environmental Economics Consultancy, (7) Geological Survey of Sweden

Abstract: The provision of safe drinking water is of primary importance in society and a prerequisite for public health and economic development. This provision is however threatened by a variety of risks due to e.g. climate change and societal development. To manage these risks the water utilities are facing complex decision situations. Today, decisions on costly investments are being made with limited knowledge of which choices are the most sustainable. In order to achieve a safe and sustainable water supply, a regional perspective on the environmental, social and economic effects of the decisions is increasingly promoted in Sweden. There are, however, few decision support tools adapted to the regional level. Hence, this study focuses on the development of a decision support framework for assessing the sustainability of risk reducing measures by adapting Cost-Benefit Analysis (CBA) and Multi Criteria Analysis (MCA) to a regional perspective. The Gothenburg region serves as a case study for which five measures are evaluated: (1) centralization of water supply production; (2) centralization of water supply organization; (3) shift of the main raw water source (4) maximization of groundwater usage; and (5) use of additional raw water sources. All measures aim to enhance safety by reducing risks. Uncertainties concerning the net present values are analyzed using statistical simulation (Monte Carlo). The CBA results are complemented with environmental and social effects in an MCA, including criteria such as intrinsic values, equity and health. The case study results are then used to design a decision support framework that allows for non-market valuations and economic and ecological tradeoffs under uncertainty, a novelty on the regional scale. In conclusion, it is expected that the results of this study will provide decision makers with a framework that can improve their ability to make well-informed decisions and ensure the society a safe water supply for generations to come.

P.59  The long and winding road: controlling CO2 emissions from international aviation. Vaishnav PT*; Carnegie Mellon University

Abstract: In 2013, International Civil Aviation Organization (ICAO), the UN agency charged with regulating CO2 emissions from international aviation, declared that it would implement a market-based mechanism (MBM) that would require airlines to buy credits to offset any growth in their carbon dioxide emissions after 2020. In March 2014, ICAO published a “strawman” proposal for the MBM, which was revised in March 2016. The revised draft was the subject of a high-level meeting (HLM) in May. The meeting ended in an impasse on whether and how to phase the MBM in, as well as on how to account each country’s contributions to global emissions. The proposed research analysis will trace the development of the fledgling mechanism, thru to the final version that emerges from the meeting of ICAO’s Assembly in 2016, where the mechanism will be voted upon. The proposed research will evaluate the impacts – many of them arguably unintended – of the different iterations of the mechanism for different types of airlines. Drawing on the documentation that ICAO provides of its Global Aviation Dialogues (GLADs), the analysis will attempt to correlate changes in the mechanism to positions taken by individual, or groups of, states. One of the big unresolved issues in the design of the MBM is the standard that any carbon dioxide offset purchased to meet its requirements must meet. The proposed analysis will draw on the literature in management sciences to argue that there is a considerable risk that this standard will not have high integrity. As the first mechanism that will address CO2 emissions from an entire sector of the global economy, ICAO’s MBM may serve as a template for other sectors, for example, ocean shipping. Understanding what it gets right, and what it does not, may be important to the design of other global mechanisms.

P.60  PM2.5 related welfare loss in Beijing, China: health and psychological mood impacts. Yin H*, Xu LY; Beijing Normal University

Abstract: PM2.5 pollution blanketed most cities in China in recent years, which caused serious health impacts and numerous welfare losses. Some local governments started to implement the emission pricing regulations, including the pollution charges and “cap and trade”. The reason for skepticism of such regulations is that there is limited study about how much welfare loss due to the PM2.5 pollution. This paper utilised contingent valuation (CV) method to elicit people’s willingness to pay (WTP) to avoid the PM2.5 pollution and whether they support relevant policies. Additionally, people’s mood impacts due to the invisibility caused by PM2.5 pollution was also considered in this study. The specific survey was conducted in Beijing, China to estimate the welfare loss due to PM2.5 in 2015. We conducted a face-to-face interview of 727 residents in Beijing applying stratified sampling method. The survey used payment card vehicle to elicit the respondents’ WTP for PM2.5 pollution reduction to the second national limit (35 µg/m3). The results showed that 85% of respondents were willing to express their WTP for PM2.5 reduction and mood improvement. The whole WTP for health and mood improvement was US$ 477 and US$ 330 million for one year respectively. As expected, WTP increased with the income, education and understanding of PM2.5; and it was larger for men, smaller for aged people. However, there was no significant relationship between WTP and daily PM2.5 concentration, which indicates that the temporary PM2.5 pollution level influence little of people’s WTP for pollution reduction. Welfare loss due to the PM2.5 pollution was around 807 million in 2015, which took around 0.22% of the regional GDP. The study results could provide scientific supports for the PM2.5 pollution compensation and tax system establishment in China.

P.61  Benefits of mercury controls for China and the neighboring countries in East Asia. Zhang W, Zhen G, Chen L, Wang H, Li Y*, Ye X, Tong Y, Zhu Y, Wang X; East Tennessee State University

Abstract: Exposure to mercury poses significant risks to the health of humans and wildlife. Globally, coal-fired power plant (CFPP) is a major source of mercury emissions, with China being the largest contributor to global atmospheric mercury. As a signatory country of the Minamata Convention on Mercury, China is developing its National Implementation Plan on Mercury Control, which gives priority to control of mercury emissions from CFPPs. While social benefits play an important role in designing environmental policies in China, the potential public health and economic benefits of mercury control in the nation are not yet understood, mainly due to the scientific challenges to trace mercury’s emissions-to-impacts path. Moreover, little is known about the potential benefits for the neighboring countries in East Asia resulted from China’s mercury control. This study evaluates the health and economic benefits for China and neighboring countries in East Asia from mercury reductions from China’s CFPPs. Four representative mercury control policy scenarios are analyzed, and the evaluation is explicitly conducted following the policies-to-impacts path under each policy scenario. We link a global atmospheric model to health impact assessment and economic valuation models to estimate economic gains for China and its three neighboring countries (Japan, South Korea and North Korea) from avoided mercury-related adverse health outcomes under the four emission control scenarios, and also take into account the key uncertainties in the policies-to-impacts path. Under the most stringent control scenario, the cumulative benefit of the mercury reduction by 2030 is projected to be $430 billion for the four countries together (the 95% confidence interval is $102-903 billion, in 2010 USD). Our findings suggest that although China is the biggest beneficiary of the mercury reduction in CFPPs, neighboring countries including Japan, South Korea and North Korea can also benefit (~7% of the total benefits) from China’s mercury reduction.

P.62  Cost-Benefit Analysis of Copper Recycling in Remediation Projects. Volchko Y*, Karlfeldt Fedje K, Norrman J, Rosén L; Chalmers University of Technology

Abstract: Soil contamination is a worldwide problem. Only in Sweden there are more than 80 000 potentially contaminated sites. Remediation in Sweden is usually performed using conventional “excavation and disposal” method because of the low disposal fees and time constrains for remedial actions. To assist this most common remediation method, a metal recycling technology can be used for extraction of valuable metals, e.g. copper, from excavated masses at contaminated sites and thus returning these metals into the societal cycle instead of landfilling. In order to assess economic competitiveness of metal recycling in remediation projects with conventional “excavation and disposal” methods, five alternative remedial actions were assessed for the Köpmannebro site in Sweden by means of a Cost-Benefit Analysis (CBA). Uncertainties in the analysis results were handled with Monte Carlo simulations. All studied alternatives assumed excavation and disposal of the masses heavily contaminated with copper, however differed in distances to the disposal sites, transport means and treatment of excavated masses before landfilling. The CBA had resulted in the negative net present values (NPVs) for all five remediation alternatives, because the remedial action itself was associated with very high costs and low benefits. Project risks (i.e. 12.5% of the costs for remedial action), copper prices, efficiency of copper recycling and costs for the increased health risks due to remedial action contributed most to the uncertainties in the NPVs for the alternatives assuming metal recycling. The simulated mean of NPV for one of these remediation alternatives was slightly lower than for direct disposal of contaminated masses because of the metal recycling opportunity and disposal of the excavated and treated materials at the local landfill. If copper prices were increased at least twice, all the metal recycling alternatives for the Köpmannebro site could be competitive with the conventional “excavation and disposal” method.

P.63  Development of innovative methodology for safety assessment of industrial nanomaterials: Report of research project in Japan (FY2011-2015). Gamo M*, Honda K, Yamamoto K, Fukushima S, Takebayashi T; 1) 2) 3) National Institute of Advanced Industrial Science and Technology (AIST), 4) Japan Bioassay Research Center, 5) Keio University

Abstract: As breakthrough technologies in various fields, many industrial nanomaterials have been developed. There are a huge number of variations in terms of physicochemical properties such as size, shape, surface area, surface coating, and crystalline phase even within nanomaterials with the same composition. Although intensive researches on the health effects have been conducted, it has been found impractical to conduct a detailed assessment on each of a variety of nanomaterials. From this viewpoint, a five-year research project named "Development of Innovative Methodology for Safety Assessment of Industrial Nanomaterials" was launched in September 2011 by Ministry of Economy, Trade and Industry (METI) in Japan, and was completed in March 2016. The project consists of two themes; One is development of equivalence criteria of nanomaterials, which would contribute to reduction of the number of animal tests through categorization of variation of nanomaterials into a limited number of categories. For the purpose, using intratracheal administration tests, we compared the pulmonary toxicity of seven TiO2, four NiO, and seven SiO2 nanomaterials with different physicochemical properties. The sensitivity of the pulmonary toxicity to physicochemical properties was evaluated. Another is establishment of intratracheal administration method as a low-cost and convenient methods which enables us to screen nanomaterials from the viewpoint of pulmonary toxicity. We conducted inhalation exposure test and intratracheal administration test using same nanomaterials at comparable dose level, and discussed the difference/similarity of the results between the two tests. We also have developed a standardized protocol of the intratracheal administration test. *) This work part of the research program "Development of innovative methodology for safety assessment of industrial nanomaterials" supported by Ministry of Economy, Trade and Industry (METI) of Japan.

P.64  Technology “Risk Radars”: An example in the area of nanotechnology. Jovanovic AS*, Quintero FQ, Klimek P, Markovic N; Steinbeis Advanced Risk Technologies, Stuttgart, Germany

Abstract: The paper presents the development of the RiskRadar in the large EU project caLIBRAte (“Performance testing, calibration and implementation of a next generation system-of-systems Risk Governance Framework for nanomaterials”). The objective of the caLIBRAte project is to establish a state-of-the-art versatile Risk Governance framework for assessment and management of human and environmental risks of MN and MN-enabled products. After reviewing the “risk radar like” initiatives worldwide (e.g. in insurance industry), the paper will present the development of the caLIBRAte Risk Radar that is being largely based on the method developed in the iNTeg-Risk project ( It helps identifying and monitoring emerging risks in the area of nanotechnology by considering the environmental, health/safety, socio-political, economic/financial, regulatory/legal and technological aspects. The indications about the possible risks are collected from different sources, such as Expert level: Platform for including experts, opinions and warnings; Scientific publications level (bibliometirc analysis), Public and stakeholders’ perception level (conventional sources; reports on surveys, focus groups and similar) as well as the social media/ networks. A special technique for automatic identification of new risks in the internet-based sources, developed for insurance industry and measuring singularity and ubiquity of new information, has been developed and deployed. The results of the identification and monitoring process will be used also for the predictive part based on the agent-based models (previously calibrated on the monitoring results).

P.65  Risk assessment of groundwater drawdown in subsidence sensitive areas. Sundell J*, Rosén L; Chalmers University of Technology

Abstract: Groundwater drawdown induced subsidence in soft soil is a severe problem in many regions around the world. One common cause for drawdown and subsidence is leakage of groundwater into sub-surface constructions. When planning for future sub-surface projects in areas with risk for subsidence, potential damage costs need to be estimated. This is of outmost importance in cities where the risk objects (buildings and other constructions) are many and of high value. To reduce the risk for costly damages, safety measures and additional investigations can be planned for. Since groundwater drawdown and subsidence is a transient process, a time space for implementation of safety measures exists. In this study, we present a novel method that recognizes the whole cause- and effect chain of groundwater drawdown induced subsidence, the large spatial scale of the problem, its spatial variability and the transient process. Applied on a case study in Sweden, the method combines three probabilistic models for spatial estimation of: (1) soil- and bedrock stratification, (2) groundwater drawdown, and (3) subsidence. The combined result of these three models gives probability density functions for subsidence magnitudes at certain time steps and locations. The risk for damages and its associated cost is estimated by combining this result with the sensitivity for subsidence of individual risk objects. By means of cost-benefit analysis, the benefit of a safety measure is compared to the cost for implementing the measure. Possible risk reduction measures can be ranked with respect to profitability. If a measure holds a positive net benefit at a certain location, the measure is recommended to be implemented. When mapping the result, the method provides a clear and communicative decision support for planning of safety measures, monitoring and additional investigations.


Abstract: It is exposed as Cuba focuses on Process Safety (PS) from regulators and controls, regulatory frameworks, risk analysis and treatment; establishing for it the Safety Report (SR) in Risk Management (RM). The Cuban state from its Regular Bodies, control is established the SR in the Facilities Hazard Major (FHM), corresponding to the Holder of Environmental Licenses (HEL) performing the Safety Assessment (SA) that must correspond with the size and nature of the hazards of major accidents in existing facilities. Engineering and projects associated with the FHM allow materialize from early stages the approach SR passing from the conception of the installation through PS, allowing incorporate not only the compliance and environmental and safety regulations, but to realize a RM in future and existing FHM in high-risk industries

P.67  Accidents risk assessment on China petroleum and chemical enterprises. Zhao Y*; Peking University

Abstract: With the development of social economy, China has become a large producer and consumer of petrochemical products in recent years. By the end of 2010, there were about 35,000 enterprises above state designated size in China's petroleum and chemical industrial. However, due to the weak capacity of safety management and the lack of prevention and control measures, environmental incidents and safety accidents took place frequently, which caused a significant threat to public health and environment. Faced with the possible losses of accidents, quantitative risk assessment and loss evaluation are needed for the sake of reasonable insurance expenses and security budgets. At the same time, risk analysis can show a basic framework on accident prediction and loss distribution simulation of petrochemical enterprises, and provide reference on related risk assessment model parameters. In view of these, this study assessed Chinese petrochemical enterprises¡¯ accident risk, measure and predict related accidents risk level and loss distribution in a nationwide scale by using probability risk model and geographical information system. As at present there is no relevant accidents database in China, this study framed a petrochemical accident information database, by means of retrieving China's petrochemical accident news during 2006-2015 through web crawlers, and combining with accident information that China State Administration of Work Safety and Ministry of Environmental Protection and their affiliates disclosed on website. In addition, according to China's administrative divisions and corresponding risk level, a risk map of Chinese petrochemical enterprises accidents were drawn. Based on the preliminary description of the spatial and temporal distribution of the accidents, over the past decade the number of overall petroleum and chemical-related incidents has declined, and both in the national and provincial scale, the regional distributions of accidents were quite uneven.

P.68  The Environmental Competitiveness of Small Modular Reactors: A Life Cycle Study. Carless TS*, Griffin WM, Fischbeck PS; Carnegie Mellon University, Department of Engineering and Public Policy Engineering   tscarles@andrew.cmu.edus

Abstract: The Energy Information Administration estimates the demand for electricity in the US will increase by 29% between 2012 and 2040. In an effort to mitigate climate change, the US has pledged to reduce significant greenhouse gas emissions (GHG) over the next ten years. To bridge this energy gap and reduce GHG emissions, an increase in nuclear power using small modular reactors (SMRs) may help meet future energy needs by providing affordable, baseload, and low-carbon electricity. Currently, there are no SMRs in commercial operation. This work conducts a prospective attribution life cycle assessment of an SMR. Monte Carlo simulation and sensitivity analyses are used to account for the uncertainties in the analysis. The analysis finds that the mean (and 90% confidence interval) life cycle GHG emissions of the Westinghouse SMR (W-SMR) to be 9.1 g of CO2-eq/kwh (5.9 to 13.2 g of CO2-eq/kwh) and the Westinghouse AP1000 to be 8.4 g of CO2-eq/kwh (5.5 to 12.1 g of CO2-eq/kwh). The GHG emissions of the AP1000 are 9% less than the W-SMR. However, when the nuclear fuel cycle is not included in the analysis the GHG emissions for the W-SMR and the AP1000 are effectively the same given the inherent uncertainties in the analysis. However, the analysis finds that both types of plants stochastically dominate the Generation II 4 loop SNUPPS. The mean (and 90% confidence interval) life cycle GHG emissions of the SNUPPS is 13.6 g of CO2-eq/kwh (10.5 to 17.3 g of CO2-eq/kwh). While the AP1000 has the benefits of economies of scale, the W-SMR’s modular ability enables it to make up some of the difference through efficiencies in construction, operation and maintenance, and decommissioning.

P.69  Health impacts of transportation and the built environment: A quantitative risk assessment. Mansfield TJ*, MacDonald Gibson J; University of North Carolina at Chapel Hill

Abstract: The design of urban transportation networks can affect three kinds of human health risks: (1) motor vehicle crashes, (2) air pollution from automobiles, and (3) physical inactivity occurring when motor vehicles replace walking and cycling as the main means of transportation. However, the relative magnitude of each of these risks in relation to the way cities are designed is poorly understood, and tools and methods that simultaneously assess all three risks are limited. Furthermore, available tools rely on static methods that fail to account for cumulative health impacts over time. This work developed the first dynamic micro-simulation model for quantifying all three risks and then applied the model to compare transportation health risks between neighborhood groups of varying designs within the Raleigh-Durham-Chapel Hill region. The model combines information on crash risk as a function of vehicle miles traveled, demographic and built environment variables routinely collected by the US Census Bureau, modeled estimates of fine particulate air pollution arising from traffic computed at the census block scale, and baseline public health data from the North Carolina State Center for Health Statistics in order to estimate premature mortality risks from each of the three transportation-risk sources at the census block group scale. The model estimates that the combined health impacts of transportation are lowest in the most walkable block groups in the region (18.4 annual excess deaths per 100,000 persons on average over 10 years, compared to 22.9 in the least walkable block groups). While air pollution health impacts are higher in the most walkable block groups (2.14 annual excess deaths per 100,000 persons compared to 1.15), physical inactivity and crash risks are lower in these areas (2.70 annual excess death per 100,000 compared to 6.66 and 13.5 compared to 15.1, respectively). Thus, designing neighborhoods to encourage walking has important net health benefits.

P.70  Cooling Energy Analysis of Commercial Buildings in the U.S. Lokhandwala M*, Shevade P, Nateghi R; Purdue University

Abstract: Over the past decade there has been considerable improvement in technology related to indoor climate control applied to commercial buildings in the United States. The focus of innovation in this area has been mostly to reduce energy consumption costs and associated CO2 emissions, in order to regulate the temperature inside the built environment. Energy Usage Intensity (Energy consumed per square foot area) is a common bench-marking parameter used to compare the energy efficiency of buildings in the United States. We developed predictive models for cooling energy usage intensity of commercial buildings using the Commercial Buildings Energy Consumption Survey (CBECS) data to identify the main contributors to cooling energy loads. With the number of Cooling Degree Days projected to increase in the coming years, the results of this study can help in devising energy usage policies and also have great implications for future innovations in the field of cooling technology.

P.71  Visualizing uncertainty in marine navigation in the Canadian Arctic. Pelot RP*, Etienne L, Stoddard MA; Dalhousie University

Abstract: Quantitative Risk Assessment of marine traffic in the Canadian Arctic is fraught with difficulties: incomplete data sources, low trip frequency, complex causal factors, consequences in the relatively pristine environment are hard to estimate, response resources are remote, and conditions are evolving due to climate change. This study is part of a broader effort to deal with these risk assessment challenges. Invoking POLARIS (Polar Operational Limitations Assessment Risk Indexing System), an evaluation system recently developed by the International Associations of Classification Societies (IACS) to specify the navigability in diverse ice regimes for various ice-class ships, we have developed tools for exploring and visualizing the outcomes. The Canadian Arctic waters are divided up into 16 navigation Zones by Transport Canada, so we developed stacked column charts by week to illustrate the accessibility of each zone for a given ice-class depending on whether they operate with icebreaker escort or not and/or at reduced speed. This can serve for advanced planning, having a broad perspective on the limitations in the entire region. More specifically, one can also calculate the various levels of passability likely to be encountered along a specific path for a planned voyage at a given time, and colour code the route accordingly. The preceding calculations and views are based on an eight year time series, from which average case, or worst case, can be deduced. A supplementary view therefore shows the variability in each week using box plots, adding further information for voyage planners and anticipating the likely conditions encountered. This presentation focuses on information to help avoid hazardous ship-ice encounters, and its importance is underscored by the potential consequences, ranging from minor such as delays or extra fuel consumption by unexpectedly diverting the route, to getting stuck in ice thus requiring rescue or suffering vessel damage.

P.73  Comparison of Risk-Based Concentrations Derived for Pesticides in Drinking Water with US EPA Human Health Benchmarks. Mattuck R*; Gradient

Abstract: We conducted an analysis of risk-based concentrations (RBCs) for pesticides in wastewater being discharged from a manufacturing facility to a municipal wastewater treatment plant (POTW), to allow the facility to develop an in-house monitoring program. We calculated human health-based RBCs for 10 pesticides (Cadusafos, Carfentrazone, Chlorimuron, Clomazone, Cloransulam, Fluthiacet-methyl, Metsulforon, Pyroxasulfone, Quinclorac, Sulfentrazone), and used the published EPA Regional Screening Levels (RSL) for 7 additional compounds (Atrazine, Bifenthrin, Carbofuran, Carbosulfan, Imazaethapyr, Metribuzin, Zeta-cypermethrin). The RBCs were calculated to be protective of human health assuming no dilution in the POTW or receiving waters, and were derived using the methodology used by US EPA to derive the RSLs for tapwater. The facility's wastewater concentrations were all below the drinking water RBCs. The RBCs were compared to the US EPA human health benchmarks for pesticides currently registered for use on food crops. Our analysis found that the RBCs differed from the US EPA human health benchmarks (HHBP), with RBC/HHBP ratios ranging from 0.004 to 1143. These differences were mainly due to differences in input assumptions, including whether the value was based on child vs. adult intake; the relative contribution of the drinking water source; whether the endpoint was based on a cancer or non-cancer effect; and the source of the toxicity factor. Our analysis demonstrates that the acceptable discharge values can vary widely depending on which EPA methodology is used.

P.74  Prioritization of water contaminants using the USGS-EPA Water Quality Portal. Greene CW*; Minnesota Department of Health

Abstract: The Water Quality Data Portal (WQP) is a data access system developed by the U.S. Geological Survey (USGS) and U.S. Environmental Protection Agency (EPA), and administered by the National Water Quality Monitoring Council. It contains over 270 million records from 2.2 million monitoring locations. The WQP includes the USGS National Water Information System (NWIS) data set, with data from 1.5 million U.S. sampling sites; EPA’s Storage and Retrieval (STORET) database, containing data from over 400 partner agencies; and other, smaller data sets. Through a web-based interface, users can access data on sampling sites or analytical results by chemical, site, geographic region, or other parameter. Data can also be retrieved through a Web Services request in the form of an URL specifying the desired search parameters, returning a file in Extensible Markup Language (XML) or other convenient format. The WQP’s shared Application Programming Interface (API) allows users to develop their own data analysis tools. The Minnesota Department of Health uses the WQP as a screening and evaluation tool for assessing Contaminants of Emerging Concern (CECs) in the state’s surface water and groundwater. In addition to retrievals of environmental occurrence data for CECs, the WQP enables assessments of geographic water quality trends (such as changes in concentrations upstream and downstream of known emission sources or wastewater discharges) and temporal trends that may result from changes in chemical use and/or discharge. These analyses have enabled MDH to prioritize chemicals being considered for toxicological review and drinking water guidance development, and focus its efforts on those chemicals for which drinking water guidance would be most useful and effective.

P.75  Review of potential risk from various exposure pathways to Marcellus shale flowback water. Abualfaraj N*, Gurian P, Olson M; Drexel University

Abstract: Concern over natural gas extraction across the U.S. and particularly from the Marcellus Shale formation, which underlies approximately two-thirds of the state of Pennsylvania, has been growing in recent years as natural gas drilling activity has increased. Identifying sources of concern and risk from shale gas development, particularly form the hydraulic fracturing process, is an important step in better understanding sources of uncertainty within the industry. Scenarios of concern are modeled in order to estimate occupational and residential risk from exposure to flowback water based on the most likely exposure pathways for on-site workers and the general public. Examining health risks to workers due to inhalation of volatilized contaminants from on-site holding ponds using mean, 2.5 percentile, and 97.5 percentile concentrations of 12 VOCs found in flowback water revealed that these risks were minimal under typical exposure conditions. An occupational risk assessment for worker exposure to flowback water through accidental spills at hydraulic fracturing sites was carried out for contaminants of concern found in flowback water. The occupational cancer risk estimate for median concentrations did not exceed the target lifetime cancer risk of 10-6 except for benzo(a)pyrene, which exceeds the target risk level even at the 2.5 percentile value. A risk assessment for residential exposure of the general public in shale gas development areas to a list of carcinogenic and non-carcinogenic chemicals of concern to human health found in flowback water through ingestion, inhalation, and dermal exposure pathways revealed that several carcinogenic compounds found in flowback water exceed target limits and significantly increase the risk of an individual developing cancer following chronic exposure. In general, exposure from the dermal pathway posed the greatest risk to human health. Exposure to radionuclides in flowback water, particularly through the inhalation pathway, poses a greater threat to human health than other contaminants examined.

P.76  Mercury Contamination in the Columbia River Basin: Health Risk Assessment of Tribal Exposure through Subsistence Lifeways. Arachy H*; Harvard University

Abstract: Fish consumption is important to riverine tribal cultures & represents deeply held beliefs that have roots in spiritual practices, subsistence lifestyles and community. A principal exposure pathway of contaminants to riverine tribes is through fish consumption. A large Columbia River Basin database on concentrations of mercury in fish was used to evaluate trends for mercury contamination in fish from the waterways for a range of consumption rates. There were significant & important differences in mercury levels among species, but the locational differences were relatively small. The findings from this study demonstrated few fish are low enough in mercury to be safe for tribal members eating resident fish at traditional historic rates or at a moderate rate. The traditional methodology of a health risk assessment used is based on the use of exposure assumptions that represent the entire American population. To limit human risk to mercury residues in locally caught species, fish consumption advisories have been established to protect local populations from health risk. The state’s fish advisories suggest reducing fish consumption with the goal of lowering risk; this shifts the burden of avoiding risk to the tribal members who now carry the burdens of contaminant exposure, socio-economic impacts and heritage and cultural loss. Tribal members are forced to choose between culture & health. These exposures represent potentially disproportionate risks for many tribes. These issues represent the potential inadequacy of health risk assessments to reflect important cultural differences in environmental justice communities. To rectify these risks federal and state agencies should take into consideration a subsistence traditional lifestyle when performing a risk assessment.

P.77  The risk assessment of Carbofuran residue in vegetables and fruits in Taiwan from 2010 to 2015. Chao KP*, Wu KY; National Taiwan University

Abstract: Pesticides are widely used in agriculture to maintain quality of crop growth. However, the potential adverse health impacts from pesticides deserve to be assessed. Carbofuran is often detected in fruits and vegetables in Taiwan. Previous studies showed that carbofuran may cause cholinesterase(ChE)-inhibiting effect and is genotoxic. The objective of this study is to establish a probabilistic risk assessment of carbofuran in foods from 2010 to 2015. Existing acceptable daily intake (ADIs) of carbofuran were derived from the no observed-adverse-effect level (NOAEL), which lacks of perspectives of possibility. Thus, we used Benchmark Dose Software (BMDS to calculate BMDL10 to replace NOAEL. Mean concentration (MC), Lifetime Average Daily Dose (LADD), and Hazard Index(HI) were calculated by the model of Bayes' theorem in Markov Chain Monte Carlo simulation (MCMC). The consumption data was obtained from the National Food Consumption Database. 19-65 years old adults are our target population. The study revealed that MC of carbofuran was 1.008 x 10-3, 4.848 x 10-4, 6.86 x 10-4, 0.01629, 0.02656, and 4.245 x 10-3 ppm in six frequently detected foods. Moreover, the LADD of these six foods was 9.795 x 10-7, 2777 x 10-6, 9.975 x 10-7, 9.543 x 10-5, 9.364 x 10-5, and 1.936 x 10-5 mg/kg/day. Hazard quotient (HQ) was 1.91 x 10-4, 4.11 x 10-4, 7.85 x 10-5, 0.01238, 7.187 x 10-3, and 1.668 x 10-3. HI was 0.02758. Although the HI showed that consumers are not subject to the potential adverse health effects, the aggregate health risks of carbofuran deserve particular attention. Moreover, residues of carbofuran reported on Taiwan Food and Drug Administration (TFDA) and Council of Agriculture still exceeded than maximum residue level (MRL), so it is necessary to revise management of carbofuran. Our findings recommend governing authorities to investigate long-term health effects of carbofuran in foods.

P.78  Exposure sources and predictors of urinary phthalate metabolites in Taiwanese children. Chen CC*, Wang YH, Wang SL, Huang PC, Chen ML, Hsiung AC; National Health Research Institutes

Abstract: Exposure to phthalates is prevalent that is known to have developmental and reproductive effects in children. In this study, we measured nine phthalate metabolites in 228 children aged 0-12 years who participated in Risk Assessment of Phthalate Incident in Taiwan (RAPIT). Two urine samples were collected from each participated children between 2013 and 2016, as were exposure questionnaire items and (consumption) frequencies on diet, plastic food container, prepackaged beverages, microwave use, personal care products, toys, and indoor living environment. A mixed model was used to assess the associations between each of the phthalate metabolites and the corresponding scores of the potential exposure categories, with participants as random effects for intraindividual variation. Plastic food container or wrapping had significantly positive associations with mono-(2-ethyl-5-hexyl) phthalate (MEHP), mono-(2-ethyl-5-oxohexyl) phthalate (MEHHP), mono-(2-ethyl-5-hydroxyhexyl) phthalate (MEOHP), and mono-benzyl phthalate (MBzP) in 4-6 years boys. Frequency of microwave use was significantly associated with mono-n-butyl phthalate (MnBP) and mono-isobutyl phthalate (MiBP) in 0-3 years boys, MEHHP, MEOHP and MiBP in 4-6 years boys, and girls, respectively. Consumption frequency of meat and seafood was associated with MEHHP and MBzP in 4-6 years children and mono-ethyl phthalate (MEP) in 7-12 years boys. Indoor environment was associated with MiBP in 4-6 years girls and di-2--ethylhexyl phthalate metabolites ( DEHP) and MnBP in 7-12 years boys. The findings suggest that plastic food container and microwave food heating is the main source of phthalate exposure in Taiwanese children, followed by phthalate migration in foods and indoor environment.

P.79  Estimations of health risk in food, by national food sampling analysis, to Taiwan. Chen Y.J*, Wu J.Y, Huang S.Z, Wu K.Y; National Taiwan University and Public Health

Abstract: The aim of this study was to evaluate three indicators: Lifetime Average Daily Dose (LADD), Hazard Quotient (HQ), Lifetime Cancer Risk(LCR), to estimate health risks and rank the sampling order. The priority sampling list will enhance the efficiency of annual food monitory program. Residual data was provided from Public health bureau and the intake data was from the National Food Consumption Database (NFCD). The exposure assessment of these chemicals used the Bayesian method combined with Markov Chain Monte Carlo simulation (MCMC). Besides, we collected reference doses (RfD) and cancer slope factors (CSF) from EFSA, Codex, US EPA and Food safety commission of Japan. The lowest RfD we took with LADD to calculate HQ and LCR. We also aggregated the HQs of some specific kinds of food to calculate Hazard Index(HI) to inform the government of high risk food. The study showed assessments of 52 compounds with concentration detection. The highly concerned chemicals that had HQ values greater than 0.2 were 29.3 (Acrylamide), 2.49(λ-Cyhalothrin), 2.05 (Cyfluthrin), 1.59 (Tolfenpyrad), 1.29 (Permethrin), 0.986 (Fludioxonil), 0.965 (Difenoconazole), 0.569 (Arsenic), 0.451 (Flufenoxuron), 0.370 (Acesulfame Potassium), 0.280 (Saccharin),0.254 (Chlorpyrifos), 0.222 (Buprofezin), 0.213 (Dinotefuran). Moreover, LCR of carcinogens was greater than 1E-06 needed to be concerned and addressed: 3.18~0.318(Aflatoxin G1), 2.93E-02(Acrylamide), 2.18E~05~2.18E-06(Aflatoxin B1), 5.58E-06~5.58E-07(Aflatoxin B2), 1.87E-06(Chlorothalonil). Aflatoxin has a significant difference in Hepatitis B carrier account for up to 13.18% Taiwanese population. To explain, in non Hepatitis B carrier group, CSF of Aflatoxin B1 is 0.2, however, in Hepatitis B carrier group, CSF is 0.02 (ng /kg- day)-1(Felicia Wu et al., 2013). The authorities should pay attention on Acrylamide because its HQ was the highest and LCR was the second rank among all chemicals. A review of risk management protocol of food should be considered since compliance with current regulations may be inadequate in safeguarding health.

P.80  Assessing the Health Risks of Gossypol from animal derived food in the Taiwanese population. Hsing HH*, Chuang YC, Wu KY; National Taiwan University

Abstract: In 2013, a food safety incidence occurred in some manufactures had illegally mixed cottonseed oil into other forms of cooking and edible oil. This event evoked public health concern of gossypol. As an anti-fertility agent, gossypol is a polyphenolic compound that occurs naturally in various parts of the cotton plant; highest concentration levels of this compound are in seeds. Cottonseeds are riches in protein, usually be used as livestock feed. There are several animal studies have investigated the tissue levels of gossypol in different species after feeding diets containing gossypol or cottonseed meal. This study aims to quantify concentrations of gossypol consumed by the general Taiwanese population through benchmark dose calculation, and the gossypol residues in edible tissues were established by using the Markov Chain Monte Carlo simulation (MCMC), and then assess the risk of gossypol to human health to general population. There is a clinical trial involving 151 men from various ethnic origins which is used to calculate the benchmark dose. The esults of BMDL10 divided by uncertainty factors of human (10). The RfD were 1.96×〖10〗^(-5) mg/kg/day. There is a study of quantitative analysis of free gossypol in animal derived food in Sinkiang. The residual of gossypol in all kind of livestock tissues are estimate by MCMC simulation. The MCMC simulation would enable health risk assessment of general populations. The results show the average free gossypol residue level in chicken liver is 0.34 mg/kg and pig liver is 0.36 mg/kg. The Hazards Index for general population was greater than 1 and for young group were greater than 1 expect for one group; indicating that the residue of free gossypol in livestock tissues might pose adverse health effects. Though there are some limitations in this study, the exposure of gossypol through the consumption of food from livestock fed cottonseed products should be concerned in the future.

P.81  Modeling study on the areal variation of the sensitivity of photochemical ozone concentrations and associated health impacts to VOC emission reduction in Japan. Inoue K*, Higashino H; National Institute of Advanced Industrial Science and Technology

Abstract: For the purpose of reduction of photochemical ozone concentrations in Japan, measures to reduce 30 percent of the VOC emissions of point evaporation sources have been undertaken uniformly nationwide since 2006. However, the resultant effect of VOC emission reduction on ozone concentrations in 2010, the time of the target year, varied greatly according to different areas such as Kanto, Tokai, and Kinki area. This was likely because the sensitivity of ozone concentrations to VOC emissions varied with locations in which emission reduction was conducted. By using a chemical transport model, we quantitatively estimated the sensitivity of the regional average ozone concentration and associated health impacts (e.g. premature mortality) to reducing VOC emissions in different locations. Two indexes, called "ozone concentration reduction efficiency ( = decrease in regional average ozone concentration / the amount of VOC reduction)" and "ozone health impact reduction efficiency ( = decrease in regional total health impacts associated with ozone exposure / the amount of VOC reduction)", were calculated, by repeating the simulations with VOC emissions reduced in each specific place. The results show both indexes varied widely with different places in which emission reduction was conducted. In particular, "ozone health impact reduction efficiency" over coastal areas was over 10 times higher than that over inland areas. This means that we can get more than 10-fold higher health benefits with the same amount of VOC emission reduction by carefully choosing the locations in which we will reduce the emission. It was also found that the variation of the index of "ozone concentration reduction efficiency" among areas (such as Kanto, Tokai, and Kinki area) was consistent to that of the observed decrease in average ozone concentrations among such areas, implying the calculated areal variation of the effects of VOC emission reduction possibly occurs in the real world.

P.82  Proposed methods for characterizing dermal exposure to BPA for purposes of Proposition 65. Lewis RC*, Singhal A, Gauthier A, Kalmes R, Sheehan P; Exponent, Inc.

Abstract: Bisphenol A (BPA), a synthetic monomer that is widely used in polycarbonate plastics and epoxy resins, was recently recognized by the State of California as a reproductive toxicant under Proposition 65. The State has proposed for BPA a Maximum Allowable Dose Level (MADL) of 3 ug/day specifically for dermal contact with solid materials. This regulatory development affects a large number of industries that sell relevant products in California and, consequently, there has been much interest in understanding exposures via this route to evaluate compliance with Proposition 65. However, there is no standardized approach for estimating dermal exposure to BPA. Although previous approaches have been applied to phthalates, a group of chemicals that are also used in consumer applications and regulated under Proposition 65, those approaches may not be appropriate because BPA and phthalates have somewhat different chemical properties. Therefore, we evaluate various methods to characterize dermal exposure to BPA through use of different dermal wiping methods that simulate the amount of BPA that may be dislodged during specific product handling scenarios. We compare both wet and dry media as well as direct wiping and washing methods of skin after contact with products. Based on experimental studies involving several different consumer goods that contain BPA, our preliminary results suggest that a reasonable characterization can be made to assess dermal exposure. Because BPA is water soluble, the approaches used to simulate skin perspiration are expected to provide an upper bound estimate of dermal exposure. Results from the various approaches are used to determine a range of dermal uptake values and are compared to the Proposition 65 MADL of 3 ug/day.

P.83  Probabilistic Risk Assessment of Fipronil Residue in Tea in Taiwan. Lu EH*, Wu KY; National Taiwan University

Abstract: Tea leaves used in chain tea shops have been found to contain multiple insecticides, including fipronil, ametryn, fenpropathrin and DDT. Fipronil, banned to use in tea trees, is an insecticide commonly used in corn and rice in Taiwan. Previous studies showed that fipronil may lead to neurological toxicity and thyroid cancer. The objective of this study is to establish a probabilistic risk assessment of fipronil in tea consumption in Taiwan. The established reference dose (RfD) was derived from NOEL. Thus, this study used Benchmark Dose Software (BMDS to calculate the corresponding benchmark dose level, BMDL10, based on existing animal study. Lifetime Average Daily Dose (LADD) was using on Bayesian Statistics with Markov Chain Monte Carlo simulation (BSMCMC). Tea consumption data was obtained from the National Food Consumption Database of Taiwan. Three target populations were 12-16, 16-18 and 19-65 years old. The factors of transfer rate and infusion process were added when conducting exposure assessment. This study reveals that LADD are 3.85 x 10-5, 4.1 x 10-5 and 5.8 x 10-5 mg/kg/day in 12-16, 16-18 and 19-65 years old, respectively. Hazard Index (HI) of fipronil in three age groups are 0.085, 0.091 and 0.129. Although the HI shows that consumers aren’t subject to the potential adverse health effects, the risk of fipronil shouldn’t be ignored. Farmers often spray multiple insecticides at the same time. Consumers may expose to multiple insecticides when drinking tea. Although the result of residual model is below MRL, residues of fipronil on FDA tea report still exceeds than MRL. Some imported tea leaves with high possibility containing excessive insecticides are still used in tea shops. It’s still necessary listing insecticide in tea leaves as inspection target. It’s essential for experts and government to cooperate together to improve the policy of insecticides.

P.84  Improvements in biota modeling for EPA’s Preliminary Remediation Goal and Dose Compliance Concentration calculators: intake rate derivation, transfer factor compilation, and mass loading factor. Manning KL*, Dolislager F, Bolus KA, Walker S; University of Tennessee, Knoxville, TN; Oak Ridge National Laboratory, Oak Ridge, TN; US EPA, Washington, DC

Abstract: Recent improvements have been made in biota modeling for EPA’s Preliminary Remediation Goal (PRG) and Dose Compliance Concentration (DCC) calculators. These risk assessment tools set forth EPA's recommended approaches, based upon currently available information with respect to risk assessment, for response actions at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites (commonly known as Superfund). Previously, produce intake rates were based on general fruit and vegetable consumption rates. Now, the produce intake rates are derived from 22 individual produce items, found in the 2011 Exposure Factors Handbook, that contribute to the overall produce ingestion PRG and DCC. Mass loading factors (MLFs) were also improved from a single MLF that was applied to all produce to 22 individual MLFs that correspond with the 22 individual produce items that make up the new produce intake rates. MLFs from Hinton (1992), the Environment Agency’s Contaminated Land Exposure Assessment (CLEA) model, and Pinder and McLeod (1989) were used and converted, where necessary, from dry weight to fresh weight. In site-specific mode, a user is now able to select additional animal products including Goat Milk, Sheep Milk, Duck, Mutton, Goat Meat, Rabbit, Turkey, and Venison as well as Rice and Grains, which are not included in the default produce intake rates. Formerly, the transfer factors used in these risk assessment tools were specific to element only. Now, the transfer factors are element-specific, biota-specific, climate zone-specific, and soil type-specific. These new transfer factors are from the recent IAEA TRS-472 and TRS-479 as well as Science Report: SC030162/SR2 from the Environment Agency of the United Kingdom and were used to replace most of the old generic values from NCRP, BAES, RESRAD, and RAD-SSL. These updates will greatly improve the accuracy and utility of the PRG and DCC calculators.

P.85  Probabilistic health risk assessment of 2-amino-3,4-dimethylimidazo[4,5-f] quinoline on fish consumption. Msibi SS*, Chuang YC, Wu C, Wu KY; National Taiwan University

Abstract: 2-Amino-3,4-dimethylimidazo[4,5-f] quinoline (MeIQ) is a heterocyclic amine formed during the condensation reaction of creatinine and amino acids. It is formed during cooking of meat and fish. Thus the general public is exposed to the compound through the ingestion of such foods. It was found in different concentration in different types of meat. Sun-dried sardines were found to have the highest concentration of the compound. Cooking method of the meat and fish have been found to have an impact on the concentration of the compound, as studies have shown that pan-frying and grilling/ barbecuing, yield higher levels of the compound. The concentration of the compound also depends on the cooking temperature and time. Generally, higher temperatures and longer cooking times increase the amount of all heterocyclic amines produced during cooking. Studies have shown that MeIQ is a potent mutagen as it tested positive to the Ames test. In several toxicology animal studies, it was found to be a carcinogen as there were significant incidence of fore-stomach, liver and intestine tumor formation in the experimental groups. There is no sufficient data available on the carcinogenicity of the compound on humans, thus the compound is currently reasonably anticipated to be a human carcinogen. Exposure to MeIQ via fish consumption was calculated using world population consumption rate data. After extrapolation using data from CDF1 mice study, the human BMDL10 was calculated to be 0.299 mg/kg/day, with a cancer slope factor (CSF) of 0.334 (mg/kg/day)-1. Data on the consumption rates of marine and freshwater fish consumption was pooled together. The life-adjusted daily dose (LADD) was reported to be 2.71 x 10-6 mg/kg/day, at 95th percentile, with an estimated risk of 9.1 x 10-7. The risk of getting cancer from the consumption of meat and fish is very low. However, this risk may be compounded by other factors and other carcinogens found in the diet and environment.

P.86  Measuring, Assessing and Communicating Individual External Doses in the Evacuation zone in Fukushima. NAITO W*; National Institute of Advanced Industrial Science and Technology

Abstract: Five years after the Fukushima nuclear disaster, the radiation levels have greatly decreased due to physical decay, weathering, and decontamination operations in the affected areas in Fukushima. In order for the government to lift the evacuation order and for individuals to return to their original residential areas, it is important to assess current and future realistic individual external doses. In this study, we used personal dosimeters along with the Global Positioning System and Geographic Information System to understand realistic individual external doses and to relate individual doses, ambient doses, and activity-patterns of individuals in the affected areas in Fukushima. The results showed that the measured individual doses were well correlated to the ambient doses based on the airborne monitoring survey. The results of linear regression analysis suggested that the additional individual doses were on average about one-fifth that of the additional ambient doses. The reduction factors, which are the ratios of the individual doses to the ambient doses, were calculated to be on average 0.14 and 0.32 for time spent at home and outdoors, respectively. We have developed a user-friendly tool using the obtained data to assess and communicate the individual external doses among various stakeholders. Our results and the tool are a valuable contribution to understanding and communicating realistic individual doses and the corresponding airborne monitoring-based ambient doses and time-activity patterns of individuals.

P.87  City Noise: Propagation and Health Impact. piotrowski A, De Guidici P, Soledano B, Payre C, Cabanes PA*; EDF

Abstract: The objective of our study was to assess the potential effects of different scenarios of an urban development project on population health, with an initial focus on the effects associated with noise. Beyond this first objective, we sought to propose and test a methodology a priori applicable to other projects of a similar nature underway in other urban areas. Acoustic levels were evaluated for the present and for 2020 and 2030, based on planning scenarios developed with the city. Noise propagation was modeled with NoiseM@p software, which enabled us to describe the time distribution of sound emissions, that is, their distribution throughout the day, according to the day of the week. The exposure-risk curves recommended by the World Health Organization (WHO) used to calculate impacts are discussed and completed by qualitative data for the cases studied (for example, the noise contains strong tonal components, emergence time) to refine results that do not fully express all of the relevant situations, especially the emergence of sound source events in a quiet background (nighttime noise, for example). The modeling of noise levels showed that this urban development project will increase the number of people exposed to high noise levels. This proportion will rise by around 50% from today to 2020. Nonetheless, the proportions of individuals bothered and of those whose sleep is disturbed by noise will return to close to their current levels by 2030 if a proactive scenario is implemented, one that promotes in particular the use of public transportation. The difference in the incidence of myocardial infarctions attributable to noise will remain less than 1 per year between today and 2020 or 2030. This approach will be extended to other pollutants.

P.88  Using Diffusive Samplers to Measure Formaldehyde in Residential Indoor Air. Singhal A*, Renee K, Sheehan P; EXPONENT, INC.

Abstract: Starting in early 2015, due to claims of elevated formaldehyde emissions from laminate flooring, there was a need to determine laminate-specific formaldehyde contribution to indoor air. This study focuses on the first part of the evaluation, which was used as a screening tool to measure total formaldehyde levels in homes. Homes that requested an evaluation (n>40,000) were provided with diffusive samplers (or passive badge monitors) to measure aggregate indoor air formaldehyde concentrations. Passive badges however are designed for occupational settings where concentrations may be 10 – 100 times greater than residential indoor air and few data are available on their ability to measure formaldehyde at low levels typical in homes. Therefore, we conducted experimental studies in which badges from three different manufacturers were placed in chambers with known concentration of 13 or 14 µg/m3 for 24 hours. These badges were subsequently analyzed by two or more of the participating labs to determine inter-badge and inter-lab differences. Badge A appeared to provide the most accurate results while badges B and C tended to under-report them. Preliminary indoor air data from actual houses (n~8,000) confirmed these results. No statistically significant differences were noted between labs in the experimental studies. Results of our analysis suggest that not all badges perform equally well at low concentrations. While some badges may have lower sensitivity and therefore higher detection limits, others report accurate levels at low formaldehyde concentrations. Our findings suggest that badges, after a pilot study calibration at concentrations of interest, can be very effective and inexpensive screening tools for measuring residential indoor air formaldehyde concentrations.

P.89  Presentation of new EPA online Vapor Intrusion Screening Level (VISL) tool. Stewart DJ*, Galloway LD, Dolislager FG, Smith S, Frame AM, Gaines LG; The University of Tennessee, Knoxville, TN; US Environmental Protection Agency, Washington, DC

Abstract: The U.S. Environmental Protection Agency (EPA), Office of Superfund Remediation and Technology Innovation (OSRTI), through an interagency agreement with the Oak Ridge National Laboratory, developed an online calculator for vapor intrusion screening levels (VISL) that will be linked to the EPA Regional Screening Levels (RSL) database. This tool provides screening level concentrations for groundwater, soil gas (sub-slab and exterior), and indoor air to help risk assessors, risk managers, and concerned citizens determine risks from vapor intrusion. Vapor intrusion is the general term for migration of hazardous vapors from any subsurface vapor source, such as contaminated soil or groundwater, through the soil and into an overlying building or structure. Chemicals must be sufficiently volatile and toxic to pose inhalation risk via vapor intrusion from soil and groundwater sources. This is determined by calculating the chemical’s pure phase vapor concentration and the groundwater vapor concentration. The soil gas vapor concentration and the groundwater vapor concentration must be greater than the air screening level to determine a VISL. The VISLs for groundwater and soil gas (either sub-slab or soil gas collected below or adjacent to buildings) are calculated from the target indoor air concentrations using empirically-based, conservative attenuation factors that reflect reasonable worst-case conditions, as described in EPA’s draft vapor intrusion guidance (EPA, 2002), and default exposure parameters and factors that represent Reasonable Maximum Exposure (RME) conditions for chronic exposures. In addition to calculating screening levels, this tool can calculate indoor air concentrations and risks from soil gas and groundwater concentrations entered by the user. The online VISL tool will be available soon.

P.90  Health Risk Assessment of maliec and fumaric acid in Taiwanese adult population via LC-MS/MS and Bayesian Statistic Markov chain Monte Carlo Simulation. Wu CH*, Shih IT, Chuang YC, Wu KY; National Taiwan University

Abstract: Maleic anhydride (MAH) was used purposefully as a food adulterant in a variety of starch-based local Taiwanese delicacies such as tapioca and vermicelli. MAH is mainly used in the manufacturing of polyester resins for boats, autos, piping, and electrical goods. Consumed and hydrolyzed, maliec acid (MA) may cause renal impairment such as tubular injury and necrosis in the proximal tubules. On the other hand, fumaric acid, an isomer of maleic acid, has been used as an acidity regulator in foods since 1946. The aims of this study are to 1) determine the total amount of maleic acid and fumaric acid in foods with liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) and 2) assess the exposure in adults in Taiwan via Bayesian Statistics and Markov Chain Monte Carlo Simulation (BSMCMC). Of the 66 food samples collected, some of which include instant coffee, tapioca starch and rice cake, 60 samples contained quantifiable concentrations of MA and 39 samples contained quantifiable residue of fumaric acid. The aforementioned data, combined intake rates collected from National Food Consumption Database, life average daily dose (LADD) and hazard index (HI) were calculated as follows: LADD of MA and MAH for adult males and females is 0.63 μg/kg/day and 0.56 μg/kg/day respectively, with a 95% upper limit of 1.14 μg/kg/day and 0.97 μg/kg/day respectively. The average exposure dose for fumaric acid is 3.55 μg/kg/day and 3.63 μg/kg/day respectively, with a 95% upper limit of 10.14 μg/kg/day and 10.01 μg/kg/day respectively. The HI value of MA for adult males and females is 0.01137 and 0.00972 respectively, while the HI value of fumaric acid is 0.00169 and 0.00167 respectively. Even though the HI value of maleic acid for adults is higher, the individual HQ of maleic acid and fumaric acid is far lower than 1. Our results indicate that the current amount of exposure of maleic acid and fumaric acid is unlikely to induce adverse health effect in humans.

P.91  The risk assessment of dietary exposure to acrylamide for adults in Taiwan. Yeh SS*, Wu C, Wu K; National Taiwan University Hospital, National Taiwan University, National Taiwan University

Abstract: Acrylamide (AA) was detected in potatoes and foods cooked at high temperatures. AA could cause neurological damages and reproductive hazards. AA was also hypothesized to act as a genotoxicant via its metabolite glycidamide (GA), and it could cause different kinds of cancers in animal tissues. Many countries had investigated the dietary intake of AA in the general population. However, there was a lack of risk assessment of AA in foods in Taiwan. We gathered information on AA analysis and national food survey in order to calculate theoretical maximum daily intake (TMDI) and lifetime average daily dose (LADD) for adults aged 19 to 65 years. Moreover, Benchmark dose lower confidence limit at 10% risk (BMDL10), margin of exposure (MOE), and cancer slope factor (CSF) were assessed based on two-year drinking water cancer studies conducted by National Toxicology Program. The results showed that TMDI for twelve food categories were 0.605, 0.04585, 0.1598, 0.3356, 0.0329, 0.5455, 0.3731, 0.3797, 0.5716, 0.1328, 0.6058, and 0.04218 mg/kg/day. LADD was estimated to be 5.975 x 10-3 mg/kg/day for consumers at the high (95th) percentile, and BMDL10, MOE and CSF were calculated to be 0.21 mg/kg/day, 35, and 3.262 (mg/kg/day)-1, respectively. Using above data, a lifetime cancer risk related to highest daily intake of acrylamide in foods for 70 years was estimated to be 0.01949. Although these results were calculated at a rather highly conservative condition, these estimates did represent a non-negligible magnitude of cancer risk associated with AA intake in Taiwan.

P.92  Risk assessment for non carcinogenic health effects for people living in a contaminated area by chemicals in Sao Paulo, Brazil. Toledo MC*, Nardocci AC; University of Sao Paulo

Abstract: The contamination of urban areas is a reason of concern, because these areas, that are very populated, represents risks to the human health. Vila Carioca, located in São Paulo city, is considered critical because of the high levels of contamination in the soil and groundwater leading to the exposure of the resident population. The contamination started in 1960 due industrial activities. Risks studies have already been conducted, however, there is still much uncertainty and controversy about the health risks of the population. The investigations also included some preliminary clinical tests (blood, urine and hair) and the results suggested alteration in leukocytes, hepatic system and hematologic system. This study proposes a risk assessment of non carcinogenic health effects for the people living in the residential area contaminated by dangerous chemicals. The exposure occurred by groundwater ingestion for 25 years, and by soil ingestion over the entire life (70 years). It were followed the methods recommended by the United States Environmental Protection Agency. It was selected 15 compounds for the groundwater assessment and 20 for soil. The Hazard Quotient was considered not tolerable in relation to water ingestion for the cis-1,2-dichloroethylene which can be associated to problems to urinary system.

P.93  Solving Complex Radioactive Decay Chains for Future Assessment and Cleanup Decisions. Galloway LD*, Bolus KA, Bellamy MB, Dolislager FG, Walker S; University of Tennessee; Ingenium Inc; Oak Ridge National Laboratory; Environmental Protection Agency

Abstract: There is a need to understand how radionuclide activity changes with time as the activity measured in the past will be different from current and future levels. When a radionuclide decays, its activity decreases exponentially as a function of time transforming into a different atom - a decay product. The atoms keep transforming to new decay products until they reach a stable state and are no longer radioactive. The series of decay products created to reach this balance is called the decay chain. For radionuclide chains, the daughter products can have significant implications in dosimetry and remediation. Thus, risk assessors evaluating sites with radioactive contamination need to plan for future progeny ingrowth, in addition to sampled radionuclides. These are important considerations for risk quantification during the characterization and cleanup plans, particularly when sampling may have occurred years before the remediation cleanup work begins. If a radionuclide's half-life and current activity are known, then hand-calculating the future activity is straightforward. However, calculating the ingrowth of progeny quickly becomes cumbersome for longer chains such as the Thorium-232 decay series. For the more complex chains where many daughters are formed, possibly with multiple branches, this calculation involves solving a complex set of simultaneous differential equations known as the Bateman Equation. The Decay Chain Activity Projection Tool calculates the activity of radionuclides and their progeny as a function of time. This tool uses a combination of Python and Perl to automatically construct the radionuclide decay chains, solve the resulting Bateman Equation, and provide the user with tabular solution output and plots. The risk assessor can then use the data for exposure assessment and cleanup decisions without further costly sampling.

P.94  Evidence Integration Facilitated by DRAGON ONLINE . Turley AT*, Burch DF, Henning CC; ICF International

Abstract: What strengthens our conclusion on whether a chemical exposure poses a health risk? More evidence? Higher quality evidence? Evidence from more than one data stream? The answer varies based on the decision context (e.g., regulatory, problem formulation) and the decision framework (e.g., traditional risk assessment paradigm, 21st century paradigm), but having well-organized, well-annotated data facilitates evaluation and development of our conclusion. DRAGON ONLINE is a tool for risk assessors that enables the organization, evaluation, and annotation of toxicology and epidemiology data to support risk assessment decisions. By standardizing the data elements evaluated across scientific studies, we can more easily reach conclusions because it is easier to understand data across studies and even evidence streams. On this poster, we demonstrate the utility of DRAGON ONLINE for the organization and evaluation of epidemiologic and toxicologic data identified through a comprehensive literature search. We show the flexibility of DRAGON ONLINE for the evaluation of individual studies and the visualization capabilities available to the risk assessor. Over the course of developing DRAGON ONLINE, we have extracted data from more than 2,000 toxicology and epidemiology studies and evaluated the quality and potential for bias of more than 1,000 studies. We share the lessons learned through this experience regarding standardization of text, clearly defined extraction fields, and quality control procedures and explain how each of these elements impacts our ability to integrate data across evidence streams.

P.95  Delimiting the study of risk: risk assessment guidelines and values-based judgments. Kokotovich AE*; University of Minnesota

Abstract: How values-based judgments are recognized and addressed has and continues to be an important topic within risk assessment. While values-based judgments are often seen as belonging only to the realm of problem formulation, there has been acknowledgment, going back to the “Red Book”, of the role of values-based judgments – known as “risk assessment policy” – throughout the steps of risk assessment. Risk assessment guidelines are proposed as a way to rigorously deal with the unavoidable judgments in risk assessment, yet there has been a lack of scholarship reflecting on how values-based judgments within guidelines shape risk assessment. The work presented here begins by developing a conceptual model of risk assessment guidelines, scientific studies, and the conducting of individual risk assessments that acknowledges the role of values-based judgments throughout. I then present a case study of conflicting genetically modified organism ecological risk assessment guidelines to examine the implications of values-based judgments within guidelines. Specifically, I examine the differences between two competing risk assessment guidelines for assessing the potential impacts from insect-resistant genetically modified plants on non-target organisms. Utilizing document analysis and in-depth interviews, I found that judgments (e.g., concerning hazard identification, substantial equivalence testing, species selection, and indirect effects) delimit how each set of guidelines examines the risks from genetically modified plants and influence which scientific studies are called for. I conclude by arguing that inclusive, transparent, and deliberative scrutiny of risk assessment guidelines is needed to ensure that consequential judgments are realized and reflected upon. This is especially true when regulatory bodies decide upon guidelines for new stressors, such as plants with novel traits from new gene-editing techniques like CRISPR.

P.96  Reference framework for the application of Quantitative Risk Analysis for hydrocarbon pipelines, coupled with uncertainty treatment methods: Uncertainty in scenario identification through event trees. Ocampo Pantoja FA*, Villalba NA, Muñoz F; Universidad de los Andes

Abstract: Risk assessment is imperative to reduce consequences and frequency of accidents generated by leaks on pipelines. After a failure in an urban pipeline, the contents can be released rapidly forming a spreading pool and generating a vapor cloud that can lead to different events such as fires and explosions. Generally, deterministic assessments are carried out, where certain assumptions are made in the models that are used to calculate failure and event probabilities, event consequences and risk values. This approach causes the existence of uncertainty that is not considered and therefore generates results only for “mean” or “worst-case” scenarios. Not taking uncertainty into account can lead to decisions with an unknown degree of conservatism: overestimate results could increase cost of decisions and underestimate results can lead to poor risk management. There is a lack of a clear directive to use methods of uncertainty assessment in QRA, making them difficult to be applied widely. For that reason, a framework is needed for carrying out such evaluations in order to handle, reduce, and properly inform uncertainty to the decision makers. To develop such framework uncertainty should be evaluated in two main steps of the QRA methodology: 1) scenario identification/frequency estimation and 2) consequence analysis. All these applied to a case study of an urban gasoline pipeline. This work deals with the first one by evaluating three approaches of addressing uncertainty in event tree analysis (a known technique for scenario identification and frequency calculation) in order to compare its results in treating epistemic uncertainty: evidence theory, fuzzy set theory and a hybrid probabilistic – possibilistic representation. Since coupling of these results with consequence analysis is needed in subsequent steps, an evaluation of transformation methods from non-probabilistic to probabilistic representation (crisp values) is performed to provide guidance in their selection in this particular field.

P.98  Realizing Disaster Causation: Critical Realism as an Underpinning Philosophy for Disaster Risk Analysis. Huang T*; Department of Urban Planning, National Cheng Kung University

Abstract: Disasters are increasing in terms of frequency and intensity. People seek to go beyond addressing merely the symptoms of disasters, but rather a treatment for underlying causes in order to prevent future disasters occurring in the same manner as before. Although common understanding exists that disaster and its risk are influenced by various factors and processes, there is still limited knowledge regarding the causation of those factors and processes. A disaster is often defined as a natural or man-made hazard resulting in “an event” of significant physical damage or destruction, loss of life or drastic change to the environment. The perception of disaster as an event implies that it has a point of beginning and an end. Therefore, we analyze the disaster with reference to the occurrence of the event, that is, before, during and after its onset. As illustrated in the critical realist framework of causation, the unsafe conditions of our societal system as a whole emerge out of the workings of the underlying structures. Given the unsafe conditions, the onset of a disaster is contingent on triggering hazards, be it natural or man-made. The unfolding process of disaster starts from problematic structures of the system leading to internal functional disorder, or dynamic pressures; that in turn manifest as symptoms or warning signs, which jointly determine the system’s conditions at the time. Hazards, by definition, are things that can cause risk or danger; in critical realist term, hazards are the “other mechanisms” that triggers the escalation of the already unsafe conditions into a state of crisis or emergency. Hazards are not necessarily extrinsic. Depends on the coping actions, the outcomes of the crisis and emergency events eventually impact on the structures creating further underlying causes of the disaster. That, in turn, deteriorates the system structure and starts another disaster cycle.

P.99  Computing Risks with Confidence. Ferson S*, Sentz K; Los Alamos National Laboratory

Abstract: Confidence distributions encode frequentist confidence intervals for a parameter at any confidence level. They characterize inferential uncertainty about parameters estimated from sparse or imprecise sample data, just like bootstrap distributions or Bayesian posterior distributions, but they enjoy a frequentist guarantee of statistical performance that makes them useful in risk and uncertainty analyses. Although there is no confidence distribution for the inference problem of estimating a binomial probability from success/failure data, an imprecise generalization of confidence distributions, which we call a ‘c-box’ can be derived for this problem, and it can be propagated through mathematical expressions using the ordinary machinery of probability bounds analysis. Remarkably, the results also offer the same statistical guarantee. C-boxes allow analysts to literally as well as figuratively compute with confidence. We illustrate the application of binomial c-boxes to a risk assessment and describe numerical simulations that confirm the statistical coverage properties of c-boxes and computations derived from them.

P.100  Data Resources for the Development of a Quantitative Microbial Risk Assessment for Norovirus in Foodservice facilities. Miranda R*, Schaffner DW; Rutgers, The State University of New Jersey

Abstract: Norovirus is a highly contagious virus and presents an increased risk to the elderly, young children, and the immunocompromised. Norovirus can be spread through contaminated food, water, and virus particles in vomit or feces. Norovirus is most commonly spread through direct contact. This can include shaking hands, caring for someone who is sick or sharing drinks or utensils. Norovirus can survive on surfaces for weeks and some disinfectants are less effective in eliminating the virus. This research project develops mathematical models to predict survival, spread and cross-contamination of Norovirus in food production, processing and handling environments from published literature data and combines those models into a quantitative microbial risk assessment (QMRA) framework to assist in Norovirus risk management efforts. The data are extracted from tables or figures provided in the literature as they pertain to the conditions of interested. Data from the literature include viral shedding, survival of Norovirus on food and surfaces, prevalence in water and food, disinfectants and inactivation treatments, dose response models, hand hygiene, and environmental factors. Data from outbreaks are used for model validation. Validation will be considered successful if we are able to recreate past Norovirus outbreaks given reasonable assumptions regarding the starting conditions. The software platform used for the development of the QMRA is AnyLogic. This software supports system dynamics models, process-centric (or discrete event) models, and agent based modeling. Its modeling language allows the description of complexity and heterogeneity to a variety of level of detail. The software uses a graphical interface, tools, and library objects to model food retail, foodservice or manufacturing as well as human resources and consumer behavior. The object-oriented model design paradigm provides for a modular and hierarchical designed, and incremental construction of large models.

P.101  Quantification of the Effect of 17β-estradiol on Escherichia coli and Enterococcus faecalis Survival and Persistence in Water. Mraz AL*, Weir MH; The Ohio State University   ALEXIS.L.MRAZ@GMAIL.COM

Abstract: Endocrine disrupting chemicals (EDCs) are known to have a disparate set of negative effects on human health, such as early onset puberty, infertility and certain cancers. In an attempt to remove EDCs from the environment, particularly water systems, the effects of bacteria on EDCs have been studied for decades. However, there is little information on the effects EDCs have on microbial communities. Through a greater understanding of EDCs’ effects on microbiology and microbial ecology we can develop a first step to a greater understanding of wider EDC effects and how best to combat them. For example, this can help inform EDC removal processes to combat human health effects caused by EDCs, and better understand the effects on the human gastrointestinal flora. Performing a survival analysis of Escherichia coli and Enterococcus faecalis when exposed to varying levels of 17β-estradiol and conducting a genetic analysis of those bacteria can inform the underlying health effects of EDCs, and the best way to treat EDCs before health effects occur. Modeling survival and persistence of common bacteria can provide an accessible projective model to inform future science and decision-making efforts to other researchers, scientists, engineers, regulators and the general public.

P.102  Evaluation of Salmonella survival and growth in Rehydrated Dry Pet Food. Qu Y*, Lambertini E, Buchanan RL, Pradhan AK; University of Maryland, College Park

Abstract: Salmonella enterica is one of the major foodborne pathogens linked to several disease outbreaks and product recalls. Salmonella has been reported to survive for extended periods in dry food products. Recent human salmonellosis outbreaks have been associated with dry pet foods and treats. Such outbreaks and product recalls have raised the concern for these products as potential vehicles for pathogens, which can be infectious for both pets and their owners. The ability of Salmonella to survive or grow in dry dog food, in case of intentional or accidental rehydration, is currently not available. The goal of this study was to characterize the behavior and evaluate the risk associated with the survival and growth of Salmonella in rehydrated dog food. Dry dog food formulations from different brands were rehydrated to a moisture content of 35% and stored at ambient temperature of about 30°C for 3 days. A cocktail of several Salmonella serovars isolated from pet food or pet treats was used for the inoculation of the dog food. Sampling was done at 8-10 time points to derive Salmonella growth or decline curve. Time trends of Salmonella growth or survival were modeled by fitting the data points with suitable mathematical equations. This study was helpful in providing critical information to develop potential contamination prevention strategies for Salmonella in pet food.

P.103  Development of a risk model to predict Mycobacterium avium subsp. paratuberculosis contamination in bulk tank milk. Rani S*, Lambertini E, Pradhan AK; University of Maryland

Abstract: Infectious diseases in dairy cattle are of significant concern to dairy industries because of their huge impact on animal health, milk production, and economics. Mycobacterium avium subsp. paratuberculosis (MAP) is a pathogenic bacterium that is associated with Johne’s disease, one of the important endemic infectious diseases in dairy cattle. The contamination of MAP can be through its direct shedding in milk by cows (internal route), or mixing with milk via contact with feces or farm equipment (external route). Humans can be exposed to MAP via milk consumption, since there are evidences of its survival in milk after pasteurization. While current farm management practices aim to limit milk contamination, to-date the relative importance of different contamination routes is poorly understood. The goal of this study was to develop a risk model of MAP transmission in dairy farms, with a focus on herd and farm environmental practices. A probabilistic modeling framework was used to predict the MAP contamination from cows’ shedding (internal route), feces, feed, surfaces of milking parlor and other environmental sources into the bulk tank milk. The data was extracted from a survey on dairy farms across the U.S. and published literature. The model estimated the probability and level of MAP in raw milk and the likelihood of MAP exposure to consumers.

P.104  Psychosocial intervention to strengthen community resilience to disasters. De la Yncera NC*, Lopez E, Lorenzo A; Universidad Autónoma del Estado de Morelos

Abstract: Facing suitably risky situations and disasters constitute a priority of our society, due to the rising of its effect in recent years. In this study will be presented the experiences and results of two investigations, one carried out in Cuba and the other in Mexico. The first one is related to the application of the Innovation Project: “The Improvement of the psychosocial coping styles while having emergency situations and disasters”, which was put into practice at the coastal area of the community of Coloma, in the municipality of Pinar del Rio, In Cuba (2010- 2014); the second one is about some job experiences in the municipality of Yautepec, in Mexico, related to the community resilience while facing disasters and natural threat (2015-present). In both cases, an Investigation-action methodology is used for the actions planned. We concluded, the first stage of the study must be headed towards an approach to the psychosocial aspects that must of the time are unnoticed, are minimized or are not considered in the management of the risks. In this regard, social representations, risk perceptions, and the way to face disasters have been explored. We observed there are different actions that we considered to be learned lessons. We have also looked into the protecting factors that are turned on because of severe events as well as the risk factors and the vulnerability that coexist. We have detected some resilience pillars that activate during such adversities. We presented some psychosocial strategies that contribute to the resilience in order to avoid and/or minimize the damage that is provoked by the disasters in every aspect of people’s daily life, as well as to contribute to the community welfare.

P.105  Asbestos risk assessment modeling: what are the keys to “Carolinas’ mystery”? Korchevskiy A*; Chemistry & Industrial Hygiene, Inc.

Abstract: Asbestos health risk assessment modeling significantly progressed during the last two decades, with several approaches proposed and validated (Hodgson, Darnton, 2000, Berman, Crump, 2008, Berman, 2011, 2013). However, all risk assessment models face an outlier problem with asbestos textile cohorts of South and North Carolina asbestos textile factories. For example, according to Darnton (2011), mesothelioma potency factor for South Carolina asbestos textile cohort was 17 times higher than for Quebec chrysotile miners; for North Carolina cohort, this ratio was equal to 10. We attempted to model mesothelioma potency factors for different types of fibrous minerals using log-log regression shape and various characteristics of fibers as independent variables. It was demonstrated that silicon oxide, magnesium oxide and iron oxides (mostly ferric iron) content of fibers along with their median aspect ratio were strong predictors of a mesothelioma potency level (measured based on Hodgson, Darnton method), with R=0.98, P<0.01. The potency factor was demonstrated to be proportional to aspect ratio minus three in the power of 1.75. The model allows to reconstruct potency for Quebec chrysotile, amosite, anthophyllite, Libby amphiboles, crocidolite from South Africa and Australia, and erionite with good accuracy. At the same time, it can be demonstrated that airborne aspect ratio of chrysotile asbestos in textile setting was reported twice as high as for mining. This difference, based on the proposed model, can account for at least 60 % excess potency factor for South Carolina cohort, and for up to 100 % of the difference for North Carolina workers. Other factors that could be responsible for the variation in textile cohorts’ potency may include underreported amphibole exposures, issues with mesothelioma diagnosis, differences in gender susceptibility to respiratory effects and other.

P.106  Health Risk Communication to a Non-Technical Workforce. Sexton KR*, Bhojani FA; Shell

Abstract: In manufacturing operational settings, where the workforce is largely non-technical and perceived risks are frequent, the seven principles of risk communication defined by the EPA should be carefully and appropriately applied. Recently, employees in a manufacturing site voiced concern over a potential health risk. Company physicians and scientists worked together to assess the risk and then tendered the results to local Health, Safety, and Enivronment (HSE) and medical staff to deliver to the employees. In the first presentations to the workforce, the presenters failed to build a high level of trust and credibility, recognize the emotional reaction from the audience, and collaborate with credible sources, thus violating several key risk communication principles. Mistrust was instantly evident as employees noted a potential weakness in the risk assessment and were suspicious of the methodology and results. They felt their concerns were not sufficiently addressed, so a second stage of risk assessment was undertaken. Restoring the relationship between the healthcare professionals and the workforce required many steps. First, Health staff was invited to all subsequent presentations in order to meet the employees and answer any questions over the methodology used to assess risk. Second, relationships were developed between site leadership, the local medical team, and the corporate scientists to deliver a unified message to the workforce, with all present for each discussion with employees. Finally, a collaboration between the scientists and an academic partner addressed the limitations of the initial risk assessment, and the final results presented to the workforce included this new partnership. This presentation will focus on what went wrong, how we corrected it, and how trust and credibility were re-established in a workforce initially skeptical of the corporate environment.

P.107  The risk assessment of radiation exposure and stochastic effect from Japanese Seafood for Taiwanese after Fukushima accident. Chen KW*, Chuang YC, Wu C, Wu KY; National Taiwan University

Abstract: On March 11, 2011, a magnitude 9.0 earthquake occurred at the east coast of Japan. It triggered powerful tsunami waves. At the same time, Fukushima Daiichi Nuclear Power Plant was disabled leading to releases of radioactive material. Taiwan has maintained a close relationship with Japan, and both people have similar diet habit. This study tried to evaluate the additional radiation exposure from the Japanese seafood and the stochastic effect was calculated. Until April 10, 2016, 84,140 pieces of Japanese food had been tested and 214 (2.5%) of them were found to be radioactive. Thirty five were seafood and 144 were tea products. Nineteen foodstuffs were tested positive only for their packages. The frequency of test positive reached climax at the end of 2011 and then decreased gradually. There have been no radioactive foodstuff since October, 2015. There were eight pieces of foodstuffs contaminated with I-131 during March and April, 2011, which may reflect short half-life (8.04 days). The mean level of Cs-134 in Japanese seafood is 0.01048 Bq/kg, and the 95% confidence interval is 0.008119 to 0.01528 Bq/kg. The annual intake of Cs-134 is 0.6794 Bq and the 95% confidence interval is from 0.04388 to 1.73 Bq. After considering the IPF (0.05), the mean annual intake of Cs-134 in seafood is 0.03397 Bq, with 95% confidence interval 0.002194 to 0.0865 Bq. The radiation is 6.45E-07 [4.17E-08, 1.64E-06]. The mean level of Cs-137 is 0.01152 Bq/kg [0.009833, 0.01582]. The annual intake of Cs-137 is 0.758 [0.03027, 1.995]. After considering the IPF, the mean annual intake of Cs-134 is 0.758 Bq [0.03027, 1.995]. The radiation is 4.93E-07 [1.97E-08, 1.30E-06]. The total radiation from the Japanese food after considering IPF is 1.14E-0.6 [6.14E-08, 2.94E-06]. The stochastic effect is 4.78E-06 [2.58E-07, 1.23E-05]. The radiation exposure from the Japanese seafood is much lower then the Codex intervention exemption level (1mSv per year). The stochastic effect calculated with the IPF 0.5 is 4.78E-06 [2.58E-07, 1.23E-05].

P.108  Safety Culture and Return to Work: Does Perception Matter? Gosen DG*, Shelton LM; Grenoble Ecole de Management

Abstract: Work injuries are cumbersome and are blamed for time away from work. Financial burdens on both employee and employer resulting from employees missing days of work after an injury and the employee emotional pains initiated a debate for over two decades. Returning to work after an injury goes beyond physical inabilities to include social, psychological, and economic factors. Organizational policies and social environment are also known to determine employee’s return. Many of the reviewed return to work studies focused on reactive measures dealing with bringing an employee back to work after an injury takes place and not on proactive safety measures like safety culture and its impact on employees’ perception of their organization. We propose that safety culture enhances perceived organizational support and results in positive occupational outcomes. In particular, we propose that employees perceive their organization’s safety culture as a form of care for their health and wellbeing where they reciprocate to such behavior by exhibiting favorable consequences. We suggest that safety culture is directly and indirectly responsible for improving critical aspects of organizational outcomes, i.e., number of days lost after an injury and number of litigated cases through the mediating effect of perceived organizational support. In addition, the proposed model examines the moderating effects of direct supervisors since employees view them as representatives of the organization where their favorable behaviors indicate organizational support. We also suggest that safety culture is directly responsible for accident prevention and the negative effects that accidents may have. This paper is unique since it explicitly discusses an additional, often overlooked benefit of a strong safety culture – its enhancement of perceived organizational support. By explicitly recognizing that improvements in safety culture can reduce injury related outcomes through the positive mediating influence of perceived organizational support as well as by directly improving safety performance.

P.109  Risk estimation on hydrogen fueling station and surrounding area. Tsunemi K*, Kato E, Kawamoto A, Kihara T, Saburi T; National Institute of Advanced Industrial Science and Technology

Abstract: The aim of this study is to identify and quantify the human risk related to hydrogen explosions during the operation of a hydrogen fueling station. First, five types of accident was identified by the volume of leakage of hydrogen from high-pressure hydrogen storage tank and event tree method was applied for estimation of probability of explosion accident. Next, maximum pressure and maximum impulse on the hydrogen fueling station and surrounding area were estimated using FLACS software developed by GEXCON. Then, consequence and risk of explosion were estimated using existed fragility curves of damage to hearing, whole body displacement effect and impact of the head. As the result, the maximum pressure was estimated 6.9 kPa on the station by the rupture type of the accident, 6.0 kPa by the major type and 1.3 kPa by the medium type. The maximum pressure gradually decreased from the station toward the surrounding area and the maximum pressure at 50 m distance area from the explosion point was less than half of that at the explosion point. The rate of damage to hearing was up to 81%, the mortality rate by whole body displacement was up to 2.3 x 10-12% and the mortality rate by impact of the head was up to 0.017%. Individual risk of damage of hearing was up to 2.5 x 10-2 year -1, mortality risk by whole body displacement effect was up to 1.1 x 10-16 year -1and mortality risk by impact of the head was up to 3.7 x 10-7 year -1. Thus, the space of explosion effect was within 200-300 m radius and mortality risk by explosion was under 10-6 year-1 which is a negligible risk level of concern.

P.110  Cumulative Risk Assessment for Occupational Health: Challenges and Solutions. Williams PRD*, Maier A; E Risk Sciences, LLP

Abstract: Novel methods, such cumulative risk assessment (CRA), have received growing attention in environmental and community settings. However, CRAs conducted to date have generally not included the occupational domain and occupational health assessments have typically not adopted a CRA approach. In the current analysis, we explore some of the key challenges and data gaps hindering the wider use of occupationally-based CRAs, and present a range of existing methods and tools that can be used to advance this approach in the near term. Specifically, we discuss methodological shortcomings and data limitations related to development of a uniform CRA framework, identification and inclusion of non-chemical stressors, establishment of a “common currency” for assessing the combined effects of stressor co-exposures, and creation of a clearinghouse or central database for exposure and effects data. We also highlight important regulatory, public policy, social, and ethical issues that may arise from performing CRAs in occupational settings. Additionally, we present several examples where existing tools can be used or modified for occupationally-based CRAs: 1) using mode-of-action hypotheses to identify potential stressor interactions, 2) using auditing checklist tools or worker surveys to document potential combinations of stressors for hazard identification, 3) adjusting occupational exposure limits in the dose-response assessment to incorporate scenario and population specific susceptibility factors, 4) modifying acceptability criteria in the hazard quotient as a refinement to the risk characterization, and 5) working with occupational health care providers and wellness teams to address combined effects of stressors. These approaches can be applied in a qualitative or quantitative way to begin to address the effects of occupational risk factors, coupled with stressors arising from non-occupational domains (e.g., personal factors, community exposures), on worker health.

P.112  Regulatory risk assessor perspective on the historical drinking water contamination at Camp Lejeune, NC. Haney JT*; Texas Commission on Environmental Quality

Abstract: Historical drinking water contamination at the U.S. Marine Corps Base at Camp Lejeune, NC is particularly notable because not only were the concentrations extraordinarily high, but an estimated 500,000 to 1 million people (e.g., civilian workers, military personnel and their families) may have been unknowingly exposed via water use (e.g., household, other drinking water and uses) over more than three decades. Although the former condition is not particularly rare in Superfund, large numbers of people (both military and civilian workers, adults and children) with substantial, long-term daily exposure (from years to perhaps decades, as opposed to a hypothetical exposure scenario) to highly contaminated water via drinking, food preparation, showering/bathing, and other pathways (e.g., baby formula preparation, in utero, workplace use) represents an approximate worst-case exposure scenario under which there is a greater than usual potential for adverse health effects. Historical trichloroethylene (TCE) and other chemical concentrations were sufficiently elevated to raise potential health concerns. For example, the 1952-1984 mean concentration (138 &#956;g/L) exceeded the USEPA’s current TCE maximum contaminant level by 28-fold, with the corresponding dose (3.9E-03 mg/kg-day) exceeding all three candidate USEPA reference doses (RfDs) by 8- to 11-fold. The mean dose also exceeds supporting RfD values based on toxic nephropathy and increased kidney weight, as well as the point of departure (POD) for toxic nephropathy. Furthermore, estimated doses for 29% of the monthly averages and 34% of the 9-month rolling averages exceed the POD for the highest RfD, which is based on fetal heart defects. The incidences of nephropathy and fetal heart defects should be thoroughly evaluated among those exposed. Long-term follow-up will be required to assess potential health effects for the 500,000 to 1 million who may have used the contaminated water at Camp Lejeune or were exposed in utero.

P.113  Associate professor. Seo K*; Aoyama Gakuin University

Abstract: A catastrophic accident of Fukushima nuclear power plant in 2011 started argument about energy policy in Japan. Although many people agree to stop using nuclear power energy in the future, they were not necessarily agreed to withdrawing nuclear power generation immediately. This paper is about the results of students' survey that implemented in 2013-2014, when there were no active nuclear power plants in Japan. The question was “which do you think is more realistic option, restarting nuclear power generation or not restarting?”. Students were also asked their risk perception related to each scenario. The questionnaires were not anonymous but open style, and the results were analyzed together with the students' academic performance in the class. The academic results of two groups were statistically different (p=0.02). Students with relatively high records expected restarting nuclear power plants, and those of relatively low records expected immediate withdrawing from nuclear energy use. This result reminds us that in the US relatively high educated people tend to support nuclear power generation. (1) Students who answered restarting nuclear power generation had negative perspective about feasibility of renewable energy use in near future. Those who answered that fading out of nuclear power generation was realistic worried about acceptability of neighboring residents of the power plants. Both groups were rather positive about improving safety technology but they were not quite positive about improving the management system of nuclear power plants. 1. Greenberg M and Truelove H. B, Energy Choices and Risk Beliefs: Is It Just Global Warming and Fear of a Nuclear Power Plant Accident? Risk Analysis. 2011; 31(5): 819-831.

P.114  Risk factors associated to ciberbullying in Chilean high school students. Gutiérrez VV*, Toledo MI; Universidad Diego Portales

Abstract: Bullying among young student is a well know phenomenon, however, much less ins know about cyberbullying. Cyberbullying has been described as harmful behavior that is: (1) intentional, (2) carried out “repeatedly and over time,” and (3) takes place in an interpersonal relationship characterized by an imbalance of power using mobile phones and the internet (Olweus, 1999, p. 10; Smith et al., 2008, p. 376). Cyberbullying brings terrible consequences to those who suffer from it. Some research has shown that cybervictims can have low self-esteem, depressions, poor academic performance and on the extreme situations of suicidal thoughts and attempts. Nowadays, when technology is available to everyone and adolescents are using it in an unprecedented way, cyberbullying becomes a real concern to schools and policy makers. Little is known about risk factors that engage students in cyberbullying behavior. Some studies have found that traditional bullying might be a risk factor associated to cyberbullying, meaning that traditional victims would also be cybervictims. On the other hand, while popularity among peers is to thought as a risk protector, time spent on-line, computer proficiency and usage of information technology are catalogue as risk factors. Many of the studies have been conducted in European and American student population, but a few, in Latino population. The aim of this study is to determine risk factors and risk protectors of cyberbullying, focusing on victims. We surveyed 749 students from high school, among 31 schools from the Metropolitan Region of Santiago, Chile. We asked students if they have been bullied in the last semester in different cyber contexts. We analyze frequency of cyberbullying with risk factors such as gender, age, self-esteem, popularity, traditional bullying, frequency and time spent on-line. Findings are discussed in terms of policy recommendations.

P.115  Risk factors of cyberbullying in 5th grade Chilean students . Ahumada W*, Gutiérrez VV, Toledo MI; Universidad Diego Portales

Abstract: Currently, the use of Internet, mobile technologies and web pages allow us to use information and communicate easily. But with the use of technology and the internet can appear a new phenomenon: cyberbulling. Cyberbullying can be defined as a harm that one person or a group can do inflict against other(s) through hostile, deliberated and repeated behaviors, using a variety of electronic media. The effects on cybervictims are too diverse: depression, low academic performance, self-steem decrease and even suicidal thoughts. The objective of this research is to determine risks factors associated with cybervictims. We hypothesized that those factors could be gender, self-esteem, popularity on social networks and presence of preliminary history of bullying. The sample was applied to student form elementary school from the city of Santiago, Chile. 841 students answered the survey. The results indicate that the presence of cyberbullying is related to the studied risk factors. The results are discussed in terms of public policy.

P.117  Nuclear Risk Communication . Khan KJ*; Vienna University

Abstract: This paper is based on the forethought that the lack of communication between the scientific community and the general public can possibly be bridged through an intervention by the development evaluation professionals who dare to enter in any critical situation to make an assessment on behalf of key stakeholders to provide as accurate an account as possible of the causes and effects given the circumstances. Development evaluators also act as ‘Change Agents’ who propose and lead to a new direction, particularly in the aftermath of a disaster to save the day. Independent evaluators enjoy the trust of the public vis a vis the government and the multinational corporations. Evaluators’ hub of activities lies mostly in the non-profit sector and their work touches the life of people at every level. Their code of conduct shields them from hypocrisy and gives them the courage to speak out in the best interest of people. Thus the idea to initiate a dialogue between the scientific community and evaluators for disseminating information and building knowledge basis for public use on issues like the nuclear risk is born. The paper deals with the vocabulary used in communicating and perceiving the risk associated with nuclear power plant in the aftermath of Fukushima Daiichi accident resulting from a massive earthquake and tsunami in Japan in 2011. Two brief activities have been undertaken: a review of news and articles in the selected newspapers issued on 11 March 2016 commemorating the five years of Fukushima Daiichi Nuclear Power Plant accident, and a brief survey with a group of development evaluators recalling the day of accident and their interaction with scientific community during the past five years. The learning from the ‘content analysis’ of these two activities is captured in the paper to be orally presented.

P.118  Risk perception on health effects of EMF among high school students in Japan. Ohkubo C*; Japan EMF Information Center

Abstract: Risk perception of high school students (n=1006) on electromagnetic fields (EMF) and health effects issues was surveyed and compared with young adults (n=1224 in 20-30 years old) as a control in Japan. The internet survey was conducted in June, 2014. The questionnaires include degree of risk perception on health effects of EMF, concerned items in daily life, ill health effects imagined to be caused by EMF exposure, and reliability on information sources. The high school students are concerned about EMF and health issues (43%) than the control (40%); however, scientific knowledge about EMF in high school students is lower (47%) than the control (58%). Among the high school students, ill health effects imagined to be related to EMF exposure are none, sleep disturbance, impairment in concentration, brain tumor, other tumors, and childhood leukemia and others whereas those among control are none, sleep disturbance, maldevelopment of fetus, fetal malformation, brain tumor, infertility, and others, in descent order of indication, respectively. All the degree of information reliability from international organization, government agencies, universities, research institutes, power companies, cellular phone companies, electronics companies, NPOs, and civic activists among high school students are higher than the control. The high school students are slightly higher concerned about EMF & health issues than the control; however, they have poor knowledge of scientific on EMF than the control. Degree of reliability on all information sources are higher than the control which may be due to their innocent mind than the control. In conclusion, scientific information about EMF and health issues should be provided to them during their school days before they are influenced by rumor.

P.119  “Weather Whiplash” — An analysis of alternating hydrologic events 1960 to 2014 and the associated representation of risk. Trumbo CW*, Peek LA, Laituri M, Schumacher RS, Mokry M; Colorado State University

Abstract: Due to climate change areas of the world will be undergoing a more frequent “weather whiplash” between drought and flood. In this pilot study we use a database of natural disaster property loss claims to examine the co-occurrence of drought and flood disasters. The goals of the project include a historical examination of how risk and other disaster-relevant concepts were represented in public reports. Using the Spatial Hazard Events and Losses Database for the United States we accessed some 87,000 county-level records for financial losses from natural hazards over the period 1960-2014. The data were parsed to isolate floods and droughts, and a summary metric was computed to identify the cases in the top 80th and 95th percentiles for total losses (both crop and property damage, in 2000 dollars). By parsing the data by geographic area we were able to sort cases by date to select specific circumstances in which losses from floods and droughts occurred in spatial-temporal proximity. The continuation of the analysis is proceeding to transform the original dataset into a time-series of approximately 650 months, with multiple events represented in each month, and the inclusion of temporal and spatial metrics that may be used to calculate proximities. Once complete, these data may be analyzed using techniques such as geographically-weighted modeling, and visualized in GIS. Importantly, a more fluid time window may be examined. The final phase of this project will examine historical records such as news sources to gain insight into the risk-related social responses and consequences of these past events. Specifically, the socio-historical analysis will allow us to understand how individuals, households, policy makers and elected officials, and the private sector, for instance, responded to alternating weather extremes. Moreover, this analysis will allow for further investigation into the myriad consequences--environmental, economic, social, and political--of these events.

P.120  Seeking for your own sake: Chinese citizens’ motivation for information seeking about air pollution. Yang JZ*, Huang J; University at Buffalo

Abstract: Based on data collected from a panel of Chinese residents maintained by Qualtrics (N = 504), this study examined participants’ information seeking behaviors about air pollution through mass media, social media, interpersonal channels, and over the Internet in general in the past six months. Guided by the risk information seeking and processing (RISP) model, results from structural equation modeling indicate that consistent with the propositions of the RISP model, information insufficiency, negative affect, perceived information gathering capacity, and attitudes toward information seeking exerted significant positive influences on information seeking. Perceived hazard characteristics had a significant indirect effect on information seeking through information insufficiency. In contrast to recent research based on the RISP model, however, informational subjective norms (ISN) were not a significant predictor of information seeking. These findings suggest that for a risk topic (e.g. air pollution) that poses a direct threat to the health and wellbeing of the research population, individuals’ communication behaviors are driven by their own cognitive evaluations and affective responses, rather than social motivations related to others’ expectations of their information level about the issue (ISN). For theory-testing purposes, results from this study attest to the applicability of the RISP model to examining an important environmental issue in a Chinese context. In terms of practical implications, findings from this research suggest that public communications about air pollution in China should aim to foster a sense of urgency and to enhance self-efficacy among the target audience.

P.121  Bridging the gap: Exploring the role of situated distance cues in climate change visualization messaging. Schuldt JP*, Rickard LN, Yang ZJ; Cornell University, University of Maine, and University at Buffalo (SUNY)

Abstract: While communication research assumes that spatially proximal visual depictions of climate impacts (e.g., local flooding) are more effective than distal depictions (e.g., global sea level rise), psychological research complicates this view, pointing to additional context-dependent (“situated”) influences on distance perception. The extent to which any given climate-impact depiction feels near or far may depend on fleeting cues to spatial distance and the level of meta-cognitive fluency audiences experience when processing information about the location of depicted impacts. Such situated cues are relevant to climate change visualization efforts that are attracting increased attention as a means of portraying current or anticipated climate impacts to public audiences—depictions that routinely incorporate spatial-distance representations (e.g., maps) or fluency-relevant experiences (e.g., easy vs. difficult-to-pronounce place names). While understudied in this context, situated cues may alter the distance that audiences perceive between themselves and the impacted place or community, which in turn, may hold implications for key climate engagement outcomes (e.g., personal concern and climate policy support). To explore these concepts, we present U.S. participants with a map representing the distance between the U.S. and a region experiencing potentially catastrophic climate impacts (i.e., Republic of Maldives). Depending on condition, participants view a map that is designed to make the Maldives feel either relatively far or relatively close to the U.S. (e.g., depending on image size/resolution). All participants then watch the same video depicting climate change impacts in the Maldives, indicate their perception of climate change risk and support for climate change policy, and decide whether to donate money to a Maldivian environmental NGO. We will present key findings and discuss theoretical and practical implications for emerging climate visualization efforts and environmental risk communication more broadly.

P.122  Risky discourses: framing as a function of accountability in climate change editorials. Holley JR*; Cornell University

Abstract: This study engages the social contingency model to examine the extent to which rhetorical elements of emphasis frames are engaged within newspaper editorials about climate change when authorship is known vs. not known (i.e., high accountability vs. low accountability). Five frames are examined across 304 articles from the Las Vegas Review-Journal and the New York Times: attribution of responsibility, human-interest, conflict, morality, and economic consequences frames. Accountability coping mechanisms and audience dispositions (i.e., conservative and homogenous; liberal and heterogeneous) are considered relative to the extent to which frame elements are used by authors. Findings reveal the general tendency for authors within disparate political environments to invoke the human-interest and morality frames when authorship is known vs. not known. For the Las Vegas Review-Journal, the conflict frame was used significantly more when authorship was not known. Findings suggest individuals may cope with accountability by extending impression management techniques to include discourse that may be considered socially or culturally favorable to audiences despite political dispositions.

P.123  Of sea lice and superfood: A comparison of regional and national news media coverage of aquaculture . Rickard LN*; University of Maine

Abstract: As wild fisheries decline, aquaculture – the cultivation of aquatic organisms, such as fish, crustaceans, mollusks, and plants – will provide the majority of the seafood consumed in the U.S. Scientific and technological advances over the past three decades has made American aquaculture production increasingly environmentally sustainable and economically viable: a source of local jobs and affordable food. Yet, a legacy of environmental and human health concerns, and current controversy surrounding siting operations and the use of genetically modified species, suggest that perceived risks of aquaculture may loom large. As domestic aquaculture expands, knowing what U.S. publics think – in order to design strategic risk communication, and foster support for policy – will be increasingly critical to industry and government sectors alike. A news media content analysis can provide a critical first step toward gauging public opinion. The present study examines U.S. news media coverage of aquaculture over a ten-year period (2005-2015). To account for differences in aquaculture development and practices, we compare coverage (N = 493 articles) in four regional news outlets and four national newspapers for discussion of aquaculture risks, benefits, scientific issues, political/legal issues, and environmental sustainability, examining both prominence and co-occurrence of these themes over time. Following past research, for the majority of the study period, risk dominated the aquaculture discussion in both the regional and national newspapers. News media coverage of aquaculture during the last three years, however, has also included increasing attention to benefits and sustainability, a pattern that may be attributed, in part, to the growing distinction of shellfish aquaculture in the U.S. Comparing within and between regional and national newspapers revealed differences in thematic prominence that suggest that the conversation about aquaculture, rather than being monolithic, may vary geographically within the U.S. Implications and directions for future research are discussed.

P.124  Communicating the unfamiliar risk of ocean acidification to members of the public. Spence EM*, Pidgeon NF, Pearson PN; Cardiff University and Understanding Risk Group

Abstract: The effects of climate change on the marine environment are becoming more prevalent, including the novel risk of ocean acidification (OA). The absorption of anthropogenic carbon dioxide by the ocean and the changes in ocean pH has already affected shellfish hatcheries and fisheries, which are vital livelihoods for some communities. As there has been little research conducted to examine public risk perceptions of this risk, we aimed to explore this through a mental models approach. We compared expert and public risk perceptions of OA, in order to highlight areas of agreement, important knowledge gaps, and key misunderstandings. Through comparison of the different mental models constructed, we found low awareness of this risk with many attributing pollution and dumping waste as being the main causes of OA. Despite this, many identified that it would impact on numerous organisms resulting in marine ecosystems being altered. More generally, OA was perceived as a highly negative issue. A survey was conducted to establish whether these findings were confirmed in a wider population as well as exploring numerous psychological factors including concern, psychological distancing and affect. We will also discuss why public perceptions and understanding of climate risks such as OA are relevant. These findings will be discussed as to how they may be implicated in future risk communications for members of the public.

P.125  The Perceived Risks and Benefits of Drones and Their Various Uses. Zwickle A*, Hamm J, Farber HB; Michigan State University and University of Massachusetts School of Law

Abstract: There are many different kinds of drones (Unmanned Aerial Vehicles) performing a wide variety of functions in our society today. Drones are a rapidly emerging technology whose growth has quickly outpaced the rules and regulation in place to govern there use. They are currently being used for recreational, research, civic, commercial, and military purposes and present benefits and risks at the individual, community, and national level. In this presentation we present findings from a recent survey measuring the public’s perception of those risks and benefits, their trust in drone operators and regulators, and support for possible policies regulating the use of different types of drones. This research expands upon a recent survey conducted in Australia (Clothier et al., 2015) enabling us to make some cross-cultural comparisons. Our findings reveal that while the public is concerned about the threats that drones pose to both our privacy and safety, they are split on what their mental image of what a “drone” actually is. In fact, the term used to refer to drones has an impact on the American public’s attitudes towards them (as found in Australia). Furthermore, the level of concern regarding privacy and safety differ depending on what type of drone is under consideration, and for what purpose it is to be used for. Respondents were generally undecided whether they would support a total ban on drones, but that level of support differed based on the type and use of specific drones. Their affective response towards drones was generally positive, but also varied significantly across drone type and use. As is common with emerging technologies respondents demonstrated the affect heuristic thru a negative correlation between perceived risks and benefits regardless of the kind of drone or its intended use. Overall, our survey showed a public that feels positive about drone technology while at the same time is wary of certain ways in which they can employed.

P.126  Exploring the Acceptability of Human Induced Earthquakes. McComas K, Lu H*; Cornell University

Abstract: Earthquakes generate little positive affect and even less so when they are human induced. Even so, are some human induced earthquakes more acceptable than others, especially if they help to diversify energy portfolios and mitigate the effects of climate change? In response to this question, this paper presents data collected from a representative sample of New York state residents (N=800) February to April 2016 on public acceptance of earthquakes depending on their causes. Not surprisingly, respondents felt significantly more negative about human induced versus naturally occurring earthquakes. Further, although no earthquake was deemed “acceptable,” respondents rated human induced earthquakes significantly less acceptable than naturally occurring ones. Some human induced earthquakes were, however, deemed more acceptable than others. Specifically, respondents felt significantly more negative and rated as less acceptable earthquakes caused by the disposal of wastewater in wells related to natural gas development as compared to earthquakes caused by enhanced geothermal systems, groundwater extraction for agriculture, and carbon capture sequestration. This finding is perhaps related to the ongoing controversy in New York State related to the development of natural gas in the Marcellus Shale, although this linkage was not explicit in the survey. It also may be due to the perceived benefits of the process; however, the survey connected each human-induced earthquake to some type of benefit, including providing a source of energy to local communities. Finally, procedural justice mattered to human-induced earthquake acceptability, as acceptability was significantly higher when people believed that people like them had a voice in the decision and significantly lower when they learned that the decision was entirely expert-driven. This finding underscores the importance of public involvement in decision making.

P.127  Effects of climate change on Malian farmers. Wooten EK*, Rivers L; North Carolina State University

Abstract: Food security in Africa has become increasingly unstable in recent years. The lack of it threatens the capacities and developmental abilities of individuals, households, communities, and states. Seventy-five percent of African nations are classified as extreme or high risk for food insecurity. Reasons for such high risk are war, disease, poverty, and climate change. In Mali, a largely agrarian country, the risk of food insecurity is severe. One reason for this risk, which will be the focus of our project, is climate change’s effects on the region. Climate change in Mali is being seen through the Sahelian climate of the north shifting into the Sudanian climate zone located in the south. This southern region located in the Sudan is where most of the country’s agriculture takes place. In this area the climate shift has led to increased rainfall variability and made predicting the start of the rainy season difficult for Malian farmers. These changes are leading to farmers facing uncertainty in their planting decisions, and are critical for farmers as agriculture in Mali is 93% rain fed. Overall this is leading to yield reductions. With most agriculture in the country being subsistence farming, crops are used to feed farmers’ large polygamous families that can exceed a hundred people. Trying to feed large families with decreasing yields has made farmers’ food security more precarious. Our objective is to look at how climate change, by making food security more uncertain, is affecting farmers’ lives. To do this 42 individual open-ended interviews were conducted with farmers in the Koutiala area of Mali. We will use these to look at how farmer’s lives are being altered by climate change and its effects on their food security by qualitatively coding the interviews. Our work will contribute to the food security literature on the link between climate change and food security in the context of Malian Farming.

P.128  Enhancing Environmental Risk Assessment with the Protocol for Community Excellence in Environmental Health. Bartlett R*; California Department of Public Health

Abstract: Organizations working in communities confronted with social and environmental stressors should apply balanced approaches that assess technical aspects of contamination and promote broad community collaboration. From October 2014 thru April 2016, the California Department of Public Health (CDPH) facilitated the Protocol for Assessing Community Excellence in Environmental Health (PACE EH) in a Los Angeles city challenged by environmental justice concerns. CDPH implemented PACE EH while conducting public health assessments (PHAs) at two Superfund sites located in the city. PHAs assess possible health risks to communities resulting from hazardous waste site contamination. The PACE EH unites communities and health agencies to evaluate and address community environmental health concerns collaboratively. Per PACE EH guidance, CDPH formed and cultivated a “Community Environmental Health Assessment Team” (CEHAT), a committee comprised of community residents, local organizations, and government stakeholders. The CEHAT gathered the public’s environmental health concerns using a CDC-based qualitative survey. CEHAT used the survey data to rank, assess, devise strategies, and carry-out action steps that are currently addressing the community’s most vital concerns. CDPH was able to foster robust support for the PACE-EH process from community residents and governance at the local, state, and federal levels. In the short-term the acquired multi-level trust allowed CDPH to: 1) seamlessly assess community concerns, 2) evaluate environmental contamination, 3) promote health information, and 4) quickly acquire feedback. CDPH is integrating portions of the PACE-EH results into the PHAs. The broader outcomes resulting from PACE EH are the community’s expanded capacity to address their most important environmentally related concerns and the creation of a positive space where state and federal agencies can effectively cooperate to conduct clean up and health promotion work in the future.

P.129  Tornado risk perceptions in response to warning polygons. Huang S-K, Jon I, Lindell MK*; University of Washington

Abstract: The National Weather Service (NWS) has replaced county-wide warnings with smaller warning polygons to provide people with more specific information about tornado threats. Previous studies of strike probability (ps) judgments in response to tornado warning polygons have found that people infer ps is highest at the polygon’s centroid, lower just inside the polygon edges, still lower just outside the polygon edges, and lowest (but not zero) in locations beyond that. However, it is unclear if a warning polygon together with additional information such as radar images of storm cells, would affect ps ratings. Thus, 167 participants were presented 23 hypothetical warning polygons in 3 different scenarios. In the first scenario (warning polygon only), the distribution of ps ratings replicated findings from previous studies. In the second scenario (warning polygon plus radar image of a major storm cell with a hook echo), ps was highest at both the polygon centroid (M = 4.3 on a 1-5 scale) and the polygon edge closest to the storm cell (M = 4.4, t214 = .40, ns), lower just inside the other polygon edges (M = 3.5, t322 = 5.87, p < .001), still lower just outside the polygon edges (M = 2.7, t700 = 10.76, p < .001), and lowest beyond that (M = 2.3, t700 = 6.26, p < .001). The results in the third scenario (warning polygon, major storm cell with hook echo, and two minor storm cells adjacent to the main storm cell) were similar to those in the second scenario. Overall, these data suggest people judge their risk more accurately by seeing a warning polygon in the context of a radar image of the storm cell on which the polygon is based. They also confirm that, contrary to NWS guidance, people perceive they are at risk of a tornado strike even if they are outside a tornado warning polygon. This information can help meteorologists to better understand how people interpret the uncertainty associated with warning polygons and, thus, improve tornado warnings.

P.131  Differences in Risk Perceptions about Medical Practices among General People and Health Professionals. Yuko A*; Tokaigakuin University

Abstract: Medical practices have brought many benefits to health and vigor of human beings. However, people often try to avoid medical practices if they can, since these practices also possess risks such as dread for the participant or future impact. This phenomenon can be viewed as a problem of risk perception. Different characteristics of risk perception have been obtained for various other types of hazards (e.g., traffic accidents, biotechnology, nuclear) in previous studies; for example, when nuclear waste started being recognized as a hazard, its risk perception increased, especially since this issue was considered unnatural and even immoral. Therefore, we expect that specific features will be observed for medical practices as well. Furthermore, grasping the differences in risk perception between lay people and medical experts will lead to better risk communication between medical staff and patients and smoother communication among the various medical professionals treating a patient together (i.e., team medical care). Additionally, risk perception is an important factor when accepting risks related to medical practices. Since many countries aim to improve their medical check-up rate, findings related to risk perception will also be helpful in achieving this goal. We examined the differences in the risk perception between medical experts (doctors, nurses, and pharmacists) and lay people who did not work as medical health professionals through a questionnaire. We surveyed 677 Japanese adults recruited via the Internet. We targeted 17 relatively well-known medical practices (e.g., protective vaccination, blood transfusion, X-ray test). After asking whether participants had heard of the medical practices or not (awareness exceeded 70%), we asked them questions about the risk dimensions of practices they were familiar with (e.g., old/new or fatal/not fatal). In this paper, we report the characteristics of risk perception for each group.

P.132  IPCC reports on Climate change and Media : comparing media coverage of IPCC AR4 and AR5. AOYAGI M*; National Institute for Environmental Studies

Abstract: This paper compares and discusses two IPCC reports, AR4 in 2007 and AR5 in 2013~2014 with their media exposure and public perception on them in Japan. According to newspaper database, media coverage in 2007 and 2013~2014, though peaks were longer and higher in 2007, newspaper coverage for climate change increased in both periods in the world as a whole. But in Japan, the peak is none or very small in 2013~2014, while it is huge peak in 2007. We explore the reason why there were very few media coverage compared to 2007 media coverage on IPCC report in 2013~2014. Our hypothesis is its relationship of revising the Japanese Basic Energy Plan, which is deeply connected with Japanese economic situation at that time. Japan government submitted tentative 2020 Greenhouse gas reduction target (-3.8% compared to 2005) in November 29, 2013. The revised Basic Energy Plan was approved by the cabinet and released April 11, 2014, just after Japan government hosted IPCC WG2 general meeting on late March in Yokohama. This was followed by the Greenhouse gas reduction target for 2030 (-26% compared to 2013) in July 17, 2015, Interestingly, the Prime Minister Abe was governed both when AR4 and AR5 were released. During this period, so called gAbenomicsh was implemented and it seemed to be worked well. Unemployment rate had been improved. We analyze this changes of target level, Japanese economic situation in those periods, energy policy, media coverage on climate change and public attitudes towards the climate change.

P.133  FrackMap : A Tool to Communicate about Fracking and Potential Environmental and Public Health Impacts in the United States. De Marcellis-Warin N*, Backus A; Harvard Center for Risk Analysis, Harvard T. Chan School of Public Health, Polytechnique Montreal and CIRANO

Abstract: In recent years, unconventional oil and gas developments (including hydraulic fracturing and horizontal drilling) have increased exponentially across the United States. These activities have raised concerns in some communities about potential environmental and health issues/impacts, especially on people living in communities proximate to hydrofracking sites. FrackMap was created using the Harvard WorldMap, a public domain collaborative mapping platform. FrackMap brings together a range of fracking related datasets (oil and gas permits, shale formations, horizontal legs, reports of specific chemical used, etc.). Moreover, we are in the process to add several layers onto the map. In particular, a new layer will help visualize the current scientific knowledge and peer-reviewed literature about potential environmental and health issues and impacts associated with U.S. shale gas plays. We identified peer-reviewed articles published during the last 10 years including location data and we map them by state and by shale play. Another layer will map geolocalize Tweets in the US using keywords and hashtags such as #shalegas, #fracking, #hydrofracking and specific topics #frackquakes, #frackingwastewater, etc. These tweets provide some interesting information about people's feelings and risk perception. FrackMap is an innovative tool to communicate through maps and interactive data visualization.

P.134  Communicating threat and efficacy through the media: An analysis of news broadcasts about the Zika virus. Olson MK*, Sutton JN, Vos SC; University of Kentucky

Abstract: News media, such as local or national news programs, are often tasked with providing risk related information to their audiences. This subsequently affects audience awareness and protective and preventative behaviors (Neuwirth, Dunwoody, & Griffin, 2000; Parrot 1996). The Zika virus has become a popular news topic due to its impact on maternal and infant health and its potential to spread within the United States as seasons change. Therefore, the news media has an increased role in communicating information not only about the threat of the Zika virus, but also how audiences can protect themselves. Using content analytic procedures, this study applies the Extended Parallel Process Model (EPPM) to assess the extent to which the news media presents EPPM message elements and seeks to provide a comprehensive depiction of how threat and efficacy messages are presented to audiences. We start by examining CDC media statements to identify what threat information and recommendations have been provided nationally and internationally. We then analyze three months of evening news segments in 2016 from national broadcast and cable news channels to assess how news media present severity and susceptibility (i.e., threat) of contracting Zika and self and collective efficacy information. In general, we find that news stories focus on details about virus transmission and population susceptibility but fail to communicate efficacy information present in CDC press releases to audiences. Such a lack of information may lead to maladaptive or unproductive behaviors in response to the Zika threat. In contrast, effective risk communication will contain information that promotes efficacy, leading to self protective behaviors and collective preventive actions.

P.135  Investgating risk communications at Fukushima-Daiichi NPP accident. Tsuchida S*; Kansai University

Abstract: Risk communications, that is, crisis communication, care communication, consensus communication, conducted mainly by Japanese governments were investigated. A) Crisis communication: At the crisis most of the evacuees from the contaminated area around Fukushima-Daiichi NPP did not have sufficient information about the evacuation. They did not know how and where to go. No information about the distribution of radioactive contaminants was provided by the government on the first and second days of the accident. Results of a mail survey to the evacuees showed that 26.0% of them obtained no information of the evacuation at that time. Many people of Minami-Soma city and Namie town failed to choose their evacuation route and they headed and stay at Iidate or Kawamata which were ones of the highest radioactive areas. TEPCO was banned to have independent press conferences by the national government at the end of April and it lost its position of primarily responsible risk communicator. The government and TEPCO made no communication with the public to seek supports to overcome the accident. B) Care communication accompanied by medical check with tens thousands of evacuees was needed from the beginning of the accident. National Institutes of Radiological Sciences (NIRS) and Japan Atomic Energy Agency (JAEA) did it. A year after the accident the Japanese government organized a committee of government officials from all the ministries to make plans of care communication with the victims. The chief of it was Minister of Environment. Children and parents were their main communication targets. They wanted teachers, local government workers, and medical staffs to be the care communicators. C) No reports of official consensus communication concerning NPP by the government was found after the Fukushima-Daiichi NPP accident. However, re-running of Sendai NPP, Takahama NPP, and Ikata NPP were offically accepted by each local government.

P.136  Extreme weather and climate change: The role of media use and interpersonal discussion in the formation of risk perceptions about climate change. Anderson AA*; Colorado State University

Abstract: Individuals’ experiences with weather play an important role in how they perceive climate change (e.g., Brulle, Carmichael, & Jenkins, 2012; Joireman, Barnes Truelove, & Duell, 2010). In particular, extreme weather events are associated with perceptions that climate change is risky (Leiserowitz et al., 2014). And perceptions of weather experiences are connected to beliefs that climate change is occurring (Borick and Rabe, 2014). Few studies, however, have analyzed how communication around extreme weather events shapes attitudes about climate change, although some studies have shown that local television weathercasters can enhance climate change awareness (Anderson et al., 2013; Bloodhart, Maibach, Myers, Zhao, & Ebi, 2015; Zhao et al., 2013). In this study, I use a statewide survey of Coloradans (n = 863) following a state-wide flooding event in 2013 to analyze how communication of the extreme weather event shaped climate change risk perceptions. An ordinary least squares linear regression shows evidence that, after controlling for general beliefs about climate change, several sources of communication are related to increased risk perceptions of climate change. These communication sources are: 1) attention to news about extreme weather in general, 2) attention to flood-specific news, 3) face-to-face discussion about the flood, and 4) social media discussions about the flood. This study expands existing scholarship on the relationship between weather and climate change perceptions by pointing to the important role of media use and interpersonal discussion during weather events.

P.137  The Relationship Between Stigma and Public Acceptance of Food Products- An Example of Chewy Starch in Taiwan. Wu CY, Huang SZ*, Wu HC, Wu KY; National Taiwan University

Abstract: In 2013, a prohibited food additive (maleic anhydride) was found in chewy starch, a commonly used ingredient for several Taiwanese cuisines, causing a widespread panic after media coverage in Taiwan. The objective of this study was to investigate whether the food safety incident had stigmatized food products that contained chewy starch, and explore how stigma, risk and benefit perceptions, trust toward government and food industries, and dose-response sensitivity affected the acceptance of the food products. A total of 714 Taiwanese participants aging from 16 to 45 were recruited for a questionnaire survey. Path analysis using LISREL was employed to analyze the direct and indirect relationships in this model. The risk and benefit perceptions were defined as mediators to predict the level of food acceptance. The result showed that stigmatization had a significant effect on both mediators: a positive effect for risk perception (&#946;= 0.51), and a negative effect for benefit perception (&#946;= -0.341). Dose-response sensitivity had a negative association with risk perception (&#946;= -0.15). The trust towards the food industry had a negative association with risk perception (&#946;=-0.07) and a positive association with benefit perception (&#946;=0.25). The direct association between risk perception and food acceptance was not significant, suggesting its effect needed to be mediated by benefit perception. The study showed that two major influential factors were stigmatization and benefit perception. Stigmatization posed both direct and indirect effects on food acceptance, whereas benefit perception posed a direct effect to acceptance. Trust had little impact on the level of acceptance through mediations of perceptions. These findings suggested that management of such crises should consider strategies that acknowledge and address the potential cost of stigmatization, and minimize its impact through other means such as increasing perceived benefits of stigmatized food.

P.138  The Role of Risk Attitudes in the Reception of Risk Information for Risk Mitigation Strategies in Wildfire. Walpole HD*, Wilson RS; The Ohio State University

Abstract: Although evacuation is the preferred method in the US for preserving public safety in wildfire, alternatives such as “staying and defending” have garnered interest due to their increased rates of structure survivability and reduced strain on public safety resources and evacuation routes. While interest has grown in why some people choose to evacuate and other prefer to stay and defend, little research examines whether or when communication interventions can be effective at altering a person’s intentions in this context. We investigated the effects of providing information about the benefits of different strategies on intended evacuation behavior. Using an online sample (n=274) we conducted an experiment aimed at assessing the effects of information detailing the benefits of different strategies on intended evacuation behavior and associated perceptions of the risks and benefits of different strategies. Drawing on previous research exploring the role of domain specific risk attitudes in evacuation decision making, we hypothesized that the effectiveness of information would be conditional on the domain specific risk attitudes of the participants. Specifically, we hypothesized that more tolerant attitudes towards a domain of risk would reduce the effectiveness of information at altering risk and benefit perceptions and ultimately behavioral intentions to engage in a strategy that reduces risks in that risk domain. Our results indicated that pro-evacuation information had no effect altering perceptions or intended behavior regardless of risk attitude, however pro-defense information had an effect on behavior through increased perceived benefits of defense. This effect was only present for those with lower tolerance for property risks. Understanding how a resident’s risk attitude contributes to the effectiveness of risk messages can help us understand why some communications fail and help us to craft messages with broader appeal to those at risk from wildfire.

P.139  Public cues to relative credibility of disputing scientists. . Johnson BB*; Decision Research; University of Oregon

Abstract: When large groups of scientists disagree over the causes or consequences of a phenomenon, public certainty, trust in science, or cooperation with expert advice can suffer. One little-examined issue in understanding public interpretations of and reactions to such disputes is the nature of cues laypeople might use to decide which position in the dispute is more likely to be correct (e.g., nanotechnology is on balance beneficial or risky; dietary salt intake should be reduced for those at risk or for everyone). Cues might include such categories as use of scientific method, credentials, values, experience, and the proportion of scientists on each side. Controlling for other variables that might critically affect public responses to a two-sided debate (e.g., ideology; prior position on the issue; understanding of scientific reasoning; science mistrust), this paper reports survey and experiment results that explore the effects of varied cues manipulated in mock news articles about real scientific disputes. Dependent variables include choices of the “correct position,” trust in the dueling scientists, and support for research funding, among others.

P.140  Disaster preparedness and natural disasters in Canada: A mixed-method inquiry of Canadians’ experiences. Yong AG*, Lemyre L, Pinsent C, Krewski D; University of Ottawa

Abstract: Urbanization, population growth and varied physical features in Canada have increased the risks of Canadians experiencing a significant loss from natural hazards. This highlights the need to increase disaster preparedness in Canada. In view of better risk communication and management, this study aimed to understand the Canadian public’s natural disaster risk perception and preparedness responses, as well as the context and meaning in their decision-making using quantitative and qualitative methodology. We conducted a nationally representative survey on Canadian adults (N = 3,263) who reported their risk perception, preparedness behaviours and risk beliefs using 5-point Likert ratings. Then, semi-structured interviews on natural disaster risks and issues in Canada were conducted. Results showed the Canadian public’s risk perception was moderate (M = 2.76, SD = 1.06). As well, their level of pre-disaster preparedness routine (e.g., emergency kit) remained low (M = 2.37, SD = 1.08). However, they were inclined to follow evacuation recommendations (M = 4.31, SD = .88) and were likely to have people to search for them post-disaster (M = 3.68, SD = 1.41). Linear regressions revealed the Canadian public’s disaster preparedness was driven by three underlying risk belief systems: External Responsibility for Disaster Preparedness, Self-preparedness Responsibility, and Illusiveness of Preparedness. Interview findings showed the Canadian public’s decision-making included: (a) downplaying the risk in Canada, (b) lack of urgency, knowledge and information in disaster preparedness, and (c) over confidence in the Canadian government to provide adequate care. Results suggest risk management and communication should focus on clarifying the responsibility of individuals, communities and government in disaster preparedness, as well as using context-based knowledge in public awareness and education programs. Theoretical and practical implications will be discussed.

P.142  Examining factors influencing risk perceptions of hydropower. Mayeda AM*, Boyd AD; Washington State University

Abstract: Public opinion is increasingly recognized as a critical factor in the development and management of energy systems. The perspectives of those living near current or proposed projects are particularly critical to assess because these residents may have a greater interest in the project and potentially have more input into the siting of the technology. A systematic review of quantitative and qualitative empirical research published between 1980 and 2015 was conducted to synthesize and consolidate the results of studies that examined public perceptions of hydropower. The review involved searching databases and journals using multiple keywords and synonyms for hydroelectricity and perceptions. The initial searches yielded 12,398 articles. Sixteen of these articles met the criteria for inclusion and were examined further to assess the factors associated with the support for or opposition to hydroelectric dams. Factors influencing public perceptions of hydroelectric dams included: (1) public participation and consultation in hydroelectric dam development; (2) availability of information about the energy source to members of affected communities; (3) socio-economic impacts associated with hydroelectric dams; and (4) environmental and ecological impacts of the technology. The findings from this review will provide insights for future research to help guide the development of more effective risk communication research and policy development in this area.

P.143  Media Coverage of Mercury Contamination in the Arctic. Fredrickson ML*, Boyd AD, Furgal C; Colorado School of Public Health, Washington State University, Trent University

Abstract: Media can affect public views and opinions on environmental hazards and public health. This is especially true of issues that are relatively unknown or poorly understood to many affected populations such as the risks associated with mercury contamination. It is well recognized now that mercury is a global issue with levels in the environment reaching significant concern even in distant locations of the globe, such as the Arctic. There is an increasing need to communicate about the risks of contaminants to both the Indigenous populations who live in these regions and those elsewhere who may be able to impact policy and public discourse on the issue. In order to better understand how the issue of mercury contamination in the Arctic has been presented in the media, a content analysis was conducted across fourteen newspapers in the Canadian North and South. During the past decade, a total of 10,424 articles from the 14 newspapers contained the word ‘mercury.’ Articles focusing on mercury contamination in the Arctic were analyzed in detail. There were a total of 81 relevant articles, with 58 of these articles appearing in northern publications and 23 appearing in southern publications. This study analyzed how different news sources presented the health risks of mercury in the Arctic, how mercury was defined, if any pathways of personal efficacy were provided, who was quoted as an information source and where the article originated. Results demonstrate that very few Indigenous people were cited as sources, the articles often failed to describe mercury at all, and many did not provide direction to support personal efficacy. Results are discussed in relation to agenda-setting theory, which asserts that the media sets the agenda for what the public thinks about. This study provides insight into how communicators can improve the development of environmental health risk messages.

P.144  Digital risk perception and communication unplugged: Twenty years of data processing . Wardman J K*; University of Nottingham

Abstract: Digital advances in information and communication technologies have markedly expanded the repertoire of risk management tools and resources that are contemporarily available to researchers and practitioners. Yet, with few notable exceptions (e.g. Bostrom 2003), the transition from ‘analogue’ to ‘digital’ risk management over the past twenty years has passed by with surprisingly little critical examination. Following Fischhoff’s (1995) and Leiss’s (1996) respective works outlining key developmental stages traditionally associated with risk management learning and practice, this paper offers a timely assessment of current advances and trends which I term ‘digital risk perception and communication’ (DRPC). The paper identifies a series of focal DRPC strategies that researchers and practitioners now ‘hope will do the trick’, and further argues that DRPC characteristically marks a ‘Fourth Phase’ in the evolution of risk communication as contrasted with the three preceding phases previously specified by Leiss (1996). Current progress in the consolidation of DRPC skills and capacities is discussed along with early lessons learned about how far each focal strategy will go when confronting risk management problems in the information age. It is concluded that DRPC has unquestionably increased the various capacities of organisations to responsibly manage risk with exacting levels of detail, powerful means of social persuasion, and unprecedented public engagement. However, this is yet to widely translate into the execution of sound and ethical risk communication as a matter of good practice, and innovations in DRPC may prove in certain instances to be disruptive to societal expectations for the conduct of risk management to empower citizens.

P.145  Urban Parks as the Nexus for Neighborhood Vulnerability and Resilience. Winter PL*, Milburn LA, Li W, Padgett PE; USFS, Pacific Southwest Research Station

Abstract: Urban parks serve as the nexus for this ongoing inquiry into neighborhood vulnerability and resilience. A number of environmental and social effects are anticipated from climate change. Urban parks, home to urban forests, offer myriad ecological, social and economic benefits to surrounding neighborhoods, including benefits that will help buffer impacts from climate change. Four communities, two affluent and two disadvantaged across a number of socioeconomic dimensions, are the focus of our analysis. For this study, neighborhood is defined as those street segments lying within a half mile radius of two urban parks within each community. California EnviroScreen data show the dramatic difference in pollution burden for the eight neighborhoods. Observations of park use paired with assessments of ozone exposure further highlight issues surrounding community well-being. To render a more complete picture of neighborhood condition the physical environment was rated through an application of Google Earth Pro and Street view. Independent raters assessed street segments using the Active Neighborhood Checklist. Typically applied to inquiries in community health and active living, the ratings reflect the mix of land uses in a neighborhood (e.g., residential or commercial structures), non-residential uses (e.g., markets or educational facilities), visible street characteristics, and environmental quality (e.g., presence of public art or litter). In sum findings inform a greater understanding of current community condition as well as areas where interventions are most needed. More broadly, the neighborhood context offers a framing that is meaningful to community members, NGOs, and policy makers capitalizing on advances in climate change communication science.

P.146  Communicating visual risk: Threat, efficacy, and emotion in SNS messages about Zika. Vos SC*, Sutton JN, Olson MK; University of Kentucky

Abstract: Existing empirical evidence suggest that images can communicate risk information as effectively as text and, in some cases, more effectively (Chang, 2013). However, little empirically based guidance exists as to how visual risk messages should be constructed even though researchers (e.g., Bostrom, Anselin, & Fairris, 2008; Lipkus & Hollands, 1999) have repeatedly called for more research and theory development. These calls came before the advent of visual risk communication on social networking sites (SNS), like Facebook and Twitter. These relatively new platforms facilitate the sharing of visual risk messages. The sharing of messages, or amplification, is a key measure of message success on social media as each time a message is shared, it is exposed to a new group of users. The growing use of images in SNS risk communication raises questions about which visual message elements are more effective for increasing message retransmission. Previously we have found that including an image increases the amplification Ebola risk messages. However, this work focused on the type of visual communication and did not examine the message communicated by the image. In this poster, we draw on the extended parallel process model (EPPM) to examine how risk is communicated visually by public health organizations on SNS during a public health crisis. We use the current Zika crisis to examine visual risk messages distributed during a six-month period in 2016 by 536 Twitter accounts that represent federal, state, and local public health agencies. We conduct a content analysis of the types of visual elements used to communicate risk. We code for the visual communication of threat, efficacy, and emotional appeals. We analyze the effect of these variables on amplification, using negative binomial regression to model how these message elements contribute to message retransmission. We use the results to build on previous research and begin theorizing the visual communication of risk.

P.148  The chronological change of consumer anxieties and concerns related with radioactive contamination of foods in Japan: applying the text mining approach. Yamaguchi H*, Shintani K, Hamada NS; National Institute of Health Sciences

Abstract: Since the accident at the Fukushima Daiichi nuclear plant, the anxieties and concerns of Japanese consumers related with radioactive contamination of foods has increased. The purpose of this study is to analyze the chronological change of the anxieties and concerns of consumers about radioactive contamination of foods by applying the text mining approach. We used the text data that were inquired from consumer in “Hitokoe Seikyo”. “Hitokoe Seikyo” is the two-way communication system of Consumer Co-operative, and provided by the Tohto Co-operative. First, we performed the frequency analysis, the cluster analysis and the co-occurrence network of the text data on annual basis by text mining approach. We then conducted the syntax analysis, and identified the main anxieties or concerns of consumer. As a results, three consumer anxieties or concerns were extracted: (i) the safety of provisional reference values in Japan, (ii) the need for voluntary efforts of Co-operative, and (iii) requests for origin labeling of food. It suggested that the main topics of the anxieties and concerns were not changed by years, although the number of data reduced after two years of the accident to one-tenth.

P.149  Trust shaped through knowledge and elaboration: Considering the attitude strength properties of trust. Song H*; Cornell University

Abstract: Theories of trust in risk communication often characterize trust as a cognitively effortless judgment based on heuristics such as value similarity or affect. Although research has well demonstrated that trust enables individuals to make judgments about environmental hazards without undertaking complex risk analyses, this should not be confused as meaning that judgments about trust are intrinsically shaped through heuristic information processing. Individuals who find themselves incapable of dealing with technical complexities may nevertheless invest considerable effort in making accurate judgments about risk managers, provided that they are sufficiently motivated. It is important to consider cognitively effortful forms of trust because research suggests that attitudes rooted in substantial knowledge and elaboration tend to be stronger. Stronger attitudes, by definition, are more persistent over time, resistant to counter-attitudinal persuasion, and impactful on subsequent cognitive and behavioral states. When trust is weak, in contrast, the public may easily withdraw their trust or be reluctant to act upon the trust to express support. To address this gap, researchers can focus on risk communication processes that enhance knowledge and elaboration about risk managers. People can acquire rich knowledge about their risk managers through direct face-to-face interactions such as public meetings. Mass-mediated news stories can also convey considerable knowledge about important risk managers. Knowledge about risk managers can be further solidified through elaboration processes when individuals are involved in discussions or perceive high personal relevance regarding certain risk issues. Empirical support for these processes will yield implications showing how sincere commitment to engage the public that go beyond appeals to heuristics can build strong forms of trust.

P.150  Global Attitudes Towards Climate Change: Evidence From 15 Countries. Shao W*, Xian S, Lin N, Lee TM; Auburn University Montgomery

Abstract: An international survey data set, supplemented with contextual data such as GDP per capita, carbon emission per capita, local weather and climate, and local climate plans/policies etc., is used to examine how people from 15 countries perceive risks of climate change and how willing they are to reduce the adverse impact of climate change. We first estimate multi-level models explaining variations in risk perceptions of climate change and willingness to address this issue across countries. Then, we select a list of mega-cities from each country and investigate into the key factors that determine one's risk perceptions of climate change and willingness to address this issue among these cities. Results of this study are believed to provide policy makers from international to local levels with important information that can serve as useful guide for effective climate change strategy formulation.

P.151  How GM issue has been told at Chinese newspapers? Comparative Analysis of National and Local Newspaper Coverage of GM issue in China, 2000--2014. zhang xiao*; The University of Tokyo

Abstract: Proper newspaper coverage of GM food issue is important in terms of the current GM industry dilemma in China. This study examined news coverage of GM issue published at 718 newspapers, one wire services and one news website from 2000 to 2014 in China. News articles at newspapers that public could access daily and easily could highlight the salience of GM issue in China and influence public perception. We examined GM articles in local and national newspapers in mainland China by applying content analyses methods, both quantitatively and qualitatively, including language aspects. Our data verified events-orientation trend again and we identified as many as 70 GM events since 2009. Furthermore, qualitative content analyses on unilateral (national or local coverage on the first day only) coverage of events indeed showed differences in terms of subjects, attitudes, and information sources. By comparing the number of GM articles in national and local newspapers, we found that in terms of volume, growth rate, frequency and improvement of frequency, local newspapers performed much better. The results of this study lead to the conclusion that GM issue has been a growing salient issue in Chinese newspaper coverage especially at local level. Although national newspapers still play an important leading and agenda-setting role in GM coverage, diverse information source and narrative forms and great potential indeed exist at local level. We were also able to observe interaction cycle consist of different national and local coverage which promoted development of coverage of event.

P.152  Implementing Geographic Information Systems to Support Coast Guard Operational Decision Making. Todd AL, Howard PM*; ABS Consulting

Abstract: Under the United States Coast Guard’s (USCG) Ports, Waterways, and Coastal Security (PWCS) mission, the ultimate goal is to prevent and respond to terrorism risk within the maritime domain. Efficient resource allocation and operational planning is crucial to the success of the mission. With a limited set of available assets and a vast area to protect, optimizing resource allocation and prioritizing Coast Guard actions is the smartest way to aim for success. The Maritime Security Risk Analysis Model (MSRAM) is a terrorism risk management tool that enables the USCG to understand and mitigate the risk of terrorist attacks on targets in the U.S. maritime domain. USCG analysts in each major port perform a detailed risk analysis for all significant targets operating within their area of responsibility, ultimately resulting in a dataset that can inform a wide variety of risk management decisions. MSRAM is an application developed in MS Access, with a complementary GIS tool called the Risk Management Workspace (RMW). The RMW is most often used to as a means to more clearly communicate risk to USCG officials and decision makers. Additionally, its built-in tools and capabilities allow for the generation of location-specific analyses, providing defendable and repeatable results that provide a clear picture of the U.S.’s maritime risk nationally. By combining census data with built-in calculators that estimate and display consequences associated with explosions, chemical releases, and IED-style attacks on a map, the RMW allows for consistent analyses for maritime targets and simplified communication of those results. This presentation will provide an overview of the available GIS functionalities developed to assist the Coast Guard in best allocating its available assets and resources to maximize mission success. It will demonstrate how fast-paced geospatial tools can inform real-life U.S. policies, decision making and security efforts in a timely and reliable manner that can be implemented simply, by an analyst without a specific scientific background.

P.153  When are climate victim portrayals persuasive? The interplay of perspective taking and social-identity cues. Lu H*, Schuldt JP; Cornell University

Abstract: Findings from communication and psychology suggest that encouraging audiences to adopt a more empathic versus objective perspective while processing messages about victims may prove a useful persuasive strategy, by promoting an emotional connection between audiences and victims that motivates helping behaviors. At the same time, research in climate change communication reports differential effects of victim portrayals across political partisans in the U.S., where the issue remains highly politicized. Drawing on these findings, we explored the conditions under which taking an objective perspective regarding a climate change victim may prove more persuasive among a key audience: political conservatives and moderates. N = 502 U.S. participants read a modified news article about an expectant mother living in Puerto Rico who is worried about Zika virus—a vector-borne disease linked to climate change. Depending on condition, participants were instructed to adopt an empathic or objective perspective while reading the article, which furthermore, referred to Puerto Rico as a “U.S. territory” or not— a social-identity cue intended to prime ingroup versus outgroup thinking (a control condition received no perspective-taking instructions). Immediately after, participants reported their support for climate change mitigation policy and other climate beliefs. Results revealed a two-way interaction between perspective taking and social-identity cue, such that the objective perspective condition increased policy support, but only when the ingroup cue was present—an effect driven by political conservatives and moderates. Further analysis revealed beliefs about the origins and consequences of climate change as key mediators. Our findings complement prior work on the politics-contingent effects of climate victim portrayals, while offering practical insights for risk communicators and environmental advocates seeking to communicate about climate-related emerging and zoonotic diseases.

P.155  Social vulnerability and the occurrence of gastrointestinal diseases associated with precipitation seasons in São Paulo, Brazil. Roncancio DJ*, Nardocci AC; University of Sao Paulo. School of Public Health

Abstract: Hydro-meteorological extreme events can have an important impact on human health. Direct or indirect exposure to flood water can result in outbreaks of gastrointestinal diseases and respiratory infections among others. Climate change scenarios project an increase in extreme precipitation events in the Brazilian southeast region. Since improving risk management and adaptation strategist depends on knowing which areas and populations are most vulnerable to natural disasters and their health related problems, is necessary to have clarity of the impacts on human health these events may have. The study aimed to compare the basin-level social vulnerability to natural hazards (SoVI) in the city of São Paulo against the spatial distribution of the public health care system registers of Diarrhea in the same area. Calculation and mapping of SoVI follows the Cutter et al (2003) methodology by processing social data from the 2010 census and using principal components analysis. Trends in spatial distribution of the proportions of authorization for hospitalizations (PAH) due to Diarrhea in 2 age groups (children and elderly) for the rainy and dry seasons of the years 2009 and 2010 assessed the effects of the city’s seasonal precipitation on health. Results show 3 main components explaining the vulnerability of the city in different amounts: Demographic characteristics (43%), Urbanization and average family income (17%) and basic sanitation (16%). Additionally, 65% of the basins fell within the medium vulnerability level, 15% in medium-high level, 16% into the medium-low level and two percent 2% within the extremes (very high and very low). Clusters of basis with high proportions of PAHs overlapping basins with high social vulnerabilities and the opposite scenario were found. This shows how the principal components of social vulnerability contrast in each basin to trigger risk factors for the outbreak of infectious diseases and can differentiate the risk management strategies.

P.156  Structuring, Implementation and Management of a Specialized Basis in the Wildlife Oiled Rescue in the Event of Environmental Accidents in Estuarine Complex Area of Paranaguá, Paraná State, Brazil. Stringari D*, Pinheiro EG, Schneider GX, Zamarchi K; Disaster Research Center of Parana State - Brazil

Abstract: The city of Paranagua and its estuarine zone are located in a high environmental fragility susceptible area to the occurrence of environmental disasters caused by the presence of the largest grain port in Latin America, especially for instance of oil products spills. In the last decade, two major disasters in the port of Paranagua showed great disarticulation and lack of coordination with respect to the wildlife rescue actions that led to the death of almost all rescued animals. Therefore, the creation of an Oiled Wildlife Rescue unit (OWR) to act in the area of the Estuarine Complex Paranagua (ECP) is presented as an unprecedented initiative in the state of Paraná and Brazil, even as the creation of a Brigade Voluntary specialized and trained for this purpose. Installing structure and an office for the management of the Voluntary Brigade that will act in the rescue and rehabilitation of oiled wildlife in the estuarine zone of Paranaguá, is one of the project objectives. In addition, we intend to further empower the academic community and local actors for the role in the ECP risk areas, strengthening the culture of prevention and risk perception in the city in connection with the local civil defense and emergency foundations. It is expected this project obtains a large number of volunteers trained to act in the rescue, rehabilitation and restoration of coastal and marine species; and strengthen the integration of the local fishing and academic community together to port activities of APPA (Paranagua and Antonina Ports Administration).

P.158  Application of the Averted Disability-Adjusted Life Year Metric for Proactive Decision-Making in a Regulatory Environment. Sridharan S, Mangalam S*; Technical Standards & Safety Authority

Abstract: This paper focuses on the characterization of the Technical Standard and Safety Authority’s (TSSA’s) impact on compliance via regulatory inspections conducted on operating plants in Ontario, Canada. The use of regulatory inspection programs to identify non-compliances, often with underlying failures in the regulated system, can be used to support a proactive decision-making framework. The paper will detail the development and implementation of the novel Averted Fatality Equivalent (AFE) metric, predicated on the Averted Disability-Adjusted Life Year (DALY), in measuring the impact of regulatory inspections. The novel application of this metric is of significance to a public safety regulator in quantifying the value of inspections in preventing or averting risks when non-compliances are identified and corrected prior to their manifestation into an occurrence with potentially deleterious health impacts. This innovative indicator is of significant value to regulators in clearly demonstrating the effect of actions of the regulator and regulated parties on the overall state of compliance and its ultimate impact on safety.

P.159  Health Outcomes and Congressional Control of Consumer Safety Regulations. Larson DB*; Virginia Polytechnic Institute and State University

Abstract: In 2007, the U.S. was racked by a series of massive recalls of children’s products. A year later, Congress responded by passing the Consumer Product Safety Improvement Act of 2008 (CPSIA). Though the name indicated a broad overhaul of consumer safety, most of the provisions focused on the regulation of children’s products that had recently proven so hazardous. The federal agency charged with regulating these products, the Consumer Product Safety Commission (CPSC), was directed to issue new standards for a handful of products aimed at children, such as cribs, strollers, and bouncers, and given additional authority to implement them. Since the passage of CPSIA, CPSC has issued 18 mandatory product standards for children’s products, marking significant progress toward the fulfillment of Congress’ mandate. At the same time, three standards have been issued for non-CPSIA products, including those product categories CPSC leadership designated as priorities. This raises questions as to who should be setting these priorities: Congress as the representative of the people or CPSC as the specialists in product safety. The democracy versus expertise quandary is a longstanding one in public administration, but the standards issued, and not issued, by CPSC since the passage of CPSIA offer an opportunity to assess the current balance. To assess the impact of congressional mandates on consumer health outcomes, we will utilize injury data available through the National Electronic Injury Surveillance System, maintained by CPSC. We will examine the injury rates associated with each product category for which a standard has been issued as a result of CPSIA and assess whether they have declined since the enactment of the new federal standard. Additionally, we will compare the prevalence of injuries in these product categories to those categories CPSC has designated as priorities. This will provide an additional comparative opportunity between expert and congressional prioritization.

P.160  Geographic Risk Evaluation and Assessment Tool (GREAT): Model for Transfusion Transmitted Infectious Diseases. Chada K*, Lane C, Huang Y, Zhang G, Walderhaug M, Toledo S, Yang H; U.S. Food and Drug Administration and Engility Corporation

Abstract: Increased global travel and its association with risk of donors infected with emerging infectious diseases demand continuous evaluation of blood safety management policies. Donor deferral and blood screening are major risk mitigation measures to ensure safety of US blood supply. The Geographic Risk Evaluation and Assessment Tool (GREAT) was developed to rank geographic risk of infectious diseases, estimate geographic risk contribution and evaluate potential donor loss associated with donor deferral policies. The tool comprises embedded databases for major inputs such as geographic specific disease incidence, travels, immigration, existing geographic donor deferral, population demographics and others. The major outputs are geographic risk contribution and potential donor loss, risk reduction, false positives and positive predicted values associated with policy options for donor deferral, blood testing or combination of donor deferral with blood screening. The tool is developed in Java and utilizes ArcGIS Runtime SDK for Java (Esri Inc.) map support. GREAT features automated data mining for updating inputs, multi-format data importing, and high resolution visual presentation of model outputs. GREAT’s framework caters flexible modification to perform risk assessment of emerging infectious diseases. The tool can be applied for geographic risk assessment for transfusion-transmission of vector-borne diseases like chickungunya, dengue, malaria, vCJD, or Zika. An example application of GREAT for evaluation of transfusion-transmitted Zika will be presented. We anticipate that GREAT will support an expedited process to evaluate risk mitigation options for emerging transfusion-transmitted diseases.

P.161  Risk Governance through the Cooperation of a Risk Evaluation Technology and the Institutional System: attention to chemical stock in product. Kojima N*, Tokai A, Machimura T, Xue M, Zhou L, Todoroki A, Ebisudani M; Osaka University

Abstract: WSSD 2020 target requires higher level of chemical risk management in the world. To do so, much effort has been doing in the world. We try to build methodology to support this target through integrating risk evaluation and institutional options. Japanese chemical risk management have been carrying out through PRTR and Law Concerning the Examination and Regulation of Manufacture, etc. of Chemical Substances. These institutional systems mainly look at chemical flows management and in this research project, concept of stock management is introduced and the possible extension of the coverage of chemicals risk management will be discussed. We employ a couple of representative chemicals and manufactured good as examples and perform case studies. Point of arguments are stock management and its contribution to the risk governance. For estimating the annual flow and stock, we collected the in-flow of both the chemical and products, chemical emission from products, the wastage rate based on the life-time of products. The combination of chemical and products was based on the real situation in Japan. And for the pilot case studies below described, we chose the representative combination from the social issue. From the 3 case studies: sick-building syndrome by the adhesives from the plywood at home, ozone-depletion and global warming by the refrigerants from home appliance, and water contamination by the detergent from home and business facilities, we evaluated the human health risk or environment impact caused by the chemicals on the in-flow/out-flow and stock with each products since 2001 when started PRTR. From these results, we quantitatively clarified that the reduction of risks from the chemical stocks would be delayed for several years after the counter-measures for in-flow. This research was supported by the Environment Research Technology Development Fund (1-1501) of the Ministry of Environment, Japan.

P.162  Hazard Assessment of Four Selected Flame Retardant Chemicals of Importance to National Defense. Rak A*, Barry J, Morgan A; Noblis and University of Dayton Research Institute (UDRI)

Abstract: The Department of Defense’s (DoD’s) Chemical and Material Risk Management (CMRM) Program has a well-established three-tiered process for over-the-horizon scanning for Emerging Contaminants, conducting qualitative and quantitative impact assessments in critical functional areas, and developing sound risk management options. This “Scan-Watch-Action” process was used to examine potentials risks from selected flame retardants (FRs). FRs are chemicals added to materials, or chemically reacted into them, to prevent or slow the ignition or spread of fire. While they play a crucial role in the safety and protection of DoD personnel and assets, the use and disposal of products containing flame retardants may result in their release into the environment, depending upon how they are incorporated into the product, with potential negative impacts to human health and the environment. In our prior study, subject matter experts (SMEs) from throughout DoD evaluated the potential risks to DoD associated with these mission-critical chemicals. The SMEs did not identify any DoD&#8208;critical applications for TBB or TBPH that would be threatened by a potential phase&#8208;out, nor did they identify risks associated with their use. Therefore our current assessment focused on TBBPA, TCEP, HBCD, and decaBDE. We qualitatively identified risks for the five DoD functional areas based on a scale of probability the adverse impact will occur, and the severity of the potential impact. Here we present the risks identified and the potential risk management options to address them. The conclusion of the assessment indicates that select FRs require risk management actions to mitigate possible risks to operation and maintenance, including additional research into safer alternatives that meet performance requirements. Overall the risk posed to DoD by these FRs remains moderate, due to the risk that DoD could be without the flame-retardant products it needs before satisfactory substitutes are available.

P.163  Epistemic uncertainty in agent-based modeling. Ferson S*, Sentz K; Applied Biomathematics, Los Alamos National Laboratory

Abstract: Traditional approaches to handling uncertainty in agent-based models employ Monte Carlo methods to randomly sample parameters and probabilistically determine whether and how a behavior or interaction rule is realized by an individual agent. A simulation of all agents thereby represents a single realization from among many possible scenarios, and simulations with many replications are used to reveal differential probabilities and the likelihoods of extreme results. Unfortunately, Monte Carlo is a poor way to project epistemic uncertainty through a complex model, and it is an unsatisfying scheme for representing the uncertainty about volitional choices of agents. Adding epistemic uncertainty to agent-based models properly requires the ability to (1) characterize stochastic drivers imprecisely, (2) specify agent attributes and other quantities as intervals, probability distributions, or p-boxes, and (3) execute behavior rules in a way that respects uncertainty in their conditional clauses. When uncertainty makes the truth value of the conditional clause of any rule unclear, the simulation should hold that the rule both fires and does not fire. This may result in subsequent uncertainties elsewhere in the simulation including the status of attributes of agents, even perhaps whether an agent exists or not. These facilities advance agent-based modeling to uncover a more comprehensive picture of the effects of epistemic uncertainty, which can be vastly more important than aleatory uncertainty. We compare this approach with traditional simulation using only Monte Carlo methods to reveal the differences between these two approaches to uncertainty.

P.164  Surveillance of a Comparative Set of Homeland Security Risks. Lundberg RP*; Sam Houston State University

Abstract: This research is designed to identify perceptions of homeland security risk relative to other risks following a homeland security event. While it is well known that a people become more concerned about a risk if there is a recent event of that type that is fresh in an individual’s memory, it is not as clear the extent to which a recent homeland security event of one type affects perception of homeland security events of other types. For example, a terrorist bombing may increase concern for a range of terrorist scenarios but not natural disasters, or perhaps a hurricane response perceived as a failure will increase concern for all disaster scenarios. These spillover effects of how perceptions of one homeland security risk are correlated with another are unclear but may be important for disaster planning and response. To do this we examine perceptions of risk in the U.S. public using Amazon Mechanical Turk across a set of ten hazards, including natural disasters, terrorist events, and major disasters. After establishing a baseline level of concern for each of the risks, additional surveys will be taken following homeland security events. These additional surveys shall be conducted: immediately after the event; two weeks after the event; four weeks after the event; and eight weeks after the event. One contribution of this research is the assessment of risk both individually and relative to the other hazards; if an event occurs we would expect the concern for that hazard to increase, but understanding whether perceptions of risk for similar hazards increase or decrease and over what timeframe may be useful as well.

P.166  Water stability index for risk identification within transboundary river basins. Hamilton MC*, Speight HA, Hunke JA, Voyadgis DE, Veeravalli S, Becker SJ, Lyon SL; US Army Corps of Engineers Geospatial Research Laboratory

Abstract: Stability tasks are conducted as part of United States military operations abroad in coordination with other instruments of national power to maintain or reestablish a safe and secure environment, and provide essential governmental services, emergency infrastructure reconstruction, and humanitarian relief. To do this the United States Army conducts primary stability tasks in order to achieve five end state conditions: a safe and secure environment, established rule of law, social well-being, stable governance, and sustainable economy. For stability tasks to be effective, it is critically important to understand and monitor the conditions that can lead to instability and target interventions accordingly. Water plays a vital role in underpinning stable and productive societies and the ecosystems on which they depend. In general, water security is defined as the reliable availability of an acceptable quantity and quality of water for health, livelihoods, and production. While several water security indices exist at the national scale, there are few indices that prioritize regions at a spatial scale that is informative for stability tasks. This research is developing a Water Stability Index (WSI) designed to inform the prioritization of where and what type of stability interventions at the river basin-country geospatial unit of analysis. The WSI focuses on water-related instability risk factors and includes concepts of ecological vulnerability, social vulnerability and adaptive capacity. The WSI is unique in that it aligns risk indicators under the five stability end states so that interventions can be targeted accordingly.

P.167  The security risk management regulation regime applied in the Norwegian context . Jore S.H.*; University of Stavanger

Abstract: In today’s fight against terrorism, national security concerns cannot be reduced to external threats and handled exclusively by military means. Current counterterrorism strategies are based on building resilience through the involvement of civil society, including private and public organizations. In Norway, several regulations, standards and reports propose a risk-management regulation regime for mitigating against terrorism and other security threats. A risk-management regulation regime builds on the assumption that organizations have the necessary competence for knowing what type of risks the organization might face in the future, and know the appropriate means for how these threats should be met. This presentation aims to describe and discuss challenges and advantages with applying a risk management regulation regime within the area of security, which is protection from terrorism or other intentional, malicious crimes, in contrast to safety witch implies protection from non-intentional acts. We identify challenges organizations face when applying the security risk regulation regime. The application of a risk management regulation regime is discussed in light of three Norwegian case studies; 1) the current aviation security regime 2) the rebuilding of the Government complex in Oslo after the bombing on July 22 2011, and 3) the implementation of security risk management in the Norwegian petroleum sector. We conclude that several aspects of a risk-management regulation regime are challenging from an organizational perspective, including the risk analytical methodology currently available to organizations.

P.168  Military coalition’s organizational challenges in complex emergencies. Stene LK*, Olsen OE; University of Stavanger

Abstract: Violent conflicts and complex emergencies such as in Afghanistan, Somalia, Syria and Iraq expose the local population and communities to great suffering. The international response has proven to be exigent and hazardous. The security dimensions of these conflict emergencies and new wars calls for an appropriate response that often include military forces to establish and maintain stability and prepare the ground for political solutions. New concepts and ways of organizing multinational military interventions has been worked out to deal with challenges and interdependencies characterizing today’s new wars and complex emergencies. Starting with a discussion of complex emergencies and new wars, this paper undertakes to discuss organizational aspects of multinational military coalition working under vague mandates and exit strategies to gain security in a complex emergency, and how this influence crisis management. The paper draws on studies of the NATO-led International Security Assistance Force (ISAF) in North-Afghanistan, as well as recent developments in ongoing complex emergencies and on experiences gained by the authors during missions and fieldworks in Afghanistan, Somalia and Iraq.

P.169  Modeling Exposures in Municipal Water Contamination Scenarios using Synthetic Systems. Richter BP*, Wilson PH, Hawkins BE, Winkel DJ, Whittaker IC, Gooding RE, Bradley DR, Cox JA; Battelle Memorial Institute

Abstract: The Chemical Terrorism Risk Assessment is a Department of Homeland Security Chemical Security Analysis Center program that assists in prioritizing mitigation strategies and assessing the risk of chemical terrorism attacks, including the intentional introduction of harmful chemicals into municipal water distribution systems (MWDS). A relative scarcity of consistently detailed, city-specific information, paired with the need to assess consequences to the wide variety of systems in the US represents a challenge to the current state of the art in hydraulic modeling, which requires explicit MWDS network information for each system considered. An innovative model involving the creation of synthetic MWDS networks based on the characteristics of real MWDS networks was developed to address this challenge. Synthetic MWDS network results were compared against industry-standard simulation results for real cities provided by USEPA. The exposures predicted were found to be consistent with USEPA-provided datasets, supporting the use of the synthetic model for the purposes of estimating mixing, dilution, and consumption.

P.170  A case study in data access, exposure assessment, and extended analyses: diesel exhaust exposure and lung cancer. Crump KS, Van Landingham C, McClellan RO*; Private Consultant

Abstract: The International Agency for Research on Cancer (IARC) in 2012 classified diesel exhaust as a "human carcinogen" largely based on findings from the Diesel Exhaust in Miners Study (DEMS) conducted by NIOSH and NCI scientists. This action raises the question - Are results of analysis of DEMS data sufficiently robust to support Quantitative Risk Assessment? DEMS was designed to test for an association between exposure to Diesel Engine Exhaust (DEE) and lung cancer in workers in 8 nonmetal mines from dieselization through 1997. Respirable Elemental Carbon (REC) was used as a metric for DEE and estimated from diesel Horse Power (HP), earlier measurements of CO in mines and assumed relationships among HP, CO, and REC. Cox proportional hazard models revealed a statistically significant association between REC and lung cancer in the total cohort and ever underground workers. Statistically significant associations were also observed in a nested case-control study controlling for smoking. The authors of this abstract with private industry support gained access to DEMS data and extended the original analyses. Uncertainties in REC estimates using CO as a surrogate for REC prompted us to develop alternative REC estimates using only mine specific HP and ventilation in CFM. The alternative REC exposure estimates were applied in a conditional logistic regression of the case-control data. Trend slopes calculated with new REC estimates were not statistically significant. Slopes were smaller by a factor of 5 without control and factors of 12 with control for radon exposure compared to slopes in original analyses. The varied results from different analyses should be considered in conducting quantitative risk assessments using DEMS data. This study illustrates the value of data sharing, construction of alternative exposure estimates and conduct of alternative analyses.

P.171  Aviation Security: Examining the Effects of Agent and Screening Procedure on Perceptions of Risk, Safety, and Fairness. Nguyen KD*, John RJ; University of Southern California

Abstract: Public perception about airport security screening is an important factor to determine the continuance of aviation security policies. Although the TSA have been using many different security screening polices, there is a lack of research to describe how travelers perceive these measures. Applying Organizational Justice Theory, we hypothesized: a) a randomized screening procedure is perceived as being fairer, both procedurally and distributively, than either a profiled-based or a behavioral screening, b) computerized screening is viewed as being more procedurally fairer than human screening. We also explored the effects of different screening procedures and human vs. computer on perceptions of personal risk and safety as well as the effects of fairness on policy satisfaction. Six-hundreds respondents from Amazon Mechanical Turk were randomized into one of a 2 (Human vs. Computer) by 3 (Randomized vs. Behavioral vs. Profiled Screenings) conditions. They read a description of a hypothetical screening policy and rated the policy on several dimensions. The results provide partial support for both hypotheses. Interestingly, there was an interaction effect. Randomized screening was perceived as being more consistent and more equal in distributing screening costs than the other two procedures, and this effect was larger when the screening was conducted by a computer than by a human. Randomized screening was also perceived as being safer than the other two screenings. Respondents believed they have a lowest risk of being selected for a security screening under a behavioral procedure than under a profile-based procedure. Mediation analyses suggest that perceptions of procedural fairness and distributive fairness partially mediate the effects of screening consistency and equal distribution of screening cost on satisfaction with airport screening policy, respectively. These findings have important implications for the design of future security screening policies.

P.172  The True Meaning of Terrorism and Response to Terrorism. Wang TW*; University at Buffalo and Industrial and Systems Engineering

Abstract: This research mainly focuses on the divergences and controversies over defining terrorism and how to define, understand and deal with terrorism. Based on the systematic survey and study of previous definitions about terrorism and fully referring to other relevant information and data, this article is devoted to exploring the main characteristics and nature of terrorism, then putting forward a clear, pragmatic and widely accepted definition of terrorism from a relatively unique perspective, and then not only tries to use it as a tool to explain, clarify and solve the problems relating to terrorism in reality, but also deepen the battle against the intangible conception of terrorism by proposing some suggestions to strike at the tangible entity of terrorist organizations and terrorists.

P.173  Factors that Influence Public Perspectives of Energy Development in Canada: Results of a National Survey on Climate Change and Energy Systems. Joo J, Mayeda A*, Chakrabarti K, Wang T, Song X, Hmielowshi J, Boyd A; Washington State University

Abstract: Energy production is a critical component of the Canadian economy. Canada is not only the fifth-largest energy producer in the world; it is also one of the highest per-capita consumers of energy. This high production and consumption of energy results in high levels of greenhouse gas (GHG) emissions in Canada. There is an increasing concern about these GHG emissions and recognition that low carbon energy systems need to be developed and sited. The publics’ opinions about energy systems can impact the development of these technologies. A survey was administered to examine public views on and understanding of key issues surrounding energy systems, climate change and government regulation. The survey was administered via Internet and phone to a representative sample of 1,479 Canadians. Results indicate that multiple factors influence public opinion about the development of energy systems. The majority of participants indicated that impact to human health and the environment were the most important factors when thinking about energy sources. Fewer responded that reliability of energy supplies or independence from other countries’ fuels was the most important factor in their perspectives on energy. Trust in government to regulate technology and perceptions of industries ability to safely develop energy systems were shown to be important factors in the publics’ attitudes towards these developments. We conclude by discussing the policy challenges associated with the energy systems development and provide directions for future risk perception research on climate change and energy systems.

P.174  Futuristic Risk Assessment For Coastal Flooding in Changing Climate Era: A case of Ernakulam, India . Walia AB*; Centre For Disaster Management, LBSNAA

Abstract: The Ernakulam district of India possesses a coastal belt of 46 Kms along with 12 coastal villages on western coast. District have high population density with huge infrastructure of high importance i.e big harbour of high importance, major Navel base, governmental infrastructure, Tourism hotspot etc. Transportation through water is one of the major way of commuting. City have many water channels, some of them being used by ships to enter harbour and naval base. But in the changing climate scenario city is at high risk for storm surges and Tsunami as preparedness level in not up to the mark. In view to the high vulnerability of the city, Risk Assessment exercise has been carried out by considering futuristic climate change consequences along with suggestive framework to deal with the worst scenario.

P.175  Disaster Risk Management in India and Iran : Conceptual Framework for Disable Sensitive DRM Planning . Walia A, Ardalan A*, Patrick V, Singh S; 1. CDM ,LBSNAA 2. TUMS 3. HHI, Harvard 4. UNICEF

Abstract: Disasters have severe impact on people, property and environment. Disasters can happen as result from natural hazards i.e. earthquakes, floods, cyclones, etc. or from human-influenced causes like climate change, accidents and conflicts, and can lead to a catastrophic situation. During every disaster situation, people with disabilities suffer most because of their physical and mental condition. It is crucial to consult and involve people with disabilities in disaster risk reduction, response and recovery planning to ensure that the capabilities and contributions of disabled and older people are recognized and utilized. People with disabilities are the people who are differently able but have many abilities. Appropriate needs assessments are critical to address specific needs of people with disabilities during disasters and to ensure that mainstream services and facilities are accessible to people with disabilities. Disaster Management Planning at all levels need to be inline with the needs of the people with disability.

P.176  Cognitive Sophistication and Learning about Risk from Experience. Royal AY*; Resources for the Future

Abstract: This study examines the relationship between cognitive sophistication, risk perception and risk mitigation when people learn about risk through experience. Findings are based on observations from a controlled laboratory experiment that had subjects make risk mitigation choices and report risk perceptions in response to a repeated low probability hazard. The experiment also contained tasks designed to independently evaluate each subjects’ risk preferences and level of cognitive sophistication. Subjects possessing cognitive biases characterized by the representative heuristic were relatively more pessimistic about risk when unfavorable outcomes were frequently experienced, reflecting a possible over-reliance on small samples of outcomes (the hot-hand fallacy). Although reliance on the representative heuristic predicted patterns in subjects’ reported beliefs, it was not significantly correlated with mitigation choices. However, subjects with greater cognitive ability, as measured by a cognitive reflection test, tended to make risk mitigation choices that more closely matched their risk preferences-- implying a lower degree of bias and inefficiency among more sophisticated subjects.

P.177  Opening the Black Boxes of Sustainability Management: How Metrics Frame Decisions? Stoycheva S*; Ca' Foscari University of Venice, Italy

Abstract: Despite the ongoing debate regarding the meaning of sustainability in the business context, there is a common understanding that to account for how a corporation is doing with respect to sustainability, this performance should be measurable. No matter the existing ways of corporate sustainability operationalization, any measurement generally implies the identification of quantitative indicators to represent properties and relations. It could be argued that such processes of measurement (data construction, analysis and final presentation) and the construction of metrics imply tacit assumptions and informal practices that affect the process of decision making. If not challenged on these instances, metrics and indicators may fall in the trap of being “black boxed”, which makes their use smoother but hampers their ability to adapt to changing contexts and be a valid representation of reality. The reliance on quantitative based measurement of organizational performance as a common means for informed decision making is then not exempt from undesired outcomes if guided by a naïve approach to measurement. To this date, however, the available body of knowledge on sustainability measurement fails to address the issue of how numbers are constructed, given meaning and enacted in organizational practices. The aim of this study is exactly to explore the impact of corporate sustainability measurement in framing organizational decisions through challenging the construction, interpretation and display of metrics produced by organizations by the means of ethnostatistical analysis. The latter will imply three stages of research, resorting to different methodologies: a) an ethnography of metrics construction and application, focusing on the informal practices and tacit assumptions involved in the production of numbers; b) a computer-based simulation of the possible diverging results that different assumptions would imply; c) a literary and textual analysis of the rhetoric use made of the resulting numbers, and of their impact on decisions.

P.178  Comparing Urban and Rural Vulnerability to Heat-Related Mortality: A Systematic Review and Meta-analysis. Li Y*, Odame E, Zheng S, Silver K; East Tennessee State University

Abstract: Studies of the adverse impacts of high temperature on human health have been primarily focusing on urban areas, due in part to the facts that urban centers generally have higher population density and are often significantly warmer than its surrounding rural areas (Heat Island Effect), and thus urban areas are considered to be more vulnerable to summer heat. However, heat vulnerability can also be affected by other population characteristics such as age, education, income and social isolation, which are likely to mark greater vulnerability among rural population. Here we explore the vulnerability to heat-related mortality in rural areas through a systematic review and meta-analysis of existing evidence. We searched studies that examined the association between high ambient temperature and morality in rural areas published in English between 2000 and 2016. Heat-mortality effect estimates from selected studies are grouped into two: (1) Rural effect estimates (RRrural) and their corresponding urban effect estimates (RRurban), from studies that reported risk estimates for both urban and their surrounding rural areas (7 studies included); (2) Rural effect estimates only (12 studies included). For Group 1, we performed a meta-analysis of the ratio of the rural estimate to the urban estimate in order to compare the magnitude of effects in rural versus urban areas. For Group 2, we performed a meta-analysis of the effect estimates in rural areas only. The pooled ratio estimate (RRrural/RRurban) for Group 1 is 1.051 (95% CI: 0.954, 1.160), which indicates the rural relative risk is about 5% larger than the urban relative risk. The pooled estimate for Group 2 is 1.191 (95% CI: 1.13, 1.251). Our preliminary results suggest that vulnerability to heat in rural areas may be similar to or even higher than urban areas, indicating that more studies are needed to understand rural vulnerability to heat-related hazards.

P.179  Modeling Growth Models of Media Attention and Public Attention during Disasters. Li J*; University of Science and Technology of China

Abstract: Understanding the growth models of media attention and public attention during disasters is a key issue for disaster communication. Based on Gaussian Function, this study constructed three respective growth models to estimate the media attention and public attention growth model, and then we used the data from 41,016 news stories about 185 disasters that occurred in China from 2003-2012 to test the growth models. The factors that influence the likelihood of media attention include newsworthiness and disaster severity, and newsworthiness, disaster severity and GDP per capita of affected area can affect the likelihood of public attention. We also compare the strength of media attention with public attention, and results show that causation, newsworthiness, frequency can influence these differences.

P.180  Analysis of the Corpus Christi Refinery Row Public Health Assessment. Lange SL*, Jones L, Haney JT, McCant D, Schaefer HR, Phillips T, Honeycutt ME; Texas Commission on Environmental Quality

Abstract: The US EPA, the Texas Commission on Environmental Quality, and the citizens and industry in Corpus Christi have spent decades working together to improve air quality, particularly in the area called Refinery Row. In August of 2016, the Agency for Toxic Substances and Disease Registry (ATSDR) released a draft public health assessment (PHA) for the Corpus Christi Refinery Row area in Texas. ATSDR analyzed 1980 – 2010 air monitoring data from the area, and provided conclusions about the public health risk to monitored chemical concentrations. Those conclusions comprised concerns about public health risk from exposure to a number of chemicals, including benzene, chromium, and particulate matter, among others; as well as discussions of increased cancer risk and birth defect prevalence. The objective of our analysis of the PHA was to compare standard toxicological and risk assessment methods with those used by ATSDR to determine the validity of their public health risk conclusions. We found significant differences between standard risk assessment and toxicological practice and the methods in the PHA, including using chemical concentrations measured on industrial sites for a public health exposure analysis; not considering sampling duration and exposure duration when calculating risk; using highest yearly mean concentrations for lifetime cancer risk calculations instead of data from the entire sampling period; and not appropriately communicating uncertainties of the birth defect and cancer risk analyses. Our own analysis of current monitoring data shows that there are no monitored concentrations of chemicals in this area that are above toxicity comparison values, so levels would not be expected to cause health concerns. A public health assessment that uses inappropriate risk assessment methods and improperly communicates uncertainties in the analysis can cause undue alarm in the population, and can damage hard-won relationships between the public and regulatory agencies.

P.181  The Open Data for Resilience Initiative: Approaches for Making Risk Analysis More Transparent, Inclusive, and Effective. Soden R*, Balog S, Deparday V; World Bank

Abstract: The World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR) launched the Open Data for Resilience Initiative (OpenDRI) in 2011 to make disaster and climate risk assessment more inclusive, transparent, and effective. Since then OpenDRI has worked in over 30 countries to bring the philosophy and practices of the open data movement to the challenge of building a more resilient future. Central to our approach has been the understanding of risk analysis as a situated, socio-technical process where the landscape of human capacities, social networks, and institutional arrangements are as important as the data and models that are produced. This understanding has led us to explore a variety of tactics outside the traditional toolbox of risk modelers, including exploration of the opportunities offered by serious games, involving non-traditional actors in peer review processes, investment in the development of open source software communities, and working with participatory mapping techniques in order to develop asset inventories. Through five years of work, OpenDRI has drawn from diverse fields of practice and scholarship to craft a bespoke approach to risk analysis and communication. We are currently launching a research project aimed at: 1) assessing the efficacy of our work with our in-country partners and its contribution to the wider community of practice in which we participate; 2) more clearly frame our activities as they sit within research areas of risk communication and civic technology; and 3) chart a path forward for the next steps of our program. The project will provide both a framework for assessing this work as well as clear and practical guidance for those seeking to implement similar efforts. We will also include case studies from projects developed by the OpenDRI team and other organizations. Our presentation will describe this work as well as solicit feedback and partnership from attendees.

P.182  Air Quality and Unconventional Oil and Natural Gas Development: A Systematic Review of the Literature from a Public Health Perspective. . Naufal Z, Blake U*; American Petroleum Institute

Abstract: The expansion of oil and natural gas development from unconventional resources in the US brought significant benefits along with increased scrutiny by academicians, NGOs, and regulators. Air emissions associated with Oil and Gas operations and their potential impacts on human health and the environment is among the cited concerns. A number of studies on ambient air quality and oil and gas emissions have been conducted in several major unconventional oil and natural gas development areas in the US. By using the measured air quality data as surrogate for personal exposures, some of these studies have attempted to determine whether associations exist between concentrations of air pollutants near oil and gas operations, and human health effects in neighboring communities. A systematic review of peer-reviewed published studies (January 2008 – June 2016) and a qualitative analysis (due to the heterogeneity of the studies) of the ambient data was conducted to assess the usability and generalizability of the data as surrogate for personal exposures. Based on defined inclusion/exclusion criteria, a total of 25 out of over 400 identified peer-reviewed studies were determined to be eligible for review. Our findings suggest that there are clear limitations that hamper the applicability of the current data in elucidating the relationship between oil and natural gas industry activities, ambient air quality, and implications to public health. This review also outlines the challenges to generalizing study findings by highlighting the importance of temporal and spatial variability in air emissions from industry operations.

P.183  Guiding versus choosing: the role of Life Cycle Assessment in US state level policymaking. Scott RP*, Cullen AC; University of Washington

Abstract: Life Cycle Assessment (LCA) is commonly proposed as a tool for assessing the complete environmental footprint of policy and infrastructure decisions. While its use in private and federal decision making has been characterized in previous research, we evaluate the characteristics of LCA that have either enhanced or limited the usefulness of the methodology in US state-level policy decisions. After selecting cases of state funded LCA via a systematic web search, we review 26 cases of states applying LCA as a method of assessing environmental impacts of policy alternatives. Each LCA is coded in light of their intended use while classifying the methods and data utilized. We pay particular attention to how the treatment of uncertainty in input data, impact categories, and potential alternatives. We find that data limitations and impact uncertainties are frequently cited as reasons for calling LCA results into question; however, such explanations are much more common when LCA results are not tied to a specific decision context. Interestingly, we find LCA is primarily used to explore potential alternatives, not as a method of comparing the relative merits of alternatives. This suggests that LCA may currently be useful as a method for suggesting alternatives and shaping proposals for the policy process, while methodological limitations and uncertainties appear to challenge its usefulness as a method for deciding among alternatives. Therefore we suggest that state-level practitioners be vigilant about emphasizing the usefulness of LCAs for future decisions while also producing analyses that could later be tailored for use within a specific decision context.

P.184  Quantity Neglect in Judgments of the Ecological Impact of “Green” Consumer Goods. Kim B*, Schuldt JP; Cornell University

Abstract: Over the decades, limiting consumption has been emphasized as a means to reduce environmental impacts and mitigate global risks from climate change. In this vein, consumption of pro-environment or “green” consumer goods is widely considered to be highly desirable, with research suggesting that it gives rise to a halo effect in judgment. However, little is known about the role of quantity in people’s judgments about the ecological impact of pro-environmental goods, an important omission given that it takes more resources to produce more goods—even “green” goods. To address this gap, we report on an experiment in which participants (n = 274) judged the total carbon footprint of a fictional family that was depicted as owning and driving either one Toyota Prius (a widely recognized hybrid-electric vehicle) or two Toyota Prius, depending on randomly assigned condition. In addition, we measured individual differences variables shown to predict the emergence of “green” halo effects in past research, including participants’ level of pro-environmental values (using the New Ecological Paradigm scale; Dunlap et al., 2000). Results revealed that whereas participants with stronger pro-environmental values (M + 1SD) rated the family’s total household carbon footprint as significantly greater in the two Prius condition than in the one Prius condition (thus accurately accounting for the quantity in their ecological impact judgments), participants with weaker pro-environmental values (M – 1SD) rated the family’s carbon footprint similarly regardless of the number of vehicles owned—demonstrating what we are calling “quantity neglect.” A follow-up study is currently underway that attempts to replicate and more systematically examine the utility of alternative theoretical accounts for this effect, including the greater likelihood of cognitive elaboration among pro-environmental individuals for whom the experimental scenario may have been deemed more personally relevant.

P.185  The role of systematic review in risk assessment – the missing link between the objectivity and transparency of scientific evidence and confidence of regulatory decisions. Tsaioun K*, Stephens ML, Hoffmann S, Maertens A, Busquet F, Hartung T; EBTC and CAAT Johns Hopkins Bloomberg School of Public Health

Abstract: The public demanded transparent and objective assessment of effectiveness and safety of medicines and other chemicals for humans and the environment for quite some time. In clinical research such approaches, termed evidence-based (EB), brought the needed objectivity, consistency and transparency to the field and have brought strict design and publication standards to clinical trials. This allows clinicians and educated public to quickly compare the effectiveness of different treatments in systematic reviews. EB approaches include inter alia the establishment of a common ontology, improved and consistently structured reporting of evidence, probabilistic uncertainty and risk assessment, development of data integration and synthesis methodology. However, these approaches are not yet adopted in chemicals risk assessment. As a result, there are many published studies, but their number does not guarantee quality or consistency, and does not lead to clear conclusions, which causes public confusion and lack of confidence in policy. Confidence is further compromised by lack of transparency in a priori determination of specific primary and secondary endpoints, declaration of pre-specified approaches to data analysis, lack of transparency in post-hoc statistical analyses, and lack of availability of the primary data, allowing an alternative analysis. EB methods are set out to assemble, assess, integrate, analyze and summarize different streams of evidence (animal, epidemiological, mechanistic, in vitro, in silico) in a transparent and objective manner. EB methods, adopted by industry, regulators and understood by public, can bridge the gap between evidence generation and regulatory decisions. This approach holds great promise to become a standard that informs confident regulatory decisions. The theory and principles of evidence-based methodologies will be described and examples of application of this approach to safety test methods assessment and will be described.

P.186  Combined Incremental Lifetime Cancer Risk for Nitrosamines: a comparison of combustible cigarette and e-cigarette emissions . Fiebelkorn SA*, Meredith C; British American Tobacco, Research and Development, Southampton, Hampshire, United Kingdom

Abstract: Eight nitrosamines have been identified by the US FDA as Harmful and Potentially Harmful Constituents (HPHC) in tobacco smoke. Some of these have been detected in e-cigarette emissions. Combined Incremental Lifetime Cancer Risk (ILCR) can be used to prioritise toxicants based on published cancer potency values and estimated human exposure. We present here the results for 14 nitrosamines (NNK, NAB, NAT, NNN, NDBA, NDEA, NDELA, NDiPA, NDMA, NDPA, NEMA, NMOR, NPIP and NPYR) based on yield data for cigarette and e-cigarette emissions reported by Margham et al. (Chem. Res. Toxicol., 2016, 29 1662–1678). Inhalation unit risks (IUR) or oral cancer slope factors (CSF) published by the USEPA or OEHHA are available for 11 of the nitrosamines. CSFs were converted to IUR assuming a human daily breathing volume of 20m3 and 70kg body weight. For NAB, NAT and NDiPA no published cancer potency values were identified and measured yields for e-cigarettes were below the limit of detection or not quantifiable. The ILCR for each nitrosamine was calculated by multiplying the IUR by estimated daily exposure for the two product types. Daily exposure was based on average daily usage estimates and machine generated yields. The yields used are based on measured values or derived from limits of detection or quantification. For the nitrosamines, the combined ILCR due to continuous smoking exposure, based on a simple additivity model, was estimated as 4.22 x10-3, while for e-cigarettes this was estimated as 1.2 x10-5. Further, assuming that nitrosamines below the limit of detection or equivalent to air blank level are not present, the adjusted combined ILCRs become 4.19 x10-3 for smoking and 2.7 x10-6 for e-cigarette use, representing almost three orders of magnitude difference. Since CSF data used is generally based on oral rather than inhalation exposure and for liver rather than lung lesions, further assessment based on inhalation data is recommended when such data becomes available.

P.187   Accidents risk assessment on China petroleum and chemical enterprises. Zhao Y*; Peking University

Abstract: With development of society and economy, China has become a large producer and consumer of petrochemical products in recent years. By the year of 2015, there were about 30000 enterprises above state designated size in China's petroleum and chemical industry. However, due to the weak capacity of safety management and the lack of prevention and control measures, environmental incidents and safety accidents took place frequently, which immediately threatened to public health and environment. For the sake of reasonable insurance expenses and security budgets, quantitative risk assessment and loss evaluation are needed confronting the possible losses of accidents. Further, risk analysis could provide a basic framework on accident prediction and loss distribution simulation of petrochemical enterprises, and provide reference on related risk assessment model parameters. In view of these, this study assesses accident risk in Chinese petrochemical enterprises using probability risk model. By retrieving China's petrochemical accident news during 2006-2015 through web crawlers, and combining with accident information that China State Administration of Work Safety and Ministry of Environmental Protection and their affiliates disclosed on website, this study framed a petrochemical accident information database, which contains 1509 accident records. The accident risk is analyzed by employing Bayesian theory for both frequency (Poisson) and severity distributions (Generalized Pareto). The result shows that there are significant differences in severity and frequency among provinces, where the highest petrochemical accident risk was found in provinces along China's eastern coastal where more extreme accidents is found. In addition, the accident time trends at province level is independent of accident risk level of that province.

P.188  Reactions to Terror: In the Air and on the Ground. Baucum MP*, Rosoff H, John RS; University of Southern California

Abstract: Understanding the differential impact of terrorism across the various contexts in which it strikes is of paramount importance for researchers and policymakers who wish to keep up with terrorism’s evolving nature. Extant literature has focused mainly on terrorism in the aviation sphere, largely due to the high-profile and disastrous nature of the 9/11 attacks. However, recent research has suggested that the threat of terrorism today looks much different than it did in 2001, shifting more towards soft-target attacks on urban spaces and public transportation systems. Given the apparent shift to low-security targets, the current paper aims to uncover how the context of an attack might affect the public’s psychological and behavioral reactions. This study presented participants with mock news stories featuring 1) a suicide bomb on a public bus, 2) a cyberattack on a public train, or 3) a suicide bomb at an airport passenger terminal. Path analysis was used to model the relationships between pre-standing risk attitudes and post-scenario measurements of negative affect, risk perception, and intention to alter travel behavior. Analyses revealed that the attack context manipulation had its greatest impact through moderating the relationships between other risk-related variables. Results also replicated past findings regarding the nuanced role of negative affect in risk perception. We discuss the usefulness of Partial Least Squares path analysis in risk perception studies, as well as how our results reflect aviation’s focal role in the Western terrorism narrative and how this might affect future research endeavors and risk communication efforts.

P.189  Integrated microbial risk assessment of infection by Giardia and Cryptosporidium from drinking water delivered by eleven surface water systems in Sao Paulo State, Brazil. Razzolini MTP*, Lauretto MS, Sato MIZ, Nardocci AC; University of Sao Paulo and CETESB

Abstract: Giardia and Cryptosporidium were quantified monthly in surface water catchments from 11 cities across the Sao Paulo state, during a year, for supporting surveillance actions based on risk assessment. Sample percentages below of the DL (0.01/L) ranged from 16.7% to 100% for Giardia (mean=70.5%) and 1.7% to 91.7% for Cryptosporidium (mean=69.8%), with maximum concentrations of 17.7 cysts/L and 11.5 oocysts/L, respectively. Usually, the high numbers of negative samples and high variability in pathogen concentrations represent a challenge to adjust data and estimate risk in an integrated approach. In this study variability was treated via cluster analysis: each catchment point was summarized by the pair of geometric mean logs for the parasites (assuming half-DL for concentrations
P.190  Making the Case for Watches, Warnings, and Advisories: Results from a Case Study Analysis of NWS Forecasters and Partners. Eosco GM*; Eastern Research Group

Abstract: NOAA’s National Weather Service (NWS) forecasts hazardous weather situations and issues watches, warnings, advisories (WWA), and other information products to convey the threats posed by these events. These products are intended to help communities prepare for and respond to hazardous weather to protect people’s lives and property. To better understand how the NWS and its stakeholders perceive and use the current system, Eastern Research Group, Inc. (ERG) worked with the NWS to develop an online case study survey instrument (using Qualtrics) that asked participants (including NWS forecasters, emergency managers and broadcast meteorologists) to respond to a series of open-ended questions about a particular hazardous weather event where the messaging did (or did not) work well from their viewpoint or from the viewpoint of their community or audience. This survey resulted in a set of case studies that provided insights into: (1) The strengths of the current WWA system from a hazard messaging standpoint. (2) The weaknesses surrounding the WWA system from a hazard messaging standpoint and how these weaknesses relate to potential solutions. (3) Whether changing the current WWA language is desired by stakeholders. A total of 706 case studies were qualitatively analyzed. ERG used a mix of inductive and deductive approaches to analyze these data. For the first phase of the work, ERG conducted an inductive, bottom-up analysis to detect theoretical patterns in a subset of the data with no preconceived notions of particular findings. In the second phase of the analysis, the theoretical patterns were analyzed to develop emerging themes and associated keywords. These keywords were then used to employ a deductive, top-down approach to identify and summarize the recurring themes in all of the remaining case studies. The major findings, challenges, and limitations from this study will be presented.

P.191  Background Radiation Dose and Cleanup Criteria. Yu C*; Argonne National Laboratory

Abstract: Throughout history, mankind has been exposed to radiation from many environmental sources, including cosmic rays, cosmogenic radionuclides, and primordial radionuclide decay products. In its Report No. 160 (2006), the National Council on Radiation Protection and Measurements (NCRP) stated that the U.S. population receives an effective dose of 6.2 mSv per person. Nearly all the dose resulted from ubiquitous background radiation (50%) and medical exposure (48%). The background radiation varies from location to location. For example, in Denver, Colorado, the background dose is about twice as high as in Chicago, Illinois. For radiation protection, both the NCRP and the International Commission on Radiological Protection recommend a dose constraint of 1 mSv/yr above background. This value has been adopted by the U.S. Department (DOE) and the U.S. Nuclear Regulatory Commission (NRC) and other organizations, including the International Atomic Energy Agency, as the dose limit for protection of the general public. For cleanup and release of radiologically contaminated sites, both DOE and NRC implement a dose criterion of 0.25 mSv/yr plus ALARA (As Low As Reasonably Achievable). Note that this criterion is a small fraction (~4%) of the background dose and well within the variation of the dose received by the average U.S. population. In this paper, we use the RESRAD-ONSITE code, which has the capability to calculate both dose and risk for radionuclides and to compare risks associated with background radiation and dose-based cleanup crieria. We also used the RESRAD-BUILD code in another example to illustrate the potential dose incurred while working in a building constructed with granite (such as the U.S. Capitol Building). Using the average radionuclide concentrations found in granite, the dose for an office worker is on the order of 1 mSv/yr, which is four times higher than the DOE or NRC dose constraint for release of property.

P.192  An economic lab experiment to compare the risk and productivity between parallel and series production systems. Akai K*, Makino R, Takeshita J, Kudo T, Aoki K; Shimane University

Abstract: This study compare the risk and productivity between parallel and series production systems proposed by Husken (2002) in Risk Analysis, Vol. 22, No. 1. We employ Japanese university students including undergraduate and graduate students and pay money relative to their achievements in the laboratory. The experimental software is constructed by z-Tree which is connected by the internet among subjects. They play as labors each other and pay their efforts (cost) into their works. Then, each system decides the probability of accidents. We control two types of treatments; a death game treatment in which labors do not continue to the game once accident happens vs. continuous game treatment in which labors can keep working after the accident occurs. The laboratory result shows that the parallel system achieves more efficiency in the aspect or risk reducing and productivity in each treatment. In the death game treatment, in each system, the session finishes within first five or six periods. This result implies that learning effect among subjects are important to achieve the high performance in production sites. Especially, in the series production system, how to educate labors about accident probability and importance of risk reducing.

P.193  Persistence and Stability of Large-Scale Command and Control Networks. Ganin AA*, Kitsak M, Eisenberg DA, Alderson DL, Linkov I; US Army Engineer Research and Development Center and University of Virginia; Northeastern University; Arizona State University; Naval Postgraduate School; US Army Engineer Research and Development Center

Abstract: In command and control (C2) systems comprised of social and communication networks, people and infrastructure provide diverse services to complete a shared mission. In difficult and time-sensitive decision-making situations, becoming isolated from the rest of the C2 network dramatically reduces both an individual’s capacity to perform their role and the network’s capacity to fulfill mission goals. Thus, the connectivity of these networks, and, in particular, the composition and the size of their largest connected components, must be studied to assess their robustness to unexpected losses. Well-known classical results for this form of robustness are based on percolation theory, where nodes and/or links are removed at random and the connectivity of the remaining sub-network is analyzed. Percolation theory establishes that the distribution of links among network nodes (degree distribution) is a key characteristic in determining network robustness. Still, percolation theory stops at identifying the persistence of network connectivity, and has yet to answer any questions regarding the types of nodes still connected to the network. In this work we develop a combined analysis of both the collective persistence of the connected component as well as individual node persistence. We show that networks characterized by heterogeneous degree distributions contain super-stable nodes, which typically are the part of the connected component. Our preliminary results indicate that it is not the mere number of links but rather the location of the node within the network that determines its individual persistence, which can be quantified as the likelihood the node belongs to the connected component. Where the number of links afforded to critical C2 nodes is limited via restricted social interactions and technological constraints, our results may be used for the optimal topological placement of critical nodes within C2 networks.

P.194  Meta-Analysis of Cancer in Petroleum Refinery Workers. Schnatter AR*, DeVilbiss EA, Chen M; ExxonMobil Biomedical Sciences, Inc.    A.R.SCHNATTER@EXXONMOBIL.COM

Abstract: Petroleum refineries are complex facilities that process crude oil into refined products such as motor gasoline. Workforces at these facilities involve many skilled positions, and there is relatively low turnover. This makes them an excellent candidate for epidemiologic study. Formal studies of these workers using modern statistical techniques began appearing in the 1970’s. The International Agency for Research on Cancer has classified the petroleum refinery work environment as ‘probably carcinogenic to humans’. This was based on findings regarding skin cancer and leukemia in 13 studies. The latest review of the epidemiologic literature on refinery workers was over 15 years ago, and was based on 21 studies. Since then, many studies have been added to the literature and/or updated with new mortality or cancer incidence information. The purpose of this project is to perform a meta-analysis of cancer occurrence in petroleum refinery workers. Guidelines that have emerged since the last review of refinery workers (such as PRISMA) are being employed to improve transparency. The present meta-analysis will be based on 66 studies, 203 abstracted tables within each study, and over 8000 measures of risk from the 203 tables. We will summarize the development of a ‘construct’ to define relevant data within the studies, the number of studies screened, and the development and justification of exclusion criteria being employed. We will also summarize the outcomes under investigation, statistical analyses being employed including the investigation of heterogeneity, and will present a funnel plot with summary results for one outcome. The database promises to be an excellent resource for new knowledge on cancer incidence and mortality patterns in this complex work environment and will serve to supplement health surveillance activities in the company.

P.195  Influence of Risk Perception on aAttitudes and Norms Regarding Electronic Cigarettes. Trumbo CW*; Colorado State University

Abstract: A hallmark of the reduction in tobacco use has been the shift in social norms concerning smoking in public. Research has shown there to be significantly greater tolerance of public electronic cigarette use than for public smoking. Related concerns about public views on electronic cigarettes reside in the attitudes that individuals hold toward the potential for electronic cigarettes to be used in smoking cessation, and the addictive potential of the devices. Such attitudes may drive views on acceptability of public electronic cigarette use. Understanding normative acceptability of electronic cigarette use in public spaces is critical for effective formulation of relevant policy and regulations. While the literature on social dimensions of electronic cigarettes is expanding, little attention has been given to the role of risk perception. This study undertakes such an effort. In spring 2015 an online survey was conducted on attitudes towards electronic cigarettes among a sample of students at a Western university (a convenience sample of students invited from several large lecture courses, 395 participants). Measures included scales for the acceptability of public electronic cigarette use, level of nicotine use from a range of sources, belief that electronic cigarettes are useful for tobacco cessation, beliefs about the addictiveness of electronic cigarettes, and the perception of risk for electronic cigarettes. All measures present good to excellent reliabilities, and strong correlations. Path analysis demonstrates a model in which risk perception predicts attitudes towards addictiveness and cessation, which subsequently predict public acceptability for electronic cigarette use (sub-model R-squares of .35 and .38, p < .001). Elaboration of this analysis will be designed to further explore the relative effects of cognitive versus affective risk perception.

P.196  Investigating a System-Theoretic Framework for Mitigating Complex Risks in International Transport of Spent Nuclear Fuel. Williams AD, Jones KA*, Osborn D, Kalinina E, Mohagheghi A, Parks J; Sandia National Laboratories

Abstract: New analytical approaches are desired to effectively manage the growing complexity of safety, security and safeguards (3S) threats to the nuclear fuel cycle (NFC). As NFC global infrastructure expands within this dynamic environment, increasingly complex risks emerge—a risk space captured in the 3S challenges of transporting spent nuclear fuel (SNF) via multimodal and international transportation routes. This research hypothesizes that an integrated 3S approach to SNF transportation risk results in design, implementation and evaluation benefits. Given that the extant 3S literature is primarily conceptual, this research develops a scientific and technically rigorous 3S approach to assessing, managing and mitigating the complex risks of SNF transportation. Traditional SNF evaluation methods for 3S are challenged by uncertainties related to ignoring interdependencies, stochastic assumptions of environmental factors and time independent domain risk mitigation strategies. In contrast, this research utilizes system-theoretic frameworks to model SNF transportation and demonstrates how to assess, manage and mitigate complex risks of SNF transportation with a time dependent, dynamic control theoretic complex system model. Leveraging the growing applications of the system theoretic process analysis (STPA) and dynamic probabilistic risk assessment (DPRA) methods, this research investigates the gaps, interdependencies, conflicts and leverage points often overlooked by traditional methods that rely on analyzing each ‘S’ in isolation. Preliminary results indicate that a system-theoretic, 3S analytical framework is better able to manage the risk complexity of SNF transportation in international environments. Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.

P.197  A Model for Coupled Population and Infrastructure Growth. Snell ML*, Eisenberg DA; Arizona State University

Abstract: The last few years have seen a rash of "One in a Thousand Year" storm events. South Carolina, Colorado, Arizona, and most recently West Virginia have all suffered the consequence of extreme precipitation. However the States have not suffered equally. In these storm effected areas Colorado, affluent with relatively lower population density, saw rapid renewal of services but has yet to see complete reconstruction. Arizona, affluent with relatively higher population density, saw lower levels of infrastructure destruction with near complete reconstruction. South Carolina, less affluent than the Phoenix region but a similar population density, witnessed more infrastructure destruction. West Virginia, lower level of both affluence and population density, saw infrastructure destruction on a level rarely evidenced in modern society. The consequence of the storm events with similar return periods was different in each region due to the relationship between population and supportive infrastructure. We propose the adoption of a new coupled population and infrastructure model to understand the differences in regional flood consequence with future work centered around modelling adaption to these consequential differences.

P.198  Multilayer Command and Control Networks. Eisenberg DA*, Kitsak M, Ganin A, Linkov I, Alderson DL; Arizona State University

Abstract: The expansion from studying complex networks in isolation to multilayered systems that link separate networks together is one of the fastest growing areas of research in network science. Command and control (C2) systems are sociotechnical systems comprised of interdependent networks of physical infrastructure, information and knowledge sharing, social hierarchy, and goal-driven activities, suggesting that they should be studied via this new multilayered network paradigm. Despite a growing body of research, multilayer network studies continue to overlook fundamental structures and functions of C2 systems, diminishing the capacity of experts to translate research into practice. In this poster, we outline contemporary C2 theory, how it relates to multilayer network analysis, and detail meaningful approaches to analyzing C2 systems with multilayered methods that inform real-world practice. Specifically, we review the characteristics of physical, information, cognitive, and social networks which comprise C2 systems and provide examples from military, emergency management, and civilian infrastructure domains in which they have been studied. We use this knowledge to inform more complicated multilayer structures that combine inherently different networks and analysis methods, such as nodal centralities, network structures, and dynamics. Taken together, this work integrates necessary knowledge from C2 and network literature to provide a basis for relating fundamental multilayer network research conducted on abstracted or synthetically generated systems to their real-world counterparts.

P.199  Developmental toxicity assessment of various sizes of multi-wall carbon nanotubes in mice after repeated intratracheal instillation to initiate grouping and read across. Kobayashi N, Tanaka S, Ikarashi Y, HIrose A*; National Institute of Health Sciences

Abstract: Grouping and read across are necessary and appropriate tools for filling data gaps in the hazard assessment of manufactured nanomaterials. Some studies have reported that maternal exposure to nanomaterials, including carbon nanotubes, may induce teratogenicity. In order to initiate grouping and read across for filling data gaps in the developmental toxicity including the teratogenicity of multi-wall carbon nanotubes (MWCNTs) via airway exposure, we conducted repeated intratracheal instillation studies of various sizes of MWCNTs in pregnant mice. Four types of MWCNT dispersions were repeatedly administered to pregnant Crlj:CD1(ICR) mice on gestational days 6, 9, 12, and 15 at dosages of 4.0 mg/kg/day. Ten pregnant mice per group were dissected on gestational day 17, and then developmental toxicity was evaluated.

P.200  Risk Choices of Farms under the 2014 Farm Bill. Liu XL*, Goodman T; Fort Valley State University

Abstract: The enacted 2014 Farm Bill represented a fundamental shift of the U.S. agricultural policy. Traditional direct and counter-cyclical payments were replaced by three new support programs: Agriculture Risk Coverage (ARC), Price Loss Coverage (PLC), and Supplemental Coverage Option (SCO). These “shallow loss” programs and the other Pre-existing Federal Crop Insurance programs built a risk management safety net, affordable and universally available to farmers. The new Farm Bill had tremendous effect on the risk management choices of U.S. farms. The heavily subsidized crop insurance programs became a dominant force in risk management planning. About 90 percent of crop land was enrolled in these programs and higher levels of coverage were usually purchased. In contrast, traditional production risk tools such as diversification and shared leases, and marketing risk tools such as spreading sales and contract sales were crowded out to some extent. Other risk-sharing vehicles like Community Supported Agriculture (CSA) farm and Food Hubs did not rise to expected prominence, although they are effective both in mitigating risk and promoting the development of local communities. Based on USDA data and our observations, we examined the various risk management choices of farms, weigh the pros and cons of the heavily subsidized programs, and identified CSA farm, food hubs, organic farming as effective alternatives for farms, specifically for small-scale farms. In the long run, the federal farm subsidy programs should be designed to benefit the large number of small-scale farms; and the CSA, food hubs, and organic farming should receive more attention of the government for their key functions on risk mitigation and rural community development.

P.201  Probabilistic risk assessment of the exposure to chlorpyrifos from some edible herbal medicine. Chang BS, Chen YJ, Wu KY, Chiang SY*; China Medical University

Abstract: Dietary exposure to pesticides from edible herbal medicine via food consumption is of particular concern. Chlorpyrifos, one of the organophosphate pesticides, has been previously detected in herbal medicine used. This pilot study was to assess the potential risk of chlorpyrifos exposure from some edible herbal medicine in Taiwan. The chlorpyrifos residues data in seven commonly used edible herbal medicine including Jujubae Fructus were released by Taiwan Food and Drug Administration (TFDA). Chlorpyrifos was detected in 24 out of 140 samples by gas chromatography-tandem mass spectrometry. The intake rates of edible herbal medicine were cited from the Taiwan National Food Consumption Database. These data were used as prior information. The posterior distributions of chlorpyrifos residues, daily intake, and hazard index for adults in Taiwan were assessed with the Bayesian statistics Markov chain Monte Carlo simulation by using the OpenBUGS software to overcome insufficient data. The hazard index was calculated to evaluate the noncarcinogenic health risk from chlorpyrifos exposure via food consumption of the edible herbal medicine. Based on the current RfD of 0.003 mg/kg bw/day for chlorpyrifos, the mean and the upper bound of 95% confidence interval of hazard index were calculated to be 0.06 and 0.17, respectively. These data suggest that it may pose no potential risk to human via dietary consumption of some edible herbal medicine.

P.202  Risk Perceptions and Behavioral Adaptations to Coupled Environmental Hazards in Phoenix, AZ. Chakalian P*, Larsen L, Gronlund C, Stone B; Arizona State University; University of Michigan; University of Michigan; Georgia Institute of Technology

Abstract: How do individual characteristics and structural constraints explain variation in risk perceptions? what is that perception? and how does that perception shape individuals’ thermal comfort, and behavioral adaptations to heat and power-failure hazards? Through an NSF Hazards SEES grant researchers at Georgia Tech, the University of Michigan, and Arizona State University have been investigating what would happen in Atlanta, GA; Detroit, MI; and Phoenix, AZ if the three cities suffered metro-wide power-failures that lasted several days during a concurrent heat wave. Over the summer of 2016 researchers at Arizona State University collected household survey’s in Phoenix, AZ to help answer this question. Using a stratified random sample 149 survey’s were collected that attempted to represent a wide range of geographies, housing-types, and demographics. The 67 question surveys were administered at respondent’s doors on computer tablets, and took between 10 and 20 minutes to complete. Participants were given $5.00 in compensation for their time. Respondents were asked about their perception of the seriousness of several environmental risks including their perceptions of the risks of extreme weather, power-failure, air pollution, and climate change. The results have been analyzed to compare perceptions between different hazards and between various groups of participants. This poster presents preliminary results from this work that indicate both amplification and attenuation of perceptions of these risks amongst geographically distinct groups of participants. These results provide directions for further investigation and could help increase the efficacy of risk management strategies in one of the countries hottest cities.

[back to schedule]