Category Archives: S&T Research

Ultrafine Particulate Matter Increases Cardiac Ischemia/Reperfusion Injury via Mitochondrial Permeability Transition Pore.

Ultrafine Particulate Matter (UFP) has been associated with increased cardiovascular morbidity and mortality. However, the mechanisms that drive PM associated cardiovascular disease and dysfunction remain unclear. We examined the impact of intratracheal instillation of 100 g UFP from the Chapel Hill, NC air shed in Sprague-Dawley rats on cardiac function, arrhythmogenesis, and cardiac ischemia/reperfusion (I/R) injury using a Langendorff working heart model. We found that exposure to UFP was capable of significantly exacerbating cardiac I/R injury without changes to overall cardiac function or major changes in arrhythmogenesis. Cardiac I/R injury was attenuable with administration of CsA, suggesting a role for the mitochondrial permeability transition pore (mPTP) in UFP associated cardiovascular toxicity. Isolated cardiac mitochondria displayed decreased Ca2+ buffering before opening of the mPTP. These findings suggest that UFP associated expansion of cardiac I/R injury may be a result of mPTP Ca2+ sensitization resulting in increased mitochondrial permeability transition and potential initiation of mPTP associated cell death pathways.

Identifying populations sensitive to environmental chemicals by simulating toxicokinetic variability

We incorporate inter-individual variability, including variability across demographic subgroups, into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which have little or no existing TK data. Chemicals are prioritized based on model estimates of hazard and exposure, to decide which chemicals should be first in line for further study. Hazard may be estimated with in vitro HT screening assays, e.g., U.S. EPA’s ToxCast program. Bioactive ToxCast concentrations can be extrapolated to doses that produce equivalent concentrations in body tissues using a reverse dosimetry approach in which generic TK models are parameterized with 1) chemical-specific parameters derived from in vitro measurements and predicted from chemical structure; and 2) with physiological parameters for a virtual population. We have developed HTTK-Pop, a software package to simulate population physiological parameters based on the most recent CDC NHANES data on distributions of demographic and anthropometric quantities in the modern U.S. population. HTTK-Pop implements a Monte Carlo approach, accounting for the correlation structure in physiological parameters, which is used to estimate ToxCast oral equivalent doses for the most sensitive portion of the population. For risk prioritization, oral equivalent doses are compared to estimates of exposure rates based on NHANES urinary analyte biomonitoring data. The inclusion of inter-individual variability in the TK modeling framework allows targeted risk prioritization for demographic groups of interest, including potentially sensitive life stages and subpopulations.

On-road Emissions and Chemical Transformation of Nitrogen Oxides

Nitrogen dioxide (NO2) not only is linked with a number of adverse effects on the respiratory system, but also contributes to the formation of ground-level ozone (O3) and fine particulate matter (PM2.5) pollution. NO2 levels near major roads have been monitored as part of the one-hour and annual NO2 standard in the revised National Ambient Air Quality Standards (NAAQS). Our analysis of near-road monitoring data in Detroit, MI and Atlanta, GA strongly suggests that a large fraction of NO2 is actually produced through chemical reactions with O3 during the “tailpipe-to-road” stage, even with a relatively short residence time. To further substantiate this finding, we designed a field campaign to compare tailpipe-level and on-road NO2 concentrations normalized by CO2 concentrations. This comparison was accomplished through measuring the same exhaust plumes at the tailpipe-level NO2 and CO2 concentrations by a Portable Emission Measurement System (PEMS) and at the on-road level by an Electric Vehicle-based mobile platform. The results showed that CO2-normalized NO2 concentrations, taking into account the effect of dilution, were significantly higher at the on-road level than those at the tailpipe level. Furthermore, we employed a turbulent reacting flow model, CTAG, to simulate the coupled on-road turbulence and chemistry behind a single vehicle, and found that the simplified chemical mechanism using a three-reaction (NO-NO2-O3) system can largely capture the rapid NO to NO2 conversion (with timescale ~ seconds) observed in the field studies. In summary, results from near-road monitoring, on-road experiments and numerical simulations all support the importance of on-road NOx chemical transformation. Our findings provide insights into developing future near-road NO2 mitigation and monitoring strategies.

Rubbertown NGEM Demonstration Project?Community Meeting 3

Rubbertown Next Generation Emission Measurement Demonstration: Background: Industrial facilities, regulators, and nearby communities have a mutual interest in the effective detection of fugitive emissions of volatile organic and odiferous compounds. If unanticipated emissions that require mitigation can be found and fixed in a timely manner, multiple benefits can result, such as lower air shed impacts, healthier air quality, safer working environments, cost savings through reduced product loss, and improved community relations.
Under its Next Generation Emission Measurement (NGEM) program, EPA is working to develop new sensor and modeling approaches that can assist facilities in detection and mitigation of fugitive air pollution sources from facility leaks and operational malfunctions.
To help advance NGEM research, EPA and the Louisville Metro Air Pollution Control District (LMAPCD) are working together on a research project to demonstrate NGEM approaches near facilities in the Rubbertown industrial area of Louisville, KY.
The area has faced challenges related to the control of ozone and exposure of pollutants to local communities. LMAPCD has made extensive efforts to control air toxics, including ozone precursors, in the area. Despite these efforts, fugitive emissions still remain a source of concern from both an air quality agency and a community perspective. These potential air quality challenges and the close physical proximity of industrial sources made the Rubbertown industrial area an ideal location for studying NGEM technologies.
Approach:
The project team will conduct a year-long demonstration field study of select NGEM technology prototypes developed by EPA researchers and other groups. The study will start in in September 2017.
The project is a measurement study with goals to document NGEM system performance and advance NGEM methods while producing source emission case study data useful for both LMAPCD and industrial facilities.
The research supports the community, the City of Louisville, and industry by furthering the development of innovative and cost effective approaches to improve air quality monitoring that protects public health and the environment.
Scientists will measure volatile organic compounds (VOCs) and air toxics using a network of SPod fenceline sensor systems, field-packaged gas chromatographs (GC’s), and open-path spectroscopic equipment. They will also perform GMAP OTM 33 mobile measurements using vehicles equipped with time-resolved sensors that are driven downwind in relatively close proximity to potential sources.
Science questions include:
• Can emerging NGEM approaches cost effectively augment current industry work practices to help identify and reduce emissions

Navigating through the minefield of read-across tools: A review of in silico tools for grouping

Read-across is a popular data gap filling technique used within analogue and category approaches for regulatory purposes. In recent years there have been many efforts focused on the challenges involved in read-across development, its scientific justification and documentation. Tools have also been developed to facilitate read-across development and application. Here, we describe a number of publicly available read-across tools in the context of the category/analogue workflow to better articulate their respective capabilities, strengths and weaknesses. No single tool addresses all aspects of the workflow. We highlight how the different tools can complement each other and what some of the opportunities for their further development could be to address the continued evolution of read-across.

Application of IATA – A case study in evaluating the global and local performance of a Bayesian Network model for Skin Sensitization

Since the publication of the Adverse Outcome Pathway (AOP) for skin sensitization, there have been many efforts to develop systematic approaches to integrate the information generated from different key events for decision making. The types of information characterizing key events in an AOP can be generated from in silico, in chemico, in vitro or in vivo approaches. Integration of this information and interpretation for decision making are known as integrated approaches to testing and assessment or IATA. One such IATA that has been developed was published by Jaworska et al (2013) which describes a Bayesian network model known as ITS-2. The current work evaluated the performance of ITS-2 using a stratified cross validation approach. We also characterized the impact of refinements to the network by replacing the most significant component, the output from a commercial expert system TIMES-SS with structural alert information readily generated from the freely available OECD QSAR Toolbox. Lack of any structural alert flags or TIMES-SS predictions, yielded a sensitization potential prediction of 79% +3%/-4%. If the TIMES-SS prediction was replaced by an indicator for the presence of a structural alert, the network predictivity increased to 84% +2%/-4%, which was only slightly less than found for the original network (89% ±2%). The local applicability domain of the original ITS-2 network was also evaluated using reaction mechanistic domains to better understand what types of chemicals ITS-2 was able to make the best predictions for – i.e. a local validity domain analysis. We ultimately found that the original network was successful at predicting which chemicals would be sensitizers, but not at predicting their relative potency.

Cold Temperature Effects on Speciated VOC Emissions from Modern GDI Light-Duty Vehicles 1

In this study, speciated VOC emissions were characterized from three modern GDI light-duty vehicles. The vehicles were tested on a chassis dynamometer housed in a climate-controlled chamber at two temperatures (20 and 72 °F) using the EPA Federal Test Procedure (FTP) and a portion of the Supplemental FTP (i.e. US06) that represents more aggressive driving conditions. The vehicles operated gasoline blended with 10% ethanol. VOC emissions from diluted vehicle exhaust were sampled with SUMMA canisters for EPA Method TO-15 analysis and with 2,4-Dinitrophenylhydrazine (DNPH) cartridges for carbonyl analysis by EPA Method TO-11A. This presentation will report the impact of ambient cold temperature, driving cycle, and GDI technology on speciated VOC emissions.

20170312 – A framework to build scientific confidence in read across results. (SOT CE course presentation)

Read-across acceptance is remains a major hurdle primarily due to the lack of objectivity and clarity on how to practically address uncertainties. One avenue that can be exploited to build scientific confidence in the development and evaluation of read-across is by taking advantage of new in vitro bioactivity data streams which have the potential to provide mechanistic information. A read-across prediction could be formulated in the context of a local neighborhood that then is amenable to objective evaluation using a QSAR-like framework. In this talk, we present such a framework where the read-across prediction relies on a similarity weighted activity of nearest neighbors based on chemistry and bioactivity descriptors. We illustrate how this framework can be used to make predictions for untested chemicals and how the uncertainty of the prediction can be evaluated dynamically across the entire neighborhood for a number of different toxicity effects.

Developing qualitative ecosystem service relationships with the Driver-Pressure-State-Impact-Response framework: A case study on Cape Cod, Massachusetts

Understanding the effects of environmental management strategies on society and the environment is critical for evaluating their effectiveness but is often impeded by limited data availability. In this article, we present a method that can help scientists to support environmental managers’ thinking about causal effects on ecosystem services in coupled human and natural systems. Our method aims to model qualitative cause-effect relationships between management strategies and ecosystem services, using information provided by knowledgeable participants, and the tradeoffs between strategies. We select and organize management strategies, environmental variables, and ecosystem services as indicators using the Driver-Pressure-State-Impact-Response framework. We evaluate the relationship between indicators using a decision tree and numerical representations of interaction strength. We use a matrix multiplication procedure to model direct and indirect interaction effects, and we provide guidelines for combining effects. Results include several data tables from which information can be visualized to understand the plausible interaction effects of implementing management strategies on ecosystem services. We illustrate our method with a coastal water quality management case study on Cape Cod, Massachusetts.

Computational Model of Secondary Palate Fusion and Disruption

Morphogenetic events are driven by cell-generated physical forces and complex cellular dynamics. To improve our capacity to predict developmental effects from cellular alterations, we built a multi-cellular agent-based model in CompuCell3D that recapitulates the cellular networks and collective cell behavior underlying growth and fusion of the mammalian secondary palate. The model incorporated multiple signaling pathways (TGF, BMP, FGF, EGF, SHH) in a heuristic computational intelligence framework to recapitulate morphogenetic events from palatal outgrowth through midline fusion. It effectively simulated higher-level phenotypes (e.g., midline contact, medial edge seam (MES) breakdown, mesenchymal confluence, fusion defects) in response to genetic or environmental perturbations. Perturbation analysis of various control features revealed model functionality with respect to cell signaling systems and feedback loops for growth and fusion, diverse individual cell behaviors and collective cellular behavior leading to physical contact and midline fusion, and quantitative analysis of the TGF/EGF switch that controls MES breakdown – a key event in morphogenetic fusion. The virtual palate model was then executed with chemical perturbation scenarios to simulate switch behavior leading to a disruption of fusion following chronic (e.g., dioxin) and acute (e.g., retinoic acid, hydrocortisone) toxicant exposures. This computer model adds to similar systems models toward a ‘virtual embryo’ for simulation and quantitative prediction of adverse developmental outcomes following genetic perturbation and/or environmental disruption.

Plant reproduction is altered by simulated herbicide drift toconstructed plant communities

Herbicide drift may have unintended impacts on native vegetation, adversely affecting structure and function of plant communities. However, these potential effects have been rarely studied or quantified. To determine potential ecological effects of herbicide drift, we constructed a series of small plant community plots using perennial species found in Willamette Valley Oregon grasslands including: Eriophyllum lanatum (Oregon sunshine), Iris tenax (toughleaf Iris), Prunella vulgaris var. lanceolata (Lance selfheal), Camassia leichtlinii (large camas), Festuca roemeri (Roemer’s fescue), Elymus glaucus (blue wildrye), Ranunculus occidentalis (western buttercup), Fragaria virginiana (Virginia strawberry), and Potentilla gracilis (slender cinquefoil). Studies were conducted on two Oregon State University farms over two years, and evaluated single and combined effects of drift rates of 0.01 to 0.2 x field application rates (FAR) of 1119 g ha-1 for glyphosate [active ingredient (a.i) of 830 g ha-1 acid glyphosate] and 560 g ha-1 a.i. for dicamba. Species response endpoints were % cover, # of reproductive structures, mature and immature seed production (dry weight), and vegetative biomass. Herbicide effects differed with species, year and farm. Among the more notable responses, Eriophyllum lanatum had a significant reduction in total seed production or % immature seed dry weight with as little as 0.01 x FAR of dicamba, glyphosate or the combination of both herbicides; but a significant reduction in % cover near the end of the growing season only with 0.2 x FAR of both herbicides. Elymus glaucus had a significant reduction in total seed production with 0.1 x FAR of glyphosate alone or in combination with dicamba in one year. The other species showed similar trends but had fewer significant responses. These studies indicated potential unintended effects of low levels of herbicides on reproduction of native plants, and demonstrated an experimental protocol whereby a plant community can be evaluated for ecological responses.

Insights from the Development of HiveScience

The National Advisory Council for Environmental Policy and Technology (NACEPT) recently assessed EPA’s approach to citizen science. The Council concluded that integration of citizen science into EPA’s structure will accelerate virtually every Agency activity. HiveScience is a new EPA citizen science project for beekeepers. Because HiveScience was a first of its kind, this project blazed a trail by developing new processes to support citizen science efforts within the Agency. Using HiveScience as a model project, this presentation describes the path from idea formation to project launch and highlights challenges associated with navigating the Agency’s policies and procedures. In order to realize the full potential of citizen science efforts, the Agency should address areas of procedural deficiency thereby streamlining the project development process. This small investment will open the Agency to a modern data collection mechanism that not only enables collection of novel information, but also promotes the formation of a positive relationship with the public.

Assessing Effects of Pesticides on the Bee Immune System

Populations of some managed and wild pollinators are in decline as a result of multiple interacting factors including parasites, disease, poor nutrition and pesticides. The role that diminished immunity plays in these declines is not understood. The U.S. Environmental Protection Agency (EPA) is working to identify and implement tests for assessing the impact of pesticides on bees including sublethal effects on the immune system. These efforts are a response to goals described in the National Strategy for Promoting the Health of Honey Bees [Apis mellifera] and Other Pollinators. Although sublethal effects may be measured in these studies, it is uncertain how these measurement endpoints may relate to regulatory risk assessment endpoints and the extent to which honey bees serve as a reasonable surrogate for non-Apis bees. The National Strategy and the 2012 EPA White Paper describing the conceptual framework for assessing risks of pesticides to bees, discussed uncertainties related to assessing exposure and effects to solitary and social bees from individual pesticides and combinations of pesticides. This presentation will discuss efforts to examine immune responses in non-Apis bees and how those responses relate to effects observed in honey bees.

Environmental Quality Index (EQI)

Presentation on the Environmental Quality Index (EQI)
• Developed to explore: – associations with adverse health effects
-how various environmental factors contribute in concert to health disparities in low-income, underrepresented minority and vulnerable populations
• Results from studies could be used for hypothesis generating studies to explore cumulative exposures in communities. -Help communities prioritize interventions
• Characterizing environmental quality across U.S.

20170403 – Identifying "known unknowns": A comparison between ChemSpider and the US EPA’s CompTox Dashboard (ACS Spring National meeting) 1 of 7

Non-targeted analysis (NTA) workflows in high-resolution mass spectrometry require mechanisms for compound identification. One strategy for tentative identification is the use of online chemical databases such as ChemSpider. Databases like this use molecular formulae and monoisotopic mass-based searching and rank-ordering of results by the associated number of data supplier sources, bringing the most likely candidate “known unknowns” to the top of the list. The U.S. EPA’s iCSS CompTox Dashboard (https://comptox.epa.gov) is a highly curated and freely available resource containing more than 720,000 chemicals of relevance to environmental health science. In this research, we evaluated the performance of the Dashboard relative to ChemSpider for the identification of “known unknowns” using 162 chemicals representing a number of previously studied datasets from peer-reviewed literature. Molecular formulae and monoisotopic masses were searched using both applications and ordered using their different ranking approaches. A greater percentage of chemicals ranked in the top position when using the Dashboard and offered better overall performance for identifying “known unknowns.” Additional data will be presented evaluating alternative sources for tentative identification of chemicals. For example, the presence of chemicals in consumer products was incorporated into the tentative identification process and evaluated via the Dashboard. Weight-ordering of identification ranking for inclusion into a non-targeted analysis workflow as part of the CompTox Dashboard is being developed. This abstract does not necessarily represent the views or policies of the U.S. Environmental Protection Agency.

20170824 – Enhancing the Application of Alternative Methods Through Global Cooperation (WC10)

Progress towards the development and translation of alternative testing methods to safety-related decision making is a common goal that crosses organizational, stakeholder, and international boundaries. The challenge is that different organizations have different missions, different regulatory frameworks, and need to apply alternative methods to different decision contexts. Advancing the development and application of alternative methods will require focusing on common goals that address key challenges in advancing toxicology testing in the 21st century and provide common benefit across organizations and international boundaries. The talk will describe the global cooperation activities by the EPA across multiple stakeholder groups for development of alternative testing methods and the lessons learned in translating the methods to decision making. This abstract does not necessarily reflect U.S. EPA policy.

Rivers and streams in the media: a content analysis of ecosystem services

While ecosystem services research has become common, few efforts are directed toward in-depth understanding of the specific ecological quantities people value. Environmental communications as well as ecological monitoring and analysis efforts could be enhanced by such information. For example, small changes in the way ecosystems are described could strongly influence relevance to the public and improve the foundation for environmental decision-making. Clarifying valued attributes is particularly important for nonmarket ecosystem services, since price and quantity data cannot be readily observed as with goods and services bought and sold in traditional markets. Focusing on rivers and streams, we conducted a content analysis of existing publications to document the breadth and frequency with which various measurable attributes, such as flooding, water quality characteristics, and wildlife appeared in different news sources over a multiyear timeline. In addition to attributes, motivations for human interest in river-related resources were also coded, such as recreation or preservation for future generations. To allow testing of differences between materials written for different audiences, three sources were sampled: a blog hosted by National Geographic, New York Times articles, and Wall Street Journal articles. The coding approach was rigorously tested in a pilot phase, with measures developed to ensure high data quality, including use of two independent coders. Results show numerous similarities across sources with some notable differences in emphasis. Significant relationships between groups of attribute and motivation codes were also found, one outcome of which is further support for the importance of “non-use” values for fish and wildlife. Besides offering insight on ecosystem services, the project demonstrates an in-depth quantitative approach to analyzing preexisting qualitative data.

Recreational freshwater fishing drives non-native aquatic species richness patterns at a continental scale

Aim

Mapping the geographic distribution of non-native aquatic species is a critically important precursor to understanding the anthropogenic and environmental factors that drive freshwater biological invasions. Such efforts are often limited to local scales and/or to single species, due to the challenges of data acquisition at larger scales. Here, we map the distribution of non-native freshwater species richness across the continental United States and investigate the role of human activity in driving macroscale patterns of aquatic invasion.
Location

The continental United States.
Methods

We assembled maps of non-native aquatic species richness by compiling occurrence data on exotic animal and plant species from publicly accessible databases. Using a dasymetric model of human population density and a spatially explicit model of recreational freshwater fishing demand, we analysed the effect of these metrics of human influence on the degree of invasion at the watershed scale, while controlling for spatial and sampling bias. We also assessed the effects that a temporal mismatch between occurrence data (collected since 1815) and cross-sectional predictors (developed using 2010 data) may have on model fit.
Results

Non-native aquatic species richness exhibits a highly patchy distribution, with hotspots in the Northeast, Great Lakes, Florida, and human population centres on the Pacific coast. These richness patterns are correlated with population density, but are much more strongly predicted by patterns of recreational fishing demand. These relationships are strengthened by temporal matching of datasets and are robust to corrections for sampling effort.
Main conclusions

Distributions of aquatic non-native species across the continental US are better predicted by freshwater recreational fishing than by human population density. This suggests that observed patterns are driven by a mechanistic link between recreational activity and aquatic non-native species richness and are not merely the outcome of sampling bias associated with human population density.