Category Archives: S&T Research

Can Particulate Pollution Affect Lung Function in Healthy Adults?

Accompanying editorial to paper from Harvard by Rice et al. entitled “Long-Term Exposure to Traffic Emissions and Fine Particulate Matter and Lung Function Decline in the Framingham Heart StudyBy almost any measure the Clean Air Act and its amendments has to be considered as one of the most significant and arguably successful pieces of environmental legislation in modem times (1 ). Air quality has improved significantly since its passage and continues to do so. The levels of fine particulate matter (PM2.5) and the larger coarse particle (PMlO) have both declined by a third nationally from 2000 to 2013 ( Because these pollutants have been implicated in respiratory and cardiac diseases, this is thought to have resulted in significant health benefits with more than one study associating reduced air pollution with increased life expectancy (3, 4). Despite these improvements, more than 46 million people still live in an area where the annual average level of particle pollution is considered unhealthful.(5). The study shows that the lung function of middle aged men and women may also be reduced by long-term exposure to air pollution or traffic. Using the Framingham Offspring and Third Generation cohorts based in Massachusetts, the authors show that those individuals residing near a major road or in areas with higher PM2.5 levels have lower average FEV 1 and FVC. In addition, the natural decline in lung function with age also appeared to be accelerated in those people. No association between long-term air pollution and FEV1/ FVC ratio was apparent and so the authors conclude that the effects are not associated with airflow obstruction. The findings on lung decline complement previous longitudinal studies in children living in Southern California where PM2.5 reduced lung function growth between the ages of 10 and 18 years, the age where rapid lung development normally occurs (6). Similar associations between air pollution and deficits in lung function growth have now been seen in schoolchildren in Mexico City. (7), China (8) and Europe(9).

Considerations in linking energy scenario modeling and Life Cycle Analysis

The U.S. EPA Office of Research and Development (ORD) has been exploring approaches for estimating U.S. anthropogenic air pollutant emissions through the mid-21st century. As a result, we have developed the Emission Scenario Projection methodology, or ESP. In this document, we provide an overview of ESP development and capabilities, then highlight possible future directions. Next, we switch gears to describe a typical life cycle analysis (LCA) and discuss briefly some limitations of standard LCA approaches. Finally, linkage of ESP and LCA is proposed as a way to address the limitations. Three possible linkages are listed: (i) driving the underlying assumptions in the LCA with scenarios developed via EPS; (ii) gaining insights into the spatial allocation of LCA results using the future-year inventory projections developed with ESP, and (iii) integrating LC factors directly into the ESP energy system representation to capture LC impacts of various scenarios. This document and the related presentation are intended to stimulate discussion between the emission projection and LCA communities.

20170624 – In vitro data and in silico models for computational toxicology (Teratology Society ILSI HESI workshop)

The challenge of assessing the potential developmental health risks for the tens of thousands of environmental chemicals is beyond the capacity for resource-intensive animal protocols. Large data streams coming from high-throughput (HTS) and high-content (HCS) profiling of biological activities, coupled with machine-learning algorithms, literature-mining tools, and systems modeling, is a newer paradigm to toxicity testing in the 21st century. Newer resources are available to measure molecular components of cellular and tissue-level phenomena in great depth and detail. HTS/HCS data now in-hand (ToxCast/Tox21), ‘evolution’ implies the advancement of best practices and computational approaches to assemble the individual pieces into an integrative model that: scales to the human exposure universe; incorporates extant knowledge of human embryology; and deals probabilistically with spatiotemporal dynamics in a morphogenetic series of events. With the advent of computational approaches and computer models fit for that purpose, ‘revolution’ implies their continued refinement and cohesion to satisfy the fundamental principles of teratology: chemical structure, dosimetry, initiating mechanism(s), genetic susceptibility, stage specificity, and bioavailability. This presentation will provide examples of how in vitro data and in silico models can be integrated with biological knowledge to simulate how embryos might react to diverse exposure scenarios. [This abstract does not reflect US EPA policy].

20170907- Computational embryology as an integrative platform for predictive DART (45th Conf of Europ Teratology Society)

Chemical regulation is challenged by the large number of chemicals requiring assessment for potential human health and environmental impacts. For example, the USEPA lists more than 85,000 chemicals on its inventory of substances that fall under the Toxic Substances Control Act (TSCA). Whereas developmental and reproductive toxicity (DART) testing is an important regulatory consideration, traditional animal-based test methods focusing on apical endpoints lack throughput and mechanistic support needed for chemicals management under TSCA reform. Hypothesis-driven Integrated Approaches to Testing and Assessment (IATA), in parallel with Adverse Outcome Pathways (AOPs) for DART can help focus resources by strategically targeting the most probable developmental hazards predicted by alternative in vitro assays and non-testing (in silico) platforms. This paradigm shifts regulatory emphasis to high-dimensional data streams drawing from extant knowledge of cellular and molecular embryology, a compendium of high-throughput screening and high-content screening (HTS/HCS) data, and web-based chemistry dashboards. For example, the ToxCast/Tox21 program has provided HTS/HCS data on thousands of chemicals and hundreds of in vitro assays that include biochemical assays, reporter cell lines, and zebrafish developmental toxicity. Newer assays derive from pluripotent stem cell platforms (mouse, human) in the near-term with microscale organotypic culture models and engineered microphysiological systems on the horizon. Vast collections of HTS/HCS data and chemistry information, in combination with AOPs, can flip the emphasis from descriptive end-points to mechanistic chemical-biological interactions. Translating local interactions into quantitative predictions of developmental toxicity is challenged by the complex cellular dynamics in an embryo (cell signaling, migration, proliferation, apoptosis, differential adhesion, matrix remodeling, …). Mechanistic modeling through computational embryology can help navigate this complexity. Steady progress has been made with multicellular agent-based models (ABMs) that recapitulate morphogenetic drivers for somitogenesis, urethrogenesis, palatogenesis, and other events. Computational systems models such as these offer a novel heuristic approach to reconstruct tissue dynamics from the bottom-up, cell-by-cell and interaction-by-interaction. Individually, they simulate emergent phenotypes and can be used to predict adverse outcomes or cybermorphs that bring an AOP to life. Collectively, they form an integrative platform or ‘virtual embryo’ that represents the spatial and temporal diversity of morphological development for predictive DART. This abstract does not reflect US EPA policy

Ultrafine Particulate Matter Increases Cardiac Ischemia/Reperfusion Injury via Mitochondrial Permeability Transition Pore.

Ultrafine Particulate Matter (UFP) has been associated with increased cardiovascular morbidity and mortality. However, the mechanisms that drive PM associated cardiovascular disease and dysfunction remain unclear. We examined the impact of intratracheal instillation of 100 g UFP from the Chapel Hill, NC air shed in Sprague-Dawley rats on cardiac function, arrhythmogenesis, and cardiac ischemia/reperfusion (I/R) injury using a Langendorff working heart model. We found that exposure to UFP was capable of significantly exacerbating cardiac I/R injury without changes to overall cardiac function or major changes in arrhythmogenesis. Cardiac I/R injury was attenuable with administration of CsA, suggesting a role for the mitochondrial permeability transition pore (mPTP) in UFP associated cardiovascular toxicity. Isolated cardiac mitochondria displayed decreased Ca2+ buffering before opening of the mPTP. These findings suggest that UFP associated expansion of cardiac I/R injury may be a result of mPTP Ca2+ sensitization resulting in increased mitochondrial permeability transition and potential initiation of mPTP associated cell death pathways.

Identifying populations sensitive to environmental chemicals by simulating toxicokinetic variability

We incorporate inter-individual variability, including variability across demographic subgroups, into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which have little or no existing TK data. Chemicals are prioritized based on model estimates of hazard and exposure, to decide which chemicals should be first in line for further study. Hazard may be estimated with in vitro HT screening assays, e.g., U.S. EPA’s ToxCast program. Bioactive ToxCast concentrations can be extrapolated to doses that produce equivalent concentrations in body tissues using a reverse dosimetry approach in which generic TK models are parameterized with 1) chemical-specific parameters derived from in vitro measurements and predicted from chemical structure; and 2) with physiological parameters for a virtual population. We have developed HTTK-Pop, a software package to simulate population physiological parameters based on the most recent CDC NHANES data on distributions of demographic and anthropometric quantities in the modern U.S. population. HTTK-Pop implements a Monte Carlo approach, accounting for the correlation structure in physiological parameters, which is used to estimate ToxCast oral equivalent doses for the most sensitive portion of the population. For risk prioritization, oral equivalent doses are compared to estimates of exposure rates based on NHANES urinary analyte biomonitoring data. The inclusion of inter-individual variability in the TK modeling framework allows targeted risk prioritization for demographic groups of interest, including potentially sensitive life stages and subpopulations.

On-road Emissions and Chemical Transformation of Nitrogen Oxides

Nitrogen dioxide (NO2) not only is linked with a number of adverse effects on the respiratory system, but also contributes to the formation of ground-level ozone (O3) and fine particulate matter (PM2.5) pollution. NO2 levels near major roads have been monitored as part of the one-hour and annual NO2 standard in the revised National Ambient Air Quality Standards (NAAQS). Our analysis of near-road monitoring data in Detroit, MI and Atlanta, GA strongly suggests that a large fraction of NO2 is actually produced through chemical reactions with O3 during the “tailpipe-to-road” stage, even with a relatively short residence time. To further substantiate this finding, we designed a field campaign to compare tailpipe-level and on-road NO2 concentrations normalized by CO2 concentrations. This comparison was accomplished through measuring the same exhaust plumes at the tailpipe-level NO2 and CO2 concentrations by a Portable Emission Measurement System (PEMS) and at the on-road level by an Electric Vehicle-based mobile platform. The results showed that CO2-normalized NO2 concentrations, taking into account the effect of dilution, were significantly higher at the on-road level than those at the tailpipe level. Furthermore, we employed a turbulent reacting flow model, CTAG, to simulate the coupled on-road turbulence and chemistry behind a single vehicle, and found that the simplified chemical mechanism using a three-reaction (NO-NO2-O3) system can largely capture the rapid NO to NO2 conversion (with timescale ~ seconds) observed in the field studies. In summary, results from near-road monitoring, on-road experiments and numerical simulations all support the importance of on-road NOx chemical transformation. Our findings provide insights into developing future near-road NO2 mitigation and monitoring strategies.

Rubbertown NGEM Demonstration Project?Community Meeting 3

Rubbertown Next Generation Emission Measurement Demonstration: Background: Industrial facilities, regulators, and nearby communities have a mutual interest in the effective detection of fugitive emissions of volatile organic and odiferous compounds. If unanticipated emissions that require mitigation can be found and fixed in a timely manner, multiple benefits can result, such as lower air shed impacts, healthier air quality, safer working environments, cost savings through reduced product loss, and improved community relations.
Under its Next Generation Emission Measurement (NGEM) program, EPA is working to develop new sensor and modeling approaches that can assist facilities in detection and mitigation of fugitive air pollution sources from facility leaks and operational malfunctions.
To help advance NGEM research, EPA and the Louisville Metro Air Pollution Control District (LMAPCD) are working together on a research project to demonstrate NGEM approaches near facilities in the Rubbertown industrial area of Louisville, KY.
The area has faced challenges related to the control of ozone and exposure of pollutants to local communities. LMAPCD has made extensive efforts to control air toxics, including ozone precursors, in the area. Despite these efforts, fugitive emissions still remain a source of concern from both an air quality agency and a community perspective. These potential air quality challenges and the close physical proximity of industrial sources made the Rubbertown industrial area an ideal location for studying NGEM technologies.
The project team will conduct a year-long demonstration field study of select NGEM technology prototypes developed by EPA researchers and other groups. The study will start in in September 2017.
The project is a measurement study with goals to document NGEM system performance and advance NGEM methods while producing source emission case study data useful for both LMAPCD and industrial facilities.
The research supports the community, the City of Louisville, and industry by furthering the development of innovative and cost effective approaches to improve air quality monitoring that protects public health and the environment.
Scientists will measure volatile organic compounds (VOCs) and air toxics using a network of SPod fenceline sensor systems, field-packaged gas chromatographs (GC’s), and open-path spectroscopic equipment. They will also perform GMAP OTM 33 mobile measurements using vehicles equipped with time-resolved sensors that are driven downwind in relatively close proximity to potential sources.
Science questions include:
• Can emerging NGEM approaches cost effectively augment current industry work practices to help identify and reduce emissions

Navigating through the minefield of read-across tools: A review of in silico tools for grouping

Read-across is a popular data gap filling technique used within analogue and category approaches for regulatory purposes. In recent years there have been many efforts focused on the challenges involved in read-across development, its scientific justification and documentation. Tools have also been developed to facilitate read-across development and application. Here, we describe a number of publicly available read-across tools in the context of the category/analogue workflow to better articulate their respective capabilities, strengths and weaknesses. No single tool addresses all aspects of the workflow. We highlight how the different tools can complement each other and what some of the opportunities for their further development could be to address the continued evolution of read-across.

Application of IATA – A case study in evaluating the global and local performance of a Bayesian Network model for Skin Sensitization

Since the publication of the Adverse Outcome Pathway (AOP) for skin sensitization, there have been many efforts to develop systematic approaches to integrate the information generated from different key events for decision making. The types of information characterizing key events in an AOP can be generated from in silico, in chemico, in vitro or in vivo approaches. Integration of this information and interpretation for decision making are known as integrated approaches to testing and assessment or IATA. One such IATA that has been developed was published by Jaworska et al (2013) which describes a Bayesian network model known as ITS-2. The current work evaluated the performance of ITS-2 using a stratified cross validation approach. We also characterized the impact of refinements to the network by replacing the most significant component, the output from a commercial expert system TIMES-SS with structural alert information readily generated from the freely available OECD QSAR Toolbox. Lack of any structural alert flags or TIMES-SS predictions, yielded a sensitization potential prediction of 79% +3%/-4%. If the TIMES-SS prediction was replaced by an indicator for the presence of a structural alert, the network predictivity increased to 84% +2%/-4%, which was only slightly less than found for the original network (89% ±2%). The local applicability domain of the original ITS-2 network was also evaluated using reaction mechanistic domains to better understand what types of chemicals ITS-2 was able to make the best predictions for – i.e. a local validity domain analysis. We ultimately found that the original network was successful at predicting which chemicals would be sensitizers, but not at predicting their relative potency.

Cold Temperature Effects on Speciated VOC Emissions from Modern GDI Light-Duty Vehicles 1

In this study, speciated VOC emissions were characterized from three modern GDI light-duty vehicles. The vehicles were tested on a chassis dynamometer housed in a climate-controlled chamber at two temperatures (20 and 72 °F) using the EPA Federal Test Procedure (FTP) and a portion of the Supplemental FTP (i.e. US06) that represents more aggressive driving conditions. The vehicles operated gasoline blended with 10% ethanol. VOC emissions from diluted vehicle exhaust were sampled with SUMMA canisters for EPA Method TO-15 analysis and with 2,4-Dinitrophenylhydrazine (DNPH) cartridges for carbonyl analysis by EPA Method TO-11A. This presentation will report the impact of ambient cold temperature, driving cycle, and GDI technology on speciated VOC emissions.

20170312 – A framework to build scientific confidence in read across results. (SOT CE course presentation)

Read-across acceptance is remains a major hurdle primarily due to the lack of objectivity and clarity on how to practically address uncertainties. One avenue that can be exploited to build scientific confidence in the development and evaluation of read-across is by taking advantage of new in vitro bioactivity data streams which have the potential to provide mechanistic information. A read-across prediction could be formulated in the context of a local neighborhood that then is amenable to objective evaluation using a QSAR-like framework. In this talk, we present such a framework where the read-across prediction relies on a similarity weighted activity of nearest neighbors based on chemistry and bioactivity descriptors. We illustrate how this framework can be used to make predictions for untested chemicals and how the uncertainty of the prediction can be evaluated dynamically across the entire neighborhood for a number of different toxicity effects.

Developing qualitative ecosystem service relationships with the Driver-Pressure-State-Impact-Response framework: A case study on Cape Cod, Massachusetts

Understanding the effects of environmental management strategies on society and the environment is critical for evaluating their effectiveness but is often impeded by limited data availability. In this article, we present a method that can help scientists to support environmental managers’ thinking about causal effects on ecosystem services in coupled human and natural systems. Our method aims to model qualitative cause-effect relationships between management strategies and ecosystem services, using information provided by knowledgeable participants, and the tradeoffs between strategies. We select and organize management strategies, environmental variables, and ecosystem services as indicators using the Driver-Pressure-State-Impact-Response framework. We evaluate the relationship between indicators using a decision tree and numerical representations of interaction strength. We use a matrix multiplication procedure to model direct and indirect interaction effects, and we provide guidelines for combining effects. Results include several data tables from which information can be visualized to understand the plausible interaction effects of implementing management strategies on ecosystem services. We illustrate our method with a coastal water quality management case study on Cape Cod, Massachusetts.

Computational Model of Secondary Palate Fusion and Disruption

Morphogenetic events are driven by cell-generated physical forces and complex cellular dynamics. To improve our capacity to predict developmental effects from cellular alterations, we built a multi-cellular agent-based model in CompuCell3D that recapitulates the cellular networks and collective cell behavior underlying growth and fusion of the mammalian secondary palate. The model incorporated multiple signaling pathways (TGF, BMP, FGF, EGF, SHH) in a heuristic computational intelligence framework to recapitulate morphogenetic events from palatal outgrowth through midline fusion. It effectively simulated higher-level phenotypes (e.g., midline contact, medial edge seam (MES) breakdown, mesenchymal confluence, fusion defects) in response to genetic or environmental perturbations. Perturbation analysis of various control features revealed model functionality with respect to cell signaling systems and feedback loops for growth and fusion, diverse individual cell behaviors and collective cellular behavior leading to physical contact and midline fusion, and quantitative analysis of the TGF/EGF switch that controls MES breakdown – a key event in morphogenetic fusion. The virtual palate model was then executed with chemical perturbation scenarios to simulate switch behavior leading to a disruption of fusion following chronic (e.g., dioxin) and acute (e.g., retinoic acid, hydrocortisone) toxicant exposures. This computer model adds to similar systems models toward a ‘virtual embryo’ for simulation and quantitative prediction of adverse developmental outcomes following genetic perturbation and/or environmental disruption.

Plant reproduction is altered by simulated herbicide drift toconstructed plant communities

Herbicide drift may have unintended impacts on native vegetation, adversely affecting structure and function of plant communities. However, these potential effects have been rarely studied or quantified. To determine potential ecological effects of herbicide drift, we constructed a series of small plant community plots using perennial species found in Willamette Valley Oregon grasslands including: Eriophyllum lanatum (Oregon sunshine), Iris tenax (toughleaf Iris), Prunella vulgaris var. lanceolata (Lance selfheal), Camassia leichtlinii (large camas), Festuca roemeri (Roemer’s fescue), Elymus glaucus (blue wildrye), Ranunculus occidentalis (western buttercup), Fragaria virginiana (Virginia strawberry), and Potentilla gracilis (slender cinquefoil). Studies were conducted on two Oregon State University farms over two years, and evaluated single and combined effects of drift rates of 0.01 to 0.2 x field application rates (FAR) of 1119 g ha-1 for glyphosate [active ingredient (a.i) of 830 g ha-1 acid glyphosate] and 560 g ha-1 a.i. for dicamba. Species response endpoints were % cover, # of reproductive structures, mature and immature seed production (dry weight), and vegetative biomass. Herbicide effects differed with species, year and farm. Among the more notable responses, Eriophyllum lanatum had a significant reduction in total seed production or % immature seed dry weight with as little as 0.01 x FAR of dicamba, glyphosate or the combination of both herbicides; but a significant reduction in % cover near the end of the growing season only with 0.2 x FAR of both herbicides. Elymus glaucus had a significant reduction in total seed production with 0.1 x FAR of glyphosate alone or in combination with dicamba in one year. The other species showed similar trends but had fewer significant responses. These studies indicated potential unintended effects of low levels of herbicides on reproduction of native plants, and demonstrated an experimental protocol whereby a plant community can be evaluated for ecological responses.

Insights from the Development of HiveScience

The National Advisory Council for Environmental Policy and Technology (NACEPT) recently assessed EPA’s approach to citizen science. The Council concluded that integration of citizen science into EPA’s structure will accelerate virtually every Agency activity. HiveScience is a new EPA citizen science project for beekeepers. Because HiveScience was a first of its kind, this project blazed a trail by developing new processes to support citizen science efforts within the Agency. Using HiveScience as a model project, this presentation describes the path from idea formation to project launch and highlights challenges associated with navigating the Agency’s policies and procedures. In order to realize the full potential of citizen science efforts, the Agency should address areas of procedural deficiency thereby streamlining the project development process. This small investment will open the Agency to a modern data collection mechanism that not only enables collection of novel information, but also promotes the formation of a positive relationship with the public.

Assessing Effects of Pesticides on the Bee Immune System

Populations of some managed and wild pollinators are in decline as a result of multiple interacting factors including parasites, disease, poor nutrition and pesticides. The role that diminished immunity plays in these declines is not understood. The U.S. Environmental Protection Agency (EPA) is working to identify and implement tests for assessing the impact of pesticides on bees including sublethal effects on the immune system. These efforts are a response to goals described in the National Strategy for Promoting the Health of Honey Bees [Apis mellifera] and Other Pollinators. Although sublethal effects may be measured in these studies, it is uncertain how these measurement endpoints may relate to regulatory risk assessment endpoints and the extent to which honey bees serve as a reasonable surrogate for non-Apis bees. The National Strategy and the 2012 EPA White Paper describing the conceptual framework for assessing risks of pesticides to bees, discussed uncertainties related to assessing exposure and effects to solitary and social bees from individual pesticides and combinations of pesticides. This presentation will discuss efforts to examine immune responses in non-Apis bees and how those responses relate to effects observed in honey bees.