Dynamic interactions of atmospheric and hydrological processes result in large spatiotemporal changes of precipitation and wind speed in coastal storm events under both current and future climates. This variability can impact the design and sustainability of water infrastructure and other environmental assets. We have examined the nature of this challenge and explored the feasibilities of determining future design basis along the U.S. eastern coastal sites. Long-duration historical precipitation datasets, along with wind records, at 14 local USHCN stations are analyzed. Design storm values are determined using sliding time window of the 30-yr and 50-yr duration. The results are compared to the Atlas-14 precipitation design curves and with the design storms derived from the downscaled CMIP5 daily precipitation datasets (1950-2000). We show that Atlas-14 design charts, mostly based on pre-1990 data, sufficiently describe the design storms of the past, but do not describe the high-intensity precipitation of recent years. The 132 CMIP5 climate models and their ensembles also under-estimate the design storms. Similar bias is found in design wind speeds from AOGCM outputs. Based on these results, a post-bias correction procedure is proposed for developing design storm values from CMIP5 and AOGCM projections.
In order to protect human health from chemicals that can mimic natural hormones, the U. S. Congress mandated the U.S. EPA to screen chemicals for their potential to be endocrine disruptors through the Endocrine Disruptor Screening Program (EDSP). However, the number of chemicals to which humans are exposed is too large (tens of thousands) to be accommodated by the EDSP Tier 1 battery, so combinations of in vitro high-throughput screening (HTS) assays and computational models are being developed to help prioritize chemicals for more detailed testing. Previously, CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) demonstrated the effectiveness of combining many QSAR models trained on HTS data to prioritize a large chemical list for estrogen receptor activity. The limitations of single models were overcome by combining all models built by the consortium into consensus predictions. CoMPARA is a larger scale collaboration between 35 international groups, following the steps of CERAPP to model androgen receptor activity using a common training set of 1746 compounds provided by U.S. EPA. Eleven HTS ToxCast/Tox21 in vitro assays were integrated into a computational network model to detect true AR activity. Bootstrap uncertainty quantification was used to remove potential false positives/negatives. Reference chemicals (158) from the literature were used to validate the model, which showed 95.2% and 97.5% balanced accuracies for AR agonists and antagonists respectively. A library of ~80k bioactivities, representing ~11k chemicals curated from PubChem literature data using ScrubChem tools was integrated with CoMPARA’s consensus predictions that combined several structure-based and QSAR modeling approaches. The results of this project will be used to prioritize a large set of more than 50k chemicals for further testing over the next phases of ToxCast/Tox21, among other projects. This work does not reflect the official policy of any federal agency.
Peatlands store 30% of global soil carbon. Many of these peatlands are located in boreal regions which are expected to have the highest temperature increases in response to climate change. As climate warms, peat decomposition may accelerate and release greenhouse gases. Spruce and Peatland Responses Under Climate and Environmental Change (SPRUCE) project initiated soil warming in 2014 in ten peatland mesocosms (five temperature treatments from ambient (+0oC) to +9°C) and elevated CO2 in half of the mesocosms in 2016. Peat cores at three depths (acrotelm, catotelm, deep peat) were analyzed in the laboratory for denitrification, nitrification, and ammonification. Denitrification increased with the addition of nitrogen, but had a weak response to initial temperature increases. The strongest temperature response was in the deep peat. Mineralization rates responded weakly to temperature increases. Nitrification increased only in the catotelm, while ammonification decreased in deep peat. Denitrification and nitrification rates were correlated, decreasing with depth. Ammonification was inversely correlated with nitrification, increasing with depth. As soil temperatures rise, water levels decrease, and decomposition increases we expect nitrification-denitrification to increase, mobilizing nitrate and increasing N2O releases.
The needs associated with the deteriorating water infrastructure are immense and have been estimated at more than $1 trillion over the next 20 years for water and wastewater utilities. To meet this growing need, utilities require the use of innovative technologies and procedures for managing their systems. The U.S. Environmental Protection Agency (EPA) developed an innovative technology demonstration program to meet this need. The purpose of the demonstration program is to evaluate rehabilitation technologies that have the potential to reduce costs and increase the effectiveness of the operation, maintenance, and renewal of aging and deteriorating water distribution and wastewater conveyance systems. This paper provides an impartial assessment of the effectiveness and cost of four innovative technologies for water distribution and wastewater collection pipes. The technologies demonstrated include spray-on polymeric lining and cured-in-place pipe (CIPP) lining for water mains; and spray applied geopolymer mortar and an internal pipe sealing system for wastewater mains.
The Storm Water Management Model (SWMM) is a widely used tool for urban drainage design and planning. Hundreds of peer-reviewed articles and conference proceedings have been written describing applications of SWMM. This review focused on collecting information on model performance with respect to calibration and validation in the peer-reviewed literature. The major developmental history and applications of the model are also presented. The results provide utility to others looking for a quick reference to gauge the integrity of their own unique SWMM application. A gap analysis assessed the models’ ability to perform water quality simulations considering green infrastructure (GI)/Low Impact Development (LID) designs and effectiveness. We conclude that the level of detail underlying the conceptual model of SWMM versus its overall computational parsimony is well balanced – making it an adequate model for large and medium-scale hydrologic applications. However, embedding a new mechanistic algorithm or providing user guidance for coupling with other models will be necessary in order to realistically simulate diffuse pollutant sources, their fate and transport, and the effectiveness of GI/LID implementation scenarios.
We experimentally assessed kinetics and thermodynamics of electron transfer (ET) from the donor substrate (acetate) to the anode for a mixed-culture biofilm anode. We interpreted the results with a modified biofilm-conduction model consisting of three ET steps: (1) intracellular ET, (2) electron-hopping extracellular ET (EET), and (3) conductive EET governed by Ohm’s law. The steady-state current density was 0.82 ± 0.03 A/m2 in a miniature microbial electrolysis cell operated at fixed anode potential of -0.15 V versus standard hydrogen electrode. Illumina 16S-rDNA and -rRNA sequences showed that the Geobacter genus was less than 30% of the community of the biofilm anode. Although Monod kinetics for utilization of acetate were relatively slow, biofilm conductivity was high at 2.44 ± 0.42 mS/cm, indicating that the maximum current density could be as high as 268 A/m2 if only the conductive EET was limiting. Due to high biofilm conductivity, the maximum energy loss for the EET was negligible at 340 µV; the energy loss in the second ET step was only 20 mV, placing at least 87% of the energy loss at the intracellular step. The potential for a rate-limiting extracellular cofactor (EC) involved in the second ET was -0.15 V, which means that >99% of the EC was in the oxidized state, which reinforces that the intracellular ET was the main kinetic and thermodynamic bottleneck to electron transfer from donor substrate to the anode for a highly conductive biofilm.
Research is being conducted to develop fluorescent sensors to detect nitrification in drinking water distribution systems.
The Narragansett Bay Estuary Program developed 24 environmental indicators for its 2017 State of the Bay and its Watershed report with the collaboration of over 50 bi-state and regional partners. A geographical approach was undertaken at different scales using an array of geospatial methodologies to identify, synthesize, and analyze indicators and to report on their status and trends. This poster depicts a map of the watershed and bay, partitioned to showcase some of the most relevant results for a selection of indicators in each area. Mapping watershed stressors and conditions in this way highlights drivers of change as well as localized variation throughout the landscape and bay.
To promote and strengthen the resiliency of coastal watersheds in the face of climate change and development, ecological outcomes as well as economic, social, and environmental justice issues need to be considered. An integrated assessment framework is being developed to help watershed managers, coastal communities, and other stakeholders strengthen coastal resiliency by identifying and prioritizing conservation and restoration efforts within coastal watersheds.
This framework is linked to a desktop and web-based decision support system (DSS) incorporating ecological integrity principles with ecosystem services (ES). The DSS operates within a geospatial platform, allowing for spatially-explicit analysis of individual ecological units and their associated ESs at multiple scales, and provides web-based and mobile applications (tablets and smart phones) developed for a range of users from technical users/stakeholders to the general public. The DSS allows for the evaluation of both ecological integrity and ESs of key functional processes, components and elements of watershed integrity relative to the location within the watershed (e.g. headwater streams, flood plains, riparian condition, coastal wetlands, etc.). This coastal watershed resiliency DSS can be used to make decisions for: 1) prioritizing protection and restoration of upland and riparian habitat for water quality and mitigating non-point source stressors; 2) reducing flooding risks by identifying opportunities to restore flood plains and riparian zones increasing aquatic connectivity for habitat and flood resilience; 3) planning for sea level rise adaptation, marsh migration and marsh hydrology restoration; 4) optimizing green infrastructure to reduce nutrients and non-point source pollutants; 5) identifying best locations for optimizing economic development, and multimodal transportation.
The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) is an analysis tool to estimate greenhouse gas emissions form light-duty vehicle sources.
Low-Density polyethylene (LDPE) sheets are often used as passive samplers for aquatic environmental monitoring to measure the freely dissolved concentrations of hydrophobic organic contaminants (HOCs). HOCs that are freely dissolved in water (Cfree) will partition into the LDPE until a thermodynamic equilibrium is achieved; that is, the HOC’s chemical potential in the passive sampler is the same as its potential in the surrounding environment. However, achieving equilibrium for high molecular weight HOCs can take several months or even years. One way to evaluate the equilibrium status or estimate the uptake kinetics is by using performance reference compounds (PRCs). PRCs are often isotopically labeled versions of target compounds and are partitioned into the LDPE prior to deployment. Based on the fraction of each PRC lost during deployment, a sampling rate (Rs) or a fractional equilibrium (feq) can be determined for target HOCs, under the assumption that PRC desorption from the passive sampler occurs at the same rate as the unlabeled target HOCs. In this study, LDPE passive samplers were pre-loaded with six, 13C-labelled PCBs as PRCs, and deployed in New Bedford Harbor, MA, USA. Triplicate samplers were collected after 30, 56, 99, and 129 day deployments. PRC-corrected Cfree concentrations were estimated for 27 target PCBs (log KOW ranging from 5.07 – 8.09) at each time point. Results allowed for calculation of desorption rates of PRCs as well as uptake rates for target HOCs and confirmed that kinetics are indeed isotropic for isomers. Results were fit to a traditional first order kinetic model, a sampling rate model, and a diffusion model to assess how well each predicted equilibrium Cfree. Samplers at equilibrium showed agreement within 20%. However, for PCBs with slower kinetics, as the fractional equilibrium achieved decreased in magnitude, the Cfree agreement between models and other time points also decreased. In general, results from the 30-day deployment illustrated the highest Cfree for PCBs with a log KOW greater than 6.5 or when a feq of 15% or less was achieved over the course of the deployment. These results provide a field-based evaluation of the usefulness of PRCs but also suggest caution should be used when correcting passive sampling data by a factor of 10 or more.
Increased consumption and improper disposal of prescription medication, such as beta (β)-blockers, contribute to their introduction into waterways and pose threats to non-target aquatic organisms. Beta-blockers are widely prescribed for medical treatment of hypertension and arrhythmias. They prevent binding of agonists, such as catecholamines, to β-adrenoceptors. In the absence of agonist induced receptor activation, adenylate cyclase activation and increases in blood pressure are limited. With their widespread use, there has been rising concern about the impacts of β-blockers on coastal ecosystems, especially because wastewater treatment plants are not designed to eliminate these drugs from the discharge. Few studies have characterized the sublethal effects of β-blocker exposures in marine invertebrates. The aim of our research is to evaluate cellular biomarker responses of two commercially important filter-feeding marine bivalves, Eastern oysters (Crassostrea virginica) and hard clams (Mercenaria mercenaria), upon exposure to two β-blocker drugs, propranolol and metoprolol. Bivalves were obtained from Narragansett Bay (Rhode Island, USA) and acclimated in the laboratory. Following acclimation, gills and digestive gland tissues were harvested and separately exposed to concentrations ranging from 0-1000 ng/l of each drug for 24 hours. Tissues were bathed in 30 parts per thousand filtered seawater, antibiotic mix, nutrient media, and the test drug. Tissue samples were analyzed for biomarker assays including tissue damage (lysosomal membrane destabilization and lipid peroxidation), total antioxidant capacity, and activity of glutathione-s-transferase (GST) – a detoxification enzyme. Elevated tissue damage and changes in GST activities were noted in the exposed tissues at environmentally relevant concentrations. Digestive gland tissues were more responsive to the exposures than gill tissues. Differences in species sensitivities and responses to the exposures were also observed. These studies enhance our understanding of the potential impacts of prescription medication on coastal organisms, and demonstrate that filter feeders such as marine bivalves may serve as good model organisms to examine the effects of water soluble drugs. Evaluation of a suite of biomarkers allows us to better define molecular initiating events and subsequent key events that might be used to develop adverse outcome pathways (AOPs) for unintended environmental exposures to β-blockers.
This review examines the use of life-cycle assessments (LCAs) to compare different lightweight materials being developed to improve light-duty vehicle fuel economy. Vehicle manufacturers are designing passenger cars and light-duty trucks using lighter weight materials and design optimization techniques to meet EPA’s greenhouse gas (GHG) emissions standards and the National Highway Traffic Safety Administration’s (NHTSA’s) Corporate Average Fuel Economy (CAFE) standards in the United States. Vehicle mass reduction, along with other efficiency improvements, have the potential to reduce tailpipe or use-phase emissions significantly. While tailpipe emissions are by far the primary contributor to life-cycle energy use and GHG impacts for most vehicles, the production and end-of-life (EOL) phases of lightweight materials can differ significantly from the conventional materials used in automobile manufacturing. Twenty-six LCAs conducted since 2010 were reviewed and compared. Aluminum was identified as the material most often specified by the LCAs for providing the highest benefits in terms of life cycle energy and GHG reductions. Advanced high-strength steels were also effective lightweighting materials for reducing energy and GHG impacts. Magnesium and carbon fiber-reinforced polymers were the lightest material options but incurred much higher production GHG impacts and were modeled with low rates of recycling. Major factors from the LCAs are highlighted and discussed; including: secondary mass reduction, recycling allocation, powertrain size reductions and fuel reduction values, material substitution rates, lifetime vehicle travel distance, and production location and grid mix. With fleet-wide adoption, there may be major shifts in material flows which must be accounted for in LCAs intended to guide material selection for lightweight vehicles in addition to other consequential factors including grid mixes and recycling technologies. As the existing standards guide the reduction of use-phase impacts over time, LCA can provide insights regarding the important role of material production and EOL in determining the overall impacts from the automotive sector.
Chitosan derived porous layered nitrogen-enriched carbonaceous CNx catalyst (PLCNx) has been synthesized from marine waste and its use demonstrated in a metal-free heterogeneous selective oxidation of 5-hydroxymethyl-furfural (HMF) to 2,5-furandicarboxylic acid (FDCA) using aerial oxygen under mild reaction conditions.
Natural gas combined-cycle (NGCC) turbines with carbon capture and storage (CCS) are a promising technology for reducing carbon dioxide (CO2) emissions in the electric sector. However, the high cost and efficiency penalties associated with CCS, as well as methane leakage from natural gas extraction and distribution, may limit the role of NGCC-CCS to achieve stringent greenhouse gas (GHG) reduction goals. The MARKet ALlocation (MARKAL) energy system optimization model and EPA U.S. nine-region database are used to identify optimal U.S. market penetrations of NGCC-CCS through 2055 in response to alternative GHG reduction trajectories and methane leakage rates. The results indicate that the NGCC-CCS is better suited for widespread deployment under a moderate 30% GHG reduction trajectory than under a more stringent 50% trajectory because of upstream methane leakage from gas extraction and the assumption that 15% of CO2 in exhaust gases with CCS remains uncaptured. Parametric sensitivity runs were conducted for the 50% GHG reduction trajectory, evaluating various fuel, CO2 capture, and technology parameters. Of the parameters examined, methane leakage rate, NGCC-CCS efficiency and CO2 capture rate, and natural gas price are found to be the strongest factors influencing NGCC-CCS deployment, in that order. Across all 46 sensitivity runs, NGCC serves as a mid-term solution to a low-CO2 future and is retrofitted with CCS in the long-term as the modeled GHG trajectories become more stringent. This is an important result as it indicates NGCC may play both a short- and long-term mitigation role, and that CCS retrofitability of NGCC plants and siting near CO2 storage sites are key considerations. The modeling results suggest that CCS may play a larger role in some regions of the U.S. than others. The West South Central Census Division, followed by the East North Central Census Division, have the highest electricity generation from NGCC-CCS, while New England has no NGCC-CCS adoption. NGCC-CCS market penetration is shown to have a mixed impact on air pollutant emissions and energy-related water consumption, depending on the region and which technologies are displaced.
Understanding the sorption mechanisms for organophosphate flame retardants (OPFRs) on impervious surfaces is important if we are to improve our understanding of the fate and transport of OPFRs in indoor environments. Traditional Langmuir and Freundlich models are widely adopted to describe the sorption behavior of indoor semivolatile organic compounds (SVOCs). In a real indoor environment, it is possible that the sorption process of SVOCs on surfaces is more heterogeneous (multilayer adsorption) than homogeneous (monolayer adsorption). Therefore, interpreting the sorption mechanisms of OPFRs on surfaces using Langmuir’s equation may not be accurate. In this study, we adopt both Langmuir and Freundlich isotherms to characterize the adsorption/desorption dynamics of OPFRs on a stainless steel surface and make comparisons between the two models through a series of empty chamber studies. The chamber tests involve two types of stainless steel chambers (53-L small chambers and 44-mL micro chambers) using tris(2-chloroethyl) phosphate (TCEP) and tris(1-chloro-2-propyl) phosphate (TCPP) as target compounds. Test results show that the Freundlich model can better represent the adsorption/desorption process in the empty small chamber. Micro chamber test results show that both Langmuir and Freundlich models can well fit the measured gas-phase concentrations of OPFRs. We further apply the Freundlich model and the obtained parameters from empty small chamber test to predict the gas phase concentrations of OPFRs in a small chamber with an emission source. Comparisons between model predictions and measurement results show the reliability and application of the obtained sorption parameters.
Accompanying editorial to paper from Harvard by Rice et al. entitled “Long-Term Exposure to Traffic Emissions and Fine Particulate Matter and Lung Function Decline in the Framingham Heart StudyBy almost any measure the Clean Air Act and its amendments has to be considered as one of the most significant and arguably successful pieces of environmental legislation in modem times (1 ). Air quality has improved significantly since its passage and continues to do so. The levels of fine particulate matter (PM2.5) and the larger coarse particle (PMlO) have both declined by a third nationally from 2000 to 2013 ( http://www.epa.gov/airtrends/pm.html). Because these pollutants have been implicated in respiratory and cardiac diseases, this is thought to have resulted in significant health benefits with more than one study associating reduced air pollution with increased life expectancy (3, 4). Despite these improvements, more than 46 million people still live in an area where the annual average level of particle pollution is considered unhealthful.(5). The study shows that the lung function of middle aged men and women may also be reduced by long-term exposure to air pollution or traffic. Using the Framingham Offspring and Third Generation cohorts based in Massachusetts, the authors show that those individuals residing near a major road or in areas with higher PM2.5 levels have lower average FEV 1 and FVC. In addition, the natural decline in lung function with age also appeared to be accelerated in those people. No association between long-term air pollution and FEV1/ FVC ratio was apparent and so the authors conclude that the effects are not associated with airflow obstruction. The findings on lung decline complement previous longitudinal studies in children living in Southern California where PM2.5 reduced lung function growth between the ages of 10 and 18 years, the age where rapid lung development normally occurs (6). Similar associations between air pollution and deficits in lung function growth have now been seen in schoolchildren in Mexico City. (7), China (8) and Europe(9).
The U.S. EPA Office of Research and Development (ORD) has been exploring approaches for estimating U.S. anthropogenic air pollutant emissions through the mid-21st century. As a result, we have developed the Emission Scenario Projection methodology, or ESP. In this document, we provide an overview of ESP development and capabilities, then highlight possible future directions. Next, we switch gears to describe a typical life cycle analysis (LCA) and discuss briefly some limitations of standard LCA approaches. Finally, linkage of ESP and LCA is proposed as a way to address the limitations. Three possible linkages are listed: (i) driving the underlying assumptions in the LCA with scenarios developed via EPS; (ii) gaining insights into the spatial allocation of LCA results using the future-year inventory projections developed with ESP, and (iii) integrating LC factors directly into the ESP energy system representation to capture LC impacts of various scenarios. This document and the related presentation are intended to stimulate discussion between the emission projection and LCA communities.
Review of vegetative barrier research and recommendations
The challenge of assessing the potential developmental health risks for the tens of thousands of environmental chemicals is beyond the capacity for resource-intensive animal protocols. Large data streams coming from high-throughput (HTS) and high-content (HCS) profiling of biological activities, coupled with machine-learning algorithms, literature-mining tools, and systems modeling, is a newer paradigm to toxicity testing in the 21st century. Newer resources are available to measure molecular components of cellular and tissue-level phenomena in great depth and detail. HTS/HCS data now in-hand (ToxCast/Tox21), ‘evolution’ implies the advancement of best practices and computational approaches to assemble the individual pieces into an integrative model that: scales to the human exposure universe; incorporates extant knowledge of human embryology; and deals probabilistically with spatiotemporal dynamics in a morphogenetic series of events. With the advent of computational approaches and computer models fit for that purpose, ‘revolution’ implies their continued refinement and cohesion to satisfy the fundamental principles of teratology: chemical structure, dosimetry, initiating mechanism(s), genetic susceptibility, stage specificity, and bioavailability. This presentation will provide examples of how in vitro data and in silico models can be integrated with biological knowledge to simulate how embryos might react to diverse exposure scenarios. [This abstract does not reflect US EPA policy].