Category Archives: Earth Science


This study measured polycyclic aromatic hydrocarbon (PAH) composition in particulate matter emissions from residential cookstoves. A variety of fuel and cookstove combinations were examined, including: (i) liquid petroleum gas (LPG), (ii) kerosene in a wick stove, (iii) wood (10% and 30% moisture content on a wet basis) in a forced-draft fan stove, and (iv) wood in a natural-draft rocket cookstove. LPG combustion had the highest thermal efficiency (~57%) and the lowest PAH emissions per unit fuel energy, resulting in the lowest PAH emissions per useful energy delivered (MJd). The average benzo[a]pyrene (B[a]P) emission factor for LPG was 0.842 µg/MJd; the emission rate was 0.043 µg/min. The highest PAH emissions were from wood burning in the natural-draft stove (209-700 µg B[a]P/MJd). PAH emissions from kerosene were significantly lower than those from the wood burning in the natural-draft cookstove, but higher than those from LPG. It is expected that in rural regions where LPG and kerosene are unavailable or unaffordable, the forced-draft fan stove may be an alternative because its emission factor (5.17-8.07 µg B[a]P/MJd) and emission rate (0.52-0.57 µg/min) are similar to kerosene (5.36 µg B[a]P/MJd and 0.45 µg/min). Compared with wood combustion emissions, LPG stoves emit less total PAH emissions and less fractions of high molecular weight PAHs. Relatively large variations in PAH emissions from LPG call for additional future tests to identify the major factors influencing emissions. These future tests should also account for different LPG fuel compositions and burner types.


The path for incorporating new approach methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. These challenges include sufficient coverage of toxicological mechanisms to meaningfully interpret negative test results, development of increasingly relevant test systems, computational modeling to integrate experimental data, putting results in a dose and exposure context, characterizing uncertainty, and efficient validation of the test systems and computational models. The presentation will cover progress at the U.S. EPA in systematically addressing each of these challenges and delivering more human-relevant risk-based assessments. This abstract does not necessarily reflect U.S. EPA policy.


The path for incorporating new alternative methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. Some of these challenges include development of relevant and predictive test systems and computational models to integrate and extrapolate experimental data, and rapid characterization and acceptance of these systems and models. The series of presentations will highlight a collaborative effort between the U.S. Environmental Protection Agency (EPA) and the Agency for Science, Technology and Research (A*STAR) that is focused on developing and applying experimental and computational models for predicting chemical-induced liver and kidney toxicity, brain angiogenesis, and blood-brain-barrier formation. In addressing some of these challenges, the U.S. EPA and A*STAR collaboration will provide a glimpse of what chemical risk assessments could look like in the 21st century.


We correlated the ToxCast library in a metabolic biomarker-based in vitro assay (Stemina devTOXqP) utilizing human embryonic stem (hES) cells (H9 line). This assay identifies the concentration of a chemical that disrupts cellular metabolism in a manner indicative of teratogenicity [Palmer et al. 2013]. Undifferentiated H9 cells were exposed for 72h and media from the final 24h period was analyzed by LC-MS to determine the ornithine to cystine ratio (ORN/CYSS). ORN is derived from arginine breakdown during the citric acid cycle and CYSS is formed by oxidation of cysteine molecules that covalently link via a disulfide bond, and the corresponding ‘teratogen index’ based on ORN/CYSS falling below 0.88. To date, the raw and plate-normalized data for 286 samples in concentration series (269 chemicals plus replicates, n=3) and another 812 samples at a single concentration (n=4) have been entered into the ToxCast pipeline for QA, processing, analysis and eventual release to the public. A preliminary analysis revealed the following trends. First, 166 compounds (15.5% of the tested compounds) were ‘active’ based on the default ORN/CYSS threshold (0.88). These included many known teratogens, such as trans-retinoic acid (LEC = 3 nM), 5-fluorouracil (100 nM), methotrexate (100 nM), thalidomide (300 nM), and carbamazepine (3 uM) among others. Second, for 23 compounds with FDA codes (A,B,C,D,X) or ECVAM classifiers, the default model had a balanced accuracy of 84% (sensitivity 0.80, specificity 0.88). Third, Many chemicals not yet classified were predicted positive, including the angiogenesis inhibitors TNP-470 (10 nM) and 5HPP-33 (10 uM). Fourth, at the concentrations tested specificity was demonstrated for a parent-metabolite pair where only the proximate teratogen was active; for chemical stereoisomer pairs where only one compound was active; and for 3 closely-related structural isomers where only one structure was active. Cross-referencing with ToxCastDB (in vitro) and ToxRefDB (in vivo) is being undertaken to assess the added value of the devTOXqP assay performance in computational models built for predictive teratogenicity in a human cell-based system. (Disclaimer: this abstract does not reflect EPA policy).


A predictive model for prenatal developmental toxicity using ToxCast Phase I showed the RAR assay set to be the strongest weighting factor (Sipes et al. 2011). Retinoid signaling mediates growth and differentiation of the embryo. ToxCast has 6 reporter assays for trans-activation of retinoic acid receptors (RARa, RARb, RARg, and RXRa, RXRb, RXRg) and cis-activation of DR5 response elements by RAR/RXR, yielding a total of 879 ToxCast AC50s (conc. 50% effect) mapped to molecular targets in the retinoid system. In total, 97 of 1858 chemicals (5.2%) hit ≥one assays in the retinoid system at an AC50 below 2 uM. Several persistent organochlorines activated the RAR system at AC50s below 2 uM; several organotins and tert-butyl compounds activated the RXR system below 0.2 uM and 2 uM, respectively. All-trans retinoic acid was the most potent chemical tested in the RARa (AC50 = 0.43 nM) and RXRa (AC50 = 0.31 nM) trans-activation assays, and closely followed several tributyltin compounds as the most potent on the DR5 (AC50 = 6.26 nM) assay. A subset of 263 chemicals was tested on mouse embryonic stem cell (mESC) differentiation and cytotoxicity. In all 54% of the subset showed effects on mESC, including 58% of the DR5 compounds, 63% of hCYP1A1 compounds and 81% of RAR/RXR compounds. A preliminary analysis with the devTOXquickPredict (Stemina) ESC assay revealed a weak correlation with chemicals active on the RAR/RXR/DR5 system and a targeted biomarker response (ORN/CYSS ratio) that is ~84% predictive for teratogenicity in a human system. In conclusion, in vitro profiling of retinoid signaling identified ~ 5% of ToxCast chemicals with a potential for disruption of retinoic acid signaling through trans-activation of RAR or RXR at submicromolar concentrations. The potential for RAR and RXR to heterodimerize with different nuclear receptor families suggests these compounds may potentially disrupt multiple signaling pathways. This abstract does not reflect EPA policy.

Solar Energy and Other Appropriate Technologies for Small Potable Water Systems in Puerto Rico

This Region 2 research demonstration project presentation studied the efficacy of sustainable solar-powered water delivery and monitoring systems to reduce the economic burden of operating and maintaining Non-PRASA drinking water systems and to reduce the impact of climate change resulting from the use of fossil fuels in Puerto Rico.


The PubChem Bioassay database is a non-curated public repository with data from 64 sources, including: ChEMBL, BindingDb, DrugBank, EPA Tox21, NIH Molecular Libraries Screening Program, and various other academic, government, and industrial contributors. Methods for extracting this public data into quality datasets, useable for analytical research, presents several big-data challenges for which we have designed manageable solutions. According to our preliminary work, there are approximately 549 million bioactivity values and related meta-data within PubChem that can be mapped to over 10,000 biological targets. However, this data is not ready for use in data-driven research, mainly due to lack of structured annotations.We used a pragmatic approach that provides increasing access to bioactivity values in the PubChem Bioassay database. This included restructuring of individual PubChem Bioassay files into a relational database (ScrubChem). ScrubChem contains all primary PubChem Bioassay data that was: reparsed; error-corrected (when applicable); enriched with additional data links from other NCBI databases; and improved by adding key biological and assay annotations derived from logic-based language processing rules. The utility of ScrubChem and the curation process were illustrated using an example bioactivity dataset for the androgen receptor protein. This initial work serves as a trial ground for establishing the technical framework for accessing, integrating, curating, analyzing, and making use of such massive bioactivity data. This abstract does not necessarily reflect U.S. EPA policy.

In a drought, over-irrigated lawns lose 70 billion gallons of water a year

Researchers estimated the flow of water vapor from the soil to the atmosphere in Los Angeles landscapes.

Find related stories on NSF’s Environmental Research and Education programs.

In the summer of 2010, Los Angeles lost about 100 gallons of water per person per day to the atmosphere through evaporation, mostly from overwatering of lawns and trees.

Lawns accounted for 70 percent of the water loss, while trees accounted for 30 percent, according to a study published today in the

More at

This is an NSF News item.

Wetland Loss Patterns and Inundation-Productivity Relationships Prognosticate Widespread Salt Marsh Loss for Southern New England

Tidal salt marsh is a key defense against, yet is especially vulnerable to, the effects of accelerated sea level rise. To determine whether salt marshes in southern New England will be stable given increasing inundation over the coming decades, we examined current loss patterns, inundation-productivity feedbacks, and sustaining processes. A multi-decadal analysis of salt marsh aerial extent using historic imagery and maps revealed that salt marsh vegetation loss is both widespread and accelerating, with vegetation loss rates over the past four decades summing to 17.3 %. Landward retreat of the marsh edge, widening and headward expansion of tidal channel networks, loss of marsh islands, and the development and enlargement of interior depressions found on the marsh platform contributed to vegetation loss. Inundation due to sea level rise is strongly suggested as a primary driver: vegetation loss rates were significantly negatively correlated with marsh elevation (r2 = 0.96; p = 0.0038), with marshes situated below mean high water (MHW) experiencing greater declines than marshes sitting well above MHW. Growth experiments with Spartina alterniflora, the Atlantic salt marsh ecosystem dominant, across a range of elevations and inundation regimes further established that greater inundation decreases belowground biomass production of S. alterniflora and, thus, negatively impacts organic matter accumulation. These results suggest that southern New England salt marshes are already experiencing deterioration and fragmentation in response to sea level rise and may not be stable as tidal flooding increases in the future.

National Science Foundation presents FY 2018 budget request

NSF's FY 2018 budget request includes investments that will yield the highest return for the Nation.

National Science Foundation (NSF) Director France A. Córdova today publicly presented President Donald J. Trump’s Fiscal Year (FY) 2018 NSF budget request to Congress.

The FY 2018 NSF budget request of $6.65 billion reflects NSF’s commitment to establishing clear priorities in areas of national importance and identifying the most innovative and promising research ideas that will yield the highest return on investment for the nation. It supports fundamental research that will

More at

This is an NSF News item.

Brainstorming transformative solutions – Sustainable Puerto Rico in 2080 – a focus on energy and food security

This narrative scenario depicts one of many possible futures for the island of Puerto Rico in which the goals of energy and food supply resilience have been met. Set in the year 2080, the narrative describes a series of hypothetical (but possible) events, a set of proactive governance actions and policies, and citizen responses to those events and interventions. The narrative is based on expert-opinion and extrapolation of trends in energy markets, technology, and policy development, as well as recent events in Puerto Rico. This narrative was developed as part of a futures exercise, and the outputs of a recent stakeholder and expert workshop, to inform modeling efforts underway by a coalition of researchers and local stakeholders — an NSF-funded project entitled, Urban Resilience to Extreme Events.


In vivo toxicology data is subject to multiple sources of uncertainty: observer severity bias (a pathologist may record only more severe effects and ignore less severe ones); dose spacing issues (this can lead to missing data, e.g. if a severe effect has a less severe precursor, but both occur at the same tested dose); imperfect control of key independent variables (in databases, one can rarely control key input variables such as animal strain or dosing schedules); effect description heterogeneity (terminology changes over time which can lead to information loss); statistical issues (too few chemicals with a given phenotype, or too few animals in dose groups). These issues directly contribute to uncertainties in models built from the data. We are investigating the use of collections of endpoints (toxicity syndromes) to address these issues. These are identical in concept to medical syndromes which allow a physician to diagnose an underlying disease more accurately than can be done when relying on examination of one symptom at a time. Our test case is anemia, for several reasons: most of the phenotypes (e.g. cell counts) are quantitative; related effects are measured in an automated way; anemia is relatively common, at least at high doses (~30% of chemicals in our database show significant drops in red cell count); the causes of anemia are well understood; and, there is a standard clinical decision tree to classify anemia. Using a database of 658 chemicals, we have made several preliminary observations. Only a subset of chemicals show clear syndrome patterns that map to the standard clinical decision tree, which may reflect remaining noise in the data, or an indication that chemicals can lead to anemia through more than one route. Second, anemia often only occurs in a subset of species tested (e.g. rat, mouse, dog). This may be a reflection of the doses used, or a true cross-species effect. Finally, the majority of anemia occurs at relatively high doses, where significant weight loss and even mortality is seen. Overall though, the syndrome methodology appears to be a promising approach to classifying the toxicity of chemicals. This abstract does not necessarily reflect U.S. EPA policy