Category Archives: Earth Science

NSF awards $36.6 million in new food-energy-water system grants

Farmland has a role in meeting increasing food and bioenergy demands in sustainable ways.

The number of humans alive on our planet today is some 7.5 billion. By 2087, projections show, 11 billion people will be living on Earth.

How will we continue to have a sustainable supply of food, energy and water, and protect the ecosystems that provide essential “services” for humans?

To help answer these questions, the National Science Foundation (NSF) has partnered with the U.S. Department of Agriculture’s National Institute for Food and Agriculture (NIFA) to award

More at https://www.nsf.gov/news/news_summ.jsp?cntn_id=242998&WT.mc_id=USNSF_51&WT.mc_ev=click


This is an NSF News item.

Relationship Between Total and Bioaccessible Lead on Children’s Blood Lead Levels in Urban Residential Philadelphia Soils

Relationships between total soil or bioaccessible lead (Pb), measured using an in vitro bioaccessibility assay, and children’s blood lead levels (BLL) were investigated in an urban neighborhood in Philadelphia, Pennsylvania, USA, with a history of soil Pb contamination. Soil samples from 38 homes were analyzed to determine whether accounting for the bioaccessible Pb fraction improves statistical relationships with children’s BLLs. Total soil Pb ranged from 58 to 2,821 mg/kg; the bioaccessible Pb fraction ranged from 47 to 2,567 mg/kg. Children’s BLLs ranged from 0.3 to 9.8 μg/dL. Hierarchical models were used to compare relationships between total or bioaccessible Pb in soil and children’s BLLs. Total soil Pb as the predictor accounted for 25% of the variability in child BLL; bioaccessible soil Pb as the predictor accounted for 28% of BLL variability. A bootstrapping analysis confirmed a significant increase in R2 for the model using bioaccessible soil Pb as the predictor with 99.3% of bootstraps showing a positive increase. Estimated increases of 1.4 μg/dL and 1.6 μg/dL in BLL per 1,000 mg/kg Pb in soil were observed for this study area using total and bioaccessible Pb, respectively. Children’s age did not contribute significantly to the prediction of BLLs.

NSF issues new EPSCoR awards, investing in science and engineering across nation

Naomi Ward, University of Wyoming associate professor of molecular biology, collects soil samples for microbiome analysis at the LaPrele Creek mammoth kill site near Douglas, Wyoming.

The National Science Foundation (NSF) has awarded five jurisdictions nearly $20 million each through the Established Program to Stimulate Competitive Research (EPSCoR), which builds research and development capacity in states that demonstrate a commitment to research but have thus far lacked the levels of investment seen in other parts of the country.

The new EPSCoR Research Infrastructure Improvement (RII) Track-1 awards will bolster science and engineering academic research

More at https://www.nsf.gov/news/news_summ.jsp?cntn_id=243159&WT.mc_id=USNSF_51&WT.mc_ev=click


This is an NSF News item.

Human Health Risk Assessment: A case study application of principles in dose response assessment

This case study application workshop will build on fundamental concepts and techniques in risk assessment presented and archived at previous TRAC meeting workshops. Practical examples from publicly available, peer reviewed risk assessments will be used as teaching aids. Course modules will be organized according to the key components of the risk assessment process: hazard characterization, dose response modeling (including Benchmark Dose methodology), dosimetric adjustment, point of departure selection, uncertainty analysis and risk value derivation such as reference dose and cancer slope factor. The participants will have a unique opportunity to learn and apply conventional methodologies, detailed considerations and emerging approaches in support of human health risk assessment.

Is biochar-manure co-compost a better solution for soil health improvement and N2O emissions mitigation?

Land application of compost has been a promising remediation strategy for soil health and environmental quality, but substantial emissions of greenhouse gases, especially N2O, need to be controlled during making and using compost. Biochar as a bulking agent for composting has been proposed as a novel approach to solve this issue, due to large surface area and porosity, and thus high ion exchange and adsorption capacity. Here, we compared the impacts of biochar-manure co-compost (BM) and manure compost (M) on soil biological properties and processes in a microcosm experiment. Our results showed that BM and M addition significantly enhanced soil total C and N, inorganic and organic N, microbial biomass C and N, cellulase enzyme activity, abundance of N2O-producing bacteria and fungi, and gas emissions of N2O and CO2. However, compared to the M treatment, BM significantly reduced soil CO2 and N2O emissions by 35% and 27%, respectively, over the experimental period. The 15N-N2O site preference was ~ 17‰ for M and ~ 27‰ for BM, suggesting that BM suppressed N2O from bacterial denitrification and nitrifier denitrification. Soil glucosaminidase activity and nirK gene abundance were lower in BM than M treatments. However, soil peroxidase activity and the abundance of ammonium oxidizing archaea were greater in BM than M treatments. Our data demonstrated that biochar-manure co-compost could substantially reduce soil N2O emissions from manure compost via controls on soil organic C stabilization and the activities of microbial functional groups, especially bacterial denitrifiers.

Changes in non-extreme precipitation may have not-so-subtle consequences

Low flow in a stream. Non-extreme rainfall is essential for maintaining ecosystem functions.

Find related stories on NSF’s Critical Zone Observatories site.

Extreme floods and droughts receive a lot of attention. But what happens when precipitation — or lack thereof — occurs in a more measured way?

Researchers have analyzed more than five decades of data from across North America to find that changes in non-extreme precipitation are more significant than previously realized.

More at https://www.nsf.gov/news/news_summ.jsp?cntn_id=243121&WT.mc_id=USNSF_51&WT.mc_ev=click


This is an NSF News item.

Waterborne Disease Outbreaks— United States, 2009–2014

Background: The Centers for Disease Control and Prevention (CDC) has conducted national surveillance for waterborne disease outbreaks since 1971 in partnership with the Council of State and Territorial Epidemiologists and the United States Environmental Protection Agency (USEPA). Outbreaks were reported using a paper-based process until 2009, when CDC transitioned to the web-based National Outbreak Reporting System (NORS). Since the 1990’s, national waterborne disease outbreak data have been summarized in biennial reports, classified by type of water exposure implicated. Data on all waterborne disease outbreaks are not routinely summarized together.Methods: Public health officials in the 50 states, the District of Columbia, U.S. territories, and Freely Associated States voluntarily reported waterborne disease outbreaks to NORS and collaborated with CDC on a verification process for data on 2009–2014 outbreaks. The verification process follows the joint CDC—USEPA review of outbreak reports to clarify details necessary to finalize a national summary (note: 2013-2014 data are preliminary). Results: In total, 433 waterborne disease outbreaks with a first illness onset date occurring during 2009–2014 were assigned one of five water exposure categories: treated recreational water (n=200 [46%]), drinking water (n=110, 25%), untreated recreational water (n=61[14%]), other water (n=31 [7%]) or unknown water (n=31, 7%). These outbreaks resulted in at least 10,161 illnesses, 843 hospitalizations, and 76 deaths. Outbreak exposures occurred in 45 states and Puerto Rico. Most (n=242 [56%]) outbreaks started during June–August. Among outbreaks with a single confirmed etiology (n=344 [79%]), Legionella (n=138 [40%]), Cryptosporidium (n=98 [28%]), and Giardia (n=24 [7%]) were the most frequently identified pathogens; four outbreaks identified more than one etiology. Investigations of outbreaks with an unidentified etiology (n=85 [20%]) primarily implicated chemicals/toxins (n=39 [46%]) or bacteria (n=21 [25%]) as suspected etiologies. Conclusion: NORS is a voluntary national reporting system used for reporting of all waterborne disease outbreaks to CDC. While data from NORS inform development of prevention and measures (e.g., health policy and communications) specific to individual water exposure categories, aggregate analyses can provide additional information to inform our overall understanding of the epidemiology and burden of waterborne disease in the United States. This is a description of a proposed presentation and does not necessarily reflect EPA policy.

Emerging Contaminants.

Recent temporal trends of contaminants from Narragansett Bay show the appearance of CECs resulting from the use of personal care products, pharmaceuticals and industrial products over time. In contrast, legacy contaminants such as PCBs and toxic heavy metals generally show sustained decline due to enactment of stricter regulatory standards. In Narragansett Bay there is a gradient from north to south in pollution, which is due to the high density of human population and wastewater treatment facilities in the upper Bay. Spatially, this trend appears to be the same for a number of CECs, including pharmaceutical compounds, due to the fact that these classes of pollutants originate from the same point source inputs within Narragansett Bay. Future research needs include identifying key CECs of concern in coastal waters such as Narragansett Bay assessing their behavior, fate and potential to impart adverse effects.

Differential Mutagenicity and Lung Toxicity of Smoldering Versus Flaming Emissions from a Variety of Biomass Fuels

Wildfire smoke properties change with combustion conditions and biomass fuel types. However the specific role of wildfire conditions on the health effects following smoke exposure are uncertain. This study applies a novel combustion and smoke-collection system to examine emissions from multiple biomass fuel types (red oak, peat, pine needles, pine, and eucalyptus) firing under different combustion phases (flaming and smoldering). The combustion system sustains flaming or smoldering phase for up to 60 min, and uses multi-stage, cryogenically-cooled impingers to capture particulate matter (PM) and semi-volatile organic compounds from the smoke emissions. Biomass smoke PM was extracted and assessed for mutagenicity in Salmonella strains TA100 and TA98 +/-S9, as well as lung toxicity in mice via oropharyngeal aspiration. Carbon dioxide (CO2), carbon monoxide (CO), and PM concentrations monitored continuously during the combustion process were used to calculate modified combustion efficiency (MCE) and emission factors (EFs). Average MCEs were 73% during smoldering and 98% during flaming phases. Additionally, EF CO, CO2, and PM correlated well with MCE. On an equal-mass basis, the extractable organic matter from the peat, pine, and eucalyptus flaming PM had the highest mutagenic potencies; similarly, the lung toxic potencies of the peat and eucalyptus flaming PM were greater than those of respective smoldering PM. However, after adjusted for the emitted PM mass (i.e., real-life smoke exposure situations), the mutagenicity and lung-toxicity emission factors were higher for the smoldering than the flaming emissions, with the highest emission factors being exhibited by the pine needles for mutagenicity and eucalyptus for lung toxicity. These results demonstrate that (1) the different fuel types and combustion phases can alter dramatically the emissions characteristics, mutagenicity, and lung toxicity; (2) the present combustion system can be used for health-risk assessment from inhalation exposure to various types of wildfire smoke; and (3) smoldering emissions produce greater toxicity emission factors than do flaming emissions. [Abstract does not represent official USEPA policy.]

Report on Responses to NTIA’s Request for Comments on Promoting Stakeholder Action Against Botnets and Other Automated Threats

Date: 
September 15, 2017
Docket Number: 
170602536-7536-01

This report identifies the common themes found in the responses to NTIA’s “Request for Comments on Promoting Stakeholder Action Against Botnets and Other Automated Threats.” It is not a comprehensive discussion of all comments, nor does it reflect a government decision. The full text of all comments is available here.

Emissions from prescribed burning of timber slash piles in Oregon.

Emissions from burning piles of post-harvest timber slash (Douglas fir) in Grande Ronde, Oregon were sampled using an instrument platform lofted into the plume using a tether-controlled aerostat or balloon. Emissions of carbon monoxide, carbon dioxide, methane, particulate matter (PM2.5), black carbon, ultraviolet absorbing PM, elemental/organic carbon, filter-based metals, polycyclic aromatic hydrocarbons (PAHs) and polychlorinated dibenzodioxins/dibenzofurans (PCDD/PCDF), and volatile organic compounds (VOCs) were sampled for determination of emission factors, the amount of pollutant formed per amount of biomass burned. The effect on emissions from covering the piles with polyethylene (PE) sheets to prevent fuel wetting versus uncovered piles was determined. Results showed that the uncovered (“wet”) piles burned with lower combustion efficiency and higher emission factors for VOCs, PM2.5, PCDD/PCDF, and PAHs. Removal of the PE prior to ignition, variation of PE size, and changing PE thickness resulted in no statistical distinction between emissions. Results suggest that dry piles, whether covered with PE or not, exhibited significantly lower emissions than wet piles due to better combustion efficiency.

When and where to intervene? Coastal nutrient loading, groundwater travel times, and estuary dynamics

There is often a difference between the timing of an intervention in a natural system and the resultant impact. Delays in the transport of pollutants from sources, inertia in the effects of policy on economic actors, and long-term memory in natural systems, turn environmental management problems into dynamic decision problems. This makes the pertinent management questions not just where and how to intervene, but also when and to what extent?

This paper presents the dynamic planning problem of addressing coastal eutrophication with an example and parameterization for an estuary system on Cape Cod (Barnstable County, MA). Due to long groundwater travel times in the aquifer, the primary transportation mechanism for nitrogen pollution in this system, there is a connection between the spatial location and the effect of abatement efforts defined along this travel time gradient. Interventions under consideration vary from source control, upgrading sceptic systems for example, to in-estuary approaches, aquiculture being popularly cited. For a community, finding the right balance of these choices is more complicated than simply implementing the fastest impact option first or prioritizing just the cheapest. This is because travel time, marginal costs, scalability, and estuary dynamics combine to make efficient choices more nuanced in space and time.

We present the conceptual model of the groundwater transport and estuary nutrient loading, followed by an analytical description of the inter-temporal tradeoffs and efficiency of source and in-estuary nutrient abatement efforts using concepts from optimal control theory. This model is then parameterized for an estuary system, Three Bays, on Cape Cod, MA and the associated watershed. Our results inform the timing, spatial allocation and combination of treatment options by simulating the optimal control paths. These control paths are compared in terms of efficiency against heuristics and policy relevant simplifications.