Sunday, May 18, 2008

Rare Musk Ox May Be Threatened By Climate Change

Rare Musk Ox May Be Threatened By Climate ChangeScienceDaily (Apr. 27, 2008) — The Wildlife Conservation Society (WCS) recently launched a four-year study to determine if climate change is affecting populations of a quintessential Arctic denizen: the rare musk ox. Along with collaborators from the National Park Service, U. S. Geological Survey, and Alaska Fish and Game, Wildlife Conservation Society researchers have already equipped six musk ox with GPS collars to better understand how climate change may affect these relics of the Pleistocene.  The research team will be assessing how musk ox are faring in areas along the Chukchi and northern Bering Seas, and the extent to which snow and icing events, disease, and possibly predation may be driving populations."Musk ox are a throwback to our Pleistocene heritage and once shared the landscape with mammoths, wild horses, and sabered cats," said the study's leader Dr. Joel Berger, a Wildlife Conservation Society scientist and professor at the University of Montana. "They may also help scientists understand how arctic species can or cannot adapt to climate change."Once found in Europe and Northern Asia, today musk ox are restricted to Arctic regions in North America and Greenland although they have been introduced into Russia and northern Europe. They have been reintroduced in Alaska after being wiped out in the late 19th century. Currently they found in two national parks: Alaska's Bering Land Bridge National Park and Cape Krusenstem National Monument.Next year, the team will collar an additional 30-40 more animals.
Rare Musk Ox May Be Threatened By Climate Change

Technological Breakthrough In Fight To Cut Greenhouse Gases

Technological Breakthrough In Fight To Cut Greenhouse GasesScienceDaily (Apr. 27, 2008) — Scientists at Newcastle University have pioneered breakthrough technology in the fight to cut greenhouse gases. The Newcastle University team, led by Michael North, Professor of Organic Chemistry, has developed a highly energy-efficient method of converting waste carbon dioxide (CO2) into chemical compounds known as cyclic carbonates.See also:Matter & Energy * Organic Chemistry * Chemistry * PetroleumEarth & Climate * Global Warming * Environmental Science * Energy and the EnvironmentReference * Automobile emissions control * Catalysis * Carbon cycle * HydrocarbonThe team estimates that the technology has the potential to use up to 48 million tonnes of waste CO2 per year, reducing the UK's emissions by about four per cent.Cyclic carbonates are widely used in the manufacture of products including solvents, paint-strippers, biodegradable packaging, as well as having applications in the chemical industry. Cyclic carbonates also have potential for use in the manufacture of a new class of efficient anti-knocking agents in petrol. Anti-knocking agents make petrol burn better, increasing fuel efficiency and reducing CO2 emissions.The conversion technique relies upon the use of a catalyst to force a chemical reaction between CO2 and an epoxide, converting waste CO2 into this cyclic carbonate, a chemical for which there is significant commercial demand.The reaction between CO2 and epoxides is well known, but one which, until now, required a lot of energy, needing high temperatures and high pressures to work successfully. The current process also requires the use of ultra-pure CO2 , which is costly to produce.The Newcastle team has succeeded in developing an exceptionally active catalyst, derived from aluminium, which can drive the reaction necessary to turn waste carbon dioxide into cyclic carbonates at room temperature and atmospheric pressure, vastly reducing the energy input required.Professor North said: 'One of the main scientific challenges facing the human race in the 21st century is controlling global warming that results from increasing levels of carbon dioxide in the atmosphere.'One solution to this problem, currently being given serious consideration, is carbon capture and storage, which involves concentrating and compressing CO2 and then storing it,' he said. 'However, long-term storage remains to be demonstrated'.To date, alternative solutions for converting CO2 emissions into a useful product has required a process so energy intensive that they generate more CO2 than they consume.Professor North compares the process developed by his team to that of a catalytic converter fitted to a car. 'If our catalyst could be employed at the source of high-concentration CO2 production, for example in the exhaust stream of a fossil-fuel power station, we could take out the carbon dioxide, turn it into a commercially-valuable product and at the same time eliminate the need to store waste CO2', he said.Professor North believes that, once it is fully developed, the technology has the potential to utilise a significant amount of the UK's CO2 emissions every year.'To satisfy the current market for cyclic carbonates, we estimate that our technology could use up to 18 million tonnes of waste CO2 per year, and a further 30 million tonnes if it is used as an anti-knocking agent.'Using 48 million tonnes of waste CO2 would account for about four per cent* of the UK's CO2 emissions, which is a pretty good contribution from one technology,' commented Professor North. The technique has been proven to work successfully in the lab. Professor North and his team are currently carrying out further lab-based work to optimise the efficiency of the technology, following which they plan to scale-up to a pilot plant.* Based on 2004 figures from the UN.The paper 'Synthesis of cyclic carbonates from atmospheric pressure carbon dioxide using exceptionally active aluminium(salen) complexes as catalysts' s been published in the European Journal of Inorganic Chemistry.The project was funded by the Engineering and Physical Sciences Research Council.
Technological Breakthrough In Fight To Cut Greenhouse Gases

Carbon Dioxide Removed From Smockstacks Could Be Useful In DVD And CD-ROM Manufacture

Carbon Dioxide Removed From Smockstacks Could Be Useful In DVD And CD-ROM ManufactureScienceDaily (Apr. 9, 2008) — Carbon dioxide removed from smokestack emissions in order to slow global warming in the future could become a valuable raw material for the production of DVDs, beverage bottles and other products made from polycarbonate plastics, chemists are reporting.See also:Matter & Energy * Materials Science * Organic Chemistry * NanotechnologyEarth & Climate * Global Warming * Air Quality * ClimateReference * Biodegradation * Plastic * Carbon-14 * Fossil fuelIn separate reports presented at the 235th annual meeting of the American Chemical Society on April 8, 2008, Thomas E. Müller, Ph.D., and Toshiyasu Sakakura, Ph.D., described innovative ways of making polycarbonate plastics from CO2. Those processes offer consumers the potential for less expensive, safer and greener products compared to current production methods, the researchers agreed."Carbon dioxide is so readily available, especially from the smokestack of industries that burn coal and other fossil fuels," Müller said. He is at the new research center for catalysis CAT, a joint 5-year project of RWTH Aachen and industrial giant Bayer Material Science AG and Bayer Technology Services GmbH. "And it's a very cheap starting material. If we can replace more expensive starting materials with CO2, then you'll have an economic driving force."In another ACS presentation, scientists from Japan also reported using CO2 as an alternative feedstock to change carbonates and urethanes into plastics and also battery components. Sakakura, the team's lead researcher, noted that the new process is simpler and faster than another process developed by a Japanese firm. Sakakura is with the National Institute of Advanced Industrial Science and Technology in Tsukuba, Japan.Müller pointed out that millions of tons of polycarbonates already are sold each year with the volume rising. Perhaps no other consumer product has such a great potential for use in removing carbon dioxide from the environment, he added. These hard, tough materials represent "intriguing sinks" for exhaust carbon dioxide and are the mainstay for producing eyeglass lenses, automotive headlamp lenses, DVDs and CDs, beverage bottles, and a spectrum of other consumer products.Trapping carbon dioxide in those plastics would avoid the release of many million of tons into the environment, Müller said. "Using CO2 to create polycarbonates might not solve the total carbon dioxide problem, but it could be a significant contribution."Consumers may be drinking from a carbon dioxide product and watching movies on waste-CO2 DVDs sooner than they think. "I would say it's a matter of a few years" before CO2-derived polymers are available to the public.
Carbon Dioxide Removed From Smockstacks Could Be Useful In DVD And CD-ROM Manufacture

Saturday, May 17, 2008

Fertilizing Oceans to Combat Climate Change: Research Guidelines

New: Joint statement from SCOR and GESAMP on Deliberate Nutrient Additions to the OceanThe Scientific Committee on Oceanic Research (SCOR)and GESAMP have released a joint statement regarding Deliberate Nutrient Additions to the Ocean. For any further questions regarding this statement, please refer to the contacts at the end of the document.The statement can be downloaded here.
GESAMP - GESAMP Presentation

Organic and Conventional Production Systems in the Wisconsin Integrated Cropping Systems Trials: I. Productivity 1990-2002 -- Posner et al. 100 (2): 253 -- Agronomy Journal

Organic and Conventional Production Systems in the Wisconsin Integrated Cropping Systems Trials: I. Productivity 1990–2002Joshua L. Posnera,*, Jon O. Baldockb and Janet L. Hedtckeaa Dep. of Agronomy, Univ. of Wisconsin, 1575 Linden Dr., Madison, WI 53706b AGSTAT, 6394 Grandview Rd., Verona, WI 53593. Major funding provided by the W.K. Kellogg Foundation's Integrated Food and Farming Systems program and federal appropriation from the Agricultural Research Service (ARS) Integrated Farming Systems program* Corresponding author (jlposner@wisc.edu).During the last half-century, agriculture in the upper U.S. Midwest has changed from limited-input, integrated grain–livestock systems to primarily high-input specialized livestock or grain systems. This trend has spawned a debate regarding which cropping systems are more sustainable and led to the question: can diverse, low-input cropping systems (organic systems) be as productive as conventional systems? To answer this question, we compared six cropping systems ranging from diverse, organic systems to less diverse conventional systems conducted at two sites in southern Wisconsin. The results of 13 yr at one location and 8 yr at the other showed that: (i) organic forage crops can yield both as much dry matter as their conventional counterparts and with quality sufficient to produce as much milk; and (ii) organic corn (Zea mays L.), soybean [Glycine max (L.) Merr.], and winter wheat (Triticum aestivum L.) can produce 90% as well as their conventionally managed counterparts. The average yields for corn and soybean, however, masked a dichotomy in productivity. Combining Wisconsin Integrated Cropping Systems Trial (WICST) data with other published reports revealed that in 34% of the site-years, weed control was such a problem, mostly due to wet spring weather reducing the effectiveness of mechanical weed control techniques, that the relative yields of low-input corn and soybean were only 74% of conventional systems. However, in the other 66% of the cases, where mechanical weed control was effective, the relative yield of the low-input crops was 99% of conventional systems. Our findings indicate that diverse, low-input cropping systems can be as productive per unit of land as conventional systems.
Organic and Conventional Production Systems in the Wisconsin Integrated Cropping Systems Trials: I. Productivity 1990-2002 -- Posner et al. 100 (2): 253 -- Agronomy Journal

CAFOs Uncovered

CAFOs UncoveredThe Untold Costs of Confined Animal Feeding OperationsThe U.S. livestock industry—a large and vital part of agriculture in this country—has been undergoing a drastic change over the past several decades. Huge CAFOs (confined animal feeding operations) have become the predominant method of raising livestock, and the crowded conditions in these facilities have increased water and air pollution and other types of harm to public health and rural communities.CAFOs UncoveredCAFOs Uncovered (PDF)Executive Summary (PDF)Author biographyPress releaseTake Action: Sign petition to USDA secretaryCAFOs are not the inevitable result of market forces. Instead, these unhealthy operations are largely the result of misguided public policy that can and should be changed.In this report, the Union of Concerned Scientists analyzes both the policies that have facilitated the growth of CAFOs and the enormous costs imposed on society by CAFOs. We also discuss sophisticated and efficient alternatives for producing affordable animal products, and offer policy recommendations that can begin to lead us toward a healthy and sustainable food system.
CAFOs Uncovered

Nature grants free access for biomedical journals - SciDev.Net

Nature grants free access for biomedical journals

Naomi Antony

29 April 2008 | EN | 中文

Nature-journals.jpg

SciDev.Net

Developing countries in Africa, Asia and Latin America will gain free access to more than 65 Nature journals, it was announced last week (22 April).

Nature Publishing Group (NPG) and INASP (International Network for the Availability of Scientific Publications) have teamed up to make NPG's collection of biomedical journals available to more than 20 partner countries, as part of INASP's Programme for the Enhancement of Research Information (PERI).

Journals in this collection include Nature, the Nature Clinical Practice series, NPGresearch journals and the Nature Reviews journals in life sciences and medicine.

Lucy Browse, head of information delivery at INASP, said that the decision came as the result of requests from their country coordination teams to include NPG biomedical journals within PERI.

"[This] inclusion will be of huge benefit to researchers within our partner countries and also strengthen the resource availability within the digital libraries," says Browse.

"It means that researchers in approximately 600 libraries [around the world] will potentially be able to access [these journals]."

Browse told SciDev.Net that INASP cooperates with over 50 publishers and aggregators to negotiate free or heavily subsidised access for their partner countries, helping to bridge gaps in research communication.

"At INASP, our mission is to enhance the research communication cycle within our partner and network countries … Our activities enable access to research information to be increased in a sustainable way and also encourage the research outputs and communication of colleagues in developing countries to reach a global audience," she says.

According to the NPG and INASP, countries will have access to 2008 content, as well as content published between 2004–2007.



Nature grants free access for biomedical journals - SciDev.Net

Friday, May 16, 2008

Job and Economic Development Impact Model

NREL has developed several Job and Economic Development Impact (JEDI) models, available for download from the Energy Analysis Web site. The JEDI models are easy-to-use, spreadsheet-based tools that estimate the economic impacts of constructing and operating power generation plants at the state level. First developed to model wind energy development impacts, JEDI recently expanded to offer more technologies. Models have been developed and are now available for download to estimate job and economic impacts from dry mill corn ethanol and concentrating solar power (CSP) plants. Additional models estimating the jobs and economic impact from other technologies will be available in the coming months. The site provides more information about JEDI, as well as free downloads.

Renewables Portfolio Standards

<!-- p class="notop">The documents in this section are available as Adobe Acrobat PDFs. Download Adobe Reader</p -->

NREL staff members Lori Bird and Karlynn Cory were contributors to a new Lawrence Berkeley National Laboratory (LBNL) report released in April, "Renewables Portfolio Standards in the United States: A Status Report with Data through 2007" (PDF 1.5 MB) Download Adobe Reader. This report provides a comprehensive overview of early experience with renewables portfolio standards (RPS) in the United States. State-level RPS programs are among the most important drivers for renewable energy deployment in the United States. As their popularity and importance have grown, so too has the need to keep up with the design, early experience, and projected impacts of these RPS programs. This report seeks to fulfill this need by providing basic, factual information on RPS policies. Drawing from a variety of sources, this report — the first in what is envisioned to be an ongoing series — provides comprehensive information on a broad range of RPS-related topics. The report concentrates on key recent developments, while also providing information on historical RPS experience and design.

The Oceans Finally Show Their Stripes -- Berardelli 2008 (428): 3 -- ScienceNOW

The Oceans Finally Show Their Stripes

By Phil Berardelli
ScienceNOW Daily News
28 April 2008

Researchers have assembled the most detailed picture of ocean currents ever produced, and in so doing they have revealed a vast array of striated currents that roughly parallel the equator. This new level of resolution should improve understanding of a wide variety of ocean-related phenomena.

Scientists have had a relatively easy time deciphering the big movements in the oceans. Major currents such as the Gulf Stream are easy to detect, for example, and climate researchers for some time have been tracking the Global Conveyor Belt--a giant underwater river that carries cold Arctic waters from the North Atlantic to the middle of the Pacific. More subtle ocean dynamics, known as weak structures, have remained largely invisible. Now an international team has assembled a high-resolution picture of the oceans using satellite radar altimeter and 20 years' worth of data collected from thousands of buoys in an ocean-surface network called the Global Drifter Program.

One of the most dramatic features revealed by the new data is a network of banded currents moving west to east and vice versa across all of the oceans at the snail's pace of about 30 meters an hour. "They're almost everywhere," says physical oceanographer and co-author Peter Niiler of the Scripps Institution of Oceanography in San Diego, California. "We've discovered many more weak structures than we thought," Niiler says. As Niiler, lead author Nikolai Maximenko of the University of Hawaii, Honolulu, and colleagues reported 24 April in Geophysical Research Letters, the currents in the northeastern Pacific, which move back and forth from the U.S. West Coast to Hawaii, could have as their source a current that loops up the California Coast and back south. The sources of stripes in other regions aren't yet known, but they all cross major ocean currents and behave more like waves rather than jets of water.

"Who would have thought the bands would be so ubiquitous," says geophysical fluid dynamicist Geoff Vallis of Princeton University. And physical oceanographer Terrence Joyce of Woods Hole Oceanographic Institution in Massachusetts says the bands are highly unusual, because they don't align with ocean currents that are defined by temperature or sea-surface height. The bands cross over the currents, he says, "yet they still persist. That's very curious, and the paper doesn't explain this curiosity."



The Oceans Finally Show Their Stripes -- Berardelli 2008 (428): 3 -- ScienceNOW

The Effect of Rearing Density on Growth and Survival of Cobia

in a Closed Recirculating Aquaculture System

  • a Maritech, Research and Development, 805 46th Place, Vero Beach, Florida 32963 USA
    b University of Texas at Austin Marine Science Institute, Fisheries and Mariculture Laboratory, 750 Channel View Drive, Port Aransas, Texas 78373 USA
    c College of Science and Technology, Texas A&M University – Corpus Christi, Corpus Christi, Texas 78412 USA
1 Corresponding author.

Cobia, Rachycentron canadum, is a highly prized game fish as well as an important commercial fish in many parts of the world. This species is found worldwide in tropical, subtropical, and warm temperate seas, except for the eastern Pacific. Cobia is thought to be highly migratory in the Gulf of Mexico where it is absent from commercial and recreational catches during the late fall and winter (Ditty and Shaw 1992). Cobia migrate from the Florida Keys north and west along the Gulf coast during early spring and reappear in the northern Gulf of Mexico in late spring and summer.

Cobia is an excellent candidate for aquaculture because of its fast growth rate, reaching 6–10 kg in 12–14 mo (Liao et al. 2004), as well as excellent flesh quality (Chen 2001). However, intensive culture of cobia is relatively new to aquaculture. Techniques for the natural spawning of cobia broodstock in recirculating aquaculture systems (RAS) have only recently been developed in the United States (Arnold et al. 2002). Although larval rearing techniques have been under investigation for the past few years (Faulk and Holt 2003, 2005), little information is currently available regarding optimal rearing densities for cobia in RAS. The availability of large numbers of larvae and/or juveniles is necessary for successful grow-out operations, so high-density larva culture is favored.

Stocking density has been shown to influence growth rate (Pickering and Stewart 1984), feeding behavior (Kentouri et al. 1994), disease resistance (Mazur and Iwama 1993), and survival (Sodeberg and Meade 1987) in finfish. Density effects on cultured fish may result in behavioral interactions among siblings including competition for food and space that often result in cannibalism. Reduced water quality in high-density rearing systems can also affect growth and survival. If water quality can be controlled and sufficient food provided, high rearing densities may still affect growth and survival through responses related to crowding. Reduced appetite is one of the effects associated with reduced growth of juveniles under crowded conditions (Wendelaar Bonga 1997). The effect that crowding has on larvae is not well understood, but it often results in reduced growth and survival (Houde 1975; Alvarez-Gonzalez et al. 2001). This study was conducted to evaluate the effects of different rearing densities on the growth and survival of cobia larvae in an RAS.



Blackwell Synergy - J World Aquaculture Soc, Volume 37 Issue 2 Page 204-209, June 2006 (Full Text)

CHEMISTRY: Chloramine Complexities

Jake Yeston

Chloramine is a comparatively recent weapon in the ongoing battle to eliminate harmful microorganisms from drinking water supplies. Though its disinfecting properties are straightforward, the concomitant generation of ammonia as a byproduct can give rise to a complex web of downstream chemistry that remains an active area of study. One important reaction is microbial nitrification, or oxidation of the ammonia to nitrite and nitrate, which also lowers the water's pH by acid production. Zhang et al. have systematically explored the efficiency of nitrification in plumbing pipes of differing compositions--polyvinyl chloride (PVC), copper, lead, and brass--at various pH and phosphate levels. They found that relative to PVC, copper inhibited nitrifier growth, whereas lead enhanced it (probably through reductive cycling of nitrate back to ammonia via lead corrosion). Brass initially resisted nitrification activity, but then shifted its behavior after ~120 days, as the efficiency of copper leaching from the alloy diminished. A perhaps counterintuitive consequence of this reaction web is that PVC pipes may ultimately cause more metal ion leaching into the water stream than copper pipes, as the acid byproducts of nitrification degrade brass valves and faucets. -- JSY

Environ. Sci. Technol. 42, 10.1021/es702483d (2008).

Fire-Derived Charcoal Causes Loss of Forest Humus -- Wardle et al. 320 (5876): 629 -- Science

Fire-Derived Charcoal Causes Loss of Forest Humus

David A. Wardle,* Marie-Charlotte Nilsson, Olle Zackrisson

Fire is a global driver of carbon storage and converts a substantial proportion of plant biomass to black carbon (for example, charcoal), which remains in the soil for thousands of years. Black carbon is therefore often proposed as an important long-term sink of soil carbon. We ran a 10-year experiment in each of three boreal forest stands to show that fire-derived charcoal promotes loss of forest humus and that this is associated with enhancement of microbial activity by charcoal. This result shows that charcoal-induced losses of belowground carbon in forests can partially offset the benefits of charcoal as a long-term carbon sink.

Department of Forest Ecology and Management, Swedish University of Agricultural Sciences, SE901-83 Umeå, Sweden.

* To whom correspondence should be addressed. E-mail: david.wardle@svek.slu.se

Fire-Derived Charcoal Causes Loss of Forest Humus -- Wardle et al. 320 (5876): 629 -- Science

Expanding Oxygen-Minimum Zones in the Tropical Oceans -- Stramma et al. 320 (5876): 655 -- Science

Expanding Oxygen-Minimum Zones in the Tropical Oceans

Lothar Stramma,1* Gregory C. Johnson,2 Janet Sprintall,3 Volker Mohrholz4

Oxygen-poor waters occupy large volumes of the intermediate-depth eastern tropical oceans. Oxygen-poor conditions have far-reaching impacts on ecosystems because important mobile macroorganisms avoid or cannot survive in hypoxic zones. Climate models predict declines in oceanic dissolved oxygen produced by global warming. We constructed 50-year time series of dissolved-oxygen concentration for select tropical oceanic regions by augmenting a historical database with recent measurements. These time series reveal vertical expansion of the intermediate-depth low-oxygen zones in the eastern tropical Atlantic and the equatorial Pacific during the past 50 years. The oxygen decrease in the 300- to 700-m layer is 0.09 to 0.34 micromoles per kilogram per year. Reduced oxygen levels may have dramatic consequences for ecosystems and coastal economies.

1 Institut für Meereswissenschaften an der Universität Kiel (IFM-GEOMAR), Düsternbrooker Weg 20, 24105 Kiel, Germany.
2 National Oceanic and Atmospheric Administration, Pacific Marine Environmental Laboratory, 7600 Sand Point Way NE, Seattle, WA 98115, USA.
3 Scripps Institution of Oceanography, 9500 Gilman Drive, La Jolla, CA 92093, USA.
4 Baltic Sea Research Institute Warnemünde, Post Office Box 301161, 18112 Rostock, Germany.

* To whom correspondence should be addressed. E-mail: lstramma@ifm-geomar.de

Expanding Oxygen-Minimum Zones in the Tropical Oceans -- Stramma et al. 320 (5876): 655 -- Science

Tuesday, May 13, 2008

Global Farm Animal Production and Global Warming: Impacting and Mitigating Climate Change

Gowri Koneswaran1 and Danielle Nierenberg1,2

1Humane Society of the United States, Washington, DC, USA; 2Worldwatch Institute, Washington, DC, USA

Abstract
Background: The farm animal sector is the single largest anthropogenic user of land, contributing to many environmental problems, including global warming and climate change.

Objectives: The aim of this study was to synthesize and expand upon existing data on the contribution of farm animal production to climate change.

Methods: We analyzed the scientific literature on farm animal production and documented greenhouse gas (GHG) emissions, as well as various mitigation strategies.

Discussions: An analysis of meat, egg, and milk production encompasses not only the direct rearing and slaughtering of animals, but also grain and fertilizer production for animal feed, waste storage and disposal, water use, and energy expenditures on farms and in transporting feed and finished animal products, among other key impacts of the production process as a whole.

Conclusions: Immediate and far-reaching changes in current animal agriculture practices and consumption patterns are both critical and timely if GHGs from the farm animal sector are to be mitigated.

Environ Health Perspect 116:578–582 (2008) . doi:10.1289/ehp.11034 available via http://dx.doi.org/ [Online 31 January 2008]

New Thinking on Flame Retardants

Kellyn S. Betts

To meet U.S. fire safety standards, manufacturers, until recently, have primarily chosen polybrominated diphenyl ethers (PBDEs) as the flame retardant used in furniture, bedding, electronics, and other consumer products. But growing evidence of adverse health effects from exposure to these compounds is driving both bans on their use and a search for new and safer alternatives. Two promising possibilities include incorporating flame retardants into the materials themselves using nanomaterials in fabrics and using halogen-free flame-retardants in electronics, including a polymer that releases water vapor rather than hazardous gases when it burns.


The full version of this article is available for free in HTML or PDF formats.

Unwelcome Guest: PBDEs in Indoor Dust

Kellyn S. Betts

Abstract

Although house dust is known to be a predominant source of exposure to PBDEs, it's not yet clear which part of the dust these chemicals bind to. The dust pictured above contains pet hair (rust brown) , pollen (yellow) , plant fibers (green) , dead skin cells (light to medium brown) , dirt and minerals (orange) , textile fibers (blue) , and spider silk (pink) .

Researchers have known for years that house dust is a major exposure route for lead and certain pesticides. Now a growing body of evidence indicates that polybrominated diphenyl ether (PBDE) flame retardants released from a wide variety of consumer products such as electronics and upholstered furniture are finding their way into air and dust in homes, cars, and workplaces. Several states and nations have banned or are considering banning PBDEs, but because these substances have recently been shown to bioaccumulate and persist for long periods in the environment, people will continue to be exposed to them. Moreover, investigators are finding extremely high concentrations of these substances in individuals in certain segments of the U.S. population, including children and the elderly. Studies are now focusing on how these substances are ending up in dust, how and to what extent exposure is occurring, and the impact of exposure on human health.


The full version of this article is available for free in HTML or PDF formats.

Monday, May 12, 2008

Technology Review: Corn Primed for Making Biofuel












In an effort to help boost the nation's supply of biofuels, researchers have created three strains of genetically modified corn to manufacture enzymes that break down the plant's cellulose into sugars that can be fermented into ethanol. Incorporating such enzymes directly into the plants could reduce the cost of converting cellulose into biofuel.

Last year, new federal regulations called for production of renewable fuels to increase to 36 billion gallons annually--nearly five times current levels--by 2022. Today, nearly all fuel ethanol in the United States is produced from corn kernels. To meet the required increase, researchers are turning to other sources, such as cellulose, a complex carbohydrate found in all plants. Corn leaves and stems, prairie grasses, and wood chips are leading candidates for supplies of cellulose. Cellulosic ethanol has many advantages over that produced from corn kernels. Cellulose is not only extremely abundant and inexpensive; studies also suggest that the production and use of ethanol from cellulose could yield fewer greenhouse gases.

However, the biggest obstacle to making cellulosic ethanol commercially feasible is the breakdown of cellulose. Enzymes that degrade cellulose, called cellulases, are typically produced by microbes grown inside large bioreactors, an expensive and energy-intensive process. "In order to make cellulosic ethanol really competitive, we really need to bring those costs down," says Michael J. Blaylock, vice president of system development at Edenspace, a crop biotechnology firm based in Manhattan, KS.

Mariam Sticklen, professor of crop and soil science at Michigan State University, in East Lansing, figured that she could eliminate the cost of manufacturing enzymes by engineering corn plants to produce the enzymes themselves. Instead of relying on the energy-intensive process of producing them in bioreactors, "the plants use the free energy of the sun to produce the enzymes," she says.

Typically, the breakdown of cellulose requires three different cellulases. Last year, Sticklen reported modifying corn with a gene for a cellulase that cuts the long cellulose chains into smaller pieces. The gene came from a microbe that lives in a hot spring. A month later, Sticklen inserted a gene derived from a soil fungus into the corn genome. That gene codes for an enzyme that breaks the smaller pieces of cellulose into pairs of glucose molecules. In this latest effort, Sticklen has modified corn to produce an enzyme that splits the glucose pairs into individual sugar molecules; the enzyme is naturally produced by a microbe that lives inside a cow's stomach. The final result: three strains of corn, each of which produces an enzyme essential to the complete breakdown of cellulose.

To avoid the possibility of transferring the genes to other crops or wild plants, the enzymes are only produced in the plant's leaves and stems, not in its seeds, roots, or pollen, says Sticklen. What's more, to prevent the corn from digesting itself, she engineered the plants so that the enzymes accumulate only in special storage compartments inside the cells, called vacuoles. The cellulases are released only after the plant is harvested, during processing. Sticklen described her modified crops last week at the American Chemical Society's national meeting in New Orleans.

By Alexandra M. Goho


Costs, Considerations Of Switching To Natural Or Organic Agricultural Methods

ScienceDaily (Apr. 24, 2008) — When Kansas State University graduate student Ben Wileman was a practicing veterinarian in Belle Fourche, S.D., natural and organic labels were a big focus for the beef producers he saw.

"They tended to be terms that were thrown around a lot, but few people really seemed to know what they truly meant," Wileman said.

The definition of "organic" is defined by U.S. Department of Agriculture; "natural," however, can be defined differently depending on who's doing the labeling. But both terms mean one thing: higher costs for producers. That's why Wileman hopes that his research will be another tool to help those in the beef industry pondering whether to abandon conventional methods and go natural or organic.


Wileman, a doctoral student in diagnostic medicine and pathobiology at K-State, is examining the economics and logistics of conventionally raised beef versus organic and naturally raised beef. He is working with Dan Thomson, associate professor of clinical sciences at K-State. The research was presented in February at the Western Veterinary Conference in Las Vegas and will be presented again in July at the American Veterinary Medicine Association conference in New Orleans.


"The reason we're looking at this is because before anyone decides to go all-natural or all-organic, they need to be aware of what it's going to cost them and cost consumers," Wileman said. "We want producers to be knowledgeable about what to expect in terms of performance and economics."


Although the scientific facets of organic foods have been probed, Wileman said that little research has been done on the economic impact. Using data from the U.S. Department of Agriculture, the K-State researchers considered feed costs and availability, the number of organic grain producers, the supply and demand for such grains going to beef cattle, and the performance impacts. They found that a producer would have to make about $120 more per head on naturally finished cattle to make the same profit as they would have on conventionally finished ones. For organically finished cattle, that increases to about $400 more per head.


The greatest contributing factor to the cost of going natural or organic is feed prices, Wileman said. In areas where there are relatively few certified-organic grain producers, transporting and certifying grain adds a major expense.


What's more, Wileman said, is that research done at K-State shows that beef producers are competing for a mere 2 percent of a consumer's income. He said another thing to keep in mind is research showing that most growth in organic and natural food items has come from the same shoppers buying more products, not from an increase in the numbers of like-minded consumers.


With this in mind, Wileman said there are a few things that the beef industry should consider when contemplating going organic or natural. Producers need to consider that they won't be able to feed their cattle in the same way and may consider forming cooperatives to meet their needs. Likewise, feedlots must be mindful of feed handling to prevent mixing organic grains with conventionally grown grains. Finally, packagers and restaurants need to know that they will have to absorb the increased costs of going natural or organic -- or be prepared to pass those costs on to their consumers.


The K-State researchers don't want to dissuade producers and others in the beef industry from going natural or organic, but they do want to offer information that can help them make that decision.


"There's not a problem with going natural or organic, but there will be production and economic issues that they will need to compensate for," Wileman said. "We want to be able to show what the implications of going organic or natural are before a producer or corporation makes that decision."


Because much of the scientific research on organic foods has centered on fruits and vegetables, Wileman said there is plenty of room to study the performance aspects of organic and natural beef production. For instance, he said that some research already has shown that natural diets can increase the prevalence of liver abscesses in cattle. Little is known about how these diets might affect other diseases like foot rot, he said.


"There are a lot more questions that need to be answered," he said.


Greenhouse Gases, Carbon Dioxide And Methane, Rise Sharply In 2007

ScienceDaily (Apr. 24, 2008) — Last year alone global levels of atmospheric carbon dioxide, the primary driver of global climate change, increased by 0.6 percent, or 19 billion tons. Additionally methane rose by 27 million tons after nearly a decade with little or no increase. NOAA scientists released these and other preliminary findings today as part of an annual update to the agency’s greenhouse gas index, which tracks data from 60 sites around the world.

The burning of coal, oil, and gas, known as fossil fuels, is the primary source of increasing carbon dioxide emissions. Earth's oceans, vegetation, and soils soak up half of these emissions. The rest stays in the air for centuries or longer. Twenty percent of the 2007 fossil fuel emissions of carbon dioxide are expected to remain in the atmosphere for thousands of years, according to the latest scientific assessment by the International Panel on Climate Change.

Viewed another way, last year’s carbon dioxide increase means 2.4 molecules of the gas were added to every million molecules of air, boosting the global concentration to nearly 385 parts per million (ppm). Pre-industrial carbon dioxide levels hovered around 280 ppm until 1850. Human activities pushed those levels up to 380 ppm by early 2006.


The rate of increase in carbon dioxide concentrations accelerated over recent decades along with fossil fuel emissions. Since 2000, annual increases of two ppm or more have been common, compared with 1.5 ppm per year in the 1980s and less than one ppm per year during the 1960s.


Methane levels rose last year for the first time since 1998. Methane is 25 times more potent as a greenhouse gas than carbon dioxide, but there’s far less of it in the atmosphere—about 1,800 parts per billion. When related climate affects are taken into account, methane’s overall climate impact is nearly half that of carbon dioxide.


Rapidly growing industrialization in Asia and rising wetland emissions in the Arctic and tropics are the most likely causes of the recent methane increase, said scientist Ed Dlugokencky from NOAA’s Earth System Research Laboratory.


”We’re on the lookout for the first sign of a methane release from thawing Arctic permafrost,” said Dlugokencky. “It’s too soon to tell whether last year’s spike in emissions includes the start of such a trend.”


Permafrost, or permanently frozen ground, contains vast stores of carbon. Scientists are concerned that as the Arctic continues to warm and permafrost thaws, carbon could seep into the atmosphere in the form of methane, possibly fueling a cycle of carbon release and temperature rise.


Want To Reduce Your Food-related Carbon Footprint? What You Eat Is More Important Than Where It Came From

ScienceDaily (Apr. 22, 2008) — The old adage, "We are what we eat,'' may be the latest recipe for success when it comes to curbing the perils of global climate warming. Despite the recent popular attention to the distance that food travels from farm to plate, aka "food miles," Carnegie Mellon researchers Christopher L. Weber and H. Scott Matthews argue in an upcoming article in Environmental Science & Technology journal that it is dietary choice, not food miles, which most determines a household's food-related climate impacts.

"Our analysis shows that despite all the attention given to food miles, the distance that food travels is only around 11% of the average American household's food-related greenhouse gas emissions,'' said Weber, a research professor in Carnegie Mellon's department of civil and environmental engineering and engineering and public policy.

The researchers report that fruit, vegetables, meat and milk produced closer to home rack up fewer petroleum-based transport miles than foods trucked cross country to your table. Yet despite the large distances involved--the average distance traveled for food in the U.S. is estimated at 4,000-5,000 miles --the large non-energy based greenhouse gas emissions associated with producing food make food production matter much more than distance traveled.


The authors suggest that eating less red meat and/or dairy products may be a more effective way for concerned citizens to lower their food-related climate impacts. They estimate that shifting to an entirely local diet would reduce the equivalent greenhouse gas emissions as driving 1,000 miles, while changing only one day per week's meat and dairy-based calories to chicken, fish, or vegetables would have about the same impact. Shifting entirely from an average American diet to a vegetable-based one would reduce the same emissions as 8,000 miles driven per year.


"Where you get your food from is a relevant factor in family food decisions, but what you are eating - and the processes needed to make it - is much more important from a climate change perspective,'' said Matthews, associate professor of civil and environmental engineering and engineering and public policy at Carnegie Mellon.


Water Needed To Produce Various Types Of Energy




ScienceDaily (Apr. 22, 2008) — It is easy to overlook that most of the energy we consume daily, such as electricity or natural gas, is produced with the help of a dwindling resource – fresh water. Virginia Tech professor Tamim Younos and undergraduate student Rachelle Hill are researching the water-efficiency of some of the most common energy sources and power generating methods.
Younos, associate director at the Virginia Water Resources Research Center based at Virginia Tech and research professor of water resources in the College of Natural Resources and undergraduate researcher Hill, of Round Hill, Va., majoring in environmental science and aquatic resource concentration, in the College of Agriculture and Life Sciences, have analyzed 11 types of energy sources, including coal, fuel ethanol, natural gas, and oil; and five power generating methods, including hydroelectric, fossil fuel thermoelectric, and nuclear methods.

Younos said they based their calculations on available governmental reports by using a standard measurement unit, which makes this study unique. “Our unit is gallons of water per British Thermal Unit (BTU),” explained Younos. “We selected BTU as a standard unit because it indicates pure energy as heat and is applicable to all energy production and power generation methods.”

According to the study, the most water-efficient energy sources are natural gas and synthetic fuels produced by coal gasification. The least water-efficient energy sources are fuel ethanol and biodiesel.

In terms of power generation, Younos and Hill have found that geothermal and hydroelectric energy types use the least amount of water, while nuclear plants use the most.

Hill took the study one step further and calculated how many gallons of water are required to burn one 60-watt incandescent light bulb for 12 hours a day, over the course of one year. She found that the bulb would consume between 3,000 and 6,000 gallons of water, depending on how water-efficient the power plant that supplies the electricity is.

Hill added that the results are estimates of the water consumption based on energy produced by fossil fuel thermoelectric plants, which produce most of the Unites State’s power – about 53 percent. “The numbers are even more staggering if you multiply the water consumed by the same light bulb by the approximately 111 million U.S. homes,” said Hill. “The water usage then gets as high as 655 billion gallons of water a year.”

By contrast, burning a compact fluorescent bulb for the same amount of time would save about 2,000 to 4,000 gallons of water per year.

Younos noted that the results of this analysis should be interpreted with a grain of salt. “There are several variables such as geography and climate, technology type and efficiency, and accuracy of measurements that come into play. However, by standardizing the measurement unit, we have been able to obtain a unique snapshot of the water used to produce different kinds of energy.”


Mercury In River Moves Into Terrestrial Food Chain Through Spiders Fed To Baby Birds




ScienceDaily (Apr. 20, 2008) — Songbirds feeding near the contaminated South River are showing high levels of mercury, even though they aren’t eating food from the river itself, according to a paper published by William and Mary researchers in the journal Science.

Lead author Dan Cristol said his paper has wide-ranging international environmental implications. Mercury is one of the world’s most troublesome pollutants, especially in water. The South River, a major tributary of Virginia’s Shenandoah River, has been under a fish consumption advisory for years, as are some 3,000 other bodies of water in the U.S.

The paper shows high levels of mercury in birds feeding near, but not from, the South River. Cristol and his colleagues also identify the source of the pollutant—mercury-laden spiders eaten by the birds. The Science paper is one of the first, if not the first, to offer scientific documentation of the infiltration of mercury from a contaminated body of water into a purely terrestrial ecosystem.

“In bodies of water affected by mercury, it’s always been assumed that only birds or wildlife that ate fish would be in danger,” said Cristol, an associate professor in William and Mary’s Department of Biology. “But we’ve now opened up the possibility that mercury levels could be very high in the surrounding terrestrial habitat, as well. It’s not just about the fish, the people who eat the fish and the animals that eat the fish. We’ve also got to look at a strip of habitat all the way around the lake or river that is affected.”

Cristol and his co-authors, all students at the College of William and Mary, have been researching mercury impacts on birds along the South River for the past three years. The waters of the river were polluted with industrial mercury sulfate from around 1930 to 1950. He explained that mercury enters the food chain through a process called methylation, in which bacteria convert the mercury to a more potent form. The methylated mercury is passed up the food chain, becoming more concentrated in the bodies of larger animals through a phenomenon known as biomagnification.

Biomagnification of mercury in fish and fish-eating birds and other animals has been studied extensively, while little attention has been paid to the effects on animals near the river, but with no direct connection to the aquatic ecosystem. The researchers studied the food actually brought by songbird parents to their nestlings.

“The birds eat a lot of spiders. Spiders are like little tiny wolves, basically, and they’ll bioaccumulate lots of contaminants in the environment. The spiders have a lot of mercury in them and are delivering the mercury to these songbirds,” Cristol said. “The question that remains is this: How are the spiders getting their mercury?”

Cristol’s group is a part of the Institute for Integrated Bird Behavior Studies at William and Mary. Co-authors on the paper are master’s degree students Ariel E. White ’07, Rebecka L. Brasso ’07, Scott L. Friedman ’07 and Anne M. Condon ’08, along with undergraduates Rachel E. Fovargue ’09, Kelly K. Hallinger ’09 and Adrian P. Monroe ’08. Cristol and his group will continue their studies of the effect of mercury in the songbirds of the Shenandoah Valley, including an examination of the effects of the contaminant on the reproduction and lifespan of the birds.

Their paper appears in the April 18 issue of the journal Science.


Environmental Principles and Policies by Sharon Beder

Biodiversity banks, emissions trading, fishing quotas, water rights. A whole new suite of environmental policy instruments are being introduced in Australia and around the world. But can policies designed primarily to facilitate economic growth also protect the environment? Are they fair and equitable? Do they fit with the precautionary principle? Are they putting human rights at risk? These are the questions which Sharon Beder's latest book, Environmental Principles and Policies sets out to answer.



Environmental Principles and Policies examines six key environmental and social principles that have been incorporated into international treaties and national laws. It uses them to evaluate the new wave of economic-based and market-based policy instruments that are currently being introduced in many nations.



This book differs from other texts on environmental policy-making as a result of its critical and interdisciplinary approach. Rather than merely setting out policies in a descriptive or prescriptive way, it analyses and evaluates policy options from a variety of perspectives. This enables students and general readers not only to gain a thorough grasp of important principles and current policies, but also to be able to apply the principles and critically evaluate them.

Ozone: Friend or Foe? -- Berardelli 2008 (424): 1 -- ScienceNOW

By Phil BerardelliScienceNOW Daily News
24 April 2008

The ozone layer protects all life on Earth, but it's frustrating scientists' attempts to curb global warming. Take geoengineering: Researchers have proposed that injecting sulfur particles into the stratosphere might counter the effects of greenhouse gas buildup, but a new study suggests that the approach could thin the planet's already fragile ozone layer. Leaving the ozone layer alone comes with its own risks, however. A second study warns that the gradual recovery of the Antarctic ozone hole could speed the continent's warming.

The sulfur strategy goes like this: Researchers release sulfur particles from high-flying aircraft or large balloons in an attempt to mimic volcanoes. That's ostensibly a good thing, because the large amounts of ash and sulfur dioxide volcanoes eject reduce Earth's absorption of sunlight, resulting in a cooling effect. In 1991, for example, a large eruption by Mount Pinatubo in the Philippines flung out thousands of metric tons of SO2 and cooled temperatures, albeit slightly and temporarily, around the world.

But don't load those sulfur carriers just yet, says atmospheric scientist Simone Tilmes of the National Center for Atmospheric Research in Boulder, Colorado. Tilmes and colleagues analyzed the chemical actions of sulfur in the atmosphere, studied the effects of the Mount Pinatubo eruption, and modeled the potential impact of an attempted geoengineering effort. Adding sulfur to the atmosphere would spark chemical reactions leading to the liberation of chlorine, a compound known to destroy ozone, the team reports online today in Science. The effect, the researchers say, would reverse 2 decades of efforts to restore Earth's ailing ozone layer (ScienceNOW, 11 September 2007).

Physicist Paul A. Newman of NASA's Goddard Space Flight Center in Greenbelt, Maryland, says the paper is important because it identifies the relationship between atmospheric sulfur levels and "surprisingly large" ozone losses. "Nature provided us with an excellent 'geoengineering experiment' with the Mount Pinatubo eruption," he says. "We need to think long and hard before we experiment with the global climate," adds atmospheric scientist Uma Bhatt of the University of Alaska, Fairbanks. "We need more studies like [this] to assess the risk involved with geoengineering."

Meanwhile, a study conducted by researchers at the University of Colorado, Boulder, has concluded that a complete recovery by the Antarctic ozone hole--which developed in the 20th century because of the effects of ozone-destroying chemicals released into the atmosphere by human activity--could amplify warming in the Southern Hemisphere. Atmospheric scientist Judith Perlwitz and colleagues report in the 26 April issue of Geophysical Research Letters that if the ozone hole continues to recover, temperatures in the Antarctic stratosphere could rise as much as 9°C by the end of the century and contribute to global temperature increases.

A Cheap CO2 Trap

Crystals could capture greenhouse gases released by power plants

It's possible today to chemi­cally capture carbon dioxide emitted by smokestacks. But the process is expensive and energy intensive, and it can inflate the cost of electricity produced from coal by 80 to 90 percent.

A new material could reduce that cost significantly. A group led by Omar Yaghi, a chemist at University of California, Los Angeles, combined organic molecules and metal atoms to form highly porous crystals whose structure resembles that of industrial materials called zeolites. A liter of the UCLA crystals stores up to 80 liters of carbon dioxide. Yaghi's materials, which can be custom-made with different pore sizes and internal structures, have an electrostatic attraction to carbon dioxide, selectively trapping molecules of the gas inside their pores. The carbon dioxide can be released by a mere drop in pressure. Then it could be compressed and stored underground indefinitely, never entering the atmosphere.

Yaghi is continuing to develop versions of the material that could offer even better performance. The power industry will need to get involved to see what savings will result at real power plants, he says. But he adds that it should be possible within two to three years to test the materials under actual operating conditions.

https://www.technologyreview.com/Energy/20578/