Wednesday, August 12, 2009
Govt tries to douse drought panic
More than a quarter of the 620-plus districts face the threat of a drought this year, raising the spectre of brutal food price increases in the country by December.Finance minister Pranab Mukherjee admitted Tuesday that sowing of rice was down 20 per cent because of an erratic monsoon.“We have declared 161 districts as drought-prone,” Mukherjee told reporters on the sidelines of a conference.On Monday, the Met department had lowered its monsoon forecast for the second time since June. It said rain during June-September would be 87 per cent of the average against the 93 per cent it had forecast earlier — placing it on a par with the rainfall deficit in 2002 when the country last faced a drought.But Mukherjee tried to douse panic by saying the government had a contingency plan and referred to the country’s experience in tackling droughts. “There is no point in pressing the panic button. This country managed the century’s worst drought in 1987. We transported drinking water through railways. We organised fodder for the cattle. This country has the experience of handling the situation and I will advise not to press the panic button.”Prime Minister Manmohan Singh exuded the same optimism and told a team of business leaders that his government would tackle inflation and had enough food stocks to deal with any shortages.“He (Singh) was quite confident that the government would be able to handle the food inflation,” Ficci secretary-general Amit Mitra said after leading a delegation for talks with the Prime Minister.A flurry of meetings was taking place at Krishi Bhavan, which houses the agriculture ministry, and at North Block.The drought-prone districts are not large crop producers but the rain deficit raised fears that the kharif season could turn out as bad as 2004 when the summer crop fell by 12 per cent.Agronomists expect this year’s rice crop to be 12 million tonnes less than the 99.4 million tonnes produced last year. However, the country has large grain buffer stocks of over 52 million tonnes, including rice stocks of almost 20 million tonnes.On Tuesday, the Prime Minister brought close aide and noted economist C. Rangarajan to head the Prime Minister’s economic advisory council — a position he gave up last October when he was nominated to the Rajya Sabha.Rangarajan is expected to guide the crisis management plans to tackle the drought and will also negotiate with the World Bank and the IMF for a greater voice for India.The reconstituted advisory council also took on board two agro-economists — Suman Bery, director-general of the National Council of Applied Economic Research, and V.S. Vyas, former director of IIM Ahmedabad.Officials, however, admit that the big worry is over essential commodities like sugar, pulses and edible oil whose prices have surged in recent months. India is the largest consumer of sugar and has already indicated that it will be importing both raw and white sugar this year.
China’s Incinerators Loom as a Global Hazard
In this sprawling metropolis in southeastern China stand two hulking brown buildings erected by a private company, the Longgang trash incinerators. They can be smelled a mile away and pour out so much dark smoke and hazardous chemicals that hundreds of local residents recently staged an all-day sit-in, demanding that the incinerators be cleaner and that a planned third incinerator not be built nearby.
After surpassing the United States as the world’s largest producer of household garbage, China has embarked on a vast program to build incinerators as landfills run out of space. But these incinerators have become a growing source of toxic emissions, from dioxin to mercury, that can damage the body’s nervous system
And these pollutants, particularly long-lasting substances like dioxin and mercury, are dangerous not only in China, a growing body of atmospheric research based on satellite observations suggests. They float on air currents across the Pacific to American shores.
Chinese incinerators can be better. At the other end of Shenzhen from Longgang, no smoke is visible from the towering smokestack of the Baoan incinerator, built by a company owned by the municipal government. Government tests show that it emits virtually no dioxin and other pollutants.
But the Baoan incinerator cost 10 times as much as the Longgang incinerators, per ton of trash-burning capacity.
The difference between the Baoan and Longgang incinerators lies at the center of a growing controversy in China. Incinerators are being built to wildly different standards across the country and even across cities like Shenzhen. For years Chinese government regulators have discussed the need to impose tighter limits on emissions. But they have done nothing because of a bureaucratic turf war, a Chinese government official and Chinese incineration experts said.
The Chinese government is struggling to cope with the rapidly rising mountains of trash generated as the world’s most populated country has raced from poverty to rampant consumerism. Beijing officials warned in June that all of the city’s landfills would run out of space within five years.
The governments of several cities with especially affluent, well-educated citizens, including Beijing and Shanghai, are setting pollution standards as strict as Europe’s. Despite those standards, protests against planned incinerators broke out this spring in Beijing and Shanghai as well as Shenzhen.
Increasingly outspoken residents in big cities are deeply distrustful that incinerators will be built and operated to international standards. “It’s hard to say whether this standard will be reached — maybe the incinerator is designed to reach this benchmark, but how do we know it will be properly operated?” said Zhao Yong, a computer server engineer who has become a neighborhood activist in Beijing against plans for an incinerator there.
Yet far dirtier incinerators continue to be built in inland cities where residents have shown little awareness of pollution.
Studies at the University of Washington and the Argonne National Laboratory in Argonne, Ill., have estimated that a sixth of the mercury now falling on North American lakes comes from Asia, particularly China, mainly from coal-fired plants and smelters but also from incinerators. Pollution from incinerators also tends to be high in toxic metals like cadmium.
Incinerators play the most important role in emissions of dioxin. Little research has been done on dioxin crossing the Pacific. But analyses of similar chemicals have shown that they can travel very long distances.
A 2005 report from the World Bank warned that if China built incinerators rapidly and did not limit their emissions, worldwide atmospheric levels of dioxin could double. China has since slowed its construction of incinerators and limited their emissions somewhat, but the World Bank has yet to do a follow-up report.
Airborne dioxin is not the only problem from incinerators. The ash left over after combustion is laced with dioxin and other pollutants. Zhong Rigang, the chief engineer at the Baoan incinerator here, said that his operation sent its ash to a special landfill designed to cope with toxic waste. But an academic paper last year by Nie Yongfeng, a Tsinghua University professor and government adviser who sees a need for more incinerators, said that most municipal landfills for toxic waste lacked room for the ash, so the ash was dumped.
Trash incinerators have two advantages that have prompted Japan and much of Europe to embrace them: they occupy much less real estate than landfills, and the heat from burning trash can be used to generate electricity. The Baoan incinerator generates enough power to light 40,000 households.
And landfills have their own environmental hazards. Decay in landfills also releases large quantities of methane, a powerful global warming gas, said Robert McIlvaine, president of McIlvaine Company, an energy consulting firm that calculates the relative costs of addressing disparate environmental hazards. Methane from landfills is a far bigger problem in China than toxic pollutants from incinerators, particularly modern incinerators like those in Baoan, he said.
China’s national regulations still allow incinerators to emit 10 times as much dioxin as incinerators in the European Union; American standards are similar to those in Europe. Tightening of China’s national standards has been stuck for three years in a bureaucratic war between the environment ministry and the main economic planning agency, the National Development and Reform Commission, said a Beijing official who insisted on anonymity because he was not authorized to discuss the subject publicly.
The agencies agree that tighter standards on dioxin emissions are needed. They disagree on whether the environment ministry should have the power to stop incinerator projects that do not meet tighter standards, the official said, adding that the planning agency wants to retain the power to decide which projects go ahead.
Yan Jianhua, the director of the solid waste treatment expert group in Zhejiang province, a center of incinerator equipment manufacturing in China, defended the industry’s record on dioxin, saying that households that burn their trash outdoors emit far more dioxin.
“Open burning is a bigger problem according to our research,” Professor Yan said, adding that what China really needs is better trash collection so that garbage can be disposed of more reliably.
Critics and admirers of incinerators alike call for more recycling and reduced use of packaging as ways to reduce the daily volume of municipal garbage. Even when not recycled, sorted trash is easier for incinerators to burn cleanly, because the temperature in the furnace can be adjusted more precisely to minimize the formation of dioxin.
Yet the Chinese public has shown little enthusiasm for recycling. As Mr. Zhong, the engineer at the Baoan incinerator, put it, “No one really cares
After surpassing the United States as the world’s largest producer of household garbage, China has embarked on a vast program to build incinerators as landfills run out of space. But these incinerators have become a growing source of toxic emissions, from dioxin to mercury, that can damage the body’s nervous system
And these pollutants, particularly long-lasting substances like dioxin and mercury, are dangerous not only in China, a growing body of atmospheric research based on satellite observations suggests. They float on air currents across the Pacific to American shores.
Chinese incinerators can be better. At the other end of Shenzhen from Longgang, no smoke is visible from the towering smokestack of the Baoan incinerator, built by a company owned by the municipal government. Government tests show that it emits virtually no dioxin and other pollutants.
But the Baoan incinerator cost 10 times as much as the Longgang incinerators, per ton of trash-burning capacity.
The difference between the Baoan and Longgang incinerators lies at the center of a growing controversy in China. Incinerators are being built to wildly different standards across the country and even across cities like Shenzhen. For years Chinese government regulators have discussed the need to impose tighter limits on emissions. But they have done nothing because of a bureaucratic turf war, a Chinese government official and Chinese incineration experts said.
The Chinese government is struggling to cope with the rapidly rising mountains of trash generated as the world’s most populated country has raced from poverty to rampant consumerism. Beijing officials warned in June that all of the city’s landfills would run out of space within five years.
The governments of several cities with especially affluent, well-educated citizens, including Beijing and Shanghai, are setting pollution standards as strict as Europe’s. Despite those standards, protests against planned incinerators broke out this spring in Beijing and Shanghai as well as Shenzhen.
Increasingly outspoken residents in big cities are deeply distrustful that incinerators will be built and operated to international standards. “It’s hard to say whether this standard will be reached — maybe the incinerator is designed to reach this benchmark, but how do we know it will be properly operated?” said Zhao Yong, a computer server engineer who has become a neighborhood activist in Beijing against plans for an incinerator there.
Yet far dirtier incinerators continue to be built in inland cities where residents have shown little awareness of pollution.
Studies at the University of Washington and the Argonne National Laboratory in Argonne, Ill., have estimated that a sixth of the mercury now falling on North American lakes comes from Asia, particularly China, mainly from coal-fired plants and smelters but also from incinerators. Pollution from incinerators also tends to be high in toxic metals like cadmium.
Incinerators play the most important role in emissions of dioxin. Little research has been done on dioxin crossing the Pacific. But analyses of similar chemicals have shown that they can travel very long distances.
A 2005 report from the World Bank warned that if China built incinerators rapidly and did not limit their emissions, worldwide atmospheric levels of dioxin could double. China has since slowed its construction of incinerators and limited their emissions somewhat, but the World Bank has yet to do a follow-up report.
Airborne dioxin is not the only problem from incinerators. The ash left over after combustion is laced with dioxin and other pollutants. Zhong Rigang, the chief engineer at the Baoan incinerator here, said that his operation sent its ash to a special landfill designed to cope with toxic waste. But an academic paper last year by Nie Yongfeng, a Tsinghua University professor and government adviser who sees a need for more incinerators, said that most municipal landfills for toxic waste lacked room for the ash, so the ash was dumped.
Trash incinerators have two advantages that have prompted Japan and much of Europe to embrace them: they occupy much less real estate than landfills, and the heat from burning trash can be used to generate electricity. The Baoan incinerator generates enough power to light 40,000 households.
And landfills have their own environmental hazards. Decay in landfills also releases large quantities of methane, a powerful global warming gas, said Robert McIlvaine, president of McIlvaine Company, an energy consulting firm that calculates the relative costs of addressing disparate environmental hazards. Methane from landfills is a far bigger problem in China than toxic pollutants from incinerators, particularly modern incinerators like those in Baoan, he said.
China’s national regulations still allow incinerators to emit 10 times as much dioxin as incinerators in the European Union; American standards are similar to those in Europe. Tightening of China’s national standards has been stuck for three years in a bureaucratic war between the environment ministry and the main economic planning agency, the National Development and Reform Commission, said a Beijing official who insisted on anonymity because he was not authorized to discuss the subject publicly.
The agencies agree that tighter standards on dioxin emissions are needed. They disagree on whether the environment ministry should have the power to stop incinerator projects that do not meet tighter standards, the official said, adding that the planning agency wants to retain the power to decide which projects go ahead.
Yan Jianhua, the director of the solid waste treatment expert group in Zhejiang province, a center of incinerator equipment manufacturing in China, defended the industry’s record on dioxin, saying that households that burn their trash outdoors emit far more dioxin.
“Open burning is a bigger problem according to our research,” Professor Yan said, adding that what China really needs is better trash collection so that garbage can be disposed of more reliably.
Critics and admirers of incinerators alike call for more recycling and reduced use of packaging as ways to reduce the daily volume of municipal garbage. Even when not recycled, sorted trash is easier for incinerators to burn cleanly, because the temperature in the furnace can be adjusted more precisely to minimize the formation of dioxin.
Yet the Chinese public has shown little enthusiasm for recycling. As Mr. Zhong, the engineer at the Baoan incinerator, put it, “No one really cares
G.E. Resumes Hudson Dredging, With Limits by E.P.A.
General Electric resumed dredging for contaminants in the upper Hudson River on Tuesday afternoon after shutting down operations in response to tests that showed that chemicals from the cleanup had traveled several miles downstream.The Environmental Protection Agency, which had ordered the dredging halted on Friday, said that operations would restart in stages and initially be confined to only three of 11 sites on the river where the dredges had been operating. Additional dredging areas may be added pending another round of water sampling, said Kristen Skopeck, a spokeswoman for the agency.
Water tests conducted about five miles south of Fort Edward in Washington County, where most of the dredging is under way, showed that levels of the chemicals known as PCBs exceeded water quality standards. E.P.A. officials has asked General Electric, which is overseeing the cleanup, to find methods for confining the sediment disturbed by the dredging to each site and keep PCB levels down elsewhere in the river.
The agency said that “enhanced engineering controls” would be incorporated at each dredge location to keep water from spilling out of the dredge buckets and back into the river.
The dredging operation, which began in May along a six-mile segment south of Fort Edward, is the first phase of a cleanup expected to last through 2015. (The current phase is expected to continue well into the fall.)
Two General Electric factories discharged PCBs for three decades beginning in the 1940s before PCBs were banned in 1977 as a health threat to people and wildlife. That led to the federal designation of nearly 200 miles of the river, from Hudson Falls, N.Y., to the southern tip of Manhattan, as a contaminated Superfund site.
Under the legislation that created the Superfund program, the responsible party, G.E., is required to supervise and pay for the cleanup. The company periodically posts updates on the operation at www.hudsondredging.com.
Another 34 miles of river, running to Troy, N.Y., are to be dredged in the project’s second phase.
Ms. Skopeck said the temporary shutdown was not expected to significantly delay the cleanup. Mark Behan, a spokesman for General Electric, said that heavy rains this summer had swelled the river’s flow and caused intermittent suspensions of some dredging already.
"It will have an effect in productivity,” he said of the shutdown. "But it’s too early to tell how much of an effect.” ownstream.
Water tests conducted about five miles south of Fort Edward in Washington County, where most of the dredging is under way, showed that levels of the chemicals known as PCBs exceeded water quality standards. E.P.A. officials has asked General Electric, which is overseeing the cleanup, to find methods for confining the sediment disturbed by the dredging to each site and keep PCB levels down elsewhere in the river.
The agency said that “enhanced engineering controls” would be incorporated at each dredge location to keep water from spilling out of the dredge buckets and back into the river.
The dredging operation, which began in May along a six-mile segment south of Fort Edward, is the first phase of a cleanup expected to last through 2015. (The current phase is expected to continue well into the fall.)
Two General Electric factories discharged PCBs for three decades beginning in the 1940s before PCBs were banned in 1977 as a health threat to people and wildlife. That led to the federal designation of nearly 200 miles of the river, from Hudson Falls, N.Y., to the southern tip of Manhattan, as a contaminated Superfund site.
Under the legislation that created the Superfund program, the responsible party, G.E., is required to supervise and pay for the cleanup. The company periodically posts updates on the operation at www.hudsondredging.com.
Another 34 miles of river, running to Troy, N.Y., are to be dredged in the project’s second phase.
Ms. Skopeck said the temporary shutdown was not expected to significantly delay the cleanup. Mark Behan, a spokesman for General Electric, said that heavy rains this summer had swelled the river’s flow and caused intermittent suspensions of some dredging already.
"It will have an effect in productivity,” he said of the shutdown. "But it’s too early to tell how much of an effect.” ownstream.
Stowaway insects imperil Darwin's finches
The famous Galapagos finches could be among the first casualties of mosquitoes that are stowing away on aircraft, potentially bringing fatal viruses to the islands.
Live mosquitoes captured in the holds of aircraft arriving on the Galapagos from mainland Ecuador were found to survive and breed on the islands. Although none of the captured mosquitoes carried lethal viruses such as the West Nile virus (WNV) – which decimated bird populations in the US after arriving in New York in 1999 – they have the potential to do so.
WNV has been reported in Colombia and Argentina, and could have reached Ecuador, says Simon Goodman of the University of Leeds, who co-led the research team. It is not only the finches that are at risk. "West Nile virus also affects reptiles and mammals, and so could impact other iconic Galapagos species such as marine iguanas and sea lions," Goodman says.
Wildlife threat
Goodman and his colleagues found 74 live insects after searching the holds of 93 aircraft landing on Baltra Island in the Galapagos. Of these, six were Culex quinquefasciatus mosquitoes, which transmit WNV and the parasite that causes bird malaria. Two more were caught in aircraft that landed on nearby San Cristobal.
"The consequences for wildlife could be severe," says Marm Kilpatrick of the University of California, Santa Cruz. The findings are probably an underestimate of the true numbers of mosquitoes arriving, he says.
By comparing genes from mosquitoes caught on the mainland with those on the Galapagos, the researchers were able to show that arrivals from Ecuador survive and breed with Galapagos mosquitoes.
Live mosquitoes captured in the holds of aircraft arriving on the Galapagos from mainland Ecuador were found to survive and breed on the islands. Although none of the captured mosquitoes carried lethal viruses such as the West Nile virus (WNV) – which decimated bird populations in the US after arriving in New York in 1999 – they have the potential to do so.
WNV has been reported in Colombia and Argentina, and could have reached Ecuador, says Simon Goodman of the University of Leeds, who co-led the research team. It is not only the finches that are at risk. "West Nile virus also affects reptiles and mammals, and so could impact other iconic Galapagos species such as marine iguanas and sea lions," Goodman says.
Wildlife threat
Goodman and his colleagues found 74 live insects after searching the holds of 93 aircraft landing on Baltra Island in the Galapagos. Of these, six were Culex quinquefasciatus mosquitoes, which transmit WNV and the parasite that causes bird malaria. Two more were caught in aircraft that landed on nearby San Cristobal.
"The consequences for wildlife could be severe," says Marm Kilpatrick of the University of California, Santa Cruz. The findings are probably an underestimate of the true numbers of mosquitoes arriving, he says.
By comparing genes from mosquitoes caught on the mainland with those on the Galapagos, the researchers were able to show that arrivals from Ecuador survive and breed with Galapagos mosquitoes.
Galapagos face ecological disaster due to tourism: study
Mosquitoes brought into the Galapagos on tourist planes and boats threaten to wreak "ecological disaster" in the islands, central to Darwin's theory of evolution, a study said Wednesday.
The insects can spread potentially lethal diseases in the archipelago off Ecuador's Pacific coast, used by Charles Darwin as the basis of his seminal work "On the Origin of Species by Means of Natural Selection".
"Few tourists realise the irony that their trip to Galapagos may actually increase the risk of an ecological disaster," said Simon Goodman of Leeds University, one of the study's co-authors.
"That we haven't already seen serious disease impacts in Galapagos is probably just a matter of luck."
The study found that the southern house mosquito, Culex quinquefasciatus, was regularly hitching rides on planes from the South American mainland, and island-hopping on tourist boats between the different islands.
Species threatened by diseases such as avian malaria or West Nile include the islands' best-known residents, its giant tortoises, as well as marine iguanas, sea lions and finches.
Arnaud Bataille, another researcher on the eight-page study, said: "On average the number of mosquitoes per aeroplane is low, but many aircraft arrive each day from the mainland in order to service the tourist industry."
Worse, "the mosquitoes seem able to survive and breed once they leave the plane," he added.
Goodman noted that Ecuador recently introduced a requirement for all aircraft flying to the Galapagos to have insecticide treatment, but said similar moves are needed for ships, and the impact needs to be evaluated.
"With tourism growing so rapidly, the future of Galapagos hangs on the ability of the Ecuadorian government to maintain stringent biosecurity protection for the islands," he said.
The study, co-authored by Leeds University, the Zoological Society of London, the University of Guayaquil, the Galapagos National Park and the Charles Darwin Foundation, was published in the journal Proceedings of the Royal Society, Britain's de-facto academy of sciences.
Some 10,000 people, mostly fishermen, live on the volcanic Galapagos archipelago, which rose from the Pacific seabed 10 million years ago and became famous when Darwin visited to conduct research in 1835.
The insects can spread potentially lethal diseases in the archipelago off Ecuador's Pacific coast, used by Charles Darwin as the basis of his seminal work "On the Origin of Species by Means of Natural Selection".
"Few tourists realise the irony that their trip to Galapagos may actually increase the risk of an ecological disaster," said Simon Goodman of Leeds University, one of the study's co-authors.
"That we haven't already seen serious disease impacts in Galapagos is probably just a matter of luck."
The study found that the southern house mosquito, Culex quinquefasciatus, was regularly hitching rides on planes from the South American mainland, and island-hopping on tourist boats between the different islands.
Species threatened by diseases such as avian malaria or West Nile include the islands' best-known residents, its giant tortoises, as well as marine iguanas, sea lions and finches.
Arnaud Bataille, another researcher on the eight-page study, said: "On average the number of mosquitoes per aeroplane is low, but many aircraft arrive each day from the mainland in order to service the tourist industry."
Worse, "the mosquitoes seem able to survive and breed once they leave the plane," he added.
Goodman noted that Ecuador recently introduced a requirement for all aircraft flying to the Galapagos to have insecticide treatment, but said similar moves are needed for ships, and the impact needs to be evaluated.
"With tourism growing so rapidly, the future of Galapagos hangs on the ability of the Ecuadorian government to maintain stringent biosecurity protection for the islands," he said.
The study, co-authored by Leeds University, the Zoological Society of London, the University of Guayaquil, the Galapagos National Park and the Charles Darwin Foundation, was published in the journal Proceedings of the Royal Society, Britain's de-facto academy of sciences.
Some 10,000 people, mostly fishermen, live on the volcanic Galapagos archipelago, which rose from the Pacific seabed 10 million years ago and became famous when Darwin visited to conduct research in 1835.
Tuesday, August 11, 2009
Vilsack at CU: Climate-change innovations create opportunity
U.S. Secretary of Agriculture Tom Vilsack praised those at a biochar conference Monday at the University of Colorado, calling them innovators who potentially could help fight climate change and even create new economic opportunities for farmers.
Vilsack gave the keynote address at the conference, which is the first major biochar gathering in the United States. Biochar, created when organic materials are burned in a low-oxygen environment, is touted as an environmentally friendly way to turn infertile soils into nutrient-rich dirt.
"These are the kinds of innovations I think we're going to see all over the country," Vilsack said.
He talked about his support for a "cap-and-trade" system to reduce carbon emissions, making companies that produce more carbon emissions than allowed under a cap buy carbon permits through a government auction. The profits could pay for new energy research.
Companies also could buy "carbon offsets" at a lower cost from farms, forests and other sources. It's those offsets that could create an economic opportunity for farmers and ranchers, Vilsack said, and biochar is an offset candidate.
He said other income-generating possibilities for farmers include biomass and biofuels.
"We're seeing more interest in renewable energy on the farms," he said.
Revitalizing rural America -- and making it possible to earn a good living through farming -- is a priority for his department, he said.
A recent survey showed that mid-sized family farms have declined in the last five years, he said, though there are about 100,000 more small vegetable, fruit and specialty product farms. One of his department's initiatives is a program to help create local supply chains for small farmers.
He also told the audience that, while the agriculture department's 2010 budget is light on money for research, there's a greater research emphasis in the 2011 budget.
Jim Amonette, a biochar conference attendee who works at the Pacific Northwest National Laboratory, said he was impressed by Vilsack's address.
"It's nice to have a secretary of agriculture who knows what biochar is and understands the role agriculture can play in energy," he said.
Deborah Martin, a Boulder research hydrologist with the U.S. Geological Survey, said she liked that Vilsack talked about the need for a worldwide approach to sustainability.
"He was terrific," she said. "He was well-educated and well-informed."
Vilsack gave the keynote address at the conference, which is the first major biochar gathering in the United States. Biochar, created when organic materials are burned in a low-oxygen environment, is touted as an environmentally friendly way to turn infertile soils into nutrient-rich dirt.
"These are the kinds of innovations I think we're going to see all over the country," Vilsack said.
He talked about his support for a "cap-and-trade" system to reduce carbon emissions, making companies that produce more carbon emissions than allowed under a cap buy carbon permits through a government auction. The profits could pay for new energy research.
Companies also could buy "carbon offsets" at a lower cost from farms, forests and other sources. It's those offsets that could create an economic opportunity for farmers and ranchers, Vilsack said, and biochar is an offset candidate.
He said other income-generating possibilities for farmers include biomass and biofuels.
"We're seeing more interest in renewable energy on the farms," he said.
Revitalizing rural America -- and making it possible to earn a good living through farming -- is a priority for his department, he said.
A recent survey showed that mid-sized family farms have declined in the last five years, he said, though there are about 100,000 more small vegetable, fruit and specialty product farms. One of his department's initiatives is a program to help create local supply chains for small farmers.
He also told the audience that, while the agriculture department's 2010 budget is light on money for research, there's a greater research emphasis in the 2011 budget.
Jim Amonette, a biochar conference attendee who works at the Pacific Northwest National Laboratory, said he was impressed by Vilsack's address.
"It's nice to have a secretary of agriculture who knows what biochar is and understands the role agriculture can play in energy," he said.
Deborah Martin, a Boulder research hydrologist with the U.S. Geological Survey, said she liked that Vilsack talked about the need for a worldwide approach to sustainability.
"He was terrific," she said. "He was well-educated and well-informed."
Climate change press coverage gets weird
For those of you not familiar with this period in Earth's history, the PETM is a very singular event in the Cenozoic (last 65 million years). It was the largest and most abrupt perturbation to the carbon cycle over that whole period, defined by an absolutely huge negative isotope spike. Although there are smaller analogs later in the Eocene, the size of the carbon flux that must have been brought into the ocean/atmosphere carbon cycle in that one event, is on a par with the entire reserve of conventional fossil fuels at present. A really big number – but exactly how big?
The story starts off innocently enough with a new paper by Richard Zeebe and colleagues in Nature Geoscience to tackle exactly this question. They use a carbon cycle model, tuned to conditions in the Paleocene, to constrain the amount of carbon that must have come into the system to cause both the sharp isotopic spike and a very clear change in the "carbonate compensation depth" (CCD) – this is the depth at which carbonates dissolve in sea water (a function of the pH, pressure, total carbon amount etc.). There is strong evidence that the the CCD rose hundreds of meters over the PETM – causing clear dissolution events in shallower ocean sediment cores. What Zeebe et al. come up with is that around 3000 Gt carbon must have been added to the system – a significant increase on the original estimates of about half that much made a decade or so ago, though less than some high end speculations.
Temperature changes at the same time as this huge carbon spike were large too. Note that this is happening on a Paleocene background climate that we don't fully understand either – the polar amplification in very warm paleo-climates is much larger than we've been able to explain using standard models. Estimates range from 5 to 9 deg C warming (with some additional uncertainty due to potential problems with the proxy data) – smaller in the tropics than at higher latitudes.
Putting these two bits of evidence together is where it starts to get tricky.
First of all, how much does atmospheric CO2 rise if you add 3000 GtC to the system in a (geologically) short period of time? Zeebe et al. did this calculation and the answer is about 700 ppmv – quite a lot eh? However, that is a perturbation to the Paleocene carbon cycle – which they assume has a base CO2 level of 1000 ppm, and so you only get a 70% increase – i.e. not even a doubling of CO2. And since the forcing that goes along with an increase in CO2 is logarithmic, it is the percent change in CO2 that matters rather than the absolute increase. The radiative forcing associated with that is about 2.6 W/m2. Unfortunately, we don't (yet) have very good estimates of background CO2 levels in Paleocene. The proxies we do have suggest significantly higher values than today, but they aren't precise. Levels could have been less than 1000 ppm, or even significantly more.
If (and this is a key assumption that we'll get to later) this was the only forcing associated with the PETM event, how much warmer would we expect the planet to get? One might be tempted to use the standard 'Charney' climate sensitivity (2-4.5ºC per doubling of CO2) that is discussed so much in the IPCC reports. That would give you a mere 1.5-3ºC warming which appears inadequate. However, this is inappropriate for at least two reasons. First, the Charney sensitivity is a quite carefully defined metric that is used to compare a certain class of atmospheric models. It assumes that there are no other changes in atmospheric composition (aerosols, methane, ozone) and no changes in vegetation, ice sheets or ocean circulation. It is not the warming we expect if we just increase CO2 and let everything else adjust.
In fact, the concept we should be looking at is the Earth System Sensitivity (a usage I am trying to get more widely adopted) as we mentioned last year in our discussion of 'Target CO2'. The point is that all of those factors left out of the Charney sensitivity are going to change, and we are interested in the response of the whole Earth System – not just an idealised little piece of it that happens to fit with what was included in GCMs in 1979.
Now for the Paleocene, it is unlikely that changes in ice sheets were very relevant (there weren't any to speak of). But changes in vegetation, ozone, methane and aerosols (of various sorts) would certainly be expected. Estimates of the ESS taken from the Pliocene, or from the changes over the whole Cenozoic imply that the ESS is likely to be larger than the Charney sensitivity since vegetation, ozone and methane feedbacks are all amplifying. I'm on an upcoming paper that suggests a value about 50% bigger, while Jim Hansen has suggested a value about twice as big as Charney. That would give you an expected range of temperature increases of 2-5ºC (our estimate) or 3-6ºC (Hansen) (note that uncertainty bands are increasing here but the ranges are starting to overlap with the observations). ALl of this assumes that there are no huge non-linearities in climate sensitivity in radically different climates – something we aren't at all sure about either.
But let's go back to the first key assumption – that CO2 forcing is the only direct impact of the PETM event. The source of all this carbon has to satisfy two key constraints – it must be from a very depleted biogenic source and it needs to be relatively accessible. The leading candidate for this is methane hydrate – a kind of methane ice that is found in cold conditions and under pressure on continental margins – often capping large deposits of methane gas itself. Our information about such deposits in the Paleocene is sketchy to say the least, but there are plenty of ideas as to why a large outgassing of these deposits might have occurred (tectonic uplift in the proto-Indian ocean, volcanic activity in the North Atlantic, switches in deep ocean temperature due to the closure of key gateways into the Arctic etc.).
Putting aside the issue of the trigger though, we have the fascinating question of what happens to the methane that would be released in such a scenario. The standard assumption (used in the Zeebe et al paper) is that the methane would oxidise (to CO2) relatively quickly and so you don't need to worry about the details. But work that Drew Shindell and I did a few years ago suggested that this might not quite be true. We found that atmospheric chemistry feedbacks in such a circumstance could increase the impact of methane releases by a factor of 4 or so. While this isn't enough to sustain a high methane concentration for tens of thousands of years following an initial pulse, it might be enough to enhance the peak radiative forcing if the methane was being released continuously over a few thousand years. The increase in the case of a 3000 GtC pulse would be on the order of a couple of W/m2 – for as long as the methane was being released. That would be a significant boost to the CO2-only forcing given above – and enough (at least for relatively short parts of the PETM) to bring the temperature and forcing estimates into line.
Of course, much of this is speculative given the difficulty in working out what actually happened 55 million years ago. The press response to the Zeebe et al paper was, however, very predictable.
The problems probably started with the title of the paper "Carbon dioxide forcing alone insufficient to explain Palaeocene–Eocene Thermal Maximum warming" which on it's own might have been unproblematic. However, it was paired with a press release from Rice University that was titled "Global warming: Our best guess is likely wrong", containing the statement from Jerry Dickens that "There appears to be something fundamentally wrong with the way temperature and carbon are linked in climate models".
Since the know-nothings agree one hundred per cent with these two last statements, it took no time at all for the press release to get passed along by Marc Morano, posted on Drudge, and declared the final nail in the coffin for 'alarmist' global warming science on WUWT (Andrew Freedman at WaPo has a good discussion of this). The fact that what was really being said was that climate sensitivity is probably larger than produced in standard climate models seemed to pass almost all of these people by (though a few of their more astute commenters did pick up on it). Regardless, the message went out that 'climate models are wrong' with the implicit sub-text that current global warming is nothing to worry about. Almost the exact opposite point that the authors wanted to make (another press release from U. Hawaii was much better in that respect).
What might have been done differently?
First off, headlines and titles that simply confirm someone's prior belief (even if that belief is completely at odds with the substance of the paper) are a really bad idea. Many people do not go beyond the headline – they read it, they agree with it, they move on. Also one should avoid truisms. All 'models' are indeed wrong – they are models, not perfect representations of the real world. The real question is whether they are useful – what do they underestimate? overestimate? and are they sufficiently complete? Thus a much better title for the press release would have been more specific ""Global warming: Our best guess is likely too small" – and much less misinterpretable!
Secondly, a lot of the confusion is related to the use of the word 'model' itself. When people hear 'climate model', they generally think of the big ocean-atmosphere models run by GISS, NCAR or Hadley Centre etc. for the 20th Century climate and for future scenarios. The model used in Zeebe et al was not one of these, instead it was a relatively sophisticated carbon cycle model that tracks the different elements of the carbon cycle, but not the changes in climate. The conclusions of the study related to the sensitivity of the climate used the standard range of sensitivities from IPCC TAR (1.5 to 4.5ºC for a doubling of CO2), which have been constrained – not by climate models – but by observed climate changes. Thus nothing in the paper related to the commonly accepted 'climate models' at all, yet most of the commentary made the incorrect association.
To summarise, there is still a great deal of mystery about the PETM – the trigger, where the carbon came from and what happened to it – and the latest research hasn't tied up all the many loose ends. Whether the solution lies in something 'fundamental' as Dickens surmises (possibly related to our basic inability to explain the latitudinal gradients in any of the very warm climates) , or whether it's a combination of a different forcing function combined with more inclusive ideas about climate sensitivity, is yet to be determined. However, we can all agree that it remains a tantalisingly relevant episode of Earth history
The story starts off innocently enough with a new paper by Richard Zeebe and colleagues in Nature Geoscience to tackle exactly this question. They use a carbon cycle model, tuned to conditions in the Paleocene, to constrain the amount of carbon that must have come into the system to cause both the sharp isotopic spike and a very clear change in the "carbonate compensation depth" (CCD) – this is the depth at which carbonates dissolve in sea water (a function of the pH, pressure, total carbon amount etc.). There is strong evidence that the the CCD rose hundreds of meters over the PETM – causing clear dissolution events in shallower ocean sediment cores. What Zeebe et al. come up with is that around 3000 Gt carbon must have been added to the system – a significant increase on the original estimates of about half that much made a decade or so ago, though less than some high end speculations.
Temperature changes at the same time as this huge carbon spike were large too. Note that this is happening on a Paleocene background climate that we don't fully understand either – the polar amplification in very warm paleo-climates is much larger than we've been able to explain using standard models. Estimates range from 5 to 9 deg C warming (with some additional uncertainty due to potential problems with the proxy data) – smaller in the tropics than at higher latitudes.
Putting these two bits of evidence together is where it starts to get tricky.
First of all, how much does atmospheric CO2 rise if you add 3000 GtC to the system in a (geologically) short period of time? Zeebe et al. did this calculation and the answer is about 700 ppmv – quite a lot eh? However, that is a perturbation to the Paleocene carbon cycle – which they assume has a base CO2 level of 1000 ppm, and so you only get a 70% increase – i.e. not even a doubling of CO2. And since the forcing that goes along with an increase in CO2 is logarithmic, it is the percent change in CO2 that matters rather than the absolute increase. The radiative forcing associated with that is about 2.6 W/m2. Unfortunately, we don't (yet) have very good estimates of background CO2 levels in Paleocene. The proxies we do have suggest significantly higher values than today, but they aren't precise. Levels could have been less than 1000 ppm, or even significantly more.
If (and this is a key assumption that we'll get to later) this was the only forcing associated with the PETM event, how much warmer would we expect the planet to get? One might be tempted to use the standard 'Charney' climate sensitivity (2-4.5ºC per doubling of CO2) that is discussed so much in the IPCC reports. That would give you a mere 1.5-3ºC warming which appears inadequate. However, this is inappropriate for at least two reasons. First, the Charney sensitivity is a quite carefully defined metric that is used to compare a certain class of atmospheric models. It assumes that there are no other changes in atmospheric composition (aerosols, methane, ozone) and no changes in vegetation, ice sheets or ocean circulation. It is not the warming we expect if we just increase CO2 and let everything else adjust.
In fact, the concept we should be looking at is the Earth System Sensitivity (a usage I am trying to get more widely adopted) as we mentioned last year in our discussion of 'Target CO2'. The point is that all of those factors left out of the Charney sensitivity are going to change, and we are interested in the response of the whole Earth System – not just an idealised little piece of it that happens to fit with what was included in GCMs in 1979.
Now for the Paleocene, it is unlikely that changes in ice sheets were very relevant (there weren't any to speak of). But changes in vegetation, ozone, methane and aerosols (of various sorts) would certainly be expected. Estimates of the ESS taken from the Pliocene, or from the changes over the whole Cenozoic imply that the ESS is likely to be larger than the Charney sensitivity since vegetation, ozone and methane feedbacks are all amplifying. I'm on an upcoming paper that suggests a value about 50% bigger, while Jim Hansen has suggested a value about twice as big as Charney. That would give you an expected range of temperature increases of 2-5ºC (our estimate) or 3-6ºC (Hansen) (note that uncertainty bands are increasing here but the ranges are starting to overlap with the observations). ALl of this assumes that there are no huge non-linearities in climate sensitivity in radically different climates – something we aren't at all sure about either.
But let's go back to the first key assumption – that CO2 forcing is the only direct impact of the PETM event. The source of all this carbon has to satisfy two key constraints – it must be from a very depleted biogenic source and it needs to be relatively accessible. The leading candidate for this is methane hydrate – a kind of methane ice that is found in cold conditions and under pressure on continental margins – often capping large deposits of methane gas itself. Our information about such deposits in the Paleocene is sketchy to say the least, but there are plenty of ideas as to why a large outgassing of these deposits might have occurred (tectonic uplift in the proto-Indian ocean, volcanic activity in the North Atlantic, switches in deep ocean temperature due to the closure of key gateways into the Arctic etc.).
Putting aside the issue of the trigger though, we have the fascinating question of what happens to the methane that would be released in such a scenario. The standard assumption (used in the Zeebe et al paper) is that the methane would oxidise (to CO2) relatively quickly and so you don't need to worry about the details. But work that Drew Shindell and I did a few years ago suggested that this might not quite be true. We found that atmospheric chemistry feedbacks in such a circumstance could increase the impact of methane releases by a factor of 4 or so. While this isn't enough to sustain a high methane concentration for tens of thousands of years following an initial pulse, it might be enough to enhance the peak radiative forcing if the methane was being released continuously over a few thousand years. The increase in the case of a 3000 GtC pulse would be on the order of a couple of W/m2 – for as long as the methane was being released. That would be a significant boost to the CO2-only forcing given above – and enough (at least for relatively short parts of the PETM) to bring the temperature and forcing estimates into line.
Of course, much of this is speculative given the difficulty in working out what actually happened 55 million years ago. The press response to the Zeebe et al paper was, however, very predictable.
The problems probably started with the title of the paper "Carbon dioxide forcing alone insufficient to explain Palaeocene–Eocene Thermal Maximum warming" which on it's own might have been unproblematic. However, it was paired with a press release from Rice University that was titled "Global warming: Our best guess is likely wrong", containing the statement from Jerry Dickens that "There appears to be something fundamentally wrong with the way temperature and carbon are linked in climate models".
Since the know-nothings agree one hundred per cent with these two last statements, it took no time at all for the press release to get passed along by Marc Morano, posted on Drudge, and declared the final nail in the coffin for 'alarmist' global warming science on WUWT (Andrew Freedman at WaPo has a good discussion of this). The fact that what was really being said was that climate sensitivity is probably larger than produced in standard climate models seemed to pass almost all of these people by (though a few of their more astute commenters did pick up on it). Regardless, the message went out that 'climate models are wrong' with the implicit sub-text that current global warming is nothing to worry about. Almost the exact opposite point that the authors wanted to make (another press release from U. Hawaii was much better in that respect).
What might have been done differently?
First off, headlines and titles that simply confirm someone's prior belief (even if that belief is completely at odds with the substance of the paper) are a really bad idea. Many people do not go beyond the headline – they read it, they agree with it, they move on. Also one should avoid truisms. All 'models' are indeed wrong – they are models, not perfect representations of the real world. The real question is whether they are useful – what do they underestimate? overestimate? and are they sufficiently complete? Thus a much better title for the press release would have been more specific ""Global warming: Our best guess is likely too small" – and much less misinterpretable!
Secondly, a lot of the confusion is related to the use of the word 'model' itself. When people hear 'climate model', they generally think of the big ocean-atmosphere models run by GISS, NCAR or Hadley Centre etc. for the 20th Century climate and for future scenarios. The model used in Zeebe et al was not one of these, instead it was a relatively sophisticated carbon cycle model that tracks the different elements of the carbon cycle, but not the changes in climate. The conclusions of the study related to the sensitivity of the climate used the standard range of sensitivities from IPCC TAR (1.5 to 4.5ºC for a doubling of CO2), which have been constrained – not by climate models – but by observed climate changes. Thus nothing in the paper related to the commonly accepted 'climate models' at all, yet most of the commentary made the incorrect association.
To summarise, there is still a great deal of mystery about the PETM – the trigger, where the carbon came from and what happened to it – and the latest research hasn't tied up all the many loose ends. Whether the solution lies in something 'fundamental' as Dickens surmises (possibly related to our basic inability to explain the latitudinal gradients in any of the very warm climates) , or whether it's a combination of a different forcing function combined with more inclusive ideas about climate sensitivity, is yet to be determined. However, we can all agree that it remains a tantalisingly relevant episode of Earth history
Subscribe to:
Comments (Atom)
how u find the blog |