UN urges all countries to sign climate accord
The UN secretary-general on Monday urged all countries to formally sign on to the Copenhagen Accord to start tackling climate change and step up work toward a legally binding treaty in 2010.
Ban Ki-moon also urged richer nations to contribute to a multi-billion dollar fund to help poorer countries cope with global warming which will become operational in January.
Robert Orr, the UN policy coordination chief, said the document will shortly be opened for signatures from all countries.
"I urge all governments to formally sign on to the Copenhagen accord by registering their support" through the United Nations Framework Convention on Climate Change, Ban said.
"The faster we have all the signatures, the more momentum we can give it," he said.
Ban said the UN will seek to streamline the negotiating process, which was strongly criticized, ahead of the next UN climate conference in Mexico City in 2010.
He said he had already discussed ways to improve negotiations with Mexican President Felipe Calderon and is willing to discuss the issue with other world leaders, opinion makers, and civic leaders.
Ban said he will encourage world leaders "to directly engage in achieving a global legally binding climate change treaty in 2010."
The UN chief also urged countries to contribute "to ensure that the Copenhagen Green Climate Fund becomes fully operational as soon as possible."
Under the accord, developed countries will finance a 10 billion-dollars-a-year, three-year program starting in 2010 to fund developing nations' projects to deal with drought, floods and other impacts of climate change, and to develop clean energy.
It also set a "goal" of mobilizing 100 billion dollars a year by 2020 for the same purposes
China attaches great importance to tackling climate change.
In 2007, it established the National Leading Group on Climate Change (NLGCC), headed by Premier Wen Jiabao.
That same year, China issued its National Climate Change Program , the first ever by a developing country.
In its National Climate Change Program , China set an objective to lower its energy consumption per unit of GDP by 20% or so of 2005 level by 2010 and in its Mid- and Long-Term Plan for the Development of Renewable Energy , China also sets an objective of increasing the proportion of renewable energy in the primary energy mix to 10% by 2010, and to 15% by 2020.
To achieve such objectives, China has adopted a series of effective policies and measures, achieving remarkable progress.
Firstly, China succeeded in lowering its energy consumption per unit of GDP by 1.79%, 4.04% and 4.59% respectively for 2006, 2007, and 2008, which strongly suggests the prospect of meeting the 20% objective by 2010.
Secondly, between 2006 and 2008, China shut down small thermal power-generation units with a total installed capacity of 34.21 GW, phased out 60.59 million tons of backward steel-making capacity, 43.47 million tons of iron-smelting capacity and 140 million tons of cement-production capacity.
All of these steps reduced pollution markedly.
Thirdly, between 2000 and 2008, China increased its wind power generating capacity from 340 MW to 10 GW, hydropower from 79.35 GW to 163 GW, and nuclear power from 2.1 GW to 9.1 GW.
It has also made great efforts to reduce agricultural and rural greenhouse gas emissions.
Indeed, by the end of 2007, more than 26.5 million rural households were using household biogas digesters, thereby avoiding CO 2 emissions by 44 million tons.
Fourthly, China has increased its carbon sinks by promoting reforestation.
China's forest coverage rate increased from 12% in the early 1980's to 18.21% today.
For this year, China will complete formulating provincial climate change programs throughout the country, promoting effective implementation of the National Climate Change Program .
Furthermore, in China's economy stimulus package, 210 billion yuan is allocated for energy conservation, pollutants reduction, and ecosystem protection projects, 370 billion yuan for economic structural adjustment and technology renovation, and 400 billion yuan for new energy-efficient housing that will use environmentally friendly materials.
Besides, 370 billion yuan will be used to improve rural living standards in an environmentally sound manner and sustainable way.
China is making huge efforts to combat climate change despite the fact that it remains a low-income developing country with a per-capita GDP of just about $3,000.
Indeed, by United Nations standards, China still has 150 million people living in poverty.
China has no other choice but to pursue sustainable development in order to meet the basic needs of its people and to eradicate poverty.
In this process, the world is assured that China will make every effort to address climate change.
The international community has great expectations for reaching a positive outcome in Copenhagen.
In China's view, the key to the success in Copenhagen lies in the realization of the full, effective and sustained implementation of the Convention and its Kyoto Protocol (KP).
Developed country Parties to the KP, collectively, must reduce their greenhouse gas emissions by at least 25-40% below their 1990 level by 2020.
For Non-KP developed countries, they should undertake comparable commitments with quantified emission reduction targets.
Developed countries should also fulfill their obligations under the Convention to provide financial support and technology transfer to enable developing countries to effectively tackle climate change.
In addition, appropriate mechanisms and institutional arrangements should be established for adaptation, financial support and technology transfer.
Developing countries will, in the context of sustainable development and with measurable, reportable, and verifiable support in terms of financing, technology, and capacity-building, take nationally appropriate mitigation actions.
The global financial crisis has undoubtedly exacerbated the challenge of climate change.
But since climate change is a more far-reaching and serious challenge, the world must not waver in its determination and commitment to addressing it.
Indeed, the international financial crisis, if handled properly, may also be turned into an opportunity to reach a win-win solution for both climate protection and economic development.
With a deep sense of responsibility for its own people and the entire human race, China will continue to implement proactive policies and measures to address climate change and make unremitting efforts to the protection of earth system.
South Africa blasts Copenhagen failure
In their first media briefing since returning from talks in the Danish capital, environment minister Buyelwa Sonjica (photo above) and her two top climate change negotiators said Tuesday that part of the blame rested with the way the host guided the conference.
South Africa says Copenhagen's failure to produce a legally binding climate change agreement was unacceptable, joining a global chorus of condemnation even though it helped draft the final accord.
South Africa's environment minister Buyelwa Sonjica and her two top climate change negotiators said Tuesday that part of the blame rested with the way the host guided the conference.
In their first media briefing since returning from talks in the Danish capital that ended Saturday, the trio described an atmosphere of distrust and suspicion that Denmark was plotting to force its own position on other nations.
In the end, South African negotiator Joanne Yawitch said, the Danes unveiled a draft at the 11th hour that Yawitch said was "seriously problematic".
She said negotiators edited late into the night and came up with a document South Africa found more balanced, but that she felt substantive changes were unwelcome.
Her fellow negotiator Alf Wills said the resulting agreement was limited not only in terms of what it did to save the planet, but in the number of nations that accepted it, saying it did not extend beyond the 28 represented at the late-night negotiations.
Sonjica said substantive talks were hijacked by debates over how to handle the process.
"Process is important, since it determines outcomes, but some ill-restrained interventions combined with poor decisions by those guiding the process meant that process problems caused the loss of three days — precious time indeed," Sonjica said.
Copenhagen's outcome was "not acceptable.
It's definitely not acceptable.
It's disappointing," Sonjica said.
South Africa along with the US, India, Brazil and China drafted the climate change agreement reached in Denmark.
The compromise calls for reducing emissions to keep temperatures from rising more than two degrees Celsius (3.6 F) above preindustrial levels. The nonbinding agreement also calls on rich nations to spend billions to help poor nations deal with drought and other impacts of climate change, and to develop clean energy.
Sonjica said South African President Jacob Zuma had discussed with other African leaders whether the talks should be abandoned, but it was decided it would be better to continue to try to influence the talks from inside.
"And maybe what we have now would have been worse" had there been a walk out, she said.
A working laboratory for energy technologies
Energy efficiency in Denmark has been created by a range of new technologies, and today, this can serve as an example of how one can create a high level of growth without a corresponding increase in greenhouse gas emissions.
Maybe number eight does not sound like all that much, but there are reasons to look more closely at the Danish example. A top placement among the world's most energy-efficient and climate-friendly economies has been achieved despite the fact that Denmark does not have any hydroelectric power resources worth mentioning, nor the large forest areas that typically form the basis for a large part of a country's production of renewable energy.
Neither does Denmark use nuclear power, which is a large source of CO2-free energy in other countries in the same group.
Energy efficiency in Denmark has been created by a range of new technologies and solutions, and this can today serve as an example of how one can create a high level of growth without a corresponding increase in energy consumption or greenhouse gas emissions.
The means to achieve this has partly been a strong political focus on energy policy.
Denmark was one of the first countries to set out detailed plans for developing the energy sector back in the 1970s. Added to this has been the strong commitment of the Danish business sector to developing – and using – energy-efficient solutions. The windmill industry is the best-known example of this, but there is much more. A common-sense approach to energy-efficient measures such as insulating houses and cost savings in production has gone hand in hand with high-tech solutions for the whole society.
For example there is an electricity supply system that can handle the fact that windmills supply, in periods, more than 100 per cent of the energy required, and in other periods supply nothing at all.
And it can do this in a competitive manner.
The last factor is the strong focus on energy saving and a secure energy supply, which has been the case since the oil crises in the 1970s. In 1985 the Danish parliament (Folketinget) rejected nuclear power and opted to focus on new, sustainable sources of energy.
Denmark in 2009 is in many ways a dynamic, working laboratory for the meeting of new energy technologies and old common sense in its relationship with nature. (Photo:
Eva Rosenqvist/Scanpix)
What is the greenhouse effect and global warming?
The most recent assessment report from the Intergovermental Panel on Climate Change (IPCC) says that the earth's average temperature has risen by 0.74 degrees in the period from 1906 to 2005, and that the average temperature will continue to rise.
The greenhouse effect is a natural mechanism that retains the heat emitted from the earth's surface.
The earth's average temperature is at the moment around 14 degrees celsius (57 degrees fahrenheit). If the natural greenhouse effect did not exist, the average temperature would be around minus 19 degrees celsius (minus 2 degrees fahrenheit).
The greenhouse effect is caused by a range of different gases in the earth's atmosphere.
Water vapour makes the most significant contribution to the greenhouse effect, followed by CO2.
The atmospheric content of greenhouse gases – in particular CO2 – and the consequences for the climate are being discussed because the content of these gases in the atmosphere has risen precipitously in a period covering approximately the latest 250 years, and especially the last 50.
At present the concentration of CO2 in the atmosphere is about 385 ppm (parts per million). Before industrialization it was about 280 ppm.
Analyses of air contained in ice from the Antarctic ice cap show that there is far more CO2 in the air today than at any time in the last 650,000 years.
The consequence is that the greenhouse effect is becoming stronger, and therefore the earth is becoming warmer. How much warmer has, however, been a matter of dispute.
The most recent assessment report from the IPCC is from 2007.
It concludes that the earth's average temperature has risen by 0.74 degrees in the period from 1906 to 2005.
The warming is stronger over land areas than over the sea, and accordingly it is strongest in the northern hemisphere.
At the same time occurrences of heat waves and violent downpours have also increased, the oceans have risen, and the ice at the world's poles and on its mountains has begun to melt.
All of these effects are predictable in the event of global warming.
The IPCC's most recent assessment report concludes that the average temperature will continue to rise, but that the extent and the duration of this rise, and the severity of its consequences, depend on how quickly and how effectively emissions of greenhouse gases can be restricted and, over time, reduced
Danish Cleantech Solutions - driving export and attracting foreign investments
Denmark is in a unique position to help the world mitigate and adapt to climate change as well as to handle other environmental challenges.
Hosting the COP15 Denmark not only brings ambitions and political will to the global scene, but also solid proof that caring for the environment and the climate does not contradict aspirations for growth and welfare.
And Denmark is capable of providing a wide range of solutions and technologies that the world needs in order to address climate change.
Following the oil crisis in 1973, Denmark has converted production processes and facilities from solely fossil fuel based to more diverse sources of energy.
Simultaneously, energy efficiency measures have been developed in order to cut down energy consumption and CO2 emissions. Add to this combined heat and power plants, district heating, waste and waste water management.
This and the fact that Danish environmental policy is integrated into all major policy sector objectives put Denmark among the world leaders in developing and commercializing cleantech technologies. Read more about " The Danish Example "
Denmark is in a unique position to help the world mitigate and adapt to climate change as well as to handle other environmental challenges. Danish companies and organizations provide energy and cleantech products, technologies as well as knowledge and consultancy.
Together, they are capable of providing world class end-to-end solutions based on quality and efficiency.
An elaboration of Danish climate solutions is found in the co-operated publications between the Scandinavian think tank Monday Morning and, the official Danish investment promotion agency, Invest in Denmark, part of the Danish Ministry of Foreign Affairs. The publications aim to portray the wide range of opportunities and world-class competencies within the field of renewable energy and climate solutions that Denmark offers. Furthermore, the publications illustrate the Danish way to become one of the world leaders in the market of Cleantech Solutions.
Danish Cleantech Solutions has become a key export commodity for Denmark.
Not least energy equipment has become an important driver behind Danish export growth in recent years.
Danish exports of energy technology and equipment has more than tripled in the last decade, outperforming most other Danish export items. Danish expertise in wind energy is second to none and Danish companies are still market leaders when it comes to wind power technology.
In 2007, the wind power sector alone contributed 6.5 % of total Danish exports and the share is still growing.
Trade Council of Denmark and Danish diplomatic missions abroad are supporting export and other globalization efforts made by Danish companies. In 2008 and 2009, the Trade Council of Denmark is dedicated to a special and focused promotion of Danish cleantech solutions. 17 markets have been singled out for targeted efforts:
Brazil, Canada, China, France, Germany, India, Italy, Japan, Mexico, Poland, Russia, Spain, South Africa, South Korea, UK, Ukraine, and the US.
In general, Danish diplomatic missions stand ready to assist foreign companies interested in Danish solutions. For contact information in your country, please link to " Danish Missions Abroad ".
Danish Cleantech Solutions is not only interesting as an export commodity.
The Danish Cleantech Clusters offers a unique co-existence of manufacturers, sub suppliers, research and educational institutions, consultants, and power companies using state of the art cleantech solutions. This makes Denmark a very interesting target for foreign investors interested in Cleantech.
For more information on Danish Cleantech Clusters and investment in Danish technologies please link to Invest in Denmark .
Invest in Denmark stands ready to assist foreign investors.
For foreign companies looking for investment opportunities or interested in learning more on Danish cleantech solutions, 2009 offers a special opportunity:
it is possible to order a tailor made tour to energy installations, companies, national and local authorities. The Danish Climate Consortium has the overall responsibility for the Energy Tours.
Atmosphere Changes
The release of greenhouse gases and aerosols resulting from human activities are changing the amount of radiation coming into and leaving the atmosphere, likely contributing to changes in climate.
Greenhouse Gases
Greenhouse gas concentrations in the atmosphere have historically varied as a result of many natural processes (e.g. volcanic activity, changes in temperature, etc). However, since the Industrial Revolution humans have added a significant amount of greenhouse gases in the atmosphere by burning fossil fuels, cutting down forests and other activities. Because greenhouse gases absorb and emit heat, increasing their concentrations in the atmosphere will tend to have a warming effect.
But the rate and amount of temperature increase is not known with absolute certainty.
Changes in the atmospheric concentration of the major greenhouse gases are described below:
Figure 1:
Atmospheric Concentrations of Carbon Dioxide in Geologic Time and in Recent Years:
This diagram is in three sections. The first, using data from ice cores in Antarctica, shows CO2 concentrations from 647,426 B.C. to 337 B.C., with a clear cyclical pattern of peaks and valleys. The second, using data from other Antarctic ice cores, shows CO2 concentrations from 8947 B.C. to 1975 A.D. The diagram shows a slight upward trend in concentrations until the 20th century, when they shoot up rapidly.
The third section, using data from CO2 monitoring stations around the world, shows CO2 concentrations from 1959 to 2006.
The trend shows a steady increase in concentrations from about 320 ppm in 1959 to approximately 380 ppm in 2006.
Carbon dioxide (CO2) concentrations in the atmosphere increased from approximately 280 parts per million (ppm) in pre-industrial times to 382 ppm in 2006 according to the National Oceanic and Atmospheric Administration's (NOAA) Earth Systems Research Laboratory, a 36 percent increase.
Almost all of the increase is due to human activities (IPCC, 2007). The current rate of increase in CO2 concentrations is about 1.9 ppmv/year. Present CO2 concentrations are higher than any time in at least the last 650,000 years (IPCC, 2007). See Figure 1 for a record of CO2 concentrations from about 420,000 years ago to present.
For more information on the human and natural sources of CO2 emissions, see the Emissions section and for actions that can reduce these emissions, see the What You Can Do Section.
Figure 2:
Atmospheric Concentrations of Methane in Geologic Time and in Recent Years:
This diagram is in three sections. The first, using data from ice cores in Antarctica and Greenland, shows methane concentrations from 648,679 B.C. to 346 B.C. Concentrations during the period vary widely, from as high as 800 ppb to as low as less than 100 ppb.
The second, using data from other ice cores, shows methane concentrations from 8945 B.C. to 1980 A.D. The diagram shows a relatively flat trend in concentrations until the 20th century, when they shoot up rapidly.
The third section, using data from several atmospheric monitoring stations around the world, shows methane concentrations from 1985 to 2001.
The trend shows an increase in concentrations during most of the period, with an apparent leveling off in the later years.
Methane (CH4) is more abundant in the Earth's atmosphere now than at any time in at least the past 650,000 years (IPCC, 2007). Methane concentrations increased sharply during most of the 20th century and are now 148% above pre-industrial levels. In recent decades, the rate of increase has slowed considerably (see Figure 2). For more information on CH4 emissions and sources, and actions that can reduce emissions, see EPA's Methane Site.
Figure 3:
Atmospheric Concentrations of Nitrous Oxide in Geologic Time and in Recent Years:
This diagram is in three sections. The first, using data from ice cores in East Antarctica and Greenland, shows nitrous oxide concentrations from 104,301 B.C. to 1871 A.D. Concentrations during the period varied widely, ranging from 180 ppb to more than 280 ppb, with an upward trend toward the end of the period.
The second, using data from a variety of sources, shows nitrous oxide concentrations from 9000 B.C. to 1976 A.D. The diagram shows a relatively flat trend in concentrations until the 20th century, when they shoot up rapidly.
The third section, using data from several atmospheric monitoring stations around the world, shows nitrous oxide concentrations from 1977 to 2005.
The trend shows a steady increase in concentrations, rising from around 300 ppb in 1997 to 320 ppb in 2005.
Nitrous oxide (N2O) has increased approximately 18 percent in the past 200 years and continues to increase (see Figure 3). For about 11,500 years before the industrial period, the concentration of N2O varied only slightly.
It increased relatively rapidly toward the end of the 20th century (IPCC, 2007). For more information on N2O emissions and sources, see EPA's Nitrous Oxide Site .
How are Greenhouse Gas Concentrations from Thousands of Years Ago Determined?
Portions of the Antarctic ice sheet are several miles deep, consisting of ice that has accumulated over hundreds of thousands of years or longer. Paleoclimatologists (scientists who study the history of the Earth's climate) drill holes in this ice to extract what are called "cylindrical cores," or "ice cores."
Ice cores can provide valuable information about the Earth's past.
For example, the cores contain trapped air bubbles that can be analyzed to obtain snapshots of the composition of the atmosphere at the time the ice accumulated.
Through this analysis, concentrations of greenhouse gases (CO2, CH4, N2O) dating back thousands of years or longer can be obtained with a high level of confidence.
See the National Aeronautics and Space Administration's (NASA) Earth Observatory feature "Paleoclimatogy:
The Ice Core Method" for more information.
* Tropospheric ozone (O3) is created by chemical reactions from automobile, power plant and other industrial and commercial source emissions in the presence of sunlight.
It is estimated that O3 has increased by about 36% since the pre-industrial era, although substantial variations exist for regions and overall trends (IPCC, 2007). Besides being a greenhouse gas, ozone can also be a harmful air pollutant at ground level, especially for people with respiratory diseases and children and adults who are active outdoors. Measures are being taken to reduce ozone emissions in the U.S. (through the Clean Air Act) and also in other countries.
* Chlorofluorocarbons (CFCs) and hydrochlorofluorocarbons (HCFCs) are used in coolants, foaming agents, fire extinguishers, solvents, pesticides and aerosol propellants. These compounds have steadily increased in the atmosphere since their introduction in 1928.
Concentrations are slowly declining as a result of their phaseout via the Montreal Protocol on Substances that Deplete the Ozone Layer.
* Fluorinated gases such as hydrofluorocarbons (HFCs), perfluorocarbons (PFCs), and sulfur hexafluoride (SF6) are frequently used as substitutes for CFCs and HCFCs and are increasing in the atmosphere.
These various fluorinated gases are sometimes called "high global warming potential greenhouse gases" because, molecule for molecule, they trap more heat than CO2.
For more information, visit EPA's High Global Warming Potential Gases Site.
Aerosols
The burning of fossil fuels and biomass (living matter such as vegetation) has resulted in aerosol emissions into the atmosphere.
Aerosols absorb and emit heat, reflect light and, depending on their properties, can either cool or warm the atmosphere. NASA's Earth Observatory describes how aerosols can also affect how clouds form.
* Sulfate aerosols are emitted when fuel containing sulfur, such as coal and oil, is burned.
Sulfate aerosols reflect solar radiation back to space and have a cooling effect.
These aerosols have decreased in concentration in the past two decades resulting from efforts to reduce the coal-fired power plant emissions of sulfur dioxide in the United States and other countries.
* Black carbon (or soot) results from the incomplete combustion of fossil fuels and biomass burning (forest fires and land clearing) and is believed to contribute to global warming (IPCC, 2007). Though global concentrations are likely increasing, there are significant regional differences. In the United States and many other countries, efforts to reduce particulate matter (of which black carbon is a part) are lowering black carbon concentrations.
* Other aerosols emitted in small quantities from human activities include organic carbon and associated aerosols from biomass burning.
Mineral dust aerosols (e.g., from deserts and lake beds) largely originate from natural sources, but their distribution can be affected by human activities.
Radiative Forcing
Radiative forcing is the change in the balance between solar radiation entering the atmosphere and the Earth's radiation going out.
On average, a positive radiative forcing tends to warm the surface of the Earth while negative forcing tends to cool the surface.
Radiative forcing is measured in Watts per square meter, which is a measure of energy.
For example, an increase in radiative forcing of +1 Watt per square meter is like shining one small holiday tree light bulb over every square meter of the Earth.
Greenhouse gases have a positive radiative forcing because they absorb and emit heat.
Aerosols can have a positive or negative radiative forcing, depending on how they absorb and emit heat and/or reflect light.
For example, black carbon aerosols - which have a positive forcing - more effectively absorb and emit heat than sulfates, which have a negative forcing and more effectively reflect light.
The following are estimates of the change in radiative forcing in the year 2005 relative to 1750 for different components of the climate (IPCC, 2007):
* The radiative forcing contribution (since 1750) from increasing concentrations of well-mixed greenhouse gases (including CO2, CH4, N2O, CFCs, HCFCs, and fluorinated gases) is estimated to be +2.64 Watts per square meter - over half due to increases in CO2 (+1.66 Watts per square meter), strongly contributing to warming relative to other climate components described below.
* The radiative forcing contribution from increasing tropospheric ozone, an unevenly distributed greenhouse gas, is estimated to be +0.35 Watts per square meter (on average), resulting in a relatively small warming effect.
This forcing varies from region to region depending on the amount of ozone in the troposphere at a particular location.
* The radiative forcing contribution from the observed depletion of stratospheric ozone is estimated to be -0.05 Watts per square meter, resulting in a relatively small cooling effect.
* While aerosols can have either positive or negative contributions to radiative forcing, the net effect of all aerosols added to the atmosphere has likely been negative.
The best estimate of aerosols' direct cooling effect is -0.5 Watts per square meter;the best estimate for their indirect cooling effect (by increasing the reflectivity of clouds) is -0.7 Watts per square meter, with an uncertainty range of -1.8 to -0.3 Watts per square meter. Therefore, the net effect of changes in aerosol radiative forcing has likely resulted in a small to relatively large cooling effect.
* Land use change (including urbanization, deforestation, reforestation, desertification, etc) can have significant effects on radiative forcing (and the climate) at the local level by changing the reflectivity of the land surface (or albedo). For example, because farmland is more reflective than forests (which are strong absorbers of heat), replacing forests with farmland would negatively contribute to radiative forcing or have a cooling effect.
Averaged over the Earth, the net radiative forcing contribution of land use changes, while uncertain, is estimated to be -0.2 Watts per square meter (IPCC, 2007), resulting in a relatively small cooling effect.
* Based on a limited, 25-year record, the effect of changes in the sun's intensity on radiative forcing is estimated to be relatively small, or a contribution of about +0.12 Watts per square meter, resulting in a relatively small warming effect.
NOAA's Annual Greenhouse Gas Index (AGGI), which tracks changes in radiative forcing from greenhouse gases over time, shows that radiative forcing from greenhouse gases has increased 21.5% since 1990 as of 2006.
Much of the increase (63%) has resulted from the contribution of CO2.
The contribution to radiative forcing by CH4 and CFCs has been nearly constant or declining, respectively, in recent years.
Electric cars in Denmark
27-08-2009
Electric cars can significantly contribute to reducing the use of fossil fuels.
That is why the Danish government supports the market penetration of electric cars in Denmark.
More electric cars will bring about particularly three important gains:
1. We reduce the use of fossil fuels in a sector which is not governed by the European emissions trading systems.
2. Electric cars are far more energy efficient than cars powered by gasoline or diesel.
These cars drive more than 3 times as many miles "pr gallon".
3. Electric cars can benefit from the production of renewable energy - not least wind power production - which is one of the key competencies in Denmark.
This supports the governments vision that transport can be sustainable.
Initiatives of the Danish government
The CO2-emissions from transport has expanded markedly in the past years.
This is why it is crucial that the transport sector becomes an active player in bringing about new solutions for reducing CO2-emissions.
The high level of CO2-emissions from transport is one of the reasons why the Danish government on January 29th this year consolidated an agreement on denmarks green transport policy, which will ensure a reduction in the CO2-emissions from the transport sector.
The settlement includes several concrete initiatives, but the central one as regards the market penetration of electric cars is a green reorganisation og the taxation on cars, so it is still economically sensible to acquire an electric car.
In February 2008 the Danish government reached an Energy Agreement, which also featured a test scheme for electric vehicles.
The test scheme is to generate new specific and practical experience with electric cars and the required infrastructure.
The test scheme may also help shed light on the opportunities for integrating electric cars as a flexible storage facility into the Danish electricity system, and this may both help optimize energy exploitation and help adapt the system to the fluctuating wind power.
DKK 35 mio. has been set aside for the test scheme for electric vehicles in the period 2008-12.
Read more about the test scheme for electric vehicles.
Biofuels:
can they fuel our lifestyle without taking food from the poor?
A consultation by the UK Nuffield Council on Bioethics wants to hear public opinion on the new generation of biofuels
Just in case you thought it was safe to stop thinking about biofuels, here comes another study – this time into the ethics. Can a new generation of biofuels ensure we don't increase greenhouse gas emissions and take food from the poor to fuel our cars?
The UK Nuffield Council on Bioethics (NCB) launched a consultation today calling for anyone and everyone's views on biofuels – everything from ethanol to futuristic synthetic hydrocarbons from algae.
The story for biofuels is well-told among environmentalists. Hailed as a sustainable way to produce liquid fuels for transport, their promise quickly began to fade
as the inadvertent side effects of growing the crops began to spoil the claims made by manufacturers.
First generation biofuels are made from food crops including sugar cane, soy or wheat.
In some cases, however, the net greenhouse gas emissions from these (once transportation and processing were taken into account) were no significantly improvement on burning the fossil fuels they replaced, such as diesel.
In addition, using food crops meant that farmers found a more lucrative market for their crops. Tortilla wars and rising food prices in general started to raise alarm bells.
In the UK, the Gallagher review (pdf) suggested a slowdown of the UK's Renewable Transport Fuel Obligation.
The obligation forced fuel suppliers to mix 2.5% biofuels into the road transport fuel they sold in 2008-09.
It proposes that this target increases by 1.25% per year to 5% in 2010-11.
Beyond the UK, at a European level, a critical report (pdf) by the European Commission's Joint Research Centre, called Biofuels in the European Context:
Facts and Uncertainties, has caused havoc with EU targets.
There's also further research suggesting that fertiliser used to grow biofuels can also be a significant source of greenhouse gases.
However, despite all these problems with the first generation biofuels, the NCB reckons second-generation fuels are much more interesting.
"Research into new types of biofuels is looking more promising," said Joyce Tait, chair of the NCB's working party on biofuels. "Rather than using food crops to produce biofuels, in the future we may be able to use algae, trees, the inedible 'woody' parts of plants, and agricultural waste.
"In addition, scientists are working to increase the yield of biofuel crops and improve the production process, in order to maximise the energy output of land and reduce net greenhouse gas emissions."
Before these new technologies are brought to life, however, Tait says society must think soon about how it can avoid the problems of first generation biofuels:
"We also want to find out how consumers feel about moving towards a greater use of biofuels. People's attitudes will have a major impact on whether biofuels can successfully become part of the energy mix."
The council will look at the displacement of local communities from land given over to biofuel production and stories of poor conditions for workers, and environmental pollution.
"We want to ensure that the ethical dimension is taken into account.
We want to see that the production of new types of biofuels, especially in developing counties, has a positive effect on local communities and supports economic development by creating jobs and new sources of income," said Tait.
The NCB wants to hear [Word doc] from anyone with a personal or professional interest in biofuels, both from developing and developed countries – the deadline for responses is March 15 next year. The final report, meanwhile, with recommendations for policy makers, will be published some time before the end of 2010.
China means business with first-ever carbon emissions targets
The Asian powerhouse has clearly bought into the climate change diplomacy game – but how much difference will these self-imposed goals actually make?
Steel mills blow industrial smoke over residential buildings in in Benxi, China.
The country yesterday set its first-ever carbon targets. Photograph:
Gilles Sabrie/Corbis
China could regret setting its first carbon target.
Even if the impact on the economy proves manageable, the country's negotiators have now condemned the world's most populous nation to jargon-filled number crunching and climate geekery for decades to come.
During the past six years in China, I can count the number of times I have heard locals talk about carbon offsetting on one finger. They didn't need to:
under the Kyoto protocol, China and other developing nations were not obliged to do anything to reduce emissions. That will all change with yesterday's announcement, which paves the way for China to establish carbon trading, carbon taxing and, perhaps one day, carbon offsetting.
What it will not mean is an overall reduction of greenhouse gases from the world's biggest emitter. The new target is a 40-45% reduction in carbon intensity (emissions per yuan of economic activity) between 2005 and 2020.
That means slowing the rate of increase rather than cutting back.
China's emissions will increase by between 90% and 108% between 2005 and 2020 if the economy grows at 8% per year, according to Arthur Kroeber of Dragonomics Research &Advisory.
But it could be a lot worse.
According to the Worldwide Fund for Nature, China's new target will prevent more than 4 gigatons of carbon entering the earth's atmosphere between 2010 to 2015, in addition to the 1.5 gigatons already saved by the energy efficiency drive during the current five-year plan.
There appears to have been considerable coordination between China and the US in announcing roughly equivalent targets within a day of each other. The World Resources Institute calculates that President Obama's goal of a 17% emissions reduction is worth slightly more than a 40% improvement in carbon intensity. A like-for-like deal seems to have been reached, even though China remains publicly adamant that its actions are voluntary while those of the developed nations are mandatory.
Xie Zhenhua, the vice chairman of the National Development and Reform Commission, stressed yesterday that the goal only applies at home.
It is not, he said, "internationally binding or subject to international verification".
This may upset some foreign observers, but China has a better record of meeting ambitious domestic targets over the past five years than many countries have managed with internationally binding commitments.
A bigger question mark over China's announcement is the lack of ambition relative to what it has already been doing.
The target is less than the country is aiming for in the current five years and less than it achieved in the previous 15 years. Xie acknowledged that China achieved energy conservation gains of 47% between 1990 and 2005.
But he insisted the lower headline figure of the new target masked the fact that it is harder to achieve because all the low-hanging fruit has already been picked.
There is some truth in this – over the past five years, China has replaced thousands of small, inefficient power plants, steel factories and cement makers with more modern facilities. It has also invested heavily in renewable energy.
Doing so again will be more difficult and costly.
But other countries are also pushing themselves hard despite increased costs and challenges – most notably Japan, which is already one of the world's most efficient nations but still raised its carbon reduction target 10% this year.
China's vice minister for foreign affairs, He Yafei, has said it is unreasonable to compare developed and developing nations because of the rich world's historical responsibility for carbon emissions. This is contentious. Data from the World Resources Institute puts China's cumulative emissions since 1900 at third behind the US and Russia.
However, given its 1.3 billion population, the carbon footprint of the average person in China is around a third and a quarter lower than in Europe and the US respectively.
In addition, almost a fifth of the emissions that are calculated as Chinese are used to manufacture products for export to countries like the UK.
But look forward instead of back and the picture is very different.
If current trends continue, China will soon be the number one climate villain in a whole new set of categories. People living in rich cities like Shanghai already have a higher average carbon footprint than people in the Japan, the UK or France.
Without stronger action, this will be true of an ever increasing number of people in China.
A carbon intensity target does not mean a cut in emissions, it means a slowing of the growth in greenhouse gases relative to the expansion of the economy.
This could still means very significant carbon savings. The bad news is that China's emissions are still likely to increase substantially between now and 2020.
But the jargon is clearly coming along.
China is very serious about contributing in every way to the global warming debate.
The U.S. Environmental Protection Agency, as expected, on Monday declared greenhouse gases a danger to public health, a decision that could soon lead to new emissions regulations for businesses across the economy.
The "endangerment finding" announced by EPA Administrator Lisa Jackson is necessary to move ahead on new emissions standards for cars due out in March 2010.
Made under the Clean Air Act, it also opens up large emitters such as power plants, oil refineries, chemical plants and metal smelters to regulations that limit their output of carbon dioxide and other gases.
"These long overdue findings cement 2009's place in history as the year when the U.S. government began addressing the challenge of greenhouse-gas pollution and seizing the opportunity of clean-energy reform," Ms. Jackson said.
The controversial decision, proposed by the administration earlier this year, comes as a global climate summit opens in Copenhagen.
It gives the administration leverage in its negotiations and puts pressure on Congress to pass a bill that cuts greenhouse gases in a more economically efficient way.
Though the House has passed such a bill, the Senate has faced a number of political hurdles.
Without any cost analyses of new greenhouse-gas regulations, it is difficult to estimate what the actual impact could be on the economy.
Dan Riedinger, a spokesman for the utility industry group Edison Electric Institute, pointed to cost predictions for federal legislation as a guide to the cost.
Estimates for legislation vary between $100 a year to $1,000 a year extra for families, and such legislation is specially designed to moderate costs.
The only certainty is that EPA regulation would be far more expensive than congressional-designed legislation," Mr. Riedinger contends.
Although industry officials say no economic study of the impacts of greenhouse-gas regulations under the Clean Air Act has been published, the EPA strongly challenges dire economic assertions.
Ms. Jackson indicated the agency would soon finalize a new "tailoring rule" that will set a greenhouse-gas-emissions threshold for regulators at 25,000 tons a year.
This is designed to target the largest emitters in the country.
The EPA says that would mean around 13,600 coal-burning power stations, crude refineries, metal smelters and other industrial facilities would come under existing regulations.
Specifically, for any new construction or modification that would affect greenhouse-gas emissions, companies would be required to apply for permits that include the "best available technology."
The EPA is seen finalizing what is considered the best available technology in 2011.
Asked when the agency would draft new regulations for existing large emitting facilities, Ms. Jackson declined to give a timeline.
Industry lawyers say if the EPA finalizes its auto-emissions rule by March 31, as expected, regulation of greenhouse gases such as carbon dioxide would automatically start 60 days later.
Jeff Holmstead, a former EPA air administrator under the George W. Bush administration and now head of the Bracewell &Giuliani Environmental Strategies Group, said this is the first time the agency has ever made a standalone endangerment finding.
He thinks it was a political decision.
"It's clearly designed to set the stage for the Copenhagen conference," Mr. Holmstead said.
Previously, the EPA had synchronized endangerment determinations with its rule-makings.
But provisions in the EPA's tailoring rule may mean the 25,000-tons-a-year threshold won't apply in many states.
Global Warming:
The Blog Epic, Part I
This is a repost of the first in a series of entries on Global Warming.
The global warming debate has been running continuously since the now very obscure publication of Moment in the Sun:
1968" by Dr. Robert Rienow and Leorna Train Rienow.
Most people think of the literary beginning of the environmental movement has having been "Silent Spring" by Rachel Carson, and maybe so, but for me, it was Rienow.
This is partly because "Moment..." was the first book I read on the topic, one of the first "adult" books I read at all, and on those early mornings before school I was able to watch Dr. Rienow on that crazy new fangled box ... the black and white TV my parents had just acquired ... on a thing called "Sunrise Semester" produced by SUNY-Albany.
Rienow would lecture, and he and his wife and (I assume) the occasional student would put on skits lampooning industrialists and other polluters.
I remember one day, years after having last seen Sunrise Semester, having just acquired a car and a license (at a ripe old age of 18 or so) exploring the territory south of town, along the Hudson River. I encountered an old narrow road running down into the wooded valley from a minor highway, and took the turn thinking it would lead somewhere interesting.
Soon enough there was another turn onto a narrow gravel way called "Holly Hock Hollow" ... that name sounded familiar, but I could not place it.
So I made that turn as well. A mile and a half or so later, the road leveled off to join the floodplain of a small creek, and I started to see little wooden signs in the forest, extolling in a few words here and there the virtues of nature, and imploring the reader to "leave no trace of your visit" and "respect the trees and animals" and such.
Eventually I spied, along side the road where a stone wall opened to a gate, a sign:
"Holly Hock Hollow Farm ~ Robert and Leorna Rienow."
Holy Crap, I had found the very place where the professor and his wife lived.
For me, it was like finding Gandolf's hideaway, or a really good used bookstore, or, well, I don't know what.
Naturally, I did not have the guts to stop in and say hello, and although I drove by the place on my explorations several more times in coming years, I never bothered the couple.
But my memory of that discovery will never fade (but details subject to random neural modifications, of course).
Anyway, at some point in time, I believe in the 1970s, many scientists realized that the greenhouse model was a powerful predictor, and started to believe that global warming was going to happen, even in the absence of enough clear empirical data.
Keep in mind:
Theories can be very powerful. A theory like the "Greenhouse Model" was very powerful, and had already been tested in a lot of contexts, including other planets. But the empirical data of change in the Earth's climate was not fully developed at that time.
From this early speculative period into the 1980s (maybe the late 1980s?) the data started to come in line as well, and an increasing number of scientists were forced to conclude that global warming was underway and likely to get worse.
But we had Reagan/Bush, Reagan/Bush, Bush, (Clinton/Gore, Clinton/Gore), Bush/Cheney, Bush/Cheney in the White House, and a congress that I think on average was more often Ree-pub than DemocratIC. And Big Oil has always been powerful.
So moving from informed speculation to virtual certainty by the early or mid 1990s, then to the point of hard and fast conclusions that not even dyed in the wool right wing yahoos could not deny, was delayed.
It probably could have happened by the late 1990s or so, but we had to wait another seventeen years. In other words ... yes, had Al Gore been inaugurated rather than the Loser Bush, this would all have happened already.
Have I got this right?
Remember, we were almost there.
We were there at Kyoto but some bad decisions were made and we slid back a decade or so in terms of political reality.
But I admit these dates are subject to revision after a closer look. I do recall writing an article for a monthly newspaper some time around 1988 (or maybe 1990?) that, in my view, summarized a number of lines of evidence and absolutely nailed down (for the readers of that fairly left wing publication) the fact that global warming was real and anthropogenic. I think a lot of us feel that we've been spinning wheels for many years, and that this planet, our civilization, the environment, have all been cheated out of a couple of decades of progress.
So what is this an introduction to? I plan to systematically go through a number of topics related to Global Warming (and more broadly climate change, to some extent) and provide up to date information and description.
What are the components of "forcing," what are the greenhouse gases, and why do some matter more than others?
Why is sea level so important, and so incredibly interesting?
What is the link between overall climate pattern and important events such as hurricanes and tornadoes, or whether we have a lot of snow or very little in a given winter?
And so on.
Why Greenhouses have nothing to do with the Greenhouse Effect, and more importantly, why CAN'T I microwave toast?
A greenhouse is a glass house that is sealed to keep air in and insulated to keep heat in but at the same time allow sunlight in.
This sunlight contributes to the heat in the greenhouse by warming the ground or other material in the greenhouse, and of course the light energy is used by the plants. But the point of a greenhouse is to keep air that is warmed, by the sun and/or heaters that may be required in the greenhouse, from wafting away.
This is not how the so-called "greenhouse" effect works. There is no thing out there keeping warm air from wafting away from the planet.
The air just stays there, greenhouse effect or not, moving around and doing the weather thing, and looking blue much of the time.
Repost
It is possible to find descriptions of the greenhouse effect (in the atmosphere) that make the analogy very directly, but this is incorrect. A gardener's greenhouse works because it keeps air that has been warmed from leaving the vicinity at the same time it lets in light for plants to use, while the greenhouse effect in the atmosphere is an entirely different process. For the first substantive post in this series, we'll look at the gory details of what a greenhouse gas is, and how the greenhouse effect works.
The sun is very hot.
In part, this means that the matter that the sun is made of emits energy of some kind, and since it is VERY hot, this energy tends to be of very high frequency (short wavelength) ... in what we call the "electromagnetic" range of frequency.
The relationship between an object's temperature and the frequency/wavelength of energy it emits is a matter of physics beyond our current scope, but you can think of it this way:
A slowly moving object (say something vibrating at a few hundred up to several thousand times a second) will "hum" ... it will emit sound.
Each movement of the object "back and forth" makes a "wave" of sound, so the faster the movement, the higher pitch the sound.
Electromagnetic radiation ... which sometimes goes by the name of "light" or "radio waves" and so on ... is a kind of energy that can be stored in and sometimes comes out of atoms. It is a phenomenon happening at a much higher frequency than this sound wave analogy, and instead of being a series of sound waves (which involves the repeated compression of, for instance, air molecules) it is a series of waves and/or particles sometimes going by the name of photons. As you probably know already, this kind of phenomenon cannot be described as a stream of "things" (photons) in a way that explains all of its properties, and it cannot be described as a "wave" of energy in a way that explains all of it's properties. If you really need to think about electromagnetic radiation in detail, you have to think of it as both/either/or particle and wave.
Fortunately, you don't need to think about it at this level to understand the greenhouse effect.
What you do need to know is that this form of energy has a wide range of wavelengths, some of which we see ("light");the frequency of the the energy determines much of its properties;and hotter things emit higher frequency wavelengths because the atoms in the hotter things are wiggling back and forth faster.
A quick digression on frequency and wavelength:
Frequency is the rate at which something vibrates, or changes its state ... like from negative to positive charge, etc. measured in units such as "billion cycles per second." "Wavelength" is measuring the same exact thing, but instead of frequency per second, it is how much distance is traveled by this energy ... typically moving at the speed of light ... before it completes one full cycle from one state to the other. Think of it as the distance between the tips of waves on the sea.
If the distance between the wave tips is shorter, there are more waves hitting the shore per minute, but if the distance is greater, fewer waves hit the shore per minute.
Higher frequency (many waves) = shorter wave length, lower frequency (few waves) = longer wave length.
If you knew about a certain wavelength, and discovered an energy of shorter wave length, you might think of calling this "shortwave."
If you then discovered even shorter wavelength (higher frequency) energy, you might have to call this "microwave" (because you already used the word "short") etc.
Thus we have things we call shortwave radios and microwave ovens. These different machines use energies of different wavelengths.
Light (electromagnetic radiation that our eyes have evolved to convert to neural signals ... i.e., energy we can see) has a range of wavelengths from about 700 to 400 nanometers (nanometers are very small ... there are 1,000,000,000 of them in a meter). Energy that is of higher frequency is called ultra violet, because the highest frequency light we see is what we call "violet" in color, so higher frequency is "ultra" (ultra = extreme). Energy of lower frequency is called "infrared" because the lower end of the frequency range of visible light is called by us humans "red" ...
"Infra" means beneath, as in infrastructure (the roads, sewers, etc.) or "inferior."
When you feel heat, you are actually perceiving energy that is down in this infrared range of wavelength.
The next level down in frequency from infrared is called "Microwave."
The boundaries between these named ranges of wavelength/frequency are not always stark in terms of the effects of the energy.
The higher frequency end of microwaves and the lower range of infrared will both cook your food.
The higher or middle end of infrared happens to cook your food in a way that facilitates the famous "Maillard reaction" ... a reaction between sugars and amino acids that makes your food taste good.
This is why microwaves and "heat" both cook your food but the food comes out differently in taste and texture depending on method.
Yes, this IS related to global warming.
The difference between microwaving vs. toasting a piece of bread has to do with the way in which specific, different, molecules react to specific, different, wavelengths of energy. A bunch of water molecules heated in a microwave or on a stove is the same ... hot water. The various molecules in a slice of bread heated in a microwave vs. in a toaster react very differently, producing very different results (something inedible vs. toast).
A gas is a "greenhouse gas" because of the way it (its molecules!) reacts to a particular form of radiation (infrared).
Energy From the Sun
Most of the energy that reaches the surface of the earth is high frequency, including light.
Light is only barely affected by the gases that make up our atmosphere.
In other words, as the light wave/particles are moving through the atmosphere of the earth, most of them don't get absorbed by the stuff the air is made out of.
(Now, this is not a coincidence.
That which we call "light" moves around pretty freely in our planet's atmosphere.
We evolved on this planet.
Our eyes can detect in fine detail this energy that moves around freely.
Our eyes can't detect the energy that is typically trapped by the magnetic field of the earth, because it is never around, so why would natural selection shape our eyes to be able to "see" this?
If we evolved on a different kind of planet, physicists, would probably have a somewhat different set of instruments to detect and measure the energies they are so interested in.
Perhaps an "optical" telescope would be used for a somewhat different (shifted one way or another, or narrower, or broader) range of "light." OK, that was today's shameless promotion of thinking-of- EVERYTHING -in-terms-of-evolution digression...)
When this high frequency (short wavelength) energy from the sun encounters the relatively solid matter that the earth's surface is made of, including rocks, plants, liquid water, etc., it is absorbed by that matter. Not so much by the MOLECULES that matter is made of, but by the ATOMS that those molecules are made of. At this energy level (light, radiation, and such) the absorption is happening at the atomic level ... this is an important fact.
You can think of it this way:
Photons (light "particles") have a very high probability of encountering an atom in, say, a rock.
The atom "absorbs" the photon .... this means the photon essentially becomes part of the atom for the time being.
An atom with this extra bit (the photon) changes. The way it changes is that one of the electrons (the outermost part of the atom is a cloudy space within which the electrons are flying around) stores this energy, what physicists refer to as "becoming excited."
This is probably why physicists do not get a lot of dates.
What has happened here is that high-frequency energy, the kind of energy that is emitted by a very hot object, has found its way to a cool object (earth surface temperatures are cool relative to the sun), and gotten stored there in the atoms that object is made of.
The Earth is a Big Space Heater
Now, this relatively cool matter can release the energy (depending on various laws of physics I won't go into), but since the wavelength (frequency) of the energy released by an object is proportional to the temperature of the object (remember that from several paragraphs back?) this energy is of lower wavelength than sunlight.
So, the relatively hard surface of the earth converts high frequency energy into lower frequency energy.
This low frequency energy is what we think of as "heat."
In this way, the surface of the earth is a simple machine that converts sunlight into heat.
So, as long as the sun is shining on the earth, the surface of the earth is a big heater. Since the atmosphere is sitting right there on top of the surface, this big heater (the earth's surface) heats up the atmosphere.
Eventually, this heat ... now in the atmosphere ... makes its way to the outer limits of the atmosphere where it radiates off into space.
Neither rain, nor sleet, nor snow, nor gloom of night ...
On its way towards outer space, this heat energy is absorbed by the molecules of the atmosphere itself, then re-released.
Think of a bit of energy as a letter that you put in the mailbox.
You know that when you put the letter in the mailbox, it does not simply disappear and rematerialize in the recipient's mailbox.
The letter changes hands many times, from the postal worker who picks it up, to other postal workers who sort the mail, move it from one place to the next, to the postal worker who eventually puts it in the recipients box.
Hold this analogy in your mind for a moment...
If the atmosphere was made of a gas that is lousy at absorbing heat energy, the heat would radiate more or less directly into outer space, the total time required being a function of the total thickness of the atmosphere (and some other things). But if the atmosphere contains a certain number of molecules that are good at absorbing heat energy, that is like having a lot of extra postal workers ... a certain unit of heat energy leaving the surface of the earth will be absorbed by a molecule, held for a while, then released, again and again.
The greater the relative number of these heat-absorbing molecules in the atmosphere, the more this will happen, and the total amount of time this energy hangs around in the atmosphere will be greatly increased.
Imagine that a bit of gas absorbs a unit of heat.
It then releases the heat.
It will release the heat in all directions around itself. So a unit of heat that may have been moving "up" towards the outer edge of the atmosphere gets stopped and then released, and some of it continues on it's way to the edge of the atmosphere, but some of it is released back towards the surface of the earth.
So it isn't just the number of postal workers (greenhouse gas molecules) that slows down the delivery (of your postcard, or of the heat to outer space) because there are more "handlers."
Imagine every postal worker has a 50-50 chance of sending your post card on in the right direction towards it's destination, and a 50-50 chance of sending it back in the direction of the sender. That's what the gas does. It randomizes the heat's flow.
So a million letters mailed each day in an efficient postal system all move through the system quickly, so at any moment there is just over a million or so letters in the various bins and boxes in the postal system.
But if there are a lot of postal workers and they are pretty random in which way the "send" each letter they handle, the letters will build up, and the bins, boxes, trucks, and mailbags will swell with letters that are taking forever to move through the system.
Greenhouse gases are molecules that absorb and release heat passing through the system.
The more greenhouse gases, the more the heat is passed around in random directions in the atmosphere, and the more the atmosphere "swells" with this heat.
Hot object (sun) irradiates cold object (the earth) with high frequency energy, cold object (earth's surface) converts high frequency energy into low frequency energy (heat) which radiates away.
Greenhouse gas molecules interfere with this process by randomizing the direction in which the heat goes. It's like when they change the gate of your departing flight and for a while nobody knows where the new gate is, only with heat.
But ... what are greenhouse gases, already?
So why do some molecules absorb (and release) this heat while others don't?
It's a matter of how the atoms that make up the molecule are bound together. The atoms in a molecule are held together by electromagnetic forces. The nature of this binding between atoms varies in different kinds of molecules. A molecule made of two identical atoms (which is how atmospheric nitrogen or oxygen usually occurs, two molecules each) are bound together with a kind of tightness and symmetry that they essentially act like they were a single atom, when it comes to low frequency radiation (like heat). Heat moves across a collection of gas molecules of this type like waves in water ... the molecules all sit there but the movement of the molecules (heat) passes across this matrix of molecules:
Heat "arrives" by pushing on some molecules, then the molecules just push the next ones in line ... and thus the heat passes along. (sort of) But if the molecule is made of different atoms, put together a certain way, then the relationship among the atoms in the molecule is in a sense flexible, so this heat energy (motion) can go from a wave of movement across a matrix to a bunch of movement WITHIN the molecule itself. Thus the energy is trapped for a while inside that molecule.
When the molecule then releases this energy, there is no "memory" of the direction in which it was moving ... the energy now simply moves outward from the molecule.
There may be a directionality to that ... frankly I don't know ... there must be in some cases ... but the direction of emission of this energy from a given molecule is not related to the direction from which the heat originally came, and there are a lot of molecules, so the effect is omnidirectional.
The non-greenhouse gas molecules are like the hallways in the post office ... they have nothing to do with stopping or redirecting the energy (letters). The greenhouse molecules are the perfect random postal workers. They stop the energy (letter), hold on for a while, and then send the energy (letters) off in a random direction.
Dry "atmosphere" is made of Nitrogen (78%), Oxygen (21%) and Argon (1%), and a tiny amount of other gases. The atmosphere can then include varying amounts of water vapor. Less than one percent of dry air is carbon dioxide and other greenhouse gases not counting water. At "100% humidity" something like 7% (depending on temperature) of the air is water vapor, and there can be as little as almost zero locally.
Water vapor is a greenhouse gas.
Water vapor is both fairly fixed and highly variable.
The total amount of free water on the earth does not really change, and how much is in the atmosphere at one point in time in a given spot varies a lot.
The other atmospheric gases don't change back and forth between gas and liquid (or solid) like water does, so they are more or less a constant (on a day to day basis) but since water converts back and forth between liquid or solid and gas form at typical Earth atmospheric temperatures (through evaporation and precipitation), it varies all the time.
This, mainly, is what we know of as weather (along with a few details such as if the non-gas water is liquid or ice/snow!). The point is that humans do not change the water vapor system in any way that alters the greenhouse effect, but by adding (or removing) the other greenhouse gases, we can have large effect.
So how long it takes for your letter to get delivered on a given day may have to do with how many other people send mail that day (seasonally varying perhaps) and things like traffic, delays at airports, etc. but that all evens out over time so there is an average delivery rate.
But if you go and hire twice as many inefficient postal workers, you slow down all delivery, on average, and over the long term.
Carbon Dioxide is the main greenhouse gas that humans alter. Prior to human effects it is estimated that the level of this gas in the atmosphere was about 260 - 280 parts per million.
The current level, elevated primarily because of human activities, is about 380 parts per million.
That's a lot of postal workers.
Clearing a Path to Nature
The disengagement of children from nature has begun to alarm some of America's more thoughtful naturalists, scientists, and environmentalists.
By Richard Louv Posted May 28, 2009
If children's direct experience of nature is vanishing, where are future environmentalists going to come from?
In 1978, Thomas Tanner, professor of environmental studies at Iowa State University, conducted a study of environmentalists' formative influences—what it was in their lives that had steered them to environmental activism.
He polled staff members and chapter officers of major environmental organizations.
"Far and away the most frequently cited influence was childhood experience of natural, rural, or other relatively pristine habitats," according to Tanner. Most enjoyed unstructured play and discovery in such settings almost every day during childhood.
"Several studies since mine have supported my findings," he says. "But for some reason, you don't hear many environmentalists expressing much concern about the intimacy factor between kids and nature."
One naturalist who has given this issue some thought is Robert Stebbins, professor emeritus at the Museum of Vertebrate Zoology at the University of California, Berkeley.
For more than 20 years, Stebbins' book, A Field Guide to Western Reptiles and Amphibians, which he wrote and illustrated, has remained the undisputed bible of herpetology, inspiring countless youngsters to chase snakes.
"We've got to teach children and young people that we're related to every living thing," Stebbins told me.
As we spoke, he dropped scores of slides into an old viewer. "Look," Stebbins said, "Ten years of before-and-after photos."
Taken by Stebbins and his students in the California desert over a period of 10 years, the slides document the destruction caused by all-terrain vehicles (ATVs):grooves and slashes, tracks that will remain for centuries. Desert crust ripped up by rubber treads, great clouds of dirt rising high into the atmosphere;a gunshot desert tortoise, with a single tire track cracking its back.
Stebbins discovered that 90 percent of invertebrate animal life—insects, spiders and other arthropods—had been destroyed in the ATV-scarred desert areas.
Forgotten Connections
Looking at Stebbins' slides, I wondered whether this destructiveness was simply the inevitable product of population growth in a fragile area, or whether our culture is producing a succession of generations with depreciating regard for the environment.
"One time, I was out watching the ATVs. I saw these two little boys trudging up a dune. I went running after them. I wanted to ask them why they weren't riding machines—maybe they were looking for something else out there," said Stebbins. "They said their trail bikes were broken. I asked them if they knew what was out there in the desert, if they'd seen any lizards. ‘Yeah,' one of them said, ‘But lizards just run away.' These kids were bored, uninterested.
If only they knew."
So how do we help them know?
For the past decade, Robert F. Kennedy Jr., an environmental lawyer and son of the late senator, has helped an organization called Riverkeeper bring back the Hudson River from its polluted, watery grave.
He likes to take his children scuba diving in the Hudson.
He "buddy dives" with them, which is a method to teach correct underwater breathing.
With a single oxygen tank, he and a child will descend to the bottom of the river and sit next to a large rock, sheltered from the current.
He holds the child around the shoulders or waist (protectively, but also to feel the child's breathing) and the two of them pass the mouthpiece back and forth.
They sit down there, among the dancing plants, and watch the fish go by:
the aggressive bass and whiskered catfish, and even an occasional sturgeon, monstrous, prehistoric, and graceful.
Kennedy told me of his earliest experiences as the family's "nature child," as he called himself, and how those experiences shaped his fathering.
"I spent every afternoon in the woods when I was growing up in Virginia," he said. "I loved finding mud puppies, salamanders, crayfish, frogs. From the time I was 6 years old, my room was filled with aquariums. And it still is today. I have a 350-gallon tank and I have aquariums all over my house."
Catfish, eel, bullheads, striped bass, largemouth bass, blue fish, perch, sturgeon and trout.
He and his kids catch them in the Hudson, bring them home alive, and keep them in the aquariums for a few days—then release them.
Practical Things You Can Do
Most children are good at introducing themselves to nature;they just need a little encouragement.
The garden spider, the ant highway, the bluegill taking a worm—these are all doorways into that other world, the one outside the Nintendo universe.
As parents, we can help open those doors. Particularly in urban areas, exposure to nature doesn't come naturally.
We need to bring nature to our kids. We can:
* Join nature organizations, and encourage them to pay attention to kids. The National Audubon Society, the National Wildlife Federation, the Nature Conservancy, the Sierra Club or other organizations are beginning to address the breach between kids and nature.
Most have monthly publications with spectacular natural photography;some have special memberships and magazines for kids.
* Take a nature break.
Take children for a walk in the woods. If there isn't one nearby, visit a local zoo. Look for insects under leaves and birds in the trees. Get dirty.
Play in the mud.
Stand in the rain until your clothing is soaked.
Show kids how good nature can feel.
* Go camping, boating, hiking.
"When we go camping, I try to tap a vein of mystery I remember when I was a child," says John Johns, a Los Angeles businessman. "I get them up before dawn so we can see the coyotes. We hike under the moon, no flashlights. On camping trips, if the parents have a good time, the kids will have a good time.
They'll connect with nature."
* Take nature vacations. Families that don't enjoy camping can still vacation in natural settings by renting a mountain or beach cabin for a few days. Ski trips give kids a chance to roll in the snow.
Dude ranches give family members a chance to ride horses, sleep under the stars, and pretend they are cowboys and cowgirls.
* Encourage schools to incorporate nature into the curriculum.
Some schools adopt nearby canyons, fields or woods and, as part of biology class, clean up the trash, remove the non-native plants, and study the animal life.
These programs help kids experience nature up close, and improve science education by making it personal and hands-on.
* Conduct family or school nature treasure hunts and nighttime explorations. Anne Lambert, mother of three and former high school teacher, quotes Rachel Carson:
"It is not half so important to know as to feel" when introducing a young child to the natural world.
Remembering the Value of Dreamtime
Most of all, we need to give our kids some dreamtime—and recognize the connection between nature, emotional health, and creativity.
Many of our most gifted thinkers and creators were touched, as children, by the magic of nature.
Samuel Langhorne Clemens held down an adult job at the age of 14, but when his working day ended at three in the afternoon, he headed to the river to swim or fish or navigate a "borrowed" boat.
There he dreamed of becoming a pirate or a trapper scout and became Mark Twain.
The poet T. S. Eliot, who grew up next to the Mississippi River, wrote, "I feel that there is something in having passed one's childhood besides the big river which is incommunicable to those who have not."
The art critic Bernard Berenson recalled:
"As I look back on fully 70 years of awareness and recall the moments of greatest happiness, they were for the most part, moments when I lost myself all but completely in some instant of perfect harmony.
In childhood and boyhood this ecstasy overtook me when I was happy out of doors?"
Creativity begins, he theorized, "with the natural genius of childhood and the 'spirit of place.'" Today, most children are probably hard-pressed to induce this spirit of place while stuck in a traffic jam on their way to soccer practice, or trapped inside a house because of the fear of crime, or fixated on achieving the next level of Mortal Kombat.
My sons often remind me that there is much that is good about today's childhood.
But still I wonder:
Am I doing enough to pass on to them a sense of natural wonder?
Am I listening?
Surely my 10-year old Matthew, who enjoys Nintendo as much as the next kid, is sending me a clear message:
He says he wants to do a lot more fishing this year.
Bigger threat to children around the world:
war or carbon emissions?
So as I watch and read all this propaganda-gone-wild in Copenhagen about how the biggest threat to our planet was apparently a potential 0.6 degrees Celsius increase in temperature over the last hundred years that might or might not have been caused by carbon emissions, I keep wondering if all these well-intentioned politicians, bureaucrats, oligarchs, executives, celebrities, agencies, corporations, NGOs, charities and world citizens just might have their priorities a little bit wrong.
To be clear, I'm concerned about the environment and am avidly against pollution, including excessive CO2 release.
But.
In the last 100 years, some 200MM people were killed last century by war and violence.
80MM or so of those were in genocidal situations.
Tens of millions of children around the world will be killed, tortured, maimed, raped or otherwise physically hurt in the next ten years.
Hundreds of millions of children are at risk of being victims of political violence or war right now, today.
So I wonder about these people's and organizations' and governments' priorities as I watch this video that has been a big hit in Copenhagen.
The video features children pleading that the developed world's governments save children in Africa and Asia and South/Central America by creating policies like "Cap n Trade".
And then the UN and other political agencies will have trillions of dollars to dole out to companies, charities and governments as they see fit to help stave off carbon emissions from killing children in Africa?
Really?
Really.
In reality, I ask you — if we really wanna help children in Darfur and Kenya, should we help them feel safe from political violence or from global warming?
Before I took this job as a TV anchor, on Christmas Eve for several years, I used to go with the Rotary Club of Ruidoso to play Santa at an orphanage in Juarez, Mexico.
I'm actually going to get home for Christmas Eve this year, but I'm not sure we can get to the orphanage because of all the violence tearing apart Mexico.
What's a bigger threat to those kids at the orphanages in Juarez, for the orphans that Faces of Kibera helps in Kenya, and for the children in rural villages in genocidal Darfur in the Sudan — potential climate change over the next hundred years or personal safety tonight?
sea levels
Although sea levels have been rising since the end of the last glaciation (nearly 11,000 years), the rate of rise has increased over the past 200 years as average global temperatures have increased.
The rise is due to two factors, the freshwater being added to the oceans from ice melt in the cryosphere , and the thermal expansion of the oceans due to rises in sea temperature.
The contribution from Antarctica melt water is uncertain, and there is a distinct possibility of surprises from this southern region.
The floating ice shelves, notably the Wordie and Larsen A and B shelves, broke up very rapidly during the 1990s, after rapid regional warming.
Climate, like other complex systems do not always vary in a smooth fashion, and sudden changes can occur over wide areas. Critical levels, or thresholds may be reached in a system whereupon drastic, and perhaps disastrous results occur.
Threshold events in this case include the complete or partial shutdown of the ocean thermohaline circulatory system, disintegration and melting of Antarctica and Greenland Ice Sheets (the polar caps) , and major changes in the carbon cycle, due to biospheric effects (see the Snowball Earth scenario).
The IPCC Report
The IPCC 4th Report shows there is strong evidence that global levels gradually rose in the 20th century and is currently rising at an increased rate, after a period of little change between AD 0 and AD 1900.
Levels are projected to rise at an even greater rate in this century.
The two major causes for the rise are thermal expansion of the oceans (water expands as it warms) and the loss of land-based ice due to increased melting.
How Much Is The Sea Rising?
Estimates for the 20th century show that global average sea level rose at a rate of about 1.7 mm per year. Satellite observations available since the early 1990s provide more accurate data with nearly global coverage.
This decade-long satellite altimetry data set shows that since 1993, and shows rising at a rate of around 3 mm per year, significantly higher than the average during the previous half century.
Global levels are projected to rise during the 21st century at a greater rate than during 1961 to 2003.
Thermal expansion is projected to contribute more than half of the average rise, but land ice will lose mass increasingly rapidly as the century progresses. An important uncertainty relates to whether discharge of ice from the ice sheets will continue to increase as a consequence of accelerated ice flow, as has been observed in recent years. In particular, the Arctic is warming at a higher than global average, resulting in increasing surface melt from the Greenland Ice Sheet.
More detail on how the oceans are measured is available here.
IPCC AR4 Global mean sea levels
Figure Above:
Time series of global mean levels (deviation from the 1980-1999 mean) in the past and as projected for the future.
For the period before 1870, global measurements of sea level are not available.
The grey shading shows the uncertainty in the estimated long-term rate of change.
The red line is a reconstruction of global mean sea level from tide gauges and the red shading denotes the range of variations from a smooth curve.
The green line shows global mean sea level observed from satellite altimetry.
The blue shading represents the range of model projections for the 21st century, relative to the 1980 to 1999 mean, and has been calculated independently from the observations. Beyond 2100, the projections are increasingly dependent on the emissions scenario.
Impacts of Sea Level Rise
Rapid urbanisation in low-lying coastal areas of both the developing and developed world is increasing population densities and the value of human-made assets exposed to coastal climatic extremes such as tropical cyclones. IPCC model based projections of the average annual number of people who would be flooded by coastal storm surges is estimated to increase several fold, creating 200 million climate refugees.
This is based on what is called a ‘mid-range scenario' of a 40-cm sea-level rise by 2080, which is pretty conservative.
The U.S. Environmental Protection Agency (EPA) notes that the sea level has risen 15 to 20 cm (6 to 8 inches) in the past 100 years, and it is predicted to continue another 50 centimeters (20 inches) over the next century (with some estimates as high as 90 centimeters, or 3 feet).
The sea level is definitely rising, and it is jeopardizing rapidly growing coastal communities. Official decisions on evacuation of whole populations from some atolls in the Pacific Ocean have been taken or are being considered.
For example, 980 people, the entire population of the Carteret Atoll, will need to be evacuated by 2015, and the island is destined to become history. A similar fate awaits the small nation of Tuvalu and Majuro in the Marshall Islands.
The potential of damage to infrastructure in coastal areas from sea-level rise will be tens of billions US$ for individual countries like;Egypt, Poland, and Vietnam.
Sea Encroaches Florida Hurricane
The first image above on the far left was taken on 12 August 1997 of a house at Floralton Beach, Florida.
When Hurricane Frances came through on 8 September 2004 all vegetation and dune lines were wiped out (middle image). As a result, the house was directly exposed and completely destroyed when coastal surges from Hurricane Jeanne hit on 29 September 2004.
It is interesting to note that seventy one percent of annual United States disaster losses are the result of coastal storms. It is estimated that within 60 years, one out of every four of those structures will be destroyed and insurancecosts will sky rocket.
This is not suprising given that the narrow fringe comprising less than one fifth of the contiguous United States land area, accounts for over one half of the nation's population and housing supply.
The global carbon cycle
The global carbon cycle can be divided into two categories:
the geological, which operates over large time scales (millions of years), and the biological - physical, which operates at shorter time scales (days to thousands of years) and as humans we meddle with both categories.
The global carbon cycle refers to the movements of carbon, as it exchanges between reservoirs (sinks), and occurs because of various chemical, physical, geological, and biological processes. The ocean contains the largest active pool of carbon near the surface of the Earth, but the deep ocean part of this pool does not rapidly exchange with the atmosphere.
Below in the diagram, you can get some idea where and how carbon is stored in the whole Earth system.
The global carbon cycle is usually thought to have four major carbon sinks interconnected by pathways of exchange.
These sinks are;
* the atmosphere,
* the terrestrial biosphere (which usually includes freshwater systems and non-living organic material, such as soil carbon),
* the oceans (which includes dissolved inorganic carbon and living and non-living marine biota),
* and the sediments (which includes fossil fuels ).
Carbon exists in the Earth's atmosphere primarily as the gas carbon dioxide (CO2). Although it is a very small part of the atmosphere overall (approximately 0.04% and rising fast), it plays an important role in supporting life.
Other gases containing carbon in the atmosphere are methane and chlorofluorocarbons (the latter is one we introduced and are still adding to). These are all greenhouse gases whose concentration in the atmosphere are increasing, and contributing to the rising average global surface temperature.
Carbon Cycle Diagram
Global Carbon Cycle - Sinks and Storage
Carbon is taken up from Earth's system in several ways:
1. When the sun is shining, plants perform photosynthesis to convert carbon dioxide into carbohydrates, releasing oxygen in the process. Deforestation and land clearing pose serious problems to the carbon cycle, and obliterating this sink means more carbon is forced into the atmosphere.
2. At the surface of the oceans towards the poles, seawater becomes cooler and CO2 is more soluble.
Cold ocean temperatures favour the uptake of carbon dioxide from the atmosphere whereas warm temperatures can cause the ocean surface to release carbon dioxide.
With seas warming this means CO2 is not so easily absorbed, and remains in the atmosphere.
This is coupled to the ocean's thermohaline circulation which transports dense surface water into the ocean's interior. During times when photosynthesis exceeded respiration, organic matter slowly built up over millions of years to form coal and oil deposits. All of these biologically mediated processes represent a removal of carbon dioxide from the atmosphere and storage of carbon in geologic sediments.
3. In upper ocean areas of high productivity, organisms form tissue containing carbon, and some also form carbonate shells or other hard body parts. Apart from trees in forests, phytoplankton in the Earth's oceans are very important organisms that soak up carbon.
The seas contain around 36000 gigatonnes of carbon, and again and in warmer seas, organisms cannot produce carbonate shells at the same rate, and increasingly acidic seas dissolve shells, or make it difficult to create shelly material.
This means of course that carbon dioxide is not being taken up as quickly through this process and more carbon remains in the atmosphere, propelling global warming.
4. As shelled organisms die, bits and pieces of the shells fall to the bottom of the oceans and accumulate as sediments. Only small amounts of residual carbon from plankton settle out to the ocean bottom but over long periods of time these represent a significant removal of carbon from the atmosphere.
Global Carbon Cycle - Sources
Carbon can be released back into the system in many different ways:
1. Through the respiration performed by plants and animals.
2. Through the decay of animal and plant matter. Fungi and bacteria break down the carbon compounds in dead animals and plants and convert the carbon to carbon dioxide if oxygen is present, or methane if not.
The melting permafrost is releasing large amounts of methane, which contributes to global warming at a rate 21 more times than carbon dioxide.
3. Through combustion of biomass which oxidizes the carbon it contains, producing carbon dioxide (as well as other things, like smoke). Burning fossil fuels such as coal, petroleum products, and natural gas releases millions of tonnes of carbon that has been stored in the geosphere for millions of years. Fires also consume biomass and organic matter to produce carbon dioxide (along with methane, carbon monoxide, smoke), and the vegetation that is killed but not consumed by the fire decomposes over time adding further carbon dioxide to the atmosphere.
Wildfires and forest fires are likely to increase as land masses dry out with higher rates of evaporation.
4. Production of cement. A component, lime, is produced by heating limestone, which produces a substantial amount of carbon dioxide, and impacting upon the global carbon cycle.
5. At the surface of the oceans where the water becomes warmer, dissolved carbon dioxide is released back into the atmosphere.
6. Volcanic eruptions and metamorphism are part of the global carbon cycle and release gases into the atmosphere.
These gases include water vapour, carbon dioxide and sulphur dioxide.
Find out how volcanic gases are measured here.
Latest Trends and Cause for Alarm!
There has been a decline in the efficiency of natural land and ocean sinks which soak up carbon dioxide (CO2) emitted to the atmosphere by human activities (anthropogenic) , according to findings published in late Oct 2007, in the Proceedings of the National Academy of Sciences of the US (PNAS).
The swift increase in atmospheric CO2 is due to faster economic growth coupled with a halt in carbon intensity reductions, in addition to natural sinks removing a smaller proportion of emissions from the air. Carbon intensity is the amount of carbon emitted to produce one dollar of global wealth.
The study's lead author, Dr Pep Canadell, executive director of the Global Carbon Project, explained "Fifty years ago, for every tonne of CO2 emitted, 600kg were removed by natural sinks. In 2006 only 550kg were removed per tonne and that amount is falling."
"In addition to the growth of global population and wealth, we now know that significant contributions to the growth of atmospheric CO2 arise from the slow down of natural sinks and the halt to improvements in carbon intensity."
The rise in growth in atmospheric CO2 is generating climate forcings that are bigger and sooner than expected.
By altering the global energy balance, these mechanisms "force" the climate to change.
Taking Action
We already possess the scientific, technical, and industrial know how to solve the carbon and climate problem for the next half-century. A concept known as "carbon wedges" proposes to limit the human contribution to the global carbon cycle, in an effort to reduce global warming.
Adoption of the wedge concept is essential if we are going to curb our extraordinary abuse of fossil based fuels.
Go Green Girl!
Never underestimate a vegetarian hippie chick with a racecar.
Meet Leilani Munter.
After earning a degree in Biology from UC San Diego, Leilani got the racing bug after doubling for Catherine Zeta Jones in a driving stunt.
Capable of success behind the wheel of any type of racecar (she has raced NASCAR and Indy-type cars), Leilani uses the platform of racing to reach thousands of racefans with her environmental awareness message.
Her website, www.carbonfreegirl.com, is a story of her journey to be carbon neutral and "a diary of her lifetime project of sorting through the facts &fiction about alternative fuels, organic food, clean energy, green buildings, solar power, wind farms, composters, recycling, politics &more all while she makes a living driving a race car at 200 mph."
Additionally, Leilani hopes her efforts will encourage racing sanctioning bodies to increase their environmental initiatives with expanded recycling programs and the use of alternative fuels.
Leilani is competing to be a Green Girl in the Project Green Search campaign, aimed at inspiring people to align their careers with their environmental, social and humanitarian ethics.
We at GoingGreen.com, endorse Leilani Munter to be the next Project Green Girl.
Voting is only through Friday, October 16.
To vote for Leilani or to pick your own favorite green girl, go to ProjectGreenSearch.com.
Better to have no deal at Copenhagen than one that spells catastrophe
The only offer on the table in Copenhagen would condemn the developing world to poverty and suffering in perpetuity
On the ninth day of the Copenhagen climate summit, Africa was sacrificed.
The position of the G77 negotiating bloc, including African states, had been clear:
a 2C increase in average global temperatures translates into a 3–3.5C increase in Africa.
That means, according to the Pan African Climate Justice Alliance, "an additional 55 million people could be at risk from hunger", and "water stress could affect between 350 and 600 million more people".
Archbishop Desmond Tutu puts it like this:
"We are facing impending disaster on a monstrous scale … A global goal of about 2C is to condemn Africa to incineration and no modern development."
And yet that is precisely what Ethiopia's prime minister, Meles Zenawi, proposed to do when he stopped off in Paris on his way to Copenhagen:
standing with President Nicolas Sarkozy, and claiming to speak on behalf of all of Africa (he is the head of the African climate-negotiating group), he unveiled a plan that includes the dreaded 2C increase and offers developing countries just $10bn a year to help pay for everything climate related, from sea walls to malaria treatment to fighting deforestation.
It's hard to believe this is the same man who only three months ago was saying this:
"We will use our numbers to delegitimise any agreement that is not consistent with our minimal position … If need be, we are prepared to walk out of any negotiations that threaten to be another rape of our continent … What we are not prepared to live with is global warming above the minimum avoidable level."And this:
"We will participate in the upcoming negotiations not as supplicants pleading for our case but as negotiators defending our views and interests."
We don't yet know what Zenawi got in exchange for so radically changing his tune or how, exactly, you go from a position calling for $400bn a year in financing (the Africa group's position) to a mere $10bn.
Similarly, we do not know what happened when secretary of state Hillary Clinton met Philippine president Gloria Arroyo just weeks before the summit and all of a sudden the toughest Filipino negotiators were kicked off their delegation and the country, which had been demanding deep cuts from the rich world, suddenly fell in line.
We do know, from witnessing a series of these jarring about-faces, that the G8 powers are willing to do just about anything to get a deal in Copenhagen.
The urgency does not flow from a burning desire to avert cataclysmic climate change, since the negotiators know full well that the paltry emissions cuts they are proposing are a guarantee that temperatures will rise a "Dantesque" 3.9C, as Bill McKibben puts it.
Matthew Stilwell of the Institute for Governance and Sustainable Development – one of the most influential advisers in these talks – says the negotiations are not really about averting climate change but are a pitched battle over a profoundly valuable resource:
the right to the sky.
There is a limited amount of carbon that can be emitted into the atmosphere.
If the rich countries fail to radically cut their emissions, then they are actively gobbling up the already insufficient share available to the south.
What is at stake, Stilwell argues, is nothing less than "the importance of sharing the sky".
Europe, he says, fully understands how much money will be made from carbon trading, since it has been using the mechanism for years. Developing countries, on the other hand, have never dealt with carbon restrictions, so many governments don't really grasp what they are losing.
Contrasting the value of the carbon market – $1.2 trillion a year, according to leading British economist Nicholas Stern – with the paltry $10bn on the table for developing countries for the next three years, Stilwell says that rich countries are trying to exchange "beads and blankets for Manhattan".
He adds:
"This is a colonial moment.
That's why no stone has been left unturned in getting heads of state here to sign off on this kind of deal … Then there's no going back.
You've carved up the last remaining unowned resource and allocated it to the wealthy."
For months now NGOs have got behind a message that the goal of Copenhagen is to "seal the deal".
Everywhere we look in the Bella Centre, clocks are ticking.
But any old deal isn't good enough, especially because the only deal on offer won't solve the climate crisis and might make things much worse, taking current inequalities between north and south and locking them in indefinitely.
Augustine Njamnshi of the Pan African Climate Justice Alliance puts the 2C proposal in harsh terms:
"You cannot say you are proposing a 'solution' to climate change if your solution will see millions of Africans die and if the poor not the polluters keep paying for climate change."
Stilwell says that the wrong kind of deal would "lock in the wrong approach all the way to 2020" – well past the deadline for peak emissions. But he insists that it's not too late to avert this worst-case scenario. "I'd rather wait six months or a year and get it right because the science is growing, the political will is growing, the understanding of civil society and affected communities is growing, and they'll be ready to hold their leaders to account to the right kind of a deal."
At the start of these negotiations the mere notion of delay was environmental heresy.
But now many are seeing the value of slowing down and getting it right.
Most significant, after describing what 2C would mean for Africa, Archbishop Tutu pronounced that it is "better to have no deal than to have a bad deal".
That may well be the best we can hope for in Copenhagen.
It would be a political disaster for some heads of state – but it could be one last chance to avert the real disaster for everyone else.
Who Consumes All Those End-Of-Life-Vehicles
Have you ever wondered who consumes all the ELV's in this province?
If you have you are not alone.
About 1 million new automobiles are produced in the World each week, and in Ontario approximately 500,000 of them are put to rest each year.
An End-Of-Life Vehicle is one which has reached the end of the road by way of its age or collision write-off.
The ELV is a vehicle that is going to be recycled.
The recycling of the vehicle for its scrap value (75% by weight of the vehicle) is almost always the outcome, the processes used for the outcome often differs.
And while some processes give our dear old beater a respectable burial (utilizing proper equipment and care for its fluid removal) other processes do not.
The issue of ELV's is obviously an important one and Governments around the World are taking pro-active measures as they address the key issue of the environment.
Much of the focus has been directed at the automakers to make their vehicles more recyclable.
The ELV directive adopted last year requires all European Auto Makers to recycle or reuse 85% by weight of the metals and materials in each vehicle by 2006 and 95% by 2015.
Significant progress is being made with the automakers but regardless of their efforts it does not solve the problem of what to do with the millions of vehicles that will eventually reach the end of the road.
At the Recycling Council of Ontario Roles and Responsibilities Forum, April 28, 1999 five distinct groups of business were identified that deal in the management of ELV and irreparable vehicles.
These included:
1. Auto Dismantlers/Recyclers:
whose primary business is the sale of used auto parts.
Material recovery is secondary.
Fluid recovery is an integral part of processing each vehicle.
2. Salvage Yards:
whose primary business is metals recovery.
The sale of used auto parts is secondary.
Fluids are recovered if parts are to be removed.
3. Scrap Metal Dealers/ Junk Yards:
whose only business is metals recovery.
Fluids generally are not recovered.
4. Car Dealers, Body Shops, Tow-Trucks
5. Individuals:
this category includes backyard mechanics, and also auto thieves and organized international and domestic auto theft rings.
Of the five distinct groups of businesses involved in the management of ELV's only the Auto Dismantler/Recycler ensures fluid recovery as an integral part of the processing of each vehicle.
There has been some very interesting "shifts," that have taken place with the ELV.
In years past it was safe to say that most vehicles were purchased by Auto Dismantlers/Recyclers for processing and then the to the scrap processors to carry on the recycling process.
Today, this is not necessarily the case.
Salvage yards and Scrap Metal Dealers do not incorporate the same processes as Auto Dismantlers to ensure that proper fluid evacuation and disposal of these fluids takes place.
The underground economy, which acquires vehicles from the public or at salvage auctions (where anyone with any business license can attend) also has a detrimental effect on the community in that parts are bought and sold for cash.
The Government realizes no revenue from these transactions.
This fact, coupled with the realization that many salvage vehicles are being exported out of the country accounts for the reduction of available used parts for the public and economic loss to professional collision repairers.
Carbon Trading:
A Brief Introduction
Rather than reducing emissions, carbon trading avoids the fundamental changes needed to mitigate climate change.
Alternatives must be developed together with local communities to prevent a repeat of the dispossession and social injustice caused by offsetting schemes, says Oscar Reyes.
21st September 2009 - Published by Carbon Trade Watch
Carbon trading is allowing industrialised countries and companies to avoid their emissions reduction targets. It takes two main forms:
"cap and trade" and "carbon offsetting."
What is cap and trade?
Under cap and trade schemes, governments or intergovernmenal bodies set an overall legal limit of carbon emissions in a certain time period ("a cap") and then grant industries a certain number of licenses to pollute ("carbon permits"). Companies that do not meet their cap can buy permits from others that have a surplus – typically, because they have been given an overly generous allowance in the first place.
They can also purchase "offsets."
What are carbon offsets?
Carbon trading runs in parallel with a system of carbon offsets. Instead of cutting emissions themselves, companies, and sometimes international financial institutions, governments and individuals, finance "emissions-saving projects" outside the capped area to generate carbon credits which can also be traded within the carbon market.
The UN's Clean Development Mechanism (CDM) is the largest such scheme with almost 1,800 registered projects in developing countries by September 2009, and over 2,600 further projects awaiting approval.
Based on current prices, the credits generated by approved schemes will cost around $35 billion by 2012.
Although offsets are often presented as emissions reductions, what these projects do at their hypothetical best is to stabilise emission levels while moving them from one location to another, normally from Northern to Southern countries. In practice, this "best case" scenario is rarely seen, with the result being that offsetting increases emissions whilst also exacerbating social and environmental conflicts.
So what's wrong with cap and trade?
There are fundamental theoretical flaws in the whole cap and trade scheme even before you look at the actual record of its implementation.
This is because the scheme was never set up to directly tackle the key task of a rapid transition away from fossil fuel extraction, over-production and over-consumption, but sought instead to quantifying existing pollution as a means to create a new tradable commodity.
Within this framework, traders invariably opt for the cheapest credits available at the time, but what is cheap in the short-term is not the same as what is environmentally effective or socially just.
Some of the key problems with the cap and trade approach are:
- The "trade" component does not reduce any emissions. It simply allows companies to choose between cutting their own emissions or buying cheaper "carbon credits," which are supposed to represent reductions elsewhere.
- The "cap" has too many holes and sometimes caps nothing.
The cap is only as tight as the least stringent part of the whole system.
This is because credits are sold by those with a surplus, and the cheapest way to produce a surplus is to be given too many credits in the first place ("hot air" credits as a result of caps being set too high). The aim of trading is to find the cheapest solution for polluting industry, and it is consistently cheaper to buy "hot air" credits than to actually reduce emissions.
Cap setting is a political process that is highly susceptible to corporate lobbying which means that there is invariable over-allocation of pollution permits. In fact, lobbying is encouraged through extensive industry "stakeholder" involvement.
- Offsets loosen the cap.
While cap and trade in theory limits the availability of pollution permits, "offset" projects are a licence to print new ones. When the two systems are brought together, they tend to undermine each other – since one applies a cap and the other lifts it.
An offset is essentially a permit to pollute beyond the cap.
Most current and proposed "cap and trade" schemes allow offset credits to be traded within them – including the EU Emissions Trading Scheme (EU ETS) and the US cap and trade scheme (proposed in the 2009 American Clean Energy and Security Act, ACES).
Will markets concerned with growth be able to deliver reductions of carbon?
The other problem is that markets are by essence growth-oriented, so look for new sources of accumulation.
In carbon markets, this is achieved by increasing their geographical scope and the number of industrial sectors and gases they cover. Yet this contradicts the essence of tackling climate change which is about reducing use of fossil fuels and consumption.
It is therefore not a surprise that introducing carbon as a commodity has resulted in new opportunities for profit and speculation.
The carbon market is already developing the way of the financial market with the use of complex financial instruments (futures trading and derivatives) to hedge risk and increase speculative profit.
This runs the risk of creating a "carbon bubble."
This is not a surprise, as it was created by many of the same people at the Chicago Climate Exchange who created the derivatives markets that led to the recent financial crash.
What examples have there been of Cap and Trade schemes?
There have been a number of Cap and Trade markets – the EU ETS, the United States Acid Rain Program, the Los Angeles Region Clean Air Markets (RECLAIM), the Chicago Emissions Reduction Market System (ERMS) and the Regional Greenhouse Gas Initiative.
The EU ETS, established in January 2005, is the largest cap and trade scheme in operation worldwide and is the best for illustrating how carbon trading has failed in practice.
Why does European Union Emissions Trading Scheme (EU ETS) consistently grant over-allocation of pollution permits?
Most cap and trade markets use projections of historical emissions provided by industry itself to calculate the initial caps. Industry has a clear incentive to overstate its past emissions to gain more credits. As a result, cap and trade markets start out with too many permits. This was true of the EU ETS which consistently awarded major polluters with more free pollution permits (called EUAs, European Union Allowances) than their actual level of carbon emissions. This means it gave them no incentive to reduce emissions, and as a result the price of the permits collapsed – ending 2007 at €0.01.
In phase I (2005-2007) as a whole, according to the EU's own data, major polluters had permits worth 3.4 per cent more than their actual level of emissions.
But didn't the second phase of the EU ETS (2008-2012) resolve this over-allocation?
The EU claims that it has learned from its mistakes and that the second phase of its scheme is working.
Whilst it is true that for the first time in 2008, polluters were awarded fewer permits than their actual level of emissions, there is still over-allocation of permits:
- The vast majority of factories and economic sectors are still over-allocated – it is only the power sector that needs to purchase credits
- The impact of the EU-wide recession means that the ETS as a whole will again be over-allocated in 2009
- Corporations get the same number of credits even if they temporarily close or scale down operations for short-term economic reasons
But isn't Phase II nevertheless leading to emissions reductions?
The EU claims emissions reductions of 3 per cent, or 50 million tons, in ETS sectors in 2008.
The trouble is that at least 80 million tons of "carbon offsets" in the developing world were bought as part of the ETS in 2009 – more than the level of the cap.
So, again, the ETS does not require emissions reductions by companies in the EU.
Moreover there is also evidence that some of the supposed "cuts" are fake.
One such example is Lithuania which claimed it would be forced to use coal-powered electricity as a result of the closure of Ignalina, a nuclear power plant.
As a result it gained a large surplus of credits, which have been sold on and treated as "emissions reductions" elsewhere.
So who profited from carbon trading?
Companies receive most carbon credits for free.
This is equivalent to a subsidy – and with allocations made on the basis of historical emissions, the largest subsidy goes to the dirtiest industry (especially coal-fired power plants).
Windfall profits also arise from an accounting trick around "opportunity costs."
Power companies choose to do the cheapest thing to meet their ETS target – which is usually buying Clean Development Mechanism (CDM) credits – but passing on costs as if they were doing the most expensive – actually reducing emissions. Even power companies receiving free credits from the ETS have nevertheless passed on the cost of these credits to consumers. Research by market-analysts Point Carbon and WWF calculated that the likely "windfall" profits made by power companies in phase II could be between €23 and €71 billion, and that these profits were concentrated in the countries with the highest level of emissions.
ArcelorMittal, the world´s largest steel company, is another typical example.
It routinely receives a quarter to a third credits than it would have needed to even begin reducing emissions. The company is likely to have made over €2 billion in profits from the ETS between 2005 and 2008, with over €500 million of this achieved in 2008 alone – yet has needed to make no proactive changes to its emissions to do so
What about phase III of the EU ETS?
EU ETS phase III runs from 2013 to 2020, and the debate in Brussels is focussed on the risk of "carbon leakage."
This relates to industry claims that strict regulations in one part of the world will encourage outsourcing to locations where regulations are weaker. It is already being used as a blackmail tactic by industry to reduce its targets or obligations within the EU ETS (and other proposed schemes in Australia and the US). Over half of the 258 industrial sectors in Europe being assessed for exposure to carbon leakage under the EU ETS will qualify for free emission allowances from 2013, according to an initial assessment by the European Commission.
So what is the problem with carbon offsetting?
Carbon offsets allow companies and countries to avoid cutting their own emissions by buying their way out of the problem with theoretical reductions elsewhere.
There are both inter-government schemes – most famously the UN Clean Development Mechanism (CDM) - as well as voluntary programmes undertaken largely for purchase by individual consumers. Unfortunately both systems are deeply flawed:
- Selling stories. Offsetting rests on "additionality" claims about what "would otherwise have happened," offering polluting companies and financial consultancies the opportunity to turn stories of an unknowable future into bankable carbon credits. The EU admits that at least 40 per cent of these are bogus, while a survey by International Rivers found over 60 per cent of projects to be "non-additional."
- Offsets increase emissions. The net result for the climate is that offsetting tends to increase rather than reduce greenhouse gas emissions, displacing the necessity to act in one location by a theoretical claim to act differently in another. Moreover, it keeps delaying any real domestic action and allows the expansion of more fossil fuel extractions.
- Making things the same.
The value of CDM projects is premised on constructing a whole series of dubious "equivalences" between very different economic and industrial practices, with the uncertainties of comparison overlooked to ensure that a single commodity can be constructed and exchanged.
This does not alter the fact that burning more coal and oil is in no way eliminated (and certainly not in the same time frame) by building more hydro-electric dams, planting more trees or capturing the methane in coal mines.
Carbon offsets have serious negative social and local environmental impacts
The use of "development" rhetoric masks the fundamental injustice of offsetting, which hands a new revenue stream to some of the most highly polluting industries in the South, while simultaneously offering companies and governments in the North a means to delay changing their own industrial practices and energy usage.
In practice, carbon offset projects have most of the times resulted in land grabs, local environmental and social conflicts, the displacement of Indigenous Peoples´ from their territories, as well as the repression of local communities and movements.
Might reforestation programmes such as REDD work?
The inclusion of tree planting and other "sinks" projects in the CDM and cap and trade schemes is also under consideration.
These pose additional measurement problems, as many projects are not additional, are difficult to measure, do not include the upkeep of the trees and assume instant absorption of already released carbon – when in fact it will take thousands of years for the carbon to be absorbed.
"Reforestation" also tends to count monoculture plantations as forests, but they are not as they lack biodiversity, and so contribute to soil degradation;and also require intensive synthetic fertilisers, which contribute significantly to climate change, pollute water and damage local peoples´ health.
Schemes for Reducing Emissions from Deforestation and Degradation (REDD) repeat the error of emissions trading by commodifying forests. They presume that deforestation happens because standing forests make less money than forests that are cut down.
In fact, the commodification of forests is what drives deforestation.
This commodification includes the role of corporate and development bank investment in new infrastructure, mining and oil extraction projects;industrial logging;and land clearance to make way for monoculture plantations for the pulp and paper and palm oil industries. REDD is likely to fuel property speculation and so exacerbate land conflicts, dispossessing Indigenous Peoples and forest communities.
What impact will new trading schemes have on offsetting and forest carbon markets?
The most active buyers of offset credits in 2008 were European companies, which bought 80 million credits from the CDM or Joint Implementation projects (a similar UN scheme, operated in countries which have emissions reduction commitments under the Kyoto Protocol) as either a cheaper alternative to reducing their own emissions (under ETS), or for the purpose of speculation and re-sale.
But this market is likely to expand massively if the American Clean Energy and Security Act (ACES) is passed, which proposes to allow US companies to purchase from 1 to 1.5 billion international offsets every year. This would spur on a massive increase in damaging offset projects, putting enormous pressure to reduce the already-inadequate checks on their environmental integrity.
What are sectoral credits?
Sectoral credits would introduce new offsets as part of what are called Nationally Appropriate Mitigation Actions (NAMAs) in the climate policy jargon.
This is one of a number of proposals currently being debated for inclusion in a new UN climate treaty.The basic idea is that developing countries should commit to reducing their greenhouse gas emissions "in an indicative range below business as usual," as the draft of the G8´s L´Aquila declaration in July 2009 puts it.
This deviation from an assumed future trajectory would be counted as a "reduction" (although it need be nothing of the sort) and traded to help industries in developed countries avoid reducing their own emissions. The private money flowing through these carbon markets could also be "double counted" as part of the financial commitment that the industrialised countries agreed to make at the UN Climate Conference in Bali.
But isn't carbon trading better than nothing?
No. As carbon trading helps to avoid change and even increases emissions while exacerbating local conflicts, it is not a question of alternatives to carbon trading but rather of taking measures that actually tackle climate change.
So what are the alternatives?
Carbon markets should be dismantled, starting with offsets. A clear intention to discontinue carbon markets can fatally undermine them even in advance of legislative action.
Alternatives then need to be developed that are properly consulted and developed together with local communities to prevent a repeat of the dispossession and social injustice caused by offsetting schemes.
A range of different approaches will be needed but may include:
- Recognition of existing climate solutions. The vast range of solutions that already exist – which tend to be distinguished by their sensitivity to the local contexts in which they operate, are overlooked in favour of the accumulation of large-scale "technological fixes" or market-based schemes.
- Leave fossil fuels in the ground.
Proposals to halt new coal power plants and the exploration of new and often "uncoventional" sources of oil extraction are at the frontline of the struggle for climate justice – and should form part of a rapid transition to a post- fossil fuel economy.
- Rediscovering environmental protection.
There are a broad range of environmental policy instruments that have proven to be more effective than market-based approaches – ranging from efficiency standards for electrical appliances and buildings to feed-in tariffs for renewables. The rediscovery of such measures could form part of a solution.
- New revenues:
tax and/or end currency and fuel speculation.
Rather than a regressive carbon tax, revenue can be generated by a tax on currency speculation. A heavy tax or an end to speculation on fossil fuel prices would also help as a transitional measure.
This should be accompanied by pro-active policy measures to tackle fuel poverty, such as a ban on pre-pay metering.
- Renewable energy should be supported but not uncritically – with the involvement of local populations and not as basis for sustaining expansions in fossil use or support of unsustainable model of industrial expansion.
- Public energy research.
Private research on energy alternatives and use favours "least cost" false solutions (eg. agrofuels, hydroelectric dams, nuclear power) rather than environmentally effective alternatives, so is less effective than public research.
However, this would need to be allied with the democratic transformation of the institutions of "environmental governance," the agenda for which currently tends to be set by transnational corporations.
- Re-estimating energy demand.
Current models presume limitless growth and overstate future energy demand, which has encouraged oversupply and kept prices low – which is, in turn a key structural driver of over-consumption.
- The Transition Towns movement is going some way towards re-estimating demand with its "Energy Descent Action Plans", but lacks a structural analysis of heavy industry use (or capitalist accumulation) and is often divorced from organising for more equitable distribution of energy.
- Changing economic calculations. Cost-benefit accounting either fails to take account of environmental or social costs, or is grossly reductionist in its assumptions.
- Challenging the "growth" fetish.
It is often claimed that continued GDP growth can go hand in hand with reductions in emissions. However, there is no evidence that "advanced" economies are significantly reducing their carbon footprints, or that such a transformation could happen quickly enough to reduce emissions. On the postive side, GDP is a very poor indicator of human-well being, so is not a condition for social improvement or a good life.
If the obesession with economic growth is set aside, it becomes easier to see how tackling climate change and maintaining a sustainable and enjoyable life are far from contradictory goals.
The Human Brain is Made for Environmental Complacency
Many governments are elected on platforms promising to address climate change, but fail to implement meaningful environmental policies once in office.
Can this be explained by human psychology - and will it take a local climate catastrophe for them to finally act, asks Chris Goodall.
Most governments in the developed world were elected on platforms that promised aggressive policies on greenhouse gas emissions. The reality has not matched the commitments made.
The reasons for this are multitudinous and no one should ever underestimate the difficulties of weaning advanced societies off the use of cheap and convenient access to fossil fuels.
But in addition to the standard reasons for slow progress we can see a large number of obstacles that spring from human psychology.
In particular, some of the resistance to aggressive action on climate seems to spring from mental attitudes that may have helped us survive as a species in the past.
Perhaps politicians intuitively recognise the existence of these barriers. So they continue to say that climate change is the most important problem facing humanity at the same time as adding new runways to the local airport or sanctioning the development of new coal-fired power stations.
I see two groups of reasons why action on climate change is not as fast or as effective as the scientific consensus suggests is as necessary.
First, although we are constantly fed with information on the severity of the threat, at some subconscious level most people believe that climate change is not dangerous.
Second, the desire to protect future generations – and current generations who live far from us – is much less well entrenched in human thinking than we piously assume.' What has posterity ever done for us?' The phrase may now be ridiculed, but it contains a worrying truth.
Optimism Bias
Human beings seem to have a psychological predisposition towards believing matters will eventually turn out well.
The phrase 'optimism bias' is sometimes used to describe this phenomenon.
We see this in many different circumstances. In the planning of a new construction project, for example, the costs are routinely underestimated.
The UK Department for Transport web site says that 'there is a demonstrated, systematic, tendency for project appraisers to be overly optimistic and that to redress this tendency appraisers should make explicit, empirically based adjustments to the estimates of a project's costs, benefits, and duration.'
In the case of climate change, we may unconsciously have a similar bias. Although the results from scientific work seem increasingly worrying, many of us may be saying at the back of our minds that the concerns are exaggerated.
Inherent optimism may have helped our ancestors and ourselves cope with present adversity and future threats. It does not help us deal with a long-distant and highly uncertain set of risks from rising temperatures and changing climate patterns.
Central Estimate Bias
Humans tend systematically to over-estimate the tightness of the distribution of likely outcomes (Loosely speaking, they wrongly guess the width of the 'bell curve'). Ask an individual a question on a subject about which they know little and then request an estimate of the probability that his or her answer is nearly right.
People will routinely be far more certain than they should be.
Examples might be a question that asked how many species there are on the planet or the number of books published a year. People don't generally know the answer but will nevertheless be far more confident than they should be about the general correctness of their estimate.
This phenomenon helps us to be usefully decisive.
Rather than endlessly discussing which way to go to hunt, perhaps our ancestors found it useful to have an exaggerated certainty.
This phenomenon allows leaders to get groups to engage in purposeful action.
Unfortunately this human attribute is not helpful when it comes to climate change.
The world faces a high degree of uncertainty about the impacts of warming, with a very wide distribution of possible outcomes. We may experience 1.5 degrees of temperature rise or it may be four times that level.
Humankind can probably cope with the smaller number but the larger figure will make most of the globe uninhabitable.
Similarly, the Siberian permafrost may melt, causing an outpouring of methane that rapidly destabilises the world climate.
Or it may not.
It is the width of bell curve of outcomes that should worry us, but we naturally tend to compress the range of outcomes into a much tighter range than is justified.
The books about climate change from Nigel Lawson and Bjorn Lomborg are particularly good examples of this. Having been sceptics about the existence and then the severity of climate change, both authors write with excessive certainty that the eventual temperature rise is going to be about 2 degrees. The overconfidence is their way of getting us all to underestimate the risks of unpredicted climatic change.
Problems Dealing with a High Noise-to-Signal Ratio
We are much more aware of weather than we are of climate.
In countries like the UK, the variability of the weather is high.
It can be sunny and 25 degrees on day and rainy and 15 degrees on the next.
This variability (or 'noise') tends to drown out the underlying 'signal' (changes in the climate).
A coldish 2008/9 winter in the US may be connected with recent opinion poll evidence that shows increasing numbers of people thinking that the potential effects of climate change are exaggerated.
People in countries with a lot of weather can always find data that supports whatever opinion that they happen to have on climate change.
This is unsurprising:
survival in past centuries depended more on the weather in the crop growing season than it did on the climate.
It may be no accident that countries with less weather and more climate seem to have smaller percentages of their population denying the existence of climate change.
The rapid spread of deserts in eastern China is obvious, and perhaps is correlated with polls showing the Chinese are among the most worried by the effects of climatic change.
Assumption of Exaggeration in those Trying to Persuade Us
Scientists are increasingly seen as salespeople trying to 'sell' their research findings. Correctly or otherwise, ordinary people seem to believe that the conclusions in scientific papers are biased by the need to impress the journalists that cover the topic, who then amplify the results in order to attract attention from their readers.
The general population tends to discount the findings, presuming them to be exaggerated and distorted by the need to show increasingly bad outcomes. A cynical citizenry may also believe that striking results are more likely to get the authors future research funding.
I think it is probable that pressure from the press does slightly distort scientific research and, being human, scientists may sometimes amplify their concerns in order to attract attention.
But the huge majority of climate change work is carefully designed and robust.
Most people in the general population don't know that the process of peer review will tend to dampen, not exaggerate, the upsetting implications of a new piece of research.
An underlying faith in smoothly adjusting and self-correcting processes
The latter half of the twentieth century bought a profound change in the way that people in developed countries saw their world.
Effective markets meant that prices generally quietly and unobtrusively adjusted supply and demand so that crises of availability became rare.
Although there are good counter-examples, such as the severe depletion of Atlantic fish stocks, markets have been generally very good at dealing with temporary disruptions. For example, it's possible that a smaller percentage of people have starved to death in the last generation than for any comparable period in the last thousand years.
We may have partly lost the ability to comprehend the risk of sudden and unpredictable environmental collapse.
Perhaps our pre-industrial ancestors would have understood the threat from catastrophic climate change much better than we can.
Until the recent implosion of large parts of the banking system, trade and financial flows seemed superb at avoiding the awful effects of natural disasters and other extreme events. We have gradually lost the sense that food or raw material shortages can get worse and worse.
So an escalating and near-irreversible climate change threat – a classic 'commons' problem, like the over-fishing of many of the world's seas, is not fully comprehended by the modern mind-set.
The liberal capitalism of the last twenty years has been so successful that we have become blind to potential threats from environmental collapse.
The examples of such crises in the past – from Easter Island through to soil degradation in the US in the Dust Bowl years– are now ignored.
The dominance of what might be called the economist's model of the world is under threat from the deepening recession.
The invisible hand is now looking a little arthritic.
But for the last thirty years it has provided the standard ideological framework in Anglo-Saxon economies. We are, in the words of Keynes, all the slaves of some defunct economist.
Whether we like to acknowledge it or not, the way we think still owes much to Milton Friedman and his friends. The discontinuities, non-linearities and tipping points of climate change will require us to reprogramme our minds. It will take many years.
I remember intellectuals like Keith Joseph acting as the nuclei around which free-market liberalism began to form in the mid-seventies. One looks in vain for similar cells of green philosophers or economists now.
The Lack of an Observable Enemy
CO2 is invisible, largely innocuous except for its absorption of certain frequencies of infra-red radiation, and it is a natural part of the carbon cycle.
It sustains living systems and helps maintain the planet at a habitable temperature.
These are not the usual characteristics of an environmental enemy.
Depletion of the ozone layer was an easier problem to address.
A small number of manufacturers were making CFCs for a limited number of uses and the effects on the stratosphere were clear to even the sceptics of the day.
It was possible to begin the process of phasing out their use without too many obstacles because the enemy was obvious.
Human societies have always sought to identify enemies, whether it be racial minorities, foreigners with different ideologies or people who simply don't fit in.
But with CO2, the opponent is not easy to locate.
We all produce carbon dioxide and greenhouse gases are not only invisible, which makes the problem difficult to see, but they are also all pervasive.
We do not even know how to start battling the opponent – some say we should ban leisure flying, other suggest we need to stop burning coal, increase forest cover or turn down the thermostat.
Compared to the usual cry of 'repel the barbarians', this doesn't make for effective warfare on CO2.
The lack of progress on greenhouse gas reduction has unnerved many activists, who now devote far too much attention to fighting among themselves rather than leading the charge against the shared enemy.
Unknown Unknowns
Donald Rumsfeld's contribution to world history will be dominated by his disastrous actions before and after the Iraq War. But the useful restatement of the idea of 'unknown unknowns' will merit a footnote in his Wikipedia entry.
Getting people to accept even the possible existence of unknown unknowns in climate science or in other fields is difficult.
It was always thus.
Perhaps the more successful of out ancestors found it was generally not useful for us to worry too much about the things we don't even know we don't know.
Many global warming scientists intuitively understand this.
Although they should be telling us that they really don't understand many aspects of the climate system, they fear that the admission of any ignorance will reduce their credibility.
They tend to give us an exaggerated impression of the certainty with which we understand things or, more correctly, know what we don't know.
The Tsunami and the Brandt report
Since the tsunami world opinion has shifted.
People have been so moved by the plight of the people in the devastated areas that they have begun to talk about poverty and injustice in other parts of the world, such as Africa, says Mohammed Mesbahi and Dr. Angela Paine.
1st February 05 - Mohammed Mesbahi and Dr. Angela Paine, STWR
The response of the world public to the tsunami disaster on the 26th December 2004 was (and continues to be) one of heartfelt empathy and an instinctive desire to help fellow human beings in trouble.
Never before have so many people, from so many countries given so much to the victims of a disaster. World governments have been shamed into promising far greater sums of aid than they originally wanted to offer by the sheer magnitude of the public's generosity.
The US initially pledged $15 million but in the end promised $350 million while the UK government felt obliged to raise their pledge to $96 million, still a tiny fraction of the money these governments have so far spent ($148 billion –the US and $11.5 billion - the UK) on the war in Iraq.
As George Monbiot says, the UK has spent almost twice as much on the war in Iraq as it spends annually on aid to the third world.
The US gives just over $16 billion in foreign aid:
less than one ninth of the money it has so far burnt in Iraq.
How many people realise, however, as Devinder Sharma points out, that many of the deaths caused by the Tsunami could have been prevented?
The area affected has been hit by tsunamis in the past, with far fewer deaths resulting, because the coastlines of South East Asia were protected by a natural defence system, composed of coral reefs and mangrove forests. Many of the previous tsunamis were tamed by the coral reefs before hitting the coast, where they were absorbed by a dense layer of red mangrove trees. These flexible trees, with long branches growing right down into the sand below the surface of the sea, absorb the shock of tsunamis. Behind the red mangrove trees there is a second layer of black mangrove trees, which are taller and slow down the waves.
Thousands of miles of coastline in South East Asia were densely covered in mangrove forests, protecting the coastline from erosion, absorbing carbon dioxide and providing a breeding ground for crustaceans and fish, on which the local population depended for their livelihood.
This was a fragile environment, which ecologists have long recommended should enjoy special protection.
In India a Coastal Regulation Zone (CRZ) was created to protect a 500 meter buffer zone along the coast.
While the belt of mangrove forest still existed, the people of the area lived inland, behind it.
In 1960 a tsunami hit the coast of Bangladesh in an area where the mangroves were intact.
No-one died.
These mangroves were subsequently cut down by the shrimp (prawn) farming industry and in 1991 thousands of people were killed when a tsunami of the same magnitude hit the same region.
On Dec 26th 2004, Pichavaram and Muthupet, in South India, who still have their mangrove forests, suffered fewer casualties than the surrounding mangrove-less areas of coast.
Mangroves also acted as a barrier, helping people to survive on Nias Island, Indonesia, close to the epicentre of the Dec 26 tsunami.
Burma and the Maldives suffered less from the tsunami because the shrimp and tourism industries had not yet destroyed all their mangroves and coral reefs.
Since the 1960s, the mangrove forests of South East Asia have been systematically destroyed to make way for commercial shrimp (prawn) farming and a massive increase in the tourism industry.
The aquaculture and tourism industries succeeded in diluting any protective regulations that were in place, until they were able to take over most of the buffer zone.
Almost 70% of the region's mangrove forests have now disappeared.
Since three quarters of South East Asian commercial fish species spend part of their life cycle in the mangrove swamps the loss of these swamps has resulted in declining fish harvests. To compound this situation, the commercial feeds, pesticides, antibiotics and non-organic fertilizers used in intensive shrimp farms have generated huge amounts of pollution, destroying the remaining fish and harming the coral reefs.
As the fish have declined, desperate fishermen resorted to dropping dynamite into the reefs to drive them out.
Scientists working for the UN Environment Programme (UNEP) have recently compiled The World Atlas of Coral Reefs, an underwater survey.
They found that one third of the world's coral reefs are in South-east Asia and almost all are under threat. 70% of the world's coral reefs have already been destroyed. 80% of Indonesia's reefs are in danger. Dynamite fishing has contributed to the destruction of an ecosystem already under threat from sediment erosion caused by the loss of mangrove forests, shrimp farm pollution and untreated sewage from the tourism industry.
According to Susan Stonich, University professor from California University, international corporations, based in the first world but operating in the third world, produce 99% of farmed shrimp.
But almost all of it is eaten in the US, Western Europe and Japan, where consumption has increased by 300% in the last ten years. Today world shrimp production, in an industry worth $9 billion, is almost 800,000 metric tons and 72% of farmed shrimp comes from Asia.
Hundreds of nongovernmental organizations have sprung up at local, national and international levels to oppose this destructive aquaculture industry.
In 1997 the Industrial Shrimp Action Network (ISA Net) was formed, a global alliance opposed to unsustainable shrimp farming.
Aquaculture corporations responded by forming the Global Aquaculture Alliance (GAA) to counter the claims of the ISA Network.
Commercial shrimp farming has displaced local communities, exacerbated conflicts, decreased the quality and quantity of drinking water and decimated the natural fish species on which the local population rely.
The population of these areas ended up living right on the coast, without the benefit of their protective mangrove forests. Their coral reefs were by now eroded by pollution, dynamite fishing, tourists (who tread on the reefs) and the rising temperature of the sea.
The reason why the aquaculture and tourism corporations have been allowed to destroy the coastal environment of South East Asia is because the current neoliberal trade system favours corporations over and above all concerns for the environment and the people living in it.
Trade liberalisation, through the World Trade Organisation, has enabled corporations to challenge the legislation of the countries they wanted to operate in, legislation that was designed to protect the local environment.
Ecological and human disasters such as the 2004 tsunami will continue to occur as long as the current Global Economic system is allowed to exist in its present form.
Way back in the 1980s Willy Brandt warned that the current global economic system, with its emphasis on profit at all costs, would lead to environmental degradation and worsening poverty in the third world.
He said "Important harm to the environment and depletion of scarce resources is occurring in every region of the world, damaging soil, sea and air. The biosphere is our common heritage and must be preserved by cooperation – otherwise life itself could be threatened" (North South, 72 -73.) How prophetic these words sound today.
He set up the Independent Commission on International Development Issues to make an in-depth study of the global economy.
His team of advisers included many experts in the field of international policy and economics. Their detailed report came to the conclusion that the developed nations dominated international trade and that this was unbalanced and biased in favour of large corporations based in the West.
The Brandt Commission was the first major independent global panel to examine connections between the environment, international trade, international economics and the third world.
The United Nations Conference on Environment and Development took Brandt's proposals regarding the environment seriously enough to hold international conferences in Rio in 1992 and in Kyoto in 1997.
However America refused to sign the Kyoto Protocol and corporate power prevented any of the Brandt Report recommendations being put into practice.
The Brandt Reports called for a complete restructuring of the global economy, in order to protect the environment and meet the needs of the world population.
Willy Brandt said "We see a world in which poverty and hunger still prevail in many huge regions;in which resources are squandered without consideration of their renewal;in which more armaments are made and sold than ever before;and where a destructive capacity has been accumulated to blow up our planet several times over".
He proposed a Summit of World Leaders, with the backing of a global citizens' movement, to discuss how to meet the needs of the majority of the world's people.
This would, he recognised, mean reforming the international economy.
He proposed a series of measures, including:
An emergency aid program to assist countries on the verge of disaster
Third world debt forgiveness
Fair trade
The stabilisation of world currencies
A reduction in the arms trade
Global responsibility for the environment
A major overhaul of the global economic system.
Brandt also recognised that poverty contributes to high birth rates and that overpopulation puts pressure on the environment.
This has indeed happened all over the world, including South East Asia.
Two decades later, world leaders had not responded to any of Brandt's proposals in any meaningful way.
They continued to allow an ever increasing export of arms to some of the most repressive regimes, and public apathy towards the plight of the world's hungry billions continued.
In the 1980s Brandt was calling for preventive action and his proposals were falling on deaf ears. Only now is preventive action beginning to be taken seriously.
The World Bank estimates that losses caused by disasters in the 1990s could have been cut by $280 billion if $40 billion had been spent on preventive measures. Whether protection of the environment came into the equation is not clear but surely the preservation of the coastal environment of South East Asia was more important than providing a luxury item of food to the US, Europe and Japan.
Brandt also called for coordinated relief programmes for areas where disasters had already occurred.
Only one organisation has the people and the close relationships with governments to make coordinated disaster aid work, the UN's Office of Coordination of Humanitarian Affairs (OCHA). Yet immediately after the tsunami world leaders were in disagreement over coordination of the relief operation.
George Bush refused to cooperate with the UN because of his long-running differences with the UN leadership.
World opinion eventually forced him to recognise the need for cooperation with the OCHA for the smooth running of the disaster relief.
However the OCHA is far from perfect, partly because it has not been given the support it needs by all the member countries of the UN. Willy Brandt recognised that the UN needed to be restructured to make it democratic and effective and all the UN agencies needed to be reformed to make them more efficient.
He called for emergency programs for food, housing and healthcare to be coordinated.
He recommended cutting the red tape to ensure that resources reached impoverished people directly, unfiltered through inefficient bureaucracy.
He called for national projects, overseen by representatives from developed and developing nations.
He recommended that instead of fighting wars, armies and navies from the developed world could be deployed to bring in the food, resources and technology needed to help poor nations reverse hunger and poverty.
This has indeed been happening since the tsunami.
Armies and navies have indeed been bringing food, resources and technology to the disaster areas. Ironically, as George Monbiot points out in the Guardian Jan 4, the US marines who have been sent to Sri Lanka to help the rescue operation were, just a few weeks ago, murdering the civilians, smashing the homes and evicting the entire population of the Iraqi city of Falluja.
Since the tsunami world opinion has shifted.
People have been so moved by the plight of the people in the devastated areas that they have begun to talk about poverty and injustice in other parts of the world, such as Africa.
Some of the poorest people in the world are concentrated in Sub-Saharan Africa, where "We have the resources to save millions of lives and raise the basic infrastructure" (Jeffrey Sachs, Kofi Annan's Special Adviser). Over the past few decades official development assistance to third world countries has been declining and few donor countries now give the internationally-agreed 0.7% of their gross domestic product.
Jeffrey Sachs would like to see donor countries increase their aid budget.
But in the end it will be popular opinion which pushes governments into rethinking their aid policies. Since the tsunami, people have been increasingly questioning the meanness of their countries' aid budgets and demanding that more aid is given to third world countries.
Jeffrey Sachs has recently presented the "Global Plan to Achieve the Millennium Development Goals".
The report, developed by 300 economists and researchers, reiterates many of the aims of the Brandt Reports:
Eradicate extreme poverty and hunger
Achieve universal primary education
Promote gender equality and empower women
Reduce child mortality
Improve maternal health
Combat HIV/AIDS, malaria and other diseases
Ensure environmental sustainability
Develop a global partnership for development
20,000 poor people die every day from preventable diseases in Africa, partly because their governments are paying $30 million dollars a day in interest to the World Bank, the IMF and the rich world creditor nations. Currently for every one dollar that is given to Africa in aid one and a half dollars goes out to pay the interest on debts.
mazon rainforest vanishing at twice rate of previous estimates
The Amazonian rainforest is being destroyed at double the rate of all previous estimates, according to research published today in the journal Science.
The destruction is leaving the forest more prone to fires and allowing more carbon dioxide to be released into the atmosphere, according to scientists.
21st October 2005, The Guardian UK
A new analysis of satellite images of the Brazilian part of the Amazon basin, which forms part of the largest contiguous rainforest on Earth, shows that on average 15,500 sq km (6,000 square miles) of forest is being cut down by selective logging each year. This is besides a similar amount clear-cut annually for cattle grazing or farming.
Conservationists have been able to monitor large clear-cut areas using satellite images. But the extent of selective logging, where individual trees of high value, such as mahogany, are felled and smuggled out of the forest, had been unclear, the effects being masked from satellites by the forest's dense canopy.
"People have been monitoring large-scale deforestation in the Amazon with satellites for more than two decades, but selective logging has been mostly invisible until now," said Gregory Asner, of the Carnegie Institution, Washington.
He tackled the problem by developing an analytical method named the Carnegie Landsat Analysis System, which allows each pixel of an image to be scrutinised for the amount of forest left to determine the overall ratio of forested to deforested land.
Natalino Silva, of the Brazilian Agricultural Research Corporation, said:
"We can now see what's happening from the top of the forest all the way to the soil.
We have a whole new picture of the Amazon region and selective logging."
The analysis revealed some surprising facts. "We discovered that annually an area about the size of Connecticut is disturbed this way," said Professor Asner. "Selective logging negatively impacts many plants and animals and increases erosion and fires. Additionally, up to 25% more carbon dioxide is released to the atmosphere each year - above that from deforestation - from the decomposition [of plant material] that the loggers leave behind.
Timber harvests are much more widespread than previously thought."
Using images of the Amazon basin taken from 1999 to 2002, Prof Asner studied the five states that account for 90% of deforestation.
The extent of selective logging was found to be between 4,685 and 7,973 square miles each year.
Michael Keller, of the US Forest Service, who was the co-author of the research, said:
"We expected to see large areas of logging, but the extent to which logging penetrates deep into the frontier is much more dramatic than we anticipated."
A large mahogany tree can fetch hundreds of dollars at the sawmill, making it a tempting target.
"People go in and remove just the merchantable species from the forest," said Prof Asner. "Mahogany is the one everybody knows about, but in the Amazon there are at least 35 marketable hardwood species, and the damage that occurs from taking out just a few trees at a time is enormous."
About 400m tonnes of carbon enter the atmosphere every year because of traditional deforestation in the Amazon, and Prof Asner estimates that an additional 100m tonnes of carbon occurs through selective logging.
"When a tree trunk is removed, the crown, wood debris and vines are left behind to decompose, releasing carbon dioxide gas into the atmosphere," he said.
A thinned canopy also makes the forest more dry and prone to fire.
"On average, for every tree removed, up to 30 more can be severely damaged by the timber harvesting operation itself," said Prof Asner.
Climate Change Alert in Bahrain
RISING sea levels due to global climate change may prove a disaster for island states like Bahrain, a United Nations official warned yesterday.
Experts predict that a one-metre sea level rise could flood 10 per cent of Bahrain's coastal areas, said UN resident co-ordinator and UN Development Programme resident representative Sayed Aqa.
"This represents land reclamation efforts of the last 30 years," he told the UN Day celebration at the UN House yesterday.
Minister of State for Foreign Affairs Dr Nazar Al Baharna was the keynote speaker at the meeting, which was attended by ambassadors and heads of various UN agencies.
The Bahrain UN team has chosen climate change as the theme for this year's UN Day celebration, said Mr Aqa.
"Although climate change scenarios are based on predictions, it is inevitable that small state islands like Bahrain, will be among the most affected areas of the world," he added.
"A serious threat for small islands is sea-level rise.
Current estimates of future global sea-level rise of 5mm per year (with a range of two to 9mm/year) represent a rate that is two to four times higher than what has been experienced globally over the past 100 years.
"Although the level of vulnerability will vary from island to island, it is expected that practically all small island states will be adversely affected by sea-level rise.
"Negative affects will range from household level impacts on livelihoods, to wider effects on national economies."
Dr Al Baharna, who reiterated Bahrain's commitment to the principles of the United Nations, said Bahrain has drawn up a strategy to combat the direct and indirect danger of the climate change.
"Climate change is one of the most important and complex challenges facing humanity in the 21st Century," he noted.
"The former UN General Assembly president, Bahrain's Shaikha Haya bint Rashid Al Khalifa, confirmed in her farewell speech in the 61st UN session on September 17 this year, that the climate change issue has become a reality.
"She talked about the danger of climate change in the world environment and its effects on water, air and soil."
Dr Al Baharna praised new UN secretary-general Ban Ki-moon's initiative to hold a high-level meeting on climate change challenges.
UNDP assistant administrator and regional director for Arab states Amat Al Alim Alsoswa also spoke.
Both Mr Aqa and Ms Alsoswa congratulated Bahrain's leadership and people for the numerous achievements in the national reform agenda.
They also hailed the kingdom's development at the international level, with the historic presidency of the UN General Assembly and the positive contributions made by Shaikha Haya during her term and with the awarding the UN Habitat Award to Prime Minister Shaikh Khalifa bin Salman Al Khalifa.
Clean getaway?
Hydrogen-powered cars are being touted as the pollution-free alternative of the future.
But, reveals Lucy Siegle, they'll come with a dirty secret...
29th August 2005 - Lucy Seigle, The Observer
At school, my class showed genuine brilliance at sidetracking teachers on to red herrings. It was always preferable to spend an hour listening to a random anecdote - even if it was about cricket or growing dahlias - than to get bogged down in a boring set text.
Still, this miseducation left me with the ability to spot a red herring at a hundred paces. And the idea that the hydrogen economy will be the world's environmental saviour smells very fishy to me.
On the surface it all adds up.
Hydrogen fuel cells use platinum to create electricity by combining hydrogen and oxygen into water, theoretically providing a pollution-free alternative to oil.
Not surprisingly, George Bush seems punch drunk on hydrogen, announcing a billion dollar development programme so that 'The first car driven by a child born today could be powered by hydrogen and [be] pollution-free.'
General Motors, not a company previously renowned for its eco credentials, is working hard to make George's dream a reality.
The company predicts that by 2020 there will be 1.1bn hydrogen-fuelled cars worldwide.
But a large raft of experts disagree.
According to them, hydrogen cars will not be truly viable for at least 30 years, while some doubt that it will ever happen at all.
Meanwhile, the current experimental form of hydrogen is far from ideal.
Because hydrogen is not a primary source, unlike coal, it needs unbinding in the first place - an extremely energy-rich process, which racks up CO2 emissions.
And as we know, climate change, like time and tide, waits for no man, not even George Bush.
It all begs the question:
why hype this long-term ecological red herring and ignore other, more viable technologies?
These include biofuels, cleaner diesels and even hybrid engines - such as the Toyota Prius, unofficial car of Hollywood, which has the capability to reduce fossil fuel dependency even more through battery technology, when run in conjunction with solar or wind power.
But hydrogen remains flavour of the month, especially last month, when an extra $4bn was allocated for research in the new US energy bill.
But the bill also contained a big clue to all this favouritism:
funding for a new generation nuclear reactor in Idaho which, surprise, surprise, will produce hydrogen as well as electricity.
The key to hydrogen development, it seems, is nuclear power. Inextricably linking the futures of car drivers with nuclear energy bolsters the case for both controversial technologies.
Therein lies the rub for anyone who believed that the hydrogen car was the way to become truly green.
Let's hope this didn't include the first family to take possession of the Honda FCX, the world's first hydrogen fuel-cell car, which they will lease in a two-year experiment.
The way things are developing in the hydrogen field, Mr and Mrs Spallino and their two daughters from southern California are set to become an archetypal nuclear family, in more than just the sociological sense.
How best to save the Dead Sea?
Jordan's plan to save the shrinking Dead Sea by channelling more water to it from the Red Sea could have a detrimental environmental impact, environmentalists have warned.
However, not doing anything could lead to an environmental, economic and human catastrophe, say experts.
Two water-related projects are currently being proposed in Jordan:
the Red Sea-Dead Sea canal project aims to save the Dead Sea by siphoning off at least 2.5 billion cubic metres (cu. m) of water from the Red Sea and pumping it to the Dead Sea.
The Jordan National Red Sea Water Development Project (JRSP) aims to address the country's chronic potable water shortage by pumping water from the Red Sea through pipelines to a yet-to-be-built nuclear-run desalination plant that will produce some 700 million cu. m of drinking water a year when fully operational.
There is some overlap between the two projects as both require water to be pumped out of the Red Sea.
Experts say JRSP and the Red Sea-Dead Sea canal project can be carried out simultaneously.
However, environmentalists are concerned that they could produce more problems than they alleviate.
"You need to study the effect of taking out 2.5 billion cubic metres of water from the Red Sea annually [for the Red Sea-Dead Sea project] - which means 60 cu. m per second.
Pumping this quantity of water will definitely affect the current of the Gulf of Aqaba and its coral reefs," Munqeth Mehyar, chairman and co-director of Friends of the Earth Middle East (FoEME), said.
He warned of the negative impacts that could result from mixing marine water from the Red Sea with the Dead Sea water, known to be rich in minerals.
Dead Sea water levels declining
The Dead Sea is considered the lowest point on earth - about 400 metres below sea level.
Its water is 10 times more saline than ocean water, and its distinctive chemical composition and fresh/salt water interface have created a unique ecology of international importance.
But the Dead Sea and its environment are changing as a result of a sharp decrease in water inflow from the River Jordan, which has been increasingly diverted for agricultural and industrial use.
Recent figures from the Ministry of Water and Irrigation show that inflows to the Dead Sea are just 10 percent of what they were in the 1960s.
The Dead Sea has lost more than one third of its water surface in the past few decades due to evaporation and industrial use, according to the World Bank.
Its water level is dropping by nearly a metre a year, a rate at which scientists say it could dry up within the next 50 years if action is not taken.
The declining water level has far-reaching environmental, social and economic consequences for the Dead Sea region and beyond.
The response has been the Red Sea-Dead Sea project proposal which consists of a 250km canal or pipe extended from the port city of Aqaba in Jordan through the Wadi Araba area to the southern Dead Sea, costing US$12 billion.
A World Bank-funded environmental impact assessment of the Red Sea-Dead Sea project is currently under way.
Desalination project proposed
The Jordanian government has said it will go ahead with the project to save the Dead Sea, whatever the cost.
However, because of delayed international aid to kick-start it, the government wants to begin with the JRSP project to pump water from the Red Sea through pipelines, for desalination.
Satellite Observations Show Declining Levels of Gas Flaring, Greenhouse Emissions
Trend is encouraging in several countries, but too much natural gas still wasted, says World Bank-led Global Gas Flaring Reduction partnership
WASHINGTON, D.C., November 17, 2009 —Global gas flaring has declined by a total of 22 billion cubic meters (bcm) over the past three years despite a 5% rise in crude oil production over the same period, according to the latest satellite study commissioned by the World Bank-led Global Gas Flaring Reduction partnership (GGFR).
The satellite estimates indicate that gas flaring peaked at about 162 bcm in 2005 and declined to 140 bcm in 2008.
The survey, which was funded by the World Bank's GGFR public-private partnership, was executed by scientists at the US National Oceanic and Atmospheric Administration (NOAA). The decrease in gas flaring corresponds to a reduction of some 60 million tons of CO2 emissions between 2005 and 2008.
According to these satellite estimates, the ranking of top 10 flaring countries include:
Russia, Nigeria, Iran, Iraq, Algeria, Kazakhstan, Libya, Saudi Arabia, Angola and Qatar. Most of the gas flaring reduction is coming from Russia and Nigeria. (See table for top 20 countries below)
"These latest satellite results are certainly encouraging, but it is still too early to celebrate because much natural gas is still being wasted around the world," says Bent Svensson, program manager of the World Bank-led GGFR partnership.
"Over the next two to three years we will continue to collect and evaluate data to confirm whether this downward trend is continued."
Flaring or burning of gas occurs to dispose of natural gas liberated during crude oil production and processing, most often in remote areas where there is no gas transportation infrastructure or local gas markets. Since its inception in 2002 the GGFR partnership has encouraged more vigorous efforts to eliminate flaring, such as re-injecting it into the ground to boost oil production, converting it into liquefied natural gas for shipment, transporting it to markets via pipelines, or using it on site for generation of electricity.
The first globally consistent survey of gas flaring was conducted using satellite data to produce a series of national and global estimates of gas flaring volumes covering a twelve-year period spanning 1995 through 2006.
The latest report summarizes the progress that NOAA's Earth Observation Group has made toward improving the estimation of gas flaring volumes from satellite observations and extends the long‐term record by adding estimations from 2007 and 2008. GGFR, which will start a third phase in 2010, also encourages on-site monitoring to help track changes in gas flaring volumes and to report progress in reducing flaring.
NOAA scientists used low-light imaging data from the U.S. Air Force Defense Meteorological Satellite Program to assess the volumes of gas burned in flares, which are visible in observations of nighttime lights under cloud-free conditions. Current and planned satellite sensors will continue to provide data suitable for estimating gas flaring volumes for years to come.
One of the most significant improvements in the utilized methodology was a comprehensive review of suspected flares using Google Earth imagery for visual confirmation.
Both additional flares were detected and previously suspected lights were disqualified.
Google Earth has proven to be an invaluable resource in the identification and confirmation of flares.
NOAA scientists warn, however, that satellite results should be used with caution, as there still are several sources of error and uncertainty, including variations in flare efficiency, mis-identification of flares, non-continuous sampling, and potential environmental effects.
Background information:
What is gas flaring?
When crude oil is brought to the surface from several kilometers below, gas associated with such oil extraction usually comes to the surface as well.
If oil is produced in areas of the world which lack gas infrastructure or a nearby gas market, a significant portion of this associated gas may be released into the atmosphere, un-ignited (vented) or ignited (flared).
On GGFR
Launched at the World Summit on Sustainable Development in August 2002, the GGFR public-private partnership brings around the table representatives of governments of oil-producing countries, state-owned companies and major international oil companies so that they can together overcome the barriers to reducing gas flaring by sharing global best practices and implementing country specific programs in gas flaring countries, with funding provided in part by the European Union, the World Bank, oil companies and donor countries.
GGFR partners and donors
The GGFR partnership, managed and facilitated by a team at the World Bank in Washington, DC, includes the following partners:
Algeria (Sonatrach), Angola (Sonangol), Azerbaijan, Cameroon, Canada (CIDA), Chad, Ecuador (PetroEcuador), Equatorial Guinea, France, Gabon, Indonesia, Iraq, Kazakhstan, Khanty-Mansijsysk (Russia), Mexico, Nigeria, Norway, Qatar, the United States (DOE) and Uzbekistan;BP, Chevron, ConocoPhillips, ENI, ExxonMobil, Marathon Oil, Maersk Oil &Gas, Shell, StatoilHydro, TOTAL, Qatar Petroleum, Pemex;OPEC Secretariat, European Union, the World Bank and the IFC
Climate Change
Earth Oil and natural gas take us down the street and around the world.
They warm and cool our homes and businesses. They provide the ingredients for medicines, fertilizers, fabrics, plastics and other products that make life safer, easier and better.
While we rely on them for most of our energy and will likely do so for years to come, emissions from their production and use may be helping to warm our planet by enhancing the natural greenhouse effect of the atmosphere.
That's why oil and gas companies are also working to reduce their greenhouse emissions.
To start, they're closely managing their own energy use.
One strategy involves heat and power technology that turns waste heat into energy, reducing energy consumption and emissions. The most recent data from the U.S. Energy Information Administration show that CO2 greenhouse emissions from U.S. industry, including oil and natural gas companies, have declined and were actually below 1990 levels. In 2006, for example, improvements in energy efficiency at API member refineries -- compared to the technology used in 2002 -- produced energy savings equivalent to taking more than 528,000 cars off the road, or savings equivalent to the electricity used by more than 950,000 homes.
Companies are also expanding use of alternative energy with lower greenhouse emissions. They are researching, developing and/or marketing virtually every form of renewable or alternative energy, including solar power, biofuels, geothermal energy, and wind power. They are working with the automakers and government agencies on new fuel/vehicle technology such as fuel cells and hydrogen power. While companies must continue to meet the demands of today's and tomorrow's consumers for oil and gas, they are also preparing for a future in which alternatives will play a much more significant role.
On other fronts, companies are reducing natural gas flaring to cut emissions (while also adding to energy supplies) and storing CO2 underground, where it can be safely held for thousands of years. The oil and gas industry has also been implementing new emissions estimation and tracking tools to enable it to assess how well it is meeting the goals it has set for itself and report progress to the public.
To see more about what companies are doing, please look at Climate Challenge:
A Progress Report.
To view additional and more recent information about voluntary efforts, see Companies Address Climate Change.
Please also visit our Resources and Links page for links to member company websites and more information about company initiatives.
And just as our industry is taking action, everyone can contribute to improved energy efficiency.
The Power Is in Your Hands campaign, created by 23 companies, trade associations and nonprofits, provides consumers with information on what they can do to help.
Its website offers valuable energy efficiency information for homes, transportation, and businesses. The campaign also encourages all Americans to take the Six Degrees of Energy Efficiency Web Challenge.
Warming Turns Fish into Daredevils A temperature increase of a few degrees can make fish more aggressive and bold.
Damselfish Climate Change
This damselfish may appear docile, but a small temperature increase could affect its personality.
Getty Images
As the world grows warmer, some fish may stop acting like themselves.
With a small rise in temperature, a new study found, some fish become more daring and more aggressive than they would otherwise be.
The finding suggests that climate change could put fish in peril in unexpected ways.
"The fact that big effects on behavior were happening over the course of just a couple degrees surprised me," said Peter Biro, a fish ecologist and evolutionary biologist at the University of New South Wales in Sydney, Australia.
"I would have never expected that from the things I had read in the literature."
Biro stumbled on the discovery by accident.
He was working with damselfish in an indoor lab that naturally got warmer over the course of a day.
His original goal was to study personality traits in the fish.
But as the lab heated up every day, he noticed some fish becoming more active, aggressive and bold.
To figure out what was behind their personality transformations, Biro and colleagues put 30 young damselfish in individual aquaria.
All fish were the same size and the same age, and the researchers gave them as much food as they wanted so that hunger wouldn't affect their behavior.
Then came the personality tests.
To measure boldness, Biro used what he calls "the scary test."
First, he shoved a wooden stick into each fish's tank, sending it into hiding.
When he removed the stick, he timed how long it took for the fish to emerge from their shelters.
In a test of aggressiveness, each fish was placed near a jar that contained another fish.
Scientists then watched to see how frequently the fish tried to attack or scare their intruders.
One round of tests happened at a temperature of just over 26 degrees Celsius (79 degrees Fahrenheit).
In a second round, the water was warmed by just under 3 degrees Celsius to 29 Celsius (84 degrees Fahrenheit).
Testing lasted for two weeks.
In warmer water, fish became an average of six times more active (with some fish becoming as much as 23 times more active), four times more aggressive and four times bolder, the researchers reported in the journal Proceedings of the Royal Society B.
Fish that had cowered in their shelters for up to 10 minutes during scary tests in cool waters emerged immediately after the stick was gone from more tepid conditions.
A shift in metabolism probably explains why timid fish can become more daring in milder times, Biro suspects.
Fish are cold-blooded, so their bodies grow warmer and more active with a warmer environment.
To sustain that higher rate of energy burn, they need to eat more.
And in order to get more food, they need to be bolder more aggressive.
The new findings suggest that even small changes in temperature -- from hour to hour or season to season -- can alter fish personalities, said Steven Cooke, an environmental biologist at Carleton
U.N. Reports Pollution Threat in Asia
The byproduct of automobiles, slash-and-burn agriculture, cooking on dung or wood fires and coal-fired power plants, these plumes rise over southern Africa, the Amazon basin and North America.
But they are most pronounced in Asia, where so-called atmospheric brown clouds are dramatically reducing sunlight in many Chinese cities and leading to decreased crop yields in swaths of rural India, say a team of more than a dozen scientists who have been studying the problem since 2002.
"The imperative to act has never been clearer," said Achim Steiner, executive director of the United Nations Environment Program, in Beijing, which the report identified as one of the world's most polluted cities, and where the report was released.
The brownish haze, sometimes in a layer more than a mile thick and clearly visible from airplanes, stretches from the Arabian Peninsula to the Yellow Sea.
In the spring, it sweeps past North and South Korea and Japan.
Sometimes the cloud drifts as far east as California.
The report identified 13 cities as brown-cloud hot spots, among them Bangkok, Cairo, New Delhi, Tehran and Seoul, South Korea.
It was issued on a day when Beijing's own famously polluted skies were unusually clear. On Wednesday, by contrast, the capital was shrouded in a thick, throat-stinging haze that is the byproduct of heavy industry, coal-burning home heaters and the 3.5 million cars that clog the city's roads.
Last month, the government reintroduced some of the traffic restrictions that were imposed on Beijing during the Olympics;the rules forced private cars to stay off the road one day a week and sidelined 30 percent of government vehicles on any given day.
Over all, officials say the new measures have removed 800,000 cars from the roads.
According to the United Nations report, smog blocks from 10 percent to 25 percent of the sunlight that should be reaching the city's streets. The report also singled out the southern city of Guangzhou, where soot and dust have dimmed natural light by 20 percent since the 1970s.
In fact, the scientists who worked on the report said the blanket of haze might be temporarily offsetting some warming from the simultaneous buildup of greenhouse gases by reflecting solar energy away from the earth.
Greenhouse gases, by contrast, tend to trap the warmth of the sun and lead to a rise in ocean temperatures.
Climate scientists say that similar plumes from industrialization of wealthy countries after World War II probably blunted global warming through the 1970s. Pollution laws largely removed that pall.
Rain can cleanse the skies, but some of the black grime that falls to earth ends up on the surface of the Himalayan glaciers that are the source of water for billions of people in China, India and Pakistan.
As a result, the glaciers that feed into the Yangtze, Ganges, Indus and Yellow Rivers are absorbing more sunlight and melting more rapidly, researchers say.
According to the Chinese Academy of Sciences, these glaciers have shrunk by 5 percent since the 1950s and, at the current rate of retreat, could shrink by an additional 75 percent by 2050.
"We used to think of this brown cloud as a regional problem, but now we realize its impact is much greater," said Veerabhadran Ramanathan, who led the United Nations scientific panel.
"When we see the smog one day and not the next, it just means it's blown somewhere else."
Although the clouds' overall impact is not entirely understood, Mr. Ramanathan, a professor of climate and ocean sciences at the University of California, San Diego, said they might be affecting precipitation in parts of India and Southeast Asia, where monsoon rainfall has been decreasing in recent decades, and central China, where devastating floods have become more frequent.
He said that some studies suggested that the plumes of soot that blot out the sun have led to a 5 percent decline in the growth rate of rice harvests across Asia since the 1960s.
For those who breathe the toxic mix, the impact can be deadly.
Henning Rodhe, a professor of chemical meteorology at Stockholm University, estimates that 340,000 people in China and India die each year from cardiovascular and respiratory diseases that can be traced to the emissions from coal-burning factories, diesel trucks and wood-burning stoves. "The impacts on health alone is a reason to reduce these brown clouds," he said.
Despite momentum, no smooth path to climate deal
PORT OF SPAIN (Reuters) - Commonwealth states representing a third of the world's people said on Sunday momentum was growing toward a global climate deal, but nagging doubts remained over funding levels and degrees of commitment.
Seeking to successfully tip the outcome of U.N. climate talks on December 7-18 in Copenhagen, the group of more than 50 nations from across the world made the climate change issue the centerpiece of a three-day summit in Trinidad and Tobago.
They declared firm support for an "operationally binding" deal to be achieved in Copenhagen that would cover tougher greenhouse gas emissions targets, climate adaptation financing for poorer nations and transfer of clean-energy technology.
The Commonwealth group, which welcomed Rwanda as its 54th member, called for a full legally binding climate treaty to be adopted "no later than 2010" and insisted fast funding be made available to poor states to counter the global warming threat.
Commonwealth leaders hailed the consensus achieved in their Port of Spain Climate Change Declaration as improving the odds for a comprehensive agreement in Copenhagen and as proof that their geographically diverse group was a viable institution.
"There is heavy traffic on the road to Copenhagen.
The good news is that it is converging and hopefully moving purposefully into a single lane," Commonwealth Secretary-General Kamalesh Sharma said in comments closing the Port of Spain summit.
U.N. Secretary-General Ban Ki-moon and the presidents of Denmark and France, had participated in the Commonwealth summit, adding weight to the group's climate deliberations.
"I have no doubt it will make an impact on Copenhagen," South African President Jacob Zuma told reporters.
But even as the Commonwealth leaders were congratulating themselves on their climate consensus, European Commission President Jose Manuel Barroso was declaring in China that pledges made so far by governments to cut greenhouse gases were not sufficient for an effective pact to fight global warming.
"If you sum up all the commitments made so far, according to our estimates, we are not yet where we should be if we want Copenhagen to succeed," said Barroso, who will attend a European Union-China summit in Nanjing on Monday.
"There is still much work to be done," acknowledged Australian Prime Minister Kevin Rudd in Port of Spain.
COSTS OF CLIMATE DEAL
Although prospects for a broad political framework pact on climate change were brightened last week by public promises of greenhouse gas curbs by leading emitters China and the United States, Barroso's blunt comments delivered a reality check on the contentious path to next month's Copenhagen talks.
The world's industrialized powers are under pressure to make substantial cuts in their greenhouse gas emissions.
At the same time, developing countries, including tiny island states which risk disappearing if ocean levels continue to rise through global warming, are clamoring for tens of billions of dollars of aid to help them fight climate change.
Hoisting One for Wind Power:
Climbing Crane Expected to Keep Vestas Turbines Spinning [Slide Show]
A Danish wind-power provider develops its own technology to maintain towering turbines in gusts roaring at up to 15 meters per second
Now that the United Nations climate talks have wrapped up in Copenhagen, nations agreeing to the accord drafted there are now obliged to keep their promises to cut greenhouse gas emissions. Wind power is one of the key sources of renewable energy expected to play an important role in helping to cut emissions and wean society from its dependence on fossil fuels, which means wind-power companies must be prepared to quickly fix mechanical problems that threaten to slow down renewable energy production.
Blown generators, misbehaving gearboxes and damaged rotors keep turbines from maximizing the energy they draw from nature, raising the question of how to reliably maintain dozens of mammoth towers, some of which rise more than 100 meters above the ground.
Large mobile cranes are used to assemble wind turbine components at the top of lofty towers and, later, to service these turbines. But the rapid expansion and interest in wind-harnessing technology has strained available crane resources, creating a shortage of capacity, says Jacob Juhl Christensen, product manager for Vestas Wind Systems, A/S's Technology Product Management R&D division in Denmark.
"Sometimes, even though spare parts are available to repair a turbine, it may take months to secure a crane to do the repair work," he says. Given that some of Vestas's turbines operate 70 to 105 meters above the ground, where the winds are strong, the heaviest cranes are required to do repair work.
Unfortunately, such cranes are also the most difficult to move from place to place.
Vestas is developing its own crane technology to avoid such problems. The Vestas Tower Crane, still in the prototype phase, is designed to attach to a cable lowered down from a small crane located inside the turbine tower's nacelle (the cover housing the wind turbine's generator and gearbox) and hoisted upward.
The Tower Crane, which measures 10 meters long, 2.9 meters high and 3.3 meters wide, actually ascends through a multistep process:
The cable lowered from the nacelle to the ground first connects to and hoists a tackle (a system of pulleys that distributes the crane's weight), which Vestas calls the "nacelle attachment system," or NAS. Once the NAS is pulled up and attached to the side of the nacelle, it uses a number of cables to hoist the crane itself.
After the crane reaches the top of the tower it clamps its four claws around the post like a giant robotic hand to provide stability, allowing the crane to operate in winds as strong as 15 meters per second, Christensen says.
The 53-metric-ton Vestas Tower Crane, first demonstrated earlier this year, has been used to a limited extent to date as the technology is further developed.
The current version of the crane features a single crane arm that can install and remove gearboxes and generators. Future versions are expected to be able to do replacement or repair work to the rotor blades themselves.
The plan for Vestas's next-generation Tower Crane is for it to be able to work with several of the company's different turbine models with only minor modifications. The company expects to have a prototype of the new crane within a year, in time for the launch of its V112 three-megawatt turbine, which will have heavier components and require a crane with a stronger arm than the current Tower Crane has.
Vestas's Tower Crane is not the only novel approach to wind turbine crane technology. ITI Energy, part of Scottish Enterprise, the country's main economic agency, has invested about $3.2 million in a special wind turbine access system called "Orangutan". ITI Energy's system, designed by Oreada, Ltd., also based in Scotland, is made up of two friction clamps, connected by a hydraulic structure that allows caterpillar-like motion up and down the turbine tower (pdf).
After testing the technology on a tower built at one-fifth scale (compared with a normal turbine tower), Scottish Enterprise is now looking for partners to turn the prototype Orangutan into a climbing machine that can be commercialized.
Tackling climate change
ScienceDaily (Nov. 26, 2009) — Tackling climate change by reducing carbon dioxide and other greenhouse emissions will have major direct health benefits in addition to reducing the risk of climate change, especially in low-income countries, according to a series of six papers appearing on, Nov. 25 in the British journal The Lancet.
The studies, three of them coauthored by Kirk R. Smith, professor of global environmental health and one coauthored by Michael Jerrett, associate professor of environmental health sciences, both at University of California, Berkeley, use case studies to demonstrate the co-benefits of tackling climate change in four sectors:
electricity generation, household energy use, transportation, and food and agriculture.
"Policymakers need to know that if they exert their efforts in certain directions, they can obtain important public health benefits as well as climate benefits," said Smith, who was the principal investigator in the United States for the overall research effort.
"Climate change threatens us all, but its impact will likely be greatest on the poorest communities in every country.
Thus, it has been called the most regressive tax in human history.
Carefully choosing how we reduce greenhouse gas emissions will have the added benefit of reducing global health inequities."
Each study in the series examines the health implications in both high- and low-income countries of actions designed to reduce the release of carbon dioxide (CO2) and other greenhouse gases.
Climate change due to emission of greenhouse gases from fossil fuel energy sources causes air pollution by increasing ground-level ozone and concentrations of fine particulate matter.
The studies were commissioned by the NIEHS, part of the National Institutes of Health (NIH), in part to help inform discussions next month at the U.N. Framework Convention on Climate Change in Copenhagen.
The NIEHS is one of the key sponsors of the international event.
"These papers demonstrate there are clear and substantive improvements for health if we choose the right mitigation strategies for reducing greenhouse gas emissions," said Birnbaum.
"We now have real life examples of how we can save the environment, reduce air pollution and decrease related health effects;it's really a win-win situation for everyone."
A case study led by Smith on the health and climate benefits from a potential 150-million-stove program in India from 2010-2020 gives the largest co-benefit of any examined in the six papers.
Smith has shown that providing low-emission stove technologies in poor countries that currently rely on solid fuel household stoves to cook and heat their homes is a very cost-effective climate change linkage.
The 10-year program could prevent 2 million premature deaths in India, he said, in addition to reducing greenhouse pollution by hundreds of millions of tons.
The paper coauthored by Jerrett contains analysis of 18 years of data on the long-term health effects of black carbon -- the first study of its kind ever conducted.
The study followed 352,000 people in 66 U.S. cities and was conducted by a team of U.S. and Canadian researchers led by Jerrett and Smith.
Black carbon is a short-lived greenhouse pollutant which, along with ozone, is responsible for a significant proportion of global warming.
Unlike CO2, these short-lived greenhouse pollutants exert significant direct impacts on health.
Also, because they are short-lived, emission controls are almost immediately reflected in changes in warming.
Green Architecture:
What Makes a Structure a "Living Building"?
A Pacific Northwest organization has the defined an environmentally sound structure as one that generates its own energy, captures and treats all of its water, operates efficiently, and is aesthetically pleasing
Over the past couple of decades, architects and builders looking to green their projects turned to the addition of various piecemeal elements to save water here or cut down on electricity there.
Those who added more than a few green touches could apply for and get certified by the United States Green Building Council (USGBC) under its Leadership in Energy and Efficient Design (LEED) program.
While these efforts have been laudable—essentially launching the green building industry as we know it today—they represent merely the infancy of what green building might someday become.
The concept of the "living building" has now emerged as a new ideal for design and construction.
The Cascadia Region Green Building Council (CRGBC)—the Pacific Northwest chapter of the USGBC—defines a living building as a structure that "generates all of its own energy with renewable nontoxic resources, captures and treats all of its water, and operates efficiently and for maximum beauty."
The group has been pushing for adoption of the concept by construction industries here at home, and also helped to launch the International Living Building Institute to promote the concept internationally.
"We view our role as the organization that is meant to ask the really tough questions, to push the boundaries as far as possible," says Jason McLennan, CEO of CRGBC. To this end, in 2006 the group launched its Living Building Challenge (LBC), a "call to the design and construction community to pursue true sustainability in the built environment."
So far 60 different projects around North America are vying to meet the high standards of the LBC, which exceed even the highest status of LEED certification.
The first building to be completed for consideration under the LBC program is the Omega Center for Sustainable Living, in Rhinebeck, NY. The 6,200 square-foot, one-level building, which serves as headquarters for the Omega Institute for Holistic Studies, features a geothermal heating and cooling system, solar panels, rain gardens that direct water run-off to irrigate plantings, a 4,500-square-foot greenhouse that helps filter wastewater for reuse, "daylighting" design that brings natural light indoor to minimize electric light usage, and eco-friendly building materials all around.
It was designed—per LBC criteria—to be "net-zero," meaning it uses no more energy than it generates itself. Once the building has been in operation for a full year next summer, CRGBC will audit it to see if its performance lives up to the green hype.
Dozens of other LBC contenders around North America will be audited, as well.
Of course, the costs of creating a living building today are very high.
Achieving net-zero can be especially costly, and stands out as one of the biggest obstacles to greater interest in the living building concept.
Another challenge is finding materials that meet LBC standards, since many common building materials—such as PVC piping for wastewater transport—off-gas chemicals and have other hazardous attributes. LBC also expects builders to source locally as many materials as possible to boost local economies and make efficient use of nearby natural resources. McLennan remains confident that costs will come down as green materials, technologies and methods become more commonplace within the general building industry
East Antarctic ice began to melt faster in 2006
A view of the Jutulsessen mountain range in east Antarctica, a sector claimed by Norway about 250 km (155 miles) from the sea, January 19, 2008.
REUTERS/Alister Doyle
LONDON (Reuters) - East Antarctica's ice started to melt faster from 2006, which could cause sea levels to rise sooner than anticipated, according to a study by scientists at the University of Texas.
In the study published in Nature's Geoscience journal, scientists estimated that East Antarctica has been losing ice mass at an average rate of 5 to 109 gigatonnes per year from April 2002 to January 2009, but the rate speeded up from 2006.
The melt rate after 2006 could be even higher, the scientists said.
"The key result is that appear to start seeing a large amount of ice loss in East Antarctica, mostly in the long coastal regions (in Wilkes Land and Victoria Land), since 2006," Jianli Chen at the university's center for space research and one of the study's authors, told Reuters.
"This, if confirmed, could indicate a state change of East Antarctica, which could pose a large impact on global sea levels in the future," Chen said.
Previous estimates for East Antarctica projected anywhere between a 4 gigatonne per year loss and a 22 gigatonne per year gain, according to the report.
The full study is available at www.nature.com/ngeo.
Climate change is turning Antarctica's ice into the one of the biggest risks for coming centuries.
Even slight melting could drive up sea levels and could affect world's cities.
Rising temperatures are thought to be the main cause of melting ice, and world leaders are under pressure to agree on a new climate treaty at an upcoming U.N. summit in Copenhagen to curb global warming.
MELTDOWN
The scientists used satellite observations of gravity change over the period April 2002 to January 2009 to calculate the rate of the ice loss in East Antarctica's coastal regions.
The ice sheet's mass has long been difficult to estimate.
"At various times, estimates have disagreed on the sign of the mass balance, as well as its magnitude," the report said.
The whole Antarctic region could be losing ice at a rate of 113-267 gigatonnes a year, with 106-158 gigatonnes coming from West Antarctica, the scientists estimate.
A separate study on Thursday found that melting ice from Greenland and Antarctica will lead to a much sharper rise in sea levels than previously thought.
Climate change will cause a rise of at least 1 meter in sea levels by the end of this century, according to a review of scientific data by environmental group Clean Air-Cool Planet.
The projection is in sharp contrast to a 2007 study by the U.N.'s Intergovernmental Panel on Climate Change, which said world sea levels could increase 18-59 centimeters by 2100.
Agriculture can adapt to climate change
Better crop management, including smarter application of pesticides, can help poor farmers cope with climate change
Innovative agricultural technologies can produce crops that meet climate change challenges, says ICRISAT head William Dar.
Sustainable land and water management combined with innovative agricultural technologies could mitigate climate change and help poor farmers adapt to its impacts.
New knowledge, technology and policy for agriculture have never been more critical, and adaptation and mitigation strategies must urgently be applied to national and regional development programmes.
Without these measures developing countries will suffer increased food insecurity.
For the 1.5 billion people engaged in agriculture in the developing world, even a small loss in agricultural productivity could mushroom into a large loss of income.
And new strategies must be built around 'green' agricultural technologies, such as adaptive plant breeding, pest forecasting, rainwater harvesting and fertiliser microdosing, where small amounts of fertiliser are given to each seed.
Water and land use
Water is critical for agriculture across the semi-arid tropics.
Although rainfall predictions remain uncertain, scientists agree that climate change will reduce water availability and storage, and warmer temperatures will increase the amount of water needed by crops.
Improving crop production in these regions largely depends on better capture and storage of rainwater.
But rainwater harvesting and storage technologies remain underdeveloped.
And we know little about the economic viability of such systems — implementing them may well require financial investment beyond the capacity of most rural communities.
As almost 95 per cent of water in developing countries is used to irrigate farmlands, policies to improve irrigation efficiency are also critical.
Research is needed on water flows and water quality, and infrastructure needs to be improved.
Better land and crop management are equally important.
There are already some promising and economically viable technologies to reduce risk of crop failure, improve soil fertility and increase productivity under variable climatic conditions.
These include methods to reduce agricultural inputs, such as fertiliser microdosing and smarter application of pesticides, as well as technologies for minimising soil disturbance such as reduced tillage, conservation agriculture and crop rotation.
Revising planting dates, plant densities and crop sequences can help cope with delayed rainy seasons, longer dry spells and earlier plant maturity that are already being observed across parts of Africa including Malawi, Mozambique, Zambia and Zimbabwe.
Climate-proof crops
Changes in growing seasons in the tropics can also, to a large degree, be mitigated by redeploying existing improved crop varieties that can cope with a wide range of climatic conditions.
For example my organisation, the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT), has developed pearl millet hybrids that can cope with temperatures of 40 degrees Celsius and deliver normal yields with limited water.
Our short duration varieties of chickpea and pigeon pea mature in 65–75 days and so can escape terminal drought — lack of water at later stages of growth.
Ocean Power Gets Fast Track
The Federal Energy Regulatory Commission and Washington state agreed yesterday to coordinate environmental reviews and establish licensing schedules for emerging hydrokinetic technologies.
FERC authorizes operating licenses for marine projects that produce energy from oceans and rivers, including wave and tidal power. States must sign off on environmental issues related to coastal zone management and pollution in their waters. Other federal entities, including the Interior Department, must conduct environmental, safety and security reviews.
Under the memorandum of understanding [pdf] (MOU) signed by Washington state and FERC, the two agreed to notify each other if there is a potential applicant, quickly coordinate and agree to a schedule to process the application, and coordinate environmental reviews with each other and stakeholders. FERC also will take into account in its licensing process any "comprehensive plan" Washington has for siting projects.
FERC signed a similar hydrokinetic energy MOU with Oregon last year.
The agreement will help the state use another form of clean and renewable energy, said FERC Commissioner Philip Moeller, who hails from Washington.
"The next crucial step is to place some of these projects in the water so that any effects on the marine ecosystem can be thoroughly analyzed.
It's time for action on renewable energy technologies," he said.
FERC already established a short-term license for wave developers to test their technology, as opposed to having to apply for the fully hydroelectric license, which can take a significant time, sometimes decades, to be issued.
The MOU is the latest step to help reduce regulatory barriers to support this fledgling renewable energy technology.
The industry, along with offshore wind power, got a huge boost in April when Interior and FERC signed an MOU that settled the offshore renewable energy permitting process, partitioning the leasing duties to Interior and the licensing and environmental reviews to FERC, although Interior would help with the environmental permitting.
In February, the industry received a setback when Finavera Renewables filed with FERC an application to surrender the first hydrokinetic license issued by the agency.
The conditioned license was for a project to construct and operate the 1-megawatt Makah Bay Offshore Wave Energy Pilot Project, offshore from Clallam County, Wash.
Congress has also been boosting its policy support for wave energy, including bills from Sen.
Lisa Murkowski (R-Alaska) and Rep.
Jay Inslee (D-Wash.) authorizing the Energy Department with up to $250 million annually for marine-power research, development and deployment efforts, as well as a technology verification program and $30 million in President Obama's 2010 appropriations requesta
ISLAM'S GREEN INITIATIVE
The UK-based the Alliance of Religions and Conservation (ARC), in working with the U.N., recently hosted 200 representatives from nine major world religions spanning over 60 different religious organizations.
Baha'i, Buddhists, Christians, Hindus, Jews, Muslims, Shintoists, Taoists and Sikhs all gathered at London's Windsor Castle with a united environmental agenda.
In an era of increasing religious divide, a once little thought of topic known as "the environment" was able to bring together ancient faith groups to discuss a modern solution.
And with Islam at the forefront of today's news, Muslim leaders proved Islam's ability to adapt and meet new needs.
Under the newfound coalition toward eco-commitment and a Muslim Seven Year Plan, Medina, Islam's second most important city after Mecca, is to serve as a model green city.
This move is critical since Saudi Arabia is essentially, for better or worse, presently the pillar of the Arab nations.
Medina, "The City of the Prophet", is a strategic start pointing that has the capacity to really launch a green campaign in neighboring territories.
The Seven Year Plan was presented by the Sheikh Ali Goma'a, Egypt's Grand Mufti, who has already introduced the plan into his own city of Dar Al Iftaa.
Some key initiatives of the Seven Year Plan include:
* Develop and implement a "Green Hajj".
With 2-3 million people visiting Mecca during Hajj alone, transforming the experience into an environmentally-friendly pilgrimage will reap immediate benefits.
* Construct a "green mosque" and introduce this model for other Islamic buildings worldwide.
* In the first phase, develop 2-3 green model cities;in the second phase, adapt ten other Muslim cities to implement the model.
* Integrate eco-awareness into Islamic education.
* Publish "green Qurans", printed on paper procured from sustainable wood.
* Create a specialized TV channel focused on Islam and the environment.
* Create award and prize systems for excellence in this field.
The ultimate goal here, as with other faith groups, is to radically redefine faith-based relationships with the environment.
While the "greenie" movement is still seen as a secular front by a number of conservative groups, the world's oldest religions with a following in the billions will be able to bring much needed attention and authority to an issue that has predated our recognition of it.
Faith groups also realize the inherent relationship they have with the environment.
Islam's Sufi Muslims have long been known to have a deep reverence for nature.
However, despite regional shifts toward eco-awareness, ARC Secretary General Martin Palmer accurately points out the difficultly in Islamic groups face in changing what are essentially government policies.
As Palmer states, essentially Muslims groups will be "saying to Islamic governments that this is how you should act Islamically."
Palmer also astutely notes that implementing green changes are possible if they can be proven to be Islamic in nature.
In anticipation of this obstacle, plans are in effect to functionally launce the Muslim Association for Climate Change Action (MACCA).
An Alternative Energy Save the Economy and the Climate?
The "new energy" economy rolls forward even as hopes for an international deal to combat climate change at Copenhagen shift into reverse.
It's part of a $1 billion investment by the company in the United States, what Colorado Gov.
Bill Ritter touts as a "new energy economy."
"We have a caseload of 56 prospects. Of those, a majority are energy-related industries," said Raymond Gonzales, president of the Brighton Economic Development Corporation.
"People are looking.
They're not slowing down.
And they're aggressively looking at the United States."
Some say these efforts - not the upcoming Copenhagen climate treaty talks - provide the most promising route to energy independence, climate change mitigation and job creation.
Regardless of whether delegates emerge next month with a comprehensive replacement for the Kyoto Protocol, industry's full-throttle acceleration toward a low-carbon future will continue, they say.
Vestas isn't the only company spending millions of its capital.
Several utilities are investing some $1 billion on an industrial-scale carbon capture and storage tests at coal plants in Wisconsin, West Virginia and Oklahoma.
The race to perfect the batteries that will power the next generation of automobiles and buses has manufacturers in Europe, the United States and China scurrying to build plants and research centers.
"The vast majority of the utility industry (has) pretty much accepted the reality that CO2 is something they have to cope with," said Revis James, director of the energy technology assessment center for the Electric Power Research Institute, or EPRI, a California-based nonprofit that helps drive long-range development and is coordinating carbon capture experiments at coal plants in the Midwest and Southeast.
Failure in Copenhagen won't "substantially stop what's going to happen," James added.
"The utilities have to deal with (carbon emissions). They have to respond one way or another."
Many business leaders and policy analysts counter the status quo - a piecemeal, federated approach to carbon and energy emissions - doesn't carry enough of a signal to produce the revolution required of our economic and energy sectors.
Private-sector investments and regional and local government efforts to boost "green" technology are good, they say.
But that's just the down payment:
The transformative change necessary to avoid the worst warming won't come until the international community firmly sets a global standard in place.
"What you want is something sustainable, predictable and long-term," said Roby Roberts, spokesman for Vestas Americas. "That's what you want out of the climate rules, but that's going to be a few years away.
Vestas is making its $1 billion bet in the United States because local governments have sent enough of a signal and all signs point to the country as the next great wind market, Roberts said.
For instance, some 29 states have imposed renewable fuel portfolio standards - requirements that utilities generate a minimum percentage of their electricity from renewable fuels such as wind, solar or geothermal.
"Given the fundamentals in the United States, it does rival any place in the world," Roberts said.
"Instead of waiting, we're putting out money there."
But the lack of certainty is unsettling, Roberts acknowledged.
Renewable fuels standards are inconsistent.
Tax and pricing policies are fickle.
"They have a lot of off-ramps that really give you pause."
True change, advocates say, will not happen until carbon has a price.
And that needs either comprehensive domestic energy and climate legislation now before the Senate or a worldwide climate treaty.
"You can't have on-again, off-again fiscal policies," said Energy Secretary Stephen Chu at a White House press briefing last month.
"As soon as you stop fooling around with all this on-again, off-again stuff you start seeing real growth."
So what, then, to make of all the growth underway?
Outside Racine, Wisc., the plume from We Energies' Pleasant Prairie power plant dominates the skyline, sending some 3,500 gallons a minute of vaporized Lake Michigan water pouring skyward round the clock.
The 1,300-megawatt plant is Wisconsin's biggest.
But down below, deep in the shadow of the plant's 450-foot stack, is the feature that makes this plant unique in the state:
A way to capture some of the carbon dioxide it sends skyward from all that coal.
Bolted to the plant is a labyrinth of pipes, valves and catwalks surrounding two modest cooling towers. Diverted flue gas is cooled and filtered through ammonia in one column, then pumped to the other, where steam from the plant reheats it and strips off the carbon.
It is simply a test.
The five-story contraption is dwarfed by other scrubbers stripping soot, sulfur and other pollutants from the plant's flue.
It can capture 90 percent of the carbon dioxide from the exhaust, although We Energies is diverting less than 1 percent of its flue through the CO2 scrubber.
There's no place to store that carbon anywhere in Wisconsin - the state's geology is just too porous, say plant operators - and so after capture the carbon dioxide is released back into the flue stream.
But the test is a success - the first industrial-scale example of carbon capture in the United States. It is step No. 1 in a three-part, $1.1 billion experiment being conducted by a consortium of energy companies to establish that carbon capture and sequestration technology can strip greenhouse gas emissions from today's coal-fired power plants.
The second part of this experiment went online last month:
American Electric Power's Mountaineer plant in West Virginia became the first plant in the United States to sequester carbon dioxide emissions, diverting approximately 20 megawatt's worth of emissions and burying the carbon dioxide in saline rock formations 8,000 feet below the Ohio River. The third test, expected to start in 2012, will scale the process up to a semi-commercial production - a 200-megawatt coal plant in Oklahoma.
AEP, We Energies and 37 other partners involved with these tests are working feverishly to show that baseload coal can be part of a low-carbon energy future, said Henry Courtright, senior vice president of EPRI, which is helping coordinate the project.
They're not waiting for regulations, he added.
They're looking to beat them.
A new carbon framework will emerge, Courtright said.
And if coal doesn't have a way to beat those constraints, it will be left behind.
"We see a great deal of (regulatory) uncertainty over the next 10 years," Courtright said.
"That's why it's vitally important to have this carbon capture technology in place by 2015.
If it slips out, you're probably looking at nuclear as your low-carbon, predictable energy source."
In this sense, Copenhagen is irrelevant.
Managers building next-generation battery plants or looking for places to put wind turbine factories aren't worried the talks will end in mid-December with no consensus on carbon limits.
They fear their competitors - or a different industry, or an as-yet unseen technology, or even another country - will emerge victorious in the race to decarbonize various industrial and economic sectors.
China intends to invest $1 million an hour for the next decade - $88 billion in all - in green technologies. The Energy Department expects solar and wind to balloon into a $3.5 trillion market.
President Obama last month offered up $3.4 billion in matching grants to hasten development of a "smart grid" in the United States.
"What's truly amazing is the amount of investment flowing into green technology in the absence of any price signal," said Kristen Sheeran, director of the Economics for Equity and the Environment Network.
"It's clear we'll continue to see these kinds of investments flowing into green technology, if for no other reason that the Chinese are doing it ... and U.S. producers are realizing this is where the future is going to be made."
But to maximize development - to see revolution across vast sectors of the economy - many economists say government has to step in with some sort of incentive or signal.
In general, economists see two ways to drive new technologies and shift cultural paradigms:
Policy makers can push the technology into the market by targeting investing at specific research.
Or they can pull it by setting a bar or standard and offering incentives to clear the mark.
Economists disagree wildly on the effectiveness of various strategies. California in the late 1990s and early 2000s tried to pull the automobile market into electric power by requiring that a certain percentage of each automaker's sales in the state be zero-emissions vehicles. President Carter in the 1970s tried to push the development of solar and wind energies, investing some $1.4 billion a year in solar alone by 1981.
Neither ultimately worked.
So far federal and state governments have mostly pushed efforts to decarbonize the economy, offering grants and tax incentives for specific projects.
But so long as carbon is free, those efforts push against prevailing economic forces, said Adam Jaffe, dean of College of Arts and Sciences and professor of economics at Brandeis University.
"You can push against economic forces, and you may have some success, but if you get a framework that puts a price on carbon, then the economic forces are on your side," he said.
For climate this is particularly crucial, he said, given the problem is global, the consequences dire, the remedies unknown and almost uncountable.
Jaffe as analogy compares a price on carbon to the economic incentives that have sent generations of miners off to the hills in search of glory and riches:
Instead of searching for gold, these new carbon prospectors will be looking for ways to cut emissions.
"We don't know where the deposits are or how deep in the ground they are or which are expensive and which are cheap," he said.
"But the clearest thing to do, if you want a whole bunch of people running around prospecting for carbon, is to make it clear that carbon has value."
Sheeran agrees. Failure at Copenhagen means technical change continues, albeit in fits and starts. It will remain confined mostly in the developed world, continuing the global technological divide.
It also likely won't force the breakthroughs that can bring the steep emissions cuts necessary to keep atmospheric carbon dioxide levels at 350 parts per million - a threshold the globe passed in 1987 and that is seen simultaneously as impossible to reach yet crucial to maintain to avoid the worst climatic effects.
"The faster we can get a price signal the faster we can get across that the era of carbon emissions is coming to an end," she said.
A taste of that new era can found out here on the Colorado prairie, where Weld County Road 4 has been renamed New Energy Drive.
Right now it's the taste of dust.
Vestas' two plants are going up faster than Weld and Adams counties can pave the roads to the site, and truck traffic combined with winds whipped the grit across the landscape on a recent autumn afternoon.
But soon it will be the taste of money, or so city leaders hope.
Vestas is bringing 2,000 jobs to the region, and Brighton civic leaders anticipate the creation of another 4,000 as the plant draws ancillary suppliers.
The city spent $40 million in infrastructure improvements last year alone, including $8 million delivering sewage, water, utilities and other upgrades to the Vestas site.
Raymond Gonzales, president of the city's Economic Development Corporation, grew up in Brighton before leaving for Washington, D.C., and a stint in Albuquerque as then-Gov.
Bill Richardson's labor secretary.
He remembers when Brighton was a farm town with nothing but a K-Mart distribution center on its outskirts. That's still there, except now it is landlocked by neighborhoods and shopping centers. The city is booming, having evolved from farms to a bedroom community to a live/work place of its own.
It has a new hospital, new City Hall, new $5.5 million library.
The old Armory re-opened last month as an arts center, the first time Brighton has had such a center since the opera house burned on July 25, 1955.
"Having a federal framework is critical to the success of this," Gonzales said.
"Federal policy, state policy has to be in favor of attracting these kinds of investments."
Colorado wants that economy and has put the state policy in place, said James Martin, executive director of Colorado's Department of Public Health and the Environment.
Federal and international policies will eventually follow.
But in the meantime, he added, there's plenty of work for states and local governments to do.
"I don't think you'll see any retrenchment in Colorado" if federal or international climate mitigation efforts collapse, he said.
"These are very large markets that will exist no matter what."
"Copenhagen is tremendously important for a host of reasons," he added.
"But our commitment to renewable energy and natural gas are independent of the climate debate in many respects."
"You'll see that across the country," he said.
"As industry leaders break out of the pack and demonstrate it can be done, everybody follows."
"It's a new paradigm, but it's a fairly painless paradigm shift."
OAA - U.S. Posts Third Coolest-Highest Precipitation for October on Record
November 10, 2009
The October 2009 average temperature for the contiguous United States was the third coolest on record for that month according to NOAA's State of the Climate report issued today.
Based on data going back to 1895, the monthly National Climatic Data Center analysis is part of the suite of climate services provided by NOAA.
The average October temperature of 50.8 degrees F was 4.0 degrees F below the 20th Century average.
Preliminary data also reveals this was the wettest October on record with average precipitation across the contiguous United States reaching 4.15 inches, 2.04 inches above the 1901-2000 average.
U.S. Temperature Highlights
October 2009 statewide temperature ranks.
High resolution (Credit NOAA)
* October 2009 was marked by an active weather pattern that reinforced unseasonably cold air behind a series of cold fronts.
* Temperatures were below normal in all regions with the exception of the Southeast which had near normal temperatures for the month.
* Oklahoma recorded its coldest October on record while the month ranked in the top five for Arkansas, Colorado, Iowa, Kansas, Minnesota, Missouri, Montana, Nebraska, South Dakota, and Wyoming.
* Florida was the only state to record an above normal temperature average in October.
It was the sixth consecutive month that Florida's temperature was above normal.
U.S. Precipitation Highlights
October 2009 statewide precipitation ranks.
High resolution (Credit NOAA)
* The nationwide average precipitation of 4.15 inches nearly doubled the long-term average.
This was the first month since December 2007 that no region in the United States recorded below normal precipitation.
* Iowa, Arkansas, and Louisiana recorded their wettest October while only Florida, Utah, and Arizona had below normal precipitation.
* Moderate-to-exceptional drought covered 12 percent of the contiguous United States, the second-smallest drought footprint of the decade, based on the U.S. Drought Monitor.
Major drought episodes in California and South Texas improved significantly.
Drought conditions, however, emerged across much of Arizona.
* About 45 percent of the contiguous United States had moderately-to-extremely wet conditions at the end of October, according to the Palmer Index.
This is the largest such footprint since February 2005
National Solar Observatory, NASA say no "Maunder Minimum" — sorry, deniers — Solar Cycle 24 poised to rev up
June 18, 2009
The sunspot cycle is about to come out of its depression, if a newly discovered mechanism for predicting solar cycles — a migrating jet stream deep inside the sun — proves accurate.
And that will add a small amount of warming in the next few years, which were already predicted to be record-setting by two recent studies.
When we last looked at the sun [please, don't try that at home], we were at "a 12-year low in solar ‘irradiance'."
As NASA explained in April:
"the sun's brightness has dropped by 0.02% at visible wavelengths" since the solar minimum of 1996, which was "not enough to reverse the course of global warming."
It's been "the quietest sun we've seen in almost a century," said sunspot expert David Hathaway of the Marshall Space Flight Center.
The deniers have been rooting for a Maunder Minimum to stifle global warming (which it wouldn't have done anyway, see here). But human-caused global warming is so strong that not bloody much stifling has been going on given that "this will be the hottest decade in recorded history by far," nearly 0.2°C warmer than the 1990s. Heck, even with a La Niña and an unusually inactive sun, 2008 was almost 0.1°C warmer than the decade of the 1990s as a whole — and of course the 1990s were, at the time, the hottest decade in recorded history.
Changes in the sun just ain't the big dog anymore when it comes to driving climate change (see here).
Yesterday, NASA reported remarkable news, "Mystery of the Missing Sunspots, Solved?":
June 17, 2009:
The sun is in the pits of a century-class solar minimum, and sunspots have been puzzlingly scarce for more than two years. Now, for the first time, solar physicists might understand why.
At an American Astronomical Society press conference today in Boulder, Colorado, researchers announced that a jet stream deep inside the sun is migrating slower than usual through the star's interior, giving rise to the current lack of sunspots.
Rachel Howe and Frank Hill of the National Solar Observatory (NSO) in Tucson, Arizona, used a technique called helioseismology to detect and track the jet stream down to depths of 7,000 km below the surface of the sun.
The sun generates new jet streams near its poles every 11 years, they explained to a room full of reporters and fellow scientists. The streams migrate slowly from the poles to the equator and when a jet stream reaches the critical latitude of 22 degrees, new-cycle sunspots begin to appear.
Above:
A helioseismic map of the solar interior. Tilted red-yellow bands trace solar jet streams. Black contours denote sunspot activity.
When the jet streams reach a critical latitude around 22 degrees, sunspot activity intensifies. [more graphics]
Howe and Hill found that the stream associated with the next solar cycle has moved sluggishly, taking three years to cover a 10 degree range in latitude compared to only two years for the previous solar cycle.
The jet stream is now, finally, reaching the critical latitude, heralding a return of solar activity in the months and years ahead.
"It is exciting to see", says Hill, "that just as this sluggish stream reaches the usual active latitude of 22 degrees, a year late, we finally begin to see new groups of sunspots emerging."
The current solar minimum has been so long and deep, it prompted some scientists to speculate that the sun might enter a long period with no sunspot activity at all, akin to the Maunder Minimum of the 17th century.
This new result dispells those concerns. The sun's internal magnetic dynamo is still operating, and the sunspot cycle is not "broken."
If Solar Cycle 24 is going to rev up soon, it won't affect global temperatures quickly. NASA explained in January:
Because of the large thermal inertia of the ocean, the surface temperature response to the 10-12 year solar cycle lags the irradiance variation by 1-2 years. Thus, relative to the mean, i.e, the hypothetical case in which the sun had a constant average irradiance, actual solar irradiance will continue to provide a negative anomaly for the next 2-3 years.
Also, Solar Cycle 24 has recently been predicted to be on the wimpy side.
No, it's a long and strong El Niño that would give us a global temperature record this year or next.
But when you put everything together — record, rapidly rising GHG concentrations, neutral or positive ENSO, and a return to the normal solar cycle — then you get what the peer-reviewed scientific literature has forecast:
# The "coming decade" (2010 to 2020) is poised to be the warmest on record, globally.
# The coming decade is poised to see faster temperature rise than any decade since the authors' calculations began in 1960.
# The fast warming would likely begin early in the next decade — similar to the 2007 prediction by the Hadley Center in Science (see "Climate Forecast:
Hot — and then Very Hot").
Causes of energy crisis
by Jack Hewitt
Posted on December 26th 2009
The lifeblood of our society is economical and abundant energy.
Coal, oil and natural gas are supplying almost 90% of the world's energy needs.
Hydro energy, nuclear energy and coal are primarily used to produce electrical energy.
Biomass is used for cooking and heating.
Natural gas is used mostly for heating.
Our salvation lies primarily in wind and solar power. Although these sources may seem small, they represent the future because they are sustainable.
Oil is uniquely versatile and as a result powers almost all our machines. At nearly the speed of sound, airplanes powered by oil carry a plethora of people across the oceans every day.
Oil-powered vehicles transport and produce our food.
In USA alone, there are less people in seats than there are seats in oil-powered vehicles. Oil-powered machines are our only way of life that we have known for many years. Clearly, we are living in the age of oil, but that age is rapidly drawing to a close.
There is only sufficient oil to last 44 years if oil production stays constant until it is used up.
As oil reserves become depleted, there will be less which will make keeping production constant impossible.
Likewise, there is only enough coal to last 133 years and only enough natural gas to last 61 more years. Certainly by now, everyone realizes that gas and oil will become expensive and scarce within the lifetimes of our children or their children.
There will inevitably be a transition to more renewable energy sources. That transition may be haphazard or planned — it is on us to decide. 66.3 percent of the world's gas reserves are in the Middle East and the Russian Federation.
The United States have 3.4 percent.
On the other hand, The United States consume 25 percent of the world's oil and 70 percent of that is imported.
The coming times of scarce energy reserves will be very hard for everyone here but it will be even harder if it is not anticipated.
It is huge importance that the public and all the people who make decisions understand all the facts about the energy crisis we are about to face.
Americans consume 25% of all the energy produced in the world every year, but that consumption can't last forever. To that end, the U.S. Department of Energy established the Energy Policy Act of 1992.
This act promotes increases in the generation and utilization of electricity from renewable energy sources and furthers the advances of renewable energy technologies. In 1996, the Renewable Energy Policy Project released "The Environmental Imperative," a plan for the energy market to draw on renewable energy to avoid the severe environmental impacts of the fossil fuel cycle.
This plan details the environmental necessity of accelerating the use of renewable resources. It is important to realize that it usually takes as much as thirty to forty years in order to shift fuel patterns significantly and that using electricity as an alternative to oil will require a major adjustment by the American public.
There is only a limited window of opportunity for us to make this energy transition without a major economic disruption.
What is thermal energy
by Max Rutherford
Posted on December 12th 2009
Energy is in essence the ability and power to perform tasks and do work.
Thermal energy is this force that takes the form of heat.
Heat is ostensibly the motion of molecules and atoms inside a substance.
If the molecules and atoms vibrate rapidly inside a substance the hotter that substance will be and the greater the thermal energy produced and radiated by it would be.
There is a close correlation between heat and temperature and if you increase the heat of a substance by applying energy, thereby causing the molecules and atoms inside the substance to travel or vibrate more rapidly you invariable increase the substance's temperature.
However, in the instance of water for example, once it reaches its boiling point you cannot raise its temperature further by increasing its thermal energy.
In this case the water would be converted into steam in the process of evaporation.
The sun is the principal source of heat and its core is a mass of rapidly-moving gas particles. The sun's radiated heat provides our planet with a very powerful stream of thermal, solar energy.
Other forms of energy can also be transformed into thermal energy.
For example, the burning of fossil fuels utilizes combustion to convert the stored solar energy in the fossil fuel (originally plants that absorbed sunlight in order to produce food) into thermal energy.
We use thermal energy to generate electricity, heat our homes and enable steam-driven processes.
The uses of thermal energy require heat to be transferred from a hot substance to a cold substance, which increases the temperature of the cold substance and thereby results in a change of state – whether from solid to liquid, liquid to gas, etc.
Heat transference happens chiefly by three methods, or by a combination of them.
The first is conduction, where thermal energy is transferred through a solid substance or from one substance to another by means of transferring heat from one molecule to another, causing the molecules to get hotter thereby vibrating and colliding more frequently causing additional ones to become hotter and so on until the energy has sufficiently been dissipated into nearby molecules. Heat travels through a substance, and its value as a conductor is reliant upon its molecular structure.
Convection is the next process and in this process heat is transferred by the movement of hot particles. Convection occurs in liquids and gases, but not in solids as their particles are unable to move freely.
Convected heat always travels to a colder place, and the hot particles that contain thermal energy are the ones that travel.
As an example, convection heaters blow out hot air particles which then disperse into cooler air.
The final process is radiation, and the sun is a good example of this process. The sun's radiated energy is able to travel across space due to the fact that it does not need any particles of matter – unlike the processes of conduction and convection.
This kind of radiant energy travels in electromagnetic waves and these waves are divided according to wavelength.
An example of the application of this kind of energy is in the use of infrared lamps used to ease muscular pain.
These lamps produce thermal heat generated by electricity which is transferred by radiation to the body.
A WORLD Imperiled FORCES BEHIND FOREST LOSS
As the first seven sections of this site have described, tropical rainforests are incredibly rich ecosystems that play a fundamental role in the basic functioning of the planet.
Rainforests are home to probably 50 percent of the world's species, making them an extensive library of biological and genetic resources.
In addition, rainforests help maintain the climate by regulating atmospheric gases and stabilizing rainfall, protect against desertification, and provide numerous other ecological functions.
However, these precious systems are among the most threatened on the planet.
Although the precise area is debated, each day at least 80,000 acres (32,300 ha) of forest disappear from Earth.
At least another 80,000 acres (32,300 ha) of forest are degraded.
Along with them, the planet loses as many as several hundred species to extinction, the vast majority of which have never been documented by science.
As these forests fall, more carbon is added to the atmosphere, climactic conditions are further altered, and more topsoil is lost to erosion.
Despite increased awareness of the importance of these forests, deforestation rates have not slowed.
Analysis of figures from the Food and Agriculture Organization of the United Nations (FAO) shows that tropical deforestation rates increased 8.5 percent from 2000-2005 when compared with the 1990s, while loss of primary forests may have expanded by 25 percent over the same period.
Nigeria and Vietnam's rate of primary forest loss has doubled since the 1990s, while Peru's rate has tripled.
Overall, FAO estimates that 10.4 million hectares of tropical forest were permanently destroyed each year in the period from 2000 to 2005, an increase since the 1990-2000 period, when around 10.16 million hectares of forest were lost.
Among primary forests, annual deforestation rose to 6.26 million hectares from 5.41 million hectares in the same period.
On a broader scale, FAO data shows that primary forests are being replaced by less biodiverse plantations and secondary forests.
Due to a significant increase in plantation forests, forest cover has generally been expanding in North America, Europe, and China while diminishing in the tropics.
Industrial logging, conversion for agriculture (commercial and subsistence), and forest fires—often purposely set by people—are responsible for the bulk of global deforestation today.
Deforestation figures and charts
* Tropical Deforestation Rates [sortable]
* Primary Forest Deforestation Rates [sortable]
* Deforestation Charts
* Latest Deforestation News
* Country by Country Deforestation Rates
* Deforestation Pictures
But enough about the extent and some of the effects of deforestation.
What is responsible for this loss?
This is the question this section addresses.
Deforestation and Degradation
Before expanding further on forest loss it is critical to first explain what is considered "forest" and what is meant by deforestation and forest degradation.
Climate bill chaos:
Your guide to the two newest proposals
December 28, 2009
smokestackcongressIn the madness leading up to and following the U.N. Climate Conference in Copenhagen two weeks ago, many have lost track of the climate change legislation working its way through Congress in the U.S. The last many heard of it, the controversial Kerry-Boxer bill was bound up in committee, poised for almost certain doom.
Since then, two shiny, new proposals have been put on the table.
But do they really have a better shot?
Before we take a closer look at the two fresh packages, called the Kerry-Lieberman-Graham bill (PDF) and Cantwell-Collins bill (PDF), let's take a moment to ponder what all this activity will actually accomplish.
When it comes to successfully pushing laws through Congress, strong backing, and cohesiveness are key.
So these bills — pitched on consecutive days — are facing some steep odds. Not only has the urgency of emissions reductions been weakened by Copenhagen's failure to produce a treaty, but now the Senate's attention is being divided between several different ideas.
Their saving grace? — bipartisan and moderate Democrat sponsors that might make their approval more palatable to previously staunch opponents of climate change legislation.
Then again, Sen.
John Kerry (D-Mass.) and Sen.
Joe Lieberman (D-Conn.) don't necessarily have popularity and political capital on their side right now.
So it will be interesting to watch.
First, let's look at Kerry-Lieberman-Graham, since it debuted first.
Like the Kerry-Boxer and Waxman-Markey bills before it, it would establish a fairly elaborate cap-and-trade system for carbon permits. It's surprising that Kerry is still peddling this idea after it's drawn such ire from his peers in the Senate.
Republicans and moderate Democrats have done a good job arguing that cap-and-trade would further burden companies still struggling to rebound from the recession and ultimately hurt consumers. Cap-and-trade also proved unpopular in Copenhagen, and the ranks of those against it continue to swell.
Kerry-Lieberman-Graham would also:
* Set a goal of a 17 percent reduction in carbon emissions (below 2005 levels) by 2020, and an 80 percent reduction by 2050.
* Create a global system for trading carbon offsets and carbon-based securities. (Offsets are a tricky business because they are basically promises from organizations that they will prevent a certain amount of emissions from being released — there are no guarantees.)
* Distribute an initial round of carbon permits (85 percent of them at least) to utilities and other organizations, which could then sell them to bring in revenue. (Some are viewing this as an underhanded subsidy for utilities without many strings attached.)
* Earmark government support for nuclear energy, clean coal technology and more efficient oil and gas drilling. (This seems like a concession to opponents who say the bill will hurt the job market.)
The Cantwell-Collins bill, also proposed before the world got swept up in Copenhagen, slims the climate change issue down considerably (it's about 50 pages instead of 1,500). Its sponsors, Sen.
Maria Cantwell (D-Wash.) and Susan Collins (R-Maine), claim that, if implemented correctly, it could cut greenhouse gas emissions by 20 percent by 2020 and 83 percent by 2050 — actually exceeding the goals of Kerry-Lieberman-Graham.
Its real strength, however, is that it would refund 75 percent of all of the revenue it might bring in.
Instead of using the typical language of cap-and-trade systems, the bill says it would sell "carbon shares" to fuel makers and return the proceeds to American taxpayers in the form of a monthly rebate check.
It's estimated that this system could result in annual payments of $1,100 for an average family of four — amounting to $21,000 between 2012 (when it would go into effect) and 2030.
Americans certainly do like to get paid — so this proposal is sounding easier to swallow already.
Here's the rundown on Cantwell-Collins (also known as the CLEAR Act):
* It would cap fossil fuels both imported and produced domestically.
Fossil fuel producers would have to buy emissions permits from the U.S. government at monthly auctions that they could then trade on a secondary market.
* The regulations are targeted at the actual producers of fossil fuels — not anyone else along the supply chain.
For example, the coal mining companies would be the ones to purchase the permits, not power plant operators or utilities.
* While 75 percent of the money generated by the government from the program would be distributed to regular citizens as rebate checks, the other 25 percent would be funneled into clean energy research.
* Early estimates say that the Cantwell-Collins program could raise $10 to $32 billion every year to start, a figure that will go up as carbon prices inevitably rise.
* Just like Kerry-Lieberman-Graham builds in support for nuclear and clean coal interests, this bill also includes provisions to cut down on non-carbon greenhouse gases like sulfur dioxide and methane, and to encourage carbon sequestration.
The two proposals are impressively different (finally, options!), and Cantwell-Collins seems particularly progressive.
Whether or not it will actually achieve reductions in emissions is up for debate.
Many are arguing that it would take too long for its initiatives to make a difference in how fuel producers operate and that the resulting investment in cleantech research would be minimal.
But some money is better than nothing — and if not too many allowances to polluters dilute the permitting market, it seems like the bill should have some teeth.
Regardless of their various merits and drawbacks, one of these bills needs to win a majority on the Senate floor this year to make any difference at all.
And the sooner the better. Former vice president Al Gore is calling for a deadline of Earth Day in April, and this seems like a fair timeline, considering how swiftly the health care measure moved once Obama gave it his full attention.
If I had to make a prediction, I'd place my bets on Cantwell-Collins. Based on response to Kerry-Boxer and Waxman-Markey before that, the idea of carbon trading has about 40 solid supporters (all Democrats). What will push one of the bills over the top is how it communicates the nature of the trading system.
Cantwell-Collins appears to be much simpler in its approach (making it easier to explain to constituents), and seems to give fewer perks to Wall Street and utility fat cats. It also distinguishes itself by emphasizing the importance of further research into sources of clean energy.
At the same time, it takes the interests of the existing oil, gas and coal infrastructures into account with its auction-based pitch.
And the clincher:
it means money in regular people's pockets. This will almost certainly sway the moderate Democrats and Republicans who have stymied earlier legislation.
While there's a chance that neither bill will make it past the Senate, one thing is for certain:
President Barack Obama will have to choose between them at some point.
He was a big supporter of both Waxman-Markey and Kerry-Boxer, and seems to be a proponent of cap-and-trade in general.
That said, he needs to get something passed, and fast, to make it look like he's making good on his major campaign promises. He already missed his end of 2009 deadline.
So if Cantwell-Collins seems to have a better shot, it'll probably get his blessing.
Reycling
Recycling is the process of re-using materials that would otherwise be discarded in landfill sites.
Most local councils offer door-to-door collections of certain recyclable materials such as paper and glass.
They also provide banks of containers in the community where you can deposit your recycling waste.
This might include clothes, plastics and cardboard.
Recycling can reduce greenhouse gas emissions by reducing the amount of mining/growing of raw materials, transportation of these materials and the energy that goes into these processes for the manufacture of new products.
It also reduces the impact on the environment of refuse disposal in landfill sites.
Reusing is even better than recycling, as no extra emissions are created.
Recycling is an excellent example of a sustainable industry.
Many of the mainstream products available today are made from recycled materials.
Fleece jackets and other synthetic products are usually made from recycled plastic bottles and containers.
Glass bottles can be made from recycled glass (usually from recycled glass bottles!).
Parts for aeroplanes and cars can be made from recycled drink cans.
Paper and cardboard can be made from recycled paper.
Food waste can be turned into compost and waste from your toilet can be recycled as fertiliser and water to irrigate crops.
It's all common sense really, but why waste perfectly good materials?
The world could save a lot of money and greenhouse gas production in the future by adopting a stricter policy of recycling.
It does not make sense to bury it underground, only to quarry and mine new materials elsewhere to make the same products.
Used grease and cooking oils can be collected and turned into Biodiesel, a fuel for vehicles with diesel engines.
Shoe soles can be made from recycled aeroplane tyres.
But, the best type of recycling is Re-Using!
If you can re-use a bottle or envelope yourself, the recycling process does not have to take place and even less greenhouse gas is produced.
Global warming activists ignore the science they claim to support
Written by Todd Myers, KPBJ | December 29 2009
Polar bear populations are higher today than during the last forty years.
Polar bear populations are higher today
than during the last forty years.
When discussing global warming, one phrase recurs:
"scientific consensus."
Environmental activists often cite "science" when arguing for far-reaching and costly responses to global warming.
Ironically, those activists ignore the findings of that same science.
The potential impacts they cite are based not on science but on speculation which contradicts the actual science.
One activist claimed that, "In the lifetime of a child born today, sea levels could rise three to six feet."
The scientific consensus says this is nonsense.
Using the science from the UN's Intergovernmental Panel on Climate Change (IPCC), University of Washington scientists found the most likely sea level rise for the next century is about 13 inches, with the high of 50 inches called "very unlikely."
Understand also, sea levels rose about one foot during the last 150 years.
When it comes to sea level science, environmental activists ignore the findings they pretend to support.
The same is true with recent storms. Many people, from environmental activists to the Governor, claim that recent storms, like those that caused floods in Centralia, are evidence of global warming.
Top local climatologists, however, disagree.
University of Washington atmospheric scientist Cliff Mass says the link between climate change and recent storms is false.
He notes that "there is no strong evidence for these claims," and that "initial simulations of future Northwest climate do not suggest heavier rain events."
Activists seize upon weather headlines in the hope that the public will believe they are linked to climate change, even when they are not.
Recent wildfires are also cited as evidence of climate change.
The National Wildlife Federation claimed that "Warmer temperatures are also to blame for the invasion of mountain pine beetles, which have already decimated over 32 million acres of forest in Washington and British Columbia."
Forestry scientists say the primary cause of insect infestations is that too many trees are fighting for too few nutrients and water. In many Washington forests there are many more trees per acre than hundreds of years ago. Stressed trees cannot fight off natural infestations that were manageable for centuries. While working at the State Department of Natural Resources, I spoke with many foresters and entomologists who demonstrated this very process.
Until recently, the environmental community made this very argument.
Arguing for a "natural" policy of letting forest fires burn, the Northwest Ecosystem Alliance said in 2001 that "Because we have vigorously enforced a no-burn policy in these forests, many have become clogged with thick clusters of trees that could easily explode into the monstrous conflagrations…" They have since changed their tune, not based on the science, but on politics.
Warmer temperatures can increase infestation, but ignoring the role of overstocked forests, and opposing the thinning necessary to help those forests recover, demonstrates a commitment to science only when it is convenient.
The threat to polar bears is another claim trotted out in a fact-free way. I spent a week in Barrow, Alaska one November, braving 30 degree below zero temperatures to see these truly magnificent creatures. I have a strong affinity for them.
But we must not ignore the facts. Polar bear populations are higher today than during the last forty years. Many more bears are killed by hunters each year than by climate change.
The belief that polar bears are drowning is based more on the cartoon segment of Al Gore's movie than on the reality of these notoriously strong-swimming animals.
One final claim is that climate change will have a dramatic impact on our mountain snowpack.
Scientists have repeatedly rebutted the claims made by climate alarmists. Snowpack has actually increased since 1980, a period when temperatures were increasing.
U.W. scientist Mark Stoelinga argues that increasing temperatures may impact snowpack, but that many other factors are involved.
He notes that in recent decades, "We can't see the global-warming signature in terms of a decline in snowpack."
Claims that, in the words of one environmental activist, "nearly 60 percent of the Cascade snowpack could be lost," are political, not scientific.
There is a real risk from the increase in carbon-dioxide in the atmosphere.
But a crisis mentality that relies more on fear than science is the surest path to costly solutions that fail to solve the real problem
The Climate Conundrum Over Nuclear Energy
by Richard Harris
December 11, 2009
Listen to the Story
Nuclear power poses a major conundrum.
Nobody's thrilled about nuclear waste, people fear potential accidents, and proliferation of nuclear weapons is downright scary.
On the other hand, nuclear plants generate lots of power and no carbon dioxide.
And when people crunch the numbers to see how to phase out carbon dioxide emissions, they often come up with nuclear energy playing a major role.
"Nuclear technology is something that's there, we know how to do it, there's no technical challenge in being able to apply it, unlike many of the other technologies," says Richard Meserve, president of the Carnegie Institution for Science.
He used to chair the Nuclear Regulatory Commission, and he served on a recent National Academy of Sciences committee looking at the future of energy in America.
For example, there's no proven technology to capture and store carbon dioxide from coal plants.
So Meserve says the challenge in nuclear isn't technology, it's money.
"These plants are very expensive at the front end.
Wall Street lost a lot of money when we built our existing nuclear plants because of the long time to construct the plants in a period of very high interest rates."
So even though there are now 26 applications pending for new reactors, most everyone seems to be too timid to take the financial plunge.
The federal government is talking about loan guarantees and other incentives to help plants get over that hump.
Some see that as a giveaway to the industry.
Others see that as necessary to keep nuclear in the mix.
Tom Terbush of the Electric Power Research Institute says his utility-funded group is agnostic about where power should come from.
But when he looks at the deep carbon cuts being promised for the next decade or two, he too comes up with nuclear as part of the solution.
"These are big, basic plants," Terbush says, and "yes, they take a few years to build.
But we know how to build them.
And once you build them, a typical generating unit can produce enough power for nearly a million homes, so it's a big chunk."
Terbush says every option has its challenges.
You would have to put up a thousand wind turbines, for example, to match the power output of a single nuclear plant.
So his organization figures that nuclear needs to play a substantial role, along with every other technology that produces clean power.
"If you're putting all your eggs in one or two or three baskets, even if you just put all your eggs in the nuclear basket, say, that probably wouldn't get you to where you need."
Protecting Big Carbon
Posted by Richard Sunday, December 20, 2009 climate change
In 2004, it was less than $300 million.
But in 2005, the trade really started to soar, ending the year with $10.8 billion-worth of transactions. A year later, in 2006, the "carbon" market had grown to $31 billion.
In 2007, again it more than doubled its turnover, to $64 billion.
Last year, it did it again, reaching a colossal $126 billion.
By 2020, some estimates suggest the annual value will reach $2 trillion.
Not only does this represent a very significant business volume, its stunning growth rate makes carbon trading the hottest item in town, with banks, financial houses and independent brokers piling in to make a killing.
The larger part of the market actually comprises the EU's mandatory Emissions Trading Scheme (ETS) – and other very much smaller allowance schemes - accounting for 73 percent of trading volume in 2008.
But the whole system is underpinned by what is known as "project-based transactions".
These comprise, in the main, so-called "carbon credits" generated by the UN's Clean Development Mechanism (CDM).
This mechanism was formally created in 1997 by the Kyoto climate treaty and started operating in a very small way in 1998 building to 78 million "credits" (or Certified Emission Reductions, CERs, as they are formally known) to 333 million this year with a projection of 1.7 billion by the end of 2012.
The "end of 2012" is, of course, the key milestone - when the Kyoto treaty lapses. And as the World Bank warned in 2008, then the gravy train could come to an abrupt halt.
"Created by regulation," it observed, "the carbon market's biggest risk is caused, perversely, by the absence of market continuity beyond 2012 and this can only be provided by policymakers and regulators."
It was those policymakers and regulators who were gathered at Copenhagen for the last two weeks, their primary concern – as Booker points out in his column this week - to protect this new and very valuable business.
Thus does he say that Copenhagen was not about global warming but money.
The cash that Hillary Clinton so dramatically plonked on the table, rising to $100 billion by 2020, which includes the £1.5 billion offered by Gordon Brown (money which of course he hasn't got) and which like a crazed gambler he last week upped to £6 billion (even more money he hasn't got), was merely a "sweetener" to persuade the developing countries to maintain the money-machine set in motion by Kyoto.
And that was the only really concrete achievement of Copenhagen, winning the agreement to the perpetuating of those Kyoto rules that have created this vast industry.
That much was acknowledged by John Prescott on Channel 4 News yesterday, who talked up the Copenhagen "accord" – as we must now call it – saying that we had achieved a "Kyoto II" – the essence of which is the carbon trading scheme.
Interestingly, this business has two main beneficiaries. On the one hand, there are all those Western "entrepreneurs" who have piled into what has become the fastest growing commodity market in the world.
But, on the other are that small number of people in China and India who have learnt how to work this system to their huge advantage, and account for the majority of CDMs.
As can be seen from the UN's own statistics - see chart above – these two countries account for by far the majority of the registered schemes, taking a 70 percent share of the total.
And, of the two, China is by far the bigger player, averaging 197,792,890 CERs as against 38,308,631 from India.
The level of China's involvement, as a major beneficiary of the scheme, makes a nonsense of the commentators at Copenhagen who were predicting that China might sabotage a deal.
With so much money at stake, there was no way China was not going to fall into line, showing up the much-reported spat between Obama and Chinese premier Wen Jiabao for exactly what it was – pure theatre.
Thus does Booker conclude that the part played at Copenhagen by all the tree-huggers, abetted by the BBC and their media allies, was to keep hysteria over warming at fever pitch while the politicians haggled over the real prize, to keep the Kyoto system in place.
The only tree they were concerned with hugging was the money tree and all the vast political apparatus that now supports it, allowing governments to tax and regulate us into handing over ever more of our money, largely without realising it, every time we drive a car, fly in a plane, pay our electricity bill or carry out any of a vast range of activities that involve the emission of CO2.
Compared with these sums, even the billions we all unwittingly spend on subsidies to the developers of useless wind turbines are chicken feed.
The tree-huggers have been well and truly "had" – but then so have we.
It is us that are going to pay, through our electricity bills, our taxes and living expenses, in increasing amounts for this hidden bonanza which the negotiators so diligently protected last week.
Trading in what amounts to thin air, on the farcical premise that life-giving carbon dioxide is a "pollutant", they have perpetrated the biggest heist in the history of mankind, all to protect "Big Carbon".
Saving money on climate change
Posted on December 28, 2009 by robertkyriakides
We are now into the time of saving money.
For years nations have lived beyond their real means, What they thought was gold was merely gold plated and the dreams of wealth for all created en passant by those who pursue wealth for themselves have proved to be, like all dreams, illusions. This is a game in which a handful of people win but a mass of humanity lose.
So the time of saving money descends, which for Governments means the time of not spending money;they have already spent money for many years to come and we have no got to get used to Governments spending less.
There is plenty of unnecessary spending by Governments in every part of their budget.
There are, for example when it comes to Government spending on climate change, large amounts of money spent on nothing in particular. There are reviews, enquiries, consultations all of which are a complete waste of time.
You can tell from most consultations that the outcome of the consultation is predetermined by the questions. Why then bother with the expense of the consultation?
There are many committees and boards and advisory bodies. Almost none of them have any expertise.
It makes you wonder whether these bodies are intended to achieve anything real or whether they are simply sinecures for the party faithful or a means of isolating criticism.
They could all be abolished without the climate coming to any further harm.
Finally there is all the expenditure on carbon trading.
This is wasteful of money and drives people to believe that markets can solve the problem of climate change;markets of course cannot solve anything;they exist to provide a choice o goods and services.
Climate change can only be solved by measures and markets abhor measures or any regulation which gets in the way of them optimizing their opportunities to make money.
We could abolish the whole apparatus of carbon trading without adding a single ounce of carbon into the atmosphere, because carbon trading saves no emissions now and will be unlikely to save any in the future.
Solar Could Generate 15% of Power by 2020, If US Ends Fossil Fuel Subsidies
The Result:
882,000 New Jobs, 10% Drop in Emissions
by Stacy Feldman - Dec 29th, 2009
Solar power technologies could generate 15 percent of America's power in 10 years, but only if Washington levels the playing field on subsidies, a report by the Solar Energy Industries Association (SEIA) says.
That means either rolling back fossil fuel subsidies, as President Obama proposed earlier this year, or increasing subsidies for clean energy, the association says.
Fossil fuels received $72 billion in total federal subsidies from 2002 to 2008, keeping prices artificially low, according to figures from the Environmental Law Institute (ELI). About 98 percent of that went to conventional energy sources, namely coal and oil, leading to more emissions. The rest, $2.3 billion, was pumped into a new technology to trap and store carbon dioxide spewed by coal plants.
During that same period, solar got less than $1 billion, according to the SEIA, a trade group representing 1,100 solar companies across the nation.
To compete and gain market share — and stop global warming — this inconsistency "must reverse itself immediately," said Rhone Resch, SEIA president and CEO.
There had been hints of this happening.
In September, the G20 group of the largest 20 economies agreed to phase out the $300 billion spent worldwide in fossil fuel subsidies "over the medium term" to combat climate change.
But neither the Obama administration nor Congress has yet to take steps to comply with the G20 commitment.
For solar to have a shot, the world cannot wait, Resch told reporters at the Copenhagen climate talks this month.
"We either remove subsidies with oil and gas or create parity with solar," he said.
Almost a million jobs could hang in the balance.
Currently, solar contributes less than 1 percent of energy used in the U.S. and employs some 60,000 people.
Increasing that amount to 15 percent would result in a total of 882,000 new jobs, the association said.
That's compared with a dwindling coal mining industry that employs 85,000 people, said Resch.
The solar ramp-up would also fight climate change. A 15 percent scenario would slash America's energy-related emissions by an estimated 10 percent, curbing national carbon dioxide output by 1.4 gigatons (1 gigaton equals 1 billion tons).
To get there, however, rooftop solar photovoltaic systems would need to grow massively — from today's 1,500 MW to 350,000 MW by 2020.
Concentrating solar power, which generates electricity by focusing sunlight on giant mirrors on desert land, would have to leap to 50,000 MW, up from just 424 MW today.
It "won't happen naturally," Resch said.
Domestic policy provisions that favor renewable energy sources are needed now, the solar industry argues. Many of these would not cost the government "a penny," said Resch.
In fact, getting to 15 percent solar would require a relatively small government investment of between $2 billion and $3 billion in total, he said.
But, he added,
"The government will have to change the way things have been done."
The policies proposed by SEIA are contained in the association's "Solar Bill of Rights."
They include:
the right to connect to a grid with uniform standards;the right to new transmission lines to connect solar resources in the Southwest to population centers;and the right to equal access to public land.
The last one is vital for utility-scale solar power. The oil industry currently leases over 45 million acres of federal land, much of it on sun-blessed stretches of Southwestern earth.
The solar industry has access to "zero" of that, said Resch.
Also vital is global warming legislation that creates a long-term price on carbon and a federal "renewable portfolio standard" that would ensure a chunk of the nation's electricity gets produced by green power.
The industry hopes momentum from the utilities and the states will trickle up to the federal government.
In 2009, solar accounted for 13 percent of all new utility announcements and filings, according to figures from the Electric Edison Institute.
"There are orders right now for solar in excess of 10 GW from utilities," Resch said.
Assuming the solar industry returns to its pre-recession growth rate of 50 percent each year, electricity from the sun will be the lowest cost option in almost every state by 2018, the association said.
Copenhagen Plea
When SEIA presented the 15 percent accelerated deployment scenario at the Copenhagen talks this month, the U.S. trade group wasn't alone.
Over 40 solar associations from around the world banned together to release a report summarizing surveys of the leading solar nations.
The main point was this:
If the EU industry makes good on its pledge to get 12 percent of its electricity from solar by 2020, and if the U.S. can hit 15 percent in the same time frame, 6.3 million new jobs would flow.
On top of that, China and India have each pledged promising near-term solar booms.
"Our message was clear," said Resch, "We are ready now to help solve the climate crisis."
Before the talks, solar representatives sent the UN secretary-general a letter, urging him to keep in mind that solar energy "offers a concrete way forward" in negotiations on how to curb and adapt to global warming.
In the end, it didn't help.
The Copenhagen Accord that emerged produced no binding commitments to slow climate change, and no hard signals to stimulate clean-tech investment.
But it appears the summit was not for naught for Big Solar.
"This is the first time in the history of climate negotiations that the global solar industry has gathered together with one voice," said Resch.
It's also the first UN climate convention where the renewable energy industries outweighed the fossil fuel industries in "both in numbers and in influence," he added.
The key in the short term, Resch said, is not legally binding and verifiable carbon reductions but action in the biggest economies.
"If agreement has to wait until Mexico City or South Africa, fine, but we can no longer wait to star building the solar industry and making sure we have uniform policies around the world," Resch said.
Plan to Turn Africa into the Saudi Arabia of Solar Gains Traction
As technology improves and the price of solar plummets, a high-profile plan to power all of Europe from the Sahara sun, called EUMENA-DESERTEC, is quickly becoming more realistic.
That was the finding at the three-day Copenhagen Congress, where Anthony Patt of the International Institute for Applied Systems Analysis informed scientists that the cost of concentrated solar power (CSP) for North Africa is getting on par with alternative technologies. On top of that:
"The cost of moving [electricity] long distances has really come down."
What kind of total investment are we talking about?
About $70 billion.
That's over 10 years, to be shared among 30 countries or more – not a lot of money, particularly by bailout standards.
For context, the nine leading economies in Europe spent five times that amount – $3.36 trillion – to shore up precarious banks, according to a study by the Independent Strategy of London.
For even more perspective, recall the cost of America's AIG rescue:
$180 billion so far.
With $70 billion, Patt says, governments could prove the worth of the DESERTEC concept and stimulate private investment that would drive it to completion.
And they wouldn't have to wait long to reap benefits.
Researchers claim if construction begins in 2010, the deserts of the Middle East and North Africa (MENA) could supply Europe's urban consumption markets with 55,000 gigawatt hours of electricity in 10 years' time – enough to meet the needs of 35 million people.
By 2050, they could power most of Europe and two-thirds of their own countries, all by using just a fraction of unused Sahara land.
Add a string of wind farms along the North African coast, Patt says, and all of Europe's power needs could be met.
EUMENA-DESERTEC, six years in the works, was developed by the Trans-Mediterranean Renewable Energy Corporation (now known as the DESERTEC Foundation), the brain child of the Club of Rome, with support from the German Aerospace Bureau and other influential groups.
The technology behind it, concentrated solar power, is critical to the solar sector's assault on the world of fossil fuels. The good news is that it's commercially available and infinitely scalable.
In Africa, it would look like this:
A giant Saharan network of solar mirrors would concentrate sunlight onto receiver tubes that contain water, producing extreme temperatures. The heat would boil the fluid, driving the traditional steam turbines that would churn out exportable clean power.
The Sahara receives some of the most intense solar radiation in the world.
That is clearly a huge plus. The distance to Europe is not.
To blunt the challenge of undersea transmission, DESETREC has proposed an HVDC (High-Voltage Direct Current) "Euro-supergrid," designed to integrate with existing and less efficient HVAC transmission lines. Under the scheme, electricity could be transmitted from North Africa to the UK for relatively minimal line losses of 10 percent or less.
Pilot CSP systems are now planned for Egypt, Morocco, Algeria and Dubai, Patt says. Libya and Tunisia could also be considered.
His findings were the first of a major research effort.
Full conclusions will be presented to governments later this year, although the gist of the research is coming clear:
Desert solar is on the cusp of getting a lot cheaper, especially in the Middle East and North Africa, but not exclusively.
The DESERTEC concept, though envisioned for Africa, is applicable to other regions of the world, particularly the Southwestern United States. The region ranks with North Africa and interior Australia as one of the world's three best areas for vast solar farms.
That explains the creation of DESERTEC-USA. The organization was conceived to market the desert solar idea in American circles. Dive into the web site.
The case it makes is compelling.
The arid Southwest has the nation's strongest direct normal radiation resources. These are so powerful, experts estimate a solar mirror field roughly 100 miles on a side located in Nevada could theoretically provide ALL of America's electricity.
The MENA nations are on the verge of becoming the Saudi Arabia of solar. And the reason is simple.
For governments and investors, the billions it would take to build vast desert solar farms and the accompanying transmission infrastructure are starting to look positively modest, compared to the future benefits of the plan.
Why not the United States?
Deforestation Deal, Copenhagen's Supposed Savior, Hits New Low as Targets Dropped
UN climate talks on ending deforestation hit a new low on Saturday after a leaked document revealed that immediate targets to halt forest loss had been cut out of a draft agreement.
Poorer forested countries had been willing to accept deforestation targets, but only with financial assistance.
They wanted rich countries to commit to providing billions of dollars for the effort before they agreed to bind themselves to any goals.
Currently, there are no dollar commitments on the table.
According to UN estimates, $22.4 billion to $37.3 billion between 2010-2015 would be needed in immediate funding.
"It's hardly surprising that developing countries won't commit to global targets for deforestation when rich countries haven't yet provided the necessary financing for REDD or global targets for deep reductions of industrial emissions," said Nathaniel Dyer of Rainforest Foundation UK.
A program for Reducing Emissions from Deforestation and Degradation (REDD) would, ideally, reward developing countries with multi-billion dollar payouts in exchange for forest preservation.
Hard targets are seen as necessary to ensure the scheme delivers more than hot air.
"Without targets, REDD becomes toothless," said Peg Putt of the Wilderness Society.
Earlier versions of the text would have required nations to halve deforestation by 2020 and end it entirely by 2030.
As of late Saturday, that quantifiable mid-term target had been abandoned.
The long-term goal was placed between brackets.
The first paragraph of the leaked text now reads:
"... all Parties should collectively aim to reduce emissions by halting and reversing forest cover loss in developing countries [by 2030] compared to current levels."
In UN speak, brackets are an ominous sign.
They mean the language is still up for debate and could be whacked from the document at any time.
Advocates further expressed frustration over newly diluted safeguards. For example, a sentence that helps to guard natural forests from being razed for palm oil plantations was cut from the main text and pasted into the preamble.
The problem with preambular language is that it's "aspirational" rather than directive, Bill Barclay of Rainforest Action Network told SolveClimate.
That means the safeguard would lack legal force.
"Limiting safeguards to the preamble weakens the agreement and deprives it of any assurance of compliance," said Rosalind Reeve of Global Witness.
Deforestation generates some 20 percent of global greenhouse gas emissions. An agreement to protect the world's forests was expected to be an easy win in Copenhagen.
Observers say success now hinges on meetings next week between high-level ministers and heads of state.
"Ministers must act to strengthen the REDD text next week if we have any hope of a REDD that will be effective in protecting tropical forests," Barclay said.
Overall Ambition Lacking
With week one of negotiations under their belt, key leaders in Copenhagen expressed dissatisfaction on Saturday with the sluggish pace of the climate talks overall.
Connie Hedegaard, the president of the UN climate conference, said:
"On the core discussions in negotiations, we still need more on finance, we still need more on commitments. There are still many unresolved issues."
Emissions-reduction targets of 11 to 18 percent below 1990 levels by 2020 are on the table now, at a time when the science is urging a minimum cut of between 25 and 40 percent.
UN climate chief Yvo de Boer said it will be up to the almost 110 heads of state arriving in Copenhagen next week to raise this lagging level of ambition.
These leaders are needed to get "stronger commitments from industrialized countries;to see significant engagement from developing nations;and to [secure] finance that will make developing country engagement possible," he said.
Sweden's environment minister, Anders Carlgren, speaking for the European Union, told reporters, "We haven't achieved enough."
"If we were to continue at this pace, we wouldn't manage what it is to be achieved next week," he warned.
Anticipation mounted this week that the EU would unilaterally commit to upping its emissions-reduction target from the current 20 percent from 1990 levels by 2020 to 30 percent.
That didn't happen.
"We want to go to 30 percent reduction," said Carlgren, but not without more ambitious targets from the U.S. and China.
"We can't sell out our 30 percent target as a cheap offer," he said.
"We have to make sure that we use that lever to put sufficient pressure on other parties to deliver what is needed to reach the 2 degrees target."
The decision could be made "literally in the last hours" of Copenhagen, Carlgren said.
When asked what specifically it would take for the EU to agree to the 10 percent leap, he suggested a commitment from the U.S. on long-term financing.
Carlgren also said that Beijing would need to boost its national carbon intensity target of 40 to 45 percent below 2005 levels by 2020, and then enshrine it as an international commitment, although he did not give specific figures.
De Boer said that "all" countries will need to do more.
"Many of the countries that are here to address climate change are operating in exposed international markets," he said.
"That means, the more they raise collective ambition," the fewer "competitive distortions."
Organic Farming May Help Meet Climate Goals
Freshly picked, organically grown potatoes sit in a basket on an allotment in the village of Lane End, Buckinghamshire, southern England, June 23, 2007.
LONDON - The conversion of all UK farmland to organic farming would achieve the equivalent carbon savings to taking nearly one million cars off the road, the Soil Association said on Thursday.
Britain's largest organic certification body, issuing results of a research project, said on average organic farming produces 28 percent higher levels of soil carbon compared with non-organic farming in northern Europe.
"The widespread adoption of organic farming practices in the UK would offset 23 percent of UK agricultural emissions through soil carbon sequestration alone, more than doubling the UK government's pathetically low target of a 6-11 percent reduction by 2020," the Soil Association said.
"A worldwide switch to organic farming could offset 11 percent of all global greenhouse gas emissions," the organic group added.
Professor of Soils and Global Change at Aberdeen University Pete Smith said organic farming had many practices which increased soil carbon.
He said the main challenge, however, was whether a switch to organic farming would maintain the productivity of the land, adding it would be fairer to compare farming methods on a "per unit of product" basis.
"If you accept there could be lower production, you may need to spread agriculture to other areas of land," he said.
"Any benefit on carbon you get could be more than wiped out by plowing up land elsewhere.
The difference between organic and conventional is not so striking when you look at it on a per unit of product basis," he said.
Invert your thinking:
Squeezing more power out of your solar panels
Editor's Note:
Scientific American's George Musser will be chronicling his experiences installing solar panels and taking other steps to save energy in 60-Second Solar. Read his introduction here and see all posts here.
When people talk about improving the efficiency of solar energy production, they usually talk about the panels themselves. What fraction of sunlight do they convert into electricity?
Most solar cells today are made of crystalline silicon, but could cleverer designs or advanced materials such as thin films, organic polymers, layered semiconductors, and phosphorescent dyes do better?
Probably, but that's only half the story.
The auxiliary equipment that connects the panels to your household wiring or the electrical grid is just as important.
A Lawrence Berkeley Labs study I cited in an earlier post found that solar has gotten cheaper over the past decade largely because of better auxiliary equipment rather than better panels. To find out what further steps engineers can take, I talked to Guy Sella, the co-founder and CEO of SolarEdge, an American-Israeli manufacturer of such equipment.
The need for this equipment arises from how a solar photovoltaic cell works. Light shining on the cell knocks electrons off the silicon atoms, and an electrical voltage built into the semiconductor material pulls the electrons in one direction, creating an electrical current.
What happens then depends on what you connect to the cell.
If you don't connect anything and just leave the wires dangling, the current has nowhere to go, electrons pile up on one side of the cell, and the voltage across the cell increases until it reaches the built-in voltage -- typically 0.6 volts for silicon.
The BP SX3400b panels that are going up on my house each consist of 50 cells connected in electrical series, for about 30 volts if you don't connect an electrical load.
Twelve of these panels are strung together for a total of about 360 volts.
When you attach a load and start to draw power from the cell, the voltage drops -- gradually at first, then precipitously as the electrons flow out too quickly for a voltage to develop across the cell.
This behavior is captured in a graph known as the current-voltage, or I-V, curve.
When the voltage reaches zero, the cell delivers its maximum current -- which is about 9 amps for my BP panel in full-on sunlight and less when it's twilight or overcast.
Because the cells in a panel and the panels in a string are wired in series, the amperage of one determines the amperage of all.
If you need more current, you have to wire strings of panels in parallel.
My solar array consists of two 12-panel strings, doubling the current.
Because power equals volts times amps, a panel doesn't do a whole lot of good if it generates 30 volts at 0 amps or 9 amps at 0 volts. In between these extremes, it produces useful power, and there's a sweet spot in the middle where the power is maximized -- for my panels, 8.16 amps at 24.5 volts, giving 200 watts of power. If you hit this sweet spot and point this panel straight at the sun, it will convert 16 percent of the incoming solar energy to electricity.
When most people talk about efficiency, this is the number they're referring to, but it presumes you've hit the sweet spot, and that's easier said than done.
The job of optimizing the electrical performance of the panels typically falls to a piece of equipment called the inverter. Its main function is to convert the direct current produced by the cell into the alternating current used by the electrical grid -- a process known as "inversion" because it reverses the more common function of converting AC to DC (as battery chargers, for example, do). But a modern inverter does more than invert.
It also adjusts how much current it draws in order to maximize the panels' power output.
As Sella explained, it's tricky for many reasons:
* Electrical mismatches. Because of the vagaries of manufacturing, different panels have slightly different I-V curves. The inverter responds only to the average I-V curve.
Consequently, it draws too little current for some panels and too much for others, reducing their power output by several percent.
* Partial shading.
If the shadow of a tree branch or another solar panel falls on the panel (as in the above photo) and diminishes the sunlight hitting it by, say, a percent, you might innocently think it would diminish the power output by a percent.
Actually, even a small shadow can completely zero out the power. Because the cells are wired in series, knocking out one can knock out all, just as a single blown Christmas tree bulb can black out a whole string of bulbs. Even when uneven illumination doesn't choke off all the power, it worsens the electrical mismatches. In a typical setup, Sella said the power output declines as much as 25 percent.
* Temperature fluctuations. As the temperature increases, electrons flow through the semiconductor material of a solar cell more readily and the built-in voltage decreases. For my BP panels, the peak voltage drops by about 0.1 volt per degree Celsius. The trouble is that the inverter can handle only a limited range of voltages -- my SMA America SB4000US unit works from 220 to 480 volts. During extreme temperature swings, the voltage will fall outside this range and the energy will be lost.
Depending on your climate, up to 15 percent of your annual energy production goes to waste.
* Inability to optimize.
Because of the above problems, the overall array I-V curve might have multiple sweet spots, some sweeter than others. The inverter will lock onto one, even if a better choice lies elsewhere.
And whenever the sun's brightness changes because of cloud cover or the time of day, the inverter needs to find the new optimum.
In fickle weather, it may not be able to keep pace.
Between these two problems, you give up 10 percent or so of the panels' potential output.
* Incomplete use of available space.
Even if I had room on my roof for a 25th panel, I couldn't install it.
It would mean that one string would have 13 panels and the other 12, yet the strings must be of equal length. I couldn't subdivide my array into five strings of five panels each, since the length of the strings is dictated by the voltage that the inverter can handle.
Because of the need to keep the number of panels numerically balanced, Sella said the typical commercial solar installation can utilize only about three-quarters of its roof.
* Damage or theft.
If a panel breaks or gets stolen (it happens), the whole array can fail.
What's worse, you can't just replace the lost panel with the latest model;you have to use the exact same model as the original, or else you'll create an electrical mismatch.
Thus a photovoltaic system installed in 2009 is locked into 2009 technology for its 25-year lifetime.
To get around these problems, Sella said that SolarEdge has developed a small box that you can attach to each panel (see photo at top). This box optimizes the electrical performance of each panel individually.
He said the company has tested its technology on 17 houses in the U.S., Europe, Israel, and Japan and found it wrung another 10 to 20 percent of power out of the arrays at no extra cost.
In fact, by simplifying the wiring or allowing more flexible use of roof space, the SolarEdge box can cut the installation cost.
Sella said it will come out in October.
Over the past 30 years, solar power has gone from 40 times as expensive as fossil fuels to just a few times. At the rate the technology continues to improve, it won't be long before it's competitive even without government subsidies.
SolarEdge's PowerBox units on the backside of solar panels in Germany (first image). Partial shading of rows of solar panels in Spain (second image). Courtesy of SolarEdge.
World's First Fuel Cell Ship Docks in Copenhagen
Pleasure yachts and tall ships line the wharves and quays of Nyhavn here in the Danish capital.
Shipping in Denmark goes back to the Vikings and their long ships that made perilous sea crossings even beyond Greenland.
Now what may be the future of shipping is docked around the corner from Nyhavn at Kvaesthusmolen pier, a bright orange and yellow North Sea supply ship from Norway dubbed "Viking Lady"—the first ship to employ a fuel cell in history.
As a result of flourishing world trade, shipping is now responsible for roughly three percent of global emissions of greenhouse gases, or more than one billion metric tons of carbon dioxide every year, along with smog-forming nitrogen oxides, acid-rain causing sulfur dioxides and soot.
In fact, emissions of nitrogen oxides from one ship burning diesel in a year are greater than those from 22,000 cars. That's because ships burn bunker fuel or diesel to cleave through the waves but, according to Tor Svensen, CEO of Det Norske Veritas (DNV) Maritime, "it is possible for shipping to reduce emissions, even taking into account growth in world trade."
In fact, ships could reduce CO2 emissions by 500 million metric tons by 2030 while increasing profits, according to an analysis done by DNV. After all, fuel costs for a tanker ship are fully 41 percent of its total operating costs. A tax on CO2 emissions of just $15 would drive cuts of 700 million metric tons, according to Svensen.
Energy savings of as much as 40 percent can be achieved through better hull design, more efficient engines and even the type of paint used on the ship.
"Just by polishing the propeller occasionally, one can do a lot," says Alte Palomaki, a spokesman for ship and turbine-maker Wartsila Corporation.
But in the case of the 5,900 metric ton Viking Lady, Norwegian shipping company Eidesvik and its partners have gone further, installing a 320-kilowatt molten carbonate fuel cell that operates on liquefied natural gas (and can be reconfigured, if necessary, to run on methanol). Storage tanks for the hydrogen and carbon dioxide that gets the fuel cell started press up against the stern of the 92.2 meter-long ship (in case of explosion) as do the machines to regasify the fuel.
The fuel cell operates at 650 degrees Celsius and is warm to the touch, even on a blustery, frigid day in Copenhagen's harbor.
Already, liquefied natural gas is cheaper than diesel—if you can find it.
Engineer and project developer Kjell Sandaker of Eidesvik notes there are as many as 15 such fueling stations along the Norwegian coast and the bright orange Viking Lady gases up once a week as its onboard turbines also directly burn the gas to supply electricity to the engines, though they can also burn diesel if necessary.
The ship's 220 cubic meter tank can hold roughly 90 metric tons of liquefied natural gas at a time.
"If the ships are ordered, we believe filling stations will also come," DNV's Svensen says. Already, at least one cruise ship that might employ the technology is under construction.
"In the North Sea, when drilling for oil they find gas," Eidesvik's Sandaker says. "By going on gas, we increase fuel efficiency" and decrease emissions.
But the $EU 12 million fuel cell from MTU On Site Energy is just in the testing phase, which will continue until mid-2010, and is not responsible for driving any of the four electric engines or propellers—after nearly a decade of development work.
"It's been two weeks working," Sandaker says. "It's been through its first storm in the North Sea."
The investment was made, in part, to get an understanding of fuel cell technology and how it might be applied to shipping, according to DNV's Viking Lady project head Tomas Heber Tronstad.
Initial estimates are that such fuel cells would cut CO2 emissions from an individual ship by 50 percent.
But the investment was also made because Norway has a tax on nitrogen oxide emissions that paid an immediate return for installing gas rather than diesel engines, says Eidesvik CEO Jan Fredrik Meling.
Compared to a traditional ship, even without using the fuel cell, the Viking Lady reduces nitrogen oxide emissions by 90 percent, CO2 emissions by 20 percent and eliminates sulfur dioxide and soot emissions.
"The technology has existed for years," Meling adds. "Demand must be created."
And old ships can be retrofitted with catalytic converters, like those in cars, to bring down emissions, according to Wartsila's Palomaki.
Ultimately, whether the Viking Lady remains unique in the annals of shipping will depend on the political decisions that come out of the Copenhagen climate conference and in national capitals. "It will take 20 to 30 years for this technology without government support," says DNV's Tronstad.
"If they want to act on climate soon, this is a technology that is available today."
Climate Numerology:
How Much Atmospheric Carbon Dioxide Is Safe?
Last December world leaders met in Copenhagen to add more hot air to the climate debate.
That is because although the impacts humanity would like to avoid—fire, flood and drought, for starters—are pretty clear, the right strategy to halt global warming is not.
Despite decades of effort, scientists do not know what "number"—in terms of temperature or concentrations of greenhouse gases in the atmosphere—constitutes a danger.
When it comes to defining the climate's sensitivity to forcings such as rising atmospheric carbon dioxide levels, "we don't know much more than we did in 1975," says climatologist Stephen Schneider of Stanford University, who first defined the term "climate sensitivity" in the 1970s. "What we know is if you add watts per square meter to the system, it's going to warm up."
Greenhouse gases add those watts by acting as a blanket, trapping the sun's heat.
They have warmed the earth by roughly 0.75 degree Celsius over the past century.
Scientists can measure how much energy greenhouse gases now add (roughly three watts per square meter), but what eludes precise definition is how much other factors play a role—the response of clouds to warming, the cooling role of aerosols, the heat and gas absorbed by oceans, human transformation of the landscape, even the natural variability of solar strength.
"We may have to wait 20 or 30 years before the data set in the 21st century is good enough to pin down sensitivity," says climate modeler Gavin Schmidt of the NASA Goddard Institute for Space Studies.
Despite all these variables, scientists have noted for more than a century that doubling preindustrial concentrations of CO2 in the atmosphere from 280 parts per million (ppm) would likely result in global average temperatures roughly three degrees C warmer.
But how much heating and added CO2 are safe for human civilization remains a judgment call.
European politicians have agreed that global average temperatures should not rise more than two degrees C above preindustrial levels by 2100, which equals a greenhouse gas concentration of roughly 450 ppm.
"We're at 387 now, and we're going up at 2 ppm per year," says geochemist Wallace Broecker of Columbia University.
"That means 450 is only 30 years away.
We'd be lucky if we could stop at 550."
Goddard's James Hansen argues that atmospheric concentrations must be brought back to 350 ppm or lower—quickly.
"Two degrees Celsius [of warming] is a guaranteed disaster," he says, noting the accelerating impacts that have manifested in recent years. "If you want some of these things to stop changing—for example, the melting of Arctic sea ice—what you would need to do is restore the planet's energy balance."
Other scientists, such as physicist Myles Allen of the University of Oxford, examine the problem from the opposite side:
How much more CO2 can the atmosphere safely hold?
To keep warming below two degrees C, humanity can afford to put one trillion metric tons of CO2 in the atmosphere by 2050, according to Allen and his team—and humans have already emitted half that.
Put another way, only one quarter of remaining known coal, oil and natural gas deposits can be burned.
"To solve the problem, we need to eliminate net emissions of CO2 entirely," Allen says. "Emissions need to fall by 2 to 2.5 percent per year from now on."
Climate scientist Jon Foley of the University of Minnesota, who is part of a team that defined safe limits for 10 planetary systems, including climate, argues for erring on the side of caution.
He observes that "conservation of mass tells us if we only want the bathtub so high either we turn down the faucet a lot or make sure the drain is bigger. An 80 percent reduction [in CO2 by 2050] is about the only path we go down to achieve that kind of stabilization."
Best summary of small wind turbine selection
wind turbine energy renewable alternative summaryThis is best analysis I've found if you are seriously considering buying a wind turbine.
This is a re-posting of a blog written by Amy Berry for Green by Design and can be seen here.
Amy can be found on Twitter at @wind2power and helps promote the Windspire Turbine by Mariah Power. I'll comment and link to relevant Mapawatt posts in blue!
Used to be if you wanted to put a wind turbine up at your house you either had to live on a remote farm, or grow your hair long and pledge allegiance to an aging group of touring musicians. Thanks to major improvements in technology and a general awareness of the benefits of making your own energy from clean and free wind, small wind power is going mainstream.
According to the American Wind Energy Association (AWEA), the US small wind market grew by 78% last year with many new turbines hitting the market.
But more options don't make finding the right wind power solution easy.
If you are interested but not sure how to even get started, here are 9 things to know as you consider wind power.
1. Small wind turbines can be broken into two main technologies:
Horizontal Axis Wind Turbines (HAWTs) or Vertical Axis Wind Turbines (VAWTs). HAWTs are propeller based turbines that are traditionally mounted on tall poles and are commonly used in large wind farm settings. HAWTs have blades which rotate vertically around a horizontal axis, similar to a propeller on an airplane. VAWTs include two main classes:
a tall vertical airfoil style (Darrieus), and a solid winged style (Savonius). Darrieus Turbines come in a few varieties. Some have rotors with curved blades that look like an eggbeater and rotate about a vertical axis. Another variation uses straight-sided airfoils and is called a Giromill.
Like propeller turbines, Darrieus turbines utilize some lift to capture wind energy.
Savonius Turbines have rotors with solid vanes or "scoops" which rotate about a vertical axis.
2. There is no precise definition for "small wind" but it usually applies to machines with less than 100 kilowatt (kW) ratings. The "ratings" refer to how much power the turbine can instantaneously generate at a specific wind speed.
There are no standards in the small wind industry, so manufacturers are able to set their ratings at varying wind speeds. It is not uncommon to find one turbine rated at 25mph and another one rated at 48mph.
Obviously the higher wind speed used will result in a higher kW rating, so its not a completely useful figure to go by.
3. While kW ratings will give you a general sense for the size of a turbine, what really matters is how much energy it will produce over a period of time.
Wind turbine companies provide energy curves that tell how many kilowatt hours (kWhs) you can expect to generate at specific average wind speeds. You can check your monthly electric bills to gain an understanding of how many kWhs you use.
Electricity use varies by season and time of day, so ideally you should add up the kWhs of the last 12 months.
Points 1-3 are summed up (in much less great detail) in my short slide-show on the Basics of Wind Power.
4. This should go without saying, but you need wind to create wind power. All turbines have a minimum required wind speed at which they will start to generate power, this is also known in the wind world as the "cut-in" wind speed.
It is possible for a turbine to spin at speeds below the cut-in speed, but those rotations won't be fast enough to actually create energy.
The majority of small wind turbines require a minimum of 10mph average annual winds to generate significant energy.
Wind power is a cubic function of wind speed.
For all you non-math people out there, this means that a little more wind can create a lot more power. When determining average annual wind speeds, a 10mph average annual wind does not mean it blows 10mph all day everyday.
Because of the cubic function, a day of high wind can generate enough power to make up for multiple days of low wind.
For you math geeks, average wind speeds follow the Rayleigh distribution curve http://en.wikipedia.org/wiki/Rayleigh_distribution)
5. So, how do you know if you have enough wind to make wind power a feasible option?
The most ideal way to know is to install an anemometer where you want to place your turbine.
You can get a very good anemometer for around $500 from www.madgetech.com.
If you don't want to wait a year, you can do shorter anemometer tests but you need to realize that wind speeds change with the seasons. Not ready to invest $500 in your research?
Check out local weather sites which should provide data on average wind speeds. Local airports are also wonderful resources for this information.
The DOD provides wind maps, but these are measured at 50 meter heights (for use by the big wind guys) and are not always localized enough for small wind installations which are very site specific.
You can also call a local wind turbine dealer and request a site visit.
Points 4 and 5 are briefly covered in my recent post on the importance of using an Anemometer!
6. What about a site? A "site" is the place on your property where you install your turbine.
Site location is a crucial element, and will have a major impact on which turbine you can consider. Turbines are best placed with enough open space to allow the wind direct access to the rotor. This does not necessarily require a specific lot size or a totally open and clear site.
Many small wind turbines are designed to work in various settings, for instance HAWTs will work if you can put up a large tower and have consistent wind direction. A VAWT maybe a better option if your wind changes directions and you cannot put up a structure taller than 30 feet.
Wind speed can also vary drastically on one piece of property due to structures and topography.
Always choose the site with the most access to wind.
7. You've done your research and know you have a good source of wind.
Now it's time to pick a turbine or turbines. It's very common for people to put up multiple turbines to meet more of their energy needs. Two big factors to initially consider are the expected power output and the cost of the unit fully installed.
Consult the energy curve of each wind turbine to determine how much energy it is likely to create with your average wind speeds over the course of a year. Compare the kWhs at the same wind speeds across wind turbines, while keeping in mind total cost of the unit. A wind turbine that generates 400kWhs for $2,000 is a lot more expensive than a wind turbine that generates 2,000kWhs for $6,500.
Also, don't be fooled by energy curves that show amazing results at 30mph average winds. It is very unlikely that you live in an area with wind speeds of that level and will ever reach those energy levels.
Excellent point.
You have to compare initial costs vs. the expected power output.
And you MUST compare your expected wind speeds and the power output at those speeds! I'm looking forward to the Honeywell wind turbine for locations that require a small footprint and a smaller power outputs and learning more about the Windspire which requires more room but has a higher power output.
8. The other big factors to consider are the look of the wind turbine and the sound it creates while operating at moderate wind speeds. Try to visit the wind turbines that you are considering (or at least view on video) so that you can see and hear what they sound like when they operate.
There are many designs on the market, all with varying looks and sounds. Find a local dealer or contact the manufacturer with questions. This is a big purchase decision, so you should feel comfortable working with the company and its local representative.
9. A final note on independent testing.
As I mentioned above, there are no standards for small wind at this time. AWEA is currently putting these together, but it will be at least a year before they are finalized.
Until then, we recommend focusing your search on independently tested wind turbines. Power curves, which turbine companies use to estimate power ratings and energy curves, can be supposed from complex calculations. But, the truest power curves are created from units being independently tested in real world scenarios. It is very easy for manufacturers to create their own power curves, so it is important to look for wind turbines with independently tested data. I recommend avoiding any turbines that do not have their data verified by an independent test facility.
Bottom Line:
It is extremely important to test your site and make sure you have enough wind.
Once you have the expected average wind speeds, use that value to compare several wind turbines. Also, don't forget about which direction the wind comes from.
If it changes speeds constantly than you might want to consider a VAWT. As Amy says, don't always trust the manufacturers power curves, because those were probably done under ideal conditions in a wind tunnel.
Do your research!
If you have any questions, don't hesitate to ask!
Global Heating:
Why We Must Shift to Carbon-Free Fuel:
The Doers (Part II)
One of the best hands-on examples of far-sighted thinking is provided perhaps by the folks who are frequently denigrated as dragging their heels when it comes to innovations -- the car-makers. With billions of dollars in future sales and their very survival at stake, almost all the major ones believe hydrogen is the long-term key to reducing the global warming threat by avoiding putting CO2 into the atmosphere in the first place.
Manufacturers such as GM, BMW, Daimler, Chrysler, Honda and Toyota are starting to place fleets of around 100 hydrogen cars each -- both fuel cell and with internal-combustion engines - into the hands of ordinary drivers in the U.S., Europe and Asia (including China) to gain operational data, market experience -- and as public relations/public education exercises. Honda executives made what to date are the most emphatically positive statements during June launch ceremonies of the world's first dedicated fuel cell car plant in Japan:
"Basically, we can mass produce these now," Kazuaki Umezu, head of Honda's New Model Center, was quoted in the June 17 New York Times as saying.
"We're waiting for the infrastructure to catch up."
Added his boss, Honda president Takeo Fukui, "this is a must-have technology for the future of the earth [...] Honda will work hard to mainstream fuel cell cars."
Still, it's a way to go. En route, the manufacturers will have to pass through a financial "Valley of Death," as GM's fuel-cell chief, Byron McCormick, put it to a DOE advisory committee in January 2007.
The current batch of vehicles cost around $1 million each.
But that's already history, and costs are coming down.
Carmakers plan to produce around 500 vehicles in the 2010 to 2012 "pilot commercialization phase" at about $250,000 each, McCormick said.
The "early commercialization phase" starting around 2013 is expected to see each manufacturer produce perhaps 10,000 vehicles costing around $50,000 in the first year, and then dropping to, hopefully, much lower numbers by 2015.
Fuel and fuel-cell costs, and a lack of fueling infrastructure, are still problematical but solvable with strong political will and close industry-government cooperation.
Efforts to figure out these and other problems -- onboard hydrogen storage, for instance -- still need a lot of work.
One survey last year found that more than 160 hydrogen stations were likely to be up and running by the end of that year worldwide. A German web database, HyWeb, says the total number of stations existing, planned, or already shut down is now about 300. GM's research vice president Larry Burns has said the U.S. would need about 12,000 fueling stations to meet 70 % of the country's hydrogen fueling needs -- a fairly small number considering there are an estimated 170,000 regular gas stations in the U.S.
As to fuel costs, GM said a couple of years ago that even at $5 per kilogram of hydrogen (a kg of hydrogen has about the same energy content as a gallon of gas), fuel cell cars potentially could provide transport at about 10 cents/mile, assuming the fuel cell has about 2.5 times the efficiency of a gasoline engine with comparable power. And fuel cell production costs, computed on the basis of producing 500,000 units annually, are now estimated by DOE to be about $94/kW -- a lot less than the baseline $275 estimate of 2002, but still far above the 2015 target of $30 -- very roughly the ballpark cost of today's internal-combustion engines.
The Revolution Has Started
The good news is that the revolution has started, with budding, albeit still very expensive,
examples sprouting Johnny Appleseed-style all over the globe. A few random examples:
-- The next three Olympic Games in Beijing this year, Vancouver in 2010, and London in 2012 will feature hydrogen-powered vehicles, including buses, probably some VIP cars, with both fuel cell and internal combustion engines, to help force hydrogen into the transport system;
-- A Caterpillar diesel mine loader is being converted to fuel cell power by Vehicle Projects, of Denver, CO. Vehicle Power is also converting a 123-ton diesel-electric railyard switching locomotive to fuel cells;
-- A Caterpillar farm tractor has been modified by a student team at the University of North Dakota to run partially on hydrogen.
Adding hydrogen to the fuel stream cleans up the diesel fuel, reduces emissions and improves power and torque;
-- Japan's Railway Technical Research Institute is testing a 33-on suburban railway train powered by a 125 kW Nuvera PEM fuel cell;
-- A fuel cell-powered tricycle, called a Cargobike, is being tried out by the German phone company Telekom as service vehicle for its technicians in Berlin.
It can carry up to about 300 pounds of tools and equipment up to 156 miles on about 90 grams of hydrogen as fuel for its small 250 W PEM fuel cell.
Best of all, you can ride it on bicycle paths around the city, bypassing any traffic jams - and you don't need a drivers license.
-- A solar hydrogen generator that may be truly revolutionary has been developed by a small Massachusetts startup company, Nanoptek.
Essentially the device splits water into hydrogen and oxygen with the help of titanium oxide, a phenomenon first discovered by a couple of Japanese scientists in the 1070s. But normal titanium oxide can use only the ultraviolet part of the solar spectrum for that, and that isn't very efficient.
Nanoptek now has applied advanced nanotechnology to "strain" the titanium oxide with the result that it now accepts also visible light to break up water molecules, making the whole process much more efficient and economical.
Nanoptek hasn't divulged any details such as efficiencies and solar conversion rates, but it expects its technology will eventually produce hydrogen fuel that will be cost-competitive with gasoline.
-- An automated high-volume fuel cell assembly line has been developed by a California fuel cell developer, Altergy Systems in Folsom, for small hydrogen fuel cells for telecommunication and utility companies and governments -- the world's first.
Altergy says automation and volume cuts the cost of the units in about half;
-- In New Jersey, inventor and long-time hydrogen developer Mike Strizki is now showing visitors his house in Hopewell converted to grid-independent solar hydrogen operation (including hydrogen for his fuel cell car), the subject of a long article in the New York Times magazine in May 2007.
It's pricey:
the conversion cost about $500,000, but his next project in the Cayman Islands is expected to cost less than half of that.
His third client appears to be actor Johnny Depp who wants to convert his Caribbean island home to Strizki's system, and Strizki says several others are in the pipeline.
Other solar hydrogen houses have been built or converted on Long Island;near Wiscasset, ME;on Stuart Island, WA;in Indonesia, and, decades ago, in Freiburg, Germany, and Switzerland;
-- In Iceland, the New Frontier of the coming global hydrogen economy, Hertz is already offering hydrogen-fueled converted Priuses as rental cars for tourists. Both Iceland and Norway have ordered dozens of them from California's Quantum Fuel Systems Technologies Worldwide for fleet operations and testing, along with hydrogen cars -- both fuel-cell and internal-combustion engines - from Daimler and Mazda;
-- One truly exciting idea with a whiff of "disruptive technology" is the concept of a high-speed liquid hydrogen-powered cargo catamaran freighter presented at the 2006 World Hydrogen Energy Conference in Lyons, France by a young Dutch naval engineer, Ivo Veldhuis. His "H2Oceanjet" would be 175 meters (575 feet) long, would be powered by four huge liquid hydrogen-fueled turbines that would drive equally huge waterjets at speeds of up to 64 knots (73 miles/hour) - more than twice the speed of current container ships. It would cover the 4,838 nautical miles ( 5,567 miles) from Yokohoma to Long Beach, CA in about 76 hours.
-- Hydrogen and fuel cells are even taking to the air:
In Europe, two teams, one in Turin, Italy, sponsored by the European Union, and another organized by Boeing and headquartered in
Madrid, Spain, are converting a very light Czech two-seater and an Austrian motorglider, respectively, to hybrid fuel cell/battery electric engines, with the fuel cells for both coming from a British company, Intelligent Energy. (Russian airplane manufacturer Tupolev flew one of its big three-engined TU 154 jetliners partially on liquid hydrogen in the late 1980s, and both Lockheed and Airbus made plans, never realized, for experimental liquid hydrogen-fueled passenger and transport jets). Looking to the far future, the European Union is currently helping fund a European Space Agency study of a liquid hydrogen-fueled hypersonic transport plane quaintly named LAPCAT (for Long-Term Advanced Propulsion Concepts and Technologies) that could whisk 300 passengers in about 2-4 hours from Brussels to Sydney, Australia;
-- And finally, a Motorola cell phone prototype powered with a tiny hydrogen fuel cell developed by a Canadian startup, Angstrom Power, Inc., was shown in January at the Consumer Electronics Show in Las Vegas. Angstrom says the fuel cell should roughly double the phone's run time, and refueling with hydrogen should take about 10 minutes.
The revolution has started, but it needs a much bigger push, and more convincing of many people, to really succeed.
Hydrogen isn't the only weapon in the fight against global heating, but it's an essential element.
USA Long-term policies needed
With its enormous tracts of land and rich wind resources, the United States has outstanding potential to generate substantial amounts of megawatts from wind.
According to a report recently published by the Department of Energy, the United States could realistically generate 20% of its electricity from wind by 2030—and some analysts believe that a higher penetration is possible.
This scenario will only be realized, however, if the U.S. enacts long-term policies to promote renewable energy at both the federal and state level.
Status at federal level
The Production Tax Credit is of vital importance to the wind industry.
In recent years, Congress has extended the PTC only one or two years at a time.
In the past, when Congress failed to pass the PTC before it expired, a boom-and-bust cycle resulted.
The PTC next expires in December 2008, and to avoid a momentum-killing bust, the industry needs an immediate extension.
A long-term extension will provide the industry with the stability it needs to continue recent growth.
A national renewable portfolio standard which requires utilities to meet a certain percentage of their electricity sales from renewables would also help drive the market.
The long-term predictability of a national RPS will enable the industry to attract investment capital and achieve manufacturing economies of scale that will lower costs.
Investment follows policy:
the states that have already adopted an RPS are experiencing growing investment from the wind industry.
Vestas applauds this commitment to renewable energy.
Initiatives at state level
Twenty-six states have already adopted an RPS in order to stimulate the market for renewable energy.
Texas and California, which have both adopted Renewable Portfolio Standards, have the most wind power installed, with 3,352 MW and 2,376 MW respectively.
A number of states across the U.S. are also implementing programmes focusing specifically on the climate issue.
The Regional Greenhouse Gas Initiative, for example, is an effort by Northeastern and Mid-Atlantic states to reduce CO2 emissions by implementing a multi-state cap and trade programme, i.e. a market-based emission trading system between these states.
U.S. transmission system in urgent need of modernization
In addition to adopting long-term incentives for renewables, the U.S. transmission system is in urgent need of modernization and expansion.
Authorities at federal and state levels must make long-term improvements to the power grid, reforming the way the nation plans and pays for new transmission assets, and the way those assets are managed.
"If we are to increase the pace at which wind power is expanding," says Vestas' CEO Ditlev Engel, "politicians must back their visions with action to improve the grid—and do so quickly."
Challenges ahead, but grounds for optimism
Despite the challenges that lie ahead, there are grounds for optimism.
With energy security and climate change placed high on the U.S. political agenda, there is more focus on providing a stable regulatory framework for the renewable industry than ever before.
The American people are calling for actions on critical energy issues.
Wind energy is poised to be a big part of the solution.
Climate, Oil, Reality and Delusion
Against a greater welter and flow of incoherence jerking the nation this way and that way en route to collapse comes "ClimateGate," the latest excuse for screaming knuckleheads to defend what has already been lost.
It is also yet another distraction from the emergency agenda that the United States faces – namely the urgent re-scaling, re-localizing, and de-globalizing of our daily activities.
What seems to be at stake for the knuckleheads is their identity, their idea of what it means to be an American, which boils down to being an organism so specially blessed and entitled that it is excused from paying attention to reality.
There were no doubt plenty of counterparts among the Mayans when the weather changed and their crops failed, and certainly the Romans had their share of identity psychotics who doubted reality even when Alaric the Visigoth was hoisting off their household treasure.
Reality doesn't care if we are on-board with its mandates or not.
The human race has to get with whatever program reality is serving up at a particular time.
Are we shocked to learn that scientists fight among themselves and cheat as much as congressmen?
Does that really change the relationships we understand about parts-per-million of carbon dioxide in the earth's atmosphere and the weather?
What the people of the world can do or will do about a change in climate is something else.
My guess is that the undertow of entropy is now too great to provoke any meaningful unified change in behavior. The collapse of the US economy is too close to the horizon, and the so-called developing nations will have problems equally severe.
In the meantime, it is unlikely that any of the major players will burn less coal and oil, or not cheat on each other even if they pledge to burn less. People who are not knuckleheads will make the practical arrangements that they can.
These will, by definition, be localized, small-scale, and non-global communities, doing what they would have to do anyway.
A parallel identity mania afflicts those who have decided that the Bakken shale oil deposits and the Marcellus gas play will allow the USA to cancel any modifications to our living arrangements. This cohort of knuckleheads wants to believe the public relations of the oil and gas industry, and in particular the bankers who are arranging the financing for these ventures. The facts are irrelevant to their identity-claims (that the USA has limitless energy resources). In fact, the Bakken shale formation is unlikely to produce more than a few hundred thousand barrels of oil a day in a nation used to burning about twenty million. A few hundred thousand might mean a lot if were only used to light kerosene lamps, but it is unlikely to keep the faithful motoring off to WalMart and Walt Disney World – which is the exact expectation of the knuckleheads.
Shale gas is a similar story.
It will be too expensive to get out of the tight rock at a flow that will allow business as usual to continue.
It certainly won't be produced at under $10 a unit, and the nation's comprehensive bankruptcy accelerates every day, making it less likely that the public can pay premium prices within the framework of our current living arrangement.
The Green Shoots crowd – a sub-category of identity maniacs, who think the USA is immune to the laws of history and physics – has made common cause with the oil and climate knuckleheads to proclaim that we are returning to normal, back to the "consumer" orgy, the suburban sprawl nexus of McHousing and miracle mortgages, and new frontiers of corporate profit-raking.
They are tragically wrong.
Instead, we're headed into the wildest king-hell debt workout that the world has ever seen, which will propel a lot of people used to working in air-conditioned cubicles into a world made by hand.
We march day by day into the great holiday season with mortgages going unpaid and the credit cards getting cancelled and money disappearing and the fears and grievances mounting.
Pretty soon, the folks doing "God's work" at Goldman Sachs (and their tribal kin on Wall Street) will announce their annual bonuses (because they are publicly-held companies, which have to do so). Won't that be a galvanizing moment for us all?
Environment
Keeping ecosystems healthy makes economic sense and is vital to control climate change, says Commissioner Dimas
EU Environment Commissioner Stavros Dimas will today underline the importance of valuing the benefits we derive from nature in order to protect them more effectively.
Nature provides numerous services for free, like water regulation flood protection and carbon storage.
Growing pressure on ecosystems around the world means that the provision of many of these services is now at risk.
While technological solutions can replace some of them, this usually comes at a significant cost.
Protecting and restoring biodiversity is therefore an essential step in the transition to a more sustainable economy, particularly because biodiversity will also play a vital role if we are to succeed in combating climate change.
Commissioner Dimas will launch the findings in a key report by The Economics of Ecosystems and Biodiversity (TEEB) project, a major independent global study co-funded by the European Commission.
Study leader Pavan Sukhdev will also present the TEEB for Policy Makers Report.
Commissioner Dimas said:
"The findings of this TEEB report are clear:
the failure to protect nature is often a consequence of our failure to understand its value.
We must learn to value nature to protect it more effectively.
Too often, nature is considered as having little if any economic relevance, but the truth is it sustains and underpins our economies and societies, and can offer effective protection against climate change.
We will not succeed in mitigating or adapting to climate change if we do not protect valuable ecosystems, and we will not manage to halt biodiversity loss if we fail to prevent dangerous levels of climate change."
TEEB Study leader Pavan Sukhdev said:
"The economic invisibility of ecosystems and biodiversity is a major reason for their alarming loss, despite their tremendous economic value to society.
Our stock of natural assets, or natural capital, is as important as man-made assets or physical capital.
Recognising and rewarding the value of benefits flowing to society from natural capital must become a policy priority."
Nature – a cost-effective investment
The report shows that investing in protecting ecosystems can be very cost-effective.
Nature supports a wide range of economic sectors and expands our options for long-term economic growth.
Nature's capacity to provide vital services such as fresh water and climate regulation is often cheaper than having to invest in technological solutions.
Protected areas – the cornerstone of our conservation policies – are not only good for nature;they can also generate significant benefits.
In Scotland, the public benefits of protecting the Natura 2000 network (the European network of protected areas) are estimated to be three times greater than its costs.
Nature is also the most cost-effective way of mitigating and adapting to climate change.
Despite mitigation efforts, risks of natural hazards are predicted to increase with climate change.
Investments in healthy ecosystems will be important for protection against events such as floods and storms.
They are even more attractive when the full range of services they provide is taken into account.
The findings further underline the need for agreement in Copenhagen on financing action to reduce carbon dioxide (CO 2) emissions from deforestation and degradation in tropical countries (through a scheme known as REDD).
Deforestation and forest degradation are responsible for around 20% of global CO 2 emissions – more than all forms of transport combined.
The EU wants the Copenhagen agreement to set the goal of at least halving tropical deforestation from current levels by 2020 and halting global forest cover loss completely by 2030.
The European Commission's proposal to scale up financing to developing countries to help fight climate change includes provisions to halt deforestation (see IP/09/1297 ).
Challenges ahead
The report highlights two main challenges for policy-makers.
The first is to understand the values of our ecosystems, biodiversity and natural resources and integrate them into decision-making across all policy areas at all levels – local, national and global.
The second is to respond efficiently and to tailor policy solutions to the needs of different economies and societies.
Better understanding and measuring of biodiversity and eco-system values to support integrated policy assessments are a core part of the long-term solution.
Background
TEEB was launched by Germany and the European Commission in response to a proposal by the G8+5 Environment Ministers, meeting in Potsdam, Germany in 2007, for an independent global study on the economics of biodiversity loss.
It is led by the United Nations Environment Programme (UNEP) with financial support from the European Commission, Germany and the UK, joined more recently by Norway, the Netherlands and Sweden.
Scientists map speed of climate change
From beetles to barnacles, pikas to pine warblers, many species are already on the move in response to shifting climate regimes. But how fast will they—and their habitats—have to move to keep pace with global climate change over the next century?
In a new study, a team of scientists including Dr. Healy Hamilton from the California Academy of Sciences have calculated that on average, ecosystems will need to shift about 0.42 kilometers per year (about a quarter mile per year) to keep pace with changing temperatures across the globe.
Mountainous habitats will be able to move more slowly, since a modest move up or down slope can result in a large change in temperature.
However, flatter ecosystems, such as flooded grasslands, mangroves, and deserts, will need to move much more rapidly to stay in their comfort zone—sometimes more than a kilometer per year. The team, which also included scientists from the Carnegie Institute of Science, Climate Central, and U.C. Berkeley, will publish their results in the December 24 issue of Nature.
"One of the most powerful aspects of this data is that it allows us to evaluate how our current protected area network will perform as we attempt to conserve biodiversity in the face of global climate change," says Healy Hamilton, Director of the Center for Applied Biodiversity Informatics at the California Academy of Sciences. "When we look at residence times for protected areas, which we define as the amount of time it will take current climate conditions to move across and out of a given protected area, only 8% of our current protected areas have residence times of more than 100 years. If we want to improve these numbers, we need to both reduce our carbon emissions and work quickly toward expanding and connecting our global network of protected areas."
The team calculated the velocity of global climate change by combining data on current climate and temperature regimes worldwide with a large suite of climate model projections for the next century.
Their calculations are based on an "intermediate" level of projected greenhouse gas emissions over the next century (the A1B emissions scenario from The Intergovernmental Panel on Climate Change). Under these emissions levels, the velocity of climate change is projected to be the slowest in tropical and subtropical coniferous forests (0.08 kilometers per year), temperate coniferous forests (0.11 kilometers per year), and montane grasslands and shrublands (0.11 kilometers per year). The velocity of climate change is expected to be the fastest in flatter areas, including deserts and xeric shrublands (0.71 kilometers per year), mangroves (0.95 kilometers per year), and flooded grasslands and savannas (1.26 kilometers per year).
The vulnerability of these respective biomes depends not only on the average velocity of climate change they will experience, but also on the sizes of the protected areas in which they are found.
For instance, while the velocity of climate change is expected to be high in deserts, this threat is mediated by the fact that protected areas for deserts tend to be larger. On the other hand, the small size and fragmented nature of most protected areas in Mediterranean temperate broadleaf and boreal forest biomes makes these habitats particularly vulnerable.
What does this mean for beetles, barnacles, and other groups of species?
The researchers note that their index estimates the velocities and residence times of climates, not species. Individual species that have a wide tolerance for a range of temperatures may be able to adapt in place as the climate around them shifts. However, for species that can only tolerate a narrow band of temperatures, the velocity estimates in the study are a close approximation for the migration speeds needed to potentially avoid extinction.
Nearly a third of the habitats in the study have velocities higher than even the most optimistic plant migration estimates, suggesting that plants in many areas will not be able to keep up with the shifting climate.
Even more problematic is the fact that natural habitats have been extensively fragmented by human development, which will leave many species with "nowhere to go," regardless of their migration rates.
The team's results not only underscore the importance of lowering greenhouse gas emissions—they also provide data for conservation managers who must now plan for the impact of global climate change.
The research was funded by the Gordon and Betty Moore Foundation and the Stanford University Global Climate and Energy Project
Glacier melt adds ancient edibles to marine buffet
Glaciers along the Gulf of Alaska are enriching stream and near shore marine ecosystems from a surprising source – ancient carbon contained in glacial runoff, researchers from four universities and the U.S. Forest Service report in the December 24, 2009, issue of the journal Nature*.
In spring 2008, Eran Hood, associate professor of hydrology with the Environmental Science Program at the University of Alaska Southeast, set out to measure the nutrients that reach the gulf from five glaciated watersheds he can drive to from his Juneau office.
"We don't currently have much information about how runoff from glaciers may be contributing to productivity in downstream marine ecosystems. This is a particularly critical question given the rate at which glaciers along the Gulf of Alaska are thinning and receding" said Hood.
Hood then asked former graduate school colleague Durelle Scott, now an assistant professor of biological systems engineering at Virginia Tech, to help analyze the organic matter and nutrient (nitrogen and phosphorus) loads being exported from the Juneau-area study watersheds. "Because there are few reports of nutrient yields from glacial watersheds, Eran and I decided to compare the result from a non-glacial watershed with those of a watershed partially covered by a glacier and a watershed fully covered by a glacier," said Scott.
Hood and Scott's initial findings, reported in the September 2008 issue of the journal Nature Geoscience (http://www.nature.com/ngeo/journal/v1/n9/abs/ngeo280.html), presented something of a mystery.
As might be expected, there is more organic matter from a forested watershed than from a fully or partially glacier-covered watershed.
With soil development, organic matter is transported from the landscape during runoff events. However, there was still a considerable amount of organic carbon exported from the glaciated landscape.
How can a glacier be a source of the organic carbon?
His curiosity peeked, in spring 2009, Hood's Ph.D. student, Jason Fellman, collected samples from 11 watersheds along the Gulf of Alaska from Juneau to the Kenai Peninsula.
The samples were analyzed to determine the age, source, and biodegradability of organic matter derived from glacier inputs.
"We found that the more glacier there is in the watershed, the more carbon is bioavailable.
And the higher the percentage of glacier coverage, the older the organic material is – up to 4,000 years old," said Scott.
Hood and Scott hypothesize that forests that lived along the Gulf of Alaska between 2,500 to 7,000 years ago were covered by glaciers, and this organic matter is now coming out.
"The organic matter in heavily glaciated watersheds is labile, like sugar. Microorganisms appear to be metabolizing ancient carbon and as the microorganisms die and decompose, biodegradable dissolved organic carbon is being flushed out with the glacier melt," said Scott.
How much?
"Our findings suggest that runoff from glaciers may be a quantitatively important source of bioavailable organic carbon for coastal ecosystems in the Gulf of Alaska and, as a result, future changes in glacier extent may impact the food webs in this region that support some of the most productive fisheries in the United States," said Hood.
Peak oil vs. climate chaos
Time for a little critical thinking boot camp, thanks to a piece I just stumbled upon.
Oil And Environment:
A Contradiction:
Among the factors in systemic collapse that should be placed far down on the list are many that might be described as environmental:
pollution, global warming, and so on.
The fact is that the issue of peak oil and that of the environment are mutually exclusive problems. As oil and other fossil fuels disappear, the environmental problems will also go away, even if very slowly.
By trying to raise the alarm about both issues at once, we are placing ourselves in a self-contradictory position, and our credibility is rapidly undermined.
We cannot, on the one hand, wish that oil would go away so that the air will have a crystalline purity, and on the other hand complain because we have spent hours poring over the charts of global oil production and found that the cost of driving to the cottage is becoming prohibitive.
As oil is depleted, there will be fewer automobiles and factories, so the air and water will be less polluted.
As the air becomes cleaner, the man-made aspects of global warming will be reduced.
As fossil fuels disappear, in fact, all that goes with it will disappear or be reduced.
Above all, there will be no chance for 7 billion people to be living on the Earth.
As the human population goes down, so a great many other problems of this planet will recede, from the disappearance of fresh water to the extinction of species.
That is not to say that the reversal of the destruction will be a sudden process. On the contrary, even if our use of fossil fuels ended tomorrow, it would take decades for the planet to cleanse itself.
The greatest danger of fossil-fuel depletion, on the other hand, is that human life itself will come to an end.
This is not a topic to stir the patriotism of the sheltered souls of Middle America.
It is a nightmare.
The simplest arithmetic shows that 7 billion humans cannot be fed with the products of pre-industrial agriculture.
We can try to hide from that reality by planting a few rows of tomatoes and lettuce up at the cottage on a summer weekend, but deep in our hearts we know that human life requires far more than tomatoes and lettuce.
Even more frightening is the thought that those doomed human beings will not float up into the sky and enter some other dimension.
Their deaths will not be anesthetized.
Death by famine is slow and painful.
It is not just hunger, and it is not just fasting.
After a few weeks without food, the entire body starts to fall apart.
Not a very nice topic for a high-school essay.
It is far better that we allow our teenagers to continue their air-conditioned lives, to dwell in what Ibsen called a doll's house.
(Don't just read these excerpts;please go read it all, if only to wallow in more of the moral superiority as seen in the last graf I quoted.)
Before I explain why I think the viewpoint expressed above is ridiculously wrong, let me ask you, dear reader, to ponder it for just a few moments. Hint:
What assumptions is the author making that one could reasonably say are either wrong or on very shaky ground?
While you're thinking and mentally jotting down some notes, I'll go make a cup of tea.
OK, I'm back, steaming mug in hand.
Did you say that they key points of contention (in no particular order) are:
* Timing.
The author is assuming that peak oil will bite us hard before climate chaos does.
* The impact of climate chaos, as a function of the amount of CO2 we dump into the air. The assumption is that not only is climate chaos quickly reversible, but that it can be reversed at all.
Right now, I think the most likely scenario is that climate chaos will start to bite soon enough and visibly enough, and we'll finally begin to restrain our fossil fuel consumption, so that far from "peak oil saving us from climate chaos", as many of my fellow enviros like to claim will happen, it will be just the opposite–we'll reduce our carbon emissions just slightly quicker than peaking production alone would have forced us. (It could well be true, as I've argued here before, albeit not recently, that policy makers will hear the peak oil message very clearly behind the scenes and it will give them the needed urgency to restrain our consumption while selling these changes to the public as a move to mitigate climate chaos impacts.)
As for the second point, I think the author of the above opinion is wildly off base in how long it will take "for the planet to cleanse itself" (and no, I'm not entirely sure what that means). Instead of "decades" it would be centuries, minimally, before the atmospheric level of CO2 dropped enough to curtail further warming;remember all that warming that's "in the pipeline" we're always talking about because the planet is in a state of radiative disequilibrium, in part because of feedbacks?
And that's assuming we're lucky and we haven't already jolted the Earth's environment so much with our emissions that we've awakened the monster under our bed, the 1.6 trillion tons of carbon locked up in permafrost in and near the Arctic, plus the vast worldwide methane hydrate deposits. With all we've learned lately (and continue to learn) about how twitchy the environment is, we simply don't know if we've disturbed enough of the small stones on the mountainside to set a gigantic boulder rolling downhill, a boulder we have almost no chance of stopping, slowing, or even deflecting once we see it beginning to move.
In no way should this be taken as an indication that I think peak oil "isn't a problem". I believe now, as I have since I started writing here nearly six years ago, that peak oil is a real, imminent, and gigantic problem.
In short, this ongoing war of words between the peak oil and climate chaos camps is ridiculous and counterproductive.
They're both enormous threats, and either one would be a hideous problem by itself, only made worse with the addition of the other problem to the toxic mix.
Generating Solar Power After Dark
PhotoSolarReserve Two farms being planned by SolarReserve of Santa Monica, Calif., would store the sun's energy in molten salt, releasing the heat at night when it could be used to drive a turbine and generate electricity.
An artist's rendering of such a solar plant is shown here.
Solar farms that would serve two Western utilities are planning to use technology that will generate electricity after the sun goes down, a move that could be a potential game-changer for the industry.
The two farms being planned by SolarReserve of Santa Monica, Calif., would store the sun's energy in molten salt, releasing the heat at night when it could be used to drive a turbine and generate electricity.
Two utilities, NV Energy in Nevada and Pacific Gas and Electric, Northern California's biggest utility, would buy the power.
The sun's intermittent nature has made large-scale solar farms most useful as so-called peaker plants that supply electricity when demand spikes, typically in the late afternoon on hot days. But the ability of SolarReserve to store the sun's energy for use at night would be a step forward in technology.
"The energy storage characteristics were a key factor in our selection of the Tonopah solar energy project," NV Energy's chief executive, Michael Yackira, said in a statement.
The utility will be able to draw electricity from the solar farm more or less on demand, which makes it easier to balance the load on the power grid.
NV Energy would buy power from the 100-megawatt Crescent Dunes Solar Energy Project being planned on federal land near Tonopah, Nev., about 215 miles northwest of Las Vegas.
"We're expecting to put in 12 hours of storage, which allows us to move power within the day to meet peak requirements as well as to operate at full load," SolarReserve's chief executive, Kevin Smith, said of the Tonopah plant.
But the technology comes with its own set of problems.
The site of the Nevada project has been moved several times because of concerns from the Air Force that the project would interfere with advanced radar systems. The solar farm features a 538-foot-tall concrete tower topped by a 100-foot receiver that contains millions of gallons of molten salt.
The Rocketdyne division of United Technologies developed the molten salt technology and has licensed it to SolarReserve.
Huge fields of mirrors called heliostats focus the sun on the receiver, which heats the salt to 1,050 degrees. The liquefied salt flows through a steam-generating system to drive the turbine and is returned to the receiver to be heated again.
SolarReserve's other farm, the 150-megawatt Rice Solar Energy Project for P.G.and E. in California, is expected to have seven hours of energy storage.
The solar farm, to be built on private land near the desert ghost town of Rice, will provide electricity to the utility under a 25-year contract.
SolarReserve, which is also building a solar power plant in Spain, expects the California and Nevada projects to create 900 construction jobs.
CO2 disposal in the ocean is a dangerous distraction
The urgency of reducing emissions of CO2 has never been greater. The science of climate change has revealed that the risks are much higher and more imminent than we had estimated only a few years ago. But just as with a deadly emergency in a heavy passenger jet:
the crew should never, ever rush into hasty actions that will ultimately make a very bad situation a lot worse.
Ocean disposal of CO2 is one such option.
A careful, rational and scientific analysis of the option of CO2 disposal in the ocean leads to the conclusion that it is not viable.
In 2006 the German government's scientific Advisory Council on Global Change (WBGU) came down against this option:
"introducing CO2 into seawater should be prohibited, because the risk of ecological damage cannot be assessed and the retention period in the oceans is too short."
The main arguments were "the largely incalculable ecological risk" and the fact that over longer timeframes a significant fraction of the stored CO2 would get back to the atmosphere.
In the long run (hundreds to thousands of years) which, given the very long lifetime of CO2 we must always keep in mind when devising climate policies to limit warming, this option would not help reduce CO2 below levels that would have otherwise occurred.
As the IPCC Fourth Assessment Mitigation report, in which I was a lead author, has shown, conventional options to rapidly reduce emissions in the next few decades are available now.
What is lacking to deploy these at scale and quickly are the appropriate policy settings. The absence of these is most acute in Dr Broecker's own country, the USA.
It is not just Greenpeace and other environmental groups that think that ocean disposal is a bad option.
The decision by OSPAR, for example, to explicitly rule out the disposal of CO2 into the ocean and on to the sea bed, is by no means irrational, nor the result of "strong-arm tactics".
Neither is it a decision in which Greenpeace played any major role.
Turning to some of the details in Dr Broecker's arguments.
Deep water injection CO2 would cause inevitable and potentially irrevocable damage to those deep-water ecosystems directly impacted (smothering, asphyxiation, acidification), and at scale would result in far more widespread effects in the abyssal zone over time as the clathrates dissolve.
Over far longer timescales it would result in changes to abyssal ecosystems which in turn feed back to the global carbon cycle.
To suggest that there is "no indication that the projected rise in upper ocean CO2 content will have adverse impacts on fish" and, on this basis, to argue that spread of CO2 through the deep sea would therefore also be benign, is misleading in the extreme.
This statement ignores the growing evidence that projected rises in upper ocean CO2 and consequent acidification is likely to have profound impacts on calcification rates and calcifying organisms. It is predicted that upper ocean pH levels will drop to levels lower than those recorded at any time over tens of millions of years, and at a rate orders of magnitude greater than any previous change.
There is also evidence that deep water crustacean species, sediment dwelling organisms and associated ecosystem processes could also be adversely affected, including changes to nutrient cycling thought to occur through impacts on sediment microflora.
The fact that deep water CO2 concentrations are currently lower than those of surface waters should not be taken as an indication of a vast unexploited capacity for CO2 disposal.
Our knowledge of the biogeochemical processes which have contributed to the current distribution of CO2 in the deep oceans remains limited, as does our capacity therefore to predict the consequences of multi-billion tonne injections of CO2 at depth.
To assume that uniformity of concentration is somehow an acceptable target, or one which will have minimal impact on marine ecosystems and the carbon cycle, is oversimplistic.
Dr Broecker argues that a series of experiments involving the release of one tonne quantities of CO2 at depths greater than 3,500m are the next logical step.
One tonne release experiments to observe behaviour and determine impacts is one thing.
However, one tonne experiments intended as proof of the concept for multi-gigatonne injections in the future is quite another.
An obvious critical aspect is the potential for cumulative impacts resulting from continuous injections over long periods, or a large number of injections, such as would be a necessary characteristic of any deep injection strategy for climate change mitigation.
The nature and likelihood of these cumulative impacts simply could not be assessed from the results of the experiments he suggests.
Existing ocean dumping laws are designed to protect the marine environment from irresponsible and unsustainable waste disposal operations. The London Convention and its 1996 Protocol, which are currently in force in parallel, preclude the disposal at sea of industrial waste, including CO2, with the specific exception to enable carbon capture and storage in sub-sea bed geological formations under strict conditions of operation, verification, monitoring and control.
By definition, injection of CO2 at the sea bed deliberately and immediately relinquishes any control over the waste.
Such disposal operations are effectively irreversible, and any adverse consequences, on whatever geographical and time scales they may occur, cannot be prevented or mitigated.
If the models Broecker suggests we should rely on prove to be inadequate or inaccurate, or both, what do we do?
In short, ocean disposal of CO2, in common with other proposals for geoengineering our way out of climate change, is simply a dangerous distraction and draws attention away from the real solutions. There is no alternative but to drastically reduce emissions and this is best done at source using renewable energy, energy efficiency, reducing deforestation and improving the efficiency of industry and agriculture.
Many in the scientific community and in environmental groups such as Greenpeace share Dr Broecker's deep sense of frustration at the lack of action to date and his great sense of foreboding over the fate of the planet if we do not succeed in getting emissions reduced quickly.
Dr Broecker's work and writings since at least the mid-1970s warning of the dangers of rising CO2 helped to inspire a generation of scientists such as myself to work on this subject, and moreover to work hard and long to develop a global agreement to reduce emissions and limit the risk of rapid human-induced and dangerous climate change.
As such his views are to be taken seriously and considered carefully.
But in this case we must agree to disagree.
By Jennifer Graham, THE CANADIAN PRESS, cp.org, Updated:
December 29, 2009 10:11 AM
Experts says road development in energy sector threatening Prairie songbird
REGINA - Experts say the increasing quest for energy is devouring a prairie songbird's habitat and putting the future of the species in doubt.
The Committee on the Status of Endangered Wildlife in Canada has added the chestnut-collared longspur to the threatened list.
That means it's likely to become endangered if nothing is done.
"What we were presented with was really some troubling information on the population trends for this bird," says Marty Leonard, a co-chair on the committee.
"From the late 1960s up until the present day, more than 90 per cent of their population has been lost.
More than 90 per cent of these birds that were in Alberta, Saskatchewan and Manitoba have declined and that's a lot of animals to lose just since the late '60s."
The chestnut-collared longspur is a small grassland songbird between 13 and 15.5 centimetres long.
The birds have a chestnut collar at the back of their head and their crown and breast are black.
Their tail has a distinctive black-and-white pattern, which they display during aerial acrobatics that accompany their song.
In Canada, the chestnut-collared longspur lives in grasslands in southern Alberta, Saskatchewan and Manitoba.
Leonard, a biology professor at Dalhousie University in Halifax, focuses his research on the behaviour and ecology of birds, as well as on the conservation of endangered birds. He says by crude estimates there are about 600,000 chestnut-collared longspurs left in Canada.
"People always say, 'Well, that's still a lot,' and that's true, but I guess one of the things we worry about is the fact that these sort of trends continue.
"The fact of it is, there's still a reasonable number of them, but if the trend continues there will come a time when there isn't, and then they might have difficulty finding another bird to breed with because their habitat gets chopped up.
"Numbers can get too small and it also makes them vulnerable."
Leonard says bird numbers usually fall because of habitat loss.
In the case of the longspur, that's partly due to invasion by shrubs and non-native plants. But the committee also says in its latest report that the bird's habitat is being lost and fragmented because of "road development associated with the energy sector."
"They need a particular size of habitat to breed in," explains Leonard.
"It gets sort of broken up into smaller and smaller chunks with roads, oil and gas development ... so you might see a patch of grassland and you'd say, 'Oh, well look, that looks OK,' but if the grass isn't the right height or there's too much woody area on it they won't breed there."
The Committee on the Status of Endangered Wildlife is to submit a report to the federal environment minister next summer on the chestnut-collared longspur. The minister needs to formally recognize the bird as threatened before it can be declared as such, although Leonard says the minister almost always accepts the committee's recommendations.
The designation would give the bird more protection and a recovery plan could then be put in place.
The chestnut-collared longspur isn't the first bird from the region to land on the committee's list.
The greater prairie chicken is considered extirpated, meaning it no longer exists in the wild in Canada.
The mountain plover was designated endangered in 1987.
Also on the list is the Eskimo curlew, a bird known to nest only in Canada.
It was designated endangered in 1978, but there have been no verified sightings of the species since 1963.
The committee says that indicates the curlew is on the brink of becoming the first Canadian bird to be declared extinct since the passenger pigeon nearly 100 years ago.
There is some good news, however.
The committee says the swift fox population could be on the rebound.
The swift fox was considered extirpated in 1978 and declared endangered in 1998.
But programs to reintroduce it in Alberta and Saskatchewan are paying off and its status has now been upgraded to threatened.
Climategate Investigator Is Member Of Vehemently Pro-Man Made Global Warming Organization
A civil servant who is a member of one of the most vehemently pro man-made global warming advocacy organizations in Europe which also has direct ties to the IPCC has been handed the job of whitewashing the investigation into the University of East Anglia, while absurdly billing himself as impartial and unconnected to climate science.
Meanwhile, the UN Intergovernmental Panel on Climate Change has ludicrously announced that it will conduct its own investigation into the climategate scandal, despite the fact that the suspects involved have intimate ties to the IPCC, with one of the primary scientists accused of manipulating climate data being a lead author of the 1995, 2001, and 2007 IPCC reports.
"The UN's Intergovernmental Panel of Climate Change (IPCC) is the leading body for assessing climate change science," reports the Daily Mail.
IPCC chairman Dr Rajendra Pachauri told the BBC the claims were serious and he wanted them investigated.
"We will certainly go into the whole lot and then we will take a position on it,' he said."
"We certainly don't want to brush anything under the carpet.
This is a serious issue and we will look into it in detail."
Having the IPCC investigate climategate would be like Ken Lay heading up the Enron enquiry.
One of the primary climategate suspects, Kevin Trenberth, is a lead IPCC author, having been influential in crafting the 1995, 2001, and 2007 IPCC reports.
Professor Phil Jones, who infamously wrote of the need to "hide the decline" in global warming, is also a primary IPCC contributor, having been in charge of the two key sets of data used by the IPCC to draw up its reports.
In addition, another climategate suspect, Michael Mann, was the creator of the debunked "hockey stick" graph, which was "Given star billing by the IPCC, not least for the way it appeared to eliminate the long-accepted Mediaeval Warm Period when temperatures were higher than they are today."
There is little pretense about the fact that the UN will merely absolve its own scientists of blame, the larger scam is the notion that civil servant Sir (Alastair) Muir Russell, who has been picked to head the investigation into East Anglia University, after an earlier trial balloon to have the inquiry headed by warmist advocate Lord Rees was shot down, is impartial, when he is a member of The Royal Society of Edinburgh, a vehemently pro-man made global warming organization.
"As a measure of how out of touch UEA is, they apparently have little idea that the title "former civil servant" does not inspire much confidence from skeptics, since it has been "civil servants" who have been blocking access to the data and procedures all along," writes Anthony Watts.
Russell has pitched himself as someone with "no links to either the university or the climate science community," yet he is firmly a member of the academic establishment, being the former Principal and Vice-Chancellor of the University of Glasgow.
Russell is the quintessential establishment lackey, having been appointed a Knight Commander of the Order of the Bath in the Queen's Birthday Honours in 2001.
There's little doubt that Russell is an establishment insider who has been tasked with whitewashing the whole affair. He is a member of the Royal Society of Edinburgh, an offshoot of the same organization that Lord Rees is a part of. The Royal Society of Edinburgh "provides annual grants totaling over half a million pounds for research" in Scottish universities, a sizeable portion of which goes to research attempting to validify claims about man-made global warming.
The RSE has thrown its weight behind the global warming movement, lending its absolute support for legislation aimed at reducing carbon emissions by 80%, a process that will devastate the global economy and living standards.
This organization has been even more vehement than national governments in its advocacy of the man-made cause of global warming, calling for such drastic CO2 cuts to be made in the short term, not even by the usual target date of 2050.
A February 2009 response to the Climate Change (Scotland) Bill outlines the organization's staunch advocacy for the hypothesis of anthropogenic (man-made) global warming.
Earlier this year, The Royal Society of Edinburgh elected Professor Peter Smith to become one of its members. Smith just happens to be the Convening Lead Author of Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report.
He has also been a lead author for numerous other IPCC reports over the past thirteen years.
How can a proud member of an organization that is aggressively pushing measures to cut CO2 in the name of halting alleged man-made global warming, while also having a direct relationship with the UN IPCC, bill himself as being totally impartial and unconnected to climate science?
The climategate scandal has grown wings and taken flight.
This is a scandal within a scandal, the notion that the very crooks caught manipulating data can appoint their own allies to "investigate" their wrongdoing and think nothing of it, while claiming that such individuals are impartial and independent, is beyond belief.
The only real investigation of climategate has to occur at the congressional or parliamentary level, preferably both, and it has to be completely open and transparent – not sneakily wrapped up behind closed doors by organizations like the IPCC and members of the Royal Society of Edinburgh, both of whom have a massive stake in protecting and upholding the entire climate change fraud.
Escalation a Bad Sign for Afghanistan Environment
Shipping off 30,000 more troops to the land of the Taliban may be infuriating to devoted antiwar activists, but the toll the Afghanistan war is having on the environment should also force nature lovers into the streets in protest.
Natural habitat in Afghanistan has endured decades of struggle, and the War on Terror has only escalated the destruction.
The lands most afflicted by warfare are home to critters that most Westerners only have a chance to observe behind cages in our city zoos:
gazelles, cheetahs, hyenas, Turanian tigers and snow leopards among others.
Afghanistan's National Environmental Protection Agency (NEPA), which was formed in 2005 to address environmental issues, has listed a total of 33 species on its Endangered list.
By the end of this year, NEPA's list may grow to over 80 species of plants and animals.
[For complete article reference links, please see source at Truthout here.]
In 2003, the United Nations Environment Program (UNEP) released its evaluation of Afghanistan's environmental issues. Titled "Post-Conflict Environmental Assessment," the UNEP report claimed that war and long-standing drought "have caused serious and widespread land and resource degradation, including lowered water tables, desiccation of wetlands, deforestation and widespread loss of vegetative cover, erosion, and loss of wildlife populations."
Ammunition dumps, cluster bombs, B-52 bombers and land mines, which President Obama refuses to ban, serve as the greatest threat to the country's rugged natural landscape and the biodiversity it cradles.
The increasing number of Afghanis that are being displaced because of military conflict, UNEP's report warned, has compounded all of these problems. It was a sobering estimation.
However, it was an analysis that should not come as much of a surprise:
warfare kills not only humans, but life in general.
As bombs fall, civilians are not the only ones put at risk, and the lasting environmental impacts of the war may not be known for years, perhaps decades, to come.
For example, birds are killed and sent off their migratory course.
Literally tens of thousands of birds leave Siberia and Central Asia to find their winter homes to the south.
Many of these winged creatures have traditionally flown through Afghanistan to the southeastern wetlands of Kazakhstan, but their numbers have drastically declined in recent years.
Endangered Siberian cranes and two protected species of pelicans are the most at risk, say Pakistani ornithologists who study the area.
The war's true impact on these species is not yet known, but President Obama's escalating of the combat effort in the country is not a hopeful sign.
Back in 2001, Dr. Oumed Haneed, who monitors bird migration in Pakistan, told the British Broadcasting Corporation (BBC) that the country had typically witnessed thousands of ducks and other wildfowl migrating through Afghanistan to Pakistan.
Yet, once the US began its bombing campaigns, few birds were to be found.
"One impact may be directly the killing of birds through bombing, poisoning of the wetlands or the sites which these birds are using," said Haneed, who works for Pakistan's National Council for Conservation of Wildlife.
"Another impact may be these birds are derouted, because their migration is very precise.
They migrate in a corridor and if they are disturbed through bombing, they might change their route."
Intense fighting throughout Afghanistan, especially in the White Mountains, where the US has hunted bin Laden, have been hit the hardest.
While the difficult-to-access ranges may serve as safe havens for alleged al-Qaeda operatives, the Tora Bora caves and steep topography also provide refuge for bears, Marco Polo sheep, gazelles and mountain leopards.
Every missile that is fired into these vulnerable mountains could potentially kill any of these treasured animals, all of which are on the verge of becoming extinct.
"The same terrain that allows fighters to strike and disappear back into the hills has also, historically, enabled wildlife to survive," Peter Zahler of the Wildlife Conservation Society (WCS) told New Scientist at the onset of the Afghanistan invasion.
But Zahler, who helped to open a field office for WCS in Kabul in 2006, also warned that not only are these animals at risk from bombing, they are also at risk of being killed by refugees. For instance, a snow leopard, whose endangered population in the country is said to be fewer than 100, can score $2,000 on the black market for snow leopard fur. That money in turn can help these displaced Afghanis pay for safe passage into Pakistan.
Bombings, however, while having an initial direct impact, are really only the beginning of the dilemma.
As Zahler recently told Truthout, "The story in Afghanistan is not the actual fighting - it's the side effects - habitat destruction, uncontrolled poaching, that sort of thing."
Afghanistan has faced nearly 30 years of unfettered resource exploitation, even prior to the most recent war. This has led to a collapse of government systems and has displaced millions of people, all of which has led to the degradation of the country's habitat on a vast scale.
Forests have been ravaged to provide short-term energy and building supplies for refugees. Many of the country's arid grasslands have also been overgrazed and wildlife killed.
"Eventually the land will be unfit for even the most basic form of agriculture," explained Hammad Naqi of the World Wide Fund for Nature in Pakistan.
"Refugees - around four million at the last count [in 2001] - are also cutting into forests for firewood."
In early 2001, during the initial attacks, the BBC reported that the United States had been carpet bombing Afghanistan in numerous locations.
John Stufflebeem, deputy director of operations for the US Joint Chiefs of Staff, told reporters at the time that B-52 aircraft were carpet bombing targets "all over the country, including Taliban forces in the north.
"We do use [carpet bombing strategies]," said Stufflebeem.
"We have used it and will use it when we need to."
If Obama opts to carpet bomb, which the White House denies it will implement, this could lead to even further environmental problems and increase the already high refugee numbers.
Additionally, Pakistani military experts and others have made allegations that the United States has used depleted uranium (DU) shells to target specific targets inside Afghanistan, most notably against the Taliban frontlines in the northern region of the country.
Using DU explosives is not far-fetched for the United States. The US-led NATO air force used DU shells when it struck Yugoslavia in 1999.
Once these deadly bombs strike, they rip through their target and then erupt into a toxic cloud of fire.
Many medical studies have shown that DU's radioactive vapors are linked to leukemia, blood cancer, lung cancer and birth defects.
"As US and NATO forces continue pounding Afghanistan with cruise missiles and smart bombs, people acquainted with the aftermaths of two recent previous wars fought by the US fear, following the Gulf and Balkan war syndromes, the Afghan War Syndrome," wrote Dr. Ali Ahmed Rind in the Baltimore Chronicle in 2001.
"This condition is marked by a state of vague aliments and carcinomas, and is linked with the usage of Depleted Uranium (DU) as part of missiles, projectiles and bombs in the battlefield."
France, Italy and Portugal have asked NATO to halt DU use, yet the Pentagon still does not admit that DU is harmful or that it has used such bombs during its assaults in the country.
Afghanistan's massive refugee crisis, lack of governmental stability, and extreme poverty, coupled with polluted water supplies, drought, land mines and excessive bombings, all contribute to the country's intense environmental predicament.
Experts seem to unanimously agree, there simply is no such thing as environmentally friendly warfare
Crazy Weather From Coast to Coast
After an exceptionally hot, dry summer, the rain has finally come to Central Texas. Just how much rain will it take to alleviate the current drought conditions?
Considering this is the worst drought Central Texas has experienced since the 1950s, the answer to that is difficult to quantify.
Don't expect any let up in the current water restrictions.
According to the Lower Colorado River Authority, year to date the area rainfall is 8 to 12 inches below normal.
However, this is the second year of lower than normal rainfall, making it closer to 16 to 20 inches below normal.
There are five levels of drought, and a good portion of Travis County has been at the highest "exceptional" level for many months.
According to the Associated Press, drought conditions in Central Texas are improving, especially as the rain keeps coming.
Williamson and Burnet Counties, and even parts of northern Travis County are now considered to be in a moderate drought.
However, the southern half of Travis County and all of Hays County are now considered to be in a severe drought, upgraded from exceptional drought.
Just in the last year of this two year drought, it is estimated that the drought has caused more than $3.6 billion in crop and livestock losses.
What isn't changing significantly, even with all the recent rain, are the lake levels. The National Weather Service says it will take "several periods of sustained heavy, soaking rains to begin refilling lakes Buchanan and Travis."
While that kind of rainfall gets the lake levels and local water supplies back to where they are supposed to be, it brings other consequences as well.
Just ask Atlanta what it's like to go from drought conditions to catastrophic flooding.
From the drought in Texas to the floods in Georgia, it's difficult these days to find a part of the country not experiencing some kind of unusual weather. According to the National Weather Service, September saw above-average temperatures for most of the country.
That is hardly the case in October, as Colorado to Minnesota are seeing early snow.
There is more heavy rain forecast in the south, something places like Atlanta doesn't need.
California has gone from wildfires to the danger of mudslides as that state braces for storms. And, then there is a heat wave in Florida.
Much of this crazy weather from coast to coast can be attributed to the climate phenomenon known as El Nino, which affects weather patterns around the globe.
According to the National Oceanic and Atmospheric Administration, we are in the midst of an El Nino pattern that will have both a positive and negative impact for the next several months. On the positive side, the hurricane season has been milder and the dry southwest will get some much needed rain.
"El Nino's negative impacts have included damaging winter storms in California and increased storminess across the southern United States."
Starting to sound familiar, isn't it?
Ki graduated from college in Austin, and couldn't leave.
He created a website to provide information on the Austin real estate market to future buyers. Anyone can search for homes in the Austin MLS on his site.
His site also provides a search for Austin commercial real estate.
Water and Oil Key Climate Issues in the Arab World
COPENHAGEN (IPS/TerraViva) – In outlining a position on climate change, the League of Arab States must somehow account for looming problems like water stress – a problem found from Morocco in the west to the Gulf states in the east – and the importance of oil to the economies of many of the league's members.
Dr Emad Adly, in Copenhagen representing the Arab Network for Environment and Development, says there was a time when countries like Saudi Arabia wouldn't listen to even a mention of worldwide emissions reductions, for fear it would affect revenues from the sale of fossil fuels.
There are still misgivings, but this has shifted.
The Arab ministers responsible for the environment met in Egypt in November and endorsed a call for reductions in emissions, though stopping short of setting a definite target.
The priority for the League is financing adaptation and the transfer of technology to cope with the impacts of global warming.
Water is a key issue throughout the Arab world, and drought is a threat to most countries. Rising sea levels are another anticipated problem, with implications for people living in coastal zones.
Roughly half of Egypt's more than 80 million people live in the Nile Delta – with a large part of the population found in coastal zones.
George Conway, of Imperial College, London, said earlier this week that our detailed understanding of exactly how climate change will unfold in a given location is still imprecise.
"There are twenty different models predicting what will happen to the Nile, but we don't know what is going to happen."
The Intergovernmental Panel on Climate Change estimates as many as six million people could be environmental refugees from the Nile Delta by 2050.
Assessing risks and vulnerability accurately is a priority.
Conway said better country weather data is needed for the adaptation of the global climate models to local levels.
Adly feels that the present deadlock in talks is not unexpected or even necessarily a bad thing.
"The crisis (in the negotiations) in itself is progress."
It was expected, he said, and through it the world will have to find ways to negotiate an agreement amongst nations.
We will accomplish nothing unless we agree on two key points, he stresses:
meaningful emissions reductions from industrialised countries and financing for those who will be most affected.
As to how this can be accomplished, he points to the G77/China group as a powerful bloc of interests.
But beyond this, Adly stresses public opinion around the world.
Politicians everywhere must ultimately answer to their constituencies, and there is enormous public pressure to reach a deal here.
He says the media has played a very important role in highlighting the challenge and framing the problems, leaving reluctant politicians with less room to maneuver.
"They have to show that they are not against humanity."
The long-time environmental activist – he organised his first event on climate change in 1991 – does not underestimate the difficulty of succeeding here.
Compromise will be key, he says, offering China as an example.
One can respect China's desire to continue to grow, but for the survival of us all, it must accept some limits.
Future growth need not repeat all the polluting mistakes of the past.
China is a superpower, he concludes, and it has the capacity to do development while reducing emission
Water, Oil Key Climate Issues in Arab World
COPENHAGEN (IPS/TerraViva) – In outlining a position on climate change, the League of Arab States must somehow account for looming problems like water stress – a problem found from Morocco in the west to the Gulf states in the east – and the importance of oil to the economies of many of the league's members.
Dr Emad Adly, in Copenhagen representing the Arab Network for Environment and Development, says there was a time when countries like Saudi Arabia wouldn't listen to even a mention of worldwide emissions reductions, for fear it would affect revenues from the sale of fossil fuels.
There are still misgivings, but this has shifted.
The Arab ministers responsible for the environment met in Egypt in November and endorsed a call for reductions in emissions, though stopping short of setting a definite target.
The priority for the League is financing adaptation and the transfer of technology to cope with the impacts of global warming.
Water is a key issue throughout the Arab world, and drought is a threat to most countries. Rising sea levels are another anticipated problem, with implications for people living in coastal zones.
Roughly half of Egypt's more than 80 million people live in the Nile Delta – with a large part of the population found in coastal zones.
George Conway, of Imperial College, London, said earlier this week that our detailed understanding of exactly how climate change will unfold in a given location is still imprecise.
"There are twenty different models predicting what will happen to the Nile, but we don't know what is going to happen."
The Intergovernmental Panel on Climate Change estimates as many as six million people could be environmental refugees from the Nile Delta by 2050.
Assessing risks and vulnerability accurately is a priority.
Conway said better country weather data is needed for the adaptation of the global climate models to local levels.
Adly feels that the present deadlock in talks is not unexpected or even necessarily a bad thing.
"The crisis (in the negotiations) in itself is progress."
It was expected, he said, and through it the world will have to find ways to negotiate an agreement amongst nations.
We will accomplish nothing unless we agree on two key points, he stresses:
meaningful emissions reductions from industrialised countries and financing for those who will be most affected.
As to how this can be accomplished, he points to the G77/China group as a powerful bloc of interests.
But beyond this, Adly stresses public opinion around the world.
Politicians everywhere must ultimately answer to their constituencies, and there is enormous public pressure to reach a deal here.
He says the media has played a very important role in highlighting the challenge and framing the problems, leaving reluctant politicians with less room to maneuver.
"They have to show that they are not against humanity."
The long-time environmental activist – he organised his first event on climate change in 1991 – does not underestimate the difficulty of succeeding here.
Compromise will be key, he says, offering China as an example.
One can respect China's desire to continue to grow, but for the survival of us all, it must accept some limits.
Future growth need not repeat all the polluting mistakes of the past.
China is a superpower, he concludes, and it has the capacity to do development while reducing emissions.
OIL:
A Market Psychology of Fear?
VANCOUVER, Canada, Dec 8 (IPS/TerraViva) - With or without a binding deal at the climate talks in Copenhagen this month, it seems the world may have to cut its oil consumption, as emerging geological and economic trends limit the availability and affordability of petroleum.
Back in the 1970s, Saudi Arabia's flamboyant oil minister Sheik Ahmed Zaki Yamani articulated what has become conventional wisdom for policymakers around the planet:
''The Stone Age didn't end for lack of stone, and the oil age will end long before the world runs out of oil.''
Today, an increasing chorus of voices is challenging that prediction.
While the world isn't running out of oil in any absolute sense, a daunting picture on the availability and thus affordability of supply compared with expected demand increases is beginning to emerge.
"In 2015, the world's consumption of oil will likely be closing in on 100 million barrels per day, roughly 22 percent higher than the current level - which is a relatively high annual growth for the oil industry," states a briefing marked "confidential" from Canada's Royal Canadian Mounted Police (RCMP), obtained by IPS a Freedom of Information Request.
The censored briefings, created in collaboration with other Canadian government agencies, paint a troubling picture of future energy security that has recently been corroborated by other sources.
In 2005, the International Energy Agency (IEA), the Paris-based multinational information centre created after the 1973 energy crises, predicted that world oil production could rise to 120 million barrels per day by 2030, up from 85 million bpd in 2008.
The IEA "was forced to reduce" its predictions on possible world supply "to 116 million and then 105 million last year," according to a senior official in the organisation, who spoke with the Guardian newspaper in early November on the condition of anonymity.
The U.S. Department of Energy, through its International Energy Outlook (IEO), has also been quietly scaling down its numbers on possible supply.
In 2007, the agency predicted that the world would be able to pump 107.2 million barrels per day in 2030.
In summer 2009, it drastically reduced its supply predictions to 93.1 million barrels per day.
In its latest forecast, released Nov. 10, the IEA predicted that world oil supply would hit 105 million barrels per day by 2030.
Even with those figures, which many analysts, including some inside the IEA, consider overly optimistic, there is likely to be a shortfall of some 11 million barrels per day by 2030.
"Every year we lose four million barrels a day [of production due to depletion]," said Jeff Rubin, the former chief economist with CIBC World Markets.
"Over the next five years, we are going to have to find 20 million barrels a day of new production, just so that we can [continue to] consume what we consume today," Rubin told IPS in June.
Rubin is a believer in the peak oil theory - the idea that oil production will reach a maximum point and then fall fairly sharply as demand outpaces possible supply.
M. King Hubbert, a geologist with Shell oil in the United States, correctly predicted that U.S. domestic oil production would peak in the 1970s.
"Shell isn't a believer in the peak oil theory," said company spokesperson Janet Annesley during a 2008 interview with IPS at the company's Calgary office tower.
Other multinational oil companies, however, are beginning to disagree with the current position of Shell, M. King Hubbert's former employer.
Gasoline and transportation oil can be manufactured from coal and other petroleum sources, meaning the world will not run out in any absolute sense, but the costs - both economic and environmental - will be far higher than conventional crude.
"Groups and individuals speaking out about forthcoming world oil supply challenges are frequently stereotyped as a fringe element with little knowledge about the oil industry," said the Sweden-based Association for the Study of Peak Oil and Gas in a Nov. 24 news release.
"But their warnings are increasingly supported by some surprising allies:
senior petroleum industry officials, consultants and analysts."
Christophe de Margerie, CEO of Total SA, Europe's third largest oil company, believes the world will never be able to produce more than 89 million barrels per day.
ConocoPhillips' chief executive Jim Mulva told a conference in London last month that he doubted producers would be able to meet long-term oil demand.
Both oil executives challenged IEA predictions.
The senior IEA official who blew the whistle on the organisation's tendency to overstate supply says the group is manipulating data in order to placate financial markets.
"Many inside the organisation (IEA) believe that maintaining oil supplies at even 90 million to 95 million barrels a day would be impossible, but there are fears that panic could spread on the financial markets if the figures were brought down further," a senior IEA official told the Guardian.
According to the confidential RCMP documents, "[censored]... a market psychology of fear will continue to place a 'geopolitical premium' on crude oil, keeping prices for oil products higher than market fundamentals along would dictate."
It is this fear that the IEA is trying to placate.
However, many believe a binding deal at Copenhagen seems like a more reasonable approach to reduce oil dependency than the current policy of fudging the numbers.
(END/2009)
MADAGASCAR:
Worrying Lapse in Forest Management
ANTANANARIVO, Dec 23 (IPS/IFEJ) - The illegal logging of precious wood rose sharply during the political crisis that gripped Madagascar during 2009.
Forest communities, who could be part of the preservation of these resources, have been swept up in the rush for rosewood.
Rosewood, or Dalbergia baroni, is a very valuable hardwood, in great demand worldwide.
Along with related species from Latin America and Asia, it has been over-exploited, and is a protected species in Madgascar.
Sipping a beer in a bar in the rural district of Manompana, is a strapping 30-year-old who today is going by the name "Christian".
It is 200 kilometres from the port city of Toamasina, through which much of Madagascar's exports of wood pass. When one of his associates introduces a potential client for precious wood, he welcomes her with a broad smile.
"I can get you any quantity of wood you want," he tells the woman.
"I'll discuss it with my loggers."
Christian doesn't personally go into the forest.
"I pay locals to cut down trees," he says.
"People shut their eyes to certain illegalities, especially when the wood leaves the country with proper papers, provided by the local authorities in exchange for a few large bills," he says.
"The local population has only the forest to rely on, and in exchange for the wood that they find for me, I give them the means to survive and provide for their families' needs."
Charles Rakotondrainibe, deputy director general of Madagascar's National Parks service, says these are difficult times for conservation.
"If illegal logging goes on every year, in 2009 it saw exceptional growth, which has been catastrophic for the forests," says Rakotondrainibe.
"Because of poverty - aggravated by the political crisis - and tempted by the easy money offered for illegal logging, communities could do little other than go with the flow and participate in the pillage of the forests to extract rosewood."
He says the political crisis of 2009, which saw Marc Ravalomanana deposed as president by a populist movement headed by Andriy Ravoelina early in the year, reversed progress made since the putting in place of a national environemntal policy in the 1990s.
The situation is especially bad in the national parks of Masoala, Marojejy and Makira, in the northeast of the country, particularly known for rosewood.
To counter this, environmental NGOs are trying to involve forest communities in conservation.
"The transfer of management (of forests) is aimed at getting the local population to take charge against the illegal exploitation of the forest, as well as to sensitise them to the effects of their daily activities on the environment," says Etienne Andriamampandry.
He is head of programmes at the Association Intercoopération de Madagascar, a non-profit organisation working on collective solutions to poverty alleviation. AIM's efforts cover issues of local government, access to basic services, local economies and food security - forest management cuts across many of these themes.
Charline Toto, a member of a like-minded association in Manompana called "Ensemble pour la gestion de la forêt" (Together for the Management of the Forest) has learned the lessons well.
Recognising the importance of forest resources in the daily lives of locals, she also appreciates how trees must be protected.
"Only wood found in the exploitable zones can be cut down and used for our daily needs, such as for the construction of houses and canoes, or to cook food," she explains.
"But in the conservation zones, the forest must be protected because it ensures the balance of the environment and gives shelter to animal and plant species that are only found in Madagascar."
But for every Charline Toto there are too many others willing to work for the young men with big wallets like Christian.
Raktondrainibe says the problem is compounded by the frequent corruption of local officials who authorise the shipment of timber.
"The continued uncertainty in the running of the country has led people believe there's no longer any law and they can do what was forbidden," adds Rakotondrainibe.
"The publication of a an interministerial order allowing the exceptional export of precious woods has only made the situation worse."
For Ndranto Razakamanarina, president of the Madagascar's association of foresters, this order, issued in January by the government of former president Ravalomanana, is "completely illegal and completely contrary to the charter on the environment and the legislation on forests which forbids the export of precious wood."
"It was meant to permit the release of (existing) stocks of rosewood which had been blocked from export for many years, but illegal logging syndicates have taken advantage by cutting down new timber," he told IPS in an interview.
"Worse, the validity of the order, which was to have expired at the end of March, was extended until the end of November by the transitional authorities."
According to a report published by the Missouri Botanical Garden, a U.S-based research institution, more than 1,200 containers of rosewood left via the ports of Vohémar and Toamasina in the east of the island between January and October 2009.
The report says these containers would hold about 120,000 logs - equivalent to the cutting down of some 60,000 rosewood trees.
"But, as the illegal loggers don't follow regulation techniques, the extraction of each rosewood means the cutting down of half a dozen trees of other species in order to make rafts (to float them down rivers out of the deep forest)," explains Razakamanarina.
The depradations in protected forests in the northeast of the island has created an international outrcy, but it has also awakened the national conscience in Madagascar.
"Faced with the rapid degradation of the environment in 2009, we have gathered NGOs and associations working for the defence of the environment into an alliance that wants to see a better environmental governance," indicated Razakamanarina.
(*This article is part of a series on sustainable development produce by IPS - Inter Press Service - and IFEJ, the international Federation of Environmental Journalists.)
SRI LANKA:
Five Years after Tsunami, Many Still without Shelter
KALMUNAI, Sri Lanka, Dec 23 (IPS) - "We have been here for almost five years. So many promises have been made, but very few have been kept," complains Mohideen Nafia, 22, one of the survivors of the 2004 Asian tsunami still living in a temporary facility in the coastal town of Kalmunai, located 300 kilometres east of the capital, Colombo.
Newly married Nafia would have preferred a house of her own with her husband.
But at the moment she has to make do with what amounts to a shelter, a one-room unit in a government-provided disaster camp, which the couple shares with Nafia's family of five and is located about a one kilometre from the beach.
Nafia hails from the Sainathimaruthu village in Kalmunai, a major domestic fishing hub that bore the brunt of what has been touted as one of the deadliest natural disasters in recorded history.
Three of its villages facing the sea – Maradamunai, Sainathimaruthu and Karathivu – suffered the heaviest damage at the time of the tsunami.
When the Asian tsunami, triggered by a 9.3-magnitude earthquake, hit the coasts of countries bordering the Indian Ocean on Dec. 26, 2004, hundreds of thousands of people across Asia were washed away at sea.
According to the International Federation of Red Cross some 226,000 people in 13 countries were killed in the aftermath of the tsunami.
One of the hardest hit was Sri Lanka, along with India, Indonesia, and Thailand.
In the South Asian island state more than 35,000 people died, over one million were displaced, and some 100,000 houses were either damaged or destroyed by the tsunami.
At least one-third of the deaths, or some 10,000, were reported from the Ampara district that comprises Kalmunai, according to official government data.
In the same district, approximately 27,000 houses were destroyed by the tsunami, the bulk of which was in Kalmunai.
Villagers estimate that some 8,500 lives were lost in the densely packed beach at the height of the disaster.
Overall, the unprecedented disaster left a reconstruction bill of 330 billion rupees (3.2 billion U.S. dollars). The reconstruction effort was spearheaded by a government agency set up soon after the tsunami and which received the support of dozens of United Nations and other international agencies.
Sri Lanka's Reconstruction and Development Agency has since wound down as has the massive reconstruction effort.
Still many are without homes they could call their own.
"Getting land for the new houses has been a big problem;we have to first locate the land.
If it is privately owned, (we) buy it," says Ismail Thawfiek, the additional government agent for Sainathimaruthu village in Kalmunai, where Nafia hails from.
Most of the available lands are paddy or rice fields, which he says puts more pressure on otherwise limited public funds, as they need to be filled.
"The biggest delay (in rebuilding the affected houses) has been in finding land and preparing it so that we can build the houses," Thawfiek says.
The lack of land has been exacerbated by the government's imposition of the no-build buffer zone along the Kalmunai coast.
The then Sri Lankan government initially imposed a limit of 200 metres from the sea soon after the tragedy.
Owing to pressure from the homeless survivors, it was later reduced to 65 m at Kalmunai and 100 m elsewhere in the tsunami-affected parts of the country, according to government officials
With just three days away from the fifth anniversary of the 2004 Asian tsunami, some 1,300 families, including Nafia's, are still waiting for their houses to be built, since the government imposed a no-construction buffer zone along the beach soon after the tsunami.
"Even after five years since the tsunami, there are still problems, there are still issues," admits Thawfiek.
Nafia's grief is understandable.
The sense of despair gripping her is matched only by her deplorable living conditions. Tin roofs are rusting, dirty water stagnates near the front door step and large pools of rainwater and garbage rot behind the tents. Chickens raised by families roam the compound, where small children play marbles.
"Look at this," Nafia says, as she points to her squalid surroundings. It is "like living in hell.
When it rains, it is all water, if it does not, it is all flies," she says while waving her hands to chase away the flies.
She adds that none of the international relief agencies that poured aid into the tsunami-hit areas like Kalmunai helped her build her house while others are still waiting for government promises to be fulfilled, notably the reconstruction of their tsunami-destroyed homes. "The life we knew before the tsunami is like a dream. I don't know why this happened to us."
"We will give them houses very soon next year," Thawfiek assures, arguing that the construction of new houses is moving according to plan once land has been located.
At least 5,000 houses damaged by the tsunami in Kalmunai have either been reconstructed or repaired.
To date, there are at least 13 disaster camps – with at least 1,000 shelters out of an original 18,000 in the Ampara district – still spread through the coastal town while hundreds more that were displaced by the tsunami are still living with relatives.
Quite apart from Nafia's complaint, the Kalmunai beach appears to have returned to what it was before the deadly tsunami waves left a path of destruction.
It is now is a hive of activity – fishermen tend to their nets on the beach while others attend to the large multi-day trawlers anchored just offshore.
"We have returned to what (our lives were) before the waves struck, maybe even better," says Mohideen Ajimal, one of the first fish wholesalers to return to the beach after the tsunami.
Ajimal lost an infant son and a daughter to the disaster.
Pointing to the large boat repair yard that has been erected near his business premises, he says that it would never have been built if there was reconstruction effort after the tsunami.
"We lost so much, but life has to go on, and it is better if life goes on better than before," he tells IPS.
Next to the new fishermen's society building is a tall red tower with loud-hailers pointing in all directions to warn the residents of any tsunami threat.
"That helps too," says Ajimal as his eyes darted toward the tower.
Among the houses that have been rebuilt since the 2004 tsunami disaster are swanky new structures, painted in bright colours that stand out amid the dull sun-baked cement facades of others. They have been rebuilt by owners who could afford to finance them.
New schools have also been constructed, replacing the damaged ones.
Yet, there are still remnants of the huge Asian tsunami waves' deadly foray inland in this predominantly Muslim town.
In place of wall-to-wall houses that used to stand next to the beach before the tsunami struck are large, empty sandy patches. Wooden poles sticking out of mounds mark off the spots where thousands were buried.
On the side of the road that runs alongside the beach are the occasional houses or fishing huts that have been deserted by owners after the tsunami.
They are bereft of roofs and window frames, having been washed away, decayed or carted away by thieves. Here goats seek shelter when the sun is too hot.
"We had a good house near the sea, but I lost two children and I don't want to go back," says Abdul Mannas, who has since moved to a new housing site about two km from the sea.
But at least the 35-year-old father of three is happy.
He now lives in a new housing complex just outside Kalmunai town.
"This house is smaller (than I had expected), but we are happier," he says. "We can build two-story houses or extensions if we want to."
The houses at the French Friendship Village, where he lives, were built with the support of the French government.
Mannas says the he and others gladly vacated the protective zone.
"It is death zone on the coast," he says. "I don't want to live there."
But for those living in small tin-roofed sheds like Nafia, where three or so families share the dimly lit units in the camp near the Jumma Mosque, the nightmare never ends, not since the tsunami struck the Indian Ocean.
"We have waited long enough;five years is a long time," she rues.
INDIA:
Despite Failed Climate Talks, More Green Awareness
NEW DELHI, Dec 21 (IPS) - The world supped on an alphabet soup of acronyms over the nearly two weeks of climate change talks that just ended – UNFCCC, COP-15, IPCC, CDM, LDCF, MEF, CCS. But did any of these filter down to reach the average citizen?
Does it bother people if their countries' "carbon sequestration" efforts are laudatory, or if the "anthropogenic climate impacts" in their cities are being mitigated?
Would they come to grief if their carbon footprint outpaced that of others in the United States?
And if it did, well, would they activate "carbon offsets" to minimise the damage to the planet?
Indeed, experts say, a lay person's understanding of climate change issues becomes vital against the backdrop of the just-finished Copenhagen summit – bruising negotiations that ended with an agreement to keep global temperature rises to no more than two degrees, have developed countries cut greenhouse gases and developing countries take steps to limit theirs.
This, however, came without the comprehensive, legally binding international deal that was the Dec. 7-18 talks' original aim, as developed countries, including the United States, avoided being bound by specific deeper cuts in greenhouse gas emissions, and developing countries like China, India, Brazil and African states stayed away from firm commitments and a regime of international monitoring.
Activists called the deal no more than a political statement.
But as headlines spread around the world about the failure of the climate change summit in Copenhagen to reach a binding agreement, Mumbai-based environmental scientist Prabhu Goenka says that unlike in the past, the average person's sense of awareness about environment issues has heightened considerably.
"Earlier, people would happily leave such concerns to governments and policy experts. Public engagement was minimal.
But over the past decade, media's exponential growth and the urgency of the situation have led to a mass awakening," he explained.
It is no surprise then that a sundry cast of rag pickers, farmers, technicians and students was part of the 1,500-member Indian delegation in Copenhagen.
An Indian rag picker -- part of a sizeable 15-million strong community in South Asia -- even grilled a United Nations official on clean technology. A farmer from India's desert state of Rajasthan made a presentation on why rich countries should fund agricultural research in developing economies. He was representing India's 600 million-strong farming community, which is perhaps the most vulnerable to the impacts of climate change.
Fifty-six year-old Katori Devi is a farmer in Haryana state that borders Delhi.
Her family has been tillers for seven generations. Climate change and global warming have come to pass, she believes, because her forefathers never bothered with things like organic pesticides and other eco-friendly farming methods.
"But now," Devi says, "my sons and I ensure that we follow farming practices which not only increase yield but also cause minimal damage to our animals and the fields."
Such awareness augurs well for an agri- and rain-dependent economy like India's, 70 percent of whose 1.2 billion people subsist on farming.
Agriculture contributes 17 percent to the nation's Gross Domestic Product and powers its economy.
But although over 60 percent of Indian farmland is rain-dependent, only about 40 percent of arable land is irrigated, leaving farmers exposed to the vagaries of the monsoon.
Due to drought and erratic rains, there has been a raft of suicides by debt-ridden farmers in the past few years.
Environmental concern among the Indian youth – who make up a sizeable two-thirds of the country's 1.2 billion demographic – is hard to miss.
Divyang Saxena, 17, a Class XII student at Delhi Public School, NOIDA, in India's northern state of Uttar Pradesh, feels that the next generation has no choice but to be environmentally responsible. "I have to care about climate change because I don't want temperatures to keep rising every year," stressed Saxena.
"So we recycle water at home and avoid turning the heater on by wearing thicker sweaters. We also buy organic produce because the farming processes are better for the environment."
M. S. Kohli, founder member and chairman of the non-government Himalayan Environment Trust, adds that greater environmental awareness reflects people's desperation.
"Earlier, when we tried telling people that global warming must stop as the glaciers were melting, nobody gave a damn," Kohli told IPS. "But now, when swelling rivers are flooding the plains, displacing millions and causing losses to property, people are keen to know how the damage can be reversed."
Private citizens are also involved in addressing climate damage.
Barefoot College, a Rajasthan-based NGO that won the Sierra Club Green Energy Award for 2009, is training semi-illiterate women as solar engineers. Though the work is traditionally perceived as a man's job in Rajasthan, women are creating and installing solar cookers on their own, and this has led to tremendous energy savings.
Sarita Bhan, 35, one of the solar engineers at Barefoot College says, "The time has come for us to look beyond gender stereotypes so that everybody can contribute to saving the environment.
Before I started doing this work, I had no clue about things like clean energy and power saving.
But now, I'm spreading the message by educating other people too."
The Indian government is also channeling money towards job training for green professions and clean-energy legislation.
To introduce a subtle shift in the job market – and channel interest in a ‘green economy' -- The Climate Project - India, an international non-profit organisation, and the Sierra Club have been organising "green job fairs" across India.
"Gradually, people are realising that fighting climate change creates new economic opportunities," says an official at the Sierra Club in New Delhi.
Green buildings use fewer resources and save more energy.
"The global climate scourge is creating a welter of new jobs in clean-energy industries, weatherisation and other areas," he added.
"People are conscious now that if they continue to abuse the environment, it is their bottom line that will get pinched," says a Delhi-based insurance agent.
As sea levels rise, he adds, flash floods and windstorms attributable to climate change have got the insurance industry on edge.
Insurance premiums are going up.
Recently, too, The Energy and Resources Institute (TERI) launched a specialised library on climate change here in New Delhi, the first ever such initiative. TERI Director General and Nobel laureate R. K. Pachauri, also chairman of the Intergovernmental Panel on Climate Change (IPCC), told IPS that this aims to create "a resource from where people can gain knowledge about climate change."
India's Ministry of New and Renewable Energy is preparing to have a ‘solar mission' that focuses on ambitious targets for grid and off-grid generation of solar power to maximize hydropower, biomass and wind power.
"It is time to innovate and operate under a new paradigm with an emphasis on conservation and energy efficiency while embracing environment-friendly energy resources," says Kohli.
"The public needs to be in tune with such efforts. That's the only way to survive in the future."
New Pirate of the Caribbean Invades from Pacific
CARACAS, Dec 2 (IPS) - The red lionfish (Pterois volitans), a venomous coral reef fish from the Indian and western Pacific Oceans, has invaded the waters of the Caribbean and the Gulf of Mexico, threatening to wreak havoc on ecosystems, native fish populations and popular underwater diving areas.
The invasive, exotic-looking fish apparently reached the Caribbean in 1992 when Hurricane Andrew destroyed an aquarium in a restaurant in Florida and six of the ferocious predators got out.
"The lionfish is a beautiful, colourful fish that reaches 45 cm in length, with bold reddish or maroon and white stripes and long, showy pectoral and dorsal fins with dangerous venomous spikes," biologist Juan Posada, head of the organism biology department at the Simon Bolivar University in Caracas, told IPS.
The lionfish was found in the waters around the Bahamas in 2004, near Cuba and the Turks and Caicos Islands in 2007, in Haiti, the Dominican Republic, Puerto Rico, Belize and the Colombian island of San Andrés in 2008, and off the coasts of Mexico, Honduras, Costa Rica and Panama this year. It most recently turned up in the waters off the Dutch islands of Aruba and Bonaire in September and October, near Venezuela's northwest coast.
"It may also have reached this area in the ballast waters of a merchant ship, but that is less likely.
The lionfish populations probably descended from the specimens that escaped from the aquarium, and have flourished thanks to their specific characteristics and the fact that in the Atlantic Ocean they don't have natural predators, like the big fish in the Pacific," said Posada.
The lionfish "is extremely adaptable, and without predators it easily flourishes over other species," Oscar Lasso, an ichthyologist at the La Salle Foundation of Natural Sciences in Venezuela, told IPS. "In first place it is cryptic, and it preferentially inhabits coral reefs, where it waits for other animals that do not even recognise it as a fish, opens its mouth and eats everything that goes by."
But it can live as deep as 175 metres below the surface, and even young lionfish are protected from other predators by their long, poisonous spines. And as a voracious predator - it eats smaller fish, crabs, shrimp and even young lobster - it poses a serious threat to local fisheries.
"A lionfish's stomach can expand up to 30 times its normal size after it has eaten," said Lasso. "In the stomach of one fish caught in the Bahamas, 17 young 3-cm long juvenile snappers were found.
While an adult can grow to 38 cm in the Pacific, 45 cm specimens have been found in the Caribbean, which gives you an idea of their success as an invasive species."
They are also prolific breeders, said the expert. A single female can lay up to one million eggs in her lifetime, and because of the warm climate in the Caribbean, they can spawn year-round, giving them a better chance at survival.
The fertilised eggs and newly hatched larvae then drift in the ocean currents, which carry them farther and farther away, fuelling the lionfish population explosion and expansion.
"Based on experience, it is practically impossible to eradicate a successful introduced species;they are here to stay, and we have to learn to live with the problem and try to manage it," said Posada.
Another challenge in fighting the spread of the invasive lionfish population is the venom - mainly consisting of a protein toxin - located in glands at the base of each dorsal spine.
Although the venom is rarely fatal to humans, the sting is extremely painful, causing vomiting, dizziness, breathing difficulties, headaches, "and sometimes arrhythmias (irregular heartbeats), and the effects can last for hours, or even days," said Posada.
Treatment for a lionfish sting includes immediate immersion in non-scalding hot water for 30 minutes, because the proteins in the venom are broken down by heat, which prevents them from spreading in the bloodstream.
In addition, any broken spines should be removed from the wound, and the victim can be given a painkiller, said the expert.
"The same principle of denaturing the venom operates in the case of someone who catches a fish and wants to eat it, because lionfish are edible, the meat is tasty," said Posada.
Complaining about the lionfish's destruction of native Atlantic and Caribbean fish species, he joked that "it even occurred to us that one way to get people to capture and eat more lionfish, to reduce the population, would be to spread the rumour that its meat is a powerful aphrodisiac."
Lasso pointed to the destruction caused by exotic species that are transported in the ballast tanks that allow ships to remain stable despite changes in the amount of cargo that they are carrying.
Once the ships unload their cargo, they fill up their tanks with water from the destination point - water that contains billions of organisms, including eggs and larvae from a broad range of sea species. When cargo is loaded, the ballast water is discharged, sometimes halfway across the globe.
That is how, for example, the muzzled blenny (Omobranchus punctatus) was apparently brought to Trinidad in the 19th century on ships from India.
The fish has been found in the Gulf of Paria, which separates that Caribbean island from Venezuela, and it competes for habitat with native species - although not with the voracity of the lionfish - from Mexico to Brazil.
Ichthyologists - zoologists who study fish - are trying to raise awareness among fisherpeople, divers, authorities, sailors, coastal communities and the public in general on the risks of introducing exotic species.
The tilapia (Oreochromis mossambicus), native to Africa, is a well-known problematic invasive species, in Venezuela and many other countries. It is on the International Union for Conservation of Nature's (IUCN) list of the 100 World's Worst Alien Invasive Species.
Just 12 years after some 800 tilapia were introduced in the Laguna de los Patos, a small lake in northeastern Venezuela, in 1964, only 10 of the 23 native species of fish were left.
The researchers who spoke to IPS underscored the serious harm to native ecosystems that can be caused by releasing non-native pet fish or farmed fish - like tilapia - into the wild, as demonstrated by the lionfish, which according to some experts has the potential to become one of the most catastrophic marine invasions in history. (END/2009
Peak oil and the psychology of work
This is a preliminary attempt to explore the relationship between the current predicament facing humanity arising out of an exploding population facing planetary resource limitations, in other words known as overshoot, and the psychology of work inherent in the human species. One reason to explore this connection is that the question of overshoot is normally framed in standard Darwinian terms. In the Darwinian framework overshoot begins with the availability of abundant resources that allows the population of a species to increase exponentially.
This exploding population eventually depletes irreversibly the very resources that sustain the population and this leads to a large scale die-off and a precipitous fall in the species population sometimes leading to extinction.
In this rise and fall, the behavior of the individuals of the species is often typical of any organism seeking to maximize its chances of survival and procreation.
However the human species, aided by a generalized intelligence, is perhaps unique in its ability to extensively craft its environment in order to garner a much larger portion of the ecological resource base to sustain itself. In the evolution of humans, there have been two signal revolutions that brought about a very large increase in humanity's ecological valence leading to profound changes in the human mode of existence and its environment.
The first was the agricultural revolution that is now understood as having begun some 10000-12000 years ago. This allowed the hunter-gatherer humans to transition to a settled agrarian lifestyle eventually paving the way for the rise of urban civilizations. The second revolution was the industrial revolution that is a mere 200-300 years old but which allowed humans to rapidly dominate the planet as perhaps no other species had managed to before.
There can be no doubt that the availability of ecological resources played a defining role in these transitions – in the case of the agricultural revolution the key resource was fertile top soil of river valley ecosystems. The nutrient laden silt deposited in the flood plains of riverine systems such as the Nile, the Euphrates and the Indus ensured the initial success and widespread replication of settled agriculture.
Similarly it was the availability of concentrated forms of different resources chiefly energy but also ores of various metals that were the principal enablers of the industrial revolution.
While the role of ecological resources in these signal revolutions is fairly well understood, the role of human mental faculties in their myriad manifestations is either unclear or the subject of severe controversies. But there can also be little doubt that human mental faculties – through innate predisposition and learnt skills and behavioral responses – must have played a fundamental role in these changes as well.
My interest lies in understanding how our mental faculties contributed to these fundamental transformations, with the hope that this understanding will enable us as individuals and collectives to be better prepared for the inevitable turmoil that results from the decline in the availability of concentrated energy resources. In particular in this essay I want to explore how the human mind views and deals with the concept of work – both as an idea in the mind and as a felt necessity of human existence.
In physics work is the same as energy.
In fact energy is defined as the ability to do work and therefore they are measured in the exact same units. In the biological world, all organisms have to do work in order to change and exploit their environment for their benefit.
But it is not uncommon in the animal kingdom to have sharply differentiated work burdens across different members of a species, e.g. the work differential between the worker ants vs the drones, or the lioness vs the lion.
However, what work means to the human mind is something quite different from both the physical concept, and the forms observed in other animal species. The intrinsic tendencies towards work in humans (like most other mental faculties) have always influenced and defined their cultural and political systems and thus contributed to the rise and fall of civilizations. It is not difficult to see that both the agricultural and industrial modes of human existence principally involve the organization and concentration of matter using energy to overcome the inevitable tendency towards disorganization and diffusion (in other words overcome the second law of thermodynamics). The main difference lies in the fact that in the agricultural mode human work is an integral part of the energy flow whereas in the industrial mode human energy is replaced to a large extent by energy obtained from burning fossil fuels.
It is normally acknowledged in peak-oil circles (at least amongst those who do see the decline in fossil fuels as leading to a decline in industrial civilization) that the aftermath of peak-oil would witness the come-back of human labour as a prominent source of energy for economic activities. And this may very well happen for the simple reason that individuals would have no other choice.
But it is worth looking at the psychological context in which this might happen if for no other reason but that our sanity may depend on doing so. And history is a good place to begin doing that.
It appears to me that throughout history humans have always distinguished between physical and mental work.
It is a felt experience for most of us that we would rather be doing mental work as opposed to physical work.
One could argue that most of us would rather do no work at all if our sustenance and comforts are somehow guaranteed.
While that may be the case at the psychological level, at an empirical level it appears to me that a farmer would rather take up the job of a bank teller given the same remuneration, than continue with farming.
Irrespective of why this might be the case, this phenomenon implies that it ought to be easier to find humans willing to do work involving less physical labour compared to more.
And yet, most human societies historically have privileged mental work over physical work.
Almost universally work involving a greater component of mental work lead to greater surplus accumulation and a more comfortable life.
To me this is a conundrum and has serious implications for the coming post-peak world.
A clear indication of this preference can be seen in the themes found in the world's folk literature.
No matter which corner of the world one looks at, one is likely to find many folk tales that begin with a clever and intelligent weaver or woodcutter who uses his mental prowess to end up as the prince or the prime minister of his country.
On the other hand the chances of finding a tale in which the king ends up living happily as a labouring peasant are almost nil.
This relative popularity of mental work compared to physical work has been a tremendous force – a kind of psychological energy – that has fueled our transition from a hunter-gatherer to agrarian and then to industrial modes of existence.
A significant example of how the relative popularity of mental work compared to physical work has defined the very fabric of most societies is to look at India.
In India the principal form of social stratification, namely the caste system, appears to be based on the crucial distinction between mental and physical work.
For those who are unaware of the main elements of the caste system (or varnashram as it was referred to in Sanskrit), humans were divided into four varnas (categories) which was determined by their profession or the kind of work done by them.
This division was hierarchical and defined (for as long as it was possible to move from one varna to another) a direction for human aspiration.
Thus at the top were the Brahmanas (the Brahmins) whose work was predominantly intellectual in nature, as teachers, priests, philosophers, etc.
In the next category were the Kshatriyas who had jobs in administration and governance.
At a lower level were the Vaishyas who were involved in business and trade.
At the lowest level were the Shudras who consisted of artisans, farmers and other professions all involving a significant amount of manual labour.
It should be of interest that each of these varnas were further divided into several sub-castes also organized in an internal hierarchy.
The relative position of the sub-caste within the varna had much to do with the manual labour component of the work that its members did.
So for instance, the priests involved in conducting the rituals in a temple had higher status than those who were tasked with keeping the temple premises in pristine condition.
Throughout the pre-industrial period various ecological and cultural limitations kept a lid on the natural human aspiration of moving away from physical labour and towards mental labour and this contributed to maintaining societal homeostasis. It is well understood that in India the ossification of the caste system into a rigid and oppressive form determined by birth, served to severely curb the aspirations of ordinary people for millennia, but that it also provided stability and continuity to the political economy of the country even in the face of various invasions and political upheaval.
Across the world, the fall of empires and civilizations resulted mostly from political overreach (as in Rome) or straightforward ecological overshoot (as in Easter Island) or some combination of these reasons. The relative role of physical and mental labour might have had only a marginal influence on the decline phase of pre-industrial civilizations.
Yet the industrial civilization has seen the most drastic change in the composition of people doing and willing to do physical work vis-a-vis mental work.
The proportion of America's population doing agriculture has declined from around 50% near the beginning of the 20th century to less than 5% towards its end, no doubt aided by the explosion of less manual labour intense employment in the secondary and tertiary sectors of the economy.
But in addition and most importantly, it has opened up newer aspirational possibilities to ordinary humans that one could not even dream of in the pre-industrial age.
A recent survey indicates that 40% of India's farmers are willing to quit farming since they find it unprofitable.
However in my own experience the number is closer to 100% when real alternatives are available, and economics is only part of the reason.
Aspirational changes brought about by education and mass-media are at least as crucial a component as the economic crisis afflicting agriculture. A subtle version of this same phenomenon is the shift, amongst those who continue to be in agriculture, from food crops to cash crops. Even when cash crops are plagued by highly uncertain and volatile price swings, cash crops are preferred since they involve less manual labour.
A deindustrialising society will therefore need to not only deal with the scarcity of material resources but also work against the prevailing cognitive current of privileging non-manual labour on a scale unprecedented in human history.
The problematic part is that this is not merely a political arrangement, but a manifestation of the individual's preference and is central to the aspirations of millions of humans today.
What this implies is that the breakdown of the industrial civilization will also witness an unprecedented cognitive breakdown as well.
A variety of questions can be asked on how this will play out and what adaptive mechanisms we have at our disposal at both the individual and the collective levels. I hope to explore these and other issues concerning the relationship between our material and cognitive predicaments in future essays, and I hope that it will help the TOD readership to address these questions with much greater intensity than what it has done so far.
WHAT LOWER CONSUMPTION MEANS
Most of the kids have a good laugh with the before/after comparison chart, and I laugh along with them.
The contrasts between the present and (likely) future presented in the chart are striking to the point of unbelievability to them, and their reactions are honest and humorous:
"So, Dr. Allen, where can I buy this mule I'll need?"
But I also laugh with some sadness and a touch of fear;sadness that prudent suggestions to prepare for a difficult future are still regarded as a joke;and fear for a possibly much darker future I don't think they yet comprehend -- a fear that we might not be able to pull this off.
Note that this is directed at high school kids as part of my ongoing series of "important side notes" to the regular Chemistry curriculum.
Even though topics such as EROEI and the "net energy curve" are very relevant to this discussion, I have not introduced them yet in this essay for the sake of simplicity.
For these topics, I highly recommend many related posts on www.theoildrum.com by Ugo Bardi, Charles Hall, and David Murphy, as well as the references contained therein.
Executive Summary:
The fevered frenzy of Industrial Civilization's resource consumption appears to have finally reached its apex and begun its decline in this, the first decade of the twenty-first century. A closer look at the physical realities of resource extraction reveals that the resource situation is, in fact, terminal for our high-consumin' civilization.
Resource depletion is a predicament requiring adaptation to an entirely new low-consumption paradigm, rather than a problem to be solved with technological or social solutions. As a country, we need to start the conversation about what a lower-consumption, resource-poor society would look like, and begin the appropriate preparations.
The Insatiable Hunger of Industrial Civilization
Over the past 150 years, the relentless combination of exponentially-increasing population and exponentially-increasing per-capita (i.e. per-person) consumption has significantly depleted a wide-range of resources necessary for the continuation of our modern Industrial Civilization.
These include both non-renewable resources (ex:
fossil fuels, metal ores, phosphate fertilizers, etc.) and theoretically-renewable resources that are being abused to such an extent that they are becoming essentially non-renewable on useful timescales (ex:
fisheries, topsoil, freshwater, etc.).
Pick any of these key resources and the annual extraction rate data will likely show an exponential increase from the mid-1800's to the present.
Ask scientists about the resource and they will tell you the bad news:
the annual extraction rate curve is near, at, or past the point of collapse.
Ask conventional economists or politicians and they will tell you the good news:
"Everything's going to be OK;the market will take care of it;It always has."
So who do we believe?
Taking a quick look past the rhetoric, the situation becomes clear -- alarmingly so for those who wish the industrial party to continue, as well as for those who fear we are not properly prepared for what follows.
The Easy Stuff's Gone
As modern Industrial Civilization built momentum, the easiest resources, the "lowest hanging fruit," were logically picked first:
the high purity coal, metal ores, and phosphate-bearing minerals at or near the surface;the light, sweet crude oil and gas that burst at great pressure from shallow wells;the huge, dense schools of protein-rich fish that practically jumped into the boats;the deep-rich top-soils that required minimal inputs to produce bountiful crop yields.
While the ease of extraction and high quality of these resources gave us a great confidence as a civilization, ever-increasing consumption rates actually became ingrained as a necessity for the continuation of our industrial economies. As this consumptive frenzy gained momentum, however, these once-easy resources became "high graded;" meaning that as the easiest stuff was skimmed off every year, the resources that remained were of increasingly lower quality.
What remains now, of course, at our currently-advanced stage of depletion, are resources that are much more expensive, of much lower quality, and much more difficult to extract.
These are the low-purity metal ores thousands of feet underground;heavy crude oil and gas laced with toxins that must be coaxed with great effort from beneath thousands of feet of ocean, rock, and salt;sparse schools of lower-quality fish requiring monstrous nets and huge ships for their economical extraction;and the nutrient-depleted, thinned-out top-soil requiring significant inputs to obtain reasonable yields.
The Difficult Stuff's Too Difficult
Let's assume to a very rough (but not entirely unreasonable) approximation that half of all theoretically-extractable resources have been depleted as we begin the 21st century – fossil fuels, metal ores, phosphate fertilizer, fisheries, etc.
The industrial consumers say, "Wow, that still leaves half remaining to be extracted.
We still have another 150 years of fun.
Party on!"
There are, however, two key problems that will undermine their (understandable) exuberance.
First, due to much-increased population and per-capita consumption rates, we are burning through these resources at a significantly faster rate than at the start of the first 150 years. Even if the second half of the resources were easily obtained, they would be likely be gone in a matter of a few decades. Secondly, the first half of the resources was the cheap, easy half. What remains is so increasingly difficult to access that it would require actual extra-terrestrial energy inputs for their complete extraction – i.e. it's not gonna happen.
Not even close.
Here's the dark irony of our resource predicament:
The low-quality, difficult half of the resources that remain require an infrastructure for their extraction that can only exist in the presence of the high-quality, easy half of the resources -- the ones that no longer exist.
Please read that again.
In other words, a relatively large percentage of the low-quality, difficult resources that remain will likely never be extracted.
The age of cheap, easy, high-quality resources to power the current version of Industrial Civilization is over, and the age of expensive, difficult, low-quality resources to power a future version of Industrial Civilization will simply never occur.
Our beloved Industrial Civilization, this pinnacle of human ingenuity, this shining beacon of light in an otherwise backward Universe, (this destructive monster killing the biosphere) is just about out of fuel.
It's time to get out and start walking.
Lower Consumption Is the New Higher Consumption
So what does all this "bad" news mean for our everyday lives?
The short answer is that we can expect a rather drastic involuntary reduction in resource use in the not-too-distant future, gradually worsening, and extending into the distant future.
This coming resource supply-reduction may well proceed in a stair-step fashion -- unexpected drop, period of stability, unexpected drop, period of stability…etc, giving repeated temporary illusions of "the bottom."
The steady erosion of the resource pipeline will not only utterly cripple our growth-requiring Industrial economy, it will send ripple effects through every facet of our formerly-industrial lives, changing them almost beyond belief.
We will not only have less and less of the "primary" extractable resources available to us every year -- less oil, less coal, natural gas, less phosphate fertilizer, less metals, etc;but we will also have less and less of the "secondary" resources that the primary resources make possible:
less electricity, less nitrogen fertilizer, less water treatment, less transportation, less computers and electronic communication, etc.
Again, it's important to state here that not only will this decline be involuntary, it will not be preventable by any combination of political, social, or technological solutions. It will simply occur, and we must simply respond to it.
How we respond, of course, will make a great deal of difference as to whether our predicament becomes disastrous or just very difficult.
Moral guidance will be greatly needed throughout.
The varied fields of Ecology, Biophysical Economics, Permaculture, and Natural Systems Agriculture (among others) have much to teach us about adapting to our changing resource situation, and we certainly should listen to them. (Note to Obama:
Please contact the Post Carbon Institute.
Invite Wendell Berry over for a beer. Heck, Derrick Jensen too.)
Also realize that there are many important facets of our lives which need not decline in the upcoming future – indeed, they may even increase:
personal connections with our families, communities, and the natural world;block parties and potlucks;tag-football and pickup-basketball;joking around and shooting the breeze;love in our hearts, etc.
In other words, it's quite possible we just may find a lot more important and fulfilling things than we're losing.
Much is still up to us.
What Lower Consumption Means
The following chart is meant to give a brief flavor of our coming lower-resource future. A quick read down the left column gives a pretty good overview of our current Industrial society, in all its fast-paced, consumptive glory.
I've been told by my students that the right column reads seems suspiciously Amish-like.
That's really not an accident -- the Amish generally lead a much less consumptive lives. Whatever you happen to think of their social structures, the physical lifestyles of the Amish will probably gradually become the lifestyles of a majority of the population.
Another accusation I get is that I'm predicting the 21st century will increasingly resemble the 18th century. I respond with this:
if that's what the Laws of Thermodynamics and the finite material limits of the Earth dictate, I don't see how we have a choice.
The Coal Question and Climate Change
I appreciate this opportunity to contribute a post to The Oil Drum.
This site is the most important forum for the discussion of oil production, because of the vigor and depth of the debate. I would like to offer some calculations for coal production and climate change. I hope you that find coal as interesting as oil.
Coal is the most important fossil fuel for generating electricity, and it is a major source of atmospheric CO2.
Oil reserves are rightly viewed skeptically at The Oil Drum, in large part because of fraud by the OPEC countries. Coal reserves are compiled by the national geological surveys, and unlike oil reserves, they are honest.
However, recently Dr. Werner Zittel and Jorg Schindler and their Energy Watch Group have written an important paper "Coal:
Resources and Future Production" that shows that there are major problems with the reliability of coal reserves, and indicates that the reserves may be too high.
Coal is different from oil, and much of the intuition that we may have developed about oil from nights pondering TOD posts is wrong for coal.
Finding oil is hard, and we have not found it all yet.
In contrast, people knew where the coal was a century ago. Once oil is found, it is likely to be produced quickly, so much so that discovery history is routinely used to predict future production.
On the other hand, there are large coal fields that are almost undeveloped.
As an example, Montana has larger coal reserves than Europe, Africa, or South America, but it is producing less than 0.1% of that coal each year. Our estimate of future coal production depends a lot on whether we think that the people of Montana will get into serious coal production.
Finally, in contrast to the situation for oil, the world market for coal is only partially developed.
Most coal is consumed in the country it is produced in, and there are large differences in prices, even in the same country.
For this reason, we will analyze production on a regional basis. I will apply the techniques to coal that are routinely used here for oil, and consider the consequences for future climate change.
People who are interested in more details can get the spreadsheets with the raw data at my web site, with lots of additional s and source links.
The authoritative source of information on climate change is the UN Intergovernmental Panel on Climate Change (IPCC), which is releasing its 4th Assessment Report this year. This is a mammoth undertaking, with more than 1,000 authors and more then 1,000 reviewers. The fossil-fuel contribution to climate change is considered in terms of 40 scenarios, each considered to be equally valid.
In the assessment modeling, the factors for future fossil-fuel production are primarily population, policy, and GDP, and limitations in fossil-fuel supplies are not considered critically.
Parts of the scenarios would strike most readers at The Oil Drum as preposterous. For example, in 17 of the scenarios, world oil production is higher in 2100 than it was in 2000.
Even OPEC oil ministers do not make that claim.
Thinking about climate change also requires adjusting to the long time scales. At the Oil Drum, there is much discussion of whether the Ghawar field will decline next year. However, from the point of view of a temperature peak in the next century, it matters little whether we burn a ton of coal now or 50 years from now.
This means that a policy that results in a ton of coal being consumed next year instead of this year does little good.
Because of the long time horizon, we will use cumulative plots, which smooth out the year-to-year fluctuations. To start with a plot that you will probably recognize, let us consider the cumulative production for US crude oil, courtesy of the amazing data gnomes at the EIA. This is a terrific series that starts all the way back in 1859.
On the same graph, I have a shown a normal curve, fit to the data.
This is the bell-shaped curve from statistics class, plotted in cumulative form.
The fit is done just by clicking the Solver button in Microsoft Excel, and it is absolutely perfect. I used 3-point symbols, which are the smallest ones I could see, and the symbols bury the fitted curve for over 100 years. We will see that we can also use cumulative normal fits for coal production.
Cumulative US crude-oil production from 1859, plotted from 1900 on, together with a normal curve that is the least mean square fit (ultimate 225Gb, 10% year 1939, 90% year 2011). The projected remaining production is 31 billion barrels. Given current production levels of 2 billion barrels per year, the prognosis for US oil production is grim.
Often we do not have enough data to fit for remaining production this way.
In these situations, I will use a Hubbert linearization to estimate the remaining production, like we often do for oil.
Hubbert introduced this approach for modeling oil production in "Techniques of Prediction as Applied to the Production of Oil and Gas," in Saul I. Gass, ed., Oil and Gas Supply Modeling, pp. 16-141.
National Bureau of Standards special publication 631.
Washington:
National Bureau of Standards, 1982.
This is a great paper. It is difficult to find, but you can download it here (15MB file). 2 shows a Hubbert linearization for world hydrocarbon production.
The trend line is for 3.2 trillion barrels of oil equivalent (Tboe) remaining.
We will use this number for our simulation of future atmospheric CO2 concentrations and temperature rise.
This is 20% larger than the reserves given by the German resources agency BGR, 2.7Tboe.
The BGR includes 500Gboe for unconventional sources. In contrast, the IPCC assumes that 11-15Tboe is available for production for its climate-change scenarios.
Hubbert linearization for world hydrocarbon production (total of oil, natural gas, and natural gas liquids), based on production data from the 2007 BP Statistical Review.
Open symbols 1960-1992, closed symbols 1993-2006.
For coal, we start with the United Kingdom.
The British production cycle is nearly complete, and it is substantial, equivalent in energy content to the cumulative Saudi oil production.
There are excellent production records back to 1854, and there is even a good cumulative production for 1853.
The Victorians were outstanding geologists, and there are good reserve estimates back to 1864.
British coal even had a Hubbert.
His name was William Stanley Jevons, and he was an economist.
In 1865, he wrote a book, The Coal Question;An Inquiry Concerning the Progress of the Nation, and the Probable Exhaustion of our Coal-Mines , which should be read by anyone who is interested in coal or oil.
Jevons wrote that even though the reserves-to-production (R/P) ratio was around 1,000 years, exponential growth would exhaust British coal in the 20th century.
Jevons was right.
In his time, there were more than 3,000 coal mines. Now the British are down to six major underground mines, with the last Welsh mine, the Tower Colliery, due to finish off its last seam next year. 3 shows a Hubbert linearization for British coal.
There is a good trend line, and the very first point in 1854 is near the line.
We will see that the quality of the trend is in contrast to the reserves, which badly over-estimate remaining production throughout.
Hubbert linearization for British coal from 1854 to 2006, with a trend line for an ultimate of 27Gt.
The peak production was 292Mt in 1913.
The production for 2006 was only 19Mt.
The normal fit, shown in 4, is more complicated than our normal fit for US oil.
There are two pieces, one for production before the Second World War, and one afterwards with a higher ultimate.
Why did this happen?
It could simply be that economic activity increased after the war. Another possibility is technical change;strip mining started in Britain during the war. Yet another possibility is that it is a result of the coal mines being nationalized in 1947.
This created strong political incentives to support coal production. I was an undergraduate student in England in the early 70's when the coal miners brought down the Heath government.
Even though the mines are privately owned now, the mining companies still receive government grants to help open up new seams.
Cumulative normal fits for British coal.
The pre-war fit gives an ultimate of 25.6Gt, with the 50% year in 1920.
The post-war fit gives an ultimate of 27.2Gt, with the 50% year in 1927.
Mercury emissions rise in Illinois, even as figures drop nationwide
Illinois' mercury pollution from coal-fired plants increased 7 percent in 2008, U.S. says
Mercury pollution from coal-fired power plants is increasing in Illinois even as it declines nationwide, a troubling trend for the state because emissions of the toxic metal tend to fall back to earth close to the source.
The amount of mercury blown into the air by the state's coal plants jumped by 7 percent last year, according to a Tribune analysis of newly released federal data on industrial pollution.
By contrast, mercury emissions from all U.S. power plants declined by 4 percent.
Only one other state, Michigan, recorded a larger increase in pounds released.
Texas tied Illinois for the second largest, but emissions declined in 27 other states, including Indiana, Ohio, Georgia and several others that rely heavily on coal to generate electricity.
The increases in Illinois and several other states can be attributed to power companies' burning more high-mercury coal in 2008, without equipment to filter out the poisonous byproduct.
That type of coal generally contains less sulfur, which helps companies meet federal limits on acid rain pollution.
There still are no national restrictions on mercury emissions from power plants, the largest man-made source of the toxic metal.
It takes only a small amount of mercury to pollute lakes and streams. Nearly half of the nation's lakes contain fish contaminated with harmful levels of mercury, according to a U.S. Environmental Protection Agency study released earlier this year. The problem is so pervasive that Illinois and 43 other states advise people, especially women of childbearing age and young children, to avoid or limit eating certain types of fish.
EPA scientists also have determined that Chicago is a "hot spot" where relatively large amounts of mercury fall.
Nearly two-thirds of the pollutant comes from sources within the state.
"This shows why it is so important to have enforceable limits in place nationwide," said Bruce Nilles, director of the Sierra Club's Beyond Coal Campaign.
Though Illinois adopted mercury limits on power plants three years ago, the regulations won't take full effect until midway through the next decade.
The Obama administration has proposed national limits, but power companies likely will get several years to comply, meaning emissions could keep rising in some states.
Mercury is one of dozens of toxic chemicals and heavy metals that billow out of the smokestacks of coal-fired power plants. The increases seen in Illinois and several other states are striking because most other pollutants from coal plants declined last year, including substances in lung-damaging soot, smog and acid rain.
Carbon dioxide pollution that contributes to global climate change also dipped last year, reflecting lower demand for electricity because of the recession.
The hodgepodge of results on mercury emissions occurred because levels of the metal can vary widely depending on the type of coal burned.
And most power plants still are not equipped to scrub mercury droplets out of smokestack exhaust.
In Illinois, where about half of the state's electricity comes from coal plants, most power companies have switched from coal mined in the state to sources in Wyoming and other Western states. Western coal has lower amounts of sulfur, an ingredient in acid rain, but it generally contains more mercury.
The amount of mercury emissions in Illinois in 2008 rose to 4,466 pounds, up about 7 percent from 4,181 pounds in 2007.
Nationally, mercury emissions dropped to 88,871 pounds in 2008, down about 4 percent from 92,907 pounds in 2007.
Mercury emissions rose at three coal-fired power plants in the Chicago area:
two in Will County and one in Chicago's Pilsen neighborhood.
Emissions fell at plants in Waukegan and Chicago's Little Village neighborhood.
Doug McFarlan, a spokesman for Midwest Generation, the company that owns all five plants, said mercury pollution should drop across the board this year. Mercury-scrubbing equipment was installed in mid-2008 at the Chicago and Waukegan plants and in July at the two Will County plants, one in Joliet, the other in Romeoville.
Under state rules, all Illinois coal plants must reduce mercury pollution by 90 percent by 2015.
"We're on track to meet those limits," McFarlan said.
This year, a federal appellate court threw out industry-friendly rules imposed by the Bush administration that would have given all U.S. power companies until the 2020s to reduce mercury emissions by 70 percent.
Those rules also would have let some plants keep releasing large amounts of mercury as long as emissions slowly declined nationwide.
The Obama administration is pushing its own rules that would require faster, deeper cuts in mercury pollution at each power plant, a move that would fulfill one of the president's campaign promises. Mercury controls are expected to be included in a package of rules that also would clamp down on soot, smog and acid rain pollution from coal plants.
"The agency is committed to following science and the law as it develops a strategy to reduce harmful emissions from these facilities, which threaten the air we all breathe," said Cathy Milbourn, an EPA spokeswoman.
Mercury is one of the last pollutants released by power plants to be targeted for limits by environmental regulators. The toxic fallout has become a lingering problem even as other smokestack emissions have declined, mostly because other chemicals are subject to federal limits.
How earth Works
Imagine coming upon Earth as a traveler from another galaxy.
It wouldn't take you long to discover Earth's fascinating variety.
The combination of continents, oceans, and atmosphere makes it unique among all the planets in the solar system, and perhaps in the galaxy.
These features also create the conditions for life in all its diversity.
But where did the land, water, and air come from?
And how do these systems work together to produce the complex phenomena that are evident everywhere on this beautiful planet?
Consider these cases:
* The steady, slow decay of radioactive elements deep inside Earth provides the heat that keeps our planet at a slow boil, moving massive amounts of mantel rock in a cyclic pattern of convection and creating plate tectonics.
* A mid-ocean ridge system snakes around Earth like the seam on a baseball.
The slow drift of Earth's tectonic plates—moving at the rate that hair grows—splits the ridges apart, pulling rock up to fill the gap.
* One of the most energetic environments on Earth is where land and ocean meet:
the seashore.
Waves crashing against the coasts of North America can be heard as background static on seismometers located at the center of the continent.
An Astonishing Journey
How the Earth Works takes you on an astonishing journey through time and space.
In 48 half-hour lectures, you will look at what went into making our planet—from the big bang, to the formation of the solar system, to the subsequent evolution of Earth.
You will travel to the center of our planet and out again, charting the geologic forces that churn beneath our feet to push the continents and seafloor around like so much froth on the surface of a pot of soup.
Earthquakes, volcanic eruptions, and tsunamis are byproducts of our planet's ceaseless activity, and you will focus on specific examples of each to learn why and when they occur. Volcanic activity has produced the atmosphere as a side effect, and you will learn how this sea of air functions at the global scale.
Earth's surface is mostly water, and you will explore the cycling of this vital substance throughout the planet, along with its role in climate, erosion, plate tectonics, and biology.
Not only are humans at the mercy of our planet's natural forces, but we ourselves have also become agents of change.
We are altering the Earth's land, water, and air faster than any other geologic process. This will be another theme of your journey:
how humans have transformed watersheds, leveled mountains, changed the balance of gases in the atmosphere, and caused the extinction of enough species to hasten the end of the 65-million-year-old Cenozoic era.
It is vitally important that we understand the nature of our geologic powers, if we are to have any hope of controlling them.
Your Guide:
A Scientific Sherlock Holmes
Professor Michael E. Wysession is the ideal guide for this expedition. A geophysicist with a specialty in seismology, he has developed techniques for using seismic waves from earthquakes to deduce the three-dimensional structure of the interior of the Earth.
Like a scientific Sherlock Holmes, Dr. Wysession uses this approach to "see" into a realm that was previously more mysterious than galaxies billions of light years away.
As a leader in geoscience education, Professor Wysession has wide experience teaching Earth science to students from high school through the advanced graduate level.
For this course, he assumes no prior background in science and introduces all the concepts you will need to understand how the Earth works—from basic physics, chemistry, astronomy, and biology to the fundamentals of geology, mineralogy, hydrology, and atmospheric science.
Each lecture builds upon preceding ones to deepen your understanding of key concepts. For example, Professor Wysession spends the first four lectures laying the groundwork for the introduction of plate tectonics in Lecture 5.
Almost everything he discusses thereafter relates in some way to this revolutionary theory, which is as instrumental to Earth science as the Copernican theory is to astronomy.
How the Earth Works is the perfect complement to The Teaching Company's in-depth geology series, Nature of Earth:
An Introduction to Geology.
Taught by Professor John J. Renton, that course covers Earth's minerals, rocks, soils, and the processes that operate on them through time.
How the Earth Works also touches on these subjects, but it ranges farther afield to investigate Earth as a system, as one might study a complicated machine.
Such a focus makes this course truly a user's manual to the planet.
Whether your interest is geology, cosmology, biology, climate science, or even human history, Professor Wysession shows how these perspectives fit into the comprehensive picture of our planet.
Fitting the Pieces Together
You don't have to travel far to realize that we live in a world of startling contrasts—in landforms, natural resources, flora and fauna, climates, vulnerability to natural disasters, and other characteristics. Professor Wysession shows that these different features are like interlocking puzzle pieces. Learning how the different pieces fit together gives you insight into some very interesting questions:
* Why is there gold in California and coal in Indiana, and not the other way around?
* What does the tilt of Earth's axis have to do with the evolution of deciduous trees?
* Why are volcanic eruptions predictable, but earthquakes (so far) are not?
* What is the link between the shape of Earth's orbit and the size of mammals?
* How does the movement of Earth's tectonic plates affect climate change?
* What does the mid-ocean ridge environment have to do with the origin of life?
Get "Under the Hood" of Earth
How the Earth Works includes many simple activities that make concepts clear. Whether through illustrating the viscosity of magma with day-old oatmeal or showing how a laptop computer can double as a seismometer, Dr. Wysession believes learning works best when you demonstrate and describe basic principles. You will marvel at the lessons he can impart—and that you can do yourself—with a chocolate bar, modeling clay, an orange, and even a piece of Afghan flatbread (which nicely reproduces the complex faulting seen along the mid-ocean ridge system). He also brings in intriguing rock samples with wondrous stories to tell about the history of our planet.
In addition to giving you the pleasure of looking under the hood of Earth and understanding how it is put together and how it works, this course provides a new context for understanding contemporary events and issues such as natural disasters, climate change, resource scarcity, and renewable and nonrenewable energy sources. You may also be surprised to learn the central role that Earth's ceaseless activity has played in historical events, from the origin of civilizations to the fall of Rome and the voyages of Leif Eriksson.
Finally, think back to that traveler from another galaxy.
The space voyager's first impression of Earth would doubtless be of its sheer beauty:
its intensely blue oceans, brilliant white polar caps, tan deserts, and deep greens of rainforest, overlain by graceful swirls of clouds. How did it all come about?
What was it like in the past, and what will it be in the future?
What is the role of life in this intricate system?
Pre
Top 10 environmental moments of the decade
An understanding of climate change was no longer limited to a small group of scientists or environmentalists, and concern started to change the way more us live our lives. Eco Solutions looks back at 10 environmental developments that defined this as the most green decade yet.
Do you agree with our list?
What do you think the most significant environmental moments of the decade were?
Have you say in the "Sound off" box at the bottom of this page.
The game-changer:
The Toyota Prius
In 2001, the Toyota Prius became the first mass-produced hybrid vehicle to be sold worldwide.
It heralded the beginning of an era - commercially-viable and even successful green goods. Industry insiders called it a game-changer attracting buyers despite its higher-than-average cost and unique look.
Buying a Prius wasn't simply about fuel efficiency, it was making a statement about the environment.
To date, more than one million Priuses have been sold worldwide, and other major manufacturers have followed suit to develop their own hybrid and electric-only vehicles.
The summit:
COP15
Video:
The decade's greenest moments
December's U.N. summit on climate change held in Copenhagen, Denmark garnered unprecedented attention from around the world.
Intended to find a successor to the Kyoto Protocol, the fifteenth Conference of Parties (COP15) was meant to produce a definitive agreement for future emission cuts. Yet even before the summit began, leaders tried to temper expectations by saying that firm targets were "highly unlikely."
In then end the accord that was reached fell short of the expectation of nearly every interested party gathered in the Danish capital. U.N. secretary-general Ban Ki-moon put a brave face on the conference saying leaders were "united in purpose, but were not yet united in action."
Read more
Al Gore's star power:
"An Inconvenient Truth" and the Nobel Peace Prize
It might not be an exercise in cinematic artistry but Al Gore's 2006 "An Inconvenient Truth" has its place in history.
The film was instrumental in spreading the message of climate change and had the rare opportunity of having a wide-spread release (it was one of the highest grossing documentaries of all time).
The 100-minute documentary was based on a simple premise of a Powerpoint presentation, but it was the content not the form that caught the world's attention.
Since it's release, it has been maligned by some critics for "fear-mongering."
The most contentious assertion in the film described a six meter sea level rise as a realistic short term prospect - a projection that disagrees with U.N. findings.
Yet despite its doomsday scenarios, it explained the basics of climate change to an audience that till then had no access to. The film brought Al Gore an unexpected accolade in 2007, the Nobel Peace Prize.
The Former U.S. Vice President was applauded by the Nobel Committee for "efforts to build up and disseminate greater knowledge about man-made climate change."
The Rise of CFLs
Compact Fluorescent Lightbulbs (CFLs) brought environmentalism into the home.
Despite some initial objections to cost and its less than ideal performance (an initial flicker and the inability to use it with a dimmer), producers overcame those hurdles and in 2007, sales of CFLs reached record heights worldwide.
Australia has already implemented a ban of traditional incandescent light bulbs while the European Union and Canada are also phasing out the old bulbs.
According to the U.S. Environmental Protection Agency, the average CFL uses 75 percent less energy than the traditional incandescent light bulb.
This translates into a $30 saving which would pay for itself in 6 months. It was an important coup for the environmental movement, but green groups now say much more needs to be done to establish recycling programs to deal with CFLs so they don't end up in landfills.
Heating up:
A decade of extreme weather
Hurricane Katrina in 2005, a string of deadly hurricanes devastating Haiti in 2008 and the heat wave in Europe in 2003, just some examples that this decade was plagued by wild weather.
While scientists say it is not possible to make a direct link between extreme weather and man-made climate change, the U.S. Environmental Protection Agency say "climate change may increase the probability of some ordinary weather events reaching extreme levels or of some extreme events becoming more extreme."
Recently, the U.N.'s weather agency said that this decade was the hottest on record.
Noughtie talk :
"Carbon Footprint"
This decade saw a flurry of green jargon enter our everyday vocabulary.
Growing awareness for the environment meant that noughties vocabulary included words like "carbon footprint," "carbon neutral" and "greenwashing."
Perhaps the most significant step was when they were accepted into the Oxford English Dictionary in 2007.
Carbon footprint was defined as "the amount of carbon dioxide emitted due to the activities, especially the consumption of fossil fuels, of a particular person, group, etc."
The devil is in the details:
The Intergovernmental Panel on Climate Change report
In 2007, climate scientists from around the world met in Paris to lay out what we knew about climate change.
It was a significant attempt to amalgamate decades of climate data from around the world.
The Intergovernmental Panel on Climate Change (IPCC) issued the first scientific consensus on climate change, which included an unprecedented acknowledgement that it was "very likely" that climate change was caused by human activity.
The report went on to project a temperature rise of 1.8 to 4 degrees Celsius and a sea-level rise of between 28 to 43 cm by the end of the century.
Ten years of growth:
Renewable Energy
Most scientists agree that there is no silver bullet for climate change but some believe that a healthy mix of existing technologies especially in the renewable energy sector will be enough to significantly cut emissions. Wind, solar, hydro all saw significant support in the past ten years.
In 2008, global power capacity from renewables topped 280,000 MW, according to the International Energy Agency.
That is three times more than what nuclear power plants in the United States currently produce.
There are also new players in the sector:
India and China are now among the leaders in the installation and manufacture of renewable energy.
In 2008, China's wind power capacity doubled for the fourth year running.
The technology itself has also developed significantly - solar technology had a few key breakthroughs including improved energy yields (it now tops 20 percent) and the creation of ultra-thin solar panels.
It's going to cost you:
The Stern Review on the Economics of Climate Change
Can you put a price on climate change?
Yes, according to British economist Sir Nicholas Stern.
In 2006, the former vice president of the World Bank issued a 700-page report calculating the cost of climate change to the world's economy.
Green groups called the report a wake-up call for governments who saw a concrete financial impact of climate change for the first time.
The report estimated that climate change would cost at least 5 percent of global GDP annually, now and forever. The worst case scenario would be 20 percent a year ($7 trillion). The Stern Review was the first of many reports that tried to put climate change under an economic perspective.
The UN Framework Convention for Climate Change suggests that climate change could cost between $70 to $100 billion by 2030, that's the cost of 3 Beijing Olympics.
Cap and trade
Carbon trading, also known as "cap and trade," became a hotly debated policy that many hope will help counter climate change.
Using free-market principals and government regulation, participants in cap and trade schemes buy and sell permits to emit carbon dioxide.
Governments limit the amount of emissions allowed and slap heavy fines on those who exceed those limits. Reducing the amount of permits issued over time should then reduce pollution levels.
The EU has the largest emissions trading market, set up in 2005 and more than 30 countries have adopted, or plan to adopt similar models. But there are many critics who point to the lack of a global market for carbon trade, which would make it more effective.
There are also questions over regulation and accounting for pollution offsets. Many see "offsetting" in cap and trade schemes as an escape hatch for businesses to avoid making real reductions in their polluting activities. It's a complex and controversial issue but one that looks set to a key feature of mainstream climate change solutions in years to come.
Small forests a big help in curbing carbon
Editor's note:
Chuck Leavell is a pianist who has spent 27 years playing keyboards for the Rolling Stones. He is a board member of the American Forest Foundation and recently founded Mother Nature Network, MNN (www.mnn.com), an environmental news and information site.
He owns and operates the 2,500-acre Charlane Plantation in Dry Branch, Georgia.
Atlanta, Georgia (CNN) -- Even if world leaders haven't finished the job with the global accord produced at the Copenhagen climate talks, the summit was not a total bust.
That's because negotiators there outlined a landmark deal aimed at making money grow on trees.
The tentative outline would allow countries -- especially developing countries -- to profit from preserving forests, and possibly even peat swamps, grasslands and other ecosystems that help soak up carbon dioxide and soften the blow of climate change.
But it's forests that are the main target, due to their knack for absorbing prodigious amounts of CO2 from the atmosphere, thereby preventing it from trapping the extra solar heat that causes global warming.
Unfortunately, the summit's forest initiatives provided little help to the more than 10 million people like me who manage family-owned forest land in the United States. Since such forests make up more than a third of all forests in the nation, they have the potential to play a huge -- and growing -- role in reducing carbon emissions. Any government action on climate change in this country needs to pay attention to them.
It may seem odd for a rock 'n roll piano player to be picking up this cause.
Let me explain:
Back in the mid 80's when I was in-between Rolling Stone tours, I found myself studying practical forestry.
My wife, Rose Lane, and I had inherited land from her grandparents and we were beginning to actively manage our family forest, Charlane Plantation, carrying on the tradition of good stewardship that her family started many generations ago.
In 1999, Rosie and I were rewarded for our efforts by being named by the American Tree Farm System, a program of the non-profit conservation group, the American Forest Foundation, as National Outstanding Tree Farmers. We were selected out of some 65,000 other family forest owners, which was a great honor for us.
Through the years I have become fascinated and passionate about trees and forests. But I never dreamed that some 25 years later I would be talking about carbon credits, cap and trade, and bio-mass for energy usage.
However, these were some of the hot topics being discussed at the recent Copenhagen climate talks, which I followed with great interest.
A full 10 percent of carbon emissions in the United States every year are absorbed by forests and forest products, according to the U.S. Environmental Protection Agency, but that number could increase if the right economic incentives were put in place.
Forests can supply us with clean, renewable energy, such as biofuels, that, if used to replace fossil fuels, also reduce carbon emissions.
They can serve as a "carbon offset" under a cap-and-trade system that allows forest owners to sell their enhanced carbon storage to an emitter when that emitter is unable to meet its limits on greenhouse gas pollution.
President Obama and the Senate will soon turn their attention again to proposed climate change legislation, which will no doubt continue to be controversial because of its complexity and broad impact on the environment and the economy.
Carbon trading is an important part of any approach, but more is needed to make environmentally-friendly steps worthwhile for private forest landowners.
Many smaller forest landowners in the United States may not be able to participate in a cap and trade program because the set-up costs for trading carbon -- such as verification of storage and monitoring costs -- are just too expensive for owners with lot sizes less than 100 acres. Yet together these smaller family-owned U.S. forests make up a significant "carbon sink" of 119 million acres.
The nation could immediately put this land to work in helping combat climate change, but first, when it considers new legislation, Congress must provide improved offset market opportunities for forest owners, and incentives for forest management practices that enhance carbon storage, like those included in pending legislation offered by Sen.
Debbie Stabenow, D-Michigan, and six of her colleagues.
Such practices include replanting trees, changing timber rotations, and avoiding deforestation.
The carbon reduced would supplement the reductions achieved via cap and trade.
Congress should consider compensating landowners to undertake these strategies.
Family forest owners like me provide about 60 percent of all U.S. lumber supplies. With the slow housing construction market, and timber prices near record lows, it's a great time to direct forest owners toward conservation-minded forest practices.
Providing forest owners with a modest income stream for carbon-enhancing forest activities is the key.
When a developer comes knocking, it helps when a forest owner already has a satisfactory economic incentive in place to nudge that owner toward taking the step that helps the climate and protects forests, instead of, say, clear-cutting the land.
Of course, besides climate improvement, conservation brings the other forest benefits we cherish:
clean air, safe drinking water, wildlife, and recreation. I hope that, beyond Copenhagen, our lawmakers, and folks all over the United States, will support the creation of the economic incentives that will help us to attain our nation's environmental goals.
Midwest hurt by energy dithering
When we see the captains of industry waiting for the corrupt, inefficient, politically expedient government to make their decisions for … read more them.
Instead of allowing the energy industry to find the most efficient and reliable sources, we have to wait for what is currently politically correct.
It may not be efficient, it may not be viable, it may not even be possible on a large scale, but we await the politicians' political posturing.
Who will get the subsidies and who will get regulated out of the market?
We cannot know until the 'central authority' makes up its mind who will sweeten the pot more.
They are waiting to see who will win the campaign funds sweepstakes and not the best economic option.
We're entering a world without sense or reason.
Back in 2004, former Xcel Energy Chairman and Chief Executive Wayne H. Brunetti made this famous plea as political leaders contemplated carbon emission limits and other policies to fight global warming:
"Give us a date, tell us how much we need to cut, give us the flexibility to meet the goals, and we'll get it done.''
Five years later, policymakers still haven't set the rules Brunetti sought to unleash American industrial innovation-- the force that can transform the nation into one powered by clean, renewable fuels. The indecision isn't good for the planet or businesses. Passing strong energy and climate legislation -- in particular, finally setting a price on carbon -- must be a top congressional priority in 2010.
The economic and security advantages of doing so are just as compelling as the science.
The United States consumes about 25 percent of the world's annual oil production, yet has just 3 percent of its proven reserves. Consumers and industry will face higher energy prices as developing nations compete with us for dwindling supplies. The world's largest reserves are also found in the world's trouble spots:
Russia, West Africa and the Persian Gulf. Unfriendly regimes could easily threaten U.S. oil supplies and wreak economic havoc here.
Those who dismiss the science of global warming shouldn't be so careless about this national security threat.
Often lost in the global warming debate is that the Midwest especially has a stake in the energy policy decisions that lie ahead.
Minnesota, Iowa and North Dakota are among the top wind-energy states. The region is also home to promising energy start-up companies. These local firms' growth and hiring is on hold because of the dithering we've seen in Congress and elsewhere -- such as Copenhagen's recent climate conference
Although Minnesota explorer Will Steger has been the region's voice on climate change policy, leaders also need to listen to business-minded Midwesterners such as Gerald Groenewold at the University of North Dakota in Grand Forks.
Groenewold runs the university's respected Energy and Environmental Research Center, which develops clean energy technology and spins it off commercially.
Uncertainty over energy policy is jeopardizing the center's work and, by extension, the region's economic well-being.
The problem?
Limited venture capital.
"It's just hamstrung right now with respect to investments. There's plenty of money out there...but (investors) are not willing to invest in these technologies and new approaches to energy transmission until they have a sense of direction:
here's what we want, here's what's going to be required and here's what we have to do,'' Groenewold said.
"Until Congress gives guidance, we have gridlock.''
It's time to end the counterproductive holding pattern businesses are locked into. The U.S. House has already passed carbon legislation.
Senators wearied by the bruising health care debate nevertheless need to come back and tackle an equally complex undertaking:
energy policy.
In addition to carbon, congressional action is needed on nuclear power and waste, as well as the federal government's role in building power lines. Research funding for technology that cleans coal plant emissions must also be addressed.
"Bring this to an end and do something,'' said Groenewold.
"Give us a road map, for goodness' sake.
Climate change:
Will our grandchildren revile the 'lost decade'?
PARIS, Dec 28 (AFP) Dec 28, 2009
The first decade of the 21st century dawned with a global strategy to fight climate change but ended in chaos and the UN system in tatters while greenhouse gases spewed with few constraints.
"Future generations will rue the years of inaction," Steve Sawyer, a veteran observer who heads the Global Wind Energy Council (GWEC), a Brussels green industry association, says grimly.
"Some generations will rue it very much -- those that survive."
In 2000, the world placed its faith in the UN Framework Convention on Climate Change (UNFCCC), the creation of the famous Rio summit.
But the following year, the vehicle started to shake and its wheels began to rattle when US President George W. Bush abandoned the Kyoto Protocol, the sole treaty to set down targets for curbing carbon gases.
Crippled by the walkout of the world's wealthiest economy and biggest carbon emitter, Kyoto limped along, failing to brake a relentless surge in heat-trapping gases.
In 2007, in its landmark Fourth Assessment Report, the Intergovernmental Panel on Climate Change (IPCC) issued a blunt warning.
Without swift action to slow, halt and reverse the growth in emissions, the world was on course for between 1.8 and 4.0 degrees Celsius (3.2-7.2 degrees Fahrenheit) of warming, the UN's top climate scientists said.
By century's end, hundreds of millions could be at threat from drought, flood, storms, rising seas, disease, malnutrition and homelessness.
The shock report reduced the lobby of climate skeptics to a rump, galvanised public opinion and nailed climate change to the top of the political agenda.
On December 18 2009 in Copenhagen, leaders' rhetoric -- and the UN format itself -- were put to the test.
The day had been billed as the moment when humankind would unite, each nation pledging a sacrifice towards a global pact that, from 2013, would shrink climate change from mortal peril to manageable risk.
Instead, it became a finger-pointing fiasco.
Terrified the talks would collapse, a couple of dozen leaders from the most powerful countries -- including the United States, the European Union, Japan, China, India and Brazil -- huddled over a so-called "Copenhagen Accord."
They gathered around a table, frenziedly adding or crossing out text on the planet's future before eventually settling on the lowest common denominator.
They set a goal of limiting warming to 2 C (3.6 F) -- but did not say when carbon emissions should peak, which scientists say has to be around the middle of the next decade.
Nor did they identify key staging points in the medium or long term, in 2020 and 2050.
And national pledges on carbon emissions were voluntary, carrying no penalty if breached.
Within hours, the document was savaged when it was put to the wider community of nations. The most outspoken lashed it as a "coup d'etat" against the UN system, a stitchup by an elite, a betrayal of the poor and a slap to expert opinion.
In the end, the critics were sidelined.
The conference gavelled the Accord through without even putting it to a vote. UN credibility lay in ruins and the blame game began.
Why did things go so wrong in Copenhagen?
And what can be salvaged?
Some say the fudge was inevitable and even argue the outcome is not so bad.
They note that Barack Obama, scrapping Bush's climate legacy and gingerly steering an emissions bill through Congress, had scant room to offer deeper US concessions in Copenhagen.
Then there are China, India and Brazil, hostile to anything resembling a binding emissions target, arguing they have the right to exploit fossil fuels to rise out of poverty.
Yet these high-population countries have become mega-polluters in their own right, accounting for the bulk of the 29-percent surge in carbon emissions between 2000 and 2008, the latest year for which figures are available.
Seeking an agreement in Copenhagen was like trying to square a circle, says WWF's Kim Carstensen, who blames "the lost years" of the Bush era for turning climate debate into trench warfare.
For UN chief Ban Ki-moon and others, the deal at least is the first to combine rich and poor nations in a single framework for emissions pledges.
They argue it provides a matrix for work that could lead to a fully-fledged treaty next December in Mexico City.
Other commentators are gloomier.
They blame the complex UNFCCC process, a spider's web of a thousand interlinked strands, where decision-making is driven by consensus among 194 nations. This offers plenty of room for delaying tactics or sabotage.
"In Copenhagen, we saw greater political will than ever, yet a handful of ideologically-driven countries nearly thwarted a deal," says Elliot Diringer, vice president for international strategies at the Pew Center on Global Climate Change, a US think tank.
"It's time to assess whose interests the process really serves. If the process can't be reformed, it may be time to forge coalitions of the willing."
If so, the toughest question of all emerges:
Can nation-states, built by definition to defend national interests, rise to the challenge of serving the global good?
Carstensen agrees the future for cutting deals on carbon emissions could lie with a smaller, nimbler arena.
It could be bilateral or a gathering of major emitters -- a format ironically proposed by Bush, a devil to green campaigners.
Even so, says Carstensen, "we will still need a global framework around climate action -- and we don't have anything better than the UN to do this. So I think the UNFCCC will continue to have a role."
Joris den Blanken of Greenpeace brands Copenhagen "a historic failure" by political leaders.
Yet he contends the setback should not mask a more positive, underlying change.
"This decade must be seen as the decade when people woke up to the challenge of climate change and began creating a climate movement," says den Blanken.
"Over the next few years, as awareness grows and the effects of climate change unfortunately become more apparent, politicians will have no other choice but to fall in line.
China's 'carbon intensity' commitment means nothing
There's been plenty of excitement over China and India's pledges to reduce the 'carbon intensity' of their economies. But without absolute limits, this is just business as usual
As we get closer to the climate-change negotiations in Copenhagen in December you can expect to hear a great deal more about carbon intensity.
At the pre-meeting in New York in September, President Hu Jintao of China committed his nation to 'continue its unremitting endeavours in boosting energy efficiency and by 2020, we should try to achieve a significant cut of carbon dioxide emissions per unit of gross domestic product'.
The flurry of excitement that greeted this announcement was not shared by ecologists or green economists. All Hu was really promising was that China's massive industrial production would be achieved with relatively less production of CO2, thus increasing ‘carbon intensity' rather than reducing carbon emissions. This carbon intensity will be calculated as a ratio of emissions to GDP output (a notoriously unhelpful measure of economic activity), and is only relative to the inefficient nature of China's current production.
So all that has really been promised is an attempt to move towards more ‘renewable' energy sources, which include nuclear power.
How likely are we to achieve the sorts of improvements in carbon intensity that will allow us to maintain current consumption standards while making the CO2 cuts the planet needs?
Adam Barnes has calculated that, taking population growth into account, merely to keep emissions stable the carbon intensity of our productive systems must fall by 66 per cent.
If we build in the need for even a modest 60 per cent cut in emissions, and extrapolate from our current trajectories to assume a population increase of 50 per cent and a doubling of per capita GDP then we are looking at the need for emissions per unit of GDP to fall by 86.8 per cent over the next 40 years.
Even if we could achieve a fraction of this greater efficiency, what would we get for our energy?
While we persist in measuring economic output in monetary terms the incentive for other economies is to follow the UK-US route and rely on ‘invisible' earnings that do not require fossil fuel burning to create value — international insurance and banking services for example.
So China's commitment could boil down to nothing more than a threat to compete with Britain for the diminishing world demand for such services.
What this means is that both sides of the ratio that generates the measure of carbon efficiency are flawed.
On the one hand we have the relative reduction which really means an absolute increase — some kind of Faustian pact with a planet which, as the climate campers blazoned across their banners, doesn't do bailouts. On the other hand we have a measure of economic value - GDP - that has nothing to do with what we, as humans, value and only respects the accountant's yardstick of financial return.
What we need is a genuine indicator of the carbon efficiency of an economy — let's call it a Responsible Carbon Index.
It should take into account the emissions already embedded in the goods we consume but that are produced overseas;it should be measured in terms of absolute carbon emissions rather than relative ones;and it should relate to something we actually value, like human well-being, or species diversity, or some combination of such measures. A single-number index is always good — it works for the journalists — and once we have this in place environmental campaigners could use it to browbeat their national politicians to try to outperform Burkina-Faso or Laos on climate responsibility, rather than competing with China or the US on some arcane monetary measure of the production of pointless and destructive stuff.
The media is full of talk of cuts this autumn, but please excuse me if I fail to get excited until I hear about the sorts of cuts that will give half of the humans alive on the planet today a chance of seeing out their natural span of life.
Why carbon trading cannot work
Carbon trading cannot work.
How do we know this?
Because economic theory tells us so
The great benefit in having a theory that is so strictly defined – regardless of its inability to represent the real world – is that you can be certain when its assumptions and rules are not met, and in such cases you ought to be able to conclude fairly definitively that a market outcome will not be efficient.
Hence it is no surprise that, as Professor Peter Rayner concludes in his Foreword to FoE's recent critique of carbon trading, A Dangerous Obsession, 'Far from proving to be an economically efficient instrument, carbon trading and offsetting have been beset by inefficiency and, in places, corruption'.
Economics argues for its superiority on the basis of its impeccable theory:
in this case the relevant model is the perfectly competitive maket.
So to what extent can the market for carbon approximate to this perfect market?
Before we can have a market we need to have a product – only then can the iron law of price-setting via supply-and-demand swing into operation.
What is carbon?
In the case of carbon trading it is pretty difficult to pin down what that product is. Is it a small piece of the global atmosphere?
If so, how is that defined?
And how much CO2 can be considered to fill it up?
With a slippery product like this, it is hard to see how a market could possibly work efficiently.
More fundamentally, the product has no existence independent of politicians who create permits which are later sold.
So, unlike a pair of shoes or a new car, it is a product with no real physical existence and is thus inevitably subject to the sort of political manipulation that results in corruption.
Carbon markets
The justification that a market system is better than a system of regulation – or just outright banning of further CO2 production – is that markets distribute products efficiently.
In this case, so the argument runs, the companies that can reduce emissions most cheaply will sell permits to those who would need to spend more to reduce their emissions. Permits are thus allocated in a neutral way to ensure the best outcomes for all by the miraculous invisible hand.
But according to the very same theory, markets only work efficiently when a number of fairly stringent conditions can be met.
The first condition is that there should be a large number of players in the market, none of whom is powerful enough to exert any significant influence over the prevailing price.
To what extent is this a fair description of the market for carbon permits?
No market at all
What we are dealing with is no sort of market at all, but rather a political system where the product is created by governments, a limited number of powerful corporations compete over it – and this is to say nothing of the financial intermediaries who are operating to distort the free operation of the market, as identified by the recent FoE report.
The whole theory of perfectly competitive markets specifically excludes certain categories of goods, which are defined as being inconsistent with a competitive system of supply and distribution, namely public goods and goods which enjoy a natural monopoly.
In the case of the carbon market, it could be argued that whatever the amorphous good might be, it would seem to fulfil both these criteria automatically, the global atmosphere being a clear example of a public good, and the right to pollute it being controlled monopolistically by the governments who have established the carbon market.
Impossible
The financial crisis has demonstrated that the commitment to free markets by powerful business interests has always been purely rhetorical.
Creating a carbon market is a policy being pushed by these same businesses not because it will work efficiently to reduce GHG emissions, but because they expect to gain from it.
They have no more interest in free competition than they do in sharing democracy with the nations of Africa or protecting the planet for future generations.
It is a shame that the economists who recite the neoclassical catechism have been less than assiduous in pointing out the fundamental impossibility of the concept of a carbon market.
How to stop doubting and love the climate models
Does human activity affect Earth's climate?
A simple question, no?
It's been settled with a ringing "yes" among the scientific community.
Yet, the so-called "climate debate" still pops up on editorial pages, political blogs and television talk shows. Apparently, we scientists have failed to explain to the entire public how we have come to understand the climate system.
For this we owe another attempt to engage readers who still feel there is some doubt about the role of human activity in Earth's climate.
What follows is a no-frills, nonpartisan explanation of how a group of scientists working on a particular problem establish knowledge, what we know, and what we need to learn.
Understanding the underlying science is important, and not just because our elected leaders in Congress are debating policy options to combat global warming, including a cap-and-trade program aimed at reducing greenhouse gas emissions. It's also important that we move the discussion to what we should be arguing about -- how to mitigate the effects of global warming -- once we get beyond the distraction of false debates over whether climate change is real and caused by humans.
A good place to begin:
On Saturday, the International Day of Climate Action will feature rallies and activities around the globe in an attempt to build momentum toward the U.N. Climate Negotiations in Copenhagen in December. See www.350oregon.org for a list of activities around Oregon.
And, Nov. 18, Al Gore, winner of the 2007 Nobel Peace Prize along with the U.N.'s Intergovernmental Panel on Climate Change, will speak at Keller Auditorium as part of the Portland Arts and Lectures Series.
How do scientists establish consensus, and what is the role of consensus documents like the IPCC reports?
climat1.jpgView full sizeThe complexity of the climate system dictates that the details of its function are investigated by a large, interdisciplinary community of scientists, spanning fields from geology, hydrology and plant biology, to atmospheric dynamics and chemistry.
The big picture comes into focus only when combining the results of specialists in each discipline, synthesizing an enormous body of scientific literature.
The scale of this endeavor makes the establishment of consensus critical to the progress of science.
We select an interdisciplinary team to periodically dedicate themselves to the review of the state of climate science.
Supported by the United Nations, the Intergovernmental Panel on Climate Change Working Group I conducts a thorough review of the peer-reviewed scientific literature on climate change science.
This group is made up of experts in the various subfields of climate science, whose careers are built on their reputations as careful scientists. Their authorship amounts to further staking their reputation on the conclusions, encouraging the thoroughness and appropriate scientific skepticism of the review.
These consensus documents provide an internationally recognized assessment of what we know, but also, very importantly, help guide the science forward by clearly stating where the greatest uncertainties remain.
What is the role of computer models?
Why should we trust them?
Once the progression of research, scientific publication and peer review has arrived at an explanation of individual atmospheric processes, such as cloud formation, we need a testable framework for understanding the massively interconnected climate system.
For this, we develop computational climate models. Multiple research groups around the world have independently developed such models, providing an excellent means of testing their accuracy:
If 12 independently written computer models roughly agree on a prediction, it is highly unlikely to be due to a random error. Models are tested by running "hindcasts" (as opposed to forecasts) to determine their skill at predicting known past trends, from the well-measured 20th century climate to the ice ages. The evidence we have that human-induced emissions have contributed to the observed temperature increase over the 20th century is shown in the graphs below.
A set of climate models from research groups around the world was run with and without human-induced carbon dioxide emissions. The model results have some spread, corresponding to remaining uncertainties in the details of the climate system.
However, the observed temperature trend only falls within the model range when the human contribution is included.
Scientific progress occurs when observations (temperature increase) can be explained by a physical model (human-enhanced greenhouse effect). This is the same way that we understand gravity:
I can't accurately predict how rapidly an object will accelerate towards the Earth without a model that incorporates the mass of the Earth.
What aspects of climate science are firmly established?
The global average surface temperature has increased by about 1.8 degrees Fahrenheit since 1850, with most of the increase since 1950.
This warming can only be explained when including human contributions to atmospheric carbon dioxide (CO2 ). Before the Industrial Revolution, CO2 concentrations were less than 280 parts per million (ppm) for at least 1 million years;current atmospheric concentration is 385 ppm.
Volcanic eruptions cause temporary cooling because of the particle haze they emit into the upper atmosphere;similarly, pollution emissions of particles from human activity have offset some CO2 warming over the past century, although the extent of this offset remains highly uncertain.
Warming is evident in other indicators as well, including rising sea level and decreasing polar sea ice and glaciers. Because CO2 remains in the atmosphere for centuries, and because oceans can store enormous amounts of heat, warming would continue for centuries even if emissions were to instantly cease.
What are some remaining uncertainties?
Here's what we should be arguing about:
What level of CO2 in the atmosphere is safe?
The specter of accelerating climate change and the possibility of crossing a tipping point are causing heightened concern among scientists. Doubling of pre-industrial CO2 levels (550 ppm) is one oft-cited target, for which average global temperatures are predicted to increase by 5.4 degrees Fahrenheit.
In contrast, some climate scientists promote aiming to preserve conditions similar to those under which life on Earth developed, which would require reducing CO2 levels from the current level of 385 ppm to 350 ppm.
These alternatives would require massively different carbon policy choices, and we must decide as a public how much risk we are prepared to take.
climate4.jpgView full sizeIs "geo-engineering" an option? A few governments around the world have recently begun discussing the possibility of offsetting global warming by introducing additional reflective particles into the atmosphere, to reduce sunlight reaching the surface.
At what point do we become sufficiently concerned about crossing tipping points that we should seriously consider this as a stopgap measure?
Almost no research has been conducted on geo-engineering, and many disconcerting open questions remain:
Will the UV-protecting ozone layer be damaged, as it is after volcanic eruptions inject natural particles into the upper atmosphere?
How far will the particles disperse, and how long will they remain aloft?
How will plant life respond to diminished and more diffuse sunlight?
If we wish to seriously consider such action as a "bridging" strategy while we work to reduce atmospheric CO2 to a safe level, urgent research is required to understand the consequences.
The ultimate question is, how do we get there from here?
Once we define target CO2 , we must begin the difficult discussions of how to get there in a way that is globally equitable and cost-effective.
This is the grand challenge, and one scientists are not equipped to address. We simply advocate for moving the "debate" from false distractions to the realm of finding solutions. We're all on this little spaceship called Earth together.
Juliane Fry has a Ph.D. in atmospheric chemistry from California Institute of Technology.
She teaches chemistry at Reed College.
The precious gifts beneath Oregon's trees
Sometime today, look outside.
There are gifts out there, presents that you, your children and future generations of Oregonians may discover and enjoy over and over again.
trees.JPGYes, this is a lean Christmas for a lot of Oregonians. Of course, there is no softening the blow of persistent unemployment.
But for the state's forests, rivers and fish and wildlife -- and all of the Oregonians who love and treasure them -- this is still a time worth celebrating.
Look outside:
There's new wilderness, new hope for the shallow and sick Klamath River and new impetus for an agreement to restore forests and badly needed sawmill jobs in Oregon's rural eastside communities.
The 202,000 acres of new Oregon wilderness signed into law by President Obama in March is a gift that will endure forever in Oregon.
The new wilderness -- the first in Oregon in 25 years -- at Mount Hood, the Columbia Gorge, Copper Salmon, Soda Mountain, Spring Basin and the Badlands near Bend will provide lasting protection to some of Oregon's most beautiful and precious places.
The wilderness bill, the culmination of years of work by Oregon's congressional delegation and thousands of other people, will protect 315 square miles of Oregon, 90 miles of wild and scenic rivers, 7 million trees and 240 miles of trails.
Look outside, there's more:
Negotiators are closing in on a final agreement that would lead to the breaching of four Klamath River dams, the largest dam removal in the history of the world.
Moreover, the deal promises to resolve decades of bitter disputes over the allocation of Klamath Basin water.
All this will still require years of work and hundreds of millions of dollars, but history may show that 2009 was the year that Oregon, California and the federal government took the first big step to restore what was once among the productive salmon rivers on the West Coast, bringing economic hope to coastal fishing communities while simultaneously providing certainty to farmers throughout the Klamath Basin.
This also may go down as the year Oregon managed to break the 20-year deadlock between environmentalists and the timber industry over the management of federal forests. Last week, Sen.
Ron Wyden announced that many of the state's leading conservation groups and some of its most prominent timber executives had agreed on legislation to triple logging on the dry public forests of central and eastern Oregon while permanently protecting old-growth trees and streams.
That agreement, if enacted by Congress and signed by the president, could help restore overgrown, fireprone forests and create hundreds of sawmill jobs across much of rural Oregon.
Even more important, it could spur negotiations for a similar agreement on westside forests to improve forest health, boost the timber industry and help rural communities.
There are still major unresolved environmental issues in Oregon, including the litigation over the federal recovery effort for salmon in the Columbia Basin.
But even with salmon, dam operations have improved, passage is much safer, fish habitat has been improved and the projected returns of chinook next spring are well above average.
Of course, there's much more work to be done on wilderness, the Klamath, logging and Columbia salmon.
But after decades of fighting about these issues and making little progress, this was a remarkable year.
Look outside again:
The rich biodiversity of Soda Mountain near Ashland, the clear-blue water of the Elk River near Port Orford, the twisted ancient junipers of the Badlands near Bend, all have won permanent protection.
There's hope for the Klamath and the tribes and farmers who depend on the river. There's real progress on eastside forests.
Baffin Island reveals dramatic scale of Arctic climate change
Study delves back into 200,000 years of history to demonstrate the devastating impact of global warming
A frozen lake on a remote island off Canada's northern coast has yielded remarkable insights into how the Arctic climate has changed dramatically over 50 years.
Muddy sediment from the bottom of the lake, some of it 200,000 years old, shows that Baffin Island, one of the most inhospitable places on Earth, has undergone an unprecedented warming over the past half-century.
Scientists believe the temperature rise is probably due to human-induced warming.
It has more than offset a natural cooling trend which began 8,000 years ago.
Instead of cooling at a rate of minus 0.2C every 1,000 years – a trend that was expected to continue for another 4,000 years because of well-known changes to the Earth's solar orbit – Baffin Island, like the rest of the Arctic, has begun to get warmer, especially since 1950.
The Arctic is now about 1.2C warmer than it was in 1900, confirming that the region is warming faster than most other parts of the world.
Related articles
* The Big Question:
Is an agreement on climate change in Copenhagen still on the cards?
* Search the news archive for more stories
Baffin Island, the largest island in the Arctic Canadian Archipelago, is subjected to prevailing northerly winds that keep average temperatures at about minus 8.5C, well below similar Arctic locations at a comparable latitude.
Polar bears, arctic fox and arctic hares walk the island's territory while narwhal, walrus and beluga whale patrol its coastline.
The island is dotted with lakes, the bottoms of which have been periodically scoured by glaciers with each passing ice age.
However, scientists have found that the sediments at the bottom of some of the lakes, which build up each year rather like tree rings, have survived this scouring process intact.
This has enabled the scientists to take core samples going back tens of thousands of years. One such lake on Baffin Island, known as CF8, has been found to have layers of sediment dating back 200,000 years, which makes it the oldest lake sediment bored from any glaciated parts of Canada or Greenland, according to the study published in the journal Proceedings of the National Academy of Sciences.
It is the CF8 lake that has provided scientists with the sediment core showing the unprecedented warming of Baffin Island over the past few decades, compared with a time span going back 200,000 years, a period which included two ice ages and three interglacial periods – and roughly the time that Homo sapiens has been on earth.
"The past few decades have been unique in the past 200,000 years in terms of the changes we see in the biology and chemistry recorded in the cores," said Yarrow Axford of the University of Colorado at Boulder. "We see clear evidence for warming in one of the most remote places on earth at a time when the Arctic should be cooling because of natural processes."
The scientists found that certain cold-adapted organisms in the layers of sediment have decreased in frequency since about 1950.
Larvae from species of Arctic midge, which only live in cold conditions, have abruptly declined and two species in particular have disappeared altogether.
Meanwhile, a species of lake alga or diatom that is better suited to warmer conditions has increased significantly over the same period, indicating longer periods when the lake's surface was free of ice, the scientists said.
Other sediment measurements support a dramatic reversal of the natural cooling trend, they said.
As part of a 21,000-year cycle, the Arctic has been receiving progressively less summertime energy from the Sun for the past 8,000 years because of a well-established "wobble" in the Earth's solar rotation – the Earth is now 0.6 million miles further from the Sun during an Arctic summer solstice than it was in 1BC. This decline will not reverse for another 4,000 years, but changes to the climate of Baffin Island show that the cooling it should have caused has gone into reverse in the past few decades.
A separate team of scientists analysing Arctic lakes in Alaska found a similar warming trend in recent years compared to sediment records going back a few thousand years. They, too, concluded that the warming was unprecedented and could be explained by human activities, namely the build of man-made carbon dioxide in the atmosphere.
"The amount of energy we're getting from the Sun in the 20th century continued to go down, but the temperature went up higher than anything we've seen in the last 2,000 years," said Nicholas McKay of the University of Arizona in Tucson .
"The 20th century is the first century for which how much energy we're getting from the Sun is no longer the most important thing governing the temperature of the Arctic," said Dr McKay, when the study was published last month in the journal Science.
Baffin Island:
An ancient trading post
*Baffin Island lies between Greenland and the northern coast of Canada and, for all its remoteness and inhospitable climate, it may have played an important role as a staging post on the first-ever transatlantic trade route.
Archaeologists have found wooden items and a length of yarn at Nunguvik in the south which they believe indicate that visiting Vikings were interacting with the local natives, known as the Dorset people, who lived on Baffin Island between 500BC and AD1500.
The scientists believe that the Dorset, who dressed in animal skins, did not know how to spin yarn, unlike the Vikings. The three-metre strand, found frozen in the tundra, was spun from arctic hare fur mixed with goat hair, similar to yarn found at Viking settlements on Greenland.
There are no goats on Baffin Island.
Further evidence comes from one of the wooden carvings which shows two faces chin to chin.
One has the features of indigenous North Americans, whose ancestors had an Asian origin, while the other shows a long, narrow face and nose with a heavy beard – a portrait perhaps of a visiting Viking.
Earth-friendly dining:
Eat frozen salmon
Go local.
Eat organic.
Buy fresh.
Those food mantras continue to make waves among environmentally conscious consumers. But, as is often the case in these climate-conscious times, if the motivation is to truly make our diets more Earth-friendly, then perhaps we need a new mantra:
Buy frozen.
Several years ago, the three of us -- two ecological economists and one food-system researcher -- teamed up in an effort to understand how to develop sustainable food systems to feed a planet of 9 billion people by 2050.
As the focus of our study, we chose salmon, an important source of protein around the world and a food that is available nearly anywhere at any time, regardless of season or local supply.
We examined the salmon's life cycle:
how the fish are caught in the wild, what they're fed when farmed, how they're processed and transported and how they're consumed.
What did we find in our research?
When it comes to salmon and total carbon impact, the questions of organic versus conventional and wild versus farmed matter less than whether the fish is frozen or fresh.
In many cases, fresh salmon has about twice the environmental impact as frozen salmon.
The reason:
Most salmon consumers live far from where the fish was caught or farmed, and the majority of salmon fillets they buy are fresh and shipped by air, which is the world's most carbon-intensive form of travel.
Flying fillets from Alaska, British Columbia, Norway, Scotland or Chile so that 24 hours later they can be served "fresh" in New York adds an enormous climate burden, one that swamps the potential benefits of organic farming or sustainable fishing. (Disclosure:
A nonprofit subsidiary of Ecotrust, the North Pacific Fisheries Trust, lends money to sustainable fisheries.)
Fresh, wild fish is wonderful and healthful, and if it's driven a reasonable distance to market, then its relative environmental impact is low.
This is a great benefit of living in the Pacific Northwest.
Fortunately for conscientious diners far away from waters with wild salmon, when fish is flash-frozen at sea, its taste and quality is practically indistinguishable from fresh.
More important, it can be moved thousands of miles by container ship, rail or even truck at much lower environmental impact than when air-freighted.
If seafood-loving Japanese consumers, who get most of their fish via air shipments, were to switch to 75percent frozen salmon, it would have a greater ecological benefit than all of Europe and North America eating only locally farmed or caught salmon.
Unfortunately, we discovered that most current popular thinking about how to change world systems for the better often misses the point.
Rather than a singular focus on food miles or growing "organic" salmon in land-based farms, we can achieve greater environmental and community benefit by learning to work with nature, not against it.
For example, chasing fish in the open ocean with big diesel engines is harmful and expensive.
Salmon happily return to us each year -- letting them do it and reducing ocean traffic is a good idea.
Then, a simple production change such as flash-freezing carries the benefit even further.
Is the future full of fish sticks?
No. But when it comes to eating seafood from halfway around the world, we need to get over our fetish for fresh.
With the challenges facing the world's oceans mounting, buying frozen is a powerful choice that concerned eaters everywhere can make.
Copyright 2009, The New York Times
Astrid Scholz is the vice president of knowledge systems at Ecotrust in Portland.
Ulf Sonesson is a researcher at the Swedish Institute for Food and Biotechnology.
Peter Tyedmers is a professor at Dalhousie University in Halifax, Nova Scotia.
The Economic Case for Slashing Carbon Emissions
Amid a growing call for reducing atmospheric concentrations of CO2 to 350 parts per million, a group of economists maintains that striving to meet that target is a smart investment — and the best insurance policy humanity could buy.
by frank ackerman
The climate change news from Washington is cautiously encouraging.
No one in power is listening to the climate skeptics any more;the economic stimulus package included real money for clean energy;a bill capping U.S. carbon emissions emerged, battered but still standing, from the House of Representatives, and might even survive the Senate.
This, along with stricter emission standards in Europe and a big push for clean energy and efficiency standards in China, provides grounds for hope for genuine progress on emissions reduction.
But while climate policy is finally moving forward, climate science is moving faster. One discovery after another suggests the world is warming faster, and climate damages are appearing sooner, than anyone had expected.
Much of the policy discussion so far has been aimed at keeping the atmospheric concentration of CO2 below 450 parts per million (ppm) — which was until recently thought to be low enough to prevent dangerous levels of warming.
But last year, James Hansen, NASA's top climate scientist, argued that paleoclimatic evidence shows 450 ppm is the threshold for transition to an ice-free earth.
This would imply a catastrophic rise in sea levels, eventually flooding all coastal cities and regions.
To avoid reaching such a crisis stage, Hansen and a growing number of others now call for stabilizing CO2 concentrations at 350 ppm.
The world is now around 390 ppm and rising;since CO2 persists in the atmosphere for a long time, it is difficult to reduce concentrations quickly.
In Hansen's scenario, a phaseout of coal use, massive reforestation, and widespread use of carbon capture and storage could allow the world to achieve negative net carbon emissions by mid-century and reach 350 ppm by 2100.
Can we afford to reduce atmospheric concentrations of CO2 to 350 ppm by the end of this century?
To address this question, Economists for Equity and Environment (www.E3Network.org) — a group dedicated to applying and developing economic principles to protect human health and the environment — conducted a study of "The Economics of 350."
Why the wide range of cost estimates?
At first glance, there is a bewildering range of estimates of the costs of climate protection.
Look more closely, however, and there are just a few projections of economic disaster, out in right field by themselves. Other estimates range from modest costs to small net economic gains.
The outliers are the handful of private consultant studies funded by partisan lobbying groups such as the U.S. Chamber of Commerce and the National Association of Manufacturers. Using proprietary models (or their own adaptations of standard models), and pessimistic economic assumptions, these studies forecast that even mild U.S. proposals, such as last year's Lieberman-Warner bill, would cost many thousands of dollars per household and would cause widespread unemployment and economic dislocation.
An analysis by journalist Eric Pooley documents the excessive, often uncritical attention given to these studies by the media.
These projections of economic ruin have not been reproduced by any major academic or non-profit research group.
Many economic models find that the modest steps called for in recent U.S. proposals would have very small costs and virtually undetectable effects on total employment — as documented in a report by Nathaniel Keohane and Peter Goldmark for the Environmental Defense Fund.
But to reach 350 ppm, we will have to go far beyond the emission reductions considered in recent U.S. proposals. How much will it cost to reach this more ambitious target?
Until recently, most economic research focused on higher targets such as 450 ppm or more.
There are, however, four major climate economics modeling groups — all at European universities — that have analyzed the costs of reaching 350 ppm.
One group starts from the (realistic) assumption of high unemployment, and finds that long-run employment and economic growth would be
Needed emissions reductions will cost 1 to 3 percent of world economic output, some studies find.
increased by a program of public investment in green technology and emissions reduction that leads to 350 ppm.
The other three groups adopt the common assumption that short-run unemployment can be ignored in long-run models. They generally find that the needed emissions reductions will cost an average of 1 to 3 percent of world economic output, for some years to come.
Other studies have reached more optimistic conclusions about costs. McKinsey &Company, an international consulting firm, has carried out detailed studies of the costs of hundreds of emission-reducing technologies. They find that some emissions can be eliminated for no cost or even an economic savings;more than half of worldwide business-as-usual emissions in 2030 could be eliminated at very small total cost.
The net costs of reducing carbon emissions (i.e. investment costs, minus the value of energy saved) go down when the price of oil goes up, and vice versa.
McKinsey's entire package of reductions, eliminating more than half of world emissions, would have zero total cost if the price of oil were $90 per barrel.
Studies from major environmental groups, including Greenpeace and the Union of Concerned Scientists (UCS), have reached even more optimistic conclusions than McKinsey.
Both Greenpeace and UCS project substantial economic savings from emission reduction, with fuel savings much larger than the costs of investment.
Both assume high oil prices — up to $140 per barrel for Greenpeace — along with rapid change in emissions-reduction technologies.
Deciding whether it's worth the price
The range of cost estimates for reaching 350 ppm, combined with uncertainties about oil prices and future technologies, make it difficult to choose a single estimate of the total economic cost.
Suppose that, for the sake of argument, 2.5 percent of world output must be spent on climate stabilization for years to come.
Is that an unacceptably large number?
Imagine an economy growing at 2.5 percent every year (a little slower than the recent U.S. average). Suppose it skips one year's growth — all too easy to imagine in 2009 — and then resumes growing.
That makes GDP 2.5 percent smaller than it would have been, forever. So the "skip year" has the same effect as spending 2.5 percent of output on climate protection every year. Household incomes would take 29 years to double, instead of 28.
Alternatively, we know we can afford to devote 2.5 percent of income to protection against a remote but disastrous threat — because we already do,
We can afford to protect the climate, and leave a livable world to future generations.
year after year. In 68 countries, military spending exceeds 2.5 percent of GDP. In the United States and China, the top greenhouse gas emitters, military spending absorbs more than 4 percent of GDP. Both countries would be safer, not more vulnerable, if they diverted half of their defense spending to defense against climate crisis.
The most important conclusion of our research involves what we did not find.
There are no reasonable studies saying that a 350 ppm stabilization target will destroy the economy.
This is not surprising.
The ominous recent research on potential climate damages does not examine the cost of doing something;instead, it looks at the cost of doing nothing about emissions.
If the worst happens, our grandchildren will inherit a degraded Earth that does not support anything like the life that we have enjoyed.
On the other hand, if we prepare for the worst but it does not quite happen, we will have invested more than was absolutely necessary — in perfect hindsight — in clean energy, conservation, and carbon-free technologies. Which extreme presents the greater danger?
Climate risk and insurance
Think about climate risk as an insurance problem.
You don't buy fire insurance because you're sure your house will burn down;rather, you are not, and cannot be, sure enough that it will not burn down.
Likewise, projections by Hansen and others of dangerous climate risk from staying above 350ppm CO2 are not certainties;they are necessarily uncertain (although becoming more likely as temperatures rise).
The analogy to insurance is important but inexact;there is no climate insurance company to which the world can hand 2.5 percent of output, if that is what it costs. There is, however, a need for large-scale investment, both in proven emissions-reducing technologies and in research and development.
The role of government in climate policy is not only to set appropriate price signals through a carbon tax or cap-and-trade system;the public sector
New Study Warns of
Crossing Planetary Boundaries
The Earth has nine biophysical thresholds beyond which it cannot be pushed without disastrous consequences, the authors of a paper in the journal Nature report.
Ominously, these scientists say, we have already moved past three of these tipping points, Carl Zimmer writes.
must also guide research on clean energy technologies. Despite free-market mythology to the contrary, this has worked well in the past.
Wind power is profitable today as a result of decades of government investment in the United States and Europe.
In another arena, the U.S. government essentially invented microelectronics in the 1950s and 1960s:
At first, almost all transistors, integrated circuits, and the like were bought by agencies such as the Pentagon and NASA, because no one else could afford them.
Just a few decades of massive government purchases of these items turned microelectronics into the premier private-sector success story of the late-20th century, transforming everyone's life in countless unexpected ways.
The climate crisis challenges us to do it again, to invent the new technologies and industries that will transform life in the mid-21st century and beyond.
We know it's possible:
We can afford to protect the climate, and leave a livable world to future generations.
Pulling CO2 from the Air:
A tiny pinwheel spins in the desert breeze atop the roof of the Global Research Technologies headquarters in Tucson.
For seven months, the pinwheel has endured the blazing Arizona sun, blistering heat, wind, dust, and — finally — torrential rains. At the end of it all, the synthetic resin that makes up this seeming child's toy has pulled carbon dioxide from the air that flowed through it and, with the rains, released it again.
The pinwheel is one of the first demonstrations of a technology that may one day be in great demand this century:
devices that can extract from the air some of the billions of tons of heat-trapping CO2 being generated by industrial society.
Known loosely as "artificial trees" for their ability to mimic a plant's own uptake of carbon, such "air capture" technology has been touted as one of the most promising of the many proposed geoengineering schemes that could be used to cool an overheated planet.
"If we really do get into a situation where we realize that we've changed the atmosphere too much for our own well-being, there are at least ways to back off of that," argues climate scientist Ken Caldeira of the Carnegie Institution of Washington at Stanford University, an expert on geoengineering.
"There's no fundamental limit on how much you could scale those activities up.
It's mostly a matter of how many resources you throw at it."
Recent reports from the U.K.'s Royal Society and the Institution of Mechanical Engineers singled out air capture as the safest and potentially most effective of proposed geoengineering technologies. Although air
Even if the technology is successful, scientists face the problem of what to do with the CO2.
capture is certainly not without its environmental impacts, the two groups noted that other geoengineering schemes — such as seeding the oceans with iron to stimulate the growth of CO2-absorbing algae, mimicking a volcanic eruption to shade the planet, or launching mirrors into space to deflect the sun's energy away from Earth — could have far more unpredictable and potentially destabilizing effects.
Proponents of air-capture technology acknowledge it is far from a perfect solution and will not enable humankind to continue spewing CO2 into the atmosphere with impunity.
First, although it has been successfully tested on a small scale, air capture is at least five years away from being tested on a larger scale and, after that, could take at least two decades before it could be widely deployed.
Second, to set up enough artificial trees to make a dent in reducing the vast amounts of CO2 being produced by humanity would require massive production at enormous expense.
"The cost estimates for capturing CO2 from ambient air are gross underestimates," says principal research engineer Howard Herzog at the Massachusetts Institute of Technology.
"It's actually still a question whether it will take more energy to capture CO2 than the CO2 associated with [fossil fuel] energy in the first place."
Even if artificial trees do prove capable of pulling large amounts of CO2 from the air, scientists then face the problem of what to do with that carbon dioxide.
Underground sequestration — one possible solution — is still in the experimental stages. And deploying such artificial trees on a mass scale will have significant environmental costs, including producing the electricity needed to run them, the large land area the air capture devices would occupy, and the manufacture and installation of devices using resins, plastics, and other substances that could release air pollutants.
As the Royal Society report notes, air capture could "require the creation of an industry that moves material on a scale as large as (if not larger than) that of current fossil fuel extraction, with the risk of substantial local environmental degradation and significant energy requirements."
In short, to extract enough CO2 from the atmosphere to begin to lower temperatures would require decades of building millions of air-capture devices that have been refined to minimize their environmental impact.
Political scientist Roger Pielke, Jr. of the University of Colorado-Boulder estimates that 650 billion tons of carbon will need to be disposed of by 2100 to keep atmospheric concentrations of CO2 around 450 parts per million, a level that could easily lead to temperature rises of 2 degrees C (3.6 F) or higher.
"You need 30 years of development time and 100 years of deployment before you start to see the effect you're looking for," says oceanographer John Shepherd, who led the Royal Society study of air capture and other geoengineering technologies.
That said, if humanity fails to rein in its greenhouse gas emissions, the need for air capture technology could be urgent.
After all, concentrations of
If emissions are not reined in, the need for air capture technology could be urgent.
atmospheric carbon dioxide have reached 387 parts-per-million (ppm), more than 100 ppm higher than pre-industrial levels and quickly moving beyond what some consider to be a safe level of 350 ppm.
Since the establishment of the United Nations Framework Convention on Climate Change in 1992, fossil fuel CO2 emissions have grown by more than 30 percent and overall human-caused emissions have now reached roughly 30 billion tons per year.
"Unless future efforts to reduce greenhouse gas emissions are much more successful than they have been so far," the Royal Society wrote in its September report, "additional action may be required should it become necessary to cool the Earth this century."
Pulling CO2 from the air is simple chemistry.
After all, a bottle of sodium hydroxide — also known as lye and a primary constituent of everything from soap to pulp and paper — must be kept carefully sealed.
That's because the strong base — the opposite of acid — will be neutralized if exposed to air by rapidly sucking up the CO2 and then transforming the lye into sodium carbonate and, ultimately, baking soda.
The captured CO2 can then be extracted via the industrial process of heating the compound above 900 degrees C in a kiln, releasing the CO2, and enabling the sodium hydroxide to regenerate its ability to suck up yet more CO2.
The process works, but as physicist Klaus Lackner at Columbia University's Earth Institute — one of the scientists behind the GRT pinwheel — explains, "The energy to pry out the CO2 is very high."
Artificial Trees
So-called "artificial trees" mimic a plant's own uptake of carbon.
That's why Lackner has moved in the direction of finding a strong base resin, such as Dow Chemical's Marathon A, typically used to produce purified water. The synthetic resin in the pinwheel absorbs CO2 to form bicarbonates when dry, but then spits out the CO2 when exposed to water. "Basically, we can swing between being dry and wet," Lackner says. "Let the resin sit in air, because air will dry it, and it will absorb CO2, taking an hour to load up.
Make it wet, and it's an hour to unload."
This type of device could be housed in an "oversized furnace filter," about three feet wide by eight feet long, loosely filled with sheets of the resin, constituting the leaves of this artificial tree.
Such a device could capture CO2 for less than $300 per metric ton, though it wouldn't function in cold climates or the humid tropics.
A number of experiments involving air capture technologies are underway, ranging from efforts to use solid amines — ammonia transformed into compounds capable of bonding with CO2 — to technologies now used to capture some flue gases from exhaust at fossil fuel-fired power plants. Scientists also are attempting to use algae — the workhorses of the Earth's natural carbon cycle — to cleanse the air of excess CO2.
That could have the benefit of creating a new source of fuel or power, since algae incorporate nearly as much energy per kilogram as coal.
But as the U.K.'s Institution of Mechanical Engineers put it, algae bioreactors "are a fledgling technology and at the moment are too expensive to be commercially viable."
Artificial trees, on the other hand, could be available as soon as next decade.
The mechanical engineers believe a demonstration could occur as soon as 2014, followed by a full-scale "artificial forest" by 2018 and global deployment by 2040.
In the long term, such air capture theoretically has the potential to cancel out human emissions of CO2, according to earth system scientist Tim Lenton of the University of East Anglia.
Assuming that CO2 can be pulled from ambient air, that still leaves the other half of the problem:
storing it safely somewhere.
Efforts to capture CO2 from coal-fired power plants have seized upon geologic sequestration
Scientists also are attempting to use algae to cleanse the air of excess CO2.
as a potential solution.
The U.S. Department of Energy estimates that the continental U.S. alone has room for 3.9 trillion tons of CO2 underground, more than enough room for the 3.2 billion tons emitted every year by large industrial sources. Still, major questions remain about underground sequestration, including its impact on groundwater supplies, subterranean pressure, and the potential for the CO2 to leak back into the atmosphere.
Certain geologic formations may offer a solution by mimicking the chemical transformation of air capture itself. Basalt formations — a residue of volcanic activity — can absorb CO2 and, over decades, transform it into minerals. An experiment by Reykjavik Energy to prove the concept by injecting the CO2 from a geothermal power plant into basalt beneath the surface is underway in Iceland, which is primarily composed of the igneous rock.
Even if technology and storage issues are resolved, CO2 air capture will require significant amounts of new electricity to power the devices. Lackner proposes a new fleet of nuclear reactors or widespread solar power.
The Institution of Mechanical Engineers estimates that it could take as many as 10 million air-capture devices sucking up one metric ton of CO2 per day to absorb just 3.6 billion tons — about one-tenth of current global emissions. The costs of deploying these devices could be staggering.
Climatologist James Hansen estimates it would cost roughly $20 trillion per 50 ppm of CO2 removed.
Although he finds the possibility unsettling, climate scientist David Keith believes large-scale geoengineering will eventually be deployed to offset global warming.
In an interview, Keith explains why scientists must begin researching an "emergency response strategy" for cooling an overheated planet.
"It's on the scale of the global military effort," the Carnegie Institution's Caldeira says. "The tragedy is there's no reason to be considering these options at all if we could just learn to cooperate [on reducing emissions], but the evidence that we are learning to cooperate is not very strong."
Still, Lackner remains undeterred.
By the end of the year, he hopes to have a small demonstration of his resin-based artificial tree — looking more like a mobile home with a large pinwheel on top — running at Columbia University.
Physicist David Keith of the University of Calgary will launch his air capture company, which uses amines to extract CO2, in October.
"If we had lots of money and things went really well, we could build a pilot plant in five years," Keith says. "I'm not saying we will be.
This field is filled with people's overconfidence."
The Royal Society's Shepherd said that, given the expense of air capture technology, "the first line of defense would be carbon capture and storage and taking it out at the point of emission."
But air capture could be effective in offsetting emissions from sources such as airlines, Shepherd said.
The challenges — and expense — of air capture also serve as a stark reminder to policy makers that the best tactic for combating climate change is to pursue energy efficiency and renewable energy programs and avoid emitting CO2 in the first place.
As the Royal Society report notes:
"The safest and most predictable method of moderating climate change is to take early and effective action to reduce emissions of greenhouse gases. No geoengineering method can provide an easy or readily acceptable alternative solution to the problem of climate change."
The Growing Specter of Africa Without Wildlife
On the road from Nairobi into the Great Rift Valley not long ago, a 48-year-old Kenyan taxi driver named Jagata Sospeter pointed out how the landscape had changed in his memory — here a soccer field where rhinos were once commonplace, there a river where hippos used to live, and everywhere, as Kenya's human population continues to boom, the endless sprawl of shambas, tin-roofed farmhouses surrounded by an acre or two of parched maize plants in place of open range.
The one consolation, in a nation where tourism accounts for 10 percent of the gross domestic product, was that wildlife was at least secure within Kenya's national parks and protected areas.
But a new study says that sense of reassurance is false, with wildlife disappearing just as fast inside Kenya's national parks as out.
According to an analysis by David Western and his co-authors, wildlife declined by 41 percent in national parks from 1977 to 1997, and the decline does not appear to have slowed since then.
The study, published on PLoSOne, was
With the rampant poaching of elephants and rhinos in the 1980s, the world had to face the idea that wildlife could survive in Africa only in parks.
commissioned by the Kenya Wildlife Service (KWS), which manages the country's national parks. Shortly after the Western study appeared, a KWS spokesman announced that Kenya's lion population, "the symbol of national strength," is now declining so fast that lions could be extinct there within 20 years.
The Western study has attracted little media attention inside Kenya or beyond, in part because control of the national government remains a more pressing issue in the aftermath of a controversial 2007 presidential election.
Reaction within the conservation community was also muted, in part because Kenya has long been notorious for mismanaging its wildlife — and also because Western himself is a long-time participant in the national bickering about how to fix the problem.
But the new study follows a raft of recent papers reporting similar declines in other protected areas across the continent.
There are notable exceptions to this trend, particularly in the southern African nations of Namibia, Botswana, and South Africa.
But together with continuing increases in human population, these studies raise the specter of an Africa without animals.
During the rampant poaching of elephants and rhinos in the 1980s, the outside world was shocked at the improbable idea that wildlife could survive in Africa only in parks patrolled by armed guards, and often behind fences. But if wildlife continues to decline as rapidly inside national parks as out, it could lead, biologists Tim Caro and Paul Scholte predicted in a 2007 article in the African Journal of Ecology, to "a continent containing isolated pockets of large mammal diversity living at low population sizes. Just like Europe."
The new Kenya study compiled data from 270 wildlife counts over 30 years, mostly focused on antelopes, the feedstock on which lions, leopards,
The wildlife declines point to the need for a radical review of conservation policies.
African wild dogs and other predators depend.
Western, who was director of KWS in the 1990s, described it as "the first time we've taken a good look at a national park system in one country, relative to all of the wildlife populations across the whole country."
The paper notes that the wildlife declines "raise grave concerns about the adequacy of parks and point to the need for a radical review of conservation policies."
But despite the trends revealed in his study, Western disputed the idea that Kenya could soon lose all its wildlife.
Elephants and rhinos seemed to be going down to extinction in the 1980s, he said in a telephone interview with Yale Environment 360, but they didn't "because people were alerted to this threat" and took action.
He argued that the same kind of shift is happening now, especially as studies provide hard statistical evidence of the decline.
Western, a long-time advocate of involving local communities in wildlife management, argued that the answer is to continue expanding the focus "from national parks only to parks plus private lands and communal lands."
In the past, the benefits of tourism generally flowed to tour operators and KWS, he said, leaving nothing for local people, who naturally came to regard wildlife as a threat rather than a benefit.
But wildlife populations are holding on, he said, in areas with "local participation."
Co-author Samantha Russell cited the example of the Shompole conservation area and tourist lodge managed since 2000 by the Masai community on the Tanzania border near Lake Natron.
"They've had wildlife increases and they're very proud of that fact," she said.
Asked if the project has produced the sort of community benefits that Western sees as the key to changing attitudes toward wildlife, she said, "In theory, there's a lot of money to be made."
But she conceded that "benefit sharing is always a tricky one to work out."
Despite considerable optimism and international support over the years, community management schemes have frequently failed. A 2000 paper by Alexander Songorwa of the Tanzania Wildlife Division recited a lengthy
Some scientists foresee "isolated pockets of large mammal diversity living at low population sizes."
catalogue of impediments, including government reluctance to turn power back to locals, resistance from national park services, the inability of illiterate locals to handle new accounting systems, and lack of wildlife management expertise.
But Western argued that much has changed in the years since Songorwa wrote his article, with Kenya training hundreds of local wildlife scouts and, more recently, resource assessors to keep track of changes in the habitat.
"Once you give them a voice, you give them opportunity, you give them skills and training, that changes very rapidly.
They're not locked into backwardness, which really that view implies."
James Deutsch, director of the Wildlife Conservation Society's Africa program, praised the Western study for producing the first hard evidence of the "extremely depressing" changes in Kenya's national parks. But he also questioned the study's "black and white conclusions."
He noted that community management success stories rarely come from East Africa these days, but mainly from southern Africa, particularly Nambia, which has a stable national government and a low population density — unlike Kenya.
Deutsch accepted the study's argument that the wildlife decline is due in part to bad design and siting of national parks, which often include only a
The pretence that all is well in Kenya's protected areas has been blown out of the water."
fraction of the migratory range of major species. But that doesn't explain, he said, why the largest parks in the study suffered the worst declines, while some small parks actually showed increases. He dismissed the study's rosy assessment of the security provided by KWS, and blamed poaching across the border from Somalia for the 78 percent decline in Meru and the 63 percent decline in Tsavo East and Tsavo West National Parks.
Deutsch also noted that the study implicitly re-plays a dispute that has raged in Kenyan wildlife circles for 30 years, "often generating more heat than light."
On one side, Western pushes his community-involvement approach.
On the other, Richard Leakey, another former KWS director, argues for "fences-and-fines."
"For me, the world is complicated," said Deutsch, adding that he'd be interested in a study "that doesn't have an axe to grind from the start."
He rattled off a series of challenges to the survival of wildlife in Kenya — badly-flawed parks, little or no benefits flowing to people living around parks, a lack of income from legal trophy hunting and other consumptive uses of wildlife, the bushmeat trade, political corruption, inadequate protection against poachers — and suggested that in any given situation, either Western's approach or Leakey's might be the right way to go.
Soon after the new study appeared, a columnist in Swara, the East African Wildlife Society quarterly, took the entire Kenyan conservation establishment, including Western and Leakey both, sharply to task:
"The cosy pretence that all is well within Kenya's Protected Areas has been simply blown out of the water...
Have the courage to admit that everything you have recommended, supported, funded and implemented over the last 30 years in Kenya to conserve wildlife has been a failure — or was it your intention to sit idly by while some 70 percent of wildlife vanished from under your very noses?"
Yet the prospects for reversing this grim trend seem small.
The human population of sub-Saharan Africa continues to boom, with a projected increase of a billion more people across the continent by 2050.
So both fences and community-friendly approaches will almost certainly need to work — along with some miraculous remedy still to be devised — if Africa's rich and potentially lucrative wildlife legacy is to last through this century.
A Total Ban on Whaling?
New Studies May Hold the Key
The International Whaling Commission is lurching from crisis to stasis, unable to impose a complete ban on whaling, yet equally unwilling to allow a formal resumption.
So the IWC meets every year and watches ever more whaling ships depart from Iceland, Japan, and Norway in contravention of the spirit — and sometimes the letter — of its 23-year-old whaling moratorium.
At one level, there is a crisis of rhetoric and perception.
Japan's annual hunt, sanctioned under a loophole allowing whaling in the interests of scientific research, is clearly bogus. But would-be whalers say that some proponents of a true moratorium — including former great whaling nations like the U.S. and Britain — are equally guilty of pretend science:
They deny that whale populations have recovered enough during the moratorium for whaling to resume, when in fact they oppose resumption of whaling under any circumstances.
But behind this rancid politics, there is a real unanswered scientific question:
Have some whale populations, such as Atlantic humpbacks, rebounded sufficiently to allow "sustainable" harvesting?
Or have numbers been so decimated by centuries of past hunting that any resumption would be dangerous both for whale populations themselves, and for wider marine ecosystems?
One of the IWC's main management objectives is to prohibit all catches where a particular population is at less than 54 percent of its original population level — sometimes referred to as the ocean's "carrying capacity."
Whalers say many stocks have now risen above this benchmark.
But have they?
How many whales once swam the world's seas?
At a recent meeting in Vancouver, scientists engaged in the decade-long Census of Marine Life agreed that, in the words of Irish delegate Poul Holm, "human pressure on marine life was much earlier, much larger and much more significant than previously thought."
That conclusion, they said, applied especially to marine mammals, including whales.
This is a bombshell for marine biologists. Their conventional view, held at the IWC and elsewhere, is that whale numbers were largely unchanged
A humpback whale breaches near California's Channel Islands.
prior to the arrival of industrial whaling, generally defined as the advent of explosive harpoons in the mid-19th century and — at the start of the 20th century — factory ships with ramps that could load large numbers of slaughtered whales for processing.
But this, it increasingly appears, is nonsense.
Seventeenth and 18th century whalers — in the heyday of chasing whales for oil to make candles, light street lamps, and lubricate machinery — wrecked most of the world's whale stocks long before the arrival of industrial whaling.
It seems that old sailors' chronicles of oceans filled to the horizon with whales were not hyperbolic fantasies. They were often literal truth.
Why do the census historians conclude this?
First, because the sheer volume of historical evidence makes it ever harder to disbelieve.
And second, because new population modeling, some of it based on DNA evidence, provides strong corroboration for the chroniclers.
The History of Marine Animal Populations (HMAP) project, part of the Census of Marine Life, has so far collected records on 70,000 whale encounters that build up a picture of past super-abundance, says Andy Rosenberg of the University of New Hampshire.
Meanwhile, the genetic variation found in a handful of whale populations so far analyzed suggests that those that remain came from much larger populations than previous supposed.
The analysis is based on the fact that, with succeeding generations, DNA is altered through subtle mutations:
The larger the original population, the greater the genetic "drift" evident in the current population.
For instance, based on its own records, the IWC had concluded that the number of humpback whales swimming the North Atlantic before whalers began to reduce their numbers was around 20,000.
With the current population estimated at 10,000, the humpback population could soon be considered sufficiently large to allow the resumption of whaling.
But when Stanford University's Stephen Palumbi and Joe Roman analyzed the population's DNA in 2003, they concluded that the pre-exploitation figure was 12 times greater, with a population once numbering 240,000.
This suggested that the North Atlantic's original humpback whale population was more than 20 times larger than the present population, and any claims that the population has recovered sufficiently to allow a resumption of whaling are well wide of the mark.
Palumbi's findings also suggest that the global population of humpbacks may once have been around 1.5 million, rather than the 100,000 estimated by the IWC.
Among the best documented stories of whale extermination is that of Arctic bowheads. England's most famous whaler, William Scoresby, was
The Japanese whaling ship, Yushin Maru 2, hauled two dead minke whales on board for processing in Antarctic waters in 2008.
one of hundreds of Dutch and English whaling captains who headed into the iceberg-infested waters around Greenland in the late 18th century to catch bowhead whales. They only had wooden vessels powered by wind and sail, and hand-held harpoons. But, guided by Scoresby — a brilliant sailor who invented the crow's nest as a lookout for ice and whales — they sailed audaciously among the ice floes, where the whales congregated each spring, and brought back ever-larger hauls of blubber, to be boiled up on the dockside at ports like Scoresby's home town of Whitby.
According to Robert Allen of the University of British Columbia in Vancouver, who modeled rates of harvesting and the reproductive rates of the species, eastern Arctic bowheads numbered almost a million animals when the Greenland hunt began.
But within a few decades, the bowheads were virtually all gone.
Whitby whalers gave up going north in the 1830s.
Today, almost 200 years on, the population of bowheads west of Greenland is still only around 1,200, and that east of Greenland, once the biggest whaling ground in the world, has simply disappeared altogether.
Whaling before the explosive harpoon and the factory ship may have been technically primitive, but it was a huge industry, with thousands of ships setting sail.
The New England port of New Bedford, Massachusetts, was once known as the "the city that lit the world" because of its production of lamp oil from whale blubber.
One whale population after another was decimated, before the fleets moved on through the oceans, heading finally for southern waters. HMAP researchers estimate that in the 18th century, the waters off New Zealand were home to around 27,000 southern right whales. But by 1925, they had been reduced to 25 reproducing females.
"The systematic destruction of the great whales was a stupendous act of modern ecological folly..."
Jeremy Jackson of Scripps Institution of Oceanography in San Diego writes in a recent paper in Whales, Whaling and Ocean Ecosystems. "(T)he ecological consequences of the removal of so many behemoths must have been profound."
He argues that the new findings on the past profusion of whales show our conventional view of marine food chains is upside down.
Modern oceans are dominated by small and lowly creatures. They comprise most of the biomass, with larger species further up the food chain comprising ever less biomass.
But, says Jackson, this "trophic pyramid" may be an artifact of human hunting over the centuries rather than the natural state.
In the natural state, he argues, the trophic pyramid was probably the other way around, with biomass dominated by megafauna.
Millions upon millions of whales were eating out the oceans as fast as they could go.
Most marine biologists, Jackson says, continue to dismiss this idea out of
A wood carving from the 18th century shows Dutchmen hunting bowhead whales in the Arctic.
hand, despite "the most careful and detailed historical descriptions [which are] are commonly dismissed as untestable anecdotes."
They do this, he says, because anecdotes of past huge whale numbers — though numerous — lack scientific rigor. And some argue that the DNA evidence relies on the assumption that distinct populations of whales did not interbreed in the past.
Jackson calls this "unbridled anti-historical determinism that flies in the face of everything we have learned about… ecosystems."
Jackson offers a startling new picture of life on Earth, a world in which giants ruled, and ruled in huge numbers. And if Jackson is right, then even the most "recovered" of today's whale populations are only a tiny fraction of their former numbers.
That is one cause for thinking very hard before allowing any resumption of whaling.
Another is that the world is only now learning the true scale of the abuse of science and wholesale flouting on IWC hunting quotas that has marked most of the six decades since the commissions was established in 1946.
We may have known how boats paid for by Unilever hunted the oceans with such vigor in the 1930s that an entire fleet was impounded by concerned Norwegians authorities in 1936 — one of the incidents that led to the commission's formation.
We may know how Greek shipping magnate Aristotle Onassis made a whaling fortune in the 1950s and reportedly covered the bar stools in his favourite yacht with the downy skin from the scrotums of sperm whales.
But it is only in the past two years that the Marine Fisheries Review has published in English the memoirs of Russian whaling scientists, supposedly monitoring fleets in the 1950s and 1960s to ensure they complied with IWC rules. Former inspector Alfred Berzin described how the most notorious flouter of IWC law, Alexei Solyanik, killed 25,000 humpback whales off Antarctica in two seasons from 1959 to 1961, almost all outside the five-day legal hunting season.
At the start, said Berzin, "there were so many whales that the helicopter pilots joked they could make an emergency landing on the backs of humpbacks that were close to each other."
By the end, they were mostly gone, and Solyanik moved on to take other species like sperm and minke whales. Meanwhile, Australian biologists reported some catastrophic loss of the humpback population, but it was decades before the truth emerged that Russian whalers had decimated the population.
Such flagrant breaches of whaling law might not be possible today.
The story suggests, however, that when greed is involved on the high seas, the chances of tightly controlling the size of catches is small.
But the bigger story may be that whenever a harpoon enters a beleaguered whale, we are not harvesting a sustainable resource, we are messing with the remnant giants of deeply traumatized ecosystems about which we know staggeringly little.
To Make Clean Energy Cheaper, U.S. Needs Bold Research Push
Energy Secretary Steven Chu recently called for "Nobel-level" breakthroughs and a "second industrial revolution" in energy technology to overcome the world's interlinked energy and climate challenges.
Chu's implication:
We currently lack the technologies we need to fully avert catastrophic global warming.
His admonition:
America must dramatically accelerate the development of clean energy technology.
Chu has it right.
The task is clear:
To renew the U.S. economy, respond to global climate change, foster the nation's energy security, and help provide the energy necessary to sustainably power global development, America must transform its outdated energy policy.
Innovation and its commercialization must move to the center of energy system reform.
The nation must move urgently to develop and harness a portfolio of clean energy sources that are affordable enough to deploy on a mass scale throughout the U.S. and the world.
In short, we must make clean energy cheap.
Putting a price on carbon will take us part of the way, but not nearly far enough.
To make the revolutionary shift to a low-carbon economy, we propose a bold new research paradigm:
the creation, over time, of several dozen renewable energy research hubs around the nation.
These centers — known as energy discovery-innovation institutes, or e-DIIs — would be established with a combination of federal, state, university, and private funds and would take the lead in accelerating the development of reasonably priced alternative energy technologies and bringing them to the marketplace.
The Brookings Institution's Metropolitan Policy Program — joined by a number of leading universities, regional alliances, and corporate partners
Institutes within the network that make clean energy breakthroughs would grow.
Those that did not, would die.
— has laid out a detailed plan for launching a network of energy innovation institutes around the country.
In the Southwest, the institutes might focus on advancing solar technologies. Centers in the Great Lakes region might speed development of advanced battery technology or hydrogen fuel cells. Energy innovation institutes in the Great Plains might focus on developing sustainable, non-food sources of biofuels.
Individual energy innovation institutes would vary not only by theme but also size, with some centers operating with budgets as small as $10 million to $15 million per year, while others — the most successful and ambitious — might see their budgets grow to as large as $100 million to $200 million a year, making them as robust as large academic medical centers. Those energy research institutes that "delivered the goods" in terms of clean energy breakthroughs within 5 to 10 years would grow;those that did not, would die.
Ultimately, total federal investment in the energy research institutes could grow to $6 billion per year.
Such an ambitious plan is needed to meet the enormous challenges we face:
Over the next four decades, global energy demand is expected to triple.
At the same time, global greenhouse gas emissions must fall rapidly, decreasing at least 50 to 85 percent by mid-century to avert disruptive climate change.
Most of the growth in energy demand will occur in the developing world, as nations like China, India, and Brazil continue to lift their citizens out of poverty and build modern societies. Overall, that's a very good thing:
Increased access to energy brings electricity to pump and treat potable water, lights to read and study by, access to modern health care, relief from backbreaking physical labor, and much more.
The problem, however, is that fossil fuels remain cheap and abundant.
That means that in the absence of similarly affordable and large-scale clean energy sources, the nations of the developing world will turn to coal and other fossil fuels to power their development, just as we in the United States have done.
And that would virtually assure massive climatic destabilization, regardless of what occurs in the developed nations of the world.
Hence the dominant climate policy agenda of our time:
Captivated by market logic and sophisticated regulatory schemes, mainstream climate policy advocates have focused above all on utilizing market-based mechanisms and price signals — such as carbon taxes and cap-and-trade plans — to make dirty energy more expensive.
According to this approach, the resulting price signals would spur private-sector investment and innovation in clean energy alternatives and secure the energy technology transformation we need.
But there is one complication:
Policymakers and the public alike are reluctant to increase the price of energy significantly through higher prices on carbon emissions. At a time of deep economic recession, public tolerance for higher energy prices wanes.
In the developing world, the message is even clearer, summed up by one Chinese official, Lu Xuedu, of the Office of Global Environmental Affairs. "You cannot tell people who are struggling to earn enough to eat that they need to reduce their emissions," declared Lu.
None of which means that putting a price on carbon is a bad idea.
Internalizing the many costs of burning dirty fuels is long overdue.
But in the midst of a recession, any carbon-pricing scheme the current Congress manages to enact will likely price carbon at levels far below those needed to adequately stimulate rapid deployment of clean energy technologies.
Moreover, price signals alone can only do some of the work, because even ambitious pricing will not fully address the many non-price-related market failures standing in the way of a cleaner and more efficient energy system.
Nowhere is this clearer than in the innovation game.
Private energy firms chronically under-invest in energy R&D because they cannot fully capture the future returns of their innovations, which spill over to their competitors. And in contrast to the consumer electronics, pharmaceutical,
We can't rely solely on market signals to make clean energy cheap.
The government must play a more active role.
and computer industries, too few early adopters in the energy-sector are willing to pay five times more to own the latest, greatest gadget, which means that emerging energy technologies must become competitive much more quickly to survive.
That makes private companies reluctant to make bets on new clean energy technologies. The energy industry remains one of the least research-intensive industries in the economy.
The upshot:
private-sector investment in energy R&D will almost certainly proceed more slowly than the climate challenge demands.
Which bring us to the crux of the matter:
If we can't rely solely on market signals to make clean energy cheap, the national government must play a more active role.
Once again, as it did in laying down the railroads and highways and stimulating the electronics and biotech explosions, the federal government must make significant investments in activities and infrastructure.
This includes supervising the construction of a modern, nationwide smart grid that will open the way for private investments in new energy sources by allowing the flexible distribution of renewable power. The government must also make direct investments to support the deployment of emerging clean energy technologies, driving economies of scale, and bringing down prices. And, finally, the government must increase the federal R&D budget by an order of magnitude while deploying some of its investments in radically new ways.
Current federal spending on non-defense, energy R&D remains far too small.
The current outlay of about $4 billion is less than half of the amount spent, in real dollars, in 1980.
New infusions in the recent economic stimulus package will boost the current figure, as will an additional $15 billion a year to be set aside for clean-tech investments from the cap-trade system projected by President Obama's recent budget outline.
Still, such investments will likely come in below the $20 billion to $30 billion per year in federal clean energy R&D that many analysts believe is needed to address the climate and security threats posed by the nation's fossil fuel dependence.
At the same time, federal energy research remains outdated and fragmented.
To begin with, the U.S. Department of Energy (DOE) really isn't even in the energy innovation business. The bulk of the department's funding and operational competence remain focused on managing — and cleaning up after — the nation's sprawling nuclear weapons system.
Otherwise, what energy research efforts do exist within DOE remain insular, focused largely on weapons development, and generally organized around an "energy technology of the year," instead of the integrated approach demanded by the nation's complex array of energy challenges. The bulk of DOE's research activities — despite excellence in certain areas — remain too removed from the marketplace to quickly develop and commercialize clean energy innovations.
To fully mobilize the entire national research enterprise — universities, federal laboratories, and corporate R&D centers — the nation needs to begin creating a network of energy innovation institutes. Developed by the National Academy of Engineering, the energy innovation concept is characterized by institutional partnerships, interdisciplinary research, technology commercialization, education, and outreach.
The institutes would place a high priority on collaboration, commercialization, and performance.
What sort of research would likely be conducted at the energy innovation institutes?
One or two would surely mount an aggressive push on new
The institutes would draw on local workforces, scientists, and businesses to rapidly deploy new technologies.
materials and designs for low-cost solar panels. Others might lead the charge on the development of massive, cost-effective energy storage;new battery chemistry that will lead to truly viable electric vehicles;better plant growth and biotechnology conversion to provide affordable biofuels;superconductive transmission lines for efficient electrical power distribution;new techniques for sequestering carbon or even capturing it from the air;and much more.
The institutes would be geared toward rapid technology transfer designed to maximize the economic and social impact of new renewable technologies. Breakthrough inventions could be protected by intellectual property laws. Depending on the region, energy innovation institutes could take shape at university campuses, through national labs, at other institutes, or even virtually. A competitive awards process would seek to target the best proposals from wherever they originate.
And the regional nature of the centers is key:
the institutes would draw on local workforces, scientists, businesses, and other resources to rapidly deploy new technologies that respond to local challenges and stimulate local economic development.
Successful examples of effective, practical collaboration among universities, government, and industry already exist.
One such enterprise is the Energy Biosciences Institute, a renewable energy research group involving the University of California at Berkeley, the University of Illinois at Urbana-Champaign, Lawrence Berkeley National Laboratory, and the oil giant, BP.
In important ways, the energy innovation institute concept represents a contemporary adaptation of the research paradigm created through the land-grant acts passed by Congress in the 19th century.
Then, federal investments established a network of university-based agricultural and engineering experiment stations, augmented by extension services capable of interacting directly with the marketplace.
That program was instrumental in developing and deploying the technologies necessary to build a modern industrial nation for the 20th century, while stimulating local economic growth.
Today, the U.S. needs a similarly bold campaign to enlist America's universities, laboratories, and companies in solving one of the most complex and important problems — the transition to a clean-energy economy — that the nation has ever faced.
As Rain Forests Disappear,A Market Solution Emerges
Environmentalists attempting to preserve the vanishing Amazon rain forest now confront a stark paradox:
Never before have they succeeded in protecting so much of the world's largest tropical forest, yet never before has so much of it simultaneously been destroyed.
The key question today is whether new models of conservation — including an increasingly popular, market-based program known as REDD — will be able to reverse the steady loss of tropical forests, not only in the Amazon, but also in Indonesia, Borneo, and Africa's Congo basin, where virgin woodlands continue to be razed at an unprecedented rate.
Since 2000, foreign donors, working with the Brazilian government, have spent hundreds of millions of dollars to place 386,000 square miles of the Amazon — an area nearly as large as France and Spain combined — in protected areas. Yet during that same period, logging, farming, ranching, and development in the Amazon have destroyed a forest area half the size of Norway.
With land prices fast appreciating, cattle ranching and industrial soy farms expanding, and billions of dollars of new infrastructure projects in the works, development pressure on the Amazon will only accelerate.
And the situation is worse in Indonesia's tropical forests, now being felled for timber and the creation of oil palm plantations. Indeed, since 2001, deforestation in Brazil and Indonesia alone has led to the loss of 116,000 square miles of forest.
If these trends continue, one of the world's greatest ecosystems — its tropical forests — will be whittled away piece by piece, with dire consequences not only for the diversity of life on earth but also for the world's climate, already warming at an alarming rate.
Hope for avoiding the worst outcomes in the Amazon, Indonesia, and other tropical forest regions increasingly rests today on the belief that markets will soon pay for the services provided by healthy rain forests, which include biodiversity maintenance, rainfall generation, carbon sequestration, and moderation of the world's climate.
This idea is quickly gaining momentum as a wide range of interests — including global banks and financiers, development experts, government policymakers, and environmentalists — have embraced the concept that the best way to keep forests intact is to pay rain forest nations, and the people who inhabit the forests, not to chop them down.
The concept is known as REDD — "reducing emissions from deforestation and degradation" — and it is more than an abstract theory:
* On Sumatra, which is suffering from an epidemic of deforestation, the global financial firm Merrill Lynch — now owned by Bank of America — is working with the local governor and international conservation groups to create a $432 million investment fund to preserve large swaths of the rain forest in Aceh province, while also encouraging sustainable development.
* In Guyana, a London investment firm has purchased the rights to the so-called "ecosystem services" in a 1,432-square-mile patch of tropical forest — rights it hopes to eventually market so that funds can be made available to the area's 7,000 inhabitants for limited development.
* In Borneo, an Australian investment outfit has established a global fund to reduce forest fires and restore the rain forest canopy in a forest reserve that has been illegally logged.
For the moment, these efforts are isolated and halting.
To create a critical mass, the world community must mandate sharp reductions in greenhouse gas emissions and place a price on carbon.
Meetings are taking place this week at a U.N. climate conference in Poznan, Poland, on these very topics, in advance of an effort next year to replace the 1997 Kyoto Protocols with new limits on greenhouse gases.
The goal is to get to the point where, thanks to caps on greenhouse gas emissions and a global carbon market, billions of dollars are raised each year to invest in REDD, or "avoided deforestation", projects. While corporations pursuing commercial interests could end up protecting forests worldwide, poor countries could find a new way to capitalize on their natural assets without destroying them.
"Forests fall because they are worth more cut down than standing," says Andrew Mitchell, director of the Global Canopy Program, a tropical forest research and conservation group.
"This is a classic example of a market failure, but ecosystem services could change that."
Despite its promise, REDD remains controversial and faces many
challenges, including ensuring land rights and financial benefits for forest inhabitants;establishing baselines to accurately measure reductions in deforestation rates;causing "leakage," when conservation measures in one area shift deforestation to another;and concerns that developed countries would merely invest in REDD projects as a way of continuing to emit large quantities of greenhouse gases. But with the weakness of traditional conservation measures becoming glaringly obvious as deforestation rates increase, and with growing awareness that destruction of tropical forests is a major source of greenhouse gas emissions, there is growing support for the idea that a market-based approach to preserving forests may be the best hope.
The conservation group WWF once opposed REDD programs out of concern that they were a mechanism to absolve rich countries of the need to reduce greenhouse gas emissions. But in September, in the face of ongoing forest loss, WWF president and CEO Carter Roberts said his group would now support REDD projects as a critical component of addressing climate change, as roughly 20 percent of the world's greenhouse gas emissions come from deforestation.
Noting that if the Amazon were a country, deforestation there would place it in the top seven emitters of greenhouse gases worldwide, Roberts said, "Unless the world has policies that recognize the value of standing trees and forests, we will have failed."
The concept that developing countries should be compensated for reducing emissions from deforestation and forest degradation has been around for more than a decade, but it had failed to gain traction not only because of fears that wealthy nations could buy their way out of emissions cuts. Some feared — and still worry — that the rights and interests of indigenous people would be ignored as governments, carbon traders, and speculators secure rights to the ecosystem services provided by tropical forests without the consent of the people who lived there.
In places where land rights are poorly defined, such claims could be used to evict people from lands upon which they have been living for generations.
"REDD (projects) . . . will only achieve lasting results if they are adapted to conditions on the ground and help meet the needs of local people," the Forests Dialogue on Climate Change — a coalition of indigenous people, trade unions, governments, and others — said in a statement last month.
Emerging REDD efforts have been taking those interests into account.
And many tropical forest experts say that avoided deforestation programs may offer a better alternative than the status quo, which has long led to the displacement of native peoples from their lands at the hands of loggers and developers.
REDD advocates are also winning support from non-traditional partners, including humanitarian organizations, faith-based aid groups, governments, and the World Bank.
Research suggests that once a viable international carbon market exists, pure economics alone may boost REDD. In areas where infrastructure is poor and forests are abundant, REDD may offer attractive economic returns for rural communities. Several studies in Indonesia and Brazil have shown that locals could earn far more if they received REDD funds to
Seeing the enormous potential, governments and investors are already positioning themselves for a forest carbon market.
support sustainable development than they would from conventional logging or conversion of forests to farming.
The world's only sizable carbon market, in the European Union, prices carbon at $20 per ton.
If investors seeking carbon offsets were to pay even a fraction of that amount to help preserve tropical forests in Indonesia, for example, it would dwarf the financial benefits the country receives from forestry – currently only $0.34 per ton of carbon released, one study has shown.
Further, because REDD is compatible with sustainable harvesting of forest products and low-impact ecotourism, it could become an integral part of rural development schemes.
The Woods Hole Research Institute estimates that using REDD programs to reduce deforestation in the Brazilian Amazon to nearly zero within a decade would cost $100 million to $600 million per year. The Eliasch Review, a British government-commissioned report on REDD, estimates that a cap-and-trade system that includes forest carbon could generate more than $7 billion per year — primarily through purchase of carbon offsets — to finance forest conservation.
Seeing the enormous potential of REDD, governments and investors are already positioning themselves for a forest carbon market.
Last December, Merrill Lynch became the first major U.S. bank to invest in an avoided deforestation project, putting $9 million towards rain forest conservation in Sumatra.
The bank hopes to lock up forestry carbon credits while they are cheap and sell them at a higher price if carbon markets emerge.
The deal — involving Australia-based Carbon Conservation, Merrill Lynch, Flora and Fauna International, and the provincial government of Aceh — could generate hundreds of millions in carbon financing over the next 30 years by preventing logging and conversion of forest to oil palm plantations.
Aceh Governor Irwandi Jusuf sees the initiative as a key step in the region's recovery from the devastating 2004 tsunami and three decades of civil war. To support the project, Irwandi has imposed a moratorium on logging, hired more than 1,000 former fighters as rangers, and laid out plans for the development of environmentally sustainable businesses.
In March a private equity firm took the unprecedented step of purchasing the rights to environmental services generated by a 1,432-square-mile rain forest reserve in Guyana.
London-based Canopy Capital is effectively banking that the services generated by a living rain forest will eventually see compensation in international markets. Eighty percent of the profit will go to local communities through micro-credit loans to sustainable economic activities.
"The only way we are going to turn this thing around is through a profit motive," said Hylton Murray-Philipson, director of Canopy Capital.
"This is what is needed to harness the power of markets. But it doesn't stop with making a profit—we are also going to have to deliver a better living for local people.
We need to start valuing the intrinsic parts of the forest as an intact entity rather than having to convert it for something else."
The World Bank is helping jumpstart projects in more than two dozen countries with its $300-million Forest Carbon Partnership Facility, designed to help nations earn compensation through REDD projects.
Brazil favors a different approach, outlined at this month's climate talks in Poznan, Poland.
Brazil plans to establish a voluntary fund into which
developed countries, companies, and other entities pay to reduce emissions from deforestation.
With complete control over how the funds are spent and no allocation of conventional carbon credits to contributors, the initiative maintains Brazil's sovereignty over the Amazon and gives it an unprecedented financial incentive to preserve the region's forest cover. The fund aims to raise $21 billion by 2021, and Norway has committed up to one billion to the scheme by 2015, contingent on Brazil's success in reducing deforestation.
What Brazil does is crucial, since it is home to more than 60 percent of the Amazon and accounts for nearly half of global tropical forest loss annually.
Brazil has been vague on how the funds will be used, but the Bolsa Floresta program in the state of Amazonas could serve as a model for compensating rural populations for avoiding deforestation.
The program, launched last year, pays forest families living near the Uatuma Reserve about $25 per month in return for not clearing and burning primary forest lands. Residents are also provided with health care, clean water, and greater access to education.
Though experts disagree on methods, few dispute that the world must try a new approach to tropical forest conservation — and fast.
Daniel Nepstad, a leading tropical forest ecologist who now heads up conservation at the Gordon and Betty Moore Foundation, suggests we are already approaching a critical tipping point in the Amazon where the world's largest rain forest will no longer be able to supply the vital ecological services it currently provides.
"The Amazon rain forest has already entered a dieback, in which the vicious cycle between land use, seasonal drought, and fire are rapidly degrading enormous swathes of rain forest each year," said Nepstad, noting that if present deforestation rates continue half of the Amazon will be burned, cut or degraded by 2030.
He maintains that, in addition to greatly reducing future cutting, 100,000 square miles of degraded Amazon land must be regenerated.
Those measures, says Nepstad, would help ensure that roughly three-quarters of the original Amazon rain forest remains intact, an important factor in stabilizing regional and global climate systems.
Brazil's target – a 70 percent reduction in net deforestation over the 1996-2005 baseline by 2018 – is less ambitious but is an acknowledgement of both the importance of maintaining substantial forest cover in the Amazon and the potential of forest carbon as an economic asset.
In the long run, improved governance, new market-based compensation systems that reward environmental performance, and continued expansion of protected areas are all key to saving forests like the Amazon.
Capturing the Ocean's Energy
Way back in Napoleonic Paris, a Monsieur Girard had a novel idea about energy:
power from the sea.
In 1799, Girard obtained a patent for a machine he and his son had designed to mechanically capture the energy in ocean waves. Wave power could be used, they figured, to run pumps and sawmills and the like.
These inventors would disappear into the mists of history, and fossil fuel would instead provide an industrializing world with almost all its energy for the next two centuries. But Girard et fils were onto something, say a growing number of modern-day inventors, engineers, and researchers. The heave of waves and the tug of tides, they say, are about to begin playing a significant role in the world's energy future.
In the first commercial-scale signal of that, last October a trio of articulated, cylinder-shaped electricity generators began undulating in the waves off the coast of Portugal.
The devices look like mechanical sea snakes. (In fact, their manufacturer, Scotland's Pelamis Wave Power Ltd.,
operating off the Portuguese coast
takes its name from a mythical ancient Greek sea serpent.) Each Pelamis device consists of four independently hinged segments. The segments capture wave energy like the handle of an old fashioned water pump captures the energy of a human arm:
as waves rock the segments to and fro, they pump a hydraulic fluid (biodegradable, in case of spills) powerfully through a turbine, spinning it to generate up to 750,000 watts of electricity per unit.
Assuming the devices continue to perform well, Portuguese utility Energis expects to soon purchase another 28 more of the generators.
The completed "wave farm" would feed its collective power onto a single high voltage sea-floor cable, adding to the Portuguese grid about 21 megawatts of electricity.
That's enough to power about 15,000 homes.
In a world where a single major coal or nuclear plant can produce more than 1,000 megawatts of electricity, it's a modest start.
But from New York's East River to the offshore waters of South Korea, a host of other projects are in earlier stages of testing.
Some, like Pelamis, rely on the motion of waves. Others operate like underwater windmills, tapping the power of the tides.
Ocean-powered technologies are in their infancy, still technologically well behind such energy alternatives as wind and solar. Necessarily designed to operate in an inherently harsh environment, the technologies remain largely unproven and — unless subsidized by governments — expensive. (Portugal is heavily subsidizing the Pelamis project, with an eye to becoming a major European exporter of clean green power in the future.) Little is known about the effects that large wave or tide farms might have on marine ecosystems in general.
Despite the uncertainties, however, proponents say the potential advantages are too striking to ignore.
Eight hundred times denser than air, moving water packs a huge energy wallop.
Like solar and wind, power from moving seas is free and clean.
But sea power is more predictable than either wind or solar. Waves begin forming thousands of miles from coastlines and days in advance;tides rise and fall as dependably as the cycles of the moon.
That predictability makes it easier to match supply with demand.
Roger Bedard, who leads ocean energy research at the U.S. utility-funded Electric Power Research Institute (EPRI) in Palo Alto, says there's plenty of reason for optimism about the future of what he calls "hydrodynamic"
More From Yale e360
Click below to read more from Yale Environment 360 on renewable energy innovation.
Deep Geothermal:
The Untapped Energy Source | Solar's Time Has Finally Arrived
power. Within a decade, he says, the U.S. could realistically meet as much as 10 percent of its electricity needs from hydrodynamic power. As a point of reference, that's about half of the electricity the U.S. produces with nuclear power today.
Although he acknowledges that initial sea-powered generation projects are going to be expensive, Bedard believes that as experience grows and economies of manufacturing scale kick in, hydrodynamic power will follow the same path toward falling costs and improving technologies as other alternatives.
"Look at wind," he says. "A kilowatt hour from wind cost fifty cents in the 1980s. Now it's about seven cents." (That's about the same as producing electricity with natural gas, and only about three cents more than coal, the cheapest — and dirtiest — U.S. energy choice.
Any future tax on carbon emissions could narrow that gap even more, as would additional clean-power subsidies.)
For some nations, wave and tide power could pack an even bigger punch.
Estimates suggest, for instance, that the choppy seas surrounding the United Kingdom could deliver as much as 25 percent of its electricity.
British alternative energy analyst Thomas W. Thorpe believes that on a worldwide basis, waves alone could produce as much as 2,000 terawatt hours of electricity, as much as all the planet's major hydroelectric plants generate today.
Although none are as far along as Pelamis, most competing wave-power technologies rely not on the undulations of mechanical serpents, but instead on the power captured by the vertical bobbing of large buoys in sea swells. Ocean Power Technologies (OPT), based in New Jersey, drives the generators in its PowerBuoy® with a straightforward mechanical piston. A stationary section of the mostly submerged, 90-foot buoy is anchored to the ocean floor;a second section simply moves up and down with the movement of sea swells, driving pistons that in turn drive an electrical generator. The Archimedes Wave Swing, a buoy-based system developed by Scotland's AWS Ocean Energy, harnesses the up-and-down energy of waves by pumping air to spin its turbines. Vancouver-based Finavera Renewables uses seawater as its turbine-driving hydraulic fluid.
Although Pelamis beat all of these companies out of the commercialization gate, OPT appears to be right behind, with plans to install North America's first commercial-scale wave power array of buoys off the coast of Oregon as early as next year. That array — occupying one square-mile of ocean and, like other wave power installations, located far from shipping lanes — would initially produce 2 megawatts of power. OPT also announced last September an agreement to install a 1.4-megawatt array off the coast of Spain.
An Australian subsidiary is in a joint venture to develop a 10-megawatt wave farm off the coast of Australia.
Meanwhile, Pelamis Wave Power plans to install more of its mechanical serpents — three megawatts of generating capacity off the coast of northwest Scotland, and another five-megawatt array off Britain's Cornwall coast.
The Cornwall installation will be one of four wave power facilities plugged into a single, 20-megawatt underwater transformer at a site called "Wave Hub."
Essentially a giant, underwater version of a socket that each developer can plug into, Wave Hub — which will be connected by undersea cable to the land-based grid — was designed as a tryout site for competing technologies. OPT has won another of the four Wave Hub berths for its buoy-based system.
Other innovators are trying to harness the power of ocean or estuarine tides. Notably, in 2007, Virginia's Verdant Power installed on the floor of New York's East River six turbines that look, and function, much like
stubby, submerged windmills, their blades — which are 16 feet in diameter — turning at a peak rate of 32 revolutions per minute.
The East River is actually a salty and powerful tidal straight that connects Long Island Sound with the Atlantic Ocean.
Although the "underwater windmills" began pumping out electricity immediately, the trial has been a halting one.
The strong tides quickly broke apart the turbines' first- (fiberglass and steel) and second- (aluminum and magnesium) generation blades, dislodging mounting bolts for good measure.
Undeterred, in September Verdant Power began testing new blades made of a stronger aluminum alloy.
If it can overcome the equipment-durability problems, the company hopes to install as many as 300 of its turbines in the East River, enough to power 10,000 New York homes.
A scattering of similar prototype "underwater windmill" projects have been installed at tidal sites in Norway, Northern Ireland, and South Korea. (In addition, interest in moving into freshwater sites is growing.
Verdant itself hopes to install its turbines on the St.
Lawrence River. At least one other company, Free Flow Power of Massachusetts, has obtained Federal Energy Regulatory Commission permits to conduct preliminary studies on an array of sites on the Mississippi River south of St.
Louis.)
The environmental benefits of hydrodynamic power seem obvious:
no carbon dioxide or any other emissions associated with fossil-fuel-based generation.
No oil spills or nuclear waste.
And for those who object to wind farms for aesthetic reasons, low-profile wave farms are invisible from distant land;tidal windmill-style turbines operate submerged until raised for maintenance.
There are, however, environmental risks associated with these technologies.
New York state regulators required Verdant Power to monitor effects of their its turbines on fish and wildlife.
So far, sensors show that fish and water birds are having no trouble avoiding the blades, which rotate at a relatively leisurely 32 maximum revolutions per minute.
In fact the company's sensors have shown that fish tend to seek shelter behind rocks around the channel's banks and stay out of the central channel entirely when tides are strongest.
But a host of other questions about environment effects remain unanswered.
Will high-voltage cables stretching across the sea from wave farms somehow harm marine ecosystems?
Will arrays of hundreds of buoys or mechanical serpents interfere with ocean fish movement or whale migrations?
What effect will soaking up large amounts of wave energy have on shoreline organisms and ecosystems?
"Environmental effects are the greatest questions right now," EPRI's Bedard says, "because there just aren't any big hydrodynamic projects in the world."
Projects will probably have to be limited in size and number to protect the environment, he says – that's a big part of the reason he limits his "realistic" U.S. estimate to 10 percent of current generation capacity.
But the only way to get definitive answers on environmental impact might be to run the actual experiment — that is, to begin building the water-powered facilities, and then monitor the environment for effects.
Bedard suggests that the way to get definitive answers will be to build carefully on a model like Verdant's:
"Start very small.
Monitor carefully.
Build it a little bigger and monitor some more. I'd like to see it developed in an adaptive way.
Revenge of the Electric Car
The recent high-profile unveiling of the Chevrolet Volt, the hybrid electric car that General Motors hopes will roll into dealer showrooms in late 2010 and rescue the automaker from near-bankruptcy, felt like the opening credits of a movie we've seen before.
After all, there's nothing new about electric cars, hybrid or otherwise – 100 years ago, there were more electric cars on the road than gas-powered ones. Henry Ford even bought an electric car for his wife, Clara.
But the story of the 20th century (or one chapter of it, anyway) is the story of the triumph of the internal combustion engine.
Periodic attempts to revive the plug-in cars have met with failure, or have been willfully squashed (check out Chris Paine's excellent 2006 documentary Who Killed the Electric Car?).
Shortly after the Volt was introduced, GM executive Bob Lutz nearly killed GM's born-again mojo when he admitted in a TV interview that when it comes to global warming, "I don't believe in the CO2 theory."
So much for enlightened corporate leadership.
But does that mean the Volt is just a repeat of the same old movie?
No. For one thing, GM – which lost $15 billion in a single quarter this year – isn't the only company betting its future on electric cars. Virtually every carmaker in the world, from Chrysler to Nissan to Chery, the upstart Chinese automaker, has announced plans to shift away from internal combustion engines toward electric drives.
Today's hybrids follow the model of the Toyota Prius, which uses batteries and an electric motor to assist the gas engine.
Tomorrow's plug-in hybrids – starting with the Volt – will flip this around, using the electric motor as the primary drive, with the gas engine on board simply as a range-extending generator to charge up the battery.
If you drive less than about 40 miles a day, you'll never need the engine – the gas station will be replaced by the outlet in your garage.
Some carmakers, including big players like Nissan and Silicon Valley start-ups like Tesla Motors, are moving straight to all-electric cars. Within the industry, there is much debate about the virtues of plug-in hybrids vs. all-electric cars, but either way, says Willett Kempton, who has been studying electric cars for more than a decade at the University of Delaware, "the transition from gasoline to electricity is now irreversible."
In the world today, electrons are easier to come by than hydrocarbons. To get oil, you have to drill thousands of feet below the surface of the earth – often in a hostile nation – pump it up, refine it, ship it (via pipeline or tanker), then store it until somebody comes along with a thirsty SUV. All in all, an expensive and rigid system.
Electrons, on the other hand, come from many places:
wind turbines, solar panels, hydroelectric dams, nukes and even burning coal.
This simple fact upends everything.
With electric cars, we're not dependent on sheiks in the Middle East.
We're dependent on our own ingenuity.
A secondary virtue of the shift from atoms to electrons is that an electric motor is three to four times better at converting energy into motion than an internal combustion engine.
In fact, the amount of electricity it takes to push an electric car down the road is surprisingly small.
"In a typical day, an electric car uses about as much electricity as four plasma TVs," says Mark Duvall, the head of the electric car program at the Electric Power Research Institute in Palo Alto, California.
In a typical day, an electric car uses about as much electricity as four plasma TVs.
This means that, for the foreseeable future, we don't need to add a lot of generating capacity to the grid in order to meet the demand for electric cars. (Mike Morris, the chairman and CEO of American Electric Power, believes that up to 20 percent of the U.S. vehicle fleet could be switched over to plug-in hybrids without overtaxing the existing grid.) And higher efficiencies all along the energy supply chain mean that switching from gas to electric cars does not simply shift pollution from the tailpipe to a distant power plant.
Even on a grid that's 100 percent coal-fired, overall CO2 emissions – that is, including pollution from mining and burning coal – with a plug-in car would be lower than overall emissions from a similar-sized car with an internal combustion engine.
According to a recent study by the Electric Power Research Institute and the Natural Resources Defense Council, widespread adoption of plug-in hybrids could reduce annual emissions of greenhouse gases by more than 450 million tons by 2050, the equivalent of taking more than one-third of today's cars and light-duty trucks off the road.
Of course, the cleaner the grid, the lower the emissions. But electric cars can help here, too.
One of the big problems with renewables is intermittency:
The sun doesn't always shine, the wind blows at the wrong time.
For electric cars, this is not a problem.
In Texas, for example, the wind blows most strongly at night – exactly when the power isn't needed.
But what if that power could be used to charge electric cars sitting in the garage?
"Plug-in vehicles are a way of changing from Middle Eastern oil to west Texas wind as a transportation fuel," says Austan Librach, director of emerging transportation technologies at Austin Energy, a large utility in Austin, Texas, that has long promoted the use of electric vehicles.
In Denmark, 20 percent of the electricity is generated by wind – "it is a perfect match for electric cars," says Torben Holm, a consultant at DONG Energy, the country's largest electric power provider. DONG recently announced a deal with Renault and Better Place, a Silicon Valley start-up, to bring electric cars to Denmark.
According to Holm, a single 2-megawatt wind turbine generates enough electricity to power 3,000 cars.
In the near term, however, the most important benefit of electric cars is that they will accelerate the deployment of the so-called "smart grid."
In fact, plug-in vehicles may be to the smart grid what Halo was to the Xbox:
the killer app that drives everybody to want one.
The problem with our electric grid today is not just that it is big and dirty, but that it is big and dirty and dumb.
We burn energy and have no idea where it is coming from.
"The most revolutionary thing about the Prius," says Dan Dudek, chief economist for Environmental Defense Fund, "is not the hybrid engine.
It's the monitor on the dash that shows you your energy consumption in real time.
It turns us all into savvy consumers, because we see the reaction in real time to our driving.
Imagine if we had that for the rest of our energy consumption?"
If electric cars are going to work, the grid has to evolve into something that looks more like the Internet, with two-way communication and lots of data and context.
For car owners, a smart grid will help them track exactly how much electricity they're consuming and what it costs (just like at a gas pump). For utilities, a smart grid will help manage demand – preventing big power surges at 6 p.m., when everyone comes home and plugs their car in, is a major concern – as well as open the door to a variety of new services, from innovative pricing packages to energy management programs for your home and business.
But the real promise of a smart grid is the ability to turn electric cars into a rolling fleet of batteries that can be tapped on demand, feeding power back into the grid.
A smart grid could have the ability to turn electric cars into a rolling fleet of batteries that can be tapped on demand.
"For utilities, the economics of vehicle-to-grid are incredibly compelling," says Willett Kempton.
According to Kempton, the richest market is in frequency regulation of the grid – that is, feeding in small amounts of power to keep the balance between electricity production and demand steady – which he estimates could amount to a market of $10 billion a year in the United States alone.
There is also money to be made in supplying power for peak-load demands on hot summer days, when everyone cranks the A/C.
"In Sacramento, we have 400 megawatts of power that we use four days a year," says Bill Boyce, the transportation supervisor at Sacramento Municipal Utility District in California.
"Instead of keeping these power plants around, what if we could draw that 400 megawatts from parked cars?
This is an idea we're very interested in pursuing."
So are plenty of other progressive power companies. Southern California Edison, Austin Energy, Duke Energy, Wisconsin Power, Excel Energy, and Pacific Gas &Electric – to name just a few – all have pilot programs to learn more about how to integrate plug-in vehicles with the grid.
Of course, all this is still a long way off.
"It's hard to overestimate the inertia of the old system, and how resistant many people are to change," says Tom Turrentine, head of the Plug-in Hybrid Electric Vehicle Research Center at the University of California at Davis. A recent MIT study on the future of the car suggested plug-in vehicles might capture, at best, 15 percent of the light-duty vehicle market (passenger cars and SUVs) by 2035.
One big uncertainty, obviously, is the price of oil – how high will it rise, and how fast?
Another is the cost and reliability of batteries. Much of the optimism about electric cars is based on assumptions that batteries will evolve like microprocessors, with rapidly declining costs and rising performance.
But what if that turns out to be a false analogy?
The revolution could also be derailed by a VHS-vs.-Betamax-type battle over plug and battery standards.
And in the long run, the happy vision of an OPEC-free world could be tarnished by the monopolistic impulses of electric power companies.
Clearly, dealing with the twin challenges of peak oil and global warming will require far deeper, more radical changes in our lives than simply jumping from gas cars to electric.
As for GM's Volt, despite all the hype it's getting today, it could look like yesterday's news by the time it finally rolls into dealer showrooms. But Chris Paine, director of Who Killed the Electric Car?, isn't betting against it.
In fact, he's already working on a sequel.
It's called Revenge of the Electric Car.
The Corn Ethanol Juggernaut
The huge corn ethanol mandates imposed by Congress a few years ago may be the single most misguided agricultural program in modern American history.
That's saying something, but consider the program's impact:
higher global food prices, increased air pollution from burning ethanol-spiked fuels, spreading dead zones in the Gulf of Mexico from a surge of fertilizer use, and strong evidence that growing a gallon of corn ethanol produces just as many greenhouse gases as burning a gallon of gas.
Why then, given these many problems, hasn't Congress rolled back the mandates and stopped this boondoggle?
The answer can be boiled down to a few salient realities of American politics and agricultural policy.
First, even in the subsidy-rich world of U.S. agriculture, corn is king.
Second, the power wielded by the farm state lobby remains enormous. Third, Iowa is Ground Zero for corn, and its pivotal presidential caucuses leave even supposed change agents like Barack Obama bowing before the altar of corn ethanol.
And, finally, once a juggernaut like corn ethanol gets rolling with massive federal support and mandated production levels, bringing it to a halt is enormously difficult — even when study after study shows that relying on corn ethanol as a cornerstone of an alleged renewable energy policy is folly.
The corn sector has long enjoyed staunch backing from Congress. According to the nonprofit Environmental Working Group, between 1995 and 2006, federal corn subsidies, which are provided through a myriad of programs, totaled $56.1 billion.
That's more than twice the amount given to any other commodity, including American mainstays like wheat and cotton, and 105 times more than was paid to tobacco farmers.
Corn ethanol production has long been a favorite of farm state legislators in Congress, who have promoted the fuel as an alternative to the evils of foreign oil.
Congress approved the first ethanol subsidies in 1978, just a few years after the Arab oil embargo of 1973.
"It makes for a good public image — supporting the farmer, supporting the rural economy," says Thomas Elam, an Indianapolis-based agricultural economist.
The problem, he says, is that "it's a special-interest program that spreads the cost of the program across the rest of the economy."
Elam says that the farm lobby collects tens of millions of dollars a year to lobby lawmakers at the state and national levels. States like Iowa and Ohio have their own ethanol associations, which work in tandem with national groups like the Renewable Fuels Association.
In 2006 alone, that group collected about $3.7 million in dues from its members and paid its president, Robert Dinneen, a salary of $300,000 to push the ethanol-is-good message on Capitol Hill.
Additional support for the ethanol mandates comes from groups like the American Corn Growers Association and its larger cousin, the powerful National Corn Growers Association (NCGA), which reported 2006 total revenue of $8.6 million.
The NCGA has some 33,000 dues-paying farmers spread among 48 of the 50 states. On its Web site, the NCGA makes it clear that it aims to "increase ethanol demand" by establishing a federal program that is "part of a comprehensive energy policy."
These interest groups will spend millions of dollars "to keep the mandate where it is," says Jan Kreider, a professor emeritus of engineering at the University of Colorado, who has been studying motor fuels for three decades. "It's a massive political battle to even slow it down," says Kreider. "Once the mandates are in, it's almost a one-way street.
It could take decades to whittle down the size of the mandates."
It's a massive political battle to even slow it down...
It could take decades to whittle down the size of the mandates."
The staying power of the ethanol mandates is largely due to the decades-long influence of the farm state delegations on Capitol Hill.
As former Sen.
Bob Dole of Kansas once explained to Texas oil baron T. Boone Pickens, "There are 21 farm states, and that's 42 senators. Those senators want ethanol."
And the influence of those senators — 15 states now have ethanol production capacity of at least 200 million gallons per year —will be hard to overcome.
Cutting the ethanol mandates will require jousting with two of the most powerful members of the Senate, Republican Charles Grassley and Democrat Tom Harkin.
Both are Iowans. Both are ardent ethanol boosters. And Harkin is the chairman of the Senate Agriculture Committee.
Harkin's position gives him tremendous leverage over any ethanol-related legislation that comes before the Senate.
Which brings us to the Iowa Imperative.
Any candidate who wants to win the White House must have a good showing in the very first presidential primary – the Iowa caucuses. "Candidates have to come here and suspend all critical judgment," says David Swenson, an economist at Iowa State University.
"There is a knee-jerk reaction in Iowa that if you don't support our special interests then you don't love us and we won't vote for you – and that's true even though the vast majority of Iowans don't have anything to do with farming and wouldn't know a crop if it fell on them."
The imperative can be explained by looking at the numbers:
Iowa now has about one-third of the ethanol production capacity in the U.S., and those ethanol plants provide jobs for several thousand Iowans.
Barack Obama understood the Iowa Imperative.
And his strong support for ethanol helped him win the Iowa primary.
That win validated his campaign and was a key factor in assuring that he won the Democratic nomination.
And the farm lobby is rewarding Obama.
On Aug. 22, the American Corn Growers Association endorsed Obama.
On the Republican side, John McCain, a long-time ethanol critic, tied for third in Iowa.
In August 2006, six months before the Iowa vote, McCain switched sides in the ethanol debate, telling a crowd in Grinnell, Iowa, that ethanol "is a vital alternative energy source not only because of our dependency on foreign oil but its greenhouse gas reduction effects."
McCain has since switched sides again and is now co-sponsoring a bill — introduced in May by Texas' Kay Bailey Hutchison and 10 other Senate Republicans — to freeze the ethanol mandates. Hutchison argued that the ethanol mandates needed to be limited because they were driving up the price of corn and were "clearly causing unintended consequences on food prices for American consumers."
Her bill would limit the volume of corn ethanol to be blended into gasoline to no more than 9 billion gallons. But current federal rules mandate far greater production:
U.S. oil refiners must be using at least 15 billion gallons of ethanol per year in their gasoline by 2015 and 21 billion gallons by 2022.
Such a sharp increase by 2022 would principally be reached by making ethanol from other materials like switchgrass and wood chips. But this "cellulosic ethanol" has never been produced in commercial quantities.
Hutchison's bill, S. 3031, is stuck in the Senate Environment and Public Works Committee. A hearing has not even been scheduled.
In early August, the Environmental Protection Agency denied a request by Texas Gov.
Rick Perry to allow his state to opt out of the federal ethanol mandates. Lower corn prices are a critical issue for livestock producers in Texas who have been hit hard by soaring corn prices. In denying the request, E.P.A. Administrator Stephen L. Johnson said that the ethanol requirements are "strengthening our nation's energy security and supporting American farming communities" and are not causing severe harm to the economy or the environment.
Furthermore, Congress has passed rules that make it hard to waive the mandates. The Environmental Working Group is one of several environmental groups that are fighting to slow or reverse the ethanol mandates. The group's Michelle Perez, a senior analyst for agriculture, was not overly surprised that the EPA rejected the Texas request.
"Congress set the bar pretty high for states to demonstrate environmental and economic harm in order to get the mandate waived," says Perez.
She points out that Texas applied for a waiver based only on the economic harm being done by the mandates. The state would likely have made a stronger case had it sought a waiver based on both economic and environmental harm, Perez says, noting that her organization has begun providing testimony to the EPA on the environmental impacts.
Any sustained attack on the ethanol mandates would have to counter the enormous amounts of capital that have been invested in the corn ethanol sector. The industry's momentum can be measured in the billions of dollars. According to the Renewable Fuels Association, the trade group, some 168 ethanol distilleries with an annual capacity of 9.9 billion gallons are now operating in the U.S. Those plants are spread among 26 states, and another 43 plants are under construction or are being expanded.
If you assume that each of those 200-plus plants costs $75 million to construct (a conservative estimate), the total cost of those distilleries is about $15 billion.
If the federal mandates are eliminated or rolled back, the owners of the ethanol plants could seek compensation from the federal government.
So the mandates continue, despite at least 10 studies — including one this spring by the World Bank — showing that the surge in U.S. corn ethanol production is forcing up global food prices. On the environmental front, a spate of studies has shown that the production of corn ethanol likely creates more greenhouse gases than conventional gasoline.
Due to the energy-intensive nature of the cultivation and distillation processes, ethanol produced from corn yields very little, if any, benefit.
Clean-air advocates also contend that the growing use of ethanol in gasoline is increasing the amount of smog in America's cities. William Becker, executive director of the National Association of Clean Air Agencies, which represents air pollution control authorities across the U.S., said Congress "decided to mandate ethanol without first analyzing the air-quality impacts."
Gasoline that has been blended with 10 percent ethanol may be more volatile than conventional gasoline, which means more light hydrocarbons — and ground-level ozone — are emitted into the air. For Becker, the conclusion is crystal clear:
"More ethanol means more air pollution.
Period."
More ethanol means more air pollution.
Period."
Corn ethanol production has negative impacts on water quality.
Researchers say that a key reason for the growing "dead zone" of oxygen-depleted water in the Gulf of Mexico is the increased planting of corn to meet the soaring demand from ethanol distilleries. That additional acreage has resulted in increased applications of fertilizers like nitrogen and phosphorus, which are then washed into the Mississippi, helping create the algal blooms that cause dead zones.
The controversy over the ethanol mandates will undoubtedly go on for months, or years, to come.
But even if Congress repeals the mandates and eliminates the subsidies for ethanol production, the ethanol industry will not shut down.
Even without federal supports, some distilleries will still be profitable.
And their profitability will be directly linked to the price of oil:
As the price of oil continues to rise, some of the most efficient ethanol producers will be able to compete with high-priced gasoline.
No matter what Congress decides to do in the future with regard to the ethanol mandates, it has birthed an industry that has an incentive to burn food in order to fuel cars. And the ramifications of that move — in food prices and environmental effects — are likely to reverberate throughout the global economy for years to come.
Solar's Time Has Finally Arrived
After years of optimistic predictions and false starts, it looks like solar's moment is here at last.
Analysts say a pattern of rapid growth, technological breakthroughs, and falling production costs has put solar power on the brink of becoming the world's dominant electricity source.
by jon r. luoma
Here's a thought about generating electricity in the future from a bona fide expert:
"I'd put my money on the sun and solar energy.
What a source of power!"
That was Thomas Edison.
The year was 1931.
For the better part of a century, Edison's forecast went largely unrealized.
But after years of optimistic talk and halting progress, solar is finally on the brink of going big-time.
Slowly but steadily, solar has been in the throes of a quiet revolution, with efficiencies rising, costs falling, and new technological breakthroughs — including the recent development of nano-thin photovoltaic materials — all dramatically changing the game.
Further driving a mass movement toward solar power are two economic imperatives:
stratospheric oil prices, and the inevitability of Europe, the United States, and other industrialized nations soon putting a price on carbon to slow global warming.
Today, solar photovoltaic (p.v.) output still represents far less than one percent of the world's four terawatts (four trillion watts) of electrical generating capacity.
But according to some industry analysts, today's size isn't the story.
The real story is what's already been happening for more than a decade:
a pattern of exponential p.v. growth that now promises to turn the world largely solar, at ever-accelerating speed.
Overall bullishness about solar is widespread. BCC Research, a Massachusetts-based market research firm specializing in technology and energy, expects that the global solar sector will grow from $13 billion today to $32 billion by 2012.
One evolving solar energy technology — concentrating photovoltaics — uses mirrors and lenses to amplify the sun's power 500-1000 times.
"The Gun Has Gone Off" is the title of a report from respected industry tracker Michael Rogol and colleagues at the consulting firm Photon Consulting, summing up what seems to be a consensus among industry analysts. Some long-term growth predictions are simply jaw dropping, including estimates that non-polluting, carbon dioxide-free solar p.v. will become fully price-competitive with coal and nuclear power in most of the world sometime in the next decade or two — without any government or utility incentives. (This goal of so-called "grid parity" is the holy grail of solar enthusiasts.) And with solar costs continuing to plummet into the future, these analysts say, solar will out-compete other forms of electric generation on its own terms until it is preeminent.
We're heard this kind of thing before.
As long ago as the early 1980s, some prognosticators were predicting that solar p.v. manufacturing would reach 1 million watts of annual capacity before the end of the decade.
But after President Ronald Reagan pulled the plug on federal solar subsidies initiated by the Carter administration, manufacturing slowed dramatically.
And even if today's prognosticators are correct, large hurdles remain, ranging from reducing manpower-intensive installation costs, to finding ways to store energy produced on sunny days for use at night or in cloudy weather.
So why the optimism surrounding solar today?
Trend-trackers say that with little fanfare, solar p.v. production has accelerated into a pattern of rapid growth, with solar cell and assembled panel production roughly doubling every two years beginning in the mid 1990s. Today, solar p.v. appears to be firmly in a pattern of exponential growth and falling overall costs, a pattern that echoes the early mass-production years of past hyper-growth industries:
automobiles in the early 20th century, televisions in mid-century, computers toward century's end.
"It started from a tiny base," says Ohio engineer David Heidenreich.
"But it's really incredible how long p.v. has been averaging a phenomenal growth rate.
Everything we see says there's no reason that can't go on for a very long time."
Heidenreich was lead author of a recent industry analysis, entitled Exponential Solar (www.exponentialsolar.com), which provides a detailed synthesis of studies that analyze solar p.v. growth trends and places them in the context of historical studies of typical manufacturing growth curves.
Growth curve analysis began in the 1930s, with a study at the Wright Airplane Co. that showed how airframe costs fell at an exponential and predictable rate as production increased.
Since then, a multitude of studies in a wide range of industries has shown a similar curve, with costs consistently dropping 20 to 30 percent with each doubling of manufacturing.
The pattern can prevail for decades.
Consider, just as just one industrial precedent, the television set.
In 1946, RCA sold about 10,000 televisions, at a cost of $325 (nearly $4,000 in 2008 dollars). By 1955, with annual U.S. production of 7.4 million sets, a bigger, better TV set cost about one-quarter of that, and a trend of falling costs and improving quality (Living Color!) continued.
Ditto for computers, printers, cell phones.
Today, a fully installed solar p.v. system costs a homeowner $7 to $8 per watt of capacity, which means a total system cost of about $35,000 for a typical house.
Assuming an average experience curve, that means that with rising output, costs should fall to about $4 per watt by 2012, then $2 per watt by 2021, continuing to fall steadily until reaching a cheaper than cheap 50 cents by 2039, according to Exponential Solar's model.
In my home U.S. state of New Jersey, where kilowatts are relatively expensive, a rooftop, solar p.v. array folded into a 30-year mortgage would start to make sense without any utility or tax incentives at about $3 a watt, or sometime around 2015.
In most of the United States, where power is cheaper, grid parity would come a few years later.
Researchers at CalTech have calculated that covering a grid this size — about 61,800 square miles —in the central U.S. with solar panels would supply the entire country with electricity.
Paul Maycock, who directed the U.S. Department of Energy's photovoltaic office during the Carter Administration, has since run his own p.v. consulting firm.
He, too, is convinced that a pattern of exponential growth with falling costs is well in place.
But he cautions that rapid solar growth since the mid 1990s has been supported by solar incentives in, principally, Germany and Japan, and more recently in such U.S. states as California and New Jersey.
"The incentives worked their magic — prices went down dramatically," Maycock says, but he notes that some incentives will need to remain until p.v. reaches grid parity.
Maycock also warns that the trends only play out on average, over time, with inevitable bumps in the road.
For example, solar panel prices stopped dropping in the mid-2000s, as soaring demand outstripped supplies of polysilicon, key to the manufacture of conventional cells. Now new silicon supplies are coming on line and price declines should follow.
Maycock — who worked for Texas Instruments in the chip giant's early years — says the trend should more or less mimic the long pattern of growth and falling costs in the computer chip industry, which also relies on silicon wafers. The integrated circuit industry reduced costs largely by squeezing ever smaller circuits into chips. Solar can't do that.
But it can accomplish something similar by using ever-smaller amounts of material, since all of the electricity-generating action occurs near the surface.
Manufacturers of conventional cells in the past have found ways to slice them thinner;newer "thin-film" production technologies use even less photovoltaic material to produce each watt of electricity.
Maycock predicts that thin solar film will go from 10 percent of the market today to 30 to 35 percent in 2015.
That kind of growth doesn't seem at all outlandish to Ray Kurzweil, an inventor and modern-day Edison in his own right. (Kurzweil developed, among other things, voice synthesizers, flat bed scanners, and early speech recognition technology). Noted for some prescient predictions — he forecast years ago that a little-known niche phenomenon called the Internet would one day transform the world — Kurzweil actually strikes an even more optimistic note.
Early this year, in a report commissioned by the National Association of Engineers, Kurzweil — working with a team of analysts that included Google founder Larry Page — concluded that innovations in areas such as nanotechnology will drive a robust growth in solar for years to come.
Saying that "we are not that far away from a tipping point where energy from solar will be competitive with fossil fuels," Kurzweil predicted that solar could meet 100 percent of the U.S.'s energy needs within 20 years.
The sort of innovation that could drive that explosion certainly appears to be underway.
Maycock points to U.S.-based First Solar, the first to mass produce ultra-thin solar cells not made of conventional silicon, but of cadmium telluride (CdTe). The company has rapidly become the lowest-cost solar producer in the world, with its cell prices approaching half those of conventional manufacturers. Essentially, its process vaporizes the solar materials and deposits them on inexpensive glass at atom-thin levels. This year, another CdTe plant will begin operating at start-up AVA Solar in Colorado, which plans to ramp up rapidly to 200 million annual watts of production.
Silicon has hardly been knocked out of the game.
Applied Materials, a leading supplier of equipment to the integrated circuit industry, has moved into automating solar cell production.
In just two years, the company has already built 10 complete assembly lines for solar manufacturers in China and India.
Early in 2008, Applied announced that an Indian customer had taken one of the biggest leaps yet, with a new order for a whopping 600 megawatts of p.v. manufacturing capacity, one-sixth of what the entire world produced last year.
California-based Nanosolar, meanwhile, last year began production of its own version of solar, using a material called copper indium gallium selenide (CIGS), which it grinds into infinitesimally tiny particles and prints as a "solar ink" on cheap foil panels. The company has a single printing press-like machine that it says can crank out a billion watts of solar power annually.
Other young companies have developed flexible thin-film solar panels with inexpensive metal or plastic substrates that could be simply and cheaply installed.
To date, most p.v. installations have been relatively small arrays attached to individual houses, schools, or commercial buildings. But utility scale installations are also growing.
The biggest p.v. technological promise may lie in systems that use mirrors or lenses to concentrate light on highly efficient multi-junction solar cells, vastly leveraging the power of the sun.
Companies are now working on deploying these unique cells in concentrator arrays that amplify the sun from 500 to more than 1000 times. SolFocus, for example, with operations in both California and Spain, claims that its system, which rotates on two axes to perfectly track the sun from dawn to dusk, uses only 1/1000th of the solar p.v. material per watt generated of standard solar panels, with most of the system cheap and abundant aluminum and glass.
Meanwhile, p.v. has a competitor in solar thermal, a technology that uses mirrors to concentrate solar heat to produce steam, which can drive a conventional generator. Although it has not been growing at p.v.'s breakneck pace, rising fossil fuel prices seem to be reviving thermal power as well, with two new plants opening since last year in the American Southwest, another 10 on drawing boards for the United States, and well over a dozen planned elsewhere in the world.
Solar thermal today can provide a utility with power for about three times the cost of the cheapest fuel, coal, but analysts expect that with growing manufacturing efficiencies, it too will drop in price.
We've relied on combustion for energy since the first caveman rubbed a couple of sticks together. Now, however, we face a gathering global warming storm and a host of other problems brought on by combustion run amok.
But there is a solution right outside the door:
according to the U.S. Department of Energy, enough solar energy strikes the nation in a single day to power it for a full year. If, as it appears, solar power has at last reached a tipping point, maybe Edison got it right after all.
The Nitrogen Fix:
Breaking a Costly Addiction
Over the last century, the intensive use of chemical fertilizers has saturated the Earth's soils and waters with nitrogen.
Now scientists are warning that we must move quickly to revolutionize agricultural systems and greatly reduce the amount of nitrogen we put into the planet's ecosystems.
by fred pearce
A single patent a century ago changed the world, and now, in the 21st century, Homo sapiens and the world we dominate have an addiction.
Call it the nitrogen fix.
It is like a drug mainlined into the planet's ecosystems, suffusing every cell, every pore — including our own bodies.
In 1908, the German chemist Fritz Haber discovered how to make ammonia by capturing nitrogen gas from the air. In the process he invented a cheap new source of nitrogen fertilizer, ending our dependence on natural sources, whether biological or geological.
Nitrogen fertilizer fixed from the air confounded the mid-century predictions of Paul Ehrlich and others that global famine loomed.
Chemical fertilizer today feeds about three billion people.
But the environmental consequences of the massive amounts of nitrogen sent coursing through the planet's ecosystems are growing fast.
We have learned to fear carbon and the changes it can cause to our climate.
But one day soon we may learn to fear the nitrogen fix even more.
A major international survey published in September in Nature listed the
Artificial nitrogen is as ubiquitous in water as man-made carbon dioxide is in the air.
nitrogen cycle as one of the three "planetary boundaries" that human interventions have disturbed so badly that they threaten the future habitability of the Earth.
The others — according to the study by Johann Rockstrom, of the Stockholm Environment Institute, and 27 other environmental scientists – are climate change and biodiversity loss.
Nitrogen affects more parts of the planet's life-support systems than almost any other element, says James Galloway of the University of Virginia, who predicts:
"In the worst-case scenario, we will move towards a nitrogen-saturated planet, with polluted and reduced biodiversity, increased human health risks and an even more perturbed greenhouse gas balance."
The problem is that we waste most of Haber's fertilizer. Of 80 million tons spread onto fields in fertilizer each year, only 17 million tons gets into food.
The rest goes missing.
This is partly because the fertilizer is wastefully applied, and partly because the new green-revolution crops developed to grow fat on nitrogen fertilizer are also wasteful of the nutrient.
The nitrogen efficiency of the world's cereals has fallen from 80 percent in 1960 to just 30 percent today.
Artificial nitrogen washes in drainage water from almost every field in the world.
It is as ubiquitous in water as man-made carbon dioxide is in the air. It is accumulating in the world's rivers and underground water reserves, choking waterways with algae and making water reserves unfit to drink without expensive clean-up.
Most of the man-made nitrogen fertilizer ever produced has been applied to fields in the last quarter-century.
Nature has some ability to reverse man-made fixing of nitrogen, converting it back into an inert gas — a process called denitrification.
But last year, Patrick Mulholland of the Oak Ridge National Laboratory in Tennessee reported that the system is being overwhelmed.
Many rivers in the U.S. are now so nitrogen-saturated that they are losing their ability to denitrify pollution.
Most of this excess nitrogen ends up in the oceans, where it is killing whole ecosystems. Excess nitrogen is the cause of the growing number of oxygen-depleted "dead zones" in the oceans, says Mulholland.
Why should a fertilizer kill?
It is just too much of a good thing.
It over-fertilizes the water, producing such large volumes of algae and other biomass that it consumes all the oxygen in the water, causing the ecosystem to crash.
Coastal bays, inlets and estuaries around the world are succumbing.
A study earlier this year found that algal blooms dump domoic acid, a neurotoxin, onto the ocean floor, where it persists for weeks. "The first signs are often birds washing up on the shore or seals acting funny, aggressive and twitching, looking as if they were drunk," says Claudia Benitez-Nelson of the University of South Carolina.
Dead Zone
NASA
Fertilizer running down the Mississippi and Missouri Rivers create a so-called "dead zone," visible in this 1999 NASA satellite image.
Notoriously, fertilizer running down the Mississippi-Missouri river system creates a "dead zone" in the Gulf of Mexico. Typically, around 20,000 square kilometers of ocean forms a layer without oxygen or fish – killed by the nitrogen fix.
The number of dead zones has "spread exponentially since the 1960s," says Robert Diaz of the Virginia Institute of Marine Science in Gloucester Point.
He counted more than 400 in a study for Science last year. They now cover a quarter of a million square kilometers, usually where rivers discharge large amounts of fertilizers and sewage into relatively enclosed oceans.
You find these dead zones in the waters between Japan and Korea;in the Black Sea, where an invasion of alien jelly fish in the 1980s wiped out most native species;off the tourist beaches of the northern Adriatic;in Chesapeake Bay and the ocean waters off Oregon;and in the semi-enclosed Baltic Sea, the largest dead zone in world.
Nitrogen is a vital nutrient in soils, essential for growing crops. Soils recycle nitrogen in organic waste, including animal dung.
But before Haber's discovery, the only way of adding more atmospheric nitrogen to soils was through capture by the bacteria that live in a small number of nitrogen-fixing plants, including legumes like clover and beans.
In the 19th century, densely-packed countries like Germany and Britain began to improve the fertility of their soils by importing nitrogen in the form of guano from the Pacific islands of Peru, and saltpetre mined in Chile.
Geological nitrogen was a geopolitical resource as vital as oil today.
Appeals were made for science to come up with a new method of producing nitrogen in a form that plants could absorb.
Haber won the race, filing his patent for fixing ammonia, a molecule made of nitrogen and hydrogen atoms, from the inert nitrogen gas that makes up 70 percent of the air.
Now, ammonia could readily be turned into chemical fertilizer and added to the world's fields as easily as cow dung.
German industrialist Carl Bosch opened the first factory near Ludwigshafen in 1913.
It was in the nick of time for Germany.
During the First World War, unable to receive shipments of guano from South America because of a British naval blockade, Germany would quickly have starved but for the Haber-Bosch process.
Outside Europe, few initially took up chemical fertilizers to intensify their farming.
It was usually cheaper and easier to expand farming — draining
Much of the nitrogen in our bodies today comes from giant chemical factories.
marshes, ploughing prairies and clearing forests. But by the 1960s, as world population soared, fertilizer manufacture took off, and plant breeders developed new lines of high-yielding crops that responded best to the nitrogen fix.
During this "green revolution," there was an eight-fold increase in global production of nitrogen fertilizer from the 1960s to the 1980s.
Today, of 175 million tons of nitrogen applied to the world's croplands in a year, almost 50 percent is from chemical fertilizer. It has raised the "carrying capacity" of the world's soils from 1.9 people per hectare of farmland to 4.3 — and 10 in China, where applications reach twice anything seen in Europe.
This is a profound change to the biochemistry of life on Earth — and to our own bodies. Today, much of the nitrogen in our bodies comes not from biological sources but from giant chemical factories. We are, in a real sense, as much chemistry as biology.
Vaclav Smil, the distinguished Canadian researcher into food and the environment at the University of Manitoba, calls the nitrogen fix "an immense and dangerous experiment."
Besides fertilizer, we are also making biologically available nitrogen by burning fossil fuels. Power stations emit nitrogen oxides that create acid rain, the environmental scourge of industrialized countries in the 1980s and 1990s. Nitrogen oxides in the air are also potent greenhouse gases, adding to global warming, and even reach the stratosphere, where they join chlorine and bromine compounds in eating up the protective ozone layer.
"Most of the world's biodiversity hotspots are receiving doses of nitrogen from the air and in water at levels known to damage many species," according to Gareth Phoenix of the University of Sheffield in England.
Yet the issue has never been addressed by the UN Convention on Biological Diversity.
In temperate lands, this is turning heaths into grasslands, while grasslands typically lose a quarter of their species richness. Within nitrogen-flooded ecosystems, aggressive outside species outperform most natives. So nitrogen is the hidden force behind invasions of alien species around the world.
The prognosis is not good.
The scientists who wrote the Nature paper on planetary boundaries argued that human nitrogen releases to the natural environment should be cut by three quarters, to around 35 million tons. But on current trends, global nitrogen use on farmland is set to double to 220 million tons a year by 2050 – more than six times the safe threshold.
The danger is that nature's ability to process this excess nitrogen and return it to the atmosphere will be overwhelmed, and we will end up inhabiting a nitrogen-saturated planet, with nitrogen driving global warming, acidifying air, eating the ozone layer, reducing biodiversity, and killing the oceans.
Luckily the potential is considerable.
In China, where nitrogen application to fields is among the highest in the world, a study by a group of scientists led by Wilfried Winiwarter and Tatiana Ermolieva of the International Institute for Applied Systems Analysis found that better on-farm management of nitrogen could cut nitrous oxide emissions to the environment by 25 percent without damaging farm output.
Galloway says the flow of nitrogen through the environment can also be reduced by decreased emissions from burning fossil fuels — perhaps as a byproduct of efforts against climate change.
And better sewage treatment in cities could convert nitrates that have passed through the human gut into safe gaseous nitrogen.
If anything exemplifies humanity's growing impact on the planet's life-support systems, it is the way we are overwhelming the nitrogen cycle.
There are solutions. But for now we are hooked.
As Smil put it:
"In just one lifetime, humanity has developed a profound chemical dependence."
Satellites and Google Earth Prove Potent Conservation Tool
Armed with vivid images from space and remote sensing data, scientists, environmentalists, and armchair conservationists are now tracking threats to the planet and making the information available to anyone with an Internet connection.
by rhett butler
In October 2008, scientists with the Royal Botanical Garden at Kew discovered a rich pocket of biodiversity, including several notable new species, in a remote highland forest in Mozambique.
Trekking into the inaccessible, 17,000-acre region, botanists and biologists found 200 types of butterflies, hundreds of plant species, and numerous animals and insects, including three new species of Lepidoptera butterfly and a new member of the poisonous Gaboon viper family.
What's significant about this find is that it was initiated not by some intrepid adventurer, but rather by a scientist sitting behind his computer. Three years prior, conservationist Julian Bayliss identified the site — Mount Mabu — using Google Earth.
Bayliss, a Tanzanian ecologist, then helped plan and lead the expedition.
The use of Google Earth to make a virtual discovery, which then led to an actual one, is just the latest example of how the spread of satellite technology — and related computer applications such as Google Earth — are changing the way scientists, conservationists, and ordinary citizens are monitoring the environment and communicating their findings to the public.
Once the exclusive domain of the military, government officials, and specialized scientists, satellite technology is being democratized and is fast becoming an indispensable tool for researchers across a wide spectrum of environmental fields. In the past several years, one of the chief uses for satellite imagery has been to accurately quantify the loss of tropical forests from the Amazon, to the Congo, to Indonesia.
In Brazil, scientists and state environmental protection officials can now monitor fires and forest clearing almost in real-time and take action to combat the deforestation.
But perhaps the most revolutionary advance in using satellites to monitor the planet has been the ever-widening use of remote sensing technology by
ordinary citizens. Google Earth has been instrumental in this development and represents a critical point in its evolution, allowing anyone with an Internet connection to attach data to a geographic representation of Earth.
Citizens and environmental groups are now using Google Earth to tracks threats to pristine rivers from hydroelectric projects, catalogue endangered species, help indigenous people in the Amazon protect their land, and alert citizens and government officials that boats are illegally fishing off the Canary Islands.
"A decade ago, high-resolution satellite imagery for the whole planet would have been accessible only to a handful of people working in government agencies, resource extraction, or as scientists," said David Tryse, an Internet technology specialist — and ordinary citizen — who has developed numerous Google Earth applications now being used by scientists and conservation groups. "Today it is in the hands of millions of people.
It's impossible to care about something if you don't know it exists, but now people can fly across the planet and zoom in to see for themselves the huge fires from Shell's gas-flaring operations in the Nigerian delta or follow the discolored toxic runoff along a hundred kilometers of rain forest river downstream from a goldmine in Peru or Indonesian Papua."
The first launch of a non-weather satellite for civilian use occurred in 1972, when NASA put Landsat into orbit to monitor the planet's landmasses, tracking everything from desertification to changes in agriculture.
Since then, ever-more sophisticated satellites have used cameras and a variety of sensors — including passive microwave, which can penetrate clouds to image the earth's surface, and infrared sensors that can measure temperatures — to monitor a host of physical processes. One of the key functions has been the use of passive microwave technology to chronicle the steady decline of Arctic sea ice over the past 30 years.
Today, many countries use satellites to monitor their environment, including Brazil, which has one of the world's most sophisticated systems
Now people can fly across the planet and zoom in to see for themselves.
for tracking deforestation.
Brazil uses two systems that can rapidly identify where forest loss is occurring, giving the country's environmental protection agency the technical capacity — although not necessarily the political will — to combat deforestation as it happens. Those systems rely on optical sensors and thus cannot see through clouds, but Brazil will soon launch its own earth observation satellite with cloud-penetrating technology, known as LIDAR.
Greg Asner of the Carnegie Institution's Department of Global Ecology at Stanford University has used advanced LIDAR technology to scan a Hawaiian forest and identify alien plant species by their canopies and the amount of ground plants that grow under them.
A new frontier for remote sensing is the emergence of REDD (Reducing Emissions from Deforestation and Degradation), a mechanism for compensating tropical countries for conserving their forests. To date, one of the biggest hurdles for the concept has been establishing credible national baselines for deforestation rates — in order to compensate countries for "avoided deforestation," officials must first know how much forest the country has been clearing on a historical basis. For the remote sensing community, REDD presents an opportunity to showcase the power of remote sensing and generate a source of funding for countries to improve their sensing capabilities.
Introduced in 2005, Google Earth — which can be downloaded for free — aggregates and organizes satellite imagery, aerial photography, and three-D global information system data from a range of sources and presents it in a format that is easily accessible to the general public.
Through KML, Google Earth's programming language, users "interact" with the planet, attaching images and other information to geospatial data.
This makes Google Earth an ideal tool for conservationists, such as the group Save the Elephants, which tracks the movement of elephants across Africa to see where they come into conflict with humans and where they forage.
To further such conservation goals, Google has developed its Outreach program, an initiative that works with nonprofits to develop tools using Google Earth.
Part of the inspiration for Google Earth Outreach came from within the company itself. Rebecca Moore, a programmer at Google, used Google
EDGE — Evolutionarily Distinct and Globally Endangered — takes viewers to the locations of 100 of the worlds most endangered mammal species.
Earth to document a planned logging project near her home in Santa Cruz County, Calif. Working with members from her community, Moore created a virtual map of the area that would be affected.
Her subsequent data animation, which took users on a virtual flyover across the proposed logging zone, generated a firestorm of protest and led to the cancellation of the project.
Google Outreach was established shortly thereafter, in June 2007, with Moore in charge.
"Because Google Earth provides, for many areas, such a realistic model of the real earth, you almost feel as if you are on that mountaintop or looking over that valley," said Moore.
"This immersive experience enables conservation organizations to convey complex environmental issues more quickly and persuasively to busy decision-makers, the media and the general public."
Many scientists have begun to adapt Google Earth technology to their research and their communications with the public.
The technology also has emerged as an effective way to publish scientific results in an accessible and meaningful format.
While Google Earth is not going to replace scientific journals, it offers a concise, visual format for presenting research that can be more compelling than data points on a chart, rows in a spreadsheet, or a 4-color map.
Mark Mulligan of the Environmental Monitoring and Modeling Group at King's College London has capitalized on the power of Google Earth to create HealthyPlanet.org, an initiative that allows people to virtually see,
Google Earth allows these conservation organizations to look at their projects from space.
and sponsor, a specific piece of many of the planet's 77,000 protected areas. His group also worked on an application, Costing Nature, that allows users to trace stream flow in an urban area back to the protected area where it fell as rainfall, providing a potent example of the value of ecosystem services. In addition, Mulligan's team has developed Google Earth applications examining the impact of oil production in the Ecuadorean Amazon and the distribution of tropical cloud forests.
"Traditionally remote sensing data have been difficult to get hold of, difficult to process, and beyond the means of many of the smaller conservation organizations," said Mulligan.
"Google Earth allows these organizations to look at their projects from space and draw upon a wealth of environmental data, in addition to the imagery.
Clearly, conservation needs good professionals working with communities on the ground, but it also needs to harness the significant body of interested citizens who can do their bit."
Google Earth is also being used for original research.
One study, published in the Proceedings of the National Academy of Science last year, was based on an analysis of 8,510 cattle spotted in Google Earth images of 308 pastures and plains around the world.
Surprisingly, two-thirds of the cattle — as well as a majority of 3,000 grazing deer monitored in satellite photos from the Czech Republic — tended to align themselves with the Earth's magnetic field lines, in a north-south direction.
The research employed satellite technology to spot a phenomenon that literally had been hiding in plain sight for millennia:
that large, non-migratory land animals were affected by the earth's magnetism. (Earlier studies had established that magnetism guided the long-distance migrations of birds, fish, butterflies, and animals.)
Among the ordinary citizens who have been most active in marshaling the power of Google Earth for environmental work is David Tryse.
His interest in conservation led him to develop an application for the Zoological Society
In the Amazon, Indians log onto Google Earth to see where new gold mines are popping up.
of London's "EDGE of Existence" program, an initiative to promote awareness of — and generate funding for — 100 of the world's rarest species. His application allows people to surf the planet to see photos of endangered species, information about their habitat, and the threats they face.
Tryse also has used Google Earth to track deforestation worldwide, highlight hydroelectric threats to Borneo's rivers, map global biodiversity hot spots, and monitor encroachment on the lands of isolated, indigenous tribes around the world.
The Jane Goodall Foundation, a partner in a project known as Google Earth Outreach, uses Google Earth three-dimensional images to show Tanzanian villagers that forests are the source of their water and to enlist the villagers in identifying chimpanzee habitat and elephant paths.
One of the first Google Earth Outreach projects involved indigenous tribes in the Amazon rain forest.
Facing an onslaught of threats to their lands and culture, the tribes have embraced advanced technology as a means of protecting and better managing their homeland.
The tribes — including the Surui in western Brazil and the Wayana and Trio in Suriname — are using GPS to map their lands, plot rivers, sites of spiritual significance, and their resources, including medicinal plants and rich hunting grounds. The Rainforest Foundation UK and the Global Canopy Program are taking a similar approach in Congo and Cameroon, respectively, helping communities map their lands to protect against illegal logging and other forms of encroachment.
"Google Earth is used primarily for vigilance," said Vasco van Roosmalen, Brazil program director for the Amazon Conservation Team, an organization that has coordinated the Google Earth project with the tribes. "Indians log on to Google Earth and study images, inch by inch, looking to see where new gold mines are popping up or where invasions are occurring.
They can see river discoloration, which could be the product of sedimentation and pollution from a nearby mine.
They are able to use these images to find the smallest gold mine."
As Chief Almir Surui of the Suri tribe put it, "The Surui know little about the Internet, but Google knows little about the forest, so working together we will be stronger."
Regulators Are Pushing Bluefin Tuna to the Brink
The international commission charged with protecting the giant bluefin tuna is once again failing to do its job.
Its recent decision to ignore scientists' recommendations for reducing catch limits may spell doom for this magnificent – and endangered – fish.
by carl safina
It's one of the biggest, fastest, and most beautiful fish in the sea.
It has captured the imaginations of people from Homer to Salvador Dali.
But end-times loom for the giant bluefin tuna, whose chances of survival were greatly diminished in late November by the international commission charged with its care.
Once again, that body – the International Commission for the Conservation of Atlantic Tunas – refused to take strong action to prevent the runaway overfishing of the giant bluefin tuna in its sole remaining, yet rapidly disappearing, stronghold:
the Mediterranean.
One of the sea's few elite warm-blooded fish, bluefin tuna can reach three-quarters of a ton, swim at highway speeds, migrate across oceans, and visit coasts of distant continents. They're also the world's most valued fish (once they're dead), and therein they hang by the tail.
Too valuable everywhere to be allowed to live anywhere, the giant bluefin tuna may be worth more money to a person who kills one than any other animal on the planet, elephants and rhinos included. A few years ago, a single 444-pound bluefin tuna sold wholesale in Japan for $173,600.
One fish.
A 43-nation commission has public-trust management authority and a mandate to conserve.
But the International Commission for the
Bluefin tuna can grow to more than 1,000 pounds and migrate the full length of the Atlantic
Conservation of Atlantic Tunas has for its 40-year history merely acted as the fishing industry's official, tax-funded conglomerate.
Think of it as the International Conspiracy to Catch All the Tuna, and its record starts making sense.
The commission's resume on bluefin tuna graphs as plunging populations, down more than half in the Mediterranean and now free-falling, and down more than 90 percent in the west Atlantic.
The increasing rarity of bluefin—and escalating worldwide sushi madness—has only intensified fishing efforts. And as the fish diminish, demand further drives up the price of bluefin meat.
As extinction nears, the fishing keeps escalating.
Fishing has already demolished bluefin populations. The last few decades have seen gold-rush bluefin fisheries disappear off Brazil, in the North Sea, and the southwest Pacific.
Wherever they still swim, they are aggressively hunted.
Atlantic bluefin breed in only two places:
the Mediterranean and Gulf of Mexico. From these two spawning areas they migrate throughout the whole North Atlantic, mingling in many fishing areas. But fish from these two populations do not interbreed;they are separate breeding stocks that, when not breeding, mix in many areas where they're fished.
Either population could thrive or go extinct.
Right now, both are in trouble.
European and Mediterranean fishing is out of control.
But the fish native to
A Corporate Approach to Rescuing the World's Fisheries | Carbon's Burden on the World's Oceans
the North American side of the ocean are actually faring far worse.
Catches reflect these different trajectories. While Europeans are killing almost twice their legal limit, U.S. fishermen are finding only a fraction of the fish they caught just ten years ago. So few fish remain in the west Atlantic's bluefin population that in the last few years U.S. commercial bluefin fishers have been able to catch less than 15 percent of what they were allowed.
In a word:
collapse.
So, what did fishery managers just do?
As they have for 20 years, they ignored their own scientists. For the east Atlantic and Mediterranean, the scientists had recommended drastic and immediate catch reduction from nearly 30,000 tons annually to 15,000 tons. Yet despite official warnings and calls for a catch ban, the sleepwalking ICCAT commissioners on Nov. 25 set the catch limit nearly half again as high:
22,000 tons. Having by incompetence, greed, and reckless industry interference caused the depletion of this magnificent and commercially important fish, the commissioners agreed to ensure further decline.
Excessive catch limits will ultimately prove self-defeating, yet short-term thinking prevails. For the Mediterranean and eastern Atlantic, ICCAT's managers have allowed fishermen to catch half again what their own scientists recommended.
And fishermen have ended up taking double what they're allowed (the actual catch in 2008 was 61,000 metric tons), making every second bite at the sushi bar illegal.
What else is going on here?
"The EU has bankrolled the decimation of bluefin stocks by subsidizing the new large fishing vessels that are responsible for overfishing, to the detriment of certain traditional fishing fleets," said European Parliament member Raül Romeva, who attended the tuna commission meeting.
"When the stocks are gone, the same ship owners who lobbied to overexploit bluefin tuna will come cap in hand for more EU money."
The U.S. has justifiable indignation over catches on the other side of the ocean.
Bluefin tuna originating in the Mediterranean migrate to American fishing grounds, and rampant overfishing on that side of the Atlantic hurts fishing in U.S. waters. So the U.S. has an immediate interest in trying to inject some sanity to the fishing there.
But the U.S. should also have an interest in fixing its own problems at home, too. And it doesn't.
The west Atlantic is fished by the U.S., Canada,
Tuna nets
As commercial fishing continues, bluefin populations continue to plummet on both sides of the Atlantic.
Mexico, and Japan.
These countries agreed last month to cut the catch limit for the western Atlantic from the current 2,100 metric tons to 1,800 metric tons by the year 2010.
Noble, right?
No, not at all.
Here's the catch, literally:
1,600 tons. That's the amount caught by the countries fishing the west Atlantic in 2007, a number that is actually less than what the "limit" will be reduced to two years from now.
In other words, the limit is higher than the catch, so it's not a limit at all.
Effectively, it's catch-as-catch-can.
And yet in the U.S. National Oceanic and Atmospheric Administration's press release on the tuna commission meeting's outcome, America praised itself while expressing "extreme disappointment" at Europe.
Finger-pointing by the U.S. obscures the fact that its own Gulf-spawning bluefin tuna population is in worse shape than the one spawned in the Mediterranean.
Gulf-spawned fish migrate up the east coast and into Canadian waters, spend about 12 years growing up, and then — if they're still alive — return to the Gulf as giants to spawn.
The NOAA press release highlights Europe's failings (the dodge-tactic pioneered by U.S. fishermen), and most environmental groups join the chorus. But the fact is:
While our last few native giants trickle in to the Gulf of Mexico to breed, our own fisheries agency allows U.S. boats to catch spawning fish.
They're allowed to keep one bluefin per trip while ostensibly fishing for other kinds of fish.
But high prices mean the big tuna get targeted.
And because they fish with hundreds of hooks and can keep only one bluefin, they kill — and dump — numerous spawners. If one were designing an extermination campaign, it would work just like this.
The World Conservation Union lists west Atlantic bluefin tuna as "critically endangered."
More dithering could doom the population—and for what?
This kind of negligent management drives U.S. boats out of business anyway because ultimately the fish population is lost, and everyone loses. The United States could immediately fix this problem.
It has full control of the west Atlantic population's Gulf of Mexico spawning ground.
Yet the killing continues, while U.S. fishery managers stonewall all criticism on the matter.
The United States can — and must now — suspend tuna fishing on the bluefin tuna's U.S. spawning grounds during breeding season.
Then we need an Atlantic-wide bluefin tuna fishing moratorium.
After years of inaction and continuous decline, letting them spawn free of constant harassment and giving the populations a five-year cease-fire seems the only hope for saving the giant fish.
Conservationists must work to get the west Atlantic bluefin population listed under the Endangered Species Act.
Meanwhile, the U.S. should seek legal action against European fishing countries under the Pelly Amendment for undermining the efficacy of officially agreed-to fishing limits. And management of international trade in bluefin tuna should be turned over to the Convention on International Trade in Endangered Species [CITES], which enacted the ban on ivory that saved Africa's elephants. After all, it's the export to Japan that drives this whole bloody mess
The Greenhouse Gas That Nobody Knew
When industry began using NF3 in high-tech manufacturing, it was hailed as a way to fight global warming.
But new research shows that this gas has 17,000 times the warming potential of carbon dioxide and is rapidly increasing in the atmosphere – and that's turning an environmental success story into a public relations disaster.
by richard conniff
Hypothetical question:
You're heartsick about global warming, so you've just paid $25,000 to put a solar system on the roof of your home.
How do you respond to news that it was manufactured with a chemical that is 17,000 times stronger than carbon dioxide as a cause of global warming?
It may sound like somebody's idea of a bad joke.
But last month, a study from the Scripps Institution of Oceanography reported that nitrogen trifluoride (NF3), with a global warming potential of 17,000, is now present in the atmosphere at four times the expected level and rapidly rising.
Use of NF3 is currently booming, for products from computer chips and flats-screen LCDs to thin-film solar photovoltaics, an economical and increasingly popular solar power format.
Moreover, the Kyoto Protocol, which limits a half-dozen greenhouse gases, does not cover NF3.
The United Nations Framework Convention on Climate Change now lists it among five major new greenhouse gases likely to be included in the next phase of global warming regulation, after 2012.
And while that may be reassuring, it also suggests the complicated character of the global warming problem.
In fact, NF3 had become popular largely as a way to reduce global warming.
The U.S. Environmental Protection Agency began actively
UC Irvine researchers noted that NF3 is one of the most potent greenhouse gases known and persists in the atmosphere for 550 years.
encouraging use of NF3 in the 1990s, as the best solution to a widespread problem in making the components for everything from cell phones to laptop computers. Manufacturers in the electronics industry all use a vacuum chamber to etch intricate circuitry and to deposit a thin layer of chemical vapor on the surface of a product.
Some of the vapor inevitably builds up instead as glassy crud on the interior of the chamber.
To tear apart that layer of crud and clean the vacuum chamber, manufacturers were using powerful fluorinated greenhouse gases. The usual choice, hexafluorethane, or C2F6 sounds better at first than NF3.
In global warming terms, it's only about 12,000 times worse than carbon dioxide.
But C2F6 is difficult to break down, and roughly 60 percent of what goes into the vacuum chamber ends up in the atmosphere.
With NF3, estimates suggested that under optimal conditions, roughly 98 percent of what goes into the vacuum chamber is destroyed there.
So when the semiconductor industry announced a voluntary partnership with the EPA to reduce greenhouse-gas emissions by 10 percent from 1995 levels between 1999 and 2010, NF3 became the replacement technology of choice.
Makers of flat-screen displays soon announced a similar program.
In 2002, the EPA gave a Climate Protection Award to the largest NF3 producer, Pennsylvania-based Air Products and Chemicals Inc., for its work in reducing emissions.
Then last summer, a paper calling NF3 "the greenhouse gas missing from Kyoto" attracted widespread press attention.
Co-authors Michael J. Prather and Juno Hsu of the University of California at Irvine noted that NF3 is one of the most potent greenhouse gases known and persists in the atmosphere for 550 years.
But back in the 1990s when the Kyoto Protocol was being negotiated, NF3 was a niche product of unknown global warming potential (GWP). [In calculating GWP, carbon dioxide is the basic unit, with a GWP of one.
For other gases, scientists measure infrared-absorption, the spectral location of the absorbing wavelengths, and the atmospheric lifetime of the gas to determine its global warming effect relative to carbon dioxide.] So NF3 got left out, meaning no requirement for industry to track emissions, or even to report how much NF3 is actually being produced.
That left room for what felt to Prather like a "flimflam."
In an interview with Yale Environment 360, he estimated that 20 or 30 percent of total NF3 production ends up in the atmosphere — not
According to a new report, NF3 is now present in the atmosphere at four times the expected amount, with atmospheric concentrations rising 11 percent a year.
the two percent industry had seemed to suggest.
He and Hsu characterized Air Products, the same NF3 producer that the EPA had honored, as producing the annual global warming equivalent of one of the world's largest coal-fired power plants.
A new paper, published in Geophysical Research Letters in October, filled in gaps in this glum picture — and threatened to turn the NF3 emissions success story into a public relations disaster. Ray Weiss and his research team at the Scripps Institution of Oceanography reported that NF3 is now present in the atmosphere at four times the expected amount, with atmospheric concentrations rising 11 percent a year. Working from annual production estimates of 4,000 metric tons, Weiss figured that about 16 percent of current production is ending up in the atmosphere.
Corning Painter, a vice president at Air Products, praised the Weiss paper but argued that "in terms of order of magnitude the numbers are relatively close" to earlier estimates. In a letter to New Scientist magazine this summer, Painter had seemed to give the impression that overall emissions were in the two percent range.
"More than 20 years of research and work with our customers finds that less than 2 percent of NF3 is released into the atmosphere," he wrote.
But in an interview with Yale Environment 360, Painter said Air Products has a two percent emissions rate just in producing and packaging the gas, though he said that rate continues to go down.
He said global NF3 production is actually 7,300 tons annually.
Given Weiss's figures for atmospheric concentrations, he said, that would translate to an overall emissions rate closer to 8 percent, including manufacturing, transportation, and end-use.
Getting the advertised results with NF3 always hinged on an expensive new technology called remote plasma cleaning.
It breaks up the gas in a remote container, then injects the active ingredient, fluorine, together with nitrogen, into the vacuum chamber. With the optimal configuration, the process destroys almost all the NF3.
Bigger companies made the change to remote plasma cleaning when they switched to newer fabrication tools, often at great expense.
"You can hear guys saying, ‘I've gone from a Hummer to a Prius. I've met all my voluntary commitments,'" said Painter.
But other companies stuck with older tools, simply replacing C2F6 with NF3.
This Band-Aid approach still releases about 20 percent of the NF3 into the atmosphere.
Painter argued that the struggling economy will force manufacturers to shut down these less efficient production lines, reducing overall emissions. But in October, Global Industry Analysts estimated that over the next four years NF3 production will increase to almost 20,000 tons, because of growing demand in the electronics industry.
Moreover, even the latest equipment does not guarantee that a company will achieve the optimal emissions rates — for instance, in the solar cell industry.
Amorphous silicon thin-film solar photovoltaic cells, manufactured using NF3, are slightly less efficient than crystalline silicon solar cells, the dominant technology.
But they are cheaper to produce and expected to supply a rapidly increasing share of the solar market, for both large-scale and domestic applications.
Because thin-film is a new technology, manufacturers generally use the latest equipment.
But a knowledgeable source, who asked to remain unidentified, recently visited thin-film solar researchers in Asia.
"They were unaware of the NF3 issue.
They were using a remote plasma, but they were also using quite a bit of NF3.
They weren't sure they had it set up right for 98 percent destruction.
It wasn't really on their radar."
The bottom line, said UC Irvine's Prather, is that "industry really cannot be trusted for self-regulation."
We will not know the extent of the problem "until we have honest, legally required reporting."
The other important lesson from the NF3 case, according to Scripps's Weiss, is that the bottom-up measurements required by some global warming regulations aren't enough.
Figuring out how much methane a cow produces, then adding up the cows, may not give you ground truth when it comes to global warming.
"You have to measure from the top down, and see what's actually going into the air."
A practical alternative to NF3 already exists. According to Paul Stockman of Munich-based Linde Gas, fluorine has zero global warming potential and no atmospheric lifetime.
But it's also highly toxic and reactive.
So instead of being shipped in bottles like NF3, it must be generated on site using special equipment.
Stockman, whose company manufactures NF3, said fluorine will become essential in thin-film solar manufacturing, because faster cleaning times mean a substantial boost in productivity.
Meanwhile, Air Products says it supports adding NF3 to the list of regulated greenhouse gases in the Kyoto Protocol's second commitment period, beginning in 2012.
But Prather believes industry needs to get more honest about NF3 production and emissions before then.
Solar cells are like any other product, he said, in that the manufacturing process has a global warming footprint.
But solar buyers are likely to be particularly concerned with the size of that footprint — and not so pleased to find out that what they thought was a Prius is really just a Hummer on the inside
The Arctic Resource Rush is On
As the Arctic's sea ice melts, energy and mining companies are moving into previously inaccessible regions to tap the abundant riches that lie beneath the permafrost and the ocean floor. The potential environmental impacts are troubling.
by ed struzik
In spring, the shores of the Arctic Ocean in Canada's Northwest Territories are buried deep in snow and ice, seemingly devoid of all life and resources. But not far under the surface, in the relatively shallow permafrost, lies what could be one of the largest sources of energy ever discovered, a slushy mix of water, ice, and natural gas known as methane hydrates. These days, Arctic geologists are scrambling to find methods to tap into this abundant store of energy.
A geologist at a site near the Beaufort Sea, where exploratory drilling is taking place for methane hydrates, a potentially enormous source of natural gas.
Gas hydrates — lattice-like ice structures that trap large quantities of methane, the major component of natural gas — are just one of a trove of natural resources in and around the Arctic Ocean.
Vast reserves of oil, natural gas, and minerals also lie beneath the frozen sea and land.
For centuries, these riches lay out of reach.
Indeed, as recently as five years ago, few companies dreamed of investing in the Canadian Arctic because there was no safe and economical way of extracting these resources and shipping them out.
In an area nine times as large as California, there was — and is — only one highway, a third of it gravel, which goes to the Arctic Ocean.
There is no seaport and no railway.
All that, however, is about to change, as a fast-moving confluence of events is turning the Canadian Arctic — and some northerly regions of Russia and other Arctic nations — into the next Klondike:
Just as the Arctic's summer sea ice is melting at an unprecedented rate, soon opening up the fabled Northwest Passage and other shipping lanes, the booming global economy has created a soaring demand for natural resources, sending prices sky-high.
The wealth that has lain untapped beneath the Arctic is now rapidly being opened for exploration.
Soon, not only will ships be steaming across the top of the world, shaving 7,000 kilometers off a Europe-to-Asia voyage that now takes them through the Panama Canal — these vessels will also be able to penetrate previously inaccessible expanses of the Arctic to explore for, extract, and transport natural resources.
These developments will have profound environmental, economic, and global security impacts. Unlike Antarctica, where exploitation of minerals has been indefinitely banned under the Antarctic Treaty System, the Arctic is ripe for exploitation.
And the environmental implications of this resource rush are sobering, with fleets of ships and oil tankers moving through a pristine marine environment and legions of workers on land drilling and digging for all manner of mineral wealth.
A Canadian Coast Guard icebreaker on the Arctic Ocean.
Neighboring countries are making conflicting territorial claims as the resource scramble intensifies.
"It's only a matter of time before a single tramp steamer takes a run through the Northwest Passage," says Michael Byers, who holds the Canada Research Chair in International Law and Politics at the University of British Columbia.
"Our ability to stop that ship or clean up if it runs aground and spills its load is severely lacking.
We have the longest coastline in the world in a region that is covered by ice for most of the year, and we don't even have an all-weather icebreaker."
With so much money at stake, the Arctic has become a hotbed of territorial disputes as the surrounding countries spar for control of resources. In a report issued earlier this year, two of the European Union's top foreign policy officials warned of the looming international struggle over this energy rich region.
The Bush administration, however, is more sanguine, arguing that existing maritime treaties and regular meetings of the five major nations bordering the Arctic Ocean — Canada, Russia, the United States, Norway, and Denmark — will ensure the Arctic's peaceful and responsible development.
This focus on the Far North comes as a result of the stunningly swift disappearance of ice in the Arctic Ocean.
Scientists estimate that summer sea ice has declined by about 50 percent since the 1950s. Last year, summer sea ice extent reached a record low, and thick, multi-year sea ice now covers less than 30 percent of the ocean, down from more than 50 percent in the mid-1980s. Experts who once believed that the Arctic Ocean would not be largely free of summer ice until mid-to-late century now concede that the ice could be gone within a decade.
As a result, in nearly every corner of this icy world, resource companies are investing heavily.
Energy and mining firms have announced a $17 billion Arctic exploration agreement with Russia.
Exxon's Canadian subsidiary, Imperial, has formed a consortium to construct a $16 billion pipeline to send Arctic natural gas to southern Canada and the United States. Last summer, Exxon spent nearly $600 million for the rights to explore a tiny lease in the Beaufort Sea, and this year BP paid $1.2 billion to explore in the same region.
French mega-uranium miner Areva, which is one of 40 companies looking for uranium in Nunavut, Canada's new Inuit territory, is considering developing a huge mine.
Theirs is just one of 200 development and exploration proposals the once-neglected territory is grappling with these days.
An increasingly common sight in the Arctic Ocean:
open water. Soon, the ocean will be ice-free in summer, opening the way for development.
On Canada's Baffin Island — a huge mountain of snow and ice that is home to less than 10,000 people, mainly Inuit — plans are underway to develop a $4.6 billion iron mine and railroad.
At Bathurst Inlet in the central Arctic, the Inuit themselves are contemplating construction of a seaport and a new 160 mile-long highway that would open up the Slave Geological Province, widely regarded as the richest untapped mineral deposit in the world.
The road also would service three existing diamond mines, with more to come.
Recently, along the shores of the Beaufort Sea in Canada's western Arctic, I got a sense of both the magnitude of these resources and the forbidding terrain that has so far kept them locked up.
With the springtime temperature at -35 F (-37 C), I watched as scientists from the Geological Survey of Canada tapped into methane hydrates buried under the permafrost not far from the frozen shoreline.
The gas rose quickly in the well before being capped by technicians.
Although methane hydrates are found deep in marine shelves worldwide, the biggest and most accessible reserves lie in the Arctic's relatively shallow permafrost zones and on the Arctic Ocean's continental shelf. The U.S. Geological Survey estimates that there may be more untapped energy stored in gas hydrates than in all of the conventional oil, gas, and coal reserves in the world.
Cognizant of the wealth at stake, many countries, including the U.S. and those in the European Union, are continuing to deny Canada's long-standing claim of sovereignty over the Northwest Passage, through which most of these resources would be transported.
They insist that the passage is, like the Suez Canal, an international strait.
Competing claims over Arctic territory are escalating.
In 2002, Denmark planted a territorial flag on Hans Island, a tiny, treeless chunk of rock off the coast of Ellesmere Island, only to have Canada later send its own mission to reclaim the island.
In the Beaufort Sea, the U.S. is in a dispute with Canada over the maritime boundary off the coast of Alaska and the Yukon, where billions of barrels of oil could potentially be found.
Last summer, the Russians used a submersible to plant a flag on the ocean floor at the North Pole, claiming 460,000 square miles — an area nearly the size of Germany, France, and Italy combined.
Much of that seabed already has been claimed by other Arctic countries.
Sovereignty, however, is not the only challenge that Canada faces in the Arctic.
The environmental threats, on land and sea, are huge.
Both the road that the Inuit want to build and the uranium mine that Areva is contemplating will be situated in the calving grounds of two huge caribou herds. Ships sailing through the Northwest Passage may also disrupt bowhead, beluga, and narwhal migrations. And today only a small portion of the biologically important areas bordering the Arctic Ocean are protected in parks or reserves.
But the biggest threat may well come from the captain of a single-hulled, crude oil tanker who tries to save time and fuel by taking a shortcut through the Northwest Passage.
If that ship runs aground or is crushed by melting ice spilling out of the High Arctic, it could make the two-year, $2 billion clean-up of the Exxon Valdez look like a kitchen spill.
Experts all agree that Canada is not prepared to handle such a disaster. With no Arctic seaport, no roads, virtually no Arctic naval capability, and very few airports from which to stage a recovery and cleanup, the government would be hard-pressed to mount an effective response.
Gary Sergy — an Environment Canada expert who helped pioneer oil spill cleanup technology in the Arctic — questions the ability of anyone to effectively deal with an Exxon Valdez-like disaster in the Canadian Arctic, where the huge Beaufort Gyre is constantly spiraling, pushing enormous volumes of ice and water through dozens of channels in the archipelago.
"How would you get a cleanup crew on site with no port or airstrip?" asked Sergy. " We just don't have the infrastructure.
It all boils down to a logistical nightmare."
Carbon's Burden on the World's Oceans
The burgeoning amount of carbon dioxide in oceans is affecting a lot more than coral reefs. It is also damaging marine life and, most ominously, threatening the future survival of marine populations.
When the annual meeting of the American Association for the Advancement of Science (AAAS) convened in Boston this year, a handful of marine biologists and physiologists exposed a whole new side of climate change.
They offered no data about the atmosphere.
Their focus was not on temperature.
Their conversation referred to warming only as a secondary and confounding effect.
Instead, these scientists were talking about chemistry — not just of the ocean water but of ocean animals themselves:
their cells, tissues, and body fluids. They were noting the increased costs of living that come as a result of elevated carbon dioxide inside the body and how this added cost stresses marine life and has led to massive extinctions in the past.
They were issuing a warning about the potential for unabated elevated carbon dioxide to threaten directly the survival of all marine species.
The direct link between increased carbon dioxide concentrations in oceans and increased internal stress on marine creatures is largely absent from the "climate change" dialogue.
It used to be enough to say "global warming" was the problem.
But increasing concentrations of carbon dioxide in the atmosphere — and in oceans — have been causing more varied and faster effects than previously imagined.
In fact, massive changes underway in the ocean are not captured with the word "climate."
Certainly, warming itself is a colossal issue.
Record world temperatures, melting sea ice, thermally expanding ocean waters, sea level rise, engorged rain clouds in some regions and droughts in others — "climate change" has been used to encapsulate all these effects.
Yet now, that term is beginning to hamper our understanding — and our conversation.
Beyond the well-defined relationship between carbon dioxide and temperature lie various chemical reactions that have profound implications for life.
These other issues are starving for attention, partly because they are not about warming, the atmosphere, or the climate.
But they are about the same carbon dioxide.
Almost half of all the carbon dioxide emitted since industrialization has been absorbed by the ocean.
When carbon dioxide reacts with water, it forms carbonic acid, and releases more hydrogen ions into the sea, lowering pH and causing "acidification" of the ocean.
Further, these hydrogen ions quickly bind with carbonate ions. This deprives animals like hard corals and certain mollusks and plankton of the raw material for their calcium carbonate shells and skeletons. This may ultimately cause the world's oceans to become corrosive to such animals, and coral reefs to dissolve.
Calcification rates (think of this as the rate at which a coral, say, can grow, based on its ability to construct its skeleton) decline in relation to carbonate concentrations. Models predict that coldwater corals may lose 70 percent of their habitat by 2100 with some waters becoming corrosive as early as 2020.
Calcification rates in tropical waters have already declined by 6 to 11 percent and are expected to decline by as much as 17 to 35 percent by the end of the century.
Some models predict concentrations of carbonate ions will be too low for reef growth by as early as 2065.
It turns out that carbon dioxide molecules not only penetrate the ocean;they also infiltrate the bodies of marine animals, permeating cell membranes and disrupting fundamental biological functions. Carbon dioxide is a small, uncharged gaseous molecule that in the ocean environment can rapidly cross cell membranes. Once inside the cell, the same acidification process that happens in ocean water occurs within the cell.
Higher concentrations of CO2 alter the acid-base balance within cells and disrupt many cellular functions, from oxygen transport to protein synthesis. The more CO2 inside body tissues, the higher the cost of living for an organism.
In its energy budget, the cost of dealing with CO2 comes directly out of energy that would otherwise have gone for other basic functions such as metabolism, growth, immune function, and making babies.
We call this cost "metabolic drag."
It has long-term consequences for survival because even if not acutely fatal, over time reduced growth, disease resistance, and reproductive output threaten the viability and resilience of populations. And they're going to need resilience.
Too much carbon in the ocean particularly threatens creatures living in the deep sea.
The depths of the ocean comprise one of Earth's most stable environments. Its animals are adapted to that stability.
They don't handle change well.
Scientists predict that the pH in the deep sea will be greater than for the ocean's other regions.
Experiments show that elevated carbon dioxide affects various cellular and bodily functions, such as the ability to make proteins, transport oxygen through the body, or growth rates. But how will the ever-increasing CO2 affect individual animals, populations, and species in coming decades?
No one knows. Significant harm appears possible, but evaluating long-term effects will require more work.
Research is also critically needed to evaluate large-scale carbon disposal (sequestration) that would cause very high concentrations in deep ocean water (in some experiments, pH declined by more than 1 unit).
Consideration of these known and potential effects of elevated carbon dioxide levels has not been part of the "climate" debate, and it will be difficult to raise or understand these effects if people continue to believe that the problem only involves climate.
The language lags behind the science, and needs to catch up.
No term in use captures the full array of issues from warming and climate to the chemistry changes throughout the ocean and inside every marine creature.
Not "climate change," certainly not the almost-quaintly catastrophic "global warming."
Those aren't even the problem;they're symptoms. Behind all these symptoms is the root of the problem.
We call it "the carbon burden."
So hear this:
It is not just about climate.
It is, and always has been, about the carbon.
We need to place carbon back in the center of the equation.
From atmosphere to ocean to cell, the carbon burden is the problem.
It's the heaviest load anyone's ever placed on an unsuspecting planet, and the more we learn, the more its dimensions appear ever more staggering.
The Growing Specter of Africa Without Wildlife
On the road from Nairobi into the Great Rift Valley not long ago, a 48-year-old Kenyan taxi driver named Jagata Sospeter pointed out how the landscape had changed in his memory — here a soccer field where rhinos were once commonplace, there a river where hippos used to live, and everywhere, as Kenya's human population continues to boom, the endless sprawl of shambas, tin-roofed farmhouses surrounded by an acre or two of parched maize plants in place of open range.
The one consolation, in a nation where tourism accounts for 10 percent of the gross domestic product, was that wildlife was at least secure within Kenya's national parks and protected areas.
But a new study says that sense of reassurance is false, with wildlife disappearing just as fast inside Kenya's national parks as out.
According to an analysis by David Western and his co-authors, wildlife declined by 41 percent in national parks from 1977 to 1997, and the decline does not appear to have slowed since then.
The study, published on PLoSOne, was
commissioned by the Kenya Wildlife Service (KWS), which manages the country's national parks. Shortly after the Western study appeared, a KWS spokesman announced that Kenya's lion population, "the symbol of national strength," is now declining so fast that lions could be extinct there within 20 years.
The Western study has attracted little media attention inside Kenya or beyond, in part because control of the national government remains a more pressing issue in the aftermath of a controversial 2007 presidential election.
Reaction within the conservation community was also muted, in part because Kenya has long been notorious for mismanaging its wildlife — and also because Western himself is a long-time participant in the national bickering about how to fix the problem.
But the new study follows a raft of recent papers reporting similar declines in other protected areas across the continent.
There are notable exceptions to this trend, particularly in the southern African nations of Namibia, Botswana, and South Africa.
But together with continuing increases in human population, these studies raise the specter of an Africa without animals.
During the rampant poaching of elephants and rhinos in the 1980s, the outside world was shocked at the improbable idea that wildlife could survive in Africa only in parks patrolled by armed guards, and often behind fences. But if wildlife continues to decline as rapidly inside national parks as out, it could lead, biologists Tim Caro and Paul Scholte predicted in a 2007 article in the African Journal of Ecology, to "a continent containing isolated pockets of large mammal diversity living at low population sizes. Just like Europe."
The new Kenya study compiled data from 270 wildlife counts over 30 years, mostly focused on antelopes, the feedstock on which lions, leopards,
The wildlife declines point to the need for a radical review of conservation policies.
African wild dogs and other predators depend.
Western, who was director of KWS in the 1990s, described it as "the first time we've taken a good look at a national park system in one country, relative to all of the wildlife populations across the whole country."
The paper notes that the wildlife declines "raise grave concerns about the adequacy of parks and point to the need for a radical review of conservation policies."
But despite the trends revealed in his study, Western disputed the idea that Kenya could soon lose all its wildlife.
Elephants and rhinos seemed to be going down to extinction in the 1980s, he said in a telephone interview with Yale Environment 360, but they didn't "because people were alerted to this threat" and took action.
He argued that the same kind of shift is happening now, especially as studies provide hard statistical evidence of the decline.
Western, a long-time advocate of involving local communities in wildlife management, argued that the answer is to continue expanding the focus "from national parks only to parks plus private lands and communal lands."
In the past, the benefits of tourism generally flowed to tour operators and KWS, he said, leaving nothing for local people, who naturally came to regard wildlife as a threat rather than a benefit.
But wildlife populations are holding on, he said, in areas with "local participation."
Co-author Samantha Russell cited the example of the Shompole conservation area and tourist lodge managed since 2000 by the Masai community on the Tanzania border near Lake Natron.
"They've had wildlife increases and they're very proud of that fact," she said.
Asked if the project has produced the sort of community benefits that Western sees as the key to changing attitudes toward wildlife, she said, "In theory, there's a lot of money to be made."
But she conceded that "benefit sharing is always a tricky one to work out."
Despite considerable optimism and international support over the years, community management schemes have frequently failed. A 2000 paper by Alexander Songorwa of the Tanzania Wildlife Division recited a lengthy
catalogue of impediments, including government reluctance to turn power back to locals, resistance from national park services, the inability of illiterate locals to handle new accounting systems, and lack of wildlife management expertise.
But Western argued that much has changed in the years since Songorwa wrote his article, with Kenya training hundreds of local wildlife scouts and, more recently, resource assessors to keep track of changes in the habitat.
"Once you give them a voice, you give them opportunity, you give them skills and training, that changes very rapidly.
They're not locked into backwardness, which really that view implies."
James Deutsch, director of the Wildlife Conservation Society's Africa program, praised the Western study for producing the first hard evidence of the "extremely depressing" changes in Kenya's national parks. But he also questioned the study's "black and white conclusions."
He noted that community management success stories rarely come from East Africa these days, but mainly from southern Africa, particularly Nambia, which has a stable national government and a low population density — unlike Kenya.
Deutsch accepted the study's argument that the wildlife decline is due in part to bad design and siting of national parks, which often include only a
The pretence that all is well in Kenya's protected areas has been blown out of the water."
fraction of the migratory range of major species. But that doesn't explain, he said, why the largest parks in the study suffered the worst declines, while some small parks actually showed increases. He dismissed the study's rosy assessment of the security provided by KWS, and blamed poaching across the border from Somalia for the 78 percent decline in Meru and the 63 percent decline in Tsavo East and Tsavo West National Parks.
Deutsch also noted that the study implicitly re-plays a dispute that has raged in Kenyan wildlife circles for 30 years, "often generating more heat than light."
On one side, Western pushes his community-involvement approach.
On the other, Richard Leakey, another former KWS director, argues for "fences-and-fines."
"For me, the world is complicated," said Deutsch, adding that he'd be interested in a study "that doesn't have an axe to grind from the start."
He rattled off a series of challenges to the survival of wildlife in Kenya — badly-flawed parks, little or no benefits flowing to people living around parks, a lack of income from legal trophy hunting and other consumptive uses of wildlife, the bushmeat trade, political corruption, inadequate protection against poachers — and suggested that in any given situation, either Western's approach or Leakey's might be the right way to go.
Soon after the new study appeared, a columnist in Swara, the East African Wildlife Society quarterly, took the entire Kenyan conservation establishment, including Western and Leakey both, sharply to task:
"The cosy pretence that all is well within Kenya's Protected Areas has been simply blown out of the water...
Have the courage to admit that everything you have recommended, supported, funded and implemented over the last 30 years in Kenya to conserve wildlife has been a failure — or was it your intention to sit idly by while some 70 percent of wildlife vanished from under your very noses?"
Yet the prospects for reversing this grim trend seem small.
The human population of sub-Saharan Africa continues to boom, with a projected increase of a billion more people across the continent by 2050.
So both fences and community-friendly approaches will almost certainly need to work — along with some miraculous remedy still to be devised — if Africa's rich and potentially lucrative wildlife legacy is to last through this century.
The Challenge for Green Energy:
How to Store Excess Electricity
"Why are we ignoring things we know?
We know that the sun doesn't always shine and that the wind doesn't always blow."
So wrote former U.S. Energy Secretary James Schlesinger and Robert L. Hirsch last spring in the Washington Post, suggesting that because these key renewables produce power only intermittently, "solar and wind will probably only provide a modest percentage of future U.S. power."
Never mind that Schlesinger failed to disclose that he sits on the board of directors of Peabody Energy, the world's largest private-sector coal company — a business with much to lose if a solar- and wind-powered future arrives. But at least he and his co-author got it partly right.
The benefits from wind and solar are mostly intermittent — so far. But the pair somehow missed the fact that a furious search for practical, affordable electricity storage to beat that intermittence problem is well underway.
For decades, "grid parity" has been the Holy Grail for alternative energy.
The rap from critics was that technologies like wind and solar could not compete, dollar-for-dollar, with conventional electricity sources, such as coal and nuclear, without large government tax breaks or direct subsidies. But suddenly, with rapid technological advances and growing economies of manufacturing scale, wind power is now nearly at grid parity — meaning it costs roughly the same to generate electricity from wind as it does from coal.
And the days when solar power attains grid parity may be only a half-decade away.
So with grid parity now looming, finding ways to store millions of watts of excess electricity for times when the wind doesn't blow and the sun doesn't shine is the new Holy Grail.
And there are signs that this goal — the day when large-scale energy storage becomes practical and cost-effective — might be within reach, as well.
Some technologies that can store sizeable amounts of intermittent power are already deployed.
Others, including at least a few with great promise, lie somewhere over the technological horizon.
New storage approaches include improvements to existing lithium ion batteries and schemes to store energy as huge volumes of compressed air
Large-scale electricity storage promises to be a game-changer, unshackling alternative energy.
in vast geologic vaults. Another idea is to create a network of small, energy-dense batteries in tens of millions of homes. Under such a "distributed storage" scheme, utility computers could coordinate electricity flows over a "smart grid" that continually communicates with — and adjusts the flow of power to and from — local batteries. This would even include batteries in future plug-in hybrid or all-electric vehicles.
And one 2008 breakthrough could even fulfill chemists' long-held dreams of producing a squeaky-clean and storable fuel by using excess electricity generated from renewable sources to cheaply produce hydrogen, which could then be used in fuel cells to power homes and cars.
In a world run mainly on fossil fuels, finding ways to store electricity was not a pressing concern:
Power plants across a regional electrical grid could simply burn more fuel when demand was high.
But large-scale electricity storage promises be an energy game-changer, unshackling alternative energy from the constraints of intermittence.
It would mean that if a wind or solar farm were the cheapest and cleanest way to generate power, it wouldn't matter when the sun shone or the wind blew.
One storage approach seems obvious:
to improve battery technologies. Picture efficient, enormous batteries that can store tens of millions of
More From Yale e360
Click below to read more from Yale Environment 360 on innovative efforts to find new energy sources.
For Greening Aviation, Are Biofuels the Right Stuff? | Capturing the Power of Waves and Tides | Jatropha Falls Short of Hype | Potential Breakthrough in Harnessing the Sun's Energy
watt-hours of juice.
Today, the vast majority of new rooftop solar photovoltaic panels are connected to the grid, using it as a giant battery, pushing excess power onto the grid when solar panels provide excess power. The building then draws power from the grid when the sun doesn't shine, with its meter spinning backward and forward with the ebb and flow of power. With relatively few solar roofs yet in play, utilities manage any ebb and flow by drawing down and ramping up generation at conventional power plants designed to balance fluctuating supply and demand.
A more robust world of solar and wind power might be better served by some sort of giant battery — or, more likely, many of them, widely distributed.
The basic concept has been proven.
Since 2003, the world's largest battery backup has been storing energy for an entire city:
Fairbanks, Alaska.
Isolated as it is, and not part of any regional electricity grid, the metropolitan area of about 100,000 residents needs an electricity backstop more than most:
In its sub-zero winters, pipes can freeze solid in as little as two hours. Six years ago, the city installed a huge nickel-cadmium battery, the same technology used for years in laptop computers and other portable devices.
Housed in a giant warehouse, the 1,300-metric ton battery is larger than a football field, and can crank out 40 million watts of power. Still, the Fairbanks battery provides only enough electricity for about 12,000 residents for seven minutes. That was enough to prevent 81 blackouts in the city in the battery's first two years of operation.
Yet effective storage of electricity from solar or wind arrays that generate power equivalent to one large coal plant implies batteries on a breathtaking scale — hundreds of units the size of the Fairbanks array.
One possible answer?
In Japan, so-called "flow" batteries have been used for years to store backup power at industrial plants. Conventional batteries store energy in chemical form.
With flow batteries, charged chemicals
Cost will be key for determining which battery or other storage technologies prevail.
are pumped into storage tanks, allowing still more chemical to be charged and pumped away, then pumped back into the active portion of the battery and drawn down as needed.
One big advantage:
Battery "size" can be expanded by simply adding more chemicals and more storage tanks. In 2003, the local utility on small King Island, off the coast of Australia, installed a large flow battery to sop up and later release excess power from a wind farm.
As with the alternative generation technologies, cost will be key for determining which battery or other storage technologies might prevail.
Aside from such typical economic concerns as raw material and maintenance costs and durability, storage technologies all face some losses in "round-trip efficiency."
Inevitably, some energy is lost as it goes into storage, and more is lost as it comes out.
Right now, hopes are riding high on lithium ion batteries, because they have impressive round-trip efficiencies, can pack in high densities of energy, and can charge and discharge thousands of times before becoming degraded.
Because of those attributes, lithium-ion battery technology has become increasingly dominant in laptop computers and cell phones. On a far larger scale, a powerful lithium ion battery pack powers the pricey all-electric Tesla Roadster, and is slated to power the plug-in hybrid Chevy Volt next year.
On the grid, lithium ion experiments are already underway.
One company, General Electric-backed A123 Systems, announced late in 2008 that it had been contracted to install a two-megawatt lithium ion storage unit at a California power plant owned by global utility giant AES.
Still, lithium ion remains a relatively expensive technology — 10 times more expensive than lead acid batteries with equivalent capacity.
Technological improvements and manufacturing scale should bring lithium costs down over time, but by the time that happens, the world could be beating a path to the door of someone who's found a way to build an even better battery.
Early this year, IBM revealed that it was launching a major research program into what looks like an even more promising technology — the lithium metal-air battery.
Last month, a company called PolyPlus announced that it had already succeeded in developing one.
The PolyPlus battery and the IBM technology deliver an astonishing 10 times more energy density than even today's best lithium ion technology.
That means that, pound for pound, they offer about the energy density of gasoline.
The key reason they can store so much energy is that they use oxygen, drawn from the air, in place of some of the chemical reactants used along with lithium in their lithium ion cousins.
There's one big rub:
Air isn't just oxygen.
Notably, it also contains
The stored power in electric cars, or anywhere on the grid, might not come from batteries
after all.
humidity, and the lithium has a bad habit of acting like ignited gasoline when exposed to moisture, creating a real risk of fire and explosion.
Chandrasekhar Narayan, manager of science and technology at IBM's Almaden Research Center near San Jose, Calif., has suggested that it will take five to 10 years to develop an effective membrane that will let oxygen into the battery while keeping moisture out.
Still in pie-in-sky mode, there's "vehicle to grid" storage, or "carbitrage."
This enticing notion relies on idled storage in the batteries of the millions of plug-in hybrid or all-electric automobiles that will be in use in the future.
There's reason to believe this scheme could work.
More than 90 percent of the time cars sit idled, and aside from days they're used for long trips, most of their full energy storage capacity goes unused.
A single idle, electric-powered car could generate as much as 10 kilowatts of power, enough to meet the average demand of 10 houses, according to Willett Kempton, director of the Center for Carbon-free Power Integration at the University of Delaware.
With vehicle-to-grid technology, controlled by an array of smart meters, car owners plugged in at home or work could allow the grid to draw off unused chunks of power at times when short-term demand is high.
Conversely, cars could be recharged when demand is low.
The stored power in those electric cars, or anywhere on the grid, might not come from batteries after all.
In March, Texas-based EEStor announced that it had received third-party verification of its "ultracapacitor" technology.
The company claims the lightweight device, which was awarded a U.S. patent last December, can bottle up huge amounts of electricity far more quickly than any battery and can do so at lower cost.
Like batteries, capacitors store and mete out electricity.
Small conventional capacitors have been ubiquitous in electronic devices as far back as the early days of radio. But capacitors, so far, haven't been able to store electricity for long enough to come close to competing with batteries. They have found use as devices that level out fluctuations in voltage or that briefly store power for near-instant release.
EEStor claims that its device, which is one-quarter the weight of a similar
Nocera
Donna Coveney/MIT
Work being done by Daniel Nocera at MIT could open up the possibly that electricity could be stored by splitting (and later recombining) abundant water molecules.
lithium ion battery, can hold a large charge for days. Its patent describes a 281-pound device that would hold almost the same charge as a half-ton lithium ion battery pack installed on the Tesla Roadster. The company's ultracapacitors have yet to prove themselves in commercial products. But industrial giant Lockheed Martin has already signed up with EEStor to use future ultra capacitors in defense applications, and Toronto-based Zenn Motors, which has also taken an ownership stake in EEStor, says it will have electric cars on the road using the technology in 2010.
If advanced batteries or ultracapacitors aren't the ultimate answer, maybe using excess electricity to make hydrogen that can be stored will do the trick.
Hydrogen can be produced through simple electrolysis, but technical and cost hurdles have made electrolysis impractical.
Today, industrial-scale hydrogen is produced using natural gas as a not-so-clean feedstock.
But that may have begun to change last summer when MIT announced that a team lead by chemist Daniel Nocera had made a "major discovery" that employs a new kind of catalyst using cobalt and phosphate — abundant and non-toxic materials — to kick-start electrolysis.
Outside observers say the process could be revolutionary:
opening up the possibility that electricity made at any time by the sun or wind could be stored by simply splitting (and later recombining) abundant water molecules, perhaps even undrinkable sea water. The breakthrough has been hailed by scientist British scientist James Barber of Imperial College London as having "enormous implications for the future prosperity of humankind."
The website Xconomy reported in April that Nocera had quietly formed a startup company called Sun Catalytics. Efforts to reach Nocera for comment were unsuccessful.
And there is progress being made on an entirely different front — using excess electricity to pump compressed air into caverns, salt domes, and old natural gas wells, and then releasing the air to help state-of-the-art natural gas power plants spin turbines, lowering the amount of fuel consumed by as much as 70 percent. A consortium of utilities in Iowa, Minnesota, and the Dakotas is already working with the U.S.'s Sandia National Laboratories to develop a giant, 268-megawatt compressed air system.
Called the Iowa Stored Energy Park, it would store excess energy from the region's burgeoning wind industry.
The Corn Ethanol Juggernaut
The huge corn ethanol mandates imposed by Congress a few years ago may be the single most misguided agricultural program in modern American history.
That's saying something, but consider the program's impact:
higher global food prices, increased air pollution from burning ethanol-spiked fuels, spreading dead zones in the Gulf of Mexico from a surge of fertilizer use, and strong evidence that growing a gallon of corn ethanol produces just as many greenhouse gases as burning a gallon of gas.
Why then, given these many problems, hasn't Congress rolled back the mandates and stopped this boondoggle?
The answer can be boiled down to a few salient realities of American politics and agricultural policy.
First, even in the subsidy-rich world of U.S. agriculture, corn is king.
Second, the power wielded by the farm state lobby remains enormous. Third, Iowa is Ground Zero for corn, and its pivotal presidential caucuses leave even supposed change agents like Barack Obama bowing before the altar of corn ethanol.
And, finally, once a juggernaut like corn ethanol gets rolling with massive federal support and mandated production levels, bringing it to a halt is enormously difficult — even when study after study shows that relying on corn ethanol as a cornerstone of an alleged renewable energy policy is folly.
The corn sector has long enjoyed staunch backing from Congress. According to the nonprofit Environmental Working Group, between 1995 and 2006, federal corn subsidies, which are provided through a myriad of programs, totaled $56.1 billion.
That's more than twice the amount given to any other commodity, including American mainstays like wheat and cotton, and 105 times more than was paid to tobacco farmers.
Corn ethanol production has long been a favorite of farm state legislators in Congress, who have promoted the fuel as an alternative to the evils of foreign oil.
Congress approved the first ethanol subsidies in 1978, just a few years after the Arab oil embargo of 1973.
"It makes for a good public image — supporting the farmer, supporting the rural economy," says Thomas Elam, an Indianapolis-based agricultural economist.
The problem, he says, is that "it's a special-interest program that spreads the cost of the program across the rest of the economy."
Elam says that the farm lobby collects tens of millions of dollars a year to lobby lawmakers at the state and national levels. States like Iowa and Ohio have their own ethanol associations, which work in tandem with national groups like the Renewable Fuels Association.
In 2006 alone, that group collected about $3.7 million in dues from its members and paid its president, Robert Dinneen, a salary of $300,000 to push the ethanol-is-good message on Capitol Hill.
Additional support for the ethanol mandates comes from groups like the American Corn Growers Association and its larger cousin, the powerful National Corn Growers Association (NCGA), which reported 2006 total revenue of $8.6 million.
The NCGA has some 33,000 dues-paying farmers spread among 48 of the 50 states. On its Web site, the NCGA makes it clear that it aims to "increase ethanol demand" by establishing a federal program that is "part of a comprehensive energy policy."
These interest groups will spend millions of dollars "to keep the mandate where it is," says Jan Kreider, a professor emeritus of engineering at the University of Colorado, who has been studying motor fuels for three decades. "It's a massive political battle to even slow it down," says Kreider. "Once the mandates are in, it's almost a one-way street.
It could take decades to whittle down the size of the mandates."
It's a massive political battle to even slow it down...
It could take decades to whittle down the size of the mandates."
The staying power of the ethanol mandates is largely due to the decades-long influence of the farm state delegations on Capitol Hill.
As former Sen.
Bob Dole of Kansas once explained to Texas oil baron T. Boone Pickens, "There are 21 farm states, and that's 42 senators. Those senators want ethanol."
And the influence of those senators — 15 states now have ethanol production capacity of at least 200 million gallons per year —will be hard to overcome.
Cutting the ethanol mandates will require jousting with two of the most powerful members of the Senate, Republican Charles Grassley and Democrat Tom Harkin.
Both are Iowans. Both are ardent ethanol boosters. And Harkin is the chairman of the Senate Agriculture Committee.
Harkin's position gives him tremendous leverage over any ethanol-related legislation that comes before the Senate.
Which brings us to the Iowa Imperative.
Any candidate who wants to win the White House must have a good showing in the very first presidential primary – the Iowa caucuses. "Candidates have to come here and suspend all critical judgment," says David Swenson, an economist at Iowa State University.
"There is a knee-jerk reaction in Iowa that if you don't support our special interests then you don't love us and we won't vote for you – and that's true even though the vast majority of Iowans don't have anything to do with farming and wouldn't know a crop if it fell on them."
The imperative can be explained by looking at the numbers:
Iowa now has about one-third of the ethanol production capacity in the U.S., and those ethanol plants provide jobs for several thousand Iowans.
Barack Obama understood the Iowa Imperative.
And his strong support for ethanol helped him win the Iowa primary.
That win validated his campaign and was a key factor in assuring that he won the Democratic nomination.
And the farm lobby is rewarding Obama.
On Aug. 22, the American Corn Growers Association endorsed Obama.
On the Republican side, John McCain, a long-time ethanol critic, tied for third in Iowa.
In August 2006, six months before the Iowa vote, McCain switched sides in the ethanol debate, telling a crowd in Grinnell, Iowa, that ethanol "is a vital alternative energy source not only because of our dependency on foreign oil but its greenhouse gas reduction effects."
McCain has since switched sides again and is now co-sponsoring a bill — introduced in May by Texas' Kay Bailey Hutchison and 10 other Senate Republicans — to freeze the ethanol mandates. Hutchison argued that the ethanol mandates needed to be limited because they were driving up the price of corn and were "clearly causing unintended consequences on food prices for American consumers."
Her bill would limit the volume of corn ethanol to be blended into gasoline to no more than 9 billion gallons. But current federal rules mandate far greater production:
U.S. oil refiners must be using at least 15 billion gallons of ethanol per year in their gasoline by 2015 and 21 billion gallons by 2022.
Such a sharp increase by 2022 would principally be reached by making ethanol from other materials like switchgrass and wood chips. But this "cellulosic ethanol" has never been produced in commercial quantities.
Hutchison's bill, S. 3031, is stuck in the Senate Environment and Public Works Committee. A hearing has not even been scheduled.
In early August, the Environmental Protection Agency denied a request by Texas Gov.
Rick Perry to allow his state to opt out of the federal ethanol mandates. Lower corn prices are a critical issue for livestock producers in Texas who have been hit hard by soaring corn prices. In denying the request, E.P.A. Administrator Stephen L. Johnson said that the ethanol requirements are "strengthening our nation's energy security and supporting American farming communities" and are not causing severe harm to the economy or the environment.
Furthermore, Congress has passed rules that make it hard to waive the mandates. The Environmental Working Group is one of several environmental groups that are fighting to slow or reverse the ethanol mandates. The group's Michelle Perez, a senior analyst for agriculture, was not overly surprised that the EPA rejected the Texas request.
"Congress set the bar pretty high for states to demonstrate environmental and economic harm in order to get the mandate waived," says Perez.
She points out that Texas applied for a waiver based only on the economic harm being done by the mandates. The state would likely have made a stronger case had it sought a waiver based on both economic and environmental harm, Perez says, noting that her organization has begun providing testimony to the EPA on the environmental impacts.
Any sustained attack on the ethanol mandates would have to counter the enormous amounts of capital that have been invested in the corn ethanol sector. The industry's momentum can be measured in the billions of dollars. According to the Renewable Fuels Association, the trade group, some 168 ethanol distilleries with an annual capacity of 9.9 billion gallons are now operating in the U.S. Those plants are spread among 26 states, and another 43 plants are under construction or are being expanded.
If you assume that each of those 200-plus plants costs $75 million to construct (a conservative estimate), the total cost of those distilleries is about $15 billion.
If the federal mandates are eliminated or rolled back, the owners of the ethanol plants could seek compensation from the federal government.
So the mandates continue, despite at least 10 studies — including one this spring by the World Bank — showing that the surge in U.S. corn ethanol production is forcing up global food prices. On the environmental front, a spate of studies has shown that the production of corn ethanol likely creates more greenhouse gases than conventional gasoline.
Due to the energy-intensive nature of the cultivation and distillation processes, ethanol produced from corn yields very little, if any, benefit.
Clean-air advocates also contend that the growing use of ethanol in gasoline is increasing the amount of smog in America's cities. William Becker, executive director of the National Association of Clean Air Agencies, which represents air pollution control authorities across the U.S., said Congress "decided to mandate ethanol without first analyzing the air-quality impacts."
Gasoline that has been blended with 10 percent ethanol may be more volatile than conventional gasoline, which means more light hydrocarbons — and ground-level ozone — are emitted into the air. For Becker, the conclusion is crystal clear:
"More ethanol means more air pollution.
Period."
More ethanol means more air pollution.
Period."
Corn ethanol production has negative impacts on water quality.
Researchers say that a key reason for the growing "dead zone" of oxygen-depleted water in the Gulf of Mexico is the increased planting of corn to meet the soaring demand from ethanol distilleries. That additional acreage has resulted in increased applications of fertilizers like nitrogen and phosphorus, which are then washed into the Mississippi, helping create the algal blooms that cause dead zones.
The controversy over the ethanol mandates will undoubtedly go on for months, or years, to come.
But even if Congress repeals the mandates and eliminates the subsidies for ethanol production, the ethanol industry will not shut down.
Even without federal supports, some distilleries will still be profitable.
And their profitability will be directly linked to the price of oil:
As the price of oil continues to rise, some of the most efficient ethanol producers will be able to compete with high-priced gasoline.
No matter what Congress decides to do in the future with regard to the ethanol mandates, it has birthed an industry that has an incentive to burn food in order to fuel cars. And the ramifications of that move — in food prices and environmental effects — are likely to reverberate throughout the global economy for years to come.
U.S. Market Sees 50% Annual Growth
The hodgepodge of federal and state policies are favoring the growth of large-scale solar farms, which will help propel the U.S. closer to the No. 1 spot, says GTM Research.
Solar energy installations in the United States are poised to grow about 50 percent annually in the next three years as the country closes in on Germany, the largest solar market in the world.
The U.S. is likely to install 400 megawatts of new solar projects in 2009, and see the growth reach 1.5 gigawatts to 2 gigawatts of new installations in 2012, according to GTM Research's new report released Tuesday.
The strong demand represents over $6.1 billion in investments per year and the creation of 50,000 jobs, GTM Research said.
The report, The United States PV Market Through 2013:
Project Economics, Policy, Demand and Strategy, analyzed the scope and financing of power projects by major developers such as Sempra Generation and Renewable Ventures. It also examined policies and demand of big solar states, and detailed the impact of the American Recovery and Reinvestment Act of 2009 (ARRA).
"One of the big conclusions in our report is that the United States is really a ridiculous complex of state markets and utility markets, and each functions pretty much independently from the other, except for some relatively loose common threads," said Shayle Kann, an energy analyst at GTM and co-author of the report.
Like other hot solar markets in the world, government incentives are a big reason for fueling growth in the next few years. Last October, Congress extended a 30 percent investment tax credit for solar installations for eight years. The legislation gets rid of a $2,000 cap for residential installations and allows the utilities to take advantage of the tax credit.
Another booster shot is coming from the ARRA, which has created a host of grants, tax credits and loan guarantees for manufacturing solar energy equipment and installing it.
These federal subsidies, coupled with states' own incentives and mandates for renewable energy installations and consumption, will propel growth for residential and utility-scale projects, the report said.
Projects developed to service utility customers will likely grow the fastest, from installing nearly 91 megawatts in 2009 to adding 466 megawatts in 2012, under a conservative estimate.
Twenty-nine states and the District of Columbia are requiring utilities to serve up an increasing amount of renewable electricity.
Out of the 29 states, 16 of them (and D.C.) have specified the amount of solar electricity and/or distributed generation in the power mix, according to the renewable energy database DSIRE.
These state mandates have prompted utilities to sign renewable electricity power purchase agreements or start developing their own wind, solar and other renewable power plants.
California has the most aggressive goal, mandating 20 percent of renewable electricity by 2010 for its investor-owned utilities. The utilities aren't likely to meet that mandate by 2010, however (see Cal May Add 365MW in 2009, Still Short of 20% Mandate).
But half a dozen states are seeing rapid growth. GTM Research estimates that new installations in Arizona, New Jersey, New Mexico, New York, Nevada and Massachusetts will grow collectively from 54 megawatts in 2008 to 376 megawatts in 2012.
Increasingly, developers who are benefiting from these state policies are veterans of the power industry, not startup companies.
Sempra Generation, for example, belongs to Sempra Energy, which also owns the utility San Diego Gas &Electric.
Sempra Generation developed a 10-megawatt solar farm next to its natural-gas power plant in Arizona and sold the resulting solar electricity to the Pacific Gas and Electric Co. starting in January this year (see PG&E to Get Solar Power For the First Time). PG&E has since agreed to buy power from another, 48-megawatt project from Sempra.
"You really have to have access to large financing channels, and Sempra is an example of that," Kann said.
"The project development game is becoming more crowded quickly.
Anybody who enters has to be very deliberate about how their strategies line up with the dynamics of the markets themselves."
Although a growing number of utilities are buying or developing their own solar farms, that doesn't mean they are less inclined to buy solar electricity from independent power producers, he added.
"Look at natural gas and coal, where utilities don't necessary have the capital to own every asset they get power from," he said.
"You will see the same thing ultimately in solar."
Although many solar companies peg the U.S. as a reigning market one day, it remains to be seen when the country will reach that goal.
Germany has become a bright spot in the otherwise gloomy market this year thanks to its generous solar incentives.
The country already has about 1.5 gigawatts of new solar projects from January through September this year, and it could add another 1 gigawatt or more by the end of the year, according to Germany's solar industry association.
Surpassing Germany might take more time than some company executives had foreseen earlier this year. For one thing, the ARRA so far hasn't given solar the boost that some had hoped for.
Although the ARRA spells out a myriad of incentives for solar, many of the programs also award money to projects for other types of renewable energy generation, electricity transmission, energy storage and even biofuels.
Competing for these federal dollars has proven difficult and more time consuming than some companies had anticipated.
As a result, the Solar Energy Industries Association is now lobbying for a new legislation that would sweeten and extend some of the incentives for solar only (see Solar Industry Lobbies for Manufacturing Tax Credit, Cash Grant).
Top Ten Greentech Misses in 2009
2009 had plenty of hits but there were also a number of misses
2009 had plenty of greentech hits:
technical breakthroughs, lots of VC and government funding, some interesting acquisitions and a few successful IPOs. But there were also a number of misses. Here's a list:
Spain Pays for Its Poorly Executed Solar Subsidy:
The mistakes happened in 2008 but the echoes were felt in 2009.
Spain went from being the largest PV market in 2008 to almost zero in 2009.
Spain had a lucrative feed-in tariff program that required utilities to buy solar electricity at high rates set by the government.
After seeing an explosive growth of solar projects that far surpassed its estimates, the government reduced the solar electricity rates for solar power plants installed after September 2008.
Additionally, government investigation uncovered widespread fraud in the administration and roll-out of the FIT program.
A rush to install solar energy systems led to reports of fraud by developers claiming they had finished their projects when they only installed some of the panels or, in some cases, put in fake solar panels to buy time.
Spain had been a great market for Chinese panel makers, who were able to sell their goods at premium prices in 2008.
Not so in 2009.
Optisolar Lands Hard:
Optisolar had raised more than $300 million based on a vision of the economies of scale of building a gigawatt-sized factory.
The vision was that the cost of solar could be radically dropped by building "Solar City" factory complexes capable of churning out 2.1 gigawatts to 3.6 gigawatts of solar cells a year. These factories would cost $500 million to $600 million and be composed of factories-within-factories focused on different tasks:
an onsite glass making outfit capable of cranking out 30 million square meters of glass a year;a solar cell unit with 100 identical manufacturing lines;and a full-fledged packaging facility.
In this ideal world modules would cost 0.60 cents to 0.52 cents per watt and fully installed solar power would cost $1.00 to 0.88 cents a watt.
As per Michael Kanellos' article:
The ideas from the Keshner NREL paper largely formed the company's business plan.
After building a factory in Hayward capable of producing 30 megawatts to 50 megawatts, it landed $20 million in tax breaks in 2007 to build a factory at McClellan Air Force Base in Sacramento County.
By 2011, the million square foot facility would employ 1,000 and put out over 600 megawatts worth of solar panels a year. Although the original paper discussed ways of making cheap solar panels out of CIGS, cadmium telluride or amorphous silicon, OptiSolar focused on silicon because, among other reasons, of its far wider availability.
Plans were also being laid to build an even larger factory after McClellan that would contain the in-house sub-factory for glass making as discussed in 2004.
But OptiSolar didn't stop there.
Once organized as a company, OptiSolar also incorporated other ideas for cutting costs. Instead of concentrating on manufacturing solar panels, the company planned on installing them itself (through a subsidiary called Topaz Solar) and selling the power to a utility.
By acting as a vertically integrated company performing several functions, the idea was that the cost could be reduced because profit margins wouldn't be split among several companies.
In 2007, it won contracts to install over 200 megawatts in Ontario. The crowning achievement came in 2008 when the company won a deal to build a 550-megawatt solar farm near San Luis Obispo for PG&E over several other bidders. Toward the end of 2008, California Governor Arnold Schwarzenegger showed up for a factory opening that 60 Minutes covered.
OptiSolar avoided VCs and won funding from private equity firms accustomed to the long slog of energy deals. Many investors like Robert Puchniak and chairman Geoff Cummings came out of Canada's oil industry.
So what went wrong?
Those closest to the company chalk it up to the credit crisis. When the crisis hit, the company was outfitting the McClellan factory and seeking another $200 million in funding.
Prior to the credit crisis, the plan was to hold an IPO in 2010.
Others, though, said that the expansive goals seem to be giving the company mission creep.
It was negotiating deals for tracts of land while also planning out factories and buying expensive equipment.
Besides the already announced deals, it had secured rights to build power plants on 136,000 acres, enough for 19 gigawatts of power.
The company's failure, ultimately, resulted in the further expansion of First Solar. The relentlessly efficient solar maker bought OptiSolar's utility and land deals for approximately $400 million earlier this year.
Optisolar was an audacious bet, smacked down by the grim realities of the economy, by mission-creep, and by the hubris of the founders and investors.
GreenFuel Closes Down:
GreenFuel Technologies, one of the earlier algae biofuel companies, closed its doors in 2009, a victim of the credit crunch.
"We are closing the doors. We are a victim of the economy," said Duncan McIntyre at Polaris Venture Partners, an investor in Greenfuel.
Although it had raised millions of dollars and landed a high-profile deal with Auranta in Spain to erect test facilities, it could not get money to complete the project.
The company had also been chronically saddled with delays and technical problems. The company's plan was to pump carbon dioxide from smokestacks into bioreactors – i.e., sealed plastic bags or tubes filled with algae and water. The algae would grow fat on the carbon dioxide and later be harvested by GreenFuel to be turned into oil for biodiesel.
The firm had raised $13.9 million from VCs including Access Private Equity, Draper Fisher Jurvetson and Polaris Venture Partners.
SV Solar RIP:
As one of many solar start-ups destined for the dust bin – SV Solar raised $10.2M from Bessemer to build low concentration PV technology and then quickly disappeared.
Bessemer had funded a low-concentration PV firm (strike one), with a staff that had very little solar experience (strike two), based on some amazing cutting edge technology that they called -- a prism (strike 3). The company's value proposition was based on the high price of silicon at the time by investors who didn't fully get the concept of supply and demand -- how the price of the silicon commodity was bound to drop as capacity was added.
2009 to 2011 will see enormous attrition for weak startups in solar power. It will be a gut-wrenching experience but it will leave the industry leaner and stronger in the 20-year solar boom to come.
Some Bad News in Geothermal – Whole Lotta Shaking Going On
Venture Capitalists are accustomed to technology risk, market risk, and policy risk.
However they are not accustomed to seismic risk and the possible threat of eathquakes has sort of shut down a project by VC-funded AltaRock.
In September of this year AltaRock's (Enhanced Geothermal Systems) EGS demo at The Geysers in Northern California suspended its drilling operations due, not to seismic activity, but to the bore-hole collapsing due to unstable geologies. Previous permitting studies had not indicated a problem.
AltaRock continues to go after EGS, just not at The Geysers. AltaRock is funded by Google, Khosla Ventures, KPCB, ATV and Vulcan Capital.
But add that PR gaffe to the $9M in damages caused by the actual seismic activity from the EGS project in Basel, Switzerland and EGS boosters are going to have move a little slower and more carefully in 2010.
Failure to Meet U.S. Renewable Fuel Standards (RFS) for Advanced Biofuelse:
Under the EPA's guidelines, refiners are required to blend 100 million gallons of cellulosic biofuel in 2010 increasing to 250MGY in 2011 and 500MGY in 2012.
In a recent 1000- page report justifying the advanced biofuel mandate, the EPA outlines 25 pilot and demonstration plants currently operating in the United States. The EPA maintains that in 2010, 100.7 million gallons of cellulosic ethanol and diesel will be produced.
But in looking at the specifics of the EPA's conclusions, it is interesting to note that 70 million gallons of the 100 million gallons in 2010 is expected to come from Cello Energy – a virtually unknown company that does yet not have an operating website.
The company has operated so far below the radar that Paul Winters, the spokesman for a cellulosic industry trade group BIO, was recently quoted as saying, "I have to admit that before the EPA [report], I hadn't heard of them. I don't even have them listed on our map of cellulosic facilities."
In recent months, Cello Energy was convicted of fraud and it is unlikely that they are going to be producing meaningful amounts of biofuel any time soon.
Cello was run by Alabama's former ethics chairman, Jack Boykin, and funded by Khosla Ventures, et al.
GTM research forecasts U.S. cellulosic ethanol capacity to reach 28 million gallons in 2010, with 4.4 million gallons of cellulosic ethanol actually being produced in the United States in 2010.
Uni-Solar's Troubles:
In early December Uni-Solar (aka ECD) announced plans to cut 20 per cent of its 1900 employee workforce, the latest sign that the company was struggling to weather the economic downturn.
The company builds multi-junction amorphous silicon solar cells on flexible substrates.
Like many other companies in the solar industry, Uni-Solar has struggled amid an economic downturn that has strangled investments in building solar power projects worldwide over the past year. Earlier this year, solar panel manufacturers and their suppliers were reporting steep revenue cuts and even losses. Uni-Solar had shut down its factories in Greenville and Auburn Hills for about a month between May and June this year.
While other solar companies began to see a boost to their revenues and profit during the summer, Uni-Solar continued to post losses.
Uni-Solar's product is unique but the reliability and cost model of their technology has been questioned on numerous fronts. According to a recent research note by Deutsch Bank's Steve O'Rourke:
"ECD has a solid technology and niche market position, but a higher cost basis and is beset by sharp ASP pressure;when combined with challenging end market conditions (demand and project financing) we expect quarterly losses looking into C2010."
ECD looks to have a challenging 2010.
Smart Grid Backlash:
Excerpts from an October 22 article in The Fresno Bee:
More than 100 people packed a town hall meeting in downtown Fresno to vent their frustration with PG&E's newest metering technology – SmartMeters – that customers say has led to faulty spikes in utility bills. "The meters, in my opinion, are not very smart," PG&E customer Joe Riojas told Senate Majority Leader Dean Florez, D-Shafter. The meeting lasted four-and-a-half hours. No one spoke in favor of the Smart Meters.
Many customers brought their PG&E bills to show Florez their skyrocketing costs. For example, Don Vercellini of Fresno said his bill recently went from $500 a month to $1,173.
"It's straight-out fraud. I want my money back," he said.
Florez complained that the technology for customers to check usage will not be in place for years.
Said Florez:
"People don't see the value [in this program].
They just see higher cost, and that makes them angry."
According to Jeff St.
John's reporting:
Those complaints have focused attention on PG&E's $2.2 billion, 10 million smart meter deployment, with the California Public Utilities Commission demanding that PG&E find a third party to investigate.
But PG&E has already tested many customers' smart meters – made by General Electric and Landis+Gyr and networked by Silver Spring Networks – and have not found any problems with how they're working, according to PG&E spokesman Denny Boyles.
Rather than malfunctioning meters, PG&E thinks the higher bills have come from its two rate hikes in the past 12 months, plus a hot summer that led to many Central Valley residents cranking their air conditioners to beat the heat, Boyles said.
With the feds ready to launch another wave of smart grid funding – it would be helpful for the public to actually want these products and services. And to actually feel some immediate benefit and value from the smart grid.
It can't be just about benefits for the utilities.
Imara, Lithium-Ion Battery Firm, Runs Out of Juice:
"We have the best product entering the market place."
Jeff Depew, CEO of Imara.
In September lithium-ion battery startup, Imara, said it had begun to commercially produce lithium-ion batteries and might have a customer or two to announce in a few months. The company's secret sauce revolved around a cathode that would effectively allow a battery to store more lithium ions than standard lithium-ion batteries. The idea emerged from experiments conducted at SRI in 2000 funded by a program to develop electric cars sponsored by the Department of Energy (see Battery-Maker Imara Shuts Its Doors).
By December, unable to obtain their next round (a problem many mid-stage start-ups are going to face), Imara was closing its doors and ceasing operations. The company had experienced a delay in ramping up operations and could not line up investors to build a factory.
"It certainly did not help that hundred of millions in DOE stimulus funds went to two Korean companies and one French company, Saft," wrote Neal Maguire, vice president of business development."
Solar Electricity Cost Likely to Fall 50% in 2009
The big drop in solar panel prices this year has contributed to a significant decline in the cost of building and operating solar power plants this year, says New Energy Finance.
The cost of solar electricity is likely to drop by 50 percent in 2009 from the previous year due largely to a big fall in solar panel prices, said New Energy Finance Monday.
The 50 percent drop refers to what's commonly called the "leveilized cost of electricity," or the cost of producing the power over the lifetime of a solar power plant (from building to operating power plants). Utilities and banks use these metrics to determine their investments and operational costs for these generation facilities over time.
The levelized cost for solar electricity fell to as much as $160 per megawatt hour in 2009 globally, said Jenny Chase, head of solar research at New Energy Finance.
The $160 per megawatt-hour came from installations in sunny spots – such as the deserts in the western United States – that used the cheaper thin-film solar panels, like the ones produced by Tempe, Ariz.-based First Solar. The cost of building solar energy systems using thin films can be as low as $3 per watt, New Energy Finance said.
A solar power project that uses the cheapest thin films could achieve a levelized cost that is 25 percent lower than a solar power project using the cheapest crystalline silicon solar panels, the New Energy Finance said.
For projects located in less sunny locations that use the more expensive crystalline silicon solar panels, their levelized cost could more than double, Chase said.
The levelized cost for other types of renewable electricity, such as wind and geothermal, are expected to drop 10 percent in 2009 over 2008.
These levelized cost don't take into account any government subsidies.
"It's incredibly exciting.
The price of photovoltaics has plummeted this year, and we are seeing that opening up new markets that wouldn't have made sense before," Chase said.
"Governments around the world are going to cut subsidies, but they are still going to see a buoyant demand for solar."
Major solar energy markets are European countries, such as Germany, Spain and increasingly Italy, as well as the U.S. and Japan.
China and India recently announced or implemented solar incentive programs, which will help create demand for their domestic manufacturers and enable them to meet any pledge for reduction greenhouse gas emissions (see India Wants 20GW of Solar by 2020 and Here Comes China's $3B 'Golden Sun' Projects).
Without government incentives, solar electricity remains more expensive in general than power from coal and natural gas power plants, however. Government subsidies are meant to help bridge the difference the costs of installing and operating solar power plants and the similar costs for fossil fuel-based power, Chase said.
The levelized cost for electricity from fossil fuel power plants range between $55 and $105 per megawatt hour, depending on whether they include a price for carbon emissions, she added.
The costs of installing and operating solar power plants differs not only in different countries but also within countries such as the U.S., where individual states offer their own incentives and wield the authority over setting electricity prices.
"Recently in the U.S., we've seen a number of rebate programs being reduced or eliminated.
These incentive changes often have just as large an impact on the ultimate cost of PV generation as falling module prices," said Shayle Kann, an energy analyst with GTM Research, in an email.
For consumers, how much they pay for solar electricity also varies depending on the amount of government incentives. And that price, of course, also depends on whether they are paying for electricity from their own solar energy systems, or through their utilities who buy solar electricity from power producers. A recent report by the Lawrence Berkeley National Laboratory on the prices paid for installing each solar energy system in the U.S. showed the disparity between system prices for different states (see Solar Declines in Price by More Than 30%, Don't Credit the Panel). The lab relied primarily on data about small solar energy systems installed at homes or businesses.
The significant pricing decline in solar energy equipment over the past year – from silicon to cells to solar panels – has played a big role in the levelized cost reductions. Solar panel manufacturers have reported anywhere from a 30 percent to 50 percent drop in their products' pricing.
Supply outstripped demand as the recession dimmed the banks' interest in loaning money to solar power project developers.
The solar panel pricing decline has tapered off, while demand has picked up nicely in the third quarter, particularly in Germany, where generous government subsidies and banks' willingness to invest in solar power projects are making Germany this year's top market.
In fact, a solar energy industry association in Germany told Reuters that the country is likely to install between 2.5 to 3 gigawatts of solar energy systems in 2009, up from a forecasted 2 gigawatts. Market research firm iSuppli also recently revised its forecast for Germany installations to 2.5 gigawatts from 1.53 gigawatts.
The German government has reported that the country saw 1.47 gigawatts of new solar installations from January to September this year (see Germany Installs 2.34GW, FIT to Decline 9-11%).
Many manufacturers who posted a big drop in profits or even losses earlier this year have reported a big surge in sales for the third quarter.
Why The Chemistry Industry Wants Carbon Caps
With the right policy signals, the industry can start trimming emissions and refashioning products, says Genomatica's CEO
For years, business and environmental leaders have been looking ahead to Copenhagen as the next big step in global climate change negotiations, and the anticipation has reached a fever pitch recently.
Although a binding accord remains elusive, it is still important time to clarify the way forward.
As the CEO of a sustainable chemicals start-up, I have two perspectives on the global climate negotiations.
As a business leader seeking sustainable growth for our customers and my company, I believe that global leaders should push to establish a global climate agreement that will provide clarity on the economic implications of delivering low carbon solutions. Without a global agreement, businesses are left to guess at the shape of future regulation.
Without national policy or global agreements in place, we cannot determine the specific impact on our regions, industries and businesses.
An emissions cap would provide clarity and allow for long-term planning, while allowing business to seek the most efficient ways to reduce emissions. As such, Genomatica supports a global emissions cap and long-term reduction pathway.
Along with global chemical companies like BASF, BP and Braskem, we signed the Copenhagen Communiqué calling for a stable and clear climate policy.
A policy pathway will unleash more low-carbon innovation and economic growth.
In clean tech circles, it is easy to get behind a price on carbon.
By definition, clean technologies become more cost-effective when regulations place a price on carbon.
But even for companies outside the cleantech sector, a clear policy framework will allow for strategic planning around research and development investment, carbon credits and other changes to operations. Even, perhaps especially, for large emitters of greenhouse gasses, an emissions cap will be good for business. The U.S. Climate Action Partnership has been leading on this issue, calling for "a mandatory economy-wide, market-driven approach to climate protection."
However, the global petrochemicals industry faces unique challenges in a carbon-constrained world.
According to the American Chemistry Council, the chemical industry is the largest energy consumer in the manufacturing sector. Chemical companies use more natural gas than the entire state of California, both to power facilities and as a raw material for thousands of common chemicals.
Industry leaders such as BASF, Dow and DuPont have publicly staked out forward-looking positions on climate change and have invested in reducing their environmental impacts. For example, BASF invests over €400 million per year in research and development for products and technologies for energy efficiency, climate protection, resource conservation and renewable raw materials. (source) Dow has pledged to advocate for an international framework that establishes clear pathways to slow, stop, and reverse emissions by all major carbon dioxide-emitting countries. (source)
Just as the telecom industry faced an existential challenge from technology disruption and recovered, I believe the chemical industry faces a tough challenge now but will innovate in multiple ways. Telephone companies of old adapted to offer text, data, video and voice services. Disruption spawned new business models and new technologies, but allowed them to open new markets and even shift consumer behavior.
The global chemicals industry must transform in two key ways to combat global warming.
First, the industry should seek renewable, responsible feedstocks to displace meaningful quantities of oil and natural gas used today as feedstocks. Extracting and processing fossil fuels is a major contributor to global warming, and a significant part of the industry's environmental impact.
Reducing our use of today's petroleum feedstocks with renewable resources will have a significant positive impact on climate change.
Even more importantly for the industry, diversifying our feedstock options will help stabilize costs and reduce the impact of any price on carbon.
According to a recent survey conducted by Genomatica and ICIS, 57 percent of chemical industry respondents believe their companies should reduce exposure to the petroleum-based commodity market.
Six in 10 companies are already engaged with sustainable chemical practices, and nearly all of these chemical companies (93 percent) are maintaining such programs during the economic downturn.
Using renewable raw materials is good business, and good for the climate.
Second, the industry should continue its energy efficiency gains by seeking innovative changes to reduce the energy consumption of manufacturing processes, and even consume greenhouse gasses. Some of the largest global chemical companies have already been leading in this effort, even before the Kyoto Protocol.
For example, in 1994, DuPont set a voluntary goal to reduce their global greenhouse gas emissions by 40 percent in just six years. They reached the 40 percent reduction on time in 2000, and then revised the goal, aiming to achieve a total 65 percent reduction by 2010.
In 2003, they achieved that goal seven years early.
Advances by large producers like this have far outpaced the national goals set by global agreements, and the industry can be proud of that.
Moving forward, bio-based processes can improve energy and greenhouse-gas efficiency even more.
Because we as an industry have picked the low-hanging fruit of improving energy efficiency in many cases, the next reductions will call for more innovation.
Bio-manufacturing already produces some important intermediate chemicals, and interesting new polymers as well.
Just as technology has lowered costs in other industries, new bio-based processes can use less energy for the same results.
In addition, the industry can do a better job of educating the public on the potential of improved chemical processes. Chemical industry entrepreneurs and scientists can do a better job of educating the public on possible solutions and what it will take to make them reality.
We have the chance and responsibility to demonstrate how scientific research can revolutionize industry, combat global warming and bring economic growth.
As negotiations wrap up in Denmark this week, the chemical industry is just one example of how business is taking a leadership role in remaking our economy. A stable and clear climate policy would spur growth, but innovation must continue in parallel to ultimately deliver transformational change for the both the profitability of the industry and the environment.
Nuclear Industry Wish List
The industry is angling for 25 to 30 new plants, loan guarantees and fuel recycling.
To meet the current goals for greenhouse gas emissions, the U.S. would have to build 187 new nuclear plants by 2050, according to former New Jersey Governor Christine Todd Whitman, who now co-chairs the Case Energy Coalition, which advocates increased nuclear power in the U.S.
But the industry will settle for 25 to 30 by 2030, she said.
That would be enough to meet the expected growth in demand for electricity in the U.S. while keeping nuclear around 20 percent of the mix.
The U.S. currently has 104 reactors.
Although a commercial reactor hasn't been built in decades here, a new wave of reactors appears to be becoming financially, technically and politically possible, she added. 32 new nuclear plants at 21 sites have already been proposed for the U.S.
"It can be done.
They've done it in the past, building four to five a year," she said.
The first new reactor might go up in the U.S. in six to seven years.
"We are still going to need baseline power," she added.
While Whitman is clearly an advocate, the call to more seriously consider nuclear and even expand the nuclear footprint in the U.S. and Europe continues to grow.
Energy Secretary Steven Chu has come out in favor of expanded nuclear and so have a number of high-profile academics.
"I don't see a sensible solution [toward reducing carbon emissions] without having nuclear as part of the mix," said Dan Kammen, the UC Berkeley professor who also runs the Renewable and Appropriate Energy Lab at the school at a recent event sponsored by Google. MIT's Ernie Moniz says nuclear might be the most practical to de-carbonize energy production in New England.
Skeptics like Amory Lovins of the Rocky Mountain Institute, however, have argued that centralized, large power plants are riskier and more costly than distributed power like solar and wind and energy efficiency techniques. Rising prices for steel and concrete, not to mention security and waste handling, will invariably hurt the rosy economics the industry touts, Lovins adds.
In any event, the debate is on.
So what has changed for the industry?
First, the technology has improved, according to Whitman.
In the past, nuclear reactors were largely one-of-a-kind creations. 95 of the 104 reactors in the U.S. are based around different designs. The industry has moved toward standardized reactors, which cuts down the cost and certification time.
"There will be four [designs] or five at the most," she said.
Reprocessing has also made advances. Right now, the U.S. stores spent nuclear fuel.
Not only does that create a nightmare over where to store it, it's not efficient.
Approximately 95 percent of the energy remains in the fuel, she argued.
"Reprocessing cuts that down to two to three percent," she said.
"France, Russia and Japan all reprocess. In Japan, the remaining fuel is reprocessed in such a manner to prevent theft.
"You can't get to it without dying," she said.
These technological changes in turn could help curb one of the notorious historical problems for nuclear:
outrageous delays and cost overruns. "You are probably not going to see spikes," in rising costs with standardization, she argued.
And politically, Whitman asserts that nuclear can be an engine for jobs. Building a reactor can employ 1,200 to 4,000 people, she said and operating a reactor will employ 400 to 700.
The total economic output to a community from a single reactor can be as high as $430 million a year.
A big test for the industry lay in the climate bills. The nuclear industry wants to qualify for loan guarantees. Without them, Wall Street banks may not back new nuclear plants. If the ultimate law only allows loan guarantees for "renewable" energy, nuclear won't qualify.
"Uranium is not renewable," she said.
If the law allows guarantees to go to "clean" energy companies, nuclear will qualify.
It could be a long debate.
Whitman, a Republican, predicted that the Obama Administration may not get a climate bill passed until 2011.
"2010 is an election year," she said.
California Sheds Enough Water to Equal Lake Mead
The drought is real, and that could drive up the price of grapes.
SAN FRANCISCO -- California's Central Valley – one of the most productive agricultural regions in the U.S. – has lost enough water since late 2003 to fill up Lake Mead, a dire trend for the state and the country.
Between October 2003 and March 2009, the total amount of water in the Sacramento and San Joaquin basins, which support a 154,000 square kilometer region, dropped by 31.3 cubic kilometers, or about enough to fill in a basin the size of Lake Mead, according to Jay Famiglietti, a professor at the University of California Irvine speaking at the American Geophysical Union taking place this week in San Francisco. The AGU is an annual gathering of earth/space scientists.
The total loss of 20.3 cubic kilometers comes from groundwater depletion, caused largely by a drought that started in 2006 and restrictions on allocation of surface water. Most of the losses are coming from the southern part of the region.
"Continued reliance on groundwater will deplete critical reserves and eliminate a buffer," he said.
"The numbers we are getting are pointing to groundwater use at unsustainable rates."
The trend poses "significant threats to food production in the U.S. and the state's economy," he added. 250 different crops are grown in the Central Valley.
The harvest comes to around $17 billion a year, about 8 percent of U.S. food production by revenue.
As a whole, the Central Valley constitutes one-sixth of the irrigated land in the U.S. and one-fifth of the demand for groundwater.
California isn't alone.
India, Australia and the southeast of the U.S. has also seen groundwater losses. Three states in northern India have lost 17.7 cubic kilometers a year over the past decade. (A cubic kilometer holds 264.2 billion gallons – enough for 400,000 Olympic-sized pools.)
The data comes from the Gravity Recovery and Climate Experiment (GRACE) conducted by NASA. GRACE takes space observations and cross checks it against data on the Earth's gravitational field.
Although various startups have launched products for conserving or desalinating water, the market is largely dominated by large companies. General Electric and Siemens have bought several water companies in recent years while IBM is working in China, Iowa, the Netherlands, Ireland and other regions on water management strategies.
What It Costs Us
Underground coal miners work in the darkness, invisible to most of us, and when they die -- also in the darkness, from methane explosions or rock falls or any of the hundreds of other hazards they face every day -- their deaths usually merit just a few paragraphs in the local newspaper.
The attempted rescue of trapped coal miners, on the other hand, is often headline news. Networks love the real-time drama of the rescue efforts -- it's reality TV from the heartland, complete with anguished family members, heroic workers and dodgy mine owners. Sometimes, these stories have happy endings. In 2002, nine miners who were trapped in a coal mine in Quecreek, Pa., for 77 hours emerged as celebrities, feted by Oprah and photographed for Vanity Fair magazine.
But not every mine rescue turns out so well, as the Crandall Canyon mine disaster near Huntington, Utah, has reminded us over the past three weeks. When three rescuers were killed trying to dig out the six miners who've been trapped since Aug. 6, the story turned, as Gov.
Jon Huntsman Jr. put it, "from a tragedy into a catastrophe."
ad_icon
In the coming months, tough questions will be asked about exactly what happened in the Crandall Canyon mine:
Did federal mine safety officials do everything they could to protect the miners?
Did Robert Murray, the co-owner of the mine, value profits over human life?
And why, at the beginning of the 21st century, when we can download real-time images from Mars onto our laptop computers, has no one figured out a way to track or communicate with coal miners underground?
"This is a defining moment for the history of mining," Huntsman said.
"We all expect to come out of this better and smarter and safer."
But if history is any guide, straightforward answers to what happened in Utah will be as rare as oxygen in the collapsed mine.
We can expect a hue and cry about mine safety on Capitol Hill, a lot of blame-shifting and finger-pointing and, most likely, some modest mine safety improvements. But you can bet that you won't hear much about the real issue, which is the high cost of the United States' dependence on coal, and whether it's worth the price we pay.
Many Americans think that coal went out with top hats and corsets. In fact, we burn more than a billion tons of coal each year in the United States -- about 20 pounds a day for every man, woman and child.
We don't burn it in coal stoves, of course, but in big power plants that generate about half the electric power in the country.
Politically, the war in Iraq has been a boon for coal, allowing coal-friendly politicians to tout America's 250-year supply as a substitute for our addiction to Middle Eastern oil -- even though, in the real world, there is no overlap between coal (used to generate electricity) and oil (used for transportation fuels, among other things). This is not to say that the coal industry would not dearly love to get into America's gas tank.
In recent months, it has pushed hard for subsidies and tax breaks that would accelerate the construction of coal-to-liquid plants, a technology developed by the Nazis during the 1930s that can transform coal into liquid fuels such as diesel (for technical reasons, it's very difficult to make gasoline from coal).
Coal boosters argue that today's industry is nothing like the industry of yore, and that many of the problems with the fuel -- like the fact that air pollution from power plants kills people -- have been solved by new technology.
Coal is cheap, plentiful and clean, they say.
What's not to like?
Mine disasters such as the one in Utah, however, don't exactly fit this script.
It's tough to argue that you've left the 19th century behind when you have Murray -- one of the most prominent coal barons in the United States, well known for his political connections and influence -- insisting that the collapse was caused by an earthquake, directly contradicting seismologists who say that their instruments clearly show that the seismic activity was the result of the collapse in the mine.
It may not surprise you that Murray also believes global warming is a hoax.
Claims about a 250-year supply of coal won't stand up to scrutiny for long, either. Yes, the United States has more coal than any other nation.
But we've been mining coal in this country for 150 years -- all the simple, high-quality, easy-to-get stuff is gone.
What's left is buried beneath towns and national parks, or places that are difficult, expensive and dangerous to mine.
The blunt truth is, if we're going to become more dependent on coal, more miners will die.
How many mining tragedies will we accept in the name of "cheap" electricity?
Digging up hard-to-get coal will also devastate Appalachia, where huge mountaintop-removal mines have already buried 700 miles of streams and 400,000 acres of forests. (Mountaintop-removal is a particularly destructive form of mining in which entire mountains are blasted apart to expose the coal seams inside;the rubble is typically dumped in nearby valleys.) Instead of strengthening oversight of this type of mining, the Bush administration proposed last week to loosen regulations and allow it to expand.
One recent study estimated that if this practice continues, within 40 years the region disemboweled by mining will be approximately the size of Rhode Island.
As for "clean coal," it's a nice advertising slogan, but it's not a statement of fact.
According to Americans for Balanced Energy Choices, a nonprofit group funded by coal companies and coal-burning electric utilities, emissions of conventional pollutants from coal plants have fallen by one-third between 1970 and 2000, even as the use of coal to generate electricity has tripled.
What they don't tell you is that a) the industry fought the laws that mandated many of those reductions;and b) the amount of pollution spewed out by a coal plant is still enormous.
According to the Union of Concerned Scientists, a scientific advocacy group, annual emissions from a typical coal plant include 10,000 tons of sulfur dioxide, the major cause of acid rain;10,200 tons of nitrogen oxide, a major contributor to smog;500 tons of small particles, which cause lung damage and other respiratory problems;225 pounds of arsenic;114 pounds of lead;and many other toxic heavy metals, including 170 pounds of mercury, which can cause birth defects, brain damage and other ailments.
But the big issue is global warming.
Burning coal accounts for more than one-third of U.S. emissions of carbon dioxide, the main greenhouse gas. In a single year, a big coal plant emits as much carbon dioxide as 1 million SUVs. Coal plants that are built today emit just as much CO2 as those that were built 50 years ago (there have been some marginal gains in efficiency, but not many). In the future, carbon dioxide might be captured from coal plants and pumped underground into abandoned oil wells or deep saline aquifers, but at the moment, these solutions are unproven and expensive.
The coal industry is soaking up billions of dollars in tax breaks and subsidies to develop technology and study the problem.
But according to climate scientists such as NASA's James Hansen, if we hope to have a chance of avoiding dangerous changes to Earth's climate, we don't have time to wait.
That's why Hansen, along with former vice president Al Gore and others, has called for a moratorium on new coal plants that do not capture and store carbon dioxide pollution.
And that's why Silicon Valley entrepreneurs are investing hundreds of millions of dollars into clean-energy technology -- because they know that confronting the problem of global warming is not just the biggest challenge that civilization has ever faced, but also the mother of all economic opportunities.
It may seem like a long way from the melting Arctic to the mine disaster in Utah, but it's not.
The lesson from Crandall Canyon is not just that we need stronger mine safety laws and better federal oversight of dangerous mines, but that as Americans, we need to be more conscious of the costs and consequences of what goes on behind the light switch.
Otherwise, instead of coming out of this disaster smarter, stronger and safer, we're likely to find ourselves repeating this story again and again.
Lasting troubles in America's coal community
We Americans don't see the faces of these people or hear their voices in the kitchen when we flick on the light in the morning and start the coffee.
But they are there nonetheless.
When we pull some "juice" from the wires, it has been supplied, about half the time, courtesy of the coal mining communities. Whenever we scoop a teaspoon of baking powder, drive down a street, shine a flashlight, or brush our teeth, we are using one of several thousand by-products generated by the coal economy.
We are all inextricably bound to these coal-mining communities and yet we all know so little about them.
The tragedies at West Virginia's Sago and Alba mines allow us to look more deeply into the rural region that has been at the heart of America's industrial power. The suffering of the men who died in the mine and the grief of their families should not be in vain.
Death is never far away in West Virginia's coal camps, and there have been other mine tragedies in rural West Virginia.
Nearly a century ago, Monongah was the site of the worst mine disaster in US history - a December 1907 explosion killed 361 miners. In 1940, a fire and explosion killed 91 miners in Bartley.
In 1968, the night crew of the Consol No. 9 Mine, near Farmington, suffered a disaster similar to the one in Sago. The bodies of 59 miners were brought to the surface, but 19 remain forever entombed.
In 1971, one of the deadliest floods produced in US history, caused by negligent strip mining, occurred at southern West Virginia's Buffalo Creek Hollow.
Heavy rain caused the coal slurry pond to break, and a raging torrent of water left 118 people dead and over 4,000 homeless.
For more than a century, there has been a constant battle between the government, the United Mine Workers, and the coal industry about mine safety.
Often, major disasters like these have prompted new legislation to protect the miners, as we have seen in the wake of this latest disaster. Indeed, mining is safer today, despite recent rollbacks of safety requirements. Fewer miners are killed now because there are far fewer of them.
In 1948, 168 million tons of coal was produced by 126,000 miners. Last year, 15,000 West Virginia coal miners produced about 128 million tons of coal, using high-tech machines. Yet coal mining remains the deadliest of the extractive industries, with about half of all US mining fatalities each year occurring in coal mines.
While the swift actions of the state and federal government are laudable, the context in which these chapters of Appalachian coal mining occur should be noted.
There has been a long tradition in West Virginia and other Appalachian coal-producing communities in which the land and the people have not received adequate respect.
For example, there are plenty of laws on the books that are not rigorously enforced, and there is a long tradition of various governmental bodies turning a blind eye to infractions.
In addition to the constant threat of death, miners live with another crisis that is a result of this rather neglectful relationship.
Up close, West Virginia is a disturbing overlap of two parallel universes. One is the functioning universe of employed West Virginians we saw in the town of Tallmansville, where people hope to diversify their economy and state by insisting on education for their children and working hard to develop tourism and business parks.
The other West Virginia could be mistaken for a slum in some part of the third world.
Coal camps still line creeks like peas in the folds of an apron, but they are shrunken and dried out.
Dilapidated houses and trailers litter the hollows like piles of waste, mixed up with denuded forest, jagged abandoned swaths of strip mines, and toxic slurry ponds. Raw sewage flows down the creeks in some of the most beautiful mountains in our country.
Clumps of toilet paper still cling to tree roots, left from the last cycle of flooding.
Big cities like Welch or Mullens that once teemed with a hundred thousand people or more are now cavernous, disintegrating mazes. Aging and disabled miners, their widows, and a lost generation of people who have never lived in a viable economy hang on, passing time in front of the TV or "settin'" on the porch.
My husband, photographer Ken Light, and I spent four years in southern West Virginia gathering images and oral histories that illustrate the unique intersection of history, geography, and culture in this region.
These mountain people present a cautionary tale as an aggressive, take-no-prisoners brand of free enterprise is adopted around the world.
The very same dynamics were at play in this region long ago when outside investors invaded, treating the land and the people of Appalachia as commodities, available for use as long as they were needed.
Government wilted under pressure to accommodate the growth of the coal-mining industry, and the land and people were left without an advocate.
Natural Gas Is Often Stored Before It Is Delivered
Natural gas is moved by pipelines from the producing fields to consumers. Because natural gas demand is greater in the winter, it is stored along the way in large underground storage systems, such as old oil and gas wells or caverns formed in old salt beds. The gas remains there until it is added back into the pipeline when people begin to use more gas, such as in the winter to heat homes.
A generalized natural gas industry process flow diagram that goes from the well to the consumer.
When the gas gets to the communities where it will be used (usually through large pipelines), it flows into smaller pipelines called "mains."
Very small lines, called "services," connect to the mains and go directly to homes or buildings where it will be used.
Natural Gas Can Also Be Stored and Transported as a Liquid
LNG Transport Barge Unloading
Picture of LNG transport barge unloading.
When chilled to very cold temperatures, approximately -260°F, natural gas changes into a liquid and can be stored in this form.
Because it takes up only 1/600th of the space that it would in its gaseous state, liquefied natural gas (LNG) can be loaded onto tankers (large ships with several domed tanks) and moved across the ocean to other countries. When this LNG is received in the United States, it can be shipped by truck to be held in large chilled tanks close to users or turned back into gas when it's ready to put in the pipelines.
Liquefied natural gas (LNG) is natural gas that has been cooled to about -260°F for shipment and/or storage as a liquid.
The volume of the liquid is about 600 times smaller than in its gaseous form.
In this compact form, natural gas can be shipped in special tankers to receiving terminals in the United States and other importing countries. At these terminals, the LNG is returned to a gaseous form and transported by pipeline to distribution companies, industrial consumers, and power plants.
Liquefying natural gas provides a means of moving it long distances where pipeline transport is not feasible, allowing access to natural gas from regions with vast production potential that are too distant from end-use markets to be connected by pipeline.
Most of the Natural Gas Consumed in the United States Comes from Domestic Production
U.S. natural gas production and consumption were nearly in balance through 1986.
After that, consumption began to outpace production, and imports of natural gas rose to meet U.S. demand for the fuel.
In 2008, production stood at 20.6 trillion cubic feet (Tcf), net imports at 3.0 Tcf, and consumption at 23.2 Tcf.
Share of 2007 natural gas production:
* Texas (30%)
* Federal Offshore Gulf of Mexico (14%)
* Wyoming (10%)
* Oklahoma (9%)
* New Mexico (8%)
In 2008, 90% of net imports came by pipeline, primarily from Canada, and 10% came by liquefied natural gas (LNG) tankers carrying gas from five different countries.
What is the Federal Offshore Gulf of Mexico?
Some natural gas and oil wells are drilled into the ocean floor in waters off the coast of the United States. States have jurisdiction over any natural resources within three nautical miles of their coastline, except for Texas and the west coast of Florida where State jurisdiction extends to nine nautical miles. The Federal government retains ownership to resources past those limits. There are around 4,000 oil and gas platforms producing in Federal waters up to roughly 7,500 feet deep and up to 200 miles from shore.
Most of them are in the Gulf of Mexico.
Natural Gas Is Stored Underground
There were about 400 active underground storage fields (salt fields, aquifers, or depleted fields) in the United States during 2008.
Natural gas is injected into these fields primarily during April through October and withdrawn primarily from November through March during the peak heating season.
The volume of working (withdrawable) gas in storage during 2008 ranged from 1.2 trillion cubic feet at the end of March to 3.4 trillion cubic feet at the end of October.
U.S. Underground Natural Gas Storage Facilities, Close of 2007
Aerial View of the Great Plains Synfuels Plant in Central North Dakota
Aerial View of the Great Plains Synfuels Plant in Central North Dakota
Photo Credit:
Courtesy of Basin Electric Power Cooperative (Copyrighted)
Supplemental Gas Supplies
Supplemental gas supplies include blast furnace gas, refinery gas, propane-air mixtures, and synthetic natural gas (gas made from petroleum hydrocarbons or from coal). These supplemental supplies totaled 55 billion cubic feet (Bcf) in 2008.
The largest single source of synthetic gas is the Great Plains Synfuels Plant in Beulah, North Dakota, where coal is converted to pipeline-quality gas.
What Are Gas Shales?
Shale is a very fine-grained sedimentary rock that is easily broken into thin, parallel layers. Shales can contain a large amount of natural gas, but it's not necessarily mobile.
Extensive efforts such as horizontal drilling and creating artificial fractures in the rock are often needed to achieve satisfactory production rates.
Gas shale is one of a number of "unconventional" sources of natural gas;other unconventional sources of natural gas include natural gas produced from coalbeds and from "tight" (impermeable) sandstone or chalk formations
Underground Reservoirs Hold Oil and Gas
A "reservoir" is a place where large volumes of methane, the major component of natural gas, can be trapped in the subsurface of the Earth at places where the right geological conditions occurred at the right times. Reservoirs are made up of porous and permeable rocks that can hold significant amounts of oil and gas within their pore spaces.
What Are Proved Reserves?
Proved reserves of natural gas are estimated quantities that analyses of geological and engineering data have demonstrated to be economically recoverable in future years from known reservoirs.
Proved reserves are added each year with successful exploratory wells and as more is learned about fields where current wells are producing.
For this reason those reserves constantly change and should not be considered a finite amount of resources available.
How Much Natural Gas Reserves Are in the United States?
As of December 31, 2007, estimated proved reserves of "dry natural gas" (consumer-grade natural gas) in the United States were 237.7 trillion cubic feet (Tcf). The United States consumed 23.2 Tcf of natural gas in 2007.
Record-high additions to U.S. dry natural gas proved reserves in 2007 totaled 46.1 Tcf. The dry natural gas reserve additions mostly reflected the rapid development of unconventional gas resources including shale, coalbed methane, and tight, low-permeability formations. Many of these unconventional resources are cost effective to develop because of advances in drilling technologies and in techniques to increase gas yields from these formations and because of increases in market prices for natural gas.
What Are Undiscovered Technically Recoverable Resources?
In addition to proved natural gas reserves, there are large volumes of natural gas classified as undiscovered technically recoverable resources. Undiscovered technically recoverable resources are expected to exist because the geologic settings are favorable despite the relative uncertainty of their specific location.
Undiscovered technically recoverable resources are also assumed to be producible over some time period using existing recovery technology.
As of January 1, 2007, EIA assumes that domestic natural gas undiscovered technically recoverable resources are approximately 1,536 trillion cubic feet.1 Almost half of all onshore undiscovered recoverable gas resources are believed to be located in the Alaska and Gulf Coast regions. Over one-third of all undiscovered gas resources are estimated to be in Federal offshore areas, primarily near Alaska, in the Gulf of Mexico, and along the Atlantic Coast.
Natural Gas Is a Major Energy Source for the United States
About 24% of energy used in the United States came from natural gas in 2008.
The United States used 23.8 trillion cubic feet (Tcf) of natural gas, matching the record high set in 2000.
How Natural Gas Is Used
Natural gas is used to produce steel, glass, paper, clothing, brick, electricity and as an essential raw material for many common products. Some products that use natural gas as a raw material are:
paints, fertilizer, plastics, antifreeze, dyes, photographic film, medicines, and explosives.
Slightly more than half of the homes in the United States use natural gas as their main heating fuel.
Natural gas is also used in homes to fuel stoves, water heaters, clothes dryers, and other household appliances.
The major consumers of natural gas in the United States in 2008 included:
* Electric power sector — 6.7 trillion cubic feet (Tcf)
* Industrial sector — 7.9 Tcf
* Residential sector — 4.9 Tcf
* Commercial sector — 3.1 Tcf
Where Natural Gas Is Used
Natural gas is used throughout the United States, but the top natural gas consuming States in 2007 were:
* Texas
* California
* Louisiana
* New York
* Illinois
What Are the Environmental Impacts of Mountaintop Removal?
Mountaintop removal is widely recognized, even by government agencies that regulate it, as one of the most environmentally devastating practices allowed under U.S. law.
According to the U.S. Environmental Protection Agency:
"The impact of mountaintop removal on nearby communities is devastating.
Dynamite blasts needed to splinter rock strata are so strong they crack the foundations and walls of houses. Mining dries up an average of 100 wells a year and contaminates water in others. In many coalfield communities, the purity and availability of drinking water are keen concerns."
This is occurring right at the heart of one of the nation's main hotspots of biological diversity.
According to the Nature Conservancy, the mountain region including southwest Virginia, southern West Virginia, eastern Kentucky and northeastern Tennessee contains some of the highest levels of biological diversity in the nation.
This, as it turns out, is precisely the region where mountaintop removal is spreading the fastest.
Already, more than a quarter of the mountains in the southern West Virginia coalfields have been leveled.
Because there has been no significant effort to track the spread of mountaintop removal by government agencies, and the maps provided by coal companies to state agencies on the locations of valley fills and mined areas are so inaccurate as to be worthless for scientific purposes, there is very little scientific information on the effects of mountaintop removal.
The one major governmental report on mountaintop removal that has come out, while based on faulty data, did report significant environmental impacts from mountaintop removal.
Here are some of the impacts and concerns expressed in the final EPA report:
* More than 7 percent of Appalachian forests have been cut down and more than 1,200 miles of streams across the region have been buried or polluted between 1985 and 2001.
* Over 1000 miles of streams have been permitted to be buried in valley fills. (for scale, this is a greater distance than the length of the entire Ohio River).
* Mountaintop removal mining, if it continues unabated, will cause a projected loss of more than 1.4 million acres by the end of the decade—an area the size of Delaware—with a concomitant severe impact on fish, wildlife, and bird species, not to mention a devastating effect on many neighboring communities.
* 800+ square miles of mountains are estimated to be already destroyed. (this is equal to a one-quarter mile wide swath of destruction from New York to San Francisco – it is also significantly underestimated).
Other quotes from the 2003 report include:
* "… studies found that the natural return of forests to mountaintop mines reclaimed with grasses under hay and pasture or wildlife post-mining land uses occurs very slowly.
Full reforestation across a large mine site in such cases may not occur for hundreds of years."
* "Because it is difficult to intercept groundwater flow, it is difficult to reconstruct free flowing streams at mountaintop removal sites."
* "Stream chemistry monitoring efforts show significant increases in conductivity, hardness, sulfate, and selenium concentrations downstream of [mountaintop removal] operations."
The threat of coal slurry impoundments
In addition to the environmental impacts of blasting mountains into rubble and burying streams with mining waste, mountaintop removal coal mining requires the building of giant sludge dams, which can hold billions of gallons of toxic coal sludge behind un-reinforced earthen dams. These slurries are necessary because, unlike coal from underground mines, coal from mountaintop removal requires extensive washing to separate the coal from debris and residues from the blasting of bedrock.
As of 2000, there were more than 600 sludge impoundments across the Appalachian coalfields. Chemical analyses of this sludge indicate it contains large amounts of arsenic, mercury, lead, copper, and chromium, among other toxins, which eventually seep into the drinking water supply of nearby communities. Even worse than this seepage, however, is the threat of a dam break.
Several dam breaches have occurred, one at Buffalo Creek in West Virginia, which took the lives of 125 people, many of whom were children.
The most recent sludge dam breach was in Martin County, Kentucky, in 2000, which the EPA called the worst environmental disaster in the history of the Southeast.
When the sludge dam breached, more than 300 million gallons of toxic sludge (about 30 times the amount of oil released in the Exxon Valdez oil spill) poured into tributaries of the Big Sandy River, killing virtually all aquatic life for 70 miles downstream of the spill.
The photos below are of the Martin County spill (click to enlarge):
Martin County Coal, Co., a subsidiary of A.T. Massey Coal, which was responsible for the spill argued in court that the event was "an act of God."
Yet not long after the disaster, the Kentucky Department of Surface Mining issued four citations to the company for unsafe waste storage practices and for allowing the 70-acre pond to weaken to the point of collapse.
At present, there are 45 impoundments in West Virginia alone that are considered at high risk for failure, and 32 are at moderate risk.
In addition, most local communities are dependent on groundwater, which could be fouled by mining waste.
Electricity from Burning CoalThirst for coal.
Since the 1930s the electricity-generating industry has thirsted for coal.
After World War II most markets for coal declined, leaving the coal industry increasingly dependent upon electric utilities for income.
From 1949 through 1977 utilities increased their share of total U.S. coal consumption every year. By 1969 electric utilities consumed sixty percent of U.S. coal production which amount grew to seventy-six percent in 1977 and to eighty-five percent in 1987.
For 1999 the National Mining Association predicted that U.S. coal production would be 1.14 billion tons, of which electric utilities would use 935 million tons to produce 56.8 percent of their electricity.
King Coal has become the handmaiden to electric utilities.
The driving forces behind electric utilities' increasing hunger for coal are:
1. consumer demand;2. ready availability of coal, particularly West Virginia's low-sulfur coal;3. and air emissions requirements.
1.
Demand for energy.
From 1977 to 1997 Americans' demand for electricity jumped seventy percent.
In no small part this results from a long standing American preoccupation with the frontier. The 19th Century historian F. D. Turner opined that the frontier sets America apart from its European roots. For us there always has been more land, fortune, fuel, and food for the taking.
We don't like limits and act accordingly.
We like to do things in a big way.
We like quickness and convenience.
We are energy hogs but we are not alone.
During the twentieth century increases in the world's urban population, energy use, and industrial output, respectively, were 13x, 16x, and 40x.
We earthlings used ten times as much energy in that century as in the previous 1,000 years. Before the 20th century, humans had little impact on earth and rock compared with glaciers, tectonic movements, and oceanic volcanoes. By the 1990s we had surpassed all of them while moving 42 billion tons of rock and soil per year. In the last century we lost an amount of topsoil it took 1,000 years to form.
So hogs we are.
For more depth about our habits J.R. McNeill's Something New Under The Sun:
An Environmental History of the Twentieth-Century World provides historical perspective.
Those numbers are so big they lack personal relevance.
So, on average one American (by the mid-1990s) consumed 121 pounds daily:
coal - 19 ;stone and cement - 27;miscellaneous minerals - 17;oil - 16;farm products - 12;wood - 11;range grass - 10;metals - 8;natural gas - 1.
Source:
Stuff:
the Secret Lives of Everyday Things (January 1997, Ryan and Durning from Northwest Environmental Watch).
Energy efficiency is an important strategy to deal with energy demand.
That approach rests upon one of our nation's greatest strengths -- technological innovation.
Historically, except for war shortages, energy efficiency played no role in U.S. public policy until the 1970s. Fossil fuels were available in essentially unlimited quantities at very low costs.
Americans have become more aware of the realities of energy production and consumption.
Supplies of fuels are finite.
Energy production and use have adverse effects upon our environment.
Our population increases and our uses of energy increase.
Energy security is threatened by the supply of oil in politically insecure regions of the world.
Government has an important role to play here.
Whether federal or state, public policy for energy efficiency affects us all.
Unfortunately, energy efficiency policy too often gets bogged down in the murky pool of politics.
For example, energy efficiency regulations proposed by President Clinton for air conditioners could reduce the projected increase in demand for electricity by one-eighth through 2020.
The new Bush administration decreased the efficiency requirements shortly after taking office.
A 2000 study by the federal Department of Energy found that market-based, energy efficiency policies such as tax credits for fuel efficient vehicles could reduce the growth of energy demand by a third through 2010.
2.
Availability meets increasing demand.
By 1973 American energy consumption was the highest ever [240 million BTUs per person in 1973] and then something outside of our control produced shell shock and compelled us to change our ways, albeit grudgingly.
Commencing with the 1974 OPEC international oil cartel embargo which led to much higher crude oil prices, electric utilities accelerated consumption of U.S. coal - because America is the Saudi Arabia of coal.
The conflict between our energy-hungry lifestyle and our energy-saving impulses remains. By 1994 per-person annual consumption of energy per American had climbed back to 213 million BTUs. Americans continue to use more energy per person than residents of any other country except Canada.
A footnote to the 1974 OPEC oil embargo is a once-obscure 1979 tax credit favoring production of synfuels (synthetic fuels). Synfuel production techniques, hidden by a veil of corporate secrecy, seem to be relatively simple -- a dash of diesel, whether oil or asphalt, sprayed on coal.
Based on energy content, a ton of synfuel containing 12,500 BTUs, typical of southern West Virginia coal, produces a $26.37 tax credit, at least until the end of July 2007 when the credits expire unless extended by Congress. Some synfuel plants are relocating near existing power generating plants. More than forty-four plants exist nationally, with ten or more being in West Virginia.
The potential for disrupting coal markets, including importing of western coal, is real.
Criticism of synfuel tax credits has led to examination of justification for the tax break.
Outside the United States, high-grade, low-sulfur, metallurgical coal is desired.
West Virginia exports more coal than does any other state, roughly one-quarter of its annual tonnage, constituting about one-half of America's coal exports overseas. The state ships coal to Canada, Brazil, Italy, United Kingdom, Japan, France, Netherlands, Romania, Belgium, Luxembourg, Turkey, and to other countries.
Both the U.S. Commerce Department and the National Mining Association predict that demand for coal will continue to increase as will its production.
3. Emissions standards stimulate demand.
For West Virginia there exists an environmental irony.
Low-sulfur coal, available in abundance in southern West Virginia, is in high demand by utilities which must meet emission standards set in the 1990 Clean Air Act.
The coal industry's response was to increase tonnage from 131 million tons in 1986 to 174 tons in 1996, the latter worth 4.4 billion dollars, 182 million tons in 1998 [the all-time record], and 168 million tons in 1999.
So a desirable environmental statute has stimulated demand for West Virginia coal, thereby increasing coal production which destroys more mountains and streams and generates more high-stack emissions which pollute the air.
Deregulation of electricty production, transmission, and sale.
Another wrinkle in the coal-to-electricity picture, spawned by falling wholesale electricity prices in the early 1990s, has been electric utility "deregulation."
Unregulated electricty markets in the 1880s left a bad taste in the mouths of American consumers. It was an unmitigated disaster, a short-term free-for-all followed by an avalanche of corporate consolidation.
George Westinghouse and J.P. Morgan (General Electric) were among the largest of the remaining monopolistic behemoths. After the 1893 depression the populist response was to form "munis" -- municipally owned utilities. War raged between munis and private utilities. Prices came down.
The mayor of Cleveland, Ohio, Tom Johnson, said then:
"If we don't control utilities, they will control us."
Deregulation is a misnomer. Regulation remains but in a different form.
Ergo, the 1996 California deregulation bill was as thick as a Los Angeles phone book.
Because truly unregulated markets bring surprises, e.g. prices go up or down, political considerations come into play.
Stability and predictability, desired by consumers, are incompatible with pure deregulation.
With pure deregulation, such as with airlines and trucks, dramatic changes occur. Utilities are ill-fitted for the cowboy, risk-it-all, boom-or-bust mentality of real deregulation.
Deregulation is about money:
who gets it and who pays it.
It is about philosophy of government, too. Who should set electricity prices:
an unbridled market or a public utility commision?
Is there a compromise position which produces stability and reduces costs to consumers?
The players in the deregulation game are:
power generators that produce and trade kilowatts at free-market prices;wires companies that transmit power and maintain lines and poles [now a regulated monopoly];and marketers that buy power wholesale and compete to sell power to customers. In the traditional regulated system, vertical integration prevails. A single, heavily regulated power company produces, transmits, and distributes electricity and recovers its costs and a set profit.
The drivers of electricity restructuring are technology and industry.
New ways of making electricity, such as combined-cycle natural gas generators, allow industrial users to produce their own power. One by one they are slipping off the power grid.
Understandably, the electric utility industry wants to use deregulation to gain access to other markets, primarily in the northeast, for excess capacity and new capacity.
The utilities want to accomplish three goals. Give large customers locked-in, low rates to keep them on the grid.
Pay off their debts (stranded costs) for their old power plants. Sell the deal to the public by promising lower rates.
Your electric bill.
Roughly one-third of an electric bill is for electricity.
The other two-thirds are fixed costs such as transmission fees and stranded costs. Stranded costs are for past industry investments. As deregulation rules are crafted, electric utilities lobby for recouping stranded costs from customers who switch to other electric providers. With just about one-third of each electric bill dedicated to the product, savings for individual consumers may be an illusion.
Nationally, deregulation has caught on Dire predictions of doom from old-line utilities have not panned out;instead, rising electricity prices have added value to existing generation plants. See the lengthy article in the April 30, 2000, business section of the New York Times. Some large utilities are merging as American Electric Power (AEP) did in 2000.
And deregulated utilities are selling older generation facilities and branching out into new businesses.
Momentum for deregulation throughout the U.S. suffered a near-knockout blow with the 2001 California electricity shortage.
Signs of trouble existed in the summers of 1998 - 2000 as residential electricity prices increased.
By Winter 2001 rolling blackouts were common.
As a consequence, dramatic changes in California "deregulation" were made in 2001.
As a small state with a small population, the bargaining power of West Virginia citizens is suspect once the electric utility industry is deregulated and sale of electricity is purely market-based.
In 1998 the state Legislature gave the state Public Service Commission full authority to prepare a deregulation plan.
In March 2000 the West Virginia Senate, by voice vote, approved the House's deregulation resolution.
The bill was pushed hard by the PSC which regulates the electric utility industry.
The bill ignored dirty-burning "grandfathered" generators and "green energy" (environmentally friendly) production.
What will happen in the long run?
We'll have to wait and see.
Implementation has been delayed until 2002.
In the meantime, there will be review of statutory changes needed to avoid an expected major loss of power-utility-paid state and local taxes, and the PSC will engage in rule making.
The bill calls for a 13-year transition to a fully market-based system.
States surrounding West Virginia have some form of electricity deregulation:
Virginia, Maryland, Pennsylvania, and Ohio.
Alternative sources of energy.
The prospect of deregulation of electricity production has stimulated consideration of alternative fuels. The cost of producing electricity by natural gas (3 to 4 cents per KwHr) and by windmills (4 to 6 cents per Kw/Hr) has dropped substantially in recent years.
Natural gas is the source for new quick-start power plants in numerous areas of West Virginia.
Most of these "peaker" plants are designed to handle peak-capacity situations, such as hot summer days. Nationally most new power plants are fueled by natural gas. While a cleaner fuel than coal, natural gas produces unwelcome nitrogen oxide.
The National Energy Technology Center in Morgantown is at the forefront of "clean coal technology."
Co-firing, using coal and natural gas, is a likely outcome in West Virginia, too. Other materials, including a pellet (RVS-1) designed in Morgantown, may be suitable for co-firing and could substantially reduce sulfur-caused pollution.
And the state's coalbed methane, which is a source of danger, is easily recoverable but must be mixed with higher-BTU natural gas to upgrade its generation quality.
Renewable sources of energy.
The sun and the wind offer immense potential as clean souces of electricity.
Passive use of solar energy in buildings is becoming more common than previously.
Solar photovoltaic cells likewise are finding wider use.
Windmills are another way to produce electricity.
They present a paradox to environmental advocates. Windmills are clean energy producers and they are impositions on pristine landscapes. Their multi-hundred-foot blades rest atop mountains and form lines of ungainly mechanical creatures.
In Tucker County, Backbone Mountain Wind Power's project will be the largest windmill project in the eastern United States (up to 70 windmills), extending miles. In 2001 the West Virginia Highlands Conservancy and the power company agreed on changes to the project and on post-construction monitoring.
In Preston County, MegaEnergy, Inc. plans to construct 10 windmills in conjunction with two small natural-gas fired generators.
The Legislature in 2001 created a tax break for wind power companies by classifying wind turbines and towers as pollution control facilities.
Something Under the ice is Moving
In May 2006, Helen Amanda Fricker was doing the kind of work that left her perfectly open to a distraction. A geophysicist at Scripps Institution of Oceanography, Fricker didn't doubt that her research was important, but she admits it was a little mundane.
In an effort to improve Antarctic ice shelf maps and models of Antarctic tides, she was mapping the continent's coastline by looking at small changes in elevation detected from satellite.
Finding the precise outline of Antarctica can be tricky because the continent is fringed with ice shelves—thick slabs of ice, fed by glaciers, floating on the ocean surface.
The shelves may hide the continent, but they do offer clues about where the land ends. The clue is that the floating part of an ice shelf moves up or down with ocean tides while the land-based part sits still.
The coast of Antarctica is fringed by ice shelves that are thousands of feet thick in places. Fed by glaciers, these massive slabs of ice float on the surface of the ocean.
Their elevation rises and falls with the tides. (©2005 BrynJ.)
Fortunately for Fricker, she didn't have to stand shivering on an Antarctic ice shelf with a handheld GPS receiver to pick up the clues. Subtle changes in ice-shelf elevation from rising and falling tides are visible to NASA's GLAS (Geoscience Laser Altimeter System) sensor on the agency's ICESat satellite.
The altimeter bounces a laser pulse off the Earth's surface and times how long the signal takes to come back.
Differences in the signal's return times for the same location indicate changes in elevation.
ICESat makes measurements for 33 days roughly every 4 months, collecting data over 70-meter-wide "footprints" every 175 meters along the satellite's ground track.
"So we got the same patch of real estate—or strip of real estate—surveyed every four months, and we could see how its elevation changed through time," Fricker explains.
In late May 2006, Fricker was concentrating on West Antarctica, around the Whillans and Mercer Ice Streams on the Ross Ice Shelf. She looked for the small elevation changes that would mark grounding lines, the place where the ice shelves stopped resting on land and started floating on the ocean.
Marking the grounding lines would improve tidal models, which would improve understanding of ice shelf behavior, which influences glaciers, which influence sea level.
Important work with a long-term payoff, but not terribly exciting.
Then she found something she didn't expect.
Fricker found an elevation change, but two things about it struck her as weird.
For one, it was in the wrong place—near a feature known as Engelhardt Ice Ridge—inland from where the ice shelf grounding line should have been.
For another, the elevation change was far bigger than the typical tidal movement of 1 or 2 meters (3 to 6.5 feet). Between October 2003 and November 2005, the area she was examining had dropped roughly 9 meters (nearly 30 feet). "I wasn't expecting to find this at all," Fricker recalls. "I was shocked."
Something under the ice had to be moving.
To figure out what was going on, Fricker needed a detailed, seamless map of Antarctica's icy surface.
Although precise, GLAS has a narrow view. A map of Antarctica made out of GLAS observations would look like the crisscrossing strings of a tennis racket.
For a comprehensive view, she turned to the Mosaic of Antarctica (MOA), a detailed digital image of the continent based on data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensors on NASA's Terra and Aqua satellites. "I plotted the place where I thought the elevation drop was occurring," she says, "and lo and behold, it was sort of a flat region that showed up in MOA."
The Ross Ice Shelf flows from land onto sea, encasing Roosevelt Island.
In places where the thick ice hides the transition from land to floating ice shelf, scientists can locate the coastline by measuring changes in the ice shelf's elevation.
The floating part of the shelf rises and falls with the tides while the land-based parts sits still. (Image courtesy National Snow and Ice Data Center, Mosaic of Antarctica.)
Fricker contacted Ted Scambos at the National Snow and Ice Data Center, the scientist who had spearheaded the development of the Mosaic of Antarctica.
Was he interested in helping her pin down the source of the elevation change?
He was.
What It Costs Us
Underground coal miners work in the darkness, invisible to most of us, and when they die -- also in the darkness, from methane explosions or rock falls or any of the hundreds of other hazards they face every day -- their deaths usually merit just a few paragraphs in the local newspaper.
The attempted rescue of trapped coal miners, on the other hand, is often headline news. Networks love the real-time drama of the rescue efforts -- it's reality TV from the heartland, complete with anguished family members, heroic workers and dodgy mine owners. Sometimes, these stories have happy endings. In 2002, nine miners who were trapped in a coal mine in Quecreek, Pa., for 77 hours emerged as celebrities, feted by Oprah and photographed for Vanity Fair magazine.
But not every mine rescue turns out so well, as the Crandall Canyon mine disaster near Huntington, Utah, has reminded us over the past three weeks. When three rescuers were killed trying to dig out the six miners who've been trapped since Aug. 6, the story turned, as Gov.
Jon Huntsman Jr. put it, "from a tragedy into a catastrophe."
ad_icon
In the coming months, tough questions will be asked about exactly what happened in the Crandall Canyon mine:
Did federal mine safety officials do everything they could to protect the miners?
Did Robert Murray, the co-owner of the mine, value profits over human life?
And why, at the beginning of the 21st century, when we can download real-time images from Mars onto our laptop computers, has no one figured out a way to track or communicate with coal miners underground?
"This is a defining moment for the history of mining," Huntsman said.
"We all expect to come out of this better and smarter and safer."
But if history is any guide, straightforward answers to what happened in Utah will be as rare as oxygen in the collapsed mine.
We can expect a hue and cry about mine safety on Capitol Hill, a lot of blame-shifting and finger-pointing and, most likely, some modest mine safety improvements. But you can bet that you won't hear much about the real issue, which is the high cost of the United States' dependence on coal, and whether it's worth the price we pay.
Many Americans think that coal went out with top hats and corsets. In fact, we burn more than a billion tons of coal each year in the United States -- about 20 pounds a day for every man, woman and child.
We don't burn it in coal stoves, of course, but in big power plants that generate about half the electric power in the country.
Politically, the war in Iraq has been a boon for coal, allowing coal-friendly politicians to tout America's 250-year supply as a substitute for our addiction to Middle Eastern oil -- even though, in the real world, there is no overlap between coal (used to generate electricity) and oil (used for transportation fuels, among other things). This is not to say that the coal industry would not dearly love to get into America's gas tank.
In recent months, it has pushed hard for subsidies and tax breaks that would accelerate the construction of coal-to-liquid plants, a technology developed by the Nazis during the 1930s that can transform coal into liquid fuels such as diesel (for technical reasons, it's very difficult to make gasoline from coal).
Coal boosters argue that today's industry is nothing like the industry of yore, and that many of the problems with the fuel -- like the fact that air pollution from power plants kills people -- have been solved by new technology.
Coal is cheap, plentiful and clean, they say.
What's not to like?
Mine disasters such as the one in Utah, however, don't exactly fit this script.
It's tough to argue that you've left the 19th century behind when you have Murray -- one of the most prominent coal barons in the United States, well known for his political connections and influence -- insisting that the collapse was caused by an earthquake, directly contradicting seismologists who say that their instruments clearly show that the seismic activity was the result of the collapse in the mine.
It may not surprise you that Murray also believes global warming is a hoax.
Claims about a 250-year supply of coal won't stand up to scrutiny for long, either. Yes, the United States has more coal than any other nation.
But we've been mining coal in this country for 150 years -- all the simple, high-quality, easy-to-get stuff is gone.
What's left is buried beneath towns and national parks, or places that are difficult, expensive and dangerous to mine.
The blunt truth is, if we're going to become more dependent on coal, more miners will die.
How many mining tragedies will we accept in the name of "cheap" electricity?
Digging up hard-to-get coal will also devastate Appalachia, where huge mountaintop-removal mines have already buried 700 miles of streams and 400,000 acres of forests. (Mountaintop-removal is a particularly destructive form of mining in which entire mountains are blasted apart to expose the coal seams inside;the rubble is typically dumped in nearby valleys.) Instead of strengthening oversight of this type of mining, the Bush administration proposed last week to loosen regulations and allow it to expand.
One recent study estimated that if this practice continues, within 40 years the region disemboweled by mining will be approximately the size of Rhode Island.
As for "clean coal," it's a nice advertising slogan, but it's not a statement of fact.
According to Americans for Balanced Energy Choices, a nonprofit group funded by coal companies and coal-burning electric utilities, emissions of conventional pollutants from coal plants have fallen by one-third between 1970 and 2000, even as the use of coal to generate electricity has tripled.
What they don't tell you is that a) the industry fought the laws that mandated many of those reductions;and b) the amount of pollution spewed out by a coal plant is still enormous.
According to the Union of Concerned Scientists, a scientific advocacy group, annual emissions from a typical coal plant include 10,000 tons of sulfur dioxide, the major cause of acid rain;10,200 tons of nitrogen oxide, a major contributor to smog;500 tons of small particles, which cause lung damage and other respiratory problems;225 pounds of arsenic;114 pounds of lead;and many other toxic heavy metals, including 170 pounds of mercury, which can cause birth defects, brain damage and other ailments.
But the big issue is global warming.
Burning coal accounts for more than one-third of U.S. emissions of carbon dioxide, the main greenhouse gas. In a single year, a big coal plant emits as much carbon dioxide as 1 million SUVs. Coal plants that are built today emit just as much CO2 as those that were built 50 years ago (there have been some marginal gains in efficiency, but not many). In the future, carbon dioxide might be captured from coal plants and pumped underground into abandoned oil wells or deep saline aquifers, but at the moment, these solutions are unproven and expensive.
The coal industry is soaking up billions of dollars in tax breaks and subsidies to develop technology and study the problem.
But according to climate scientists such as NASA's James Hansen, if we hope to have a chance of avoiding dangerous changes to Earth's climate, we don't have time to wait.
That's why Hansen, along with former vice president Al Gore and others, has called for a moratorium on new coal plants that do not capture and store carbon dioxide pollution.
And that's why Silicon Valley entrepreneurs are investing hundreds of millions of dollars into clean-energy technology -- because they know that confronting the problem of global warming is not just the biggest challenge that civilization has ever faced, but also the mother of all economic opportunities.
It may seem like a long way from the melting Arctic to the mine disaster in Utah, but it's not.
The lesson from Crandall Canyon is not just that we need stronger mine safety laws and better federal oversight of dangerous mines, but that as Americans, we need to be more conscious of the costs and consequences of what goes on behind the light switch.
Otherwise, instead of coming out of this disaster smarter, stronger and safer, we're likely to find ourselves repeating this story again and again.
Aruba's New Windfarm
As Copenhaguen ended, unsurprisingly, in confusion, I have the opportunity to give you a more positive tale, and show you it is possible for people - including even bankers amongst them - to work towards a more sustainable future without necessarily endangering our way of life.
The Vader Piet 30MW wind farm on the island of Aruba.
In this case, it involves the construction of a windfarm in a place where it will directly replace fuel-oil-burning power plants. As you'll be able to see below, this wind farm is quite remarkable in a number of ways which means that this experience will not be replicable as easily everywhere, but it shows that there are many places and energy systems which it is possible to materially improve under almost all criterai using renewable energy.
(part of the wind power series)
Full disclosure:
As indicated below, I financed the project discussed in this post last year.
Amongst notable features, one can find:
* at around 60%, it has one of the highest capacity factors in the world, with 50% more power output per turbine than European offshore windfarms...;located on the Eastern coast of the island, it is exposed directly to the trade winds, which are highly regular and almost always in the same direction (allowing to put the turbines very close to one another);their almost constant strength also mean that tear and wear is actually likely to be less than usual, as there are very few brutal changes in regime and torque;
it is a windy place...
* it is now providing 20% of the island's overall electricity needs, replacing dirty and expensive fuel-oil in the process. At night, it will produce up to 60% of the demand.
And thanks to the highly regular wind regime, this is very stable and predictable production;(even though they pushed for this project to happen, the local power company had quite a shock to see 'for real' how big a portion of their system the windfarm has suddenly become - as is still frequent, utilities have trouble taking wind seriously, but it this case the reality was quite compelling);
* the utility will save money on fuel imports and, more importantly, will actually end up with cheaper power:
it buys the electricity from the wind farm at a fixed price over 15 years which is roughly equivalent to what it costs to produce electricity from their traditional oil-fired generators with oil prices at $45/bbl.
Who wants to bet on oil being consistently under $45 for the next 15 years?
In fact, the prime minister of the island, who was present at the inauguration, used the opportunity of that ceremony to announce lower power prices for the poorest households on the island...
the windfarm is situated in a very isolated part of the island, invisible to everybody
but it adds to that area's spectacular vistas.
* and the reality is that the windfarm has received an enthusiastic welcome by the population of the island - the project team was telling me about how there were people all along the road clapping them when they were transporting the machinery to the site (not a trivial task, as the videos below show):
turbine unloading
* and, finally (and this is where I come in), the windfarm was financed at the top of the financial crisis last year. I told the story in a blog post then ( How to keep on financing wind farms when banks have no money left.) but it's worth underlining here that one of the most dangerous consequences of the crash is that traditional banking - lending to the economy - has been, and still is, directly impacted and curtailed, as the result of lack of liquidity and heightened risk aversion by banks (which are just as stupid and gregarious in systematically cutting off credit as they were enthusiastic at shoving it onto clients before). So it was an especially proud moment for me to see this project, because we really made a difference at the time, saving the wind farm from a potentially damaging delay, and saving very real economic activity on the island and amongst the suppliers (which are mainly European).
erection works - same source.
Wind is a capital-intensive but low risk activity where simple and stable financing structures are both necessary and useful - construction costs need to be spread out over a number of years for power generation costs to make sense.
Technical and operational risks are understood and very small if you have a competent project company, and the revenue profile is highly predictable, thus making it possible for lenders to provide a large part of the initial cost at a fixed price without requiring any benefit sharing, making this cheaper than equity and keeping the ultimate power price down.
This, called "project finance," is the boring kind of banking that makes the economy run but is sadly seen as unsexy or useless whenever new funny products are invented in the capital markets and create opportunities for bonus-generating bubbles...
I've already been set aside as dreary 3 times in the past 15 years:
emerging market bonds were all the rage in the mid-90s (until the Asian crisis), then the dot-coms were 'it' (until the crash), then the grand multi-product bubble of the past decade, with its mortgage-backed securities, collateralised loan obligations, credit default swaps and the rest.
And having being bailed out, they're at it again, while project finance is still suffering - and wind or solar projects get built more slowly than they could as a result.
Technology will save us...or not
Technology will save us--it's the mantra heard around the world when it comes to climate change, fossil fuel depletion, and myriad other environmental and resource challenges. But, that mantra rarely comes with the proviso that technology often has unintended and even perverse consequences.
"Yes, yes," you will say, "we know that."
Then, why, may I ask, is this almost never mentioned in the same speeches, op-ed pieces, and journal articles that tout the efficacy of one or another technology to definitively solve or at least help solve critical environmental and resource problems?
It is because these pronouncements are polemics, or more properly, sermons meant to instruct us in the supposed invincibility of our technology.
Let us take just one example of a technology that is so ubiquitous that people rarely even think about a world without it:
the automobile.
The automobile was probably the signature technology of the 20th century, one that shaped culture and in turn shaped so many other technologies that serve our automobile-based culture.
If humans had understood ahead of time that automobiles would result directly or indirectly in the following, would society have chosen to allow their widespread use?
* Climate change
* Health problems and mortality due to air pollution
* Pollution of groundwater from decrepit gasoline storage tanks
* Urban sprawl
* Hollowing out of many American cities
* Massive traffic jams
* Dependence on unreliable foreign sources of oil
* Serial military conflicts involving access to and control over oil
* Paving and development of prime farmland and forest
* Mass death and disability due to accidents on the world's highways
* An obesity epidemic related to loss of walkable living environments
* Massive public expenditures for roads, parking and other purposes related to automobiles (to the exclusion of other priorities)
I have not tried to be exhaustive.
But, I think this list outlines just how deleterious the automobile has been not only environmentally, but also socially and economically.
Now, of course, it would have taken exceptional clairvoyance to have foreseen all the perverse consequences I list.
But that is just the point!
What allows those who are so confident about the salutary outcomes for their favorite technological fixes to pretend that there will be no perverse and even fatal consequences related to them?
How can the technological optimists be sure that their solutions will not lead either to the opposite of what they intend or to other problems perhaps even more intractable than the ones they purport to solve?
Perhaps the most readily obvious example is the notion that energy efficiency will result in a reduction of energy use.
But the Jevons Paradox tells us that just the opposite happens by making energy cheaper due to a reduction in demand and thus subject to greater demand as more people take advantage of the lower prices.
The technological optimists seem to be unaware of how complex the energy, climate, forest, and other systems with which they propose to tinker are.
They do not know the ecological dictum that you can never do just one thing.
Each action ramifies outward into any complex system resulting in multiple unforeseen and often unwelcome effects.
But, all of technological optimism can be summed up in one desire:
The desire not to have to change any of our current behaviors. And, yet it is our behavior that most of all needs changing.
To be sure, even changes in our behavior can have unforeseen and sometimes perverse consequences. But I would venture that these unforeseen consequences would be far less troublesome than those related to new technologies provided that any changes in behavior are guided by two principles:
1) To increase the long-term resilience of human society and 2) to increase the margin of safety in the way in which we exploit the environment.
For example, we could decide that the target for carbon dioxide in the atmosphere should be 350 ppm--below where it is today--as some advocate, instead of taking a chance that a much higher reading could put us past the tipping point that will lead to runaway global warming.
It is hubris to believe we can easily and precisely calculate the limits of extraction and pollution and then move right up to our artificially calculated limits and still achieve sustainability.
Instead, we should take a path of much greater humility that acknowledges that we must build greater resilience and wider margins of safety into our physical infrastructure and our everyday practices.
Ultimately, I believe, we will be forced to live in a much lower energy society.
And, that means that the set of behaviors that need to change most will be those that currently lead to overconsumption.
Reserves are bunk
Henry Ford is famous for having once said, "History is more or less bunk."
He was, in fact, attacking tradition in an age of rapid technological and social change.
Almost a century later we have a less ambitious observation which may not achieve the broad visceral appeal of Ford's statement, but one which may turn out to have a good deal of importance, to wit:
Oil and natural gas reserve numbers are more or less bunk.
Let me introduce you to B. J. Doyle, vice president of operations for a small Houston-based oil and natural gas exploration company.
Doyle's views on the oil and gas business have been on display for more than a year now at The Oil Drum, a site famous for its technical prowess and breadth of coverage when it comes to energy-related issues. On the site Doyle goes by the moniker Rockman, and through his frequent comments he has been trying to educate readers about the realities of the oil and gas business.
Now, he didn't actually say that oil and natural gas reserve numbers are more or less bunk.
Nevertheless, that is a fair summary of what he told me when I spoke with him recently.
To understand why an insider would cast aspersions on this sacred metric of the oil and gas industry, you need to know two things. First, Doyle doesn't have to please shareholders. The company he works for is privately held.
Second, reserve numbers are meaningless unless they are indexed to a price.
Doyle began his explanation with a seemingly astounding statement:
"One of the things we're least interested in is the amount of oil and gas that we are going to produce."
How can this possibly be true?
It turns out that the oil and gas industry uses a method common to nearly every modern business enterprise to evaluate its investments, namely, net present value analysis or NPV.
The concept is actually simple.
If you have the choice of receiving $1,000 now or $1,000 three years from now, naturally you'd take the $1,000 today.
That's because of what is called the time value of money.
If you can invest the $1,000 today, say, in a bank CD, you can at least earn some interest in the next three years. Also, if you were foolish enough to wait for your money, inflation might undermine the purchasing power of that $1,000.
The inflation calculator at the U. S. Bureau of Labor Statistics shows that it would take $1,072 in 2009 to equal the purchasing power of $1,000 received in 2006.
Every business knows that there are several ways in which it can invest its capital.
So, business owners take the amount of the initial investment in, say, a new factory or a new oil well, and subtract that amount from the present value of what they forecast will be the future cash flows from that investment.
If the amount is positive, then the project will be profitable and should be considered.
If the amount is negative, the project should be abandoned.
Of course, there are many factors when considering an investment, but a project that appears to be unprofitable will certainly not be considered.
Net present value analysis, however, doesn't describe the real world perfectly.
This flows from the obvious truth that no one can actually know the future.
One has to estimate the expected future cash flows. This is no easy feat when dealing with the uncertainties of yet-to-be drilled underground reservoirs, the challenges of operating producing wells, and the vagaries of the oil and natural gas markets. Then, one needs to apply a so-called discount rate.
This process assumes that future cash flows received years down the road must be "discounted" to reflect the time value of money as described above.
Doyle explains that discount rates applied in the oil and gas industry often range from 10 to 15 percent.
He admits it's an arbitrary number, but it's arbitrary in every industry except perhaps as it reflects the presumed risks involved in the venture and the cost of capital (such as interest on loans).
When you work out what this implies for cash flow generated from a well several years after production begins, it becomes clear why the ultimate amount of oil and gas recovered from a well has little relevance to the decision to drill it.
Let's do an example to see why.
If you invest $3 million to drill a well (not an unusual amount these days) and expect to get cash flow of $1 million per year from the well for 10 years, on the face of it that sounds as if you are reaping more than three times your investment.
But when you discount the cash flows appropriately, for example at 12 percent, you get an NPV of $5,650,223.
That's $2,650,223 more than you are investing, so it's still a positive number even after discounting.
And, it's 1.88 times the initial investment, a ratio that will become meaningful below.
But the NPV of the $1,000,000 in yearly cash flow in years 8, 9, 10 are as follows:
$359,634;$316,478;and $278,500.
If the well keeps producing in year 20, the NPV of the cash flow in that year falls to just $77,562.
If it is a very long-lived well, the NPV of the cash flow from year 40 is negligible, $6,015.
As it turns out, few companies would even bother drilling such a prospect.
Doyle says that right now his company won't even look at a prospect unless, based on seismic data and other information, it reasonably expects that the completed well will produce an NPV six or more times that of the initial investment.
When there is keen competition for prospects, companies will drop their expectations down to three to four times the NPV.
This is where things get interesting.
Doyle has seen some public companies drop their goal down to one.
That's right.
They will drill prospects that they believe have no reasonable chance of doing anything other than breaking even.
Why will they do this?
To boost stated reserves, a number by which Wall Street judges the value of oil and gas companies. They won't, however, make any true profit on these wells. But they will become what Wall Street calls an "asset play."
They will be valued on their assets, in this case stated reserves, rather than on their profitability.
This strategy has proven especially tempting to those engaged in the hunt for shale gas since drilling success rates are very high.
This is a risky strategy, however, that leaves little margin for error. Prices lower than those forecast by such an analysis could quickly bankrupt a company that drills too many wells based on an assumed one-to-one ratio of investment to net present value.
The claims that the United States has 100 years of recoverable natural gas as a result of the newly accessible shale basins has no meaning without attaching a price to it, Doyle contends. The fact that major shale gas producers have trimmed their active drilling fleets to a fraction of what they were during the 2008 boom in natural gas prices proves that price is a critical factor in determining whether to drill.
And, where there is no drilling, there are no additions to reserves. The natural gas market has shown itself to be highly volatile which has not surprisingly led to wide swings in natural gas drilling.
The notion that somehow there will be a consistent accretion of natural gas reserves from year to year or that all discoveries from previous years will still be considered reserves in a low-price environment is pure bunkum.
The same logic applies to oil discoveries. But these days no one is claiming the United States has enough oil left to supply the entire country for 100 years. And, so hype about oil reserves is less of an issue.
The upshot is that expected cash flow determines what areas will be drilled, not the size of potential reserves. Most companies won't drill a prospect unless they believe they can get their money back within two to three years, Doyle says. If it takes four or five years, the prospect is not very attractive.
Cash flow is king.
It turns out that the NPV of the first three years of cash flow from my hypothetical well mentioned above is $1,556,112, only about half of the initial investment.
Most companies would or should pass on such a prospect, and it would therefore never become part of anyone's reserves, he explains. Part of the hype over shale gas has to do with the claim that the wells may be very long-lived, he adds. Even if that turns out to be true--not a certainty as of now--the low flow rates expected after the initial burst of production and the distant payoffs would actually work against any decision to drill such wells. No wells, no reserves.
Doyle says that given modern technology, oil and natural gas are easier to find than ever before.
But he doesn't believe that in North America at least, there is that much more to find.
He thinks that shale gas in North America my indeed prove to be plentiful.
But it will not be both plentiful and cheap.
And, of course, if we succeed at expanding natural gas production to meet the needs of a new natural gas-powered vehicle fleet--an idea advocated by one of the leading producers of shale gas--and expand other current uses such as the generation of electricity, we can expect that natural gas prices will soar. That may provide the necessary incentive (i.e. cash flow) to extract the shale gas that lies below the American landscape.
But it will also certainly mean that the 100 years of supply that has been so frequently touted in the media will rapidly shrink to perhaps 30 or 40, and that the peak in production will come much sooner.
A peak in natural gas production in, say, 20 years would not exactly be a useful talking point for those advocating the wholesale conversion of key parts of the U. S. economy to run on natural gas. Just as we would be finishing such a conversion, we could find ourselves on the downslope of the natural gas production curve and faced with the urgent need to adapt our costly and newly completed natural gas infrastructure to run on some other energy source.
Dairy industry milks innovations to cut greenhouse gases 25 per cent
US Department of Agriculture and dairy industry sign landmark partnership to slash emissions and accelerate roll out of anaerobic digestors
GreenBiz Staff, BusinessGreen, 29 Dec 2009
The US Department of Agriculture and the Innovation Center for US Dairy have agreed to work jointly in support of the dairy industry's goal to reduce greenhouse gas emissions by 25 per cent over the next decade.
The landmark memorandum of understanding identified a variety of projects that can help the dairy industry achieve those greenhouse gas reduction goals and increase its financial and environmental sustainability.
"This historic agreement, the first of its kind, will help us achieve the ambitious goal of drastically reducing greenhouse gas emissions while benefiting dairy farmers," said US Agriculture Secretary Tom Vilsack.
"Use of manure to electricity technology is a win for everyone.
It provides an untapped source of income for farmers, it provides a source of renewable electricity, reduces our dependence on foreign fossil fuels, and provides a wealth of additional environmental benefits."
Under the agreement, USDA will take a number of steps to help farmers, including supporting a strategic research plan to help the industry further reduce environmental impacts. Other initiatives would help the industry develop future technologies, advance nutrient management, support renewable energy, and improve energy efficiency.
Potential outcomes of the MOU include accelerating opportunities to adopt livestock manure processing systems that capture methane gas from livestock manure and convert it into electricity, coordinating research information on life cycle assessments, and supporting the industry's efforts in energy audits, feed management and energy conservation.
The Innovation Center is nearing completion of the first-ever life-cycle assessment of fluid milk from farm to table.
Initial estimates by the Applied Sustainability Center at the University of Arkansas show that the entire dairy supply chain, from cattle feed ingredients through packaging and transportation to the consumer's table, accounts for less than two per cent of US greenhouse gas emissions.
"The dairy industry's on-going efforts to improve milk production efficiency over the past six decades have already reduced greenhouse gas emissions at the farm level by more than 60 per cent," said Indiana dairy producer Mike McCloskey, chairman of the Innovation Center's Sustainability Committee.
"To feed a growing world we must continue to develop new ideas, innovations and best practices to preserve natural resources and secure a healthy future for the next generation."
The agreement may also help accelerate adoption of methane gas digesters for all sizes of dairy farms, making it easier to connect digesters to electricity grids and help digester operators capture potential carbon offset payments. Additional support from the USDA could include research on how feed mixtures affect methane emissions from cows. Opportunities to reduce so-called enteric emissions have been identified by dairy stakeholders in the Innovation Center's industry-wide plan to cut greenhouse gas emissions.
Here Comes China's $3B, ‘Golden Sun' Projects
China's finance ministry has selected hundreds of projects totaling nearly $3 billion in costs for its subsidy plan to dramatically boost the country's solar energy production.
China has selected hundreds of projects for its Golden Sun initiative to subsidize solar power plant installations across the country.
The Ministry of Finance said it has selected 294 projects totaling 642 megawatts. The ministry estimated that the projects would require a total construction costs of roughly RMB 20 billion ($2.93 billion). The government previously said it would subsidize half of a project's installation and related transmission costs.
Although the ministry first announced it had selected the projects last Friday, it wasn't until today that it posted on its website the project guidelines and provided the list of developers and the size and location of each project.
The list, which can be accessed via a link on this webpage, contains 275 projects, not 294, however (two projects involve building 10 systems each, so that might help to account for the difference). There also is a link to a Word document that spells out some technical guidelines, including pricing for crystalline and thin-film solar equipment.
The ministry first announced the Golden Sun initiative in July this year, characterizing it as a demonstration program to promote renewable energy generation and create a domestic market for its solar cell and panel manufacturers (see Chinese Gov't Will Pay to Install 500MW of Solar). These manufacturers, such as Suntech Power, Yingli Green Energy, JA Solar and Trina Solar, export most of their goods to Europe and North America.
In that July announcement, the government said it would subsidize no less than 500 megawatts of installations, which the projects would need to spread across the country and benefit both residents and businesses that are on and off the grid.
The ministry said it expects all of these projects to be completed within three years.
It also has spelled out some technology and pricing requirements for these projects. Prices to be paid for crystalline silicon panels should be no higher RMB 14 ($2) per watt.
The ceiling for amorphous silicon panels is RMB 9 ($1.32) per watt.
Monocrystalline silicon panels must have at least 15 percent efficiency.
The minimum efficiency for multicrystalline panels would be 14 percent and 6 percent for amorphous silicon thin films.
The government also wants guaranteed energy output of each solar energy system at the two-year, 10-year and 25-year mark.
Developers could use other types of thin films, concentrating photovoltaic and other solar technologies, but they must show success stories about using these technologies.
The majority of the projects would be installed at industrial and commercial operations, where the solar electricity would be used onsite.
The ministry described a several of those as building-integrated photovoltaic projects. Another eighteen projects could be off-grid installations.
The remaining 35 projects would be large power plants that would feed the electricity to the grid.
The government also has discussed creating a feed-in tariff, which would allow solar power plant operators to sell electricity at government-set, premium prices.
Rainforest conservation:
a year in review
2009 may prove to be an important turning point for tropical forests.
Lead by Brazil, which had the lowest extent of deforestation since at least the 1980s, global forest loss likely declined to its lowest level in more than a decade.
Critical to the fall in deforestation was the global financial crisis, which dried up credit for forest-destroying activities and contributed to a crash in commodity prices, an underlying driver of deforestation.
2009 was also notable for progress made on REDD, a proposed climate change mitigation mechanism that would pay tropical countries for protecting their forests. Over the course of the year, world business and political leaders, prominent scientists and conservation groups, celebrities, and other noted figures voiced support for the concept.
Momentum carried into climate talks in Copenhagen, where REDD was one of the only areas to see gains. Concerns over REDD now revolve mostly around the details (implementation, financing, governance, and equity) of the mechanism, rather than underlying idea of compensating rainforest conservation as a means to reduce greenhouse gas emissions.
Drivers of Deforestation
Forest clearing in Mato Grosso. Photo by Rhett A. Butler.
2009 saw major developments reflecting the implications of the shift from poverty-driven deforestation to enterprise-driven deforestation, a trend that continues to accelerate with urbanization and abandonment of government-sponsored colonization projects. While corporations and large landowners have ever-increasing capacity to deforest, the recent shift seems to offer new opportunities for rainforest conservation in that it is easier for pressure groups to target corporations and enterprises rather than tens of millions of poor farmers who are simply trying to put food on the table for their families. Accordingly, major industrial drivers of deforestation — the palm oil, cattle ranching, and logging industries — were significantly affected by activist campaigns in 2009.
In Brazil, the cattle industry was walloped by a Greenpeace report that linked some of the world's most prominent brands — Nike, Toyota, Prada, and others — to destruction of the Amazon rainforest.
The fallout from the report was immediate.
Some of the world's largest beef and leather buyers suspended contracts with suppliers associated with Amazon forest clearing.
The Brazilian government announced a crackdown and fines, raided the offices of powerful cattle companies, and called for a review of loan programs. Government ministers joined the private sector in demanding new chain-of-custody controls for suppliers to ensure that cattle products were not contributing to deforestation.
The largest cattle producers and traders soon responded with a moratorium on Amazon deforestation and a promise to implement improved supply-chain tracking mechanisms. The Brazilian cattle industry may now be on the cusp of transitioning from being the world's largest single driver of deforestation to a critical component in helping slow climate change.
Since the 1990s deforestation has become increasingly concentrated.
Recently published research by Matt Hansen of South Dakota State University suggests an even more dramatic shift in recent years. His work, which is based off of high resolution satellite imagery, shows that Brazil and Indonesia accounted for 61 percent of tropical deforestation between 2000 and 2005, rather than the 43 percent reported by the U.N. Food and Agriculture Organization (FAO).
In Southeast Asia, the palm oil industry was stung by the decision by Unilever, the world's largest buyer of palm oil, to suspend its contract with Sinar Mas, the world's second largest producer of palm oil, after an investigation commissioned by the consumer products giant found allegations made by Greenpeace about the palm oil producer's environmental record to be true.
The investigation's findings were a set back to the Roundtable on Sustainable Palm Oil (RSPO), of which Sinar Mas was a member (although its operations had not yet been certified as as environmentally responsible). Several other prominent palm oil buyers, including Cadbury-New Zealand and Lush Cosmetics, announced they would stop using palm oil in their products following consumer concerns about deforestation.
Meanwhile an internal audit by the World Bank's International Finance Corporation (IFC) concluded it had violated its own environmental rules in lending to palm oil companies. Two palm oil companies announced they would forgo development of concessions in carbon-rich areas, instead opting to preserve forests on the land for carbon payments.
In the forestry sector, several large firms severed ties with timber and pulp and paper companies linked by reports from NGOs to questionable logging practices in Indonesia.
The United States and Europe stepped up enforcement of laws and regulations (the Lacey Act in the U.S. and FLEGT in Europe) meant to hold importers to environmental laws in timber-producing countries, while a Brazilian federal prosecutor launched an investigation into charges that illegal timber from the state of Pará is being laundered as "eco-certified" wood and exported to markets in the United States, Europe, and Asia.
China's Ministry of Environmental Protection draft regulations that would require Chinese companies operating abroad to comply with environmental laws of China and the host country, although it is unclear whether these will actually be enforced.
More Good News
2009 was marked by a number of other hopeful developments for tropical forests. Brazil, Peru, and the Democratic Republic of Congo established massive new rainforest parks, while Papua New Guinea created it first nature reserve.
Norway continued to lead industrialized countries in funding rainforest conservation, contributing a quarter of a billion dollars to Guyana and reiterating its billion dollar pledge to Brazil.
The United States, Japan, Australia, France, and the United Kingdom also made large financial commitments towards tropical forests.
Indigenous rights in Brazil got a boost after a court victory in a dispute with farmers in Roraima and a legal opinion finding that the Surui tribe owns the carbon rights to the land they inhabit, perhaps leading the way for future indigenous-run forest carbon projects. The Surui also unveiled its partnership with Google in developing tools that will enable the tribe to better monitor their territory for encroachment by loggers, miners, and ranchers. Working with leading scientific institutions and NGOs, Google announced the Earth Engine platform, a system that combines its computing power with advanced monitoring and analysis technologies. The platform promises to enable near real-time monitoring of the world's forests and carbon at high resolution at selected sites before 2011.
Meanwhile the Woods Hole Research Institute reported progress on a high resolution global forest map for tracking land cover change.
And Some Bad News
Deforested area and healthy forest in Borneo.
But the good news for tropical forests was tempered by developments including Indonesia announcing its intentions to open up more than 2 million hectares of carbon-dense peatlands to old palm development;the collapse in law enforcement in Madagascar, contributing to an explosion of commercial timber (and lemur) harvesting in that country's spectacular rainforest parks;a breakdown at the RSPO meeting over efforts to reduce greenhouse gas emissions from palm oil production;violent conflict in Peru between government security forces and indigenous groups over land rights and resource extraction;massive foreign land acquisitions in the Congo Basin;dodgy REDD dealings in Indonesia and Papua New Guinea;and large-scale expansion of oil palm agriculture in the Amazon.
Brazil moved to grant amnesty to farmers and ranchers who illegally occupy or have illicitly cleared Amazon forest lands, a decision that some say legitimizes past deforestation (others maintain it is a critical step towards improved governance in the region).
Looking forward
While 2009's developments are likely to have long-term implications for conservation, the fate of tropical forests is far from determined.
Looking forward, things to watch include:
the impact of economic recovery on commodity prices and agricultural expansion for food and biofuels production;large-scale land acquisition by foreign nations and corporations in tropical countries;climate negotiations and the REDD mechanism, including controversies over land rights, "offsetting", forest definitions, and sustainable forest management;the emergence of payments for ecosystem services beyond REDD;the cap-and-trade versus carbon tax schemes;efforts to address the demand side of deforestation — notably consumption;emerging certification systems for agricultural and forestry products (i.e. RSPO, Aliança da Terra, FSC, etc);and Brazil's progress in meeting its deforestation reduction targets.
How satellites are used in conservation
In October 2008 scientists with the Royal Botanical Garden at Kew discovered a host of previously unknown species in a remote highland forest in Mozambique.
The find was no accident:
three years earlier, conservationist Julian Bayliss identified the site—Mount Mabu—using Google Earth, a tool that's rapidly becoming a critical part of conservation efforts around the world.
As the discovery in Mozambique suggests, remote sensing is being used for a bewildering array of applications, from monitoring sea ice to detecting deforestation to tracking wildlife.
The number of uses grows as the technology matures and becomes more widely available.
Google Earth may represent a critical point, bringing the power of remote sensing to the masses and allowing anyone with an Internet connection to attach data to a geographic representation of Earth.
Brief history of remote sensing for environmental applications
A lot of environmental monitoring is possible today only through remote sensing.
Detecting changes in sea ice across the sub-freezing Arctic is one example, but remote sensing also allows monitoring of hostile and sometimes war-torn deserts, vast expanses of ocean, the dense Amazon rainforest, and isolated mountain ranges—monitoring which would be cost-prohibitive or impossible without eyes from above.
Landsat image revealing "fishbone" deforestation along roads in the Brazilian Amazon
"Sending people in by foot to survey these places is costly, time-consuming, and potentially dangerous," Ruth DeFries, a GIS specialist at Columbia University, told mongabay.com.
DeFries authored a comprehensive review last year on the use of remote sensing for terrestrial environmental monitoring.
"Remote sensing is really the only way to do this work." she said.
Early earth observation satellites focused on weather, but scientists quickly devised ways to use their data to analyze vegetation cover. In 1972 Landsat became the first non-weather satellite for civilian use, giving scientists the ability to observe any place on Earth every 18 days. The satellite initially was used for crop analysis but today, seven satellite generations later, higher resolution and additional spectral bands have vastly expanded Landsat's functionality for a wide array of applications.
Since the 1990s, Landsat has become only one example of many sensing technologies. Landsat 7, the most recent Landsat satellite, carries several passive sensors using different wavelengths to decipher Earth's features from above. A passive optical sensor, for example, is much like a camera, operating off visible light reflected from Earth's surface.
This reliance has its shortcomings—notably it will only take pictures of what it can see.
Clouds, smoke, and other factors can interfere with or block its sensing capabilities. Meanwhile, infrared detects the amount of heat emitted from an object at the earth's surface, making it effective for identifying fires and other sources of heat like cities.
But remote sensing isn't limited to passive sensing.
Active sensing—which sends out pulses of energy and reads the radiation that bounces back to the sensor—can provide detailed information about Earth's surface, including the structure of a forest or the distinction between secondary and primary forest.
These technical advances, which make remote sensing data more relevant and timely, have been accompanied by favorable political and economic trends. Landsat data is now freely available to the public.
At the same time a proliferation of commercial satellites offers a range of remote sensing products. Remote sensing data is no longer limited to the military, specialized institutions, or the academic world.
This image shows Arctic sea ice concentration on September 8, 2008, as observed by the Advanced Microwave Scanning Radiometer–Earth Observing System (AMSR-E) sensor on NASA's Aqua satellite.
The observations are collected on a pixel by pixel basis over the Arctic.
The percentage of a 12.5-square-kilometer pixel covered by ice is shown in shades of dark blue (no ice) to white (100 percent ice). The gray line around the Arctic basin shows the median minimum extent of sea ice from 1979-2000. NASA image created by Jesse Allen, using data obtained courtesy of the National Snow and Ice Data Center (NSIDC). Caption by Rebecca Lindsey.
One of the most prominent uses for earth observation satellites is monitoring sea ice.
With records dating back to the 1970s, remote sensing observations have established a baseline for tracking the rapid loss of sea ice in the Arctic.
Due to persistent cloud cover, which obscures optical sensors, scientists from the National Snow and Ice Data Center (NSIDC) at the University of Colorado, Boulder, use infrared (IR) sensors to infer the amount of heat emitted from the surface.
Because sea ice is much colder than surrounding water, the contrast between the two is stark—at least during the winter. In the summer, melting sea ice approaches the temperature of the surrounding ocean, making it difficult to distinguish between the two. NSIDC says the new generation of multispectral sensors has helped fine-tune monitoring.
What researchers have found hasn't been encouraging, at least for polar bears:
summertime sea ice extent in 2007 fell half below average for the past three decades.
Unlike sea ice, fires are relatively easy to detect using thermal infrared bands provided by MODIS sensors. Fire data is acquired at least daily, enabling researchers to monitor, in near real-time, fires burning anywhere in the world.
The Fire Alert System—developed by Madagascar's ministry of Environment, the International Resources Group, and Conservation International using data from the University of Maryland and NASA—has put this information to practical use.
The system alerts subscribers via e-mail whenever burning is detected, potentially enabling them to take action on the ground when fires threaten protected areas or human settlements. MODIS data is also regularly used to monitor burning in the world's tropical forests. In 2007 when commodity prices were peaking, MODIS revealed a surge in the number of hotspots burning across the Brazilian Amazon.
Wildlife tracking
Remote sensing via satellite isn't limited to relatively stationary objects—it is widely used to track wildlife.
The Tagging of Pacific Predators (TOPP) program uses satellite tags to track nearly two dozen species of marine predators, including whales, sharks, birds, squid, sea turtles, and fish. TOPP data has revealed migration patterns, feeding grounds, behavior, and oceanic deserts.
Antelopes, bears, big cats, and parrots are among the many land species whose movements are regularly tracked using satellites. However, one of the most interesting applications has been developed by Save the Elephants, a Kenya-based conservation group.
Iain Douglas-Hamilton's organization has fitted elephants with GPS-enabled collars, allowing researchers (and Google Earth watchers) to track African elephants as they move through the bush in Kenya.
The system includes an alert feature that automatically sends a text message to rangers when a collared elephant approaches a virtual "geofence," which is established to reduce human-elephant conflict.
The "warning" allows rangers to take preventive action before raids on croplands occur.
Possibilities will only broaden as cost and size factors continue to fall, and batteries improve.
Accurate and timely monitoring of deforestation and forest degradation may present the next great frontier for remote sensing due to the potential emergence of REDD (Reducing Emissions from Deforestation and Degradation), a mechanism for compensating tropical countries for conserving their forests. To date, one of the biggest hurdles for the concept has been establishing credible national baselines for deforestation rates—in order to compensate countries for "avoided deforestation," it must first be known how much forest the country had been clearing on a historical basis. For the remote sensing community, REDD presents an opportunity to showcase the power of remote sensing and generate a source of funding for improved sensing capabilities.
Presently optical sensing can do a reasonably good job distinguishing between cleared forest and natural forest—assuming cloud cover is minimal, a big assumption in the tropics. It does less well identifying and distinguishing between recovering forests, selectively logged forest, tree plantations, and degraded forests. New active sensing technologies, like cloud-penetrating radar and LIDAR, may change this. Some of these technologies may allow scientists to directly measure biomass in dense forests—currently many sensing technologies are limited by their tendency to "saturate" at a threshold well below the actual biomass in such forests.
Josef Kellndorfer, an associate scientist at the Woods Hole Research Center in Massachusetts, says that a new JAXA (Japan Aerospace Exploration Agency) satellite—known as ALOS for the Advanced Land Observation Satellite—offers great promise for monitoring deforestation and degradation.
Radar image mosaic is a composite of nine individual scenes (45,000 km2) of Bali, Indonesia acquired by the PALSAR sensor carried on board ALOS. The image acquisitions were made between September 9 and October 10, 2007.
From New Eyes in the Sky
"ALOS features three sensors:
an optical sensor, a typical multi-spectral sensor with visible and infrared bands;a precise stereo mapper for generation of very high resolution elevation models and topographic information, which is also optical-based;and a radar sensor named PALSAR," he told mongabay.com.
PALSAR will allow scientists to get an annual snapshot at 20-meter resolution of all the world's biomes, allowing the scientists to establish a baseline for forest cover every year. The sensor also has a 100-meter resolution mode that provides near-global coverage every six weeks and can be used to detect illegal logging activities even under cloud cover.
"These capabilities are exciting because we have the complete pan-tropical forest cover for 2007.
We'll get the same on an annual basis for the life of the ALOS mission.
This will enable us to build a data record of forest cover on an annual basis no matter what the cloud conditions are," Kellndorfer said.
"Every year, within three months, we will have a full resolution pan-tropical assessment of forest cover," he continued. "ALOS and future missions with dedicated observation strategies can thus be used as a tool to complement the overall remote sensing and monitoring effort of forests."
Deforestation data from INPE
The sensing exercise will be particularly important in the Amazon—the world's largest tropical forest.
Brazil's use of satellite data for environmental monitoring is among the most sophisticated on the planet.
The country has two systems for tracking deforestation:
PRODES (Program to Calculate Deforestation in the Amazon) and DETER (Real-time Detection of Deforestation). Both presently rely on optical sensing—sometimes thwarted by cloud cover—but where it has a clear view, the country can rapidly identify where deforestation is occurring. PRODES, which has a sensitivity of 6.5 ha, provides Brazil's annual deforestation estimates (measured each August) while DETER, which has a coarser resolution of 25 ha, is a year-round alert system that updates IBAMA, Brazil's environmental protection agency, every two weeks. This gives authorities the technical capacity—although not necessarily the political will—to combat deforestation as it occurs. The system will be enhanced when Brazil launches the Amazon-1, its own earth observation satellite with cloud-penetrating LIDAR.
Dark green trees are the highly invasive strawberry guava tree from Brazil.
This invasion is occurring in a remote rainforest reserve in Hawaii.
This CAO image shows the march of invasive albizia trees (pale reds and pinks) and highly invasive strawberry guava trees (intense red) in Hawaii.
- This 3-D image shows an invasion (red-pink trees) into a protected forest reserve (blue-green trees) in lowland rainforest.
Images and captions courtesy of the Carnegie Airborne Observatory
But researchers are using remote sensing of forests for more than monitoring forest loss. Greg Asner of the Carnegie Institution's Department of Global Ecology at Stanford University, has developed advanced applications for processing data generated by a wide range of sensors, including Carnegie's state-of-the-art Airborne Observatory (CAO), which uses a combination of technologies aboard high-altitude aircraft to create high-resolution, three-dimensional maps of vegetation structure.
These maps can be used to detect small changes in forest canopy structure from selective logging, measure biomass in dense tropical rainforests (presently limited due to "saturation" in carbon-dense ecosystems), and even distinguish between individual plant species, including invasive species. Asner's group recently won grants from the Gordon and Betty Moore to expand upon CLASlite—a desktop monitoring application that enables conservationists anywhere to measure forest biomass and deforestation—and to develop its spectranomics project.
This project has the potential to inventory biodiversity across 40,000 acres of rainforest per day by detecting the chemical and spectral (light-reflecting) properties of individual plant species across a diverse landscape.
"Infrared reflectances of tropical forest canopies are often unique signatures for species," Asner said.
"This new technology will help us to capture previously hidden ‘chemical fingerprints' of rainforest species ...
It will be a new era in rainforest research."
Satellite data is also providing insights into the drivers of deforestation.
Researchers at the Woods Hole Research Center are using remote sensing to map logging roads and anticipate future forest disturbance in the Congo and the Amazon.
Meanwhile, Holly Gibbs, a researcher at Stanford University, recently used remote sensing data to evaluate shifting patterns of tropical deforestation.
She found that 80 percent of agricultural expansion since the 1980s came at the expense of forests
"I recently analyzed the Landsat database created by the UN FAO to estimate the probable land sources for expanding croplands," she told mongabay.com. "I was also able to consider changing patterns of agricultural expansion during the 1980s and 1990s and demonstrate that the amount of cropland expanding into forested areas, rather than grassland or previously disturbed forests, is increasing.
Documenting changing patterns in land use is becoming increasingly important as we see mounting demands for global food, feed, and fuel, highlighting the importance of continuing the longtime history of Landsat into coming decades."
Gibbs says increased accessibility has been key to wider use of satellite data.
"Satellite data is becoming increasingly accessible to everyone from a local park ranger to scientists at major universities, opening the door to more diverse analyses," she said.
"The creation of ‘free' global Landsat mosaics, for example, is a major push for initiatives to Reduce Emissions from Deforestation and Degradation [REDD] across the tropics that rely on what used to be very costly data."
How to Save Tropical Rainforests
Today tropical rainforests are disappearing from the face of the globe.
Despite growing international concern, rainforests continue to be destroyed at a pace exceeding 80,000 acres (32,000 hectares) per day.
World rainforest cover now stands at around 2.5 million square miles (6 million square kilometers), an area about the size of the contiguous 48 United States or Australia and representing around 5 percent of the world's land surface.
Much of this remaining area has been impacted by human activities and no longer retains its full original biodiversity.
Five Basic Steps to Saving Rainforests
"TREES" is a concept originally devised for an elementary school audience but serves well as set of principles for saving rainforests and, on a broader scale, ecosystems around the world.
* Teach others about the importance of the environment and how they can help save rainforests.
* Restore damaged ecosystems by planting trees on land where forests have been cut down.
* Encourage people to live in a way that doesn't hurt the environment.
* Establish parks to protect rainforests and wildlife.
* Support companies that operate in ways that minimize damage to the environment.
Deforestation of tropical rainforests has a global impact through species extinction, the loss of important ecosystem services and renewable resources, and the reduction of carbon sinks. However, this destruction can be slowed, stopped, and in some cases even reversed.
Most people agree that the problem must be remedied, but the means are not as simple as fortifying fences around the remaining rainforests or banning the timber trade.
Economic, political, and social pressures will not allow rainforests to persist if they are completely closed off from use and development
So, what should be done?
The solution must be based on what is feasible, not overly idealistic, and depends on developing a new conservation policy built on the principle of sustainable use and development of rainforests. Beyond the responsible development of rainforests, efforts to rehabilitate and restore degraded forest lands along with the establishment of protected areas are key to securing rainforests for the long-term benefits they can provide mankind.
Past efforts
Historic approaches to rainforest conservation have failed, as demonstrated by the accelerated rate of deforestation.
In many regions, closing off forests as untouchable parks and reserves has neither improved the quality of living or economic opportunities for rural poor nor deterred forest clearing by illegal loggers and developers. Corruption has only worsened the situation.
The problem with this traditional park approach to preserving wildlands in developing countries is that it fails to generate sufficient economic incentives for respecting and maintaining the forest.
Rainforests will only continue to survive as functional ecosystems if they can be shown to provide tangible economic benefits. Local people and the government itself must see financial returns to justify the costs of maintaining parks and forgoing revenue from economic activities within the boundaries of the protected area.
Limited resources
Countries with significant rainforest cover are generally among the world's poorest.
As such, people's day-to-day survival is dependent upon natural-resource use.
Most local people living in and around forests never have an option to become a doctor, sports star, factory worker, or secretary;they must live off the land that surrounds them, making use of whatever resources they can find.
Their poverty costs themselves, their country, and the world through the loss of biodiversity and ecosystem services like erosion prevention, flood control, water treatment, and fisheries protection.
Governments in these countries are in the unenviable position of having to balance the well-being of rural poor with the interests of industry, demands from foreign governments, and requirements from the international aid community.
In this climate, it can be easier to simply neglect the continued destruction and degradation of environmental assets than to come up with a long-term plan to ensure that economic development is ecologically sustainable.
Success in conserving wildlands in these countries will require reconciling the inevitable conflicts between short-term needs of local people and the long-term nature of the benefits that conservation can generate on a sustainable, ongoing basis.
Forces behind rainforest loss
Rainforests are being cut mostly for economic reasons, though there are political and social motivations as well. A significant portion of deforestation is caused by poor farmers simply trying to eke out a living on marginal lands. Beyond conversion for subsistence agriculture, activities like logging, clearing for cattle pasture and commercial agriculture are sizeable contributors to deforestation on a global scale.
Agricultural fires typically used for land-clearing are increasingly spreading outside cultivated areas and into degraded rainforest regions.
Addressing deforestation
Addressing deforestation will need to take the very different needs and interests of these groups into account.
Poor farmers:
Poor farmers are simply trying to put food on the table for their families. A better approach to addressing the needs of the rural poor may be improving and intensifying currently existing agricultural projects and promoting alternative cultivation techniques—notably permaculture.
Permaculture adds a mix of crops to the farmer's palette that both enables him to diversify his income stream and enhance degraded soils by restoring nutrients. An added benefit of such techniques is that they maintain forest systems, soils, and biological diversity at a far higher level than do conventional agricultural approaches. As long as such fields are adjacent to secondary and old-growth forest, many species will continue to thrive.
One promising area of research looks at ancient societies that lived in the Amazon rainforest before the arrival of Europeans in the 15th century.
Apparently these populations were able to enrich the rainforest soil, which is usually quite poor, using charcoal and animal bones. By improving soil quality, large areas of the Amazon that have been deforested could be used to support agriculture.
This could help reduce pressure on rainforest areas for agricultural land.
Further, the "terra preta" soil could be used to help fight global warming since it absorbs carbon dioxide, an important greenhouse gas.
A second important part of aiding poor farmers is helping them gain formal title to their land.
Right now, in places where it is difficult to gain ownership rights to land and where land is relatively open and abundant, there is little incentive to maintain or improve holdings. Once local people have a stake in the land they are farming, they will have an interest in using it efficiently instead of moving on to a new area of forest once soils are prematurely exhausted.
The creation of credit facilities for poor farmers to both save their earnings and borrow in times of need is also important to improving their quality of life.
Micro-credit facilities can provide significant economic benefits to the local economy while bringing dignity to and promoting entrepreneurship among local people.
Finally, improved access to markets is important in enabling farmers to get their agricultural products. Improved access can be a doubled-edged sword if it means increased road-building, which often spurs further deforestation.
Any infrastructure improvements should be carefully planned to minimize the future impact on remaining ecosystems.
Industrial/commercial developers:
Thus far it has proved difficult to apply the same permaculture agricultural techniques mentioned above to industrial operations. As currently practiced, large-scale agriculture is typically quite destructive of native ecosystems and does not maintain biodiversity at levels commensurate with adjacent forest areas. Incremental steps like the use of natural pest control and fertilizers can help reduce pollution caused by agricultural operations, while leaving strips of forest as corridors linking sections of forest helps moderate biodiversity losses.
Sustainable logging, while possible, has met resistance from the timber industry for its lack of efficiency relative to traditional harvesting methods, and it remains controversial among conservationists as to the impact on the environment.
Illegal logging and counterfeit labeling are major obstacles facing sustainable forest management for timber, but in time the development of higher yielding timber plantations will help alleviate pressures on natural forests.
Restoring and rehabilitating ecosystems
There is no use bemoaning past deforestation of large areas. Today the concern is how to best utilize lands already cleared so they support productive activities, now and for future generations. Without improving the well-being of people living in and around forests, we cannot expect rainforests to persist as fully functional systems and continue to cater to our needs.
In addressing environmental problems in rainforest countries, it is important that decision makers not only be concerned with the transformation of existing natural ecosystems, but also the more rational utilization of already cleared and degraded areas. To lessen future forest loss, we must increase and sustain the productivity of farms, pastures, plantations, and scrub land in addition to restoring species and ecosystems to degraded habitats. By reducing wasteful land-use practices, consolidating gains on existing cleared lands, and improving already developed lands, we can diminish the need to clear additional forest.
Research and experience has shown that the restoration of entire ecosystems is most possible in regions where parts or at least remnants of the original forest still remain and there are few human population pressures. Small clearings surrounded by forest recover quickly, and large sections may recover in time, especially if some assistance in the reforestation process is provided.
After several years, a once-barren field can again support vegetation in the form of pioneer species and secondary growth.
Although the secondary forest will be low in diversity and poorly developed, the forest cover will be adequate for some species to return (assuming they still exist). In addition, the newly forested patch can be used for the sustainable harvest of forest products and low-intensity logging and agriculture.
Funding rainforest conservation efforts
Conservation efforts and sustainable development programs are not going to be cost-free.
Even countries that already get considerable aid from foreign donors have trouble effectively making such initiatives work in the long term.
Since handouts, which in and of themselves have the tendency to breed dependency, are not going to last forever, funding these initiatives may require more creative sources of income to be truly successful.
Here are some other funding strategies for consideration:
* Ecotourism—Ecotourism can fund efforts both through park entrance fees and employing locals as guides and in the handicraft and service sectors (hotels, restaurants, drivers, boat drivers, porters, cooks).
* Bio-prospecting fees—Rainforest countries can earn revenue by allowing scientists to develop products from the island's native plant and animal species. The pioneer in this area was Costa Rica, which entered into an agreement with the American pharmaceutical company, Merck, to look for plants with potential pharmaceutical applications. Under the agreement, a portion of the proceeds from compounds that do prove commercially valuable will go to the Costa Rican government, which has guaranteed that some of the royalties will be set aside for conservation projects. Similarly, in 2001 Givaudan, a Swiss fragrance and flavor company, sent a team to look for new exotic smells and flavors in Madagascar. Following their survey, Givaudan researchers "reconstituted" 40 aromas that could be used in commercial products. The company has agreed to share a portion of the profits from these products with local communities through conservation and development initiatives.
* Carbon credits—For setting aside forest for the purpose of atmospheric carbon mitigation, developing countries can receive payments from industrialized countries looking to offset their carbon emissions. Carbon-offset programs are popular in many circles, since they can "provide a mechanism for motivating wealthy countries to pay for a benefit of forest conservation that transcends national borders."
In effect, such programs promote "the transfer of funds from industrialized countries to tropical countries as a commercial transaction rather than an act of charity" (Costa, P.M. "Tropical forestry practices for carbon sequestration:
a review and case study from Southeast Asia," Ambio Vol. 25 No. 4, June 1996)).
* Corporate sponsorship—Corporations have been a bit slow in "adopting" parks, but they have the money and a marketing-driven interest in taking a closer look at such schemes. See below for more details on a potential plan.
* The Linden-Lovejoy-Phillips plan—One interesting idea proposed by Eugene Linden, Thomas Lovejoy, and J. Daniel Phillips for tropical rainforests consists of dividing natural areas into blocks and then soliciting funding commitments from international environmental groups, development institutions, corporations, and other credible donors. There would be a bidding process, after which an entity would take responsibility for maintaining forest cover and forest health in each block of the entire forest system.
This plan could be a road for corporations to become involved in conservation as a public-relations/marketing tool. A given percentage of the proceeds could be put into a trust fund with the payout ear-marked for ongoing conservation and sustainable development programs.
Further steps once funding is in place
* Expand protected areas—As many areas should be protected as soon as possible.
If protected areas can be developed in such a manner to generate income for local communities, an increasing number of parks should theoretically create more economic benefits for a greater share of the population.
* Increase surveillance of and patrols in protected areas—This can be done at a reduced cost if local communities benefit from the success of the park.
If locals have a vested interest (i.e. are compensated via entrance fees, hired as guides, make handicrafts to sell to tourists, and learn to value their ecosystem for the services it can provide), they will want to watch the park so that the source of their income is not diminished.
Community surveillance is the most effective way to patrol a protected area, though it will probably be necessary to have park staff conduct patrols as well.
Guides should be trained as well to keep watch for activities that are damaging to the ecosystem and report suspicious activities at park headquarters.
* Build research facilities for training local scientists and guides—The average rainforest country needs to build its intellectual capital to grow its economy and make the best use of the country's resources. There need to be further studies on endemic species (many just have a name and a location and new species are being discovered every year) for both pure-research reasons and potential commercial applications. Improved crop yields and reduced erosion could also be possible with future research.
* Establish programs that promote sustainable use—Programs that promote sustainable use are key to elevating the standard of living for people living around protected areas. Not all members of a community will see the direct benefits from employment in the service or production sector, and many people will still rely on traditional use of the natural resources around them.
These resources must be used in a more effective manner to maximize productivity and minimize the impact on the environment.
* Compensate displaced people—As more protected areas are set aside, it is inevitable that some people may be asked to move.
It is important that these people are compensated for abandoning their existing livelihood and homes. While direct cash payouts is an option, a better strategy is providing these displaced people with long-term income possibilities through training in better agricultural techniques or alternative crops.
* Involve indigenous people, where they still exist, in park management.
Indigenous people know more about the forest than anyone and have an interest in safeguarding it as a productive ecosystem that provides them food, shelter, and clean water. Research has found that in some cases, "indigenous reserves" may actually protect rainforest better than national parks in the Amazon.
* Promote ecotourism—Ecotourism is perhaps the best hope for developing the economy of some rainforest countries. Planners should seek to minimize the environmental impact and maximize the benefits for local communities.
* Ensure economic success does not result in increased deforestation—As rural populations begin to reap benefits from conservation-related activities, it is important that they not reinvest this income in activities that result in further deforestation.
Traditionally, in many villages, the more money someone made, the more money was put back into land clearing.
Rural banks and savings institutions are virtually unknown in many parts of the developing world.
Such facilities, which would enable both saving and lending, could rapidly change the lives of millions through increased entrepreneurship and the ability to put away money for the future.
* Encourage entrepreneurship—Encouraging entrepreneurship through such a micro-credit strategy could pay significant dividends for a country's economy as a whole.
Studies in developing countries have found that entrepreneurial skills among the poor are actually quite high when people are given access to capital.
Default rates are typically quite low as well (do the poor have a greater respect for money?). Stimulating entrepreneurship through small, low-cost loans is possibly a better approach than handouts, which may do little more than breed dependency and reduce human dignity.
Looking toward the future, tough choices
Simply banning the timber trade or establishing reserves will not be enough to salvage the world's remaining tropical rainforests. In order for the forest to be preserved, the underlying social, economic, and political reasons for deforestation must be recognized and addressed.
Once the issues are brought into the light, the decision can be made about what should be done.
If it is decided that rainforests must be saved, then the creation of multi-use reserves that promote sustainable development and education of local people would be a good place to start.
Currently about 6 percent of the world's remaining forests are protected, meaning that over 90 percent are still open for the taking.
However, even this 6 percent is not safe if the proper steps towards sustainable development are not taken.
If possible, reforestation and restoration projects should be encouraged if we, humanity, hope to come out of this situation without serious, long-term consequences.
Brazil to cut CO2 emissions
President Luiz Inacio Lula da Silva has signed a law requiring that Brazil cut greenhouse gas emissions by 39 per cent by 2020, meeting a commitment made at the Copenhagen climate change summit.
Brazil announced at the summit a 'voluntary commitment' to reduce CO2 emissions by between 36.1 and 38.9 per cent in the next 10 years.
The new law, however, is subject to several decrees setting out responsibilities and regulations for the farming, industrial, energy and environmental sectors.
Lula is expected to sign the decrees in January after consulting scientists and other experts, officials said.
Despite its ambitious targets, Greenpeace's top representative in Brazil, Sergio Leitao, called it merely a list of good intentions and accused Lula of using double standards in environmental issues.
'Brazil usually makes good speeches on the international stage, as in Copenhagen, but in practice it doesn't keep its word,' he told reporters.
Before signing the new law, in fact, Lula vetoed three of its provisions, including a reference to 'promoting the development of clean energy sources and the gradual phasing out of energy from fossil fuels'.
Environment Minister Carlos Minc said he was was pleased with the new law because it showed Brazil's determination to respect the pledges it made in Copenhagen.
'It doesn't matter if the Copenhagen summit didn't get the results we wanted.
We will still meet our goals,' he told reporters.
The climate change conference held in the Danish capital ended last week with a non-binding agreement that exposed the stark divide between rich and developing nations.
A total of $US30 billion ($A33.57 billion) was pledged from 2010-2012 to help poor countries in the firing line of climate change, and rich nations set a goal of providing 100 billion dollars annually in aid by 2020.
It established a goal of limiting warming to two degrees Celsius, but did not impose binding targets to reduce the emissions of gases that scientists say are heating up the world's atmosphere to dangerous levels.
The Copenhagen agreement was put together by leaders of the United States, China, India, Brazil, South Africa and major European nations, after it became clear the 194-nation summit was in danger of failure.
Florida Developments Suggest Improving Environment for PV
Despite the approval of a solar PV deal in Tampa, Fla., PV and other alternatives still have a pricing disadvantage.
A single event doesn't equal a trend, but the recent approval by Florida regulators of a solar PV deal in Tampa – in combination with other factors at work in the Sunshine State – suggests an improving environment for developers there.
The Florida Public Service Commission on Dec. 15 voted 4–1 to let Tampa Electric Co. recover from its customers the costs of the 25-year solar contract that are above the costs of "avoided" conventional generation, or facilities that won't have to be built or contracted for as a result of the solar purchase.
The vote, in which Commissioner Nathan Skop dissented, was a rare instance of the commission overruling a formal recommendation of its staff.
The action comes in the context of renewable-resources cheerleading by Florida Governor Charlie Crist Jr., a Republican.
There also has been a general upswing of solar activity in a state that, despite its latitude and meteorology, has lagged in PV development.
TECO's March 9 announcement of its contract with Energy 5.0 LLC included unusual commentary from the governor, given that the deal was subject to approval by governor-appointed regulators:
"I applaud TECO and Energy 5.0 on this exciting partnership that moves Florida closer to our goal of increasing energy diversity and reducing greenhouse gas emissions. ...
Two years ago, I challenged Florida to find the 'gold in green,' and we continue to see companies investing in innovative solutions that promote the use of renewable energy while saving money for consumers."
Crist's pro-solar orientation helped push through the 2008 passage of House Bill 7135, which granted full cost recovery for 110 MW of new renewable projects. That incentive was snapped up in its entirety by Florida Power &Light Co., which has contracted for a 15-MW solar facility in DeSoto County;a 10-megawatt facility at the Kennedy Space Center;and a 75-MW solar/thermal facility at an existing natural-gas power plant in Martin County. FPL didn't respond to requests for information for this article.
The solar upswing, which has defied a severe economic pullback in the state tied to residential real estate, also has roots in the February 2009 installation of feed-in tariffs by Gainesville Regional Utilities. Rachel Meek, GRU's solar program coordinator, says officials of the municipal utility expect the FIT to add 4 megawatt of solar PV per year to Gainesville's portfolio going forward.
"The sentiment was that renewables have to get in the door, even if we know they'll be more expensive" than conventional generating technologies, said a staff member, who cautioned that she was "paraphrasing" the rationale behind the vote because an approving order hadn't yet been drafted.
The commission's Dec. 15 action is significant for two reasons:
First, it departs from previous practice by allowing for payment of a higher rate than would otherwise be approved for conventional generation.
According to a commission staff memorandum, the project carries a gross capital cost of $135 million prior to deduction of the federal ITC, resulting in a net capital cost of $94.5 million.
Staff estimated total annual cost (O&M plus capital) of 21.4 cents per kilowatt hour to 23.52 cents per kilowatt hour over the contract term, assuming level annual generation of 48.3 megawatt hour.
Second, in going against its staff recommendation to allow recovery of costs only up to the level of available alternatives, including fossil fuel, the PSC signaled that it "gets" the fact that solar and other non-fossil fuels are more expensive at this time.
Specifically, the commissioners overruled staff's advice that TECO be prohibited from recovering the costs of such non-power "attributes" as renewable energy credits.
In a search for previous cases in which the PSC was asked by Florida utilities to approve contracts for renewable-energy purchases, six decisions were found from 2008 and 2009.
In all six decisions, the commissioners adopted every substantive recommendation made by its staff - including one case in which the staff explicitly recommended against the recovery of the cost of purchasing renewable attributes or certificates.
Voting with the majority on Dec. 15 was recently arrived Commissioner David Klement, who was seated on the panel in October. Klement filled a vacancy created by the resignation, under pressure from Crist, of Commissioner Katrina McMurrian.
Crist also announced that he won't renominate Commission Chairman Matthew Carter II to a new term.
The governor has nominated Benjamin "Steve" Stevens to fill the vacant panel seat in January 2010.
Stevens' orientation toward renewables probably won't be readable until then.
The PSC on Dec. 1 unanimously elected sitting commissioner Nancy Argenziano to succeed Carter as chairman.
While the appointments of both Klement, a policy-institute executive and former journalist – and Stevens – a Pensacola businessman and financial consultant –require approval by the state senate, Klement was seated early to finish the incomplete term created by McMurrian's departure.
While recent developments have been favorable, solar PV's momentum in Florida won't be unrestrained.
The state remains among those most hobbled by the recession.
The real-estate market is wallowing in oversupply, unemployment is rising, investment capital only thinly available, and electricity sales are expected to decline in 2009 for the second year straight.
The PSC's Muir notes that the legislature has attempted to require statewide establishment of FITs, but so far no bills have made it through any committees. Although Florida has other sizable municipal utilities besides Gainesville's, to date none of them has taken the FIT plunge.
And the fact remains that PV and other alternatives retain a pricing disadvantage.
"It's a more costly source of generation," Muir says. "Unless there's a change in law, cost is the biggest obstacle.
World's largest solar project prompts environmental debate
Panoche Valley is known mostly for cattle and barbed wire, a treeless landscape in eastern San Benito County that turns green every spring but for much of the year looks like rural Nevada.
A posse of lawmen gunned down the famous Gold Rush bandit Joaquin Murrieta, an inspiration for the fictional character Zorro, near here in 1853.
Nothing that exciting has happened since.
But now the remote valley 25 miles south of Hollister is finding itself at the center of a new showdown. A Silicon Valley company is proposing to build here what would be the world's largest solar farm — 1.2 million solar panels spread across an area roughly the size of 3,500 football fields.
"This is renewable energy.
It doesn't
cause pollution, it doesn't use coal or foreign oil, and it emits no greenhouse gases," said Mike Peterson, CEO of Solargen Energy, the Cupertino company behind the $1.8 billion project.
But critics — including some environmentalists — say green energy isn't always green.
In a refrain being heard increasingly across California, they contend the plan to cover this ranch land with a huge solar project would harm a unique landscape and its wildlife.
From the Bay Area to the Mojave Desert, green energy supporters are frustrated that a state that wants to lead the green revolution is facing roadblocks.
Peterson, a former vice president of Goldman Sachs, looked across the Panoche Valley last week and noted its attributes.
Advertisement
It sits 20 miles from the nearest town.
It has 90 percent of the solar intensity of the Mojave Desert.
Five willing sellers, mostly longtime ranching families, have signed options to sell his company 18,000 acres. And huge transmission lines run through the site, negating the need to build the kind of costly and controversial new power lines that have stalled similar projects.
"From our standpoint, this is a perfect place," he said.
"If not here, where?"
Opposition mounts
The project would produce 420 megawatts of electricity, roughly the same as a medium-sized natural gas power plant, and enough to power 315,000 homes.
But in recent weeks, the Santa Clara Valley, Monterey Peninsula and Fresno chapters of the Audubon Society have opposed the project.
"One of our biggest worries is the size.
There are no other projects like it," said Shani Kleinhaus, an environmental advocate with the Santa Clara Valley Audubon Society.
"There is really very little information on how these sorts of projects impact the environment.
We really don't know."
Among their primary concerns:
Panoche Valley is home to several endangered species, including the San Joaquin kit fox, the blunt-nosed leopard lizard and the giant kangaroo rat.
Additionally, an estimated 130 species of birds have been observed in the valley, including the bald eagle, golden eagle and prairie falcon.
Kleinhaus said she supports renewable energy.
But not here.
"Put solar panels over parking lots. Put them along the freeways, in airports, landfills," she said.
"There's plenty of space.
In five years, with new technology, they may not even need this much space."
Several nearby residents also are fighting the project.
Kim Williams leases 300 acres along Panoche Road, where she raises 650 free-range chickens. Williams moved to the area from Concord three years ago, becoming a sustainable farmer after reading Michael Pollan's "The Omnivore's Dilemma."
"It looks desolate right now at first glance.
But come back in a month, and you'll see a green valley that looks like Ireland," she said.
"There are wildflowers. It's beautiful."
Williams said vast solar arrays would alter the character of the area.
She worries that Solargen, founded in 2006, has never built a solar farm, and is pursuing the project primarily for the huge federal subsidies now flowing to renewable energy.
Similar debates are playing out across California.
Two large solar proposals in San Luis Obispo County near the Carrizo Plain — a 250-megawatt project proposed by SunPower of San Jose, and a 550-megawatt proposal from First Solar of Arizona — also are facing environmental opposition.
Meanwhile, U.S. Sen.
Dianne Feinstein, D-Calif., on Monday introduced a bill to establish two new national monuments on federal land in the Mojave Desert.
If approved, the measure would all but kill 19 large solar and wind farms proposed for the area.
Feinstein said she wants no large-scale solar or wind energy on former railroad lands that the federal government acquired a decade ago and that are prime habitat for bighorn sheep, desert tortoises and other wildlife.
In a statement, Feinstein said she supports solar energy, and her bill requires the Bureau of Land Management and other agencies to identify other desert areas suitable for solar.
But others argue that prohibiting solar developments in vast portions of California doesn't make sense.
"They say that we want renewable energy, but we don't want you to put it anywhere," said Gov.
Arnold Schwarzenegger in a speech at Yale University last year. "I mean, if we cannot put solar power plants in the Mojave Desert, I don't know where the hell we can put it."
Demand for solar is hot.
Schwarzenegger this year signed an executive order requiring 33 percent of California's electricity to come from renewable sources such as solar and wind.
Meanwhile, President Barack Obama's stimulus plan contains billions in grants and tax credits for green power. It would pay for 30 percent of Solargen's project in the Panoche Valley, for example, if ground can be broken by Dec. 1, 2010.
Julia Levin, a member of the California Energy Commission and former Audubon California policy director, said large solar projects are needed because residential rooftop solar, while important, costs more and takes longer to ramp up than big commercial installations.
"There are some very real environmental challenges for renewable energy development," she said.
"But NIMBY challenges are slowing down some renewable projects. Our challenge is separating one from the other." NIMBY is an acronym for "Not In My Back Yard."
The Panoche Valley solar project could come to a final vote before the San Benito County Board of Supervisors by year's end.
If work started by next December, it would be finished by 2016.
"There is some opposition down there, and I can understand that," said retired schoolteacher Reb Monaco, the supervisor whose district includes Panoche Valley.
"But when you look at areas that make sense for solar, it is probably an area that makes sense."
Jobs, tax revenue
Solargen's Peterson said the solar panels would be on racks, 3 feet off the ground, so sheep could graze underneath, and wildlife could move under them.
The 4,717-acre installation would create jobs and tax revenue for the tiny county and give it an international reputation as a solar leader, he added.
Currently, the largest solar farm in the world, in Spain, is 60 megawatts, about seven times smaller than Solargen's proposal.
"It was like everyone was in favor of renewable energy," Peterson said, looking out over a field of cow patties. "But the solar industry is finding the politics are complicated.
There's a lot of 'we love renewable, but not here, and not in my backyard.' "
Global warming likely to be amplified by slow changes to Earth systems
Researchers studying a period of high carbon dioxide levels and warm climate several million years ago have concluded that slow changes such as melting ice sheets amplified the initial warming caused by greenhouse gases.
The study, published in the journal Nature Geoscience, found that a relatively small rise in atmospheric carbon dioxide levels was associated with substantial global warming about 4.5 million years ago during the early Pliocene.
Coauthor Christina Ravelo, professor of ocean sciences at the University of California, Santa Cruz, said the study indicates that the sensitivity of Earth's temperature to increases in carbon dioxide in the atmosphere is greater than has been expected on the basis of climate models that only include rapid responses.
Carbon dioxide and other greenhouse gases trap heat in the atmosphere, leading to increased atmospheric and sea-surface temperatures. Relatively rapid feedbacks include changes in atmospheric water vapor, clouds, and sea ice.
These short-term changes probably set in motion long-term changes in other factors--such as the extent of continental ice sheets, vegetation cover on land, and deep ocean circulation--that lead to additional global warming, Ravelo said.
"The implication is that these slow components of the Earth system, once they have time to change and equilibrate, may amplify the effects of small changes in the greenhouse gas composition of the atmosphere," she said.
The researchers used sediment cores drilled from the seafloor at six different locations around the world to reconstruct carbon dioxide levels over the past five million years. They found that during the early and middle Pliocene (3 to 5 million years ago), when average global temperatures were at least 2 to 3 degrees Celsius warmer than today, the concentration of carbon dioxide in the atmosphere was similar to today's levels, about 30 percent higher than preindustrial levels.
"Since there is no indication that the future will behave differently than the past, we should expect a couple of degrees of continued warming even if we held carbon dioxide concentrations at the current level," said lead author Mark Pagani, an associate professor of geology and geophysics at Yale University.
How the Economics of Natural Gas Vehicles Works
Converting a gas-powered vehicle to run on natural gas can add upwards of $10,000 dollars to the cost of a vehicle — depending on tank size, production volume and other factors. "It's not exactly cheap," BAF Technologies President John Bacon told us today.
So why are companies like AT&T — which just tapped BAF for what Bacon says is the company's largest-ever single order of natural gas vehicle conversions — taking the plunge and investing in this technology?
In the short term it comes down to fuel savings (based on current natural gas vs. gasoline and diesel prices), low enough emissions to let drivers use the HOV lane in California, and desire to use fuels produced in the U.S., said Bacon.
Longer term, Rich Kolodziej, President of the trade group Natural Gas Vehicles for America explained to us today, fleet operators will continue to eye natural gas conversions as a way to increase their options for lower-emission vehicles than they'd have relying on the models that roll off automakers' assembly lines.
AT&T's Natural Gas Vehicle Bet
AT&T 's plan to invest up to $565 million deploying more than 15,000 alternative fuel vehicles — including 8,000 vehicles converted to run on natural gas — by 2019 means selected car makers and conversion providers will be able to sink their teeth into some juicy contracts in coming years.
BAF, a Clean Energy Fuels Corp. subsidiary, has just taken its second prime cut of the $350 million natural gas vehicle portion of AT&T's plan, announcing today that AT&T has awarded it the job of delivering 463 Ford E-250 vans converted to run on compressed natural gas, or CNG, in the first quarter of 2010 and another 463 converted vans the following quarter.
For the third quarter of next year, Bacon said BAF has gotten the go-ahead to purchase certain parts for an order of the same size.
In all, Bacon said BAF will supply more than 1,800 converted vans to AT&T in 2010.
This comes in addition to the 600 van conversions that AT&T ordered from the company earlier this year — a deal that BAF says remains on track for completion in 2009.
The Costs of Natural Gas Vehicle Conversions
Large fleet orders are a key to making natural gas vehicle conversions make economic sense not only for high mileage customers — which can reap savings on fuel — but also for the conversion provider. "Fleets generally operate a number of vehicles that are centrally maintained and fueled.
They also travel more miles daily than the average personal use vehicle and therefore can take better advantage of the lower price per gallon of natural gas," points out the NGVA.
Bacon said BAF's retail price for converting a single van or pickup truck to run on natural gas is about $16,500, but the per-vehicle cost is less than that for a fleet order (he also said BAF would never really convert just one vehicle). According to Kolodziej, the biggest costs in the process lie in the environmental certification process and parts, in addition to labor and purchasing the vehicle itself. An aftermarket conversion provider like BAF has to get certification from the EPA and the California Air Resources Board for "every engine family, every year," said Kolodziej — a process that he called "burdensome and much of it unnecessary" and can cost upwards of $100,000.
When it comes to parts, Kolodziej said fuel tanks represent one of the biggest costs. Bacon said AT&T's vans will have four cylinders, and "when you put that much fuel on a vehicle, it drives up the cost."
While a regular gasoline tank "is really a rigid plastic bag," said Kolodziej, natural gas fuel tanks are often made of metal, since the fuel has to be stored at a specific pressure.
That means more cost, compared to a gas vehicle.
There's a "big opportunity," however, to bring the cost down as the market grows, he said:
"You get the volume up, you get the economies of scale." OEMs (original equipment manufacturers), of course, have capacity for high volume production.
"If you make it on the assembly line, it's a lot cheaper," said Kolodziej.
While he expects car companies to eventually introduce more natural gas models to the U.S. market (where the Honda Civic GX is currently the only natural gas vehicle from an OEM), Kolodziej expects aftermarket conversion companies to play a significant role for another several decades. OEMs will start out with 1-2 natural gas models out of their total lineup, he predicted.
Carbon Reduction Benefits
For corporations looking to cut fleet costs in a time when the prospect of Congress putting a price on carbon still looms — and for the companies competing for the job — there's more to consider than direct costs:
Potential emission savings could also factor into decisions about what types of vehicles to deploy.
According to NGVA, nat gas vehicles emit only about 20 percent less carbon dioxide than a standard gas vehicle (BAF claims its fuel system reduces emissions by up to 25 percent).
Natural gas powered cars can qualify for a federal Alternative Fuel Vehicle tax credit of up to $4,000, but for some, the technology's carbon-cutting potential falls short.
Venture capitalist Vinod Khosla — a strong believer in supporting cleantech solutions that can scale quickly and cheaply enough to succeed in China and India's markets – has called natural gas vehicles a "dead end." AT&T holds a different view, citing estimates from the Center for Automotive Research that in total, AT&T's 15,000 vehicle plan for the next decade will prevent as much as 211,000 metric tons of carbon emissions from the company's fleet.
Infotech Can Cut Carbon, But Should It Be In the Copenhagen Agreement?
COPENHAGEN — For a conference that is focusing on negotiating an international agreement on fighting climate change, there's an almost shocking amount of information technology firms at the Copenhagen climate talks. There's a kiosk in the middle of the Bella Center where companies from Google to Cisco to Microsoft to various IT trade groups can lead daily discussions about how information technology can fight climate change.
While I have little doubt about IT's ability to reduce emissions (see The Climate Group' Smart2020 report), a representative from the International Telecommunications Union (ITU) told me today that the group is hoping that the final Copenhagen agreement will reference IT as an important part of the shared vision to reduce emissions.
Information and communications technology (ICT) can cut global green house gas emissions by 15 percent across sectors, said Arthur Levin, the head of the Telecommunication Standardization Policy division of the ITU, a UN group focused on ICT. The group believes this contribution should be recognized by negotiators. Eventually, he said, we also hope to discuss the possibility of ICT projects being used in the Clean Development Mechanism, a market structure to help industrialized nations gain and trade credits from green projects created in developing nations.
The Smart2020 report found that ICT can cut emissions through adding efficiency to the power grid, buildings, transportation and logistics, and through dematerialization (replacing physical goods with virtual ones). The report says that even factoring in the energy and carbon footprint of ICT (which is about 2 percent of the world's greenhouse gas emissions) ICT can reduce greenhouse emissions by a factor of five.
The IT companies at Copenhagen certainly know that data well.
The Chief Environment Strategist of Microsoft, Rob Bernard, who gave a presentation at the IT kiosk in Copenhagen on Thursday, said that ICT can "enable radical energy efficiency," "drive basic research," and deliver "responsible environmental leadership."
Bernard said ICT can help deliver research that can aid the issues that the attendees and delegates of the Copenhagen conference will be focusing on closely over the next eight days.
Google also has tools that will contribute to the climate dialogue.
Today the search engine giant launched a tool that helps collect and manage deforestation data, and which could come in very handy for groups backing these different forestry frameworks. Google has also been showing off its Google Earth layers that focus on climate change at Copenhagen this week.
Cisco has been highlighting its TelePresence (video conferencing) and web conferencing tools at Copenhagen all week as well.
The Ministry of Foreign Affairs of Denmark chose Cisco as "the official technology partner of COP15."
But should ICT get drafted into the language of the agreement itself?
Well, it clearly could play a major role in cutting emissions substantially in the near term, compared to other technologies that are in the research and development and science project stage.
And Tom Phillips, the chief of government and regulatory affairs for the GSM association, the mobile trade group, explained that it's crucial to "get ICT policies into government hands."
What do you think?
Should IT get a shout in the Copenhagen agreements?
4 Green Building Trends to Watch in 2010
The market for new construction is still struggling to pick itself up, but the growing trend of green building promises a sort of renaissance for the centuries-old industry.
That's the hope, anyways, and if you believe (as we do, though with a healthy pinch of skepticism) the mountain of reports and data pointing to the growth of the green building industry, then 2010 looks to be a pivotal year for transitioning the built environment into one that consumes significantly less energy, water and other resources.
Below we present four of the most important trends that we see shaping the industry in 2010.
Since energy use, at least so far, has been the primary focus of innovators and investors, we've largely limited our view into the green building crystal ball to that slice of the industry.
Modular Green Homes Go Mainstream:
When Warren Buffet makes a bet in energy-efficient modular homes, it's a good sign the market is set to grow.
Clayton Homes, one of the largest builders of manufactured housing in the U.S. and a subsidiary of Buffet's Berkshire-Hathaway, launched its i-house earlier this year. The homes, which will be constructed as modules in a factory and then assembled in the field, are billed as "affordable luxury in a green, energy-efficient package."
Besides Clayton, a number of startups like Zeta Communities and Blu Homes are getting into the prefabricated market.
So far, these companies have built a relatively small number of "prefab" homes, but 2010 could be the year that this industry finally becomes a serious player. "It's going to change — there is no question," Michelle Kaufmann, whose firm, Michelle Kaufmann Studio, designs prefab homes, tells us. "The technology is there, it's just about embracing it."
The industry will really take off once the country's largest home builders start using modular construction.
That time is probably not too far off, as Kaufmann says she's been approached by two of the nation's five biggest home builders (she wouldn't give names because of nondisclosure agreements) to advise them on modular construction.
Besides cost savings in labor and materials compared with conventional building, modular construction can help developers reduce risk, Kaufmann says. A developer can build homes on a large site as sales come in rather than investing a large amount of money upfront to build all the planned homes at once and before most are sold.
This should prove attractive at a time when financing is hard to come by and the market for new construction is lagging.
Building Materials Get Smarter:
Tech-oriented innovators and investors are finally starting to embrace the building industry, and one of the most exciting areas is smarter, more energy-efficient building materials. Serious Materials, which raised a $60 million third round of venture funding in September, already has built a bustling business out of energy-saving windows and environmentally friendly substitutes for sheetrock.
But a host of "revolutionary innovations" are in the pipeline, according to a report earlier this year by venture firm Nth Power and the research firm Fraunhofer Center for Sustainable Energy Systems. High-efficiency insulation systems such as walls with micro-encapsulated phase change materials are being developed, according to the study.
These materials could help stabilize the indoor temperatures in buildings by, say, releasing heat absorbed during the day at night when the outside air cools.
A number of companies (such as the stealthy Soladigm) are using electrochromic technologies that can darken or lighten the tint of a window when in contact with an electrical current and manage the sunlight that passes through.
The study also points to the development of ventilated double-skin facades, systems that use inner and outer glass walls with a thin gas cavity in between for the exterior shell of a building.
The facades provide insulation, and heat absorbed within the cavity can be used to warm cooler areas of a building.
Double-skin facades have already found a fair amount of traction in Europe, but they'll need some tweaking before they're widely adopted in the U.S.
Energy Retrofits Become Big Business:
"Efficiency" may have been the most popular word for 2009, and nowhere was its meaning so loud and clear as in the building industry.
The country's building stock is largely old and wastes energy, and the measures needed to make structures run more efficiently — say by adding more insulation in the case of homes or replacing aging heating and cooling systems in office buildings — often pay for themselves in reduced energy bills in a handful of years. Add to that the buzz in Washington (and likely financial incentives to spur them on) about creating jobs through these projects, and you've got a powerful force driving this industry.
Geoff Chapin, chief executive of home energy retrofitter Next Step Living, tells us he expects his business to grow 300-400 percent next year. Chapin said electric utilities are giving the industry a boost by offering more rebates for homeowners on measures like energy audits, insulation and duct sealing.
The U.S. home energy retrofit market will grow about 15 percent per year to $35 billion by 2013, up from $20.7 billion in 2007, according to SBI Energy.
And look out in 2010 for movement at the national level to help correct what are widely seen as three major barriers to the industry:
limited information for consumers about the energy performance of homes, difficulties accessing finance for energy retrofits, and a lack of skilled workers in the field.
The market for nonresidential building retrofits is also set to explode.
Research and publishing firm McGraw-Hill Construction published a widely circulated report earlier this year that said nonresidential "green building retrofits" represent in the near-term a better opportunity for designers and builders than new construction.
The market for these retrofits -– defined in this report as over $1 million in total cost and employing at least three aspects of green building such as energy, water and resource efficiency –- could grow to as much as $15 billion by 2014 from less than $4 billion this year.
Retro-commissioning, the practice of optimizing a building's operation and maintenance activities such as around its heating and cooling systems, has become a sort of mantra for the industry.
While the practice should be seen as just one piece of a comprehensive energy retrofit, retro-commissioning's rise in popularity is for good reason since it can often lead to energy savings as high as 30 percent, says David Leathers, senior vice president of energy services for mechanical contractor Limbach.
Leathers says that any commercial building in the U.S. five years or older can likely benefit from a retrofit with payback for most measures taken in less than five years.
Energy Codes Will Demand Greater Energy Efficiency:
Ever since the energy crisis of the 1970s faded out of memory, energy codes adopted by states and other jurisdictions across the country have been making small, incremental steps toward demanding more efficiency out of buildings. But a consensus is forming that there needs to be more strict standards for building energy efficiency, says Jim Edelson, who runs the codes program for the New Building Institute, a nonprofit that promotes improved building energy performance.
The new versions of the "model codes" currently under development — ASHRAE 90.1 and the International Energy Conservation Code (IECC), which are typically but not always adopted by jurisdictions — will likely require a 30 percent increase in energy efficiency, what Edelson calls the "most significant" increase in a generation. ASHRAE 90.1 is planned to be available in 2010, and the IECC is targeting a 2012 release.
But just because ASHRAE or IECC develop new codes doesn't mean jurisdictions have to adopt them, and that's led to a patchwork of energy standards across the country. (For a full description of the map, see this page on the Department of Energy's web site.)
One issue to look out for in 2010 is if Congress decides to mandate that all states raise their standards to the newest codes. The American Clean Energy and Security Act passed by the House this year includes a provision that would effectively create a baseline national building energy code by mandating the adoption of a standard set by the Department of Energy, which would presumably point to ASHRAE or IECC. Consistent codes across the country would be good for anyone selling products or marketing services related to building energy efficiency, but it's unclear if this provision will be part of the Senate's version of the bill or if it will make it through any compromise legislation
The Green Building Sector Is Ripe for Water-Saving Innovation, Report Says
Water scarcity is becoming a hot-button issue in the U.S. (and globally), with water managers in 36 states saying they expect freshwater shortages hitting their states by early in the next decade.
But the coming shortages could present opportunities for entrepreneurs and investors to develop new water-saving technologies. One ripe area for innovation is the building sector, according to a report, titled "Green Buildings + Water Performance," released this week by publisher Building Design+Construction.
Buildings account for about 12 percent of water use in the country, according to the U.S. Geological Survey, and green building ratings systems like the U.S. Green Building Council's LEED encourage more efficient use of water, such as through low-flow toilets, drip irrigation and on-site water reuse.
Typically more water is consumed outside commercial buildings and homes (see charts below taken from the report) — for landscape irrigation and cooling towers — than is used inside by things like toilets, faucets and showers, according to the report.
With that in mind, we've summarized the three areas in green building design noted in the report as the most promising for reducing water use outside buildings:
office bldg water use
Smart Landscape and Irrigation:
Landscape irrigation can be as much as 60 percent of water use in homes in arid climates and more than a third in more water-rich areas, the report says. Newer technologies, like weather-based irrigation (Rockport Capital-backed HydroPoint is one example), are helping building owners reduce water use.
Instead of watering according to a preset schedule, these "smart" systems take into account weather conditions, current and historic evapotranspiration, and soil moisture levels to deliver water based on the needs of the plants. Other water-saving landscape features emerging (besides the low-tech solution of selecting drought-resistant plants) include bioswales and vegetated roofs.
dom water useRainwater Reuse:
The bulk of U.S. building projects miss out on one of the most potentially significant water conservation opportunities by failing to put in place rainwater catchment and reuse systems, according to Building Design+Construction.
For every inch of rain that falls on 1,000 square feet of roof area, 600 gallons of water can be collected for harvesting — where water is collected either from the roof or the ground and then diverted to storage tanks. If just 10 percent of the roof area in arid Texas were used for rainwater harvesting, 38 billion gallons of water would be conserved each year, says the report.
Moreover, rainwater harvesting is relatively simple to execute, especially for irrigation and cooling tower applications.
While many rainwater harvesting systems are custom-engineered from various components, a growing number of packaged systems are now available, such as those from BRAC Systems and Watertronics. One emerging trend in rainwater reuse is the application of siphonic roof drainage technology, in which negative pressure is used to draw water along horizontal piping.
Proponents of siphonic roof tech say the process can be less expensive than conventional systems that depend on gravity and require more piping to move water.
Cooling Tower Water Recovery:
Cooling towers for chillers are often the largest consumers of water in commercial buildings. (They typically rely on water evaporation to provide cooling for air conditioning.) A large commercial building with 1,000 tons of refrigeration will use 3,000 gallons of water per minutes, the report says.
Newer cooling technologies like variable refrigeration volume systems, which cool individual rooms of a building depending on the need, show promise for reducing water (and energy) use, as do cooling tower water management techniques, such as automated controls. Water treatment technologies (such as Dolphin WaterCare's system) increase the recirculation rates in cooling towers before the need for a so-called blowdown, when water is removed from the system to reduce mineral concentration and scaling that occurs as a result of the evaporation process.
There is a "potential opportunity" for whole building water savings, according to the report, in the reuse of wastewater (blowdown and condensate) from cooling towers and other mechanical equipment for irrigation.
But condensate recovery has not yet caught on all that well in the building industry.
A year after Tenn. disaster, fight over coal-ash rules just beginning
CHARLESTON, W.Va. -- A year ago Tuesday, at about 1 a.m., a coal-ash dike ruptured at the Tennessee Valley Authority's Kingston Plant west of Knoxville, Tenn.
More than a billion gallons of coal ash -- containing an estimated 2.9 million pounds of toxic pollutants -- poured into nearby streams, fields and homes. The spill covered more than 300 acres and made three homes uninhabitable.
It damaged 23 other homes, along with roads, rail lines and utilities. TVA estimates the cleanup will cost between $933 million and $1.2 billion and take two to three years to complete.
The disaster heated up a long-simmering controversy over major loopholes in the way the nation regulates the handling and disposal of millions of tons of ash generated by coal-fired power plants.
But today, as the anniversary of the Kingston mess approaches, the battle over potential new rules to protect coalfield communities and the environment from the dangers of toxic coal ash is just getting started.
"This will be a very, very hard political fight," said Eric Schaeffer, director of the Environmental Integrity Project, which advocates tougher rules. "We've got a long way to go before the finish line."
Last week, the Obama administration backed off Environmental Protection Agency Administrator Lisa Jackson's promise to publish proposed new coal-ash rules before the end of 2009.
The EPA offered no new timeline, saying only that the delay was for a "short period" and the proposed rules would be published "in the near future."
Agency officials blamed the "complexity of the analysis" involved and said staffers were "actively clarifying and refining parts of the proposal."
But, the EPA delay was announced just a week after a power industry official warned a congressional committee that tougher regulation could force nearly 200 power plants nationwide to close.
And lobbyists for coal-fired utilities, coal companies and other related industries met privately at least 10 times with White House officials in October and November to try to scuttle or weaken the EPA rules before they were even proposed.
"As the cost and benefits of the coal ash rule have become better known, the rule has become more controversial," said Scott Segal, director of the Electric Reliability Coordinating Council, an industry group.
"Almost every state has raised concern about the proposal.
Dozens and dozens of members of Congress have also made inquiry."
Coal-fired power plants generate more than 130 million tons of various ash wastes every year. The numbers have been on the rise as more plants install scrubbers and other equipment that controls air pollution, but shifts the toxic leftovers from burning coal into ash and other wastes. By 2015, the annual amount of coal ash generated at U.S. plants is expected to increase to 175 million tons, a jump of more than a third.
But no single national program sets up a concrete regulatory plan for the handling and disposal of these "coal combustion wastes," or CCW. Instead, the nation relies on a patchwork of state programs that vary in terms of their standards and their level of enforcement.
Environmental groups want to see the EPA issue a rule that would regulate coal ash as a "hazardous waste" under the federal Resource Conservation and Recovery Act.
Industry groups oppose this.
"We support regulations that promote sound management practices, protect ground and surface waters, assure structural integrity, and include other performance-based measures," said Pat Hemlepp, a spokesman for American Electric Power. "We have urged EPA to incorporate these techniques into regulations that can be adopted and administered by the states."
But that's just what environmental groups don't want.
If the EPA proposes to regulate coal ash as a "non-hazardous waste," then the agency can't force specific handling and disposal standards onto the states. Environmentalists worry that this approach would be little better than the state-by-state regulation that exists now.
Environmental groups are especially concerned about data that has come out since the Kingston disaster to confirm that coal-ash impoundments are leaking toxic pollution -- including arsenic, chromium, cadmium and other metals -- into streams and groundwater.
One EPA report, made public in May, concluded that residents near coal-ash dumps could have as much as a 1 in 50 chance of getting cancer from drinking water contaminated with arsenic.
Another agency study, issued in October, reported that coal-ash pollution of water is "of particular concern" because of the large quantities and high concentrations involved.
Still, the U.S. Governmental Accountability Project reported in late October that EPA was considering a "hybrid" approach to regulating coal-ash dumps. So-called "wet disposal" in impoundments would be considered a hazardous waste.
Other "dry" landfill facilities would be counted as non-hazardous waste.
"We don't believe this is a workable scheme, because of the substantial documentation that dry disposal has caused contamination at numerous, numerous sites," said Lisa Evans, an Earthjustice attorney and one of the environmental community's top experts on coal ash.
Along with EPA's rulemaking on coal ash handling and disposal, that agency is also developing new water pollution limits to govern discharges from coal-fired power plants and their ash dumps.
And the U.S. Office of Surface Mining Reclamation and Enforcement is expected to re-issue a rule to promote the dumping of coal ash into coal mines as part of reclamation projects.
Industry officials love this OSM idea, which they call making a "beneficial use" of power plant wastes. Environmental groups don't think coal ash really works in mine reclamation, and they're concerned that the practice is a pet project of Joe Pizarchik, a former Pennsylvania regulator appointed by President Obama to be director of OSM.
Jeff Stant, director of the Environmental Integrity Project's Coal Combustion Waste Project, said last week that the Bush administration OSM actually sent a rule promoting "beneficial use" of coal ash to the White House for approval near the end of 2008.
That rule was pulled after the Kingston spill, Stant said.
At the same time, there are growing concerns about the integrity of coal-ash dams across the country, and whether another Kingston-sized disaster is possible.
Six of 43 sites examined by EPA were given "poor" ratings for their structural integrity.
And EPA officials conceded last week that they have yet to assign separate hazard rankings -- which simply explain the likely consequences should a dam fail, but not necessarily the chances of such a failure -- to nearly 400 coal-ash impoundments across the country.
In West Virginia, state regulators who launched an inspection sweep of coal-ash dams actually found two that they didn't previously know about.
Both of them were in bad shape, prompting state Environmental Protection Secretary Randy Huffman to say his agency probably needs to schedule periodic inspections, something that isn't required by state or federal law.
Efforts to improve coal-ash regulation date back before 1980, when Congress initially told EPA to make a decision about whether the material qualified to be "hazardous waste."
The Clinton administration tried to move forward with such a designation, but the Bush administration reversed course.
Environmental groups say they're hoping to see something out of EPA sometime in January, and Senate Environment and Public Works Chairwoman Barbara Boxer said she's convinced the agency will propose tough reforms.
"I have spoken with the EPA administrator and I believe that she is committed to moving forward with a good rule," said Boxer, D-Calif. "The sooner this happens, the better."
Seminole Electric drops plan for plant
Tampa-based Seminole Electric Cooperative Inc. said Friday that it had withdrawn its application to build a new coal-burning 750 mega-watt plant in Palatka for "business reasons."
The nonprofit company operates a 665 mega-watt plant on the St.
Johns River in Putnam County and applied in 2006 to add a third that would come on line in 2012.
Jeff Fella, spokesman for Seminole Electric, said, "We have cancelled the Unit 3 project because of the uncertain regulatory and legal environment related to the construction of new coal-fired units."
The Palatka plant was built in 1984.
This year, it was named 2009's "Top Plant" by Power, an energy industry magazine.
One criterion was the plant's "good corporate environmental policy."
Seminole Electric supplies wholesale power to 10 electric distribution cooperatives, which together serve 1.7 million customers in 46 Florida counties.
On Friday, Tallahassee-based environmental group Earthjustice, working with Florida Wildlife Federation, released a joint press release applauding Seminole Electric for its decision to cancel the plant application.
A statement from Earthjustice said, "(Its) demise represents a new direction for energy in the Sunshine State.
They did the right thing.
Instead of sticking with dirty coal, Seminole Electric is considering building a 1.5 mega-watt solar energy project in southwest Florida."
Both environmental organizations, however, believe Seminole Electric's decision to withdraw the application was strongly influenced by a lawsuit against the company they filed in June.
Sarah Owen Gledhill, planning advocate for Florida Wildlife Federation, said coal fired plants account for 41 percent of U.S. industrial emissions.
Also, 14 percent of the mercury emitted by coal plants is deposited within 30 miles.
"We're not going to have additional levels of mercury from the coal-burning plant in our waterways," Gledhill said.
"Those of us who like to fish and consume fish should celebrate this decision."
David Guest, attorney for Earthjustice, said, "Our claim was that the proposed facility was going to emit a large amount of air pollutants, above the level that expensive and sophisticated pollution control equipment is needed."
At a hearing, Seminole Electric moved to delay the case "in hope the matter can be settled."
Fella said the withdrawal of the application "doesn't have anything to do with any specific lawsuit. (The company) obviously has a need for that power. But we don't have any plans (for new facilities to announce) at this time."
Seminole is exploring creating what Fella calls a "smart grid," which would "facilitate integration of renewable energies (such as wind or solar) to base load units."
In addition, the company is exploring the use of "peaking units," natural gas-driven generators sited in places that allow them to stabilize the electricity harvested from wind farms.
"This is a really big step for them," Fella said.
Earthjustice in 2007 successfully challenged Florida Power &Light's proposal for what would have been the largest new proposed coal plant in the nation, near Everglades National Park.
Guest said coal is cheap but dirty.
"Each coal seam has a different concentration of impurities," he said.
"Everybody in Florida uses Appalachian coal, shipped down in barges. They'd have to build a Rolls Royce plant right from Jump Street.
Coal generates more of the greenhouse gases that cause global warming than other fuels, almost twice as many pollutants as natural gas."
The plant recently completed a $300 million investment in new environmental controls, the company said.
In the release Friday, Florida Wildlife Federation president Manley Fuller was quoted as saying, "It makes no sense to add new coal-generating units in Florida when we're finally moving to install renewable energy sources like solar.
Pentagon, CIA Eye New Threat:
Climate Change
Global warming is now officially considered a threat to U.S. national security.
For the first time, Pentagon planners in 2010 will include climate change among the security threats identified in the Quadrennial Defense Review, the Congress-mandated report that updates Pentagon priorities every four years.
The reference to climate change follows the establishment in October of a new Center for the Study of Climate Change at the Central Intelligence Agency.
The projections lead us to believe that severe weather events will increase in intensity in the future, perhaps in frequency as well.
- Amanda Dory, deputy assistant secretary of defense for strategy
But the new attention to climate concerns among U.S. security officials does not mean the Pentagon and the CIA have taken sides in the debate over the validity of data on global warming.
As with nuclear terrorism, deadly pandemics or biological warfare, it only means they want to be prepared.
"I always look at the worst case," says one senior intelligence official who follows climate issues. "Whether it's global warming or the chance of Country A invading Country B, I just assume the most likely outcome is the worst one."
Military officials, accustomed to drawing up detailed plans for a wide variety of contingencies, have a similar view.
"The American people expect the military to plan for the worst," says retired Vice Adm.
Lee Gunn, a 35-year Navy veteran now serving as president of the American Security Project.
"It's that sort of mindset, I think, that has convinced, in my view, the vast majority of military leaders that climate change is a real threat and that the military plays an important role in confronting it."
Among the scenarios that concern security planners is the melting of the massive Himalayan ice mass. In theory, the rivers fed by the Himalayan glaciers would flood at first, then dry up once the glaciers retreat.
That would endanger tens of millions of people in lowland Bangladesh.
Retired Air Marshal A.K. Singh, a former commander in India's air force, foresees mass migrations across national borders, with militaries soon becoming involved.
Boats vie for passengers at a flooded intersection in downtown Sirajganj, Bangladesh
Enlarge David Greedy/Getty Images
Boats vie for passengers at a flooded intersection in downtown Sirajganj, Bangladesh, in this 2007 photo. The flooding was caused in part by melting snow in the Himalayas. Among the scenarios that concern U.S. security planners is the melting of the massive Himalayan ice mass, endangering tens of millions of people in the region.
Boats vie for passengers at a flooded intersection in downtown Sirajganj, Bangladesh
David Greedy/Getty Images
Boats vie for passengers at a flooded intersection in downtown Sirajganj, Bangladesh, in this 2007 photo. The flooding was caused in part by melting snow in the Himalayas. Among the scenarios that concern U.S. security planners is the melting of the massive Himalayan ice mass, endangering tens of millions of people in the region.
"It will initially be people fighting for food and shelter," Singh says. "When the migration starts, every state would want to stop the migrations from happening.
Eventually, it would have to become a military conflict.
Which other means do you have to resolve your border issues?"
The drafters of the Quadrennial Defense Review were instructed by Congress to accept the assessments of the Intergovernmental Panel on Climate Change (IPCC), the international body established by the United Nations and the World Meteorological Organization to gather and report world climate data.
Neither the Pentagon nor U.S. intelligence agencies make an independent effort to assess the planet's climate, and U.S. security officials have generally tried to distance themselves from any debate over the validity of the IPCC data.
Instead, they focus on the security repercussions.
"The [IPCC] projections lead us to believe that severe weather events will increase in intensity in the future, perhaps in frequency as well," says Amanda Dory, the deputy assistant secretary of defense overseeing the review process. "This is a mission area where the Department [of Defense] already responds on a regular basis in support of civil authorities, whether for floods, wildfires [or] hurricanes. We believe there's a possibility those types of requests will increase in the future."
Climate change could also have implications for ship and aircraft designers.
"When you talk about building ships that are going to last from 30 to 50 years or programming for aircraft that are not going to be put in the air for 20 years, you have to be thinking about the kinds of changed conditions into which you're going to throw them in the future," Gunn says.
Still, there is only so much military planners can do to prepare for the consequences of climate change.
The 2010 Quadrennial Defense Review, due to be delivered in February, is required to identity what global warming may mean for the Defense Department's "roles, missions and installations."
But Dory of the Pentagon says there won't be much change in that area.
"We don't anticipate that there are new mission areas as a result of climate change," Dory says. "Similarly, there may be changes in technical specifications for platforms, but not the need for new types of platforms that we don't already possess." (In Pentagon jargon, "platforms" are the things on which weapons are carried, like ships or aircraft.)
In the short term, climate change may be a more important subject for intelligence officials than for military planners.
Analysts at the National Intelligence Council are trying to develop a set of early warning signs that could suggest where the next famine might arise or which countries are in most danger of being destabilized as a result of dramatic climate changes. Intelligence officials put those countries on a "stability watch list."
But how far to go with such climate and security projections is a matter of dispute.
"We suck at predicting wars, and we're not very good at predicting peace," says James Carafano, a retired Army officer and former West Point instructor who now directs foreign policy and national security studies at the Heritage Foundation.
"These are huge, giant, complex systems, and people who take a linear approach to these things and say, 'Oh, well, if this happens, then we'll have to worry about that' — that's not how reality works out."
Perhaps not, but it's the job of national security officials at least to imagine future climate and security scenarios, whether they can do something about them or not.
Judge Halts Timber Sale in Alaska Roadless Area
U.S. District Judge John Sedwick ruled that the Forest Service must re-evaluate the sale due to changing economic conditions that have greatly reduced the revenue the proposed sale would bring.
Agriculture Secretary Tom Vilsack in July had approved the Orion North sale, the first logging allowed in an area covered by the 2001 roadless rule since the secretary took personal responsibility for such decisions earlier this year. Environmental groups had filed a lawsuit in March challenging the proposed sale, which would allow Pacific Log and Lumber to harvest about 4.4 million board feet of timber.
The sale also would involve almost 5 miles of new road construction, nearly 2 miles of road reconstruction and 1 mile of temporary road building.
Sedwick ruled that new information from the environmental groups shows that the costs to the public of the timber sale are significantly higher and the returns to the federal government are "very significantly lower" than anticipated.
The groups contend the sale would cost taxpayers nearly $1.6 million to build roads into the national forest and bring in about $141,000 for the trees.
Sedwick said the court injunction will "protect public resources just long enough to allow time for the preparation" of a supplemental environmental impact statement.
The original environmental analysis was issued in 1999, when the sale was first authorized.
The judge rejected the groups' arguments that the government failed to consider new information on deer habitat, wolf viability, invasive species and climate change effects on yellow cedar.
The groups include the Tongass Conservation Society, the Sierra Club, the Natural Resources Defense Council, the Center for Biological Diversity, Greenpeace and the Cascadia Wildlands Project.
"The court's ruling recognizes that timber sales like this one in a roadless area of the Tongass are a waste of taxpayer money," Kate Glover of Earthjustice Alaska said in a statement.
"This decision protects one of the last pristine areas in the Throne Arm;the trees have more value standing than cut down."
When Vilsack approved the sale in July, department officials noted it was first offered before the Clinton roadless rule was adopted in 2001.
They also said the Forest Service had judged that the sale was critical to keep a local timber mill open and to protect jobs there (E&ENews PM, July 20).
The department did not answer requests for comment by deadline.
The roadless rule granted blanket protection to about 58 million acres of national forests nationwide but has been mired in legal battles ever since President Clinton put it in place just before leaving office.
In May, Vilsack signed a directive giving himself sole power to make decisions for one year on building roads and harvesting timber on nearly all of the areas covered by the 2001 rule.
No project will proceed without his personal approval while the Obama administration decides how to handle the roadless rule.
EIA:
Total Greenhouse Gas Emissions in the US Down 2.2% in 2008;Transportation Sector Emissions Down 4.7%
Total US greenhouse gas emissions in 2008 were 2.2% below the 2007 total, according to the just-released report by the US Energy Information Administration, Emissions of Greenhouse Gases in the United States 2008.
The decline in total emissions—from 7,209.8 million metric tons carbon dioxide equivalent (MMTCO2e) in 2007 to 7,052.6 MMTCO2e in 2008—was largely the result of a 177.8-MMTCO2e drop in carbon dioxide emissions. There were small percentage increases in emissions of other greenhouse gases, but their absolute contributions to the change in total emissions were relatively small, with the increase in emissions of those gases being more than offset by the drop in CO2 emissions:
14.8 MMTCO2e growth for methane (CH4), or 2%.
Methane emissions totaled 737.4 MMTCO2e in 2008.
Most of the increase came from coal mining and from natural gas production and processing.
Emissions from petroleum systems decreased.
Emissions from stationary combustion—primarily from wood combustion for residential heating—increased.
0.4 MMTCO2e growth for nitrous oxide (N2O), or 0.1%.
5.3 MMTCO2e growth for the man-made gases with high global warming potentials (high-GWP gases). The increase resulted mainly from higher emissions levels for hydrofluorocarbons (HFCs, up by 5.0 MMTCO2e).
The decrease in US CO2 emissions in 2008 resulted primarily from three factors, according to the EIA report:
higher energy prices— especially during the summer driving season—that led to a drop in petroleum consumption;economic contraction in three out of four quarters of the year that resulted in lower energy demand for the year as a whole in all sectors except the commercial sector;and lower demand for electricity along with lower carbon intensity of electricity supply.
Petroleum remained the largest fossil fuel source for energy-related CO2 emissions, contributing 42% of the total, followed by coal with a 37% contribution.
Although coal produces more CO2 per unit of energy, petroleum consumption—in terms of British thermal units (Btu)—made up 44.6% of total fossil fuel energy consumption in 2008, as compared with coal's 26.8%.
The transportation sector has led all US end-use sectors in emissions of carbon dioxide since 1999;however, with higher fuel prices and slower economic growth in 2008, emissions from the transportation sector fell by 4.7% from their 2007 level.
Transportation sector carbon dioxide emissions in 2008 were 95.6 MMT lower than in 2007 but still 343.2 MMT higher than in 1990.
Climate Bill Can Benefit Farmers Despite Higher Costs
An expanded economic study, which USDA released yesterday, estimates that farmers with energy-intensive crops could see their cost of production per acre go up to nearly 10 percent over the next 50 years. But agriculture officials insist that higher prices for fuel or feed would be offset by the gains of participating in an offset market.
"The bottom line is, we think this is a net benefit for farmers and ranchers," said Agriculture Secretary Tom Vilsack.
USDA released the report at a House Agriculture Committee hearing yesterday. A second hearing today will focus on the offset market that could establish a new cash flow to farmers. Companies with carbon emissions could pay farmers and ranchers to sequester carbon through planting trees, practicing no-till farming or developing better nutrient management.
The new analysis of the House-passed climate bill (H.R. 2454 (pdf)) estimates that commodity crop farmers would see minimal price increases in the first four years, when a rebate in the bill would keep fertilizer prices lower.
Over the longer term, farmers would see bigger price increases, according to the agency.
Agriculture itself is not capped in the bill, but the price of energy and fertilizer is expected to go up because of caps on the industries that supply the inputs for fuel and fertilizer. Livestock producers would also be hit by higher commodity prices.
For instance, corn, one of the most energy-intensive crops, would see cost increases of $1.19 per acre in the short-term but $25.19 in long term, or 9.6 percent.
USDA projects fuel costs to rise between 2.6 percent to 5.3 percent from 2012 to 2018.
Fertilizer would go up an extra 0.3 percent to 1.7 percent per year.
The cost estimates ignore potential benefits that farmers and ranchers might gain from selling carbon credits. Vilsack said in a teleconference with reporters yesterday that the offset market in the House bill could bring in $10 billion to $20 billion for the farm sector. USDA chief economist Joseph Glauber will expand on the benefits from offset markets in another round of testimony for the Agriculture Committee today.
Vilsack said all sectors -- including livestock and crops, like rice, that have less carbon-sequestration potential -- could have a chance to benefit under the system.
But he warned the transition would not be easy for all farm enterprises.
"It is fair to recognize that different producers will be affected differently," Vilsack said.
"But ... more farmers will benefit than not."
Differing interpretations
House Agriculture Committee members interpreted the economic results differently, depending on their stance on cap and trade.
"The conclusions of all the studies remain the same, that cap and trade has the potential to devastate the agriculture community with higher energy prices," said Rep.
Bob Goodlatte (R-Va.).
If Congress does not pass a climate bill, EPA could move forward with their own regulations to curb emissions. Goodlatte recommended Congress block the effort by passing legislation that restricts the authority of EPA to take that action, an idea that failed during the EPA spending bill debate this year.
But Rep.
Tim Walz (D-Minn.) said that lawmakers, farm interests and economists are not paying enough attention to the potential harm to agriculture from a warming planet -- which could bring on more severe weather events, disease and pests.
"The fact of the matter is, what the bill won't do, climate will do," Walz said.
"It behooves us all to look at the evidence on all sides, not just the short-term view, climate change is not going to allow us to yield the food and fuel we need."
Other economists who testified at the hearing yesterday agreed that unmitigated global warming could also have serious economic effects for farmers.
"Studies have tended to underemphasize the costs of adaptation and of severe climate events," said John Antle, professor of economics at Montana State University.
"We need to think more carefully about where we are headed in the future."
"The potential yield decreases of doing nothing are extreme," said Richard Pottorff, chief economist of Doane Advisory Services. "We need to take some action, but what we do is the question."
Get on board
Most farm groups have been hesitant to support U.S. cap-and-trade efforts because of concern about how such a plan could raise energy and fertilizer prices.
The American Farm Bureau has launched a campaign against U.S. emission curbs and many commodity groups have either decried climate legislation or taken a neutral stance.
The left-leaning National Farmers Union is one of the few large farm groups advocating for climate change legislation.
Vilsack urged farm groups to change their stance, saying global warming could be devastating for agriculture.
"It is important for us to recognize no action is not a good option," he said.
Vilsack predicted farmers would embrace cap and trade as they see the advantages from a carbon sequestration market.
"Farmers were reluctant when fertilizer was first proposed ... or seed technology ... but they embraced the technology, and the result is agriculture production that is the envy of the world," Vilsack said. "I understand the concern in farm country, but if this is done properly, it is going to ultimately, in the long term, the short term and the medium term, it is going to be a benefit to farmers."
Low Carbon Prices:
Just a Phase or an Indictment of Cap and Trade?
In the wake of the weak climate agreements reached in Copenhagen a little over a week ago, the price of carbon has dropped substantially.
The main exchange for the carbon emissions allowances that are traded as part of the European Union's Emissions Trading System saw carbon dioxide emissions drop to €12.4 ($17.90) a metric ton Monday.
Prices have been volatile throughout the ETS's first five years. Permits had reached a high of €30 ($43) in summer 2008 before dropping to €8 ($12) earlier this year.
On the other side of the Atlantic, the U.S. House of Representatives passed a climate change bill in June based on creating a similar cap-and-trade system.
Like the ETS, the proposed U.S. system would limit industries' emissions and eventually force companies to pay for allowances to offset their emissions — or allow them to sell excess allowances if their emissions are lower than expected.
The fact that prices for those allowances are so low in the cap-and-trade systems that already exist might be troubling for proponents of such a system's ability to mitigate climate change.
"Price volatility is a problem in and of itself because it's not sending a clear market signal" to investors, says Daphne Wysham, a fellow at the Institute for Policy Studies and co-director of their Sustainable Energy and Economy Network.
She also points to offsets, such as the Certified Emissions Reduction credits created by the Kyoto Protocol's Clean Development Mechanism to pay for such projects as keeping forests unlogged, as "a drag on price because they're the cheapest option out there" for meeting the emissions targets on paper.
"You're essentially keeping the price low by allowing for carbon offsets," she says.
Other factors are at play as well.
The price of carbon, as with that of any commodity, is seen as a function of demand.
The current dip in prices, then, is a reflection of the low expectations for carbon dioxide regulation following the disappointing summit in Denmark and the stalling of climate legislation in the U.S. Senate.
It is also a product of the slow economy, as less economic activity has meant lower emissions, and thus a lower volume for buying emissions permits.
The EU system and any future U.S. system are supposed to encourage a movement away from dependence on high-emitting, fossil-fuel based production by making those activities more expensive relative to cleaner alternatives. It is all about motivation.
But for businesses to be sufficiently motivated to reduce their emissions they need a price incentive.
Carbon around 13 euros per ton is not expected to provide that push, and many analysts believe that, ultimately, prices will need to be much higher than they have been for cap and trade in carbon to achieve its ends. The International Energy Agency's World Energy Outlook for 2009 said carbon prices should be at $50 (€35) a ton in 2020 and $110 (€67) in 2030 in order to provide the incentives that will push companies to invest in cleaner technologies.
The non-binding international accord announced by the U.S., China, India, Brazil and South Africa and "taken note of" by the other countries is not likely to raise demand enough to push up that price.
Meanwhile, the Regional Greenhouse Gas Initiative, which caps emissions in 10 northeastern U.S. states, is creating a fragmented market for firms, and one that is very dependent on political decree — a major factor in the price volatility seen so far.
Prices in RGGI dropped to around $2 in the most recent quarterly auction of emission allowances, at the beginning of this month.
Analysts say supply is easily overpowering demand as that system tries to hit its stride after getting its start last September. The EU ETS also faced the early problem of having allocated too many allowances at its start.
What these low prices mean for the future of cap-and-trade systems is still a matter of debate. NASA climate scientist James Hansen has railed against a cap-and-trade system since before the lead-up to the House bill's passage.
"Because cap and trade is enforced through the selling and trading of permits, it actually perpetuates the pollution it is supposed to eliminate," Hansen says. His main issue with the system is not so much with the price level as with elements of the system like loopholes and offsets, which in turn lower the price of allowances.
Others have contended these low prices and other concessions to industry are simply unavoidable growing pains of implementing a brand new regulation.
Jill Duggan, a senior fellow with the World Resources Institute who has helped run and design emissions trading schemes in the UK, EU and U.S., has questioned whether the over-allocation that plagued the early years of the ETS and can be seen in RGGI is really over-allocation.
Perhaps the EU just "underestimated how cheap and easy it would be for companies to reduce their emissions," she suggests.
"Companies, once they started to implement greenhouse gas reduction measures, were quite effective at cutting back on emissions, and needed fewer allowances than predicted," Duggan wrote in November. She also says the price of allowances has been no more volatile than other energy commodities in the past year.
The Future
As for the future of cap and trade, the currently low prices are not expected to, themselves, slow the momentum toward a cap-and-trade system in the U.S. or elsewhere.
"A more 'realistic' market will soon come into play as early allowances begin to expire and the cap begins to lower," says Michael Clingan, who has researched the business side of climate regulation as a partner at Ascendant Consulting.
The ETS prices are low right now, he says, due to the "overly generous initial allowance" that "expedited the political process needed to launch cap and trade."
He says this both delayed emissions reductions — and stretched the patience of those hoping to see reductions — and allowed emitters and markets time "to develop the machinery" needed operate under a cap.
Gregory Casas, an attorney who focuses on energy and natural resources law at Greenberg Traurig and who has written and lectured on carbon markets, expects emissions to grow as the world economy pulls out of the recession.
But he sees this increase bringing, paradoxically, an eventual decrease, as "this will lead to a renewed call for cuts in emission allowances, which will in turn bolster cap and trade programs and an increase in the price of allowances."
The biggest unanswered question remains what the U.S. Congress will do. Several conservative Democratic senators have told the White House and congressional leaders that legislation to cap greenhouse gas emissions would be too contentious for them to support when a climate bill comes up for a vote next year.
Others, like Hansen, have critiqued cap and trade for not being radical enough in its approach to saving the planet from the worst effects of climate change.
Many of them have advocated a direct carbon tax on emissions rather than a market-based trading system.
A carbon tax might not face the same difficulties in terms of price level, but it would have its own problems with volatility as, being more easily repealed than an established carbon emissions market, it would be more subject to the political whims of the moment — if it made it into law at all.
Paul Krugman, a Nobel laureate in economics, says a tax and a cap would have equivalent effects. The difference, he says, is "we have a real chance of getting a serious cap-and-trade program in place within a year or two. We have no chance of getting a carbon tax for the foreseeable future."
"My thoughts are that cap and trade will survive internationally and will be put into place here in the U.S.," says Casas, though he is less sure about the time frame for that U.S. legislation.
But for cap and trade to achieve its ends — that is, for emitting greenhouse gases to become more costly for the emitters in the short run — carbon offsets and derivatives based on the carbon markets need to be eliminated, says Wysham.
"But all that is implicit in any cap and trade system … the entire approach and the entire architecture of that approach is flawed," she says.
Duggan is more optimistic, at least in terms of the effect of low carbon prices.
"Companies do not need to know what the carbon price will be in 2020 (just as they do not know the price for oil or coal will in 2020)," she says. "They do need to know that there will be a carbon price in 2020, and in Europe at least, they know that the ETS is here to stay."
Environment in Jordan 2009:
A Year in Review
The year 2009 can be better described, from an environmental perspective as the year when Jordan finally opened its eyes to the threats of Climate change.
After years of neglect, denial and lack of data Jordan has realized through a concerted effort backed by global momentum that climate change is a clear and imminent danger to the environment and development in Jordan.
For many years Jordan has neglected the issue of climate change due to three major reasons. The first one was the fact that Jordan is facing many acute socio-economic and environmental problems that need to be rapidly addressed and fixed, which put the long-term impacts of climate change at the lower level of priorities. Second, Jordan was lacking political power to address climate change seriously and was mainly content with echoing the "denialist" approach developed by many other Arab countries, mainly oil producing ones that wanted to marginalize climate change from the regional environmental agenda to protect the global demand for oil.
Thirdly Jordan was lacking the scientific basis for documenting the actual and projected impacts and scenarios of climate change and the absence of well researched data contributed to a big gap of knowledge.
The three barriers were removed this year. Jordan has finally managed to publish updated data on the current and projected impacts of climate change through the Second National Communication Report that was submitted to the secretariat of the climate change convention and included information on the sources of greenhouse gas emissions, potential mitigation measures, scenarios for the next 40 years and potential adaptation measures in the areas of water, agriculture, health and socio-economic dimensions. According to the report Jordan is expected to witness a 1-2°C increase in temperatures by 2030-2050, resulting in diminished aquifers and surface water bodies, reduced vegetation cover, and the transformation of semi-arid lands, some 80 per cent of the country's total area, into arid deserts .
The political momentum for climate change kept increasing with an active role played by the Ministry of Environment at both the national and regional levels to incorporate climate change into development planning. A series of events, workshops and campaigns assisted in raising the portfolio of climate change especially as donors were actively involved in the process. The build up to the Copenhagen summit resulted in increased public awareness but the process was given a greatest boost with the support from higher political decision making levels that culminated in a historic speech by HM King Abdullah II, delivered by HRH Prince Hamza at the Copenhagen summit that included a roadmap for environmental planning and addressing climate change in Jordan.
This year witnessed also an important breakthrough in reducing carbon emissions from the transport sector with the introduction of hybrid cars in Jordan.
According to figures by the Jordan Customs Department (JCD), a total of 4,796 hybrid cars have been cleared from the free zone since the beginning of 2009, with 859, or 18 per cent of the total number, entering the market between December 1 and 21.
The rise was supported by the government's decision to abolish customs fees on hybrid cars for environmental reasons. Debate however has been raised about the "environmental value" of introducing big hybrid cars with more than 2,000 cc of engine size.
The former Minister of Environment Khalid Irani has raised concerns that demand of luxury hybrid cars is not serving the environmental purpose but there were no concrete actions by the Ministry of Finance to introduce new levies on big engine cars. For consumers, they can be both environmentally friendly and luxurious!!
Jordan has started to reap the benefits from joining Kyoto protocol's Clean Development Mechanism (CDM). Jordan has received 1.5 million euros in return for selling carbon credits from the Aqaba Thermal Power Station, the first Jordanian venture registered as a Clean Development Mechanism.
The plant is expected to generate 23 million euros over the next five years. Within the same context Jordan has approved five programmes as part of the CDM to contribute to the reduction of around 3.5 million tonnes of carbon dioxide annually and generate 100 million euros over the next five years.
For biodiversity and ecosystem conservation a decision was made to declare Yarmouk River as the 8th national reserve in Jordan.
The Yarmouk River nature reserve is rich in flora and fauna.
It is home to 59 plants, 20 mammals, some of which are endangered, as well as 58 species of birds. The nature reserve, to be managed by the Royal Society for the Conservation of Nature, will preserve trees and rare plants, as well as animals, particularly predators that are in danger of extinction both locally and regionally.
Moreover, Wadi Rum reserve is currently in the process of being inscribed as a natural heritage site by UNESCO.
The long awaited Badia Restoration programme based on the environmental claims from the Gulf War 1990-1991 has started operation with 6 projects for baseline assessments that will be followed by practical restoration of rangelands in specific areas in the Badia.
The crucial project for the integrated ecosystem management of the Jordan valley has been suffering from delays mainly by the Jordan valley Authority's plans for the Red-Dead canal.
The project should create 4 additional national reserves in the Jordan valley area.
Jordan has created a Green Building Council that has been registered as a non-governmental environmental organization.
The council will support the introduction and promotion of green building concepts in Jordan for conservation of water and energy resources in buildings. It is expected that this council will provide much needed guidelines for sustainable urbanization in Jordan.
This year has also witnessed the adoption of two important pieces of environmental legislation.
The environmental enforcement bylaw governs the mechanism by which the Ministry of Environment and the Royal Management for Environmental Protection (formerly known as environmental police) will monitor environmental violations by industries and other developmental facilities and how such violators will be asked to perform remediation measures to stop pollution.
Another bylaw for the establishment of an environmental fund in Jordan defines the principles and guidelines for providing financial incentives for environmental protection projects and initiatives especially by small-medium scale industries.
At the borderline between water and environment, the environmental impact assessment studies for the Red-Dead canal are still underway, still not providing answers for the dispute over environmental impacts of the large project.
The Disi water project will finally begin construction amidst a controversy over the presence of high concentration of radium isotopes in a few wells discovered by a team of Jordanian, Israeli and American scientists in February.
At the policy level the new water strategy 2008-2022 was adopted this year. The document entitled "Water for Life" outlines the country's vision for a strategic and integrated approach to the sustainable management of water resources, as well, it reflects adoption of water demand management strategies for:
greater understanding and more effective management of groundwater and surface water;a sustainable use of water resources, fair, affordable and cost-reflective water charges;and adaptation to increased population growth and economic development across the water sector and water users.
On the ground, the conditions of the environmental hotspots in Jordan have not improved substantially.
The Zarqa River Basin is still subject to heavy pollution from domestic and industrial sources despite efforts from the Ministry of Environment and other agencies, while the Ekieder landfill in Irbid has benefited from the construction of an industrial waste treatment plant that will reduce the pressure on the landfill.
Water and wastewater infrastructure in Zarqa will under go a huge improvement due to an influx of financial and technical support from the Millennium Challenge Corporation (MCC) as four contracts have been issued for the rehabilitation of sewage and domestic water networks, as well as the further expansion of the Khirbet As Samra wastewater treatment plant and the rehabilitation of water wells in Zarqa.
There are opportunities that are still being missed.
Without the adoption of the ambitious renewable energy law –spent 4 months at the Parliament- investments in renewable energy (solar and wind) have not been materialized to the appropriate level.
According to the National Energy Strategy Jordan needs to increase the share of renewable energy in the primary energy mix from 1% now to 10% in 2020 and that required massive investments in solar and wind energy.
The draft law provides many tax and management incentives for investors but is still not adopted.
It is becoming an obvious necessity that this particular law should be enacted, even as a temporary law.
The opportunities for recycling and reuse of wastes have not been realized yet due to the lack of an incentive-based legal and policy framework that supports the reuse practices and makes such initiatives a mainstream economic activity.
Jordan badly needs a revolutionary approach in waste management systems that can introduce the concept of waste reuse and providing networks for waste separation and collection to be incorporated as raw material and production resources in recycling industries.
Generally speaking the year 2009 can be considered as another slow step forward in enhanced environmental management in Jordan.
The process has been slow but consistent in the past decade and it is always hoped that the impacts of the financial crises and the reduction of public capital expenditures will not result in a reduction of financial allocations for environmental initiatives that may derail the positive developments witnessed in the past few year
Copenhagen:
Triumph or Failure?
It's been over a week since the Copenhagen climate talks ended.
Most bloggers and pundits have taken a couple days off from 24-7 prognostication, so it feels safe for me to venture into the fold and submit my humble take on what happened at COP15, and what it means for the global climate movement.
First, I'd like to look back at the last couple years of the climate movement leading up to the Copenhagen talks.
note:
this is a long post – click here to see the whole piece
The movement comes of age
Let's step into our time machine for a moment and take a trip back to Bali, Indonesia, where COP13 was held in December 2007.
Still logjammed by the Bush administration's oil-slicked representatives, delegates from other countries agreed to the Bali Roadmap, basically an agreement to keep moving towards an agreement, with vague topical goalposts along the way:
technology transfer, forests, financing, adaptation and, of course, the elusive carbon cuts.
Even with the US delegation blocking progress at every level, there was a sense of momentum, and with US elections coming up, a flickering light shown at the end of the tunnel.
In 2007, the climate movement also came of age.
All over the world, climate activists — in particular young people — realized that nobody would save the world from catastrophic climate change for them.
They would have to do it themselves. In Australia, India, the US, all over Africa, in Mexico, China, Europe and the Middle East, activists began forming coalitions of social justice, environmental, faith and other progressive groups to take on the fossil fuel interests and intransigant elected officials.
While many of these groups existed before 2007, a combination of innovative public campaigning, shocking scientific reports and natural disasters vaulted climate change into the center of the political arena.
In a sense, that year was when the global climate movement came of age – two major organizing pushes in the US (Step It Up 2007 and Powershift 07) inspired a rash of similar events around the world over the next two years. As the Bali negotiations came to a close, the movement learned about a new number that would set a goalpost for years to come:
350 — the safe level of carbon in the atmosphere in parts per million.
While the results of the Bali negotiations proved to be nothing more than a band-aid on the international climate negotiation process, the two weeks spent there helped activists forge relationships and ideas that would serve us well throughout the next two years.
Making our voices heard
Now cut back to 2009.
The movement, having weathered yet another year of climate inaction on the part of most major countries, but bouyed by the election of Barack Obama in the US, undertook the largest, most widespread collective campaign effort ever initiated in social movement history.
Globally, thousands of climate activist groups began to build constituencies, impact national elections significantly, target specific elected officials, take down fossil fuel interests, and come up with new and innovative ways of getting the word out.
There are too many efforts for me to list here, but suffice it to say that in every corner of the planet, from the smallest rural village to the largest metropolis, concerned citizens were successful in pushing for local change and linking it to a global movement, building political power from the ground up everywhere on the planet.
The 350.org campaign aggregated some of the most amazing work worldwide, raising those voices — many from the world's most vulnerable countries — to make sure that world leaders would notice.
And notice they did.
China and India, long considered to be inexorably tied to high-carbon growth, came forward with not insignificant proposals to reduce carbon emissions while still ensuring their citizens comfortable lives. The US House passed the first ever climate bill in that country (albeit a very weak one.) In Africa and small islands all over the world, activists helped educate and support Environment Ministers and delegates on the need for bold action on climate change.
For the first time, we had a truly global movement, and we influenced public policy all over the planet.
In just a few years, we had managed to build a movement that had a voice at the international table.
Copenhagen – triumph or failure?
After two years of light-speed movement building all over the world, towards the end of 2009 all eyes were on Copenhagen.
US legislators hurried to pass a(ny) climate legislation, using the now irrelevant excuse "we have to get something before Copenhagen" to weaken a cap and trade bill.
It didn't even come close to being passed in 2009, overshadowed by the circus-like debate over healthcare in that country.
China, India, Europe, Maldives, South Korea, Mexico, South Africa and a whole host of other countries submitted their own climate laws.
On the eve of Copenhagen, with more than 140 heads of state scheduled to arrive, it became clear that the world's largest carbon emitters were not ready to commit to significant reductions. Perhaps some of us were blinded by the flashy ads everywhere telling us to refer to the city as Hopenhagen rather than Copenhagen — but it became abundantly clear in the few weeks before COP15 that no significant deal would be made.
It should have been no surprise that at the end of a marathon negotiation session on the last night of the summit, the world would be presented with a half-baked, vague "accord" that a handful of countries wouldn't sign.
There have been various attempts at finger-pointing for the failure to make significant progress:
Obama is a favorite target, China has been accused of political plotting, and even Denmark didn't escape the witch-hunt.
It's abundantly clear that none of the world's leaders, save those from the smallest and most vulnerable countries, were able to muster the leadership needed to move the world into a prosperous new decade.
That alone is a stark failure of leadership, but it is not a reflection so much of the climate movement as it is the result of the influence of entrenched fossil fuel interests.
Politics is often a binary;even if significant people power comes up against entrenched power, it might amount to 0 if the entrenched interests win.
We should recognize and understand that COP15 was a complete failure in terms of saving the world from climate catastrophe.
It makes no sense to call the Accord a step forward, when the only thing it does is to ensure that the broken UNFCCC process continues. Physics and Chemistry don't care that the UNFCCC process will continue into 2010 and beyond.
In the morbid math of climate change, every month our leaders spend debating, instead of reducing emissions, is a failure because it means millions more people will die of climate-related phenomena.
Despite the 0 that our leaders handed over, Copenhagen was a triumph for our movement.
It provided a focal moment, a frame through which we could explain to each other and to the global public the clear moral argument for taking action on climate change.
Most polling data shows that we were incredibly successful in doing so, but a better measure of how large and powerful the movement has become is how much we shaped the narrative of the negotiations.
From the opening days of COP15, when African delegates spoke vociferously about the need for hundreds of billions of dollars in clean energy funding for developing countries and bold targets that put us on the path to 350 parts per million CO2, right up until the last moments of tense debate, our power was at the negotiating table, propped up by our champions.
While small island state leaders and African delegates may have been forced to sign the Accord, in the end, it was our hard work that enabled President Nasheed of the Maldives to approach President Obama and demand stronger targets, or Lumumba Di-Aping of the Sudan to call out his colleagues in the G77 who would sign onto a weak deal if the price was right.
It was with our language and with the support of the global climate movement that a handful of delegates from vulnerable countries refused to sign the Copenhagen Accord.
It is our movement that fought off greenwash in the media worldwide, and mobilized citizens to redouble their efforts, even after a long year of sustained public pressure.
Even though we ended up with a 0 from our leaders, our movement significantly altered the narrative in Copenhagen.
What comes next?
"The arc of history is long, but it bends towards justice."
- Martin Luther King Jr.
Too often we look at the climate movement as a singular event in history.
True, there has never before been such a pressing global emergency that implicates everybody on the planet.
However, the rich history of social movements teaches us that a war is not won in a day (or even a couple weeks), and a movement takes time to build.
The paradox of the climate movement, of course, is that we have no time to waste.
Below are a few ideas I've been mulling over the past couple weeks – call them general recommendations for moving forward:
* Build robust international networks
2009 saw some of the most incredible international coalitions form — the World Council of Churches, Rising Tide direct action groups and everything in-between joined together to fight for bold and fair action on climate change.
For the first time, climate action groups built coordinated and strategic bottom-up pressure on world leaders everywhere. 2010 should be the year when the movement refocuses on national fights, while continuing to coordinate strategically.
In particular, we should focus pressure on the "C5," the five coal-dependent economies:
USA, EU, Australia, South Africa, China. A simple political analysis shows that by building strategic constituencies all over the world, we can change some positions and bolster others.
Lest we forget, movement-building does not just happen on its own.
At the same time as applying strategic pressure, we should encourage open-source, innovative campaigns, in particular those that empower civil society from developing countries to continue affecting change at home.
* Highlight the solutions
Everywhere in the world, entrepreneurs are making incredible strides towards cheap clean energy and efficiency.
India, China, Brazil and other countries are at the forefront.
Our movement must highlight and support these efforts, and work towards ramping up a massive subsidy shift from fossil fuels to clean technology.
Numerous thought-leaders have already begun to refer to the "Race China" paradigm, where the US and other countries would invest in WWII-style programs to jump-start investment and ramp up implementation.
2010 should be the year the international movement joins with entrepreneurs to show that clean energy and efficiency technologies exist and are economically feasible, but will require a shift in government subsidies and massive investment worldwide.
* Take down opponents of climate action
If the US Senate is any indication, even our so-called "champions" are far from what the science says is necessary.
While we should support our champions at the national and international levels, we should stop coddling fence-sitters, and focus on getting vulnerable climate deniers out of office.
In countries like Australia, the UK and Germany, activists have succeeded in making climate a voting issue.
We must laser-focus on the proponents of climate inaction, and work collaboratively to hit them where it hurts — the ballot box.
We should use our new media savvy to help shape public opinion during election campaigns.
* Make coal the enemy of prosperity
Coal is the dirtiest fuel on the planet, and everybody in the world should know that simple fact.
In the US, the Sierra Club, along with numerous state and local groups, have erased more than 90 new coal-fired power plants from the list of 170 proposed facilities. In the UK, Australia and India, activists have successfully shut down coal supply lines and power plants through direct action and grassroots protest.
In 2010, we must make it clear to the world that coal stands in the way of prosperity.
"After all," said President Nasheed of the Maldives in Copenhagen, "It is not carbon we want, but development.
It is not coal we want, but electricity."
Already, discussions about internationally coordinated anti-coal work have begun.
The international climate movement would do well to set it sights on the handful of coal barons and their politician pawns who have taken the world hostage.
* Ensure the sustainability of the movement
If the months leading up to COP15 taught us anything, it is that we will be in this fight for the long haul — because it is the fight for our lives and the human race.
We will not solve climate change this decade.
Indeed, in many places, we are already suffering its effects. In the weeks leading up to the Copenhagen climate talks, many activist groups released statements referring to COP15 as "the last bite of the apple," or "the end game."
While heightening drama around the climate talks is a useful PR tool, in the case of COP15, it was dishonest and misleading.
2010 must be a year of deepening commitment, honest reassessment and continued collaborative action.
This is no time to be distracted by internal politics, organizational ego or incrementalism.
We must find out what works, leave behind what doesn't, and provide resources and support to each other to get the job done.
Finally, 2010 must be the year when mutual trust and love ensure that moral voices from across the climate movement — from the most vulnerable countries to faith leaders, from youth activists to senior citizens — resound worldwide.
Only by forming truly loving relationships with each other, across boundaries of race, sex, class, language and religion will we build a movement strong enough to usher in a new era of clean energy prosperity.
Biofuels market to triple from $76 billion to $240 billion by 2020
In a positive outlook for the biodiesel industry, Pike Research, a new cleantech analysis firm headquartered in Boulder, Colo., has predicted robust growth over the long-term for the global renewables market.
"The study started out as an assessment of technologies in the biofuels industries, but then it grew into a more of a market analysis," said Robert McDonald, author of the report.
Growth on the supply side was linked to game changing technologies related to biofuels processing.
"In the biofuels world, feedstock is king and any technology that provides additional feedstock is a game changer to me," McDonald told Biodiesel Magazine. "I think the technology developed by Clayton McNeff [of Ever Cat Fuels in Isanti, Minn.] for making biodiesel from trap grease is one of the developments that is very exciting."
Recycled greases, however, do not have the same potential that algae or jatropha holds for the biodiesel industry over the long term, the Pike Report said.
"They're planting a lot of jatropha right now, but it will take four to five years to mature, so we're looking at 2013-2014 before it starts to make an impact," McDonald said.
Making algae oil for biodiesel production is still a long way off as well and sometimes seems unrealistic despite the intense amount of attention it has received from oil companies, government and the press. It's still five years out from being commercially viable, a timetable that causes skepticism from critics. "There will be several 18 month cycles before we start seeing biodiesel made from algae [at the commercial scale]," McDonald said.
"But there have been too may breakthroughs and too much investment, so it will happen."
In terms of methodology used to distinguish viable new technologies from the hype associated with renewables, McDonald said he looked for corroboration.
For instance, Aurora Biofuels in Florida announced it had found a way to harvest algae oil using the same methods as waste-water treatment plants. Then a few weeks later the research arm of the Australian government made a similar announcement.
"That's what we are looking for," McDonald said.
"When legitimate organizations make similar discoveries independently that seem to corroborate each other, I think it gives credence to the commercial development and growth of the technology."
While biodiesel growth on the supply side is seen as related to feedstock expansion from new processing capabilities, demand acceleration is linked to government mandates, McDonald said.
"Most of the major regional markets already have mandates for use and I think we will see more issued in the near future."
The Pike Report said that the biofuels market has the potential to triple within the next decade.
Healthcare fight makes passing energy and climate bill tougher
Sen.
Lindsey Graham (R-S.C.) said Sunday that Republican anger over healthcare legislation makes it tougher to pass the energy and climate-change bill that he is working with Democrats to craft.
"I want to work with this administration, but this healthcare proposal has made it very hard for Republicans to sit down at the table with these guys, because of the way they have run over us. But at the end of the day we have more problems than just healthcare," Graham said on CNN's "State of the Union."
"I want to help solve hard problems, but this healthcare bill has made a hard problem worse," he added.
Graham has split with the bulk of his caucus to work with Sen.
John Kerry (D-Mass.) and White House officials on a compromise global warming and energy bill that can reach 60 Senate votes.
Kerry and Sen.
Joe Lieberman (I-Conn.) are planning legislation that would blend mandatory nationwide greenhouse gas emissions reductions with wider U.S. oil-and-gas drilling and expanded federal financing to build new nuclear power plants, among other measures.
President Barack Obama on Friday helped broker a limited international climate-change agreement at United Nations talks in Copenhagen.
Supporters of slow-moving climate legislation in the U.S. hope that pledges by nations including China and India to slow their emissions of heat-trapping gases will help propel the bill in the Senate.
Graham said his priority in working on the bill is to reduce U.S. reliance on oil imports. "When [Venezuelan President] Hugo Chavez got a standing ovation in Copenhagen it made me sick to my stomach, but the only way he is relevant is because of the oil revenues," he said.
Venezuela is a major oil producer.
On the same program, White House Senior Advisor David Axelrod defended the nonbinding agreement Obama reached after a frenzied day of ad-hoc meetings with world leaders at the summit.
"Let's understand that when the president arrived the talks were collapsing and there was a very real prospect of no progress out of Copenhagen," he said.
The agreement included compromise language that allows for outside analysis of nations' implementation of their pledged emissions reductions.
How to address "transparency" was a major sticking point at the fractious two-week Copenhagen talks because China, the world's largest emitter of greenhouse gases, resisted calls for external verification of its actions.
Graham called the accord limited progress. "I think in many ways it is going to be seen as ineffective, but it is some transparency that we don't have today," he said.
Under the overall accord, countries will commit to implementing their national emissions-cutting plans. It sets a global goal of keeping temperature increases below 2 degrees Celsius, the level that many scientists say is needed to prevent catastrophic and irreversible climatic changes.
Obama acknowledged Friday the accord would not bring about the needed reductions, but called it a major breakthrough that paves the way for further action.
But Sen.
John McCain (R-Ariz.) on Sunday criticized the outcome in Copenhagen. "I think that the fact it has no binding provisions to it whatsoever is a rhetorical attempt to cover up what was obviously a serious failure," he said on "Fox News Sunday."
McCain in years past has called for limits on U.S. greenhouse gases and sponsored an early version of "cap-and-trade" plans with Lieberman.
But he has been sharply critical of current Democratic climate proposals.
Majority Leader Harry Reid (D-Nev.) hopes to bring a climate and energy package to the floor in the spring.
The House approved a sweeping bill in late June.
Kerry and others say China's pledge to slow its emissions and endorse the Copenhagen accord should help ease concerns that U.S. legislation would hand a competitive advantage to manufacturers overseas.
Majority Whip Richard Durbin (D-Ill.) on Sunday said he was hopeful that Democrats would be able to pass a bill in 2010.
"We're going to move forward on it. I hope we can get it done this coming year," he said on ABC's "This Week."
He called climate and energy legislation a way to provide U.S. jobs in green-energy industries.
As Polar Ice Turns to Water, Dreams of Treasure Abound
Still, the newest study of the Arctic ice cap - finding that it faded this summer to its smallest size ever recorded - is beginning to make Mr. Broe look like a visionary for buying this derelict Hudson Bay port from the Canadian government in 1997.
Especially at the price he paid:
about $7.
By Mr. Broe's calculations, Churchill could bring in as much as $100 million a year as a port on Arctic shipping lanes shorter by thousands of miles than routes to the south, and traffic would only increase as the retreat of ice in the region clears the way for a longer shipping season.
With major companies and nations large and small adopting similar logic, the Arctic is undergoing nothing less than a great rush for virgin territory and natural resources worth hundreds of billions of dollars. Even before the polar ice began shrinking more each summer, countries were pushing into the frigid Barents Sea, lured by undersea oil and gas fields and emboldened by advances in technology.
But now, as thinning ice stands to simplify construction of drilling rigs, exploration is likely to move even farther north.
Last year, scientists found tantalizing hints of oil in seabed samples just 200 miles from the North Pole.
All told, one quarter of the world's undiscovered oil and gas resources lies in the Arctic, according to the United States Geological Survey.
The polar thaw is also starting to unlock other treasures:
lucrative shipping routes, perhaps even the storied Northwest Passage;new cruise ship destinations;and important commercial fisheries.
"It's the positive side of global warming, if there is a positive side," said Ron Lemieux, the transportation minister of Manitoba, whose provincial government is investing millions in Churchill.
If the melting continues, as many Arctic experts expect, the mass of floating ice that has crowned the planet for millions of years may largely disappear for entire summers this century.
Instead of the white wilderness that killed explorers and defeated navigators for centuries, the world would have a blue pole on top, a seasonally open sea nearly five times the size of the Mediterranean.
But if the Arctic is no longer a frozen backyard, the fences matter. For now it is not clear where those fences are.
Under a treaty called the United Nations Convention on the Law of the Sea, territory is determined by how far a nation's continental shelf extends into the sea.
Under the treaty, countries have limited time after ratifying it to map the sea floor and make claims.
In 2001, Russia made the first move, staking out virtually half the Arctic Ocean, including the North Pole.
But after challenges by other nations, including the United States, Russia sought to bolster its claim by sending a research ship north to gather more geographical data.
On Aug. 29, it reached the pole without the help of an icebreaker - the first ship ever to do so.
The United States, an Arctic nation itself because of Alaska, could also try to expand its territory.
But several senators who oppose any possible infringement on American sovereignty have repeatedly blocked ratification of the treaty.
Indeed, not everyone agrees that warming of the Arctic merits concern.
No one knows what share of the recent thawing can be attributed to natural cycles and how much to heat-trapping pollution linked to recent global warming, and some scientists and government officials, particularly in Russia, are dismissive of assertions that a permanent change is at hand.
"We are not going to have apple trees growing in Vorkuta," said the mayor of that coal-mining city, Igor L. Shpektor, who is also the president of Russia's union of Arctic cities and towns
But the current thaw is already real enough for the four million people who live in the Arctic, including about 150,000 Inuit.
"As long as it's ice," said Sheila Watt-Cloutier, leader of a transnational Inuit group, "nobody cares except us, because we hunt and fish and travel on that ice.
However, the minute it starts to thaw and becomes water, then the whole world is interested."
Increasingly, big corporations, the eight countries with Arctic footholds and other nations farther south are betting on the possibility of a great transformation.
Energy-hungry China has set up a research station on the Norwegian island of Spitsbergen and twice deployed its icebreaker Snow Dragon, which normally works in Antarctica, to northern waters to conduct climate research.
Interest in Arctic-hardy vessels has picked up so much that in January, Aker Finnyards, a giant shipbuilder based in Helsinki, created a subsidiary just to develop ice-hardened ships. Its new double-ended tanker slips smoothly through open water bow first but can spin around and use an icebreakerlike stern to smash through heavy floes. A Finnish energy company bought two for about $90 million apiece, and after buying one Russia licensed the design and is building two more.
In January, the State Department's Bureau of Intelligence and Research held a closed two-day meeting to hear from experts on the implications of a warming, opening Arctic.
"There are likely to be a number of foreign-policy issues that must be addressed by the United States and other nations" if the climate trends persist, said a summary of the meeting.
"These issues include the availability and potential for exploitation of energy, fisheries and other resources;access to new sea routes;new claims under Law of the Sea;national security;and others."
A look at a map of the globe with the North Pole at its center explains why a new frontier matters. Some countries that one might think of as being half a world part appear as startlingly close neighbors, and relatively speaking, they are.
In the days of empire, Rudyard Kipling called jockeying among world powers in Central Asia the Great Game.
Christopher Weafer, an energy analyst with Alfa Bank in Moscow, says this new Arctic rush is "the Great Game in a cold climate."
The Petroleum Rush
To understand the practical terms of this new competition for territory, opportunity and resources, a good place to begin is Hammerfest, Norway, one of the northernmost towns in the world and one of 12 Arctic settlements visited over six months by correspondents of The New York Times preparing this series of articles.
Hammerfest, once an austerely beautiful fishing village burned to the ground by the Nazis in World War II, is starting to swell with young people from other parts of Norway, Finland, Russia and Asia, as well as with highly trained technical workers from Europe and North America.
They are drawn by Snohvit (in English, Snow White), a mammoth complex being built to receive natural gas piped from the Barents Sea and liquefy the gas for shipping.
The Norwegian government, which controls Snohvit in part through its majority ownership of the energy company Statoil, is desperate for Snohvit to be a success and put the country in the forefront of Arctic energy exploration.
Being first, however, has had its challenges in the severe operating environment of the High North, as Arctic areas are called in Norway.
Overruns have put the price of Snohvit at $8.8 billion, almost 50 percent above its original estimate.
The project has a firm backer in John Doyle Ong, the blunt United States ambassador in Oslo. Snohvit is scheduled to start sending liquefied natural gas to the Cove Point port in Maryland in 2007, just as American imports of liquefied gas from competing sources in the Middle East and Africa are set to rise rapidly.
Importing natural gas from a stable country like Norway - already the world's third-largest oil exporter, after Saudi Arabia and Russia - is a rare option these days.
"Norway's importance to the United States in terms of our national energy policy is increasing with every passing year,"
But the United States' interests go beyond that - too far beyond for many in Norway.
In September, the opening of frontier areas in the Barents and Norwegian Seas emerged as a central issue in elections that brought a leftist coalition to power, with some coalition members favoring a ban on Arctic oil and gas exploration in environmentally sensitive area
And besides supporting Snohvit, Mr. Ong, a former energy executive, has stepped into disputes between Norway and Russia over a large gray zone in the Barents. His insistence that Arctic-related matters be "trilateral" rather than bilateral is viewed as belligerent by some Norwegians.
In private, Norwegian officials welcome the heft of the United States in its negotiations with Russia.
Norway is eager to resolve the territorial dispute so that some order, and Norwegian drilling expertise and environmental standards, can be imposed on Arctic exploration.
Because as large as Snohvit is, it is dwarfed by a far bigger gas field to the east in Russian waters. That field, called Shtokman, is being developed by Gazprom, Russia's gas behemoth.
In September, Gazprom selected five companies - Statoil and Norsk Hydro from Norway, Total from France and Chevron and ConocoPhillips - as finalists in a search for partners to develop Shtokman, in the Barents Sea, 350 miles north of Russia's Kola Peninsula.
The development costs are estimated at $15 billion to $20 billion.
The field is reported to hold more than double all of Canada's gas reserves.
"They're going to find more of them," Mr. Weafer, the Moscow-based energy analyst, said of Arctic gas deposits. "It's the next energy frontier."
And while natural gas is certainly valued, the prize that is generating the biggest interest is oil.
Virtually every large international energy company is studying how eventually to win permission from Norway and Russia to explore in the Barents, and the Norwegian Polar Institute has been contacted repeatedly by oil companies to explore the feasibility of drilling in the icier waters north of Spitsbergen.
Jan-Gunnar Winther, director of the institute, said the seasonal melting of the polar cap might allow access to more petroleum deposits but also create more challenges.
"A warmer climate in the north would mean more icebergs, rather than less," he said.
"There will be obstacles in getting to the petroleum, but if oil prices stay high there will be enticements as well."
A push into the Barents Sea could help redraw the politics of energy allegiances, and gas in particular puts Russia in a strong position.
"It has a good chance of becoming a more effective counterbalance to OPEC," Mr. Weafer said.
As for Norway, the warming world gives it the chance to seek influence far beyond its size.
Energy-hungry countries that might have written off the Arctic not long ago are showing considerable interest in Norway's opening of the Barents;one visitor to Oslo in September was India's oil minister, seeking a role in exploration.
And if a route farther north opens just four or five months of the year, Norway could even become a major supplier of oil and gas to China, said Sverre Lodgaard, director of the Norwegian Institute for International Affairs.
Norway is trying to position itself as "a dwarf among giants," Mayor Alf Jakobsen of Hammerfest said.
"We're attracting young people to Hammerfest instead of sending them away, for the first time in years. The opportunity to become a springboard into the Arctic is upon us."
Fisheries Head North
Charlie Lean easily recalls when he realized that big changes were sweeping the fish stocks along the northern shores of Alaska.
Just over 10 years ago, when Mr. Lean was the state's fisheries manager for the northwest region, a call came in from the tiny Eskimo outpost of Kivalina, on the Chukchi Sea 150 miles northeast of the Bering Strait. A village elder was reporting "a massive fish kill" in the Wulik River, Mr. Lean said.
Everyone assumed it was from some toxic spill upriver at the giant Red Dog zinc mine.
Weather vs. Climate
The temperature doesn't tell you much about climate change
Seattle and the Pacific Northwest are frying under a heat wave this summer. In New York, it's so cool that the New York Times has called it "the summer that isn't."
And Texas is suffering under the most severe drought since the 1950s.
What does this all mean for climate change?
Absolutely nothing.
Every time we write about climate change, someone writes in saying that they are shocked that Smithsonian would perpetuate such a myth.
Don't we know about the record cold/snow/rain/etc. in Minnesota/North Carolina/Utah/etc.?
Obviously, there are some people who do not understand the difference between weather and climate.
Let's start with the dictionary definitions:
Weather:
the state of the atmosphere with respect to wind, temperature, cloudiness, moisture, pressure, etc.
Climate:
the composite or generally prevailing weather conditions of a region, as temperature, air pressure, humidity, precipitation, sunshine, cloudiness, and winds, throughout the year, averaged over a series of years.
In short, weather is a data point.
Climate is a collection of data.
You can think of it like the economy. I can tell you that the Dow is up 112.61 as I write this, at 9,284.22.
This is the weather (partly sunny, 84 F). But it doesn't tell you anything useful about the economy as the whole (like the weather conditions don't tell you anything useful about climate). A graph of the Dow over the last year, showing a terrifying decline followed by a steady rise, begins to tell the story of the last year. But to get a true picture of the economy, we'll need to look at lots of other bits of data, like consumer confidence, unemployment rates and durable goods orders. It's complicated, messy and hard to understand.
That's climate.
Now, if you make changes to the country's economic situation, for example, by raising taxes, that is going to have some effect on the economy as a whole.
Economists will crunch the numbers and come out with predictions. They won't all be the same, but they will probably trend toward some particular end.
Adding carbon dioxide to the atmosphere is akin to raising taxes. We've changed the climate situation.
And while these climate models—which are far simpler than economic models and more certain—may not agree on the specifics, the general trend is that temperatures are going to rise.
And they have been rising.
And more than that, we can already see the effects of that rise.
Just read the magazine:
We've featured melting glaciers, melting permafrost and changes in plant and animal distributions in the Andes and, closer to home, the Northeast, to name a few.
So please don't write to us to say that we're neglecting the latest weather superlative.
We're not.
We just have our eyes on the bigger picture—climate.
EARTH MEANDERS:
Ecological Overshoot:
Climate, Inequity and Corruption
A disease is ravaging Earth as ever more people, consume ever more, destroying natural ecosystems that are our shared habitat.
In a few short centuries the violent, expansionist and deeply ecologically unsustainable Western mindset has become virtually universally accepted.
The meaning of life is more, ever more of everything, at the expense of a finite biosphere.
The emptiness of such a vacuous worldview is revealed through changing climate, devastating human inequities and an irredeemably corrupt economic system.
More than just a climate crisis, humanity is facing profound over-population and injustice that are spurring dozens of inter-related ecological and social crises. Billions suffer as their basic human needs go unmet, while billions more gorge themselves. Forests, prairies, streams, rivers, estuaries, wetlands, lakes, soil, oceans, air and all the rest are all life's flesh and blood.
Humanity, Earth and kindred species have entered the late stage condition of ecological overshoot -- whereby our cumulative demands upon ecosystems exceed their life-giving capacity and cause them to collapse.
We are eating creation.
Hardly anyone is thinking or acting at the necessary scale to avert global ecological Armageddon.
Market based solutions are pervasive with corruption and inequity.
Nothing we do is going to maintain an affluent life, as it is now for some.
Widespread economic decline will certainly accompany abrupt climate change and global ecosystem collapse;indeed, it has begun.
If existing political systems are unable to deal with the inevitable collapse of the growth machine, at the same time as pursuing rigorous environmental policy-making, then new political structures will be necessary.
A stewardship revolution that maintains life of some worthy, habitable sort is possible.
Surely in a free country whose liberty came from such means, we can talk about revolutionary violence, as Thomas Jefferson said would continue to be necessary.
"The blood of tyrants and patriots must flow to renew the soil."
What could be more glorious than fighting, and perhaps dying, for the Earth, and maybe even succeeding in saving her (and us)?
It is time for a credible revolutionary threat to protect the biosphere.
What is needed is a steady ratcheting up of pressure – protests, sit-ins, sabotage, assassinations -- giving opponents every opportunity to respond to reasoned arguments – and culminating in guerrilla warfare and whatever else is necessary to save the Earth.
If a few thousand insurrectionists can tie up the American military in Iraq, think what dedicated, highly decentralized and autonomous groups of tens of thousands of Earth insurgents could do to bring down industrial capitalism and the Earth eating growth machine.
People power protest culminating in an Earth Revolution needs to be done urgently yet thoughtfully.
Not speaking of mob rule or rioting -- that is what is coming from the status quo. We are speaking of highly disciplined, targeted protests including the possible use of violence to bring down the equipment and individuals responsible for destroying global ecosystems, and herald in a new ecologically sustainable, just and equitable way of living with the land, water and sky.
Living must become a matter of what you can give to ecosystems, and others with whom you share being, rather than only being concerned with what you can take.
Economic growth cannot continue forever if greenhouse gases are to be curbed, and the myriad of other eco-crises solved.
Efforts to cap and trade, certify, sustainably manage and otherwise reform our way out of the situation are orders of magnitude inadequate and failing.
Free markets appear to inherently be unable to price carbon and other externalities. It is becoming increasingly unlikely (if not impossible) that current political and business growth systems can reform in time to maintain the ecosystems necessary for life.
The looming death of Gaia and most or all being is no one's fault, or rather, it is all our faults. As many species have done previously, we have collectively overgrazed our habitat.
We simply must immediately allow traditional ecological disturbance, regeneration and succession patterns to again operate.
The industrial growth machine must be powered off and we must herald in an era of ecological stewardship and restoration.
Even while we organize and pursue revolutionary action;each of us must plant, tend and restore our Earth's natural ecosystems and permaculture gardens, and help others to do so.
Only dramatic and immediate revolutionary action to destroy the growth machine offers any hope of maintaining a livable Earth.
We must commit to stopping burning and cutting -- antiquated means to make a living -- indeed killing those that refuse to stop.
Rich people are setting themselves up to be fine in geo-engineered comfort while sacrificing the poor who no longer have free ecosystem services to sustain them.
There can be no engineering of a biosphere;indeed, thinking we can has brought us to this moment.
We must return to nature.
We must hold onto our humanity as we collapse and renew ourselves. Earth Revolution is as much about helping those that want to reconnect to Earth as it is sabotaging equipment and killing people directly responsible for ecocide.
This means sharing food and water, shelter and clothing.
But bring those responsible for ecocide to justice, utterly destroying them, their institutions and their equipment.
There must be no indiscriminate terror, but if our warnings go unheeded, targeted violence against known ecological criminals is justified and warranted.
Given the momentum of nearly seven billion seeking to be super-consumers, do not see any other way to stop the forces of destruction other than a revolution.
There is absolutely no way current energy and other resource use-- much less expected growth in population and per capita consumption -- can be produced either from agrofuels or more drilling.
Humans have hit the biogeochemical limits of a finite planet, and each of us must seek what is enough, rather than always more.
It is well past time to be men and women of fortitude, set aside our computers and amusements, and commit our minds and bodies to stopping the destruction of being.
We must demand more courage and less corruption from ourselves and our leaders. The Arctic has already been changed forever. Soon your neighborhood, ecosystem and bioregion will be too (if you really look, almost certainly it is already). Please, as I do, take the end of human being through needless habitat destruction personally.
Part of the solution is allowing people to get back to Earth on their own plot of land.
How we live in the future will be by necessity less urban.
We will be called upon to make do with what is in our bioregion.
Let me make some further suggestions to you.
Acquire land and seeds. Make or restore an Earth friendly shelter and plant trees and permaculture forest gardens. Prepare to live in your changing bioregion.
Go back to the land.
Ecologically farm and restore as you connect with like minded Earth revolutionaries to clandestinely carry out escalating protest, sabotage and guerrilla war.
I urge you to really think about what is necessary -- both personally and in terms of social change -- to sustain being, and committing to it.
Token managerial reforms of the antiquated ecologically damaging activities of burning and cutting are not enough.
Technology is not going to save us. Market campaigns using glamorous celebrities are not enough.
Petitioning our leaders is not going to save us. Personal efforts will only get you and Gaia so far. Only escalating protest action targeting the destroyers, their equipment and their Earth eating worldview can still avert biosphere disintegration.
Set aside your best efforts at ecological denial, acknowledge the task before us, and join with others in becoming a reluctant revolutionary.
An Earth insurgency could topple the growth machine in a day, though it may take years. The sooner the better, as more ecological remnants will exist to serve as the basis of ecological restoration.
Even as we pursue revolutionary strategies and tactics to maintain a habitable Earth, commit to remaining free and humane.
The answer is neither tyranny of the left nor right.
Above all else we must achieve global ecological sustainability through just and equitable means.
Protect and restore natural ecosystems including old forests right now.
Work with others to destroy coal, tar sands, fishing trawlers, oil palm, industrial agriculture, pipelines and ancient forest loggers. Start today.
Now continued human existence depends upon your courage, ecological wisdom and taking direct lethal action in defense of our shared ecological heritage.
Each of us and together will transition to a state of ecological grace, quickly, and through action against the Earth destroyers, or we will all die a horrific and barbaric death together as being ends.
If we choose to fight for Earth there is hope, otherwise there is none.
Share the anguish of not knowing if revolutionary violence is the answer or not.
But it has to be considered comprehensively, thoroughly and quickly.
Prove me wrong and demonstrate how to ecologically sufficiently address converging eco-crises in a couple years time within current economic and political systems. Revolution is almost certainly the only possible way to sustain and restore healthy ecosystems as the basis of human civilization and all life.
Be strong, slay the growth machine, for Gaia.
On issues like global warming and evolution, scientists need to speak up
The battle over the science of global warming has long been a street fight between mainstream researchers and skeptics. But never have the scientists received such a deep wound as when, in late November, a large trove of e-mails and documents stolen from the Climatic Research Unit at Britain's University of East Anglia were released onto the Web.
In the ensuing "Climategate" scandal, scientists were accused of withholding information, suppressing dissent, manipulating data and more.
But while the controversy has receded, it may have done lasting damage to science's reputation:
Last month, a Washington Post-ABC News poll found that 40 percent of Americans distrust what scientists say about the environment, a considerable increase from April 2007.
Meanwhile, public belief in the science of global warming is in decline.
The central lesson of Climategate is not that climate science is corrupt.
The leaked e-mails do nothing to disprove the scientific consensus on global warming.
Instead, the controversy highlights that in a world of blogs, cable news and talk radio, scientists are poorly equipped to communicate their knowledge and, especially, to respond when science comes under attack.
A few scientists answered the Climategate charges almost instantly.
Michael Mann of Pennsylvania State University, whose e-mails were among those made public, made a number of television and radio appearances. A blog to which Mann contributes, RealClimate.org, also launched a quick response showing that the e-mails had been taken out of context.
But they were largely alone. "I haven't had all that many other scientists helping in that effort," Mann told me recently.
ad_icon
This isn't a new problem.
As far back as the late 1990s, before the news cycle hit such a frenetic pace, some science officials were lamenting that scientists had never been trained in how to talk to the public and were therefore hesitant to face the media.
"For 45 years or so, we didn't suggest that it was very important," Neal Lane, a former Clinton administration science adviser and Rice University physicist, told the authors of a landmark 1997 report on the gap between scientists and journalists. ". . .
In fact, we said quite the other thing."
The scientist's job description had long been to conduct research and to teach, Lane noted;conveying findings to the public was largely left to science journalists. Unfortunately, despite a few innovations, that broad reality hasn't changed much in the past decade.
Scientific training continues to turn out researchers who speak in careful nuances and with many caveats, in a language aimed at their peers, not at the media or the public.
Many scientists can scarcely contemplate framing a simple media message for maximum impact;the very idea sounds unbecoming.
And many of them don't trust the public or the press:
According to a recent Pew study, 85 percent of U.S. scientists say it's a "major problem" that the public doesn't know much about science, and 76 percent say the same about what they see as the media's inability to distinguish between well-supported science and less-than-scientific claims. Rather than spurring greater efforts at communication, such mistrust and resignation have further motivated some scientists to avoid talking to reporters and going on television.
They no longer have that luxury.
After all, global-warming skeptics suffer no such compunctions. What's more, amid the current upheaval in the media industry, the traditional science journalists who have long sought to bridge the gap between scientists and the public are losing their jobs en masse.
As New York Times science writer Natalie Angier recently observed, her profession is "basically going out of existence."
If scientists don't take a central communications role, nobody else with the same expertise and credibility will do it for them.
Meanwhile, the task of translating science for the public is ever more difficult:
Information sources are multiplying, partisan news outlets are replacing more objective media, and the news cycle is spinning ever faster.
Consider another failure to communicate from the global-warming arena:
the scientific fallout after a devastating trio of hurricanes -- Katrina, Rita and Wilma -- in the fall of 2005.
Just as these storms struck, a pair of scientific studies appeared in top journals suggesting, for the first time, that global warming was making hurricanes more intense and deadly.
Other scientists vociferously disagreed, and the two camps fell into combat.
So while public interest in hurricanes was at a high after Katrina, much of the science reporting at the time portrayed researchers bickering with one another ("Hurricane Debate Shatters Civility of Weather Science," announced a Wall Street Journal cover story). Judith Curry, a climate scientist at the Georgia Institute of Technology and a co-author of one of the contested studies, told me recently that the experience made her realize that "this was really the wrong way to do things, trying to fight these little wars and knock the other side down."
With the media distracted by the food fight, scientists weren't leading the public discussion, and other important findings that ought to have received attention in Katrina's wake -- for instance, that we had better tend to our overdeveloped coastlines, which are dangerously exposed to future storms -- were drowned out.
If the global-warming battle has any rival in its intensity, its nastiness and its risk to scientists if they do not talk to the public, it is the long-standing conflict over the teaching of evolution.
Science's opponents in this fight are highly organized, and they constantly nitpick evolutionary science to cast the field into disrepute.
The scientific response to creationists has long been to cite the extensive evidence for evolution.
In book after book, scientists have explained how DNA, fossil, anatomical and other evidence indisputably shows the interrelatedness of all species. Further, they have refuted creationist claims that evolution cannot explain the complexity of the eye or the intricacy of the bacterial flagellum.
Yet such down-in-the-weeds messages probably miss most of the public -- polls repeatedly show that a large portion of Americans have doubts about evolution.
For all these efforts, why haven't scientists made any inroads?
It's because at its core, the objection to evolution isn't about science at all, but about perceived threats to faith and moral values. The only way to defuse the conflict is to assuage these fundamental fears. Yet this drags many scientists out of their comfort zone:
They're not priests or theologians and don't know how to sound like them.
Many refuse to try;others go to the opposite extreme of advocating vociferous and confrontational atheism.
Ironically, to increase support for the teaching of evolution, scientists must join forces with -- and show more understanding of -- religion.
Scientists who are believers also need to be more vocal about how they reconcile science and faith.
ad_icon
"Many Christians, including fundamentalists, can accept evolution as long as it is not attached to the view that life has no purpose," Karl Giberson, a Christian physicist and the author of "Saving Darwin:
How to Be a Christian and Believe in Evolution," told me recently.
"Human life has value, and any scientific theory that even appears to deny this central religious affirmation will alienate people of faith and create opportunity for those who would rally believers against evolution."
In other words, what's needed is less "pure science" on its own -- although of course scientists must continue to speak in scientifically accurate terms -- and more engagement with the concerns of nonscientific audiences. In response to that argument, many researchers will say:
"Why target us?
We're the good guys. And if we become more media savvy, we'll risk our credibility."
There is only one answer to this objection:
"Look all around you -- at Climategate, at the unending evolution wars -- and ask, are your efforts working?"
The answer, surely, is no.
The precise ways in which scientists should change their communication strategies vary from issue to issue, but there are some common themes. Reticence is never a good thing, especially on a politically fraught topic such as global warming -- it just cedes the debate to the other side.
"If we come out of this with a more organized way of dealing with these attacks in the future, then it will have done some good," Mann said of Climategate.
On other topics, including evolution, scientists must recognize that more than scientific matters are at stake, and either address the moral and ethical issues themselves, or pair with those who can (in the case of evolution, religious leaders and scientists such as Giberson and National Institutes of Health chief Francis Collins, who in 2006 wrote a book called "The Language of God:
A Scientist Presents Evidence for Belief").
All this will require universities to do a better job of training young scientists in media and communication.
The good news is that this is beginning to happen:
At the Scripps Institution of Oceanography at the University of California-San Diego, for instance, marine biologist Jeremy Jackson's "Marine Biodiversity and Conservation" summer course introduces young scientists to the media, blogging and even filmmaking.
"Traditionally, scientists have been loathe to interact with the media," Jackson said in a recent interview.
But in his class, "the students understand that good science is only the beginning to solving environmental problems, and that nothing will be accomplished without more effective communication to the general public."
Scientists need not wait for former vice presidents to make hit movies to teach the public about their fields -- they must act themselves.
And in another sign that the times may be changing, a syllabus for such classes is already here. A spate of recent books, from Randy Olson's "Don't Be Such a Scientist:
Talking Substance in an Age of Style" to Cornelia Dean's "Am I Making Myself Clear?:
A Scientist's Guide to Talking to the Public," seem like perfect assigned reading.
To Save the Earth, Encourage Economic Freedom
The best way to create sustainable environmental policies around the world is to increase economic growth and the standard of living.
A shadow hung over the Copenhagen conference.
The credibility of sophiscated climate science has been tainted by allegations that key scientists and institutions manipulated data and access to publications to support the case for global warming.
Still, many around the world would support sensible, cost-effective strategies to minimize the risk of man-made global climate change.
The inconvenient truth, however, is that the mammoth government regulatory schemes discussed at Copenhagen are both prohibitively expensive and unlikely to work -- the worst possible combination in any cost/benefit analysis.
There is a better way.
For 15 years, The Heritage Foundation and The Wall Street Journal have been measuring economic freedom in countries worldwide.
Our historical evidence and volumes of supportive social science research demonstrate that economic freedom is good not only for individual economic advancement, but for the progressive values and public goods that people seek for society as a whole.
It's simply better to live in a free society.
Higher levels of economic freedom lead to higher living standards and healthier human development.
Greater economic freedom provides more choices and improves the quality of life by opening opportunities and promoting innovation.
The benefits of economic freedom also extend to environmental protection.
Proponents of cap-and-trade schemes or other massive government regulatory interventions assert that only a strong government can protect the environment.
In fact, the market forces unleashed in an economically free society are far more likely to drive economic results in the positive directions demanded by those concerned about the environment.
The most remarkable improvements in clean energy use and energy efficiency over the past decades haven't been as a result of government regulation.
The most progress was driven by advances in economic freedom and freer trade.
These unleash greater economic opportunity and increase prosperity, generating a virtuous cycle of investment, innovation, and dynamic economic growth.
The fundamental flaw of those who favor new government regulations is their belief that there is a trade-off between economic growth and environmental protection.
They seem to think that to get more of one, you have to have less of the other. The truth is just the opposite:
to get more environmental protection you need more growth, not less. And the surest path to economic growth is through greater economic freedom.
A recent study from the World Bank reports that freer trade is "a key factor in helping developing countries reduce their greenhouse gas emissions and adapt to climate change."
Other evidence abounds. The Environmental Performance Index (EPI), published by the World Economic Forum, the Center for International Earth Science Information Network (CIESIN), and the Yale Center for Environmental Law and Policy, provides "a composite index of current national environmental protection efforts."
Levels of economic freedom and the EPI are positively correlated at a statistically significant level.
The freer the economy, the higher -- and more sustainable -- the level of environmental protection.
Policy efforts aimed at imposing stricter environmental standards through a global regulatory body run great risk of being not only fruitless, but counterproductive as well.
They undercut the economic growth necessary for greater efforts to protect the environment.
Such regulations only serve as feel-good actions, without generating real "change" that could mitigate climate change and its possible negative impacts. Countries in general -- but developing countries in particular -- are able to protect their environment only if their economies prosper and the standard of living of their citizens improves.
The surest way to promote sustainable environmental policies around the world is to increase economic growth and the standard of living.
Increased government regulation would stifle that growth.
The compelling force of economic freedom, by contrast, has been proven over and over, in countries around the world, to empower people, create positive forces of opportunity and innovation, and nourish overall well-being, including through a cleaner environment.
Climate Change &Temperature
Apart from the great financial impacts, the human impact is being already felt.
Millions are starving throughout the globe, and with the World's population is rising steadily, the situation will only continue to deteriorate if temperature and climate is allowed to continue unimpeded.
To understand how mankind can counter global warming, we must understand what brought it about.
There are several holes in the ozone layer, caused by gases emitted from the Earth's atmosphere and rising into the atmosphere.
These gases have been caused by the use of certain noxious chemicals, some of which were used in agricultural.
One of them in particular, Methyl Bromide, has largely been banned from use.
Another minor cause in temperature and climate changes is the use of aerosol sprays, which might seem petty and
innocuous, has had some effect, scientist say.
However the principal gases being emitted into the atmosphere are from industry and vehicle exhaust systems.
Only very recently has the United States Government begun to realize how much these factors have affected the situation.
However the realization among all of us is that it will take years if not decades to put and end t o the emission of gases. This can only be achieved through a gradual transition to cleaner energy.
In the meantime, mankind will have to live with the catastrophic effects that these temperature and climate changes are bringing upon us.
Natural disasters are becoming a much more common place, the most prominent example being the Indian Ocean Tsunami that occurred in 2004, causing the deaths or more than 300,000 people and destruction of property at a tremendous scale.
Hurricanes and typhoons over the last few years have been increasingly frequent and more violent, again at great cost to lives and property.
However, temperature and climate changes have become a silent killer, due to acute shortages of water being felt all over the World, Drinking water is increasingly short of supply as well as water for agriculture to supply the most basic foods for the starving millions.
Great efforts are being made to increase water supplies through desalination and waste water treatment , but it may be a case of too little too late to save many millions of people who are slowly starving to death.
If and when mankind succeeds in reversing the changes in our weather patterns caused by temperature and climate changes, only then can their situation improve.
the Green House Effect
The process know as the greenhouse effect earns its description from a comparison between the temperature of air inside a greenhouse compared to the air outside.
Known to many as global warming, its is one of the most frightening man made phenomena of the 21st century.
At the end of 2006, scientists confirm that the average temperature of the planet has shown an increase of up to 1 C every year for the last five.
This increase had proved beneficial for just a few regions in the World especially those of higher latitude.
However, the negative impacts of the greenhouse effect on the World's climate change are being observed already through the overall effect in constant annual temperature increases.
Overall the effects of global warming have proved to be disturbing, a prominent example is the increasing levels of acidity being detected in sea water due to increased evaporation levels. This could trigger off a long term reaction that could bring catastrophic effects on the marine food chain.
Changes to the polar ice packs and glaciers are occurring on an ongoing basis, and scientific evidence has proven that these areas are warming up at a much more rapid rate than the global average.
The frightening statistic is temperatures in the far north have by between 5-7 degrees Celsius in the last 50 years. No need to say that if the greenhouse effect continues at this level then we can expect a catastrophe of global promotions, and sometime within the next twenty to fifty years.
Rainfall has decreased dramatically over the last decade, with drought conditions being felt in Western Australia and the deep south of America, where just a few years ago this situation was unheard of.
Shortages of drinking water has become a way of life for billions throughout the World, and with populations rising, this will soon reach epidemic proportions.
Water for agriculture is also in increasingly short supply due to the greenhouse effect, bringing with it shortage of the most basic food supplies.
There is no doubt that the greenhouse effect has changed our lives, and will affect the lives of our children and our grandchildren even more so. Awareness of the problem is in the increase and technological advances may go a long way to curtailing it in the future.
Let us hop for our future that mankind will be able to halt if not reverse the damage being done to the World's climate through the greenhouse effect.
Why Worry about Global Warming
There are many things which can't be neglected in our lives. People always worry about basic and important things in life, but they do not realize that environment is also one of the important aspects which cannot be neglected.
No human can survive in an atmosphere full of carbon dioxide.
So this point can't be taken for granted, we should worry about global warming because it is increasing in the earth's atmosphere.
Greenhouse effect increases due to increasing amount of greenhouse gases in the earth's atmosphere.
As the earth warms up, the snow level decreases. It leads to more warming and the amount of water vapor generated will go up.
This is the most powerful greenhouse gas in existence.
The warmest year was 2005, according to a third assessment report, if no major steps are implemented then the temperature will rise periodically and it will be very difficult in controlling it.
Scientists predict that even if we make efforts immediately, the climate would not be balanced as the gases released in atmosphere will stay there for many years. So chances become higher for irreversible changes in climate.
Some primary sources such as fossil fuels, agriculture and land clearing, and other activities increase the volume of greenhouse gases which are released in the atmosphere.
There are a number of cars running on roads which gives major contribution increasing the rate of CO2 in the atmosphere.
United States is responsible in releasing 25 percent of carbon dioxide in atmosphere every year.
Trees absorb carbon dioxide and releases oxygen.
But due to deforestation, the ratio of carbon dioxide has increased as it is not being recycled due to lack of trees to balance the atmosphere and nature.
This effect will be seen all over the world where people will observe a drastic change in their environment.
As the earth's temperature is rising, it will melt the ice of the Arctic sea and Antarctica.
This will result in a rise of sea level due to global warming.
Small islands could experience sudden damage from the rise of global warming.
It had become difficult to predict the melting of ice-sheets and glaciers since the record-keeping began.
Hence it is very important to consider global warming as one of the most important problems of the world and we should take precautionary steps to control global warming.
My Story - Why I Went Green
So you and I are here day in and day out.
Well at least I know I am here and I have never told you my story.
The story about why I have gone green and why am I writing a blog about it.
Well grab some organic coffee, tea, or just a glass of water and sit back and enjoy.
My journey began 2 years ago. My daughter was a couple of months old and Sheamus was about 1 3/4.
These two are the reasons for my journey.
Of course I have my eldest son Celtic, but there was never really a time in the beginning of his life when the light bulb flashed on in my head like with my two youngest.
Sheamus at one year old, a couple of months before Guinevere was born, was admitted to the hospital with the beginnings of pneumonia.
It was an extremely stressful time, Celtic was 5 and I was getting heavily pregnant, the only one that could stay there full time was my husband and this required taking time off of work.
Sheamus was scared to death as they had him hooked up to an oxygen machine that kept losing his reading and beeping really loudly.
He was there for 3 days, I think, and no one got any sleep.
After that he was never really quite the same, he now has seasonal allergies as well as exercised induced asthma.
Whenever a cold hits, we immediately take him to the doctor to start him on some strong antibiotics. He usually goes from runny nose to "coughing so much he can't breath in less" than 24 hours. We have a nebulizer here that he has to use every 2-3 hours when he has a cold and no one really sleeps, we have to make sure that he keeps getting enough oxygen or else it is off to the hospital.
I started noticing that his breathing would act up when I would light candles or incense in the house.
And worse after I would clean.
Then I noticed after I would vacuum, with the carpet freshener powder that Sheamus would get choked up and my husband would get a really bad headache.
That is when I started doing research online and looking for alternatives. See I am what my husband would call, an internet searching guru.
If there is something that he needs or a friend needs to find they call on me and within a matter of minutes, never more than about 10 minutes, I will have an answer for them along with a list of websites.
All of these causes were connected, the cleaning products, the candles, the carpet freshener. They all have horrible chemicals in them that stay in the air long after you have used them and you are breathing them in and they affect your mood and the way your are breathing. I found simple, easy, and natural alternatives to cleaning products, vinegar, water and orange peel anyone?
Soy candles with cotton wicks and baking soda with essential oil for my carpets. And I will tell you what, everyone's health improved.
Celtic stopped bringing home as many colds from school as usual, Sheamus went from using his nebulizer nightly to only when he was sick.
Even my husband who usually had a cold back to back to back went 6 months without getting sick!
And the cost of making these products was nickels and dimes compared to dollars and dollars every week.
The pride to provide my family with healthier living and knowing that I was the one doing it, feels amazingly awesome.
July 26th 2007, Guinevere Rose Waldron is born.
Red hair and pale skin, our first daughter!
Rosey red cheeks!
It turns out Guinevere has eczema on her face, it is usually very dry and itchy.
The doctor gave us a prescription cream for her face. I started to use it, I use it every time her cheeks flare up or start to itch, which was about every 2-3 hours. A week went by and her skin is completely cleared up as long as I used the cream, if I missed a couple of hours, her cheeks are back to being red, scaly and itchy.
We went back to the doctor for a check up and he asked how the cream was working, I told him it was working great I was using it often during the day and it seemed to be working.
He looked at me and asked, how many times during the day, I responded with about 4-6 times. And he said, Oh you can't do that, you can only use it once a week really.
If you use it more than that is will change the composition of her skin.
I respond with What?! I am not sure I heard you correctly.....
"It will change the composition of the skin on her face if you use it more than once week". I walked out with my kids with a look of complete shock on my face.
How dare they give me a cream that WILL.CHANGE. THE.COMPOSITION. OF. MY. BABIES. SKIN!!!!
What are they thinking, not to mention once a week isn't going to do shit for her. So I go home and chuck the remaining cream in the tube.
My husbands asks what I am doing and I tell him what the doctor said.
He is in shock, to say the least. I hop on the internet determined to find a natural cure for my baby. I come across Lavender essential oil which is good for just about everything. I decide from now on I will add one drop of essential oil to her bath(kitchen sink) water. This is just an experiment but I consider the alternative and well, do I really have a choice?
In my mind, I don't and essential oil is natural, from the Earth.
We dive in and try it, giving her a bath every night with the added essential oil to her bath water. There is a difference even the first night.
Her cheeks are still read, but they are not bothering her anymore, her cheeks are calm and not itchy.
The next night we see that the rosiness is fading every so slightly.
By the end of one week she has clear skin!
And now she always really has a rosey glow to her cheeks but they are rarely every scaly or itchy and we still always add the lavender essential oil to her bath water.
This was the huge turning point.
If my pediatrician would prescribe something like that and think it is OK, what the hell was in everything else that my kids got a hold of or that I was told was good for them and what about myself and my husband.
That is how the blog started. I needed somewhere to organize and store the information I had found. I could do just that with a blog not to mention share what I have found with everyone else who stops by.
Well that is my story, my family is still evolving and learning new things everyday.
My children are growing up with these practices and will likely not know any different, I try to help friends and family as much as I can.
But I can't force them to take my advice no matter how much I tell them that it is healthier for them, it will save them money, it will extend the future of the Earth for their children, grandchildren, or even my children.
Some of them just don't get it.
And that is ok, I still throw out the information here and there when it is relevant.
Maybe one day....Ill just have to wait and see and be patient.
At least I know my readers come back here to see what I've got to offer or contact me with questions. And that fact makes me feel accomplished.
Food Inc.
Review
Companies use false imagery in the marketing of their products such as farm houses next to green pastures that resemble the old school farming mentality.
In reality, today it is more of a factory system that has been standardized and controlled by a few businesses that have monopolized the industry for their own financial gain.
The expense is often of the hard working and underpaid people who keep getting sicker as a result of poor regulations and greedy business men.
These people put their lives at risk;immigrants working the fields and meat packers cutting thousands of dead animals. How have we allowed a select few to claim power and authority over such a basic necessity of life such as our food?
How it is to be tailor engineered (as if anything was wrong with it before)?
Food, Inc., digs deep to the roots of each of these issues and more, calling out to support organic agriculture and other ways of sustainable farming;and create justice for honest workers and consumers.
These days you rarely see meat packaged with the bone still attached.
This simple tactic further disconnects consumers from where there food is coming from, how it is cultivated, and how it gets to them.
Farms these days are no longer farms but mass producing factories of food.
These many factories of food produce that source out to many brands are actually all owned just by a few conglomerates who control all the regulating and get all the profit while consumers suffer health problems, environmental problems, as well as societal and economic problems. Meanwhile, they don't want us to know what they put in our food so they leave out important information so that we will keep suffering, and they will keep profiting.
How is this different from Bio-warfare? GMO vs Anthrax... which is worse?
In the 1930's the first Drive-In fast food chain was born, also christening America with a new factory supply system to the kitchen of a restaurant.
Their main priorities were uniformity, conformity, and cheapness. They are the largest purchaser of ground beef in the U.S., as well as potatoes, pork, chicken, apples, tomatoes, and lettuce.
The top 5 beef companies in 1970 controlled about 25% of the beef industry, now, in 2009, the top 4 beef companies control +80% of the entire industry.
Chicken and other birds are grown and slaughtered at half the time as compared to 50 years ago, but are twice as large.
Due to more consumers preferring white meat, they re-engineered the chicken to grow breasts twice as large.
The supporting argument from one farmer was that, "Why would you grow a chicken in 3 months, if you can do it in 49 days?" in reference to farming genetically modified birds.
Out of dozens of farmers interviewed for Food, Inc., only one would speak up and let a film crew record footage of the chicken farming conditions at her farm, per regulations of the company she produced for.
Some chickens die regularly, most days as a result of their internal organs and bones not being able to keep up with the rapid growth of their muscles. When they get infections they are treated with antibiotics in their feed and then often become resistant.
The antibiotics in the chicken's body can be transmuted into the human body through consumption, causing allergies or stronger tolerance to antibiotics in humans as well.
When the chickens are being transported to the plant, all the ones that make it, whether sick, filthy, or injured, get processed through the plant.
Meat packing companies oppress the farmer's power. They on average spend $500,000 on loaned money to pay for one poultry house for approximately $280,000.
It would be mandatory for the farmer to keep up to current on all new equipment and technologies, or less he/she lose their contract with the company.
On average, chicken farmers make about $18,000/year. That's some discouraging math!
Big Corporations try to hide these things from us, and Food.
Inc. is a beacon in the direction that we should all turn our eyes and ears too, and really pay attention to what is going on with our food today.
We have the right to look into our food!
We have the right to know where it comes from, and what ways will it's processing affect us, positively or negatively, because we should have the right to choose that for ourselves.
30% of America's land is used for Corn. 100 years ago farmer's were growing 20 bushels of corn per acre, but now farmers have figured out how to grow up to 200 bushels per acre!
They are actually producing corn below the cost of production!
This makes corn a popular ingredient for many products, and is used as feed for animals, but this does not mean that corn is the healthiest choice.
You can find corn in products such as:
ketchup, cheese, twinkies, batteries, peanut butter, cheeze-its, salad dressing, Coca Cola, jelly, sweet n low, syrup, juice, kool-aid,
charcoal, diapers, Motrin, and more .
Farmers are even using corn to feed animals that are not meant to have that kind of diet.
Take cows for example, farmers feed their cows corn because it is cheaper and more efficient when you have a large amount of cows that eat grass faster than you can grow it.
Human's eat an average of 200lbs of meat per year!
These cows end up developing e-coli bacteria that is acid resistant, as a result of their unusual corn diet, that can become harmful to the consumer that eats the meat from that cow.
They stand ankle deep in their own feces all day and then get processed at the plant without even being cleaned off, inevitably passing the bacteria on to the food.
One woman's child died from e-coli eating a burger and worked with lobbyists to get a measure passed that they called Kevin's Law.
Kevin's Law gave back power to the USDA to shut down plants that consistently produce and deliver contaminated meat. 7 years after the woman's son died she still can't get the law passed or even get the meat company to apologize!
The Smithfield Hog Processing Plant is known for exploiting their workers. They slaughter up to 32,000 hogs a day!
Their employees get infections under their nails from the bacteria in the meat of the hogs, causing them to split away from their fingers. They get covered in blood, guts, urine and feces all day.
Their employers know that they can't afford to leave and they pray on this fact to keep them in their positions.
In the 1950's, being a meat packer was like being an auto-mechanic.
It was thought of as a good job with a decent wage, benefits, and a pension plan.
Now, it is the most dangerous job and employs many immigrants. Many of these immigrants were corn farmers in Mexico whose market got flooded with an influx of American corn. 1.5 million Mexican workers made their way over the border to become meat packers. These meat packers were recruited by the meat companies themselves, and shuttled in to work by company owned vans. When the government cracks down on illegal immigrants working in America, they go after the workers and not the companies that invited them to work illegally in the first place.
How can we let people be trapped into that and not do anything to bring them justice?
The Organics are the fastest growing food commodity, and there are some big name companies that are jumping on the wagon.
Large stores like WalMart have introduced organic food lines and have implemented labels that are easily seen by customers, so that they have the freedom of choice when they go to the store, do I support local, or not.
Do I support organic, or not?
Do I support fighting animal cruelty, or not?
These options are important!
Because, face it, a $1,000,000 purchase by WalMart does affect the economy, so at least they are investing in healthier means as they put more money into circulation for taxes, employment, production of goods for the whole, etc.
Go Organic!
It's good for you!
2,000 years ago people gathered their seed from their crops and saved them to plant for the next year's harvest.
Today there are companies out there have have managed to control something as simple as a seed of life by putting a patent on it that can now make it illegal to gather ones own seeds due to patent infringement!
This company is called Monsanto.
Monsanto is a chemical company that created agent-orange back in the Vietnam War, and also created and manufactures Round-Up.
Round-Up was not only killing weeds like it was intended to do, but was also affecting farmer's crops, so Monsanto created a Round-Up ready GMO (genetically modified) soybean seed that springs forth into life already resilient to pesticides.
Food, Inc offers some even more astounding information.
In 1996, 2% of U.S. soybeans contained this patented gene from Monsanto. In 2008, 90% contained Monsanto's patented gene. 90% of U.S. soy product is genetically modified and the industry is controlled by one company!
It is insanely unfair and unjust!
Would we have chosen this for ourselves given the choice?
Some farmers' crops become contaminated with this patented gene without their own knowledge or doing.
Wind can carry seeds to other farms where there are no GMO seeds in circulation and they start to grow and spread right under their noses!
When Monsanto finds out, they often start an investigation and the farmer's have to prove that they are not violating Monsanto's patent.
One corn farmer was facing $25,000 in lawyer's fees and hadn't even had his court trial yet.
Eventually he ended up settling with Monsanto because fighting it was getting to be too expensive, and often this is the case.
A measure was presented in California to label cloned foods and GMO foods as such, that was passed, but then Governor Arnold Schwarzeneger vetoed this!
Can you believe 75% of all supermarket food is GMO?!
They end the film with a positive message I would like to close with:
You can vote to change this system - Three times a day - Buy from companies that treat - Workers - Animals - And the environment - With respect - When you go to the Super Market - Choose foods that are in season - Buy foods that are organic - Know what is in your food - Read labels - Know what you buy - The average meal travels 1500 miles from the farm to the super market - Buy locally grown food - Shop at farmer's markets - Plant a garden (even a small one) - Cook a meal with your family and eat it together - Everyone has a right to healthy food - Make sure your farmer's market takes food stamps - Ask the school board to provide healthy lunches - The FDA and USDA are supposed to protect you and your family - Tell congress to enforce food safety standards and reintroduce Kevin's Law - If you say grace, ask for food that will keep us, and the planet healthy - You can change the world with every bite - Hungry for change? - Go to takepart.com/foodinc -
uest Viewpoint
Myths cloud the truth about biomass energy generation
By Rodolfo Oliviera and Suzana Radivojevic
Appeared in print:
Monday, Oct 26, 2009
Claims made by Seneca Sawmill Co. that appeal to the public's hope for clean energy promote public misinformation. A tremendous backlash to biomass is building in the Northeastern states where plants are already on line;will Oregonians learn the same lessons, but also too late?
Let's debunk some of the prominent myths of the biomass proponents:
Myth:
Biomass energy generation is carbon neutral and helps solve global warming.
Fact:
By Seneca's own admission, this one facility will generate more than 210,000 tons of greenhouse gases yearly that trap pollution and infrared radiation.
A biomass co-generation plant converts carbon sequestered over a tree's 50- to 60-year growing cycle to greenhouse gases. While existing greenhouse gas tracking and trading systems account only for emissions from fossil fuels, new tracking systems, which are already in development, will soon include emissions of biogenic carbon. A facility such as Seneca's will be held accountable for its carbon emissions.
The plant will also emit substantial amounts of nitrogen oxides, which are precursors to tropospheric ozone — the "bad" ozone that damages forests and crops and is a health threat to people who exercise outdoors or have breathing problems.
The Seneca plant will saturate our local climate with other trapped pollutants, including particulate matter in permanent suspension.
This has the effect of creating a local micro-climate capable of creating thermal inversions and acid rain.
Even if one accepts that biomass is carbon neutral, its emissions have a significant potential for changing the quality of life for the communities nearby.
Myth:
Seneca's plant is not a major source of air pollution when compared to cars and trucks.
Fact:
Comparing power plants to vehicle emissions is misleading.
Air pollution from a static, steady-state pollution source operating continuously is distinctly different compared with a multitude of personal, public and business vehicles and equipment.
That is one among many reasons that industrial point source emitters such as biomass plants require air pollution permits under the Clean Air Act.
In contrast, vehicles are intermittent, nonpoint sources that are regulated and monitored under a different set of rules, some of which are not even operating in Lane County.
Local pollution from cars and trucks would be considerably curtailed if the Lane Regional Air Protection Agency had a vehicle emissions testing program in place such as exists in Portland and Medford.
Myth:
Seneca will reduce pollution by burning forest slash in a cleaner way.
Fact:
A source of pollution that was dispersed in rural areas will be transported to a single facility within the city limits, burned 24 hours a day, and will release toxics known to cause lung and cardiovascular diseases upwind of homes, schools, hospitals, ball fields, playgrounds and parks.
Is open slash burning an unregulated source of pollution that needs to be significantly reduced?
Yes, but slash and renewable wood products can be reused by businesses (i.e., Rexius Sustainable Solutions or Lane Forest Products) to produce more carbon-sequestering products such as compost and material for landscaping and agriculture.
Local businesses that provide these wood products currently offer many dependable, family-wage local jobs.
Myth:
Seneca is doing all it can to control air pollution.
Fact:
The Seneca power plant will be among the top polluters in Lane County, even after the smoke passes through its pollution control equipment.
The company refused to install state-of-the-art emission control technology available for biomass boilers, which is significantly more efficient in reducing air pollutants than the one adopted by Seneca.
This superior technology, called Regenerative Selective Catalytic Reduction, is already being used at seven other biomass plants in the United States and is considered the best available control technology in two states. RSCR reduces the two main pollutants of concern, nitrogen oxides and carbon monoxide, by 75 percent and 65 percent respectively.
Cost estimates for RSCR suggest that Seneca substantially overstated the costs of using RSCR to LRAPA. For example, the annual RSCR operating costs are less than $500,000, instead of the $1.7 million stated by Seneca.
The total cost per megawatt of generated energy would amount to between $5 and $6.5, not the $10 claimed by the company.
LRAPA should have required an independent performance and cost evaluation of available technologies that weighed the benefits of diminishing greenhouse gases and reducing public exposure to harmful air pollutants. As it now stands, community health is sacrificed for cheaper, inadequate pollution control.
Other communities have discovered that biomass incineration is not a sensible solution to either waste or energy problems. If pollution control and location issues are not addressed at the beginning of the permitting process, a biomass co-generation plant will turn out to be a disappointing way to manage valuable biomass.
Rodolfo Oliviera, a chemical engineer, and Suzana Radivojevic, a wood engineer, are consultants for the Oregon Toxics Alliance in Eugene.
Climate change ethics to be explored | The UO's Wayne Morse Center for Law and Politics will spend two years looking at the moral implications
As people continue to debate the science behind climate change, a two-year program at the University of Oregon is moving beyond data to explore the moral and ethical implications of a warming planet.
The UO's Wayne Morse Center for Law and Politics is staging a series of events around the theme "Climate Ethics and Climate Equity."
The next event will be a public talk by New York University environmental studies and philosophy professor Dale Jamieson, who will discuss "The Moral and Political Challenges of Climate Change."
Jamieson's talk will be at 7 p.m.
Tuesday in Room 110 of the Knight Law Center on the UO campus. It is free and open to the public.
Jamieson was among the first scholars to explore the ethical implications of climate change, said Margaret Hallock, the Morse Center director. He calls human-caused climate change "the greatest collective challenge humanity has ever faced."
For the most part the debate on global warming has centered on the science supporting it, but Jamieson believes the issue won't be addressed successfully until people look beyond data and frame it as a moral issue.
For people to care about climate change, he suggests they need to accept it as socially wrong.
"We're going to have to change our attitude so that for the next generation burning coal is going to be seen as behavior that's every bit as unrespectable as lighting up a cigar in a public place," said Jamieson, who holds this year's Wayne Morse Chair.
Hallock said the two-year program isn't about climate science and won't debate that issue.
She said the two-year exploration is based on an acceptance that climate change is happening and is largely the result of human activity.
Seen from that perspective, Hallock said climate change presents some questions that are ripe for debate.
She said addressing climate change could require people to redefine the relationships between rich countries and poor countries, between consumers and producers and between the current generation and future generations.
"We have to think about new conceptions of justice and property and responsibility and obligation," Hallock said. "I think that people grappling with the equity issues demonstrates that it is a moral issue.
And it's a motivator for people to think about future generations, to think about people in other countries and to think about issues of personal responsibility and global justice."
The Morse Center will hold more public events centered on the moral and ethical issues raised by climate change over the coming two academic years. A daylong symposium, "The Perfect Moral Storm:
Ethical Challenges of Our Climate Crisis," will be held Nov. 13 at the Many Nations Longhouse.
For information on the program, visit the Morse Center Web site, waynemorsecenter.uoregon.edu.
Glimmers of climate hope | Nations made progress despite chaos in Copenhagen
Most early assessments of the Copenhagen climate summit were grim:
"Worse than useless," declared the Financial Times of London.
"Another frank lesson to the rest of the world on the limits of a U.S. president's power," said The Philadelphia Inquirer. Even the Swedish prime minister's office called it a flat-out "disaster."
Certainly no one can argue with a straight face that the climate conference was a grand success. But neither was it the categorical failure that many have called it since the talks concluded Saturday with a grudging agreement by participants to "take note" of an interim deal by the United States, China and three other nations to fight global warming.
Expectations were depressingly low going into the conference.
With a climate bill mired in the U.S. Senate and China balking at international verification and demanding that rich nations contribute hundreds of millions of dollars to help poor countries address climate change, it was clear there would be no binding agreement to reduce greenhouse gas emissions.
But the Copenhagen Accord may eventually prove to be the "breakthrough" that a weary President Obama declared it after 13 hours of nonstop negotiations at the conference.
If that is to happen, it will require intensive and fruitful negotiations by key participants before the next U.N. climate summit a year from now in Mexico City.
It also will require Obama to persuade Congress to approve cap-and-trade legislation, an effort that could prove as formidable and as fiercely partisan as the current debate over health care reform.
For all its limitations, the Copenhagen summit produced significant progress on several critical issues.
For example, there was broad agreement over a U.N. forest carbon scheme that would allow developed nations to pay countries to preserve their rain forests and earn carbon credits. That's important, because the logging and burning of tropical rain forests accounts for roughly 15 percent of global carbon emissions, and stopping deforestation is a relatively inexpensive way to slow greenhouse emissions.
Another key development was an offer by the United States and other developed countries to provide $100 billion in climate aid to poor countries to combat climate change.
Republican members of Congress immediately responded that they would refuse to agree to such payments — an indication of the resistance that Obama faces at home to a climate agreement.
Yet another development was China's surprising agreement to international verification of emission cuts — a position that China steadfastly resisted until Obama arrived in Copenhagen to meet with China's Premier Wen Jiabao. The Obama administration has said it will oppose any binding deal that does not include verification, in part because Congress is unlikely to approve a climate bill without a verification agreement.
The path to a binding international agreement only gets tougher after Copenhagen, which produced broad political agreements largely devoid of detail.
Those details, including specific emissions reductions, must still be negotiated.
That's a daunting prospect.
But the modest breakthroughs achieved at Copenhagen provide reason to hope that even larger ones can be achieved at Mexico City.
Environmental Benefits of Telecommuting
Telecommuting and outsourcing are really starting to take off as the worldwide economic crisis is forcing employers to find new ways to reduce costs and increase employee productivity.
At first, the thought of letting an employee work unsupervised at home may be a terrifying experience for managers. But if you trust your staff to be self-motivated in their work, the payoffs can be huge.
The following is a short list of environmental benefits of telecommuting:
# From an environmental perspective, this represents a huge fuel savings. And with rising gas prices &shrinking salaries, it can be a great benefit for attracting and retaining quality talent.
# Given that it's common for a worker to spend upwards of 3 hours driving to-and-from work every day, the option of working from home can also cut the length of their their work day by 30%.
This means extra time to spend with family, taking courses or at the gym.
In the end, you wind up with a healthier, happier, and more productive employee.
# Telecommuting is also good for productivity since working from home eliminates distracting social contact that may occur in an office.
Once interrupted, it can sometimes take up to 30 minutes for a worker to fully regain their train of thought.
# Outsourcing is a great money-saving option for companies looking to reduce HR costs. The software industry understands this, and has been outsourcing to India for years. Now, the trend is starting to take off in other areas of business as well.
What's most exciting about this trend is that it shows how environmental stewardship and the free market economy can work closely together in a frictionless partnership.
These 2 ideologies don't have to be in conflict with each other.
Of course, the system isn't perfect.
From an IT and disaster-planning perspective, this move can be particularly challenging.
Most companies today will manually back up their computer data to some sort of physical digital media on a nightly basis. Some examples include backup tapes, DVDs, external hard drives, etc...
This is fine for corporate servers and systems physically housed in the main office, but it can quickly become a logistical nightmare when it comes to managing remote systems such as laptops. In fact, the Ponemon institute recently released a study that showed 42% of business travelers do not back up their data.
For this, and many other reasons, companies are turning to "cloud-based" or Software-as-a-Service (SaaS) systems for their data protection.
SaaS data protections services, such as online backup, allow companies to effortlessly manage multiple remote locations without having to invest in additional equipment, staffing, or licensing.
SaaS backup systems are 100% digital, and send data over the internet using a secure connection.
This means no more lost, stolen, or broken backup tapes. Also, because these systems are fully-automated, your IT staff can devote their time to more productive activities within the organization.
Of course, as your company grows, the task of managing these remote systems will eventually get much more complicated.
That's why it's important to choose a provider that offers a centralized management portal.
This way, you can manage accounts in bulk and group users into functional categories. This will ensure more efficient administration and fewer potential configuration errors.
For more information on telecommuting, I'd strongly recommend checking out COI.com's excellent guide, filled with up-to-date content and resources. And if your company is planning to take advantage of the telecommuting trend, make sure to first implement a disaster recovery plan that's able to handle it.
Eco-Friendly Skincare:
Boots Botanics Organic
I've recently tried some organic/eco beauty products called Boots Botanics Organic.
The UK's most trusted health and beauty brands Boots is recently being introduced to the US. I thought I would share the word on how wonderful they are.
Part of my regular routine after a shower is to lather up with body lotion (as most girls do) and I'm always searching for the latest cream on the market that can make my skin softer than the last. I usually don't buy designer creams at makeup counters due to high cost(especially ones that claim to be organic), but I did come across some great stuff that is very affordable and happens to be really great quality.
The best part about it is it's all organic made from real plant extract.
Also the containers and packaging are all made out of recyclable materials to help you reduce your impact on the environment by making your beauty routine eco-friendly.
Boots Botanics Organics has a wide range of organic beauty products from eye creams to get rid of dark circles and puffy eyes to body oils,creams and even makeup.
Pretty much anything a girl needs to pamper her face and body.
But don't worry boys, their is some products available for you as well.
The facial smoothing polish is made from almond oil and leaves my face baby soft.And the chap stick is made with olive oil moisturizers which works great and looks like a light gloss. I absolutely love the results after using it for just one week.
The body cream smells refreshing and definitely helps with dry skin and it comes out feeling soft and silky. I will definitely be making some changes by purchasing more organic beauty products in the future.
This is good news for my mom because she is all for Living Green and purchasing green.
After I told my mom about Botanics Organic she is really excited to try it.
Here are some details of the products I've tried:
Organic Face Lip Balm is 95% certified organic and comes in a 100% recyclable package.It is made with a rich balm that keeps your lips moisturized all day long.
It contains Shea Butter to nourish and condition and Olive oil to moisturize.
The rich butter from the Shea or karite nut is widely used in West Africa both for cooking and skin care.
The nuts are gathered and the butter produced by local women who value it for nourishing and conditioning the skin and hair. It feels great on your lips and keeps them smooth and soft. A great value at $6.99.
The Organic Bathing Rich Body Butter is 81% certified organic and contains murumuru butter which is extracted from the seeds of the tropical palm.
It is a rich emollient that moisturizes and softens. Murumuru butter is extracted by cold-pressing from the seeds of the murumuru tree, which grows in Brazil.
It is anti-inflammatory, protective, emollient, high in oleic &linoleic acids, and vitamin A. This cream works wonders on dry skin and has a pleasant "earthy" aroma to it. A great value at $13.99.
The Organic Face Rosewater Toner is 100% certified organic and is used to gently remove makeup and leaves your skin smooth and silky.
It is derived from the damask rose that has been used for centuries as a toner and cleanser, particularly for fair to dry skin.
Its gentle action leaves skin toned and refreshed. A great value at $7.99.
Face Hydrating Day Cream is 85% certified organic and is made from almond oil nourishes and is known to improve complexion and retain a glow.
This is a wonderful face cream to use daily. A great value at $12.99.
Smoothing Face Polish is 87% certified organic and is a gentle exfoliating face scrub that you can use every day to remove impurities and dead skin cells. It contains cranberry and capuacu to smooth, soften, and nourish with almond oil.
This works great if you use right before the day cream.
It smells very "planty" and light. I particularly love this to help smooth away blackheads and remove oils. A great value at $8.99.
Your skin is the biggest organ on your body and using only organic cosmetics could be just another step towards living greener and healthier. Its important to take into consideration what we put in our bodies and on the outside as well.
The powers of plants are evident when you try this line.
All Botanics Organic Products are now being sold in Target store nationwide and online.
You cant beat the prices and I can guarantee you will never go back to anything else once you try them.
Oregon Green Jobs
According to the Portland Business Journal, Oregon reported a total of 51,402 green jobs for the year of 2008.
The amount of jobs was determined by a survey of employers across a broad spectrum of industries and occupations.
These numbers account for roughly 3 percent of Oregon's employment base and is expected to grow by 14 percent by 2010.
The green jobs surveyed in Oregon average above $20 an hour. In case you were wondering what defines a green job, the survey used the following criteria:
• Increases energy efficiency.
• Produces renewable energy.
• Prevents, reduces or mitigates environmental degradation.
• Cleans up and restores the natural environment.
• Provides education, consulting, policy promotion, accreditation, trading and offsets, or similar services supporting the other categories.
The three industries with the most green jobs were construction, wholesale and retail trade, and administrative and waste services. Combined, these industries accounted for 47 percent of Oregon's green jobs.
Oregon has managed to attract Vestas (a wind turbine company), Solarworld, and several other solar manufacturing plants, making Oregon a hub for alternative energy companies. Other projects that will sustain green jobs would be the Oregon Sustainability Center. A concept building which has recently completed a feasibility study to create the world's first living high rise building with net zero energy and net zero water usage.
The outcome of the study was that it is possible, and after attending the community meeting about it last week at Portland State University, it seems as though the project will move forward in the next few years.
5 Tips for a Green July 4th
Independence day is right around the corner. And most of us love to celebrate this day by gathering with friends, eating good food off the grill and watching fireworks at your local park.
If you're planning on throwing a party or hosting a red white and blue event maybe you can add a little green in the mix.
Here are some tips to keep in mind while celebrating this year:
1.) Buy Organic Food:
Make sure to support the local farmers markets and buy organic fruits and veggies, and don't forget your reusable shopping bag!
2.) Bust out the silverware:
Instead of adding waste to our landfills and purchasing a bunch of paper plates, napkins, and plastic utensils, why not use your own dishes. Instead of paper napkins try using cloth ones since they can be washed and used again and again.
It may mean more cleaning up time for you but it will do the environment a big favor. And if you must buy throw away plates and cups make sure they are biodegradable.
Here is an example of some great biodegradable products.
3.) Grill on the Green Side:
If you plan on using a spray on cleaner on your grill, I would suggest a non-toxic one.
Simple Green Heavy Duty BBQ Cleaner or Soy Green BBQ Cleaner.
4.) Buy Green Charcoal:
I would highly suggest a 100% All Natural Hardwood Lump Charcoal.
There are many advantages like they don't have any chemicals or fillers, they burn faster than the regular stuff, and produce less ash.
And with the leftover ashes you can sprinkle them around your plants to help keep insects away.
5.) Substitute lighter fluid:
Forget pouring a half a bottle of lighter fluid all over your coals which subsequently gets cooked into the food.
Use a chimney starter. They are inexpensive and beats using toxic lighter fluid any day.
Another example of how Portland, Oregon is consistently rated the number one green city in the US is in the picture above.
These are designated green zones so bicyclists can get in front of cars at intersections safely.
Although the purpose of the green zones are safety, the green painted boxes represents Portland's dedication to the bicycle community.
The city calls them "bike boxes" and since they've been painted have had great reviews from bicyclists. Cars are suppose to stop a little earlier than usual at some intersections, of which most cars obey.
The idea has been successfully used in Copenhagen.
The main goal is to prevent collisions between motorists turning right and cyclists going straight.
It's all about visibility and awareness. At a red light, cyclists are more visible to motorists by being in front of them.
At a green light, the green bike lane through the intersection reminds motorists and cyclists to watch for each other.
It's no wonder Travel &Leisure Magazine recently rated Portland, Oregon the number two in the world for Top Biking City.
This is what it says for Portland:
As the only large U.S. city to receive the League of American Bicyclists' top rating, Portland leads the domestic charge to put two-wheel transportation on par with automobile travel.
Highlights in the Pacific Northwest enclave include 270 miles of on-street bike lanes and paved paths;hundreds of signs for bikers navigating their way;lock-up corrals for parking in the city;and mandates that give incentives for developers willing to provide showers and locker rooms for bike commuters.
Hooray, the clear summer skies are here!
Most of us enjoy all the activities that summer brings:
BBQing with friends, swimming, tanning, fourth of July, and camping.
Along with those wonderful things comes some things I think we pretty much all hate and dread like sweating profusely, getting into a hot baked car with leather seats(or any hot car for that matter), bug bites, and finding it hard to fall asleep due to lack of cool air. Well there are ways to stay cool without running the energy guzzling air conditioner on all day long..here are just a few tips to stay cool and still have that green summer fun.
1.) Hang a dampened sheet over the window, the moisture will evaporate the breeze therefore making it much more pleasantly cool.
2.) Switch on that under appreciated ceiling fan or regular floor fan. I just moved into an apartment where the air conditioner is located in the living room and I noticed the cool air doesn't even reach into the bedroom at all.
And since heat rises, living on a third floor apartment really makes my bedroom like a sauna!
But I'm very grateful there is a ceiling fan.
If I just shut the curtains so no sunlight can bake my bed and switch on the fan it is much cooler.
3.) Keep windows open at night to let the cool night air circulate throughout the house.
4.) Splash some cold water on your face will really help bring your core temperature down.
5.) Put some damp cloths in the freezer and press them on your wrists and neck to help you stay cool.
If you are about to go on a drive, bring along one of those cold cloths to put on your seat.
6.) Buy a chill misting fan.
They are cheap and really give you a splash of energy if you're feeling hot and lethargic.
You can buy them pretty much anywhere These things really work great.
7.) Eat a Popsicle, or even better freeze some frozen fruit.
They can quench your taste buds in the relentless heat.
You can either eat the fruit by itself or make a delicious smoothie with some ice, yogurt, and a little milk.
On a sizzling hot summer day heavy meals aren't so appetizing such as Mexican food or greasy burgers. Stick with light smaller meals that don't require you to turn on that hot box (oven) that emanates heat into your place.
You can fire up that grill and enjoy the summer chatting with friends keeping the heat outside.
Also you can make a sandwich, or great fresh salad.
8.) Also watermelon is a very tasty snack to keep you cool and hydrated..here's a refreshing recipe to try on a hot summer day:
Watermelon Icee
2 cups watermelon, cubed, and seeded
4 large strawberries
1/2 cup guava juice
2 cups Ice
Place all ingredients listed in a blender. Blend until smooth and ice is completely crushed.
Serve immediately or chill in the freezer for later use.
Number of Servings:
3
Prep/Cook Time:
5 Minutes
9.) Wear white or light colors since dark colors absorb more heat, making you hotter. It's also great to wear natural fabrics like cotton, silk, and linen instead of synthetic polyester and rayon..
10.) Heck, if you live alone or with your significant other why not throw on a bikini or walk around your house in your underwear or even in the nude =) One suggestion though, make sure to shut your blinds so your neighbors don't get a show!
truck stop for Dragon Power Station
The progression of creating new ways to generate pollution-free electricity seems to be endless. A while back Terry Kenney dreamed of a device (now called Dragon Power Station) in the road that would generate energy from the vehicles that drive over it.
It took some years to bring the idea to life.
According to an article at New American Media as trucks pass over the plates embedded in the asphalt "they compress a tank of hydraulic fluid under the road, which in turn creates a series of pumping actions that turns a generator to produce electricity."
By next month Kenney expects it to produce 5,000 to 7,000 kilowatt-hours every day.
That's enough to power 1,750 homes!
Some could argue that this is stealing energy from the vehicles and isn't fuel efficient therefore not making it pollution free.
In my opinion this energy is being recovered which would otherwise be inevitably wasted.
The stations are normally placed in areas where trucks already have to slow down and come to a stop anyway, it shouldn't hurt since the cars would waste all the energy by using their brakes otherwise.This is nothing more than untapped kinetic energy.
It seems that this clever idea is flourishing in the UK because they seem to have adopted this same idea to power a grocery store.
Sainsbury supermarket in Gloucester UK has these plates placed in the parking lot and as cars drive over them, the pushed down weight of the vehicle creates a rocking motion under the road surface which turns generators to power the checkouts. And since the amount of fuel needed is so tiny that the effect is equivalent to that of passing over a speed bump.
This is truly a green store and not just for that reason..they also harvest rain water to flush toilets, have floor-to-ceiling windows for natural light and solar-powered hot water heaters.
Pets and Toxics
Our pets are like family to a lot of us here in America.
We strive for better well being and to reduce our carbon footprint, so why not take our furry friends into consideration. I have a very cute chihuahua and she doesn't have the ability to tell me "you know this doggy food has artificial flavors and preservatives in it, i think i would like to try some real chicken product."
So we as pet owners need to make some better choices that would help the environment as well as the health of our pets. Here is just a handful of my top important green pet tips you should consider:
Eating Well
Most dog or cat food kibble you will see at the supermarket is so very bad for our pets. In these cases the animals used to make many pet foods are classified as "4-D," which is really a polite way of saying "Dead, Dying, Diseased, or Down (Disabled)" when they line up at the slaughterhouse.
Unless that can of Friskies or Dog Chow is labeled FDA Food Grade Meat than its not fit for people to eat but for some reason its okay for our pets? I don't think so!
Natural and organic pet foods use meats that are raised in sustainable, humane ways without added drugs or hormones, minimally processed, and preserved with natural substances, such as vitamins C and E. Certified-organic pet foods must meet strict USDA standards that spell out how ingredients are produced and processed, which means no pesticides, hormones, antibiotics, artificial preservatives, artificial ingredients or genetically engineered ingredients.
Poop Cleanup
Make sure to use biodegradable doggy bags. And for kitty cats avoid clumping clay litter at all costs. It's really harmful to the planet and if the cat was to ingest some, it could be very hazardous to your cats health.
Clay litter contains sodium bentonite that acts as the clumping agent and it can poison your cat through chronic ingestion through their constant need to groom.
Because sodium bentonite acts like expanding cement—it's also used as a grouting, sealing, and plugging material—it can swell up to15 to18 times their dry size and clog up your cat's insides. I would highly suggest an Eco friendly kitty litter such as FelinePine.
Safe Cleaning Products
None of us wants to live in a chemically toxin filled household.
So I would eliminate all bleach products in your bathtubs or sinks. I know after a shower one of my cats loves to lap up the leftover water in the tub for some reason and if it were cleaned with bleach that could be very harmful to their health.
And we know dogs are known for getting into the toilet water. So make sure you use cleaning products made from safe natural and non-toxic ingredients to prevent exposure to the hazardous synthetic chemicals conventional cleaners often contain.
Play Green
Many toys these days are dangerous and harmful for our pets. I came across this sad story of just how dangerous they can be zootoo.com.
Not only can accidents happen through accidental ingestion or other ways alot of pet toys are known to contain elevated levels of deadly toxins. Some tests have revealed high levels of arsenic, mercury, cadmium as well as lead, so it is a combo platter of lethal chemicals. You wouldn't let your child put a toxic toy in there mouth, so why let your pets (which most people consider there babies) do so?
Shop for green pet toys at Shopgreenpets.com or find them in your local pet store.
Whats are farmers to do with 800 million pounds of deformed, misshapen and just plain ugly looking watermelons?
Well in 2007 they were simply just left to rot.
Now that just sounds like such a waste when instead they can be used to produce ethanol.
The usual ethanol crops are corn, switch grass, sugar cane, and sweet sorghum.
The sweet succulent summer fruit might be added to that list since a 20 pound watermelon can yield up to 1.4 pounds of sugar. From that scientists can produce seven tenths of a pound of ethanol from crop waste for energy.
So while that may sound like a small amount of ethanol you must consider that watermelons are not being grown as a biofuel crop.
We are benefiting from a little boost in oil independence from leftover melons that would have gone to waste.
This could be a great new market for watermelon farmers although research is still being done, it sounds like a wonderful and promising breakthrough.
Ever go to your mailbox after a week or so and its overstuffed with junk mail and useless catalog's?
And then you go through your mail and one by one toss it all in the garbage nearby.
Well suppose you could do the environment a big favor by reducing the amount of junk mail you receive by 90%.
Cut down on junk mail you receive today to save energy, natural recourse's, landfill space, tax dollars and a lot of personal time.
Here are some startling facts.
* 5.6 million tons of catalogs and other direct mail advertisements end up in U.S. landfills annually.
* The average American household receives unsolicited junk mail equal to 1.5 trees every year—more than 100 million trees for all U.S. households combined.
* 44 percent of junk mail is thrown away unopened, but only half that much junk mail (22 percent) is recycled.
* Americans pay $370 million annually to dispose of junk mail that doesn't get recycled
* On average, Americans spend 8 months opening junk mail in the course of their lives.
So if you decide to make the decision to cut junk mail out of your life there are a few ways to go about it.
One being you can register your name on a "Do Not Mail" list at the Direct Marketing Agency (DMN). You can also go to OptOutPreScreen.com, which can enable you to remove your name from lists that mortgage, credit card and insurance companies use to mail you offers and solicitations. Surely this wont get rid of all the junk mail but you can get further information on it at JunkBusters.com to rid your life of the junk once and for all.
Environmental Portland Oregon
View of Portland City Center from Broadway Bridge [near Pearl District]
Portland Oregon also known as the city of Roses is a great leader and example of a very environmentally aware city...according to the U.S. Census, American Community Survey [2006] 4.2% of Portlanders commute to work on bike but some estimates today say that 16% of the city population bikes around (that's pretty good). Right now they are leading the way nationally in the use of Green Roofs (also called eco roofs or planted roofs) in hopes to further a more livable urban environment.
Portland got its first green roof in 1996 when Tom Liptan, an Eco-roof expert for the Bureau of Environmental Services, topped his garage with a green roof. Today, Portland has hundreds of green roofs that cover about 19 acres of rooftops, and the city plans to add another 43 acres of green roofs in the next five years. The city wants to encourage other businesses and homeowners to take part in this great new movement.
These Eco roofs have loads of benefits, they filter pollutants, absorb carbon dioxide, attract birds and insects and emit oxygen.
However, you can't just plant any plants on your Eco roof...low-maintenance, self-sustaining plants, especially drought-tolerant sedums, are generally recommended for eco roofs. Shrubs and trees require much more care as well as room to grow, so they're better off in roof gardens, where residents can enjoy and tend them more easily.
These eco roofs are not just meant to be admired for there pretty plants and flowers that grow on them but they also add insulation making houses warmer in winter and cooler in summer. When the skies open up, eco roofs capture 10 percent to 100 percent of the rainfall, reducing runoff into the storm-water system.
And with life spans of about 40 years, ecoroofs last longer than conventional roofs, which usually need to be replaced every 20 years or so. The cost can be more expensive than your regular standard roof but the benefits definitely out weigh the cost in this situation. A little oasis upon the roofs in a completely green city, where the people are friendly and the community is constantly striving to be more environmentally correct, I'm surprised the whole country doesn't move to Portland, Oregon.
Dirty reusable bag
These days reusable bags are a pretty ordinary thing.
You will especially see them being used in your local health food store or market. I think it's great that this new eco fad has become so popular, but it could be possibly harmful.
Once you have used your bags a number of times, you can expect there would be some stains possibly from produce, meat, and fish.
Or maybe you use your reusable grocery bags for other purposes such as a diaper bag or gym bag.
I'm not sure who would do that but apparently some people do. According to a microbiological study done this year [2009] it has shown that unclean reusable bags can pose a public health risk due to high levels of mold, bacteria and yeast present in the samples. Some of the findings include:
* Sixty four percent of the tested reusable bags were contaminated with some level of bacteria and nearly 30 percent had bacterial counts higher than what is considered safe for drinking water.
* Forty percent of the bags contained the presence of yeast or mold.
* Some of the sampled bags contained unsafe levels of coliforms and fecal intestinal bacteria -EEK Gross-
This really shouldn't be a cause for concern if you just do something very simple...TOSS IT IN THE WASHER!
Susie Craig, a food safety expert was quoted "Anytime you're toting around fresh produce, meat, poultry and fish, there's going to be bacteria," "You should treat reusable bags as you would your cutting boards:
clean and sanitize between uses."
Sounds pretty simple and easy right?
Well a friend of mine works at a grocery store and she says she is shocked at the dingy stained dirty bags some people bring in!
Just practice normal hygiene and i don't think it should be a problem.
Continue to use reusable bags, just stick them in the washer regularly.
After learning about this and subsequently writing about it, I've realized that I am guilty of never having washed my reusable bags since I've bought them.
Off to the wash they go =)
Electric Charging Stations
Electric Charging Station, Electric Vehicle Charging Station
Photo credit to John Tarantino, taken at OMSI in Portland, Oregon
As the fall of two out of the Big Three Detroit automakers continues due to the current Great Recession, as some call it, emerging niche markets are poised to benefit.
Electric vehicles, although not currently mainstream, stand a fighting chance now that the Obama administration has directed stimulus funds towards a greener economy.
While the restructure of GM and Chrysler continues, electric vehicles will most likely continue to steadily gain attention as technology improves and prices decrease.
Electric vehicles also stand to get a boost from governments that are providing incentives for people to make the switch via tax credits.
Charging stations as seen in the picture above at the OMSI Museum in Portland, Oregon will soon begin to sprout all over as a switch from fossil fuel to pure electric vehicles seems to be taking place.
This transition is starting with small electric vehicles, but the major push will be through plug-in electric hybrids that will start hitting the market very soon.
Once people start purchasing these plug-in hybrids for short trips around town, gasoline consumption will slowly decrease over time.
The only reason to use gasoline would be for long trips out of town if you live close to where you work and play.
These are certainly exciting times.
Most Americans feel the need to have the latest Electronics these days. Those pricey flat screens are a must have for the modern American home.
Well 10% of the average homes electricity usage comes from TVs and DVRs that go along with them.
Those Plasma's in particular are energy guzzling devices. Well, California is doing something about it and plans to cut TV power by 50% in four years .
The California energy commission is proposing energy efficiency standards using new technology when manufacturing new TVs. This technology already exists in some TVs that are in stores now.
I'm sure you have heard of the Energy Star program before.
It is a voluntary program that encourages manufacturers to produce efficient televisions to achieve an Energy Star label.
The label, however, does not prevent the sale of energy inefficient televisions that will cost consumers money over time.
So therefore the new standards are still necessary.
The new regulations will be very beneficial in many ways. It will save consumers money on their electricity bill, conserve energy, and protect the environment.
Additionally, it will help reduce greenhouse gas emissions and decrease the need to build additional, large power plants. Four million TV sets are sold in the state annually.
Plasma displays like Panasonic's, which can use up to 30 percent more energy per square inch than liquid crystal displays (LCD), would be hit hardest by the standard.
Meanwhile the people that are still using the old school TVs (cathode-ray-tube sets) can have some peace of mind knowing that those models are the most energy efficient TVs around=)
Some ways you can save power regardless of the TV you own -
Of course turning off the TV when you're not watching (even unplugging it, since appliances still pull energy from outlets) would save you a bundle on your bill.
If possible try watching less TV and maybe listen to the radio, or read a book and expand you're mind!
If you have an LCD TV, try turning down the backlight.
This can save power and may not make a lot of difference in the television's picture quality.
Turn off the HDTV quick start option - many new HDTVs have a quick start option.
This feature uses more back up power when the TV is off. It only takes a few more seconds to start up the TV without this and it can save a lot of power to turn it off.
Get a group of your friends together and maybe watch your favorite series or show as a group.
Or limit the number of TVs in your house and watch it as a family .
I love Washington, Washington heart
Photo credit danielglaserphotography DOTcom
The State of Washington's Governor, Chris Gregoire, signed a law on May 6th 2009 that has many environmental benefits. The law is that by the end of this year all state agencies and state colleges will be required to purchase 100 percent recycled paper. Any building with 25 or more employees has to recycle all of the copy and printing paper used in the office.
In addition the state is required to cut down their printing and copying usage by 30 percent by July 2010.
The idea is that these obligations aim to reduce overall paper and print waste.
If you have tried to do your part in helping the environment and went to purchase recycled paper at the store, I'm sure you've noticed it's not quite as cheap as standard paper. So the increase in cost of purchasing the 100 percent recycled paper will be balanced out by paper-reduction strategies like double-sided printing and paper conservation programs. Also the new law is said to save taxpayers $1 million per year and create more green jobs. "Washington has a proud tradition of producing the wood and paper products that the world needs," Representative of the new Legislation Kessler said in a statement.
"Our new paper conservation and recycling law helps open up new markets for green products created by blue-collar workers right here in our state -- jobs that won't be exported."
Here are some numbers to put into perspective exactly how much good this new law will provide the environment:
• Use 6,256 tons less wood -- the equivalent of about 43,000 trees.
• Produce 3.8 million pounds less in climate-changing greenhouse gases -- the equivalent of about 346 cars per year -- in producing the paper.
• Use 15.7 million gallons less water -- the equivalent of about 24 swimming pools -- in producing the paper.
• Create 2 million pounds less solid waste sent to landfills -- the equivalent of about 72 garbage truck loads.
It would be nice to see other states pass similar laws so that America can continue saving trees, reducing emissions, and preventing less waste overall.
So a friend of mine has a beautiful home in the East Bay (of San Francisco, CA) with some incredible views and a lot of rolling green hills right behind his backyard. I usually go over there to swim or barbecue on a warm sunny day.
When I arrived there the other day, I noticed a herd of at least 100 goats that were eating grass so fast as if they've never eaten before!
They were really cute to watch and surprisingly they let you get really close to them without running away.
My friend was saying that every year a few guys and there cattle dog come put a little orange fence up and bring all the goats in to act as lawn mowers. I thought it was a great eco friendly way to mow the lawn.
Coincidentally, a recent article came up that Google is also utilizing this idea by "hiring" 200 goats to mow the campuses at Google headquarters in Mountain View, California.
Dan Hoffman (who heads the companies real estate services) wrote:
"Instead of using noisy mowers that run on gasoline and pollute the air, we've rented some goats from California Grazing to do the job for us (we're not 'kidding'). A herder brings about 200 goats and they spend roughly a week with us at Google, eating the grass and fertilizing at the same time."
This is also useful because California is known for raging wildfires and so keeping the brush to a minimum is important.
Its unlikely that every house hold home will adopt a goat to mow their lawn every Sunday afternoon but its nice to see big companies do what they can to help the environment.
If i said that right beneath the ocean floors of the US alone lies a cheap, clean natural energy source that could power all of America's homes and cars for over 2000 years, would you believe it?
Well its true!
There is an abundance and accessible amount hydro methane or "ice that burns."
It's basically an ice formation combined of natural gas (methane) and water that if put up to a flame would burn.
Gas hydrates, also known as "clathrates," form when methane gas from the decomposition of organic material comes into contact with water at low temperatures and high pressures. Those cold, high-pressure conditions exist deep below the oceans and underground on land in certain parts of the world, including the ocean floor and permafrost areas of the Arctic.
Japan and India currently have the best funded research program to date.
In April and June 2008, the US department of energy signed agreements with India, Japan, and Korea to cooperate and share research information.
The US plans to drill sometime early next year. Methods of extracting the methane from the ice are similar to oil and coal industry, depressurization, and thermal stimulation.
Although methane when burned is a clean fuel, more information is needed to ensure the implications of it would be safe on the environment.
Tim Collett is a research geologist and he states "Gas hydrates are totally doable, but when and where we will see them depends on need, motivation, and our supply of other energy resources. In the next five to ten years, the research potential of gas hydrates will be more fully realized."
Scientists have known of hydro methane for along time but just recently started to try to use them as an alternative energy source.
The problem is, although there is a a lot of this stuff all around the world, they still need to find the best way to extract gas hydrate deposits and produce it on an industrial scale.
This sounds to me like a great supplement for oil and coal.
So many great things could come out of this if it is further studied.
Electricity would be extremely cheap and globally available for underdeveloped countries. This could revolutionize the world.
Since we're being optimistic, agriculture could expand and help end world hunger. Our quality of life and others would be greatly benefited.
It could cause a domino effect of Peace and equality among all nations!
The government has shown interest in this and invested millions in the past to drill in our oceans and hopefully they continue to plan and invest money in exploring and researching this promising discovery for the good of humanity and well being of our planet!
We all have heard that eating too much fish may be bad for your health, especially for pregnant woman.
That fish you may love from your frequently visited sushi house such as tuna, swordfish, king mackerel, and shellfish can be potentially more dangerous in the years to come.
The administrator from the Environmental Protection Agency (EPA) states that a new study has shown them a better understanding of how dangerous levels of mercury move into our air, our water, and the food we eat, and shines new light on a major health threat to Americans and people all across the world.
With this new information the EPA plans to continue working with our international partners to help cut mercury pollution in the years ahead and protect the health of millions of people.
This is a pretty alarming topic.
Not just for sushi lovers but for the ocean wildlife and ecosystems. Water samples done in 2006 shows that mercury levels were 30 percent higher than when measured in the 1990s. So this new study reveals a few new things that are kinda scary.
The first thing is that methyl mercury was found in the North Pacific ocean.
Methyl mercury is a soluble and highly toxic form of mercury which is basically formed from Algae that floats around the surface of the ocean then quickly dies and trickles down to the deep of the ocean where it decomposes and this process of decomposition mixed with mercury, forms methyl mercury.
The other unexpected finding was that a lot of this mercury is being swept in from the Asian coasts. Usually scientists look to the skies if there is a rise in mercury levels. They have known for a long time how mercury deposited from the atmosphere to freshwater ecosystems can be transformed into methyl mercury but they have now learned something they didn't know which is more about the transformation of mercury to methyl mercury and that they know a majority of it is coming from the Asian coasts. Scientists predict with the increased mercury emissions from human sources there could be a 50 percent increase in mercury in the Pacific by 2050.
I guess if you're a vegetarian this may not concern you all that much but Mercury is also in the air we breathe too and the biggest way we contribute to this is the coal fired power plants that counts for 65% of mercury that is released in our environment.
We can only hope that the EPA stays on top of there game and can help to regulate and put a limit on emissions on power plants in the US. Also to continue to work with other countries to control the amount of Mercury that is polluted into our earth.
America's Renewable Energy Report Card
Obama said it well last week in Iowa on Earth day "I don't think we should be followers, I think it's time for us to lead."
He was referring to America being the leader in renewable energy sources. And currently Denmark is the best example of where the US should be at.
Here are some reasons why they are first:
* They get 20-25% of electricity and heating from renewable energy, mostly biomass, biodegradable waste and wind.
* That will be 36% by 2025.
* They build houses with green roofs made from seaweed...enough said.
* Since 1973, they've gone from a 99% dependence on Middle East oil to 0% today.
None.
Obama also mentioned that currently America produces less than 3% of it's electricity from renewable sources, like wind and solar. Danish companies manufacture 40 percent of the world's supply of wind turbines, as well as having had extensive research programs for decades which provides employment for a good chunk of the population.
The people of Denmark are kind of like simpletons. The average Dane only owns one vehicle and a lot of them use public transportation. A majority of them don't own half the gadgets your every day American has in their house.
They use less than half the energy Americans use annually. A fun fact about the Danish is that they are reported, in a social study, to be the happiest people on earth.
They have total energy Independence and a great attitude towards sustainability for future generations. I think the US may have a tough time beating them for the top spot!
But we can sure try!
Biofuel from coffee
Coffee is not only a great pick me up on a sluggish morning but new studies show that it has a very profitable new benefit.
After brewing your favorite cup of coffee or cappuccino the remaining or "spent" coffee grounds end up in your garbage can or possibly added to your soil as a fertilizer. Now there is an even greater use that can come of used grounds...bio diesel fuel!
The reason that bio diesel isn't being used a whole lot these days is because there isn't a cheap availability of a quality source or feedstock to make that new energy source.
Used coffee grounds contain between 11 and 20 percent oil by weight.
Which is about the same compared to traditional bio diesel feedstocks such as rapeseed, vegetable, palm, and soy bean oil.
Growers produce more than 16 billion pounds of coffee around the world each year. Scientists estimated that out of that, 340 million gallons of bio diesel can be added to the worlds fuel supply.
The scientists verified it by testing grounds from a multinational coffeehouse chain.
They discovered a few things during the process, one being that the coffee based fuel actually smells like java -YUMMMM- and two it turns out since coffee has a high antioxidant content it was more stable and had a bigger advantage over other bio diesels .
So after converting the grounds into oil the leftover solid stuff can just be used as a compost or converted into ethanol.
This new process is thought to profit the US more than 8 million dollars a year. They plan to develop a small pilot plant to produce and test the experimental fuel within the next six to eight months.
So if you are a coffee lover this may just be another reason for you to love it more.
All in all its a cheap environmentally friendly bio diesel to power our cars and trucks. I think the US should really start to utilize these wonderful non costly sources of fuel. I mean they are right at our finger tips!
recycled toilet paper
Manufacturing regular old toilet paper has a big impact on the environment because it is made from virgin trees and harsh chemicals that are used in the process. That nice soft toilet paper is worse on the environment than gas guzzlers, fast food restaurants, or huge homes. 98 percent of all toilet rolls in America come from virgin forests. Other countries are not spoiling there bums as much, in many European nations a rough sheet of paper gets the job done.
And many other countries are using either part or 100 percent recycled toilet tissue and don't complain about it.
A lot of people give in to the marketing of premium toilet paper. These days they have super soft, double strong, aloe, quilted, rippled, and even perfumed t.p. that I personally think we don't need!
And some may argue that the recycled stuff isn't as comfy but in my opinion is not bad at all!
Watch out when shopping for earth friendly toilet paper...look for statements such as "unbleached", "processed chlorine-free" or "totally chlorine free."
If every household in America would switch to Green toilet paper that isn't chemically treated and made from recycled materials it would save over a million trees!
Even if its plantation timber these trees could be used for far better things such as furniture or building materials rather than just for a temporary one time use of wiping up poop.
Some people go as far as using cloths and just wash them.
Of course the ideal situation would be to wipe out toilet paper all together. Which is an option with new high tech toilets on the market that can spray you clean.
And i might add it does a much better job than TP could ever do!
India adopts this concept and as well as Japan...so America needs to get with the times and make these spray toilets more affordable.
But for now we can start by making a small change in our house with smarter consumer choices that would have a positive difference in the environment.
Happy wiping =)
Portland Eco Roof on 5th floor of Multnomah County Building
This isn't exactly breaking news...or news at all for that matter. But in case you live in Portland, or are visiting it, you may want to check out a really cool eco-roof in on the fifth floor of the Multnomah County building on 501 SE Hawthorne Blvd.
Eco roofs or green roofs are simply roofs with plants and shrubs covering them.
It makes a nice addition for a roof and does help insulate the building it covers.
Eco roofs are sort of a new popular thing for trendy urban cities like Portland and they are beginning to sprout up in random places. The cool thing about the eco roof on top of the Multnumah County building is the view.
The building is situated right on the east bank of the river and has a perfect view of the entire city.
If you are an eco-tourist, this a spot to check out.
Portland Eco Roof, green roof, PDX
Thoughts, Comments, Questions...
Add This Social Bookmark Button Subscribe in a reader
at 3:22 PM 0 COMMENTS, ADD YOURS HERE
Labels:
eco roofs, green roofs, Portland
Fog and Pollution
foggy portland
So it's early morning and you are walking out your door on your way to work and notice it's one of those foggy mornings. You would hardly say to yourself...yay! its foggy! I know I hate driving in it, and i think most people would think of it as a nuisance on their daily commute.
It's definitely a contributing factor to car accidents due to poor visibility.
So fog basically has gotten a bad rap.
But there is some good news about foggy mornings, recently scientists have revealed that fog may actually be helping clean up our polluted skies. This is especially good for people with respiratory health problems. It helps clear up airborne particles which are hazardous to them.
There are all different kinds of fog around the world but the kind responsible for air pollution cleanup is called radiation fog or ground fog which is mostly occurs in the morning.
So here's how it works:
on a clear night, heat from the ground radiates into space, leaving cool air just above the ground.
When the air reaches a specific temperature, water droplets form surrounding the pollution particles which creates radiation fog.
Later, when the sun rises, it evaporates the water droplets, cleaning the particles out of the air. Left behind on the ground are the dirt and dust.
I found out California is actually one of the 5 foggiest land areas in the world.
The bay area is pretty densely populated and air pollution is high due to cars, buses, airplanes, construction, and industry.
So this fogginess is actually a good thing for the area.
If you live in the city, the next time you see fog draping over the ground you can take a deep cleansing breath and know it's as fresh as its gonna get!
cows have feelings
Cows have feelings too!
For many years Dairy farmers believed that naming your cows and giving them some individual attention would cause more milk production but until recently scientists have proven its true.
The average cow produces 2000 gallons of milk a year and studies show that with just a little TLC they can produce up to 68 gallons more annually.
If a cow is stressed out and not given enough individual attention then their bodies release a chemical called cortisol which inhibits milk production.
Joe Bansen is the owner of a Organic dairy farm in Portland Oregon and he knows all 165 cows by name!
He walks amongst them and talks to them, and even plays music during milking to keep them happy!
Now that is a devoted dairy farmer!
Some may call it crazy but i truly respect that.
Cows have personalities and respond well to human interaction just like any animal does. If more dairy farmers could consider doing this everyone would benefit including the cows! I personally think all animals should be treated equally and this is a great example of humane treatment of animals.
This kind of reminds me of the reason that my mom's free range chickens laid eggs all through the winter while her neighbor complained that her couped up chickens weren't laying st all.
Its because my moms got to run around the property and enjoy life!
They were simply happier. I definitely would love to see a world of happy cows that each had a name!
Being slim and fit definitely has its perks. Your clothes fit better, you feel energized and healthy, but if someone told you that you could help the environment by putting down that super sized burger and fries you may react like i did "huh? how could me indulging in this delicious burger possibly help the environment?"
Well as you may know food production is a main contributing factor to global warming.
In Vietnam the population consumes 20% less food and therefore produce less greenhouse gases compared to the UK which has a 40% obesity rate and the US trailing very close behind at 32%.
Not only that but transport-related emissions will be lower because it takes less energy to transport slim people.
Just think if every American were at a healthy weight how much less emissions would pollute the air.
Studies show that in many countries around the world the average body mass index (BMI) is rising.
That includes Australian, Argentinian, Belgian, and Canadian to name a few.
Basically the world is getting bigger.
An author from the International Journal of Epidemiology said it well "When it comes to food consumption, moving about in a heavy body is like driving around in a gas guzzler. The heavier our bodies become the harder and more unpleasant it is to move about in them and the more dependent we become on our cars."
So being healthy doesn't just do our bodies good but also the environment. I can admit that eating right and exercising can be hard to do at times and it does take dedication but think about all the positive things that come out of it.
Reducing emissions and living a longer healthier life!
The sun is out and it finally feels like spring has arrived.
Here in the Pacific Northwest where rain is too often despised...people can finally put their Prozac bottles away when the sun comes out.
Today I bought an all access day pass for the bus/streetcar/light rail and started a journey of exploration.
As I was walking to the bus stop, people from all walks of life were out enjoying the sun. I immediately noticed people were finally taking advantage of this really cool organic community garden near my apartment.
Community gardens where I came from in California were unheard of. But here in Oregon, they are probably the coolest thing ever if you are one of the thousands of people who live in an apartment or condo and don't have a backyard to grow stuff. Tending to the garden is one of my favorite stress relievers and puts to use land that would otherwise be under utilized.
Community gardening is a great way to add culture and green space in a densely urban community, no matter what city or town you live in.
I'm pretty sure there is a local organization in Portland that helps people convert small plots of land into gardens too, but there name escapes me right now.
If you live by a community garden, you should really consider using it to grow whatever you like.
Or an even better project would be to contact your local neighborhood association and advocate for one if there isn't one close to you.
What would you grow if you had a garden?
Would you consider using a community garden if one was close to you?
Let us know, we'd like to hear your thoughts.
It's that time again for Saturday Market in Oregon folks. Oregon is quite known for their lively Saturday Market's which are full of local culture, hand-made crafts, and farmers. Remember that supporting your local farmers stimulates the local economy and purchasing food that is not shipped from hundreds or thousands of miles away helps prevent air emissions. Buying local literally saves the environment from needless pollution caused by transportation from cars, trucks, ships, and jets. Don't forget to bring your re-usable bag to the market...and sometimes free recipes are available to help you make a great dish from the produce that is currently in season.
It also never hurts to just ask someone about ideas for in season produce recipes.
The sun was shining for part of the day today as it marked the beginning of the Saturday Market season despite the weather man's grim forecast of rain, rain, rain.
We always love it when he's wrong...heh.
Oregon Saturday market is a tradition that some Oregonian's take for granted...but if it isn't one yet, give it a try =)
Does the thought of this toddler playing in grass that was just sprayed with pesticides frighten you?
You wouldn't think twice about when the last application has been applied before letting your children loose at the park...but the fact is parks, playgrounds, soccer fields, and nearby "weeds" are all regularly sprayed to control plants that are invasive or simply do not look pretty.
Organic lawn care would be the best bet for your own personal lawn.
It serves as a safe bet to protect you, your loved ones, innocent people or animals that may lay in your grass, and the environment. I would also consider cutting your lawn with either a manual lawn mower or an electric one.
Gas powered lawn mowers are known to cause a significant amount of air pollution on their own and also contribute significantly to greenhouse gases.
In states that take pride in their salmon populations such as Oregon, pesticides applied to lawns or fields end up running off into nearby creeks and streams that can hurt the salmon populations as well as other aquatic life.
This is why there are strict rules about application of pesticides near bodies of water in Oregon...however, these protections are never enough.
We must all do our part.
Lawn care pesticides are often wrongly used or wrongly applied.
Precaution of the serious dangers associated with lawn care pesticides are often overlooked.
This may be part of the reason several parts of Canada have outright banned lawn care pesticides all together.
Our beautiful environment and our health are worth the efforts that we put forth protecting.
Cheers and happy springtime.
daisies in the grass
Too often are these daisies in the grass often thought of as an eye soar that must be treated with dangerous pesticides or herbicides. Spring time is around the corner, officially at 4:44 AM tomorrow morning, and I would like to convince people to view their daisies in the grass as a charming addition to their lawn.
Personally, I think lawns are a big waste of resources, however, if I must view a lawn walking by a house I would much prefer it to be blanketed in the natural white flowers.
Pesticides/herbicides to treat so called weeds are a great addition to our local streams and groundwater. So don't be afraid to remind your neighbor that those yellow dandelions or those white daisies are a welcome to the neighborhood.
We must keep in mind that pesticides are poisonous and can even harm our pets if we let them play on sprayed lawns. If you are a recent lawn to vegetable garden convert, leave us a comment and tell us how it's working for you.
Bus Rapid Transit System in Eugene, Oregon
Bus Rapid Transit or BRT is a transit system that tries to emulate light rail by design, concept, and functionality but with the cost savings of a bus system.
In Eugene, Oregon a relatively new BRT system called the EMX has been implemented as a public transportation option for Eugene and Springfield of Lane County in Oregon.
The EMX route serves as an alternative mode of transit via its arterial link between the interconnected cities.
While still in its infancy, some people complain that the system doesn't necessarily improve travel time.
However, Lane Transit District (LTD) argues that as congestion becomes a larger problem in Oregon's second largest urban area, the EMX will maintain its travel times with its dedicated bus lanes. Its low floor design allows for easy access for wheelchairs as well as people.
The 60 foot long hybrid electric buses offer a sleek and sexy look that well suits the size of Eugene and Springfield all the while sipping gas and saving on costs. The EMX runs every ten minutes during the weekdays to try and accommodate people's busy schedules.
Another prominent feature about bus rapid transit systems are the high quality stations which often feature sheltered platform stops or open platform stops. North America is starting to use BRT systems as the more economic alternative to light rail.
As the population center in Eugene-Springfield grows, hopefully the model of the EMX BRT system will be followed in similar population centers in the Northwest.
Clothes Drying Rack
This is my clothes drying rack. I am saving MONEY and ENERGY by not using a dryer to dry my clothes. Dryers are very energy intensive and switching to hang drying clothes can save you some money on that energy bill every month.
Home appliances usually account for an average of 17% of your energy bill with refrigerators, washers, and dryers being on top of the list.
The formula to estimate your energy consumption is (according to the US Dept. of Energy):
Wattage × Hours Used Per Day ÷ 1000 = Daily Kilowatt-hour (kWh) consumption
1 kilowatt (kW) = 1,000 Watts
So, a clothes dryer may use anywhere from 1800 watts – 5000 watts depending on what you have in your home.
Say you run your dryer three hours per week
1800 watts x 3 hours x 52 weeks (three hours per week)÷ 1000 = 280.8 KwH for the year
now you take that value x the rate at which your utility provider charges you.
In this example we will use the what I get charged 9.99 cents per KwH with Portland General Electric.
280.8 KwH x $0.099 = $27.79 per year
but if you have a high powered dryer using 5000W and say you have a larger dryer because you have a large family and needed a bigger one.
This means you'll be running the dryer 6 hours per week.
5000 watts x 6hours x 52 weeks ÷ 1000 = 1560 KwH for the year
1560 KwH x $0.099 = $154.44 per year
So as you can see, these are savings that could go towards your grocery bill, cell phone bill, or can be donated to your favorite organization.
The point is, once you start drying on a cloths drying rack...you'll feel good about being green and saving green.
Give it a try.
TGlobal Warming Is Exaggerated?
storm surge gates, Port of Rotterdam
Photo:
Storm Surge Gates, Port of Rotterdam, taken by John Tarantino
According to an on going Gallop polling survey conducted on the subject of global warming, new data suggests that in 2009, 41% of those surveyed (the most since the survey began in 1998) think global warming is exaggerated.
In general, 57% (down from a high of 66% in 2001 and 2006) of those surveyed still believe that global warming is either correctly portrayed by the media or underestimated.
However, this level indicates the highest level of skepticism on the subject of global warming since the polling began.
Other more pressing issues could be the reason such as the global economic recession.
However, we must be not forget about the pressing issues of climate change.
We can't forget that sea level rises could eventually swallow up low lying countries like the Netherlands. The picture above is a storm surge barrier called Maeslant.
It protects the port of Rotterdam, the 2nd largest port in the world.
It has been called a modern marvel and has been the feature of the Modern Marvels show on The History Channel.
These massive flood gates protect the Netherlands from nasty storm surges that would normally flood the densely populated Netherlands. But as sea levels rise in decades to come, these gates will get used more and more, and one day may become obsolete.
city bicycle lanes
How convenient would it be to get up in the morning and ride the good old bicycle freeway to work or around town?
Well in the Netherlands you can!
These bicycle lanes (painted in the clay-ish color) are just as wide as those car lanes next to them, and there is plenty of additional sidewalk space to line up bikes for parking too!
What's not in this picture is the center lane on this street is for buses only, and there is another wide bicycle freeway on the opposite side as well!
Talk about ease of use and plenty of room to cycle at speeds as fast as you can go without the worry of being hit by our gasoline based friends. These city bike lanes are the coolest thing I've seen while over on the other side of the world.
While back home in Oregon, USA cities like Eugene and Portland are trying to be more eco-minded with transportation, the fact is they are not trying hard enough.
It's time we take back the pavement and promote boulevards that no cars are allowed on.
Lets take a main artery road in Eugene, Corvallis, or Portland and just convert it to a no car boulevard zone where alternative transportation can flourish.
Sounds like a good plan to me...and from what I can see our dutch friends have already got it figured out.
Why are we the ones always playing catch up with the rest of the world nowadays?
The train system in Europe simply dwarfs that of the United States. Riding the train in Europe is easy, convenient, and hassle free in comparison to flying.
Now that the Stimulus bill has passed, thanks to the Obama administration, funding for high speed rail across the country may now begin to jump start the nation's old railroad networks. California passed measure 1A last November creating funds for an interstate high speed rail line through the California High Speed Rail Authority.
The $40 billion project plans a line from Sacramento/San Francisco all the way down to Los Angeles/San Diego!
California will most likely get large chunk of the $8 billion for high speed rail for this project.
But what about the Pacific Northwest?
The Cascade Corridor line from Eugene, Oregon to Vancouver, B.C. is a well established route that could benefit from federal stimulus money as well.
Currently, Oregon/Washington state stand a chance to win $500 million in funding to increase ridership, increase speeds to 110mph in some segments, and all the while reducing carbon emissions. Sounds like a good plan to me.
The rest of the remainder of competitive bids can be seen at:
http://thetransportpolitic.com/2009/02/26/competitors-for-high-speed-rail-grants
Being as I live in the Northwest, having a faster connection to Seattle and Eugene would make life easier and more convenient.
We'll know more about who gets the funding in the coming months...
FREEZING POINT , coming October 2008 from Berkley
Freezing Point, Karen DionneWhen a crisis arises, there are generally two types of people who come to the forefront of the situation:
Those who wish to solve the crisis, and those who would simply profit from it.
In the ongoing debate between altruism-versus-capitalism there never seems to be a winner, hence the continued split.
Some people are born to put themselves first, while others are predisposed to putting the needs of others above their own.
And in smaller situations, this is all fine and well, but what happens when the crisis is global and when selfish or selfless decisions will affect billions rather than a handful of people?
This is one of the issues facing the players in debut novelist Karen Dionne's environmentally-charged Antarctic adventure novel, Freezing Point.
Environmental activist Ben Maki has come up with a way to help ease the world's problem with contaminated drinking water:
melting Arctic ice bergs. With microwave technology that can safely melt icebergs, tens of millions of people can benefit and be spared the injustice of not being able to drink clean water. It's a grand idea.
But Donald Gillette, head of Soldyne Corporation—the company who is funding Ben's work—has other plans in mind for the water that will be harvested from these efforts. Why use this water for the good of others—Say, those in Third-World countries—when you can bottle it up and make a pretty penny off the billions of gallons of water you'll be able to sell?
The ideological differences between Ben and Donald are apparent from the moment they appear together in the novel, and this is one of the novel's greatest sources of tension.
Unbeknownst to these two however, are two larger, very unexpected threats. One of them is found in the person of Rebecca Sweet and her ecoterrorist group, POP. To them, icebergs are part of nature and should not be trifled with, so when she and her fellow environmental enthusiasts hear about Ben's expedition and that a tanker will be transporting the water to Los Angeles for processing, they hatch a plan that aims to have explosive consequences for everyone involved.
Save the water, save the world, as it were.
The second problem, however, is even worse.
All the members of a research team on the Antarctic Peninsula have recently contracted a disease that is killing them off one by one, and the source of that problem is found in the water. The same water that came from the iceberg which split from the peninsula at the start of the novel and now is en route to Los Angeles. Of course, the reason the water there is contaminated is ever more sinister.
But because of their isolation no one knows about this. Contaminated water is bad enough, but what happens if the tanker reaches L.A. and some of the contaminants don't get filtered out of the water before others get to drink it?
And what if POP is able to complete their mission of blowing the tanker's sky high, sending the contaminated contents back into the ocean, thinking they are doing Mother Nature a favor?
Like any good yarn, this novel is not about the people involved, but the greater issues at stake.
Dionne wastes little time fleshing out the novel's characters (and in fact really doesn't give readers much of a reason to like, root for, or otherwise connect with any of them beyond their need to survive) and instead lets the action move the story along.
Can you be altruistic and still make a profit?
Is it evil or somehow unethical to do so?
To want to do so?
And when you are faced with life or death, which will you choose?
Will you do anything to keep yours?
To take that of another?
Freezing Point shines light on a very real problem—the lack of sufficient drinking water in the world for those in need of it—and the issues that surround the efforts of others to either address or ignore the issue.
Even the well-intentioned efforts of some can be used for evil, and even those that are meant for a sort of good can go awry because of unknowns. Sometimes people aren't the bad guys they are made out to be, but sometimes people aren't really heroes either.
One thing is clear in this novel:
everyone cares about the environment, just for different reasons. In the end, the question we are left to ponder is this:
Why do you care about the environment?
# Dutch Energy Neutral Homes
100 years ago there were approximately 10,000 traditional windmills that were primarily used for chopping grain.
Even back then, the dutch learned to transform wind energy to a more useful form.
Today, only about 10% are left of which two thirds are still functionally working.
Still the beauty of these old style machines are an iconic symbol of history nestled next to their modern turbines used for electricity.
While the European Union has set goals for reducing greenhouse gases, the Netherlands is sure to continue to use wind energy as an alternative to fossil fuels. My hope is that the traditional windmills in rural Netherlands will continue to be maintained for years to come.
Why We Should Conserve Water
Top Reasons To Conserve Water
It's long been assumed that the fall of the ancient Mayans was due to lack of food and water among other things. The peak population of the Mayans was estimated to be around 2000 people per square mile near the cities (similar to what LA is today.)
Persistent drought in Southwestern and Southeastern United States is proving to be a larger issue every year. Even the water rich Pacific Northwest is starting to feel the pinch from farmers demanding more water and growing population.
As a society, if we continue to mindlessly use as much water as we please without careful water management practices, we too could end up like the Mayan civilization.
Wars over resources is something we are beginning to understand is possible with the war in the middle east over oil (no matter what they say). This is why we should conserve water.
Everything You Need To Know About
Downspout Rainwater Collection Barrels!
DIY:
Do It Yourself
Build Your Own Rainwater Catchment System:
Step by Step Pictures for what to do.
What you need:
(1) 45 gallon drum cost ~ $10.00 check your local recycling center
(1) 3/4 inch NPT tap valve
(1) Boiler drain valve, total cost ~$10
(4) Fence posts ~ $10 each
(2) Eaves elbows, cost ~ $2 each
(1) Galvanized cable, cost ~ $4.50
(1) Clamp, cost ~ $3.50
Total cost ~ $71.50
End Results to the Right:
Costs are estimate.
Thanks to Durgan.org for the Guide.
Buy Downspout Rainwater Collection Barrels
Do you want to be eco-friendly without all the hassle?
Aquabarrel is a huge product resource site for rain barrels/cisterns including different options for different needs. Below are some products they list:
# Rain Barrel Kits
# Rain Barrel Parts
# Downspout Diverters
# Filters
# Hoses
# etc.
Government Rebates for Downspout Rainwater Collection Barrels
Some cities and counties in drought stricken areas across the country offer rebates for the purchase of rain barrels. Below is a list I've compiled:
Santa Monica, California:
$40 rebate for rain gutter downspout redirect.
$100 rebate for a rain barrel. (limit 2)
$500 rebate for cistern (must be over 500 gallons, limit 2).
Palo Alto, California:
$50 rebate on rain barrels
$0.15 per gallon on cisterns (up to $1k for residential, or $10k for commercial)
Olympia, Washington(pdf form):
$20 rebate per rain barrel, 3 max for a total of $60 in savings.
Albuquerque, Bernalillo County, New Mexico:
Rainwater Harvesting rebates will be based upon the amount of rain that can be stored:
$25 for 50 - 149 gallons
$50 for 150 – 299 gallons
$75 for 300 – 499 gallons
$100 for 500 – 999 gallons
$125 for 1000 – 1499 gallons
$150 for 1500 gallons and over
Austin, Texas:
Purchase a rain barrel through your water utility bill at discount for $61, that's a $36 savings.
Springfield, Missouri:
$25 rebate for rain barrels, (limit 2)
Willamsburg, Virginia area:
Note:
*Must be James City Service Authority customer (JCSA)*
$50 rebate for rain barrels (limit 4)
Florida Residents of Daytona Beach Shores;DeBary;DeLand;Deltona;Edgewater;Holly Hill;Ponce Inlet;Port Orange;Orange City;Ormond Beach;South Daytona or unincorporated Volusia County:
$20 rebate on rain barrels (must be at least 50 gallons)
Allen, Texas:
$25 rebate for rain barrels, (limit 2).
Rebate of 50% of cost for SMART irrigation technology ($125 limit)
$50 rebate for rain/freeze sensor
More on other water saving devices
If you know of any more cities/counties/or states that offer rebates please email me or post...thanks.
Lawn:
Environmental Impact
The question bares to ask, do we really need lawns?
Sure they look aesthetically pleasing to some...others see their front yard as a potential for use instead of looks. In my opinion, lawns are great for some purposes I suppose, however, they can easily waste water resources if they are not properly cared for. If you live in an area that is prone to drought, then it is downright wrong.
There are also negative environmental impacts to lawn care such as using a gas powered lawn mower, fertilizers, pesticides, and other chemicals. These can lead to adverse health impacts on children who play on lawns as well as pets, and can adversely affect the environment due to runoff.
There are many alternatives to lawns that could look just as aesthetically pleasing and require little work or resources. Perhaps native species or plants that are resistant to drought.
Of course, this would require some research on the part of homeowners or the price of a professional horticulturist or landscaper.
One other alternative to lawns would be a vegetable garden.
You could plant different vegetables and legumes all year round depending on your climate region.
There is nothing more therapeutic for me then to tend to the garden. I love watching them grow, pulling the weeds, and most of all the harvesting. I once read that some Californian's with decent sized backyards grew mini vineyards and sold the grapes to nearby wineries.
Some facts you might be interested in on the Lawn Environmental Impact:
Studies show that depending on the age and model, gas-powered lawn mowers can emit the same amount of pollution in one hour, as a car driven 20 to 200 miles.1 In one year, the average gas-powered lawn mower can emit the same amount of PM2.52 as the average car traveling about 3300 km.3 PM2.5 is a key component of smog and can have negative health effects on humans and the environment.4
What are your thoughts on the traditional suburban lawn?
1.
Environment Canada, 2006, Clean Air Online:
Compiled List of Quick Facts, www.ec.gc.ca/cleanair-airpur/default.asp?lang=En8n=2309FEF9-1 (accessed June 4, 2007).
2.
Particulate matter under 2.5 microns in size.
3.
Environment Canada, 2007, Criteria Air Contaminant Inventory.
4.
Environment Canada, Statistics Canada and Health Canada, 2006, Canadian Environmental Sustainability Indicators, Statistics Canada Catalogue no. 16-251-XIE, Ottawa.
Add This Social Bookmark Button Subscribe in a reader
at 3:16 AM 1 COMMENTS, ADD YOURS HERE
Labels:
environment
Gravity Feed Rainwater
The subject of drought has caught a number of headlines recently along with the proclamation that climate change is the culprit.
Drought doesn't care how it was caused or the reasons for lingering.
Californians must begin to think of new ways to quench their thirst for water. Unlike the Pacific Northwest awash with rain, California is all dried up. A local company to the Los Angeles region, Rainbud, is trying to convince Californians to gravity feed rainwater by purchasing rainwater harvesting barrels. The rainwater barrels are made from recycled plastic and go for $125 for a 60 gallon barrel.
Check there site Rainbud.com for more details.
Rainbud cites several statistics that Californians should know about:
# Los Angeles has an average rainfall of 14 inches per year.
# The City of Los Angeles is planning to restrict homeowners' irrigation water use to two days per week.
# One inch of rain falling on a 1,000 sq. ft. roof produces over 500 gallons of water.
If you live in Santa Monica, California the city is offering $100 rebates for the purchase of a rainwater harvesting barrel.
Taking a stance to use nature's water for your outdoor needs is much more efficient and environmentally friendly.
Due to drought and water shortages, more and more cities are considering building or are building costly desalination plants. Do you or anyone you know use rainwater barrels?
How are they working out for you?
The Netherlands and Bicycle Heaven
I will preface this post with the fact that I am on a work trip in the Netherlands for a few weeks and what I have seen here is quite amazing. I honestly have never seen such efficient transportation systems in my life.
Careful planning of full bicycle lanes almost have just as much room as the roads do on both sides. The bicycle lanes even have their own traffic lights that are integrated with normal vehicle traffic lights as well.
During so called traffic hour in the morning and the evening, while there is some moderate traffic, their is also something I have never seen...bicycle traffic.
When the bicycle lanes are waiting at their stop lights, the trail of bicyclists start to back up, and voila...bicycle traffic.
Motor scooters are also allowed on the bicycle lanes if they are below a certain cc.
The motor scooters pass by frequently but not as much as commuter bicyclists.
I have been staying in southern Netherlands, in a city called Eindhoven.
It reminds me of the city of Eugene, Oregon and what Eugene could be if it somehow got its act together. Eindhoven, in my opinion is the Utopian vision many people candidly dream for in many of the town hall meetings I've attended in Eugene for its downtown development, new city hall plans, and new transportation objectives.
I believe the population of Eindhoven hovers around 200,000 so perhaps one day Eugene will blossom into a better city in the decades to come.
However, Eugene needs to promote economic stimulus/growth and not be afraid of smart development.
Smart Grid in our Future
Google is working on a prototype system called the Google Powermeter to help energy consumers monitor and track their energy consumption.
It would require something called a smart meter, a new method that utility companies are upgrading to, and Google's prototype product the Google Powermeter. This would save the utility company from sending a worker to read your power meter every month.
Instead, your power reading would be sent over a secure wireless signal to the utility company.
Having a smart grid in our future will ensure the most efficient energy delivery systems that can calculate delivery routes through a decentralized energy network.
If you live in the Portland, Oregon metro area...Portland General Electric claims that once the system is fully up and running (estimated to be in September 2010), the benefits would be:
Saves money:
PGE anticipates millions in savings per year with the installation of smart meters. Controlling costs helps keep electric rates as low as possible.
Saves the environment:
Fewer meter-reading vehicles will cut 1.2 million miles of driving, save 80,000 gallons of gasoline and reduce CO2 emissions by 1.5 million pounds every year.
Speeds power restoration:
Further down the road, the new meters will be able to tell PGE if you're experiencing a power outage.
That can help us dispatch repair crews more efficiently and speed the restoration of service.
Helps you save energy:
When the entire system is up and running, you'll be able to access detailed information about your power consumption.
Together, we can help you plan energy-saving strategies.
The last point is where Google hopes to work with utility companies to provide a service to energy consumers who want to view a break down of their electricity usage.
If you know what, when, and how you are consuming energy (as seen on the picture above) , you are likely to save 5%-15% on your energy bill every month.
Google claims that if the entire United States were to really utilize a program like the one they are implementing...the energy savings would amount to the equivalent of removing 50 million cars off of the road!
Looks like our energy future is bright =)
Eco-friendly printing uses soy and other vegetable inks instead of petroleum based inks. Soy ink is created using soybean oil that is slightly refined and combined with pigment, resins, and waxes. Soybean oil is naturally clearer than their petroleum based counterparts and in turn requires less pigment to create more brightly colored inks. This reduces the amount of chemicals put into the ink.
Soy ink can also be more easily removed from paper which reduces the amount of damage done to paper fibers during recycling and increases the yield of reusable product.
Furthermore, the waste produced by recycling soy and vegetable inks is not hazardous and can be treated much more easily.
The paper used is 100% recycled and is processed chlorine-free which has obvious effects on the environment.
The recycling process will always require some amount of virgin pulp, which is pulp that has never been used in the production of products before.
Eco-friendly printers use sustainable tree farms to collect virgin pulp unlike other printers who rely on old-growth forests.
There are even more benefits regarding how "green" printing facilities are operated including being run on 100% wind power and investments in renewable energy credits. This dramatically reduces their impact on the environment.
By combining the advantages of soy inks, recycled paper, and environmentally friendly facilities you can make a big difference on how you're impacting the Earth.
"Green" printing is also very affordable when compared to traditional printing and the products produced have a very ethereal and professional look and feel.
There is really no downside to involving yourself in eco-friendly printing practices.
Congress is now deciding which federal programs will be funded in 2009.
Among those programs are the Green Jobs Act, which would invest $125 million in green-collar job training programs, and the Energy Efficiency and Conservation Block Grant, which would authorize grants to local communities to help improve their energy efficiency and increase renewable energy.
Now we must make sure that Members of Congress keep their promise and fully fund these programs.
Please ask your Senators and Representative to support full funding for the Green Jobs Act and the Energy Efficiency and Conservation Block Grant in this year's Appropriations Bill.
Take a moment to personalize your letter:
Using your own words will deliver the most powerful message!
Go to this link for a sample letter to send to your congressmen.
Personalize the letter and tell them why it is so important to have legislation that will move our country forward.
People are always looking for more inventive and unique ways to reduce their carbon footprint from hybrid cars to electric bikes. It's always a treat when you find a way to help the environment in a way that costs no more than traditional methods would which is why I was pleased to find out about another company that can help you make a positive impact in the way the world works.
I've recently been introduced to the Director of Design at www.ShaverDesign.com, which utilizes eco-friendly printing methods that use 100% recycled paper and only water based coatings and soy and vegetable inks. The printing factory itself is run on 100% wind power so you are not only saving the environment in the products you buy, but the companies you choose to do business with are supporting the environment as well.
It may not seem like much but the statistics show how real the impact can be.
By ordering, for example, just 500 "green" business cards instead of ones printed using traditional techniques;you will save 0.02 fully grown trees, 4.4 gallons of water, 1 pound of solid waste, and 1.6 pounds of greenhouse gases. Numbers don't lie.
The fact of the matter is that they add up to make a very real impact in the environment.
If you are ever in need for business cards, brochures, flyers, or any other printed material make sure to check out a company that offers eco-friendly printing along with their custom design services such as Shaver Design.
Sumatran Tigers are being sold into Extinction
The Sumatran Tiger is found only on the Indonesian island of Sumatra.
If some conservation doesn't go into effect quickly these majestic creatures will be extinct along with three other sub species of tigers;the Javan Tiger, Caspian Tiger and Bali Tiger. Each of these of tigers became extinct due to habitat loss and poaching.
Laws to protect the Sumatran tiger are failing to prevent body parts of the critically endangered animal from being sold openly in Indonesia, according to a report released February 14th 2008.
Tiger body parts, including canine teeth, claws, skin pieces, whiskers, and bones, were on sale in 10 percent of the 326 retail outlets surveyed during 2006 in 28 cities and towns across Sumatra.
Outlets included goldsmiths, souvenir shops, traditional Chinese medicine shops, and shops selling antique and precious stones.
Because of poor enforcement the Sumatran tiger is slipping through our fingers," said Leigh Henry, program officer for TRAFFIC North America.
There are only about 400 Sumatran tigers left and such a small population can't sustain this level of poaching.
If enforcement and political will are not bolstered the Sumatran tiger will be wiped out just as the Javan and Bali tigers were.
Sumatra's few remaining tigers are also under threat from rampant deforestation by the pulp, paper, and palm oil industries. The combined threats of habitat loss and illegal trade (unless tackled immediately) will be the death for Indonesian tigers.
The Sumatran tiger is already listed as Critically Endangered on IUCN's Red List of Threatened Species, the highest category of threat before extinction in the wild," said Jane Smart, head of IUCN's Species Program.
"We cannot afford to lose any more of these magnificent creatures.
All in all, tigers are a beautiful and rare species and we should do everything we can to stop them from becoming extinct. I think it would be a tragedy if these animals were wiped off the earth.
The more people that are aware of these useless killings the more we can come together and help to take action to stop it.
Why the species is important?
The tiger is a powerful symbol of admiration among the variety of cultures that live across its range.
They command respect, awe or fear from their human neighbors. Even in places where tigers have become extinct or never existed in the wild, they live in myth and legend.As top predators, they keep populations of wild ungulates in check, thereby maintaining the balance between prey herbivores and the vegetation upon which they feed. A whole myriad of other life-forms are essential to support a healthy tiger population.
A reader of this blog was wondering why algae hasn't been a topic of discussion on this blog. I have heard about the super efficacy of algae for biofuels before but never seriously looked at it due to its infancy at the time.
To my amazement, reading about algae as a scrubber and fuel source makes perfect sense.
Now people are beginning to actively promote algae as a carbon dioxide scrubber and biofuel producer. An article from New Scientist has proved that CO2 can act as a feedstock rather than a waste byproduct.
In what some call an algae bioreactor, algae is fed carbon dioxide as a fuel to flourish the algae in conjunction with sunlight which can then produce an algae that can be used for biodiesel or even hydrogen for local and domestic transportation.
This project has already been successfully piloted and should be getting more attention than those that would like to promote carbon sequestration (the storage of carbon dioxide in the Earth).
My friend who is promoting this concept through his blog has a great take on the future of bio algae.
There is also a link to a do-it-yourself algae reactor which gives the amateur a chance to build a mini reactor and begin transforming carbon dioxide into a potential fuel source.
The reasons to go algae as stated from oilgae.com:
1. The yields of oil from algae are significantly higher than those from traditional oilseeds
2. Algae can grow in places far from farmlands &forests, minimizing damages caused to the environment and food chain.
3. Algae can be grown in sewages and next to power-plant smokestacks where they convert the pollutants and give us oil!
The innovation can be realized after watching the following youtube video.
Humpback Whales making a comeback!
A new study released May 22, 2008 has shown that the population of Humpback whales in the pacific ocean are dramatically improving.
There are now nearly 20,000 swimming in the Pacific Ocean compared to the 1,400 counted in 1966.
Thankfully to prevent extinction the International Whaling Commission introduced a ban on commercial humpback whaling in 1966.
Conservation programs are much to thank for these dramatically improved numbers. Although humpback whales on the coast of Asia are still fairly endangered, conservationists are very optimistic they can bounce back.
The world needs to see more examples like this to prove that it is possible to make a difference.
Wanna be a Green Author?
Chelsea Green Publishing is a green publishing company that is a sponsor for some of the books on this site that The Environmental Blog team has reviewed.
They are currently hosting a contest to get more authors with a focus on the politics and practice of sustainable living.
For complete contest rules, check out their blog.
If I can ever get a full proposal done I would love to submit one myself. For those of you who are English teachers or freelance writers out there, this could be your chance.
I believe the winner of the contest will get a $1000 royalty advance and a publishing contract with Chelsea Green Publishing.
Who wouldn't want to be published?
Check it out and good luck. =)
Endangered Giant Pandas
One of the most adored and rare animals of the whole world is the Giant Panda.
Their population is pretty alarming at a low 3,000.
It is one of the most critically endangered species in the world because there are only about 1,000 left in the wild.
They are threatened by [poaching], human encroachment, habitat loss and trouble breeding in captivity.
This is why it is important for people to learn about them so that they do not disappear forever. Many of them are in captivity and very few are displayed at zoos in the US. The sustainability of these animals is very difficult due to the fact that Pandas have no desire to mate once in captivity.
Chinese pandas have even been given Viagra in hopes of boosting their desire to mate.
Many other strategies have been attempted including cloning panda embryos, also what has been dubbed as "panda-porn"- explicit video of pandas mating, traditional herbs and insemination.
So far researchers haven't made much progress with any of those methods.
I found it interesting that these animals are so popular that US zoos pay the Chinese government $2 million a year basically to rent a pair of giant pandas. These pandas are on a 10 year contract and any babies born results in an increase pay up to $600,000 more.
Giant Pandas are on loan to zoos in Atlanta, Washington D.C., San Diego, and Memphis. There is an impression in China that US zoos are very rich because when they come over the zoos are beautiful and so we pay much more than the $300,000 annual rate that Australia and Thailand pay.
Now that doesn't seem fair does it? US zoos are barely breaking even with such high costs and maintenance.
Not only is the annual payment pocket breaking , the upkeep for their 100% vegetarian diet is very high.
Pandas eat 84 pounds of bamboo daily which costs five times more than that of the next most expensive animal, an elephant.
Luckily, in Atlanta 400 volunteers grow bamboo in their backyards to help out.
These animals are on the "red list", which means they are critically endangered.
Many people in the world love Giant Pandas and there are even such a thing as "pandaholics".
Not only that but there is live web cams people watch on zoo websites just to keep an eye on their favorite animals. With so much love going around for these docile animals there has got to be a way to help keep them around for good.
To help keep Giant Pandas around, you can Adopt A Panda through the World Wildlife Foundation.
Electric Bicycles - Eco Transportation
I wanted people to know that I am on a mission to boycott driving my nice Honda Civc Hybrid.
Even though I am driving a hybrid, the price of oil on the market and the rising cost of gasoline has forced me to boycott the oil companies. I have been experimenting the best way to get to my work in an eco friendly way, which is 7 miles away.
It is not that far, so I tried an old electric scooter I had which only got me about half way to work. I had to call my co-worker to come pick me up.
Then I thought about riding my bike, but I need to buy a city cruiser bike because my mountain bike which is not suitable for city bicycling.
But a recent eco-tip from Scooters N' Bikes had opened my eyes to an electric bicycle. I guess in the back of my head I always knew they existed, but I have never really seen anyone or heard about anyone ever using one.
So, I challenge all my readers to think long and hard about how close you live to work, and the viability of using one of these electric bicycles instead of driving.
These are the stats for this innovative mode of transportation:
# In California, PG&E's baseline Rate per kilowatt is $.1156.
At baseline, CA PG&E prices they can charge fully for only $0.18.
# That's like paying less than a penny a mile!
# Compare them to a car getting 20mpg, with gas costing $4.50 in CA, which costs ~$0.22 a mile.
# Comparing the electric bikes to that they get the equivalent of 440MPG!
# Max Speed:
18.8 mph
# Range:
17.6 Miles w/ light pedaling
Obviously you would have to compare the prices of your own local energy provider to find out how much money you would actually be saving.
But no matter how much your energy costs are, this mode of eco transportation would be cheaper than driving a car for sure.
I am placing my order after the completion of this blog. I seriously don't understand why more people aren't using this as a primary mode of transportation, especially if you live in a downtown area like I do.
A Closer Look at Seal Hunting
Seal hunting is the personal or commercial hunting of seals for their pelts and blubber. Hunting also ensures the population does not reach levels that would threaten other species. But is it really necessary to put these animals through extreme pain and suffering just for a measly waterproof jacket? I recently learned that high end designers such as Gucci, Prada, Dolce &Gabana and Versace sell fur coats, accessories and trinkets made from seal pelts. The main targets for these inhumane killings are baby seals that have just grown out of their white coats. The age of these hunted baby deals are usually 12 days to 4 months old. I say inhumane because it TRULY is a brutal killing.
The most common weapon used is called a Hakapik which is a heavy wooden club with a hammer head and a metal hook at the end.
Some people may argue that the weapon is designed for a quick painless death but that is far from the truth.
Reports in 2001 have shown that 79% of the clubbers did not check to see if the animal was dead before skinning it.
It has also shown that 40% of the kills from a seal hunter had to strike the seal a second time presumably because it was still conscience after the first blow;42% of killed seals examined were found to have minimal or no fractures suggesting a high probability that these seals were conscious when skinned.
There are regulations for these killings and a lot of them are not being followed.
They state that:
"Every person who strikes a seal with a club or hakapik shall strike the seal on the forehead until its skull has been crushed," and that "No person shall commence to skin or bleed a seal until the seal is dead," which occurs when it "has a glassy-eyed, staring appearance and exhibits no blinking reflex when its eye is touched while it is in a relaxed condition.
It seems to be a losing battle to save these animals from this painful death, especially now that the world demand for seal pelts is now growing.
Pelt prices are at their highest in years and fashion industries are even thinking of new ways to market these products to change what was once a struggling industry.
If you want to make sure seal killings don't continue you can do so by not taking part in purchasing such brands. So if you ever find yourself shopping at one of these designer stores for a new wallet you might think twice about where it came from.
It's up to us to make sure these industries aren't profitable!
The Presidential Candidates and Where They Stand on the Environment
Now that Barack Obama has secured the Democrat's party nomination, the list of presidential front runners becomes increasingly shorter;this also means the list of possible policies that effect the environment becomes that more definitive.
But where does each candidate stand exactly?
In order to help you be more informed come November, very helpful website, OnTheIssues.org, attempts to list "Every Political Leader on Every Issue," including the issue of the environment.
The site displays a verbose list of stances, quotes and even past voting records on legislation all pertaining to environmental issues. So let's see how your favorite candidate stacks up:
Sen.
Barack Obama (Democrat, IL):
* Regulate animal feeding operations for pollution. (Feb 2008)
* Will end the notion of Yucca Mountain nuclear storage. (Jan 2008)
* Promote green technologies and fuel efficiency standards. (Dec 2007)
* Protect the Great Lakes &our National Parks and Forests. (Aug 2007)
* Voted YES on including oil &gas smokestacks in mercury regulations. (Sep 2005)
Sen.
John McCain (Republican, AZ)
* Support much tougher regulations on emission requirements. (Jan 2008)
* Economic &environmental interests not mutually exclusive. (Sep 2007)
* 1996:
Put 3.5B acres of land into wilderness protection. (Jan 2004)
* Voted NO on reducing funds for road-building in National Forests. (Sep 1997)
* End commercial whaling and illegal trade in whale meat. (Jun 2001)
For the list in its entirety along with context compared to other presidential candidates, you can visit the site's environment section.
But why stop at presidential candidates?
Check out where your local Senator or Congressmen in your district stands on the environment.
WHO:
BigCarrot (www.bigcarrot.com), the Web's premier resource for creating, funding and claiming inducement prizes, is leading the way for consumers and organizations alike to take advantage of them to facilitate innovation and change.
BigCarrot was founded to allow those with common interests to collaborate and pool their resources to create an inducement prize—a method for innovation previously only available to those with substantial fiscal capital.
WHAT:
BigCarrot is offering bloggers an opportunity to advocate for their causes through the creation of inducement prize on the BigCarrot Web site.
Bloggers who are looking to spur innovation or promote change are invited to submit their inducement prize concept(s). Once concepts are received, BigCarrot will choose ten prizes, from among the submissions, for inclusion on the BigCarrot site.
As part of the promotion, BigCarrot will waive the initial inducement prize fee of $100.
In addition, BigCarrot will help launch the prizes with a minimum initial contribution of $250.
The value of the prizes will increase once interest and demand for the prize grows among the site's visitors.
BigCarrot welcomes inducement prize concepts in the areas of:
computers, electronics, environment, mathematics, household, medicine, science, society and sports.
Chinese Pollution Plan Includes Tougher Fines
BEIJING (Reuters) - China's water and air are straining to cope with the country's industrial take-off, the government said on Monday, vowing steps to make polluters pay more for environmental damage.
The cabinet gave a stark picture of ecological damage when belatedly releasing the country's plan for environmental protection from 2006 to 2010.
"The conflict between our country's economic and social development and environment and resources grows starker by the day," said the plan, released on the government's Web site (www.gov.cn). "Environmental protection faces severe challenges."
The five-year plan urges officials to abandon a fixation on economic growth and turn to cutting pollution that the document says threatens citizens' health and long-term prosperity.
At the end of the previous five-year plan that ended in 2005, water in 26 percent of "key" lakes and rivers targeted for clean-up was so contaminated that it was classified as unfit even to touch or to irrigate crops.
Emissions of sulphur dioxide, the industrial pollutant that causes acid rain, grew by almost a third, despite a goal set in 2000 to cut emissions by 10 percent.
The new plan brings together a set of policy promises that have mostly been announced before:
cutting two key pollution measures by 10 percent between 2006 and 2010;healing stretches of spoiled lakes and rivers;and recycling much more of the waste and domestic run-off from growing towns and cities.
Officials are urged to do more to rein in greenhouse gas emissions by embracing energy-saving technology and policies and growing more forests to absorb rising carbon dioxide levels that scientists say are dangerously heating the atmosphere.
Beijing has set similar goals before, only to be overwhelmed by the break-neck economic growth that many local officials see as vital to more jobs and revenue.
But the new plan spells out that the government wants to tie environmental goals to policy measures that may hit polluters through tougher fines and revenue restrictions.
"Clean-ups of industrial pollution must follow the principle that the polluter bears responsibility," the plan states, also urging tax reforms to discourage waste and pollution.
"Give scope to the role of price levers and establish a pollution emissions pricing and fees mechanism that reflects the costs of cleaning up pollution," it says.
Trading in emissions quotas for sulfur dioxide may also be introduced in areas where "feasible."
http://beturtle.com is a website that offers various resources for going green.
It offers "green" zones where you can learn about topics such as pollution and global warming. I thought it was funny how in the pollution section they mentioned that the Hollywood sign is sometimes not visible due to L.A.'s horrible smog and pollution problem.
Well over the summer when I was in Hollywood, this was the case.
You could barely see the infamous Hollywood sign. I tried to take a picture but of course the smog was too thick to see the sign.
The information zone is a great resource for those eager to pick up a little knowledge on the problems currently facing the world environmentally wise.
The main attraction of the site, is that it is green social networking site.
So much like myspace it is a place where green users can all come together and make profiles about themselves while looking at other profiles. I personally think this is a great idea, now I can find other users with like minded interests. If you sign up be sure to add me as your friend, my username:
tecknopuppy, I hope to see you there.
The site looks has a lot of little fun sections like polls, discussions on green topics, and a little green celebrity information.
It also looks like it is getting ready to ramp up its site with new features that will surely be a hit.
With a section on environmental bloggers and beTurtle board of advisers to help the site along.
Check the site out and sign up, and don't forget to add me as your friend (username:tecknopuppy).
at 12:35 PM 7 COMMENTS, ADD YOURS HERE
Sunday, November 18, 2007
Honda Fuel Cell Cars Released in limited Quantity
Just like in my previous post back in February, Honda is releasing their first Fuel Cell Vehicle in the L.A. area for lease.
They are only letting 100 Honda FCX Clarities go free for now to be leased at $600 a month.
Pretty pricey, but the only by-product of the vehicle is water vapor. In a hydrogen fuel-cell vehicle, hydrogen combines with oxygen in the vehicle's fuel-cell stack, and energy from the reaction is converted into electricity to power the vehicle.
Pretty neat ha?
The only thing is where are these people who are going to lease the new Fuel Cell Vehicles going to get hydrogen for their new hydrogen vehicles?
Honda has an answer for that too!
Now if you're an environmentalist you may not be very happy with the source, because Honda has developed a home fueling station that would convert natural gas(methane) into hydrogen.
So essentially your you have the convenience of having your own fueling station at home, but you have to have existing natural gas lines in your home for it to work.
Furthermore, it seems a bit ironic that the new Fuel Cell Vehicle developed is suppose to use Hydrogen as a fuel source which is very abundant...but converts it from a depleting resource...natural gas. Natural gas is just as much in jeopardy as is oil.
And the United States imports a lot of natural gas from Canada as it is.
Eventually the next stages of the Honda Fuel Cell project will be released in the Washington DC area and New York that will begin next year. Hydrogen based Fuel Cell Vehicles is a step in the right direction and very futuristic, but we need to develop a better technology to extract hydrogen in a cheap and reliable method that doesn't consume an already depleting resource.
AddThis Social Bookmark Button
at 7:34 AM 7 COMMENTS, ADD YOURS HERE
Labels:
energy
Sunday, November 11, 2007
Oil Spill in San Francisco Bay Area
The San Francisco Bay Area has recently experienced their worst oil spill in over twenty years. Rescue workers hurried to save any wildlife that may have been damaged.
Oil cleanup crews raced to clean the 58,000 gallons of bunker fuel that leaked out.
The cause was a collision with a support beam on the San Francisco-Oakland Bay Bridge which left a 100 foot gash on the vessel which caused the leakage.
It's sad because this was where I grew up and hearing dozens of beaches closing and rescue workers scurrying to minimize environmental damage is hard.
The governor even declared the area a state of emergency as the investigation continues. Someone has to be blamed for this accident.
The cleanup will continue on for weeks and even months, and there will be long lasting effects.
In case you missed the news, watch the video below
Global Carbon Tax?
Would you support the first ever worldwide tax called the Global Carbon Tax?
It doesn't exist, but it is being talked about among some EU members and some corporations around the world.
How would such a tax be implemented?
Is it ethical?
Would the tax be justified?
As much as I love the environment and believe that something must be done before we destroy our planet, I am completely against a worldwide global carbon tax.
It is especially unfair to third world countries who are the victims of first world pollution and greenhouse gases. Nevertheless there are many supporters of a carbon tax of some sort whether it is city wide, industry wide, or statewide.
Some prominent global leaders are awaiting the successor to the Kyoto Treaty which hasn't done very much to mitigate harmful effects of climate change.
One could argue that the effects of climate change are now being seen with odd and extreme weather across the world.
Most companies, including Barclays, advocate a global carbon tax, and are, somewhat optimistically, waiting for a successor to the Kyoto Protocol that includes such a move. -climatechangecorp.com
Establishing a global market for carbon by setting a value on carbon in the market might be way to handle the issue.
Europe already has a system like this called the European Union's Emissions Trading Scheme (ETS). The scheme covers almost 30 countries and 40 billion euros worth of emissions are allocated to it.
The scheme is getting some influence from some transactions that have occurred in the US and in Asia so I speculated that this will truly become an international system given enough time.
My guess is that if their were a global carbon tax, it would be governed by the United Nations. The research they conduct on climate change with the International Panel on Climate Change (IPPC), have findings that suggest global turmoil by the end of next century if nothing is done.
Their most probable excuse for trying to implement such a tax would be "A global problem requires a global tax", or something to that end.
Once again, I cannot ever support something like this, would their be a vote?
There are far too many questions surrounding the administering of such a tax. I think that if people do care about something like this, then they should vote on it on a state by state system in the US.
For now, there is no such thing as a global carbon tax, but perhaps this issue will get more media attention in the near future.
Hybrid Vehicle Quotes
Their is a website that has Honda Car Quotes, as well as other car quotes if you are in the market to buy. I wish I had used this service before I purchased my 2006 Honda Civic Hybrid last year. I am sure I could saved a buck or two. For those environmentalists out there that are planning on purchasing a new hybrid or flex fuel vehicle, check out this site.
You'll get up to 2 quotes, and it's fast and FREE. Please remember, that the 2008 models have a lot of hybrids to choose from, as well as cars that will run on bio-fuel.
One hot new hybrid out for the 2008 models are the Saturn Aura's and Saturn Vue's. They are the best priced for hybrids and they have also been getting a lot of attention in the media.
Take your time and make sure you are happy with your new eco friendly cars. The Toyota Prius is still a very popular choice.
Check out the site and be merry.
at 5:36 PM 2 COMMENTS, ADD YOURS HERE
Labels:
hybrids
Saturday, November 3, 2007
Recycled Materials used for Home Building
A project in Bolivia has taken materials that are rampant in the area such as plastic bottles and bags as well glass bottles, and have filled them with sand to create a sturdy wall that is quite re-enforced.
The buildings are set to make a tourist attraction that will educate people who visit it about the possibility of re-using materials we already have to create plausible living conditions for all.
Ideas like these are necessary to further reduce the worst effects of climate change.
It also helps alleviate costs of traditional building, by using readily available materials for new purposes. I'm truly excited when I see found goods put to good uses;it inspires me with hope for the future and also really makes me want to build my own version.
Leave any comments you may have.
AddThis Social Bookmark Button
at 1:39 AM 2 COMMENTS, ADD YOURS HERE
Labels:
energy
Energy Efficient Home Improvements
Anglian Home Improvements is a company established in 1966 that is dedicated to improving homes. There website offers numerous methods of improving all aspects of your home including double glazing of windows for energy efficiency, and is among the best selling windows. The windows are in fact rated Category B for energy efficiency.
Improving your home by using these double glazed windows, you are actually trapping heat inside your home and thus reducing the amount of energy consumed.
In fact, according to their website in the energy efficient section, nearly one fifth of household heat can be lost through single glazed windows. So if your in the market to make some "green" upgrades, then make sure to check these guys out.
Cheers.
at 12:09 AM 0 COMMENTS, ADD YOURS HERE
Friday, November 2, 2007
PG&E and Kaiser Permanente go Green
PG&E and Kaiser Permanente in Northern California have gone green by installing solar panels on their buildings to reduce the amount of energy pulled from carbon emitting sources.
The move will actually cut down on costs in the long run and actually qualifies them for $1.6 million in rebates from the state.
It makes you wonder why every business is not investing in some form of solar energy as a supplement to their current energy.
It cuts costs down and can actually see returns on the investments over time and though state financed incentives.
Kaiser Permanente will actually be saving 734,354 kWh/year which adds up to an annual savings of about $95,000 per year. It is also equivalent of taking 56 cars off the road.
The more businesses that go green in California the better, and hopefully will lead to other businesses following suit.
AddThis Social Bookmark Button
at 11:52 PM 2 COMMENTS, ADD YOURS HERE
Carbon Neutral Search Engine
Well here is a first that I have seen in the environmental online world, a green or rather a Carbon Neutral Search Engine carbonsquash.co.uk. I thought that it was a good idea.
What I understand is that the site utilizes Google search engine and is in fact no different.
However, according to the site, the amount of energy used on each individual computer generated from search queries equates to about one gram of carbon dioxide.
So the premise of the website is that any revenue generated from Google queries on their 'Carbon Neutral Search Engine' will go to purchase carbon offsets. The website has openly chosen to use ClimateCare.org, a UK based company that allows people from all over the world to purchase carbon offsets and puts the money towards funding sustainable energy projects. The idea is a great one and should be remembered by all those self proclaimed tree huggers out there. I for one declare that I will bookmark and use this site more often to do my part.
As for other environmental bloggers out there, if you write about the website and let them know you've done so, they will enhance your listing in there search engine.
It could result in more traffic to your environmental website if you have one.
Besides that, make sure you check it out the next time you have a search, it goes to a great cause and will provide an extra boost to battling this thing we called climate change.
U.S. crude hit a peak of $130.47 before easing to $129.71, up 73 cents. Billionaire T. Boone Pickens, that I've mentioned in a previous post said on Tuesday he expected oil to hit $150 a barrel this year. Some of the reasons the price of oil is so high is due to a weak U.S. dollar and due to long-term production worries and a near-term focus on tight fuel stocks.
I have resorted to drastic measures including refusing to drive my car on my days off from work even though I own a hybrid. I am voluntarily boycotting gas on my days off. I am walking, biking, and riding my electric scooter around town a lot more.
In fact, I feel liberated and free from the hassles of driving. I think other people would feel the same if they just tried it. I refuse to buy gas for leisure if the prices of gas are going to remain this high.
So far studies have suggested that Americans have not changed their driving habits even though prices have gone up.
But studies have shown an increased demand for hybrids once again.
If you have changed your driving habits, I would love to hear about it.
What are you doing about these insane gas prices?
LA Plans a Massive Water Conservation Plan
A $2 billion proposal to conserve water for the City of Los Angeles California has reached city officials who will take a serious look at the future of water. The plan proposes a massive water conservation plan to conserve about 32 billion gallons of water each year!
Part of the plan includes reclaiming or recycling water from sewage back into the drinking water supply.
It also includes building systems to capture and treat rainwater and runoff. The proposal also requires restrictions for homes watering their lawns and people washing their cars to certain days of the week.
According to the LA Times:
Financial incentives and building code changes would be used to incorporate high-tech conservation equipment in homes and businesses. Builders would be pushed to install waterless urinals, weather-sensitive sprinkler systems and porous parking lot paving that allows rain to percolate into groundwater supplies.
So I guess it's time for everyone in the LA area to start doing their part. LA needs to to do this in order to support an increase of 15% in demand for water by 2030.
If nothing is done, water restrictions could end up as serious as it is in Georgia where extreme drought has caused the state to take drastic measures.
Easy things can be done to conserve water:
# Don't leave the water running when you brush your teeth
# Put a brick in the toilet so less water is needed to flush
# Take shorter showers or bath once every other day (it's not going to kill you)
# Purchase a water barrel and capture rain runoff from your roof for your summer gardens
# Re-use a dish or cup to prevent overuse of dishwasher
Believe it or not, doing these things helps you conserve water which is great for the environment, but it also can save you money on your water bill.
We really need to be aware of issues like water conservation so there is plenty for everyone and for the generations to come.
illionaire Oilman to Invest in World's Largest Wind Farm
Multi-billionaire T.Boone Pickens has decided to invest billions of dollars into which will most likely be the largest wind farm in the world.
Pickens plans on setting up 600 wind turbines in Central Texas with the hopes of producing "enough power for the equivalent of 1.3 million homes."
Not only has Pickens considered alternative energy, but he has also considered land usage as he plans on using other people's land to host the turbines which will also generate $20,000 in royalties to the land owners. As ironic as it sounds, Pickens has accumulated his wealth from being a business man, since his early years, in the oil business.
He currently chairs the hedge fund for BP Capital Management -- BP being one of the worlds largest energy companies that mainly relies on oil, natural gas, and petroleum.
In his appearance on CNN, Pickens explains his justifications for choosing wind energy.
In correlation with the rising cost per barrel and lack of production to meet demand, he was asked if wind energy is the future for the United States in which he simply responds, "It's got to be part of it because we don't have much oil."
It is interesting to see as the proverbial spigot for oil dries up how energy companies that are heavily reliant on oil will try to use alternative energy sources to turn a profit.
Could this be the beginning of a trend for other oilmen? [Source via Slashdot]
Written By:
AJ Papa
Solar Industry Booming
A survey conducted by two research college students in California indicates that the solar industry is doing very well for the most populous state.
They surveyed 212 solar companies in the state, of which 77 of them were in the Bay Area.
The solar energy industry has created jobs for installers, engineers, and other miscellaneous working class jobs. These are jobs that have been lost over the years and the conclusion based from the industry survey is that there aren't enough workers. An estimated 5,000 jobs could be created by next year.
The salaries would vary upon level of experience.
Entry-level solar installers make a median salary of $31,200 per year, while more experienced installers earn up to $60,000 a year. Experienced solar designers and engineers earn a median salary of $83,000.
These are great salaries for people with different skills. Most of the jobs like solar installers would not require a degree to qualify for the job.
Currently the two researchers estimated that California employs between 16,500 and 17,500 people already.
California will require energy companies to diversify their energy portfolio through 20% renewable energy like solar, wind, and geothermal energy.
But some California legislators are thinking about increasing the renewable portfolio standard to 33% by 2020.
If California does enact such a law, they will be truly be a model for the rest of the country.
The California solar industry has even expanded into nearby states for cheaper manufacturing plants. With this industry on the rise, it may be wise for people to position themselves to catch the wave of this new and exciting market.
The War on Bugs
Don't let the title of Will Allen's book, The War on Bugs, fool you.
Before I picked it up, I thought it would be a book specifically about bugs and what types of horrible pesticides have been sold to farmers to destroy them.
My assumptions were partially true, but thankfully instead of merely repeating what we all know about the dangers of noxious chemicals, Allen puts pesticide use into a 160 year historical context, showing how early chemical fertilizers gave rise to pesticides, and pesticides gave rise to genetically modified foods and animal hormone treatment.
He discusses how the anti-personnel chemicals, developed during war-time research, have become the chemicals that are still on the food we eat, and how pharmaceutical and even oil companies have a huge stake in the continuation of pesticide use (as if we environmentalists needed another reason to hate the oil industry).
Rather than re-telling this story from the consumer standpoint with which I'm sure all of the readers of this blog are familiar, Allen tells this story from a farmer's perspective.
He makes it clear that the real tragedy in the story is that no one has the economic interests of small and medium sized farmers in mind.
The government, advertisers, and large-scale farms have all sided with the chemical companies, and have been driving a nurturing and natural ethic out of the agricultural industry.
Sure, advertisers used war rhetoric to advocate pest extermination, but the subtext of this book is that the real war going on is between organic and chemical factions, and unless consumers and farmers realize that chemicals and nature don't mix, our society will have some serious, fatal consequences to deal with.
One of the great things about Allen's book is his extensive documentation of over a century and a half of advertising graphics that have been printed in farm journals and elsewhere, in order to lull American farmers into feeling comfortable with spraying enormous amounts of toxins on the land. I can imagine this combination of Americana-type prints and narrative style being very popular with those who are both American History buffs and environmentalists.
There is only one qualification I might make about this book, however:
My Father's family comes from Missouri, which he (among others) endearingly refers to as "the show me state".
If you plan on giving this book to a sibling, friend, or parent who is unconvinced that chemicals are bad for you (hah.
You have a "unique" person on your hands.), or if you come from Missouri yourself, this book may not be what you're looking for. It's heavier on the narrative and a little skimpy on the citations.
However, if you want to hear the story of a man who has had the unique position of having grown up on a pesticide-using farm, who then served in the Marine Corps studying war chemicals and treatment, and then who became a successful organic farmer, surprise! I found a book for you.
~Megan Geuss
Thoughts, Comments, Questions...
Add This Social Bookmark Button Subscribe in a reader
at 7:41 PM 4 COMMENTS, ADD YOURS HERE
Labels:
book review
Thursday, May 29, 2008
Should We Bring Back the National 55 MPH Speed Limit?
Wired's Autotopia blog ask its readers an interesting question;the question being "Is It Time To Drive 55 Again?"
It's an intriguing question indeed as gasoline prices continue to soar past $4.00 a gallon, air pollution continues to rise, and our little ol' blue planet slowly starts to warm up.
Taking all these factors into consideration, is it time to bring back the national 55 mph speed limit?
As a response to the oil crisis of the '70s, Congress and President Nixon imposed a nationwide 55 mph speed limit in order to conserve energy.
However, since 1993, the law has been repealed allowing states, once again, to set legal speed limits. Though the today's oil "crisis" is not as bad as the '70s rationing system -- at least not yet -- it is interesting to see a national speed limit debate resurface.
Regardless of whether or not a national speed limit law is passed, it is still good practice to drive at a speed that is still safe yet not excessively fast.
Start using the slower right lanes in traffic and plan accordingly -- whether it be waking up earlier -- in anticipation for a longer commute.
Doing so will not only help out your wallet at the pump, but it will also reduce the impact on the environment by lowering emissions.
[source]
Photo credit by Consumer Guide Automotive
Written by AJ Papa
Thoughts, Comments, Questions...
Add This Social Bookmark Button Subscribe in a reader
at 9:44 PM 7 COMMENTS, ADD YOURS HERE
Wednesday, May 28, 2008
The Summer of the Shark
PhotobucketI have recently read that there was a second shark attack that ended in a fatality in Mexico on a beach that hasn't had a shark attack in years. There hasn't been any large sharks at this beach due to irresponsible fishing and specifically targeting larger sharks to relieve the fears of tourists that are vital to the community's economy.
The resort mentioned is ZIHUATANEJO, MEXICO which if it really cared about tourism would promote healthy ecological practices because once the biodiversity has disappeared so goes the attraction of the ocean.
Every year more people are surfing and using the ocean for recreation and every year we kill more and more sharks and their food sources. It's simple math, the more people in the ocean, the less the sharks have to eat, the more attacks there will be on people.
Now the dynamics of ecology is more complicated than that but some common sense can be applied whether your a biologist or not.
As the top predators of the ocean become extinct an ecological collapse will be inevitable, this type of devastation to the earth's oceans can not be completely predicted but some inferences can be made.
Please educate yourself when you buy any products that come from the ocean and please don't fall for the media's scare tactics. I know its trite and everyone has heard this but you really are more likely to die from driving to the beach than you are to die in the beach from a shark attack.
More specifically if you are in Mexico you are more likely to die from food poisoning than you are to die from a shark.
Ty Spaulding
Associate degree in Biology.
Biology student with an emphasis in ecology.
Current research assistant for climate change in the Arctic at the University of Alaska.
Add This Social Bookmark Button Subscribe in a reader
at 10:21 PM 1 COMMENTS, ADD YOURS HERE
Labels:
animals
Monday, May 26, 2008
Green Industry Booming
With all the hype about being green, a new emerging industry has finally arrived and is growing every year. The alternative energy industry is hiring solar panel workers, wind power installers, maintenance workers, engineers, researchers, scientists, and all the white collar jobs associated with every industry like financing and logistics. The green industry is standing out to be a growing industry year over year while other industries like in manufacturing are retracting in the slowing economy.
Personally as a young engineer in the semiconductor field, I am extremely worried that the economy could force my company to cut back spending and possibly lay off workers. I would be the first to go since I only have two years experience versus other co-workers with 10 years or more.
States like California have the majority of these so called "green collared" workers, but other parts of the country like Oregon, Washington, and Texas have green niches as well.
You really just have to be lucky enough to already live near an existing green hub in order to transition into this new field.
Thoughts, Comments, Questions...
Add This Social Bookmark Button Subscribe in a reader
at 4:36 PM 5 COMMENTS, ADD YOURS HERE
Thursday, May 22, 2008
Oil Prices hit a Record $130 a Barrel
A new record high for the price of oil occurred today.
It achieved an all time high of over $130 a barrel!
What this sadly means for all of us is that gas prices will continue to go up, but also everything else will go up in price as well.
If transportation costs keep going up, everything will have to go up in price in order to maintain a profit for most businesses. These are issues we should all be really worried about.
At what point does the trucking industry and cab industries start to go on strike because of how much money they are losing to the high prices of gasoline and diesel?
Ethical Treatment of Animals:
If you live in a progressive community like I do, then you know that people who eat eggs much rather purchase them from free range chickens. This is because we know that the animal had a reasonably decent life and probably was healthier. PETA, the People for the Ethical Treatment of Animals, has increased their awareness campaigns to show how the animals headed for McDonald's are being raised in very inhumane and unethical ways. To read about how serious the problem is, check out Why Animals Matter.
Good animal welfare can also help to protect the safety of our nation's food supply.
Scientists have long recognized that food safety is linked to the health of the animals that produce the meat, dairy and egg products that we eat.
In fact, scientists have found modern intensive confinement production systems can be stressful for food animals, and that stress can increase pathogen shedding in animals.
Conclusion:
I am a vegetarian because I choose to be.
By no means am I trying to force other people into not eating meat.
Educate yourself about the risks and other issues surrounding meat consumption.
Even lowering, not eliminating, meat consumption makes a difference to the planet, environment and animal welfare.
Bat Deaths in Northeast
Thousands of bat deaths are occurring in the United States Northeast.
Scientists have been struggling to find out the reason why.
Last year, thousands of dead bats were found in 4 separate caves that were 7 miles apart from each other.
This year, at least 25 caves and mines spread across 135 miles were found to have sick or dying bats. Homeowners from Hanover, N.H., to East Canaan in northwest Connecticut have reported dead bats on lawns, decks, and roofs, a sign the animals might be affected in an even wider area.
But so far, no one has found an infectious agent or any other cause.
Some people and scientists are worried that the die off might be similar to the mysterious honeybee die off.
Bats are now migrating as far as 250 miles to their summer roosts, where they will mix with bats from other far-off caves and mines. By fall, they will travel back to their hibernation site to mingle and mate with still other bats. If the sickness is contagious, bat deaths in the Northeast could be in the millions as more of the mammals around the country could be at risk next year.
Scientists are also worried that the die off might be attributed to a changing natural world.
If millions of bats do die off, then it would seriously affect a chain the ecosystem in unforeseen ways. Bats consume thousands of pounds of insects per year. If those balances are thrown off, the insect population could put a further hamper on crops and also could spread more diseases among humans.
Now, dozens of pathologists, immunologists, toxicologists, wildlife biologists, and other researchers in more than 15 government, university, and private labs are methodically working to unravel the bat mystery.
Government grants are being written to fund more in-depth work.
Scientists are using cutting-edge technology, from heat-detecting cameras in muddy bat caves to DNA analysis in sterile labs. Even a Columbia University molecular epidemiologist who discovered a possible contributor to the bee colony collapse has joined the sleuthing.
Scientists have called this sickness "white nose syndrome" because most of the bats that died or are sick have white noses as a symptom.
Please spread the word about the bats in Northeast to create awareness so that more funding can go towards solving this serious issue.
Thanks.
MIT Tracks the Carbon Footprints of Americans
carbon footprint"Going green" seems to be the trendy slogan these days in the United States with companies and individuals making both economic and lifestyle changes that reduce the impact on the environment.
As such, we've seen an increase in mass media coverage with celebrities setting public example such as Julia Louis-Dreyfus -- who recently spoke on The Tonight Show with Jay Leno about her active environmentally friendly lifestyle.
Despite the slowly increasing awareness and lifestyle changes of the American public, how much exactly does the different American lifestyles contribute to environmental impact? A recent study at MIT decided to tackle the tall order of tracking the environmental impact, specifically carbon footprints, of the different American lifestyles. As it's well known that the United States is the leading country in carbon emissions;so it is no surprise that even the lowest energy using American still contributes significantly more than the global average. A recent Eureka Alert article discusses the situation in the United States:
An MIT class has estimated the carbon emissions of Americans in a wide variety of lifestyles--from the homeless to multimillionaires, from Buddhist monks to soccer moms--and compared them to those of other nations. The somewhat disquieting bottom line is that in the United States, even the people with the lowest usage of energy are still producing, on average, more than double the global per-capita average.
And those emissions rise steeply from that minimum as people's income increases:
The class estimated Bill Gates' impact as about 10,000 times the average.
With the increasing trendiness in the United States of going green, and as an American myself, it is certainly food for thought that still shows the disparity of lifestyles in a global context, regardless of environmental awareness. [Source]
Why Animals Matter
Most people would agree that animal cruelty is wrong, but how many people in the general public comprehend how cruel the meat, fur industry, animal experimentation, pet, entertainment, and hunting industries can really be?
For most Americans, it is easier to turn a blind eye to animal cruelty than take some simple steps to eradicate it.
It was with a great deal of trepidation that I undertook Why Animals Matter:
The Case for Animal Protection, by Erin E. Wells and Margo DeMello. I am notoriously soft hearted when it comes to animals—companion, farmed, or wild—but like many people in this country, I often turn a blind eye to the aforementioned industries in favor of convenience.
While I certainly realize that the juicy steak I ate last week came from a living breathing cow, to this point I have managed to detach myself enough to focus on the end result as opposed to the process by which the meat found its way to my plate.
I can no longer plead ignorance of animal cruelty no matter the realm in which it occurs. Of the various industries that Wells and DeMello write about, the meat industry stands out as the most striking and troubling.
For instance, cows, pigs, chickens--and any other variety of animal that comes of age on a factory farm--live a ridiculously miserable life.
For me, it is near impossible to ruminate at length on a crate crammed full of sick and injured animals with open sores, broken limbs, often resorting to cannibalism to stay alive for one more miserable day.
Underfed and lacking water, medical attention, and living a strikingly abbreviated life, the plight of an animal at the mercy of the meat industry is a horrible one.
However, it is increasingly important—for animals and the environment at large—to think over issues of animal cruelty and the accompanying ripple effect of pollution, violence, and injustice.
Besides the obvious animal cruelty, Wells and DeMello make a compelling case for sweeping changes in the meat industry to positively impact the environment.
For example, the meat industry is one of the largest causes of deforestation and water pollution in the world.
As the meat industry reaches into the depths of South America, for example, millions of acres of rainforest are stripped to make room for factory farms.
My initial fear in reading and reviewing this book was the emotional impact it would have on me.
Admittedly, the section on the pet industry brought me to tears, and I had to hug my dog for a good bit afterwards. To counteract the overwhelming nature of their subject Wells and DeMello wisely include success stories about animals saved from lives of cruelty.
For example, in 2000, a retired greyhound named Fever was adopted by a neglectful owner. After aging beyond her racing career her first adopter allowed her to dwindle to a startling 28 pounds. Luckily for Fever, a second adopter took over her care, got immediate veterinary attention, and Fever lived a life of love and fulfillment until her death in 2004.
Many such happy stories populate the pages of Why Animals Matter, giving the reader a sense of the goodness that exists in the world and the overwhelming amount of people who are willing to go the distance to make an animal's life luxurious and fulfilling. I am extremely pleased to have tackled this book as it was richly educational and I dare say life-changing. I plan to be much more diligent about issues of animal cruelty including seeking out options besides factory farmed meat, I would love to join the ASPCA, and volunteer at a local animal aid organization.
Reviewed by Andi Miller
West Coast Sea Bird Endangered?
The ashy storm-petrel is a small, smoke-gray seabird that nests and forages almost exclusively on the offshore islands and waters of California near San Francisco, Los Angeles, and San Diego.
These are major metropolitan population centers with high development that has contributed to the degradation of all local ecosystems. This bird is not officially on the endangered list, but could be by the end of the year.
Now the federal government is going to launch a full status review due to a scientific petition filed by the Center for Biological Diversity.
There are several different factors that could be contributing to a decline in these benign birds. Climate change, over development, and even the S.F. Bay Area oil spill they had last year certainly didn't help either. There are many other reasons for the decline of the birds which can be read at the website from the Center for Biological Diversity.
From the Press Release:
The Center for Biological Diversity petitioned for federal protection of the ashy storm-petrel on October 15, 2007 , triggering a requirement that the U.S. Fish and Wildlife Service make an initial finding within 90 days. On March 27, 2008, the Center formally notified the Fish and Wildlife Service that the finding was overdue and threatened to take legal action if the decision was not immediately forthcoming.
Following today's finding that protection of the ashy storm-petrel under the Endangered Species Act may be warranted, t he Fish and Wildlife Service must now conduct a status review and issue a proposed rule to list the species by October 15, 2008.
A Fine Line Between Safety and Panic:
A Sheep Story
My grandmother's first husband died of Creutzfeldt-Jacob Disease, otherwise known as mad cow disease, years before my grandmother married my grandfather. But since that time, this disease has always been in mind – when I studied in England in 1996, I was glad to be largely vegetarian, eating only one hamburger during my time there at the height of the mad cow scare. I hear about cows being slaughtered because their flock has been contaminated. I wonder how these things happen.
Now, I've read Mad Sheep:
The True Story Behind the USDA's War on a Family Farm, and my understanding of this disease is further complicated by seeing that the battle against it has wandered into hysteria and panic.
It did so in the case of Faillace family, a shepherding family whose entire flock of sheep was taken away and slaughtered by the USDA because of supposed "susceptibility" to mad cow disease.
There is an impossibility of sheep ever getting this disease and despite the fact that the Faillace family's flock of sheep had been tested and was awaiting certification to show that they had no susceptibility for scrapie, "mad sheep" disease there flock was still slaughtered.
The book chronicles this family's beautiful and painful story while also educating its readers about the actual dangers of infected animals – many of whose flocks are not quarantined and who may end up in the food marketplace – and the ways that these dangers can get contorted and twisted into panic, delusions, and ineffectual solutions.
This book is worth the read both for the heart-wrenching, anger-inspiring story it tells but also for the lovely writing and the inspiring commitment that this family has to living a life that is about promoting health and family, rather than government or big business. Linda Faillace's story helps us understand that the USDA has ways of preventing mad cow disease but isn't taking those methods;instead, they sometimes go to extremes of making a "point" that really has no point.
One note – this book does itself a disservice, at least for readers like me, by including a foreword by Ronnie Cummins, National Director of the Organic Consumers Association, that reads to be alarmist and of almost a "conspiracy theory" bent.
While that idea of a conspiracy certainly plays out to be true in the book, this foreword sets a paranoid tone that belies the fair and personal nature of the tragedy that the book details.
This book has once again made me glad that I'm a vegetarian who doesn't have concerns about ingesting animals that have been fed bonemeal or other meat products that may have come from mad cow-infected animals. But now that I've read this book, I will be even more cautious about what I encourage my friends and family to partake of. I will look for ways to help farmers like the Faillaces, even if it's just by encouraging other people to read this book and spread the word about the ways our government can be dangerous and about the dangers of the meat industry itself. Oh, and I'll probably start eating a lot more sheep's milk cheese.
$4 Gas and Still no more Bicyclists?
According to an article by Reuters, Americans don't seem to be picking up the habit of riding their bikes regularly to work, despite the fact that gas is over $4 a gallon in some parts of the country.
The article points out that some who can ride their bike to work, might do it.
But the majority of people still live too far away from work to consider it.
Their is also the danger of traffic congested roads mixed with bicyclists which can get ugly if the streets weren't designed to share the road.
The article also mentions that the amount of people that regularly ride their bikes is also down compared to previous years.
Here in Oregon, however, bicyclists seem to be a staple characteristic of what makes Oregon so green.
College towns are notorious for bicyclists like Corvallis and Eugene, but even some local studies suggest that ridership is down compared to previous years. The cited reason for reduced bicyclists is poor city planning and the monopolistic car to street relationship.
The local studies say if more bicycle routes and paths were developed that ridership would be up.
Either way, I am personally making a commitment to ride my bike to work starting June 1st until September. My work is only 7 miles away and I have the leisure of taking a beautiful bike path along the Amazon Creek all the way to work.
If you have the ability to ride your bike due to the proximity of your home and work, please consider riding your bike.
It saves the air and saves you money from our outrageous gas prices.
Gas prices are skyrocketing and now more than ever we can agree we're sick of it.
Fortunately, so were the people at Aptera who not only created a new hybrid but one like any other. Want a full electric version?
They have that too.
If fifty miles per gallon sounds good to you, how does bumping the mileage up to 230 miles per gallon sound?
The new Aptera Hybrid
can take you further than any hybrid available today.
Its space age design is not only pleasing esthetically but it also is receiving high ratings in simulated crash testing.
The Aptera will be available for purchase this year starting in October however it is only initially being released to California residents. Check out the website at Aptera.com or the video below for more details.
20% Wind Energy by 2030?
A report just came out from the U.S. Department of Energy at 20percentwind.org called 20% Wind Energy by 2030:
Increasing Wind Energy's Contribution to U.S. Electricity Supply.
As the name suggests the report lays out the feasibility of being able to achieve 20% of its energy from wind power!
Under the 20% wind scenario, installations of new wind power capacity would increase to more than 16,000 megawatts per year by 2018, and continue at that rate through 2030.
The U.S. doesn't even does have more than 16,000 MW in wind turbines currently installed--we passed that level in 2007.
We're at about 18,000 now.
The report identifies the problems with actually achieving 20% wind energy which includes transmission, siting, and manufacturing.
However, the report gives solutions to overcome these road blocks while highlighting its potential to stave off global warming.
The American Wind Energy Association states:
as of the end of 2006 the United States had an estimated 11,603 MW of wind energy generating per year. Even though this figure seems low, there have been improvements and gains over the years. However, wind energy only accounts for 0.7% of the U.S. energy mix.
If the government helped stimulate the renewable energy market, imagine where we could be today.
We need to continue to press our elected officials to continue to secure our energy through renewable means. If we can elect a plan to achieve 20% by 2030, the United States will be recognized around the world as an environmental leader as opposed to its current backwards thinking.
Lets create change, lets create jobs, lets help our country evolve.
With the prices of conventionally grown food continuing to rise as time goes on, the prices of organic food is also rising.
In some cases, the prices of organic food is now going beyond the reach of the lower and middle class. If the cost for organic food is nearly twice that of conventional products, then many people will just buy what they can afford.
Most people buy organic food to ensure they are not receiving potentially carcinogenic toxics in their body from chemical fertilizers, pesticides, and synthetic hormones.
As a single guy, I spend more money on groceries from the price of organics compared to my coworker who feeds an entire family of four with conventional products. Someone who can't afford to buy organic food for an entire family will buy what makes financial sense to them.
Some farmers are actually switching back to growing conventional foods because the prices of certain crops are so high, there is no need to go through USDA organic certification.
This can eventually reduce the amount of selection for organics and cause prices to go up even further.
Due to these economic hard times with inflation and a looming recession, it feels as if a serious problem could arise if the economy doesn't stabilize soon.
Simply put, organic food will only be affordable to the wealthy.
What kind of world do we live in?
The Right Thing
Review of Javatrekker by Dean Cycon
Six shelves of coffee, at least ten feet long.
French Roast, Sumatran, Decaf, Whole Bean, canisters, bags . . . so much choice for a little hot liquid in the morning.
And yet, that choice could help change the lives of coffee farmers around the world.
This is what I have come to understand from reading Javatrekker:Dispatches from the World of Fair Trade Coffee by Dean Cycon of Dean's Beans. For me, now, the only choice will be one that says "fair trade" on the bag or canister. No other choice would be a right one.
Cycon's book chronicles his travels around the world as he meets with coffee growers and processers, trying to help them establish or improve their fair systems for selling their coffee.
He visits the Oromo in Ethiopia, the Ashaninkas in Peru, the Asaro Mudmen in Papua New Guinea, and other indigineous groups, combining a belief in the power of fair trade cooperatives to improve the lives of these people with a deep knowledge and respect for the people and their cultures. His trips are not evangelical;he's not out to save the world by making it like his. He's traveling to help people help themselves, living up to the old "teach a man to fish" adage.
The writing here is personal;Cycon tells us not only of the coffee, but of the rigorous treks to reach the coffee.
Not just of the trees or the beans, but of the cultural customs that dominate each village he visits. Not just of the money, but of the flesh and life that has been given for what is sometimes simply a few cents a day.
This book is a not a chronicle in the life of a coffee bean;it is the epic struggle of people to be treated justly and honestly for their hard work.
Cycon's words are words of a witness, a clumsy, occasionally inappropriate (as when he snaps a picture of a group of Arhuaco farmers, much to their chagrin), but always caring witness who has taken the time to know these people, to see them, to share their stories with us.
This book would be simply another dry collection of tales from an activist who doesn't understand the complexities of culture and who doesn't see beyond his own goals, but it isn't that.
Cycon notes, often, that what he thought would be so simple, is simply not.
From the realization that the Kenyan government has corrupted even the fair trade system, to the knowledge that the self-educated Mamos of Colombia may know more about saving the earth from humans than he (or I, for that matter) will ever know, Cycon grants that his wisdom is not unlimited, his guidance not perfect.
As he says, "Our understanding of justice, in trade and society in general, cannot be confined to a formula.
Fair trade, or any movement that is intended to improve the quality of life for people, is more accurately seen as a process. The more we work with the peoples in this book and beyond, the deeper we plunge into the dynamics of their societies, their ecologies, and their economies. Each layer reveals a more profound set of relationships that we must consider as we evolve toward more human and just relationships. Being open to the experiences of each culture not only makes us more aware but also makes our lives richer."
This book records a man's journey into learning, one he undertakes because it is the right – but not always easy – thing to do.
The next time I return to those shelves of coffee I will remember Dean Cycon and the people he has introduced me to through his words. Now, I will only buy fair trade coffee, trusting that while it is the more expensive and therefore harder thing, it is the right thing to do.
If you have ever seen a sea turtle than you know may know and understand the beauty of these creatures. Unfortunately, the death rate among them is increasing and without help to stop this it will result in extinction for these precious animals. Sea turtles are not very fast or quick on land which leaves them vulnerable for poachers to harm them and steal their eggs which is a delicacy in many parts of the world.
There are many other reasons for these innocent and brutal killings. Calipee is a cartilage that is literally cut out of the live turtle from the bones under the shell.
This sadly means a slow painful death for the turtle as the helpless animal is left struggling on the beach.
Unfortunately, eliminating turtle trade completely would be hard because many cultures rely on them.
The Japanese support some of their economy with turtle shell products. German soup makers depend on calipee for turtle soup and in Nicaragua turtle hunting is a way to make a living.
Also men in Mexico were buying turtle eggs from locals in hopes of getting the effect of Viagra which is a myth.
Recently, Mexican authorities announced they will use campaign posters of scantily dressed young women to promote the protection of endangered sea turtles. The poster reads in Spanish "My man doesn't need turtle eggs". I am not a feminist by any means so in my opinion that is a great way to get some attention towards this problem.
Death is part of the natural world, although when humans heavily contribute to the demise of a species, we can do our part to make sure they can survive for generations to come by making changes. This is why we must save the sea turtles!
DDT is a pesticide that was banned decades ago. However, traces of DDT are still found to this day!
But how in the world is a pesticide showing up in Antarctica you ask?
According to sea bird experts, the most likely cause is from build up from melting glaciers. DDT has been known to be in the fatty tissues of these birds, but not enough to harm them.
The surprise is that these levels of DDT haven't gone down.
The United States banned DDT in the 1970's, and other countries have as well.
DDT, and other pesticides, actually travel through the atmosphere toward the polar regions by a process of evaporation and then condensation in cooler climates. Penguins feed off tiny sea creatures called krill that live in melted glacier water, and DDT is brought up the food chain directly to the penguins.
Although DDT is banned, the United States and the rest of the world should adopt a policy of extensive research before approving toxic chemicals that can leave long lasting damage to our environment.
Road Kill Hurting Ecosystems
This article is by no means suggesting that we should stop driving or building new roads, rather it is merely raising awareness of the fact that our ecosystems are continuing to degrade.
Sometimes there are very few practical solutions to some of the problems we have, but it does not mean we should be ignorant of the problems.
Scientists estimate that one-third of amphibian species are threatened, and hundreds of species have gone extinct in the past two decades alone.
The reasons are habitat loss, disease, pollution, competition from introduced exotic species and climate change.
Well now road kill can be added on that list.
Frogs, toads, and salamanders are all amphibians that serve vital roles in many ecosystems. They are consumers of insects and a food source for carnivores. To maintain healthy ecosystems it is important to limit the deaths of amphibians.
A main location of these incidents is a one mile stretch road with wetlands surrounding it in West Lafayette, Indiana.
During a 17 month study researchers found 10,500 dead animals along 11 miles of roads. Of those 7,600 were frogs of unidentifiable species and another 1,700 were bullfrogs.
A recent study published showed that the number of animals killed were significantly underestimated because they were scavenged by other animals, destroyed beyond recognition or moved.
About five times more animals died than could be recorded.
The dead included 142 road killed eastern tiger salamanders which doesn't seem like a lot but most of them were up to 10 years old or females bearing eggs on the their way to making an annual trip to their breeding grounds where they often lay 500 to 1,000 eggs. This could make a big difference for the population.
Researchers also found 74 dead northern leopard frogs, a species of special conservation concern in Indiana.
Other animals were involved in this study which included:
These roads are literally having an environmental impact!
Scientists believe there are options to help this problem like underpasses, viaducts, overpasses to allow wildlife safe crossing and special fences will help.
Habitats like wetlands and rain forests are declining and this is just one more problem that isn't going to fix itself unless some action is taken.
Although you can't actually hear a cow burp they are constantly releasing methane.
When they digest grass, micro flora in their gut breaks down.
Methane is the majority of what is burped up, a contributing factor in greenhouse gases. Scientists from a plant breeding research center in Australia are developing a new kind of grass that has been tested in the lab and glasshouses and are now planning field trials. Farmers should be able to maintain their dairy herds, productivity and profitability while cutting down gassy burps and reducing their contribution to global warming.
The dairy industry definitely has an impact on greenhouse gas emissions and Britain's target is to cut them by 20% by 2010.
The goal is to have 20-30% of milk producers trying out new technology to cut greenhouse gas emissions in the year 2015.
There is some controversy about the breakdown process of the new grass and if it in fact will reduce methane. A lecturer in farm animal health says that a diet too rich in highly digestible carbs can actually increase the amount of methane a cow belches out.
And a professor of animal science says more digestible forage could push up a cows absolute methane emissions but productivity gains would mean less methane per unit of milk.
As long as this new technology remains new, I am sure there are some kinks that need to be worked out.
If it pans out, it could really help the reach their greenhouse gas reduction goals in the UK and around the world.
Veal:
Causes Greenhouse Gases?
Producing one kilo of cut veal produces as much greenhouse gas emissions as traveling 137 miles by car. However, meat from non ruminants turns out to be more environmentally friendly.
According to a study by a Swiss environmental ratings agency in 2007, The US group Tyson Foods produces nearly as much greenhouse gas emissions as a major car manufacturer. The full life cycle of products of producing veal was taking into account which included items like forest clearance for grazing land, emissions from cattle feed,
VEAL IS NOT GOURMET!
Veal may be more appealing to some while looking over a menu because it's known to be a "gourmet" meat and also it's low in fat.
During the young calves brief lives, they never see the sun or touch the Earth.
They never see or taste the grass. Their bodies are anemic to get the desired pale look.
Their muscles ache for freedom and exercise.
The real issue is the calves experience. I want people to be aware of these innocent baby cows that are put through these conditions and are seen in everyday life as tender cutlets. As a former waitress at an Italian restaurant I noticed veal was an important menu item and popular choice by lots of people.
I'm ashamed to have once served it after learning more about all the down falls of producing it and eating it.
Written by our New Contributor:
Angie
chicken feed additive roxarsone A popular chicken feed additive which has been in use since the 1960's containing arsenic to produce pinker, healthier, big breasted birds could cause diseases in humans according to a study headed by a Duquesne University researcher.
The feed additive is used by commercial chicken producers to control intestinal parasites, reduce stress, stimulate growth and improve the color of chicken meat.
Laboratory analysis shows that the antibiotic arsenic compound roxarsone, which promotes the growth of blood vessels in chickens to produce pinker meat, does the same in human cell lines. This is a critical first step in many human diseases, including cancer. It's actually hard to find any other information on this besides the study and the published report on Environmental Perspective.
The technical term for growth of blood vessels is called angiogenesis, and roxarsone was found to do just that in humans. Which means that people who eat chicken sandwiches from McDonald's everyday could be exposing themselves through chronic exposure in small amounts. Chronic exposure to inorganic arsenic is known to cause cancer and has been linked to heart disease, diabetes and declines in brain functions. The study doesn't focus on the risks of eating chickens that were fed roxarsone, but there are reasons to be cautious.
U.S. chicken producers use a total of 2.2 million pounds of roxarsone each year. About 70 percent of the 9 billion fryer chickens grown annually nationwide eat feed containing the additive, which also is used in turkey and pig feed.
More than 95 percent of the roxarsone fed to chickens is excreted unchanged in chicken waste, which is regularly applied as fertilizer on surrounding farm fields or used in commercial fertilizers. The arsenic from those applications could leach into surface and ground water supplies. Just another reason to be worried about our meat supply.
Farm Animal Waste:
An Environmental Hazard!
Just to be clear, I am not talking about your simple family farm or rural America for that matter. This weeks theme is based on the large scale industrial production of livestock for meats and other products that go to your daily supermarket, fast food restaurant, or what have you.
This picture is of a "lagoon" that is quite literally a lake of animal waste.
Storage and disposal of manure and animal waste are among the most significant challenges for industries that handle livestock production.
By any estimate, the amount of farm animal waste produced annually in the United States is enormous;the United States Department of Agriculture (USDA) estimates around 500 million tons of manure are produced annually by operations that confine livestock and poultry.
That is more than three times the EPA estimate of 150 million tons of human sanitary waste produced annually in the US. And in comparison to the lesser amount of human waste, the management and disposal of animal wastes are poorly regulated.
The lagoons sometimes rupture after heavy rains, and the fields on which waste is sprayed leak polluted runoff into streams and rivers.
Most animal waste produced at factory farms is spread on the ground untreated, according to Pew Commission.
In addition to threatening drinking water sources, nutrients within animal waste often seep into groundwater or waterways, where they contribute to reduced oxygen levels in aquatic ecosystems. This is thought to have contributed to the increasing problem of dead zones along the coasts of the US.
If structures are built properly to handle this waste, technology can actually capture the gases released from the waste to produce energy.
There are certain farms that are testing this new technology and it will only be a matter of time before this becomes the standard way of doing things.
Is Your Meat Certified Humane?
When you have to put signs up that say "this animal handling and raising is certified humane", thats sad.
However, signs like that raise awareness of animal abuse and neglect while reassuring that the animals were treated with decency.
Certified Humane was a program developed by Humane Farm Animal Care which is described below:
Certified Humane meets the Humane Farm Animal Care Program standards, which include nutritious diet without antibiotics, hormones, animals raised with shelter, resting areas, sufficient space and the ability to engage in natural behaviors.
The program is run under the non-profit which certifies participating businesses by annual inspections from trained individuals with backgrounds in Animal Science, Veterinary Medicine, or other relevant backgrounds.
With all the choices and labeling we have in the supermarket, if it stands out against the regular products we are all use to, then go for it. USDA certified organic, natural, no hormones added, they all indicate a growing trend of consumers demanding more humane, and healthier choices. Wouldn't you like to know that the your hamburger from Burger King was certified humane...that would be the day =P
Sick or injured cattle are being handled by forklifts or bulldozers and are being shoved or dragged along the floor. Watching this video from the Humane Society details that these sickened cows were being fed to school children through the National School Lunch Program.
Downed cattle may pose a higher risk of contamination from E. coli, salmonella or mad cow disease because they typically wallow in feces and their immune systems are often weak.
The health risks are there and the USDA cannot regulate this. The only thing that can be done is to strengthen animal abuse laws to scare these people into treating cattle better.
These types of conditions are simply unacceptable.
The people that raise these cows obviously don't care about the animals and only want to make as much profit as possible.
It is my hope that more videos come out to shame other farmers that raise livestock into treating the animals with respect.
Even though these animals are going to the slaughterhouse does not mean they can be tortured.
Decide for yourself what you consider is ethical.
Abuse on the Egg Farm
An animal rights group called Mercy For Animals uncovered a shocking undercover video that showed animal abuse at an egg farm in California.
The chickens are clearly crammed in cages so small that they can't even spread their wings. These crammed conditions can make public health a more serious issue.
If any more incidents like this continue to come out, then legislative action could follow suit.
The video was shot this year in a facility that supplies to a distributor called NuCal Foods Inc.
NuCal Foods is one of the largest egg suppliers on the west coast which supplies eggs to Trader Joe's, Raley's, and Savemart.
The footage from the undercover video shows workers kicking and stomping on chickens and snapping the necks of sick hens. It also shows birds left with untreated wounds and crowded into cages, sometimes amid rotting corpses. It shows how after 1 to 2 years of being caged for their egg laying prime, they are thrown into a gas chamber and disposed of.
Is their any humanity for these innocent beings. If you still want to support the egg industry after watching this video, then I suggest you only support eggs that come from free range hens. I personally get my eggs from my mothers own free range hens or from local farmers at the Saturday Market. I know the price of food is going up including eggs, but getting eggs from a local supplier is possible.
Check around for farmers markets, they usually sell eggs from free range hens. Beware of the video below, it can be graphic and somewhat disturbing.
Meat is Bad For the Environment, Part:
I
I have mentioned in previous posts that growing animals for food or meat is very unsustainable.
For every pound of meat produced, it takes 8 pounds of grain that has to be grown to feed the animals. When will we realize our unsustainable ways?
Below is an outline of the problems associated with the meat industry and Part:2 will outline ways in which we can begin to solve the problems.
Health Risks:
The Pew Commission on Industrial Farm Animal Production released a study on April 29th, 2008 has found that the number of farms producing animals has dramatically dropped over the past five decades. They have found that there is a concentration of farm animals in larger and larger numbers in close proximity to one another which is creating unacceptable risks to human health.
Animals in such close confinement, along with some of the feed and animal management methods employed in the system, increase pathogen risks and magnify opportunities for transmission from animals to humans. In other words, this creates more sicknesses in people from eating meat.
Do you have stomach aches after eating that Jack in The Box burger?
Symptoms can go unnoticed, but the problems are there.
Environmental Risks:
According to the EPA, the annual production of manure produced by animal confinement facilities exceeds that produced by humans by at least three times. Unlike most human sewage, the majority of Industrial Farm Animal Production waste is spread on the ground untreated.
Manure in such large quantities carries excess nutrients and farm chemicals that find their way into waterways, lakes, groundwater, soils and airways. Our improper methods of raising animals and their waste is creating an environmental disaster for future generations.
Not only that, but the waste from animals creates runoff that also carries antibiotics and hormones, pesticides, and heavy metals. Antibiotics are used to prevent and treat bacterial infections and as growth promoters. Pesticides are used to control insect infestations and fungal growth.
Heavy metals, especially zinc and copper, are added as micronutrients to the animal diet.
Tracking an Unexpected Storm
The 2009 Atlantic hurricane season seemed as though it would be active, with Tropical Storm Ana, Tropical Storm Claudette and Hurricane Bill all forming and moving closer to the United States. by the middle of August.
These were followed by an additional six tropical systems. Thankfully, only one of these nine systems, Tropical Storm Claudette, made a U.S. landfall.
As November rolled around and the season neared its end, the Texas Tech University Hurricane Research Team (TTUHRT) had hung up its hat, accepting that there would be no research opportunities this year. And then came Ida, an odd, late-season storm that formed off the coast of Nicaragua.
As Ida moved northward and her threat to the Gulf Coast became apparent, TTUHRT hit the road with weather-monitoring StickNet probes and the new TTUKA-1 radar. Ida was the StickNet fleet's fourth tropical cyclone deployment, and the very first for TTUKA-1.
The StickNet probes were deployed on both sides of Mobile Bay in Alabama, and were primarily concentrated along Gulf Shores, within 10 miles of TTUKA-1, to get overlapping coverage.
The StickNets are portable weather stations that measure temperature, humidity, pressure, wind speed and direction, and precipitation.
They were designed and built by Texas Tech students and have been used for four years during severe thunderstorm seasons, including VORTEX 2 last spring, and were deployed last year for Hurricanes Dolly, Gustav, and Ike.
StickNet has become my own personal "Flat Stanley."
In the words of the late Johnny Cash, "I've been everywhere, man," with StickNet by my side.
Last year for Hurricane Dolly, my team deployed StickNet on the sand dunes of South Padre Island, in the shadow of spring break hotels. We deployed StickNet at the Morton Salt Plant at Weeks Island and at Avery Island, home of Tabasco, for Hurricane Gustav.
Hurricane Ike brought a deployment at Fort Travis at the tip of the devastated Bolivar Peninsula.
We were fearful that the probe would be swept away and lost forever;it survived despite the fact that nearly every home was destroyed.
The elevated fort made such a good deployment site, that we decided to try other fort sites, and made deployments at Fort Gaines on Dauphin Island on the west side of Mobile Bay, as depicted in the image, and at Fort Morgan on the east side of Mobile Bay for Tropical Storm Ida.
Buried Volcano Discovered in Antarctica
A volcano beneath Antarctica's icy surface has been detected for the first time.
Under the frozen continent's western-most ice sheet, the volcano erupted about 2,300 years ago yet remains active, according to a study published Sunday in an online issue of the journal Nature Geosciences.
"We believe this was the biggest eruption in Antarctica during the last 10,000 years," said study co-author Hugh Corr, a glaciologist for the British Antarctic Survey (BAS). "It blew a substantial hole in the ice sheet, and generated a plume of ash and gas that rose around 12 kilometers (7.5 miles) into [the] air."
Brooding giant
Although ice buried the unnamed volcano, molten rock is still churning below.
David Vaughan, a glaciologist with the BAS and a co-author of the new study, said the discovery might explain the speeding up of historically slow-moving glaciers in the region.
"This eruption occurred close to Pine Island Glacier on the West Antarctic Ice Sheet," Vaughan said.
"The flow of this glacier towards the coast has speeded up in recent decades, and it may be possible that heat from the volcano has caused some of that acceleration."
The effect is similar to a person gliding down a Slip 'n Slide:
Volcanically melted water beneath the colossal ice sheet lubricates its movement, assisting its gravity-powered journey toward the Antarctic Ocean.
Vaughan noted, however, that the hidden volcano doesn't explain widespread thinning of Antarctic glaciers.
"This wider change most probably has its origin in warming ocean waters," he said, which most scientists attribute to global warming resulting from human activity, such as the use of fossil fuels.
Hide and seek
Corr and Vaughan used ice-penetrating radar to locate the volcano just west of the expansive Pine Island Glacier. Specifically, they detected a New Jersey-sized plot of ash at more than 8,000 square miles (20,700 square kilometers) beneath the ice.
The debris is a hallmark of an ancient eruption.
"The discovery of a ‘subglacial' volcanic eruption from beneath the Antarctic ice sheet is unique in itself," Corr said.
"But our techniques also allow us to put a date on the eruption, determine how powerful it was and map out the area where ash fell."
Scientists like Corr have used radar and other technologies to find other features, such as lakes, tucked beneath the Antarctic ice.
Researchers also think that magma-heated rock beneath Greenland's massive ice sheet is accelerating its melting, but whether a volcano or just a pool of magma is responsible is still a matter of debate.
Submersible Robot Runs on Sea's Heat
A Green robotic glider runs on energy absorbed from the ocean.
Credit:
Dave Fratantoni, Woods Hole Oceanographic Institution
Scientists have invented the Prius of ocean-going submersibles — a new "green" robotic glider that runs on energy absorbed from the heat of the sea, rather than batteries.
The new underwater glider can stay at sea at least twice as long as previous submersibles that used battery power. It is the first autonomous underwater vehicle to travel great distances for extended periods running on green energy, according to the Woods Hole Oceanographic Institution (WHOI).
Submersibles gained fame in 1985 when WHOI's remotely-operated underwater vehicle, "Argo," discovered the wreck of the RMS Titanic near Newfoundland.
Built by the Webb Research Corporation in Falmouth, Mass., the new submersible has successfully traveled back and forth between two of the U.S. Virgin Islands, St.
Thomas and St.
Croix, more than 20 times. WHOI researchers plan to use the data gathered by the craft to study ocean currents in the area.
To power its propulsion, the submersible gathers thermal energy from the ocean.
When it moves from cooler water to warmer areas, internal tubes of wax are heated up and expand, pushing out the gas in surrounding tanks and increasing its pressure.
The compressed gas stores potential energy, like a squeezed spring, that can be used to power the vehicle.
"This glider allows longer missions than previous [battery-run] versions," said Ben Hodges, a physical oceanographer at WHOI. "It could be out there for a year or two years. None of the old ones could go beyond six months. And producing fewer batteries is good for the environment."
The torpedo-shaped glider moves through the ocean by changing its buoyancy to dive and surface, unlike motorized, propeller-driven undersea vehicles. To rise, oil is pushed from inside the vehicle to external bladders, thus increasing the glider's volume without changing its mass, making it less dense.
The oil can be shifted inside to increase the density and sink the vehicle. A vertical tail rudder allows the glider to be steered horizontally.
Technically, the new vehicle is a hybrid, like Toyota's Prius, because it uses a small amount of battery power to run the onboard instruments and to move the rudder.
Gliders of this type are perfect for long-term, long-distance journeys that humans can't make, Hodges said.
"They can be very helpful in getting measurements that would be too expensive to get otherwise — any kind of study that requires long-term measurements from multiple locations," Hodges told LiveScience.
"If you had to be there in a ship, it would cost millions of dollars
Aruba's New Windfarm
As Copenhaguen ended, unsurprisingly, in confusion, I have the opportunity to give you a more positive tale, and show you it is possible for people - including even bankers amongst them - to work towards a more sustainable future without necessarily endangering our way of life.
The Vader Piet 30MW wind farm on the island of Aruba.
In this case, it involves the construction of a windfarm in a place where it will directly replace fuel-oil-burning power plants. As you'll be able to see below, this wind farm is quite remarkable in a number of ways which means that this experience will not be replicable as easily everywhere, but it shows that there are many places and energy systems which it is possible to materially improve under almost all criterai using renewable energy.
(part of the wind power series)
Full disclosure:
As indicated below, I financed the project discussed in this post last year.
Amongst notable features, one can find:
* at around 60%, it has one of the highest capacity factors in the world, with 50% more power output per turbine than European offshore windfarms...;located on the Eastern coast of the island, it is exposed directly to the trade winds, which are highly regular and almost always in the same direction (allowing to put the turbines very close to one another);their almost constant strength also mean that tear and wear is actually likely to be less than usual, as there are very few brutal changes in regime and torque;
it is a windy place...
* it is now providing 20% of the island's overall electricity needs, replacing dirty and expensive fuel-oil in the process. At night, it will produce up to 60% of the demand.
And thanks to the highly regular wind regime, this is very stable and predictable production;(even though they pushed for this project to happen, the local power company had quite a shock to see 'for real' how big a portion of their system the windfarm has suddenly become - as is still frequent, utilities have trouble taking wind seriously, but it this case the reality was quite compelling);
see how the blades bend under the strength of the wind
(the second one was switched off temporarily for visits,
thus its different orientation in this picture).
* the utility will save money on fuel imports and, more importantly, will actually end up with cheaper power:
it buys the electricity from the wind farm at a fixed price over 15 years which is roughly equivalent to what it costs to produce electricity from their traditional oil-fired generators with oil prices at $45/bbl.
Who wants to bet on oil being consistently under $45 for the next 15 years?
In fact, the prime minister of the island, who was present at the inauguration, used the opportunity of that ceremony to announce lower power prices for the poorest households on the island...
the windfarm is situated in a very isolated part of the island, invisible to everybody
but it adds to that area's spectacular vistas.
* and the reality is that the windfarm has received an enthusiastic welcome by the population of the island - the project team was telling me about how there were people all along the road clapping them when they were transporting the machinery to the site (not a trivial task, as the videos below show):
* and, finally (and this is where I come in), the windfarm was financed at the top of the financial crisis last year. I told the story in a blog post then ( How to keep on financing wind farms when banks have no money left.) but it's worth underlining here that one of the most dangerous consequences of the crash is that traditional banking - lending to the economy - has been, and still is, directly impacted and curtailed, as the result of lack of liquidity and heightened risk aversion by banks (which are just as stupid and gregarious in systematically cutting off credit as they were enthusiastic at shoving it onto clients before). So it was an especially proud moment for me to see this project, because we really made a difference at the time, saving the wind farm from a potentially damaging delay, and saving very real economic activity on the island and amongst the suppliers (which are mainly European).
Wind is a capital-intensive but low risk activity where simple and stable financing structures are both necessary and useful - construction costs need to be spread out over a number of years for power generation costs to make sense.
Technical and operational risks are understood and very small if you have a competent project company, and the revenue profile is highly predictable, thus making it possible for lenders to provide a large part of the initial cost at a fixed price without requiring any benefit sharing, making this cheaper than equity and keeping the ultimate power price down.
This, called "project finance," is the boring kind of banking that makes the economy run but is sadly seen as unsexy or useless whenever new funny products are invented in the capital markets and create opportunities for bonus-generating bubbles...
I've already been set aside as dreary 3 times in the past 15 years:
emerging market bonds were all the rage in the mid-90s (until the Asian crisis), then the dot-coms were 'it' (until the crash), then the grand multi-product bubble of the past decade, with its mortgage-backed securities, collateralised loan obligations, credit default swaps and the rest.
And having being bailed out, they're at it again, while project finance is still suffering - and wind or solar projects get built more slowly than they could as a result.
Researchers Develop New Bacterial Strain with Higher Butanol Tolerance;Potential to Double the Output of Biobutanol from Conventional Bacterial Fermentation
20 August 2009
Researchers at Ohio State University and Shanghai Institute of Plant Physiology and Ecology, Chinese Academy of Sciences, have developed a method that can double the output of biobutanol compared to conventional bacterial fermentation through the use of a new strain of bacteria and a new bioreactor. They reported their results at the 238th national meeting of the American Chemical Society in Washington, DC.
Under conventional production in a bacterial fermentation tank, the high toxicity of butanol results in a low butanol titer (about 15 grams per liter);i.e., with increasing butanol concentration in the tank, the environment becomes too toxic for the bacteria to survive.
This heavily affects the economics of biobutanol production.
Biobutanol has a number of advantages over ethanol for use as a biofuel—it is more hydrophobic;has a higher energy density;can be transported through existing pipeline infrastructure;and can be mixed with gasoline at any ratio.
To improve butanol tolerance of the organisms, Shang-Tian Yang, professor of chemical and biomolecular engineering at Ohio State, and his colleagues developed a mutant strain of the bacterium Clostridium beijerinckii in a bioreactor containing bundles of polyester fibers.
They immobilized cells of an asporogenous (non spore-producing) mutant of C. beijerinckii in the fibrous bed bioreactor (FBB). Culture medium and process conditions were optimized to facilitate the butanol production.
Cells immobilized in the FBB were used as seeds for each subsequent batch of fermentation;thus, these cells were intermittently challenged with butanol produced by themselves.
After several fermentation batches' adaptation, a mutant strain was isolated from the FBB. Compared with the original strain, the mutant showed higher butanol tolerance and reduced autolysis (self-destruction). At the same time, for the mutant, the fermentation conditions needed to realize the metabolic shift from acidogenesis to solventogenesis were different from that of its parent strain.
Under subsequently optimized fermentation conditions to maximize the butanol production potential of the mutant, the maximal butanol titer was improved to up to 30 grams per liter.
In addition, they constructed a recombinant C. beijerinckii strain with enhanced mutation frequency.
They suggest that the mutator strain, in combination with the FBB-based adaptation method, will help more rapid evolution of the solvent-producing Clostridium beijerinckii towards higher butanol tolerance.
The engineers are applying for a patent on the mutant bacterium and the butanol production methodology, and will work with industry to develop the technology.
This research is funded by the Ohio Department of Development.
15 Air Carriers Sign Non-Binding MOUs on Synthetic Jet Fuel Purchases with Two Providers:
AltAir and RenTech
A core group of 15 commercial airlines has signed non-binding memoranda of understanding (MOUs) for negotiating purchases from two different producers of synthetic jet fuel:
AltAir Fuels LLC and RenTech Inc.
These alternative fuels will be more environmentally friendly, on a life cycle basis, than today's petroleum-based jet fuels.
The MOU with AltAir contemplates the production and purchase of up to 750 million gallons of jet fuel and diesel fuel over 10 years (75 million gallons per year) derived from camelina oil.
The Rentech MOU contemplates the production and purchase of approximately 250 million gallons per year of synthetic jet fuel derived principally from coal or petroleum coke, with the resultant carbon dioxide sequestered and the carbon footprint potentially further reduced by integrating biomass as a feedstock.
Twelve airlines from the United States, Canada, Germany and Mexico—Air Canada, American Airlines, Atlas Air, Delta Air Lines, FedEx Express, JetBlue Airways, Lufthansa German Airlines, Mexicana Airlines, Polar Air Cargo, United Airlines, UPS Airlines and US Airways—have signed MOUs with both producers. In addition, Seattle-based Alaska Airlines and Honolulu-based Hawaiian Airlines signed the MOU with AltAir Fuels, and Orlando-based AirTran Airways signed the MOU with Rentech.
Today's announcement reinforces the proactive steps that airlines are taking to stimulate competition in the aviation fuel supply chain, contribute to the creation of green jobs, and promote energy security through economically viable alternatives that also demonstrate environmental benefits.
Our intention as an airline industry is to continue to do our part by supporting the use of alternative fuels. We urge the US government and the investment community also to do their part to further support this critical energy opportunity.
—Glenn Tilton, Air Transport Association of America, Inc. (ATA) Board Chairman and UAL Corporation and United Airlines Chairman, President and CEO
Tilton also noted that discussions with a number of additional alternative-fuel producers about other projects are underway, as are discussions with the US military regarding other cooperative opportunities.
ASTM Certification.
In June, the ASTM International Aviation Fuels Subcommittee having responsibility for jet fuel (D02.J0.01) formally voted upon and passed a new fuel specification that will enable use of synthetic fuels in commercial aviation.
The new specification constructs a framework to enable the use of multiple alternative fuels (including both non-renewable and renewable blends) for aviation, and targets complete interchangeability with conventional fuels produced to specification D1655.
The initial issue of the specification will enable use of fuels from the Fischer-Tropsch (FT) process up to a 50% blend with conventional Jet A. It is expected that the FT approval will be followed by approvals for hydrotreated renewable Jet (HRJ) blends and other alternatives as data from technical evaluations is obtained. (Earlier post.)
AltAir. AltAir Fuels LLC was formed in 2008 to develop projects for the production of jet fuel from renewable and sustainable oils. The company and its partners are designing and building a network of renewable jet-fuel production facilities, with the first plant expected to be running before the end of 2012.
The AltAir Fuels project will initially produce bio-jet fuel (and diesel) at a plant to be located at the Tesoro refinery in Anacortes, Wash.
Bio-jet fuel produced from camelina by AltAir will be piped into the Tesoro refinery facility, where it will be blended with petroleum-based jet fuel, and delivery and servicing issues will be handled at Seattle-Tacoma International Airport.
The jet fuel must meet the then-appropriate ASTM specification for hydrotreated renewable jet (HRJ) fuels. The first delivery is expected in the fourth quarter of calendar year 2012.
The facility will have a nameplate capacity of 100 million gallons per year, and is slated to begin operations in 2012.
The camelina oil will be sourced from Montana-based Sustainable Oils, which has the largest camelina research program in North America and production contracts with numerous farmers and grower cooperatives.
AltAir has chosen refining technology developed by UOP, LLC, a Honeywell company, which has already produced biojet fuel for various test flights and US military contracts in 2009.
Rentech.
Rentech proposes using petroleum coke as feedstock with its Rentech Fischer-Tropsch process. Its Natchez Project in Mississippi as currently contemplated would produce approximately 400 million gallons per year of synthetic fuels and chemicals (including approximately 250 million gallons of synthetic jet fuel) and more than 120 MW of power.
The facility is designed to capture approximately 80% of the carbon dioxide generated in the production process, which will be sold under a long-term agreement with Denbury Resources for enhanced oil recovery to produce otherwise unrecoverable oil at Denbury's Cranfield oil field in Southwest Mississippi as well as at the company's oil fields within the greater Gulf Coast area.
The Cranfield oil field is currently hosting a US Department of Energy (DOE)-sponsored carbon dioxide sequestration project that is the first in the nation to inject more than 1 million tons of carbon dioxide into an underground rock formation followed by additional injections into the saline portion of the reservoir, more than 10,000 feet below the surface.
In February 2009, a report from the US Department of Energy (DOE) National Energy Technology Laboratory (NETL) concluded that coupling a Coal to Liquids (CTL) process with carbon capture and sequestration (CCS) yields a fuel with 5-12% less lifecycle greenhouse gas (GHG) emissions compared to the average emissions profile of petroleum-derived diesel, based on the US national average in 2005.
Adding biomass to the coal in the CTL process (Coal and Biomass to Liquids, CBTL) can reduce the GHG emissions further, according to the study. A mixture of 8% (by weight) biomass and 92% coal can produce fuels which have 20% lower life cycle GHG emissions than petroleum-derived diesel. (Earlier post.)
A separately third-party lifecycle assessment of the carbon footprint of synthetic fuels to be produced at Rentech's Natchez plant—which includes the use of CCS—concluded that the fuels from the facility would produce 11% to 23% fewer carbon dioxide emissions than would result from fuels produced from conventional crude refining.
Will Dolphins Need Noise-Canceling Headphones?
Rising CO2 Could Lead to Louder Oceans
Researchers at the University of Hawaii have discovered a strange and potentially damaging result of increased CO2 in the atmosphere:
louder oceans. More CO2 in the atmosphere leads to more acidic seas and more acidic seas produce fewer sound-absorbing chemical reactions meaning noise will travel farther and be louder.
The researchers have used ocean models and CO2 projections to predict regional sound absorption changes. They found that seas in the upper latitudes and deepwater formations will be most dramatically impacted (these areas are facing the worst acidification) with sound absorption falling by 60 percent.
Low frequency sounds (1,000 - 5,000 hertz) like ship propellers and military sonar will be louder, but scientists aren't sure yet what the impact on marine life will be.
The worst-case scenario is that those louder man-made sounds will interfere with marine mammal communication and echolocation sounds. On the other hand, the animal sounds will be louder and travel farther too, so their calls might not be drowned out.
The researchers are continuing their studies to find out.
Can Bio Coal Turn Coal Power Plants into Green Energy Producers?
Hello there!
If you are new here, you might want to subscribe to the RSS feed for updates on this topic.
Don't forget to get your copy of our free book on home water conservation.
Is it possible to turn coal power plants into renewable energy producers?
According to the supporters of bio coal, the answer is yes – coal plants can burn bio coal, created from biomass, without retooling their storage, handling, or burning systems.
Image:
psdFossil Coal
Fossil Coal
Bio coal is somewhat similar to charcoal, but created from biomass harvested from plantations, urban areas, or the forest (brush and dead or diseased trees) and treated by torrefaction to provide a much better fuel quality than straight biomass. Torrefaction is a process carried out at atmospheric pressure (and in the absence of oxygen) at temperatures between 200-300°C, with the biomass partially decomposing during the process.
Bio coal has almost twice the energy density of wood sawdust pellets and is equivalent to the energy density of the coal used by coal fired power plants. According to advocates, bio coal meets all renewable fuel and CO2 reduction regulations and is carbon neutral – the original biomass would naturally release its carbon back into the atmosphere as CO2 through decomposition or fire.
Bio coal is also said to have a significantly greater energy conversion efficiency than biomass used directly as a fuel source.
Renewable Fuel Technologies, a bio coal energy company, says that because coal power plants are the largest source of CO2 emissions in the US, if coal plants switched to bio coal, they could be green energy producers and comply with recent regulations calling for reduced CO2 emissions from power plants.
Bio coal could also be blended with fossil coal in power plants, reducing greenhouse gas emissions, says Verdant Energy Solutions. This means that power plants don't have to switch over to 100% bio coal in order to reap some of the benefits of it.
A study earlier this year found that biomass was best used for the generation of electricity, rather than ethanol or other biofuels, as it is 80% more efficient to do so. Any technology that can efficiently turn biomass into bio coal could very well change the face of renewable energy as we know it, especially if it was a mobile processing unit, capable of going to where the feedstock was rather than relying on transportation to a facility for conversion.
Emissions Reporting Deadline Draws Pundits Out of Wordwork
With large emitters scheduled to begin collecting their emissions data Jan. 1, companies are gearing up to ensure their compliance.
But that hasn't stopped critics from ramping up their rhetoric.
On Sept. 22, the Environmental Protection Agency announced it would require about 10,000 facilities that emit about 85 percent of the nation's greenhouse gases to begin collecting their emissions data.
The largest emitters will have to submit annual reports of their emissions, starting in 2011, with information from the 2010 calendar year. Vehicle and engine manufacturers are getting a one-year reprieve.
They don't have to start reporting until model year 2011.
For a list of reporting requirements, click here.
Some politicians are making last-ditch efforts to stop the reporting rule.
Louisiana Gov.
Bobby Jindal (R), in a letter to EPA Administrator Lisa Jackson, said that the rule would negatively impact oil refining and production in his state.
He has asked Jackson to back off for now, reports The Hill.
Companies that emit 25,000 metric tons of carbon dioxide equivalents are subject to the reporting rule.
Failure to measure and report emissions to the EPA can result in fines of up to $37,500 per day, per violation.
The reporting is intended to set a baseline for any future cap-and-trade program, as well as ensure compliance if, or when, cap-and-trade gets rolling.
Law firms are encouraging companies to get ahead of the rolling wheels of bureaucracy.
"Companies need to understand that from the standpoint of government regulation and public opinion, the debate about global warming is over. That means it's time for them to develop sustainability plans and carbon reduction strategies before regulators, environmental advocates, shareholders and other groups force them to act," said Saulius Mikalonis, Senior Attorney at law firm Plunkett Cooney.
Because the reporting rules will put companies' emissions data in the public eye, activist groups will be able to do company-to-company and plant-to-plant comparisons, Mikalonis said.
"They will create public relations issues and potential legal problems for some companies, especially if they have been marketing themselves as ‘green' when the emissions report says otherwise.
But they also may speed up the adoption of energy-saving technologies, which can flow straight to the bottom line," he said.
Yet other environmental lawyers say the EPA is "walking through a legal minefield" if it plans to truly regulate GHGs, reports Truth About Trade &Technology, via the Wall Street Journal.
Strictly defined under the Clean Air Act, the EPA's endangerment finding means that emitters above a 100-ton to 250-ton per year threshhold will be subject to emissions standards. Despite the wording in the 1970's-era Clean Air Act, the EPA has said, however, that facilities emitting less than 25,000 metric tons annually would be exempt, at least for the first few years.
But that hasn't kept the critics at bay.
EPA's so-called "tailoring rule" that would apply to large emitters only could come under challenge, said Patrick Traylor, a partner at the Washington office of Hogan &Hartson, in the Wall Street Journal article.
"The EPA is on a tightwire without a net with this tailoring rule," Traylor said.
"There's a very real risk a court could vacate the rule and a higher-than- normal risk they could stay it."
The EPA's plan to study for five years the permitting process for facilities under 25,000 metric tons is a positive, said Jason Schwartz, a Legal Fellow at the Institute for Policy Integrity at New York University School, in the article.
However, lower thresholds can be passed at the state level, and individuals, unions and business competitors could use the Clean Air Act to expose emitters to regulation and lawsuits, Schwartz said.
The National Cattlemen's Beef Association, fearing potential penalties related to greenhouse gas emissions, filed a challenge to EPA's endangerment finding on GHG gases. The petition, filed Dec. 23 in the Washington, D.C., Circuit Court of Appeals, said that EPA climate regulations would harm the profitability of large farms.
Without a cap-and-trade law in place, some companies are turning to voluntary emissions reporting as a means of being ready for any such change, and to let investors know that they won't be exposed if such changes come, reports the New York Times.
Boeing is one such company.
Earlier this year it took steps to improve insulation and other HVAC properties at a Seattle-area IT facility, and five similar sites. The company expects to save $55,000 a year and 685,000 kilowatt hours of electricity because of the effort, reports NYT.
Corporations participating in voluntary emissions reporting programs such as the Carbon Disclosure Project can gain a modicum of investor support, but because emissions figures filed through the project do not have to be third-party verified, even this measure is open to skepticism, the NYT reports.
Report from CS3 Symposium Highlights Work Toward Artificial Photosynthesis For Direct Solar Production of Liquid Transportation Fuels
Scientists are making progress toward development of an "artificial leaf" that mimics photosynthesis, but that converts sunlight and water into a liquid fuel such as methanol for cars and trucks, according to a new report summarizing the discussions from the 1st Annual Chemical Sciences and Society Symposium (CS3). However, much work remains to be done in all the component areas, as well as in the integration of the components to a viable artificial leaf.
The three-day symposium, which took place in Germany this past summer, included 30 chemists from China, Germany, Japan, the United Kingdom and the United States. It was organized through a joint effort of the science and technology funding agencies and chemical societies of each country, including the US National Science Foundation and the American Chemical Society (ACS).
The symposium series was initiated though the ACS Committee on International Activities in order to offer a unique forum whereby global challenges could be tackled in an open, discussion-based setting, fostering innovative solutions to some of the world's most daunting challenges.
The symposium focused on four main topics:
* Mimicking photosynthesis using synthetic materials such as the "artificial leaf"
* Production and use of biofuels as a form of stored solar energy
* Developing innovative, more efficient solar cells
* Storage and distribution of solar energy
Photosynthesis is the conversion of water (H2O) and carbon dioxide (CO2) into carbohydrates (CH2O) and oxygen (O2) through a combination of several separate reactions, the two main ones being the splitting of water into hydrogen (H2) and oxygen and the reduction of CO2 using electrons released during the water splitting.
Chemists have in fact mimicked most of what is required to make these separate reactions proceed.
In essence, they have replicated photosynthesis. But it hasn't been easy, and chemists have yet to replicate the separate reactions in an integrated fashion and in a way that can be commercially applied on a wide scale. ...scientists have only recently developed an experimental O2 production reaction that is potentially affordable enough for widespread use;most current commercial O2 production methods rely on the use of expensive platinum catalysts.
Most of the CS3 discussion on direct solar fuels focused on current research efforts to develop affordable catalysts for driving reactions, particularly in the areas of hydrogen production, oxygen production, and CO2 reduction.
Dr. Kazuhito Hashimoto of the University of Tokyo commented during this session that two desirable aspects of any next-generation PV technology, indeed any type of solar energy technology are:
(1) the ability to self-assemble or self-organize and (2) the ability to self-heal.
Noting that natural photosynthesis is typically only 4½-5% efficient at best, Hashimoto argued that most artificial systems are already "better" than natural photosynthesis with respect to solar energy conversion efficiency.
Clearly, however, artificial systems are lacking something that nature has—specifically, the ability to self-assemble and the ability to self-heal.
The participants at the CS3 meeting suggested that before artificial photosynthesis can become an affordable, sustainable solution for widespread use, chemists must:
* Develop chemical catalysts for the two major processes of artificial photosynthesis—water splitting and CO2 reduction—that can be applied commercially and are made of affordable, earth-abundant materials;and
* Create an "artificial leaf" by coupling water splitting and CO2 reduction in a way that eliminates the need for an external, sacrificial electron donor. (A reduction reaction requires an electron donor. In natural photosynthesis, water serves as the electron donor.)
Among the key overarching messages of the report:
* There's no single best solution to the energy problem.
Scientists must seek more affordable, sustainable solutions to the global energy challenge by considering all the options.
* Investing in chemistry is investing in the future.
Strong basic research is fundamental to realizing the potential of solar energy and making it affordable for large-scale use.
* Society needs a new generation of energy scientists to explore new ways to capture, convert, and store solar energy
Breaking News:
China's Yellow River may be Threatened by Oil Spill
Last week, an oil pipeline burst in northern China, causing one major oil spill.
Thousands of gallons (liters) of diesel flowed into a main tributary of the Yellow River.
The pipeline, operated by CNPC (China National Petroleum Corporation), is located near Shaanxi province and is used to transport diesel from Gansu province in the northwest to the more centralized places in the country.
Investigations show that the bursting was caused by a local construction project.
To make matters worse, oil has already been found at least 20.5 miles (33 km) downstream from the spill, making the cleanup quite the daunting task.
So far, 23 containment belts have been set up downstream from the spill and hundreds of people have been trying to clean things up.
China's rivers and lakes are already extremely polluted due to the amount of economic growth that's happened in the past 30 years. This has left over 200 million Chinese citizens without proper drinking water. China also ranks as one of the top polluters in the world, especially when it comes to air pollution.
The Yellow River, or Huang He, is known as the "cradle of Chinese civilization", since northern Chinese civilizations were founded along its basin.
It was also once one of the most prosperous regions in all of China.
However, extreme flood risk and increase in pollution would soon remove some of its former glory, leaving some areas of the river with unfortunate nicknames like "China's Sorrow".
It is the second largest river in China and the 6th largest in the world.
The river owes its colored name to the loess, or silt-like sediment that hangs in the water. If the oil from the spill is able to reach the yellow river, it could mean a lot of trouble for wildlife that live in and along the river, as well as the people and domesticated animals.
Updates will be posted as they happen.
Why is our planet so heavily polluted?
Do we really care about our planet?
If we take a look at how much we're polluting our planet then answer is certainly no we don't.
Pollution is taking heavy toll in many countries around the world, like our planet is some sort of dumping place where we can dump all our trash, not thinking how this is affecting not only our planet but ourselves. We seem to be forgetting all the time that Earth is still our only home, and that by polluting our planet we are really hurting ourselves.
Ever since the industrial revolution started people started dumping more and more garbage into our rivers, sea oceans, and different types of emissions have also created significant air pollutions in many countries across the world.
Take China for instance, by reaching super fast economic boom China also experienced supper fast air and river pollution.
Pollution is not only hurting our planet, it is hurting us as well because pollution often leads to many diseases, for instance air pollution leads to respiratory diseases while river pollution can lead to poisoning.
Pollution problem is present in many parts of the world but countries that are mostly affected are China and India.
China does not only have tremendous air pollution problem because of heavy coal-fired industry but it is also facing serious river pollution problem with its main rivers Yangtze and Yellow river. These rivers are no longer safe for normal use, and are leading to many waterborne diseases. Air pollution problem in China had really become famous because of Beijing Olympics when China's government made extreme efforts to clean the air as much as possible in such short time.
Of course, now when this games are over, and once recession ends industry will yet again create even more air pollution.
Despite this serious situation China's government is still deciding whether to go for stronger environmental policy that would ensure cleaner air in China.
Why?
Because then industry would suffer and China isn't ready to give up on its huge economic boom.
Condition in India is also more than worrying.
It is enough to visit holy India's river Ganges and you'll realize right away what I'm talking about.
This river is is some parts so heavily polluted that by looking at it you have the feeling like you're in some kind of nightmare. 300 million gallons of waste go into the Ganges river each day, and what is even worse this river is still consider in India as the symbol of spiritual purity.
Some estimates say how one third of all deaths in India is the result of different waterborne diseases.
So then it's the industry the main factor responsible for pollution?
Well the answer is both yes and no. Yes industry is creating extreme waste, but the question we should ask ourselves is who stands behind the industry?
We, of course.
We are the most responsible for pollution, our greedy nature and lack of ecological conscience.
As long as world remains driven by high profits things will not change, and planet will remain heavily polluted.
We can have adequate laws but what good is the law if nobody obeys it?
What we really need to do is change our current perspective where everything is oriented on money.
Money is important but is definitely not more important than our planet, well at least it shouldn't be.
We need to realize this as soon as possible because even almighty dollar won't mean nothing without the planet to spend it on.
China - Reasons for air pollution
China seriously suffers because of air pollution problem.
This huge country is boosting large economic growth lately and owes it almost all to coal and coal is fossil fuel with very harmful effect for environment since it not only releases dangerous greenhouse gases into the atmosphere but also contributes to air pollution.
It's a terrible irony, country with most people in the world has the worst air quality which often results in many respiratory diseases, and some other diseases which at the end cost country much than the renewable energy research.
Renewable energy is only hope for countries like China, India and even USA to maintain air quality on acceptable level.
While USA has enough funds to invest in renewable energy sector situation is quite different with China and India that are lacking necessary funds and therefore stick with dirty fuels like coal to maintain their rapid economy boost.
This has its price and yet again ecological price since ecology is in many countries only acceptable if economy gives green light.
For instance in 1999 carbon dioxide emission reached 18.57 million tons, air pollution emission reached 11.59 million tons and industrial dust emission reached 11.75 million tons, according to statistics released by the State Environmental Protection Administration of China.
Thankfully China is aware of her air pollution problem and tries to fight it if not for anything else than for the 2008 Olympics as they're investing more funds in renewable energy sector and particularly in wind energy sector but as long coal based power plants will dominate China's industry there's really no place for significant ecological improvement.
Last year Beijing and its area were named by the European Satellite Agency as having the world's highest levels of nitrogen dioxide -- a key smog gas originating from power plants, heavy industry and vehicle emissions and inhalable particles had reached a dangerous 300 micrograms per cu meter- meaning that outdoor activities had become hazardous to human health.
There is also the big problem in form of acid rains in China although not this big as air pollution.
Acid rain hit more than 30 % of China's area and in some south areas acid rains even exceed 80 % causing great environmental damage and influencing the range of PH level in water.
China is in very delicate situation as their government tries to balance their economy needs with environmental protection and this is very tough to achieve, especially now when China became major player on global industry market, but still not powerful enough to set its sights away from coal.
There are some investments in renewable energy sector but coal is still dominant and there's also increase in number of vehicles(as the result of higher living standards) and this will affect China's air and environment in years to come