CCNet 102/2002 - 4 September 2002

"A doomsday asteroid could wipe out humanity if it collided with
Earth, but scientists said on Tuesday there was little support for
investigating how to turn aside a big deadly space rock. Part of the
problem is the lack of an actual hazard: Yeomans and others at the
workshop agreed that there is no present danger of an asteroid strike big
enough to cause cataclysmic damage to Earth. But if one were detected in the
future, there is no plan in place to deal with the threat, Yeomans and others said."
--Deborah Zabarenko, Reuters, 3 September

    Reuters Science News, 3 September 2002

    Isaac Newton Group of Telescopes, 4 September 2002 

    Aviation Week & Space Technology, 2 September 2002

    Ron Baalke <>

    John Michael Williams <>

    Andrew Yee <>


>From Reuters Science News, 3 September 2002;jsessionid=U1YZ0EJDN4A4CCRBAEKSFFA?type=scienceNews&storyID=1404983

By Deborah Zabarenko

WASHINGTON (Reuters) - A doomsday asteroid could wipe out humanity if it
collided with Earth, but scientists said on Tuesday there was little support
for investigating how to turn aside a big deadly space rock.

The United States spends between $3.5 million and $4 million a year to track
asteroids and comets that might hit Earth at some point, but very little on
strategies to get our planet out of the way, said astronomer Don Yeomans of
NASA's Jet Propulsion Laboratory in Pasadena, California.

"There's been very little money spent on mitigation studies," Yeomans said
at a scientific workshop on what it would take to get out of the way of an
incoming Near Earth Object, or NEO, as these potential cosmic hazards are

Part of the problem is the lack of an actual hazard: Yeomans and others at
the workshop agreed that there is no present danger of an asteroid strike
big enough to cause cataclysmic damage to Earth.

But if one were detected in the future, there is no plan in place to deal
with the threat, Yeomans and others said.

"What if you do find one with our name on it, then whose responsibility is
it?" Yeomans said. "You assume it would be the military's, but which one?
... NASA's charter is to find them and track them. That's it."

Since 1998, the National Aeronautics and Space Administration has been
identifying NEOs as part of a decade-long search for the biggest ones, those
with diameters of .6 miles or more.


An asteroid this size could eradicate humans as a species, or send them back
to the dark ages, said Clark Chapman of the Southwest Research Institute,
and for this reason, these big NEOs should be the top priority for

But smaller rocks, about 1,000 feet across, could destroy cities, spur
monstrous tsunamis and flatten an area the size of New Jersey, according to
Erik Asphaug of the University of California at Santa Cruz.

While the bigger asteroids would disrupt society most profoundly and have
the longest-lasting effect, the probability of them ever striking Earth is
extremely low, Asphaug said in an interview.

However, he said, "from the point of economic harm and lives lost today,
it's probably the 300 meter asteroid that is the worst," because the
likelihood of one of these higher Earth is 10 times higher than a .6 mile

Compared to a natural disaster such as a massive earthquake or volcanic
eruption, where national agencies are prepared to mobilize and communicate
with those in harm's way, Asphaug said there is little or no preparation for
what to do in the event a NEO is headed for Earth.

But he reckoned a mitigation program to nudge the incoming threat off-course
could be put in place in 10 years at a cost of perhaps $10 billion, which
might be made available if a credible NEO threat were detected.

"Once you identify the thing, I'm sure that money will be no problem,"
Asphaug said.

As of last week, NASA's NEO tracking program had identified 2,027 asteroids
and comets that might come to Earth's neighborhood.

Of the 36 ranked of highest risk, only one merited scientific attention, and
even so, its chance of a collision with Earth was deemed as low as the
chance that some random object would hit our planet in the next few decades.

Copyright 2002, Reuters

MSNBC, 3 September 2002


>From the Isaac Newton Group of Telescopes, 4 September 2002

The Isaac Newton Group of Telescopes is an establishment of the Particle
Physics and Astronomy Research Council (PPARC) of the United Kingdom, the
Nederlandse Organisatie voor Wetenschappelijk Onderzoek (NWO) of the
Netherlands and the Instituto de Astrofísica de Canarias (IAC) in Spain


Press Release ING 3/2002
Date: Wednesday 4 September 2002
Embargo: For immediate release
Other available formats: PDF | TXT
Mirror in UK:
The Near Earth Asteroid 2002 NY40 was observed with the William Herschel
Telescope on La Palma, Canary Islands, on the night of August 17 to 18,
2002. The asteroid was imaged just before its closest approach to Earth,
using the Adaptive Optics system NAOMI. These are the first images of a Near
Earth Asteroid obtained with an Adaptive Optics system.

During these observations the asteroid was 750,000 kilometres away, twice
the distance to the Moon, and moving very rapidly across the sky (crossing a
distance similar to the diameter of the Moon in 6 minutes or at 65,000
kilometres per hour). Despite the technical difficulties introduced by this,
very high quality images were obtained in the near-infrared with a
resolution of 0.11 arcseconds. This resolution is close to the theoretical
limit of the telescope, and sets an upper limit to the size of the asteroid:
only 400 metres across at the time of the observations.

Measuring the size of asteroids helps astronomers understand their nature
and formation history as well as the potential threat they pose.

Near Earth Asteroids are a small population of asteroids that periodically
approach or intersect the orbit of our planet, and have the possibility of
colliding with the Earth as probably happened 65 million years ago, ending
the dinosaur era. However, the probability that such an impact could happen
is very low and in particular Near Earth Asteroid 2002 NY40 represents no
danger to human live on Earth.

Close encounters of large Near Earth Asteroids such as 2002 NY40 on August
18 happen approximately every 50 years. The last known case was NEA 2001
CU11 which passed just outside the Moon's orbit on August 31, 1925. Nobody
saw that approach because that asteroid was not discovered until 77 years
later. 2002 NY40 was discovered on July 14, 2002 by the LINEAR telescope in
New Mexico (USA), providing a unique opportunity to obtain observations of
the asteroid from the Earth during its flyby.

Several observers have reported variations in the brightness of 2002 NY40,
suggesting that it is highly elongated and tumbling. Further monitoring of
these variations will tell us whether the asteroid was viewed end-on or
side-on, and thus allowing the determination of the size and the shape more

NAOMI is the WHT's Adaptive Optics system, built by a team from the
University of Durham and the Astronomy Technology Centre, UK. It
incorporates a system of fast-moving mirror elements which correct in
real-time for the defocusing of stars caused by the Earth's turbulent
atmosphere. In good conditions, NAOMI delivers images as sharp as those from
Hubble Space Telescope.

The Isaac Newton Group of Telescopes (ING) is an establishment of the
Particle Physics and Astronomy Research Council (PPARC) of the United
Kingdom, the Nederlandse Organisatie voor Wetenschappelijk Onderzoek (NWO)
of the Netherlands and the Instituto de Astrofísica de Canarias (IAC) in
Spain. The ING operates the 4.2 metre William Herschel Telescope, the 2.5
metre Isaac Newton Telescope, and the 1.0 metre Jacobus Kapteyn Telescope.
The telescopes are located in the Spanish Roque de Los Muchachos Observatory
on La Palma which is operated by the Instituto de Astrofísica de Canarias

The ING NAOMI team consists of Dr. Chris Benn, Dr. Sebastian Els, Dr. Tom
Gregory, Dr. Roy Østensen and Dr. Francisco Prada.


Caption: Movie of asteroid 2002 NY40 on the night of August 17 to 18, 2002.
Every frame of this movie is a 0.5 second exposure in H-band (1.63 microns).
Stars are trailed because the telescope was tracking in the asteroid proper
Picture credit: The ING NAOMI team.
Available formats:
Animated GIF 200×200 pixels (430 K)
Animated GIF 400×400 pixels (2,977 K)

Caption: H-band (1.63 microns) image of asteroid 2002 NY40 taken on the
night of August 17 to 18, 2002.
Picture credit: The ING NAOMI team.
Available formats:
With describing text:
JPEG 400×400 pixels (133 K)
TIFF 15cm×15cm, 300dpi (4,640 K)
Without text:
JPEG 400×400 pixels (113 K)
TIFF 15×15 cms, 300dpi (4,640 K)

Caption: The NAOMI Adaptive Optics system at the Nasmyth focus of the
William Herschel Telescope.
Picture credit: The NAOMI team.
Available formats:
JPEG 1000×783 pixels (185 K)
TIFF 19,4×15,2 cms, 300dpi (12,492 K)
Caption: Composite picture showing the William Herschel Telescope (WHT) with
the Milky Way in the background. The WHT is the most powerful
optical/infrared telescope in Europe.
Picture credit: Nik Szymanek and Ian King.

Available formats:
JPEG 800×934 (170 K)
TIFF 3643×4252 pixels, 400dpi (16 M)
TIFF 15,4×18,0 cms, 300dpi (11 M)


Dr. Javier Licandro
Isaac Newton Group of Telescopes
Phone: +34 922 425 431
Fax: +34 922 425 401

Dr. Francisco Prada
Isaac Newton Group of Telescopes
Phone: +34 922 425 430
Fax: +34 922 425 401

Dr. Sebastian Els
Isaac Newton Group of Telescopes
Phone: +34 922 425 441
Fax: +34 922 425 401

Mr. Javier Méndez
Public Relations Officer
Isaac Newton Group of Telescopes
Phone: +34 922 425 464, 616 464 111
Fax: +34 922 425 442


>From Aviation Week & Space Technology, 2 September 2002

Scientists and engineers from the Comet Nucleus Tour (Contour) spacecraft
team have started planning a replacement for the probe that was almost
certainly lost at the end of a critical solid-fuel rocket burn on Aug. 15,
targeting a 2006 launch date to one and perhaps more of the same comets.

Contour lifts off July 3 on a Delta II Med Lite booster with only four solid
rocket motors. A replacement for the lost probe would use nine solids on its
Delta and forgo the onboard rocket suspected in the failure.

Cornell University's Joseph Veverka, the Contour principal investigator,
said Aug. 26 that if the lost spacecraft isn't recovered during a
last-chance attempt in December, "we're going to proceed aggressively with a
Contour 2."

"The science objectives remain unique," Veverka said. "Contour is the only
kind of mission that allows you to study the diversity of comets with the
same instrumentation looking at different comets. Equally importantly, it's
the only type of mission that gives you the opportunity to intercept a new
comet if one of those happens to come in from the Oort Cloud at the right

Veverka said his team probably would go back to NASA for more funding under
the Discovery program that spent $159 million to develop and operate Contour
1. Edward Reynolds, Contour project manager at the Johns Hopkins University
Applied Physics Laboratory (APL), said a Contour 2 spacecraft probably would
be less expensive than the $97-million original because instruments and
other hardware custom-developed for the mission can simply be rebuilt. At a
very rough guess, the savings would be on the order of $20 million, he said.

SOME OF that savings would be offset by a redesign of the spacecraft
propulsion system to eliminate the onboard solid rocket motor that is a
prime suspect in the failure. Robert W. Farquhar, the Contour mission
director, said a Contour 2 probably would shift to a more powerful (and
expensive) version of the Delta II launch vehicle and use more hydrazine
fuel instead of the one-shot Star 30 solid-fuel motor on Contour 1.

About 9% of the fuel in that ATK Tactical Systems Co. Star 30BP was
off-loaded to give Contour 1 the right kick for a series of Earth-return
loops designed to take it to the comet Encke on Nov. 12, 2003, and to
Schwassmann-Wachmann-3 (SW3) on June 19, 2006. That mission design also
required about six weeks of phasing orbits to position the probe in exactly
the right spot over the Indian Ocean for the 50-sec. injection burn of the
solid-fuel motor (AW&ST Aug. 19, p. 27).

Farquhar said a Contour 2 mission would probably use a direct-injection
launch instead, with a Delta II carrying nine strap-on boosters instead of
four. Rather than following an all-or-nothing injection profile, the launch
window would be adjusted day-to-day, and controllers would use the extra
hydrazine fuel to make the larger trajectory adjustments that would be
required as a result. The first target of a 2006 launch would be SW3, and
Encke, d'Arest, or a new comet could be potential follow-ons.

Loss of Contour 1 is a setback to the APL. The Laurel, Md., facility scored
a big success in its drive to be a new supplier of deep space probes with
NASA's successful Near Earth Asteroid Rendezvous (NEAR) mission, and it
plans to use Contour electronics in the nuclear-powered New Horizons mission
to Pluto (AW&ST Feb. 18, p. 38). Congress has funded that mission in the
face of opposition from the Bush administration, and the Contour 1 failure
will give critics like NASA Administrator Sean O'Keefe ammunition as future
New Horizons funding is debated on Capitol Hill this fall (AW&ST July 29, p.

Reynolds said Contour engineers were working with two preliminary failure
scenarios as they prepared for the methodical fault-tree analysis that will
try to pinpoint the cause under a mishap review board set up by NASA
headquarters. In one, the spacecraft itself failed, either as a result of
acceleration from the rocket motor or because heat from the burning solid
rocket fuel triggered a "systemic failure."

In the other scenario, the solid rocket motor itself failed and caused an
explosion, perhaps when a piece of fuel broke loose near the end of the burn
and blocked the nozzle. Tracking data during and after the injection burn,
which happened while Contour was out of range of Deep Space Network
telemetry antennas, indicated the mishap occurred about 2 sec. before the
motor was scheduled to shut down.

REYNOLDS SAID the spacecraft was tested at APL using an inert Star 30 motor
that weighed 70 lb. more than the flight motor. With the inert motor in
place, the spacecraft went through vibration, spin balance and acoustic
tests. The inert motor was replaced with heaters designed to simulate a Star
30's thermal effects for thermal vacuum testing at APL. Once the flight
motor was installed at Kennedy Space Center, final checkout was limited to
spin balance testing without and with the 70 kg. (154 lb.) of hydrazine fuel
the probe's tanks carried at launch, he said.

Ground telescopes have since spotted three fragments slightly behind
Contour's planned trajectory away from Earth that have been designated
Contour A, B and C (AW&ST Aug. 26, p. 68). Reynolds said when the first two
fragments were found engineers calculated that Contour A was the majority of
the spacecraft and Contour B was perhaps the solid rocket nozzle or a piece
of thermal blanket, based on the brightness of the objects and the known
reflectivity of the spacecraft components. After the third fragment was
found, "basically it was anybody's guess," he said.

A FAINT HOPE remains that the spacecraft will respond in mid-December when
controllers at APL try one last time to contact it. The spacecraft's
"pancake beam" antenna, at the opposite end of the spacecraft from the
rocket motor and so perhaps still functioning, will be in a better position
for Earth transmissions then, but Farquhar has put odds of 10,000 to 1
against establishing contact. Without thruster firings that would at least
enable the Encke encounter in November 2003, the spacecraft will be too far
off trajectory by mid-January 2003 to salvage with the hydrazine fuel that
remains on board, he said.

"December is our best hope," he said.

Copyright 2002 ¨ All Rights Reserved.


>From Ron Baalke <>


>From Lori Stiles, UA News Services, 520-621-1877
September 3, 2002

Global wildfires ignited by high-velocity debris from the catastrophic
impact of an asteroid or comet with Earth 65 million years ago spread over
southern North America, the Indian subcontinent and most of the equatorial
part of the world one to three days after impact, according to a new study.

But northern Asia, Europe, Antarctica and possibly much of Australia may
have been spared, David A. Kring of the University of Arizona and Daniel D.
Durda of the Southwest Research Institute report in the Journal of
Geophysical Research ­ Planets.

UA planetary scientist H. Jay Melosh in 1990 and others modeled global
wildfire scenarios from the horrific impact that is thought to have led to
one of the greatest mass extinctions in Earth history, including dinosaur
extinction. The impact that blasted the immense Chicxulub crater near
Yucatan, Mexico, marked the end of the Age of Reptiles, the Mesozoic, and
heralded the Age of Mammals, the Cenozoic.

"We've added more detail in re-evaluating the extent of the wildfires,"
Kring said. "Our new calculations show that the fires were not ignited in a
single pulse, but in multiple pulses at different times around the world. We
also explored how the trajectory of the impacting object, which is still
unknown, may affect the distribution of these fires."

Their more detailed modeling suggests pulses of misery for life on Earth
during days after impact. More than 75 percent of the planet's plant and
animal species did not survive to see the Cenozoic.

"The fires were generated after debris ejected from the crater was lofted
far above the Earth's atmosphere and then rained back down over a period of
about four days. Like countless trillions of meteors, the debris heated the
atmosphere and surface temperatures so intensely that ground vegetation
spontaneously ignited."

The collision was so energetic -- 10 billion times more energetic than the
nuclear bombs that flattened Hiroshima and Nagasaki in 1945 ­ that 12
percent of the impact debris was launched beyond Earth into the solar
system, Kring said.

About 25 percent of the debris rained back through the atmosphere within two
hours of impact. Fifty-five percent fell back to Earth within 8 hours of
impact, and 85 percent showered down within 72 hours of impact, according to
Kring's and Durda's calculations.

Both physics and Earth's rotation determined the global wildfire pattern.
High-energy debris would have concentrated both around the Chicxulub crater
in Mexico and its global antipode ­ which corresponded to India and the
Indian Ocean 65 million years ago. "The way to think of this is, the
material was launched around Earth and headed on a return trajectory to its
launch point," he explained.

"Then, because the Earth rotates, it turned beneath this returning plume of
debris, and the fires migrated to the west. That's what causes the wildfire

Durda has turned the simulations into a movie that can be viewed at the
Lunar and Planetary Lab Space Imagery Center Web site,

Kring and Durda noted not in this paper, but in an unrefereed abstract, that
post-impact wildfires generated as much carbon dioxide, and perhaps more
carbon dioxide, than limestone vaporized at the impact site. Wildfires
played at least as big a role as the limestone target site in disrupting the
carbon cycle and in greenhouse warming.

The team proposes to model other impact events using the code they developed
for these simulations.

Contact Information

David A. Kring

Daniel D. Durda

The Chicxulub wildfire movie is on the web at:,



>From John Michael Williams <>

Hi Benny.

This was an interesting issue, as contrasted with the fear-mongering usually
provided in answer to questions about it.

However, why is it "MITIGATION"?

"Mitigation" would imply spraying something on it to reduce the odor or cut
down the dust. Or, maybe, grinding something down a little finer. 

Then again, maybe mitigating the fear by mitigating the threat really is the
                     John Michael Williams


>From Andrew Yee <>

[ ]

Tuesday, September 3, 2002, 8:05 AM EDT

NASA to select new space telescope
By IRENE BROWN, UPI Science News

CAPE CANAVERAL, Fla. (UPI) -- Hoping to peer much farther than ever into
space and back in time, NASA this week is expected to choose the design and
builder of its Next Generation Space

The two competitors for the project are Lockheed Martin Aeronautics Company
of Fort Worth, Texas, and the tandem of TRW's Space Systems division, of
Redondo Beach, Calif., and Ball Aerospace & Technologies Corp. of Boulder,

NASA plans to build and launch the successor to the Hubble Space Telescope
for about $1.5 billion, project manager Bernard Seery, with the agency's
Goddard Space Flight Center in Greenbelt, Md., told United Press

Unlike Hubble, the new observatory will not be placed in low-Earth orbit and
serviced by teams of space shuttle astronauts. Instead, it will be launched
on an expendable rocket out to an orbit about 940,000 miles from Earth. From
that locale, the telescope's infrared sensors can be kept cold enough to
detect light from extremely distant galaxies -- radiation that has been
shifted into infrared wavelengths due to the universe's expansion.

The NGST will need a much larger light-collecting surface to pick up the
faint emissions of far-away galaxies. Its primary mirror is expected to be
at least six meters -- or 20 feet -- in diameter, compared to Hubble's
2.4-meter -- 7.8-foot -- mirror.

NASA and its contractors turned to the military to figure out how to launch
a mirror that size on a rocket with a payload container measuring just five
meters in diameter. The container also must hold a sun shield roughly the
size of a tennis court when fully extended.

"All the deployables have heritage in the (Department of Defense) world,"
said Seery. "We've had to push the state-of-the-art."

Borrowing from de-classified spy satellite technology, Lockheed and TRW both
are proposing fold-up mirrors that would unfurl in space after the telescope
reaches its intended orbit. Lockheed's design is for a glass mirror with its
panels folded alternatively up and down like a fan. TRW's mirror design has
two hinges so its panels can fold like a table with drop-down leaves.

"At first blush, the designs seem similar," Seery said. "The purpose is the
same, but in the fine details there are differences."

NASA has paid both Lockheed and TRW for initial design work, some details of
which remain proprietary. In addition to a deployable mirror, the telescope
needs a shield to prevent sunlight and reflected light from Earth from reaching its mirror.
The shield also must be resistant to micro-meteoroid impact.

"They based their designs on what they've done for the DoD for deploying
antennas and other large structures," Seery said.

The military contributions also include mathematical algorithms and computer
programming techniques to coordinate the individual segments of the mirror
into a single large reflective surface.

NGST will have three science instruments: a near-infrared camera, to be
built by the University of Arizona in Tucson in partnership with scientists
sponsored by the Canadian Space Agency; a near-infrared spectrometer,
provided by the European Space Agency using detectors and a micro-shutter
provided by NASA; and a mid-infrared camera/spectrometer to be built by
NASA's Jet Propulsion Laboratory in Pasadena, Calif., in partnership with a
consortium of European institutions overseen by the European Space Agency.

The Canadian Space Agency has agreed to provide the telescope's fine
guidance sensor.

NASA hopes to launch NGST in 2010, the same year the Hubble Space Telescope
is scheduled to be taken out of service, said NASA's space sciences chief Ed

Copyright © 2002 United Press International. All rights reserved.

CCNet is a scholarly electronic network. To subscribe/unsubscribe, please
contact the moderator Benny J Peiser < >. Information
circulated on this network is for scholarly and educational use only. The
attached information may not be copied or reproduced
for any other purposes without prior permission of the copyright holders.
The fully indexed archive of the CCNet, from February 1997 on, can be found
at DISCLAIMER: The opinions,
beliefs and viewpoints expressed in the articles and texts and in other
CCNet contributions do not necessarily reflect the opinions, beliefs and
viewpoints of the moderator of this network.


CCNet TERRA 2/2002 - 4 September 2002

"The effects of urbanization are typically far larger than the
background signal being sought. Therefore, since the true
temperature history of the non-urbanized area of the planet over the past
couple of centuries is so much smaller than the warming produced by
growing urban heat islands over this period, it must be admitted that the
contemporary natural temperature trend of the planet cannot be accurately
determined from the near- surface air temperature record."
--CO2 Science Magazine, 4 September 2002

"FORGET transatlantic rifts about trade, Iraq, Kyoto, or the
International Criminal Court. These have been thoroughly ventilated. One
area of difference has not got the attention it deserves:
demography. It may prove the most important of all."
--The Economist, 22 August 2002

    CO2 Science Magazine, 4 September 2002

    CO2 Science Magazine, 4 September 2002

    Andrew Yee <>

    Mark Hess <>

    Paal Brekke <>

    CO2 Science Magazine, 4 September 2002

    Michael Martin-Smith <>

    The Economist, 22 August 2002


>From CO2 Science Magazine, 4 September 2002

How much land can ten billion people spare for nature? This provocative
question was posed by Waggoner (1995) in the title of an essay designed to
illuminate the dynamic tension that exists between the need for land to
support the agricultural enterprises that sustain mankind and the need for
land to support the natural ecosystems that sustain all other creatures.  As
noted by Huang et al. (2002), human populations "have encroached on almost
all of the world's frontiers, leaving little new land that is cultivatable."
And in consequence of humanity's ongoing usurpation of this most basic of
natural resources, Raven (2002) notes that "species-area relationships,
taken worldwide in relation to habitat destruction, lead to projections of
the loss of fully two-thirds of all species on earth by the end of this

If one were to pick the most significant problem currently facing the
biosphere, this would probably be it: a single species of life, Homo
sapiens, is on course to completely annihilate fully two-thirds of the ten
million or so other species with which we share the planet within a mere
hundred years, simply by taking their land. Global warming, by comparison,
pales in significance. Its impact is nowhere near as severe, being possibly
nil or even positive. In addition, its root cause is highly debated; and
actions to thwart it are much more difficult, if not impossible, to both
define and implement. Furthermore, what many people believe to be the cause
of global warming, i.e., anthropogenic CO2 emissions, may actually be a
powerful force for preserving land for nature.

What parts of the world are likely to be hardest hit by this human
land-eating machine? Tilman et al. (2001) note that developed countries are
expected to actually withdraw large areas of land from farming over the next
fifty years, leaving developing countries to shoulder essentially all of the
growing burden of feeding our expanding species. In addition, they calculate
that the loss of these countries' natural ecosystems to cropland and pasture
will amount to about half of all potentially suitable remaining land, which
"could lead to the loss of about a third of remaining tropical and temperate
forests, savannas, and grasslands," along with the many unique species they

What can be done to alleviate this bleak situation? In a new analysis of the
problem, Tilman et al. (2002) introduce a few more facts before suggesting
some solutions. They note, for example, that by 2050 the human population of
the globe is projected to be 50% larger than it is today and that global
grain demand could well double, due to expected increases in per capita real
income and dietary shifts toward a higher proportion of meat. Hence, they
but state the obvious when they conclude that "raising yields on existing
farmland is essential for 'saving land for nature'."

So how is it to be done? Tilman et al. (2002) suggest a strategy that is
built around three essential tasks: (1) increasing crop yield per unit of
land area, (2) increasing crop yield per unit of nutrients applied, and (3)
increasing crop yield per unit of water used.

With respect to the first of these requirements, Tilman et al. note that in
many parts of the world the historical rate of increase in crop yields is
declining, as the genetic ceiling for maximal yield potential is being
approached. This observation, they say, "highlights the need for efforts to
steadily increase the yield potential ceiling."  With respect to the second
requirement, they note that "without the use of synthetic fertilizers, world
food production could not have increased at the rate it did [in the past]
and more natural ecosystems would have been converted to agriculture."
Hence, they say the ultimate solution "will require significant increases in
nutrient use efficiency, that is, in cereal production per unit of added
nitrogen, phosphorus," and so forth. Finally, with respect to the third
requirement, Tilman et al. note that "water is regionally scarce," and that
"many countries in a band from China through India and Pakistan, and the
Middle East to North Africa either currently or will soon fail to have
adequate water to maintain per capita food production from irrigated land."
Increasing crop water use efficiency, therefore, is also a must.

Although the impending biological crisis and several important elements of
its potential solution are thus well defined, Tilman et al. (2001) report
that "even the best available technologies, fully deployed, cannot prevent
many of the forecasted problems." This is also the conclusion of the study
of Idso and Idso (2000), who - although acknowledging that "expected
advances in agricultural technology and expertise will significantly
increase the food production potential of many countries and regions" - note
that these advances "will not increase production fast enough to meet the
demands of the even faster-growing human population of the planet."

Fortunately, we have a powerful ally in the ongoing rise in the air's CO2
content that can provide what we can't. Since atmospheric CO2 is the basic
"food" of essentially all terrestrial plants, the more of it there is in the
air, the bigger and better they grow. For a nominal doubling of the air's
CO2 concentration, for example, the productivity of earth's herbaceous
plants rises by 30 to 50% (Kimball, 1983; Idso and Idso, 1994), while the
productivity of its woody plants rises by 50 to 80% or more (Saxe et al.
1998; Idso and Kimball, 2001). Hence, as the air's CO2 content continues to
rise, so too will the land use efficiency of the planet rise right along
with it (see also Plant Growth Data on our website). In addition,
atmospheric CO2 enrichment typically increases plant nutrient use efficiency
and plant water use efficiency (see Nitrogen Use Efficiency and Water Use
Efficiency in our Subject Index). Thus, with respect to all three of the
major needs noted by Tilman et al. (2002), increases in the air's CO2
content pay huge dividends, helping to increase agricultural output without
the taking of new lands from nature.

In conclusion, it would appear that the extinction of two-thirds of all
species of plants and animals on the face of the earth is essentially
assured within the next century, if world agricultural output is not
dramatically increased. This unfathomable consequence will occur simply
because we will need more land to produce what is required to sustain us
and, in the absence of the needed productivity increase, because we will
simply take that land from nature to keep ourselves alive. It is also the
conclusion of scientists who have studied this problem in depth that the
needed increase in agricultural productivity is not possible, even with
anticipated improvements in technology and expertise. With the help of the
ongoing rise in the air's CO2 content, however, Idso and Idso (2000) have
shown that we should be able - but just barely - to meet our expanding food
needs without bringing down the curtain on the world of nature.

That certain forces continue to resist this reality is truly incredible.
More CO2 means life for the planet; less CO2 means death ... and not just
the death of individuals, but the death of species. And to allow, nay, cause
the extinction of untold millions of unique and irreplaceable species has
got to rank close to the top of all conceivable immoralities.

We humans, as stewards of the earth, have got to get our priorities
straight. We have got to do all that we can to preserve nature by helping to
feed humanity; and to do so successfully, we have got to let the air's CO2
content rise. Any policies that stand in the way of that objective are truly

Huang, J., Pray, C. and Rozelle, S.  2002.  Enhancing the crops to feed the
poor.  Nature 418: 678-684.

Idso, C.D. and Idso, K.E.  2000.  Forecasting world food supplies: The
impact of the rising atmospheric CO2 concentration.  Technology 7S: 33-55.

Idso, K.E. and Idso, S.B.  1994.  Plant responses to atmospheric CO2
enrichment in the face of environmental constraints: a review of the past 10
years' research.  Agricultural and Forest Meteorology 69: 153-203.

Idso, S.B. and Kimball, B.A.  2001.  CO2 enrichment of sour orange trees: 13
years and counting.  Environmental and Experimental Botany 46: 147-153.

Kimball, B.A.  1983.  Carbon dioxide and agricultural yield: an assemblage
and analysis of 430 prior observations.  Agronomy Journal 75: 779-788.

Raven, P.H.  2002.  Science, sustainability, and the human prospect.
Science 297: 954-959.

Saxe, H., Ellsworth, D.S. and Heath, J.  1998.  Tree and forest functioning
in an enriched CO2 atmosphere.  New Phytologist 139: 395-436.

Tilman, D., Cassman, K.G., Matson, P.A., Naylor, R. and Polasky, S.  2002.
Agricultural sustainability and intensive production practices.  Nature 418:

Tilman, D., Fargione, J., Wolff, B., D'Antonio, C., Dobson, A., Howarth, R.,
Schindler, D., Schlesinger, W.H., Simberloff, D. and Swackhamer, D.  2001.
Forecasting agriculturally driven global environmental change.  Science 292:

Waggoner, P.E.  1995.  How much land can ten billion people spare for
nature?  Does technology make a difference?  Technol. Soc. 17: 17-34.

Copyright © 2002.  Center for the Study of Carbon Dioxide and Global Change


>From CO2 Science Magazine, 4 September 2002

The warming of near-surface air over non-urban areas of the planet during
the past one to two centuries is believed to have been less than 1°C.
Simultaneous warming in many growing cities, on the other hand, may have
been a full order of magnitude greater. Thus, since nearly all near-surface
air temperature records of this period have been obtained from sensors
located in population centers that have experienced significant growth, it
is absolutely essential that urbanization-induced warming be removed from
all original temperature records when attempting to accurately assess what
has truly happened in the natural non-urban environment.

One popular method of minimizing the potentially huge spurious effect of
urbanization on global or regional temperature trends is to only consider
data from small towns; yet even modest congregations of people can generate
heat of significant magnitude. Oke (1973), for example, demonstrated that
settlements with as few as 1,000 inhabitants typically exhibit heat islands
on the order of 2 to 2.5°C - values that could be as much as four times
greater than the mean increase in global air temperature that is believed to
have occurred since the end of the Little Ice Age.

Changnon (1999) also emphasized this point in a study of soil temperatures
at a rural site in Illinois that spanned the period 1889 to 1952. His
results revealed the existence of a significant urban-induced warming bias
in nearby air temperature records that had not previously been detected, or
even suspected. Specifically, data from the rural setting revealed the
existence of a temperature increase that was 0.17°C less than the 0.57°C
warming determined from three benchmark stations with populations of less
than 2000 people.  Changnon underscored the significance of his finding by
stating that "this could be significant because the IPCC (1995) indicated
that the global mean temperature increased 0.3°C from 1890 to 1950."

In addition to air and soil temperatures, water temperatures have sometimes
been used to detect urbanization effects on nearby air temperatures over
land.  Maul and Davis (2001), for example, analyzed air and seawater
temperature data obtained over the past century at sites of primary tide
gauges of the U.S. Coast and Geodetic Survey, calculating trends for the 14
longest records.  The mean century-long warming of these 14 sites was
0.74°C.  The result for Boston (a 100-year warming of 3.6°C), however, was
almost seven times greater than the mean of the other 13 stations (0.52°C)
and probably suffers from some unspecified error.

With respect to the results obtained for the 13 more consistent stations,
0.52°C of warming does not seem unusual for the 20th century, when the
planet experienced the bulk of its recovery from the global chill of the
Little Ice Age.  Nevertheless, the authors note that "each site has
experienced significant population growth in the last 100 years, and ...
with the increase in maritime traffic and discharge of wastewater one would
expect water temperatures to rise," which implies that some unspecified
portion of the half-degree C warming is due to a maritime analogue of the
urban heat island.  And since Maul and Davis note that on the time scale
they investigated "one would expect the water temperatures to equilibrate to
the air," we are left with the conclusion that the true 20th century change
in air temperature over the 13 U.S. coastal sites was likely significantly
less than 0.52°C.

A study of 51 watersheds in the eastern United States also reveals the
potential for an urban warming bias in climatic records that may previously
have been thought to be unaffected by this phenomenon.  In this study, Dow
and DeWalle (2000) report that a complete transformation from 100% rural to
100% urban characteristics results in a 31% decrease in watershed
evaporation and a 13 W/m2 increase in sensible heating of the atmosphere.
Based upon their results, we have calculated that, to a first approximation,
a transformation from a totally rural regime to a mere 2% urbanization
regime could increase the near-surface air temperature by as much as a
quarter of a degree Centigrade (See The Urbanization of America's
Watersheds: Climatic Implications).  This powerful anthropogenic but
non-greenhouse effect of urbanization on the energy balance of the watershed
and the temperature of the boundary-layer air above it begins to express
itself with the very first hint of urbanization and, hence, may be most
difficult to remove from instrumental air temperature records that are used
in attempts to identify any greenhouse warming that may be present.  Indeed,
the warming effects of urbanization may already be present in many
temperature records that have been considered "rural enough" to be devoid of
all human influence, when such is really not the case.

Much the same thing has been observed at other places around the world.  In
China, for example, Weng (2001) used remotely-sensed Landsat Thematic Mapper
data in a Geographic Information System to determine the temperature
consequences of the urban development of the third largest river delta in
China (Zhujiang Delta) that followed economic reforms instituted there in
1978.  Between 1989 and 1997, it was determined that cropland area declined
by almost 50%, while the area of urbanized land increased by nearly the same
amount, with the result, in the words of Weng, that "urban development
between 1989 and 1997 has given rise to an average increase of 13°C in
surface radiant temperature."

In Australia, Torok et al. (2001) observed that urban-rural temperature
differences scale linearly with the logarithms of city populations, as is
true for cities in Europe and North America.  They also learned that
Australian city heat islands are generally of lesser magnitude than those of
similar-size European cities, which are typically of lesser magnitude than
those of similar-size North American cities.  The regression lines for all
three continents, however, converge in the vicinity of a population of 1000
people, where the urban-rural temperature difference is approximately 2.2 ±
0.2°C, essentially the same as what Oke (1973) reported two decades earlier.

In addition to city temperatures increasing in response to increases in city
population, it is possible for urban-rural temperature differences to
intensify in cities experiencing no change in population, as demonstrated by
Bohm (1998), who analyzed urban, suburban and rural temperature records in
and around Vienna, Austria over the 45-year period 1951 to 1996.  During
this time, Vienna experienced zero population growth.  However, there was a
20% decrease in woodland area and a 30% decrease in grassland area within
the city, as well as a doubling of the number of buildings, a ten-fold
increase in the number of cars, a 60% increase in street, pavement and
parking area, and a 2.5-fold increase in energy consumption.  As a
consequence of these changes, suburban stations exhibited temperature
increases ranging from 0.11 to 0.21°C over the 45-year period, while urban
stations experienced temperature increases ranging from zero, in the
historic center of the city, to 0.6°C in the area of most intensive urban

In light of these several findings, there is clearly ample opportunity for
large errors to occur in attempts to reconstruct non-urban temperature
trends of the past century or so.  Given the magnitudes of these potential
errors, which often rival or vastly exceed the magnitude of the purported
non-urban global temperature trend, it appears that more detailed analyses
of urban population and development characteristics are needed before we can
be confident that the global temperature record of the past century or so is
properly corrected for these phenomena.  And until this is done, it would be
premature to put too much faith in that record as it stands today.

A case in point is provided by Hasanean (2001), who investigated
near-surface air temperature trends in eight Eastern Mediterranean cities.
All of the stations with sufficiently long records exhibited similar uniform
warming trends that began about 1910; but only some of them exhibited less
coherent and discontinuous warming trends in the 1970s.  In view of what we
know about urban effects on temperature, the first of these warming trends
may well represent, at least partially, a true background warming of the
entire region; but the divergent temperature histories of the cities in the
1970s and beyond may well be expressions of differences in their rates of
urban growth and development.  Of the eight cities studied, in fact, four of
them exhibited overall warming trends while four of them exhibited overall
cooling trends, allowing one to say little about the mean temperature trend
of the entire region.

A study from Norway raises similar concerns.  Nordli (2001) developed a
number of summer temperature histories for rural areas of the central and
southern parts of the country based on relationships established between
modern instrumental temperature records and dates of local grain harvests of
barley and oats, finding that summer temperatures during the last part of
the Little Ice Age (1813-1880) were about 1°C lower than those of the last
70 years, while "the warmest decade of the series is the 1930s."  This
rural-based assessment of climate change suggests there has been no net
warming of the Norwegian countryside for the past seven decades, in contrast
to what is claimed for the world by the Intergovenmental Panel on Climate
Change on the basis of almost exclusively urban records.

Holmgren et al. (2001) have observed much the same thing in South Africa.
Based on a 3000-year temperature history for the southern part of the
continent - which they derived from a correlation between color variations
in the annual growth layers of a stalagmite and an area-averaged regional
temperature series - they were able to clearly delineate the existence of
both the Medieval Warm Period and Little Ice Age.  As for the late
nineteenth century warming that appears in the instrumental surface record,
however, they could find "little evidence" of it in the stalagmite data,
which are not affected by the phenomena that produce the urban warming that
typically contaminates near-surface air temperature records of even small

With respect to this latter topic, i.e., causes of urban heat islands,
Balling et al. (2002) recently investigated the role played by the buildup
of carbon dioxide in the air over cities that results from the copious
quantities of CO2 emitted to the atmosphere by vehicular traffic and
industrial processes (see Urban CO2 Dome in our Subject Index).  Using a
one-dimensional infrared radiation simulation model supplied with data on
atmospheric CO2 concentration, temperature and humidity obtained from
vertical profiles of these parameters measured during daily aircraft flights
that extended through and far above the urban CO2 dome of Phoenix, Arizona,
they calculated that the excess CO2 over the city produced by the burning of
fossil fuels creates a maximum warming that is one to two orders of
magnitude less than that produced by the other urban heat island-producing
phenomena of the city.

What is the bottom line with respect to all of these observations?  Hegerl
et al. (2001) indicate that all instrumental long-term near-surface air
temperature data sets are plagued by systematic errors that "cannot be
assessed at present," one of which they say is "urban warming." In this
assessment they are correct. Nevertheless, they go on to do precisely what
they say cannot be done, claiming that "the effect of urbanization on
estimates of global-scale signals should be small." From the several reports
discussed in this brief summary, however, it is clear that nothing could be
further from the truth; the effects of urbanization are typically far larger
than the background signal being sought. Therefore, since the true
temperature history of the non-urbanized area of the planet over the past
couple of centuries is so much smaller than the warming produced by growing
urban heat islands over this period, it must be admitted that the
contemporary natural temperature trend of the planet cannot be accurately
determined from the near-surface air temperature record.

Balling Jr., R.C., Cerveny, R.S. and Idso, C.D.  2002.  Does the urban CO2
dome of Phoenix, Arizona contribute to its heat island?  Geophysical
Research Letters 28: 4599-4601.

Bohm, R.  1998.  Urban bias in temperature time series - A case study for
the city of Vienna, Austria.  Climatic Change 38: 113-128.

Changnon, S.A.  1999.  A rare long record of deep soil temperatures defines
temporal temperature changes and an urban heat island.  Climatic Change 42:

Dow, C.L. and DeWalle, D.R.  2000.  Trends in evaporation and Bowen ratio on
urbanizing watersheds in eastern United States.  Water Resources Research
36: 1835-1843.

Hasanean, H.M.  2001.  Fluctuations of surface air temperature in the
Eastern Mediterranean.  Theoretical and Applied Climatology 68: 75-87.

Hegerl, G.C., Jones, P.D. and Barnett, T.P.  2001.  Effect of observational
sampling error on the detection of anthropogenic climate change.  Journal of
Climate 14: 198-207.

Holmgren, K., Tyson, P.D., Moberg, A. and Svanered, O.  2001.  A preliminary
3000-year regional temperature reconstruction for South Africa.  South
African Journal of Science 97: 49-51.

Maul, G.A. and Davis, A.M.  2001.  Seawater temperature trends at USA tide
gauge sites.  Geophysical Research Letters 28: 3935-3937.

Nordli, P.O.  2001.  Reconstruction of nineteenth century summer
temperatures in Norway by proxy data from farmers' diaries.  Climatic Change
48: 201-218.

Oke, T.R.  1973.  City size and the urban heat island.  Atmospheric
Environment 7: 769-779.

Torok, S.J., Morris, C.J.G., Skinner, C. and Plummer, N.  2001.  Urban heat
island features of southeast Australian towns.  Australian Meteorological
Magazine 50: 1-13.

Weng, Q.  2001.  A remote sensing-GIS evaluation of urban expansion and its
impact on surface temperature in the Zhujiang Delta, China.  International
Journal of Remote Sensing 22: 1999-2014.
Copyright © 2002.  Center for the Study of Carbon Dioxide and Global Change


>From Andrew Yee <>

University of Washington
Seattle, Washington

FROM: Sandra Hines, 206-543-2580,


Scientists zero in on Arctic, hemisphere-wide climate swings

In the late 1990s, as scientists were reaching consensus that the Arctic had
gone through 30 years of significant climate change, they began reading the
first published papers about the Arctic Oscillation, a phenomenon reported
to have hemisphere-wide effects.

In short order the arctic-science and the global-change communities were
galvanized, says Richard Moritz, polar oceanographer with the UW's Applied
Physics Laboratory and lead author of a review of recent Arctic climate
change in the Aug. 30 special polar-science issue of Science.

"We've learned more about the dynamics of post-glacial arctic climate change
in the last five years than in the 50 years previous," Moritz says. "For
example, the recent trend in the
Arctic Oscillation explains the warming observed in the Arctic better than
anything else."

Advances in understanding arctic climate change are particularly timely,
with some studies indicating that the recent trend in the Arctic Oscillation
results partly from human activities that generate greenhouse gases and
sulfate particles, and deplete stratospheric ozone. Scientists, planners and
policymakers need to know what the changes of the last 30 years portend.

Thus climate modelers have redoubled their efforts to determine the physics
behind the patterns of change. Although their models portray realistic
day-to-day and month-to-month variations in
the Arctic Oscillation, they fail to capture the magnitude of the longer
term trend in the Arctic Oscillation that was observed from 1970 to 2000.
While paleoclimatologists studying the climate record of the past 1,000
years have not reached a consensus on the importance of the Arctic
Oscillation pattern over this longer period, some surprising findings
indicate that past Arctic warmings and coolings tended to coincide with
low-frequency El Nino-Southern Oscillation events in the tropical Pacific.

The review by Moritz and co-authors Cecilia Bitz, a sea-ice expert with the
UW's Applied Physics Laboratory, and Eric Steig, assistant professor with
the UW's Quaternary Research Center, refers to more than 80 published
papers, most appearing in just the last two years. The co-authors say that
warming of the surface from 1970 to 2000 in the Northern Hemisphere was
greatest in the Arctic, causing changes in precipitation, snow cover and the
extent of sea ice.

The Arctic Oscillation is a seesaw pattern in which atmospheric pressure at
the polar and middle latitudes fluctuates between positive and negative
phases. The wind patterns associated with the Arctic Oscillation affect the
surface temperature over North America and Eurasia, as well as the Arctic.
The Arctic Oscillation was first described in a 1998 article by David
Thompson, then a graduate student at the UW and now an assistant professor
at Colorado State University, and John M. Wallace, a UW professor.

"The Arctic Oscillation provides a very fruitful framework and the result is
that a tremendous amount of work has been done in a relatively short period
of time," Moritz says. "Attempts to model the pattern of recent Arctic and
global warming have to come to grips with the problem of the Arctic
Oscillation." Climate modelers have benefited from a growing understanding
of sea-ice physics and the best-ever measurements of how heat from the sun
and the atmosphere affects the pack ice that covers the Arctic Ocean.
Moritz, for example, is director of the SHEBA (Surface Heat Budget of the
Arctic Ocean) Project Office funded by the National Science foundation and
Office of Naval Research. Now in its analysis phase, SHEBA locked an
icebreaker into the pack ice for a full year in the late '90s to measure the
interactions of the ice, atmosphere and the ocean during all four seasons.

Because so many climate modelers worldwide are working on the Arctic
Oscillation, Moritz says it's conceivable that in a year or two we will
understand the fundamental physics of the Arctic Oscillation, and be able to
account for its recent trend. "If we can't, it won't be for lack of trying."

Richard Moritz, 206-543-8023,

Cecilia Bitz, 206-543-1339,

Eric Steig is on vacation in Massachusetts but is checking e-mail,, or you can try the family residence where he is
staying, 413-458-4967

Click here to view the article in Science,

Images for use by news media only.
Photo credit required: University of Washington

Efforts in the late 1990s contributing to a better understanding of climate
dynamics in the Arctic included the SHEBA (Surface Heat Budget of the Arctic
Ocean) project sponsored by the National Science Foundation and Office of Naval
Research to learn how the ocean, pack ice and atmosphere interact during the
course of an entire year. Richard Moritz, lead author of an Aug.
30 review of the dynamics of recent climate change in the Arctic, is
director of the SHEBA project office based at the University of Washington.
The project is now in its analysis phase.

[Image 1: (331KB)]
Two researchers walk to their research hut from the Canadian icebreaker that
served as floating hotel, power source, communications base and science
headquarters for Ice Station SHEBA.

[Image 2: (407KB)]
The Canadian icebreaker that served as floating hotel, power source,
communications base and science headquarters for Ice Station SHEBA is
surrounded by huts and tents used for research,
storage and as garages for snowmobiles.

[Image 3: (182KB)]
Life preservers were a necessity during summer melting as Ice Station SHEBA
researchers and crew skirted open water, melt ponds and slushy snow, all
part of the summer melting of the ice pack.

[Image 4: (495KB)]
As summer melting of the ice pack progressed, Ice Station SHEBA is riddled
with rivers of open water and melt ponds that had to be crossed to reach
research huts and equipment.

[Image 5: (147KB)]
As summer melting of the ice pack progressed, Ice State SHEBA research huts
and logistics structures were floated on barrels when the ice melted out
from under them.


>From Mark Hess <>

Krishna Ramanujan                       August 28, 2002
Goddard Space Flight Center, Greenbelt, Md.
(Phone: 301/286-3026)

Release No: 02-132

A NASA researcher finds that the amount of sea ice that moves between
Greenland and Spitsbergen, a group of islands north of Norway, is dependent
upon a "wave" of atmospheric pressure at sea level. By being able to
estimate how much sea ice is exported through this region, called Fram
Strait, scientists may develop further insights into how the ice impacts
global climate.

This export of sea ice helps control the thermohaline circulation, a deep
water ocean conveyor belt that moves warm, salty water poleward and cold,
fresh water toward the equator. The thermohaline circulation is one of the
primary mechanisms that maintains the global heat balance.

Don Cavalieri, a researcher at NASA's Goddard Space Flight Center in
Greenbelt, Md., discovered a link between the transport of sea ice through
this region and the position or phase of the longest sea level pressure wave
circling the Earth at polar latitudes.

Until now, scientists have had inconsistent results when trying to identify
the mechanism behind this transport of sea ice. The North Atlantic
Oscillation, in particular, was unable to explain the changes in sea ice
transport through Fram Strait.

"The significance of this work is the connection between the phase of the
longest atmospheric wave called 'wave 1' in sea level pressure and long-term
Arctic Ocean and sea ice variability," said Cavalieri.

Sea level pressure is made up of high and low pressure systems as any
weather map will show. The large-scale semi-permanent highs and lows define
the longest pressure waves which are called planetary waves because they
extend thousands of miles and circle the world. The longest wave, called
wave 1, is made up of one ridge (high pressure) and one trough (low
pressure). It turns out that wave 1 is the dominant pattern at polar
latitudes. Because these planetary waves are so dominant in wintertime
atmospheric circulation their amplitudes (strength) and phases (position)
provide useful information on large-scale wind patterns and thus on sea ice

The Icelandic Low is the primary weather system in the North Atlantic. At
times this low pressure system extends northeastward into the Barents Sea.
When this happens a secondary low pressure system may develop in the Barents
Sea region. It is the counterclockwise circulation around this secondary low
pressure system in the Barents Sea that drives sea ice through the Fram
Strait. Whenever this secondary low pressure system exists, wave 1 shifts
eastward and is said to be in its eastward phase, as opposed to a westward

When wave 1 is in its westward mode, the Icelandic Low is more intense and
localized, no longer extending to the Barents Sea. Because of the position
of the Low relative to the Strait, the winds are more westerly and less ice
is forced southward through Fram Strait.

Variations in the phase of wave 1 between these two extreme modes also seem
to control the cycle of Arctic Ocean circulation which reverses from
clockwise to counterclockwise (or anticyclonic to cyclonic, respectively)
every 6 or 7 years.

Cavalieri used simulations for the 40 year period (1958-1997) from two
computer models to obtain a record of the volume of sea ice that moved
through Fram Strait. The two models each showed a similar correlation
between the eastward phase of wave 1 and movement of sea ice through the
strait, with the exception of two anomalous years between 1966 and 1967.
When those years were removed, one ice-ocean model, using monthly surface
wind and air temperature data, found that the wave 1 eastward phase
explained 70 percent of Arctic ice export through Fram Strait, while the
other model, which used daily surface wind and air temperature data,
accounted for 60 percent of the sea ice export.

Cavalieri also used Northern Hemisphere monthly sea level pressure grids to
obtain phase and amplitude information for wave 1.

The paper appeared in a recent issue of Geophysical Research Letters.

The study was funded by NASA's Cryospheric Sciences Research Program.

For more information, please see:


>From Paal Brekke <>

Meteorologists can no longer view the Earth as an isolated system. Both
long-term climate changes and day-to-day weather show links with the Sun's
activity. Scientists therefore study the nature of those links intensely.
With data from ESA's spaceprobes SOHO, Cluster, and Ulysses, we now have the
information we need to solve the mystery of how the Sun's activity affects
the climate here on Earth.

Full story:


Paal Brekke,
SOHO Deputy Project Scientist  (European Space Agency - ESA)

NASA Goddard Space Flight Center,      Email:
Mail Code 682.3, Bld. 26,  Room 001,   Tel.:  1-301-286-6983 /301 996 9028
Greenbelt, Maryland 20771, USA.        Fax:   1-301-286-0264


>From CO2 Science Magazin, 4 September 2002

A 232-Year History of High Water Levels at Liverpool

Woodworth, P.L. and Blackman, D.L. 2002. Changes in extreme high waters at
Liverpool since 1768.  International Journal of Climatology 22: 697-714.

What was done
The authors analyzed four discontinuous sets of high-water data from the
Liverpool waterfront that span the period 1768-1999, looking for changes in
annual maximum high water (tide plus surge), surge at annual maximum high
water (surge component of annual maximum high water), and annual maximum

What was learned
There were no significant trends in the first two parameters over the period
of study.  However, the annual maximum surge-at-high-water declined at a
rate of 0.11 ± 0.04 meters per century.

What it means
The results of this study run counter to two frequent GCM predictions: (1)
that CO2-induced global warming should be causing an increase in sea level,
and (2) that CO2-induced global warming should be increasing the frequency
and/or severity of extreme weather events such as storms.  Contrary to these
predictions, real-world data from Liverpool indicate a stable maximum sea
level height since 1768.  Furthermore, the observed decline in the annual
maximum surge-at-high-water over the past 232 years suggests that winds that
are responsible for producing high storm surges were much stronger and/or
more common during the early part of the record (Little Ice Age) than the
latter part (Modern Warm Period).
Copyright © 2002.  Center for the Study of Carbon Dioxide and Global Change



>From Michael Martin-Smith <>

Dear Benny,

The UK Sunday Telegraph (Sep 1) , p23 International News, carries a report
on the divergent views of Greens and numerous scientists on the present and
future prospects for our planetary ecosystems.

Inter alia, Greens at the WWF are quoted as saying that, by 2050, we will
need the resources of two planets to maintain our current ways of life,
while scientists are quoted as saying that this is too pessimistic a view of
our ability to manage our affairs here with Earth's resources.

If the Earth were indeed an isolated system unthreatened by factors from
neighbouring Outer Space, we could accept the scientists' statement as

As it is, the Greens have, quite unintentionally, hit the nail on the head -
we will need at least two planets to ensure our survival for the longer

Fortunately our solar system contains nine planets, several dozen moons, and
countless thousands of asteroids and comets - all awaiting the creative hand
of our cosmic gardeners.

Let us take the Greens at their careless words, and go forth!

Yours sincerely

Michael Martin-Smith


>From The Economist, 22 August 2002

FORGET transatlantic rifts about trade, Iraq, Kyoto, or the International
Criminal Court. These have been thoroughly ventilated. One area of
difference has not got the attention it deserves: demography. It may prove
the most important of all.

For 50 years, America and the nations of Western Europe have been lumped
together as rich countries, sharing the same basic demographic features:
stable populations, low and declining fertility, increasing numbers of old
people. For much of that period, this was true. But in the 1980s, the two
sides began to diverge. The effect was muted at first, because demographic
change is slow. But it is also remorseless, and is now beginning to show up.

America's census in 2000 contained a shock. The population turned out to be
rising faster than anyone had expected when the 1990 census was taken. There
are disputes about exactly why this was (more on that shortly). What is not
in doubt is that a gap is beginning to open with Europe. America's fertility
rate is rising. Europe's is falling. America's immigration outstrips
Europe's and its immigrant population is reproducing faster than native-born
Americans. America's population will soon be getting younger. Europe's is

By 2040, perhaps earlier, America will overtake Europe in population

Unless things change substantially, these trends will accelerate over coming
decades, driving the two sides of the Atlantic farther apart. By 2040, and
possibly earlier, America will overtake Europe in population and will come
to look remarkably (and, in many ways, worryingly) different from the Old

In 1950, Western Europe was exactly twice as populous as the United States:
304m against 152m. (This article uses the US Census Bureau's definition of
"Europe", which includes all countries that were not communist during the
cold war. The 15 countries that make up the European Union are a slightly
smaller sample: they had a population of 296m in 1950.) Both sides of the
Atlantic saw their populations surge during the baby boom, then grow more
slowly until the mid-1980s. Even now, Europe's population remains more than
100m larger than America's.

In the 1980s, however, something curious began to happen. American fertility
rates-the average number of children a woman can expect to bear in her
lifetime-suddenly began to reverse their decline. Between 1960 and 1985, the
American fertility rate had fallen faster than Europe's, to 1.8, slightly
below European levels and far below the "replacement level" of 2.1 (the rate
required to keep the population steady). By the 1990s American fertility had
rebounded, rising back to just below the 2.1 mark.

Nobody quite knows why. Some of the recovery was the result of
higher-than-average fertility among immigrants. But not all of it: fertility
rose among native-born whites and blacks as well. Perhaps the most
plausible, if unprovable, explanation is that higher fertility was the
product of the economic boom of the 1990s combined with what one might call
"social confidence": America was a good country to bring more children into.

America is the world's great demographic outlier
America is not unique: a few north-European countries, like Norway, have
followed a similar trajectory. But it is highly unusual. Nearly every
country in the world has seen its fertility rate fall recently and, by and
large, the richer the country, the greater the fall. "America", says Hania
Zlotnik of the United Nations Population Division, "is the world's great
demographic outlier."

Meanwhile, Europe's fertility continues to fall. Having been just below 1.9
in the mid-1980s, the rate is now less than 1.4 and it is projected to
continue declining for at least another ten years. In some countries-Spain,
Italy and Greece-the fertility rate has fallen to between 1.1 and 1.3.

It is fair to say that these numbers probably exaggerate the long-term
demographic difference between the two sides. Remember that between 1970 and
1985 American fertility rates were slightly lower than Europe's. What seems
to have happened then was not that Americans were having fewer children
overall, but that a generation of women was postponing motherhood. That
depressed America's birth rate in 1970-85, shifted a surge of births by half
a generation, and produced an unusually high rate in the 1990s. That same
population shift is happening in parts of Europe now, especially in those
Mediterranean countries with the lowest fertility rates. There too, many
women are merely postponing child-bearing. Later, after about 2010, when
they have their children, Europe's fertility rate will nudge back up (see
chart 1).

But what is striking about the American rate is not that it rose and fell,
but that it rose so much-to within a whisker of the replacement level. And
what is striking about the European rate is that it fell so far, to a much
lower level than America's. That is also a reason for thinking it may not
recover as strongly as America's did.

The UN reckons that the differences in fertility between America and Europe
will continue over the next few decades. America's high rate is expected to
remain relatively stable. Europe's should recover a bit-but it will not
close the gap. The result of these differences, already evident in the
census of 2000, will then become starker.

America's population should have been 275m in 2000. At least, that is what
the central projection of the 1990 census predicted. The 2000 census showed
it was actually 281m, higher even than the "high series" projection from
1990. Some of this may have been caused by things other than population
change: improvements in counting, for instance. But not all. The new census
showed that immigration was higher than expected, and that the birth rate of
native-born Americans was up too.
Those higher fertility rates will have a bigger impact as time goes on. By
2040, using the new census's "middle series" projection, America's
population will overtake Europe's. This forecast has already proved too low.
On the "high-series projection", the crossing point occurs before 2030 (see
chart 2). Admittedly, this projection is based on high assumptions about
fertility rates-over 2.5 in 2025-50. But if this proves correct, Europe's
population in 2050 would be 360m and falling, America's would be over 550m
and rising. Half a billion people: in other words, America would be twice
the size it is now. Europe would be smaller. Obviously, straight-line
projections over 50 years need to be taken with plenty of salt. All the
same, the numbers are startling.

European commissioners are fond of boasting that the European Union (EU) is
the largest market in the world. They claim an equal status with the United
States in trade negotiations as a result. Some also think that, because of
this parity, the euro will one day become an international reserve currency
to rival the dollar.

The balance of global economic power would be tilted in fundamental ways

But assume, for a minute, that Americans remain, as they are now, about
one-third richer per head than Europeans. The high-series projection implies
that America's economy in 2050 would still be more than twice the size of
Europe's-and something like that preponderance would still be there even if
you assume that by then much of Central and Eastern Europe will have joined
the EU. The balance of global economic power would be tilted in fundamental
ways. With 400m-550m rich consumers, the American market would surely be
even more important to foreign companies than it is today. And if so,
American business practices-however they emerge from the current
malaise-could become yet more dominant.

And still they come

Part of the rise in the population has been driven by higher fertility. The
rest comes from immigration. In the past decade, America has taken in over
11m immigrants. That compares with 6m in the 1970s and 7m in the 1980s. The
real number of new immigrants may be even higher, since these figures may
not account for all the estimated 8m-9m people who have slipped illegally
into the country. Some may return home. Others may be thrown out. But those
that remain contribute to the growing population both directly and
indirectly (that is, through their children). The fertility rate for
non-hispanic whites is just over 1.8, for blacks 2.1. For Latinos, it is
nearly 3.0-higher than in many developing countries. From the point of view
of the overall population, therefore, higher immigration counts twice over.

That is a reason why America's total population will continue to rise,
though perhaps not as much as the highest estimates suggest. Many of the new
migrants come from Mexico. If Mexico's own demographic pattern, with
fast-falling fertility, is anything to go by, the fertility rate for
American Latinos is also bound to fall, though it will still be quite high.

Europe has had an immigration boom as well, of course. Indeed, in 1985-95,
there were slightly more immigrants going into Europe than into America
(though since Europe's population is larger, America's rate was higher). But
more recently, the European numbers have fallen-presumably reflecting
increased barriers to entry-and overall, since 1950, Europe has taken in far
fewer people. Most demographers forecast that immigration will be much lower
in Europe than in America during the next few decades (see chart 3).

The difference in immigration not only increases America's population
compared with Europe's, it also makes it look increasingly different.
Compare the different shapes of chart 4, which maps the age distribution of
America's whites, on the left, and other groups, on the right. Whites form a
pear shape: they are preponderant among adults. This is also the shape of
Europe's population. Blacks and browns form a pyramid: children account for
most of the population. Even now, in the parts of America with the highest
immigration, such as Los Angeles and Houston, Latinos account for half of
all children under 14. This is the characteristic shape of developing
countries. As the bulge of Latinos enters peak child-bearing age in a decade
or two, the Latino share of America's population will soar.

That could have an impact both on economics and on geopolitics. The economic
impact is clear enough. Kenneth Prewitt, the former head of the US Census
Bureau, argues that "in the struggle to find workers to support growing
economies, nations that are hospitable to immigrants will have an
advantage." Immigrants go where there are friends and family to welcome them
and help them get jobs. Where will they find a more hospitable
welcome-Europe or America?

The geopolitical impact is fuzzier, but still powerful. At the moment,
America's political connections and shared values with Europe are still
strong, albeit fraying. But over time, America's ties of family and culture
will multiply and strengthen with the main sources of its immigration-Latin
America chiefly, but also East and South Asia. As this happens, it is
probable that it will also pull American attention further away from Europe.

The young world and the old

Higher fertility rates and immigration produce not only a larger population
but a society that is younger, more mixed ethnically and, on balance, more
dynamic. The simplest expression of this is median age (by definition, half
of the population is older than the median age, and half younger). According
to Bill Frey, a demographer at the University of Michigan, the median age in
America in 2050 will be 36.2. In Europe it will be 52.7. That is a stunning
difference, accounted for almost entirely by the dramatic ageing of the
European population. At the moment, the median age is 35.5 in America and
37.7 in Europe. In other words, the difference in the median age is likely
to rise from two to 17 years by 2050.

Behind this change lie demographic patterns with big policy implications.
The percentage of children in the population is falling as populations age.
But in America it is falling more slowly than elsewhere. In 1985, America
and Europe had more or less the same proportion of the population under 14
years of age: around 20%. By 2020, the proportion of children in Europe will
have slumped to 13.7%. In America it will still be 18.6%-not only higher
than in Europe but higher than in China and Japan, as well.

>From a fiscal point of view, more children are not necessarily a blessing:
their education is a burden on the public finances. Because America has
relatively more children than Europe, its "dependency ratio", or the number
of children and elderly people for each working-age person, does not stay
low. It is slightly higher than in Europe now-51% against 47%-and will stay
just above European levels as the ratio rises in both places until about
2035 (see top panel of chart 5). But note the difference: a higher
proportion of Europe's "dependency costs" comes from the old. The number of
people over 65 will be equivalent to 60% of the working-age population in
Europe in 2050, compared with only 40% in America. There, most of the
overall burden will come from the cost of educating children (chart 5,
bottom panel).
You see the significance after 2035. In America, the dependency ratio will
start to fall as the bulge of children turns into a bulge of adults. But in
Europe there will be no such change, and the ratio will continue to rise.
This is where the implications for public policy become large.

Both Europe and America face fiscal problems in providing pensions and
health care as their baby-boomers retire. On some estimates, by 2050,
government debt could be equivalent to almost 100% of national income in
America, 150% in the EU as a whole, and over 250% in Germany and France. So
while the burden is growing on both sides of the Atlantic, it is much
heavier in Europe.

That is a problem in its own right, and a source of another long-term
difficulty in the transatlantic relationship. Since the end of the cold war,
Europe and America have made different calculations about where to spend
public money. America has put more into defence; Europe has spent more on
social programmes.

The long-term logic of demography seems likely to entrench America's power
and to widen existing transatlantic rifts

The result is a familiar military imbalance. America spends about twice as
much on defence as the entire European Union ($295 billion in 2000, or 3% of
GDP, compared with the EU's $153 billion), thus maintaining its preponderant
military might. Europeans intermittently promise to spend more in order to
narrow the military gap, recognising the dangers to the NATO alliance if
they fail to pull their weight, but population trends will sap their

If Europeans are unwilling to spend what is needed to be full military
partners of America now, when 65-year-olds amount to 30% of the working-age
population, they will be even less likely to do more in 2050, when the
proportion of old people will have doubled. In short, the long-term logic of
demography seems likely to entrench America's power and to widen existing
transatlantic rifts.

Perhaps none of this is altogether surprising. The contrast between
youthful, exuberant, multi-coloured America and ageing, decrepit,
inward-looking Europe goes back almost to the foundation of the United
States. But demography is making this picture even more true, with long-term
consequences for America's economic and military might and quite possibly
for the focus of its foreign policy.

Copyright 2002, The Economist

CCNet is a scholarly electronic network. To subscribe/unsubscribe, please
contact the moderator Benny J Peiser < >. Information
circulated on this network is for scholarly and educational use only. The
attached information may not be copied or reproduced
forany other purposes without prior permission of the copyright holders. The
fully indexed archive of the CCNet, from February 1997 on, can be found at DISCLAIMER: The opinions,
beliefs and viewpoints expressed in the articles and texts and in other
CCNet contributions do not necessarily reflect the opinions, beliefs and
viewpoints of the moderator of this network.

CCCMENU CCC for 2002