PLEASE NOTE:
*
CCNet 134/2002 - 20 November 2002
--------------------------------
"Based on these new estimates the average
chances the Earth will be
hit by an asteroid impact capable of causing serious
regional damage
(roughly one megaton TNT equivalent energy) is close
to once per
century."
--The University of Western
Ontario, 20 November 2002
"Because there are fewer of these objects than
originally believed,
we can all worry a little less about the risk of the
next hazardous
impact."
--Robert Jedicke, University
of Arizona, 20 November 2002
"It is important to realize the impact
estimates we have measured are
averages from the last eight and a half years. Based
on past
observations, it seems likely there is also a
non-random component to
the impact flux at these smaller sizes which would
suggest our
estimates are lower bounds to the true impact
risk."
--Peter Brown, 20 November
2002
"So what should be done? While the
Spaceguard program seems to be on
track, another coordinated, cohesive federal
program should be
considered to catalogue the smaller NEOs, even
if it means adding a
digit or two to the budget deficit."
--The Washington
Times, 20 November 2002
(1) SATELLITE STUDY ESTABLISHES FREQUENCY OF MEGATON-SIZED
ASTEROID
IMPACTS
The University of Western Ontario, 20 November
2002
(2) STUDY DOWNPLAYS EARTH-ASTEROID PERIL
Associated Press, 20 November 2002
(3) SCIENTISTS REVISE ODDS OF ASTEROID COLLISION
Associated Press (London), 20 Novemember 2002
(4) KEEPING A WARY EYE ON THE SKIES
The Washington Times, 20 November 2002
=================
(1) SATELLITE STUDY ESTABLISHES FREQUENCY OF MEGATON-SIZED
ASTEROID
IMPACTS
>From The University of Western Ontario, 20 November 2002
http://www.newswire.ca/releases/November2002/19/c3822.html
Media Release
Satellite study establishes frequency of megaton-sized asteroid
impacts
/Embargoed until November 20, 2002, 2 p.m. E.S.T./
TORONTO, Nov. 19 /CNW/ - In Hollywood films such as
"Armageddon" and
"Deep Impact" Earth is threatened by enormous
asteroids. New research at
The University of Western Ontario establishes a better baseline
for the
frequency of large impacts that may cause serious damage on the
ground.
Based on these new estimates the average chances the Earth will
be hit
by an asteroid impact capable of causing serious regional damage
(roughly one megaton TNT equivalent energy) is close to once per
century.
The study, led by Peter Brown, Canada Research Chair in Meteor
Science
and Assistant Professor in the Department of Physics &
Astronomy at
Western, appears in the November 21 issue of the prestigious
journal
Nature.
United States Department of Defense and Department of Energy
satellites
scanning the Earth for evidence of nuclear explosions over the
last
eight years detected nearly 300 optical flashes caused by small
asteroids (one to 10 metres in size) exploding in the upper
atmosphere.
This provided Brown and his research team with a new estimate of
the
flux of near-Earth objects colliding with the Earth.
The revised estimate suggests Earth's upper atmosphere is hit
about once
a year by asteroids that release energy equivalent to five
kilotons of
TNT. The object that exploded above Tunguska, Siberia in 1908 was
considered 'small' (30 to 50 metres across), yet its energy was
big
enough to flatten 2,000 square kilometres of forest. It would
have
completely destroyed a city the size of New York. Brown and his
colleagues calculate that Tunguska-like events may occur as
frequently
as once every 400 years.
"It is important to realize the impact estimates we have
measured are
averages from the last eight and a half years. Based on past
observations, it seems likely there is also a non-random
component to
the impact flux at these smaller sizes which would suggest our
estimates
are lower bounds to the true impact risk," says Brown.
"We use Earth's atmosphere as a detector of small asteroids
or comets by
watching for the bright flashes produced as they impact the upper
layers
of the atmosphere. This is an ideal way to see smaller objects
(one to
10 metres) too small to be detected while still in space by
ground-based
telescopic surveys, but too large to be detected after they
become
bright fireballs by camera networks that watch the skies,"
says Brown.
"Ultimately, this new method of obtaining information
redefines our
range of knowledge about how and when asteroids may hit the
Earth.
Eventually, this will help us also better determine their
origins,
effects, and orbits."
Co-authors of the Nature paper are Richard E. Spalding, Sandia
National
Laboratories in Albuquerque, New Mexico; Douglas O. ReVelle, Los
Alamos
National Laboratory in Los Alamos, New Mexico; Edward
Tagliaferri, ET
Space Systems in Camarillo, California; and Brigadier General
Simon
"Pete" Worden, formerly of the United States Space
Command in Colorado
Springs, Colorado and now Director of Transformation, Air Force
Space
Command.
Note to broadcasters: Western has a VideoRoute service and can
arrange
broadcast interviews until tomorrow, November 20, at 10:30 a.m.
Please
call (519) 661-2111 ext. 85468 for more information.
For further information: Peter Brown will be available for
interviews
beginning today at about 4:30 p.m.; He can be reached at (519)
661-2111
ext. 86458 (office), (519) 642-0924 (home) or pbrown@uwo.ca; For copies
of the Nature paper please contact Marcia Daniel, Communications
&
Public Affairs, at (519) 661-2111 ext. 85468, or mdaniel@uwo.ca
==============
(2) STUDY DOWNPLAYS EARTH-ASTEROID PERIL
>From Associated Press, 20 November 2002
http://story.news.yahoo.com/news?tmpl=story&u=/ap/20021120/ap_on_sc/asteroid_threat_1
By WILLIAM McCALL, Associated Press Writer
Medium-size asteroids that could flatten a city the size of New
York
strike Earth less frequently than previously believed, possibly
only
about once a millennium, according to a study aided by military
satellites.
Rocky space debris created by collisions in the asteroid belt
between
Mars and Jupiter or chunks that break away from comets rain down
on the
Earth every day as meteroids, but most of the asteroid or comet
pieces
are tinier than grains of rice and quickly burn in the upper
atmosphere
as meteors.
In 1908, however, a meteor estimated to be up to 50 yards wide
nearly
hit the ground before it burned up over Russia, causing an
explosion
that flattened hundreds of square miles of forest in Tunguska,
Siberia.
The blast was estimated to be the equivalent of about 10 megatons
of TNT
- or 10 million tons.
By comparison, the nuclear bomb that exploded over Hiroshima in
World
War II unleashed about 13 kilotons of explosive power - or 13,000
tons.
In the new study, satellite data taken over the past eight years
suggest
that an intermediate-size asteroid like the one that struck
Siberia
occur an average of only once every 1,000 years - not every
couple of
centuries as previously believed, said Peter Brown, a University
of
Western Ontario astrophysicist.
His study, to be published Thursday in the journal Nature, was
based on
measurements of the flashes of light created when the debris
burns after
hitting the upper atmosphere.
Even chunks larger than a yard wide are too small to be easily
detected
with camera networks or telescopes on the ground, so Brown and
his
fellow researchers - including Gen. Simon "Pete" Worden
of the Air Force
Space Command - turned to military satellites used to detect the
flash
of a nuclear explosion.
By measuring the intensity of the flash of light with highly
sensitive
instruments, the researchers were able to estimate the size of
the
asteroids and their explosive power.
They tracked about 300 meteor flashes caused by debris ranging 1
to 10
yards wide from February 1994 to last September. The incoming
debris
typically packed an explosive punch of no greater than one ton of
TNT,
leading Brown to conclude the chances of a Tunguska-class
asteroid
damaging Earth are lower than previously estimated.
Scientists who did not participate in the study were impressed by
the
analysis.
"It's a darned cool approach to this," said Timothy
Spahr of the
Harvard-Smithsonian Center for Astrophysics, who specializes in
studying
asteroids near the Earth.
"I'm sure the military has got other things to do but it's
really nice
to see things that are used for other purposes help out in this
way,"
Spahr said.
Brown also compared his results to telescope data on larger
asteroids
from the Lincoln Near Earth Asteroid Research project in New
Mexico, run
by the Lincoln Laboratory at the Massachusetts Institute of
Technology.
"When you draw a line from our data down to their data, they
intersect,
which is remarkable because they're two completely different
techniques," Brown said. "And that gives us confidence
that numbers on
both sides are reasonable."
However, other researchers said Brown's estimate may be subject
to
unexpected changes, such as an uncharted comet moving closer to
Earth
and showering the atmosphere with fragments of varying sizes.
"The study assumes the flux of asteroids and comets that we
have been
observing over the last 20 to 30 years always remains the same, a
basic
assumption that is regarded among some astronomers with some
skepticism," said Benny Peiser, an anthropologist at John
Moores
University in Liverpool, England, who leads an international
forum on
the threat posed by asteroids.
Peiser said the study should help reassure the public that
scientists
are developing better ways to assess the asteroid threat, leading
to
ways to prevent it or at least minimize it.
Peiser said the study was valuable in another way because it
helped show
the U.S. military can detect the difference between a nuclear
explosion
and a meteor that sets off a flash similar to it - a capability
he said
could help governments avoid mistaking a meteor blast for a
nuclear
weapon.
Copyright 2002, AP
==============
(3) SCIENTISTS REVISE ODDS OF ASTEROID COLLISION
>From Associated Press (London), 20 Novemember 2002
http://story.news.yahoo.com/news?tmpl=story&u=/nm/20021120/sc_nm/space_asteroids_dc_1
By Patricia Reaney
LONDON (Reuters) - Scientists have revised estimates of the
frequency of
small asteroids colliding with the Earth, saying a hazardous
impact
would occur on average every 1,000 years and not 200 to 300 as
previously thought.
If a massive near-Earth object measuring more than a half-mile in
diameter slammed into the planet it would cause global
devastation, but
scientists estimate an event of that size would only occur about
every
700,000 years.
A small near-Earth object, similar to the one that flattened 800
square
miles of forest when it exploded over Tunguska, Siberia, in 1908,
would
be harder to detect and could kill millions if it happened over a
populated area.
But scientists in Canada and the United States have now created a
baseline for the likelihood of smaller asteroids colliding with
the
Earth's upper atmosphere.
"We estimate...that Tunguska-like events occur about once
every 1,000
years," said Peter Brown, of the University of Western
Ontario.
In research reported in the science journal Nature, the
scientists
studied more than eight years of data from U.S. Department of
Defense
(news - web sites) satellites which scan the Earth for evidence
of
nuclear explosions.
Small asteroids that collide with the Earth usually detonate in
the
atmosphere.
"They are a little bit too big to be detected with any
frequency from
ground-base networks and too small to be detected in large
numbers by
telescopic surveys," Brown explained.
According to the satellite information there were nearly 300
flashes of
small asteroids ranging in size from less than one to 10 yards
exploding
in the upper atmosphere during that time.
Brown emphasized the importance of studying large near-Earth
objects but
added that small asteroids hit more frequently and if an
explosion in
the atmosphere was big enough it could cause damage on the
ground.
The explosion from the Tunguska event was the equivalent of about
ten
megatons of dynamite.
A mile-wide asteroid raised concern worldwide in July when
astronomers
warned it could collide with the Earth in about 17 years, before
later
saying the asteroid would pass the planet by. "The medium
estimate for
recurrence of a 10 megaton impact is now a millennium based on
the
average background numbers we have seen in the last eight and a
half
years," Brown said.
"Because there are fewer of these objects than originally
believed, we
can all worry a little less about the risk of the next hazardous
impact," Robert Jedicke, of the Lunar and Planetary
Laboratory at the
University of Arizona, said in a commentary on the research in
Nature.
Copyright 2002, AP
============
(4) KEEPING A WARY EYE ON THE SKIES
>From The Washington Times, 20 November 2002
http://www.washtimes.com/op-ed/20021120-74834174.htm
Area residents who strained to see shooting stars from the recent
Leonid
meteor shower may not have realized it, but those streaks of
light
signify a constant threat from outer space. Since its formation,
the
Earth has been bombarded by streams of comets and asteroids. Most
of
them (like the cometary debris that produces the Leonids) burn up
in the
atmosphere, but larger-sized objects can impact with enough
energy to
destroy cities, countries, and even civilizations.
Since 1994, NASA scientists have been surveying and cataloguing
the
civilization-killing (greater or equal to 1 kilometer in diameter
in
size) near-Earth objects (NEOs). There are believed to be between
900
and 1,300 such objects, 90 percent of which should be identified
by
2008.
However, the real danger might lie in the smaller NEOs. They're
much
smaller (between 200 and 500 meters in diameter), far harder to
detect
and far more numerous (estimated at 50,000). One of those objects
could
devastate a city, a state, or, in the case of an NEO strike in
Tunguska,
Siberia, in 1908 - which went off with a force of about 10
megatons -
devastating 2,000 or so square kilometers of Siberian forest. In
this
year alone, there were at least three near-misses by such
objects.
Notwithstanding such close calls, it's not completely clear how
great a
threat such objects pose. Estimates of the occurrence of
Tunguska-sized
events range from 200 to 1,000-plus years, NEO explosions in the
five
kiloton range are thought to occur about once a year.
Although mostly harmless, atmospheric detonations have the
potential to
set off even larger ones. Last June, with the governments of
India and
Pakistan on hair-trigger alert, a Hiroshima-sized NEO explosion
occurred
over the Mediterranean. Had it detonated over Kashmir, it may
have been
enough to start a war. That threat seems likely to become greater
with
time, as more and more nations join the nuclear club.
So what should be done? That question was discussed last month at
a
hearing into the matter at the House Subcommittee on Space and
Aeronautics, led by chairman Rep. Dana Rohrabacher. While the
Spaceguard
program seems to be on track, another coordinated, cohesive
federal
program should be considered to catalogue the smaller NEOs, even
if it
means adding a digit or two to the budget deficit. At some point,
scientists and engineers (other than Bruce Willis and Ben
Affleck) will
also figure out how to divert those NEOs that are found to be on
a
collision course with the Earth.
In testimony to the subcommittee, David Morrison, a senior
scientist at
NASA's Ames Research Center, pointed out that searching for NEOs
is
somewhat akin to buying a fire-insurance policy. Given the
potential
consequences, it seems wise to keep a wary eye on the skies.
Copyright 2002, The Washington Times
--------------------------------------------------------------------
CCNet is a scholarly electronic network. To
subscribe/unsubscribe,
please contact the moderator Benny J Peiser < b.j.peiser@livjm.ac.uk
>.
Information circulated on this network is for scholarly and
educational
use only. The attached information may not be copied or
reproduced for
any other purposes without prior permission of the copyright
holders.
The fully indexed archive of the CCNet, from February 1997 on,
can be
found at http://abob.libs.uga.edu/bobk/cccmenu.html.
DISCLAIMER: The
opinions, beliefs and viewpoints expressed in the articles and
texts and
in other CCNet contributions do not necessarily reflect the
opinions,
beliefs and viewpoints of the moderator of this network.
------------------------------------------------------------------------
*
CCNet TERRA 11/2002 - 20 November 2002
------------------------------------
"Abrupt changes in water temperatures occurring over
intervals of up
to 25 years suggest that global warming may result as much from
natural
cyclical climate variations as from human activity. The results
suggest
that as much as one-half of all global surface warming since the
1970's may
be part of natural variation as distinct from the result of
greenhouse
gases,"
--Benjamin Giese, Texas A&M University, 13 November 2002
"In science, conflict between predictions and measurements
is
progress. It serves to focus scientists' activities. This will
take
patient work, if science is to be the basis for deciding energy
and
environmental policy. According to the accumulated measurements,
and even
the exaggerated forecasts, time for study is available before
making sharp
emission cuts because temperature measurements do not show
evidence
for catastrophic human-made warming trends."
--Sallie Baliunas, 15 November 2002
(1) PACIFIC OCEAN TEMPERATURE CHANGES POINT TO NATURAL CLIMATE
VARIABILITY
ScienceDaily, 13 November 2002
(2) PAN EVAPORATION
John-Daly.com, 19 November 2002
(3) NO NATURAL PARADOX
Tech Central Station, 15 November 2002
(4) THIRTEEN THOUSAND YEARS OF STORMINESS IN THE NORTHEASTERN
UNITED STATES
CO2 Science Magazine, 20 November 2002
(5) SIMULATING DISASTERS TO SAVE LIVES
National Science Foundation, 18 November 2002
===============
(1) PACIFIC OCEAN TEMPERATURE CHANGES POINT TO NATURAL CLIMATE
VARIABILITY
>From ScienceDaily, 13 November 2002
http://www.sciencedaily.com/releases/2002/11/021113070418.htm
COLLEGE STATION, November 12, 2002 - Analysis of long-term
changes in
Pacific Ocean temperatures may provide additional data with which
to
evaluate global warming hypotheses.
"Abrupt changes in water temperatures occurring over
intervals of up to 25
years suggest that global warming may result as much from natural
cyclical
climate variations as from human activity," said Benjamin
Giese,
oceanography professor in the College of Geosciences.
"Climate models constructed here at Texas A&M University
were used to
analyze ocean surface temperature records in the tropical Pacific
since
1950. The results suggest that as much as one-half of all global
surface
warming since the 1970's may be part of natural variation as
distinct from
the result of greenhouse gases,"
Giese and graduate student Amy J. Bratcher published the results
of their
analysis in the Oct. 8 issue of Geophysical Research Letters.
Surface air temperature records maintained over the past 120
years serve as
the main evidence for hypotheses linking global warming to
increased
greenhouse gases generated by manmade (anthropogenic) causes.
These records
show the average global air temperature has risen by about
one-half degree
Centigrade over the last 50 years. But while the general air
temperature
trend seems to be undisputedly upward, this upward trend varies
considerably.
"How much of this variability is attributable to natural
variations and how
much is due to anthropogenic contributions to atmospheric
greenhouse gases
has not yet been resolved," Giese said. "Recent studies
indicate that it is
difficult to separate intrinsic natural variance from
anthropogenic forcing
in the climate system."
Giese believes their analysis of tropical Pacific Ocean data
indicates
long-term upward changes in ocean temperatures precede global
surface air
temperature changes by about four years. These ocean temperature
fluctuations are in turn preceded by an increase in subsurface
water
temperatures by about seven years.
"Thus, the results suggest that much of the decade to decade
variations in
global air temperature may be attributed to tropical Pacific
decadal
variability," Giese observed. "The results also suggest
that subsurface
temperature anomalies in the southern tropical Pacific can be
used as a
predictor of decadal variations of global surface air
temperature."
For example, in 1976 an abrupt change in the temperature of the
tropical
Pacific Ocean preceded a rise of two-tenths of a degree in global
air
temperatures.
"This phenomenon looks like El Nino, but with a much longer
time scale - El
Nino occurs over a period of from nine to 12 months, but this
fluctuation
lasts for about 25 years," he continued. "In 1976, the
ocean temperature
change in question occurred very quickly, moving from cooler than
normal to
warmer than normal in about a year."
Bratcher and Giese report that now conditions in the tropical
Pacific are
similar to those prior to the 1976 climate shift, except with the
opposite
sign. If conditions develop in a similar way, then the tropical
Pacific
could cool back to pre-1976 conditions.
"The subsurface tropical Pacific has shown a distinct
cooling trend over the
last eight years, so the possibility exists that the warming
trend in global
surface air temperature observed since the late 1970's may soon
weaken,"
Giese observed.
"This natural variation would help to counter the greenhouse
gas warming
effect. In fact, careful study reveals that global warming and
cooling has
occurred in the past in cyclical patterns."
Giese's work involves constructing computer models that
incorporate years of
weather data to reveal recurring patterns of oscillation and help
identify
mechanisms that may affect climate. He focuses on climate
oscillations that
are not directly forced by such things as changing amounts of
sunlight, but
instead are mechanisms of internal climatic variation for which
scientists
have as yet isolated no particular cause.
"Our model results terminated at the end of 2001," he
said. "Now we're
waiting to see what their long-term effects may be on global
temperatures.
"Our results don't preclude the possibility that
anthropogenic sources of
greenhouse gases have contributed to global warming. We're just
suggesting
that the human forced portion of global warming may be less than
previously
described."
===========
(2) PAN EVAPORATION
>From John-Daly.com, 19 November 2002
http://www.john-daly.com
At many weather stations in the world, the usual temperature and
rainfall
measurements are supplemented by `pan evaporation' measurements.
Simple
water filled open pans are used, and the loss of water to
evaporation over a
specified period of time provides a measure of the evaporation
rate.
Evaporation is important in predicting climate change because if
the
atmosphere warms as claimed by the global warming theory, then
the rate of
evaporation should increase. The evaporated water vapour is
itself a
powerful greenhouse gas, and so any greenhouse warming from CO2
would then
be supplemented by further warming from the evaporated water
vapour. This
knock-on effect, or `positive feedback', is then claimed to warm
the
atmosphere much more than CO2 on its own could do. In fact,
current theory
promoted by the greenhouse industry suggests that CO2 alone can
manage about
0.8°C of warming if it's atmospheric concentration is doubled
from its
pre-industrial level. At recent rates of increase, it will take
about 130
years or so from now to reach that doubling point.
A warming of 0.8°C by the year 2130 does not sound like too much
to get
excited over, but the theory all along has claimed that added
water vapour
to the atmosphere, plus a few other minor feedbacks will blow out
that
warming to many degrees celsius, as much as 5°C or so.
A new study by Roderick and Farquhar (Science, v.298, p.1410, 15
Nov 02)
appears to blow that idea out the window. They show that 50 years
of pan
evaporation data reveals a steady decrease in the evaporation
rate over the
northern hemisphere - this in spite of industry claims that the
world has
warmed over the same period. The authors are a bit uncertain
about the
reason for this decline, citing cloudiness, pollution, and
aerosols as
possible contributors, but the bottom line is the observed
decline in the
evaporation rate, whatever the reasons.
With declining evaporation, there is no possibility of the
expected
`positive feedback' from water vapour to add to the small
theorised warming
from CO2. Indeed, the reverse may be the case, a fall in water
vapour in the
atmosphere would actually reverse what little warming was exacted
by CO2, a
conclusion consistent with what we see from the global
atmospheric
temperature record measured by satellites.
================
(3) NO NATURAL PARADOX
>From Tech Central Station, 15 November 2002
http://www.techcentralstation.com/1051/envirowrapper.jsp?PID=1051-450&CID=1051-111502A
By Sallie Baliunas
The U.S. climate science community has been asked by the Bush
Administration
to help plan its research agenda for the next few years. The goal
is to
define, reliably, the human effect on global climate change
resulting from
anthropogenic emissions of, for instance, carbon dioxide,
methane, aerosols
and soot that would interfere with natural climate change.
Present estimates
of the human impact abound with uncertainty, and the objective of
continued
research is to reduce that uncertainty.
Despite the uncertainty of climate forecasts, calls have been
made for
reductions in human-made emissions. Only steep cuts would make a
difference
in the climate forecasts. But even accepting the uncertain
temperature
forecasts for the next 100 years, delaying deep cuts in emissions
would
result in extra warming of just a few tenths of a degree C above
the
middle-value forecasts of 2 - 3 C warming by 2100. The extra
warming
incurred by a delay in steep cuts would be much less than natural
ups and
downs in temperature.
Do We Know Enough to Act?
The question remains, do we currently know enough scientifically
to justify
large reductions in emissions? The tremendous research investment
of about
$20 billion dollars the U.S. has made since 1990 does not support
rapid and
intense cuts in anthropogenic emissions. Here are several
reasons.
Major tools for forecasting the impact of the air's increased
greenhouse
gases are the highly complex climate simulations calculated on
supercomputers. The simulations should be tested against
measurements of
recent and past climate. The output of computer simulations says
that
significant warming trends near the surface and in the low layer
of air
should already have occurred in the last few decades. But
averaged over the
globe, there has been only modest warming at the surface, and
much of it
predates the rise in the air's concentration of human-produced
greenhouse
gases. Very little warming has been seen in the low layer of air.
Also, the simulations expect a strong human-made surface warming
trend at
high latitudes - more pronounced than the global human-made
trend. Yet the
observed temperature trends at high latitude show little recent
warming,
contradicting the output of simulations. Thus the models
erroneously
exaggerate expected warming trends compared to measured trends.
Cloudy Science
In another case, the Third Assessment Report of the United
Nations
Intergovernmental Panel on Climate Change, issued in 2001,
compares the
average percentage of present-day cloudiness over the globe for
December,
January and February (p. 485 of Working Group I). The ten models
give a
range of output from 0% to around 90% average winter cloud
coverage at high
latitudes. Near the equator, the different models yield values
ranging from
25% to 65% or so. Yet not one model agrees with the state-of-the
art
measurements from satellites.
Understanding cloud physics is critical to improving the models
and their
results. The climate has a natural greenhouse created
predominantly by water
in different forms - water vapor in the air, and water and ice
droplets in
clouds. Yet the climate effects of changes in water vapor and
clouds remain
poorly known.
Thus, a high priority for upcoming research should be targeted
measuring
programs lasting for several years in order to demystify the role
of clouds
and water vapor in climate change.
Good Work, More to Do
Since the UN report was written, researchers have continued to
attack the
unknowns in climate science - for example, with newly available
measurements
of the upper layers of air. Researchers at the University of
Illinois and
National Center for Atmospheric Research determined the air's
temperatures
at 30 to 110 km height, and found that the models give wrong
results by as
much as 20 degrees C (and different model results are equally
wrong in
opposite directions).
In science, conflict between predictions and measurements is
progress. It
serves to focus scientists' activities. This will take patient
work, if
science is to be the basis for deciding energy and environmental
policy.
According to the accumulated measurements, and even the
exaggerated
forecasts, time for study is available before making sharp
emission cuts
because temperature measurements do not show evidence for
catastrophic
human-made warming trends.
The climate simulations do not reproduce well-observed aspects of
the
climate system. To recognize that the forecasts of warming are
deeply flawed
while calling, on the basis of science, for intense cuts in
carbon dioxide
and other human-made emissions, makes for a paradox. Ultimately
there are no
paradoxes in nature. Any paradox is a signpost of ignorance - and
a very
valuable sign for scientists venturing into the unknown, where
illumination
comes only by science.
Copyright 2002, Tech Central Station
======
(4) THIRTEEN THOUSAND YEARS OF STORMINESS IN THE NORTHEASTERN
UNITED STATES
>From CO2 Science Magazine, 20 November 2002
http://www.co2science.org/journal/2002/v5n47c1.htm
Reference
Noren, A.J., Bierman, P.R., Steig, E.J., Lini, A. and Southon, J.
2002.
Millennial-scale storminess variability in the northeastern
Unites States
during the Holocene epoch. Nature 419: 821-824.
Background
Climate alarmists loudly proclaim that anthropogenic emissions of
atmospheric greenhouse gases may lead to increases in the
frequency of
severe storms. Hence, there is a tendency for the general
public, including
the media, to wonder if humanity is to blame for periodic
occurrences of
extreme weather; while radical environmentalists make no bones
about it,
claiming we are indeed responsible. But are we?
To correctly answer this question, it is necessary to know where
we
currently reside in the suite of natural cycles that characterize
earth's
climate; for as the authors of this important study describe the
situation,
"the existence of natural variability in storminess
confounds reliable
detection of anthropogenic effects." This being the case, it
was their
intention to provide an historical context for addressing this
question.
What was done
Sediment cores were extracted from thirteen small lakes
distributed across a
20,000-km2 region in Vermont and eastern New York, after which
several
techniques were used to identify and date terrigenous in-wash
layers that
depict the frequency of storm-related floods.
What was learned
The authors' data indicate that "the frequency of
storm-related floods in
the northeastern United States has varied in regular cycles
during the past
13,000 years (13 kyr), with a characteristic period of about 3
kyr." There
were four major storminess peaks during this period; they
occurred
approximately 2.6, 5.8, 9.1 and 11.9 kyr ago, with the most
recent upswing
in storminess beginning "at about 600 yr BP [Before
Present], coincident
with the beginning of the Little Ice Age."
The authors say that the pattern they observed "is
consistent with long-term
changes in the average sign of the Arctic Oscillation [AO],
suggesting that
modulation of this dominant atmospheric mode may account for a
significant
fraction of Holocene climate variability in North America and
Europe." They
also note that several "independent records of storminess
and flooding from
around the North Atlantic show maxima that correspond to those
that
characterize our lake records [Brown et al., 1999; Knox, 1999;
Lamb, 1979;
Liu and Fearn, 2000; Zong and Tooley, 1999]."
The authors also report that "during the past ~600 yr, New
England
storminess appears to have been increasing naturally [our
italics]," and
they suggest that "changes in the AO, perhaps modulated by
solar forcing,
may explain a significant portion of Holocene climate variability
in the
North Atlantic region." They further state that their
explanation is
appealing "because it makes a specific prediction that New
England
storminess should be at its greatest when Europe is cold
(characteristic of
the low-phase AO)," such as during Little Ice Age
conditions; and they
report that "comparison of our results with the other
climate records [cited
below], including European glacier fluctuations, suggest that, as
predicted,
intense storms in New England tend to occur more frequently
during periods
that are cooler than average in Europe [Mayewski et al., 1994;
O'Brien et
al., 1995; Holmes et al., 2001; Karlen and Kuylenstierna, 1996;
Matthews et
al., 2000]."
What it means
This study indicates that: (1) climate fluctuates significantly
on a
millennial timescale in a reasonably well-defined oscillatory
fashion that
is independent of the atmosphere's CO2 concentration but is
"perhaps
modulated by solar forcing," as we and a host of others have
long suggested
[see Climate Oscillations and Solar Effects (Climate) in our
Subject Index],
and (2) relatively colder climates are typically characterized by
relatively
stormier weather, as we also have long suggested [see Extreme
Weather in our
Subject Index]. The first of these observations tends to
discredit the
climate-alarmist claim that recent global warming is CO2-induced,
in that
the warming of the last century or so is but a normal part of
earth's
naturally-recurring millennial-scale climatic oscillation; while
the second
observation tends to discredit the climate-alarmist claim that
global
warming will lead to increases in storminess, in that increases
in
real-world storminess are typically associated with global
cooling.
References
Brown, P., Kennett, J.P. and Ingram, B.L. 1999.
Marine evidence for
episodic Holocene megafloods in North America and the northern
Gulf of
Mexico. Paleoceanography 14: 498-510.
Hormes, A., Muller, B.U. and Schluchter, C. 2001. The
Alps with little
ice: evidence for eight Holocene phases of reduced glacier extent
in the
Central Swiss Alps. The Holocene 11: 255-265.
Karlen, W. and Kuylenstierna, J. 1996. On solar
forcing of Holocene
climate: evidence from Scandinavia. The Holocene 6:
359-365.
Knox, J.C. 1999. Sensitivity of modern and Holocene
floods to climate
change. Quaternary Science Reviews 19: 439-457.
Lamb, H.H. 1979. Variation and changes in the wind
and ocean circulation:
the Little Ice Age in the northeast Atlantic. Quaternary
Research 11: 1-20.
Liu, K.b. and Fearn, M.L. 2000. Reconstruction of
prehistoric landfall
frequencies of catastrophic hurricanes in northwestern Florida
from lake
sediment records. Quaternary Research 54: 238-245.
Matthews, J.A., Dahl, S.O., Nesje, A., Berrisford, M. and
Andersson, C.
2000. Holocene glacier variations in central Jotunheimen,
southern Norway
based on distal glaciolacustrine sediment cores. Quaternary
Science Reviews
19: 1625-1647.
Mayewski, P.A., Meeker, L.D., Whitlow, S., Twickler, M.S.,
Morrison, M.C.,
Bloomfield, P., Bond, G.C., Alley, R.B., Gow, A.J., Grootes,
P.M., Meese,
D.A., Ram, M., Taylor, K.C. and Wumkes, W. 1994.
Changes in atmospheric
circulation and ocean ice cover over the North Atlantic during
the last
41,000 years. Science 263: 1747-1751.
O'Brien, S.R., Mayewski, P.A., Meeker, L.D., Meese, D.A.,
Twickler, M.S. and
Whitlow, S.E. 1995. Complexity of Holocene climate as
reconstructed from a
Greenland ice core. Science 270: 1962-1964.
Zong, Y. and Tooley, M.J. 1999. Evidence of
mid-Holocene storm-surge
deposits from Morecambe Bay, northwest England: A
biostratigraphical
approach. Quaternary International 55: 43-50.
Copyright © 2002. Center for the Study of Carbon Dioxide
and Global Change
==========
(5) SIMULATING DISASTERS TO SAVE LIVES
>From National Science Foundation, 18 November 2002
http://www.nsf.gov/od/lpa/news/02/tip021118.htm#fourth
In the past year, civil engineers have begun to change the way
they look at
buildings. A team of civil engineers and computer scientists at
Purdue
University has developed a new high-performance computing tool
that will
help to improve the design of critical buildings, such as
hospitals and fire
stations, which may save lives in the event of a disaster.
The research team, as part of several NSF computer science and
engineering
projects, used software commonly used in automobile crash testing
to create
highly realistic simulations of the September 11, 2001, attack on
the
Pentagon, in which an airliner was crashed into the building. The
results
were then used to create a vivid re-enactment of the moment of
impact.
The most detailed version of the simulation used a mesh of 1
million nodes
to represent the airliner and the steel-reinforced columns of the
building.
Simulating a quarter-second of real time required close to 68
hours on a
Purdue supercomputer.
"Using this simulation I can do the so-called 'what-if'
study, testing
hypothetical scenarios before actually building a
structure," said project
leader Mete Sozen, the Kettelhut Distinguished Professor of
Structural
Engineering.
Generating detailed and accurate models as well as combining
commercial
software with the special models needed to simulate an airliner
hitting a
building provided the team with its greatest challenges. The
results of the
simulation showed that the reinforced concrete support columns in
the
Pentagon probably saved many lives.
Team members Christoph Hoffmann, professor of computer science,
and Sami
Kilic, a civil engineering research associate, will present the
team's work
in the Research in Indiana exhibit (R1951) at SC2002 on Monday,
Nov. 18, and
Tuesday, Nov. 19.
For more details, see http://www.cs.purdue.edu/homes/cmh/simulation/.
--------------------------------------------------------------------
CCNet is a scholarly electronic network. To
subscribe/unsubscribe, please
contact the moderator Benny J Peiser < b.j.peiser@livjm.ac.uk
>. Information
circulated on this network is for scholarly and educational use
only. The
attached information may not be copied or reproduced for any
other purposes
without prior permission of the copyright holders. The fully
indexed archive
of the CCNet, from February 1997 on, can be found at
http://abob.libs.uga.edu/bobk/cccmenu.html.
DISCLAIMER: The opinions,
beliefs and viewpoints expressed in the articles and texts and in
other
CCNet contributions do not necessarily reflect the opinions,
beliefs and
viewpoints of the moderator of this network.
----------------------------------------------------------------------------