PLEASE NOTE:
*
CCNet TERRA 15/2003 - 2 April 2003
----------------------------------
"A review of more than 200 climate studies led by
researchers at the Harvard-Smithsonian Center for Astrophysics
has determined that the 20th century is neither the warmest
century nor the century with the most extreme weather of the past
1000 years. While 20th century temperatures are much higher than
in the Little Ice Age period, many parts of the world show the
medieval warmth to be greater than that of the 20th
century."
--Christine Lafon, Harvard-Smithsonian Center for Astrophysics
"Official Science is not made up of working research
scientists, but rather staffs of scientific bureaucracies,
national and international science panels, and so on. These
members of Official Science aren't appointed by scientists to
speak on their behalf, but are middlemen who control the
distribution of research money and define "scientific
truth" for the public. They have the job of striking "a
mad compromise between the realities of politics and the
realities of nature," writes McKitrick. "So while
scientists are skeptical of their own work and that of others,
Official Science speaks with the simple confidence that good
politics requires and journalism demands, but which science
abhors."
--Paul Georgia, Tech Central Station, 28 March 2003
(1) 20TH CENTURY CLIMATE NOT SO HOT
Christine Lafon <clafon@cfa.harvard.edu>
(2) NASA FINDS WIDE ANNUAL FLUCTUATIONS IN ARCTIC OZONE LOSS
NASA News <NASANews@hq.nasa.gov>
(3) ARE STORM TRENDS IN EUROPE DOING WHAT CLIMATE MODELS SAY THEY
SHOULD?
CO2 Science Magazine, 2 April 2003
(4) ABRUPT CLIMATE CHANGE: NOT ON THE HUMAN-INDUCED AGENDA
CO2 Science Magazine, 2 April 2003
(5) STORM FRONT: THE SUBJUGATION OF SCIENCE TO POLITICS
Tech Central Station, 28 March 2003
(6) APRIL FOOL OR APRIL SCARE? "TABLE MOUNTAIN AN ISLAND IN
30 YEARS"
Franz Dullaart <jfd@tenet.ac.za>
(7) DEPLETED URANIUM: SAFETY AND LEGALITY
John Michael Williams <jwill@AstraGate.net>
(8) HOMO SAPIENS AND THE ROOTS OF HUMAN CATASTROPHES, WITH
COMPLIMENTS TO GEORGE ORWELL
Andrew Glikson <geospec@webone.com.au>
(9) AND FINALLY: "SUPER-PNEUMONIA" OR SUPER SCARE?
Scripps Howard News Service, 26 March 2003
==========
(1) 20TH CENTURY CLIMATE NOT SO HOT
>From Christine Lafon <clafon@cfa.harvard.edu>
Harvard-Smithsonian Center for Astrophysics
Release No: 03-10
For Immediate Release: March 31, 2003
Cambridge, MA -- A review of more than 200 climate studies led by
researchers at the Harvard-Smithsonian Center for Astrophysics
has determined that the 20th century is neither the warmest
century nor the century with the most extreme weather of the past
1000 years. The review also confirmed that the Medieval Warm
Period of 800 to 1300 A.D. and the Little Ice Age of 1300 to 1900
A.D. were worldwide phenomena not limited to the European and
North American continents. While 20th century temperatures are
much higher than in the Little Ice Age period, many parts of the
world show the medieval warmth to be greater than that of the
20th century.
Smithsonian astronomers Willie Soon and Sallie Baliunas, with
co-authors Craig Idso and Sherwood Idso (Center for the Study of
Carbon Dioxide and Global Change) and David Legates (Center for
Climatic Research, University of Delaware), compiled and examined
results from more than 240 research papers published by thousands
of researchers over the past four decades. Their report, covering
a multitude of geophysical and biological climate indicators,
provides a detailed look at climate changes that occurred in
different regions around the world over the last 1000 years.
"Many true research advances in reconstructing ancient
climates have occurred over the past two decades," Soon
says, "so we felt it was time to pull together a large
sample of recent studies from the last 5-10 years and look for
patterns of variability and change. In fact, clear patterns did
emerge showing that regions worldwide experienced the highs of
the Medieval Warm Period and lows of the Little Ice Age, and that
20th century temperatures are generally cooler than during the
medieval warmth."
Soon and his colleagues concluded that the 20th century is
neither the warmest century over the last 1000 years, nor is it
the most extreme. Their findings about the pattern of historical
climate variations will help make computer climate models
simulate both natural and man-made changes more accurately, and
lead to better climate forecasts especially on local and regional
levels. This is especially true in simulations on timescales
ranging from several decades to a century.
Historical Cold, Warm Periods Verified
Studying climate change is challenging for a number of reasons,
not the least of which is the bewildering variety of climate
indicators - all sensitive to different climatic variables, and
each operating on slightly overlapping yet distinct scales of
space and time. For example, tree ring studies can yield yearly
records of temperature and precipitation trends, while glacier
ice cores record those variables over longer time scales of
several decades to a century.
Soon, Baliunas and colleagues analyzed numerous climate
indicators including: borehole data; cultural data; glacier
advances or retreats; geomorphology; isotopic analysis from lake
sediments or ice cores, tree or peat celluloses (carbohydrates),
corals, stalagmite or biological fossils; net ice accumulation
rate, including dust or chemical counts; lake fossils and
sediments; river sediments; melt layers in ice cores;
phenological (recurring natural phenomena in relation to climate)
and paleontological fossils; pollen; seafloor sediments;
luminescent analysis; tree ring growth, including either ring
width or maximum late-wood density; and shifting tree line
positions plus tree stumps in lakes, marshes and streams.
"Like forensic detectives, we assembled these series of
clues in order to answer a specific question about local and
regional climate change: Is there evidence for notable climatic
anomalies during particular time periods over the past 1000
years?" Soon says. "The cumulative evidence showed that
such anomalies did exist."
The worldwide range of climate records confirmed two significant
climate periods in the last thousand years, the Little Ice Age
and the Medieval Warm Period. The climatic notion of a Little Ice
Age interval from 1300 to 1900 A.D. and a Medieval Warm Period
from 800 to 1300 A.D. appears to be rather well-confirmed and
wide-spread, despite some differences from one region to another
as measured by other climatic variables like precipitation,
drought cycles, or glacier advances and retreats.
"For a long time, researchers have possessed anecdotal
evidence supporting the existence of these climate
extremes," Baliunas says. "For example, the Vikings
established colonies in Greenland at the beginning of the second
millennium that died out several hundred years later when the
climate turned colder. And in England, vineyards had flourished
during the medieval warmth. Now, we have an accumulation of
objective data to back up these cultural indicators."
The different indicators provided clear evidence for a warm
period in the Middle Ages. Tree ring summer temperatures showed a
warm interval from 950 A.D. to 1100 A.D. in the northern high
latitude zones, which corresponds to the "Medieval Warm
Period." Another database of tree growth from 14 different
locations over 30-70 degrees north latitude showed a similar
early warm period. Many parts of the world show the medieval
warmth to be greater than that of the 20th century.
The study -- funded by NASA, the Air Force Office of Scientific
Research, the National Oceanic and Atmospheric Administration,
and the American Petroleum Institute -- will be published in the
Energy and Environment journal. A shorter paper by Soon and
Baliunas appeared in the January 31, 2003 issue of the Climate
Research journal.
NOTE TO EDITORS: Photos of key climate indicators are available
online at
http://cfa-www.harvard.edu/press/pr0310image.html
Headquartered in Cambridge, Massachusetts, the
Harvard-Smithsonian Center for Astrophysics (CfA) is a joint
collaboration between the Smithsonian Astrophysical Observatory
and the Harvard College Observatory. CfA scientists organized
into six research divisions study the origin, evolution, and
ultimate fate of the universe.
For more information, contact:
David Aguilar, Director of Public Affairs
Harvard-Smithsonian Center for Astrophysics
617-495-7462
daguilar@cfa.harvard.edu
Christine Lafon, Public Affairs Specialist
Harvard-Smithsonian Center for Astrophysics
617-495-7463
clafon@cfa.harvard.edu
===============
(2) NASA FINDS WIDE ANNUAL FLUCTUATIONS IN ARCTIC OZONE LOSS
>From NASA News <NASANews@hq.nasa.gov>
Elvia H. Thompson
Headquarters, Washington March 28, 2003
(Phone: 202/358-1696)
Alan Buis
Jet Propulsion Laboratory (JPL), Pasadena, Calif.
(Phone: 818/354-0474)
RELEASE: 03-125
NASA FINDS WIDE ANNUAL FLUCTUATIONS IN ARCTIC OZONE LOSS
Ozone depletion over Earth's Arctic region varies widely from
year to year in its amount, timing and pattern of loss. That's
the conclusion of a research team using data from the Microwave
Limb Sounder (MLS) on NASA's Upper Atmosphere Research Satellite.
The findings, published in the current issue of the Journal of
Geophysical Research, provide the first consistent,
three-dimensional picture of ozone loss during multiple Arctic
winters. The findings confirm previous Arctic ozone loss estimate
variations.
"This work provides a consistent picture of how Arctic ozone
loss varies between winters," said lead researcher Dr.
Gloria Manney, a senior research scientist with NASA's Jet
Propulsion Laboratory, Pasadena, Calif. "Scientists will
have a better understanding of current Arctic ozone conditions
and be better able to predict variations in the future."
Manney said NASA's unique vantage point in space provides data
needed by policy makers. "They need accurate data to show
whether current regulations on ozone-depleting substances are
having the desired effect," she said. "In this way,
NASA is providing a vital piece of the puzzle needed to
understand this global phenomenon."
Ozone is a form of oxygen that shields life on Earth from harmful
ultraviolet radiation. Earth's stratospheric ozone layer is
thinning around the world outside of the tropics. This thinning
is a result of chlorofluorocarbons produced by industrial
processes, which form reactive compounds like chlorine monoxide
in the stratosphere during winter. To date, ozone loss has been
most pronounced over Antarctica, where colder conditions
encourage greater ozone loss and result in ozone
"hole."
Higher temperatures and other differences in atmospheric
conditions in the Arctic have thus far prevented similarly large
depletions. Nevertheless, as Manney and her colleagues validated
in 1994, widespread Arctic ozone loss also occurs, and scientists
are eager to understand it better, since formation of Arctic
ozone "hole" could negatively affect populations in
Earth's far northern latitudes.
Many uncertainties remain regarding ozone depletion. Scientists
want to know what is causing ozone decreases in Earth's mid
latitudes. They also wish to assess effects of climate change on
future ozone loss, especially in the northern hemisphere high
latitudes.
In the new study, Manney's team reanalyzed MLS observations
during seven Arctic winters (1991 - 2000) to estimate chemical
ozone loss. To yield accurate estimates, the team developed a
model to account for naturally occurring ozone variations
resulting from atmospheric transport processes
such as wind variability. Their results show large year-to-year
variability in the amount, timing and patterns of Arctic ozone
loss. Ozone depletion was observed in the Arctic vortex each year
except 1998, when temperatures were too high for chemical ozone
destruction. This vortex is a band of strong winds encircling the
North Pole in winter like a giant whirlpool. Inside the vortex,
temperatures are low and ozone-destroying chemical are confined.
Ozone loss was most rapid near the vortex edge, with the biggest
losses in 1993 and 1996. The greatest loses occurred in the
months of February and March.
The variability in the size, location and duration of the Arctic
vortex is driven by meteorological conditions. High mountains and
land-sea boundaries in the northern hemisphere
interact with wind variations to generate vast atmospheric
undulations that displace air as they travel around Earth. These
waves form in the troposphere (the lowest atmospheric layer),
where they produce our winter storms, and propagate upward,
depositing their energy in the stratosphere. The energy from
these waves warms the stratosphere, suppressing formation of
polar stratospheric clouds necessary for ozone destruction.
Arctic ozone loss tends to be greatest in years when these wave
motions are unusually weak.
NASA's MLS experiments measure naturally occurring microwave
thermal emissions from the limb of Earth's atmosphere to remotely
sense vertical profiles of selected atmospheric gases,
temperature and pressure. These data are unique in their ability
to show the three-dimensional evolution of
ozone loss over time. The Microwave Limb Sounder on the Upper
Atmosphere Research Satellite was the first such experiment in
space. A next-generation MLS, developed and built at JPL for the
Aura mission of NASA's Earth Observing System, is scheduled for
launch in 2004. That instrument will provide simultaneous
observations of ozone and one or more long-lived trace gases,
substantially advancing future studies of ozone loss. The
California Institute of Technology in Pasadena manages JPL for
NASA.
For more information about the Microwave Limb Sounder, see: http://mls.jpl.nasa.gov
===========
(3) ARE STORM TRENDS IN EUROPE DOING WHAT CLIMATE MODELS SAY THEY
SHOULD?
>From CO2 Science Magazine, 2 April 2003
http://www.co2science.org/subject/s/summaries/stormseuro.htm
Among the highly publicized doom-and-gloom scenarios that are
alleged to attend the ongoing rise in the air's CO2 content are
predicted increases in the frequency and severity of storms. As a
result, and in an effort to determine if these predictions have
any validity, many scientists are examining historical and proxy
storm records in an attempt to determine how temperature changes
of the past millennium have impacted this aspect of earth's
climate. This summary reviews what some of them have learned
about storm trends in Europe.
A number of studies have reported increases in North Atlantic
storminess over the past two decades (Jones et al., 1997; Gunther
et al., 1998; Dickson et al., 2000). Since climate alarmists
claim this period to have been one of the warmest -- if not the
warmest -- of the entire past millennium, this observation might
appear to vindicate their view of the subject. When a much longer
time period is considered, however, just the opposite is found to
be true.
Dawson et al. (2002), for example, scoured daily meteorological
records of the Royal Meteorological Society held in the archives
of the Society's Scottish Office in Edinburgh for Stornoway
(Outer Hebrides), Lerwick (Shetland Islands), Wick (Caithness)
and Fair Isle (west of the Shetland Islands), recovering all data
pertaining to gale-force winds over the period 1876-1996, which
enabled them to reconstruct a history of storminess for that
period for northern and northwestern Scotland.
Analysis of the data showed that although North Atlantic
storminess and associated North Atlantic wave heights have indeed
increased over the past two decades, "storminess in the
North Atlantic region was considerably more severe during parts
of the nineteenth century than in recent decades." In
addition, whereas the modern increase in storminess appears to be
associated with a recent spate of substantial positive values of
the North Atlantic Oscillation (NAO) index, they say "this
was not the case during the period of exceptional storminess at
the close of the nineteenth century." During that earlier
period, the conditions that determine modern storminess were
apparently overpowered by something even more potent, i.e., cold
temperatures. The cold temperatures, in the view of the authors,
led to an expansion of sea ice in the Greenland Sea, which
expanded and intensified the Greenland anticyclone, which then
led to the North Atlantic cyclone track being displaced farther
south. A similar hypothesis has been expressed by Clarke et al.
(2002), who postulate that a southward spread of sea ice and
polar water results in an increased thermal gradient between
50°N and 65°N that intensifies storm activity in the North
Atlantic and supports dune formation in the Aquitaine region of
southwest France.
The results of the Dawson et al. analysis indicate that increased
storminess and wave heights observed in the North Atlantic Ocean
over the past two decades do not appear to be the result of
global warming. Rather, they are associated with the most recent
periodic increase in the NAO index. Furthermore, a longer
historical perspective reveals that North Atlantic storminess was
even more severe than it is now in the latter part of the
nineteenth century, when it was significantly colder than it is
now. In fact, the storminess of that much colder period was so
great that it was actually decoupled from the NAO index. Hence,
the long view of history suggests that the global warming of the
past century or so has actually led to an overall decrease in
North Atlantic storminess.
Additional evidence for a century-long decrease in storminess in
and around Europe comes from the study of Bijl et al. (1999), who
analyzed long-term sea level records from several coastal
stations in northwest Europe. According to these authors,
"although [the] results show considerable natural
variability on relatively short (decadal) time scales,"
there is "no sign of a significant increase in storminess
... over the complete time period of the data sets."
In the southern portion of the North Sea, however, where natural
variability was more moderate, they found "a tendency
towards a weakening [our italics] of the storm activity over the
past 100 years."
Much the same results were obtained by Pirazzoli (2000), who
analyzed tide-gauge, wind and atmospheric pressure data over the
period 1951-1997 for the northern portion of the Atlantic coast
of France. In that study, the number of atmospheric depressions
(storms) and strong surge winds were found to be decreasing in
frequency. In addition, it was reported that "ongoing trends
of climate variability show a decrease in the frequency and hence
the gravity of coastal flooding."
Tide-gauge data have also been utilized as a proxy for historic
storm activity in England. Using high-water data from the
Liverpool waterfront over the period 1768-1999, Woodworth and
Blackman (2002) report that the annual maximum
surge-at-high-water declined at a rate of 0.11 ± 0.04 meters per
century, suggesting that the winds that are responsible for
producing high storm surges were much stronger and/or more common
during the early part of the record (colder Little Ice Age) than
the latter part (Modern Warm Period).
Lastly, Bielec (2001) analyzed thunderstorm data from Cracow,
Poland for the period 1896-1995, finding an average of 25 days of
such activity per year, with a non-significant
linear-regression-derived increase of 1.6 storm days from the
beginning to the end of the record. From 1930 onward,
however, the trend was negative, revealing a similarly-derived
decrease of 1.1 storm days. It was also determined there was a
decrease in the annual number of thunderstorms with hail over the
entire period and a decrease in the frequency of storms producing
precipitation in excess of 20 mm.
In conclusion, as the earth has warmed over the past hundred or
so years during its recovery from the global chill of the Little
Ice Age, there appears to have been no significant increase in
either the frequency or intensity of stormy weather in Europe. In
fact, most studies suggest just the opposite. This observation,
coupled with the fact that storminess in many other regions of
the world has also decreased as local or regional temperatures
have risen, suggests there is no real-world-data-based reason to
believe that storms will suddenly get worse if the earth were to
warm somewhat more in the future.
References
Bielec, Z. 2001. Long-term variability of
thunderstorms and thunderstorm precipitation occurrence in
Cracow, Poland, in the period 1896-1995. Atmospheric
Research 56: 161-170.
Bijl, W., Flather, R., de Ronde, J.G. and Schmith, T.
1999. Changing storminess? An analysis of long-term
sea level data sets. Climate Research 11: 161-172.
Clarke, M., Rendell, H., Tastet, J-P., Clave, B. and Masse,
L. 2002. Late-Holocene sand invasion and North
Atlantic storminess along the Aquitaine Coast, southwest
France. The Holocene 12: 231-238.
Dawson, A.G., Hickey, K., Holt, T., Elliott, L., Dawson, S.,
Foster, I.D.L., Wadhams, P., Jonsdottir, I., Wilkinson, J.,
McKenna, J., Davis, N.R. and Smith, D.E. 2002.
Complex North Atlantic Oscillation (NAO) Index signal of historic
North Atlantic storm-track changes. The Holocene 12:
363-369.
Dickson, R.R., Osborn, T.J., Hurrell, J.W., Meincke, J.,
Blindheim, J., Adlandsvik, B., Vinje, T., Alekseev, G. and
Maslowski, W. 2000. The Arctic Ocean response to the
North Atlantic Oscillation. Journal of Climate 13:
2671-2696.
Gunther, H., Rosenthal, W., Stawarz, M., Carretero, J.C., Gomez,
M., Lozano, I., Serrano, O. and Reistad, M. 1998. The
wave climate of the northeast Atlantic over the period 1955-1994:
the WASA wave hindcast. The Global Atmosphere and Ocean
System 6: 121-163.
Jones, P.D., Jonsson, T. and Wheeler, D. 1997.
Extension to the North Atlantic Oscillation using early
instrumental pressure observations from Gibraltar and South-West
Iceland. International Journal of Climatology 17:
1433-1450.
Pirazzoli, P.A. 2000. Surges, atmospheric pressure
and wind change and flooding probability on the Atlantic coast of
France. Oceanologica Acta 23: 643-661.
Woodworth, P.L. and Blackman, D.L. 2002. Changes in
extreme high waters at Liverpool since 1768. International
Journal of Climatology 22: 697-714.
Copyright © 2003. Center for the Study of Carbon Dioxide
and Global Change
===========
(4) ABRUPT CLIMATE CHANGE: NOT ON THE HUMAN-INDUCED AGENDA
>From CO2 Science Magazine, 2 April 2003
http://www.co2science.org/edit/v6_edit/v6n14edit.htm
"Large, abrupt, and widespread climate changes with major
impacts have occurred repeatedly in the past, when the Earth
system was forced across thresholds." Thus begins the
abstract of a major new review of the subject of rapid climate
change (Alley et al., 2003), which was written by the authors of
an earlier National Research Council (NRC) report dealing with
the same topic (Alley et al., 2002).
The new review article and the NRC report that inspired it
contain both reasonable and illogical proposals. On the sensible
side of the ledger is Alley et al.'s (2003) suggestion that
policy-makers should consider "improving monitoring systems,
and taking actions designed to enhance the adaptability and
resilience of ecosystems and economies." Since no one can
accurately predict the future trajectory of world climate -- and
many cannot even agree what has happened in the past, most
notably with respect to the temperature history of the planet --
it should be clear that reliable meteorological monitoring
systems are definitely needed. And since we cannot yet actually
do anything about either weather or climate, adaptation to what
nature brings our way is the only viable option for making the
best of whatever climate surprises might possibly confront us in
the future.
On the logically-deficient side of the ledger is the authors'
suggestion that "human forcing of climate change is
increasing the probability of large, abrupt events." It is
our contention that not only is this suggestion incorrect, it is
fully 180 degrees out of phase with reality.
Consider, for example, the study of Helmke et al. (2002), who
developed a history of late Pleistocene climate variability from
an analysis of a deep-sea sediment core retrieved from a
well-studied ice-rafted debris belt in the northeast Atlantic
Ocean. This exercise led them to detect and quantify three
distinct levels of climate variability that have been operative
over the past half-million years. Their findings? Maximum climate
variability occurred during times of either ice sheet growth or
ice sheet decay, medium climate variability was the norm during
glacial maxima, while minimum climate variability was observed
during what Helmke et al. call "peak interglaciations,"
which are essentially periods of greatest warmth.
Similar conclusions were reached by Oppo et al. (1998) and
McManus et al. (1999), also as a result of analyzing real-world
data. What is more, Alley et al.'s main model-based scenario of
possible human-induced abrupt climate change -- which the NRC
authors link to the exceeding of some threshold value of the
ocean's thermohaline circulation, which they say could be caused
by "warming and associated changes in the hydrological
cycle" -- has been challenged by another model study. Based
on a set of sensitivity analyses of the response of the ocean's
thermohaline circulation to the freshening of North Atlantic
surface water that could be caused by the predicted warming of
the 21st century, Rind et al. (2001) concluded that one of the
two major driving forces of the thermohaline circulation, i.e.,
North Atlantic deep water formation, "decreases linearly
with the volume of fresh water added," and that it does so
"without any obvious threshold effects," noting
additionally that "the effect is not rapid."
Clearly, if we are dispassionate in the application of logic, and
if we really want to do something to reduce the likelihood of
abrupt climate change, real-world data pertaining to the planet's
palaeoclimatic history (as well as some climate model work) tell
us we should be attempting to prevent global cooling. And, in
fact, that is precisely what we are doing via our burning of
massive quantities of coal, gas and oil.
So how is this grand -- but unplanned -- endeavor progressing?
Actually, not very well, for the planet's temperature is running
well below "normal." Based on the Antarctic ice-core
study of Petit et al. (1999), the present interglacial is more
than 2°C cooler than all four of the interglacials that preceded
it. Although earth's temperature may have risen half a degree C
over the last century or so, we can take little credit for that
development, as the bulk of the warming was likely a result of
the most recent natural upswing of the probably-solar-induced
millennial-scale climatic oscillation that is a persistent
feature of both glacial and interglacial epochs alike, the two
prior upswings of which resulted in the establishment of the
Medieval Warm Period and the antecedent Roman Warm Period. In
addition, the planet is still considerably cooler than it was
throughout the great "climatic optimum" of the
mid-Holocene, which itself defined much of the mean temperature
of the current interglacial that falls far short of the
corresponding mean temperature of the prior four interglacials.
Where does all of this leave us? On the one hand, logic and
real-world data suggest that the potentially catastrophic abrupt
climate changes discussed by Alley et al. would best be avoided
by either maintaining the climatic status quo or by warming. On
the other hand, the predictions of manifestly imperfect climate
models -- which sometimes predict nearly opposite outcomes, as in
the case of the response of the ocean's thermohaline circulation
to global warming -- are used by climate alarmists and a host of
politicians to mandate measures to resist warming.
These two positions are about as different from each other as
night and day. Hence, it should not be too difficult for a
reasonably rational person to decide which is the more correct.
The question of paramount importance for each of us, therefore,
is this: Am I a reasonably rational person?
Sherwood, Keith and Craig Idso
References
Conway, G.R. and Pretty, J.N. 1991. Unwelcome
Harvest: Agriculture and Pollution. Earthscan, London, UK.
EEA. 1998. Europe's Environment: The Second
Assessment. European Environment Agency, Copenhagen,
Denmark.
FAO. 2000. Agriculture: Towards 2015/30. Global
Perspective Studies Unit, Food and Agriculture Organization,
Rome, Italy.
Idso, C.D. and Idso, K.E. 2000. Forecasting world
food supplies: The impact of the rising atmospheric CO2
concentration. Technology 7S: 33-55.
Idso, K.E. and Idso, S.B. 1994. Plant responses to
atmospheric CO2 enrichment in the face of environmental
constraints: a review of the past 10 years' research.
Agricultural and Forest Meteorology 69: 153-203.
Mayeux, H.S., Johnson, H.B., Polley, H.W. and Malone, S.R.
1997. Yield of wheat across a subambient carbon dioxide
gradient. Global Change Biology 3: 269-278.
Pretty, J.N., Morison, J.I.L. and Hine, R.E. 2003.
Reducing food poverty by increasing agricultural sustainability
in developing countries. Agriculture, Ecosystems and
Environment 95: 217-234.
Wood, S., Sebastien, K. and Scherr, S.J. 2000. Pilot
Analysis of Global Ecosystems. IFPRI and WRI, Washington,
DC.
Copyright © 2003. Center for the Study of Carbon Dioxide
and Global Change
===========
(5) STORM FRONT: THE SUBJUGATION OF SCIENCE TO POLITICS
>From Tech Central Station, 28 March 2003
http://www.techcentralstation.com/1051/envirowrapper.jsp?PID=1051-450&CID=1051-032803E
By Paul Georgia
Does the greenhouse effect really work like a greenhouse? Does
the average global temperature provide any meaningful climatic
information? Is there even a theory of climate? These are some of
the questions asked and answered in a new book, Taken by Storm:
The Troubled Science, Policy and Politics of Global Warming,
written by Christopher Essex, a professor in the Department of
Applied Mathematics at the University of Western Ontario, and
Ross McKitrick, an associate professor in the Department of
Economics at the University of Guelph.
As the title notes, the book addresses both science and politics.
As we shall see, the science underlying global warming alarmism
is flimsier than most people, even many scientists, suspect. How
we have reached a point where the world is on the verge of
putting into force a treaty, the Kyoto Protocol, that would
stifle economic growth in the developed countries and preclude it
in the third world, in the absence of scientific evidence,
demands an answer. The answer lies in the perverse incentives
that arise from the subjugation of science to politics.
No Physical Meaning
Essex, who studies the underlying mathematics, physics and
computation of complex dynamic processes, raises some very
fundamental scientific issues with regard to global warming.
Take, for instance, the "average global temperature,"
which is the primary statistic offered as evidence of global
warming. The problem with this statistic is that it has no
physical meaning. Temperature is not a thermodynamic variable
that lends itself to statistical analysis, nor does it measure a
physical quantity.
Thermodynamic variables are of two types, says Essex, extensive
and intensive. Extensive variables, like energy or mass, occur in
amounts. Intensive variables, such as temperature, refer to
conditions of a system. A cup of hot coffee, for example,
contains an amount of energy and has a temperature. If you add an
equal amount of coffee with the same amount of energy and the
same temperature to the cup, the amount of energy doubles, but
not the temperature. The temperature remains the same. Thus,
while you can add up the energy from two separate systems and get
total energy, it is physically meaningless to add up the two
systems' temperatures. And dividing that number by two doesn't
give you the average temperature either. Such an exercise results
in a statistic that has no physical meaning. Yet that is exactly
what occurs when the average global temperature is computed.
Moreover, temperature and energy aren't the same thing. The
internal energy of a system can change without changing the
temperature and the temperature can change while the internal
energy of the system remains the same. In fact, this occurs all
the time in the climate because the two variables are
fundamentally different classes of thermodynamic variables and
there is no physical law that requires that they move together.
The next time somebody informs you that the planet's
"average temperature" has increased, you can rest
assured that they have told you exactly nothing.
Flawed Metaphor
Taken by Storm also takes on the greenhouse metaphor. The
so-called greenhouse effect does not work like a greenhouse.
Incoming solar radiation adds energy to the Earth's climate. To
restore radiative balance, the energy must be transported back to
space in roughly the same amounts in which it arrived. The energy
is transported via two processes - infrared radiation (heat
transfer) and fluid dynamics (turbulence). "When you add up
the net amount of energy flow away from the surface by pure
(infrared) radiation, it turns out to be roughly the same as that
carried away by wind, air movements and evaporation," says
Essex.
A greenhouse, however, works by preventing fluid motions, such as
wind, by enclosing an area with plastic or glass. To restore
balance, infrared radiation must increase, thereby causing the
temperature to rise. Predicting the resulting temperature
increase is a relatively straightforward process. If you think of
the climate system as a gigantic greenhouse, it's easy to assume
that we can calculate how much the temperature will rise from an
increase in greenhouse gases. But the greenhouse picture ignores
the fluid dynamics half of the story.
The "greenhouse effect" works differently. Greenhouse
gases slow down outgoing infrared radiation, which causes changes
in turbulence. But it cannot be predicted what will happen
because the equations which govern turbulence cannot be solved!
"In the case of turbulence," writes Essex, "we
can't even forecast from first principles the average flow in a
simple pipe." The climate system is a vastly more complex
turbulent system than a pipe. It is impossible to determine from
first principles whether an increase in greenhouse gases will
lead to warming, cooling or no change.
This is why Essex argues that there is no theory of climate. We
do have equations governing turbulence, but not on the enormous
scale of climate. "There is no one living on climate scales
to observe structures, do experiments or establish physically
meaningful structure for us," he says. "We are little
better than bacteria in a test tube trying to deduce from first
principles what the laboratory ought to look like."
Model Problems
Essex also takes on the practice of parameterization in climate
modeling. Parameters are simple numbers that stand in place of
tremendously complex climatic processes that we really don't
understand. Parameters used in climate models are not derived
from theory, which doesn't exist at the level of climate, or from
observation, says Essex. "Everything [in the climate models]
from convection to clouds, rain and the general cycle of water
into and out of the system - everything that has to do with
moving energy from the surface of the Earth to space - is made
up." This is very important because a slight change in a
model parameter can lead to totally different projections.
Climate models, for example, assume that temperature decreases by
6.5 degrees Celsius per kilometer of altitude. At a certain
altitude the temperature is such that the amount of radiation
entering the climate system equals the amount leaving it. This
altitude is known as the "characteristic emission
level" (CEL). A doubling of atmospheric carbon dioxide
concentrations causes the CEL to move out about 300 meters. To
figure out how this affects the surface, one can simply calculate
the temperatures from the CEL by adding 6.5 degrees C per
kilometer all the way down to the surface, which shows a warming
of about 2 degrees C at the surface.
The problem with this exercise is that there is no single rate at
which temperature changes with altitude. It varies from 4 to 10
degrees C. Nor is there a single level for the CEL, but an
infinite number of levels. The 6.5 degree C figure is just a
parameter based on a simplistic model of the atmosphere. If you
change that parameter to 6.2 degrees C the projected warming
disappears. At 6.1 degrees C the model projects cooling.
"The models show surface warming from adding carbon dioxide
to the atmosphere because of their programming," says Essex.
"They could yield surface cooling with different
programming, without violating any physical law. All that is
required is to allow things to change in the model that do in
fact change in the atmosphere."
'Official Science'
Surely scientists are aware of these issues (and many others
discussed in the book). If so, then how have we come to a place
where the media and politicians repeatedly state that there is a
scientific consensus that the planet is warming up, it is caused
by man, and the effects will be catastrophic? Dr. McKitrick
offers a very convincing explanation. He discusses several
relevant groups, but we'll focus on politicians and what
McKitrick calls "Official Science."
Politicians need big issues around which they can form winning
coalitions. Global warming is great because it is a complex and
baffling scientific issue that can be reduced to a simple matter
of "warming" without the public being the wiser, and it
allows politicians the prospect of becoming global statesman and
proposing heroic planet-saving initiatives. However, such
initiatives can be very costly, so politicians need a high degree
of scientific support.
This is where Official Science comes in. Official Science is not
made up of working research scientists, but rather staffs of
scientific bureaucracies, national and international science
panels, and so on. These members of Official Science aren't
appointed by scientists to speak on their behalf, but are
middlemen who control the distribution of research money and
define "scientific truth" for the public. They have the
job of striking "a mad compromise between the realities of
politics and the realities of nature," writes McKitrick.
"So while scientists are skeptical of their own work and
that of others, Official Science speaks with the simple
confidence that good politics requires and journalism demands,
but which science abhors."
The United Nations' Intergovernmental Panel on Climate Change is
the premier representative of Official Science. Its governing
principles state that it shall concentrate its activities on
"actions in support of the UN Framework Convention on
Climate Change process." This is Official Science in the
service of politics. The IPCC has repeatedly proclaimed that it
represents "the consensus" on global warming science.
Former IPCC Chairman Robert Watson stated in 2001 that the IPCC's
summary report added "impetus for governments of the world
to find ways to live up to their commitments ... to reduce
emissions of greenhouse gases."
So what can be done to fix this problem of science shilling for
politics? How do we "make sure science is free to
investigate [climate change], without having to prove constantly
that this or that is relevant to policy issues?" Essex and
McKitrick offer a modest proposal. Instead of one IPCC report,
there should be two reports. One report would be written by
global warming alarmists and the other by global warming
skeptics, each drawn from the ranks of scientists. Each group
would make the best case for its position and provide a rebuttal
of the other's position. There would be no summaries written by
Official Science middlemen to "interpret" the meaning
of the two reports. If politicians wanted to know the state of
the science, they would have to read the reports themselves and
come to their own conclusions.
Will it happen? Not likely. Official Science isn't about to
relinquish its monopoly on defining scientific truth, nor the
perquisites (good pay, travel to exotic locations) of its
position.
Paul Georgia is an environmental policy analyst with the
Competitive Enterprise Institute in Washington, D.C.
Copyright 2003, Tech Central Station
============================
* LETTERS TO THE MODERATOR *
============================
(6) APRIL FOOL OR APRIL SCARE? "TABLE MOUNTAIN AN ISLAND IN
30 YEARS"
>From Franz Dullaart <jfd@tenet.ac.za>
Hi Benny
You may be interested in this story which appeared in this
moring's (1 April) Cape Times. I hope it's not intended to be an
April Fool joke! However, IF the sea level were to rise, the dire
predictions would come true.
Abstract:
Table Mountain: an island in 30 years
Robben Island and much of Cape Town permanently submerged...
Table Mountain an island within 30 years - that's the shocking
picture painted in a report by the Institute for Global Warming
and Sea Level Changes. The report, which was leaked to the Cape
Times, had been kept under wraps for fear of creating panic and
causing havoc with property prices
Full story here:
http://www.iol.co.za/index.php?click_id=13&art_id=vn20030401064128675C194430&set_id=1
Regards
Franz
===========
(7) DEPLETED URANIUM: SAFETY AND LEGALITY
>From John Michael Williams <jwill@AstraGate.net>
Hi Benny.
You published several arguments in "TERRA: ENVIRONMENTAL
RISK ASSESSMENT OF THE WAR AGAINST
SADDAM (Tue, 1 Apr 2003) which concluded that depleted uranium
(DU) was not toxic as used in warfare.
I would like to point out that uranium is a heavy metal, and it
is a cumulative poison which
damages especially the kidneys. Merely because one dose doesn't
obviously kill or sicken those
nearby is not valid evidence that it is safe. The same
biopollyannaism was the cause of the
asbestos catastrophe, and it allowed tobacco companies to market
cigarettes for years.
Furthermore, unlike tungsten, which is about equally toxic,
uranium is radioactive. Thus,
local damage because of (mainly alpha) radiation has to be
factored in. Both chemically and
radiologically, uranium is cumulatively harmful. Veterans with DU
shrapnel in their bodies
have been found to be excreting uranium for years after being
wounded.
The second issue is the way it is delivered in combat: The
projectile essentially disintegrates into a blob of energized
dust, somewhat like a high-energy interaction in a particle
accelerator. It is converted to a fine dust which then may
be inhaled by persons in the target (usually a tank or ship).
It would be illegal under the 1925 Geneva Protocol to pulverize
lead or uranium and deliver it as a cloud of dust over enemy
trenches; so, what is the argument that DU dust from a projectile
is not a violation of this law? Aren't exploding bullets
(dumdums) illegal? Is it legal because we
have a lot of it and it is effective? This argument would
work for chlorine gas, too . . ..
All the WWI vererans are dead now, so I guess it's OK.
Who's going to complain?
Reviews of the toxicity issues of DU in combat may be found at:
http://ccnr.org/du_hague.html
http://www.antenna.nl/wise/uranium/index.html
Specific issues of damage during the 1991 Gulf War and the Balkan
War are at:
http://www.deploymentlink.osd.mil/du_library/
http://www.ngwrc.org/Dulink/du_link.htm
http://ccnr.org/index_uranium.html#dir
The actual effectiveness of DU vs tungsten is studied at http://arXiv.org/abs/physics/0301059
So far as I know, the only nontoxic heavy metal is bismuth. I
would advocate this metal as a replacement of DU in ammunition.
It currently is used in bird shot, and, like DU, it is more
effective than the mildly toxic lead it replaces.
--
John
jwill@AstraGate.net
John Michael Williams
==============
(8) HOMO SAPIENS AND THE ROOTS OF HUMAN CATASTROPHES, WITH
COMPLIMENTS TO GEORGE ORWELL
>From Andrew Glikson <geospec@webone.com.au>
Preamble
Attempts at reaching high ENDS through the use of the lowest
MEANS, including murder and genocide, reduce any ENDS to mere
banner under which men devoid of a sense of humanity and justice
rise yet again to harness the gullible masses for wars against
each other and the destruction of the biosphere.
----------
At the beginning was the WORD - Ludwig Wittgenstein, George
Orwell, Noam Chomsky and (to make a distinction) Joseph Goebbels
- knew how inexorably catch cries, buzz words and superlatives
rush the collective adrenaline into the collective brain,
translating fear into hate into aggression and war, that cyclic
ritual mass sacrifice unique to the species "Homo
Sapiens".
With nauseating predictability, as memories of the vast carnage
of WWII fade, once again drums are beating, heralding a clash
between true believers in a desert sky god and disciples of the
global market force. An examination of the language, slogans and
aphorisms used to fuel the looming conflicts reveals but empty
lies on every side. Examples abound.
Traditionally followers of ancient vengeful gods never cease to
dehumanize the "infidel", the alien and the weak, not
least women - veiled and subjected to "honor killing"
or stoning, whether the Taliban or the 16th century French
Catholic church (where six million women were burnt on the stake
as "witches"). Religion seems to have made little
difference to slavery and mass murder, whether of Africans by
Moslem slave traders or of Indians by Spanish gold seekers,
leaving entire continents in ruin.
Under the banner of Arian supremacy Teutonic warriors of the 3rd
Reich went on a cowardly genocide of helpless populations,
defiling any claims for "honor". Revolutions eat their
own sons - under pretexts for social justice Stalin's secret
police annihilated the idealists and betrayed their own troops.
Under the banner of democracy - the "people's rule" -
the Athenians legitimized the privileges of land and ship owners
at the expense of war captives and slaves. In more recent history
the democratic ideal was corrupted by the rise to the likes of
Adolph Hitler, who never hid his genocidal plans as manifested in
"Mine Kampf"! More than 50 million lost their lives as
a result of the 1933 German elections. Three million Vietnamese
were napalm bombed in their rice paddies in the name of freedom.
Once banners are placed ahead of human lives and basic human
needs - its all the same for the victims.
Nor can salvation be found in Milton Friedman's market force, the
antithesis of Keynesian liberalism, a force that chops the hand
of a hungry child for "stealing" a loaf of bread in an
oriental marketplace, or enslaves peasants to cash crops. Greed
as an automatic economic panacea run by snake oil merchants is
but the latest myth and a guise for the re-emergence of the
"survival of the fittest" paradigm.
That much of the world is run by subterranean networks of drug
rings, arms dealers and intelligence agencies is more than a
conspiracy theory. That the west is ruled by big corporations as
much as by elected governments is a truism.
How shocked were the Romans at Spartacus' uprising, the victories
of slaves to which the Romans accorded no humanity. How little
can affluent suburbia-international comprehend the despair of
starving masses in Africa and in Asia. Little effort is
undertaken to heal the roots of conflict. With this
failure, fundamentalist forces whose simple minded
"solutions" hinge on racial genocide are once again on
the rise.
Whether the next carnage is rationalized by the arms trade, oil
or just the innate destructiveness of the species, the very
re-invention of George Orwell's 'Newspeak' with its
self-congratulatory superlatives heralds the demise of ideals
generations fought and died for - peace, social justice, the
United Nations. Nowadays "winners" never tire of
writing themselves into history, boasting a "competitive
edge" or "world's best practice" - including the
practice of planetary ecocide.
There is little point in blaming "leaders", they would
not have been able to pursue their murderous goals had it not
been for the acquiescing masses, brainwashed and corrupted by
effective bread-and-circuses strategies, forever re-electing the
same elites through one political party or the other. At the root
is a universal primeval double standard - one rule for
"us" and one for "them", for example one rule
for the white-skinned and another for the dark skinned. Arthur
Koestler suggested the only solution lies in "sanity
drugs", as a substitute for the multitude of poisons humans
otherwise consume, a suggestion unlikely to be accepted...
And while what has been done in Auschwitz and Bergen Belsen, to
the American Indian, to the Australian aboriginal, to African
herdsmen, or to rice farmers of southeast Asia can hardly be
undone, knights in shiny armor never die, not until the last
non-believer, the last tree and the last drop of water are gone.
The criminal insanity of ecocide and mass destruction is not
unique to any particular political or religious system, its
universality suggests the species "Homo Sapiens" is
very ill and will take no remedies.
When Seneca - the Roman 'Arbiter Elegantium' - heard of his death
sentence, he wrote Nero with his last breath: "Dear Caesar -
go on and kill, rape and pillage but, I besiege you, spare us
your obscene poetry and dull songs!
On reading this letter, the Emperor collapsed.
Andrew Glikson
25.3.03
==============
(9) AND FINALLY: "SUPER-PNEUMONIA" OR SUPER SCARE?
>From Scripps Howard News Service, 26 March 2003
http://www.fumento.com/disease/sars.html
By Michael Fumento
"It is the worst medical disaster I have ever seen,"
the Dean of Medicine at the Chinese University in Hong Kong told
a prominent Asian newspaper. This irresistible quote was then
shot 'round the world by other media, seeking desperately to hype
the "mysterious killer pneumonia" or
"super-pneumonia." But a bit of knowledge and
perspective will kill this panic.
Start with those scary tags. "Mysterious" in modern
medicine usually means we haven't yet quite identified the cause,
although it appears we have now done so here. What's been
officially named Severe Acute Respiratory Syndrome (SARS) appears
to be one or more strains of coronavirus, commonly associated
with colds.
"Killer pneumonia" is practically a redundancy, since
so many types of pneumonia (there are over 50) do kill.
The real questions are: How lethal, how transmissible, and how
treatable is this strain? And the answers leave no grounds for
excitement, much less panic. Super?
At this writing, SARS appears to have killed 49 people out of
1323 afflicted according to the World Health Organization, a
death rate of less than four percent. In Hong Kong, that alleged
"worst medical disaster" has killed ten people out of
316 known victims. But since this only takes into account those
ill enough to seek medical help, the actual ratio of deaths to
infections is certainly far less.
In contrast, the 1918-1919 flu pandemic killed approximately a
third of the 60 million afflicted.
Further, virtually all of the deaths have been in countries with
horrendous health care, primarily mainland China. In the U.S., 40
people have been hospitalized with SARS. Deaths? Zero.
Conversely, other forms of pneumonia kill about 40,000 Americans
yearly.
Transmissibility?
Each year millions of Americans alone contract the flu. Compare
that with those 40 SARS cases and - well - you can't compare
them. Further evidence that SARS is hard to catch is that health
care workers and family members of victims are by far the most
likely to become afflicted.
Treatability?
"There are few drugs and no vaccines to fight this
pathogen," one wire service panted breathlessly. But there
are also few drugs to fight any type of viral pneumonia, because
we have very few antiviral medicines. Nevertheless, more become
available each year and one of the oldest, ribavirin, appears
effective against SARS.
So why all the fuss over this one strain of pneumonia?
First, never ignore the obvious: It does sell papers.
But an added feature to this scare is the cottage industry that's
grown up around so-called "emerging infectious
diseases." Some diseases truly fit the bill, with AIDS the
classic example. Others, like West Nile Virus in North America,
are new to a given area.
But there's fame, fortune, and big budgets in sounding the
"emerging infection" alarm and warning of our terrible
folly in being unprepared. The classic example is Ebola virus,
which is terribly hard to catch, remains in Africa where it's
always been, is now usually non-fatal, and - despite what
reporters love to relate - does not turn the victims' internal
organs "into mush."
Yet you'd almost swear that every outbreak of Ebola in Africa is
actually taking place in Chicago. Laurie Garrett rode Ebola onto
the bestseller list and talk show circuit with her book The
Coming Plague: Newly Emerging Diseases in a World out of Balance.
Since then, the U.S. government and various universities have
also seen these faux plagues as budget boosters. The CDC
publishes a journal called Emerging Infectious Diseases, though
in any given issue it's hard to find an illness that actually
fits the definition.
The U.S. Institute of Medicine just issued a report warning that
we're grossly unprepared to deal with emerging pathogens.
Soothingly, however, it adds that it's nothing that an injection
of lots of tax dollars can't cure.
Meanwhile, a disease that emerged eons ago called malaria kills
up to 2.7 million people yearly. Another, tuberculosis, kills
perhaps three million more. Both afflict Americans, albeit at
very low rates.
The big money and headlines may be in the so-called
"emerging diseases," but the cataclysmic illnesses come
from the same old (read: boring) killers. How do our priorities
get so twisted? There's your mystery.
Copyright 2003 Scripps Howard News Service
--------------------------------------------------------------------
CCNet is a scholarly electronic network. To
subscribe/unsubscribe, please contact the moderator Benny J
Peiser < b.j.peiser@livjm.ac.uk
>. Information circulated on this network is for scholarly and
educational use only. The attached information may not be copied
or reproduced for
any other purposes without prior permission of the copyright
holders. The fully indexed archive of the CCNet, from February
1997 on, can be found at http://abob.libs.uga.edu/bobk/cccmenu.html.
DISCLAIMER: The opinions, beliefs and viewpoints expressed in the
articles and texts and in other CCNet contributions do not
necessarily reflect the opinions, beliefs and viewpoints of the
moderator of this network.
--------------------------------------------------------------------