CCNet 111/2003 - 25 November 2003

Guided by Japanese writings from an era of shoguns, an international team of
scientists today reported new evidence that an earthquake of magnitude 9 struck
the northwestern United States and southwestern Canada three centuries ago.
Their findings are likely to affect the region's precautions against future
earthquakes and tsunamis.
    --Science Daily, 21 November 2003

Remember a few years ago when killer asteroids were all the rage? News magazine
cover stories went into gruesome detail about the death and destruction an
asteroid collision with Earth could wreak. Movies and television delighted in
the special effects of cities being incinerated. Then we found more pressing
matters to worry about, and asteroids were relegated to the mental shelf where
such remote but entertaining threats as a second Ice Age, alien landings and
mutant killer viruses are stored, ready if need be but of no immediacy.
     --Dale McFeatters, Scripps Howard News Service, 21 November 2003

    Science Daily, 21 November 2003

    Andrew Yee <>

    Dale McFeatters

    Zurab Silagadze <>

    Space Daily, 24 November 2003

    Tech Central Station, 21 November 2003

    Tech Central Station, 19 November 2003

    Space Daily, 24 November 2003

    BBC News Online, 24 November 2003

     PC Magazine, 19 November 2003

     Andrew Yee <>
(12) AND FINALLY: BLOWING IN THE WIND, 22 November 2003


Science Daily, 21 November 2003

WASHINGTON - Guided by Japanese writings from an era of shoguns, an international team of scientists today reported new evidence that an earthquake of magnitude 9 struck the northwestern United States and southwestern Canada three centuries ago. Their findings are likely to affect the region's precautions against future earthquakes and tsunamis.

Writing in the Journal of Geophysical Research-Solid Earth, published by the American Geophysical Union, scientists from Japan, Canada and the United States summarize old reports of flooding and damage by a tsunami in 1700 on the Pacific coast of Japan. With the aid of computer simulations, they conclude that this tsunami must have been generated by a North American earthquake of close to magnitude 9. Such an earthquake would, in a few minutes, release about as much energy as the United States now consumes in a month.

The report's authors are Kenji Satake, of the Geological Survey of Japan; Kelin Wang, of the Geological Survey of Canada; and Brian Atwater, of the United States Geological Survey, based at the University of Washington in Seattle.

The earthquake apparently ruptured the full length of an enormous fault, known as the Cascadia subduction zone, which extends more than 1,000 kilometers [600 miles] along the Pacific coast from southern British Columbia to northern California. Until the early 1980s, this fault was thought benign by most scientists, said Atwater. But then a swift series of discoveries in North America showed that the fault produces earthquakes of magnitude 8 or larger at irregular intervals, averaging about 500 years. The most recent of the earthquakes, dated by radiocarbon methods, occurred between 1680 and 1720.

These discoveries raised a further question: Can the fault produce earthquakes of magnitude 9? Such a giant earthquake would produce low-frequency shaking, lasting minutes, that might now threaten tall buildings from Vancouver, British Columbia, to northern California. A giant Cascadia earthquake would also warp large areas of seafloor, thereby setting off a train of ocean waves -- a tsunami -- that could prove destructive even on the far side of the Pacific Ocean.

Such international concerns motivated the research described today. "At issue for North Americans," said Atwater, "is how to adjust building codes and tsunami-evacuation plans to reduce losses of life and property in the event of a future magnitude 9 earthquake in southern British Columbia, Washington, Oregon and northern California."

Few scientists took that threat in the Cascadia region seriously until 1996, when Japanese researchers, in a letter to the journal Nature, stunned their North American colleagues by linking a tsunami in Japan to geologic reports of an earthquake and tsunami at the Cascadia subduction zone.

From the tsunami's arrival time in Japan, the Japanese researchers assigned the earthquake to the evening of Tuesday, January 26, 1700. In addition, from preliminary estimates of the tsunami's height in Japan, they guessed that it was too large to explain by a Cascadia earthquake of less than magnitude 9.

That guess was on target, according to today's report in the Journal of Geophysical Research-Solid Earth. The researchers begin by showing that the 1700 tsunami crested as much as five meters [15 feet high] in Japan. They then use recent findings about the Cascadia subduction zone to relate earthquake size to plausible areas of fault rupture and seafloor displacement. Finally, they employ computer simulations of trans-Pacific tsunamis to tune the estimates of earthquake size at Cascadia to the estimated tsunami heights in Japan.

The findings, said Atwater, justify precautions taken recently by engineers and emergency personnel. Under construction standards adopted since 1996, engineers have sought to design buildings to withstand giant earthquakes in the northwestern United States. At the same time, state and local officials have devised evacuation routes from areas believed subject to a tsunami from a Cascadia earthquake of magnitude 9. In Canada, buildings constructed in Vancouver and Victoria since 1985 are designed to resist stronger shaking from local earthquakes than is expected from the next Cascadia earthquake. Canada's 2005 building code will explicitly include the hazard from the subduction zone, said Wang.

Wang also noted that the giant fault responsible for this earthquake is currently "locked," accumulating energy for a future destructive event. "Scientists in the United States, Canada, and Japan are carefully monitoring the fault's activities using seismological and geodetic methods and making comparisons with a similar fault in southwestern Japan," said Wang. "With a combination of a better understanding of the previous earthquake and modern observations, we hope to better define the potential rupture area of the future event."

Lead author Satake noted that since their first report in 1996 about the possible relationship between the Japanese documents and the American earthquake, the Geological Surveys of the three countries have conducted a joint project on the Cascadia earthquake. "As a result of this international collaboration," he said, "we have collected more evidence, made rigorous interpretation of it, and have modeled the earthquake source and tsunami propagation by using the latest techniques. Consequently, we have confirmed that the 1700 earthquake was magnitude 9."

An animation, prepared by Kenji Satake, which shows hourly snapshots of the simulated tsunami moving across the Pacific Ocean for a full day, may be viewed at

This story has been adapted from a news release issued by American Geophysical Union.

Copyright 1995-2003 ScienceDaily Magazine  

Andrew Yee <>


Tuesday, September 9, 2003

About 300,000 tonnes of space garbage orbiting Earth

NALCHIK (Interfax-South) -- About 300,000 tonnes of space garbage, including
boosters, carrier rockets and satellites whose service life has expired, are
orbiting the Earth, Lyudmila Rykhlova, a researcher at the Russian Academy of
Sciences' Institute of Astronomy has stated.

Since the start of the space era, over 4,000 space launches have been made. Only
5% of the almost 1,000 artificial space objects orbiting the Earth are in
operation, Rykhlova told an international conference on near-earth astronomy in
the mountain village of Terskol in Russia's internal republic of Kabardino-Balkaria.

This factor must be taken into account when new launches are planned. Russian
researchers have drawn up charts of objects observed from Earth, including about
10,000 "satellites" with a diameter of over one meter, she said.

The conference brought together more than 100 researchers from Russia, Ukraine,
Poland and Bulgaria, who will discuss the origin and migration of small bodies
in the solar system, the influence of the interplanetary medium on space and
Earth objects, the problem of celestial bodies' collisions with the Earth and
the pollution of the atmosphere.

2003 Interfax. All rights reserved.



Sunday, August 17, 2003

Russian, European experts to monitor space garbage jointly

MOSCOW (Interfax) -- Russian researchers have proposed establishing a
Trans-European monitoring system to prevent satellite collisions with asteroids
and space garbage.

"Over 200,000 objects in space that could be described as space garbage are in
near Earth orbits. Asteroids are also dangerous. A network of telescopes and
radars needs to be created to monitor and tackle these problems," Igor Molotov,
an expert form the Russian Academy of Sciences' Pulkovo Observatory, has told

"The equipment available in Europe is not sufficient. Therefore a new project
involving Europe's means of surveillance and the optical facilities and radars
of former Soviet republics has been launched," he said.

"The new system will be able to warn of small pieces of space garbage and
monitor them round-the-clock in any weather. There are telescopes and radars
located from Spain to the Far East, covering several time zones," Molotov said.

"The system will be capable of finding new asteroids and measuring their orbits
and determining their physical properties, which will help make long-term
forecasts on dangerous space collisions and evaluate the consequences of
possible impacts," he said.

2003 Interfax. All rights reserved.



By Dale McFeatters

(SH) - Remember a few years ago when killer asteroids were all the rage?

News magazine cover stories went into gruesome detail about the death and destruction an asteroid collision with Earth could wreak. Movies and television delighted in the special effects of cities being incinerated.

Then we found more pressing matters to worry about, and asteroids were relegated to the mental shelf where such remote but entertaining threats as a second Ice Age, alien landings and mutant killer viruses are stored, ready if need be but of no immediacy.

It is considered a reasonable scientific certainty that 65 million years ago the impact of a six-mile wide meteor killed off the dinosaurs and might have killed us off, too, had we been around.

Now scientists say that there is evidence of an earlier mass extinction due to a meteor.

Fragments found in Antarctica suggest that a mountain-sized meteor hit the Earth 251 million years ago, killing off 90 percent of all life. "There were no large animals then, but there were lots of species living on the land and in the seas, and there were plants," said University of Rochester professor Asish Basu, a co-author of the meteor study.

It shows something about the resilience and determination of our ancestral single cells that life on Earth survived two meteor-caused mass extinctions.

There are two ways to look at this death-from-outer-space.

Either the meteors encountered the Earth at random, in which case there's no point in worrying.
Or they strike at specific intervals, meaning the next one isn't due for another 186 million years, in which case there's no reason to worry.

But it never hurts to look up once in awhile.



Zurab Silagadze <>

Dear Dr. Peiser,

maybe the following article will be interesting for CCNet audiance:
(Tunguska genetic anomaly and electrophonic meteors)

I wanted to submit it to Astronomy & Astrophysics but the editor answered
that "the manuscript cannot be considered for publication in Astronomy and
Astrophysics, as the journal is dedicated to publishing results of
astrophysical research. While quite fascinating, your article is based
mainly on biological evidence. We therefore suggest that you submit it to
a journal in that discipline."

I'll appreciate if you could advise what journal I should submit to.

With best regards, Zurab Silagadze.  

Tunguska genetic anomaly and electrophonic meteors
Authors: Z.K. Silagadze
Comments: 15 pages, LaTeX, A&A style

One of great mysteries of the Tunguska event is its genetic impact. Some genetic anomalies
were reported in the plants, insects and people of the Tunguska region. Remarkably, the
increased rate of biological mutations was found not only within the epicenter area, but also
along the trajectory of the Tunguska Space Body (TSB). At that no traces of radioactivity were
found, which could be reliably associated with the Tunguska event. The main hypotheses about
the nature of the TSB, a stony asteroid, a comet nucleus or a carbonaceous chondrite, readily
explain the absence of radioactivity but give no clues how to deal with the genetic anomaly.
A choice between these hypotheses, as far as the genetic anomaly is concerned, is like to
the choice between ``blue devil, green devil and speckled devil'', to quote late Academician
N.V. Vasilyev. However, if another mysterious phenomenon, electrophonic meteors, is evoked,
the origin of the Tunguska genetic anomaly becomes less obscure.


Space Daily, 24 November 2003

New York - Nov 24, 2003
Buyers looking for that perfect piece of real estate for their retirement sanctuary or vacation getaway have grown accustomed to rising property prices. But one company claims to have discovered a place where bargains still exist. There's just one catch: the property is not on Earth.

For a modest price (about $30 an acre), the Lunar Registry offers lunar land claims for sale over the Internet. The organization, which also advocates lunar exploration and settlement, donates 95 percent of each sale to the Kennedy II Lunar Exploration Project, a partnership between investors and aerospace contractors that hopes to develop permanent communities on the moon by 2015.

"Space law experts agree that actual occupation of the moon is the only legal method for ownership of lunar property," said David Ferrell Jackson, managing director of the Lunar Registry. "Through our partnership with the Kennedy Project, we plan to make lunar settlement a reality."

Property demand has driven prices upward at such an alarming rate that many investors are expanding their horizons to include the moon, which represents one of the last remaining outposts of peace and quiet for those looking to get away from it all, says Jackson.

Peace and quiet may be an understatement, as no human has stepped foot on the moon since Apollo 17 landed there in 1972. But numerous international organizations seek to return people to the moon over the next two decades, making settlement a distinct possibility.

Those who purchase land claims through the Lunar Registry would enjoy full rights to their property should the Kennedy II Lunar Exploration Project prove successful, says Jackson.

Buyers may select property from a list of specific lunar regions, such as the Sea of Tranquility, the Bay of Rainbows and the Sea of Dreams. Ownership packages include a personalized deed and a satellite photograph of the property.

According to Jackson, the Lunar Registry has already sold more than 400,000 acres of lunar property. The organization will only make two percent of the moon's nine billion acres available for land claims, meaning plenty of space is still available for that perfect vacation home.


Tech Central Station, 21 November 2003
By Sallie Baliunas
If human activities are having a dramatic effect on globally-averaged temperature, then the temperature in the low atmosphere would be rising at a rate faster than at the Earth's surface. A flurry of recent studies continues to round out the picture and suggests that alarmism about catastrophic anthropogenic global warming is more hype than scientific fact.

The best analysis of air temperature over the last 25 years is based on measurements made from satellites and checked with information from weather balloons. That work, conducted by J. Christy and R. Spencer at the University of Alabama at Huntsville (UAH), shows a small global warming trend. Even if the small trend were entirely human-caused -- an unlikely possibility because temperature exhibits many naturally-caused changes -- it contradicts the forecasts of extreme, human-made global warming.

It's not surprising, then, that the satellite measurements are intensely studied and debated.

Let's start with some background. There is concern that the air's increasing content of carbon dioxide and other greenhouse gases, mainly from the burning of fossil fuels, may cause substantial global warming. Because naturally-caused temperature changes have always occurred and will continue to occur, the human effect must be searched for against that varying temperature backdrop, preferably in areas of the climate system especially sensitive to human-caused warming.

Nearly all computer simulations of climate say that the air layer at a height from just above the surface to about five miles -- called the troposphere -- is very sensitive to human-made global warming. Thus, one important test of the human effect -- a stronger warmer trend, especially in the low troposphere compared to the surface -- should be obvious in reliable balloon and satellite observations that have been made from 1978.

Specifically, near the surface, the globally-averaged temperature constructed from thermometer readings scattered across the globe rose about 0.4 C during the last 25 years. That is the period in which the air's concentration of human-produced greenhouse gases has been rapidly increasing. The low troposphere should show a greater warming trend -- about 0.5 C over the last 25 years -- if the human-made global warming effect is pronounced.

Three new analyses of troposphere temperatures have appeared in the publications Science and Journal of Oceanic and Atmospheric Technology. They all start with the same set of measurements made from satellites, but find different results. Because not one but a series of satellites has collected the data, corrections need to be made to the measurements from each instrument to produce a precise record of temperature going back over two decades. How to find the best result?

The first of the three recent analyses of the satellite measurements appeared in Journal of Oceanic and Atmospheric Technology. It continues the UAH work by Christy and Spencer. It is solidly-based because its results were checked by careful comparison to good measurements from the weather balloons. The UAH team finds a small warming trend of approximately 0.07 C per decade in the low troposphere, with a few hundredths degree C uncertainty in the last 25 years. In an independent study by NOAA researchers published earlier this year in Journal of Climate, good weather balloon information agrees with the UAH satellite analysis.

That observed trend is much cooler than estimates of the human-made trend, according to the computer simulations.

The remaining two studies consider the same satellite measurements and find results consistent with computer-based forecasts of globally-averaged human warming. But those two studies also produce contradictory results, indicating the small temperature trend from UAH is the most reliable.

The second of the three analyses of satellite data, developed by a team led from Lawrence Livermore Labs and appearing in Science, claims to find a substantial rise in the height of the top of the troposphere, called the tropopause. The increase in height of the tropopause nearly matches that predicted by computer simulations for human-made global warming resulting from a significant warming in the troposphere below.

But that second analysis contains within itself its own counterargument, which wasn't mentioned in the study. The temperature of the troposphere -- presumed to be the cause of the observed tropopause rise -- is an easier and more direct measurement to make than the difficult-to-measure (and, for the computer simulations, to estimate) changing height of the tropopause. The temperature trend derived from the second analysis of the satellite data shows a small warming trend in tropospheric temperature, which agrees with UAH's trend within the uncertainties. The conclusion is that the measurement and modeling of changes in the tropopause height may be too uncertain to use in the question of evaluating the size of the human-made warming trend.

The third analysis of the satellite data, made by a team led from the University of Maryland and also published in Science, claims to see a significant warming trend in the troposphere, consistent with forecasts of a human-made enhanced greenhouse effect. But it, too, seems uncertain, as evidenced by the following.

The satellite instruments sample the air temperature often enough to see afternoon, sun-heated warmth and evening, sun-absent cooling. This third analysis incorrectly shows cool temperatures in the early to mid-afternoon and warm temperatures after sunset. That odd result seems to arise from the omission of an important temperature correction owing to calibration errors among satellite instruments, noted five years ago by the UAH team, and independently verified by Remote Sensing Systems in California, whose work is just appearing in Journal of Climate. Also, no independent balloon comparisons for cross-checking results were provided to readers in that second Science paper.

Thus, the UAH analysis, which has been thoroughly scrutinized by many independent researchers and measurements, shows a small temperature trend in the low troposphere. It has been checked by good weather balloon measurements, and may be the most reliable indicator of the temperature of the low troposphere of the last 25 years.

The small troposphere temperature trend indicates that the human-made part of the warming trend at the surface has been exaggerated by at least a factor of two to three. Adjusting forecasts downward by the same amount suggests a human-made global warming trend of less than 1 degree C over the next 100 years, an amount that would be lost in the background of natural change, thereby answering panic with scientific facts.

Copyright 2003, Tech Central Station


Tech Central Station, 19 November 2003
By David R. Legates

The conventional wisdom has been that temperatures during the early years of the last millennium (~A.D. 800 to 1300) were relatively warmer -- in what was known as the Medieval Warm Period -- while temperatures decreased during the middle years of the millennium (~A.D. 1400 to 1850) -- during what was known as the Little Ice Age. During the 1900s, temperatures increased as a result of a number of factors, including the demise of the Little Ice Age. Both introductory scientific texts as well as extensive scientific literature confirm these facts.

But in 1999, Dr. Michael Mann of the University of Virginia and his colleagues produced what has now become known as the 'hockey stick' curve -- a representation of the annual temperature for the Northern Hemisphere over the last millennium. This curve, compiled by averaging a number of proxy records (secondary or inferred sources from which assumptions about temperature can be drawn), shows a very slight cooling trend from A.D. 1000 to 1900 with a dramatic warming during the 1900s. This led Dr. Mann, the Intergovernmental Panel on Climate Change (IPCC), and the US National Assessment of Climate Change to assert that the 1990s were the warmest decade of the last millennium with 1998 being the warmest year.

But is the "hockey stick" assumption consistent with the observations? Harvard astrophysicists Dr. Willie Soon and Dr. Sallie Baliunas and their colleagues contend that it isn't. After examining more than 240 individual proxy records analyzed by nearly 1000 researchers, they concluded that taken individually, proxy records offer strong support for the widespread existence of both the Medieval Warm Period and the Little Ice Age and that they do not support the claim that the climate of the 20th Century is unusual when compared to the variability over the last millennium.

So why does Mann's "hockey stick" representation of average Northern Hemisphere temperature fail to retain the fidelity of individual proxy records? Many reasons involve detailed statistical issues, although some are rather obvious and fundamental. For example, Mann contends that the curve represents Northern Hemisphere temperature trends. So why is it that four of the twelve proxy sources used for the pre-A.D. 1400 analysis are from the Southern Hemisphere? Mann also simply affixed thermometer-based estimates for the 1900s to the end of his proxy averages -- a classic apples-versus-oranges comparison -- thereby producing the characteristic 'hockey stick' shape. But the thermometer-based record shows more variability than the proxy records during the 1900s and Mann represents it without the assignment of uncertainty. If the thermometer-based record was not included or if a satellite-based temperature record (where only a small warming trend exists for the late 1900s) were used instead, the claim that the 1990s were the warmest decade becomes unfounded. Even if a reasonable estimate of the error in the thermometer-based record were provided, the claim becomes questionable. Moreover, the range of uncertainty for the pre-A.D. 1400 analysis depends on a single proxy source for western North America; and Mann admits that his entire millennial reconstruction hinges on that single source.

But do proxy records really represent air temperature fluctuations? Most of the analyses on which the "hockey stick" relies are taken from tree-ring cores. Trees, however, respond not only to temperature fluctuations but also to species competition, fire episodes, pest infestations, and droughts. For example, if rainfall is limited, as often is the case in western North America (where the preponderance of data for Mann's pre-A.D. 1400 analysis is located), tree growth is severely restricted, regardless of the temperature conditions. It is impossible under such conditions to discriminate between a cold period and a dry period -- which is why Soon and Baliunas correctly characterized their assessments as "climate anomalies" rather than boldly assert they reflect air temperature fluctuations, as Mann does. Moreover, Dr. Jan Esper of the Swiss Federal Research Institute and colleagues demonstrated that their careful analysis of tree-ring chronologies yields an annual temperature curve for a large portion of the Northern Hemisphere that, unlike the "hockey stick," clearly shows the existence of the Medieval Warm Period and that temperatures during the early years of the millennium were commensurate with those of the 1900s.

These and other more complex issues are fundamental reasons the "hockey stick" is being challenged on scientific grounds by a number of serious scientists. But the IPCC and the US National Assessment of Climate Change continue to demand that policy be based on this flawed and biased research. We must take a closer look at the "science" behind the IPCC and, in this case, ask the question, "How much of the warming of the 20th Century was 'man-induced' and how much of it is 'Mann-induced'?"

David R. Legates is Associate Professor and Director of the Center for Climatic Research at the University of Delaware and Research Fellow at the Independent Institute in Oakland, Calif., publisher of the new report, New Perspectives in Climate Change: What the EPA Isn't Telling Us (

Copyright 2003, Tech Central Station


Space Daily, 24 November 2003

Moffett Field - Nov 24, 2003
A NASA scientist has discovered that future solar-power satellite systems designed to harvest sunlight, convert solar electric energy into weak microwaves and beam them down to Earth to make electricity, are not harmful to green plants.

During the simple experiment, the scientist bathed a tray of alfalfa plants with weak, 2.45 Ghz microwaves for seven weeks with no ill effects. The microwaves were about 1 million times weaker than those an average kitchen microwave oven makes.

The test took place in a laboratory at NASA Ames Research Center in California's Silicon Valley, and is the first of many experiments scientists plan to conduct to see if an array of solar-power satellites designed to send microwave power to Earth could affect plant life.

"A tray of growing plants was illuminated with microwaves while control plants were grown behind a microwave-opaque shield. Test plants and the control plants were subjected to the same environment otherwise," said NASA Ames scientist Jay Skiles, who designed the experiment and recently presented its results at the 54th International Astronautical Congress in Bremen, Germany.

"In all measured variables, there was no difference between the control and the microwave treatment plants," Skiles added. A 'control' is a parallel experiment in which the factor being tested in the main experiment is left out in order to provide a way for scientists to judge that factor.

In 1968, scientists proposed putting solar-power satellites into orbit about 22,000 miles above the ground, where these spacecraft could harvest sunlight for its energy. While the satellites would collect sunshine to make direct current (DC), they also would be converting the DC to some form of radiation, most likely microwaves. The satellites then would broadcast the microwave energy to the Earth's surface, where power plants would reconvert it into electricity for distribution.

"Over the ensuing decades, the space-power satellite concept has been studied from the view of engineering feasibility and cost per kilowatt, with only little attention paid to the biological consequences to organisms exposed to continuous microwave radiation," Skiles said. "The hypothesis of my experiment was that plants exposed to microwaves would be no different from those plants not exposed to microwaves," he said.

Skiles used off-the-shelf equipment to conduct the experiment. He used the same nutrients and watering techniques on two sets of plants, only one of which was exposed to microwaves.

A microwave generator with an antenna and a parabolic reflector beamed microwaves onto the test plants from the side so as not to block lights placed above the plants. A sheet metal microwave shield protected the 'control' plants from the microwaves so Skiles could compare the non-microwaved plants with the microwaved plants.

Skiles measured the chlorophyll concentration of the alfalfa leaves in the microwaved and non-microwaved plants. He measured the plants' stem lengths, and also harvested, dried and weighed the plants. He found there were no significant differences in the microwave-treated plants and the untreated control plants. Skiles chose to test alfalfa because it is an important crop that animals and people eat. Alfalfa also represents a broad class of economically important plants, he added.

Unlike radioactive materials, microwaves cannot burn living things, but microwaves do generate heat. However, Skiles reported, "Even though I tested microwaves on alfalfa, I didn't see any increase in plant or soil temperatures."

Skiles plans to conduct additional experiments to test plants outdoors, as well as under other conditions. "I want to test plants growing in a glasshouse to determine the effects of microwaves on the plants during daily changes of light and temperature," he said.

"Another experiment will be to grow cereal plants, including wheat and oats, to determine the effect of microwaves on the kinds of plants that humankind depend on for food," Skiles continued.

He also plans to test whether or not microwaves provide a competitive advantage for some kinds of plants when several different species are growing in the same area. In another experiment, he is planning to examine the genes of one plant species to learn the effects of weak microwaves on that plant. Additional experiments to test effects of climate change, watering and other conditions also may be conducted, according to Skiles.

BBC News Online, 24 November 2003
A famously polluted canal is now supporting one of the fastest growing fish populations in the UK, experts say.

The water quality is so good in the Manchester Ship Canal after a three-year clean-up initiative that it has gone from supporting five species of marine life to more than 30.

Before the operation began the canal was so polluted it was reportedly in danger of catching fire and was "virtually lifeless and hazardous to human life", researchers said.

However, after 15 tonnes of liquid oxygen was pumped into the canal every day since 2000, its "biodiversity is booming", the Mersey Basin Campaign (MBC) revealed on Monday....


PC Magazine, 19 November 2003,4149,1382914,00.asp

By John C. Dvorak

Blogs, or Web logs, are all the rage in some quarters. We're told that blogs will evolve into a unique source of information and are sure to become the future of journalism. Well, hardly. Two things are happening to prevent such a future: The first is wholesale abandonment of blog sites, and the second is the casual co-opting of the blog universe by Big Media.

Let's start with abandoned blogs. In a white paper released by Perseus Development Corp., the company reveals details of the blogging phenomenon that indicate its foothold in popular culture may already be slipping ( According to the survey of bloggers, over half of them are not updating any more. And more than 25 percent of all new blogs are what the researchers call "one-day wonders." Meanwhile, the abandonment rate appears to be eating into well-established blogs: Over 132,000 blogs are abandoned after a year of constant updating.
Perseus thinks it had a statistical handle on over 4 million blogs, in a universe of perhaps 5 million. Luckily for the blogging community, there is still evidence that the growth rate is faster than the abandonment rate. But growth eventually stops.

The most obvious reason for abandonment is simple boredom. Writing is tiresome. Why anyone would do it voluntarily on a blog mystifies a lot of professional writers. This is compounded by a lack of feedback, positive or otherwise. Perseus thinks that most blogs have an audience of about 12 readers. Leaflets posted on the corkboard at Albertsons attract a larger readership than many blogs. Some people must feel the futility.

The problem is further compounded by professional writers who promote blogging, with the thought that they are increasing their own readership. It's no coincidence that the most-read blogs are created by professional writers. They have essentially suckered thousands of newbies, mavens, and just plain folk into blogging, solely to get return links in the form of the blogrolls and citations. This is, in fact, a remarkably slick grassroots marketing scheme that is in many ways awesome, albeit insincere.

Unfortunately, at some point, people will realize they've been used. This will happen sooner rather than later, since many mainstream publishers now see the opportunity for exploitation. Thus you find professionally written and edited faux blogs appearing on MSNBC's site, the Washington Post site, and elsewhere. This seems to be where blogging is headed-Big Media. So much for the independent thinking and reporting that are supposed to earmark blog journalism.

So now we have the emergence of the professional blogger working for large media conglomerates and spewing the same measured news and opinions we've always had-except for fake edginess, which suggests some sort of independent, counterculture, free-thinking observers. But who signs the checks? The faux blog will replace the old personality columns that were once the rage in newspaperdom. Can you spell retro? These are not the hard-hitting independent voices we were promised. They are just a new breed of columnist with a gimmick and a stern corporate editor.
This trend is solid. A look at Columbia Journalism Review's recent listing of traditional-media blogs shows everyone getting into the act: ABC News, FOX, National Review, The New Republic, The Christian Science Monitor, The Boston Globe, The Wall Street Journal, and so on. The blogging boosters, meanwhile, are rooting like high-school cheerleaders over this development. To them, it's some sort of affirmation. In fact, it's a death sentence. The onerous Big Media incursion marks the beginning of the end for blogging. Can you spell co-opted?

I'm reminded of the early days of personal computing, which began as a mini-revolution with all sorts of idealism. Power to the people, dude. IBM was epitomized as the antithesis of this revolution. But when IBM jumped on board in 1981 and co-opted the entire PC scene, it was cheered. Welcome, brother! Apple even took out a semiflippant full-page national newspaper ad welcoming IBM. Actually, the ad reflected Apple's neediness and low self-esteem. IBM represented affirmation about as much as Big Media is affirmation for the hopeless bloggers.
Another so-called revolution bites the dust. Big surprise.
Copyright 2003, PC Magazine,


Andrew Yee <>

Monday, November 24, 2003; Page A08

On the Web, Research Work Proves Ephemeral

Electronic Archivists Are Playing Catch-Up in Trying to Keep Documents From
Landing in History's Dustbin

By Rick Weiss, Washington Post Staff Writer

It was in the mundane course of getting a scientific paper published that
physician Robert Dellavalle came to the unsettling realization that the world
was dissolving before his eyes.

The world, that is, of footnotes, references and Web pages.

Dellavalle, a dermatologist with the Veterans Affairs Medical Center in Denver,
had co-written a research report featuring dozens of footnotes -- many of which
referred not to books or journal articles but, as is increasingly the case these
days, to Web sites that he and his colleagues had used to substantiate their

Problem was, it took about two years for the article to wind its way to
publication. And by that time, many of the sites they had cited had moved to
other locations on the Internet or disappeared altogether, rendering useless all
those Web addresses -- also known as uniform resource locators (URLs) -- they
had provided in their footnotes.

"Every time we checked, some were gone and others had moved," said Dellavalle,
who is on the faculty at the University of Colorado Health Sciences Center. "We
thought, 'This is an interesting phenomenon itself. We should look at this.' "

He and his co-workers have done just that, and what they have found is not
reassuring to those who value having a permanent record of scientific progress.
In research described in the journal Science last month, the team looked at
footnotes from scientific articles in three major journals -- the New England
Journal of Medicine, Science and Nature -- at three months, 15 months and 27
months after publication. The prevalence of inactive Internet references grew
during those intervals from 3.8 percent to 10 percent to 13 percent.

"I think of it like the library burning in Alexandria," Dellavalle said,
referring to the 48 B.C. sacking of the ancient world's greatest repository of
knowledge. "We've had all these hundreds of years of stuff available by
interlibrary loan, but now things just a few years old are disappearing right
under our noses really quickly."

Dellavalle's concerns reflect those of a growing number of scientists and
scholars who are nervous about their increasing reliance on a medium that is
proving far more ephemeral than archival. In one recent study, one-fifth of the
Internet addresses used in a Web-based high school science curriculum
disappeared over 12 months.

Another study, published in January, found that 40 percent to 50 percent of the
URLs referenced in articles in two computing journals were inaccessible within
four years.

"It's a huge problem," said Brewster Kahle, digital librarian at the Internet
Archive in San Francisco. "The average lifespan of a Web page today is 100 days.
This is no way to run a culture."

Of course, even conventional footnotes often lead to dead ends. Some experts
have estimated that as many as 20 percent to 25 percent of all published
footnotes have typographical errors, which can lead people to the wrong volume
or issue of a sought-after reference, said Sheldon Kotzin, chief of
bibliographic services at the National Library of Medicine in Bethesda.

But the Web's relentless morphing affects a lot more than footnotes. People are
increasingly dependent on the Web to get information from companies,
organizations and governments. Yet, of the 2,483 British government Web sites,
for example, 25 percent change their URL each year, said David Worlock of
Electronic Publishing Services Ltd. in London.

That matters in part because some documents exist only as Web pages -- for
example, the British government's dossier on Iraqi weapons. "It only appeared on
the Web," Worlock said. "There is no definitive reference where future
historians might find it."

Web sites become inaccessible for many reasons. In some cases individuals or
groups that launched them have moved on and have removed the material from the
global network of computer systems that makes up the Web. In other cases the
sites' handlers have moved the material to a different virtual address (the URL
that users type in at the top of the browser page) without providing a direct
link from the old address to the new one.

When computer users try to access a URL that has died or moved to a new
location, they typically get what is called a "404 Not Found" message, which
reads in part: "The page cannot be displayed. The page you are looking for is
currently unavailable."

So common are such occurrences today, and so iconic has that message become in
the Internet era, that at least one eclectic band has named itself "404 Not
Found," and humorists have launched countless knockoffs of the page -- including, which looks like a standard error page but scolds
people for spending too much time on their computers ("This page cannot be
displayed because you need some fresh air ...") and, which offers political commentary about the U.S.
war in Iraq ("The weapons you are looking for are currently unavailable.").

Not all apparently inaccessible Web sites are really beyond reach. Several
organizations, including the popular search engine Google and Kahle's Internet
Archive (, are taking snapshots of Web pages and archiving them
as fast as they can so they can be viewed even after they are pulled down from
their sites. The Internet Archive already contains more than 200 terabytes of
information (a terabyte is a million million bytes) -- equivalent to about 200
million books. Every month it is adding 20 more terabytes, equivalent to the
number of words in the entire Library of Congress.

"We're trying to make sure there's a good historical record of at least some
subsets of the Web, and at least some record of other parts," Kahle said. "We're
injecting the past into the present."

But with an estimated 7 million new pages added to the Web every day, archivists
can do little more than play catch-up. So others are creating new indexing and
retrieval systems that can find Web pages that have wandered to new addresses.

One such system, known as DOI (for digital object identifier), assigns a virtual
but permanent bar code of sorts to participating Web pages. Even if the page
moves to a new URL address, it can always be found via its unique DOI.

Standard browsers cannot by themselves find documents by their DOIs. For now, at
least, users must use go-between "registration agencies" -- such as one called
CrossRef -- and "handle servers," which together work like digital switchboards
to lead subscribers to the DOI-labeled pages they seek. A hodgepodge of other
retrieval systems is cropping up, as well -- all part of the increasingly
desperate effort to keep the ballooning Web's thoughts accessible.

If it all sounds complicated, it is. But consider the stakes: The Web contains
unfathomably more information than did the Alexandria library. If our culture
ends up unable to retrieve and use that information, then all that knowledge
will, in effect, have gone up in smoke.

[Research editor Margot Williams contributed to this report.]

2003 The Washington Post Company

(12) AND FINALLY: BLOWING IN THE WIND, 22 November 2003

The senior senator from Massachussetts, Edward Kennedy now finds himself in something of a political dilemma (

At stake is a proposal to build a massive wind turbine farm - right in the middle of historic Nantucket Sound near Cape Cod, the so-called 'Cape Wind' project. As usual, such a project will bring ruination to the landscape and the seascape but this is the logical outcome arising from the pro-Kyoto policies that Kennedy himself has promoted. So first, Kennedy the environmentalist speaks -

"I strongly support renewable energy, including wind energy as a means of reducing our dependence on foreign oil and protecting the environment."

All very motherhood. Then a bit of family history and a eulogy about his responsibilities to the 'treasures' of Cape Cod and Nantucket Sound -

"My family has a long history on Cape Cod. After growing up and raising my children here, I understand the enormous national treasure we have in the Cape. We have an obligation to preserve it for future generations, which requires us to know the impact of our decisions on the landscape, seascape, and environment."

More motherhood. But what if these lofty aims are in conflict? At that point, Kennedy quickly remembers where his votes come from:

"I'm concerned that we are rushing to implement the Cape Wind proposal (for Nantucket Sound) - the world's largest proposed wind farm, 130 turbines, 400 feet tall in the waters between the Cape and the Islands - with little understanding of its likely impacts."

It's a bit late for Kennedy to now suddenly find virtue, now that one of his pet policies is going to be built right in his own back yard. He is partly to blame for the political climate which brought the Cape Wind project into existence to begin with, something which his political rivals may well remind the voters about.

CCNet is a scholarly electronic network. To subscribe/unsubscribe,
please contact the moderator Benny Peiser <>.
Information circulated on this network is for scholarly and educational
use only. The attached information may not be copied or reproduced for
any other purposes without prior permission of the copyright holders.
DISCLAIMER: The opinions, beliefs and viewpoints expressed in the
articles and texts and in other CCNet contributions do not necessarily
reflect the opinions, beliefs and viewpoints of the moderator of this

CCCMENU CCC for 2003