[plug] Microsoft researchers recommend Linux!

Leon Brooks leon at brooks.fdns.net
Mon Jun 2 23:53:37 WST 2003


    http://www.nytimes.com/2003/06/02/technology/02SUPE.html

Highlight:

    The researchers, Gordon Bell and Jim Gray, scientists at
    Microsoft's Bay Area Research Center, presented the argument
    last month in a meeting of the National Research Council's
    Computer Science and Telecommunications Board at Stanford
    University.

    [...]

    By rewriting existing scientific programs, they say, researchers
    will be able to get powerful computing from inexpensive clusters
    of personal computers that are running the free Linux software
    operating system. Many scientists are now adapting their work to
    these parallel computing systems, known as Beowulfs, which make
    it possible to cobble together tremendous computing power at low
    cost.

    "The supercomputer vendors are adamant that I am wrong," Dr.
    Bell said. "But the Beowulf is a Volkswagen and these people are
    selling trucks."

<quote>
In Computing, Weighing Sheer Power Against Vast Pools of Data

By JOHN MARKOFF

SAN FRANCISCO, June 1 - For almost two decades the federal government 
has heavily underwritten elaborate centers to house the world's fastest 
supercomputers. The policy has been based on the assumption that only 
government money could ensure that the nation's research scientists had 
the computing power they needed to pursue projects like simulating the 
flow of air around a jet airplane wing, mimicking the way proteins are 
folded inside cells or modeling the global climate.

But now two leading American computer researchers are challenging that 
policy. They argue that federal money would be better spent directly on 
the scientific research teams that are the largest users of 
supercomputers, by shifting the financing to vast data-storage systems 
instead of building ultrafast computers.

Innovation in data-storage technology is now significantly outpacing 
progress in computer processing power, they say, heralding a new era 
where vast pools of digital data are becoming the most crucial element 
in scientific research.

The researchers, Gordon Bell and Jim Gray, scientists at Microsoft's Bay 
Area Research Center, presented the argument last month in a meeting of 
the National Research Council's Computer Science and Telecommunications 
Board at Stanford University.

"Gordon and I have been arguing that today's supercomputer centers will 
become superdata centers in the future," said Dr. Gray, an expert in 
large databases who has been working with some of the the nation's 
leading astronomers to build a powerful computer-based telescope.

The policy challenge spelled out by the Microsoft researchers comes as a 
quiet national policy debate over the future of supercomputing is 
taking place among experts in scientific, industrial and military 
computing.

In February the National Science Foundation Advisory Panel on 
Cyberinfrastructure issued a report calling on the nation to spend more 
than $1 billion annually to modernize its high-performance computing 
capabilities.

Separately, a study completed last year by a group of military agencies 
was released in April. Titled "Report on High Performance Computing for 
National Security," it calls for spending $180 million to $390 million 
annually for five years to modernize supercomputing for a variety of 
military applications.

Computer scientists added that the construction of the Japanese Earth 
Simulator, which is now ranked as the world's fastest supercomputer, 
has touched off alarm in some parts of the United States government, 
with some officials advocating even more resources for the nation's 
three national supercomputer centers, located in Pittsburgh, at the 
University of Illinois at Urbana-Champaign and at the University of 
California at San Diego.

Whatever decisions the government makes could have vast implications for 
computing.

The decision in 1985 to build a group of what were then five 
supercomputer centers linked together by a 56-kilobit-per-second 
computer network was a big impetus for development of the modern 
high-speed Internet, said Larry Smarr, an astrophysicist who is 
director of the California Institute for Telecommunications and 
Information Technology.

He said that Dr. Bell and Dr. Gray were correct about the data-centric 
technology trend and that increasingly the role of the nation's 
supercomputer centers would shift in the direction of being vast 
archives. Rapidly increasing network speeds would make it possible to 
increasingly distribute computing tasks.

Central to the Bell-Gray argument is the vast amount of data now being 
created by a new class of scientific instruments that integrate sensors 
and high-speed computers.

While the first generation of supercomputing involved simulating 
physical processes with relatively small data sets, the tremendous 
increase in data storage technology has led to a renaissance in 
experimental science, Dr. Gray said.

The nation should forget about financing the world's fastest computers, 
he said, and instead turn the nation's attention back toward science.

"The core of our argument is to give money back to the sciences and let 
them do the planning," he said.

Dr. Gray and Dr. Bell, a legendary computer designer who oversaw the 
national supercomputer centers for two years during the 1980's as a 
director for the National Science Foundation, call their current 
approach to computing "information centric" and "community centric." By 
rewriting existing scientific programs, they say, researchers will be 
able to get powerful computing from inexpensive clusters of personal 
computers that are running the free Linux software operating system. 
Many scientists are now adapting their work to these parallel computing 
systems, known as Beowulfs, which make it possible to cobble together 
tremendous computing power at low cost.

"The supercomputer vendors are adamant that I am wrong," Dr. Bell said. 
"But the Beowulf is a Volkswagen and these people are selling trucks."

The Bell-Gray proposal has been greeted with skepticism from the 
supercomputer centers and in some cases from scientists, too.

"I believe the dramatic increase in data the scientific community is 
producing will lead to the increasing importance of the scientific 
computing centers," said Horst D. Simon, a mathematician who is the 
director of the National Energy Research Scientific Computer Center in 
Berkeley, Calif.

He said that scientific research projects were turning increasingly to 
his computing center to take advantage of its professional management 
and technical support for managing their experiments' data.

Some other computer scientists say that Dr. Bell and Dr. Gray have 
correctly identified a fundamental technology trend, but that they are 
wrong in stating that the United States no longer needs to focus on 
developing the most powerful computers.

"Beowulf clusters are an attractive alternative," said Daniel A. Reed, 
director of the National Center for Supercomputing Applications at the 
University of Illinois at Urbana-Champaign. "However, we still need a 
national-scale capability at the very high end."

A number of other scientists said they believed that Dr. Bell and Dr. 
Gray were overstating the power of the inexpensive Beowulf computing 
clusters.

"I'm not sure I agree with them on which is the cheap commodity computer 
and which is the specialized system," said Eric Bloch, a physicist at 
the Washington Advisory Group, a Washington-based science and 
consulting group, who is a former director of the National Science 
Foundation.

He said that supercomputer centers were still vital because they 
integrated systems that could be made available to scientific 
communities that might use the world's fastest computer if it were 
available.
</quote>

Cheers; Leon



More information about the plug mailing list