• Countdown to the 2025 QuantNet rankings. Join the list to get the ranking prior to public release!

Supercomputing for the Masses

Joined
5/2/06
Messages
12,179
Points
273
By ASHLEE VANCE
PORTLAND, Ore. — For decades, the world’s supercomputers have been the tightly guarded property of universities and governments. But what would happen if regular folks could get their hands on one?

The price of supercomputers is dropping quickly, in part because they are often built with the same off-the-shelf parts found in PCs, as a supercomputing conference here last week made clear. Just about any organization with a few million dollars can now buy or assemble a top-flight machine.

Meanwhile, research groups and companies like I.B.M., Hewlett-Packard, Microsoft and Intel are finding ways to make vast stores of information available online through so-called cloud computing.

These advances are pulling down the high walls around computing-intensive research. A result could be a democratization that gives ordinary people with a novel idea a chance to explore their curiosity with heavy computing firepower — and maybe find something unexpected.

The trend has spurred some of the world’s top computing experts and scientists to work toward freeing valuable stores of information. The goal is to fill big computers with scientific data and then let anyone in the world with a PC, including amateur scientists, tap into these systems.

“It’s a good call to arms,” said Mark Barrenechea, the chief executive of Silicon Graphics, which sells computing systems to labs and businesses. “The technology is there. The need is there. This could exponentially increase the amount of science done across the globe.”

The notion of top research centers sharing information is hardly new. Some of the earliest incarnations of what we now know as the World Wide Web came to life so that physicists and other scientists could tap into large data stores from afar.

In addition, universities and government labs were early advocates of what became popularized as grid computing, where shared networks were created to shuttle data about.

The current thinking, however, is that the labs can accomplish far more than was previously practical by piggybacking on some of the trends sweeping the technology industry. And, this time around, research bodies big and small, along with brainy individuals, can participate in the sharing agenda.

For inspiration, scientists are looking at cloud computing services like Google’s online office software, photo-sharing sites and Amazon.com’s data center rental program. They are trying to bring that type of Web-based technology into their labs and make it handle enormous volumes of data.

“You’ve seen these desktop applications move into the cloud,” said Pete Beckman, the director of the Argonne Leadership Computing Facility in Illinois. “Now science is on that same track. This helps democratize science and good ideas.”

With $32 million from the Energy Department, Argonne has set to work on Magellan, a project to explore the creation of a cloud-computing infrastructure that scientists around the globe can use. Mr. Beckman argued that such a system would reduce the need for smaller universities and labs to spend money on their own computing infrastructure.

Another benefit is that researchers would not need to spend days downloading huge data sets so that they could perform analysis on their own computers. Instead, they could send requests to Magellan and just receive the answers.

Even curious individuals on the fringe of academia may have a chance to delve into things like climate change and protein analysis.

“Some mathematician in Russia can say, ‘I have an idea,’ ” Mr. Beckman said. “The barrier to entry is so low for him to try out that idea. So, this really broadens the number of discoverers and, hopefully, discoveries.”

The computing industry has made such a discussion possible. Historically, the world’s top supercomputers relied on expensive, proprietary components. Government laboratories paid vast sums of money to use these systems for classified projects.

But, over the last 10 years, the vital innards of supercomputers have become more mainstream, and a wide variety of organizations have bought them.

At the conference, undergraduate students competed in a contest to build affordable mini-supercomputers on the fly. And a supercomputer called Jaguar at the Oak Ridge National Laboratory in Tennessee officially became the world’s fastest machine. It links thousands of mainstream chips from Advanced Micro Devices.

Seven of the world’s top 10 supercomputers use standard chips from A.M.D. and Intel, as do about 90 percent of the 500 fastest machines. “I think this says that supercomputing technology is affordable,” said Margaret Lewis, an A.M.D. director. “We are kind of getting away from this ivory tower.”

While Magellan and similar projects are encouraging signs, researchers have warned that much work lies ahead to free what they consider valuable information for broader analysis.

At the Georgia Institute of Technology, for example, researchers have developed software that can evaluate scans of the brain and heart, and identify anomalies that might indicate problems. To advance such techniques, the researchers need to train their software by testing it on thousands of body scans.

But it is hard to find a repository of such scans that a hospital or a government organization like the National Institutes of Health is willing to share, even if personal information can be stripped away, said George Biros, a professor at the Georgia Institute of Technology. “Medical schools don’t make this information available,” he said.

Bill Howe, a senior scientist at the eScience Institute at the University of Washington, has urged research organizations to reveal their information. “All the data that we collect in science should be accessible, and that’s just not the way it works today,” he said.

Mr. Howe said high school students and so-called citizen scientists could make new discoveries if given the chance.

“Let’s see what happens when classrooms of students explore this information,” he said.

With Cheaper Supercomputers, an Entry for Citizen Scientists - NYTimes.com
 
I think, though, there is still a lot of uncertainty about the cloud. Mostly this is because once you hand over your data to a cloud service, it is not guaranteed to be yours alone. As well, it may be difficult to move your data should you choose to use another cloud service in the future.
 
This is really exciting and one of the things technology changes our life.
We at QuantNetwork already move our static files, images over to Amazon cloud server, spreading over many of our virtual domains for faster browser loading time. For example, if our members from Asia connect to Quantnet, Amazon servers will detect and serve them the images, css, javascript from one of their servers in Hongkong.

I know a few friends who already move their entire music collection to Amazon S3. Somewhere out there, people may have already moved their trading engine to one of these supercomputer for rent.

As for the data security, I'm sure before they attract institutional clients, it has been worked out. Unlike free service like Gmail, Facebook this is a pay service so it requires a different level of scrutiny.
 
Back
Top Bottom