Peter Sayer, IDG News Service (June 24, 2003), writes:

As the Internet grows, it is becoming harder to change and easier to break, according to the researchers from the PlanetLab consortium. New, distributed technologies could be used to identify denial-of-service (DoS) attacks before they cause damage, or to enable the Internet to heal itself quicker after a cable break, they say.

One of the biggest problems they face is how to develop and test such services, given that building a prototype requires a testbed on the same scale as the Internet itself, with a volume of traffic to match.

One group of researchers has been quietly building just such a testbed for a little over a year now, and is about to transform its project, PlanetLab, into an academic-industrial consortium with support from Intel and HP.

Intel and Hewlett-Packard are joining a consortium that aims to build a platform on which to develop “disruptive” Internet technologies, they announced Tuesday. These technologies are needed because the Internet is becoming “ossified,” according to HP and Intel researchers.

As the Internet grows, it is becoming harder to change and easier to break, according to the researchers from the PlanetLab consortium. New, distributed technologies could be used to identify denial-of-service (DoS) attacks before they cause damage, or to enable the Internet to heal itself quicker after a cable break, they say.

One of the biggest problems they face is how to develop and test such services, given that building a prototype requires a testbed on the same scale as the Internet itself, with a volume of traffic to match.

One group of researchers has been quietly building just such a testbed for a little over a year now, and is about to transform its project, PlanetLab, into an academic-industrial consortium with support from Intel and HP.

Of course, the researchers haven’t really built a test network the size of the Internet: They’ve piggy-backed their project on the real thing. PlanetLab, as the testbed is called, is a network of computers, linked by the Internet, on which researchers can test distributed, networked applications and services such as search engines, or new routing and naming protocols.

The goal of the project is to provide a space to test the disruptive technologies they hope will enable the Internet to develop further.

“The research community is full of ideas about new services they want to deploy. … The problem is, there is a very high barrier to entry,” said Larry Peterson, one of the project’s founders and a professor of computer science at Princeton University in New Jersey, who also works for Intel Research. Peterson and other project members spoke in a conference call with journalists.

That barrier is the cost of deploying a planet-wide network. No one university research group could hope to achieve it, but by pooling resources and agreeing to share processor time and disk space, researchers from 62 institutions have built a network of 160 servers at 65 locations across the Internet. And it’s not finished yet: the goal is to link around 1,000 computers.

“If I have the viewpoint of 1,000 machines, I have the ability to see the beginnings of a distributed denial of service [DDoS] attack, or to route traffic differently than the route I would have chosen looking in from the edge of the network,” Peterson said.

Applications running on PlanetLab share resources on some or all of the computers in the network, executing in many places at once. While this might sound like a grid computing system, it’s not the same, according to another of the project’s founders, David Culler, a professor in computer science at the University of California at Berkeley and also academic director of Intel Research at Berkeley.

“We have a number of grid researchers using PlanetLab as a way to try their ideas out, but there’s a big difference. Grid is a way to get access to cycles that happen to be in other places, but if they happened to be in the same machine room, that would be better. With PlanetLab, the Internet between the nodes is what it’s all about,” Culler said in the same conference call.

Fields of research that could be opened up by PlanetLab are plentiful, according to Culler.

“People are looking at distributed search engines. Distributed storage is particularly interesting as you start to integrate that with various kinds of sensors streaming information from the physical world,” he said.

Enhancing massively multiplayer games is another potential application but, said Culler, “We believe the most important ones are the ones we haven’t seen yet because people haven’t had the tools available.”

Those tools are starting to come online. Peterson expects the network will almost double in size to 300 nodes by the end of this year, and hopes for 1,000 nodes within two years.

Intel Research has already donated around 100 servers to the project, Culler said. HP is donating 30, according to Rick McGeer of HP Labs. McGeer is HP’s liaison officer with CITRIS, the Center for Information Technology Research in the Interest of Society at Berkeley.

The servers PlanetLab uses are based on Intel’s Pentium III processor, and run a customized version of Red Hat’s version of the Linux operating system, Peterson said. “It’s convenient to get things running quickly, and a lot of the research community uses Linux,” Peterson said.

There are still holes in PlanetLab’s net, though. “We don’t have nodes in Japan. We are very interested in seeing sites come up there,” Culler said.