Saturday, February 16, 2008

Bandwidth on Demand

An academic internet provides clues about ways to improve the commercial Internet.

Big sender: Internet2’s dynamic circuit network will help provide channels for large quantities of information to flow to and from academic research projects, such as CERN’s hadron collider, above. In the future, the technology may find commercial applications, such as for fast transfer of high-definition online video.
Credit: CERN
Multimedia
Download PDF


Internet2
, a nonprofit advanced networking consortium in the United States, is designing a new network intended to open up large amounts of dedicated bandwidth as needed. For example, a researcher wanting to test telesurgery technologies--for which a smooth, reliable Internet connection is essential--might use the network to temporarily create a dedicated path for the experiment. Called the dynamic circuit network, its immediate applications are academic, but its underlying technologies could one day filter into the commercial Internet, and it could be used, for example, to carry high-definition video to consumers.

"The idea here is to basically look at the network in a different way," says Rick Summerhill, CTO of Internet2. The Internet Protocol (IP) currently used for the Web breaks data into packets that are sent through fiber-optic cables to their ultimate destination. The packets don't have to take a common path through the network; routers act like way stations along the network, examining every packet individually and deciding where it should be sent next. The problem with this system is that large data transfers can clog the routers with packets waiting for direction, and if the packets don't make it to their final destination at the same time, the receiver may experience jitter--interruptions to the data stream that can produce skips in online video, for example.

Summerhill says that, using the dynamic circuit network, a researcher could set up a temporary connection to another location that would work like a phone call: the user's data would be carried directly to that other location, uninterrupted by the traffic of others sharing the network. The result is that large quantities of information could be transferred quickly and clearly.

The dynamic circuit network is really an enhancement of a traditional network, rather than a replacement. Internet2 still has a backbone that uses the standard IP common across the Web. What makes the dynamic circuit network different is that it uses a circuit-switched network, which can be set up so that all the packets follow the same path. Also, those circuits don't have to be in place permanently. Lachlan Andrew, a researcher at Netlab, at the California Institute of Technology, explains that a circuit-switched network determines a pathway for the entire stream of packets, so that at every way station, they can be sent on without having to be individually examined. "Internet2 is developing technology to communicate between nodes, find a path, and construct it," he says.

The idea of the dynamic circuit network, Summerhill says, is that these circuits can be set up on demand, so that traffic needing excellent quality of service can step out of the regular flow. Because data is sent down fiber-optic cables at different frequencies of light, he explains, data from the dynamic circuit network can coexist with IP data and wouldn't require new cable to be laid. Summerhill says that Internet2 is working on software that could eventually be built into network devices to control these different flows and to set up circuits when and where they are needed.


Among the current applications for the dynamic circuit network, Internet2 expects to facilitate the transfer of data from CERN's large hadron collider to researchers at other institutions, and it has done trials in which circuits are opened between the collider and the University of Nebraska. In the future, Summerhill says, the researchers hope that commercial applications develop from the technology. "Think of a network that provided hundreds or thousands of high-definition channels and also provided on-demand video capabilities," he says. He foresees a commercial network that needs both high bandwidth and high quality of service, like some current academic requirements. "The methods for supporting that network are under investigation," Summerhill says. Although right now, there are no commercial implementations, he notes that Internet2 works with commercial partners that might eventually be a conduit to bringing the technology into the ordinary Internet.

Clive Davenhall worked on software for academic circuit-switched networks in the United Kingdom, as part of his role as an engineer at the National e-Science Centre, in Edinburgh, which works to improve methods for conducting large-scale science research over the Internet. Davenhall says that, although people have been talking about dynamic circuit networks for a long time, this type of network hasn't had much of an impact on the commercial Internet, partly because of concerns about how it might function in an environment less controlled than academia. For example, if the average person could set up a dedicated circuit on demand, it might be possible to hog resources that could interfere with other users' experience.

Summerhill says that the dynamic circuit network is still in its early stages, and "still has some evolution to do." He recalls the time that IP wasn't considered ready for commercial applications. So far, four universities in four different regional networks are connected to the dynamic circuit network, says Lauren Rotman, public relations manager for Internet2. Rotman adds that it will be easy to add universities in regions that are already connected. The organization hopes to increase the dynamic circuit network's reach significantly in the coming year.


http://www.technologyreview.com/Infotech/20277/

Mobile Carriers See Gold in Femtocells

If consumers buy in to private wireless phone networks, the industry could save money.

Can you hear me now? Airvana's HubBub femtocell (above) could provide better cellular reception inside homes and offices.
Credit: Airvana


On its face, it sounds like a company's technological fantasy: a product sold to customers that will also save the business itself money.

That's roughly the attraction of a young wireless phone technology called femtocells, which promise to give homes and businesses their own private wireless phone networks.

Similar in concept to the Wi-Fi routers that many people use to blanket their homes with wireless Internet access, these little boxes instead provide a network for carrying the voice and high-speed data services of mobile phones. They're designed to give bandwidth-hungry cell-phone subscribers the strongest possible connections at home. But by keeping those customers off the main mobile network and using home broadband connections to transfer data, they could wind up saving the phone companies money, too.

It's no wonder, then, that equipment vendors say that mobile phone companies are rushing into this market--with technology and even commercial trials beginning on both sides of the Atlantic--even before standards have been set or final technological hurdles cleared.

"Usually in the networking business, you build equipment, and then drum up demand," says Paul Callahan, vice president of business development for Airvana, a femtocell equipment vendor. "This time, demand is already really strong."

The femtocell buzz is part of a broader, years-long push by mobile phone companies to persuade their customers to use cell phones instead of landlines for all their communications needs, and increasingly to use their cells for third-generation (3G) applications such as Web surfing, downloading music, and watching videos.

One hurdle, phone companies say, is that mobile phone coverage inside homes and businesses often isn't as good as it is outside. Some homes are in coverage shadows or have thick apartment walls that impede transmissions. In addition, the Wideband Code Division Multiple Access (W-CDMA) technology used for 3G services by T-Mobile and AT&T in the United States transmits at a higher frequency than does its predecessor, so it has a harder time penetrating walls.

A femtocell would relieve this problem--in theory. Instead of relying on the mobile phone's nearest cellular tower (known in the industry as a base station), which might also be serving scores of other callers at the same time, a customer would have her own private, high-quality cell-phone connection.

"Our goal is to get to a place where our services are available to all users at all times," says John Carvalho, head of core network innovation for Telefónica O2 Europe, which announced femtocell trials this week.

Boosters of the technology paint femtocell as technology that benefits everyone. Customers get a fast, reliable broadband phone connection at home, and the mobile phone companies get to offload a small piece of their infrastructure investments to their customers.

In effect, every customer who buys and installs his own home femtocell would reduce the load on the carrier's local macro network. The femtocell itself serves as an alternative base station, broadcasting and receiving ordinary wireless signals from cell phones that the femtocell owner permits. This is a strikingly attractive idea, particularly to carriers in big cities that find their networks often overloaded, and find that local regulations or public opinion makes it difficult and costly to set up new antennas.

By using a femtocell, customers will send their voice and data traffic out their own DSL, cable, or fiber connection to the Internet, and then to the carrier's network. This will also reduce the load on the land-based data networks that carry voice and data traffic from the mobile phone companies' base stations to their own central switching facilities. That, in turn, could translate into less infrastructure investment.

Yet all of this will happen only if customers see enough benefit to buying themselves a femtocell--and for now, that's the biggest flaw in this rosy scenario, analysts say.

"What's in it for the user?" asks Keith Nissen, an analyst with the In-Stat research firm. "That's the big question. Right now, there isn't enough."


Broadband subscribers already have fast Internet surfing at home, by definition. Carriers may well offer cheaper cell-phone calls for femto customers using their home connection--but broadband subscribers can already do this using Skype, Vonage, or other voice over Internet protocol (VoIP) services. Strong cell signals at home are certainly a plus, but it's not clear how much consumers will pay for this, analysts say.

Without an obvious consumer must-have attraction, demand will likely be tied closely to price, Nissen says. If a femtocell is cheap enough, consumers will latch on to the idea, assuming (and this can be a big assumption) that carriers are able to explain and market it clearly. But this price may be a sticking point for some time.

Today, the equipment cost for femtocells runs in the range of $250 to $300. Sprint, one of the first companies to start commercial trials of the products, is offering them to consumers in Denver and Indianapolis for $50 apiece, along with an offer of lower-priced calling plans--altogether a substantial subsidy.

O2's Carvalho says that he expects equipment costs to come down to between 50 and 80 British pounds (about $100 to $160) once standards are set and mass-manufacturing begins. That's an acceptable price range for consumers used to buying products such as Wi-Fi modems, he says.

The standards process may take several years, however. Different equipment vendors use different techniques for aspects such as security, or for letting the femtocells talk to the carrier's core network. Femtocells have been developed for both rival 3G mobile phone standards--W-CDMA and CDMA2000--but different standards-setting bodies are separately at work on rules for each.

In the long term, analysts expect femtocells to be a fast-growing, successful market. In-Stat forecasts that 40.6 million femtocells will be distributed around the world by 2011. ABI Research is even more optimistic, projecting 70 million in use by 2012.

By that time or shortly afterward, analysts say, femtocell technology may be built into other devices, such as Internet routers for consumers.

Vodafone, T-Mobile, and O2 all announced trials early this year. Equipment vendors say that many other carriers are in undisclosed trials as well. Commercial deployment, in which the products will be distributed to consumers by the phone companies or their retail partners beyond the limited scale of Sprint's two-city experiment, is expected by early next year.

That's all assuming that consumers react positively when they actually get a chance to see how the technology works.

"If it winds up being more expensive, but it provides better data rates, it's probably worth the investment for us," says O2's Carvalho. "If it's more expensive but slower, and it annoys customers, we probably wouldn't take that on."


http://www.technologyreview.com/Biztech/20293/

Improving Toxicity Tests

A new initiative will work on cell-based toxicity tests for chemicals.

Credit: Technology Review


As chemical companies develop more pesticides, cleaners, and other potentially toxic compounds, traditional methods of safety testing can hardly keep up. Animal tests, which have been the gold standard for decades, are slow and expensive, and these sorts of tests are increasingly socially unacceptable, too. What's more, the results of animal testing sometimes don't translate to humans, so researchers are eager for better alternatives.

This week, at the annual meeting of the American Association for the Advancement of Science in Boston, the U.S. Environmental Protection Agency and the National Institutes of Health (NIH) announced a multiyear research partnership to develop a cell-based approach that they hope can replace animal testing in toxicity screening. Work has already begun, although it will take years to refine the techniques.

Using systems that are already employed in the search for new drugs, researchers hope to develop quick, accurate methods of toxicity testing for chemicals that are carried out on cells, rather than on whole animals.

That way, instead of having to spend weeks dosing and dissecting roomfuls of rabbits or rats, thousands of chemicals could be tested in a matter of hours using automated systems and human cells grown in a lab. Different kinds of cells could be used as proxies for particular tissues, providing a way for researchers to test the effects of a chemical on the liver, for example, and, ultimately, to predict toxic effects.

The approach "really has the potential to revolutionize the way toxic chemicals are identified," says Francis Collins, director of the National Human Genome Research Institute. Automated cell-based tests could screen many thousands of chemicals in a single day, compared with the decades spent so far gathering detailed information on a few thousand toxic chemicals.

"We need to be able to test thousands of compounds in thousands of conditions much faster than we did before," says Elias Zerhouni, director of the NIH. The new approach repurposes a technique that's a mainstay in pharmaceutical labs, where high-throughput screening is used to help identify new drugs. Automated systems can test hundreds of thousands of candidate compounds in a single day and identify those that have any effect on cells, and hence may have therapeutic value. The aim of the toxicity-testing research is "to try to turn that around to find compounds that might be toxic," Collins says. Their effects could be assessed according to the number of cells they kill, or by using markers that indicate whether certain functions in a cell are affected.

Because high-throughput screening can handle many thousands of tests at a time, a given chemical can be tested at different concentrations and for different exposure times during a single screening process, producing comprehensive and reliable data that's "not a statistical approximation," says Christopher Austin, director of the NIH Chemical Genomics Center. "It's pharmacology."


"In order to get the answers you want, you need to do all the concentrations, all the times, and that's why you need to have a high-throughput system," Austin says.

Researchers at the NIH have already used high-throughput screening to test several thousand chemicals over a range of 15 concentrations varying by several orders of magnitude, and for exposure times ranging from minutes to days. The chemicals they picked have well-known toxic effects, gleaned from animal studies. By comparing data from high-throughput tests with that from animals, researchers should be able to fine-tune cell-based tests so that they're at least as reliable and as informative as animal experiments.

"Animals are not always giving us the right answer," says John Bucher, associate director of the National Toxicology Program, "so we need to use all the information we can get from different systems."

In a sense, Austin says, this new approach turns the animal-testing procedure "upside down." Rather than giving a rat a chemical and then dissecting the animal and examining its tissues to see the effect of the compound, metaphorically, "we are dissecting the rat first into its component cells, then computationally putting the rat back together."

However, it will take years for researchers to prove--if they can--that cell-based toxicity screening can supercede animal tests so "you cannot abandon animal testing overnight," Zerhouni says. "It will have to be intertwined for a few years."


http://www.technologyreview.com/Biotech/20294/