Saturday, February 16, 2008

Bandwidth on Demand

An academic internet provides clues about ways to improve the commercial Internet.

Big sender: Internet2’s dynamic circuit network will help provide channels for large quantities of information to flow to and from academic research projects, such as CERN’s hadron collider, above. In the future, the technology may find commercial applications, such as for fast transfer of high-definition online video.
Credit: CERN
Multimedia
Download PDF


Internet2
, a nonprofit advanced networking consortium in the United States, is designing a new network intended to open up large amounts of dedicated bandwidth as needed. For example, a researcher wanting to test telesurgery technologies--for which a smooth, reliable Internet connection is essential--might use the network to temporarily create a dedicated path for the experiment. Called the dynamic circuit network, its immediate applications are academic, but its underlying technologies could one day filter into the commercial Internet, and it could be used, for example, to carry high-definition video to consumers.

"The idea here is to basically look at the network in a different way," says Rick Summerhill, CTO of Internet2. The Internet Protocol (IP) currently used for the Web breaks data into packets that are sent through fiber-optic cables to their ultimate destination. The packets don't have to take a common path through the network; routers act like way stations along the network, examining every packet individually and deciding where it should be sent next. The problem with this system is that large data transfers can clog the routers with packets waiting for direction, and if the packets don't make it to their final destination at the same time, the receiver may experience jitter--interruptions to the data stream that can produce skips in online video, for example.

Summerhill says that, using the dynamic circuit network, a researcher could set up a temporary connection to another location that would work like a phone call: the user's data would be carried directly to that other location, uninterrupted by the traffic of others sharing the network. The result is that large quantities of information could be transferred quickly and clearly.

The dynamic circuit network is really an enhancement of a traditional network, rather than a replacement. Internet2 still has a backbone that uses the standard IP common across the Web. What makes the dynamic circuit network different is that it uses a circuit-switched network, which can be set up so that all the packets follow the same path. Also, those circuits don't have to be in place permanently. Lachlan Andrew, a researcher at Netlab, at the California Institute of Technology, explains that a circuit-switched network determines a pathway for the entire stream of packets, so that at every way station, they can be sent on without having to be individually examined. "Internet2 is developing technology to communicate between nodes, find a path, and construct it," he says.

The idea of the dynamic circuit network, Summerhill says, is that these circuits can be set up on demand, so that traffic needing excellent quality of service can step out of the regular flow. Because data is sent down fiber-optic cables at different frequencies of light, he explains, data from the dynamic circuit network can coexist with IP data and wouldn't require new cable to be laid. Summerhill says that Internet2 is working on software that could eventually be built into network devices to control these different flows and to set up circuits when and where they are needed.


Among the current applications for the dynamic circuit network, Internet2 expects to facilitate the transfer of data from CERN's large hadron collider to researchers at other institutions, and it has done trials in which circuits are opened between the collider and the University of Nebraska. In the future, Summerhill says, the researchers hope that commercial applications develop from the technology. "Think of a network that provided hundreds or thousands of high-definition channels and also provided on-demand video capabilities," he says. He foresees a commercial network that needs both high bandwidth and high quality of service, like some current academic requirements. "The methods for supporting that network are under investigation," Summerhill says. Although right now, there are no commercial implementations, he notes that Internet2 works with commercial partners that might eventually be a conduit to bringing the technology into the ordinary Internet.

Clive Davenhall worked on software for academic circuit-switched networks in the United Kingdom, as part of his role as an engineer at the National e-Science Centre, in Edinburgh, which works to improve methods for conducting large-scale science research over the Internet. Davenhall says that, although people have been talking about dynamic circuit networks for a long time, this type of network hasn't had much of an impact on the commercial Internet, partly because of concerns about how it might function in an environment less controlled than academia. For example, if the average person could set up a dedicated circuit on demand, it might be possible to hog resources that could interfere with other users' experience.

Summerhill says that the dynamic circuit network is still in its early stages, and "still has some evolution to do." He recalls the time that IP wasn't considered ready for commercial applications. So far, four universities in four different regional networks are connected to the dynamic circuit network, says Lauren Rotman, public relations manager for Internet2. Rotman adds that it will be easy to add universities in regions that are already connected. The organization hopes to increase the dynamic circuit network's reach significantly in the coming year.


http://www.technologyreview.com/Infotech/20277/

Mobile Carriers See Gold in Femtocells

If consumers buy in to private wireless phone networks, the industry could save money.

Can you hear me now? Airvana's HubBub femtocell (above) could provide better cellular reception inside homes and offices.
Credit: Airvana


On its face, it sounds like a company's technological fantasy: a product sold to customers that will also save the business itself money.

That's roughly the attraction of a young wireless phone technology called femtocells, which promise to give homes and businesses their own private wireless phone networks.

Similar in concept to the Wi-Fi routers that many people use to blanket their homes with wireless Internet access, these little boxes instead provide a network for carrying the voice and high-speed data services of mobile phones. They're designed to give bandwidth-hungry cell-phone subscribers the strongest possible connections at home. But by keeping those customers off the main mobile network and using home broadband connections to transfer data, they could wind up saving the phone companies money, too.

It's no wonder, then, that equipment vendors say that mobile phone companies are rushing into this market--with technology and even commercial trials beginning on both sides of the Atlantic--even before standards have been set or final technological hurdles cleared.

"Usually in the networking business, you build equipment, and then drum up demand," says Paul Callahan, vice president of business development for Airvana, a femtocell equipment vendor. "This time, demand is already really strong."

The femtocell buzz is part of a broader, years-long push by mobile phone companies to persuade their customers to use cell phones instead of landlines for all their communications needs, and increasingly to use their cells for third-generation (3G) applications such as Web surfing, downloading music, and watching videos.

One hurdle, phone companies say, is that mobile phone coverage inside homes and businesses often isn't as good as it is outside. Some homes are in coverage shadows or have thick apartment walls that impede transmissions. In addition, the Wideband Code Division Multiple Access (W-CDMA) technology used for 3G services by T-Mobile and AT&T in the United States transmits at a higher frequency than does its predecessor, so it has a harder time penetrating walls.

A femtocell would relieve this problem--in theory. Instead of relying on the mobile phone's nearest cellular tower (known in the industry as a base station), which might also be serving scores of other callers at the same time, a customer would have her own private, high-quality cell-phone connection.

"Our goal is to get to a place where our services are available to all users at all times," says John Carvalho, head of core network innovation for Telefónica O2 Europe, which announced femtocell trials this week.

Boosters of the technology paint femtocell as technology that benefits everyone. Customers get a fast, reliable broadband phone connection at home, and the mobile phone companies get to offload a small piece of their infrastructure investments to their customers.

In effect, every customer who buys and installs his own home femtocell would reduce the load on the carrier's local macro network. The femtocell itself serves as an alternative base station, broadcasting and receiving ordinary wireless signals from cell phones that the femtocell owner permits. This is a strikingly attractive idea, particularly to carriers in big cities that find their networks often overloaded, and find that local regulations or public opinion makes it difficult and costly to set up new antennas.

By using a femtocell, customers will send their voice and data traffic out their own DSL, cable, or fiber connection to the Internet, and then to the carrier's network. This will also reduce the load on the land-based data networks that carry voice and data traffic from the mobile phone companies' base stations to their own central switching facilities. That, in turn, could translate into less infrastructure investment.

Yet all of this will happen only if customers see enough benefit to buying themselves a femtocell--and for now, that's the biggest flaw in this rosy scenario, analysts say.

"What's in it for the user?" asks Keith Nissen, an analyst with the In-Stat research firm. "That's the big question. Right now, there isn't enough."


Broadband subscribers already have fast Internet surfing at home, by definition. Carriers may well offer cheaper cell-phone calls for femto customers using their home connection--but broadband subscribers can already do this using Skype, Vonage, or other voice over Internet protocol (VoIP) services. Strong cell signals at home are certainly a plus, but it's not clear how much consumers will pay for this, analysts say.

Without an obvious consumer must-have attraction, demand will likely be tied closely to price, Nissen says. If a femtocell is cheap enough, consumers will latch on to the idea, assuming (and this can be a big assumption) that carriers are able to explain and market it clearly. But this price may be a sticking point for some time.

Today, the equipment cost for femtocells runs in the range of $250 to $300. Sprint, one of the first companies to start commercial trials of the products, is offering them to consumers in Denver and Indianapolis for $50 apiece, along with an offer of lower-priced calling plans--altogether a substantial subsidy.

O2's Carvalho says that he expects equipment costs to come down to between 50 and 80 British pounds (about $100 to $160) once standards are set and mass-manufacturing begins. That's an acceptable price range for consumers used to buying products such as Wi-Fi modems, he says.

The standards process may take several years, however. Different equipment vendors use different techniques for aspects such as security, or for letting the femtocells talk to the carrier's core network. Femtocells have been developed for both rival 3G mobile phone standards--W-CDMA and CDMA2000--but different standards-setting bodies are separately at work on rules for each.

In the long term, analysts expect femtocells to be a fast-growing, successful market. In-Stat forecasts that 40.6 million femtocells will be distributed around the world by 2011. ABI Research is even more optimistic, projecting 70 million in use by 2012.

By that time or shortly afterward, analysts say, femtocell technology may be built into other devices, such as Internet routers for consumers.

Vodafone, T-Mobile, and O2 all announced trials early this year. Equipment vendors say that many other carriers are in undisclosed trials as well. Commercial deployment, in which the products will be distributed to consumers by the phone companies or their retail partners beyond the limited scale of Sprint's two-city experiment, is expected by early next year.

That's all assuming that consumers react positively when they actually get a chance to see how the technology works.

"If it winds up being more expensive, but it provides better data rates, it's probably worth the investment for us," says O2's Carvalho. "If it's more expensive but slower, and it annoys customers, we probably wouldn't take that on."


http://www.technologyreview.com/Biztech/20293/

Improving Toxicity Tests

A new initiative will work on cell-based toxicity tests for chemicals.

Credit: Technology Review


As chemical companies develop more pesticides, cleaners, and other potentially toxic compounds, traditional methods of safety testing can hardly keep up. Animal tests, which have been the gold standard for decades, are slow and expensive, and these sorts of tests are increasingly socially unacceptable, too. What's more, the results of animal testing sometimes don't translate to humans, so researchers are eager for better alternatives.

This week, at the annual meeting of the American Association for the Advancement of Science in Boston, the U.S. Environmental Protection Agency and the National Institutes of Health (NIH) announced a multiyear research partnership to develop a cell-based approach that they hope can replace animal testing in toxicity screening. Work has already begun, although it will take years to refine the techniques.

Using systems that are already employed in the search for new drugs, researchers hope to develop quick, accurate methods of toxicity testing for chemicals that are carried out on cells, rather than on whole animals.

That way, instead of having to spend weeks dosing and dissecting roomfuls of rabbits or rats, thousands of chemicals could be tested in a matter of hours using automated systems and human cells grown in a lab. Different kinds of cells could be used as proxies for particular tissues, providing a way for researchers to test the effects of a chemical on the liver, for example, and, ultimately, to predict toxic effects.

The approach "really has the potential to revolutionize the way toxic chemicals are identified," says Francis Collins, director of the National Human Genome Research Institute. Automated cell-based tests could screen many thousands of chemicals in a single day, compared with the decades spent so far gathering detailed information on a few thousand toxic chemicals.

"We need to be able to test thousands of compounds in thousands of conditions much faster than we did before," says Elias Zerhouni, director of the NIH. The new approach repurposes a technique that's a mainstay in pharmaceutical labs, where high-throughput screening is used to help identify new drugs. Automated systems can test hundreds of thousands of candidate compounds in a single day and identify those that have any effect on cells, and hence may have therapeutic value. The aim of the toxicity-testing research is "to try to turn that around to find compounds that might be toxic," Collins says. Their effects could be assessed according to the number of cells they kill, or by using markers that indicate whether certain functions in a cell are affected.

Because high-throughput screening can handle many thousands of tests at a time, a given chemical can be tested at different concentrations and for different exposure times during a single screening process, producing comprehensive and reliable data that's "not a statistical approximation," says Christopher Austin, director of the NIH Chemical Genomics Center. "It's pharmacology."


"In order to get the answers you want, you need to do all the concentrations, all the times, and that's why you need to have a high-throughput system," Austin says.

Researchers at the NIH have already used high-throughput screening to test several thousand chemicals over a range of 15 concentrations varying by several orders of magnitude, and for exposure times ranging from minutes to days. The chemicals they picked have well-known toxic effects, gleaned from animal studies. By comparing data from high-throughput tests with that from animals, researchers should be able to fine-tune cell-based tests so that they're at least as reliable and as informative as animal experiments.

"Animals are not always giving us the right answer," says John Bucher, associate director of the National Toxicology Program, "so we need to use all the information we can get from different systems."

In a sense, Austin says, this new approach turns the animal-testing procedure "upside down." Rather than giving a rat a chemical and then dissecting the animal and examining its tissues to see the effect of the compound, metaphorically, "we are dissecting the rat first into its component cells, then computationally putting the rat back together."

However, it will take years for researchers to prove--if they can--that cell-based toxicity screening can supercede animal tests so "you cannot abandon animal testing overnight," Zerhouni says. "It will have to be intertwined for a few years."


http://www.technologyreview.com/Biotech/20294/

Friday, February 15, 2008

Power from Fabrics

Nanowires that convert motion into current could lead to textiles that can generate power.

Power suit: Gold-plated zinc oxide nanowires (yellow), each about 3.5 micrometers tall, are grown on a flexible polymer fiber. The gold-plated nanowires brush against untreated nanowires (green), which flex and generate current. Yarn spun from the fibers could lead to fabrics that convert body movements into electric current.
Credit: Z. L. Wang and X. D. Wang, Georgia Tech


Georgia Tech researchers have taken an important step toward creating fabrics that could generate power from the wearer's walking, breathing, and heartbeats. The researchers, led by materials-science professor Zhong Lin Wang, have made a flexible fiber coated with zinc oxide nanowires that can convert mechanical energy into electricity. The fibers, the researchers say, should be able to harvest any kind of vibration or motion for electric current.

The zinc oxide nanowires grow vertically from the surface of the polymer fiber. When one fiber brushes against another, the nanowires flex and generate electric current. The researchers described a proof-of-concept yarn in a paper published this week in the journal Nature. They show that the output current increases by entwining multiple fibers to make the yarn.

By the researchers' calculations, a square meter of fabric made from the fibers could put out as much as 80 milliwatts--enough to power portable electronics. The development could make shirts and shoes that power iPods and medical implants, curtains that generate power when they flap in the wind, and tents that power portable electronics devices.

In 2007, Wang and his colleague the 2007 TR 35 winner Xudong Wang (no relation) built a zinc oxide nanowire array that generated direct current when exposed to ultrasonic vibrations. The piezoelectric nanowires stood on an electrically conducting substrate that acted as an electrode. The other electrode was a platinum-coated silicon plate with parallel peaks and trenches carved on its surface. (See "Nanogenerator Fueled by Vibrations.") When the ultrasonic waves pushed the electrodes together, the nanowires bent and produced current.

In the new work, the researchers have substituted the rigid, zigzag electrode with a flexible one. They convert some of the bendable fibers into electrodes by applying a thin layer of gold to them. These gold-plated fibers act as flexible electrodes.

The researchers entangle a gold-coated fiber with an uncoated fiber. When the fibers are pulled back and forth with respect to each other, the individual gold-plated nanowires push and bend the uncoated nanowires, generating current.

The flexibility of the fibers brings the idea of wearable, foldable energy sources closer to fruition, says Charles Lieber, a chemistry professor at Harvard University. The flexibility is also crucial for harvesting energy from extremely small ambient motion, says Thomas Thundat, who studies nanoscale biological sensors at Oak Ridge National Laboratory. Entwining the flexible fibers, he explains, leads to very close contact between the gold-coated and the uncoated nanowires. As a result, small motions, such as a light wind or walking movements, make the coated and uncoated nanowires brush against each other and generate current.


"The idea is ingenious," says Min-Feng Yu, a mechanical-science and engineering professor at the University of Illinois at Urbana-Champaign. "It's like you have millions of nanogenerators outputting electricity simultaneously, each at maximum performance.".

The generator's ability to capture small movements makes it especially useful for powering biological sensors, Thundat says. Microscale sensors can be implanted in the body to measure such things as cancer biomarkers and glucose. But chemical batteries are bulky compared with the tiny sensors, and they have a limited lifetime. "Implanted sensors based on [the fiber nanogenerator] concept could use blood pressure or muscle movement for operation," Thundat says.

The Georgia Tech advance would not be possible without the simple but highly innovative process the researchers have used to make the fibers, Lieber points out. Zhong Lin Wang and his colleagues first cover a polymer fiber with a 100-nanometer-thick zinc oxide layer. They immerse the fiber in a reactant solution at 80 °C, which results in nanowires growing vertically from the surface. Then the researchers use a final trick to keep the nanowires firmly attached to the fibers while keeping the fibers flexible. They dip the fibers in tetraethoxysilane, a liquid used in weatherproofing and protective coatings. The tetraethoxysilane forms two coatings: one between the fiber and the zinc oxide layer, and another on top of the zinc oxide layer.

This tetraethoxysilane coating makes the fiber robust. The zinc oxide layer did not crack or peel off even when the fiber was twisted. The nanowires also stayed put after the researchers continuously brushed two fibers against each other for 30 minutes. The fibers will have to last even longer and have higher output power in order to be used practically, Wang says.

Power-generating shirts might still be out of reach for most. At this point, the fabric might be affordable for the military for use in tents and shoes, says Wang, but "it is probably too expensive for you and me to buy."


http://www.technologyreview.com/Nanotech/20278/

Plucking Cells out of the Bloodstream

A new implantable device can extract stem cells for therapeutic transplant or program cancer cells to die.

Cell catcher: University of Rochester bioengineer Michael King holds up a section of plastic microtubing lined with proteins that trap cancer and stem cells.
Credit: Richard Baker, University of Rochester
Multimedia
Watch Michael King's new device in action.


Bioengineers have developed an implantable device that captures very pure samples of stem cells circulating in the blood. The device, a length of plastic tubing coated with proteins, could lead to better bone-marrow transplants and stem-cell therapies, and it also shows promise as a way to capture and reprogram cancer cells roaming the bloodstream. The company CellTraffix is commercializing the technology.

When patients get bone-marrow transplants, what they're really receiving are infusions of a type of adult stem cell. Bone-marrow-derived stem cells play a crucial role in renewing the blood throughout adulthood, creating new cells to carry oxygen and fight infections. These adult stem cells can be sampled using the new device.

The new device mimics a small blood vessel: it's a plastic tube a few hundred micrometers in diameter that's coated with proteins called selectins. The purpose of selectins in the body seems to be to slow down a few types of cells so that they can receive other chemical signals. A white blood cell, for instance, might be instructed to leave the circulation and enter a wound, where it would protect against infection. "Selectins cause [some] cells to stick and slow down," says Michael King, a chemical engineer at the University of Rochester who's developing the cell-capture devices. Different types of selectins associate with different kinds of cells, including platelets, bone-marrow-derived stem cells, and immune cells such as white cells.

In an upcoming publication in the British Journal of Hematology, King reports that selectin-coated microtubes implanted in rats can capture very pure samples of active stem cells from circulating blood. He gave a similar demonstration of stem-cell purification with samples taken from human bone marrow last year. Cancer patients often require bone-marrow transplants following harsh chemotherapy and radiation treatments that kill adult stem cells in the blood.

The purity of these transplants can be a matter of life or death. When the transplant is derived from the patient's own bone marrow--extracted before treatment--it's critical that it not contain any cancer cells. When it comes from another person, there's a chance that the donor's immune cells will attack the recipient if they're not filtered out. But current purification methods are slow and inefficient, King says. Those that rely on antibody recognition or cell size and shape typically extract only a small fraction of the stem cells in a blood sample; the rest go to waste.

Twenty-eight percent of the cells captured by King's implants were stem cells. "This is astounding given how rare they are in the bloodstream," says King. Implants would probably not be able to capture enough stem cells for transplant. But King believes that filtering a donor's blood through a long stretch of selectin-coated tubing outside the body, in a process similar to dialysis, would be very efficient. "This technique will clearly be useful outside the body" as a means of purifying bone-marrow-derived stem cells, says Daniel Hammer, chair of bioengineering at the University of Pennsylvania.

Hammer believes that King's devices will also have broader applications as implants that serve to mobilize a person's own stem cells to regenerate damaged tissues. By slowing down cells with selectins and then exposing them to other kinds of signals, says Hammer, King's devices "could capture stem cells, concentrate them, and differentiate them, without ever having to take the cells out of the body." There might be a way to use selectins to extract neural stem cells, too.

"This is a very broad-reaching discovery," says Hammer. Indeed, King says that he has already had some success using selectin coatings to reprogram cancer cells.

Cancer cells appear to highjack selectin pathways in order to spread to other parts of the body, the process known as metastasis. Tumors shed cells into the bloodstream. Some of those cells seem to exit with the help of selectins; ensconced in new tissue, they then establish new tumors. These secondary tumors cause more cancer deaths than initial tumors do.

King says he has unpublished work demonstrating that leukemia cells that roll along a coating of selectins and a cancer-specific signaling molecule will go through a process called programmed cell death. Healthy stem cells also roll across the device because they're attracted to the selectins, but the death signal doesn't affect them. Leukemia is a blood cancer, but King expects that the anticancer coating would work for solid tumors as well. Devices lined with these coatings might be implanted into cancer patients to prevent or slow metastasis.

King hopes to test antimetastasis implants in animals this year. He's collaborating with Jeffrey Karp, a bioengineer at the Harvard-MIT Division of Health Sciences and Technology, and Robert Langer, an MIT Institute Professor, to develop selectin coatings that are stable over months rather than days.

CellTraffix CEO Tom Fitzgerald says that the company's first product, a kit that will enable researchers to capture large numbers of stem and cancer cells in the lab, will likely reach the market early next year. The company hopes to begin clinical testing of the anticancer coatings by early 2010.


http://www.technologyreview.com/Biotech/20204/

Wiring Up DNA

Measuring the conductivity of DNA could provide a way to detect mutations.

Hot-wired: By placing a double-stranded DNA segment in a gap in a single-walled carbon nanotube, researchers have measured the electrical properties of the biological molecule. Since even a single mismatch in the DNA letters affects the conductivity of the segment, the system could eventually be the basis of chemical sensors to detect mutations in DNA.
Credit: Colin Nuckolls


By wiring up DNA between two carbon nanotubes, researchers have measured the molecule's ability to conduct electricity. Introducing just a single letter change can drastically alter the DNA's resistance, the researchers found, a phenomenon that they plan to exploit with a device that can rapidly screen DNA for disease-linked mutations.

Measuring the electrical properties of DNA has proved tricky because the molecule and its attachments to electrodes tend to be very fragile. But in the new study, Colin Nuckolls, a professor of chemistry at Columbia University, in New York, teamed up with Jacqueline Barton, a professor of chemistry at Caltech, in Pasadena, CA, who's an expert in DNA charge transport. Nuckolls's group had previously developed a method for securely hooking up biological molecules to single-walled carbon nanotubes, which act as the electrodes in a miniscule circuit.

The researchers used an etching process to slice a gap in a carbon nanotube; they created a carboxylic acid group on the nanotube at each end of the gap. They then reacted these groups with DNA strands whose ends had been tagged with amine groups, creating tough chemical amide links that bond together the nanotubes and DNA. The amide linkages are robust enough to withstand enormous electrical fields.

The team estimated that DNA strands of around 15 base pairs (around 6 nanometers) in length had a resistance roughly equivalent to that of a similar-sized piece of graphite. This is a finding that the researchers might have expected since the chemical base pairs that constitute DNA create a stack of aromatic rings similar to those in graphite.

"In my opinion, the results of this work will survive, in contrast to many other publications on this topic," says chemist Bernd Giese, of the University of Basel, Switzerland. Previous estimates of DNA's conductivity have varied dramatically, Giese says, partly because it was unclear if the delicate DNA or its connection to electrodes had become damaged by the high voltages used. "One thinks one has burned the DNA to charcoal," Giese says. "It's extremely complicated experimentally."

Barton and Nuckolls performed two tricks with their wired-up DNA. For their first, they introduced a restriction enzyme that bound and cut the DNA at a specific sequence. When severed, the current running through the DNA vanished. "It's a way of biochemically blowing a fuse," Nuckolls says. It also demonstratesthat the DNA keeps its native structure in the circuit; if it had not, the enzyme would not recognize and cut the molecule.


For their second trick, the researchers introduced a single base-pair mismatch into the DNA so that, for example, a C was paired up with an A (rather than its normal partner, G). This tweak boosted the molecule's resistance some 300-fold, probably because it distorts the double helical structure. They could do this easily by connecting only one of DNA's two strands into the circuit. The second strand - which can either be a perfect match to the first or contain a mismatch - can lift on or off.

Showing the electrical effect of such sequence mismatch and enzyme cutting is the real strength of the experiments, says Danny Porath, of Hebrew University, in Jerusalem, Israel, who has also measured current through DNA. "They play with the parameters and show that conductivity of DNA clearly depends on them, and that's beautiful," he says.

Nuckolls is now working to exploit this discovery to detect single nucleotide polymorphisms (SNPs), the one-letter variations in DNA that are linked to, for example, susceptibility to Alzheimer's, diabetes, and many other major diseases. Nuckolls hopes that his method can be used to identify SNPs more rapidly and with greater sensitivity than existing methods. In such a device, a reference strand of DNA is wired into the circuit and other strands allowed to pair up with it. If the second strand carries a different base at the position of the SNP, this would be enough to trigger a change in the current through a nanoscale circuit, just as the base-pair mismatch did. Nuckolls says that he is already working with electrical engineers to create a sensor that can slot into existing semiconductor chips, making it cheap and readily available. "It's one of our big focuses, and we're pretty close," he says.

The team is likely to have competition. Late last year, a group led by Wonbong Choi, of Florida International University, in Miami, reported that it had strung 80 base pairs of DNA between two carbon nanotubes and sent current through the DNA. Choi says that he is working to create a sensor that can rapidly reveal the presence of specific genetic sequences--such as the avian influenza virus--by looking at changes in current through the tiny circuit.

Barton, meanwhile, is intent on finding out whether the conductivity of DNA serves any biological purpose in the cell. She has evidence that proteins bound to DNA may detect DNA damage by changes in its electrical properties, perhaps triggering repair of the damage. "We think it's something nature takes advantage of," she says. "It's a radical idea, but I think as we get more and more evidence, the case will be built."


http://www.technologyreview.com/Nanotech/20205/

A Better Way to Capture Carbon

New materials provide a potentially cheaper way to reduce carbon dioxide emissions from power plants.

Carbon-capturing crystals: This is an optical micrograph of a new material that can pull carbon dioxide from a stream of gases, making it possible to sequester the greenhouse gas.
Credit: Omar Yaghi


Researchers have developed porous materials that can soak up 80 times their volume of carbon dioxide, offering the tantalizing possibility that the greenhouse gas could be cheaply scrubbed from power-plant smokestacks. After the carbon dioxide has been absorbed by the new materials, it could be released through pressure changes, compressed, and, finally, pumped underground for long-term storage.

Such carbon dioxide capture and sequestration could be essential to reducing greenhouse-gas emissions, especially in countries such as the United States that depend heavily on coal for electricity. The first stage, capturing the carbon, is particularly important, since it can account for 75 percent of the total costs, according to the Department of Energy.

The new materials, described this week in Science, were created by researchers at UCLA led by Omar Yaghi, a chemist known for producing materials with intricate microscopic structures. They absorb large amounts of carbon dioxide but do not absorb other gases.

Techniques already exist for capturing carbon dioxide from smokestacks, but they use large amounts of energy--15 to 20 percent of the total electricity output of a power plant, according to one estimate, Yaghi says. That is because existing materials, known as amines, need to be heated to release the carbon dioxide they've absorbed. Indeed, capturing and compressing carbon dioxide through these existing methods can add 80 to 90 percent to the cost of producing electricity from coal, says Thomas Feeley, a project manager at the National Energy Technology Laboratory.

Feeley says that Yaghi's materials "compare favorably" with other experimental materials that absorb carbon dioxide that are being developed to help bring down these costs. Yaghi says that his materials could lower costs considerably since they use less energy, although exactly how much will require testing the materials at power plants.

Beyond being potentially useful in smokestacks, the materials could be employed in coal gasification plants. In these plants, coal is first processed to produce a mixture of carbon dioxide and hydrogen gas. The hydrogen is then used to generate electricity. The carbon dioxide could be captured using a solvent that increases energy consumption. But as in the smokestack-based process, the new UCLA materials could require less energy.


The materials belong to a class called zeolitic imidazolate frameworks (ZIFs). They're made of metal atoms bridged by one of a number of ring-shaped organic molecules called imidazolates. Prior to Yaghi's research, 24 types of ZIFs had been developed over the course of 12 years. Yaghi made 25 new versions in just three months. These materials can be extremely versatile, since the metal atoms can act as powerful catalysts, and the organic molecules can serve as anchors for a number of functional molecules.


ZIF proliferation: New automated techniques allow researchers to quickly synthesize dozens of new materials called zeolitic imidazolate frameworks (ZIFs). Credit: Omar Yaghi

The new materials absorb carbon dioxide in part because they're extremely porous, which gives them a high surface area that can come into contact with carbon dioxide molecules. The most porous of the materials that Yaghi reports in Science contain nearly 2,000 square meters of surface area packed into one gram of material. One liter of one of Yaghi's materials can store all of the molecules of carbon dioxide that, at zero °C and at ambient pressure, would take up a volume of 82.6 liters.

While the exact mechanisms are not fully understood, Yaghi thinks that the slightly negative charge of organic molecules in his material attracts carbon dioxide molecules, which have a slightly positive charge. As a result, carbon dioxide is held in place, while other gases move through the material. This method of trapping carbon dioxide is better than some other methods because it does not involve strong covalent bonds, so it doesn't take much energy to release the gas.

The next step for the materials is commercialization. This means scaling up production and incorporating the materials into a system at a power plant, such as by packing the materials into canisters that can be filled with pressurized exhaust gases--something that the UCLA group says could be possible in two to three years. Yaghi estimates that the materials could easily be made in large quantities, since they are similar to other materials he has developed that can now be made by the ton by BASF, the giant chemical company. "Now it's in the hands of industry," Yaghi says. And he has developed automated techniques that could lead to more materials that could have even better properties.


http://www.technologyreview.com/Energy/20295/

Thursday, February 14, 2008

Information technology governance

Problems with IT governance

Nicholas Carr has emerged as a prominent critic of the idea that information technology confers strategic advantage.[5] This line of criticism might imply that significant attention to IT governance is not a worthwhile pursuit for senior corporate leadership. However, Carr also indicates counterbalancing concern for effective IT risk management.

The manifestation of IT governance objectives through detailed process controls (e.g. in the context of project management) is a frequently controversial matter in large scale IT management. See Agile methods. The difficulties in achieving a balance between financial transparency and cost-effective data capture in IT financial management (i.e., to enable chargeback) is a continual topic of discussion in the professional literature[6], [7] and can be seen as a practical limitation to IT governance

Relationship to other IT disciplines

IT governance is supported by disciplines such as:

Frameworks

There are quite a few supporting mechanisms developed to guide the implementation of information technology governance. Some of them are:

  • Control Objectives for Information and related Technology (COBIT) is another approach to standardize good information technology security and control practices. This is done by providing tools to assess and measure the performance of 34 IT processes of an organization. The ITGI (IT Governance Institute) is responsible for CObIT
  • The ISO/IEC 27001 (ISO 27001) is a set of best practices for organizations to follow to implement and maintain a security program. It started out as British Standard 7799 ([BS7799]), which was published in the United Kingdom and became a well known standard in the industry that was used to provide guidance to organizations in the practice of information security.
  • The Information Security Management Maturity Model ISM3 is a process based ISM maturity model for security.
  • AS8015-2005 Australian Standard for Corporate Governance of Information and Communication Technology

Others include:

  • BS7799 - focus on IT security
  • CMM - The Capability Maturity Model - focus on software engineering

Non-IT specific frameworks of use include:

  • The Balanced Scorecard (BSC) - method to assess an organization’s performance in many different areas.
  • Six Sigma - focus on quality assurance

http://en.wikipedia.org/wiki/Information_technology_governance

Rethinking the Cell Phone

An Israeli startup has made a modular mobile phone that can work on its own or slip into other electronic devices. Will it catch on?

Mini mobile: The Modu, a cell phone slightly larger than a domino, is designed to slip into other electronic devices, such as picture frames, stereo systems, and bigger phones.
Credit: Modu Mobile


If you could reduce a mobile phone to its essence, it would look like the Modu. This tiny phone, which is slightly larger than a domino, is capable of sending and receiving calls and text messages. It can store contacts and MP3s with up to 16 gigabytes of storage capacity, and it has a small but usable screen and a sparse keypad that lacks numbers. Launched this week at the Mobile World Congress in Barcelona, the Modu can be used as a stand-alone phone. But more important, it can also be slipped into a variety of "jackets," such as in-car MP3 players, Global Positioning Systems, and larger cell phones, that expand the Modu's functions and change its look.

Modu Mobile, the Israeli startup that launched the phone, is hoping to change the way that consumers think about their handhelds, explains Itay Sherman, the company's chief technology officer. Today, people generally have one phone that they use all the time, and they use it for a year or two because it's too expensive to buy a new model more frequently. But Sherman says that the idea of one phone for all occasions doesn't mesh with people's lifestyle. Sometimes you want to walk around with the smallest possible phone, he says; other times you want a good messaging device with a large keyboard, or a media player with a large screen. "Instead of buying a completely new phone, the jacket enables you to switch."

In making the Modu, Sherman says, there were a number of technical considerations. While semiconductor technology is at the point where chips are small enough to easily fit into the mini mobile, his team also had to shrink the phone's other features, such as the screen, keypad, and battery. The display, for instance, needed to be specially designed: it uses organic light-emitting diodes and is a mere one millimeter thick. (See "Super-Vivid, Super-Efficient Displays.") Knowing that it would be impractical to put a full, numbered keypad on the Modu, Sherman says, his team designed a simpler keypad that lets people access menus on the screen, similar to those of MP3 players. The lithium-ion polymer battery, which uses the same basic technology as traditional phone batteries, was customized to be thin and long, while still providing about 3 hours of talk time and 100 hours of standby.

Once a user plugs the Modu into a jacket, however, the features improve. "The jacket may also have a battery," says Sherman, and the combined device shares the load between the two batteries. "It extends the talk time and standby time."

One of the main innovations, says Sherman, is that the software that runs the Modu automatically reconfigures when it is put in another device. A resource file defines the way the Modu and jacket will work together. "Every jacket you plug into, you'll get a completely different experience, yet it keeps the basic functionality in all cases so that it's familiar to the user," he says.

Beyond cell-phone jackets, Modu Mobile will offer other consumer-electronics devices in which the phone module can be inserted, improving the basic functions of the device. For instance, a camera with the Modu could wirelessly send pictures to other phones, and a car entertainment system designed for the Modu could let a user access his MP3s while enabling hands-free calling.

This isn't the first time that consumer-electronics companies have tried to build modular phones, says Avi Greengart, the research director for mobile devices at Current Analysis, a market research firm. He points to IXI Mobile, the maker of the Ogo mobile messenger. "It had the notion of connecting multiple devices together via Bluetooth," he explains. A user would have a basic storage module and then connect to a large display or media player. However, the technology didn't catch on because few people think to buy a shell of a media player and then the other pieces to make it work, Greengart says.


Greengart is skeptical that the Modu will take off. "It makes sense on paper, but in the past, every effort to create modular types of devices has failed because [the companies] miss the way consumers actually buy products," he says. "It requires a change in consumer behavior ... Consumers don't buy [multiple] modules at once or have the foresight to know that they're going to want more ... down the road."

Modu Mobile hopes to buck the trend by getting people used to thinking in terms of jackets and the Modu. "We want to educate the market on the flexibilities and offerings," says Sherman. The company's first products will be available in October in Italy, Russia, and Israel. The initial package, which will include the Modu and two phone jackets, will cost 200 euros, an amount that the company expects will be subsidized by cell-phone carriers. In 2009, the company will extend to operators in the rest of Europe and in the United States, Sherman says.

Greengart admits that by inking deals with major carriers in the three initial countries, Modu Mobile has overcome one of the hurdles in making a marketable phone. "Oftentimes, the biggest challenge with a mobile device is just getting it in front of the consumers," he says. "They have carriers in Israel, Italy, and Russia. We'll see how much weight they put behind it."

The Modu is a different idea, and "the industry could use more 'different,'" Greengart says. But it will be hard for the company to gain traction in the mobile market and, especially, compete with Apple's popular iPhone. "I hate to say it because it sounds cliché," admits Greengart, "but no matter what jacket you slip this thing into, it's not going to be an iPhone."


http://www.technologyreview.com/Infotech/20276/?a=f

Discovering Novel Pathogens

Next-generation sequencing uncovers disease-causing microbes.

Mystery microbes: A next-generation sequencing technique allowed researchers to identify the virus that likely killed three transplant patients who received organs from the same donor. Because the technique is "unbiased," it could pick up the virus even though it was highly dissimilar at the nucleotide level to its nearest viral kin, lymphocytic choriomeningitis virus (LCMV). In the top panel, cells infected with the new virus have been stained using antibodies against LCMV and its relatives. In the bottom panel, individual virus particles (denoted with arrows) are revealed by electron microscopy.
Credit: New England Journal of Medicine


The next-generation sequencing technology that was harnessed to assemble the entire sequence of James Watson's genome has been put to a new and potentially life-saving use: identifying novel pathogens. After several other identification techniques failed, the new sequencing approach was used to discover a never-before-seen virus that was likely responsible for the deaths of three transplant patients who received organs from the same donor.

The technique, called unbiased high-throughput pyrosequencing, or 454 sequencing, was developed by 454 Life Sciences, owned by Roche. This is the first time it was used to probe for the cause of an infectious-disease outbreak in humans, and experts say that it could ultimately usher in a new era in discovering and testing for agents of infectious disease.

"This is going to begin to allow us to understand the etiology of infections that had previously gone undiagnosed," says Richard Whitley, professor of medicine at the University of Alabama at Birmingham, who was not involved with the research.

Last spring, several weeks after receiving organs from a single donor, three Australian transplant patients became ill with fever and encephalitis; within six weeks of the operation, all three had died. When traditional methods failed to identify the cause of the patients' deaths, the Victorian Infectious Disease Reference Laboratory turned to W. Ian Lipkin, director of the Laboratory for Immunopathogenesis and Infectious Diseases at Columbia University's Mailman School of Public Health, for assistance.

To find the mystery pathogen responsible for the deaths, Lipkin's team extracted RNA from the tissues of two of the patients and prepared the sample by treating it with an enzyme that removed all traces of human DNA; this enriched the sample for viral sequences. The researchers then amplified the RNA into millions of copies of the corresponding DNA using a reverse transcriptase polymerase chain reaction (PCR). Usually, PCR requires some advance knowledge of the sequence in question because it relies on molecular primers that match the string of code to be amplified. But 454 sequencing avoids that problem by using a large number of random primers.

The resulting strands of DNA were sequenced using pyrosequencing, which determines the sequence of a piece of DNA by adding new complementary nucleotides one by one in a reaction that gives off a burst of light. Pyrosequencing allows for fast, simultaneous analysis of hundreds of thousands of DNA fragments. Although traditional pyrosequencing generally produces relatively short chunks of sequence compared with earlier sequencing techniques, 454 Life Sciences has improved upon the technology such that longer reads are possible.

When 454 Life Sciences used this technique to sequence James Watson's genome, its approach was nearly identical. Lipkin's modification was to eliminate human DNA so that only the mystery pathogen's genetic material would remain.


Once the sequences were generated, Lipkin used computational techniques developed in his laboratory to filter out any remaining human sequences (which sometimes linger due to the presence of human RNA) and to piece together the many sequence fragments into longer strings. Of the more than 100,000 sequences initially produced, a mere 14 matched viral proteins in a database of all known microbes' sequences.

"If we had used a different sequencing strategy--one that gives you shorter reads--or if we had not used the sample preparation to enrich [for viral sequences], we would never have captured those," says Lipkin.

The virus from the patients' tissues was most closely related to a pathogen called lymphocytic choriomeningitis virus (LCMV), which is known to cause meningitis in humans. While LCMV has been implicated in transplant-associated illness before, the sequence of the new virus was different enough that existing methods could not have detected its presence. The results of the analysis were published online last week in the New England Journal of Medicine (NEJM).

Once it had characterized the LCMV-like virus, the group was able to design probes to test specifically for its presence. The group found evidence of the virus in several tissue samples from all three transplant recipients.

Unbiased high-throughput pyrosequencing has become a critical tool in Lipkin's lab, which is a member of the World Health Organization and helps train and equip public-health workers around the world. Lipkin has successfully used the technique to identify 20 viruses to date, including the Israel acute paralysis virus thought to be responsible for colony collapse disorder in bees. "There are all sorts of things that we've been able to identify using this approach," says Lipkin. "It's really quite powerful."

Because the sequencing technique is not biased toward known organisms, it is ideally poised to track down previously unknown pathogens. "We're finding the needle in the haystack, even without knowing what the needle looks like a priori," says Michael Egholm, vice president of research and development at 454 Life Sciences and a coauthor of the NEJM report.

"There's an enormous amount of uncharted territory in microbiology," says Lipkin. As many as 40 percent of cases of central nervous system disease cannot be traced back to a specific culprit. For respiratory illness, the figure is 30 to 60 percent. In the United States alone, 5,000 deaths each year result from unidentified food-borne infections. "The advent of molecular tools like the one we've described here will be important in identifying the pathogenesis of a wide variety of diseases, acute and chronic," says Lipkin.

According to Whitley, understanding the microorganisms that cause these diseases could lead to more effective treatments.

As powerful as 454 sequencing is for discovering new pathogens, it is not fast or cost efficient enough for use in routine screening of transplant tissue. But microbes discovered using this technique could be incorporated into existing screening techniques. "As we do more and more transplantation medicine," says Lipkin, "it's going to become critical that we find faster, more efficient, less expensive ways to screen to ensure safety."


http://www.technologyreview.com/Biotech/20191/

Wednesday, February 13, 2008

Information technology governance

Information Technology Governance, IT Governance or ICT Governance, is a subset discipline of Corporate Governance focused on information technology (IT) systems and their performance and risk management. The rising interest in IT governance is partly due to compliance initiatives (e.g. Sarbanes-Oxley (USA) and Basel II (Europe)), as well as the acknowledgment that IT projects can easily get out of control and profoundly affect the performance of an organization.

A characteristic theme of IT governance discussions is that the IT capability can no longer be a black box. The traditional handling of IT management by board-level executives is that due to limited technical experience and IT complexity, key decisions are deferred to IT professionals. IT governance implies a system in which all stakeholders, including the board, internal customers and related areas such as finance, have the necessary input into the decision making process. This prevents a single stakeholder, typically IT, being blamed for poor decisions. It also prevents users from later complaining that the system does not behave or perform as expected:

A board needs to understand the overall architecture of its company's IT applications portfolio … The board must ensure that management knows what information resources are out there, what condition they are in, and what role they play in generating revenue… [1]

Definitions

There are narrower and broader definitions of IT governance. Weill and Ross focus on "Specifying the decision rights and accountability framework to encourage desirable behaviour in the use of IT."[2]

In contrast, the IT Governance Institute expands the definition to include underpinning mechanisms: "… the leadership and organisational structures and processes that ensure that the organisation’s IT sustains and extends the organisation’s strategies and objectives. [3]

While AS8015, the Australian Standard for Corporate Governance of ICT, defines Corporate Governance of ICT as "The system by which the current and future use of ICT is directed and controlled. It involves evaluating and directing the plans for the use of ICT to support the organisation and monitoring this use to achieve plans. It includes the strategy and policies for using ICT within an organisation."

Background

The discipline of information technology governance derives from corporate governance and deals primarily with the connection between business focus and IT management of an organization. It highlights the importance of IT related matters in contemporary organizations and states that strategic IT decisions should be owned by the corporate board, rather than by the chief information officer or other IT managers.

The primary goals for information technology governance are to (1) assure that the investments in IT generate business value, and (2) mitigate the risks that are associated with IT. This can be done by implementing an organizational structure with well-defined roles for the responsibility of information, business processes, applications, infrastructure, etc.

Decision rights are a key concern of IT governance, being the primary topic of the book by that name by Weill and Ross.[4] According to Weill and Ross, depending on the size, business scope, and IT maturity of an organization, either centralized, decentralized or federated models of responsibility for dealing with strategic IT matters are suggested. In this view, the well defined control of IT is the key to success.

After the widely reported collapse of Enron in 2000, and the alleged problems within Arthur Andersen and WorldCom, the duties and responsibilities of the boards of directors for public and privately held corporations were questioned. As a response to this, and to attempt to prevent similar problems from happening again, the US Sarbanes-Oxley Act was written to stress the importance of business control and auditing. Sarbanes-Oxley and Basel-II in Europe have been catalysts for the development of the discipline of information technology governance since the early 2000s. However, the concerns of Sarbanes Oxley (in particular Section 404) have less to do with IT decision rights as discussed by Weill and Ross, and more to do with operational control processes such as Change management.

Following Corporate Collapses in Australia around the same time, working groups were established to develop standards for Corporate Governance. A series of Australian Standards for Corporate Governance were published in 2003, these were:

  • Good Governance Principles (AS8000)
  • Fraud and Corruption Control (AS8001)
  • Organisational Codes of Conduct (AS8002)
  • Corporate Social Responsibility (AS8003)
  • Whistle Blower protection programs (AS8004)

In 2005, AS8015 Corporate Governance of ICT was published.


http://en.wikipedia.org/wiki/Information_technology_governance

Options Trading Technology

The Optioneer strategy uses proprietary technology that has been formulated to give you two valuable indicators that identify trade entry and exit points.

  • Probability or "P" Factor: The probability of the market being outside the Strike price by contract expiration date.
  • Risk or "R" Factor: The risk associated with entering the market.

The tried formulas which enable both "R" and "P" factors to be determined are calculated daily by Optioneer Systems and posted on the web site. Without these two indicators we believe it is very difficult to assess one's position on a daily basis.

Through regular practice in smaller trades, skills can be honed, while at the same time gaining confidence and building market knowledge.


http://www.optioneer.com.au/options-trading-technology

Trading Technology

Internet Trading

Internet Trading unleashes the potential of the Internet by providing the broking members of an exchange with the functionality to grant limited / full access to any of their clients.

When making this connection, the broker is guaranteed complete confidentiality and the rules of the exchange are strictly adhered to. The exchange receives bids and offers from the broker, acting as an agent on behalf of a client. Every such bid or offer is checked against limits, SET UP BY THE BROKER, and when any bid or offer is satisfied all other bids and offers are re-checked. Order authorization is totally unnecessary if the order is within the limits set by the broker, however, the facility is available.

The client has a restricted set of ATS functions, for example they cannot make a request for a double, or an RFQ. In real time however, they can bid, offer, hit a bid or offer, view their orders, trades, positions, and margin requirement.

As the Internet ATS server behaves just as a dealer would who receives a call from a client, a dealer who is logged in will see all the client orders as they are created and as they become trades and positions.

The Internet ATS software package consists of three separate modules, each performing specific functions.

ATS Inet server

Provides the interface between the client, broker and the appropriate exchange through which all deals take place. Stores a client database. The ATS Inet server runs at the broker. The broker can add new clients, delete clients or modify existing client data. Adding a client will grant them access to the relevant exchange system via the internet, if the client has the ATS Client Interface installed on their PC and has access to the internet. When a client is added the broker can choose to which degree the client is restricted to deal by setting the margin limits of the client, deciding whether the client can hit only, view depth and whether the client needs authorization to make deals. A client may also be denied dealing at all and will only be able to view the live data that is transmitted from the relevant exchange. If a client is deleted from the database, they will no longer be able to access the relevant financial market via the internet until they are added to the system again. The client's particulars may also be modified so that more/fewer restrictions are placed on them, according to the current wishes of the broker.

ATS Client Interface

Provides the means through which a client of the broker can view live exchange data, provides the means through which a client of the broker can make deals through the internet. The ATS Client Interface runs at the client. The client must be a registered exchange client, have access to the internet and have the ATS Client Interface installed on their PC. The client will have to supply their personal password before they are allowed to connect to the ATS Inet server. They will then be able to see live data streaming in from the relevant exchange on their terminal, and will be able to perform whatever functions their broker has allowed.

Monitor

The main function of the Monitor is to enable the broker to authorize the deals that a client wishes to make. The Monitor runs at the broker. Brokers will be able to see all those deals for which they wish to deny or grant approval, according to the criteria set up in the ATS Inet server for each client. When a client attempts to make a bid or offer that requires such approval, the client's code and details of the transaction will appear on the screen. The broker can then accept or decline the proposed deal at the click of a button.

Security and data integrity

A number of security measures have been built into the Internet ATS:

Encryption / Decryption

Any deals (bids/offers), or password changes made by the client is sensitive data that needs to be secured. The sensitive data is encrypted at the Client Interface and decrypted at the Internet server by making use of a complex encryption/decryption algorithm. The data is encrypted using an untraceable key, which includes random elements and changes daily. The key is calculated independently at both the Client Interface and the Internet server, making use of identical formulae, and is therefore never transmitted with the sensitive data.

Time encapsulation

Potentially, a hacker can intercept a sensitive message and re-send it a number of times to the Inet server without tampering with the message itself. Any number of identical transactions, unwanted by the client, can be performed at the exchange in this way as long as the margin limit of the client is not transgressed. In order to prevent this, the current time is recorded as part of the sensitive message, and is subsequently encrypted at the Client Interface and decrypted at the Internet server.

When a valid sensitive message has been sent (for example a client has made a valid bid) the time that is encapsulated within the message is stored at the Internet server. When the client sends a subsequent sensitive message, the time that is encapsulated in the second message is compared to the time that had been stored previously. Logic dictates that the time encapsulated in the second message must be a copy of an earlier message. Since the time that is encapsulated is also encrypted within the sensitive message, this cannot be tampered with and the potential scenario, as described, will not occur.

Password Issues

Password lengths must be a certain length and must contain at least 5 different characters. The client must change their password regularly. Recent passwords are stored at the Internet server and new passwords are checked against these so that passwords are not re-used often. If consecutive logins are unsuccessful, it is assumed that someone is tampering with the client's system.

In this case the Internet server changes the client's password to a random number, and a message is sent to the client to contact their broker who will be able to notify them what their password had been changed to. The client will then be able to re-iogin and change their password again should they wish to do so. Each time that a sensitive message is sent (for example on making a bid), the client must provide their password. In this way tampering by other people is minimized when the client is away from their computer.

Set out below are our technological specifications for each module of our Internet Automated Trading System:

Internet SERVERS:

  • 100 % IBM compatible 350 Mhz Pentium
  • 64MB RAM
  • 1.4MB stiffy drive 1 GB Hard Drive VGA
  • Windows NT Server
  • Windows IIS or equivalent
  • Novell 32 Client

Internet CLIENTS and MONITOR:

  • 100 % IBM compatible 200 Mhz Pentium
  • 32MB RAM
  • 1.4MB stiffy drive 1 GB Hard Drive
  • VGA
  • Windows 95/98/NT

Margin Monitor

The add-on tool which monitors margins, on Yield-X, real-time. Facilitating what-if inputs which enable you to check margin requirements prior to actual trade entry. Margin requirements are calculated on a trade for trade basis throughout the day. Margin contributions are calculated on trade level.

The Margin Monitor calculates the Risk Margin and the Settlement Margin for each participant of the market (Derivatives and Spot) that has Positions for the trading day. In addition, the Margin Monitor calculates the Margin for every Contract in the deals file per participant.

There is an option "What If" to calculate the margin of selected contracts, number of positions and Strike (only relevant for Option Contracts). These margins will either be calculated with offset - if there is a Participant of the market selected - or not, i.e. only the margin for the selected contracts.


Tuesday, February 12, 2008

Only in Japan: The Best Technologies You Can't Buy

True mobile TV. Connected cars. Personal robots. The coolest new gadgets and services are still found in the Land of the Rising Sun.


Just a few years ago Japan's lead in all things digital was easy to see. Japanese consumers could buy new domestic gadgets from companies like Sony, Toshiba, and Panasonic, often a year or two before they hit the market in other countries. But now things have changed. With gadgets increasingly coming out at the same time around the world, it's no longer the hardware that makes something cool, but what you can do with it.

[Note: To see some of the technology and services described here in action, watch our video, "Made in Japan: Future Tech Today."]

Mobile Digital TV

Take OneSeg, Japan's mobile digital TV system. The entire electronics industry, TV broadcasters, and the government all agreed on a single broadcasting standard, eliminating the technical competition that's holding back such services in the United States and Europe.

The result is a popular service that features all the regular terrestrial channels at no cost. Already, 14 million cell phones with the service have been sold, and the sight of people watching TV is becoming more common on trains and in cafes across Japan.

The latest phones also allow you to record TV shows. And if you're in a public space but forgot to bring your headphones, it's no problem. A couple of button presses brings up the subtitles so you can enjoy the show with the volume turned down. In addition, a companion data service provides information about the current show, promotions from the broadcaster, and, often, a link to the TV station's mobile Internet home page.

Mobile Wallet Service

A customer uses her cell phone as a so-called mobile wallet.

Something else that's popular in cell phones these days is the "Osaifu keitai," the mobile wallet service. Phones have smart cards embedded inside, and these cards let you add applications like electronic money, your commuter pass, an airline mileage card, or a credit card just by downloading some software.

The strength of Japan's mobile wallet system is that the industry has settled on a single smart card, Sony's Felica. Once a person's phone has this hardware, he or she can add more functionality with software.

To use a cell phone as a credit card, pass it over a reader.

NTT DoCoMo, Japan's largest cellular carrier, gives all its customers an electronic credit card application called DCMX Mini. It has a 10,000-yen ($94) credit limit, and charges appear on the phone bill. Big spenders can apply for more credit and use it just like a regular credit card. All you have to do is bring your phone within an inch of the reader and the transaction can be completed.

Electronic money--something that was tried many times but failed during the dot-com bubble--is now becoming very popular, thanks to "Osaifu keitai."

Of the electronic money systems in Japan, Edy from BitWallet is the market leader, accepted in more than 71,000 convenience stores, bookshops, and coffee chains, and at vending machines. More than 37 million cards and cell phones that support Edy are on the market, and the network handles close to a million transactions per day on average.

Connected Cars

In Japan, car navigation systems have been a must-have accessory in automobiles for years. Streets in cities like Tokyo often don't have names, so a navigation system can really save you time. But the latest systems, offered by companies like Nissan, come with something extra.

Hook your navigation system to your cell phone, and you have a connection through which you can get the latest road and traffic data. The navigation system already knows where the nearest gas station is, but with the network link it can also tell you where the cheapest station is, thanks to daily updates on gas prices.

When you're driving, the phone can connect you to an operator who will help you on your journey and even remotely reprogram your navigation system so that you never have to take your hands off the wheel.

About 10 percent of streets are covered with sensors that provide information on traffic. Nissan is experimenting with a new service that collects data about the roads you've driven and the speeds you've achieved, and feeds it to a central computer that adds the information to the traffic database for a more complete picture of jams.

Round View Monitor car safety system--click for full-size image.

High-tech is also being employed in car safety systems like the Round View Monitor. The video from four cameras around the vehicle is processed and brought together into a single image so that you get the illusion of seeing your car from above. It makes backing into tight spaces really easy and is a big-step beyond the single cameras now found on some large cars and trucks.

Warning of the Big One

One area that's taken very seriously by people in Japan is earthquakes and disaster prevention. The problem is, you never know when a quake could strike, right? Well, not necessarily.

A new warning system has just gone into operation that seeks to quickly detect the weak but fast-moving primary waves from a quake and use them to estimate when the slower-moving but destructive secondary waves will hit.

The system won't help people living at the epicenter of an earthquake, since both kinds of waves arrive virtually simultaneously. But in the event of a major quake, warnings of anywhere from a few seconds to up to a minute can be supplied almost instantaneously.

That's enough time to halt trains and bring factory equipment to an emergency stop, and for homeowners to switch off the gas. Most deaths in the Kobe quake of 1995 were from fires that started after the quake, so preventing flameups is important.

Robots

Honda's Asimo.

No discussion of cool tech in Japan would be complete without robots. Japanese researchers are leading the world in robot technology, and humanoid bots like Honda's Asimo are especially impressive.

The latest version of Asimo can serve drinks on a tray and has gained the ability to work intelligently with other Asimo robots in the vicinity to get jobs done faster. Two of the robots have spent most of January working at Honda's Tokyo offices, bringing tea or coffee to guests--and almost certainly entertaining the visitors at the same time.

Rival car-maker Toyota has a clutch of robots including one unveiled in December that plays the violin. (It follows a trumpet-playing robot created a year earlier, so perhaps a robot orchestra is in the making?) The company also has Robina, which is intended to serve as a guide in a public space. Toyota put it into use last year at a public hall in Japan and expects robots like Robina will be commercially realistic in the middle of the next decade.

Taking on a much more serious role is Twendy One, a home-help robot developed by Tokyo's Waseda University. It can do many of the basic tasks that a frail person may need help with, such as assisting people out of bed and serving up toast and drinks.

The robot is still under development but could have a bright future. Japan's population is aging fast--already, 22 percent of people are over 65--and the birth rate is slowing. That likely means a future shortage of workers. It's one of the reasons money is being poured into robot technology in this already technology-saturated nation.


http://www.pcworld.com/article/id,142120/article.html#