Tuesday, January 15, 2008

Google Maps Gaining on Mapquest

Google's Maps application is making headway against the number-one ranked map Web site.


Google Inc.'s Maps application is making headway against the number-one ranked map Web site Mapquest, according to an analyst from Hitwise Pty. Ltd., which examines Internet usage data.

A year ago, MapQuest, which is owned by AOL LLC, had 429%, or more than five times the number of U.S. visits to Google Maps, said Hitwise analyst Heather Hopkins in a blog post. Now, however, Mapquest's lead has dropped to 126% more visits than the number-two ranked Google Maps. Yahoo Maps ranked third and MSN's Local Live was fourth.

Hopkins said traffic to MapQuest is flat year over year and is down 20% in the past six months. Traffic to Google Maps, meanwhile, is up 135% from the same time last year and is up 7% in the past six months. She attributed the growth for Google Maps to traffic from Google's search engine.

In addition, Hopkins said despite the increase in the use of GPS systems, U.S. visits to maps Web sites are up 10% from last year.

Google sends more of its own traffic to Google Maps than to Mapquest, a change that occurred last March, Hopkins said, but that doesn't account for the increase in consumers looking for Google Maps.

"We can measure this through Internet searches," Hopkins said in her blog. "Searches for 'google maps' have increased but the term 'mapquest' receives nearly 10 times the search volume."

Hopkins said Google is now sending more traffic to Google Maps for high-volume generic terms such as "map" and "driving directions" and for variations on the Mapquest brand name than it did last year.

"I am sure many will jump to the conclusion that Google is favoring its own property. I can't say whether that's true but I can say that Google Maps is receiving more of its traffic from paid search listings than MapQuest," Hopkins said in her blog. "In the past four weeks. Google Maps received 19% of its search traffic from paid listings compared with 10% for MapQuest."

Neither Google nor Mapquest could be reached for comment.


Computerworld
For more enterprise computing news, visit Computerworld. Story copyright © 2007 Computerworld Inc. All rights reserved.

Apple Introduces Time Machine Companion Hardware

Taking the keynote stage at Macworld Expo on Tuesday, Apple CEO Steve Jobs wasted no time in introducing Time Machine, a new product designed as companion hardware to Mac OS X v10.5 "Leopard's" Time Capsule backup technology.


Taking the keynote stage at Macworld Expo on Tuesday, Apple CEO Steve Jobs wasted no time in introducing Time Machine, a new product designed as companion hardware to Mac OS X v10.5 "Leopard's" Time Capsule backup technology.

While Time Machine enables you to maintain persistent backups to a local hard disk drive, Time Capsule combines an 802.11n network access point and a hard disk drive. The device is a "full AirPort Extreme base station" combined with "a server-grade hard drive," according to Jobs.

Initially, Apple will make Time Capsule in two versions: one with a 500GB hard disk drive, for US$299, and a 1 terabyte model for $499. The new device is expected to be released in February.

Jobs also noted during the opening moments of his keynote speech that Leopard has been Apple's most successful Mac OS X release to date, racking up over 5 million unit sales in three months. Apple estimates that about 20 percent of its installed base has migrated to Leopard.

More to follow.


Macworld
For more Macintosh computing news, visit Macworld. Story copyright © 2007 Mac Publishing LLC. All rights reserved.

Researchers Control Robot With Brain Signals

Scientists in Japan have succeeded in controlling a humanoid robot with signals from a monkey's brain.


Scientists in Japan have succeeded in controlling a humanoid robot with signals picked up in the U.S. from a monkey's brain and transmitted across the Internet, they said Tuesday.

The research, which represents a world's first according to the Japan Science and Technology Agency (JST), could be a first step toward giving doctors the ability to restore motor functions in severely paralyzed patients. It can also contribute to the development of robots that move more like humans, JST said in a statement Tuesday.

In the tests, scientists led by Miguel Nicolelis at Duke University in North Carolina trained two monkeys to walk on their legs on a treadmill. The activity of neurons in the leg area of the monkey's brain was recorded while the monkey walked and decoded into predictions of the position of their leg joints.

These predictions were then sent across the Internet to Kyoto where they were used to control a robot. A live video signal of the robot was relayed back to the monkey to provide feedback.

The robot, called CBi for Computational Brain interface, is about the same size as a human at 155 centimeters tall and weighs 85 kilograms. It has 51 degrees of freedom of motion and was developed by JST and Christopher Atkeson of Carnegie Mellon University's Robotics Institute to enable such neuroscience research. The hardware side of the robot was developed by Sarcos, a Salt Lake City robotics company.

The results of the work are groundbreaking, according to JST, although much remains to be done before it can be worked into something useful. As part of the ongoing research, the teams are looking at sending back more complex feedback to the brains of the monkeys.

In recent years, robotics researchers have been increasingly studying how to make the movements of robots more lifelike. Robots like Asimo, developed by car-maker Honda, are being positioned as future companion robots that could either work alongside humans or carry out tasks for them. One of the many issues that needs to be tackled before such a dream can be realized is increasing the mechanical complexity of the robot while simultaneously developing more advanced control systems.


http://www.pcworld.com/article/id,141367-c,futuretechnology/article.html

BlackBerry Entering Chinese Market

HONG KONG (Reuters) - TCL Corp began sending BlackBerry handsets to partner China Mobile in 2007's fourth quarter, suggesting the launch of the popular email device in China is just around the corner.

Chinese newspapers have reported that the smartphones could go on sale this month, marking a breakthrough for Research in Motion Ltd's years-long effort to launch its device in the world's top telecoms arena and the final major Asian market left untapped.

RIM had expected sales to commence in 2007. But executives said just last week it was now for service partner China Mobile to decide when the BlackBerry would hit store shelves.

TCL Multimedia and TCL Communication, both units of parent TCL Corp, said on Tuesday they started dispatching handsets to China Mobile in the fourth quarter as the device's sole manufacturer and distributor in that country.

"They've shipped the handsets, yes, but the decision (on when they go on sale) lies with China Mobile," said Lorna Wong, an appointed spokeswoman for the TCL group, reaffirming RIM's stance.

When the BlackBerry handset is officially launched in China, it will face stiff competition from low-cost rivals, including a popular local service nicknamed RedBerry.

TCL would not forecast sales or take-up, she added. China Mobile executives declined to comment.

RIM says no handsets had been sold officially in mainland China, though BlackBerry devices are widely available on auction sites and on the black market.

The firm said in October that a concurrent deal with Alcatel-Lucent to help distribute the BlackBerry smartphones in China, and its partnership with China Mobile, gave it a powerful platform in a market that has more wireless users than there are people in the United States.

But analysts say delays in the device's launch may have arisen because the Ontario-based RIM needed to satisfy Beijing that its handsets posed no security threat to Chinese communication networks.

The BlackBerry is available now from more than 300 carriers in about 120 countries.

(Reporting by Edwin Chan; Editing by Louise Ireland)

Panasonic TH-42PX700 Review

42in Plasma
Picture
Sound
Features
Usability
Value
The jury is out on whether the extra outlay is justified. What is not in doubt is that this plasma is an excellent all round performer.
HD Ready: yes
Resolution: 1,024 x 768
Rating: 92%

Reviewed: 14 January 2008

Design

The TH42PX700 like its younger brother the TH42PX70, comes with a pedestal, wall-mounting brackets, or on a Panasonic cabinet. The panel itself has a stylish minimalist glossy black finish which has a distinguishing silver strip at the bottom of the screen. For real impact and presence go for the Panasonic cabinet which is perfectly matched to the panel.

Features

The TH42PX700 is an upgrade to the TH42PX70 upon which it is largely based. A very useful extra HDMI input has been added (taking the total to 3) along with Panasonic's 'Advanced Smart Sound Speaker system', which incorporates twin passive radiator woofers. Finally, a card slot has been added which takes SD or SDHC for the display of JPEG stills.

Screen: 42in 16:9
Tuner:Digital
Sound System: Nicam
Resolution: 1,024 x 768
Contrast Ratio: 10,000:1
Other Features: Vreal Picture Processing, Deep Black Filter, Real Black Drive.
Sockets: 3 HDMI, 2 SCART, Component Video, Composite Video, PC input.

The TH42PX700 is one of Panasonic's 10th generation plasma screens, and as such has a completely new panel, the G10.

Although there are some 'added extras', The TH42PX700 still forms part of the base range of new plasmas from Panasonic and comes with a 1,024 x 768 resolution.

At the heart of the TH42PX700 is the latest incarnation of Panasonic's picture processing technology, Vreal2. Vreal2 brings together an impressive range of technological picture processing wizardry adapted for the 10th generation screens.

A Digital Optimiser has been designed to reduce digital noise originating from the video source itself, and also motion pattern noise which produces false contouring during motion.

Complementing Vreal technology is Panasonic's 'Deep Black Filter' and 'Real Black Drive' technologies which build upon the companies already legendary black level performance capabilities.

Another feature of the TH42PX700, Viera Link, uses high-definition multimedia interface (HDMI) connections to automatically switch between different devices, put them in standby and take control of them with full menu access using a single remote.

Performance

The more powerful speakers on the PX700 certainly make their presence felt. Bass sounds are much deeper and overall clarity is much improved from the PX70.

Over the years, Panasonic have gained a legendary reputation for their black level performance and the TH42PX700 carries on this tradition. In fact, you could make a good argument for buying the TH42PX700 on the strength of its capabilities in this area alone. You really need to sit this screen alongside any LCD to really appreciate its strength in producing a stunning black level performance.

Panasonic's TH42PX700, with its 1024x768 resolution will downscale 1080i content (such as Sky) and will inevitably lose a degree of clarity as a result. The results of this downscaling however are hardly noticeable. Put this screen next to a full HD 1080p enabled alternative and on close inspection you will notice a difference, but are you prepared to pay a substantial amount more for the improvement?

Colour on the TH42PX700 is superb, with a level of accuracy and saturation that is a match for some more expensive plasmas. Colours appear deep and natural, and skin tones are always subtle and believable.

As expected motion handling is one of the TH42PX700s strengths, with no evidence of smearing at anytime, even with the quickest movie action sequences or sporting action.

High Definition (HD) on the TH42PX700 is almost faultless, and for a screen in this price bracket this is quite an achievement. Worthy again of particular mention are the superb black levels and the greyscale graduation which seems to pick up every subtlety of shadowy scenes in any situation.

With Standard Definition (SD) the TH42PX700 suffers like all plasma and LCDs in that the technology was just not built for SD sources. Not as accomplished as its smaller 37in brother the Th42PX700 is nevertheless one of the better SD performers. Sharpness and clarity generally remain intact even with fast motion sequences. The SD picture is not perfect, but easier to live with than most flat panels.

Conclusion

The TH-42PX700 will cost you at least £200 more than its TH-42PX70 sibling. The improved sonic capabilities will be especially welcome for those of you with larger living areas and an extra HDMI will almost certainly come in handy. Otherwise, performance is identical to the TH42-PX70, and you might just want to save yourself that £200 ...

http://www.hdtvorg.co.uk/reviews/plasma/panasonic_th-42px700.htm

Blu-ray gains advantage over HD DVD

With the announcement last week that Warner Brothers is dropping support for the HD DVD High definition format, the number of big Hollywood studios favoring the Blu-ray camp stands at 5 against just 2 for HD DVD.

A Warner Bros spokesman quoted sales of 3:1 in favor of Blu-ray as one of the reasons why they were giving the format their exclusive support. If the two remaining studios supporting HD DVD, Paramount and Universal have similar stats it is hard to see them not taking the same route.

With rumors that Microsoft will offer support for the Blu-ray format on their Xbox 360, it certainly looks as if the writing is on the wall for HD DVD.

If Blu-ray emerges as the HD format champion it will be bad news for Toshiba and Microsoft who backed HD DVD, and more importantly, millions of consumers who have invested in HD DVD players and film titles.

A single format will be good news for consumers who up until this point have been reluctant to commit to one of the two formats, and for the industry in general which will surely see an explosion in public interest. Barry Meyer, chairman of Warner Bros had previously warned that "the window of opportunity for high-definition DVD could be missed if format confusion continues to linger."

http://hdtvorg.co.uk/news/articles/2008011401.htm

Samsung's Rose-Black 650 LCD

With a unique and innovative 'Crystallized' Rose-Red finish, Samsung have produced a desirable LCD TV of high style along with an impressive feature list.

The 650 series is due out this spring and employs an innovative manufacturing process to produce Samsung's trademark black gloss finish with a subtle hint of deep red.

Adding to the stylistic mix, the 650 series scores another industry first as the only flat panel TV to dispense with the use of either glue or screws in the manufacturing process.

With an impressive feature list, the 650 series is certainly no show boat. 120Hz processing is joined by 4 HDMI (v1.3) inputs along with Full HD (1920 x 1080) resolution. A claimed contrast ratio of 30,000:1 should make the 650 series a very interesting piece of kit.

No word yet on pricing, but Samsung's 650 series will be available in 40in, 46in and 52in models.

http://hdtvorg.co.uk/news/articles/2008011301.htm

A New Treatment for Alzheimer's?

Neurologists urge caution upon reports of a successful therapy.

A puzzling disease: Alzheimer's patients given an anti-inflammatory drug show rapid improvement, according to a report from physicians in California. These doctors believe that the treatment improves the connections in the brain.
Credit: Technology Review


A drug commonly used to treat arthritis caused a dramatic and rapid improvement in patients with Alzheimer's disease, according to physicians in California. However, scientists and others not involved in the work worry that the report, which was based on trials in a few patients and hasn't been independently confirmed, may offer little more than false hope for Alzheimer's sufferers and their families.

Alzheimer's patients injected with the anti-inflammatory drug etanercept--marketed as Enbrel--showed dramatic improvements in their functioning within minutes, according to Edward Tobinick, director of the Institute for Neurological Research, a private medical facility in Los Angeles where the patients were treated, and an assistant clinical professor of medicine at the University of California, Los Angeles.

"The patients improve literally before your eyes," says Tobinick, who began using etanercept in Alzheimer's patients three years ago. He uses an unconventional method to administer the drug; he injects it near patients' spines. In 2006, he reported success with weekly treatments given to 15 people over the course of six months. In a case study in the latest issue of the Journal of Neuroinflammation, Tobinick and Hyman Gross, who practices in Santa Monica, describe how a patient improved within 10 minutes of treatment, and how cognitive tests performed two hours after the treatment showed a marked improvement over tests given before the injection. Tobinick says that the rapid improvement is typical in patients he has injected with etanercept. He treats them weekly, or, in some cases, less often.

"In each case, the person was more alert, calm, attentive, and they stayed on track," says Sue Griffin, director of research at the Donald W. Reynolds Institute on Aging at the University of Arkansas for Medical Sciences, who watched Tobinick treat several patients. Griffin says that she was skeptical when she first heard about Tobinick's approach, but having witnessed the effect firsthand, she says, "It was just completely amazing, like nothing I'd ever seen for an Alzheimer's person."

Minutes before the treatment, the patient in the case study couldn't recall the year or which state he was in. Ten minutes after the injection, he answered these questions correctly. As part of a cognitive assessment performed the day before the treatment, the patient was asked to draw a clock face showing a certain time. He sketched a square. Two hours after the injection, he drew a round face with two hands in approximately the correct positions.

The case report on this patient's rapid improvement is "interesting," says William Thies, vice president of medical and scientific relations for the Alzheimer's Association, but he adds that "we're going to need more information before it's something that people should get wildly excited about."

"There are some kernels of good science here," says David Standaert, director of the Center for Neurodegeneration and Experimental Therapeutics at the University of Alabama at Birmingham, but he cautions that "this is not enough evidence that we would start treating people outside of a trial." Standaert was not involved in the work.

Etanercept has been used since 1998, primarily to treat rheumatoid arthritis. It's usually injected into the thigh, stomach, or upper arm, but Tobinick says that by injecting it into the neck, near the spine, the drug can reach the brain. It's a method that requires considerable skill. "It would be incorrect for anyone to think that they could just take Enbrel if they have Alzheimer's, and they'll get better," he says.

For decades, scientists have been trying to figure out how Alzheimer's disease does its damage so that they can determine how it might be treated. A hallmark of the disease is the globs of protein that form in the brain. There's mounting evidence that "part of what damages the brain is the body's own immune responses to these abnormal proteins," Standaert says. Various anti-inflammatory drugs have been trialed in Alzheimer's patients, but with disappointing results.

Etanercept reduces inflammation by blocking a protein called tumor necrosis factor (TNF), which plays an important role in immune responses. TNF occurs naturally in the brain, but studies have found elevated levels in people suffering from Alzheimer's disease.

Recent evidence suggests that TNF regulates the activity of synapses, which connect brain cells and enable electrical signals to travel around the brain. In Alzheimer's patients, an excess of TNF may wreck havoc on those connections, Tobinick says. "Even though the neurons may be working, the connections between the neurons and between the different lobes of the brain may not be working properly."

By using etanercept to reduce levels of TNF in the brains of Alzheimer's patients, Tobinick thinks he may have normalized those connections, leading to an immediate improvement in cognitive functioning. He says that he's working with academic partners to design larger-scale trials of the treatment. However, Sonia Fiorenza, a spokeswoman for Amgen, which markets Enbrel, says that the company won't be sponsoring trials because it doesn't believe there's enough evidence that it may be useful in Alzheimer's disease.

Some researchers want to see independent studies carried out, in part because Tobinick has disclosed that he has stock in Amgen and holds patents on the use of the drug and other anti-TNF agents to treat Alzheimer's disease.

It's not unusual for researchers to have a financial interest in something they're studying, Thies says, and "it doesn't stop them from doing good science." However, "you're going to have to have some independent confirmation in the hands of others."

"You have to do these [studies] double blind, placebo controlled, by people who don't have a financial interest," says J. Wesson Ashford, a senior research scientist at the Stanford University/VA Aging Clinical Research Center. (He was not involved in Tobinick's work.) "I'd really like to believe it, but I've seen it so many times, when people say something and it doesn't turn out to be anything."

"This is something that's got to be looked at," Griffin says. "I hope that scientists will pay attention to this, and the funding agencies will pay attention to this."

"This is not a cure," she adds, but if there's a person who can't dress or feed himself, is arrogant, mean, and up all night, "and you can take them to the point where they can feed themselves, they're calmer, attentive, conversational--in other words, you can stand them--that's great."


http://www.technologyreview.com/Biotech/20066/

The New CAFE Standards

Fuel standards will likely be achievable but won't encourage innovation.

Small changes: New fuel-economy standards won’t drive plug-in electric vehicles, such as this three-door hatchback from ZENN Motor Company, into the market. But lower-profile R&D incentives and manufacturing supports in the same legislation could jump-start the process.
Credit: Electric Drive Transportation Association


The 40 percent increase in the U.S. fuel-economy standard to 35 miles per gallon by 2020, which Congress passed last month, could be a significant step toward trimming U.S. drivers' increasing greenhouse-gas emissions and dependence on imported oil. But energy experts say that the new technologies required to meet the new standards are minimal. Instead, they say that lesser-known provisions in the Energy Independence and Security Act could have a far greater impact on spurring the development of new technologies, such as plug-in electric vehicles.

The new law tightens Corporate Average Fuel Economy (CAFE) standards that regulate the average fuel economy in the vehicles produced by each major automaker. The current CAFE standard for cars, set in 1984, requires manufacturers to achieve an average of 27.5 miles per gallon, while a second CAFE standard requires an average of 22.2 miles per gallon for light trucks such as minivans, sport utility vehicles, and pickups. The new rules require that these standards be increased such that, by 2020, the new cars and light trucks sold each year deliver a combined fleet average of 35 miles per gallon.

Raising fuel economy by 10 miles per gallon nationwide will deliver real benefits. The Union of Concerned Scientists, for example, estimates that it will save 1.1 million barrels of oil per day in 2020--about half of what the United States imports from the Persian Gulf. That should deliver a reduction in greenhouse gases equivalent to taking 28 million of today's cars and trucks off the road. Nevertheless, Jim Kliesch, a senior engineer with the Union of Concerned Scientists, in Washington, DC, projects that these savings will be largely negated in 2020 by increased driving.

"Fuel-economy policy in this country had been stagnating for decades, and getting a minimum of 35 mpg by 2020 is a critical first step, but if we want to achieve a sustainable transportation system, it's going to take much more," says Kliesch.

Analysts like Kliesch do expect automakers to produce more advanced diesel vehicles and hybrids in the coming decade, both in response to tightening fuel-economy rules and for strategic and marketing reasons. But reports by the U.S. National Academy of Sciences, think tanks, and activists show that a combination of existing efficiency options, such as continuously variable transmissions and better tires, can cheaply and easily deliver a 35-miles-per-gallon fleet. (See "Why Not a 40-MPG SUV?")

Indeed, Europe currently requires 40 miles per gallon average fuel economy and will soon push up to 49 miles per gallon, while Japan is expected to reach 47 miles per gallon in its 2015 standard. Greenhouse-gas regulations developed by California (and adopted by many other states) may soon eclipse CAFE in the United States. Last month, the Environmental Protection Agency rejected California's petition to impose its own standards, arguing that the state rules delivered the equivalent of just 33.5 miles per gallon, but California officials shot back with their own analysis early this month. They estimated that the state standard would yield 35 miles per gallon from new cars by about 2016--four years ahead of CAFE.

While the tighter CAFE standards will have a minimal effect on spurring the development of new technologies, other measures in the new federal energy law could stimulate more transformative, long-term change. Genevieve Cullen, vice president of the Electric Drive Transportation Association, in Washington, DC, points to the law's support for research on, and demonstration and manufacturing of, electric-vehicle technologies such as lithium batteries and advanced motors. The law authorizes, for example, $450 million in grants to companies and state and local governments to demonstrate the use of plug-in hybrid vehicles, and up to $25 billion for direct loans to support manufacturing. "Congress needs to now provide the money for these programs," says Cullen, referring to the separate process in which funds are appropriated for each fiscal year.

A hoped-for $3,000-to-$5,000-per-vehicle tax credit for buyers of plug-in hybrids could have further stimulated demand for advanced vehicles but was struck from the bill at the last minute, along with a tax hike for oil producers that would have funded it. Therese Langer, transportation program director for the American Council for an Energy-Efficient Economy, a Washington-based think tank, says that thanks to the high cost of batteries, the tax credit would probably have only cut in half the incremental cost of a plug-in vehicle. "That's really nice for someone who's prepared to shell out thousands of bucks for a plug-in," she says, "but it's not going to cause plug-ins to make a dent in total vehicle sales."


http://www.technologyreview.com/Energy/20067/

Google's Answer to Wikipedia

Faces and names: In a bid to make Internet content more credible and profitable, Google has created Knol, an online encyclopedia whose articles will feature author attributions and advertisements.
Credit: Google


Google recently announced Knol, a new experimental website that puts information online in a way that encourages authorial attribution. Unlike articles for the popular online encyclopedia Wikipedia, which anyone is free to revise, Knol articles will have individual authors, whose pictures and credentials will be prominently displayed alongside their work. Currently, participation in the project is by invitation only, but Google will eventually open up Knol to the public. At that point, a given topic may end up with multiple articles by different authors. Readers will be able to rate the articles, and the better an article's rating, the higher it will rank in Google's search results.

Google coined the term "knol" to denote a unit of knowledge but also uses it to refer to an authoritative Web-based article on a particular subject. At present, Google will not describe the project in detail, but Udi Manber, one of the company's vice presidents of engineering, provided a cursory sketch on the company's blog site. "A knol on a particular topic is meant to be the first thing someone who searches for this topic for the first time will want to read," Manber writes. And in a departure from Wikipedia's model of community authorship, he adds that "the key idea behind the Knol project is to highlight authors."

Noah Kagan, founder of the premier conference about online communities, Community Next, sees an increase in authorial attribution as a change for the better. He notes the success of the review site Yelp, which has risen to popularity in the relatively short span of three years. "Yelp's success is based on people getting attribution for the reviews that they are posting," Kagan says. "Because users have their reputation on the line, they are more likely to leave legitimate answers." Knol also has features intended to establish an article's credibility, such as references to its sources and a listing of the title, job history, and institutional affiliation of the author. Knol may thus attract experts who are turned off by group editing and prefer the style of attribution common in journalistic and academic publications.

Manber writes that "for many topics, there will likely be competing knols on the same subject. Competition of ideas is a good thing." But Mark Pellegrini, administrator and featured-article director at Wikipedia and a member of its press committee, sees two problems with this plan. "I think what will happen is that you'll end up with five or ten articles," he says, "none of which is as comprehensive as if the people who wrote them had worked together on a single article." These articles may be redundant or even contradictory, he says. Knol authors may also have less incentive to link keywords to competitors' articles, creating "walled gardens." Pellegrini describes the effect thus: "Knol authors will tend to link from their articles to other articles they've written, but not to articles written by others."

Google also faces the difficult task of generating a useful body of knowledge from scratch. According to Wikipedia, it has taken more than seven years to generate its 9.25 million articles. "There's really no shortcut to getting this kind of coverage," says Pellegrini.

But Google is well positioned to provide a monetary incentive for content generation through its advertising programs, such as AdSense. If Knol attracts the number of users Wikipedia currently enjoys, Google has an opportunity to publish an equivalent number of ads. And some of that revenue would find its way to content providers. Manber writes, "If an author [of a Knol article] chooses to include ads, Google will provide the author with substantial revenue share from the proceeds of those ads."

These payments are likely to be modest, however, especially when the site is newly launched and doesn't yet have enough content to attract many readers. And Kagan believes that for many online content contributors, small payments from revenue-sharing programs will prove less of an incentive than the desire to share something they are passionate about. He points to the example of the revenue-sharing video website Revver, which has yet to approach the popularity of YouTube. "Many times, paying users to do things they wouldn't genuinely do proves not to work," Kagan says.

Google is betting that, if it can generate enough content, its expertise in search--and the effectiveness of peer review--will give it a competitive advantage. But while reader rating systems are common on sites that review goods and services, such as epinions and Amazon.com, it's unclear how effective they will be as a means of promoting user-generated content. Manber writes, "Google will not serve as an editor in any way, and will not bless any content." Wikipedia and peer-reviewed journals, by contrast, have mechanisms for preventing the proliferation of inaccurate content. Peer-reviewed journals publish only those articles deemed worthy by a group of the author's academic contemporaries. Wikipedia articles are constantly edited by numerous authors, so bogus information is typically removed quickly. In 2005, Nature found that there was not a substantial difference between the accuracy of scientific articles on Wikipedia and those in the Encyclopedia Britannica.

http://www.technologyreview.com/Biztech/20065/