logo Welcome, Guest. Please Login or Register.
2024-03-30 03:40:22 CoV Wiki
Learn more about the Church of Virus
Home Help Search Login Register
News: Check out the IRC chat feature.

  Church of Virus BBS
  General
  Science & Technology

  The Crowd Is Wise (When It’s Focused)
« previous next »
Pages: [1] Reply Notify of replies Send the topic Print 
   Author  Topic: The Crowd Is Wise (When It’s Focused)  (Read 570 times)
Walter Watts
Archon
*****

Gender: Male
Posts: 1571
Reputation: 8.85
Rate Walter Watts



Just when I thought I was out-they pull me back in

View Profile WWW E-Mail
The Crowd Is Wise (When It’s Focused)
« on: 2009-07-19 01:18:04 »
Reply with quote

The New York Times
July 19, 2009
Unboxed

The Crowd Is Wise (When It’s Focused)

By STEVE LOHR

FEW concepts in business have been as popular and appealing in recent years as the emerging discipline of “open innovation.” It is variously described as crowdsourcing, the wisdom of crowds, collective intelligence and peer production — and these terms apply to a range of practices.

The overarching notion is that the Internet opens the door to a new world of democratic idea generation and collaborative production. Early triumphs like the Linux operating system and the Wikipedia Web encyclopedia are seen as harbingers.

In the new model, innovation is often portrayed as a numbers game. The more heads, the better — all weighing in, commenting, offering ideas. Collective knowledge prevails, as if a force of egalitarian inevitability.

But a look at recent cases and new research suggests that open-innovation models succeed only when carefully designed for a particular task and when the incentives are tailored to attract the most effective collaborators. “There is this misconception that you can sprinkle crowd wisdom on something and things will turn out for the best,” said Thomas W. Malone, director of the Center for Collective Intelligence at the Massachusetts Institute of Technology. “That’s not true. It’s not magic.”

The Netflix Prize is a stellar example of crowdsourcing. In October 2006, Netflix, the movie rental company, announced that it would pay $1 million to the contestant who could improve the movie recommendations made by Netflix’s internal software, Cinematch, by at least 10 percent. In other words, the company wanted recommendations that were at least 10 percent closer to the preferences of its customers, as measured by their own ratings.

(Cinematch analyzes each customer’s film-viewing habits and recommends other movies that the customer might enjoy. More accurate recommendations increase Netflix’s appeal to its audience.)

The contest will end next week because a contestant finally surpassed the 10 percent hurdle on June 26, and, according to the rules of the competition, rivals have 30 days from that date to try to beat the leader. The frontrunner is a seven-person team, and its members are statisticians, machine learning experts and computer engineers from the United States, Austria, Canada and Israel. It is led by statisticians at AT&T Research.

The leading team is a very elite crowd, indeed, but it is also one that was made possible by the Internet. The original three AT&T researchers (one has since joined Yahoo Research, but remains on the contest team) made good strides in the first year of the contest. But to make further progress, they went looking for people with other skills and perspectives. So they reached out eventually to a pair of two-person teams, who were among the leaders in the rankings posted on the contest Web site.

“The leader board was right there,” said Chris Volinsky, director of statistics research at AT&T. “It was pretty obvious who the top teams were.”

Though leading, his team may not win. But the teams in close pursuit are similar collaborations of skilled researchers and engineers.

The Netflix contest has lured experts worldwide not only because of the prize money but also because it offered a daunting challenge. The contestants’ algorithms must find patterns nestled in a collection of more than 100 million movie ratings. What is learned in tackling such a large-scale data analysis and predictive-modeling problem could well be applied in many industries, like Web commerce or telecommunications. “It made sense for us both from the perspective of AT&T and scientific research,” Mr. Volinsky explained.

In the Netflix contest, the winning idea is simply the one with the highest score. But often, companies rely on a contributing crowd for ideas, though management then chooses. I.B.M., for example, conducts online brainstorming sessions it calls Jams — 13 over the last seven years.

I.B.M. used one session to guide its strategy for investing in new growth fields, starting in 2006. An estimated 150,000 employees, clients, business partners and academics participated. Management sifted through the ideas and committed $100 million to invest in several opportunities to apply technology innovations to energy saving, health care and smart electricity grids.

“It starts out as crowdsourcing and it is culled to a set of action items,” said Jeffrey T. Kreulen, a researcher at the I.B.M. Almaden Research Center in San Jose, Calif.

Open-innovation models are adopted to overcome the constraints of corporate hierarchies. But successful projects are typically hybrids of ideas flowing from a decentralized crowd and a hierarchy winnowing and making decisions. In Linux’s case, anyone can submit code, but Linus Torvalds and a few lieutenants decide what code will be included in the operating system, noted Mr. Malone of M.I.T. Even Wikipedia — produced by collaborating clusters of contributors focused on particular areas of interest — relies on administrators to make final judgments on whether to delete a challenged article, he added.

“Most of the interesting examples of collective intelligence contain many different design patterns,” Mr. Malone said.

In a recent paper, “Harnessing Crowds: Mapping the Genome of Collective Intelligence,” Mr. Malone and his two co-authors, Robert Laubacher, a research scientist at M.I.T., and Chrysanthos Dellarocas, a professor at the University of Maryland, use a biological analogy in calling the design patterns of collective intelligence systems “genes.” They studied the genelike building blocks in more than 250 examples of collective intelligence enabled by the Web. The intent, they write, is to provide a systematic framework for thinking about collective intelligence, so “managers can do more than just look at examples and hope for inspiration.”

OPENING the corporate doors to ideas and inspiration from the collective crowd holds great potential, but there are pitfalls, warns Henry Chesbrough, executive director of the Center for Open Innovation at the University of California, Berkeley. To succeed, Mr. Chesbrough said, a company must have a culture open to outside ideas and a system for vetting and acting on them.

“In business, it’s not how many ideas you have,” he observed. “What matters is how many ideas you translate into products and services.”

Report to moderator   Logged

Walter Watts
Tulsa Network Solutions, Inc.


No one gets to see the Wizard! Not nobody! Not no how!
Pages: [1] Reply Notify of replies Send the topic Print 
Jump to:


Powered by MySQL Powered by PHP Church of Virus BBS | Powered by YaBB SE
© 2001-2002, YaBB SE Dev Team. All Rights Reserved.

Please support the CoV.
Valid HTML 4.01! Valid CSS! RSS feed