home >> blog >>
February 2004 archives

February 28, 2004

pixies tour dates

Pixies tour dates have been announced, and it looks like I won't be able to make it :( The tour starts in Canada, then comes to the west coast before heading to Europe. I will be a few days out of the states come April 29. when the Pixies are scheduled to play in Davis. For those of you in the area, get ready to act quick - tickets for the Davis show go on sale Monday, March 1. I could possibly make the shows in Barcelona or the Netherlands, but I'd have to prolong my Europe stay a couple days and do some strategic flight routing. Or maybe I should just schedule another Europe trip altogether, and see the Pixies in Paris on my birthday :). Too cool.

Posted by jheer at 10:46 AM | Comments (1)

February 24, 2004

if you weren't pissed off enough already

From an SF Chronicle article: Sixty-two of the nation's top scientists, including a dozen Nobel laureates, denounced the Bush administration Wednesday for "misrepresenting and suppressing scientific knowledge for political purposes."

In an unusually harsh critique of White House policy-making, the scientists signed a joint statement accusing the administration of systematically distorting research findings, disbanding scientific advisory panels, ignoring or demoting its own staff experts and misleading the public on issues ranging from lead poisoning to climate change.

More at the Union of Concerned Scientists website.

Posted by jheer at 02:32 PM | Comments (0)

February 23, 2004

vienna waits for me

When it rains, it pours. Now with three papers accepted to the CHI conference in Vienna, I was just notified that our AVI submission (to be presented in Italy in May) has also gotten in. This has an important implication: in order to reduce air fare costs I must make the sacrifice of staying in Europe for a month between conferences. So I will be a bum, but only out of consideration to the financial concerns of my advisor and the good folks at PARC. Looking at the preliminary CHI conference schedule, it looks like Nathan and I are scheduled as the first talk in the first listed session (entitled "Keeping Safe"). This means that not only might we be the very first paper in the proceedings (yay visibility), but we get our talk out of the way early (translation: more time to enjoy Austrian beer halls free of all worries).

To make things even better, I just got back from STA travel on Telegraph and was able to get a one-way ticket for under $500. I'm not sure how much more round-trip costs. The only catch was that you had to be a full time student and be 25 or younger (at least for the Northwest flight I got, I'm not sure what availability for other airlines is like). One-way tickets through other carriers (ITA, Orbitz, Expedia) were coming up ridiculously expensive, with the exception of a previously unknown-to-me Polish airline that routes you through Warsaw and includes a segment in a prop plane. I'll save that adventure for another day. So if you're a student going to CHI, check them out (www.sta.com). Of course, the sweetest part will likely be the layover in Amsterdam. Good times :)

Posted by jheer at 06:33 PM | Comments (1)

animals underground

Too cute... riding the subway just got more enjoyable. "Now rounding the elephant's ass... next stop Paddington station"

Posted by jheer at 03:58 PM | Comments (0)

February 11, 2004


Ever feel this way?

Better to be untethered and open to possibility: living for the exhilaration of meeting someone new, of not knowing what the night will bring.

There is a bittersweet fondness for silence. All those nights alone—they bring insight.

...there is no patience for dating just for the sake of not being alone. We want a miracle. Out of millions, we have to find the one who will understand.

or perhaps you are enamored of Rilke?

If so, you may be a quirkyalone. You can even take the quiz to find out. I scored an 84, which makes me "somewhat quirkyalone" aka "a quirkytogether".... and I had always thought I was just picky. If you score highly, perhaps you can go party with your peers.

Posted by jheer at 11:47 AM | Comments (4)

book: computers and cognition

It seems we've all been getting phenomenological these days, so now is no time to stop. I just finished reading one of the better things to come out of the 1980's (my little brother and Metallica being two other notable exports) -- Winograd and Flores' monograph "Understanding Computers and Cognition".

The book is the retelling of an intellectual journey, philosophically examining the failure of Artificial Intelligence to achieve its lofty goals and directing the insights gained from this exploration towards a new approach to the design of computer systems. Or, more simply, how Heidegger and friends led an AI researcher to the study of human-computer interaction.

The authors begin by challenging what they call the "rationalistic" tradition (what today might be referred to as positivism?) stretching throughout most of Western thought. This tradition's problem solving approach consists of identifying relevant objects and their properties, and then finding general rules that act upon these. The rules can then be applied logically to the situation of interest to determine desired conclusions. Under this tradition, the question of achieving true artificial intelligence on computers, while daunting, holds the glimmer of possibility.

Winograd and Flores instead argue for a phenomenological account of being. The authors pull from a variety of sources to make their claims, but rest primarily on Heidegger's Being and Time and the work of biologist Humberto Maturana. One of the important implications is the notion of a horizon, background, or pre-understanding, making it impossible to completely escape our own prejudices or interpretations. Much of our existence is ready-to-hand, operating beneath the level of recognition and subject-object distinction, and this can not, in its entirety, be brought into conscious apprehension (i.e. made present-at-hand). AI programs at the time, however, were largely representational. The program's "background" is merely the encoding of the programmer's apprehension and assumptions of the program's domain. While this approach can certainly create useful programs, they are characteristic of the decontextualized, desituationalized nature commonly attributed to computer interaction and are a far cry from human intelligence.

The authors further delve into the issue of language, arguing that "...the essence of language as a human activity lies not in its ability to reflect the world, but in its characteristic of creating commitment. When we say a person understands something, we imply that he or she has entered into the commitment implied by that understanding." Thus, the authors argue that computers, by their very nature, are incapable of commitment and therefore prevented from entering into language on the same terms as humans.

The authors' conclusion? Move from AI to HCI. There is an error in assuming that success will follow the path of artificial intelligence. The key to design lies in understanding the readiness-to-hand of the tools being built, and in anticipating the breakdowns that will occur in their use. A system that provides a limited imitation of human facilities will intrude with apparently irregular and incomprehensible breakdowns. On the other hand, we can create tools that are designed to make the maximal use of human perception and understanding without projecting human capacities onto the computer.

Other thoughts and notes are in the extended entry.

The design section at the end of the book discusses the Coordinator system, which explicitly represents different speech acts as a way of attempting better coordination of organizational communication, in particular supporting the formation and evaluation of commitments. I'm not familiar with the literature on this system, but colleagues of mine have referred to it as a known failure of early CSCW (computer-supported cooperative work). The explicit encoding of otherwise "ready-to-hand" communication seems potentially dangerous and limiting of social nuance. For example, if a commitment is encoded formally, how much room for ambiguity (or delaying, or weaseling, or whatever) is left without making it present-at-hand? It is similar to one of the projects discussed in my friend Scott's thesis, in which by trying to leverage a theory of human behavior (in this case Goffman's notion of different fronts or faces), he encoded formally what people practice unconsciously with high degrees of nuance, thus creating a disconnect between actual human behavior and the well-intentioned mechanisms of the interface.

How would more recent AI developments be treated through the lens of this book? Modern statistical techniques can incorporate probabilisitic logic and learning from example data, but still revolves around the statistical model (e.g. specific graphical models) and training techniques (e.g. the EM algorithm) used. These are still representational (primarily in the choice of statistical model), but less strictly so. How far can we extrapolate this, loosening the representation?

Do we have any of our own 'hard-coded' models (e.g. Chomskian grammar)? Where do our own representational structures lie on the spectrum of nature (genetics, evolution) and nurture (socially learned and negotiated meaning)?

The question here is at the heart of modern cognitive neuroscience - at what representational level, if any, can we understand human functioning, cognition, and experience (at varying levels of consciousness)? Physics? Chemistry? Neuronal interaction? At what level should we look for the organization (or perhaps better stated, embodiment) of a structure-determined, autopoietic system that allows for experience, intelligence and a background to arise? In short, where and how do science and phenomenology dovetail?

In the meantime, it is argued that the design of computer programs should steer clear of these pretensions. The lesson from above teaches us that even as we understand mechanisms of thought, language, experience, etc, the way we naturally perceive and act in the world is not experienced or conceptualized in the terms of these mechanisms.

The big challenge left for us after reading this book: How do we determine the readiness-to-hand of the tools being built (or the desired 'invisibility' of ubiquitous computing environments)? How do we design for it, how do we measure it, evaluate it, and value it? Furthermore, how do we look beyond just 'tools'? How do we build things that appropriately shift between ready-to-hand and present-at-hand, and that are designed to evoke emotional as well as rational responses? (e.g. a nuclear missile launch control interface should be anything BUT ready-to-hand, requiring conscious deliberation). We've had almost 20 years of HCI research since this book was published, with numerous successes in various (often constrained) domains, but these are still the core theoretical and methodological motivations pushing us forward.


Heideggerian Philosophy
- Our implicit beliefs and assumptions cannot all be made explicit
- Practical understanding is more fundamental than theoretical understanding
- We do not relate to things primarily by having representations of them
- Meaning is fundamentally social and can not be reduced to the meaning giving activity of individual subjects.

Ready-to-hand: the world in which we are always acting unreflectively. The ready to hand is taken as part of the background, taken for granted without explicit recognition or identification.

Present-at-hand: the world in which we are consciously reflective, identifying, labeling, and recognizing artifacts and ideas as such.

Breakdown: the event of the ready-to-hand becoming present-at-hand

Throwness: the condition of understanding in which our actions find some resonance or effectiveness in the world

Properties of throwness
- You can not avoid acting
- You can not step back and reflect on your actions
- The effects of actions can not be predicted
- You do not have a stable representation of the situation
- Every representation is an interpretation
- Language is action


The Biology of Cognition: Humberto Maturana

The structure of the organism at any moment determines a domain of perturbations--a space of possible effects the medium could have on the sequence of structural states that it could follow.

Autopoiesis. An autopoietic system is defined as: "...a network of processes of production (transformation and destruction) of components that produces the components that: (i) through their interactions and transformations continuously regenerate the network of processes (tealtions) that produced them; and (ii) constitue it (the machine) as a concrete unity in the space in which they (the components) exist by specifying the topological domain of its realization as such a network..." -Maturana and Verla, Autopoiesis and Cognition (1980), p.79

A plastic, structure-determined system (i.e., one whose strucutre can change over time while its identity remains) that is autopoietic will by necessity evolve in such a way that its activities are properly coupled to its medium.

Structural coupling is the basis not only for changes in an individual during its lifetime (learning) but also for changes carried through reproduction (evolution). In fact, all structural change can be viewed as ontogenetic (occurring in the life of an individual). A genetic mutation is a structural change to the parent which has no direct effect on its state of autopoiesis until it plays a rolue in the development of an offspring.

A cognitive explanationis one that deals with the relevance of action to the maintenance of autopoiesis. It operates in a phenomenal domain (domain of phenomena) that is distinct from the domain of mechanistic structure-determined behavior.

For Maturana the cognitive domain is not simply a different (mental) level for providing a mechanistic description of the functioning of an organism. It is a domain for characterizing effective action through time. It is essentially temporal and historical.

The sources of pertrubation for an organism include other organisms of the same and different kinds. In the interaction between them, each organism undergoes a process of structural coupling due to the pertrubations generated by the others. This mutual process can lead to interlocked patterns of behavior that form a consensual domain.


Speech Acts

Five categories of illocutionary point:
- Assertives
- Directives
- Commissives
- Expressives
- Declarations
Each can be applied with varying illocutionary force.


The failures of AI

[A] program's claim to understanding is based on the fact that the linguistic and experiential domains the programmer is trying to represent are complex and call for a broad range of human understanding. As with the other examples however, the program actually operates within a narrow micro-world that reflects the blindness of that representation.

...the essence of language as a human activity lies not in its ability to reflect the world, but in its characteristic of creating commitment. When we say a person understands something, we imply that he or she has entered into the commitment implied by that understanding. But how can a computer enter into a commitment?

In order to produce a set of rules for [an] ... 'expert' system, it is first necessary to pre-select the relevant factors and thereby cut out the role of the background. But as we have been arguing throughout this book, this process by its very nature creates blindness. there is always a limit set by what has been made explicit, and always the potential of breakdowns that call for moving beyond this limit.

There is an error in assuming that success will follow the path of artificial intelligence. The key to design lies in understanding the readiness-to-hand of the tools being built, and in anticipating the breakdowns that will occur in their use. A system that provides a limited imitation of human facilities will intrude with apparently irregular and incomprehensible breakdowns. On the other hand, we can create tools that are designed to make the maximal use of human perception and understanding without projecting human capacities onto the computer.

Posted by jheer at 10:15 AM | Comments (0)

February 02, 2004

google + friendster = eurekster?

With all the hoopla surrounding social network sites these days, I was a bit surprised to see that this one, perhaps overshadowed by the launch of orkut, seemed to fly under most folks' radar: Eurekster. Eurekster is a personalized search engine that uses the search activity of your social network to help rank your search results... like if Google took your friendsters into account when computing a personalized PageRank. The site got a feature in ZDNet a few weeks ago.

So I signed up a couple fake accounts to see how it works. The graphic design of the site left me feeling bored and uninspired, so fortunately the registration process was simple and quick, with no long list of interests or demographic data to fill out... just a name, e-mail, and password. Like friendster and any other YASNS they then ask you for a list of friend's e-mails to invite into your social network. You then use Eurekster like any search engine, except that search hits previously clicked-through by people in your social network climb up in the ratings, and are marked with the Eurekster 'e' logo to indicate the previous traffic. Additionally, a side bar on the right of the browser presents the most recent and most recurrent search queries and followed links.

Their business model for the moment is to collect money through paid search results provided by Overture. It's an interesting idea, but I don't expect them to do well. First, these kind of sites need a critical mass to be successful, and I don't think the cost of building up yet another social network will give you a reasonable benefit over what you can already get from Google. Besides, do I really want my friends' search results to influence mine? The voyeur in me kind of likes the idea of seeing the "footprints in the sand" of previous searches, but does it make my search results much more relevant? Perhaps if I could segment my friends into different groups, and apply these groups depending on topic, it might add some relevance, but that incurs even more work on the part of the user. There is also the possibility, especially in smaller networks, of spying other's search queries. Your social network can then play the game of "which one of us has been hunting for porn!?", though to be fair, Eurekster includes a "private search" checkbox that hides your searches from your network. Still, what about people following crappy links, both intentionally (search spamming!) and unintentionally, changing your results? Finally, before I get any further carried away, how big does your social network need to be for it to make any real, significant impact on the majority of your searches?

I think using actual user behavior to improve relevance rankings is a fruitful avenue to explore.... but why even bother with the social network? Why not just use the traffic patterns of everyone, or maybe just yourself? The most useful feature may actually be to have a social network consisting of just yourself, and then all the sites you visited before will rise in the rankings, allowing you to revisit them faster. In the case of one global network, with a little analysis or clustering you could even pick out global patterns and groupings which you could use to refine your search upon demand (though I suppose this is akin to adding user interaction data to a search engine like Vivisimo, which already does clustering). In the end, though, how will any of this successfully differentiate a competitor from Google? That's certainly a tough act to beat today, let alone in the future when Google unrolls their own personalization technology, based on the work they acquired from Outride and Kaltix.

Posted by jheer at 04:17 PM | Comments (1)

February 01, 2004

search wars: a prelude

Microsoft is taking on Google, but don't say I didn't warn you. Nice NYTimes article documenting the nascent competition between Google and Microsoft (and don't forget Yahoo!).

I particularly enjoyed the anecdotes regarding hiring competition:

Last year, Rick Rashid, a Microsoft vice president in charge of the company's research division, came to its outpost in Silicon Valley to give a demonstration of an experimental Microsoft Research search engine. Shortly afterward, however, Mike Burrows, one of the original pioneers of Internet search at Digital Equipment who later helped design Microsoft's experimental search engine, quietly defected. He joined Google.

... Microsoft has already begun a recruitment campaign aimed at demoralizing Google employees, several Google executives said. Microsoft recruiters have been calling Google employees at home, urging them to join Microsoft and suggesting that their stock options will lose value once Microsoft enters the search market in a serious way.

...hmmm, seems a bit duplicitous for my tastes.

Here's a copy of the article:

Microsoft Is Taking On Google

Published: February 1, 2004


AT the World Economic Forum in Switzerland last week, Microsoft, the software heavyweight, and Google, the scrappy Internet search company, eyed each other like wary prizefighters entering the ring.

Bill Gates, the chairman of Microsoft, stated his admiration for the "high level of I.Q." of Google's designers. "We took an approach that I now realize was wrong,'' he said of his company's earlier decision to ignore the search market. But, he added pointedly, "we will catch them.''

The four top Google executives attending the forum, at the ski resort of Davos, were no less obsessed with Mr. Gates's every move. "We had many opportunities to see Bill and Microsoft here in Davos," Eric E. Schmidt, Google's chief executive, wrote in an e-mail message to a colleague that was distributed to employees through an internal company mailing list.

Microsoft is intently poring over Google's portfolio of patents, hunting for potential vulnerabilities, Mr. Schmidt contended. And because Google is running its business using Linux - the free open source software that has become the biggest challenger of Windows - Microsoft is concerned that it may be at a competitive disadvantage. "Based on their visceral reactions to any discussions about 'open source,' '' Mr. Schmidt wrote in his e-mail message, "they are obsessed with open source as a business model.''

Get ready for Microsoft vs. Silicon Valley, Round 2.

The last time around, in the mid-1990's, Netscape Communications, another brash, high-tech start-up from the Bay Area, commercialized the Web browser, touching off the dot-com gold rush. The company told anyone who would listen that its newfangled software program would reduce Microsoft's flagship Windows operating system to a "slightly buggy set of device drivers.''

As it turned out, Microsoft - based in the Seattle suburb of Redmond, far from Silicon Valley, the heart of the nation's technology industry - was listening.

Mr. Gates, belatedly waking up to the threat that the Internet posed to his business, aimed Microsoft's firepower at Netscape and flattened his rival, which was later acquired by America Online and is now a shadow of its former self in an obscure corner of Time Warner.

As a consequence, however, he brought a federal antitrust lawsuit down upon his company, raising the specter of a Microsoft breakup. In the end, Microsoft escaped with little more than a requirement that it operate under a relatively mild court-ordered consent decree.

Today, nearly everyone in Silicon Valley, from venture capitalists and chip engineers to real estate agents and restaurateurs, has begun to ask: Will Google become the next Netscape?

Mr. Gates, who for more than a decade has promised - but not yet delivered - "information at your fingertips" for his customers, has decided that the Internet search business is both a serious threat and a valuable opportunity.

The co-founder and now the chief software architect of his company, Mr. Gates readily acknowledges these days that Microsoft "blew it" in the market for Internet search. Despite his early grand vision, he displayed little inclination to deploy software that would improve the ability of computer users to find information - until he saw the dollars in the business.

THAT opportunity fell to two Stanford computer science graduate students, Sergey Brin and Larry Page, who disregarded the industry's common wisdom that search technology would become an inexpensive, marginal commodity.

While the Internet's dominant companies fought one another over Web portals, the promise of e-commerce and access to providers like America Online, Google developed a speedy search engine that soon became almost a universal first step onto the Internet. It displaced earlier search engines because the technology invented by Mr. Brin and Mr. Page did a measurably better job in returning results that satisfied Web surfers' requests.

As a result, Google now has an immense number of users, with 200 million searches on an average day. That gives it a great advantage over its competitors, which are now trying to catch up.

"The system that has the most users benefits the most," said Nancy Blachman, a computer scientist and author of an independent guide to using Google (www.googleguide.com). "Microsoft faces a tremendous challenge because Google fine-tunes its system by watching how users adjust their queries."

But Google has done more than develop a smart new technology. Unlike many dot-com flameouts of the 1990's, it has also figured out how to turn it into a highly profitable business. The company demonstrated that focused ads based on key words related to Web surfers' search requests are the most effective form of online advertising.

That has ignited a three-way battle among Microsoft and its two Silicon Valley rivals: Yahoo, based in Sunnyvale, Calif., and Google, whose headquarters are nearby, in Mountain View. Underscoring the importance of search engines to Internet advertising, Yahoo recently said it planned to end its exclusive reliance on Google for search results and had established its own research lab to try to cut its new rival's lead.

Google's financial success is clear. In 2001, the company had virtually no revenue; in the past year, it recorded sales of almost $1 billion and profits of about $350 million, according to several executives familiar with the company's private financial figures.

As for Microsoft, its executives have already begun boasting about sharp revenue growth from Internet advertising from its MSN partnership with Overture, now a Yahoo division, which also pioneered Web search advertising. In its second fiscal quarter that ended on Dec. 31, Microsoft reported $292 million in online advertising, an increase of 47 percent from the corresponding period a year earlier. The company has said that its overall online advertising revenue, which includes sources beyond search ads, reached $1 billion in the past year.

Later this year, Microsoft is expected to unveil its own search technology, which Mr. Gates says will help Microsoft catch up with Google. Last week, Microsoft released a test version of a special set of software buttons for its browser designed to direct users to its MSN search and related services. For Google, though, the greater threat is that Microsoft will decide that Internet search, like the Web browser before it, should be an integral part of future versions of the Windows operating system.

For the moment, though, Google's lead seems formidable. Last year, Rick Rashid, a Microsoft vice president in charge of the company's research division, came to its outpost in Silicon Valley to give a demonstration of an experimental Microsoft Research search engine. Shortly afterward, however, Mike Burrows, one of the original pioneers of Internet search at Digital Equipment who later helped design Microsoft's experimental search engine, quietly defected. He joined Google.

But even if it can protect its technological lead, will Google still succumb to Microsoft's marketing muscle?

Google shares the intense Silicon Valley work ethic that characterized companies like Netscape. Its new headquarters, on a spacious campus once occupied by SGI, a computer maker, are just across the freeway from Netscape's original base.

But many veteran Silicon Valley executives are skeptical about Google's ability to hold its corporate culture together once it goes public later this year. The initial public offering, much anticipated, is expected to create hundreds of instant multimillionaires among its regular employees, but will leave many others hired as contractors without significant gains. As a result, some people fret that Google is fostering a class society in its ranks.

So far, though, the disaffection is limited largely to the company's Adwords business, which is aimed at creating and placing its focused search advertising. That operation has grown rapidly with temporary workers. "The Adwords environment is brutal," one Google executive said.

Clearly, though, keeping its ebullient esprit de corps so robust after the I.P.O. will be difficult, say those who have gone through similar roller-coaster rides in Silicon Valley.

"The challenge Google faces is figuring out how to retain a high rate of innovation" in the face of a disruptive event like the I.P.O., said a former Netscape executive, who also worries that the two young founders, for all their brilliance, may not fit well into the kind of management team needed to run Google as a fast-growing public company.

Although Google has clear vulnerabilities, Microsoft is seen in Silicon Valley as a powerful but not particularly creative competitor. Beyond its core business in Office and Windows, Microsoft has no major recent successes to point to - but it has a growing list of disappointments. These include its Xbox video game player and Ultimate TV set-top box.

In other words, rivals have fought Microsoft and lived to tell about it. "At TiVo, we managed to stare down that $40 billion barrel,'' said Stewart Alsop, a venture capitalist who helped finance the creation of TiVo's digital video recorder, which allows TV viewers to easily record hours of video programming for viewing at other times. "We dodged that particular bullet,'' Mr. Alsop said, when Microsoft "shut down Ultimate TV and got out of the business."

Other executives who compete with Microsoft said Google's position might be more defensible than Microsoft executives believe.

"The good news for Google is that what they do has many branches," said Rob Glaser, the chief executive of RealNetworks, which competes with Microsoft in the software for playing video and digital audio on personal computers. "It's not easily replicable in one step."

OTHERS say that even though the Justice Department consent decree is weak, it may still be enough of a barrier to prohibit Microsoft from making Internet search an integral part of the operating system in the same way it absorbed the Web browser.

"They can't undercut Google on price, and I don't think they can get away with integrating search," said S. Jerrold Kaplan, an industry executive who competed against Microsoft while at Lotus, the spreadsheet maker that is now part of I.B.M.

As it prepares its public offering, Google is trying to avoid Netscape's fate by remaining focused on its own measures of customer satisfaction. On computers at Google headquarters, the home page constantly displays a graph reflecting how well Google does on searches, compared with its competitors. Even the slightest dip in performance creates alarm, a company executive said.

Google has also brought in a Silicon Valley veteran, William V. Campbell, the chairman of Intuit, to serve as a consultant. His gospel for Googlers, as employees refer to themselves, is this: Ignore Microsoft's impending arrival as a competitor and focus on the customer.

Good luck. Microsoft has already begun a recruitment campaign aimed at demoralizing Google employees, several Google executives said. Microsoft recruiters have been calling Google employees at home, urging them to join Microsoft and suggesting that their stock options will lose value once Microsoft enters the search market in a serious way.

"Our approach has been to seek out the best and brightest talent," said Lisa Gurry, a lead product manager at MSN. "Beyond that, I can't add anything."

Google executives also say they believe that Microsoft is systematically pursuing Web sites downgraded by Google, which punishes companies for trying to manipulate their rankings. The company is striking partnerships with unhappy Google customers.

Microsoft is currently relying on Overture for its paid search listings, Ms. Gurry said.

But Google is hardly standing still. As Mr. Gates himself has acknowledged, it has marshaled a remarkable collection of technologists. They are focused both on keeping the company's lead in search technology and on developing a range of new services.

To help their work, Google has been quietly developing what industry experts consider to be the world's largest computing facility. Last spring, Google had more than 50,000 computers distributed in over a dozen computer centers around the world. The number topped 100,000 by Thanksgiving, according to a person who has detailed knowledge of the Google computing data center. The company is placing a significant bet that Microsoft will be hard pressed to match its response time to the ever increasing torrent of search requests.

Besides the additional computing firepower, Google has a wide-ranging list of new services that it will roll out as competition with Microsoft and Yahoo dictates. For example, it recently introduced Orkut, a social networking service intended to compete with Friendster, LinkedIn and others. Still under wraps is an electronic mail service that will have an advertising component.

The company has also been pushing hard to find new sources of information to index, beyond material that is already stored in a digital form. In December, it began an experiment with book publishers to index parts of books, reviews and other bibliographic information for Web surfers.

And Google has embarked on an ambitious secret effort known as Project Ocean, according to a person involved with the operation. With the cooperation of Stanford University, the company now plans to digitize the entire collection of the vast Stanford Library published before 1923, which is no longer limited by copyright restrictions. The project could add millions of digitized books that would be available exclusively via Google.

ON the marketing side, the company is racing to build its strengths overseas. Wayne Rosing, vice president for engineering at Google, has been chosen to travel the world, weaving the company's search engine into local economies and local technologies. It is concentrating initially on 12 countries.

Mr. Page, the Google co-founder, is even trying to persuade Mr. Schmidt, the veteran Silicon Valley executive recruited from Novell Inc., to run Google, and others in the company to market a phone with a built-in custom personal digital assistant intended to let Web surfers use Google from anywhere.

For all of Google's hyperactivity, there is still a lingering sense among many Silicon Valley veterans that they have seen this movie before. The company may not have Netscape's arrogance, but it is still not clear that all of its clever marketing, technology and brand identification can withstand Microsoft's onslaught when it arrives.

After all, just as Silicon Valley has learned from some of its errors, so has Mr. Gates. In Davos, Mr. Gates ruefully acknowledged that Google "kicked our butts,'' reminding him of what Microsoft itself was like two decades ago.

"Our strategy was to do a good job on the 80 percent of common queries and ignore the other stuff,'' he said. But "it's the remaining 20 percent that counts,'' he added, "because that's where the quality perception is.''

He promised not to make that mistake again.

Posted by jheer at 05:19 PM | Comments (0)