December 03, 2003
Scott's musings on life and learning (rather orthogonally) point to a somewhat mind-warping piece from 2000 by virtual reality pioneer Jaron Lanier. The content of the article was also the subject of a somewhat more substantive talk given by Lanier at Berkeley last year, in which the reliance of current computing and information theory paradigms on shared, predetermined protocols is attacked. The notion of pattern matching and spatial organization and processing of information to establish successful communication or "discover" protocols in situ is advanced.
The paper is ultra-speculative, and imho a bit unfair to the fathers of computing, but thought-provoking none the less, as are the reader responses... though Dylan Evans, once again, misses the point. I'm left wondering though, if some level of base protocol is not only desirable but unavoidable. In our world, at the lowest level (gotta be careful with statements like this, but I'll do it anyway), isn't this the laws of physics and chemistry? And we certainly use protocols (language, social protocol, etc) to navigate daily life, though clearly of a much more flexible form than current computing interfaces, and more importantly, they aren't necessary engrained but learned. Lanier's thesis might then be interpreted as the claim that "soft" methods can give rise to protocols but not vice-versa. I'm not sure I would accept that (though to be fair, I'm not sure that's what's being advanced either). Without some base level, fundamental established protocol (even if very low-level) it seems it's turtles all the way down.
Posted by jheer at December 3, 2003 01:55 AM
just read the comments. fascinating exchance. once again, i'm in awe of lanier's graceful ability to shoot down his critics with respect when they fail to criticize what he's really saying. i recall a similar thing happening when he wrote his Half a Manifesto.
i was at his talk at berkeley, too. in the "turtles all the way down" vein, i asked him if it weren't possible that there are potentially infinite levels alternating between linear protocols and spatial recognition, each one recasting the previous. fuzzy quantum actions become deterministic laws of physics become chaotic processes become cosmological variables become god. something like that. he seemed to like that idea.
evans did miss the point. and with unnecessary egotism. i think steve grand hit on the real point: in the end, it comes down to metaphor, not efficiency and technical feasibility. it's about how can we as a culture of human technologists recast our conceptual notion of, in effect, interprocess communication such that WE can understand it and scale it, and so can the machines, with plenty of room for error.
I agree, it is more speculation than anything, but interesting nonetheless.
In contrast to your response, I was much more sympathetic to the readers who "missed the point" than to the writer who simply pointed out that grepping the original text for certain words yielded no results.
I won't say that this is hypocritical, but that part of writing as a human has to do exactly with this non-linear communication problem, this issue of the learnability of interprocess chit chat.
When one writes, I don't think one should fault one's readers for "missing the point" and seeing things that simply "are not there". In fact, perhaps that's the whole point of writing and of IPC on a broader scale, this kind of "metered ambiguity" that I like to speak of.
I won't fault technologists for believing that there exists such a thing as "what I was really saying", it's not endemic to techies per se, but I do hope that my own work (to come) will bring us closer to some of these goals. So who knows, maybe phenotropic computing is not too far away :)