Below is the text Barney sent us of his talk, which he said he was going to post in his blog, but I couldn't find it online. (no editing has been done)
Bringing NLP to Market: 30 years in the making
By Ron Kaplan and Barney Pell
For Danny Bobrow’s Festshriff, March 26, 2008
Barney gave the talk on behalf of himself and Ron Kaplan, who was in Turkey at the time of this event.
I first met Danny almost 40 years ago, in 1969, when I was a first year graduate student and he was heading up the AI Center at BBN. I was hired to develop an English grammar within the Augmented Transition Network system that Bill Woods had created. And it was my understanding of the benefits and disadvantages of that grammar and framework that eventually led to many of the concepts and insights that underlie today's practical NLP technologies (hierarchical attribute value structures, Lexical Functional Grammar...). So Danny was there at the beginning.
In fact, he was there before the beginning. Most people in AI are aware of the Student program that he developed for his thesis, one of the milestones in the early history of the field. But what has been mostly forgotten is a seminal paper from the late 60's that he wrote with Bruce Fraser. It was a very brief paper, but it outlined the basic ideas of network grammars that evolved into Wood's ATN which evolved into LFG which evolved into...(Powerset)
This early example illustrates one of Danny's wonderful qualities, his ability to cut to the core of complex problems and to find the essence of a solution. I've seen this many times over the years, how he is able to abstract from the details of a situation and figure out what really matters and what doesn't, what two problems have in common and how they differ, which directions are promising and which will dead end. This has always impressed me, but I have to say that it has also annoyed me at times. I can remember several occasions when I was struggling to overcome a really hard obstacle. I would have a whiteboard full of diagrams and formulas, things crossed out and overwritten, marked up in different colors--basically, totally confused. Danny would come by, pop his head in, take a quick look, and say something like: "Oh yes, the answer is 5". And he would be exactly right: the answer was 5. Damn! The annoying thing was that he could come up with the solution without experiencing the pain of the problem itself--and I needed someone to share the pain just as much as I needed the answer.
Barney: I got to see this first-hand, and it’s funny how many people make the same observation. I have also had some people, including Ron, make a similar observation about me. So if I may be annoying at times, at least I have in Danny a good role model that shows this is ok… at least if you’re right.
Ron: Danny left BBN for Parc in the early 70's (and in the early days of Parc), and he set about creating a Language Understanding group. I was still a graduate student (finishing my thesis "any day now"), and I was lucky that he invited me to interview and to become a member of the group (I arrived about 2 years after I accepted the job, still about to finish my thesis "any day now"--he pushed (he's very good at that) and I did finally finish!). He also attracted other key people to work in and with the group--Martin Kay, Terry Winograd, and later Don Norman, Richard Fikes, Ira Goldstein, and others.
Those were great years. We were excited about integrating existing technologies and inventing new ones to build a real language-understanding system--we were smart, well-funded, great computing infrastructure, how hard could it be? We pulled everything together in what was (and what still may be) a state-of-the-art dialog system, called GUS (Genial Understander System). If you wanted to take a trip from Palo Alto to San Diego, GUS was the reservation agent you wanted to talk to--best fares and most convenient schedule on that route!
Barney: It is interesting to note Ron’s statement that a system built 30 years ago was, and still may be, a state of the art NLP system. For how many fields is that even possible? I worked at SRI in NLP in the 80’s and had the fortune to be part of another state of the art NLP system at its time, and which might still be a state of the art NLP system today. I saw the potential of NLP to change the user experience, but I also saw that it was going to take a long time before it was really ready. I decided to go work on other things for a while (say, 20 years), and come back when hopefully the technology could be ready for broad deployment. Somewhat surprisingly, that’s pretty much what happened.
Ron: That was our first attempt at integration. What we realized is that there was more work to be done on the individual components, so we decided to back away from integration until we got better individual pieces. Martin and I worked on syntax--and LFG and the concepts of unification grammar came out of that--while Danny and Terry focussed on knowledge representation and processing, producing one of the earliest knowledge-representation systems. We decided we would make another run at a full-fledged system as soon as we locked down the pieces.
And that's what we did. The only thing, as everyone knows, is that it took a little bit longer to get the pieces done than we had thought. And there were shifts and changes: I forked off a separate group, Parc's Natural Language Theory and Technology area, to work on the language side, Danny continued on the knowledge side with Mark Stefik and others. And oh, we all worked on building the Lisp system and launching Xerox' AI lisp machine business.
We made a lot of progress. Every now and then, over the years, Danny and I would get together for a brief status check--are the pieces done, are we ready to integrate? The answers were always No and No, but Soon and Soon. This went on for 25 years or so--and some in management might have heard the answers as Never and Never.
Barney: Most people in the field gave up during this period. During the nineties, there was an AI winter, and really a Semantic winter, during which most people just stopped working on semantics. Instead, statistical approaches emerged and a generation of people entered the field who did not know or care about deeper knowledge or linguistics. Many people now think these approaches were tried and failed. But the trying was still a process and since the project hadn’t been finished, the conclusions of its failure mid-way through were premature. This room contains perhaps 1/3 of all the people who stuck around for the long haul. Luckily the people here did not give up.
Ron: But we were actually advancing, and in fact, 7 or 8 years ago, the answer changed. We were ready--we agreed that finally the integration experiment was worth running again, there was a good chance that we could create something useful, and not simply learn that more work was needed.
And that's what we did. We defined a common project for the knowledge and language groups (extracting knowledge from textual documents, with an eye towards detecting entailment and contradiction). It was challenging and fun, required a lot of strategic thinking (Danny is great at that), required picking and organizing the right kinds of expertise (Danny was always great at that too).
And how did (at least this chapter) end? Powerset.
Barney: I knew Danny from his days as an advisor at Nasa, where he helped us think about human interaction with intelligent systems. I ran into him again at a time when I was developing plans for a natural language search engine, at the AAAI conference in July, 2005. Danny told me the story about 35 years of concluding PARC NLP wasn’t yet ready, and then finally concluding it was ready. You almost never hear about researchers working on something for such a long period of time at all. Much less, having the humility and perspective to say that this is going to be really hard and take a long time, and despite progress it continues to require more fundamental work. It then seems completely unheard of for such researchers to finally come back at say it now is ready. Knowing that this story comes from the PARC team, given the pedigree and history in NLP, I followed up immediately. Danny connected me with Ron, and for better or for worse (remains to be seen, in short order), the combined linguistic and semantic technologies seemed just right for the search application.
Ron: The story is still unfolding, and Danny is still in the thick of it. 40 years ago, or even 10 years ago, or 5, it wasn't clear what the killer app for natural language technology would be. But we all believed that somewhere, somehow we would find a pony. And it wasn't clear how we would structure things to go after the application when it became apparent. We didn't predict Powerset or search, but here we are, just about to launch a broad-scale search engine. An engine based on the deep NL and KR technology that emerged from all those years of research--what a ride! And what a great collaboration!
And search may be just the beginning of the end. In the back of our minds and with Powerset success, there is still the possibility of building that Genial Understander System, what we would now call a Conversational User Interface (a CUI).
GUS2--how hard could it be? Let's get started...