The most fundamental problem with natural language search engines is that the "natural language" part is more a limitation than a feature to me. Natural language is meant for people to communicate with other people and not with computers. I believe that a well designed keyword/tag based search combined with factual auto suggestions extracted from formal/semantic sources (similar to wikipedia) could be far more efficient for people to use and computers to run.
The article argues that one thing is having a successful technology, and another is having a successful business. This is, of course, true - but there's a significant correlation between the two. Correlational databases, Google's search algorithm and the light bulb were all influential technologies that founded hugely successful companies.<p>There's a rule of thumb saying that your solution to a problem has to be 2 - 300% better than the existing state of the art for it to be adapted successfully without artificial help (marketing $, monopolies, etc.) and Google certainly lived up to that when it went live. It remains to be seen whether Wolfram Alpha will.
You can’t beat Google only by developing a superior technology. You’ll still be left with the Herculean task of drilling your search engine deep into the minds of hundreds of millions of users around the world. Google the brand is far mightier than Google the technology. It’s the Google’s redoubtable omnipresence and visibility that makes it tick.
No. Better search does mean beating google.<p>But you don't have to play within Googles rules..
theres lots of territory between text search [=cool] and Natural Language [=sucks, to a first order approximation].<p>For example, just treat data as a graph of tagged pieces of text.. and give a good web interface to that. Bypass all the RDF, semantic web hype and just make something workable, usable. A wiki for data.<p>anyone looking for a co-founder? Im working on this.