I'm skeptical about Siri. My wife got an iPhone 4S on opening weekend, and we both did the requisite amount of playing with the new feature, and it was amusing, but after the first couple of days she pretty much abandoned it.<p>There are some reliability problems, but I think the main problem is that Siri still lives firmly in the AI uncanny valley. Which is exacerbated by the way that Apple presents Siri, i.e. as AI.<p>With a clearly defined set of commands, you can be confident about what's going to work and what isn't. And if you try something that isn't a command, you relegate the failure to "oh, that's not a command." But with Siri, because it's presented as an anthropomorphized, intelligent agent, it makes the failure feel a lot more brittle and frustrating.<p>For example, "What's my next appointment?" works, but "When's my dentist appointment?" doesn't. Why not? Well, I know why not, and you probably know. It's because it's really, really, really not AI, and unless we make some kind of breakthrough on strong AI, it's not going to be for a long time. But my wife doesn't know that. All she knows is that Siri is cool when it works, but is actually pretty stupid a lot of the time. Which means she's not <i>reliable</i>, which is important because Siri is most useful when it must be <i>most reliable</i>, like when you're in a rush.<p>Apple will certainly continue to add commands and make Siri smarter and smarter, but this will necessarily be incremental, and that failure will always feel brittle to lay users.<p>[edit: btw, John Siracusa talks a bit about this on the most recent Hypercritical: <a href="http://5by5.tv/hypercritical/39-quasimodo-backpack" rel="nofollow">http://5by5.tv/hypercritical/39-quasimodo-backpack</a>]