"Make it possible for programmers to write in English and you will find the programmers cannot write in English."<p>I teach computer science and have a particular fondness for introductory CS. The reason Stephen Wolfram is wrong, wrong, wrong about this is that people that have never been taught programming can't express themselves precisely enough in their native language, either; and even among those of us that have been programming for decades, when we express ourselves in natural language we <i>can</i> be very precise but it takes a lot more work and becomes a lot more unwieldy than just writing out our instructions in [pseudo]code.<p>CS educators have been wishing for a long time that "intro to CS" didn't equate to "intro to programming". And it doesn't have to, not quite, but the reason it always seems to revert there is that the prerequisite for <i>every other thing in CS</i> is, not programming itself, but a certain precision of thought that is easiest to teach just by teaching students to program. In a programming language. Because if you try to make them write out instructions in a natural language, and you notice that they aren't being precise and therefore deliberately misinterpret that instruction, they just think you're being a dick about it. They sometimes even think this if you honestly misinterpret them. (This is true even in a non-CS context.)<p>Saying that we will soon be "generating code from natural language" is, at best, misleading. It implies that people who couldn't learn a programming language will be able to program, which is quite untrue---I promise that with the possible rare exception of a few pathological edge cases, when people can't learn to program, the language is the least of their problems. And for those of us that <i>can</i> and <i>do</i> learn programming languages, all but the simplest sorts of programs will probably be easier to write in a programming language (which was designed for that sort of thing) than in a natural language (which was not).<p>(And holding up Mathematica as an exemplar is particularly egregious; it is so loaded with syntax that "just works" that you need to either have a deep familiarity with traditional mathematical notation or else a degree-level CS background in programming language theory if you want to have a good shot at learning the language in anything more than a pattern-matching fill-in-the-blank way.)
This is in the class of "demoware", projects that are easy to program fancy demos for but are very difficult to bring to production status. (See also: "fully visual programming".) It's only really interesting if they escape from that. We'll have to wait and see.
I would be fascinated to see several hundred years down the road how natural languages and computer languages have comingled and evolved into something new. I'd be inclined to believe that bringing natural language to computers won't just be a one-way street.<p>You already see this in places like hacker news here where people often use constructs like "s/thing/other thing/" because it's more concise and useful than writing out the natural language version.
While it would be neat to give the power of programming to everyone, I'm not convinced coding in a natural language would necessarily be better/easier than writing code in Ruby or Lisp or Python.<p>Sure, you eliminate the first big hurdle in programming, but learning the syntax of programming language is usually one of the easier parts of software development.
Awesome, Stephen Wolfram has duplicated Terry Winograd's 1971 PhD thesis, which ran in 256K of memory on a PDP-6.<p>Winograd: <a href="https://hci.stanford.edu/~winograd/shrdlu" rel="nofollow">https://hci.stanford.edu/~winograd/shrdlu</a>
Wolfram: <a href="http://blog.stephenwolfram.com/data/uploads/2010/11/conesphere_11.jpg" rel="nofollow">http://blog.stephenwolfram.com/data/uploads/2010/11/conesphe...</a>
Well, I since it isn't possible (natural language isn't precise enough to even communicate efficiently with other humans that share the same hardware as you, much less an unthinking autamoton such as a computer), that would put the time frame at around never. If it ever actually happens, that would by definition of 'never' be sooner than I think, so all he has to do is actually accomplish it instead of talking shit his entire life, and he'll have proved his statement correct.<p>Maybe he'll call it 'A New Kind of Programming Language'.
Sure, right after physics is adequately expressed in natural language.<p>There's a reason physicists express their concepts in mathematics, and that's because math is the language humans devised to express those things, having found natural language inadequate.<p>Programming is similar in that regard.
This is clearly demoware, good enough to impress the general public. But once you go from "Draw a red circle" to "Draw a red circle overlapped one third with a blue square with white dots over the lower left corner" that would be progress. And then I wonder if this will also work: "Paint a white-dotted blue square that intersects over a third of a red circle in the lower-left corner. Or will it give a "compile error"<p>Why not just create a DSL (e.g. in Scala) with a simple standardized NL-like syntax that can give meaningful "compile errors". There is no need to impress the general public.
Two reactions to this piece.<p>1. I'm not sure what problem so-called natural language programming is trying to solve.<p>2. Though I admire this man's building of Mathematica and the company that sells it, I'm generally not a fan of what I perceive as his history of "discovering the obvious" and self-promotion. Or rather, rediscovering or making things sound like he invented them or came up with them for the first time. Cellular automata and it's implications in his book "A New Kind of Science", and now this piece sounds like more of the same. I give him a little slack because he's in business and so there's the self-promotion angle, but not much.
First off, did he really have to plug ANKS? Honestly it seems like every piece of writing I come across from him he has to mention it.<p>Secondly, I think eventually programming will have to become more mainstream. And this will be done through some form of a natural language interface. Programming is best a tool to solve problems. It's way too limited now where only an elite group gets to control its use.<p>Eventually programming will have to become a tool as common as mathematics. The problems we face are only going to continue to grow in complexity where the only way to get a handle on them is through automated computation. The user will need the instant, iterative feedback that only a self-made program can provide. For this to happen, the interface to programming will need a radical change.
This will exactly end like Infocom adventures. They are mostly nice, but quite often you hunt for the exact phrase the parser understood (but to be fair, it happened more often in Magnetic Scrolls).<p>But would you call using a text adventure (put blue ball in red box) programming?
I think that the title should be: Generating code that can be described in a short phrase is closer than you think.<p>We can create a database with the different phrases that people use to suggest a command, for example:<p>Calculate the limit, find the limit ... all of this is Limit, and so on.<p>To sell a product, this kind of ability is well received. Is like telling to your telephone: Please call this number for me, the number of my friend Alfred. And the computer lookup Alfred in its database and connect with that number, that is not deep IA but is a useful trick for selling products.
Sure, by the time you enter all the specifics of the Circle, you've got something more unwieldy than a succinct programmatic description.<p>But for most people unfamiliar with Mathematica syntax, typing, "draw a red circle" and having the computer choose sensible (or any) defaults yields a template of the exact code one would have needed to type. Which saves Googling or reading the manual for circles, and teaches the syntax in a very natural way.
Not the same thing, but similar?, from 1983:<p>"A natural language query system for a prolog database" (Hadley)
<a href="http://ir.lib.sfu.ca/handle/1892/7254" rel="nofollow">http://ir.lib.sfu.ca/handle/1892/7254</a><p>Not general code generation I suppose, but there's been quite some work on natural language to database queries systems in AI research elsewhere.
Natural language can lead to contradictions and ambiguities. Just look at that star trek episode where they supposedly brought down androids built by a very advanced ancient race just by saying stupid and self contradictory things.<p>Which by the way was a stupid premise but hey :)
A programmer can code in English (though a programming language is easier, more concise, easier to understand) or in a programming language.<p>A non-programmer cannot write code, neither in English nor in programming language.
It's over-hyped as usual, but this looks like it could evolve into a handy way of generating snippets of working code that you can then modify. Think of it as an alternative to doing a Google search for a useful example to start from, or a better kind of Rails scaffolding.
I'm suspecious of the usefulness of such an approach.<p>What's easier, "5 + 10" or "five plus 10", or even worse "five added to ten"?<p>Dijkstra had an article about that, titled: On the foolishness of "natural language programming"[1].<p>[1]: <a href="http://www.cs.utexas.edu/users/EWD/transcriptions/EWD06xx/EWD667.html" rel="nofollow">http://www.cs.utexas.edu/users/EWD/transcriptions/EWD06xx/EW...</a>