Something about this whole thing made me think of how prompts are similar to laws and contracts.<p>Just like with prompts, these are never really followed exactly either (in the way that we never read the full TOS), and are mostly there as guidance, as well as justification for punishment/recourse (or at least exemption of responsibility) if not followed.<p>The tweet about prompts accumulating technical debt also mirrors similar thoughts I have about law - it's very easy to add another law covering special cases, but how all these special cases interact with each other is often unpredictable (the simplest example I think is unintentional tax loopholes).<p>I'm not sure I have a point here, but this parallel must say something about human cultural and societal structures and how writing/text shapes them. What a fascinating trend to observe.
Does Gary do any AI research these days, or is he full-time Tweet-reactor?<p>It's got to be exhausting having to write about why each new mind-blowing advance in deep learning isn't actually impressive. The best recent example was Sora. His financial backers would've absolutely creamed themselves if his line of work had produced that, but he was nitpicking this insane development with a magnifying glass.
Prompts are horribly non deterministic, that's true;
However, programming <i>through</i> English is the perfect input mode:
if I tell the bot convert these files to this format it's the most convenient way to enter my intent and often the chat bot is even smart enough to guess what you really mean and specify what was left ambiguous. If the produced code does not exactly do what's desired one can iterate, and maybe more important: see the precise reason and deterministically resolve it.<p>So yes one still needs mathematical programming languages but interacting with them through natural language is a huge leap forward plus (optimistic prediction ) it will also result in a more precise usage or version of English.
yes well unfortunately people are very bad at communicating in English, humanities departments in schools have been cut to the bone and English professor is now a part time job you get in addition to waiting tables or working retail.<p>So the old tradition of analyzing what sentences and words mean so that you can more clearly state your ideas, itself has been attacked as "useless liberal arts" and replaced by coding camps. So now we have a bunch of people who know how to "code" in a javascript framework that will become obsolete in 18 months but if you ask them to write a short English essay they will look at you with a blank stare.
Lawyers use English (other languages are available) to express things which have to be very precise. They often end up being argued over in court. Software needs to be massively more precise. I am not hopeful.
I always think this when people talk about programming in English. Jargon, symbols, technical terms, programming languages exist to make it easier to communicate. If you listen to the internet, the only reason mathematical notation exists is to make ivory tower elites feel smart. But there is a lot of value in assigning precise context free meaning to words or symbols.<p>I think there will be a wave of “code as English” tools, then people will come full circle and develop programming short had for interfacing with LLMs when the limitations start to present themselves.
Why are we not expressing our intent in programming with human readable languages ?<p>Look into the computer history . Hint: Charles Petzold wrote a book about it, called Code.
Honestly, this has been predictable since the beginning of GPT-3 and Copilot. Programmers tend to overestimate how much time we spend sitting down banging on the keyboard, and management <i>severely</i> overestimates it. Even when we are typing code, much of the time we maintaining code already written rather than creating greenfield code out of whole cloth. So a system that is by design oriented toward creating new things out of whole cloth will be missing a lot of what programmers must do.<p>GPT-4 makes enough errors when I use it to generate code that my chats with it become excessively long where I am effectively feeding it error messages until it makes something that will at least parse or compile. Having to maintain those over time seems like it would be a nightmare, much worse than the often mediocre tooling programmers are used to.<p>I'm not anti-LLM, but the hype around them has been unbelievable, and I suspect that's part of the reason for these immense layoffs over the past 18 months. LLMs make me more productive, but I would honestly be terrified to depend on a product that relied on LLMs in its critical path.
There are several ways in which something like this could be implemented, but I can't help thinking that one should already know enough stuff to give a good description that they might as well just write the code anyway.<p>In any case, only formal languages for me, please.