Since coding with Cursor, I’ve spent a lot of time thinking about how much computing interfaces have changed in such a short period of time.<p>Just as we saw a paradigm shift from imperative to declarative interfaces, LLMs have opened the door to new kinds of interfaces.<p>Intent-driven interfaces are already everywhere - from AI assistants processing raw data, to Arazzo for defining how APIs map to human-scale workflows, to agents capable of handling complex tasks.<p>While the doomer view is “robots will take all our jobs”, my optimistic take is we finally have a human-centric way of interacting with machines and it’s going to supercharge our abilities
<a href="https://www.paradigm.xyz/2023/06/intents" rel="nofollow">https://www.paradigm.xyz/2023/06/intents</a><p>Check this out
If I understand, this means to map a chat to a choice of clear API calls. I buy it.<p>The only issue with it that I can think of it the lack of reporting without asking. When I login to a site, I already want to see the most applicable information without having to ask for it. If I exclusively have to start asking a chatbot for basic info, I may miss out on a lot of what I need to know.
Is this basically, how to do GUI's with functional programming. And someone saw the light and it is now the future?<p>People are always getting hung up on the "how to make a GUI if variables are immutable??".<p>It takes a bit of getting used to, but once over the hump, it seems like GUI's are easier.<p>See typescript, or fabulous, F#, elm, etc...