I'm confused by the website. The intro has a clear example that reads like natural language processing, but the FAQ goes out of the way to stress
that the language does not do NLP. At that stage it gets a bit ranty, vague and dense, and I kind of lost interest. Perhaps I'm not smart enough to get what they're trying to do. "Developing a domain-appropriate lexicon and phraseology". Is this a DSL? Regardless, I don't see this setting the world on fire.<p>It's interesting that they've called this paradigm Articulate Programming, because articulation of the domain is where the problem both starts and ends.<p>How many times have you worked in a company with staff who start off exasperated with how complex IT makes solving a business problem, only to be surprised at just how many details are in their day to day processes once you've spent time covering off all the edge cases and writing tests around exceptions.<p>Code becomes complicated because the domain it models <i>is</i> complicated. Hence the reason why a good engineer's most important skill is in gaining an understanding of the real world problem domain, and expressing that as code. And also why I'm not worried of AI taking my job any time soon.
While I believe 'plain English' type programming languages are a great concept, the reality is that it just introduces far more ambiguity into the mix, and you are just left trying to guess what the language designer's vagaries are.<p>I came across this years ago when trying to get to grips with the then new fangled Ruby language. I kept having to go back to the documentation to remember the best way to convert a string to all uppercase... was it:<p><pre><code> str.upper
str.uppercase
str.ucase
str.upcase
str.capitalize (<- Don't even get me started on the regional differences of 'ise' vs 'ize' between US and UK English variants)
???
</code></pre>
Even here, I would probably start using Avail, then in a few weeks I would be scratching my head and asking, was it:<p><pre><code> Print 1 to 10, as a comma-separated list
</code></pre>
or<p><pre><code> Display 1-10, in CSV format</code></pre>
IMO, the landing page of any programming language site should include some code samples demonstrating what makes the language different from the crowd.
> An infinite algebraic type lattice that confers a distinct type upon every distinct value. Intrinsic support for a rich collection of immutable types, including tuples, sets, maps, functions, and continuations.<p>Ok, I consider myself OK at type theory but I'm still lost in what this claim actually means. And if it is what I think it is (that all values have types), I wonder how this doesn't run afoul of decidability of fancy dependent type systems (perhaps 1 has a type, 2 has a type, but 1 + 2's type isn't 3?).
I think there's actually a very fundamental difference between natural and formal languages that make this kind of project wrongheaded.<p>Formal languages, at root, have exact reference. In a programming language, a symbol ultimately refers to a block of memory, or an operation. The problems of writing a formal language are ones of trying to express a given concept when the relation between symbols and references is known, but the relationship between concept and symbol is not.<p>In natural language, a symbol ultimately refers to nothing. Its meaning is derived from context, convention, intention. As such, the relationship between concept and symbol is basically known - we know we are talking about red things when we use the word red. The relationship between concept and reference is absolutely unknown - we can never know for sure whether our concept 'red' is adequate to real red objects.<p>As such, natural languages are a poor model for formal ones. The problems are essentially different. In one, you know how the symbol 'red' relates to operations and memory. In another, you know how the symbol 'red' relates to intention and meaning. Each has different challenges associated.
Print 1 to 10, as a comma-separated list.<p>In e.g. Scala, you can do that:<p>print( 1 to 10 mkString "," )<p>It's not 100% human language/grammar, but close (and you have auto-completion using IDE). Why would you need another DSL?<p>(Not trying to bash Avail, nor promote Scala, just curious for its usecases)
It'd be better if the tutorials are rewritten in a more concise manner. Do you really need that many words to explain Guess The Number? <a href="http://www.availlang.org/about-avail/learn/tutorials/guess-the-number.html" rel="nofollow">http://www.availlang.org/about-avail/learn/tutorials/guess-t...</a>
I find the syntax hard to follow for the express reason that variable names and function names have no distinguishing features. If variable names were decorated somehow (a symbol, or color) it would be much easier to visually parse a function call. As is, my brain must remember exactly the (complex) names of functions and variables in scope to determine how to parse a function call. But I like the idea and am using something similar in a language I'm developing.
Here is some code from their examples page if anyone else (like me) is more interested in how the actual code looks like.<p>To be honest, I'm not sure how much clearer this is to read than for example Python.<p><a href="http://www.availlang.org/_examples/guess-the-number/Guess%20the%20Number.avail" rel="nofollow">http://www.availlang.org/_examples/guess-the-number/Guess%20...</a>
It looks like GitHub is active [1], but documentation hasn't kept up. The blogs don't seem to have been updated since 2014, and the links to the mailing lists are broken.<p>[1] <a href="https://github.com/AvailLang/Avail" rel="nofollow">https://github.com/AvailLang/Avail</a>
Someone smart needs to explain what an infinite algebraic lattice is, because it sounds awesome. Potentially.<p>Edit: (I just googled "algebraic type lattice" and while ymmv, I don't recommend it unless you're well versed in scary black mathic)<p>I didn't get too in depth with reading the docs, but any language that goes for non ascii symbols a la APL is going to be fighting an uphill battle right from the get go.<p>Maybe it was a bit easier even for apl because there were interfaces more immersive than what we have now for non ascii, especially when mixed with regular ascii.<p>Type type type, oh wait, backslash, dropdown, there's my symbol, enter, type type type. That's not very fun. That's less fun when youre dividing your cognition between what things im actually trying to accomplish and what things I have to type.<p>Just my two cents, no ill will
The example at the beginning strikes me as a bit over the top.
Something like 'String.Join("," Range(1,10))' (pseudocode, but you get the picture) would be better, and avoid all the ambiguities of the plain english version.
I really hate programming in AppleScript, which also attempted a similar syntax, because it's in the uncanny valley of semi-structured English. It's too close to the language I speak such that remembering all the special cases (which prepositions link which operations, sentence structure, etc.) becomes really difficult.<p>I <i>like</i> some well-structured separation in my coding languages. It's not a downside for me at all.
Reminds me a bit of intentional programming, something that Charles Simonyi has been pushing for a few decades. As far as I know he might still be pushing this but I haven't seen much progress since 2002.
> But there are many career programmers who would rather say:
> Print 1 to 10, as a comma-separated list.<p>No, I would not. Don't make assumptions on behalf of others.
To be honest, this project is going the wrong direction.<p>Rather than trying to get programming languages to look like human language, we need to get human language closer to computer language.<p>By this I mean that every argument I've ever been in has turned out to either be an intrinsic disagreement about definitions (fixable, and usually we agree) or an intrinsic argument about god (probably not fixable, we will probably not agree).<p>If the average person understood the beauty of a solid (and unambiguous) definition, I dunno, world peace and rainbows and butterflies? Probably not, but I'd definitely not have to rage-quit socializing so often.<p>Still, with that said, from a purely intellectual curiosity standpoint this is neat. I hope that the general saltiness of the internet doesn't discourage the devs from working on this some more.