The tutorial of the recently posted BQN variant of APL (<a href="https://news.ycombinator.com/item?id=33180842" rel="nofollow">https://news.ycombinator.com/item?id=33180842</a>) is <i>super</i> approachable and written so nicely, that I couldn't resist being drawn in - I <i>highly</i> recommend checking it if you're even slightly "APL-curious". The first part is at: <a href="https://mlochbaum.github.io/BQN/tutorial/expression.html" rel="nofollow">https://mlochbaum.github.io/BQN/tutorial/expression.html</a>
There is a lot of development happening in the area of APL-insipred programming languages.<p>I've spent far too much time working on an APL dialect that allows you to combine APL with imperative structures at the same time. I really need to document it better though. <a href="https://aplwiki.com/wiki/KAP" rel="nofollow">https://aplwiki.com/wiki/KAP</a><p>Then there is April, which is a very neat version of APL that is implemented in Common Lisp. It allows you to miss Lisp arrays with APL arrays, giving you the best of both worlds. It's very functional even now: <a href="https://github.com/phantomics/april" rel="nofollow">https://github.com/phantomics/april</a><p>And of course, BQN is a new language that takes a lot of the good ideas from APL but also changes a lot of the symbols. It's a very nice language: <a href="https://mlochbaum.github.io/BQN/" rel="nofollow">https://mlochbaum.github.io/BQN/</a>
The only time I see APL is each year during <i>Advent of Code</i> – <a href="https://adventofcode.com/" rel="nofollow">https://adventofcode.com/</a> – APL users always pop up and manage to solve the most complex of tasks with about ten symbols versus everyone else's 50 lines of Python :-)
APL is from an era where impressively functional code that looks like line noise (another obsolete concept) was considered heroic. Other examples of such code are text editing and text formatting macro languages. Recall that EMACS stands for "Editor Macros" because that was its original implementation.<p>Back then, computers were so limited relative to human capability that this was still a good thing. Those days have gone. The bizarre syntax of APL has had its day. Its powerful array manipulation primitives can easily be brought forward into a language with modern syntax, and haven't they? In my company there is a lot of Matlab usage but I'm personally not knowledgeable.
Suppose you have a couple graphs, represented by adjacency matrices (A and B) whose cells represent travel times or distances or some other notion of cost. Assuming their dimensions are compatible, and that the columns of A correspond to the same nodes as the rows of B, then<p><pre><code> A ⌊.+ B
</code></pre>
will produce a new adjacency matrix (call it C) where each c_ij is the minimum traversal cost from node i to node j. For example, if A represented driving times from all cities in NY state to all airports in NY, and B represented direct flight times from all NY airports to all CA airports, then our result would be the minimum travel time (ignoring parking and security) between any NY town and any CA airport.<p>The sequence `⌊.+` is an inner product chosen specifically to achieve this effect, with addition where multiplication would normally be, and ⌊ (min) where multiplication would normally be.<p>Notice that it took me longer to give this under-caffeinated explanation of what is going on than to write the code.
Allow me to disagree. The page presents toy examples, and those are already unreadable. APL lacks data structures, modules, exception/error handling and typing. Well, since almost everything is a matrix of floats, perhaps typing isn't needed, but that idiom just deteriorates readability. You might think of it as a "bc" on steroids, but a rebirth for APL is uncalled for.
You can get everything there is in APL, plus no weird symbols in the J programming language.<p>> <i>learning all the alien symbols is a one-time investment</i><p>No one time investment is needed if you want to learn J.<p>And yes, it does teach you to think in a new way. I am not kidding.<p>You can see the plethora of (mostly free) books available at the website [0].<p>Learning J has given me "enlightenment" at the same level <i>The Little Schemer</i> has given me.<p>(I don't recommend APL to anyone as you need to learn and painstakingly slowly insert those symbols. If you want to learn array thinking, go straight to J. Why waste time and headspace with APL symbols?)<p>[0]: <a href="https://code.jsoftware.com/wiki/Books" rel="nofollow">https://code.jsoftware.com/wiki/Books</a>
«APL <i>has</i> flourished, as a DSL embedded in a language with excellent I/O. It's just got weird syntax and is called Numpy. Or Tensorflow. Or PyTorch...»<p>(<a href="https://news.ycombinator.com/item?id=17176147" rel="nofollow">https://news.ycombinator.com/item?id=17176147</a> 4 years ago)
I intend to learn an APL at some point. I'm an avid listener of the Array Cast podcast [0]. Can't say I u derstand all of what they talk about but I definitely feel like APL has its place for number crunching.<p>[0] <a href="https://www.arraycast.com/episodes" rel="nofollow">https://www.arraycast.com/episodes</a>
> Even Haskell not only gained popularity on its own but deeply influenced F# and Scala.<p>The influences came from Standard ML and OCaml, primarly.<p>F# started its life as OCaml.NET.
APL looks like a lot of fun, and I've tried to get into learning it for a while, but it's a shame that the most popular implementation is proprietary, and GNU APL is a dialect which is not as featureful and a separate dialect of APL unto itself, so it's not as though you can write a reasonably sized APL program with Dyalog and then expect it to work in GNU APL.
KDB+, using Q, is heavily inspired by APL. Great to look smart, might make you productive, horrible for cooperation, maintenance nightmares, heavy technical debt.<p>I'll pass, but you do you.
I've dabbled in APL, and a couple of jobs ago I worked extensively in kdb/q. I found it to be an immensely powerful tool in its natural environment of time-series processing, and extraordinarily painful to use outside that domain. Typically I'd write the core logic in q with a small wrapper in Java or Python to do string formatting and other tasks that kdb falls down on.<p>Which is fine. Not every language needs to do everything. Horses for courses.
Fast.ai steals a lot of ideas from APL:<p><a href="https://forums.fast.ai/t/apl-array-programming/97188" rel="nofollow">https://forums.fast.ai/t/apl-array-programming/97188</a><p>I've programmed in APL last millennium. It is hell.
The practical working set of English is about 60,000 words. In theory, every word does not need to be more than four characters (26^4 = 456K).<p>But we don't do that. Because the brain needs redundancy. Ditto source code.
I like Ivy's[0] approach by using keywords instead of symbols. I feel like something like APL should either be embedded in some host language, be given lots of functionality for I/O, or some static type system with rank polymorphism.<p>[0]: <a href="https://github.com/robpike/ivy" rel="nofollow">https://github.com/robpike/ivy</a>
Well, it won't happen. The language is too weird and for a language these days to make it, it has to target the lowest common denominator of developers.<p>Let's say you're at work, and you casually mention to your team something about APL, ATS or Agda. How many were excited about it? How many did roll their eyes? Yeah, I thought so.
Too much magic. I don't feel any fulfilment from knowing quirks of a specific language that not many others know. A waste of time IMHO.<p>Is it still pretty cool? Yes I think so.
and has for a while:<p><i>APL deserves its renaissance too</i> - <a href="https://news.ycombinator.com/item?id=17173283" rel="nofollow">https://news.ycombinator.com/item?id=17173283</a> - May 2018 (118 comments)<p>(Reposts are fine after a year or so - <a href="https://news.ycombinator.com/newsfaq.html" rel="nofollow">https://news.ycombinator.com/newsfaq.html</a> - especially about APL.)
There are a lot of comments of the type "it's unreadable/write-only/unmaintainable" here. It's a natural reaction; I know, I was there. It looks different. But readability is in the eyes of the beholder, not the language. I made this point in my APL book (<a href="https://xpqz.github.io/learnapl" rel="nofollow">https://xpqz.github.io/learnapl</a>) -- just because I can't read Japanese does not make Japanese unreadable. Since I wrote that book, I've spent a few years immersing myself in APL, gradually rewriting thousands of lines of Python into APL, and it's such a productivity boost. My code is an order of magnitude smaller. More, sometimes. Fast, too.<p>For the right use cases, it's unbeatable. Sure, you probably wouldn't want to write an OS kernel in it, but anything that reads a bunch of data, mashes it up, and spits out some result, APL is a hand-in-glove fit. And with modern SIMD processors, APL really screams.<p><a href="https://xpqz.github.io/learnapl" rel="nofollow">https://xpqz.github.io/learnapl</a><p><a href="https://xpqz.github.io/cultivations" rel="nofollow">https://xpqz.github.io/cultivations</a><p>Drop in on <a href="https://apl.chat" rel="nofollow">https://apl.chat</a> if you're interested.
What about using like a Wacom drawing tablet in order to interact with APL?<p>I've never tried any of the Iverson-verse languages but the non-ascii inputs seem daunting and cumbersome.
That's a cute writeup, but the author overlooked some important bits of history.<p>For example, the original use of APL (before it was called APL and before anyone had implemented it as a programming language -- the reason Iverson joined IBM) was to specify the IBM 370 machine architecture.<p>In other words, it was being used to document how the CPU worked. And it was impressively successful there, distilling a large body of documentation into 2 pages.<p>This reduction in cognitive load, describing machine structure, is what made it popular back then. And this is what motivated the effort to implement it as a programming language.<p>(Also, if I understand correctly, there's also a relatively short and direct step from APL to the initial implementation of SQL.)
There are implementations/dialects of APL that use "ASCII-friendly" operation names. The "J" and "K" language are examples.
A lot of people in finance like to us q, an APL derivative with some more functional programming facilities, so in that sense it's had a mini-renaissance.
I had to use APL once back in the 1970's, because I needed the graphics capabilities of an IBM 5100. I managed, but it wasn't a pleasant experience and I've felt no compulsion to repeat it.<p>I admit it was kind of cool to be able to operate on whole arrays at a time, but nowadays I can do that with Python's numpy.
Aaron Hsu tried with some level of success.<p><a href="https://duckduckgo.com/?q=aaron+hsu+apl&t=ffab&iax=videos&ia=videos" rel="nofollow">https://duckduckgo.com/?q=aaron+hsu+apl&t=ffab&iax=videos&ia...</a>
I particularly liked this principle from the article: "learning all the alien symbols is a one-time investment, and expressiveness — the leverage you as a programmer gain — is for life."
No, it does not.<p>I used APL professionally for about ten years (early '80's to early '90's) for a range of applications spanning various business applications, industrial automation and DNA sequence analysis used in the Human Genome Project. The language is/was fantastic. It truly is a tool for thought.<p>How about the funny symbols?<p>I see this type of comment all the time. If you think this way, you lack context. The notation is an important element of APL's value proposition. You cannot understand this by watching an APL video on YouTube or running through a tutorial.<p>I equate it to something like using vim. People who have casual contact with it absolutely hate it. Those who commit and develop the skills and mental automation have a very different perspective. From that context, someone saying "vim's cryptic keyboard commands and states are horrible" sounds, well, I'll be kind, uninformed.<p>And yet, my first sentence implies I think APL does not deserve to exist. Which is it?<p>I firmly believe the future of advanced types of software engineering will require a form of notation in order to effectively communicate and describe computational solutions. What comes to mind is AI/ML. I think APL needs to mutate and evolve into being the tool that is best-suited for solving complex AI/ML problems. If this can happen, the notation will be critical and just as important as it is in mathematics or music.<p>I think the issue might be that we don't yet know enough about this evolutionary stage of AI/ML to understand what kind of programming notation we might need to invent. This isn't well defined at all. For the most part, AI/ML is exactly where it was in the 80's and 90's. We have faster computers with massive resources. Yet, if you go back to books on AI dating back to the 80's you will be surprised to learn we haven't really invented much in the last few decades.<p>This three volume set on my bookshelf is a good reference dating back to the early '80's:<p><a href="https://www.sciencedirect.com/book/9780865760899/the-handbook-of-artificial-intelligence" rel="nofollow">https://www.sciencedirect.com/book/9780865760899/the-handboo...</a><p>In fact, if you read through it you'll discover just how much was done in the '70's and '60's.<p>So, yes, I would not use or recommend APL for any modern project. It's a liability from an ROI perspective. In addition to this, the pool of capable APL software engineers is microscopic. That is a huge problem. APL does not make business or technical sense and cannot compete with modern tools, their libraries and the large pool of capable programmers who can use them.
I disagree. The terseness of APL is the worst part about it. Its arcane symbols are a close second.<p>I don't want my programming language to be as terse as possible. I want it to be easy to read and maintain. I don't see how APL is a step in the right direction at all, and IMO should just be relegated to the dustbin of history as a failed experiment.
> life←{↑1 ⍵∨.∧3 4=+/,¯1 0 1∘.⊖¯1 0 1∘.⌽⊂⍵}<p>Nope, it should be burned in fire.<p>> Roughly 86% of all the fun you get from APL programming comes from the mysterious symbols and the magic behind them. It’s not that APL is alien to computers, it’s just the computers were alien to APL for quite a while.<p>That's being <i>alien to humans</i> my dude. If understanding syntax is heaviest mental exercise in programming language it is unexcusably terrible. It should be considered torture to even require someone to read it.
APL died for the same reason Perl is fading: it is a write-only language.<p>Those symbols look cool. But they're not very far away from Brainfuck.