Conor Hoekstra's code_report[0] YouTube channel is a great way to learn about APL. The "1 problem N languages" series of videos are especially good.<p>[0] <a href="https://www.youtube.com/c/codereport/featured" rel="nofollow">https://www.youtube.com/c/codereport/featured</a>
Gilad Bracha is working on Shaperank. An APL inspired language for reactively calculating with multi dimensional arrays. Ie vectors and matrices. <a href="https://twitter.com/Gilad_Bracha/status/1450149734325256193" rel="nofollow">https://twitter.com/Gilad_Bracha/status/1450149734325256193</a>
Anyone using APL or J for work on HN? I've used it as a hobby and think it's really cool, but haven't ever used it professionally.<p>Edit: this has been posted on HN before.
Nothing dies.<p>Like, if it has actual uses and implementations on modern machines and isn't abandonware, someone is going to be using it somewhere.<p>But I would say it was niche.
Few languages ever die completely. Even Jovial is still used in some places (I learned it in 1982). I have not seen any mention of PL/1 in decades, however. I learned APL in 1979 and found it an amazing language at the time, though I never used it again. Every other language I learned in my life is still in use in some form.
There was a good Co-Recursive episode I just listened to last week that was partially about APL. <a href="https://corecursive.com/065-competitive-coding-with-conor-hoekstra/" rel="nofollow">https://corecursive.com/065-competitive-coding-with-conor-ho...</a>
Context: Professional APL user for ten years back in the '80's<p>I firmly believe languages like APL, Forth and LISP should be taught in a single quarter course on programming languages. The perspective you get is invaluable. These ideas help you think about computational problem solving in a different way.<p>That said, attempting to use any of these languages today for anything other than a fun hobby would be a mistake. While APL isn't dead --paid and free distributions are still actively maintained--, it is, in my opinion, deader than a doornail when it comes to the reality of using it for any real work, particularly at scale. In this sense it isn't any different than Forth, LISP, COBOL and FORTRAN. Can you imagine Facebook announcing a move to FORTRAN. Neither can I.<p>I often find the comments on HN about APL is terribly misinformed. Things like "read only language", "need a custom keyboard", "need a custom machine", etc. are, from the perspective of someone who actually knows APL, just silly. People truly should stop for a second and think about whether their opinions about anything are based on enough data to actually support even having an opinion. Simple parallel example:<p>Dabbling in music does not make you a musician. Declaring that you need a custom machine to type musical notation and that this notation is impossible to read would sound terribly ignorant to someone who devoted sufficient time to actually learning and internalizing this.<p>I can, still, to this day, decades later, touch-type APL. Do you look at your keyboard when you type anything in your spoken language/s? No? Same with APL. The learning curve isn't any worse than learning to type on an ASCII keyboard. Do you have to look at the keyboard when you type any of the shifted symbols on the top row? No? Well, imagine that's APL. Different symbols. No problem.<p>Yes. APL is dead as a sensible informed choice for non-trivial projects.<p>No. APL is not dead as it pertains to learning some amazing things about what computing could --and arguably, should-- look like at some undefined point in the future.<p>I have always thought that the future of AI will require us to be able to communicate with the computer in code in a form far closer to the symbolic APL approach rather than typing words in English. I can't put my finger on what form that would take. Iverson's "Tool for Though" paper goes over the reasons that support this idea. I just can't offer anything other than to say I believe this to be true based on ten years using an amazing symbolic language for real work at scale. One of my APL applications was part of the human genome decoding project. It helped analyze, classify and visualize sequencing data.
I feel like we're quibbling about semantics.<p>Is APL an interesting language that most people would benefit from picking up and building something with? Sure.<p>Is there a small and passionate community around it? Absolutely?<p>Is it still possible to get up and running with APL in 2021? Yes.<p>Given the variety of choices in the developer ecosystem is APL the best choice for the types of problems the vast majority of developers are solving today? No.<p>And I think maybe most interestingly this conversation (to me) highlights how important ecosystems, frameworks and communities are to modern development over the pure language semantic benefits.
Article describes interest in APL, and constructing tools for it, but does not note any modern practical/commercial use of APL. As such, seems little more than academic interest akin to Latin or Esperanto, where even Klingon gets more actual use (<a href="https://smile.amazon.com/s?k=klingon+shakespeare" rel="nofollow">https://smile.amazon.com/s?k=klingon+shakespeare</a>).
APL is alive and well and widely used. It’s just evolved into more verbose forms known as NumPy, R, and other Iverson Ghosts [0].<p>Turns out people love array programming but hate terse syntax.<p>[0] <a href="https://dev.to/bakerjd99/numpy-another-iverson-ghost-9mc" rel="nofollow">https://dev.to/bakerjd99/numpy-another-iverson-ghost-9mc</a>
Anyway, the iota operator isn't wholly dead. It gets name-checked in libraries for other languages, including recently C++. Which isn't dead.