I'm sorry, as someone who used APL professionally for about ten years J simply fills me with an urge to defecate. It is an absolute abomination. It left behind the power of notation as a tool for thought and turnd the concept of APL into a pile of giberish on a page.<p><a href="http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf" rel="nofollow">http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pd...</a>
Where are APL and J? For the most part, dead. Sure there are people using them. If someone like me who used the language almost exclusively for ten years hasn't touched it for two decades I think it is safe to say: Great language to learn. Powerful and fantastic concepts. But, no thanks. Happy with C, C++ and Python.<p>The failure to open source a good APL didn't help either.<p>If J-Software wants to make a last valiant effort to bring this back my opinion is simple: Read the paper linked above and go back to the roots. Extend the language to make it object oriented and create interfaces to C, C++ and Python. That's a start. I could write a paper on how to improve APL. It has to start with a commitment to moving software engineering into the realm of communicating through notation rather than cryptic text.
I've been struggling to learn J for a few weeks, and there are several things that make it difficult.<p>1) Lack of spaces in the majority of code make it difficult to lex let alone parse, meaning I can't tell what the operators are. When people use spaces it makes a difference.<p>2) Every operator has at least two meanings. Some have three meanings. That's more than 100 operators to memorize and 50 share the same spelling as the other 50.<p>3) The parsing rules as described strike me as extraordinarily complicated.<p>The ideas in APL are worth learning. For example striping objects and using grade up to make a concordance or annotate any other data structure. Another example is having the primitives auto-sort collections when searching to get nlogn complexity. Almost any language can use these ideas. If Python is one step up from Standard ML because of dicts and out-of-the-box primitives, then APL is one step up from Python.<p>If anyone who knows APL well is around, can I ask some simple questions?<p>1) Do user functions and if statements vectorize? Meaning, if I write a recursive function to implement Euclid's GCD and then pass it two arrays, will it give an array of pairwise GCDs?<p>gcd(a,b) = if b=0 then a else gcd(b,a mod b)<p>where = and mod are both vectorized APL primitives.<p>2) Why do I see so few user functions being used? I would expect application code to consist mostly of specific functions but instead I see line after line of unbroken primitive operators.<p>(EDIT: fixed formatting.)
This is what Fibonacci looks like, apparently<p><pre><code> f1a=: 3 : 0
{. y f1a 0 1x
:
if. *x do. (x-1) f1a +/\|.y else. y end.
)
f1b=: {.@($:&0 1x) : ((<:@[ $: +/\@|.@])^:(*@[))
</code></pre>
I think I have PTSD already just from reading this stuff.
I think the key idea behind APL and J is that they take one fundamental data structure, namely arrays of arbitrary rank containing characters, numbers or boxes, and define a number of operations on them, each of which has a natural mathematical definition and satisfies a number of laws. For example APL has at its core combinators ι ρ φ ⊤ , . \ / and a number of binary and unary operators.<p>To give an example, / is like fold in a functional programming language, so
+/ would be fold (+) in Haskell, there fold (+) :: Foldable t, Monoid m => m t -> t. However you would have a hard time implementing J like arbitrary rank arrays in Haskell efficiently and supporting all the operations it does.<p>The philosophy of J and K is to isolate the essential operations and laws the data structure you are trying to manipulate has and to implement an interpreter for those operations that is as efficient as possible. This is in contrast to a conventional general purpose language that tries to implement a sufficiently smart compiler and primitives to create new data types from a set of primitives ones. The problem is that it is quite hard and sometimes impossible to teach the compiler retroactively about all the nice properties your newly defined data structure has.<p>I believe several such domain specific interpreters/compilers could be "stitched together" into a general purpose language, maybe integrated by a system specialised in symbolic computation. Instead of compilation to machine code the top level system would compile down to the base operations for each data structure and simplify according to the laws the base operations satisfy. Haskell and many other do that by compiling down to a Core language and then applying a huge number of simplification passes on that intermediate language.
Now I've gone and installed J and started playing around. I'm doomed.<p>Just the most basic stuff is pretty nifty: <a href="http://www.jsoftware.com/help/learning/01.htm" rel="nofollow">http://www.jsoftware.com/help/learning/01.htm</a>
Perhaps the hope with J was that symbols could create a grammar for concepts. In maths we use intuition to define an integral as a certain kind of summation, a progressive difference as a certain kind of derivative, and so on. I was thinking that perhaps J symbols could create some form of grammar that suggest new concepts from primitive ones. But the language for me don't provide me with that kind of thinking and at the end is simply a way of writing code that is cryptic and difficult to understand. The idea of using a language with automatic vectorization is orthogonal to the syntax of J.<p>Haskell is different, the concepts of monad, category and so on allow you to obtain a higher order view of your ideas, to frame them in a different concept, that is experience that expand your concept of programming, I don't experiment that feeling with J, I paid the price to learn the vocabulary and grammar, but I don't think the language is worth the effort.
A question. I've played with J a bit, but I haven't had the "burst of enlightenment" that some suggest will come (ala lisp/haskell/etc).<p>My hypothesis based on a few conversations: the "burst of enlightenment" is treating problems like "applied math" rather than computer science. I.e., "first represent as a vector, then do linear algebra, iterate this process until < epsilon". Since I already do this in Python/Julia/Mathematica, J just feels like a variant with esoteric syntax.<p>Possibly my problem is that I'm simply programming Python in J. Can anyone with deeper knowledge of J confirm/deny this?
I thought it was a J library for C when I saw this title. Anyway, I think APL/J/K is the real geek code (and its syntax will be forgotten sooner than Perl syntax if you don't use it often. ;)
Greg Wilson (from the 'Making Software' book) has been claiming the cost of building software is rather constant, regardless the programming language. The formula seems to be
"one can build and maintain 10 lines of code per hour".<p>If this holds for J, K, APL we should all be doing that.
> Writing code in a scalar language makes you rather like a general who gives orders to his troops by visiting each one and whispering in his ear.<p>Well, actually, I use for-loops for that, thank you :)
Interestingly, we just had an article on HN that complained about the amount of mental RAM that C++ takes nowadays. I can't imagine that J takes less.