(Apologies to the OpenDylan folk, I find I wrote this in the past tense. Just my POV.)<p>A Historical View of Dylan<p>Back in the day we used Dylan heavily for generating enterprise middleware for the C++ programers, writing X11 applications, and generating web pages when the web finally arrived.<p>It was like a Lisp with a world class object system, neatly disguised in a Pascal feeling syntax so you could grab commodity programmers and get them up and running. Any programmer with eyeballs could read the code. To many eyes it looks verbose, but with a nice editor mode it typed quickly, e.g. you never typed "define variable", you hit a key, expanded the statement, filled in the blanks, and headed on.<p>Multimethods, value based dispatch, multiple inheritance, and open ancestor classes all together without warts is a lot of power. Of course, don't point that fully armed and operational object system at your foot.<p>Ah yes, you say, but how do you do eval and all those Lisp macro things from a Pascal syntax? We didn't. While many smart people struggled to make spectacular Dylan compilers, we happily used Mindy, the tiny byte code interpreter. It was the mid '90s. 32MB RAM in a 66MHz i386. What we lost in CPU efficiency we picked back up in memory footprint. In those days you did something else while building your software, (not surf the web of course, maybe poll ftp sites to see if there was a new "init" to compile for your Linux machine since distributions weren't really invented yet), g++ had a 1 in the first version digit. If C++ had templates, they were so ludicrously bloated that you couldn't afford the RAM to use them. egcs hadn't forked then eventually killed the abomination that gcc 1 had become. With Mindy our builds were instantaneous and it ran pretty well too. Even the "stop the world and copy everything" garbage collector worked well in our GUI programs once we figured out to detect and force impending collections at times when the user wouldn't notice, like right after they selected a menu item, rather than letting the GC trigger automatically in the middle of a button draw.<p>Sadly, after a promising start, Dylan languished. Apple refocusing away from Dylan was a big part of it, but I think there was a bigger problem, perhaps part of the reason Apple moved away (plus timing issues, Dylan missed the Newton moment). The macro system was so insanely difficult to understand that I don't think more than 5% of our programmers could have fully understood it. I'm being quite literal here. It was a language that anyone could read and understand combined with a macro system so byzantine that statistically speaking no one could comfortably use it. You could define whole new Pascal looking syntax constructs, flow control, or whatever you wanted with it… probably… after somehow altering you brain into some transhuman construct capable of understanding the macro system. Notice how in the linked book, "Macros" is the last chapter? Clearly the author went mad and was no longer able to communicate with humans.<p>Edit: Added this little forward looking section<p>Given a “do over” on Dylan, I would whisper “packrat parser” in the ear of the language designers. Replace the macro system with the ability to dynamically augment the parser. This gives you a model that a reasonable number of programmers will understand, an opportunity for meaningful error messages, and essentially unlimited potential, for both good and evil. It wouldn't have worked in the '90s, too much memory required, but we aren't there anymore. Want to add SQL support? Add a module. Give your language a "select" statement. Make it a loop construct if you want to stream process the results. Your SQL statements get syntax checked at compile time. Screw pasting up commands and substituting placeholders. Write SQL right there in the middle of your code and use your language variables willy nilly.<p>Edit Edit: Ok, don't repeat all the text twice.