It's a little thing, and not uncontroversial, but I'm excited to see another language besides Python adopt whitespace. In general I wish more languages provided a way to enforce code layout.<p>After using Python professional (and managing a team of developers) for a few years, whitespace is a great way to keep things readable as more people touch code.<p>Whitespace isn't the only way to do it, of course. Go's "gofmt" is a good example of a different way to enforce code layout. [1]<p>[1] <a href="https://blog.golang.org/go-fmt-your-code" rel="nofollow">https://blog.golang.org/go-fmt-your-code</a>
I notice they have now included the Aporia IDE and Nimble package manager in the Windows installer. This seems like a good step toward getting newcomers up and running quickly, and getting more feedback from the community about the core tools. It seems like many new languages are going with this route of tools supported by core-dev, such as package management. For instance Go ships its code formatting tool, which seems like a good idea. Supporting an IDE seems more questionable, for instance Python never seemed to have much success with IDLE, and IDE preference is very personal. Anyway, as someone who primarily writes Python at work I find Nim to be very usable because the syntax and style is so familiar, and I really like most of the decisions they have made. I prefer Nim's static typing and compiled executables for redistribution, and they have introduced nice new concepts around concurrency and more advanced language features. Definitely looking forward to using this language more and more as it moves toward 1.0
It's very exciting that Nim has generator syntax in a compiled, statically typed language. To me, recursive generators with "yield" and "yield from" expressions are the most natural way to express iterative algorithms.<p>The Ranges proposal for C++17 is great for <i>consuming</i> generic algorithms, but it's far behind generators for <i>creating</i> generic algorithms. Simple example: try to write an iterator for a tree that doesn't have parent pointers. With ranges or iterators, you must maintain a stack as a data member. With recursive generators, the variables on the stack are captured implicitly in the language. I only know a little about compilers, but it seems like a compiler with syntax tree knowledge of the entire program should be able to optimize generators into machine code that's just as good as iterators - i.e. not copying all the caller-save registers onto the stack every time a generator recurses.<p>I think generator syntax is truly a killer feature for any kind of serious algorithmic work. C++11 made it a lot easier to write and use generic algorithms, but writing iterators is still a big task that reluctant C++ programmers avoid. IMO, fear (and actual difficulty) of writing iterators is the main reason most people don't enjoy C++. If C++ had shipped with generators from the beginning, I think the programming language landscape would be very different today.<p>Nim seems to have a small community and limited promotion, so I do not feel especially hopeful that it will compete with C++ at the same level as Rust. (I am not sure if Rust will upset C++ in any significant way either!) Perhaps a well curated set of Nim/Rust/C++ comparisons could convince the C++ standards committee or the Rust team to add generator syntax.
Congratulations to the Nim team!<p>I did a quick comparison between the up and coming compiled languages (D, Rust, Nim, Go) and C++ a week or so back. My main aim was to assess the final statically linked binary size for a simple hello world program.<p>Here are the results on x64 Mint:<p>1. C++: 1.6 MB<p>2. Go: 1.9 MB<p>3. Rust: 550 KB with dependencies (I was not able to figure out how to pass a static option to rustc)<p>4. D: 710 KB, same as Rust<p>5. Nim: 970 KB, statically linked<p>Nim wins, by a large margin. I did not remove any of the runtime checks by the way. Removing them actually saves 100 KB+, depending on the program. Furthermore, the Nim program was actually a naive Fibonacci number calculator, not a simple hello world, meaning that Nim should have been at a disadvantage! Amazing stuff.
Congratulations to the developers for the impressive list of improvements and fixes! Nim is becoming my first choice of programming language in a large number of situations.<p>I hope to see version 1.0 soon. Keep up the good work!
Good to see the Nim project moving closer to a solid 1.0.<p>Now all we need is a cross-platform GUI library that uses native widgets on each platform, such as a set of bindings for wxWidgets or (better yet) a translation of SWT from Java to Nim, and we'd have an excellent foundation for self-contained, cross-platform desktop apps. Yes, I know desktop apps aren't trendy (at least on platforms other than Mac), but they're still important.
I hadn't seen this language before, but it looks neat. Coming from a Ruby background, it's very interesting to see Nim's pragmas and templates, as they look rather like type-checked and compiled cousins of metaprogramming methods.
> starting with version 1.0, we will not be introducing any more breaking changes to Nim.<p>I wonder whether they will stay committed to that. Pretty much all developers of other "living" languages I know who do not have strong industry support and thus pressure not to break working code introduce breaking changes all the time e.g. see D.<p>Nim could distinguish itself from the crowd with such a commitment to stability. However I expect overwhelming pressure from the enthusiast crowd which currently utterly dominates among Nim users to introduce breaking changes even post-1.0. And with little pressure from the other direction..
Maybe off-topic, but why exactly do we have hard and soft real-time? I think this is terrible, either you have deadlines or you don't. E.g. in games which lag you are missing deadlines and the result is terrible.<p>Currently it seems that we distinguish between hard-real-time, soft-real-time and no-real-time, but probably we should just have hard realtime everywhere and if we do not care, we just give big numbers. I mean even if you are not hard realtime you're are probably expecting results before the heat death of the universe, so maybe it would be a nice idea to NOT abstract time in programming languages but give the possibility to the programmer to annotate timing constraints in e.g. function's definitions? E.g. such information could be crucial to the GC.<p>Maybe we could build such a mechanism into a fancy type theory?
I really wish I had a use-case for this powerful static language, but alas I am no ninja, guru, zen-master, sensei or rocket-scientist who is working on a problem that really needs this level of power/performance.<p>Python will always be "good enough".