It was an interesting choice to include the Compucolor II emulation. It was a very quirky machine with some interesting design choices, but not very popular.<p>Ops website references <a href="https://compucolor.org" rel="nofollow">https://compucolor.org</a>, which I created. I'm glad someone was able to find something useful in it. The site has an in-browser javascript emulation of the Compucolor II. I've written one javascript program in my life, and that emulator is it. It has languished since 2014 other than a bug fix here and there. Eventually I'll refresh the code, and hopefully replace the display generation logic with webgl.<p>The core emulator was quite simple to write, but 90% of the time was spent getting the code to work across browsers and dealing with the infuriating differences in keyboard handling. Maybe things are better now.
I am having trouble understanding the usefulness of all these new pseudo-HDL languages.<p>In all the projects that I've worked on, the choice of HDL (which was 95% of the time, Verilog, for the rest, VHDL), was never actually 'important'; the language features were never critical to the completion of the project. Verilog is fully adequate for any kind of serious HDL development.<p>What mattered were, the tooling, IDEs, debuggers, timing analysis tools, verification infrastructure, the IP ecosystem, etc.<p>Perhaps I am getting old but I just can not see how these new languages can be a serious alternative to Verilog/VHDL.
I had so much fun a decade ago in University creating a game that ran on an FPGA. We were generating the RGB signals in the FPGA so we only had a little time between screen refreshes for an game logic. It was very challenging but very fun. We made a little scorched earth clone. That game worked amazingly well given the constraints of the system.<p>For anyone wanting to learn these systems I'd highly recommend developing a game as a learning project.
By the way, I recently had an idea while taking shower: can there be such a CPU (e.g. FPGA-based) that would support algebraic typing on the hardware level?