These languages are fun. "Look ma, no verilog!" But the underlying problem with all of these DSLs is the fact that the EDA[0] industry interoperates on verilog. Period. Worse, at some point in the design cycle, post-synthesized gate-level verilog becomes the codebase.<p>No upstream verilog changes are allowed because it can be difficult to get a precise edit (e.g. 2-input NAND into a 2-input AOI by changing a verilog function) and you just don't have 3 weeks of runtime to go from verilog to GDSII again. Or you want to make a metal-only respin that only changes one $0.xM mask layer and requires 8 weeks of fab time instead of changing multiple metal layers including the base and needs 16 weeks and a $xM payment.<p>Programming language design is quite rich because they used to cross-compile to C, and now they generally generate LLVM IR. It doesn't matter what the bug is in the final binary; you're not going to hex edit the binary like you would with a single metal layer of a 300mm wafer. You're just going to recompile and it generally doesn't matter if one machine instruction changes or 1M do because unlike verilog, not even GHC needs 3w to compile a program.<p>source: I've been on chip design teams for 2 decades and finally gave up on fighting verilog.<p>[0]: Electronic Design Automation. Synopsys, Cadence, Siemens, Ansys, etc.
Everytime I look at the examples, coming from a verilog background, it's strange to see the clock and reset are all implicit rather than explicit. The blinking led for example, while readable the link between the generated verilog with clock and reset is not clear. How are multi clock domains and Async CDCs handled? I've never used chisel so maybe this all is well managed, but not being explicit about the clock domain seems strange
Not being a hardware person, when I heard “Hardware Design Language” I was thinking more along the lines of Snow White[1] - the idea of an open source industrial design language would be pretty interesting, something along the lines of Material UI but for hardware.<p>1. <a href="https://en.wikipedia.org/wiki/Snow_White_design_language" rel="nofollow">https://en.wikipedia.org/wiki/Snow_White_design_language</a>
I don’t understand how this improves upon VHDL, even after reading their own explanation[0]. Just why they think object orientation makes hardware design easier isn’t really explained. After a quick look at it I much prefer VHDL’s entities (though their syntax is rather too wordy for my tastes), which at least make the direction of signals clearer. The problem with libraries could have been easily solved by extending/fixing VHDL instead of going through all this effort.<p>0. <a href="https://stackoverflow.com/questions/53007782/what-benefits-does-chisel-offer-over-classic-hardware-description-languages" rel="nofollow">https://stackoverflow.com/questions/53007782/what-benefits-d...</a>
There's a similar project at Intel: <a href="https://github.com/intel/rohd">https://github.com/intel/rohd</a><p>It uses Dart instead of Scala.
I've played with this and while I much prefer it to verilog (and even migen) it is still too implicit for me.<p>Most of my time spent building VGA toy was wasted debugging wrong counter wrapping and similar. I would like to try out something where the register width and operation semantics (wrap, extend, saturate) have to be always explicit.<p>Maybe it would turn out to be too annoying?
I've said as much before but I find the issue with alternative HDLs Vs SystemVerilog is they concentrate on fixing annoying and frustrating things but don't address the really hard issues in hardware design and can actually make them harder.<p>For example SystemVerilog has no real typing which sucks, so a typical thing to do is to build a massively improved type system for a new HDL. However in my experience good use of verilog style guides and decent linting tools solves most of the problem. You do still get bugs caused by missed typing issues but they're usually quickly caught by simple tests. It's certainly <i>annoying</i> to have to deal with all of this but fundamentally if it's all made easier it's not significantly improving your development time or final design quality.<p>Another typical improvement you'll find in an alternative HDL is vastly improved parameterization and generics. Again this is great to have but mostly makes tedious and annoying tasks simpler but doesn't produce major impact. The reason for this is writing good HDL that works across a huge parameterisation space is very hard. You have to verify every part of the parameter space you're using and you need to ensure you get good power/performance/area results out of it too. To do this can require very different micro architectural decisions (e.g. single, dual and triple issue CPUs will all need to be built differently improved parameterization doesn't save you from this). Ultimately you often only want to use a small portion of the parameter space anyway so just doing it in system verilog possibly with some auto generated code using python works well enough even if it's tedious.<p>So if the practical benefits turn out to be minor why not take all the nice quality of life improvements anyway? There's a large impact on the hard things. From a strictly design perspective these are things like clock domain crossing, power, area and frequency optimization. Here you generally need a good understanding of what the actual circuit is doing and to be able to connect tool output (e.g. the gates your synthesis tool has produced) and your HDL. Here the typical flow of HDL -> SystemVerilog -> tool output can become a big problem. The HDL to SystemVerilog step can produce very hard to read code that's hard to connect to your input HDL. This adds a new and tricky mental step when you're working with the design, first understand the circuit issue then map that to the hard to read SystemVerilog then map that to your HDL and work out what you need to change.<p>Outside of design alone a major cost of building silicon is verification. Alternative HDLs generally don't address this at all and again can make it harder. Either you entirely simulate the HDL itself which can be fine but then you're banking on minimal bugs in that simulator and there's no bugs in the HDL -> SystemVerilog step. Alternatively you simulate the SystemVerilog directly with an existing simulator but then you've got the HDL to SystemVerilog mapping problem all over again.<p>I think my ideal HDL at this point is a stripped down SystemVerilog with a good type system, better generative capability that crucially produces plain system verilog that's human readable (maintaining comments, signal and module names and module hierarchy as much as possible).
I tried to use Verilog for a DIY project and found no way to control Verilog model from Python. Why is it like this? Do people really write tests directly in this awful outdated language instead of using Python?<p>I tried to use cocotb, but this is not what I want. It runs verilog interpreter and launches Python script from it, but I want the other way: I want to create a verilog model instance and access it in Python.<p>Also, I found that Verilog seems to have no built-in feature for modeling real delays. Let's say we have a gate, and when input signal changes, output should instantly go to "undefined" (x) state and only after a delay switch to a valid 0 or 1 value. It seems that verilog has no such type of realistic delay built-in. I found only "transport delay" and "elastic delay", both unrealistic. I had to create my own "delay gate" to simulate this (and made lot of mistakes doing this).
One thing I really like about Verilog is explicit register widths. I want to be able to work at the individual bit level, something that Python (and even C) are not very good at. Is Chisel decent for efficiency?
It's been like 20 years since I did anything with an FPGA, but back then you basically had to use whatever tools your vendor provided you with. Have things improved to the point where an open-source HDL is usable with a large fraction of the FPGAs available?
The goal is worthy, the effort is commendable, but the underlying language (Scala) is an absolute turn-off AFAIC, and I suspect I'm far from being the only one.