TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Chisel: A Modern Hardware Design Language

156 pointsby nairboonover 1 year ago

17 comments

BooneJSover 1 year ago
These languages are fun. &quot;Look ma, no verilog!&quot; But the underlying problem with all of these DSLs is the fact that the EDA[0] industry interoperates on verilog. Period. Worse, at some point in the design cycle, post-synthesized gate-level verilog becomes the codebase.<p>No upstream verilog changes are allowed because it can be difficult to get a precise edit (e.g. 2-input NAND into a 2-input AOI by changing a verilog function) and you just don&#x27;t have 3 weeks of runtime to go from verilog to GDSII again. Or you want to make a metal-only respin that only changes one $0.xM mask layer and requires 8 weeks of fab time instead of changing multiple metal layers including the base and needs 16 weeks and a $xM payment.<p>Programming language design is quite rich because they used to cross-compile to C, and now they generally generate LLVM IR. It doesn&#x27;t matter what the bug is in the final binary; you&#x27;re not going to hex edit the binary like you would with a single metal layer of a 300mm wafer. You&#x27;re just going to recompile and it generally doesn&#x27;t matter if one machine instruction changes or 1M do because unlike verilog, not even GHC needs 3w to compile a program.<p>source: I&#x27;ve been on chip design teams for 2 decades and finally gave up on fighting verilog.<p>[0]: Electronic Design Automation. Synopsys, Cadence, Siemens, Ansys, etc.
评论 #38786243 未加载
评论 #38788223 未加载
评论 #38786197 未加载
评论 #38786597 未加载
tails4eover 1 year ago
Everytime I look at the examples, coming from a verilog background, it&#x27;s strange to see the clock and reset are all implicit rather than explicit. The blinking led for example, while readable the link between the generated verilog with clock and reset is not clear. How are multi clock domains and Async CDCs handled? I&#x27;ve never used chisel so maybe this all is well managed, but not being explicit about the clock domain seems strange
评论 #38782192 未加载
评论 #38784456 未加载
评论 #38785098 未加载
评论 #38789089 未加载
评论 #38789822 未加载
donatjover 1 year ago
Not being a hardware person, when I heard “Hardware Design Language” I was thinking more along the lines of Snow White[1] - the idea of an open source industrial design language would be pretty interesting, something along the lines of Material UI but for hardware.<p>1. <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Snow_White_design_language" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Snow_White_design_language</a>
评论 #38782696 未加载
评论 #38783117 未加载
irdcover 1 year ago
I don’t understand how this improves upon VHDL, even after reading their own explanation[0]. Just why they think object orientation makes hardware design easier isn’t really explained. After a quick look at it I much prefer VHDL’s entities (though their syntax is rather too wordy for my tastes), which at least make the direction of signals clearer. The problem with libraries could have been easily solved by extending&#x2F;fixing VHDL instead of going through all this effort.<p>0. <a href="https:&#x2F;&#x2F;stackoverflow.com&#x2F;questions&#x2F;53007782&#x2F;what-benefits-does-chisel-offer-over-classic-hardware-description-languages" rel="nofollow">https:&#x2F;&#x2F;stackoverflow.com&#x2F;questions&#x2F;53007782&#x2F;what-benefits-d...</a>
评论 #38783469 未加载
评论 #38789848 未加载
modulovalueover 1 year ago
There&#x27;s a similar project at Intel: <a href="https:&#x2F;&#x2F;github.com&#x2F;intel&#x2F;rohd">https:&#x2F;&#x2F;github.com&#x2F;intel&#x2F;rohd</a><p>It uses Dart instead of Scala.
评论 #38789881 未加载
progbitsover 1 year ago
I&#x27;ve played with this and while I much prefer it to verilog (and even migen) it is still too implicit for me.<p>Most of my time spent building VGA toy was wasted debugging wrong counter wrapping and similar. I would like to try out something where the register width and operation semantics (wrap, extend, saturate) have to be always explicit.<p>Maybe it would turn out to be too annoying?
评论 #38782687 未加载
gchadwickover 1 year ago
I&#x27;ve said as much before but I find the issue with alternative HDLs Vs SystemVerilog is they concentrate on fixing annoying and frustrating things but don&#x27;t address the really hard issues in hardware design and can actually make them harder.<p>For example SystemVerilog has no real typing which sucks, so a typical thing to do is to build a massively improved type system for a new HDL. However in my experience good use of verilog style guides and decent linting tools solves most of the problem. You do still get bugs caused by missed typing issues but they&#x27;re usually quickly caught by simple tests. It&#x27;s certainly <i>annoying</i> to have to deal with all of this but fundamentally if it&#x27;s all made easier it&#x27;s not significantly improving your development time or final design quality.<p>Another typical improvement you&#x27;ll find in an alternative HDL is vastly improved parameterization and generics. Again this is great to have but mostly makes tedious and annoying tasks simpler but doesn&#x27;t produce major impact. The reason for this is writing good HDL that works across a huge parameterisation space is very hard. You have to verify every part of the parameter space you&#x27;re using and you need to ensure you get good power&#x2F;performance&#x2F;area results out of it too. To do this can require very different micro architectural decisions (e.g. single, dual and triple issue CPUs will all need to be built differently improved parameterization doesn&#x27;t save you from this). Ultimately you often only want to use a small portion of the parameter space anyway so just doing it in system verilog possibly with some auto generated code using python works well enough even if it&#x27;s tedious.<p>So if the practical benefits turn out to be minor why not take all the nice quality of life improvements anyway? There&#x27;s a large impact on the hard things. From a strictly design perspective these are things like clock domain crossing, power, area and frequency optimization. Here you generally need a good understanding of what the actual circuit is doing and to be able to connect tool output (e.g. the gates your synthesis tool has produced) and your HDL. Here the typical flow of HDL -&gt; SystemVerilog -&gt; tool output can become a big problem. The HDL to SystemVerilog step can produce very hard to read code that&#x27;s hard to connect to your input HDL. This adds a new and tricky mental step when you&#x27;re working with the design, first understand the circuit issue then map that to the hard to read SystemVerilog then map that to your HDL and work out what you need to change.<p>Outside of design alone a major cost of building silicon is verification. Alternative HDLs generally don&#x27;t address this at all and again can make it harder. Either you entirely simulate the HDL itself which can be fine but then you&#x27;re banking on minimal bugs in that simulator and there&#x27;s no bugs in the HDL -&gt; SystemVerilog step. Alternatively you simulate the SystemVerilog directly with an existing simulator but then you&#x27;ve got the HDL to SystemVerilog mapping problem all over again.<p>I think my ideal HDL at this point is a stripped down SystemVerilog with a good type system, better generative capability that crucially produces plain system verilog that&#x27;s human readable (maintaining comments, signal and module names and module hierarchy as much as possible).
评论 #38783716 未加载
评论 #38784438 未加载
评论 #38786451 未加载
评论 #38782725 未加载
codedokodeover 1 year ago
I tried to use Verilog for a DIY project and found no way to control Verilog model from Python. Why is it like this? Do people really write tests directly in this awful outdated language instead of using Python?<p>I tried to use cocotb, but this is not what I want. It runs verilog interpreter and launches Python script from it, but I want the other way: I want to create a verilog model instance and access it in Python.<p>Also, I found that Verilog seems to have no built-in feature for modeling real delays. Let&#x27;s say we have a gate, and when input signal changes, output should instantly go to &quot;undefined&quot; (x) state and only after a delay switch to a valid 0 or 1 value. It seems that verilog has no such type of realistic delay built-in. I found only &quot;transport delay&quot; and &quot;elastic delay&quot;, both unrealistic. I had to create my own &quot;delay gate&quot; to simulate this (and made lot of mistakes doing this).
programjamesover 1 year ago
One thing I really like about Verilog is explicit register widths. I want to be able to work at the individual bit level, something that Python (and even C) are not very good at. Is Chisel decent for efficiency?
评论 #38783179 未加载
cmrx64over 1 year ago
I much prefer SpinalHDL, having used both.
评论 #38783804 未加载
评论 #38783734 未加载
physPopover 1 year ago
Anyone have experience that can compare this with Clash?
aidenn0over 1 year ago
It&#x27;s been like 20 years since I did anything with an FPGA, but back then you basically had to use whatever tools your vendor provided you with. Have things improved to the point where an open-source HDL is usable with a large fraction of the FPGAs available?
评论 #38784681 未加载
throw10920over 1 year ago
I tried to use Chisel a few years ago, and gave up quickly because of the absolutely abysmal tooling. sbt is a nightmare.
dilawarover 1 year ago
Nice. BlueSpec (Haskell based) was open sourced few years ago. I wonder how it compares with BlueSpec.
klysmover 1 year ago
Has anybody made this for factorio
tmitchel2over 1 year ago
I wish there was one in typescript, I just can&#x27;t get on with python.
评论 #38781610 未加载
评论 #38781736 未加载
评论 #38788877 未加载
评论 #38781711 未加载
ur-whaleover 1 year ago
The goal is worthy, the effort is commendable, but the underlying language (Scala) is an absolute turn-off AFAIC, and I suspect I&#x27;m far from being the only one.