Readit News logoReadit News
dalance · 2 years ago
Hi, Im' developer of Veryl. So I'll try to write the answer to "why new HDL?".

I use SystemVerilog from 10+ years ago, and developed many large ASICs. But development environment of SystemVerilog was poor than software development. So I wrote some SystemVerilog tools to improve the environment like below: https://github.com/dalance/svlint

After writing it, I felt that more improvement is difficult because the specification of SystemVerilog is too complicated. (For example, even commercial EDA tool vendors can't cover all specification...)

Therefore I decided to develop a new language replacing SystemVerilog. I focus that the new language can be used by production ASIC development. I'm plan to develop a part of a new project in my company by using Veryl.

fayalalebrun · 2 years ago
I probably don't have nearly as much experience as you do, but I have used VHDL, Verilog, and modern HDLs like Chisel and SpinalHDL. I think the main advantage of a modern HDL is to have the full power of a traditional programming language when it comes to generating hardware. This especially helps when making deeply parameterizable and reusable hardware in a fraction of the lines compared to SystemVerilog, and which sometimes is impossible to do in Verilog.

From a first impression, your language doesn't look all that different from SystemVerilog. Does it have any features that make parameterization easier than SystemVerilog? Can I, for example, easily generate hardware using higher order functions and other functional programming features like those available in Rust and Scala?

gchadwick · 2 years ago
Powerful generative capabilities aren't always as useful as you might think. There's two major issues:

1. Verification - You can verify one particular part of the configuration space but verifying the full generic component is something else entirely. As far as I'm aware there's no new HDL which seriously tries to address this point.

2. Implementation - If you're generating something sufficiently advanced you likely want different micro-architecture for different configurations to reach the most optimum design (in terms of power, timing and area). As an example, take a CPU, a single, dual and triple issue core will need to be designed in very different ways. You could aim to build something which can generate all of these wrapped up as a nice CPU module with an 'IssueWidth' parameter but that's going to be harder than just writing separate 1, 2 and 3 issue width CPUs.

Certainly for more mechanical things like interconnects, interrupt controllers, pin multiplexers etc yes it can work well. However building those things in System Verilog is often done with separate generator programs anyway and overall doesn't consume much of the total project engineering time, it's just tedious work.

It does seem a lot of new HDLs focus on eradicating the annoyances and tedium you get developing with System Verilog but increase the difficulties you get in point 1 and 2 which are the actual hard bits that take up the bulk of the time.

Early days for Veryl but it's taking a different direction for most (i.e. just building a more sane System Verilog) I shall be watching with interest!

dalance · 2 years ago
I plan to introduce generics to enable module/interface/package as type parameter. But I'm not aiming Chisel like programability.

I tried to use Chisel in a large codebase to judge it can be used as SystemVerilog alternative in my company. So I found many problems which causes difficulty to apply ASIC development flow. I think that differences of semantics from SystemVerilog causes a part of these problems.

So I aim that Veryl has the almost same semantics as SystemVerilog. I think this ease to interoperate with SystemVerilog codebase too.

Rochus · 2 years ago
> because the specification of SystemVerilog is too complicated

Right, too big and complicated, too much stuff to decently cover in one language (more than e.g. Ada covers), and the majority of people use only a fraction of it (different identifiable user groups use different subsets).

Do you try to add the full SV feature set in your language, or do you focus on a user group/use-case?

dalance · 2 years ago
I'm focusing ASIC designers who write synthesizable SystemVerilog. So I choose language features to avoid mismatch between synthesis and simulation. Verification engineers is a second group. But verification features like SVA and UVM are too large. So I think it may be better to achieve these features by interoperation with SystemVerilog instead of adding them to Veryl.
polalavik · 2 years ago
I was going through all the alternative HDLs the other day to see what’s out there. So many one off projects to appeal to this language or that language, but none solving real problems in the space.

The problem? Modeling large systems with modules with large I/O (DSPs or something else) and verifying those things is hard and not well suited for a complex verbose language.

Fight me, but I really think a mathworks simulink type visual modeling environment is a better way to approach this problem (and I hate mathworks with a passion, this isn’t a plug for matlab). That type of environment coupled with a way to generate high quality HDL is what I really want. I know mathworks has HDL coder and it’s ok, but still doesn’t quite do everything and you have the pay the enormous mathworks fee and can’t crowdsource platform updates as well as a well managed open source project might be able offer.

So when I see stuff like this I think “cool” but also “why???”

omgJustTest · 2 years ago
If you are doing hardware verification, more than using IP blocks in the space, verification tasks are typically oriented around

1. Design : architectures need to be well constructed

2. Well derived tests on case-elements/state-conditions : asserting that a specific signal-value achieves a specific state at a specific time is not hard, but coming up with the conditions are.

3. Placement & technology specific details of delays, glitch management & etc.

Event-driven simulators are essentially doing a lot of simple things correctly, so the language doesn't need to be full-featured, ideally it is as efficient as possible in computation, since checking the verification takes the most time for complex systems.

Additionally after logic has been verified, there are many physical implementation details that have to be layered on top of this, including placement & route optimizations.

Veryl to me looks like a SystemVerilog competitor in Rust-like language aspects.

bb88 · 2 years ago
> Additionally after logic has been verified, there are many physical implementation details that have to be layered on top of this, including placement & route optimizations

Software designed chip fabrication. Please someone make this happen in my lifetime.

stefanpie · 2 years ago
Funnily enough, Matlab offers a high-level synthesis (HLS) tool to lower Matlab code and Simulink models to Verilog [1]. I have not personally used it, but the last I heard, it works well enough for those who need it and Matlab puts in the work to support it. However, HLS is another whole can of worms, and it can be confusing when discussing new HDLs and the distinction between an HLS tools and high-level HDL dialects.

[1] https://www.mathworks.com/discovery/high-level-synthesis.htm...

pclmulqdq · 2 years ago
Matlab HLS is fantastic due to the restricted problem space. The C-to-hardware HLS stuff often has more mixed results.
ijuz · 2 years ago
I do not see the point, there is not much syntactical sugar.

Basically all non-systemverilog HDL languages suffer from the issue that they make development maybe simpler, but on the other side it will be much harder to test the code, because the systemverilog output has to be simulated. Additionally of course, synthesis and timing reports will tell you line numbers in generated systemverilog files instead of the line number of randomHDL.

kimmeld · 2 years ago
System Verilog is the C++ of chip design. It sucks, but it’s everywhere and tied into everything. Like C++ there are so many ways to write dangerous code. We do need a new chip design language but it’s going to take something special to move the needle.
bb88 · 2 years ago
30 years or so ago, there were a handful of programming languages in regular use: C/C++/Assembly/Ada/Fortran/Pascal. Now there's 100x more that people use professionally.

I always wondered what would happen if hardware engineers bothered to learn compiler theory. Maybe this could have been a solved problem decades ago.

vrinsd · 2 years ago
This is a pretty heavy handed statement -- there are plenty of "hardware engineers" who know plenty about compiler theory and/or have contributed significantly to it. A similarly flippant comment might be "if software engineers only understood hardware better we might have smartphones that last a month on a charge and never crash".

The challenge with hardware is that unlike "traditional" software which compiles to a fixed instruction set architecture, with hardware you might literally be defining the ISA as part of your design.

In hardware you can go from LEGO style gluing pre-existing building blocks to creating the building blocks and THEN gluing it together, with everything in-between.

The real crux of the problem is likely our modern implementation of economics -- a CS graduate who has base-level experience can bank roll a crazy salary that some guy who might have a BSEE, MSEE and PhD in Electrical Engineering ("hardware") will be lucky to get a job offer that's even enough to cover costs of education.

Until the "industry" values hardware and those who want to improve it, you'll likely see slow progress.

P.S.

VHDL (a commonly-used hardware description language) is more or less ADA. Personally I think the choice of ADA syntax was NOT a positive for hardware design but the type-safety and verbosity being a very apt fit for software.

wbl · 2 years ago
This is really a list of the survivors. Tcl was big in the early 1990's, and Objective-C was around on NeXT and other boxes. Perl was rapidly becoming the glue that holds the web together. Lisp was embedded into a lot of programs, as it is now for extensibility. It's also overstating Fortran's role, which is much like todays niche.

It's very hard to have an objective view of what programming language success and popularity looks like over that long a time, but I think that today there is a much narrower happy path. Either you're a dynamically typed multiparadgram language that's mostly imperative and OO in practice (Ruby, Python, Javascript), a statically typed object imperative language with brackets (C#, Go, Java), or Rust (where a lot of people don't realize how good GCs got instead). Not a ton of Haskell or SML inspired new languages.

By contrast in the 1980-1990s there were serious questions about which of Pascal, C, Objective-C, Smalltalk, C++, was going to win out for the dominant language a system is built in. Stuff like Display PostScript depended in a deep way on exposing programing languages to the engineers who had to work with it that were pretty alien.

imtringued · 2 years ago
You mean if compiler theorists learned about parallel processing...
fpgamlirfanboy · 2 years ago
> It sucks, but it’s everywhere and tied into everything.

meh lots of places aren't actually programming system verilog - they're using python/perl to generate it. that's not the same thing.

kimmeld · 2 years ago
Mangling the quote about regexes… http://regex.info/blog/2006-09-15/247

Some people when confronted with a problem, think “I know, I’ll use a different language to generate another language.” Now they have two problems.

fpgamlirfanboy · 2 years ago
i'll hop on the dismissive bandwagon but not for the same reason as everyone else: not a single one of these HDLs ever (ever) amount to more than perl scripts.

why? how? how could that possibly be? this one is written in rust...?

because not a single one of them, ever (ever) emits a netlist. they all emit verilog (or vhdl). hence: overengineered perl scripts (which actual employed RTL engineers use heavily...).

vrinsd · 2 years ago
That's because in order to turn your "design" into an ASIC or an FPGA, you need to go from whatever high-level language (insert new-HDL-du-jour) into a netlist compatible with your physical part.

The only scalable/vendor/device-neutral way to do that is to back-end your "new" HDL with Verilog/SystemVerilog/VHDL, especially if you want to simulate your design.

A long time ago people used to design chips and CPLDs/FPGAs in schematic-capture (which actually does have its place) which is basically one-step removed from being a netlist.

Among other challenges if you wanted to change to a new device (say a bigger part) or a part from a different vendor, there wasn't really "neutral" way to do this until HDLs came around.

To really change the status quo, you'd have to own EVERYTHING from the point of design entry to the physical device and all steps between.

Even with both "major" FPGA vendors now with their own synthesis and sometimes simulation capability they can barely and/or reliably make that minimum bar work.

fpgamlirfanboy · 2 years ago
> The only scalable/vendor/device-neutral way to do that is to back-end your "new" HDL with Verilog/SystemVerilog/VHDL

i mean like the guy below says, but in the opposite tone, that's like saying building a C emitter is the same as building a compiler. like i just don't agree (for C, because C isn't some abstract device model and isn't (easily) optimizable) because you are forever yoked to the assumptions/affordances of the target language (which is ultimately a language and not an IR/netlist/whatever).

> The only scalable/vendor/device-neutral way

blif is a thing, eblif, xml, etc. are there points of ingress in vivado or intel's thing for any of these? i don't remember but regardless i agree they're probably not well-supported.

> To really change the status quo, you'd have to own EVERYTHING from the point of design entry to the physical device and all steps between.

nah that's not true. LLVM/GCC/etc. FOSS compilers that grew up in the last ~20-30 years prove you don't need to own the last mile to change the status quo. if you build it, device manufacturers will come.

the truth is simply that performant logic synthesis, tech mapping, place and route, etc are just all much harder to implement than XYZ compiler pass. but it could be done given maybe 20*5 highly competent engineer years? i dunno pulling it out of my ass but i know i could write some of that stuff and i'm not that smart. maybe i'm underestimating by 2x (40*5) but that's still less than most "exciting" startups that raise series A today. it just takes diligence and "hard" technical skills (familiarity with the relevant optimization literature).

> Even with both "major" FPGA vendors now with their own synthesis and sometimes simulation capability they can barely and/or reliably make that minimum bar work.

not sure what's the minimum bar you're referring to but sure i agree - i work at one of em (the one that didn't file divorce papers recently...) and i still readily admit the toolchain is trash. which is why i wish some consortium would get together and replace it (because i would love to be able to program/design on our own parts using a modern tool suite). and no circt/yosys/vtr whatever aren't there probably won't ever be (yosys in particular is terribly disappointing for all the waves/claims it makes).

fayalalebrun · 2 years ago
Chisel does emit FIRRTL. Which can be made into a bitstream directly by Yosys.
zem · 2 years ago
for a software analogy, a lot of programming languages compile to C; that doesn't make them perl scripts!
fpgamlirfanboy · 2 years ago
> that doesn't make them perl scripts!

but it doesn't make them compilers either :shrug:

whateveracct · 2 years ago
I highly recommend looking into Clash as an alternate HDL. It's a GHC Haskell backend - exceedingly cool.

And there's a book: "Retrocomputing in Clash"