Readit News logoReadit News
gluggymug commented on FPGAs becoming more SoC-like   semiengineering.com/fpgas... · Posted by u/chclau
planteen · 7 years ago
Yeah, I know you can use vim. Text editing is not even close to where the major pain points are with Vivado. There's more to the workflow of developing a FPGA than adding files in Vivado.

My experience went something like this: An hardware engineer needs to do a routine task like add a peripheral, swap some pin assignments, and modify the Verilog/VHDL. So they do all their synthesis and have an export ready to hand it off the the software engineer. They commit their changes and it probably causes differences in dozens of files, but so is life. It seems like this could be reduced to differences in a few human-readable files, except for the bitstream which obviously is binary.

The SW engineer then needs to update the FSBL and BSP for the board. I never found a way to automate this on the command line, you needed to update the FSBL using their horrible Eclipse-based import tool. In my case, I had to make some manual modifications to the FSBL. I think for my modifications that I needed to flip some GPIO pins early in the boot process and also do some RSA validation on the bitstream. Well, all those modifications would get wiped out. I never found a way to template those and preserve them across new imports.

So I had a bunch of differences that had to be manually merged every time. I had notes about it, but come on. What a pain. At the end of all of this, many dozens more files were changed. Once again, it seems like this should reduce to just handful of human-readable files like a FSBL configuration header / C file, the new U-Boot config, and the new kconfig. But instead you had two massive changesets in version control for some very routine work.

gluggymug · 7 years ago
I believe Tcl scripts can help automate your FSBL and BSP changes. Pretty much everything you do in the GUI translates to Tcl commands.
gluggymug commented on Ask HN: Where do I get started on ASICs, FPGA, RTL, Verilog et. al?    · Posted by u/bharatkhatri14
zerohp · 8 years ago
My experience agrees with yours. Many big-budget teams use a hardware emulator like the Palladium XP or the similar Synopsis device. Both built from FPGAs.

Hardware emulators are expensive, but a single mask respin at 7, 10, or 16nm is even more expensive.

gluggymug · 8 years ago
There is a distinction between hardware emulators and FPGAs. Though hardware emulators such as Palladiums may use FPGAs inside them they don't work the same way in terms of validation. The two tools are very different to use.

See myth 7 here: http://www.electronicdesign.com/eda/11-myths-about-hardware-...

gluggymug commented on Ask HN: Where do I get started on ASICs, FPGA, RTL, Verilog et. al?    · Posted by u/bharatkhatri14
spear · 8 years ago
That's a false dichotomy -- you can do FPGA verification in addition to simulation-based verification. And yes, there are ASIC teams that have successfully done that.
gluggymug · 8 years ago
At the SoC level, I don't think so.

The reasons are numerous. I already gave a few. I will give another. Once you have to integrate hard IP from other parties, you cannot synthesise it to FPGA. Which means you won't be able to run any FPGA verification with that IP in the design. You can get a behavioural model that works in simulation only. In fact it is usually a requirement for Hard IP to be delivered with a cycle accurate model for simulation.

I'll give another reason. If you are verifying on FPGA you will be running a lot faster than simulation. The Design Under Test requires test stimulus at the speed of the FPGA. That mans you have to generate that stimulus at speed and then check all the outputs of the design against expected behaviour at speed. This means you have to create additional HW to form the testbench around the design. This is a lot of additional work to gain speed of verification. This work is not reusable once the design is synthesised for ASIC.

I can go on and on about this stuff. Maybe there are reasons for a particular product but I am talking about general ASIC SoC work. I got nothing against FPGAs. I am working on FPGAs right now. But real ASIC work uses simulation first and foremost. It is a dominant part of the design flow and FPGA validation just isn't. On a "Ask HN", you would be leading a newbie the wrong way to point to FPGAs. It is not done a lot.

gluggymug commented on Ask HN: Where do I get started on ASICs, FPGA, RTL, Verilog et. al?    · Posted by u/bharatkhatri14
phkahler · 8 years ago
I find the above pair of comments really interesting. I'm guessing there are parallels with differences of opinion and approach in other areas of engineering. There are always reasons for the differences, and those are usually rooted in more than just opinion or dogma.

In this case, I'd guess its got a lot to do with cost vs relevance of the simulation. If you're Intel or AMD making a processor, I bet FPGA versions of things are not terribly relevant because it doesn't capture a whole host of physical effects at the bleeding edge. OTOH for simpler designs on older processes, one might get a lot of less formal verification by demonstrating functionality on an FPGA. But this is speculation on my part.

gluggymug · 8 years ago
"If you're Intel or AMD making a processor, I bet FPGA versions of things are not terribly relevant because it doesn't capture a whole host of physical effects at the bleeding edge."

Exactly. When you verify a design via an FPGA you are only essentially testing the RTL level for correctness. Once you synthesise for FPGA rather than the ASIC process, you diverge. In ASIC synthesis I have a lot more ability to meet timing constraints.

So given that FPGA validation only proves the RTL is working, ASIC projects don't focus on FPGA. We know we have to get back annotated gate level simulation test suite passing. This is a major milestone for any SoC project. So planning backwards from that point, we focus on building simulation testbenches that can work on both gate level and RTL.

I am not saying FPGAs are useless but they are not a major part of SoC work for a reason. Gate level simulation is a crucial part of the SoC design flow. All back end work is.

gluggymug commented on Ask HN: Where do I get started on ASICs, FPGA, RTL, Verilog et. al?    · Posted by u/bharatkhatri14
pslam · 8 years ago
As a veteran from the chip industry, I can tell you my experience is completely the opposite.

Nobody in their right mind would produce an ASIC without going through simulation as a form of validation. For anything non-trivial, that means FPGA.

gluggymug · 8 years ago
I don't agree. If it's non trivial, I don't have the more advanced verification tools such as UVM if I prototype via FPGA.

The ability to perform constrained randomised verification is only workable via UVM or something like it. For large designs that is arguably the best verification methodology. Without visibility through the design to observe and record the possible corner cases of transactions, you can't be assured of functional coverage.

While FPGAs can run a lot more transactions, the ability to observe coverage of them is limited.

I have worked on multiple SoCs for Qualcomm, Canon and Freescale. FPGAs don't play a role in any SoC verification that I've worked on.

gluggymug commented on Ask HN: Where do I get started on ASICs, FPGA, RTL, Verilog et. al?    · Posted by u/bharatkhatri14
gluggymug · 8 years ago
As a veteran from the chip industry, I should warn you that all these suggestions about FPGAs for prototyping are not really done that much in the ASIC industry.

The skills to do front end work are similar but an ASIC design flow generally doesn't use an FPGA to prototype. They are considered slow to work with and not cost effective.

IP cores in ASICs come in a range of formats. "Soft IP" means the IP is not physically synthesised for you. "Hard IP" means it has been. The implications are massive for all the back end work. Once the IP is Hard, I am restricted in how the IP is tested, clocked, resetted and powered.

For front end work, IP cores can be represented by cycle accurate models. These are just for simulation. During synthesis you use a gate level model.

gluggymug commented on Clocks for Software Engineers   zipcpu.com/blog/2017/09/1... · Posted by u/mr_tyzic
teraflop · 8 years ago
I'm not surprised that software engineers find these concepts difficult to understand at first -- it's a very different way of thinking, and everyone has to start somewhere. But I do find it kind of odd that someone would jump straight into trying to use an HDL without already knowing what the underlying logic looks like. (My CS degree program included a bit of Verilog programming, but it only showed up after about half a semester of drawing gate diagrams, Karnaugh maps and state machines.)

Does this confusion typically happen to engineers who are trying to teach themselves hardware design, or is it just an indication of a terribly-designed curriculum?

gluggymug · 8 years ago
I say "terribly-designed curriculum".

Maybe engineers need to be introduced to the synthesis tools at the same time as the simulator tools.

Simulating RTL is only an approximation of reality. So emphasizing RTL simulation is bad. You see it over and over though. People teach via RTL simulation.

Synthesis is the main concern. Can the design be synthesised into HW and meet the constraints? Because all the combinatorial logic gets transformed into something quite different in a FPGA.

gluggymug commented on Facebook recruiting and Unix systems   imgur.com/hw2pnDt... · Posted by u/abhisuri97
aspyrx · 8 years ago
Hey folks, I'm the student writing the emails in the post here. Thanks to everyone for their criticisms. While I was initially kind of shocked by the recruiter's response, I've had a lot of time to think about it today and have realized that I was being pretty damn condescending and spoke out of line without regards to the context. It's been a hard lesson learned. I honestly regret the whole exchange, and posting it online was inappropriate as well. I briefly debated deleting the image, but decided to leave it up for sake of posterity and accountability.

Also, just to be clear, I do not (and never did) hold any hard feelings towards the recruiter; in fact, it was very kind of them to point out why I was not qualified in the first place. This has been probably the most reflective of how I let my ego get the best of me at times, and I hope it might serve as a warning to those who might be tempted to do the same "devsplaining" in similar situations.

Please let me know if you have any other criticisms beyond the ones already voiced in this thread. I'm reading through the comments here as I can, and it's been a lot of good advice. Thanks again.

gluggymug · 8 years ago
If you are not an old fart like myself, you probably haven't used an actual Unix system but back in the days before the popularity of Linux, you'd see a lot of Solaris.

You stuck to your guns and didn't just lie about Unix experience, so I commend you.

But if you really want the job, next time just lie and set them straight once you've gotten an interview. It is splitting hairs to make a big deal out of actual Unix experience vs Linux experience.

gluggymug commented on Rules for new FPGA designers   zipcpu.com/blog/2017/08/2... · Posted by u/jsnell
lvoudour · 8 years ago
I don't agree teaching students experimental high-level languages in lieu of proven industry standards just because those standards are archaic and/or un-intuitive. It's a great academic endeavor but the FPGA (and ASIC) landscape is driven by industry not by academia.

If you're aiming for an FPGA job after school you'll need to be proficient in verilog or vhdl (ideally both), there's no shortcut. The sooner you learn how to deal with their quirks and pitfalls (I agree they have a lot), the better. Sprinkle some good-ol' TCL in there and you're good to go. Yes python is better and more feature/library rich but the industry is still using TCL (which is not bad, just not modern).

Don't get me wrong, I'd like to see a standardized higher level approach to hardware description, but unless the vendors agree and support it there's very little chance it will be useful. The current trend in high level synthesis is non-portable vendor specific tools. The only way I see the trend changing is when FPGAs become more mainstream (already happening in the server/deep learning sectors) and there's a critical mass of customers that ask for FPGA tools in par with software tools (ie. high level languages, open source, etc.)

PS. You forgot the python based myHDL :)

gluggymug · 8 years ago
It's the experimental part of the high level language that is the problem. I agree you shouldn't teach it to students. It just leads them down a divergent path away from what is done in industry. It isn't addressing the needs of the student, only their short term "wants".

But the language is just a small part of the design process. You have to be learn to design HW. The HW engineering project tailors the tool choices around the requirements of the product. It is assumed that engineers know the fundamentals. They can adapt to any high level synthesis tool.

Vendors training courses for all fancy HLS tools are done in a few days at most. They don't have a semester for any newbies to learn Verilog/VHDL or C/C++ first. It's assumed you know them.

gluggymug commented on Rules for new FPGA designers   zipcpu.com/blog/2017/08/2... · Posted by u/jsnell
FullyFunctional · 8 years ago
Good list for starters, but obviously the reset question has nuances (not opening that can of worms).

One thing that bit me when I was a complete n00b: assigning registers from within more than a single always block. On my simulator (at the time) it worked perfectly but the synthesis tool silently ignored one of the blocks.

EDA tools suck. There I said it. Coming from a software it's truly shocking how poor error/warnings are handled. My "favorite" part is that you cannot enforce a "0 warnings" discipline as the libraries and examples from the vendors provoke thousands warnings and the only workaround is to filter the individual instances of the messages.

gluggymug · 8 years ago
"One thing that bit me when I was a complete n00b: assigning registers from within more than a single always block. On my simulator (at the time) it worked perfectly but the synthesis tool silently ignored one of the blocks."

It's tool dependent but I believe you should see a warning that two drivers are assigned to the same net.

This is probably where I am guessing you mistakenly thought you were creating a register in Verilog with the keyword "reg". Synthesis tools don't work like that and haven't for quite a while.

Taken from https://blogs.mentor.com/verificationhorizons/blog/2013/05/0... :

"Initially, Verilog used the keyword reg to declare variables representing sequential hardware registers. Eventually, synthesis tools began to use reg to represent both sequential and combinational hardware as shown above and the Verilog documentation was changed to say that reg is just what is used to declare a variable. SystemVerilog renamed reg to logic to avoid confusion with a register – it is just a data type (specifically reg is a 1-bit, 4-state data type). However people get confused because of all the old material that refers to reg."

A lot of people here on HN seem to be self taught and not keeping up with tool and language developments. If you use tools and techniques from the 90s, don't expect wonderful results.

u/gluggymug

KarmaCake day153December 30, 2014
About
esyuen2002@yahoo.com.au
View Original