Readit News logoReadit News
JoachimS commented on The FPGA turns 40   adiuvoengineering.com/pos... · Posted by u/voxadam
maxdamantus · 2 months ago
Yes, performance (per watt, or per mass of silicon).

Profit is dependent on scale. FPGAs are useful if the scale is so small that an ASIC production line is more expensive than buying a couple of FPGAs.

If the scale is large enough that ASIC production is cheaper, you reap the performance improvements.

Think of it this way: FPGAs are programmed using ASIC circuitry. If you programmed an FPGA using an FPGA (using ASIC circuitry), do you think you'll achieve the same performance as the underlying FPGA? Of course not (assuming you're not cheating with some "identity" compilation). Same thing applies with any other ASIC.

Each layer of FPGA abstraction incurs a cost: more silicon/circuitry/resistance/heat/energy and lower clock speeds.

JoachimS · 2 months ago
Yes, profit depends on scale. But far from everything sells in millons of units, and scale is not everything. Mobile base stations sells i thousands and sometimes benefit from ASICs. But the ability to adapt the base station due to regional requirements and support several generations of systems with one design makes FPGAs very attractive. So in this case, the scale make FPGAs a better fit.
JoachimS commented on The FPGA turns 40   adiuvoengineering.com/pos... · Posted by u/voxadam
15155 · 2 months ago
> The ability to reprogram an FPGA to implement a new digital circuit in milliseconds would be a game changer for many workloads

Someone has to design each of those reconfigurable digital circuits and take them through an implementation flow.

Only certain problems map well to easy FPGA implementation: anything involving memory access is quite tedious.

JoachimS · 2 months ago
The ability to optimize the memory access and memory configuration is sometimes a game changer. And modern FPGA tools have functionality to make mem access quite easy. Not as easy as a MCU/CPU, but basically the same as for an ASIC.

I would also question the premise that mem access is less tedious, easy for MCUs/CPU. Esp if you need determinstic performance and response times. Most CPUs have memory hierarchies.

The more practial attempts at dynamic, partial reconfiguration involves swapping out accelerators for specific functions. Encoders, fecoders for different wireless standards, Different curves in crypto for example. And yes somebody has to implement those.

JoachimS commented on The FPGA turns 40   adiuvoengineering.com/pos... · Posted by u/voxadam
fake-name · 2 months ago
> Synthesis results can vary from run to run on the exact same code with the same parameters, with real world impacts on performance.

This is because some of the challenges for the synthesis/routing process are effectively NP-hard. Instead, the compiler uses heuristics and a random process to try to find a valid solution that meets the timing constraints, rather then the best possible solution.

I believe you can control the synthesis seed to make things repeatable, I believe that the stochastic nature of the process means that any change to the input can substantially change the output.

JoachimS · 2 months ago
Yes you can control the seeds and get determinstic bitstreams. Depending on device, tools you can also assist the tools by providing floorplanning constraints. And one can of course try out seeds to get designs that meet results you need. Tillitis use this to find seeds that generate implementations that meet the timing requirements. Its in ther custom tool flow.
JoachimS commented on The FPGA turns 40   adiuvoengineering.com/pos... · Posted by u/voxadam
avidiax · 2 months ago
FPGAs are an amazing product that almost shouldn't exist if you think about the business and marketing concerns. They are a product that is too expensive at scale. If an application takes off, it is eventually cheaper and more performant to switch to ASICs, which is obvious when you see the 4-digit prices of the most sophisticated FPGAs.

Given how ruinously expensive silicon products are to bring to market, it's amazing that there are multiple companies competing (albeit in distinct segments).

FPGAs also seem like a largely untapped domain in general purpose computing, a bit like GPUs used to be. The ability to reprogram an FPGA to implement a new digital circuit in milliseconds would be a game changer for many workloads, except that current CPUs and GPUs are already very capable.

JoachimS · 2 months ago
You also need to bring time to market, product lifetime, the need for upgrades, fixes and flexibility, risks and R&D cost including skillset and NRE when comparing FPGAs and ASICs. Most, basically all ASICs start out as FPGAs, either in labs or in real products.

Another aspect where FPGAs are interesting alternatives are security. Open up a fairly competent HSM and you will find FPGAs. FPGAs, esp ones that can be locked to a bitstream - for example anti-fuse or Flash based FPGAs from Microchip are used in high security systems. The machines can be built in a less secure setting, and the injection, provisioning of a machine can be done in a high security setting.

Dynamically reconfigurable systems was a very interesting idea. With support for partial reconfiguration, which allowed you to change accelarator cores connected to a CPU platform seemed to bring a lot of promise. Xilinx was an early provider with the C6x family IRRC through company they bought. AMD also provided devices with support for partial reconfiguration. There were also some research devices and startups for this in the early 2000s. I planned to do a PhD around this topic. But tool, language support and the added cost in the devices seemed to have killed this. At least for now.

Today, in for example mobile phone systems, FPGAs provide the compute power CPUs can't do with the added ability do add new features as the standards evolve, regional market requirements affect the HW. But this is more like FW upgrades.

JoachimS commented on The FPGA turns 40   adiuvoengineering.com/pos... · Posted by u/voxadam
inamberclad · 2 months ago
The problem is that the tools are still weak. The languages are difficult to use, nobody has made something more widely adopted than Verilog or VHDL. In addition, the IDEs are proprietary and the tools are fragile and not reproduceable. Synthesis results can vary from run to run on the exact same code with the same parameters, with real world impacts on performance. This all conspires to make FPGA development only suitable for bespoke products with narrow use cases.

I would love to see the open source world come to the rescue here. There are some very nice open source tools for Lattice FPGAs and Lattice's lawyers have essentially agreed to let the open source tools continue unimpeded (they're undoubtedly driving sales), but the chips themselves can't compete with the likes of Xilinx.

JoachimS · 2 months ago
The competitiveness between Lattice and Xilinx is also not a univeral truth. It totally depends on the applications. Small to medium designs Lattice have very competitive offerings. Hard ARM cores, not as much. Very large designs not at all. But if you need internal config memory (for some devices), small footprint etc Lattice is really a good choice. And then support in open source tools to boot.
JoachimS commented on The FPGA turns 40   adiuvoengineering.com/pos... · Posted by u/voxadam
inamberclad · 2 months ago
The problem is that the tools are still weak. The languages are difficult to use, nobody has made something more widely adopted than Verilog or VHDL. In addition, the IDEs are proprietary and the tools are fragile and not reproduceable. Synthesis results can vary from run to run on the exact same code with the same parameters, with real world impacts on performance. This all conspires to make FPGA development only suitable for bespoke products with narrow use cases.

I would love to see the open source world come to the rescue here. There are some very nice open source tools for Lattice FPGAs and Lattice's lawyers have essentially agreed to let the open source tools continue unimpeded (they're undoubtedly driving sales), but the chips themselves can't compete with the likes of Xilinx.

JoachimS · 2 months ago
The non-deterministic part of the toolchain is not a universal truth. Most, all tools allow you to set, control the seeds and you can get deterministic results. Tillitis use this fact to allow you to verify that the FPGA bitstream used is the exact one you get from the source. Just clone the design repo, install the tkey-builder docker image for the release and run 'make run-make' and of course all tools in the tkey-builder are open source with known versions to that you can verify the integrity of the tools.

And all this is due to the actually very good open source toolchain, including synthesis (Yosys) P&R (NextPNR, Trellis etc), Verilator, Icarus, Surfer and many more. Lattice being more friendly than other vendors has seen an uptake in sales because of this. They make money on the devices, not their tools.

And even if you move to ASICs, open source tools are being used more and more, esp at simulation, front end design. As an ASIC and FPGA designer for 25 odd years I spend most of my time in open source tools.

https://github.com/tillitis/tillitis-key1https://github.com/tillitis/tillitis-key1/pkgs/container/tke...

JoachimS commented on The FPGA turns 40   adiuvoengineering.com/pos... · Posted by u/voxadam
inamberclad · 2 months ago
The problem is that the tools are still weak. The languages are difficult to use, nobody has made something more widely adopted than Verilog or VHDL. In addition, the IDEs are proprietary and the tools are fragile and not reproduceable. Synthesis results can vary from run to run on the exact same code with the same parameters, with real world impacts on performance. This all conspires to make FPGA development only suitable for bespoke products with narrow use cases.

I would love to see the open source world come to the rescue here. There are some very nice open source tools for Lattice FPGAs and Lattice's lawyers have essentially agreed to let the open source tools continue unimpeded (they're undoubtedly driving sales), but the chips themselves can't compete with the likes of Xilinx.

JoachimS · 2 months ago
Systemverilog (SV) is the dominating language for both ASIC and FPGA development. SV is evolving, and the tools are updated quite fast. SV allows you to do abstractions through interfaces, enums, types etc. The verification part of contains a lot of modern-ish language constructions, support for formal verification. The important thing is really to understand that what is being described is hardware. Your design is supposed to be possible to implement on a die, with physical wires, gates, register, I/Os etc. There will be clocks, wire delays. It actually one of the problems one encounter with more SWE people tries to implement FPGAs and ASICs. The language, tools may help you, but you also need to understand that it is not programing, but design you are doing.

https://en.wikipedia.org/wiki/SystemVerilog

JoachimS commented on Saab achieves AI milestone with Gripen E   saab.com/newsroom/press-r... · Posted by u/fnordsensei
Hamuko · 2 months ago
Saab Automobile AB went bankrupt in 2012.
JoachimS · 2 months ago
Which has nothing to do with SAAB AB, the defence company.

SAAB Automobile was split out from SAAB AB in 1990 with General Motors (GM) taking a 51% stake, and was fully a part of GM ten years later. GM then tried to build SAABs on GM platforms which meant the quality tanked and tanked the company too. And as another posted, What was left became Nevs.

Wikipedia has a good writeup on SAAB, with its many divisions. It's a bit like Volvo. Both companies have had divisions that makes automobiles, heavy vehicles and other types of products. Volvo Cars was sold off from Volvo AB to Ford. The heavy vehicle division of SAAB (Scania) is now part of Volkswagen

https://en.wikipedia.org/wiki/Saab_AB

JoachimS commented on Apple Exclaves   randomaugustine.medium.co... · Posted by u/todsacerdoti
fedxc · 5 months ago
I see exclaves as a significant but intermediate step. Apple is making XNU less of a liability, but they're still playing defense instead of fully embracing a microkernel architecture.

If I had to bet, exclaves will be a bridge to something bigger, either a more modular OS (like Fuchsia) or a CHERI-inspired security model where memory safety is enforced at the hardware level.

Apple is leading the pack in consumer OS security, but exclaves are a patchwork improvement rather than a total rethinking of system design. That said, this is probably the biggest security shift in mainstream OS design in the last decade, and it will take years before we see its full impact.

JoachimS · 5 months ago
I see it as a way to move back to the micro kernel it once was - with modern solutions and new requirements. Security was much less of a concern when Mach was created. With the insane performance we now get in the machines, the overhead caused by the micro kernel process communication may well be negligible.

u/JoachimS

KarmaCake day5642June 19, 2013
About
Embedded security expert at Assured AB https://www.assured.se/
View Original