We had many experiments with nanoimprint lithography at the university 20 years ago. The resolution was poor and the durability very poor. After dozens of imprints the “stamps” degraded heavily. I am curious if 20 years were enough to fix all the issues and it’s really competitive today.
Came to the comments to see this. Stamping has all sorts of problems with alignment, stamp resolution, contact quality, etc, it's not clear whether this so called "simplicity" still holds after scaling up the resolution even more.
You're probably thinking of contact lithography, where a 1:1 mask is placed directly on the wafer and illuminated. This would've been used for the earliest IC processes, where you'd still be able to see the structures with the naked eye or a loupe.
AFAIK the main coat of EUV is cost of machines that will be obsolete in few years, so you want to produce as many chips using them as possible during that timeframe, they design a lot of around to maintain near 100% uptime of those machines. This include buffers before and after machines (so any unplanned stalls are mitigated) and technicians trained to do maintenance in F1 pitstop fashion.
(source: some tour of some chipmaker I saw online, no longer remember details)
They have uptime only about 80%. They need to be stopped, calibrated and maintained frequently.
They do not go obsolete quicly. They are constantly upgraded. 10-15 year old fabs and machines are still running all over the world. There are 1000 nm, 90nm, 40 nm, 14 nm fabs still running. High-end is not all of semiconductor industry.
These machines will be not obsolete for very long time. They are extremely rare and expensive. And the most of semiconductors are fabricated on mature nodes anyway.
"Obsolete" which I guess for you means for the bleeding edge? Larger nanometer processes will still be in use since their cost will come down. For example when automakers stopped their orders for chips during COVID they pivoted (ported?) to higher nanometer designs because it wasn't a core requirement.
AFAIK, 250W is the net energy of light arriving at the wafer after it has reflected off of many mirrors, with a very inefficient process to generate light from the tin plasma on top of that.
Weird article. The energy consumption of an EUV machine is about 1MW, that's why it's interesting to have an efficient alternative, not the actual useful power of the source.
The EUV light is produced by shooting a pulsed laser on tin droplets.
You already lose most of the input power in the pulsed laser. Then only a fraction of the energy of the light hitting the tin is converted to EUV light with the correct wavelength.
Finally the EUV light has to be focused on the mask through complicated optics, which is notoriously difficult for EUV light.
I guess, there are other sources of inefficiencies, that I forgot.
I’d just like to comment on how batshit insane the technology is.
“We pulse lasers in sync with dispensing droplets of molten tin to produce light that doesn’t exist outside of stars, then we use mirrors with a sub-angstrom surface roughness to precisely direct it onto wafers.”
Not to mention the fact that this is happening, IIRC, thousands of times per second, and the tool has to take the wafer’s topography into account to focus the beam. Honestly, EUV litho makes every other technology you could describe sound like child’s play.
OP states that this can go down to 14nm. What I am interested in is whether older and larger processes (say ~50nm) can be done at a much cheaper cost than traditional methods.
A lot of stuff simply does not require the most advanced chips.
The answer almost certainly is no. While lithography is one of the largest single contributor to manufacturing costs, the contribution to overall cost is still far below 10%.
And one cannot simply substitute an optical lithography with a nano imprint machine without redesigning some part of the process (etch, metrology etc.).
Investing R&D resources for a (best case) 10% reduction in costs while still having a decent probability of failure in a big but declining node is not worth it.
Note that 14nm processes (which are quite old by now) are not the same as 14nm feature sizes. I'm not sure what these machines are capable of, since some details may well be lost in translation in this kind of publication. And I'm only an interested enthusiast, I don't work in the field directly.
But towards the end of the article they talk of targeting 8nm line width in 2028, which is impressive. Maybe this time around NIL actually becomes real for high-end processes?
These numbers are all mindboggling. I understand that the modern specs for EUV dont mean wire width, but if with this future NIL we truly get down to 8nm wide wires, perhaps we should start counting the number of atoms across the width of the wire (around 30).
For everyone interested on technical details of the TSMC EUV process I would highly recommend this CCC talk [1] (From Silicon to Sovereignty: How Advanced Chips are Redefining Global Dominance).
I knew the process was complex especially with the light source but I didn't realize that diffraction was something they also use which is absolutely insane.
(It's licensed CC-BY so this should be allowed, and I like having videos like this on YouTube where I can easily watch them from anywhere and add them to my playlists.)
The "transistors shipped" in the history of computing was an interesting number. In 2024 it is now over 10^24. That's a massive number, more than estimate number of stars in the universe. But, in another sense, still quite small. It finally surpassed Avogadro's number, or 6*10^23 particles. This is the equivalent of a small shot glass filled with water (molecules).
This is fascinating and looks promising! I've never heard of this but expect we will more in the near future, especially if they meet that 2028 target.
I wonder what the environmental impact of this is versus extreme ultraviolet. Although they mention "cost of ownership" and throughput, I wonder if this has any hidden implications.
Why should we care about the environmental impact of EUV machines? I think it's probably better to focus on things which have a real environmental impact. For example, EUV machines are estimated to 54 000 GWh per year by 2030 [1]. This number is a extremely high estimate because current usage is much lower (10 GWh per tool annually according to the same article and in 2020 ASM shipped their 100th EUV system, so current total about 1 000 GWh). This is sold as being "power hungry". Let's put these numbers in perspective.
The United States alone consumes about 25 000 TWh "primary energy" pear year (includes electricy, transport, and heating) [2]. This means that in the extreme case, EUV machines consume 54 TWh / 25 000 TWh = 0.2% of total energy! In comparison, 27% of total U.S. energy consumption was used for transporting people and goods around in the US [3].
And I made the example here before that if you are considering to turn off your phone in order to save battery at the risk of taking an accidental detour, then the decision is simple. Keep the phone. Driving one kilometer extra consumes multiple orders of magnitude more energy than powering a phone for hours. I think this idea holds in many more cases. Video meetings for example can save people from traveling all over the world. This saves energy and time as well.
So I would say please go full power on chip manufacturing. It's way better for the environment (and often saves people time) than deciding to stop innovation and instead keep transporting everything around physically. I'm not saying transport is bad. I'm saying that standing in the way of innovation as an argument for better "environmental impact" is nonsensical.
> So I would say please go full power on chip manufacturing. It's way better for the environment
The flip side of this is that chips becoming so cheap has caused a huge increase in e-waste. Basically everything has a computer inside it (think smart toothbrushes, fridges, toys...) and it usually leads to shorter product lifetimes. Manufacturers drop support for their apps and shut down cloud services sometimes as quickly as two years after manufacture, so things are thrown away. Smart gadgets are also generally more prone to breaking due to having more, more complex more and sensitive parts (no way that 10c MCU in a smart toaster is survivng 10 years of hot-cold cycles).
If chips were more expensive, we wouldn't waste machine time on dual-core mediatek SOCs for 100 € smartphones with a "life expectantly" of less than two years. Manufacturers would make expensive and quality phones and those that can't afford them (I've been there) would buy older models used or refurbished. Longer product lifespans, more reuse, less waste.
I feel like this tech would be better suited to flexible circuitry, because flexible can be continuous feed, and why try to limit or size your stamp to the surface area of a wafer when you could just size it to the width of a spool? Also flexible circuits tend to be at a much larger feature size and so it’s okay if they’re a couple generations behind, but this is still far ahead of printed circuitry.
I wonder how big the wafers can be in the NIL system. It definitely sounds like the larger the wafer, the more problems you will have with deformation, alignment etc. if they have to reduce the wafer size in then that would also affect their ability to compete with EUV.
I'm not an expert on this but feel like a 250w light is not the major driver of cost in EUV? Or am I misunderstanding this?
They have uptime only about 80%. They need to be stopped, calibrated and maintained frequently.
They do not go obsolete quicly. They are constantly upgraded. 10-15 year old fabs and machines are still running all over the world. There are 1000 nm, 90nm, 40 nm, 14 nm fabs still running. High-end is not all of semiconductor industry.
You already lose most of the input power in the pulsed laser. Then only a fraction of the energy of the light hitting the tin is converted to EUV light with the correct wavelength. Finally the EUV light has to be focused on the mask through complicated optics, which is notoriously difficult for EUV light.
I guess, there are other sources of inefficiencies, that I forgot.
“We pulse lasers in sync with dispensing droplets of molten tin to produce light that doesn’t exist outside of stars, then we use mirrors with a sub-angstrom surface roughness to precisely direct it onto wafers.”
Not to mention the fact that this is happening, IIRC, thousands of times per second, and the tool has to take the wafer’s topography into account to focus the beam. Honestly, EUV litho makes every other technology you could describe sound like child’s play.
A lot of stuff simply does not require the most advanced chips.
FPA-1200NZ2C came out 2015-2016. Press release from a sale 2017 https://global.canon/en/news/2017/20170720.html
And one cannot simply substitute an optical lithography with a nano imprint machine without redesigning some part of the process (etch, metrology etc.).
Investing R&D resources for a (best case) 10% reduction in costs while still having a decent probability of failure in a big but declining node is not worth it.
But towards the end of the article they talk of targeting 8nm line width in 2028, which is impressive. Maybe this time around NIL actually becomes real for high-end processes?
[1] https://news.ycombinator.com/item?id=42546231
(It's licensed CC-BY so this should be allowed, and I like having videos like this on YouTube where I can easily watch them from anywhere and add them to my playlists.)
I wonder what the environmental impact of this is versus extreme ultraviolet. Although they mention "cost of ownership" and throughput, I wonder if this has any hidden implications.
The United States alone consumes about 25 000 TWh "primary energy" pear year (includes electricy, transport, and heating) [2]. This means that in the extreme case, EUV machines consume 54 TWh / 25 000 TWh = 0.2% of total energy! In comparison, 27% of total U.S. energy consumption was used for transporting people and goods around in the US [3].
And I made the example here before that if you are considering to turn off your phone in order to save battery at the risk of taking an accidental detour, then the decision is simple. Keep the phone. Driving one kilometer extra consumes multiple orders of magnitude more energy than powering a phone for hours. I think this idea holds in many more cases. Video meetings for example can save people from traveling all over the world. This saves energy and time as well.
So I would say please go full power on chip manufacturing. It's way better for the environment (and often saves people time) than deciding to stop innovation and instead keep transporting everything around physically. I'm not saying transport is bad. I'm saying that standing in the way of innovation as an argument for better "environmental impact" is nonsensical.
[1]: https://www.techinsights.com/blog/euv-lithography-power-hung...
[2]: https://ourworldindata.org/energy/country/united-states
[3]: https://www.eia.gov/kids/using-and-saving-energy/transportat...
The flip side of this is that chips becoming so cheap has caused a huge increase in e-waste. Basically everything has a computer inside it (think smart toothbrushes, fridges, toys...) and it usually leads to shorter product lifetimes. Manufacturers drop support for their apps and shut down cloud services sometimes as quickly as two years after manufacture, so things are thrown away. Smart gadgets are also generally more prone to breaking due to having more, more complex more and sensitive parts (no way that 10c MCU in a smart toaster is survivng 10 years of hot-cold cycles).
If chips were more expensive, we wouldn't waste machine time on dual-core mediatek SOCs for 100 € smartphones with a "life expectantly" of less than two years. Manufacturers would make expensive and quality phones and those that can't afford them (I've been there) would buy older models used or refurbished. Longer product lifespans, more reuse, less waste.