I'd be interested in other takes on this article, I'm struggling to accept the arguments presented here.
> turning instead to software to handle functions that had historically been done in hardware
That generally means doing things less efficiently. Devices will get slower and battery lifetimes shorter.
> There’s room for new startups and new ways of doing things. More of the car’s features are accessed using a touchscreen as opposed to a knob or button, for example, which turns components that once were hardware into a software program.
This has got to be one of the worst times to do a hardware startup. Doing R&D at an established company is already painful, between skyrocketing costs and 6 month plus lead times on standard microcontrollers (not to mention more specialized ICs). If you're just starting out instead of "only" facing trouble keeping production going you're going have to hope that the chips you're prototyping with will be available when you go into production. If things keep going the way they have this year, that's a very risky bet no matter what components you select.
> And now, in light of the chip shortage, suppliers to the automotive industry are going to be looking for more ways to turn something that used to be an electronic part into software.
Not in the automotive field but from what I've heard it's the exact opposite issue, the [lack of] processors they'd like to use in those fancy touchscreen infotainment systems are a big part of their problem.
The central thesis is that the shortage will lead to innovation, but in my experience it has put a HUGE strain on everyone doing R&D. I suspect it'll instead lead to rising costs and stagnation as people wait for supplies. This is already basically what we have in the consumer hardware market, where last year's consoles still aren't consistently on store shelves.
> The chip shortage could lead to an era of hardware innovation
It can, but people currently busy "innovating" are at least half a decade too late. 200mm capacity shortage, and year long backlogs were common for at least 5 year now.
The opportunity from either exploiting the shortage, or ameliorating it is now long gone.
> That’s because electric cars require a lot more computer and chip-based components than those powered by gas
Electric cars require a lot less electronic components!!!
> “I saw a design the other day of a lightbulb with 12 components,” said Carrieres. “[My team] asked me if this was an idea, and I had to tell them that it was a real product.”
Recently we had a rush project for redesigning a control panel for a certain machinery. The task was to replace, or remove MCUs currently experiencing shortage.
After looking at the project for a few minutes, we realised MCU was used basically just for blinkey LEDs, and an interlock.
Our engineer slapped few 555s, flip flops, and registers in under an hour. Client engineers' jaws fell off.
While the real reason client was at the mercy of their supply chain was not that individual MCU, it still shows just how much blind following of "one chip to rule them all" can eat away at supply chain resilience.
If you have a part uniquely fitting your unique needs, the likeliness it being some rare widget, and having poorer availability increases.
I can confirm that this is an extremely challenging time to do a hardware startup. In addition to the supply and freight shortages that are deadly to startups stuck at the back of the line:
1. Travel restrictions make it extremely challenging to evaluate suppliers, do joint development, and bring up manufacturing.
2. A bunch of recent high profile failures have caused VCs to cool on the space.
3. Google, Facebook, Amazon, and Microsoft are all hiring armies of hardware people, making the job market more competitive than it once was.
All that said:
1. There’s still no shortage of interesting and worthwhile problems to solve with hardware.
2. Like web companies that survived the dot com bubble bursting, the hardware companies that make it through 2020 and 2021 are going to have been strengthened by it.
Definitely, chips and other components shortages are getting increasingly serious, and costs are going through the roof; voltage regulators that normally cost 50 cents have been selling for as much as $70.
There is indeed a lot of demand for chips and electronic components of all kinds but I understand that manufacturers claim is not real demand, is just a spike caused by uncertainty in international supply chains.
https://www.taipeitimes.com/News/biz/archives/2021/04/22/200...
Just like we all saw a lot of people buying toilet paper like crazy at the beginning of the pandemic, well, there is also a lot of hoarding by big tech companies that make this shortage likely to last well into 2022.
So, they're not exactly willing to expand capacity for a demand that is most likely to go away eventually.
So indeed, it is a challenging time to design and manufacture hardware; all the big players are buying it all.
The only advice I could give to anyone that is starting to design their electronics to make the design as flexible as possible, leave room on the PCB for 2 or more alternative components.
You could even try to make 2 different PCB layouts and choose one or the other based on price and availability of components, oh and try to delay components selection as much as possible, that's the most useful advice I've read https://titoma.com/blog/reduce-component-lead-time
> A bunch of recent high profile failures have caused VCs to cool on the space.
VCs have been "cool" on the hardware space since 2001.
Why fund hardware which can take 5-10 years to go somewhere when you can fund the latest social garbage fad and it will take off or fail completely within 36 months?
Ye, I'm with you. To give the benefit of doubt to the author, maybe they were thinking of "turning to software innovation" as in optimizing / longer dev cycles that can result in less resource wastage. i.e. rather than needing the latest chip to run Electron based applications everywhere they'd spend a bit more time developing software that can run well on older chips.
For the integrated hardware, like IoT, cars and so on, it makes sense to attempt to reduce the amount of chips needed, and the glue the remaining bits in using software. I don't see how there's much else you can do to deal with a shortage, other than perhaps starting to remove many of the "smart" features and accept that perhaps we didn't need all of them to begin with.
For computers, desktops, laptops, servers and phone, we could start focusing on more efficient software. Replacing laptops every two or three year is no longer cost effective. We start to look at 5 to 8 year lifespan for phones and laptop, meaning that developers need to focus more on efficiency.
It's two different ways of dealing with rising chip prices: Use fewer chips, and use them longer. It's also a great way of shifting to greener computing.
> which turns components that once were hardware into a software program
this is a fun one, because there is no shortage of knobs and buttons, just chips, and knobs don't need chips to work. Even if you do need a chip for it to work (eg. rotary encoder to canbus), most of them are still available, because the "old tech" is good enough for them (compared to newer, smaller transistor sizes, needed for modern cpus to process a lot of data - eg graphics cards, modern cpus and even touchscreen radios for cars).
I stopped reading after the car bit. Anyone who's advocating more eye-requiring touchscreen controls for cars instead of less hasn't put enough thought into their opinions for anything else they've written to be worth reading.
You're talking about two different things: eliminating hardware by doing more in software (what the article was stating) vs consolidating discrete logic to an ASIC (both the IWM and the discrete logic it replaced performed the same task in hardware. The IWM just reduced it to one chip allowing them to eliminate the controller PCB)
> The central thesis is that the shortage will lead to innovation, but in my experience it has put a HUGE strain on everyone doing R&D. I suspect it'll instead lead to rising costs and stagnation as people wait for supplies. This is already basically what we have in the consumer hardware market, where last year's consoles still aren't consistently on store shelves.
I think there's some truth to creativity being enhanced by constraints. Certainly, if supplies are limited, and especially if the limits are uneven, there's going to be incentive to design around the chips that are limited, and some of that might be innovation that could be useful even after supply goes back to normal. Of course, CPU shortages are hard to design around, especially if all CPUs are in short supply; but some applications might be able to make use of different CPUs that might have more availability.
Technically, sure. But the extent of the innovation I've seen so far is people going back to using inferior chips because that was all they could get within their budget.
Microcontrollers aren't exactly interchangeable, even within the same product line. You could design for flexibility and use the arduino framework to run your code on most microchip/atmel, ST, and a million other chips but that comes at enormous cost -- to put it nicely that's an incredibly inefficient library if you're not doing anything demanding, and damn near worthless if you are. Any multiplatform framework that abstracts away the inner workings of microcontrollers is going to be too heavy to work for a huge percentage of people's power and complexity profiles.
It's not just MCUs and firmware either, any time you replace a component due to shortages you need to revalidate your design. Constantly redesigning and revalidating boards based on available stock is what people are doing right now to keep the lights on. It's hell.
If you don't need a microcontroller to do whatever you do, then sure. Pop it out and save a few bucks. But that's hardly innovation, it's more rectifying a mistake made when doing the initial design.
>> turning instead to software to handle functions that had historically been done in hardware
>That generally means doing things less efficiently. Devices will get slower and battery lifetimes shorter.
I'm personally not at all convinced hardware accelerated GUI toolkits are more power efficient than software rendered ones. Weather it's due to the abstraction or some weird idea that drawing is now "free" you end up with way more drawing being done once something is hardware accelerated and it tends to more than offset the efficiency gains. The only place it really works are video games because they're actually paying attention to where the limits are. (and they don't really care about power efficiency.)
> I'm personally not at all convinced hardware accelerated GUI toolkits are more power efficient than software rendered ones.
It really depends on what is being drawn and how. If a GUI toolkit is all flat colors, basic shapes, and doesn't use antialiasing then it can be very power efficient on modern CPUs, even low power embedded ones. If it instead uses gradients, alpha blending, antialiasing, and drawing complex shapes it's going to far less efficient all in software. The difficulty increases with increased demands for fill rate (display size x frame rate).
On modern CPUs (even embedded ones) a software rendering to a QVGA display would be no probably and likely no less efficient than hardware acceleration. However as the fill rate demand and drawing complexity increases software rendering will quickly hit a wall.
As well the argument about power management chips not benefiting from Moore’s law. Yes, but that’s been the case for decades. Thermal and current requirements demand a certain amount of silicon regardless of how small you can make features in a low-power chip.
>> turning instead to software to handle functions that had historically been done in hardware
> That generally means doing things less efficiently. Devices will get slower and battery lifetimes shorter.
Not necessarily. For example, the original idea behind RISC was too move lots of complicated functions (and instructions) from hardware into software, so that the hardware could concentrate on doing the most common and simple functionality quicker.
Those common computationally complex functions generally get abstracted away via CMSIS or the like, I've yet to beat the hardware implementations of common signal processing algorithms in software despite actively trying. Anyone trying to fill in for missing DSP hardware is going to have a very bad time.
I can't imagine it's much better for other hardware-->software transitions, given the cost of specific hardware implementations they're generally only used when actually necessary.
Or, we could just ban crypto exchanges in the USA (and allied nations), watch prices tank 95%, and all of the fab capacity which is now going towards Bitcoin and Ethereum mining could be restored to proper productive use (alongside a big reduction in energy expenditure and ransomware).
I think the biggest pitfall in your plan is thinking that prices would respond by falling, let alone by 95%.
There’s a long history of countries imposing capital controls. And it’s rarely favorable to the home country’s exchange rate. What you are proposing is basically just capital controls towards the crypto economy.
Yes it will make it harder for new buyers to bid up the price of Bitcoin. But it almost certainly will induce current holders to hoard any Bitcoin they own, thus reducing supply and putting upward pressure on the price. Without a liquid market, miners are likely to hoard their rewards, creating even more Bitcoin scarcity.
It also may create panic buying before the law comes in effect, or by people in foreign jurisdiction fearing the export of the policy.
It can also just as likely create panic selling as speculators look to shed an asset that's dead in the water. Remember when Robinhood banned GME shares from being purchased, and the share price cratered as speculators tried to rinse their hands of the asset?
Maybe a better strategy is to discourage crypto through fiscal policy: Imagine 90% capital gains tax on crypto profits, no credits for capital losses on crypto, 10% crypto wealth tax. Put those taxes to use fighting climate change.
And watch decentralised exchanges that are even less efficient replace them. There's as much chance of successfully banning crypto as there is of successfully banning drugs (in a big part because of how useful crypto is for online drug transactions).
The US could simply ban US banks and financial institutions from doing business with crypto, and from doing business with any foreign entity that does business with crypto. That would make it mostly unusable except as just another aspect of the criminal underworld shadow banking systems. That's based on the current state of crypto though. If it becomes a common medium of exchange with many people having some or all if their transactions denominated in crypto, that option kind of goes off the table.
It's wild how many people on a tech forum don't understand the next major innovation of the Internet. Crypto is up there with Netscape and smartphones.
People said similar things with: AI, torrents, VR/AR, quantum computers, memristors, drones, chatbots, distributed social media, Internet of things, self-driving cars, robots, dapps.
Some of these technologies managed to produced something useful. On the other hand, I cannot use crypto to anything but speculation.
The real answer, I think, is not to ban the nebulous idea of "cryptocurrency" but to ban (or highly tax) proof-of-work coins or similar schemes where the network requires a large amount of some physical resource to maintain its function.
That will spur innovation on approaches that don't have that requirement.
There’s already rapid innovation in this direction without government innovation being necessary. Ethereum is switching to proof-of-stake by the end of the year, and partially because of that is already 50% of the way to passing Bitcoin by market cap.
Automakers cancelled their orders during the early days of COVID-19 and lost their place in the queue (and now are crying foul).
Also many companies have double-booked orders from multiple suppliers making the shortage look worse.
There's a huge amount of semiconductor capacity coming online in the next few years, much of will be barely profitable once this current shortage dissipates.
Double booking is a consequence of shortage. You can't separate it out. Longer lead times make it harder to predict future demand, and perception of scarcity leads to overbooking that exacerbates the negative consequences of underbooking.
LOL, no. It's hard to implement hardware functions in software if it's impossible to source the MCUs my software stack was built on.
What's really happening is that I (and likely many others) realised that Chinese-brand chips (which are not affected by the shortage) works just as fine as the western ones.
The other thing I realised was that non-authorized sellers works just as fine as well. Yep, I had to implement more thorough testing/QC, but I would not call that extra work innovation.
I think most assembly-houses self-source some parts, and they do it in large quantities (like 1M+ parts/year). As such, they buy at extremely good prices. They overbuy themselves, and in order to get the same good pricing next year, they keep overbuying. Some of their inventory is then sold locally, and ends up in the Shenzhen market or at unauthorized resellers.
For the quantities I buy (sub-1000 pcs typ.) the pricing is usually 0.2-2.0 times the known good parts from known (authorized) sellers.
The problem is that I can not be sure what I get, and I have to implement really extensive testing for each batch.
Shortage is a driver to creativity. This is what made Picasso become famous and outstanding on his way: Picasso intentionally constrained himself to only use lines, very simple shapes and single colored areas as his toolbox. Although he could paint photorealistic portraits when he was a teen, he intentionally constrained himself to unleash creativity.
It’s the limited amount of tools that we have, also when programming (a few constructs like if/then, loops, boolean, integers, etc.) which make us think “how to express this certain idea best?”.
If you have boundless resources, there’s no importance to be creative.
Anecdotally, I've seen a lot more job postings for firmware engineers lately. I assume these are coming from companies that cannot source their preferred microcontrollers and need to port code to a new chip before inventory runs out. Hopefully this leads to some modernization of embedded software.
Modernization? The deadlines are tight and the focus will be all about getting done as fast as possible before the inventory of replacement parts dries up too.
I have ended up raiding the warranty return bin for fixable boards in my small startup. Its easier to repair the returned boards right now than order new ones.
Why aren't chips recycled more? (Not that recycling could address the immediate problem.)
Or perhaps there is already a lot of recycling, but it is not apparent to me and probably not apparent to most people.
Is pulling and sorting chips from e-waste inefficient? Damages too many chips?
In China a lot of reverse engineered info on proprietary chips floats around among engineers. Are hardware engineers in the west too busy to adapt to recycled chips? Is the supply too precarious?
Chips are recycled, at least the valuable ones. Go on AliExpress and you'll find lots of expensive FPGAs reballed, iPhone chips for repair, classic computer chips that are no longer made, etc.
Ftlog not another round of 'let software do it', please.
How about, when a couple of dead simple switches will do it, DO NOT stuff an embedded Linux SoC in. A toaster, for example, should not require one single bit of software.
Cars. Omg. Do we have to turn them into rolling smart phones? DO we? I dread the day when my car has to OTA update every 3 weeks. There should be no need for that. Yeah, I can see you might want something to handle engine timing or active suspension. But I do not need or want all these 'apps'.
We seriously need some sanity in product design these days. Don't blame engineers. Blame the product designers.
Cars have been increasingly more computerized during the last two decades. If car firmwares were more open it would be possible to deploy software improvements to reduce emissions and increase performance.
I agree that having to update too frequently would be a hassle, but we could make far better use of existing vehicles if they were more easily upgradeable and updateable.
> If car firmwares were more open it would be possible to deploy software improvements to reduce emissions and increase performance.
Better yet - have the actual "car" layer (steering, acceleration, brakes, safety critical parts like airbags and ABS) entirely self-contained on a separate system that's written in a safety-critical language. Force all the "extras" (AC, seats, radio, windshield wipers, maybe blinkers etc.) to be controllable via a user-supplied module that might as well run on a consumer-grade electronics motherboard.
That way the part of the car that gets you from A to B gets to benefit from being self-contained and as optimised for emissions/performance as possible for the time the car was manufactured. The part of the car that merely provides comfort and convenience is something the users are free to monkey about with because it doesn't compromise any regulatory or safety standards.
Having a dumb Bluetooth receiver wired into your stereo system is not very complicated, and mainly foolproof. Let the end user (moreover, force them to) pick which end device of their own they want to use to stream music to the car.
> turning instead to software to handle functions that had historically been done in hardware
That generally means doing things less efficiently. Devices will get slower and battery lifetimes shorter.
> There’s room for new startups and new ways of doing things. More of the car’s features are accessed using a touchscreen as opposed to a knob or button, for example, which turns components that once were hardware into a software program.
This has got to be one of the worst times to do a hardware startup. Doing R&D at an established company is already painful, between skyrocketing costs and 6 month plus lead times on standard microcontrollers (not to mention more specialized ICs). If you're just starting out instead of "only" facing trouble keeping production going you're going have to hope that the chips you're prototyping with will be available when you go into production. If things keep going the way they have this year, that's a very risky bet no matter what components you select.
> And now, in light of the chip shortage, suppliers to the automotive industry are going to be looking for more ways to turn something that used to be an electronic part into software.
Not in the automotive field but from what I've heard it's the exact opposite issue, the [lack of] processors they'd like to use in those fancy touchscreen infotainment systems are a big part of their problem.
The central thesis is that the shortage will lead to innovation, but in my experience it has put a HUGE strain on everyone doing R&D. I suspect it'll instead lead to rising costs and stagnation as people wait for supplies. This is already basically what we have in the consumer hardware market, where last year's consoles still aren't consistently on store shelves.
The opportunity from either exploiting the shortage, or ameliorating it is now long gone.
> That’s because electric cars require a lot more computer and chip-based components than those powered by gas
Electric cars require a lot less electronic components!!!
> “I saw a design the other day of a lightbulb with 12 components,” said Carrieres. “[My team] asked me if this was an idea, and I had to tell them that it was a real product.”
Recently we had a rush project for redesigning a control panel for a certain machinery. The task was to replace, or remove MCUs currently experiencing shortage.
After looking at the project for a few minutes, we realised MCU was used basically just for blinkey LEDs, and an interlock.
Our engineer slapped few 555s, flip flops, and registers in under an hour. Client engineers' jaws fell off.
While the real reason client was at the mercy of their supply chain was not that individual MCU, it still shows just how much blind following of "one chip to rule them all" can eat away at supply chain resilience.
If you have a part uniquely fitting your unique needs, the likeliness it being some rare widget, and having poorer availability increases.
Sounds like it's more an issue of this work being outside the client engineer's competence than any kind of engineering breakthrough.
I suspect replacing the MCU with discrete components would have resulted in a more expensive product if there was no shortage.
Deleted Comment
1. Travel restrictions make it extremely challenging to evaluate suppliers, do joint development, and bring up manufacturing.
2. A bunch of recent high profile failures have caused VCs to cool on the space.
3. Google, Facebook, Amazon, and Microsoft are all hiring armies of hardware people, making the job market more competitive than it once was.
All that said:
1. There’s still no shortage of interesting and worthwhile problems to solve with hardware.
2. Like web companies that survived the dot com bubble bursting, the hardware companies that make it through 2020 and 2021 are going to have been strengthened by it.
5. China-West tensions, including HK/mainland border being locked down, are creating additional difficulties.
6. Current China anti-platform/monopoly drive has cooled domestic VC activity and appetite for vertical integration.
(Full disclosure: Cross-border hardware venture founder in China.)
There is indeed a lot of demand for chips and electronic components of all kinds but I understand that manufacturers claim is not real demand, is just a spike caused by uncertainty in international supply chains. https://www.taipeitimes.com/News/biz/archives/2021/04/22/200...
Just like we all saw a lot of people buying toilet paper like crazy at the beginning of the pandemic, well, there is also a lot of hoarding by big tech companies that make this shortage likely to last well into 2022.
So, they're not exactly willing to expand capacity for a demand that is most likely to go away eventually.
So indeed, it is a challenging time to design and manufacture hardware; all the big players are buying it all.
The only advice I could give to anyone that is starting to design their electronics to make the design as flexible as possible, leave room on the PCB for 2 or more alternative components.
You could even try to make 2 different PCB layouts and choose one or the other based on price and availability of components, oh and try to delay components selection as much as possible, that's the most useful advice I've read https://titoma.com/blog/reduce-component-lead-time
VCs have been "cool" on the hardware space since 2001.
Why fund hardware which can take 5-10 years to go somewhere when you can fund the latest social garbage fad and it will take off or fail completely within 36 months?
Could you expand on this?
If we think it's bad now, wait until the dozen or so AI chip startups have their moments of reckoning.
For computers, desktops, laptops, servers and phone, we could start focusing on more efficient software. Replacing laptops every two or three year is no longer cost effective. We start to look at 5 to 8 year lifespan for phones and laptop, meaning that developers need to focus more on efficiency.
It's two different ways of dealing with rising chip prices: Use fewer chips, and use them longer. It's also a great way of shifting to greener computing.
I hope it does lead to debloating though, when mouse config software weights in over 200MB we're long overdue for some trimming of the proverbial fat.
this is a fun one, because there is no shortage of knobs and buttons, just chips, and knobs don't need chips to work. Even if you do need a chip for it to work (eg. rotary encoder to canbus), most of them are still available, because the "old tech" is good enough for them (compared to newer, smaller transistor sizes, needed for modern cpus to process a lot of data - eg graphics cards, modern cpus and even touchscreen radios for cars).
> That generally means doing things less efficiently. Devices will get slower and battery lifetimes shorter.
Possibly, but the point is to reduce the number of expensive integrated circuits. A classic example:
https://en.wikipedia.org/wiki/Integrated_Woz_Machine
I think there's some truth to creativity being enhanced by constraints. Certainly, if supplies are limited, and especially if the limits are uneven, there's going to be incentive to design around the chips that are limited, and some of that might be innovation that could be useful even after supply goes back to normal. Of course, CPU shortages are hard to design around, especially if all CPUs are in short supply; but some applications might be able to make use of different CPUs that might have more availability.
Microcontrollers aren't exactly interchangeable, even within the same product line. You could design for flexibility and use the arduino framework to run your code on most microchip/atmel, ST, and a million other chips but that comes at enormous cost -- to put it nicely that's an incredibly inefficient library if you're not doing anything demanding, and damn near worthless if you are. Any multiplatform framework that abstracts away the inner workings of microcontrollers is going to be too heavy to work for a huge percentage of people's power and complexity profiles.
It's not just MCUs and firmware either, any time you replace a component due to shortages you need to revalidate your design. Constantly redesigning and revalidating boards based on available stock is what people are doing right now to keep the lights on. It's hell.
If you don't need a microcontroller to do whatever you do, then sure. Pop it out and save a few bucks. But that's hardly innovation, it's more rectifying a mistake made when doing the initial design.
>That generally means doing things less efficiently. Devices will get slower and battery lifetimes shorter.
I'm personally not at all convinced hardware accelerated GUI toolkits are more power efficient than software rendered ones. Weather it's due to the abstraction or some weird idea that drawing is now "free" you end up with way more drawing being done once something is hardware accelerated and it tends to more than offset the efficiency gains. The only place it really works are video games because they're actually paying attention to where the limits are. (and they don't really care about power efficiency.)
It really depends on what is being drawn and how. If a GUI toolkit is all flat colors, basic shapes, and doesn't use antialiasing then it can be very power efficient on modern CPUs, even low power embedded ones. If it instead uses gradients, alpha blending, antialiasing, and drawing complex shapes it's going to far less efficient all in software. The difficulty increases with increased demands for fill rate (display size x frame rate).
On modern CPUs (even embedded ones) a software rendering to a QVGA display would be no probably and likely no less efficient than hardware acceleration. However as the fill rate demand and drawing complexity increases software rendering will quickly hit a wall.
> That generally means doing things less efficiently. Devices will get slower and battery lifetimes shorter.
Not necessarily. For example, the original idea behind RISC was too move lots of complicated functions (and instructions) from hardware into software, so that the hardware could concentrate on doing the most common and simple functionality quicker.
I can't imagine it's much better for other hardware-->software transitions, given the cost of specific hardware implementations they're generally only used when actually necessary.
There’s a long history of countries imposing capital controls. And it’s rarely favorable to the home country’s exchange rate. What you are proposing is basically just capital controls towards the crypto economy.
Yes it will make it harder for new buyers to bid up the price of Bitcoin. But it almost certainly will induce current holders to hoard any Bitcoin they own, thus reducing supply and putting upward pressure on the price. Without a liquid market, miners are likely to hoard their rewards, creating even more Bitcoin scarcity.
It also may create panic buying before the law comes in effect, or by people in foreign jurisdiction fearing the export of the policy.
1) ban computer games to free up people’s time, and grow the economy,
2) ban ML research to improve overall privacy.
All these will also affect GPU prices favorably.
No computers No chip shortage
Fixed :D
Some of these technologies managed to produced something useful. On the other hand, I cannot use crypto to anything but speculation.
Its a squeeze, no different than what happens to rental prices when there is 1% vacancy rates versus .5% vacancy rates
That will spur innovation on approaches that don't have that requirement.
Deleted Comment
This video covers it quite well: https://www.youtube.com/watch?v=Z7QkIECEkVc
Automakers cancelled their orders during the early days of COVID-19 and lost their place in the queue (and now are crying foul).
Also many companies have double-booked orders from multiple suppliers making the shortage look worse.
There's a huge amount of semiconductor capacity coming online in the next few years, much of will be barely profitable once this current shortage dissipates.
What's really happening is that I (and likely many others) realised that Chinese-brand chips (which are not affected by the shortage) works just as fine as the western ones.
The other thing I realised was that non-authorized sellers works just as fine as well. Yep, I had to implement more thorough testing/QC, but I would not call that extra work innovation.
I think most assembly-houses self-source some parts, and they do it in large quantities (like 1M+ parts/year). As such, they buy at extremely good prices. They overbuy themselves, and in order to get the same good pricing next year, they keep overbuying. Some of their inventory is then sold locally, and ends up in the Shenzhen market or at unauthorized resellers.
For the quantities I buy (sub-1000 pcs typ.) the pricing is usually 0.2-2.0 times the known good parts from known (authorized) sellers.
The problem is that I can not be sure what I get, and I have to implement really extensive testing for each batch.
I have ended up raiding the warranty return bin for fixable boards in my small startup. Its easier to repair the returned boards right now than order new ones.
Or perhaps there is already a lot of recycling, but it is not apparent to me and probably not apparent to most people.
Is pulling and sorting chips from e-waste inefficient? Damages too many chips?
In China a lot of reverse engineered info on proprietary chips floats around among engineers. Are hardware engineers in the west too busy to adapt to recycled chips? Is the supply too precarious?
The only one I know of working in this area is https://en.wikipedia.org/wiki/Andrew_Huang_(hacker) I wonder if he gets lonely.
How about, when a couple of dead simple switches will do it, DO NOT stuff an embedded Linux SoC in. A toaster, for example, should not require one single bit of software.
Cars. Omg. Do we have to turn them into rolling smart phones? DO we? I dread the day when my car has to OTA update every 3 weeks. There should be no need for that. Yeah, I can see you might want something to handle engine timing or active suspension. But I do not need or want all these 'apps'.
We seriously need some sanity in product design these days. Don't blame engineers. Blame the product designers.
I agree that having to update too frequently would be a hassle, but we could make far better use of existing vehicles if they were more easily upgradeable and updateable.
Better yet - have the actual "car" layer (steering, acceleration, brakes, safety critical parts like airbags and ABS) entirely self-contained on a separate system that's written in a safety-critical language. Force all the "extras" (AC, seats, radio, windshield wipers, maybe blinkers etc.) to be controllable via a user-supplied module that might as well run on a consumer-grade electronics motherboard.
That way the part of the car that gets you from A to B gets to benefit from being self-contained and as optimised for emissions/performance as possible for the time the car was manufactured. The part of the car that merely provides comfort and convenience is something the users are free to monkey about with because it doesn't compromise any regulatory or safety standards.