Readit News logoReadit News
PaulHoule · a year ago
The biggest problem I've seen with Intel is that they are "getting high on their own supply" and the whole tech press enables them with the sole exception of Charlie Demerjian.

They should pick a few random people out of the phone book and pay them handsomely for their advice, this way they might start to "see themselves as other see them". Most of all they need to recognize the phenomenon of "brand destruction" which has manifest in clueless campaigns such as "Ultrabooks", brands that should have been killed off years ago ("Celeron", sorry, low performance brands have a shelf life, there's a reason why Honda is still making the Civic but GM is not making the Chevette) and that they should write it in their bylaws that they are getting out of the GPU business permanently (they've made so many awful products it will take 15 years for people to get the idea that an Intel GPU adds value, instead it's this awful thing you have to turn off so that it won't screw up graphics in your web browser when you're using a Discrete GPU.) People might get their heads around the idea that an Intel CPU is a premium brand if it had a GPU chiplet from a reputable manufacturer like AMD or NVIDIA.

(Oddly when Intel has had a truly premium brand, as in SSDs, they've ignored it. Hardly anybody noticed the great 95% latency performance that of Intel SSDs because hardly anybody realizes that 95% latency is what you feel when your computer feels slow. Intel SSDs were one Intel product that I would seek out by name, at least until they sold their SSD division. Most people who've run a lot of Intel SSDs swear by them.)

Miraste · a year ago
A lot of these are symptoms of the root cause: a bad product. Ultrabooks aren't a terrible concept; they're the Wintel version of Macbook Airs. Making a non-Apple Apple device is a fine business strategy, and Samsung has made plenty of money off of it. The problem was that Intel chips continually crippled them, making them hot and slow with no battery life. People don't like Celerons and iGPUs because they all run at the speed of molasses.

Any of these branding decisions would have worked fine, if the products did. But no amount of marketing will fix bad engineering.

Panzer04 · a year ago
I don't think I agree at all with your assertions about ultrabooks and blaming intel CPUs.

To be honest, all x86 processors are reasonably efficient if you pick the right power points to run them at. My experience with my laptops is that windows will arbitrarily use 2-3x idle power draw while on battery for no apparent reason. There is no reason for a CPU to be running higher than idle the vast majority of the time in a laptop, for tasks that you would be using a laptop for on battery.

I think most laptops are just poorly designed and Windows isn't reliable about using minimal power. MFGs make big laptops with 60whr batteries (instead of 100wh), they put bloatware on them that consume excess power at idle, MS Windows doesn't reliably ensure low power consumption on battery (seriously, I have to hibernate/shut down my laptop when I stop using it, otherwise it will consume its entire battery in 8hrs during sleep mode, somehow), and so on. Most problems with Windows laptops aren't Intel's fault, IMO.

I agree that Intel really should clarify, at least, the difference between their "Celeron" class (e-core only) and "Core" class processors (with P cores), since there's such a massive difference in performance capability between them.

adrian_b · a year ago
While the Intel laptop CPUs may have been not good enough, that is not where Intel has lost money.

The current financial results have shown decent profits for the consumer CPUs, which could have been greater if Intel had succeeded to produce more Meteor Lake CPUs. Intel could not produce as many Meteor Lake CPUs as the demand because their new Intel 4 manufacturing process has low production yields.

While Intel lost money in their foundry, that was unavoidable due to the high expenses needed for catching up with TSMC.

Where Intel had great losses was in server CPUs. The reason for these losses must have been that Intel had to grant very large discounts to the big buyers of server CPUs, in order to convince them to not buy superior AMD server CPUs.

These losses in server CPUs were easily predictable, because even if Intel has succeeded to reach in time all the publicly announced milestones of their roadmap, their roadmap itself does not hope to reach parity with the AMD server CPUs before H2 2025 (and that assuming that AMD will not introduce Zen 6 during 2025, which would move the target).

It looks like Intel has made a PR mistake. Even if their public roadmap has shown all the time that they will continue to have big losses for at least one more year, that was obvious only for technical people, who could compare the future Intel products with the AMD products, because Intel has not said any word in their roadmap about what the competition will have at the same time. For non-technical people, the Intel press releases sounded more optimistic than they should have been, leading to disappointment caused by financial results that should not have been surprising.

soulbadguy · a year ago
bad product is a symptoms of a even deeper cause... Lack of competition. The decline of intel started long ago. The financial results here are just very very lagging indicator of that. The lack of competition combined with very agressive anti competitive practice allow them to survive on pretty bad product line. In a healthy market, competition would have force intel to improve well before the point we are now
jakobson14 · a year ago
ultimately it was a focus on marketing that led them to kill engineering

(if itanium and the many failures and lucky breaks that led up to it hadn't already convinced you that intel might never have had good engineering)

einpoklum · a year ago
> Ultrabooks aren't a terrible concept; they're the Wintel version of Macbook Airs.

I'd say both are a terrible concept. I mean, it sells, but it's still terrible. A laptop being thin is not really such a virtue and people's life would be better with a decent-travel keyboard, better heat dissipation and more ports than with 5-10mm less laptop height.

impossiblefork · a year ago
But don't they need to get into the GPU business?

Surely GPUs or similarly very parallel machines for things like ML training are very needed and will remain very needed. Seeing as firms such as Groq have done fine, surely Intel can do something of that sort without much difficulty?

Since their GPU business has been unsuccessful perhaps they can go for whatever makes sense, as there's nothing of this sort that they can release that will compete with their own products.

pradn · a year ago
The AI boom is, in theory, a godsend for Intel and AMD. You can focus on creating good tensor computation hardware, without having to worry about getting gamers on board. No need for "Game ready drivers" or compatibility with complex legacy graphics APIs.

Of course, there's the elephant the room for general purpose tensor machines, which is CUDA - famously closely guarded Nvidia. But with the new wave of "above-CUDA" APIs like TensorFlow, PyTorch, and Keras, there's an opportunity to skip the CUDA layer altogether.

bee_rider · a year ago
Intel getting into the GPU business is good, for sure and I hope they don’t give up on it.

It is a tiny thing, but Intel really seems to charge a premium for PCIe lanes. Maybe if they sell a lot of dGPUs they’ll be less stingy with the sockets to plug them in to…

immibis · a year ago
I have an Arc A770 and it seems to work fine so far... though I've only had it for a month. There are some tolerable teething problems: I can't set the fan curve from Linux so I had to duct tape some extra fans, and it's incompatible with BIOS boot (requires UEFI) which should be considered acceptable in 2024. For presumably the same reason, there's no graphics output in early Linux boot until the GPU driver is loaded.

The IGPU in my previous machine's i7-6700K (Skylake) was also just fine. Intel Graphics Media Accelerator, yeah that really sucked, but that was like 15 years ago?

Jochim · a year ago
The main issue with ARC seems to be driver support in specific games, DX11 being particularly problematic.

Intel basically have to catch up on the ~10-20 years of kludges that AMD, Nvidia, and game devs had already implemented when the games were released.

They've been making strong improvements to their drivers though.

Their iGPU is great for home media servers. Low power draw, QuickSync can handle multiple 4k transcodes, and one less part to buy.

iGPUs still aren't a great choice for gaming on desktops, even the latest AMD APUs perform poorly in comparison to the cheaper dGPUs.

Sparkyte · a year ago
Intel has always been an echo chamber of fiscal earnings and cutting corners to appease investors. For the longest time they stagnated on Mac hardware improvements because it would cost money, they would not have lost Mac as a business customer if they continued to innovate.
PaulHoule · a year ago
Also they should have had a policy of "not one transistor for the national labs" and "not one transistor for hyperscalers", particularly because anything they do to appease hyperscalers just saves them money that they'll spend on their custom silicon transition.

There's something prestigious about HPC but the story of how commercial data processing went parallel in the oughts shows how out of touch the HPC community is.

In the meantime Intel has botched the deployment of SIMD, slowplaying to the point where maybe 7 or 8 years from now some mainstream software might support AVX512, maybe. Somebody should be tarred and feathered for introducing the idea that features should be fused off for all but the highest paying customers. If the customer is not paying for the fused off functionality, the shareholders are. It sounds really ruthless and avaricious and might impress some business analysts as a form of vice signalling but it's pure waste.

jzb · a year ago
"they would not have lost Mac as a business customer if they continued to innovate"

I'm not entirely sure this is true. I mean, I guess it depends on whether one expected Intel to be able to make not just a decent chip for PCs/laptops, but also one for phones & tablets. Once Apple started dabbling in its own chips for the iPhone and iPad, it seemed inevitable that they'd expand that to their macOS systems too. Apple has been a rough customer to please for chip designers/manufacturers, I'm not sure any company could've satisfied them with a general-purpose chip.

AtlasBarfed · a year ago
Companies that are built on engineering eventually become subsumed by soul-sucking profit sucking bonus sucking middle management types that just plop themselves in their big huge hierarchy and destroy the company long-term with inertia and apathy.

See also. Boeing Medtronic GE AT&t

spookie · a year ago
Having a third competitor in the GPU is great, for both Intel and the consumer. If one starts blaming newcomers for their shortcomings then we are failing to see the bigger picture.

When you talk about web browsers failing to display content because of the iGPUs I fail to even have heard of such a case. Maybe the performance is bad but that's it. Either way, that's more of a failure of the OS and not Intel. Windows is particularly biased to not use your dGPU on a laptop, but I blame the kernel for making badly calculated assumptions based on nothing but !={game, 3D modelling tool} then iGPU, and not the hardware.

christkv · a year ago
The CEO had only been in charge since 2021 bringing an engineer back to the helm. How long does it take to bring a big ship like Intel around and undo decades of internal rot? It will be interesting to see who is let go in the coming months.
code_biologist · a year ago
Though prior to Gelsinger, Youtuber and chip leaker / rumor monger "Moore's Law is Dead" mentioned that Jim Keller's stint at Intel was so short (April 2018 - June 2020) because of internal cultural toxicity. In particular, things like leaders of major groups in Intel trying to get employees of other groups fired, so as to make their own relative progress look better. Take with a pound of salt of course, but MLiD's sources have been pretty decent.

You only resolve that kind of badness by firing a bunch of SVPs and VPs and Gelsinger hasn't done that, for many possible reasons.

akira2501 · a year ago
> How long does it take to bring a big ship like Intel around and undo decades of internal rot?

If it takes that long, then this is your actual problem, and not how many engineers are in the loop at the board level.

gorgoiler · a year ago
Your second quote has always resonated very strongly with me, both as a concept and also from the original-ish verse:

  Oh would some power
    the gift He gives us
  To see ourselves
    as others see us
  It would from many
    a blunder free us
  And foolish notion
  What airs in dress
    and gait would leave us
  And even in CPU fabrication!
— adapted from “To a Louse”, R. Burns

roydivision · a year ago
Intel should never be directing its branding at the end user in the first place. Joe Public doesn't give a rats ass what is powering his device. Ask anyone off the street if they know who AMD are and what they produce. Concentrate on making good product and the device manufacturers will come to you.
Rinzler89 · a year ago
>Ask anyone off the street if they know who AMD are and what they produce.

That's why Intel, AMD and Nvidia are super obnoxious with plastering laptop keyboards with their stickers. They want the consumers to know what's inside.

jakobson14 · a year ago
Intel has spent the last 25 years training consumers to look for the "intel inside" sticker. It's the only thing that's kept intel afloat while they butcher their engineering tallent.

People shouldn't care, but they do.

greybox · a year ago
I'm not so sure nobody cares. Because of the apple's M1-3 machines battery life, people already associate ARM chips with power efficiency.

People will always want their laptop/tablet/phone batteries to last longer, and if makers of those devices know that consumers associate ARM with power efficiency, they will want to take advantage of that.

thaumasiotes · a year ago
> instead it's this awful thing you have to turn off so that it won't screw up graphics in your web browser when you're using a Discrete GPU.

You can disable integrated graphics chips?

larodi · a year ago
from https://genius.com/Top-cat-a-friend-in-need-panik-and-m-rode...

""" A Friend In Need Is A Friend Indeed

But a friend with weed is better

So if you want to get high

Bring your own supply

Or we will know you as a joker smoker """

samstave · a year ago
Disclaimer: I used to work at Intel:

Specifically, I was the second person on the planet to game on a Celeron Proc. How? Me and my best friend ran the Intel Developer Relations Group Gaming Test Lab in the later half of the 90s.

Our job was to specifically test all PC games that were being developed against the Intel Celeron procs and the amd alts... with a focus on Subjective Gaming Performance of the Celeron as an "Optimized" feature leader whereby developers were given handsome marketing monies to optimize the games against latest SIMD instructions to allow for the games to be more performant on the Celeron.

THe goal: Show the world that a $1,000 fully capable gaming PC was possible, in their budget, and desirable.

---

The Issue at the time was the graphic bottlenecks -- all the pieces had yet to come to fore: AGP, Unreal, OpenGL, ~~NERDS~~ NURBS!, Games, Graphix Chips (VooDoo, 3DFX, blah blah)

Celeron should have died long ago - but certainly when the first, actual GPUs came along and did the heavy lifting.

I have a lot of thoughts about Intels mess (Transmetta really fucked Intel up in the way an abusive step-relative would) and caused them to lose focus...

Then, just the ridiculous amount of marketing over engineering....

(if anyone worked at intel in the DRG circles in SC5 - and has access to emails of that day - search my name and the thread I started asking why we cant just physically stack CPUs on top of eachother... (this was prior to the Voxel timeline) and was laughed at. it wasnt until several years after that I went on a hike with a labs head and found out about the 64-core text CPUs that were only coming out)

---

I was just having shower-thoughts about intels future as effectively computing grout -- a mechanism to get data from real-world INTO NVIDIA GPUs and then displayed again real-world. And thats it. Thats the only thing Intel may be doing in the future - is grout to display the data components solely computed, as NVIDIA CEO stated himself "All compute will be done on NVIDIA chips" -- delivery of the data through intel's grout (minimally) then again delivered to an NVIDIA desktop GPU...

Intel is like the maintenance staff of a home. NVIDIA is the architect, interior designer, party planner and end user.

(Intel was my first ever stock package: $70 with a $125 option price. 10K shares. I left before ever vesting... it was the late 90s and I had to chase the dream intc: https://i.imgur.com/U9PWURv.png )

gdiamos · a year ago
Even then, why is Intel better grout than AMD today?
is_true · a year ago
Does the new owner provides the same performance?
Gettingolderev · a year ago
They were leader for over 10-15 years. They wiped the floor with Nehalem left and right.

The tooling of intel is still better than AMD.

The celerons and other types of CPUs are in plenty of notebooks people have who buy laptops below 1k.

The SSD topic happened, and went away i guess it was a margen topic. Low margin, ultra mass product lots of other companies happily building them.

I think its critical for intel to have knowledge about GPUs for a stable longterm strategy. It was critical 10 years ago when they tried and failed hard with larrabee but they need to do that or buy AMD/Nvidia. With AI and modern workload you can't just have CPUs anymore and with CUDA and ML that was clear a long time ago.

The market itself is also totally bonkers. When AMD surpassed intel, intel still sold like cake and still does. The fab capacity in the world is limited. If you can't get an Nvidia GPU for 200-300 because they complelty ignore this price range, than you have only AMD and intel left.

Intel took the market of low end gpus already through their iGPUs.

Intel knows how to build GPUs and other things than CPUs. It would be a total waste for them to stop.

Intel should have bought nvidia or done something else but the Intel CEO is an idiot. He stated publicly that Nvidias position is just luck. That is such a weird idiotic take its bonkers that someone who would say something like this is the CEO of intel.

The amount of R&D Nvidia did and still does left and right is crazy. He totally ignores how much Nvidia was doing the right things at the right time.

Not sure were intel is heading but with their fab knowledge and capacity, their cashflow and core tech, they should be able to swing back. I was hoping that would happen already, unclear to me why they struggle so so hard

kjellsbells · a year ago
The classic moat metaphor that the OP article and others use needs to be fleshed out to match Intel's predicament.

A moat protects a castle that adversaries want to take over. The presence of the castle defines what can and cannot be done with the surrounding landscape. But if the castle ceases to protect what people care about, or make meaningful additions to the environment, it becomes irrelevant and the presence of the moat makes no difference.

Intel's problem isn't that competitors want to storm the castle and achieve domination over the landscape that x86 controls. It's that the competition have built their own castles on the other side of the river, and a lot of the peasants are tilling the lands around Castle ARM and Chateau NVIDIA.

To put it another way, Intel thought the castle was "control of computing" and the moat was "leadership in x86" but irrelevance comes a little closer with each passing chip generation. It is fortunate for Intel that corraling an ecosystem into existence around the alternatives to x86 is an insanely difficult task, but it has been done with ARM, it has been done (albeit for a niche) with NVIDIA and it can be done again with whatever comes next.

soulbadguy · a year ago
>Intel's problem isn't that competitors want to storm the castle and achieve domination over the landscape that x86 controls.

IMO it's both. While the importance x86 is declining, AMD is aggressively eating what ever part of it is left. I also think that in the long term, as intel and amd build better x86 chips, the value proposition of ARM will slowly fade in favor of something like risc-v

apantel · a year ago
Asking because I don’t know: what is the value proposition of ARM?
moffkalast · a year ago
AMD's stormed the castle with Ryzen long ago and planted their flag, but since everyone's still asleep at Castle Intel they haven't really noticed.
JonChesterfield · a year ago
There's sorrow and confusion at intel that the server and desktop markets are shrinking. People just aren't buying as many computers any more. Not to worry though, they'll probably want more in the future.

The rest of the world has noticed that people are buying lots of computers. This doesn't seem to cause any cognitive dissonance for intel though. Perhaps "computer" means "thing intel sells" and excludes the work of the heathen outsiders. It makes for some quite confused reporting from the financial press.

BadHumans · a year ago
We are coming up on 7 years since the first Ryzen chip. In 7 years AMD went from very behind to a little behind then on-par and finally now market leader. The fact Intel let this happen in such a short time frame is a bit mind boggling.
userbinator · a year ago
AMD and Intel have always been racing each other closely. Remember when the Athlons outperformed Netburst P4s while Intel was trying to get the latter to higher clock speeds? Then the Core series put Intel in the lead again, and now they're losing to AMD once more. Here's some 15-year-old benchmarks for your amusement:

https://www.tomshardware.com/reviews/athlon-64-power,2259-9....

layer8 · a year ago
AMD is not the market leader in any segment, by any approximation: https://www.theregister.com/2024/05/10/amd_gains_on_intel/
adrian_b · a year ago
In server CPUs, Intel still has a larger market share than AMD, but judging for the published financial results, where most of the loss is in server CPUs, Intel has succeeded to keep that diminishing market share only by accepting a great loss caused by huge discounts, which has determined the action price fall, so perhaps trying so hard to retain the market share has not been an optimal decision.

So AMD definitely leads over Intel from the POV of the profits obtained in the server CPU market segment.

There are also various small markets where AMD leads comfortably over Intel by volume, e.g. Amazon currently sells much more AMD CPUs than Intel CPUs.

BadHumans · a year ago
My phrasing is confusing and I apologize for that. I do not mean market leader in that there are more AMD CPUs than Intel CPU. I'm speaking about hardware performance.
nasdaq-txn · a year ago
Market leader in what? Intel's Q2 revenue is over double that of AMD's. Intel still controls well over 60% of the x86 space. Intel and AMD's most performant x86 offerings are fairly close to each other.
BadHumans · a year ago
Performance. Intel still manages to squeak out some wins against AMD when it comes to single-threaded CPU task but in every other metric, they are chasing AMD.
sargun · a year ago
I feel like Intel has embarked upon a bunch of good ideas like SDI (https://www.intel.com/content/www/us/en/manufacturing/softwa...), IoT / Edison (https://ark.intel.com/content/www/us/en/ark/products/84572/i...), Silicon Photonics (https://www.intel.com/content/www/us/en/products/details/net...), and SDN (https://www.intc.com/news-events/press-releases/detail/640/i...).

But, they manage to fail to capitalize on any of these. What's wrong with them?

bn-l · a year ago
Ask them why they sold their ssd division. It’s almost intentionally bad.
timschmidt · a year ago
A new https://en.wikipedia.org/wiki/Xeon_Phi would be killer for AI right about now.
gumby · a year ago
The "paradox of the x86" is simply the classic Innovator's Dilemma. In fact the history of the last 20 years against ARM could be a case study right out of the book.

Worse for Intel, AMD flubbed it time and time again, but now Intel is too weak to defend against a resurgent AMD.

Meanwhile they are cutting headcount deeply yet only now suspended the dividend. Madness!

They should take a page out of AMD's book and spin off fab. That new company can then be flooded with "strategic" government aid, and maybe the rest of intel can catch up (or not) but it would at least give the shareholders a better chance. Right now the combination is acting like two anchors tied together.

Wytwwww · a year ago
> They should take a page out of AMD's book and spin off fab.

It's way too late for that now. But even with the benefit of hindsight if we go back by a few years abandoning their main potential advantage just to compete for limited capacity at TSMC with everyone else wouldn't have been the best decision IMHO..

> flooded with "strategic" government aid, and maybe the rest of intel can catch up (or not)

How could that ever possibly help Intel's foundries to catch up if Intel itself switched to TSMC? The "government" doesn't need leading edge nodes so they'd just end up in the same as spots as Global Foundries.

To be fair they did outsource their last gen low-power/laptops chips to TSMC which is probably why they now seem very competitive with AMD/Qualcomm.

gumby · a year ago
One of Intel’s strengths in roughly the 90s and 00s was the tight coupling between its strong fab technology and its strong design team. This this was the model for Motorola, IBM, TI and everybody else.

But by now both sides are suffering, and management has to try to fix them both simultaneously.

As with other industries, semiconductors evolved as an integrated industry. But now both parts are fiendishly complex, and rather than integration being a strength, it’s more like a conglomerate.

You can’t just move your design from one process to another so a spun out fab would start with mostly Intel jobs, just as AMD, IBM etc did when they sold off their fabs. But the standalone fab would possibly find it easier to hire customer-oriented people and change its culture, and the two parts could concentrate better on their needs. It would give the shareholders a better chance too.

It’s not a great solution, but the current situation is dire.

theparanoid · a year ago
Intel's decline was obvious when I worked for them in 2010, their compensation package wasn't competitive with FANG and they consistently missed out or lost their better engineers.
causal · a year ago
The article quotes a mention of Intel getting into the foundry business - this seems like the most obvious good move to make, even if a little late, right?

Being able to operate a general fab for chips seems far more important now than the design of the chips, since design has been somewhat commoditized by ARM. Any big downside for this move?

barkingcat · a year ago
the downside is that the foundry process has to work properly, and by all reports, it hasn't started working properly yet.
causal · a year ago
Only report I saw on it was that Intel said they have tested working 18A chips.
JonChesterfield · a year ago
There was lots of talk at the last earnings about "clean sheet redesign" to achieve "world class foundary" and a "world class semiconductor. Sounds like getting the organisation ready to split in two to me.

That probably kills the vertical integration that made intel the world leader in the past though.

BoredPositron · a year ago
They just slept for a decade... I had an overclocked i7 3970k (2012) and there was no need to update for 8 years. The perfomance increase was always marginal. I finally pulled the trigger when AVX happened.
nikanj · a year ago
And then you learn AVX is only available on some cores, lowers the operating frequency of the whole cpu when you use it, and generally seems like an unstable prototype
amiga-workbench · a year ago
I was in the same boat with an overclocked 3570K. I only just replaced it last year and grabbed a 5800X3D. I hope this thing lasts me a decade too.