No one appears to have mentioned the important meta game going on: Intel bidding as a credible alternative supplier.
For Intel, by bidding they get to undercut AMD's profits.
For Sony, they get a credible alternative which they can pretend would be a viable choice. Thus forcing a slightly better deal from AMD.
We saw similar articles related to the Switch 2. That time it was AMD acting as spoiler to Nvidia. Nvidia reportedly got the contract. That time too we got news articles lamenting this loss for AMD.
As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.
Switching vendors does not just invalidate old games compatibility, it also requires retooling for their internal libraries. Console games, outside small or open source engines, use proprietary graphics api. Those apis are tied to the hardware. With this coming generation from Nintendo, and the "current gen" from Sony and Xbox they've been able to mostly reuse much of their software investment. I'd case more but this is obviously nda, other devs should be able to confirm.
Thus I don't think AMD for switch2 or Intel for ps6 was ever a credible path. Their bids existed to keep the existing vendor from getting overly greedy and ruining the parade for everyone. This is important, famously the original Xbox got hamstrung in the market by Nvidia's greed and refusal to lower prices as costs went down.
+1. An important non-obvious detail for AMD is that they (at least in the past, I assume for this as well) have kept the instruction timings similar from generation to generation of consoles.
Different x86 micro-architectures benefit from writing the machine code in slightly different ways. Games are highly optimized to the specific micro-architecture of the console, so keeping that stable helps game developers optimize for the console. If you suddenly changed the micro-architecture (if switching to Intel), then old games could suddenly become janky and slow even though both systems are x86.
(This would only matter if you were pushing performance to the edge, which is why it rarely matters for general software development, but console game dev pushes to the edge)
So it isn't just the graphics APIs that would change going from AMD to Intel, but the CPU performance as well.
> Different x86 micro-architectures benefit from writing the machine code in slightly different ways. Games are highly optimized to the specific micro-architecture of the console, so keeping that stable helps game developers optimize for the console.
While that can be true, very few gamedev companies these days optimize to that degree. They almost all use off-the-shelf middleware and game engines that are built to support all of the platforms. The companies that do go through that effort tend to have very notable releases.
Nobody is hand-tuning Assembler code these days to fit into tight instruction windows. At least, not outside of some very specific logic fragments. Instead they're all writing generic interrupt-based logic. Which is fine, as that's what the newer CPUs expect and optimize for internally.
In addition, the difference in the Zen generation gap is as different as switching to Intel. We're talking fairly different cache coherency, memory hierarchies, CCX methodologies, micro-op and instruction timings, iGPU configurations, etc.
That all being said, AMD was going to beat Intel regardless because of established business relationships and their current internal struggles (both business-wise and R&D) making it fairly difficult for them to provide an equivalent alternative.
> An important non-obvious detail for AMD is that they (at least in the past, I assume for this as well) have kept the instruction timings similar from generation to generation of consoles.
What? The Jaguar-based CPU in the PS4 has both a much lower clock and substantially lower IPC than the Zen 2 based one in the PS5. The timings are not remotely the same and the micro-architectures are quite different. Jaguar was an evolution of the Bobcat core which was AMD's answer to the Intel Atom at the time (i.e. low cost and low-power, though it was at least an out-of-order core unlike contemporary Atoms).
Going from GCN to RDNA on the GPU side is also a pretty significant architectural change, though definitely much less than the going from AMD to Intel would be.
cpu timings taken care around by developers is 10-15 years out of date. most of them these days dont even know what a dot product is, how to find the distance to a point or a straight line in-between two... and the people they rely on to do this for them make horrendous meals of it.
Given the size of such a contract, wouldn't it be reasonable for Sony to just request equal or better instruction latency for everything relevant from the old CPU?
Nvidia got the bid for Switch when they were basically dumping those unwanted Tegra to Nintendo for an incredibly low price.
Xbox and Playstation dont earn AMD much profits at all. AMD had this Custom Processor segment to barely keep them surviving, people may forget AMD was only worth ~$3B market cap in 2016. They are now at ~$250B.
On the subject of software compatibility though, one thing I got it wrong was my prediction of having AAA titles on Xbox and PS would have helped AMD's market share on PC, given those titles are already optimised on Xbox and PS anyway. That didn't happen at all. And Nvidia continue to dominate.
Sometimes a low margin business is all you need and have to keep the lights on, don't hemorrhage too much people and stay afloat until you get better winds.
Traditional MBA thinking sometimes is too short sighted. For example, PCs might not have been a Cash Cow for IBM, but the Thinkpad brand, the distributor relationships and the customer may had helped IBM more than the cash out selling this business to Lenovo. Maybe having a healthy bridge head with a popular brand of laptops could have helped IBM coming up with some innovative way of selling the overhyped Watson.
The same with AMD and videogames, it paid the bills, paid salaries and left a little profit on the table to be invested. Probably it helped them bridge from their hell days to what they are today.
There's a lot of intangibles and hidden symmetries, serendipitous opportunities that are frequently overlooked by our bean-counting master race overlords.
> Xbox and Playstation dont earn AMD much profits at all
It doesn't cost them much either. Lisa Su, in an interview that was posted to HN a few months ago, said it is a deliberate strategy to repackage IP AMD has already developed. They are willing to pull designs from the shelf and customize it to meet partners needs. Having a long tail of products adds up, and sets you up to get first dibs on higher margin partnerships in the future.
> Nvidia got the bid for Switch when they were basically dumping those unwanted Tegra to Nintendo for an incredibly low price.
This seems pretty well aligned with Gunpei Yokoi’s strategy of “Lateral Thinking [with] Withered Technology”. It worked out pretty well for Nintendo in the past (e.g., Gameboy) and seems to be working out with the Switch. Even though he has passed, his Wikipedia page alleges that this philosophy has been passed on to others at Nintendo.
A few of the Playstation titles that made their way to PC do seem to have a little home field advantage on AMD chips, but not enough to sway people over to them.
> having AAA titles on Xbox and PS would have helped AMD's market share on PC, given those titles are already optimised on Xbox and PS anyway. That didn't happen at all. And Nvidia continue to dominate.
My impression is that console ports have insufficient popularity with PC gamers for them to alter their hardware purchasing habits for those games.
> Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.
But would they really?
Staying on x86-64 would take care of CPU compatibility (unless there's some exotic AMD-only instruction set extension heavily used by PS4/5 games), and a GPU emulation stack seems at least somewhat plausible.
Sony has pulled this off multiple times before with seemingly completely incompatible architectures:
The PS2 came with the PS1 CPU (repurposed as an IO controller, but fully available for previous-gen games) and emulated the GPU on its own. The PS3 did the reverse in its second iteration (i.e. it included the PS2's GPU and emulated the CPU). The PS Vita's SoC had the PSP MIPS CPU included on-die, which in turn is similar enough to the PS1's to allow running those games too.
For GPU emulation, I'm not super knowledgeable but I would think the shaders are a big issue, older systems don't have that problem. Console games come with precompiled shaders and you won't be able to reuse those between AMD vs. Nvidia. Certainly you can get around it, emulators for modern Console do just that, but it's not without it's issues which might be considered unacceptable.
That's still fixable if you're willing to ship newly compiled shaders and such, but that's a lot more work if you're talking about needing some kind of per-game fix to be downloaded. This is how the XBox 360 "Backwards-compatibility" works, and this approach means it only works with a subset of XBox 360 games, not all of them. It's much better than nothing, but it's not a hardware-level fix that makes the original game binaries "just work".
For packaging the old GPU with the new system, I think that's not really realistic anymore since prices for them simply don't drop enough and the system design would be a mess (the chips are huge and you'd need cooling for both chips. I guess if only one is running at a time then it's not as bad, but...). Separately, if you're swapping from Nvidia to AMD then you're talking about trying to convince one of them to make a batch of old chips for you while you use their competitor's chip as the the main one, they might not be willing to do it.
The whole article seems unfair to Intel. They didn’t lose the contract because they didn’t have it in the first place. I think your analysis is correct. They win a little if they don’t get the contract and they win a lot if they do. It was a no brainer to bid on it.
This is all true. Xbox always threatens to leave their current vendors only to end up signing a renewal at the final hours of the contract.
>As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.
In your view, is this issue worse with modern consoles now that the Playstation (and possibly Nintendo) online store purchases persist across generations? Imagine a scenario where someone has a PS4 and PS5, they buy many games through the Playstation Store, then Sony selects a different chip supplier for the PS6. I'm guessing this would cause issues with games that were designed for the older consoles, breaking backwards compatibility.
I'd imagine that if the console manufacturers cared about backwards compatibility, which I think they do, the likelihood of them switching chip providers would decrease with each generation.
Microsoft maintained backwards compatibility across Intel+Nvidia, IBM+ATI, and AMD+AMD so it's possible. Sony hasn't invested as much in compatibility, instead just keeping the same architecture for PS4/5.
There was no backwards compatibility between the PS3 and PS4 whatsoever (except for PS Plus allowing cloud-based game streaming of some PS3 titles), and Sony survived that as well.
What they did was offer some old PS2 games for purchase, though, which allowed them to tap into that very large back catalog. I could see something like this happen for a hypothetical Intel PS6 as well (i.e. skipping PS5 backwards compatibility and tapping into the large catalog of PS4 and PS4/PS5 games).
I am quite sure PS5 doesn't do Vulkan at all, and you even don't need a NDA access for that, there are enough GDC talks and SCEE presentations on the what APIs Playstations do support.
It’s not clear to me that the PS5 supports Vulkan at all (excluding third party translation layers). I would be happy to see any evidence. In any case I’m confident the large majority of PS5 games use its native api GNM.
GNM could certainly be implemented for Intel GPUs, but it’s an additional cost to account for.
Yeah, this was rigged from the start. If Sony did want to take up Intel next gen, they'd need to do a lot of work on backwards compatibility with the PS5 on the PS6. Whereas I imagine the PS6 being a "PS5 Pro Pro" at this rate.
I suppose it can be seen as controlling rampant greed (especially for Nvidia), but it feels like the consoles dealt the cards here. There would have needed to either be some revolutionary tech or an outright schism to make a business steer an otherwise smooth ship that way.
>As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.
I agree that both are probably playing it safe this time. But as a devil's advocate: both Sony and Nintendo are not strangers to ditching the previous gen if they don't want to compromise their next gen. At this point Nintendo is skewed towards ditching (SNES/N64/Gamecube/Switch vs. Wii/WiiU).
Sony tried and almost failed hard with the PS3 (kind of before with the whole SKU debacle, and then ditched after) but is otherwise usually consistent on BC. Well, that and the Vita. But I don't think anyone missed the UMD (it was still backwards compatible digitally, though).
> At this point Nintendo is skewed towards ditching (SNES/N64/Gamecube/Switch vs. Wii/WiiU).
Ultimately, a company is its people. And the management class at Nintendo is famously new. Everybody is expecting them to focus on robust backwards compatibility as part of their new, exciting development.
> Yeah, this was rigged from the start. If Sony did want to take up Intel next gen, they'd need to do a lot of work on backwards compatibility with the PS5 on the PS6. Whereas I imagine the PS6 being a "PS5 Pro Pro" at this rate.
Why would they need to do a lot more work on compatibility if they'd picked Intel vs AMD?
Either CPU is presumably going to be x86_64. The GPU is almost certainly going to be much different between AMD's PS5 GPU and AMD's PS6 GPU, so the graphics driver will need work either way.
I would assume if intel can make ARM and x86, it can do whatever sony needs.
Or is AMDs architecture THAT special? My assumption is, that the ps3 streaming processor was so different, that it would have mattered but with ps4 and 5?
You could also patch PS5 games if you need to. The ecosystem is closed.
> Switching vendors does not just invalidate old games compatibility, it also requires retooling for their internal libraries.
This is a red herring. The hardware is x86-64, and all the game engines are made on x86-64, and all the games are compiled on, you guess it, x86-64. That's why they stopped using PowerPC, or Motorola, or other non-x86 architectures. To simplify backwards compatibility, and actually get comparable value to a decent performing system.
So when they tell you there is a cost overhead associated with switching vendors, that is BS. However long it takes to port your desktop driver package is how long it would take to get all of this working on different hardware.
Seriously, if someone in a basement in Arkansas can get Windows to run on a PowerPC PS3, Sony can figure out how to make x86-64 AMD games work on an x86-64 Intel chips. Anyone saying otherwise has incentive to not make it happen.
I'm not convinced, this feels like those "actually this is good for bitcoin" replies that are popular with cryptobros anytime some bad news hits. Intel have lost out on a big, high-profile contract - this cannot be something they are happy with and any explanation to the contrary is, as the kids say, "cope"
Maybe I'm misinformed, but I could never see Intel getting this contract.
AMD has extensive experience with high-performing APUs, something Intel, at least in my memory, does not have. The chips on modern high-end consoles are supposed to compete with GPUs, not with integrated graphics. Does Intel even have any offerings that would indicate they could accomplish this? Intel has ARC, which presumably could be put in a custom "APU"; however, their track record with that is not stellar.
> Their iGPU performance is actually getting good now.
I've only been waiting for Intel to ship a compelling iGPU since, I dunno, their "Extreme Graphics" in 2001? What on earth have their iGPU teams been doing over there for the last 20+ years?
I guess the OEMs were blinkered enough not to demand it, and Intel management was blinkered enough not to see the upside on their own.
The "Intel Core Ultra 7 258V" is at least 2-3x slower than the GPU within the PlayStation 5. It is not even close, and that's last gen. Again, the APUs within modern consoles compete with desktop grade GPUs. In the case of the PS5 its roughly comparable to an RTX 2070 or Rx 6700 (better analog).
Well, Nvidia has powered a much more popular console... the Nintendo Switch, and Nvidia looks set to power the Switch 2 when it launches next year. So, AMD is clearly not the only choice.
> Intel has ARC, which presumably could be put in a custom "APU"; however, their track record with that is not stellar.
I wouldn't exactly agree with that. ARC GPUs aren't really bad, sure when they where new there was for quite some time quite some driver issues but they have been mostly ironed out and where more in the "expected issues with first non iGPU" territory then "intel being very bad at their job" territory.
Also GPUs in consoles (ignoring switch) are at the lower mid-class area today and that it's unlikely to change with future consoles, so that is a segment intel should be able to compete with. I mean console GPUs are more like big iGPS then dedicated GPUs.
The main issue would be that weather it's intel, nvidea or amd their drivers have subtle but sometimes quite important differences in performance characteristics meaning that sometimes optimizations for one are de-optimizations for the other and
similar interoperability issues. And they seem more likely with Intel as there is
just much less history between the larger game engines and ARC GPUs.
So IMHO Intel would have to offer a smaller price to be viable to compensate for more issues with backward compatibility, but if they where in a much better financial situation atm. I believe they would have had a high chance of getting it by subventioning it a bit so that they get a foothold on the marked and can compete without drawback next generation.
And they rightly deserve to lose the business to AMD.
Intel to Apple: "We're too big to deliver what you want for cell phones." Apple: "Ok. We'll use ARM."
Intel to Sony: "We're too big to commit to pricing, compatibility and volume." Sony:" Ok. We'll keep using AMD."
It's interesting that Intel keeps trying to ship "features", some of arguable utility but others that are decently helpful, like AVX-512, that now AMD delivers and Intel does not. I'm sure Sony didn't want a processor that can't properly and performantly run older and current titles.
>Intel to Apple: "We're too big to deliver what you want for cell phones." Apple: "Ok. We'll use ARM."
Reality:
“We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we’d done it. The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do… At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.”
This is from the horses mouth, and reliable as such. However, it does give the impression that they weren’t sufficiently interested to think more creatively about cost optimization, because they were riding the gravy train of Wintel ruling the world. So I think root comment isn’t too far off.
IMO, more interesting than Intel not doing the iPhone is Intel ending atom for phones right before Microsoft demoed Continuum for Windows Mobile 10. That would have been a much different product on an x86 phone, IMHO. Maybe it would have been enough of an exciting feature that Microsoft would have not botched the Windows Mobile 10 release.
Otellini was not a dispassionate observer at the time he said this and there are very good reasons to believe that isn’t an accurate portrayal of what happened - including the fact that Otellini had just sold Intel’s smartphone SoC business and no x86 design was remotely suitable.
Backward compatibility guarantees is a significant one, I think. A lot of the QA process for console games is predicated on testing against a fixed set of hardware configurations, and various race conditions and other weirdness can start crawling out of the woodwork even with modest changes. This has been seen on many games running on emulators, on hacked console firmwares that allow overclocking (e.g. by running the CPU at the "native" clock speed in backward compatibility mode), or with framerate unlocking patches.
Intel's Arc GPUs are quite competent (especially with the highly necessary driver updates). If Battlemage fixed the hardware scheduling design flaw, Intel has a decent shot at competing with AMD.
If AMD continues to lose ground on the desktop market and Intel continues to advance with Arc, there's a chance the PS6/Xbox Series 360 will run on Intel instead of AMD.
AMD also has a track record for Sony and consoles in general dating back to the game cube and delivering success. Maybe not the fastest thing but one that works and is reliable. Nvidia, IBM and Intel don't exactly deliver on the full suite either.
> It bothers me and makes it difficult to take the article seriously.
But if you’re in the chip game AI is the big thing of the last 10 years. It’s driven a huge chunk of new sales and demand for upgraded choices than they likely would have seen otherwise.
Having missed out on AI in many ways (nVidia was perfectly positioned, AMD better than Intel) they need stuff to keep growing.
Their current business is looking shakier than any time in recent history. ARM is getting pretty realistic on the desktop. Apple proved it and now Samsung and Qualcomm have parts for Windows users that perform well enough (compared to the failure of early ARM on Windows).
They’re behind on selling silicon for AI to business and it’s not clear consumers care enough to upgrade their PCs. And when consumers upgrade they have not only great options from AMD, doing better than ever, but the ARM threat.
They’re being squeezed on all sides. The PS6 wouldn’t make them dominant but it would have been a very steady and reliable revenue stream for years and a chance at parlaying that into additional business. “See what we did for Sony? We can do that for you.”
The article seemed rather well done to me. I think you’re being too dismissive in this case.
For Intel, by bidding they get to undercut AMD's profits.
For Sony, they get a credible alternative which they can pretend would be a viable choice. Thus forcing a slightly better deal from AMD.
We saw similar articles related to the Switch 2. That time it was AMD acting as spoiler to Nvidia. Nvidia reportedly got the contract. That time too we got news articles lamenting this loss for AMD.
As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.
Switching vendors does not just invalidate old games compatibility, it also requires retooling for their internal libraries. Console games, outside small or open source engines, use proprietary graphics api. Those apis are tied to the hardware. With this coming generation from Nintendo, and the "current gen" from Sony and Xbox they've been able to mostly reuse much of their software investment. I'd case more but this is obviously nda, other devs should be able to confirm.
Thus I don't think AMD for switch2 or Intel for ps6 was ever a credible path. Their bids existed to keep the existing vendor from getting overly greedy and ruining the parade for everyone. This is important, famously the original Xbox got hamstrung in the market by Nvidia's greed and refusal to lower prices as costs went down.
Different x86 micro-architectures benefit from writing the machine code in slightly different ways. Games are highly optimized to the specific micro-architecture of the console, so keeping that stable helps game developers optimize for the console. If you suddenly changed the micro-architecture (if switching to Intel), then old games could suddenly become janky and slow even though both systems are x86.
(This would only matter if you were pushing performance to the edge, which is why it rarely matters for general software development, but console game dev pushes to the edge)
So it isn't just the graphics APIs that would change going from AMD to Intel, but the CPU performance as well.
While that can be true, very few gamedev companies these days optimize to that degree. They almost all use off-the-shelf middleware and game engines that are built to support all of the platforms. The companies that do go through that effort tend to have very notable releases.
Nobody is hand-tuning Assembler code these days to fit into tight instruction windows. At least, not outside of some very specific logic fragments. Instead they're all writing generic interrupt-based logic. Which is fine, as that's what the newer CPUs expect and optimize for internally.
In addition, the difference in the Zen generation gap is as different as switching to Intel. We're talking fairly different cache coherency, memory hierarchies, CCX methodologies, micro-op and instruction timings, iGPU configurations, etc.
That all being said, AMD was going to beat Intel regardless because of established business relationships and their current internal struggles (both business-wise and R&D) making it fairly difficult for them to provide an equivalent alternative.
What? The Jaguar-based CPU in the PS4 has both a much lower clock and substantially lower IPC than the Zen 2 based one in the PS5. The timings are not remotely the same and the micro-architectures are quite different. Jaguar was an evolution of the Bobcat core which was AMD's answer to the Intel Atom at the time (i.e. low cost and low-power, though it was at least an out-of-order core unlike contemporary Atoms).
Going from GCN to RDNA on the GPU side is also a pretty significant architectural change, though definitely much less than the going from AMD to Intel would be.
cpu timings taken care around by developers is 10-15 years out of date. most of them these days dont even know what a dot product is, how to find the distance to a point or a straight line in-between two... and the people they rely on to do this for them make horrendous meals of it.
but yeah, sure, cpu instruction timings matter.
Nvidia got the bid for Switch when they were basically dumping those unwanted Tegra to Nintendo for an incredibly low price.
Xbox and Playstation dont earn AMD much profits at all. AMD had this Custom Processor segment to barely keep them surviving, people may forget AMD was only worth ~$3B market cap in 2016. They are now at ~$250B.
On the subject of software compatibility though, one thing I got it wrong was my prediction of having AAA titles on Xbox and PS would have helped AMD's market share on PC, given those titles are already optimised on Xbox and PS anyway. That didn't happen at all. And Nvidia continue to dominate.
Traditional MBA thinking sometimes is too short sighted. For example, PCs might not have been a Cash Cow for IBM, but the Thinkpad brand, the distributor relationships and the customer may had helped IBM more than the cash out selling this business to Lenovo. Maybe having a healthy bridge head with a popular brand of laptops could have helped IBM coming up with some innovative way of selling the overhyped Watson.
The same with AMD and videogames, it paid the bills, paid salaries and left a little profit on the table to be invested. Probably it helped them bridge from their hell days to what they are today.
There's a lot of intangibles and hidden symmetries, serendipitous opportunities that are frequently overlooked by our bean-counting master race overlords.
It doesn't cost them much either. Lisa Su, in an interview that was posted to HN a few months ago, said it is a deliberate strategy to repackage IP AMD has already developed. They are willing to pull designs from the shelf and customize it to meet partners needs. Having a long tail of products adds up, and sets you up to get first dibs on higher margin partnerships in the future.
This seems pretty well aligned with Gunpei Yokoi’s strategy of “Lateral Thinking [with] Withered Technology”. It worked out pretty well for Nintendo in the past (e.g., Gameboy) and seems to be working out with the Switch. Even though he has passed, his Wikipedia page alleges that this philosophy has been passed on to others at Nintendo.
My impression is that console ports have insufficient popularity with PC gamers for them to alter their hardware purchasing habits for those games.
But would they really?
Staying on x86-64 would take care of CPU compatibility (unless there's some exotic AMD-only instruction set extension heavily used by PS4/5 games), and a GPU emulation stack seems at least somewhat plausible.
Sony has pulled this off multiple times before with seemingly completely incompatible architectures:
The PS2 came with the PS1 CPU (repurposed as an IO controller, but fully available for previous-gen games) and emulated the GPU on its own. The PS3 did the reverse in its second iteration (i.e. it included the PS2's GPU and emulated the CPU). The PS Vita's SoC had the PSP MIPS CPU included on-die, which in turn is similar enough to the PS1's to allow running those games too.
That's still fixable if you're willing to ship newly compiled shaders and such, but that's a lot more work if you're talking about needing some kind of per-game fix to be downloaded. This is how the XBox 360 "Backwards-compatibility" works, and this approach means it only works with a subset of XBox 360 games, not all of them. It's much better than nothing, but it's not a hardware-level fix that makes the original game binaries "just work".
For packaging the old GPU with the new system, I think that's not really realistic anymore since prices for them simply don't drop enough and the system design would be a mess (the chips are huge and you'd need cooling for both chips. I guess if only one is running at a time then it's not as bad, but...). Separately, if you're swapping from Nvidia to AMD then you're talking about trying to convince one of them to make a batch of old chips for you while you use their competitor's chip as the the main one, they might not be willing to do it.
>As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.
In your view, is this issue worse with modern consoles now that the Playstation (and possibly Nintendo) online store purchases persist across generations? Imagine a scenario where someone has a PS4 and PS5, they buy many games through the Playstation Store, then Sony selects a different chip supplier for the PS6. I'm guessing this would cause issues with games that were designed for the older consoles, breaking backwards compatibility.
I'd imagine that if the console manufacturers cared about backwards compatibility, which I think they do, the likelihood of them switching chip providers would decrease with each generation.
What they did was offer some old PS2 games for purchase, though, which allowed them to tap into that very large back catalog. I could see something like this happen for a hypothetical Intel PS6 as well (i.e. skipping PS5 backwards compatibility and tapping into the large catalog of PS4 and PS4/PS5 games).
So I’m not buying that going Intel would lose backwards compatibility.
GNM could certainly be implemented for Intel GPUs, but it’s an additional cost to account for.
I suppose it can be seen as controlling rampant greed (especially for Nvidia), but it feels like the consoles dealt the cards here. There would have needed to either be some revolutionary tech or an outright schism to make a business steer an otherwise smooth ship that way.
>As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.
I agree that both are probably playing it safe this time. But as a devil's advocate: both Sony and Nintendo are not strangers to ditching the previous gen if they don't want to compromise their next gen. At this point Nintendo is skewed towards ditching (SNES/N64/Gamecube/Switch vs. Wii/WiiU).
Sony tried and almost failed hard with the PS3 (kind of before with the whole SKU debacle, and then ditched after) but is otherwise usually consistent on BC. Well, that and the Vita. But I don't think anyone missed the UMD (it was still backwards compatible digitally, though).
Ultimately, a company is its people. And the management class at Nintendo is famously new. Everybody is expecting them to focus on robust backwards compatibility as part of their new, exciting development.
I think there will be sufficient time between now and PS6 release that they will be able to support full RTRT.
Why would they need to do a lot more work on compatibility if they'd picked Intel vs AMD?
Either CPU is presumably going to be x86_64. The GPU is almost certainly going to be much different between AMD's PS5 GPU and AMD's PS6 GPU, so the graphics driver will need work either way.
I would assume if intel can make ARM and x86, it can do whatever sony needs.
Or is AMDs architecture THAT special? My assumption is, that the ps3 streaming processor was so different, that it would have mattered but with ps4 and 5?
You could also patch PS5 games if you need to. The ecosystem is closed.
This is a red herring. The hardware is x86-64, and all the game engines are made on x86-64, and all the games are compiled on, you guess it, x86-64. That's why they stopped using PowerPC, or Motorola, or other non-x86 architectures. To simplify backwards compatibility, and actually get comparable value to a decent performing system.
So when they tell you there is a cost overhead associated with switching vendors, that is BS. However long it takes to port your desktop driver package is how long it would take to get all of this working on different hardware.
Seriously, if someone in a basement in Arkansas can get Windows to run on a PowerPC PS3, Sony can figure out how to make x86-64 AMD games work on an x86-64 Intel chips. Anyone saying otherwise has incentive to not make it happen.
AMD has extensive experience with high-performing APUs, something Intel, at least in my memory, does not have. The chips on modern high-end consoles are supposed to compete with GPUs, not with integrated graphics. Does Intel even have any offerings that would indicate they could accomplish this? Intel has ARC, which presumably could be put in a custom "APU"; however, their track record with that is not stellar.
[1] https://www.pcgamer.com/hardware/graphics-cards/embargo-no-p...
[2] https://www.tomshardware.com/pc-components/cpus/lunar-lake-i...
I've only been waiting for Intel to ship a compelling iGPU since, I dunno, their "Extreme Graphics" in 2001? What on earth have their iGPU teams been doing over there for the last 20+ years?
I guess the OEMs were blinkered enough not to demand it, and Intel management was blinkered enough not to see the upside on their own.
https://www.extremetech.com/computing/intel-to-offer-panther...
Deleted Comment
I wouldn't exactly agree with that. ARC GPUs aren't really bad, sure when they where new there was for quite some time quite some driver issues but they have been mostly ironed out and where more in the "expected issues with first non iGPU" territory then "intel being very bad at their job" territory.
Also GPUs in consoles (ignoring switch) are at the lower mid-class area today and that it's unlikely to change with future consoles, so that is a segment intel should be able to compete with. I mean console GPUs are more like big iGPS then dedicated GPUs.
The main issue would be that weather it's intel, nvidea or amd their drivers have subtle but sometimes quite important differences in performance characteristics meaning that sometimes optimizations for one are de-optimizations for the other and similar interoperability issues. And they seem more likely with Intel as there is just much less history between the larger game engines and ARC GPUs.
So IMHO Intel would have to offer a smaller price to be viable to compensate for more issues with backward compatibility, but if they where in a much better financial situation atm. I believe they would have had a high chance of getting it by subventioning it a bit so that they get a foothold on the marked and can compete without drawback next generation.
Deleted Comment
And with a sticker on the front, of course.
Intel to Apple: "We're too big to deliver what you want for cell phones." Apple: "Ok. We'll use ARM."
Intel to Sony: "We're too big to commit to pricing, compatibility and volume." Sony:" Ok. We'll keep using AMD."
It's interesting that Intel keeps trying to ship "features", some of arguable utility but others that are decently helpful, like AVX-512, that now AMD delivers and Intel does not. I'm sure Sony didn't want a processor that can't properly and performantly run older and current titles.
Reality:
“We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we’d done it. The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do… At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.”
https://en.wikipedia.org/wiki/XScale
It was incredibly bad timing. If intel had continued making ARM chips they could be in an entirely different position today.
https://thechipletter.substack.com/p/how-intel-missed-the-ip...
100% intel screwup.
It's hard to compete with AMD which is the only tech company to offer both x86 and a solid GPU technology that comes with it.
On top of that you have backwards compatibility woes and the uncertainty around Intel being able to deliver on its foundry.
All in all, this win would've been a great deal for Intel's foundry in PR, but money wise those were never going to be huge sums.
If AMD continues to lose ground on the desktop market and Intel continues to advance with Arc, there's a chance the PS6/Xbox Series 360 will run on Intel instead of AMD.
If only Project Denver had kept its original goal
AMD has done: Gamecube, Wii, Xbox 360 (gpu, not cpu), Xbox one, PS4, PS5 ...
> Similar to how big tech companies like Google and Amazon rely on outside vendors to help design and manufacture custom AI chips
> Having missed the first wave of the AI boom dominated by Nvidia and AMD, Intel reported a disastrous second quarter in August.
But if you’re in the chip game AI is the big thing of the last 10 years. It’s driven a huge chunk of new sales and demand for upgraded choices than they likely would have seen otherwise.
Having missed out on AI in many ways (nVidia was perfectly positioned, AMD better than Intel) they need stuff to keep growing.
Their current business is looking shakier than any time in recent history. ARM is getting pretty realistic on the desktop. Apple proved it and now Samsung and Qualcomm have parts for Windows users that perform well enough (compared to the failure of early ARM on Windows).
They’re behind on selling silicon for AI to business and it’s not clear consumers care enough to upgrade their PCs. And when consumers upgrade they have not only great options from AMD, doing better than ever, but the ARM threat.
They’re being squeezed on all sides. The PS6 wouldn’t make them dominant but it would have been a very steady and reliable revenue stream for years and a chance at parlaying that into additional business. “See what we did for Sony? We can do that for you.”
The article seemed rather well done to me. I think you’re being too dismissive in this case.
IMHO, AMD having done well despite being woefully unprepared for the recent AI wave suggests that AI is not the only big thing
(edit: grammar)
Deleted Comment
I guess Intel lost the bidding process but they never had the 'Playstation business' in the first place.
Nevertheless, an interesting read.
This kind of thing is probably part of the motivation behind Intel splitting out a "Partner Security Engine."