I'm a former Jaguar developer too, and reading the article was a strange experience, remembering things from ages ago. I'd forgotten most of the details, as well as how it felt.
Also the Atari ST mention. I'm getting all sorts of 80s and 90s flashbacks from that story, of what it "felt" like to program on those machines.
All those fiendishly clever timing and register hacks to make the video and audio hardware do things it wasn't designed to do, well outside their design parameters, and other tricks to make 8-bit and 16-bit machines generally do things nobody thought possible, until it was.
That feeling is something I miss a bit, as it's difficult to find tricks that make the "impossible" possible on modern hardware.
As a former Jaguar developer, I'm getting PTSD just looking at the diagram with the four different CPUs, all on the same memory bus in constant contention, all with their own unique machine instruction set.
I didn’t know much about this, but was surprised to read ”a Motorola 68000 […] running at 14Mhz […] two 32-bit RISC processors running at 26.59 MHz“
I wondered how those could share a bus, with that weird 1: 1.899 ratio between CPU frequencies. I think https://en.wikipedia.org/wiki/Atari_Jaguar#Processors answers that. It says the 68000 ran slightly slower, at 13.295 MHz. That makes the other clocks exactly twice as fast.
One of the motivators behind GOAL (Lisp dialect used on Jak & Daxter games) is to allow programming the PS2's various processors very close to the metal in a single language.
The Jaguar might benefit from a GOAL-like language of its own.
Andy Gavin has a great interview on the technical-side of developing Crash Bandicoot [0], which they wrote in GOOL, the precursor to GOAL. In fact, they wrote GOOL explicitly for Crash Bandicoot. Later, when writing Jak & Daxter, Naughty Dog developed GOAL. Sony ended up buying ND -- among other things, they were hoping to use ND's code, and were quite surprised to discover that it was all written in a homebrewed Lisp dialect.
While we are on the subject of Lips-likes for game systems there is a recent effort for the NES which is intriguing. And has been discussed here on HN of course. Just search the archives.
Yes, I think some of that complexity was due to interference by Atari, they definitely took a dump on the Handy/Lynx by changing the power circuitry and the size of the unit.
From what I've read, the joypads were lifted straight from the panther, as were the launch games, rather than be redesigned.
The CPU was always meant to be 32bit with a cache, Atari chose the 68000 for ease of porting games over, but cheaped out rather than choose an EC68020 which would have increased the bus bandwith by 4 times.
Same with a lack of a CD unit which if it had come with one would have helped attract developers - the costs of ROM carts made it very unattractive for most dev houses.
Additionally, having the designers do a proper SDK would have helped, especially with the direct memory access bug which would be mitigated with an official workaround.
Lastly, the chipset was designed to be able to use 2 memory banks, so having 1meg+512K would've been better than 2meg on one bank.
Obviously having more time to develop the chipset would have been benficial. I believe Atari insisted on the object processor in addition to the DSP and blitter, which the designers were not keen on, and really by that time had been demonstrated to be a bit of a dead end (Atari GTIA->MARIA->Copper vs tile/sprite-based VDPs or blitter-based processors). If they'd not put one in maybe the chips would've been ready earlier/better, or could've had some cache/registers for the blitter.
Even without the chipset improvements, a handful of changes would've made things a lot easier for developers, and faster to boot.
> EC68020 which would have increased the bus bandwith by 4 times.
I thought it was only 2x (16 bit bus on the straight 68k). It did also have a cache though which likely would have helped muchly. Wikipedia does indicate the EC68020 was considered for the Jaguar 2. My gut would be that the EC68020 was considered 'too expensive'; back then 130k extra transistors still meant a few bucks from a cost standpoint, and Atari was already cutting odd corners in a desperate stupor.
That bus is terrifyingly bad IMO, I remember developers lamenting it in interviews I've read over the years.
One of the -smartest- things Sony did with the Playstation was picking an architecture that was known to be capable of 'pushing graphics' and licensed/reused what they could from SGI's system designs. Probably a big part of why, while still challenging for it's age, it was at least -sane- compared to what was involved on the Saturn (lol, quadratics) or the Jaguar.
Nintendo, while late to the 32 bit party, also went with an SGI based architecture, but stuck with ROM carts which still cost them a bit of business.
Funny, the TG-16 suffered from many of the same problems as the Jaguar (specifically, questionable CPU bus sizes and quirky hardware design). Alas, it seems nobody from Atari payed attention to that lesson.
I find the Amiga graphics are the nicest, then the Jaguar, then the 3DO's are so terrible and overwrought I never would have liked this game if this was the first version I played.
Nostalgia? Maybe. But I find the minimalist polygon graphics of the original a big part of what made Another World a big hit. At least to me. In my mind this is related to the concept of "closure" as described in "Understanding Comics": the more abstract, more symbolic the graphics, the more my mental processor can give it hyperrealistic meaning in my mind; conversely, the more detailed the graphics, the less interesting they become and the less room for my mind to "complete" them. The beast as a black outline with eyes and fangs is terrific (and terrifying). The beast as a more detailed rendition would probably just look silly.
Plus, you know, the polygon graphics look objectively cool ;)
I was an Amiga user, and I certainly can see why you prefer it, but the Jaguar ones look great as well (something I'd never have admitted to back in my Amiga days when Atari was "the enemy") - it feels like pixel art made for the resolution.
The problem with the 3DO one to me is that it looks like it was just downsampled from a high resolution image rather than was being drawn for the resolution it was rendered at.
> back in my Amiga days when Atari was "the enemy"
To the best of my recollection that was the first major fanboi war in computers, also amplified by the advent of pre-internet online community in the form of BBSs (Bulletin Board Systems). I fondly remember accessing GEnie[0] from my trusty Atari 1040ST (my first home computer, since it had midi builtin).
However, I have to admit, the whole fanboi thing struck me as silly, since Amiga’s were graphically superior, while Atari ended up attracting midi developers and users. Not until much later did I realize, that in marketing terms, a fanboi war may very well benefit both sides. So while I still shake my head at fanboi posts, I also see why manufacturers like those wars.
I think it’s the same as the effect of a really good book and a really good tv film adaptation.
I can count on one hand where the TV/Film adaptation lived to my imagination - most recently been the expanse and altered carbon both of which absolutely nailed the source material.
I'm torn in the middle, but I don't totally agree. I had never seen the 3DO version before now, but I really like how it evolves on the original layout. I think the artwork is really interesting and "other worldly" while still retaining some of the abstractness. Now, part of this might be due to being familiar with the original, and having an exciting alternate version to see, I'm not sure if the game would be as notable without the polygon aesthetic.
The link skips over the part explaining that this is part of a series about how Another World was implemented on many platforms: http://fabiensanglard.net/
Funny how Sony did a similar thing (and had similar problems) with the Cell architecture of the PS3.
The developers who took the time to really target the platform (Naughty Dog? Kojima Productions?) managed to extract a lot of performance from it, and PS3 really felt like something from the future for the time, but it made developing and porting to the platform much harder.
(The difference is that the Cell architecture was more like cluster architecture, just focused on multiprocessing, while this Jaguar seem to have unique components each with unique quirks)
It's more analogous to the PS2 and it's pipeline of arcane custom chips. All of them had their own flavor of assembly, including a few chips that handled 2 opcodes per line, each operation with different cycle latencies. These chips had to be driven by the DMA and synchronized with double buffers, to ensure you had a decent fillrate of polygons. The CPU had a 'scratchpad' which you could access in 1 cycle (as opposed to 30-200 from main memory depending on cache misses), and would also be piped in by the DMA. Also, one of its vector processors was accessible (slowly) directly from the main CPU as a coprocessor (the so-called macro mode), but could also be driven by the DMA (micro-mode). The main processor and the input/output processor were two flavors of MIPS, the latter being the same as the PS1 presumably to provide emulation.
I'm certain the PS2 architecture was written by a mad genius - and it took one to use it's system to the max. If it wasn't for popularity the PS1, it might have gone the way of the 3DO. The PS3 was a dream to work with in comparison.
My approach to squeezing the most out of the Jaguar was to start by getting a fairly complex fused tiled-polygon-and-texture rendering algorithm working on the 68000 first, written mostly in C++ for GCC, with some of the classes generated in Lisp.
To get all the geometry and other details right and focused on feeding calls to a texture-scan-line inner loop.
Then to switch the inner part to a fast, asynchronous command pipe, pushing commands to the GPU/blitter et al. which would read and execute those as fast as possible. And then as needed optimise any bottlenecks on the 68000 using assembly.
The idea was to keep the GPU/blitter as busy as possible filling pixels from simple texture-line commands, with minimal logic to fetch and setup each new blit (i.e. no higher level geometry calculations during which the texturing would be idle), while the 68000 ran in parallel generating those commands and doing the higher level geometry, which was quite complex in our game.
I got everything running just right on the 68000 for a nice demo, with the graphics and maps we had ready by then. It was visually perfect, and play speed was just about usable but not smooth like we were aiming for.
Unfortunately perhaps, that's when Atari called my boss in to do a milestone demo, so that version was shown.
When my boss returned, I was told whichever Tramiel was head of Atari at the time was "angry" as the game looked "too slow", and the project was cancelled. :-(
Just as I was getting the GPU/blitter accelerated mode working, which was projected to be 2 weeks work (since everything up to that point had been aimed at this).
I never did get to complete or show off the fast version. So close!
Atari was in trouble and collapsed very soon after, so maybe that wasn't the real reason.
The geometry, texturing and physics systems could be moved over because they were mostly GCC-compatible C++. (At the time different C++ compilers accepted different dialects.)
But all the effort around optimising the engine around pushing pixels as fast as possible out of the Jaguar's hardware subsystems had to be abandoned.
Fortunately the pipeline architecture translated quite well to x86 texturing, first using awesome integer register tricks (very few registers on x86, but because you could address individual bytes and words inside dword registers you could "vectorise" some of the texturing and address arithmetic); later the FPU became faster.
This was in the days of Doom, Quake and Descent, when extremely hand-optimised assembly software rendering was normal on PCs, and consoles had very strange "not quite right for 3D" blitters. (Ask about the Sega Saturn sometime.) GPUs as we now know them were just starting to be created for PCs, and Microsoft launched DirectX a year or so later.
--
The Jaguar's CRY colour space was not great. We had a very large graphics asset pipeline at the time, derived from hundreds or thousands of photos of real physical models made by the artists (a very different look than drawing), and all of it had to be scaled, alpha-clipped and quantized into CRY, which was not kind to the colours. Adequate, but not great use of the colour bits available.
Somewhere I'm pretty sure I still have a "pnmtocry" executable in Linux a.out format :-)
I just wanted to say that I enjoyed this comment so much. As a Jag owner at the time I remember being taken by the production shots of that game that popped up in magazines, and couldn't wait to see it in motion. The craftiness necessary to pull off a compelling rendering pipeline on the hardware of that time period, combined with the aesthetic directions the constraints pushed designs in, still inspires me. (And somewhat indirectly led me down my own career path.) Thanks!
If we had published on the Jaguar, maybe that would have led to a whole different career for me too :-)
It's lovely to think someone outside the company enjoyed it. Because it was never published, it felt like multiple years put into something that just disappeared, and I didn't get much feedback from outside the company. That happened with the next game I worked on too, Mr Tank, which was more technically advanced and a lot of fun to play in the demos.
Bearing in mind I didn't work on the Saturn. What I recall from others, for 3D rendering, the Sega Saturn:
- Could only draw quadrilaterals with any corner coordinates, but not triangles. You need triangles to draw general 3D polygon shapes. Triangles could be simulated by setting two corners to the same position, but that doubled the amount of rendering due to the next points:
- Drew quads in a strange way that overwrote lots of pixels repeatedly if they weren't screen-aligned rectangles. Instead of drawing horizontal spans on the screen to the bounds of the shape, as almost anyone else's renderer would do, so that each pixel was rendered once, it drew lots of sloping lines, starting with a line between points A and B at the "top" of the quad, working its way towards points C and D at the "bottom" of the quad, interpolating the endpoints and slopes appropriately.
- Drawing the sloping lines would have been ok (though not as efficient for RAM access) if it had managed to still draw each pixel once by careful choice of line pixels. But instead, it just drew sloping lines on top of each other, with the effect that you got a combination of some pixels being drawn over more than once inside the quad (wasted time), and also gaps where no pixels were draw inside the quad (holes!).
- To avoid the holes you had to ensure the line density was enough that you would definitely also have significant pixel duplication. The line density depended on the polygon's longest edge and its orientation after projecting from 3D to screen space.
- When drawing a highly distorted quad that meant a lot of pixel duplication at the more "pinched" end.
- Calculated texture coordinates by linear interpolation along these lines, instead of 3D projection. This made the texture look strongly distorted on quads not facing the viewer in 3D space, needing a workaround in software.
- And then there was the difficulty figuring out how to get the hardware to actually do these things from the poor documentation and an enormous list of some 200 registers, without useful sample code or libraries.
Drawing a good looking 3D textured polygon was tricky on this thing. Even when you had it drawing polys they looked wrong in all sorts of ways until compensated for, and still looked a bit wrong no matter what you did.
And then you just knew it was much slower than it could have been given the rest of the hardware, bus speeds etc. It seems like the hardware designers had put a lot of effort into a complex chip, but unfortunately didn't understand much about 3D graphics.
The color space is probably one of the stranger things I've seen. I wonder what led the designers to choose that scheme. I'd love to see the design considerations for this whole system.
It's not conceptually that different from YUV[1], the color spaces used by PAL television and most JPEG images. I can't speak to the Jaguar, but both of those decisions were motivated by the desire to completely separate the luminance/brightness dimension from the 2 color dimensions. For PAL this was to maintain compatibility with black and white televisions during the slow adoption of color, for JPEG it's a compression trick since the chroma channels can be downscaled more than the grayscale without humans noticing as readily.
In any case I'd argue the ability to reason about this space is a bit more natural than RGB (and a bit less than HSV) too.
But I'm sure there were some other specific requirements they were trying to satisfy...
YCbCr (often called YUV for... reasons) and 4:2:0 chroma subsampling (half the vertical resolution for each plane of color information as for the brightness) are still the dominant color encoding used for digital video, HDTV, Blu-ray, etc.
The "CRY" model used by the Jaguar is fairly different than (though as you say, related conceptually in its separation of brightness/darkness information, of course). I think it basically just arose from a combination of big focus on smooth shading while keeping the color values workably small and fast to calculate on.
It is fundamentally different than YUV (PAL)/YIQ (NTSC). In those systems, the IQ/UV regions are monotonic. In the Jaguar color space, axes have different meanings depending on which quadrant you're in. In the (-1,-1),(0,0),(1,-1) triangle the vertical dimension appears to have no effect.
When Nintendo natively used the YIQ color space for the NES, that makes sense, because it simplified the hardware, because it's the color space used by NTSC. It's not apparent to me that there's an easy way to convert this non monotonic color space into YUV/YIQ.
Thanks for the explanation and the link! I'd imagine it would make shading a little more computationally quick, now that you mention separating color from brightness.
The idea was to use a lot of gouraud shaded polygons (i.e. polygons with a gradient that goes to black) and the color space was designed, with the blitter chip, to make those kinds of polygons fast.
Also the Atari ST mention. I'm getting all sorts of 80s and 90s flashbacks from that story, of what it "felt" like to program on those machines.
All those fiendishly clever timing and register hacks to make the video and audio hardware do things it wasn't designed to do, well outside their design parameters, and other tricks to make 8-bit and 16-bit machines generally do things nobody thought possible, until it was.
That feeling is something I miss a bit, as it's difficult to find tricks that make the "impossible" possible on modern hardware.
[Edit: I suppose Spectre and friends qualify :-)]
Some of the stuff that were done on the machines is jaw dropping - realtime raytracing on the Falcon for instance: https://www.youtube.com/watch?v=_cKRZ8QgH5o
Doom on the Atari ST (not Wolfenstein, more sophisticated with heightmaps, lighting, 64 colours etc): https://www.youtube.com/watch?v=QCvx2O5M69E
Complex texturemapping on Atari ST: https://www.youtube.com/watch?v=gHwUchzEG8k
Polygon landscapes on the Lynx: https://www.youtube.com/watch?v=DPexnRsNJDs
I wondered how those could share a bus, with that weird 1: 1.899 ratio between CPU frequencies. I think https://en.wikipedia.org/wiki/Atari_Jaguar#Processors answers that. It says the 68000 ran slightly slower, at 13.295 MHz. That makes the other clocks exactly twice as fast.
https://en.wikipedia.org/wiki/Clock_domain_crossing
http://ai.eecs.umich.edu/people/conway/VLSI/VLSIText/PP-V3/2...
The Jaguar might benefit from a GOAL-like language of its own.
You can see some GOAL here: https://web.archive.org/web/20070127022728/http://lists.midn...
[0] https://www.youtube.com/watch?v=izxXGuVL21o
http://www.dustmop.io/blog/2019/09/10/what-remains-technical...
From what I've read, the joypads were lifted straight from the panther, as were the launch games, rather than be redesigned.
The CPU was always meant to be 32bit with a cache, Atari chose the 68000 for ease of porting games over, but cheaped out rather than choose an EC68020 which would have increased the bus bandwith by 4 times.
Same with a lack of a CD unit which if it had come with one would have helped attract developers - the costs of ROM carts made it very unattractive for most dev houses.
Additionally, having the designers do a proper SDK would have helped, especially with the direct memory access bug which would be mitigated with an official workaround.
Lastly, the chipset was designed to be able to use 2 memory banks, so having 1meg+512K would've been better than 2meg on one bank.
Obviously having more time to develop the chipset would have been benficial. I believe Atari insisted on the object processor in addition to the DSP and blitter, which the designers were not keen on, and really by that time had been demonstrated to be a bit of a dead end (Atari GTIA->MARIA->Copper vs tile/sprite-based VDPs or blitter-based processors). If they'd not put one in maybe the chips would've been ready earlier/better, or could've had some cache/registers for the blitter.
Even without the chipset improvements, a handful of changes would've made things a lot easier for developers, and faster to boot.
I thought it was only 2x (16 bit bus on the straight 68k). It did also have a cache though which likely would have helped muchly. Wikipedia does indicate the EC68020 was considered for the Jaguar 2. My gut would be that the EC68020 was considered 'too expensive'; back then 130k extra transistors still meant a few bucks from a cost standpoint, and Atari was already cutting odd corners in a desperate stupor.
That bus is terrifyingly bad IMO, I remember developers lamenting it in interviews I've read over the years.
One of the -smartest- things Sony did with the Playstation was picking an architecture that was known to be capable of 'pushing graphics' and licensed/reused what they could from SGI's system designs. Probably a big part of why, while still challenging for it's age, it was at least -sane- compared to what was involved on the Saturn (lol, quadratics) or the Jaguar.
Nintendo, while late to the 32 bit party, also went with an SGI based architecture, but stuck with ROM carts which still cost them a bit of business.
Funny, the TG-16 suffered from many of the same problems as the Jaguar (specifically, questionable CPU bus sizes and quirky hardware design). Alas, it seems nobody from Atari payed attention to that lesson.
Deleted Comment
Nostalgia? Maybe. But I find the minimalist polygon graphics of the original a big part of what made Another World a big hit. At least to me. In my mind this is related to the concept of "closure" as described in "Understanding Comics": the more abstract, more symbolic the graphics, the more my mental processor can give it hyperrealistic meaning in my mind; conversely, the more detailed the graphics, the less interesting they become and the less room for my mind to "complete" them. The beast as a black outline with eyes and fangs is terrific (and terrifying). The beast as a more detailed rendition would probably just look silly.
Plus, you know, the polygon graphics look objectively cool ;)
The problem with the 3DO one to me is that it looks like it was just downsampled from a high resolution image rather than was being drawn for the resolution it was rendered at.
To the best of my recollection that was the first major fanboi war in computers, also amplified by the advent of pre-internet online community in the form of BBSs (Bulletin Board Systems). I fondly remember accessing GEnie[0] from my trusty Atari 1040ST (my first home computer, since it had midi builtin).
However, I have to admit, the whole fanboi thing struck me as silly, since Amiga’s were graphically superior, while Atari ended up attracting midi developers and users. Not until much later did I realize, that in marketing terms, a fanboi war may very well benefit both sides. So while I still shake my head at fanboi posts, I also see why manufacturers like those wars.
[0] https://en.wikipedia.org/wiki/GEnie
I can count on one hand where the TV/Film adaptation lived to my imagination - most recently been the expanse and altered carbon both of which absolutely nailed the source material.
Also, in this video https://www.youtube.com/watch?v=tiq0OL8rzso&t=2m the person who implemented the SNES version tells the story first-hand.
The developers who took the time to really target the platform (Naughty Dog? Kojima Productions?) managed to extract a lot of performance from it, and PS3 really felt like something from the future for the time, but it made developing and porting to the platform much harder.
(The difference is that the Cell architecture was more like cluster architecture, just focused on multiprocessing, while this Jaguar seem to have unique components each with unique quirks)
I'm certain the PS2 architecture was written by a mad genius - and it took one to use it's system to the max. If it wasn't for popularity the PS1, it might have gone the way of the 3DO. The PS3 was a dream to work with in comparison.
My approach to squeezing the most out of the Jaguar was to start by getting a fairly complex fused tiled-polygon-and-texture rendering algorithm working on the 68000 first, written mostly in C++ for GCC, with some of the classes generated in Lisp.
To get all the geometry and other details right and focused on feeding calls to a texture-scan-line inner loop.
Then to switch the inner part to a fast, asynchronous command pipe, pushing commands to the GPU/blitter et al. which would read and execute those as fast as possible. And then as needed optimise any bottlenecks on the 68000 using assembly.
The idea was to keep the GPU/blitter as busy as possible filling pixels from simple texture-line commands, with minimal logic to fetch and setup each new blit (i.e. no higher level geometry calculations during which the texturing would be idle), while the 68000 ran in parallel generating those commands and doing the higher level geometry, which was quite complex in our game.
I got everything running just right on the 68000 for a nice demo, with the graphics and maps we had ready by then. It was visually perfect, and play speed was just about usable but not smooth like we were aiming for.
Unfortunately perhaps, that's when Atari called my boss in to do a milestone demo, so that version was shown.
When my boss returned, I was told whichever Tramiel was head of Atari at the time was "angry" as the game looked "too slow", and the project was cancelled. :-(
Just as I was getting the GPU/blitter accelerated mode working, which was projected to be 2 weeks work (since everything up to that point had been aimed at this).
I never did get to complete or show off the fast version. So close!
Atari was in trouble and collapsed very soon after, so maybe that wasn't the real reason.
We then started moving the project to the Fujitsu FM Towns: https://en.wikipedia.org/wiki/FM_Towns
And then the PC using DJGPP.
The geometry, texturing and physics systems could be moved over because they were mostly GCC-compatible C++. (At the time different C++ compilers accepted different dialects.)
But all the effort around optimising the engine around pushing pixels as fast as possible out of the Jaguar's hardware subsystems had to be abandoned.
Fortunately the pipeline architecture translated quite well to x86 texturing, first using awesome integer register tricks (very few registers on x86, but because you could address individual bytes and words inside dword registers you could "vectorise" some of the texturing and address arithmetic); later the FPU became faster.
This was in the days of Doom, Quake and Descent, when extremely hand-optimised assembly software rendering was normal on PCs, and consoles had very strange "not quite right for 3D" blitters. (Ask about the Sega Saturn sometime.) GPUs as we now know them were just starting to be created for PCs, and Microsoft launched DirectX a year or so later.
--
The Jaguar's CRY colour space was not great. We had a very large graphics asset pipeline at the time, derived from hundreds or thousands of photos of real physical models made by the artists (a very different look than drawing), and all of it had to be scaled, alpha-clipped and quantized into CRY, which was not kind to the colours. Adequate, but not great use of the colour bits available.
Somewhere I'm pretty sure I still have a "pnmtocry" executable in Linux a.out format :-)
If we had published on the Jaguar, maybe that would have led to a whole different career for me too :-)
It's lovely to think someone outside the company enjoyed it. Because it was never published, it felt like multiple years put into something that just disappeared, and I didn't get much feedback from outside the company. That happened with the next game I worked on too, Mr Tank, which was more technically advanced and a lot of fun to play in the demos.
- Could only draw quadrilaterals with any corner coordinates, but not triangles. You need triangles to draw general 3D polygon shapes. Triangles could be simulated by setting two corners to the same position, but that doubled the amount of rendering due to the next points:
- Drew quads in a strange way that overwrote lots of pixels repeatedly if they weren't screen-aligned rectangles. Instead of drawing horizontal spans on the screen to the bounds of the shape, as almost anyone else's renderer would do, so that each pixel was rendered once, it drew lots of sloping lines, starting with a line between points A and B at the "top" of the quad, working its way towards points C and D at the "bottom" of the quad, interpolating the endpoints and slopes appropriately.
- Drawing the sloping lines would have been ok (though not as efficient for RAM access) if it had managed to still draw each pixel once by careful choice of line pixels. But instead, it just drew sloping lines on top of each other, with the effect that you got a combination of some pixels being drawn over more than once inside the quad (wasted time), and also gaps where no pixels were draw inside the quad (holes!).
- To avoid the holes you had to ensure the line density was enough that you would definitely also have significant pixel duplication. The line density depended on the polygon's longest edge and its orientation after projecting from 3D to screen space.
- When drawing a highly distorted quad that meant a lot of pixel duplication at the more "pinched" end.
- Calculated texture coordinates by linear interpolation along these lines, instead of 3D projection. This made the texture look strongly distorted on quads not facing the viewer in 3D space, needing a workaround in software.
- And then there was the difficulty figuring out how to get the hardware to actually do these things from the poor documentation and an enormous list of some 200 registers, without useful sample code or libraries.
Drawing a good looking 3D textured polygon was tricky on this thing. Even when you had it drawing polys they looked wrong in all sorts of ways until compensated for, and still looked a bit wrong no matter what you did.
And then you just knew it was much slower than it could have been given the rest of the hardware, bus speeds etc. It seems like the hardware designers had put a lot of effort into a complex chip, but unfortunately didn't understand much about 3D graphics.
In any case I'd argue the ability to reason about this space is a bit more natural than RGB (and a bit less than HSV) too.
But I'm sure there were some other specific requirements they were trying to satisfy...
[1] - https://en.wikipedia.org/wiki/YUV
The "CRY" model used by the Jaguar is fairly different than (though as you say, related conceptually in its separation of brightness/darkness information, of course). I think it basically just arose from a combination of big focus on smooth shading while keeping the color values workably small and fast to calculate on.
When Nintendo natively used the YIQ color space for the NES, that makes sense, because it simplified the hardware, because it's the color space used by NTSC. It's not apparent to me that there's an easy way to convert this non monotonic color space into YUV/YIQ.