On a personal note, this is also part of why I exist at all.
My father came from England very young, worked his way through agriculture jobs in California, and somehow found himself with a job offer at the u of u as an electron microscopist. Met my mom there. Details are fuzzy and not retrievable from the dead. For some reason this job required him early access to the „internet“ in the 80s.
In 1994, we were, as I was told at the time, one of the first families to “have internet” in Utah. We had a dial up connection that lasted 15 minutes before needing to reconnect. As anyone my age remembers, that was about as long as it took to load one webpage with one image.
It was a massive influence on me and my neighborhood friends in countless ways and I’m eternally grateful for what came of it.
East High School is the closest public high school to the University of Utah. Because of this proximity the school was fortunate to get a direct T1 (1.5 Mbps) connection in 1992 (93?).
The original domain was east.east-slc.edu before it was standardized to east.k12.ut.us circa 1995.
After school every day for a few hours the East High CS room would be full of students exploring the new online world: surfing gopher, playing MUDs, Usenet, and using NCSA Mozilla on the DEC station. This is when Yahoo! was all hand curated.
Students could even dial in to one of 2 modems and connect to the Internet from home. It was glorious.
Skyline High School had a teletype terminal by 1978. I think it connected to the University of Utah, though I am not absolutely certain. But that was a long way below a T1 line...
1994 was way way after dialup Internet access was mainstream (both Yahoo and Amazon were founded that year).
Any first access in the state would be sometime in the 80s. By 1993 there were already national level dialup ISPs.
> that was about as long [15 minutes] as it took to load one webpage with one image.
Very hyperbolic. A simple webpage with text would load in seconds on a 28.8k modem. A single image would usually be a 10s of kB in those days, so maybe some seconds, not even a minute.
This is not correct for access open to the general public. The first commercial ISP in Utah, Xmission, was founded in 1993. Yes, many of us had internet access through the University of Utah before that (Pete Ashdown, the founder, had worked at Evans & Sutherland, which had quite good internet connectivity). But most people did not if they weren't associated with a university.
(I helped create the third public ISP in Utah (ArosNet), in 1995).
V34 (28.8k) was only ratified in 1994, and many ISPs were still at 14.4 at that time. Many customers still used much slower modems - 9600 remained quite common.
The commercial Internet really only started taking off in 1993. Not by coincidence, that was the same year NCSA Mosaic was released.
Right, unless the backhaul was massively overcommitted... which was normal at the time because it was really difficult to sufficiently provision backhaul, even a few months in advance. The internet was certainly not fast in 1994.
Anyway, since when does the truth need to get in the way of a good story?
You assume 28.8, good phone lines, and a responsive server. Sometimes it was an old 9600 because it was all you could easily get, noisy lines, and the server on the other end being slow because the picture was popular. Then a page could load for a minute, with pictures and all.
Dialup services were available. Most had little to no internet connectivity. The few services dedicated to internet access were not mainstream yet. MS was preparing to deploy MSN 1.0 with no internet because that was just a hippie fad.
There's a rivalry between the NBA fan bases of the Utah Jazz and the Houston Rockets. Commonly this manifests as little jabs Rockets fans will make, mocking Utah for not having internet.
Always funny to see this brought up when Utah was one of the first places on the planet to have the "Internet".
The U of U still has a large block of public IPs that they use for all their "internal" systems. No need for NAT there!
That was at least 100% true as of around 15 years ago. I know the CS and engineering departments are still using that block. Hard to imagine that it still holds true for WiFi APs, etc and the proliferation of devices.
I can only imagine how cool it must be to be raised by early adopters of internet technology.
I can somewhat relate to this story in a couple coincidental points. First is that my father being a relatively early adopter of the PC in his professional life as a civil engineer in the early 80s, while an internet connection was about twenty years in the future for him, and the second one is that I learned electron microscopy on a Jeol microscope from 1994 when I was doing my master's in material science.
And this is my first comment in HN. I hope everyone is doing wonderfully well.
Most of the discussion on this thread is about my timeline of saying we were one of the first families with internet in 1994. First of all, I did try to qualify it with “as I was told” - i was a kid and all I knew was what my dad and the internet told me - but I was writing that comment with haste and on second thought, the timing was certainly more like 1992. Also, I didn’t say that’s when we got it, just when he told me that. As i said before, he had it in the 80s. Just to throw a wrench in the discussion. Also yes, it was hyperbole to say it took 15 min to download an image, but just barely and just for the fun of the story. Love you all and your pedantic asses.
1994 already feels a tad late in my memory, everyone at my high school already had email in 1993, and we knew several families with internet in their house by 1992.
During my internship, I worked at Evans and Sutherland, which was just off of the University of Utah tech park and was a major part of the UoU graphics world. I worked on what I think was the last generation of E&S image generators. It was pretty cool, especially the realtime phong shading (this was the 1990s!) and calligraphic lighting (sort of like a vector display that overdrew the normal displays and made things like point lights look incredible!) in the actual flight simulators.
The company was visibly dying at the time and I didn't take their offer to join, I just had other plans, but the engineers were incredibly welcoming. One of the people I worked with went on to be an imagineer. I with I had not lost touch with them, the guy I reported to was a great guy.
It was just an amazing place, but the writing was on the wall. GPUs were just starting and I looked at the rack full of gear that was the image generator and thought: yeah, this is all going away soon. That was, in fact, exactly what happened.
Speaking of E&S vector displays: as a young volunteer at our local children's museum[1] in the early '90s, I spent many pleasant hours screwing around with a first-generation Digistar planetarium system, which was essentially a vector projection display with an enormous dome-shaped screen driven by scripts running on a MicroVAX in the back room that could be interacted with via a variety of controllers — joysticks, faders, etc. — on the planetarium floor.
Sadly, this interactivity was never used in an official capacity while I was there; all public shows were fully automated affairs. But this didn't stop twelve-year-old me from simulating starship combat in between shows and after hours!
Computer-wise, this was one of the highlights of my childhood.
The duo in 1969 developed the line-drawing system displays LDS-1 and LDS-2, the first graphics devices with a processing unit. They then built the E&S Picture System—the next generation of LDS displays.
I'm working with my limited and stereotyped knowledge of Utah, but is "line-drawing system" a easter egg reference to latter day saints?
Recently Utah had a small commemorative conference where the photo in the picture was taken. Ivan Sutherland attended, and gave what I thought was the best talk, about nothing to do with graphics. He’s actively doing research on Single Quantum Flux circuits at 85 years old! [1] He’s trying to spread the word that the US is about to miss out on the next hardware revolution because we’re not paying attention to SQF. [2] The talk is awesome and worth watching. Anyway, I feel like this was a sneak peek into one of the biggest reasons why that early graphics team was so successful, Ivan’s amazing. Combined with some other magic ingredients, their environment was special, and I think, sadly, unreproducible.
Evans and Sutherland also developed CDRS (Conceptual Design and Rendering System) which was later bought by PTC and developed into ISDX. This is a 3D CAD tool for high quality surfacing used by industrial designers and surfacing experts to develop exteriors for automotive and consumer products. It was originally written in Lisp.
Unlike many other CAD tools, especially for its time, it described the relation between surfaces as a bunch of rules. So surfaces 'A' and 'B' could influence 'C', and then the user change their mind and decide that 'C' should drive 'A' and 'B' instead. Unlike other parametric tools which would come later, you didn't set this up by reordering a bunch of features, you just changed the connection icons between the surfaces. It was up to CDRS to figure out what order to solve the constraints and build the surfaces.
Right out of high school I started as an intern at PTC in Research Park shortly after they acquired CDRS from E&S.
CDRS was originally started by some of the researchers from the University of Utah that created the Alpha_1 NURBS modeler.
After PTCs org wide rebranding, it was called Pro/Concept.
I worked on the other product in the group called Pro/3D Paint. It was the first product to use projective texture mapping to allow industrial designers to draw directly on 3D models instead of in texture space.
So many more memories I wouldn’t know where to begin…
I started using ISDX in ‘96 soon after PTC acquired it and managed to port it and make it work with Pro/E. I’ve worked on a number of products which are still in production using it, but it’s been a while since I was using Pro/E (now Creo). I don’t miss Creo, but I do miss ISDX. There was a while when I was using Solidworks and Pro/E (with ISDX) depending on the project/client, but then I transitioned to Solidworks for many years. Now I’m using Onshape (started by the key people from Solidworks) which recently was acquired by PTC. Both Solidworks and Onshape use the Parasolid kernel which was first developed for Unigraphics. It seems like it’s a small world in CAD tool development.
I mentioned in another thread I used alpha_1 in high school at the U (2002). I had actually had Autocad (R16) 3D CAD experience at the time, but alpha_1 was a lot of fun. I ended up making a 3D Model of Escher’s Belvedere. One of the guys was going to “print” it down in the lab in the bottom of MEB (they did have some form of 3D printer) but thought it might be hard and unstable. I ended up getting it rendered with ray tracing on one of the brand new UltraSPARCs they had in the lab, and it was printed on a T shirt that everyone got. We also got a copy of VisualStudio 6 from Microsoft (we toured their studio near the airport) which I sold on ebay for $375 at the time.
I used CDRS while at Caterpillar in 1997 and purchased it in the same year not knowing it was slated for the graveyard. I made it my mission to be the expert ... luckily STYLE became the cool new thing in 2001 and I was able to be expert in that right away.
Computer graphics has always been a world apart in computing. Even today we still have the somewhat arcane world of GPU's versus the CPU's for the common folk.
Programming-wise too, computer graphics, especially 3D, has this pronounced mathematical / geometric aspect to it that is not shared with normal applications. That has been inhereted in the game industry but remains a thing apart.
In some sense it is only now, half a century later, with the increased emphasis on data science and ML/AI that this fundamental cleavage is being repaired:
Both at the hardware side, the repurposing and utilization of GPU architectures for non-graphical compute and on the software side, the elevation of more mathematical objects like tensors (= nd-arrays) into first class citizens.
Some of that disjointness might have been unavoidable, but maybe these guys (remarkably 100% male btw) in the family photo had something to do with it.
The counterfactual would have been something like graphics enabled CPU's and more mainstream support in programming languages for the associated mathematical operations.
I've been writing 3D graphics software since '84, and a few years ago when I started formally learning machine learning I found it had so many parallels with 3D graphics, the transition was quite a bit easier than I expected.
If you haven't read Creativity, Inc. by Ed Catmull of Pixar, he (and his ghostwriter) do a great job talking about his work on early animation tech at Utah. Highly recommend.
Another great book that covers the early days of computer graphics work is Droidmaker [1]
For a book that is purportedly about George Lucas and his contributions, it's goes into a great amount of detail about the histories of the computer graphics pioneers were doing before they worked for Lucas.
I didn't realize the group was so small, and so influential. Highly recommended.
I've read "The Biography of the Pixel" book. It's an amazing and educational book about the history of computer graphics and the definition of a pixel by Smith.
On a personal note, this is also part of why I exist at all.
My father came from England very young, worked his way through agriculture jobs in California, and somehow found himself with a job offer at the u of u as an electron microscopist. Met my mom there. Details are fuzzy and not retrievable from the dead. For some reason this job required him early access to the „internet“ in the 80s.
In 1994, we were, as I was told at the time, one of the first families to “have internet” in Utah. We had a dial up connection that lasted 15 minutes before needing to reconnect. As anyone my age remembers, that was about as long as it took to load one webpage with one image.
It was a massive influence on me and my neighborhood friends in countless ways and I’m eternally grateful for what came of it.
The original domain was east.east-slc.edu before it was standardized to east.k12.ut.us circa 1995.
After school every day for a few hours the East High CS room would be full of students exploring the new online world: surfing gopher, playing MUDs, Usenet, and using NCSA Mozilla on the DEC station. This is when Yahoo! was all hand curated.
Students could even dial in to one of 2 modems and connect to the Internet from home. It was glorious.
> that was about as long [15 minutes] as it took to load one webpage with one image.
Very hyperbolic. A simple webpage with text would load in seconds on a 28.8k modem. A single image would usually be a 10s of kB in those days, so maybe some seconds, not even a minute.
(I helped create the third public ISP in Utah (ArosNet), in 1995).
V34 (28.8k) was only ratified in 1994, and many ISPs were still at 14.4 at that time. Many customers still used much slower modems - 9600 remained quite common.
The commercial Internet really only started taking off in 1993. Not by coincidence, that was the same year NCSA Mosaic was released.
I'd argue it wasn't until 96/97 when "everyone" started using it and membership didn't quite peak with services like AOL until 2001.
The internet was still the land of the nerds until the early 2000's
Anyway, since when does the truth need to get in the way of a good story?
Not 15 minutes though.
Always funny to see this brought up when Utah was one of the first places on the planet to have the "Internet".
You inherited your dad's British humour :)
That was at least 100% true as of around 15 years ago. I know the CS and engineering departments are still using that block. Hard to imagine that it still holds true for WiFi APs, etc and the proliferation of devices.
I can somewhat relate to this story in a couple coincidental points. First is that my father being a relatively early adopter of the PC in his professional life as a civil engineer in the early 80s, while an internet connection was about twenty years in the future for him, and the second one is that I learned electron microscopy on a Jeol microscope from 1994 when I was doing my master's in material science.
And this is my first comment in HN. I hope everyone is doing wonderfully well.
The company was visibly dying at the time and I didn't take their offer to join, I just had other plans, but the engineers were incredibly welcoming. One of the people I worked with went on to be an imagineer. I with I had not lost touch with them, the guy I reported to was a great guy.
It was just an amazing place, but the writing was on the wall. GPUs were just starting and I looked at the rack full of gear that was the image generator and thought: yeah, this is all going away soon. That was, in fact, exactly what happened.
Sadly, this interactivity was never used in an official capacity while I was there; all public shows were fully automated affairs. But this didn't stop twelve-year-old me from simulating starship combat in between shows and after hours!
Computer-wise, this was one of the highlights of my childhood.
[1] https://en.wikipedia.org/wiki/The_Children's_Museum_of_India...
https://en.wikipedia.org/wiki/Utah_teapot
sort of like the great-great-grandparent of 3dbenchy:
https://en.wikipedia.org/wiki/3DBenchy
History of computer animation: https://en.wikipedia.org/wiki/History_of_computer_animation
https://en.wikipedia.org/wiki/Stanford_bunny
Or, more distantly related, the test photo Lena, which recently fell victim to political correctness:
https://en.wikipedia.org/wiki/Lenna
The photo does have Martin Newell in it who created the famous model.
I'm working with my limited and stereotyped knowledge of Utah, but is "line-drawing system" a easter egg reference to latter day saints?
[1] https://www.youtube.com/live/LUFp6sjKbkE?feature=share (For Ivan’s talk, scroll to 5:50:00 - that’s 5 hours and fifty minutes in.)
[2] https://www.nytimes.com/2023/04/19/technology/ivan-sutherlan...
Unlike many other CAD tools, especially for its time, it described the relation between surfaces as a bunch of rules. So surfaces 'A' and 'B' could influence 'C', and then the user change their mind and decide that 'C' should drive 'A' and 'B' instead. Unlike other parametric tools which would come later, you didn't set this up by reordering a bunch of features, you just changed the connection icons between the surfaces. It was up to CDRS to figure out what order to solve the constraints and build the surfaces.
CDRS was originally started by some of the researchers from the University of Utah that created the Alpha_1 NURBS modeler.
After PTCs org wide rebranding, it was called Pro/Concept.
I worked on the other product in the group called Pro/3D Paint. It was the first product to use projective texture mapping to allow industrial designers to draw directly on 3D models instead of in texture space.
So many more memories I wouldn’t know where to begin…
I mentioned in another thread I used alpha_1 in high school at the U (2002). I had actually had Autocad (R16) 3D CAD experience at the time, but alpha_1 was a lot of fun. I ended up making a 3D Model of Escher’s Belvedere. One of the guys was going to “print” it down in the lab in the bottom of MEB (they did have some form of 3D printer) but thought it might be hard and unstable. I ended up getting it rendered with ray tracing on one of the brand new UltraSPARCs they had in the lab, and it was printed on a T shirt that everyone got. We also got a copy of VisualStudio 6 from Microsoft (we toured their studio near the airport) which I sold on ebay for $375 at the time.
Bart Brejcha Design Engine
Programming-wise too, computer graphics, especially 3D, has this pronounced mathematical / geometric aspect to it that is not shared with normal applications. That has been inhereted in the game industry but remains a thing apart.
In some sense it is only now, half a century later, with the increased emphasis on data science and ML/AI that this fundamental cleavage is being repaired:
Both at the hardware side, the repurposing and utilization of GPU architectures for non-graphical compute and on the software side, the elevation of more mathematical objects like tensors (= nd-arrays) into first class citizens.
Some of that disjointness might have been unavoidable, but maybe these guys (remarkably 100% male btw) in the family photo had something to do with it.
The counterfactual would have been something like graphics enabled CPU's and more mainstream support in programming languages for the associated mathematical operations.
For a book that is purportedly about George Lucas and his contributions, it's goes into a great amount of detail about the histories of the computer graphics pioneers were doing before they worked for Lucas.
I didn't realize the group was so small, and so influential. Highly recommended.
[1] https://www.droidmaker.com/