I was webmaster for this site (and thousands of others at WB) back in 2001! I believe this was when we did the great www -> www2 migration, which was of course supposed to be temporary. In fact I think that was when we migrated from our own datacentre to AOL's but I could be getting the timing wrong.
Back then it was served from a Sun E4500 running Solaris (7?) and Netscape Enterprise Server. Netscape had been acquired by AOL which had also just bought Time Warner (that's why we moved to their datacentre) but somehow we couldn't make the internal accounting work and still had to buy server licenses.
Fun fact, unlike Apache, NES enabled the HTTP DELETE method out of the box and it had to be disabled in your config. We found that out the hard way when one of the sysadmins ran a vulnerability scanner which deleted all the websites. We were forbidden from running scans again by management.
Another fun fact about NES - they were really pushing server side Javascript as the development language for the web (and mostly losing to mod_perl). Also back in 2001 but at a different place I worked with the person who had just written a book on server side js for O'Reilly - he got his advance but they didn't publish it because by the time he had finished it they considered it a "dead technology".
Our job was basically to maintain an enormous config file for the webserver which was 99% redirects because they would buy every conceivable domain name for a movie which would all redirect to the canonical one. Famously they couldn't get a hold of matrix.com and had to use whatisthematrix.com. Us sysadmins ran our own IRC server and "302" was shorthand for "let's go" - "302 to a meeting". "302" on its own was "lunchtime".
I still mention maintaining this site on my CV and LinkedIn - disappointingly I've never been asked about it in an interview. I suspect most of the people doing the interviewing these days are too young to remember it.
> I still mention maintaining this site on my CV and LinkedIn - disappointingly I've never been asked about it in an interview. I suspect most of the people doing the interviewing these days are too young to remember it.
This is astonishing to me. I check back to see if this site is still up once every year or two just to have a smile. If you were sitting across from me in an interview I am quite sure I'd lose all pretense of professionalism and ask you about nothing else for the hour.
"We were forbidden from running scans again by management"
Seems like management believes it is better to wait for real bad actor to purposefully destroy your site than have it done by your honest employees by accident.
I know security in 2001 was much more lax (I started in 2000) but this still shows ignorance of management.
The right way to handle this would be to ask your staff to ensure it is possible to restore services and to ensure you know what the tests are doing before you run them.
>Seems like management believes it is better to wait for real bad actor to purposefully destroy your site than have it done by your honest employees by accident.
From a politics standpoint that is completely true. Which would you rather tell your boss:
Q: Why is the website offline?
A: One of our sys admins accidentally deleted it.
OR
Q: Why is the website offline?
A: Some nation-state/teenager launched a sophisticated cyber attack, we need to increase the cybersecurity budget. It's the wildwest out there!
I don't really agree with that decision, but maybe that phrasing is not 100% accurate. Maybe they just meant that they should run the scan on a local test environment and not in the production deployment. Obviously, there's value in running it against live, but at least for this particular issue, they could probably had caught it in a testing environment.
It used to be super slow. Like, you cannot imagine how slow it was.
The usual example I trot out is when I was writing a client-side pivot table creator in the mid-2000s, as far as I can remember, with just 100 items the js version took 20-30 seconds. I then tried it using XML/XSLT instead[1] and it was instant.
I haven't checked recently, but even a few years ago javascript is extremely slow with large datasets, I was mucking around with Mandelbrot generators and js compared to C# was like a bicycle vs a jet engine, they weren't even vaguely in the same league performance-wise. Just had a quick look at some js based versions looks like it's got a bit faster, but still slow.
[1] Awesome performance, super hard for others to understand the code. XSLT was great in some ways, but the learning curve was high.
Having done it NES-style, I'm kind of glad that server-side JS didn't catch.
Node is an unlikely event. It hit the sweet spot precisely when async started being needed, there was a runtime environment that made it viable (v8, thanks to massive investment by Google on Chrome), and a language that made it kind of natural.
Server side javascript didn't catch on the first time around because you couldn't create threads. Because in the early 2000's, all technical interviews asked about Java style multi-threading. At some companies, the first round of technical interviews was a very, very detailed discussion about the Java threading model. If you didn't pass, you didn't get to round two. So everybody wanted to use threads.
I like how it loads instanly! :-) I remember staring at this thing for about 2 mins watching it load over dial up in a third world country on an laptop dad somehow smuggled into the country from dubai without paying import duties. Yeah.
Funny you mentioned matrix.com; back between Matrix 1 and Matrix Reloaded I knew someone who knew the guy who owned thematrix.com and apparently was not offered enough to sell it. I guess they must have agreed at some point because I also recall that the matrix.com at around the time of Reloaded then started to redirect to the canonical Matrix site (which it sort of does now). Wonder what the price turned out to be.
:) This made my day. Were you based in LA at the time? I’m out in LA now and love hearing stories like this from colleagues who’ve done engineering work in the entertainment industry.
Here's another one - Solaris had a 2GB file size limit (this was before ZFS). Which isn't as crazy as it sounds now - hard drives were 9GB at the time. So ordinarily this wasn't a problem, but when the first Harry Potter movie came out, harrypotter.com (which was being served off the same server as spacejam.com) was the most popular website on the internet and the web server log would hit the 2GB limit every couple hours and we would have to frantically move it somewhere and restart the process.
> Another fun fact about NES - they were really pushing server side Javascript as the development language for the web
I started “web stuff” with Apache and Perl CGI, and I knew NES existed but never used or saw it myself. I had no idea “server side JavaScript” was a thing back then. That’s hilarious.
I submitted this and thought, "this is neat." Didn't expect to return to it being on the front-page nor the former webmaster to show up. The internet is a fun place.
That background story is fascinating. I wonder how many full-circles server side JavaScript has made up until now.
> Fun fact, unlike Apache, NES enabled the HTTP DELETE method out of the box and it had to be disabled in your config. We found that out the hard way when one of the sysadmins ran a vulnerability scanner which deleted all the websites. We were forbidden from running scans again by management.
Oh man, the early days were so exciting. Like that time I told my boss not to use Alexa on our admin page out of general paranoia... and a few days later a bunch of content got deleted from our mainpage because they spidered a bunch of [delete] links. I learned my lesson; secured the admin site a lil better and upgraded to delete buttons. Boss kept on using Alexa on the admin page tho.
> they were really pushing server side Javascript as the development language for the web (and mostly losing to mod_perl).
Enterprise server-side JavaScript was the first stage of dynamic web servers that connected to relational databases. Netscape LiveWire, Sybase PowerDynamo, and Microsoft Active Server Pages (with interpreted LiveScript or VBScript) were early. The enterprise software industry switched to three tiered app server architectures with JVM/.net bytecode runtimes. Single-process, multi-threaded app/web servers were a novelty and none of the client drivers for the popular relational databases were thread safe initially.
It took some time for RESTful architectures to shake-out.
Apache (and mod_perl) was thread safe by being multi-process, single threaded. You were always limited by how many perl interpreters could fit in RAM simultaneously. Then came the era of the Java app server (Weblogic and WebSphere).
Everyone has mostly forgotten AOLserver (and TCL!). Almost everyone of that generation was influenced to use a RDBMS as a backend by Philip and Alex's Guide to Web Publishing which came out in 1998.[0] Although I never actually met anyone who used AOLserver itself! Everyone took the idea and implemented it in ASP or perl, until the world succumbed to EJBs.
Well, the issue seems to be the scan not only detected but realized the risk from the vulnerability, which is exactly what is the point of scans to help you avoid.
>the person who had just written a book on server side js for O'Reilly - he got his advance but they didn't publish it because by the time he had finished it they considered it a "dead technology".
Makes sense. This server-side Javascript thing will never take off. The Internet in general is really just a passing fad!
Your comment made me remember netcraft.com which was the tool we always used to see which web server and OS was running a site. There was a running joke back in the day on slashdot about "BSD is dying" as Linux started to take over, based around Netcraft which used to publish a report showing which servers were most popular. I'm glad to see they haven't changed their logo in 20 years!
Their site report used to show an entire history so you could see every time they changed servers but it doesn't look like it does anymore. Now it's running in AWS so certainly not the same Solaris server. Although those E4500s were built like tanks so it could be plugged in somewhere...
I once went to an industry presentation where someone on that project described the workflow.The project got into a cycle where the animators would animate on first shift, rendering was on second shift, printing to film was done on third shift. The next morning, the director, producer, and too many studio execs would look at the rushes from the overnight rendering. Changes would be ordered, and the cycle repeated.
The scene where the "talent" is being sucked out of players had problems with the "slime" effect. Production was stuck there for weeks as a new thing was tried each day. All the versions of this, of which there were far too many, were shown to us.
Way over budget. Cost about $80 million to make, which was huge in 1996. For comparison, Goldeneye (1995) cost $60 million.
Awful film. I was listening to the radio the other day and they had a 10 year old on saying Toy Story was cringe and he preferred Space Jam. I've never been so appalled just listening to radio.
Why are you bothered by a child’s opinion on children’s movies? You’ll likely never meet the kid, and one’s sense of taste at that age is hardly definitive.
For someone that young, visuals tend to take precedent over story. I, for one, am glad to see a younger generation appreciate 2D animation (which still looks acceptable in Space Jam) over 3D (which looks dated in Toy Story).
It’s such a nostalgic feeling of the earlier web back when just interest groups, universities, fan pages, web-rings ruled the web. Back before it became commercialized by greedy folks that threw ads all over the place, tracked everything you do and spammed the hell out of your inbox.
I miss the good ‘ol days for what the web was intended for.
One of my first projects was maintaining the site for: Looney Tunes Teaches the Internet.
The best thing about the early web was that nobody knew what it was for. So people just did things, without considering if it was "right."
Nowadays, you'd never get Bob's Labrador Page. Because "Hi! I'm Bob. I live in Lebanon, Kansas. I like Labrador dogs. Here are some pictures of my favorite Labradors!"
I was surprised to find a great example of a site like this recently, "How to Care for Jumping Spiders": https://kozmicdreams.com/spidercare.htm , which is part of someone's broader personal home page with random bits of art, photography, and a guestbook(!). The geocities-esque design bowled me over with nostalgia... the header is an image map!!
Funny you mention tracking, one of the few modifications made to the Space Jam website at some point in the past 24 years was the addition of both Adobe and Google analytics.
I like how you can tell that the HTML was done by hand. There's a parallel here, something like: handmade furniture is to hand written html as mass produced furniture is to generated html. There's a quality to it.
I know it's partly nostalgia, but something about both of these sites feels more fun to interact with and browse through than almost any modern website I visit. The web used to be so much fun.
Navigating these sites feels like exploring a labyrinth. I feel like I can spend an hour on those pages, clicking all the hyperlinks, trying to consume all the content it has to offer
Two members stayed behind. I heard recently they still respond to email.
>two group members were briefed about a side mission. They would remain on Earth – the last surviving members – and their job was to maintain the Heaven’s Gate website exactly as it was on the day the last suicides took place.
>And for two decades, the lone Gaters have diligently continued their mission, answered queries, paid bills and dealt with problems.
I remember seeing that in the news when I was 16 years old. At that time the Internet was just starting in my country so the news came in a local news broadcasting. It was crazy.
We called them "swirlies" in middle school/high school. But I've never actually seen someone get one, and it could well be mostly apocryphal. And it wasn't something you sought out, it was like, you were getting bullied.
I liked this quote from the page: "Tim Berners-Lee on home page:
Q. The idea of the "home page" evolved in a different direction.
A. Yes. With all respect, the personal home page is not a private expression; it's a public billboard that people work on to say what they're interested in. That's not as interesting to me as people using it in their private lives. It's exhibitionism, if you like. Or self-expression. It's openness, and it's great in a way, it's people letting the community into their homes. But it's not really their home. They may call it a home page, but it's more like the gnome in somebody's front yard than the home itself. People don't have the tools for using the Web for their homes, or for organizing their private lives; they don't really put their scrapbooks on the Web. They don't have family Webs. There are many distributed families nowadays, especially in the high-tech fields, so it would be quite reasonable to do that, yet I don't know of any. One reason is that most people don't have the ability to publish with restricted access."
Basically was describing the concept of social networks before they existed on the web.
When I was in college I remember discovering the "Master Zap" website on this domain. It was a musician/software developer who made a few software odds and ends. One of those being Stomper an analog style drum synthesis program. I have great memories of spending hours trying to re-create each sound from a TR-808. Taught me a lot about synthesis. Also really got me writing code and learning C...
It's pretty darn hard to hit their page delivery targets unless you're serving from something close to their testing site. https doesn't help, because it adds round trips (www.spacejam.com doesn't support tls 1.3, and I wouldn't want pagespeed to be using a resume handshake for initial load anyway).
The main tips I can give for a high page speed that most websites don't do are avoid large header images, make sure text is visible before custom fonts load, use minimal CSS (and/or inline the CSS for the header into the top of the HTML), don't use blocking JavaScript and especially avoid huge JavaScript triggered cookie popups (the blocking JavaScript + big delay for the Largest Contentful Paint will kill your score).
That’s an extremely small variation in score. Why would that make you think they’re making stuff up? Networks and servers don’t always respond with identical timings.
Back then it was served from a Sun E4500 running Solaris (7?) and Netscape Enterprise Server. Netscape had been acquired by AOL which had also just bought Time Warner (that's why we moved to their datacentre) but somehow we couldn't make the internal accounting work and still had to buy server licenses.
Fun fact, unlike Apache, NES enabled the HTTP DELETE method out of the box and it had to be disabled in your config. We found that out the hard way when one of the sysadmins ran a vulnerability scanner which deleted all the websites. We were forbidden from running scans again by management.
Another fun fact about NES - they were really pushing server side Javascript as the development language for the web (and mostly losing to mod_perl). Also back in 2001 but at a different place I worked with the person who had just written a book on server side js for O'Reilly - he got his advance but they didn't publish it because by the time he had finished it they considered it a "dead technology".
Our job was basically to maintain an enormous config file for the webserver which was 99% redirects because they would buy every conceivable domain name for a movie which would all redirect to the canonical one. Famously they couldn't get a hold of matrix.com and had to use whatisthematrix.com. Us sysadmins ran our own IRC server and "302" was shorthand for "let's go" - "302 to a meeting". "302" on its own was "lunchtime".
I still mention maintaining this site on my CV and LinkedIn - disappointingly I've never been asked about it in an interview. I suspect most of the people doing the interviewing these days are too young to remember it.
This is astonishing to me. I check back to see if this site is still up once every year or two just to have a smile. If you were sitting across from me in an interview I am quite sure I'd lose all pretense of professionalism and ask you about nothing else for the hour.
The only reason I am sad about the death of flash is that it has all but killed my version of this: zombo.com.
Seems like management believes it is better to wait for real bad actor to purposefully destroy your site than have it done by your honest employees by accident.
I know security in 2001 was much more lax (I started in 2000) but this still shows ignorance of management.
The right way to handle this would be to ask your staff to ensure it is possible to restore services and to ensure you know what the tests are doing before you run them.
From a politics standpoint that is completely true. Which would you rather tell your boss:
Q: Why is the website offline?
A: One of our sys admins accidentally deleted it.
OR
Q: Why is the website offline?
A: Some nation-state/teenager launched a sophisticated cyber attack, we need to increase the cybersecurity budget. It's the wildwest out there!
I love these examples of historical calls made too early...
The usual example I trot out is when I was writing a client-side pivot table creator in the mid-2000s, as far as I can remember, with just 100 items the js version took 20-30 seconds. I then tried it using XML/XSLT instead[1] and it was instant.
I haven't checked recently, but even a few years ago javascript is extremely slow with large datasets, I was mucking around with Mandelbrot generators and js compared to C# was like a bicycle vs a jet engine, they weren't even vaguely in the same league performance-wise. Just had a quick look at some js based versions looks like it's got a bit faster, but still slow.
[1] Awesome performance, super hard for others to understand the code. XSLT was great in some ways, but the learning curve was high.
Node is an unlikely event. It hit the sweet spot precisely when async started being needed, there was a runtime environment that made it viable (v8, thanks to massive investment by Google on Chrome), and a language that made it kind of natural.
Server side javascript didn't catch on the first time around because you couldn't create threads. Because in the early 2000's, all technical interviews asked about Java style multi-threading. At some companies, the first round of technical interviews was a very, very detailed discussion about the Java threading model. If you didn't pass, you didn't get to round two. So everybody wanted to use threads.
But I used a few servers that allowed for JS scripting way before Node.
I guess, most of them were proprietary, so it never caught on until Node.
I like how it loads instanly! :-) I remember staring at this thing for about 2 mins watching it load over dial up in a third world country on an laptop dad somehow smuggled into the country from dubai without paying import duties. Yeah.
For a site that never had pagespeed as a tool, that is super impressive and just goes to show how bloated today's websites really are.
[0] https://developers.google.com/speed/pagespeed/insights/?url=...
Actually matrix.com now appears to be a site relating to hairstyling and haircare products.
Here's another one - Solaris had a 2GB file size limit (this was before ZFS). Which isn't as crazy as it sounds now - hard drives were 9GB at the time. So ordinarily this wasn't a problem, but when the first Harry Potter movie came out, harrypotter.com (which was being served off the same server as spacejam.com) was the most popular website on the internet and the web server log would hit the 2GB limit every couple hours and we would have to frantically move it somewhere and restart the process.
I started “web stuff” with Apache and Perl CGI, and I knew NES existed but never used or saw it myself. I had no idea “server side JavaScript” was a thing back then. That’s hilarious.
That background story is fascinating. I wonder how many full-circles server side JavaScript has made up until now.
I'd love to see this unpublished book, if possible!
Oh man, the early days were so exciting. Like that time I told my boss not to use Alexa on our admin page out of general paranoia... and a few days later a bunch of content got deleted from our mainpage because they spidered a bunch of [delete] links. I learned my lesson; secured the admin site a lil better and upgraded to delete buttons. Boss kept on using Alexa on the admin page tho.
Enterprise server-side JavaScript was the first stage of dynamic web servers that connected to relational databases. Netscape LiveWire, Sybase PowerDynamo, and Microsoft Active Server Pages (with interpreted LiveScript or VBScript) were early. The enterprise software industry switched to three tiered app server architectures with JVM/.net bytecode runtimes. Single-process, multi-threaded app/web servers were a novelty and none of the client drivers for the popular relational databases were thread safe initially.
It took some time for RESTful architectures to shake-out.
Everyone has mostly forgotten AOLserver (and TCL!). Almost everyone of that generation was influenced to use a RDBMS as a backend by Philip and Alex's Guide to Web Publishing which came out in 1998.[0] Although I never actually met anyone who used AOLserver itself! Everyone took the idea and implemented it in ASP or perl, until the world succumbed to EJBs.
[0] https://philip.greenspun.com/panda/
Deleted Comment
A scan detects a severe vulnerability and their reaction is to never run scans again...
/s
Makes sense. This server-side Javascript thing will never take off. The Internet in general is really just a passing fad!
Their site report used to show an entire history so you could see every time they changed servers but it doesn't look like it does anymore. Now it's running in AWS so certainly not the same Solaris server. Although those E4500s were built like tanks so it could be plugged in somewhere...
https://sitereport.netcraft.com/?url=https://www.spacejam.co...
Also Space jam has « style » to be compared to « wikipedia » that is « pure html text »
<!-- Badda Bing, Badda Boom -->
1996 site.
I once went to an industry presentation where someone on that project described the workflow.The project got into a cycle where the animators would animate on first shift, rendering was on second shift, printing to film was done on third shift. The next morning, the director, producer, and too many studio execs would look at the rushes from the overnight rendering. Changes would be ordered, and the cycle repeated.
The scene where the "talent" is being sucked out of players had problems with the "slime" effect. Production was stuck there for weeks as a new thing was tried each day. All the versions of this, of which there were far too many, were shown to us.
Way over budget. Cost about $80 million to make, which was huge in 1996. For comparison, Goldeneye (1995) cost $60 million.
For someone that young, visuals tend to take precedent over story. I, for one, am glad to see a younger generation appreciate 2D animation (which still looks acceptable in Space Jam) over 3D (which looks dated in Toy Story).
It’s such a nostalgic feeling of the earlier web back when just interest groups, universities, fan pages, web-rings ruled the web. Back before it became commercialized by greedy folks that threw ads all over the place, tracked everything you do and spammed the hell out of your inbox.
I miss the good ‘ol days for what the web was intended for.
One of my first projects was maintaining the site for: Looney Tunes Teaches the Internet.
If you look hard enough it’s still out there.
Nowadays, you'd never get Bob's Labrador Page. Because "Hi! I'm Bob. I live in Lebanon, Kansas. I like Labrador dogs. Here are some pictures of my favorite Labradors!"
Not that much of a difference really IMO
Look, I just linked to a wrong answer the middle of a quiz. It perfectly preserved all the state. (Fun quiz, too.)
This is mostly a tongue in cheek argument, but it has the benefit of being true.
Sadly the quiz seems broken at question 6. But you can even un-break the quiz by manually editing the URL to question 7: https://www.spacejam.com/cmp/lineup/quiz7.html
Imagine trying to do that with a React app. (And I say that as a fan of react apps.)
The ending of the quiz is hilarious, by the way.
It'd work fine if the developer used the URL to maintain state with React Router's BrowserRouter or HashRouter.
<div id="linkbyme" style="display: none;"><li><a href="http://www.heavensgate.com/img/index.asp?index=bogner-ski-we... ski wear</a></li></div><script>document.getElementById('linkbyme').style.display='none';</script>
Weird.
>two group members were briefed about a side mission. They would remain on Earth – the last surviving members – and their job was to maintain the Heaven’s Gate website exactly as it was on the day the last suicides took place.
>And for two decades, the lone Gaters have diligently continued their mission, answered queries, paid bills and dealt with problems.
https://www.mirror.co.uk/news/weird-news/two-decades-after-h...
http://totic.org/nscp/index.html
Personally I enjoyed this bit:
http://totic.org/nscp/swirl/swirl.html
If Aleksandar reads hacker news I hope he never takes that down.
Hard to believe, isn't it...
A. Yes. With all respect, the personal home page is not a private expression; it's a public billboard that people work on to say what they're interested in. That's not as interesting to me as people using it in their private lives. It's exhibitionism, if you like. Or self-expression. It's openness, and it's great in a way, it's people letting the community into their homes. But it's not really their home. They may call it a home page, but it's more like the gnome in somebody's front yard than the home itself. People don't have the tools for using the Web for their homes, or for organizing their private lives; they don't really put their scrapbooks on the Web. They don't have family Webs. There are many distributed families nowadays, especially in the high-tech fields, so it would be quite reasonable to do that, yet I don't know of any. One reason is that most people don't have the ability to publish with restricted access."
Basically was describing the concept of social networks before they existed on the web.
Oh wait TVs do that now. I guess the real world evolved to be more like social networks...
http://www.lysator.liu.se/pinball/expo/
Is anyone from Linköping University reading this? I need to thank them for 26 years of free hosting. :-)
EDIT- ITS STILL THERE :D http://www.lysator.liu.se/~zap/
https://developers.google.com/speed/pagespeed/insights/?url=...
Looks like 80kb and they still find things.
An ultra small, static site can sometimes get 100 though: https://developers.google.com/speed/pagespeed/insights/?url=...
Are you sure? Have you tried using a CDN?
> An ultra small, static site can sometimes get 100 though
Here's an example site I run that gets a close to perfect score that has a fairly complex landing page:
https://developers.google.com/speed/pagespeed/insights/?url=...
(it should load close to instant once it connects in your browser: https://www.checkbot.io/)
The main tips I can give for a high page speed that most websites don't do are avoid large header images, make sure text is visible before custom fonts load, use minimal CSS (and/or inline the CSS for the header into the top of the HTML), don't use blocking JavaScript and especially avoid huge JavaScript triggered cookie popups (the blocking JavaScript + big delay for the Largest Contentful Paint will kill your score).
https://developers.google.com/speed/pagespeed/insights/?url=...
I'm such a perfectionist that I'd kinda rather not do it at all, than do a crappy version.
Seems that Google's software shares that mentality.