Readit News logoReadit News
mprovost · 5 years ago
I was webmaster for this site (and thousands of others at WB) back in 2001! I believe this was when we did the great www -> www2 migration, which was of course supposed to be temporary. In fact I think that was when we migrated from our own datacentre to AOL's but I could be getting the timing wrong.

Back then it was served from a Sun E4500 running Solaris (7?) and Netscape Enterprise Server. Netscape had been acquired by AOL which had also just bought Time Warner (that's why we moved to their datacentre) but somehow we couldn't make the internal accounting work and still had to buy server licenses.

Fun fact, unlike Apache, NES enabled the HTTP DELETE method out of the box and it had to be disabled in your config. We found that out the hard way when one of the sysadmins ran a vulnerability scanner which deleted all the websites. We were forbidden from running scans again by management.

Another fun fact about NES - they were really pushing server side Javascript as the development language for the web (and mostly losing to mod_perl). Also back in 2001 but at a different place I worked with the person who had just written a book on server side js for O'Reilly - he got his advance but they didn't publish it because by the time he had finished it they considered it a "dead technology".

Our job was basically to maintain an enormous config file for the webserver which was 99% redirects because they would buy every conceivable domain name for a movie which would all redirect to the canonical one. Famously they couldn't get a hold of matrix.com and had to use whatisthematrix.com. Us sysadmins ran our own IRC server and "302" was shorthand for "let's go" - "302 to a meeting". "302" on its own was "lunchtime".

I still mention maintaining this site on my CV and LinkedIn - disappointingly I've never been asked about it in an interview. I suspect most of the people doing the interviewing these days are too young to remember it.

steverit · 5 years ago
> I still mention maintaining this site on my CV and LinkedIn - disappointingly I've never been asked about it in an interview. I suspect most of the people doing the interviewing these days are too young to remember it.

This is astonishing to me. I check back to see if this site is still up once every year or two just to have a smile. If you were sitting across from me in an interview I am quite sure I'd lose all pretense of professionalism and ask you about nothing else for the hour.

slg · 5 years ago
>This is astonishing to me. I check back to see if this site is still up once every year or two just to have a smile.

The only reason I am sad about the death of flash is that it has all but killed my version of this: zombo.com.

lmilcin · 5 years ago
"We were forbidden from running scans again by management"

Seems like management believes it is better to wait for real bad actor to purposefully destroy your site than have it done by your honest employees by accident.

I know security in 2001 was much more lax (I started in 2000) but this still shows ignorance of management.

The right way to handle this would be to ask your staff to ensure it is possible to restore services and to ensure you know what the tests are doing before you run them.

EthanHeilman · 5 years ago
>Seems like management believes it is better to wait for real bad actor to purposefully destroy your site than have it done by your honest employees by accident.

From a politics standpoint that is completely true. Which would you rather tell your boss:

Q: Why is the website offline?

A: One of our sys admins accidentally deleted it.

OR

Q: Why is the website offline?

A: Some nation-state/teenager launched a sophisticated cyber attack, we need to increase the cybersecurity budget. It's the wildwest out there!

aflag · 5 years ago
I don't really agree with that decision, but maybe that phrasing is not 100% accurate. Maybe they just meant that they should run the scan on a local test environment and not in the production deployment. Obviously, there's value in running it against live, but at least for this particular issue, they could probably had caught it in a testing environment.
newshorts · 5 years ago
Server side JavaScript? That’ll never catch.

I love these examples of historical calls made too early...

mattmanser · 5 years ago
It used to be super slow. Like, you cannot imagine how slow it was.

The usual example I trot out is when I was writing a client-side pivot table creator in the mid-2000s, as far as I can remember, with just 100 items the js version took 20-30 seconds. I then tried it using XML/XSLT instead[1] and it was instant.

I haven't checked recently, but even a few years ago javascript is extremely slow with large datasets, I was mucking around with Mandelbrot generators and js compared to C# was like a bicycle vs a jet engine, they weren't even vaguely in the same league performance-wise. Just had a quick look at some js based versions looks like it's got a bit faster, but still slow.

[1] Awesome performance, super hard for others to understand the code. XSLT was great in some ways, but the learning curve was high.

rbanffy · 5 years ago
Having done it NES-style, I'm kind of glad that server-side JS didn't catch.

Node is an unlikely event. It hit the sweet spot precisely when async started being needed, there was a runtime environment that made it viable (v8, thanks to massive investment by Google on Chrome), and a language that made it kind of natural.

pjdemers · 5 years ago
>> Server side JavaScript? That’ll never catch.

Server side javascript didn't catch on the first time around because you couldn't create threads. Because in the early 2000's, all technical interviews asked about Java style multi-threading. At some companies, the first round of technical interviews was a very, very detailed discussion about the Java threading model. If you didn't pass, you didn't get to round two. So everybody wanted to use threads.

k__ · 5 years ago
Haha

But I used a few servers that allowed for JS scripting way before Node.

I guess, most of them were proprietary, so it never caught on until Node.

zakki · 5 years ago
But node js is alive now.
systemvoltage · 5 years ago
Made my day too! Wow.

I like how it loads instanly! :-) I remember staring at this thing for about 2 mins watching it load over dial up in a third world country on an laptop dad somehow smuggled into the country from dubai without paying import duties. Yeah.

michaelbrooks · 5 years ago
97 on mobile and 99 on desktop. [0]

For a site that never had pagespeed as a tool, that is super impressive and just goes to show how bloated today's websites really are.

[0] https://developers.google.com/speed/pagespeed/insights/?url=...

palad1n · 5 years ago
Funny you mentioned matrix.com; back between Matrix 1 and Matrix Reloaded I knew someone who knew the guy who owned thematrix.com and apparently was not offered enough to sell it. I guess they must have agreed at some point because I also recall that the matrix.com at around the time of Reloaded then started to redirect to the canonical Matrix site (which it sort of does now). Wonder what the price turned out to be.
bartread · 5 years ago
> matrix.com at around the time of Reloaded then started to redirect to the canonical Matrix site (which it sort of does now)

Actually matrix.com now appears to be a site relating to hairstyling and haircare products.

jameshush · 5 years ago
:) This made my day. Were you based in LA at the time? I’m out in LA now and love hearing stories like this from colleagues who’ve done engineering work in the entertainment industry.
mprovost · 5 years ago
Yes, Burbank.

Here's another one - Solaris had a 2GB file size limit (this was before ZFS). Which isn't as crazy as it sounds now - hard drives were 9GB at the time. So ordinarily this wasn't a problem, but when the first Harry Potter movie came out, harrypotter.com (which was being served off the same server as spacejam.com) was the most popular website on the internet and the web server log would hit the 2GB limit every couple hours and we would have to frantically move it somewhere and restart the process.

digitaltrees · 5 years ago
Posts like this make HN amazing. Thanks.
acheron · 5 years ago
> Another fun fact about NES - they were really pushing server side Javascript as the development language for the web

I started “web stuff” with Apache and Perl CGI, and I knew NES existed but never used or saw it myself. I had no idea “server side JavaScript” was a thing back then. That’s hilarious.

mywittyname · 5 years ago
I entered around the time LAMP took over and the old-timers would always trash on server-side javascript.
olingern · 5 years ago
I submitted this and thought, "this is neat." Didn't expect to return to it being on the front-page nor the former webmaster to show up. The internet is a fun place.

That background story is fascinating. I wonder how many full-circles server side JavaScript has made up until now.

donut · 5 years ago
> the person who had just written a book on server side js for O'Reilly

I'd love to see this unpublished book, if possible!

mprovost · 5 years ago
Unfortunately I can't even remember the author's last name and LinkedIn isn't helping. Let me ask around.
jacobush · 5 years ago
It would sell now like hot cakes. An interesting artefact
ngcc_hk · 5 years ago
It will be great!
klyrs · 5 years ago
> Fun fact, unlike Apache, NES enabled the HTTP DELETE method out of the box and it had to be disabled in your config. We found that out the hard way when one of the sysadmins ran a vulnerability scanner which deleted all the websites. We were forbidden from running scans again by management.

Oh man, the early days were so exciting. Like that time I told my boss not to use Alexa on our admin page out of general paranoia... and a few days later a bunch of content got deleted from our mainpage because they spidered a bunch of [delete] links. I learned my lesson; secured the admin site a lil better and upgraded to delete buttons. Boss kept on using Alexa on the admin page tho.

sradman · 5 years ago
> they were really pushing server side Javascript as the development language for the web (and mostly losing to mod_perl).

Enterprise server-side JavaScript was the first stage of dynamic web servers that connected to relational databases. Netscape LiveWire, Sybase PowerDynamo, and Microsoft Active Server Pages (with interpreted LiveScript or VBScript) were early. The enterprise software industry switched to three tiered app server architectures with JVM/.net bytecode runtimes. Single-process, multi-threaded app/web servers were a novelty and none of the client drivers for the popular relational databases were thread safe initially.

It took some time for RESTful architectures to shake-out.

mprovost · 5 years ago
Apache (and mod_perl) was thread safe by being multi-process, single threaded. You were always limited by how many perl interpreters could fit in RAM simultaneously. Then came the era of the Java app server (Weblogic and WebSphere).

Everyone has mostly forgotten AOLserver (and TCL!). Almost everyone of that generation was influenced to use a RDBMS as a backend by Philip and Alex's Guide to Web Publishing which came out in 1998.[0] Although I never actually met anyone who used AOLserver itself! Everyone took the idea and implemented it in ASP or perl, until the world succumbed to EJBs.

[0] https://philip.greenspun.com/panda/

Deleted Comment

janpot · 5 years ago
> We were forbidden from running scans again by management.

A scan detects a severe vulnerability and their reaction is to never run scans again...

sealthedeal · 5 years ago
no politics, but this is like the "if you stop testing numbers go down" lol
dragonwriter · 5 years ago
Well, the issue seems to be the scan not only detected but realized the risk from the vulnerability, which is exactly what is the point of scans to help you avoid.
peterpost2 · 5 years ago
Website is still alive and kicking more then 20 year later, guess it worked.

/s

kinard · 5 years ago
Bit like testing for COVID-19, if you don't test for it how can you have it?
Deestan · 5 years ago
They might give you a call now, as the site just died from traffic.
ramraj07 · 5 years ago
Which is weird - this site comes up on reddits front-page often and it has never gone down. Is HN higher traffic??
combatentropy · 5 years ago
Wow, every paragraph has a great story!
SmokeyHamster · 5 years ago
>the person who had just written a book on server side js for O'Reilly - he got his advance but they didn't publish it because by the time he had finished it they considered it a "dead technology".

Makes sense. This server-side Javascript thing will never take off. The Internet in general is really just a passing fad!

nstj · 5 years ago
This is a wonderful post. Thanks for your insight.
mindentropy · 5 years ago
What machine is this being hosted now? Is it the same Solaris machine?
mprovost · 5 years ago
Your comment made me remember netcraft.com which was the tool we always used to see which web server and OS was running a site. There was a running joke back in the day on slashdot about "BSD is dying" as Linux started to take over, based around Netcraft which used to publish a report showing which servers were most popular. I'm glad to see they haven't changed their logo in 20 years!

Their site report used to show an entire history so you could see every time they changed servers but it doesn't look like it does anymore. Now it's running in AWS so certainly not the same Solaris server. Although those E4500s were built like tanks so it could be plugged in somewhere...

https://sitereport.netcraft.com/?url=https://www.spacejam.co...

lucasverra · 5 years ago
I used this site as teaching Material for young adults the concept of A Webpage that is not an app.

Also Space jam has « style » to be compared to « wikipedia » that is « pure html text »

laputan_machine · 5 years ago
Thanks for sharing this -- I love stories like these
carldaddy · 5 years ago
Thanks for the story! Also, I wish 'webmaster' was still a title used in the industry.
ralphael · 5 years ago
thanks for sharing mate, great to know what was going on behind the scenes during this time.
lenartowski · 5 years ago
Thanks for sharing. Is this your comment in html code?

<!-- Badda Bing, Badda Boom -->

iagovar · 5 years ago
<!-- Google Tag Manager -->

1996 site.

Animats · 5 years ago
Oh, that turkey. The movie, not the web site.

I once went to an industry presentation where someone on that project described the workflow.The project got into a cycle where the animators would animate on first shift, rendering was on second shift, printing to film was done on third shift. The next morning, the director, producer, and too many studio execs would look at the rushes from the overnight rendering. Changes would be ordered, and the cycle repeated.

The scene where the "talent" is being sucked out of players had problems with the "slime" effect. Production was stuck there for weeks as a new thing was tried each day. All the versions of this, of which there were far too many, were shown to us.

Way over budget. Cost about $80 million to make, which was huge in 1996. For comparison, Goldeneye (1995) cost $60 million.

trengorilla · 5 years ago
Awful film. I was listening to the radio the other day and they had a 10 year old on saying Toy Story was cringe and he preferred Space Jam. I've never been so appalled just listening to radio.
latexr · 5 years ago
Why are you bothered by a child’s opinion on children’s movies? You’ll likely never meet the kid, and one’s sense of taste at that age is hardly definitive.

For someone that young, visuals tend to take precedent over story. I, for one, am glad to see a younger generation appreciate 2D animation (which still looks acceptable in Space Jam) over 3D (which looks dated in Toy Story).

bananamerica · 5 years ago
What are you talking about? Space Jam is great fun!
deckarep · 5 years ago
Needs moar stars!

It’s such a nostalgic feeling of the earlier web back when just interest groups, universities, fan pages, web-rings ruled the web. Back before it became commercialized by greedy folks that threw ads all over the place, tracked everything you do and spammed the hell out of your inbox.

I miss the good ‘ol days for what the web was intended for.

One of my first projects was maintaining the site for: Looney Tunes Teaches the Internet.

If you look hard enough it’s still out there.

ethbro · 5 years ago
The best thing about the early web was that nobody knew what it was for. So people just did things, without considering if it was "right."

Nowadays, you'd never get Bob's Labrador Page. Because "Hi! I'm Bob. I live in Lebanon, Kansas. I like Labrador dogs. Here are some pictures of my favorite Labradors!"

kibwen · 5 years ago
I was surprised to find a great example of a site like this recently, "How to Care for Jumping Spiders": https://kozmicdreams.com/spidercare.htm , which is part of someone's broader personal home page with random bits of art, photography, and a guestbook(!). The geocities-esque design bowled me over with nostalgia... the header is an image map!!
AndrewKemendo · 5 years ago
Those pages still exist, except instead of being hosted on geocities.com/area51/bobslabs it's on instagram.com/bobslabs

Not that much of a difference really IMO

ChrisSD · 5 years ago
Though it is somewhat ironic that this post is prompted by an advertisement for a Warner Bros movie.
toomanybeersies · 5 years ago
Funny you mention tracking, one of the few modifications made to the Space Jam website at some point in the past 24 years was the addition of both Adobe and Google analytics.
sillysaurusx · 5 years ago
I love that 1996 HTML solved deep linking. It works flawlessly: https://www.spacejam.com/cmp/lineup/quiz1a.html

Look, I just linked to a wrong answer the middle of a quiz. It perfectly preserved all the state. (Fun quiz, too.)

This is mostly a tongue in cheek argument, but it has the benefit of being true.

Sadly the quiz seems broken at question 6. But you can even un-break the quiz by manually editing the URL to question 7: https://www.spacejam.com/cmp/lineup/quiz7.html

Imagine trying to do that with a React app. (And I say that as a fan of react apps.)

The ending of the quiz is hilarious, by the way.

onion2k · 5 years ago
Imagine trying to do that with a React app.

It'd work fine if the developer used the URL to maintain state with React Router's BrowserRouter or HashRouter.

GrumpyNl · 5 years ago
All thats wrong with the modern web is in this answer.
recursive · 5 years ago
In other words, not by default.
vesche · 5 years ago
I like how you can tell that the HTML was done by hand. There's a parallel here, something like: handmade furniture is to hand written html as mass produced furniture is to generated html. There's a quality to it.
5Qn8mNbc2FNCiVV · 5 years ago
Nextjs wants to talk with you xd
aquabeagle · 5 years ago
As is Heaven's Gate's website - http://heavensgate.com/
libraryatnight · 5 years ago
I know it's partly nostalgia, but something about both of these sites feels more fun to interact with and browse through than almost any modern website I visit. The web used to be so much fun.
chrisco255 · 5 years ago
It's not just nostalgia, the design invites exploration. The web was truly built for surfing back then. Now only The Feed exists.
spideymans · 5 years ago
Navigating these sites feels like exploring a labyrinth. I feel like I can spend an hour on those pages, clicking all the hyperlinks, trying to consume all the content it has to offer
_sbrk · 5 years ago
Indeed.
mattlondon · 5 years ago
Weird thing from the bottom of that page's source code (apart from the black-on-black keyword stuffing that was done a lot at the time):

<div id="linkbyme" style="display: none;"><li><a href="http://www.heavensgate.com/img/index.asp?index=bogner-ski-we... ski wear</a></li></div><script>document.getElementById('linkbyme').style.display='none';</script>

Weird.

madrox · 5 years ago
This is actually wild to me. How is it being powered? Did they just pay a hosting provider decades in advance?
andai · 5 years ago
Two members stayed behind. I heard recently they still respond to email.

>two group members were briefed about a side mission. They would remain on Earth – the last surviving members – and their job was to maintain the Heaven’s Gate website exactly as it was on the day the last suicides took place.

>And for two decades, the lone Gaters have diligently continued their mission, answered queries, paid bills and dealt with problems.

https://www.mirror.co.uk/news/weird-news/two-decades-after-h...

xtracto · 5 years ago
I remember seeing that in the news when I was 16 years old. At that time the Internet was just starting in my country so the news came in a local news broadcasting. It was crazy.
racl101 · 5 years ago
Yikes.
IBCNU · 5 years ago
hot tip!
godzillabrennus · 5 years ago
Aleksandar Totic from the original mosaic team has his website still up.

http://totic.org/nscp/index.html

Personally I enjoyed this bit:

http://totic.org/nscp/swirl/swirl.html

If Aleksandar reads hacker news I hope he never takes that down.

fearingreprisal · 5 years ago
"So far, I haven't received any swirl pictures from the outside world. I find this hard to believe that we are the only ones enjoying this activity."

Hard to believe, isn't it...

rconti · 5 years ago
We called them "swirlies" in middle school/high school. But I've never actually seen someone get one, and it could well be mostly apocryphal. And it wasn't something you sought out, it was like, you were getting bullied.
chrisco255 · 5 years ago
I liked this quote from the page: "Tim Berners-Lee on home page: Q. The idea of the "home page" evolved in a different direction.

A. Yes. With all respect, the personal home page is not a private expression; it's a public billboard that people work on to say what they're interested in. That's not as interesting to me as people using it in their private lives. It's exhibitionism, if you like. Or self-expression. It's openness, and it's great in a way, it's people letting the community into their homes. But it's not really their home. They may call it a home page, but it's more like the gnome in somebody's front yard than the home itself. People don't have the tools for using the Web for their homes, or for organizing their private lives; they don't really put their scrapbooks on the Web. They don't have family Webs. There are many distributed families nowadays, especially in the high-tech fields, so it would be quite reasonable to do that, yet I don't know of any. One reason is that most people don't have the ability to publish with restricted access."

Basically was describing the concept of social networks before they existed on the web.

infinity0 · 5 years ago
Except it's not really private; some random big company sees everything you do inside your house.

Oh wait TVs do that now. I guess the real world evolved to be more like social networks...

rconti · 5 years ago
The Oasis closed a couple of years back :(
koz1000 · 5 years ago
1994 checking in!

http://www.lysator.liu.se/pinball/expo/

Is anyone from Linköping University reading this? I need to thank them for 26 years of free hosting. :-)

caf · 5 years ago
You should really get around to finishing those pages that link to http://www.lysator.liu.se/pinball/expo/unfinished.html any decade now.
koz1000 · 5 years ago
We're still uploading photos from that QuickTake 100...
mister_hn · 5 years ago
Sounds like side projects for forgotten
S_A_P · 5 years ago
When I was in college I remember discovering the "Master Zap" website on this domain. It was a musician/software developer who made a few software odds and ends. One of those being Stomper an analog style drum synthesis program. I have great memories of spending hours trying to re-create each sound from a TR-808. Taught me a lot about synthesis. Also really got me writing code and learning C...

EDIT- ITS STILL THERE :D http://www.lysator.liu.se/~zap/

rovr138 · 5 years ago
Wondered how well it would rank on those insight scores,

https://developers.google.com/speed/pagespeed/insights/?url=...

Looks like 80kb and they still find things.

toast0 · 5 years ago
It's pretty darn hard to hit their page delivery targets unless you're serving from something close to their testing site. https doesn't help, because it adds round trips (www.spacejam.com doesn't support tls 1.3, and I wouldn't want pagespeed to be using a resume handshake for initial load anyway).

An ultra small, static site can sometimes get 100 though: https://developers.google.com/speed/pagespeed/insights/?url=...

seanwilson · 5 years ago
> It's pretty darn hard to hit their page delivery targets unless you're serving from something close to their testing site.

Are you sure? Have you tried using a CDN?

> An ultra small, static site can sometimes get 100 though

Here's an example site I run that gets a close to perfect score that has a fairly complex landing page:

https://developers.google.com/speed/pagespeed/insights/?url=...

(it should load close to instant once it connects in your browser: https://www.checkbot.io/)

The main tips I can give for a high page speed that most websites don't do are avoid large header images, make sure text is visible before custom fonts load, use minimal CSS (and/or inline the CSS for the header into the top of the HTML), don't use blocking JavaScript and especially avoid huge JavaScript triggered cookie popups (the blocking JavaScript + big delay for the Largest Contentful Paint will kill your score).

weare138 · 5 years ago
Google can't even meet their own metrics. The Gmail landing page gets a PageSpeed score of 24.

https://developers.google.com/speed/pagespeed/insights/?url=...

pcurve · 5 years ago
98 on first try, 97 on refresh. 99 on 2nd refresh. Are they just making stuff up?
dafman · 5 years ago
There's a bit about the variability here: https://github.com/GoogleChrome/lighthouse/blob/master/docs/...
hbosch · 5 years ago
If they were making stuff up, you'd think at least their own Google.com would score higher than a 76.
baddox · 5 years ago
That’s an extremely small variation in score. Why would that make you think they’re making stuff up? Networks and servers don’t always respond with identical timings.
racl101 · 5 years ago
Reminds me of that quote from Ryan from the Office:

I'm such a perfectionist that I'd kinda rather not do it at all, than do a crappy version.

Seems that Google's software shares that mentality.