I was one the developers responsible for implementing the netcode on Serious Sam. We often slept under the desks in the offices at croteam after lurking usenet. One post in particular described the QuakeWorld prediction system which inspired us. That night we coded a simplistic mvp as a colleague (hi dan) tested it over an old 486 nix machine acting as a router that we could simulate lag with. This was well before the actual game was built around it
Why wouldn't they? It's called great sound design. Games of the 90s and early 2000s put a lot of effort in that. See Thief the dark project. Sound design can bring more immersiveness than graphics alone. It's something forgotten in many games of today that keep beating the ray traced graphics horse
I love how there's an article about how some legendary game was made, and someone in the comments casually goes "oh yeah, I built that, fun times". It's great.
I once commented something to the effect of “that must have sucked” on a story about debugging a weird error on crash bandicoot and in comes the developer to tell me “yes, it did suck.”
I work at a big company and commented once that some decision was stupid and one of the top two engineers at the company dropped in to tell me I was wrong. I felt so honored. (And he's wrong.)
HN is wonderful for that. I once commented on an article about Brave Browser that it was an ad extortion racket, and Brendan Eich showed up to call me an asshole. Good times!
Serious Sam was one of my uncle’s favorite games (along with Duke Nukem 3D). He was a hugely important figure in my life, and playing through his favorites is a good way to keep connected to my memories of him. So, thanks to you and your coworkers! Excellent game, excellent multiplayer, and very good memories.
Serious Sam was always a killer LAN party game. Not necessarily because it was the sexiest title of the day, or what anyone had planned. Serious Sam won the LAN party because even when every other game died under some driver issue, thermal issue, update problem, whatever, you’d load Serious Sam and it would just work. This continued through the later series too, hell of someone’s machine was completely dead in the water it would also reliably split screen and handled input well enough for split peripherals. The systemic parts of the game were truly exceptional on the reliability axis.
Counter-Strike was/is popular because gameplay is awesome. Players stayed long enough that even "toaster PCs" could run the engine. It was never "cutting edge" even after major upgrades.
I remember getting through some battle and coming on a room FULL of ammo of every kind. First time, it was awesome. But gradually ammo caches started making me say "Yikes!"
In the late 90s, I managed the EA Tech Support website. The support/QA* teams would have massive after-hours gaming sessions playing Serious Sam. It was the only FPS that ran consistently on our work PCs, and it was a ton of fun.
* At least at that time, EA QA and Tech Support had a lot of overlap; support guys would be come in-house beta testers in the Summer when they were trying to get games done for the holidays and do Tech Support in the Winter around Christmas when the calls went up.
We used deterministic game play to implement multiplayer on the GB Color port of Vigilante 8.
The GBC Link Cable would pass 1 byte in each direction at the same time. It's a pair of shift registers filling each other up across the cable.
The game was locked to the GBC's frame rate. There was a lot work to update the screen that had to happen in each (effectively) V-Blank and if it was missed the smooth scrolling stuttered.
At multiplayer startup we passed our seed. To run it looked like this:
On frame A it reads the controls, and packs them into a byte and puts that in the transfer buffer. The transfer occurs while it renders frame B. At the start of frame C it has the local controls encoded in the byte sent on frame A and it has the other side's controls in the byte received in frame B.
It applies the controls to the game state and renders frame C. Local and remote controls are applied with one frame delay.
There was no frame delay of the controls for local play so if you ever lost in multi-player feel free to blame lag and me specifically if you need to.
I just bought this cart a few weeks ago (love the old GBC rumble carts!) and I was impressed that it had link-cable multiplayer. It's a really good game!
Thank you for the technical note on the multiplayer implementation -- that's really cool!
Croteam is such a talented team of game developers. I really enjoyed the Talos Principle (1 and 2) and they were some of the early pioneers of a completely custom Vulkan game engine for the first game.
Talos 2 was a great game. Except the awful TAA. Made some of the meta puzzles almost impossible for me, because I literally couldn't see certain things.
I found this disappointing too. I know they don’t have the resources to build their own equivalent of Nanite and such, but… On top of the performance issues, there were a number of things that actually looked so much better in the 2014 game. For example, the forcefields and water ripples.
Yes. Both are “deterministic lockstep” systems. Many many games have used such a system over the years. Although it’s probably less common these days for a variety of reasons.
Yet games with 10x the bandwidth struggle to support that many enemies. It only just occurred to me: an increase of technological resources actually has a reversing effect on the efficiency and creativity of computer science. As bandwidth, storage, memory and compute has gotten bigger, there is an inverse response in the software that makes it slower, more bloated, and less capable, per discrete unit of resource. Call it the Benjamin Button software design effect
Do you have a number for "that many enemies"? If article is giving one, I couldn't find it. There are plenty of modern games that is multiplayer and supports enemy numbers that I would consider "massive". Or massive number of players if that is that matters.
I played it last night. There would be maybe a couple dozen enemies on screen. Which, for a dial-up connection, is a massive amount to communicate about in real time. The fights are structured such that it feels like 5x that amount, though.
Haha they even made a game about it, and its called "I hate running backwards". It's on Steam. I don't know if its from the same makers but it is in the Serious Sam universe.
Their friday facts are very worth reading, not just for people that play computer games, it is interesting for any developer. If I remember correctly they had issues with parts behaving slightly differently on different platforms, which then lead to sync issues in multiplayer.
I have no experience here, but I guess the difficulty of such a strategy comes from ordering "events" in a game loop.
https://youtu.be/C71U4CxD_J8?si=dvaDyhYCKtb5f069
I loved that game.
I was a huge fan of co-op games and shooters, but all my friends wanted to play Counter Strike all the time.
With Serious Sam I could at least sometimes motivate them to play something I liked.
Thanks for your service! :D
On a similar note, Counter-Strike never looked good, but was popular for a long time because it ran great on toaster PCs
* At least at that time, EA QA and Tech Support had a lot of overlap; support guys would be come in-house beta testers in the Summer when they were trying to get games done for the holidays and do Tech Support in the Winter around Christmas when the calls went up.
The GBC Link Cable would pass 1 byte in each direction at the same time. It's a pair of shift registers filling each other up across the cable.
The game was locked to the GBC's frame rate. There was a lot work to update the screen that had to happen in each (effectively) V-Blank and if it was missed the smooth scrolling stuttered.
At multiplayer startup we passed our seed. To run it looked like this:
On frame A it reads the controls, and packs them into a byte and puts that in the transfer buffer. The transfer occurs while it renders frame B. At the start of frame C it has the local controls encoded in the byte sent on frame A and it has the other side's controls in the byte received in frame B.
It applies the controls to the game state and renders frame C. Local and remote controls are applied with one frame delay.
There was no frame delay of the controls for local play so if you ever lost in multi-player feel free to blame lag and me specifically if you need to.
Thank you for the technical note on the multiplayer implementation -- that's really cool!
https://www.gamedeveloper.com/programming/1500-archers-on-a-...
Either you have the resources, or you don't. Don't cap for reasons of capping.
Dead Comment
I have no experience here, but I guess the difficulty of such a strategy comes from ordering "events" in a game loop.