Walls are rendered as PCB_TRACK traces, and entities (enemies, items, player) are actual component footprints - SOT-23 for small items, SOIC-8 for decorations, QFP-64 for enemies and the player.
How I did it:
Started by patching DOOM's source code to extract vector data directly from the engine. Instead of trying to render 64,000 pixels (which would be impossibly slow), I grab the geometry DOOM already calculates internally - the drawsegs[] array for walls and vissprites[] for entities.
Added a field to the vissprite_t structure to capture entity types (MT_SHOTGUY, MT_PLAYER, etc.) during R_ProjectSprite(). This lets me map 150+ entity types to appropriate footprint categories.
The DOOM engine sends this vector data over a Unix socket to a Python plugin running in KiCad. The plugin pre-allocates pools of traces and footprints at startup, then just updates their positions each frame instead of creating/destroying objects. Calls pcbnew.Refresh() to update the display.
Runs at 10-25 FPS depending on hardware. The bottleneck is KiCad's refresh, not DOOM or the data transfer.
Also renders to an SDL window (for actual gameplay) and a Python wireframe window (for debugging), so you get three views running simultaneously.
Follow-up: ScopeDoom
After getting the wireframe renderer working, I wanted to push it somewhere more physical. Oscilloscopes in X-Y mode are vector displays - feed X coordinates to one channel, Y to the other. I didn't have a function generator, so I used my MacBook's headphone jack instead.
The sound card is just a dual-channel DAC at 44.1kHz. Wired 3.5mm jack → 1kΩ resistors → scope CH1 (X) and CH2 (Y). Reused the same vector extraction from KiDoom, but the Python script converts coordinates to ±1V range and streams them as audio samples.
Each wall becomes a wireframe box, the scope traces along each line. With ~7,000 points per frame at 44.1kHz, refresh rate is about 6 Hz - slow enough to be a slideshow, but level geometry is clearly recognizable. A 96kHz audio interface or analog scope would improve it significantly (digital scopes do sample-and-hold instead of continuous beam tracing).
Links:
KiDoom GitHub: https://github.com/MichaelAyles/KiDoom, writeup: https://www.mikeayles.com/#kidoom
ScopeDoom GitHub: https://github.com/MichaelAyles/ScopeDoom, writeup: https://www.mikeayles.com/#scopedoom
That got me thinking “I wonder if anyone has done this on an oscilloscope” and oddly I can't fine anyone who quite has. That DOOM objects are sprites and not actual 3D objects would limit the fidelity, but the scenery could be rendered at least. There are several examples of managing to use a high-speed scope as a low-res monochrome raster device (scanning like a CRT monitor does and turning the beam on & off as needed).
I did find an example of Quake being done on a scope the way I was imagining: https://www.youtube.com/watch?v=aMli33ornEU - as all objects are actual 3D models in Quake that even manages to give them some presence & shape.
EDIT: then I read the second half of this post and saw ScopeDoom! I'm surprised there are no earlier examples that are easy to find.
I pulled inspiration from this port to a vectrex: https://www.youtube.com/watch?v=VuVnoqFF3II, there is a wayback machine link to the authors writeup credited in my writeup.
I have seen a lot of ports of DOOM on overpowered Windows based test equipment, particularly the Keysight MXA's, but they're just using them as a computer. Spectrum DOOM though. Could it be done by taking snapshots of a waterfall plot?
Was never quite sure if I should raw XY it or soft modem so I could decode on a web page on a handy device.
How about analog raster scan? a.k.a. slow-scan TV? [0] Like how they returned the live television images from the Apollo missions. (They only had 1 MHz of bandwidth for everything - voice, computer up and downlink, telemetry, and TV. Standard analog broadcast TV was 6 MHz. So they reduced the scan rate to 10 frames per second instead of 60, and halved the horizontal line resolution -- that could fit in 500 kHz.)
Most modern SSTV standards are super-narrowband, designed to fit into just a few hundred Hertz for amateur radio. But what if you had the full 20 kHz of bandwidth of a nice audio channel? With 100 horizontal lines per frame, and 1 frame per second -- that is about 200 cycles per horizontal line, or enough to resolve, in theory, 100 vertical lines on each horizontal line. I.e., 100 x 100 pixels (ish) at 1 fps.
[0] https://en.wikipedia.org/wiki/Slow-scan_television
It looks like a web page with audio input permissions can be expected to sample at 48KHz I wonder what the quality is like from a cable bodged off a spare pin.
A little webapp running on your phone could actually do some nifty on-the-fly display.