Readit News logoReadit News
otterstack · 2 days ago
Hi, author here. I heard it got posted here and decided to make an account, so I can hop in here. Thanks for sharing!

I'm also looking into simplifying it a bit more with environment maps, which I shared on my Bsky: https://bsky.app/profile/dannyspencer.bsky.social/post/3mecu...

Agentlien · a day ago
This was so much fun to read! Very neat solutions using spherical coordinates and logarithms.

How did you get the actual idea to do this in the first place?

otterstack · 15 hours ago
Thanks!

It's fuzzy, but I think it was because I was learning GB assembly while working on shaders in Houdini or something (I'm a tech artist). The two worlds collided in my head, saw that there's no native multiplication on the GB, and figured it'd be a fun problem.

glouwbug · 2 days ago
It’s nice getting real hacker material on hackernews
andix · 2 days ago
It wasn't just a prompt to an AI? How did they do it? ;)
FeteCommuniste · a day ago
The lost, dark art of using one's brain to implement something line by line.
jama211 · a day ago
Genuinely curious, what’s your goal here? Disparage those who use LLMs? Or just express your unhappiness at the amount of ai content on the HN front page? Or just want to throw shade on LLM use in general?

This is impressive and cool but I don’t understand the bitterness here.

speps · 2 days ago
Awesome looking results. As far as I understand it's a "3D" shader in the sense that it looks 3D but it's a prerendered 2D normal map which is then lit using the resulting world space normal.

Here are the frames: https://github.com/nukep/gbshader/tree/main/sequences/gbspin...

Karliss · 2 days ago
It's not that different from "real 3d" renderers. Especially in deferred rendering pipelines the rasterizer creates a bunch of buffers for depth map, normal map, color, etc but main shaders are running on those 2d buffers. That's the beauty of it parts operating with 3d triangles are kept simple simple and the expensive lighting shaders run once on flat 2d images with 0 overdraw. The shaders don't care whether normal map buffer came from 3d geometry which was rasterized just now, prerendered some time ago or the mix of 2. And even in forward rendering pipelines the fragment shader is operating on implicit 2d pixels created by vertex shaders and rasterizer from "real 3d" data.

The way I look at it if the input and math in the shader is working with 3d vectors its a 3d shader. Whether there is also a 3d rasterizer is a separate question.

Modern 3d games are exploiting it in many different ways. Prerendering a 3D model from multiple views might sound like cheating but use of imposters is a real technique used by proper 3d engines.

araes · 2 days ago
There's a GBDK demo that actually does something similar (spinning 2D imposters). Does not handle the lighting though, which is quite impressive.

https://github.com/gbdk-2020/gbdk-2020/tree/develop/gbdk-lib...

Unfortunately, the 2D imposter mode has pretty significant difficulties with arbitrarily rotated 3D. The GBDK imposter rotation demo needs a 256k cart just to handle 64 rotation frames in a circle for a single object. Expanding that out to fully 3D views and rotations gets quite prohibitive.

Haven't tried downloading RGDBS to compile this yet. However, suspect the final file is probably similar, and pushing the upper limits on GB cart sizes.

antidamage · a day ago
It's not that different from how some creative Mac games were doing 3d lighting on 2d textures prior to 3d accelerated hardware being available. The neat part here is that it runs on a Gameboy Colour.
bulbar · a day ago
On a device that apparently doesn't even support floating point operations and doesn't support multiplication. Super cool.
Someone · a day ago
It’s a shader, not a renderer. The images are pre-rendered, but the shading is done in real time.

⇒ I think they’re correct in calling this a 3D shader.

wasmainiac · 2 days ago
> An overall failed attempt at using AI > I attempted to use AI to try out the process, mostly because 1) the industry won't shut up about AI, and 2) I wanted a grounded opinion of it for novel projects, so I have a concrete and personal reference point when talking about it in the wild. At the end of the day, this is still a hobbyist project, so AI really isn't the point! But still...

> I believe in disclosing all attempts or actual uses of generative AI output, because I think it's unethical to deceive people about the process of your work. Not doing so undermines trust, and amounts to disinformation or plagiarism. Disclosure also invites people who have disagreements to engage with the work, which they should be able to. I'm open to feedback, btw.

Thank you for your honesty! Also tremendous project.

otterstack · 2 days ago
The funny thing is the phrasing used to be more neutral, but I changed the tone to be slightly more skeptical because people thought I was just glazing AI in my post. Another guy on Reddit seemed annoyed that I didn't love AI enough.

I just wanted to document the process for this type of project. shrug

ekipan · a day ago
It seems to me that AI is mostly optimized for tricking suits into thinking they don't need people to do actual work. If I hear "you're absolutely right!" one more time my eyes might roll all the way back into my head.

Still, even though they suck at specific artifacts or copy, I've had success asking an LLM to poke for holes in my documentation. Things that need concrete examples, knowledge assumptions I didn't realize I was making, that sort of thing.

Sweet Gameboy shader!

wileydragonfly · a day ago
Just… ignore Reddit.
jama211 · a day ago
I dunno about the need for disclosure in this way. In my working life I’ve copied a lot of code from stack overflow, or a forum or something when I’ve been stuck. I’ve understood it (or at least tried to) when implementing it, but I didn’t technically write it. It was never a problem though because everybody did this to some degree and no one would demand others disclose such a thing at least in hobby projects or low stakes professional work (obviously it’s different if you’re making like, autopilot software for a passenger plane or something mission critical, that’s notwithstanding).

If it’s the norm to use LLMs, which I honestly believe is the case now or at least very soon, why disclose the obvious? I’d do it the other way around, if you made it by hand disclose that it was entirely handmade, without any AI or stackoverflow or anything, and we can treat it with respect and ooh and ahh accordingly. But otherwise it’s totally reasonable to assume LLM usage, at the end of the day the developer is still responsible for the final result, how it functions, just like a company is responsible for its products even if they contracted out the development of them. Or how a filmmaker is responsible for how a scene looks even if they used abobe after effects to content aware remove an object.

otterstack · 15 hours ago
I disclosed AI because I think it's important to disclose it. I also take pride in the process. Mind you, I also cite Stack Overflow answers in my code if I use it. Usually with a comment like:

    // Source: https://stackoverflow.com/q/11828270
With any AI code I use, I adopted this style (at least for now):

    // Note: This was generated by Claude 4.5 Sonnet (AI).
    // Prompt: Do something real cool.

spacebacon · 2 days ago
This GBC shader reveals a key truth: all computation is approximation under constraint. Multiplication becomes table lookups plus addition, while precision yields to what the eye actually sees.
VimEscapeArtist · a day ago
I bow before the master. Genuinely outstanding work.

Since you're already doing what's essentially demoscene-grade hacking, have you thought about putting together a short demo and entering it at a demoparty? There's a list of events at demoparty.net - this kind of thing would absolutely shine there.

Waterluvian · 2 days ago
I’m incredibly impressed by this, largely because it actually is running on a CGB. What I see often are hacks where the game boy is just being used as a terminal and the cartridge has been packed with far more powerful processing power.
giancarlostoro · 2 days ago
I lowkey wish Nintendo would rerelease the GBC or GBA I would buy one. They can bake in some games into a few cartridges and make it 100% worth the buy too.
gkhartman · 2 days ago
You can pick a used one up for pretty cheap. Add a flash cartridge and you're done. I think the cheap android handhelds of the same form factor are a better option though.

I've still got my Gameboy collection, but rarely use it. It's just so much easier to fire up an emulator these days.

giancarlostoro · a day ago
I still have my 90s one but would love a modern brand new one, similar to how they did the SNES Mini
VimEscapeArtist · 2 days ago
You can buy the ModRetro Chromatic from the Oculus VR creator. It's better than anything Nintendo could ever produce.
wileydragonfly · a day ago
Doesn’t he use your money to blow people up or something?
giancarlostoro · a day ago
I seen those but I dont like the asthetic, my GBC from the 90s is dirty but sturdy as heck despite my carelessness through 28 plus years
vel0city · 15 hours ago
This is pricey but pretty awesome. Very well built, high quality. Hit up a local used game store and have a more modern hardware experience with legit copies of the actual games.

https://www.analogue.co/pocket

I went with getting a GBA SP and replacing the screen with a more modern panel. The kids love it.