Readit News logoReadit News
dualogy · a year ago
Also discussed 30 days ago, 33 comments: https://news.ycombinator.com/item?id=42485423
draven · a year ago
Watching the video I thought "No Man's Sky as a python lib."
amlib · a year ago
Also terragen but for everything
darknavi · a year ago
I miss terragen! What a fun way to waste an afternoon on a rainy day as a kid.

I just looked it up and WOW it has come a long way (and wasn't it free before? Maybe I misremember).

feverzsj · a year ago
I like the "zero AI" part.
markisus · a year ago
This project generates synthetic computer vision training data. The arxiv paper has more detail including some cool pictures of random creatures it can generate. The images are nice but all of them are nature settings so I assume one would have to supplement this type of data with another data set for training a computer vision model.
kannonboy · a year ago
The same authors also created Infinigen Indoors[1] to generate indoor scenes for computer vision applications such as robotics & AR.

[1] https://arxiv.org/abs/2406.11824

culi · a year ago
If only demoscenes were still as prominent today as they used to be. They'd be having a field day with this
tandr · a year ago
Maybe this IS the demoscene of today? I saw more insanely beautiful computer generated pictures in the last couple years than I saw in previous 10, AI or no AI.
w_for_wumbo · a year ago
This feels like we've got all the pieces of the puzzle to create a reality experience - I'm pretty sure with visuals like this and haptic feedback that your brain will fill in any gaps once you adapt to this given enough time.

You could use this with a VR headset, monitoring heart-beat, temperature and adapt the environment based off the experiencer's desires.

It feels like we're on the precipice of recreating an experience of reality, which may reveal more about our existing reality than we ever expected.

janalsncm · a year ago
This seems extremely cool. I’m wondering if it can be used to create procedural video game assets.
kannonboy · a year ago
From the homepage it sounds like they've prioritised geometry fidelity for CV research rather than performance:

> Infinigen is optimized for computer vision research, particularly 3D vision. Infinigen does not use bump/normal-maps, full-transparency, or other techniques which fake geometric detail. All fine details of geometry from Infinigen are real, ensuring accurate 3D ground truth.

So I suspect the assets wouldn't be particularly optimised for video games. Perhaps a good starting point though!

jeffhuys · a year ago
Well, we've come a long way. Look at nanite - it might actually be compatible...
cma · a year ago
I doubt they prioritized it. To get normal maps you usually first need a high resolution mesh, but then need other steps to get good decimation for lods and normal bake. That's mostly extra work, not alternative work that wasn't prioritized. If by transparency they mean faking aggregates, you also need full geo there before sampling and baking down into planes or some other impostor technique.
ghfhghg · a year ago
That's actually a fairly ideal fit for nanite meshes.
ANewFormation · a year ago
This looks rather extremely similar to something that Unreal already natively supports. Here is a demo video from them - https://youtube.com/watch?v=8tBNZhuWMac
fireant · a year ago
It looks great, but I'm missing what's innovative about this? AAA procedural folliage has been done for 20 years, terrain too. Blender has had procedural geo nodes for a long time too. What is so interesting here?