You want a killer app for VR/AR goggle style things? You’re right this would be amazing.
Apple demoed some kind of volumetric video to the press with the Vision Pro. There was a short clip of a concert and an NBA game (Nuggets?) among other things. I heard a number of people said it was like being there.
This is a step past that. Apple’s was recorded from some kind of special camera rig (I assume), but I seriously doubt it was full volumetric video from a large number of angles. It sounded more like volumetric video if you were stuck in a (very good) seat in the venue.
I’d be curious to know just how much horsepower it takes to play these back.
From the paper, it seems a 3060 is enough for 60FPS for the DNA-Rendering dataset. On the full-screen datasets it manages 25 fps. A 4090 might be needed to stay above 60 fps.
Still pretty heavy I’d say but it certainly came a long way and shows us real volumetric video is doable.
Does the cache size scale linearly with the length of the video? 0013_01 is only 150 frames. And how long does the cache take to generate?
I think volumetric video should be thought of as a regular video, where the decoding and playback happen at the same time. A few papers down the line this could be easily implemented.