A bit more context - the cloth sim is part of my app, Lungy (https://www.lungy.app). It's designed to be an active meditation / relaxation app, so you can play relaxing instruments in space and immersive breathing exercises. The original Lungy is a breathing app available for iOS, that uses real-time breathing with interactive visuals.
The cloth sim is a Verlet integration, running on a regular grid. For now, I have tried a couple of different cloth scenes - a sort of touch reactive 'pad', where different parts of the cloth are mapped to different sounds and a cloth that blows in sync with breathing. The collision detection is a little bit tricky with the deforming mesh, but seems to work ok overall. Overall, it seems like a cool interaction to explore.
The cloth sim is live on the app store now (and free) - would love to hear feedback from anyone with a Vision Pro.
App Store (Vision Pro): https://apps.apple.com/app/id6470201263
Lungy, original for iOS - https://apps.apple.com/app/id1545223887
The limiting factor is actually not the sim, but the initial load time for the procedural geometry. I might try adding some subdivisions though..
Can’t believe this is what $4-5K piece of tech looks like. Wild.
Apple is very much not having an iPhone moment with Vision Pro: there is no explosion of creativity or a new market rush. And there may never be one.
Apple treated devs like trash for years and now they have a new platform nobody wants to build on. Oops.
It's been fun. It's not too dissimilar to iOS - a lot of spatial capabilities are linked very closely with RealityKit, it's worth looking at the API if you're interested. I was thinking we'd use Metal for rendering, but as I think there are privacy issues with accessing the raw camera data, it's only supported for 'full' spaces, so not the mixed / camera feed + overlay.
In the video I am looking at where I want to interact and then using a pinch gesture and the sounds are mapped to different cells of the cloth. By either looking and tapping, or playing directly you can hopefully play your intended sound.
Is that how most vision pro interfaces work?
Say you slide it over so it's hovering over the edge of your sofa and then you turn gravity on. It seems like you're just a few steps away from what!
You can get detect planes and the mesh geo of the room.
Deleted Comment