Readit News logoReadit News
Posted by u/lukko 2 years ago
Show HN: I built an interactive cloth solver for Apple Vision Proyoutube.com/watch?v=kpD2J...
A bit more context - the cloth sim is part of my app, Lungy (https://www.lungy.app). It's designed to be an active meditation / relaxation app, so you can play relaxing instruments in space and immersive breathing exercises. The original Lungy is a breathing app available for iOS, that uses real-time breathing with interactive visuals.

The cloth sim is a Verlet integration, running on a regular grid. For now, I have tried a couple of different cloth scenes - a sort of touch reactive 'pad', where different parts of the cloth are mapped to different sounds and a cloth that blows in sync with breathing. The collision detection is a little bit tricky with the deforming mesh, but seems to work ok overall. Overall, it seems like a cool interaction to explore.

The cloth sim is live on the app store now (and free) - would love to hear feedback from anyone with a Vision Pro.

App Store (Vision Pro): https://apps.apple.com/app/id6470201263

Lungy, original for iOS - https://apps.apple.com/app/id1545223887

v1sea · 2 years ago
Really neat, are you running the cloth simulation on the CPU or GPU? How many elements are in the simulated cloth? Good luck on future AR projects!
lukko · 2 years ago
The cloth sim is GPU. It’s not that heavy currently, around 40k quads. You could go much heavier, easily 500k, but it also changed the cloth behaviour.

The limiting factor is actually not the sim, but the initial load time for the procedural geometry. I might try adding some subdivisions though..

xyst · 2 years ago
Is the choppy movement of the cloth because of the limitations of the device or something else?

Can’t believe this is what $4-5K piece of tech looks like. Wild.

lukko · 2 years ago
It's because of the simultaneous recording / screen capture - it's smoother on device with less lag.
hyko · 2 years ago
Is the video captured using the dev kit?
spaceman_2020 · 2 years ago
Not to disparage OP or this product - which sure looks cool - but man, the idea of buying a $3500 device to simulate something I can do by opening my wardrobe seems…absurd.
dorkwood · 2 years ago
Did you watch the video? I don't think anyone has anti-gravity cloth that lights up on touch in their wardrobe.
elliottkember · 2 years ago
You’ll be delighted to know that the $3500 device can actually run many other applications!
hehdhdjehehegwv · 2 years ago
Can it though? What apps are people actually using?

Apple is very much not having an iPhone moment with Vision Pro: there is no explosion of creativity or a new market rush. And there may never be one.

Apple treated devs like trash for years and now they have a new platform nobody wants to build on. Oops.

superamit · 2 years ago
This is really cool! I save AVP for work but have been eager to find new meditation interfaces because it feels like AR has so much potential for this. Will try it out!
throwaway115 · 2 years ago
Congrats! What has it been like developing on the AVP?
lukko · 2 years ago
Thanks!

It's been fun. It's not too dissimilar to iOS - a lot of spatial capabilities are linked very closely with RealityKit, it's worth looking at the API if you're interested. I was thinking we'd use Metal for rendering, but as I think there are privacy issues with accessing the raw camera data, it's only supported for 'full' spaces, so not the mixed / camera feed + overlay.

goeiedaggoeie · 2 years ago
API's are a lot more limited than IOS however from my experience.
jncfhnb · 2 years ago
It seems weird that the interface seems to encourage interacting with it from afar and has no indicators as to where your actions would affect it (like a highly on a pickupable node). Is there some sort of intuitive reason for that that’s not obvious on a video? Not a complaint just curious.
lukko · 2 years ago
Good point - I should have probably shown the direct interaction too (touching the actual fabric in space) - there's a gif here: https://jmp.sh/s/lHqJm6NEvqMqkXMPyUpZ. It was a little but laggy when also screen recording on device.

In the video I am looking at where I want to interact and then using a pinch gesture and the sounds are mapped to different cells of the cloth. By either looking and tapping, or playing directly you can hopefully play your intended sound.

jncfhnb · 2 years ago
So it looks for gesture events and maps them to locations where your eyes are looking rather than where the gesture was done?

Is that how most vision pro interfaces work?

Falimonda · 2 years ago
Are you able to have it interact with the environment?

Say you slide it over so it's hovering over the edge of your sofa and then you turn gravity on. It seems like you're just a few steps away from what!

raincole · 2 years ago
A few really big steps.
lukko · 2 years ago
haha, yes - it’s definitely possible, just may be bit tricky.

You can get detect planes and the mesh geo of the room.

Deleted Comment