Readit News logoReadit News
COGlory · a year ago
I still routinely use Nvidia 3D vision to build protein structures into electron density maps. Nvidia hasn't updated the driver in generations, and no new compatible displays or parts have been made in eons, but it's an invaluable tool. My concern is that Wayland is going to kill it entirely. There is definitely a market for this.
vrinsd · a year ago
What display are you using? 120 Hz LCD with shutter glasses? I assume a Quadro video card with the 3-pin stereo output connector.
COGlory · a year ago
I'm using a Dell S2716DG which is a 27" 1440p TN panel at 120 Hz. As far as I'm aware, there were only ever a few 1440p models that got 3D Vision (officially). I've tried running 3D Vision on an OLED ultrawide I have, and it works, but only on the bottom of the screen. I assume something to do with the refresh rate (144 Hz) or pixel response time (which I think isn't great with OLED).

I'm driving it with an A4500, on Linux (openSUSE mainly), 3-pin to the USB emitter, with glasses. The A4500 is somewhat gimped because only the 470.xx driver works. With the newer drivers X11 detects the display and emitter but displays both frames simultaneously. I think it might have something to do with the stereo declaration in the xorg.conf file being different with the newer drivers, but I'm still chasing down that lead.

sfmz · a year ago
I keep waiting for a commerical version of motion-tracking perspective-changing 3d on TV, but nobody seems interested in this space...

Wii Sensor Bar VR For A 3D Window Like Display https://www.youtube.com/watch?v=LC_KKxAuLQw

codegrappler · a year ago
The big change here for Looking Glass is GROUP 3D. Everything shown above tracks a single person at a time.
krenzo · a year ago
LOL, what? There are indeed 3D displays with eye tracking built in that do this. I was just at Display Week expo last week and saw a handful of new models, and they've been around for years.

Innolux booth at Display Week: https://youtu.be/Tapm05Zwokc?t=268

A Chinese OEM also at Display Week: https://www.youtube.com/watch?v=Y_ZBIC4VydI

Here's one built into an ASUS laptop: https://www.asus.com/content/asus-spatial-vision-technology/

stubish · a year ago
The real use case is eye catching advertising, distracting passers by and stealing attention.

Price is getting more competitive though with VR and 3D TV, once you include the cost of the PC needed to drive them. Might start seeing them in a few specialist places like your more exclusive dental surgery. Is 32" big enough for architects to sell designs to customers?

bee_rider · a year ago
One of the examples they have is a museum display, which seems nice and unobjectionable. The ads will be annoying though.
jasonvorhe · a year ago
How long is that actually going to work? At some point the novelty will wear off and people will just walk past it, adding just another noise generator to our urban environment.

This reminds me of the fad of placing beacons everywhere in shopping centers that sent out "helpful" (annoying) notifications to anyone's phone who walked by.

dartos · a year ago
Some businesses still use neon signs. It’ll probably work for a long time as long as the display holds up
fortran77 · a year ago
It's lenticular with eye tracking?

I hate when they use the word "holographic" when it has nothing to do with holograms.

dwallin · a year ago
It’s lenticular but with 100 different possible angles, so there’s no eye tracking needed and it works with multiple viewers. The tradeoff is it seems you need to pump a lot of data in for all those views, and you probably need a pretty high resolution and brightness screen. There’s a good description in their docs:

https://docs.lookingglassfactory.com/keyconcepts/how-it-work...

_moof · a year ago
Hard to tell from the video what the quality is like but this concept will be huge for CAD.
zimpenfish · a year ago
I've got the Portrait and it's pretty good. Definitely not "this is ACTUALLY 3D!" but certainly in the "huh, that's got some depth" zone. It's much more impressive if you have "real" depth maps (ie you're rendering the content, using stereoscopic cameras, LIDAR, iPhone Portraits etc.) - most of the stuff on mine is ML'd depth maps from old photos.
dagmx · a year ago
Their products have existed for the better part of a decade now with easy to use SDKs etc

They haven’t seen much adoption however.

dartos · a year ago
I’ve had the portrait for a while and the drivers are a HUGE pain in the ass
deckar01 · a year ago
I have been working on a portable camera array to put real video on these displays. The video demos are either projected depth maps or very expensive, stationary, indoor camera rigs. Despite trying to stay away from AI generation, making the array sparse, then synthesizing missing angles and dropped frames actually solved a lot of problems better than more hardware.

https://github.com/deckar01/holocam-bilinear-interpolation

ipsum2 · a year ago
It's just a lenticular display. Same as the Nintendo 3ds back in the day.
mensetmanusman · a year ago
When something improves by an order of magnitude, prior descriptors may be misleading. (wealthy people take advantage of this oversight all the time in my experience).
sandspar · a year ago
Elaborate on that part about wealthy people?
rowanG077 · a year ago
I don't get it. What does that mean? A state of the art processor now is just the same bucket of transistors invented in 1954. But they are incomparable in complexity and functionality.
bee_rider · a year ago
It’s all just carefully arranged sand used to send photons at our eyeballs, maybe with some bits of metal to spice things up. A laptop is really the same thing as a stained glass window from hundreds of years ago.

Deleted Comment