Readit News logoReadit News
neoteo · 10 years ago
As always Douglas Adams had some keen, if slightly cynical, insight: "The machine was rather difficult to operate. For years radios had been operated by means of pressing buttons and turning dials; then as the technology became more sophisticated the controls were made touch-sensitive - you merely had to brush the panels with your fingers; now all you had to do was wave your hand in the general direction of the components and hope. It saved a lot of muscular expenditure of course, but meant that you had to sit infuriatingly still if you wanted to keep listening to the same programme." Hitchhiker's Guide to the Galaxy.
guelo · 10 years ago
At Google I/O a couple weeks ago they said they had miniaturized the Soli chip and they demoed a smartwatch with the chip in the wrist band, as well as a gesture-controlled speaker. They announced the Soli beta dev kit coming "next year". You can watch starting around 21:40 here https://www.youtube.com/watch?v=8LO59eN9om4
awesomerobot · 10 years ago
I'm sure a lot of consumer implementations come to mind pretty quickly for most, but this could be really huge for accessibility... if you could say, train gestures tailored to the very specific nuances of a person's available range of motion. Avoiding the limitations tied to physical hardware would be huge.
melling · 10 years ago
It could also be huge for ergonomics. Many people develop RSI issues through repeated keyboard and mouse usage. People have as gone as far as using their nose as an input device:

http://www.looknohands.me

There are lots of other RSI stories here: https://github.com/melling/ErgonomicNotes/blob/master/README...

awesomerobot · 10 years ago
Exactly.

An unintended result of the advent of reliable/portable touch devices was the reduction in physical hardware input.

Instead of clamping down a joystick to a desk and moving it around with your face... or looking away from content to type on a keyboard with your nose... you can buy an off-the-shelf product and use it on just about any solid surface. The iPad was huge for people with limited limb control.

azinman2 · 10 years ago
It'll shift to RSI doing these gestures instead. They're still fine motor control which is what's involved in RSI. Notice how runners don't get RSI in their legs but pianists do.

Background: I have RSI and have battled it for 15 years now. There's a lot that's mental and nervous system, and can easily shift in the body (almost gave myself RSI in my throat and eyes doing voice rec and eye tracking to avoid typing).

Aelinsaar · 10 years ago
I wonder if this could be used for some kind of modified eye/eyelid tracking?
awesomerobot · 10 years ago
That could be interesting, eye tracking is often expensive and can require a "just-right" type of setting. Reducing that barrier can be huge for many.

I think outside of a single solution like eye-tracking there's a lot of room for a device like this to be much more adaptable to a broad range of conditions. There's so much variation of how physical symptoms can present, even within a single disease, that it can be really difficult (and expensive) to customize hardware for an individual... and degenerative diseases can take constant readjustment and new hardware.

The hardware barrier seems endlessly frustrating — you have an off-the-shelf device like a joystick/keyboard/mouse that was originally designed for hands, being used by feet and with mouths...

...a good example of hardware reduction making a meaningful impact was the advent of reliable portable touchscreens — people could actually start directly touching the interface without some of the ergonomic constraints of a mouse and keyboard. You can reliably surf the web on an iPad using your nose. Imagine trying to do the same with an off-the-shelf joystick or mouse.

dizzydot · 10 years ago
It is truly amazing to see the photograph of the huge and complicated prototype that they managed to put on a chip in under 2 years.
dskhatri · 10 years ago
alex_duf · 10 years ago
I may be old school but for every example given in the video, I would rather have an actual button to press...
elsewhen · 10 years ago
the physical size of buttons/knobs/sliders requires devices to be large enough to accommodate those elements. with a chip like soli, designers can create devices that are dramatically smaller.

as the soli gets smaller (which future versions likely will), you can imagine tiny devices that maintain rich interactivity.

0x6c6f6c · 10 years ago
Exactly. I also prefer physical buttons but imagine what could be done in terms of reducing space taken by controls alone. This is in many ways vastly better than touchscreen interfaces since it will allow for many different kinds of interaction in the same space.
kevincox · 10 years ago
However if you have a button, you have a button. With this you can have a button, or a dial, or a slider or sliders in the other two directions. And which you have switches with the context.
neves · 10 years ago
I want one of these in my glass so my desktop computer would put the focus on the screen I'm looking.
kevincox · 10 years ago
This can be done today, and probably more accurately with a webcam.
sourthyme · 10 years ago
The video suggests that using radar they can more accurately track hand motions than using a camera.

Deleted Comment