For those curious meta actually bought out a company that orginally pioneered this idea (wrist controller) from a company called CTRL+Labs in 2019. Here is a verge article that has some photos of the prototype from CTRL-Labs. https://www.theverge.com/2018/6/6/17433516/ctrl-labs-brain-c...
Another company doing this (mentioned in the Verge article) was Thalmic Labs, a YC company from 2013, which was acquired by Google in 2020. I remember seeing their presentation at YC Demo Day and it was jaw-dropping stuff; one of the only demos I still remember, 12 years on.
It's sad to see they didn't make it as a commercial success, and is a grim reminder that brilliant innovation doesn't assure a successful outcome.
Pretty sure Thalmic sold the tech to CTRL+. I’ve still got one of the bands knocking around somewhere. It was cool tech, but really wasn’t ready for a product.
Thalmic then became North to make smart glasses and then got sold to Google
Was skeptical (especially because it's Meta) until it said it's designed for accessibility. Reminds me of the Xbox accessible controller. A lot of devices designed for accessible end up leading to cool user design discoveries.
I would say such inventions are as old as 30 years when i first heard of startups / inventors trying to do such stuff. Obviously the tech must be now much more mature. Still, it never got off back then because typing was magnitudes faster than what ever you could do with your hands alone. Learning to do such hand motions had a similiar fate as why alt keyboard layouts always stayed niche - most people have no patience to learn that complicated stuff when they already have learned something early on that works.
Why is this just now news? They already built a similar device for their Project Orion glasses. As far as I can tell, this is just the same thing but with a PC driver.
Having tried prototypes at neuroscience conferences where their team attended, I can tell you that the device was incredibly brittle (e.g. damp wrist, interference from even the metal table or a nearby computer).
As it says in the article, the device seems to be more robust, and ready for the market soon. After having used ML to tune the decoding model on many participants contributing EMG data.
You’re correct that this was publicly announced last fall along with Orion. This is back in the news now because of the recent Nature paper demonstrating the performance of general models on new participants without additional training data. It has nothing to do with PC drivers.
(I helped with their release and last month gave a presentation on the project's original research infrastructure, but I'm no longer on the team and I definitely never was allowed to talk about final products.)
From mine certainly. Funny how fast we forget about tech that was for years pretty common and then completely disappeared as it turned out be be a fad.
Another company doing this (mentioned in the Verge article) was Thalmic Labs, a YC company from 2013, which was acquired by Google in 2020. I remember seeing their presentation at YC Demo Day and it was jaw-dropping stuff; one of the only demos I still remember, 12 years on.
It's sad to see they didn't make it as a commercial success, and is a grim reminder that brilliant innovation doesn't assure a successful outcome.
Thalmic then became North to make smart glasses and then got sold to Google
As it says in the article, the device seems to be more robust, and ready for the market soon. After having used ML to tune the decoding model on many participants contributing EMG data.
https://github.com/facebookresearch/emg2pose
https://github.com/facebookresearch/emg2qwerty
Infer what you will.
(I helped with their release and last month gave a presentation on the project's original research infrastructure, but I'm no longer on the team and I definitely never was allowed to talk about final products.)
Became iPhone FaceID.