A genuine intellectual question: what's driving the (hardware) cost of these humanoid robots? Dynamixel smart servo motors? Exotic sensors?
I do robotics research but not humanoid. Almost every component used in production came down drastically in price in recent years. Lidar sensors are a few hundred dollars now.
Vast majority of BOM is actuators, although for low volume if you need to CNC a bunch of stuff that can hike the price as well. You can use exotic sensors but they're not particularly necessary. It's a death by a thousand cuts thing with many of the humanoids you see out there right now - hence our approach, which is to get the hardware cost down as much as we could and then start building the software to make up for it
Not sure Dynamixel or other servo is a good choice for dynamic locomotion. In high end they use large brushless motors with minimum gear. Just look at latest dog and humanoid from Boston Dynamics or Chinese Unitree(?) and nonames. The reason is for dynamics it has to be fast and strong. For sensors they use gyro/accelerometer in the body as main. Advanced models use pressure sensors in feet. Looks like they are needed for precision acrobatics. And rotary encoders in all joints, of course.
The reason why you want quasi direct drive with low gear reduction has to do with the kinetic energy stored in the gears. It's difficult to get an intuition as a human, because we don't have rapidly spinning disks inside our arms.
The moment your high gear reduction actuator is forming a contact with anything, it will have to either decelerate instantly, push the object away or move into the object by deforming it. If your robot arm is moving at speed and hitting a wall, the deceleration needs to shed all velocity in a few milliseconds. This is fine for the arm itself, but if you have a 100:1 gear reduction ratio, one gear is moving at 100 times the speed of the robot arm and since kinetic energy is 1/2mv^2, the energy stored in the gear is significant. Stop the arm and you'll break off the teeth on your gears!
For an open-source robot design the website seems quite light on the hardware details. All I was able to find from the link were the robots' physical dimensions in Imperial (?) units. There seems to be a github repository for the robots but the all the folders are empty. Nothing in the docs either. Does anyone have more details about the hardware?
Oh sorry about that, actually we are still putting together the hardware specs for the robot - we just launched the software SDK today: https://news.ycombinator.com/item?id=44022106 Planning to make an announcement about the hardware later
An older version of their repo (or some section I cannot find anymore) indicated they were at the time using a few different Robstride actuators - https://www.robstride.com/products/robStride04
Yes that's correct, we're working with Robstride. We're gonna release the full hardware spec in a bit, will live in this repo -> https://github.com/kscalelabs/kbot
We have specific industrial tasks to train and we’re taking a closer look at this as an alternative to the hard-to-reach bigcorps that have their eyes too far down the road. We want to start now and push the current tech as far as it can go
Pretty cool. Once suggestion: any bot for home use (and many industrial uses) needs to be able to wash itself. No-one wants a bot that cleans the toilet and then gets dirt all round the house. This will be hard to retrofit (just covering it with silicone will give you massive thermal issues) so have a think about it early.
When I see a video of one of these things taking a shower, I'll think about buying it :-)
What about having the frame be relatively open, ship the robot with a sort of temporary shelter and have some disinfectant spray? Basically to "shower" the robot, you cover it in the shelter, press a button, remove shelter. Like fumigating a house, but on a smaller scale.
Would that still be too manual? I guess you could have another robot do it for you/it :)
I think that only works for rather specific cases, like if your robot is contaminated but not physically dirty. Usually people care about the mess of dirt as well. Also if the frame is open I suspect it will get grit etc inside.
Perhaps a more general way to put this is, think about having this bot in a house with a small child. It would need to clean itself after the kid gets it dirty, and it shouldn't stop working if the kid sticks a Lego brick in some gap.
These open framed robots are fine for development purposes, but for general use they don't seem practical.
I'm quite interested in the Z-bot [1]. I assume the actuators are good enough to allow the robot to be mobile, but it's missing any information about the sensors, compute, battery, etc. It's difficult to know what you are getting into.
There are at least 18 humanoid robots far enough along to have YouTube videos.[1] That's from February 2024. As far as I can tell, none have an order page where you jut enter a credit card number and get the product. It's all "pre-order", "contact sales", or just plain vagueness.
There are a lot of "really great, real soon now" humanoid robot startups.[2][3] As far as I can tell, nobody has yet deployed one in a production environment.
On the mechanical engineering side, it's likely that a drone company will have the first big low-cost product. Drone companies have people who understand sensors, balance, navigation, reliability, and weight/cost/strength tradeoffs.
Yea, we built everything ourselves and tried to stay as low-cost as possible. In my opinion, humanoid robots are not very capable right now, so we sell it for a bit above cost for the time being. As the software capabilities improve we will increase the price.
The Unitree robots look great in videos, kind of suck in reality. The motors in the humanoid (G1) overheat after shaking hands a few times, and the wheeled dog (GO2W) drifts like a broken RC car and constantly topples over in motion.
They also patched the known jailbreak methods early this year, so all newer models lack sensor access unless you pay Unitree massive $$$ for SDK access.
The base Go2 is a fun toy, though. There’s a high level web SDK you can use for free.
Heh. This landing page takes me to somewhere between deepmind circa 2014 and tesla's AI Day press decks.
I mean if you're actually training humanoids in under an hour with sim-to-real transfer that "just works" then congrats, you've solved half of embodied AI
the vertical integration schtick (from "metal to model") echoes early apple, but in the robotics space that usually means either 1) your burn rate is brutal and you're ngmi, or 2) you're hiding how much is really off-the-shelf
Clearly the real play here, assuming it's legit, is the RL infra. K-Sim is def interesting if it's not just another wrapper over Brax/Isaac. Until we see actual benchmarks re say, dexterous manipulation tasks trained zero-shot on physical hardware, it's hard to separate "open-source humanoid stack" from the next pitch that ends in "-scale"
Actually, we use COTS components for basically everything, that's how the price is so low. It's just that we do a lot to make sure we understand how everything works together from software to hardware
IMO humanoid companies do make a lot of big claims which is why it's important to make everything open-source. Don't have to take my word for it, can just read the code
IME the COTS angle cuts both ways. It brings costs down and makes iteration faster, but whats the moat then?
if the value is in integration, that’s fine, but integration is fairly fragile IP. Open source is good reputationally but accelerates the diffusion of your edge unless the play is towards community+ecosystem lock-in or being the canonical reference impl (cf. ROS, HuggingFace)?
I do robotics research but not humanoid. Almost every component used in production came down drastically in price in recent years. Lidar sensors are a few hundred dollars now.
The moment your high gear reduction actuator is forming a contact with anything, it will have to either decelerate instantly, push the object away or move into the object by deforming it. If your robot arm is moving at speed and hitting a wall, the deceleration needs to shed all velocity in a few milliseconds. This is fine for the arm itself, but if you have a 100:1 gear reduction ratio, one gear is moving at 100 times the speed of the robot arm and since kinetic energy is 1/2mv^2, the energy stored in the gear is significant. Stop the arm and you'll break off the teeth on your gears!
We do have an entirely 3D printable robot with build guide here, if you're interested: https://docs.kscale.dev/docs/zeroth-bot-01#/
Looking forward to helping however we can
When I see a video of one of these things taking a shower, I'll think about buying it :-)
Would that still be too manual? I guess you could have another robot do it for you/it :)
Perhaps a more general way to put this is, think about having this bot in a house with a small child. It would need to clean itself after the kid gets it dirty, and it shouldn't stop working if the kid sticks a Lego brick in some gap.
These open framed robots are fine for development purposes, but for general use they don't seem practical.
[1] https://shop.kscale.dev/products/zbot
There are a lot of "really great, real soon now" humanoid robot startups.[2][3] As far as I can tell, nobody has yet deployed one in a production environment.
On the mechanical engineering side, it's likely that a drone company will have the first big low-cost product. Drone companies have people who understand sensors, balance, navigation, reliability, and weight/cost/strength tradeoffs.
[1] https://james.darpinian.com/blog/you-havent-seen-these-real-...
[2] https://personainc.ai/
[3] https://gotokepler.com/
They also patched the known jailbreak methods early this year, so all newer models lack sensor access unless you pay Unitree massive $$$ for SDK access.
The base Go2 is a fun toy, though. There’s a high level web SDK you can use for free.
I mean if you're actually training humanoids in under an hour with sim-to-real transfer that "just works" then congrats, you've solved half of embodied AI
the vertical integration schtick (from "metal to model") echoes early apple, but in the robotics space that usually means either 1) your burn rate is brutal and you're ngmi, or 2) you're hiding how much is really off-the-shelf
Clearly the real play here, assuming it's legit, is the RL infra. K-Sim is def interesting if it's not just another wrapper over Brax/Isaac. Until we see actual benchmarks re say, dexterous manipulation tasks trained zero-shot on physical hardware, it's hard to separate "open-source humanoid stack" from the next pitch that ends in "-scale"
IMO humanoid companies do make a lot of big claims which is why it's important to make everything open-source. Don't have to take my word for it, can just read the code
IME the COTS angle cuts both ways. It brings costs down and makes iteration faster, but whats the moat then?
if the value is in integration, that’s fine, but integration is fairly fragile IP. Open source is good reputationally but accelerates the diffusion of your edge unless the play is towards community+ecosystem lock-in or being the canonical reference impl (cf. ROS, HuggingFace)?
Just like reprap worked a treat, I hope initiatives like this bring experimental bots into people's workshops.