Google’s New Robot Just SHOCKED The Entire INDUSTRY (MOBILE ALOHA)
In a groundbreaking development, student researchers from Google Deep Mind and Stanford University have unveiled the Mobile Aloha robot, setting a new standard for the capabilities of robotic technology. This incredible creation showcases the potential for future advancements in the robotics industry.
The Mobile Aloha robot is a versatile and highly dexterous machine capable of performing a wide range of tasks with remarkable precision and accuracy. From cooking a three-course meal to handling heavy objects and even completing complex tasks like opening cabinets and using elevators, the capabilities of this robot are truly awe-inspiring.
One of the most remarkable aspects of the Mobile Aloha robot is its ability to learn and adapt through a process called supervised behavior cloning. By mimicking human actions and learning from new and mobile tasks, the robot has achieved an impressive success rate of up to 90% in completing complex tasks.
The robot’s innovative design includes autonomous movement at the speed of a walking person, stability when handling heavy loads, and the ability to be controlled remotely via a human operator. The combination of cutting-edge technology and advanced learning capabilities makes the Mobile Aloha robot a game-changer in the field of robotics.
The potential applications of this technology are far-reaching, with implications for industries ranging from manufacturing and logistics to personal assistance and healthcare. The capabilities demonstrated by the Mobile Aloha robot are a testament to the possibilities of robotics in the near future, and its impact on the industry is sure to be far-reaching.
Watch the video by TheAIGRID
Video Transcript
So we actually have something that is absolutely incredible some student researchers from Google Deep Mind and Stanford University have built something amazing it’s called mobile Aloha and it’s a really incredible robot that will show us what’s to come in the near future so you can see right here that
This is the robot that they created this is some of the demos that they’ve showcased on their page and you can see that this robot here is being used to make a three course meal now I do want to say that there are many many things
That will be discussed later on in this video because this is a very very fascinating project that you definitely want to pay attention to because it highlights the dexterity and the way in which these mots move which is much more accurate than many people had assumed so
This initial demo was posted on Twitter and it did take the Internet by storm for very obvious reasons one of the main reasons that this took the Internet by storm was due to the fact that robots currently haven’t really been able to do many tasks especially at this scale in
Terms of the cheapness where you can where you can actually do a lot of tasks which people think only humans can do and trust me when I say that a lot of the demos later on in the video only get more intense and more amazing so let’s
Actually take a look at the team and and this entire project to see exactly how things work so this is the team here we have ziang Fu a Stanford and air robotics PhD at Stanford lab and a student researcher at Google de mind we have Tony z a PhD student at Stanford
And a student researcher at Google Deep Mind and we have Chelsea Finn CS faculty at Stanford and a research scientist at Google Deep Mind so these are the people behind this now you’re going to want to read the abstract and the abstract is essentially just the introduction for
The research paper in which they document how they manage to do everything so essentially it starts out by saying that imitation learning where robots by watching humans work really well for tasks done on a table like picking up objects but this isn’t enough for tasks that need moving around and
Using the whole body in this project we made a system called mobile Aloha which is like an upgraded version of an old system Aloha this new system can move around and is designed to control its whole body making it more versatile we use use mobile Aloha to gather data by
Having it mimic human actions then we teach it to perform tasks by itself through a process called supervised Behavior cloning and then we improve its learning by using a method called co-training which I’ll dive into later on the paper where it learns from both new and mobile tasks from the older
Aloha data which is more static and this approach is super effective by watching and learning from about 50 examples of each task the robot gets really good at doing complex things like cooking shrimp opening cabinets using an elevator and washing dishes in fact this method the robot’s ability to complete these tasks
Correctly can go up to 90% so let’s take a look at some of the tasks that they were able to do autonomously because by far this is absolutely incredible so so you can see here that this is a demonstration of some of these tasks now what’s even crazier is that this is 100%
Autonomous later on in the video I will tell you guys about teleop mode and how it actually works but here you can see that this robot is completely going about its day during the daily tasks on its own now we did see some other examples of this from other robots like
The palm e demonstration in which there was a robot there but I think this one is far more intriguing because of the way that it’s able to do certain tasks which we really only thought were possible with just hands and it was able to do many of these tasks and some of
Them like this one with the chairs that weren’t even in the original data set so by far this is definitely fascinating stuff because I also remember seeing a video here where there was actually even some interference with the task meaning that I think someone was throwing some
Stuff at the robot and throwing some stuff into it um and it still was able to complete the task so a robot being able to cook shrimp um at this level is pretty crazy if you ask me so some things that you do need to know about
This robot one of the first things is that the movement this robot can actually move as fast as a person walking in the paper they talk that it can move pretty decently in addition to that it’s steady even when handling heavy things like spots so the stability
Of this robot is very effective so it’s not going to just be tipped over in addition you can control its arms and the base it moves on at the same time and what’s crazy about this is that it is self-powered so it has its own battery and computer and for the moving
Part they chose a mobile Base called Tracer which is used in warehouses it’s fast can carry heavy loads and can move over small obstacles and slopes and it’s also cheaper than similar products and they designed the robot so a person can control it by being connected to it this
Way the person can move the robot around while using its arms which is definitely helpful for tasks like opening cabinets and it also do does have cameras on its arms and one in front for seeing and it can feel and record its movements now like I said before the this is
Essentially version two of the original Aloha here and you can see exactly how accurate this robot is at doing certain things I mean it’s able to put a ram stick in a robot then it’s able to get this box cut a knife able to open it up
And then you know do some very very precise cuts to be able to open that package then it’s able to get that out and apply that so that thing can move I’m not exactly sure what I’m looking at I know that those are bike gears and yeah so this is very very very
Impressive stuff because as you can see even here this is in the teop mode and essentially teop mode which I’ll talk about more in just a second is where a human essentially controls the robot via another arm so essentially the human is giving the input and then that is able
To do that now the thing about teop is because it’s essentially allowing you to control the robot in a very precise way and why it’s good is because we can then see exactly how good these robots could do if they were given the right data set
Or if we were able to program them in the correct way which shows us how effective these robots are so with VR teleop this is something you know autonomous versus VR teleop or just standard teop because there are many different ways that you can do it you
Can do it in VR you can do it in standard so for example if you don’t know the difference this right here is essentially VR teleop from pollen Robotics and this is essentially where you have someone in VR and they are able to control that robot so teleoperation
Also known as remote operation just refers to the operation of a system or a machine from a distance and it’s mostly common associated with robotics and mobile robots but it can be applied to a wide range of systems and the term is often used in research Academia on
Technology context and of course in the context of Robotics teleoperation involves a human operator controlling a robot from a distance and this control can be achieved through various communication channels such as radio links satellite connections or Selen networks and the distance involved can be as little as across a room and across
Continent and it’s particularly useful in situations where it’s unsafe or impractical for humans to be physically present for example teleoperated robots are used in search and rescue operations nuclear and pharmaceutical Industries maintenance and research sectors where on-site operation would put people at risk or where a clean environment needs
To be maintained and other common applications include space exploration underwater Vehicles forestry mining agriculture surveillance rescue and surgery this one here is of course a VR teleop which is definitely some fascinating stuff but in the actual paper we did get some of the physical teleup which is right there so you can
See he’s able to clean the restroom with this robot right here and some people might be asking what is the point of him even doing this if he could do it but like I just stated there there are many different applications for this and of course you also get to see what the
Robot is actually capable of and once you train autonomously you can then get that robot to do similar tasks so I do think that it does show us how crazy robot dexterity is getting and how effective these robots might be in the future at certain tasks now here was a
Demonstration of where they had the robot at home and you can see here once again with the teleop system and remember they do have two kinds of ways to control the robot they have the teleop systems and the autonomous systems you can see here that is able to
Really do pretty pretty much any tasks that I can honestly think of like a lot of this you know stuff that you look and you see and you think ah a robot doesn’t have hands it won’t be able to do that it won’t be able to I guess you could
Say you know for example cook me a three course meal as we did see in the start it won’t be able to control its hands to be able to do that it won’t be fast enough it won’t be quick enough but of course you know there are some things
That you should know some of these clips I’m not sure exactly which ones but it will say in the leftand corner or the right hand you know side whichever one it is but it will say how sped up it is because sometimes and I will show you
The real time speeds of these robots in around you know 2 minutes once we finish analyzing this clip because it also is very effective at you know just honestly controlling a lot of stuff and one of the things that I do find really interesting was that this robot isn’t
Actually shaped like in human hands it’s it it’s shaped like two claws like kind of like a crab I guess you could say and it just goes to show how much you know how many different ways there are to interact with the environment that are still very effective that aren’t 100%
Completely human so that is something that I did also find pretty fascinating because some people would argue that you know human hands are just simply the best they have the most you know dexterity you know the grip strength is unmatched and I guess you could argue
That but you can see that with these kind of robots you are definitely accomplishing quite a lot here so this is definitely something that is quite fascinating that um and even that right there you know being able to zip that up is that’s that’s pretty pretty hard
Stuff even even for some humans I would argue that that is something that is quite hard so um yeah what we do need to take a look at as well is of course the autonomous nature of this product because I do believe that this project if scaled correctly could definitely be
Something that could be applied to many different things so let’s take a look at so let’s take a look at the rest of this clip and then I’m going to show you guys the actual real autonomous stuff and it in real Time let’s take a look at some of the realtime autonomous tasks so this right here is of course the realtime autonomous task of this robot cooking shrimp so you can see of course it isn’t as fast but it’s still isn’t as painstakingly slow as some of the
Previous tasks that we have seen from other robots and earlier Generations now some people could argue that a human is way faster but like I’ve always said this is always development what we are seeing is the continuous Evolution and progression of these robots they might not be lightning fast right now but
Remember year on year month after month day after day as researchers and as many different students delve into the industry I’m sure there will be more discoveries more efficient ways of making them faster that we will eventually discover which will lead to robots that are probably going to be too
Fast for our liking so there are many different tasks that this robot was able to do autonomously and it will be most certainly fascinating to see what other robots or what other tasks this is able to successfully accomplish and I cannot be the only one surprised by how
Effective this robot is at doing certain tasks so here we have the push chairs task and this is of course completely autonomous and this is in real time so you can see how fast it actually is and I did want to show this because I wanted
To be transparent in not just you know saying that oh my God this robot is so amazing it definitely absolutely is amazing but it’s important to know where we currently are because even at this speed I still do think that it is pretty pretty impressive because although some
Humans could do it faster this is still something that looks to be very effective at not only completing tasks effectively and not only controlling robots with a very precise grip and then what we have as well about this was that it was only trained on the first three
Chairs and the last two chairs are out of data extrapolation so then of course we have this one here and like I said there was an interesting example on Twitter but I unfortunately wasn’t able to find it but in that video it did actually show some adversarial disturbance in which someone did put
Some things in the pot or they did move the pot multiple times but this goes to show that you know you might come home from one day and then um you know your robot has managed to clean up everything not just like a standard Ruma Roomba robot which just you know uh Glides
Around and picks up dusk but a real real actionable robot that you know moves around your kitchen or wherever and is able to you know fold your clothes able to move your stuff I think this kind of technology is something that people should be paying attention to not only
Because it’s fascinating but you know I think there is the elephant in the room you know one of the things that many people have been concerned about recently when looking at these autonomous robots and for example you can see the robot here being able to rinse out this Pan in real time is
Whether or not these robots are going to be taking people’s jobs because of course as you know any task on a computer in the nearby future could of course eventually be automated by AI agents but the robotics debate was something that many people have said for
A while now is something that is going to be fairly slower than the software advancements that we have with software like chat GPT but with videos like this and with projects like this from Stanford and Google deep mind it’s showing us that robots won’t be that far
Behind as long as there is this level of research taking place and then of course here we have completely autonomous and in real time we can see the robot coming over to here and of course being being able to wipe this down now there definitely has been an influx of
Robotics research being looked at just across the globe based on what I’ve seen and I do think that we are going to be getting a stellar level of research and development in 2024 not just from students not just from companies but just honestly across the globe because
Of the many things that are now going on for example they actually did open source this project and one thing that they did also talk about that was absolutely incredible was the price of this this entire rig you could get for only $32,000 you might be thinking well
You know for the average person $332,000 is a lot of money and you’d be correcting saying that but remember a lot of the other humanoid robots that are able to do you know pretty basic stuff are a lot of money I’m talking six figures uh you know mid six figures
Around $200,000 you know which is definitely the price of like uh you know a super car or a mediumsized family home depending on where you are and that’s definitely not something that people can afford but if we have something that’s open source and people can figure out
How to make it cheaper and cheaper and cheaper just like as technology has done I mean if you look back you can remember how giant our TVs were and how giant our computers were and how expensive and slow they were um I think the trajectory is definitely definitely clear now
Something that I also do want to talk about is the co-training so co-training is like teaching someone to cook using two different cookbooks so one cookbook has recipes for basic dishes and the other recipe has more complex meals that involve moving around the kitchen to use various appliances now in the mobile
Aloha project co-training means the robot learns from two sets of examples or cookbooks the first set shows robots how to do the tasks that require both using the hands at the same time like cracking eggs with one hand while stirring a bowl with the other but these
Tasks are done in one place without moving around now the second set of examples teaches the robot how to move around and do tasks in a different part of a room like going to the fridge to get ingredients and then moving to the stove to cook by learning from both sets
The robot gets really good at tasks that need both hand coordination and moving around like cooking a complete meal where it needs to fetch ingredients prepare them and cook them on the stove this way the robot becomes more skilled and can do more complex tasks than if it
Had learned from just one set of examples and we can see time and time again co-training outperforms the no co-training this is also a tweet from Chelsea Finn one of the people on the team who did this awesome project where they’re talking about the hardware that you know only cost $32,000 including all
Peripherals and they were actually surprised by how capable it was and of course they left a link in order for you to build it yourself now something that is also important to talk about is the conclusions limitations and the future directions so essentially I read the conclusions in the paper and in simple
Terms the project is about making a robot that can move around and use both hands to do complex tasks like a human would in the kitchen and the robot system of course as we know was built on top of the existing Aloha system and it
Was able to do many of these tasks very well with just 20 to 50 demonstrations and also cost less than $32,000 and they’re available freely for anyone to use however they are some things that they do want to improve for example the robot is a bit large in some
Spaces and its arms can’t reach high or low enough for tasks like using an oven now of course in certain houses we do have different spaces and different sizes so ideally for any large scale robot that’s going to be for a home usage we do want it to have minimal as
You know we do want it to be essentially as small as possible and we want it to have as many degrees of free Freedom as possible in order to be able to work in the widest range of places and to be able to grab things that most humans
Can’t grab and to be able to walk in places that most humans can’t walk now they also plan to make this robot smaller which is of course what we need and give its arms greater range of movements which is something that we do expect now on the software side the
Robot is currently only learning from tasks shown by expert users and the team wants to improve this so that the robot can learn from a a range of examples including less perfect ones and in the research paper they actually show that an average user can actually become an
Expert within only around I think it was five examples or five um training runs I’m not sure exactly what the uh indicator was but it definitely didn’t take that long so I think this project this entire thing to start off 2024 is definitely a very good look into the
Future of Robotics and what is going to be done at scale I mean think about it like this if three students and of course to note definitely three very very smart and well capable students can produce this what could a team of 300 at a large corporation do I mean it
Definitely goes to show that if people can do this on such a smaller scale what is going to be available on a bigger scale and the fact that is open source means that this does once again raise the bar for the base level of what we’re
Able to see with the robotics so let me know if you found this to be intriguing are you going to be taking a further look at this all links will be in the description and are you excited for robots even if some of them can do these autonomous tasks which may actually take
Some of our jobs but at the same time should make some of our Lives easier Oh Da La
Video “Googles New Robot Just SHOCKED The Entire INDUSTRY (MOBILE ALOHA)” was uploaded on 01/05/2024 to Youtube Channel TheAIGRID