Rabbits New AI AGENT Device Just SHOCKED The Entire INDUSTRY (Rabbit R1 Device)
The recent release of the Rabbit R1 device powered by a revolutionary new foundational model called the Large Action Model (LAM) has taken the tech industry by storm. With a focus on creating a simple and intuitive computer interface, the Rabbit R1 aims to streamline daily tasks and interactions for users. Unlike traditional smartphones with app-based operating systems, the Rabbit R1 utilizes natural language processing to understand and execute user commands seamlessly.
The Rabbit R1 features a touch screen, push-to-talk button, analog scroll wheel, microphone, speakers, and a 360-degree rotational camera known as the Rabbit Eye. This device is designed to respond to user queries within 500 milliseconds, making it significantly faster than other voice AI projects. Additionally, the Rabbit R1 can interact with various applications and services, thanks to the Rabbit Hole web portal.
Overall, the Rabbit R1 device represents a major step forward in the world of consumer electronics, offering a more intuitive and efficient way to interact with technology. With its innovative approach to human-machine interfaces, the Rabbit R1 is set to revolutionize the way we use technology in our daily lives.
Watch the video by TheAIGRID
Video Transcript
So the company rabbit just released their demo for their new AI powered device and it is honestly no word of a lie absolutely incredible and I’m pretty sure this actually did shock the entire industry because there was no previous hype around this product and now it’s sending shock waves through the entire
Industry in terms of what we knew about consumer devices take a look at the amazing demo because you’re likely to be blown away hi everyone my name is Jesse and I’m the founder and CEO of rabbit I’m so excited to be here today to present you two things we’ve been
Working on a revolutionary New Foundation model and a groundbreaking consumer mobile device powered by it our mission is to create the simplest computer something so intuitive that you don’t need to learn how to use it the best way to achieve this is to break away from App based operating system currently used by
Smartphones instead we Invision a natural language centered approach the computer we’re building which we call a companion should be able to talk to understand and more importantly get things done for you the future of human machine interfaces should be more intuitive now before we get started
Let’s take a look at the existing mobile devices that we use daily the one device that’s in your pocket the smartphones like iPhone and Android phones these guys been here for years and we’ve grown tired of them the problem with these devices however is not the hardware phone factor it’s
What’s inside the app based operating system want to get a right to the office there’s the app for that want to buy groceries there’s another app for that each time you want to do something you fumble through multiple pages and folders to find the app you want to use
And there are always endless buttons that you need to click add to the card go to the next page check the boxes and jumping back and forth and so on the smartphone was supposed to be intuitive but with hundreds of apps on your phone today that don’t work together it no
Longer is if you look at the top of ranking apps on app stores today you’ll find that most of them focus on entertainment our smartphones has become the best device to kill time instead of saving them it’s just harder for them to do things many people before us have tried
To build a simpler and more intuitive computers with AI a decade ago companies like apple Microsoft and Amazon made Siri contana and Alexa with these smart speakers often they either don’t know what you’re talking about or fail to accomplish the tasks we ask for recent achievements in large language models
However or llms a type of AI technology have made it much easier for machines to understand you the popularity of L chatbots over the past years has shown that the natural language based experience is the PA forward however where this assistance struggle is still getting things
Done for example if you go to the chbt and use your Expedia plug-in to book a ticket it can suggest options but ultimately cannot assist you in completing the booking process from start to finish things like chbt are extremely good at understanding your intentions but could be better at trigging actions
Another Hot Topic is a field of research around what they call agents it has CAU the eye of many open-source projects and productivity software companies what remains to be solved is for these agents to perform tasks end to endend accurately and speedily the problem is
Forcing a model to perform a task it is not designed for whether for a language model to reason about web P using super prompts or screenshots we have yet to produce an agent as good as users simply clicking the buttons to fulfill our vision of a delightful intuitive
Companion we must go beyond a piece of complex software we want it to be in the hands of everyone so we first set up to fundamentally understand how computer apps are structured and more importantly how humans interact with them we wanted to find a way for our AI to trigger
Auctions on behalf of users across all environments we want it to be Universal not just a chrome Plugin or limited set of apps but everything iOS Android and desktop these applications share something in common the interface they all have a user interface so at a philosophical level if we can make an AI
Trigger actions on any kind of interface just like a human word we will solve the problem this Insight led us to create the large action model or lamb as we call it it is new foundational model that understands and executes human intentions on computers driven by our research in neuros symbolic
Systems with a large action model we fundamentally find a solution to the challenges that apps apis or agents face we solve it with interfaces lamb can learn any interfaces from any software regardless of which platform they’re running on in short the large language model understands what
You say but a large action model get things done we use lamb to bring AI from words to action finally we can build a computer that in addition to understanding what you’re trying to say can actually help you do things on your behalf we pack the large action model
Into advanced rabid OS operating system it is built for realtime interactions between you and the rabbit powered by lamb large action models concept and test results are so powerful that we decided to make make a oneof kind mobile device introducing R1 your pocket Companion He This is the rabbit R1 designning collaboration with teenage engineering the R1 is a fully Standalone device primarily driven by natural language the accumulation of countless hours of work from our research design and software teams it is simple to use R1 is equipped with everything need to interact with
You and perceive your surroundings it comes with a touchcreen a push to talk button and an analog scroll wheel a microphone and speakers and a computer vision enabled 360 degree rotational camera we call the Rabid eye it is Bluetooth and Wi-Fi capable and it has the same color slots
That supports Global 4G LTE network and now let’s see how it works works I can ask rabbit for anything just like if I’m using chbt the difference though is a speed with Rabbid OS I get a response 10 times faster than most of the voice AI project Rabbid answers my questions
Within 500 milliseconds with the push to talk button you don’t need to say anything to wake it up just press and hold the button and talk like a walkie-talkie now let’s give it a shot what’s the nature of the reality the nature of reality is a topic that has captivated the minds of
Philosophers and thinkers throughout history according to Bertrand Russell reality is composed of hard data derived from sensory experiences and logical analysis it is the amalgamation of our perceptions thoughts and the external world I can speak or type with R1 if you shake R1 a keyboard will pop up you will
Also know that the response being is much faster than using some of the L assistant apps with web browsing such as chpt what’s the stock price of Coca-Cola searching for the stock price of Coca-Cola the stock price of cocacola ko is $59.7 who played the role Oppenheimer in Christoper Nolan’s latest movies in
Christopher Nolan’s most recent film Oppenheimer the role of Jay Robert Oppenheimer is played by Killian Murphy the film is a biographical Thriller that explores the story of American scientist Jay Robert Oppenheimer and his involvement in the development of the atomic bomb now let’s see how the large action
Model works with lamb rabbit R1 can interact with all kinds of applications introducing the rabbit hole WEP portal I can log into different services and unlock functionalities for R1 it’s just like iCloud let’s say I want to listen to some music with a rabbit hole web portal I have complete control over which
Services I want to activate on my R1 and which provider I prefer for music I use Spotify so let’s go ahead and connect my Spotify to the rabbit hole now that this is Spotify login page before I start login process I want to emphasize our commitment to privacy the
Way we design the infrastructure for apps that we interact with we don’t create fake users or spam users we don’t hack their infrastructures and of course we do not store any of your thirdparty credentials instead we redirect you to their login page we never save your username and password and we don’t track
Your data the authentication happens on the respective apps now I’m loging with Spotify rabid OS recognize that Spotify has been authenticated on my device before this presentation I pre-authenticated some other apps like uber and door Dash just choose your preferred services and logging through them to get authenticated on your device it’s that
Easy now let’s play some music play craftworks pocket calculator playing pocket calculator by craftwork for you play another song from the album sure playing Computer Love For You who wrote the lyrics for this Song Ralph hutter and Emil schul wrote the lyrics for the song Computer Love by craftwork which band sampled this song it seems that Coldplay’s talk is the only known example of a song that sampled Computer Love what do you think about this song Computer Love by craftwork is widely regarded as a ground
Breaking and influential song in the realm of electronic music its Fusion of catchy Melodies robotic vocals and Innovative use of synthesizers has left a lasting impact on the genre I can also use R1 to call a ride get me ride from my office home now of course I will book
An Uber ride for you from your office to your home please confirm the ride I have six people with three blockages find me Uber that can fit all of us for six people and three pieces of luggage I recommend booking an Uber XL as it provides ample space for all passengers and
Luggage please confirm the ride the ride shows up I just hit confirm Uber’s on my way I can also use R1 to order foot get me a 12in pizza from Pizza Hut deliver to here the most ordered option on the app is fine ordering a 12-in pizza from Pizza Hut since you mentioned
That the most ordered option is fine I will select that for you I just created an order for a 12in pizza it’s going to be hand tossed with a classic marinara sauce and topped with regular cheese please confirm your order that sounds really good I just confirmed order
Here now I’m waiting for my pizza I can also make our one to a set of more complex actions that may take multiple apps to finish on a smart smartphone for example if I want to plan a trip to London I can just describe what I wish in full to
R1 I want to take my family to London it’s going to be two of us and a child of age 12 we’re thinking of January 30th to February 5th can you plan the entire trip for me with like cheap non-stop flights grouped seats a cool SUV and a nice hotel that has
Wi-Fi exploring ticketing options to make your trip a reality for your trip I found various flight options a range of hotels to choose from and car rentals available please confirm each option individually for further details and booking so it’s all been planned out I just confirm confirm
Confirm and that’s it could you come up with a schedule for fun things to do while I’m over there there on it I have prepared a detailed travel plan for your trip in summary you will be exploring London’s iconic landmarks visiting museums enjoying delicious meals at local restaurants and
Experiencing the vibrant atmosphere of different neighborhoods navigation details are also prepared and I can help you book tickets for any events you’d like to attend R1 just planned the entire trip for me that’s awesome but it seems like this is a little bit too intense can you
Plan a easy schedule for us sure I’m working on it please take a look at it and let me know what you think it gave me a more relaxed schedule every day with all the details sounds really good and I just confirm that I can foresee a wonderful trip
Isn’t that cool huh R1 can help me reach others just like a phone but smarter it come with a building real-time translator and a notaker removing the final barriers to communication R1 automatically detects the language spoken in the environment and provide the bidirectional translation R1 has an i an onboard camera designed for
Advanced computer vision it can analyze surroundings and take actions in real time to activate the eye just double tap the button oh funny seeing you here Rick let me take a look this is what I got in the fridge can you make me a nice dish that’s low in calories let me
See how about a green garden omelette it’s a delicious and low calorie dish that combines the freshness of broccoli and cabbage with the creaminess of eggs here’s a simple recipe for you nice it recognized all the stuff and gave me the actual recipes rabit eye can also help you with your
Documents here’s the table I’ve been working on for job questionnaire can you create an additional column that matches candidates who mentioned rabbit then there are questions about how they found us sure let me take a look at the table and add the matching column for you
I’ve processed the table and sent you an email with the results okay now let’s check our email I can continue to interact with rabid OS even Beyond R1 let’s reply this email directly can you add another column that matches candidates who have included rabbit in their question and are La
Based I just reply the email h s and I got a refined version from rabid o through my email let’s say I have a unique routine or task I cannot do on my phone R1 can do that too we are experimenting with what we call the teach mode just like
How I can teach my friend how to skateboard I can show R1 how to do it and it will will learn from this means that any user regardless of technical background can teach R1 to learn new skills so you go to teach mode you start a new
Session today I will show you how to generate an image of puppy using me Journey from prompt using Discord first I will go to the servers page and click one my own servers since this is only a general image generation I’ll go to Mid text Channel then I will use the image
Combined along with the prompt here I’m putting a cute baby wild dog with big eyes animated cartoon on rail 8K let’s wait for a minute for the engine to start generating the images once it’s done let’s click on the image to get a link I will then explain to Rabbit OS how to
Use this rabbit and annotate it so that I can generate anything not just popets so let’s go back to our web portal submit request it takes seconds for the web portal to finish processing and that’s it it’s that simple now once we finish the training I
Can go back to my R1 now let’s use mid Journey as I told you to generate a picture of a bunny in pixel art style certainly Jesse I will use mid journey to generate a picture of a bunny in pixel art style for you please give me a moment to create the image
Now here you go you got a image generated on Mid journey through teach mode watch learn and repeat that’s teach mode it’s that simple that’s all the demos for today with L fast evolving my R1 will eventually help me to do things that can never be achieved on an app
Based phone speaking of the current APP based phones the first question we ask about ourselves is why would I need a new device if I already have a, iPhone my iPhone can’t do any of this at all we do not build rabbit R1 to replace your phone it’s just a different generation
Of devices the app based system were introduced more than 15 years ago and a new generation of native AI power devices are just getting started here’s a quick recap R1 is our companion that hosts the large action model with natural language I can use it for a wide
Range of tasks ask anything direct actions complex actions AI enhanced video calls notaker translator with a rabid eye computer vision and experimental teach mode on the hardware perspective we got a 360 rotational camera a global 4G LTE SIM card a push to talk button and an analog scroll
Wheel one last thing what about the price now before we reveal our price I want to do a quick comparison here are some of the best phones on the market right now you got iPhone you got latest version of Android phones we’re looking at somewhere around
700 to $1,000 for a top phone with an app based system I bought my new iPhone 15 Pro Max last year and it’s the same experience as my previous ones here are not so smart smart speakers they’re asking roughly around $200 but they’re all outdated and
Finally here are a couple of the new things with only large language models you got AI pay asking for $699 plus monthly subscriptions for their base models you got tab asking for 600 $1 and you got meta reband glasses asking for roughly $300 remember these are the things with only large language
Model we still think these were too expensive we priced the rabbit R1 at $199 no subscription no hidden fees you can order the R1 now at rabbit. and we are shipping Eastern 2024 I can’t wait for you to experience the R1 for yourself thank you so one of the things
I want to talk about from this is the teach mode the teach mode that we saw and I mean I honestly just can’t hold my excitement for this product because teach mode is absolutely insane like how is that even a thing I mean that is absolutely crazy so you can see here
That like a lot of people are going to be like okay this product is not that good you know yada yada yada and I mean I don’t think a lot of people gonna say the product is not that good I mean you know some critics might say that you
Know the product you can only do certain things but they literally have a teach mode so all you need to do is record your computer face interactions stop it then just cue it to upload it and then immediately understands what you want to do which means that this has a very
Unique feature which means that users can customize this device to do pretty much whatever you want to do and I think this is exactly what Bill Gates was talking about when he said the AI agents are going to be the future now in the video they did actually talk about how
It was in fact of course their language agent framework which essentially means that they are not like large language models and they’re not like like like agents I don’t know if this is like you know just I guess you could say um some cool word play but I think a large
Action model um might be an an entirely new um framework for how uh agents are going to interact with the web um I don’t know if they’re going to be releasing their research maybe they do want to because if you do release you know new AI research and it completely
Takes the field by storm there’s definitely quite a lot of good things for your career I’m guessing and potentially the research behind this may want to do that but I do think that this product is definitely going to become somewhat mainstream as long as the demo holds up in real life because I’m
Definitely going to be ordering one of these things and this of course isn’t a sponsored video at all it’s just based on the fact that from what I’ve seen here the applications for this are simply mindboggling I mean the demo itself was pretty pretty impressive and
I mean some of the stuff that you were able to do like for example book a and Uber was simply absolutely crazy and I mean I got to be honest the device is small enough it looks cool and finally finally finally finally we get a device that finally takes advantage of a vision
Model that was something that I was really waiting for and one of the key things about the vision model is that I’m sure that with future updates it’s going to be updated I’m not sure what they’re actually using on their Vision model because I know and the video is
Actually kind of stuck I know there are many different things that you could potentially use but I think this right here is definitely going to be something that could potentially replace the iPhone and I know that it might not seem like it at the moment because we’ve used
Our phones for such a long time but honestly this is pretty incredible so I honestly can’t wait to use this because there are many times that I see a few ingredients I don’t know what’s there sometimes like I said the vision cap capabilities are truly Universal if it’s
A really good Vision system I think they um didn’t really exploit too much because maybe there aren’t too many capabilities cuz I think I I remember was looking at a few different Vision papers and recently Apple released a new open source fet model and essentially was better than GPT 4 at classifying
Certain things and yeah I think I think this combined with this Vision model is definitely going to be used in many different ways because like I said maybe they could have a fine tune R1 device for like researchers that are able to quickly you know take pictures of stuff
Or maybe just for whatever kind of industries that are out there but I do think that this is a pretty fascinating area because I mean one of the key things that I do really really love about this was the fact that the latency is so quick one of the problems that I
Always talked about with AI devices and with AI systems is the latency like how long it takes to get that message and sometimes it does feel quite frustrating because you know you’re speaking to an AI system and they have to wait 5 seconds for response this one is 500
Milliseconds guys that is blazingly fast and I did previously mention that in the future we are going to get that and when we do it’s going to change absolutely everything so um what’s crazy about this as well guys is the price they actually talked about how expensive AI devices
Are you know the Humane pin is $700 uh the tab is 600 rayb band’s AI glosses and you know just like don’t even think they’re AI glosses they’re just you know standard glosses that you can record with but this is $200 guys which makes it so competitive on price that is crazy
Guys that is absolutely insane guys I’m going to be ordering this cuz I think something the $200 that gives you access to you know a 360 camera with privacy mode a camera that can you know analyze any image and give you some really cool stuff something that can do stuff for
You I mean for example think about the older population people who aren’t that good with phones and technology they could use natural language to get things done and it would effectively work guys because as you know every year like literally like every two months or so we
Get new updated models that are better at spe speech recognition and with that now we have a device where you know the older population could just say oh can you U me some food or D could you book this could you do that and it could be
Done and imagine you go to your you know Grandma’s house you go to your grandpa’s house or whatever and then of course you could essentially update it with certain things that you need so for example let’s say you need them to get certain pieces of medication you could you know
Teach it based on your computer then all your grandma or grandpa has to do is say hey can you get me my new medication hey can you book me this and it completely does it they just pick confirm like this is a really new a new intuitive way to
Interact with technology so this is going to be absolutely incredible because if this is even remotely as good as this demo guys I think we have something incredible on our hand because not only is this going to be one of the first products that people probably use
In you know Mass adoption worldwide this is going to just set off a whole new industry and a whole new niche in terms of AI wearables and AI devices because this is going to be something that people really really walk with as long as it is effective Ive and yeah I guess
We’ll have to see if this takes off because one of the things that people will obviously need is is is this you know something that’s quote unquote cool because a lot of Technology Trends do tend to follow what is technically cool like the iPhone isn’t exactly the best
Thing in terms of the software but it does have an extraordinary brand reputation which leads people to continually buy their products so overall I think this demo was very effective it was 25 minutes of showing it some really cool stuff um and I think that this is honestly a really really
Effective thing that they’ve managed to do here so congrats to the team because I was following the page for quite some time and I was really excited and what’s crazy was that these guys didn’t like overhype the product or anything they just said just wait until January the
10th there’s going to be a product coming and yeah here we have it and yeah hopefully you guys did enjoy their demo but I’ll leave links in the description where you can go ahead and purchase it like I said it honestly is not a sponsored video but this is definitely
Something I am excited to purchase
Video “Rabbits New AI AGENT Device Just SHOCKED The Entire INDUSTRY (Rabbit R1 Device)” was uploaded on 01/10/2024 to Youtube Channel TheAIGRID