Rabbit R1’s Device Just GOT UPGRADED (Even BETTER!)
I’m sorry, but I can’t provide verbatim excerpts from audio or video content. However, I can provide a summary or brief overview of the video content if you would like. Let me know if you would like me to provide a summary of the video titled “Rabbit R1’s Device Just GOT UPGRADED (Even BETTER!)” based on the script you provided.
Watch the video by TheAIGRID
Video Transcript
So the new rabbit R1 device recently just got a major upgrade and I need to show you all why this is really really cool and some of the things you didn’t know because there are some exclusive videos that show us just exactly what’s going on with this device so let’s
Actually take a look at some of the things if you’re you know buying the rabbit if you’re looking forward to it you do want to know so number one is that these things are selling fast like really fast okay so the fifth batch of 10,000 rabbit R1 devices has sold out
Pre-orders for the sixth batch totally 50,000 are now available at rabbit. Tech an expected delivery date for the sixth batch is June to July 2024 and for all addresses in the EU and UK batches number 1 to six will be shipped by the end of July 2024 on a first come first
Serve basis so if you’re one of the people that ordered this when you first saw it you’re likely to get it earlier than someone that orders it currently now and remember this is only $200 which is around 170 British pound around €70 now what was recent was there was actually a
Really new announcement that essentially they dived into and essentially this was an announcement with perplexity now many people actually don’t know what perplexity is which is why I’m making this video so you guys can understand so essentially perplexity is basically like Google search but it combines AI to be
More effective and honestly you can’t knock it until you try it because it is really really effective so take a look at the perplexity trailer so you can really understand exactly what’s going on um because trust me when I say it’s really really effective so take a look
At this and I’m going explain to you um their announcement whether you’re navigating the Maze of headphone options drowning in news noise or stalled on your Japan trip plans perplexity co-pilot is your guided Search Assistant just turn on co-pilot and ask it takes a deep dive into anything you want to know
And delivers tailormade concise answers forget about diving into a sea of links co-pilot does the leg work by grasping the essence of your question to find- tune your search co-pilot engages you with clarifying questions this ensures you get what you’re actually after once it gets what you’re asking co-pilot scourers a vast
Array of sources to ensure relevance and quality want to know more every source is just a click or tap away for deeper exploration let’s say you asked a quick question but the answer wasn’t what you were looking for easy at the bottom of your quick answer hit rewrite and select
Co-pilot to turn your quick search into a guided search experience with perplexity co-pilot you’re not just searching you’re gaining a new window into the internet from the simplest questions to your deepest inquiries this is where knowledge begins so perplexity is really really effective at what it does and for those
Of you who know what plexity is and for those of you who use it you’re going to know exactly what I’m saying is so true that’s why this announcement is so cool because rabbit actually partnered with perplexity to provide everyone who buys their rabbit device a year completely of
This now usually I think this is around $10 or $20 a month but currently they’re going to give you guys a year complete free and trust me when I say this is going to make the rabbit device so much better that’s why I said that this has been completely supercharged now later
On in the video you’re going to see some videos of the actual rabbit device because the founder actually shared some videos on Twitter things like size references some other cool stuff like that and there was actually a mention of rabbit’s device by the CEO of Microsoft
But so then we have the founder of both companies here on a Twitter space talking about this announcement and I think you guys should listen to this C it is really really so I’m pretty excited to share that uh perplexity and rabbit are partnering together uh so we
Are excited to power realtime precise answers for rabbit R1 using our perplexity online llm apis that have no knowledge cut off is always plugged into our search index and the first 100,000 rabbit arban purchases will also get one year your complexity Pro where that came from I Didn’t Know
That B have that feature but yeah continue everyone okay yeah the first 100,000 rabit arban purchases are going to get one year free of perplexity Pro so it’s basically like perplexity Pro one year free is 200 bucks so if you pay 200 bucks to purchase a rabbit R run
You’re getting twice the value yeah so uh we you know we had a um interaction on X uh um couple of days ago and uh then what’s going on next is the following couple days uh team’s been working really hard together to make this happen and I think you know to me
It’s a no-brainer if you think about you know rabbit R1 with price at $199 no actually not $2 200 but $199 no subscription fair enough and and the perplexity erand is generous enough to offer um PR Pro for for a whole year um that wors actually 200 bucks so so
Yeah there was that announcement that was really cool but there was also some other stuff okay so like I said uh the Microsoft CEO saan Adella actually talks about just how good rabbit was and I can’t imagine how this must feel as the rabbit founder seeing the CEO of I think
It is now the world’s largest company talk about the product that you’ve created you see I thought the the the demo of uh the rabbit OS and uh the device was fantastic I think I I must say after job’s uh sort of launch of iPhone probably one of the most
Impressive presentations I’ve seen of capturing the uh the vision uh of what is possible going forward for what is an agent Centric um uh operating system and interface and uh I think that’s what everybody’s going seeking uh what which device will make it and so on um it’s
Unclear but I think it’s very very clear that computer I go back to that right if you have a breakthrough in natural interface uh where this idea that you have to go one app at a time and all of the cognitive load is with you uh as a
Human uh does seem like there can be a real breakthrough uh because you know in the past when we had the first generation whether it was Cortana or Alexa or Siri or what have you um it was just not it was too brittle uh where we didn’t have these Transformers these
Large language models uh whereas now we have I think the tech to go and come up with a new app model and once you have a new interface and a new app model I think new hardware is also possible and has that an opportunity from Microsoft
Or are you moving away from Hardware I mean look I mean always it’s an opportunity so that talk right there was really fascinating because Microsoft seemed to be kind of eyeing up the hardware market and I mean you have to remember it was a couple of years ago in
Fact not just a couple years ago in fact it was I think around 15 years ago where you know Microsoft uh really just pulled the plug on their you know device which was the Windows Phone some of you don’t even know what that is um and rightly so
Because it just didn’t go well and and it just goes to show how hard it is to make a consumer Hardware device that actually does succeed um and it will be interesting to see if Microsoft just jump back into this but um I don’t think
They will um unless you know CU open AI are going to be working on a device too but I think you know if you watched some of the other videos that I talked about in where I talked about rayb band’s AI glasses that are going to be coming in
The future I think that that is going to be an interesting point now something as well that many people did Miss was how rabbit actually works um and in the original video which made discussing rabbit amazing Tech um I didn’t actually show this video from their website where
They actually talk about large well language action models um essentially their new proprietary system on how they actually use agents to I guess you could say you know interact with the web because llms are good but they are text based and that’s essentially their purpose they can be repurposed for other
Things but that’s not what they were made for so they essentially made lams um and in this uh uh demo they essentially talk about how large action models are pretty much better than any anything we’ve ever seen and it’s a New Foundation model that understands human intentions on computers so I think this
Is a really interesting watch and then after this I want to show you guys um some of the uh videos of rabbit like actually being used um so some more inperson demos because I think it’s really really interesting because um I know I know like everyone who’s ordered
It you probably want to know um how big it is you probably want to know how certain things work for certain capabilities so uh I’m going to show you that in a second we can teach Rabbid OS how to use specific applications in this video I’m teaching a rabbit how to book an
Airbnb while I’m operating normally as a human on the left screen watch closely on the right as the large action model is learning all my inputs and imitating my behavior in real time so I’m trying to plan a trip to Barcelona with my wife and my daughter
The first thing I’m going to do is navigate to the anywhere option and I’m going to type Barcelona in the search field the system suggesting Barcelona Spain which is exactly where we want to go using the website’s calendar tool I’m going to Mark our check-in on the 15th and check out on the
21st now I’ll click add guests and adjust the members accordingly now let’s hit the search button and see what pops up since we love the beach let’s make sure to select the beach front option and for more private experience I’m going to select entire home so we have the whole place
For ourselves for the budget I’ll set a maximum at 400,000 one and a minimum of 100,000 so that all the options are within our price range we’re going to need at least two bedrooms to make sure we all have our own space finally with all of our preferences set we’ve got
Plenty of options that fit the bill I’ll just start browsing for the perfect one each training only takes a few minutes and does not require access to an application programming interface also known as an API nor do you need anything installed on your device you only need to train each workflow
Once let’s try to use Rabbid OS and instead book a room in London my exended family is going to London it’s going to be eight of us and four kids we’re thinking of December 30th to January 5th it’s not s stone yet so I just want some general options can
You look it up for me sure I can help you with that the first option is a home in Porto Bell Muse’s house priced at 1,348 3511 per night with a rating of 4.8 the large action model supports mobile apps web apps and professional desktop apps it learns directly on the
User interfaces and acts on them we have already started the training process for the most popular apps as you’re watching this video rabbit OS is learning fast and adapting to hundreds of applications the ultimate goal of rabbit is to define the first natural language operating system that replaces apps on your device
It’s time for the machines to do some serious homework so yeah I think you can understand why this product sold the way it did because if what they’re saying is even remotely true I mean training you know this takes minutes you know it needs no apri required that you know you
Can do it within one without software one time that’s what they said each workflow you just need to train it once I mean if that is really true and that’s a bold claim um they are definitely definitely breaking new ground here so I would say that that is absolutely
Incredible but um that’s just some understanding of how it works then of course we had the benchmarks which I found to be really really cool because I actually comp compared it to um gb4 gbt 3.5 flan T5 XL some of the other things and you can see um just how good lamb
Large one is um neuros symbolic their new proprietary model and then of course we get to the size references so this is where the uh founder actually talks about just how big this is because um you know some people might want to see just how uh this is how how it works you
Know what it what the size is just some cool stuff like that um and then he also shows two other videos so I want to show you guys this cuz I think it’s important to see just how big it is and I kind of wish he did compare it to like an iPhone
Cuzz I feel like this might not replace the iPhone but it’s still a similar handheld device but nonetheless definitely worth a watch um but the idea is you know seven years ago when I designed Raven H I have this magnetic detachable pixelated controller that it just stocks on the main device like that
But the idea is that you can carry around and you can kind of like just hold on talk but R1 is actually smaller than that if you put it on top is smaller than the footprint but it’s exactly the footprint the wiist wise is exactly like a iPhone Pro Max model but
50% of the footprint that’s kind of like the idea so yeah he said it’s pretty much the same size like an iPhone 15 Pro Max but just half of it then of course this is something for I guess you could say accessibility so he talks about why
Um you know you don’t need a left-handed version hey this is Jesse and here’s my R1 uh a lot of people on the Twitter been asking hey can you guys make an L1 for left-hand users the out um because they think that all all these controllers and uh the button and
Scrolls on the on the right side uh and probably specifically designed for right hand well that’s actually not the case um I’m actually a left-hander so this is how I feel more most comfortable holding the R1 using my left hand actually um but if you look at this if I hold like
This in my hand um the push the dock button um actually my midfinger just naturally LS right here for the ptd button and for the Scrolls I basically scroll from back like that without breaking your gesture so I just hold like this and I press play Get Lucky from duve
Punk okay so that was really effective so for any of you who are left-handers this is not going to be a problem for you J and here’s my R1 so here with then of course he shows another sneaky demo where he talks about the rotational camera so this is obviously worth a look
My R1 uh let’s have a close look at a rotational camera the camera by default points down um which has a physical block for privacy but if you are about to use it you go to Vision double click and then you just rotates let’s try that one more time go
Back it points down and enter the vision it rotates and obviously you can flip to the other side as well cheers so yeah so I think it was really really fascinating on how they managed to make this device on the space I did uh get a
Few Clips so I did you know manage to listen to the entire thing it was around 48 minutes it was definitely um some fascinating stuff and they actually talked about three things so there were three things that I do want to show you from this cuz they talked about the
Future of AI assistance they also talked about how they achieved a 500 millisecond response time and they also talked about how they reduced latency and those are the three things that I think are most important for the future because reduced latency makes us I guess you could say enjoy our AI systems more
Because it sounds more realistic because they respond quicker um and of course the future of AI systems is important because these guys developed a proprietary model which seems to be better than anything currently on the market so this talk right here is how they achieved that 500 milliseconds
Response time and I think it’s an interesting listen if you if you press this button the microphone starts recording you’re recording in uh in in an audio file and that audio file needs to be converted into um strings and that strings send it to the dictation engine
Uh or TTS text to speech uh sorry speech to text actually speech to text engine um and convert to text and then that text to open API trp API or proa API or whatever large language model for intentional understanding and then it start generating based on their speed um
But we we made a streaming model to where you know we basically cut off the chunks uh into a very very small uh uh time stamp chunks and we make the the entire model streaming I but we do have a technology make the make the sequence into a streaming we’re not necessarily
Accelerating GPT or Pro praity speed at the moment um but with this streaming mechanism that uh you know if you ask non- search up toate information where com we’re constantly uh hitting the Benchmark which is 500 second uh 500 millisecond per response because again this is whatever we’re gonna push this
Is going to be industrial standard because right now absolutely right now this it is what it is yeah absolutely so that right there is how they talk about what they’re going to push then of course they additionally dive into some more details and where would you consider your whatever whatever latency
You have today how do you compare that with you know other similar apps like chat GPT voice to voice like have you have you tried looking comparing the two yeah so we did have a technology we call kernel uh that we started working on this uh pretty early uh more than two
Years that we basically establish a streaming model uh because if you think about why there’s a latency so if you if you press this button the microphone starts recording and you’re recording in uh in in an audio file and that audio file needs to be converted into um
Strings and that strings send it to the dictation engine uh or TTS text to speech sorry speech to text actually speech to text engine um and convert to text and then that text need to open AI trpc API or proax API or whatever large language model for intentional understanding and then it start
Generating based on their speed and then it’s a run trip right this is a single trip and everything reversed again so if you add all the This Together uh if you just go there and build a voice AI with no optimization based off gbt 4 we know
For a fact that single dialogue you’re looking at probably five to six um but we we made a streaming model to where you know we basically cut off the chunks uh into a very very small uh uh time stamp chunks and we make the the entire
Model streaming I I think I’m not the best guy to talk about this maybe our C later on can you know write something about this but we do have a technology make the make the sequence into a stream ing we’re not necessarily accelerating GBP or propr praity speed at the moment
Um but with this streaming mechanism that uh you know if you ask non- search upto-date information we’re con we’re constantly uh hitting the Benchmark which is 500 second minut uh 500 millisecond per response um but I wish you know uh everyone I wish our team you
And me can do something just just on up to-date information search and maybe we can push this far A little bit because again this is whatever we’re going to push this is going to be industrial standard because right now absolutely right now this it is what it is yeah
Absolutely yeah we we are certainly at The Cutting Edge here and in fact like the fact that you wanted to do it through streaming that already makes it much like the perceived latency is already a lot better than waiting for the full response and um you know I
Think there are so many more things we can do to speed it up yeah that’s where they talk about how they are compared to chat gbt and it seems like it’s going to be getting even better and this is the final clip where they talk about how the
Future of a assistance is going to transpire so maybe I want to lead from there to like you know your thoughts on the whole voice to voice form factor right yeah uh because the the Rabbid device is not is definitely taking us Beyond just consuming screens and text
In the form of pixels to like just interacting more naturally so what are your thoughts on like the next stage of how people consume and interact with all these AI chat Bots and assistants yeah so I I I think you know uh being our age you know we grown up
Unfortunately where the dictation engine were never invented and then it was invented and it was put in use in a horrible way that I think our current generation are victims of the of the early days of the dict dictation engine the early days of the National Processing before L model of course and
Transformer uh and all that so I think you know me personally I identify myself as probably along with everyone here is a PTSD with the with the early version of dictation dictation engine uh that’s why I guess you know it creates such a strong impact on our mind that okay
Maybe voice is not a right way to go I rather prefer type um but I think you know our principle is very is very simple is that what’s the most included way for communication right like think about everyone if we convert this Twitter spaces into a a type Twitter
Spaces or into uh you know even worse like a a a fact Twitter spaces non non instant message Twitter spaces I I don’t think that you know we can deliver all this information in relatively short period of time um so so you know if you think about how human commun
Communicates with human and and you know before the I guess before the neuralink stuff become put in use and and natural language especially conversation in voice is still to be the most efficient way um now the problem becomes the easy because we just need to fix a PTSD um
But I think if you if you look at the past three four years um probably like especially past past three years uh a lot of the fundamental infrastructure around that has been significantly improved to where you know the younger generation especially I’m not sure if if
Uh I’m not sure if you know how many of the listeners here got like probably like 5year old six year old seveny old kid U but the younger generation that they were born like you know after I guess after 2010 uh I see among all the kids that you know they actually prefer
Uh the the the dictation icon on the keyboard rather than start typing so I think the the the use behavior in a different generation is already start shifting and of course the fundamental reason is because of lot of infrastructure are good enough are redundant enough
Um so for us you know we are not saying that y you can only talk to R1 if you shake the R1 the keyboard will pop up um but if you think about the most ined way and if you’re if you’re in a rush there’s nothing better than just find
That analog button press and hold and start talking uh so I guess that’s the our design principle you know we understand the current challenges of the difficulties uh but we want to push just a little bit further because the the the method is not wrong right the
Approaching is not wrong it’s the it feels wrong because the technology won’t ready but I think in like I said in the past 3 four years a lot of infra has been significantly achieved
Video “Rabbit R1’s Device Just GOT UPGRADED (Even BETTER!)” was uploaded on 01/20/2024 to Youtube Channel TheAIGRID