Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
ICRA 2026: 1–5 June 2026, VIENNA
Enjoy today’s videos!
Suzumori Endo Lab, Science Tokyo has developed a dog musculoskeletal robot using thin McKibben muscles. This robot mimics the flexible “hammock-like” shoulder structure to investigate the biomechanical functions of dog musculoskeletal systems.
[ Suzimori Endo Robotics Laboratory ]
HOLEY SNAILBOT!!!
We present a system that transforms speech into physical objects using 3D generative AI and discrete robotic assembly. By leveraging natural language, the system makes design and manufacturing more accessible to people without expertise in 3D modeling or robotic programming.
[ MIT ]
Meet the next generation of edge AI. A fully self-contained vision system built for robotics, automation, and real-world intelligence. Watch how OAK 4 brings compute, sensing, and 3D perception together in one device.
[ Luxonis ]
Thanks, Max!
Inspired by vines’ twisty tenacity, engineers at MIT and Stanford University have developed a robotic gripper that can snake around and lift a variety of objects, including a glass vase and a watermelon, offering a gentler approach compared to conventional gripper designs. A larger version of the robo-tendrils can also safely lift a human out of bed.
[ MIT ]
The paper introduces an automatic limb attachment system using soft actuated straps and a magnet-hook latch for wearable robots. It enables fast, secure, and comfortable self-donning across various arm sizes, supporting clinical-level loads and precise pressure control.
[ Paper ]
Thanks, Bram!
Autonomous driving is the ultimate challenge for AI in the physical world. At Waymo, we’re solving it by prioritizing demonstrably safe AI, where safety is central to how we engineer our models and AI ecosystem from the ground up.
[ Waymo ]
Built by Texas A&M engineering students, this AI-powered robotic dog is reimagining how robots operate in disaster zones. Designed to climb through rubble, avoid hazards and make autonomous decisions in real time, the robot uses a custom multimodal large language model (MLLM) combined with visual memory and voice commands to see, remember and plan its next move like a first responder.
[ Texas A&M ]
So far, aerial microrobots have only been able to fly slowly along smooth trajectories, far from the swift, agile flight of real insects — until now. MIT researchers have demonstrated aerial microrobots that can fly with speed and agility that is comparable to their biological counterparts. A collaborative team designed a new AI-based controller for the robotic bug that enabled it to follow gymnastic flight paths, such as executing continuous body flips.
[ MIT ]
In this audio clip generated by data from the SuperCam microphone aboard NASA’s Perseverance, the sound of an electrical discharge can be heard as a Martian dust devil flies over the Mars rover. The recording was collected on Oct. 12, 2024, the 1,296th Martian day, or sol, of Perseverance’s mission on the Red Planet.
[ NASA Jet Propulsion Laboratory ]
In this episode, we open the archives on host Hannah Fry’s visit to our California robotics lab. Filmed earlier this year, Hannah interacts with a new set of robots—those that don’t just see, but think, plan, and do. Watch as the team goes behind the scenes to test the limits of generalization, challenging robots to handle unseen objects autonomously.
[ Google DeepMind ]
This GRASP on Robotics Seminar is by Parastoo Abtahi from Princeton University, on “When Robots Disappear – From Haptic Illusions in VR to Object-Oriented Interactions in AR”.
Advances in audiovisual rendering have led to the commercialization of virtual reality (VR); however, haptic technology has not kept up with these advances. While a variety of robotic systems aim to address this gap by simulating the sensation of touch, many hardware limitations make realistic touch interactions in VR challenging. In my research, I explore how, by understanding human perception through the lens of sensorimotor control theory, we can design interactions that not only overcome the current limitations of robotic hardware for VR but also extend our abilities beyond what is possible in the physical world.
In the first part of this talk, I will present my work on redirection illusions that leverage the limits of human perception to improve the perceived performance of encountered-type haptic devices in VR, such as the position accuracy of drones and the resolution of shape displays. In the second part, I will share how we apply these illusory interactions to physical spaces and use augmented reality (AR) to facilitate situated and bidirectional human-robot communication, bridging users’ mental models and robotic representations.
[ University of Pennsylvania GRASP Laboratory ]
The post “Video Friday: Robot Dog Shows Off Its Muscles” by Evan Ackerman was published on 12/12/2025 by spectrum.ieee.org













-1.jpg)







