Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
RoboCup German Open: 12–16 March 2025, NUREMBERG, GERMANY
German Robotics Conference: 13–15 March 2025, NUREMBERG, GERMANY
RoboSoft 2025: 23–26 April 2025, LAUSANNE, SWITZERLAND
ICUAS 2025: 14–17 May 2025, CHARLOTTE, NC
ICRA 2025: 19–23 May 2025, ATLANTA, GA
IEEE RCAR 2025: 1–6 June 2025, TOYAMA, JAPAN
RSS 2025: 21–25 June 2025, LOS ANGELES
IAS 2025: 30 June–4 July 2025, GENOA, ITALY
ICRES 2025: 3–4 July 2025, PORTO, PORTUGAL
IEEE World Haptics: 8–11 July 2025, SUWON, KOREA
IFAC Symposium on Robotics: 15–18 July 2025, PARIS
RoboCup 2025: 15–21 July 2025, BAHIA, BRAZIL
Enjoy today’s videos!
Are wheeled quadrupeds going to run out of crazy new ways to move anytime soon? Looks like maybe not.
[ DEEP Robotics ]
A giant eye and tiny feet make this pipe inspection robot exceptionally cute, I think.
[ tmsuk ] via [ Robotstart ]
Agility seems to be one of the few humanoid companies talking seriously about safety.
[ Agility Robotics ]
A brain-computer interface, surgically placed in a research participant with tetraplegia, paralysis in all four limbs, provided an unprecedented level of control over a virtual quadcopter—just by thinking about moving their unresponsive fingers. In this video, you’ll see just how the participant of the study controlled the virtual quadcopter using their brain’s thought signals to move a virtual hand controller.
Hair styling is a crucial aspect of personal grooming, significantly influenced by the appearance of front hair. While brushing is commonly used both to detangle hair and for styling purposes, existing research primarily focuses on robotic systems for detangling hair, with limited exploration into robotic hair styling. This research presents a novel robotic system designed to automatically adjust front hairstyles, with an emphasis on path planning for root-centric strand adjustment.
[ Paper ]
Thanks, Kento!
If I’m understanding this correctly, if you’re careful it’s possible to introduce chaos into a blind juggling robot to switch synced juggling to alternate juggling.
[ ETH Zurich ]
Drones with beaks? Sure, why not.
[ GRVC ]
Check out this amazing demo preview video we shot in our offices here at OLogic prior to CES 2025. OLogic built this demo robot for MediaTek to show off all kinds of cool things running on a MediaTek Genio 700 processor. The robot is a Create3 base with a custom tower (similar to a TurtleBot) using a Pumpkin Genio 700 EVK, plus a LIDAR and a Orbbec Gemini 335 camera on it. The robot is running ROS2 NAV and finds colored balls on the floor using an NVIDIA TAO model running on the Genio 700 and adds them to the map so the robot can find them. You can direct the robot through RVIZ to go pick up a ball and move it to wherever you want on the map.
[ OLogic ]
We explore the potential of multimodal large language models (LLMs) for enabling autonomous trash pickup robots to identify objects characterized as trash in complex, context-dependent scenarios. By constructing evaluation datasets with human agreement annotations, we demonstrate that LLMs excel in visually clear cases with high human consensus, while performance is lower in ambiguous cases, reflecting human uncertainty. To validate real-world applicability, we integrate GPT-4o with an open vocabulary object detector and deploy it on a quadruped with a manipulator arm with ROS 2, showing that it is possible to use this information for autonomous trash pickup in practical settings.
[ University of Texas at Austin ]
The post “Video Friday: Hottest On The Ice” by Evan Ackerman was published on 01/24/2025 by spectrum.ieee.org