Four Essential Steps to Improve Robot Safety at Festivals: How to Prevent Near Misses Like the Recent Incident in China

Four Essential Steps to Improve Robot Safety at Festivals: How to Prevent Near Misses Like the Recent Incident in China

Humanoid robots are supposed to be our loyal assistants, but we saw another side to them the other day. Chinese robot manufacturer Unitree was demonstrating its latest H1 robots at a lantern festival in the city of Taishan, Guangdong province, when one walked up to the crowd barrier and seemed to lunge at an elderly woman, nearly headbutting her.

The incident quickly went viral, and sparked a fierce debate about whether the robot actually attacked the woman or had tripped up. It’s mostly being overlooked that we’re a long way from having robots that could intentionally attack someone – machines like these are often remote controlled – but the danger to the public is clearly real enough.

With sales of humanoid robots set to skyrocket over the next decade, the public will increasingly be at risk from these kinds of incidents. In our view as robotics researchers, governments have put very little thought into the risks.

Here are some urgent steps that they should take to make humanoid robots as safe as possible.

1. Increase owner requirements

The first important issue is to what extent humanoid robots will be controlled by users. Whereas Tesla’s Optimus can be remotely operated by people in a control centre, others such as the Unitree H1s are controlled by the user with a handheld joystick.

Currently on sale for around £90,000, they come with a software development kit on which you can develop your own artificial intelligence (AI) system, though only to a limited extent. For example, it could say a sentence or recognise a face but not take your kids to school.

Who is to blame if someone gets hurt or even killed by a human-controlled robot? It’s hard to know for sure – any discussion about liability would first involve proving whether the harm was caused by human error or a mechanical malfunction.

This came up in a Florida case where a widower sued medical robot-maker Intuitive Surgical Inc over his wife’s death in 2022. Her death was linked to injuries she sustained from a heat burn in her intestine during an operation that was caused by a fault in one of the company’s machines.

The case was dropped in 2024 after being partially dismissed by a district judge. But the fact that the widower sued the manufacturer rather than the medics demonstrated that the robotics industry needs a legal framework for preventing such situations as much as the public do.

While for drones there are aviation laws and other restrictions to govern their use in public areas, there are no specific laws for walking robots.

So far, the only place to have put forward governance guidelines is China’s Shanghai province. Published in summer 2024, these include stipulating that robots must not threaten human security, and that manufacturers must train users on how to use these machines ethically.

Drone regulations are an indication of where the robot equivalent needs to go.
lzf

For robots controlled by owners, in the UK there is currently nothing preventing someone from taking a robot dog out for a stroll in a busy park, or a humanoid robot to the pub for a pint.

As a starting point, we could ban people from controlling robots under the influence of alcohol or drugs, or when they are otherwise distracted such as using their phones. Their use could also be restricted in risky environments such as confined spaces with lots of members of the public, places with fire or chemical hazards, and the roofs of buildings.

2. Improve design

Robots that looks sleek and can dance and flip are fun to watch, but how safe are the audiences? Safe designs would consider everything from reducing cavities where fingers could get caught, to waterproofing internal components.

Protective barriers or exoskeletons could further reduce unintended contact, while cushioning mechanisms could reduce the effect of an impact.

Robots should be designed to signal their intent through lights, sounds and gestures. For example, they should arguably make a noise when entering a room so as not to surprise anyone.

Even drones can alert their user if they lose signal or battery and need to return to home, and such mechanisms should also be built into walking robots. There are no legal requirements for any such features at present.

Robot opening a door

‘I am now exiting the room.’
Simple Line

It’s not that manufacturers are entirely ignoring these issues for walking robots. Unitree’s quadroped Go2, for instance, blinks and beeps when the battery is low or if it is overheating.

It also has automatic emergency cut-offs in these situations, although they must be triggered by a remote operator when the robot is in “telemetric mode”. Crucially, however, there are no clear regulations to ensure that all manufacturers meet a certain safety standard.

3. Train operators

Clearly there will be dangers with robots using AI features, but remote-operated models could be even more dangerous. Mistakes could result from users’ lack of real-world training and experience in real-life situations.

There appears to be a major skills gap in operator training, and robotics companies will need to prioritise this to ensure operators can control machines efficiently and safely.

In addition, humans can have delayed reaction times and limited concentration, so we also need systems that can monitor the attention of robot operators and alert them to prevent accidents. This would be similar to the HGV-driver distraction-detection systems that were installed in vehicles in London in 2024.

4. Educate the public

The incident in China has highlighted current misconceptions about humanoid robots as the media is once again blaming AI despite the fact that this was not the issue. This risks causing widespread mistrust and confusion among the public.

If people understand to what extent walking robots are owner-operated or remote-operated, it will change their expectations about what the robot might do, and make everyone safer as a result.

Also, understanding the owner’s level of control is vital for managing buyers’ expectations and forewarning them about how much they’ll need to learn about operating and programming a robot before they buy one.

The post “A robot nearly headbutted a festival spectator in China – here are four urgent steps to make the tech safer” by Carl Strathearn, Lecturer in Computer Science, Edinburgh Napier University was published on 02/27/2025 by theconversation.com