Engineering news

‘Walking looks natural, but it’s a very complicated thing for robots’: Dr Carlos Mastalli, National Robotarium

Joseph Flaig

Dr Carlos Mastalli with the Nadia humanoid robot
Dr Carlos Mastalli with the Nadia humanoid robot

Dr Carlos Mastalli wants to “enable robots to move everywhere”. As humanoid robots mature and enter commercial applications, that is starting to happen – but many challenges remain.

This article is part of a special series on humanoid robots. Click here to read our feature on how they are stepping out of research into commercial applications, and here for a Q&A with Sanctuary AI CEO Geordie Rose.

We spoke to the head of the Robot Motor Intelligence (RoMI) lab at the National Robotarium in Edinburgh about how those challenges could be solved, and how widely the resulting machines could be used.

What are some of the most significant challenges that need to be solved for robots to be able to move everywhere?

We need robots that can make decisions quickly, reliably and robustly, in an environment that is uncertain and, more importantly, in very, very short times, making a decision in the order of milliseconds. That makes this problem very challenging.

There are a lot of uncertainties. For example, when the robot is perceiving its environment, we know that the measurements, what it perceives, is not completely perfect. So the sensors have limitations, there's a lot of uncertainty…

When you make a decision of how to move, one of the fundamental problems is when to break an open contact. The reason why it’s one of the most fundamental and difficult problems to solve is because there are some discontinuities in the dynamics, and the whole process of deciding where and when to break the contact is very complicated from a mathematical perspective.

You need to do all this decision making within a few milliseconds. You need to do this considering your partial information, observation of the environment, delays, uncertainties.

The environment is dynamic as well. So when, for example, you are hiking in the mountains, you might decide where to make a full step. You can do it over a rock, that rock can roll, and that maybe leads to a situation that you cannot recover from, not being able to avoid falling down. These are the types of information, semantic information, that is very complicated to understand for a robot. But they are important for being robust, reliable, making the right decision within three milliseconds.

What is easy for us can be hard for a robot, and the other way around – what is hard for us is easy for robots and computers. Something like walking looks natural, but actually it is a very complicated thing. And that is one of the reasons why robots are not as able to do many things autonomously as we will need them to.

What are some advances in the last 10 years that have enabled much quicker decisions?

A lot of ground-breaking work being done in robotics worldwide… is based on research from a few decades back. Examples are ideas like reinforcement learning, machine learning, but also algorithms for perception, for control, model predictive control. All of these are well known areas – but you're completely right about these quick computations.

There have been advances in hardware – better GPUs, better CPUs, etcetera. But to be able to exploit this hardware we need to release it, using these fundamental ideas developed decades ago, and scale it up. A lot of our research is about this. Scaling things up, most of the time, is not trivial… this is a very challenging task from an engineering perspective, but also a scientific one.

From a biomechanical point of view, humanoid forms have the obvious advantage of being well-suited to the environment. In your mathematical approach, do you find those same advantages?

The idea that a human shape is the best, because environments have been developed by humans, is one of the hypotheses used by many in the community… But you can also frame this in a more rigorous way. That's the problem we call co-design.

So we design tasks and robot morphologies all together in an optimisation process… you want to really quantify what is the best robot you do need to do [a task in a particular way].

Does the integration of more senses, such as tactile information or thermal imaging, offer another way to make more useful machines?

When you touch my arm, I don't have to see – I know you’re touching me there, and you’re pushing me. I can use that information. My brain can use that information, together with other sensing capabilities, to make the best decision.

Better sensing gives us a better understanding of the world, to make better decisions. We need better sensors to better understand the world, but we also need algorithms that can make this decision process better as well, so it’s not a trivial solution.

Machine learning can help, definitely. It’s part of the solution, but I don't embrace the idea that it's the entire solution. That's what we are trying to look at, how to build a complex system.

What about augmenting humanoid robots with non-biological features, such as wheels?

Yes, definitely. That's why I don't exactly embrace the idea of bioinspiration. 150 years ago, someone might have looked to horses for inspiration when designing a vehicle – but actually wheeled vehicles are far more practical, and are now our primary mode of transportation. So nature is a great source of inspiration, but it’s not necessarily what we need to do.

It's the same with a robot that has limbs and wheels. I see much more application of those robots than a humanoid right now, because in a lot of industrial environments, like Amazon warehousing, they are all that you need. [Using both] will save energy, it will allow for the robot to move faster through corridors, and make the task more efficient.

How widespread could the use of humanoid robots be in 10 years?

I think there's a lot of potential in healthcare, but… we need a robot that is not just capable to do the task, but can do it safely, can interact pleasantly with people. So I see that possibility, but not within the next 10 years.

In the next 10 years, I see humanoid robots… in the manufacturing industry, robots in the construction industry, in the energy sector, inspecting things, doing things like checking valves, doing maintenance in offshore platforms. And robots doing warehouse work, and last-mile deliveries as well.


Want the best engineering stories delivered straight to your inbox? The Professional Engineering newsletter gives you vital updates on the most cutting-edge engineering and exciting new job opportunities. To sign up, click here.

Content published by Professional Engineering does not necessarily represent the views of the Institution of Mechanical Engineers.

Share:

Read more related articles

Professional Engineering magazine

Professional Engineering app

  • Industry features and content
  • Engineering and Institution news
  • News and features exclusive to app users

Download our Professional Engineering app

Professional Engineering newsletter

A weekly round-up of the most popular and topical stories featured on our website, so you won't miss anything

Subscribe to Professional Engineering newsletter

Opt into your industry sector newsletter

Related articles