Home   Education   Article

Subscribe Now

Watch University of Cambridge's 3D-printed robotic hand as it plays Jingle Bells




Scientists have developed a 3D-printed robotic hand which can play simple musical phrases on the piano by just moving its wrist.

And while the robot is no virtuoso, it demonstrates just how challenging it is to replicate all the abilities of a human hand, and how much complex movement can still be achieved through design.

University of Cambridge robot plays the piano. Picture: Department of Engineering (6185976)
University of Cambridge robot plays the piano. Picture: Department of Engineering (6185976)

The robot hand, developed by researchers at the University of Cambridge, was made by 3D-printing soft and rigid materials together to replicate of all the bones and ligaments – but not the muscles or tendons – in a human hand. Even though this limited the robot hand’s range of motion compared to a human hand, the researchers found that a surprisingly wide range of movement was still possible by relying on the hand’s mechanical design.

Using this ‘passive’ mechanical movement – in which the fingers cannot move independently – the robot was able to mimic different styles of piano playing without changing the material or mechanical properties of the hand. The results, reported in the journal Science Robotics, could help inform the design of robots that are capable of more natural movement with minimal energy use.

PhD student Josie Hughes, from the Department of Engineering and the paper’s first author, says: “We can use passivity to achieve a wide range of movement in robots: walking, swimming or flying, for example.

“Smart mechanical design enables us to achieve the maximum range of movement with minimal control costs: we wanted to see just how much movement we could get with mechanics alone.

“Piano playing is an ideal test for these passive systems, as it’s a complex and nuanced challenge requiring a significant range of behaviours in order to achieve different playing styles.”

By altering the movement of the robot’s wrist, the researchers were able to programme it to play short musical phrases with clipped (staccato) or smooth (legato) notes.

“It’s just the basics at this point, but even with this single movement, we can still get quite complex and nuanced behaviour,” adds Josie.

“We can extend this research to investigate how we can achieve even more complex manipulation tasks: developing robots which can perform medical procedures or handle fragile objects, for instance.

“This approach also reduces the amount of machine learning required to control the hand; by developing mechanical systems with intelligence built in, it makes control much easier for robots to learn.”

University of Cambridge robot plays the piano. Picture: Department of Engineering (6185972)
University of Cambridge robot plays the piano. Picture: Department of Engineering (6185972)

Dr Fumiya Iida, who led the research funded by the Engineering and Physical Sciences Research Council (EPSRC), says: “The basic motivation of this project is to understand embodied intelligence, that is, the intelligence in our mechanical body.

“Our bodies consist of smart mechanical designs such as bones, ligaments, and skins that help us behave intelligently even without active brain-led control.

“By using the state-of-the-art 3D printing technology to print human-like soft hands, we are now able to explore the importance of physical designs, in isolation from active control, which is impossible to do with human piano players as the brain cannot be ‘switched off’ like our robot.”

Piano playing is an example of Moravec’s paradox, which states that tasks that are easy for humans are difficult for robots and vice versa.

“Robots can go all the way to Mars, but they can’t pick up the groceries,” Dr Iida points out.

At Amazon’s enormous warehouses, robots have been busy this Christmas picking up and moving huge shelves that weigh more than 1,000kg when an order comes in, taking them to an area where the required item is removed and placed in a plastic bin, reading for packing and sending. Removing the item from the shelf is the challenging part.

“An Amazon order could be anything from a pillow, to a book, to a hat, to a bicycle,” says. “For a human, it’s generally easy to pick up an item without dropping or crushing it – we instinctively know how much force to use. But this is really difficult for a robot.”

The ‘last metre’ problem is something that vexes industry - but Dr Iida's lab is working on solutions.

“It’s the front in robotics, because so many things we do in our lives, from cooking to care to picking things up, are last metre problems, and that last metre is the barrier to robots really being able to help humanity,” says Dr Iida.

The lab is working with British Airways to help complete the robotic process of baggage handling, which is automated to the point where suitcases of different shapes, sizes and weights need to be put on the aircraft.

And it has spent two summers working with fruit and vegetable group G’s Growers, based near Ely, to design robots that can harvest lettuces without destroying them.

As we solve these problems, the issue of human-robot interaction becomes key.

Dr Hatice Gunes, of Cambridge’s Department of Computer Science and Technology, has just completed a three-year EPSRC-funded project, which brings together computer vision, machine learning, public engagement, performance and psychology.

She says: “Robots are not sensitive to emotions or personality, but personality is the glue in terms of how we behave and interact with each other.

“So how do we improve the way in which robots and humans understand one another in a social setting?”

Dr Gunes’ project explored the use of artificial emotional intelligence - robots that could express emotions, read cues and respond appropriately.

Computer vision techniques were used to help robots recognise emotion, micro-expressions and human personalities, and a robot was programmed to be either introverted or extroverted.

Dr Gunes said:. “A robot that can adapt to a human’s personality is more engaging, but the way humans interact with robots is also highly influenced by the situation, the physicality of the robot and the task at hand.

“For me, it was more interesting to observe the people rather than to showcase what we’re doing, mostly because people don’t really understand the abilities of these robot.

“But as robots become more available, hopefully they’ll become demystified.”

Dr Gunes now intends to focus on the potential of robots and virtual reality technology for applications, such as coaching, cognitive training a

Meanwhile, Josie Hughes and fellow PhD students Luca Scimeca, Kieran Gilday and undergraduate Soham Garg – from the Department of Engineering’s Biologically Inspired Robotics Lab won a special innovation award at the World Robot Challenge in Tokyo.

The team, called CambridgeARM, competed against 16 groups from around the world and came second in the Kitting and Adaptive Assembly challenges, part of the Industrial Robotics Category.

Their robot was required to perform a quick and accurate assembly of technical components needed in the assembly of industrial products and other goods. CambridgeARM used grease so that Allen keys could be used to hold bolts temporarily, while putty was used to pick up single items, such as a washer, from a box of parts.



This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies - Learn More