Research into human-robot cooperative piano shown at AI conference

A musical robot that can play the piano alongside a human, creating a harmonic accompaniment in real time, has won an award at the Centre for Human-Inspired Artificial Intelligence (CHIA) Conference 2024.

Research into human robot cooperative piano shown at AI conference

Cambridge Engineering PhD student Huijiang Wang from the Bio-Inspired Robotics Laboratory (BIRL), Department of Engineering, won the Best Demo Award for his ‘Harmony Robot’ – a custom anthropomorphic hand attached to a robot arm, with its ‘fingers’ preset into a chord-hitting pose.

A separate research paper, published in the journal IEEE Transactions on Robotics, was also accepted as part of the award win.

Titled Human-robot cooperative piano playing with learning-based real-time music accompaniment, Huijiang and his co-authors Dr Xiaoping Zhang and Professor Fumiya Iida, present a collaborative robot that uses machine learning to predict the appropriate piano chord progressions based on a human’s piano playing. Experiments demonstrated a 93% accuracy rate and involved the robot learning chords inspired by pop music.

Meanwhile, a behaviour-adaptive controller enables seamless temporal synchronisation to take place, so that the collaborative robot can generate harmonic chord accompaniment for the human-played melody in real time. In effect, both human and robot are playing a duet.

The robotic setup involved the vertical movement of the custom anthropomorphic hand via a robot arm, enabling the piano keys to be struck. While a separate horizontal motion enabled chord switching to take place, thus expanding the expressive capabilities of the robot in piano-playing tasks.

“We have developed a musical robot that can play the piano alongside a human, creating good duets in real time,” said Huijiang Wang. “Our robot ‘listens’ to the melody the human plays and uses machine learning techniques to predict the next chords, ensuring the music stays harmonious.”

He added: “By using a special control system, the robot matches its timing with the human player, making the performance synchronised. We also have a system that checks how well the robot and human are playing together, ensuring high-quality collaboration.”

Future work is set to include enhancing the diversity of the robot-based music accompaniment by incorporating more chords and enhancing the dexterity of the anthropomorphic hand to enable its position and orientation to be adapted further – ideal for pressing a broader range of chords.

Professor Iida said: “With advancements in AI, we are now aiming to push the boundaries of robotics into emotionally aware scenarios, such as artistic and entertainment performances, where robots must be capable of capturing, interpreting and responding to the emotional cues of their human counterparts.

“Developing an improvisation algorithm capable of managing intensive computational resources – while preserving chord prediction accuracy – is also essential for future endeavours in this field.”

Funded by the European Union’s MSCA-ITN ‘SMART’ project, this study is positioned within the broader field of interactive robotics.

“Our research on the Harmony Robot advances the integration of soft robotics in creative and collaborative roles”, said Huijiang. “By leveraging cutting-edge machine learning and soft robotics, we aim to achieve seamless human-robot cooperation in the future, enabling robots to respond adaptively in dynamic, real-world scenarios.”