Inside the Mind of the Cyborg Pianist

London-based pianist and Ensemble Offspring alumnus, Zubin Kanga, is unlocking new possibilities in music through interactions with AI and machine learning. He joined the Ensemble Offspring team for an interview about the upcoming Lumen Machine concert taking place in Sydney on Saturday, 12 April (ACO on the Pier) and in Newcastle on Sunday, 13 April (Newcastle Conservatorium of Music). This is what he had to say about interactive visuals and VR, motion and biosensors, new hybrid instruments, and composer collaborations

Cyborg Pianist Zubin Kanga. Photo by Raphael Neal

Cyborg Pianist Zubin Kanga. Photo by Raphael Neal

You’ve been described as a ‘cyborg pianist’, how do you define that term in relation to your performances?

The ‘cyborg pianist’ descriptor captures how the music I play either extends the instrument, or extends me and my body, through new technologies. And it more broadly captures the way the experience in the concert hall is being extended by technology to become an immersive multimedia experience. These technologies range from hybrid keyboard instruments, to new immersive audio-visual technologies, to motion and bio-sensors, to AI. 

I’d been playing contemporary music for many years before focusing on works that use tech. At the end of my PhD, I’d commissioned dozens of composers who had written works for me that pushed the limits of traditional piano virtuosity as well as extended techniques inside the piano, so using tech seemed like the logical next step. I commissioned a few works in 2014 and the first big project was an Australian tour, Dark Twin in 2015, which included new works by Julian Day, Ben Carey and Cate Hope. I loved the results so much that it’s been the focus of my work ever since. 

My academic and artistic career have also worked hand in hand in this exploration of music technologies. I was a postdoc with a research project based at IRCAM in Paris (one of the first major music technology research institutions, established by Boulez) and after that had a Leverhulme Fellowship at Royal Holloway, University of London, focusing on how musicians collaborate around new technologies. I’m now Senior Lecturer at Royal Holloway, as well as having a UKRI Future Leaders Fellowship, which has funded my current seven-year music technology research project, Cyborg Soloists.

What kinds of technology do you incorporate in your performances? How does using sensors, AI, or interactive electronics change your relationship with the instrument?

Over the past four years of Cyborg Soloists, I’ve had the opportunity to work with many industry partners and researchers on new cutting-edge music technologies. These include new and hybrid instruments (including new types of keyboards, and sensors attached to the piano), glove sensor instruments like the MiMU gloves, biosensors (including EEG brain sensors), AI (from custom-made neural networks from datasets we made, to the new commercial platforms), and live audio-visuals (including holograms) interacting with these instruments and sensors. 

The funding is from a prestigious UK Research and Innovation Future Leaders Fellowship – these are awarded to researchers across all disciplines (although mainly to science and engineering researchers) to allow them to go from early career researchers, into established leaders in their field. I was awarded the original funding in 2021, and it has just been renewed, taking the project to 2028  – a total of 7 years and over £1.5 million in funding, in collaboration with 26 industry partners. The project has commissioned more than 50 works, and 60 performances, including at Paris Autumn Festival, Huddersfield Contemporary Music Festival, November Music (Netherlands), Time of Music Festival (Finland), Music Current Festival (Ireland), Transit Festival (Belgium) and Modulus Festival (Canada).

Have there been moments where technology surprised you or forced you to adapt in unexpected ways?

Using these other instruments and devices requires a constant development of new skills that go far beyond traditional skills on an instrument. Besides understanding the engineering of the devices, and many new mapping and coding platforms and languages, I also need to learn and master the specific techniques that are afforded by each of the devices. Some people spend years on one of these new instruments, but I need to be constantly learning, mastering and juggling more than two dozen devices and software platforms. 

The biggest adaptation for me as a performer was in using brain sensors. This required a some hardcore programming by one of the UK’s leading brain-computer interaction experts (Serafeim Perdikis), but they also posed an artistic challenge. Any fast movement creates too much noise in the system, so I can’t play the piano while using them – the piece requires me to focus on particular visuals on a hologram screen to create a response, with my choices to focus on different areas of the screen affecting both the resulting sound and the visuals. Although this seems limiting, Alexander Schubert managed to create one of the most spectacular works that I’ve ever commissioned out of this, linking the sensors to a hologram screen. It is still a challenge on stage, as the piece requires extreme focus to ‘play’ the brain sensors. 

Kanga performs as a member of Ensemble Offspring for Future Retro in 2015 at the Sydney Conservatorium of Music.

Can you tell us about your relationship to Ensemble Offspring and how being a part of the group impacted your musical development?

I first played with Ensemble Offspring 20 years ago (in January 2005) and performed pretty regularly with the ensemble for the next 15 years. When I first joined, I was still an undergraduate student – although I’d done a lot of work with student and staff composers, and a lot of performances of contemporary music, working with Ensemble Offspring was my first real professional experience of the contemporary music scene.

Being in the ensemble was really vital for my musical development – I learnt so much from working with this group of amazing musicians who all many more years of experience. I have  so many great memories with the ensemble: the Australian premiere of Grisey’s Vortex Temporum in 2006, a tour to the Northern Territory where the ensemble got stuck in Darwin during a hurricane, performing with Steve Reich at the Sydney Opera House, the 20th birthday Future-Retro concerts, performing the Australian premiere of Fausto Romitelli’s Professor Bad Trip, countless premieres of amazing new works by Australian composers, and a big European tour in 2019 to Germany, the Netherlands and the UK.

What are you looking forward to most about the Lumen Machine program?

Lumen Machine is one of the major outputs of Cyborg Soloists, and features three concerti I commissioned for myself to play with Offspring. I’ve known the two Australian composers, Amanda Cole and Tristan Coelho, for years and worked with them previously on either solo or chamber works, and the concert also features one of Germany’s leading composers creating work with new technologies, Brigitta Muntendorf. A soloist in these works, I play the piano, keyboards, a motion sensor ring, a microtonal keyboard, infrared sensors and lasers!

It’s been 5 years since my last project with Offspring over COVID, as well as my first concert in Australia since 2019, so it’s also a comeback concert for my work in Australia. I’m really looking forward to playing with all the Offspring musicians again, as we have such a highly developed chamber music rapport. 

Could you share your experience of collaborating with the composers and insights on the repertoire?

I went to work with Brigitta Muntendorf in Cologne a few weeks ago and together we tried out the infrared sensors and lasers she’s using in her piece. Using these, it’s like I’m playing a virtual piano by moving my hands through the lasers. 

Amanda Cole’s Dream Garden uses the Lumatone, a keyboard of hexagonal keys that’s ideally suited to microtonal tuning. You can play 24-note, 32-note, 51-note, 64-note scales, or creating your own bespoke scales and tuning, which is what Amanda has done. She’s been working with this instrument for several years herself, so has developed a lot of expertise in both playing it, and using it in really imaginative ways, allowing her to create ethereal and otherworldly sounds in this piece. 

Tristan Coelho’s Hot Take combines his signature driving funky rhythmic and virtuosic writing style with the Genki Wave ring. This device fits onto one finger, but provides a remarkable amount of control over the electronics. The ring triggers sounds when use a particular finger, and it also shapes sounds through movement in the air (using a motion sensor). The result is that I’m a kind of hyper pianist, playing the notes on the piano and keyboard as well as shaping these sounds in the air like a conductor. 

My own piece is a little exploration of virtual synthesisers I’ve been experimenting with, with the effects and digital arpeggiator effects disguising some references to well-known works for the piano repertoire (some easier to spot than others). Like a lot of my music, it uses new instruments and technologies to reflect on all the canonical music I’ve played and internalised in the past.

Why should people come along to the Lumen Machine performance? What will they get out of the experience?

Audiences really love how engaging and immersive the concerts can be. I often get audiences from outside of classical music, with people who are interested in film, performance art, internet art and AI. And classical audiences can still find really wonderful connections to the canon – a lot of the new works are inspired by experiments and ideas from the past (even from centuries ago) which have only now become possible with the latest technologies.


Don't miss out! Access this link now for tickets to Ensemble Offspring’s Lumen Machine!

Next
Next

Jason chats to Klearhos Murphy