Real-time AI can generate lines for live instrumental music
Researchers at the University of Waterloo’s Natural Language Processing Lab have developed a real-time artificial intelligence (AI) system that can generate lines for live instrumental music. The new system is called LyricJam, and it went live in June 2021 with over 1,500 users testing it since then.
The team will present its research at the International Conference on Computational Creativity in September.
The lab is headed by Olga Vechtomova, an engineering professor who was jointly appointed in computer science at the university. Vechtomova has been developing AI applications for years, and the lab’s work first led to the creation of a system that learns artists’ musical expressions before generating lyrics in their style.
Vechtomova and Waterloo graduate students Gaurav Sahu and Dhruv Kumas have also developed technology that uses different musical components such as chord progressions, tempo, and instrumentation. The technology is able to synthesize the lyrics to reflect the mood and emotions expressed by live music.
The neural network
The system continuously receives raw audio clips while a musician or group performs instrumental music. After that, the neural network processes the data before generating new lyrics, which artists can then use to develop lyrics for their songs.
“The purpose of the system is not to write a song for the artist,” explains Vechtomova. “Instead, we want to help artists realize their own creativity. The system generates poetic lines with new metaphors and expressions, potentially leading artists in creative directions they have never explored before.
The newly developed neural network is able to learn what lyrical themes and words are associated with different aspects of music, and most impressively, it does so in every audio clip.
The team performed a user study in which musicians performed live music while using the system.
“An unexpected finding is that the participants felt encouraged by the lines generated to improvise,” Vechtomova said. “For example, the lines have inspired the artists to structure the chords a little differently and to take their improvisation in a new direction than initially intended. Some musicians also used the lines to check if their improvisation was having the desired emotional effect.
Partnership with AI
Another major aspect of this research was its demonstration of collaboration and co-creativity between humans and AI. According to participants, the system acted as a musical partner, not a critical one, which allowed musicians to play unhindered. They also said they felt encouraged to play musical instruments even though they were not working on the lyrics.
The new LyricJam system is the latest example of how artificial intelligence is making its way into our creative minds. While we always talk about the connection between humans and AI, it is often in terms of areas like health. With new advancements like these, we’re also getting closer to being connected to these machines in creative ways.
The LyricJam system can be found here.