Berkeley researchers write programs that jam.
From autoworkers to telephone operators, many employees have lost their jobs to robots. Could session musicians be next? It’s more likely than you might think. A trio of Berkeley researchers is developing computer programs that can hold their own with the best jazz artists.
The study is part of a broader effort to better understand how machines can adapt to unforeseen changes in their environments, according to Sanjit Seshia, Associate Professor of Electrical Engineering and Computer Sciences. “If you look at the dictionary meaning of ‘improvise,’ it means you’re performing something without preparation,” says Seshia, who is working with Music and Technology professor David Wessel and post doctoral researcher Alexandre Donzé. “Music is a very nice way to investigate these ideas.”
A human musician might learn to improvise by listening to other artists, taking note of which rhythms and riffs work well together and which ones clash, forming musical preferences. To create a program that can improvise, Seshia, Wessel, and Donzé “translate” such preferences into rules that govern pitch and rhythm, specifying what notes the computer can play next at any given time.
To learn these rules, the program reads a form of digital sheet music, then generates a database of musical fragments, called sub-phrases, that make up longer melodies. The program builds a repertoire from these sub-phrases (the team prefers the traditional jazz-cat term “licks”), which it can then string together in various combinations much as a human player would.
It’s not just a theoretical exercise. This summer, the team fine-tuned a program to improvise along with Duke Ellington’s jazz standard, “It Don’t Mean a Thing.” Donzé said the result “sounds good” (though it’s perhaps too early for an encore performance). The team can explore different styles by controlling how far an improvisation can stray from the original. The more it strays, the more unconventional the composition. A piece of music that stays within musical boundaries while keeping its distance from the starting point hits the sweet spot, according to Wessel: a familiar but creative result.
The study is one of the many activities of the TerraSwarm Research Center, an interdisciplinary collaboration between nine universities and Intel. Funded by corporate donors and DARPA, TerraSwarm aims to develop novel technologies for a future in which the world is filled with trillions of remote sensors that generate information.
It’s in this kind of world—one awash in data that must be managed —where machines will need to think on their feet. Wessel is optimistic about the potential impact of the team’s research. He and Seshia both see their work someday used in a highway control system that tackles traffic jams or accidents without human input, a daunting task given the variables involved. For now, the humble program remains in a Berkeley lab, working on its groove.