Oct 23, 2024 10:10 AM
In Memoriam: Claire Daly, 1958–2024
Claire Daly often signed her correspondences with “Love and Low Notes.”
The baritone saxophonist, who died Oct.…
Seated on steeply inclined risers and packed in like sardines, a community of concertgoers gazes down expectantly, if less than comfortably, on a floor-level stage.
Its wide expanse is eerily dark, save for the faint outline of unmanned musical instruments, a bank of digital hardware and a large movie screen on which the partial image of an otherworldly molten sphere brightly glows.
Suddenly, the sphere begins to rise and sounds of arpeggiated chords emerge as if from nowhere. Each chord resonates throughout the concert hall, a big black box of a space. And as it does, the image pulsates.
Then, to a burst of applause, a compact, white-clad figure lithely crosses the stage, parks himself before the vibraphone and starts striking the tone bars with a touch so dazzlingly deft that, with each ringing note, the hitherto bloodless chords seem infused with a life force.
That scenario played out on Feb. 18 in the Victoria Theater, the Apollo Theater’s brand-new venue just down 125th Street from its main stage in Harlem. And it only hinted at the multi-modal, multi-sensory experience to come.
The scenario’s author, Stefon Harris — vibraphonist, visionary, veritable force of nature — was soon joined by the members of his close-knit band, Blackout, with Christian Sands on piano, James Francies on keys, Dezron Douglas on bass, Terreon Gully on drums and Casey Benjamin on saxophones and vocals. Working in harmony with the chords and images — both computer-generated — they fashioned a singularly empathetic testimonial to collective invention, the first in a series of similar creations that, over the course of the evening, introduced a potentially transformative tool to the jazz arsenal: artificial intelligence.
To be sure, Blackout, after 20 years on the scene, did not need AI to demonstrate its ability to honor and update jazz tradition. But when activated — via an app trained on musical data that Harris painstakingly created over the course of a decade — the colors, contours and, ultimately, very nature of the onstage interplay suggested a measure of true innovation.
“I’ve actually created a new instrument,” said the 50-year-old Harris in a post-performance interview.
The app, which Harris has dubbed Harmony Cloud, employs technology like the one that powers ChatGPT. It was deployed onstage by the musicians, each of whom, armed with an iPad loaded with seemingly every chord known to Western harmony, could, with a touch of his screen, trigger a new harmonic pathway — one that was unpredictable yet thoroughly rational in its adherence to principles of tonal centers and voice leading.
The pathway would appear as chord symbols on all the players’ iPads, so they would be informed and make musical choices accordingly. The triggering musician could also dictate tempo and rate of arpeggiation. And, unless otherwise directed, any musician could refresh the screen and take control.
While the concert’s opening improvisation was a free-form exercise constrained only by imagination and the app’s input — data that was translated into on-screen images visible to the musicians by way of a TV monitor — the starting points for the evening’s subsequent interplay were originals and covers from Harris’ upcoming albums on Motéma, Sonic Creed Volume II: Life Signs (out June 14 with singles out April 12 and May 17) and Legacy Dances (to be released early next year). The tunes had specific chord changes laid out within conventional structures to which the musicians in the recording sessions largely hewed. But at the Apollo, the addition of the AI materially changed the quality of the collaboration.
“It forces the musicians to listen and engage on a deeper level,” said Kevin Jiang, the project’s lead engineer and a musician himself. “You can’t just rely on your memory of the chord progression to guide you through it. You have to be listening and hearing what’s going on. It almost elevates the level of empathy that’s going on onstage.”
Harris has made a lifetime study of the science of empathy, and, in a sense, the Apollo event was one more experiment in his ongoing research. But it was perhaps his most ambitious, its scope suggested by the varied group gathered for the event. In addition to the band, the assemblage included actor Dwayne Clark, who delivered Harris’ spoken-word paeans to cultural awareness in tones as lush as the harmonies emanating from the app. Also on hand was singer Christie Dashiell, a recent recruit of Harris’, whose soulful purity of tone complemented Benjamin’s digitally altered vocals. Visual artists Alexander Baumann and Robert Ruth, integral to the presentation, lent the technology a visual identity to match its aural one.
Though Harris had given the app a whirl in public on a limited basis, the Apollo event’s elaborate setting and multitude of moving parts — made possible with the support of an Apollo New Works commission five years in the making — constituted its biggest test.
“It was a major risk in terms of the technology, just making sure it functions,” Harris said, noting that internet problems delayed the loading of information both at the rehearsal the day before the show and on show day. “On top of that, there was the artistic risk, where many of the musicians onstage were using the AI for the first time. I didn’t really have a great sense of what they were going to do with it, but I had total confidence and trust that they were brilliant musicians and would come up with something amazing, which they did.
“And there was the other layer of risk, where we had never seen all of it together with music and AI and the visuals, so that was pretty special seeing it come together.”
The rehearsal, a stop-and-start affair, offered little indication that it would in fact come together. But whatever snags had come before — there had also been an early show at the Apollo — had been ironed out for the evening performance. It was seamless, from the AI-heavy opener to the AI-light closer, “What A Wonderful World.”
Blackout’s penchant for recontextualization took center stage with its tributes to the early jazz masters. “What A Wonderful World,” popularized by Louis Armstrong, drew liberally on modern jazz reharmonization and Benjamin’s echo-laden vocoder stylings. Stride titan Willie “the Lion” Smith’s “Echoes Of Spring,” delicately rendered by Sands with fealty to the original, yielded to a more kinetic “Echoes Of The Lion” that featured the intricate swooping and swaying of Francies’ synthesizer.
The shifts in the visuals paralleled those in the music, with the pixilated image of Smith that accompanied Sands dissolving and reassembling on-screen as swirls of color accompanying Francies. The images, created on Harris’ order, demonstrated the visual team’s abilities to coordinate sight with sound in real time. “At any arbitrary moment, I can do anything,” Ruth said. The on-screen evidence allowed for only a bit of hyperbole in his statement.
But as impressive as his maneuvers were, even he, sitting behind his imposing control panel, had to acknowledge the power of AI to assert itself. “The thing that’s not visible now is when the chords change, it generates a change inside our system,” he said.
Visually, the AI-generated changes began with the rhythmic pulsations on the sphere in the concert’s opening. Similar effects appeared throughout the performance. But perhaps the most striking impact of the AI was its ability to enrich the head-to-head musical exchanges, nowhere more clearly than in the interaction of Sands and Francies.
Ironically, Harris said, the pairing of the pianists almost didn’t happen. Initially, he had only enlisted Francies, whose penchant for synesthesia he hoped would lead to colorful interplay. Francies also brought to the endeavor a history of pursuing technological avenues of expression with scientific rigor.
But as Harris was putting the event together, he said, “I realized it’s pretty complicated to control the AI, play the piano and look at the music, so I ended up adding Christian to the gig two weeks before the show.” Sands, who had been on Legacy Dances, already knew the basic tunes, so he was able handle a lot of the written music and free up Francies to add whatever he wanted without worrying about that aspect of the proceedings.
Years before the Apollo gig, Sands, who was once Harris’ ear-training student at the Manhattan School of Music, had encountered an early version of the app while participating in Harris’ presentations before corporate audiences. It was only on arriving at the rehearsal that Sands learned that the app, now in a more advanced version, would be part of the show.
From the first moment at the rehearsal, he was playfully sparring with the app. “I thought, ‘Let’s try selecting a chord,’” he later said. “I wanted to surprise myself. I didn’t look, just pressed. From there it mapped out the comping. I’m directing it, and it’s directing me.”
By show time, he was fully engaged with the app, not only as a foil but as a means of building a wider conversation with Francies. The resulting colloquy reached a peak on “I Know Love,” a soaring ballad Harris wrote for the family of a cherished patron who had died.
Harris, Sands said, had told him to trigger the harmonies for Francies to solo on. He opted to wrest partial control from the machine, creating his own pathway from the panoply of chords the program laid out before him.
“I chose any sequence I wanted, for any number of bars,” Sands explained. “I could make it complicated or not. I chose organically, the way the emotion came to me and by listening to the way James was playing.” Francies returned the favor.
“We were following each other,” Sands said. “The challenge was to make something that sounded and felt good. It made us pull our punches and be present.”
As the back-and-forth, both man-to-man and man-to-machine, grew more intense — each musician working the iPad with his left hand and the keyboard with his right — the app became a presence that altered both the performing dynamic and the music’s shape.
“It’s almost like playing with another person,” Francies said. “The songs are what they are. But we’re writing new songs with the information.”
In the end, Harris said, the app’s potential as a tool to enhance creativity started to become clearer. “What they can do with their hands on their iPads it would take two hands to pull off. It’s almost like we’ve expanded their ability through the invention of this instrument.”
He likened the result to a “beautiful dance between the two of them” that, as real-time composing, was very much in the spirit of jazz — but with the added dimension of creative tension fostered by the uncertainty of interfacing with a machine.
“It’s a learning curve,” Francies said, noting that, while the technology was still very new, he believed that his talents as an improvising musician would help flatten the curve as he integrated the app into his playing. “It’s a natural progression.”
Natural or not, Harris was surprised. “I couldn’t have predicted in advance that that would have been an outcome of the use of AI. You create the technology and put it in the hands of brilliant people and pay attention to how they decide to use it, and its ultimate function is going to be revealed to you by the use, not by the inventor of the technology.”
The technology’s origins trace back to a time 17 years ago, when Harris, suffering from profound ennui, cut back his touring and entered a period of contemplation and concentration. Pen in hand, he filled pages of notebooks with innumerable permutations of chord combinations and long series of adjectives applied to them. In those notebooks, a memo he wrote to himself has special resonance: “I need to design an algorithm for my harmonization.”
Building the algorithm began in earnest about 10 years ago. It was hardly a preordained success, especially with the technology available at the time. “I wouldn’t say I had doubts,” he said. “I definitely had anxiety because of the amount of work it was going to take. But I knew that it was possible.”
The original intent was to create an ear-training tool, the harmonic equivalent of a metronome. That effort continues. As a professor at Rutgers University’s Newark, New Jersey, campus, he has converted a room in a former department store near the campus into a lab ringed by electric keyboards. Standing in the lab on a December afternoon, he noted that, from the moment students enter class, they are immersed in chords generated by the app. Much of the class communication is sung to its harmonies.
Harris also uses the app, in conjunction with some brilliantly conceived low-tech supporting material, to teach the occasional master class at the New Jersey Performing Arts Center. There, as an advisor to the program TD Jazz for Teens, he had a group of aspiring musicians standing, singing — and learning to hear on a deep level.
“What he brings to it,” said saxophonist Mark Gross, the program’s director of jazz instruction, “is a very clear and tangible way for students to understand and eventually recognize, very quickly and efficiently, harmony. He’s also inspiring to students because he makes them feel like, ‘I can do this.’”
The potential educational market for the app includes classrooms, studios and the home. Though the possibilities for the app in live performance might at this early point be less well defined, they got a serious boost with the success of the Apollo show, which, to players and patrons alike, offered a glimpse into its role in advancing the music.
“Stefon’s performance was very now but also very future,” said an elated Leatrice Ellzy, the Apollo’s senior director of programming.
Harris praised Jiang for his efforts in upgrading the capabilities of the app, which he said was likely to be released in its current form this year. But Jiang was quick to identify Harris’ articulation of its key principles as the critical piece. Harris, he said, focused on defining where notes can go and which chords can do what — information parsed by the program to generate progressions that are compelling.
“The way Stefon’s brain works is crazy,” Jiang said. “In a sense he’s programming without actually writing code. The program would do nothing of substance if the rule set that drives the AI were not sophisticated and extremely well thought out.”
From Harris’ point of view, getting the software to play chords was not incredibly difficult. “It was really, really difficult to get the software to play chords in a key center,” he said. “It was really hard to make it sound like it was playing a song — to have it generate music in a way that felt logical, intuitive, emotional.”
In the weeks leading up to the Apollo show, Harris had an experience that helped affirm his progress toward achieving that goal. It came during a talk he gave at the theater, when he invited someone up from the audience to trigger the app as he improvised.
“At some point Harmony Cloud made a really interesting movement and the audience clapped. What the AI did in that moment was really beautiful and elicited an emotional reaction from the audience. They just had a reaction based on something a computer did for them — and for me that was a powerful moment.” DB
Oct 23, 2024 10:10 AM
Claire Daly often signed her correspondences with “Love and Low Notes.”
The baritone saxophonist, who died Oct.…
Nov 5, 2024 1:00 AM
Quincy Delight Jones Jr., musician, bandleader, composer and producer, died in his home in Bel Air, California, on…
Nov 19, 2024 12:57 PM
Powerhouse jazz drummer and bandleader Roy Haynes died Tuesday in Nassau County, New York. He was 99. One of the few…
Nov 7, 2024 2:17 PM
In the aftermath of World War II, the deprivations felt throughout the United Kingdom were particularly acute in John…
Nov 12, 2024 12:12 PM
Alto saxophonist Lou Donaldson, the final surviving member of the original Art Blakey quintet that in 1954 introduced…