Close-up photo of a hand adjusting dials on a music mixer

The Awesome Science of Music Technology

 

Close-up photo of a hand adjusting dials on a music mixer in the music technology field.

Photo by Drew Patrick Miller on Unsplash

By Amin Ghane

Musicians spend countless hours tweaking, calibrating, and arranging sounds and ideas to create your favorite jams. And then, of course, there are the instruments themselves that provide a conduit through which creativity can be expressed. The field of music technology develops and improves how musicians create – and how we interact with – music.

17th-century music technology

Dating from 1720, this grand piano was invented by Bartolomeo Cristofori.

Breakthroughs in music technology have always led to booms in musical innovation. Consider that most ubiquitous of all Western instruments: the piano. Invented around the 18th century by Bartolomeo Cristofori, the piano was an advancement upon the harpsichord, which introduced the musician’s ability to adjust the volume and duration of the notes played. Classical and Romantic musicians took advantage of these attributes to create some of the most enduring pieces of all time. 

And the work of music technologists hasn’t stopped. In fact, advances in computation have provided musicians the power to produce, edit, and play their music with more flexibility and expression than ever before. Whereas the work of recording and editing music was previously reserved for experts who would charge a fortune for their services, now anyone with an iPad and some curiosity can get into the basics. The latest, top-of-the-line synthesizers are producing noises that grip our attention.

From the underground artist slapping together beats in their garage to the latest number one hit from Taylor Swift, music lovers of all styles benefit from this innovation. To take a look at what’s going on the front lines at the intersection of technology and artistic expression, I ventured to Georgia Tech’s School of Music, which houses one of the country’s foremost academic and research programs in music technology.

The future of music

Innovation on display at Georgia Tech’s annual Guthman Musical Instrument Competition

As I walked into the J. Allen Couch Building, I was immediately greeted by the sound of violins filling the sonic space between gentle scales being played on the cello. These musical instruments are centuries old, but I’m here for what is emerging from the cutting-edge music labs on the second floor. As he leads me upstairs, Avneesh Sarwate, a master’s student in music technology, explains, “Most of those kids also spend a lot of time building synthesizers and coding. Half of them are also in the laptop orchestra.” In a discipline as ancient as music, it is eye-opening to see first hand how intimately the old and new are joined in this hub of experimental music.

In a discipline as ancient as music, it is eye-opening to see first hand how intimately the old and new are joined in this hub of experimental music.

We walk down a long hallway, passing rooms and labs that each have their own character. One room has gigantic speakers surrounding a sole computer monitor, while the next has what looks to be an entirely newly invented instrument made out of scrap metal sitting amid piles of tools and screwdrivers. Avneesh notices my confusion. “Oh yeah, that lab is working with acoustical engineers. I don’t think it’s ready yet.” I intuit that acoustical engineers, who study the control of sound or noise, must play a central role in the development and design of any instrument because they study how different materials and design elements will determine how the sound, well, sounds.

Finally, we arrive at his lab at the end of the hall on the left. As Avneesh swipes his ID card and the door swings open, I’m greeted by what looks, at first, like a traditional computer lab. On second glance, I see some clues about what goes on in here: stacks of keyboards and mixing boards around the room, for example, or the piles of sheet music strewn about on the desks.

“I was a very creative kid growing up. I played guitar and my dad did whatever he could to help me grow creatively,” Avneesh tells me as his computer boots up. “I went to school for software engineering, but I never really stopped playing music. At some point, I just figured I’d use software to help my music and that’s how I got into interactive art.” 

“At some point, I just figured I’d use software to help my music and that’s how I got into interactive art.”

When I ask what he means by interactive art, Avneesh opens a program on his computer and starts typing computer code. Just when I feel I can’t take the anticipation any longer, the speakers on either side of his desk come alive with an ethereal melody looping around a drumbeat that, while simple, demands my attention. “This is a module I wrote that lets you generate melodies with algorithms.” Then, with another line of code, the melody shifts: whereas before the synthesizer was coming out of each speaker equally, the melody now faded between the speakers, almost oscillating, creating a rather gripping effect. I was reminded of the way thoughts swirl together when one is on the cusp of sleep.

Bringing art to life with technology

Live coding in action

What Avneesh is doing is called live coding, creating art in real-time using computer code – whereas software is traditionally written, tested, debugged, and then used, live coding allows one to use programming as an actual instrument, the way you would a keyboard or a guitar. And it doesn’t stop at music.

“Yeah, I’ve actually done this with performance art as well.” He’s referring to Paradise Lost, an “immersive musical theater performance for which I developed interactive graphics.” He shows me videos of performers dancing while he live-coded various graphical effects onto the performance. In one, the dancers are given a wave-like effect while in another, mesmerizing pixelation effect reminds one of a faded memory. Bravely, Avneesh displays the code on the screen the whole time for the audience to see. “That would’ve been a bad time for a typo!” he says with a smile.

Avneesh is especially interested in developing interfaces, ways to foster connection between live-coding musicians and musicians who are playing physical instruments live. In one paper, he prototypes tools that allow traditional performers to see and understand what the coder is doing. One method uses colorful balls to represent notes, which move around in a pattern that corresponds with the code. “A big part of my work is making everything as accessible as possible,” he says. And it’s true – Avneesh has also been part of an effort to create an entirely new platform for creating computer music aimed at non-technologically inclined musicians, thereby opening new doors for creativity.

“A big part of my work is making everything as accessible as possible.”

I peer into another room on our way out, expecting some more engineering and computer equipment. Instead, there sits a lone baby grand piano in the corner facing a few rows of chairs. I smirk, thinking of where we’d be if the harpsichord had never been improved on. Thanks to the work of people like Mr. Cristofori in the 1700s – and Avneesh today – the bounds of music will never stop expanding.

Thank you to the Georgia Tech School of Music and Avneesh Sarwate for sharing their interactive music technology! Follow Science ATL on FacebookTwitter, and Instagram for more Awesome Science of Everyday Life features and other science updates.