MIT Theater Students Learn Design Through the Power of Play

Jul 30, 2024
From the classroom to expanding research opportunities, students at MIT Music Technology use design to push the frontier of digital instruments and software for human expression and empowerment.
By Michelle Luo
May 31, 2025
The MIT Music Technology Program carves out a space to explore new sounds, tunes, and experiences. From the classroom to the community, students in music tech grapple with developing both creative technology and their creative selves.
In the course 21M.080 (Intro to Music Technology), it dawned on Thelonious Cooper ‘25 that he had the skills to create his own instruments.
“I can literally make a new instrument. I don't think most people consider that as an option. But it totally is,” Cooper says.
Similar to how the development of photography contributed to a radical shift in the priorities of painting, Cooper identifies the potential of new music tools to “[pave] the way to find new forms of creative expression.” Cooper develops digital instruments and music software.
Thelonious Cooper '25 develops digital instruments and music software.
Image: Adélaïde Zollinger
For Matthew Caren ‘25, his parallel interests in computer science, mathematics, and jazz performance found an intersection in design. Caren explains, “the process of creating music doesn't actually start when you, for instance, sit at a piano. It really starts when someone goes out and designs that piano and lays out the parameters for how the creation process is going to go.” When it is the tool that defines the parameters for creating art, Caren reasons, “You can tell your story only as well as the technology allows you to.”
What purposes can music technology serve? In holding both technical and artistic questions simultaneously, makers of music technology uncover new ways to approach engineering problems alongside human notions of community and beauty.
Taught by Professor of the Practice Eran Egozy, 21M.385 (Interactive Music Systems, or IMS) focuses on the creation of musical experiences that include some element of human-computer interaction (HCI) through software or a hardware interface.
In their first assignment, students program a digital synthesizer, a piece of software to generate and manipulate pitches with desired qualities. While building this foundation of the application of hard technical skills to music, students contemplate their budding aesthetic and creative interests.
“How can you use it creatively? How can you make it make music in a way that's not just a bunch of random sounds, but actually has some intention? Can you use the thing you just made to perform a little song?” prompts Egozy.
Eran Egozy, Professor of the Practice in Music Technology at MIT, is an entrepreneur, musician and technologist. He was the co-founder and chief technical officer of Harmonix Music Systems, which developed the video game franchises Guitar Hero and Rock Band.
Image: Adélaïde Zollinger
In the spirit of MIT’s mens et manus, students of IMS propose, design, implement, playtest, and present a creative musical system of their own during the last stretch of the semester. Students develop novel music games, tools, and instruments alongside an understanding of the principles of user interface, user experience (UI/UX), and HCI.
Once students implement their ideas, they can evaluate their design. Egozy stresses it is important to develop a “working prototype” quickly. “As soon as it works you can test it. As soon as you test it, you find out whether it's working or not, then you can adjust your design and your implementation,” he explains.
Though students receive feedback at multiple milestones, a day of playtesting is the “most focused and concentrated amount of learning [students] get in the entire class.” Students might find their design choices affirmed or their assumptions broken as peers test the limits of their creations. “It’s a very entertaining experience,” Egozy says.
Immersed in music tech since his graduate studies at the MIT Media Lab and as co-founder of Harmonix, the original developers of popular music game titles Guitar Hero and Rock Band, Egozy aims to empower more people to engage with music more deeply by creating “delightful music experiences.”
By the same token, developers of music technology deepen their understanding of music and technical skills. For Cooper, understanding the “causal factors” behind changes in sounds, has helped him to “better curate and sculpt the sounds [he uses] when making music with much finer detail.”
Music technologies mark milestones in history — from the earliest acoustic instruments to the electrified realm of synthesizers and digital audio workstations, design decisions reverberate throughout the ages.
“When we create the tools that we use to make art, we design into them our understanding and our ideas about the things that we're interested to explore,” says Ian Hattwick, lecturer in music technology.
Hattwick brings his experience as a professional musician and creative technologist as the instructor of Intro to Music Technology and 21M.370 (Digital Instrument Design).
For Hattwick, identifying creative interests, expressing those interests by creating a tool, using the tool to create art, and then developing a new creative understanding is a generative and powerful feedback loop for an artist. But even if a tool is carefully designed for one purpose, creative users can use them unexpectedly, generating new and cascading creative possibilities on a cultural scale.
In cases of many important music hardware technologies, “the impact of the decisions didn’t play out for a decade or two,” says Hattwick. Over time, he notes, people shift their understanding of what is possible with the available instruments, pushing their expectations of technology and what music can sound like. One novel example is the relationship between drummers and drum machines — human drummers took inspiration from programmatic drum beats to learn unique, challenging rhythms.
Though designers may feel an impulse for originality, Hattwick stresses that design happens “within a context of culture.” Designers extend, transform, and are influenced by existing ideas. On the flip side, if a design is too unfamiliar, the ideas expressed risk limited impact and propagation. The current understanding of what sounds are even considered musical is in tension with the ways new tools can manipulate and generate them.
This tension leads Hattwick to put tools and the thoughtful choices of their human designers back in focus. He says, “When you use tools that other people have designed, you're also adopting the way that they think about things. There's nothing wrong with that. But you can make a different choice.”
Grounding his interests in the physical hardware that has backed much of music history, EECS undergraduate Evan Ingoldsby ‘27 builds guitar pedals and audio circuits that manipulate signals through electronic components. “A lot of modern music tech is based off of taking hardware for other purposes, like signal filters and saturators and such, and putting music and sounds through them and seeing how [they] change,” says Ingoldsby.
For Cooper, learning from history and the existing body of knowledge, both artistically and technically, unlocks more creativity. “Adding more tools to your toolbox should never stop you from building something that you want to. It can only make it easier,” he says.
Ingoldsby finds the unexpected, emergent effects of pushing hardware tools such as modular synthesizers to their limits most inspiring. “It increases in complexity, but it also increases in freedom.”
Music has always been a collective endeavor, fostering connection, ritual, and communal experiences. Advancements in music technology can both expand creative possibilities for live performers and foster new ways for musicians to gather and create.
Cooper makes a direct link between his research in high performance low-latency computing to his work developing real-time music tools. Many music tools can only function well “offline,” Cooper poses.
“For example, you'll record something into your digital audio workstation on your computer, and then you'll hit a button, and it will change the way it sounds. That's super cool. But I think it's even cooler if you can make that real-time. Can you change what the sound is coming out as you're playing?” asks Cooper.
The problem of speeding up the processing of sound, such that the time difference in input and output — latency — is imperceptible to human hearing, is a technical one. Cooper takes an interest in real-time timbre transfer which could, for example, change the sound coming from a saxophone as if it were coming from a cello. The problem intersects with common techniques in artificial intelligence research, he notes. Cooper’s work to improve the speed and efficiency of music software tools could provide new effects for digital music performers to manipulate audio in a live setting.
With the rise of personal computing in the 2010s, Hattwick recounts, an appeal for “laptop ensembles” emerged to contemplate new questions about live music performance in a digitizing era. “What does it mean to perform music with a laptop? Why is that fun? Is a laptop an instrument?” he poses.
In the Fabulous MIT Laptop Ensemble (FaMLE), directed by Hattwick, MIT students pursue music performance in a “living laboratory.” Driven by the interests of its members, FaMLE explores digital music, web audio, and live coding, an improvisational practice exposing the process of writing code to generate music. A member of FaMLE, Ingoldsby has found a place to situate his practice of sound design in a broader context.
When emerging digital technologies interface with art, challenging questions arise regarding human creativity. Communities made of multidisciplinary people allow for the exchange of ideas to generate novel approaches to complex problems. “Engineers have a lot to offer performers,” says Cooper. “As technology progresses, I think it's important we use that to further develop our abilities for creative practice instead of substituting it.”
Hattwick emphasizes, “The best way to explore this is together.”
Jul 30, 2024
Apr 9, 2024
Jan 7, 2025