News

Garber Announces Advisory Committee for Harvard Law School Dean Search

News

First Harvard Prize Book in Kosovo Established by Harvard Alumni

News

Ryan Murdock ’25 Remembered as Dedicated Advocate and Caring Friend

News

Harvard Faculty Appeal Temporary Suspensions From Widener Library

News

Man Who Managed Clients for High-End Cambridge Brothel Network Pleads Guilty

Spectralism Explores Physiology

Musicians and panelists discuss the cross-disciplinary musical movement

By Charlene C. Lee, Contributing Writer

Emily Dolan, Assistant Professor of Music at the University of Pennsylvania, describes Spectralism as a renaissance. “[Spectral music] is the rebirth of musical aesthetics—the return of music to human perception and sensation,” says Dolan, who was a panelist for “Sensations of Tone.” Held in the exhibition space of the Collection of Historical Scientific Instruments on October 27 and 28, this event showcased the movement of Spectralism, which uses the spectrum of sound to influence musical compositions. Covering an array of topics from the history of Spectralism to its various components, “Sensations of Tone” also featured vocalist Jane Sheldon and the Firebird Ensemble. The musicians performed pieces by prominent Spectralists.

Sheldon and Peter Godfrey-Smith, Professor of Philosophy, envisioned “Sensations of Sound” about seven months ago. They approached Peter L. Galison ’77, Professor of History of Science and of Physics, with the concept. “This sounds like a perfect opportunity to mix science and art,” said Galison, recalling the proposal.

According to Sheldon, the Spectral movement began in the 19th century as a reaction to the mathematical, formulaic methods by which composers were producing music. Music theory first originated as a collection of loose conventions, typically sounds that had been observed to please the ear. Later, however, it came to be a series of equations that dictated what music should be.

“The aim of the [Spectral] movement was to go back to how sound was experienced by any human ear. The aim was to create things human ears found interesting,” said Sheldon.

According to Sheldon, Spectralists started producing music that subtly altered a note’s dominant chord so the ear could perceive these slight changes. The shifts exposed tones, pitches, and timbre more fully. With the advent of computers in the 1970s, Spectralists could use graphical and scientific representations of sound waves to portray slight modifications in different components of music.

Dolan began the panel by defining timbre as “everything that remains after pitch and dynamic level” are extracted from the music. The term timbre was first coined by Jean-Jacques Rousseau as a vague, perceivable difference between qualities of sounds. A sound’s timbre can discern its harshness, softness, darkness, or brightness. Over time, as new instruments were invented, and bands and orchestras expanded, the increasing diversity of instruments gave conductors a choice of a variety of timbre. Conductors could envision an overall sound and pick instruments according to their ability to produce timbre that would contribute to that sound.

Pitch, as another component of sound, also became a field of study in Spectralism. “Pitch is a more subtle experience than it may seem,” said Eric Heller, Professor of Chemistry and of Physics. “It masquerades as an objective attribute of sounds… not measurable in the sense of frequency.” Ears summarize sound the same way eyes process what they receive before a person perceives the image she “sees,” Heller said.

“What makes Spectralism unusual is that diverse parameters—timbre, pitch, tone, sounds—are all very closely related and often beautifully connected,” said panelist Alexander Rehding, Professor of Music.

Michael A. Einziger, guitarist and lead composer of the rock band Incubus, had a musical background that neglected certain intellectual aspects of music. He was accustomed to, as he says, “picking up instruments and banging on them.”

“Rock musicians tend to frown upon the intellectualization of music,” he said. “They think that it’s got to come from some raw emotional place.” Einziger, who is taking courses at Harvard, says experiencing an “intellectualized” environment has countered his impressions of the purpose music serves in his life.

“I felt like I was seeing music though a specific lens and had a desire to expand upon that. There are underlying structures to things like sound. If you just hear them with an unaided ear, it will sound a certain way,” he said.

Einziger notes that technological developments have allowed further exploration of music, by dissecting sound to tone, timbre, and pitch. According to Einziger, just as magnifying lenses and microscopes aided biological understandings of life, new technologies in music are aiding a physiological understanding of music.

Panelist Jimena Canales, Associate Professor of History of Science, discussed how graphs and mathematics can be used to analyze sound. According to Canales, how waves stimulate a particular organ influences the sensation felt by a person. “The differences between seeing and hearing, and hearing and touch, are caused by how they excite different nerve apparatus,” Canales said.

Throughout the performances of Sheldon and the Firebird Ensemble, the sounds they were producing were represented on a graph. The x-axis depicted time, while the y-axis depicted frequency: the higher the vibrations, the higher the waves on the y-axis. The color of the waves represented loudness, ranging from black to blue to green to yellow—black being no sound and yellow being the loudest.

Galison hopes the Collection of Historical Scientific Instruments can use their exhibition space to further showcase similar performances in order to bring exposure to its collection. “After all,” he said, “we want to do things with the instruments—we want to bring together scientists, musicians, all types of people.”

Want to keep up with breaking news? Subscribe to our email newsletter.

Tags
MusicScience