Ever wondered what a seizure sounds like?

As we found when investigating the world of music psychology for our latest multimedia feature Synapsethe speedy advance of science is making such things commonplace.

To answer the question, Stanford professors Chris Chafe and Josef Parvizzi created 'Seizures', the title track to our feature. Formed by converting the electrical spikes from the EEG of a seizure patient into sound, it perfectly replicates the three stages of a seizure.

As for the audio, it sounds...well...a little disturbing as you would expect, but, at the same time, absolutely fascinating. And this is the point of Synapse and the field of music psychology in general - to highlight these discoveries and think about our brains and music in new ways.

Describing the output of an EEG through sound first arose in the 1930s and is considered a legitimate scientific method of representing brain data. In 1965 this technology began to inspire the creation of music, when physicist Edmond Dewan (1931–2009) and composer Alvin Lucier (b. 1931) helped create Music for the solo performer - the first time science and artistic creation had overlapped in such a way.*

This eventually gave rise to portable and affordable EEG machines which would be used by artists such as Lora-Faye Ashuvud when she took part in an experiment organised by Artlab. Ashuvud essentially sang a duet with her own mind.

This leads us nicely onto Professor Eduardo Miranda, who, as a music composer and a scientist, is constantly seeking to push the artistic boundaries and previously held a research position at Sony's Computer Science Laboratory where he made discoveries in speech synthesis, evolutionary music and cognitive neural modelling.

Recently he used a type of living mould to make a living musical instrument. Biocomputer Music (2015) and Biocomputer Rhythms (2016) saw Prof Miranda performing live with a piano which included the organism grown on a circuit board. The mould would respond to currents sent to it and, in this way, became part of the sound.

The piece you can hear in Synapse, Raster Plot, was composed with rhythms generated by a computer simulation of a network of neurones (each neurone corresponding to an instrument) to mimic the way the brain encodes information and includes the mezzo-soprano Juliete Pochin, who has appeared on the soundtracks to Lord of the Rings, Harry Potter and Star Wars

As Prof Miranda explains: "In a nutshell, each instrument of the orchestra corresponds to a neurone. Scientists refer to the graph plotting the activity of the neurones as "raster plots", hence the title of the movement."

It would be somewhat ironic if music, one of the most emotional of the creative arts, ended up being the key to helping science finally unravel the mysteries of the brain.

*Brainwaves in concert: the 20th century sonification of the electroencephalogram, Bart Lutters, Peter J. Koehler, (2016)