• Email Us: [email protected]
  • Contact Us: +1 718 874 1545
  • Skip to main content
  • Skip to primary sidebar

Medical Market Report

  • Home
  • All Reports
  • About Us
  • Contact Us

Brain Activity Translated Into Snippets Of Classic Rock Song In Breakthrough Study

August 15, 2023 by Deborah Bloomfield

Music is such a central part of what it means to be human, yet there’s so much scientists don’t know about what goes on in our brains when we listen to our favorite tunes. Now, a study has broken new ground by showing that it is possible to reconstruct a song that someone was hearing from only their brain activity patterns – and if you think this sounds like sci-fi, you can take a listen for yourself.

Beyond a greater understanding of how the brain perceives music, there’s another strand to this research. Brain-computer interfaces are advancing all the time. For people who have lost the ability to speak due to a brain injury or illness, there are devices that can help them to communicate, such as the one used by the late Stephen Hawking. 

Advertisement

Versions of these devices, sometimes referred to as neuroprostheses, have been developed to allow people with paralysis to type text by imagining hand-writing it, or to spell out sentences using just their thoughts. But, when it comes to speech, one thing that’s been notoriously hard to capture is the rhythm and emotion behind the words, called prosody. The best we’ve been able to do comes out sounding distinctly robotic.

“Right now, the technology is more like a keyboard for the mind,” said lead author Ludovic Bellier in a statement. “You can’t read your thoughts from a keyboard. You need to push the buttons. And it makes kind of a robotic voice; for sure there’s less of what I call expressive freedom.”

The team behind the new study looked to music, which naturally includes rhythmic and harmonic components, to try to create a model for decoding and reconstructing a more prosodic sound. And luckily, there was a perfect dataset just waiting to be analyzed.

Over a decade ago, 29 patients with treatment-resistant epilepsy took part in a study in which recordings of their brain activity were taken – using electrodes inside their brains – while they listened to a three-minute segment of the Pink Floyd classic Another Brick in the Wall, Part 1. 

Advertisement

At that time, in 2012, UC Berkeley professor Robert Knight was part of a team that was the first to reconstruct words that a person was hearing from their brain activity alone. Things in the field had moved on apace since then, and now Knight was leading the study with Bellier on the new problem of music perception.

Bellier reanalyzed the recordings and used artificial intelligence to come up with a model that could decode the brain activity recorded from the auditory cortex, and use it to reconstruct a sound waveform that aimed to reproduce the music the person had been listening to at the time.

spectrogram of original song (left), brain showing representative activity pattern as colored dots (center), reconstructed spectrogram (right)

The left panel shows the spectrogram of the original song the patients listened to, and the center demonstrates a typical neural activity pattern. The researchers used only these patterns to decode and reconstruct a spectrogram like that on the right, which is recognizable as the original song.

Image credit: Ludovic Bellier, PhD (CC BY 4.0)

For Bellier, a lifelong musician himself, the prospect was compelling: “You bet I was excited when I got the proposal.”

And the results are impressive.

Advertisement



In the reconstructed audio, the rhythm and tune are recognizable, and even the words, “All in all it was just a brick in the wall,” can just be made out.

The research also allowed the team to identify new areas of the brain involved in detecting rhythm – in this case, the thrumming of the guitar. The most important seemed to be part of the right superior temporal gyrus, which sits in the auditory cortex just behind and above the ear.

They also discovered that, while language perception happens more on the left side of the brain, music perception has a bias towards the right. 

Advertisement

Bellier and Knight, along with their co-authors, are hopeful the project could lead to an improvement in brain-computer interface technology.

“As this whole field of brain machine interfaces progresses, this gives you a way to add musicality to future brain implants for people who need it,” explained Knight. “It gives you an ability to decode not only the linguistic content, but some of the prosodic content of speech, some of the affect. I think that’s what we’ve really begun to crack the code on.”

It would be particularly useful to be able to make the brain recordings noninvasively, but Bellier explained that we’re not there yet: “Noninvasive techniques are just not accurate enough today. Let’s hope, for patients, that in the future we could, from just electrodes placed outside on the skull, read activity from deeper regions of the brain with a good signal quality. But we are far from there.”

One of These Days, that might be possible. But hearing music decoded only from brain activity still left us Lost for Words. And, as the authors concluded in their paper, they have certainly added “another brick in the wall of our understanding of music processing in the human brain.” 

Advertisement

The study is published in PLOS Biology.

Deborah Bloomfield
Deborah Bloomfield

Related posts:

  1. Paris ramps up security as jihadist attacks trial starts
  2. Cricket-‘Western bloc’ has let Pakistan down, board chief says
  3. Analysis-Diverse boards to pick the next Boston and Dallas Fed bank chiefs
  4. Ancient Bison Found In Permafrost Is So Well Preserved Scientists Want To Clone It

Source Link: Brain Activity Translated Into Snippets Of Classic Rock Song In Breakthrough Study

Filed Under: News

Primary Sidebar

  • In 1954, Soviet Scientist Vladimir Demikhov Performed “The Most Controversial Experimental Operation Of The 20th Century”
  • Watch Platinum Crystals Forming In Liquid Metal Thanks To “Really Special” New Technique
  • Why Do Cuttlefish Have Wavy Pupils?
  • How Many Teeth Did T. Rex Have?
  • What Is The Rarest Color In Nature? It’s Not Blue
  • When Did Some Ancient Extinct Species Return To The Sea? Machine Learning Helps Find The Answer
  • Australia Is About To Ban Social Media For Under-16s. What Will That Look Like (And Is It A Good Idea?)
  • Interstellar Comet 3I/ATLAS May Have A Course-Altering Encounter Before It Heads Towards The Gemini Constellation
  • When Did Humans First Start Eating Meat?
  • The Biggest Deposit Of Monetary Gold? It Is Not Fort Knox, It’s In A Manhattan Basement
  • Is mRNA The Future Of Flu Shots? New Vaccine 34.5 Percent More Effective Than Standard Shots In Trials
  • What Did Dodo Meat Taste Like? Probably Better Than You’ve Been Led To Believe
  • Objects Look Different At The Speed Of Light: The “Terrell-Penrose” Effect Gets Visualized In Twisted Experiment
  • The Universe Could Be Simple – We Might Be What Makes It Complicated, Suggests New Quantum Gravity Paper Prof Brian Cox Calls “Exhilarating”
  • First-Ever Human Case Of H5N5 Bird Flu Results In Death Of Washington State Resident
  • This Region Of The US Was Riddled With “Forever Chemicals.” They Just Discovered Why.
  • There Is Something “Very Wrong” With Our Understanding Of The Universe, Telescope Final Data Confirms
  • An Ethiopian Shield Volcano Has Just Erupted, For The First Time In Thousands Of Years
  • The Quietest Place On Earth Has An Ambient Sound Level Of Minus 24.9 Decibels
  • Physicists Say The Entire Universe Might Only Need One Constant – Time
  • Business
  • Health
  • News
  • Science
  • Technology
  • +1 718 874 1545
  • +91 78878 22626
  • [email protected]
Office Address
Prudour Pvt. Ltd. 420 Lexington Avenue Suite 300 New York City, NY 10170.

Powered by Prudour Network

Copyrights © 2025 · Medical Market Report. All Rights Reserved.

Go to mobile version