That's no longer the case – researchers have eavesdropped on our
internal monologue for the first time. The achievement is a step towards
helping people who cannot physically speak communicate with the outside world.
"If you're reading text in a newspaper or a
book, you hear a voice in your own head," says Brian Pasley at
the University of California, Berkeley. "We're trying to decode the brain
activity related to that voice to create a medical prosthesis that can allow
someone who is paralyzed or locked in to speak."
When you hear someone speak, sound waves activate sensory
neurons in your inner ear. These neurons pass information to areas of the brain
where different aspects of the sound are extracted and interpreted as words.
In a previous study, Pasley and his colleagues
recorded brain activity in people who already had electrodes implanted in their
brain to treat epilepsy, while they listened to speech. The team found that
certain neurons in the brain's temporal lobe were only active in response to
certain aspects of sound, such as a specific frequency.
One set of neurons
might only react to sound waves that had a frequency of 1000 hertz, for
example, while another set only cares about those at 2000 hertz. The team
hypothesized that hearing speech and thinking to oneself might spark some of
the same neural signatures in the brain. They supposed that an algorithm
trained to identify speech heard out loud might also be able to identify words
that are thought.
Each participant was asked to read the text aloud,
read it silently in their head and then do nothing. While they read the text
out loud, the team worked out which neurons were reacting to what aspects of
speech and generated a personalized decoder to interpret this information.
The
decoder was used to create a spectrogram – a visual representation of the
different frequencies of sound waves heard over time. As each frequency
correlates to specific sounds in each word spoken, the spectrogram can be used
to recreate what had been said.
They then applied the decoder to the brain
activity that occurred while the participants read the passages silently to
themselves.
The algorithm isn't perfect, says Stephanie Martin,
who worked on the study with Pasley. "We got significant results but it's
not good enough yet to build a device."

No comments:
Post a Comment