Decoding senses: video and audio decoded from the brain by UC Berkeley scientists

On September 2011, the 22nd a group of resercher of the Barklay University published an article about “Reconstructing visual experiences from brain activity evoked by natural movies”. [1]
Scientists used brain imaging to reveal the movies in our mind. [2]
Using functional Magnetic Resonance Imaging (fMRI) and computational models, UC Berkeley researchers have succeeded in decoding and reconstructing people’s dynamic visual experiences – in this case, watching Hollywood movie trailers.

Decoding for dummies
A student from the same group had the same succesfull results applying the same technique to the audio experience, on Jannuary 2012, the 31th. [3]
audio decoded“This research is based on sounds a person actually hears, but to use it for reconstructing imagined conversations, these principles would have to apply to someone’s internal verbalizations,” said first author Brian N. Pasley, a post-doctoral researcher in the center. “There is some evidence that hearing the sound and imagining the sound activate similar areas of the brain. If you can understand the relationship well enough between the brain recordings and sound, you could either synthesize the actual sound a person is thinking, or just write out the words with a type of interface device.” [4]

A nice short documentary about these recherches has been produced in the series by ScienceBytes, funded by Alfred P. Sloan Foundation: https://vimeo.com/42863899

[1] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3326357/

[2] http://news.berkeley.edu/2011/09/22/brain-movies/

[3] http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.1001251

[4] https://www.newscientist.com/article/dn21408-telepathy-machine-reconstructs-speech-from-brainwaves/

Leave a Reply

Your email address will not be published. Required fields are marked *