Facebook Research Aims to Read Minds With Neuroscience

Facebook is at work on a project that would enable users to control virtual reality and augmented reality experiences telepathically. The company unveiled this research in April at its annual F8 conference, and more details have emerged about a technology that could revolutionize the next era of computing. The technology is, however, a long shot, as both neuroscientists and engineers outside the company are dubious that it can succeed. The solution could be a simple headband, rather than the brain implant some companies propose.

The Wall Street Journal reports that, “Facebook has enlisted a small in-house team, supplemented by 60 scientists and engineers from research institutions across the U.S., all receiving funding from Facebook.” Physicist and neuroscientist Mark Chevillet is technical lead of the project, which is housed in Facebook’s Building 8 incubator for moonshots.


The team is focusing on updating “an obscure, largely abandoned technology” called “fast optical scattering” or “event-related optical signal,” based on shining light through the head, into the brain, and measuring the light reflected back.

Facebook’s researchers have been “developing sensors to identify the small number of photons that, after penetrating the skull and bouncing off a neuron, return to the detector instead of scattering in every direction.”

The Johns Hopkins University Applied Physics Laboratory, funded by the Department of Defense, has developed sensor technology that “could in theory accomplish this,” and Chevillet hopes to “train a machine-learning algorithm that correlates neural activity with language to extract words from our heads.”

But Stanford University neuroprosthetics researcher Krishna Shenoy says he sees no evidence that “Facebook’s team could sense brain activity from outside the skull nearly as accurately as with implants.” Even if they did, the technology would have to transform the brain readings into actual words.

At the University of California, Berkeley, neuroscience postdoc Alexander Huth’s research has shown that “words — and the concepts that underlie them — are spread across the surface of our brains,” and might be decoded by “observing which parts of a brain are active.” But, Huth argues, the biggest challenge is that “we still know so little about how language works.”

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.