Meta Researchers Are Building New AI To Study Brainwaves

Meta researchers are working on an algorithm that aims to study brainwaves. The New AI is capable of hearing what another person is saying to them by scanning a person’s brainwaves. The company announced on August 31 that Meta research scientists in the AI lab have developed AI that is capable of “hearing” what someone’s hearing, by studying their brainwaves.

The New AI To Study Brainwaves Is Still In Its Early Stages

According to the latest reports, the research is still in very early stages. Furthermore, it’s intended to be a building block for tech that will help people with traumatic brain injuries who can’t communicate by talking or typing. The most important thing worth mentioning here is that the researchers are trying to record this brain activity without probing the brain with electrodes, which needs surgery.

The reports claim that the Meta AI study looked at 169 healthy adult participants who heard stories and sentences read aloud, as researchers recorded their brain activity with various devices. After that, the scientists then fed that data into an AI model, in order to find patterns. They actually wanted the algorithm to “hear” or determine what participants were listening to, based on the electrical and magnetic activity in their brains.

Jean Remi King, a research scientist at Facebook Artificial Intelligence Research (FAIR) Lab said that there are a bunch of conditions, from traumatic brain injury to anoxia that basically makes people unable to communicate. So, one of the paths that have been identified for these patients over the past couple of decades is brain-computer interfaces. By placing an electrode on the motor areas of a patient’s brain, we can decode activity and help the patient communicate with the rest of the world.

Biggest Challenges That Came While Conducting This Experiment:

  • The signals that are picked up from brain activity are extremely “noisy.” The sensors are actually pretty far away from the brain as there is a skull, there is skin, which corrupts the signal that needs to be picked up. Picking those signals with a sensor requires super advanced technology.
  • The other big issue is more conceptual as no one actually knows how the brain represents language to a large extent.

The main goal of Meta is to delegate these above-mentioned challenges to an AI system by learning to align representations of speech and representations of brain activity in response to speech.

Also Check: 15 Best Apps to Hide Photos/Videos on Android in 2022 – PhoneWorld

PTA Taxes Portal

Find PTA Taxes on All Phones on a Single Page using the PhoneWorld PTA Taxes Portal

Explore NowFollow us on Google News!

Laiba Mohsin

Laiba is an Electrical Engineer seeking a placement to gain hands-on experience in relevant areas of telecommunications. She likes to write about tech and gadgets. She loves shopping, traveling and exploring things.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Get Alerts!

PhoneWorld Logo

Join the groups below to get the latest updates!

💼PTA Tax Updates
💬WhatsApp Channel

>