Close Menu
Karachi Chronicle
  • Home
  • AI
  • Business
  • Entertainment
  • Fashion
  • Politics
  • Sports
  • Tech
  • World

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Russia-Ukraine War: Putin says he will meet Zelensky, but only in the “final stage” of discussion

Three times more fatal! Thanks to the SIC, China’s J-20 stealth fighters can now detect enemy jets at distances such as F-35, F-22, and more.

Chinese researchers release the world’s first fully automated AI-based processor chip design system

Facebook X (Twitter) Instagram
  • Home
  • About us
  • Advertise
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram Pinterest Vimeo
Karachi Chronicle
  • Home
  • AI
  • Business
  • Entertainment
  • Fashion
  • Politics
  • Sports
  • Tech
  • World
Karachi Chronicle
You are at:Home » Meta announces AI models that convert brain activity into text with unparalleled accuracy
AI

Meta announces AI models that convert brain activity into text with unparalleled accuracy

Adnan MaharBy Adnan MaharFebruary 11, 2025No Comments3 Mins Read0 Views
Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
Share
Facebook Twitter LinkedIn Pinterest WhatsApp Email


What happened? In collaboration with international researchers, Meta has presented major milestones in understanding human intelligence through two groundbreaking research. They created AI models that could read, interpret, reconstruct typed sentences and map the exact neural processes that translate thoughts into spoken or written words.

The first study conducted by the Meta Basic Artificial Intelligence Research (FAIR) Lab in Paris, in collaboration with the Basque Center for Cognition, Brain and Language in San Sebastian, Spain demonstrates the ability to decipher the production of non-sentences from non-derived origin. An invasive brain record. Using magnetic EEG and EEG, the researchers recorded brain activity and entered text from 35 healthy volunteers.

The system employs a three-part architecture consisting of an image encoder, a brain encoder, and an image decoder. Image encoders construct a rich set of representations of images independent of the brain. Next, brain encoders learn to line up MEG signals in these image embeddings. Finally, the image decoder generates plausible images based on these brain representations.

The results are impressive. AI models can decode up to 80% of characters typed by participants whose brain activity was recorded in MEG. This is at least twice as effective as a traditional EEG system. This study opens up new possibilities for non-invasive brain computer interfaces that can help restore communication among individuals who have lost the ability to speak.

The second study focuses on understanding how the brain transforms thoughts into language. By interpreting MEG signals using AI, researchers can identify the exact moment when thoughts are converted into words, syllables, and individual letters while participants are entering sentences. It’s done.

This study reveals that the brain starts at the most abstract level (the meaning of a sentence) and generates a series of representations that gradually transform into specific actions, such as keyboard finger movements. This study also demonstrates that the brain uses “dynamic neural codes” to chain consecutive representations while maintaining each one for a long period of time.

While this technology has shown to be promising, there are still some challenges before it can be applied in a clinical setting. The decoding performance remains incomplete, and in MEG, subjects must be in a magnetically shielded room and remain stationary. The MEG scanner itself is large and expensive, and the Earth’s magnetic field is 1 trillion times stronger than the brain’s magnetic field, so it must be operated in a shielded room.

Meta improves the accuracy and reliability of decoding processes, explores alternative non-invasive brain imaging techniques that are more practical for everyday use, and creates more sophisticated AI models that can better interpret complex brain signals. By developing it, these limitations will be addressed in future research. The company also aims to expand its research to include a wider range of cognitive processes and explore potential applications in areas such as healthcare, education, and human computer interactions.

Further research is needed before these developments can help people with brain damage, but they bring us closer to building AI systems that can learn and reason like humans.



Source link

Share. Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Telegram Email
Previous ArticleIBM and C40 cities are collaborating on new AI projects for resilient cities
Next Article Kerala Court acquits Malayalam actor Shintom Chakko in 2015 NDPS case
Adnan Mahar
  • Website

Adnan is a passionate doctor from Pakistan with a keen interest in exploring the world of politics, sports, and international affairs. As an avid reader and lifelong learner, he is deeply committed to sharing insights, perspectives, and thought-provoking ideas. His journey combines a love for knowledge with an analytical approach to current events, aiming to inspire meaningful conversations and broaden understanding across a wide range of topics.

Related Posts

Dig into Google Deepmind CEO “Shout Out” Chip Engineers and Openai CEO Sam Altman, Sundar Pichai responds with emojis

June 1, 2025

Google, Nvidia invests in AI startup Safe Superintelligence, co-founder of Openai Ilya Sutskever

April 14, 2025

This $30 billion AI startup can be very strange by a man who said that neural networks may already be aware of it

February 24, 2025
Leave A Reply Cancel Reply

Top Posts

20 Most Anticipated Sex Movies of 2025

January 22, 2025110 Views

President Trump’s SEC nominee Paul Atkins marries multi-billion dollar roof fortune

December 14, 2024102 Views

Alice Munro’s Passive Voice | New Yorker

December 23, 202458 Views

How to tell the difference between fake and genuine Adidas Sambas

December 26, 202437 Views
Don't Miss
AI June 1, 2025

Dig into Google Deepmind CEO “Shout Out” Chip Engineers and Openai CEO Sam Altman, Sundar Pichai responds with emojis

Demis Hassabis, CEO of Google Deepmind, has expanded public approval to its chip engineers, highlighting…

Google, Nvidia invests in AI startup Safe Superintelligence, co-founder of Openai Ilya Sutskever

This $30 billion AI startup can be very strange by a man who said that neural networks may already be aware of it

As Deepseek and ChatGpt Surge, is Delhi behind?

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

About Us
About Us

Welcome to Karachi Chronicle, your go-to source for the latest and most insightful updates across a range of topics that matter most in today’s fast-paced world. We are dedicated to delivering timely, accurate, and engaging content that covers a variety of subjects including Sports, Politics, World Affairs, Entertainment, and the ever-evolving field of Artificial Intelligence.

Facebook X (Twitter) Pinterest YouTube WhatsApp
Our Picks

Russia-Ukraine War: Putin says he will meet Zelensky, but only in the “final stage” of discussion

Three times more fatal! Thanks to the SIC, China’s J-20 stealth fighters can now detect enemy jets at distances such as F-35, F-22, and more.

Chinese researchers release the world’s first fully automated AI-based processor chip design system

Most Popular

ATUA AI (TUA) develops cutting-edge AI infrastructure to optimize distributed operations

October 11, 20020 Views

10 things you should never say to an AI chatbot

November 10, 20040 Views

Character.AI faces lawsuit over child safety concerns

December 12, 20050 Views
© 2025 karachichronicle. Designed by karachichronicle.
  • Home
  • About us
  • Advertise
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions

Type above and press Enter to search. Press Esc to cancel.