The ability to communicate using only the thoughts in your mind might sound like the stuff of science fiction. But for people who’ve lost the ability to speak or move due to injury or disease, there’s now great hope that it may one day be possible using brain-computer interfaces (BCIs)...
The ability to communicate using only the thoughts in your mind might sound like the stuff of science fiction. But for people who’ve lost the ability to speak or move due to injury or disease, there’s now great hope that it may one day be possible using brain-computer interfaces (BCIs) that can “read” the relevant brain signals and translate them directly into written or spoken words. An NIH-supported team has now made an important but preliminary advance in this direction by showing for the first time that a computer can decode silent, internal speech with little training.
The NIH Brain Research Through Advancing Innovative Neurotechnologies® (BRAIN) Initiative has expanded scientists’ understanding of the human brain in recent years, offering fascinating insights into the ways that individual cells and complex neural circuits interact dynamically to enable us to think, feel, and act. But neuroscientists still have much more to learn about how our brains are put together at the most fundamental, subcellular level.
As a step in that direction, in a new study supported in part by the NIH BRAIN Initiative and reported in the journal Science, researchers have created the most detailed nanoscale resolution map ever produced of a cubic millimeter of brain tissue, about the size of half a grain of rice.
Human consciousness requires a person to be both awake and aware. While neuroscientists have learned a great deal from research about the underlying brain networks that sustain awareness, surprisingly little has been known about the networks that keep us awake.
Now, an NIH-supported team of researchers has mapped the connectivity of a neural network they suggest is essential for wakefulness, or arousal, in the human brain. According to the researchers, this advance, reported in Science Translational Medicine, is essential for understanding human consciousness. It may also lead to new ways of understanding what happens in the brain when people lose consciousness, with potentially important implications for treating those who have entered a coma or vegetative state.
Millions of children around the world are born each year with severe genetic disorders. Many of these are Mendelian disorders, which are rare genetic conditions caused by mutations in a single gene. But pinpointing the specific gene responsible for a disorder to get a clear diagnosis for an individual can be labor-intensive, and reanalysis of undiagnosed cases is also difficult. As a result, only about 30% of people with a rare genetic disorder get a definitive diagnosis, and on average, it takes 6 years from symptom onset to diagnosis.
Progress is needed to get accurate diagnoses to individuals and families more often and faster, and to create more efficient ways to update genetic diagnoses as new discoveries are made. As an important step in this direction, a team funded in part by NIH has developed a new artificial intelligence (AI) system called AI-MARRVEL (AI-Model organism Aggregated Resources for Rare Variant ExpLoration).