Scientists find how brain encodes speech, closing to making brain-speech machine interface

    Source: Xinhua| 2018-09-27 02:34:47|Editor: yan
    Video PlayerClose

    WASHINGTON, Sept. 26 (Xinhua) -- American researchers have unlocked new information about how the brain encoded speech, moving closer to developing speech-brain machine interface that can decode the commands the brain is sending to the tongue, palate, lips and larynx.

    The study published on Wednesday in the Journal of Neuroscience revealed that the brain controls speech production in a similar manner to how it controls the production of arm and hand movements.

    Northwestern University researchers recorded signals from two parts of the brain and decoded what these signals represented.

    They found that the brain represented both the goals of what we are trying to say (speech sounds like "pa" and "ba") and the individual movements that we use to achieve those goals (how we move our lips, palate, tongue and larynx). The different representations occur in two different parts of the brain.

    The discovery could potentially help people like the late Stephen Hawking communicate more intuitively with an effective brain machine interface (BMI) and those people with speech disorders like apraxia of speech.

    "This can help us build better speech decoders for BMIs, which will move us closer to our goal of helping people that are locked-in speak again," said the paper's lead author Marc Slutzky, associate professor of neurology and of physiology at Northwestern.

    Speech is composed of individual sounds, called phonemes, which are produced by coordinated movements of the lips, tongue, palate and larynx. However, scientists didn't know exactly how these movements, called articulatory gestures, are planned by the brain.

    Slutzky and his colleagues found speech motor areas of the brain had a similar organization to arm motor areas of the brain.

    Scientists recorded brain signals from the cortical surface using electrodes placed in patients undergoing brain surgery to remove brain tumors, keeping the patients awake during surgery and asking them to read words.

    After the surgery, scientists marked the times when the patients produced phonemes and gestures. Then they used the recorded brain signals from each cortical area to decode which phonemes and gestures had been produced, and measured the decoding accuracy.

    The brain signals in the precentral cortex were more accurate at decoding gestures than phonemes, while those in the inferior frontal cortex, a higher level speech area, were equally good at decoding both phonemes and gestures, according to the study.

    This finding helped support linguistic models of speech production and guide engineers in designing brain machine interfaces to decode speech from these brain areas.

    The next step for the research is to develop an algorithm for brain machine interfaces that would not only decode gestures but also combine those decoded gestures to form words.

    TOP STORIES
    EDITOR’S CHOICE
    MOST VIEWED
    EXPLORE XINHUANET
    010020070750000000000000011105521374949131
    主站蜘蛛池模板: 男女抽搐动态图| 亚洲入口无毒网址你懂的| 日本videos18高清hd下| 亚洲日本天堂在线| 秋葵视频在线免费观看| 国产偷窥熟女精品视频| 在线免费你懂的| 夫妇交换性3中文字幕| 中文版邻居的夫妇交换电影| 欧洲精品一区二区三区| 亚洲网站在线免费观看| 综合网在线视频| 国产在线精品一区二区夜色| 2019中文字幕在线| 天天影视综合网| 中文在线天堂资源www| 日韩欧美高清视频| 亚洲国产精品综合久久网各| 瓮红电影三级在线播放| 啊…别了在线观看免费下载| 黄又色又污又爽又高潮动态图| 国产精品扒开做爽爽爽的视频| av免费网址在线观看| 性asmr视频在线魅魔| 久久久久久久影院| 日韩高清特级特黄毛片| 亚洲午夜国产精品| 水蜜桃视频在线免费观看| 免费看国产一级片| 美女黄色一级毛片| 国产人久久人人人人爽| 国产玉足榨精视频在线观看| 国产精品美女一区二区视频| a毛片在线看片免费| 尤物网站在线播放| 中文字幕成人在线观看| 日本边添边摸边做边爱的视频| 亚洲av无码日韩av无码网站冲| 欧美日韩国产在线人成| 亚洲网站视频在线观看| 男人的天堂黄色|