发布时间:2022.04.19 18:40 访问次数: 作者:
返回列表researchers at the usc viterbi school of engineering are using generative adversarial networks (gans) -- technology best known for creating deepfake videos and photorealistic human faces -- to improve brain-computer interfaces for people with disabilities.
advertisement
in a paper published in nature biomedical engineering, the team successfully taught an ai to generate synthetic brain activity data. the data, specifically neural signals called spike trains, can be fed into machine-learning algorithms to improve the usability of brain-computer interfaces (bci).
bci systems work by analyzing a person's brain signals and translating that neural activity into commands, allowing the user to control digital devices like computer cursors using only their thoughts. these devices can improve quality of life for people with motor dysfunction or paralysis, even those struggling with locked-in syndrome -- when a person is fully conscious but unable to move or communicate.
various forms of bci are already available, from caps that measure brain signals to devices implanted in brain tissues. new use cases are being identified all the time, from neurorehabilitation to treating depression. but despite all of this promise, it has proved challenging to make these systems fast and robust enough for the real world.
specifically, to make sense of their inputs, bcis need huge amounts of neural data and long periods of training, calibration and learning.
"getting enough data for the algorithms that power bcis can be difficult, expensive, or even impossible if paralyzed individuals are not able to produce sufficiently robust brain signals," said laurent itti, a computer science professor and study co-author.
原文链接:
tel: 021-63210200
业务咨询: info@oymotion.com
销售代理: sales@oymotion.com
ag九游会官网的技术支持: faq@oymotion.com
加入傲意: hr@oymotion.com
上海地址: 上海市浦东新区广丹路222弄2号楼6层
厦门地址: 厦门市集美区百通科技园1号楼301-1室
微信号:oymotion
扫描二维码,获取更多相关资讯