[2603.27492] Copilot-Assisted Second-Thought Framework for Brain-to-Robot Hand Motion Decoding
About this article
Abstract page for arXiv paper 2603.27492: Copilot-Assisted Second-Thought Framework for Brain-to-Robot Hand Motion Decoding
Computer Science > Robotics arXiv:2603.27492 (cs) [Submitted on 29 Mar 2026] Title:Copilot-Assisted Second-Thought Framework for Brain-to-Robot Hand Motion Decoding Authors:Yizhe Li (1), Shixiao Wang (1), Jian K. Liu (1) ((1) University of Birmingham, Birmingham, United Kingdom) View a PDF of the paper titled Copilot-Assisted Second-Thought Framework for Brain-to-Robot Hand Motion Decoding, by Yizhe Li (1) and 4 other authors View PDF Abstract:Motor kinematics prediction (MKP) from electroencephalography (EEG) is an important research area for developing movement-related brain-computer interfaces (BCIs). While traditional methods often rely on convolutional neural networks (CNNs) or recurrent neural networks (RNNs), Transformer-based models have shown strong ability in modeling long sequential EEG data. In this study, we propose a CNN-attention hybrid model for decoding hand kinematics from EEG during grasp-and-lift tasks, achieving strong performance in within-subject experiments. We further extend this approach to EEG-EMG multimodal decoding, which yields substantially improved results. Within-subject tests achieve PCC values of 0.9854, 0.9946, and 0.9065 for the X, Y, and Z axes, respectively, computed on the midpoint trajectory between the thumb and index finger, while cross-subject tests result in 0.9643, 0.9795, and 0.5852. The decoded trajectories from both modalities are then used to control a Franka Panda robotic arm in a MuJoCo simulation. To enhance trajectory f...