Multi-modal sensing and shared control are rapidly evolving fields that are pivotal to advancing the field of wearable robotics. This half-day workshop will delve into cutting-edge technologies that incorporate multiple sensing modalities—such as visual, tactile, and electrophysiological inputs—into wearable devices, improving their adaptability and functionality. We will also explore shared control, where users and robotic systems collaborate to enhance both user experience and performance across various applications, from rehabilitation to personal assistance.
By integrating novel sensing modalities and biosignals within unified control architectures, researchers have a unique opportunity to explore how to increase device autonomy while still enabling users to maintain a sense of agency and ownership. Participants will gain insights into the latest innovations and emerging trends in wearable robotics focusing on multimodal sensing and shared control, showing how these technologies can improve device utility, thereby enhancing quality of life for users.