Invention Title:

ASYNCHRONOUS BRAIN COMPUTER INTERFACE IN AR USING STEADY-STATE MOTION VISUAL EVOKED POTENTIAL

Publication number:

US20240288940

Publication date:
Section:

Physics

Class:

G06F3/015

Inventors:

Assignee:

Applicant:

Drawings (4 of 20)

Smart overview of the Invention

A method and system utilize steady-state motion visual evoked potential (SSMVEP) stimuli within an augmented reality (AR) environment. The system begins by receiving stimuli data from a user application on a smart device, along with sensor and context data. This information is processed to create modified stimuli, which are then mixed with environmental stimuli and presented to the user through a rendering device. The user's biosignals, generated in response to these rendered stimuli, are collected via a wearable biosignal sensing device for classification and feedback.

Background of Brain-Computer Interfaces

Electroencephalography-based brain-computer interfaces (BCIs) facilitate direct communication between the brain and external devices, bypassing traditional nerve pathways. Existing SSVEP BCIs rely on detecting EEG responses to repetitive visual stimuli but often lead to visual fatigue and discomfort. These limitations hinder performance and interactivity, especially when stimuli are presented on screens that do not integrate with the user's real-world environment.

Challenges in Current BCI Systems

Conventional SSVEP systems typically present visual stimuli in a way that can cause discomfort and reduce signal quality. Users often need to alternate their focus between computer screens and their surroundings, which can introduce distractions that negatively affect EEG readings. The limited field of view provided by existing head-mounted displays (HMDs) further complicates the effectiveness of AR applications in this context.

Innovative Features of the Proposed System

The proposed SSMVEP-based BCI addresses these challenges by using visually perceived movement rather than flashing stimuli, thereby reducing discomfort and enhancing interaction. By employing optical see-through (OST) HMDs, the system overlays virtual content onto the user's view of the real world, improving the overall experience. Asynchronous operation allows users to interact with the BCI naturally without being constrained by fixed timing, enabling continuous analysis of their responses.

Implementation and User Interaction

The process involves generating perceptible stimuli that influence the user's EEG patterns, which can then be interpreted as decisions or actions. By utilizing advanced classification methods like convolutional neural networks (CNNs), the system aims for high accuracy in interpreting user intentions. This innovative approach not only enhances user comfort but also broadens the applicability of BCIs in real-world scenarios, paving the way for practical implementations in various fields.