Hostname: page-component-77c78cf97d-57qhb Total loading time: 0 Render date: 2026-04-25T05:26:28.299Z Has data issue: false hasContentIssue false

Mobile robot navigation with a self-paced brain–computer interface based on high-frequency SSVEP

Published online by Cambridge University Press:  27 November 2013

Pablo F. Diez*
Affiliation:
Gabinete de Tecnología Médica (GATEME), Facultad de Ingeniería, Universidad Nacional de San Juan (UNSJ), San Juan, Argentina Instituto de Automática (INAUT), Facultad de Ingeniería, Universidad Nacional de San Juan (UNSJ), San Juan, Argentina
Vicente A. Mut
Affiliation:
Instituto de Automática (INAUT), Facultad de Ingeniería, Universidad Nacional de San Juan (UNSJ), San Juan, Argentina
Eric Laciar
Affiliation:
Gabinete de Tecnología Médica (GATEME), Facultad de Ingeniería, Universidad Nacional de San Juan (UNSJ), San Juan, Argentina
Enrique M. Avila Perona
Affiliation:
Instituto de Automática (INAUT), Facultad de Ingeniería, Universidad Nacional de San Juan (UNSJ), San Juan, Argentina
*
*Corresponding author. E-mail: pdiez@gateme.unsj.edu.ar

Summary

A brain–computer interface (BCI) is a system for commanding a device by means of brain signals without having to move any muscle. One kind of BCI is based on Steady-State Visual Evoked Potentials (SSVEP), which are evoked visual cortex responses elicited by a twinkling light source. Stimuli can produce visual fatigue; however, it has been well established that high-frequency SSVEP (>30 Hz) does not. In this paper, a mobile robot is remotely navigated into an office environment by means of an asynchronous high-frequency SSVEP-based BCI along with the image of a video camera. This BCI uses only three electroencephalographic channels and a simple processing signal method. The robot velocity control and the avoidance obstacle algorithms are also herein described. Seven volunteers were able to drive the mobile robot towards two different places. They had to evade desks and shelves, pass through a doorway and navigate in a corridor. The system was designed so as to allow the subject to move about without restrictions, since he/she had full robot movement's control. It was concluded that the developed system allows for remote mobile robot navigation in real indoor environments using brain signals. The proposed system is easy to use and does not require any special training. The user's visual fatigue is reduced because high-frequency stimulation is employed and, furthermore, the user gazes at the stimulus only when a command must be sent to the robot.

Information

Type
Articles
Copyright
Copyright © Cambridge University Press 2013 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Article purchase

Temporarily unavailable