Roadmap

Milestone: Initial Public Release

Completed 4 years ago (02/06/10 00:00:00)

Initial Public Release

Puzzlebox Brainstorms version 0.10 released for download.

This version permits control of a LEGO Mindstorms robot using the standard Emotiv EPOC headset (non-developer version).

Detections are made through the Emotiv Control Panel, then sent via EmoKey? to Puzzlebox Brainstorms' client GUI running on a Windows PC. From there control commands are TCP/IP connection to the Brainstorms server application, and finally to the LEGO NXT brick via Bluetooth through the "remote control" component.

Milestone: Windows Installation Package

Completed 4 years ago (06/10/10 20:02:48)

Windows Installation Package

Currently all software is distributed as raw source files.

This milestone is focused on producing a self-extracting, automatic installation package which includes all software components (those which are freely redistributable) for a Windows PC based installation.

NeuroSky Mindset Hardware Support

http://store.neurosky.com/products/mindset

The "Mindset" EEG headset from NeuroSky does not provide as many sensors or therefore as complex detection algorithms as does the EPOC headset from Emotiv, but does come at a cheaper cost and should still provide at least single-direction control.

The hardware is also easier for a general user put on and take off as it does not require saline solution or cleaning of sensors, which might make it more appropriate for a classroom environment.

Milestone: Variable Control Duration

Completed 4 years ago (07/11/10 05:08:50)

Variable Control Duration

Currently direction control is discrete - that is direction movements occur one at a time, for a set time period.

This milestone will permit users to control the amount of time movement occurs by pressing or holding down a button (or when controlled via EEG for the entire time a detection occurs, not merely once per detection).

GUI Configuration Menu

Currently all user-configurable options are stored in text file "puzzlebox_brainstorms_configuration.ini" and are needed to be handed-edited by a user to change such settings as the IP address for the server application.

This milestone will permit these settings to be changed by a user at will in a more convenient manner.

P300 EEG Control

Currently EEG control is handled entirely via proprietary software from Emotiv, which is specific to their hardware.

One form of EEG control which is not supported by Emotiv's Developer SDK but which is popular for BCI control in research environments is referred to as "P300":

http://en.wikipedia.org/wiki/P300_(neuroscience)

http://www.bci2000.org/wiki/index.php/User_Tutorial:Introduction_to_the_P300_Response

For P300 based control the Brainstorms software will randomly highlight a direction for approximately 1/2 seconds before highlighting another direction at random. When the user sees the direction they wish the robot to drive being highlighted, the P300 response is generated automatically in their brain approximately 300 ms from the moment recognition occurs (hence the name "P300").

Such control is envisioned to be easier for the majority of users.

This level of control will require access to raw EEG signals as well as implementation of P300 algorithms either from scratch or from a third-party Open Source library such as OpenViBE http://openvibe.inria.fr/

One significant form of EEG-based BCI control which is not currently supported is referred to as Steady-State Visual Evoked Potential (SSVEP):

http://en.wikipedia.org/wiki/Steady_state_visually_evoked_potential

http://iopscience.iop.org/1741-2552/2/4/008

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1880883/

For SSVEP based control the Brainstorms software will present a set of four checkerboard patterns which correspond to four directions the robot can be instructed to drive (forward, backward, turn left, and turn right). By having the computer display rapidly flash the checkerboard patterns at different frequencies (between 3 and 40 Hz) it is possible to measure the harmonic resonance in the visual cortex (at the back of the head) which matches the specific pattern the user is looking at, and therefore determine the direction they wish to steer.

Such control is envisioned to be easy for the majority of users and require little or no trainting.

This level of control will require access to raw EEG signals as well as implementation of SSVEP algorithms either from scratch or from a third-party Open Source library such as OpenViBE http://openvibe.inria.fr

Note: See TracRoadmap for help on using the roadmap.