Brain-Computer Interface Controlled Role-Playing Games

Brain-computer Interface (Electroencephalogram (EEG) based technologies) online, multi-player, turn-based, cross-platform role-playing game opensource project for the ultimate in accessibility for people in Locked-in Syndrom (LIS), Completed Locked in State (CLIS) and other disabilities.

About BCI RPG OpenSource Community


The goal, and real-world metrics test, is if someone with profound disabilities, including experiencing locked-in syndrome (LIS) or complete locked-in state (CLIS), can be freed from their prison to engage socially in complex multiplayer social cooperative problem-solving role-playing games.


The creator of the Brain-Computer Interface Role-playing Game (BCIRPG) project, Hawke Robinson, was strongly motivated to try to find a solution for those people suffering from Locked-in Syndrome (LIS) and Complete Locked-in State (CLIS). This started with his caring for a young adult CLIS patient (close to his age at the time) in 1990 as a nurse's aide and LPN trainee, at Doxie Hatch Medical Center, and additionally caring for many others with various injuries and neurological disorders as a habilitation therapist at Hillcrest Care Center.  Ever since he has wanted to figure out a way to use technology to help them reconnect socially, set them free of their physical prisons of the active minds trapped in their bodies, and gain back some of their lives.

LIS & CLIS - Locked In Syndrome and Complete Locked in State

Hawke Robinson started this opensource program. He was first involved with role-playing games in 1977. He also began software development and engaging in the online community in 1979 (through the University of Utah) and various BBSes. He has been involved with Virtual Reality (VR) equipment since the late 1980s using Amiga computers (by Commodore) (he still has a working Amiga 2000) and other equipment over the decades. He developed Virtual-Reality Markup Language (VRML) websites in the mid-to-late-90s. Experimenting with early AR in the late 90s and early 2000s with early PDAs and "smartphones" long before iOS and Android existed (Nokia, Palm phones, etc), combined with early location and GPS technologies. Experimenting with biofeedback and bio-controlled devices since 1996 (a la mouse cursor controlled by single fingertip clip for example).

Since about 2005 working on integration of electroencephalogram (EEG) and later brain-computer interface (BCI) equipment with computers, mobile devices, VR, and AR. Experimenting with a wide range of VR and AR hardware and software, including used in educational, artistic, social, and therapeutic programs.

The BCIRPG project is an opensource community on Github started by Hawke Robinson. It is developed by anyone wishing to be involved with opensource software development on Github, and anyone that is interested in advocating and creating solutions for accessibility and inclusiveness in gaming. 

The ultimate in Role-Playing Game Accessibility and Immersion 

The ultimate in accessibility potential is through the Brain-Computer Interface (BCI) technologies based on Electroencephalogram (EEG) technologies, integrating with AR & VR, (and eventually back into the physical world through robotics), literally allowing a person to interface with computer systems purely with the thoughts of their brain.

Also, when linked with haptics, robotics, AR, VR, and other technologies it is the potential in ultimate immersive experiences.

Furthermore the "loot" that player character's acquire could potentially be linked to crypto-style non-fungible tokens (NFTs) in the virtual world linked to the game.

The BCIRPG opensource community software developers under Hawke's technology leadership  have been hard at work experimenting with EEG and BCI equipment with music and RPGs since 2004.

Since 2019 the BCI RPG software development opensource community developers have been working on Project Ilmatar,  an opensource, online, multiplayer, turn-based, cooperative-play, electronic, role-playing game designed from the ground up for maximal accessibility including BCI control support.

Update Fall 2020, migrated to Github, project content from the previous Git.

As is common these kinds of projects are little-known, long-running, and have almost no funding. It is thanks to wonderful volunteers worldwide that these opensource projects accomplish anything (for example Linus Torvald's Linux).

This project hopes that some day, it might perhaps, have more support financially, but for now they  keep leading in innovations thanks to the wonderful efforts of these generous people helping a few hours every week, and they share openly with the world in the truest form of opensource, following Hawke Robinson's philosophy of "the rising tide of shared knowledge raises all boats", helping improve the human condition globally.

This project is intended to function with a wide range of equipment, but for development purposes, in Phase 2+, specifically the OpenBCI equipment (which typically costs from hundresd to over $2,500.00 USD for the full R&D bundle).

This means there are two major teams for the project:

  • The game play side

  • The game control (BCI) side


The project developers also also hope down the road to loop the game from the virtual world and integrate with robotic devices controlled through BCI equipment to enable participants with disabilities to more fully engage in all role-playing game formats

High-level Roadmap Development Phases

Phase 0 - EEG I/O controllers in NWN on LInux

Learn basics of EEG, bio-and-neuro feedback, and computer interaction, then attempt to replace regular keyboard or other I/O controls for PC-based video game (preferably on Linux due to more flexiiblity of I/O controls) and play game with only EEG and/or bio-monitoring inputs. Start with NeverWinter Nights on Linux. - Complete

Phase 1 - Game design prototype in NWN:EE on LInux with EEG/BCI for I/O

Train new development team on game dev concepts desired with prototype use of Aurora Toolset for NWN.

Create new adventure module that will be used as test bed for future prototypes.

Choose Shakespeare's The Tempest.

Experiment with controls using EEG and BCI equipment with the Enhanced Edition Neverwinter Nights on Linux.

Phase Completed 2020.

Phase 2 - Game from scratch TUI version BCI I/O online turn-based mutiiplayer

Create from scratch a text-only multiplayer online turn-based cooperative RPG that can be played with only OpenBCI (or similar) equipment. - Currently nearing end of design and documentation steps, and starting actual coding steps.

Will use GUI tools with GoDot engine fhe the GM tools, but play must be all TUI and able to be run with the simple inputs of OpenBCI.

Use the same adventure bluebrint as Phase 1. - IN PROGRESS. 

limited chat features based on pregenerated responses from selected menus controlled by BCI.

Game adventure modules are game system and genre agnostic.

For early prototyping including 3 genres and 3 RPG systems, but design actual adventures modules (a la something like "madlibs" variable replacement of key descriptors of the adventure).

Phase 3 - Add GUI and Add Optional Integration Support with Other Technologies

Add basic graphics real-time gaphics or pre-recorded action videos triggered like Dirk the Daring from Dragon's Lair - on top of existing TUI-based game.

Add integration of the chat features to operate with Matrix (such matrix-synapse federated home servers, for example).

Incorporate BCI-controlled chat pick-list features for creating custom text messages, not just the pregenerated responses from selection menu from phase 2.

Incorporate Automated Speech Recognition (ASR)

Incorporate Machine Learning (ML), Deep Neural Networks (DNN), and Artificial Intelligence (AI) into the game, bots and other features to improve accessibility and game play.

Consider adding blockchain authentication support to validate identities of players for improved security as relationships build over time through the game.

Considering adding optional non-fungible token (NFT) support to potentially allow movement of virtual "real" loot between player characters, adventures, campaigns, game systems, etc.

Phase 4 - Add VR

Add virtual Reality (VR) option to the graphical interface.

Phase 5 - Add AR+ / IR / xR /etc

Add Augmented Reality (AR), Immersive Reality (IR), or other xR variants.

Incorporate optional GPS, global world location play to game.

If technology ,viable include "holographic" UI options so no glasses, etc. required.

Phase 6 + Add Physical local realm interaction

Migrate BCI tools to work in physical world through robotics and wireless triggers so can interace with combination of AR and physical devices, including manipulating dice roller devices, robotic arms to pick up and move objects, move miniature tokens around enhanced by AR overlaid animation, so LIS/CLIS can play at the same physical table or larp with other players in the same physical location, not just online.

Phase 0 (completed) 2006 - 2014 OpenEEG

NeverWinter Nights (NWN) Diamond Edition on Linux controlled by OpenEEG equipment.

Basic control of PC movement. Routing of I/O through OpenEEG 5 channel headset.

Began 2006, functional by 2010, stopped working on it by 2014.

Very rudimentary movement, and on/off menu (or other on/off assigned hotkey) mapped and working. Not sufficient for LIS/CLIS, but helpful for some disabilities. Very high latency and error rate however.

Need greater resolution of signal, frequency ranges, more CPU power. OpenEEG and civilian system power still leaving something to be desired, but getting closer. Not sufficient for AR/VR addition however.

Phase 1 (completed) 2019 - 2020

NeverWinter Nights (NWN) Enhanced Edition (NWN:EE).

Using Windows for Aurora Toolset module development.

Not using EEG or BCI equipment at this phase. Creating prototype adventure that will be used in Phase 2 onward for baseline R&D.

Development prototyping and team training / team building, ue NWN:EE Aurora Toolset to create custom adventure. Began August 2019, completed August 2020 successfully (still adding enhancements over time). Later see if newer generation OpenBCI can be used.

Phase 2

In progress

details on github site.