Brain-Computer Interface Controlled Role-Playing Games

RPG Research is a huge advocate for accessibility and inclusiveness in gaming. Not only through all of our training, advocacy, and accessible mobile facilities, but through our active projects to make gaming accessible to all. (this page)

Background History

Our founder was first involved with role-playing games in 1977. He also began software development and engaging in the online community in 1979 (through the University of Utah) and various BBSes.

He has been involved with VR equipment since the late 1980s using Amigas (still has working Amiga 2000).

Developed Virtual-Reality Markup Language (VRML) websites in the mid-to-late-90s.

Experimenting with early AR in the late 90s and early 2000s with early PDAs and "smartphones" long before iOS and Android existed (Nokia, Palm phones, etc), combined with early location and GPS technologies.

Experimenting with biofeedback and bio-controlled devices since 1996 (a la mouse cursor controlled by single fingertip clip for example).

Since about 2005 working on integration of electroencephalogram (EEG) and later brain-computer interface (BCI) equipment with computers, mobile devices, VR, and AR. Experimenting with a wide range of VR and AR hardware and software, including used in educational, artistic, social, and therapeutic programs.


Our founder was dramatically motivated to try to find a solution for those people suffering from Locked-in Syndrome (LIS) and Complete Locked-in State (CLIS), when caring for a young adult in 1990 as a nurse's aide and LPN trainee, at Doxie Hatch Medical Center, and as a habilitation therapist at Hillcrest Care Center.  Wanting to figure out a way to use technology to help them reconnect socially, set them free of their physical prisons of the mind, and gain back some of their lives.

Ultimate in RPG Accessibility and Immersion 

The ultimate in accessibility potential is through the Brain-Computer Interface (BCI) technologies based on Electroencephalogram (EEG) technologies, integrating with AR & VR, (and eventually back into the physical world through robotics), literally allowing a person to interface with computer systems purely with the thoughts of their brain.

Also, when linked with haptics, robotics, AR, VR, and other technologies it is the potential in ultimate immersive experiences.

We have been hard at work experimenting with EEG and BCI equipment with music and RPGs since 2004.

Since 2019 our research and development team have been working on Project Ilmatar - - an opensource, online, multiplayer, turn-based, cooperative-play, electronic, role-playing game designed from the ground up for maximal accessibility including BCI control support.

Update Fall 2020, migrated to organization Github, moving our content from the previous individual Git:

As usual, our projects are little-known, long-running, and have almost no funding. It is thanks to our wonderful volunteers worldwide that we accomplish anything. We hope some day perhaps to have more support financially, but for now we keep leading in innovations thanks to the wonderful efforts of these generous people helping a few hours every week, and we share openly with the world in the truest form of opensource, following our philosophy of "the rising tide of shared knowledge floats all boats" to help improve the human condition globally.

While the software is free, the hardware we have to purchase to realize this technology fully is expensive. We are currently utilizing OpenBCI which still costs from hundreds to even thousands of dollars per headset. The RPG Research board approved $2,500 USD to purchase the full R&D bundle from OpenBCI. We are now working from two directions. The software side and the hardware side, working toward full integration.

We also hope to loop back from the virtual world and integrate robotic devices controlled through BCI equipment to enable participants with disabilities to more fully engage in all role-playing game formats.

We stream the development meetings live each week on Saturdays from 10 am to Noon PST8PDT: 

If you miss the live stream, Patreon supporters get access to the recorded videos at least a month before the general public as a thank you for their donation support.

The goal, and real-world metrics test, is if someone with profound disabilities, including experiencing locked-in syndrome (LIS) or complete locked-in state (CLIS), can be freed from their prison to engage socially in complex multiplayer social cooperative problem-solving role-playing games.

Program phases and updates on our live streams and Github:

Phase 0 (completed)

Viability prototyping with Linux and Neverwinter Nights and OpenEEG equipment, control PC movement. Began 2006, completed 2014 successfully.

Phase 1 (completed)

Development prototyping and team training / team building, ue NWN:EE Aurora Toolset to create custom adventure. Began August 2019, completed August 2020 successfully (still adding enhancements over time). Later see if newer generation OpenBCI can be used.

Phase 2 (in-progress)

Scope and build from ground-up electronic role-playing game (ERPG) that can be controlled by accessibility equipment, especially newer generation EEG/BCI equipment, such as the OpenBCI hardware (among others).

Game is:

  • online

  • multi-player

  • team-focused

  • turn-based

  • initially text-only

  • menu-driven

  • non-chat (use other chat solutions as needed) (not a MUD/MUSH/MOO)

  • electronic role-playing game that can be played with many different adaptive devices but MUST be fully playable (without chat) using only the human brain of the player(s) (BCI).

  • Opensource

Began scoping August 2020, currently actively in progress.

Meeting weekly (broadcast live).

Estimate this phase completion around August 2021 (potentially sooner if current development momentum maintained by team).

  • Supports Game Master (GM) tools to participate in the game in DM roles with the players during the live game.

  • Ability to pause game, save game, restore game.

  • Supports multiple genres (player selected).

  • Multiple underlying RPG systems (player selected).

  • Toolset provided for potential game master to create new adventures (does not need to meet BCI requirements for toolset).

More scope details and full documents in our Github repository.

Phase 3 

Add graphical interface (instead of just text interface) with audio and video pre-recorded events tied into the text-based events (a la Dragon's Lair). Later add more dynamic graphics/audio/video for smoother experience.Planned to begin early hooks around late 2021.

Phase 4 

Add VR integration, motion integration, while still fully supporting BCI play. Include accessibility settings for VR for people with very limited or no physical mobility of their own (various drivers from third-parties hopefully to integrate, do not want to have to develop those drivers ourselves).

Phase 5 

Add AR integration, GPS integration, with mobile devices and headsets, while still fully supporting full BCI play.

Phase 6 

Bring accessibility BCI RPG back to the tabletop.

  • integrate robotics equipment controlled through the BCI controls of the game play

  • enable rolling physical dice

  • moving physical miniatures

  • and other features through the BCI control

But in a TRPG environment rather than online ERPG (but using the tools and code created for this now more well-rounded continually evolving ERPG as the tools to enable TRPG play).

Your donations mean that more people can have access to our free programs.

Please donate today to help us with these efforts.

Or Volunteer Today!