Projects at KU
A brain computer interface controlled speech synthesizer
This project aims to develop and test a motor imagery BCI that converts changes in the brain’s sensorimotor rhythm into speech formant frequencies for instantaneous continuous synthesis.
Controlling augmentative and alternative communication devices with BCIs
A major goal of the lab is to provide BCI control to commercial AAC devices through collaborations with industry partners, federal agencies, foundations and current users of AAC.
Neural correlates of vowel identification
Motivated from our work on a BCI controlled formant frequency speech synthesizer, this project investigates how auditory event related potentials change as adults listen to vowel sounds of varying quality and distinctiveness. A second goal is to determine whether these ERP changes can be used in a real time BCI for formant synthesis.
The readiness potential and its application for BCI
Motor-based BCIs require not only the ability to interpret motor execution or imagery related neural activity into control signals, but they must also do so only when the users intends. We are investigating the readiness potential as a possible neural signal of upcoming movement intention.
The Prosodic Marionette
In collaboration with the CADLAB at Northeastern University. The Prosodic Marionette is a novel visual-spatial graphical interface for manipulating the acoustic and temporal cues involved in linguistic prosody. We are investigating prosodic knowledge in two populations: 1) typically developing children and 2) adults with congenital and acquired neuromotor impairment.
Electrocorticography of continuous speech production
In collaboration with the Schalk Lab. By using electrocorticography we are able to sample the brain’s neuroelectrical activity at incredibly fine spatial and temporal resolution necessary for studying the neural dynamics of continuous speech production.
The Unlock Project
In collaboration with CELEST at Boston University