Behavioral user models provide accurate predictions of user intent in on-screen interactions, effectively reducing system lag. The adoption of wearable devices, such as neural sensors, enabled psychophysiological models to resolve further uncertainty about the user's intent. Today, movement sensors in combination with wearables enable immediate, natural, interactions in extended realities without lag or proxy control devices. My research is motivated by what follows step: How to design novel interactions with extended realities that still feel natural?

Untitled

In this project, we will build a prototype to monitor piano playing skill acquisition 🙂. Novice piano players will learn to play a simple song, while wearing a haptic glove.

Dexta Robotics - Touch the Untouchable-岱仕科技 - 触摸虚实的边界

Sometimes, the learning experience will be manipulated by triggering wrong hand movements using the haptic glove. At these time points, we hypothesize the brain to generate a prediction error, a discrepancy between what was predicted to happen and what actually happened.

EEG and other physiological data will be recorded and investigated offline. In our earlier works, we could show these prediction errors to carry task relevant information about the user. Physiological correlates of these prediction errors will be investigated with reference to the piano playing learning experience and progress.

What you will do

Who we are

Our department (Biopsychology and Neuroergonomics), headed by Prof. Dr. Klaus Gramann, comprises an interdisciplinary team of researchers from a variety of scientific fields and is hosted within the Institute of Psychology and Ergonomics with teaching obligations in the Master’s Program “Human Factors”. We are situated on the main campus of TU Berlin that is located in the west of Berlin with good access to public transport (the Berlin Zoo railway station). You will be seated in our main office space with your own workplace and PC. The university is known for its engineering programmes as well as its high number of international students.

About me (Lukas Gehrke - project lead)

I research how to leverage neural interface technologies for human-computer interaction (HCI). Previously, I have worked in a joint project with TU Berlin and the Swartz Center for Computational Neuroscience at UC San Diego researching spatial learning. Recently, I interned at Chatham Labs (acquired by Facebook). In the future, I want to realize novel experiences in AR/VR that feel natural. My goal is to design adaptive technologies in which computer and user coexist and together create truly connected experiences for the user.

How to apply

Please contact me at [email protected]. If you have the time, add a brief (<200 words) cover letter or a link to something interesting you did and are proud of.

Reference Papers

Detecting Visuo-Haptic Mismatches in Virtual Reality using the Prediction Error Negativity of Event-Related Brain Potentials

Resting-state cortical connectivity predicts motor skill acquisition

Spectral modulation of frontal EEG during motor skill acquisition: A mobile EEG study

Synchronizing MIDI and wireless EEG measurements during natural piano performance

The Mismatch Negativity: An Indicator of Perception of Regularities in Music

Feedback-based error monitoring processes during musical performance: An ERP study

Processing Expectancy Violations during Music Performance and Perception: An ERP Study