The face and the mind... together, they express our mood to the world.
Following in the time honored tradition of appropriating new technologies for the purposes of artistic expression, this project uses two inputs for interaction. One device, designed for meditation training, tracks electrical data happening within the user's brain. The other helps assist artists in capturing realistic facial  activity for animations. Being that both topics are intimately tied to emotion, either could give users a visualization that is, at least in a small way, reflective of their state-of-mind. 
The ​goal​ of this project was​ ​to​ ​create​ ​an artistic ​visualization​ to show the harmony or disagreement between the user’s brain and their facial expression. These ​visualizations​ are procedurally generated and controlled through inputs from both a consumer grade EEG (electroencephalogram) and​ advanced ​facial​ ​tracking technology, to drive the elements within the scenes. The EEG is the commercially available Interaxon Muse. The second input device is a web ​camera that is capturing input from facial activity in conjunction with software that is integrated into the project from Faceware. Both devices broadcast their data over the network​, to be received by Digital Emotes. Together, these devices, in conjunction with the software I have developed, capture and export an artistic visualization representative of the user’s emotion.
This creates a transdisciplinary platform that allows users to manipulate artistic compositions through physiologically driven inputs. It takes influences from ​data​ ​visualization,​ cognitive science, art, and aesthetics.​​​​​​​​ Aside from it’s artistic purposes, recent​ ​findings​ ​have​ ​shown​ ​that​ ​with​ ​​ ​training​ ​on​ ​biofeedback​ ​devices, people​ ​can​ ​learn​ ​to​ better ​regulate​ ​physiological​ ​states.​ The simple act of exploring and maintaining theses states to produce desired visual results could assist users in better regulating their emotions.
Back to Top