The Emote Project, comprised of 50 minutes of animation in four chapter and 5 sections 2017) is about feelings, brain waves, and real-time experimental animation. Data recorded from EEG brainwave signals were analyzed for emotional patterns and reinterpreted to generate 20 animated chapters.
Emote (2017) is an ongoing series of real-time experimental animation that resulted from an interdisciplinary collaboration between Media Art Nexus and the NTU School of Computer Science and Engineering, (SCSE). The Emote Project is about feelings, brain waves and real-time experimental animation. Using EEG, the brain signals were recorded and analyzed for recognizing emotional patterns and used for generating 20 animated chapters in real time after the non-lyrical music clips of the known emotional class (such as happy, sad, exciting, scary, etc.). NTU SCSE contributed their scientific knowledge about neuro-network and specific analysis of EEG signals. The emotional states in the brain manifest EEG signals and machine-learning techniques in order to learn patterns of emotions certain signals.These can be utilized for the recognition of emotion and generating suitable external stimuli to elicit a desired emotional response.
We used 20 anime derived non-lyrical music clips of known emotional class (happy, exciting, sad, melancholy) provided by the Computer Science and Engineering Team.
In this work, music language is a means for learning emotions in brain and is used as an external stimulus in response to the emotional state of the brain. A study would observe the same feedback when using visuals.
Mark Chavez, Yun Ke Chang, “Cinematics and Narratives: Movie Authoring & Design Focused Interaction.” Leonardo Electronic Almanac, Published 2013-07-15, http://journals.gold.ac.uk/index.php/lea/article/view/87
Yun-Ke Chang, Mark J. Chavez, Miguel A.