1st "Virtual Reality as a Transformative Technology to Develop Empathy" Conference.
By the "empathic Reactive MediaLab Coalition" (eRMLab Coalition).
Back To Schedule
Wednesday, June 20 • 12:00pm - 12:30pm
(From Panel Discussion: "Machine Learning for Emotional Detection with the MASK technology") (English)

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

MindMaze builds intuitive human machine interfaces through its breakthrough neuro-inspired
computing platform, which captures brain activity upon intent, and a new generation of brain computer interface products, using virtual reality (VR), augmented reality (AR) and machine learning. Based on a decade of rigorous testing in the healthcare industry, the company has designed intuitive mind-machine interfaces, which utilize real-time decoding of brain signals and is able to track fine movements and gestures. Its innovations are poised to transform industries, starting with healthcare and gaming.
Recently, MindMaze introduced MASK the first-ever bio-signal approach to translate expressions into VR and AR experiences for gaming and social networking for trans-media consumption. MASK brings
emotion and a novel input mechanism with the goal of evolving VR/AR beyond novelty and into
mainstream applications.
Humans love to communicate, joke and share experiences together, but effective communication is
more than only the voice, since we also transmit a lot of non-verbal information. Our facial expressions
are a major part of that non-verbal communication; therefore MindMaze introduced MASK as the gate
to transpose that non-verbal communication to any virtual world adding the human touch to VR and AR.
The patented MASK technology embedded in any head mounted display (HMD) allows the
instantaneous detection of facial expressions. Those are identified not by looking at the visual
appearance, but instead on the underlying muscular activity from the facial muscles which are used to
trigger these expressions. To do so, tiny electrodes are placed on the foam of the HMD to capture the
muscular activity. The activities are then analyzed with high sophisticated signal processing and machine learning technologies to identify a set of expressions. Finally, the expressions can be used to animate the face of the avatar and convey the emotions of the participant in VR.

avatar for Fabien Bourban

Fabien Bourban

Software Engineer, MindMaze. Biosignal and Emotional Detection for VR
Fabien Bourban is a computer science engineer, He did his master at EPFL in Lausanne, focused on VR and computer graphics. He joined Mindmaze in 2015 to work on VR. Fabien Bourban is now working with the biosignal team at Mindmaze on the MASK project.

Wednesday June 20, 2018 12:00pm - 12:30pm CEST
Sala Exterior