Loading…
1st "Virtual Reality as a Transformative Technology to Develop Empathy" Conference.
By the "empathic Reactive MediaLab Coalition" (eRMLab Coalition).
Wednesday, June 20 • 11:45am - 1:30pm
PANEL DISCUSSION: "AI & Machine Learning for Human Development" (English)

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

  • "Artificial Intelligence for Human Development". Ángeles Manjarrés, PhD.
Since the birth of AI, healthcare has been one of its main application domains. AI is currently taking on a greater role in technologies for mental health and personal development. We can cite, in particular, the use of expert systems, virtual assistants, robots -- used for diagnosis, treatment and therapy, clinical training as well as in the robotization of care --, computational psychiatry, and mobile applications for coaching or stress management.
These applications incorporate the most relevant paradigms of modern Artificial Intelligence: decision systems, environmental intelligence based on biometrics, data mining, user modeling and personalization, virtual agents, natural language processing, artificial vision, robotics ... Along with their potential benefits, these technologies pose serious dilemmas in the philosophical, ethical and socio-political domains.
The integration of Artificial Intelligence techniques enriches Virtual Reality applications, enabling, among other things, interactive and customized immersion experiences.
  • "Machine Learning for Emotional Detection with the MASK technology". Fabien Bourban.
MindMaze builds intuitive human machine interfaces through its breakthrough neuro-inspired
computing platform, which captures brain activity upon intent, and a new generation of brain computer interface products, using virtual reality (VR), augmented reality (AR) and machine learning. Based on a decade of rigorous testing in the healthcare industry, the company has designed intuitive mind-machine interfaces, which utilize real-time decoding of brain signals and is able to track fine movements and gestures. Its innovations are poised to transform industries, starting with healthcare and gaming.
Recently, MindMaze introduced MASK the first-ever bio-signal approach to translate expressions into VR and AR experiences for gaming and social networking for trans-media consumption. MASK brings
emotion and a novel input mechanism with the goal of evolving VR/AR beyond novelty and into
mainstream applications.
Humans love to communicate, joke and share experiences together, but effective communication is
more than only the voice, since we also transmit a lot of non-verbal information. Our facial expressions
are a major part of that non-verbal communication; therefore MindMaze introduced MASK as the gate
to transpose that non-verbal communication to any virtual world adding the human touch to VR and AR.
The patented MASK technology embedded in any head mounted display (HMD) allows the
instantaneous detection of facial expressions. Those are identified not by looking at the visual
appearance, but instead on the underlying muscular activity from the facial muscles which are used to
trigger these expressions. To do so, tiny electrodes are placed on the foam of the HMD to capture the
muscular activity. The activities are then analyzed with high sophisticated signal processing and machine learning technologies to identify a set of expressions. Finally, the expressions can be used to animate the face of the avatar and convey the emotions of the participant in VR.

  • "Automatic Content Creation for Virtual Environments based on Emotions". Luis Peña Sánchez, PhD.
The user's interaction with a virtual environment is a huge source of information that allows intelligent systems to create content derived from these interactions. The contribution of studies of cognitive psychology and analysis of emotions to interactive virtual environments will allow the creation of contents adapted to the emotions that the user is experiencing, so we can maintain a level of tension, attention or entertainment appropriate to the environment to achieve a richer experience.

Speakers
avatar for Fabien Bourban

Fabien Bourban

Software Engineer, MindMaze. Biosignal and Emotional Detection for VR
Fabien Bourban is a computer science engineer, He did his master at EPFL in Lausanne, focused on VR and computer graphics. He joined Mindmaze in 2015 to work on VR. Fabien Bourban is now working with the biosignal team at Mindmaze on the MASK project.
avatar for Luis Peña Sánchez, PhD.

Luis Peña Sánchez, PhD.

CEO & Co-Founder, Lurtis Rules. AI Characters and Environments for VR
CEO & Co-Founder of Lurtis Rules, Technological Startup, awarded by ActuaUPM, focused on the development of tools for the creation of digital content. Professor at the Rey Juan Carlos University and in the U-Tad of subjects related to Artificial Intelligence applied to the environment... Read More →
avatar for Ángeles Manjarrés, PhD.

Ángeles Manjarrés, PhD.

Professor, Artificial Intelligence for Human Development. Universidad Nacional Educación a Distancia (UNED)


Wednesday June 20, 2018 11:45am - 1:30pm CEST
Sala Exterior