Data-Driven Immersive Analytics (DDIA)

Data-Driven Immersive Analytics

With the results of the COMET-Module DDIA (Data-Driven Immersive Analytics), it will be possible to walk into environments offering a full immersive experience, while intelligent technology composes a cohesive presentation of real and digital aspects and delivers personalized feedback for local and remote participants.

Vision and Strategy

Digital representation of real-world entities of varying complexity are available in digital twins. How-ever, interaction and visual analytics metaphors are simply not built to appear in the physical world context, whilst operating remotely, a user has restricted sense of space. It is necessary to offer a unified holistic experience of data and analytics anchored in the real world for local or remote participants.

The COVID19 pandemic has highlighted the importance of human workers and the need to create a more resilient society. Outbreaks in workplaces, such as industry plants and factories, have emphasized the importance of maintaining a healthy workforce. While lockdowns and travel bans continue to disrupt what was taken for granted as normal operation, the crisis has made us aware that it is possible to carry out many activities remotely with various levels of synchronicity in collaboration. Technology has facilitated communication, yet the crisis has made people painfully aware of its limitations: many activities require an awareness of space, a sense of presence and freedom of movement to operate. This project aims to investigate and develop technological solutions to counter the limitations posed, and strengthen the ability to act from a safe distance.

The need for a new approach towards immersive interaction in the real-world goes beyond the recent crisis. Huge sums invested in digitalization lead to complex “cyber physical systems”, i.e., physical systems monitored and controlled with networks of sensors and computational cores. Various data representing the physical entity are collected in digital twins comprising 3D structural data, semantic information, and sensor data. Nonetheless, the sensory experience offered through immersive technologies is rather constrained regardless of major recent advances. Data is available, yet access, interaction, and analytics are separate from the physical entity. These are cumbersome activities when carried out in the physical world. Visualization and interaction metaphors as well as analytics were simply not built to appear in the physical world context. Yet, in various areas of expertise we need digital information about physical entities in real-time. In the case of units with a physical and digital existence, how should the information referring to them best be combined in a coherent presentation? And what are appropriate interaction paradigms for analytics processes in immersive environments?

Objectives

This project builds new computational methods for immersive analytics, relying on i) data originating from a physical entity through a digital twin, ii) data originating in the user obtained with physiological sensors, and iii) data originating in the immersive experience, so called traces, recorded with software sensors. In a nutshell, the challenges we aim to address include how to present and interact with immersive data views, what physiological traits lead to perceptual optimization and personalization, and how to use traces of interactions in the environment to create recommendations and tutorials.

The ultimate goal is to offer personalized experience of data and analytics anchored in the real-world for individual participant or in collaboration with peers locally or remotely.

  • Investigate models of embodied interaction with immersive analytics.
  • Develop paradigms for remote collaborative immersive analytics.
  • Investigate personalization based on physiological sensing and social immersive trainings.

Project Structure

This project aims to finance seven (7) researchers: a post-doctoral researcher and a team of six (6) PhD candi- dates. They have been organized along (3) phases, analysis, collaboration, and learning. These phases cut across six research topics (one for each PhD candidate). (See image below)

T1- Immersive Interaction with Digital Twin: T1 aims to investigate and advance interactions with the digital twin, including analytics methods and with historic data. Along these lines, we investigate the mental model that the user gains of the digital twin through interaction. This cognitive model will be used to propose novel para- digms to interact with analytics processes, including the manipulation and comparison of the situation presented by the data at various timesteps in the past (from historic data) or future if simulated data are available.

T2- Embodied interaction with data visualizations in space: T2 is intended to investigate the organization of space to deploy various forms of data views and how the user motion and action in space should influence the type of presentation. Data views can be created in 2D or 3D. They can be anchored in space around an object, attached to planar surfaces, or laid out around the user. We investigate various forms of presentation and or- ganization, aiming to make immersive analytics an extension of intuitive motion.

T3- Mediated perception for remote immersive analytics: For a remote user to perform analytics as if being present at the location where the data originates, it is necessary to capture the location and convey it with enough fidelity to the user’s senses. T3 is about research and development of a “mediated perception platform” to deliver first person immersive experience to remote participants, for example using a semi-autonomous robot.

T4- Effects of prior knowledge on experiencing remote work and collaboration in data analytics: We investigate how prior knowledge in the domain affects experience of and performance in remote and collaborative data analytics. Knowing the effect and relative importance of prior knowledge allows 1) taking informed decisions about whether to initiate collaboration in specific circumstances, 2) choosing additional participants for collabo- ration, and 3) taking countermeasures, such as providing upfront information or training.

T5- Physiological models for personalized immersive analytics: We investigate the effects that manifest them- selves in the user during the immersive experience. We will collect physiological data with sensors, incl. eye- tracking, heartrate, galvanic skin response, electro encephalogram, electromyography, etc. We will model the effects on human response and propose personalization and/or optimizations for the immersive experience.

T6- Social immersive training and support: This topic aims at investigating methods to exploit knowledge gained while a user performs the immersive experience. To do so, activity traces will be logged including the user actions in the virtual environment. We will investigate the use of activity traces in two ways: for personal- ized recommendations, and to create workflows for tutorials.

Funding

The COMET Module proposes research on groundbreaking topics aiming to enable immersive analytics (tele-) collaboration, bringing two new perspectives so far rarely considered around immersive computing: personalization and paradigms for social learning.

This is a 4-year project with a funding rate of 80% and requires a commitment from private-sector partners for the total amount of EUR 140,000 per year, distributed among five (5) partners. For each partner, this amounts to max 112,000 EUR distributed in four years (or 28,000 EUR per year, over 4 years).