Predoc seminar: Luis Quintero

Seminar

Date: Thursday 1 June 2023

Time: 14.00 – 17.00

Location: Room M20, DSV, Nod building, Borgarfjordsgatan 12, Kista

Welcome to a predoc seminar on experiences in VR! Luis Quintero, PhD student at DSV, is the respondent.

June 1, 2023, PhD student Luis Quintero will present his ongoing work on “User Modeling for Adaptive Virtual Reality Experiences: Personalization from behavioral and physiological time series”. The seminar takes place at the Department of Computer and Systems Sciences (DSV), Stockholm University.

Respondent: Luis Quintero, DSV
Opponent: Jefrey Lijffijt, Ghent University, Belgium
Main supervisor: Uno Fors, DSV
Supervisors: Panagiotis Papapetrou and Jaakko Hollmén, DSV
Professor closest to the subject: Rahim Rahmani, DSV

Contact Luis Quintero

 

Abstract

Research in human-computer interaction (HCI) has greatly focused on designing technological systems that serve a beneficial purpose, offer intuitive interfaces, and adapt to a person’s expectations, goals, and abilities. Nearly all digital services available in our daily lives have unlocked personalization capabilities, mainly due to the ubiquity of mobile devices and the progress in machine learning (ML) algorithms.

Web, desktop, and smartphone applications inherently gather metrics from the system’s functioning and users’ activity to improve the attractiveness of their products and services. Imminently, the design of upcoming interactive systems will leverage the hardware, interfaces, and algorithms currently being developed, which may one day become pervasive in society.

This thesis presents research on the technical and human aspects of interactive technology that transcends the traditional assembly of a 2D-based display, touchscreen, keyboard, and mouse. The publications in the thesis provide insights into how human behavior can be more precisely measured through portable body sensors and incorporated into applications that use modern visualization mediums like immersive virtual reality (VR).

The papers contribute with frameworks and algorithms that harness multimodal time-series data and state-of-the-art ML classifiers in user-centered VR applications. The data multimodality comprises continuous motion trajectories and body measurements to capture subjective factors related to user experience. Additionally, the ML algorithms exploit the temporality and size of such datasets performing automatic data analysis, and providing interpretable insights about user-specific responses related to subjective cognitive factors such as emotions or skills.

This thesis gives an outlook on how the combination of recent hardware and algorithms may unlock unprecedented opportunities to create 3D experiences tailored to each user and help them attain specific goals with VR-based systems. The discussion shows the pertinence of interpretable ML models in context-aware VR systems to overcome the limitations of accurate opaque ML models, while also commenting on the potential ethical implications of personalization based on behavioral and physiological time-series data in immersive VR experiences.