Stockholm university

Research project Mixed Reality Shared Engagement in Cultural Events (SECE)

The project creates opportunities for a new form of public performances where the line between artists and audience is blurred. We work with mixed and augmented reality together with immersive participation and new mobile communication technology. We will demonstrate this in an interactive performance in a public environment.

Real and virtual objects and actors interact in mixed reality (MR).
Real and virtual objects and actors interact. Screenshot from the project.

The SECE project is a collaboration between the Department of Computer and Systems Sciences (DSV) at Stockholm University, Ericsson Research and Kulturhuset Stadsteatern.

We focus on the creation of a completely new form of public performances where physical actors can interact with virtual actors, and with physical and virtual objects. The project develops and studies both artistic creative processes and new wireless communication technology such as WiFi7 and 6G. We also use a number of prototypes developed in collaboration between Stockholm University and Ericsson Research during 2023–2024.

Central to the SECE project is the use of mixed reality (MR), enabling us to mix physical and virtual actors, objects and environments. The goal is to enable performances also outdoors, which is a big challenge with today’s technology.

The project involves expertise from many different directions and areas such as mobile communication, augmented and mixed reality, artistry and choreography.

A first public demo will take place at Kulturhuset in Stockholm, in June 2025. A final outdoor performance is planned for late spring 2026, with a number of dancers and actors.

Hear two of the project members, Pernilla Jonsson and Michael Björn from Ericsson Research, discuss ”Internet of Senses”

Project description

Project aims:

  • Creating a completely novel arena for immersive, participatory and creative cultural events using mobile communications and 3D/Mixed Reality (MR) innovations
  • Allowing artists and audience to participate in real-time performative events like theatre, music and dance, both indoors and outdoors in public spaces
  • Developing an experimental mobile AR and MR visualizing and auditive platform allowing creation of new types of participatory artistic performances, utilizing cutting edge colocation and spatial map/digital twin technologies

Project vision:

  • Develop a platform for new forms of theatre, music or other performances attracting new audiences
  • Offer possibilities to be actively involved in and/or interacting with the performance, like people can do in digital games or social media
  • Support theatres and other cultural organizations in attracting new audiences in today’s competitive world
  • Support industry in realizing the expectations of the massive business opportunities that 5/6G, WiFi6e/7 and AR/MR offer
  • A paradigm shift for participatory mobile communication and AR/MR collaborative environments
  • Support the entertainment industry and other players in reaching new groups of audiences and participants (young and old)
  • Combination of real places, real and virtual participants, as well as real and virtual objects
  • Co-location of both people and objects
  • Develop never before seen events where the borders between actors, audience, places and objects are torn down
  • Seamless real time integration of real and virtual people, objects and scenery using state-of-the-art mobile communications and MR technology

Project members

Project managers

Uno Fors

Researcher

Department of Computer and Systems Sciences

Pernilla Jonsson

Ericsson Research

Charlotte Nelson Prag

Kulturhuset Stadsteatern

Henricus Verhagen

Professor, Unit Head IDEAL

Department of Computer and Systems Sciences
Portrettbild Harko Verhagen

Members

Jordi Solsona Belenguer

Senior Lecturer

Department of Computer and Systems Sciences

Tobias Falk

Senior Lecturer

Department of Computer and Systems Sciences

Noak Petersson

PhD student

Department of Computer and Systems Sciences

António Miguel Beleza Maciel Pinheiro Braga

Research Engineer

Department of Computer and Systems Sciences

Luis Velez Quintero

Associate Senior Lecturer

Department of Computer and Systems Sciences
profile-pic-luva3178

Michael Björn

Ericsson Research

Huani Yao

Ericsson Research

Andreas Eriksson

Kulturhuset Stadsteatern

Alexander Ragge

Kulturhuset Stadsteatern

Robin Jonsson

Freelancing XR/MR choreographer

More about this project

In this research project, we experiment with real time interactions between physical and virtual environments, actors and objects. The short videos below illustrate, amongst other things, how virtual objects can be used in physical rooms, and how physical actors can collaborate in a hybrid context.

researchProjectPageLayout