Interactive floor for visitors using projector-based AR

Transform movement into interaction — this study explores intuitive foot-based control in projector-based AR spaces leveraging body tracking

Text

Summary

Our project designed and developed interaction floor where your feet become the controller in a projector-based AR experience. We explored how intuitive foot gestures and visual feedback create a seamless, engaging, and hardware-free way to interact with digital spaces.

Features

  • Real-time image processing for seamless interaction.
  • Immediate visual feedback and immersive gameplay for high usability.
  • Visually appealing and responsive game environment.
  • Optimized gestures and design for projector-based AR interaction floors.

Gameplay

In our interactive floor experience, players use their feet to navigate through a projector-based AR environment. In the first layer, they walk through stardust, attracting and deflecting stars to familiarize themselves with the interaction. As they progress to the second layer, they collect stars to earn points while dodging asteroids, aiming for a high score. In the final layer, players can choose to return to the stardust or replay the game to improve their performance. The intuitive foot-based controls and real-time feedback create an engaging and immersive experience.

Layer 1: Attention

FHNW

In the first layer, players walk through stardust, discovering how their movements attract and deflect stars. As they explore, they gradually learn the interaction mechanics, eventually reaching the spawn point to enter the next layer.

Layer 2: Interaction

FHNW

In the second layer, players collect stars to earn points while dodging incoming asteroids. The goal is to set a high score and stay in the game as long as possible, balancing risk and reward with precise movements.

Layer 3: Repeat or quit

FHNW

In the final layer, players master the interaction and choose their path—return to the stardust of layer one or replay the game for a higher score.

Technologies

Our project leverages advanced technologies to create an intuitive and immersive interactive floor experience. The Azure Kinect Camera, combined with the Azure Kinect Body Tracking SDK, enables precise motion detection. Unity serves as the foundation for gameplay, utilizing C# for seamless interaction logic. OpenCV Sharp handles projection calculations, ensuring accurate visual alignment and responsiveness.

Key terms

  • Real-Time Motion TrackingUses Azure Kinect to detect foot movements instantly
  • Foot-Based InteractionEnables intuitive control without external hardware
  • Projection MappingUses OpenCV Sharp to align visuals seamlessly with the floor
  • Immersive AR ExperienceTransforms the IIT entrance into an interactive space
  • Gesture OptimizationDesigned for natural and intuitive movement
  • High UsabilityCombines clear visual cues, responsive feedback, and intuitive controls

Customer

Institut für Interaktive Technologien, Hochschule für informatik FHNW
Elif Gurcinar
Bahnhofstrasse 6
5210 Windisch

Team

Caspar Mücke
Lukas Tadeu

Advisor
Kevin Kim

Advisor
Yves Simmen