DoG

DoG explores the perception of animacy, intelligence, and the embodiment of non-anthropomorphic avatars through a speculative scenario: a Living Space. By conceptualizing a virtual environment as a social entity, the project challenges conventional ideas of what constitutes a body, and how we perceive agency and intelligence. The scenario is tested through a carefully designed two-player VR experience.

The project was developed within the AIRLab at Politecnico di Milano as part of ongoing research. It was later exhibited at Milan Design Week 2025 and NAMA Nuovo Anfiteatro Martesana.

Virtual Reality

User Testing

Unity

Unity

Brief

Design a VR experiment that explores whether a radically non-anthropomorphic entity can be perceived as animate and intelligent through non-verbal interaction.

Objective

To investigate how humans attribute agency to a living space within a two-player VR experience, where the environment functions as an interactive social avatar.

Role

Research and design of experiment, interaction and gesture designs, interaction flows, sound design, 3D visuals, tutorial UI, user testing execution and analysis, and idea development.

Concept

To test a speculative scenario that challenges conventional ideas of what defines a body, we imagined a virtual space as the entity itself, drawing inspiration from literature such as Stanisław Lem’s Solaris, which explores the concept of a living environment. Unlike familiar avatars, a space does not occupy a single point, has no personal boundaries, and lacks a face or clear focal point of interaction. Yet, it can socially respond through light, sound, and movement.

Experiment Setup

The scenario is tested through a two-player VR experience, where one participant embodies the environment and controls its reactions, while the other engages with it through nonverbal interaction. This setup allows us to test the following research questions.

Methodology

Surveys

To construct a large-scale organism that could move fluidly and be perceived as a single body, we designed the Space as a swarm, a coordinated system of repeated elements, inspired by flocks of birds. To define how this entity can expresses itself, we identified key messages and formed a moodboard with visual references.


Drawing from Disney’s animation principles, we translated each message into abstract motion using biomechanical movement parameters. Then through iterative surveys, we tested whether these movements alone could convey animacy and meaning, refining the design in each round.

Initial Prototype Testing

Following insights from the surveys, we developed an interactive VR prototype, controllable through defined hand gestures. The first testing phase included 25 participants, each experiencing both the Visitor and Controller roles. Qualitative and quantitative feedback was collected through surveys and observation.

Play as Interaction

Find the perfect plan tailored to your needs, offering the right balance of features, flexibility, and value to help you achieve your goals effortlessly.

Fragmented Perception

Find the perfect plan tailored to your needs, offering the right balance of features, flexibility, and value to help you achieve your goals effortlessly.

Unclear Controls

Find the perfect plan tailored to your needs, offering the right balance of features, flexibility, and value to help you achieve your goals effortlessly.

Final Design

The final prototype introduced vocal sound feedback, a new embodiment system, refined gesture controls, and a redesigned body structure that unified all previously detached components. Swarm elements were redesigned for a more organic appearance while ensuring computational efficiency and avoiding unintended associations. Two distinct models were created to reduce visual repetition, both responsive to the Visitor’s touch through illumination.

Controls

The updated control system allowed the Controller to interact through:


  • Bump: Deform the boundary, creating limb-like protrusions without getting detached. The width is controlled by hand gesture.


  • Head Orientation: Influence element direction based on head orientation.


  • Lightwave: Trigger a spreading light on touch of boundary, also used as a reward system.


  • Wind by Voice: Generate wind particles toward the Visitor using voice, modulated by loudness and duration.


  • Environmental Light: Adjust light level using the angle of the left hand thumb.


  • Portal: Toggle the visibility of a portal that enables switching between roles, that is also used in playful context.

Icons

Tutorial panels introduced each role, and dynamic hand icons were added to provide constant visual guidance on available controls.

Final Testing

The final prototype was tested with 48 participants over four days. A seamless role-shift mechanism enabled a continuous experience, with the initial Controllers of each session following a defined interaction flow.

Future Improvements

Full body tracking

Find the perfect plan tailored to your needs, offering the right balance of features, flexibility, and value to help you achieve your goals effortlessly.

Tactile Feedback

Find the perfect plan tailored to your needs, offering the right balance of features, flexibility, and value to help you achieve your goals effortlessly.

Refining Vocal Input

Find the perfect plan tailored to your needs, offering the right balance of features, flexibility, and value to help you achieve your goals effortlessly.

Defne Kmo

DIGITAL DESIGNER & ARCHITECT

defnekmo@gmail.com

Defne Kmo

DIGITAL DESIGNER & ARCHITECT

defnekmo@gmail.com

Defne Kmo

DIGITAL DESIGNER & ARCHITECT

defnekmo@gmail.com