ARTEC
ARTEC (Augmented Reality Through Emotional Computing) envisions a two-way dialogue between art and audience, celebrating the subjectivity of each visitor’s emotional response. It transforms the traditional museum experience into an interactive, personalized encounter, where artworks subtly change in AR based on the viewer’s emotions. Behind the scenes, the system combines machine learning, facial recognition, and biometric sensors to detect and interpret emotional states.
Augmented Reality
Machine Learning
Physical Prototype
Unity
Brief
Design an AR-based museum experience that captures and responds to visitors’ emotions, personalizing experience.
Objective
Encourage people to visit museums by offering an emotionally responsive experience that enhances the artwork.
Role
Dataset training, ML–Unity integration, Vuforia development, and research on emotion categorization and elicitation.
Concept
The system collects heart rate data from a wearable device and facial expression data from the phone’s camera. These inputs are processed by a machine learning model trained to recognize emotional states. Based on the detected emotion, the system generates a personalized augmented reality version of the artwork on the user’s screen, including interactive visual and auditory elements.
Measuring Emotions
The design was guided by Russell’s Circumplex Model, mapping emotions by arousal and valence. We translated emotions into visual and sensory cues: color [1], musical intervals [2] [3], and environmental elements like weather and light [4]. These subtle changes shaped each AR scene to mirror the user’s emotional state.

Design Solution

Scenario

Visual Cues
Lunapark, Giacomo Balla
Audio Cues
Natura morta con violino, Georges Braque
Prototype
A mid-fidelity prototype was developed in the shape of a ring, aiming for a non-invasive design. The device was able to communicate wirelessly using a low-battery consume chip.

Testing
The ARTEC prototype was tested with 12 participants over one week to assess the system’s functionality and its ability to engage users emotionally. The setup simulated a museum environment, with three printed artworks from Museo del 900 displayed on the walls. Participants were given a phone with the ARTEC app and a working prototype of the ring. Each session included a pre-test questionnaire on participants’ background, a hands-on trial of the three AR experiences, and post-test surveys evaluating emotional reactions and overall feedback.


Key Insights
Emotional Accuracy
Find the perfect plan tailored to your needs, offering the right balance of features, flexibility, and value to help you achieve your goals effortlessly.
Engaging Potential
Find the perfect plan tailored to your needs, offering the right balance of features, flexibility, and value to help you achieve your goals effortlessly.
Model Inclusivity
Find the perfect plan tailored to your needs, offering the right balance of features, flexibility, and value to help you achieve your goals effortlessly.
References
[1] P. Valdez and A. Mehrabian. 1994. Effects of color on emotions. Journal of Experimental Psychology: General, 123, 4 (1994), 394–409.
[2] B. Schuller, J. Dorfner, and G. Rigoll. 2010. Determination of nonprototypical valence and arousal in popular music: Features and performances. EURASIP Journal on Audio, Speech, and Music Processing, 2010, 1 (2010), 1–19.
[3] I. Lahdelma and T. Eerola. 2014. Single chords convey distinct emotional qualities to both naïve and expert listeners. Psychology of Music, 44, 1 (2014), 37–54.
[4] L. Venz and A. Pundt. 2021. Rain, rain go away! A diary study on morning weather and affective well-being at work. Applied Psychology, 70, 4 (2021), 1856–1871.