Module 3 Activity Research
Weekly Activity Template
Xinyu Lu
Project 3
Module 3
This module is an extension of the research and explorations from Project 1 and Project 2, continuing our focus on how real-time data can shape interactive visual experiences. While the previous projects used environmental inputs such as images and light, Project 3 shifts toward sensing the user’s physical state through a force sensor and a pulse sensor, creating a more personal and responsive system.
In this project, we built a prototype that combines Arduino and TouchDesigner to turn body-based data into visual feedback. When the user leans on the cushion, the force sensor detects pressure changes, and the pulse sensor on the wristband reads heart-rate values. These signals are sent to TouchDesigner, where they control different visual modes. Fast and intense visuals appear when the user’s heart rate is high, while slow and calm visuals show when the user is relaxed.
Through this module, we explored sensor integration, data processing, and media output mapping. We learned how to connect multiple sensors, stabilize their readings, and translate live data into animated particle systems and geometric effects. The final outcome is an interactive relaxation device that supports self-awareness and emotional regulation, continuing the overall design goal established in earlier projects—using real-time data to help users feel more present, calm, and connected to their environment.
Additional Research or Workshops
Additional Research or Workshops
Project 2
Project 3 Final Prototype
Final Prototype – Video
Project 3 builds on the directions from Projects 1 and 2, exploring how real-time environmental data can shape visual output to help users relax. In this stage, we replaced the previous light sensor with a force sensor and a pulse sensor to create a more personal and responsive media experience.
The prototype uses an Arduino to read pressure changes from a force sensor in the cushion and heart-rate data from a wristband pulse sensor. When the user leans on the cushion or wears the wristband, the sensors send data to TouchDesigner, which then triggers different visuals. High heart rates activate fast and intense animations, while stable heart rates produce calm and gentle motions. This system can remind users to slow down or support calming and meditation.
Through this project, we learned how to connect multiple sensors and transform real-time data into interactive visuals. By experimenting with particle effects and mapping sensor values to visual behaviors, we created a system that turns physical input into meaningful feedback for relaxation.