
Gaia's Gaze
Parnavi Dinkar | Prem Sai P | Aditi Sharma | Sayali Junagade | Tina Soni | Vivek Pal
Gaia’s Gaze is an immersive art installation developed to bridge the disconnect between human actions and their impact on Earth’s environment. This project offered participants a sensory journey employing dynamic visuals and soundscapes, prompting them to reflect on their environmental responsibility.
The experience was co-created as a part of Team Pixels and was recognized for its innovation at the HMI Horizon showcase, held at Hive Lab, IIT Kanpur.

Project at a Glance
👩🏽🔧
Problem & Context
People understand climate change intellectually, but rarely feel the impact of their actions in the moment. Environmental damage often feels distant, abstract, and disconnected from everyday behaviour-making accountability easy to ignore.
Gaia's Gaze was created to close this gap by turning human movement into an immediate, emotional reflection of planetary health
💡
Solution & Iteration
Gaia’s Gaze is an immersive, motion-responsive installation where visitors influence the state of a digital Earth in real time. Using motion sensors, generative visuals, and adaptive soundscapes, the system translates human activity into visible planetary flourishing or decay.
Calm, mindful movement sustains balance, while chaotic motion triggers deterioration - making cause and effect instantly perceptible through sight and sound.
🎯
Impact & Outcome
The installation transforms environmental awareness from passive observation into embodied experience. By linking physical behavior to real-time planetary response, Gaia’s Gaze encourages reflection, restraint, and responsibility.
Exhibited at the HMI Horizon showcase at IIT Kanpur, the project was recognized for its innovative use of technology to drive emotional engagement and public understanding of environmental impact.
Scope of the Project
🌏
Design and develop a visually rich Earth model that dynamically responds to audience interactions, symbolizing flourishing and decay as a direct result of motion
🎧
Compose two distinct soundtracks, serene and ominous, adapted to visitor behavior, making environmental change immediately perceptible through auditory cues.
🎛️
Integrate real-time motion detection using Azure Kinect sensors to trigger changes in both the visual representation and accompanying soundscapes, crafting a multisensory feedback loop.
👾
Deliverables included a fully functional installation, a video demonstration showcasing visitor interaction, and a user guide to facilitate meaningful engagement.
👩🏽💻
Employ advanced creative tools, including TouchDesigner for generative visuals and audio, and Blender for 3D modeling and animation.
🖼️
Target audiences ranged from exhibition visitors and art gallery patrons to students and the environmentally conscious public, broadening awareness through experience-driven learning.
Technical Highlights
Motion Tracking
Leveraged the Azure Kinect to capture user motion, integrating data to drive both visual and auditory responses.
Real-time Processing
TouchDesigner handled interactive visuals and synchronized audio feedback, while Blender was used to animate transitions between planetary health and decay.
Creative Audio
Custom soundtracks merged ambient nature with distressing environmental cues, enhancing immersion and emotional response.
Collaboration
As part of a six-member team, each contributor brought unique skills for addressing technical and creative challenges, from 3D modeling to user experience design.


Concept and Experience
The installation centered on a vividly rendered Earth model that responded in real-time to visitors’ movements. Utilizing advanced tools like TouchDesigner for real-time visual and audio interactivity, and Blender for highly realistic 3D modeling, the artwork demonstrated the ecosystem's delicate balance:
During moments of stillness, the Earth's visual state flourished with lush, vibrant details.
Intense activity or erratic movement by participants triggered visible decay, with the Earth deteriorating before their eyes.
Complementing audio shifted from serene, harmonious nature sounds to ominous, urgent soundscapes, reinforcing the emotional impact.

Outcome and Recognition
Gaia’s Gaze successfully delivered an immersive, introspective experience that not only captivated audiences but also encouraged a deeper understanding of environmental issues.
The project was exhibited at the HMI Horizon showcase by Hive Lab, IIT Kanpur, where it garnered attention for its innovative use of technology in art and public engagement.

My Contributions
Co-developed the real-time motion-driven ecosystem visualization and audio system.
Designed user interaction mechanics to create meaningful feedback loops that support environmental reflection.
Collaborated in both creative ideation and technical implementation, adapting to challenges and iterating on design and experience quality.