WORKS  Lucid Dream




















Lucid Dream



Collaborative Project /   Interactive Digital Installation









Lucid Dream

A Real-Time Interactive Digital Installation
(Code-Based)



Collaborative Project (2024)
Contributors: Choi Yerin, Min Kyu-Won


-

Theme: Data Spaces

Software: p5.js, Visual Studio Code, TouchDesigner, Scaniverse, Photoshop


-

I played a leading role in the development process, managing coding tasks, time coordination, and workflow integration to ensure the project's success.

-

For full documentation, 
visit lucidreamyrkw.netlify.app
(structured from a provided template)





3D visual created by Min Kyu-Won (Experiment2 / Blender)




Project Overview

Lucid Dream is an interactive digital installation that immerses users in a dream-like experience through real-time audio recognition, generative visuals, and interactive elements. 

Developed as part of a group project under the theme of Data Spaces, the project investigates the intersection of perception and digital interaction through five different experiments.



























Lucid Dream Main Poster | Designed by Yerin

    My Core Contributions

    Experiment 1
    Body Rhythm

    Developed the first interactive sequence to introduce users to the dream-state concept.




    Experiment 01

    Experiment 3
    Haziness

    Independently developed this experiment, focusing on coding and visual execution using p5.js.




    Experiment 03

    Experiment 5
    Audio Recognition

    Developed the interaction system connecting real-time audio recognition to generative visuals in TouchDesigner (visuals created by a teammate).

    Experiment 05

    Main Poster Design

    Designed the primary promotional poster for Lucid Dream, visually encapsulating the project's themes.




    Website Documentation

    lucidreamyrkw.netlify.app

    Documented Outcomes, process, and observations on Visual Studio Code.









    Collaboration

    - Led the coding and development process, ensuring seamless integration between interaction and visual elements.

    - Managed time constraints, setting realistic goals and keeping the project on track.

    - Worked closely with teammates to coordinate tasks and align the final output.





    While this was a collaborative project, I ensured that my contributions in coding, interaction logic, workflow management, and visual design played a key role in shaping the final experience.










    Experiment 1
    Hidden Body Rhythm



    p5.js Links
    Experiment1_Exercise3_Ver01

    Experiment1_Exercise3_Ver02



    Pulse
    Breath
    Blink



    This experiment visualizes the body's unconscious rhythms—breath, pulse, and blink—through p5.js. By collecting data from my teammate and me, the visuals reflect differences in bodily rhythms, with my teammate’s data nearly half of mine.The of circles represent these rhythms, with sizes varying based on individual data. Lines connecting the circles adjust in opacity, weight, and visibility, dynamically shifting as neighboring circles reach specific thresholds. This organic visual system highlights the body's continuous yet unnoticed activity.








    Experiment 3
    Haziness (몽롱함)



    The Korean word "몽롱하다" describes a state of haziness or dazedness, often experienced in dreams or moments of drowsiness. This exploration aimed to translate that sensation into fluid, textural visuals using p5.js. By leveraging body rhythm data (breath, pulse, and blink patterns) collected from fellow students, we generated abstract, evolving textures that mimic the drifting, dreamlike nature of "몽롱함."



    p5.js Links
    Experiment3_Ver01

    Experiment3_Ver02



    Generative Visuals & Concept

    The visuals represent the blurred boundary between consciousness and dreams, where clarity fades and surreal logic takes over. 

    Using generative design techniques, the textures continuously shift—mirroring the unpredictable flow of dreams during REM sleep.

    *REM Sleep & Lucid Dreams

    Lucid dreams primarily occur during REM (Rapid Eye Movement) sleep, a deep sleep stage characterized by rapid eye movement, increased brain activity, and irregular breathing. During this state, heart rate can fluctuate based on dream intensity, further reinforcing the connection between physiological rhythms and dreamlike visuals.









    Visual Variations




    Base Visuals:

    Core representations of body rhythm data

    Long-Exposure Effect: 

    Background transparency accumulates over time, creating 
    layered noise and an intensified dreamlike atmosphere.












    Reflections

    This project challenged me to balance technical precision with artistic abstraction, especially in mapping body rhythms to dynamic visuals. 

    After multiple iterations, we refined a process that effectively conveys the dazed, fluid quality of dreams—a rewarding outcome that brought our concept to life through data-driven design.









    Experiment 5
    Voice, Space, and Dream Control



    In this experiment, we explored how voice commands and body rhythms could manipulate digital environments, simulating the surreal experience of lucid dreaming. Inspired by films like Inception and Ant-Man, we used speech recognition to alter scanned 3D spaces, dynamically distorting familiar environments like studios and bedrooms. 


    Tools: Scanniverse, TouchDesigner, p5.js


    p5.js Link

    Experiment5_Audio_Recognition
    Experience It Now!






    I was mainly responsible for implementing the speech recognition system in p5.js and ensuring seamless video playback and dynamic interactions between voice commands and visual transformations. To achieve this, I adapted and refined an initial speech recognition script provided by my supervisor, Andreas, modifying it to better integrate with our project’s requirements.




    Interactive Dreamscapes: 
    Sound, Control, and Visual Transformation


    By integrating subtle body rhythms—pulse, breath, and blink—we captured the subconscious movements that occur in dreams, driving immersive, surreal transformations. 

    This experiment highlighted the deep connection between sound, control, and visual manipulation, offering insight into how abstract data can shape digital landscapes in real time.






    Altering Familiar Environments 
    / Speech Recognition


    We scanned real-world spaces such as bedrooms and design studios using Scanniverse and Polycam, then processed them in TouchDesigner to introduce dreamlike distortions based on voice commands:





    / Bedroom


    Room
    Scatter
    Swell




    / Studio


    Split
    Distort
    Surround

    Studio
    Noise




    Each transformation was designed to reflect the fluid, 
    shifting nature of lucid dreams, where control and 
    perception blend seamlessly.

    • Scatter – Fragments the space, dispersing elements into floating clusters.
    • Swell – Expands the environment as if it’s breathing.
    • Split – Divides the room into multiple shifting sections.
    • Distort – Warps structures, twisting them into surreal forms.
    • Surround – Encloses the viewer, enhancing immersion.








    Visual Sound Score

    Expressing data as a narrative


    I designed a Totem Phrases Chart to represent the corresponding actions triggered by voice commands. This score maps the totem words to their visual changes and serves as a guide for interaction.
































    © 2025 Choi Yerin. ALL RIGHTS RESERVED.