Surfing Photons
Surfing Photons is an interactive installation designed and built in TouchDesigner. It connects viewers to a visualization of light through a motion tracking interface, recorded and deployed via Python module.
Conceputal Basis
I have long been fascinated by water and its metaphoric relationship to a lot of (at least my) life. Recently, I have begun thinking about light in much the same way. Arbiter of perception, light modulates everything in our conscious world. It allows us to see up close (microscopes), far away (stars), and enjoy the breathtaking beauty of the world at normal zoom as well.
Depending on your point of view, light can be measured like a wave, or like a particle. A particle representation of a unit of light is known as a photon. This duality is a function of the observer and their limitations, or constraints.
Surfing Photons adopts a particle representation of light distorted by waves of a different type: A shallow water simulation, like the ocean surface (or top of a pool). These waves refract rays of light, distorting them and creating caustic patterns on objects beneath the surface.
This concept inspired Surfing Photons, connecting two of my favorite metaphors for life: light and water. Try to get as much of both as you can.
Depending on your point of view, light can be measured like a wave, or like a particle. A particle representation of a unit of light is known as a photon. This duality is a function of the observer and their limitations, or constraints.
Surfing Photons adopts a particle representation of light distorted by waves of a different type: A shallow water simulation, like the ocean surface (or top of a pool). These waves refract rays of light, distorting them and creating caustic patterns on objects beneath the surface.
This concept inspired Surfing Photons, connecting two of my favorite metaphors for life: light and water. Try to get as much of both as you can.
Project Description

Role: Designer and Developer
Technology: TouchDesigner, Arduino, Kinect, Point Cloud
A (Brief) Description of Caustics
Caustic simulation that takes in a surface (e.g. water in a pool) and map of incoming light direction. A simulated photon of incoming light is refracted by the surface, changing its direction of travel (represented as a ray). Photons are visualized via instanced point geometry, with the instance position calculated by intersecting the refracted ray with some bounding object (e.g. the bottom of the pool). In this project, the bounding object is either a plane or a hemisphere.
I explain the technical concepts and implemenation in TouchDesigner here.
Project States
With this setup, many visual results emerge from a selection of a few surface/light maps and choice of bounding object. Three parametrized vector fields were defined for use as maps in this project (each with independent parameter controls):
The "state" relevant in this context is what input a map is used for, thus the project state at any given point in time can be determined by an integer where each digit represents the map selection of a given input. For example, a state of 102 would mean:
As shown in the diagram above, these maps are interchangeably used for the surface map, light direction map, or an optional distortion map (applied after refraction), depending on the project's current state (see below). Permutations can vary significantly (order matters), creating a cohesive, rich environment inviting exploration.
Technology: TouchDesigner, Arduino, Kinect, Point Cloud
A (Brief) Description of Caustics
Caustic simulation that takes in a surface (e.g. water in a pool) and map of incoming light direction. A simulated photon of incoming light is refracted by the surface, changing its direction of travel (represented as a ray). Photons are visualized via instanced point geometry, with the instance position calculated by intersecting the refracted ray with some bounding object (e.g. the bottom of the pool). In this project, the bounding object is either a plane or a hemisphere. I explain the technical concepts and implemenation in TouchDesigner here.
Project States
With this setup, many visual results emerge from a selection of a few surface/light maps and choice of bounding object. Three parametrized vector fields were defined for use as maps in this project (each with independent parameter controls):
-
Fluid simulation driven by optical flow
-
Customized Simplex noise field
-
Customized Perlin noise field
The "state" relevant in this context is what input a map is used for, thus the project state at any given point in time can be determined by an integer where each digit represents the map selection of a given input. For example, a state of 102 would mean:
- Map 1 -> Input 0
- Map 0 -> Input 1
- Map 2 -> Input 2
As shown in the diagram above, these maps are interchangeably used for the surface map, light direction map, or an optional distortion map (applied after refraction), depending on the project's current state (see below). Permutations can vary significantly (order matters), creating a cohesive, rich environment inviting exploration.
Project Inputs
Users explore (interact) in this project via a Kinect sensor and Arduino:
- Kinect depth map output is used to isolate the viewer and provide an input to the optical flow algorithm driving the fluid simulation (allowing the viewer's action to affect the simulated surface as if they were in water).
- Kinect skeleton tracking maps user gestures to parameters affecting one of the three input vector fields (e.g. raising right hand corresponds to increasing the amplitude of the Perlin noise field being used as light direction map)
- Arduino monitors heartrate information used for subtle, ambient dynamics
So, users can move their bodies to tweak parameters a given map currently in use as any one of the surface map, light direction map, or distortion map. While creating a dense space of possible interactions, this setup quite chaotic and required some refinement to make the interactions legible.
Designing Interaction
To provide some structure and support and iterative experience design workflow, an abstract Interaction construct was created, associating an input CHOP channel (e.g. position of user's hand) on the range [a,b] to an target Parameter on the range [c,d]. A custom Python extension saves Interactions in a Table DAT and provides functionality to easily "activate" any set of interactions during a performance.
This establishes a one - to - many relationship where a single input CHOP channel can be used to control any number of parameters at any depth in the project, each mapped to a unique range.
All interactions in this project have state-specific targets, e.g. the same gesture ("raise left hand") will automatically target different maps / parameters depending on project state. A given interaction must to be defined relative to its target, e.g. we may want raising the right hand to increase Gain if the Perlin noise field is used a the surface map, but to decrease Amplitude if the same noise field is used as the light direction map.
An interaction's impact should be state-dependent.
As a corollary, each state contains a preset with default parameter values and a state-specific array of interactions ("When Map 1 -> Input 0, raising the right hand -> increase Map 1 amplitude") that together define the state's visual signature.
Finally, state-agnostic parameters are be applied to refine the ultimate output, such as:
This establishes a one - to - many relationship where a single input CHOP channel can be used to control any number of parameters at any depth in the project, each mapped to a unique range.
All interactions in this project have state-specific targets, e.g. the same gesture ("raise left hand") will automatically target different maps / parameters depending on project state. A given interaction must to be defined relative to its target, e.g. we may want raising the right hand to increase Gain if the Perlin noise field is used a the surface map, but to decrease Amplitude if the same noise field is used as the light direction map.
An interaction's impact should be state-dependent.
As a corollary, each state contains a preset with default parameter values and a state-specific array of interactions ("When Map 1 -> Input 0, raising the right hand -> increase Map 1 amplitude") that together define the state's visual signature.
Finally, state-agnostic parameters are be applied to refine the ultimate output, such as:
- Simulation bounding object
- Color palette
- Background palette
- Post processing

Interaction Design
- In order to foster user learning and subsequent exploration, maximizing interaction legibility is essential. Legibility comes from
- Understanding the trigger of the interaction
- Visually identifying the target or impact of the interaction
- Intuitive understanding of how to control the interaction trigger
- Understanding the trigger or source of an interaction can be difficult. To reduce confusion and promote learning, this project chose a few user gestures that are used as the source channel in all project interactions. These gestures are:
- Hand (R/L) moving horizontally
- Hand (R/L) moving vertically
- Body moving horizontally (left to right), measured from the user's head
- Body moving forward/backward, measured from the user's head
- By keeping the triggers of interaction constant, the user will build context while viewing the work and intuitively understand how to begin interacting with any new project state they are exposed to. Introducing slight variation (or by removing one or two triggers from some states) keeps things interesting while consistent enough for users to make directionally correct inferences on how to interact, leading to exploration through parameter space.
- Identifying the impact of the interaction is made easier by proper transformations and filtering (making sure the interaction is traversing the full range of the target parameter and doing so smoothly)