
Meta, Reality Labs
Timeline
Fall 2022
Tools
RunwayML, Unity, Blender, HoloLens, Arduino, C#, JS
Team
Meta Reality Labs
Context
Swimming in the light
This mixed reality prototype explores light qualities as environmental input: how would objects in mixed reality take on behaviors and responses of daily circadian rhythms?



Brief
Everything as input
Computer vision systems are actively transforming our visual field into new forms of machine sensing and controlling, turning everything within its field of view into an input.
As environments are increasingly observed by autonomous cameras, there is an emerging algorithmic point of view to interact with.
How can we build and influence a technological value system as it expands into the social sphere?

Outcome
Light as input
Informed by ambient environmental signals, I turned to light signatures as inputs to influence object interactions in mixed reality. Through research, when lighting conditions in mixed reality degrades, objects are known to “swim”.
Expanding this boundary of expected and emergent interactions, I created a library of swimming primitive behaviors.

Swimming in the light
Implementing a light meter in Unity, I simulated various lighting conditions and light signatures of temperature (K) and luminance (lux).
Then I built a library of swimming primitives to interact in in these conditions.





RunwayML Process
Contextual computation in RunwayML with crowd dynamics
To understand environmental computation, I explored algorithmic context recognition on a public street feed.
Using negative space formation amongst throngs of tourists, I trained an ML model in RunwayML to recognize when a picture was being taken, based only on crowd spacing.
Understanding this environmental signal that points to the action of a camera being raised, I found that ambient signals can be traced, interpreted and computed.



