05 experiments across the full spatial pipeline —
capture · reconstruction · synthesis · navigation
scene_06 · MEMORIA
● live
Memory Palace for Music
React Three Fiber · Three.js · Bun → Annotated .ply / .glb
A mix is a temporal fragment — a moment of curation that captures a mood, a context, a way of listening that will never quite repeat. Memoria treats that fragment as something worth preserving spatially. Photogrammetric scans as spatial containers, with tracks, images, and notes pinned as positional nodes in 3D space. Closer to an archive than a queue. Closer to a walk than a stream.
Four Boiler Room / Cercle sets decomposed into audio, video frames, video native, and text molecules — 1,788 atoms embedded at 768 dimensions, rendered as an orbitable 3D globe. Query a concept and cosine similarity lights up matches across all 16 molecules. An assemblage engine cuts a ~60s sequence from the highest-matched clips, edited by meaning, not by time.
All frames of a video rendered simultaneously as stacked semi-transparent planes — treating time as a navigable spatial axis. Orbit the full timeline as a physical object. Gemini AI search lets you query frames by natural language.
Three.js · WebXR · GLSL Vertex Shader → Particle Field
A source video and depth map are displaced by a custom GLSL vertex shader — extruding a dense particle grid along Z per pixel to create an orbitable 3D point cloud. A WebXR companion anchors the same depth mesh to real-world surfaces via plane detection and hand tracking on Quest 3.
Reference photos of David Chipperfield's brutalist SSENSE flagship reconstructed into a navigable 3D world via World Labs' Marble model. Outputs .spz at 100k–full resolution with collider meshes, explorable in-browser.
A single photograph enters a feedforward neural network and exits as a full 3D Gaussian Splat in under a second. No multi-view capture. No photogrammetry pipeline. Pure inference.