03 experiments across the full spatial pipeline —
capture · reconstruction · synthesis · navigation
scene_03 · DEPTHSHIFT
● live
Depth Reprojection Hologram
Three.js · WebXR · GLSL Vertex Shader → Particle Field
A source video and depth map are displaced by a custom GLSL vertex shader — extruding a dense particle grid along Z per pixel to create an orbitable 3D point cloud. A WebXR companion anchors the same depth mesh to real-world surfaces via plane detection and hand tracking on Quest 3.
Reference photos of David Chipperfield's brutalist SSENSE flagship reconstructed into a navigable 3D world via World Labs' Marble model. Outputs .spz at 100k–full resolution with collider meshes, explorable in-browser.
A single photograph enters a feedforward neural network and exits as a full 3D Gaussian Splat in under a second. No multi-view capture. No photogrammetry pipeline. Pure inference.