"Light Leaks" by Kyle McDonald and Jonas Jongejan is an immersive installation built from a pile of mirror balls and a few projectors, created originally for CLICK Festival 2012 in Elsinore, Denmark.
Technical notes:
- First version of the installation
- Running on Mac Mini
- 1024x768 (native resolution) with TripleHead2Go
Technical notes:
- iMac
- 1280x1024 with TH2G on projectiondesign F32 sx+ (native 1400x1050) inset on the sensor
- When calibrating, create a network from the calibration computer that shares the ethernet and therefore provides DHCP.
- BlackMagic grabber and SDI camera for interaction
Technical notes:
- Mac Pro (the bin)
- 3x Mitsubishi UD8350U 6,500 Lumens, 1920x1200
- 2 projectors running through TripleHead2Go, the last directly from the Mac Pro
- One monitor hidden in the back closet
Technical notes: 20x15x4.4m room with 42 mirror balls on floor and 4 projectors at 3.25m high, 7.2m away from center of mirror balls.
Temporarily used a laptop and Mac Pro together for calibration, leading to the "Primary" and "Secondary" ProCamSample instances. This solution doesn't really work because lost OSC messages to the Secondary machine cause irrecoverable problems.
- Mac Pro with 4x Apple Mini DisplayPort to Dual-Link Display Adapter, with USB Extension
- 4x Christie D12HD-H 1DLP projectors at 10,500 ANSI Lumens, all running at 1920x1080.
- Additional monitor for debugging. Display arrangement was set so the monitor was the last screen to the right.
- Scarlett Focusrite 18i8 audio interface.
- 4 Projectors connected to Linux machine running headless
Installation at Lights All Night festival, using 4 x 30K projectors
Outdoor installation in courtyard of Can Framis Museum
Long-term installation at Wonderspace in Scottsdale, Arizona. Video
Thousand Deep (US, 2021)
The repo is meant to be used with openFrameworks 0.10.0 (a0bd41a75).
- https://github.com/kylemcdonald/ofxCv @ c171afa
- https://github.com/kylemcdonald/ofxControlPanel @ c45e93856ba9bab2a70b7e1f6e44255399b91637
- https://github.com/kylemcdonald/ofxEdsdk @ a40f6e4d85b11585edb98ccfc0d743436980a1f2
- https://github.com/dzlonline/ofxBiquadFilter @ 87dbafce8036c09304ff574401026725c16013d1
- https://github.com/mazbox/ofxWebServer # 6472ba043075c685977ecca36851d51db1ec4648
- https://github.com/HalfdanJ/ofxGrabCam
At TodaysArt we ported the ProCamScan and BuildXYZ from openFrameworks to python.
Before doing any calibration, it's essential to measure the room and produce a model.dae
file that includes all the geometry you want to project on. We usually build this file in SketchUp with a laser rangefinder for measurements, then save with "export two-sided faces" enabled, and finally load the model into MeshLab and save it again. MeshLab changes the order of the axes, and saves the geometry in a way that makes it easier to load into OpenFrameworks. (Note: at BLACK we ignored the "export two-sided faces" step and the MeshLab step, and camamok was modified slightly to work for this situation.)
- Capture multiple structured light calibration patterns using
ProCamSample
withEdsdkOsc
. Make sure the projector size and OSC hosts match your configuration insettings.xml
. If the camera has image stabilization, make sure to turn it off. - Place the resulting data in a folder called
scan/cameraImages/
inSharedData/
. RunProCamScan
and this will generateproConfidence.exr
andproMap.png
. - Place your
model.dae
and areferenceImage.jpg
of the well lit space incamamok/bin/data/
. Runcamamok
on your reference image. Hit the 'o' key to generate the normals, then press thesaveXyzMap
button to save the normals. - Place the resulting
xyzMap.exr
andnormalMap.exr
insideSharedData/scan/
. - Run
BuildXyzMap
and dragSharedData/scan
into the app. This will produceSharedData/scan/camConfidence.exr
andSharedData/scan/xyzMap.exr
. Repeat this step for multiple scans, then hit "s" to save the output. This will produceSharedData/confidenceMap.exr
andSharedData/xyzMap.exr
. - Run
LightLeaks
.
- Run the CalibrationCapture app from a laptop that is on the same network as the computer that is connected to the projectors.
- On the computer connected to the projector run the Calibration app.
- Plug the camera into the laptop, and position it in a location where you can see at least N points. The app will tell you whether the image is over or underexposed.
- Transfer the images to the Calibration app, it will decode all the images and let you know how accurately it could reconstruct the environment.
- In the Calibration app, select control points until the model lines up.
- Start the LightLeaks app.
- Auto calibrate from video taken with iPhone. Using BMC to encode the gray code signal.
- Auto create point mesh from images so 3d model is not needed, and camamok is not required.
- Change ProCamScan into a CLI that automatically is triggered by ProCamSample for live feedback on progress (highlight/hide points that have good confidence)