Location-based Light Painting

Mapping Geotagged Photos in Public Spaces

For any city, thousands of geotagged photos are available online. The project maps these photos in the places where they were taken.

A custom-built camera flash + smartphone setup queries the Flickr and Panoramio APIs for photos taken at the current geographical position. Whenever there's a photo available, a flash is triggered.

Long-exposure photography captures multiple flash lights — each representing one geotagged photo — and situates them in the place of their origin.

Each light dot represents a geotagged photo taken at that location Each light dot represents a geotagged photo taken at that location Each light dot represents a geotagged photo taken at that location Each light dot represents a geotagged photo taken at that location Each light dot represents a geotagged photo taken at that location Each light dot represents a geotagged photo taken at that location

After a few test shots, a new question occurred. What would it look like, if all of the photographers that uploaded their pictures had been there at the same time?

By pointing the flash at a second person instead of the ground, I was able to place photographers' ghosts where they must have stood when shooting their picture. By pointing the flash at a second person instead of the ground, I was able to place photographers' ghosts where they must have stood when shooting their picture. By pointing the flash at a second person instead of the ground, I was able to place photographers' ghosts where they must have stood when shooting their picture. By pointing the flash at a second person instead of the ground, I was able to place photographers' ghosts where they must have stood when shooting their picture.

Process

The amount of existing photos is just extraordinary. According to Wikipedia, Google Panoramio alone archives more than 65.000.000 geotagged images. Popular landmarks are so thoroughly documented by photographers that PhotoSynth can create 3D models of these places just using pictures.

As a first step, I wanted to visualize all the photos I could get for a certain place in an obvious map view.
The visualizations are generated by providing the Flickr and Panoramio APIs with a geographic area and placing the returned photos (or rather their geotags) on a map view.

Map view of New York City More than 50.000 geotagged photos available for the selected rectangular area. Pictures accumulate around popular sites. (not all pictures shown) I also made a version that only listed photos within sight, as well as couple of mobile versions to explore the data on the go

My visualizations only take into account pictures that possess a geotag. Since the majority of cameras is not equipped with a GPS sensor yet, I am just working with a fraction of the existing material.

Painting Data with Light

I was intrigued by the idea to visualize photography-related data using photography itself. Inspired by Immaterials, I decided to use Light Painting as a data visualization method.

Light Painting device: camera flash with custom-built trigger circuit, iPhone + webapp
Light painting device: iPhone + web app, camera flash with custom-built trigger circuit.

I wrote an iPhone web application to retrieve my current geo location and to query the web for pictures taken at that position. The app plays a short sound impulse whenever there's a photo at that geolocation.

In order to trigger a flash from my smartphone, I soldered a circuit board with an ATtiny85 microprocessor as its core. The board is attached to the iPhone headphone jack and listens for audio impulses sent from the web application.
When a signal is detected, the microprocessor triggers the camera flash via its synchron cable.

The whole setup is open source. More information, all code and circuit design is available on Github.

Process

In order to map the photos into their place of origin, I made long-exposure photographs at night while systematically walking around the place with my light painting device. The results are night shots with light-painted data in them.

If you look closely, you can see me being lit by a flash
If you look closely, you can see me being lit by a flash.

Due to GPS inaccuracies, a lack of time during exposures and interfering light sources (street lights/cars) many of the final pictures were stitched together from multiple photos. Still, I kept post production to a minimum in order to retain natural shots.

Conclusion

For me, the interesting bit about location-based data is how it helps us understand the way we use and interact with our surroundings. All too often, this data is taken out of its original context and visualized in bar charts and line graphs.
Using just light on a sensor, Location-Based Light Painting blends together imagery of the world around us with the invisible data that describe it.

This project has won the award Nachwuchspreis Neue Medien 2014 and was featured on WIRED, TIME, PetaPixel, The Dish, Hyperallergic, Vantage and The Creators Project, among others.

Credits

Concept/Design/Development: Philipp Schmitt
Consulting: Benedikt Gross
University: Hochschule für Gestaltung Schwäbisch Gmünd