Viewfinder: tool for "Flickrizing" Google Earth

Media artist Michael Naimark and his colleagues developed a system to "Flickrize" Google Earth. The Viewfinder tool not only enables photos to be placed in the right geolocation on a 3D model like Google Earth, but "poses" them at the correct angle. The video demo is fantastic. From the Viewfinder project page:
Viewfinderrrr “Geotagged” photos, geographically indexed on a world map, either manually or via GPS, are an increasingly popular phenomenon. However, current implementations treat maps, and particularly 3D models, in fundamentally different modalities than photographs. The result is that photos tend to hover like playing cards, seemingly suspended over the world, remaining 2D objects in a 3D environment, and negating the transformative experience that we think should occur when combining images and a 3D world.

We can do better. We believe we can craft an experience that is as visceral as Google Earth and as accessible as Flickr by integrating photos into corresponding 3D models (such as Google Earth) so that they appear as perfectly aligned overlays; this could be called “situated,” “dimensionalized” or “seamless” alignment. Using appropriate interactive methods that combine human and machine intelligence, we believe that it will be possible to open up the process to geo-locate any and all photos that correspond to real-world places.
Link to Viewfinder, Link to New York Times article on Viewfinder


  1. There is no reason to believe we won’t eventually do this for every inch on the surface of the Earth. And then take it live. You’ll be able to walk down the street and have detailed information on every object, man made or not, you encounter. The boundary between the real and the virtual will blur to where they are nearly inseparable from one another.

    Of course, this will only be for the select few. The rest of us will be living in mud huts.

  2. mud huts? Mud Huts!? Gor! Bloody posh are’t we! When I were a wee ‘un we at the Aleph Singularity, all we had were a drain in the middle of the galaxy! Mud huts! A drain it were, all manky with stray neutrinos and scraps of other folk’s dark matter – and bloody grateful we were for it too! Mud huts! I never……..

  3. Reminds me of the virtual artwork from William Gibsons Spook Country.

    Didn’t like this though: what it did was put some pictures in a very very boring context. It drags your focus away from the scene in the picture, instead of enhancing it.

  4. This is really fascinating and could be useful too. I work in the mapping department of an electric co-op near Fort Worth, and we constantly use either Google Earth’s street view or Live Search Maps’s (msn) Birds-eye-view. This would be another useful resource for locating electric lines and the sort.

    Live Search Maps website:

  5. I wonder what results this will generate if you go looking for CIA black sites or other politically sensitive information. Intriguing. . .

  6. Since the pictures are sourced from Flickr, it would be fun to subtly photoshop real pictures of scenery to include fantastic elements, and then geo-code them to appear naturally in this app.

    like, a normal picture of a block in manhattan, only look, there’s osama bin laden buying a hot dog!

    i could see a new version of photoshop tennis developing, where artists show off by uploading pictures which modify a section of the virtual environment according to a supposed scenario or theme – essentially creating a photographic alternate universe of a given location.

  7. “We’re definitely impressed by photosynth”

    No shit, kids, they beat you to the punch, totally.

    All that trendy marketing talk and their pre-rendered video isn’t a fraction as cool as the photosynth working demo.

  8. I totally agree photosynth..

    Do a search for “photosynth tedcom” on you tube and you’ll find the original vid on photosynth and seadragon or whatever it’s called :p

  9. There’s some Free software that sort-of does something a bit like the first half of this for raw images coming down from the MER Mars rovers: Midnight Mars Browser, (screenshots on Flickr.)

    (No 3D world model though, just multiple images automatically composited into perspective views.) And of course the images come with precise metadata on filters used, alt/az, time of day, etc. I can’t see it being computationally feasible to do this without that metadata.

  10. >”We’re definitely impressed by photosynth”
    >No shit, kids, they beat you to the punch, totally.
    it’s way beyond being “beaten to the punch”: photosynth is a real and advanced technology; this thing is just “me too” hot air.

  11. Well there is NO way a project bought (licenesed, whatever) by MICRO$OFT would ever find its way into a GOOGLE app. So yea, Viewfinder does have a legit purpose other than being just “me too” copyware.

  12. Yes, the photosynth stuff is clearly “real” technology and isn’t “hot air.” But take a look at the paper and you’ll notice that the methods it uses to pose photographs will actually throw out a large percentage of “normal” photos. This isn’t to say that photosynth isn’t completely awesome, but it’s approaching what amounts to a similar problem in a very different way.

Comments are closed.