This morning's roundup of this month's Wired missed this fantastic article by David Weinberger on the coming infocalyptic disaster when we all have a squillion photos with no metadata.
Thus, the metadata most of us attach to our photos is pretty pathetic. We can name them when we transfer them to a computer, but most people don't bother and end up with a hard disk full of photos with names like DSC00012.jpg and DSC00234.jpg. As the years go on, DSC00234.jpg will become an archaeological artifact that might as well be labeled Don't_Know_Don't_Care.jpg. If we're to have any hope of preserving our memories, we'll need to be more clever than that. Much more clever.
What do you do if you're too lazy – or overburdened or preoccupied – to tag your photos? Let a machine do it. Digital cameras already capture critical data points at the moment the shutter clicks. Most models record – in the image file itself – not only the date and time a photo was taken but also the focal length, the aperture setting, and whether the flash fired. These tidbits can provide clues about whether the photo was taken indoors or out, during the day or at night, focusing on something close up or far away. Scanty metadata, but potentially helpful.
But why limit the possibilities to what today's cameras can do? The image file format most cameras use includes fields for longitude and latitude, in anticipation of the day when global positioning systems are built in. That day could be soon. Cell phones already gather some positioning information, and by the end of 2005 all new cell phones in the US will be locatable to within 500 feet or so. Establish a Bluetooth wireless connection between phone and camera and the camera will know where it is. Web sites already exist that use GPS data to let you upload photos pegged to spots on maps, and a Stanford research project compares photos with shots of known locations, automatically annotating snaps with information about where they were taken.