Forget geo-tagging: researchers at Carnegie Mellon University can tell where in the world a photo was taken, no metadata required.
The new technique, developed by computer science graduate student James Hays and assistant computer science and robotics professor Alexei A. Efros, compares the photograph against millions of GPS-tagged images in Flickr’s massive online library. It doesn’t require “clues” such as signage, but instead operates on the statistical distribution of texture, color and line.
“We’re not asking the computer to tell us what is depicted in the photo but to find other photos that look like it,” Efros said in a press release. “It was surprising to us how effective this approach proved to be. Who would have guessed that similarity in overall image appearance would correlate to geographic proximity so well?”
It’s not a magic bullet: 16 percent of photos in a test set worked. That’s far greater than chance would allow, however, and even the failures often got close enough to be useful. From the project page at CMU:
We quantitatively evaluate our approach in several geolocation tasks and demonstrate encouraging performance (up to 30 times better than chance). We show that geolocation estimates can provide the basis for numerous other image understanding tasks such as population density estimation, land cover estimation or urban/rural classification.
Hays will present the work at June’s IEEE Computer Society Conference on Computer Vision and Pattern Recognition in Anchorage, Alaska. Here’s a direct link to their paper. The link below also includes the test set of photos, and the source code.
Project Page [CMU]