TerrapatternCarnegie Mellon University, 2016
We present the alpha version of Terrapattern: "similar-image search" for satellite photos. It's an open-source tool for discovering "patterns of interest" in unlabeled satellite imagery?a prototype for exploring the unmapped, and the unmappable. The alpha version launch features a fully functional prototype with the ability to search for ?patterns of interest? in high-resolution satellite imagery of seven cities: Pittsburgh, San Francisco, New York City, Berlin, Austin, Detroit and Miami.
There has never been a more exciting time to observe human activity and understand the patterns of humanity?s impact on the world. We aim to help people discover such patterns in satellite imagery, with the help of deep learning machine vision techniques. Technically, the project uses a Deep Convolutional Neural Net (DCNN) to assist with image recognition.
Terrapattern provides an open-ended interface for visual query-by-example: an interface for finding "more like this, please" in satellite photos. Simply click an interesting spot on Terrapattern's map, and it will find other locations that look similar. Our tool is ideal for locating specialized 'nonbuilding structures' and other forms of soft infrastructure that aren't usually indicated on maps.
Terrapattern is an open-source, open-access, open-ended project created by a collaborative team of artists, creative technologists, and students. Terrapattern is not a company or startup; it is an experimental research prototype, developed in a university setting, whose purpose is to present a new way of exploring, understanding, and organizing the world. We hope it will help citizen scientists, humanitarians, journalists, and other curious people to discover new "patterns of interest".
Developed at the Frank-Ratchye STUDIO for Creative Inquiry at Carnegie Mellon University with support from the John S. and James L. Knight Foundation Prototype Fund.