In recent weeks I’ve been thinking that a confluence of innovations could begin to usher in an era of mixed reality and augmented reality applications…
- Together, Google’s APIs for mobile maps and mobile search provide a ubiquitous substrate for locative media.
- Phones & cell networks are now capable of multiple methods of locating themselves – GPS, cell-ID and even SMS commands.
- Gestural interfaces are becoming widely accepted thanks to the iPhone and Nintendo’s Wii.
Though producers of actual reality games, such as area/code, gestural handset manufacturers like GeoVector and researchers such as Markus Kähäri have been exploring mixed reality platforms for many years, I believe the Android platform and the upcoming iPhone SDK are where we’ll see some action in the next few months.
Rafael Spring and Max Braun have already taken up the Google Android developer challenge with Enkin (thanks Aaron), a ‘link between maps and reality’ that uses positioning data from GPS, accelerometers for orientation and gestures along with a number of web services to overlay data onto a 3D maps or live camera feeds. In essence, Enkin can alternately provide a God’s Eye View of your immediate environment or a ‘head-up display‘ for whatever you happen to be looking at.
Though Enkin is ergonomically clunky, it points the way towards for multimodal mixed reality; there’s no hardware used in Spring & Braun’s work that’s not in current and future handsets.
A couple years ago, I was mesmerised by the possibilities of my friend Victor’s Herescan project at IDII – he playfully describes it as Exploring Deep Place. It looks like Mixed Reality is about to join the fabric of Actual Reality 🙂
UPDATE: One step closer with Evolution Robotics’ ViPR visual search technology for the iPhone…check out the video demonstration on YouTube.
Leave a Reply