The key facet of his engine is that it is markerless. While I couldn’t quite pick up the exact method of his markerless technology, I’m going to make a couple of guesses. I would assume it uses the GPS in the iPhone to get the location, then determines the direction and the buildings by using item recognition and data from GoogleEarth.
The beauty of a markerless AR engine is it must use existing data already developed to create the information-object link. Once his SREngine is fully functional then I imagine other ideas can be linked to his. I’d love to tell my phone the address of my desired location, then hold up the phone to give it an idea of where I was at, then follow arrows on the iPhone to my destination. Hopefully next time I’m in Japan visiting the Toyota mother plants, his app will be up and running so I won’t have to struggle to find my favorite noodle shop again.
Gambatte kudasai Kanemura-san!