fbpx

Thomas WrobelYesterday UgoTrade posted a discussion of Thomas Wrobel’s proposal called, “Everything Everywhere: A proposal for an Augmented Reality Network system based on existing protocols and infrastructure.”  If you’ve been following along recent AR debates , you’ll know that Thomas is a frequent contributor to the discussion under his moniker – Darkflame

Thomas has been offering his insight into the direction of AR in the form of IRC-like systems for some time so I wasn’t surprised at all when I saw the interview.  I’m glad he took the time to put his thoughts down into a comprehensive paper to help guide the industry forward. 

Personally, I find this discussion fascinating.  Especially given the potential of augmented reality.  It feels like sitting at the dawn of the Universe, at the original singularity, debating how quarks should interact to form atoms.  Maybe my metaphor is a bit overdone, but future events hinge on these little details. 

 

Back to Thomas’ paper, which Tish has covered quite nicely.  I do want to try to answer a question he posed in the later half of the interview:

I think, that just like the remote channels, local software should also be blended into the same list of layers.  People shouldn’t have to “Alt+Tab” out of one view of the world, to see another.
They should be able to see both at once, if they wish.

For instance, if your playing a AR game, why shouldn’t your chat window be viewable at the same time?

If you have skinned your environment with a custom view of the world, why shouldn’t you also see mapping or restaurant recommendations?

So local data and remote data should be blended in the same view.
How can AR software – of which I hope, there will be  thousands – seamlessly be expected to layer their graphics, not only with the real world, but with each other, and with online data too? Will games and software makers need to co-operate to allow their graphics to be integrated together with correct occlusion taken into account? A tall order, no?

I must confess though, my technology knowledge fails me here.

I would offer that these types of applications have already been worked out in the modifiable user interfaces contained within many online games, and especially World of Warcraft.  While this isn’t a true 3D environment, I believe that the way we interface with the AR world, can be customizable to suit our needs.  We won’t be able to control occlusion between layers, but we can control the way our personal data looks in relation to that world.  I explained as much in a post about the Human User Interface (HUI) a few months back, so hopefully it adds something to the discussion. 

Overall, I think a lot of the ideas proposed by Thomas are valid discussion points to the unfolding AR world.  The question is how do we move ideas like this from concept to reality?  Hopefully, ISMAR and the AR Consortium will help facilitate this discussion, and when they do, I hope they include Thomas Wrobel.

About

Thomas K. Carpenter

Thomas K. Carpenter is a full time contemporary fantasy author with over 50 independently published titles. His bestselling, multi-series universe, The Hundred Halls, has over 25 books and counting. His stories focus on fantastic families, magical academies, and epic adventures.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
>