<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>UgoTrade &#187; augmented reality on the iphone</title>
	<atom:link href="http://www.ugotrade.com/tag/augmented-reality-on-the-iphone/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.ugotrade.com</link>
	<description>Augmented Realities at the Edge of the Network</description>
	<lastBuildDate>Wed, 25 May 2016 15:59:56 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>Games, Goggles, and Going Hollywood&#8230;How AR is Changing the Entertainment Landscape: Talking with Brian Selzer, Ogmento</title>
		<link>http://www.ugotrade.com/2009/08/30/games-goggles-and-going-hollywood-how-ar-is-changing-the-entertainment-landscape-talking-with-brian-selzer-ogmento/</link>
		<comments>http://www.ugotrade.com/2009/08/30/games-goggles-and-going-hollywood-how-ar-is-changing-the-entertainment-landscape-talking-with-brian-selzer-ogmento/#comments</comments>
		<pubDate>Mon, 31 Aug 2009 03:38:38 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[nanotechnology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[alternate reality RPG]]></category>
		<category><![CDATA[ambient intelligence]]></category>
		<category><![CDATA[AMEE]]></category>
		<category><![CDATA[AR Network]]></category>
		<category><![CDATA[AR spam]]></category>
		<category><![CDATA[ARBalloon]]></category>
		<category><![CDATA[ARN]]></category>
		<category><![CDATA[augmented reality baseball cards]]></category>
		<category><![CDATA[augmented reality development]]></category>
		<category><![CDATA[augmented reality eyewear]]></category>
		<category><![CDATA[augmented reality hotspots]]></category>
		<category><![CDATA[augmented reality industry]]></category>
		<category><![CDATA[augmented reality network]]></category>
		<category><![CDATA[augmented reality on the iphone]]></category>
		<category><![CDATA[augmented reality search]]></category>
		<category><![CDATA[augmented reality toys]]></category>
		<category><![CDATA[Blockade]]></category>
		<category><![CDATA[Brad Foxhoven]]></category>
		<category><![CDATA[Brian Selzer]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Cyberpunk]]></category>
		<category><![CDATA[Evolutionary Reality]]></category>
		<category><![CDATA[EyeToy]]></category>
		<category><![CDATA[eyewear for AR]]></category>
		<category><![CDATA[Games Alfresco]]></category>
		<category><![CDATA[Green Tech AR]]></category>
		<category><![CDATA[jim purbrick]]></category>
		<category><![CDATA[Kensuke Tanabe]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[Layar Developer Conference]]></category>
		<category><![CDATA[location based RPGs]]></category>
		<category><![CDATA[Lumus]]></category>
		<category><![CDATA[markerless AR]]></category>
		<category><![CDATA[markerless mobile augmented reality]]></category>
		<category><![CDATA[markerless natural feature tracking]]></category>
		<category><![CDATA[Masunaga]]></category>
		<category><![CDATA[Metroid]]></category>
		<category><![CDATA[Metroid Prime]]></category>
		<category><![CDATA[Mirrorshades]]></category>
		<category><![CDATA[multiperson mobile AR experiences]]></category>
		<category><![CDATA[Nano Air Vehicles]]></category>
		<category><![CDATA[near field object recognition]]></category>
		<category><![CDATA[new augmented reality trade jargon]]></category>
		<category><![CDATA[Ogmento]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[Pentagon's Robot Hummingbirds]]></category>
		<category><![CDATA[Project Natale]]></category>
		<category><![CDATA[Put a Spell]]></category>
		<category><![CDATA[Robert Rice]]></category>
		<category><![CDATA[Sekai camera]]></category>
		<category><![CDATA[social gaming platforms]]></category>
		<category><![CDATA[sticky light]]></category>
		<category><![CDATA[The Dawn of the Augmented Reality Industry]]></category>
		<category><![CDATA[Tonchidot]]></category>
		<category><![CDATA[Topps AR baseball cards]]></category>
		<category><![CDATA[Total Immersion]]></category>
		<category><![CDATA[Vuzix]]></category>
		<category><![CDATA[Wikitude]]></category>
		<category><![CDATA[Yoshio Sakamoto]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4334</guid>
		<description><![CDATA[Picture on the left Mirrorshades, picture on the right a Metroid Hud. &#8220;Augmented Reality is like a Philip K Dick novel torn off its paperback rack and blasted out of iPhones,&#8221; Bruce Sterling in Beyond the Beyond &#8220;a techno visionary dream come true &#8211; those are rare, really rare, you have to be patient,Â  it&#8217;s [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/mirrorshadespost3.jpg"><img class="alignnone size-full wp-image-4349" title="mirrorshadespost3" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/mirrorshadespost3.jpg" alt="mirrorshadespost3" width="124" height="204" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/metroid_hud1post2.jpg"><img class="alignnone size-medium wp-image-4350" title="metroid_hud1post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/metroid_hud1post2-300x204.jpg" alt="metroid_hud1post" width="300" height="204" /></a></p>
<p><em>Picture on the left <a href="http://www.amazon.com/Mirrorshades-Cyberpunk-Anthology-Greg-Bear/dp/0441533825" target="_blank">Mirrorshades</a>, picture on the right a <a href="http://en.wikipedia.org/wiki/Metroid" target="_blank">Metroid Hud</a>.</em></p>
<p><strong>&#8220;Augmented Reality is like a Philip K Dick novel torn off its paperback rack and blasted out of iPhones,&#8221; <a href="http://www.wired.com/beyond_the_beyond/2009/08/the-key-take-aways-for-investors-interested-in-the-augmented-reality-field/" target="_blank">Bruce Sterling in Beyond the Beyond</a></strong></p>
<p><strong>&#8220;a techno visionary dream come true &#8211; those are rare, really rare, you have to be patient,Â  it&#8217;s super cyberpunk&#8221;&#8230; Bruce Sterling, <a href="http://vimeo.com/6189763" target="_blank">&#8220;At the Dawn of the Augmented Reality Industry.&#8221; </a></strong></p>
<p>The Dawn of the Augmented Reality Industry continues to brighten, and now we have two augmented reality companies, <a href="http://www.t-immersion.com/" target="_blank">Total Immersion</a> and <a href="http://ogmento.com/" target="_blank">Ogmento</a>, firmly established in Hollywood &#8211; the dream mother of so many of our augmented realities.<a href="http://ogmento.com/" target="_blank"></a></p>
<p><a href="http://ogmento.com/" target="_blank">Ogmento</a> is the most recent of these two pioneering augmented reality companies to set up shop in LA.Â  <a href="http://www.t-immersion.com/" target="_blank">Total Immersion&#8217;s</a> CEO Bruno Uzzan moved to LA from France two years ago, although he still has a fifty person RandD team in France.Â Â  Total Immersion began 10 years ago in the quiet, lonely, hours before the dawn of an AR industry.Â  But <a href="http://gamesalfresco.com/2009/07/23/mattel-launches-augmented-toys-at-comic-con/" target="_blank">Total Immersion&#8217;s AR toys for Mattel,</a> and augmented reality for <a href="http://www.youtube.com/watch?v=I7jm-AsY0lU" target="_blank">Topps baseball cards</a>, fired CNet writer Daniel Terdiman up enough to say, &#8220;I have seen the future of toys, and it is augmented reality&#8221; (<a href="http://news.cnet.com/8301-13772_3-10317117-52.html" target="_blank">see full post here on CNet</a>).</p>
<p>Recently, I talked withÂ <a href="http://www.ugotrade.com/2009/07/28/augmented-realitys-growth-is-exponential-ogmento-reality-reinvented-talking-with-ori-inbar/" target="_blank"> Ori Inbar, one of the founders of Ogmento </a>andÂ  the premier augmented reality blog <a href="http://gamesalfresco.com/" target="_blank">Games Alfresco</a> about his new venture in Hollywood. Bruce Sterling, <a href="http://twitter.com/bruces" target="_blank">@bruces</a>, had some fun with my invention of <a href="http://www.wired.com/beyond_the_beyond/2009/08/augmented-reality-ogmento/" target="_blank">brand new augmented reality trade jargon here</a>!Â  Ori pointed out Ogmento brings two important new facets to the rapidly growing augmented reality field: firstly they are bringing leadership from veterans of the entertainment industry into augmented reality development. <a id="squu" title="Brad Foxhoven" href="http://www.blockade.com.nyud.net:8080/about/about-blockade" target="_blank">Brad Foxhoven</a> and <a id="odvk" title="Brian Seizer" href="http://brianselzer.com/">Brian Selzer</a> from <a id="xow_" title="Blockade" href="http://www.blockade.com/" target="_blank">Blockade</a> have partnered with Ori on Ogmento.Â  And, in an another important step forward for a young industry, Ogmento announced they will be acting as publishers for a fast growing cohort of augmented reality application developers and helping AR development teams out there bring their concepts to the market.</p>
<p>So I was very happy also to have the opportunity to talk with Brian Selzer.Â  Bruce Sterling pointed out in his seminal<a href="http://eurekadejavu.blogspot.com/2009/08/augmented-realitys-sermon-on-flatlands.html" target="_blank"> sermon from the flatlands</a> at the <a href="http://layar.com/" target="_blank">Layar</a> Developer Conference, AR is kind of a &#8220;Hollywood scene.&#8221; We have seen the web early adopter/developer/blogger communityÂ  embrace augmented reality browser experiences in recent weeks in an awesome wave of enthusiasm. Are Hollywood creatives equally smitten? For the answers see the full interview with Brian Selzer below.</p>
<p>Brian Selzer (<a href="http://brianselzer.com/" target="_blank">www.brianselzer.com</a> and <a href="http://twitter.com/brianse7en" target="_blank">twitter &#8211; brianse7en</a> ) has an extensive involvement with emerging platforms:</p>
<p><strong>&#8220;from launching dot com entertainment sites in the late 90&#8242;s to creating early versions of social gaming platforms, or bringing big brands like Spider-Man and X-Men into the mobile space for the first time. Â Last year I was focused on bringing video game characters and worlds into the online space as UGC [user generated content] projects (<a href="http://www.mashade.com/" target="_blank">mashade.com</a>, <a href="http://www.instafilms.com/" target="_blank">instafilms.com</a>).&#8221;</strong></p>
<p>I began my own career in Hollywood doing motion control photography and creating software that bridged the language of robotics and servo motors with the visions ofÂ  film directors. Eventually our little company, NPlus1, moved on to 3D vision systems and image recognition stuff.Â  So yes, I have been really, really patient waiting for this particular techno visionary dream.Â  And, while I have been waiting for augmented reality to manifest, I have grown to love the internet.Â  But now, how awesome, <a href="../../2009/01/17/is-it-%E2%80%9Comg-finally%E2%80%9D-for-augmented-reality-interview-with-robert-rice/" target="_blank">It is OMG finally for mobile AR!</a></p>
<p>Augmented reality is busting out all over &#8211; through our laptops, our phones, on the streets, toys, baseball cards, art installations, <a href="http://www.youtube.com/watch?v=9noMfsg486Y" target="_blank">sticky light calligraphy</a> and more.</p>
<p>Many of my questions to Brian were directed at at how and when we will see augmented realities with near field object recognition, image recognition and tracking and, of course, the illusive eyewear.Â  As Bruce Sterling points out we are just at the very, very beginning &#8211; the dawn of an industry.Â  I created the photomontage below on the right to compliment <em> <a href="http://www.tonchidot.com/">Tonchidot&#8217;s</a> </em>illustration suggesting the evolutionary inevitability of holding our phones up (below on the left).Â  The Evolutionary Reality of AR will not end there.Â  It is just a step into eyewear, hummingbirds or <a href="http://http://gizmodo.com/5306679/pentagons-robot-hummingbird-christened-nano-air-vehicle" target="_blank">Nano Air Vehicles</a>, and more&#8230;&#8230;.</p>
<h3>The Evolutionary Reality of AR</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-96.png"><img class="alignnone size-medium wp-image-4359" title="Picture 96" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-96-300x97.png" alt="Picture 96" width="300" height="97" /></a></p>
<p><em>Cartoon on the left  by  <a href="http://www.tonchidot.com/">Tonchidot</a> on the right a collage of a stock photo and the <a href="http://gizmodo.com/5306679/pentagons-robot-hummingbird-christened-nano-air-vehicle" target="_blank">Pentagon&#8217;s Robot Humming Birds &#8211; </a><a href="http://http//gizmodo.com/5306679/pentagons-robot-hummingbird-christened-nano-air-vehicle" target="_blank">&#8220;Nano Air Vehicles</a>.&#8221;</em><strong><em><strong><a href="http://gizmodo.com/5306679/pentagons-robot-hummingbird-christened-nano-air-vehicle" target="_blank"> </a></strong></em> </strong></p>
<p>While we finally we have, an affordable mediating device with the horse power, mindshare and business model to bring AR mainstream with the iphone.Â  The much anticipated Apple 3.1 Beta SDK to be released in September will not, I am sure, open up the Video API at the levels that augmented realities with near field object recognition and tracking require (I would love to be proved wrong though). But the magic wand to deliver even <span id="b9-2" title="Click to view full content">tightly registered AR graphics/media (that require a lot of CPU and GPU)</span> to a wide audience is in our hands, so full access to may not be far off. And others, of course, can/will/might knock the iphone off its current pedestal.Â  AR made it&#8217;s mobile phone debut on the Android after all.</p>
<p>Like everyone else who loves AR, I wish that Apple would open up faster (and I wish Android would manifest on some rocking hardware). But we will see enough of the iphone Video API open for the next generation of mobile augmented reality games and applications to emerge in the coming months.</p>
<p>One of these will be Ogmento&#8217;s.  Although Ogmento is in stealth mode, they have released <a href="http://www.youtube.com/watch?v=EB45O7-6Xrg&amp;eurl=http%3A%2F%2Fogmento.com%2F&amp;feature=player_embedded" target="_blank">a teaser for their first game, &#8220;Put A Spell,&#8221;</a> developed by ARBalloon â€“ screenshot below.Â  Ori did reveal to me in <a href="../../2009/07/28/augmented-realitys-growth-is-exponential-ogmento-reality-reinvented-talking-with-ori-inbar/" target="_blank">th<span style="color: #551a8b;">is interview</span></a> that they are doing image recognition and using the Imagination AR engine.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-95.png"><img class="alignnone size-medium wp-image-4356" title="Picture 95" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-95-300x177.png" alt="Picture 95" width="300" height="177" /></a></p>
<p>As Brian notes, Hollywood has had the AR bug for a long time. AR has been everywhere in Science Fiction Movies and video games. Nintendo&#8217;s SPD3 head Kensuke Tanabe, &#8220;effectively the man in charge of overseeing all the <em>Metroid</em> franchise underneath original co-creator Yoshio Sakamoto,&#8221; explains the story of <em>Metroid</em> to Brandon Boyer of <a href="http://www.offworld.com/2009/08/retro-effect-a-day-in-the-stud.html" target="_blank">Offworld here</a> (an image of a Metroid Hud on the right opening this post) :</p>
<p><strong>&#8220;the idea of the different visors you use in the <em>Prime</em> games to interact with the world: the scan visor, for instance, set the game apart from other first person shooters in that the player was using it to proactively collect information from the world, rather than having the story come to them passively, in the form of cut-scenes or narration. &#8220;<em>Prime</em> could have adventure elements with the introduction of this visor,&#8221; says Tanabe, &#8220;That&#8217;s how we came up with the genre &#8212; first person adventure, instead of shooter.&#8221;</strong></p>
<p>But as Brian points out:</p>
<p><strong>&#8220;the light bulb has been lit and Hollywood is seeing that the software and hardware are here today to deliver these types of AR experiences in real life (to a lesser extent of course, but the path is getting clear).&#8221;</strong></p>
<p><strong><br />
</strong></p>
<h3>Talking with Brian Selzer</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/me.jpg"><img class="alignnone size-full wp-image-4363" title="me" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/me.jpg" alt="me" width="188" height="227" /></a></p>
<p><strong>Tish Shute: </strong>Bruce Sterling&#8217;s sermon at the Layar Developer conference, <a href="http://www.wired.com/beyond_the_beyond/2009/08/at-the-dawn-of-the-augmented-reality-industry/" target="_blank">&#8220;At the Dawn of the Augmented Reality Industry,&#8221;</a> was absolutely awesome. He spread the future feast/orgy of augmented reality before usÂ  &#8211; and described many of the dishes we will tasting both delectable and diabolical.Â  One of the many things he points out is, AR is kind of a &#8220;Hollywood scene.&#8221; And, as Ogmento is one of only two augmented reality companies in Hollywood, I am interested to hear how it looks from your neck of the woods. We have seen the web early adopter/developer/blogger communityÂ  embrace augmented reality browser in recent weeks in an awesome wave of enthusiam &#8211; are Hollywood creatives catching the buzz?</p>
<p><strong>Brian Selzer: Â It was a thrill to hear Bruce Sterling mention Ogmento. I devoured all of his Cyberpunk books back in the 80&#8242;s, along with writers like Gibson, Rucker, Shirley&#8230; To me, sci-fi writers are the visionaries who define and influence our technological paths into the future. They make science and tech sexy enough to want to manifest those experiences in the real world. Clearly Bruce sees the AR industry as being sexy. I love that he called it &#8220;a techno-visionary dream come true&#8230; and super-cyberpunk.&#8221; Â And yes, kind of a Hollywood scene.</strong></p>
<p><strong>Hollywood creatives caught the AR bug before they knew what AR was. Â Look at science fiction movies and video games to see AR everywhere. Terminator, The Matrix, Minority Report, Iron Man.. the list goes on. Â Look at any video game with an integrated heads-up display. Â It&#8217;s clear Hollywood loves AR. Â It&#8217;s only been in the past few months though that the light bulb has been lit and Hollywood is seeing that the software and hardware are here today to deliver these types of AR experiences in real life (to a lesser extent of course, but the path is getting clear). So yes, the buzz is here and it&#8217;s strong. Â With that, we all have to be prepared for the good, the bad and the ugly as AR goes mainstream.</strong></p>
<p><strong>It certainly goes to show how young this industry is when Ogmento and Total Immersion are currently the only AR companies based in Los Angeles. It&#8217;s very exciting to be the only company right now demonstrating a natural feature tracking (markerless) iPhone experience in Hollywood. We are in talks to bring some very big brand and properties to the mobile AR space. The goal is to deliver experiences that create added engagement and value to the consumer.</strong></p>
<p><strong>Tish Shute:</strong> Also in his landmark sermon Bruce Sterling noted that augmented reality has been around for 17 yrs and now at last we are seeing the dawning ofÂ  an augmented reality industry. What inspired you to take up the challenge of launching an augmented reality company in Hollywood?Â  Oh congrats that Bruce Sterling name checked Ogmento in his list of companies that prove that this really is the dawn of an industry!</p>
<p><strong>Brian Selzer: I&#8217;ve always been involved in emerging platforms&#8230; from launching dot com entertainment sites in the late 90&#8242;s to creating early versions of social gaming platforms, or bringing big brands like Spider-Man and X-Men into the mobile space for the first time. Â Last year I was focused on bringing video game characters and worlds into the online space as UGC projects (mashade.com, instafilms.com). Working with all these great CG game assets, I continued to think about what&#8217;s next, and that&#8217;s when I started to follow AR very closely and started engaging with those who were pioneering in the space.</strong></p>
<p><strong>I remember swapping instant messages with <a href="http://curiousraven.squarespace.com/" target="_blank">Robert Rice</a> (<a href="http://twitter.com/robertrice" target="_blank">@robertrice</a>) right after the 2008 Super Bowl.Â  We were not chatting about the football game, but rather about some of the commercials that aired during the event as a sign that AR was making its way into the mainstream.Â  A lot of people became aware of AR for the first time when the <a href="http://ge.ecomagination.com/smartgrid/" target="_blank">GE SmartGrid commercial</a> aired.Â  There were all these YouTube videos popping up of people blowing on holographic wind turbines.</strong></p>
<p><strong>The commercial that really got me excited though was the <a href="http://www.youtube.com/watch?v=Kwke0LNardc" target="_blank">Coke Avatar commercial</a>.Â  In that commercial people in the city were sporadically being portrayed as their digital persona&#8217;s, avatars, gaming characters, etc..Â  For me that spot did a great job showing how many of us already have these â€˜alter egosâ€ that live in cyberspace, and how the line between these worlds can sometimes be blurred. I remember watching that commercial and thinking that is exactly the type of experience Iâ€™d like to create with mobile AR.Â  I want to overlap the virtual world into our every-day reality. Why cant I bring my World of Warcraft or Second Life persona with me into the real world?</strong></p>
<p><strong>I am big on the notion of â€œGames and Goals.â€ I believe that games have the power to motivate people in a very powerful way. By challenging ourselves while playing a game we can climb mountains.Â  Augmented Reality is the perfect platform to bring gaming into the real world.Â  By mixing the virtual world with the physical world, this added layer of perception provides a very powerful experience for something like a role-playing game.</strong></p>
<p><strong>One of my earlier social-gaming projects was a website called Superdudes.Â  This was a â€œBe Your Own Superheroâ€ concept that celebrated and motivated kids to create superhero avatar/persona&#8217;s online, and we gave members all sorts of games, challenges, and rewards, some of which carried into the real world. The site recognized members for teamwork, creativity, volunteer work and things like that. So the Superdudes were often involved in charity events and benefits to help children. Â Everybody called each other by their Superhero names, and the line between fantasy and reality were being blurred. Â This project really got me thinking about what happens when you take positive role-playing like this and mix it into the real world.Â  I started to work on a plan for location-based activist missions for points and rewards, but never got to complete that. So I have some unfinished business here.</strong></p>
<p><strong>I think it would be fantastic to be able to show up to some type of fun event with friends, and everybody could see each others alter ego personas standing before them. When you can turn the world into a playground, and use the power of gaming to make a positive impact on the planet&#8230; well, I donâ€™t think there is anything better than that.Â  These are the types of projects that drive me, and I think AR is the best platform to support these types of social gaming experiences.</strong></p>
<p><strong>Tish:</strong> Does Ogmento have any RPGs under development?Â  I noticed in the Google Wave on RPG someone has been working on doing something with the Dungeons&amp;Dragons API.Â  I am interested in exploring the web of protocols underlying Wave as a transport mechanism for multi-person, mobile, AR experiences (not requiring downloads), on an open global outdoor AR network. If not Wave, what do you see as the potential infrastrucure and protocols we could harness for an open augmented reality network?</p>
<p><strong>Brian: Â Ogmento has a deep background in video games and we interact regularly with most of the major game publishers. As a company we are not so much developing our own RPGs right now, but rather exploring what mobile AR extensions make sense for existing brands. Â There are many limitations to location-based gaming, but a global AR network is exactly along the lines we are thinking. Â Lots of discussions are taking place on protocols, platforms, API&#8217;s, and there are numerous ways to approach this. Â We need to be able to use what&#8217;s available now and continue to refine and customize for AR&#8217;s specific needs and issues as we progress. </strong></p>
<p><strong>In general though, Ogmento is focused on what types of experiences can be had today and over the next couple of years. I still think we are several years out from a truly open augmented reality network. Â We are certainly looking at launching our own &#8220;Ogmented Network&#8221; which would support some fun treasure hunt type experiences, or add an entertainment layer on top of traditional outdoor marketing campaigns.</strong></p>
<p><strong>Tish:</strong> I don&#8217;t know whether you have read Thomas Wrobel&#8217;s ideas for an open augmented reality network that I just <a href="http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/" target="_blank">published here on Ugotrade</a>.Â  The principals he talks about are very important for augmented reality to become a major part of our lives &#8211; .Â  Considering the difficulty open networks can pose for emerging business models how can we fund the development of an open framework for augmented reality?</p>
<p>&#8220;<em>a future AR Network, I mean one as universal and as standard as the internet. One where people can connect from any number of devices, and without additional downloads, experience the majority of the content.<br />
Where people can just point their phone, webcam, or pair of AR glasses anywhere were a virtual object should be, and they will see it. The user experience is seamless, AR comes to them without them needing to â€œprepareâ€ their device for it.&#8221;</em></p>
<p><strong>Brian: I think funding for these types of projects will definitely come from Venture Capital groups in the near future. Â It&#8217;s early in AR, but the VC&#8217;s are watching and deciding which horses to bet on. Â Until that time, it&#8217;s about service work, and developing AR experiences for others with what is possible today. That work will help fund internal development of original AR products, and platform development.</strong></p>
<p><strong>Tish:</strong> How did you get started with Ogmento?</p>
<p><strong>Brian: My first conversation with Ori was actually about my interest in Location Based RPG concepts.Â Â  We had a long conversation about the possibilities with AR, and it was clear that we shared similar interests, but were coming from different complimentary backgrounds. The idea of collaboration was exciting, so we just kept talking until the timing felt right. Now, with Ogmento we bring a unique blend of AR development experience with a deep backgrounds in AR technology, animation, video games, entertainment, social media, etc.Â Â  I think this is a powerful mix that will allow us to do some great things.</strong></p>
<p><strong>Itâ€™s still so early, and things are just getting started in AR. There are only so many webcam magic tricks you can enjoy before you are ready for something else.Â  The location-based apps have the most potential in my opinion, which is why we are really focused on mobile AR.Â Â  We have some board-game type projects, which do not instantly scream location-based gaming, but if you look at something like the ARhrrr board game, you can see how much more compelling it can be when the game invites the player to be actively moving around during the experience.</strong></p>
<p><strong>Tish:</strong> I am interested in your perspective on how we can create the kind AR experiences that really embody what has always been so exciting about AR &#8211; the tight alignment of graphics and media with real world objects and ultimately a rich immersive 3D experience, so I am going to hit you with a bunch of those, &#8220;Is this really eyewear or vaporware?&#8221; questions.Â  The real deal eyewear changes everything!</p>
<p>While eyeware is a big challenge technically and aesthetically,Â  I am pretty sure that there are several outfits out there that can pull off the optics and projection. â€¨Will the entertainment industry get excited enough to put a major push into delivering the eyewear in short order instead of the 5 to 10 year project that some people still think it is? Â Â  The business development challenge is bigger perhaps than the technical obstacles perhaps? What is your view on this?</p>
<p>And, perhaps, the eyewear is a clear example of a need for partnerships. For example, we have seen efforts from companies like <a href="http://www.vuzix.com/home/index.html" target="_blank">Vuzix</a> and <a href="http://www.lumus-optical.com/" target="_blank">Lumus</a>, and recently a Japanese Company, <a href="http://www.masunaga1905.jp/brand/teleglass/">Masunaga</a>.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-97.png"><img class="alignnone size-medium wp-image-4386" title="Picture 97" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-97-300x80.png" alt="Picture 97" width="300" height="80" /></a></p>
<p>I have no reports from people who have tried the Maunaga eyewear yet.Â  But,Â  limited by small field of view, and tethered, currently eyewear offerings, available at a reasonable price point, are not workable solutions for augmented reality experiences. But the problems are not insurmountable. What will facilitate the real deal?Â  â€¨â€¨â€¨It seems that it is critical to start creating hardware relationships now. The industry is costly and slow moving and as Robert Rice put it to me in a recent conversation, &#8220;once the software cat is out of the bag, its going to go wild and if the hardware isnt there, its going to stutter.&#8221;</p>
<p>As Ori notes some of the hardware companies like Intel and others don&#8217;t seem to be paying enough attention to AR.Â  Ori points out they donâ€™t see the demand yet.Â  But in order to create an awesome AR experience and demand from a mass audience, don&#8217;t we need to work in conjunction with hardware designers?</p>
<p><strong>Brian: Itâ€™s fun to think about who will eventually deliver a great hardware solution for AR glasses. It will happen. It would be cool to see somebody like an Oakley or Nike partnered up with a company like Vuzix to deliver something people actually might wear in public. Â Perhaps a hardware manufacturer like Apple or Nokia will bring us something like the iSight or the NGaze down the line. Â Iâ€™d love to see a set of glasses designed by Ideo.Â  Microsoft or Sony are already playing with technologies like Project Natale and the EyeToy, so I think its only a matter of time before they deliver an eyewear solution. I would even look to the toy companies to eventually make an investment here.</strong></p>
<p><strong>Gamers will be the early adopters, and in a few years we may start to see people running around in the park wearing glasses with headsets, but it will be acceptable because it&#8217;s clear they are using them for a game. Â Itâ€™s going to take a very sexy and stylish piece of hardware for everyday people to be willing to wear AR glasses in public while going about their everyday business. Â Â Itâ€™s like the recent cover of Wired magazine where Brad Pitt is wearing a mobile headset in his ear, and the editors point out that even he canâ€™t pull that look off, so why do you think you can. Â When AR glasses come in designer frames, and you can&#8217;t tell them from non-AR glasses, to me thatâ€™s when things get really interesting from a mass-adoption perspective. Â Â Compare how many people were carrying around a mobile phone in the 80s to now.Â  I think it will be the same thing with glasses.</strong></p>
<p><strong>I was in an AR pitch meeting the other week at a very significant media company, and brought up the point that todayâ€™s handheld Smartphones will eventually evolve into tomorrows Smartglasses. My comment was quickly shrugged off as sort of a sci-fi notion that was irrelevant to the business at hand. Â Probably true, but I think it is important to understand where digital media and entertainment is going, so you can adapt quickly, and evolve into those spaces more naturally. Â The more we see people walking around with their Smartphones in front of their face (like a camera), the sooner it will be that we make the jump to eyeglasses as a key hardware device for AR experiences.</strong></p>
<p><strong>At Ogmento, we definitely are working on AR experiences with the hardware and software available today. Â We will get some product out this year, and 2010 will be a banner year for markerless mobile AR in general.Â  I think the entire AR community is looking forward to bringing this technology to the mainstream in the form of games, marketing campaigns, virtual docent apps, and much more.Â  It might not be the full experience we are all dreaming about for some time, but we can see the path and the true potential, and it&#8217;s pretty spectacular.</strong></p>
<p><strong>You mention the tight alignment of graphics and media with real world objects. Â That is really our focus. A lot of well-deserved attention is going to the browser overlay &#8220;post-it&#8221; approach right now, which uses compass and GPS. Â We are focused on markerless natural feature tracking, so once you identify something that is AR enhanced in your environment, you can interact with that integrated experience. Â On an iPhone that can be as simple as using your touch screen to interact. Â When you are wearing glasses, it becomes more about visual tracking. There are lots of smart people thinking through these issues. Many of which you have interviewed. It is my hope that there are exciting collaborative efforts to be had in the coming months to get us all there together and faster.</strong></p>
<p><strong>Tish:</strong> Bruce touched on some of the hard problems that have to be solved for augmented reality &#8211; and he noted for instance security needs to be tackled in the early stages. Robert made a nice list, <em>â€œprivacy, media persistence, spam, creating UI conventions, security, tagging and annotation standards, contextual search, intelligent agents, seamless integration and access of external sensors or data sources, telecom fragmentation, privilege and trust systems, and a variety of others.â€</em> Will Ogmento be leading the way in solving some of these hard problems?</p>
<p>And, won&#8217;t trying to solve these hard problems for networked AR in walled garden scenarios one company at a time lead to a lot of reinventing the wheel wasted energy?</p>
<p><strong>Brian: These are all important issues, and again there are a lot of smart people thinking about solutions to these problems on a daily basis. Â Ogmento is interested in partnering with developers and supporting their efforts as a publisher of mobile AR experiences. Â While we intend to roll up our sleeves in these areas, we are currently more focused on taking AR mainstream with the hardware and software available today. Â As the industry evolves, so will Ogmento. As the opportunities evolve, our ability to make a greater impact tackling these issues will be realized.</strong></p>
<p><strong>Tish: </strong>Another area of development that could really kick AR into high gear might be creating augmented reality hotspotsÂ  where we use can deliver the kind of location accuracy/instrumentation necessary to create interesting AR experiences (partnership with Starbucks, perhaps ?!).Â  Augmented reality hots spots, could deliver the kind of high quality AR experience that isn&#8217;t possible ubiquitously at the moment, and may be a real way to get people really exploring the potential of AR now, rather than later?</p>
<p><strong>Brian: Â Agreed. I see a great opportunity here with this approach.</strong></p>
<p><strong>Tish:</strong> Although there are many obstacles to Green AR &#8211; the energy hogging servers at the backend for starters! Last week I had a conversation with Gavin Starks, <a href="http://www.amee.com/?page_id=289" target="_blank">AMEE</a>, and <a href="http://curiousraven.squarespace.com/" target="_blank">Robert Rice </a>and <a href="http://jimpurbrick.com/" target="_blank">Jim Purbrick</a> about how to work with AMEE and the technology available and encourage Green Tech AR development (<a href="http://blog.pachube.com/2009/06/pachube-augmented-reality-demo-with.html" target="_blank">see an early exploration of green tech AR from Pachube here</a>).</p>
<p>We came up with the idea of holding a competition perhaps centered around a targeted instrumented space. But I would really love to hear your thoughts on the topic of Green Tech AR (the energy hogging servers at the back end being the first cloud on the horizon!.)Â  Cool GreenTech AR imaginings, social gaming ideas, RPGs, not even necessarily even tied to the immediately practical, would be like rain in a drought!</p>
<p><strong>Brian: Â I go back to &#8220;Games and Goals&#8221;&#8230; If you make environmental and other activist efforts fun and rewarding, more are likely to be motivated and participate. Â Can you imagine having a personal &#8220;carbon footprint stat&#8221; floating over your self at all times? Or over your home or factory? Â How would that change your behavior? Â We all love stats. Look at how the Nike+ campaign has used technology and gaming to motivate people to run. Â I think there is a lot that can be done to make being green fun. It starts with the individual, and spreads from there. Â Keep me posted on that one!</strong></p>
<p><strong>Tish:</strong> I would also like to explore further the <a href="http://www.readwriteweb.com/archives/augmented_reality_human_interface_for_ambient_intelligence.php" target="_blank">RRW suggestion that ambient intelligence is both the Holy Grail of AR and possibly snake oil</a>:</p>
<p><em>&#8220;The holy grail of the mobile AR industry is to find a way to deliver the right information to a user before the user needs it, and without the user having to search for it. This holy grail is likely in a ditch somewhere beside a well-traveled road in the district of the semantic Web, ambient intelligence and the Internet of things. Be wary of any hyped-up invitation to invest in a company that claims to have gotten the opportunity right. What we&#8217;ve seen in the commercial industry to date is a rather complex version of a keyboard, mouse, and monitor.&#8221;</em></p>
<p><em> </em></p>
<p>So Holy Grail, Snake Oil, or a ditch somewhere&#8230;.?</p>
<p><strong>Brian: Â I instantly think of Minority Report, where Tom Cruise&#8217;s character is being bombarded with holographic ads personalized with his name and to his current situation. Â In the future, Spam is a nightmare, especially when it knows who you are. Â I think the key thing here is delivering &#8220;the right information&#8221;, and we still dont have that down. I do see a day where we can truly customize what comes to us, how we want it, when we want it. Â My future vision of ambient intelligence is the ability to &#8220;turn everything off&#8221; if I want to&#8230; block out the stimuli and replace it with images of nature, or natural surroundings, etc. Â Where I live in Los Angeles, we have those digital billboards everywhere, so it&#8217;s like advertising overload wherever you look (hints of Blade Runner). Â I personally don&#8217;t mind them, but I know there is great debate on there being simply too many billboards everywhere. So AR would only add to the noise of life by adding yet another digital overlay of information, right? </strong></p>
<p><strong>Perhaps the holy grail is to use technology to filter things out. AR might become a solution to leading a simpler life, or a perfectly customized life if you want that. Ultimately the control needs to be with the individual. Â I guess I am talking about something like TiVo taken to the extreme.</strong></p>
<p><strong>Tish:</strong> And then that other biggy &#8211; augmented reality search! I am asking this next question ofÂ  <a href="http://www.wikitude.org/" target="_blank">Wikitude</a> and <a href="http://sekaicamera.com/" target="_blank">Sekai </a>camera too and now I must also ask <a href="http://www.acrossair.com/" target="_blank">Acrossair</a> and several others I guess! Obviously a huge area of opportunity in this broader landscape that uses location-awareness, barcode scanners, image recognition and augmented reality is to harness the collective intelligence &#8211; a whole new field of search. There is the beginning of a discussion on this <a href="http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/" target="_blank">in the comments here</a>.</p>
<p>What will it take, in your view, to become a leader in augmented reality search?</p>
<p><strong>Brian: Â I&#8217;m more of a content guy, so I tend to focus on things like UI, quality of creative, etc.. Â From that perspective, I am looking forward to evolving beyond the &#8220;post-it&#8221; text overlay user-experience we see now in AR search. I was impressed with the TAT Augmented ID concept and hope we start seeing more smart design solutions like that emerging in the space. Â There are some great new design approaches coming out of the location-aware space that should be applied to AR search. I&#8217;ve been studying the heads-up display designs being used in video games, and re-watching movies like Iron Man for ideas. This is another example where Hollywood has painted a polished picture of what AR can and should look like, and the masses have already accepted these design approaches. Â So from that perspective, from my view the leaders in search will be delivering sexy, smart and simple solutions. It&#8217;s all about the S&#8217;s.</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/08/30/games-goggles-and-going-hollywood-how-ar-is-changing-the-entertainment-landscape-talking-with-brian-selzer-ogmento/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
		</item>
		<item>
		<title>Mobile Augmented Reality and Mirror Worlds: Talking with Blair MacIntyre</title>
		<link>http://www.ugotrade.com/2009/06/12/mobile-augmented-reality-and-mirror-worlds-talking-with-blair-macintyre/</link>
		<comments>http://www.ugotrade.com/2009/06/12/mobile-augmented-reality-and-mirror-worlds-talking-with-blair-macintyre/#comments</comments>
		<pubDate>Fri, 12 Jun 2009 05:07:01 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[MMOGs]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Virtual Realities]]></category>
		<category><![CDATA[Virtual Worlds]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[3D mirror world]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Android and augmented reality]]></category>
		<category><![CDATA[ARhrrrr]]></category>
		<category><![CDATA[Art of Defense]]></category>
		<category><![CDATA[augmented reality on the gphone]]></category>
		<category><![CDATA[augmented reality on the iphone]]></category>
		<category><![CDATA[augmented reality shooter games]]></category>
		<category><![CDATA[Aware Home Research]]></category>
		<category><![CDATA[Blair Macintyre]]></category>
		<category><![CDATA[Bragfish]]></category>
		<category><![CDATA[Dark Star]]></category>
		<category><![CDATA[geolocation]]></category>
		<category><![CDATA[geotagging]]></category>
		<category><![CDATA[google earth]]></category>
		<category><![CDATA[handheld AR games]]></category>
		<category><![CDATA[handheld augmented reality]]></category>
		<category><![CDATA[Immersive augmented reality]]></category>
		<category><![CDATA[Information Landscapes]]></category>
		<category><![CDATA[instrumented homes]]></category>
		<category><![CDATA[instrumented world]]></category>
		<category><![CDATA[iphone 3Gs]]></category>
		<category><![CDATA[iphone games]]></category>
		<category><![CDATA[ISMAR]]></category>
		<category><![CDATA[ISMAR 2009]]></category>
		<category><![CDATA[location aware applications]]></category>
		<category><![CDATA[minimally immersive augmented reality]]></category>
		<category><![CDATA[MMO of the real world]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[MS Virtual Earth]]></category>
		<category><![CDATA[NVidia Tegra devkits]]></category>
		<category><![CDATA[Open Sim]]></category>
		<category><![CDATA[OpenSim and Augmented Reality]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[outdoor tracking and markerless AR]]></category>
		<category><![CDATA[parallel mirror worlds]]></category>
		<category><![CDATA[persistent immersive mirror worlds]]></category>
		<category><![CDATA[photosynth]]></category>
		<category><![CDATA[Sun's Wonderland]]></category>
		<category><![CDATA[Texas Instrument's OMAP3 devkits]]></category>
		<category><![CDATA[the shape of alpha]]></category>
		<category><![CDATA[ubicomp]]></category>
		<category><![CDATA[Unity3D]]></category>
		<category><![CDATA[Unity3D and Augmented Reality]]></category>
		<category><![CDATA[virtual pets]]></category>
		<category><![CDATA[Wikitude]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=3691</guid>
		<description><![CDATA[Blair MacIntyre is one of the original pioneers ofÂ  augmented reality and an extraordinary amount of creative work is coming out of his Augmented Environments Laboratory at Georgia Tech &#8211; see YouTube videos here.Â  The screenshot below is from, ARhrrrr, a very impressive augmented reality shooter game created at Georgia Tech Augmented Environments Lab and [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/arf.jpg"></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/arf2.jpg"><img class="alignnone size-full wp-image-3732" title="arf2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/arf2.jpg" alt="arf2" width="259" height="239" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/droppedimage1.jpg"><img class="alignnone size-full wp-image-3725" title="droppedimage1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/droppedimage1.jpg" alt="droppedimage1" width="271" height="240" /></a></p>
<p><a href="http://www.cc.gatech.edu/~blair/home.html" target="_blank">Blair MacIntyre</a> is one of the original pioneers ofÂ  augmented reality and an extraordinary amount of creative work is coming out of his <a href="http://www.cc.gatech.edu/ael/" target="_blank">Augmented Environments Laboratory</a> at Georgia Tech &#8211; see <a href="http://www.youtube.com/user/AELatGT" target="_blank">YouTube videos here</a>.Â  The screenshot below is from, <strong>ARhrrrr</strong>, a very impressive augmented reality shooter game created at Georgia Tech <span class="description">Augmented Environments Lab </span>and <span class="description"> Savannah College of Art and Design, </span>(SCAD- Atlanta), and produced  on the <strong>NVidia Tegra devkits</strong> &#8211; <a href="http://www.youtube.com/watch?v=cNu4CluFOcw" target="_blank">watch the demo here</a>.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/picture-63.png"><img class="alignnone size-medium wp-image-3799" title="picture-63" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/picture-63-300x169.png" alt="picture-63" width="300" height="169" /></a></p>
<p>Blair has spent much of his career working on immersive augmented reality and more recently the integration of augmented reality with mirror worlds. Blair explains:</p>
<p><strong>&#8220;</strong><strong>I am interested in the intersection of mobile devices &#8211; whether they are head mounts or handhelds &#8211; and parallel mirror worlds&#8230;I think that parallel mirror worlds are a direct manifestation of the intersection of the virtual world we now live in (the web) and geotagging. Â As more and more information is tied to place, and as more of our searching become place-based, we will want to do those searches about places we are not at. Â A 3D mirror world may provide one interface to that data. Â Want to plan your trip to London; Â go their virtually and look around, see what is there (both physically and virtually), teleport between areas you want to learn about, and so on. Â More interestingly, talk to people who are there now, and retrieve your location-based notes when you are on your trip.&#8221;</strong></p>
<p>But, at a time when many augmented reality developers are focusing on AR apps for smart phones, including Blair (the picture on left opening this post is Blair&#8217;s augmented reality <a href="http://www.youtube.com/watch?v=_0bitKDKdg0&amp;feature=channel_page" target="_blank">iphone app ARf)</a>, I was interested in finding out from Blair what the state of play was for the real deal Rainbow&#8217;s End style AR, as well as the potential he sees in smart phones to mediate meaningful AR experiences.</p>
<p>There is enormous amount ofÂ  innovation in mapping our world, see my post, <a href="http://www.ugotrade.com/2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/" target="_blank">&#8220;Location Becomes Oxygen at Where 2.0 and WhereCamp,</a>&#8221; andÂ  <a href="http://gamesalfresco.com/2009/05/26/where-2-0-the-world-is-mapped-now-use-it-to-augmented-our-reality/" target="_blank">Ori Inbar&#8217;sÂ  Where 2.0. conference roundup. </a>But as Ori notes, to move augmented reality forward:</p>
<p><strong>My point is not a shocker: all we need is to tap into this information and bring it, in context, into peopleâ€™s field of view.</strong></p>
<p>And this is what Blair MacIntyre&#8217;s work is all about.</p>
<h3>Talking With Blair MacIntyre</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/picture-62.png"><img class="alignnone size-medium wp-image-3728" title="picture-62" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/picture-62-300x257.png" alt="picture-62" width="300" height="257" /></a></p>
<p><strong>Tish Shute:</strong> There do seem to be broader implications to augmented reality today than when this term was first coined. I am interested to have your perspective on how augmented reality may go beyond some of our early definitions?</p>
<p><strong>Blair MacIntyre: I still think the original definition of the term is useful: Â media (typically graphics) tightly registered (aligned) with the physical world, in real time. Â Many people talk about many things that relate virtual worlds to places, spaces, objects and people. Â There is room for many of them, and they don&#8217;t all have to &#8220;be&#8221; augmented reality. Â I like using Milgram&#8217;s definition of Mixed Reality as everything from the physical world (at one end) to the virtual world at the other; Â it&#8217;s a spectrum, and augmented reality just sits at one point.</strong></p>
<p><strong>The reason I like the old definition is I believe there is something special about graphics that are tightly, rigidly aligned with the physical world. Â When things appear to stick to the world, and an obviously identifiable location, people can start leveraging their natural perceptual, physical and social abilities and interact with the mixed world as they do the physical world. Â We&#8217;ve found this with the two studies we&#8217;ve done of tabletop AR games (<a href="http://www.augmentedenvironments.org/lab/research/handheld-ar/artofdefense/" target="_blank">Art of Defense</a> and </strong><a href="http://www.augmentedenvironments.org/lab/research/handheld-ar/artofdefense/" target="_blank"><strong></strong></a><strong><a href="http://www.youtube.com/watch?v=w3iBrj_zfTM&amp;feature=channel_page" target="_blank">Bragfish</a></strong><strong>); Â one key to those games is that the graphics were tightly aligned with identifiable landmarks in the physical world (gameboard).</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/aod-sandbox-video-15.png"><img class="alignnone size-medium wp-image-3729" title="aod-sandbox-video-15" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/aod-sandbox-video-15-300x225.png" alt="aod-sandbox-video-15" width="300" height="225" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/imgp0782-2.jpg"><img class="alignnone size-medium wp-image-3782" title="imgp0782-2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/imgp0782-2-300x225.jpg" alt="imgp0782-2" width="300" height="225" /></a></p>
<p><em><a href="http://www.augmentedenvironments.org/lab/research/handheld-ar/artofdefense/" target="_blank">Art of Defense</a> (pic on left) <a href="http://www.youtube.com/watch?v=w3iBrj_zfTM&amp;feature=channel_page" target="_blank">Bragfish</a> (pic on right)<br />
</em></p>
<p><strong>Tish:</strong> I know that you are involved with <a id="b-c6" title="ISMAR 2009" href="http://www.ismar09.org/" target="_blank">ISMAR 2009</a> which is the key US augmented reality conference.Â  What do you think will be the hot themes, applications, innovations at this year&#8217;s conference? Do you think this will be the year that AR really breaks out of eye candy into truly useful and sustained experiences?</p>
<p><strong>Blair:  Unfortunately, I won&#8217;t be involved this year. Â I was supposed to be helping run the technical program, as well as the art/media program, but sickness in my family prevented me from having the time, so I am not helping this year.</strong></p>
<p><strong>First, I would not agree with the implication of the last question &#8212; I don&#8217;t think AR has just been eye candy up to now. Â I do agree that the &#8220;high profile&#8221; uses of it have largely been that, which is mostly because of the limits of the technology. Â I don&#8217;t think we&#8217;ll see huge changes in that regard by ISMAR this year. Â However, we will hopefully see a mixing of communities that hasn&#8217;t happened at ISMAR before, and I do believe that this year (independent of ISMAR) we will see more and more AR apps. Â Whether they go beyond eye candy is still a question. Â I&#8217;m hoping that some folks (including myself and other ISMAR folks!) will help push AR in new directions. Â But I also expect many folks new to ISMAR and AR to play a big role, because it is this new blood, especially those folks with real problems to solve, new art and game ideas, and a fresh perspective, that will open new doors.</strong></p>
<p><strong>Tish:</strong> You have been working on integrating augmented reality with virtual worlds. You mentioned that the way you use <a href="https://lg3d-wonderland.dev.java.net/" target="_blank">Sun&#8217;s Wonderland</a> is really about pulling the virtual world into the real world, i.e., Wonderland, &#8220;is just a place to put data.&#8221;Â  How is your use of the persistent virtual space different from what we have become accustomed to call virtual worlds?</p>
<p><strong>Blair: The approach we are taking in our project at Georgia Tech is to use the virtual world as the central hub of the information space, and allow the virtual world to be the element that enables distributed workers to collaborate more smoothly. Â This is work we are doing with Sun and Steelcase (and the NSF), and is an outgrowth of a project (the InSpace project) that&#8217;s been going on for a few years.</strong></p>
<p><strong>What we are trying to do is use mixed reality and ubicomp techniques to pull as much of the physical activity into the virtual world, and then reflect that activity back out to the different participants as best suits their situation. Â So, folks in highly instrumented team rooms will collaborate in one way, and their activity will be reflected in the virtual world; Â remote participants (e.g., those at home, or in a cafe or hotel) may control their virtual presence in different ways, but the presence of all participants will be reflected back out to the other sides in analogous ways. Â We may see ghosts of participants at the interactive displays, or hear their voices in 3D space around us; Â everyone will hopefully be able to manipulate content on all displays and tell who is making those changes.</strong></p>
<p><strong>A secondary benefit, I hope, is that by putting the data in the virtual world and making that the place that gives you more powerful and flexible access to the data (e.g., by leveraging space and giving access to history), distributed teams will begin to have the virtual space become a place they go to work, bump into each other and have those casual contacts co-located workers take for granted.</strong></p>
<p><strong><br />
</strong></p>
<h3><strong>Creating the Information Landscape of the Future</strong></h3>
<p><strong></strong></p>
<p><strong>Tish: </strong>At the end of <a href="http://www.ugotrade.com/2009/05/06/composing-reality-and-bringing-games-into-life-talking-with-ori-inbar-about-mobile-augmented-reality/" target="_blank">my interview with Ori Inbar</a> he said, in order to have a ubiquitous experience <em>&#8220;youâ€™ll need to 3d map the world. Google earth like apps are going to help but it is not going to be sufficient. So letâ€™s leverage people. Google became successful in part by making people work with them.Â  Each time you create a link from your blog to my blog their search engines learn from it.Â  So letâ€™s find ways to make people create information that can be used for AR.&#8221;</em> What ways do you think people can create information that can be used for AR?</p>
<p><strong>Blair: I think the big part of that is the creation of models and environments, the necessary &#8220;baseline&#8221; for specifying experiences. Â Google and Microsoft are clearly working toward this; Â recent videos from Microsoft show them starting to move the photosynth work toward Virtual Earth. Â Similarly, I came across a page where people are finally starting to mine geotagged Flickr [see my post, <a href="http://www.ugotrade.com/2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/" target="_blank">&#8220;Location Becomes Oxygen,&#8221;</a> and <a href="http://www.ugotrade.com/2009/05/17/creating-the-information-landscapes-of-the-future-locative-media-and-the-shape-of-alpha/" target="_blank">here</a> for more on the <a href="http://code.flickr.com/blog/2008/10/30/the-shape-of-alpha/" target="_blank">â€œThe Shape of Alphaâ€</a></strong><strong> project from Flickr]Â  images to create models. Â It&#8217;s that kind of thing that will be useful first; Â using the data we all create to enable modeling and (eventually) vision-based tracking in the real world.</strong></p>
<p><strong>After that, it&#8217;s a matter of time till more of what we &#8220;create&#8221; (e.g., Tweets and blog posts and so on) are all geo-referenced; Â these will become the information landscape of the future, the kinds of things people think about when they read &#8220;Rainbow&#8217;s End&#8221;. Â  The big problem will be filtering, searching and sorting. Â And, of course, safety and security.</strong></p>
<p><strong>Tish: </strong>You are working with <a href="http://unity3d.com/" target="_blank">Unity3D</a> to research the integration of mobile location based AR with persistent mirror world like spaces.Â  What has attracted you to Unity? What is the difference between this and your Wonderland project? I know you mentioned. you will be using head-mounted displays are part of this Unity project. What are your goals for this project?</p>
<p><strong>Blair:</strong> <strong>We started to use <a href="http://unity3d.com/" target="_blank">Unity3D</a> because it gave us what we wanted in a game engine. Â Most importantly, it&#8217;s very open and let us trivially expose AR technologies into the editor. Â Similarly, it can target the iPhone, so we can begin to work with it on that platform, too. Â The biggest problem with creating compelling experiences is content; Â and a show stopper for creating content is not getting it into your engine. Â Unity has a nice content workflow.</strong></p>
<p><strong>Unity3D is a front end engine, for creating the game; Â Wonderland is both a front end, and a backend. Â We are actually looking into using the Wonderland backend with Unity as well. Â Wonderland also has growing support for doing &#8220;real work&#8221; in a virtual world, which is key to our other projects.</strong></p>
<p><strong>Eventually, we&#8217;ll be using HMD&#8217;s. Â The goal for the Unity3D project, initially, was to explore what you can do with an AR/VR mirror-world; this is a project are working on with Alcatel-Lucent, and demo&#8217;d at CTIA this year. Â It&#8217;s continuing to grow, though, and now includes a number of our projects, including some work on mobile social AR and soon, some performance and experience design projects in the area of AR ARG&#8217;s. Â It&#8217;s really quite interesting to imagine what you can do when you have an &#8220;MMO of the real world&#8221; (which we now have for part of campus) that supports both VR-style desktop access simultaneously with mobile AR access.</strong></p>
<p><strong>Tish: </strong>Have you taken another look at <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a> as a possible backend for augmented reality?Â  Recently I talked to David Levine, IBM and he is thinking about some possibilities to optimize OpenSim to dynamically load a large amount of objects at once (i.e how fast OpenSim can bulk load into an existing sim) and make it better suited to augmented reality/mirror world type projects.</p>
<p><strong>Blair: I haven&#8217;t looked at OpenSim recently. Â We will probably look at it this summer.</strong></p>
<p><strong>Tish:</strong> Why did you select Unity as a good client for augmented reality?</p>
<p><strong>Blair: Unity is a 3D game authoring environment so at some level it is no different from using Ogre, if all the associated stuff was just as well done. It has integrated physics, scripting, debugging, etc. &#8211; you can write code in javascript or C# or whatever. Â  It has a good content pipeline, as well, and supports a range of platforms.</strong></p>
<p><strong>It has simple networking built in, so multiple unity engines can talk to each other but it is not a virtual world platform out of the box &#8211; there is no back end &#8230;</strong></p>
<p><strong>Tish: </strong>Someone described Unity to me as a great client waiting for a great backend? So what are you going to use as a back end?</p>
<p><strong>Blair: There is no real processing except in the client right now.Â  We will eventually have to create a back end.Â  We are thinking of using Dark Star because someone on the Sun Wonderland community forums has already built a set of scripts connecting Unity to Darkstar.</strong></p>
<p><strong>But for us, we are not proposing right now to build a real product.Â  This is research to demonstrate what you could do if you actually had the back end.</strong></p>
<p><strong>Tish:</strong> What are the most important aspects of the backend from your POV?</p>
<p><strong>Blair: We want to simulate a variety of the interesting aspects of the back end.Â  So I very much care about notions of privacy and security and how these sorts of AR/VR Mirror Worlds would work in practice.Â  But I care about how those things as they impact user experience, not really about how we would really implement them.</strong></p>
<p><strong>Tish:</strong> So looking at some of the big problems from the perspective of user experience? Are we are going to go through the same growing pains that the web and VWs have seen, for example, will we have to type in passwords to get into everyone&#8217;s little worlds&#8230;.</p>
<p><strong>Blair: Well you know the SciFi background to this, you&#8217;ve mentioned it in other posts on your blog. Â Because when you look at the Rainbow&#8217;s End model where you have security certificates flying around, that is in effect what cookies and so on are now.Â  You can authenticate yourself once and then have those certificates hang around. So you can easily imagine how it could be done.Â  But the big question is how does that change user experience.Â  There are all kinds of things that start coming into play &#8211; like what happens if nearby people see different things &#8211; it goes on and on!</strong></p>
<p><strong>Tish:</strong> Sounds Like this is very valuable research.Â  It seems to me that there will be a lot of investment soon in putting the pieces together to do location based markerless AR and it would be nice if we knew more about it from the user experience POV.</p>
<p>Isn&#8217;t it vital for a productive intersection between mobile AR and persistent mirror world spaces for us to have markerless AR?Â  Aren&#8217;t we right at the beginning of people really saying yeah markerless AR is doable now? But it seems to me not many people researching or working on fully immersive AR and its integration with mirror worlds?</p>
<p><strong>Blair: I think some of the AR community is thinking about this. There&#8217;s probably people who are doing stuff in some other non technical communities. It wouldn&#8217;t surprise me to find out that people in the digital performance or ARS electronica world who are thinking a little bit about these sorts of things. Although not necessarily at the level of actually trying to build it, because they probably can&#8217;t right now. Â But experimenting with the precursors. Â My colleagues in digital media like to point out that this is often the purpose of digital art, to point out new directions and push the boundaries.</strong></p>
<p><strong>Obviously Science Fiction has explored the possibilities because that is what Rainbow&#8217;s End and the Matrix were all about.</strong></p>
<p><strong>Tish:</strong> and <a href="http://en.wikipedia.org/wiki/Denn%C5%8D_Coil" target="_blank">Denno Coil</a>&#8230;</p>
<p><strong>Blair: There has been some research &#8211; people like my adviser Steve Feiner up at Columbia, Mark Billinghurst in New Zealand, myself and people at Graz University in Austria .Â  But partly it has been so hard to do mobile AR up to now &#8211; so many people mock head worn displays and can&#8217;t get past current technology &#8211; you have hadÂ  to be willing to ignore the bulky back packs and cables and batteries and so on.Â  That is changing which is good.</strong></p>
<p><strong>My current response to the anti-head-mounted display people is if 5 years ago you told me you told me that fabulously dressed people who care about their looks and wear stylish clothes would have had big things hanging from their ears that blink bright blue light, so they could talk on the phone, many of us would have said you were crazy, because it would be ugly and so on.Â  But because there is an intersection of demonstrable need and benefit&#8230;Bluetooth headsets are really useful and the sort of early gestalt feeling that grew up around them &#8211; that people who use them are so important that they always have to be in touch, they wear these things &#8211; so people accept them.</strong></p>
<p><strong>It will likely be a similar thing with head mounted displays. And I don&#8217;t know if it will be that people wearing them so that they can read their mail while driving, god forbid. But it will be something.Â  And when we get the 2nd generation of the wrap glasses that look more like sun glasses and are not bulky and so on, we will have the potential for them catching on because you will look at them and you will think that the person is wearing because they are doing x&#8230;</strong></p>
<p><strong>X might be surfing a virtual world or reading their email or keeping in touch, or being aware. It will happen. But they have to get unbulky enough and there has to be moreÂ  than one important application, not just watching TV.</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/karmablair-fix.jpg"><img class="alignnone size-medium wp-image-3787" title="karmablair-fix" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/karmablair-fix-300x227.jpg" alt="karmablair-fix" width="300" height="227" /></a><br />
</strong></p>
<p><em>Picture above showsÂ  an outside view of the KARMA AR system; Â the knowledge based maintenance system Blair built in his first year of grad school (<strong>&#8220;first AR system Steve Feiner, Doree Seligmann, and I worked on&#8221;</strong>).Â  Blair noted, &#8220;<strong>The Communications of the ACM paper on it (from 1993) is a pretty widely cited AR paper.&#8221;</strong></em></p>
<p><strong>Tish:</strong> I think the need forÂ  full on transparent, immersive, wraparound, Gucci stylish eyewear with a decent field of view are the elephant in the room in terms of realizing the full potential of augmented reality.Â  There are a few new players in the field <a href="http://www.sbglabs.com/" target="_blank">Digilens</a>,Â  <a href="http://www.vuzix.com/home/index.html" target="_blank">Vuzix</a>, others?Â  What is the progress in this area and what do you hope for in terms of near term solutions?</p>
<p><strong>Blair: I agree with that sentiment. Â I think that, in the near term, there is a lot we can do with handhelds, as we&#8217;ve been doing in the lab. Â However, because it&#8217;s awkward and tiring to hold up a device, even a small one, for any length of time, handhelds will only be good for &#8220;focused&#8221; uses of AR. Â Such as the table-top games we&#8217;ve been doing, or the constellation viewing app that I heard came our recently for the Android G1. Â I don&#8217;t even see something like Wikitude as that compelling (beyond the &#8220;gee wiz&#8221; factor) for a handheld form factor. Â  Many proposed AR apps only really become compelling when users have constant awareness of them, and that requires a see-through head-worn display.</strong></p>
<p><strong>I&#8217;ve seen the mockups of the Vuzix ones; Â they seem pretty interesting, and are getting to were early adopters could use them (they will be cheap enough, and will hopefully be good enough). Â Microvision&#8217;s virtual retinal display is also promising; Â the contact lens displays will be the most interesting, if anyone can ever make them work. Â  I don&#8217;t know of anything else out there.</strong></p>
<p><strong><br />
</strong></p>
<h3><strong>&#8220;its not really a killer app you care about, it is the killer existence that all of the technology and small applications taken together facilitate&#8221;</strong></h3>
<p><strong></strong></p>
<p><strong>Tish:</strong> While location based services are accepted now and people are understanding that it is something that opens up a new relationship to everything, we still haven&#8217;t found the experience that will get everyone holding up their mobile devices?</p>
<p><strong>Blair: Well that is actually the killer problem. Â Gregory Abowd is one of my colleagues who does ubiquitous computing research here at Tech. Â  Way back when we started the Aware Home project (<a href="http://www.awarehome.gatech.edu/">Aware Home Research Institute at Georgia Tech</a>) when I first got here about ten years ago, there was always this question of what is the killer app.Â  So Gregory comment in a meeting once that its not really a killer app you care about, it is the killer existence that all of the technology and small applications taken together facilitate. It is not that any one of these AR demos we see right, whether it is seeing your photos in the world or whatever, is important. Its that when taken together, there is enough of a benefit that you would use the whole environment.</strong></p>
<p><strong>In the original context we were talking about an instrumented home, but it is the same thing here with AR.</strong></p>
<p><strong>The problem with the mobile phone as a AR device is that problem of awareness. If I have a head mount on and I walk down the street and there is bunch of probably-not-useful-but-potentially-useful information floating by me, that&#8217;s a good thing, because I may see something that is useful or makes me think of something else.Â  But if I have to hold up my phone to see if something might be interesting nearby, I will never hold up my phone because at the time there is a high probability that there won&#8217;t be anything particularly important there.Â  You might imagine you can get around this by using alerts or something like that, but then you overload whatever alert channel you use. Â For example, I forward maybe 5 or 6Â  people&#8217;s updates from Facebook to my phone &#8211; started with my wife, a few friends, my brother, and the net result of that is I never get SMSs&#8217; anymore because when my phone buzzes, usually I ignore it because it is probably just somebody&#8217;s random Facebook update. So if we start overloading channels like that with &#8220;oh there might be something useful here in the real world, if you pick up the phone and look through it you will see it &#8230; and I will buzz you.&#8221; PeopleÂ  just start ignoring the buzzes.</strong></p>
<p><strong>So it is a very hard problem if you think about the kinds of applications that people always imagine with global AR &#8212; names over peoples heads and other random information floating in the world &#8212; until you have a head mount and all that information is around you all the time. That is when those sort of applications will actually happen.</strong></p>
<p><strong>Tish:</strong> <a href="http://curiousraven.squarespace.com/" target="_blank">Robert Rice</a> notes: <strong>&#8220;AR is inherently about who YOU are, WHERE you are, WHAT you are doing, WHAT is around you, et</strong><em><strong>c.&#8221; </strong></em>(see my interview with Robert,<em> </em><a href="http://www.ugotrade.com/2009/01/17/is-it-%E2%80%9Comg-finally%E2%80%9D-for-augmented-reality-interview-with-robert-rice/" target="_blank">&#8220;Is it &#8216;OMG Finally&#8217; for Augmented Reality?</a>)<em>. </em>And I think the iphone experience has laid the foundation for the increasing desire to experience the network wherever we are &#8211; and not be stuck behind a pc.Â  We cannot perhaps do all we want to do yet. But even in the range of things we can do know, we are not even sure exactly what it is we want to do where yet is it?</p>
<p><strong><br />
</strong></p>
<h3><strong>&#8220;imagine your iphone Facebook client supports AR and that all data on Facebook might be georeferenced &#8211; pictures, status updates etc&#8230;&#8230;.&#8221;</strong></h3>
<p><strong></strong></p>
<p><strong>Blair: Yes that is a huge problem. I have been lucky to be able to teach two fun classes this year that let the students and I start to explore some of the potential that handheld AR might bring. Â Last fall I taught a handheld AR game design class &#8212; coordinated with a class at the Savanna College of Art and Design&#8217;s Atlanta campus &#8212; and we had the students build a sequence of prototype handheld AR games, which was a lot of fun. Â  This spring I taught a mixed reality/augmented reality design class with Jay Bolter (a professor in the School of Literature, Communication, and Culture here at GT). Â Jay and I have been teaching this class off and on for about 9 years; this semester we decided to say to the students &#8220;imagine your iphone Facebook client supports AR and that all data on Facebook might be georeferenced &#8211; pictures, status updates etc&#8230;&#8230;.&#8221; and have them do projects aimed at such an environment.</strong></p>
<p><strong>Tish: </strong>Not many of our favorite social media today have much sense of location do they? But FlickrÂ  areÂ  utilizing the geo-referenced pictures to create vernacular maps&#8230;..The Shape of Alpha</p>
<p><strong>Blair:Yes that is because lots of cameras put geo location data into the exif data so they can extract it&#8230;</strong></p>
<p><strong>Some mobile Twitter clients like the one I use in my iphone will let you add your location.Â  But in general Facebook and other sites don&#8217;t have any notion of location. But if you look at all the things people do in Facebook, such as sending gifts and other games, its easy to imagine what these might look like with geo-reference data. Â So, the high level project for the class is the groups have to design experiences people might have using mobile AR Facebook. Â We told them to assume Facebook as it stands now, but add geolocation and AR to the client. Â The class boiled down to &#8220;What would you imagine people doing?&#8221; So it has been kind of fun.</strong></p>
<p><strong>And we are using Unity for the class too &#8211; the same infrastructure I am working on in my research linking mobile AR to persistent immersive mirror world type spaces &#8211; and we having the students mock up what a mobile AR Facebook experience would be like.</strong></p>
<p><strong>Tish: </strong>Can you describe some of the ideas you class came up with that you think have potential? I know Ori mentioned that from the games class he liked <a href="http://www.youtube.com/watch?v=Rqcp8hngdBw&amp;feature=channel_page" target="_blank">Candy Wars.</a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/candywars-6.png"><img class="alignnone size-medium wp-image-3693" title="candywars-6" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/candywars-6-300x225.png" alt="candywars-6" width="300" height="225" /></a></p>
<p><em>Candy Wars</em></p>
<p><strong>Blair: In the end, they had a nice range of projects in the Spring class. Â One created tag clouds out of status messages over spaces, others looked at analogies to virtual pets and gift giving out in the world, one looked at leveraging geolocation to help with crowd-sourced cultural translation, and three groups did straight-up social games.</strong></p>
<p><strong>[See <a href="http://www.youtube.com/user/AELatGT" target="_blank">all of the projects from the handheld AR games class on YouTube here</a>]</strong></p>
<p><strong><br />
</strong></p>
<h3><strong>iphone, Android, or </strong><strong>NVidia Tegra devkits or the Texas Instrument&#8217;s OMAP3 devkits?</strong></h3>
<p><strong>Tish:</strong> Is anyone in the class working on Android?</p>
<p><strong>Blair: Nobody is using Android because no-one in the class has the phones. We have ATT microcell infrastructure on campus. Â Some ATT people joke that we are better off than them because we have a head office on campus so we can build in the network applications which people even at ATT research can&#8217;t do.Â  But becauseÂ  we have this infrastructure on campus, and a great relationship with ATT and the other sponsors, we have the ability to provision our own phones without having to pay for long-term contracts, which is vital for research and teaching.</strong></p>
<p><strong>Tish:</strong> So does this lock you into the iphone?</p>
<p><strong>Blair: Well the G1 is of course not AT&amp;T but it is GSM so we could probably buy them unlocked and put them on our AT&amp;T network. But the students I work with are much more interested in the iphone right now.</strong></p>
<p><strong>Tish:</strong> Is that because the iphone has the market?</p>
<p><strong>Blair: For me the reason I am not interested in the G1 is because you can&#8217;t do AR on it &#8211; there is <a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a> and a few other apps, but it is all hideously slow. Â Worse, because the Java code isn&#8217;t compiled like it would be on the desktop, you can&#8217;t do computer vision with it, so you can&#8217;t do anything particularly interesting on the current commercial G1s.Â  We could probably take the NVidia Tegra devkits or the Texas Instrument&#8217;s OMAP3 devkits (both are chipsets for next gen phones &#8212; high end graphics, fast processing),Â  and install Android on those and we may actually do that yet. Â But, it seems like a lot of work right now, for not much benefit.</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/pastedgraphic.jpg"><img class="alignnone size-medium wp-image-3730" title="pastedgraphic" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/pastedgraphic-300x166.jpg" alt="pastedgraphic" width="300" height="166" /></a><br />
</strong></p>
<p><em>Augmented Reality shooter game <strong>ARrrrr</strong> from<strong> </strong></em><em>Georgia Tech and SCAD Atlanta on the <strong>NVidia Tegra devkits</strong></em><em> &#8211; <a href="http://www.youtube.com/watch?v=cNu4CluFOcw" target="_blank">watch the demo on YouTube here</a></em><em>. </em><strong> </strong></p>
<p><strong>Tish: </strong>Everyone seems very excited about the iphone OS 3.0 and the addition of compass. Compass is pretty essential for AR right?</p>
<p><strong>Blair: It is necessary if you can&#8217;t do other forms of outdoor tracking, but the problem is that the compass on the G1 isn&#8217;t very good, relatively speaking and the iPhone one probably won&#8217;t be much better. It does not have very high accuracy, nor is it very fast (compared to, say, the high end 3D orientation sensors we use, from Intersense and MotionNode). As far as I can tell, it doesnâ€™t even give full 3D orientation. I donâ€™t have a G1 (although I have pre-ordered an iPhone 3Gs), but people have told me it only has absolute 2D orientation, so you can only line things up if you are careful.Â  Your can&#8217;t look around arbitrarily&#8230;</strong></p>
<p><strong>Tish: </strong>You can&#8217;t sweep your phone?</p>
<p><strong>Blair: You can look left and right, but if it doesn&#8217;t have full 3D orientation, you can&#8217;t go up and down. You can&#8217;t tilt it in weird directions. It is not fast in the form that you would want to look around quickly.Â  So it is nice demo.Â  And it is good for what the Android people use it for which is to let you do your Google street view by looking around, which is actually really useful.</strong></p>
<p><strong>I think there are lots of really useful things you can do with such a compass.</strong></p>
<p><strong>And, it is clear that compass is a necessary feature if we want to do AR. Â It&#8217;s just not sufficient.</strong></p>
<p><strong><br />
</strong></p>
<h3><strong>Outdoor Tracking and Markerless AR<br />
</strong></h3>
<p><strong></strong></p>
<p><strong>Tish:</strong> Isn&#8217;t it essential for markerless AR?  I guess not I just saw this post about <a href="http://artimes.rouli.net/2009/04/srengine-in-english.html" target="_blank">SREngine on Augmented Times</a>!</p>
<p>This wasn&#8217;t up when we spoke so perhaps you have some comments about what it brings to the table?</p>
<p><strong>Blair: Maybe. The folks at Nokia are working on outdoor tracking, they demoed some stuff at ISMAR last year on the N95 handsets that is all image based.Â  We are trying to do some work with them, one of my students is working on it.Â  And probably Microsoft is going to do more on this as well, they had a video up showing that they are also working on vision based techniques.Â  If you give the phone the equivalent of those panoramic Google Street View images (assuming they are up-to-date) and you are standing at the right place, you don&#8217;t really need a compass, you can figure out which way you are looking by looking at the camera video.  Ulrich Neumann (USC) did some work on tracking from panorama&#8217;s years ago, I don&#8217;t know what ever became of it.</strong></p>
<p><strong>Regarding SREngine, that project appears to be a pretty simple first step, but is probably just a demo at this point, and limitations like &#8220;only works on static scenes&#8221; and &#8220;doesn&#8217;t work for simple scenes&#8221; means it&#8217;s probably extracting some simple features out of the image and then matching those to some database. Â The trick would be getting this to work on a large scale, where the world changes a lot. Â  It&#8217;s not obvious how to get there.</strong></p>
<p><strong>Tish:</strong> So forget RFID for AR&#8230;</p>
<p><strong>Blair: RFID is not really useful.</strong></p>
<p><strong>Tish:</strong> not at all?</p>
<p><strong>Blair: RFID is useful for telling you what things are near you.Â  The problem is it doesn&#8217;t give you any directional information &#8211; it just tells you you&#8217;re in range of the tag. So can use it to tell you when you are near a certain product for example.Â  So it is useful in terms of telling you what thing you are near, and then you can load up a vision system or something else that will recognize that thing.</strong></p>
<p><strong>In that way, it could be useful as a good starting point.</strong></p>
<p><strong>Similarly for computer vision, the compass and the gps are very useful for giving you an initial guess at what you may be looking at that can then speed up the rest of the process. Â But, computer vision by itself will not be a complete solution because if I have my panoramic Google Street view (or whatever image database I use for tracking) and you are standing between me and the building -Â  I am not going to see what I expect to see, I am going to see you.</strong></p>
<p><strong>So I think it is all going to be part of one big package &#8211; you are going to see accelerometers, digital compasses, and gps and then combine that with computer vision and other sensors, and then maybe we are going to start getting the things that we have always dreamed about.Â  I like to show <a href="http://mi.eng.cam.ac.uk/~gr281/outdoortracking.html" target="_blank">this video </a>from the U. of Cambridge (work done by Gerhard Reitmayr and Tom Drummond) of an outdoor tracking demo because it gives a sense of what will be possible.Â  Techniques like this will be an ingredient in the future of things.Â  It becomes especially interesting when you have these highly detailed mirror worlds.Â  It is sort of one of those chicken and egg problems where if I have an highly detailed model of the world then techniques like they have can be used to track.Â  But that mirror world needs to be accurate or you can&#8217;t use it for tracking, and why would you create the mirror world if you couldn&#8217;t track?</strong></p>
<p><strong>Tish:</strong> I noticed in your comment to <a href="http://www.ugotrade.com/2009/01/17/is-it-%E2%80%9Comg-finally%E2%80%9D-for-augmented-reality-interview-with-robert-rice/" target="_blank">&#8220;my interview with Robert Rice&#8221;</a> that you said you thought that is was important not to collapse AR into ubicomp &#8211; &#8220;forgetting what originally inspired us about AR&#8221; is, I think if I remember correctly, the suggestion you made. But aren&#8217;t ubiquitous computing and AR basically coextensive?</p>
<p>The <a href="http://www.ugotrade.com/2009/03/18/dematerializing-the-world-shadows-subscriptions-and-things-as-services-talking-with-mike-kuniavsky-at-etech-2009/" target="_blank">vision of ubicomp Mike Kuniavsky describes</a> &#8211; &#8220;sharing data through open APIs and the promise of embedded information processing and networking distributed through the environment&#8221; demonstrates how much can be done with very little processing power.&#8221; In its most immersive form augmented reality requires a lot of processing power. I think we have all become very conscious about trying minimize levels of consumption.Â  Can you explain why you think people shouldn&#8217;t see AR as the Hummer (energy squandering indulgence) of Ubiquitous Computing?</p>
<p><strong>Blair:Â  I think there will be a hierarchy of interfaces. You are going to have the rich Rainbow&#8217;s End like experience &#8211; you are totally submerged in a mixed environment, if you have a head mount on (its not going to be Rainbow&#8217;s End for while) but if you don&#8217;t have the headmount on that information might be available to you other ways, whether it is a 3D overlay using your handheld or just a 2D mashup with Google maps.Â  But there will be some circumstances and people who will want to get the compelling experience you can only get with the headmount.</strong></p>
<p>Tish:Â  Are you doing any research on how all these hierarchies of experiences will fit together &#8211; what aspects of this are you looking at?</p>
<p><strong>Blair: The thing that really needs to happen is you need to have this backend architecture that allows you to collect your data from different sources and aggregate it much like the web. Right now Google Earth and Microsoft&#8217;s Virtual Earth are much like the old pre-web hyper-text systems that were all centralized. And what we really need is to have the web equivalent where Georgia tech can publish their building models and I.B.M. can publish their building models and their campus models, and your client can aggregate them, as opposed to Microsoft or I.B.M. puts their building models into Google Earth and then somehow you get them out with Google&#8217;s google earth browser. That&#8217;s just not going to fly.</strong></p>
<p>Tish: so what does it take then to get us to this backend architecture, because I&#8217;m in total agreement?</p>
<p><strong>Blair: The nice thing about augmented reality versus virtual reality is that you don&#8217;t need everything modeled. You can do interesting AR apps like <a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a> with absolutely no world model.</strong></p>
<p><strong>Tish:</strong> So that means we can start with what we have &#8211; utilize cloud services without a full blown backend architecture?</p>
<p><strong>Blair: It may very well be that Google Earth and MS Virtual Earth act as a portal because people go and build models and link them with KML, and they can see them in google earth but they can also download the KML&#8217;s through some some other channel. So it may be that those things end up being something that feeds some of this along. Then people start seeing a benefit to having these highly accurate models so then you start integrating the Microsoft photosynth stuff and leveraging photographs to generate models.</strong></p>
<p><strong>It&#8217;s just keeping up with it and building it in real time is the challenge. A lot of folks think it will be tourist applications where there&#8217;s models of times square and models of central park and models of Notre Dame and the big square around that area in paris and along the river and so on, or the models of Italian and Greek history sites &#8211; the virtual Rome. As those things start happening and people start building onto the edges, and when Microsoft Photosynth and similar technologies become more pervasive you can start building the models of the world in a semi-automated way from photographs and more structured, intentional drive-by&#8217;s and so on. So I think it&#8217;ll just sort of happen. And as long there&#8217;s a way to have the equivalent of Mosaic for AR, the original open source web browser, that allows you to aggregate all these things. It&#8217;s not going to be a Wikitude. It&#8217;s not going to be this thing that lets you get a certain kind of data from a specific source, rather it&#8217;s the browser that allows you to link through into these data sources.</strong></p>
<p><strong>So it&#8217;s that end that interests me. It&#8217;s questions like &#8220;what is the user experience&#8221;, how do we create an interface that allows us to layer all these different kinds of information together such that I can use it for all my things. I imagine that I open up my future iphone and I look through it. The background of the iphone, my screen, is just the camera and it&#8217;s always AR.</strong></p>
<p><strong>I want the camera on my phone to always be on, so it&#8217;s not just that when I hold it a certain way it switches to camera mode, but literally it&#8217;s always in video mode so whenever there&#8217;s an AR thing it&#8217;s just there in the background.</strong></p>
<p><strong>When we can do that I can have little alerts so when I have my phone open I can look around and see it independent of the buttons and things that I&#8217;m tapping and pushing to use the phone. That&#8217;ll be a really a different kind of experience.</strong></p>
<p><strong>Of course it is not known yet if the next gen iphone will have an open video API. Â And of course, the current camera is pretty low quality, so why would they give it an open API until they put in a better camera? Â I am not expecting anything one way or the other until the 3Gs comes out and people start using it.</strong></p>
<p><strong>But there are many things about the iphone 3.0 OS that are hugely important, like the discovery API that allows people to play games with other people nearby, that don&#8217;t have much to do with AR.</strong></p>
<p><strong>Tish:</strong> You have an iphone AR virtual pet application ARf.</p>
<p><a href="http://www.macrumors.com/2009/04/08/video-in-and-magnetometers-could-introduce-interesting-iphone-app-possibilites/" target="_blank">Macrumors wrote it up</a> and suggested that the neg gen iphone will have compass and open video API.Â  What are your plans for ARf?</p>
<p><strong>Blair: ARf is just a demo right now. Â I know what we&#8217;d like to do with it, but it would require tons of work; Â imagine what it would take to do a multiplayer, social version of Nintendogs? Â It&#8217;s not clear what we&#8217;d really learn by doing that, but there are lots of other game ideas we have that we want to explore.</strong></p>
<p><strong>Tish:</strong> I think it was on Twitter where Tim O&#8217;Reilly said, &#8220;saying everything must have a RFID tag is like saying we can&#8217;t recognize each other unless we wear name tags. Look at what&#8217;s happening with speech recognition, image recognition et.al. and tell me you really think we need embedded metadata.&#8221; What would you say to that?</p>
<p><strong>Blair: I think that whatever extra data is there will be used. So if we put machine readable labels on some objects then they&#8217;ll be used if they make the identification and tracking problem easier. But it&#8217;s pretty clear that people are already working on tracking and so on.</strong></p>
<p><strong>A lot of these mobile AR apps are clearly putting ideas in people&#8217;s minds things that won&#8217;t really be doable in the near future. Like being able to look down the aisle of the store and it recognize all of the products. Given the distances and complexity of the scene, the number of pixels devoted to each of those objects, and so on &#8211; you just can&#8217;t recognize things in that context. But if I&#8217;m standing in front of a small set of objects, or looking at one thing, or I&#8217;m standing in front of a building, or if I&#8217;m in the store and because of the location API &#8212; imagine an enhanced location API that can tell me within a few feet where I am, and then combine that with some use of the discovery API that allows the store to tell your device you&#8217;re in the toothpaste section. Now you only have to look for different brands of toothpaste. So now you can recognize the big letters &#8220;Crest&#8221; or whatever. It&#8217;s all about constraining the problem.</strong></p>
<p><strong>That&#8217;s why I like that particular piece of Drummond&#8217;s work, the tracking web site I mentioned above. The general tracking problem of looking around and recognizing objects and tracking is still impossible. But if I know roughly what direction I&#8217;m looking in and I have a good estimate of my position, and I have models of what I should be seeing when I look in that direction, then it becomes a tractable problem. And so it&#8217;s not that a compass and a GPS are 100% necessary. But if you have them it certainly makes things possible that you wouldn&#8217;t otherwise be able to do.</strong></p>
<p><strong>Imagine for exampleÂ  if there&#8217;s a new version of GPS, I just noticed that some of the new satellites going up have this new L5 channel. There&#8217;s the L1 &amp; L2 signalsÂ  that the military and civilian ones use and they added this civilian L5 signal, which should make GPS more accurate. I haven&#8217;t found anything online that says how much more accurate.</strong></p>
<p><strong>But someday, hopefully, all GPS will get to be the quality of survey-grade GPS. Right now, if you get an RTK GPS from one of these companies that make the survey grade GPS systems, they give you position estimates in the range of two centimeters, and update 10 to 20 times a second. When you have that kind of positional accuracy combined with the kind of orientational accuracy you get from the orientation sensors we use in the lab from Intersense and MotionNode, everything is easier because you&#8217;ve pretty much got absolute position. You put that into a phone and now when I look up, it&#8217;s still not perfectly aligned because there will still be errors (especially in orientation, since the compasses are affected by metal and other magnetic noise). But it does mean if you and I are standing 5 feet apart from each other and look at each other, I can pretty much put a little smiley face above your head. Whereas now, with GPS, if I look at you and we&#8217;re 5 feet apart our GPS&#8217;s might think we&#8217;re on the opposite side of each other because they&#8217;re only accurate to two to five meters.</strong></p>
<p><strong>And that depending on the time of day and weather!</strong></p>
<p><strong>Putting RFID tags everywhere is easy; the problem is the readers &#8211; they currently require lots of power and they have a limited range.Â  Sprinkling RFID tags everywhere is fine. But you have to be able to activate those tags and read back the signal.Â  In certain contexts it works.</strong></p>
<p><strong>Tish:</strong> And one final question!Â  What do you think can be done re beginning to think about standards for AR.Â  Is there a meaningful discussion going on yet? Thomas Wrobel left this comment on my blog rcently and I was wondering what your position was on some of the ideas he raises?</p>
<p>Wrobel wrote, <em>&#8220;The AR has to come to the users, they cant keep needing to download unique bits of software for every bit of content! We need an AR Browsing standard that lets users log into an out of channels (like IRC) and toggle them as layers on their visual view (like Photoshop).Channels need to be public or private, hosted online (making them shared spaces) or offline (private spaces). They need to be able to be both open (chat channel) or closed (city map channel) as needed. Created by anyone anywhere. Really IRC itself provides a great starting point. Most data doesn&#8217;t need to be persistent, after all. I look forward too seeing the world though new eyes.I only hope I will be toggling layers rather then alt+tabbing and only seeing one â€œreality additionâ€ at a time.&#8221;<br />
</em></p>
<p><strong>Blair:  I agree with him, in principle. Â But, I&#8217;m not sure there&#8217;s a point yet. Â It can&#8217;t hurt to try, of course, from a research perspective, and I&#8217;m interested in the experience such an infrastructure would enable (as we&#8217;ve talked about already).</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/06/12/mobile-augmented-reality-and-mirror-worlds-talking-with-blair-macintyre/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
		</item>
	</channel>
</rss>
