<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>UgoTrade &#187; augmented reality standards</title>
	<atom:link href="https://www.ugotrade.com/tag/augmented-reality-standards/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.ugotrade.com</link>
	<description>Augmented Realities at the Edge of the Network</description>
	<lastBuildDate>Wed, 25 May 2016 15:59:56 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>Vision Based Augmented Reality (AR) in Smart Phones &#8211; Qualcomm&#8217;s AR SDK: Interview with Jay Wright</title>
		<link>https://www.ugotrade.com/2010/08/05/vision-based-augmented-reality-ar-in-smart-phones-qualcomms-ar-sdk-interview-with-jay-wright/</link>
		<comments>https://www.ugotrade.com/2010/08/05/vision-based-augmented-reality-ar-in-smart-phones-qualcomms-ar-sdk-interview-with-jay-wright/#comments</comments>
		<pubDate>Thu, 05 Aug 2010 22:56:11 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Android]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[Anselm Hook]]></category>
		<category><![CDATA[AR eyewear]]></category>
		<category><![CDATA[AR HMDs]]></category>
		<category><![CDATA[AR standards]]></category>
		<category><![CDATA[AR version of Rock'em Sock'em]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[are2010]]></category>
		<category><![CDATA[ARWave]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[augmented reality standards]]></category>
		<category><![CDATA[Blair Macintyre]]></category>
		<category><![CDATA[Chokkan Nabi]]></category>
		<category><![CDATA[Christian Doppler Handheld AR LAB in Graz]]></category>
		<category><![CDATA[Davide Carnovale]]></category>
		<category><![CDATA[Gene Becker]]></category>
		<category><![CDATA[going beyond compass/gps based AR]]></category>
		<category><![CDATA[google goggles]]></category>
		<category><![CDATA[InsideAR]]></category>
		<category><![CDATA[Junaio]]></category>
		<category><![CDATA[Junaio glue]]></category>
		<category><![CDATA[Karma Augmented Reality Mobile Architecture]]></category>
		<category><![CDATA[Kooaba]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[Maarten Lens-FitzGerald]]></category>
		<category><![CDATA[markerless tracking]]></category>
		<category><![CDATA[Markus Strickler]]></category>
		<category><![CDATA[Metaio]]></category>
		<category><![CDATA[Ogmento]]></category>
		<category><![CDATA[open Android JPCT 3D engine]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[Patrick O'Shaughnessey]]></category>
		<category><![CDATA[point and find]]></category>
		<category><![CDATA[Qualcomm]]></category>
		<category><![CDATA[Qualcomm AR Competition]]></category>
		<category><![CDATA[Qualcomm Augmented Reality Competition]]></category>
		<category><![CDATA[Qualcomm Augmented Reality Developer Challenge]]></category>
		<category><![CDATA[Qualcomm Augmented reality SDK]]></category>
		<category><![CDATA[Qualcomm Developer Challenge]]></category>
		<category><![CDATA[Simulation3D]]></category>
		<category><![CDATA[Snapdragon]]></category>
		<category><![CDATA[Thomas Alt]]></category>
		<category><![CDATA[Thomas Wrobel]]></category>
		<category><![CDATA[Total Immersion]]></category>
		<category><![CDATA[Unifeye Mobile SDK]]></category>
		<category><![CDATA[Unifeye SDK]]></category>
		<category><![CDATA[Unity for AR]]></category>
		<category><![CDATA[Unity for augmented reality]]></category>
		<category><![CDATA[Unity3D]]></category>
		<category><![CDATA[Upliq 2010]]></category>
		<category><![CDATA[vision based AR]]></category>
		<category><![CDATA[vision based augmented reality]]></category>
		<category><![CDATA[visual search]]></category>
		<category><![CDATA[Yohan Baillot]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=5593</guid>
		<description><![CDATA[Recently, Qualcomm announced an SDK for vision based augmented reality &#8211; currently in private beta and open to the public this fall. The Qualcomm augmented reality (AR) bonanza will launch with a $200,000 developer challenge and a SDK that will put vision based augmented reality into the hands of developers without licensing fees. This is [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.qualcomm.com/videos/explore?search=mattel&amp;sort=&amp;channel=All" target="_blank"><img class="alignnone size-medium wp-image-5616" title="Screen shot 2010-08-05 at 6.07.36 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/08/Screen-shot-2010-08-05-at-6.07.36-PM-300x212.png" alt="Screen shot 2010-08-05 at 6.07.36 PM" width="300" height="212" /></a></p>
<p>Recently, <a href="http://www.qualcomm.com/" target="_blank">Qualcomm</a> announced <a href="http://qdevnet.com/ar" target="_blank">an SDK for vision based augmented reality</a> &#8211; currently in <a href="http://qdevnet.com/dev/augmented-reality/private-beta-program" target="_blank">private beta</a> and open to the public this fall.  The Qualcomm augmented reality (AR) bonanza will launch with a <a href="http://qdevnet.com/dev/augmented-reality/developer-challenge" target="_blank">$200,000 developer challenge</a> and a SDK that will put vision based augmented reality into the hands of developers without licensing fees.</p>
<p>This is a big step forward for augmented reality and a very important move made by an industry giant to support the rapidly evolving AR industry.  Innovation at all levels of the AR stack, particularly at the hardware level (CPU/GPU optimization) is vital for the full vision of augmented reality &#8211; media tightly registered to physical space, to take center stage.   Vision based AR takes mobile AR beyond compass/GPS based AR post-its, which are only loosely connected to the world (but the staple of most current AR apps), towards the holy grail of AR &#8211; markerless tracking with the whole world as the platform.</p>
<p>Click on the image above or <a href="http://www.qualcomm.com/videos/explore?search=mattel&amp;sort=&amp;channel=All" target="_blank">see here</a> for a video demo of an  AR version  of Rock&#8217;em Sock&#8217;em Robots game.Â  <a href="http://www.mattel.com/">Mattel</a>, one of the first companies  working with the SDK demoed AR Rock&#8217;em Sock&#8217;em, at the <a href="http://uplinq.com/">Uplinq 2010</a> conference (see <a href="http://www.readwriteweb.com/archives/qualcomm_launching_mobile_sdk_for_vision-based_ar_on_android_this_fall.php" target="_blank">Chris Cameronâ€™s ReadWriteWeb write-up</a> on <a href="http://uplinq.com/">Uplinq 2010</a>).</p>
<p>The Qualcomm AR stack, which reaches from the metal to developer APIs, will give Android developers an important edge in AR development.   And, when vision based AR starts getting integrated with visual search capabilities, and combined with cool tools like <a href="http://unity3d.com/" target="_blank">Unity</a>, we will start to see the augmented world get really interesting.</p>
<p>Visual search is already an area of AR getting a lot of attention, with <a href="http://www.google.com/mobile/goggles/#text" target="_blank">Google Goggles</a>, <a href="http://europe.nokia.com/services-and-apps/nokia-point-and-find" target="_blank">Point and Find</a>, <a href="http://www.cnet.com.au/augmented-reality-taking-off-on-japanese-smartphones-339304998.htm" target="_blank">Japan&#8217;s NTT DoCoMo set to launch &#8220;chokkan nabi,&#8221;</a> or &#8220;intuitive navigation,&#8221; in September, and the <a href="http://www.layarnews.com/2010/07/kooaba-meets-layar.html" target="_blank">recent partnership between Layar and Kooaba</a>.  <a href="http://www.metaio.com/" target="_blank">Metaioâ€™</a>s mobile augmented reality platform <a href="http://www.metaio.com/products/junaio/" target="_blank">Junaio</a> is already integrated with <a href="http://www.kooaba.com/" target="_blank">Kooabaâ€™s</a> computer vision capabilities.</p>
<p>And, of course, I am particularly excited about including open distributed real time communications for AR in this stack, which is why I asked a group of developers who have been inputting into the <a href="http://arwave.org/" target="_blank">ARWave</a> project if they had questions for Jay Wright, Qualcomm.Â  Thank you <a href="http://www.linkedin.com/in/yohanbaillot" target="_blank">Yohan Baillot</a>, <a href="http://lightninglaboratories.com/" target="_blank">Gene Becker</a>, <a href="http://www.hook.org/" target="_blank">Anselm Hook</a>, <a href="http://patchedreality.com/about/" target="_blank">Patrick  O&#8217;Shaughnessey</a>, <a href="http://www.lostagain.nl/" target="_blank">Thomas Wrobel</a>, <a href="http://twitter.com/kusako" target="_blank">Markus Strickler</a>, and <a href="http://twitter.com/need2revolt" target="_blank">Davide Carnovale</a> for your input.Â  [Note: see my upcoming post, about the future of <a href="http://arwave.org/">ARWave</a> and real time distributed communications for AR following <a href="http://googleblog.blogspot.com/2010/08/update-on-google-wave.html" target="_blank">this Google announcement</a>.]</p>
<p><a href="http://www.linkedin.com/in/jaywright" target="_blank">Jay Wright</a>, â€œis responsible for developing and driving Qualcommâ€™s augmented reality commercialization strategy.â€ He â€œhandles partnerships with leading innovators in industry and academia and leads Qualcommâ€™s efforts in enabling augmented reality within the mobile ecosystem.â€  In the interview below, Jay very generously answers our questions in detail.</p>
<p>A key contributor of questions for this interview is Yohan Baillot.  Yohan is working on a full vision of AR &#8211; integrating computer vision, visual search, open distributed real time communications and AR eyewear.  Yohan Baillot is founder of <a href="http://www.simulation3d.biz/" target="_blank">Simulation3D</a>, a consulting and system integration company specializing in interactive visualization systems and eyewear-based AR systems.  (I hope to bring you an interview with Yohan soon!).</p>
<p>Qualcomm was the title sponsor for <a href="http://augmentedrealityevent.com/" target="_blank">are2010, Augmented Reality Event</a>, and  played a vital role in making this event an historic gathering of the talent and creative minds at the heart of the emerging AR industry.  Watch out for the videos of the are2010 sessions to be posted at the end of August.  My are2010 co-chair, <a href="http://ogmento.com/team" target="_blank">Ori Inbar</a>, is preparing them to go online while kicking his newly funded start up, <a href="http://ogmento.com/" target="_blank">Ogmento</a>, into high gear! Ogmento is also one of the start ups pioneering vision based AR.</p>
<p><a href="http://www.metaio.com/" target="_blank">Metaio</a>, (with <a href="http://www.t-immersion.com/" target="_blank">Total Immersion</a>, they are one of the first augmented reality companies), has played a key role in bringing a vision component to smart phone augmented reality apps with their <a href="http://www.metaio.com/products/" target="_blank">Unifeye mobile SDK</a>.Â  Junaio, Metaioâ€™s own mobile augmented reality platform has gone beyond location based AR with â€œjunaio glueâ€ &#8211; â€œthe camera&#8217;s eye is now able to identify objects and &#8220;glue&#8221; object specific real-time, dynamic, social and 3D information onto the object itself,â€Â (see my upcoming interview with Metaio founder, Thomas Alt).Â   Also, recently, Layar &#8211; who continue to innovate at a breathtaking pace, announced a partnership with the computer vision company Kooaba.</p>
<p>Both Maarten Lens-FitzGerald, Layar, and Thomas  Alt, Metaio, when I spoke to them recently,  saw the Qualcomm SDK as a very positive development for AR, and they look forward to exploring its capabilities and integrating it where appropriate with their AR tools.Â  See more about <a href="http://site.layar.com/company/blog/layar-will-visit-the-us/" target="_blank">Layar&#8217;s  upcoming visit, to the US here &#8211; </a><a href="http://site.layar.com/company/blog/layar-will-visit-the-us/" target="_blank">August  10th NYC, and August 12th SF</a>.Â  Also save the date, Sept 27th, Munich, for <a href="http://www.metaio.com/index.php?id=1103" target="_blank">InsideAR,</a> Metaio&#8217;s  upcoming conference.</p>
<p>It is clear that vision based AR will be driving the next wave of AR apps.  And, as Maarten and Thomas both pointed out, it will be interesting to see which use cases capture the imagination of users the most.  Having more tools freely available to AR developers will certainly be a boost to creativity.  And, Qualcommâ€™s SDK is going to give Android developers, in particular, a big opportunity to take the lead.</p>
<p><strong><br />
<h3>Interview with Jay Wright, Director, Business Development, Qualcomm</h3>
<p></strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/08/JayWright.jpg"><img class="alignnone size-medium wp-image-5598" title="JayWright" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/08/JayWright-300x255.jpg" alt="JayWright" width="300" height="255" /></a><br />
</strong></p>
<p><strong>Tish Shute:</strong> Before I start with questions on the new Qualcomm vision based augmented reality SDK, I want to briefly look ahead to what many people feel is vital for the full realization of augmented reality &#8211; head mounted displays, or more specifically, comfortable, sexy AR eye wear.  Is Qualcomm going to be involved in the development of augmented eye wear and wearable displays?</p>
<p><strong>Jay Wright:   I think thereâ€™s some core technology that needs to come together so we can have what we think needs to be a see-through head mounted display with a decent field of view.  And that looks like something that is quite possibly further than a three to five year horizon.</strong></p>
<p><strong>Tish Shute:</strong> Gene Becker asked some interesting general questions about the Qualcomm AR initiatives.  He said,  â€œIâ€™m unclear exactly what Qualcommâ€™s goal is.â€  It would be interesting to hear from you the Qualcomm view, from the top down.</p>
<p><strong>Jay Wright:</strong> <strong> Our largest revenue stream comes from sales of chipsets.    And we see augmented reality as a technology that drives demand for increasing amounts of processing power.  So we want to create demand for chips, higher-end chips, and augmented reality does that.  Specifically vision based augmented reality because it is so computationally intensive.</strong></p>
<p><strong>Tish Shute:</strong> Yes.  And I think that is why people are very excited by the Qualcomm SDK.  It is not only the first free toolkit for developers to build vision apps from, isnâ€™t it?  Thereâ€™s been nothing freely available before this, has there?  But also Qualcomm is paying attention to the complete AR stack to support vision based AR development, from the chips to game/app development tools like Unity.</p>
<p><strong> </strong><strong>Jay Wright:  Thatâ€™s really the goal.  Weâ€™re not here to be in the augmented reality applications business.  Qualcommâ€™s role in the ecosystem has been to serve as an enabler.  And thatâ€™s what we want to do with augmented reality: provide the enabling technology that allows the entire ecosystem to flourish.</strong><br />
<br /></br><br />
<h3>&#8220;Augmented Reality has a number of attributes that make it a  great fit for Qualcomm&#8217;s core competencies&#8221;</h3>
<p></br><br />
<strong>Augmented Reality has a number of attributes that make it a great fit for Qualcomm&#8217;s core competencies. </strong><strong>Itâ€™s very computationally intensive, algorithmically complex, requires tight integration of hardware and software, and benefits from tight integration of multiple hardware components.  And thatâ€™s the kind of problem we like here, where we can apply our core competence of really optimizing complex systems for performance, while at the same time minimizing power consumption. </strong></p>
<p><strong> And as you know Tish, mobile AR is really extremely power sensitive.  We sometimes talk about it as a batteryâ€™s worst nightmare.  Itâ€™s roughly equivalent to playing a 3D game and recording a video all at the same time.</strong></p>
<p><strong>Whenever there is something that takes a lot of power, thatâ€™s a definite opportunity for us to optimize it.</strong></p>
<p><strong>Tish Shute:</strong> Right.  One of the core business is chips right, but for Qualcomm thereâ€™s basically a lot of profit in licensing.  When I talked to the developer community about the Qualcomm SDK developers first question was, â€œWhatâ€™s the licensing?  Whatâ€™s this going to cost us in the long run to develop on this SDK re licensing?â€  And they had all different takes on this.  So everyone had different ideas about what your approach to licensing might or might not be.  Could you clarify the approach to licensing, as I think this is a core concern for developers.</p>
<p><strong>Jay Wright:   Anytime you see something for free, you kind of say, â€œHey, whatâ€™s the hook?â€  So yes, itâ€™s definitely a logical question.  Our intent is not to generate licensing revenue from application developers using the SDK.  So the SDK will be made available free of charge for development, and it will also be free of charge for developers to deploy applications.</strong></p>
<p><strong>Tish Shute:</strong> Now, this is another question.  You also include not just image recognition capabilities but Unity in the package you are offering developers.  Unity products usually involve a license.  They do have some free products too, I think.  But how does this work?  And how do you separate your part from their part, or donâ€™t you?</p>
<p><strong>Jay Wright:  Thatâ€™s a good question.  What weâ€™re trying to do with the platform is incorporate it into tools that people already know how to use.  So weâ€™re actually going to have the SDK support two different tool chains.  One of them is the Android SDK and NDK.  And then the other one, is Unity.</strong></p>
<p><strong>Weâ€™re working with Unity to create an extension to the Unity environment that will be available as part of the Unity installer when you install Unity from the Unity website.  Developers will still be paying whatever license fees are associated with Unityâ€™s products on their existing pricing schedule.</strong></p>
<p><strong>Tish Shute:</strong> One of Thomas Wrobelâ€™s question is whether developers can just use the image recognition without Unity?  Your answer is yes, you can work with the computer vision component of the SDK separate from Unity?</p>
<p><strong>Jay Wright:  Yes, you can.</strong></p>
<p><strong>Tish Shute:</strong> Good because we would like to build a completely open Android client for ARWave, and not tie it to Unity unless people choose to.  Heâ€™s using the <a href="http://www.jpct.net/" target="_blank">open Android JPCT 3D engine</a>, which heâ€™s adapting for AR.  So he could actually use the part of the SDK that does image recognition and association with that, right?</p>
<p><strong>Jay Wright:  Thatâ€™s correct.  You are not required to use Unity.  Unity is just one option for building the application.</strong></p>
<p><strong>Tish Shute:</strong> Great! Thatâ€™s very good.  But Iâ€™m sure many developers are going to jump on the chance to use Unity.  But I mean itâ€™s nice to be flexible because itâ€™s so early for AR that people have different ideas and new use cases coming up all the time.  I think itâ€™s excellent youâ€™ve divided that.</p>
<p>Another of Thomasâ€™s questions was, â€œCan developers use their own positioning data sharing solution?â€  Heâ€™s really talking about AR blips.</p>
<p><strong>Jay Wright:  With data sharing solutions, I am assuming that by data he means referring to augmentation data or graphics?</strong></p>
<p><strong>Tish Shute:</strong> Yes, and Iâ€™ll ask him to elaborate.  But, at the moment, everyone is using different ideas for POI, arenâ€™t they?<br />
<br /></br><br />
<h3>&#8220;The goal with our platform is to make it just as easy for a  developer to create 3D content for the real world as it is for a game  world or a virtual world.&#8221;</h3>
<p></br><br />
<strong>Jay Wright:  Yes.  So let me answer it this way, Tish.  The goal with our platform is to make it just as easy for a developer to create 3D content for the real world as it is for a game world or a virtual world.  So all weâ€™re really trying to do is provide the computer vision piece that makes the real world look like a bunch of geometric surfaces and potentially some meta data that is associated with this so you know what you are looking at.</strong></p>
<p><strong>So that means from a developerâ€™s perspective, you are still doing all of the 3D content, all of the animations, all of the game logic, all of the rendering.  You are still doing that all yourself.  So if you think about doing an AR game, you are doing everything you used to do, except you are not creating a virtual terrain.  You are just going to map it in the real world.</strong></p>
<p><strong>So if you want to do a browser that is doing POIâ€™s, your POI data, or augmentation, or meta data, or whatever it is, that can be in your application, it can be in the cloud, it can be wherever you want to put it.  Weâ€™re not putting any constraints on what that content is or where itâ€™s stored.</strong></p>
<p><strong>Tish Shute:</strong> Right, and thatâ€™s what I hoped for.  And I think that does answer the question.  People are interested to know how far Qualcomm is going with this.  For instance, Gene Becker asked: â€œdo they see a business at a certain level in the AR stack?â€  As you said AR development basically feeds into the core business of chip development, right?  But does Qualcomm also see some new business models developing?</p>
<p><strong>Jay Wright:   I think itâ€™s foreseeable that Qualcomm could identify other business opportunities down the line.  But weâ€™re certainly not there today.  Today, our motivation for the investment in AR is to create technology that is going to advance the chipset business.</strong></p>
<p><strong>Tish Shute:</strong> When the news came out about Qualcommâ€™s support of a game development studio at Georgia Tech at the same time as the SDK I think I wondered what was the scope of Qualcommâ€™s interest [for more on using Unity for AR development see <a href="http://www.qualcomm.com/partials/service/video/14230?primary=0x319cb5&amp;secondary=0xffffff&amp;simple_endScreen=true&amp;disable_embed=false&amp;disable_send=false&amp;send_mailto=http://www.uplinq.com/&amp;disable_embedViewMore=true&amp;simple_infoPanel=true" target="_blank">Vision-Based Augmented Reality Technical Super Session  video</a> from <a href="http://uplinq.com/">Uplinq 2010</a>].Â  For example, I am interested to know how the Qualcomm initiative in developing an AR stack connects to the effort to introduce an AR browser based on web standards, i.e., the <a href="https://research.cc.gatech.edu/polaris/content/home" target="_blank">Kharma/Kamra KML/HTML Augmented Reality Mobile Architecture from Blair MacIntyre and the Georgia Tech team</a> (image below)?  Are you supporting the open standards based browser development too?</p>
<p><strong>Jay Wright:   Blair is going to continue to work on the browser effort.  And itâ€™s our expectation that he will use our SDK and technologies for vision pieces of the browser effort where appropriate.  So they are certainly not mutually exclusive.  I would just think about our technology as one element of what may be used in that browser, as I expect it would be an element of what any other app developer would put in their application, whether it be browser, or game, or whatever.</strong></p>
<p><strong>Tish Shute:</strong> Yes Now, this is an interesting question, which is sort of connectedâ€¦Iâ€™m trying to keep some form of narrative for this!  It follows from the question about Blairâ€™s web-based standards browser.  A few people have asked me why we havenâ€™t heard more from Qualcomm in all these various standard discussions that are starting to come up.  I mean is it just too early, or are you too busy, or what?</p>
<p><strong>Jay Wright:  No, let me explain.  The type of standards that have come up so far have been around how HTML should be extended for geo-browser type applications.  And while thatâ€™s interesting, I think the standards efforts that Qualcomm would be more likely to be associated with in the near term are those related to APIâ€™s that are hardware accelerated.</strong></p>
<p><strong>So one of the things that we are in the process of doing right now, Tish â€“ because as you know, Qualcomm is a company that adheres to standards and strives to produce a leading implementation of those standards on our hardware and software â€“ is we are in the process of determining what API set within the existing SDK should be standardized.</strong></p>
<p><strong>Tish Shute:</strong> Right.</p>
<p>Now, my next question is, â€œWho are the other players at this level of the AR stack in the standards conversation? Who else is working at that level?â€  Obviously, the AR Lab in Graz was, but now they are Qualcomm, right?</p>
<p><strong>Jay Wright:   They are still independent.  Qualcomm is the exclusive industrial partner of the Christian Doppler Handheld AR LAB in Graz.</strong></p>
<p><strong>Tish Shute:</strong> Does this compete with, say, the work that other AR start ups are doing?</p>
<p><strong>Jay Wright:  Our intent is not to compete with companies that have done augmented reality technology.  Our intent is to enable the entire ecosystem.  So we would like to work with both Metaio and Total Immersion to find ways that they can benefit from our technology.  That would be the hope &#8211; that our technology can kind of lift and float all boats in the ecosystem.</strong></p>
<p><strong>Tish Shute: </strong>There are not many implementations of vision based AR right now?  I mean obviously Microsoft is doing stuff because they have <a href="http://www.robots.ox.ac.uk/~gk/" target="_blank">Georg Klein</a> now, right, and there is Google Goggles, Total Immersion, Metaio, and it will be interesting to see where Layarâ€™s partnership with Kooaba will lead?</p>
<p><strong>Jay Wright:  Yes.  I think there are relatively few commercial implementations of vision based AR stacks.</strong></p>
<p><strong>Tish Shute:</strong> One of Patrick O&#8217;Shaughnessey&#8217;s question is he wants to understand what features are going to be in the vision component, very specifically.  Patrick Oâ€™Shaughnessy, <a href="http://patchedreality.com/" target="_blank">Patched Reality</a>, working with <a title="Circ.us" href="http://circ.us/" target="_blank">Circ.us</a>,  <a title="Edelman" href="http://edelman.com/" target="_blank">Edelman</a>,   and <a title="metaio" href="http://metaio.com/" target="_blank">Metaio</a> used the Unifeye SDK to do <a href="http://mashable.com/2010/07/09/ben-and-jerrys-iphone-app/" target="_blank">a vision based AR app for Ben and Jerryâ€™s</a> thatâ€™s been getting all the attention lately. He was a speaker at are2010.</p>
<p>He very specifically wants to know what features will be included in the computer vision component.  He says, â€œIâ€™m most interested in understanding what features are going to be in the vision component.  Is it marker based?â€  Well I know itâ€™s more than marker  based.  I saw some of it in <a href="http://www.readwriteweb.com/archives/qualcomm_launching_mobile_sdk_for_vision-based_ar_on_android_this_fall.php" target="_blank">Chris Cameronâ€™s ReadWriteWeb write-up</a> on <a href="http://uplinq.com/">Uplinq 2010</a>.  Is it â€œNFT?  PTAM? other?  Also, are you are integrating any backend services.â€  That is an interesting question!</p>
<p><strong>Jay Wright:  So letâ€™s get to the features on the client side, the vision based features.  Thereâ€™s support for, what AR aficionados would know as natural feature targets, or image based targets.  And we use those to represent, obviously, 2D planar surfaces.</strong></p>
<p><strong>The other thing that we are trying to do to set expectations, Tish, about where these can be used is to let people know that they work best in what weâ€™re calling near-field environments.  So the idea isnâ€™t that you use the system to create a large scale AR system that can recognize buildings indoors and outdoors.  Itâ€™s the idea where I can recreate 3D experiences that take place on surfaces that are in my immediate field of view, whether that be on the table in front of me, or on the floor, or on the wall, or on the shelf.</strong></p>
<p><strong>Also, when you talk about near field experiences, there are some other constraints that are implied.  Like, if itâ€™s in front of me and my immediate field of view is probably going to be pretty well lit.  And lighting, of course, is an important requirement.</strong></p>
<p><strong>So weâ€™ll support these natural feature targets, or image targets.  And we also have support for sort of a hybrid marker image type.  Itâ€™s something called a frame marker, which has kind of a black border with some dots on it.</strong></p>
<p><strong><a href="http://www.qualcomm.com/partials/service/video/14230?primary=0x319cb5&amp;secondary=0xffffff&amp;simple_endScreen=true&amp;disable_embed=false&amp;disable_send=false&amp;send_mailto=http://www.uplinq.com/&amp;disable_embedViewMore=true&amp;simple_infoPanel=true" target="_blank"><img class="alignnone size-medium wp-image-5610" title="Screen shot 2010-08-05 at 5.13.50 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/08/Screen-shot-2010-08-05-at-5.13.50-PM-300x166.png" alt="Screen shot 2010-08-05 at 5.13.50 PM" width="300" height="166" /></a><br />
</strong></p>
<p>Click on the image above or <a href="http://www.qualcomm.com/partials/service/video/14230?primary=0x319cb5&amp;secondary=0xffffff&amp;simple_endScreen=true&amp;disable_embed=false&amp;disable_send=false&amp;send_mailto=http://www.uplinq.com/&amp;disable_embedViewMore=true&amp;simple_infoPanel=true" target="_blank">here to view Vision-Based Augmented Reality Technical Super Session video</a> from <a href="http://uplinq.com/">Uplinq 2010</a></p>
<p><strong>Jay Wright:  So thereâ€™s this additional type.  And the reason for this additional hybrid marker type is it has a lower computational requirement than a natural feature target.  So the idea is these things can be used as game pieces or elements of play where I want to have a large number of them detected and tracked simultaneously.</strong></p>
<p><strong>So you can have, for example, one big natural feature target that serves as a game board or game surface, and you can use these other things as smaller game pieces.  And when you put them out, different types of content can appear on them and do different things.</strong></p>
<p><strong>Tish Shute:</strong> Yes, thatâ€™s nice!  And the other thing I noticed was the virtual buttons.  How well developed is that?</p>
<p><strong>Jay Wright:  The idea behind virtual buttons is, in addition to supporting augmentation, we want to support interaction.  And we think there are going to be different types of user interaction with augmented reality content.  It may be hand tracking and finger tracking, but another compelling form weâ€™ve identified so far is the ability for me to touch particular surfaces and have an event fire within the application..</strong></p>
<p><strong>So virtual buttons are rectangular areas on image targets that a developer can define, and they serve as buttons.  So you can create a target that is a game board, for example, and define certain regions.  And when the user covers that region with his hand, like pushing a button, your application can detect that event and take some action.</strong></p>
<p><strong>Tish Shute:</strong> Nice!  And what is the documentation on these capabilities that is offered by Qualcomm&#8230;For example Yohan Baillot, who is interested in integrating eyewear-based AR systems with smartphones asked. How deep does this go?  Will there be full documentation on <a href="http://www.qualcomm.com/products_services/chipsets/snapdragon.html" target="_blank">Snapdragon</a>, people who want to work at that level? Is there a chip SDK?</p>
<p><strong>Jay Wright:   . Qualcommâ€™s model is to work with providers of the operating systems and deliver functionality of the chip through the operating system. So many operating systems APIs will take advantage of functionality thatâ€™s in the chip. But there is no separate chip SDK per se.</strong></p>
<p><strong>Tish Shute:</strong> I suppose that does come up a little bit with one of Anselm Hookâ€™s questions, because there is some overlap with Google Goggles here, isnâ€™t there, in terms of what youâ€™re doing, right? Are you going to work closely with Google Goggles ?</p>
<p><strong>Jay Wright: Google Goggles is performing what weâ€™ve described â€˜visual searchâ€™. So the idea is you take a picture, send it to the cloud and identify it and the results come back. I think if we see Google Goggles go in a direction where thereâ€™s an AR experience that would be a good area for us to collaborate with Google.</strong></p>
<p><strong>Tish Shute:</strong> <a href="http://www.ugotrade.com/2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/" target="_blank">Anselm Hook</a> is very interested in having some kind of open standard around this physical tagging of the world, right, &#8211; the physical world as a platform. But I suppose thatâ€™s down the road but is there a plan to start talking about open standards here &#8211; visual search with image recognition? Thatâ€™s a very powerful combination. (see my interview with Anselm Hook here).</p>
<p><strong>Jay Wright:    I think it is. And weâ€™re very interested to hear from developers and others that have ideas about how they would want to integrate with the functionality that we have to best enable those kinds of combined experiences.</strong></p>
<p><strong>Tish Shute:</strong> Well, I know Anselm has a lot of very important ideas on that.</p>
<p><strong>Jay Wright: Iâ€™d be very interested in hearing those because we want to do everything we can to enable the maximum number of applications and best user experience for anything that people want to do.</strong></p>
<p><strong>Tish Shute:</strong> Letâ€™s go back to some specific questions about the platform, right? For example Yohan Baillot asked, â€œIs it arbitrary image/tag recognition supported? Is the tag / image specifiable by user? Is face recognition supported?â€  Not yet, face recognition, right?</p>
<p><strong>Jay Wright:    Not yet.</strong></p>
<p><strong>Tish Shute:</strong> What are the plans with that?</p>
<p><strong>Jay Wright:    I think weâ€™ve identified it as an interesting area and something that thereâ€™s some interest in, but have not made a decision on a particular technology direction.</strong></p>
<p><strong>Tish Shute:</strong> Youâ€™ve answered some of these but 3D model based vision tracking. Yohanâ€™s question was, â€œIs 3D model based vision tracking supported (that is recover the pose of the camera using a known 3D model and a 2D camera view of this model)?â€</p>
<p><strong>Jay Wright:    Thatâ€™s something weâ€™re looking at very closely, but again, donâ€™t have a plan, or donâ€™t have a future date for.</strong></p>
<p><strong>Tish Shute:</strong> And you said with the natural landmark tracking thatâ€™s not supported, right?</p>
<p><strong>Jay Wright:    I donâ€™t know if I know what that means, Tish. But we donâ€™t have any APIs that provide compass or GPS functionality other than already exists in the operating system. So if you want to take advantage of the compass or other sensors, you can absolutely do that, but the SDK does not currently provide anything different or anything more than already exists in the OS.</strong></p>
<p><strong>Tish Shute:</strong> This is an interesting question, â€œIs Snapdragon offloading some processing to the GPU, if any?â€</p>
<p><strong>Jay Wright:    Certainly  rendering functionality that utilizes OpenGL is being offloaded to the GPU. Weâ€™re currently in the process of determining multiple methods for offloading functionality between both symmetric and heterogeneous cores on Snapdragon. Which would include the GPU, the apps processor, and  DSPs.</strong></p>
<p><strong>Tish Shute: </strong> No one has truly solved optimizing the GPU/CPU for mobile AR yet have they?</p>
<p><strong>Jay Wright:    That really gets to the heart of the optimization here. Which pieces ought to be operating on which cores and when, and why? And thatâ€™s something that weâ€™re looking at very closely.</strong></p>
<p><strong>Tish Shute: </strong> Right.  The only AR &#8211; that is truly 3D media tightly registered to the physical world has been done for military and medical (and that has often been with a locked of camera!).  But to take mobile AR to the next level I think many developers would like access to the CPU/GPU, for example a developer interested in the future of eyewear like Yohan?</p>
<p><strong>Jay Wright:     Weâ€™re very interested in hearing what kinds of tools developers would like to see.</strong></p>
<p><strong>Tish Shute:</strong> What is the best forum for discussing feature specifics?</p>
<p><strong>Jay Wright:    To provide feature requests to us?</strong></p>
<p><strong>Tish Shute:</strong> Yes. And discuss them.</p>
<p><strong>Jay Wright:    if people go to <a href="http://qdevnet.com/ar" target="_blank">qdev.net/AR</a> thereâ€™s an application up there for the private beta program. So if people do have ideas about features or other things they would like to see, theyâ€™re welcome to submit [their requests and ideas] there.</strong></p>
<p><strong>Tish Shute:</strong> I also have some questions about the specifics of the competition.  Some people are a little confused about some things.  Yohan asked, â€œWhat is the expected form of the project?  Lab demonstration?  Specific capability?  Complete end to end system?â€</p>
<p><strong>Jay Wright:  The only requirement is that they submit an Android application that we can then get running on a device.  So if it has a backend component or backend server that it works against, great.  If it does, it does.  If it doesnâ€™t, it doesnâ€™t.  But thatâ€™s really it. Thereâ€™s no limit to the application category.  It can be a game, it can be a museum tour, it can be a childrenâ€™s learning game or learning experience.  It can really be anything.  The idea is we want to find experiences for which AR delivers some unique value. Weâ€™ll be announcing more specifics about the competition in the near-future.</strong></p>
<p><strong>Tish Shute:</strong> Right, because some people werenâ€™t sure about the Unity being separated whether it was biased towards games.  And itâ€™s not really, is it?</p>
<p><strong>Jay Wright:  Unity is a bias toward just rapid development for 3D, I think.  Itâ€™s most commonly associated with games, but there are also a lot of Unity customers that use it for medical simulations and other types of applications that arenâ€™t really games at all.</strong></p>
<p><strong>Tish Shute:</strong> Yes.  Itâ€™s very flexible, I know.  You did bring up the backend services again.  Are you thinking of offering any of that?</p>
<p><strong>Jay Wright:  There is a backend tool that we offer.  And the backend tool is what you use to generate your targets.  So if you want to create or use a particular image for a target in your application, you upload it to our target management application, and then it will evaluate that target and tell you how well it will work.  So as you know, certain images are more likely to be recognizable than others.  And so thereâ€™s metrics in that application that will give you some feedback.</strong></p>
<p><strong>And then you can download your target resource from the website that you can then incorporate into your application project.</strong></p>
<p><strong>Tish Shute:</strong> So this is available at the moment to people who are in the private beta and not to&#8230;you know, all of this information and documentation, right?</p>
<p><strong>Jay Wright:  Thatâ€™s correct.</strong></p>
<p><strong>Tish Shute: </strong>So thatâ€™s an incentive.  Now, just to encourage people to submit to the private beta is the other thing that people seem confused about.  In one part you say 25 developers.  And some people have thought that meant it was limited to 25 individuals.  And some people have like maybe four people on their team, so they were going, â€œWell, are we going to be accepted because we have four developers, or do we count as one because we are all working at the same project?â€</p>
<p><strong>Jay Wright:   itâ€™s just 25 companies.</strong></p>
<p><strong>Tish Shute: </strong> OK.  I think weâ€™ve gone through the questions.  Just to clarify and maybe give some incentive for people to apply to the private beta&#8230;the big advantage of getting in the private beta, aside from getting a monthâ€™s start on the competition, is that you get a chance to input, right?</p>
<p><strong>Jay Wright:  Yes.  A chance to provide feedback, get early access to the technology.  And then we are also providing a free HTC phone.</strong></p>
<p><strong>Tish Shute:</strong> Oh, yes.  I forgot the phone.  Yes, right.  In the requirements, though, you basically seem to be asking for sort of a full app&#8230;some people get reticent about delivering their full application plan, right?</p>
<p><strong>Jay Wright:  Yes.  I understand that.  People should just reveal what they are comfortable talking about.  Just so you understand the constraint on this end, this is early technology and weâ€™re trying to understand exactly what the support requirement is going to be.  And we have limited supported resources at this time, so we want to make sure that we can focus the resources that we have on folks that are really going to use the technology and have a sound plan to actually build something.  So thatâ€™s really the motivation behind limiting the size of the private beta.</strong></p>
<p><strong>Tish Shute:</strong> OK.  Yes, itâ€™s good to reiterate that.  Weâ€™re down to the last question that I have, and then Iâ€™ll ask you if there is anything that I missed.  You say you are partnering with Mattel.  Who are the developers?  Because I mean Mattel isnâ€™t an augmented reality development team.</p>
<p><strong>Jay Wright:  Mattel used a subcontractor, <a href="http://www.aura.net.au/">Aura Interactive</a>.</strong></p>
<p><strong>Tish Shute: </strong> Nice.  But thatâ€™s your only partner that I saw, right?  Why Mattel?</p>
<p><strong>Jay Wright:  Well, to launch a new technology, companies will often find showcase partners to demonstrate compelling uses of it.  And we thought Mattel and the Rockâ€™em Sockâ€™emâ„¢ toy was a great example of combining augmented reality with an existing toy.</strong></p>
<p><strong>Tish Shute:</strong> And I think people agree with you on Rockâ€™em Sockâ€™em (see <a href="http://www.readwriteweb.com/archives/qualcomm_launching_mobile_sdk_for_vision-based_ar_on_android_this_fall.php" target="_blank">Chris Cameron&#8217;s RWW post</a>).</p>
<p><strong>Jay Wright:  And thereâ€™s other showcase partners and applications that we will continue to work on to kind of spur the ecosystem and show what is possible.</strong></p>
<p><strong>Tish Shute: </strong>OK.  Now, is there anything Iâ€™ve left out that you think?  Whatâ€™s the core of this narrative that we need to get across, and if Iâ€™ve left anything out that is a key piece?</p>
<p><strong>Jay Wright:  I think youâ€™ve done an excellent job of covering all the bases, Tish.</strong></p>
<p><strong>Tish Shute: </strong> [laughs]</p>
<p><strong>Jay Wright:  I think the important overriding message to get across is that we really see ourselves in an enablement role here, and that we are trying to provide&#8230;.weâ€™d like to provide fundamental technology that helps all developers build content for the real world.</strong></p>
<h3><strong><strong><br />
</strong></strong></h3>
]]></content:encoded>
			<wfw:commentRss>https://www.ugotrade.com/2010/08/05/vision-based-augmented-reality-ar-in-smart-phones-qualcomms-ar-sdk-interview-with-jay-wright/feed/</wfw:commentRss>
		<slash:comments>3</slash:comments>
		</item>
		<item>
		<title>Total Immersion and the &#8220;Transfigured City:&#8221; Shared Augmented Realities, the &#8220;Web Squared Era,&#8221; and Google Wave</title>
		<link>https://www.ugotrade.com/2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/</link>
		<comments>https://www.ugotrade.com/2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/#comments</comments>
		<pubDate>Sun, 27 Sep 2009 04:42:42 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[3D Interactive Live Show]]></category>
		<category><![CDATA[Acrossair]]></category>
		<category><![CDATA[AMEE]]></category>
		<category><![CDATA[Amphibious Architecture]]></category>
		<category><![CDATA[anime]]></category>
		<category><![CDATA[Apple iPhone]]></category>
		<category><![CDATA[AR baseball cards for Topps]]></category>
		<category><![CDATA[AR Consortium]]></category>
		<category><![CDATA[AR eyewear]]></category>
		<category><![CDATA[AR goggles]]></category>
		<category><![CDATA[Architectural League of New York]]></category>
		<category><![CDATA[ARML]]></category>
		<category><![CDATA[ARN]]></category>
		<category><![CDATA[Augmented City]]></category>
		<category><![CDATA[augmented city lab]]></category>
		<category><![CDATA[augmented reality books]]></category>
		<category><![CDATA[augmented reality entrpreneurship]]></category>
		<category><![CDATA[augmented reality goggles]]></category>
		<category><![CDATA[augmented reality making visible the invisible]]></category>
		<category><![CDATA[augmented reality mark-up language]]></category>
		<category><![CDATA[augmented reality pollution meter]]></category>
		<category><![CDATA[augmented reality standards]]></category>
		<category><![CDATA[augmented reality toys]]></category>
		<category><![CDATA[augmented virtuality]]></category>
		<category><![CDATA[Bionic Eye]]></category>
		<category><![CDATA[Blair Macintyre]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Bruno Uzzan]]></category>
		<category><![CDATA[Conflux]]></category>
		<category><![CDATA[cross platform compatibility for augmented reality]]></category>
		<category><![CDATA[D'Fusion]]></category>
		<category><![CDATA[Daniel Wagner]]></category>
		<category><![CDATA[Denno Coil]]></category>
		<category><![CDATA[distributed]]></category>
		<category><![CDATA[elements of networked urbanism]]></category>
		<category><![CDATA[Elizabeth Goodman]]></category>
		<category><![CDATA[everyware]]></category>
		<category><![CDATA[Fish 'n Microchips]]></category>
		<category><![CDATA[Flickr]]></category>
		<category><![CDATA[Gavin Starks]]></category>
		<category><![CDATA[Gene Becker]]></category>
		<category><![CDATA[geo spatial web]]></category>
		<category><![CDATA[geoAR]]></category>
		<category><![CDATA[geoaugmentation]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Google Wave Protocol]]></category>
		<category><![CDATA[Gov 2.0 Expo Showcase]]></category>
		<category><![CDATA[Gov 2.0 Summit]]></category>
		<category><![CDATA[Graz University of Technology]]></category>
		<category><![CDATA[Imagination]]></category>
		<category><![CDATA[Incheon Free Economic Zone]]></category>
		<category><![CDATA[information shadows]]></category>
		<category><![CDATA[Int13]]></category>
		<category><![CDATA[Interaction Design for Augmented Reality]]></category>
		<category><![CDATA[ISMAR 2009]]></category>
		<category><![CDATA[Jeremy Hight]]></category>
		<category><![CDATA[Joe Lamantia]]></category>
		<category><![CDATA[Jonathan Laventhol]]></category>
		<category><![CDATA[Korea's u-Cities]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[Layar 3D]]></category>
		<category><![CDATA[magic lens augmented reality]]></category>
		<category><![CDATA[manga]]></category>
		<category><![CDATA[Mark Shepard]]></category>
		<category><![CDATA[Mark Weiser]]></category>
		<category><![CDATA[markerless mobile augmented reality]]></category>
		<category><![CDATA[Metaio]]></category>
		<category><![CDATA[Microsoft Bing]]></category>
		<category><![CDATA[Mike Kuniavsky]]></category>
		<category><![CDATA[Mobilizy]]></category>
		<category><![CDATA[multiuser augmented reality]]></category>
		<category><![CDATA[Natalie Jeremijenko]]></category>
		<category><![CDATA[Natural Fuse]]></category>
		<category><![CDATA[near-field object rcognition and tracking]]></category>
		<category><![CDATA[Networked City]]></category>
		<category><![CDATA[networked urbanism]]></category>
		<category><![CDATA[newer urbanism]]></category>
		<category><![CDATA[open]]></category>
		<category><![CDATA[open augmented reality framework]]></category>
		<category><![CDATA[open augmented reality network]]></category>
		<category><![CDATA[Orange Cone]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[realtime panorama mapping on mobile phones]]></category>
		<category><![CDATA[RobotVision]]></category>
		<category><![CDATA[sensor networks]]></category>
		<category><![CDATA[Sentient City Survival Kit]]></category>
		<category><![CDATA[Shangri La]]></category>
		<category><![CDATA[shared augmented realities]]></category>
		<category><![CDATA[Sky Writer]]></category>
		<category><![CDATA[Steven Feiner]]></category>
		<category><![CDATA[symbiosis between augmented reality and brands]]></category>
		<category><![CDATA[the internet of things]]></category>
		<category><![CDATA[the LAN of things]]></category>
		<category><![CDATA[the shape of alpha]]></category>
		<category><![CDATA[the web squared era]]></category>
		<category><![CDATA[ThingM]]></category>
		<category><![CDATA[things as services]]></category>
		<category><![CDATA[Thomas Wrobel]]></category>
		<category><![CDATA[Tim O'Reilly]]></category>
		<category><![CDATA[Tod E. Kurt]]></category>
		<category><![CDATA[Total Immersion]]></category>
		<category><![CDATA[Toward the Sentient City]]></category>
		<category><![CDATA[Transfigured City]]></category>
		<category><![CDATA[twitter]]></category>
		<category><![CDATA[u-City]]></category>
		<category><![CDATA[ubiquitous computing and augmented reality]]></category>
		<category><![CDATA[uCity]]></category>
		<category><![CDATA[Usman Haque]]></category>
		<category><![CDATA[Wave Federation Protocol]]></category>
		<category><![CDATA[Weisarian Ubiquitous Computing]]></category>
		<category><![CDATA[Wikitude]]></category>
		<category><![CDATA[xClinic]]></category>
		<category><![CDATA[XMPP versus HTTP]]></category>
		<category><![CDATA[Yocahi Benkler]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4439</guid>
		<description><![CDATA[Above is an image aboveÂ  from Total Immersion&#8217;s augmented reality experience developed for the &#8220;Networked City&#8221; exhibition in South Korea, &#8211; &#8220;a fun scenario created for a u-City&#8217;s infrastructure and city management service&#8221; &#8220;To the naked eye, the exhibit looks like a bare bones model of a city. But when visitors put on the special [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_338cwpzntgp_b.jpg"><img class="alignnone size-medium wp-image-4440" title="dhj5mk2g_338cwpzntgp_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_338cwpzntgp_b-300x170.jpg" alt="dhj5mk2g_338cwpzntgp_b" width="300" height="170" /></a></p>
<p><em>Above is an image aboveÂ  from <a href="http://www.t-immersion.com/" target="_blank">Total Immersion&#8217;s</a> augmented reality experience developed for the <a id="winm" title="&quot;Networked City&quot; exhibition in South Korea, &quot;" href="http://www.tomorrowcity.or.kr/sv_web/en_US/space.SpaceInfo.web?targetMethod=DoUe04Sub1" target="_blank">&#8220;Networked City&#8221; exhibition in South Korea,</a> &#8211; &#8220;a fun scenario created for a<a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank"> u-City&#8217;s</a> infrastructure and city management service&#8221; </em></p>
<p><strong>&#8220;To the naked eye, the exhibit looks like a bare bones model of a city. But when visitors put on the special AR goggles a whole new world unfolds â€“ as graphics overlaid on the city model.</strong><em><strong>&#8221; </strong>(<a href="http://gamesalfresco.com/2009/09/14/total-immersion-brings-augmented-reality-to-tomorowcity-todaytomorrow/" target="_blank">Games Alfresco)</a></em></p>
<p>&#8220;The Networked City,&#8221; is a large scale augmented virtuality of a scenario for a networked city. But my guess, reading the &nbsp; &nbsp;    <em><a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank">Korea IT Times</a></em>, is the plan is to move from an augmented virtuality to an augmented reality as Incheon Free Economic ZoneÂ  (IFEZ) realizes its vision to become a leading u-City &#8211; where reality is turned &#8220;inside out&#8221; (see <a id="x:2w" title="Inside Out Reality" href="http://www.uxmatters.com/mt/archives/2009/08/inside-out-interaction-design-for-augmented-reality.php">Inside Out: Interaction Design for Augmented Reality )</a>.Â <a id="x:2w" title="Inside Out Reality" href="http://www.uxmatters.com/mt/archives/2009/08/inside-out-interaction-design-for-augmented-reality.php"> </a>If you are not familiar with South Korea&#8217;s u-Cities, <a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank">check out this post</a> for a short primer (and note<a href="http://www.google.com/trends?q=augmented+reality&amp;ctab=1986817859&amp;geo=all&amp;date=all" target="_blank"> Google Trends search on Augmented Reality </a>showsÂ  South Korea leaving everyone else in the dust).<a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank"></p>
<p></a></p>
<h3>Ubiquitous computing and augmented reality are like adenine and thymine &#8211; a DNA base pair.</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-24-at-11.34.35-PM.png"><img class="alignnone size-medium wp-image-4442" title="Screen shot 2009-09-24 at 11.34.35 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-24-at-11.34.35-PM-300x256.png" alt="Screen shot 2009-09-24 at 11.34.35 PM" width="300" height="256" /></a></p>
<p><em>A sky view of Incheon Free Economic Zone (<a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank">from Korean IT Times</a>). For more on the IFEZ vision to become a leading u-City <a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank">see here</a>.</em></p>
<p><a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank">Korea IT Times</a> writes about the u-city concept:</p>
<p><strong>&#8220;Korea began using the term u-City after accepting the concept of ubiquitous computing, a post-desktop model of human-computer interaction created by Mark Weiser, the chief technologist of the Xerox Palo Alto Research Center in California, in 1998. There have been a lot of research in this field since 2002. As a result, many local governments in Korea have applied this concept to various development projectsÂ since 2005Â based on a practical approach to it.&#8221;</strong></p>
<p>The back story to many of my recent posts, including this one, is an understanding of a relationship between ubiquitous computing and augmented reality that emerged, for me, in a February conversation with Adam Greenfield, <a title="Permanent Link to Towards a Newer Urbanism: Talking Cities, Networks, and Publics with Adam Greenfield" rel="bookmark" href="../../2009/02/27/towards-a-newer-urbanism-talking-cities-networks-and-publics-with-adam-greenfield/">Towards a Newer Urbanism: Talking Cities, Networks, and Publics with Adam Greenfield</a>.Â  In cased you missed it, here is the link again because I think it holds up very well considering the rapid developments of recent months.Â  Also, importantly for this post, it includes a discussion ofÂ  moving on from Weiserian visions.</p>
<p><a href="http://speedbird.wordpress.com/" target="_blank">Adam Greenfield&#8217;s Speedbird</a> is one of my key sources for understanding &#8220;networked urbanism,&#8221; and the list he makes of <a href="http://speedbird.wordpress.com/2009/03/22/the-elements-of-networked-urbanism/" target="_blank">the elements of networked urbanism here</a> (also see the comments) &#8211; is my mantra for thinking about the DNA base pair relationship of augmented reality and ubiquitous computing.</p>
<p>Adam Greenfield&#8217;s, <strong>&#8220;summary of what those of us who are thinking, writing and speaking about networked urbanism seem to be seeing&#8221;</strong> is:</p>
<p><strong>1. From <em>latent</em> to <em>explicit</em>; 2. From <em>browse</em> to <em>search</em>; 3. From <em>held</em> to <em>shared</em>; 4. From <em>expiring</em> to <em>persistent</em>; 5. From <em>deferred</em> to <em>real-time</em>; 6. From <em>passive</em> to <em>interactive</em>; 7. From <em>component</em> to <em>resource</em>; 8. From <em>constant</em> to <em>variable</em>; 9. From <em>wayfinding</em> to <em>wayshowing</em>; 10. From <em>object</em> to <em>service</em>; 11. From <em>vehicle</em> to <em>mobility</em>; 12. From <em>community</em> to <em>social network</em>; 13. From <em>ownership</em> to <em>use</em>; 14. From <em>consumer</em> to <em>constituent</em>.</strong></p>
<p><strong></p>
<p></strong></p>
<h3>Augmented Reality &#8211; Making Visible the Invisible</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-2.44.27-PM.png"><img class="alignnone size-medium wp-image-4509" title="Screen shot 2009-09-26 at 2.44.27 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-2.44.27-PM-300x229.png" alt="Screen shot 2009-09-26 at 2.44.27 PM" width="300" height="229" /></a></p>
<p>The screenshot above is one ofÂ  the coolest &#8220;making visible the invisible&#8221; AR applications. It was developed at Columbia University Graphics and User Interface Lab where <a href="http://www1.cs.columbia.edu/%7Efeiner/" target="_blank">Steven Feiner</a> is Director (see the deep list of projects from the lab <a href="http://graphics.cs.columbia.edu/top.html" target="_blank">here</a>).Â  This app &#8220;shows carbon monoxide levels projected over New York City. The height of each ball reflects concentrations of the pollutant.&#8221; Credit: Sean White and Steven FeinerÂ  (<a href="http://www.technologyreview.com/computing/23515/page2/" target="_blank">via Technology Review</a>).</p>
<p>The recent emergence of &#8220;magic lens&#8221; augmented reality apps for our smart phones &#8211; <a href="http://www.wikitude.org/" target="_blank">Wikitude</a>, <a href="http://layar.com/" target="_blank">Layar,</a> <a href="http://www.acrossair.com/" target="_blank">Acrossair</a>, <a href="http://support.sekaicamera.com/">Sekai Camera</a>, and many others now, have given us a new window into our cities. But we are yet to realize the full potential of the AR/ubicomp base pair that can &#8220;make visible the invisible&#8221; and give us new opportunities to relate to the invisible data ecosystems of our cities, not merely as a spectator experience,Â  but as an interactive, in context, real time opportunity to reimagine social relations.</p>
<p><a href="http://www.sentientcity.net/exhibit/?p=3" target="_blank">Mark Shepard</a> says in <a href="http://www.sentientcity.net/exhibit/?p=3" target="_blank">his curatorial statement</a> for, <a href="http://www.sentientcity.net/exhibit/" target="_blank">&#8220;Toward the Sentient City:&#8221;</a> (Much more soon on this very significant exhibit which runs from Sept. 17th to Nov. 7th, 2009.)</p>
<p><strong>&#8220;In place of natural weather systems, however, today we find the dataclouds of 21st century urban space increasingly shaping our experience of this city and the choices we make there.&#8221;</strong></p>
<p>Augmented Reality, as Joe Lamantia points out, is becoming the great &#8220;<a id="o0mh" title="ambassador of ubiqitous computing" href="http://www.uxmatters.com/mt/archives/2009/08/inside-out-interaction-design-for-augmented-reality.php">ambassador of ubiqitous computing</a>.&#8221; AR is. &#8220;<strong>&#8230;mak[ing] it possible to experience the new world of ubiquitous computing by reifying the digital layer that permeates our inside-out world,&#8221; </strong>and we are only just glimpsing the razor thin end of the wedge in this regard.</p>
<p>I am still working on my <a href="http://www.gov2summit.com/" target="_blank">Gov 2.0 Summit </a>write upÂ  and, amongst other things, I will talk about how an emerging new social contract around open data, here in the US,Â  will put augmented realityÂ  apps center stageÂ  &#8211; &#8220;doing stuff that matters.&#8221; At <a href="http://www.gov2expo.com/gov2expo2009" target="_blank">Gov 2.0 Expo Showcase</a> Tim O&#8217;Reilly tweeted:</p>
<p><a id="i23q" title="Tim O'Reilly" href="http://twitter.com/timoreilly">Tim O&#8217;Reilly</a> Really enjoyed @capttaco (Digital Arch Design) @ #gov20e: &#8220;Augmented Reality could be a new public infrastructure&#8221; <a href="http://bit.ly/18iCx" target="_blank">http://bit.ly/18iCx</a></p>
<p>Also see Tim O&#8217;Reilly and Jennifer Pahlka on Forbes.com discuss the <a href="http://www.forbes.com/2009/09/23/web-squared-oreilly-technology-breakthroughs-web2point0.html" target="_blank">The &#8220;Web Squared&#8221; Era</a> -Â <strong> &#8220;the Web Squared era is an era of augmented reality arriving (like the sensor revolution) stealthily, in more pedestrian clothes than we expected</strong>.<strong>&#8230; &#8230;our world will have &#8220;<a href="http://www.orangecone.com/archives/2009/02/smart_things_an.html" target="_blank">information shadows</a>.&#8221; Augmented reality amounts to information shadows made visible.&#8221;</strong></p>
<p>Again there is back story to how I came to think about Information Shadows in relation to augmented reality.Â  So in case your missed it the first time, here is the link to a conversation that began in a hallway meeting between Tim O&#8217;Reilly, Mike Kuniavsky, <a href="http://thingm.com/" target="_blank">ThingM</a>, Usman Haque, <a href="http://www.pachube.com/" target="_blank">Pachube</a>, and Gavin Starks, <a href="http://www.amee.com/" target="_blank">AMEE</a>, at <a href="http://en.oreilly.com/et2009/" target="_blank">ETech earlier this year</a>,Â  <a title="Permanent Link to Dematerializing the World, Shadows, Subscriptions and Things as Services: Talking With Mike Kuniavsky at ETech 2009" rel="bookmark" href="../../2009/03/18/dematerializing-the-world-shadows-subscriptions-and-things-as-services-talking-with-mike-kuniavsky-at-etech-2009/">&#8220;Dematerializing the World, Shadows, Subscriptions and Things as Services: Talking With Mike Kuniavsky at ETech 2009</a>.&#8221;</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-9.32.09-PM.png"><img class="alignnone size-medium wp-image-4547" title="Screen shot 2009-09-26 at 9.32.09 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-9.32.09-PM-300x225.png" alt="Screen shot 2009-09-26 at 9.32.09 PM" width="300" height="225" /></a></p>
<p><a href="http://www.slideshare.net/rlenz/augmented-city-lab-picnic-09" target="_blank">Slide from Augmented City Lab</a> @ <a href="http://www.picnicnetwork.org/" target="_blank">Picnic &#8217;09</a></p>
<h3>So What&#8217;s Next for Mobile Augmented Reality?</h3>
<p><a href="http://www.youtube.com/watch?v=434zw201iN8&amp;feature=player_embedded" target="_blank"><img class="alignnone size-medium wp-image-4513" title="Screen shot 2009-09-26 at 3.45.45 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-3.45.45-PM-300x186.png" alt="Screen shot 2009-09-26 at 3.45.45 PM" width="300" height="186" /></a></p>
<p>These videos from Daniel Wagner&#8217;s team from Graz University of Technology showing <a href="http://www.youtube.com/watch?v=434zw201iN8&amp;feature=player_embedded" target="_blank">Realtime Panorama Mapping and Tracking on Mobile Phones</a> and <a href="http://www.youtube.com/watch?v=W-mJG3peIXA&amp;feature=player_embedded" target="_blank">Creating an Indoor Panorama in Realtime</a>, as Rouli from Games Alfresco points out,Â  indicate that there is a lot in store for us at <a href="http://www.icg.tugraz.at/Members/daniel/MultipleTargetDetectionAndTrackingWithGuaranteedFrameratesOnMobilePhones/inproceedings_view">ISMAR09</a>.</p>
<p>We may not be so impressed by directory style/&#8221;post it&#8221; AR anymore, as these applications have become common place so quickly!Â  But while these early mobile AR apps may be disappointing in relation to some futurist visions of AR &#8211; merely AR/ubicomp appetizers,Â  there are still good implementations of this model coming out (see new comers to the app store<a id="tzvf" title="Bionic Eye" href="http://mashable.com/2009/09/24/bionic-eye/" target="_blank"> Bionic Eye</a> and <a href="http://www.readwriteweb.com/archives/robotvision_a_bing-powered_iphone_augmented_realit.php" target="_blank">RobotVision</a>). And <a href="http://layar.com/" target="_blank">Layar,</a> always on the ball, has upped the ante for the new cohort of AR Browsers with <a href="http://layar.com/3d/" target="_blank">Layar 3D</a>.</p>
<p>But as Bruce Sterling <a href="http://www.wired.com/beyond_the_beyond/2009/09/augmented-reality-robotvision/" target="_blank">notes here</a>:</p>
<p><strong>*In AR, everybody wants to be the platform and the browser, and nobody wants to be the boring old geolocative database. Look how Tim [creator of RobotVision] here, who is like one guy working on his weekends, can boldly fold-in the multi-billion dollar, multi-million user empires of Apple iPhone, Microsoft Bing, Flickr, and Twitter, all under his right thumb</strong></p>
<p> (watch <a id="qxek" title="video here" href="http://www.youtube.com/watch?v=hWC9gax7SCA&amp;feature=player_embedded">video here</a>)</p>
<p>But ifÂ  you looking for something more from AR, you probably won&#8217;t have to wait too long.Â  The two pioneering companies in AR, <a href="http://www.t-immersion.com/" target="_blank">Total Immersion</a> &#8211; founded in 1999, and <a href="http://www.metaio.com/" target="_blank">Metaio</a> &#8211; founded in 2003 are both coming out with &#8220;mobile augmented reality platforms&#8221; in a matter of weeks (see press releases <a href="http://augmented-reality-news.com/2009/09/14/bringing-its-augmented-reality-to-mobile-applications-total-immersion-partners-with-smartphones-app-provider-int13/" target="_blank">here</a> and <a href="http://gamesalfresco.com/2009/09/18/metaio-announcing-mobile-augmented-reality-platform-junaio/" target="_blank">here</a>).Â  And both companies, it seems, will deploy much more sophisticated AR rendering and tracking than we have seen to date.</p>
<p>I approached Bruno Uzzan, founder and CEO of Total Immersion, for an interview as part of my look at the new industry of augmented reality through the eyes of the founding members of the <a href="http://www.arconsortium.org/" target="_blank">AR Consortium</a>. These consortium members are some of the first commercial augmented reality companies.</p>
<p><a href="#jumpto">The interview below</a> with Bruno began early this summer and then we both went on vacation and it picks up after the announcement of the <a href="http://www.int13.net/blog/en/" target="_blank">partnership between Total Immersion and Int13</a>.</p>
<p>The significance of this announcement is that Total Immersion is now positioned to take the augmented reality experiences they have developed for a number of top brands onto multiple mobile platforms with, &#8220;<strong>Int13&#8242;s very clever embedded solution that allows our [Total Immersion's] solutions to work across many [mobile] platforms,&#8221; </strong>while Int13 gets to extend their reach.</p>
<p>Total Immersion has a 50 person R&amp;D team and their two main focuses have been, firstly getting:<strong> </strong></p>
<p><strong>&#8220;Augmented Reality to work with as many platforms as possible &#8211; PC, Mac, Mobile, Game Consoles, all those are the platforms that we are targeting. We are currently doing lot of work in the R &amp; D team in cross platform compatibility&#8230;.&#8221;</strong></p>
<p>and, secondly:<strong></p>
<p></strong></p>
<p><strong>&#8220;Our R&amp;D guys are working on the real world interacting more with the virtual world.Â  And I have started seeing some results which are pretty much crazy and this will be ready for next year.&#8221;</strong></p>
<p><strong></p>
<p></strong></p>
<h3>Pandora&#8217;s Box &#8211; Shared Augmented Realities</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-25-at-1.18.15-AM.png"><img class="alignnone size-medium wp-image-4450" title="Screen shot 2009-09-25 at 1.18.15 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-25-at-1.18.15-AM-186x300.png" alt="Screen shot 2009-09-25 at 1.18.15 AM" width="186" height="300" /></a></p>
<p>Spes or &#8220;Hope&#8221;; <a title="Engraving" href="http://en.wikipedia.org/wiki/Engraving">engraving</a> by <a title="Sebald Beham" href="http://en.wikipedia.org/wiki/Sebald_Beham">Sebald Beham</a>, German c1540 (see <a href="http://en.wikipedia.org/wiki/Pandora%27s_box" target="_blank">Wikipedia article on Pandora&#8217;s Box</a>)</p>
<p>There are many weaknesses to the mobile smart phone AR experiences we have now, and the lack of near field object recognition (to date), and difficulties with accurate positioning aren&#8217;t the only ones.Â  Note re solving positioning problems in mobile AR, we are yet to see ARÂ  leverage public libraries for analyzing scenes like Flickr&#8217;s geo tagged photos, see Aaron Straup Copesâ€™s work on <a href="http://code.flickr.com/blog/2008/10/30/the-shape-of-alpha/" target="_blank">â€œThe Shape of Alpha.â€</a> And for more on this <a href="http://www.ugotrade.com/2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/" target="_blank">my post here</a>.</p>
<p>But, as Joe Lamantia points out:</p>
<p><strong>â€œOne of the weakest aspects of the existing interaction patterns for augmented reality is their reliance on single-person, socially disconnected user experiences.â€</strong></p>
<p>In my view, <strong>The Pandora&#8217;s Box of Augmented Realities</strong> is an open, distributed, multiuser augmented reality framework, fully integrated with the internet and world wide web.</p>
<p>As Yochai Benkler has pointed out many times, and argues again in, <a href="Capital, Power, and the Next Step in Decentralization" target="_blank">Capital, Power, and the Next Step in Decentralization</a>, it is &#8220;open, collaborative, distributed practices that have been at the core of what made the Internet.&#8221;Â  We have to try to make sure that open, collaborative, distributed practices are at the core of mobile augmented reality.</p>
<p><strong></p>
<p></strong></p>
<h3>Can Google Wave be the basis for an Open, Distributed, Multiuser Augmented Reality Framework?</h3>
<p><a href="http://www.lostagain.nl/tempspace/PrototypeDiagram.html" target="_blank"><img class="alignnone size-medium wp-image-4492" title="Screen shot 2009-09-25 at 11.51.20 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-25-at-11.51.20-PM-300x141.png" alt="Screen shot 2009-09-25 at 11.51.20 PM" width="300" height="141" /></a></p>
<p>I have been exploring the idea of using <a href="http://wave.google.com/" target="_blank">Google Wave </a>protocol as the basis for a distributed, multiuser open augmented reality framework with a small group of AR enthusiasts and developers. And I am happy to say the proposal is beginning to get fleshed out a little.Â  New collaborators are welcome both for &#8220;gear heady&#8221; input and use case suggestions (but re the latter, you can&#8217;t just say everything you see in <a href="http://en.wikipedia.org/wiki/Denn%C5%8D_Coil" target="_blank">Denno Coil</a>..!).</p>
<p>This effort started with Thomas Wrobel&#8217;sÂ  proposal for an Open AR Framework prototyped on IRC &#8211; see <a id="s336" title="here" href="../../2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/">here,</a> and click to enlarge the image above of, <a href="http://www.lostagain.nl/tempspace/PrototypeDiagram.html" target="_blank">&#8220;Sky Writer: Basic Concept for an Open Multi-source AR Framework.&#8221;</a></p>
<p>But recently we began looking at the <a href="Wave Federation Protocol" target="_blank">Wave Federation Protocol</a>.Â  And, if you check out <a id="ogbq" title="this post," href="http://www.jasonkolb.com/weblog/2009/09/why-google-wave-is-the-coolest-thing-since-sliced-bread.html#more" target="_blank">this post,</a> and <a id="c0ep" title="this post" href="http://reuvencohen.sys-con.com/node/980762" target="_blank">this post</a>, you may get a glimpse of why Google Wave protocol might be a good basis for an open, distributed, AR Framework.Â  You will notice, if you study what Google Wave has done with the XMPP protocol, that many ofÂ <a href="http://speedbird.wordpress.com/2009/03/22/the-elements-of-networked-urbanism/" target="_blank"> the elements of networked urbanism</a> that Adam Greenfield describes resonate strongly with what is being attempted in Wave.</p>
<p>But enough said for now!Â  Regardless of the details of implementation,Â  Google Wave or an AR protocol built from scratch (phew! the latter does seem like a lot of work) -Â  an open, distributed, multiuser AR framework integrated with the internet and web would explode the potential of AR, creating new possibilities for data flows, mashups ,and shared augmented realities.</p>
<p>And we are excited by Google Wave because, as Thomas puts it:</p>
<p><strong>&#8220;The really great thing wave does &#8230;.(aside from being an open standard backed by a major player&#8230;hopefully leading to thousands of worldwide servers )&#8230;.is that it allows anyone to create any number of waves, set precisely who can view or edit them, and for them to be able to be updated quickly and continuously (and even simultaneously!)</strong><strong> Better yet, changes will (if necessary) propagate to all the other servers sharing that wave. It does all this right now. From my eyes this does a lot of the work of an AR infrastructure already.</strong></p>
<p><strong>I cant see any other protocol actually doing anything like this at the moment, although correct me if I&#8217;m wrong, as alternatives are always welcome :)&#8221;</strong></p>
<p>Also, Thomas notes, <strong>&#8220;even the playback system (that is, the ability to playback the changes made to a wave since its creation) &#8230;this could give us automatically some of the ideas Jeremy Hight has mentioned in <a href="http://piim.newschool.edu/journal/issues/2009/01/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf" target="_blank">his visionary work here</a>,Â  and <a href="http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf" target="_blank">here</a> on &#8220;the geo spatial web, interlinked locations and data, immersive augmentation and open source geo augmentation.&#8221;</strong></p>
<p>One of the many reasons why an Open, distributed AR Framework would be so cool is it would open up all kinds of possibilities for <span>GeoAR</span> by providing the over-arching standard protocol for communication of updates necessary for the substandards that will facilitate <span>GeoAR</span>.</p>
<p>Also important to note is theÂ  <a id="o0is" title="Wave Federation Protocol docs which are all publicly available here" href="http://www.waveprotocol.org/" target="_blank">Wave Federation Protocol</a> allows anyone:</p>
<p><strong>&#8220;to run wave servers and become wave providers, for themselves, or as services for their users, and to &#8220;federate&#8221; waves, that is, to share waves with each other and with Google Wave. &#8211; &#8220;the federation gateway and a federation proxy and is based on open extension to <a href="http://www.waveprotocol.org/draft-protocol-spec#RFC3920">XMPP core</a> [RFC3920] protocol to allow near real-time communication between two wave servers.&#8221; See Reuven Cohen&#8217;s blog for more <a id="rmr3" title="here" href="http://reuvencohen.sys-con.com/node/980762" target="_blank">here</a> and <a id="mqxr" title="&quot;HTTP is Dead, Long Live the Real Time Cloud.&quot;" href="http://www.elasticvapor.com/2009/05/http-is-dead-long-live-realtime-cloud.html" target="_blank">here, &#8220;HTTP is Dead, Long Live the Real Time Cloud.&#8221;</a></strong></p>
<p>Still some people have expressed concern that an AR Framework using Google Wave protocol would give Google disproportionate influence. Â  Will Google-specific functionality be an issue?Â  How much stuff is Google specific just because no one else is using it (yet)? And how much is Google specific because it holds no value to anyone else but Google? These are some of the questions that have come up.</p>
<p>You are going to see a variety of suggestions for standards and specs for open AR coming out out in the next few months which as, Robert Rice of the <a href="http://www.arconsortium.org/" target="_blank">AR Consortium</a> points out is: <strong>&#8220;a good thing, we need that competition early on to settle down on best case.&#8221; </strong>Recently,Â <a href="http://www.mobilizy.com/" target="_blank"> Mobilizy</a> have offered up an ARML (&#8220;an augmented reality mark-up language specification based on the OpenGISÂ® KML Encoding Standard (OGC KML) with extensions&#8221;) for consideration see <a href="http://www.mobilizy.com/enpress-release-mobilizy-proposes-arml" target="_blank">here.</a></p>
<p>So it is, perhaps, also important to note, that an Open AR Framework should be neutral/transparent to techniques ofÂ  &#8220;reality recognition,&#8221;Â  and methodologies of registration/tracking, allowing various ones to work on the system as new techniques evolve, and to support as many evolving standards as possible.</p>
<p>Augmented Reality developers, like Total Immersion and others with powerful rendering/tracking AR software, should be able use an Open AR Framework to exchange the data which their tracking will use. And the tracking/rendering problems they and other researchers have solved are much harder than figuring out data exchange on on a standard infrastructure or protocol!</p>
<p>So I pricked up my ears when I heard Bruno Uzzan, CEO of <a href="http://www.t-immersion.com/" target="_blank">Total Immersion</a> -Â  the first and currently the largest augmented reality company, with a 50 person R&amp;D team in France and offices in LA, where Bruno himself is now based, say: <strong>&#8220;Total Immersion isÂ  only months away from launching shared mobile augmented reality experiences using near field object recognition/tracking across multiple platforms&#8221;</strong> (for more details read my conversation with Bruno Uzzan <a href="#jumpto">below</a>).</p>
<p>I was happy when I asked Bruno about the possibilities for developing an open, distributed, multiuser augmented reality framework fully integrated with the internet and world wide web (possibly using Google Wave protocols), and he replied:</p>
<p><span id="pnk:" title="Click to view full content"><strong>&#8220;I think this is feasible. I think that&#8217;s doable, that&#8217;s justÂ  in my opinion. I mean some people might have another kind of opinion but I think that that&#8217;s definitely doable.&#8221;</strong></span></p>
<p><span title="Click to view full content"><strong></p>
<p></strong></span></p>
<h3>Total Immersion &#8211; working with the &#8220;symbiosis between augmented reality and brands&#8221;</h3>
<p><a href="http://www.youtube.com/watch?v=I7jm-AsY0lU" target="_blank"><img class="alignnone size-medium wp-image-4457" title="dhj5mk2g_344g64g96cq_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_344g64g96cq_b-300x224.png" alt="dhj5mk2g_344g64g96cq_b" width="300" height="224" /></a></p>
<p>Total Immersion has created many of the best known and most ambitious augmented reality experiences for major brands to date, including Mattel&#8217;s <a title="new toys" href="http://www.readwriteweb.com/archives/mattels_new_web-enabled_avatar_toys_will_offer_augmented_reality.php">new AR toys</a><a title="new toys" href="http://www.readwriteweb.com/archives/mattels_new_web-enabled_avatar_toys_will_offer_augmented_reality.php"><img src="http://www.uxmatters.com/mt/archives/images/new-window-arrow.gif" alt="" width="14" height="12" /></a> to be released in conjunction with the James Cameron film Avatar, and <a id="dmas" title="AR baseball cards for Topps" href="http://www.youtube.com/watch?v=I7jm-AsY0lU">AR baseball cards for Topps</a>, <a href="http://www.youtube.com/watch?v=I7jm-AsY0lU" target="_blank">video here</a> (or click screenshot above), and the <a href="http://www.publishersweekly.com/article/CA6698612.html?industryid=47152" target="_blank">UK&#8217;s first augmented reality book</a>s.</p>
<p>Bruno founded Total Immersion 10 years ago when he was just 27. And the kind of conviction it took to survive as an augmented reality business in the decade before augmented reality captured the world&#8217;s attention is remarkable.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_343dbsph2fz_b1.png"><img class="alignnone size-medium wp-image-4456" title="dhj5mk2g_343dbsph2fz_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_343dbsph2fz_b1-300x225.png" alt="dhj5mk2g_343dbsph2fz_b" width="300" height="225" /></a></p>
<p>AR&#8217;s first steps out into the world after 17 years as predominantly a lab science maybe &#8220;wobbly&#8221; (what new technology isn&#8217;t), and sometimes gloriously kitsch &#8211; check out<a id="d_eu" title="the riotus video of and AR Live Show Total Immersion produced in Korea here." href="http://www.t-immersion.com/en,video-gallery,36.html" target="_blank"> this riotus video of the 3D Interactive Live Show Total Immersion produced in Korea </a> (also see the <a href="http://augmented-reality-news.com/2009/09/15/entertainment-first-interactive-3d-live-show-now-open-in-south-korea/" target="_blank">Total Immersion Augmented Reality Blog</a> for more on the TI&#8217;s turn keyÂ  Interactive 3D Live Show Solution).</p>
<p>As Lamantia points out <a id="eo6x" title="here" href="http://www.uxmatters.com/mt/archives/2009/08/inside-out-interaction-design-for-augmented-reality.php" target="_blank">here</a>, &#8221; projecting mixed realities into public, common, or social spaces makes them  social by default.&#8221;</p>
<p>However, the potential for shared location based augmented reality experiences is as yet untapped.Â  So I see the entry of the most experienced commercial augmented reality company into mobile as pretty interesting.Â Â  WhileÂ  smart phone AR still has significant limitations, and it certainly does differ from some of the futurist dreams of AR (see <a id="x3:y" title="Mok Oh's post hear on his disappointment in this regard" href="http://allthingsv.com/2009/09/03/you-know-what-really-grinds-my-gears-augmented-reality/">Mok Oh&#8217;s post here on his disappointment in this regard)</a>, it is significant that Total Immersion is committing to becoming a leader in mobile AR.</p>
<p>Our smart phones, the powerful networked sensor devices that so many people carry in their pockets, have proved themselves a &#8220;good enough for now&#8221;Â  mediating device for early manifestations of the ubiquitous computing and augmented reality base pair.Â  And now AR and ubicomp is mixed in theÂ  rich, messy soup of everyday life, commerce, business, marketing, art, entertainment, and government, we should get ready to see these technologies grow up fast, and unfold in some surprising ways that lab science didn&#8217;t necessarily predict.</p>
<p>And, perhaps, the new dialogue between scientists and entrepreneurs may spur both communities to outdo themselves.</p>
<p>Particularly, as <a href="http://programmerjoe.com/" target="_blank">Joe Ludwig</a> notes: &#8220;It seems to me that the biggest disconnect between the academics and the entrepreneurs is that they disagree on how far we are from the finish line.&#8221;</p>
<p>See the comments&#8217;s on Ori Inbar&#8217;s post, <a title="Augmented Reality Entrepreneurship: Natural Evolution or IntelligentÂ Design?" rel="bookmark" href="http://gamesalfresco.com/2009/09/22/augmented-reality-entrepreneurship-natural-evolution-or-intelligent-design/">Augmented Reality Entrepreneurship: Natural Evolution or IntelligentÂ Design?</a>, forÂ  a courteous but spirited discussion on the potential benefits and frictions of the newly expanded AR community ofÂ  researchers andÂ  entrepreneurs.</p>
<p>As <a href="http://www.cc.gatech.edu/~blair/home.html" target="_blank">Blair MacIntyre </a>(see my long conversation with Blair<a href="http://www.ugotrade.com/2009/06/12/mobile-augmented-reality-and-mirror-worlds-talking-with-blair-macintyre/" target="_blank"> here</a>) notes:</p>
<p><strong>&#8220;not all academics and researchers are only interested in the traditional models of impact. Case in point: I wouldnâ€™t be building unpublishable games, nor investing so much time talking to the press, entrepreneurs and VCs if I did not believe strongly in the value of the impact I am having by doing that â€” and I know others with the same attitude.&#8221;</strong></p>
<p>In this vein, check out the Marble Game (<a href="http://www.youtube.com/watch?v=6AKgH4On65A&amp;feature=player_embedded" target="_blank">video here</a>) developed by Steve Feiner and his team at Columbia U. It&#8217;s enabled by Goblin XNA, an open source AR framework built on top of Microsoft&#8217;s XNA, which powers XBox live games, Zune games, and some Windows games. For more about Goblin XNA and AR from Columbia U <a href="http://graphics.cs.columbia.edu/projects/goblin/index.htm" target="_blank">see here</a>.Â  (Hat tip to <a href="http://www.oreillynet.com/pub/au/125" target="_blank">Brian Jepson</a> for this link)</p>
<p><a href="http://www.youtube.com/watch?v=6AKgH4On65A&amp;feature=player_embedded" target="_blank"><img class="alignnone size-medium wp-image-4528" title="Screen shot 2009-09-26 at 5.16.56 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-5.16.56-PM-300x182.png" alt="Screen shot 2009-09-26 at 5.16.56 PM" width="300" height="182" /></a></p>
<p>While we are still waiting for the kind of sexy AR specs &#8211; nothing totally game changing in <a href="http://gigantico.squarespace.com/336554365346/2009/9/20/eye-for-an-iphone.html" target="_blank">Gigantico&#8217;s AR eyewear rounup</a> (<a href="http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&amp;Sect2=HITOFF&amp;d=PG01&amp;p=1&amp;u=%2Fnetahtml%2FPTO%2Fsrchnum.html&amp;r=1&amp;f=G&amp;l=50&amp;s1=%2220080088937%22.PGNR.&amp;OS=DN/20080088937&amp;RS=DN/20080088937" target="_blank">maybe note this Apple patent</a>), that might get wide adoption. But at least researchers are not afraid to explore the possibilities of AR Goggles.</p>
<p>But how far are we now, with or without sexy goggles,Â  from a fuller expression of the base pair DNA of ubiquitous computing and augmented reality?</p>
<h3>We may have a LAN of things before we have an Internet of Things</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_345g9bxbwd3_b1.jpg"><img class="alignnone size-medium wp-image-4534" title="dhj5mk2g_345g9bxbwd3_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_345g9bxbwd3_b1-300x199.jpg" alt="dhj5mk2g_345g9bxbwd3_b" width="300" height="199" /></a></p>
<p><em>The picture above is a workshop I attended at <a href="http://confluxfestival.org/2009/about/" target="_blank">Conflux</a> last weekend &#8211; <a href="http://confluxfestival.org/2009/events/workshops/natalie-jeremijenko/" target="_blank">Fish â€˜n microChips</a>, with <a href="http://www.environmentalhealthclinic.net/people/natalie-jeremijenko/" target="_blank">Natalie Jeremijenko.</a> We are at the site of the <a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank">Amphibious Architecture</a> project (a commissioned work for <a href="http://www.sentientcity.net/exhibit/?cat=3" target="_blank">Toward the Sentient City</a>) and &#8220;a collaborative project with <a href="http://www.environmentalhealthclinic.net/environmental-health-clinic/" target="_blank">xClinic</a>, The Living and other intelligent creatures.&#8221;</em></p>
<p>We are probably as far off some grand futurist visions of ubiquitious computing as we are some of the futurist visions of augmented reality. But as it turns out that may not be a bad thing! Recently, <a href="http://twitter.com/mikekuniavsky" target="_blank">@mikekuniavsky</a> noted in a tweet:</p>
<p><span><span>&#8220;Another argument for the LAN of Things before the Internet of Things: <a rel="nofollow" href="http://tinyurl.com/lgp9uq" target="_blank">http://tinyurl.com/lgp9uq&#8221;</a></span></span></p>
<p><span><span>Bert Moore, <a href="http://www.aimglobal.org/members/news/templates/template.aspx?articleid=3553&amp;zoneid=24" target="_blank">in the article Mike linked to points out</a>, the grand vision of an &#8220;internet of things&#8221; with everything connected to everythingÂ  can &#8220;distract people from thinking about the benefits of RFID in smaller, more easily implemented and cost-justified applications.&#8221;Â  The same argument I think applies to sensor networks and augmented reality.</p>
<p></span></span></p>
<p>In New York City, a series of commissioned works for the <a href="http://www.archleague.org/" target="_blank">Architectural League of New York&#8217;s</a> exhibit,<em> </em><a href="http://www.sentientcity.net/exhibit/?cat=3" target="_blank">&#8220;Toward the Sentient City&#8221;</a><em> </em>are giving us the opportunity to dip our toes into the ocean of a &#8220;networked urbanism.&#8221; Â  For only a small budget, two of the <a href="http://www.sentientcity.net/exhibit/?cat=4" target="_blank">five commissioned works</a>, <a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank">Amphibeous Architecture</a> and <a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank">Natural Fuse</a> demonstrate how sensor networks can allow us to explore new kinds of communities &#8211; connecting people to environments in interesting ways to create new forms of social agency.</p>
<p><a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank">&#8220;Amphibeous Architecture</a>&#8221; -Â  from The Living Architecture Lab at Columbia University Graduate School of Architecture, Planning and Preservation (Directors David Benjamin and Soo-in Yang) and Natalie Jeremijenko, Environmental Health Clinic at New York University, uses a skillfully built (electronics and water are notoriously hard to mix) array of partially submerged sensors to pierce the blinding, reflective surfaces of the riversÂ  surrounding Manhattan and to create a new two way relationship with the ecosystem below &#8211; the water, our neighbors the fish and even a beaver that lives in the water surrounding Manhattan.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-6.34.56-PM.png"><img class="alignnone size-medium wp-image-4536" title="Screen shot 2009-09-26 at 6.34.56 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-6.34.56-PM-300x125.png" alt="Screen shot 2009-09-26 at 6.34.56 PM" width="300" height="125" /></a></p>
<p><em>Image from <a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank">Toward the Sentient City</a></em></p>
<p>In a similar spirit, &#8220;<a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank">Natural Fuse</a>&#8221; &#8211; Usman Haque, creative director, Nitipak â€˜Dotâ€™ Samsen, designer, Ai Hasegawa, designer, Cesar Harada, designer, Barbara Jasinowicz, producer, creates a network of people and electronically assisted plants to explore what it takes to work together on energy consumption and to experience the consequences of &#8220;selfish&#8221; and &#8220;unselfish&#8221; behavior interactively before it is too late to modify our actions.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-6.55.29-PM.png"><img class="alignnone size-thumbnail wp-image-4537" title="Screen shot 2009-09-26 at 6.55.29 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-6.55.29-PM-150x150.png" alt="Screen shot 2009-09-26 at 6.55.29 PM" width="150" height="150" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-9.37.06-PM.png"><img class="alignnone size-thumbnail wp-image-4548" title="Screen shot 2009-09-26 at 9.37.06 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-9.37.06-PM-150x150.png" alt="Screen shot 2009-09-26 at 9.37.06 PM" width="150" height="150" /></a></p>
<p><em>The &#8220;Greedy Switch</em>&#8220;<em> from <a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank">Natural Fuse </a>on the left. On the right &#8220;The System&#8221; &#8211; click to enlarge.<a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank"></p>
<p></a></em></p>
<p>Much more to come in another post on these works, and &#8220;Toward the Sentient City.&#8221;Â  Also an update on how <a href="http://www.pachube.com/">Pachube</a> &#8211; an important part of both these projects and a very important contribution to ubiquitous computing because it creates the opportunity to connect environments and create mashups from diverse sensor data feeds &#8211; has matured since my interview with Pachube founder, Usman Haque, <a href="http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/" target="_blank">&#8220;Pachube, Patching the Planet,&#8221;</a> in January this year.</p>
<p>In the picture above <a href="http://www.environmentalhealthclinic.net/people/natalie-jeremijenko/" target="_blank">Natalie Jeremijenko</a>, and <a id="r_oi" title="Jonathan Laventhol, Imagination" href="http://www.laventhol.com/about" target="_blank">Jonathan Laventhol</a> give the <a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank">Amphibious Architecture</a> sensor array a last look over, as it will soon be lowered into the East River. Jonathan is on a busman&#8217;s holiday to help out at the pre launch of Amphibious Architecture, nr Manhattan Bridge, NYC.</p>
<p>I was very happy to getÂ  a chance to talk to <a id="r_oi" title="Jonathan Laventhol, Imagination" href="http://www.laventhol.com/about" target="_blank">Jonathan Laventhol </a>- more on our conversation in another post<em>. </em>Jonathan Laventhol is <a id="r_oi" title="Jonathan Laventhol, Imagination" href="http://www.laventhol.com/about" target="_blank">CTO of Imagination &#8211; one of the world&#8217;s leading design, events, and branding agencies.</a> We talked about the importance ofÂ <a id="r_oi" title="Jonathan Laventhol, Imagination" href="http://www.laventhol.com/about" target="_blank"> Pachube</a>, which Jonathan called the &#8220;The Facebook of Data,&#8221;Â  andÂ  how the <strong>symbiosis between brands and augmented reality</strong>, and healthcare applications, wouldÂ  be key to augmented reality emerging into the mainstream.</p>
<p><em><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_340djvd2thc_b.jpg"><img class="alignnone size-medium wp-image-4453" title="dhj5mk2g_340djvd2thc_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_340djvd2thc_b-235x300.jpg" alt="dhj5mk2g_340djvd2thc_b" width="235" height="300" /></a></em></p>
<p>Natalie Jeremijenko&#8217;s workshop at Conflux on the social negotiation of technology and how <a href="http://speedbird.wordpress.com/my-book-everyware-the-dawning-age-of-ubiquitous-computing/" target="_blank">&#8220;everyware&#8221;</a> can give us the chance to experience new forms of agency and connection was a totally inspiring.Â  And I will cover this too in another post.Â  I have so much awesome stuffÂ  to write about at the moment!</p>
<p>None of the projects in, &#8220;Toward the Sentient City,&#8221; included a mobile augmented reality, or &#8220;magic lens&#8221; component, but they all pointed to why &#8220;enchanted windows into our newly inside-out reality&#8221; are going to be so important. And why the DNA base pair of ubicomp and augmented reality can really do stuff that matters.</p>
<h3>Shangri- La &#8211; &#8220;Transfigured City&#8221;</h3>
<p><a href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/"><a href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/"><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_342g43n6w7k_b.png"><img class="alignnone size-medium wp-image-4452" title="dhj5mk2g_342g43n6w7k_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_342g43n6w7k_b-300x249.png" alt="dhj5mk2g_342g43n6w7k_b" width="300" height="249" /></a></a></a></p>
<p>Screenshot from <a href="http://en.wikipedia.org/wiki/Shangri-La_%28novel%29" target="_blank">Shangri-La</a> episode </em><a id="cwnc" title="The Transfigured City," href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/" target="_blank">Transfigured City</a></p>
<p>In my AR Consortium founder member interview series, I have found that, understandably, the visionary founders of these first augmented reality companies are a little reticent about sharing their full vision.Â  They are basically on stealth mode in this regard.Â  So as you will not, from my interview with <a href="http://www.t-immersion.com/" target="_blank">Total Immersion</a> founder and CEO, Bruno Uzzan, get a fully drawn scenario of his vision for a next generation of shared augmented reality experiences, here&#8217;s a really interesting anime episode from the anime Shangri La called, <a id="cwnc" title="The Transfigured City," href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/" target="_blank">Transfigured City</a>, to mull over instead.</p>
<p>As you can tell from this rather long and circuitous intro to my my conversation with Bruno Uzzan, IÂ  have been investigating shared augmented realities pretty intensively recently. And Mike Kuniavsky pointed me to <em><em><a href="http://en.wikipedia.org/wiki/Shangri-La_%28novel%29" target="_blank">Shangri-La</a></em></em>, and<a id="cwnc" title="The Transfigured City," href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/" target="_blank"> Transfigured City</a>, in a conversation with Mark Shepard, after Mark&#8217;s presentation at Conflux, <a href="http://confluxfestival.org/2009/events/workshops/mark-shepard/" target="_blank">Sentient City Survival Kit.</a></p>
<p><a href="http://thingm.com/about-us/team/mike-kuniavsky.html">Mike Kuniavsky</a> with <a href="http://thingm.com/about-us/team/tod-e-kurt.html">Tod E. Kurt</a> is founder of <a href="http://thingm.com/home.html" target="_blank">ThingM</a>, a ubiquitous computing device studio. Also Mike Kuniavsky researches, designs and writes about people&#8217;s experiences at the intersection of technology and everyday life &#8211; see Mikes blog <a href="http://www.orangecone.com/" target="_blank">Orange Cone</a>.Â  And I interviewed Mike at Etech- see<a href="../../2009/03/18/dematerializing-the-world-shadows-subscriptions-and-things-as-services-talking-with-mike-kuniavsky-at-etech-2009/" target="_blank"> here</a>.</p>
<p>In <a id="cwnc" title="The Transfigured City," href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/" target="_blank">Transfigured City</a>, the &#8220;Metal Age&#8221; group has to figure out how to share and communicate in a city transfigured by augmented realities/virtualities, where no-one sees the same place in the same way.Â  Only one character can figure out from her previous experience of the city the relationship between the transfigured city and how it used to be.</p>
<p>The conversation I had with <a href="http://www.orangecone.com/" target="_blank">Mike Kuniavsky</a> on <a id="cwnc" title="The Transfigured City," href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/" target="_blank">The Transfigured City</a> continued at a picnic in Washington Square Park the next day with Elizabeth Goodman, who I met at Etech when she gave a brilliant presentation, <a id="eag1" title="Designing for Urban Green Space" href="http://en.oreilly.com/et2009/public/schedule/detail/5562" target="_blank">Designing for Urban Green Space</a>.Â  We covered so many areas at the picnic related to ubiquitous computing and augmented realities that this conversation probably deserves a post of its own (my writing to do list is growing longer!).</p>
<p><a id="on28" title="The Plot Synopsis for Shangri La" href="http://en.wikipedia.org/wiki/Shangri-La_%28novel%29" target="_blank">The Plot Synopsis for Shangri La</a>:</p>
<p><strong>&#8220;In the mid-21st century, the international committee decided to forcefully reduce CO2 emission levels to mitigate the global warming crisis. As a result, the economic market was transferred mainly into the trade of carbon. A great earthquake destroys much of Japan, yet the carbon tax placed on the country is not lifted, so Tokyo is turned into the worldâ€™s largest &#8220;jungle-polis&#8221; that absorbs carbon dioxide. Project Atlas is commenced to plan the rebuilding of Tokyo and oversee the government organization, which the Metal Age group opposes due to its oppressive nature. However, Atlas is only built with enough room for 3,500,000 people and most people are not allowed to migrate into the city. The disparity between the elite within Atlas and the refugees living in the jungles outside of its walls set up the background of the story.&#8221;</strong></p>
<p><strong></p>
<p></strong></p>
<p><a name="jumpto"><span style="font-size: medium;"><strong> Talking With Bruno Uzzan</strong></span></a></p>
<p><span style="font-size: medium;"><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/BrunoUzzanpost.jpg"><img class="alignnone size-medium wp-image-4494" title="BrunoUzzanpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/BrunoUzzanpost-225x300.jpg" alt="BrunoUzzanpost" width="225" height="300" /></a></p>
<p></strong></span></p>
<p><strong></p>
<p>Tish Shute:</strong> We won&#8217;t have fully opened the Pandora&#8217;s Box of Augmented Realities until we have ubiquitous, shared augmented realities, will we?</p>
<p><span id="p-xo" title="Click to view full content"> <strong>Bruno Uzzan: Yes. The most important for augmented reality is the experience we want to share. Now we are working on the cell phone, we can potentially do some marketing components that we already have developed now on cell phone. Done. Itâ€™s working.</strong></span></p>
<p><strong>But the most interesting part of it is how these new components [cell phone AR] will be used for marketing campaigns by brands. And we are also pretty much well positioned to transform some of the AR that we currently have working on Mac and PC and to transform these to applications working on mobile devices. </strong></p>
<p><strong>Tish Shute: </strong> We havenâ€™t really experienced yet what it means to actually share mobile AR experiences?</p>
<p><strong>Bruno Uzzan: Itâ€™s hard &#8212; we did a Facebook app. Itâ€™s a first try, it has a way to go.Â  But </strong><span id="c8ek" title="Click to view full content"><strong> to go more and more into social, is the way forward for us &#8211; to share and expand AR experiences. But yes, I mean what youâ€™re seeing is how two people on two different applications can share that same expanse.Â  For sure we are going in that direction. We are currently working on those kind of solutions. How people can share and experience together at the same time. Thatâ€™s how we start creating excitement in augmented reality, and itâ€™s coming up.</strong></span></p>
<p><strong>It&#8217;s a new market and thereâ€™s so much more in store for augmented reality. You know, some people are telling me, donâ€™t you believe that augmented reality is a gimmick? It will be a trend for a few weeks or a few months and then gone? I say, youâ€™re kidding me. This is only the beginning. I mean I can assure you that the applications that are on the market today are one percent of what we will have five years from now.</p>
<p></strong></p>
<p><strong>Tish Shute: </strong>I agree.</p>
<p><strong>Bruno Uzzan: And Iâ€™m sure that augmented reality will be a part of a lot of components that we are currently using today &#8211; GPS, web browser, glasses, I mean there are so many applications that will come up shortly. This is only the beginning. Iâ€™m completely convinced that augmented reality will be in three years from now what virtual reality is today, which is a billion dollar market.Â  I know that itâ€™s not just a gimmick of a few weeks or a few months, because so many brands are jumping into it, spending money, exploring solutions.Â  I know that itâ€™s not just short term -what they are willing to do and we are willing to do, but also middle and long term. And thatâ€™s what makes this adventure pretty much unique and what makes creating a cutting edge technology, very, very much exciting for us.</p>
<p></strong></p>
<p><span id="pb9s" title="Click to view full content"><strong>Tish Shute:</strong> First could you explain more to me about your partnership with Int13. I am not sure I understand what is in the arrangement from Total Immersion&#8217;s POV. I mean what happens re your own mobile software development? Haven&#8217;t you only been licensed the Int13 SDK for a limited period of time and have limited access to all it&#8217;s power? </span><span id="p_2y" title="Click to view full content"><a href="http://gamesalfresco.com/2009/09/15/why-int13-got-in-bed-with-total-immersion/" target="_blank">Stephane from Int13 said to Ori on Games Alfresco, here, </a>â€œwe have licensed the SDK4 for two years,â€ and then Ori asks, â€œbut you have basically kept the power to yourselves, right?â€ So if they are the only ones that can enhance it and develop the software, where willÂ  TI be in two years in mobile if you havenâ€™t really had the chance to develop your own software .</span></p>
<p><span id="j5co" title="Click to view full content"></p>
<p><strong>Bruno Uzzan: Actually itâ€™s a real win-win situation. Int13 is a very small company and they have so many requests they can&#8217;t possibly fulfill them all. SoÂ  this is a way for both of us to be, as quickly as possible, the first mobile provider for all the requests we have. Also they give us exclusivity so nobody else can use INT13 SDK for such applications.Â  I think that it is a good partnership, </strong></span></p>
<p><strong>And concerning our own mobile applicationâ€¦ First of all we have currently some mobile applications working. But with Int13 we have a mobile solution that can work on many different devices. Thatâ€™s a fact and thatâ€™s working. And, believe me you will hear from us a lot more about this soon. We are fully independent on our mobile development. The reason we closed the partnership with Int 13 isÂ  to be able to deploy mobile in a broad way.</strong></p>
<p><strong> I mean you know that the difficulty with AR mobile is that each separate device needs some customization. Working on the iPhone is different from working on the Nokia, different from working on the Palm; itâ€™s different from working on the Samsung. Each of them have their own operating system inside and so we were interested in Int13&#8242;s very clever embedded solution that allows our solutions to work across many platforms.</strong></p>
<p><strong>The reason we are working with Int13 is that we are able to work on so many mobile devices, thanks to Int13. And in the mobile AR race that we are currently in, the next two years will be extremely important to usâ€¦</strong></p>
<p><span id="z_5s" title="Click to view full content"><strong>Tish Shute:</strong> OK, that definitely clarifies it a lot. So Int13 has done an embedded solution to allow TI developed AR solutions to work easily across many devices?</span></p>
<p><span id="y.wt" title="Click to view full content"><strong>Bruno Uzzan: YesÂ  they have kind of an embedded solution, a way to address extremely quickly new cell phone&#8230; But, currently on our side, we are in discussions with a mobile companyâ€¦ and that only refers to some very specific mobile devices.Â  And what they have is also a way to embed deeper our technology into mobile, so that we can have quickerâ€¦ applications that work on a large number of cell phones.</strong></span><span id="mufh" title="Click to view full content"> </span></p>
<p><strong>Tish Shute:</strong> So, basically it means you don&#8217;t have to go through some complicated negotiations with each of the cell phone companies, is what you are saying?</p>
<p><strong>Bruno Uzzan: Not only negotiations, but also hard development. You know? Working on the Windows mobile is completely different from working on the Palm OS. You know, that&#8217;s different! Its a big work, to have a mobile application working on many other devices. So, INt13,Â  provides us a way for us to save some time and some development cost too.</strong></p>
<p><strong>Tish Shute:</strong> And Int13 doesn&#8217;t have powerful AR development tools like <a href="http://www.t-immersion.com/en,interactive-kiosk,32.html" target="_blank">D&#8217;fusion</a> right?</p>
<p><strong> Bruno Uzzan: Right! That&#8217;s right. That&#8217;s why we say it&#8217;s a true win-win solution. They can benefit from our work too. And we can benefit from their work, in order to deploy quicker and faster mobile solutions. </strong></p>
<p><strong>Tish Shute:</strong> Now, the second thing isâ€¦ there is a lot of debate and disagreement about how far mobile augmented reality is from delivering something more that the &#8220;post it&#8221; approach that has been much publicized in recent months, via all the AR browser apps.</p>
<p>But from my understanding from the conversation we had earlier this summer (see below), Total Immersion is targeting a much higher level of mobile augmented reality than we&#8217;ve seen to date?</p>
<p><strong>Bruno: Yes the browser apps we have seen are a kind of augmented reality, but not exactly the way we see it. Let me explain you why. With this kind of application it&#8217;s true that you can overlay 3D-information and video. That&#8217;s a fact. So, in a sense, that&#8217;s augmented reality. But the way that they are working on the position of the 3D on that video is that they are using compass and GPS-information.. so it means that this AR solution will work only on some building and some physical objects that are FIXED. In a fixed and known position.</strong></p>
<p><strong>So you want to go to a theater?</strong></p>
<p><strong> </strong><span id="a9qv" title="Click to view full content"><strong>The theater is here, for sure it will not move, so you know the position of the theater, and thatâ€™s a fact that you can superimpose an object on the theater. Thatâ€™s what can be done currently. What we are achieving and what we are doing on mobile is more than that. We want to be able to port our solution with trading cards, with brands, into a smart phone.</strong></span></p>
<p><strong>Iâ€™m assuming that you want a can, a drink can, to be able to trigger an experience. The only way you can do it is to be able to understand what the can, it is. And the current solutions that are out there canâ€™t do that, itâ€™s impossible. </strong></p>
<p><strong>Tish Shute:</strong> Right, yes. Thereâ€™s no near-field object at all in these early browser apps.</p>
<p><strong>Bruno Uzzan: And the solution we have is that we can recognize a can and then &#8212; in a very, very precise way and that activates geo-location, so we can superimpose 3D. I mean in that case, it opens up all the applications that we currently have, so they could work on mobile.</strong></p>
<p><strong>Tish Shute:</strong> So for example, if youâ€™re working with a soft drink company, people can trigger that experience wherever they see that can?</p>
<p><strong>Bruno Uzzan: Correct. </strong></p>
<p><strong>Tish Shute:</strong> Yes. Yes, I assumed that was what youâ€™re doing</p>
<p><strong>Bruno Uzzan: We believe &#8212; and maybe thatâ€™s not the case, but we believe that our marker-less tracking technology is pretty much unique on the mobile devices.</strong></p>
<p><strong>I havenâ€™t seen yet, from anyone, a full augmented reality mobile solution working.</p>
<p></strong></p>
<p><span id="rzqr" title="Click to view full content"><strong>I really see AR being part of the Web 3.0 next generation. I mean the vision I have is that, you know &#8212; today, when you want to have information, you go on a website and then you find your information. AR &#8212; and the future is that I think it will be part of the opposite. You want to have information about a product, you just show it to your computer and the information will automatically pop up. I see here a new way to market some key messages, a new way to get information is that some physical product by themselves could be a way to get information, and you donâ€™t have to search anymore for them, itâ€™s coming out to you.</strong></span></p>
<p><strong>AR is definitely for me, one of these components. Another thing that AR is a solution, another thing that AR itself will create these kind of results in how information is being displayed. But Iâ€™m seeingÂ  here a way that could be part of a new way to have access to information. And thatâ€™s part of the vision I have. Whatever, if it is through mobile phone or web or PC, Mac, whatever, I really believe that now this kind of new generation of receiving information will come shortly and could be a kind of a new &#8212; could be part of the new 3.0 generation of the web. </strong></p>
<p><strong>Tish Shute:</strong> My friend <a id="evae" title="Gene Becker" href="http://www.genebecker.com/" target="_blank">Gene Becke</a>r did <a href="http://www.genebecker.com/2009/09/thinking-about-design-strategies-for-magic-lens-ar/" target="_blank">an interesting post recently on some of the current limitations of mobile AR</a> where he pointed out the problem of:</p>
<p><em><strong>&#8220;S</strong><strong>implistic, non-standard data formats</strong> â€“ POIs, the geo-annotated data that many of these apps display, are mostly very simple one-dimensional points of lat/long coordinates, plus a few bytes of metadata. Despite their simplicity there has been no real standardization of POI formats; so far, data providers and AR app developers are only giving lip service to open interoperability. Furthermore, they are not looking ahead to future capabilities that will require more sophisticated data representations. At the same time, there is a large community of GIS, mapping and Geoweb experts who have defined open formats such asÂ <a href="http://georss.org/" target="_blank">GeoRSS</a>,Â <a href="http://geojson.org/" target="_blank">GeoJSON </a>andÂ <a href="http://code.google.com/apis/kml/documentation/" target="_blank">KML</a> that may be suitable for mobile AR use and standardization.&#8221;</p>
<p></em> <span id="gd8y" title="Click to view full content"></p>
<p><strong></p>
<p></strong></span><span id="v68s" title="Click to view full content"><strong> Bruno Uzzan: Thatâ€™s interesting. I mean &#8212; I know exactly what his is referring to. He is mainly referring to a localization and how you can have a quick, accurate localization.Â  If you look at current solutions, and you look at this 3-D superimposing on the video, the 3-D is shaking a lot. I donâ€™t know if you see that in some of these early efforts.</strong></span></p>
<p><strong>Itâ€™s hard to use because the 3-D, you know, isÂ  part of the magic of augmented reality, that is when the 3-D is being inserted in a very easy way and smooth way in your solution. Here, when you see this overlay, 2-D or 3-D overlaid on the video, itâ€™s shaking a lot. One reason for this is that the GPS compass is not accurate enough to coordinate the perfect location of the user. And here, what Gene says is interesting. I think we are addressing this localization issue in a pretty smart way.</strong></p>
<p><strong>But to be frank with you, I donâ€™t believe mobile augmented reality in the extremely short term &#8212; Iâ€™m talking about three weeks, one, two months is mature enough for good AR applications.Â  It will be shortly.Â  But for now it is more proof of concept than a true and easy application to use. </strong></p>
<p><strong>But we are starting to see a lot of new application coming out, but I really believe that marketing and entertainment are the two key markets for AR right now.</strong></p>
<p><strong>Iâ€™ve been working ten years in augmented reality. And, eight years ago, when I was talking about augmented reality, I was E.T., you know? Nobody understood what I said, and I thought it was crazy. And now, today, yes itâ€™s completely different.</strong><strong> </strong></p>
<p><strong> </strong></p>
<p><strong>Tish Shute:</strong> The Pandora&#8217;s Box of Augmented Realities, in my view, is an open, universal and standard, distributed, multiuser, augmented reality framework fully integrated with the internet and world wide web. I have been looking into Google Wave protocols as a basis for this would you be interested in this? Do you think it is feasable?</p>
<p><span id="ngwf" title="Click to view full content"> </span><span id="vz68" title="Click to view full content"><strong> </strong></span></p>
<p><span id="vz68" title="Click to view full content"><strong>Bruno Uzzan: I think this is feasible. I think that&#8217;s doable, that&#8217;s justÂ  in my opinion. I mean some people might have another kind of opinion but I think that that&#8217;s definitely doable.</strong></span></p>
<p><strong>Tish Shute:</strong> Yes I suppose an open AR Framework involves cooperation and collaboration, it is more about business and politics than technological problems.</p>
<p><strong> Bruno Uzzan: Yes!Â  Actually the Web is politics. Business is politics. </strong></p>
<p><span id="yeg4" title="Click to view full content"><strong>Tish Shute: </strong>I would be interested if anyone in your R&amp;D team would be interested in looking at some of the ideas that are emerging in our little discussion of Google Wave and an Open AR FrameworkÂ  to offer feedback. it is an interesting time now to input on the Wave Federation Protocol docs because nothing is set it stone right now.</span></p>
<p><span id="hzrf" title="Click to view full content"><strong>Bruno Uzzan: Just shoot me an email, I&#8217;ll try to put you in touch with the right person and, and a team member that can input on this.</strong></span></p>
<p><span id="hbcd" title="Click to view full content"><strong>Tish Shute: </strong>For mobile augmented reality the best thing weâ€™ve got now is the phone, right?</span></p>
<p><strong>Bruno Uzzan: Right. </strong></p>
<p><strong>Tish Shute:</strong> And the only way we can use the phone is by holding it up, right?Â  Isnâ€™t this a bit of an an obstacle as you introduce better object recognition and tracking?Â  People are going to have to stop moving to use their phone. What do you feel about that experience? Isn&#8217;t AR eyewear and essential part of a tightly registered AR experience?</p>
<p><strong></p>
<p>Bruno Uzzan: </strong>We donâ€™t do hardware and we donâ€™t have the current solution for eyewear that would do all we need for a good mobile AR experience, so I guess we donâ€™t have the current answer for that.Â  But we are beginning to see the next generation of this &#8212; of these glasses.</p>
<p><strong>Tish Shute:</strong> But youâ€™re happy enough with the mobile experience of augmented reality on smart phones that youâ€™re investing in this next generation of software for this.</p>
<p><strong>Bruno Uzzan: Yes, I know. We know that some application will not work on the iPhone. And yes, whatever you do, you still need to hold the iPhone, so it means that you canâ€™t play with your hands anymore. So we know that partially, some AR solutionsÂ  we have on other platforms will lose the magical effectivities on just the iPhone.</strong></p>
<p><strong>But Iâ€™m starting to see on the market some glasses that could perhaps be not too expensive &#8212; thatâ€™s a challenge!Â  And easy to use &#8212; thatâ€™s another big challenge. And, that could fit on anybodyâ€™s faces and head &#8212; there&#8217;s another big challenge. So yes, Iâ€™m starting to see that, but so far AR glasses are only applicable for some very, very specific application, like design or theme park or, you know, some specific location where it makes sense to move forward with glasses.</p>
<p></strong></p>
<p><strong>I donâ€™t believe that kids will use glasses for &#8212; in our toys and for games in the next months or maybe othe next one or two years. But maybe something will come out shortly and that could be a big breakthrough, and enable us to think another way. ButÂ  from what we have seen so far and from what we know in this hardware market, I donâ€™t believe that currently there is a workable solution.</p>
<p><span style="font-size: small;"></p>
<p></span></strong> <span style="font-size: small;"><strong></p>
<p></strong></span><span style="font-size: medium;"><span style="font-size: small;"><strong>Note: The following section of the interview took place earlier in the Summer.</strong></span></p>
<p></span><span id="yvdi" title="Click to view full content"></p>
<p><strong>Tish Shute:</strong> You are the first commercial AR companyÂ  &#8211; you started in 1999 right?</p>
<p><strong></p>
<p>Bruno Uzzan: Yes you are right. We started the extremely early in this augmented reality market. We were the first company worldwide to start doing augmented reality and to start promoting augmented reality. So it&#8217;s true, we are pretty old players although the market has been getting bigger and bigger for the last year and a half. So for a long time we were only in the market, and the market was not really there.</strong></span></p>
<p><strong>But for the past 8 months, the company has been growing really fast.</strong></p>
<p><strong>Tish Shute:</strong> Yes I&#8217;m sure. Congratulations for hanging in there long enough to get the pay off!</p>
<p><strong> Bruno Uzzan: You know, my background is Financial. So I have been driving the company for many years in a very cash efficient way. So we have been waiting for the markets to reach maturity before starting make some investments. So that&#8217;s the reason we are still here, and that&#8217;s the reason I think we managed pretty smartly the cash that we raised for the company.</strong></p>
<p><strong>Tish Shute:</strong> Yes there is a saying that when a market takes off you can tell a pioneers because they are the ones with the arrows in their backs. But I am glad you are dodging the arrows!</p>
<p><strong>Bruno Uzzan: You know, I&#8217;ve always driven the company with revenue. And because revenue was not there at the beginning I was extremely cautious about the cash. So now that the company is getting some revenue, for sure we are making more and more investments, and taking advantage of our situation as a worldwide leader of augmented reality.</strong></p>
<p><strong>This situation is not easy as it appears today but it&#8217;s now getting better, as you can see, AR, Augmented Reality, has very good momentum and we are benefiting a lot from all this momentum for augmented reality right now.</strong></p>
<p><strong>Tish Shute:</strong> You&#8217;ve been very involved in researching developing augmented reality tools. Are you still as active in the research area, or are you too busy keeping up with work for hire now, to be working on research and building new technology for Augmented Reality?</p>
<p><strong>Bruno Uzzan: Both. First of all, we are part of lot of projects either directly with clients like Mattel or with some partners that are using our technology to promote and develop other AR projects. From what we he have seen, many, many, many, projects augmented projects have been done currently with our solutions.</strong></p>
<p><strong>To continue with your previous question. So we are being perceived as this leader in that space, and weÂ  have some pretty heavy demand for our services. But we are coming up with new technology, of course, still connected to Augmented Reality.Â  But, our R &amp; D is working in two different directions, which of course also bind together.</strong></p>
<p><strong>The first one is platform developments. So we want </strong><strong>Augmented Reality to work with as many platforms as possible &#8211; PC, Mac, Mobile, Game Consoles, all those are the platforms that we are targeting. We are currently doing lot of work in the R &amp; D team in cross platform compatibility</strong><strong>.</p>
<p></strong></p>
<p><strong>Tish Shute:</strong> Robert Rice said recently, &#8220;markers and webcams equal Photoshop page curls&#8230;&#8221;</p>
<p><span id="dulu" title="Click to view full content"></p>
<p><strong>Bruno Uzzan: Yes. There are so many concerns with markers. The quality is extremely bad. As soon as you hide a part of the marker, a slight part of the marker, youâ€™re dead. You canâ€™t track any more of the object. So compared to our solution where I want to say play with cards or where you are going to play with a Mattel toy, even if you hide a part of the toy, itâ€™s still working.</strong></span></p>
<p><strong> Tish Shute:</strong> But you havenâ€™t offered the public an SDK to your engine right? Basically the way people get access to your tools is working in a partnership with Total Immersion right?</p>
<p><strong>Bruno Uzzan: Correct. </strong></p>
<p><strong>Tish Shute:</strong> Do you think in the future you might open your SDK? Are you considering that?</p>
<p><strong></p>
<p>Bruno Uzzan: Yes, it would be interesting. </strong></p>
<p><strong>Tish Shute:</strong> So that is something we can see coming soon?</p>
<p><span id="short_transcription0" title="Click to view full content"><strong>Bruno Uzzan: Maybe, because itâ€™s true that Total Immersion is starting to be mature enough for these kind of tools. The only thing is that we have to respect good timing for that.Â  Itâ€™s a big decision. You know what I mean?Â  It is a big, big decision. We would then compete with others using our technology. </strong></span></p>
<p><strong>Tish Shute:</strong> Oh I know, it is a big decision when you have so much skin in the game! But it would be nice to have your SDK being THE platform for AR, wouldn&#8217;t it?</p>
<p><strong> Bruno Uzzan: It is a really big decision that we canâ€™t just take like that, you know.Â  There are a lot of friends who told me you have to be extremely careful about timing. This timing is pretty much connected to the maturity of the market. For sure, we see the market being more and more mature. But, there are a lot of low hanging fruits we still want to address</strong></p>
<p><strong>To get the best value possible for all the publicity we have and all the clients we have now. </strong></p>
<p><strong>Tish Shute:</strong> Yes, I know. Youâ€™ve been in this game so long. Now, there is an interesting question here though about tools and platforms because you know, A.R., augmented reality has already expandedÂ  beyond its kind of original purist definition. And when I talk to peopleÂ  about augmented reality, there are actually lot of different ideas and priorities of where the tools should go right now. You know, obviously we have these kind of browser-like applications, but these browser like applications are not dealing with recognizing near field objects yet.Â  What are your priorities for tool development and what are your priorities for AR development in the future? What areas are you going to focus on? Oh dear that is a rambling question!</p>
<p><strong>Bruno Uzzan: [laughter]Â  So, one of our first priorities is we need to create our software with one development, one installer, one software that can be spread on different platforms. The same application, the same software can be used either on a PC, Mac, phone or console. So thatâ€™s a lot of work, because that means that our platform has to address many many different devices and thatâ€™s a big priority for us because we received this request from our clients. We want to be able to use one application on many different platforms and devices. So, thatâ€™s the first one.</p>
<p></strong></p>
<p><strong><span id="hk3z" title="Click to view full content">And the second one is to add more and more interactivity between the real and the virtual world. So, we are working on some improvements to add some real components that will interact with virtual, and that also part of our big strategy and direction and these two worlds can more and more be bridged together, linked together so they can interactÂ  one with the other.</span></strong></p>
<p><strong>Our R&amp;D guys are working on the real world interacting more with the virtual world.Â  And I have started seeing some results which are pretty much crazy and this will be ready for next year.</p>
<p><br style="background-color: #ffff00;" /></strong><span id="b1qt" title="Click to view full content"><strong> There are so many different directions for interaction between the real world and virtual world to develop.Â  Iâ€™m sure ten years from now youâ€™re going to have AR applications everywhere.Â  Its not just temporary fashion stuff or a gimmick for few months. I mean we are getting there, its getting stronger and stronger and we are getting a good adoption rate from our consumers. They like it, they test it, they play with it and brands wants more, people want more and its getting bigger and bigger.</p>
<p></strong></p>
<p><strong>Tish Shute:</strong> Yea and I totally agree, its not a gimmick because the interaction between &#8220;virtual&#8221; and &#8220;real&#8221; enhances the magic of both. Another question about you RandD operation. Is your R&amp;D still in France or have you moved totally out to LA.</span></p>
<p><strong>Bruno Uzzan: We are 50 people in France and I started this LA office two years ago and I moved permanently two years to LA. So Iâ€™m now permanently located in the US to take care of the US office, knowing that revenues are really getting bigger and bigger in the US. So it means that we are getting a lot of traction, working with large company and now Iâ€™m currently located in the US.</strong></p>
<p><strong>Tish Shute:</strong> My sister lives in Paris. Could I visit your R&amp;D lab at some point? Iâ€™d love to visit!</p>
<p><span id="bt1e" title="Click to view full content"><strong>Bruno Uzzan: Yeah sure sure sure. I mean if you want to go. You wonâ€™t have access to all the research. But if you want to go out and meet all the team please do.</strong></span></p>
<p><strong>Tish Shute:</strong> Iâ€™d love to.</p>
<p><strong> Bruno Uzzan: No problem. Shoot me an Email you and I will introduce you to Eric Gehl, COO, he is the COO of the French team. And he can definitely take care of that. </strong></p>
<p><strong>Tish Shute:</strong> That would be fun. Thank you!</p>
<p>Recently, AR browser applications have really caught the imagination of the web community, eg., Layar and Wikitude?Â  Where do you think the most important market for AR is at the moment<span id="k6fx" title="Click to view full content">, entertainment,Â  green tech, business, education?</span></p>
<p><strong>Bruno Uzzan: I think that all that you mention will be important. The first one that did grab my attention is entertainment particularly dual marketing, because they always searching for new ways to interact with players or the consumers.Â  But itâ€™s just the tip of the iceberg, you know, I mean medical applications could be huge using augmented reality. Education, and edutainment is definitely using more and more augmented reality components.Â  And I will just be submitting with big companies â€“ that are considering using augmentation for education. Museums are very important too. Also augmentation as a kind of free sales tool, you know there are so many applications, design, architecture &#8211; so many directions that itâ€™s hard to say today which one will take the lead.</strong></p>
<p><strong>But I do believe that on the short term the ones that are really really moving fast are the entertainment business and the digital marketing business. </strong></p>
<p><strong>Tish Shute:</strong> What do you think are the biggest shortcomings with current augmented reality and what are the obstacles that no one has solved yet?</p>
<p><strong>Bruno Uzzan: I think the cell phone is not fully ready for augmented reality â€“ a lot of people are working on that but there are still a lot of constraints to get the augmented reality working on a cell phone and I think that from what I heard a lot of manufacturers and a lot of companies are working from direction that are going to help us a lot to develop some great cell phone applications.</strong></p>
<p><strong>And I think thatâ€™s one of the biggest part of the game. All the applications that you see on cell phones so far are just gimmicks â€“ the next big key is how to transform some gimmick cell phone application to a real, industrial, robust application that&#8217;s going to work on a cell phone. So I think thatâ€™s a big challenge for this year. </strong></p>
<p><strong></p>
<p>Most of what we see now is just matching and overlaying some 2d components in a video. This is not what I call AR.Â  Youâ€™re far away â€“ with this kind of application, you are far away from doing the registration that we need to do â€“ you canâ€™t do it. So here&#8217;s the challenge: &#8220;how can you get a Topps is an application working on cell phone. Thatâ€™s the big challengeÂ  How we can make that work!&#8221;</strong> <strong> You can&#8217;t today get a real AR Topps application working on cell phone because there&#8217;s no cell phoneÂ  thatâ€™s actually ready. But we are working on it and the first one that can make that work, itâ€™s going to be huge.</strong></p>
<p><span id="b9-2" title="Click to view full content"><strong>When you are working with good AR components you need a lot of CPU and GPU programs. So today new cell phone have started to be more and more ready for augmented reality but you need a really good cell phone to make it work. You canâ€™t choose an old cell phone to make it work because you have some recognition, you have some tracking, you have some rendering, so you canâ€™t choose a Nokia cell phone two years old to make that work. For sure the newest iPhone is the one that can make it work, but thatâ€™s it for now. There is a lot of research â€“ from large cell phone companies â€“ to get more CPU and GPU into their cell phone.Â  But so far we are also waiting for these devices to be released to consumers.</strong></span></p>
<p><strong>Tish Shute: </strong>And the current economic climate has put a damper on MIDs hasn&#8217;t it. But who can tell? It depends what price points some new MID came out at right?</p>
<p><strong></p>
<p>Bruno Uzzan: Correct.</strong></p>
<p><strong>Tish Shute:</strong> Yes,I agree. But basically whatâ€™s interesting, the interesting thing is, the iPhone can deliver so much of what is necessary and even if Apple hasn&#8217;t given access to the full power of the iphone to AR developers yet, there is really no going back now &#8211; the mobile augmented reality cat is out of the bag!</p>
<p><strong>Bruno Uzzan: Youâ€™re right, youâ€™re fully right. </strong></p>
]]></content:encoded>
			<wfw:commentRss>https://www.ugotrade.com/2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/feed/</wfw:commentRss>
		<slash:comments>36</slash:comments>
		</item>
	</channel>
</rss>
