<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>UgoTrade &#187; Denno Coil</title>
	<atom:link href="http://www.ugotrade.com/tag/denno-coil/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.ugotrade.com</link>
	<description>Augmented Realities at the Edge of the Network</description>
	<lastBuildDate>Wed, 25 May 2016 15:59:56 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>Total Immersion and the &#8220;Transfigured City:&#8221; Shared Augmented Realities, the &#8220;Web Squared Era,&#8221; and Google Wave</title>
		<link>http://www.ugotrade.com/2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/</link>
		<comments>http://www.ugotrade.com/2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/#comments</comments>
		<pubDate>Sun, 27 Sep 2009 04:42:42 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[3D Interactive Live Show]]></category>
		<category><![CDATA[Acrossair]]></category>
		<category><![CDATA[AMEE]]></category>
		<category><![CDATA[Amphibious Architecture]]></category>
		<category><![CDATA[anime]]></category>
		<category><![CDATA[Apple iPhone]]></category>
		<category><![CDATA[AR baseball cards for Topps]]></category>
		<category><![CDATA[AR Consortium]]></category>
		<category><![CDATA[AR eyewear]]></category>
		<category><![CDATA[AR goggles]]></category>
		<category><![CDATA[Architectural League of New York]]></category>
		<category><![CDATA[ARML]]></category>
		<category><![CDATA[ARN]]></category>
		<category><![CDATA[Augmented City]]></category>
		<category><![CDATA[augmented city lab]]></category>
		<category><![CDATA[augmented reality books]]></category>
		<category><![CDATA[augmented reality entrpreneurship]]></category>
		<category><![CDATA[augmented reality goggles]]></category>
		<category><![CDATA[augmented reality making visible the invisible]]></category>
		<category><![CDATA[augmented reality mark-up language]]></category>
		<category><![CDATA[augmented reality pollution meter]]></category>
		<category><![CDATA[augmented reality standards]]></category>
		<category><![CDATA[augmented reality toys]]></category>
		<category><![CDATA[augmented virtuality]]></category>
		<category><![CDATA[Bionic Eye]]></category>
		<category><![CDATA[Blair Macintyre]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Bruno Uzzan]]></category>
		<category><![CDATA[Conflux]]></category>
		<category><![CDATA[cross platform compatibility for augmented reality]]></category>
		<category><![CDATA[D'Fusion]]></category>
		<category><![CDATA[Daniel Wagner]]></category>
		<category><![CDATA[Denno Coil]]></category>
		<category><![CDATA[distributed]]></category>
		<category><![CDATA[elements of networked urbanism]]></category>
		<category><![CDATA[Elizabeth Goodman]]></category>
		<category><![CDATA[everyware]]></category>
		<category><![CDATA[Fish 'n Microchips]]></category>
		<category><![CDATA[Flickr]]></category>
		<category><![CDATA[Gavin Starks]]></category>
		<category><![CDATA[Gene Becker]]></category>
		<category><![CDATA[geo spatial web]]></category>
		<category><![CDATA[geoAR]]></category>
		<category><![CDATA[geoaugmentation]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Google Wave Protocol]]></category>
		<category><![CDATA[Gov 2.0 Expo Showcase]]></category>
		<category><![CDATA[Gov 2.0 Summit]]></category>
		<category><![CDATA[Graz University of Technology]]></category>
		<category><![CDATA[Imagination]]></category>
		<category><![CDATA[Incheon Free Economic Zone]]></category>
		<category><![CDATA[information shadows]]></category>
		<category><![CDATA[Int13]]></category>
		<category><![CDATA[Interaction Design for Augmented Reality]]></category>
		<category><![CDATA[ISMAR 2009]]></category>
		<category><![CDATA[Jeremy Hight]]></category>
		<category><![CDATA[Joe Lamantia]]></category>
		<category><![CDATA[Jonathan Laventhol]]></category>
		<category><![CDATA[Korea's u-Cities]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[Layar 3D]]></category>
		<category><![CDATA[magic lens augmented reality]]></category>
		<category><![CDATA[manga]]></category>
		<category><![CDATA[Mark Shepard]]></category>
		<category><![CDATA[Mark Weiser]]></category>
		<category><![CDATA[markerless mobile augmented reality]]></category>
		<category><![CDATA[Metaio]]></category>
		<category><![CDATA[Microsoft Bing]]></category>
		<category><![CDATA[Mike Kuniavsky]]></category>
		<category><![CDATA[Mobilizy]]></category>
		<category><![CDATA[multiuser augmented reality]]></category>
		<category><![CDATA[Natalie Jeremijenko]]></category>
		<category><![CDATA[Natural Fuse]]></category>
		<category><![CDATA[near-field object rcognition and tracking]]></category>
		<category><![CDATA[Networked City]]></category>
		<category><![CDATA[networked urbanism]]></category>
		<category><![CDATA[newer urbanism]]></category>
		<category><![CDATA[open]]></category>
		<category><![CDATA[open augmented reality framework]]></category>
		<category><![CDATA[open augmented reality network]]></category>
		<category><![CDATA[Orange Cone]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[realtime panorama mapping on mobile phones]]></category>
		<category><![CDATA[RobotVision]]></category>
		<category><![CDATA[sensor networks]]></category>
		<category><![CDATA[Sentient City Survival Kit]]></category>
		<category><![CDATA[Shangri La]]></category>
		<category><![CDATA[shared augmented realities]]></category>
		<category><![CDATA[Sky Writer]]></category>
		<category><![CDATA[Steven Feiner]]></category>
		<category><![CDATA[symbiosis between augmented reality and brands]]></category>
		<category><![CDATA[the internet of things]]></category>
		<category><![CDATA[the LAN of things]]></category>
		<category><![CDATA[the shape of alpha]]></category>
		<category><![CDATA[the web squared era]]></category>
		<category><![CDATA[ThingM]]></category>
		<category><![CDATA[things as services]]></category>
		<category><![CDATA[Thomas Wrobel]]></category>
		<category><![CDATA[Tim O'Reilly]]></category>
		<category><![CDATA[Tod E. Kurt]]></category>
		<category><![CDATA[Total Immersion]]></category>
		<category><![CDATA[Toward the Sentient City]]></category>
		<category><![CDATA[Transfigured City]]></category>
		<category><![CDATA[twitter]]></category>
		<category><![CDATA[u-City]]></category>
		<category><![CDATA[ubiquitous computing and augmented reality]]></category>
		<category><![CDATA[uCity]]></category>
		<category><![CDATA[Usman Haque]]></category>
		<category><![CDATA[Wave Federation Protocol]]></category>
		<category><![CDATA[Weisarian Ubiquitous Computing]]></category>
		<category><![CDATA[Wikitude]]></category>
		<category><![CDATA[xClinic]]></category>
		<category><![CDATA[XMPP versus HTTP]]></category>
		<category><![CDATA[Yocahi Benkler]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4439</guid>
		<description><![CDATA[Above is an image aboveÂ  from Total Immersion&#8217;s augmented reality experience developed for the &#8220;Networked City&#8221; exhibition in South Korea, &#8211; &#8220;a fun scenario created for a u-City&#8217;s infrastructure and city management service&#8221; &#8220;To the naked eye, the exhibit looks like a bare bones model of a city. But when visitors put on the special [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_338cwpzntgp_b.jpg"><img class="alignnone size-medium wp-image-4440" title="dhj5mk2g_338cwpzntgp_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_338cwpzntgp_b-300x170.jpg" alt="dhj5mk2g_338cwpzntgp_b" width="300" height="170" /></a></p>
<p><em>Above is an image aboveÂ  from <a href="http://www.t-immersion.com/" target="_blank">Total Immersion&#8217;s</a> augmented reality experience developed for the <a id="winm" title="&quot;Networked City&quot; exhibition in South Korea, &quot;" href="http://www.tomorrowcity.or.kr/sv_web/en_US/space.SpaceInfo.web?targetMethod=DoUe04Sub1" target="_blank">&#8220;Networked City&#8221; exhibition in South Korea,</a> &#8211; &#8220;a fun scenario created for a<a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank"> u-City&#8217;s</a> infrastructure and city management service&#8221; </em></p>
<p><strong>&#8220;To the naked eye, the exhibit looks like a bare bones model of a city. But when visitors put on the special AR goggles a whole new world unfolds â€“ as graphics overlaid on the city model.</strong><em><strong>&#8221; </strong>(<a href="http://gamesalfresco.com/2009/09/14/total-immersion-brings-augmented-reality-to-tomorowcity-todaytomorrow/" target="_blank">Games Alfresco)</a></em></p>
<p>&#8220;The Networked City,&#8221; is a large scale augmented virtuality of a scenario for a networked city. But my guess, reading the &nbsp; &nbsp;    <em><a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank">Korea IT Times</a></em>, is the plan is to move from an augmented virtuality to an augmented reality as Incheon Free Economic ZoneÂ  (IFEZ) realizes its vision to become a leading u-City &#8211; where reality is turned &#8220;inside out&#8221; (see <a id="x:2w" title="Inside Out Reality" href="http://www.uxmatters.com/mt/archives/2009/08/inside-out-interaction-design-for-augmented-reality.php">Inside Out: Interaction Design for Augmented Reality )</a>.Â <a id="x:2w" title="Inside Out Reality" href="http://www.uxmatters.com/mt/archives/2009/08/inside-out-interaction-design-for-augmented-reality.php"> </a>If you are not familiar with South Korea&#8217;s u-Cities, <a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank">check out this post</a> for a short primer (and note<a href="http://www.google.com/trends?q=augmented+reality&amp;ctab=1986817859&amp;geo=all&amp;date=all" target="_blank"> Google Trends search on Augmented Reality </a>showsÂ  South Korea leaving everyone else in the dust).<a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank"></p>
<p></a></p>
<h3>Ubiquitous computing and augmented reality are like adenine and thymine &#8211; a DNA base pair.</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-24-at-11.34.35-PM.png"><img class="alignnone size-medium wp-image-4442" title="Screen shot 2009-09-24 at 11.34.35 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-24-at-11.34.35-PM-300x256.png" alt="Screen shot 2009-09-24 at 11.34.35 PM" width="300" height="256" /></a></p>
<p><em>A sky view of Incheon Free Economic Zone (<a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank">from Korean IT Times</a>). For more on the IFEZ vision to become a leading u-City <a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank">see here</a>.</em></p>
<p><a href="http://www.koreaittimes.com/story/4371/leading-global-u-city" target="_blank">Korea IT Times</a> writes about the u-city concept:</p>
<p><strong>&#8220;Korea began using the term u-City after accepting the concept of ubiquitous computing, a post-desktop model of human-computer interaction created by Mark Weiser, the chief technologist of the Xerox Palo Alto Research Center in California, in 1998. There have been a lot of research in this field since 2002. As a result, many local governments in Korea have applied this concept to various development projectsÂ since 2005Â based on a practical approach to it.&#8221;</strong></p>
<p>The back story to many of my recent posts, including this one, is an understanding of a relationship between ubiquitous computing and augmented reality that emerged, for me, in a February conversation with Adam Greenfield, <a title="Permanent Link to Towards a Newer Urbanism: Talking Cities, Networks, and Publics with Adam Greenfield" rel="bookmark" href="../../2009/02/27/towards-a-newer-urbanism-talking-cities-networks-and-publics-with-adam-greenfield/">Towards a Newer Urbanism: Talking Cities, Networks, and Publics with Adam Greenfield</a>.Â  In cased you missed it, here is the link again because I think it holds up very well considering the rapid developments of recent months.Â  Also, importantly for this post, it includes a discussion ofÂ  moving on from Weiserian visions.</p>
<p><a href="http://speedbird.wordpress.com/" target="_blank">Adam Greenfield&#8217;s Speedbird</a> is one of my key sources for understanding &#8220;networked urbanism,&#8221; and the list he makes of <a href="http://speedbird.wordpress.com/2009/03/22/the-elements-of-networked-urbanism/" target="_blank">the elements of networked urbanism here</a> (also see the comments) &#8211; is my mantra for thinking about the DNA base pair relationship of augmented reality and ubiquitous computing.</p>
<p>Adam Greenfield&#8217;s, <strong>&#8220;summary of what those of us who are thinking, writing and speaking about networked urbanism seem to be seeing&#8221;</strong> is:</p>
<p><strong>1. From <em>latent</em> to <em>explicit</em>; 2. From <em>browse</em> to <em>search</em>; 3. From <em>held</em> to <em>shared</em>; 4. From <em>expiring</em> to <em>persistent</em>; 5. From <em>deferred</em> to <em>real-time</em>; 6. From <em>passive</em> to <em>interactive</em>; 7. From <em>component</em> to <em>resource</em>; 8. From <em>constant</em> to <em>variable</em>; 9. From <em>wayfinding</em> to <em>wayshowing</em>; 10. From <em>object</em> to <em>service</em>; 11. From <em>vehicle</em> to <em>mobility</em>; 12. From <em>community</em> to <em>social network</em>; 13. From <em>ownership</em> to <em>use</em>; 14. From <em>consumer</em> to <em>constituent</em>.</strong></p>
<p><strong></p>
<p></strong></p>
<h3>Augmented Reality &#8211; Making Visible the Invisible</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-2.44.27-PM.png"><img class="alignnone size-medium wp-image-4509" title="Screen shot 2009-09-26 at 2.44.27 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-2.44.27-PM-300x229.png" alt="Screen shot 2009-09-26 at 2.44.27 PM" width="300" height="229" /></a></p>
<p>The screenshot above is one ofÂ  the coolest &#8220;making visible the invisible&#8221; AR applications. It was developed at Columbia University Graphics and User Interface Lab where <a href="http://www1.cs.columbia.edu/%7Efeiner/" target="_blank">Steven Feiner</a> is Director (see the deep list of projects from the lab <a href="http://graphics.cs.columbia.edu/top.html" target="_blank">here</a>).Â  This app &#8220;shows carbon monoxide levels projected over New York City. The height of each ball reflects concentrations of the pollutant.&#8221; Credit: Sean White and Steven FeinerÂ  (<a href="http://www.technologyreview.com/computing/23515/page2/" target="_blank">via Technology Review</a>).</p>
<p>The recent emergence of &#8220;magic lens&#8221; augmented reality apps for our smart phones &#8211; <a href="http://www.wikitude.org/" target="_blank">Wikitude</a>, <a href="http://layar.com/" target="_blank">Layar,</a> <a href="http://www.acrossair.com/" target="_blank">Acrossair</a>, <a href="http://support.sekaicamera.com/">Sekai Camera</a>, and many others now, have given us a new window into our cities. But we are yet to realize the full potential of the AR/ubicomp base pair that can &#8220;make visible the invisible&#8221; and give us new opportunities to relate to the invisible data ecosystems of our cities, not merely as a spectator experience,Â  but as an interactive, in context, real time opportunity to reimagine social relations.</p>
<p><a href="http://www.sentientcity.net/exhibit/?p=3" target="_blank">Mark Shepard</a> says in <a href="http://www.sentientcity.net/exhibit/?p=3" target="_blank">his curatorial statement</a> for, <a href="http://www.sentientcity.net/exhibit/" target="_blank">&#8220;Toward the Sentient City:&#8221;</a> (Much more soon on this very significant exhibit which runs from Sept. 17th to Nov. 7th, 2009.)</p>
<p><strong>&#8220;In place of natural weather systems, however, today we find the dataclouds of 21st century urban space increasingly shaping our experience of this city and the choices we make there.&#8221;</strong></p>
<p>Augmented Reality, as Joe Lamantia points out, is becoming the great &#8220;<a id="o0mh" title="ambassador of ubiqitous computing" href="http://www.uxmatters.com/mt/archives/2009/08/inside-out-interaction-design-for-augmented-reality.php">ambassador of ubiqitous computing</a>.&#8221; AR is. &#8220;<strong>&#8230;mak[ing] it possible to experience the new world of ubiquitous computing by reifying the digital layer that permeates our inside-out world,&#8221; </strong>and we are only just glimpsing the razor thin end of the wedge in this regard.</p>
<p>I am still working on my <a href="http://www.gov2summit.com/" target="_blank">Gov 2.0 Summit </a>write upÂ  and, amongst other things, I will talk about how an emerging new social contract around open data, here in the US,Â  will put augmented realityÂ  apps center stageÂ  &#8211; &#8220;doing stuff that matters.&#8221; At <a href="http://www.gov2expo.com/gov2expo2009" target="_blank">Gov 2.0 Expo Showcase</a> Tim O&#8217;Reilly tweeted:</p>
<p><a id="i23q" title="Tim O'Reilly" href="http://twitter.com/timoreilly">Tim O&#8217;Reilly</a> Really enjoyed @capttaco (Digital Arch Design) @ #gov20e: &#8220;Augmented Reality could be a new public infrastructure&#8221; <a href="http://bit.ly/18iCx" target="_blank">http://bit.ly/18iCx</a></p>
<p>Also see Tim O&#8217;Reilly and Jennifer Pahlka on Forbes.com discuss the <a href="http://www.forbes.com/2009/09/23/web-squared-oreilly-technology-breakthroughs-web2point0.html" target="_blank">The &#8220;Web Squared&#8221; Era</a> -Â <strong> &#8220;the Web Squared era is an era of augmented reality arriving (like the sensor revolution) stealthily, in more pedestrian clothes than we expected</strong>.<strong>&#8230; &#8230;our world will have &#8220;<a href="http://www.orangecone.com/archives/2009/02/smart_things_an.html" target="_blank">information shadows</a>.&#8221; Augmented reality amounts to information shadows made visible.&#8221;</strong></p>
<p>Again there is back story to how I came to think about Information Shadows in relation to augmented reality.Â  So in case your missed it the first time, here is the link to a conversation that began in a hallway meeting between Tim O&#8217;Reilly, Mike Kuniavsky, <a href="http://thingm.com/" target="_blank">ThingM</a>, Usman Haque, <a href="http://www.pachube.com/" target="_blank">Pachube</a>, and Gavin Starks, <a href="http://www.amee.com/" target="_blank">AMEE</a>, at <a href="http://en.oreilly.com/et2009/" target="_blank">ETech earlier this year</a>,Â  <a title="Permanent Link to Dematerializing the World, Shadows, Subscriptions and Things as Services: Talking With Mike Kuniavsky at ETech 2009" rel="bookmark" href="../../2009/03/18/dematerializing-the-world-shadows-subscriptions-and-things-as-services-talking-with-mike-kuniavsky-at-etech-2009/">&#8220;Dematerializing the World, Shadows, Subscriptions and Things as Services: Talking With Mike Kuniavsky at ETech 2009</a>.&#8221;</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-9.32.09-PM.png"><img class="alignnone size-medium wp-image-4547" title="Screen shot 2009-09-26 at 9.32.09 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-9.32.09-PM-300x225.png" alt="Screen shot 2009-09-26 at 9.32.09 PM" width="300" height="225" /></a></p>
<p><a href="http://www.slideshare.net/rlenz/augmented-city-lab-picnic-09" target="_blank">Slide from Augmented City Lab</a> @ <a href="http://www.picnicnetwork.org/" target="_blank">Picnic &#8217;09</a></p>
<h3>So What&#8217;s Next for Mobile Augmented Reality?</h3>
<p><a href="http://www.youtube.com/watch?v=434zw201iN8&amp;feature=player_embedded" target="_blank"><img class="alignnone size-medium wp-image-4513" title="Screen shot 2009-09-26 at 3.45.45 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-3.45.45-PM-300x186.png" alt="Screen shot 2009-09-26 at 3.45.45 PM" width="300" height="186" /></a></p>
<p>These videos from Daniel Wagner&#8217;s team from Graz University of Technology showing <a href="http://www.youtube.com/watch?v=434zw201iN8&amp;feature=player_embedded" target="_blank">Realtime Panorama Mapping and Tracking on Mobile Phones</a> and <a href="http://www.youtube.com/watch?v=W-mJG3peIXA&amp;feature=player_embedded" target="_blank">Creating an Indoor Panorama in Realtime</a>, as Rouli from Games Alfresco points out,Â  indicate that there is a lot in store for us at <a href="http://www.icg.tugraz.at/Members/daniel/MultipleTargetDetectionAndTrackingWithGuaranteedFrameratesOnMobilePhones/inproceedings_view">ISMAR09</a>.</p>
<p>We may not be so impressed by directory style/&#8221;post it&#8221; AR anymore, as these applications have become common place so quickly!Â  But while these early mobile AR apps may be disappointing in relation to some futurist visions of AR &#8211; merely AR/ubicomp appetizers,Â  there are still good implementations of this model coming out (see new comers to the app store<a id="tzvf" title="Bionic Eye" href="http://mashable.com/2009/09/24/bionic-eye/" target="_blank"> Bionic Eye</a> and <a href="http://www.readwriteweb.com/archives/robotvision_a_bing-powered_iphone_augmented_realit.php" target="_blank">RobotVision</a>). And <a href="http://layar.com/" target="_blank">Layar,</a> always on the ball, has upped the ante for the new cohort of AR Browsers with <a href="http://layar.com/3d/" target="_blank">Layar 3D</a>.</p>
<p>But as Bruce Sterling <a href="http://www.wired.com/beyond_the_beyond/2009/09/augmented-reality-robotvision/" target="_blank">notes here</a>:</p>
<p><strong>*In AR, everybody wants to be the platform and the browser, and nobody wants to be the boring old geolocative database. Look how Tim [creator of RobotVision] here, who is like one guy working on his weekends, can boldly fold-in the multi-billion dollar, multi-million user empires of Apple iPhone, Microsoft Bing, Flickr, and Twitter, all under his right thumb</strong></p>
<p> (watch <a id="qxek" title="video here" href="http://www.youtube.com/watch?v=hWC9gax7SCA&amp;feature=player_embedded">video here</a>)</p>
<p>But ifÂ  you looking for something more from AR, you probably won&#8217;t have to wait too long.Â  The two pioneering companies in AR, <a href="http://www.t-immersion.com/" target="_blank">Total Immersion</a> &#8211; founded in 1999, and <a href="http://www.metaio.com/" target="_blank">Metaio</a> &#8211; founded in 2003 are both coming out with &#8220;mobile augmented reality platforms&#8221; in a matter of weeks (see press releases <a href="http://augmented-reality-news.com/2009/09/14/bringing-its-augmented-reality-to-mobile-applications-total-immersion-partners-with-smartphones-app-provider-int13/" target="_blank">here</a> and <a href="http://gamesalfresco.com/2009/09/18/metaio-announcing-mobile-augmented-reality-platform-junaio/" target="_blank">here</a>).Â  And both companies, it seems, will deploy much more sophisticated AR rendering and tracking than we have seen to date.</p>
<p>I approached Bruno Uzzan, founder and CEO of Total Immersion, for an interview as part of my look at the new industry of augmented reality through the eyes of the founding members of the <a href="http://www.arconsortium.org/" target="_blank">AR Consortium</a>. These consortium members are some of the first commercial augmented reality companies.</p>
<p><a href="#jumpto">The interview below</a> with Bruno began early this summer and then we both went on vacation and it picks up after the announcement of the <a href="http://www.int13.net/blog/en/" target="_blank">partnership between Total Immersion and Int13</a>.</p>
<p>The significance of this announcement is that Total Immersion is now positioned to take the augmented reality experiences they have developed for a number of top brands onto multiple mobile platforms with, &#8220;<strong>Int13&#8242;s very clever embedded solution that allows our [Total Immersion's] solutions to work across many [mobile] platforms,&#8221; </strong>while Int13 gets to extend their reach.</p>
<p>Total Immersion has a 50 person R&amp;D team and their two main focuses have been, firstly getting:<strong> </strong></p>
<p><strong>&#8220;Augmented Reality to work with as many platforms as possible &#8211; PC, Mac, Mobile, Game Consoles, all those are the platforms that we are targeting. We are currently doing lot of work in the R &amp; D team in cross platform compatibility&#8230;.&#8221;</strong></p>
<p>and, secondly:<strong></p>
<p></strong></p>
<p><strong>&#8220;Our R&amp;D guys are working on the real world interacting more with the virtual world.Â  And I have started seeing some results which are pretty much crazy and this will be ready for next year.&#8221;</strong></p>
<p><strong></p>
<p></strong></p>
<h3>Pandora&#8217;s Box &#8211; Shared Augmented Realities</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-25-at-1.18.15-AM.png"><img class="alignnone size-medium wp-image-4450" title="Screen shot 2009-09-25 at 1.18.15 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-25-at-1.18.15-AM-186x300.png" alt="Screen shot 2009-09-25 at 1.18.15 AM" width="186" height="300" /></a></p>
<p>Spes or &#8220;Hope&#8221;; <a title="Engraving" href="http://en.wikipedia.org/wiki/Engraving">engraving</a> by <a title="Sebald Beham" href="http://en.wikipedia.org/wiki/Sebald_Beham">Sebald Beham</a>, German c1540 (see <a href="http://en.wikipedia.org/wiki/Pandora%27s_box" target="_blank">Wikipedia article on Pandora&#8217;s Box</a>)</p>
<p>There are many weaknesses to the mobile smart phone AR experiences we have now, and the lack of near field object recognition (to date), and difficulties with accurate positioning aren&#8217;t the only ones.Â  Note re solving positioning problems in mobile AR, we are yet to see ARÂ  leverage public libraries for analyzing scenes like Flickr&#8217;s geo tagged photos, see Aaron Straup Copesâ€™s work on <a href="http://code.flickr.com/blog/2008/10/30/the-shape-of-alpha/" target="_blank">â€œThe Shape of Alpha.â€</a> And for more on this <a href="http://www.ugotrade.com/2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/" target="_blank">my post here</a>.</p>
<p>But, as Joe Lamantia points out:</p>
<p><strong>â€œOne of the weakest aspects of the existing interaction patterns for augmented reality is their reliance on single-person, socially disconnected user experiences.â€</strong></p>
<p>In my view, <strong>The Pandora&#8217;s Box of Augmented Realities</strong> is an open, distributed, multiuser augmented reality framework, fully integrated with the internet and world wide web.</p>
<p>As Yochai Benkler has pointed out many times, and argues again in, <a href="Capital, Power, and the Next Step in Decentralization" target="_blank">Capital, Power, and the Next Step in Decentralization</a>, it is &#8220;open, collaborative, distributed practices that have been at the core of what made the Internet.&#8221;Â  We have to try to make sure that open, collaborative, distributed practices are at the core of mobile augmented reality.</p>
<p><strong></p>
<p></strong></p>
<h3>Can Google Wave be the basis for an Open, Distributed, Multiuser Augmented Reality Framework?</h3>
<p><a href="http://www.lostagain.nl/tempspace/PrototypeDiagram.html" target="_blank"><img class="alignnone size-medium wp-image-4492" title="Screen shot 2009-09-25 at 11.51.20 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-25-at-11.51.20-PM-300x141.png" alt="Screen shot 2009-09-25 at 11.51.20 PM" width="300" height="141" /></a></p>
<p>I have been exploring the idea of using <a href="http://wave.google.com/" target="_blank">Google Wave </a>protocol as the basis for a distributed, multiuser open augmented reality framework with a small group of AR enthusiasts and developers. And I am happy to say the proposal is beginning to get fleshed out a little.Â  New collaborators are welcome both for &#8220;gear heady&#8221; input and use case suggestions (but re the latter, you can&#8217;t just say everything you see in <a href="http://en.wikipedia.org/wiki/Denn%C5%8D_Coil" target="_blank">Denno Coil</a>..!).</p>
<p>This effort started with Thomas Wrobel&#8217;sÂ  proposal for an Open AR Framework prototyped on IRC &#8211; see <a id="s336" title="here" href="../../2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/">here,</a> and click to enlarge the image above of, <a href="http://www.lostagain.nl/tempspace/PrototypeDiagram.html" target="_blank">&#8220;Sky Writer: Basic Concept for an Open Multi-source AR Framework.&#8221;</a></p>
<p>But recently we began looking at the <a href="Wave Federation Protocol" target="_blank">Wave Federation Protocol</a>.Â  And, if you check out <a id="ogbq" title="this post," href="http://www.jasonkolb.com/weblog/2009/09/why-google-wave-is-the-coolest-thing-since-sliced-bread.html#more" target="_blank">this post,</a> and <a id="c0ep" title="this post" href="http://reuvencohen.sys-con.com/node/980762" target="_blank">this post</a>, you may get a glimpse of why Google Wave protocol might be a good basis for an open, distributed, AR Framework.Â  You will notice, if you study what Google Wave has done with the XMPP protocol, that many ofÂ <a href="http://speedbird.wordpress.com/2009/03/22/the-elements-of-networked-urbanism/" target="_blank"> the elements of networked urbanism</a> that Adam Greenfield describes resonate strongly with what is being attempted in Wave.</p>
<p>But enough said for now!Â  Regardless of the details of implementation,Â  Google Wave or an AR protocol built from scratch (phew! the latter does seem like a lot of work) -Â  an open, distributed, multiuser AR framework integrated with the internet and web would explode the potential of AR, creating new possibilities for data flows, mashups ,and shared augmented realities.</p>
<p>And we are excited by Google Wave because, as Thomas puts it:</p>
<p><strong>&#8220;The really great thing wave does &#8230;.(aside from being an open standard backed by a major player&#8230;hopefully leading to thousands of worldwide servers )&#8230;.is that it allows anyone to create any number of waves, set precisely who can view or edit them, and for them to be able to be updated quickly and continuously (and even simultaneously!)</strong><strong> Better yet, changes will (if necessary) propagate to all the other servers sharing that wave. It does all this right now. From my eyes this does a lot of the work of an AR infrastructure already.</strong></p>
<p><strong>I cant see any other protocol actually doing anything like this at the moment, although correct me if I&#8217;m wrong, as alternatives are always welcome :)&#8221;</strong></p>
<p>Also, Thomas notes, <strong>&#8220;even the playback system (that is, the ability to playback the changes made to a wave since its creation) &#8230;this could give us automatically some of the ideas Jeremy Hight has mentioned in <a href="http://piim.newschool.edu/journal/issues/2009/01/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf" target="_blank">his visionary work here</a>,Â  and <a href="http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf" target="_blank">here</a> on &#8220;the geo spatial web, interlinked locations and data, immersive augmentation and open source geo augmentation.&#8221;</strong></p>
<p>One of the many reasons why an Open, distributed AR Framework would be so cool is it would open up all kinds of possibilities for <span>GeoAR</span> by providing the over-arching standard protocol for communication of updates necessary for the substandards that will facilitate <span>GeoAR</span>.</p>
<p>Also important to note is theÂ  <a id="o0is" title="Wave Federation Protocol docs which are all publicly available here" href="http://www.waveprotocol.org/" target="_blank">Wave Federation Protocol</a> allows anyone:</p>
<p><strong>&#8220;to run wave servers and become wave providers, for themselves, or as services for their users, and to &#8220;federate&#8221; waves, that is, to share waves with each other and with Google Wave. &#8211; &#8220;the federation gateway and a federation proxy and is based on open extension to <a href="http://www.waveprotocol.org/draft-protocol-spec#RFC3920">XMPP core</a> [RFC3920] protocol to allow near real-time communication between two wave servers.&#8221; See Reuven Cohen&#8217;s blog for more <a id="rmr3" title="here" href="http://reuvencohen.sys-con.com/node/980762" target="_blank">here</a> and <a id="mqxr" title="&quot;HTTP is Dead, Long Live the Real Time Cloud.&quot;" href="http://www.elasticvapor.com/2009/05/http-is-dead-long-live-realtime-cloud.html" target="_blank">here, &#8220;HTTP is Dead, Long Live the Real Time Cloud.&#8221;</a></strong></p>
<p>Still some people have expressed concern that an AR Framework using Google Wave protocol would give Google disproportionate influence. Â  Will Google-specific functionality be an issue?Â  How much stuff is Google specific just because no one else is using it (yet)? And how much is Google specific because it holds no value to anyone else but Google? These are some of the questions that have come up.</p>
<p>You are going to see a variety of suggestions for standards and specs for open AR coming out out in the next few months which as, Robert Rice of the <a href="http://www.arconsortium.org/" target="_blank">AR Consortium</a> points out is: <strong>&#8220;a good thing, we need that competition early on to settle down on best case.&#8221; </strong>Recently,Â <a href="http://www.mobilizy.com/" target="_blank"> Mobilizy</a> have offered up an ARML (&#8220;an augmented reality mark-up language specification based on the OpenGISÂ® KML Encoding Standard (OGC KML) with extensions&#8221;) for consideration see <a href="http://www.mobilizy.com/enpress-release-mobilizy-proposes-arml" target="_blank">here.</a></p>
<p>So it is, perhaps, also important to note, that an Open AR Framework should be neutral/transparent to techniques ofÂ  &#8220;reality recognition,&#8221;Â  and methodologies of registration/tracking, allowing various ones to work on the system as new techniques evolve, and to support as many evolving standards as possible.</p>
<p>Augmented Reality developers, like Total Immersion and others with powerful rendering/tracking AR software, should be able use an Open AR Framework to exchange the data which their tracking will use. And the tracking/rendering problems they and other researchers have solved are much harder than figuring out data exchange on on a standard infrastructure or protocol!</p>
<p>So I pricked up my ears when I heard Bruno Uzzan, CEO of <a href="http://www.t-immersion.com/" target="_blank">Total Immersion</a> -Â  the first and currently the largest augmented reality company, with a 50 person R&amp;D team in France and offices in LA, where Bruno himself is now based, say: <strong>&#8220;Total Immersion isÂ  only months away from launching shared mobile augmented reality experiences using near field object recognition/tracking across multiple platforms&#8221;</strong> (for more details read my conversation with Bruno Uzzan <a href="#jumpto">below</a>).</p>
<p>I was happy when I asked Bruno about the possibilities for developing an open, distributed, multiuser augmented reality framework fully integrated with the internet and world wide web (possibly using Google Wave protocols), and he replied:</p>
<p><span id="pnk:" title="Click to view full content"><strong>&#8220;I think this is feasible. I think that&#8217;s doable, that&#8217;s justÂ  in my opinion. I mean some people might have another kind of opinion but I think that that&#8217;s definitely doable.&#8221;</strong></span></p>
<p><span title="Click to view full content"><strong></p>
<p></strong></span></p>
<h3>Total Immersion &#8211; working with the &#8220;symbiosis between augmented reality and brands&#8221;</h3>
<p><a href="http://www.youtube.com/watch?v=I7jm-AsY0lU" target="_blank"><img class="alignnone size-medium wp-image-4457" title="dhj5mk2g_344g64g96cq_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_344g64g96cq_b-300x224.png" alt="dhj5mk2g_344g64g96cq_b" width="300" height="224" /></a></p>
<p>Total Immersion has created many of the best known and most ambitious augmented reality experiences for major brands to date, including Mattel&#8217;s <a title="new toys" href="http://www.readwriteweb.com/archives/mattels_new_web-enabled_avatar_toys_will_offer_augmented_reality.php">new AR toys</a><a title="new toys" href="http://www.readwriteweb.com/archives/mattels_new_web-enabled_avatar_toys_will_offer_augmented_reality.php"><img src="http://www.uxmatters.com/mt/archives/images/new-window-arrow.gif" alt="" width="14" height="12" /></a> to be released in conjunction with the James Cameron film Avatar, and <a id="dmas" title="AR baseball cards for Topps" href="http://www.youtube.com/watch?v=I7jm-AsY0lU">AR baseball cards for Topps</a>, <a href="http://www.youtube.com/watch?v=I7jm-AsY0lU" target="_blank">video here</a> (or click screenshot above), and the <a href="http://www.publishersweekly.com/article/CA6698612.html?industryid=47152" target="_blank">UK&#8217;s first augmented reality book</a>s.</p>
<p>Bruno founded Total Immersion 10 years ago when he was just 27. And the kind of conviction it took to survive as an augmented reality business in the decade before augmented reality captured the world&#8217;s attention is remarkable.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_343dbsph2fz_b1.png"><img class="alignnone size-medium wp-image-4456" title="dhj5mk2g_343dbsph2fz_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_343dbsph2fz_b1-300x225.png" alt="dhj5mk2g_343dbsph2fz_b" width="300" height="225" /></a></p>
<p>AR&#8217;s first steps out into the world after 17 years as predominantly a lab science maybe &#8220;wobbly&#8221; (what new technology isn&#8217;t), and sometimes gloriously kitsch &#8211; check out<a id="d_eu" title="the riotus video of and AR Live Show Total Immersion produced in Korea here." href="http://www.t-immersion.com/en,video-gallery,36.html" target="_blank"> this riotus video of the 3D Interactive Live Show Total Immersion produced in Korea </a> (also see the <a href="http://augmented-reality-news.com/2009/09/15/entertainment-first-interactive-3d-live-show-now-open-in-south-korea/" target="_blank">Total Immersion Augmented Reality Blog</a> for more on the TI&#8217;s turn keyÂ  Interactive 3D Live Show Solution).</p>
<p>As Lamantia points out <a id="eo6x" title="here" href="http://www.uxmatters.com/mt/archives/2009/08/inside-out-interaction-design-for-augmented-reality.php" target="_blank">here</a>, &#8221; projecting mixed realities into public, common, or social spaces makes them  social by default.&#8221;</p>
<p>However, the potential for shared location based augmented reality experiences is as yet untapped.Â  So I see the entry of the most experienced commercial augmented reality company into mobile as pretty interesting.Â Â  WhileÂ  smart phone AR still has significant limitations, and it certainly does differ from some of the futurist dreams of AR (see <a id="x3:y" title="Mok Oh's post hear on his disappointment in this regard" href="http://allthingsv.com/2009/09/03/you-know-what-really-grinds-my-gears-augmented-reality/">Mok Oh&#8217;s post here on his disappointment in this regard)</a>, it is significant that Total Immersion is committing to becoming a leader in mobile AR.</p>
<p>Our smart phones, the powerful networked sensor devices that so many people carry in their pockets, have proved themselves a &#8220;good enough for now&#8221;Â  mediating device for early manifestations of the ubiquitous computing and augmented reality base pair.Â  And now AR and ubicomp is mixed in theÂ  rich, messy soup of everyday life, commerce, business, marketing, art, entertainment, and government, we should get ready to see these technologies grow up fast, and unfold in some surprising ways that lab science didn&#8217;t necessarily predict.</p>
<p>And, perhaps, the new dialogue between scientists and entrepreneurs may spur both communities to outdo themselves.</p>
<p>Particularly, as <a href="http://programmerjoe.com/" target="_blank">Joe Ludwig</a> notes: &#8220;It seems to me that the biggest disconnect between the academics and the entrepreneurs is that they disagree on how far we are from the finish line.&#8221;</p>
<p>See the comments&#8217;s on Ori Inbar&#8217;s post, <a title="Augmented Reality Entrepreneurship: Natural Evolution or IntelligentÂ Design?" rel="bookmark" href="http://gamesalfresco.com/2009/09/22/augmented-reality-entrepreneurship-natural-evolution-or-intelligent-design/">Augmented Reality Entrepreneurship: Natural Evolution or IntelligentÂ Design?</a>, forÂ  a courteous but spirited discussion on the potential benefits and frictions of the newly expanded AR community ofÂ  researchers andÂ  entrepreneurs.</p>
<p>As <a href="http://www.cc.gatech.edu/~blair/home.html" target="_blank">Blair MacIntyre </a>(see my long conversation with Blair<a href="http://www.ugotrade.com/2009/06/12/mobile-augmented-reality-and-mirror-worlds-talking-with-blair-macintyre/" target="_blank"> here</a>) notes:</p>
<p><strong>&#8220;not all academics and researchers are only interested in the traditional models of impact. Case in point: I wouldnâ€™t be building unpublishable games, nor investing so much time talking to the press, entrepreneurs and VCs if I did not believe strongly in the value of the impact I am having by doing that â€” and I know others with the same attitude.&#8221;</strong></p>
<p>In this vein, check out the Marble Game (<a href="http://www.youtube.com/watch?v=6AKgH4On65A&amp;feature=player_embedded" target="_blank">video here</a>) developed by Steve Feiner and his team at Columbia U. It&#8217;s enabled by Goblin XNA, an open source AR framework built on top of Microsoft&#8217;s XNA, which powers XBox live games, Zune games, and some Windows games. For more about Goblin XNA and AR from Columbia U <a href="http://graphics.cs.columbia.edu/projects/goblin/index.htm" target="_blank">see here</a>.Â  (Hat tip to <a href="http://www.oreillynet.com/pub/au/125" target="_blank">Brian Jepson</a> for this link)</p>
<p><a href="http://www.youtube.com/watch?v=6AKgH4On65A&amp;feature=player_embedded" target="_blank"><img class="alignnone size-medium wp-image-4528" title="Screen shot 2009-09-26 at 5.16.56 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-5.16.56-PM-300x182.png" alt="Screen shot 2009-09-26 at 5.16.56 PM" width="300" height="182" /></a></p>
<p>While we are still waiting for the kind of sexy AR specs &#8211; nothing totally game changing in <a href="http://gigantico.squarespace.com/336554365346/2009/9/20/eye-for-an-iphone.html" target="_blank">Gigantico&#8217;s AR eyewear rounup</a> (<a href="http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&amp;Sect2=HITOFF&amp;d=PG01&amp;p=1&amp;u=%2Fnetahtml%2FPTO%2Fsrchnum.html&amp;r=1&amp;f=G&amp;l=50&amp;s1=%2220080088937%22.PGNR.&amp;OS=DN/20080088937&amp;RS=DN/20080088937" target="_blank">maybe note this Apple patent</a>), that might get wide adoption. But at least researchers are not afraid to explore the possibilities of AR Goggles.</p>
<p>But how far are we now, with or without sexy goggles,Â  from a fuller expression of the base pair DNA of ubiquitous computing and augmented reality?</p>
<h3>We may have a LAN of things before we have an Internet of Things</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_345g9bxbwd3_b1.jpg"><img class="alignnone size-medium wp-image-4534" title="dhj5mk2g_345g9bxbwd3_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_345g9bxbwd3_b1-300x199.jpg" alt="dhj5mk2g_345g9bxbwd3_b" width="300" height="199" /></a></p>
<p><em>The picture above is a workshop I attended at <a href="http://confluxfestival.org/2009/about/" target="_blank">Conflux</a> last weekend &#8211; <a href="http://confluxfestival.org/2009/events/workshops/natalie-jeremijenko/" target="_blank">Fish â€˜n microChips</a>, with <a href="http://www.environmentalhealthclinic.net/people/natalie-jeremijenko/" target="_blank">Natalie Jeremijenko.</a> We are at the site of the <a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank">Amphibious Architecture</a> project (a commissioned work for <a href="http://www.sentientcity.net/exhibit/?cat=3" target="_blank">Toward the Sentient City</a>) and &#8220;a collaborative project with <a href="http://www.environmentalhealthclinic.net/environmental-health-clinic/" target="_blank">xClinic</a>, The Living and other intelligent creatures.&#8221;</em></p>
<p>We are probably as far off some grand futurist visions of ubiquitious computing as we are some of the futurist visions of augmented reality. But as it turns out that may not be a bad thing! Recently, <a href="http://twitter.com/mikekuniavsky" target="_blank">@mikekuniavsky</a> noted in a tweet:</p>
<p><span><span>&#8220;Another argument for the LAN of Things before the Internet of Things: <a rel="nofollow" href="http://tinyurl.com/lgp9uq" target="_blank">http://tinyurl.com/lgp9uq&#8221;</a></span></span></p>
<p><span><span>Bert Moore, <a href="http://www.aimglobal.org/members/news/templates/template.aspx?articleid=3553&amp;zoneid=24" target="_blank">in the article Mike linked to points out</a>, the grand vision of an &#8220;internet of things&#8221; with everything connected to everythingÂ  can &#8220;distract people from thinking about the benefits of RFID in smaller, more easily implemented and cost-justified applications.&#8221;Â  The same argument I think applies to sensor networks and augmented reality.</p>
<p></span></span></p>
<p>In New York City, a series of commissioned works for the <a href="http://www.archleague.org/" target="_blank">Architectural League of New York&#8217;s</a> exhibit,<em> </em><a href="http://www.sentientcity.net/exhibit/?cat=3" target="_blank">&#8220;Toward the Sentient City&#8221;</a><em> </em>are giving us the opportunity to dip our toes into the ocean of a &#8220;networked urbanism.&#8221; Â  For only a small budget, two of the <a href="http://www.sentientcity.net/exhibit/?cat=4" target="_blank">five commissioned works</a>, <a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank">Amphibeous Architecture</a> and <a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank">Natural Fuse</a> demonstrate how sensor networks can allow us to explore new kinds of communities &#8211; connecting people to environments in interesting ways to create new forms of social agency.</p>
<p><a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank">&#8220;Amphibeous Architecture</a>&#8221; -Â  from The Living Architecture Lab at Columbia University Graduate School of Architecture, Planning and Preservation (Directors David Benjamin and Soo-in Yang) and Natalie Jeremijenko, Environmental Health Clinic at New York University, uses a skillfully built (electronics and water are notoriously hard to mix) array of partially submerged sensors to pierce the blinding, reflective surfaces of the riversÂ  surrounding Manhattan and to create a new two way relationship with the ecosystem below &#8211; the water, our neighbors the fish and even a beaver that lives in the water surrounding Manhattan.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-6.34.56-PM.png"><img class="alignnone size-medium wp-image-4536" title="Screen shot 2009-09-26 at 6.34.56 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-6.34.56-PM-300x125.png" alt="Screen shot 2009-09-26 at 6.34.56 PM" width="300" height="125" /></a></p>
<p><em>Image from <a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank">Toward the Sentient City</a></em></p>
<p>In a similar spirit, &#8220;<a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank">Natural Fuse</a>&#8221; &#8211; Usman Haque, creative director, Nitipak â€˜Dotâ€™ Samsen, designer, Ai Hasegawa, designer, Cesar Harada, designer, Barbara Jasinowicz, producer, creates a network of people and electronically assisted plants to explore what it takes to work together on energy consumption and to experience the consequences of &#8220;selfish&#8221; and &#8220;unselfish&#8221; behavior interactively before it is too late to modify our actions.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-6.55.29-PM.png"><img class="alignnone size-thumbnail wp-image-4537" title="Screen shot 2009-09-26 at 6.55.29 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-6.55.29-PM-150x150.png" alt="Screen shot 2009-09-26 at 6.55.29 PM" width="150" height="150" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-9.37.06-PM.png"><img class="alignnone size-thumbnail wp-image-4548" title="Screen shot 2009-09-26 at 9.37.06 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-26-at-9.37.06-PM-150x150.png" alt="Screen shot 2009-09-26 at 9.37.06 PM" width="150" height="150" /></a></p>
<p><em>The &#8220;Greedy Switch</em>&#8220;<em> from <a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank">Natural Fuse </a>on the left. On the right &#8220;The System&#8221; &#8211; click to enlarge.<a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank"></p>
<p></a></em></p>
<p>Much more to come in another post on these works, and &#8220;Toward the Sentient City.&#8221;Â  Also an update on how <a href="http://www.pachube.com/">Pachube</a> &#8211; an important part of both these projects and a very important contribution to ubiquitous computing because it creates the opportunity to connect environments and create mashups from diverse sensor data feeds &#8211; has matured since my interview with Pachube founder, Usman Haque, <a href="http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/" target="_blank">&#8220;Pachube, Patching the Planet,&#8221;</a> in January this year.</p>
<p>In the picture above <a href="http://www.environmentalhealthclinic.net/people/natalie-jeremijenko/" target="_blank">Natalie Jeremijenko</a>, and <a id="r_oi" title="Jonathan Laventhol, Imagination" href="http://www.laventhol.com/about" target="_blank">Jonathan Laventhol</a> give the <a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank">Amphibious Architecture</a> sensor array a last look over, as it will soon be lowered into the East River. Jonathan is on a busman&#8217;s holiday to help out at the pre launch of Amphibious Architecture, nr Manhattan Bridge, NYC.</p>
<p>I was very happy to getÂ  a chance to talk to <a id="r_oi" title="Jonathan Laventhol, Imagination" href="http://www.laventhol.com/about" target="_blank">Jonathan Laventhol </a>- more on our conversation in another post<em>. </em>Jonathan Laventhol is <a id="r_oi" title="Jonathan Laventhol, Imagination" href="http://www.laventhol.com/about" target="_blank">CTO of Imagination &#8211; one of the world&#8217;s leading design, events, and branding agencies.</a> We talked about the importance ofÂ <a id="r_oi" title="Jonathan Laventhol, Imagination" href="http://www.laventhol.com/about" target="_blank"> Pachube</a>, which Jonathan called the &#8220;The Facebook of Data,&#8221;Â  andÂ  how the <strong>symbiosis between brands and augmented reality</strong>, and healthcare applications, wouldÂ  be key to augmented reality emerging into the mainstream.</p>
<p><em><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_340djvd2thc_b.jpg"><img class="alignnone size-medium wp-image-4453" title="dhj5mk2g_340djvd2thc_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_340djvd2thc_b-235x300.jpg" alt="dhj5mk2g_340djvd2thc_b" width="235" height="300" /></a></em></p>
<p>Natalie Jeremijenko&#8217;s workshop at Conflux on the social negotiation of technology and how <a href="http://speedbird.wordpress.com/my-book-everyware-the-dawning-age-of-ubiquitous-computing/" target="_blank">&#8220;everyware&#8221;</a> can give us the chance to experience new forms of agency and connection was a totally inspiring.Â  And I will cover this too in another post.Â  I have so much awesome stuffÂ  to write about at the moment!</p>
<p>None of the projects in, &#8220;Toward the Sentient City,&#8221; included a mobile augmented reality, or &#8220;magic lens&#8221; component, but they all pointed to why &#8220;enchanted windows into our newly inside-out reality&#8221; are going to be so important. And why the DNA base pair of ubicomp and augmented reality can really do stuff that matters.</p>
<h3>Shangri- La &#8211; &#8220;Transfigured City&#8221;</h3>
<p><a href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/"><a href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/"><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_342g43n6w7k_b.png"><img class="alignnone size-medium wp-image-4452" title="dhj5mk2g_342g43n6w7k_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/dhj5mk2g_342g43n6w7k_b-300x249.png" alt="dhj5mk2g_342g43n6w7k_b" width="300" height="249" /></a></a></a></p>
<p>Screenshot from <a href="http://en.wikipedia.org/wiki/Shangri-La_%28novel%29" target="_blank">Shangri-La</a> episode </em><a id="cwnc" title="The Transfigured City," href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/" target="_blank">Transfigured City</a></p>
<p>In my AR Consortium founder member interview series, I have found that, understandably, the visionary founders of these first augmented reality companies are a little reticent about sharing their full vision.Â  They are basically on stealth mode in this regard.Â  So as you will not, from my interview with <a href="http://www.t-immersion.com/" target="_blank">Total Immersion</a> founder and CEO, Bruno Uzzan, get a fully drawn scenario of his vision for a next generation of shared augmented reality experiences, here&#8217;s a really interesting anime episode from the anime Shangri La called, <a id="cwnc" title="The Transfigured City," href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/" target="_blank">Transfigured City</a>, to mull over instead.</p>
<p>As you can tell from this rather long and circuitous intro to my my conversation with Bruno Uzzan, IÂ  have been investigating shared augmented realities pretty intensively recently. And Mike Kuniavsky pointed me to <em><em><a href="http://en.wikipedia.org/wiki/Shangri-La_%28novel%29" target="_blank">Shangri-La</a></em></em>, and<a id="cwnc" title="The Transfigured City," href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/" target="_blank"> Transfigured City</a>, in a conversation with Mark Shepard, after Mark&#8217;s presentation at Conflux, <a href="http://confluxfestival.org/2009/events/workshops/mark-shepard/" target="_blank">Sentient City Survival Kit.</a></p>
<p><a href="http://thingm.com/about-us/team/mike-kuniavsky.html">Mike Kuniavsky</a> with <a href="http://thingm.com/about-us/team/tod-e-kurt.html">Tod E. Kurt</a> is founder of <a href="http://thingm.com/home.html" target="_blank">ThingM</a>, a ubiquitous computing device studio. Also Mike Kuniavsky researches, designs and writes about people&#8217;s experiences at the intersection of technology and everyday life &#8211; see Mikes blog <a href="http://www.orangecone.com/" target="_blank">Orange Cone</a>.Â  And I interviewed Mike at Etech- see<a href="../../2009/03/18/dematerializing-the-world-shadows-subscriptions-and-things-as-services-talking-with-mike-kuniavsky-at-etech-2009/" target="_blank"> here</a>.</p>
<p>In <a id="cwnc" title="The Transfigured City," href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/" target="_blank">Transfigured City</a>, the &#8220;Metal Age&#8221; group has to figure out how to share and communicate in a city transfigured by augmented realities/virtualities, where no-one sees the same place in the same way.Â  Only one character can figure out from her previous experience of the city the relationship between the transfigured city and how it used to be.</p>
<p>The conversation I had with <a href="http://www.orangecone.com/" target="_blank">Mike Kuniavsky</a> on <a id="cwnc" title="The Transfigured City," href="http://www.kazeebo.com/view/17506/shangrila-episode-14-transfigured-city/" target="_blank">The Transfigured City</a> continued at a picnic in Washington Square Park the next day with Elizabeth Goodman, who I met at Etech when she gave a brilliant presentation, <a id="eag1" title="Designing for Urban Green Space" href="http://en.oreilly.com/et2009/public/schedule/detail/5562" target="_blank">Designing for Urban Green Space</a>.Â  We covered so many areas at the picnic related to ubiquitous computing and augmented realities that this conversation probably deserves a post of its own (my writing to do list is growing longer!).</p>
<p><a id="on28" title="The Plot Synopsis for Shangri La" href="http://en.wikipedia.org/wiki/Shangri-La_%28novel%29" target="_blank">The Plot Synopsis for Shangri La</a>:</p>
<p><strong>&#8220;In the mid-21st century, the international committee decided to forcefully reduce CO2 emission levels to mitigate the global warming crisis. As a result, the economic market was transferred mainly into the trade of carbon. A great earthquake destroys much of Japan, yet the carbon tax placed on the country is not lifted, so Tokyo is turned into the worldâ€™s largest &#8220;jungle-polis&#8221; that absorbs carbon dioxide. Project Atlas is commenced to plan the rebuilding of Tokyo and oversee the government organization, which the Metal Age group opposes due to its oppressive nature. However, Atlas is only built with enough room for 3,500,000 people and most people are not allowed to migrate into the city. The disparity between the elite within Atlas and the refugees living in the jungles outside of its walls set up the background of the story.&#8221;</strong></p>
<p><strong></p>
<p></strong></p>
<p><a name="jumpto"><span style="font-size: medium;"><strong> Talking With Bruno Uzzan</strong></span></a></p>
<p><span style="font-size: medium;"><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/BrunoUzzanpost.jpg"><img class="alignnone size-medium wp-image-4494" title="BrunoUzzanpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/BrunoUzzanpost-225x300.jpg" alt="BrunoUzzanpost" width="225" height="300" /></a></p>
<p></strong></span></p>
<p><strong></p>
<p>Tish Shute:</strong> We won&#8217;t have fully opened the Pandora&#8217;s Box of Augmented Realities until we have ubiquitous, shared augmented realities, will we?</p>
<p><span id="p-xo" title="Click to view full content"> <strong>Bruno Uzzan: Yes. The most important for augmented reality is the experience we want to share. Now we are working on the cell phone, we can potentially do some marketing components that we already have developed now on cell phone. Done. Itâ€™s working.</strong></span></p>
<p><strong>But the most interesting part of it is how these new components [cell phone AR] will be used for marketing campaigns by brands. And we are also pretty much well positioned to transform some of the AR that we currently have working on Mac and PC and to transform these to applications working on mobile devices. </strong></p>
<p><strong>Tish Shute: </strong> We havenâ€™t really experienced yet what it means to actually share mobile AR experiences?</p>
<p><strong>Bruno Uzzan: Itâ€™s hard &#8212; we did a Facebook app. Itâ€™s a first try, it has a way to go.Â  But </strong><span id="c8ek" title="Click to view full content"><strong> to go more and more into social, is the way forward for us &#8211; to share and expand AR experiences. But yes, I mean what youâ€™re seeing is how two people on two different applications can share that same expanse.Â  For sure we are going in that direction. We are currently working on those kind of solutions. How people can share and experience together at the same time. Thatâ€™s how we start creating excitement in augmented reality, and itâ€™s coming up.</strong></span></p>
<p><strong>It&#8217;s a new market and thereâ€™s so much more in store for augmented reality. You know, some people are telling me, donâ€™t you believe that augmented reality is a gimmick? It will be a trend for a few weeks or a few months and then gone? I say, youâ€™re kidding me. This is only the beginning. I mean I can assure you that the applications that are on the market today are one percent of what we will have five years from now.</p>
<p></strong></p>
<p><strong>Tish Shute: </strong>I agree.</p>
<p><strong>Bruno Uzzan: And Iâ€™m sure that augmented reality will be a part of a lot of components that we are currently using today &#8211; GPS, web browser, glasses, I mean there are so many applications that will come up shortly. This is only the beginning. Iâ€™m completely convinced that augmented reality will be in three years from now what virtual reality is today, which is a billion dollar market.Â  I know that itâ€™s not just a gimmick of a few weeks or a few months, because so many brands are jumping into it, spending money, exploring solutions.Â  I know that itâ€™s not just short term -what they are willing to do and we are willing to do, but also middle and long term. And thatâ€™s what makes this adventure pretty much unique and what makes creating a cutting edge technology, very, very much exciting for us.</p>
<p></strong></p>
<p><span id="pb9s" title="Click to view full content"><strong>Tish Shute:</strong> First could you explain more to me about your partnership with Int13. I am not sure I understand what is in the arrangement from Total Immersion&#8217;s POV. I mean what happens re your own mobile software development? Haven&#8217;t you only been licensed the Int13 SDK for a limited period of time and have limited access to all it&#8217;s power? </span><span id="p_2y" title="Click to view full content"><a href="http://gamesalfresco.com/2009/09/15/why-int13-got-in-bed-with-total-immersion/" target="_blank">Stephane from Int13 said to Ori on Games Alfresco, here, </a>â€œwe have licensed the SDK4 for two years,â€ and then Ori asks, â€œbut you have basically kept the power to yourselves, right?â€ So if they are the only ones that can enhance it and develop the software, where willÂ  TI be in two years in mobile if you havenâ€™t really had the chance to develop your own software .</span></p>
<p><span id="j5co" title="Click to view full content"></p>
<p><strong>Bruno Uzzan: Actually itâ€™s a real win-win situation. Int13 is a very small company and they have so many requests they can&#8217;t possibly fulfill them all. SoÂ  this is a way for both of us to be, as quickly as possible, the first mobile provider for all the requests we have. Also they give us exclusivity so nobody else can use INT13 SDK for such applications.Â  I think that it is a good partnership, </strong></span></p>
<p><strong>And concerning our own mobile applicationâ€¦ First of all we have currently some mobile applications working. But with Int13 we have a mobile solution that can work on many different devices. Thatâ€™s a fact and thatâ€™s working. And, believe me you will hear from us a lot more about this soon. We are fully independent on our mobile development. The reason we closed the partnership with Int 13 isÂ  to be able to deploy mobile in a broad way.</strong></p>
<p><strong> I mean you know that the difficulty with AR mobile is that each separate device needs some customization. Working on the iPhone is different from working on the Nokia, different from working on the Palm; itâ€™s different from working on the Samsung. Each of them have their own operating system inside and so we were interested in Int13&#8242;s very clever embedded solution that allows our solutions to work across many platforms.</strong></p>
<p><strong>The reason we are working with Int13 is that we are able to work on so many mobile devices, thanks to Int13. And in the mobile AR race that we are currently in, the next two years will be extremely important to usâ€¦</strong></p>
<p><span id="z_5s" title="Click to view full content"><strong>Tish Shute:</strong> OK, that definitely clarifies it a lot. So Int13 has done an embedded solution to allow TI developed AR solutions to work easily across many devices?</span></p>
<p><span id="y.wt" title="Click to view full content"><strong>Bruno Uzzan: YesÂ  they have kind of an embedded solution, a way to address extremely quickly new cell phone&#8230; But, currently on our side, we are in discussions with a mobile companyâ€¦ and that only refers to some very specific mobile devices.Â  And what they have is also a way to embed deeper our technology into mobile, so that we can have quickerâ€¦ applications that work on a large number of cell phones.</strong></span><span id="mufh" title="Click to view full content"> </span></p>
<p><strong>Tish Shute:</strong> So, basically it means you don&#8217;t have to go through some complicated negotiations with each of the cell phone companies, is what you are saying?</p>
<p><strong>Bruno Uzzan: Not only negotiations, but also hard development. You know? Working on the Windows mobile is completely different from working on the Palm OS. You know, that&#8217;s different! Its a big work, to have a mobile application working on many other devices. So, INt13,Â  provides us a way for us to save some time and some development cost too.</strong></p>
<p><strong>Tish Shute:</strong> And Int13 doesn&#8217;t have powerful AR development tools like <a href="http://www.t-immersion.com/en,interactive-kiosk,32.html" target="_blank">D&#8217;fusion</a> right?</p>
<p><strong> Bruno Uzzan: Right! That&#8217;s right. That&#8217;s why we say it&#8217;s a true win-win solution. They can benefit from our work too. And we can benefit from their work, in order to deploy quicker and faster mobile solutions. </strong></p>
<p><strong>Tish Shute:</strong> Now, the second thing isâ€¦ there is a lot of debate and disagreement about how far mobile augmented reality is from delivering something more that the &#8220;post it&#8221; approach that has been much publicized in recent months, via all the AR browser apps.</p>
<p>But from my understanding from the conversation we had earlier this summer (see below), Total Immersion is targeting a much higher level of mobile augmented reality than we&#8217;ve seen to date?</p>
<p><strong>Bruno: Yes the browser apps we have seen are a kind of augmented reality, but not exactly the way we see it. Let me explain you why. With this kind of application it&#8217;s true that you can overlay 3D-information and video. That&#8217;s a fact. So, in a sense, that&#8217;s augmented reality. But the way that they are working on the position of the 3D on that video is that they are using compass and GPS-information.. so it means that this AR solution will work only on some building and some physical objects that are FIXED. In a fixed and known position.</strong></p>
<p><strong>So you want to go to a theater?</strong></p>
<p><strong> </strong><span id="a9qv" title="Click to view full content"><strong>The theater is here, for sure it will not move, so you know the position of the theater, and thatâ€™s a fact that you can superimpose an object on the theater. Thatâ€™s what can be done currently. What we are achieving and what we are doing on mobile is more than that. We want to be able to port our solution with trading cards, with brands, into a smart phone.</strong></span></p>
<p><strong>Iâ€™m assuming that you want a can, a drink can, to be able to trigger an experience. The only way you can do it is to be able to understand what the can, it is. And the current solutions that are out there canâ€™t do that, itâ€™s impossible. </strong></p>
<p><strong>Tish Shute:</strong> Right, yes. Thereâ€™s no near-field object at all in these early browser apps.</p>
<p><strong>Bruno Uzzan: And the solution we have is that we can recognize a can and then &#8212; in a very, very precise way and that activates geo-location, so we can superimpose 3D. I mean in that case, it opens up all the applications that we currently have, so they could work on mobile.</strong></p>
<p><strong>Tish Shute:</strong> So for example, if youâ€™re working with a soft drink company, people can trigger that experience wherever they see that can?</p>
<p><strong>Bruno Uzzan: Correct. </strong></p>
<p><strong>Tish Shute:</strong> Yes. Yes, I assumed that was what youâ€™re doing</p>
<p><strong>Bruno Uzzan: We believe &#8212; and maybe thatâ€™s not the case, but we believe that our marker-less tracking technology is pretty much unique on the mobile devices.</strong></p>
<p><strong>I havenâ€™t seen yet, from anyone, a full augmented reality mobile solution working.</p>
<p></strong></p>
<p><span id="rzqr" title="Click to view full content"><strong>I really see AR being part of the Web 3.0 next generation. I mean the vision I have is that, you know &#8212; today, when you want to have information, you go on a website and then you find your information. AR &#8212; and the future is that I think it will be part of the opposite. You want to have information about a product, you just show it to your computer and the information will automatically pop up. I see here a new way to market some key messages, a new way to get information is that some physical product by themselves could be a way to get information, and you donâ€™t have to search anymore for them, itâ€™s coming out to you.</strong></span></p>
<p><strong>AR is definitely for me, one of these components. Another thing that AR is a solution, another thing that AR itself will create these kind of results in how information is being displayed. But Iâ€™m seeingÂ  here a way that could be part of a new way to have access to information. And thatâ€™s part of the vision I have. Whatever, if it is through mobile phone or web or PC, Mac, whatever, I really believe that now this kind of new generation of receiving information will come shortly and could be a kind of a new &#8212; could be part of the new 3.0 generation of the web. </strong></p>
<p><strong>Tish Shute:</strong> My friend <a id="evae" title="Gene Becker" href="http://www.genebecker.com/" target="_blank">Gene Becke</a>r did <a href="http://www.genebecker.com/2009/09/thinking-about-design-strategies-for-magic-lens-ar/" target="_blank">an interesting post recently on some of the current limitations of mobile AR</a> where he pointed out the problem of:</p>
<p><em><strong>&#8220;S</strong><strong>implistic, non-standard data formats</strong> â€“ POIs, the geo-annotated data that many of these apps display, are mostly very simple one-dimensional points of lat/long coordinates, plus a few bytes of metadata. Despite their simplicity there has been no real standardization of POI formats; so far, data providers and AR app developers are only giving lip service to open interoperability. Furthermore, they are not looking ahead to future capabilities that will require more sophisticated data representations. At the same time, there is a large community of GIS, mapping and Geoweb experts who have defined open formats such asÂ <a href="http://georss.org/" target="_blank">GeoRSS</a>,Â <a href="http://geojson.org/" target="_blank">GeoJSON </a>andÂ <a href="http://code.google.com/apis/kml/documentation/" target="_blank">KML</a> that may be suitable for mobile AR use and standardization.&#8221;</p>
<p></em> <span id="gd8y" title="Click to view full content"></p>
<p><strong></p>
<p></strong></span><span id="v68s" title="Click to view full content"><strong> Bruno Uzzan: Thatâ€™s interesting. I mean &#8212; I know exactly what his is referring to. He is mainly referring to a localization and how you can have a quick, accurate localization.Â  If you look at current solutions, and you look at this 3-D superimposing on the video, the 3-D is shaking a lot. I donâ€™t know if you see that in some of these early efforts.</strong></span></p>
<p><strong>Itâ€™s hard to use because the 3-D, you know, isÂ  part of the magic of augmented reality, that is when the 3-D is being inserted in a very easy way and smooth way in your solution. Here, when you see this overlay, 2-D or 3-D overlaid on the video, itâ€™s shaking a lot. One reason for this is that the GPS compass is not accurate enough to coordinate the perfect location of the user. And here, what Gene says is interesting. I think we are addressing this localization issue in a pretty smart way.</strong></p>
<p><strong>But to be frank with you, I donâ€™t believe mobile augmented reality in the extremely short term &#8212; Iâ€™m talking about three weeks, one, two months is mature enough for good AR applications.Â  It will be shortly.Â  But for now it is more proof of concept than a true and easy application to use. </strong></p>
<p><strong>But we are starting to see a lot of new application coming out, but I really believe that marketing and entertainment are the two key markets for AR right now.</strong></p>
<p><strong>Iâ€™ve been working ten years in augmented reality. And, eight years ago, when I was talking about augmented reality, I was E.T., you know? Nobody understood what I said, and I thought it was crazy. And now, today, yes itâ€™s completely different.</strong><strong> </strong></p>
<p><strong> </strong></p>
<p><strong>Tish Shute:</strong> The Pandora&#8217;s Box of Augmented Realities, in my view, is an open, universal and standard, distributed, multiuser, augmented reality framework fully integrated with the internet and world wide web. I have been looking into Google Wave protocols as a basis for this would you be interested in this? Do you think it is feasable?</p>
<p><span id="ngwf" title="Click to view full content"> </span><span id="vz68" title="Click to view full content"><strong> </strong></span></p>
<p><span id="vz68" title="Click to view full content"><strong>Bruno Uzzan: I think this is feasible. I think that&#8217;s doable, that&#8217;s justÂ  in my opinion. I mean some people might have another kind of opinion but I think that that&#8217;s definitely doable.</strong></span></p>
<p><strong>Tish Shute:</strong> Yes I suppose an open AR Framework involves cooperation and collaboration, it is more about business and politics than technological problems.</p>
<p><strong> Bruno Uzzan: Yes!Â  Actually the Web is politics. Business is politics. </strong></p>
<p><span id="yeg4" title="Click to view full content"><strong>Tish Shute: </strong>I would be interested if anyone in your R&amp;D team would be interested in looking at some of the ideas that are emerging in our little discussion of Google Wave and an Open AR FrameworkÂ  to offer feedback. it is an interesting time now to input on the Wave Federation Protocol docs because nothing is set it stone right now.</span></p>
<p><span id="hzrf" title="Click to view full content"><strong>Bruno Uzzan: Just shoot me an email, I&#8217;ll try to put you in touch with the right person and, and a team member that can input on this.</strong></span></p>
<p><span id="hbcd" title="Click to view full content"><strong>Tish Shute: </strong>For mobile augmented reality the best thing weâ€™ve got now is the phone, right?</span></p>
<p><strong>Bruno Uzzan: Right. </strong></p>
<p><strong>Tish Shute:</strong> And the only way we can use the phone is by holding it up, right?Â  Isnâ€™t this a bit of an an obstacle as you introduce better object recognition and tracking?Â  People are going to have to stop moving to use their phone. What do you feel about that experience? Isn&#8217;t AR eyewear and essential part of a tightly registered AR experience?</p>
<p><strong></p>
<p>Bruno Uzzan: </strong>We donâ€™t do hardware and we donâ€™t have the current solution for eyewear that would do all we need for a good mobile AR experience, so I guess we donâ€™t have the current answer for that.Â  But we are beginning to see the next generation of this &#8212; of these glasses.</p>
<p><strong>Tish Shute:</strong> But youâ€™re happy enough with the mobile experience of augmented reality on smart phones that youâ€™re investing in this next generation of software for this.</p>
<p><strong>Bruno Uzzan: Yes, I know. We know that some application will not work on the iPhone. And yes, whatever you do, you still need to hold the iPhone, so it means that you canâ€™t play with your hands anymore. So we know that partially, some AR solutionsÂ  we have on other platforms will lose the magical effectivities on just the iPhone.</strong></p>
<p><strong>But Iâ€™m starting to see on the market some glasses that could perhaps be not too expensive &#8212; thatâ€™s a challenge!Â  And easy to use &#8212; thatâ€™s another big challenge. And, that could fit on anybodyâ€™s faces and head &#8212; there&#8217;s another big challenge. So yes, Iâ€™m starting to see that, but so far AR glasses are only applicable for some very, very specific application, like design or theme park or, you know, some specific location where it makes sense to move forward with glasses.</p>
<p></strong></p>
<p><strong>I donâ€™t believe that kids will use glasses for &#8212; in our toys and for games in the next months or maybe othe next one or two years. But maybe something will come out shortly and that could be a big breakthrough, and enable us to think another way. ButÂ  from what we have seen so far and from what we know in this hardware market, I donâ€™t believe that currently there is a workable solution.</p>
<p><span style="font-size: small;"></p>
<p></span></strong> <span style="font-size: small;"><strong></p>
<p></strong></span><span style="font-size: medium;"><span style="font-size: small;"><strong>Note: The following section of the interview took place earlier in the Summer.</strong></span></p>
<p></span><span id="yvdi" title="Click to view full content"></p>
<p><strong>Tish Shute:</strong> You are the first commercial AR companyÂ  &#8211; you started in 1999 right?</p>
<p><strong></p>
<p>Bruno Uzzan: Yes you are right. We started the extremely early in this augmented reality market. We were the first company worldwide to start doing augmented reality and to start promoting augmented reality. So it&#8217;s true, we are pretty old players although the market has been getting bigger and bigger for the last year and a half. So for a long time we were only in the market, and the market was not really there.</strong></span></p>
<p><strong>But for the past 8 months, the company has been growing really fast.</strong></p>
<p><strong>Tish Shute:</strong> Yes I&#8217;m sure. Congratulations for hanging in there long enough to get the pay off!</p>
<p><strong> Bruno Uzzan: You know, my background is Financial. So I have been driving the company for many years in a very cash efficient way. So we have been waiting for the markets to reach maturity before starting make some investments. So that&#8217;s the reason we are still here, and that&#8217;s the reason I think we managed pretty smartly the cash that we raised for the company.</strong></p>
<p><strong>Tish Shute:</strong> Yes there is a saying that when a market takes off you can tell a pioneers because they are the ones with the arrows in their backs. But I am glad you are dodging the arrows!</p>
<p><strong>Bruno Uzzan: You know, I&#8217;ve always driven the company with revenue. And because revenue was not there at the beginning I was extremely cautious about the cash. So now that the company is getting some revenue, for sure we are making more and more investments, and taking advantage of our situation as a worldwide leader of augmented reality.</strong></p>
<p><strong>This situation is not easy as it appears today but it&#8217;s now getting better, as you can see, AR, Augmented Reality, has very good momentum and we are benefiting a lot from all this momentum for augmented reality right now.</strong></p>
<p><strong>Tish Shute:</strong> You&#8217;ve been very involved in researching developing augmented reality tools. Are you still as active in the research area, or are you too busy keeping up with work for hire now, to be working on research and building new technology for Augmented Reality?</p>
<p><strong>Bruno Uzzan: Both. First of all, we are part of lot of projects either directly with clients like Mattel or with some partners that are using our technology to promote and develop other AR projects. From what we he have seen, many, many, many, projects augmented projects have been done currently with our solutions.</strong></p>
<p><strong>To continue with your previous question. So we are being perceived as this leader in that space, and weÂ  have some pretty heavy demand for our services. But we are coming up with new technology, of course, still connected to Augmented Reality.Â  But, our R &amp; D is working in two different directions, which of course also bind together.</strong></p>
<p><strong>The first one is platform developments. So we want </strong><strong>Augmented Reality to work with as many platforms as possible &#8211; PC, Mac, Mobile, Game Consoles, all those are the platforms that we are targeting. We are currently doing lot of work in the R &amp; D team in cross platform compatibility</strong><strong>.</p>
<p></strong></p>
<p><strong>Tish Shute:</strong> Robert Rice said recently, &#8220;markers and webcams equal Photoshop page curls&#8230;&#8221;</p>
<p><span id="dulu" title="Click to view full content"></p>
<p><strong>Bruno Uzzan: Yes. There are so many concerns with markers. The quality is extremely bad. As soon as you hide a part of the marker, a slight part of the marker, youâ€™re dead. You canâ€™t track any more of the object. So compared to our solution where I want to say play with cards or where you are going to play with a Mattel toy, even if you hide a part of the toy, itâ€™s still working.</strong></span></p>
<p><strong> Tish Shute:</strong> But you havenâ€™t offered the public an SDK to your engine right? Basically the way people get access to your tools is working in a partnership with Total Immersion right?</p>
<p><strong>Bruno Uzzan: Correct. </strong></p>
<p><strong>Tish Shute:</strong> Do you think in the future you might open your SDK? Are you considering that?</p>
<p><strong></p>
<p>Bruno Uzzan: Yes, it would be interesting. </strong></p>
<p><strong>Tish Shute:</strong> So that is something we can see coming soon?</p>
<p><span id="short_transcription0" title="Click to view full content"><strong>Bruno Uzzan: Maybe, because itâ€™s true that Total Immersion is starting to be mature enough for these kind of tools. The only thing is that we have to respect good timing for that.Â  Itâ€™s a big decision. You know what I mean?Â  It is a big, big decision. We would then compete with others using our technology. </strong></span></p>
<p><strong>Tish Shute:</strong> Oh I know, it is a big decision when you have so much skin in the game! But it would be nice to have your SDK being THE platform for AR, wouldn&#8217;t it?</p>
<p><strong> Bruno Uzzan: It is a really big decision that we canâ€™t just take like that, you know.Â  There are a lot of friends who told me you have to be extremely careful about timing. This timing is pretty much connected to the maturity of the market. For sure, we see the market being more and more mature. But, there are a lot of low hanging fruits we still want to address</strong></p>
<p><strong>To get the best value possible for all the publicity we have and all the clients we have now. </strong></p>
<p><strong>Tish Shute:</strong> Yes, I know. Youâ€™ve been in this game so long. Now, there is an interesting question here though about tools and platforms because you know, A.R., augmented reality has already expandedÂ  beyond its kind of original purist definition. And when I talk to peopleÂ  about augmented reality, there are actually lot of different ideas and priorities of where the tools should go right now. You know, obviously we have these kind of browser-like applications, but these browser like applications are not dealing with recognizing near field objects yet.Â  What are your priorities for tool development and what are your priorities for AR development in the future? What areas are you going to focus on? Oh dear that is a rambling question!</p>
<p><strong>Bruno Uzzan: [laughter]Â  So, one of our first priorities is we need to create our software with one development, one installer, one software that can be spread on different platforms. The same application, the same software can be used either on a PC, Mac, phone or console. So thatâ€™s a lot of work, because that means that our platform has to address many many different devices and thatâ€™s a big priority for us because we received this request from our clients. We want to be able to use one application on many different platforms and devices. So, thatâ€™s the first one.</p>
<p></strong></p>
<p><strong><span id="hk3z" title="Click to view full content">And the second one is to add more and more interactivity between the real and the virtual world. So, we are working on some improvements to add some real components that will interact with virtual, and that also part of our big strategy and direction and these two worlds can more and more be bridged together, linked together so they can interactÂ  one with the other.</span></strong></p>
<p><strong>Our R&amp;D guys are working on the real world interacting more with the virtual world.Â  And I have started seeing some results which are pretty much crazy and this will be ready for next year.</p>
<p><br style="background-color: #ffff00;" /></strong><span id="b1qt" title="Click to view full content"><strong> There are so many different directions for interaction between the real world and virtual world to develop.Â  Iâ€™m sure ten years from now youâ€™re going to have AR applications everywhere.Â  Its not just temporary fashion stuff or a gimmick for few months. I mean we are getting there, its getting stronger and stronger and we are getting a good adoption rate from our consumers. They like it, they test it, they play with it and brands wants more, people want more and its getting bigger and bigger.</p>
<p></strong></p>
<p><strong>Tish Shute:</strong> Yea and I totally agree, its not a gimmick because the interaction between &#8220;virtual&#8221; and &#8220;real&#8221; enhances the magic of both. Another question about you RandD operation. Is your R&amp;D still in France or have you moved totally out to LA.</span></p>
<p><strong>Bruno Uzzan: We are 50 people in France and I started this LA office two years ago and I moved permanently two years to LA. So Iâ€™m now permanently located in the US to take care of the US office, knowing that revenues are really getting bigger and bigger in the US. So it means that we are getting a lot of traction, working with large company and now Iâ€™m currently located in the US.</strong></p>
<p><strong>Tish Shute:</strong> My sister lives in Paris. Could I visit your R&amp;D lab at some point? Iâ€™d love to visit!</p>
<p><span id="bt1e" title="Click to view full content"><strong>Bruno Uzzan: Yeah sure sure sure. I mean if you want to go. You wonâ€™t have access to all the research. But if you want to go out and meet all the team please do.</strong></span></p>
<p><strong>Tish Shute:</strong> Iâ€™d love to.</p>
<p><strong> Bruno Uzzan: No problem. Shoot me an Email you and I will introduce you to Eric Gehl, COO, he is the COO of the French team. And he can definitely take care of that. </strong></p>
<p><strong>Tish Shute:</strong> That would be fun. Thank you!</p>
<p>Recently, AR browser applications have really caught the imagination of the web community, eg., Layar and Wikitude?Â  Where do you think the most important market for AR is at the moment<span id="k6fx" title="Click to view full content">, entertainment,Â  green tech, business, education?</span></p>
<p><strong>Bruno Uzzan: I think that all that you mention will be important. The first one that did grab my attention is entertainment particularly dual marketing, because they always searching for new ways to interact with players or the consumers.Â  But itâ€™s just the tip of the iceberg, you know, I mean medical applications could be huge using augmented reality. Education, and edutainment is definitely using more and more augmented reality components.Â  And I will just be submitting with big companies â€“ that are considering using augmentation for education. Museums are very important too. Also augmentation as a kind of free sales tool, you know there are so many applications, design, architecture &#8211; so many directions that itâ€™s hard to say today which one will take the lead.</strong></p>
<p><strong>But I do believe that on the short term the ones that are really really moving fast are the entertainment business and the digital marketing business. </strong></p>
<p><strong>Tish Shute:</strong> What do you think are the biggest shortcomings with current augmented reality and what are the obstacles that no one has solved yet?</p>
<p><strong>Bruno Uzzan: I think the cell phone is not fully ready for augmented reality â€“ a lot of people are working on that but there are still a lot of constraints to get the augmented reality working on a cell phone and I think that from what I heard a lot of manufacturers and a lot of companies are working from direction that are going to help us a lot to develop some great cell phone applications.</strong></p>
<p><strong>And I think thatâ€™s one of the biggest part of the game. All the applications that you see on cell phones so far are just gimmicks â€“ the next big key is how to transform some gimmick cell phone application to a real, industrial, robust application that&#8217;s going to work on a cell phone. So I think thatâ€™s a big challenge for this year. </strong></p>
<p><strong></p>
<p>Most of what we see now is just matching and overlaying some 2d components in a video. This is not what I call AR.Â  Youâ€™re far away â€“ with this kind of application, you are far away from doing the registration that we need to do â€“ you canâ€™t do it. So here&#8217;s the challenge: &#8220;how can you get a Topps is an application working on cell phone. Thatâ€™s the big challengeÂ  How we can make that work!&#8221;</strong> <strong> You can&#8217;t today get a real AR Topps application working on cell phone because there&#8217;s no cell phoneÂ  thatâ€™s actually ready. But we are working on it and the first one that can make that work, itâ€™s going to be huge.</strong></p>
<p><span id="b9-2" title="Click to view full content"><strong>When you are working with good AR components you need a lot of CPU and GPU programs. So today new cell phone have started to be more and more ready for augmented reality but you need a really good cell phone to make it work. You canâ€™t choose an old cell phone to make it work because you have some recognition, you have some tracking, you have some rendering, so you canâ€™t choose a Nokia cell phone two years old to make that work. For sure the newest iPhone is the one that can make it work, but thatâ€™s it for now. There is a lot of research â€“ from large cell phone companies â€“ to get more CPU and GPU into their cell phone.Â  But so far we are also waiting for these devices to be released to consumers.</strong></span></p>
<p><strong>Tish Shute: </strong>And the current economic climate has put a damper on MIDs hasn&#8217;t it. But who can tell? It depends what price points some new MID came out at right?</p>
<p><strong></p>
<p>Bruno Uzzan: Correct.</strong></p>
<p><strong>Tish Shute:</strong> Yes,I agree. But basically whatâ€™s interesting, the interesting thing is, the iPhone can deliver so much of what is necessary and even if Apple hasn&#8217;t given access to the full power of the iphone to AR developers yet, there is really no going back now &#8211; the mobile augmented reality cat is out of the bag!</p>
<p><strong>Bruno Uzzan: Youâ€™re right, youâ€™re fully right. </strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/feed/</wfw:commentRss>
		<slash:comments>36</slash:comments>
		</item>
		<item>
		<title>Tonchidot: Taking Augmented Reality Beyond Lab Science with Fearless Creativity and Business Savvy</title>
		<link>http://www.ugotrade.com/2009/09/17/tonchidot-taking-augmented-reality-beyond-lab-science-with-fearless-creativity-and-business-savvy/</link>
		<comments>http://www.ugotrade.com/2009/09/17/tonchidot-taking-augmented-reality-beyond-lab-science-with-fearless-creativity-and-business-savvy/#comments</comments>
		<pubDate>Thu, 17 Sep 2009 21:56:49 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Android]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[air tagging]]></category>
		<category><![CDATA[air tags]]></category>
		<category><![CDATA[Android developers in Japan]]></category>
		<category><![CDATA[Android phone by NTT DoCoMo]]></category>
		<category><![CDATA[anime]]></category>
		<category><![CDATA[AR Commons]]></category>
		<category><![CDATA[AR Consortium]]></category>
		<category><![CDATA[augmented reality apps on symbian phones]]></category>
		<category><![CDATA[augmented reality as a new public infrastructure]]></category>
		<category><![CDATA[augmented reality eyewear]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Carl Malamud]]></category>
		<category><![CDATA[Denno Coil]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Google Wave Protocol]]></category>
		<category><![CDATA[Gov 2.0 Summit]]></category>
		<category><![CDATA[Japanese augmented reality eyewear]]></category>
		<category><![CDATA[japanese iphone use]]></category>
		<category><![CDATA[japanese mobile culture]]></category>
		<category><![CDATA[Japanese mobile market]]></category>
		<category><![CDATA[Ken Inoue]]></category>
		<category><![CDATA[manga]]></category>
		<category><![CDATA[markerless augmented reality]]></category>
		<category><![CDATA[Mitsuo Ito]]></category>
		<category><![CDATA[Sekaicamera]]></category>
		<category><![CDATA[Takahito Iguchi]]></category>
		<category><![CDATA[Tonchidot]]></category>
		<category><![CDATA[Wikitude]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4410</guid>
		<description><![CDATA[Sekai Camera has a slick new demo video out that is already causing a stir in the Japanese press (see Beyond the Beyond).Â  This video shows a ton of stuff going on! (A friend who lives in Tokyo pointed out to me that, in Japan, people are used to working with &#8220;busier&#8221; mobile UIs.) Takahito [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.youtube.com/watch?v=ORRZgEx0_Lc&amp;feature=player_embedded" target="_blank"><img class="alignnone size-medium wp-image-4411" title="Screen shot 2009-09-17 at 12.57.03 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-17-at-12.57.03-PM-300x243.png" alt="Screen shot 2009-09-17 at 12.57.03 PM" width="300" height="243" /></a></p>
<p><a href="http://support.sekaicamera.com/" target="_blank">Sekai Camera</a> has a <a href="http://www.youtube.com/watch?v=ORRZgEx0_Lc&amp;feature=player_embedded" target="_blank">slick new demo video</a> out that is already causing a stir in the Japanese press (<a id="w1av" title="see Beyond the Beyond" href="http://www.wired.com/beyond_the_beyond/2009/09/augmented-reality-sekaicamera-demo/">see Beyond the Beyond</a>).Â  This video shows a ton of stuff going on! (A friend who lives in Tokyo pointed out to me that, in Japan, people are used to working with &#8220;busier&#8221; mobile UIs.)</p>
<p>Takahito Iguchi, founder of <a href="http://translate.google.com/translate?hl=en&amp;sl=ja&amp;u=http://www.tonchidot.com/&amp;ei=TJ6ySvupL4LVlAfEnPjvDg&amp;sa=X&amp;oi=translate&amp;resnum=1&amp;ct=result&amp;prev=/search%3Fq%3DTonchidot%26hl%3Den%26client%3Dfirefox-a%26rls%3Dorg.mozilla:en-US:official%26hs%3DPW3" target="_blank">Tonchidot</a> the company that has created Sekai Camera, is ultra cool.Â  Coming to augmented reality from the worlds of anime and manga culture, he isÂ  already a successful entrepreneur with excellent sartorial taste (as <a href="http://www.wired.com/beyond_the_beyond/2009/09/augmented-reality-sekaicamera-demo/" target="_blank">Bruce Sterling notes</a>).Â  Before turning his attention to AR, Iguchi-san was founder of Digitao, where he pioneered a blogging + social networking service &#8220;chibikki (Little Diary).&#8221; Also Iguchi-san spent time at JUST Systems and Scitron &amp; Art, where he developed innovative multimedia platforms and web services.</p>
<p>But Takahito Iguchi doesn&#8217;t give interviews in English.Â  So recently, as part of my series of interviews with members of the <a href="http://www.arconsortium.org/" target="_blank">AR Consortium</a>, I found myself talking to the brilliant,Â  <a href="http://www.tonchidot.com/corporate-profile.html" target="_blank">CFO of Tonchidot</a>, <a href="http://www.linkedin.com/ppl/webprofile?action=vmi&amp;id=499984&amp;pvs=pp&amp;authToken=r8TF&amp;authType=name&amp;trk=ppro_viewmore&amp;lnk=vw_pprofile" target="_blank">Ken Inoue. </a>Inoue-san&#8217;s specialties include the Japanese mobile market, start-up finance, alliances, new business development, and international expansion.</p>
<p>And while, perhaps, I would have liked to learn more about how cool Japanese sub-cultures are informing the future of AR, with every business analyst under the sun opining on the future of this young industry, it is good to hear directly from an augmented reality CFO who is actually shaping business development on the ground.Â  And Tonchidot is one ofÂ  AR&#8217;s most interesting start ups.</p>
<p>With Tonchidot, I think we are beginning to taste a magic brew as augmented reality, long nurtured only in lab scientist cultures, meets business savvy and fearless creativity.</p>
<p>Bruce Sterling <a href="http://www.wired.com/beyond_the_beyond/2009/09/augmented-reality-sekaicamera-demo/" target="_blank">posted the video below</a>, noting:</p>
<p><strong> &#8220;Tonchidot tearinâ€™ it up at the department store.  Check out that exceedingly weird and/or clever AR-iPhone <em>pistol grip device</em> that kicks in around 2:20.&#8221;</strong></p>
<p><strong><a href="http://www.youtube.com/watch?v=FiVFVdl3EA4&amp;feature=player_embedded#t=115" target="_blank"><img class="alignnone size-medium wp-image-4414" title="Screen shot 2009-09-17 at 2.34.42 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-17-at-2.34.42-PM-300x181.png" alt="Screen shot 2009-09-17 at 2.34.42 PM" width="300" height="181" /></a></strong></p>
<h3>The AR Commons</h3>
<p>In the interview below, Ken Inoue also describes an important organization that Tonchidot has helped create &#8211; the <a href="http://translate.google.com/translate?client=tmpg&amp;hl=en&amp;u=http%3A%2F%2Fwww.arcommons.org%2F&amp;langpair=ja|en" target="_blank">AR Commons</a>.</p>
<p><strong>Ken Inoue:</strong> <strong>We feel that public data, such as landmarks, government facilities, and public transport should be shared. We see an AR world where people can readily and easily access information by just seeing &#8211; quick, easy, and efficient.Â  And because of this ease and intuitiveness, children, the elderly and handicapped will surely benefit.Â  AR could help create a safer society.Â  Warnings, alerts, and safety information could save lives and avoid disasters.Â  These are what we, and <a href="http://translate.google.com/translate?client=tmpg&amp;hl=en&amp;u=http%3A%2F%2Fwww.arcommons.org%2F&amp;langpair=ja|en" target="_blank">AR Commons</a> would like to tackle in the not so distant future.</strong></p>
<p>An AR Commons is something we should all be thinking about. <strong>&#8220;Augmented reality could be a new public infrastructure,&#8221;</strong> as <a href="http://twitter.com/timoreilly" target="_blank">Tim O&#8217;Reilly noted in Twitter</a>. I will discuss this more in my upcoming post on the recent<a href="http://www.gov2summit.com/" target="_blank"> Gov 2.0 Summit</a> which was was an extraordinary event &#8211; an historic manifestation of the current wave of transformation in the nature of Government that <a href="http://en.wikipedia.org/wiki/Carl_Malamud" target="_blank">Carl Malamud</a> described in his address, &#8220;By The People,&#8221; available as <a href="http://public.resource.org/people/" target="_blank">video, audio and text here</a>.Â  Carl Malamud received a standing ovation at the Summit.</p>
<p>Malamud pointed out:</p>
<p><strong>&#8220;We are now witnessing a third wave of change &#8211; an Internet wave &#8211; where the underpinnings and machinery of government are used not only by bureaucrats and civil servants, but by the people.&#8221;</strong></p>
<p><strong><br />
</strong></p>
<h3>Talking with Ken Inoue, CFO, Tonchidot</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/tonchidot.png"><img class="alignnone size-thumbnail wp-image-4416" title="tonchidot" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/tonchidot-150x150.png" alt="tonchidot" width="150" height="150" /></a></p>
<p><span style="background-color: #ffffff;"><strong>Tish Shute:</strong> There has been some skepticism lately that augmented reality experiences will live up to the recent hype (</span><a id="fk:l" style="background-color: #ffffff;" title="see this post for example" href="http://www.genebecker.com/2009/09/thinking-about-design-strategies-for-magic-lens-ar/">see this post for example</a><span style="background-color: #ffffff;">). But Tonchidot has a reputation for creativity, as you pointed out, &#8220;we are not &#8220;AR lab scientists&#8221; &#8211; we are from the worlds of multimedia, visual arts, publishing, lovers of manga and anime and Japanese sub-culture&#8230;. &#8221; What is Tonchidot&#8217;s approach to designing AR experiences that can deliver wonder, curiosity and discovery &#8211; the emotions of AR, despite the limitations of GPS+compass implementations of mobile AR? </span><br style="background-color: #ffffff;" /> <strong><br />
Ken Inoue:</strong> <strong>We have been facing skepticism ever since we started!Â  It doesn&#8217;t really bother us, it never has.Â  As for the opposite, the recent hype, well, we will have to live with that too.Â  We are aware of the hype cycle, the obstacles that lie ahead.Â  We are not rejoicing, and we will be prepared.Â  By the way, I didn&#8217;t think the said article was skeptical at all &#8211; in fact, I took it as great advice.<br />
</strong></p>
<p><strong><br style="background-color: #00ffff;" /></strong> <span style="background-color: #ffffff;"><strong>Tish Shute:</strong> Most augmented reality experiences are at the moment about one person experiencing multiple streams of content, we haven&#8217;t seen any multiuser realtime interaction in augmented reality yet, for example, people teaming up to accomplish some goal?Â  What do you think will be the most exciting aspects of shared augmented reality experiences? And, we are yet to see a really mobile augmented reality game get a mass audience.Â  Pong was a landmark game for the PC.Â  It really excited people because there was a &#8220;Wow! my physical action is changing what is seen?&#8221;Â  What would be an equivalent Wow! experience for augmented reality?<br />
</span><span style="background-color: #00ffff;"><br />
</span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-17-at-3.38.57-PM.png"><img class="alignnone size-medium wp-image-4417" title="Screen shot 2009-09-17 at 3.38.57 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-17-at-3.38.57-PM-300x148.png" alt="Screen shot 2009-09-17 at 3.38.57 PM" width="300" height="148" /></a><br />
<span style="background-color: #00ffff;"> <span style="background-color: #ffffff;"> </span></span></p>
<p><span style="background-color: #00ffff;"><span style="background-color: #ffffff;"><strong>Ken Inoue:</strong> <strong>We wish we had an answer to that! <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" />  </strong></span></span><strong><span style="background-color: #00ffff;"><span style="background-color: #ffffff;">We are talking to many game developers, and everyone has different ideas&#8230;Â  we want to test them all! </span></span><span style="background-color: #00ffff;"><span style="background-color: #ffffff;">We are striving to be a social application, and we are thinking hard.Â  But often times, users find new uses and come up with really unexpected, but ingenious ideas&#8230; that&#8217;s the nature of social experiences, I guess. </span></span></strong><br style="background-color: #00ffff;" /><br style="background-color: #00ffff;" /> <span style="background-color: #00ffff;"><span style="background-color: #ffffff;"><strong>Tish Shute: </strong> It is a year since you demoed at TC50.Â  What have been the most exciting developments in augmented reality this year and what have been the biggest disappointments?</span></span></p>
<p><span style="background-color: #ffffff;"><strong>Ken Inoue: We&#8217;re definitely excited about what other start-ups in the field are doing across the ocean.Â  We get a lot of stimulation, and we see it as something close to a great sporting rivalry, but only, we aren&#8217;t that great yet&#8230;.Â  Our disappointment was that we weren&#8217;t able to release our app this summer&#8230;. </strong></span></p>
<p><strong>Tish Shute:</strong> I know you can&#8217;t give too many details about your upcoming iphone launch because you are in &#8220;stealth mode&#8221; and because of Apple&#8217;s NDA. But I will start with a general question: &#8220;Do you think that Apple is going down the right path with what they are or aren&#8217;t making available to developers?&#8221;</p>
<p><strong>Ken Inoue:</strong> <strong>Looking back at Apple&#8217;s short history in iPhone and AppStore, they&#8217;ve slowly but steadily headed in the path of more openness. And what with the FCC making an inquiry to Apple about the rejected Google Voice application, they&#8217;re forced to be more friendly and open to developers, whether they want to or not&#8230;</strong></p>
<p><strong>Tish Shute:</strong> How is Tonchidot going to differentiate itself in an exploding field of new augmented reality companies?</p>
<p><strong>Ken Inoue:</strong> <strong>Well, I feel that the market for augmented reality is still in such a nascent stage, that the priority for many of us is cooperation, rather than cut-throat competition.Â  That&#8217;s the rationale for the <a href="http://www.arconsortium.org/" target="_blank">AR Consortium </a>that was founded by Robert Rice and others very recently, and something that we completely subscribe to.Â  In Japan, Tonchidot is the central proponent in <a href="http://translate.google.com/translate?client=tmpg&amp;hl=en&amp;u=http%3A%2F%2Fwww.arcommons.org%2F&amp;langpair=ja|en" target="_blank">AR Commons</a>, an organization which has already started building a social database for AR.</strong></p>
<p><strong>Tish Shute:</strong> What do you think of the augmented reality applications released recently?</p>
<p><strong>Ken Inoue: There are now so many cool AR apps out there &#8211; we&#8217;d like to think that our presentation at TC50 back in September 2008 stimulated fellow developers just a little bit.Â  Many AR applications and services seem to capture the benefits of AR in some way or another very well.Â  I think maybe the difference between our service and what some others are doing, is that we are initially focused on UGC (user generated content) &#8211; not on business applications and tools.Â  However, it&#8217;s just a matter of prioritization, I think &#8211; it seems we all share the same dream!</strong></p>
<p><strong>Tish Shute:</strong> Yes I like the way you have taken these the concepts &#8220;world camera&#8221;Â  and &#8220;air tagging&#8221; and focused on the social aspects &#8211; social tagging.Â  Wikitude now has a way for users to create tags &#8211; <a href="http://www.wikitude.org/add-content" target="_blank">wikitude.me</a> which is a big step forward too I think.</p>
<p><strong>Ken Inoue:</strong> <strong>Yes, indeed!Â  It seems they have done a great job.Â  Their success, and the success of everyone else helps us too, since it generates media attention, and also ideas for how it can be applied to the real world and real businesses.</strong></p>
<p><span style="background-color: #ffffff;"><strong>Tish Shute:</strong> There is a a growing development of AR browser like experiences, Wikitude, Layar, and Sekai Camera but they are not true browser experiences (in the sense that we experience web browsers) as they don&#8217;t share AR data across browsers. How can we move towards a situation of sharing augmented reality data?</span><span style="background-color: #ffff00;"><span style="background-color: #ffffff;"> What are the obstacles to sharing AR data across browsers now?Â  I guess these obstacles are business obstacles mainly, not technical obstacles. But what do you think?</span><br style="background-color: #ffffff;" /><br />
<span style="background-color: #000000;"><span style="background-color: #ffffff;"><strong>Ken Inoue:</strong> </span></span></span><strong><span style="background-color: #ffff00;"><span style="background-color: #000000;"><span style="background-color: #ffffff;">Because AR is in many ways location dependent, geographic coverage always will be a challenge for anyone.Â  This means that collaboration makes sense. </span></span></span><span style="background-color: #ffff00;"><span style="background-color: #000000;"><span style="background-color: #ffffff;">I don&#8217;t think there are many technical obstacles, and some things can already be shared though open APIs. </span></span></span>The issue of sharing AR data can not be solved by any one company &#8211; We believe we must make collaborative efforts.Â <span style="background-color: #ffff00;"><span style="background-color: #000000;"><span style="background-color: #ffffff;"> </span></span></span></strong><br />
<span style="background-color: #ffff00;"><br />
</span> As I mentioned, we helped create an organization called <a href="http://www.arcommons.org/" target="_blank">AR Commons</a> which has already started building a social database for AR in Japan.Â Â  However, sharing ALL data on this platform will be a challenge, since so many interests will need to be aligned.Â  Not all info is shared on the internet, and some prefer closed and secure environments.</p>
<p><span style="background-color: #00ffff;"><span style="background-color: #ffffff;"><strong>Tish Shute: </strong> What is your vision for AR Commons in the next 12 months?</span></span><br style="background-color: #00ffff;" /></p>
<p><strong>Ken Inoue:</strong> <strong>We feel that public data, such as landmarks, government facilities, and public transport should be shared. We see an AR world where people can readily and easily access information by just seeing &#8211; quick, easy, and efficient.Â  And because of this ease and intuitiveness, children, the elderly and handicapped will surely benefit.Â  AR could help create a safer society.Â  Warnings, alerts, and safety information could save lives and avoid disasters.Â  These are what we, and AR Commons would like to tackle in the not so distant future.</strong><span style="background-color: #ffff00;"> </span></p>
<p><span style="background-color: #ffff00;"><br />
</span><strong>Tish Shute: </strong>What isÂ  the business model for Sekai Camera?Â  Do you have to subscribe to create? Otherwise just view?</p>
<p><strong>Ken Inoue:</strong> <strong>All users can create AirTags &#8211; we want to allow all users to start AirTagging and add value to our service.Â  We wanted everybody to make tags, and we didn&#8217;t want to put a hurdle on it.</strong></p>
<p><strong>So, users can create text, voice, image/photo tags and can add comments on the tags &#8211; much like blogging and twitter. We will also mash up with many other social services which will strengthen the &#8220;social&#8221; aspect of our app.</strong></p>
<p><strong>Tish Shute:</strong> are you aiming for something close to the real time experience of Twitter?Â  <span style="background-color: #ffff00;"><span style="background-color: #ffffff;">And what will attract users over other social location based apps like Bright Kite using 2 dimensional maps?</span></span></p>
<p><strong>Ken Inoue:</strong> <strong>Our service is very close to real-time already &#8211; only, because of the location specific aspect, it will be different.Â  It will definitely be something new.Â  Maps will also be integrated.</strong></p>
<p><strong>Tish Shute:</strong> And Sekai camera will work anywhere in the world?</p>
<p><strong>Ken Inoue: We have named and designed it to be global! <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" /> </strong></p>
<p><strong>However, it&#8217;s definitely easier for any company to focus on your home market first.Â  Being a Japanese company, we are initially concentrating on the Japanese market.Â  It&#8217;s still the second largest economy in the world, one of the leaders in the mobile internet market, full of geeks and early adopters of new technologies.Â  And what&#8217;s more, we already have a great buzz here, and it&#8217;s easier to talk and collaborate with local partners.Â  For any company building AR apps, geography and platform may be the difficult decisions to make, since first-mover advantage may become quite significant&#8230;Â  We are lucky to have such a large and hungry home market.</strong></p>
<p><strong>Tish Shute: </strong>Yes you have Denno Coil too. One of my big inspirations!</p>
<p><strong>Ken Inoue: Oh, you know about it!Â  How surprising!</strong></p>
<p>Tish Shute: You mentionedÂ  CEO <a href="http://ascii.jp/elem/000/000/409/409940/" target="_blank">did a talk session with the creator of denno coil recently</a>.</p>
<p><strong>Ken Inoue: Yes, &#8220;Denno Coil&#8221; shows us what the future could be, and is very inspiring.Â  We actually didn&#8217;t know about Denno Coil until afterwards, although it was broadcast on national TV.</strong></p>
<p><strong>There is a picture on the web article of our talk session &#8211; on the right, you see our CEO, Takahito Iguchi, on the left, Mitsuo Iso, creator of Denno Coil.Â  Iso-san knew about the Sekai camera, and in fact, gave us lots of hints and advice on how to make it better.Â  He is a real technology lover &#8211; a mac lover, and iphone lover.</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-17-at-4.55.06-PM.png"><img class="alignnone size-medium wp-image-4423" title="Screen shot 2009-09-17 at 4.55.06 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-17-at-4.55.06-PM-300x201.png" alt="Screen shot 2009-09-17 at 4.55.06 PM" width="300" height="201" /></a><br />
</strong></p>
<p><span style="background-color: #ffffff;"><strong>Tish Shute: </strong>Iguchi-san is a very inspiring and charismatic thinker and I would love to know about some of his imaginings for augmented realities.</span> <span style="background-color: #ffffff;">What are his AR imaginings for the next step after air tagging! </span><span style="background-color: #ffff00;"><span style="background-color: #ffffff;">What does Iguchi-san see Tonchidot doing in 2010? And then beyond that? And, what are some augmented realities he would like to see even beyond the limitations of current technologies?</span></span></p>
<p><strong><span style="background-color: #ffffff;">Ken Inoue: We believe the possibilities are infinite!</span><span style="background-color: #ffffff;"> There are so many things we can and would like to do, but so limited resources.. </span><span style="background-color: #ffffff;">So here again, what we and other fellow AR pioneers will be doing will depend on how we prioritize.Â  We would like to keep our plans secret for now. <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" /> </span></strong></p>
<p><span style="background-color: #ffffff;"><strong>Tish Shute:</strong> Does Iguchi-san see Tonchidot doing more with image recognition and the tight alignment of graphics with physical objects in the near future?</span></p>
<p><strong><span style="background-color: #ffffff;">Ken Inoue: Yes, definitely!Â  We are already in talks with potential partners. There are some great technologies here in Japan, which were just waiting for us!</span></strong></p>
<p><span style="background-color: #ffffff;"><strong>Tish Shute:</strong> And when will we get the kind of eyeware that would really change everything? (I noticed <a href="http://www.masunaga1905.jp/brand/teleglass/" target="_blank">one Japanese company that is producing eyewear </a>- what is their potential? Are their other eyewear initiatives in Japan?Â  What does Tonchidot think will be key to pushing this kind of hardware development for AR forward?</span></p>
<p><strong>Ken Inoue: Yes indeed, the world of Denno Coil is not too far away&#8230;.Â  There are actually many projects going on in Japan, and we are definitely interested in hardware development.Â  We are not short of world-class hardware developers here in Japan, and we have been approached by quite a few.</strong></p>
<p><strong>Tish Shute:</strong> I know you got some criticism for showing a concept video at last year&#8217;s TechCrunch50 which people felt didn&#8217;t show the technology you had actually developed. Do you have all the functionality shown in your video working now?</p>
<p><strong>Ken Inoue: Hmm.. We did get criticism, and so it seems did TechCrunch &#8211; but we got far more praise and support!Â  I guess we really felt we needed to get the idea out there &#8211; As Robert said in your interview -Â  it&#8217;s hard to make people understand the full potential of AR.Â  And unless you show something like that in video form, it&#8217;s difficult to make people understand.Â  We showed TC50 our working prototype on the iPhone, and made it clear that the video was a vision of the future.Â  Because of the language barrier, we used simple phrases like, &#8220;Look up, not down&#8221; and &#8220;AirTag&#8221;.Â  TC50 let us make the presentation, for which we are very very thankful.</strong></p>
<p><strong>Tish Shute:</strong> Oh I love the term Air Tagging. It is a brilliant term!Â  Robert Rice noted it has the ring of terms like xerox and kleenex &#8211; i.e. a brand that becomes the &#8220;thing&#8221; and no longer a brand, congrats!Â  Sekai (World) Camera is really nice concept too!</p>
<p><strong>Ken Inoue: Thanks!</strong></p>
<p><strong>Tish Shute:</strong> <span id="c:i9" style="background-color: #ffffff;" title="Click to view full content">Recently @rhymo of SPRXMobile tweeted that Samsung NL was calling #augmentedreality the Optical Internet.Â  The resulting Twitter discussion gave a pretty resounding the thumbs down to the term Optical Internet with no&#8217;s from @bruces and my friend Gene Becker.</span><span id="c:i9" style="background-color: #ffffff;" title="Click to view full content"> </span></p>
<p><span id="c:i9" style="background-color: #ffffff;" title="Click to view full content"><br />
</span><em style="background-color: #ffffff;"><strong>RT @genebecker: No @Rhymo, Optical Internet misses the point that #AR will be multimodal, multisensory, social, contextual</strong></em><br style="background-color: #ffffff;" /> <br style="background-color: #ffffff;" /> <span style="background-color: #ffff00;"><span style="background-color: #ffffff;">I tweeted that I thought Tonchidot my be able to improve on the term augmented reality considering your great track record with word smithing.Â  Has the Tonchidot team got any ideas for a better term?</span></span></p>
<p><strong><span style="background-color: #ffffff;">Ken Inoue: *** Good question &#8211; the term &#8220;AR&#8221; is too techy/difficult&#8230;..Â  we agree. </span><br />
<span style="background-color: #ffffff;">But we haven&#8217;t thought of a good alternative term yet&#8230;</span><br />
</strong> <span style="background-color: #ffff00;"><strong><br />
</strong> <span style="background-color: #ffffff;"><strong>Tish Shute:</strong> Who came up with the term &#8220;air tags?&#8221; </span></span></p>
<p><strong>Ken Inoue: Our CEO, Takahito Iguchi did.Â  He has a talent for creating names, phrases&#8230;Â  and the future, we hope. <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" /><br />
Our members are not &#8220;AR lab scientists&#8221; &#8211; we are from the worlds of multimedia, visual arts, publishing, lovers of manga and anime and Japanese sub-culture&#8230;.</strong></p>
<p><span style="background-color: #ffffff;"><strong> Tish Shute:</strong> You mentioned Tonchidot has been very involved in Android development community in Japan. can you tell me more about this and what have been areas Tonchidot has been most interested in? What do the Tonchidot developers think have been the most exciting new developments with Android?</span></p>
<p><strong><span style="background-color: #00ffff;"><span style="background-color: #ffffff;">Ken Inoue: Yes,Â  core members of our tech team are key members of the Android movement in Japan, and we are influenced greatly by what&#8217;s happening there.Â  Their openness is very very attractive indeed!Â  I</span></span><span style="background-color: #ffffff;">t was a tough decision whether to choose Android or iPhone as our first application platform. There are pros and cons. </span></strong></p>
<p><strong>The android dev community is unofficial, of course, but we have been invited to speak and do demos very often -Â  <a href="http://www.mobilecrunch.com/2009/03/20/sekai-camera-mobile-social-tagging-is-coming-to-android-phones-too/" target="_blank">one of our demos is in the media</a> &#8211; shooting games on Android.Â  It was quite a while back, and our app is now far ahead.</strong></p>
<p><strong>Tish Shute:</strong> But Sekai Camera will be released on the iphone?</p>
<p><strong>Ken Inoue: YES, ifÂ  all goes well &#8211; as many have pointed out, iPhone is not PERFECT &#8211; no device is, at least currently.</strong></p>
<p><strong>Tish Shute:</strong> Yes and how is the iphone uptake in Japan &#8211; the big plus in the US is the big user base?</p>
<p><strong>Ken Inoue: Yes that&#8217;s the big difference. Â  In Japan, Softbank, the #3 carrier is marketing it &#8211; for now. They don&#8217;t release numbers, but I think there are 1M handsets already sold.Â  Still very small compared to other markets.Â  BTW, In Japan, roughly 35M handsets were sold last year, dropping from 50M in previous years.</strong></p>
<p><strong>Tish Shute:</strong> Yes it seems at the moment application developers are forced to choose between the US market and the rest of the world! So what is the status of Android in the Japanese mobile market &#8211; the iphone is pretty tiny</p>
<p><strong>Ken Inoue: We just had a release of the first Android phone by NTT DoCoMo a couple of months ago, so still very very early.</strong></p>
<p><strong>Tish Shute:</strong> So Android phones market is even smaller than the iphone</p>
<p><strong>Ken Inoue: Yes, and so our decision to release on the iPhone -<br />
We haven&#8217;t provided our app for android yet &#8211; just demos. It&#8217;s too small of a market, at least for now.</strong><br />
<br style="background-color: #ffffff;" /> <span style="background-color: #ffffff;"><strong>Tish Shute:</strong> Robert put out an interesting question: </span><em style="background-color: #ffffff;">&#8220;Are we letting the short term glitz of Apple and the iPhone fad pull us in the wrong direction? Shouldnt we be focusing on symbian devices that have the lion&#8217;s share of the market? or should we be looking more at either other OSs (winmobile, android) or not at all and trying to create a new platform that is more MID and less smart phone with a hardware partner?&#8221;</em><br style="background-color: #ffff00;" /><br />
<strong>Ken Inoue: Good point. We certainly don&#8217;t wish to be Apple dependent, or dependent on anyone.Â  As much as we like Apple and iPhone, we will surely create apps for other platforms. We always get question/requests to create symbian apps, and we would like to do that &#8211; but in order of prioritization- we&#8217;re a small start-up.</strong></p>
<p><strong>Tish Shute:</strong> There are obstacles to creating AR apps on symbian devices aren&#8217;t there?</p>
<p><strong>Ken Inoue: The AR experience we can provide on iPhone and android, can not be replicated on conventional phones.Â  However, we haven&#8217;t examined possibilities on Symbian in detail yet, so we can&#8217;t say much.</strong></p>
<p><strong>Tish Shute: </strong>iphone adoption in the US has really put augmented reality on the map.</p>
<p><strong>Ken Inoue: It certainly has!<br />
</strong></p>
<p><strong>In Japan, it is rumored that iPhone will soon be marketed by multiple carriers, in addition to Softbank. That will be a boost for us.Â  Apple is moving gradually to a multi-carrier strategy, I believe.Â  With content getting richer, Apple will be required to partner with carriers with strong infrastructure.</strong></p>
<p><strong>Tish Shute:</strong> Recently I haveÂ  been exploring the strengths of <a href="http://www.waveprotocol.org/" target="_blank">Google Wave protocol</a> for some aspects of mobile augmented reality.<br />
<span style="background-color: #ffffff;"> </span></p>
<p><span style="background-color: #ffffff;">And this is, perhaps, a question for the Tech team perhaps?Â  Do the Tonchidot devlopers think Google Wave would be an interesting jumping off point for some augmented reality standards?</span><br style="background-color: #ffffff;" /><br />
<strong>Ken Inoue: Our tech members haven&#8217;t been able to examine this in detail yet &#8211; but we are definitely excited!</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-17-at-5.10.09-PM.png"><img class="alignnone size-medium wp-image-4425" title="Screen shot 2009-09-17 at 5.10.09 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/09/Screen-shot-2009-09-17-at-5.10.09-PM-300x199.png" alt="Screen shot 2009-09-17 at 5.10.09 PM" width="300" height="199" /></a><br />
</strong></p>
<p><strong><br />
</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/09/17/tonchidot-taking-augmented-reality-beyond-lab-science-with-fearless-creativity-and-business-savvy/feed/</wfw:commentRss>
		<slash:comments>8</slash:comments>
		</item>
		<item>
		<title>Everything Everywhere: Thomas Wrobel&#8217;s Proposal for an Open Augmented Reality Network</title>
		<link>http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/</link>
		<comments>http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/#comments</comments>
		<pubDate>Thu, 20 Aug 2009 03:58:57 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[alternate reality games]]></category>
		<category><![CDATA[alternate reality games and augmented reality]]></category>
		<category><![CDATA[AR Consortium]]></category>
		<category><![CDATA[AR Network]]></category>
		<category><![CDATA[ARG games]]></category>
		<category><![CDATA[ARN]]></category>
		<category><![CDATA[augmented reality and privacy]]></category>
		<category><![CDATA[augmented reality browser wars]]></category>
		<category><![CDATA[Augmented Reality Browsers]]></category>
		<category><![CDATA[augmented reality concepts]]></category>
		<category><![CDATA[augmented reality filters]]></category>
		<category><![CDATA[augmented reality games]]></category>
		<category><![CDATA[augmented reality permissions]]></category>
		<category><![CDATA[Bertine van Hovell]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Dark Flame]]></category>
		<category><![CDATA[Denno Coil]]></category>
		<category><![CDATA[distributed augmented reality]]></category>
		<category><![CDATA[Elan Lee]]></category>
		<category><![CDATA[Fourth Wall Studios]]></category>
		<category><![CDATA[future of augmented reality]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Google Wave Protocols]]></category>
		<category><![CDATA[Google Wave Web of Protocols]]></category>
		<category><![CDATA[Internet Relay Photoshop]]></category>
		<category><![CDATA[IRC paradigm]]></category>
		<category><![CDATA[IRC protocols and augmented reality]]></category>
		<category><![CDATA[J Aaron Farr]]></category>
		<category><![CDATA[Lost Again]]></category>
		<category><![CDATA[Mez Breeze]]></category>
		<category><![CDATA[Mitsuo Iso]]></category>
		<category><![CDATA[Open Augmented Reality Netwrok System]]></category>
		<category><![CDATA[open standards for augmented reality]]></category>
		<category><![CDATA[protocols for augmented reality]]></category>
		<category><![CDATA[real time communications protocols]]></category>
		<category><![CDATA[real time web]]></category>
		<category><![CDATA[res-nova]]></category>
		<category><![CDATA[Robert Rice]]></category>
		<category><![CDATA[social tesseracting]]></category>
		<category><![CDATA[Thomas Wrobel]]></category>
		<category><![CDATA[XMPP]]></category>
		<category><![CDATA[XMPP and presence]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4228</guid>
		<description><![CDATA[Today, I was very excited when Thomas Wrobel sent me a draft of, &#8220;Everything Everywhere: A proposal for an Augmented Reality Network system based on existing protocols and infrastructure.&#8221; Thomas has kindly agreed to let me publish his draft, to open a discussion on this topic. The diagram opening this post (click image to enlarge) [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Image1.jpg"><img class="alignnone size-medium wp-image-4277" title="Image1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Image1-300x162.jpg" alt="Image1" width="300" height="162" /></a></p>
<p>Today, I was very excited when <a href="http://www.darkflame.co.uk/">Thomas Wrobel</a> sent me a draft of, <strong>&#8220;Everything Everywhere: A proposal for an Augmented Reality Network system based on existing protocols and infrastructure.&#8221;</strong></p>
<p>Thomas has kindly agreed to let me publish his draft, to open a discussion on this topic. The diagram opening this post (click image to enlarge) shows, <strong>&#8220;An example of how collaborative 3D-spaces could be shared over existing IRC networks.&#8221;</strong> It is from Thomas&#8217; proposal.<strong> </strong>The full text of his paper is included later in this post.</p>
<h3>&#8220;Can we try to avoid a browser war this time?&#8221;</h3>
<p>Thomas notes in the closing remark to his paper:</p>
<p><strong>&#8220;I am absolutely confident in my belief AR will become at least as important as the web has, and probably a lot more so. It will also face much the same hurdles and challenges getting established as that medium did. But, speaking as a web-developer, can we try to avoid a browser war this time?&#8221;</strong></p>
<p><a href="http://www.darkflame.co.uk/">Thomas Wrobel</a> has consistently posted insightful comments on how existing standards could be used for creating open augmented reality networks. But he expressed concern to me that his work and this paper not be overplayed:</p>
<p><strong>&#8220;I&#8217;m hardly a leader, I&#8217;m just an amateur with a load of ideas on AR-related topics, some which might be useful, others might become unworkable. I don&#8217;t want anyone to get the impression this is how I think it has to, or should be done.&#8221;</strong></p>
<p>I have brought/am bringing up this topic of using existing standards and infrastructure where possible for open augmented reality networks in all my interviews with members of the <a href="http://www.arconsortium.org/" target="_blank">AR Consortium</a>.</p>
<p>And I am finding agreement on a point that <a href="http://curiousraven.squarespace.com/" target="_blank">Robert Rice</a> makes, <strong>&#8220;there is no perfect, ultimate solution *now*, but we have to do *something* to work from and refine/evolve.&#8221;<br />
</strong></p>
<p>Thomas Wrobel makes what I consider some crucial opening suggestions. I take my hat off to him for thinking about this early, coming up with some clear, elegant, and practical ideas, and doing the work to articulate these ideas so others can participate in evolving them.Â  Massive props for that, many times over.</p>
<p>Good ideas on standards at an early stage ofÂ  a developing industry like augmented reality are like spring sunshine and April showers for new crops. No one knows what storms and pests the growing season will bring &#8211; but water and sunshine (open standards) are always a good start. And, personally, I can&#8217;t wait to see how this new industry unfolds (see Bruce Sterling&#8217;s Layar Conference awesome keynote : <a href="http://layar.com/video-bruce-sterlings-keynote-at-the-dawn-of-the-augmented-reality-industry/" target="_blank">&#8220;At the Dawn of the Augmented Reality Industry.&#8221;</a>)</p>
<p>Thomas Wrobel is:</p>
<p><strong> &#8220;a web developer working for a small, brand-new company called <a href="http://www.lostagain.nl/" target="_blank">Lost Again</a>, which mostly works on ARGs (That is, the alternate reality games, not the augmented reality games, although there&#8217;s probably going to be big overlap there in the future). We developed two educational ARG games for the Netherlands with <a href="http://www.res-nova.nl/">a company called res-nova</a>.&#8221;</strong></p>
<p>I have been following Alternate Reality GamesÂ  through the amazing work of Elan Lee and <a href="http://www.fourthwallstudios.com/">Fourth Wall Studios</a>. Like Thomas, I think the intersection of ARGs and augmented realities is going to be very interesting.  Thomas wanted me to point out that the website for his company with Bertine van HÃ¶vell, http://www.lostagain.nl/, is just a placeholder for now.<br />
<strong><br />
&#8220;Probably be up fully within a week or two. And, &#8220;despite the logo, we aren&#8217;t an AR company [yet], or a travel firm. The logos supposed to represent being lost in our minds.&#8221;</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/logolostagainsmall.png"><img class="alignnone size-full wp-image-4250" title="logolostagainsmall" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/logolostagainsmall.png" alt="logolostagainsmall" width="162" height="56" /></a></p>
<p>Thomas has been thinking about the topic of an open augmented reality network for a while now.Â  He is an artist also known as <a href="http://www.renderosity.com/mod/gallery/index.php?image_id=1221354&amp;member">DarkFlame</a> and his ARN network is included in this augmented reality concept for 2086 he did in 2006 (click on image below to enlarge).</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-78.png"><img class="alignnone size-medium wp-image-4254" title="Picture 78" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-78-300x218.png" alt="Picture 78" width="300" height="218" /></a></p>
<h3>Beyond IRC</h3>
<p>Both Thomas and <a href="http://arsvirtuafoundation.org/research/">Mez Breeze</a> made extensive and insightful comments on my last post, <a href="http://www.ugotrade.com/2009/08/03/augmented-reality-bigger-than-the-web-second-interview-with-robert-rice-from-neogence-enterprises/">&#8220;Augmented Reality &#8211; Bigger Than the Web: Second Interview with Robert Rice.&#8221; </a>And in particular they both picked up on something I am very interested in &#8211; the potential use of the Google Wave Web of protocols in creating open augmented reality networks.</p>
<p>Mez in her brilliant brainsplosion on social tesseracting takes on the very definition of information:</p>
<p><strong>&#8220;Tish, when you ask Robert â€œâ€¦what is your approach to delivering a massively shared real time [augmented reality] experience that is like Wave not confined to a walled garden?â€ thatâ€™s an extremely relevant question + one that needs to be addressed while considering the entirety of the Reality-Virtual Continuum. Iâ€™ve recently finished a series of articles addressing this: the framework Iâ€™ve developed is termed<a href="http://arsvirtuafoundation.org/research/2009/03/01/_social-tesseracting_-part-1/" target="_blank"> â€œSocial Tesseracting.â€</a></strong></p>
<p>I have recently begun exploring the Google Wave Web of Protocols which are nicely outlined in <a href="http://cubiclemuses.com/cm/articles/2009/08/09/waves-web-of-protocols/">this post</a> by J Aaron Farr which includes the very interesting diagram below (so more on Google Wave in another post).</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/wave_protocols.png"><img class="alignnone size-medium wp-image-4255" title="wave_protocols" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/wave_protocols-300x293.png" alt="wave_protocols" width="300" height="293" /></a></p>
<p>But, as Thomas notes, while he demonstrates his ideas using IRC (Internet Relay Chat) they reach<strong> Beyond IRC</strong>:</p>
<p><strong>&#8220;As mentioned before IRC has some drawbacks, which are due to its age or method of working. As such, future systems might yet prove better alternatives for a open AR network. One example of such a system is Google Wave. It shares many of the advantages of IRC (open, anyone can create a channel of data, different permission levels can be set and its free), while avoiding some critical restrictions. (The data can be persistent). I believe some of the ideas I&#8217;ve mentioned, and possibly even the proposed protocol string could be adapted for Google Wave or other future systems. I believe overall the principles are more important then any specific implementation to get to them</strong>&#8221;</p>
<p>Also Thomas pointed that while he uses markers to illustrate some of his examples, they are just a method for tracking.Â  What he is presenting is going to be transparent to the methodology of registration/tracking.</p>
<p><strong><strong>Tish Shute: You mostly use marker based examples but there is no reason why the principles you are suggesting will not be just as relevant as we move more into using more sophisticated image recognition tools is there?<br />
</strong><br />
Thomas Wrobel: No reason whatsoever. I mostly choose familiar markers as something that could be used now, with a lot of coding library&#8217;s already established for them. I think for most future AR use, markers will go completely&#8230;especially outside. Either things will be done purely by gps, object recognition, or the (in the case of advertising) markers will look like normal posters.</strong></p>
<p><strong>However, I do think traditional markers might &#8220;cling on&#8221; as being used for non geographical specific stuff at home. After all, if you need some reference points for moving mesh&#8217;s about in real time&#8230;(say, when playing a board game with a friend on the other side of the world)&#8230;.then there&#8217;s probably nothing that&#8217;s going to be more practical then some simple bits of paper or card.</strong></p>
<p><strong><br />
</strong></p>
<h3>Everything Everywhere</h3>
<h4>-Â  A proposal for an Augmented Reality Network system based on existing protocols and infrastructure.</h4>
<h3>by Thomas Wrobel</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/darkflame2.jpg"><img class="alignnone size-medium wp-image-4260" title="darkflame" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/darkflame2-199x300.jpg" alt="darkflame" width="199" height="300" /></a></p>
<p>The following paper is my vision of a open AR Network and potential methods to implement it with existing technologies. Specifically I&#8217;ll be focusing on a potential for a global outdoor AR network, although the ideas  aren&#8217;t limited to that.</p>
<p>Of course I call it â€œmyâ€ vision, but I&#8217;m obviously not the first to have many of these ideas. I have been influenced and inspired by many things&#8230;</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/AR_paper_img_0new1.jpg"><img class="alignnone size-medium wp-image-4232" title="AR_paper_img_0new" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/AR_paper_img_0new1-140x300.jpg" alt="AR_paper_img_0new" width="140" height="300" /></a></p>
<p><em>[Some of Thomas Wrobel&#8217;s influences &#8211; watched and played. ImagesÂ  from Mitsuo Iso&#8217;s<a href="http://en.wikipedia.org/wiki/Denn%C5%8D_Coil" target="_blank"> Denno Coil</a> (Click to enlarge) top,Â  below from the game &#8220;Metroid Prime,&#8221; and Terminator, and the last from Buffy the Vampire Slayer!]</em></p>
<p><strong>The AR Network.</strong></p>
<p>When I speak of a future AR Network, I mean one as universal and as standard as the internet. One where people can connect from any number of devices, <em>and without additional downloads</em>, experience the majority of the content.<br />
Where people can just point their phone, webcam, or pair of AR glasses anywhere were a virtual object should be, and they will see it. The user experience is seamless, AR comes to them without them needing to â€œprepareâ€ their device for it.</p>
<p>From this point forward, I will refer to this future AR Network simple as the <strong>â€œArnâ€.</strong></p>
<p>The Arn should be an inclusive, and open platform where any number of devices can connect to, and anyone can make and host their own location-specific models or data.<br />
It should allow people to communicate both publicly and privately, and not have their vision constantly cluttered with things they don&#8217;t want to see.</p>
<p>There&#8217;s two old, existing paradigms that I think can help reach this goal when they are combined.</p>
<p><strong>The Internet Relay Photoshop.</strong></p>
<p>IRC, or Internet Relay Chat  was a chat system designed by Jarkko Oikarinen in the late 80&#8242;s.</p>
<p>Its a system where people meet on &#8220;channels&#8221;, they can talk in groups, or privately. Channels can be read-only, or open to all to contribute to. There is no restriction to the number of people that can participate in a given discussion, or the number of channels that can be formed. All servers are interconnected and pass messages from user to user over the network.</p>
<p>To me, this relatively old internet technology is a great template, or even foundation, for how the Arn could operate. Rather then text being exchanged, it would be mesh data (or links to mesh data), but other then that much of the same principles could apply.</p>
<p>People could join channels of information to view or contribute. Families could leave messages to each other scribbled in mid-air on private channels. Strangers can watch AR games being played between people in parks. People going into a restaurant could see the comments from recent guests hovering by the menu items.<br />
None of this would have to be called up specially, if they are on the right channel when it was broadcast, they will see it.</p>
<p>The IRC paradigm becomes particularly powerful when combined with another one common to many computer users; that of a â€œLayerâ€ in an art program, such as Photoshop or Paint Shop Pro.<br />
As most of us know, layers allow us to separate out different components of a piece of art while editing, either to focus our attention on one piece, or to make future editing easier.</p>
<p>Now what if we simply have each  â€œchannelâ€ of information represented as a layer?</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/AR_paper_img_1.jpg"><img class="alignnone size-medium wp-image-4265" title="AR_paper_img_1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/AR_paper_img_1-300x206.jpg" alt="AR_paper_img_1" width="300" height="206" /></a></p>
<p><em>Click to enlarge image above.</em></p>
<p>Having channels corresponding to layers is an easy and intuitive way for the Arn to operate. The user can login and contribute data to any channel, like IRC as well as adjusting the desired opacity and visual range of each layer, like they would a layer in Photoshop.</p>
<p>In this way they can get a custom view of the world, both with shared and personal AR elements visible at the same time.<br />
They would not have to switch between various overlays to their world view, as they could see many at the same time.</p>
<p><strong>Persistence of Data</strong></p>
<p>With IRC or IRC-like system to communicate the data sent is mostly temporary data&#8230;broadcast on the fly from user to user and device to device. Retained in the users local logs, but not â€œhostedâ€ anywhere.</p>
<p>I think for the majority of day to day purpose&#8217;s this is not so much a drawback, but actually desired for AR. Most casual communication doesn&#8217;t need to be recorded permanently in 3D space and, indeed, if it was, the cost of running such a service would increase exponentially with users and with time. Not to mention, our visual view of the world would get very cluttered very quickly. Imagine what your monitor would be like if it kept a history of every window you have ever opened and their positions!</p>
<p>So for most cases AR space should be treated like a 3D monitor letting us display many pieces of data from remote and local sources, and even to share them with others, but not being, by default, a permanent record for it all.</p>
<p>Most data will be analogous to pixels on a display, and if kept in records its only on the clients devices, not on the network itself.</p>
<p>However, occasionally we do want 3d data analogous to a web-page, such as (in the example above), the map layer. Data here should be persistent and visible to all that have that layered turned on.Â  I see no reason why hosting this data needs to use anything else but standard web-hosting with the (read only) #channel on the Arn merely providing a route to the data.</p>
<p>As the user logs onto the channel, the server, using a chat-bot, can send them a list of meshes with location data attached, and the Arn browser can simply pick the data to display that&#8217;s local to them. (Note 1: By doing it this way around, it allows some degree of anonymity to be possible, rather then the server knowing exactly where you are and feeding the specific correct string to you.)</p>
<p>We simply need to establish standards so this data can be pulled up and interpreted.</p>
<p>For instance, this standard could be as simple as a XML string pointing to a KML file on a server. This could then be then displayed in the users field of view at the co-ordinates specified.</p>
<p>In this way permanent data tied to locations, such as historical overlays or maps, could co-exist on the same protocol as temporary data such as mid-air chat&#8217;s or gaming related meshes.</p>
<p>There is also no reason why this shared-space/personal spaces based on channels of data has to be restricted to things given absolute co-ordinates.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/AR_paper_img_2.jpg"><img class="alignnone size-medium wp-image-4266" title="AR_paper_img_2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/AR_paper_img_2-226x300.jpg" alt="AR_paper_img_2" width="226" height="300" /></a></p>
<p>(Different ways to access the same mesh)</p>
<p>It could work just as well with Markers and thus relative co-ordinates.</p>
<p>This would be mostly useful for indoor use, letting people logged onto a channel see the same meshes as everyone else on the markers. Thus allowing multi-player AR games, or AR games with observers very easily.</p>
<p>For example; games like Chess could be played between people with no additional code needed; You simply have a set of markers for only your own pieces, and as you move them the channel updates with the new positions, which are displayed in place in your opponents field of view.</p>
<p>This sort of game comes â€œfreeâ€ with just having a  generic system of shared space supporting markers.</p>
<p>It would also allow AR adverts down the street or in magazines to be viewed by simply logging onto the right AR channel</p>
<p>If markers are designed with URL data in them, this could even be a prompted or automatic process.<br />
â€œThere is visual data in this area on the following channel;  #ABCD  would you like to view this channel?â€</p>
<p><strong>Pros and Cons of using IRC or IRC-like systems</strong></p>
<p><strong>Pros;</strong></p>
<p><strong>â€¢	Anyone can write a IRC interface software.<br />
â€¢	Anyone can create new IRC channels without cost<br />
â€¢	Channels can have read and write permissions set.<br />
â€¢	Users can easily have multiple channels open at once.<br />
â€¢	Already established with thousands of severs worldwide.</strong></p>
<p><strong>Cons;</strong></p>
<p><strong>â€¢	500-or-so character limit. 3D data must be linked too, not sent.<br />
â€¢	Slow update rate. Lines of data can take a whole second or more to send.<br />
â€¢	Non-persistent. Good for a 3d-view, not good for storage.</strong></p>
<p><strong><br />
</strong></p>
<p><strong>An example of how collaborative 3D-spaces could be shared over existing IRC networks;</strong></p>
<p><strong><br />
</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Image1.jpg"><img class="alignnone size-medium wp-image-4277" title="Image1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Image1-300x162.jpg" alt="Image1" width="300" height="162" /></a></p>
<p><strong><em>Click on the image to enlarge </em><br />
</strong><br />
While in the long run I would hope for a dedicated AR network to be developed, with greater flexibility with persistence of data, there is a lot that can be done with the existing IRC system to implement the ideas mentioned above.</p>
<p>Below I will show an example of simple, crude, pseudo-protocol that could be fairly easily implemented to create shared AR spaces broadcast across IRC channels.</p>
<p>Its important to note, the goal here isn&#8217;t to exchange the mesh data itself on IRC, its to exchange links to the data.</p>
<p>Exchanging the mesh data directly within the 500 character IRC limit would be very hard, and liable to errors.</p>
<p>It&#8217;s also a waste of network bandwidth, as many people logged onto the channel might not have that object in their field of view, so their clients should not bother downloading it. (it should be up to the client browsers when to anticipate and cache mesh data).</p>
<p><strong>Proposed Basic XML link exchange for AR;</strong></p>
<p>Principle;<br />
As user creates or changes an object, the clients softwareÂ  posts a simple xml formatted string to<br />
the IRC channel.<br />
Anyone logged into that channel then sees that mesh displayed in the specified location.</p>
<p>This string could be formatted as follows;</p>
<p>&lt;Mesh<br />
ID=â€DARKFLAME:1â€<br />
Obj=â€http://www.darkflame.co.uk/mesh/church/chuch.kmlâ€<br />
Loc=â€(49.5000123,-123.5000123)â€<br />
Permissions=â€Noneâ€<br />
LastUpdate=â€12/12/0000,2012:12â€<br />
/&gt;</p>
<p>This string allows other users client logged into the channel to automatically load the object from the URL and display it at the correct position in their field of view.<br />
If the permissions are set to allow it, they could then move the object themselves, with the update being feeding back seamlessly to other users on the channel.</p>
<p>The objects posted are given an ID, which can be just the posters name, followed by a unique object number for that name. These unique ID&#8217;s would allow clients to track different instances of the same mesh, as well as making it easy to implement permissions. (if only the poster should be allowed to move this object, then the clients simply check if ID matches the user name posting the update. If its not, they can ignore it).</p>
<p>Next the objects need to be linked to a mesh.</p>
<p>The location of the objects mesh doesn&#8217;t have to be a fixed remotely-hosted url, it could be an IP address and port number of the user posting the mesh,hosted by the application posting the link to the channel.</p>
<p>Obj=â€www.darkflame.co.uk/mesh/church/chuch.kmlâ€<br />
Obj=â€123,223,14,23::3030â€</p>
<p>The objects co-ordinates, likewise, need not be specified as absolute gps co-ordinates, but instead could refer to generic Marker.</p>
<p>Loc=â€(49.5000123,-123.5000123)â€<br />
Loc=â€Marker1â€<br />
Or relative to a marker;<br />
Loc=â€Marker4 (+0.0023,-0.0023)â€<br />
Or relative to a default plane;<br />
Loc=â€Default(+0.213,-0.123)â€</p>
<p>The AR Browsers could then handle the association between the Markers pattern and its Name.<br />
This way the markers are reusable, they do need unique markers to be printed for every new bit of AR they want to look at.<br />
Users could just keep a set of generic markers handy, which they could simply assign to be Marker1,Marker2 etc for any AR use. (Note 2: As mentioned above specific makers could also contain a default ID name and channel built into their data, letting the Arn browser simply prompt the user if they want to see the model even if they aren&#8217;t in the right channel. This set up would be most useful for paper and even billboard advertising.)</p>
<p>The Default location could be a settable region, or marker, on the clients browser that defines a playable/user-able area in the field of view. Mostly useful for home use, this could typical be a square region on a users desk.</p>
<p>So, in the chess-game example, the client of the person making the moves simply updates the position relative to the Default every time they move their marker (which is tied to a chess piece mesh).<br />
Then the (non-owners) clients software could automatically display it relative to their Default plane. This would make games like Chess, Checkers, Go or any other game involving merely moving objects about automatically very intuitive and easy to set up.</p>
<p>So by having meshes settable to absolute gps, marker-relative, or default-relative locations, reduces the bother necessary to experience AR content quite considerably, and makes â€œnon-geo-specificâ€ AR applications and games trivial to implement.</p>
<p>Next is permissions.</p>
<p>Mesh-permissions would be a simple string saying who else can update the data, if anyone.</p>
<p>eg;<br />
Permissions=â€Noneâ€<br />
Permissions=â€RandomPerson1, RandomPerson2â€<br />
Permissions=â€Allâ€</p>
<p>By default you could only update or move your own meshes. (identified by the ID of first posting). If you attempt to update anyone else&#8217;s,Â  their clients would just ignore it.</p>
<p>Thus in a game of chess, you can only move your own pieces. If you attempted to move your opponents (by reassigningÂ  your own marker to their pieces Ids), the clients would just ignore that assignment. You&#8217;d only be fooling your own system.<br />
Likewise, when pinning a message in mid-air for your friends to read, no one else can change that message without your permission, although copying it would be easy.Â  (Note 3: It&#8217;s important to note this sort of object-specific permission system is in addition to the global-permissions, or â€œuser-modesâ€ it&#8217;s possible to set for the IRC channels and users as a whole.)</p>
<p>Finally, as object data could change within all sorts of time-scales, the easiest way to keep everyone logged in up to date is to just have a time-stamp of when each model was last updated.</p>
<p>LastUpdate=â€12/12/0000,2012:12â€</p>
<p>This would not necessarily be the same as the XML string post date, because the models mesh might not be updated, but merely moved, and in such case the Arn browser shouldn&#8217;t redownload the mesh.</p>
<p>This sort of arrangement could be used as a standard today, and users wouldn&#8217;t have to constantly download special AR programs to view a single AR mesh.</p>
<p>In the long-term I would hope for more advanced methods to manipulate Arn-content online, analogous to Dom manipulation in web-pages. But for now, we should at least establish standard methods for devices to pull up meshes and overlay them in the correct position.</p>
<p>So, having a layered system could give the user a seamless blend of dynamic and static data with which to paint their world with.<br />
I believe this is all relatively easy to achieve using modifications of existing web technology, combined with some basic graphics systems.<br />
<strong><br />
Local Data:</strong></p>
<p>However, so far I have only talked about remote data.<br />
What of programs originating on the device itself? This is, after all, how most AR software we have at the moment works.</p>
<p>I think, that just like the remote channels, local software should also be blended into the same list of layers.Â  People shouldn&#8217;t have to â€œAlt+Tabâ€ out of one view of the world, to see another.<br />
They should be able to see both at once, if they wish.</p>
<p>For instance, if your playing a AR game, why shouldn&#8217;t your chat window be viewable at the same time?</p>
<p>If you have skinned your environment with a custom view of the world, why shouldn&#8217;t you also see mapping or restaurant recommendations?</p>
<p>So local data and remote data should be blended in the same view.<br />
How can AR software &#8211; of which I hope, there will beÂ  thousands &#8211; seamlessly be expected to layer their graphics, not only with the real world, but with each other, and with online data too? Will games and software makers need to co-operate to allow their graphics to be integrated together with correct occlusion taken into account? A tall order, no?</p>
<p>I must confess though, my technology knowledge fails me here.</p>
<p>I can only guess special graphics drivers, or 3D APIs,Â  will have to be developed to let programs share their 3D world with that of a Arn browser.<br />
Maybe programmes should simply treat themselves as a local-sever which the browser can connect too, and let the Arn handle all the rendering itself (although I imagine many games designers would find this quite limiting).<br />
So I leave it as an exercise to the readers to discuss and propose the best methods by which this vision of a layered world could be realised..</p>
<p><strong>Beyond IRC:</strong></p>
<p>As mentioned before IRC has some drawbacks, which are due to its age or method of working.<br />
As such, future systems might yet prove better alternatives for a open AR network.<br />
One example of such a system is Google Wave.<br />
It shares many of the advantages of IRC (open, anyone can create a channel of data, different permission levels can be set and its free), while avoiding some critical restrictions. (The data can be persistent).<br />
I believe some of the ideas I&#8217;ve mentioned, and possibly even the proposed protocol string could be adapted for Google Wave or other future systems.<br />
I believe overall the principles are more important then any specific implementation to get to them.<br />
<strong><br />
Summary;</strong></p>
<p>âƒÂ Â  Â In order for AR to flourish the user shouldn&#8217;t need to download a separate application for each mesh they want to see.<br />
âƒÂ Â  Â  Having url&#8217;s embedded into QRCoded markers which point to standard mesh files like dxf or kml would be a way to do this right now.Â  The QR code would only have to be seen preciselyÂ  in shot once, then its borders could be used like a standard marker.</p>
<p>âƒÂ Â  Â An augmented view of the world needs to support visual multitasking, and havingÂ  layers of information is the best way to do that.<br />
âƒ<br />
âƒÂ Â  Â Methods need to be devised to allow drastically different software to contribute to these layers, without restricting either the software&#8217;s rendering ability&#8217;s, or the users ability to pick and choose what layers of information he wants to see.<br />
<strong><br />
Last point;</strong></p>
<p>I am absolutely confident in my belief AR will become at least as important as the web has, and probably a lot more so. It will also face much the same hurdles and challenges getting established as that medium did.<br />
But, speaking as a web-developer, can we try to avoid a browser war this time?</p>
<p>Everything Everywhere , draft.<br />
by Thomas Wrobel<br />
Darkflame a t gmail</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/feed/</wfw:commentRss>
		<slash:comments>34</slash:comments>
		</item>
		<item>
		<title>Is it â€œOMG Finallyâ€ for Augmented Reality?: Interview with Robert Rice</title>
		<link>http://www.ugotrade.com/2009/01/17/is-it-%e2%80%9comg-finally%e2%80%9d-for-augmented-reality-interview-with-robert-rice/</link>
		<comments>http://www.ugotrade.com/2009/01/17/is-it-%e2%80%9comg-finally%e2%80%9d-for-augmented-reality-interview-with-robert-rice/#comments</comments>
		<pubDate>Sun, 18 Jan 2009 01:03:32 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[Metaverse]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[nanotechnology]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[Second Life]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[virtual communities]]></category>
		<category><![CDATA[virtual economy]]></category>
		<category><![CDATA[virtual goods]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[virtual world standards]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web 3D]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[Web3.D]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[AR Geisha Doll]]></category>
		<category><![CDATA[compass in the android]]></category>
		<category><![CDATA[Denno Coil]]></category>
		<category><![CDATA[EEML]]></category>
		<category><![CDATA[hybrid augmented/virtual reality]]></category>
		<category><![CDATA[immersive mobile augmented reality]]></category>
		<category><![CDATA[markerless augmented reality]]></category>
		<category><![CDATA[massively multiuser augmented reality]]></category>
		<category><![CDATA[minimally immersive augmented reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[Neogence]]></category>
		<category><![CDATA[next generation transparent wearable displays]]></category>
		<category><![CDATA[NYC Tech Meetup]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[Robert Rice]]></category>
		<category><![CDATA[socializing sensor data]]></category>
		<category><![CDATA[Unreal 3]]></category>
		<category><![CDATA[Web Alive]]></category>
		<category><![CDATA[Wikitude]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=2620</guid>
		<description><![CDATA[Neogence is on stealth mode with an immersive mobile augmented reality platform &#8211; â€œtools, sdk, and infrastructure plus some applications.â€ They are probably six months away from YouTubing anything according to CEO, Robert Rice.Â  But Robert rustled up this pic for me &#8211; a Google street view of Neogence R&#38;D labs: â€œthe patio on the [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><img class="alignnone size-full wp-image-2557" title="neogencesekrithqpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/neogencesekrithqpost.jpg" alt="neogencesekrithqpost" width="450" height="412" /></p>
<p><a id="zd89" title="Neogence" href="http://www.neogence.com/sekrets.html" target="_blank">Neogence</a> is on stealth mode with an immersive mobile augmented reality platform &#8211; â€œtools, sdk, and infrastructure plus some applications.â€ They are probably six months away from YouTubing anything according to CEO, <a id="rzgp" title="Robert Rice" href="http://curiousraven.squarespace.com/about-me/" target="_blank">Robert Rice</a>.Â  But Robert rustled up this pic for me &#8211; a Google street view of Neogence R&amp;D labs: â€œthe patio on the lower left is where I do a lot of pacing and smoking my pipe and the porch and office upstairs is whereÂ  a lot ofÂ  meetings have been held.â€</p>
<p><a id="rzgp" title="Robert Rice" href="http://curiousraven.squarespace.com/about-me/" target="_blank">Robert Rice</a> (<a id="x_:i" title="@RobertRice" href="http://twitter.com/RobertRice" target="_blank">@RobertRice</a> ), CEO of <a id="zd89" title="Neogence" href="http://www.neogence.com/sekrets.html" target="_blank">Neogence</a>, recently tweeted:</p>
<p><em><strong>Iâ€™m changing my name to Robert Mobile Ubiquitous Geospatial Augmented Rice. Iâ€™m betting on radical changes in next 18 months.</strong></em></p>
<p>Although Robertâ€™s new AR platform is still under wraps, I think you will get a good idea of what direction he is going in from this interview (full text at end ofÂ  this post). Robert is the author of â€œ<a id="c:rr" title="MMO Evolution" href="http://books.google.com/books?id=dkZ-6C5utz8C&amp;dq=MMO+Evolution&amp;printsec=frontcover&amp;source=bn&amp;hl=en&amp;sa=X&amp;oi=book_result&amp;resnum=4&amp;ct=result" target="_blank">MMO Evolution</a>â€ and is a key developer and thought leader in persistent immersive environments, simulations, virtual worlds and massively multiplayer games as well as large scale communities and social networking.</p>
<h3>It is OMG finally, at least, for minimally immersive but truly useful AR.</h3>
<p>Since the launch of Android a new generation of useful augmented reality applications like <strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a></strong> are emerging.</p>
<p>After the last<a href="http://www.meetup.com/ny-tech/calendar/9466657/" target="_blank"> NYC Tech Meetup</a>, myÂ  friend <a title="Nat Mobile Meets Social DeFreitas" href="http://openideals.com/" target="_blank">Nathan Freitas</a>,Â  <a title="Nat Mobile Meets Social DeFreitas" href="http://openideals.com/" target="_blank">(</a><a title="@NatDefreitas" href="http://twitter.com/natdefreitas" target="_blank">@NatDefreitas</a>),Â <a title="Nat Mobile Meets Social DeFreitas" href="http://openideals.com/" target="_blank"> </a>or rather Nathan Mobile Meets Social Freitas, demoed for me a cool graffiti appÂ  he has developed on Android.Â Â  You leave a marker for your graffiti so other people can find view/add their own &#8211; a nice primal experience like pissing on the lamp post to let your pack know where youâ€™ve been.Â  Also the graffiti app taps into a long history ofÂ  NYC street culture around tagging and graffiti art.Â  For more cool mobile projects Nathan is working on &#8211; <a href="http://blog.twittervotereport.com/" target="_blank">Vote Report </a>and data collection for mass events, a guide to pubs and nightlife in New York City, and more, see his blog, â€œNathanâ€™s<a href="http://openideals.com/" target="_blank"> OpenIdeals. </a>With Camera, GPS, compass, and accelerometer, and APIs on Android for temperature, light meters, (no hardware yet), Nathan says Android:</p>
<p><a href="http://openideals.com/" target="_blank"><em><strong> </strong></em></a><em><strong>â€œseems to be the platform most likely to socialize the idea that sensor data could be a piece of every application.â€ </strong></em></p>
<p>As Nathan is fond of saying:</p>
<p><strong><em>The compass is a killer app enabler!</em></strong></p>
<p><a href="http://openideals.com/" target="_blank">Also see </a><a id="ixwx" title="OpenIntents" href="http://code.google.com/hosting/search?q=label:sensors" target="_blank">OpenIntents</a> for some interesting Android Sensor projects.</p>
<p><img class="alignnone size-full wp-image-2558" title="wikitudepost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/wikitudepost.jpg" alt="wikitudepost" width="450" height="356" /></p>
<p><strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a></strong> was one of <em><strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Thomas Wrobel</a>â€™s</strong></em> two top AR milestones for 2008 (see <a id="vwuu" title="Gamesalfreso" href="http://gamesalfresco.com/" target="_blank">Gamesalfreso</a>):</p>
<p><em><strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a> I think. Seems the first released, useful, AR software.</strong></em></p>
<p><em><strong></strong></em> <a href="http://gamesalfresco.com/2008/07/20/want-your-own-augmented-reality-geisha/" target="_self">AR Geisha doll</a> is also a remarkable breakout for AR &#8211; but useful, nah.</p>
<p>I asked Robert if he also thought <a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a> and <a href="http://gamesalfresco.com/2008/07/20/want-your-own-augmented-reality-geisha/" target="_self">AR Geisha doll</a> asÂ  significant breakthroughs:</p>
<p><em><strong>Yes,Â  these are among the first attempts to get away from the novelty of simply rendering a 3D object based on a marker and making it interesting.</strong></em></p>
<p><em><strong>Remember, one of the biggest risks that AR has, is being branded as â€œnoveltyâ€, which means â€œcool for five minutes but ultimately a waste of time.â€ I think we have a ways to go before something is truly useful, but as 2009 progresses we should start seeing some effort here. Iâ€™d guess 2010 before something really useful comes outâ€¦at least something practical.</strong></em></p>
<p><em><strong>Now, having said that, I should say that I expect entertainment and games to take the lead (as usual), although there are a few companies really trying to leverage AR and video/graphics compositing for marketing (brochures) and location based methods (kiosks, large screen projections, etc.)</strong></em></p>
<h3>So when is it â€œOMG finally!â€ for massively multiuser augmented reality?</h3>
<p><img class="alignnone size-full wp-image-2559" title="ar-guipost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/ar-guipost.jpg" alt="ar-guipost" width="450" height="360" /></p>
<p>The picture above is from <a id="kzm2" title="benjapo's portfolio" href="http://www.istockphoto.com/file_closeup/technology/computers/3919295-futuristic-computer-panel.php?id=3919295" target="_blank">benjapoâ€™s portfolio</a> on istockphoto &#8211; also see the <a id="cqhi" title="istock video here" href="http://www.istockphoto.com/file_closeup/technology/computers/3919295-futuristic-computer-panel.php?id=3919295" target="_blank">istock video here</a>.</p>
<p><a id="ylpn" title="Alex Soojung-Kim Pang considers" href="http://www.endofcyberspace.com/2006/11/royal_college_o.html" target="_blank">Alex Soojung-Kim Pang</a> (who weighed in recently on the <a id="vr8o" title="twitter-baby" href="http://www.endofcyberspace.com/2008/12/twitter-baby.html" target="_blank">twitter-baby</a> debates &#8211; see my <a href="http://tishshute.com/twitter-baby-debates" target="_blank">KickBee Posterous</a> blog) challenges design assumptions for augmented reality that take as a given the userâ€™s desire for numerous private enhancements to their reality.</p>
<p>Alex points out less will probably be more so that enhancements do not impinge on shared experience.Â  See his write up of a talk he gave at the Royal College of Art, <a id="bxx1" title="&quot;and the end of my own private Shibuya.&quot;" href="http://www.endofcyberspace.com/2006/11/royal_college_o.html" target="_blank">â€œand the end of my own private Shibuya.â€</a> Photo below by <em>StÃ©fan, â€œ</em><em><a href="http://www.flickr.com/photos/st3f4n/130889444/in/pool-84787688@N00">Karaoke in Shibuya</a></em><em>â€œ</em></p>
<p><em></em><em><strong>Part of the pleasure of these streetscapes is precisely that theyâ€™re collectively experienced, rather than individual visions: for even a brief period, we share with other postmodern, globe-hopping flaneurs and expatriates and temporary natives the light of the ABC-Mart sign and storefront.</strong></em></p>
<p><em><strong><img class="alignnone size-full wp-image-2560" title="karaokepost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/karaokepost.jpg" alt="karaokepost" width="450" height="338" /><br />
</strong></em></p>
<p>It is collective experience of enhanced, augmented, virtual or real experiences that interests me too. This is one of the reasons I find <strong><em><a href="http://www.pachube.com/" target="_new">Pachube</a></em></strong> and the <a href="http://www.eeml.org/" target="_blank">EEML project </a>of Haque Design and Research so interesting.</p>
<p><strong><em>Extended Environments Markup Language (EEML), a protocol for sharing sensor data between remote responsive environments, both physical and virtual. It can be used to facilitate </em><em>direct connections between any two environments; it can also be used to facilitate many-to-many connections as implemented by the web service <a href="http://www.pachube.com/" target="_new">Pachube</a>, which enables people to tag and share real time sensor data from objects, devices and spaces around the world.</em></strong></p>
<h3>â€œDistinctions between virtual and real are as quaint and outmoded as distinctions between mind and bodyâ€ (Usman Haque)</h3>
<p><img class="alignnone size-full wp-image-2603" title="chair1post1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/chair1post1.jpg" alt="chair1post1" width="150" height="150" /><img class="alignnone size-full wp-image-2602" title="remotechair-slpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/remotechair-slpost.jpg" alt="remotechair-slpost" width="150" height="150" /><img class="alignnone size-full wp-image-2604" title="chair2post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/chair2post.jpg" alt="chair2post" width="150" height="150" /></p>
<p>Usman Haque (founder of <a href="http://www.haque.co.uk/pachube.php" target="_blank">Pachube</a> and <a href="http://www.haque.co.uk/" target="_blank">Haque Design and Research</a>) points out this is an underlying premise of his work &#8211; and augmented reality (full interview coming up soon!).</p>
<p>The pictures above show the Haque Design project, <a href="http://www.haque.co.uk/remote.php" target="_blank">Remote</a>:</p>
<p>â€˜<em><strong>Remoteâ€™ connects together two spaces, one in Boston the other in Second Life, and treats them as a single contiguous environment, bound together by the internet so that things that occur in one space affect things that happen in the other and vice versa &#8211; remotely controlling each other.</strong></em></p>
<p>There was a discussion in twitter recently about how the terms like Second Life, Exit Reality, Virtual Worlds are misleading and outmoded. As Robert pointed out we need:</p>
<p><em><strong>one word pleaseâ€¦that sums up virtual and/or augmented reality, interactive, immersive, virtual worlds, mmorpgs, simulations, etcâ€¦ also, I really donâ€™t like the term â€œaugmented realityâ€ or â€œmixed realityâ€. Neither is all that great. And NO â€œmatrixâ€ or â€œmetaverseâ€ please</strong></em></p>
<p>Robert argues strongly that there is a stultification both in virtual world technology &#8211; much of what we call virtual world technology was already, basically, where it is now in the mid 90â€™s. And MMOGs have devolved into gameplay design â€œthat emphasizes the single player experience and does nothing to take advantage of the potential of the massively connected internet.â€</p>
<p>Robert suggested I take a cruise through a new Virtual Space -Â  <a href="http://www.cooliris.com/">CoolIris</a> to find some good pictures for this post (note the partnership between <a href="http://blog.cooliris.com/2009/01/14/cooliris-and-seesmic-streamline-video-blogging/" target="_blank">CoolIris and Seesmic to Streamline Video Blogging.</a> I added the Cooliris Plugin to Firefox and typed Augmented Reality into search and soon I was cruising a highway of images and links. The Road Map image grabbed my attention (see below). It shows the continua that <a href="http://www.metaverseroadmap.org/" target="_blank">the Metaverse RoadMap</a> authors thought are likely to influence the ways in which the Metaverse unfolds. It is â€œa map of the spectrum of technologies and applications ranging from augmentation to simulation; and the spectrum ranging from intimate (identity-focused)external (world-focused)â€</p>
<p><img class="alignnone size-full wp-image-2561" title="metaverseroadmap" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/metaverseroadmap.jpg" alt="metaverseroadmap" width="452" height="427" /></p>
<p>Quite to my surprise, when I clicked out of <a href="http://www.cooliris.com/">CoolIris</a> to the source for the image, I found it had been drawn from a post I wrote in May 2007, <em><strong><a id="jv.r" title="Hybridized Digital/Physical Worlds: Where Pop and Corporate Cultures Mingle." href="../../2007/05/22/hybridized-digitalphysical-worlds-where-pop-and-corporate-cultures-mingle/" target="_blank">Hybridized Digital/Physical Worlds: Where Pop and Corporate Cultures Mingle.</a> </strong></em>My post talks about a number of hybridization experiments that were bringing together lifelogging, sensors everywhere, simulation, virtual worlds, and augmentation.</p>
<p>The striking difference from 2007 to now is that we have definitely moved on from mere experimentation. And the poles of the continua<em><strong> intimate/extimate, augmentation/simulation </strong></em>as<em><strong> </strong></em>expressed in the Metaverse Roadmap are now becoming entwined (note the picture above seems to be slightly different to the one used in the road map as <a id="vdcf" title="posted here" href="http://www.metaverseroadmap.org/overview/" target="_blank">published here</a> &#8211; perhaps I had an early version?)</p>
<h3>&#8220;Augmented Reality is not just about overlaying dataâ€¦&#8221; (Robert Rice)</h3>
<p><img class="alignnone size-full wp-image-2562" title="totalimmersion" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/totalimmersion.jpg" alt="totalimmersion" width="450" height="332" /></p>
<p>Th<em>e </em>screenshot above is from <a id="c7vm" title="TotalImmersions video" href="http://www.t-immersion.com/en,video-gallery,36.html#">TotalImmersions video</a> demoing Augmented Reality with 3D Cell Phones.<em> Also see <a id="tvca" title="video of their immersive games" href="http://www.t-immersion.com/en,video-gallery,36.html#" target="_blank">video of their immersive games</a>, and FutureScope kiosks <a id="eje0" title="here" href="http://www.t-immersion.com/en,video-gallery,36.html#" target="_blank">here</a> and <a id="h-:s" title="here" href="http://www.t-immersion.com/en,video-gallery,36.html#" target="_blank">here</a>.<br />
</em><br />
<a id="vwuu" title="Gamesalfreso" href="http://gamesalfresco.com/">Gamesalfreso</a> noted that Will Wright, delivered the best <a href="http://www.pocketgamer.co.uk/r/Various/Spore+Origins/news.asp?c=8725" target="_blank">augmented reality quote</a> of the year. When describing AR as the way of the future for games, Will Wright said:</p>
<p><em><strong>â€œGames could increase our awareness of our immediate environment, rather than distract us from itâ€.</strong></em></p>
<p>Robert points out in this interview the term Augmented Reality itself has become associated with a very limited understanding of what â€œenhancing your specific reality,â€ is really about. Robert notes:</p>
<p><em><strong>it is inherently about who YOU are, WHERE you are, WHAT you are doing, WHAT is around you, etc.</strong></em></p>
<p><em><strong>When I talk about AR, I try to expand the definition a little bit. Usually, when you talk to someone about augmented reality, the first thing that comes to mind is overlaying 3D graphics on a video stream. I think though, that it should more properly be any media that is specific to your location and the context of what you are doing (or want to do)â€¦augmenting or enhancing your specific reality.</strong></em></p>
<p><strong><em>In this sense, anything that at least knows who you are (your ID, mobile phone #, etc.), where you are (GPS coord or a specific place like a cafe), and gives you relevant data, information, or media = augmented reality. Sure, you can make things more interactive or immersive, but that is the minimum.</em></strong></p>
<p><strong><em>So, in this case, yes, I think there will be networked applications in the next 18 monthsâ€¦mostly things that are enhanced by friends lists (you are here, your friend is over there). These will be *application specific*. My team at Neogence is already going beyond this, building a platform and infrastructure for other applications to be developed onâ€¦all networked through the same backbone. Now, in this context (the science fiction AR that we all dream about), no I do not see anyone else trying to leap a generation or two ahead of the industry to build a massively multiuser shared AR space. Expect to see things like multi-user AR games, virtual pets, kiosk marketing, magic book, â€œgee whizâ€ presentations (tradeshow booths, entertainment parks, etc.), and so forth.</em></strong></p>
<p><strong><em><br />
</em></strong></p>
<h3>Goggleâ€™s Are Not The Secret Sauceâ€¦</h3>
<p><strong><em><img class="alignnone size-full wp-image-2563" title="ar-catpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/ar-catpost.jpg" alt="ar-catpost" width="137" height="150" /><img class="alignnone size-full wp-image-2564" title="goggles-avatarpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/goggles-avatarpost.jpg" alt="goggles-avatarpost" width="150" height="150" /><br />
</em></strong></p>
<p>AR Cat left and Robert Rice right</p>
<p>What has come to be associated with the term Augmented Reality, in the popular imagination &#8211; an idea of 3D graphics projected over markers that has been forever waiting for the advent of â€œwicked next generation transparent wearable displaysâ€ &#8211; nirvana for augmented reality. While such displays may be nirvana for AR (and they could be with us in less than twenty four months), Goggles are not the â€œsecret sauceâ€ of AR as Robert points out.<strong><em><br />
</em></strong></p>
<p><em><strong>All the glasses are, is another display device. At the end of the day, it doesnâ€™t matter if you are looking at an LCD monitor, an IPhone, a head mounted display, or a pair of wicked next generation transparent wearable displays that magically draw directly on your retinas.</strong></em><br />
<em><strong><br />
The real tricky stuff is what happens on the backendâ€¦making it all persistent, massively multiuser, intelligent, interoperable, realistic, etc. etc.</strong></em></p>
<p><em><strong><img class="alignnone size-full wp-image-2585" title="vuzix" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/vuzix.jpg" alt="vuzix" width="450" height="318" /><br />
</strong></em></p>
<p>There has been quite<a href="http://www.realwire.com/release_detail.asp?ReleaseID=10934" target="_blank"> a buzz going around</a> about the new <a href="http://www.vuzix.com/iwear/products_wrap920av.html" target="_blank">Vuzix Eyewear</a>, and recently Robert talked with Vuzix and checked The Wrap 920AV eyewear out:</p>
<p><em><strong>Vuzix is not alone in pursuing the ultimate in hardware, at least as far as wearable displays. However, I think they are much farther than the rest of the pack in vision, roadmap, and execution. They have put together a team that has a sense of urgency and ambition that will blow the industry away. After talking to them, I got the feeling that they really know what they are doing and there is a lot of mind blowing stuff in their pipeline. Iâ€™m sure they are one of the few companies that really gets it and has a clear vision of the future. Definitely my first choice to work with.</strong></em></p>
<p><em><strong><br />
</strong></em></p>
<h3>Hybrid Augmented/Virtual Reality</h3>
<p><img class="alignnone size-full wp-image-2566" title="qa_2post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/qa_2post.jpg" alt="qa_2post" width="450" height="347" /></p>
<p><a id="va0_" title="Cory Ondrejka posted" href="http://ondrejka.blogspot.com/2009/01/anybots-telepresence-robot.html" target="_blank">Cory Ondrejka posted</a> this picture of the anybots telepresence robot and â€œcongrats to <a href="http://www.tlb.org/">Trevor Blackwell</a> and the rest of the <a href="http://anybots.com/">Anybots</a> team on the launch of <a href="http://anybots.com/abouttherobots.html">QA at CES</a>.â€Â  Cory (one of the founders and former CTO of Second Life) also made some predictions for Virtual Worlds, some optimistic and some less so, including â€œthe increasing need to be able to diversify the Second Life product offering to begin truly rebuilding the code base.â€</p>
<p>Robert is unabashedly irritated with the state of play in Virtual Worlds and MMOGS:<br />
<em><strong><br />
</strong><strong>Unless both industries (Virtual Worlds and MMOGs) have some serious upheaval or radical new approaches, they will quickly be eclipsed by AR, which will eventually evolve into something hybrid..AR/VR depending on your level of access and hardware.</strong></em></p>
<p><em><strong></strong><strong>Iâ€™d like to see someone grab an engine like Offset, Crytek, HERO, or Unreal 3, and smack on a fat MMO server infrastructure (Eve or Bigworld)â€¦toss in the right tools, and you would see a revolution and renaissance occur at the same time in the virtual world space. All the puzzle pieces are there, just no one is putting them together the right way.</strong></em></p>
<p>I did just find out that Nortelâ€™s <a id="qkxv" title="WebAlive is powered by the Unreal 3 engine" href="http://www2.nortel.com/go/news_detail.jsp?cat_id=-8055&amp;oid=100251105&amp;locale=en-US" target="_blank">WebAlive is powered by the Unreal 3 engine</a>. You <a id="xqbw" title="can try WebAlive" href="http://www.lenovo.com/elounge" target="_blank">can try WebAlive</a> out here.</p>
<p>Robert<strong><em> </em></strong>points out how rare it has become to see people really push virtual worlds technology and MMOGs into entirely new directions.Â  Although, of course, there are exceptions.Â  I managed to engage some interest from Robert in the possibilities the <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">opensource modular architecture of OpenSim</a> opens up, and <a id="vx_i" title="the augmented reality experiments from Georgia Tech with Second Life" href="http://arsecondlife.gvu.gatech.edu/" target="_blank">the augmented reality experiments from Georgia Tech with Second Life</a> (screenshot below) got praise from Robert for trying to do something new. (Georgia tech have also put out a <a id="kfzj" title="virtual pet app for the iphone" href="http://uk.youtube.com/watch?v=_0bitKDKdg0" target="_blank">virtual pet app for the iphone</a> ).</p>
<p><img class="alignnone size-full wp-image-2567" title="picture-4" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/picture-4.png" alt="picture-4" width="321" height="245" /></p>
<p>But while Robert clearly has zero patience for virtual world technology which he sees stuck in the mid nineties, he notes:</p>
<p><em><strong>the innovative and wonderful stuff about SL isnâ€™t SL, it is what people are doing and creating on their own with terrible tools *IN* SL</strong></em> [Second Life].</p>
<p>The immersive mobile augmented reality platform Robert is building, he hopes, will generate this kind of user creativity but with 21st century tools.</p>
<h3>So is it â€œOMGâ€ finally for the Augmented Reality we have dreamed about?</h3>
<p>According to Robert:</p>
<p><em><strong>It really boils down to a markerless solution and a good application.</strong></em></p>
<p>In the interview below we cover a number of topics including business models for Augmented Reality, e.g., how business models based on micro-transactions and virtual goods will translate to Augmented Reality.</p>
<p>Many of the challenges to becoming mainstream faced by virtual worldsÂ  are similar to the challenges AR must overcome. Robert discusses these including the interface/gui that is a critical element for AR, solving the riddle of one world or many, patent wars in Virtual Worlds and Augmented Reality, the role of Augmented Reality in the future of sustainable computing, and what interoperability is about.</p>
<h3>The Back Story for AR/VRâ€¦</h3>
<p>In case you want to get up to speed on the required background reading forÂ  Augmented Reality. This is Robertâ€™s required reading list and Denno Coil is an absolute <strong>must</strong> see (feel free to add to this list in the comments, please).</p>
<p>â€œIf you want to see the things that have inspired our vision of what we want to build, check out:</p>
<p>* Dream Park by Larry Niven and Steven Barnes<br />
* Rainbows End by Vernor Vinge<br />
* Spook Country by William Gibson<br />
* Halting State by Charles Stross<br />
* The Diamond Age by Neal Stephenson<br />
* Donnerjack by Roger Zelazny and Jane Lindskold<br />
* Otherland by Tad Williams<br />
* Neuromancer by William Gibson<br />
* Idoru by Wiliam Gibson<br />
* Cryptonomicon by Neal Stephenson</p>
<p>and watch the whole anime of Denno Coil (subbed NOT dubbed!)â€</p>
<p><img class="alignnone size-full wp-image-2568" title="dennoucoil" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/dennoucoil.jpg" alt="dennoucoil" width="450" height="256" /></p>
<p>Screenshot from Denno Coil from<a id="yic5" title="Concrete Badger" href="http://www.concretebadger.net/blog/2007/12/17/dennou-coil-full-series-2007-in-12-day-4/" target="_blank"> Concrete Badger</a>.</p>
<h3>Interview With Robert Rice</h3>
<p><strong>Tish Shute:</strong> I am glad to hear that you are working on this [an immersive mobile augmented reality platform]!</p>
<p><strong>Robert Rice:</strong> We switched gears from MMO stuff about a year ago and we are finally getting some traction. It is very hard doing anything in this economy right now, but we found an opportunity to take AR to a new level beyond what you see on youtube. AR is still too â€œcuteâ€ and novelty. We donâ€™t want to play around.</p>
<p><strong>Tish Shute:</strong> I like Wikitude â€˜cos it even manages to do something useful!</p>
<p><strong>Robert </strong><strong> Rice</strong><strong>:</strong> Yeah, useful = traction. Now that we are getting near a prototype we are starting to get a lot of interest even though we are still technically way under the radar.</p>
<p><strong>Tish Shute:</strong> r u funded?</p>
<p><strong>Robert </strong><strong> Rice</strong><strong>:</strong> privately funded, some revenues from an early license, and ongoing discussions with several institutional investors. So, we have some funding, but nothing spectacular just yet.</p>
<p><strong>Tish Shute:</strong> are you just developing an AR platform?</p>
<p><strong> Robert Rice:</strong> hrm, sort of, but not just that. By platform I mean tools, sdk, and infrastructure plus some applications. The idea is to build something that facilitates everyone else making cool things and useful applications for different industries/sectors</p>
<p><strong>Tish Shute:</strong> Yes that is the cool thing to do but isnâ€™t that hard to fund!</p>
<p>(Robert grins) Well, that depends on the business model. Weâ€™ve got that figured out. Iâ€™d be absolutely happy if everyone and their brother were making applications on our stuff that gives us an edge on market penetration/saturation. There are plenty examples that prove the model. If you give people free and easy to use tools, they will run with it. ARtoolkit for example, has tons of people making nifty things and posting videos on youtube that has pushed them to the forefront as THE AR middleware to use right now, or heck, look at youtube free service, and they dominate video sharing.Â  Sure there will be a lot of â€œnoiseâ€, but there will also be a lot of â€œsignalâ€ that will rise to the top, facilitating and enabling is creating value in its own right.</p>
<p><strong>Tish Shute:</strong> But how do you expect to monetize?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> There are a good half a dozen ways to monetize AR or an AR platform.</p>
<p><strong>Tish Shute:</strong> What are your top 3?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> hrm, microtransactions, localized mobile advertising, and enterprise solutions (visualization)</p>
<p><strong>Tish Shute:</strong> Do you think the consumer market will give the lead?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Iâ€™m not sure. We are getting people from academia, intelligence, defense, border security, and some corporate types knocking on our door already, and pretty aggressively. It may be that those sectors push AR before consumer entertainment really kicks off.</p>
<p>But going back to a discussion we had earlier &#8211; yes working with â€œno markersâ€ is a big deal.</p>
<p><strong>Tish Shute:</strong> Can you talk about what you are doing there or is it still under wraps?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> I can say that between some university tech transfer and some of our own proprietary stuff, we are using some fairly common visual tracking technology. if you are really plugged into the AR scene, you will know there are probably half a dozen visual tracking methods out there. We just looked for the best one, licensed it for commercial use, and then started working our magic. This is a very small piece of the overall effort, but worth noting.</p>
<p>The downside with working with university tech is that it is usually based on research, incomplete, and not wrapped up in a nice commercial package on the upside, it can be a good start to build on.</p>
<p><strong>Tish Shute:</strong> As you know I am very interested in â€œtechnology that mattersâ€ in particular tech that can help us accomplish the urgent goal of sustainable living.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong>: oh, Iâ€™m pretty keen on sustainable living as wellâ€¦after I sell off a few companies and have money of my own, Iâ€™m going to get into arcologies<br />
â¦<br />
Robert grins</p>
<p>The interesting thing with the visual stuff combined with our other tech, is that we can make things multiuser, persistent, dynamic, and mobile.<br />
The markers (fiducials) are really really limiting outside of basic applications. You canâ€™t really plaster everyone and everything with a marker.Â  And they are, by nature, static (even if they are animated or whatever).</p>
<p>Alsoâ€¦ our stuff works indoors and outdoors even without a GPS connection.<br />
â¦<br />
Robert grins</p>
<p><strong>Tish Shute:</strong> Now that does sound interesting!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yeah, with visual, you donâ€™t need a compass or accelerometers either. Less hardware : )</p>
<p>You start with wifi triangulation or gps coord to get a â€œbruteâ€ location, and then you use the visual stuff for down to the meter accuracy and that by nature, gives you your orientation and positioning.</p>
<p><strong>Tish Shute: </strong>Wow this is beginning to sound very interesting!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Once you have that, it doesnâ€™t matter where you go, it continues to track and continually refines areas you have been before. Weâ€™ve spent the last year figuring all this out. There are so many problems and obstacles that are going to be developing in the future for anyone trying to do what we are, but we have already discovered solutions.</p>
<p>oh, visual tracking = gesture based interfaces too thatâ€™s going to take some work, but its doable.Â  The real pain in the ass there isnâ€™t the actual tracking, it is in the interface design.</p>
<p>Thatâ€™s something that almost every AR company, venture, and research program is missing out on entirely. They are so focused on making cute things with markers.Â  They are missing the larger problems of AR Spam, interface, iconography, GUI, metaphor, interoperability, privacy, identity.</p>
<p><strong>Tish Shute:</strong> So how are you dealing with all that!!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> We took the backwards approach of trying to think where we want things to be in ten years (and we read all the cool booksâ€¦Vinge, Stephenson, Gibson, etc.) and then we spent time trying to think of what the potential problems areâ€¦.like AR spam. Its bad enough when a giant penis flies by in second life, we donâ€™t want that to happen in a global wireless AR platform.</p>
<p><strong>Tish Shute: </strong>Do you have a prototype yet?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> hrm, 6 months away from youtubing something. Problem has been slow funding, which equals slow development. We also donâ€™t want to show our cards too soonâ€¦too many potential competitors out there.</p>
<p>â¦<br />
Robert grins</p>
<p><strong>Tish Shute:</strong> when you say microtransactions what is the business potential there?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>hrm last year I think, $1.5B was spent on virtual items. Thatâ€™s games and virtual worlds. That should hit $5B in a couple of years. Thatâ€™s basically people buying and selling things like WoW gold or items in SL or whatever. microtransactions, is basically the same thing, but in AR space.</p>
<p>Why couldnâ€™t a 3D artist make a wicked animated 3D dragon, and then sell it to someone else? With AR, you could sit it on your shoulder. With a good scripting engine, you could train it to do stuff. Thats what I want to enable.</p>
<p>tools + sdk + platform = enabling people to make and create. Add in a commerce level (microtransactions) and wala.</p>
<p><strong>Tish Shute:</strong> At the moment all of these virtual goods are very platform specific, is that a problem for you?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Not at all. This is at a higher level. You have to switch mental models when you talk about what AR could or should be. For example, lets contrast the web and virtual worlds. For every virtual world you go to, you have to download a whole new client. Imagine if that model was applied to the webâ€¦ you would need a brand new browser for every website you went to. That is just soâ€¦wrong.</p>
<p>Itâ€™s the same thing for ARâ€¦people are thinking about it with the same mental and business models and development philosophies as virtual worlds or web.Â  There are some things and aspects that work fine, but not everything.</p>
<p>Virtual worlds, are, by nature, necessarily different and walled gardens. The idea of 100% open and interoperable virtual worlds is a red herringâ€¦it sounds good but in practice it is a really dumb idea.</p>
<p><strong>Tish Shute: </strong>I was wondering if you had a way to leverage all the 3D content already created â€˜cos that would jump start things in AR wouldnâ€™t it?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Oh yeah, thatâ€™s easy. They all use the same polygons. Any virtual item in any game or virtual world is likely created with 3D studio or maya or something similar would be easy to convert and use.</p>
<p><strong>Tish Shute:</strong> So people could bring their WoW weapons into your system?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Not legally, but sure. Its just a 3D model with a texture.Â  It doesnâ€™t matter if you use corel draw or photoshop or paintshop proâ€¦.or one screwdriver or another. Part of my teamâ€™s advantage, is that we are all experienced in MMORPG and virtual world design and development. We know the tools, the tech, and what works and what doesnâ€™t.</p>
<p><strong>Tish Shute:</strong> But some of the 3D content created in the social worlds is what has most value to people.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Right, and that can be exported out easily.</p>
<p><strong>Tish Shute: </strong>But back to â€œrealâ€ life applications. Is you platform really markerless?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yes.Â  marker = printed icon or glyph, also known as a fiducial</p>
<p><strong>Tish Shute:</strong> But u must have some marker?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> hrm, more accurately, you need a point of reference.</p>
<p>Visual tracking has been around for more than a decade.Â  Lots of work for robots and other sectors.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> But isnâ€™t the specificity of reference n terms of RL applications a vital key, for example, for a database of things?</p>
<p>Robert grin That is a different problemâ€¦tracking, registration, mapping, positioning, etc. That question has to do with mapping which is related to visual tracking, but not the same thing. We have a rather unique approach to some of this that I canâ€™t discuss (patent pending).</p>
<p><strong>Tish Shute: </strong>But for example, to create an augmented natural history of food &#8211; say I want to point at the slab of meat on my plate and know where that cow came from, what feed lot how it was treated etc</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>That is not possible without ubiquitous nanotechnology. Shall I explain?</p>
<p><strong>Tish Shute:</strong> Yes please!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Ok, lets step back a minute and turn that burger back into a cowâ€¦ the first problem (of this particular situation) is differentiating from one cow to another since most cows look alike, you can either attempt to discriminate visually (cow patterns) or use a much simpler option, like giving each cow a rfid chip in their bell, or hoof</p>
<p>Now, most people would try to figure out how to jam all sorts of info in the rfid chip, which sounds like a good idea, but isnâ€™t, the trick would be to simply use the rfid to store a unique identifier with is then linked to a database elsewhere, or hoof.</p>
<p>That database should continually be updated with whatever relevant information you need so as you get close with your AR laptop, wearable displays, or embedded brain chip, you get the identifier broadcast, then you get the info downloaded to you, and it â€œsticksâ€ to the cow with the generic visual tracking (object following, even simple bounding box is sufficient for a slow moving cow)</p>
<p>So, up to that point, you can get tons of information about that specific cow, that cow population (remember, AR is not just about overlaying dataâ€¦it is inherently about who YOU are, WHERE you are, WHAT you are doing, WHAT is around you, etc.) Tie in data visualisation and some farmer tools and all sorts of other things happen. Now, lets move the timeline ahead a bit.</p>
<p>The butcher gets the cow and does his handiworkâ€¦because we know all the info about the cow, all of the meat can be properly labeled and marked. Ideally, with a UPC code or a unique glyph (somewhat problematic depending on how many unique glyphs you can create) so, while you are in the grocery store, you can access the relevant shopping dataâ€¦age of cow, state of origin, type of feed, how many spots, how much body fat, which butcher, whatever not because of what is inside the package, but the package itself.</p>
<p>Getting back to your hamburger, the problem is that it is a burgerâ€¦there is nothing to distinguish that burger from another one at the tableâ€¦unless you stuck a rfid chip in it or splattered it with ink and a unique glyph, or maybe a special one of a kind plate.</p>
<p>However, a properly designed AR system could say â€œhey! that/s a hamburger! and I know I am at Fat Daddyâ€™s Burger Joint in Raleigh North Carolina on Glenwood Avenue, and I know that they cook their burgers this particular way, and their meat supplier is those guys over there, and they usually get their cow meat from a farm out in Utahâ€</p>
<p>With ubiquitous nanomites or whatever, then its not that far out to consider edible nanos that are in the meat and that broad cast info so a slab of meat can tell you about itself and broadcast that to the general public.</p>
<p><strong>Tish Shute:</strong> What useful scenarios can we create without the nanomites?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> If it wasnâ€™t a burger or a consumable organic, the scenario changes.</p>
<p><strong>Tish Shute: </strong>What is the time scale on nanomites?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> ehhhhhhh 20 years minimum if we are lucky. They sound good on paper, but there is a whole book worth of problems and why they are so far offâ€¦as consumer grade, all over the place, type of stuff.</p>
<p><strong>Tish Shute:</strong> Did you see the Nokia Home Control center?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yes, I saw the Nokia stuff.</p>
<p>AR for sensors, like security systems, temperature control, etc. all become â€œsources of dataâ€ that a AR system can visualize. So yes, thats easily doable. You could do that in a short period of time with some half decent engineers.</p>
<p>The trick of what Nokia is doing is aggregating sensor data from a building/home/facility, mashing it together, and sending the mobile device alerts and data visualization conceptually rather simple, but no one has done it right or well yet.</p>
<p>It wouldnâ€™t surprise me if Nokia pulled it off.</p>
<p><strong>Tish Shute:</strong> yes and if they do and someone does an AR interface to it that would be an inflection point for AR?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> In a roundabout way, yes. You could get data directly from your house, or get it through your mobile device and in either case, use the AR for visualization and control.</p>
<p>The interface/gui is a critical element for AR. That is one of the areas where it, as an industry, risks doing a bad job and turning into just a fad or another novelty like VR.Â  Virtual worlds have been struggling with that for a while, but MMORPGs have had the effect of extending their life cycle</p>
<p><strong>Tish Shute: </strong>Yes VWs have not solved the interface problem.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>The interface is one of their problems yes. Most virtual worlds are stuck in 1996/98</p>
<p><strong>Tish Shute:</strong> If ARÂ  is inherently about who YOU are, WHERE you are, WHAT you are doing, WHAT is around you, etc. seems that it is the ideal interface for home control?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Well for home control, you must know:</p>
<p>1) Who am I? Am I authorized to know this information? Am I a guest?</p>
<p>2) Where am I? Is this my house? or someone elses?</p>
<p>3) What am I doing? Do I want to make all the doors lock? Turn on or off lights? Open the garage door? Trigger the security alarm?</p>
<p>So the same questions apply</p>
<p>Iâ€™d say that all virtual worlds are stuck in the mid 90s. They are at least a decade behind the game worldsâ€¦in technology, design, implementation, architecture, etc. etc. In my opinion, things like Second Life are shameful in how they are presented as state of the art, innovative, ground breaking, new, wonderful, and world changing.</p>
<p>But thats another topic of conversation : )</p>
<p><strong>Tish Shute: </strong> Well for me the contribution of VWs is the presence enabled real time interaction with application (as 3D info machine) and context with other people.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Oh,there is no doubt that they are greatly useful and have a phenomenal amount of potential.</p>
<p>They *could* be all those things I just said that SL isnâ€™tâ€¦the problem is that they are either just existing, or they are meandering around without any real focus or direction. They arenâ€™t evolving.</p>
<p>Even MMORPGs are losing their way and beginning to stagnate terribly</p>
<p><strong>Tish Shute:</strong> yes I agree</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>But, AR has the potential to change a lot of things.</p>
<p>Im sure you have seen <a id="n_22" title="the yellowbook commercials" href="http://www.youtube.com/watch?v=zdPFBTQpk-U" target="_blank">the yellowbook commercials</a>? The technologies you are seeing here are doable in hrm, a year or less maybe. The tricky part is the interactivity and AIâ€¦that is, the content. Everything else isnâ€™t a problem. The avatar there could be photorealistic or stylized like a WoW character.</p>
<p>You could do that to some degree with markers for registration but dynamically changing the content linked to those markers is a little weird</p>
<p>(by the way, for the record, I like markers just fine, I just donâ€™t think they are useful for real-world mobile applications)</p>
<p>I also think that the guys that want to dust the planet with miniature rfid chips are on crack and are going about it the wrong way</p>
<p><strong>Tish Shute: </strong>A high level of interactivity is hard though. Isnâ€™t it? Even in VWs it is very limited.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> it depends if you can track what the user is doing, and interpret that properly. Interactive is also a very lose term.</p>
<p>Clicking a button and making a light blink could be considered interactive.</p>
<p><strong>Tish Shute: </strong>In VWs a high level of interactivity wouldÂ  be to wield a virtual hammer and have a real nail go in! is physics part of the problem?</p>
<p><strong>Robert Rice:</strong> physics arenâ€™t difficult, plenty of middleware out there for it. The problem with that isnt so much the physics as much as it is the scale and purpose</p>
<p><strong>Tish Shute:</strong> well for robotics?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> that gets into a conversation about meshes, textures, and volumetric collision detection and stuff</p>
<p><strong>Tish Shute:</strong> virtual robotics?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> You mean teleremote/telepresence of real robots?</p>
<p><strong>Tish Shute: </strong>yes!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> ah, for that, you need some tactile feedback and some other stuff &#8211; doable, but insanely difficult. Thatâ€™s why you donâ€™t see a whole lot of remote controlled surgery robots all over the place.</p>
<p>They do existâ€¦</p>
<p><strong>Tish Shute: </strong> Will AR contribute to sustainable living by freeing us from some of our energy hogging devices?<strong></strong></p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>AR will ultimately encourage energy saving and recycling. where did I leave a light on at? where is the nearest trash can? what is the UV index outside today?</p>
<p>Yes, computers are energy hogs, but as we start seeing larger SSD drives, more efficient CPUs (even if the number of cores increases in multiples), and so on, the power will go down.</p>
<p>Also, think about thisâ€¦wearable displays potentially use less energy than LCD monitors on your desk.</p>
<p><strong>Tish Shute: </strong>Yes I should pick the brains of my intel chums on energy saving!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Getting rid of the monitor and switching to solid state drives will save an assload of power. Yes, I said assload.</p>
<p>Tell your intel chums to quit screwing around with single core mobile CPUs. We need multiple cores, that are smaller, faster, and use less power.</p>
<p><strong>Tish Shute: </strong>Is AR is the sustainable future of VW and MMOGs?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>The fun stuff will happen when they are both integrated in some fashion.</p>
<p><strong>Tish Shute:</strong> So perhaps this is why the Georgia guys are thinking in trying to combine AR and SL (<a id="boum" title="see video here" href="http://uk.youtube.com/watch?v=O2i-W9ncV_0&amp;feature=related" target="_blank">see video here</a>).</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> That first video was pretty damn cool. It just pains me that they are using SL for it. And omg, all those markers on the table.</p>
<p>Although, I could care less about seeing my SL avatar on my coffee table. I would rather see an avatar representing ME in the real world, moving around in a virtual world that is a â€œto scaleâ€ replica of the real world. That is MUCH more interesting and innovative.</p>
<p>But even if I donâ€™t like where they are going, or that they are using SL, the important thing is that they are doing something and forging ahead. I have a massive amount of respect for anyone, private, government, or academic, that is doing that.</p>
<p>And yes, the door (or window, or looking glass) has to work both ways for maximum potential, at least, thatâ€™s what Id like to see. They donâ€™t *have* to, but it would be rather cool.</p>
<p>And going back to sustainability, AR has the potential to make monitors generally obsolete, laptops too. Thatâ€™s a lot of power hungry devices with all sorts of metals and batteries inside.</p>
<p>But, even if the tech was absolutely crazy awesome right this minute, it would take a little while for consumer adoption.</p>
<p><strong>Tish Shute:</strong> But AR unleashes the mobile device?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes, AR is going to be built on powerful mobile devices for the near future, eventually embedded comps in clothing and whatnot. But that is a ways off</p>
<p>Entertainment is going to be the first huge driver.</p>
<p><strong>Tish Shute:</strong> So people will get used to having a pet virtual dragon on their shoulder first?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes, virtual dragon is way cool, easy tech for games, and can eventually be leveraged into a smart agent which becomes a practical applicationâ€¦agent based contextual search, etc. Yes, entertainment will also drive people to get used to the tech</p>
<p><strong>Tish Shute: </strong>Oh thanks for turning me on to <a id="kzbv" title="gamesalfresco" href="http://gamesalfresco.com/" target="_blank">gamesalfresco</a>!<strong></strong></p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Ive noticed that the good stuff usually gets linked to there. They donâ€™t list my blog, but thatâ€™s what I get for staying under the radar and not posting often. But anyway, gamesalfresco is the first place I send people that need a crash course in AR. Great site, great owner.</p>
<p><strong>Tish Shute:</strong> So are you in agreement with Thomas Wrobelâ€™s positioning ofÂ <a href="http://www.mobilizy.com/wikitude.php" target="_blank"> </a><em><strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a></strong></em> and <em><strong><a href="http://gamesalfresco.com/2008/07/20/want-your-own-augmented-reality-geisha/" target="_self">AR Geisha doll</a> </strong></em>as being significant milestones for AR?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes,Â  these are among the first attempts to get away from the novelty of simply rendering a 3D object based on a marker and making it interesting.</p>
<p class="MsoNormal">Remember, one of the biggest risks that AR has, is being branded as â€œnoveltyâ€, which means â€œcool for five minutes but ultimately a waste of time.â€ I think we have a ways to go before something is truly useful, but as 2009 progresses we should start seeing some effort here. Iâ€™d guess 2010 before something really useful comes outâ€¦at least something practical.</p>
<p>Now, having said that, I should say that I expect entertainment and games to take the lead (as usual), although there are a few companies really trying to leverage AR and video/graphics compositing for marketing (brochures) and location based methods (kiosks, large screen projections, etc.)</p>
<p><strong>Tish Shute:</strong> Many people would say SnowCrash (metaverse) is now and Halting State (AR) is ten years from now. But you are seeing a development timeline for some popular AR apps in the next 18 months?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> Anyone that says SnowCrash is -now- is living in a box. Virtual Worlds, Virtual Reality, and immersive tech in general stopped innovating in the mid 90s. Iâ€™m continually flabbergasted at the number of people that think that things like Second Life are state-of-the-art or innovative. You might as well try to market a walkman as cutting edge, even though we have IPods out there.</p>
<p>Id like to see someone grab an engine like offset, crytek, hero, or unreal 3, and smack on a fat mmo server infrastructure (eve or big world)â€¦toss in the right tools, and you would see a revolution and renaissance occur at the same time in the virtual world space. All the puzzle pieces are there, just no one is putting them together the right way.</p>
<p><strong>Tish Shute:</strong> Why doesnâ€™t anyone do that?<strong></strong></p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Its not cheap, people will only fund a copy of something that exists already, people fear change and innovation, etc, The list goes on. The right money goes to the wrong people all the time.</p>
<p>Alternatively stated, there is a lot of â€œright idea, wrong implementationâ€</p>
<p>MMORPGs carried the torch and have made huge strides on the technology front, but have devolved in design. More often than not the gameplay emphasizes the single player experience and does nothing to take advantage of the potential of the massively connected internet.</p>
<p>Unless both industries have some serious upheaval or radical new approaches, they will quickly be eclipsed by AR, which will eventually evolve into something hybrid..AR/VR depending on your level of access and hardware.</p>
<p>But yes, Iâ€™d say that the next 18 months are going to be very interesting with a lot of money being thrown around, new ventures, and plenty of content/applications. I expect most of this will be centered on single user AR experienced through a mobile device with a screen (iphone, android, etc.). I expect that there will be a significant boost after Vuzix releases some of their wearable *transparent* displays, putting Microvision back into the â€œhas potential but is too quietâ€ position.</p>
<p><strong>Tish Shute:</strong> AR conjurs an image in many peopleâ€™s minds of dreadful head gear!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes, it is either transparent wearable displays (in eyeglass formfactor) or nothing. HMDs with miniature LCD or OLED displays are good for streaming video, but for the mobile ubiquitous AR we all dream about, it has to be something that looks and feels like a pair of Oakleys.</p>
<p>I should also mention that several different types and modes of AR are going to find themselves being defined and refined over the next two years as we continue to blaze new trails, establish a lexicon (we keep borrowing terms from games, VR, virtual worlds, mmorpgs), and really work out the how as well as the why.</p>
<p>Even though the idea of AR has been around for a long time, the technology is just beginning to emerge, and very few people are even looking far enough ahead to figure out the problems and solutions that the tech creates. Really, who is thinking about how to deal with AR spam right now?</p>
<p><strong>Tish Shute: </strong>Do you see any successful networked AR applications emerging in the next 18 months?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yes and no.</p>
<p>When I talk about AR, I try to expand the definition a little bit. Usually, when you talk to someone about augmented reality, the first thing that comes to mind is overlaying 3D graphics on a video stream. I think though, that it should more properly be any media that is specific to your location and the context of what you are doing (or want to do)â€¦augmenting or enhancing your specific reality.</p>
<p>In this sense, anything that at least knows who you are (your ID, mobile phone #, etc.), where you are (GPS coord or a specific place like a cafe), and gives you relevant data, information, or media = augmented reality. Sure, you can make things more interactive or immersive, but that is the minimum.</p>
<p>So, in this case, yes, I think there will be networked applications in the next 18 monthsâ€¦mostly things that are enhanced by friends lists (you are here, your friend is over there). These will be *application specific*. My team at Neogence is already going beyond this, building a platform and infrastructure for other applications to be developed onâ€¦all networked through the same backbone. Now, in this context (the science fiction AR that we all dream about), no I do not see anyone else trying to leap a generation or two ahead of the industry to build a massively multiuser shared AR space. Expect to see things like multi-user AR games, virtual pets, kiosk marketing, magic book, â€œgee whizâ€ presentations (tradeshow booths, entertainment parks, etc.), and so forth.</p>
<p>The big thing Iâ€™m worried about is AR becoming the next silicon valley trendâ€¦once they realize the potential, an enormous amount of capital will flow to a bunch of startups with half baked ideas, weak business models, ten year old tech, and a lot of overhyped marketing. That is the very thing that will kill this technology as something that has true power and potential to literally change the way we interact with each other, our surroundings, information, and media.</p>
<p><strong>Tish Shute: </strong>Do you think AR has value for a project like Pachube that helps us connect dtat from lots of different environments and sensor actuator data?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> I think that AR has value as an interface to this data (essentially data visualization based on information streaming from a sensor or source that is interpreted in some dynamic graphical manner that has meaning). This is one of the â€œbig areasâ€ where ubiquitous augmented reality and wearable computing will really shine. Iâ€™ll definitely be keeping an eye on Pachube .</p>
<p><strong>Tish Shute:</strong> I canâ€™t help it! I am really interested to hear more about the Vuzix glasses?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yeah, everyone is getting hung up on the glasses as the end-all be all and having markers everywhere too.</p>
<p>All the glasses are, is another display device. At the end of the day, it doesnt matter if you are looking at a lcd monitor, a iphone, a head mounted display, or a pair of wicked next generation transparent wearable displays that magically draw directly on your retinas.</p>
<p>The real tricky stuff is what happens on the backendâ€¦making it all persistent, massively multiuser, intelligent, interoperable, realistic, etc. etc.</p>
<p>I think that we are within 24 months of the magic wearables (these new ones by vuzix are probably the real first generation attempt at doing it right). They wont be perfect, but I expect they will be functionalâ€¦and once we have functional, we can start doing the good stuff.</p>
<p><strong>Tish Shute:</strong> You mentioned you disappointement with VWs and MMORPGs earlier.Â  Could you tell me more about that?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> Yeah, there was an evolutionary divergence between virtual worlds and mmorpgs a while back. One stagnated almost completely, and the other leapt ahead in one sense and devolved horribly in the other sense. Neither is where the state of the art should be.Â  That is a whole other conversation, and probably a second book.</p>
<p><strong>Tish Shute:</strong> So making AR persistent, massively multiuser, intelligent, interoperable, realistic, etc. etc. that is where your efforts are going?<strong></strong></p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes. I fully expect that the hardware is almost ready for it. You can cobble together some amazing things in the lab right now, and I think commercial viability is imminent. The real value (as far as Iâ€™m concerned) is in making it mobile, wireless, persistent, and massively multiuser. You could argue that augmented reality will take over where virtual reality failed and become internet 3, internet one being the internet, internet two being the webâ€¦</p>
<p>mmorpgs are nothing more than single player games in a multiuser environment these days. Iâ€™m more than a bit bitter about it. All the right money went to the wrong people, and the best games we have are barely shadows of what we could have had by now.</p>
<p><strong>Tish Shute:</strong> Are there any open source AR platform dev projects?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>open source? hrm, Im sure there are multiple ones out there</p>
<p>if not entirely open source, there are plenty of things to experiment with that are generally free if you arenâ€™t trying to sell something, DART and ARTOOLKIT come to mind as very accessible applications.</p>
<p>Marker based AR is very important right nowâ€¦it is easy, low tech, understandable, highly customizable, and most importantly, accessible to the average joe. Ultimately though, we need a method of pure trackingâ€¦no markers glued to everything on the planet, no â€œbillions of RFIDsâ€ embedded in every square inch of every object on the planet, etc.</p>
<p><strong>Tish Shute:</strong> What do you mean by interoperability in AR? And what do you think about the development of standards?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> Ooh, good question.</p>
<p>Ok, so the internet is basically computers communicating with computers, and the web is mostly pages linking to other pages (Iâ€™m greatly oversimplifying here). Hold this thought for a minute.</p>
<p>Switch over to MMORPGs. If you want to play in one (or a virtual world), you need to download a client that is specific to that world. One client does not work with another world. There are plenty of efforts to change this, but they are all barking up the wrong tree. The specific uniqueness of each world defeats the need and purpose of true interoperability, unless you completely reinvent the whole thing with a common backbone, features, functionality, etc. The very nature of virtual worlds and mmorpgs rebels against this.You absolutely do not want an avatar from second life running around in world of warcraft (for reasons that should be obvious).</p>
<p>On the other hand, with the web, you can use just about any client (browser) to access nearly any website (some requiring plugins or whatever).</p>
<p>The thing with augmented reality, is how do we go about making this? Iâ€™ve seen a few people thinking about this from the wrong perspective. There was a question at the last techcrunch to the Sekai Camera guys (a conceptual AR application for the iphone) where someone on the panel wanted to know how website owners would convert their content for augmented reality. BZZZZZT! That is a fundamental misunderstanding of what AR is, or could be, and it falls into the same trap I see a lot of people doingâ€¦and that is looking at AR through the web 2.0 lens or the virtual world lens. It is absolutely fundamentally different at the coreâ€¦sure there are similarities: it has social networking/media applications and properties, and it has 3D graphics, but it stops there.</p>
<p>Ubiquitous augmented reality will be dramatically different depending on which standards, approaches, and philosophies get the most traction first. Will you walk down the street with your AR glasses and have a pop up every 30 feet asking you if you want to access the AR content on another server? Will you then have to register, subscribe, or whatever?</p>
<p>Or will all AR content be mediated by one sole master control server deep in the bowels of google? What about some other option? Will you need different sets of glasses to access different features and content from multiple sources?</p>
<p>At the end of the day, it should not matter what brand of glasses you are wearing, you should never have to deal with AR server popups to join/subscribe, and so forth.</p>
<p>Interoperability, in the context of what I was saying earlier, is the sense of how to build the infrastructure so all of this is seamless to the end user, but still maintaining the features/functionality necessary for all of what augmented reality promises usâ€¦I dont want to see everything in AR space, I want to be able to tune in or filter out some things, and I want to customize the snot out of what I see (perhaps changing metaphors or â€œholoscapesâ€), and so on. It all has to work together and simplify the end-user experience or it wonâ€™t get anywhere</p>
<p><strong>Tish Shute: </strong>So what caused the stagnation of new development and devolution of MMOGs in you opinion?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>yes, look at all the hope and hype for the mmorpgs released in the last 12 months really, what is different or better? Now, what is worse?</p>
<p>I bet any decent mmorpg gamer could give you a list of 2 or 3 things for the first question and 20-30 things for the second.</p>
<p>And, VWs seem to be stuck in a feedback loop</p>
<p><strong>Tish Shute: </strong>feedback loop?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Imagine nailing one of your feet to the ground and then trying to run â€™round and â€™round and â€™round.</p>
<p><strong>Tish Shute:</strong> Why do you think this happened to VWs?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Men in suits and flashy watches.</p>
<p>actually, hang onâ€¦..</p>
<p>I saw a video clip the other day from a conference about using various virtual and game technologies for simulations and other real world applications several people were talking about â€œavatar technologyâ€ and how theirs was better than their competitions and what not.</p>
<p>Now, can you tell me what â€œavatar technologyâ€ is? Avatar technology is a red herring. Avatar technology is the same thing as calling a toaster a new â€œfire technology.â€</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> The problem is that a lot of people that donâ€™t have a clue about what they are doing are selling the tech to other people that have no clue what they are buying, but they feel like the should for some unknown reason.</p>
<p>That is happening all over the government, academic, and industrial sectors now with a few companies selling virtual worlds (again, mid 90-s tech) as the ultimate solution to all problems.</p>
<p>Anyway, getting back to your question</p>
<p>Once virtual reality started getting some buzz, some people got greedy and jumped into the avatar/virtual world thing and tried making it commercial too soon half of the 3D chat worlds were being jammed into platforms for virtual shopping malls.</p>
<p>Most of the money funding tech R&amp;D started funneling towards VRML, and doing 3D in web pages, etc.</p>
<p><strong>Tish Shute: </strong>yes horrible idea trying make web pages 3D IMHO</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> The money people got involved too soon, and then the greedy people jumped in and tried patenting everything possible. Take a look at the worlds.com patent for 3D worlds.</p>
<p>They filed it back in 2000 or so and it was awarded in 07 (it shouldnt have been in my opinion) now they are suing everyone they can.</p>
<p><strong>Tish Shute: </strong>Will there be patent wars in AR?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yes, the AR patent wars will be legendary once people start waking up to the real potential here.</p>
<p>The only solution is for everyone to band together and pre-emptively patent or make public domain every possible patentable concept, technology, or implementation for AR otherwise, you havenâ€™t seen anything yet.</p>
<p><strong>Tish Shute:</strong> Is the AR community organized enough to do that yet?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> That depends on how my company fares in the next six months.</p>
<p><strong>Tish Shute:</strong> Will you patent or make your tech public domain?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> I plan on patenting the snot out of everything we can possibly think of, and then giving away our content creation tools and SDK stuff for free. The whole goal of what we are trying to build is to empower the end user and facilitate the creation of a wonderful world of augmented reality.</p>
<p>There are some things we will make public domain for sure, on top of that</p>
<p><strong>Tish Shute:</strong> So back to my question on networked real time experience. Will we have networked Real time AR experiences in the next 18 months</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> It is possible, yes. Other than what we are doing, I am not aware of anyone else taking the same approach we are, but the potential for an â€œunder the radar ventureâ€ (much like my own company) is definitely there.</p>
<p><strong>Tish Shute: </strong>Will you use cloud computing?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>I think thatâ€™s overrated and probably another attempt at the whole â€œthin clientâ€ model that some companies have been pushing for the last 20 years.</p>
<p>It sounds good on paper, but ultimately takes power and control away from the end user.</p>
<p><strong>Tish Shute:</strong> cloud computing?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes. You know, we arenâ€™t playing around, We are totally building â€œTHE ARâ€ that everyone keeps dreaming about. None of this cute stuff you see on youtube. Actually, if you want to see the things that have inspired our vision of what we want to build, check out:</p>
<p>* Dream Park by Larry Niven and Steven Barnes<br />
* Rainbows End by Vernor Vinge<br />
* Spook Country by William Gibson<br />
* Halting State by Charles Stross<br />
* The Diamond Age by Neal Stephenson<br />
* Donnerjack by Roger Zelazny and Jane Lindskold<br />
* Otherland by Tad Williams<br />
* Neuromancer by William Gibson<br />
* Idoru by Wiliam Gibson<br />
* Cryptonomicon by Neal Stephenson</p>
<p>and watch the whole anime of Denno Coil (subbed NOT dubbed!).</p>
<p><strong>Tish Shute:</strong> So scaling the real time experience wonâ€™t be a problem in your project hehe</p>
<p>Cos no sharding allowed in AR right</p>
<p>And if you have lots of API calls?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong>: haha, sharding is one of the dumbest things to happen to the VW/MMO industry</p>
<p>It is a solution to a technical problem that was relevant 15 years ago.</p>
<p><strong>Tish Shute:</strong> so why did it stick (i know men in suits)</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> it stuck because â€œthats what the other guys didâ€ and the mmo designers are too lazy to reconcile gameplay for PvP and RP gamers</p>
<p>However, there is a curious problem between dealing with â€œone worldâ€ and â€œanyone can start their own custom AR serverâ€</p>
<p><strong>Tish Shute: </strong>Now that is a very interesting problem the one world and own AR server</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> It took me a few weeks of not sleeping to figure that one out. It gets back to the interoperability issue</p>
<p><strong>Tish Shute:</strong> What did you come up with?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> a solution. Thats all I can say for now on that.</p>
<p><strong>Tish Shute</strong>: eeextra seeekrit!</p>
<p>Well I will definitely have to bug you on that.</p>
<p>The problem has produced some creativity in OpenSim with people coming up with hybrids of p2p and oneworld</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> As far as virtual worlds are concerned, they need to look at the problem from a different perspective. They are trying to make all virtual worlds interoperable intead of creating a new model for interoperable worlds that new ones will be created to adhere to.</p>
<p><strong>Tish Shute: </strong>well some people are. I would say most OpenSim developers see their modular approach doing this.Â  And you choose to interoperate based on what modules you have activated and then social agreementsâ€¦</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> hrm, thats a start, but that only works on a functional and social level &#8211; doesnâ€™t account for content (story, mythos, game rules), unique data (my +3 sword), or the concepts of commerce, inherent value, and intellectual property</p>
<p>Enabling my WoW avatar to run around in SL and vice versa creates more problems than it solves.</p>
<p>Its like two alien races working hard to make sure that their two spaceships can dock but no one is paying any attention to the fact that race A breathes nitrogen and race B breathes sulpher.</p>
<p>Its technically possible, but they are missing the boat on the content side of the problem.</p>
<p><strong>Tish Shute:</strong> Yes but donâ€™t you think when a modular open source tech for virtual worldsÂ  becomes pervasive, what will happen is that those interested in a similar genre will increasingly use the module in ways that allows their content to interoperate if they want it too</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>everyone has to use the same backend tech, and the front end clients need to adhere to the same standards. Bu I have to admit, I havenâ€™t been paying much attention to the vw space in the last 9 months or so.</p>
<p>Oh I have to run now.Â  But download and install <a id="vsnt" title="cooliris" href="http://www.cooliris.com/" target="_blank">cooliris</a>. I promise you will be blown away and will start using it to search for images and videos</p>
<p>Its frigging awesome.</p>
<p><strong>Tish Shute:</strong> Will do!Â  Thanks so much great talking to you. I canâ€™t wait for your launch.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/01/17/is-it-%e2%80%9comg-finally%e2%80%9d-for-augmented-reality-interview-with-robert-rice/feed/</wfw:commentRss>
		<slash:comments>27</slash:comments>
		</item>
	</channel>
</rss>
