<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>UgoTrade &#187; Virtual Meters</title>
	<atom:link href="http://www.ugotrade.com/category/instrumenting-the-world/virtual-meters/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.ugotrade.com</link>
	<description>Augmented Realities at the Edge of the Network</description>
	<lastBuildDate>Wed, 25 May 2016 15:59:56 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>ARE is now AWE â€“ Augmented World Expo!</title>
		<link>http://www.ugotrade.com/2012/12/20/are-is-now-awe-%e2%80%93-augmented-world-expo/</link>
		<comments>http://www.ugotrade.com/2012/12/20/are-is-now-awe-%e2%80%93-augmented-world-expo/#comments</comments>
		<pubDate>Thu, 20 Dec 2012 22:31:34 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Ambient Findability]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial general Intelligence]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Artificial Life]]></category>
		<category><![CDATA[Augmented Data]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[evolutionary technologies]]></category>
		<category><![CDATA[GeoFencing]]></category>
		<category><![CDATA[GeoMessaging]]></category>
		<category><![CDATA[gestrural interface]]></category>
		<category><![CDATA[Hadoop]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[ipad]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[Linked Data]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[nanotechnology]]></category>
		<category><![CDATA[New Interfaces]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[NoSQL]]></category>
		<category><![CDATA[Open Data]]></category>
		<category><![CDATA[Rainbows End]]></category>
		<category><![CDATA[Real Time Big data]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[technological singularity]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[3D printing]]></category>
		<category><![CDATA[AR eyewear]]></category>
		<category><![CDATA[augmented humans]]></category>
		<category><![CDATA[augmented perception]]></category>
		<category><![CDATA[augmented reality eyewear]]></category>
		<category><![CDATA[augmented world]]></category>
		<category><![CDATA[augmentedworldexpo]]></category>
		<category><![CDATA[bone mics]]></category>
		<category><![CDATA[brain computer interface]]></category>
		<category><![CDATA[digital fabrication]]></category>
		<category><![CDATA[DIO]]></category>
		<category><![CDATA[DIY]]></category>
		<category><![CDATA[fab at home]]></category>
		<category><![CDATA[fabbers]]></category>
		<category><![CDATA[gesture interfaces]]></category>
		<category><![CDATA[hacker spaces]]></category>
		<category><![CDATA[hardware]]></category>
		<category><![CDATA[location based services]]></category>
		<category><![CDATA[maker]]></category>
		<category><![CDATA[maker hardware]]></category>
		<category><![CDATA[mobile social computing]]></category>
		<category><![CDATA[neural interface]]></category>
		<category><![CDATA[perceptual computing]]></category>
		<category><![CDATA[pro-maker]]></category>
		<category><![CDATA[project glass]]></category>
		<category><![CDATA[robotics]]></category>
		<category><![CDATA[smart things]]></category>
		<category><![CDATA[spimes]]></category>
		<category><![CDATA[toys everywhere]]></category>
		<category><![CDATA[wearables]]></category>
		<category><![CDATA[World as a Platform]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=6575</guid>
		<description><![CDATA[I&#8217;m really excited that we opened a call for proposals today for Augmented World Expo (registration opens February!). Â Our edgy conference on augmented reality has morphed into the worldâ€™s first Expo about the augmented world. Â If you loved ARE you are going to findÂ Augmented World Expo the most important event of 2013, and if you [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2012/12/AWE2013.jpg"><img class="alignnone size-medium wp-image-6576" title="AWE2013" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2012/12/AWE2013-300x187.jpg" alt="" width="300" height="187" /></a></p>
<p>I&#8217;m really excited that we opened a call for proposals today for <a href="http://augmentedworldexpo.com/cfp/"><strong>Augmented World Expo</strong></a> (registration opens February!). Â Our<strong> </strong>edgy conference on augmented reality has morphed into the worldâ€™s first Expo about the augmented world. Â If you loved ARE you are going to findÂ <strong><a href="http://augmentedworldexpo.com/cfp/" target="_blank">Augmented World Expo</a></strong> the most important event of 2013, and if you never got a chance to attend before register early to reserve your spot!</p>
<p>&#8220;The way we experience the world will never be the same. We no longer interact with computers. We interact with the world. A set of emerging technologies such as augmented reality, gesture interaction, eyewear, wearables, smart things, cloud computing, and ambient computing are completely changing the way we interact with people, places and things. These technologies create a digital layer that empowers humans to experience the world in a more advanced, engaging, and productive way.</p>
<p>Augmented World Expo will bring together the best in augmented experiences from all aspects of life: health, education, emergency response, art, media and entertainment, retail, manufacturing, brand engagement, travel, automotive, and urban design. It will be the largest ever exposition demonstrating how these technologies come together to change our lives and change the world.</p>
<p><strong>Registration will open in February.&#8221;</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2012/12/20/are-is-now-awe-%e2%80%93-augmented-world-expo/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Games, Goggles, and Going Hollywood&#8230;How AR is Changing the Entertainment Landscape: Talking with Brian Selzer, Ogmento</title>
		<link>http://www.ugotrade.com/2009/08/30/games-goggles-and-going-hollywood-how-ar-is-changing-the-entertainment-landscape-talking-with-brian-selzer-ogmento/</link>
		<comments>http://www.ugotrade.com/2009/08/30/games-goggles-and-going-hollywood-how-ar-is-changing-the-entertainment-landscape-talking-with-brian-selzer-ogmento/#comments</comments>
		<pubDate>Mon, 31 Aug 2009 03:38:38 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[nanotechnology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[alternate reality RPG]]></category>
		<category><![CDATA[ambient intelligence]]></category>
		<category><![CDATA[AMEE]]></category>
		<category><![CDATA[AR Network]]></category>
		<category><![CDATA[AR spam]]></category>
		<category><![CDATA[ARBalloon]]></category>
		<category><![CDATA[ARN]]></category>
		<category><![CDATA[augmented reality baseball cards]]></category>
		<category><![CDATA[augmented reality development]]></category>
		<category><![CDATA[augmented reality eyewear]]></category>
		<category><![CDATA[augmented reality hotspots]]></category>
		<category><![CDATA[augmented reality industry]]></category>
		<category><![CDATA[augmented reality network]]></category>
		<category><![CDATA[augmented reality on the iphone]]></category>
		<category><![CDATA[augmented reality search]]></category>
		<category><![CDATA[augmented reality toys]]></category>
		<category><![CDATA[Blockade]]></category>
		<category><![CDATA[Brad Foxhoven]]></category>
		<category><![CDATA[Brian Selzer]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Cyberpunk]]></category>
		<category><![CDATA[Evolutionary Reality]]></category>
		<category><![CDATA[EyeToy]]></category>
		<category><![CDATA[eyewear for AR]]></category>
		<category><![CDATA[Games Alfresco]]></category>
		<category><![CDATA[Green Tech AR]]></category>
		<category><![CDATA[jim purbrick]]></category>
		<category><![CDATA[Kensuke Tanabe]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[Layar Developer Conference]]></category>
		<category><![CDATA[location based RPGs]]></category>
		<category><![CDATA[Lumus]]></category>
		<category><![CDATA[markerless AR]]></category>
		<category><![CDATA[markerless mobile augmented reality]]></category>
		<category><![CDATA[markerless natural feature tracking]]></category>
		<category><![CDATA[Masunaga]]></category>
		<category><![CDATA[Metroid]]></category>
		<category><![CDATA[Metroid Prime]]></category>
		<category><![CDATA[Mirrorshades]]></category>
		<category><![CDATA[multiperson mobile AR experiences]]></category>
		<category><![CDATA[Nano Air Vehicles]]></category>
		<category><![CDATA[near field object recognition]]></category>
		<category><![CDATA[new augmented reality trade jargon]]></category>
		<category><![CDATA[Ogmento]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[Pentagon's Robot Hummingbirds]]></category>
		<category><![CDATA[Project Natale]]></category>
		<category><![CDATA[Put a Spell]]></category>
		<category><![CDATA[Robert Rice]]></category>
		<category><![CDATA[Sekai camera]]></category>
		<category><![CDATA[social gaming platforms]]></category>
		<category><![CDATA[sticky light]]></category>
		<category><![CDATA[The Dawn of the Augmented Reality Industry]]></category>
		<category><![CDATA[Tonchidot]]></category>
		<category><![CDATA[Topps AR baseball cards]]></category>
		<category><![CDATA[Total Immersion]]></category>
		<category><![CDATA[Vuzix]]></category>
		<category><![CDATA[Wikitude]]></category>
		<category><![CDATA[Yoshio Sakamoto]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4334</guid>
		<description><![CDATA[Picture on the left Mirrorshades, picture on the right a Metroid Hud. &#8220;Augmented Reality is like a Philip K Dick novel torn off its paperback rack and blasted out of iPhones,&#8221; Bruce Sterling in Beyond the Beyond &#8220;a techno visionary dream come true &#8211; those are rare, really rare, you have to be patient,Â  it&#8217;s [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/mirrorshadespost3.jpg"><img class="alignnone size-full wp-image-4349" title="mirrorshadespost3" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/mirrorshadespost3.jpg" alt="mirrorshadespost3" width="124" height="204" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/metroid_hud1post2.jpg"><img class="alignnone size-medium wp-image-4350" title="metroid_hud1post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/metroid_hud1post2-300x204.jpg" alt="metroid_hud1post" width="300" height="204" /></a></p>
<p><em>Picture on the left <a href="http://www.amazon.com/Mirrorshades-Cyberpunk-Anthology-Greg-Bear/dp/0441533825" target="_blank">Mirrorshades</a>, picture on the right a <a href="http://en.wikipedia.org/wiki/Metroid" target="_blank">Metroid Hud</a>.</em></p>
<p><strong>&#8220;Augmented Reality is like a Philip K Dick novel torn off its paperback rack and blasted out of iPhones,&#8221; <a href="http://www.wired.com/beyond_the_beyond/2009/08/the-key-take-aways-for-investors-interested-in-the-augmented-reality-field/" target="_blank">Bruce Sterling in Beyond the Beyond</a></strong></p>
<p><strong>&#8220;a techno visionary dream come true &#8211; those are rare, really rare, you have to be patient,Â  it&#8217;s super cyberpunk&#8221;&#8230; Bruce Sterling, <a href="http://vimeo.com/6189763" target="_blank">&#8220;At the Dawn of the Augmented Reality Industry.&#8221; </a></strong></p>
<p>The Dawn of the Augmented Reality Industry continues to brighten, and now we have two augmented reality companies, <a href="http://www.t-immersion.com/" target="_blank">Total Immersion</a> and <a href="http://ogmento.com/" target="_blank">Ogmento</a>, firmly established in Hollywood &#8211; the dream mother of so many of our augmented realities.<a href="http://ogmento.com/" target="_blank"></a></p>
<p><a href="http://ogmento.com/" target="_blank">Ogmento</a> is the most recent of these two pioneering augmented reality companies to set up shop in LA.Â  <a href="http://www.t-immersion.com/" target="_blank">Total Immersion&#8217;s</a> CEO Bruno Uzzan moved to LA from France two years ago, although he still has a fifty person RandD team in France.Â Â  Total Immersion began 10 years ago in the quiet, lonely, hours before the dawn of an AR industry.Â  But <a href="http://gamesalfresco.com/2009/07/23/mattel-launches-augmented-toys-at-comic-con/" target="_blank">Total Immersion&#8217;s AR toys for Mattel,</a> and augmented reality for <a href="http://www.youtube.com/watch?v=I7jm-AsY0lU" target="_blank">Topps baseball cards</a>, fired CNet writer Daniel Terdiman up enough to say, &#8220;I have seen the future of toys, and it is augmented reality&#8221; (<a href="http://news.cnet.com/8301-13772_3-10317117-52.html" target="_blank">see full post here on CNet</a>).</p>
<p>Recently, I talked withÂ <a href="http://www.ugotrade.com/2009/07/28/augmented-realitys-growth-is-exponential-ogmento-reality-reinvented-talking-with-ori-inbar/" target="_blank"> Ori Inbar, one of the founders of Ogmento </a>andÂ  the premier augmented reality blog <a href="http://gamesalfresco.com/" target="_blank">Games Alfresco</a> about his new venture in Hollywood. Bruce Sterling, <a href="http://twitter.com/bruces" target="_blank">@bruces</a>, had some fun with my invention of <a href="http://www.wired.com/beyond_the_beyond/2009/08/augmented-reality-ogmento/" target="_blank">brand new augmented reality trade jargon here</a>!Â  Ori pointed out Ogmento brings two important new facets to the rapidly growing augmented reality field: firstly they are bringing leadership from veterans of the entertainment industry into augmented reality development. <a id="squu" title="Brad Foxhoven" href="http://www.blockade.com.nyud.net:8080/about/about-blockade" target="_blank">Brad Foxhoven</a> and <a id="odvk" title="Brian Seizer" href="http://brianselzer.com/">Brian Selzer</a> from <a id="xow_" title="Blockade" href="http://www.blockade.com/" target="_blank">Blockade</a> have partnered with Ori on Ogmento.Â  And, in an another important step forward for a young industry, Ogmento announced they will be acting as publishers for a fast growing cohort of augmented reality application developers and helping AR development teams out there bring their concepts to the market.</p>
<p>So I was very happy also to have the opportunity to talk with Brian Selzer.Â  Bruce Sterling pointed out in his seminal<a href="http://eurekadejavu.blogspot.com/2009/08/augmented-realitys-sermon-on-flatlands.html" target="_blank"> sermon from the flatlands</a> at the <a href="http://layar.com/" target="_blank">Layar</a> Developer Conference, AR is kind of a &#8220;Hollywood scene.&#8221; We have seen the web early adopter/developer/blogger communityÂ  embrace augmented reality browser experiences in recent weeks in an awesome wave of enthusiasm. Are Hollywood creatives equally smitten? For the answers see the full interview with Brian Selzer below.</p>
<p>Brian Selzer (<a href="http://brianselzer.com/" target="_blank">www.brianselzer.com</a> and <a href="http://twitter.com/brianse7en" target="_blank">twitter &#8211; brianse7en</a> ) has an extensive involvement with emerging platforms:</p>
<p><strong>&#8220;from launching dot com entertainment sites in the late 90&#8242;s to creating early versions of social gaming platforms, or bringing big brands like Spider-Man and X-Men into the mobile space for the first time. Â Last year I was focused on bringing video game characters and worlds into the online space as UGC [user generated content] projects (<a href="http://www.mashade.com/" target="_blank">mashade.com</a>, <a href="http://www.instafilms.com/" target="_blank">instafilms.com</a>).&#8221;</strong></p>
<p>I began my own career in Hollywood doing motion control photography and creating software that bridged the language of robotics and servo motors with the visions ofÂ  film directors. Eventually our little company, NPlus1, moved on to 3D vision systems and image recognition stuff.Â  So yes, I have been really, really patient waiting for this particular techno visionary dream.Â  And, while I have been waiting for augmented reality to manifest, I have grown to love the internet.Â  But now, how awesome, <a href="../../2009/01/17/is-it-%E2%80%9Comg-finally%E2%80%9D-for-augmented-reality-interview-with-robert-rice/" target="_blank">It is OMG finally for mobile AR!</a></p>
<p>Augmented reality is busting out all over &#8211; through our laptops, our phones, on the streets, toys, baseball cards, art installations, <a href="http://www.youtube.com/watch?v=9noMfsg486Y" target="_blank">sticky light calligraphy</a> and more.</p>
<p>Many of my questions to Brian were directed at at how and when we will see augmented realities with near field object recognition, image recognition and tracking and, of course, the illusive eyewear.Â  As Bruce Sterling points out we are just at the very, very beginning &#8211; the dawn of an industry.Â  I created the photomontage below on the right to compliment <em> <a href="http://www.tonchidot.com/">Tonchidot&#8217;s</a> </em>illustration suggesting the evolutionary inevitability of holding our phones up (below on the left).Â  The Evolutionary Reality of AR will not end there.Â  It is just a step into eyewear, hummingbirds or <a href="http://http://gizmodo.com/5306679/pentagons-robot-hummingbird-christened-nano-air-vehicle" target="_blank">Nano Air Vehicles</a>, and more&#8230;&#8230;.</p>
<h3>The Evolutionary Reality of AR</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-96.png"><img class="alignnone size-medium wp-image-4359" title="Picture 96" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-96-300x97.png" alt="Picture 96" width="300" height="97" /></a></p>
<p><em>Cartoon on the left  by  <a href="http://www.tonchidot.com/">Tonchidot</a> on the right a collage of a stock photo and the <a href="http://gizmodo.com/5306679/pentagons-robot-hummingbird-christened-nano-air-vehicle" target="_blank">Pentagon&#8217;s Robot Humming Birds &#8211; </a><a href="http://http//gizmodo.com/5306679/pentagons-robot-hummingbird-christened-nano-air-vehicle" target="_blank">&#8220;Nano Air Vehicles</a>.&#8221;</em><strong><em><strong><a href="http://gizmodo.com/5306679/pentagons-robot-hummingbird-christened-nano-air-vehicle" target="_blank"> </a></strong></em> </strong></p>
<p>While we finally we have, an affordable mediating device with the horse power, mindshare and business model to bring AR mainstream with the iphone.Â  The much anticipated Apple 3.1 Beta SDK to be released in September will not, I am sure, open up the Video API at the levels that augmented realities with near field object recognition and tracking require (I would love to be proved wrong though). But the magic wand to deliver even <span id="b9-2" title="Click to view full content">tightly registered AR graphics/media (that require a lot of CPU and GPU)</span> to a wide audience is in our hands, so full access to may not be far off. And others, of course, can/will/might knock the iphone off its current pedestal.Â  AR made it&#8217;s mobile phone debut on the Android after all.</p>
<p>Like everyone else who loves AR, I wish that Apple would open up faster (and I wish Android would manifest on some rocking hardware). But we will see enough of the iphone Video API open for the next generation of mobile augmented reality games and applications to emerge in the coming months.</p>
<p>One of these will be Ogmento&#8217;s.  Although Ogmento is in stealth mode, they have released <a href="http://www.youtube.com/watch?v=EB45O7-6Xrg&amp;eurl=http%3A%2F%2Fogmento.com%2F&amp;feature=player_embedded" target="_blank">a teaser for their first game, &#8220;Put A Spell,&#8221;</a> developed by ARBalloon â€“ screenshot below.Â  Ori did reveal to me in <a href="../../2009/07/28/augmented-realitys-growth-is-exponential-ogmento-reality-reinvented-talking-with-ori-inbar/" target="_blank">th<span style="color: #551a8b;">is interview</span></a> that they are doing image recognition and using the Imagination AR engine.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-95.png"><img class="alignnone size-medium wp-image-4356" title="Picture 95" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-95-300x177.png" alt="Picture 95" width="300" height="177" /></a></p>
<p>As Brian notes, Hollywood has had the AR bug for a long time. AR has been everywhere in Science Fiction Movies and video games. Nintendo&#8217;s SPD3 head Kensuke Tanabe, &#8220;effectively the man in charge of overseeing all the <em>Metroid</em> franchise underneath original co-creator Yoshio Sakamoto,&#8221; explains the story of <em>Metroid</em> to Brandon Boyer of <a href="http://www.offworld.com/2009/08/retro-effect-a-day-in-the-stud.html" target="_blank">Offworld here</a> (an image of a Metroid Hud on the right opening this post) :</p>
<p><strong>&#8220;the idea of the different visors you use in the <em>Prime</em> games to interact with the world: the scan visor, for instance, set the game apart from other first person shooters in that the player was using it to proactively collect information from the world, rather than having the story come to them passively, in the form of cut-scenes or narration. &#8220;<em>Prime</em> could have adventure elements with the introduction of this visor,&#8221; says Tanabe, &#8220;That&#8217;s how we came up with the genre &#8212; first person adventure, instead of shooter.&#8221;</strong></p>
<p>But as Brian points out:</p>
<p><strong>&#8220;the light bulb has been lit and Hollywood is seeing that the software and hardware are here today to deliver these types of AR experiences in real life (to a lesser extent of course, but the path is getting clear).&#8221;</strong></p>
<p><strong><br />
</strong></p>
<h3>Talking with Brian Selzer</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/me.jpg"><img class="alignnone size-full wp-image-4363" title="me" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/me.jpg" alt="me" width="188" height="227" /></a></p>
<p><strong>Tish Shute: </strong>Bruce Sterling&#8217;s sermon at the Layar Developer conference, <a href="http://www.wired.com/beyond_the_beyond/2009/08/at-the-dawn-of-the-augmented-reality-industry/" target="_blank">&#8220;At the Dawn of the Augmented Reality Industry,&#8221;</a> was absolutely awesome. He spread the future feast/orgy of augmented reality before usÂ  &#8211; and described many of the dishes we will tasting both delectable and diabolical.Â  One of the many things he points out is, AR is kind of a &#8220;Hollywood scene.&#8221; And, as Ogmento is one of only two augmented reality companies in Hollywood, I am interested to hear how it looks from your neck of the woods. We have seen the web early adopter/developer/blogger communityÂ  embrace augmented reality browser in recent weeks in an awesome wave of enthusiam &#8211; are Hollywood creatives catching the buzz?</p>
<p><strong>Brian Selzer: Â It was a thrill to hear Bruce Sterling mention Ogmento. I devoured all of his Cyberpunk books back in the 80&#8242;s, along with writers like Gibson, Rucker, Shirley&#8230; To me, sci-fi writers are the visionaries who define and influence our technological paths into the future. They make science and tech sexy enough to want to manifest those experiences in the real world. Clearly Bruce sees the AR industry as being sexy. I love that he called it &#8220;a techno-visionary dream come true&#8230; and super-cyberpunk.&#8221; Â And yes, kind of a Hollywood scene.</strong></p>
<p><strong>Hollywood creatives caught the AR bug before they knew what AR was. Â Look at science fiction movies and video games to see AR everywhere. Terminator, The Matrix, Minority Report, Iron Man.. the list goes on. Â Look at any video game with an integrated heads-up display. Â It&#8217;s clear Hollywood loves AR. Â It&#8217;s only been in the past few months though that the light bulb has been lit and Hollywood is seeing that the software and hardware are here today to deliver these types of AR experiences in real life (to a lesser extent of course, but the path is getting clear). So yes, the buzz is here and it&#8217;s strong. Â With that, we all have to be prepared for the good, the bad and the ugly as AR goes mainstream.</strong></p>
<p><strong>It certainly goes to show how young this industry is when Ogmento and Total Immersion are currently the only AR companies based in Los Angeles. It&#8217;s very exciting to be the only company right now demonstrating a natural feature tracking (markerless) iPhone experience in Hollywood. We are in talks to bring some very big brand and properties to the mobile AR space. The goal is to deliver experiences that create added engagement and value to the consumer.</strong></p>
<p><strong>Tish Shute:</strong> Also in his landmark sermon Bruce Sterling noted that augmented reality has been around for 17 yrs and now at last we are seeing the dawning ofÂ  an augmented reality industry. What inspired you to take up the challenge of launching an augmented reality company in Hollywood?Â  Oh congrats that Bruce Sterling name checked Ogmento in his list of companies that prove that this really is the dawn of an industry!</p>
<p><strong>Brian Selzer: I&#8217;ve always been involved in emerging platforms&#8230; from launching dot com entertainment sites in the late 90&#8242;s to creating early versions of social gaming platforms, or bringing big brands like Spider-Man and X-Men into the mobile space for the first time. Â Last year I was focused on bringing video game characters and worlds into the online space as UGC projects (mashade.com, instafilms.com). Working with all these great CG game assets, I continued to think about what&#8217;s next, and that&#8217;s when I started to follow AR very closely and started engaging with those who were pioneering in the space.</strong></p>
<p><strong>I remember swapping instant messages with <a href="http://curiousraven.squarespace.com/" target="_blank">Robert Rice</a> (<a href="http://twitter.com/robertrice" target="_blank">@robertrice</a>) right after the 2008 Super Bowl.Â  We were not chatting about the football game, but rather about some of the commercials that aired during the event as a sign that AR was making its way into the mainstream.Â  A lot of people became aware of AR for the first time when the <a href="http://ge.ecomagination.com/smartgrid/" target="_blank">GE SmartGrid commercial</a> aired.Â  There were all these YouTube videos popping up of people blowing on holographic wind turbines.</strong></p>
<p><strong>The commercial that really got me excited though was the <a href="http://www.youtube.com/watch?v=Kwke0LNardc" target="_blank">Coke Avatar commercial</a>.Â  In that commercial people in the city were sporadically being portrayed as their digital persona&#8217;s, avatars, gaming characters, etc..Â  For me that spot did a great job showing how many of us already have these â€˜alter egosâ€ that live in cyberspace, and how the line between these worlds can sometimes be blurred. I remember watching that commercial and thinking that is exactly the type of experience Iâ€™d like to create with mobile AR.Â  I want to overlap the virtual world into our every-day reality. Why cant I bring my World of Warcraft or Second Life persona with me into the real world?</strong></p>
<p><strong>I am big on the notion of â€œGames and Goals.â€ I believe that games have the power to motivate people in a very powerful way. By challenging ourselves while playing a game we can climb mountains.Â  Augmented Reality is the perfect platform to bring gaming into the real world.Â  By mixing the virtual world with the physical world, this added layer of perception provides a very powerful experience for something like a role-playing game.</strong></p>
<p><strong>One of my earlier social-gaming projects was a website called Superdudes.Â  This was a â€œBe Your Own Superheroâ€ concept that celebrated and motivated kids to create superhero avatar/persona&#8217;s online, and we gave members all sorts of games, challenges, and rewards, some of which carried into the real world. The site recognized members for teamwork, creativity, volunteer work and things like that. So the Superdudes were often involved in charity events and benefits to help children. Â Everybody called each other by their Superhero names, and the line between fantasy and reality were being blurred. Â This project really got me thinking about what happens when you take positive role-playing like this and mix it into the real world.Â  I started to work on a plan for location-based activist missions for points and rewards, but never got to complete that. So I have some unfinished business here.</strong></p>
<p><strong>I think it would be fantastic to be able to show up to some type of fun event with friends, and everybody could see each others alter ego personas standing before them. When you can turn the world into a playground, and use the power of gaming to make a positive impact on the planet&#8230; well, I donâ€™t think there is anything better than that.Â  These are the types of projects that drive me, and I think AR is the best platform to support these types of social gaming experiences.</strong></p>
<p><strong>Tish:</strong> Does Ogmento have any RPGs under development?Â  I noticed in the Google Wave on RPG someone has been working on doing something with the Dungeons&amp;Dragons API.Â  I am interested in exploring the web of protocols underlying Wave as a transport mechanism for multi-person, mobile, AR experiences (not requiring downloads), on an open global outdoor AR network. If not Wave, what do you see as the potential infrastrucure and protocols we could harness for an open augmented reality network?</p>
<p><strong>Brian: Â Ogmento has a deep background in video games and we interact regularly with most of the major game publishers. As a company we are not so much developing our own RPGs right now, but rather exploring what mobile AR extensions make sense for existing brands. Â There are many limitations to location-based gaming, but a global AR network is exactly along the lines we are thinking. Â Lots of discussions are taking place on protocols, platforms, API&#8217;s, and there are numerous ways to approach this. Â We need to be able to use what&#8217;s available now and continue to refine and customize for AR&#8217;s specific needs and issues as we progress. </strong></p>
<p><strong>In general though, Ogmento is focused on what types of experiences can be had today and over the next couple of years. I still think we are several years out from a truly open augmented reality network. Â We are certainly looking at launching our own &#8220;Ogmented Network&#8221; which would support some fun treasure hunt type experiences, or add an entertainment layer on top of traditional outdoor marketing campaigns.</strong></p>
<p><strong>Tish:</strong> I don&#8217;t know whether you have read Thomas Wrobel&#8217;s ideas for an open augmented reality network that I just <a href="http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/" target="_blank">published here on Ugotrade</a>.Â  The principals he talks about are very important for augmented reality to become a major part of our lives &#8211; .Â  Considering the difficulty open networks can pose for emerging business models how can we fund the development of an open framework for augmented reality?</p>
<p>&#8220;<em>a future AR Network, I mean one as universal and as standard as the internet. One where people can connect from any number of devices, and without additional downloads, experience the majority of the content.<br />
Where people can just point their phone, webcam, or pair of AR glasses anywhere were a virtual object should be, and they will see it. The user experience is seamless, AR comes to them without them needing to â€œprepareâ€ their device for it.&#8221;</em></p>
<p><strong>Brian: I think funding for these types of projects will definitely come from Venture Capital groups in the near future. Â It&#8217;s early in AR, but the VC&#8217;s are watching and deciding which horses to bet on. Â Until that time, it&#8217;s about service work, and developing AR experiences for others with what is possible today. That work will help fund internal development of original AR products, and platform development.</strong></p>
<p><strong>Tish:</strong> How did you get started with Ogmento?</p>
<p><strong>Brian: My first conversation with Ori was actually about my interest in Location Based RPG concepts.Â Â  We had a long conversation about the possibilities with AR, and it was clear that we shared similar interests, but were coming from different complimentary backgrounds. The idea of collaboration was exciting, so we just kept talking until the timing felt right. Now, with Ogmento we bring a unique blend of AR development experience with a deep backgrounds in AR technology, animation, video games, entertainment, social media, etc.Â Â  I think this is a powerful mix that will allow us to do some great things.</strong></p>
<p><strong>Itâ€™s still so early, and things are just getting started in AR. There are only so many webcam magic tricks you can enjoy before you are ready for something else.Â  The location-based apps have the most potential in my opinion, which is why we are really focused on mobile AR.Â Â  We have some board-game type projects, which do not instantly scream location-based gaming, but if you look at something like the ARhrrr board game, you can see how much more compelling it can be when the game invites the player to be actively moving around during the experience.</strong></p>
<p><strong>Tish:</strong> I am interested in your perspective on how we can create the kind AR experiences that really embody what has always been so exciting about AR &#8211; the tight alignment of graphics and media with real world objects and ultimately a rich immersive 3D experience, so I am going to hit you with a bunch of those, &#8220;Is this really eyewear or vaporware?&#8221; questions.Â  The real deal eyewear changes everything!</p>
<p>While eyeware is a big challenge technically and aesthetically,Â  I am pretty sure that there are several outfits out there that can pull off the optics and projection. â€¨Will the entertainment industry get excited enough to put a major push into delivering the eyewear in short order instead of the 5 to 10 year project that some people still think it is? Â Â  The business development challenge is bigger perhaps than the technical obstacles perhaps? What is your view on this?</p>
<p>And, perhaps, the eyewear is a clear example of a need for partnerships. For example, we have seen efforts from companies like <a href="http://www.vuzix.com/home/index.html" target="_blank">Vuzix</a> and <a href="http://www.lumus-optical.com/" target="_blank">Lumus</a>, and recently a Japanese Company, <a href="http://www.masunaga1905.jp/brand/teleglass/">Masunaga</a>.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-97.png"><img class="alignnone size-medium wp-image-4386" title="Picture 97" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-97-300x80.png" alt="Picture 97" width="300" height="80" /></a></p>
<p>I have no reports from people who have tried the Maunaga eyewear yet.Â  But,Â  limited by small field of view, and tethered, currently eyewear offerings, available at a reasonable price point, are not workable solutions for augmented reality experiences. But the problems are not insurmountable. What will facilitate the real deal?Â  â€¨â€¨â€¨It seems that it is critical to start creating hardware relationships now. The industry is costly and slow moving and as Robert Rice put it to me in a recent conversation, &#8220;once the software cat is out of the bag, its going to go wild and if the hardware isnt there, its going to stutter.&#8221;</p>
<p>As Ori notes some of the hardware companies like Intel and others don&#8217;t seem to be paying enough attention to AR.Â  Ori points out they donâ€™t see the demand yet.Â  But in order to create an awesome AR experience and demand from a mass audience, don&#8217;t we need to work in conjunction with hardware designers?</p>
<p><strong>Brian: Itâ€™s fun to think about who will eventually deliver a great hardware solution for AR glasses. It will happen. It would be cool to see somebody like an Oakley or Nike partnered up with a company like Vuzix to deliver something people actually might wear in public. Â Perhaps a hardware manufacturer like Apple or Nokia will bring us something like the iSight or the NGaze down the line. Â Iâ€™d love to see a set of glasses designed by Ideo.Â  Microsoft or Sony are already playing with technologies like Project Natale and the EyeToy, so I think its only a matter of time before they deliver an eyewear solution. I would even look to the toy companies to eventually make an investment here.</strong></p>
<p><strong>Gamers will be the early adopters, and in a few years we may start to see people running around in the park wearing glasses with headsets, but it will be acceptable because it&#8217;s clear they are using them for a game. Â Itâ€™s going to take a very sexy and stylish piece of hardware for everyday people to be willing to wear AR glasses in public while going about their everyday business. Â Â Itâ€™s like the recent cover of Wired magazine where Brad Pitt is wearing a mobile headset in his ear, and the editors point out that even he canâ€™t pull that look off, so why do you think you can. Â When AR glasses come in designer frames, and you can&#8217;t tell them from non-AR glasses, to me thatâ€™s when things get really interesting from a mass-adoption perspective. Â Â Compare how many people were carrying around a mobile phone in the 80s to now.Â  I think it will be the same thing with glasses.</strong></p>
<p><strong>I was in an AR pitch meeting the other week at a very significant media company, and brought up the point that todayâ€™s handheld Smartphones will eventually evolve into tomorrows Smartglasses. My comment was quickly shrugged off as sort of a sci-fi notion that was irrelevant to the business at hand. Â Probably true, but I think it is important to understand where digital media and entertainment is going, so you can adapt quickly, and evolve into those spaces more naturally. Â The more we see people walking around with their Smartphones in front of their face (like a camera), the sooner it will be that we make the jump to eyeglasses as a key hardware device for AR experiences.</strong></p>
<p><strong>At Ogmento, we definitely are working on AR experiences with the hardware and software available today. Â We will get some product out this year, and 2010 will be a banner year for markerless mobile AR in general.Â  I think the entire AR community is looking forward to bringing this technology to the mainstream in the form of games, marketing campaigns, virtual docent apps, and much more.Â  It might not be the full experience we are all dreaming about for some time, but we can see the path and the true potential, and it&#8217;s pretty spectacular.</strong></p>
<p><strong>You mention the tight alignment of graphics and media with real world objects. Â That is really our focus. A lot of well-deserved attention is going to the browser overlay &#8220;post-it&#8221; approach right now, which uses compass and GPS. Â We are focused on markerless natural feature tracking, so once you identify something that is AR enhanced in your environment, you can interact with that integrated experience. Â On an iPhone that can be as simple as using your touch screen to interact. Â When you are wearing glasses, it becomes more about visual tracking. There are lots of smart people thinking through these issues. Many of which you have interviewed. It is my hope that there are exciting collaborative efforts to be had in the coming months to get us all there together and faster.</strong></p>
<p><strong>Tish:</strong> Bruce touched on some of the hard problems that have to be solved for augmented reality &#8211; and he noted for instance security needs to be tackled in the early stages. Robert made a nice list, <em>â€œprivacy, media persistence, spam, creating UI conventions, security, tagging and annotation standards, contextual search, intelligent agents, seamless integration and access of external sensors or data sources, telecom fragmentation, privilege and trust systems, and a variety of others.â€</em> Will Ogmento be leading the way in solving some of these hard problems?</p>
<p>And, won&#8217;t trying to solve these hard problems for networked AR in walled garden scenarios one company at a time lead to a lot of reinventing the wheel wasted energy?</p>
<p><strong>Brian: These are all important issues, and again there are a lot of smart people thinking about solutions to these problems on a daily basis. Â Ogmento is interested in partnering with developers and supporting their efforts as a publisher of mobile AR experiences. Â While we intend to roll up our sleeves in these areas, we are currently more focused on taking AR mainstream with the hardware and software available today. Â As the industry evolves, so will Ogmento. As the opportunities evolve, our ability to make a greater impact tackling these issues will be realized.</strong></p>
<p><strong>Tish: </strong>Another area of development that could really kick AR into high gear might be creating augmented reality hotspotsÂ  where we use can deliver the kind of location accuracy/instrumentation necessary to create interesting AR experiences (partnership with Starbucks, perhaps ?!).Â  Augmented reality hots spots, could deliver the kind of high quality AR experience that isn&#8217;t possible ubiquitously at the moment, and may be a real way to get people really exploring the potential of AR now, rather than later?</p>
<p><strong>Brian: Â Agreed. I see a great opportunity here with this approach.</strong></p>
<p><strong>Tish:</strong> Although there are many obstacles to Green AR &#8211; the energy hogging servers at the backend for starters! Last week I had a conversation with Gavin Starks, <a href="http://www.amee.com/?page_id=289" target="_blank">AMEE</a>, and <a href="http://curiousraven.squarespace.com/" target="_blank">Robert Rice </a>and <a href="http://jimpurbrick.com/" target="_blank">Jim Purbrick</a> about how to work with AMEE and the technology available and encourage Green Tech AR development (<a href="http://blog.pachube.com/2009/06/pachube-augmented-reality-demo-with.html" target="_blank">see an early exploration of green tech AR from Pachube here</a>).</p>
<p>We came up with the idea of holding a competition perhaps centered around a targeted instrumented space. But I would really love to hear your thoughts on the topic of Green Tech AR (the energy hogging servers at the back end being the first cloud on the horizon!.)Â  Cool GreenTech AR imaginings, social gaming ideas, RPGs, not even necessarily even tied to the immediately practical, would be like rain in a drought!</p>
<p><strong>Brian: Â I go back to &#8220;Games and Goals&#8221;&#8230; If you make environmental and other activist efforts fun and rewarding, more are likely to be motivated and participate. Â Can you imagine having a personal &#8220;carbon footprint stat&#8221; floating over your self at all times? Or over your home or factory? Â How would that change your behavior? Â We all love stats. Look at how the Nike+ campaign has used technology and gaming to motivate people to run. Â I think there is a lot that can be done to make being green fun. It starts with the individual, and spreads from there. Â Keep me posted on that one!</strong></p>
<p><strong>Tish:</strong> I would also like to explore further the <a href="http://www.readwriteweb.com/archives/augmented_reality_human_interface_for_ambient_intelligence.php" target="_blank">RRW suggestion that ambient intelligence is both the Holy Grail of AR and possibly snake oil</a>:</p>
<p><em>&#8220;The holy grail of the mobile AR industry is to find a way to deliver the right information to a user before the user needs it, and without the user having to search for it. This holy grail is likely in a ditch somewhere beside a well-traveled road in the district of the semantic Web, ambient intelligence and the Internet of things. Be wary of any hyped-up invitation to invest in a company that claims to have gotten the opportunity right. What we&#8217;ve seen in the commercial industry to date is a rather complex version of a keyboard, mouse, and monitor.&#8221;</em></p>
<p><em> </em></p>
<p>So Holy Grail, Snake Oil, or a ditch somewhere&#8230;.?</p>
<p><strong>Brian: Â I instantly think of Minority Report, where Tom Cruise&#8217;s character is being bombarded with holographic ads personalized with his name and to his current situation. Â In the future, Spam is a nightmare, especially when it knows who you are. Â I think the key thing here is delivering &#8220;the right information&#8221;, and we still dont have that down. I do see a day where we can truly customize what comes to us, how we want it, when we want it. Â My future vision of ambient intelligence is the ability to &#8220;turn everything off&#8221; if I want to&#8230; block out the stimuli and replace it with images of nature, or natural surroundings, etc. Â Where I live in Los Angeles, we have those digital billboards everywhere, so it&#8217;s like advertising overload wherever you look (hints of Blade Runner). Â I personally don&#8217;t mind them, but I know there is great debate on there being simply too many billboards everywhere. So AR would only add to the noise of life by adding yet another digital overlay of information, right? </strong></p>
<p><strong>Perhaps the holy grail is to use technology to filter things out. AR might become a solution to leading a simpler life, or a perfectly customized life if you want that. Ultimately the control needs to be with the individual. Â I guess I am talking about something like TiVo taken to the extreme.</strong></p>
<p><strong>Tish:</strong> And then that other biggy &#8211; augmented reality search! I am asking this next question ofÂ  <a href="http://www.wikitude.org/" target="_blank">Wikitude</a> and <a href="http://sekaicamera.com/" target="_blank">Sekai </a>camera too and now I must also ask <a href="http://www.acrossair.com/" target="_blank">Acrossair</a> and several others I guess! Obviously a huge area of opportunity in this broader landscape that uses location-awareness, barcode scanners, image recognition and augmented reality is to harness the collective intelligence &#8211; a whole new field of search. There is the beginning of a discussion on this <a href="http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/" target="_blank">in the comments here</a>.</p>
<p>What will it take, in your view, to become a leader in augmented reality search?</p>
<p><strong>Brian: Â I&#8217;m more of a content guy, so I tend to focus on things like UI, quality of creative, etc.. Â From that perspective, I am looking forward to evolving beyond the &#8220;post-it&#8221; text overlay user-experience we see now in AR search. I was impressed with the TAT Augmented ID concept and hope we start seeing more smart design solutions like that emerging in the space. Â There are some great new design approaches coming out of the location-aware space that should be applied to AR search. I&#8217;ve been studying the heads-up display designs being used in video games, and re-watching movies like Iron Man for ideas. This is another example where Hollywood has painted a polished picture of what AR can and should look like, and the masses have already accepted these design approaches. Â So from that perspective, from my view the leaders in search will be delivering sexy, smart and simple solutions. It&#8217;s all about the S&#8217;s.</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/08/30/games-goggles-and-going-hollywood-how-ar-is-changing-the-entertainment-landscape-talking-with-brian-selzer-ogmento/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
		</item>
		<item>
		<title>Composing Reality and Bringing Games into Life: Talking with Ori Inbar about Mobile Augmented Reality</title>
		<link>http://www.ugotrade.com/2009/05/06/composing-reality-and-bringing-games-into-life-talking-with-ori-inbar-about-mobile-augmented-reality/</link>
		<comments>http://www.ugotrade.com/2009/05/06/composing-reality-and-bringing-games-into-life-talking-with-ori-inbar-about-mobile-augmented-reality/#comments</comments>
		<pubDate>Wed, 06 May 2009 14:50:30 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[Kids With Cameras]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[MMOGs]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[Virtual Realities]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Adam Greenfield]]></category>
		<category><![CDATA[Add new tag]]></category>
		<category><![CDATA[alternate reality games]]></category>
		<category><![CDATA[alternative reality gaming]]></category>
		<category><![CDATA[AMEE]]></category>
		<category><![CDATA[AR]]></category>
		<category><![CDATA[AR eyewear]]></category>
		<category><![CDATA[AR goggles]]></category>
		<category><![CDATA[ARToolkit]]></category>
		<category><![CDATA[augmented reality games]]></category>
		<category><![CDATA[augmented times]]></category>
		<category><![CDATA[Better Place]]></category>
		<category><![CDATA[Blair Macintyre]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Caryatids]]></category>
		<category><![CDATA[Come Out and Play]]></category>
		<category><![CDATA[composing reality]]></category>
		<category><![CDATA[Cory Doctorow]]></category>
		<category><![CDATA[eyewear for augmented reality]]></category>
		<category><![CDATA[game development conference]]></category>
		<category><![CDATA[Games Alfresco]]></category>
		<category><![CDATA[games for preschoolers on the iphone]]></category>
		<category><![CDATA[games on the iphone]]></category>
		<category><![CDATA[GDC 2009]]></category>
		<category><![CDATA[GE augmented reality ad]]></category>
		<category><![CDATA[google earth]]></category>
		<category><![CDATA[green technology]]></category>
		<category><![CDATA[image recognition]]></category>
		<category><![CDATA[Immersive augmented reality]]></category>
		<category><![CDATA[Int 13]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[iphone games]]></category>
		<category><![CDATA[iPhone OS 3]]></category>
		<category><![CDATA[iphone versus the android]]></category>
		<category><![CDATA[ISMAR]]></category>
		<category><![CDATA[ISMAR 2009]]></category>
		<category><![CDATA[jane mcgonigal]]></category>
		<category><![CDATA[julian Bleeker]]></category>
		<category><![CDATA[Kati London]]></category>
		<category><![CDATA[Kweekies]]></category>
		<category><![CDATA[Loopt]]></category>
		<category><![CDATA[markerless AR]]></category>
		<category><![CDATA[markerless augmented reality]]></category>
		<category><![CDATA[Microsoft Tag]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile gaming]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Netweaver]]></category>
		<category><![CDATA[open source augmented reality]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[Pookatak]]></category>
		<category><![CDATA[Pookatak Games]]></category>
		<category><![CDATA[reality experiences]]></category>
		<category><![CDATA[RFID]]></category>
		<category><![CDATA[Robert Rice]]></category>
		<category><![CDATA[Rouli Nir]]></category>
		<category><![CDATA[sensor networks]]></category>
		<category><![CDATA[Shai Agassi]]></category>
		<category><![CDATA[smart environments]]></category>
		<category><![CDATA[smart objects]]></category>
		<category><![CDATA[The End of Hardware]]></category>
		<category><![CDATA[the Pong for augmented reality]]></category>
		<category><![CDATA[the shape of alpha]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[Tonchidot]]></category>
		<category><![CDATA[ubicomp]]></category>
		<category><![CDATA[ubiquitous augmented reality]]></category>
		<category><![CDATA[ubiquitous experience]]></category>
		<category><![CDATA[virtual reality]]></category>
		<category><![CDATA[WARM 09]]></category>
		<category><![CDATA[Wattzon]]></category>
		<category><![CDATA[Where 2.0]]></category>
		<category><![CDATA[WikiMouse]]></category>
		<category><![CDATA[Wikitude]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=3448</guid>
		<description><![CDATA[Recently, I talked to Ori Inbar (above), formerly senior vice- president at SAP.Â  Ori is on a mission to make augmented reality commercially successful not in 5, 10, or 15 years, but now. Ori is the founder of Pookatak Games &#8211; a video game company, &#8220;with a vision to upgrade the way people experience the [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/05/oriinbarpost.jpg"><img class="alignnone size-medium wp-image-3449" title="oriinbarpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/05/oriinbarpost-300x199.jpg" alt="oriinbarpost" width="300" height="199" /></a></p>
<p>Recently, I talked to <a href="http://gamesalfresco.com/">Ori Inbar</a> (above), formerly senior vice- president at <a href="http://www.sap.com/">SAP</a>.Â  Ori is on a mission to make augmented reality commercially successful not in 5, 10, or 15 years, but now. Ori is the founder of <a href="http://gamesalfresco.com/about/" target="_blank">Pookatak Games</a> &#8211; a video game company, <strong>&#8220;with a vision to upgrade the way people experience the world.&#8221;</strong> Ori will be participating May 20th, in<a href="http://en.oreilly.com/where2009/public/schedule/detail/7197" target="_blank"> O&#8217;Reilly&#8217;s Where 2.0 panel, &#8220;Mobile Reality</a>&#8221; -Â  an event not to be missed IMO.</p>
<p>The taste for computing anywhere anytime has entered human culture via the iphone and is spreading like chocolate cake and pizza at a preschool party (see <a href="http://gamesalfresco.com/2009/03/23/gdc-2009-why-the-iphone-just-changed-everything/" target="_self">why the iPhone changed everything</a>).Â  And while the full flowering of the next step is yet to come &#8211; computing anywhere, anytime by anyone and <strong>anything </strong><a href="http://en.wikipedia.org/wiki/Internet_of_Things" target="_blank">(&#8220;the internet of things&#8221;</a>), our love for these first devices capable of being <strong>mediating artifacts for ubiquitous computing</strong> (Adam Greenfield) is a vital first step to free us from our tethers to computer screens, and fulfill the promise of augmented reality.</p>
<p>If you need more convincing on the pivotal role augmented reality will play as the web moves into the world, check out Tim O&#8217;Reilly&#8217;s recent comments in <a id="iz1_" title="this video clip on Augmented Times" href="http://artimes.rouli.net/2009/04/tim-oreilly-on-recognition-rfid-and-web.html" target="_blank">this video clip posted on Augmented Times</a> and <a id="wtf4" title="here" href="http://radar.oreilly.com/2008/02/augmented-reality-a-practical.html" target="_blank">here</a> early last year.</p>
<p>From another perspective, the gloomy specter of economic and environmental catastropheÂ  is driving a movement to &#8220;<a id="h5pf" title="infuse intelligence into the way the world work's&quot;" href="http://news.bbc.co.uk/2/hi/technology/7992480.stm" target="_blank">infuse intelligence into the way the world work&#8217;s.&#8221;</a> But the challenge for a smart planet is not just about making environments smart, it is about using smart environments to enable people to act smarter (<a href="http://www.ugotrade.com/2009/02/27/towards-a-newer-urbanism-talking-cities-networks-and-publics-with-adam-greenfield/" target="_blank">see my interview with Adam Greenfield</a>).</p>
<p>We need a rapid upgrade in both the way the world works, and the way we experience the world.</p>
<p>((Note:Â  It is time to read (if you haven&#8217;t already) <a href="http://search.barnesandnoble.com/The-Caryatids/Bruce-Sterling/e/9780345460622" target="_blank">Bruce Sterling&#8217;s Caryatids</a> (<a href="book of the year for 2009" target="_blank">Cory Doctorow&#8217;s book of the year for 2009</a>) &#8220;as a software design manual&#8221; (<a href="http://www.nearfuturelaboratory.com/2009/03/17/design-fiction-a-short-essay-on-design-science-fact-and-fiction/" target="_blank">see Julian Bleeker</a>) because Caryatids reveals the Gordian knots of human folly, greed, compassion and desire entwined in near future designs for technologies to save the world.))</p>
<p>Ori Inbar, worked with Shai Agassi (Shai is now leading the world changing <a id="v5ow" title="Better Place" href="http://www.betterplace.com/" target="_blank">Better Place</a> ) driving <a id="gf_5" title="Netweaver" href="http://en.wikipedia.org/wiki/NetWeaver" target="_blank">Netweaver</a> from a mere concept to a &#8220;major, major business for SAP.&#8221; So Ori has already been through the cycle of working in a very small startup and growing it into a billion dollar business.Â  He has both the experience and the passion to realize his vision for augmented reality.</p>
<p>At Pookatak, he explains :</p>
<p><strong>&#8220;We design â€œreality experiencesâ€ that make usersâ€™ immediate environments more significant to them. We wish to free young and old from getting lost in front of the screen. By delivering the worldâ€™s information to peopleâ€™s field of view, and by weaving real world objects into interactive narratives, we help people rediscover the real world.&#8221;</strong></p>
<p>Pookatak will release their first game this summer. Currently it is under wraps. But Ori gives us some glimpses of what is to come in the interview below.</p>
<p>In addition to founding Pookatak, Ori is involved in a broader effort to move augmented reality forward. On his blog, <a id="ie5s" title="Games Alfresco" href="http://gamesalfresco.com/" target="_blank">Games Alfresco</a> &#8211; he recently welcomed <a href="http://gamesalfresco.com/about/" target="_blank">a new partner, Rouli Nir</a>, Ori has focused his eye of wisdom on every significant recent advance in Augmented Reality (check out <a id="zr9y" title="this essence of Ori's thinking in a fast paced video" href="http://gamesalfresco.com/2009/03/09/augmented-reality-today-ori-inbar-speaks-at-warm-2009/" target="_blank">this essence of Ori&#8217;s thinking in a fast paced video</a> presentation for <a href="http://gamesalfresco.com/2009/02/12/live-from-warm-09-the-worlds-best-winter-augmented-reality-event/" target="_blank">WARM â€˜09</a>).</p>
<p>Also Ori is one of the organizers of the interactive media track at <a id="b-c6" title="ISMAR 2009" href="http://www.ismar09.org/" target="_blank">ISMAR 2009</a>.Â  At ISMAR this year, Ori explained,<strong> &#8220;we are trying to bring in people that develop interactive experiences for consumers, beyond the traditional attendees coming from a research perspective.</strong>&#8221;</p>
<p>In the interview below, Ori explains much of his thinking on how augmented reality will become commercially successful.Â  Enjoy it, think about it, and share it. And most importantly, if you can, get involved with ISMAR 2009.</p>
<p>OriÂ  has inspired me to participate in <a id="seky" title="ISMAR" href="http://www.ismar09.org/" target="_blank">ISMAR</a> this year.Â  Ori pointed out:</p>
<p><strong>The </strong> <a href="http://campwww.informatik.tu-muenchen.de/ismar09/lib/exe/fetch.php?id=ismar09%253Astart&amp;cache=cache&amp;media=ismar09:ismar09-cfp_090211_final.pdf" target="_blank">call for papers</a> <strong>is on, and this year it targets well beyond the typical research papers audience and into interactive media and art folks. </strong></p>
<p><strong>There are plenty of opportunities such as:</strong></p>
<p><strong>Art Gallery</strong></p>
<p><strong>Demonstrations</strong></p>
<p><strong>Tutorial</strong></p>
<p><strong>Workshops</strong></p>
<p>It&#8217;s a huge opportunity to shape the emergence of augmented reality.<br />
<br /></br></p>
<h2><strong> Interview With Ori Inbar</strong></h2>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/05/picture-41.png"><img class="alignnone size-full wp-image-3479" title="picture-41" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/05/picture-41.png" alt="picture-41" width="107" height="146" /></a><br />
<h3>Making Augmented Reality Commercially Successful</h3>
<p><strong>Tish Shute: </strong>You are considered a key trail blazer in AR and you have the go to blog for augmented reality!Â  What are the most important lessons you have learned researching, writing, and developing AR in the last couple of years?</p>
<p><strong>Ori Inbar: You need to have a vision. You need to know where this is going to go in ten or fifteen or twenty years. But you&#8217;ve got to start with something really simple that makes use of the technology you have on hand. And do something that is practical, that people will like, and something they would actually want to buy. Its as simple as that. I&#8217;m currently looking at what we could do with existing technology. First of all, you have to put it in front of people. Right now most people have never heard about the term augmented reality. Go into the street, and ask 100 people about it, maybe 2 would know about it. So you need to put it in front of people because most people think it&#8217;s still science fiction or a special effect you see in movies, not something you can experience in real life. </strong></p>
<p><strong>Tish: </strong>It seems to me to that for augmented reality applications to become popular with existing technology the key breakthrough would be getting people to hold up their phones. What are the obstacles to getting people to use their mobile devices like this?</p>
<p><strong>Ori: There&#8217;s a really nice cartoon by </strong><em> </em><strong><a href="http://www.tonchidot.com/">Tonchidot</a> (below) &#8211; the Japanese company behind the Sekai Camera. It&#8217;s an illustration showing the evolution of man, from ape to man (holding a cell phone looking down), to the developed man holding a device like a camera &#8211; in front of its eyes.</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/05/picture-37.png"><img class="alignnone size-medium wp-image-3454" title="picture-37" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/05/picture-37-300x221.png" alt="picture-37" width="300" height="221" /></a><strong></strong></p>
<p><strong>Which is exactly what you&#8217;re talking about. People ask, &#8220;are people going to walk with this like that all day long?&#8221; Probably not. I mean you have to build it in a way that doesn&#8217;t require them to hold it like that all the time. People are used to this gesture with the ubiquitous digital cameras. I tested one of my prototypes on a two and a half year old girl. She had no problem holding it just like she holds a camera.<br />
</strong><br />
<strong>Tish:</strong> <a href="http://www.cc.gatech.edu/~blair/home.html" target="_blank"> Blair MacIntyre</a> mentioned, &#8220;The problem with the mobile phone as a AR device is a problem of awareness,&#8221; i.e., you have to have a way of letting people know when there&#8217;s something interesting wherever they are. One of the issues regarding this is if you get too many alerts, then you tune them out.</p>
<p><strong>Ori: First of all Blair is one of the people in academia that get it. Because he looks at it from an experience perspective. Not just as an interesting technical problem to solve. Let&#8217;s start with getting people to enjoy this new experience. The AR demos so far were mostly eye candies, and mostly for advertising &#8211; the<a href="http://ge.ecomagination.com/smartgrid/#/landing_page" target="_blank"> GE AR ad</a> created a lot of buzz; but you look at it for 10 seconds and you forget about it.Â  You need to build something that people would want to experience over time and would be willing to pay for. I think that&#8217;s the big test, right?</strong></p>
<p><strong>Now in terms of having a ubiquitous experience where you&#8217;re continously connected, it doesn&#8217;t have to be an overwhelming experience. Just like some of the social media tools we&#8217;re using today, we decide when to connect, and we filter out the trash. You could get alerts only for things that really matter to you, not for everything that happens in your immediate environment. </strong></p>
<p><strong>There will be many layers of information, and it&#8217;ll be up to you to pick the ones you want to experience. The real benefit is that you get the information in your own field of view and in context of where you are or what you do.</strong></p>
<p><strong>Tish:</strong> So what are you working on these days?</p>
<p><strong>Ori: We are working on a little app that targets a very different audience than what you&#8217;d expect: pre schoolers. We think we can encourage them to get away from a PC or TV screen and learn something while playing &#8211; in the real world. You&#8217;ll hear more about it as soon as this summer. Nuff said.</strong></p>
<p><strong>But, it is a small application that will run on the iPhone. People ask how many pre-schoolers own iPhones? Well, their parents do. </strong></p>
<p><strong>Tish:</strong> Yes there are certainly many New York kids with iPhones &#8211; my kid now has my old iphone.Â  He has pretty much switched from playing games on his DS to the iPhone. I noticed in your WARM video you place a big emphasis on AR as something that will get kids away from screens and engaged with reality.Â  This is something parents will approve of!</p>
<p><strong>Ori: Yes I saw something really interesting at my kids&#8217; party one day; they were all sitting around the room &#8211; looking down at their own DS screens.Â  You could play the DS anywhere, but kids would usually play it on the sofa, looking at the screen, isolated from the world. With an iPhone and a camera, and the application we&#8217;re producing, reality becomes part of the game. Yes that makes it all of a sudden much more interesting for parents. Because kids are spending so much time in front of the screen, all of a sudden they&#8217;re something that will encourage them to interact with real objects, real things. Every parent I&#8217;ve talked to loves that idea.</strong></p>
<p><strong>Tish:</strong> Yes that is what is cool about the work of <a href="http://www.katilondon.com/" target="_blank">Kati London</a> &#8211; I think I saw someone say this on Twitter, &#8220;Kati puts the computer in the game not the game in the computer.&#8221;</p>
<p><strong>Ori: Yes, kids are spending more time in front of games and the computer because it&#8217;s more interesting. It captivates them with &#8220;<a id="x_z0" title="game pleasures" href="http://8kindsoffun.com/">game pleasures</a> &#8221; that tap into their brain&#8217;s dopamine circuitry &#8211; constantly seeking reward and satisfaction. So you&#8217;re not going to be able to tell them to go back to playing in reality without these pleasures. We have to study these mechanics from games and bring them into reality. It&#8217;s about programming real life; and augmented reality helps you achieve that.</strong></p>
<p><strong>Here&#8217;s an example: cause and effect; in a game when you do something you always get an immediate effect. You&#8217;re good, you get a reward. You&#8217;re not good, you get a cue to improve. In real life you do things and you could wait 2 or 3 years until you actually get feedback (if you&#8217;re lucky). Augmented Reality allows you to bring these mechanics into the real world. I think that&#8217;s going to help kids rediscover reality, in a new sense, which is what every parent is dreaming about.</strong></p>
<p><strong>Tish:</strong> I don&#8217;t know how much you can say about your app. But in regard to doing augmented reality on the iPhone.. there&#8217;s no compass. Is this a limitation?</p>
<p><strong>Ori: True, no compass yet. But the camera gives you a lot of information that you can interact with. When you run the application, you see the world in front of you, and if the app can recognize real life objects &#8211; it can put virtual elements on top of it.</strong></p>
<p><strong>Tish:</strong> But not with any accuracy unless you&#8217;re using markers. Are you using markers?</p>
<p><strong>Or</strong><strong>i: We&#8217;re using natural feature recognition. It doesn&#8217;t have to be an ugly looking marker. It can be any image.</strong></p>
<p><strong>Tish:</strong> So you&#8217;re using image recognition. Are you working with one of these image recognition startup companies (<a id="nws6" title="list here" href="http://www.educatingsilicon.com/2008/11/25/a-round-up-of-mobile-visual-search-companies/" target="_blank">list here</a> )?</p>
<p><strong>Ori: We&#8217;re working with one of those. What&#8217;s unique about it is it runs very nicely on any cell phone, and on the iPhone it works the best. For this first app, it doesn&#8217;t really matter where you are physically; the geolocation is not part of the experience. </strong><span style="background-color: #ffff00;"><br />
<strong><br style="background-color: #ffffff;" /></strong><span style="background-color: #ffffff;"><strong>Tish: </strong> For a truly engaging AR experience we will need more of a backend than is currently available?</span><br />
</span><br />
<strong>Ori: I call the backend the cloud, where you have all this information and ways to access it from anywhere. Actually I think it&#8217;s become pretty mature today. If you look at the different elements required to enable an augmented reality experience to work, you have &#8211; first &#8211; the user whose always in the center. Then you have the lens. The lens can be an iPhone, or glasses, even a projector. The lens allows you to watch, sense and track information in the real world: people, places, things. Then in the backend you have the cloud where you store and retrieve information.</strong></p>
<p><strong>So if you look at the maturity of these different elements, I think the cloud is in pretty good shape. Because there&#8217;s so much information we&#8217;re collecting and storing. Anything from Google, Wikipedia, Facebook, all that kind of stuff, it&#8217;s a lot of useful information you can access from anywhere using APIs. And a lot of it is also starting to include geolocation information. Take <a id="zhag" title="Loopt" href="http://www.loopt.com/" target="_blank">Loopt</a> or Google&#8217;s <a href="http://www.google.com/latitude/intro.html" target="_blank">friends service</a> that allows you to see where your friends are and what they&#8217;re doing. There&#8217;s tons of information out there and it&#8217;s pretty easy to access it. Now what do you do with it is the question?</strong></p>
<p><strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a> is such a simple and brilliant application and nobody thought about doing it until this guy from Salzburg did. It doesn&#8217;t have any sophisticated visual tracking. It knows your position and it&#8217;s simply looking at the angle you&#8217;re pointing to. Based on these parameters it brings information from Wikipedia that pertains to your field of view. So most of it was already there. It&#8217;s just a matter of connecting the pieces in an experience that is valuable for people.</strong></p>
<p><strong>Tish: </strong>It is the uptake of even a very simple technology that puts the magic in it.</p>
<p><strong>Ori:Â  Yes, take Twitter. If you go to its homepage it looks like a very simple boring app but it is something that is both enjoyable and very useful to people.</strong></p>
<h3><strong>Why you should participate in ISMAR 2009</strong></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/05/picture-40.png"><img class="alignnone size-medium wp-image-3478" title="picture-40" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/05/picture-40-222x300.png" alt="picture-40" width="222" height="300" /></a><br />
<strong>Tish: </strong>I know that you are involved in organizingÂ  <a id="seky" title="ISMAR" href="http://www.ismar09.org/" target="_blank">ISMAR</a> (picture above from Ori&#8217;s post on <a href="http://gamesalfresco.com/2009/02/23/ismar-2009-the-worlds-best-augmented-reality-event-wants-you-to-contribute/" target="_blank">&#8220;ISMAR 2009: The World&#8217;s Best Augmented Reality Event&#8230;,</a>&#8220;) and there is a call out for papers and for volunteers, can you tell me more about it?</p>
<p><strong>Ori: Yes, we hope to have the first ISMAR where we practice what we have just discussed: let&#8217;s build on all the research invested so far and instead of thinking only about 5-10 years from now, let&#8217;s see what we can do today. So we are bringing people in from other disciplines &#8211; artists, interactive media developers and people from the entertainment industry.Â  The goal is to use the technology to make something interesting for people &#8211; again, something that people would buy, and making it commercially successful.Â  Many people either don&#8217;t know about ISMAR because in the past it was a pure engineering-orientated event and peopleÂ  from a commercial perspective of AR weren&#8217;t attracted to it.Â  The Chair of the Event this year is based in Florida and he is going to bring in a lot of people from the entertainment industry such as Disney. I think this will transform this event into something more like SIGGRAPH &#8211; more of an industry event.Â  As one of the organizers of the interactive media track we are trying to bring in people that want to build applications for consumers.</strong></p>
<p><strong>Tish:</strong> In terms of AR applications what are the flagships today?</p>
<p><strong>Ori: There are very few because it&#8217;s just the beginning. There&#8217;s one tiny studio in France called <a id="z1ln" title="Int 13" href="http://www.int13.net/en/" target="_blank">Int 13</a> . They&#8217;ve created maybe the first commercial game running on a mobile device using AR technology. It&#8217;s called <a href="http://www.youtube.com/watch?v=Te9gj22M_aU" target="_blank">Kweekies</a>. It was one of the contenders for the Nokia Mobile innovation awards. They were one of the ten finalists, but they didn&#8217;t win it. It&#8217;s looks really cool. It&#8217;s somethng that runs on your desk, with a marker. Many AR folks say markers are the past, markers are ugly. But it&#8217;s still a cool experience. I think people will go for it.</strong></p>
<p><strong>Tish:</strong> Yes I think we will have to look to small companies that are free to think creatively to lead the way.Â  It seems many games companies are tied up pulling off huge big budget projects and enterprise is still catching up on how to use social media!</p>
<p><strong>Ori: Yes, last year I was in the game development conference (GDC); there was no mention of augmented reality &#8211; not on the exhibition floor, none of the sessions, nobody talked about it. I was stunned. Then this year, there was a little a change. There were like three demos on the exhibition floor, <a href="http://www.metaio.com/" target="_blank">Metaio,</a> <a href="http://www.vuzix.com/home/index.html" target="_blank">Vuzix</a> and a Dutch company called <a href="http://www.augmented-reality-games.com/" target="_blank">Beyond Realit</a>y.Â  And then there was Blair&#8217;s talk, which was very very cool. The room was packed with people. And after the talk there were dozens of people lining up to talk with him about the topic. There was definitely interest, but still on the very edge. The video game industry is still a hit driven business and publishers spend upward of 20-30 million dollar to create the best AAA game possible. They just can&#8217;t take the risk. So it&#8217;s going to come from smaller companies, from outsiders coming in with a vision and understanding on how to put the AR pieces together to create a totally new experience.</strong></p>
<p><strong>Tish:</strong> But the basic tool set is there isn&#8217;t it?</p>
<p><strong>Ori: I talked to some folks at the games developer conference, many folks with MMO background, and they have great ideas about AR. It&#8217;s great to see different people with different views on what&#8217;s needed first. &#8220;Joe the Programmer&#8221; had this idea of creating a small piece of hardware that you can put in every house and provide accurate geospatial information in your home. That couldÂ  open up many opportunities for AR experiences in homes.</strong></p>
<p><strong>Tish:</strong> Don&#8217;t you think we have enormous resources in terms of image databases that provide a great basis for augmented reality.Â  I was talking to Aaron Cope at ETech about <a href="http://code.flickr.com/blog/2008/10/30/the-shape-of-alpha/" target="_blank">The Shape of Alpha</a> &#8211; Flickr&#8217;s vernacular mapping project using all the geotagged photos in Flickr. That is such cool project. <a href="http://en.oreilly.com/where2009/public/schedule/speaker/43824" target="_blank">Aaron will be speaking at Where 2.0</a> also.</p>
<p><strong>Ori: Think of Google Earth. Google Earth leveraged communities to basically map all the major cities around the world into 3D models. And that is an essential step to be able to do augmented reality outdoors. Because if you had to model everything from scratch, it wouldn&#8217;t be realistic.</strong></p>
<h3><strong>Augmented Reality and Becoming Greener.</strong></h3>
<p><strong>Tish:</strong> I am really interested in how AR interfaces might be useful to some of the emerging energy identity/metering projects like <a href="http://www.amee.com/" target="_blank">AMEE</a> and <a href="http://www.wattzon.com/" target="_blank">WATTZON</a> because I think it is very important that people have very intuitive, immediate, and enjoyable ways to relate to energy data so they can make greener choices.</p>
<p><strong>Ori: Back in the day I had an idea to build an Augmented Reality application to become greener. You look at things around your home with the camera and itÂ  recognizes its green gas footprint and makes recommendations to reduce it.Â  I guess it was a bit too early to do that based on visual recognition alone&#8230;you&#8217;d needÂ  additional sensors that would provide related information about what you are looking at.</strong></p>
<p><strong>Tish:</strong> Well as there is more interest in Green technology do you think we may see VC interest in some green AR projects now?</p>
<p><strong>Ori: I talked to some of the investment folks, Angels as well as VC&#8217;s about AR and they had no clue what it is. There&#8217;s a need for a whole lot of education. And there are no proof points (as in successful investments in this domain), and counter to popular belief &#8211; they don&#8217;t like risk so much&#8230;</strong></p>
<p><strong>Tish:</strong> And consumer adoption must lead the way, right?</p>
<p><strong>Ori: Just like with every emerging technology in history, people never bought the technology, they bought the content, the apps, the benefits that came on top of the technology. Whether it was VHS winning over Beta Max, or BluRay winning over HD. It&#8217;s always because of more/better content. Look at the video game console war: Xbox, and Nintendo did better than Sony just because they had more and better games. Even Windows was a success thanks to its applications. People bought it for the applications not the OS. The content is the first to drive demand.</strong></p>
<p><strong>Tish:</strong> One of the challenges to giving people new ways to relate to their energy consumption is that you can just have them looking at graphs of how bad they have been in the past you &#8211; that may make them feel bad but that doesn&#8217;t necessarily give them ways or motivation to change. There perhaps needs to be more immediate relationship to the data to facilitate change. I think the mantra for optimization of anything from energy usage to supply chains is timely, actionable data?</p>
<p><strong>Ori: There are a lot of ideas about measuring information and displaying it to people. For example, the Prius hybrid car, one of its interesting features &#8211; which is kind of game like &#8211; is a constant display of your current fuel consumption. That alone changes how people drive because they try to beat the &#8220;Score&#8221; and as a result conserve more fuel. That model can be applied to our homes&#8230;</strong></p>
<p>Tish: Yes that is something I am very interested in. I have been following several projects in this area &#8211; one of my favorites is the <a href="http://www.arduino.cc/" target="_blank">Arduino</a>, <a href="http://www.currentcost.com/" target="_blank">Current Cost</a>/<a href="http://www.ladyada.net/make/tweetawatt/" target="_blank">Tweetawatt</a>, <a href="http://www.pachube.com/" target="_blank">Pachube</a> integrations <a href="http://www.ugotrade.com/2009/04/24/homecamp-2-home-energy-management-and-distributed-sustainability/" target="_blank">I saw at Homecamp</a>.</p>
<p>You joined a start up with Shai Agassi which was bought out by SAP right? He has a brilliant approach with Better Place.</p>
<p><strong>Ori:Â  I think what&#8217;s really unique about Better Place&#8217;s approach is that he doesn&#8217;t require people to change their behavior. People are still going to have their own cars. They&#8217;ll be able to drive as far as they want, and for the same (or lower cost). Its not necessarily about a new technology, electric cars have been around for a long time but there was no way people were going to be limited by the 50 or 70 mile range and Better Place is solving that problem. With its infrastructure of charging spots and battery switching stations, drivers are going to be able to drive anywhere. And it&#8217;ll be similar to having to stop once in a while to refuel your car. The price maybe even lower than what you pay today for your transportation needs &#8211; and you&#8217;ll stop generating green gas. It&#8217;s a clever way of taking technology to a whole new level without changing the behavior of people.</strong></p>
<p><strong>Tish: </strong>Better Place is a classic example of things as a service isn&#8217;t it?Â  It is basically a utility company.</p>
<p><strong>Ori: It is similar to a phone carrier model.Â  You pay for a membership that gives you access to the car (equivalent to the phone) and electricity (equivalent to the phone line) for the same price of fuel cost today. And as bonus you get to save the world.</strong></p>
<h3><strong>How the iphone changed the game for AR &#8211; and the iphone versus Android</strong></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/05/picture-38.png"><img class="alignnone size-medium wp-image-3472" title="picture-38" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/05/picture-38-300x198.png" alt="picture-38" width="300" height="198" /></a><em></em></p>
<p><em>Picture from Ori&#8217;s post</em><strong><em>, <a href="http://gamesalfresco.com/2009/03/23/gdc-2009-why-the-iphone-just-changed-everything/" target="_blank">&#8220;GDC 2009: Why the iphone changed everything&#8221; </a></em></strong></p>
<p><strong>Ori: And back to AR, you have to take the same approach, because nobody&#8217;s wants to don those huge head mounted displays or backpacks. You have to take advantage of people&#8217;s current behavior: they already carry their iPhones or similar devices.</strong></p>
<p><strong>Tish:</strong> As we discussed, you just have to get people raising up their phones and looking through them when that is a useful thing to do. Both Wikitude and Nathan Freitas&#8217;s graffiti app were enough to get me interested in the evolutionary step of raising my phone! Nathan&#8217;s graffiti app is nice. You leave a marker for your graffiti so other people can find view/add their own &#8211; a nice primal experience like pissing on the lamp post to let your pack know where youâ€™ve been.Â  Also the graffiti app taps into a long history ofÂ  NYC street culture around tagging and graffiti art (see my interview, <a href="http://www.ugotrade.com/2009/01/17/is-it-%E2%80%9Comg-finally%E2%80%9D-for-augmented-reality-interview-with-robert-rice/" target="_blank">&#8220;Is it OMG finally for Augmented Reality?&#8221;</a>).</p>
<p><strong>Ori: The app store has fundamentally changed the mobile gaming industry. Last year they were in shambles. There was no growth. Everybody was complaining, &#8220;we can&#8217;t handle it, there&#8217;s a million phones, and you have to test it on each phone. And carriers suck, they don&#8217;t care about sharing and promoting your content. Everything was bad. This year mobile gaming is the hottest thing. And it&#8217;s all because of the iPhone. It changed the game.</strong></p>
<p><strong>Tish: </strong>How do you think Android is going to get traction against the iphone?</p>
<p><strong>Ori: Well the number one thing is the form factor &#8211; the iPhone is just much cooler than the G1. Its OK but it doesn&#8217;t have the same feel. People thought it was going to be easy to clone the iPhone but none of the attempts succeeded so far.</strong></p>
<p><strong>Tish: </strong>How much does it matter for AR not being able to runs things persistently in the background on the iphone?</p>
<p><strong>Ori: Actually they have add a such a capability in OS 3.Â  You can now make use of a background service.</strong></p>
<p><strong>Tish:</strong> OS 3 will open up new possibilities for AR?<strong> </strong></p>
<p><strong>Ori: The access to the video API is still not public.Â  But there is a new Microsoft application &#8211; Microsoft Tag that makes use of that API which means it is probably OK to use it.</strong></p>
<p><strong>Tish: </strong>(I ask Ori for his card and he shows me how to read it with my iphone.) Oh nice you have an AR card, of course!</p>
<h3><strong>In Search of Pong for Augmented Reality</strong></h3>
<p><strong>Tish: </strong>So how will AR begin to, as Blair&#8217;s friend put&#8217;s it, &#8220;facilitate a killer existence,&#8221; particularly as we are probably looking at some new and perhaps pricey hardware?</p>
<p><strong>Ori: You could take the Better Place approach. We&#8217;re going to give you a great experience and we&#8217;ll include the devices as part of that experience for the same price. Let&#8217;s say you subscribe to an AR experienceÂ  which offers access to multiuser, support, and all the information you need wherever you go &#8211; exactly according to the vision. You pay for a subscription on a monthly basis and included in that cost we give you a better device that offers aÂ  better AR experience. It&#8217;s following the phone carrier approach, but in a good way.</strong></p>
<p><strong>But first of all we do need our Pong! I was sitting with a couple of AR game enthusiasts at the GDC and we were asking ourselves, &#8220;how do we create the first pong for AR?&#8221;</strong></p>
<p><strong>Was Pong a multiplayer game? Not necessarily! Did it connect to the network? No! We have to create the first dot in a long line of dots that will bring us to our destination.</strong></p>
<p><strong>Tish: </strong>You haven&#8217;t seen a Pong yet have you?</p>
<p><strong>Ori: Not yet. I mean there&#8217;s maybe a handful of games and apps out there, but I don&#8217;t think any of them is a Pong yet. Still, it&#8217;s getting closer.</strong></p>
<p><strong>Tish: </strong>Kati London is doing some very interesting work on bringing games into reality, isn&#8217;t she?</p>
<p><strong>Ori: Yes, she works with Frank Lanz at <a href="http://playareacode.com/" target="_blank">Area/Code</a>. He teaches at NYU and has designed games for the <a href="http://www.comeoutandplay.org/" target="_blank">&#8220;Come Out and Play&#8221;</a> festival here in Manhattan. And a lot of these games are actually low tech.</strong></p>
<p><strong>Tish:</strong> Yes I have a big alternate reality game blog brewing that I haven&#8217;t had time to write yet!</p>
<p><strong>Ori: The city is the gameboard is their slogan. It&#8217;s going to be a great playground for AR games. The city becomes a theme park. The city could become an even bigger touristic attraction. People will come to the city to be part of these games. So you&#8217;re having thousands of people running around the city playing all sorts of games from laser-tag style to history adventures, to treasure hunts.</strong></p>
<h3><strong>Composing Reality</strong></h3>
<p><strong>Tish: </strong>So why haven&#8217;t you focused on one of these kinds of games with your company?</p>
<p><strong>Ori: We have a couple of scenarios along these lines that we&#8217;re planning for 2010-11. But first focus on what&#8217;s possible today.</strong></p>
<p><strong>Tish: </strong>And what&#8217;s stopping you from doing those kind of games today?</p>
<p><strong>Ori: Many things. The devices are not there yet, location services are not accurate enough, ubiquitous sensors are notÂ  there yet.</strong></p>
<p><strong>Tish: </strong>You think alternate reality gaming needs more &#8220;ubiquity&#8221; than is currently available?</p>
<p><strong>Ori: Not necessarily. People are doing alternate reality games with no &#8220;ubiquity&#8221; at all. But my interest is to add the visual aspect. I believe humans are mostly driven visually.</strong></p>
<p><strong>Jane McGonigal said in a talk at GDC, that AR would allow us to program reality, which is exactly how I look at it. Once you can recognize things, some of it with WiFi and RFID and all sorts of sensors. But visual sensors is always going to be the ultimate way to recognize things. And once you recognize things and know what they are, and can pull information about those things (or people and places) from the internet, you can program it (visually). You could program it to be fictional, like in a video game, or it could be programmed as non-fictional, like a documentary. And that allows you to do things that before were unimaginable.</strong></p>
<p><strong>Tish: </strong>But you can&#8217;t forget the visual, it is primary the connection to peoples&#8217; primary sensory relationships.</p>
<p><strong>Ori: Yes, it&#8217;s like you go to a grocery store and you pick your vegetables, a lot of it is by sight and by touch. And what if you could also see just by looking at it that it&#8217;s from a local store, and that it&#8217;s organic?</strong></p>
<p><strong>Tish:</strong> It goes beyond overlays really?</p>
<p><strong>Ori: By the way, I don&#8217;t like the term &#8216;overlay&#8217;. I know that&#8217;s how it looks: you either overlay or superimpose, but I&#8217;m still searching for a better term. A term I prefer to use is &#8220;composing reality&#8221;. Just like painters, they use brushstrokes and colors and compose a painting. We need to take the real element and the virtual element and compose them into something new. It&#8217;s not just about slapping one on top of the other.</strong></p>
<p><strong>Tish: </strong>yes I think the idea of dashboards is not so appealing.</p>
<h3><strong>Pookatak Games</strong></h3>
<p><strong>Tish: </strong>Do you want to explain the evolution of your company? You have an interesting history of success with high end enterprise applications.</p>
<p><strong>Ori: Since I was a kid I wanted to invent and create things. When I discovered software, that was a really cool way of actually creating things from nothing. From thin air; and you can do it very quickly. That&#8217;s what brought me into software. But I was always looking for the intersection between technology and art. Looking for ways to bring these things together. In the early nineties virtual reality was doing it. It had the appeal of cutting edge technology that can be combined with art. But then, as we all know, it crashed. So I joined Shai Agassi&#8217;s startup (who is now doing Better Place) back in the early nineties. I was one of the first employees in his startup which was developing multimedia products. I was leading the development of one of its flagship product. At some point we realized the technology could be great for an enterprise environment.</strong></p>
<p><strong>It was a really great experience. First going through this cycle from a very small startup and growing into this multi billion dollar business. I was responsible for defining and marketing SAP&#8217;s platform, which was called Netweaver. It was just an idea when we joined SAP and by the time I left it was a major, major business for SAP. I learned about the challenges of building a platform. No matter what purpose you&#8217;re building it for, it typically has similar rules. It&#8217;s definitely not just about the technology; the content that comes with it is really key to making a platform successful.</strong></p>
<p><strong>The third part of this platform trifecta is the community. If you don&#8217;t build a community, you won&#8217;t get the critical mass required for adoption. It may be your own platform but it&#8217;s not necessarily the people&#8217;s platform. That experience is very key to what we&#8217;re doing today. Now, a new industry is being born on the basis of a remarkable technology. But to drive adoption, first we&#8217;ll need good content. The content will be created using today&#8217;s technology with internal tools developed to simplify the process. Next step would be to make the tools used internally &#8211; available to other developers. Help scale the industry, enable innovation on a larger scale. That way we have a chance to create a platform. So it isn&#8217;t really just about my company. I&#8217;m so passionate about augmented reality, I want to it to become a healthy and successful industry for the next 5, 10, 15 years.</strong></p>
<p><strong>Tish: </strong>Yes I am so ready to be liberated from the sitting behind a computing screen! And I know that all this hardware is murdering the environment.</p>
<p><strong>Ori: There&#8217;s &#8216;s the book by Rolf Hainich which is called &#8220;<a id="ba8p" title="The End Of Hardware" href="http://www.theendofhardware.com/">The End Of Hardware.</a> &#8221; It&#8217;s about hardware for augmented-reality. Once you use goggles or other AR interfaces you eliminate the need for screens, laptops, etc. It&#8217;s going to be great for the environment. You have read Rainbow&#8217;s End, right? According to the book in few years there will barely be any (visible) hardware. At least it&#8217;ll have a much smaller footprint for the environment. And it&#8217;ll touch every aspect of life, everything you do. It&#8217;ll change the way you interact with the world.</strong></p>
<h3><strong>The Illusive Eyewear for Immersive AR.</strong></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/05/retroar-googlespost.jpg"><img class="alignnone size-medium wp-image-3469" title="retroar-googlespost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/05/retroar-googlespost-300x225.jpg" alt="retroar-googlespost" width="300" height="225" /></a><br />
<em>Friend of Ori&#8217;s in San Francisco wearing retro AR goggles (from <a href="http://gamesalfresco.com/2009/05/04/gdc-2009-roundup-a-tiny-spark-of-augmented-reality/" target="_blank">Games Alfresco, Ori&#8217;s roundup of GDC 2009</a>)</em></p>
<p><strong>Tish:</strong>OK lets talk about goggles.</p>
<p><strong><strong>Ori: Goggles are going to happen, we want to be hands free.</strong></strong></p>
<p><strong>It&#8217;s going to happen because it&#8217;s just a more intuitive way to use this technology. But above all it has to look cool. Because if it&#8217;s not, if it&#8217;s a big headset, then maybe a small percent of the population might use it, but most people won&#8217;t. It has to look like an accessory, like new cool eyeglasses that you just must wear.</strong></p>
<p><strong>I recently talked to a friend, who runs an industrial design firm, and has experience in designing such glasses for companies like Microvision and Lumux. He says that when you try to bring the images so close to our eyes &#8211; there are some really hard problems to solve. Otherwise it can become really annoying and cause dizzyness.</strong></p>
<p><strong>But I&#8217;m optimistic. I believe it&#8217;s going to happen 3 to 5 years from now. It&#8217;s already starting now: Vuzix announced goggles that will be available this year. Some AR apps that are going to take advantage of next year. Initially only a fraction of the population will use it. And that&#8217;s going to help advance it and make it better and better. But it&#8217;s going to take time until it reaches the mass market.</strong></p>
<p><strong>Tish:</strong> In virtual worlds we have seen, I think, a lot of mistakes in terms of reinventing the wheel and producing too many proprietary versions of the same thing and not enough concerted effort on standards and open platforms that could create a vibrant ecosystem.Â  How can augmented reality not make the same mistakes?</p>
<p><strong>Ori: There are some early AR open source efforts ARTookit, ARtag but it is not a movement yet.Â  One of the things we&#8217;re trying to do at ISMAR this year is to put togetherÂ  discussions around key industry issues, such as standards. Some people say it&#8217;s too early, you have to have a defacto standard to start from. But pretty soon it&#8217;s going to be too late. Just like with virtual worlds, all of a sudden you have all these islands that don&#8217;t talk to each other. Why get to that point if we can plan to avoid it? Let&#8217;s start thinking about it right now. On the other front there are devices. There are pockets of people working on adapting devices for AR, second guessing the hardware companies. Why not get them together with the Intels and Nvidias of the world, and discuss what this device should be able to do. And then compete to make it happen.</strong></p>
<p><strong>Tish: </strong>How much luck are you having with this discussion part?</p>
<p><strong>Ori: People are very interested in doing this. We proposed these panels for ISMAR. And I&#8217;ve got some key people already on board. They have tons of input, they want to get involved. We&#8217;ll see how much we can actually get out of it.</strong></p>
<p><strong>Tish: </strong>In virtual worlds it was a while before vibrant opensource communities developed.Â  OpenSim has I think been the breakthrough community in this regard.</p>
<p><strong>Ori: You have to think about the elements up front. The dream job is to architect the industry. Say we agree on the required pieces. Then we could help the right companies succeed in delivering the pieces. Next, we have to collaborate so that these pieces talk to each other. And eventually these communication methods will become defacto standards and most developers will adopt it.</strong></p>
<p><strong>Tish: </strong>So I&#8217;m going to put you in the role. You&#8217;ve got your dream job. You&#8217;re going to architect this community. So what are the key pieces and where would you like to see the open source communities take hold first?</p>
<p><strong>Ori: Open source will not be exclusive. It&#8217;s going to live side by side with proprietary technology.</strong></p>
<p><strong>The key pieces? You have the user at the center. And the user interacts with a lens. The lens includes both the hardware and the software. And then the lens senses and interacts with the world, which includes people, things and places. And these people-things-places emit information &#8211; about who they are, where they are, what they&#8217;re doing, etcÂ  &#8211; which is then stored in the cloud.</strong></p>
<p><strong>And then you have the content providers, the people and companies, composers who weave AR experiences through the pieces we mentioned before. These composers need a platform that glues these pieces together. Pieces of the platform will be on the lens, and in the world, and in the cloud. If you manage to remove the frictions, and connect these pieces into an experience that people like &#8211; then you have a platform. What the platform does it reduces the overhead and accelerates innovation.</strong></p>
<p><strong>Tish: </strong>Another problem virtual worlds faced in their development was their isolation from the world wide web.Â  Will augmented reality avoid this plight?</p>
<p><strong>Ori:Â  Yes, I believe the key, like you said before, is not to reinvent the wheel. The cloud is already there.Â  Take Wikitude for example, all <a href="http://www.mobilizy.com/" target="_blank">Mobilizy</a> had to do is buildÂ  a relatively simple client app, connected to wikipedia, and all of a sudden it offered a wealth of information in your field of view.</strong></p>
<p><strong>I think we can learn a lot from web 2.0. For example, in order to have a ubiquitous experience like <a href="http://www.curiousraven.com/" target="_blank">Robert Rice</a> and others are striving for, you&#8217;ll need to 3d map the world. Google earth like apps are going to help but it is not going to be sufficient. So let&#8217;s leverage people. Google became successful in part by making people work with them.Â  Each time you create a link from your blog to my blog their search engines learn from it.Â  So let&#8217;s find ways to make people create information that can be used for AR.</strong></p>
<p><object width="425" height="344" data="http://www.youtube.com/v/GTXtW3W8mzQ&amp;hl=en&amp;fs=1" type="application/x-shockwave-flash"><param name="allowFullScreen" value="true" /><param name="allowscriptaccess" value="always" /><param name="src" value="http://www.youtube.com/v/GTXtW3W8mzQ&amp;hl=en&amp;fs=1" /><param name="allowfullscreen" value="true" /></object></p>
<p><em>Ori Inbar directed <a title="Wiki Mouse" href="http://www.youtube.com/watch?v=GTXtW3W8mzQ" target="_blank">Wiki Mouse</a> &#8211; a WIKI Film co-created by a swarm of movie makers around the world.</em></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/05/06/composing-reality-and-bringing-games-into-life-talking-with-ori-inbar-about-mobile-augmented-reality/feed/</wfw:commentRss>
		<slash:comments>12</slash:comments>
		</item>
		<item>
		<title>HomeCamp 2: Home Energy Management and Distributed Sustainability</title>
		<link>http://www.ugotrade.com/2009/04/24/homecamp-2-home-energy-management-and-distributed-sustainability/</link>
		<comments>http://www.ugotrade.com/2009/04/24/homecamp-2-home-energy-management-and-distributed-sustainability/#comments</comments>
		<pubDate>Fri, 24 Apr 2009 19:14:16 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Bar Camp]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[Virtual HomeCamp]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[Virtual Worlds]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[Add new tag]]></category>
		<category><![CDATA[distributed sustainability]]></category>
		<category><![CDATA[electricity 2.0.]]></category>
		<category><![CDATA[green technology]]></category>
		<category><![CDATA[home energy management]]></category>
		<category><![CDATA[intelligent energy management]]></category>
		<category><![CDATA[living greener]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[sustainable interaction design]]></category>
		<category><![CDATA[TweetaWatt]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=3423</guid>
		<description><![CDATA[HomeCamp is a home hacking, automation and green technology community that will be gathering in London tomorrow, Saturday 25th April 2009, 10am until 6pm BST (GMT + 1), and in an OpenSim event running alongside for virtual participation, to brainstorm new possibilities for distributed sustainability, creative smart meters, monitoring, graphing and visulaizing energy usage. More [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-31.png"><img class="alignnone size-medium wp-image-3424" title="picture-31" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-31-299x300.png" alt="picture-31" width="299" height="300" /></a></p>
<p><a rel="nofollow" href="http://homecamp.org.uk/">HomeCamp</a> is a home hacking, automation and green technology community that will be <a href="http://maps.google.co.uk/maps?f=q&amp;source=s_q&amp;hl=en&amp;geocode=&amp;q=65+-+71+Scrutton+Street,+London,+EC2A+4PJ&amp;sll=51.509912,-0.129361&amp;sspn=0.100214,0.30899&amp;ie=UTF8&amp;ll=51.524379,-0.080895&amp;spn=0.006582,0.019312&amp;z=16&amp;iwloc=addr" target="_blank">gathering in London</a> tomorrow, Saturday 25th April 2009, 10am until 6pm BST (GMT + 1), and in an <a href="http://homecamp.pbwiki.com/Virtual-Home-Camp">OpenSim event running alongside for virtual participation</a>, to brainstorm new possibilities for distributed sustainability, creative smart meters, monitoring, graphing and visulaizing energy usage.</p>
<p class="MsoNormal">More details and videos on the <a href="http://homecamp.org.uk" target="_blank">blog.</a> <a href="http://homecamp.pbwiki.com/" target="_blank">The wiki, which includes signup</a>, is the main portal to all the online activity.<a href="http://homecamp.pbwiki.com/"></a></p>
<p>As James Governor notes <a href="http://www.redmonk.com/jgovernor/2009/04/24/homecamp-returns/" target="_blank">here</a>:</p>
<blockquote><p><span lang="EN-GB">there has been a huge amount of code and applications released focused purely on using technology for home energy monitoring and automation.Â  We have an active google group and quite a few videos and content showcasing the various applications and hardware currently being used by geeks to save money and live greener.</span></p></blockquote>
<p><span lang="EN-GB">Now the challenge is to see how this seedling home energy management movement</span><span lang="EN-GB"> can </span><span lang="EN-GB">really grow into widely adopted distributed sustainability solutions that </span><span lang="EN-GB">everyone can use, and participate in.</span></p>
<p>Both <a href="http://www.yellowpark.net/cdalby/index.php/about/" target="_blank">Chris Dalby</a> (<a href="http://www.yellowpark.net/cdalby/index.php/2009/04/23/homecamp-2-is-this-saturday/" target="_blank">see here)</a>, <a href="http://andypiper.wordpress.com/2009/04/24/home-camp-mark-2/" target="_blank">Andy Piper</a>, James Governor of <a href="http://www.redmonk.com/jgovernor/" target="_blank">Monkchips</a> (<a href="http://www.redmonk.com/jgovernor/2009/04/24/homecamp-returns/" target="_blank">see here</a>),Â  and Tom Raftery of <a href="http://greenmonk.net/" target="_blank">GreenMonk</a> (<a href="http://greenmonk.net/homecamp-ii/" target="_blank">see here</a>), have posted on tomorrow&#8217;s <a href="http://homecamp.pbwiki.com/" target="_blank">Ho</a><a href="http://homecamp.pbwiki.com/" target="_blank">meCamp</a> event. So I am just going to add some quick notes, especially to highlight some of what will be going on virtually for those of you, like me, who canâ€™t make it to London.</p>
<p>You can tune in either on the live video ustream, or sign up on <a href="http://reactiongrid.com/">ReactionGrid </a>and join the <a href="http://homecamp.pbwiki.com/Virtual-Home-Camp">OpenSim event</a>. Also, you can keep up on what is happening on Twitter #homecamp. I highly recommend that you catch Tom Raftery&#8217;s talk which will be streamed from Spain live into the London meeting, the OpenSim event on ReactionGrid, and Ustream. Tom Raftery, a leading Green technology analyst at <a href="http://redmonk.com/" target="_blank">RedMonk</a> <a href="http://greenmonk.net/" target="_blank">(see also GreenMonk</a>), will be picking up, in depth, on some themes raised in his brilliant ETech 2009 presentation, <a href="http://en.oreilly.com/et2009/public/schedule/detail/5655" target="_blank">&#8220;Electricity 2.0: Applying the Lessons of the Web to Our Energy Networks.&#8221;</a></p>
<p class="MsoNormal"><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/tweetawatt.jpg"><img class="alignnone size-medium wp-image-3425" title="tweetawatt" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/tweetawatt-300x162.jpg" alt="tweetawatt" width="300" height="162" /></a></p>
<p class="MsoNormal">There will be homecampers dropping in to virtual homecamp in ReactionGrid throughout the day, including <a href="http://blogs.ipona.com/chris/" target="_blank">Chris Hart (the awesome &#8220;girl-geek&#8221;@dstrawberrygirl)</a>, <a href="http://mikethebee.mevio.com/" target="_blank">MiketheBee</a>, and <a href="http://www.cminion.com/wordpress/" target="_blank">Cminion</a>, who has a number of cool projects to demo, including <a href="http://www.cminion.com/wordpress/?p=43" target="_blank">his energy turbines</a>.Â  <a href="http://www.gomaya.com/glyph/" target="_blank">Dave Pentecost</a> (pictured above with his <a href="http://twitter.com/tweetawatt" target="_blank">Tweetawatt</a>, <a href="http://www.pachube.com/" target="_blank">Pachube</a> Orb) and I (<a href="http://docs.google.com/Presentation?id=dhj5mk2g_214g48q37hj" target="_blank">see our presentation for EarthWeek SL here</a>) plan to be at Virtual Homecamp on ReactionGrid between 9am and 10.30am EST. Dave has done a number of cool energy monitoring hacks including a <a href="http://www.pachube.com/" target="_blank">Pachube</a> link to and from <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a>.</p>
<p><span class="title">Also keep your eye on Dave&#8217;s blog, <a href="http://www.gomaya.com/glyph/" target="_blank">The Daily Glyph</a>, for what&#8217;s new in distributed sustainability. Dave just posted some great links on Sustainable Interaction, design</span> and work by ITP researchers and others in sustainable use of technology.</p>
<p><a title="Sustainable Interaction | Main / Papers" href="http://itp.nyu.edu/sustainability/interaction/Main/Papers">Sustainable Interaction | Main / Papers</a></p>
<p><a title="Sustainable interaction design | Sustainable Minds" href="http://www.sustainableminds.com/category/categories/sustainable-interaction-design">Sustainable interaction design | Sustainable Minds</a></p>
<p><a title="Design For the Other 90% | Cooper-Hewitt, National Design Museum" href="http://other90.cooperhewitt.org/">Design For the Other 90% | Cooper-Hewitt, National Design Museum</a></p>
<p class="MsoNormal">If you are in London, look out for Oliver Goh of <a href="http://www.shaspa.com/" target="_blank">Shaspa</a> as Oliver will be at Homecamp in London. As I mentioned in <a href="http://www.ugotrade.com/2009/04/19/sensor-networks-and-sustainability-connecting-real-virtual-mobile-and-augmented-reality/" target="_blank">my previous post</a>, Oliver will soon be launching both Shaspa commmunity and enterprise hardware and software packages for &#8220;Intelligent Energy Management.&#8221;</p>
<p class="MsoNormal"><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-35.png"><img class="alignnone size-medium wp-image-3428" title="picture-35" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-35-300x229.png" alt="picture-35" width="300" height="229" /></a></p>
<p>For a bit of homecamp history, James Governor (picture below from <a href="http://chinposin.com/home/monkchips" target="_blank">Chinposin)</a>, recapsÂ  some of the successes ofÂ  the first HomeCamp <a href="http://www.redmonk.com/jgovernor/2009/04/24/homecamp-returns/" target="_blank">here</a>.</p>
<p>And last but not least, a big thanks to sponsors, <a href="http://currentcost.co.uk/">CurrentCost</a>, <a href="http://greenmonk.net/">Greenmonk</a>, <a href="http://www.pachube.com/">Pachube</a>, <a href="http://www.onzo.co.uk/" target="_blank">Onzo</a>, and <a href="http://reactiongrid.com/">ReactionGrid</a>,Â  and media partner <a href="http://theattick.tv/" target="_blank">theattick.tv</a> who are making the London and virtual homecamp events possible.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-33.png"><img class="alignnone size-medium wp-image-3426" title="picture-33" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-33-294x300.png" alt="picture-33" width="294" height="300" /></a></p>
<p class="MsoNormal"><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/tweetawatt.jpg"></a></p>
<p class="MsoNormal"><a href="http://homecamp.pbwiki.com/"></a></p>
<p class="MsoNormal">
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/04/24/homecamp-2-home-energy-management-and-distributed-sustainability/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Sensor Networks and Sustainability: &#8220;Connecting Real, Virtual, Mobile and Augmented Spaces&#8221;</title>
		<link>http://www.ugotrade.com/2009/04/19/sensor-networks-and-sustainability-connecting-real-virtual-mobile-and-augmented-reality/</link>
		<comments>http://www.ugotrade.com/2009/04/19/sensor-networks-and-sustainability-connecting-real-virtual-mobile-and-augmented-reality/#comments</comments>
		<pubDate>Sun, 19 Apr 2009 06:32:59 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[MQTT and RSMB]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[realXtend]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Virtual HomeCamp]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[Virtual Realities]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[AMEE]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[Carbon Goggles]]></category>
		<category><![CDATA[distributed sustainability]]></category>
		<category><![CDATA[home energy management]]></category>
		<category><![CDATA[open data]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[sensor networks]]></category>
		<category><![CDATA[sensor networks and sustainability]]></category>
		<category><![CDATA[SHASPA]]></category>
		<category><![CDATA[the internet of things]]></category>
		<category><![CDATA[TweetaWatt]]></category>
		<category><![CDATA[Virtual Worlds]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=3381</guid>
		<description><![CDATA[Today, I did a presentation, on connecting real, virtual, mobile, and augmented spaces to support sustainability, for Earth Week SL, with Dave Pentecost and Jim Purbrick, who presented on Carbon Goggles. Dave and I focused on sensor networks, open data, Pachube, OpenSim, and sustainability from perspective of, &#8220;hack local, think global.&#8221;Â  Dave and I will [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-21.png"><img class="alignnone size-medium wp-image-3382" title="picture-21" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-21-300x225.png" alt="picture-21" width="300" height="225" /></a></p>
<p>Today, I did a presentation, on <a href="http://docs.google.com/Presentation?id=dhj5mk2g_214g48q37hj" target="_blank">connecting real, virtual, mobile, and augmented spaces to support sustainability,</a> for <a href="http://slearthweek.wordpress.com/2009/04/10/earth-week-press-release-see-schedule-also/" target="_blank">Earth Week SL</a>, with <a href="http://www.gomaya.com/glyph/" target="_blank">Dave Pentecost</a> and <a href="http://jimpurbrick.com/" target="_blank">Jim Purbrick</a>, who presented on <a href="http://carbongoggles.org/" target="_blank">Carbon Goggles</a>.</p>
<p>Dave and I focused on sensor networks, open data,<a href="http://www.pachube.com/" target="_blank"> Pachube</a>,  <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim,</a> and sustainability from perspective of, &#8220;hack local, think global.&#8221;Â  Dave and I will be picking up on some of these themes of sensor networks and sustainability next week in our presentation with <a href="http://www.darleon.com/" target="_blank">Dimitri Darras</a> at ITP,Â  NYU, Aprl 24th, 6.30 pm to 8 pm &#8211; <a href="http://itp.nyu.edu/sigs/news/special-event-open-sim/" target="_blank">details here</a>.Â  If you are in New York City, I hope to see you there.</p>
<p>We got some interesting insights into augmented reality from <a href="http://jimpurbrick.com/" target="_blank">Jim Purbrick</a> whose <a href="http://carbongoggles.org/" target="_blank">Carbon Goggles</a> project prototypes how we can use augmented reality to read carbon identity and to combine well organized, verified data from <a href="http://www.amee.com/" target="_blank">AMEE</a> &#8211; a neutral aggregation platform to measure the &#8220;carbon footprint&#8221; of everything on earth, with crowd sourced tagging and linking.</p>
<h3>Shaspa &#8211; &#8220;the sensor network system that has it all&#8221;</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-22.png"><img class="alignnone size-medium wp-image-3391" title="picture-22" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-22-300x224.png" alt="picture-22" width="300" height="224" /></a></p>
<p>We also discussed, recently launched, <a href="http://www.shaspa.com/" target="_blank">Shaspa</a>. Shaspa&#8217;s energy management packages connect spaces &#8211; real, virtual, mobile and augmented.Â  Shaspa has been bloggedÂ  by <a href="http://www.maxping.org/business/real-life/virtual-management-of-energy-consumption-in-the-home.aspx/" target="_blank">Maxping</a> and <a href="http://www.virtualworldsnews.com/2009/04/shaspa-launches-home-energy-organizer-on-opensim.html" target="_blank">Virtual World News</a>, so you can read all about it, but the Shaspa device kit won&#8217;t be available until next week. Some key features of the Home EnergyÂ  package are listed on the slide above.Â  However, this evening, Dave Pentecost and I got a sneak preview of both the Shaspa commmunity and enterprise hardware and software packages from Shaspa founder Oliver Goh. We were pretty impressed.</p>
<p><strong>Dave:</strong> &#8220;<strong>It&#8217;s the ultimate hackable device for energy management!&#8221;</strong></p>
<p><strong>Oliver:</strong> <strong>&#8220;Bring us any sensor device &#8211; with documentation, and within three days we will put a driver into Shaspa.&#8221;</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/daveandoliverpost.jpg"><img class="alignnone size-medium wp-image-3392" title="daveandoliverpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/daveandoliverpost-300x178.jpg" alt="daveandoliverpost" width="300" height="178" /></a></p>
<p>Oliver is on the right and Dave on the left in the picture above. The picture below shows Shaspa in OpenSim. Oliver and I will be attending the <a href="http://www.3dtlc.com/"><span style="color: #810081;">3D Training, Learning and Collaboration</span></a> Conference in Washington, DC, next week.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-23.png"><img class="alignnone size-medium wp-image-3412" title="picture-23" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-23-300x208.png" alt="picture-23" width="300" height="208" /></a></p>
<h3>Links</h3>
<p>Here are some of the links that came up in the presentation as many people asked for them to be published. Dave also has them on <a href="http://www.gomaya.com/glyph/archives/002520.html#002520" target="_blank">his blog</a>.</p>
<p>SLIDES on GOOGLE DOCS:<br />
<a title="Earth Week SL Presentation, April 18th, 2009 - Google Docs" href="http://docs.google.com/Presentation?id=dhj5mk2g_214g48q37hj">Earth Week SL Presentation, April 18th, 2009 &#8211; Google Docs</a></p>
<p><a href="http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/" target="_blank">Pachube, sensor networks</a></p>
<p><a href="http://www.gomaya.com/glyph" target="_blank">Dave&#8217;s blog covering Maya archaeology, jungle ecology, and technology</a></p>
<p><a href="http://www.gomaya.com/glyph/archives/001914.html" target="_blank">Maya Frontier, Usumacinta River videos</a></p>
<p><a href="http://en.wikipedia.org/wiki/Collapse_(book)" target="_blank">Collapse</a></p>
<p><a href="microcontrollers http://arduino.cc/" target="_blank">Arduino</a></p>
<p><a href="http://community.pachube.com/tutorials" target="_blank">Pachube &#8211; tutorials</a></p>
<p><a href="http://apps.pachube.com/" target="_blank">Pachube Apps </a>-</p>
<p><a href="http://www.pachube.com/feeds/1284" target="_blank">Arduino-SL-Pachube data site</a></p>
<p><a href="http://www.pachube.com/feeds/1505" target="_blank">SL to Pachube site</a></p>
<p><a href="http://www.zachhoeken.com/connecting-to-the-world" target="_blank">Dave&#8217;s Danger Shield &#8211; Pachube  tutorial</a></p>
<p><a href="http://www.ladyada.net/make/tweetawatt/" target="_blank">TweetaWatt site (LadyAda)</a></p>
<p><a href="http://www.gomaya.com/glyph/archives/002505.html" target="_blank">Dave&#8217;s post on TweetaWatt to Opensim/SL</a></p>
<p><a href="http://peterquirk.wordpress.com/2008/12/22/tutorial-using-the-streamlined-tool-chain-for-importing-sketchup-models-into-realxtend-04/" target="_blank">Peter Quirk&#8217;s post on Importing Sketchup into RealXtend</a></p>
<p><a href="http://opensimulator.org/wiki/Main_Page" target="_blank">Opensim</a></p>
<p><a href="http://www.realxtend.org/" target="_blank">RealXtend</a></p>
<p><a href="http://reactiongrid.com/" target="_blank">ReactionGrid</a></p>
<p><a href="http://homecamp.pbwiki.com/" target="_blank">homecamp</a></p>
<p><a href="http://www.cminion.com/wordpress/" target="_blank">cminion -wind turbines in OpenSim</a></p>
<p><a href="http://mikethebee.mevio.com/" target="_blank">MiketheBee</a></p>
<p><a href="http://www.ugotrade.com/2009/01/17/is-it-%E2%80%9Comg-finally%E2%80%9D-for-augmented-reality-interview-with-robert-rice/" target="_blank">Is it &#8220;OMG finally&#8221; for Augmented Reality?</a></p>
<p><a href="http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/" target="_blank">Smart Planet: Interview with Andy Stanford-Clark</a></p>
<p><a href="http://www.orangecone.com/" target="_blank">Orange Cone &#8211; Information Shadows and Things as Services</a></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/04/19/sensor-networks-and-sustainability-connecting-real-virtual-mobile-and-augmented-reality/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Pachube, Patching the Planet: Interview with Usman Haque</title>
		<link>http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/</link>
		<comments>http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/#comments</comments>
		<pubDate>Wed, 28 Jan 2009 16:31:41 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[MQTT and RSMB]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[Second Life]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[Virtual HomeCamp]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[Virtual Realities]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[connecting environments]]></category>
		<category><![CDATA[dynamic environments]]></category>
		<category><![CDATA[electronically assisted plants]]></category>
		<category><![CDATA[Extended Environment Markup Language]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[sensor technology]]></category>
		<category><![CDATA[smart buildings]]></category>
		<category><![CDATA[smart spaces]]></category>
		<category><![CDATA[social networking sensor data]]></category>
		<category><![CDATA[software of space]]></category>
		<category><![CDATA[sustainable real estate]]></category>
		<category><![CDATA[the street as a platform]]></category>
		<category><![CDATA[ubicomp]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=2686</guid>
		<description><![CDATA[Usman Haque (architect and director, Haque Design + Research) and founder of Pachube pointed me to this image from T.R. Oke&#8217;s book, &#8220;Boundary Layer Climates&#8221; (original photo source Prof. L. E. Mount&#8217;s The Climatic Physiology of the Pig) to explain his approach to the &#8220;software&#8221; of space. My focus as an architect has always been [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pigletspachubepost.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/dcfwgkt_8g2dvxgdg_b2.jpg"><img class="alignnone size-full wp-image-2835" title="piglets" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/dcfwgkt_8g2dvxgdg_b2.jpg" alt="piglets" width="614" height="407" /></a></p>
<p>Usman Haque (architect and director, <a id="o.td" title="Haque Design + Research" href="http://www.haque.co.uk/" target="_blank">Haque Design + Research</a>) and founder of <a id="cpbp" title="Pachube" href="http://www.pachube.com/">Pachube</a> pointed me to this image from <a href="http://www.geog.ubc.ca/~toke/Profile.htm &lt;http://www.geog.ubc.ca/%7Etoke/Profile.htm" target="_blank">T.R. Oke&#8217;s</a> book, <a href="http://www.amazon.com/Boundary-Layer-Climates-T-Oke/dp/0415043190" target="_blank">&#8220;Boundary Layer Climates&#8221;</a> (original photo source Prof. L. E. Mount&#8217;s <a href="http://www.alibris.com/booksearch?qwork=1137594&amp;matches=1&amp;author=Mount%2C+Laurence+Edward&amp;browse=1&amp;cm_sp=works*listing*title" target="_blank">The Climatic Physiology of the Pig</a>) to explain his approach to the &#8220;software&#8221; of space.</p>
<p><em>My focus as an architect has always been to consider what I&#8217;ve called the &#8220;software&#8221; of space (sounds, smell, light, temperature, electromagnetic fields, social relationships, etc.) rather than the &#8220;hardware&#8221; (floors, walls, roof, etc.) as it has traditionally been considered. The image (above) really sums up why I think this is important.</em></p>
<p><em>It&#8217;s the same piglets, in the same box, but on the right hand side the temperature has been increased. This small change in how the space is &#8220;programmed&#8221; has dramatically changed the way the &#8216;inhabitants&#8217; relate to each other and how they relate to their space. This approach to architecture became my challenge: how to translate such strategies into the general architectural discourse and how to bring into reality such possibilities for the construction industry.</em></p>
<h3>&#8220;Connecting Environments, Patching the Planet&#8221;<em><br />
</em></h3>
<p>Pachube is the culmination of 12 years of work.<em> </em></p>
<p><em>&#8220;It is now occupying pretty much all my time and will do for the foreseeable future,&#8221; </em>Usman told me.</p>
<p>Haque Design + Research is not foregrounded on theÂ <a id="q51:" title="Pachube site" href="http://www.pachube.com/" target="_blank">Pachube site</a>. And I did not make the connection at first. But when I followed a small link at the bottom, I was soon delving into the <a id="n4ku" title="work of Usman Haque" href="http://www.haque.co.uk/" target="_blank">work of Usman Haque</a>.Â  Then the penny dropped and I realized that Pachube is not only:</p>
<p><em><em>A web service that enables people to tag and share real time sensor data from objects, devices and spaces around the world, facilitating interaction between remote environments, both physical and virtual.</em></em><strong><em><br />
</em></strong></p>
<p>Pachube is also a really big idea.</p>
<h3><strong>Ubicomp and the &#8220;Software of Space?&#8221;<br />
</strong></h3>
<p>Usman suggested that, if I really wanted to go back to the beginning of the Pachube vision, I should check out the work of Dutch architect Constant Nieuwenhuys and his 1956 proposal for a visionary society, <a id="y-7j" style="font-weight: normal;" title="New Babylon" href="http://www.artfacts.net/index.php/pageType/exhibitionInfo/exhibition/15904" target="_blank">New Babylon</a></p>
<p>Usman explained:<strong><em></em></strong></p>
<p><em>Constant Nieuwenhuys is certainly an inspiration for Pachube. He envisages a globally connected architecture, built by its inhabitants &#8211; configured, reconfigured, reappropriated&#8230;</em></p>
<p>For a more contemporary reference, Usman noted there are lots of overlapping concepts with <a id="d21o" title="Adam Greenfield (head of design direction for service and user-interface design at Nokia)" href="http://speedbird.wordpress.com/about/" target="_blank">Adam Greenfield&#8217;s work. </a>Adam is head of design direction for service and user-interface design at Nokia. see Everyware: <a id="spz5" title="The dawning age of ubiquitous computing" href="http://www.amazon.com/exec/obidos/ASIN/0321384016/v2organisa/" target="_blank">The dawning age of ubiquitous computing</a>, and <a href="http://www.lulu.com/content/1554599">Urban Computing and its Discontents</a> to understand more about the vision Adam Greenfield has been developing.</p>
<p>Pachube is right in the zone with the ideas outlined in <a id="pxeu" title="The project description for Adam Greenfield's upcoming book, The City Is Here For You To Use" href="http://speedbird.wordpress.com/2008/01/01/new-day-rising/" target="_blank">The project description </a>for Adam Greenfield&#8217;s upcoming book,<a id="pxeu" title="The project description for Adam Greenfield's upcoming book, The City Is Here For You To Use" href="http://speedbird.wordpress.com/2008/01/01/new-day-rising/" target="_blank"> The City Is Here For You To Use</a>:</p>
<p><em><em>The City&#8230; takes everything explored in Everyware as a given, and a point of departure.<br />
<em><br />
It assumes that emergent technologies like RFID, mesh networking and shape-memory actuators&#8230;</em></em></em><em><em><em>will simply be part of how cities will be made from now on&#8230;</em></em></em></p>
<p><em><em><em><br />
</em></em></em></p>
<h3 style="text-align: left;">The Pachube Team</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pachubeteamfull.jpg"><img class="alignnone size-full wp-image-2764" title="pachubeteamfull" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pachubeteamfull.jpg" alt="pachubeteamfull" width="480" height="344" /></a></p>
<p>The Pachube Team &#8211; Usman Haque (creative director), Chris Leung (EEML developer), photoshopped laptop: Chris Burman (&#8220;example-maker&#8221;. e.g. SL code and Google SketchUp plugin), Ai Hasegawa (graphic designer), Sam Mulube (technical producer and website development).</p>
<p>Also, with Bruce Sterling as a &#8220;visionary&#8221; adviser and other luminaries involved, Pachube has some brilliant guiding lights.Â  Usman pointed that many people have<em> &#8220;have helped, prodded, nudged and advised along the way!&#8221; </em></p>
<div><em>Gavin Starks and also Dopplr&#8217;s Matt Biddulph have been sort of &#8220;friendly neighbours&#8221; to Pachube: they&#8217;ve made some great introductions and I turn to them often for advice on being a London start-up. What&#8217;s been really useful for me is that they are active in a related area and have directly useful advice: Gavin, of course, since he&#8217;s involved in metering the world&#8217;s energy; and Matt perhaps less tangibly in his day job as Dopplr&#8217;s CTO but more so in his active Arduino-enabled social life!</em></div>
<div><em><br />
</em></div>
<div><em>One very important Pachube advisor has been Dr. Paul Pangaro, who has previously been CTO at a number of technology startups, and brings vital experience from his time at Sun Microsystems as Senior Director and Distinguished Market Strategist. Oh, and he&#8217;s also a former student and collaborator of Gordon Pask&#8217;s! He has been very helpful in developing a viable business model in conjunction with my brother Yusuf Haque, who, with his experience in raising capital for startups, has led the fundraising process.</em></div>
<div><em><br />
</em></div>
<div><em>Of course, direct daily input from the Pachube team has been vital to the development of the project, and without Chris Leung (EEML development) and Sam Mulube (backend development) it would be a very different thing indeed!</em></div>
<div>
<h3>Pachube is not just a social networking project for sensor data.</h3>
<p>Pachube evolved out three strands of thought:</p>
<p><em>1) the geographical non-specificity of architecture these days as people live their lives in constant connection with people in remote spaces </em></p>
<p><em>2) a desire to open up the production process of &#8220;smart homes&#8221; in reaction to current trends forÂ placing the design and construction process solely in the hands of knowledgeable others.</em></p>
<p><em>3) an emphasis on contextually specific &#8220;environments&#8221; rather than object-centric &#8220;sensors&#8221;</em></p>
<p>Sensor/actuator integrations are a part of whatÂ  Pachube is about (also see Peter Quirk&#8217;s in depth post on <a id="ai70" title="the strong connection between virtual worlds and sensor networks" href="http://peterquirk.wordpress.com/2009/01/21/sensor-networks-and-virtual-worlds/" target="_blank">the strong connection between virtual worlds and sensor networks</a>), and an interest in home automation and energy management is giving a lot of early momentum to Pachube.</p>
<p>But Usman makes clear Pachube is about &#8220;environments&#8221; rather than &#8220;sensors.&#8221;Â  &#8220;An &#8216;environment&#8217; has dynamic frames of reference, all of which are excluded when simply focusing on devices, objects or mere sensors&#8221; (Usman explains this in depth in the interview below). A central part of Pachube is the development ofÂ  the <a id="f0b2" title="Extended Environments Markup Language." href="http://www.eeml.org/" target="_blank">Extended Environments Markup Language.</a></p>
<h3>Extended Environment Markup Language</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/eeml.jpg"><img class="alignnone size-full wp-image-2765" title="eeml diagram" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/eeml.jpg" alt="eeml diagram" width="520" height="159" /></a></p>
<p><em>Pachube came about as a direct attempt to enable the production of dynamic, responsive, conversant &#8216;environments&#8217;. </em></p>
<p><em>The <a id="gv6y" style="color: #551a8b;" title="Extended Environments Markup Language (EEML)" href="http://www.eeml.org/" target="_blank">Extended Environments Markup Language (EEML)</a> (which is the protocol around which much of Pachube is based) is being developed to make the idea of &#8220;dynamic, responsive and conversant environments&#8221; a reality. It worksÂ with existing construction standards like <a id="l7sl" style="color: #551a8b;" title="Industry Foundation Classes (IFC)" href="http://en.wikipedia.org/wiki/Industry_Foundation_Classes" target="_blank">Industry Foundation Classes (IFCs)</a>, but exists to extend them to account for dynamic, responsive and, dare I say it, conversant buildings. </em></p>
<p>A key member of the Pachube<em> </em>team<em> </em>doing EEML development is <a id="h3n5" title="Chris Leung" href="http://www.chrisleung.org/" target="_blank">Chris Leung</a><em>. </em>Haque Design + Research<em> </em>is industry sponsor of Chris&#8217; doctorate that:</p>
<p><em>investigates how Architectural and Engineering consultancies can use advanced imaging, sensing and visualisation technology to capture, record and playback the responsive behaviour of built Architecture in response to its environment as a decision-support tool to meet this unique challenge.</em></p>
<p><strong><a href="http://www.chrisleung.org/CaseStudy1.htm">Case-Study I â€“ Kielder Forest</a></strong></p>
<p><em><strong><img class="alignnone size-medium wp-image-2707" title="kielderforest" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/kielderforest-300x225.jpg" alt="kielderforest" width="300" height="225" /></strong></em></p>
<p>Usman explained to me the full vision for Pachube is not yet fleshed out on the web site (so read the full interview!), and this is in part because the focus has been on building a backend capable of handling millions of users.</p>
<h3>The business model for Pachube</h3>
<p>Usman explained his commitment to an ethically driven business model to allow a diverse group of companies and individuals to transition to the internet of things. Usman emphasizes that one of his chief concerns is to make sure that these technologies of &#8220;extreme connectivity,&#8221; that will soon be part of every aspect of our lives, are in the hands of all who want to use them.<br />
<em><br />
Pachube is here to make it easier to participate in what I expect to be a vast &#8216;eco-system&#8217; of conversant devices, buildings &amp; environments. </em></p>
<p><em>Pachube will facilitate the development of a huge range of new products and services that will arise from extreme connectivity. It&#8217;s relatively easy for large technology companies like Nike and Apple to transition into the Internet of Things, but Pachube will be particularly helpful for that huge portion of smaller scale industry players that *want* to become part of it, but which are only now waking up to the potentials of the internet &#8212; small and medium scale designers, manufacturers and developers who are very good at developing their products but don&#8217;t have the resources to develop in-house a massive infrastructure for their newly web-enabled offerings. </em></p>
<p><em>Basically, having built a generalized data-brokering backend to connect physical (and virtual) entities to the web, others can now start to build the applications that make the connections really useful. </em></p>
<h3>An Inspired Community of Early Adopters and Business Visionaries</h3>
<p><em><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/monkchipsathomecamp1.jpg"><img class="alignnone size-full wp-image-2766" title="monkchipsathomecamp1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/monkchipsathomecamp1.jpg" alt="monkchipsathomecamp1" width="462" height="308" /></a><br />
</em></p>
<p>James Governor <a href="../wp-content/uploads/2008/12/andystanfordclark.jpg"><span class="entry-content">(</span></a><a id="qd8i" title="@monkchips" href="http://twitter.com/monkchips" target="_blank">@monkchips</a>), <a href="http://redmonk.com/">Redmonk</a> has Pachube, <a href="http://currentcost.co.uk/">Current Cost</a>, <a id="g.i:" title="using MQTT" href="http://mqtt.org/" target="_blank">MQTT</a> and RSMB (<a id="h0is" title="IBM AlphaWorks" href="http://alphaworks.ibm.com/tech/rsmb" target="_blank">IBM AlphaWorks</a>), and <a href="http://www.arduino.cc/" target="_blank">Arduino</a> on the board at <a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp â€˜08.</a> Photo from theÂ  <a href="http://www.flickr.com/photos/tags/homecamp08/" target="_blank">Flickr</a><a href="http://www.flickr.com/search/?q=homecamp&amp;w=29034542%40N00" target="_blank"> stream</a> ofÂ  <a href="http://benjaminellis.co.uk/" target="_blank">Benjamin Ellis</a>.<a href="http://www.flickr.com/search/?q=homecamp&amp;w=29034542%40N00" target="_blank"></a></p>
<p>What attracted my attention to Pachube, at first, was the small but highly energized community of early adopters I noticed experimenting with Pachube.Â  <a id="x2vv" title="Nigel Crawley" href="http://www.nigelcrawley.co.uk/" target="_blank">Nigel Crawley</a> <a id="nf4y" title="@ni" href="http://twitter.com/ni" target="_blank">@ni</a>), and <a id="zjcv" title="James Taylor" href="http://jtlog.wordpress.com/" target="_blank">James Taylor</a>, (<a id="ie4m" title="@jtonline" href="http://twitter.com/jtonline" target="_blank">@jtonline</a>)Â  were some of the first to plunge in.Â <a id="o0.i" title="Rick Bullotta" href="http://www.automation.com/content/wonderware-appoints-rick-bullotta-vp-and-cto" target="_blank">Rick Bullotta,</a> Usman noted, has been very active in the community forum bringing much-needed automation expertise to the conversation. <a id="ny-t" title="Pam Broviak" href="http://www.publicworksgroup.com/" target="_blank">Pam Broviak</a> (<a id="xkmo" title="@pbroviak" href="http://twitter.com/pbroviak" target="_blank">@pbroviak</a>) is an early Second Life adopter.Â  And <a id="ugu0" title="Matt Biddulph" href="http://www.hackdiary.com/about/" target="_blank">Matt Biddulph</a> (CTO of <a href="http://www.dopplr.com/">Dopplr</a>) was the first non-Pachube person to get a feed up!</p>
<p>A very active early adopter is <a id="q54j" title="Carl Johan Rosen" href="http://carljohanrosen.com/" target="_blank">Carl Johan Rosen</a> wrote an <a href="http://www.openframeworks.cc/" target="_blank">openFrameworks</a> addon (<a id="ljuh" title="for more see here" href="http://carljohanrosen.com/?p=42" target="_blank">see here</a>) for <a href="http://www.pachube.com/" target="_blank">Pachube</a> that he presented at the <a href="http://www.aec.at/en/festival2008/program/project.asp?parent=14439&amp;iProjectID=14447" target="_blank">OFLab at Ars Electronica Festival</a>.<br />
After the first inaugural <a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp</a>, where Usman and Chris Burman from Pachube were presenters, (<a id="diae" title="see slides here" href="http://www.slideshare.net/tag/pachube" target="_blank">see slides here</a>), I began to notice that people were sending their current cost feeds into Pachube. And recently, it was announced that Pachube has <a href="http://apps.pachube.com/carbon_footprint.php" target="_new">carbon footprint calculation app</a> which:</p>
<p><em>makes it very easy to take any Pachube feed that measures electricity consumption in watts or kilowatts and convert it into a Pachube feed that shows a realtime estimated carbon footprint for the last 15 minutes, the last hour and the last 24 hours.</em></p>
<p><em>The app makes use of international data provided by <a href="http://www.amee.cc/" target="_new">&#8216;AMEE &#8211; The world&#8217;s energy meter&#8217;</a>. AMEE provides figures that are specific to electricity suppliers in UK &amp; Ireland and specific to country in the rest of the world.</em></p>
<p><em>This app, combined with the <a href="http://community.pachube.com/?q=node/100">Current Cost app</a> makes it simple to monitor your carbon footprint on a day to day basis!</em></p>
<p>I still haven&#8217;t found out what <a id="kmt8" title="@yellowpark" href="http://twitter.com/yellowpark" target="_blank">@yellowpark</a> was doing last Saturday to produce so much CO2&#8230;&#8230;? (the perils of going public with your energy consumption as <a id="am8t" title="@epachube" href="http://twitter.com/pachube" target="_blank">@epachube</a> pointed out).</p>
<p>But perhaps Chris Dalby <a id="kmt8" title="@yellowpark" href="http://twitter.com/yellowpark" target="_blank">(@yellowpark</a>) can be excused a day of CO2 excess as he has just released <a id="qf:l" title="Pachube Air" href="http://www.yellowpark.net/cdalby/index.php/2009/01/10/pachube-air-the-first-release/" target="_blank">Pachube Air</a>.</p>
<p>While enterprise and government projects are on the near horizon, PachubeÂ  is designed to introduce a DIY approach to ubicomp.Â  Usman said he is &#8220;concerned by developments in ubiquitous computing whereby &#8216;making technology invisible&#8217; equates to placing the design and construction process solely in the hands of knowledgeable others.</p>
<p>DIY City (see the <a id="zwms" title="Do-It-Yourself-City Project" href="http://diycity.org/diycity-main-group/call-work-first-diycity-project" target="_blank">Do-It-Yourself-City Project</a>) is developing a similar vision here in NYC.</p>
<h3>Natural Fuse: &#8220;A city wide network of electronically-assisted plants.&#8221;</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/naturalfusenetwork1.jpg"><img class="alignnone size-full wp-image-2779" title="naturalfusenetwork1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/naturalfusenetwork1.jpg" alt="naturalfusenetwork1" width="405" height="305" /></a></p>
<p><em>I think we&#8217;ve really not even begun to imagine the kinds of applications that will be important,&#8221; </em> Usman Haque.</p>
<p>Haque Design + Research which still continues, and has a separate team will be involved mostly in the kinds of things it has in the past, but it isÂ <em> &#8220;also in pushing development of things that *use* Pachube,&#8221;</em> such as the project Natural Fuse, by Usman Haque, <a id="y5x7" title="Nitipak Samsen (Designer)" href="http://www.dotmancando.info/" target="_blank">Nitipak Samsen (Designer)</a>,Â <a id="d.p2" title="Cesar Harada (Designer)" href="http://www.cesarharada.com/" target="_blank">Cesar Harada (Designer)</a>, Barbara Jasinowicz (Producer), was commissioned by <a href="http://www.archleague.org/index-dynamic.php?show=757" target="_new">the Architecture League</a> &amp; <a href="http://www.situatedtechnologies.net/?q=node/89" target="_new">Situated Technologies: Toward the Sentient City</a> and will open to the public in Autumn 2009.</p>
<p><em>Natural Fuse harnesses the carbon-sinking capabilities of plants to create a city-wide network of electronically-assisted plants that act both as energy providers and as shared &#8220;carbon sink&#8221; circuit breakers. By sharing resources and information between the plants, energy expenditure can be collectively monitored and managed.</em></p>
<p><em> The purpose is to create a collective &#8220;carbon sink&#8221;, that offsets the amount of energy consumed by the plant owners &#8211; a natural &#8220;circuit breaker&#8221;. If people cooperate on their energy expenditure then the plants thrive (and they can all use more energy); but if they don&#8217;t then the network starts to kill plants, thus diminishing the network&#8217;s energy capacity,</em> (a full description of natural fuse in the interview below).</p>
<h3>The Street As Platform</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/streetasaplatform1.jpg"><img class="alignnone size-full wp-image-2780" title="streetasaplatform1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/streetasaplatform1.jpg" alt="streetasaplatform1" width="450" height="301" /></a></p>
<p>Image courtesy ofÂ <a id="k0g3" title="Timo Arnall" href="http://www.elasticspace.com/" target="_blank">Timo Arnall</a> -Â  who is an awesome photographer and mover and shaker in ubicomp. <em>&#8220;The way the street feels may soon be defined by what cannot be seen with the naked eye,&#8221;</em> writes Dan Hill in his post <a href="http://www.cityofsound.com/blog/2008/02/the-street-as-p.html" target="_blank">&#8220;The Street as Platform.&#8221;</a> Usman comments on Dan Hill&#8217;s other &#8220;must read&#8221; post:</p>
<p><em><a id="doow" title="&quot;the personal well-tempered environment,&quot;" href="http://www.cityofsound.com/blog/2008/01/the-personal-we.html" target="_blank">&#8220;The Personal Well-Tempered Environment&#8221;</a> is full of &#8220;fascinating propositions&#8230; &#8230;they&#8217;re relevant to things I&#8217;m interested in&#8230;</em></p>
<p>In a summary of his ideas on personal well-tempered env., Dan Hill writes:<br />
<em></em></p>
<p><em>A real-time dashboard for buildings, neighbourhoods, and the city, focused on conveying the energy flow in and out of spaces, centred around the behaviour of individuals and groups within buildings.</em></p>
<p><em>A form of &#8216;BIM 2.0&#8242; that gives users of buildings both the real-time and longitudinal information they need to change their behaviour and thus use buildings, and energy, more effectively. An ongoing post-occupancy evaluation for the building, the neighbourhood and the city.</em></p>
<p><em>A software service layer for connecting things together within and across buildings.</em></p>
<p><em>As information increasingly becomes thought of a material within building, it makes sense to consider it holistically as part of the built fabric, as glass, steel, ETFE etc.</em></p>
<h3>Interview With Usman Haque</h3>
<p><strong>Tish Shute:</strong> You have been involved in many awesome projects but Pachube seems to be quite a new direction.Â  What are the key influences in your career and the development of your thinking? And, could you tell me more about how your previous work brought you to creating Pachube? Is Pachube a central focus for you and Haque design now?</p>
<p><strong>Usman Haque:</strong><em> To me Pachube is the logical culmination of everything I&#8217;ve worked on for the last 12 years since finishing my post-grad architecture studies.</em></p>
<p><em>A lot of my work until now has centered around large-scale mass-collaboration interactive &#8220;spectacles&#8221; involving many thousands of members of the public at once. I found this a good medium in which (a) to explore strategies for collaboration that take account of the granularity of participation (i.e. the fact that different people have different interests, skills and intentions in any participative act); and (b) to work at an urban scale; i.e. in a way that has an effect at the scale of buildings, parks, and streetscapes etc.</em></p>
<p><em> <a id="kr8h" title="Open Burble" href="http://www.haque.co.uk/openburble.php" target="_blank">Open Burble</a> was a good example of this approach: essentially a framework, composed of 2m carbon-fibre modules, it had electronics embedded in 1000 helium balloons. Members of the public could configure and assemble these, inflate them and then unfurl the complex structure up to the scale of a 15 storey buidling. Finally, by shaking, rowing, twisting and bending a handlebar embedded with sensors (the same as in the Wii controller as it happens), dozens of people at once could have an effect on the Burble&#8217;s position and the colours streaming through it.</em></p>
<p><em><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/openburble2.jpg"><img class="alignnone size-full wp-image-2832" title="openburble2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/openburble2.jpg" alt="openburble2" width="509" height="338" /></a><br />
</em></p>
<p><a href="http://www.haque.co.uk/openburble.php" target="_blank">Open Burble, Singapore Biennale 2006</a></div>
<p><em>Along the way I became interested at times in what an &#8220;operating system&#8221; might mean in the context of architecture (paper,Â <a id="cxpf" title="Hardspace, Softspace and the possibilities of open source architecture, 2002" href="http://www.haque.co.uk/papers/hardsp-softsp-open-so-arch.PDF" target="_blank"> Hardspace, Softspace and the possibilities of open source architecture, 2002 (PDF)</a>, particularly an &#8220;open source&#8221; operating system (Urban Versioning System,Â <a id="yvjc" title="http://uvs.propositions.org.uk/" href="http://uvs.propositions.org.uk/" target="_blank">http://uvs.propositions.org.uk/</a> ). I was also interested in developing tools for supposedly &#8220;non-technical&#8221; people to start building their own interactive systems or environments, hence the release of <a id="zv:-" title="The &quot;Low Tech Sensors &amp; Actuators for Artists and Architects&quot;" href="http://lowtech.propositions.org.uk/" target="_blank">The &#8220;Low Tech Sensors &amp; Actuators for Artists and Architects&#8221;</a> pamphlet , co-authored with an old friend,Â <a id="w-ad" title="Adam Somlai-Fischer" href="http://www.aether.hu/" target="_blank">Adam Somlai-Fischer</a>, back in 2005.</em></p>
<p><em>An off-shoot of this has been an obsession withÂ <a id="ahue" title="trying to rescue the concept of &quot;interaction&quot;" href="http://mags.acm.org/interactions/20090102/?pg=71" target="_blank">trying to rescue the concept of &#8220;interaction&#8221;</a> from oblivion &#8211; I say oblivion because I think the really exciting possibilities of the concept of interaction are being lost because we&#8217;re being sold a billion so-called &#8220;interactive&#8221; devices and gadgets that are, in fact, merely &#8220;reactive&#8221;. In this, <a id="t5h7" title="I turn often to the work of cybernetician Gordon Pask" href="http://www.haque.co.uk/papers/architectural_relevance_of_gordon_pask.pdf" target="_blank">I turn often to the work of cybernetician Gordon Pask</a>, particularly active in the 50s, 60s and 70s in the development of truly interactive systems. (And also a collaborator withÂ <a id="gt4p" title="Cedric Price" href="http://en.wikipedia.org/wiki/Cedric_Price" target="_blank">Cedric Price</a>, one of my favourite architects).</em></p>
<p><em>Which brings me to Pachube, which is now occupying pretty much all my time and will do for the foreseeable future. (<a id="qdfj" title="Haque Design + Research" href="http://www.haque.co.uk/" target="_blank">Haque Design + Research</a> still continues, and has a separate team &#8212; it will be involved mostly in the kinds of things it has in the past, but also in pushing development of things that *use* Pachube, such as the projectÂ <a id="h:9w" title="Natural Fuse" href="http://www.haque.co.uk/naturalfuse.php" target="_blank">Natural Fuse</a> ).</em></p>
<p><em>Pachube came about as a direct attempt to enable the production of dynamic, responsive, conversant &#8216;environments&#8217;. ItÂ basically evolved out of three strands of thought.</em></p>
<p><em>The first was the notion of the <strong>geographical non-specificity of architecture</strong> these days. By this I mean that, for many of us now, &#8220;home&#8221; is an idea constructed from several places &#8211;we live and work in environments composited by networked technology from fragments that bridge huge geographical distances. These environments are resolutely &#8220;human&#8221; (in the sense of being inhabited, designed and determined by people) yet context-free (because they do not privilege geographical location). I wanted to find a way to &#8220;connect&#8221; up remote spaces, much likeÂ <a id="ubie" title="Remote Home" href="http://www.tobi.net/remotehome/remotehome.htm" target="_blank">Remote Home</a> and a whole range of other projects had done, but in a generalized way so that it would be possible to keep adding to the ecosystem of connected environments on an ad hoc basis; a global architecture if you will.</em></p>
<p><em>The second strand of thought came from the <strong>desire to open up the production process of &#8220;smart homes.&#8221;</strong> I&#8217;m concerned by developments in ubiquitous computing whereby &#8220;making technology invisible&#8221; equates to placing the design and construction process solely in the hands of knowledgeable others. Whereas it&#8217;s still possible more or less to do DIY on your home, if many ubicomp technologists had their way it would become less and less possible simply because of the complexity of reverse-engineering such closed-systems. It&#8217;s already a problem with larger buildings: service companies go out of business, proprietary skills or tools disappear and complex lighting and sensor systems remain unused. So, with Pachube I wanted to help foster a more open way of developing the discipline: to embrace the concept of the maker, and to help people negotiate their technological future.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/reconfigurablehouse.jpg"><img class="alignnone size-full wp-image-2781" title="reconfigurablehouse" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/reconfigurablehouse.jpg" alt="reconfigurablehouse" width="419" height="107" /></a></p>
<p><em><a id="ex31" title="Reconfigurable House" href="http://haque.co.uk/reconfigurablehouse.php" target="_blank">Reconfigurable House</a>,Â an environment constructed from thousands of low tech components that can be &#8220;reconfigured&#8221; by its occupants.</em></p>
<p><em>The final strand of thought relates to Pachube&#8217;s emphasis on <strong>&#8220;environments&#8221; rather than &#8220;sensors.&#8221; </strong>I believe that one of the major failings of the usual ubicomp approach is to consider the connectivity and technology at the object-level, rather than at the environment-level. It&#8217;s built into much of contemporary Western culture to be object-centric, but at the level of &#8220;environment&#8221; we talk more about context, about disposition and subjective experience. An &#8216;environment&#8217; has dynamic frames of reference, all of which are excluded when simply focusing on devices, objects or mere sensors. If one really studies deeply what an &#8216;environment&#8217; is (by this I mean more than simply saying that &#8220;it&#8217;s what things exist in&#8221;), one begins to understand that an environment is a construction </em><em>process and </em><em>not a medium; nor is it a state or an entity. In this I would refer to Gordon Pask&#8217;s phenomenally important text </em><em>&#8220;Aspects of Machine Intelligence&#8221; in Nicholas Negroponte&#8217;sÂ <a id="hlcg" title="Soft Architecture Machine" href="http://www.amazon.com/Soft-Architecture-Machines-Nicholas-Negroponte/dp/0262140187" target="_blank">Soft Architecture Machine</a> though it makes for extremely tough reading (Negroponte compared it in importance to Alan Turing&#8217;s contributions to the computer science discipline).</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pachube1.jpg"><img class="alignnone size-full wp-image-2782" title="pachube1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pachube1.jpg" alt="pachube1" width="411" height="275" /></a></p>
<p><em>Ultimately, though, Pachube is here to make it easier to participate in what I expect to be a vast &#8216;eco-system&#8217; of conversant devices, buildings &amp; virtual environments. Pachube will facilitate the development of a huge range of new products and services that will arise from extreme connectivity. It&#8217;s relatively easy for large technology companies likeÂ <a id="ps11" title="Nike and Apple" href="http://www.apple.com/ipod/nike/" target="_blank">Nike and Apple</a> to transition into the Internet of Things, but Pachube will be particularly helpful for that huge portion of smaller scale industry players that *want* to become part of it, but which are only now waking up to the potentials of the internet &#8212; small and medium scale designers, manufacturers and developers who are very good at developing their products but don&#8217;t have the resources to develop in-house a massive infrastructure for their newly web-enabled offerings.Â Basically, having built a generalized data-brokering backend to connect physical (and virtual) entities to the web, others can now start to build the applications that make the connections really useful.</em></p>
<p><strong>Tish Shute:</strong> You mentioned that both Bruce Sterling and Gavin Starks (AMEE) have given input on Pachube.Â  Can you describe any specific ways they (and others?) have influenced the evolution of Pachube? You mentioned the concept of &#8220;engaged responsible spime wrangling&#8221; when we talked on skype?</p>
<p><strong>Usman Haque:</strong> <em>Yes, I am very grateful to a whole bunch of people who have helped, prodded, nudged and advised along the way!</em></p>
<p><em>I asked Bruce to be a &#8220;visionary&#8221; adviser because he was one of the people early on to envisage the concepts and ramifications ofÂ <a id="v5w3" title="&quot;spimes&quot;Â Â (his neologism for 'space-time objects')" href="http://www.boingboing.net/images/blobjects.htm" target="_blank">&#8220;spimes&#8221;Â Â (his neologism for &#8216;space-time objects&#8217;)</a>. While I agree that &#8220;spimes&#8221; are directly relevant, what I found most important from his conception was the concept of &#8220;wrangling&#8221; &#8211; being actively and productively engaged and responsible in the development of spimed environments. I think it was a crucial leap: to talk about &#8220;wranglers&#8221; rather than &#8220;end-users&#8221;. So the kinds of questions I&#8217;ve turned to him for regard how to nudge people away from being &#8220;end users&#8221; and towards being &#8220;wranglers&#8221;; and about how to transition from being a &#8220;hacker toy&#8221; to &#8220;major infrastructure&#8221;. He had some great (and invaluable) responses, of which one of the most important to me was something he said in email: &#8220;&#8230;I think total openness is fatal. Â It&#8217;s like lying in a blazing sun under a sky full of vultures, naked. It&#8217;s also rather rude, like babbling anything or anything that flies into your head and still expecting people to pay attention.&#8221;</em></p>
<p><em><a id="qrs7" title="Gavin Starks" href="http://www.amee.cc/" target="_blank">Gavin Starks</a> and alsoÂ <a id="bbd." title="Dopplr's" href="http://www.dopplr.com/" target="_blank">Dopplr&#8217;s</a> <a id="aqy:" title="Matt Biddulph" href="http://www.hackdiary.com/" target="_blank">Matt Biddulph</a> have been sort of &#8220;friendly neighbours&#8221; to Pachube: they&#8217;ve made some great introductions and I turn to them often for advice on being a London start-up. What&#8217;s been really useful for me is that they are active in a related area and have directly useful advice: Gavin, of course, since he&#8217;s involved inÂ <a id="lzoi" title="metering the world's energy" href="http://www.amee.cc/" target="_blank">metering the world&#8217;s energy</a>; and Matt perhaps less tangibly in his day job as Dopplr&#8217;s CTO but more so in hisÂ <a id="jav_" title="active Arduino-enabled social life" href="http://tinker.it/now/2009/01/20/toy-hacking-workshop-09/" target="_blank">active Arduino-enabled social life</a>!</em></p>
<p><em>One very important Pachube advisor has beenÂ <a id="qjz0" title="Dr. Paul Pangaro" href="http://www.pangaro.com/" target="_blank">Dr. Paul Pangaro</a>, who has previously been CTO at a number of technology startups, and brings vital experience from his time at Sun Microsystems as Senior Director and Distinguished Market Strategist. (Oh, and he&#8217;s also a former student and collaborator of Gordon Pask&#8217;s!) He has been very helpful in developing a viable business model in conjunction with my brother Yusuf Haque, who, with his experience in raising capital for startups, has led the fundraising process.</em></p>
<p><em>Of course, direct daily input from the Pachube team has been vital to the development of the project, and withoutÂ <a id="nyoj" title="Chris Leung" href="http://www.chrisleung.org/" target="_blank">Chris Leung</a> (EEML development) andÂ <a id="xr8l" title="Sam Mulube" href="http://twitter.com/smazero" target="_blank">Sam Mulube</a> (backend development) it would be a very different thing indeed!</em></p>
<p><strong>Tish Shute:</strong> Now the emerging internet is the world as a networked, enhanced virtual/reality environment &#8211; sorry about the inadequate terminology, but as you said &#8220;the distinction between real and virtual is becoming as quaint as the distinction between mind and body&#8221;. You are participating in the <a id="k7s8" title="Sentient City" href="http://www.situatedtechnologies.net/?q=node/89" target="_blank"><strong>Sentient City</strong> exhibition organized by the </a><a href="http://www.archleague.org/" target="_blank">Architectural League of New York for September 2009.</a></p>
<p>Could you explain more about the Sentient City project and what your contribution Natural Fuse which uses common house plants, energy-monitoring sensors, and Pachube to create &#8220;a city-wide network of electronically-assisted plants that act as carbon-cycle circuit-breakers in much the same way as conventional electrical circuit-breakers do&#8230;..&#8221; is about?</p>
<p><strong>Usman Haque: </strong><em>Situtated Technologies, founded toÂ explore the impact of &#8220;situated&#8221; technologies (i.e. locative media, etc.) in urban spaces,Â kicked off with a <a id="b77z" title="symposium organised by Mark Shepard, Omar Khan and Trebor Scholz" href="http://www.situatedtechnologies.net/?q=node/1" target="_blank">symposium organised by Mark Shepard, Omar Khan and Trebor Scholz</a> and supported by theÂ <a id="o7a4" title="Architecture League of New York" href="http://www.archleague.org/" target="_blank">Architecture League of New York</a> a couple of years ago, and continued throughÂ <a id="o5o6" title="a series of pamphlets" href="http://www.situatedtechnologies.net/?q=node/75" target="_blank">a series of pamphlets</a> (the first by Adam Greenfield &amp; Mark Shepard; the second by me and Matthew Fuller; the third and fourth byÂ Benjamin Bratton &amp; Natalie Jeremijenko andÂ Laura Forlano &amp; Dharma Dailey). This is now culminating in an exhibition,Â &#8220;Toward the Sentient City&#8221;, opening in September 2009, as a public manifestation of many of the concepts raised over the years.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantcircuit1.jpg"><img class="alignnone size-full wp-image-2783" title="plantcircuit1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantcircuit1.jpg" alt="plantcircuit1" width="400" height="289" /></a></p>
<p><em><a id="k48e" title="Natural Fuse" href="http://www.haque.co.uk/naturalfuse.php" target="_blank">Natural Fuse</a>, a project funded by the Architecture League to be part of that exhibtion, is really a Haque Design + Research project rather than Pachube project alone. It came about for two reasons. The first was because we had been investigating for several months many different ways to use plants and vegetation in interactive architectural design: as living walls, as responsive systems, as visual and olfactory indicators, as passive ventilation &#8212; fantastic research undertaken predominantly by my invaluable production assistant Barbara Jasinowicz. We were particularly interested in energy creation and monitoring and had made a number of (unsuccessful) proposals to develop building systems based on plant interaction. The second was because I wanted to have a good demonstration project for Pachube: a system that was not just end-to-end single-point communication, but one in which the system increased its efficiency over time through more and more geographically-dispersed connections. So Natural Fuse developed through a series of conversations with a very intelligent and witty designerÂ <a id="ed_l" title="Nitipak (Dot) Samsen" href="http://www.dotmancando.info/" target="_blank">Nitipak (Dot) Samsen</a> who was then an intern and who will now lead design work along withÂ <a id="w9.y" title="Cesar Harada" href="http://www.cesarharada.com/" target="_blank">Cesar Harada</a> (similarly intelligent and witty!).</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfusecare1.jpg"><img class="alignnone size-full wp-image-2784" title="plantfusecare1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfusecare1.jpg" alt="plantfusecare1" width="400" height="322" /></a></p>
<p><em>Briefly, the point of Natural Fuse is to use networked plants, based on the Arduino ethernet platform, to harnessÂ the carbon-sinking capabilities of plants to create a city-wide network of electronically-assisted plants that act both as energy providers and as shared &#8220;carbon sink&#8221; circuit breakers. By sharing resources and information between the plants, energy expenditure can be collectively monitored and managed. The purpose is to create a collective &#8220;carbon sink&#8221;, that offsets the amount of energy consumed by the plant owners &#8211; a natural &#8220;circuit breaker&#8221;. If people cooperate on their energy expenditure then the plants thrive (and they can all use more energy); but if they don&#8217;t then the network starts to kill plants, thus diminishing the network&#8217;s energy capacity.Â Of course, the network functionality is enabled by Pachube. The plan is to distribute these to some households in New York and offer plans and downloads for people to build their own as well.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfusesystem1.jpg"><img class="alignnone size-full wp-image-2785" title="plantfusesystem1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfusesystem1.jpg" alt="plantfusesystem1" width="432" height="214" /></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfuseunit.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfuseunit1.jpg"><img class="alignnone size-full wp-image-2786" title="plantfuseunit1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfuseunit1.jpg" alt="plantfuseunit1" width="443" height="197" /></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/naturalfusenetwork2.jpg"><img class="alignnone size-full wp-image-2787" title="naturalfusenetwork2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/naturalfusenetwork2.jpg" alt="naturalfusenetwork2" width="462" height="348" /></a><br />
<strong><br />
Tish Shute:</strong> You describe Pachube as linking environments not just sensor to sensor (as sensorbase.org does) &#8211; an environment for Pachube could be a web page. An essential concept in Pachube is the concept that anything could be an environment and such environments are treated equivalently with EEML. You describe EEML as a protocol that sits comfortably with existing building protocols &#8220;what it brings to the picture is the ability to describe buildings that change.&#8221;</p>
<p>How will EEML change our understanding of architecture and enable the view of architecture that &#8220;includes smells, sounds, light, electromagnetic fields &#8211; buildings as dynamic and changing?&#8221; (Prasad Passive House?)</p>
<p>You describe EEML as straddling and designed to work alongside IFC construction industry format. Who is involved in the creation of EEML?Â  Could you explain a little bit how it is different from SensorEML? You mentioned little has been done re post-construction evaluation of buildings. How will EEML enable buildings to share strategies (for example on energy consumption) as you put it?</p>
<p><strong>Usman Haque:</strong> <em>TheÂ <a id="gv6y" style="color: #551a8b;" title="Extended Environments Markup Language (EEML)" href="http://www.eeml.org/" target="_blank">Extended Environments Markup Language (EEML)</a> (which is the protocol around which much of Pachube is based) is being developed to make the idea of &#8220;dynamic, responsive and conversant environments&#8221; a reality. It worksÂ with existing construction standards likeÂ <a id="l7sl" style="color: #551a8b;" title="Industry Foundation Classes (IFC)" href="http://en.wikipedia.org/wiki/Industry_Foundation_Classes" target="_blank">Industry Foundation Classes (IFCs)</a>, but exists to extend them to account for dynamic, responsive and, dare I say it, conversant buildings. In the perhaps prosaic world of construction, this helps to facilitate a number of architectural requirements such asÂ <a id="i2_j" style="color: #551a8b;" title="post-occupancy evaluation" href="http://www.google.com/search?hl=en&amp;client=safari&amp;rls=en&amp;defl=en&amp;q=define:post+occupancy+evaluation&amp;sa=X&amp;oi=glossary_definition&amp;ct=title" target="_blank">post-occupancy evaluation</a>, realtime site-based environmental feedback at the design phase and simulations that synchronise with realworld installation. WithÂ <a id="hxs4" style="color: #551a8b;" title="EEML" href="http://www.eeml.org/" target="_blank">EEML</a> and Pachube you&#8217;ll be able to start working with, say, an Autocad model at the design phase, and include *real time* environmental data from the site, as well as to model expected sensor and assumed energy consumption data of the design; use the same model during the construction phase (because it will translate fine to standard modelling descriptions), and keep working with the same set of information even after the building is occupied and running &#8212; making it a whole lot easier to learn from the design and maintenance processes than it is currently.</em></p>
<p><em>At the same time this does not exclude the possiblity of talking about &#8220;sensors&#8221; (asÂ <a id="swia" title="SensorML" href="http://en.wikipedia.org/wiki/SensorML" target="_blank">SensorML</a> wants to), but we are more easily able to consider, say, the dozens of different ways that different clients will want to address, access or search for those sensors; the changing contextual motivations for actually processing sensor information; and the capacity for flexible sensor ontologies &#8212; where you don&#8217;t need to know from the beginning everything you&#8217;ll be looking for once you&#8217;ve recorded mountains of data.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/environmentsconnected.jpg"><img class="alignnone size-full wp-image-2792" title="environmentsconnected" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/environmentsconnected.jpg" alt="environmentsconnected" width="454" height="151" /></a></p>
<p><em>We can consider, equally as &#8216;environments&#8217; a mountainside, the interior a building, the context of a webpage, the internal status and external context of a mobile device, the interactions within something like Second Life.</em></p>
<p><em>As a result of this conception of &#8220;environment&#8221; we remove the need for a distinction between &#8220;real&#8221; and &#8220;virtual&#8221;. We can consider, equally as &#8216;environments&#8217; a mountainside, the interior a building, the context of a webpage, the internal status and external context of a mobile device, the interactions within something like Second Life &#8212; all these are environments and can communicate with each other on equivalent terms. More importantly a single &#8220;environment&#8221; can be expressed as a snapshot in time; or it can be expressed as a sequence of many snap shots over several years.</em></p>
<p><em>One very important thing we&#8217;re looking at now is how to transition the protocol from something that is status-based, to something that can express transactions, goals and processes. We&#8217;ve just started looking at howÂ <a id="e7.0" title="RDF" href="http://en.wikipedia.org/wiki/Resource_Description_Framework" target="_blank">RDF</a> andÂ <a id="khn." title="machine tags" href="http://en.wikipedia.org/wiki/Machine_tag" target="_blank">machine tags</a> might help in this, largely spurred on by perceptive comments from one of my favourite designers,Â <a id="mit9" title="Toxi, a.k.a. Karsten Schmidt" href="http://postspectacular.com/" target="_blank">Toxi, a.k.a. Karsten Schmidt</a>.</em></p>
<p><strong>Tish Shute:</strong> You mentioned that you see &#8220;smart&#8221; buildings and &#8220;smart&#8221; cities as environments not just a collection of devices? On the Pachube web page there is a chart describing potential interactions between entities (one to one, one to many, etc.) but you do not give many pointers to how two unrelated objects that are connected would derive any value out of the connection&#8230;could you give me some examples of the kinds of use cases (Natural Fuse is one of course!) and interesting new opportunities to create shared value that Pachube will enable?</p>
<p><strong>Usman Haque:</strong> <em>Yes, I recognize that the Pachube website information leaves a lot to be desired&#8230;! Apart from a whole lot of conceptual information that&#8217;s missing, there are a number of undocumented API features that nobody has yet uncovered!</em></p>
<p><em>Well, in answer to your question: much of it is intuition &#8211; I don&#8217;t know exactly _how_ it will be valuable but I do expect the community to find ways to make such seemingly disparate interoperability valuable.</em></p>
<p><em>To make a prosaic example: say, (once privacy options are introduced) that a manufacturer creates aÂ <a id="s53b" title="Pachube input application" href="http://community.pachube.com/?q=node/100" target="_blank">Pachube input application</a>, like an electricity meter that automatically charts on Pachube. There is a certain benefit to its customers in being able to monitor their usage over time and to compare their usage to the aggregation of others in a similar class, but anonymised. Say that someone else has produced a Pachube output application like aÂ <a id="fhjs" title="mobile phone Pachube viewer" href="http://www.rcreations.com/freeandroidgphoneg1applications" target="_blank">mobile phone Pachube viewer</a>. Now the electricity meter users can use this new output application as an extension to be able to monitor their consumption on a mobile phone. Now, imagine if someone else develops a new product, aÂ <a id="j.l-" title="networked lamp" href="http://www.goodnightlamp.com/" target="_blank">networked lamp</a> &#8212; it would now be very easy for that designer to write a little app to make the networked lamp switch on (or change brightness) according to the electricity consumption, even remotely. The point is that the more input and output apps are added the more valuable they each become.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/scatteredhouse.jpg"><img class="alignnone size-full wp-image-2791" title="scatteredhouse" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/scatteredhouse.jpg" alt="scatteredhouse" width="443" height="109" /></a><br />
<a id="tzsq" title="Scattered House" href="http://www.haque.co.uk/scatteredhouse.php" target="_blank"></a></p>
<p><em><a id="tzsq" title="Scattered House" href="http://www.haque.co.uk/scatteredhouse.php" target="_blank">Scattered House</a>, like Reconfigurable House, but spread throughout various cities in the world to demonstrate the implications of designing environments and buildings in the context of family diasporas and ubiquitous ad hoc networked connectivity.</em></p>
<p><em>Part of Pachube&#8217;s emphasis, in not making specific connections more important than others, is that the community can develop new types of connection. So, while of course it makes it relatively simple to create remote control connections between seemingly unrelated entities (like mobile phones and houses; or web pages and furniture); and it makes it relatively simple to connect up environmental conditions from the physical world to seemingly distant Second Life (or, more interestingly to me,Â <a id="iqkx" title="OpenSim" href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a> ) which can make it a more viable interactive environment; and it makes data aggregation and comparison possible between wide ranges of energy consumers to facilitate aggregation analysis; but, the point really is to make it easy for people and companies to build in this kind of connectivity and invent new uses.</em></p>
<p><em>Through my close association withÂ <a id="sin8" title="The Bartlett, University College London's architecture school" href="http://www.bartlett.ucl.ac.uk/" target="_blank">The Bartlett, University College London&#8217;s architecture school</a>, I hope to develop some particularly relevant use-case scenarios for the architectural industry. I think we&#8217;ve really not even begun to imagine the kinds of applications that will be important, though I guess Natural Fuse exemplifies the kind of approach I would like to see in Pachube-enabled applictations: one in which the collective/hive experience contributes towards some end goal, to make it possible to create a &#8220;wikipedia of environments&#8221; as opposed to a web-based Wikipedia &#8211; it&#8217;s not that I necessarily want to create these things myself, but rather I want to make it </em><em>possible to create such things.</em></p>
<p><strong>Tish Shute:</strong> You mentioned that you hope Pachube to be the place to connect smart products &#8211; product to product communication?Â  Also you mentioned that you would like to have a way that smart products can self register with Pachube. While all feeds are public now, you are going to create groups with different levels of privacy. Both of the aforementioned features would enable more business applications for Pachube.Â  But could you describe the business model for Pachube?</p>
<p><strong>Usman Haque:</strong> Essentially, there are three facets to the business model. The first takes a cue fromÂ <a id="irzp" title="Flickr" href="http://www.flickr.com/upgrade/" target="_blank">Flickr</a> in recognising that there are those who would like a more sophisticated set of services as &#8220;professional&#8221; accounts. The second is to be able to provide a set of tools and applications for medium scale manufacturers and developers who want to web-enable their offerings, who will be able to take advantage of the growing repository of Pachube.Apps and add-ons, and who want the convenience, security and economy that Pachube will be able to offer. The third approach is to become more directly involved in large-scale urban infrastructure projects. There is a fourth facet, but we consider it the killer so I&#8217;m keeping quiet for the moment&#8230;.</p>
<p>So yes, in order to make all these things more useful we&#8217;ll soon be introducing a range of privacy options on feeds, the ability to create &#8220;aggregates&#8221; from collections of feeds, and the possibility of groups, organised around feeds. Another thing we&#8217;re hoping to introduce soon is open environment-level tagging, so that anyone will be able to tag environments, though there will be a way of evaluating the importance of any given tag.</p>
<p><strong>Tish Shute: </strong>I know you mentioned that you are trying to find ways to find tools that allow people to contribute to their environment. There are a number of projects aimed at providing tools that will help people/business to reduce their carbon footprintÂ  &#8211; <a id="a2qc" title="The Carbon Account," href="http://www.thecarbonaccount.com/" target="_blank">The Carbon Account,</a> AMEE, Wattzon, <a id="f8y3" title="Onzo" href="http://www.onzo.co.uk/" target="_blank"> Onzo</a> Is Pachube working with any of these projects and how?</p>
<p>What are the most interesting ideas in this area of changing our relationship to energy consumption emerging from Pachube?</p>
<p><strong>Usman Haque: </strong><em>The carbon footprint calculating industry is getting quite crowded&#8230;! So far I&#8217;ve particularly appreciated AMEE&#8217;s API (which is also used by the Carbon Account, I believe). So one thing we have just released a Pachube.App &#8216;plugout&#8217; which will take a feed from an electricity meter tagged &#8220;watts&#8221; or &#8220;kilowatts&#8221; and convert it into a realtime carbon footprint calculation (driven by AMEE&#8217;s international and region- and supplier-specific carbon conversion factors). So it should be really easy to discover how many kilograms of CO2 you generated in the last 15 minutes&#8230;. that last hour&#8230; the last 24 hours. Here&#8217;s a list of some of the feeds that are already making use of this:Â http://www.pachube.com/tag/co2_last_15_mins</em><br />
<strong><br />
Tish Shute:</strong> I know the Aduino community has really taken and interest in Pachube. Who are the early adopters on Pachube?Â  What are the most prevalent use cases you have seen so far?</p>
<p><strong>Usman Haque:<em> </em></strong><em>It has actually been more difficult than I thought it would be getting the Arduino community interested. This has partly been due to the difficulty of internet-enabling Arduino (until recently adding ethernet access has been a bit of a tough chore). Now that it&#8217;s easier to connect up Arduinos, some of the early adopters have been interfacing Arduino to Current Cost meters (alleviating the need for a computer in between); and others have been doing things like tracking temperature, humidity and light level in their homes and offices.Â <a id="ohbg" title="Pachube user C4C" href="http://www.gomaya.com/glyph/" target="_blank">Pachube user C4C</a> has been pretty active from early on:Â http://www.pachube.com/feeds/1284</em><br />
<strong><br />
Tish Shute:</strong> Pachube is input heavy at the moment &#8211; you mentioned not many accuators are plugged into Pachube yet.Â  You said this is in part because you have focused on making the backend robust and stable before taking a lot of hits. What new directions for Pachube will emerge from enabling the dynamic relationship between sensors and accuators?</p>
<p><strong>Usman Haque:</strong> <em>This will be a crucial evolution in Pachube, when we make actuators more evident. It&#8217;s input heavy at the moment, basically in the sense of being easy to see the inputs &#8212; you add &#8220;inputs&#8221; rather than &#8220;outputs&#8221;, so at the moment we have no idea of what&#8217;s actually plugged into the outputs unless people tell us! However, we know that there are plenty of outputs because they&#8217;re making API requests, we just don&#8217;t know what they&#8217;re being used for! Once the concept of actuators and output environments get built in to the system then I think we&#8217;ll know a lot more about how people are using the system.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/currentcost.jpg"><img class="alignnone size-full wp-image-2794" title="currentcost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/currentcost.jpg" alt="currentcost" width="444" height="150" /></a></p>
<p><em>To make this easier in the meantime we recently announce theÂ <a id="zp60" title="Pachube.apps" href="http://apps.pachube.com/%29" target="_blank">Pachube.apps</a> site, where people can start contributing Pachube &#8216;plugins&#8217; and &#8216;plugouts&#8217; &#8212; things that can be used by others without needing to code or hack, to create, generate or modulate Pachube inputs and outputs. One of these wasÂ <a id="htj9" title="Status2Pachube" href="http://apps.pachube.com/online-status.html" target="_blank">Status2Pachube</a>, which turns the online status of AIM, MSN Messenger, Skype or Yahoo! Messenger users into a Pachube input feed (to make it easy to create &#8220;remote presence&#8221; orbs and such); another was theÂ <a id="wjey" title="CurrentCost2Pachube" href="http://community.pachube.com/?q=node/100" target="_blank">CurrentCost2Pachube</a> app to make it easy to connect up Current Cost electricity meters as input feeds; all of which can then be used by Pachube output apps, like theÂ <a id="xki1" title="G1 Android phone Pachube viewer" href="http://www.rcreations.com/freeandroidgphoneg1applications" target="_blank">G1 Android phone Pachube viewer</a> by Pachube user N4Spd or in the soon-to-launchÂ <a id="pd2x" title="Pachube2SketchUp" href="http://apps.pachube.com/" target="_blank">Pachube2SketchUp</a> plugout which will direct Pachube outputs into Google SketchUp (and by extension Google Earth) in order to generate or modulate 3-d models in response to realtime environmental/sensor data. (Pachube2SketchUp is pretty much finished for Mac OS X &#8212; but we&#8217;re having difficulty getting it to work on Windows, because of its sometimes pigheaded security measures&#8230; we&#8217;ll probably release it for Mac OS X alone soon anyway).</em></p>
<p><strong>Tish Shute:</strong> Do you and Haque design expect to go beyond just providing a platform? Will you be producing more interesting applications like Natural Fuse on Pachube?Â  If so, can you tell me more about what you have in mind?</p>
<p><strong>Usman Haque:</strong> <em>I keep a clear distinction between my work as creative director of Pachube.com and my work as director of Haque Design + Research. Basically, while Pachube.com continue development of the platform in general, I hope that Haque Design + Research will separately continue creating pioneering interactive experiences, some using Pachube and others not. We have some things in mind, such as the idea of creating an open source building management platform, but that&#8217;s all to come later&#8230;</em></p>
<p><strong>Tish Shute:</strong> One very interesting project you have been involved in is the creation of &#8220;Urban Versioning System 1.0&#8243; which asks &#8220;What lessons can architecture learn from software development, and more specifically, from the Free, Libre, and Open Source Software (FLOSS) movement?&#8221; Can you tell me more about this project, its goals, and its progress? How Does UVS 1.0 relate to Pachube?</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/urbanvs.jpg"><img class="alignnone size-full wp-image-2795" title="urbanvs" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/urbanvs.jpg" alt="urbanvs" width="277" height="386" /></a></p>
<p><strong>Usman Haque: </strong><em>TheÂ <a id="xujn" title="Urban Versioning System" href="http://uvs.propositions.org.uk/" target="_blank">Urban Versioning System</a> was essentially an attempt to understand what lessons the &#8220;open source&#8221; approach in software might provide to the collaborative development of environments and cities. It&#8217;s a sort of quasi-license &#8212; not yet quite ready to have the status of something like Creative Commons (which nicely suits media and software based creations, but doesn&#8217;t suit quite so well hardware and physical things beyond their design files). It&#8217;s more of a challenge, a series of constraints that might be applied. It has a link to Pachube, in the sense of encouraging conception at the environment and systemic level &#8212; you might call it the manifesto that connects Constant&#8217;s New Babylon hypothesis to the reality of Pachube!</em></p>
<p><strong>Tish Shute:</strong> I know that you imagine Pachube scaling up to millions (billions???) of users. But scaling the real time web has proved a challenge (e.g the frequent surfacings of the Twitter failwhale during big events). What are the key points of Pachube&#8217;s architecture and design that will enable successful scaling?</p>
<p>How do you see Pachube itself fitting into the FLOSS movement?</p>
<p><strong>Usman Haque: </strong><em>This is a really important question. There are a couple of things we are doing. The first is constantly to assume that we have 20 to 50 times more connections than we actually have&#8230; I put a lot of pressure on Sam about making sure about this, so he&#8217;s constantly developing, thinking about and testing little things for weeks in advance while at the same time fighting the usual daily little fires that arise <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" />  The second is that we&#8217;re trying to learn from strategies being developed byÂ <a id="fq2y" title="Vlad Trifa" href="http://vladtrifa.com/" target="_blank">Vlad Trifa</a> and his group at theÂ <a id="zjfb" title="Institute for Pervasive Computing at ETH Zurich" href="http://www.pc.inf.ethz.ch/" target="_blank">Institute for Pervasive Computing at ETH Zurich</a> in Switzerland regarding the development of infrastructures for millions or more entities.</em></p>
<p><em>Regarding the connection to the FLOSS movement, there is no specific technical part of Pachube that is currently open source (apart from all the example apps and tutorials of course). However, I find the approach taken by OpenSim and Hypergrid really fascinating: I haven&#8217;t given this enough thought to how it might be implemented but I find quite appealing the idea of a multitude of open source and geographically dispersed Pachube-enabled servers with seamless transfer of data connections between them as necessary&#8230;..</em></p>
<p><strong>Tish Shute: </strong>I know you have an <a id="ttbg" title="Android Viewer for Pachube" href="http://en.androidwiki.com/wiki/Pachube_Viewer" target="_blank"> Android Viewer for Pachube</a>.Â  Android is a landmark for extended/augmented reality, as <a id="x-.a" title="Wikitude" href="http://www.mobilizy.com/wikitude.php" target="_blank"><span style="color: #0000ff;"><strong>Wikitude</strong></span></a> proved, because with its compass mode Android brings together the essential ingredients for extended/augmented reality &#8211; knowing who YOU are, WHERE you are, WHAT you are doing, WHAT is around you.Â  It seems Pachube could be a powerful backend to a number of multi-user, mobile augmented/enhanced reality android applications?Â  Do you have any ideas/thoughts on this?</p>
<p><strong>Usman Haque:</strong> <em>That&#8217;s right &#8212; the Android viewer was created by rcreations.com/ a Pachube user &#8212; this new platform brings amazing opportunities to mobile devices. I would be really interested to see what I would consider the obvious next step: an app that becomes both a Pachube input and an output feed, one that overlays existing Pachube data, with new context-based, site specific data.</em></p>
<p><em>If I was to make a parallel to a Japanese anime, I&#8217;m fascinated byÂ <a id="ht3b" title="Dennou Coil" href="http://en.wikipedia.org/wiki/Dennou_Coil" target="_blank">Dennou Coil</a> a Japanese anime set 20 years in the future where children take for granted the overlay of the digital world with the physical world. BUT, I&#8217;d say that Pachube somehow relates more closely toÂ <a id="zg78" title="Furi Kuri" href="http://www.adultswim.com/shows/flcl/index.html" target="_blank">Furi Kuri</a> in itsÂ <a id="gko_" title="pataphysical" href="http://en.wikipedia.org/wiki/%E2%80%99Pataphysics" target="_blank">pataphysical</a> stance and because one of the main characters has a portal to another galaxy in his head&#8230;&#8230;.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/furikuri.jpg"><img class="alignnone size-full wp-image-2793" title="furikuri" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/furikuri.jpg" alt="furikuri" width="420" height="320" /></a></p>
<p><strong> Tish Shute:</strong> Do you do you see Haque design picking up on the challenge of creating some cool next generation interfaces/GUIs for extended/enhanced/augmented (sorry no perfect term) reality?</p>
<p><strong>Usman Haque:</strong> <em>Actually, no, I don&#8217;t see this as Haque Design + Research&#8217;s core focus going forward. We did some of this early on, getting involved in, for example, the development of aÂ <a id="ty:5" title="3d smell interface" href="http://www.haque.co.uk/scentsofspace.php" target="_blank">3d smell interface</a>; and exploring theÂ <a id="ykap" title="role of electromagnetic fields on perception of haunted spaces" href="http://www.haque.co.uk/haunt.php" target="_blank">role of electromagnetic fields on perception of haunted spaces</a>. But these days, in the context of HDR, I&#8217;m less interested in making seamless interfaces and more interested in exploring what authentic interaction actually is (whether technologically based or not). I think it&#8217;s challenge enough for me to make a light-switch engaging, dynamic and conversant before getting to the perceptual infrastructure that goes on top of it all! HDR will also spend more time exploringÂ <a id="p2v5" title="passive systems, phase-change materials and plants" href="http://www.haque.co.uk/climateclock.php" target="_blank">passive systems, phase-change materials and plants</a> in the context of the built environment.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/scentsofspace.jpg"><img class="alignnone size-full wp-image-2796" title="scentsofspace" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/scentsofspace.jpg" alt="scentsofspace" width="550" height="197" /></a></p>
<p><strong>Tish Shute: </strong>I know there has been some interesting integrations with Pachube lately &#8211; <a href="http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/" target="_blank">Andy Stanford-Clark&#8217;s mentioned using MQTT as the feed to get EML data into and out of Pachube</a> rather than over HTTP. He said thatâ€™s interesting because MQTT is a much more lightweight protocol, designed for small sensors and low bandwidth / expensive (e.g. cellular) networksâ€¦ and itâ€™s also true push.. i.e. data is pushed to you directly from the broker (the hub in the middle), rather than you having to ask for it constantly (polling).</p>
<p>Have you opted for MQTT over HTTP polling?</p>
<p><strong>Usman Haque:</strong> <em>We haven&#8217;t yet implemented an MQTT bridge in part because it has proved pretty difficult. HTTP is quite important for us right now because there&#8217;s a whole universe out there using it; from your average web browser, to mobile devices, to ethernet devices and a whole range of languages and platforms &#8212; they all work, pretty much out of the box with HTTP. However, what we are exploring instead is being able to interface withÂ <a id="a4w." title="Oliver Goh" href="http://www.eolusone.com/cms/website.php" target="_blank">Oliver Goh</a>&#8216;s Shaspa project &#8212; they&#8217;re already in the middle of solving the MQTT-Pachube bridge problem, and so that should hopefully provide Pachube access to and from MQTT devices.</em></p>
<p><strong>Tish Shute:</strong> Chris Dalby just released <a id="qcm6" title="Pachube Air" href="http://www.yellowpark.net/cdalby/index.php/2009/01/10/pachube-air-the-first-release/" target="_blank">Pachube Air.</a> Have you had a chance to play with that yet?</p>
<p><strong>Usman Haque:</strong> <em>I have indeed! It&#8217;s still early days yet, and I know he did it partly just to test the AIR development process rather than solely solving a desperate Pachube need but I&#8217;m looking forward to future iterations!</em></p>
<p><strong>Tish Shute:</strong> Peter Quirk felt the Pachube web page positions Pachube as a social networking site focused on data exchange, inviting anyone with an interest in sharing environmental or other data to publish data or construct interesting uses for the data.</p>
<p>What is your response to that?</p>
<p><strong>Usman Haque:</strong> <em>Hmm&#8230; I don&#8217;t really see Pachube as a social networking site. Yes, it perhaps enables the creation of social-networking objects and environments, but in itself and in terms of networking of people that has barely begun yet. Certainly Pachube exists quite comfortably in facilitating mashups and visualisations and other web 2.0 based social applications but I don&#8217;t see that as a driving force. I think it would be a mistake also to conceive of Pachube solely as being the storage of machine communication that then gets experienced by people; rather, it can transition quite easily to being solely useful for machine-to-machine communication. </em></p>
<p><em>In fact, with recent API releases (which as it happens as of this writing we haven&#8217;t announced&#8230; <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" />  it&#8217;s now possible to use most of Pachube&#8217;s features without ever going to the website: i.e. your Arduino can create feeds, search feeds, edit feeds, delete feeds. Over time,Â as direct machine-to-machine communication becomes more prominent,Â it&#8217;s quite likely that the website itself becomes less and less important, while the backend becomes the focus of everything.</em><br />
<strong><br />
Tish Shute:</strong> I am interested in some of the differences between<a href="http://sensorbase.org/" target="_blank"> SensorBase.org&#8217;s project</a> and Pachube. Is Sensorbase as more of a data repository (environmental data in particular)?</p>
<p><strong>Usman Haque</strong>: <em>The difference I see between Pachube and SensorBase is that while (from what I know) SensorBase is mostly about &#8220;write&#8221; operations, with later &#8220;read&#8221; operations (i.e. it&#8217;s about being a data repository), Pachube is really &#8220;read-write&#8221; (i.e. it&#8217;s about being both a data repository _and_ a quasi-realtime proxy). Pachube will be able to handle potentially millions of connections, both incoming and outgoing, and as we&#8217;ll soon start storing every data point ever recorded, so of course the data repository aspect will be crucial. However, the fact that it *also* facilitates one-to-many realtime broadcasts of that data (and facilitates conversion to a number of different formats: EEML, CSV and JSON now, more in the future) means that the two-way connectivity aspect of it is just as important.</em></p>
<p><strong>Tish Shute</strong>: I know you mentioned something that sounding a lot like Pachube would facilitate buildings and products ability to benchmark and optimize themselves against/with each other?</p>
<p><strong>Usman Haque:</strong> <em>Further down the line, I would like to see Pachube able to help two particular processes:</em></p>
<p><em>1) to make it straightforward for developers and manufacturers to web-enabled their products and services; and 2) to help building and environment designers create their buildings (by providing access to realtime site data) and also help in the post-occupancy evaluation process &#8212; where buildings will be able to talk with each other, share information on energy consumption, resource management or occupancy rates and even &#8220;learn&#8221; from each others&#8217; strategies. This type of approach has a parallel at the level of individuals (for example, networked electricity meter users who are able to compare and contrast their usage and strategies for conservation). I don&#8217;t want Pachube to become the application; rather I want to make it easier for other people and companies to create such applications. So in that sense, yes, perhaps Pachube can be considered an enabler of social networking applications&#8230;!</em></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/feed/</wfw:commentRss>
		<slash:comments>64</slash:comments>
		</item>
		<item>
		<title>Is it â€œOMG Finallyâ€ for Augmented Reality?: Interview with Robert Rice</title>
		<link>http://www.ugotrade.com/2009/01/17/is-it-%e2%80%9comg-finally%e2%80%9d-for-augmented-reality-interview-with-robert-rice/</link>
		<comments>http://www.ugotrade.com/2009/01/17/is-it-%e2%80%9comg-finally%e2%80%9d-for-augmented-reality-interview-with-robert-rice/#comments</comments>
		<pubDate>Sun, 18 Jan 2009 01:03:32 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[Metaverse]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[nanotechnology]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[Second Life]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[virtual communities]]></category>
		<category><![CDATA[virtual economy]]></category>
		<category><![CDATA[virtual goods]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[virtual world standards]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web 3D]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[Web3.D]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[AR Geisha Doll]]></category>
		<category><![CDATA[compass in the android]]></category>
		<category><![CDATA[Denno Coil]]></category>
		<category><![CDATA[EEML]]></category>
		<category><![CDATA[hybrid augmented/virtual reality]]></category>
		<category><![CDATA[immersive mobile augmented reality]]></category>
		<category><![CDATA[markerless augmented reality]]></category>
		<category><![CDATA[massively multiuser augmented reality]]></category>
		<category><![CDATA[minimally immersive augmented reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[Neogence]]></category>
		<category><![CDATA[next generation transparent wearable displays]]></category>
		<category><![CDATA[NYC Tech Meetup]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[Robert Rice]]></category>
		<category><![CDATA[socializing sensor data]]></category>
		<category><![CDATA[Unreal 3]]></category>
		<category><![CDATA[Web Alive]]></category>
		<category><![CDATA[Wikitude]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=2620</guid>
		<description><![CDATA[Neogence is on stealth mode with an immersive mobile augmented reality platform &#8211; â€œtools, sdk, and infrastructure plus some applications.â€ They are probably six months away from YouTubing anything according to CEO, Robert Rice.Â  But Robert rustled up this pic for me &#8211; a Google street view of Neogence R&#38;D labs: â€œthe patio on the [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><img class="alignnone size-full wp-image-2557" title="neogencesekrithqpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/neogencesekrithqpost.jpg" alt="neogencesekrithqpost" width="450" height="412" /></p>
<p><a id="zd89" title="Neogence" href="http://www.neogence.com/sekrets.html" target="_blank">Neogence</a> is on stealth mode with an immersive mobile augmented reality platform &#8211; â€œtools, sdk, and infrastructure plus some applications.â€ They are probably six months away from YouTubing anything according to CEO, <a id="rzgp" title="Robert Rice" href="http://curiousraven.squarespace.com/about-me/" target="_blank">Robert Rice</a>.Â  But Robert rustled up this pic for me &#8211; a Google street view of Neogence R&amp;D labs: â€œthe patio on the lower left is where I do a lot of pacing and smoking my pipe and the porch and office upstairs is whereÂ  a lot ofÂ  meetings have been held.â€</p>
<p><a id="rzgp" title="Robert Rice" href="http://curiousraven.squarespace.com/about-me/" target="_blank">Robert Rice</a> (<a id="x_:i" title="@RobertRice" href="http://twitter.com/RobertRice" target="_blank">@RobertRice</a> ), CEO of <a id="zd89" title="Neogence" href="http://www.neogence.com/sekrets.html" target="_blank">Neogence</a>, recently tweeted:</p>
<p><em><strong>Iâ€™m changing my name to Robert Mobile Ubiquitous Geospatial Augmented Rice. Iâ€™m betting on radical changes in next 18 months.</strong></em></p>
<p>Although Robertâ€™s new AR platform is still under wraps, I think you will get a good idea of what direction he is going in from this interview (full text at end ofÂ  this post). Robert is the author of â€œ<a id="c:rr" title="MMO Evolution" href="http://books.google.com/books?id=dkZ-6C5utz8C&amp;dq=MMO+Evolution&amp;printsec=frontcover&amp;source=bn&amp;hl=en&amp;sa=X&amp;oi=book_result&amp;resnum=4&amp;ct=result" target="_blank">MMO Evolution</a>â€ and is a key developer and thought leader in persistent immersive environments, simulations, virtual worlds and massively multiplayer games as well as large scale communities and social networking.</p>
<h3>It is OMG finally, at least, for minimally immersive but truly useful AR.</h3>
<p>Since the launch of Android a new generation of useful augmented reality applications like <strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a></strong> are emerging.</p>
<p>After the last<a href="http://www.meetup.com/ny-tech/calendar/9466657/" target="_blank"> NYC Tech Meetup</a>, myÂ  friend <a title="Nat Mobile Meets Social DeFreitas" href="http://openideals.com/" target="_blank">Nathan Freitas</a>,Â  <a title="Nat Mobile Meets Social DeFreitas" href="http://openideals.com/" target="_blank">(</a><a title="@NatDefreitas" href="http://twitter.com/natdefreitas" target="_blank">@NatDefreitas</a>),Â <a title="Nat Mobile Meets Social DeFreitas" href="http://openideals.com/" target="_blank"> </a>or rather Nathan Mobile Meets Social Freitas, demoed for me a cool graffiti appÂ  he has developed on Android.Â Â  You leave a marker for your graffiti so other people can find view/add their own &#8211; a nice primal experience like pissing on the lamp post to let your pack know where youâ€™ve been.Â  Also the graffiti app taps into a long history ofÂ  NYC street culture around tagging and graffiti art.Â  For more cool mobile projects Nathan is working on &#8211; <a href="http://blog.twittervotereport.com/" target="_blank">Vote Report </a>and data collection for mass events, a guide to pubs and nightlife in New York City, and more, see his blog, â€œNathanâ€™s<a href="http://openideals.com/" target="_blank"> OpenIdeals. </a>With Camera, GPS, compass, and accelerometer, and APIs on Android for temperature, light meters, (no hardware yet), Nathan says Android:</p>
<p><a href="http://openideals.com/" target="_blank"><em><strong> </strong></em></a><em><strong>â€œseems to be the platform most likely to socialize the idea that sensor data could be a piece of every application.â€ </strong></em></p>
<p>As Nathan is fond of saying:</p>
<p><strong><em>The compass is a killer app enabler!</em></strong></p>
<p><a href="http://openideals.com/" target="_blank">Also see </a><a id="ixwx" title="OpenIntents" href="http://code.google.com/hosting/search?q=label:sensors" target="_blank">OpenIntents</a> for some interesting Android Sensor projects.</p>
<p><img class="alignnone size-full wp-image-2558" title="wikitudepost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/wikitudepost.jpg" alt="wikitudepost" width="450" height="356" /></p>
<p><strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a></strong> was one of <em><strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Thomas Wrobel</a>â€™s</strong></em> two top AR milestones for 2008 (see <a id="vwuu" title="Gamesalfreso" href="http://gamesalfresco.com/" target="_blank">Gamesalfreso</a>):</p>
<p><em><strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a> I think. Seems the first released, useful, AR software.</strong></em></p>
<p><em><strong></strong></em> <a href="http://gamesalfresco.com/2008/07/20/want-your-own-augmented-reality-geisha/" target="_self">AR Geisha doll</a> is also a remarkable breakout for AR &#8211; but useful, nah.</p>
<p>I asked Robert if he also thought <a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a> and <a href="http://gamesalfresco.com/2008/07/20/want-your-own-augmented-reality-geisha/" target="_self">AR Geisha doll</a> asÂ  significant breakthroughs:</p>
<p><em><strong>Yes,Â  these are among the first attempts to get away from the novelty of simply rendering a 3D object based on a marker and making it interesting.</strong></em></p>
<p><em><strong>Remember, one of the biggest risks that AR has, is being branded as â€œnoveltyâ€, which means â€œcool for five minutes but ultimately a waste of time.â€ I think we have a ways to go before something is truly useful, but as 2009 progresses we should start seeing some effort here. Iâ€™d guess 2010 before something really useful comes outâ€¦at least something practical.</strong></em></p>
<p><em><strong>Now, having said that, I should say that I expect entertainment and games to take the lead (as usual), although there are a few companies really trying to leverage AR and video/graphics compositing for marketing (brochures) and location based methods (kiosks, large screen projections, etc.)</strong></em></p>
<h3>So when is it â€œOMG finally!â€ for massively multiuser augmented reality?</h3>
<p><img class="alignnone size-full wp-image-2559" title="ar-guipost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/ar-guipost.jpg" alt="ar-guipost" width="450" height="360" /></p>
<p>The picture above is from <a id="kzm2" title="benjapo's portfolio" href="http://www.istockphoto.com/file_closeup/technology/computers/3919295-futuristic-computer-panel.php?id=3919295" target="_blank">benjapoâ€™s portfolio</a> on istockphoto &#8211; also see the <a id="cqhi" title="istock video here" href="http://www.istockphoto.com/file_closeup/technology/computers/3919295-futuristic-computer-panel.php?id=3919295" target="_blank">istock video here</a>.</p>
<p><a id="ylpn" title="Alex Soojung-Kim Pang considers" href="http://www.endofcyberspace.com/2006/11/royal_college_o.html" target="_blank">Alex Soojung-Kim Pang</a> (who weighed in recently on the <a id="vr8o" title="twitter-baby" href="http://www.endofcyberspace.com/2008/12/twitter-baby.html" target="_blank">twitter-baby</a> debates &#8211; see my <a href="http://tishshute.com/twitter-baby-debates" target="_blank">KickBee Posterous</a> blog) challenges design assumptions for augmented reality that take as a given the userâ€™s desire for numerous private enhancements to their reality.</p>
<p>Alex points out less will probably be more so that enhancements do not impinge on shared experience.Â  See his write up of a talk he gave at the Royal College of Art, <a id="bxx1" title="&quot;and the end of my own private Shibuya.&quot;" href="http://www.endofcyberspace.com/2006/11/royal_college_o.html" target="_blank">â€œand the end of my own private Shibuya.â€</a> Photo below by <em>StÃ©fan, â€œ</em><em><a href="http://www.flickr.com/photos/st3f4n/130889444/in/pool-84787688@N00">Karaoke in Shibuya</a></em><em>â€œ</em></p>
<p><em></em><em><strong>Part of the pleasure of these streetscapes is precisely that theyâ€™re collectively experienced, rather than individual visions: for even a brief period, we share with other postmodern, globe-hopping flaneurs and expatriates and temporary natives the light of the ABC-Mart sign and storefront.</strong></em></p>
<p><em><strong><img class="alignnone size-full wp-image-2560" title="karaokepost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/karaokepost.jpg" alt="karaokepost" width="450" height="338" /><br />
</strong></em></p>
<p>It is collective experience of enhanced, augmented, virtual or real experiences that interests me too. This is one of the reasons I find <strong><em><a href="http://www.pachube.com/" target="_new">Pachube</a></em></strong> and the <a href="http://www.eeml.org/" target="_blank">EEML project </a>of Haque Design and Research so interesting.</p>
<p><strong><em>Extended Environments Markup Language (EEML), a protocol for sharing sensor data between remote responsive environments, both physical and virtual. It can be used to facilitate </em><em>direct connections between any two environments; it can also be used to facilitate many-to-many connections as implemented by the web service <a href="http://www.pachube.com/" target="_new">Pachube</a>, which enables people to tag and share real time sensor data from objects, devices and spaces around the world.</em></strong></p>
<h3>â€œDistinctions between virtual and real are as quaint and outmoded as distinctions between mind and bodyâ€ (Usman Haque)</h3>
<p><img class="alignnone size-full wp-image-2603" title="chair1post1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/chair1post1.jpg" alt="chair1post1" width="150" height="150" /><img class="alignnone size-full wp-image-2602" title="remotechair-slpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/remotechair-slpost.jpg" alt="remotechair-slpost" width="150" height="150" /><img class="alignnone size-full wp-image-2604" title="chair2post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/chair2post.jpg" alt="chair2post" width="150" height="150" /></p>
<p>Usman Haque (founder of <a href="http://www.haque.co.uk/pachube.php" target="_blank">Pachube</a> and <a href="http://www.haque.co.uk/" target="_blank">Haque Design and Research</a>) points out this is an underlying premise of his work &#8211; and augmented reality (full interview coming up soon!).</p>
<p>The pictures above show the Haque Design project, <a href="http://www.haque.co.uk/remote.php" target="_blank">Remote</a>:</p>
<p>â€˜<em><strong>Remoteâ€™ connects together two spaces, one in Boston the other in Second Life, and treats them as a single contiguous environment, bound together by the internet so that things that occur in one space affect things that happen in the other and vice versa &#8211; remotely controlling each other.</strong></em></p>
<p>There was a discussion in twitter recently about how the terms like Second Life, Exit Reality, Virtual Worlds are misleading and outmoded. As Robert pointed out we need:</p>
<p><em><strong>one word pleaseâ€¦that sums up virtual and/or augmented reality, interactive, immersive, virtual worlds, mmorpgs, simulations, etcâ€¦ also, I really donâ€™t like the term â€œaugmented realityâ€ or â€œmixed realityâ€. Neither is all that great. And NO â€œmatrixâ€ or â€œmetaverseâ€ please</strong></em></p>
<p>Robert argues strongly that there is a stultification both in virtual world technology &#8211; much of what we call virtual world technology was already, basically, where it is now in the mid 90â€™s. And MMOGs have devolved into gameplay design â€œthat emphasizes the single player experience and does nothing to take advantage of the potential of the massively connected internet.â€</p>
<p>Robert suggested I take a cruise through a new Virtual Space -Â  <a href="http://www.cooliris.com/">CoolIris</a> to find some good pictures for this post (note the partnership between <a href="http://blog.cooliris.com/2009/01/14/cooliris-and-seesmic-streamline-video-blogging/" target="_blank">CoolIris and Seesmic to Streamline Video Blogging.</a> I added the Cooliris Plugin to Firefox and typed Augmented Reality into search and soon I was cruising a highway of images and links. The Road Map image grabbed my attention (see below). It shows the continua that <a href="http://www.metaverseroadmap.org/" target="_blank">the Metaverse RoadMap</a> authors thought are likely to influence the ways in which the Metaverse unfolds. It is â€œa map of the spectrum of technologies and applications ranging from augmentation to simulation; and the spectrum ranging from intimate (identity-focused)external (world-focused)â€</p>
<p><img class="alignnone size-full wp-image-2561" title="metaverseroadmap" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/metaverseroadmap.jpg" alt="metaverseroadmap" width="452" height="427" /></p>
<p>Quite to my surprise, when I clicked out of <a href="http://www.cooliris.com/">CoolIris</a> to the source for the image, I found it had been drawn from a post I wrote in May 2007, <em><strong><a id="jv.r" title="Hybridized Digital/Physical Worlds: Where Pop and Corporate Cultures Mingle." href="../../2007/05/22/hybridized-digitalphysical-worlds-where-pop-and-corporate-cultures-mingle/" target="_blank">Hybridized Digital/Physical Worlds: Where Pop and Corporate Cultures Mingle.</a> </strong></em>My post talks about a number of hybridization experiments that were bringing together lifelogging, sensors everywhere, simulation, virtual worlds, and augmentation.</p>
<p>The striking difference from 2007 to now is that we have definitely moved on from mere experimentation. And the poles of the continua<em><strong> intimate/extimate, augmentation/simulation </strong></em>as<em><strong> </strong></em>expressed in the Metaverse Roadmap are now becoming entwined (note the picture above seems to be slightly different to the one used in the road map as <a id="vdcf" title="posted here" href="http://www.metaverseroadmap.org/overview/" target="_blank">published here</a> &#8211; perhaps I had an early version?)</p>
<h3>&#8220;Augmented Reality is not just about overlaying dataâ€¦&#8221; (Robert Rice)</h3>
<p><img class="alignnone size-full wp-image-2562" title="totalimmersion" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/totalimmersion.jpg" alt="totalimmersion" width="450" height="332" /></p>
<p>Th<em>e </em>screenshot above is from <a id="c7vm" title="TotalImmersions video" href="http://www.t-immersion.com/en,video-gallery,36.html#">TotalImmersions video</a> demoing Augmented Reality with 3D Cell Phones.<em> Also see <a id="tvca" title="video of their immersive games" href="http://www.t-immersion.com/en,video-gallery,36.html#" target="_blank">video of their immersive games</a>, and FutureScope kiosks <a id="eje0" title="here" href="http://www.t-immersion.com/en,video-gallery,36.html#" target="_blank">here</a> and <a id="h-:s" title="here" href="http://www.t-immersion.com/en,video-gallery,36.html#" target="_blank">here</a>.<br />
</em><br />
<a id="vwuu" title="Gamesalfreso" href="http://gamesalfresco.com/">Gamesalfreso</a> noted that Will Wright, delivered the best <a href="http://www.pocketgamer.co.uk/r/Various/Spore+Origins/news.asp?c=8725" target="_blank">augmented reality quote</a> of the year. When describing AR as the way of the future for games, Will Wright said:</p>
<p><em><strong>â€œGames could increase our awareness of our immediate environment, rather than distract us from itâ€.</strong></em></p>
<p>Robert points out in this interview the term Augmented Reality itself has become associated with a very limited understanding of what â€œenhancing your specific reality,â€ is really about. Robert notes:</p>
<p><em><strong>it is inherently about who YOU are, WHERE you are, WHAT you are doing, WHAT is around you, etc.</strong></em></p>
<p><em><strong>When I talk about AR, I try to expand the definition a little bit. Usually, when you talk to someone about augmented reality, the first thing that comes to mind is overlaying 3D graphics on a video stream. I think though, that it should more properly be any media that is specific to your location and the context of what you are doing (or want to do)â€¦augmenting or enhancing your specific reality.</strong></em></p>
<p><strong><em>In this sense, anything that at least knows who you are (your ID, mobile phone #, etc.), where you are (GPS coord or a specific place like a cafe), and gives you relevant data, information, or media = augmented reality. Sure, you can make things more interactive or immersive, but that is the minimum.</em></strong></p>
<p><strong><em>So, in this case, yes, I think there will be networked applications in the next 18 monthsâ€¦mostly things that are enhanced by friends lists (you are here, your friend is over there). These will be *application specific*. My team at Neogence is already going beyond this, building a platform and infrastructure for other applications to be developed onâ€¦all networked through the same backbone. Now, in this context (the science fiction AR that we all dream about), no I do not see anyone else trying to leap a generation or two ahead of the industry to build a massively multiuser shared AR space. Expect to see things like multi-user AR games, virtual pets, kiosk marketing, magic book, â€œgee whizâ€ presentations (tradeshow booths, entertainment parks, etc.), and so forth.</em></strong></p>
<p><strong><em><br />
</em></strong></p>
<h3>Goggleâ€™s Are Not The Secret Sauceâ€¦</h3>
<p><strong><em><img class="alignnone size-full wp-image-2563" title="ar-catpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/ar-catpost.jpg" alt="ar-catpost" width="137" height="150" /><img class="alignnone size-full wp-image-2564" title="goggles-avatarpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/goggles-avatarpost.jpg" alt="goggles-avatarpost" width="150" height="150" /><br />
</em></strong></p>
<p>AR Cat left and Robert Rice right</p>
<p>What has come to be associated with the term Augmented Reality, in the popular imagination &#8211; an idea of 3D graphics projected over markers that has been forever waiting for the advent of â€œwicked next generation transparent wearable displaysâ€ &#8211; nirvana for augmented reality. While such displays may be nirvana for AR (and they could be with us in less than twenty four months), Goggles are not the â€œsecret sauceâ€ of AR as Robert points out.<strong><em><br />
</em></strong></p>
<p><em><strong>All the glasses are, is another display device. At the end of the day, it doesnâ€™t matter if you are looking at an LCD monitor, an IPhone, a head mounted display, or a pair of wicked next generation transparent wearable displays that magically draw directly on your retinas.</strong></em><br />
<em><strong><br />
The real tricky stuff is what happens on the backendâ€¦making it all persistent, massively multiuser, intelligent, interoperable, realistic, etc. etc.</strong></em></p>
<p><em><strong><img class="alignnone size-full wp-image-2585" title="vuzix" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/vuzix.jpg" alt="vuzix" width="450" height="318" /><br />
</strong></em></p>
<p>There has been quite<a href="http://www.realwire.com/release_detail.asp?ReleaseID=10934" target="_blank"> a buzz going around</a> about the new <a href="http://www.vuzix.com/iwear/products_wrap920av.html" target="_blank">Vuzix Eyewear</a>, and recently Robert talked with Vuzix and checked The Wrap 920AV eyewear out:</p>
<p><em><strong>Vuzix is not alone in pursuing the ultimate in hardware, at least as far as wearable displays. However, I think they are much farther than the rest of the pack in vision, roadmap, and execution. They have put together a team that has a sense of urgency and ambition that will blow the industry away. After talking to them, I got the feeling that they really know what they are doing and there is a lot of mind blowing stuff in their pipeline. Iâ€™m sure they are one of the few companies that really gets it and has a clear vision of the future. Definitely my first choice to work with.</strong></em></p>
<p><em><strong><br />
</strong></em></p>
<h3>Hybrid Augmented/Virtual Reality</h3>
<p><img class="alignnone size-full wp-image-2566" title="qa_2post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/qa_2post.jpg" alt="qa_2post" width="450" height="347" /></p>
<p><a id="va0_" title="Cory Ondrejka posted" href="http://ondrejka.blogspot.com/2009/01/anybots-telepresence-robot.html" target="_blank">Cory Ondrejka posted</a> this picture of the anybots telepresence robot and â€œcongrats to <a href="http://www.tlb.org/">Trevor Blackwell</a> and the rest of the <a href="http://anybots.com/">Anybots</a> team on the launch of <a href="http://anybots.com/abouttherobots.html">QA at CES</a>.â€Â  Cory (one of the founders and former CTO of Second Life) also made some predictions for Virtual Worlds, some optimistic and some less so, including â€œthe increasing need to be able to diversify the Second Life product offering to begin truly rebuilding the code base.â€</p>
<p>Robert is unabashedly irritated with the state of play in Virtual Worlds and MMOGS:<br />
<em><strong><br />
</strong><strong>Unless both industries (Virtual Worlds and MMOGs) have some serious upheaval or radical new approaches, they will quickly be eclipsed by AR, which will eventually evolve into something hybrid..AR/VR depending on your level of access and hardware.</strong></em></p>
<p><em><strong></strong><strong>Iâ€™d like to see someone grab an engine like Offset, Crytek, HERO, or Unreal 3, and smack on a fat MMO server infrastructure (Eve or Bigworld)â€¦toss in the right tools, and you would see a revolution and renaissance occur at the same time in the virtual world space. All the puzzle pieces are there, just no one is putting them together the right way.</strong></em></p>
<p>I did just find out that Nortelâ€™s <a id="qkxv" title="WebAlive is powered by the Unreal 3 engine" href="http://www2.nortel.com/go/news_detail.jsp?cat_id=-8055&amp;oid=100251105&amp;locale=en-US" target="_blank">WebAlive is powered by the Unreal 3 engine</a>. You <a id="xqbw" title="can try WebAlive" href="http://www.lenovo.com/elounge" target="_blank">can try WebAlive</a> out here.</p>
<p>Robert<strong><em> </em></strong>points out how rare it has become to see people really push virtual worlds technology and MMOGs into entirely new directions.Â  Although, of course, there are exceptions.Â  I managed to engage some interest from Robert in the possibilities the <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">opensource modular architecture of OpenSim</a> opens up, and <a id="vx_i" title="the augmented reality experiments from Georgia Tech with Second Life" href="http://arsecondlife.gvu.gatech.edu/" target="_blank">the augmented reality experiments from Georgia Tech with Second Life</a> (screenshot below) got praise from Robert for trying to do something new. (Georgia tech have also put out a <a id="kfzj" title="virtual pet app for the iphone" href="http://uk.youtube.com/watch?v=_0bitKDKdg0" target="_blank">virtual pet app for the iphone</a> ).</p>
<p><img class="alignnone size-full wp-image-2567" title="picture-4" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/picture-4.png" alt="picture-4" width="321" height="245" /></p>
<p>But while Robert clearly has zero patience for virtual world technology which he sees stuck in the mid nineties, he notes:</p>
<p><em><strong>the innovative and wonderful stuff about SL isnâ€™t SL, it is what people are doing and creating on their own with terrible tools *IN* SL</strong></em> [Second Life].</p>
<p>The immersive mobile augmented reality platform Robert is building, he hopes, will generate this kind of user creativity but with 21st century tools.</p>
<h3>So is it â€œOMGâ€ finally for the Augmented Reality we have dreamed about?</h3>
<p>According to Robert:</p>
<p><em><strong>It really boils down to a markerless solution and a good application.</strong></em></p>
<p>In the interview below we cover a number of topics including business models for Augmented Reality, e.g., how business models based on micro-transactions and virtual goods will translate to Augmented Reality.</p>
<p>Many of the challenges to becoming mainstream faced by virtual worldsÂ  are similar to the challenges AR must overcome. Robert discusses these including the interface/gui that is a critical element for AR, solving the riddle of one world or many, patent wars in Virtual Worlds and Augmented Reality, the role of Augmented Reality in the future of sustainable computing, and what interoperability is about.</p>
<h3>The Back Story for AR/VRâ€¦</h3>
<p>In case you want to get up to speed on the required background reading forÂ  Augmented Reality. This is Robertâ€™s required reading list and Denno Coil is an absolute <strong>must</strong> see (feel free to add to this list in the comments, please).</p>
<p>â€œIf you want to see the things that have inspired our vision of what we want to build, check out:</p>
<p>* Dream Park by Larry Niven and Steven Barnes<br />
* Rainbows End by Vernor Vinge<br />
* Spook Country by William Gibson<br />
* Halting State by Charles Stross<br />
* The Diamond Age by Neal Stephenson<br />
* Donnerjack by Roger Zelazny and Jane Lindskold<br />
* Otherland by Tad Williams<br />
* Neuromancer by William Gibson<br />
* Idoru by Wiliam Gibson<br />
* Cryptonomicon by Neal Stephenson</p>
<p>and watch the whole anime of Denno Coil (subbed NOT dubbed!)â€</p>
<p><img class="alignnone size-full wp-image-2568" title="dennoucoil" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/dennoucoil.jpg" alt="dennoucoil" width="450" height="256" /></p>
<p>Screenshot from Denno Coil from<a id="yic5" title="Concrete Badger" href="http://www.concretebadger.net/blog/2007/12/17/dennou-coil-full-series-2007-in-12-day-4/" target="_blank"> Concrete Badger</a>.</p>
<h3>Interview With Robert Rice</h3>
<p><strong>Tish Shute:</strong> I am glad to hear that you are working on this [an immersive mobile augmented reality platform]!</p>
<p><strong>Robert Rice:</strong> We switched gears from MMO stuff about a year ago and we are finally getting some traction. It is very hard doing anything in this economy right now, but we found an opportunity to take AR to a new level beyond what you see on youtube. AR is still too â€œcuteâ€ and novelty. We donâ€™t want to play around.</p>
<p><strong>Tish Shute:</strong> I like Wikitude â€˜cos it even manages to do something useful!</p>
<p><strong>Robert </strong><strong> Rice</strong><strong>:</strong> Yeah, useful = traction. Now that we are getting near a prototype we are starting to get a lot of interest even though we are still technically way under the radar.</p>
<p><strong>Tish Shute:</strong> r u funded?</p>
<p><strong>Robert </strong><strong> Rice</strong><strong>:</strong> privately funded, some revenues from an early license, and ongoing discussions with several institutional investors. So, we have some funding, but nothing spectacular just yet.</p>
<p><strong>Tish Shute:</strong> are you just developing an AR platform?</p>
<p><strong> Robert Rice:</strong> hrm, sort of, but not just that. By platform I mean tools, sdk, and infrastructure plus some applications. The idea is to build something that facilitates everyone else making cool things and useful applications for different industries/sectors</p>
<p><strong>Tish Shute:</strong> Yes that is the cool thing to do but isnâ€™t that hard to fund!</p>
<p>(Robert grins) Well, that depends on the business model. Weâ€™ve got that figured out. Iâ€™d be absolutely happy if everyone and their brother were making applications on our stuff that gives us an edge on market penetration/saturation. There are plenty examples that prove the model. If you give people free and easy to use tools, they will run with it. ARtoolkit for example, has tons of people making nifty things and posting videos on youtube that has pushed them to the forefront as THE AR middleware to use right now, or heck, look at youtube free service, and they dominate video sharing.Â  Sure there will be a lot of â€œnoiseâ€, but there will also be a lot of â€œsignalâ€ that will rise to the top, facilitating and enabling is creating value in its own right.</p>
<p><strong>Tish Shute:</strong> But how do you expect to monetize?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> There are a good half a dozen ways to monetize AR or an AR platform.</p>
<p><strong>Tish Shute:</strong> What are your top 3?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> hrm, microtransactions, localized mobile advertising, and enterprise solutions (visualization)</p>
<p><strong>Tish Shute:</strong> Do you think the consumer market will give the lead?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Iâ€™m not sure. We are getting people from academia, intelligence, defense, border security, and some corporate types knocking on our door already, and pretty aggressively. It may be that those sectors push AR before consumer entertainment really kicks off.</p>
<p>But going back to a discussion we had earlier &#8211; yes working with â€œno markersâ€ is a big deal.</p>
<p><strong>Tish Shute:</strong> Can you talk about what you are doing there or is it still under wraps?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> I can say that between some university tech transfer and some of our own proprietary stuff, we are using some fairly common visual tracking technology. if you are really plugged into the AR scene, you will know there are probably half a dozen visual tracking methods out there. We just looked for the best one, licensed it for commercial use, and then started working our magic. This is a very small piece of the overall effort, but worth noting.</p>
<p>The downside with working with university tech is that it is usually based on research, incomplete, and not wrapped up in a nice commercial package on the upside, it can be a good start to build on.</p>
<p><strong>Tish Shute:</strong> As you know I am very interested in â€œtechnology that mattersâ€ in particular tech that can help us accomplish the urgent goal of sustainable living.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong>: oh, Iâ€™m pretty keen on sustainable living as wellâ€¦after I sell off a few companies and have money of my own, Iâ€™m going to get into arcologies<br />
â¦<br />
Robert grins</p>
<p>The interesting thing with the visual stuff combined with our other tech, is that we can make things multiuser, persistent, dynamic, and mobile.<br />
The markers (fiducials) are really really limiting outside of basic applications. You canâ€™t really plaster everyone and everything with a marker.Â  And they are, by nature, static (even if they are animated or whatever).</p>
<p>Alsoâ€¦ our stuff works indoors and outdoors even without a GPS connection.<br />
â¦<br />
Robert grins</p>
<p><strong>Tish Shute:</strong> Now that does sound interesting!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yeah, with visual, you donâ€™t need a compass or accelerometers either. Less hardware : )</p>
<p>You start with wifi triangulation or gps coord to get a â€œbruteâ€ location, and then you use the visual stuff for down to the meter accuracy and that by nature, gives you your orientation and positioning.</p>
<p><strong>Tish Shute: </strong>Wow this is beginning to sound very interesting!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Once you have that, it doesnâ€™t matter where you go, it continues to track and continually refines areas you have been before. Weâ€™ve spent the last year figuring all this out. There are so many problems and obstacles that are going to be developing in the future for anyone trying to do what we are, but we have already discovered solutions.</p>
<p>oh, visual tracking = gesture based interfaces too thatâ€™s going to take some work, but its doable.Â  The real pain in the ass there isnâ€™t the actual tracking, it is in the interface design.</p>
<p>Thatâ€™s something that almost every AR company, venture, and research program is missing out on entirely. They are so focused on making cute things with markers.Â  They are missing the larger problems of AR Spam, interface, iconography, GUI, metaphor, interoperability, privacy, identity.</p>
<p><strong>Tish Shute:</strong> So how are you dealing with all that!!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> We took the backwards approach of trying to think where we want things to be in ten years (and we read all the cool booksâ€¦Vinge, Stephenson, Gibson, etc.) and then we spent time trying to think of what the potential problems areâ€¦.like AR spam. Its bad enough when a giant penis flies by in second life, we donâ€™t want that to happen in a global wireless AR platform.</p>
<p><strong>Tish Shute: </strong>Do you have a prototype yet?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> hrm, 6 months away from youtubing something. Problem has been slow funding, which equals slow development. We also donâ€™t want to show our cards too soonâ€¦too many potential competitors out there.</p>
<p>â¦<br />
Robert grins</p>
<p><strong>Tish Shute:</strong> when you say microtransactions what is the business potential there?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>hrm last year I think, $1.5B was spent on virtual items. Thatâ€™s games and virtual worlds. That should hit $5B in a couple of years. Thatâ€™s basically people buying and selling things like WoW gold or items in SL or whatever. microtransactions, is basically the same thing, but in AR space.</p>
<p>Why couldnâ€™t a 3D artist make a wicked animated 3D dragon, and then sell it to someone else? With AR, you could sit it on your shoulder. With a good scripting engine, you could train it to do stuff. Thats what I want to enable.</p>
<p>tools + sdk + platform = enabling people to make and create. Add in a commerce level (microtransactions) and wala.</p>
<p><strong>Tish Shute:</strong> At the moment all of these virtual goods are very platform specific, is that a problem for you?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Not at all. This is at a higher level. You have to switch mental models when you talk about what AR could or should be. For example, lets contrast the web and virtual worlds. For every virtual world you go to, you have to download a whole new client. Imagine if that model was applied to the webâ€¦ you would need a brand new browser for every website you went to. That is just soâ€¦wrong.</p>
<p>Itâ€™s the same thing for ARâ€¦people are thinking about it with the same mental and business models and development philosophies as virtual worlds or web.Â  There are some things and aspects that work fine, but not everything.</p>
<p>Virtual worlds, are, by nature, necessarily different and walled gardens. The idea of 100% open and interoperable virtual worlds is a red herringâ€¦it sounds good but in practice it is a really dumb idea.</p>
<p><strong>Tish Shute: </strong>I was wondering if you had a way to leverage all the 3D content already created â€˜cos that would jump start things in AR wouldnâ€™t it?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Oh yeah, thatâ€™s easy. They all use the same polygons. Any virtual item in any game or virtual world is likely created with 3D studio or maya or something similar would be easy to convert and use.</p>
<p><strong>Tish Shute:</strong> So people could bring their WoW weapons into your system?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Not legally, but sure. Its just a 3D model with a texture.Â  It doesnâ€™t matter if you use corel draw or photoshop or paintshop proâ€¦.or one screwdriver or another. Part of my teamâ€™s advantage, is that we are all experienced in MMORPG and virtual world design and development. We know the tools, the tech, and what works and what doesnâ€™t.</p>
<p><strong>Tish Shute:</strong> But some of the 3D content created in the social worlds is what has most value to people.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Right, and that can be exported out easily.</p>
<p><strong>Tish Shute: </strong>But back to â€œrealâ€ life applications. Is you platform really markerless?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yes.Â  marker = printed icon or glyph, also known as a fiducial</p>
<p><strong>Tish Shute:</strong> But u must have some marker?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> hrm, more accurately, you need a point of reference.</p>
<p>Visual tracking has been around for more than a decade.Â  Lots of work for robots and other sectors.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> But isnâ€™t the specificity of reference n terms of RL applications a vital key, for example, for a database of things?</p>
<p>Robert grin That is a different problemâ€¦tracking, registration, mapping, positioning, etc. That question has to do with mapping which is related to visual tracking, but not the same thing. We have a rather unique approach to some of this that I canâ€™t discuss (patent pending).</p>
<p><strong>Tish Shute: </strong>But for example, to create an augmented natural history of food &#8211; say I want to point at the slab of meat on my plate and know where that cow came from, what feed lot how it was treated etc</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>That is not possible without ubiquitous nanotechnology. Shall I explain?</p>
<p><strong>Tish Shute:</strong> Yes please!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Ok, lets step back a minute and turn that burger back into a cowâ€¦ the first problem (of this particular situation) is differentiating from one cow to another since most cows look alike, you can either attempt to discriminate visually (cow patterns) or use a much simpler option, like giving each cow a rfid chip in their bell, or hoof</p>
<p>Now, most people would try to figure out how to jam all sorts of info in the rfid chip, which sounds like a good idea, but isnâ€™t, the trick would be to simply use the rfid to store a unique identifier with is then linked to a database elsewhere, or hoof.</p>
<p>That database should continually be updated with whatever relevant information you need so as you get close with your AR laptop, wearable displays, or embedded brain chip, you get the identifier broadcast, then you get the info downloaded to you, and it â€œsticksâ€ to the cow with the generic visual tracking (object following, even simple bounding box is sufficient for a slow moving cow)</p>
<p>So, up to that point, you can get tons of information about that specific cow, that cow population (remember, AR is not just about overlaying dataâ€¦it is inherently about who YOU are, WHERE you are, WHAT you are doing, WHAT is around you, etc.) Tie in data visualisation and some farmer tools and all sorts of other things happen. Now, lets move the timeline ahead a bit.</p>
<p>The butcher gets the cow and does his handiworkâ€¦because we know all the info about the cow, all of the meat can be properly labeled and marked. Ideally, with a UPC code or a unique glyph (somewhat problematic depending on how many unique glyphs you can create) so, while you are in the grocery store, you can access the relevant shopping dataâ€¦age of cow, state of origin, type of feed, how many spots, how much body fat, which butcher, whatever not because of what is inside the package, but the package itself.</p>
<p>Getting back to your hamburger, the problem is that it is a burgerâ€¦there is nothing to distinguish that burger from another one at the tableâ€¦unless you stuck a rfid chip in it or splattered it with ink and a unique glyph, or maybe a special one of a kind plate.</p>
<p>However, a properly designed AR system could say â€œhey! that/s a hamburger! and I know I am at Fat Daddyâ€™s Burger Joint in Raleigh North Carolina on Glenwood Avenue, and I know that they cook their burgers this particular way, and their meat supplier is those guys over there, and they usually get their cow meat from a farm out in Utahâ€</p>
<p>With ubiquitous nanomites or whatever, then its not that far out to consider edible nanos that are in the meat and that broad cast info so a slab of meat can tell you about itself and broadcast that to the general public.</p>
<p><strong>Tish Shute:</strong> What useful scenarios can we create without the nanomites?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> If it wasnâ€™t a burger or a consumable organic, the scenario changes.</p>
<p><strong>Tish Shute: </strong>What is the time scale on nanomites?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> ehhhhhhh 20 years minimum if we are lucky. They sound good on paper, but there is a whole book worth of problems and why they are so far offâ€¦as consumer grade, all over the place, type of stuff.</p>
<p><strong>Tish Shute:</strong> Did you see the Nokia Home Control center?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yes, I saw the Nokia stuff.</p>
<p>AR for sensors, like security systems, temperature control, etc. all become â€œsources of dataâ€ that a AR system can visualize. So yes, thats easily doable. You could do that in a short period of time with some half decent engineers.</p>
<p>The trick of what Nokia is doing is aggregating sensor data from a building/home/facility, mashing it together, and sending the mobile device alerts and data visualization conceptually rather simple, but no one has done it right or well yet.</p>
<p>It wouldnâ€™t surprise me if Nokia pulled it off.</p>
<p><strong>Tish Shute:</strong> yes and if they do and someone does an AR interface to it that would be an inflection point for AR?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> In a roundabout way, yes. You could get data directly from your house, or get it through your mobile device and in either case, use the AR for visualization and control.</p>
<p>The interface/gui is a critical element for AR. That is one of the areas where it, as an industry, risks doing a bad job and turning into just a fad or another novelty like VR.Â  Virtual worlds have been struggling with that for a while, but MMORPGs have had the effect of extending their life cycle</p>
<p><strong>Tish Shute: </strong>Yes VWs have not solved the interface problem.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>The interface is one of their problems yes. Most virtual worlds are stuck in 1996/98</p>
<p><strong>Tish Shute:</strong> If ARÂ  is inherently about who YOU are, WHERE you are, WHAT you are doing, WHAT is around you, etc. seems that it is the ideal interface for home control?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Well for home control, you must know:</p>
<p>1) Who am I? Am I authorized to know this information? Am I a guest?</p>
<p>2) Where am I? Is this my house? or someone elses?</p>
<p>3) What am I doing? Do I want to make all the doors lock? Turn on or off lights? Open the garage door? Trigger the security alarm?</p>
<p>So the same questions apply</p>
<p>Iâ€™d say that all virtual worlds are stuck in the mid 90s. They are at least a decade behind the game worldsâ€¦in technology, design, implementation, architecture, etc. etc. In my opinion, things like Second Life are shameful in how they are presented as state of the art, innovative, ground breaking, new, wonderful, and world changing.</p>
<p>But thats another topic of conversation : )</p>
<p><strong>Tish Shute: </strong> Well for me the contribution of VWs is the presence enabled real time interaction with application (as 3D info machine) and context with other people.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Oh,there is no doubt that they are greatly useful and have a phenomenal amount of potential.</p>
<p>They *could* be all those things I just said that SL isnâ€™tâ€¦the problem is that they are either just existing, or they are meandering around without any real focus or direction. They arenâ€™t evolving.</p>
<p>Even MMORPGs are losing their way and beginning to stagnate terribly</p>
<p><strong>Tish Shute:</strong> yes I agree</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>But, AR has the potential to change a lot of things.</p>
<p>Im sure you have seen <a id="n_22" title="the yellowbook commercials" href="http://www.youtube.com/watch?v=zdPFBTQpk-U" target="_blank">the yellowbook commercials</a>? The technologies you are seeing here are doable in hrm, a year or less maybe. The tricky part is the interactivity and AIâ€¦that is, the content. Everything else isnâ€™t a problem. The avatar there could be photorealistic or stylized like a WoW character.</p>
<p>You could do that to some degree with markers for registration but dynamically changing the content linked to those markers is a little weird</p>
<p>(by the way, for the record, I like markers just fine, I just donâ€™t think they are useful for real-world mobile applications)</p>
<p>I also think that the guys that want to dust the planet with miniature rfid chips are on crack and are going about it the wrong way</p>
<p><strong>Tish Shute: </strong>A high level of interactivity is hard though. Isnâ€™t it? Even in VWs it is very limited.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> it depends if you can track what the user is doing, and interpret that properly. Interactive is also a very lose term.</p>
<p>Clicking a button and making a light blink could be considered interactive.</p>
<p><strong>Tish Shute: </strong>In VWs a high level of interactivity wouldÂ  be to wield a virtual hammer and have a real nail go in! is physics part of the problem?</p>
<p><strong>Robert Rice:</strong> physics arenâ€™t difficult, plenty of middleware out there for it. The problem with that isnt so much the physics as much as it is the scale and purpose</p>
<p><strong>Tish Shute:</strong> well for robotics?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> that gets into a conversation about meshes, textures, and volumetric collision detection and stuff</p>
<p><strong>Tish Shute:</strong> virtual robotics?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> You mean teleremote/telepresence of real robots?</p>
<p><strong>Tish Shute: </strong>yes!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> ah, for that, you need some tactile feedback and some other stuff &#8211; doable, but insanely difficult. Thatâ€™s why you donâ€™t see a whole lot of remote controlled surgery robots all over the place.</p>
<p>They do existâ€¦</p>
<p><strong>Tish Shute: </strong> Will AR contribute to sustainable living by freeing us from some of our energy hogging devices?<strong></strong></p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>AR will ultimately encourage energy saving and recycling. where did I leave a light on at? where is the nearest trash can? what is the UV index outside today?</p>
<p>Yes, computers are energy hogs, but as we start seeing larger SSD drives, more efficient CPUs (even if the number of cores increases in multiples), and so on, the power will go down.</p>
<p>Also, think about thisâ€¦wearable displays potentially use less energy than LCD monitors on your desk.</p>
<p><strong>Tish Shute: </strong>Yes I should pick the brains of my intel chums on energy saving!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Getting rid of the monitor and switching to solid state drives will save an assload of power. Yes, I said assload.</p>
<p>Tell your intel chums to quit screwing around with single core mobile CPUs. We need multiple cores, that are smaller, faster, and use less power.</p>
<p><strong>Tish Shute: </strong>Is AR is the sustainable future of VW and MMOGs?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>The fun stuff will happen when they are both integrated in some fashion.</p>
<p><strong>Tish Shute:</strong> So perhaps this is why the Georgia guys are thinking in trying to combine AR and SL (<a id="boum" title="see video here" href="http://uk.youtube.com/watch?v=O2i-W9ncV_0&amp;feature=related" target="_blank">see video here</a>).</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> That first video was pretty damn cool. It just pains me that they are using SL for it. And omg, all those markers on the table.</p>
<p>Although, I could care less about seeing my SL avatar on my coffee table. I would rather see an avatar representing ME in the real world, moving around in a virtual world that is a â€œto scaleâ€ replica of the real world. That is MUCH more interesting and innovative.</p>
<p>But even if I donâ€™t like where they are going, or that they are using SL, the important thing is that they are doing something and forging ahead. I have a massive amount of respect for anyone, private, government, or academic, that is doing that.</p>
<p>And yes, the door (or window, or looking glass) has to work both ways for maximum potential, at least, thatâ€™s what Id like to see. They donâ€™t *have* to, but it would be rather cool.</p>
<p>And going back to sustainability, AR has the potential to make monitors generally obsolete, laptops too. Thatâ€™s a lot of power hungry devices with all sorts of metals and batteries inside.</p>
<p>But, even if the tech was absolutely crazy awesome right this minute, it would take a little while for consumer adoption.</p>
<p><strong>Tish Shute:</strong> But AR unleashes the mobile device?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes, AR is going to be built on powerful mobile devices for the near future, eventually embedded comps in clothing and whatnot. But that is a ways off</p>
<p>Entertainment is going to be the first huge driver.</p>
<p><strong>Tish Shute:</strong> So people will get used to having a pet virtual dragon on their shoulder first?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes, virtual dragon is way cool, easy tech for games, and can eventually be leveraged into a smart agent which becomes a practical applicationâ€¦agent based contextual search, etc. Yes, entertainment will also drive people to get used to the tech</p>
<p><strong>Tish Shute: </strong>Oh thanks for turning me on to <a id="kzbv" title="gamesalfresco" href="http://gamesalfresco.com/" target="_blank">gamesalfresco</a>!<strong></strong></p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Ive noticed that the good stuff usually gets linked to there. They donâ€™t list my blog, but thatâ€™s what I get for staying under the radar and not posting often. But anyway, gamesalfresco is the first place I send people that need a crash course in AR. Great site, great owner.</p>
<p><strong>Tish Shute:</strong> So are you in agreement with Thomas Wrobelâ€™s positioning ofÂ <a href="http://www.mobilizy.com/wikitude.php" target="_blank"> </a><em><strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a></strong></em> and <em><strong><a href="http://gamesalfresco.com/2008/07/20/want-your-own-augmented-reality-geisha/" target="_self">AR Geisha doll</a> </strong></em>as being significant milestones for AR?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes,Â  these are among the first attempts to get away from the novelty of simply rendering a 3D object based on a marker and making it interesting.</p>
<p class="MsoNormal">Remember, one of the biggest risks that AR has, is being branded as â€œnoveltyâ€, which means â€œcool for five minutes but ultimately a waste of time.â€ I think we have a ways to go before something is truly useful, but as 2009 progresses we should start seeing some effort here. Iâ€™d guess 2010 before something really useful comes outâ€¦at least something practical.</p>
<p>Now, having said that, I should say that I expect entertainment and games to take the lead (as usual), although there are a few companies really trying to leverage AR and video/graphics compositing for marketing (brochures) and location based methods (kiosks, large screen projections, etc.)</p>
<p><strong>Tish Shute:</strong> Many people would say SnowCrash (metaverse) is now and Halting State (AR) is ten years from now. But you are seeing a development timeline for some popular AR apps in the next 18 months?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> Anyone that says SnowCrash is -now- is living in a box. Virtual Worlds, Virtual Reality, and immersive tech in general stopped innovating in the mid 90s. Iâ€™m continually flabbergasted at the number of people that think that things like Second Life are state-of-the-art or innovative. You might as well try to market a walkman as cutting edge, even though we have IPods out there.</p>
<p>Id like to see someone grab an engine like offset, crytek, hero, or unreal 3, and smack on a fat mmo server infrastructure (eve or big world)â€¦toss in the right tools, and you would see a revolution and renaissance occur at the same time in the virtual world space. All the puzzle pieces are there, just no one is putting them together the right way.</p>
<p><strong>Tish Shute:</strong> Why doesnâ€™t anyone do that?<strong></strong></p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Its not cheap, people will only fund a copy of something that exists already, people fear change and innovation, etc, The list goes on. The right money goes to the wrong people all the time.</p>
<p>Alternatively stated, there is a lot of â€œright idea, wrong implementationâ€</p>
<p>MMORPGs carried the torch and have made huge strides on the technology front, but have devolved in design. More often than not the gameplay emphasizes the single player experience and does nothing to take advantage of the potential of the massively connected internet.</p>
<p>Unless both industries have some serious upheaval or radical new approaches, they will quickly be eclipsed by AR, which will eventually evolve into something hybrid..AR/VR depending on your level of access and hardware.</p>
<p>But yes, Iâ€™d say that the next 18 months are going to be very interesting with a lot of money being thrown around, new ventures, and plenty of content/applications. I expect most of this will be centered on single user AR experienced through a mobile device with a screen (iphone, android, etc.). I expect that there will be a significant boost after Vuzix releases some of their wearable *transparent* displays, putting Microvision back into the â€œhas potential but is too quietâ€ position.</p>
<p><strong>Tish Shute:</strong> AR conjurs an image in many peopleâ€™s minds of dreadful head gear!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes, it is either transparent wearable displays (in eyeglass formfactor) or nothing. HMDs with miniature LCD or OLED displays are good for streaming video, but for the mobile ubiquitous AR we all dream about, it has to be something that looks and feels like a pair of Oakleys.</p>
<p>I should also mention that several different types and modes of AR are going to find themselves being defined and refined over the next two years as we continue to blaze new trails, establish a lexicon (we keep borrowing terms from games, VR, virtual worlds, mmorpgs), and really work out the how as well as the why.</p>
<p>Even though the idea of AR has been around for a long time, the technology is just beginning to emerge, and very few people are even looking far enough ahead to figure out the problems and solutions that the tech creates. Really, who is thinking about how to deal with AR spam right now?</p>
<p><strong>Tish Shute: </strong>Do you see any successful networked AR applications emerging in the next 18 months?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yes and no.</p>
<p>When I talk about AR, I try to expand the definition a little bit. Usually, when you talk to someone about augmented reality, the first thing that comes to mind is overlaying 3D graphics on a video stream. I think though, that it should more properly be any media that is specific to your location and the context of what you are doing (or want to do)â€¦augmenting or enhancing your specific reality.</p>
<p>In this sense, anything that at least knows who you are (your ID, mobile phone #, etc.), where you are (GPS coord or a specific place like a cafe), and gives you relevant data, information, or media = augmented reality. Sure, you can make things more interactive or immersive, but that is the minimum.</p>
<p>So, in this case, yes, I think there will be networked applications in the next 18 monthsâ€¦mostly things that are enhanced by friends lists (you are here, your friend is over there). These will be *application specific*. My team at Neogence is already going beyond this, building a platform and infrastructure for other applications to be developed onâ€¦all networked through the same backbone. Now, in this context (the science fiction AR that we all dream about), no I do not see anyone else trying to leap a generation or two ahead of the industry to build a massively multiuser shared AR space. Expect to see things like multi-user AR games, virtual pets, kiosk marketing, magic book, â€œgee whizâ€ presentations (tradeshow booths, entertainment parks, etc.), and so forth.</p>
<p>The big thing Iâ€™m worried about is AR becoming the next silicon valley trendâ€¦once they realize the potential, an enormous amount of capital will flow to a bunch of startups with half baked ideas, weak business models, ten year old tech, and a lot of overhyped marketing. That is the very thing that will kill this technology as something that has true power and potential to literally change the way we interact with each other, our surroundings, information, and media.</p>
<p><strong>Tish Shute: </strong>Do you think AR has value for a project like Pachube that helps us connect dtat from lots of different environments and sensor actuator data?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> I think that AR has value as an interface to this data (essentially data visualization based on information streaming from a sensor or source that is interpreted in some dynamic graphical manner that has meaning). This is one of the â€œbig areasâ€ where ubiquitous augmented reality and wearable computing will really shine. Iâ€™ll definitely be keeping an eye on Pachube .</p>
<p><strong>Tish Shute:</strong> I canâ€™t help it! I am really interested to hear more about the Vuzix glasses?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yeah, everyone is getting hung up on the glasses as the end-all be all and having markers everywhere too.</p>
<p>All the glasses are, is another display device. At the end of the day, it doesnt matter if you are looking at a lcd monitor, a iphone, a head mounted display, or a pair of wicked next generation transparent wearable displays that magically draw directly on your retinas.</p>
<p>The real tricky stuff is what happens on the backendâ€¦making it all persistent, massively multiuser, intelligent, interoperable, realistic, etc. etc.</p>
<p>I think that we are within 24 months of the magic wearables (these new ones by vuzix are probably the real first generation attempt at doing it right). They wont be perfect, but I expect they will be functionalâ€¦and once we have functional, we can start doing the good stuff.</p>
<p><strong>Tish Shute:</strong> You mentioned you disappointement with VWs and MMORPGs earlier.Â  Could you tell me more about that?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> Yeah, there was an evolutionary divergence between virtual worlds and mmorpgs a while back. One stagnated almost completely, and the other leapt ahead in one sense and devolved horribly in the other sense. Neither is where the state of the art should be.Â  That is a whole other conversation, and probably a second book.</p>
<p><strong>Tish Shute:</strong> So making AR persistent, massively multiuser, intelligent, interoperable, realistic, etc. etc. that is where your efforts are going?<strong></strong></p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes. I fully expect that the hardware is almost ready for it. You can cobble together some amazing things in the lab right now, and I think commercial viability is imminent. The real value (as far as Iâ€™m concerned) is in making it mobile, wireless, persistent, and massively multiuser. You could argue that augmented reality will take over where virtual reality failed and become internet 3, internet one being the internet, internet two being the webâ€¦</p>
<p>mmorpgs are nothing more than single player games in a multiuser environment these days. Iâ€™m more than a bit bitter about it. All the right money went to the wrong people, and the best games we have are barely shadows of what we could have had by now.</p>
<p><strong>Tish Shute:</strong> Are there any open source AR platform dev projects?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>open source? hrm, Im sure there are multiple ones out there</p>
<p>if not entirely open source, there are plenty of things to experiment with that are generally free if you arenâ€™t trying to sell something, DART and ARTOOLKIT come to mind as very accessible applications.</p>
<p>Marker based AR is very important right nowâ€¦it is easy, low tech, understandable, highly customizable, and most importantly, accessible to the average joe. Ultimately though, we need a method of pure trackingâ€¦no markers glued to everything on the planet, no â€œbillions of RFIDsâ€ embedded in every square inch of every object on the planet, etc.</p>
<p><strong>Tish Shute:</strong> What do you mean by interoperability in AR? And what do you think about the development of standards?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> Ooh, good question.</p>
<p>Ok, so the internet is basically computers communicating with computers, and the web is mostly pages linking to other pages (Iâ€™m greatly oversimplifying here). Hold this thought for a minute.</p>
<p>Switch over to MMORPGs. If you want to play in one (or a virtual world), you need to download a client that is specific to that world. One client does not work with another world. There are plenty of efforts to change this, but they are all barking up the wrong tree. The specific uniqueness of each world defeats the need and purpose of true interoperability, unless you completely reinvent the whole thing with a common backbone, features, functionality, etc. The very nature of virtual worlds and mmorpgs rebels against this.You absolutely do not want an avatar from second life running around in world of warcraft (for reasons that should be obvious).</p>
<p>On the other hand, with the web, you can use just about any client (browser) to access nearly any website (some requiring plugins or whatever).</p>
<p>The thing with augmented reality, is how do we go about making this? Iâ€™ve seen a few people thinking about this from the wrong perspective. There was a question at the last techcrunch to the Sekai Camera guys (a conceptual AR application for the iphone) where someone on the panel wanted to know how website owners would convert their content for augmented reality. BZZZZZT! That is a fundamental misunderstanding of what AR is, or could be, and it falls into the same trap I see a lot of people doingâ€¦and that is looking at AR through the web 2.0 lens or the virtual world lens. It is absolutely fundamentally different at the coreâ€¦sure there are similarities: it has social networking/media applications and properties, and it has 3D graphics, but it stops there.</p>
<p>Ubiquitous augmented reality will be dramatically different depending on which standards, approaches, and philosophies get the most traction first. Will you walk down the street with your AR glasses and have a pop up every 30 feet asking you if you want to access the AR content on another server? Will you then have to register, subscribe, or whatever?</p>
<p>Or will all AR content be mediated by one sole master control server deep in the bowels of google? What about some other option? Will you need different sets of glasses to access different features and content from multiple sources?</p>
<p>At the end of the day, it should not matter what brand of glasses you are wearing, you should never have to deal with AR server popups to join/subscribe, and so forth.</p>
<p>Interoperability, in the context of what I was saying earlier, is the sense of how to build the infrastructure so all of this is seamless to the end user, but still maintaining the features/functionality necessary for all of what augmented reality promises usâ€¦I dont want to see everything in AR space, I want to be able to tune in or filter out some things, and I want to customize the snot out of what I see (perhaps changing metaphors or â€œholoscapesâ€), and so on. It all has to work together and simplify the end-user experience or it wonâ€™t get anywhere</p>
<p><strong>Tish Shute: </strong>So what caused the stagnation of new development and devolution of MMOGs in you opinion?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>yes, look at all the hope and hype for the mmorpgs released in the last 12 months really, what is different or better? Now, what is worse?</p>
<p>I bet any decent mmorpg gamer could give you a list of 2 or 3 things for the first question and 20-30 things for the second.</p>
<p>And, VWs seem to be stuck in a feedback loop</p>
<p><strong>Tish Shute: </strong>feedback loop?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Imagine nailing one of your feet to the ground and then trying to run â€™round and â€™round and â€™round.</p>
<p><strong>Tish Shute:</strong> Why do you think this happened to VWs?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Men in suits and flashy watches.</p>
<p>actually, hang onâ€¦..</p>
<p>I saw a video clip the other day from a conference about using various virtual and game technologies for simulations and other real world applications several people were talking about â€œavatar technologyâ€ and how theirs was better than their competitions and what not.</p>
<p>Now, can you tell me what â€œavatar technologyâ€ is? Avatar technology is a red herring. Avatar technology is the same thing as calling a toaster a new â€œfire technology.â€</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> The problem is that a lot of people that donâ€™t have a clue about what they are doing are selling the tech to other people that have no clue what they are buying, but they feel like the should for some unknown reason.</p>
<p>That is happening all over the government, academic, and industrial sectors now with a few companies selling virtual worlds (again, mid 90-s tech) as the ultimate solution to all problems.</p>
<p>Anyway, getting back to your question</p>
<p>Once virtual reality started getting some buzz, some people got greedy and jumped into the avatar/virtual world thing and tried making it commercial too soon half of the 3D chat worlds were being jammed into platforms for virtual shopping malls.</p>
<p>Most of the money funding tech R&amp;D started funneling towards VRML, and doing 3D in web pages, etc.</p>
<p><strong>Tish Shute: </strong>yes horrible idea trying make web pages 3D IMHO</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> The money people got involved too soon, and then the greedy people jumped in and tried patenting everything possible. Take a look at the worlds.com patent for 3D worlds.</p>
<p>They filed it back in 2000 or so and it was awarded in 07 (it shouldnt have been in my opinion) now they are suing everyone they can.</p>
<p><strong>Tish Shute: </strong>Will there be patent wars in AR?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yes, the AR patent wars will be legendary once people start waking up to the real potential here.</p>
<p>The only solution is for everyone to band together and pre-emptively patent or make public domain every possible patentable concept, technology, or implementation for AR otherwise, you havenâ€™t seen anything yet.</p>
<p><strong>Tish Shute:</strong> Is the AR community organized enough to do that yet?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> That depends on how my company fares in the next six months.</p>
<p><strong>Tish Shute:</strong> Will you patent or make your tech public domain?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> I plan on patenting the snot out of everything we can possibly think of, and then giving away our content creation tools and SDK stuff for free. The whole goal of what we are trying to build is to empower the end user and facilitate the creation of a wonderful world of augmented reality.</p>
<p>There are some things we will make public domain for sure, on top of that</p>
<p><strong>Tish Shute:</strong> So back to my question on networked real time experience. Will we have networked Real time AR experiences in the next 18 months</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> It is possible, yes. Other than what we are doing, I am not aware of anyone else taking the same approach we are, but the potential for an â€œunder the radar ventureâ€ (much like my own company) is definitely there.</p>
<p><strong>Tish Shute: </strong>Will you use cloud computing?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>I think thatâ€™s overrated and probably another attempt at the whole â€œthin clientâ€ model that some companies have been pushing for the last 20 years.</p>
<p>It sounds good on paper, but ultimately takes power and control away from the end user.</p>
<p><strong>Tish Shute:</strong> cloud computing?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes. You know, we arenâ€™t playing around, We are totally building â€œTHE ARâ€ that everyone keeps dreaming about. None of this cute stuff you see on youtube. Actually, if you want to see the things that have inspired our vision of what we want to build, check out:</p>
<p>* Dream Park by Larry Niven and Steven Barnes<br />
* Rainbows End by Vernor Vinge<br />
* Spook Country by William Gibson<br />
* Halting State by Charles Stross<br />
* The Diamond Age by Neal Stephenson<br />
* Donnerjack by Roger Zelazny and Jane Lindskold<br />
* Otherland by Tad Williams<br />
* Neuromancer by William Gibson<br />
* Idoru by Wiliam Gibson<br />
* Cryptonomicon by Neal Stephenson</p>
<p>and watch the whole anime of Denno Coil (subbed NOT dubbed!).</p>
<p><strong>Tish Shute:</strong> So scaling the real time experience wonâ€™t be a problem in your project hehe</p>
<p>Cos no sharding allowed in AR right</p>
<p>And if you have lots of API calls?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong>: haha, sharding is one of the dumbest things to happen to the VW/MMO industry</p>
<p>It is a solution to a technical problem that was relevant 15 years ago.</p>
<p><strong>Tish Shute:</strong> so why did it stick (i know men in suits)</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> it stuck because â€œthats what the other guys didâ€ and the mmo designers are too lazy to reconcile gameplay for PvP and RP gamers</p>
<p>However, there is a curious problem between dealing with â€œone worldâ€ and â€œanyone can start their own custom AR serverâ€</p>
<p><strong>Tish Shute: </strong>Now that is a very interesting problem the one world and own AR server</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> It took me a few weeks of not sleeping to figure that one out. It gets back to the interoperability issue</p>
<p><strong>Tish Shute:</strong> What did you come up with?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> a solution. Thats all I can say for now on that.</p>
<p><strong>Tish Shute</strong>: eeextra seeekrit!</p>
<p>Well I will definitely have to bug you on that.</p>
<p>The problem has produced some creativity in OpenSim with people coming up with hybrids of p2p and oneworld</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> As far as virtual worlds are concerned, they need to look at the problem from a different perspective. They are trying to make all virtual worlds interoperable intead of creating a new model for interoperable worlds that new ones will be created to adhere to.</p>
<p><strong>Tish Shute: </strong>well some people are. I would say most OpenSim developers see their modular approach doing this.Â  And you choose to interoperate based on what modules you have activated and then social agreementsâ€¦</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> hrm, thats a start, but that only works on a functional and social level &#8211; doesnâ€™t account for content (story, mythos, game rules), unique data (my +3 sword), or the concepts of commerce, inherent value, and intellectual property</p>
<p>Enabling my WoW avatar to run around in SL and vice versa creates more problems than it solves.</p>
<p>Its like two alien races working hard to make sure that their two spaceships can dock but no one is paying any attention to the fact that race A breathes nitrogen and race B breathes sulpher.</p>
<p>Its technically possible, but they are missing the boat on the content side of the problem.</p>
<p><strong>Tish Shute:</strong> Yes but donâ€™t you think when a modular open source tech for virtual worldsÂ  becomes pervasive, what will happen is that those interested in a similar genre will increasingly use the module in ways that allows their content to interoperate if they want it too</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>everyone has to use the same backend tech, and the front end clients need to adhere to the same standards. Bu I have to admit, I havenâ€™t been paying much attention to the vw space in the last 9 months or so.</p>
<p>Oh I have to run now.Â  But download and install <a id="vsnt" title="cooliris" href="http://www.cooliris.com/" target="_blank">cooliris</a>. I promise you will be blown away and will start using it to search for images and videos</p>
<p>Its frigging awesome.</p>
<p><strong>Tish Shute:</strong> Will do!Â  Thanks so much great talking to you. I canâ€™t wait for your launch.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/01/17/is-it-%e2%80%9comg-finally%e2%80%9d-for-augmented-reality-interview-with-robert-rice/feed/</wfw:commentRss>
		<slash:comments>27</slash:comments>
		</item>
		<item>
		<title>Hacking the World in 2009: Google Street View, &#8220;Smart Stuff,&#8221; and Wikiculture.</title>
		<link>http://www.ugotrade.com/2008/12/29/hacking-the-world-in-2009-google-street-view-smart-stuff-and-wikiculture/</link>
		<comments>http://www.ugotrade.com/2008/12/29/hacking-the-world-in-2009-google-street-view-smart-stuff-and-wikiculture/#comments</comments>
		<pubDate>Mon, 29 Dec 2008 19:20:11 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Architecture Working Group]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[interoperability of virtual worlds]]></category>
		<category><![CDATA[Linden Lab]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[Metaverse]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[MQTT and RSMB]]></category>
		<category><![CDATA[Open Grid]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[open protocols for virtual worlds]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[Open Source Virtual Worlds]]></category>
		<category><![CDATA[open standards for virtual worlds]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[virtual communities]]></category>
		<category><![CDATA[Virtual HomeCamp]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[virtual world standards]]></category>
		<category><![CDATA[Virtual Worlds]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[World 2.0]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=2463</guid>
		<description><![CDATA[Google Street View Hacking This Google Street View Hack (via @timoreilly) will get my nomination for a Hacking the World Award this year, if there is such an award. A parade (the screenshot opening this post), a marathon,Â a mad-scientists laboratory, a sword fight, and more (see The Infonaut Blog) were staged all along the route [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/sampsoniawaypost.jpg"><img class="alignnone size-full wp-image-2475" title="sampsoniawaypost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/sampsoniawaypost.jpg" alt="" width="450" height="274" /></a></p>
<h3>Google Street View Hacking</h3>
<p><a href="http://www.wikio.com/video/576734" target="_blank">This Google Street View Hack</a> (via<a href="http://twitter.com/timoreilly" target="_blank"> @timoreilly</a>) will get my nomination for a Hacking the World Award this year, if there is such an award.</p>
<p><a href="http://maps.google.com/maps?cbp=1,262.96388206761037,,0,16.58444579096093&amp;cbll=40.456878,-80.01196&amp;layer=c&amp;ie=UTF8&amp;ll=40.458499,-80.009319&amp;spn=0.00569,0.012918&amp;z=17&amp;panoid=zHdES6mj-vBrH2nF-K9ROQ" target="_blank">A parade</a> (the screenshot opening this post), <a href="http://maps.google.com/maps?cbp=1,260.87215088682916,,0,8.64102186979147&amp;cbll=40.457046,-80.011085&amp;layer=c&amp;ie=UTF8&amp;ll=40.458671,-80.00845&amp;spn=0.00569,0.012918&amp;z=17&amp;panoid=81ALq0NpV6uyLEF5S5ENhw" target="_blank">a marathon</a>,Â <a href="http://maps.google.com/maps?cbp=1,160.10914016686365,,0,33.949139944215034&amp;cbll=40.456949,-80.011593&amp;layer=c&amp;ie=UTF8&amp;ll=40.458573,-80.008954&amp;spn=0.00569,0.012918&amp;z=17&amp;panoid=C4I-QLkZJoT1SHXslK5f7Q" target="_blank">a mad-scientists laboratory</a>, <a href="http://maps.google.com/maps?cbp=1,9.995045624107206,,0,10.698194796922357&amp;cbll=40.457636,-80.00767&amp;layer=c&amp;ie=UTF8&amp;ll=40.459103,-80.006486&amp;spn=0.00569,0.012918&amp;z=17&amp;panoid=W_ox0QPcWyPqWGNPiK91Nw" target="_blank">a sword fight</a>, and more (see <a href="http://www.infonaut.ca/blog/?p=290" target="_blank">The Infonaut Blog</a>) were staged all along the route of the Google Street View truck by artists Robin Hewlett and Ben Kinsley working in conjunction with the local community and Google Street View<em><strong>. </strong></em></p>
<p>The Google Street View Hack suggests at a myriad of possibilities for anyone with their eye on the prize for a great world hack for 2009.Â  In my mind&#8217;s eye, I imagine the Google Street View truck&#8217;s trek across the planet triggering local environmental street action carnivals wherever it goes.</p>
<p>Local energy conservationists,<a href="http://www.nytimes.com/2008/12/27/world/europe/27house.html?_r=1&amp;pagewanted=all" target="_blank"> &#8220;passive house&#8221; architects</a>, retrofitters, could turn the arrival ofÂ  Google Street View into an occasion to create projects for a sustainable future &#8211; a traveling StreetCamp (see <a href="http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/" target="_blank">my post on HomeCamp &#8217;08 here</a>).Â  As Google Street View intends, surely, to go everywhere,Â  this would be a global hack for sustainable living that crossed the bounds of the physical and the virtual.Â  And the vast public record of Google Street View would became a generative engine and global resource for sustainable living.</p>
<h3>Working together on the noble aim of sustainable living</h3>
<p>- this is my (and many other people&#8217;s) big theme for 2009.</p>
<p>A Hacking the World award should also go toÂ  <a href="http://www.pachube.com/">Pachube</a> &#8211; &#8220;patching the planet&#8221; &#8211; for demonstrating that instrumenting the world is not merely a Sci FiÂ  fantasy anymore.Â  By facilitating &#8220;interaction between remote environments, both physical and virtual,&#8221;Â  Pachube demonstrates (see <a href="http://community.pachube.com/?q=node/1" target="_blank">diagram here</a>) how we have only just begun to dip our toes into the many new opportunities we have to work together to save energy, rethink our culture of consumption, and to reboot our failing economy under a new sustainable operating system.</p>
<p>Energy awareness unlike the glut of information we have in entertainment and games suffers from a dearth of information. We really have very little idea about what we are consuming and the waste we are producing.Â  So more Hacking the World Awards should go to projects like <a href="http://www.amee.com/" target="_blank">AMEE</a> &#8211; creating the world&#8217;s energy meter, and <a href="http://www.wattzon.com/" target="_blank">Wattzon</a> &#8211; your personal energy meter, for giving us new ways to understand and work with energy data.</p>
<p>Many people and organizations, given the information, will change their behaviours. But the cultural changes necessary for sustainable living are deep and old habits die hard (see <a href="http://www.nytimes.com/2008/12/27/opinion/27sat1.html" target="_blank">this disturbing report</a> on the recent return to SUV buying in November as soon as gas prices fell!).</p>
<h3>AÂ  Small Community of Volunteers Can Bring Change on a Global Scale</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/homecampthethrongpost.jpg"><img class="alignnone size-full wp-image-2535" title="homecampthethrongpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/homecampthethrongpost.jpg" alt="" width="450" height="153" /></a></p>
<p>Picture above by <a href="http://benjaminellis.co.uk/" target="_blank">Benjamin Ellis</a>, &#8220;HomeCamp &#8211; The Throng,&#8221; from his <a href="http://www.flickr.com/photos/tags/homecamp08/" target="_blank">Flickr</a><a href="http://www.flickr.com/search/?q=homecamp&amp;w=29034542%40N00" target="_blank"> stream.</a></p>
<p>One of my favorite &#8220;instrumenting the world&#8221; projects to date and another top contender for a Hacking the World Award is <span class="entry-content"><a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp â€˜08</a></span> (see my <a href="http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/" target="_blank">previous post</a>).Â  HomeCamp brings together a community of creators and enthusiasts ofÂ  &#8220;smart stuff,&#8221; creating <a href="http://meta.wikimedia.org/wiki/Wikiculture" target="_blank">a wikiculture</a> for the noble cause of sustainable living.</p>
<p>The key to whether &#8220;instrumenting the world&#8221; empowers people and changes our lives for the better will be the capacity our systems of instrumentation have for what Jonathan Zittrain in <em><strong>&#8220;</strong></em><a href="http://futureoftheinternet.org/" target="_blank">The Future of the Internet: And How To Stop It:,&#8221; </a><em><strong> </strong></em>defines as generativity, i.e.:Â  &#8220;the system&#8217;s capacity to produce unanticipated change through unfiltered contributions from broad and varied audiences&#8221; ( Zittrain, 2008).</p>
<p>Generativity is the &#8220;secret sauce&#8221; that makes the difference between, for example, <a href="http://www.wikipedia.org/" target="_blank">Wikipedia</a> and its all but forgotten predecessor &#8211; the &#8220;written by experts&#8221; <a href="http://en.wikipedia.org/wiki/Nupedia" target="_blank">Nupedia</a>.</p>
<p>Jonathan Zittrain writes:</p>
<p><em><strong></strong></em></p>
<p><em><strong>Wikipedia stands for more than the ability of people to craft their own knowledge and culture.Â  It stands for the idea that people of diverse backgrounds can work together on a common project with, whatever its other weaknesses, a noble aim </strong><strong>- bringing such knowledge to the world. (p.147)</strong></em></p>
<p>At <a href="http://en.oreilly.com/web2008/public/content/home" target="_blank">Web 2.0 Summit</a>, Jonathan Hochman (<em><strong><a href="http://en.oreilly.com/web2008/public/schedule/detail/6952" target="_blank">Known as </a><a href="http://en.wikipedia.org/wiki/User:Jehochman">Jehochman</a> on Wikipedia</strong></em>), shared with me his insider perspective as a Wikipedia administrator. The <a href="http://www.ugotrade.com/2008/12/26/wikipedia-houdini-google-street-view-instrumenting-sustainable-living#link_1">full interview</a> with Jonathan is later in this post.</p>
<p>Jonathan comments on the role of wikiculture in sustainable living:</p>
<p><em><strong>&#8220;Sustainable Living requires everything to become more efficient. Incentives need to line up with conservation priorities. This requires a radical change to the way we govern ourselves. Command economies, whether commanded by politicians or capital, lead to huge inefficiencies.&#8221;</strong></em></p>
<p>And surely, if we have learned anything in 2008, we have learned that very bad things happen when the complex systems of modern life are left in the hands of a few people motivated solely by the urge to make profit.</p>
<h3>Hacking Design and Planning Processes for Real Estate and Transportation with Virtual Worlds</h3>
<p><object width="400" height="302" data="http://vimeo.com/moogaloop.swf?clip_id=2326434&amp;server=vimeo.com&amp;show_title=1&amp;show_byline=1&amp;show_portrait=0&amp;color=&amp;fullscreen=1" type="application/x-shockwave-flash"><param name="allowfullscreen" value="true" /><param name="allowscriptaccess" value="always" /><param name="src" value="http://vimeo.com/moogaloop.swf?clip_id=2326434&amp;server=vimeo.com&amp;show_title=1&amp;show_byline=1&amp;show_portrait=0&amp;color=&amp;fullscreen=1" /></object></p>
<p>This great machinima by Azwaldo Vilotta shows the progress so far on the <a href="http://studiowikitecture.wordpress.com/2008/12/12/now-is-an-ideal-time-to-join-wikitecture-40/" target="_blank">Wikitecture 4.0 project</a>, â€˜Re-Inventing the Virtual Classroomâ€™ for the University of Alabama.</p>
<p>Though still a niche market Virtual Worlds are growing at a steady pace.Â  As I mentioned in my previous post, energy hungry avatars themselves will be a target for optimization in 2009.Â  But as my personal power usage breakdown from <a href="http://www.wattzon.com/" target="_blank">Wattzon</a> shows, cutting down the amount of flying I do in 2009 would be far more effective in reducing my carbon footprint than deciding not to log into Virtual Worlds!</p>
<p>Note: Read Write Web&#8217;s recent post, &#8220;<a href="http://www.readwriteweb.com/archives/enterprise_virtual_worlds.php" target="_blank">Report Enterprise Virtual Worlds More Effective Than Web Conferencing</a>.Â  Also check out <a href="http://www.projectchainsaw.com/" target="_blank">Web.Alive</a>, and <a href="http://immersivespaces.com/" target="_blank">Immersive WorkSpaces</a> and Dusan Writer&#8217;s post on &#8220;<a href="http://dusanwriter.com/index.php/2008/12/20/thinkbalm-the-immersive-internet-and-collaborative-culture/" target="_blank">ThinkBalm,The Immersive Internet and Collaborative culture</a>,&#8221;</p>
<p>My friend Melanie Swan points out in her <a href="Jimmy Wales recent personal appeal for support for Wikipedia." target="_blank">Top Ten Computing Trends for 2009</a>, that Virtual Worlds not only have the power of the 3 Cs (communication, collaboration and commerce) but they are fast expanding into <a href="http://www.3pointd.com/20070406/rapid-architectural-prototyping-in-second-life/">rapid prototyping</a>, <a href="http://your2ndplace.com/node/926">simulation</a> and <a href="http://sldataviz.pbwiki.com/">data visualization</a>.</p>
<p>My Hacking the World, 2008, Awards for Virtual World innovation would go to three potentially world changing projects for sustainable living:</p>
<p>1) <a href="http://studiowikitecture.wordpress.com/" target="_blank">Studio Wikitecture</a>, (see <a href="http://studiowikitecture.wordpress.com/" target="_blank">&#8220;Reinventing the Virtual Classroom&#8221;</a> for The University of Alabama).</p>
<p>2) Oliver Goh&#8217;s work on &#8220;<a href="http://www.shaspa.com/cms/website.php" target="_blank">The Path to Sustainable Real Estate.&#8221;</a></p>
<p>3) Encitra,Â <a href="http://www.podcar.org/uppsalaconference/christerlindstrom.htm" target="_blank"></a>a company recently co-founded by <a href="http://www.ics.uci.edu/informatics/research/research_highlight_view.php?id=52" target="_blank">Crista Lopes</a> and <a href="http://www.podcar.org/uppsalaconference/christerlindstrom.htm" target="_blank">Christer Lindstrom</a> focused on improving urban planning processes, starting with transportation, using virtual worlds (<a href="http://www.ugotrade.com/2008/11/25/web-meets-world-participatory-culture-and-sustainable-living/" target="_blank">see my previous post here for more</a>).</p>
<p>The latter two projects are being developed in <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a> &#8211; the open source project that should also get a Hacking The World Award for creating an open modular architecture for virtual worlds that is unleashing all these new possibilites for integrating physical and virtual worlds.</p>
<p>The 2008 code contributions to OpenSim of special note re world hacking are Crista Lopes&#8217;<a href="http://opensimulator.org/wiki/Hypergrid"> OpenSim Hypergrid</a> &#8211; see Justin CC&#8217;s blog for full details on <a href="http://justincc.wordpress.com/2008/12/19/what-is-the-hypergrid/" target="_blank">&#8220;What is the hypergrid?,&#8221;</a> and David Levine&#8217;s work (IBM),  in collaboration with Linden Lab (see<a href="http://wiki.secondlife.com/wiki/Architecture_Working_Group" target="_blank"> Architecture Working Group</a>), on interoperability (see <a href="http://www.ugotrade.com/2008/07/" target="_blank">my earlier post here</a>).</p>
<p>Both these projects expand the frontiers of interoperability for virtual worlds although they &#8220;slice the problem from different ends,&#8221; as David Levine put it.Â  The emphasis in the LL/IBM approach is on security so assets are not moving yet.Â  In Crista&#8217;s solution you can have assets but the security issues are not addressed yet. But this work is vital to expanding the usefulness of virtual worlds and both projects should get Hacking the World Awards IMHO.</p>
<p>I asked <a href="http://archsl.wordpress.com/" target="_blank">Jon Brouchoud </a>(full interview upcoming) what he thought were Studio Wikitecture&#8217;s most important successes to date:</p>
<p><strong><em>&#8220;I think the greatest success has been in proving, on some level, that everyone has important knowledge that can inform and improve the design of a building, not just architects.Â  If we can continue building on that success, I hope we can eventually start to hack the traditional design process, and find ways to harness the wealth of knowledge held by the general public, instead of ignoring or avoiding it, as is most often the case.&#8221;</em></strong></p>
<h3>Harnessing the &#8220;Smart Stuff&#8221; to the Noble Cause of Sustainable Living</h3>
<p>Robert Scoble&#8217;s, <a href="http://scobleizer.com/2008/12/27/the-interview-of-the-year-tim-oreilly/" target="_blank">The Interview of the Year: Tim O&#8217;Reilly,</a> is not to be missed. Tim O&#8217;Reilly discusses the key trends for 2009 that are bubbling up at O&#8217;Reilly Media.Â  And, Yes, Tim O&#8217;Reilly, as the guru of Hacking the World, gets the &#8220;Distinguished Thinker &#8211; Hacking The World Award of 2008!&#8221;</p>
<p>Tim O&#8217;Reilly&#8217;s trend list includes:</p>
<p>1) big data- vast peer produced data bases in the cloud accessible by mobile devices</p>
<p>2) &#8220;smart stuff&#8221; &#8211; sensors and robotics and hacking on stuff for fun and not for profit</p>
<p>3) Green Tech</p>
<p>4) Advances in Biological/Life Sciences.</p>
<p>And, in Robert Scoble&#8217;s interview, there is a nice titbit of history re his attendance of early <a href="http://en.wikipedia.org/wiki/Foo_Camp" target="_blank">Foo Camps</a>.Â  Foo Camp is the wiki of O&#8217;Reilly conferences and a lineage holder to my favorite Hacking the World event of 2008, <span class="entry-content"><a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp â€˜08</a></span>.</p>
<p>But what will be the &#8220;secret sauce&#8221; for these big ideasÂ  &#8211; the generative engines that harness to the noble cause of sustainable living these vast peer produced data bases and all the creative &#8220;smart stuff&#8221; hackers across the globe are creating?Â  What will motivate the mass adoption of Green Tech and sustainable living?</p>
<p>What can Wikipedia teach us about how generative systems and bottom up approaches can change the world?</p>
<p>Jimmy Wales (interview coming soon!)Â  writes in his recent <a href="http://wikimediafoundation.org/wiki/Donate/Letter/en?utm_source=2008_jimmy_letter_r&amp;utm_medium=sitenotice&amp;utm_campaign=fundraiser2008#appeal" target="_blank">personal appeal</a> for support for Wikipedia.</p>
<p><em><strong>At its core, Wikipedia is driven by a global community of more than 150,000 volunteers &#8211; all dedicated to sharing knowledge freely. Over almost eight years, these volunteers have contributed more than 11 million articles in 265 languages. More than 275 million people come to our website every month to access information, free of charge and free of advertising.</strong></em></p>
<p>To answer questions on a how to create a successful wikiculture for sustainable living, an insider&#8217;s view of Wikipedia may be a good place to start.</p>
<h3>Interview With Jonathan Hochman on Wikipedia.</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/gammapostjon.jpg"><img class="alignnone size-full wp-image-2477" title="gammapostjon" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/gammapostjon.jpg" alt="" width="223" height="158" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/jonathanwikikpost.jpg"><img class="alignnone size-full wp-image-2473" title="jonathanwikikpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/jonathanwikikpost.jpg" alt="" width="224" height="158" /></a></p>
<p>The picture on the left is from the Wikipedia article, <a href="http://en.wikipedia.org/wiki/Gamma-ray_burst" target="_blank">Gamma-ray Burst</a>, that Jonathan Hochman is currently working on.Â  It is a drawing of a massive <a title="Star" href="http://en.wikipedia.org/wiki/Star">star</a> collapsing to form a <a title="Black hole" href="http://en.wikipedia.org/wiki/Black_hole">black hole</a>. Energy released as jets along the axis of rotation forms a gamma-ray burst. <em>Credit: Nicolle Rager Fuller/NSF </em></p>
<p>The picture on the right, Jonathan at Web 2.0 Summit, is taken by me. Jonathan was part of the,<em> <a href="http://en.oreilly.com/web2008/public/schedule/detail/6952" target="_blank">Defending Web 2.0 from Virtual Blight, panel.</a> </em></p>
<p><em><strong><a href="http://en.oreilly.com/web2008/public/schedule/detail/6952" target="_blank">Known as </a><a href="http://en.wikipedia.org/wiki/User:Jehochman">Jehochman</a> on Wikipedia, he serves as an administrator and as a leader in addressing online harassment, disruption and sock puppetry. He is also the founder of <a href="http://www.hochmanconsultants.com/">Hochman Consultants</a>, an Internet marketing consultancy, and the director of <a href="http://www.semne.org/">Search Engine Marketing New England</a>, a regional conference series.</strong></em></p>
<p><strong>Tish:</strong> Second Life and Wikipedia are the two great experiments in collaborative co-creation what do they have to teach us about the future of the internet?</p>
<p><strong>Jonathan:</strong> Yes, Wikipedia and Second Life are key social spaces.Â  Some people have been seeing Second Life as the beginning of Web 3.0 &#8211; a wrap around environment where you can almost experience another life. Wikipedia is sort of another example of this.</p>
<p>All the problems that exist in the real world are mirrored right into that little universe.Â  For example, the Armenians and the Turks are at each others throats and the Japanese and the Koreans are going at it, the Palestinians and the Israelis, and the &#8220;Troubles&#8221;Â  &#8230; all the conflicts are imported into Wikipedia.Â  People are fighting over the content of these articles. They want to have it their way because these are first ranked in Google and they have a big impact in public opinion.</p>
<p>There was a huge fight on the waterboarding article a while back. Some guys from Little Green Footballs &#8211; they are a very conservative reactionary type of media. They are trying to change the article to say that water boarding might not be torture &#8211; change it to say it is probably not so bad.Â  Crazy stuff. They were trying to water it down.Â  And it is very clear, from every source out there, that waterboarding is torture.Â  We did a study and there are 115 sources that say waterboarding is torture. You simulate drowning &#8211; you simulate killing someone &#8211; that is a violation of the Geneva Convention and everything else. People were fighting, fighting, fighting!</p>
<p>One of the things I did was to try and clear people out who were being disruptive.Â  We actually had to go to arbitration over that article. It is like the supreme court of Wikipedia. There is a panel of 15 arbitrators.Â  They hear the case. There is evidence, arguments and decisions. It is really like a simulated law suit. You get all the experience of a simulated law suit with the real threat that you could be banned. If they don&#8217;t like what you are doing they can actually ban you or restrict you from topics.</p>
<p>So it is really fascinating how this social space Wikipedia becomes a very real platform though it is in a virtual world for real world disputes.Â  Most disputes are over the definition of things.Â  If you have a you suit most disputes are about how things are defined. And Wikipedia has become the defacto definition of things in the real world.Â  People want to know what are &#8220;The Troubles.&#8221;Â  If you go to Wikipedia you find outÂ  The Troubles are a dispute over Northern Ireland.Â  What the article says has a profound impact on public opinion.</p>
<p><strong>Tish:</strong> So who is on the court of Wikipedia?</p>
<p><strong>Jonathan:</strong> They are volunteers. these people work two or three hours a day to run this court.Â  There are all kinds of projects.Â  There is a WikiProject Spam which has people who can write computer programs to statistically analyze Wikipedia projects &#8211; not only Wikipedia. But all of them are looking at the links and reporting them and banning those people who are abusing or gaming the system.</p>
<p><strong>Tish:</strong> You were on the Stopping Virtual Blight Panel at Web 2.0 Summit &#8211; what are the most important things to think about on this topic?</p>
<p><strong>Jonathan:</strong> Yes we were talking about how to defend the web against virtual blight. The thing I find interesting about Wikipedia is that because it is the eighth largest web site and possibly the second largest web site comprised of user generated content after YouTube. The problems that exist in Wikipedia are larger and more detailed than any other site.Â  For whatever problem someone has for their social media site or their Web 2.0 site these problems already exist in Wikipedia and the solutions are there and they are transparent. You can actually see the history of what&#8217;s been done.</p>
<p>If there is, for example, a problem on Digg &#8211; some problem with sock puppetry or vote stacking &#8211; it happens, it goes away.Â  You don&#8217;t get full disclosure.Â  With Wikipedia you can actually go in and look at a dispute and watch it unfold.Â  You can watch the arbitration cases that are filed, the arguments, the decisions, the logic, the rationale.Â  You can see the successes and the failures and the different things people have tried to control blight. For example, we tried to resolve this dispute one way but it was a disaster, so we have tried something else and that worked.</p>
<p>Wikipedia is a large laboratory for social media. Wikipedia and the large universe around it Wiki and WikiMedia projects that individuals, enterprises and put together like Commons.Â  Wikimedia Commons is a repository of publicly licensed images that anyone can take and reuse. They have sound and they have video, and all of this stuff is being stitched together now.</p>
<p>So if you go to the article on ObamaÂ  you can probably now hear his acceptance speech because that is public domain &#8211; its been stitched into the article.Â  If you go to the article on Richard Nixon &#8211; his resignation speech &#8211; you may even hear his conversation with the astronauts when they landed on the moon.Â  So this becomes a giant repository of all our culture and knowledge.Â  When I design a website, a lot of times I go to Commons to find images I use for free.Â  I don&#8217;t want to pay for an image I can get for free.Â  <strong></strong></p>
<p><strong>Tish: </strong>And the Commons images get contextualized in Wikipedia too.</p>
<p><strong>Jonathan:</strong> Some of these articles are fascinatingly detailed. If you want a quick summary of the Dr. Strangelove, the article is fantastic.Â  It is enjoyable, a pleasure to read.Â  I was reading about S.A. Andree&#8217;s North Pole balloon expedition of 1897. Some guys from Sweden decided to fly a balloon over the North pole.Â  They managed to get aloft then they flew over the icepack for 24 hrs then they crashed.</p>
<p>They unloaded their stuff and hiked back across the ice toward the island they had launched from. They ended up being on the ice pack for three months before they finally holed up in an ice cave and starved to death.Â  There weren&#8217;t found until thirty years later!Â  There was a camera with these guys and the frozen pictures taken 30 yrs earlier.Â  They developed the film and those pictures are now on Wikipedia.Â  It is just a fascinating thing!</p>
<p><strong>Tish: </strong> Do you see real time collaboration beginning to play more of a role in Wikipedia &#8211; whether virtual worlds or just voice/IM &#8212; how could real time collaboration change the wikipedia editing process?</p>
<p>Jonathan:Â  The Presidential candidate articles were being edited very rapidly yesterday. There are certain real time problems.Â  Some of the more interesting problems are when you get two administrators who &#8220;get into it.&#8221; One administrator says I am blocking this user and the other one says I am unblocking him, and the other one &#8220;NO I am blocking him!&#8221; And so on&#8230;&#8230; And everyone says, &#8220;Stop fighting. You are not allowed to do that!&#8221; And they both get their powers stripped. People do get very heated over the silliest things. Wikipedia does have some mailing lists attached and there are some IRC channels. So there are some real time elements.</p>
<p><strong>Tish: </strong>What is the role of avatars in Wikipedia?<br />
<br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> In Wikipedia you have a user page and many users are anonymous.Â  They create an avatar and they personalize it and show themselves in ways they want to show themselves through an avatar. In many ways it is a lot like Second Life.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">Some users have created second accounts &#8211; or a humerous second account. Bishzilla &#8211; a Swedish lady who is in tremendous command of the English language and has a razor sharp wit.Â  She has created this secondary account &#8211; almost like in a baby language.Â  Her avatar is a dinosaur that is not very bright that goes around frying people. Bizarre what people do! People may be editing a topic like an interest they have &#8211; e.g. Pokemon that they don&#8217;t want associated with their professional avatar. Or people may be editing a topic about hot political issues.Â  There have actually been some death threats issued to people over stuff they have been putting into the encyclopedia. </span><strong><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /></strong><span style="background-color: #ffffff;"><strong>Tish: </strong>So avatars are important in Wikipedia.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> Absolutely because people may be going in and editing articles that they may not want their friends and family to know they are editing.Â  One editor may say to another, &#8220;Stop putting stuff in or I will come and kill you!&#8221; Well then we have to ban them.Â  We have to call the police.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Tish:</strong> Can you build reputations on multiple avatars?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan: </strong>You are allowed to use multiple avatars as long as they don&#8217;t cross paths.Â  You can&#8217;t have two avatars editing in the same area beacuse you are going to be giving yourself double weight commenting on a discussion. </span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Tish:</strong> How do you know when this is happening?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> You can watch the style of a users editing.Â  You have to watch behavior.Â  And if you have enough evidence through behavior that suggests accounts are controlled by one person you can go and request a technical check.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">There are some uses who are called Checkusers who are able to access information desired from the server logs and check the technical characteristics of these accounts to see if they are using the same IP address.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Tish:</strong> So if you want to understand avatar interaction on the web it helps to understand Wikipedia. </span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> Yes it is a fantastic way to understand how avatars work in some aspects, and also how to deal with community dynamics.Â  We have some very strong willed people &#8211; people in their 40s, 50s, and 60s &#8211; who are very successful in business.Â  They have plenty of money and spare time and they are doing this as a hobby. And some of these people can really butt heads.Â  You can have a problem when you have an editor who has been writing fantastic articles but also happens to be rude and chew other people out and tell them to f**k off if they are not behaving. What do you do?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Tish:</strong> Sounds a bit like Second Life!</span><br style="background-color: #ffffff;" /><strong><br style="background-color: #ffffff;" /></strong><span style="background-color: #ffffff;"><strong>Jonathan:</strong> The person is a great contributor to the community but they are telling noobies to f**k off, so you can&#8217;t allow that.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">What do you do?Â  Vested contributors are a major problem to some of these sites. They are vested in the community but they start misbehaving. You can&#8217;t block them, because if you block them there is a huge upsroar from all their friends and it causes a cataclysm.Â  It requires very careful diplomacy to deal with some of these situations. </span><br style="background-color: #ffffff;" /><strong><br style="background-color: #ffffff;" /></strong><span style="background-color: #ffffff;"><strong>Tish:</strong> How many Wikipedia volunteers are there now?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">Jonathan: Think of a Venn Diagram &#8211; a big circle. The total number of contributors are about one million different people that contribute.Â  But there are probably about 5,000 active editorsÂ  that are consistently and regularly contributing.Â  And within that kernel there are fifteen hundred people that have administrator access and probably only eight hundred of them are active.Â  People have a natural life span with the community.Â  People come an typically stay for 6 months to 3 years.Â  Usually after that they become bored, disillusioned or get into a conflict with someone.Â  There is a natural tendency for people to stay for a while and move on. Some people stay longer, a few, but the majority will move on at some point.Â  So it is a lot of fresh faces moving in.</span><br style="background-color: #ffffff;" /><strong><br style="background-color: #ffffff;" /></strong><span style="background-color: #ffffff;"><strong>Tish:</strong> What lessons of trust does Wkipedia have to teach us about new projects like AMEE that aims to aggregate the world&#8217;s energy data?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> Well you have to know who is releasing the data. Who is creating the data? The beauty of Wikipedia is that you have an edit history so you can see exactly who has done what.Â  So you can judge whether this person is trustworthy or not.Â  That&#8217;s a huge problem on the web today.Â  We don&#8217;t have enough identification information.Â  When you see a web page you don&#8217;t necessarily know when that page was created and by whom, or how many revisions it has had.Â  Sometimes you can glean information by checking it.Â  If you see typos and errors you may decide that that page probably didn&#8217;t receive as much attention as it should have, and probably it is not that good.</span> <br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">Typos are an interesting thing.Â  People always try to figure out how Google ranks web pages. </span><a id="uy3s" style="background-color: #ffffff;" title="Matt Cutts" href="http://www.mattcutts.com/">Matt Cutts</a><span style="background-color: #ffffff;"> was here from Google today.Â  And he was talking about spam.Â  But Matt also did a <a id="e4lo" title="blog post" href="http://www.mattcutts.com/blog/2006-pubcon-in-vegas-getting-there-and-back/">blog post</a> about how he was in an airport once, and how he has a policy &#8211; when you are reading a document as soon as you come to the first error just stop because if the author hasn&#8217;t taken the care to make everything correct, you don&#8217;t need to read it. So he was in the airport, there was a sign, he came to a typo and stopped reading it. Somehow he got in trouble for not reading the sign and not having the information.Â  But it is interesting to think whether Goggle is looking for for typos, misspellings, broken links and using that as a signal of quality to rank pages.</span><br style="background-color: #ffffff;" /></p>
<p><strong>Tish:</strong> Aaaagh typos might bring down your page rank!!!Â  That certainly is a scary thought for a blogger like me who likes to write impossibly long posts that are hard to check&#8230;&#8230;&#8230;</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2008/12/29/hacking-the-world-in-2009-google-street-view-smart-stuff-and-wikiculture/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
		</item>
	</channel>
</rss>
