<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>UgoTrade &#187; message brokers and sensors</title>
	<atom:link href="http://www.ugotrade.com/category/smart-planet/message-brokers-and-sensors/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.ugotrade.com</link>
	<description>Augmented Realities at the Edge of the Network</description>
	<lastBuildDate>Wed, 25 May 2016 15:59:56 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>Augmented Twitter at Jeff Pulver&#8217;s #140conf</title>
		<link>http://www.ugotrade.com/2010/04/23/augmented-twitter-at-jeff-pulvers-140conf/</link>
		<comments>http://www.ugotrade.com/2010/04/23/augmented-twitter-at-jeff-pulvers-140conf/#comments</comments>
		<pubDate>Fri, 23 Apr 2010 14:25:03 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[nanotechnology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[#140conf]]></category>
		<category><![CDATA[#ashtag. TEDxVolcano]]></category>
		<category><![CDATA[3D mailbox]]></category>
		<category><![CDATA[Alon Nir]]></category>
		<category><![CDATA[Anselm Hook]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[are2010]]></category>
		<category><![CDATA[ARWave]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[augmented reality games]]></category>
		<category><![CDATA[augmented twitter]]></category>
		<category><![CDATA[Dancing Ink Productions]]></category>
		<category><![CDATA[EComm]]></category>
		<category><![CDATA[Evolutionary Reality]]></category>
		<category><![CDATA[Farmville]]></category>
		<category><![CDATA[federation protocol]]></category>
		<category><![CDATA[Foure Square]]></category>
		<category><![CDATA[Gamepocalypse]]></category>
		<category><![CDATA[gowalla]]></category>
		<category><![CDATA[Jeff Pulver]]></category>
		<category><![CDATA[Jerry Paffendorf]]></category>
		<category><![CDATA[Joshua Fouts]]></category>
		<category><![CDATA[Latitude]]></category>
		<category><![CDATA[Loveland]]></category>
		<category><![CDATA[micro-real estate]]></category>
		<category><![CDATA[mobial social]]></category>
		<category><![CDATA[mobile social augmented reality]]></category>
		<category><![CDATA[mobile social games]]></category>
		<category><![CDATA[Open AR]]></category>
		<category><![CDATA[Open AR Web]]></category>
		<category><![CDATA[open standard federated protocol]]></category>
		<category><![CDATA[Rita J. King]]></category>
		<category><![CDATA[Rouli Nir]]></category>
		<category><![CDATA[social games]]></category>
		<category><![CDATA[The Kotel]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[tishshute]]></category>
		<category><![CDATA[wave federation prtocol]]></category>
		<category><![CDATA[WhereCamp]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=5406</guid>
		<description><![CDATA[Augmented Twitter &#8211; open, mobile, social augmented reality via ARwaveView more presentations from Tish Shute. Augmented Twitter Presenting Augmented Twitter (see video and slides above) at Jeff Pulver&#8217;s 140 Characters Conference (#140conf ) was super fun, and great video makes this a conference that you can enjoy catching up on after the fact.Â  Jeff Pulver [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ippio.com/view_video.php?viewkey=da6ab5c15dd856998e4b" target="_blank"><img class="alignnone size-full wp-image-5407" title="Screen shot 2010-04-22 at 9.52.22 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/04/Screen-shot-2010-04-22-at-9.52.22-AM.png" alt="Screen shot 2010-04-22 at 9.52.22 AM" width="458" height="368" /></a></p>
<div id="__ss_3817428" style="width: 425px;"><strong style="display:block;margin:12px 0 4px"><a title="Augmented twitter - open, mobile social augmented reality via ARwave" href="http://www.slideshare.net/TishShute/augmented-twitter">Augmented Twitter &#8211; open, mobile, social augmented reality via ARwave</a></strong><object classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" width="425" height="355" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0"><param name="allowFullScreen" value="true" /><param name="allowScriptAccess" value="always" /><param name="src" value="http://static.slidesharecdn.com/swf/ssplayer2.swf?doc=augmentedtwitter-100422085925-phpapp01&amp;stripped_title=augmented-twitter" /><param name="allowfullscreen" value="true" /><embed type="application/x-shockwave-flash" width="425" height="355" src="http://static.slidesharecdn.com/swf/ssplayer2.swf?doc=augmentedtwitter-100422085925-phpapp01&amp;stripped_title=augmented-twitter" allowscriptaccess="always" allowfullscreen="true"></embed></object>View more <a href="http://www.slideshare.net/">presentations</a> from <a href="http://www.slideshare.net/TishShute">Tish Shute</a>.</div>
<p> </br></p>
<h3>Augmented Twitter</h3>
<p>Presenting <a href="http://www.ippio.com/view_video.php?viewkey=da6ab5c15dd856998e4b" target="_blank">Augmented Twitter</a> (see video and slides above) at <a href="http://140conf.com/" target="_blank">Jeff Pulver&#8217;s 140 Characters Conference</a> (#140conf ) was super fun, and <a href="http://www.ippio.com/140conf" target="_blank">great video </a>makes this a conference that you  can enjoy catching up on after the fact.Â  Jeff Pulver does an excellent job of keeping people to a challengingly short format.Â  Even I managed to bring my talk in under 5 mins!</p>
<p>#140conf is a real time mobile social crowd, and pretty attuned to Augmented Reality.Â  Everyone had heard of Augmented Reality in the audience, and while most had never tried an AR app, nearly everyone used a mobile social app like, <a href="http://foursquare.com/" target="_blank">Four Square</a>, <a href="http://gowalla.com/" target="_blank">Gowalla</a>, or <a href="http://www.google.com/latitude/intro.html" target="_blank">Latitude</a>. Â  As Dan Harple (@dharple) &#8211; Executive Chairman,<a href="http://www.gypsii.com/" target="_blank"> GyPSii</a>, said in hisÂ  interesting presentation, <a href="http://www.ippio.com/view_video.php?viewkey=44143e1f2f13b2b729ab"><strong>Evolution  of Location and Places</strong></a>,Â  &#8220;everyone get&#8217;s connection, and that connection in real time is the thing if we can get it, and that real time connection is innately mobile.&#8221;</p>
<p><a href="http://www.arwave.org/" target="_blank">ARwave</a> aims to push mobile, social, real time connection even further with augmented reality.Â  As Anselm Hook puts it so brilliantly in his <a href="http://www.slideshare.net/anselm/20100421-ecomm-pressy" target="_blank">presentation at EComm</a>, &#8220;AR is about publishing &#8220;verbs&#8221; &#8211; interactive, actionable, digital agents not publishing 3D models.&#8221;Â  I have some mega posts brewing on this topic.Â  Augmented Reality will need to support publishing game like behavior, and digital agents that can  embody a set of actions and reactions.</p>
<p>This need for augmented reality to publish behavior, and to share and integrate, in one view, multiple real time data streams are just some of the reasons <a href="http://www.arwave.org/" target="_blank">AR Wave</a> uses <a href="http://www.waveprotocol.org/" target="_blank">an open federated   protocol</a>.Â  Federation is also particularly important for augmented reality because, as Anselm pointed out at <a href="http://wherecamp.org/" target="_blank">WhereCamp</a>,Â  AR will certainly demand very efficient distribution of state change at the systems level &#8211; Â to move the computation to its lowest latency.</p>
<p>The only other cloud over our Augmented Reality party at #140confÂ  was that #ashtag kept our co-panelist and panel chair from joining us. Â  Rita J King, @ritajking, who is Innovator-in-Residence at IBMâ€™s Analytics Virtual Center, the &#8220;General of the Imagination Age,&#8221; and <a href="http://dancinginkproductions.com/" target="_blank">Dancing Ink Productions</a>, and Joshua Fouts, @josholalia, &#8220;Cultural AttachÃ©,&#8221; and Chief Global Strategist of Dancing Ink, were on a 5 day trek out of #ashcloud, and, sadly, not there for our panel.</p>
<p>Bu Twitter, once again, was a life line in a time of crisis connecting them to <a href="TEDxVolcano">TEDxVolcano,</a> an impromptu unconference with must see presentations from Rita and others, see<a href="http://www.theimaginationage.net/" target="_blank"> Rita&#8217;s blog for more</a>.</p>
<p>So the two of us carried the flag forÂ  Augmented Twitter.Â  Myself and Jerry Paffenfdorf, futurist, artist, entrepreneur and swell guy  &#8211; the co-inventor of the most famous real time social web system you have never heard of (actually I tried and loved it in alpha testing, before it was quote &#8220;shut down by blood thirsty investors&#8221;).</p>
<p>Now Jerry lives in Detroit Michigan where he works on the <a href="http://makeloveland.com/" target="_blank">Loveland Micro-real estate project</a> which is the simplest, cheapest, funnest way to become a land owner.Â   At a dollar a square inch it mixes video games and real estate, like Farmville for urban development.</p>
<p>Joshua and Rita, our very virtual panel mates, are the first and largest inchvestors, and creating their own micro city within the project.Â   Jerry is one of the most creative and original thinkers on the planet, so treat yourself to glimpse of what is on his mind in the video above &#8211; <a href="http://makeloveland.com/" target="_blank">Loveland</a>, <a href="http://www.3dmailbox.com/" target="_blank">3D mailbox</a>, canned augmented reality, and the relationship of virtual worlds to the real time social web.</p>
<p>Jerry also hat tipped one of the most captivating projects and presentations of the conference, Alon Nir&#8217;s, <a href="http://www.ippio.com/view_video.php?viewkey=510442f2fd40f2100b05"><strong>The  Story Behind @TheKotel</strong></a>, &#8220;Tweet Yr Prayers!&#8221;Â  What a great story about the power of Twitter to reach out into the world, and beyond!Â  I got a chance to chat with Alon at #140conf, and I found out he is brother of augmented reality guru, Rouli Nir, @augmented.Â  Rouli is known for his sharp and comprehensive AR commentary on <a href="http://artimes.rouli.net/" target="_blank">Augmented Times </a>and <a href="http://gamesalfresco.com/2010/04/22/the-future-of-ar-browser/" target="_blank">Games Alfresco</a>.Â  Cool family!</p>
<p>Before I close this post, I want to mention @AndyDixn&#8217;s talk on the prison sysetm, <a href="http://www.ippio.com/view_video.php?viewkey=7bc562a711ef96884a38"><strong>A  conversation with Andy Dixon: What the prison yard &amp; twitter have  in common</strong></a>.Â  This conversation, I think, is a great example about what makes #140conf special.Â  As @nwjerseyliz pointed out, we, &#8220;hear few voices from those who&#8217;ve experienced that side of the issue.&#8221;</p>
<p>Thank you @jeffpulver for creating such a cool staging for so many diverse voices.</p>
<p>And before I close here is what the only slide I didn&#8217;t have time to show said!</p>
<h3><strong>If you liked &#8220;Augmented Twitter&#8221;<br />
Donâ€™t miss Augmented Reality Event! </strong></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/04/are234x60augmented_w.png"><img class="alignnone size-full wp-image-5424" title="are234x60augmented_w" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/04/are234x60augmented_w.png" alt="are234x60augmented_w" width="234" height="60" /></a></p>
<p><strong>2 days, 3  tracks, 40 AR companies, 76 SpeakersArt! Magic!  Competitions!  Awards!Bruce (the Prophet) Sterling, Will (The Sims)  Wright, Jesse  (Gamepocalypse) Schell, Blaise Aguera y Arcas (Microsoft  Bing) and You! </strong> T<strong>he <a href="http://augmentedrealityevent.com/2010/04/10/sneak-preview-of-are-2010-schedule-packed-with-augmented-reality-goodness/">sneak preview of the schedule is here</a>.<br />
</strong><br />
<strong>Register today at<a href="http://augmentedrealityevent.com/" target="_blank"> Augmented Reality Event.com</a></strong></p>
<p><strong>Discount  code for @140 attendees, (and readers of this post!) <a href="https://register03.exgenex.com/GcmRegister/Index.Aspx?C=70000088&amp;M=50000500" target="_blank">TISH245</a> activates $245 price for full  conference.</strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/04/are234x60augmented_w.png"></a></p>
<p><strong>See you there!</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2010/04/23/augmented-twitter-at-jeff-pulvers-140conf/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>The Next Wave of AR: Mobile Social Interaction Right Here, Right Now!</title>
		<link>http://www.ugotrade.com/2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/</link>
		<comments>http://www.ugotrade.com/2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/#comments</comments>
		<pubDate>Fri, 20 Nov 2009 04:53:07 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial general Intelligence]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[AR browsers]]></category>
		<category><![CDATA[AR Dev camp]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[calo]]></category>
		<category><![CDATA[mobile social]]></category>
		<category><![CDATA[mobile social interaction utility]]></category>
		<category><![CDATA[open distributed augmented reality]]></category>
		<category><![CDATA[pygowave]]></category>
		<category><![CDATA[real time internet]]></category>
		<category><![CDATA[siri]]></category>
		<category><![CDATA[smart things]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[social augmented reality]]></category>
		<category><![CDATA[The Copenhagen Wheel]]></category>
		<category><![CDATA[the internet of things]]></category>
		<category><![CDATA[the outernet]]></category>
		<category><![CDATA[the sentient city]]></category>
		<category><![CDATA[Wave Federation Protocol]]></category>
		<category><![CDATA[Web Squared]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4869</guid>
		<description><![CDATA[The Next Wave of AR: Mobile Social Interaction, Right Here, Right Now! View more presentations from Tish Shute. Click on the image below or here to watch this presentation and others from Momo13]]></description>
				<content:encoded><![CDATA[<div id="__ss_2542526" style="width: 425px; text-align: left;"><a style="font:14px Helvetica,Arial,Sans-serif;display:block;margin:12px 0 3px 0;text-decoration:underline;" title="The Next Wave of AR: Mobile Social Interaction, Right Here, Right Now!" href="http://www.slideshare.net/TishShute/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now-2542526">The Next Wave of AR: Mobile Social Interaction, Right Here, Right Now!</a><object style="margin:0px" classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" width="425" height="355" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0"><param name="allowFullScreen" value="true" /><param name="allowScriptAccess" value="always" /><param name="src" value="http://static.slidesharecdn.com/swf/ssplayer2.swf?doc=thenextwaveofar2-091120000046-phpapp01&amp;stripped_title=the-next-wave-of-ar-mobile-social-interaction-right-here-right-now-2542526" /><param name="allowfullscreen" value="true" /><embed style="margin:0px" type="application/x-shockwave-flash" width="425" height="355" src="http://static.slidesharecdn.com/swf/ssplayer2.swf?doc=thenextwaveofar2-091120000046-phpapp01&amp;stripped_title=the-next-wave-of-ar-mobile-social-interaction-right-here-right-now-2542526" allowscriptaccess="always" allowfullscreen="true"></embed></object></p>
<div style="font-size: 11px; font-family: tahoma,arial; height: 26px; padding-top: 2px;">View more <a style="text-decoration:underline;" href="http://www.slideshare.net/">presentations</a> from <a style="text-decoration:underline;" href="http://www.slideshare.net/TishShute">Tish Shute</a>.</div>
<p>Click on the image below or <a href="http://www.mobilemonday.nl/talks/tish-shute-the-next-wave-of-ar/" target="_blank">here to watch</a> this presentation and others from <a href="http://www.mobilemonday.nl/">Momo13</a></div>
<p><a href="http://www.mobilemonday.nl/talks/tish-shute-the-next-wave-of-ar/" target="_blank"><img class="alignnone size-medium wp-image-4876" title="Screen shot 2009-11-20 at 1.32.24 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-20-at-1.32.24-PM-300x167.png" alt="Screen shot 2009-11-20 at 1.32.24 PM" width="300" height="167" /></a></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/feed/</wfw:commentRss>
		<slash:comments>4</slash:comments>
		</item>
		<item>
		<title>Toward the Sentient City: The Future of the Outernet and How to Imagine it?</title>
		<link>http://www.ugotrade.com/2009/11/09/toward-the-sentient-city-the-future-of-the-outernet-and-how-to-imagine-it/</link>
		<comments>http://www.ugotrade.com/2009/11/09/toward-the-sentient-city-the-future-of-the-outernet-and-how-to-imagine-it/#comments</comments>
		<pubDate>Mon, 09 Nov 2009 21:09:00 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[3rd cloud]]></category>
		<category><![CDATA[Adam Greenfield]]></category>
		<category><![CDATA[aesthetics of distributed participation]]></category>
		<category><![CDATA[Amphibious Architecture]]></category>
		<category><![CDATA[architectures of participation]]></category>
		<category><![CDATA[asynchronous city]]></category>
		<category><![CDATA[Benjamin H. Bratton]]></category>
		<category><![CDATA[Breakout!]]></category>
		<category><![CDATA[Conflux 2009]]></category>
		<category><![CDATA[Dan Hill]]></category>
		<category><![CDATA[Dharma Dailey]]></category>
		<category><![CDATA[distributed open AR]]></category>
		<category><![CDATA[Enrique Ramirez]]></category>
		<category><![CDATA[everyware]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[human electric hybrid]]></category>
		<category><![CDATA[hybrid social netoworks]]></category>
		<category><![CDATA[julian Bleeker]]></category>
		<category><![CDATA[Laura Forlano]]></category>
		<category><![CDATA[location aware applications]]></category>
		<category><![CDATA[Mark Shepard]]></category>
		<category><![CDATA[Martijn de Waal]]></category>
		<category><![CDATA[Matthew Fuller]]></category>
		<category><![CDATA[Mimi Zeiger]]></category>
		<category><![CDATA[Natalie Jeremijenko]]></category>
		<category><![CDATA[Natural Fuse]]></category>
		<category><![CDATA[new architectures of participation]]></category>
		<category><![CDATA[Nicolas Nova]]></category>
		<category><![CDATA[Omar Khan]]></category>
		<category><![CDATA[Open AR]]></category>
		<category><![CDATA[outernet]]></category>
		<category><![CDATA[Philip Beesley]]></category>
		<category><![CDATA[real time communication]]></category>
		<category><![CDATA[real time web]]></category>
		<category><![CDATA[real-time database enable city]]></category>
		<category><![CDATA[sensor networks]]></category>
		<category><![CDATA[Sentient City Survival Kit]]></category>
		<category><![CDATA[Situated Technologies]]></category>
		<category><![CDATA[smart things]]></category>
		<category><![CDATA[social mobility]]></category>
		<category><![CDATA[social mobility and the 3rd cloud]]></category>
		<category><![CDATA[synchronous internet of things]]></category>
		<category><![CDATA[The Copenhagen Wheel]]></category>
		<category><![CDATA[The Living Architecture Lab]]></category>
		<category><![CDATA[the social negotiation of Technology]]></category>
		<category><![CDATA[Too Smart City]]></category>
		<category><![CDATA[Toward the Sentient City]]></category>
		<category><![CDATA[Trash Track]]></category>
		<category><![CDATA[urban sustainability]]></category>
		<category><![CDATA[urbanware]]></category>
		<category><![CDATA[Usman Haque]]></category>
		<category><![CDATA[Web Squared]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4758</guid>
		<description><![CDATA[Amphibious Architecture &#8211; &#8220;submerges ubiquitous computing into the waterâ€”that 90% of the Earthâ€™s inhabitable volume that envelops New York City but remains under-explored and under-engaged.&#8221; Toward the Sentient City, brought &#8220;architects and urban designers into a conversation that until now has been limited largely to technologists,â€ and created an extraordinary opportunity to investigate distributed architectures [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.sentientcity.net/exhibit/?p=603" target="_blank"><span id="n.6p" title="Click to view full content"> </span></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-06-at-12.03.40-AM.png"><img class="alignnone size-medium wp-image-4783" title="Screen shot 2009-11-06 at 12.03.40 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-06-at-12.03.40-AM-300x200.png" alt="Screen shot 2009-11-06 at 12.03.40 AM" width="300" height="200" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/dhj5mk2g_404g3prc6dc_b.jpg"><img class="alignnone size-medium wp-image-4759" title="dhj5mk2g_404g3prc6dc_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/dhj5mk2g_404g3prc6dc_b-300x199.jpg" alt="dhj5mk2g_404g3prc6dc_b" width="300" height="199" /></a><br />
<span id="ot:x" title="Click to view full content"> </span></p>
<p><em><a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank"><span id="it_d" title="Click to view full content">Amphibious </span>Architecture</a> &#8211; &#8220;submerges ubiquitous computing into the waterâ€”that 90% of the Earthâ€™s inhabitable volume that envelops New York City but remains under-explored and under-engaged.&#8221;</em></p>
<p><a href="http://www.sentientcity.net/exhibit/">Toward the Sentient City</a>,<span id="ju31" title="Click to view full content"> brought </span> &#8220;architects and urban designers into a conversation that until now has been limited largely to technologists,â€ and <span id="hb:z" title="Click to view full content">created an extraordinary opportunity to investigate distributed architectures of participation of what we might call the &#8220;outernet.&#8221;Â  This is a</span><span id="hb:z" title="Click to view full content"> timely conversation as &#8220;web squared,&#8221;Â  &#8220;smart things,&#8221; the &#8220;internet of things,&#8221; or the &#8220;outernet,&#8221;</span><span id="g6ad" title="Click to view full content"> and their popular &#8220;ambassador&#8221; augmented reality are rapidly becoming everyone&#8217;s &#8220;business.&#8221;</span><span id="eb9y" title="Click to view full content"> From </span><span id="b265" title="Click to view full content">&#8220;evil&#8221; marketers, to global corporations, </span><span id="sq48" title="Click to view full content">environmentalists, artists and community activists -Â  everyone, it seems, is</span><span id="mqn_" title="Click to view full content"> interested in the possibilities of this new frontier.</span></p>
<p><span id="ot:x" title="Click to view full content">It is a challenging task to respond to, </span><a href="http://www.sentientcity.net/exhibit/">Toward the Sentient City</a><span id="ot:x" title="Click to view full content">, an exhibition whose backdrop includes a series of conversations on Situated Technologies &#8211; published by the Architectural League, from a circle of people who have been thinking, writing, and speaking on networked urbanism for many years now, including: Adam Greenfield, </span><span id="vjks" title="Click to view full content"> Mark Shepard, Matthew Fuller, Usman Haque, Benjamin H. Bratton, Natalie JeremiJenko, Laura Forlano, Dharma Dailey,Â  Philip Beesley, Omar Khan, Julian Bleeker, Nicolas Nova</span><span id="o7yp" title="Click to view full content">.Â  And the exhibition itself has a very thoughtful group of respondents, see posts from: <a href="http://www.sentientcity.net/exhibit/?p=595" target="_blank">Dan Hill</a>, <a href="http://www.sentientcity.net/exhibit/?p=659" target="_blank">Martijn de Waal,</a> <a href="http://www.sentientcity.net/exhibit/?p=622" target="_blank">Enrique Ramirez</a>, and <a href="http://www.sentientcity.net/exhibit/?p=603" target="_blank">Mimi Zeiger.</a></span><a href="http://www.sentientcity.net/exhibit/?p=603" target="_blank"><span id="n.6p" title="Click to view full content"> </span></a></p>
<p>But one ofÂ  Toward the Sentient City&#8217;s key accomplishments was to go beyond the rhetorical, and to put practical examples out into the world to<span id="ijgh" title="Click to view full content"> organize a discussion on some of the ideas and possibilities of ubiquitous computing that have barely begun to emerge from academic research, and entrepreneurial blue skying.Â  As curator, </span><a href="http://www.andinc.org/v3/" target="_blank">Mark Shepard</a><span id="ijgh" title="Click to view full content">, explained:<br />
</span></p>
<p><strong><span id="fqkh" title="Click to view full content">&#8220;The </span></strong><strong><span id="tq6_" title="Click to view full content"><span>aim is to provide concrete examples in the present around which to organize a discussion about just what kind of future we might want. Whether theyâ€™re prototypes or not, these commissions are concrete examples. Theyâ€™re not abstract ideas. And we can go stand next to each other and look at and interact with something which is out there in the world behaving in the way it behaves, performing as it does, and we can then begin to have a discussion about it that is less dependent upon powers of rhetoric.</span> So itâ€™s not about me persuading you about an idea but itâ€™s about us evaluating something thatâ€™s living and existing in this world. And that was really the intention of the show.â€</span></strong></p>
<p><span id="ijgh" title="Click to view full content">The commissioned works </span><span id="d4-:" title="Click to view full content">-<a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank"> Amphibious Arc</a></span><span id="d4-:" title="Click to view full content"><a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank">hitecture</a>, <a href="http://www.sentientcity.net/exhibit/?p=53" target="_blank">Breakout!</a>, <a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank">Natural Fuse</a>, <a href="http://www.sentientcity.net/exhibit/?p=59" target="_blank">Too Smart City</a>, and <a href="http://www.sentientcity.net/exhibit/?p=31" target="_blank">TrashTrack,</a> </span><span id="xnxp" title="Click to view full content">that were the hub of Toward the Sentient City&#8217;s </span><span id="g.08" title="Click to view full content"> events, themes and texts, provided a unique glimpse</span><span id="j-jh" title="Click to view full content"> at </span><span id="pa9i" title="Click to view full content">some of the possible dystopian and utopian futures of a &#8220;smart&#8221; city.Â  But, most importantly,Â  all the works questioned what might be new </span><span id="ijgh" title="Click to view full content">architectures of participation for a sentient city. </span></p>
<h3>New Architectures of Participation: Hybrid Social Networks with Human and Non-human Participants .</h3>
<p>Of the five works, Amphibious Architecture and Natural Fuse were particularly fascinating to me because they explored the possibilities of sensor networks to create new forms of distributed participation in networked ecosystems that connected the experience/trajectories of human and non human actors &#8211; fish, plants,Â  and people.</p>
<p>Both Amphibious Architecture, andÂ  &#8220;Natural Fuse&#8221; &#8211; from Usman Haque and <a href="http://www.haque.co.uk/" target="_blank">Haque Design + Research,</a> gave exhibition attendees the chance to experience at a personal level our relationships with our non-human neighbors.</p>
<p><a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank"><span id="it_d" title="Click to view full content">Amphibious </span>Architecture</a> from the The Living Architecture Lab at Columbia University Graduate School of Architecture, Planning and Preservation (Directors David Benjamin and Soo-in Yang) and Natalie Jeremijenko, Environmental Health Clinic at New York University, <span id="w.m9" title="Click to view full content">used a sensor array to &#8220;pierce the reflective </span><span id="ud4u" title="Click to view full content">surface of the water&#8221; that</span> separates us from the underwater ecosystem below.Â  <span id="kfwr" title="Click to view full content">The sensor arrays just below the surface of the East River andÂ  floating light array</span> (see picture on left opening this post) create a new interface between people and fish whose movements and water quality are transmitted in light.</p>
<p>One could also SMS the fish and the single beaver that lives in the rivers surrounding NYC to find the conditions they were experiencing.<span id="cehj" title="Click to view full content"> But t</span><span id="y9m6" title="Click to view full content">urning the city&#8217;s &#8220;back stories,&#8221; like the movements of &#8220;Yo beaver,&#8221; and the oxygen levels and water quality of the rivers into &#8220;fore stories,&#8221; is only one of the many ways Natalie JeremiJenko explores how we can engender the empathy necessary for humans and non humans to live in harmony and mutual benefit.</span></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/nataliefishandmicrochips.jpg"><img class="alignnone size-medium wp-image-4802" title="nataliefishandmicrochips" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/nataliefishandmicrochips-300x199.jpg" alt="nataliefishandmicrochips" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/fishfoodpost.jpg"><img class="alignnone size-medium wp-image-4803" title="fishfoodpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/fishfoodpost-300x199.jpg" alt="fishfoodpost" width="300" height="199" /></a></p>
<p><span id="y9m6" title="Click to view full content"> </span>Toward the Sentient City also held workshops/presentations in conjunction with <a href="http://confluxfestival.org/2009/" target="_blank">Conflux 2009</a>. After her Conflux presentation, Natalie Jeremijenko of Amphibious Architecture (which is also a collaborative project between <a href="http://www.environmentalhealthclinic.net/">xClinic</a>, <a href="http://www.thelivingnewyork.com/">The Living</a><span id="wz9v" title="Click to view full content">, </span>&#8220;and other intelligent creatures on the East River&#8221;)Â  invited participants to enjoy a lunch of cross-species foods at the East River site.Â  <span id="k2u." title="Click to view full content"> </span></p>
<p><span id="k2u." title="Click to view full content">The cross-species lunch takes </span><span id="x0h." title="Click to view full content"> an existing interaction pattern through which people and fish are already communicating, </span><span id="tkk5" title="Click to view full content">i.e., people going to the river â€“ the waterfront,Â  and feeding the fish</span><span id="vct4" title="Click to view full content"> Wonder Bread (which is bad for humans and fish); and transforms this desire to feed the fish into something which actually can remove the mercury content from the fish and our bodies by removing it from the food chain, so a previously inharmonious connection between people and fish, is redirected into a productive interaction benefitting both species.Â  As it turns out, food that is good for Fish (see pictures above), and removes mercury from their bodies can also be nutritious and tasty for humans. </span></p>
<p><a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank">Natural Fuse</a>, from team members, Usman Haque, creative director, Nitipak â€˜Dotâ€™ Samsen, designer, Ai Hasegawa, designer, Cesar Harada, designer, Barbara Jasinowicz, producer, used sensors toÂ <span id="oenx" title="Click to view full content"> link humans and plants in network where we are accountable for how our behavior effects others in your ecosystem. </span></p>
<p><span id="oenx" title="Click to view full content">If you brought an ordinary plant to the exhibition, you could take home an electronically assisted plant and become part of a social network of humans and plants. This network of humans and electronically assisted plants is also a carbon sink and ifÂ  more energy is consumed than the total number of plants in the social network can offset, plants begin to die giving immediate feedback and consequences to being greedy about energy consumption. </span><span id="ijgh" title="Click to view full content">For more about joining the Natural Fuse network see<a href="http://www.naturalfuse.org" target="_blank"> here.</a><br />
</span></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/naturalfusepres.jpg"><img title="naturalfusepres" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/naturalfusepres-300x199.jpg" alt="naturalfusepres" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/naturalfusetakehome.jpg"><img title="naturalfusetakehome" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/naturalfusetakehome-300x199.jpg" alt="naturalfusetakehome" width="300" height="199" /></a></p>
<p><span id="pa9i" title="Click to view full content"> </span><span id="w-ed" title="Click to view full content">We are in the pre-dawn ofÂ  sensor networks like those Natural Fuse and Amphibious Architecture created &#8211; social</span><span id="n.6p" title="Click to view full content"> networksÂ  that link human and non human participants in entirely new ways are largely an uncharted territory. </span><span id="o7yp" title="Click to view full content">(Note: T</span><span id="zr9t" title="Click to view full content">he upcoming <a href="http://www.situatedtechnologies.net/" target="_blank">Situated Technologies</a> Pamphlet 6</span><span id="ijgh" title="Click to view full content"> &#8211; <strong>&#8220;Micro Public Places,&#8221; </strong>Marc Bohlen and Hans Frei, indicates it will continue the journey with an investigation ofÂ  &#8220;transparent and distributed participation.&#8221;)</span></p>
<h3>Where Does the Social Negotiation ofÂ  Technology Happen?</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/markshepardpost.jpg"><img class="alignnone size-medium wp-image-4825" title="markshepardpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/markshepardpost-199x300.jpg" alt="markshepardpost" width="199" height="300" /></a></p>
<p>Frequent questions that came up at the presentations given by the teams that produced the commissioned works were: Does this idea scale?Â Â  Does it close the loop in that you <span>get answers to the questions asked?Â  How does the conversation gain agency?Â  And where does the social negotiation of technology happen?Â  (These last two questions were asked by <a href="http://www.orangecone.com/" target="_blank">Mike Kuniavsky</a> at Mark Shepardâ€™s presentation at Conflux: â€œ</span><a id="ktb-" title="Sentient City Survival Kit" href="http://survival.sentientcity.net/" target="_blank"><span>Sentient City Survival Kit</span></a><span>.â€ â€“ see picture above)Â  I think it is fair to say that these questions for the most part remain unanswered. But Toward the Sentient city was alive with ideas and practical examples about ways we can explore these questions more deeply.</span></p>
<p><span id="oenx" title="Click to view full content">Usman Haque in response to the question, &#8220;Does this experiment scale?,&#8221; replied:</span></p>
<p><strong>&#8220;it would, but at an individual level because it has to remain at the individual level because it is about the individual in relationship to the wider social context as opposed to building a forest to offset a city it is about each individual making choices of their own about what they do andÂ  having some kind of knowledge about the effect they are having on other people because most of the time we are quite complacent &#8211; we are able to do whatever we want because we are not necessarily aware how our intrusions effect both human and non-human neighbors&#8230;.&#8217;</strong></p>
<p>So how does this close the loop?Â  Usman explains that one of the key aspects for him is that if you do take home a plant you become part of a system in which you are no longer anonymous and if a plant is threatened (plants get three lives) you have the opportunity to email the person in the system who has threatened your plant.Â  Usman noted that one of the interesting things that happened in the context of the exhibition, where there was a single unit, was that 90% of the time people switched it on to selfish mode &#8211; presumably because they were anonymous.Â  Another aspect of Natural Fuse that raises interesting questions is that as more people decide to join the network the risk of a plant being harmed by any particular individual&#8217;s selfishness lessens.Â  As <a href="http://www.sentientcity.net/exhibit/?p=659" target="_blank">Martijn de Waal</a>,<span id="gi2_" title="Click to view full content"> in his response that unpacks some of the deeper philosophical, epistemological, and ethical questions that Natural Fuse addresses, observes:</span></p>
<p><strong>&#8220;The concept of a commons thus assumes cooperation and mutual accommodation. Could Sentient Technology play a role in the allocation of limited resources between citizens? Could it lead to the emergence of some sort of peer-to-peer governance model, that could prevent overusage of scarce resources?&#8221;</strong></p>
<h3><strong><br />
New Aesthetics of Distributed Participation</strong></h3>
<p><span id="nqx:" title="Click to view full content">The works of, </span><span id="nqx:" title="Click to view full content"><span> &#8220;Toward the Sentient City&#8221; point to possibilities for a new aesthetics of distributed participation in which users and system are no longer separated but instead â€œdevelop joint forms of observing and knowing that neither [...] is capable on its own.â€ (quote from upcoming, <a href="http://www.situatedtechnologies.net/" target="_blank">Situated Technologies Pamphlets</a></span> 6: Micro Public Places, Marc Bohlen and Hans Frei).Â  Natural Fuse and Amphibious Architecture examine the new transactional realities of the Sentient City.</span></p>
<p><span id="po-s" title="Click to view full content"> But there are many questions left unanswered.Â  We know a lot about the power of generativity from the </span>internet (see Zittrain)-Â  the ur<strong> &#8220;architecture of participation.&#8221;</strong> <span id="hri-" title="Click to view full content">As Zittrain points out, the &#8220;generativity&#8221; of the internet is &#8220;the engine that has catapaulted the internet from backwater to ubiquity.&#8221; </span> Tim O&#8217;Reilly coined the phrase, &#8220;architecture of participation,&#8221; to &#8220;describe the nature of systems that are designed for user contribution,&#8221;<span id="o7et" title="Click to view full content"> such that &#8220;participants extend the reach/increase the value of the system.&#8221;Â  But as Tim O&#8217;Reilly put it in his recent talk, &#8220;<a href="http://www.slideshare.net/timoreilly/state-of-the-internet-operating-system" target="_blank">State of the Internet Operating System:&#8221;</a></span></p>
<p><span title="Click to view full content"><strong>&#8220;Web 2.0 is about finding meaning in user-generated data, and turning that meaning into real-time user facing services.Â  &#8220;Web Squared&#8221; takes that same concept to real-time sensor data.&#8221;</strong><br />
</span></p>
<p><span id="o7et" title="Click to view full content">We know little yet about what constitutes generativity for the &#8220;outernet,&#8221; particularly for the kind ofÂ  hybrid social networks that Natural Fuse and Amphibious Architecture present.Â  Social Networks that connect people and place, humans and non humans, challenge dichotomies of man and nature, and machine and user in new and unexpected ways.</span></p>
<p>At the moment, the internet is going through a metamorphosis with the emergence of real time technologies like XMPP, PubHubSubBub and Google Wave and the coming of age of mobile computing.Â Â  While these shifts were not investigated specifically in any of the commissioned works I think all the worksÂ  begged the question,Â  What is a common platform for social interaction in the &#8220;outernet,&#8221; or sentient city?Â  I was not entirely satisfied, from this point of view, with a web interface for Natural Fuse or SMS as a mobile interface for Amphibious Architecture.</p>
<p><a href="http://www.media.mit.edu/people/dpreed" target="_blank">David P. Reed</a> points to the relationship between social mobility what he describes as the 3rd cloudÂ  and the need for a common platform (see <a href="http://www.slideshare.net/venicesessions/david-reed-social-mobility-and-the-3rd-cloud" target="_blank">David Reed &#8211; Social Mobility and the 3rd Cloud</a>. Hat tip to <a href="http://twitter.com/srenan" target="_blank">@srenan</a> for pointing me to David&#8217;s presentation).</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-06-at-11.11.25-PM.png"><img class="alignnone size-medium wp-image-4826" title="Screen shot 2009-11-06 at 11.11.25 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-06-at-11.11.25-PM-300x222.png" alt="Screen shot 2009-11-06 at 11.11.25 PM" width="300" height="222" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-06-at-11.16.59-PM1.png"><img class="alignnone size-medium wp-image-4828" title="Screen shot 2009-11-06 at 11.16.59 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-06-at-11.16.59-PM1-300x222.png" alt="Screen shot 2009-11-06 at 11.16.59 PM" width="300" height="222" /></a></p>
<p><em>Slides above are from David P. Reed&#8217;s presentation,Â <a href="http://www.slideshare.net/venicesessions/david-reed-social-mobility-and-the-3rd-cloud" target="_blank"> Social Mobility and the 3rd Cloud</a></em><a href="http://www.slideshare.net/venicesessions/david-reed-social-mobility-and-the-3rd-cloud" target="_blank"></a></p>
<p>What is an architecture of participation for mobile, social interaction? This is something I am very interested in.</p>
<p>Recently I began a project with a small group of augmented reality developers and enthusiasts to use Google Wave Federation Protocol as a transport system for open distributed, social augmented experiences (lots more to come on this soon &#8211; you can see the back story in my posts <a href="http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/" target="_blank">here</a> and <a href="http://www.ugotrade.com/2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/" target="_blank">here</a>).Â  Wave has introduced an open federated architecture of participation that <strong style="font-weight: normal;">combines asynchronous &amp; synchronous data,Â  bringingÂ  together the advantages of real-time communication with the persistent hosting of collaborative data (like wikis). </strong><strong> </strong></p>
<p>Augmented Reality puts who you are, where you are, and what you are doing center stage, and is an interface for &#8220;communications embedded in context&#8221; and &#8220;enabled by identity&#8221; &#8211; two key qualities of what David <span>P. Reed calls the 3rd cloud.Â  An open, distributed framework for augmented reality could createÂ  an interconnected sense of AR, one that fuses augmentation, data overlays, and varied media with location/time/place and crucially, social networking.Â  Such an interface would open up many possibilities for the new transactional realities that could </span>integrate real-time cloud based data with a human perspective and social networking.Â  I am using the term,<span> transactional realitiesÂ  to suggest an extension into social augmented experiences ofÂ  what, Di-Ann Eisnor, </span><a id="s050" title="Platial" href="http://www.platial.com/"><span>Platial</span></a><span>, describes as,Â  &#8220;</span><span><span><span>transactional cartography&#8221; &#8211; &#8220;the movement from map providing entertainment/information to map as enabling action&#8221; (see </span><a id="h6.r" title="Human as Sensors" href="http://www.youtube.com/watch?v=Di285pgcZRE&amp;feature=PlayList&amp;p=F664D8C553A57C93&amp;index=3"><span>Human as Sensors</span></a><span>).</span></span></span></p>
<p>We have only just got a glimpse ofÂ  how real time technologies and &#8220;communications embedded in context&#8221; will transform social interaction and our cities.Â  This post on <a id="r3ow" title="Writing as Real-Time Performance" href="http://snarkmarket.com/2009/3605">Writing as Real-Time Performance</a> that looks at the Google Wave playback feature is a brilliant example of how real time technology turns familiar practices like writing inside out, and catapaults us into new time trajectories. And, if you haven&#8217;t already seen Matt Jones of BERG&#8217;s, brilliant look at, <a href="http://berglondon.com/blog/2009/10/26/all-the-time-in-the-world-talk-at-design-by-fire-2009-utrecht/" target="_blank">&#8220;All the time in the world&#8221; </a>- from the &#8220;soft time&#8221; and &#8220;squishy time&#8221; ofÂ  cell phone culture, to their anticedents in real-time computing, go now!Â  Also see Dan Hill&#8217;s work on <a href="http://cityofsound.com" target="_blank">&#8220;time based notation,&#8221;</a> and Tom Carden&#8217;s work for mysociety.org</p>
<p><span> </span></p>
<h3>Transactional Realities Between the &#8220;Asynchronous City&#8221; and the &#8220;Synchronous Internet ofÂ  Things&#8221;</h3>
<p><span> </span><span id="nqbb" title="Click to view full content"><span>Out of Toward the Sentient City&#8217;s five commissioned works,</span><span> only</span></span><span id="n:_n" title="Click to view full content"><span> </span></span><span> </span><a href="http://www.sentientcity.net/exhibit/?p=31" target="_blank"><span>Trash Track</span></a><span> </span><span id="nqbb" title="Click to view full content"></span><span> </span><span id="n:_n" title="Click to view full content"><span>focused on the â€œsynchronized Internet of Things.â€ </span></span><a href="http://www.sentientcity.net/exhibit/?p=31" target="_blank"><span id="n:_n" title="Click to view full content"><span> </span></span></a><span id="n:_n" title="Click to view full content"><span>Trash Track asks what can we learn from the aggregated data streams of â€œsmartâ€ trash about</span></span><span> the infamous path of trash from cities of privilege to rivers of want,Â  rather than</span><span id="rkuc" title="Click to view full content"><span> exploring the the particular transactional realities of a social network that linked people with their trash</span></span><span id="n.6p" title="Click to view full content"> </span></p>
<p><span id="ft58" title="Click to view full content"><br />
<span> </span></span><span id="ft58" title="Click to view full content"> </span><span id="n.6p" title="Click to view full content"><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/TrashTrack2.jpg"><img title="TrashTrack2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/TrashTrack2-300x199.jpg" alt="TrashTrack2" width="300" height="199" /></a></span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/TrashTrack2.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/trashtrack4.jpg"><img title="trashtrack4" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/trashtrack4-300x199.jpg" alt="trashtrack4" width="300" height="199" /></a><span id="ft58" title="Click to view full content"><span> </span></span></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/trashtrack3.jpg"><img class="alignnone size-medium wp-image-4768" title="trashtrack3" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/trashtrack3-300x199.jpg" alt="trashtrack3" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/trashtrackpost.jpg"><img class="alignnone size-medium wp-image-4782" title="trashtrackpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/trashtrackpost-300x199.jpg" alt="trashtrackpost" width="300" height="199" /></a></p>
<p><span id="ft58" title="Click to view full content"><span>The goals of </span></span><span id="ft58" title="Click to view full content"><span>Trash Track </span></span><span id="ft58" title="Click to view full content"><span>were</span></span><span id="ft58" title="Click to view full content"><span>, Assaf </span></span><span id="ft58" title="Click to view full content"><span>Biderman explained during his presentation:</span></span></p>
<p><span id="ft58" title="Click to view full content"><span> <strong>â€œto learn about the removal chain, to see if knowing more cou</strong></span></span><strong><span id="f:mt" title="Click to view full content"><span>ld promote behavioral change, and investigate if smart tagging could one day lead to 100% recycling.â€ </span></span></strong></p>
<p><strong><span id="f:mt" title="Click to view full content"> </span></strong><span>The team from SENSEable City Laboratory, MIT included &#8211; Carlo Ratti: Director, Assaf Biderman: Associate Director, Rex Britter: Advisor, Stephen Miles: Advisor, Kristian Kloeckl Project Leader, Musstanser Tinauli, E Roon Kang, Alan Anderson, Avid Boustani, Natalia Duque Ciceri, Lorenzo Davolli, Samantha Earl, Lewis Girod, Sarabjit Kaur, Armin Linke, Eugenio Morello, Sarah Neilson, Giovanni de Niederhausern, Jill Passano, Renato Rinaldi, Francisca Rojas, Louis Sirota, Malima Wolf.</span></p>
<p><span>However, Assaf,Â  in his presentation, presented another project from SENSEable City Laboratory in partnership with the City of Copenhagen, </span><a href="http://senseable.mit.edu/copenhagenwheel/" target="_blank">The Copenhagen Wheel</a>.Â  <span>This project seems to work brilliantly at the intersection of the &#8220;asynchronous city&#8221; (Bleeker and Nova) and the &#8220;synchronized internet of things&#8221;Â  The &#8220;smart&#8221; wheel &#8211; a low cost, open source, human electric hybrid is:</span></p>
<p><strong>&#8220;an electric bicycle wheel that can be easily retrofitted into any regular bicycle and location and environmental sensors which are powered by the bike wheel and in turn provide data for a variety of applications.&#8221;</strong></p>
<p>This project, that aims to promote urban sustainability through smart biking, opens up many possibilities for a bottom up architecture of participation for the sentient city (<a href="http://senseable.mit.edu/copenhagenwheel/">see video here</a>). <strong><br />
</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-08-at-7.18.45-PM.png"><img class="alignnone size-medium wp-image-4838" title="Screen shot 2009-11-08 at 7.18.45 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-08-at-7.18.45-PM-300x218.png" alt="Screen shot 2009-11-08 at 7.18.45 PM" width="300" height="218" /></a><br />
</strong></p>
<p><a href="http://www.andinc.org/v3/" target="_blank">Mark Shepard</a> describes something he calls &#8220;propagativeÂ  urbanism:&#8221;</p>
<p><strong>&#8220;a way of thinking about shaping the experience of urban space in terms of a bottom-up, participatory approach to the evolution of cities.&#8221; </strong></p>
<p>And, in the most recent pamphlet in the <a href="http://www.situatedtechnologies.net/" target="_blank">Situated Technologies pamphlets </a><span><a href="http://www.situatedtechnologies.net/" target="_blank">series, #5 â€œAsynchonicity Design Fictions for Asynchronous Urban Computing,â€ </a>Julian Bleeker and Nicolas Nova invert an emphasis in the so-called â€œreal-time database enabled cityâ€ with its synchronized Internet of Thingsâ€¦.Â  and speculate on the existence of an â€œasynchronous city.â€Â  They &#8220;forecast situated technologies based on weak signals that show the importance of time on human perspectives.â€Â  They ask:</span></p>
<p><span><strong>&#8220;why, besides &#8216;operational efficiency,&#8217; would we want a ubiquitously computed environment?Â  What are the measures of &#8216;better&#8217; that we want to count as meaningful?&#8221;</strong></span></p>
<p><span>They explain:</span></p>
<p><span><strong>..we are trying to think through what &#8220;urbanwares might be &#8211; urban operating systems &#8211; if they were less about synchronization, top-down construction and connected channels of information and databases and so forth, and more about asynchronized, decentralized things.Â  Software, data, time out of alignment, incongruities, tiles and imbrications of the geographic, spatial parameters into a delicious kind of lively peasant&#8217;s stew.&#8221; </strong><br />
</span></p>
<p><span>One takeaway, perhaps, from Toward the Sentient City is that it&#8217;s at the intersection ofÂ  theÂ  â€œasynchronous cityâ€Â  and theÂ  â€œreal-time database enabled cityâ€ where many new transactional realities of the sentient city will arise.</span></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/11/09/toward-the-sentient-city-the-future-of-the-outernet-and-how-to-imagine-it/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>ISMAR 2009: An Augmented Reality &#8220;Top Chef&#8221; Coopetition</title>
		<link>http://www.ugotrade.com/2009/10/24/ismar-2009-an-augmented-reality-top-chef-coopetition/</link>
		<comments>http://www.ugotrade.com/2009/10/24/ismar-2009-an-augmented-reality-top-chef-coopetition/#comments</comments>
		<pubDate>Sat, 24 Oct 2009 22:26:42 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Acrossair]]></category>
		<category><![CDATA[AR Sketch]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[ARhrrr]]></category>
		<category><![CDATA[augmented reality at VW]]></category>
		<category><![CDATA[avatars and people together in physical spaces]]></category>
		<category><![CDATA[Avilus]]></category>
		<category><![CDATA[Blair Macintyre]]></category>
		<category><![CDATA[Chetan Damani]]></category>
		<category><![CDATA[Christine Perey]]></category>
		<category><![CDATA[cloud computing]]></category>
		<category><![CDATA[Dirk Groten]]></category>
		<category><![CDATA[distributed computing]]></category>
		<category><![CDATA[eyewear for augmented reality]]></category>
		<category><![CDATA[geoAR]]></category>
		<category><![CDATA[Georg Klein]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Green Tech AR Competition]]></category>
		<category><![CDATA[HMDs]]></category>
		<category><![CDATA[Humans as Sensors]]></category>
		<category><![CDATA[industrial augmented reality]]></category>
		<category><![CDATA[Institut Graphische Datenverarbeitung]]></category>
		<category><![CDATA[ISMAR 2009]]></category>
		<category><![CDATA[ISMAR 2010]]></category>
		<category><![CDATA[ISMAR09]]></category>
		<category><![CDATA[Jay Wright]]></category>
		<category><![CDATA[Joe Ludwig]]></category>
		<category><![CDATA[Junaio]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[Mark Billinghurst]]></category>
		<category><![CDATA[Markus Tripp]]></category>
		<category><![CDATA[Metaio]]></category>
		<category><![CDATA[Michael Goesele]]></category>
		<category><![CDATA[Microsoft and augmented reality]]></category>
		<category><![CDATA[Mobile Monday]]></category>
		<category><![CDATA[Mobilizy]]></category>
		<category><![CDATA[MoMo]]></category>
		<category><![CDATA[Noah Zerking]]></category>
		<category><![CDATA[Noora Guldemond]]></category>
		<category><![CDATA[Ogmento]]></category>
		<category><![CDATA[open distributed AR]]></category>
		<category><![CDATA[open hardware]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[participatory sensing]]></category>
		<category><![CDATA[Pattie Maes]]></category>
		<category><![CDATA[Peter Meier]]></category>
		<category><![CDATA[Platial]]></category>
		<category><![CDATA[PTAM on an iphone]]></category>
		<category><![CDATA[Put a Spell. Thomas Carpenter]]></category>
		<category><![CDATA[RoomWare]]></category>
		<category><![CDATA[Sean White]]></category>
		<category><![CDATA[sensor networks]]></category>
		<category><![CDATA[smart phones]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[social augmented realities]]></category>
		<category><![CDATA[standards for augmented reality]]></category>
		<category><![CDATA[Steven Feiner]]></category>
		<category><![CDATA[Technische Universitat Munchen]]></category>
		<category><![CDATA[The RoomWare Project]]></category>
		<category><![CDATA[The Zerkin Glove]]></category>
		<category><![CDATA[tracking and mapping in mobile augmented reality]]></category>
		<category><![CDATA[transactional cartography]]></category>
		<category><![CDATA[ubicomp]]></category>
		<category><![CDATA[Vernor Vinge]]></category>
		<category><![CDATA[virtual pets]]></category>
		<category><![CDATA[Volkswagen augmented reality group]]></category>
		<category><![CDATA[Vuzix]]></category>
		<category><![CDATA[Wave]]></category>
		<category><![CDATA[Wave enabled augmented reality]]></category>
		<category><![CDATA[Web 2.0 Summit]]></category>
		<category><![CDATA[Yuri van Geest]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4670</guid>
		<description><![CDATA[ISMAR 2009 -Â  was an extraordinary mix ofÂ  high geek, academic eminence, gungho Dutch Cowboy entrepreneurial spirit, German engineering and industry, brilliant artistry, and invention, all fueled by a sense, and a very active presence in the case of Diamond Sponsor &#8211; Qualcomm, that the big technology players are waking up to augmented reality. In [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/MetaioLayarpost.jpg"><img class="alignnone size-medium wp-image-4674" title="Metaio&amp;Layarpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/MetaioLayarpost-300x199.jpg" alt="Metaio&amp;Layarpost" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/DirkseesDirkonJunaiopost.jpg"><img class="alignnone size-medium wp-image-4676" title="DirkseesDirkonJunaiopost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/DirkseesDirkonJunaiopost-300x199.jpg" alt="DirkseesDirkonJunaiopost" width="300" height="199" /></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dirkwatchesdirkvcupost.jpg"><img class="alignnone size-medium wp-image-4675" title="dirkwatchesdirkvcupost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dirkwatchesdirkvcupost-300x199.jpg" alt="dirkwatchesdirkvcupost" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/metaiodinasaurpost.jpg"><img class="alignnone size-medium wp-image-4678" title="metaiodinasaurpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/metaiodinasaurpost-299x201.jpg" alt="metaiodinasaurpost" width="299" height="201" /></a></p>
<p><a href="http://www.ismar09.org/" target="_blank">ISMAR 2009</a> -Â  was an extraordinary mix ofÂ  high geek, academic eminence, gungho Dutch Cowboy entrepreneurial spirit, German engineering and industry, brilliant artistry, and invention, all fueled by a sense, and a very active presence in the case of Diamond Sponsor &#8211; Qualcomm, that the big technology players are waking up to augmented reality.</p>
<p>In the picture sequence above (click on photos to enlarge),Â  <a href="http://twitter.com/metaioUS" target="_blank">Noora </a><span><span><a href="http://twitter.com/metaioUS" target="_blank">Guldemond</a></span></span><span><span>, <a href="http://www.metaio.com/" target="_blank">Metaio</a>, demonstrates <a href="http://www.junaio.com/" target="_blank">Junaio</a> (coming to an iphone near you Nov 2nd) to <a href="http://twitter.com/dirkgroten" target="_blank">Dirk Groten</a>, CTO of<a href="http://layar.com/" target="_blank"> Layar</a> (top left photo).Â  One of the nice social features of Junaio is that users can share the 3D augmented scenes they have created.Â  Noora is demoing this capability to </span></span><span><span>Dirk, and as you can see he cracks up when he sees theÂ  scene Noora has stored on her phone.Â  Dirk and I both recognize that this cute little dinosaur augmentation (close up above on bottom left) must have been created by <a href="http://www.metaio.com/company/" target="_blank">Peter Meier, CTO of Metaio</a>, during the Interoperability and Standards workshop earlier that day.Â  Metaio it seems were discussing standards while enjoying some 3D augmented back chat.<br />
</span></span></p>
<p><span><span> Both Dirk and I were active participants in the workshop too.Â  But little did we know that Peter Meier had introduced his little 3D dinosaur into our discussion while we diligently, and sometimes heatedly, debated the merits of XMPP, Wave Federation Protocol,Â  KML, ARML, VRML, X3D, andÂ  more!Â  The photo I took is on the bottom right of the four pics above. It was probably taken very shortly after Peter&#8217;s augmented Junaio scene.Â  Of course there is no little dinosaur in my pic ofÂ  Dirk Groten with <a href="http://twitter.com/JoeLudwig" target="_blank">Joe Ludwig</a> and <a href="http://twitter.com/markustripp" target="_blank">Markus Tripp of Mobilizy</a> who were discussing AR standards oblivious to Peter&#8217;s virtual pet in our midst.<br />
</span></span></p>
<p><span><span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/MarkusTrippPeterMeier.jpg"><img class="alignnone size-medium wp-image-4685" title="MarkusTrippPeterMeier" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/MarkusTrippPeterMeier-300x199.jpg" alt="MarkusTrippPeterMeier" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Thereisawillingnesstostandardizepost.jpg"><img class="alignnone size-medium wp-image-4686" title="Thereisawillingnesstostandardizepost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Thereisawillingnesstostandardizepost-300x199.jpg" alt="Thereisawillingnesstostandardizepost" width="300" height="199" /></a><br />
</span></span></p>
<p><span><span>I must say I had noticed an impish look on Peter Meier&#8217;s face (see photo above on the left &#8211; Peter is wearing glasses and holding a phone).Â  And Markus Tripp, of MobilizyÂ  revealed a little bit of gaming of his own, when he let out that, in part, ARML is a provocation.Â  But Peter was clearly unfazed and enjoying himself.Â  Dirk, tasked to summarize our discussion, stalwartly maintained an optimistic but serious tone fitting for a standards discussion:Â  &#8220;There is a willingness to standardize&#8230;.,&#8221; he began (pic above on left &#8211; click to enlarge and read text). </span></span></p>
<p><span><span> But it was a little 3D dinosaur that, perhaps appropriately, had the last laugh. Fitting, as I am not sure whether anything anyone says about AR standards at the moment will hold up.Â  But, as Ori commented in <a href="http://gamesalfresco.com/2009/10/23/ismar-2009-epilogue-a-new-augmented-reality-world-order/" target="_blank">his great post &#8211; an epilogue for ISMAR 2009,</a> the vibe was &#8220;Peace and Love&#8221; in AR Browser land (</span></span>although Chetan Damani of <a href="http://gamesalfresco.com/?s=%22acrossair%22" target="_blank">Across Air</a> was not in the standards discussion because he attended the UX/content? workshop instead)<span><span>.Â  But as they say, &#8220;all&#8217;s fair in love and war.&#8221;Â  And it is my feeling the games have barely begun!Â  There are many players (<a href="http://www.youtube.com/watch?v=KI4lB00Ht9o&amp;feature=player_embedded#" target="_blank">virtual pets </a>included) waiting in the wings. I met some at ISMAR, and they are just itching to join the frey.<br />
</span></span></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/coopetitionpost.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/ARConsortiumpost2.jpg"><img class="alignnone size-medium wp-image-4701" title="ARConsortiumpost2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/ARConsortiumpost2-300x188.jpg" alt="ARConsortiumpost2" width="300" height="188" /></a><img class="alignnone size-medium wp-image-4690" title="coopetitionpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/coopetitionpost-300x185.jpg" alt="coopetitionpost" width="300" height="185" /></p>
<p><span><span>Ori Inbar, <a href="http://ogmento.com/" target="_blank">Ogmento </a>and Robert Rice, <a href="http://www.neogence.com/#/home" target="_blank">Neogence Enterprises</a>, both founders of the <a href="http://www.arconsortium.org/" target="_blank">AR Consortium</a>, made great efforts to set our young industry off on the right foot -Â  in theÂ  spirit of <a href="http://en.wikipedia.org/wiki/Coopetition" target="_blank">coopetition </a>(</span></span>a <a title="Neologism" href="http://en.wikipedia.org/wiki/Neologism">neologism</a> coined to describe <a title="Co-operation" href="http://en.wikipedia.org/wiki/Co-operation">cooperative</a> <a title="Competition" href="http://en.wikipedia.org/wiki/Competition">competition)</a><span><span>. See </span></span><a href="http://gamesalfresco.com/2009/10/23/ismar-2009-epilogue-a-new-augmented-reality-world-order/" target="_blank">Curious Raven for </a><a href="http://curiousraven.squarespace.com/home/2009/10/23/ismar-09-observations-and-comments.html" target="_blank">Robert&#8217;s conference observations</a>, and <span><span><a href="http://gamesalfresco.com/2009/10/23/ismar-2009-epilogue-a-new-augmented-reality-world-order/" target="_blank">Ori&#8217;s post on Games Alfresco</a> for more about </span></span>Mobile Augmented Reality at ISMAR 2009.Â  The Mobile Augmented Reality Workshops were driven by an indomitable spokesperson for the new AR industry, <a href="http://www.perey.com/" target="_blank">Christine Perey</a>.Â  Christine not only helped motivate discussion on the issue of oxygen to the system, i.e. business value, but also she was a very generous connector at the conference.</p>
<p><span><span><br />
</span></span></p>
<h3>What&#8217;s Next From Augmented Reality&#8217;s Top Chefs?</h3>
<p><span><span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-7.15.58-PM.png"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-7.12.35-PM.png"><img class="alignnone size-medium wp-image-4692" title="Screen shot 2009-10-24 at 7.12.35 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-7.12.35-PM-300x196.png" alt="Screen shot 2009-10-24 at 7.12.35 PM" width="300" height="196" /></a><br />
</span></span></p>
<p>As Ori pointed out, <a href="http://www.imdb.com/name/nm0218033/" target="_blank">Kent Demaine</a>, <a href="http://www.ooo-ii.com/" target="_blank">oooii</a> (pic above is from the oooii web site), Minority report VFX designer was hanging out at ISMAR 2009 and he came to the panel I was on: &#8220;Augmented Reality in Sports,Â  Entertainment and Advertising.&#8221;Â  We chatted afterwords about instrumented environments and how this is such a key to development interesting augmented experiences.Â  Also I mentioned how back in the day I was involved in some of the early development of motion control software.Â  And it was great to hear Kent say they were still finding motion control cool at <a href="http://www.ooo-ii.com/" target="_blank">oooii</a>.Â  As Ori notes, he is the &#8220;guy with the most enviable AR credentials in the world (the guy who designed VFX for minority report)<strong>,&#8221;</strong><strong> </strong>and <a href="http://www.ooo-ii.com/" target="_blank">oooii</a> is busy and hiring.</p>
<p>One of the highlights of the Arts, Media and Humanities track for me was meeting <a href="http://jarrellpair.com/" target="_blank">JarrellÂ  Pair.</a> He really brought the best out in panelists with his well tuned questions.Â  The recording of ISMAR was comprehensive and videos should be up next week.Â  I will post the slides on Ugotrade of my presentation:Â  &#8220;The Next Wave of AR: Shared Augmented Realities and Remix Culture.&#8221;.</p>
<h3>&#8220;Mixed and Augmented Reality: &#8216;Scary and Wondrous&#8217;&#8221; &#8211; <a href="http://en.wikipedia.org/wiki/Vernor_Vinge" target="_blank">Vernor Vinge</a></h3>
<p><strong>&#8220;Imagine an environment where most physical objects know where they are, what they are, and can, (in principle) network with any other object. With this infrastructure, reality becomes its own database.Â  Multiple consensual virtual environments are possible, each oriented to the needs of its constituency.Â  If we also have open standards, then bottom-up social networks and even bottom up advertising become possible. Now imagine that in addition to sensors, many of these itsy-bitsy processors are equipped with effectors.Â  Then the physical world becomes much more like a software construct.Â  The possibilities are both scary and wondrous.&#8221;</strong> (<a href="http://en.wikipedia.org/wiki/Vernor_Vinge" target="_blank">Vernor Vinge</a> -Â  intro to ISMAR 2009)</p>
<p>Vernor Vinge&#8217;s short intro to ISMAR 2009 (which can be downloaded with the <a href="http://www.ismar09.org/" target="_blank">ISMAR 2009 schedule here)</a> captures the essence of the &#8220;Scary and Wondrous&#8221; dawn of the age of ubiquitous computing and mixed and augmented reality.Â  It is definitely worth a moment to download.Â  The future of augmented and mixed realities, as Vernor Vinge points out, is tied up in a &#8220;tension between centralized and distributed computing&#8221; that &#8220;will continue long into the future.&#8221; One ofÂ  my fascinations with Wave is that it offers a tantalizing opportunity to explore augmented reality in an open distributed architecture.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-12-at-2.40.39-PM.png"><img class="alignnone size-medium wp-image-4586" title="Screen shot 2009-10-12 at 2.40.39 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-12-at-2.40.39-PM-300x154.png" alt="Screen shot 2009-10-12 at 2.40.39 PM" width="300" height="154" /></a></p>
<p>At ISMAR, I talked with as many people as possible about the AR Wave project &#8211; <a href="../../2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/" target="_blank">see my post here for more about Wave enabled AR</a>.Â  Many people were very enthusiastic to join the AR wave and the only thing I really lacked was about 100 invites to hand out!</p>
<h3>&#8220;Everything, Everywhere &#8211; making visible the invisible&#8221;</h3>
<p>Some of the areas that I would have liked to see given more attention on at ISMAR were sensor networks, data curation, and user experience.Â  Not that these areas were entirely neglected with Pattie Maes, MIT as a keynote speaker, and Mark Billinghurst presenting on some fascinating work on social augmented experiences and user experience.Â  I highly recommend catching up on these and other ISMAR presentations when the videos go up.</p>
<p><a href="http://www1.cs.columbia.edu/~swhite/" target="_blank"><img class="alignnone size-medium wp-image-4716" title="Screen shot 2009-10-25 at 12.28.25 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-12.28.25-PM-300x57.png" alt="Screen shot 2009-10-25 at 12.28.25 PM" width="300" height="57" /></a></p>
<p>And, I was very happy to meet and talk to <a href="http://www1.cs.columbia.edu/~swhite/" target="_blank">Sean White</a> whose work at Columbia University is one of my inspirations (for more <a href="http://www1.cs.columbia.edu/~swhite/" target="_blank">about Sean&#8217;s work see here</a> or click image above):</p>
<p><strong>&#8220;the confluence of powerful connected mobile devices, advances in computer vision and sensing, and techniques such as augmented reality (AR) enables exciting new opportunities for interacting with this hidden network of dynamic information and shifts the locus of interaction from the desktop computer to the world around us&#8221;</strong></p>
<p>And I had several very interesting conversationsÂ  at ISMAR about developing social augmented experiences that connect us to a physical world that is becoming &#8220;much more like a software construct&#8221; (Vernor Vinge).Â  Dirk Groten, CTO of Layar mentioned a few interesting projects Layar has up their sleeves, including somethingÂ  Layar may be cooking up with <a href="http://www.roomwareproject.org/" target="_blank">The RoomWare Project.</a></p>
<p><span><span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-10.03.00-PM.png"><img class="alignnone size-medium wp-image-4697" title="Screen shot 2009-10-24 at 10.03.00 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-10.03.00-PM-300x231.png" alt="Screen shot 2009-10-24 at 10.03.00 PM" width="300" height="231" /></a><br />
</span></span><br />
The picture above is of RoomWare&#8217;s Social RFID Installation for Media Plaza in Utrecht (<a href="http://blog.roomwareproject.org/2008/10/06/social-rfid-installation-for-media-plaza/">read more here</a>).</p>
<h3>Demos Galore!</h3>
<p>In the demo rooms,<a rel="cc:attributionURL" href="http://augmentation.wordpress.com/2009/10/24/ismar-ismar-ismar-where-to-start/augmentation.wordpress.com"> Noah Zerkin</a> (pic below left) pretty much single handedly carried the AR flag for a growing community of augmented reality Makers and Hackers.Â  But his presence was much appreciated, and he tirelessly demoed <a href="http://zerkinglove.com/" target="_blank">The Zerkin Glove.</a> See <a href="http://augmentation.wordpress.com/2009/10/24/ismar-ismar-ismar-where-to-start/" target="_blank">the first of what may be several posts from Noah on ISMAR here</a>.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/noah2post.jpg"><img class="alignnone size-medium wp-image-4700" title="noah2post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/noah2post-300x199.jpg" alt="noah2post" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/TishVuzixgogglespost.jpg"><img class="alignnone size-medium wp-image-4704" title="Tish&amp;Vuzixgogglespost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/TishVuzixgogglespost-300x199.jpg" alt="Tish&amp;Vuzixgogglespost" width="300" height="199" /></a></p>
<p>And I got to try out the Vuzix goggles (picture above on right).Â Â  This was my first experience playing an AR game that was smart about real world gravity. It&#8217;sÂ  &#8220;an <span>augmented reality</span> marble game that uses gravity as a <span>game controller</span>&#8221; &#8211; see <a href="http://gamesalfresco.com/2009/08/09/augmented-reality-has-gained-gravity/" target="_blank">Ori Inbar&#8217;s write up here</a>.Â  It was a very compelling experience and I have to say I didn&#8217;t really notice the shortcomings of the Vuzix goggles while I was absorbed in the game. AndÂ  I turned out to be quite good at the game too. It is intuitive unlike the kind ofÂ  rule based games I never have time to learn properly.Â  But what is so special about this project is the tools that it is built with are open, and available for all, and affordable (see this <a href="http://gamesalfresco.com/2009/08/09/augmented-reality-has-gained-gravity/" target="_blank">list on Games Alfresco</a>).</p>
<p>It was a great pleasure to meet <a href="http://www1.cs.columbia.edu/~feiner/" target="_blank">Prof. Steven Feiner</a> (picture on below the left) who heads Columbia University&#8217;s brilliant AR research team at <a href="http://graphics.cs.columbia.edu/top.html" target="_blank">The Columbia University Graphics and User Interfaces Lab.</a></p>
<p>Ori Inbar (pic below on right) also spent a lot of time in the demo room showing off Ogmento&#8217;s lovely AR learning game that delighted attendees, <a href="http://ogmento.com/"><strong>â€œPut a Spell: Learn to Spell with Augmented Reality.â€</strong></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/TishVuzixpost.jpg"><img class="alignnone size-medium wp-image-4703" title="TishVuzixpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/TishVuzixpost-199x300.jpg" alt="TishVuzixpost" width="199" height="300" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Ogmentopost.jpg"><img class="alignnone size-medium wp-image-4702" title="Ogmentopost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Ogmentopost-199x300.jpg" alt="Ogmentopost" width="199" height="300" /></a></p>
<p>For a round up ofÂ  what&#8217;s next for augmented reality head mounted displays check out, <a href="http://gamesalfresco.com/2009/10/23/ismar-2009-epilogue-a-new-augmented-reality-world-order/" target="_blank">Games Alfresco here</a>, and Thomas Carpenter&#8217;s excellent review of the <a href="http://thomaskcarpenter.com/2009/10/21/ismar09-hmd-review/">head mounted displays.</a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/GeorgandBlairpost.jpg"><img class="alignnone size-medium wp-image-4712" title="GeorgandBlairpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/GeorgandBlairpost-300x199.jpg" alt="GeorgandBlairpost" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/cypherpost.jpg"><img class="alignnone size-medium wp-image-4713" title="cypherpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/cypherpost-300x199.jpg" alt="cypherpost" width="300" height="199" /></a></p>
<p><strong>Ori Inbar on Games Alfresco asks is &#8220;Microsoft â€“ the new big player to watch</strong>?&#8221;Â Â  &#8220;<a href="http://www.robots.ox.ac.uk/%7Egk/" target="_blank">Georg Klein</a>, inventor of <a href="http://www.youtube.com/watch?v=pBI5HwitBX4" target="_blank">PTAM-on-an-iPhone</a> (and the smartest Computer Vision guy on the block)&#8221; has joined Microsoft to make Mobile AR.</p>
<p>The picture on the left above shows Georg trying out <a href="http://www.youtube.com/watch?v=Cix3Ws2sOsU&amp;feature=player_embedded" target="_blank">ARhrrr</a> with Blair MacIntyre.Â Â  And on the right Blair is demoing his marker card pack to Senior Vice President of Cypher Entertainment, David Elmekies.Â  Yes ISMAR was abuzz with demos. See<a href="http://compscigail.blogspot.com/2009/10/ismar09-few-demos.html" target="_blank"> </a><a href="http://compscigail.blogspot.com/2009/10/ismar09-few-demos.html" target="_blank">this post</a> from Gail Carmichael for more video demos.</p>
<h3>Next Year ISMAR 2010 in Korea!</h3>
<p><span><span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/ISMARBanquet.jpg"><img class="alignnone size-medium wp-image-4693" title="ISMARBanquet" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/ISMARBanquet-300x199.jpg" alt="ISMARBanquet" width="300" height="199" /></a></span></span></p>
<p><span style="font-weight: normal;"><span style="font-weight: bold;"><span style="font-size: 0.800001em;"> </span></span></span>At the banquet, I managed to find a seat at a table with Sean White (at left in photo above with Christine Perey to his right) and the Columbia University team.Â  The banquet culminated with the â€œPast and Future of ISMARâ€ Panel chaired valiantly by Jay Wright of Qualcomm.Â  We were asked to offer our input for ISMAR 2010.Â  I offered up an idea that I have been nurturing for a while now -Â  to stage a &#8220;Green Tech AR Competition.&#8221;Â  Perhaps, I suggested, we could <span id="zx-." title="Click to view full content">base the competition around a conference (ISMAR 2010 in Korea?) and set up a target rich, instrumented environment for the occassion.Â  I think the Arduino open hardware community and AR developers have a synergy that is just waiting to be explored!Â  And, if we add the innovators of data curation to the mix, e.g., Pachube, AMEE, and Path Intelligence&#8230;(Markus Tripp left ISMAR to speak on a <a href="http://www.web2summit.com/web2009" target="_blank">Web 2.0 Summit</a> panel, <a href="http://www.readwriteweb.com/archives/humans_as_sensors.php" target="_blank">&#8220;Humans as Sensors,&#8221;</a> which also included Path Intelligence, Deborah Estrin on <a href="http://research.cens.ucla.edu/people/estrin/" target="_blank">&#8220;participatory sensing,&#8221;</a> and the brilliant work of <a href="http://twitter.com/dianneisnor" target="_blank">Di-Ann Eisnor</a>, <a href="http://platial.com/" target="_blank">Platial</a>, on &#8220;Transactional Cartography&#8221;).Â  Anyway a big Green tech AR competition could get people working together across the broad spread of AR terrain on some of the sticky problems of user experience.Â  And, with a high level of support from Smart Phone companies, HMDs manufacturers and the chip makers we just might come up with some extraordinary magic.<br />
</span></p>
<p><span id="zx-." title="Click to view full content"> The devil of course will be in the details.Â  But a competition like this could not only motivate key players to come together in the spirit of coopetition but also be an opportunity to show the world the power of AR to make visible the invisible ecosystems that are so important to the health of our planet.<br />
</span></p>
<p>One of the notable presences at ISMAR 2009 was the Qualcomm team.Â Â  Jay Wright&#8217;s presentation (an exclusive for ISMAR) not only outlined AR for 2012, but Jay also talked about some &#8220;close to the metal&#8221; innovation that we will see from Qualcomm very, very soon!Â  I had some time in the press room with Jay and his team prompted by <a href="http://www.mobilemonday.nl/" target="_blank">MoMo&#8217;s </a>Yuri van Geest.Â  When I twittered about Qualcomm&#8217;s presentation at ISMAR, Yuri replied:<strong><br />
</strong></p>
<p><a href="http://twitter.com/vanGeest" target="_blank">vangeest</a> <a href="http://twitter.com/TishShute" target="_blank">&#8220;@tishshute</a>: good stuff, hopefully you will integrate the neat new solutions and ideas in your talk in November ;)&#8221;</p>
<p><strong> </strong>I will be presenting at <a href="http://www.mobilemonday.nl/" target="_blank">MoMo #13</a> on AR, open AR, future of AR and GeoWeb,Â  and hopefully will bring some good news from Qualcomm too.Â  Anyway Jay seemed to like the idea of a Green Tech AR Competition, even though I did stress that I thought it needed some serious sponsorship and BIG prizes.</p>
<p><strong><br />
</strong></p>
<h3>Where&#8217;s the beef? Tracking and Mapping at ISMAR 2009</h3>
<p>On the flight from NYC to Orlando and ISMAR&#8217;o9 I dozed (I had been up late preparing my presentation) and I watched the Dew Tour Pro Skateboard competition and Top Chef on the Food Channel.Â  In this particular episode of Top Chef, the aspiring chefs were all given a brown bag of ingredients by an already famous chef who then judged whether the contenders managed to make a delicious meal with their allotment which was notably lacking in key ingredients of haute cusine.</p>
<p>This metaphor ofÂ  trying to cook up a great meal while perhaps missing the staples is apt for the current early stage of commercial augmented reality.Â  And when I arrived in Orlando, not only were the Dew Tour pro skateboarders staying at the same hotel as ISMAR, but ISMAR itself felt remarkably like an Augmented Reality Top Chef Coopetition.</p>
<p>Much of ISMAR was dedicated to the task ofÂ  providing the meat and potatoes of Augmented Reality, solutions to mobile tracking, mapping and registration, particularly in the Science and Technology track.</p>
<p>Industrial and Military Augmented reality solutions I found out, typically, solve the tracking problems by using fixed mounts which clearly wouldn&#8217;t translate well into the AR everywhere with everything mobile consumer culture expects.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/DanielPustkapost.jpg"><img class="alignnone size-medium wp-image-4679" title="DanielPustkapost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/DanielPustkapost-300x199.jpg" alt="DanielPustkapost" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-2.41.56-PM.png"><img class="alignnone size-medium wp-image-4726" title="Screen shot 2009-10-25 at 2.41.56 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-2.41.56-PM-300x208.png" alt="Screen shot 2009-10-25 at 2.41.56 PM" width="300" height="208" /></a></p>
<p><em>In the picture on the left Fabian Doil stands by the VW engine that provided some of the outdoor targets for the ISMAR tracking competition.Â  On the right is a picture from the VW&#8217;s presentation on their research and development of AR.</em></p>
<p>I followed the tracking contest, organized by Daniel Pustka and Fabian Doil of Volkswagen, quite closely. And I learned a lot in the process. WhileÂ  it is clear there has been progress in AR mapping and tracking, we still have a ways to go.</p>
<p>But hanging around the Tracking Competition was a good way to find out the state of play of this crucial part of the AR dream.Â  For example,Â  a little tidbit I learned is that <a href="http://www.gris.informatik.tu-darmstadt.de/~mgoesele/" target="_blank">Michael Goesele </a>who has been reconstructing &#8220;high-quality geometry models from images collected from the internet (so called community photo collections, CPC)&#8221; is soon to be at the <a href="http://www.ini-graphics.net/ini-graphicsnet/members/fraunhofer-institut-fuer-graphische-datenverarbeitung-igd.html" target="_blank">Institut Graphische Datenverarbeitung</a> where top contenders in the tracking contest &#8211; Harald WuestÂ  and Folker Weintipper (in the foreground of the photo at the left and right respectively) are also to be found. [update Harold and Folker were the winning team <a href="http://docs.google.com/gview?a=v&amp;pid=gmail&amp;attid=0.1&amp;thid=1248dd2927becb21&amp;mt=application%2Fpdf&amp;url=http%3A%2F%2Fmail.google.com%2Fmail%2F%3Fui%3D2%26ik%3De77cfddae9%26view%3Datt%26th%3D1248dd2927becb21%26attid%3D0.1%26disp%3Dattd%26zw&amp;sig=AHBy-hbcqUsaRNjbqpHO8vAF_vJqfDrMig" target="_blank">see here for details of scoring and results</a>!] Otto Korkalo and Tuomas Kantonen of VTT, Finland, Augmented Reality team are in the background. They have been working on the joint IBM, Nokia and VTT project that brings, <a href="http://www.marketwatch.com/story/researchers-from-ibm-nokia-and-vtt-bring-avatars-and-people-together-for-virtual-meetings-in-physical-spaces-2009-10-19" target="_blank">Avatars and People Together for Virtual Meetings in Physical Spaces.</a></p>
<p>The picture on the right is another team that were doing very well. If my notes serve me well (and please forgive me if they don&#8217;t. I came back with my card wallet overflowing!) the photo on the right showsChristian Waechter (on the left) and Peter Keitler (on the right) of the <a href="http://portal.mytum.de/welcome" target="_blank">Technische Universitat Munchen</a>.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/trackingcompetitionpost.jpg"><img class="alignnone size-medium wp-image-4672" title="trackingcompetitionpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/trackingcompetitionpost-300x199.jpg" alt="trackingcompetitionpost" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Trackingcompetition2post.jpg"><img class="alignnone size-medium wp-image-4681" title="Trackingcompetition2post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Trackingcompetition2post-300x199.jpg" alt="Trackingcompetition2post" width="300" height="199" /></a></p>
<p>Germany is certainly leading the way in industrial AR. And I learned how small businesses like Metaio get to work with top research institutions and big companies like VW, thanks to very strong German funding program for AR and VR. The current iteration of a series of funding programs isÂ  called<a href="http://www.avilus.de/" target="_blank"> Avilus</a>.Â  AvilusÂ  is putting 42 million Euros into AR and VR this year alone (click on the slide below to see more about Avilus ).</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-1.08.48-AM.png"><img title="Screen shot 2009-10-24 at 1.08.48 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-1.08.48-AM-300x212.png" alt="Screen shot 2009-10-24 at 1.08.48 AM" width="300" height="212" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-2.04.50-AM.png"><img class="alignnone size-medium wp-image-4673" title="Screen shot 2009-10-24 at 2.04.50 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-2.04.50-AM-300x202.png" alt="Screen shot 2009-10-24 at 2.04.50 AM" width="300" height="202" /></a></p>
<p>I wish we had the equivalent of Avilus here in the US.Â  But there is no equivalent to Arvilus for AR here, andÂ  no AR isÂ  being developed by the US car industry either it seems.Â  But look at the slide above to get a taste of some of the cool stuff Metaio and other small AR and VR businesses do for VW through the Avilus project.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/VWtrackinggudrunpost.jpg"><img class="alignnone size-medium wp-image-4682" title="VWtrackinggudrunpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/VWtrackinggudrunpost-300x199.jpg" alt="VWtrackinggudrunpost" width="300" height="199" /></a></p>
<p>I also got to meet many people from one of the world&#8217;s most important AR hubs -Â  The Department of Informatics, <a href="http://portal.mytum.de/welcome" target="_blank">Technische Universitat Munchen</a>, including Prof. Gudren Klinker on the far right in pic above.Â  And from left to right, Fabian Doil (VW, co-organizer of contest), Sebastian Lieberknecht , Selim Ben Himane (Metaio), Tobias Eble (Metaio).Â  Prof. Klinker is the engine behind much of German innovation in AR.</p>
<p>Metaio was one of the few teams to rely mainly on markerless tracking which in this contest was very challenging because of the very different light conditions (see pics below) between the windowless interior and dazzling Florida sunshine outside (pic on the right shows targets under ideal lighting conditions).Â  Many people in the US may beÂ  familiar with Metaio&#8217;s consumer applications, like Junaio,Â  but thanks to Germany&#8217;s efforts to nurture augmented and virtual reality they are also respected software developers in industrial AR.Â  And I suspect that Metaio will spearhead markeless tracking in consumer AR too.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Trackingcompetition5post.jpg"><img class="alignnone size-medium wp-image-4740" title="Trackingcompetition5post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Trackingcompetition5post-300x199.jpg" alt="Trackingcompetition5post" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-7.47.44-PM.png"><img class="alignnone size-medium wp-image-4745" title="Screen shot 2009-10-25 at 7.47.44 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-7.47.44-PM-300x229.png" alt="Screen shot 2009-10-25 at 7.47.44 PM" width="300" height="229" /></a></p>
<p>This post as usual has already expanded to something much longer than I originally attended &#8211; pretty typical for me! There is much I have not been able to cover including some of the interesting contributions by augmented reality artists at ISMAR &#8211; again I recommend the upcoming videos.</p>
<p>But I cannot end without a hat tip to, Oriel, Nate et al. who won the best student paper award for AR Sketch &#8211; again please <a href="http://gamesalfresco.com/2009/10/23/ismar-2009-epilogue-a-new-augmented-reality-world-order/" target="_blank">see Games Alfresco for more on this</a> (pic below from Games Alfresco). AR Sketch, Ori notes, is featured &#8220;in our <a href="http://gamesalfresco.com/2009/10/16/ismar-2009-sketch-and-shape-recognition-preview-from-ben-gurion-university/" target="_self">top post</a> and popular <a href="http://www.youtube.com/watch?v=M4qZ0GLO5_A" target="_blank">video</a>.&#8221; And</p>
<p><strong>&#8220;Their work is revolutionizing the AR world by avoiding the need to print markers â€“ or any images whatsoever.&#8221;</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-1.58.35-PM1.png"><img class="alignnone size-medium wp-image-4719" title="Screen shot 2009-10-25 at 1.58.35 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-1.58.35-PM1-300x223.png" alt="Screen shot 2009-10-25 at 1.58.35 PM" width="300" height="223" /></a><br />
</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/10/24/ismar-2009-an-augmented-reality-top-chef-coopetition/feed/</wfw:commentRss>
		<slash:comments>9</slash:comments>
		</item>
		<item>
		<title>AR Wave: Layers and Channels of Social Augmented Experiences</title>
		<link>http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/</link>
		<comments>http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/#comments</comments>
		<pubDate>Tue, 13 Oct 2009 18:52:42 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[virtual communities]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Amphibious Architecture]]></category>
		<category><![CDATA[AR Blip]]></category>
		<category><![CDATA[AR Browser]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[augmentaion]]></category>
		<category><![CDATA[augmented reality search]]></category>
		<category><![CDATA[Blair Macintyre]]></category>
		<category><![CDATA[Channels and Social Augmented Realities]]></category>
		<category><![CDATA[citi sensing]]></category>
		<category><![CDATA[citizen sensing]]></category>
		<category><![CDATA[Clayton Lilly]]></category>
		<category><![CDATA[cybernetics vs ecology and human waste]]></category>
		<category><![CDATA[distributed]]></category>
		<category><![CDATA[eco mapping]]></category>
		<category><![CDATA[Gene Becker]]></category>
		<category><![CDATA[geoAR]]></category>
		<category><![CDATA[geospatial web]]></category>
		<category><![CDATA[geospatial web and augmented reality]]></category>
		<category><![CDATA[Goggle Wave Federation Protocol]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Google Wave as an AR enabler]]></category>
		<category><![CDATA[Google Wave enable augmented reality]]></category>
		<category><![CDATA[Google Wave Protocols]]></category>
		<category><![CDATA[green tech augmented reality]]></category>
		<category><![CDATA[immersive sight]]></category>
		<category><![CDATA[Jeremy Hight]]></category>
		<category><![CDATA[Joe Lamantia]]></category>
		<category><![CDATA[Layers]]></category>
		<category><![CDATA[layers and channels of augmented reality]]></category>
		<category><![CDATA[Life Clipper]]></category>
		<category><![CDATA[life streaming]]></category>
		<category><![CDATA[location based media]]></category>
		<category><![CDATA[location based services]]></category>
		<category><![CDATA[locative media]]></category>
		<category><![CDATA[locative narratives]]></category>
		<category><![CDATA[Mannahatta]]></category>
		<category><![CDATA[map based augmentation]]></category>
		<category><![CDATA[mapping]]></category>
		<category><![CDATA[modulated mapping]]></category>
		<category><![CDATA[modulated napping]]></category>
		<category><![CDATA[multi-user]]></category>
		<category><![CDATA[narrative archaeology]]></category>
		<category><![CDATA[Natural Fuse]]></category>
		<category><![CDATA[neogeography]]></category>
		<category><![CDATA[networked urbanism]]></category>
		<category><![CDATA[non euclidian geometry]]></category>
		<category><![CDATA[open augmented reality framework]]></category>
		<category><![CDATA[Seanseable Labs]]></category>
		<category><![CDATA[sensor networks]]></category>
		<category><![CDATA[shared augmented realities]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[social augmented reality experiences]]></category>
		<category><![CDATA[sound augmentation]]></category>
		<category><![CDATA[Thomas K. Carpenter]]></category>
		<category><![CDATA[Thomas Wrobel]]></category>
		<category><![CDATA[Trash Track]]></category>
		<category><![CDATA[ubicomp]]></category>
		<category><![CDATA[virtual reality]]></category>
		<category><![CDATA[Wave as a platform for augmented reality]]></category>
		<category><![CDATA[Wave Blip]]></category>
		<category><![CDATA[Wave Bots]]></category>
		<category><![CDATA[Wave playback]]></category>
		<category><![CDATA[Wave playback feature]]></category>
		<category><![CDATA[Wave Robots]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4585</guid>
		<description><![CDATA[It is now nearly two weeks since the Google Wave preview launch and I am happy to say we have some AR Wave news. The diagram above shows Thomas Wrobelâ€™s basic concept for a distributed, multi-user, open augmented reality framework based on the Google Wave Federation Protocol and servers (click on the image to see [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://lostagain.nl/tempspace/PrototypeDiagram3_wave.html" target="_blank"><img class="alignnone size-medium wp-image-4586" title="Screen shot 2009-10-12 at 2.40.39 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-12-at-2.40.39-PM-300x154.png" alt="Screen shot 2009-10-12 at 2.40.39 PM" width="300" height="154" /></a></p>
<p>It is now nearly two weeks since the <a href="http://wave.google.com/" target="_blank">Google Wave </a>preview launch and I am happy to say we have some AR Wave news. The diagram above shows Thomas Wrobelâ€™s basic concept for a distributed, multi-user, open augmented reality framework based on the <a href="http://www.waveprotocol.org/" target="_blank">Google Wave Federation Protocol</a> and servers (click on the image to see the dynamic annotated sketch <a href="http://lostagain.nl/tempspace/PrototypeDiagram3_wave.html" target="_blank">or here</a>).</p>
<p>Even in the short time we have had to explore Wave, some very exciting possibilities are becoming clear. Thomas puts some of the virtues of Wave as an AR enabler succinctly when he writes:</p>
<p><strong>â€œWave allows the advantages of both real-time communication, as well as the advantages of persistent hosting of data. It is both like IRC, and like a Wiki. It allows anyone to create a Wave, and share it with anyone else. It allows Waves to be edited at the same time by many people, or used as a private reference for just one person.</strong></p>
<p><strong>These are all incredibly useful properties for any AR experience, more so Wave is open. Anyone can make a server or client for Wave. Better yet, these servers will exchange data with each other, providing a seamless world for the userâ€¦..a single login will let you browse the whole world of public waves, regardless of whoâ€™s providing or hosting the data. Wave is also quite scalable and secureâ€¦data is only exchanged when necessary, and will stay local if no one else needs to view it.</strong></p>
<p><strong>Wave allows bots to run on itâ€¦allowing blips in a waves to be automatically updated, created or destroyed based on any criteria the coders choose. Wave even allows the playback of all edits since the wave was created.</strong></p>
<p><strong>For all these reasons and more, Wave makes a great platform for AR.â€</strong></p>
<p>There will be much more <span>coming soon on Wave enabled AR because the Google Wave invites have begun to flow out to a wider community now. This week, many of our small ad-</span>hoc group looking at the development challenges and implications of Google Wave for AR actually got into Wave for the first time.</p>
<p>Many thanks to all the people who have contributed to this discussion so far including: Thomas Wrobel, Thomas K. Carpenter, Jeremy Hight, Joe Lamantia, Clayton Lilly, Gene Becker and many others.</p>
<p>We will be setting up some public AR Framework Development Waves this week.Â  If you have any trouble finding them, or adding yourself to it, please add Thomas and I to your contact list.Â  I am tishshute@googlewave.comÂ  Thomas is darkflame@googlewave.comÂ  The first two are currently called:<strong> </strong></p>
<p><strong><br />
AR Wave: Augmented Reality Wave Framework Development</strong> (developer forum)</p>
<p><strong>AR Wave: Augmented Reality Wave Development</strong> (for general discussion)</p>
<p>The discussion so far has been in two areas. On the one hand, it is gear-heady and focused on the <a href="http://www.waveprotocol.org/" target="_blank">Google Wave Federation Protocol</a>, code, development challenges, and interfacing to mobile, while on the other hand people have been looking at use cases and questions of user experience.</p>
<p>Distributed, â€œshared augmented realities,â€ or â€œsocial augmented experiences&#8221; â€“ that not only allow mashups, &amp; multisource data flows, but dynamic overlays (not limited to 3d), created by users, linked to location/place/time, and distributed to other users who wish to engage with the experience by viewing and co-creating elements for their own goals and benefit &#8211; are something very new for us to think about.</p>
<p>As, Joe Lamantia, puts it, now:</p>
<p><strong>â€œthereâ€™s a feedback loop between which interactions are made easy by any given combo of device;/ hardware / software / connectivity, and the ways that people really work in real life (without any mediation / permeation by tech).â€</strong></p>
<p>Joe Lamantia whose term, <strong>â€œsocial augmented experiencesâ€</strong> I borrow for this post title, has done some thinking about <strong>â€œconcepts and models for understanding and contributing to shared augmented experiences, such as the social scales for interaction, and the challenges attendant to designing such interactions.â€ </strong>Check out <a href="http://www.joelamantia.com/" target="_blank">Joe Lamantia&#8217;s blog </a>for more on this later this week.</p>
<p>It is very helpful, as Joe points out, to shift the focusÂ  back and forth between the experience and the medium.</p>
<p>It is super exciting to have clear evidence that shared augmented realities are no longer merely possible, but highly probable and actually do-able now.</p>
<p>I shouldÂ  be absolutely clear about what Google Wave does to enable AR because obviously Wave plays no role in solving image recognition and tracking/registrations issues.Â  But, for example, Wave protocols and servers do provide a means to exchange, edit, and read data, and that enables distributed, social augmented realities.</p>
<p>Thomas explains how the newly named &#8220;AR Blip&#8221; works as:</p>
<p><strong>&#8220;An AR Blip is simply a Blip in wave containing AR data. Typically this would be the positional and url data telling a AR browser to position a 3d object at a location in space.</strong></p>
<p><strong>In more generic terms, an AR Blip allows data of various forms (meshes,text,sound) to be given a real-world position.&#8221;</strong></p>
<p>I have mentioned in other posts (<a href="http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/" target="_blank">here</a> and <a href="http://www.ugotrade.com/2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/" target="_blank">here</a>) that Wave can be used for AR as precise or as loose as the current generation devices can handle. And as the hardware and software for the kind of AR that can put media out in the world to truly immerse you in a mixed space, the frameworkÂ  shouldÂ  be able to handle this too.</p>
<p>(a note on the Wave playback feature &#8211; this opens up a whole new world of possibilities.Â  Check out <a href="http://snarkmarket.com/2009/3605" target="_blank">this post</a> on some of the implications of playback for writing!)</p>
<p>The use cases we have been coming up with are too numerous to go into in detail this post<span>.Â  The open nature of an AR framework/Wave standard will lead to many new applications we have barely begun to imagine.Â  As Thomas points out, different client software can be made for browsing, potentially allowing for various specialist browsers, as well as more generic ones for typicalÂ  use. T</span>he multitudes of different kinds of data in/output that could be integrated in an open AR framework as it evolves are mind boggling.</p>
<p>But, for now, someÂ  obvious use cases do come to mind:<br />
eg.</p>
<p>- Historical environmental overlays showing how a city used to be/and how this vision may be constructed differently by different communities</p>
<p>- Proposed building work showing future changes to a structure/and the negotiations of this future (both the public and professionals could submit their own comments to the plans in context), seeing pipes, cables and other invisible elements that can help builders and engineers collaborate and do their work.</p>
<p>- Skinning the world with interactive fantasies</p>
<p>I asked Thomas to help people understand how Wave enables new interactions to data by explaining how Wave could enable citi sensing and citizen sensing projects (e.g.<a href="http://tinyurl.com/y97d5zr" target="_blank"> this one being pioneered by Griswold</a>):</p>
<p><strong><strong>&#8220;Sensors, both mobile and static could contribute environmental data into city overlays;</strong></strong></p>
<div><strong><strong>â€”temperature, windspeed, air quality (amounts of certain particles) water quality, amount of sunlight, Co2 emissions could all be feed into different waves. The AR Wave Framework makes it easy to see any combination of these at the same time.&#8221;</strong></strong></div>
<div><strong><strong><br />
</strong></strong></div>
<p><strong><strong> </strong></strong>Having these invisible aspects of the world made visible would create ways to improve sustainability, social equity, urban management, energy efficiency, public health, and allow communities to understand and become active participants in the ecosystems and infrastructure of their neighborhoods.</p>
<p>The key is reflecting thisÂ  kind of data back to people &#8220;making it not back story but fore story,&#8221; right where we are, right where it happens, as well as having it available for analysis.</p>
<p>As well asÂ  creating new opportunities to interact/respond to/and enhance data, making visible the invisible as <a href="http://www.environmentalhealthclinic.net/people/natalie-jeremijenko/" target="_blank">Natalie Jeremijenko&#8217;s</a> work on <a href="http://www.amphibiousarchitecture.net/" target="_blank">Amphibious Architecture</a> and <a href="http://www.haque.co.uk/" target="_blank">Usman Haque&#8217;s</a> project <a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank">Natural Fuse</a> shows, can also create new connections/understandings between humans and the non human&#8217;s that share our world, e.g. fish, plants, waterways.</p>
<p>At a more prosaic levelÂ  potential buyers of property could see more clearly what they are buying, city planners could see better what needs to be worked on, and environmental researchers could see more clearly the impact people are having on an area.</p>
<p>Also Wave can provide some of the framework necessary to begin to begin to address tricky problems of privacy. Sensitive data can be stored on private waves, e.g. medical data for doctors and researchers, but the analysis of theÂ  data could still be of benefit to all, e.g., if it&#8217;s tied disease occurrences to locations andÂ  relationships between the environmental data and health wereâ€¦quite literallyâ€¦made visible.</p>
<p><strong>&#8220;The publication of energy consumption and making it visible as overlays, could help influence the public into supporting more energy efficiency companies and businesses. It could also help citizens to try to keep their own energy usage down, to try to keep their street in â€œthe green.â€</strong></p>
<p>Thomas notes:</p>
<p><strong>&#8220;With all of the above, it becomes fairly trivial to write persistent Wave-bots that automatically send notice when certain criteria are met (pollutants over a certain level, for example). On publicly readable waves, anyone can use the data in their local computers, process it, and contribute results back on a new wave. Alternatively, persistent remote severs could run Cron jobs, or other automated processing, using services such as App Engine to run wave robots.</strong></p>
<p><strong>All these possibilities become â€œfreeâ€ when using Wave as a platform for geographically tied data.&#8221;</strong></p>
<p>But of course this is just the beginning!</p>
<p><em>Recently, I talked at length with Jeremy Hight who has been thinking about, designing and creating shared augmented realities, that anticipate the kind of dynamic, real time, large scale architecture we now have available through Wave,Â  for quite some time now.Â Â  This is exciting stuff. </em></p>
<p><em><br />
</em></p>
<h3><strong>Modulated Mapping:</strong> Talking with Jeremy Hight about Layers, Channels andÂ  Social Augmented Experiences</h3>
<p><strong><strong> </strong></strong></p>
<p><strong><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping5.jpg"><img class="alignnone size-medium wp-image-4611" title="modulatedmapping5" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping5-230x300.jpg" alt="modulatedmapping5" width="230" height="300" /></a><br />
</strong></strong></p>
<p><strong><strong><em><span>image from Volume Magazine (Hight/Wehby)</span></em></strong></strong></p>
<p><strong><strong>Tish Shute:</strong></strong> I know you have been involved in locative media from its early days. Perhaps we can talk about how AR continues the locative media journey?</p>
<p><a href="http://www.cc.gatech.edu/~blair/home.html" target="_blank">Blair MacIntyre</a> gave me this distinction, recently:<em> &#8220;AR is about systems that put media out in the world, and immerse you in a mixed space. Â Even the current &#8220;not really registered&#8221; mobile phone AR systems are still &#8220;sort of&#8221; AR (e.g., Layar, etc).</em></p>
<p><em>Locative media/ubicomp/etc are very different, in that they tend to display media on a device (phone screen) that is relevant to your context, but does not attempt to merge it with the world.<br />
The difference is significant, and making it clear helps people think about what they do and what they want to do, with their work. The locative media space though points toward future AR systems (when the technology catches up!).&#8221;</em></p>
<p><strong><strong>Jeremy Hight: The need is to finish the arc that locative media and early AR have started and to now truly return to the map itself, but as an internet of data, interactivity, channels of data , end user options like analog machines once were but in high end tools, a smart AI-ish ability for it to cull data for the user, and to allow social networking to be in real world places on the map both in building augmentation and in using and appreciating it..not hacks..which have their place&#8230;but a rhizome, a branched system with shared root,end user adjustable and variable..this is the key.</strong></strong></p>
<p><strong><strong>This takes AR and mapping and makes a possible world of channels in space and this eventually can be a kind of net we see in our field of vision with a selected percentage of visual field and placement so a geo-spatial net, a local to world wide fusion of lm into a tool and educational tool</strong></strong></p>
<p><strong><strong><span>VR[virtual reality] has greatly advanced, but in nodes as it has limitationsâ€¦LM [locative media] is the sameâ€¦AR [augmented reality] is the way..</span></strong><strong> it now has locative elements and aspects of VR integrated into its functionality and nodes&#8230;it is the best option with all of these elements, greater hybridity and data level potential a well as end user and community sourcing potential</strong></strong></p>
<p><strong><strong>I wrote an essay for Archis&#8217; Volume, the architecture magazine on a near future sense of some of this&#8230;.a visual net on the lens like ar but with smart objects and social networking and dissent.</strong></strong></p>
<p><strong><strong>I also wrote of these things for immersive graphic design, spatially aware museumÂ  augmentation, education through ar and lm and nod to the base interface of eye to cerebral cortex in layered and malleable augmentation in my essay <a href="http://www.neme.org/main/645/immersive-sight" target="_blank">&#8220;Immersive Sight&#8221;</a> a few years back</strong></strong></p>
<div id="gqg9" style="text-align: left;"><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_3dj7g8zf7_b.jpg"><img class="alignnone size-medium wp-image-4601" title="dgznj3hp_3dj7g8zf7_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_3dj7g8zf7_b-300x225.jpg" alt="dgznj3hp_3dj7g8zf7_b" width="300" height="225" /></a></strong></div>
<p><strong><strong>image [above] is simple illustration of a possible example on a screen or in front of eye where in a mondrian show..the graphic design of information actually builds as one moves</strong></strong></p>
<p><strong><strong>(key is calibrated spatial intervals and related layers of further augmentation which is logical due to location and proximity)</strong></strong></p>
<p><strong><strong>from immersive sight on immersive graphic design:</strong> <em>&#8220;The design can work with this in a way that creates an interactive supplemental set of information that is malleable, shifts based on location, builds and peels away as one moves closer to a work and plays with the forms of the works and the elements of the space itself. The sequence can contain many different elements and their interplay (both in the field of vision and in terms of context and layers of information). This is the model of sections of augmentation turning on and off at key points as individual spatial and concepts moments and nodes.</em></strong></p>
<p><strong><em>Another interesting possibility is that individual points of augmentation donâ€™t turn off, but instead are designed to build as one moves in a direction toward a specific part of the exhibit. The design can work in a sequence both content wise and visually in terms of a delay powered compositional development and style in which each discreet layer of text and image does not fade out, but builds on each other into a final composition. This can form paintings similar to Mondrian perhaps if it is a show of similar works of that era or it can form something much more metaphorical and open interpretation of the space and content but utilizing a sense of emergence spatially in terms of the composition (pieces laid bare until final approach for effect). </em></strong></p>
<p><strong><em>Each section will be well designed, but they build in layers as one moves until finally forming the final composition both visually and in terms of scope of information or building immediacy. The effect can be akin to taking a painting and slicing it into onion skin layers laid out in the air at intervals, each the same dimensions, but only one section compositionally of the greater whole. This has many semiotic applications beyond its potential aesthetically and as spatialized information possessing a sense of inter-relationship as one moves.</em>&#8220;</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>One of the things I found very inspiring when I read your papers was that your ideas are not all dependent on a model of AR that would necessarily require goggles, back packs and lots of CPU/GPU &#8211; not that that wouldn&#8217;t be nice, but that even using &#8220;magic lens&#8221; AR of the kind smart phones has enabled in an open distributed framework would open up a lot of new possibilities for what you call modulated mapping wouldn&#8217;t it?Â  What kind of social augmented realities might be enabled by a distributed infrastructure like this [AR Wave]?</p>
<p><strong><strong>Jeremy Hight: right&#8230;.I see that as wayyy down the road&#8230;most important is the one you talk about as it is more immediate and thus more essential and needed. Eventually the goggles will be like a contact lens and a deep immersive ar version ofÂ  this will come, that to me is certain, but a ways down the road.Â  An incredible amount is possible now, and this is a more pragmatic move as opposed to the more theoretical of what is a few steps from here. Thus it is more important and essential now. Tools like Google Wave are taking what even 2 years ago was more theoretical discussions of what may be and instead introducing key elements to a more immediate, powerful, flexible level of augmentation. What have been hacks and isolated elements are to be integrated and social networking, task completion, shared tools and graphics building and geo-location.</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>I think some people question what augmented reality has to bring to the continuum of location based experiences that other forms of interface/mapping do not?</p>
<p><strong><strong><span>Jeremy Hight: rightâ€¦.and the schism between its commercial </span></strong><strong>flat self and tests with physics etc and in between &#8230;there are a lot of unfortunate assumptions it seems as to where ar and lm cross and how ar can be many things beyond deep immersion or the opposite pole of a hockey puck having a magic purple line etc&#8230;.like lm is seen as either car directions or situationist experiments with deep data&#8230;..the progression to me is deeply organic&#8230;.and now augmentation can be more malleable, variable and end user controlled.</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Yes, it is really exciting time for AR.Â  Historically AR research has gone after the hard problems of image recognition, tracking and registration because we have had available to us these dynamic, real time, large scale architectures like Wave available (until now!),Â  so less work has been done on exploring the possibilities for distributed AR fully integrated with the internet and WWW hasn&#8217;t it?</p>
<p>A distributed augmented reality framework such as we have envisaged on Wave wouldÂ  allow people to see many layers from many different people at the same time. â€¬And this kind of model has been part of your thinking and fundamental to your work for a while, hasn&#8217;t it? But it is a very new idea to most people to think about collaboratively editing layers on the world, and to be able to viewÂ  augmented space through channels and networked communities?Â  Could you explain some of the ways you have explored these ideas and how they could be explored further now to create meaningful experiences for people?</p>
<p><strong><strong><span>Jeremy Hight: right..exactlyâ€¦modulated mapping to me can be an amazing tool for studentsâ€¦back end searching data visualizations and augmentations based on their needsâ€¦while they do something else on their computer or iphoneâ€¦that can be amazing..and not deep </span></strong><strong>immersive..The map can be active, malleable, open source fed, and even, in a sense, intelligent and able to adapt. The possibility also exists for this map to have a function that based on key words will search databases on-line to find maps, animations, histories and stories etc to place within it for your study and engagement. The map is thus a platform and yet is active. Community is possible as people can communicate graphically in works placed on the map and in building mode in the tool. All the tropes of locative media are to be in a </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> system of channels of augmentation and a spatial net. The software by design will allow development on the map and communication like programs such as second life but in </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> itself.</strong></strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modultedmapping1.jpg"><img class="alignnone size-medium wp-image-4607" title="interactive 3d map copy" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modultedmapping1-246x300.jpg" alt="interactive 3d map copy" width="246" height="300" /></a></strong></p>
<p><strong><strong><em><strong><span>image from Parsons Journal of Information Mapping Volume 2 (Hight/Wehby)</span></strong></em></strong></strong></p>
<p><strong><strong><span>I wrote an essay a few years ago for the Sarai reader questioning the traditional map and its semiotics and need to reconsider â€“ then did work looking into it and what those dynamics were and they got into 2 group shows in museums in Russiaâ€¦so it actually was my arc toward modulated mappingâ€¦an interesting way to it! But yes the map itself..this is a huge area of potential and non screen based alone navigation etc. I see now that my 2 dozen or so essays in lm,ar, interface design and augmentation have all also been leading in this direction for about 10 years now</span></strong></strong></p>
<p><strong><strong>Tish Shute: </strong>IÂ  love immersive visualization but can we &#8220;return to the map &#8211; the internet of data&#8221; as you mentioned earlier and produce interesting augmentation experiences that go beyond locative media&#8217;s device display mode without having the goggles, for example, through the magic lens of or smart phones?</strong></p>
<p><strong><strong>Jeremy Hight: yes, absolutely.Â  the map in the older paradigm is an artifice born often of war and border dispute and not of the earth itself and its processes&#8230;the new mapping like google maps is malleable, can be open source, can read spaces and can be layers of info in the related space not plucked from it as in the past..this is amazing. the old map also was born of false semiotics/semantics like &#8220;discovery of new lands&#8221; or &#8221; pioneer&#8221;Â  while the places were there already and names often were of empire&#8230;now this is no longer the case</strong></strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping2.jpg"><img class="alignnone size-medium wp-image-4608" title="jeremy map small2 copy" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping2-300x233.jpg" alt="jeremy map small2 copy" width="300" height="233" /></a></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>So geoAR is an a better way to express a new social relationship to mapping? And how does this fit into the evolving arc of locative media that evolves into augmented reality?</p>
<p><strong><strong>Jeremy Hight:&#8230;early lm was mostly geocaching and drawing with gps..it took new paradigms to invigorate the fieldÂ  a lot of folks focus on tools and what already is, cross pollination can ground ideas that are more radical&#8230;a metaphor in a sense to place what can be in a familiar context.</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>one of the great disappointments in VR has been its isolation from networked computing and also, up to now, augmented reality &#8211; to achieve an immersive experience withÂ  tight registration of media/graphics have to create separate system isolated from the internet and power of the web.</p>
<p><strong><strong>Jeremy Hight: yes&#8230;.this will change. vr is to me an island but ar takes a part of it and shifts the paradigm and new things open this way. Do you know the project <a href="http://www.lifeclipper.net/EN/process.html" target="_blank">&#8220;life clipper&#8221;</a>? friends of mine..doing interesting things..they are a clear bridge betwen lm and ar&#8230;.and from vr</strong></strong></p>
<p><strong><strong>in ar augmentation and what is being augmented become fused or in collision or in complex interactions as a means to a larger contextualization and exploration of what is being augmented..this is true in immersive or non ar&#8230;.huge potential</strong></strong></p>
<p><strong><strong>vr is a space, now can be surgery which is amazing. but not layered interaction, thus an island and graphic iconography on a location can use symbolic icons which opens up even more layers (graphic designer/information designer in me talking there I suppose..)</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Yes !Â  talk to me more about layers and channels I think this is one of the most interesting questions for meÂ  in augmented reality at the moment &#8211; what can we do with layers and channels and the new possibilities on connections between people and environments that these can create?</p>
<p>The ability for anyone to post something is critical to the distributed idea but one of the reasons I am so excited by Google Wave is I am fascinated by the playback function. How do you think this will enable new forms of collaborative locative narratives (<a href="http://snarkmarket.com/2009/3605" target="_blank">nice post on Wave playback here </a>).</p>
<p><strong><strong>Jeremy Hight: We are in an age of cartographic awareness unseen in hundreds of years. When was the last time that new </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> tools were sold in chain stores and installed in most vehicles? When was the last time that also the augmentation of maps was done by millions (Google map hacks, etc)? The ubiquitous gps maps run in automobiles while people post pictures and graphic pins to denote specific places on on-line maps.</strong></strong></p>
<p><strong><strong>The need is for a tool that combines all of these new elements into an open source, intuitive layered and rhizomatic map that is porous (like pumice, organic in form yet with â€œbreathing roomâ€ ),ventilated (i.e: adjustable, a flow in and out), and open (open source,open access,open spatialized dialog).</strong></strong></p>
<p><strong><strong><span> I wrote of this in my essay &#8220;Revising the Map: Modulated Mapping and the Spatial Interface .&#8221;(</span></strong><span> </span><a id="h0qr" title="http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf )" href="http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf%20%29"><span>http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf )</span></a></strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping3.jpg"><img class="alignnone size-medium wp-image-4609" title="jeremy map small2 copy" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping3-300x206.jpg" alt="jeremy map small2 copy" width="300" height="206" /></a></strong></p>
<p><strong><em><strong><span>image from Parsons Journal of Information Mapping (Hight/Wehby)</span></strong></em></strong></p>
<p><strong><strong>Tish Shute:</strong></strong> One mapping project I really like is <a href="http://themannahattaproject.org/" target="_blank">Mannahatta</a>.Â  How could distributed AR contribute to a project like <a href="http://themannahattaproject.org/" target="_blank">Mannahatta</a>?</p>
<p><strong><strong>Jeremy Hight: that is a good example..imagine taking manhattan and having channels of options to overlay, that being an excellent option, and imagine being able to even run a few at once with deliniating icons..you can augment a space with history, data, erasure, narrative, scientific analysis, time line of architecture, infrastructure, archaeological record etc&#8230;.endless possibilities, and this agitates place and place on a map into an active field of information with end user control&#8230;and open options for new layers</strong></strong></p>
<p><strong><strong>Tish Shute: </strong></strong>and do you think we could do interesting things with AR on a project like Mannahatta even with the current mediating devices we have available &#8211; i.e. our smart phones as obviously the rich pc experience of Mannhatta has built for it&#8217;s web interface would not be available as AR at this point?</p>
<p><strong><strong>Jeremy Hight: yes&#8230;.k.i.s.s right?Â Â  these projects do not have to only be immersive and graphic intensive&#8230;&#8230;take how people upload photos onto google maps&#8230;.just make that on a menu of options, there are some pretty cool hacks already..<br />
&#8230;options is key, a space can have a community as well, building on it in software, and others navigating it, i see it near future and down the road..always have with ar really</strong></strong></p>
<p><strong><strong><a href="../wp-content/uploads/2009/10/locativenarratives1.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/locativenarratives1.jpg"><img class="alignnone size-medium wp-image-4596" title="locativenarratives1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/locativenarratives1-230x300.jpg" alt="locativenarratives1" width="230" height="300" /></a><br />
</strong></strong></p>
<p><strong><em><strong><span>image from Volume Magazine (Hight/Wehby)</span></strong></em></strong></p>
<p><strong><strong>Jeremy Hight: and yes, a lot of people focus on ar as its limitations and processing power needs as a major road block</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>so do you see AR on smart phones adding any value to a project like Mannahatta?</p>
<p><strong><strong>Jeremy Hight: yes&#8230;that it can be integrated into other similar works and even disparate but cloud linked ones&#8230;so a place can be &#8220;read&#8221; in diff ways on the iphone&#8230;.beyond its map location, and more can be possible if you are there&#8230;others away, so it becomes channels of augmentation</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>AR like locative media puts who you are, where you are, what you are doing, what is around you center stage in online experience but it also &#8220;puts media out in the world&#8221; &#8211; people I think understand this well as a single user experience but we are only just beginning to think about how this will manifest as a social experience &#8211; could explain more about modulated mapping as an experience of social augmentation?</p>
<p><strong><strong style="background-color: #99ff99; color: black;"><span>Jeremy H</span>ight: Modulated</strong> <strong style="background-color: #ff9999; color: black;">Mapping </strong><strong>is a tool that will allow channels to be run along the map itself. This will allow one to view different icons and augmentations both as systems on the map and in deeper layers of information (photos, videos, animations,Â  visualizations, etc) that can be turned on and off as desired. The different layers of icons and data may be history, dissent, artworks, spatialized narratives, and annotations developed that are communally based on shared interests, placed spatially and far beyond. The use of chat functionality in text or audio will be open in building mode and in </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> navigation/usage as desired. This also allows a community to develop or augment in the spaces on the earth. These nodes can be larger and open or small and set by groups in their channel. The end result is an open source sense of </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> that will also have a needed sense of user control as one can select which layers of augmentation they wish to see and interact with at any time. It also will incorporate all the functionality of locative media in </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> software and </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong>. In building mode and in map mode, icons will be coded to represent within channels (remember that the person using it has selected channels of augmentation from many based on their current interests and needs). Icons will be coded as active to show work in progress in cities and the globe to both invite participation and to further agitate the map from the sense of the static as action is visible even with its icons as people are working and community is formed in common interest/need .</strong></strong></p>
<p><strong><strong>locative media got a buzz for &#8220;reading&#8221; places&#8230;when I helped create locative narrative that was what blew me away back in 2001&#8230;that we could give places a voice by placing data from research and icons on a map&#8230;&#8230;this meant lost history or augmentation was possible as kind of voices of a place and its layers&#8230;&#8230;.I called it &#8220;narrative archaeology.&#8221; We now have tools that can push these ideas and concepts farther..much farther&#8230;and with a range beyond what was before, and then the map was just a tool&#8230;.but now we are returning to the map itself&#8230;..and this as place as much as marker..this is where ar takes the ball to use a bad metaphor</strong></strong></p>
<p><strong><strong>also that project could only work if you came to our spot of a 4 block augmentation and with us there to lend you our gear&#8230;we are far beyond that now but it had its place</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>How do you see &#8220;in context&#8221; AR and something we might call &#8220;context aware&#8221; cloud computing models interacting?</p>
<p><strong><strong>Jeremy Hight: sure&#8230;and I must add that I have issues with cloud computing as much as it is a good idea..</strong>.</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>because of loss of autonomy?</p>
<p><strong><strong>Jeremy Hight: tivo is simply a hard drive&#8230;but it keyword reads and givesÂ  suggestions..that is the is cro magnon link to what can be</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>The nice thing about Wave is because of the Federation model, the cloud model and local store ur own data models should work together.<strong><strong><span> </span></strong></strong></p>
<p><strong><strong><span>Jeremy Hight: yes..that is better&#8230;..loss of autonomy also opens up the arbitrary which is the flaw of search engines as we know itâ€¦even Bing fails to me in that sense</span></strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>how do you mean, could you explain?</p>
<p><span> </span><strong><strong><span>Jeremy Hight: spidersÂ  cull from wordsÂ  but cull like trawlers at sea â€¦. tested Bing with very specific requests.. it spat out the same mass of mostly off topic resultsâ€¦.</span><br />
<span> I wonder if there is a way to cull from key words and topics from a userâ€¦not O</span>rwellian back end of courseâ€¦but from their preferences, their searches etc..</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>did you see the discussion on search in the AR Framework doc? AR search will be a massively important thing that will take a lot of intelligence and all sorts of algorithm development won&#8217;t it?</p>
<p><strong><strong>Jeremy Hight:It also has one area of key functionality that moves into more intuitive software. Upon continued usage, the </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> software will â€œlearnâ€ and search based on key words used and spheres of interest the user is </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> or observing as mapped and will integrate deeper data and types of animations, etc. into the map or will have them waiting to be integrated upon user approval as desired. Over time the level of sophistication of additions and of search intuition will increase dramatically. The search can also, if the user wishes, run in the back end while working in the </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> program, or in off time as selected while doing other tasks. It also can never be used if one is not interested. One of the key elements of this </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> is that it is not composed of a closed set or needs user hacks to augment, but instead is to evolve and deepen by user controls and desired as designed. Pre-existing data,visualizations and augmentations can be integrated with relative ease.</strong></strong></p>
<p><strong><strong>Tish Shute: </strong></strong>One of the things that Joe Lamantia points out about social augmented experiences is that they will operate across a number of different scales &#8211; conversation &gt; product design &amp; build team &gt; neighborhood / town fixing potholes &gt; global community for causes. How do designs for channels and layers change across these different social scales?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> quote myself &#8230;&#8221;The &#8220;frontier&#8221; is often defined as the space just ahead of the known edge and limit, and where it may be pushed out deeper into the previously unknown. The frontier in the world of ideas is not the warm comfort of what has been long assimilated; and the frontier in the landscape is not of maps, but of places beyond and before themâ„</strong></strong></p>
<p><strong><strong>The border along what has been claimed is not only that of maps â€“ it is of concepts, functions, inventions and related emergent industries. Ideas and innovations are like the cloud shape that briefly forms around a jet breaking the sound barrier, tangible yet not fully mapped into measure. It is when things are nailed down into specific entities, calibrated and assessed, that the dangers may inflict themselves â€“ greed, competition, imitation, anger, jealously, a provincial sense of ownership either possessed or demanded&#8221;. (from essay in Sarai reader). Otherwise channels and augmentation do not have to be socio-economically stratifying or defined by them. We built 34nÂ  for almost nothing on older tools.</strong></strong></p>
<div id="yqjj" style="text-align: left;"><strong><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_1g3svj8fq_b.jpg"><img class="alignnone size-medium wp-image-4599" title="dgznj3hp_1g3svj8fq_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_1g3svj8fq_b-300x225.jpg" alt="dgznj3hp_1g3svj8fq_b" width="300" height="225" /></a></strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_1g3svj8fq_b.jpg"><span> </span></a></strong></div>
<p><strong><em><strong><span>image from 34north 118westÂ  (Spellman/Hight/Knowlton)</span></strong></em></strong></p>
<p><strong><strong>The ar that is not deep immersion can be more readily available and channels can be what end users need like the diversity of chat rooms or range of Facebook users among us.</strong></strong></p>
<p><strong><strong>I had two moments yesterday that totally fit what we talked about.Â  I went to west hollywood book fair and traditional directions off of mapping for driving directions were wrong and we got lost&#8230;our friend could only get a wireless signal to map on itouch and we had to roam neighborhoods then we called a friend who google mapped it and we found we were a block away&#8230;.so a fast geomapping overlay with an icon for the book fair on some optional grid service or community would have made it immediate.Â  Then at the book fair talked to a small press publisher who is trying to map works about los angeles by los angeles authors on a map..she was stunned when I told her it could be a kind of google map feature option</strong></strong></p>
<p><strong><strong>it also has great potential to publish and place writing and art in places..both for commentary and access. imagine reading joyce in chapters where it was written about and then another similar experience but with writers who published on a service into their city.</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong></strong></strong> The challenge of shared augmented realities is not just a matter of shipping bits around, but also of how it we will use channels and layars &#8211; to create and negotiate different, distributed perspectives, understand a shared common core/or expressions of dissent (this came up in an email conversation with <a href="http://www.oreillynet.com/pub/au/166" target="_blank">Simon St Laurent</a>).</p>
<p><strong><strong><strong>Jeremy Hight:</strong> well my example earlier could have been communal in a way too..a tribe sort of augmentation channeling &#8230;.like subscribing to list servs back in the day but of augmentation communities/channels, and for folks to build and use in shared live form, coordinating too</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong></strong> </strong>one good thing though about building an open AR Framework is that as bandwidth/CPU/hardware gets better shared high def immersive experiences could be supported by the same framework..</p>
<p><strong><strong>Jeremy Hight: excellent</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>were you thinking of the image recognition and tracking with this example?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> yeah&#8230;.like scanning across a multi channeled google map augmentation with diff icons and their connected data&#8230;and poss social networking and fle sharing even in that mode&#8230;and rastering etc&#8230;.could be cool with google wave </strong><strong><span>- on the map..then zooming in a la powers of ten..(eames film).</span></strong></strong></p>
<p><strong><strong>-</strong><strong><span>I have pictured variations of this for a few years now in my head like the example of my friends and I yesterdayâ€¦we could have correlated a destination by icons in diff channels..one being lit events within lit channel in l.a mapâ€¦maybe things streaming on it tooâ€¦remote info and video etc&#8230; that would be awesome</span></strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong></strong></strong> So many of the ideas in you paper on modulated mapping (see <a href="http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf" target="_blank">here</a>) are brilliant use cases for shared augmented realities. Perhaps you could talk more your ideas about locative narrative because this is something I think is at the core of the kinds of experiences that a distributed AR Framework would make possible?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> on the project &#8220;34 north 118 west&#8221; we mapped out a 4 block area for augmentation of sound files triggered by latitude and longitude on the gps grid and map and the map on the screen had pink rectangles that were the &#8220;hot spots&#8221; where the augmentation had been placed.</strong></strong></p>
<div id="nwc6" style="text-align: left;"><strong><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_0gg994bf9_b.jpg"><img class="alignnone size-medium wp-image-4600" title="dgznj3hp_0gg994bf9_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_0gg994bf9_b-300x225.jpg" alt="dgznj3hp_0gg994bf9_b" width="300" height="225" /></a></strong></strong></div>
<p><strong><em><strong><span>image of interactive map with map based augmentation connected to audio augmentation on site for 34north 118west (Spellman/Hight/Knowlton)</span></strong></em></strong></p>
<p><strong><strong>We researched the history of the area and placed moments in time of what had been there at specific locations &#8230;.I called this <a href="http://www.xcp.bfn.org/hight.html" target="_blank">&#8220;narrative archaeology&#8221;</a> as it allowed places to be &#8220;read&#8221; by their augmentations&#8230;info that was of the place beyond the immediate experience (diff types of info) that otherwise would be lost or only found in books or web sites elsewhere. there now are locative narratives around the world but they need to be linked.Â  from humble origins &#8220;narrative archaeology&#8221; went on to be recently named of the 4 primary texts in locative media which is pretty amazing to me&#8230;but it is growing</strong></strong></p>
<p><strong><strong>- the limitations then were what I called the &#8220;bowling alley connundrum&#8221; &#8211; the specifc data had to reset like pins&#8230;..and was isolated&#8230;.this led me to think about ar back then and up to now.Â  How these could lead to much more from that point, data that would be more layered, variable , fluid..yet still augmented place and sense of place and social networking within data and software</strong></strong></p>
<p><strong><strong><a href="http://34n118w.net/34N/" target="_blank">lifeclipper</a> to me is a bridge</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>But Life Clipper is isolated from the internet currently is it?</p>
<p><strong><strong><span>Jeremy Hight: yes&#8230;ours was too.. that is what google wave makes possible.. our project only ran on our gear..in 4 blocksâ€¦with additional auxi</span>liary info online, and not malleable..but hey 2001 and all..</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>so the sites for 34 north 118 west are still active though?</p>
<p><strong>Jeremy Hight: oh yeah!</strong></p>
<p><strong><strong><strong>Tish Shute: </strong></strong></strong>nice I really like sound augmentation &#8211; have you seen <a href="http://www.soundwalk.com/blog/tag/augmented-reality/" target="_blank">Soundwalk</a>?</p>
<p><strong><strong><span>Jeremy Hight: yes, very cool..</span> </strong><strong>we chose sound only as it fought the power of image..instead caused a person to be in a sense of two places and times at once</strong></strong></p>
<p><strong><strong>Tish Shute:</strong></strong> and in 2001 that was definitely a visionary project!</p>
<p>You must be very excited that finally the pieces are coming together to make this stuff scale!</p>
<p><strong><strong><strong>Jeremy Hight:</strong> I can&#8217;t even tell you!! it is funny..i have known that this would come..just waited and waited&#8230;</strong></strong></p>
<p><strong><strong>..knew it needed the right people and tools..</strong></strong></p>
<p><strong><strong><span>..so the bowling alley connundrum led me to develop my project shortlisted for the iss (international space station)Â  as I thought a lot about how points and works are not to be isolatedâ€¦but connectedÂ  and should be flowing in diff parts of a mapâ€¦.to open up perspective and connected augmentations , but also to think about the map againâ€¦not as a base only. then moved into my work with new ways to visualize time and it all really began to gell.Â  The ideas first were published as an essay</span></strong><span> </span><a id="qw.2" title="(http://www.fylkingen.se/hz/n8/hight.html)" href="http://www.fylkingen.se/hz/n8/hight.html"><span>(http://www.fylkingen.se/hz/n8/hight.html)</span></a><span> </span><strong><span>and later my project blog</span></strong><span> (</span><a id="bp.b" title="http://floatingpointsspace.blogspot.com/)" href="http://floatingpointsspace.blogspot.com/%29"><span>http://floatingpointsspace.blogspot.com/)</span></a></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>One thing I noticed when I was reading your paper is how you have been exploring non-euclidian geometries.Â  Could you explain how this is part of your idea of modulated mapping?</p>
<p><strong><strong><span>Jeremy Hight: Yes, this first came to me when my wife was reading to me from a book on the Poincare Conjecture and I was hit with a new way to measure events in time and after months of sketches, schematics and research came to see how it could also be connected to a geo-spatial web of projects and augmentations.Â  It was published in the inaugural issue of Parsons School of Design&#8217;s Journal of Information Mapping which was an exciting fit.</span></strong><span><strong> I call it &#8220;Immersive Event Time&#8221;</strong>(</span><a id="o3rt" title="http://piim.newschool.edu/journal/issues/2009/01/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf)" href="http://piim.newschool.edu/journal/issues/2009/01/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf%29"><span>http://piim.newschool.edu/journal/issues/2009/01/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf)</span></a></strong></p>
<p><span><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_4cxz57xgv_b.jpg"><img class="alignnone size-medium wp-image-4634" title="dgznj3hp_4cxz57xgv_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_4cxz57xgv_b-195x300.jpg" alt="dgznj3hp_4cxz57xgv_b" width="195" height="300" /></a></strong></span></p>
<p><span><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_5g68k9ggh_b.jpg"><img class="alignnone size-medium wp-image-4635" title="dgznj3hp_5g68k9ggh_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_5g68k9ggh_b-300x225.jpg" alt="dgznj3hp_5g68k9ggh_b" width="300" height="225" /></a><br />
</strong></span></p>
<p><strong><strong>so the last 3 years I have been working on how it could all work as channels of augmentation, and building and navigation as open and community in a sense as well as ai capability that was the time work especially. how time as experienced within an event is not a time &#8220;line&#8221;Â  but points on and within a form&#8230;.and how this model is better for visualizing events in time and documenting them. it actually sprang form reading a book on the poincare conjecture, popped a bunch of other stuff together so one could visualize an event in time as like being in the belly of a whale..with time as the ribs..and our measure of time as the skin&#8230;and moving within it&#8230;.hoping this will be used as educational tool</strong></strong></p>
<p><strong><strong>and this also can be tied to ar and map again&#8230;how documentation of important events can be kept within icons on a google map..then download varying visualizations based on bandwidth and desired format</strong></strong></p>
<p><strong><strong><strong>Tish Shute: </strong></strong></strong>I have been thinking about is the new forms of social interaction/agency that these kinds of augmentations of space/place/time will create.Â  it seems there are two poles &#8211; one is the area Natalie Jeremijenko explores of shifting social relations from institutions/statistics to real time/location based/interactions and new forms of social agency.Â  The other pole completely is more like the cloud based AI and perhaps crowd sourced machine learning.</p>
<p>Your ideas explore the possibilities of both these poles.Â  And certainly one of the big deals of distributed AR integrated with would be the possibilities it opened up both for new forms of networked social relationships and for new ways to draw on network effects.</p>
<p><strong><strong><strong>Jeremy Hight:</strong> and cross pollinations within &#8230;that is what my mind goes to</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>The other night I met Assaf Biderman, MIT, from the <a href="http://senseable.mit.edu/trashtrack/" target="_blank">Trash Track</a> team. Trash Track doesn&#8217;t utilize AR but I could see that there are possibilites there.<br />
What do you think?</p>
<p><strong><strong><span>Jeremy Hight: yes, absolutely,</span> </strong><strong>there can sort of skins on locations that user end selection can yield &#8230;like channels of place&#8230;.and can range from pragmatic core to art and play and places between&#8230;.how this recalibrates the semiotics of map&#8230;more than just augmentation as seen as a kind of piggy back on map..map becomes interface and defanged platform if you wil, interestingly my more poetic/philosophic writing led me here too</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong></strong></strong> I know they are at very different poles of the system but I do wonder how AR can bring some of the level of social agency/interaction that <a href="http://www.environmentalhealthclinic.net/people/natalie-jeremijenko/" target="_blank">Natalie Jeremijenko</a> works on into a productive interaction with the kind of innovations in Machine learning that Dolores Lab style machine learning!!and others are pioneering?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> Natalie&#8217;s genius to me is in practical functional tech that also opens deeper questions and even new openings of what is needed..amazing layers in her work that way.. succint yet deep..very deep</strong></strong></p>
<p><strong><strong><strong>Tish Shute: </strong></strong></strong>Yes &#8211; I a just writing a post about her work &#8211; I find it deeply moving the way she has delved into the possibilities to using technology to open us up to our world.Â  One of the reasons I find distributed AR so interesting is because it will make it possible for all kinds of people to create and use augmentation in their lives and communities.</p>
<p>So to return to how a distributed AR framework could contribute to a project like Trash Track?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> what about using it for community, dissent and awareness raising then?Â  like Natalie&#8217;s work but building like a communal work of multiple points, like the old adage of the elephant and the blind menÂ  sorry..metaphor &#8211; like one of my points in immersive sight was how one could take augmentation as multiple works sort of turning the faces of a thing or place&#8230;and how this would make a larger work even in such a flow so people moving in a space could also build..</strong></strong></p>
<p><strong><strong>what of ar traces left as people move calibrated to user traffic and trash as estimated in an urban space&#8230;like it goes back to chris burden in the 70&#8242;s making you know that as you turn the turnstyle you are drilling into the foundation and may be the one that collapses the building?</strong></strong></p>
<p><strong><strong>so their movements leave trash. Natalie is all about raising awareness to cause and effect and data , space and ecology. love that.Â  so maybe &#8230;<br />
a feedback loop , artifact and user end responsibility can leave traces &#8230;trash&#8230;</strong></strong></p>
<p><strong><strong>.. cybernetics vs ecology and human waste</strong></strong></p>
<p><strong><strong><strong>Tish Shute: </strong></strong></strong>could you elaborate?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> brain fart&#8230;that the mass of trash people leave is a piece at a tiime&#8230;.and how like the space shuttle mission when it was argued first true cybernaut occured&#8230;.one cord to air for astronaut..one for computer on their back to fix broken bay arm&#8230;if there is a way to build on that and in relation to the topic&#8230;..how this can go further, that machines do not waste as much&#8230;as ar is a means to cybernetic raise awareness..eh..</strong><strong><span>In a sense it is likeÂ  the space shuttle mission when arguably the first true cybernaut occurredâ€¦.one cord to air for astronaut..one for computer on their back to fix broken bay armâ€¦if there is a way to build on that and in relation to the topicâ€¦..how this can go further, that machines do not waste as muchâ€¦as ar is a means to cybernetic raise awareness..eh.. hmmm.</span>.. </strong><strong> sensors etc&#8230;wearables too &#8211; could be eco awareness with data and machine and human</strong></strong></p>
<p><strong><strong>what about a cloud computing system with a slight ai in the sense of intuitive word cloud and interest scans&#8230;..so as one moves through say new york they can be offered new ai data and services as they move ? could also be of eco interests? concerns about urban farming, eco waste, air pollution etc&#8230;.perhaps with (jeremijenko element here) Â sensors placed in locations and these also giving data reads in public areas Â with no input but hard data itself&#8230;&#8230;hmm..could be interesting</strong></strong></p>
<p><strong><strong>it can also give info of the carbon footprints (estimated prob unless data is public record somehow) of chain businesses Â and data on which are more eco friendly as well as an iconography color coded and icon coded to the best places to go to support greening and eco friendly business? Â and the companies could promote themselves on this service to attract eco aware customers who would be seeing them as kindred spirits and helping the<br />
larger effort?</strong></strong></p>
<p><strong><strong>kind of eco mapping..and ar on mobile app</strong></strong></p>
<p><strong><strong>what about sensors that read air pollution levels, levels of solar radiation (to aid with skin protection in shifting light values in a city space..ie put on some skin cream now&#8230;), light sensors that detect density and over density in public spaces&#8230;to use the old trope in art of reading crowds in a space..but instead could indicate overcrowding, failing infrastructure in public spaces (which is a congestion that leads to greater pollution levels as well as flaws in city planning over time..), and perhaps a tie in to wearables&#8230;&#8230;worn sensors Â on smart clothes&#8230;.this could form a node network of people in the crowds &#8230;.and also send data within moving in a space&#8230;</strong></strong></p>
<p><strong><strong>here is a kooky thought&#8230; what of taking the computing power and data of people moving in a space..and not only get eco data and make available to them levels of<br />
data..but make possibly a roving super computer&#8230;crunching the deeper data of people open to this&#8230;&#8230;a hive crunching deeper analysis of the space, scan properties from sensors, and even a game theory esque algorithm of meta data if say 40 people out of 50 hit on a certain spike or reading&#8230;and even their input&#8230;..I worked in game theory for paleontology in this manner for a time as a teen&#8230;.a private project&#8230;&#8230; Â  the reading can lead to a sort of meta read by what hits most consistently..as well as in their input..text of what they experienced, observed,postulated,analyzed even&#8230;. this could be really interesting&#8230;even if just the last part from collected data and not from any complex branching of servers..</strong></strong></p>
<p><strong><strong>I thought at 19 or so that the flaw in paleontology was in how so many larger theories were shifting exhibitions and larger senses of things like were there pre-historic birds that were mistaken for amphibean and then back again&#8230;.so why not make a computer program and feed all the papers published into it and see what hits were counted in terms of an emerging meta theory&#8230;and landscape of key points being agreed upon&#8230;this data would be in a sense both algorithmic and a sort of unspoken dialogue &#8230;came from a lot of study of game theory one summer&#8230;</strong></strong></p>
<p><strong><strong>hope this makes some sense&#8230;I forgot to mention that I originally planned to be a research meteorologist and my plan in middle school or so was to get a phd and develop new software to have a global map and then run models of hypothetical storms across it in real time animations of cloud forms, radar and wind analysis/fields, barometric pressure spaghetti charts etc&#8230;.and to also do 3d cut away models of storm architectures&#8230;so been into visualizations of complex data and mapping for a long time!</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>Wow let me think about this one!</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/feed/</wfw:commentRss>
		<slash:comments>18</slash:comments>
		</item>
		<item>
		<title>Sensor Networks and Sustainability: &#8220;Connecting Real, Virtual, Mobile and Augmented Spaces&#8221;</title>
		<link>http://www.ugotrade.com/2009/04/19/sensor-networks-and-sustainability-connecting-real-virtual-mobile-and-augmented-reality/</link>
		<comments>http://www.ugotrade.com/2009/04/19/sensor-networks-and-sustainability-connecting-real-virtual-mobile-and-augmented-reality/#comments</comments>
		<pubDate>Sun, 19 Apr 2009 06:32:59 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[MQTT and RSMB]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[realXtend]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Virtual HomeCamp]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[Virtual Realities]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[AMEE]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[Carbon Goggles]]></category>
		<category><![CDATA[distributed sustainability]]></category>
		<category><![CDATA[home energy management]]></category>
		<category><![CDATA[open data]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[sensor networks]]></category>
		<category><![CDATA[sensor networks and sustainability]]></category>
		<category><![CDATA[SHASPA]]></category>
		<category><![CDATA[the internet of things]]></category>
		<category><![CDATA[TweetaWatt]]></category>
		<category><![CDATA[Virtual Worlds]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=3381</guid>
		<description><![CDATA[Today, I did a presentation, on connecting real, virtual, mobile, and augmented spaces to support sustainability, for Earth Week SL, with Dave Pentecost and Jim Purbrick, who presented on Carbon Goggles. Dave and I focused on sensor networks, open data, Pachube, OpenSim, and sustainability from perspective of, &#8220;hack local, think global.&#8221;Â  Dave and I will [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-21.png"><img class="alignnone size-medium wp-image-3382" title="picture-21" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-21-300x225.png" alt="picture-21" width="300" height="225" /></a></p>
<p>Today, I did a presentation, on <a href="http://docs.google.com/Presentation?id=dhj5mk2g_214g48q37hj" target="_blank">connecting real, virtual, mobile, and augmented spaces to support sustainability,</a> for <a href="http://slearthweek.wordpress.com/2009/04/10/earth-week-press-release-see-schedule-also/" target="_blank">Earth Week SL</a>, with <a href="http://www.gomaya.com/glyph/" target="_blank">Dave Pentecost</a> and <a href="http://jimpurbrick.com/" target="_blank">Jim Purbrick</a>, who presented on <a href="http://carbongoggles.org/" target="_blank">Carbon Goggles</a>.</p>
<p>Dave and I focused on sensor networks, open data,<a href="http://www.pachube.com/" target="_blank"> Pachube</a>,  <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim,</a> and sustainability from perspective of, &#8220;hack local, think global.&#8221;Â  Dave and I will be picking up on some of these themes of sensor networks and sustainability next week in our presentation with <a href="http://www.darleon.com/" target="_blank">Dimitri Darras</a> at ITP,Â  NYU, Aprl 24th, 6.30 pm to 8 pm &#8211; <a href="http://itp.nyu.edu/sigs/news/special-event-open-sim/" target="_blank">details here</a>.Â  If you are in New York City, I hope to see you there.</p>
<p>We got some interesting insights into augmented reality from <a href="http://jimpurbrick.com/" target="_blank">Jim Purbrick</a> whose <a href="http://carbongoggles.org/" target="_blank">Carbon Goggles</a> project prototypes how we can use augmented reality to read carbon identity and to combine well organized, verified data from <a href="http://www.amee.com/" target="_blank">AMEE</a> &#8211; a neutral aggregation platform to measure the &#8220;carbon footprint&#8221; of everything on earth, with crowd sourced tagging and linking.</p>
<h3>Shaspa &#8211; &#8220;the sensor network system that has it all&#8221;</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-22.png"><img class="alignnone size-medium wp-image-3391" title="picture-22" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-22-300x224.png" alt="picture-22" width="300" height="224" /></a></p>
<p>We also discussed, recently launched, <a href="http://www.shaspa.com/" target="_blank">Shaspa</a>. Shaspa&#8217;s energy management packages connect spaces &#8211; real, virtual, mobile and augmented.Â  Shaspa has been bloggedÂ  by <a href="http://www.maxping.org/business/real-life/virtual-management-of-energy-consumption-in-the-home.aspx/" target="_blank">Maxping</a> and <a href="http://www.virtualworldsnews.com/2009/04/shaspa-launches-home-energy-organizer-on-opensim.html" target="_blank">Virtual World News</a>, so you can read all about it, but the Shaspa device kit won&#8217;t be available until next week. Some key features of the Home EnergyÂ  package are listed on the slide above.Â  However, this evening, Dave Pentecost and I got a sneak preview of both the Shaspa commmunity and enterprise hardware and software packages from Shaspa founder Oliver Goh. We were pretty impressed.</p>
<p><strong>Dave:</strong> &#8220;<strong>It&#8217;s the ultimate hackable device for energy management!&#8221;</strong></p>
<p><strong>Oliver:</strong> <strong>&#8220;Bring us any sensor device &#8211; with documentation, and within three days we will put a driver into Shaspa.&#8221;</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/daveandoliverpost.jpg"><img class="alignnone size-medium wp-image-3392" title="daveandoliverpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/daveandoliverpost-300x178.jpg" alt="daveandoliverpost" width="300" height="178" /></a></p>
<p>Oliver is on the right and Dave on the left in the picture above. The picture below shows Shaspa in OpenSim. Oliver and I will be attending the <a href="http://www.3dtlc.com/"><span style="color: #810081;">3D Training, Learning and Collaboration</span></a> Conference in Washington, DC, next week.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-23.png"><img class="alignnone size-medium wp-image-3412" title="picture-23" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-23-300x208.png" alt="picture-23" width="300" height="208" /></a></p>
<h3>Links</h3>
<p>Here are some of the links that came up in the presentation as many people asked for them to be published. Dave also has them on <a href="http://www.gomaya.com/glyph/archives/002520.html#002520" target="_blank">his blog</a>.</p>
<p>SLIDES on GOOGLE DOCS:<br />
<a title="Earth Week SL Presentation, April 18th, 2009 - Google Docs" href="http://docs.google.com/Presentation?id=dhj5mk2g_214g48q37hj">Earth Week SL Presentation, April 18th, 2009 &#8211; Google Docs</a></p>
<p><a href="http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/" target="_blank">Pachube, sensor networks</a></p>
<p><a href="http://www.gomaya.com/glyph" target="_blank">Dave&#8217;s blog covering Maya archaeology, jungle ecology, and technology</a></p>
<p><a href="http://www.gomaya.com/glyph/archives/001914.html" target="_blank">Maya Frontier, Usumacinta River videos</a></p>
<p><a href="http://en.wikipedia.org/wiki/Collapse_(book)" target="_blank">Collapse</a></p>
<p><a href="microcontrollers http://arduino.cc/" target="_blank">Arduino</a></p>
<p><a href="http://community.pachube.com/tutorials" target="_blank">Pachube &#8211; tutorials</a></p>
<p><a href="http://apps.pachube.com/" target="_blank">Pachube Apps </a>-</p>
<p><a href="http://www.pachube.com/feeds/1284" target="_blank">Arduino-SL-Pachube data site</a></p>
<p><a href="http://www.pachube.com/feeds/1505" target="_blank">SL to Pachube site</a></p>
<p><a href="http://www.zachhoeken.com/connecting-to-the-world" target="_blank">Dave&#8217;s Danger Shield &#8211; Pachube  tutorial</a></p>
<p><a href="http://www.ladyada.net/make/tweetawatt/" target="_blank">TweetaWatt site (LadyAda)</a></p>
<p><a href="http://www.gomaya.com/glyph/archives/002505.html" target="_blank">Dave&#8217;s post on TweetaWatt to Opensim/SL</a></p>
<p><a href="http://peterquirk.wordpress.com/2008/12/22/tutorial-using-the-streamlined-tool-chain-for-importing-sketchup-models-into-realxtend-04/" target="_blank">Peter Quirk&#8217;s post on Importing Sketchup into RealXtend</a></p>
<p><a href="http://opensimulator.org/wiki/Main_Page" target="_blank">Opensim</a></p>
<p><a href="http://www.realxtend.org/" target="_blank">RealXtend</a></p>
<p><a href="http://reactiongrid.com/" target="_blank">ReactionGrid</a></p>
<p><a href="http://homecamp.pbwiki.com/" target="_blank">homecamp</a></p>
<p><a href="http://www.cminion.com/wordpress/" target="_blank">cminion -wind turbines in OpenSim</a></p>
<p><a href="http://mikethebee.mevio.com/" target="_blank">MiketheBee</a></p>
<p><a href="http://www.ugotrade.com/2009/01/17/is-it-%E2%80%9Comg-finally%E2%80%9D-for-augmented-reality-interview-with-robert-rice/" target="_blank">Is it &#8220;OMG finally&#8221; for Augmented Reality?</a></p>
<p><a href="http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/" target="_blank">Smart Planet: Interview with Andy Stanford-Clark</a></p>
<p><a href="http://www.orangecone.com/" target="_blank">Orange Cone &#8211; Information Shadows and Things as Services</a></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/04/19/sensor-networks-and-sustainability-connecting-real-virtual-mobile-and-augmented-reality/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Making a RFID to Web Interface and LilyPad Electronic Fashion at ETech 2009</title>
		<link>http://www.ugotrade.com/2009/03/10/making-a-rfid-to-web-interface-and-lilypad-electronic-fashion-at-etech-2009/</link>
		<comments>http://www.ugotrade.com/2009/03/10/making-a-rfid-to-web-interface-and-lilypad-electronic-fashion-at-etech-2009/#comments</comments>
		<pubDate>Tue, 10 Mar 2009 17:38:30 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[#etech]]></category>
		<category><![CDATA[Add new tag]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[electronic fashion]]></category>
		<category><![CDATA[etech maker shed]]></category>
		<category><![CDATA[Etech2009]]></category>
		<category><![CDATA[interaction design]]></category>
		<category><![CDATA[internetworked worlds]]></category>
		<category><![CDATA[Leah Buechley]]></category>
		<category><![CDATA[LilyPad]]></category>
		<category><![CDATA[Make]]></category>
		<category><![CDATA[Maker workshop]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[physical computing]]></category>
		<category><![CDATA[processing]]></category>
		<category><![CDATA[RFID]]></category>
		<category><![CDATA[RFID to Web Interface]]></category>
		<category><![CDATA[Tom Igoe]]></category>
		<category><![CDATA[user experience design]]></category>
		<category><![CDATA[Wattzon]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=3115</guid>
		<description><![CDATA[&#8220;Come to ETech; Experiment with Physical Computing and RFIDs&#8221; said Brady Forrest. The ETech RFID tag that I activated at registration is a gateway to several internetworked worlds.Â  It allows you to check into pulse stations to tell you about people with similar interests to you based on your traffic movements around the conference.Â  There [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/03/ahmedriazrfidreaderpost.jpg"><img class="alignnone size-full wp-image-3118" title="ahmedriazrfidreaderpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/03/ahmedriazrfidreaderpost.jpg" alt="ahmedriazrfidreaderpost" width="500" height="332" /></a></p>
<p><a href="http://radar.oreilly.com/2009/02/etech-rfid-proximity-interaction.html" target="_blank">&#8220;Come to ETech; Experiment with Physical Computing and RFIDs&#8221;</a> said Brady Forrest. The ETech RFID tag that I activated at registration is a gateway to several internetworked worlds.Â  It allows you to check into pulse stations to tell you about people with similar interests to you based on your traffic movements around the conference.Â  There is <a href="Lensley's Photobooth:" target="_blank">a photo booth</a> that allows you to upload photos to Flickr. And even a Fortune Teller from Josh and Tarikh of <a href="http://uncommonprojects.com/site/">Uncommon Projects</a> (makers of the awesome <a href="http://uncommonprojects.com/site/work/ybike">Yahoo! geo-bike</a>) that will be arriving tomorrow, and more.</p>
<p>But the <a href="http://en.oreilly.com/et2009/public/schedule/grid/2009-03-09" target="_blank">first day of ETech 2009</a> was packed with hands-on workshops.Â  And I actually managed to make, in <a href="http://en.oreilly.com/et2009/public/schedule/detail/5455" target="_blank">Tom Igoe&#8217;s, Hands-On RFID for Makers</a> workshop, my first RFID to web interface that could read<a href="http://radar.oreilly.com/200902101213.jpg" target="_blank"> Etech&#8217;s elegant RFID tags</a> (also see <a href="http://www.flickr.com/photos/ugotrade/sets/72157614982983865/" target="_blank">my photo set on Flickr</a> to get a glimpse of the action in the workshop). Amazingly it worked perfectly first time (I did have help from the very patient executive editor of <a href="http://www.oreillynet.com/pub/au/3428" target="_blank">Maker Media</a> Books, Brian Jepson. And Tom Igoe&#8217;s <a href="http://www.tigoe.net/pcomp/code/" target="_blank">step by step instructions on his website</a> are invaluable (picture of Tom Igoe below).</p>
<p>It was very exciting to actually get hands-on with the<a href="http://www.arduino.cc/" target="_blank"> Arduino</a> open source electronics prototyping platform, and <a href="http://processing.org/" target="_blank">Processing</a> &#8211; <strong style="font-weight: normal;">a very accessible language to do dynamic and interactive graphics for screen-based medi</strong>a, . You&#8217;ll know how much I love to write about these things if you have checked out some of my previous posts.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/03/tomigoepost.jpg"><img class="alignnone size-full wp-image-3120" title="tomigoepost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/03/tomigoepost.jpg" alt="tomigoepost" width="476" height="578" /></a></p>
<p>My &#8220;build&#8221; is sitting on the right of my workshop neighbor Ahmed Riaz (ebay) in the photo opening this post. We shared power supplies and a great discussion on interaction and user experience design (see my previous post, <a href="http://www.ugotrade.com/2009/02/27/towards-a-newer-urbanism-talking-cities-networks-and-publics-with-adam-greenfield/" target="_blank">Towards a Newer Urbanism: Talking Cities, Networks, and Publics with Adam Greenfield</a>, for an idea of some of the topics that we touched on). We also discovered a shared interest in User Experience Design sketches &#8211; see <a href="http://ahmedriaz.com/mind/" target="_blank">Ahmed&#8217;s blog here</a> and <a href="http://www.flickr.com/photos/nationless/" target="_blank">his flickr stream</a> for his project on UX sketches. I have <a href="http://tishshute.com/what-you-want-machine">reposted here one of my favorite UX sketches</a> done by an eight year old, especially for Ahmed.</p>
<p>If you look closely at the picture below you will see that Ahmed&#8217;s RFID to web interface has read my Etech RFID tag and pulled up <a href="http://en.oreilly.com/et2009/profile/38011" target="_blank">my Etech Conflink profile and picture</a>.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/03/rfidprofilepost.jpg"><img class="alignnone size-full wp-image-3122" title="rfidprofilepost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/03/rfidprofilepost.jpg" alt="rfidprofilepost" width="500" height="332" /></a><br />
In the evening, Tom Igoe announced during <a href="http://en.oreilly.com/et2009/public/schedule/detail/6980" target="_blank">his Ignite presentation</a> that an Arduino MEGA will be available next week &#8211; more pins, more ports, more memory.Â  I can&#8217;t wait to see what people come up with for the MEGA, especially as<a href="http://www.pachube.com/" target="_blank"> </a><a href="http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/" target="_blank">Pachube</a> (another favorite project of mine &#8211; <a href="http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/" target="_blank">see my interview with founder Usman Haque here</a>) is designed to work with Arduino and Processing.</p>
<p>I think I&#8217;m hooked on Maker culture. I can&#8217;t wait to check out the <a href="http://blog.makezine.com/archive/2009/03/make_stuff_at_the_etech_maker_shed.html" target="_blank">Etech Maker Shed</a> that opens today. I got a feel for the excitement of rapid prototyping in the morning doing the <a href="http://en.oreilly.com/et2009/public/schedule/detail/6663" target="_blank">LilyPad Electronic Fashion workshop with Leah Buechley</a>, a brilliant and patient teacher. Leah is checking out RaffiÂ  Krikorian and Tom Igoe&#8217;s progress in the photo below.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/03/leahbuechleyinspectingtomigoesworkpost.jpg"><img class="alignnone size-full wp-image-3123" title="leahbuechleyinspectingtomigoesworkpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/03/leahbuechleyinspectingtomigoesworkpost.jpg" alt="leahbuechleyinspectingtomigoesworkpost" width="500" height="332" /></a></p>
<p>There was some big talent in the Lilypad workshop. The<a href="http://www.wattzon.com/" target="_blank"> Wattzon</a> team, RaffiÂ  Krikorian and Jeremy Cloud, and Wattzon-phile Tom Igoe stitched and ironed (<a href="http://www.flickr.com/photos/ugotrade/sets/72157615047320486/" target="_blank">see my Flickr stream here</a>), and helped out noobs like me. Possibly we will see some programmable T-Shirts displaying carbon footprint data. But certainly you can use <a href="http://www.wattzon.com/" target="_blank">Wattzon</a> to compute the embodied energy data of all the Lilypad components.</p>
<p>I was a little hampered by my appalling needlework skills. But Maker culture came to the rescue when I twittered about needlework phobia and LilyPad love. @dpentecost replied in seconds inviting me to &#8220;sew and tell&#8221; at a NYC Lilypad meetup when I return to NYC. Below is a picture of Jeremy Cloud&#8217;s excellent stitching with the challenging silver plated thread.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/03/silverplatedthreadpost.jpg"><img class="alignnone size-full wp-image-3124" title="silverplatedthreadpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/03/silverplatedthreadpost.jpg" alt="silverplatedthreadpost" width="500" height="332" /></a></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/03/10/making-a-rfid-to-web-interface-and-lilypad-electronic-fashion-at-etech-2009/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
		</item>
		<item>
		<title>Pachube, Patching the Planet: Interview with Usman Haque</title>
		<link>http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/</link>
		<comments>http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/#comments</comments>
		<pubDate>Wed, 28 Jan 2009 16:31:41 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[MQTT and RSMB]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[Second Life]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[Virtual HomeCamp]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[Virtual Realities]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[connecting environments]]></category>
		<category><![CDATA[dynamic environments]]></category>
		<category><![CDATA[electronically assisted plants]]></category>
		<category><![CDATA[Extended Environment Markup Language]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[sensor technology]]></category>
		<category><![CDATA[smart buildings]]></category>
		<category><![CDATA[smart spaces]]></category>
		<category><![CDATA[social networking sensor data]]></category>
		<category><![CDATA[software of space]]></category>
		<category><![CDATA[sustainable real estate]]></category>
		<category><![CDATA[the street as a platform]]></category>
		<category><![CDATA[ubicomp]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=2686</guid>
		<description><![CDATA[Usman Haque (architect and director, Haque Design + Research) and founder of Pachube pointed me to this image from T.R. Oke&#8217;s book, &#8220;Boundary Layer Climates&#8221; (original photo source Prof. L. E. Mount&#8217;s The Climatic Physiology of the Pig) to explain his approach to the &#8220;software&#8221; of space. My focus as an architect has always been [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pigletspachubepost.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/dcfwgkt_8g2dvxgdg_b2.jpg"><img class="alignnone size-full wp-image-2835" title="piglets" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/dcfwgkt_8g2dvxgdg_b2.jpg" alt="piglets" width="614" height="407" /></a></p>
<p>Usman Haque (architect and director, <a id="o.td" title="Haque Design + Research" href="http://www.haque.co.uk/" target="_blank">Haque Design + Research</a>) and founder of <a id="cpbp" title="Pachube" href="http://www.pachube.com/">Pachube</a> pointed me to this image from <a href="http://www.geog.ubc.ca/~toke/Profile.htm &lt;http://www.geog.ubc.ca/%7Etoke/Profile.htm" target="_blank">T.R. Oke&#8217;s</a> book, <a href="http://www.amazon.com/Boundary-Layer-Climates-T-Oke/dp/0415043190" target="_blank">&#8220;Boundary Layer Climates&#8221;</a> (original photo source Prof. L. E. Mount&#8217;s <a href="http://www.alibris.com/booksearch?qwork=1137594&amp;matches=1&amp;author=Mount%2C+Laurence+Edward&amp;browse=1&amp;cm_sp=works*listing*title" target="_blank">The Climatic Physiology of the Pig</a>) to explain his approach to the &#8220;software&#8221; of space.</p>
<p><em>My focus as an architect has always been to consider what I&#8217;ve called the &#8220;software&#8221; of space (sounds, smell, light, temperature, electromagnetic fields, social relationships, etc.) rather than the &#8220;hardware&#8221; (floors, walls, roof, etc.) as it has traditionally been considered. The image (above) really sums up why I think this is important.</em></p>
<p><em>It&#8217;s the same piglets, in the same box, but on the right hand side the temperature has been increased. This small change in how the space is &#8220;programmed&#8221; has dramatically changed the way the &#8216;inhabitants&#8217; relate to each other and how they relate to their space. This approach to architecture became my challenge: how to translate such strategies into the general architectural discourse and how to bring into reality such possibilities for the construction industry.</em></p>
<h3>&#8220;Connecting Environments, Patching the Planet&#8221;<em><br />
</em></h3>
<p>Pachube is the culmination of 12 years of work.<em> </em></p>
<p><em>&#8220;It is now occupying pretty much all my time and will do for the foreseeable future,&#8221; </em>Usman told me.</p>
<p>Haque Design + Research is not foregrounded on theÂ <a id="q51:" title="Pachube site" href="http://www.pachube.com/" target="_blank">Pachube site</a>. And I did not make the connection at first. But when I followed a small link at the bottom, I was soon delving into the <a id="n4ku" title="work of Usman Haque" href="http://www.haque.co.uk/" target="_blank">work of Usman Haque</a>.Â  Then the penny dropped and I realized that Pachube is not only:</p>
<p><em><em>A web service that enables people to tag and share real time sensor data from objects, devices and spaces around the world, facilitating interaction between remote environments, both physical and virtual.</em></em><strong><em><br />
</em></strong></p>
<p>Pachube is also a really big idea.</p>
<h3><strong>Ubicomp and the &#8220;Software of Space?&#8221;<br />
</strong></h3>
<p>Usman suggested that, if I really wanted to go back to the beginning of the Pachube vision, I should check out the work of Dutch architect Constant Nieuwenhuys and his 1956 proposal for a visionary society, <a id="y-7j" style="font-weight: normal;" title="New Babylon" href="http://www.artfacts.net/index.php/pageType/exhibitionInfo/exhibition/15904" target="_blank">New Babylon</a></p>
<p>Usman explained:<strong><em></em></strong></p>
<p><em>Constant Nieuwenhuys is certainly an inspiration for Pachube. He envisages a globally connected architecture, built by its inhabitants &#8211; configured, reconfigured, reappropriated&#8230;</em></p>
<p>For a more contemporary reference, Usman noted there are lots of overlapping concepts with <a id="d21o" title="Adam Greenfield (head of design direction for service and user-interface design at Nokia)" href="http://speedbird.wordpress.com/about/" target="_blank">Adam Greenfield&#8217;s work. </a>Adam is head of design direction for service and user-interface design at Nokia. see Everyware: <a id="spz5" title="The dawning age of ubiquitous computing" href="http://www.amazon.com/exec/obidos/ASIN/0321384016/v2organisa/" target="_blank">The dawning age of ubiquitous computing</a>, and <a href="http://www.lulu.com/content/1554599">Urban Computing and its Discontents</a> to understand more about the vision Adam Greenfield has been developing.</p>
<p>Pachube is right in the zone with the ideas outlined in <a id="pxeu" title="The project description for Adam Greenfield's upcoming book, The City Is Here For You To Use" href="http://speedbird.wordpress.com/2008/01/01/new-day-rising/" target="_blank">The project description </a>for Adam Greenfield&#8217;s upcoming book,<a id="pxeu" title="The project description for Adam Greenfield's upcoming book, The City Is Here For You To Use" href="http://speedbird.wordpress.com/2008/01/01/new-day-rising/" target="_blank"> The City Is Here For You To Use</a>:</p>
<p><em><em>The City&#8230; takes everything explored in Everyware as a given, and a point of departure.<br />
<em><br />
It assumes that emergent technologies like RFID, mesh networking and shape-memory actuators&#8230;</em></em></em><em><em><em>will simply be part of how cities will be made from now on&#8230;</em></em></em></p>
<p><em><em><em><br />
</em></em></em></p>
<h3 style="text-align: left;">The Pachube Team</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pachubeteamfull.jpg"><img class="alignnone size-full wp-image-2764" title="pachubeteamfull" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pachubeteamfull.jpg" alt="pachubeteamfull" width="480" height="344" /></a></p>
<p>The Pachube Team &#8211; Usman Haque (creative director), Chris Leung (EEML developer), photoshopped laptop: Chris Burman (&#8220;example-maker&#8221;. e.g. SL code and Google SketchUp plugin), Ai Hasegawa (graphic designer), Sam Mulube (technical producer and website development).</p>
<p>Also, with Bruce Sterling as a &#8220;visionary&#8221; adviser and other luminaries involved, Pachube has some brilliant guiding lights.Â  Usman pointed that many people have<em> &#8220;have helped, prodded, nudged and advised along the way!&#8221; </em></p>
<div><em>Gavin Starks and also Dopplr&#8217;s Matt Biddulph have been sort of &#8220;friendly neighbours&#8221; to Pachube: they&#8217;ve made some great introductions and I turn to them often for advice on being a London start-up. What&#8217;s been really useful for me is that they are active in a related area and have directly useful advice: Gavin, of course, since he&#8217;s involved in metering the world&#8217;s energy; and Matt perhaps less tangibly in his day job as Dopplr&#8217;s CTO but more so in his active Arduino-enabled social life!</em></div>
<div><em><br />
</em></div>
<div><em>One very important Pachube advisor has been Dr. Paul Pangaro, who has previously been CTO at a number of technology startups, and brings vital experience from his time at Sun Microsystems as Senior Director and Distinguished Market Strategist. Oh, and he&#8217;s also a former student and collaborator of Gordon Pask&#8217;s! He has been very helpful in developing a viable business model in conjunction with my brother Yusuf Haque, who, with his experience in raising capital for startups, has led the fundraising process.</em></div>
<div><em><br />
</em></div>
<div><em>Of course, direct daily input from the Pachube team has been vital to the development of the project, and without Chris Leung (EEML development) and Sam Mulube (backend development) it would be a very different thing indeed!</em></div>
<div>
<h3>Pachube is not just a social networking project for sensor data.</h3>
<p>Pachube evolved out three strands of thought:</p>
<p><em>1) the geographical non-specificity of architecture these days as people live their lives in constant connection with people in remote spaces </em></p>
<p><em>2) a desire to open up the production process of &#8220;smart homes&#8221; in reaction to current trends forÂ placing the design and construction process solely in the hands of knowledgeable others.</em></p>
<p><em>3) an emphasis on contextually specific &#8220;environments&#8221; rather than object-centric &#8220;sensors&#8221;</em></p>
<p>Sensor/actuator integrations are a part of whatÂ  Pachube is about (also see Peter Quirk&#8217;s in depth post on <a id="ai70" title="the strong connection between virtual worlds and sensor networks" href="http://peterquirk.wordpress.com/2009/01/21/sensor-networks-and-virtual-worlds/" target="_blank">the strong connection between virtual worlds and sensor networks</a>), and an interest in home automation and energy management is giving a lot of early momentum to Pachube.</p>
<p>But Usman makes clear Pachube is about &#8220;environments&#8221; rather than &#8220;sensors.&#8221;Â  &#8220;An &#8216;environment&#8217; has dynamic frames of reference, all of which are excluded when simply focusing on devices, objects or mere sensors&#8221; (Usman explains this in depth in the interview below). A central part of Pachube is the development ofÂ  the <a id="f0b2" title="Extended Environments Markup Language." href="http://www.eeml.org/" target="_blank">Extended Environments Markup Language.</a></p>
<h3>Extended Environment Markup Language</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/eeml.jpg"><img class="alignnone size-full wp-image-2765" title="eeml diagram" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/eeml.jpg" alt="eeml diagram" width="520" height="159" /></a></p>
<p><em>Pachube came about as a direct attempt to enable the production of dynamic, responsive, conversant &#8216;environments&#8217;. </em></p>
<p><em>The <a id="gv6y" style="color: #551a8b;" title="Extended Environments Markup Language (EEML)" href="http://www.eeml.org/" target="_blank">Extended Environments Markup Language (EEML)</a> (which is the protocol around which much of Pachube is based) is being developed to make the idea of &#8220;dynamic, responsive and conversant environments&#8221; a reality. It worksÂ with existing construction standards like <a id="l7sl" style="color: #551a8b;" title="Industry Foundation Classes (IFC)" href="http://en.wikipedia.org/wiki/Industry_Foundation_Classes" target="_blank">Industry Foundation Classes (IFCs)</a>, but exists to extend them to account for dynamic, responsive and, dare I say it, conversant buildings. </em></p>
<p>A key member of the Pachube<em> </em>team<em> </em>doing EEML development is <a id="h3n5" title="Chris Leung" href="http://www.chrisleung.org/" target="_blank">Chris Leung</a><em>. </em>Haque Design + Research<em> </em>is industry sponsor of Chris&#8217; doctorate that:</p>
<p><em>investigates how Architectural and Engineering consultancies can use advanced imaging, sensing and visualisation technology to capture, record and playback the responsive behaviour of built Architecture in response to its environment as a decision-support tool to meet this unique challenge.</em></p>
<p><strong><a href="http://www.chrisleung.org/CaseStudy1.htm">Case-Study I â€“ Kielder Forest</a></strong></p>
<p><em><strong><img class="alignnone size-medium wp-image-2707" title="kielderforest" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/kielderforest-300x225.jpg" alt="kielderforest" width="300" height="225" /></strong></em></p>
<p>Usman explained to me the full vision for Pachube is not yet fleshed out on the web site (so read the full interview!), and this is in part because the focus has been on building a backend capable of handling millions of users.</p>
<h3>The business model for Pachube</h3>
<p>Usman explained his commitment to an ethically driven business model to allow a diverse group of companies and individuals to transition to the internet of things. Usman emphasizes that one of his chief concerns is to make sure that these technologies of &#8220;extreme connectivity,&#8221; that will soon be part of every aspect of our lives, are in the hands of all who want to use them.<br />
<em><br />
Pachube is here to make it easier to participate in what I expect to be a vast &#8216;eco-system&#8217; of conversant devices, buildings &amp; environments. </em></p>
<p><em>Pachube will facilitate the development of a huge range of new products and services that will arise from extreme connectivity. It&#8217;s relatively easy for large technology companies like Nike and Apple to transition into the Internet of Things, but Pachube will be particularly helpful for that huge portion of smaller scale industry players that *want* to become part of it, but which are only now waking up to the potentials of the internet &#8212; small and medium scale designers, manufacturers and developers who are very good at developing their products but don&#8217;t have the resources to develop in-house a massive infrastructure for their newly web-enabled offerings. </em></p>
<p><em>Basically, having built a generalized data-brokering backend to connect physical (and virtual) entities to the web, others can now start to build the applications that make the connections really useful. </em></p>
<h3>An Inspired Community of Early Adopters and Business Visionaries</h3>
<p><em><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/monkchipsathomecamp1.jpg"><img class="alignnone size-full wp-image-2766" title="monkchipsathomecamp1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/monkchipsathomecamp1.jpg" alt="monkchipsathomecamp1" width="462" height="308" /></a><br />
</em></p>
<p>James Governor <a href="../wp-content/uploads/2008/12/andystanfordclark.jpg"><span class="entry-content">(</span></a><a id="qd8i" title="@monkchips" href="http://twitter.com/monkchips" target="_blank">@monkchips</a>), <a href="http://redmonk.com/">Redmonk</a> has Pachube, <a href="http://currentcost.co.uk/">Current Cost</a>, <a id="g.i:" title="using MQTT" href="http://mqtt.org/" target="_blank">MQTT</a> and RSMB (<a id="h0is" title="IBM AlphaWorks" href="http://alphaworks.ibm.com/tech/rsmb" target="_blank">IBM AlphaWorks</a>), and <a href="http://www.arduino.cc/" target="_blank">Arduino</a> on the board at <a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp â€˜08.</a> Photo from theÂ  <a href="http://www.flickr.com/photos/tags/homecamp08/" target="_blank">Flickr</a><a href="http://www.flickr.com/search/?q=homecamp&amp;w=29034542%40N00" target="_blank"> stream</a> ofÂ  <a href="http://benjaminellis.co.uk/" target="_blank">Benjamin Ellis</a>.<a href="http://www.flickr.com/search/?q=homecamp&amp;w=29034542%40N00" target="_blank"></a></p>
<p>What attracted my attention to Pachube, at first, was the small but highly energized community of early adopters I noticed experimenting with Pachube.Â  <a id="x2vv" title="Nigel Crawley" href="http://www.nigelcrawley.co.uk/" target="_blank">Nigel Crawley</a> <a id="nf4y" title="@ni" href="http://twitter.com/ni" target="_blank">@ni</a>), and <a id="zjcv" title="James Taylor" href="http://jtlog.wordpress.com/" target="_blank">James Taylor</a>, (<a id="ie4m" title="@jtonline" href="http://twitter.com/jtonline" target="_blank">@jtonline</a>)Â  were some of the first to plunge in.Â <a id="o0.i" title="Rick Bullotta" href="http://www.automation.com/content/wonderware-appoints-rick-bullotta-vp-and-cto" target="_blank">Rick Bullotta,</a> Usman noted, has been very active in the community forum bringing much-needed automation expertise to the conversation. <a id="ny-t" title="Pam Broviak" href="http://www.publicworksgroup.com/" target="_blank">Pam Broviak</a> (<a id="xkmo" title="@pbroviak" href="http://twitter.com/pbroviak" target="_blank">@pbroviak</a>) is an early Second Life adopter.Â  And <a id="ugu0" title="Matt Biddulph" href="http://www.hackdiary.com/about/" target="_blank">Matt Biddulph</a> (CTO of <a href="http://www.dopplr.com/">Dopplr</a>) was the first non-Pachube person to get a feed up!</p>
<p>A very active early adopter is <a id="q54j" title="Carl Johan Rosen" href="http://carljohanrosen.com/" target="_blank">Carl Johan Rosen</a> wrote an <a href="http://www.openframeworks.cc/" target="_blank">openFrameworks</a> addon (<a id="ljuh" title="for more see here" href="http://carljohanrosen.com/?p=42" target="_blank">see here</a>) for <a href="http://www.pachube.com/" target="_blank">Pachube</a> that he presented at the <a href="http://www.aec.at/en/festival2008/program/project.asp?parent=14439&amp;iProjectID=14447" target="_blank">OFLab at Ars Electronica Festival</a>.<br />
After the first inaugural <a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp</a>, where Usman and Chris Burman from Pachube were presenters, (<a id="diae" title="see slides here" href="http://www.slideshare.net/tag/pachube" target="_blank">see slides here</a>), I began to notice that people were sending their current cost feeds into Pachube. And recently, it was announced that Pachube has <a href="http://apps.pachube.com/carbon_footprint.php" target="_new">carbon footprint calculation app</a> which:</p>
<p><em>makes it very easy to take any Pachube feed that measures electricity consumption in watts or kilowatts and convert it into a Pachube feed that shows a realtime estimated carbon footprint for the last 15 minutes, the last hour and the last 24 hours.</em></p>
<p><em>The app makes use of international data provided by <a href="http://www.amee.cc/" target="_new">&#8216;AMEE &#8211; The world&#8217;s energy meter&#8217;</a>. AMEE provides figures that are specific to electricity suppliers in UK &amp; Ireland and specific to country in the rest of the world.</em></p>
<p><em>This app, combined with the <a href="http://community.pachube.com/?q=node/100">Current Cost app</a> makes it simple to monitor your carbon footprint on a day to day basis!</em></p>
<p>I still haven&#8217;t found out what <a id="kmt8" title="@yellowpark" href="http://twitter.com/yellowpark" target="_blank">@yellowpark</a> was doing last Saturday to produce so much CO2&#8230;&#8230;? (the perils of going public with your energy consumption as <a id="am8t" title="@epachube" href="http://twitter.com/pachube" target="_blank">@epachube</a> pointed out).</p>
<p>But perhaps Chris Dalby <a id="kmt8" title="@yellowpark" href="http://twitter.com/yellowpark" target="_blank">(@yellowpark</a>) can be excused a day of CO2 excess as he has just released <a id="qf:l" title="Pachube Air" href="http://www.yellowpark.net/cdalby/index.php/2009/01/10/pachube-air-the-first-release/" target="_blank">Pachube Air</a>.</p>
<p>While enterprise and government projects are on the near horizon, PachubeÂ  is designed to introduce a DIY approach to ubicomp.Â  Usman said he is &#8220;concerned by developments in ubiquitous computing whereby &#8216;making technology invisible&#8217; equates to placing the design and construction process solely in the hands of knowledgeable others.</p>
<p>DIY City (see the <a id="zwms" title="Do-It-Yourself-City Project" href="http://diycity.org/diycity-main-group/call-work-first-diycity-project" target="_blank">Do-It-Yourself-City Project</a>) is developing a similar vision here in NYC.</p>
<h3>Natural Fuse: &#8220;A city wide network of electronically-assisted plants.&#8221;</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/naturalfusenetwork1.jpg"><img class="alignnone size-full wp-image-2779" title="naturalfusenetwork1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/naturalfusenetwork1.jpg" alt="naturalfusenetwork1" width="405" height="305" /></a></p>
<p><em>I think we&#8217;ve really not even begun to imagine the kinds of applications that will be important,&#8221; </em> Usman Haque.</p>
<p>Haque Design + Research which still continues, and has a separate team will be involved mostly in the kinds of things it has in the past, but it isÂ <em> &#8220;also in pushing development of things that *use* Pachube,&#8221;</em> such as the project Natural Fuse, by Usman Haque, <a id="y5x7" title="Nitipak Samsen (Designer)" href="http://www.dotmancando.info/" target="_blank">Nitipak Samsen (Designer)</a>,Â <a id="d.p2" title="Cesar Harada (Designer)" href="http://www.cesarharada.com/" target="_blank">Cesar Harada (Designer)</a>, Barbara Jasinowicz (Producer), was commissioned by <a href="http://www.archleague.org/index-dynamic.php?show=757" target="_new">the Architecture League</a> &amp; <a href="http://www.situatedtechnologies.net/?q=node/89" target="_new">Situated Technologies: Toward the Sentient City</a> and will open to the public in Autumn 2009.</p>
<p><em>Natural Fuse harnesses the carbon-sinking capabilities of plants to create a city-wide network of electronically-assisted plants that act both as energy providers and as shared &#8220;carbon sink&#8221; circuit breakers. By sharing resources and information between the plants, energy expenditure can be collectively monitored and managed.</em></p>
<p><em> The purpose is to create a collective &#8220;carbon sink&#8221;, that offsets the amount of energy consumed by the plant owners &#8211; a natural &#8220;circuit breaker&#8221;. If people cooperate on their energy expenditure then the plants thrive (and they can all use more energy); but if they don&#8217;t then the network starts to kill plants, thus diminishing the network&#8217;s energy capacity,</em> (a full description of natural fuse in the interview below).</p>
<h3>The Street As Platform</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/streetasaplatform1.jpg"><img class="alignnone size-full wp-image-2780" title="streetasaplatform1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/streetasaplatform1.jpg" alt="streetasaplatform1" width="450" height="301" /></a></p>
<p>Image courtesy ofÂ <a id="k0g3" title="Timo Arnall" href="http://www.elasticspace.com/" target="_blank">Timo Arnall</a> -Â  who is an awesome photographer and mover and shaker in ubicomp. <em>&#8220;The way the street feels may soon be defined by what cannot be seen with the naked eye,&#8221;</em> writes Dan Hill in his post <a href="http://www.cityofsound.com/blog/2008/02/the-street-as-p.html" target="_blank">&#8220;The Street as Platform.&#8221;</a> Usman comments on Dan Hill&#8217;s other &#8220;must read&#8221; post:</p>
<p><em><a id="doow" title="&quot;the personal well-tempered environment,&quot;" href="http://www.cityofsound.com/blog/2008/01/the-personal-we.html" target="_blank">&#8220;The Personal Well-Tempered Environment&#8221;</a> is full of &#8220;fascinating propositions&#8230; &#8230;they&#8217;re relevant to things I&#8217;m interested in&#8230;</em></p>
<p>In a summary of his ideas on personal well-tempered env., Dan Hill writes:<br />
<em></em></p>
<p><em>A real-time dashboard for buildings, neighbourhoods, and the city, focused on conveying the energy flow in and out of spaces, centred around the behaviour of individuals and groups within buildings.</em></p>
<p><em>A form of &#8216;BIM 2.0&#8242; that gives users of buildings both the real-time and longitudinal information they need to change their behaviour and thus use buildings, and energy, more effectively. An ongoing post-occupancy evaluation for the building, the neighbourhood and the city.</em></p>
<p><em>A software service layer for connecting things together within and across buildings.</em></p>
<p><em>As information increasingly becomes thought of a material within building, it makes sense to consider it holistically as part of the built fabric, as glass, steel, ETFE etc.</em></p>
<h3>Interview With Usman Haque</h3>
<p><strong>Tish Shute:</strong> You have been involved in many awesome projects but Pachube seems to be quite a new direction.Â  What are the key influences in your career and the development of your thinking? And, could you tell me more about how your previous work brought you to creating Pachube? Is Pachube a central focus for you and Haque design now?</p>
<p><strong>Usman Haque:</strong><em> To me Pachube is the logical culmination of everything I&#8217;ve worked on for the last 12 years since finishing my post-grad architecture studies.</em></p>
<p><em>A lot of my work until now has centered around large-scale mass-collaboration interactive &#8220;spectacles&#8221; involving many thousands of members of the public at once. I found this a good medium in which (a) to explore strategies for collaboration that take account of the granularity of participation (i.e. the fact that different people have different interests, skills and intentions in any participative act); and (b) to work at an urban scale; i.e. in a way that has an effect at the scale of buildings, parks, and streetscapes etc.</em></p>
<p><em> <a id="kr8h" title="Open Burble" href="http://www.haque.co.uk/openburble.php" target="_blank">Open Burble</a> was a good example of this approach: essentially a framework, composed of 2m carbon-fibre modules, it had electronics embedded in 1000 helium balloons. Members of the public could configure and assemble these, inflate them and then unfurl the complex structure up to the scale of a 15 storey buidling. Finally, by shaking, rowing, twisting and bending a handlebar embedded with sensors (the same as in the Wii controller as it happens), dozens of people at once could have an effect on the Burble&#8217;s position and the colours streaming through it.</em></p>
<p><em><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/openburble2.jpg"><img class="alignnone size-full wp-image-2832" title="openburble2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/openburble2.jpg" alt="openburble2" width="509" height="338" /></a><br />
</em></p>
<p><a href="http://www.haque.co.uk/openburble.php" target="_blank">Open Burble, Singapore Biennale 2006</a></div>
<p><em>Along the way I became interested at times in what an &#8220;operating system&#8221; might mean in the context of architecture (paper,Â <a id="cxpf" title="Hardspace, Softspace and the possibilities of open source architecture, 2002" href="http://www.haque.co.uk/papers/hardsp-softsp-open-so-arch.PDF" target="_blank"> Hardspace, Softspace and the possibilities of open source architecture, 2002 (PDF)</a>, particularly an &#8220;open source&#8221; operating system (Urban Versioning System,Â <a id="yvjc" title="http://uvs.propositions.org.uk/" href="http://uvs.propositions.org.uk/" target="_blank">http://uvs.propositions.org.uk/</a> ). I was also interested in developing tools for supposedly &#8220;non-technical&#8221; people to start building their own interactive systems or environments, hence the release of <a id="zv:-" title="The &quot;Low Tech Sensors &amp; Actuators for Artists and Architects&quot;" href="http://lowtech.propositions.org.uk/" target="_blank">The &#8220;Low Tech Sensors &amp; Actuators for Artists and Architects&#8221;</a> pamphlet , co-authored with an old friend,Â <a id="w-ad" title="Adam Somlai-Fischer" href="http://www.aether.hu/" target="_blank">Adam Somlai-Fischer</a>, back in 2005.</em></p>
<p><em>An off-shoot of this has been an obsession withÂ <a id="ahue" title="trying to rescue the concept of &quot;interaction&quot;" href="http://mags.acm.org/interactions/20090102/?pg=71" target="_blank">trying to rescue the concept of &#8220;interaction&#8221;</a> from oblivion &#8211; I say oblivion because I think the really exciting possibilities of the concept of interaction are being lost because we&#8217;re being sold a billion so-called &#8220;interactive&#8221; devices and gadgets that are, in fact, merely &#8220;reactive&#8221;. In this, <a id="t5h7" title="I turn often to the work of cybernetician Gordon Pask" href="http://www.haque.co.uk/papers/architectural_relevance_of_gordon_pask.pdf" target="_blank">I turn often to the work of cybernetician Gordon Pask</a>, particularly active in the 50s, 60s and 70s in the development of truly interactive systems. (And also a collaborator withÂ <a id="gt4p" title="Cedric Price" href="http://en.wikipedia.org/wiki/Cedric_Price" target="_blank">Cedric Price</a>, one of my favourite architects).</em></p>
<p><em>Which brings me to Pachube, which is now occupying pretty much all my time and will do for the foreseeable future. (<a id="qdfj" title="Haque Design + Research" href="http://www.haque.co.uk/" target="_blank">Haque Design + Research</a> still continues, and has a separate team &#8212; it will be involved mostly in the kinds of things it has in the past, but also in pushing development of things that *use* Pachube, such as the projectÂ <a id="h:9w" title="Natural Fuse" href="http://www.haque.co.uk/naturalfuse.php" target="_blank">Natural Fuse</a> ).</em></p>
<p><em>Pachube came about as a direct attempt to enable the production of dynamic, responsive, conversant &#8216;environments&#8217;. ItÂ basically evolved out of three strands of thought.</em></p>
<p><em>The first was the notion of the <strong>geographical non-specificity of architecture</strong> these days. By this I mean that, for many of us now, &#8220;home&#8221; is an idea constructed from several places &#8211;we live and work in environments composited by networked technology from fragments that bridge huge geographical distances. These environments are resolutely &#8220;human&#8221; (in the sense of being inhabited, designed and determined by people) yet context-free (because they do not privilege geographical location). I wanted to find a way to &#8220;connect&#8221; up remote spaces, much likeÂ <a id="ubie" title="Remote Home" href="http://www.tobi.net/remotehome/remotehome.htm" target="_blank">Remote Home</a> and a whole range of other projects had done, but in a generalized way so that it would be possible to keep adding to the ecosystem of connected environments on an ad hoc basis; a global architecture if you will.</em></p>
<p><em>The second strand of thought came from the <strong>desire to open up the production process of &#8220;smart homes.&#8221;</strong> I&#8217;m concerned by developments in ubiquitous computing whereby &#8220;making technology invisible&#8221; equates to placing the design and construction process solely in the hands of knowledgeable others. Whereas it&#8217;s still possible more or less to do DIY on your home, if many ubicomp technologists had their way it would become less and less possible simply because of the complexity of reverse-engineering such closed-systems. It&#8217;s already a problem with larger buildings: service companies go out of business, proprietary skills or tools disappear and complex lighting and sensor systems remain unused. So, with Pachube I wanted to help foster a more open way of developing the discipline: to embrace the concept of the maker, and to help people negotiate their technological future.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/reconfigurablehouse.jpg"><img class="alignnone size-full wp-image-2781" title="reconfigurablehouse" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/reconfigurablehouse.jpg" alt="reconfigurablehouse" width="419" height="107" /></a></p>
<p><em><a id="ex31" title="Reconfigurable House" href="http://haque.co.uk/reconfigurablehouse.php" target="_blank">Reconfigurable House</a>,Â an environment constructed from thousands of low tech components that can be &#8220;reconfigured&#8221; by its occupants.</em></p>
<p><em>The final strand of thought relates to Pachube&#8217;s emphasis on <strong>&#8220;environments&#8221; rather than &#8220;sensors.&#8221; </strong>I believe that one of the major failings of the usual ubicomp approach is to consider the connectivity and technology at the object-level, rather than at the environment-level. It&#8217;s built into much of contemporary Western culture to be object-centric, but at the level of &#8220;environment&#8221; we talk more about context, about disposition and subjective experience. An &#8216;environment&#8217; has dynamic frames of reference, all of which are excluded when simply focusing on devices, objects or mere sensors. If one really studies deeply what an &#8216;environment&#8217; is (by this I mean more than simply saying that &#8220;it&#8217;s what things exist in&#8221;), one begins to understand that an environment is a construction </em><em>process and </em><em>not a medium; nor is it a state or an entity. In this I would refer to Gordon Pask&#8217;s phenomenally important text </em><em>&#8220;Aspects of Machine Intelligence&#8221; in Nicholas Negroponte&#8217;sÂ <a id="hlcg" title="Soft Architecture Machine" href="http://www.amazon.com/Soft-Architecture-Machines-Nicholas-Negroponte/dp/0262140187" target="_blank">Soft Architecture Machine</a> though it makes for extremely tough reading (Negroponte compared it in importance to Alan Turing&#8217;s contributions to the computer science discipline).</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pachube1.jpg"><img class="alignnone size-full wp-image-2782" title="pachube1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pachube1.jpg" alt="pachube1" width="411" height="275" /></a></p>
<p><em>Ultimately, though, Pachube is here to make it easier to participate in what I expect to be a vast &#8216;eco-system&#8217; of conversant devices, buildings &amp; virtual environments. Pachube will facilitate the development of a huge range of new products and services that will arise from extreme connectivity. It&#8217;s relatively easy for large technology companies likeÂ <a id="ps11" title="Nike and Apple" href="http://www.apple.com/ipod/nike/" target="_blank">Nike and Apple</a> to transition into the Internet of Things, but Pachube will be particularly helpful for that huge portion of smaller scale industry players that *want* to become part of it, but which are only now waking up to the potentials of the internet &#8212; small and medium scale designers, manufacturers and developers who are very good at developing their products but don&#8217;t have the resources to develop in-house a massive infrastructure for their newly web-enabled offerings.Â Basically, having built a generalized data-brokering backend to connect physical (and virtual) entities to the web, others can now start to build the applications that make the connections really useful.</em></p>
<p><strong>Tish Shute:</strong> You mentioned that both Bruce Sterling and Gavin Starks (AMEE) have given input on Pachube.Â  Can you describe any specific ways they (and others?) have influenced the evolution of Pachube? You mentioned the concept of &#8220;engaged responsible spime wrangling&#8221; when we talked on skype?</p>
<p><strong>Usman Haque:</strong> <em>Yes, I am very grateful to a whole bunch of people who have helped, prodded, nudged and advised along the way!</em></p>
<p><em>I asked Bruce to be a &#8220;visionary&#8221; adviser because he was one of the people early on to envisage the concepts and ramifications ofÂ <a id="v5w3" title="&quot;spimes&quot;Â Â (his neologism for 'space-time objects')" href="http://www.boingboing.net/images/blobjects.htm" target="_blank">&#8220;spimes&#8221;Â Â (his neologism for &#8216;space-time objects&#8217;)</a>. While I agree that &#8220;spimes&#8221; are directly relevant, what I found most important from his conception was the concept of &#8220;wrangling&#8221; &#8211; being actively and productively engaged and responsible in the development of spimed environments. I think it was a crucial leap: to talk about &#8220;wranglers&#8221; rather than &#8220;end-users&#8221;. So the kinds of questions I&#8217;ve turned to him for regard how to nudge people away from being &#8220;end users&#8221; and towards being &#8220;wranglers&#8221;; and about how to transition from being a &#8220;hacker toy&#8221; to &#8220;major infrastructure&#8221;. He had some great (and invaluable) responses, of which one of the most important to me was something he said in email: &#8220;&#8230;I think total openness is fatal. Â It&#8217;s like lying in a blazing sun under a sky full of vultures, naked. It&#8217;s also rather rude, like babbling anything or anything that flies into your head and still expecting people to pay attention.&#8221;</em></p>
<p><em><a id="qrs7" title="Gavin Starks" href="http://www.amee.cc/" target="_blank">Gavin Starks</a> and alsoÂ <a id="bbd." title="Dopplr's" href="http://www.dopplr.com/" target="_blank">Dopplr&#8217;s</a> <a id="aqy:" title="Matt Biddulph" href="http://www.hackdiary.com/" target="_blank">Matt Biddulph</a> have been sort of &#8220;friendly neighbours&#8221; to Pachube: they&#8217;ve made some great introductions and I turn to them often for advice on being a London start-up. What&#8217;s been really useful for me is that they are active in a related area and have directly useful advice: Gavin, of course, since he&#8217;s involved inÂ <a id="lzoi" title="metering the world's energy" href="http://www.amee.cc/" target="_blank">metering the world&#8217;s energy</a>; and Matt perhaps less tangibly in his day job as Dopplr&#8217;s CTO but more so in hisÂ <a id="jav_" title="active Arduino-enabled social life" href="http://tinker.it/now/2009/01/20/toy-hacking-workshop-09/" target="_blank">active Arduino-enabled social life</a>!</em></p>
<p><em>One very important Pachube advisor has beenÂ <a id="qjz0" title="Dr. Paul Pangaro" href="http://www.pangaro.com/" target="_blank">Dr. Paul Pangaro</a>, who has previously been CTO at a number of technology startups, and brings vital experience from his time at Sun Microsystems as Senior Director and Distinguished Market Strategist. (Oh, and he&#8217;s also a former student and collaborator of Gordon Pask&#8217;s!) He has been very helpful in developing a viable business model in conjunction with my brother Yusuf Haque, who, with his experience in raising capital for startups, has led the fundraising process.</em></p>
<p><em>Of course, direct daily input from the Pachube team has been vital to the development of the project, and withoutÂ <a id="nyoj" title="Chris Leung" href="http://www.chrisleung.org/" target="_blank">Chris Leung</a> (EEML development) andÂ <a id="xr8l" title="Sam Mulube" href="http://twitter.com/smazero" target="_blank">Sam Mulube</a> (backend development) it would be a very different thing indeed!</em></p>
<p><strong>Tish Shute:</strong> Now the emerging internet is the world as a networked, enhanced virtual/reality environment &#8211; sorry about the inadequate terminology, but as you said &#8220;the distinction between real and virtual is becoming as quaint as the distinction between mind and body&#8221;. You are participating in the <a id="k7s8" title="Sentient City" href="http://www.situatedtechnologies.net/?q=node/89" target="_blank"><strong>Sentient City</strong> exhibition organized by the </a><a href="http://www.archleague.org/" target="_blank">Architectural League of New York for September 2009.</a></p>
<p>Could you explain more about the Sentient City project and what your contribution Natural Fuse which uses common house plants, energy-monitoring sensors, and Pachube to create &#8220;a city-wide network of electronically-assisted plants that act as carbon-cycle circuit-breakers in much the same way as conventional electrical circuit-breakers do&#8230;..&#8221; is about?</p>
<p><strong>Usman Haque: </strong><em>Situtated Technologies, founded toÂ explore the impact of &#8220;situated&#8221; technologies (i.e. locative media, etc.) in urban spaces,Â kicked off with a <a id="b77z" title="symposium organised by Mark Shepard, Omar Khan and Trebor Scholz" href="http://www.situatedtechnologies.net/?q=node/1" target="_blank">symposium organised by Mark Shepard, Omar Khan and Trebor Scholz</a> and supported by theÂ <a id="o7a4" title="Architecture League of New York" href="http://www.archleague.org/" target="_blank">Architecture League of New York</a> a couple of years ago, and continued throughÂ <a id="o5o6" title="a series of pamphlets" href="http://www.situatedtechnologies.net/?q=node/75" target="_blank">a series of pamphlets</a> (the first by Adam Greenfield &amp; Mark Shepard; the second by me and Matthew Fuller; the third and fourth byÂ Benjamin Bratton &amp; Natalie Jeremijenko andÂ Laura Forlano &amp; Dharma Dailey). This is now culminating in an exhibition,Â &#8220;Toward the Sentient City&#8221;, opening in September 2009, as a public manifestation of many of the concepts raised over the years.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantcircuit1.jpg"><img class="alignnone size-full wp-image-2783" title="plantcircuit1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantcircuit1.jpg" alt="plantcircuit1" width="400" height="289" /></a></p>
<p><em><a id="k48e" title="Natural Fuse" href="http://www.haque.co.uk/naturalfuse.php" target="_blank">Natural Fuse</a>, a project funded by the Architecture League to be part of that exhibtion, is really a Haque Design + Research project rather than Pachube project alone. It came about for two reasons. The first was because we had been investigating for several months many different ways to use plants and vegetation in interactive architectural design: as living walls, as responsive systems, as visual and olfactory indicators, as passive ventilation &#8212; fantastic research undertaken predominantly by my invaluable production assistant Barbara Jasinowicz. We were particularly interested in energy creation and monitoring and had made a number of (unsuccessful) proposals to develop building systems based on plant interaction. The second was because I wanted to have a good demonstration project for Pachube: a system that was not just end-to-end single-point communication, but one in which the system increased its efficiency over time through more and more geographically-dispersed connections. So Natural Fuse developed through a series of conversations with a very intelligent and witty designerÂ <a id="ed_l" title="Nitipak (Dot) Samsen" href="http://www.dotmancando.info/" target="_blank">Nitipak (Dot) Samsen</a> who was then an intern and who will now lead design work along withÂ <a id="w9.y" title="Cesar Harada" href="http://www.cesarharada.com/" target="_blank">Cesar Harada</a> (similarly intelligent and witty!).</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfusecare1.jpg"><img class="alignnone size-full wp-image-2784" title="plantfusecare1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfusecare1.jpg" alt="plantfusecare1" width="400" height="322" /></a></p>
<p><em>Briefly, the point of Natural Fuse is to use networked plants, based on the Arduino ethernet platform, to harnessÂ the carbon-sinking capabilities of plants to create a city-wide network of electronically-assisted plants that act both as energy providers and as shared &#8220;carbon sink&#8221; circuit breakers. By sharing resources and information between the plants, energy expenditure can be collectively monitored and managed. The purpose is to create a collective &#8220;carbon sink&#8221;, that offsets the amount of energy consumed by the plant owners &#8211; a natural &#8220;circuit breaker&#8221;. If people cooperate on their energy expenditure then the plants thrive (and they can all use more energy); but if they don&#8217;t then the network starts to kill plants, thus diminishing the network&#8217;s energy capacity.Â Of course, the network functionality is enabled by Pachube. The plan is to distribute these to some households in New York and offer plans and downloads for people to build their own as well.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfusesystem1.jpg"><img class="alignnone size-full wp-image-2785" title="plantfusesystem1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfusesystem1.jpg" alt="plantfusesystem1" width="432" height="214" /></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfuseunit.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfuseunit1.jpg"><img class="alignnone size-full wp-image-2786" title="plantfuseunit1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfuseunit1.jpg" alt="plantfuseunit1" width="443" height="197" /></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/naturalfusenetwork2.jpg"><img class="alignnone size-full wp-image-2787" title="naturalfusenetwork2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/naturalfusenetwork2.jpg" alt="naturalfusenetwork2" width="462" height="348" /></a><br />
<strong><br />
Tish Shute:</strong> You describe Pachube as linking environments not just sensor to sensor (as sensorbase.org does) &#8211; an environment for Pachube could be a web page. An essential concept in Pachube is the concept that anything could be an environment and such environments are treated equivalently with EEML. You describe EEML as a protocol that sits comfortably with existing building protocols &#8220;what it brings to the picture is the ability to describe buildings that change.&#8221;</p>
<p>How will EEML change our understanding of architecture and enable the view of architecture that &#8220;includes smells, sounds, light, electromagnetic fields &#8211; buildings as dynamic and changing?&#8221; (Prasad Passive House?)</p>
<p>You describe EEML as straddling and designed to work alongside IFC construction industry format. Who is involved in the creation of EEML?Â  Could you explain a little bit how it is different from SensorEML? You mentioned little has been done re post-construction evaluation of buildings. How will EEML enable buildings to share strategies (for example on energy consumption) as you put it?</p>
<p><strong>Usman Haque:</strong> <em>TheÂ <a id="gv6y" style="color: #551a8b;" title="Extended Environments Markup Language (EEML)" href="http://www.eeml.org/" target="_blank">Extended Environments Markup Language (EEML)</a> (which is the protocol around which much of Pachube is based) is being developed to make the idea of &#8220;dynamic, responsive and conversant environments&#8221; a reality. It worksÂ with existing construction standards likeÂ <a id="l7sl" style="color: #551a8b;" title="Industry Foundation Classes (IFC)" href="http://en.wikipedia.org/wiki/Industry_Foundation_Classes" target="_blank">Industry Foundation Classes (IFCs)</a>, but exists to extend them to account for dynamic, responsive and, dare I say it, conversant buildings. In the perhaps prosaic world of construction, this helps to facilitate a number of architectural requirements such asÂ <a id="i2_j" style="color: #551a8b;" title="post-occupancy evaluation" href="http://www.google.com/search?hl=en&amp;client=safari&amp;rls=en&amp;defl=en&amp;q=define:post+occupancy+evaluation&amp;sa=X&amp;oi=glossary_definition&amp;ct=title" target="_blank">post-occupancy evaluation</a>, realtime site-based environmental feedback at the design phase and simulations that synchronise with realworld installation. WithÂ <a id="hxs4" style="color: #551a8b;" title="EEML" href="http://www.eeml.org/" target="_blank">EEML</a> and Pachube you&#8217;ll be able to start working with, say, an Autocad model at the design phase, and include *real time* environmental data from the site, as well as to model expected sensor and assumed energy consumption data of the design; use the same model during the construction phase (because it will translate fine to standard modelling descriptions), and keep working with the same set of information even after the building is occupied and running &#8212; making it a whole lot easier to learn from the design and maintenance processes than it is currently.</em></p>
<p><em>At the same time this does not exclude the possiblity of talking about &#8220;sensors&#8221; (asÂ <a id="swia" title="SensorML" href="http://en.wikipedia.org/wiki/SensorML" target="_blank">SensorML</a> wants to), but we are more easily able to consider, say, the dozens of different ways that different clients will want to address, access or search for those sensors; the changing contextual motivations for actually processing sensor information; and the capacity for flexible sensor ontologies &#8212; where you don&#8217;t need to know from the beginning everything you&#8217;ll be looking for once you&#8217;ve recorded mountains of data.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/environmentsconnected.jpg"><img class="alignnone size-full wp-image-2792" title="environmentsconnected" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/environmentsconnected.jpg" alt="environmentsconnected" width="454" height="151" /></a></p>
<p><em>We can consider, equally as &#8216;environments&#8217; a mountainside, the interior a building, the context of a webpage, the internal status and external context of a mobile device, the interactions within something like Second Life.</em></p>
<p><em>As a result of this conception of &#8220;environment&#8221; we remove the need for a distinction between &#8220;real&#8221; and &#8220;virtual&#8221;. We can consider, equally as &#8216;environments&#8217; a mountainside, the interior a building, the context of a webpage, the internal status and external context of a mobile device, the interactions within something like Second Life &#8212; all these are environments and can communicate with each other on equivalent terms. More importantly a single &#8220;environment&#8221; can be expressed as a snapshot in time; or it can be expressed as a sequence of many snap shots over several years.</em></p>
<p><em>One very important thing we&#8217;re looking at now is how to transition the protocol from something that is status-based, to something that can express transactions, goals and processes. We&#8217;ve just started looking at howÂ <a id="e7.0" title="RDF" href="http://en.wikipedia.org/wiki/Resource_Description_Framework" target="_blank">RDF</a> andÂ <a id="khn." title="machine tags" href="http://en.wikipedia.org/wiki/Machine_tag" target="_blank">machine tags</a> might help in this, largely spurred on by perceptive comments from one of my favourite designers,Â <a id="mit9" title="Toxi, a.k.a. Karsten Schmidt" href="http://postspectacular.com/" target="_blank">Toxi, a.k.a. Karsten Schmidt</a>.</em></p>
<p><strong>Tish Shute:</strong> You mentioned that you see &#8220;smart&#8221; buildings and &#8220;smart&#8221; cities as environments not just a collection of devices? On the Pachube web page there is a chart describing potential interactions between entities (one to one, one to many, etc.) but you do not give many pointers to how two unrelated objects that are connected would derive any value out of the connection&#8230;could you give me some examples of the kinds of use cases (Natural Fuse is one of course!) and interesting new opportunities to create shared value that Pachube will enable?</p>
<p><strong>Usman Haque:</strong> <em>Yes, I recognize that the Pachube website information leaves a lot to be desired&#8230;! Apart from a whole lot of conceptual information that&#8217;s missing, there are a number of undocumented API features that nobody has yet uncovered!</em></p>
<p><em>Well, in answer to your question: much of it is intuition &#8211; I don&#8217;t know exactly _how_ it will be valuable but I do expect the community to find ways to make such seemingly disparate interoperability valuable.</em></p>
<p><em>To make a prosaic example: say, (once privacy options are introduced) that a manufacturer creates aÂ <a id="s53b" title="Pachube input application" href="http://community.pachube.com/?q=node/100" target="_blank">Pachube input application</a>, like an electricity meter that automatically charts on Pachube. There is a certain benefit to its customers in being able to monitor their usage over time and to compare their usage to the aggregation of others in a similar class, but anonymised. Say that someone else has produced a Pachube output application like aÂ <a id="fhjs" title="mobile phone Pachube viewer" href="http://www.rcreations.com/freeandroidgphoneg1applications" target="_blank">mobile phone Pachube viewer</a>. Now the electricity meter users can use this new output application as an extension to be able to monitor their consumption on a mobile phone. Now, imagine if someone else develops a new product, aÂ <a id="j.l-" title="networked lamp" href="http://www.goodnightlamp.com/" target="_blank">networked lamp</a> &#8212; it would now be very easy for that designer to write a little app to make the networked lamp switch on (or change brightness) according to the electricity consumption, even remotely. The point is that the more input and output apps are added the more valuable they each become.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/scatteredhouse.jpg"><img class="alignnone size-full wp-image-2791" title="scatteredhouse" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/scatteredhouse.jpg" alt="scatteredhouse" width="443" height="109" /></a><br />
<a id="tzsq" title="Scattered House" href="http://www.haque.co.uk/scatteredhouse.php" target="_blank"></a></p>
<p><em><a id="tzsq" title="Scattered House" href="http://www.haque.co.uk/scatteredhouse.php" target="_blank">Scattered House</a>, like Reconfigurable House, but spread throughout various cities in the world to demonstrate the implications of designing environments and buildings in the context of family diasporas and ubiquitous ad hoc networked connectivity.</em></p>
<p><em>Part of Pachube&#8217;s emphasis, in not making specific connections more important than others, is that the community can develop new types of connection. So, while of course it makes it relatively simple to create remote control connections between seemingly unrelated entities (like mobile phones and houses; or web pages and furniture); and it makes it relatively simple to connect up environmental conditions from the physical world to seemingly distant Second Life (or, more interestingly to me,Â <a id="iqkx" title="OpenSim" href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a> ) which can make it a more viable interactive environment; and it makes data aggregation and comparison possible between wide ranges of energy consumers to facilitate aggregation analysis; but, the point really is to make it easy for people and companies to build in this kind of connectivity and invent new uses.</em></p>
<p><em>Through my close association withÂ <a id="sin8" title="The Bartlett, University College London's architecture school" href="http://www.bartlett.ucl.ac.uk/" target="_blank">The Bartlett, University College London&#8217;s architecture school</a>, I hope to develop some particularly relevant use-case scenarios for the architectural industry. I think we&#8217;ve really not even begun to imagine the kinds of applications that will be important, though I guess Natural Fuse exemplifies the kind of approach I would like to see in Pachube-enabled applictations: one in which the collective/hive experience contributes towards some end goal, to make it possible to create a &#8220;wikipedia of environments&#8221; as opposed to a web-based Wikipedia &#8211; it&#8217;s not that I necessarily want to create these things myself, but rather I want to make it </em><em>possible to create such things.</em></p>
<p><strong>Tish Shute:</strong> You mentioned that you hope Pachube to be the place to connect smart products &#8211; product to product communication?Â  Also you mentioned that you would like to have a way that smart products can self register with Pachube. While all feeds are public now, you are going to create groups with different levels of privacy. Both of the aforementioned features would enable more business applications for Pachube.Â  But could you describe the business model for Pachube?</p>
<p><strong>Usman Haque:</strong> Essentially, there are three facets to the business model. The first takes a cue fromÂ <a id="irzp" title="Flickr" href="http://www.flickr.com/upgrade/" target="_blank">Flickr</a> in recognising that there are those who would like a more sophisticated set of services as &#8220;professional&#8221; accounts. The second is to be able to provide a set of tools and applications for medium scale manufacturers and developers who want to web-enable their offerings, who will be able to take advantage of the growing repository of Pachube.Apps and add-ons, and who want the convenience, security and economy that Pachube will be able to offer. The third approach is to become more directly involved in large-scale urban infrastructure projects. There is a fourth facet, but we consider it the killer so I&#8217;m keeping quiet for the moment&#8230;.</p>
<p>So yes, in order to make all these things more useful we&#8217;ll soon be introducing a range of privacy options on feeds, the ability to create &#8220;aggregates&#8221; from collections of feeds, and the possibility of groups, organised around feeds. Another thing we&#8217;re hoping to introduce soon is open environment-level tagging, so that anyone will be able to tag environments, though there will be a way of evaluating the importance of any given tag.</p>
<p><strong>Tish Shute: </strong>I know you mentioned that you are trying to find ways to find tools that allow people to contribute to their environment. There are a number of projects aimed at providing tools that will help people/business to reduce their carbon footprintÂ  &#8211; <a id="a2qc" title="The Carbon Account," href="http://www.thecarbonaccount.com/" target="_blank">The Carbon Account,</a> AMEE, Wattzon, <a id="f8y3" title="Onzo" href="http://www.onzo.co.uk/" target="_blank"> Onzo</a> Is Pachube working with any of these projects and how?</p>
<p>What are the most interesting ideas in this area of changing our relationship to energy consumption emerging from Pachube?</p>
<p><strong>Usman Haque: </strong><em>The carbon footprint calculating industry is getting quite crowded&#8230;! So far I&#8217;ve particularly appreciated AMEE&#8217;s API (which is also used by the Carbon Account, I believe). So one thing we have just released a Pachube.App &#8216;plugout&#8217; which will take a feed from an electricity meter tagged &#8220;watts&#8221; or &#8220;kilowatts&#8221; and convert it into a realtime carbon footprint calculation (driven by AMEE&#8217;s international and region- and supplier-specific carbon conversion factors). So it should be really easy to discover how many kilograms of CO2 you generated in the last 15 minutes&#8230;. that last hour&#8230; the last 24 hours. Here&#8217;s a list of some of the feeds that are already making use of this:Â http://www.pachube.com/tag/co2_last_15_mins</em><br />
<strong><br />
Tish Shute:</strong> I know the Aduino community has really taken and interest in Pachube. Who are the early adopters on Pachube?Â  What are the most prevalent use cases you have seen so far?</p>
<p><strong>Usman Haque:<em> </em></strong><em>It has actually been more difficult than I thought it would be getting the Arduino community interested. This has partly been due to the difficulty of internet-enabling Arduino (until recently adding ethernet access has been a bit of a tough chore). Now that it&#8217;s easier to connect up Arduinos, some of the early adopters have been interfacing Arduino to Current Cost meters (alleviating the need for a computer in between); and others have been doing things like tracking temperature, humidity and light level in their homes and offices.Â <a id="ohbg" title="Pachube user C4C" href="http://www.gomaya.com/glyph/" target="_blank">Pachube user C4C</a> has been pretty active from early on:Â http://www.pachube.com/feeds/1284</em><br />
<strong><br />
Tish Shute:</strong> Pachube is input heavy at the moment &#8211; you mentioned not many accuators are plugged into Pachube yet.Â  You said this is in part because you have focused on making the backend robust and stable before taking a lot of hits. What new directions for Pachube will emerge from enabling the dynamic relationship between sensors and accuators?</p>
<p><strong>Usman Haque:</strong> <em>This will be a crucial evolution in Pachube, when we make actuators more evident. It&#8217;s input heavy at the moment, basically in the sense of being easy to see the inputs &#8212; you add &#8220;inputs&#8221; rather than &#8220;outputs&#8221;, so at the moment we have no idea of what&#8217;s actually plugged into the outputs unless people tell us! However, we know that there are plenty of outputs because they&#8217;re making API requests, we just don&#8217;t know what they&#8217;re being used for! Once the concept of actuators and output environments get built in to the system then I think we&#8217;ll know a lot more about how people are using the system.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/currentcost.jpg"><img class="alignnone size-full wp-image-2794" title="currentcost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/currentcost.jpg" alt="currentcost" width="444" height="150" /></a></p>
<p><em>To make this easier in the meantime we recently announce theÂ <a id="zp60" title="Pachube.apps" href="http://apps.pachube.com/%29" target="_blank">Pachube.apps</a> site, where people can start contributing Pachube &#8216;plugins&#8217; and &#8216;plugouts&#8217; &#8212; things that can be used by others without needing to code or hack, to create, generate or modulate Pachube inputs and outputs. One of these wasÂ <a id="htj9" title="Status2Pachube" href="http://apps.pachube.com/online-status.html" target="_blank">Status2Pachube</a>, which turns the online status of AIM, MSN Messenger, Skype or Yahoo! Messenger users into a Pachube input feed (to make it easy to create &#8220;remote presence&#8221; orbs and such); another was theÂ <a id="wjey" title="CurrentCost2Pachube" href="http://community.pachube.com/?q=node/100" target="_blank">CurrentCost2Pachube</a> app to make it easy to connect up Current Cost electricity meters as input feeds; all of which can then be used by Pachube output apps, like theÂ <a id="xki1" title="G1 Android phone Pachube viewer" href="http://www.rcreations.com/freeandroidgphoneg1applications" target="_blank">G1 Android phone Pachube viewer</a> by Pachube user N4Spd or in the soon-to-launchÂ <a id="pd2x" title="Pachube2SketchUp" href="http://apps.pachube.com/" target="_blank">Pachube2SketchUp</a> plugout which will direct Pachube outputs into Google SketchUp (and by extension Google Earth) in order to generate or modulate 3-d models in response to realtime environmental/sensor data. (Pachube2SketchUp is pretty much finished for Mac OS X &#8212; but we&#8217;re having difficulty getting it to work on Windows, because of its sometimes pigheaded security measures&#8230; we&#8217;ll probably release it for Mac OS X alone soon anyway).</em></p>
<p><strong>Tish Shute:</strong> Do you and Haque design expect to go beyond just providing a platform? Will you be producing more interesting applications like Natural Fuse on Pachube?Â  If so, can you tell me more about what you have in mind?</p>
<p><strong>Usman Haque:</strong> <em>I keep a clear distinction between my work as creative director of Pachube.com and my work as director of Haque Design + Research. Basically, while Pachube.com continue development of the platform in general, I hope that Haque Design + Research will separately continue creating pioneering interactive experiences, some using Pachube and others not. We have some things in mind, such as the idea of creating an open source building management platform, but that&#8217;s all to come later&#8230;</em></p>
<p><strong>Tish Shute:</strong> One very interesting project you have been involved in is the creation of &#8220;Urban Versioning System 1.0&#8243; which asks &#8220;What lessons can architecture learn from software development, and more specifically, from the Free, Libre, and Open Source Software (FLOSS) movement?&#8221; Can you tell me more about this project, its goals, and its progress? How Does UVS 1.0 relate to Pachube?</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/urbanvs.jpg"><img class="alignnone size-full wp-image-2795" title="urbanvs" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/urbanvs.jpg" alt="urbanvs" width="277" height="386" /></a></p>
<p><strong>Usman Haque: </strong><em>TheÂ <a id="xujn" title="Urban Versioning System" href="http://uvs.propositions.org.uk/" target="_blank">Urban Versioning System</a> was essentially an attempt to understand what lessons the &#8220;open source&#8221; approach in software might provide to the collaborative development of environments and cities. It&#8217;s a sort of quasi-license &#8212; not yet quite ready to have the status of something like Creative Commons (which nicely suits media and software based creations, but doesn&#8217;t suit quite so well hardware and physical things beyond their design files). It&#8217;s more of a challenge, a series of constraints that might be applied. It has a link to Pachube, in the sense of encouraging conception at the environment and systemic level &#8212; you might call it the manifesto that connects Constant&#8217;s New Babylon hypothesis to the reality of Pachube!</em></p>
<p><strong>Tish Shute:</strong> I know that you imagine Pachube scaling up to millions (billions???) of users. But scaling the real time web has proved a challenge (e.g the frequent surfacings of the Twitter failwhale during big events). What are the key points of Pachube&#8217;s architecture and design that will enable successful scaling?</p>
<p>How do you see Pachube itself fitting into the FLOSS movement?</p>
<p><strong>Usman Haque: </strong><em>This is a really important question. There are a couple of things we are doing. The first is constantly to assume that we have 20 to 50 times more connections than we actually have&#8230; I put a lot of pressure on Sam about making sure about this, so he&#8217;s constantly developing, thinking about and testing little things for weeks in advance while at the same time fighting the usual daily little fires that arise <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" />  The second is that we&#8217;re trying to learn from strategies being developed byÂ <a id="fq2y" title="Vlad Trifa" href="http://vladtrifa.com/" target="_blank">Vlad Trifa</a> and his group at theÂ <a id="zjfb" title="Institute for Pervasive Computing at ETH Zurich" href="http://www.pc.inf.ethz.ch/" target="_blank">Institute for Pervasive Computing at ETH Zurich</a> in Switzerland regarding the development of infrastructures for millions or more entities.</em></p>
<p><em>Regarding the connection to the FLOSS movement, there is no specific technical part of Pachube that is currently open source (apart from all the example apps and tutorials of course). However, I find the approach taken by OpenSim and Hypergrid really fascinating: I haven&#8217;t given this enough thought to how it might be implemented but I find quite appealing the idea of a multitude of open source and geographically dispersed Pachube-enabled servers with seamless transfer of data connections between them as necessary&#8230;..</em></p>
<p><strong>Tish Shute: </strong>I know you have an <a id="ttbg" title="Android Viewer for Pachube" href="http://en.androidwiki.com/wiki/Pachube_Viewer" target="_blank"> Android Viewer for Pachube</a>.Â  Android is a landmark for extended/augmented reality, as <a id="x-.a" title="Wikitude" href="http://www.mobilizy.com/wikitude.php" target="_blank"><span style="color: #0000ff;"><strong>Wikitude</strong></span></a> proved, because with its compass mode Android brings together the essential ingredients for extended/augmented reality &#8211; knowing who YOU are, WHERE you are, WHAT you are doing, WHAT is around you.Â  It seems Pachube could be a powerful backend to a number of multi-user, mobile augmented/enhanced reality android applications?Â  Do you have any ideas/thoughts on this?</p>
<p><strong>Usman Haque:</strong> <em>That&#8217;s right &#8212; the Android viewer was created by rcreations.com/ a Pachube user &#8212; this new platform brings amazing opportunities to mobile devices. I would be really interested to see what I would consider the obvious next step: an app that becomes both a Pachube input and an output feed, one that overlays existing Pachube data, with new context-based, site specific data.</em></p>
<p><em>If I was to make a parallel to a Japanese anime, I&#8217;m fascinated byÂ <a id="ht3b" title="Dennou Coil" href="http://en.wikipedia.org/wiki/Dennou_Coil" target="_blank">Dennou Coil</a> a Japanese anime set 20 years in the future where children take for granted the overlay of the digital world with the physical world. BUT, I&#8217;d say that Pachube somehow relates more closely toÂ <a id="zg78" title="Furi Kuri" href="http://www.adultswim.com/shows/flcl/index.html" target="_blank">Furi Kuri</a> in itsÂ <a id="gko_" title="pataphysical" href="http://en.wikipedia.org/wiki/%E2%80%99Pataphysics" target="_blank">pataphysical</a> stance and because one of the main characters has a portal to another galaxy in his head&#8230;&#8230;.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/furikuri.jpg"><img class="alignnone size-full wp-image-2793" title="furikuri" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/furikuri.jpg" alt="furikuri" width="420" height="320" /></a></p>
<p><strong> Tish Shute:</strong> Do you do you see Haque design picking up on the challenge of creating some cool next generation interfaces/GUIs for extended/enhanced/augmented (sorry no perfect term) reality?</p>
<p><strong>Usman Haque:</strong> <em>Actually, no, I don&#8217;t see this as Haque Design + Research&#8217;s core focus going forward. We did some of this early on, getting involved in, for example, the development of aÂ <a id="ty:5" title="3d smell interface" href="http://www.haque.co.uk/scentsofspace.php" target="_blank">3d smell interface</a>; and exploring theÂ <a id="ykap" title="role of electromagnetic fields on perception of haunted spaces" href="http://www.haque.co.uk/haunt.php" target="_blank">role of electromagnetic fields on perception of haunted spaces</a>. But these days, in the context of HDR, I&#8217;m less interested in making seamless interfaces and more interested in exploring what authentic interaction actually is (whether technologically based or not). I think it&#8217;s challenge enough for me to make a light-switch engaging, dynamic and conversant before getting to the perceptual infrastructure that goes on top of it all! HDR will also spend more time exploringÂ <a id="p2v5" title="passive systems, phase-change materials and plants" href="http://www.haque.co.uk/climateclock.php" target="_blank">passive systems, phase-change materials and plants</a> in the context of the built environment.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/scentsofspace.jpg"><img class="alignnone size-full wp-image-2796" title="scentsofspace" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/scentsofspace.jpg" alt="scentsofspace" width="550" height="197" /></a></p>
<p><strong>Tish Shute: </strong>I know there has been some interesting integrations with Pachube lately &#8211; <a href="http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/" target="_blank">Andy Stanford-Clark&#8217;s mentioned using MQTT as the feed to get EML data into and out of Pachube</a> rather than over HTTP. He said thatâ€™s interesting because MQTT is a much more lightweight protocol, designed for small sensors and low bandwidth / expensive (e.g. cellular) networksâ€¦ and itâ€™s also true push.. i.e. data is pushed to you directly from the broker (the hub in the middle), rather than you having to ask for it constantly (polling).</p>
<p>Have you opted for MQTT over HTTP polling?</p>
<p><strong>Usman Haque:</strong> <em>We haven&#8217;t yet implemented an MQTT bridge in part because it has proved pretty difficult. HTTP is quite important for us right now because there&#8217;s a whole universe out there using it; from your average web browser, to mobile devices, to ethernet devices and a whole range of languages and platforms &#8212; they all work, pretty much out of the box with HTTP. However, what we are exploring instead is being able to interface withÂ <a id="a4w." title="Oliver Goh" href="http://www.eolusone.com/cms/website.php" target="_blank">Oliver Goh</a>&#8216;s Shaspa project &#8212; they&#8217;re already in the middle of solving the MQTT-Pachube bridge problem, and so that should hopefully provide Pachube access to and from MQTT devices.</em></p>
<p><strong>Tish Shute:</strong> Chris Dalby just released <a id="qcm6" title="Pachube Air" href="http://www.yellowpark.net/cdalby/index.php/2009/01/10/pachube-air-the-first-release/" target="_blank">Pachube Air.</a> Have you had a chance to play with that yet?</p>
<p><strong>Usman Haque:</strong> <em>I have indeed! It&#8217;s still early days yet, and I know he did it partly just to test the AIR development process rather than solely solving a desperate Pachube need but I&#8217;m looking forward to future iterations!</em></p>
<p><strong>Tish Shute:</strong> Peter Quirk felt the Pachube web page positions Pachube as a social networking site focused on data exchange, inviting anyone with an interest in sharing environmental or other data to publish data or construct interesting uses for the data.</p>
<p>What is your response to that?</p>
<p><strong>Usman Haque:</strong> <em>Hmm&#8230; I don&#8217;t really see Pachube as a social networking site. Yes, it perhaps enables the creation of social-networking objects and environments, but in itself and in terms of networking of people that has barely begun yet. Certainly Pachube exists quite comfortably in facilitating mashups and visualisations and other web 2.0 based social applications but I don&#8217;t see that as a driving force. I think it would be a mistake also to conceive of Pachube solely as being the storage of machine communication that then gets experienced by people; rather, it can transition quite easily to being solely useful for machine-to-machine communication. </em></p>
<p><em>In fact, with recent API releases (which as it happens as of this writing we haven&#8217;t announced&#8230; <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" />  it&#8217;s now possible to use most of Pachube&#8217;s features without ever going to the website: i.e. your Arduino can create feeds, search feeds, edit feeds, delete feeds. Over time,Â as direct machine-to-machine communication becomes more prominent,Â it&#8217;s quite likely that the website itself becomes less and less important, while the backend becomes the focus of everything.</em><br />
<strong><br />
Tish Shute:</strong> I am interested in some of the differences between<a href="http://sensorbase.org/" target="_blank"> SensorBase.org&#8217;s project</a> and Pachube. Is Sensorbase as more of a data repository (environmental data in particular)?</p>
<p><strong>Usman Haque</strong>: <em>The difference I see between Pachube and SensorBase is that while (from what I know) SensorBase is mostly about &#8220;write&#8221; operations, with later &#8220;read&#8221; operations (i.e. it&#8217;s about being a data repository), Pachube is really &#8220;read-write&#8221; (i.e. it&#8217;s about being both a data repository _and_ a quasi-realtime proxy). Pachube will be able to handle potentially millions of connections, both incoming and outgoing, and as we&#8217;ll soon start storing every data point ever recorded, so of course the data repository aspect will be crucial. However, the fact that it *also* facilitates one-to-many realtime broadcasts of that data (and facilitates conversion to a number of different formats: EEML, CSV and JSON now, more in the future) means that the two-way connectivity aspect of it is just as important.</em></p>
<p><strong>Tish Shute</strong>: I know you mentioned something that sounding a lot like Pachube would facilitate buildings and products ability to benchmark and optimize themselves against/with each other?</p>
<p><strong>Usman Haque:</strong> <em>Further down the line, I would like to see Pachube able to help two particular processes:</em></p>
<p><em>1) to make it straightforward for developers and manufacturers to web-enabled their products and services; and 2) to help building and environment designers create their buildings (by providing access to realtime site data) and also help in the post-occupancy evaluation process &#8212; where buildings will be able to talk with each other, share information on energy consumption, resource management or occupancy rates and even &#8220;learn&#8221; from each others&#8217; strategies. This type of approach has a parallel at the level of individuals (for example, networked electricity meter users who are able to compare and contrast their usage and strategies for conservation). I don&#8217;t want Pachube to become the application; rather I want to make it easier for other people and companies to create such applications. So in that sense, yes, perhaps Pachube can be considered an enabler of social networking applications&#8230;!</em></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/feed/</wfw:commentRss>
		<slash:comments>64</slash:comments>
		</item>
		<item>
		<title>Hacking the World in 2009: Google Street View, &#8220;Smart Stuff,&#8221; and Wikiculture.</title>
		<link>http://www.ugotrade.com/2008/12/29/hacking-the-world-in-2009-google-street-view-smart-stuff-and-wikiculture/</link>
		<comments>http://www.ugotrade.com/2008/12/29/hacking-the-world-in-2009-google-street-view-smart-stuff-and-wikiculture/#comments</comments>
		<pubDate>Mon, 29 Dec 2008 19:20:11 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Architecture Working Group]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[interoperability of virtual worlds]]></category>
		<category><![CDATA[Linden Lab]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[Metaverse]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[MQTT and RSMB]]></category>
		<category><![CDATA[Open Grid]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[open protocols for virtual worlds]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[Open Source Virtual Worlds]]></category>
		<category><![CDATA[open standards for virtual worlds]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[virtual communities]]></category>
		<category><![CDATA[Virtual HomeCamp]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[virtual world standards]]></category>
		<category><![CDATA[Virtual Worlds]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[World 2.0]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=2463</guid>
		<description><![CDATA[Google Street View Hacking This Google Street View Hack (via @timoreilly) will get my nomination for a Hacking the World Award this year, if there is such an award. A parade (the screenshot opening this post), a marathon,Â a mad-scientists laboratory, a sword fight, and more (see The Infonaut Blog) were staged all along the route [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/sampsoniawaypost.jpg"><img class="alignnone size-full wp-image-2475" title="sampsoniawaypost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/sampsoniawaypost.jpg" alt="" width="450" height="274" /></a></p>
<h3>Google Street View Hacking</h3>
<p><a href="http://www.wikio.com/video/576734" target="_blank">This Google Street View Hack</a> (via<a href="http://twitter.com/timoreilly" target="_blank"> @timoreilly</a>) will get my nomination for a Hacking the World Award this year, if there is such an award.</p>
<p><a href="http://maps.google.com/maps?cbp=1,262.96388206761037,,0,16.58444579096093&amp;cbll=40.456878,-80.01196&amp;layer=c&amp;ie=UTF8&amp;ll=40.458499,-80.009319&amp;spn=0.00569,0.012918&amp;z=17&amp;panoid=zHdES6mj-vBrH2nF-K9ROQ" target="_blank">A parade</a> (the screenshot opening this post), <a href="http://maps.google.com/maps?cbp=1,260.87215088682916,,0,8.64102186979147&amp;cbll=40.457046,-80.011085&amp;layer=c&amp;ie=UTF8&amp;ll=40.458671,-80.00845&amp;spn=0.00569,0.012918&amp;z=17&amp;panoid=81ALq0NpV6uyLEF5S5ENhw" target="_blank">a marathon</a>,Â <a href="http://maps.google.com/maps?cbp=1,160.10914016686365,,0,33.949139944215034&amp;cbll=40.456949,-80.011593&amp;layer=c&amp;ie=UTF8&amp;ll=40.458573,-80.008954&amp;spn=0.00569,0.012918&amp;z=17&amp;panoid=C4I-QLkZJoT1SHXslK5f7Q" target="_blank">a mad-scientists laboratory</a>, <a href="http://maps.google.com/maps?cbp=1,9.995045624107206,,0,10.698194796922357&amp;cbll=40.457636,-80.00767&amp;layer=c&amp;ie=UTF8&amp;ll=40.459103,-80.006486&amp;spn=0.00569,0.012918&amp;z=17&amp;panoid=W_ox0QPcWyPqWGNPiK91Nw" target="_blank">a sword fight</a>, and more (see <a href="http://www.infonaut.ca/blog/?p=290" target="_blank">The Infonaut Blog</a>) were staged all along the route of the Google Street View truck by artists Robin Hewlett and Ben Kinsley working in conjunction with the local community and Google Street View<em><strong>. </strong></em></p>
<p>The Google Street View Hack suggests at a myriad of possibilities for anyone with their eye on the prize for a great world hack for 2009.Â  In my mind&#8217;s eye, I imagine the Google Street View truck&#8217;s trek across the planet triggering local environmental street action carnivals wherever it goes.</p>
<p>Local energy conservationists,<a href="http://www.nytimes.com/2008/12/27/world/europe/27house.html?_r=1&amp;pagewanted=all" target="_blank"> &#8220;passive house&#8221; architects</a>, retrofitters, could turn the arrival ofÂ  Google Street View into an occasion to create projects for a sustainable future &#8211; a traveling StreetCamp (see <a href="http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/" target="_blank">my post on HomeCamp &#8217;08 here</a>).Â  As Google Street View intends, surely, to go everywhere,Â  this would be a global hack for sustainable living that crossed the bounds of the physical and the virtual.Â  And the vast public record of Google Street View would became a generative engine and global resource for sustainable living.</p>
<h3>Working together on the noble aim of sustainable living</h3>
<p>- this is my (and many other people&#8217;s) big theme for 2009.</p>
<p>A Hacking the World award should also go toÂ  <a href="http://www.pachube.com/">Pachube</a> &#8211; &#8220;patching the planet&#8221; &#8211; for demonstrating that instrumenting the world is not merely a Sci FiÂ  fantasy anymore.Â  By facilitating &#8220;interaction between remote environments, both physical and virtual,&#8221;Â  Pachube demonstrates (see <a href="http://community.pachube.com/?q=node/1" target="_blank">diagram here</a>) how we have only just begun to dip our toes into the many new opportunities we have to work together to save energy, rethink our culture of consumption, and to reboot our failing economy under a new sustainable operating system.</p>
<p>Energy awareness unlike the glut of information we have in entertainment and games suffers from a dearth of information. We really have very little idea about what we are consuming and the waste we are producing.Â  So more Hacking the World Awards should go to projects like <a href="http://www.amee.com/" target="_blank">AMEE</a> &#8211; creating the world&#8217;s energy meter, and <a href="http://www.wattzon.com/" target="_blank">Wattzon</a> &#8211; your personal energy meter, for giving us new ways to understand and work with energy data.</p>
<p>Many people and organizations, given the information, will change their behaviours. But the cultural changes necessary for sustainable living are deep and old habits die hard (see <a href="http://www.nytimes.com/2008/12/27/opinion/27sat1.html" target="_blank">this disturbing report</a> on the recent return to SUV buying in November as soon as gas prices fell!).</p>
<h3>AÂ  Small Community of Volunteers Can Bring Change on a Global Scale</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/homecampthethrongpost.jpg"><img class="alignnone size-full wp-image-2535" title="homecampthethrongpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/homecampthethrongpost.jpg" alt="" width="450" height="153" /></a></p>
<p>Picture above by <a href="http://benjaminellis.co.uk/" target="_blank">Benjamin Ellis</a>, &#8220;HomeCamp &#8211; The Throng,&#8221; from his <a href="http://www.flickr.com/photos/tags/homecamp08/" target="_blank">Flickr</a><a href="http://www.flickr.com/search/?q=homecamp&amp;w=29034542%40N00" target="_blank"> stream.</a></p>
<p>One of my favorite &#8220;instrumenting the world&#8221; projects to date and another top contender for a Hacking the World Award is <span class="entry-content"><a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp â€˜08</a></span> (see my <a href="http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/" target="_blank">previous post</a>).Â  HomeCamp brings together a community of creators and enthusiasts ofÂ  &#8220;smart stuff,&#8221; creating <a href="http://meta.wikimedia.org/wiki/Wikiculture" target="_blank">a wikiculture</a> for the noble cause of sustainable living.</p>
<p>The key to whether &#8220;instrumenting the world&#8221; empowers people and changes our lives for the better will be the capacity our systems of instrumentation have for what Jonathan Zittrain in <em><strong>&#8220;</strong></em><a href="http://futureoftheinternet.org/" target="_blank">The Future of the Internet: And How To Stop It:,&#8221; </a><em><strong> </strong></em>defines as generativity, i.e.:Â  &#8220;the system&#8217;s capacity to produce unanticipated change through unfiltered contributions from broad and varied audiences&#8221; ( Zittrain, 2008).</p>
<p>Generativity is the &#8220;secret sauce&#8221; that makes the difference between, for example, <a href="http://www.wikipedia.org/" target="_blank">Wikipedia</a> and its all but forgotten predecessor &#8211; the &#8220;written by experts&#8221; <a href="http://en.wikipedia.org/wiki/Nupedia" target="_blank">Nupedia</a>.</p>
<p>Jonathan Zittrain writes:</p>
<p><em><strong></strong></em></p>
<p><em><strong>Wikipedia stands for more than the ability of people to craft their own knowledge and culture.Â  It stands for the idea that people of diverse backgrounds can work together on a common project with, whatever its other weaknesses, a noble aim </strong><strong>- bringing such knowledge to the world. (p.147)</strong></em></p>
<p>At <a href="http://en.oreilly.com/web2008/public/content/home" target="_blank">Web 2.0 Summit</a>, Jonathan Hochman (<em><strong><a href="http://en.oreilly.com/web2008/public/schedule/detail/6952" target="_blank">Known as </a><a href="http://en.wikipedia.org/wiki/User:Jehochman">Jehochman</a> on Wikipedia</strong></em>), shared with me his insider perspective as a Wikipedia administrator. The <a href="http://www.ugotrade.com/2008/12/26/wikipedia-houdini-google-street-view-instrumenting-sustainable-living#link_1">full interview</a> with Jonathan is later in this post.</p>
<p>Jonathan comments on the role of wikiculture in sustainable living:</p>
<p><em><strong>&#8220;Sustainable Living requires everything to become more efficient. Incentives need to line up with conservation priorities. This requires a radical change to the way we govern ourselves. Command economies, whether commanded by politicians or capital, lead to huge inefficiencies.&#8221;</strong></em></p>
<p>And surely, if we have learned anything in 2008, we have learned that very bad things happen when the complex systems of modern life are left in the hands of a few people motivated solely by the urge to make profit.</p>
<h3>Hacking Design and Planning Processes for Real Estate and Transportation with Virtual Worlds</h3>
<p><object width="400" height="302" data="http://vimeo.com/moogaloop.swf?clip_id=2326434&amp;server=vimeo.com&amp;show_title=1&amp;show_byline=1&amp;show_portrait=0&amp;color=&amp;fullscreen=1" type="application/x-shockwave-flash"><param name="allowfullscreen" value="true" /><param name="allowscriptaccess" value="always" /><param name="src" value="http://vimeo.com/moogaloop.swf?clip_id=2326434&amp;server=vimeo.com&amp;show_title=1&amp;show_byline=1&amp;show_portrait=0&amp;color=&amp;fullscreen=1" /></object></p>
<p>This great machinima by Azwaldo Vilotta shows the progress so far on the <a href="http://studiowikitecture.wordpress.com/2008/12/12/now-is-an-ideal-time-to-join-wikitecture-40/" target="_blank">Wikitecture 4.0 project</a>, â€˜Re-Inventing the Virtual Classroomâ€™ for the University of Alabama.</p>
<p>Though still a niche market Virtual Worlds are growing at a steady pace.Â  As I mentioned in my previous post, energy hungry avatars themselves will be a target for optimization in 2009.Â  But as my personal power usage breakdown from <a href="http://www.wattzon.com/" target="_blank">Wattzon</a> shows, cutting down the amount of flying I do in 2009 would be far more effective in reducing my carbon footprint than deciding not to log into Virtual Worlds!</p>
<p>Note: Read Write Web&#8217;s recent post, &#8220;<a href="http://www.readwriteweb.com/archives/enterprise_virtual_worlds.php" target="_blank">Report Enterprise Virtual Worlds More Effective Than Web Conferencing</a>.Â  Also check out <a href="http://www.projectchainsaw.com/" target="_blank">Web.Alive</a>, and <a href="http://immersivespaces.com/" target="_blank">Immersive WorkSpaces</a> and Dusan Writer&#8217;s post on &#8220;<a href="http://dusanwriter.com/index.php/2008/12/20/thinkbalm-the-immersive-internet-and-collaborative-culture/" target="_blank">ThinkBalm,The Immersive Internet and Collaborative culture</a>,&#8221;</p>
<p>My friend Melanie Swan points out in her <a href="Jimmy Wales recent personal appeal for support for Wikipedia." target="_blank">Top Ten Computing Trends for 2009</a>, that Virtual Worlds not only have the power of the 3 Cs (communication, collaboration and commerce) but they are fast expanding into <a href="http://www.3pointd.com/20070406/rapid-architectural-prototyping-in-second-life/">rapid prototyping</a>, <a href="http://your2ndplace.com/node/926">simulation</a> and <a href="http://sldataviz.pbwiki.com/">data visualization</a>.</p>
<p>My Hacking the World, 2008, Awards for Virtual World innovation would go to three potentially world changing projects for sustainable living:</p>
<p>1) <a href="http://studiowikitecture.wordpress.com/" target="_blank">Studio Wikitecture</a>, (see <a href="http://studiowikitecture.wordpress.com/" target="_blank">&#8220;Reinventing the Virtual Classroom&#8221;</a> for The University of Alabama).</p>
<p>2) Oliver Goh&#8217;s work on &#8220;<a href="http://www.shaspa.com/cms/website.php" target="_blank">The Path to Sustainable Real Estate.&#8221;</a></p>
<p>3) Encitra,Â <a href="http://www.podcar.org/uppsalaconference/christerlindstrom.htm" target="_blank"></a>a company recently co-founded by <a href="http://www.ics.uci.edu/informatics/research/research_highlight_view.php?id=52" target="_blank">Crista Lopes</a> and <a href="http://www.podcar.org/uppsalaconference/christerlindstrom.htm" target="_blank">Christer Lindstrom</a> focused on improving urban planning processes, starting with transportation, using virtual worlds (<a href="http://www.ugotrade.com/2008/11/25/web-meets-world-participatory-culture-and-sustainable-living/" target="_blank">see my previous post here for more</a>).</p>
<p>The latter two projects are being developed in <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a> &#8211; the open source project that should also get a Hacking The World Award for creating an open modular architecture for virtual worlds that is unleashing all these new possibilites for integrating physical and virtual worlds.</p>
<p>The 2008 code contributions to OpenSim of special note re world hacking are Crista Lopes&#8217;<a href="http://opensimulator.org/wiki/Hypergrid"> OpenSim Hypergrid</a> &#8211; see Justin CC&#8217;s blog for full details on <a href="http://justincc.wordpress.com/2008/12/19/what-is-the-hypergrid/" target="_blank">&#8220;What is the hypergrid?,&#8221;</a> and David Levine&#8217;s work (IBM),  in collaboration with Linden Lab (see<a href="http://wiki.secondlife.com/wiki/Architecture_Working_Group" target="_blank"> Architecture Working Group</a>), on interoperability (see <a href="http://www.ugotrade.com/2008/07/" target="_blank">my earlier post here</a>).</p>
<p>Both these projects expand the frontiers of interoperability for virtual worlds although they &#8220;slice the problem from different ends,&#8221; as David Levine put it.Â  The emphasis in the LL/IBM approach is on security so assets are not moving yet.Â  In Crista&#8217;s solution you can have assets but the security issues are not addressed yet. But this work is vital to expanding the usefulness of virtual worlds and both projects should get Hacking the World Awards IMHO.</p>
<p>I asked <a href="http://archsl.wordpress.com/" target="_blank">Jon Brouchoud </a>(full interview upcoming) what he thought were Studio Wikitecture&#8217;s most important successes to date:</p>
<p><strong><em>&#8220;I think the greatest success has been in proving, on some level, that everyone has important knowledge that can inform and improve the design of a building, not just architects.Â  If we can continue building on that success, I hope we can eventually start to hack the traditional design process, and find ways to harness the wealth of knowledge held by the general public, instead of ignoring or avoiding it, as is most often the case.&#8221;</em></strong></p>
<h3>Harnessing the &#8220;Smart Stuff&#8221; to the Noble Cause of Sustainable Living</h3>
<p>Robert Scoble&#8217;s, <a href="http://scobleizer.com/2008/12/27/the-interview-of-the-year-tim-oreilly/" target="_blank">The Interview of the Year: Tim O&#8217;Reilly,</a> is not to be missed. Tim O&#8217;Reilly discusses the key trends for 2009 that are bubbling up at O&#8217;Reilly Media.Â  And, Yes, Tim O&#8217;Reilly, as the guru of Hacking the World, gets the &#8220;Distinguished Thinker &#8211; Hacking The World Award of 2008!&#8221;</p>
<p>Tim O&#8217;Reilly&#8217;s trend list includes:</p>
<p>1) big data- vast peer produced data bases in the cloud accessible by mobile devices</p>
<p>2) &#8220;smart stuff&#8221; &#8211; sensors and robotics and hacking on stuff for fun and not for profit</p>
<p>3) Green Tech</p>
<p>4) Advances in Biological/Life Sciences.</p>
<p>And, in Robert Scoble&#8217;s interview, there is a nice titbit of history re his attendance of early <a href="http://en.wikipedia.org/wiki/Foo_Camp" target="_blank">Foo Camps</a>.Â  Foo Camp is the wiki of O&#8217;Reilly conferences and a lineage holder to my favorite Hacking the World event of 2008, <span class="entry-content"><a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp â€˜08</a></span>.</p>
<p>But what will be the &#8220;secret sauce&#8221; for these big ideasÂ  &#8211; the generative engines that harness to the noble cause of sustainable living these vast peer produced data bases and all the creative &#8220;smart stuff&#8221; hackers across the globe are creating?Â  What will motivate the mass adoption of Green Tech and sustainable living?</p>
<p>What can Wikipedia teach us about how generative systems and bottom up approaches can change the world?</p>
<p>Jimmy Wales (interview coming soon!)Â  writes in his recent <a href="http://wikimediafoundation.org/wiki/Donate/Letter/en?utm_source=2008_jimmy_letter_r&amp;utm_medium=sitenotice&amp;utm_campaign=fundraiser2008#appeal" target="_blank">personal appeal</a> for support for Wikipedia.</p>
<p><em><strong>At its core, Wikipedia is driven by a global community of more than 150,000 volunteers &#8211; all dedicated to sharing knowledge freely. Over almost eight years, these volunteers have contributed more than 11 million articles in 265 languages. More than 275 million people come to our website every month to access information, free of charge and free of advertising.</strong></em></p>
<p>To answer questions on a how to create a successful wikiculture for sustainable living, an insider&#8217;s view of Wikipedia may be a good place to start.</p>
<h3>Interview With Jonathan Hochman on Wikipedia.</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/gammapostjon.jpg"><img class="alignnone size-full wp-image-2477" title="gammapostjon" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/gammapostjon.jpg" alt="" width="223" height="158" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/jonathanwikikpost.jpg"><img class="alignnone size-full wp-image-2473" title="jonathanwikikpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/jonathanwikikpost.jpg" alt="" width="224" height="158" /></a></p>
<p>The picture on the left is from the Wikipedia article, <a href="http://en.wikipedia.org/wiki/Gamma-ray_burst" target="_blank">Gamma-ray Burst</a>, that Jonathan Hochman is currently working on.Â  It is a drawing of a massive <a title="Star" href="http://en.wikipedia.org/wiki/Star">star</a> collapsing to form a <a title="Black hole" href="http://en.wikipedia.org/wiki/Black_hole">black hole</a>. Energy released as jets along the axis of rotation forms a gamma-ray burst. <em>Credit: Nicolle Rager Fuller/NSF </em></p>
<p>The picture on the right, Jonathan at Web 2.0 Summit, is taken by me. Jonathan was part of the,<em> <a href="http://en.oreilly.com/web2008/public/schedule/detail/6952" target="_blank">Defending Web 2.0 from Virtual Blight, panel.</a> </em></p>
<p><em><strong><a href="http://en.oreilly.com/web2008/public/schedule/detail/6952" target="_blank">Known as </a><a href="http://en.wikipedia.org/wiki/User:Jehochman">Jehochman</a> on Wikipedia, he serves as an administrator and as a leader in addressing online harassment, disruption and sock puppetry. He is also the founder of <a href="http://www.hochmanconsultants.com/">Hochman Consultants</a>, an Internet marketing consultancy, and the director of <a href="http://www.semne.org/">Search Engine Marketing New England</a>, a regional conference series.</strong></em></p>
<p><strong>Tish:</strong> Second Life and Wikipedia are the two great experiments in collaborative co-creation what do they have to teach us about the future of the internet?</p>
<p><strong>Jonathan:</strong> Yes, Wikipedia and Second Life are key social spaces.Â  Some people have been seeing Second Life as the beginning of Web 3.0 &#8211; a wrap around environment where you can almost experience another life. Wikipedia is sort of another example of this.</p>
<p>All the problems that exist in the real world are mirrored right into that little universe.Â  For example, the Armenians and the Turks are at each others throats and the Japanese and the Koreans are going at it, the Palestinians and the Israelis, and the &#8220;Troubles&#8221;Â  &#8230; all the conflicts are imported into Wikipedia.Â  People are fighting over the content of these articles. They want to have it their way because these are first ranked in Google and they have a big impact in public opinion.</p>
<p>There was a huge fight on the waterboarding article a while back. Some guys from Little Green Footballs &#8211; they are a very conservative reactionary type of media. They are trying to change the article to say that water boarding might not be torture &#8211; change it to say it is probably not so bad.Â  Crazy stuff. They were trying to water it down.Â  And it is very clear, from every source out there, that waterboarding is torture.Â  We did a study and there are 115 sources that say waterboarding is torture. You simulate drowning &#8211; you simulate killing someone &#8211; that is a violation of the Geneva Convention and everything else. People were fighting, fighting, fighting!</p>
<p>One of the things I did was to try and clear people out who were being disruptive.Â  We actually had to go to arbitration over that article. It is like the supreme court of Wikipedia. There is a panel of 15 arbitrators.Â  They hear the case. There is evidence, arguments and decisions. It is really like a simulated law suit. You get all the experience of a simulated law suit with the real threat that you could be banned. If they don&#8217;t like what you are doing they can actually ban you or restrict you from topics.</p>
<p>So it is really fascinating how this social space Wikipedia becomes a very real platform though it is in a virtual world for real world disputes.Â  Most disputes are over the definition of things.Â  If you have a you suit most disputes are about how things are defined. And Wikipedia has become the defacto definition of things in the real world.Â  People want to know what are &#8220;The Troubles.&#8221;Â  If you go to Wikipedia you find outÂ  The Troubles are a dispute over Northern Ireland.Â  What the article says has a profound impact on public opinion.</p>
<p><strong>Tish:</strong> So who is on the court of Wikipedia?</p>
<p><strong>Jonathan:</strong> They are volunteers. these people work two or three hours a day to run this court.Â  There are all kinds of projects.Â  There is a WikiProject Spam which has people who can write computer programs to statistically analyze Wikipedia projects &#8211; not only Wikipedia. But all of them are looking at the links and reporting them and banning those people who are abusing or gaming the system.</p>
<p><strong>Tish:</strong> You were on the Stopping Virtual Blight Panel at Web 2.0 Summit &#8211; what are the most important things to think about on this topic?</p>
<p><strong>Jonathan:</strong> Yes we were talking about how to defend the web against virtual blight. The thing I find interesting about Wikipedia is that because it is the eighth largest web site and possibly the second largest web site comprised of user generated content after YouTube. The problems that exist in Wikipedia are larger and more detailed than any other site.Â  For whatever problem someone has for their social media site or their Web 2.0 site these problems already exist in Wikipedia and the solutions are there and they are transparent. You can actually see the history of what&#8217;s been done.</p>
<p>If there is, for example, a problem on Digg &#8211; some problem with sock puppetry or vote stacking &#8211; it happens, it goes away.Â  You don&#8217;t get full disclosure.Â  With Wikipedia you can actually go in and look at a dispute and watch it unfold.Â  You can watch the arbitration cases that are filed, the arguments, the decisions, the logic, the rationale.Â  You can see the successes and the failures and the different things people have tried to control blight. For example, we tried to resolve this dispute one way but it was a disaster, so we have tried something else and that worked.</p>
<p>Wikipedia is a large laboratory for social media. Wikipedia and the large universe around it Wiki and WikiMedia projects that individuals, enterprises and put together like Commons.Â  Wikimedia Commons is a repository of publicly licensed images that anyone can take and reuse. They have sound and they have video, and all of this stuff is being stitched together now.</p>
<p>So if you go to the article on ObamaÂ  you can probably now hear his acceptance speech because that is public domain &#8211; its been stitched into the article.Â  If you go to the article on Richard Nixon &#8211; his resignation speech &#8211; you may even hear his conversation with the astronauts when they landed on the moon.Â  So this becomes a giant repository of all our culture and knowledge.Â  When I design a website, a lot of times I go to Commons to find images I use for free.Â  I don&#8217;t want to pay for an image I can get for free.Â  <strong></strong></p>
<p><strong>Tish: </strong>And the Commons images get contextualized in Wikipedia too.</p>
<p><strong>Jonathan:</strong> Some of these articles are fascinatingly detailed. If you want a quick summary of the Dr. Strangelove, the article is fantastic.Â  It is enjoyable, a pleasure to read.Â  I was reading about S.A. Andree&#8217;s North Pole balloon expedition of 1897. Some guys from Sweden decided to fly a balloon over the North pole.Â  They managed to get aloft then they flew over the icepack for 24 hrs then they crashed.</p>
<p>They unloaded their stuff and hiked back across the ice toward the island they had launched from. They ended up being on the ice pack for three months before they finally holed up in an ice cave and starved to death.Â  There weren&#8217;t found until thirty years later!Â  There was a camera with these guys and the frozen pictures taken 30 yrs earlier.Â  They developed the film and those pictures are now on Wikipedia.Â  It is just a fascinating thing!</p>
<p><strong>Tish: </strong> Do you see real time collaboration beginning to play more of a role in Wikipedia &#8211; whether virtual worlds or just voice/IM &#8212; how could real time collaboration change the wikipedia editing process?</p>
<p>Jonathan:Â  The Presidential candidate articles were being edited very rapidly yesterday. There are certain real time problems.Â  Some of the more interesting problems are when you get two administrators who &#8220;get into it.&#8221; One administrator says I am blocking this user and the other one says I am unblocking him, and the other one &#8220;NO I am blocking him!&#8221; And so on&#8230;&#8230; And everyone says, &#8220;Stop fighting. You are not allowed to do that!&#8221; And they both get their powers stripped. People do get very heated over the silliest things. Wikipedia does have some mailing lists attached and there are some IRC channels. So there are some real time elements.</p>
<p><strong>Tish: </strong>What is the role of avatars in Wikipedia?<br />
<br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> In Wikipedia you have a user page and many users are anonymous.Â  They create an avatar and they personalize it and show themselves in ways they want to show themselves through an avatar. In many ways it is a lot like Second Life.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">Some users have created second accounts &#8211; or a humerous second account. Bishzilla &#8211; a Swedish lady who is in tremendous command of the English language and has a razor sharp wit.Â  She has created this secondary account &#8211; almost like in a baby language.Â  Her avatar is a dinosaur that is not very bright that goes around frying people. Bizarre what people do! People may be editing a topic like an interest they have &#8211; e.g. Pokemon that they don&#8217;t want associated with their professional avatar. Or people may be editing a topic about hot political issues.Â  There have actually been some death threats issued to people over stuff they have been putting into the encyclopedia. </span><strong><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /></strong><span style="background-color: #ffffff;"><strong>Tish: </strong>So avatars are important in Wikipedia.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> Absolutely because people may be going in and editing articles that they may not want their friends and family to know they are editing.Â  One editor may say to another, &#8220;Stop putting stuff in or I will come and kill you!&#8221; Well then we have to ban them.Â  We have to call the police.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Tish:</strong> Can you build reputations on multiple avatars?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan: </strong>You are allowed to use multiple avatars as long as they don&#8217;t cross paths.Â  You can&#8217;t have two avatars editing in the same area beacuse you are going to be giving yourself double weight commenting on a discussion. </span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Tish:</strong> How do you know when this is happening?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> You can watch the style of a users editing.Â  You have to watch behavior.Â  And if you have enough evidence through behavior that suggests accounts are controlled by one person you can go and request a technical check.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">There are some uses who are called Checkusers who are able to access information desired from the server logs and check the technical characteristics of these accounts to see if they are using the same IP address.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Tish:</strong> So if you want to understand avatar interaction on the web it helps to understand Wikipedia. </span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> Yes it is a fantastic way to understand how avatars work in some aspects, and also how to deal with community dynamics.Â  We have some very strong willed people &#8211; people in their 40s, 50s, and 60s &#8211; who are very successful in business.Â  They have plenty of money and spare time and they are doing this as a hobby. And some of these people can really butt heads.Â  You can have a problem when you have an editor who has been writing fantastic articles but also happens to be rude and chew other people out and tell them to f**k off if they are not behaving. What do you do?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Tish:</strong> Sounds a bit like Second Life!</span><br style="background-color: #ffffff;" /><strong><br style="background-color: #ffffff;" /></strong><span style="background-color: #ffffff;"><strong>Jonathan:</strong> The person is a great contributor to the community but they are telling noobies to f**k off, so you can&#8217;t allow that.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">What do you do?Â  Vested contributors are a major problem to some of these sites. They are vested in the community but they start misbehaving. You can&#8217;t block them, because if you block them there is a huge upsroar from all their friends and it causes a cataclysm.Â  It requires very careful diplomacy to deal with some of these situations. </span><br style="background-color: #ffffff;" /><strong><br style="background-color: #ffffff;" /></strong><span style="background-color: #ffffff;"><strong>Tish:</strong> How many Wikipedia volunteers are there now?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">Jonathan: Think of a Venn Diagram &#8211; a big circle. The total number of contributors are about one million different people that contribute.Â  But there are probably about 5,000 active editorsÂ  that are consistently and regularly contributing.Â  And within that kernel there are fifteen hundred people that have administrator access and probably only eight hundred of them are active.Â  People have a natural life span with the community.Â  People come an typically stay for 6 months to 3 years.Â  Usually after that they become bored, disillusioned or get into a conflict with someone.Â  There is a natural tendency for people to stay for a while and move on. Some people stay longer, a few, but the majority will move on at some point.Â  So it is a lot of fresh faces moving in.</span><br style="background-color: #ffffff;" /><strong><br style="background-color: #ffffff;" /></strong><span style="background-color: #ffffff;"><strong>Tish:</strong> What lessons of trust does Wkipedia have to teach us about new projects like AMEE that aims to aggregate the world&#8217;s energy data?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> Well you have to know who is releasing the data. Who is creating the data? The beauty of Wikipedia is that you have an edit history so you can see exactly who has done what.Â  So you can judge whether this person is trustworthy or not.Â  That&#8217;s a huge problem on the web today.Â  We don&#8217;t have enough identification information.Â  When you see a web page you don&#8217;t necessarily know when that page was created and by whom, or how many revisions it has had.Â  Sometimes you can glean information by checking it.Â  If you see typos and errors you may decide that that page probably didn&#8217;t receive as much attention as it should have, and probably it is not that good.</span> <br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">Typos are an interesting thing.Â  People always try to figure out how Google ranks web pages. </span><a id="uy3s" style="background-color: #ffffff;" title="Matt Cutts" href="http://www.mattcutts.com/">Matt Cutts</a><span style="background-color: #ffffff;"> was here from Google today.Â  And he was talking about spam.Â  But Matt also did a <a id="e4lo" title="blog post" href="http://www.mattcutts.com/blog/2006-pubcon-in-vegas-getting-there-and-back/">blog post</a> about how he was in an airport once, and how he has a policy &#8211; when you are reading a document as soon as you come to the first error just stop because if the author hasn&#8217;t taken the care to make everything correct, you don&#8217;t need to read it. So he was in the airport, there was a sign, he came to a typo and stopped reading it. Somehow he got in trouble for not reading the sign and not having the information.Â  But it is interesting to think whether Goggle is looking for for typos, misspellings, broken links and using that as a signal of quality to rank pages.</span><br style="background-color: #ffffff;" /></p>
<p><strong>Tish:</strong> Aaaagh typos might bring down your page rank!!!Â  That certainly is a scary thought for a blogger like me who likes to write impossibly long posts that are hard to check&#8230;&#8230;&#8230;</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2008/12/29/hacking-the-world-in-2009-google-street-view-smart-stuff-and-wikiculture/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
		</item>
		<item>
		<title>Smart Planet:Interview with Andy Stanford-Clark</title>
		<link>http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/</link>
		<comments>http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/#comments</comments>
		<pubDate>Mon, 15 Dec 2008 18:13:59 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[MQTT and RSMB]]></category>
		<category><![CDATA[Open Grid]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[Open Source Virtual Worlds]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[Virtual HomeCamp]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[Web3.D]]></category>
		<category><![CDATA[World 2.0]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=2404</guid>
		<description><![CDATA[&#8220;Smart Planet: Andy Stanford-Clark&#8217;s time has really come. His career of work in lightweight brokers and sensors is now going to pay off,&#8221; twittered James Governor (@monkchips), Redmonk, recently. The picture opening this post (from Andy Piper&#8217;s Flickr stream} was taken during Andy Stanford-Clark&#8217;s talk at The Inaugural HomeCamp (for more photos see Flickr &#8220;homecamp08&#8243;). [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/andystanfordclark.jpg"><img class="alignnone size-full wp-image-2405" title="andystanfordclark" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/andystanfordclark.jpg" alt="" width="450" height="300" /></a></p>
<p><span class="entry-content"><em><strong>&#8220;Smart Planet: Andy Stanford-Clark&#8217;s time has really come. His career of work in lightweight brokers and sensors is now going to pay off,&#8221;</strong></em> <a href="http://twitter.com/monkchips/status/1029249885" target="_blank">twittered</a> </span><span class="entry-content">James Governor</span><span class="entry-content"> </span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/andystanfordclark.jpg"><span class="entry-content">(</span></a><a id="qd8i" title="@monkchips" href="http://twitter.com/monkchips" target="_blank">@monkchips</a>), <a href="http://redmonk.com/">Redmonk,</a> recently<span class="entry-content">. </span></p>
<p><span class="entry-content">The picture opening this post (from <a id="wfe3" title="Andy Piper's Flickr stream" href="http://www.flickr.com/photos/andypiper/" target="_blank">Andy Piper&#8217;s Flickr stream}</a> was taken during Andy Stanford-Clark&#8217;s talk at <a id="exzg" title="The Inaugural HomeCamp" href="http://andypiper.wordpress.com/2008/12/01/the-inaugural-homecamp/">The Inaugural HomeCamp</a> (for more photos see <a id="hi96" title="Flickr &quot;homecampo8&quot;" href="http://www.flickr.com/photos/tags/homecamp08/" target="_blank">Flickr &#8220;homecamp08&#8243;</a>). </span></p>
<p><span class="entry-content"><a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp &#8217;08</a> was organized by </span><a id="pnnc" title="Chris Dalby" href="http://www.yellowpark.net/cdalby/" target="_blank">Chris Dalby</a> and <a id="vqd3" title="Dale Lane" href="http://dalelane.co.uk/blog/" target="_blank">Dale Lane</a> and sponsored by <a href="http://currentcost.co.uk/">Current Cost</a> and <a href="http://redmonk.com/">Redmonk</a>. A<span class="entry-content"> video </span><span class="entry-content">of Andy Stanford-Clark&#8217;s talk</span><span class="entry-content">, by <a id="hwom" title="Andy Piper" href="http://andypiper.wordpress.com/" target="_blank">Andy Piper,</a></span><span class="entry-content"> </span><span class="entry-content"> is <a href="http://www.viddler.com/explore/andypiper/videos/21/" target="_blank">up </a></span><a id="k4xo" title="see the video taken" href="http://www.viddler.com/explore/andypiper/videos/21/" target="_blank"><span class="entry-content">on Viddler</span></a><span class="entry-content">. Also see </span>Andy Piper&#8217;s <a href="http://andypiper.wordpress.com/2008/04/27/current-cost/" target="_blank">post abut CurrentCost meters</a> and most recently about <a href="http://andypiper.wordpress.com/2008/12/11/current-cost-monitoring-from-an-iphone/" target="_blank">running his CurrentCost meterâ€™s graphs on his iphone</a>.</p>
<p>Ambient displays were a hot topic at HomeCamp see <a id="q39t" title="here" href="http://ambientdevices.com/cat/orb/orborder.html" target="_blank">here</a> and <a id="ss3w" title="here" href="http://ambientdevices.com/cat/index.html" target="_blank">here</a> for some good examples.</p>
<p><span class="entry-content">I </span><a id="pyxa" title="first wrote about IBM Master Inventor Andy Stanford-Clarkâ€™s Home Automation project June of 2007" href="../../2007/06/05/extreme-life-logging-3d-experience-architects-digging-it-with-destroy-tv/" target="_blank">first wrote about IBM Master Inventor Andy Stanford-Clarkâ€™s Home Automation project June of 2007</a><span class="entry-content">.Â  At that time relatively few people were playing with home monitoring. But now the lynch pin of Andy&#8217;s work -</span> MQTT and RSMB &#8211; Really Small Message Broker, is available free on <a id="h0is" title="IBM AlphaWorks" href="http://alphaworks.ibm.com/tech/rsmb" target="_blank">IBM AlphaWorks</a> for anyone to download and play with.</p>
<p>This puts a key tool into the hands of developers and mashup artists ready to explore the possibilities of home automation as a generative technology that can bring the power of participatory culture to the urgent task of creating sustainable living.Â  Andy points out:<span class="entry-content"> </span></p>
<p><em><strong>&#8220;Lots of people can start playing with home energy monitoring, social aspects of the data sharing, home automation, ambient displays, etc. The powerful thing about messaging middleware like MQTT, is that you don&#8217;t have to worry about how to get the messages from A to B: you can focus on how to capture the data, and what to do with it when it gets to the other end.&#8221;</strong></em></p>
<p>The full interview, that I did with Andy last week, is later in this post.</p>
<p><span class="entry-content">Also recently, I did an <a id="gp5_" title="interview with Gavin Starks, founder of AMEE" href="../../2008/11/02/tim-oreilly-instrumenting-the-world/">interview with Gavin Starks, founder of AMEE</a>. </span>As a neutral data aggregation platform, &#8220;AMEEâ€™s vision is to enable the measurement of the â€œCarbon Footprintâ€ of everything on Earth.&#8221;Â  A<span class="entry-content"><a id="cde2" title="A press release out yesterday" href="http://www.amee.com/?p=556"> press release last week</a> announced that a &#8220;co</span>llaboration between Oâ€™Reilly Alphatech Ventures (OATV), Union Square Ventures (USV) and The Accelerator Group (TAG) will enable AMEE to expand its reach by enhancing its data, and extend globally.<span class="entry-content">&#8221; </span><span class="entry-content"> </span></p>
<p>The combination of a neutral aggregation platform and MQTT and RSMB can enable new forms of data sharing to meet broader sustainability goals (see <a id="ol7c" title="my interview with Gavin for AMEE's direction re privacy and data sharing" href="../../2008/11/02/tim-oreilly-instrumenting-the-world/">my interview with Gavin for AMEE&#8217;s direction re privacy and data sharing</a>), and the kind of ecological intelligence that Larry Brilliant, Google.org,Â  talked about at <a href="http://en.oreilly.com/web2008/public/content/home" target="_blank">Web 2.0 Summit</a>.Â  Dan Golemanâ€™s new book: <a title="&quot;Ecological Intelligence: How Knowing the Hidden Impacts of What We Buy Can Change Everything,&quot;" href="http://www.randomhouse.ca/catalog/display.pperl?isbn=9780385527828" target="_blank">â€œEcological Intelligence: How Knowing the Hidden Impacts of What We Buy Can Change Everything,â€</a> will come out in April, 2009. (<a id="fkkt" title="see my previous post" href="../../2008/11/25/web-meets-world-participatory-culture-and-sustainable-living/">see my previous post</a>).</p>
<p>There is already a <a id="c-ox" title="virtual worlds integration to AMEE" href="http://carbongoggles.org/">virtual worlds integration to AMEE</a> by <a id="qg5." title="Jim Purbrick" href="http://jimpurbrick.com/">Jim Purbrick</a> of Linden Lab!<br />
<span class="entry-content"><br />
</span></p>
<h3>Links For HomeCamp &#8217;08</h3>
<p>Chris Dalby has a list of blog posts about homecamp in his <a id="vx_v" title="HomeCamp Review" href="http://www.yellowpark.net/cdalby/index.php/2008/12/10/home-camp-review/" target="_blank">HomeCamp Review</a>.</p>
<p><a href="http://dalelane.co.uk/blog/?p=318">Homecamp by Dale Lane</a><br />
<a href="http://nicktaylor.co.uk/2008/11/10/home-camp/">Home Camp Unconference &#8211; inspired me by the thoughts</a><br />
<a href="http://andypiper.wordpress.com/2008/12/01/the-inaugural-homecamp/">The Inaugural Homecamp<br />
</a><a href="http://www.tomtaylor.co.uk/blog/2008/11/30/homecamp-demand-shifting/">Home Camp Deman Shifting</a><a href="http://andypiper.wordpress.com/2008/12/01/the-inaugural-homecamp/"><br />
</a><a href="http://pbjots.blogspot.com/2008/11/homecamp-november-2008.html">Homecamp</a> from Phoebe Bright<br />
<a id="tti9" title="Homecamp '08" href="http://jamie.op-i.net/blog/" target="_blank">Homecamp &#8217;08</a><br />
<a id="lnis" title="HomeCamp Event: Andy Stanford-Clarkâ€™s View" href="http://digital-lifestyles.info/2008/12/08/homecamp-event-andy-stanford-clarks-view/" target="_blank">HomeCamp Event: Andy Stanford-Clarkâ€™s View</a></p>
<h3>Virtual HomeCamp</h3>
<p><span class="entry-content">In 2007, I published the picture below (thanks <a href="http://annieok.com" target="_blank">Annie Ok</a> as Destroy Television for SL pics) which shows:</span></p>
<p>On the right the virtualization of Andy&#8217;s RL house which is part of a Second Life Real Life Home Automation project. The pictures in the bottom row shows Mrs Stanford-Clarkâ€™s Real Life Llamas on the left and their virtual counterparts on Second Life on the right. Real and Virtual Llamas are linked through GPS and MQTT so people can &#8220;track the trek&#8221; when the llamas are out on a walk (see <a href="http://www-03.ibm.com/innovation/us/podcasts/blog_videocast.shtml">this IBM podcast</a>).</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/andysautomatedhouse.jpg"><img class="alignnone size-full wp-image-2409" title="andysautomatedhouse" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/andysautomatedhouse.jpg" alt="" width="448" height="338" /></a></p>
<p>I am currently working on a Virtual HomeCamp which will probably be nomadic from meetup to meetup but will kick off in Andy&#8217;s virtual house in Second Life. Andy Stanford-Clark, <a id="awwk" title="Adam Frisby" href="http://www.adamfrisby.com/blog/" target="_blank">Adam Frisby</a> (one of the founders of <a id="bc79" title="OpenSim" href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a> ), and Chris Dalby have all agreed to talk (more presenters to come!) at the first Virtual HomeCamp.</p>
<p>Charles Crinke, OpenSim has offered Virtual HomeCamp a patch of land on <a id="s58j" title="OSGrid" href="http://osgrid.org/" target="_blank">OSGrid,</a> and to give a talk on interesting home automation projects to get started in OpenSim. Charles has a cornucopia of great ideas!</p>
<p>And Kyle Gomboy (avatar G2 Proto) of the Microsoft Developer Community has set up an OpenSim on <a id="z:jr" title="ReactionGrid" href="http://reactiongrid.com/" target="_blank">ReactionGrid</a> that virtual HomeCampers can use to develop projects related to participatory culture and sustainable living.</p>
<p>The interview with Andy Stanford-Clark in this post gives Virtual HomeCampers some great ideas for good projects &#8220;that matter&#8221; to work on.</p>
<p>If you have a Second Life or OpenSim venue and you would like to offer your sim for a meetup &#8211; please let me know! Meetups will need to be streamed to the web as there is already a dynamic and rapidly growing HomeCamp community. See:</p>
<p><a id="mg60" title="HomeCamp Wiki" href="http://homecamp.pbwiki.com/" target="_blank">HomeCamp Wiki</a></p>
<h4 style="font-weight: normal;"><a href="http://homecamp.org.uk/">HomeCamp Blog</a></h4>
<h4 style="font-weight: normal;"><a href="http://upcoming.yahoo.com/event/1304370">HomeCamp on Upcoming</a></h4>
<h4 style="font-weight: normal;"><a href="http://www.facebook.com/events.php?ref=sb#/event.php?eid=43794919520">HomeCamp on Facebook</a></h4>
<h4 style="font-weight: normal;"><a href="http://groups.google.co.uk/group/homecamp?hl=en">Google Group Discussion</a></h4>
<p><a href="http://friendfeed.com/rooms/homecamp">FriendFeed Room</a></p>
<h3>Reducing the Carbon Footprint of Avatars and Getting Energy Awareness to the Masses</h3>
<p>As Andy notes:</p>
<p><em><strong>&#8220;We need to get energy awareness and energy saving to the masses; and by saying &#8220;you can reduce energy by interacting in a virtual 3D world&#8221;, just isn&#8217;t going to cut it for all but a very small fraction of the people we need to get to.&#8221;</strong></em></p>
<p>But, perhaps, some of our phenomenal OpenSim developers will push the envelope and produce the code that will make open source virtual worlds one of the most important future contributors to sustainable living. And, hopefully, Virtual HomeCamp will leverage both the collective intelligence of the web and the real time presence plus rapid prototyping capabilities unique to immersive 3D virtual worlds, to explore new ways to get energy awareness and energy saving to the masses in the short term as well as the long term.</p>
<p>And yes we will have to address the topic of those energy-hogging avatars!!!</p>
<p>Adam Frisby has been doing some interesting work with OpenSim that has the potential to reduce the energy consumption of VWs. And Michael Osias, IBM, told me:</p>
<p><em><strong>&#8220;We operate the IBM grid [100 OpenSims] on almost all virtual machines with Xen. Recently, we migrated the opensim appliance into the IBM Research cloud appliance catalog.&#8221;</strong></em></p>
<p>So I will definitely be calling on Michael and Adam to present on how server virtualization and cloud computing can reduce the carbon footprint of avatars.</p>
<h3>Setting Up Your Own Home Automation Hub</h3>
<p>There is an amazing choice of home automation technology becoming available now. Recently <a id="i0w2" title="Nokia announced their home automation ecosystem" href="http://www.engadget.com/2008/11/27/nokia-launching-z-wave-home-control-center-next-year/" target="_blank">Nokia announced their home automation ecosystem</a> &#8211; available in late 2009. And I recently saw <a id="sph0" title="The Apple Macintosh Z - Wave Home Automation System" href="http://www.automatedhome.co.uk/New-Products/Apple-Macintosh-Z-Wave-Home-Automation-System.html" target="_blank">The Apple Macintosh Z &#8211; Wave Home Automation System</a>. If you don&#8217;t already, start checking out Automated Home<a href="http://www.automatedhome.co.uk/"> for lots of good ideas and smart devices.</a></p>
<p>In the interview below, Andy describes how he achieves some impressive energy consumption reduction with some very affordable and readily available hardware, a little detective work, and a tip from his son to examine the energy consumption of the home automation set-up itself. And with the newly &#8220;available for free download&#8221; Really Small Message Broker from <a id="h0is" title="IBM AlphaWorks" href="http://alphaworks.ibm.com/tech/rsmb" target="_blank">IBM AlphaWorks</a>, IBM has made available a cool way to give creative home automators a free vehicle to broker and share their data and integrate home automation in all the exciting ways we can come up with.</p>
<p>The pictures below (<a href="http://podcast.ubuntu-uk.org/2008/12/03/s01e19-love-letters/" target="_blank">see here for enlargements</a>) are the before and after shots of a streamlining effort Andy made on his own home automation setup.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/img_9072-small.jpg"><img class="alignnone size-full wp-image-2416" title="img_9072-small" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/img_9072-small.jpg" alt="" width="200" height="150" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/img_9074-small.jpg"><img class="alignnone size-full wp-image-2417" title="img_9074-small" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/img_9074-small.jpg" alt="" width="200" height="150" /></a></p>
<p>Andy said:</p>
<p><em><strong>&#8220;I&#8217;ve moved my entire home automation system from the pile of equipment shown in the first photo, to a single Viglen MPC-L with a load of USB serial connections (second photo).</strong></em></p>
<p><em><strong> The pile of equipment I replaced is: A Cisco wireless access point, an IBM ThinkPad, aÂ  Linksys NSLU2 SLUG, an Arcom Viper, and an Arcom Field Sentry I/O box.<br />
</strong></em></p>
<p><em><strong>Moving to the Viglen and turning off all that lot, has replaced 50W of always-on standby power with 10W, i.e. 40W less, or about Â£40 a year!&#8221;</strong></em></p>
<p>See Chris Dalby&#8217;s post, <a href="http://www.yellowpark.net/cdalby/index.php/2008/12/15/viglen-mpc-l-useful-commands-and-tips/" target="_blank">Viglen MPC-L Useful Commands and Tips</a>.</p>
<h2>Interview With Andy Stanford-Clark</h2>
<p><strong>Tish Shute:</strong> I just got a good question for you from Gavin Starks AMEE, &#8220;if the Stern report is going to be out by 100% by 2020, and we have to start seeing an actual reduction of 10% per annum starting next year: What would you do, personally?&#8221; (See <a href="http://www.climatesafety.org" target="_blank">The Climate Safety</a> report, backed by IPCC).<br />
<strong><br />
Andy Stanford-Clark:</strong> Oh, man! Now you&#8217;re asking the tough questions!</p>
<p>We have to change attitudes, otherwise just a few people making a noise about this stuff isn&#8217;t going to make any significant difference &#8211; and the way to change attitudes is by starting to make people aware of just how much energy various things we have, and things we do, take. But it needs to be something in each person&#8217;s home, that&#8217;s not &#8220;in your face&#8221;&#8230; something more subtle &#8211; &#8220;ambient&#8221;&#8230; otherwise people reject it out of hand.</p>
<p>Also, people are suspicious of the power companies asking us to use less power: &#8220;what, give you less money?? Surely there&#8217;s a catch?&#8221; This is a real problem. Someone phoned one of the power companies here and accused them of sending her an energy monitor that would suck electricity out of the wall socket at night, to INCREASE her bill! If that&#8217;s the kind of thing we&#8217;re up against, it&#8217;s going to be a long journey!</p>
<p><strong>Tish:</strong> So what it the best way to change attitudesÂ  &#8211; have you seen projects like <a id="c.tc" title="Wattzon" href="http://www.wattzon.com/" target="_blank">Wattzon</a>?</p>
<p><strong>Andy SC:</strong> Yes, projects like Wattzon are exactly the kind of thing that start to make people realise the true cost of wasting energy.</p>
<p>Personally, my family has reduced our home electricity bill by 30%, which is great! But my neighbours didn&#8217;t, nor the other 4 billion or so people who have electricity.<br />
<strong><br />
Tish:</strong> How did you reduce your consumption so much?<br />
<strong><br />
Andy SC:</strong> We reduced our home electricity bill when we got a <a id="w57x" title="currentcost meter" href="http://www.automatedhome.co.uk/Announcements/Reduce-Your-Bills-with-Smart-Home-Power-Monitoring.html" target="_blank">currentcost meter</a> &#8211; a plug-in energy monitor which gives a total for the whole house.Â  When we got it, it showed up really quickly a couple of things&#8230;. that our &#8220;standby power&#8221; was really high (i.e. in the middle of the night, when everyone&#8217;s asleep, you creep up to the meter with a torch (flashlight &lt;grin&gt;) and see what it&#8217;s showing).</p>
<p>That was about 500 Watts before we started paying attention to it. The other thing was the lights.. I had no idea the lights in the kitchen used 480 Watts.. we just used to leave them on all the time when we were in the house. A simple change, once I realised: turn them off when you leave the room!</p>
<p>Our standby power was really high because I had a load of geeky home automation stuff running, and my first-generation, homebrew, energy monitoring solution (how ironic!)&#8230; which included 3 laptops doing various things (monitoring data and displaying information round the house). I just didn&#8217;t think about the cost.</p>
<p>So one weekend we went round the house making an inventory in each room of things that were on (the children were keen to help!). That enabled me to pretty much track down the whole 500 W&#8230; there were a few things that took some sleuthing, like the alarm system and the central heating controller. We used <a id="asuo" title="a plug-in meter" href="http://www.amazon.co.uk/gp/product/B000Q7PJGW?ie=UTF8&amp;tag=markmccall&amp;linkCode=as2&amp;camp=1634&amp;creative=19450&amp;creativeASIN=B000Q7PJGW" target="_blank">a plug-in meter</a> to see what individual appliances were using.. a really useful diagnostic aid.</p>
<p>It&#8217;s worth having a look at <a id="wjzg" title="AutomatedHome's review" href="http://www.automatedhome.co.uk/Announcements/Reduce-Your-Bills-with-Smart-Home-Power-Monitoring.html" target="_blank">AutomatedHome&#8217;s review</a> of these energy monitoring products, by the way.</p>
<p>So I turned off a load of things that were sitting there on standby.. things like stereo, microwave, scanner, Wii, power bricks&#8230; each taking 4-6 Watts just doing nothing &#8211; each one small, but it all adds up. The big hitters were the PCs&#8230; turned off 3 of those, and consolidated onto a low power (10W) <a id="ym7y" title="linux server (Viglen MPC)" href="http://www.viglen.co.uk/viglen/Products_Services/Product_Range/Product_file.aspx?eCode=XUBUMPCL&amp;Type_Info=Description&amp;Type=Desktops&amp;GUID=" target="_blank">Linux server (Viglen MPC-L)</a>&#8230;so that got our standby power down to 180 watts. And that, combined with being proactive about turning off lights, reduced our power usage from 900 KWH a month to 600&#8230; i.e. 30% and it has been at that for 4 months now.</p>
<p><strong>Tish:</strong> Interesting that your home automation was one of the power issues as I am an aspiring home automator myself!</p>
<p><strong>Andy SC:</strong> Yes, you have to strike a balance of using energy to save energy, and make sure you know what your standby power is. There are a number of home energy monitors available &#8211; there&#8217;s a <a id="qy1h" title="review" href="http://www.automatedhome.co.uk/Announcements/Reduce-Your-Bills-with-Smart-Home-Power-Monitoring.html" target="_blank">review</a> on the AutomatedHome blog. The CurrentCost meter has a handy serial port so you can plug it into a computer to download history data or make it live on the internet.</p>
<p><strong>Tish:</strong> That is interesting because it opens the door to having a social energy network, doesn&#8217;t it?</p>
<p><strong>Andy SC:</strong> Yes.. absolutely&#8230; you should watch <a id="i28f" title="my intro talk at homecamp" href="http://www.viddler.com/explore/andypiper/videos/21/" target="_blank">my intro talk at HomeCamp</a>! About 50 of us at IBM in the UK (and one in Australia!) have put our home energy graphs online using a currentcost meter plus a cheap low power Linux server like the Viglen MPC-L or Linksys NSLU2 (SLUG) type devices.</p>
<p>And a community has formed around the graphs (I described this in my HomeCamp talk at some point).. so people ask what&#8217;s that spike, or why&#8217;s yours so high in the morning, or how do you get your standby power so low.. and people talk about it and exchange ideas. There&#8217;s a facebook group (currentcost) too, with people talking about this.</p>
<p>And there&#8217;s some peer pressure too.. if my power is really high compared with everyone else, I feel bad about it and see what I can do to reduce it.. or if not reduce it, at least know why it&#8217;s high, and have been through a process to justify that to myself.</p>
<p><strong>Tish:</strong> You mentioned earlier that it was important to have ambient solutions, not &#8220;in your face&#8221; messages from Big Brother like &#8220;turn your lights off now!&#8221; What kind of &#8220;ambient&#8221; solutions have you been working on?</p>
<p><strong>Andy SC: </strong>Ok &#8211; <a id="ewgg" title="ambient" href="http://ambientdevices.com/cat/index.htm" target="_blank">ambient devices</a> &#8230; so an <a id="stq:" title="&quot;orb&quot; is a good example" href="http://ambientdevices.com/cat/orb/orborder.html" target="_blank">&#8220;orb&#8221; is a good example</a>.. wired up to the home automation system, or the energy monitor.. or maybe even controlled by the power company&#8230;</p>
<p>It glows different colours (e.g. blue through red, or red/amber/green) to tell me how &#8220;healthy&#8221; the house is from an energy point of view. So I don&#8217;t have to open a browser and pull up a geeky graph and analyse it.. it just lets me know subconsciously how we&#8217;re doing.</p>
<p><strong>Tish:</strong> But it doesn&#8217;t necessarily help you find out what your problems is, right?</p>
<p><strong>Andy SC:</strong> In our house, it&#8217;s in my study, so when I go to bed, for example, I glance in to see it, and if it&#8217;s green, all is good&#8230; but if it&#8217;s still amber or red(!), then I think.. hmm &#8211; what&#8217;s still on.. oh, the dishwasher.. ok &#8211; that will finish soon&#8230; or.. oh, I left the heater on .. I&#8217;ll go and turn it off.</p>
<p><strong>Tish:</strong> What do you have to help you troubleshoot the problem?</p>
<p><strong>Andy SC:</strong> If the orb doesn&#8217;t jog your memory, then you can pull up the graph to give more information, or a dashboard which shows various things that are turned on, both of which help with knowing what&#8217;s going on.</p>
<p><strong>Tish: </strong>And how to fix it?<strong></strong></p>
<p><strong>Andy SC:</strong> Yes, so if things are on X10 or other appliance control systems like <a id="rv3d" title="Bye Bye Standby" href="http://www.byebyestandby.co.uk/" target="_blank">Bye Bye Standby</a>, for example, and under computer control, then you can have a dashboard of what&#8217;s on so you can see.</p>
<p><strong>Tish:</strong> Good interfaces to home automation seem to be a problem yet to be solved?</p>
<p><strong>Andy SC:</strong> There&#8217;s at least one company which has technology to analyze your power usage (voltage and current together) to &#8220;learn&#8221; which appliance has which profile on the graph, so you can see what&#8217;s on that&#8217;s using lots of power and also get a pie chart view of the whole house with slices showing different appliances &#8211; so many % for the TV, so many for freezer, etc. that&#8217;s <a id="k2ca" title="Onzo.com" href="http://www.onzo.co.uk/" target="_blank">Onzo.com</a> . Their product isn&#8217;t out yet, but will give a much finer grain understanding of what&#8217;s using the power in your home.</p>
<p>There are also some &#8220;IAM&#8217;s&#8221;.. Individual Appliance Monitors, which are like the plug-in meter I showed you, but with a (usually wireless) link back to a base station to tell you how much power is flowing through each of them. So by knowing what appliances you plugged into your IAMs round the house, you can break out the usage by appliance. And if they&#8217;re 2-way, which some of them will be, you can have the computer turn them off if you tell it, say from the web, or your mobile phone, etc. Or maybe the home automation system will make an autonomous decision to turn it off for you!</p>
<p>Back to interfaces to home automation: there are two typical approaches &#8211; PLC (power line carrier) like X10, and wireless (like Bye Bye Standby, etc)&#8230; there are computer interfaces to both, but it&#8217;s all still quite expensive (in UK at least &#8211; cheaper in the US because X10 is more ubiquitous)&#8230;Â  but the cheaper ones don&#8217;t tell you that they definitely turned the device on or off &#8211; all you know is that the command was sent out. It might not have got there, so you don&#8217;t <em>really</em> know if the heater got turned off.. unless you monitor it by some secondary means, like seeing if the temperature goes down, or if the power usage goes down, or (for a light) if the room goes dark, or whatever.</p>
<p>BTW, my standby is now down to 120 watts</p>
<p><strong>Tish:</strong> Yes!</p>
<p><strong>Andy SC:</strong> I consolidated some more home automation stuff into one device.. there are two photos on <a id="i-2g" title="this page" href="http://podcast.ubuntu-uk.org/2008/12/03/s01e19-love-letters/" target="_blank">this page</a> &#8211; my &#8220;before&#8221; and &#8220;after&#8221; shots. It gets a mention in the podcast. They did a promotion on the low power Viglen servers.. Â£80 instead of Â£150&#8230; bargain! Loads of people have bought them for home automation.. you can&#8217;t have failed to see the #viglen references on twitter over the past few months!</p>
<p><strong>Tish:</strong> I think there is a lot of enthusiasm for virtual worlds as a good interface for home automation. But we need to come up with something simple enough for everyone?</p>
<p><strong>Andy SC:</strong> Yes, virtual worlds are very interesting.. though let&#8217;s not mention the carbon cost of running a VW!</p>
<p>So you know already, I think, that I can control my home automation stuff from SL&#8230; if I turn on my lights in SL, my FL (first life, i.e. here!) lights turn on, and also meter reading.. my live electricity and water meter readings are displayed on virtual meters in my virtual house so the meter reader doesn&#8217;t even need to drive to my house &lt;grin&gt; and the orb is there too, so I can see how healthy the house is, energy-wise, in-world.</p>
<p>Imagine a row of houses each glowing blue through red according to its power use &#8211; peer pressure again. If you have local generation.. the power hogs could be made to feel guilty for using all the town&#8217;s energy from the wind farm or gas turbine generator.</p>
<p><strong>Tish:</strong> So every one would see if you have a Bad House eeek!</p>
<p><strong>Andy SC: </strong>right!</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/slmeterpost.jpg"><img class="alignnone size-full wp-image-2410" title="slmeterpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/slmeterpost.jpg" alt="" width="450" height="332" /></a></p>
<p><span class="entry-content"> <em>The picture above shows Andy Stanford-Clark&#8217;s electricity meter in Second Life. </em></span></p>
<p><em></em></p>
<p><strong>Tish:</strong> Yes and the great thing about a VW is you get a sense of confidence your controls are working and how to adjust them. But yes the carbon cost is one of the obstacles.</p>
<p>Do you think the power hogging sims of Virtual Worlds could be improved by server virtualization techniques and clouds &#8211; I know there is an IBMer here in US who is working on server virtualization integrated into OpenSim?</p>
<p><strong><br />
Andy SC:</strong> Yes, cloud technologies have a lot to offer in terms of making best use of a set of machines to run a set of applications, rather than one machine per application as often tends to be the case.</p>
<p>And with dynamic load balancing, like we do for our sporting event on-demand server farms for things like Wimbledon, as the load ramps up, we squeeze out the other apps that are using the farm to give extra capacity (as Wimbledon takes priority in that instance!)</p>
<p>But there was a popular statistic when SL became really popular &#8211; over a year ago now, that was something like to have an avatar in SL for a year was the same carbon footprint as driving an SUV from NY to SF or something &#8211; don&#8217;t quote me on that till we check it &#8230; <a id="ymnc" title="here it is" href="http://www.roughtype.com/archives/2006/12/avatars_consume.php" target="_blank">here it is</a> &#8211; 2000 miles</p>
<p><strong>Tish:</strong> Yes I remember <a href="http://www.ugotrade.com/2008/06/27/ibms-virtual-wimbledon-web-rendering-in-second-life/" target="_blank">Judge telling me about some of the interesting load balancing you do at Wimbledon</a>.</p>
<p>Many of my friends are thinking ahead to AR solutions now too.<br />
<strong></strong></p>
<p><strong>Andy SC:</strong> Yeah &#8211; AR very interesting too.. you have to read Halting State by Charles Stross</p>
<p><strong>Tish:</strong> Yes loved it!</p>
<p><strong>Andy SC:</strong> So &#8220;Halting State is to 15 years&#8217; time as SnowCrash was to NOW, 15 years ago&#8221;</p>
<p>SnowCrash is effectively a history book now.</p>
<p>Yeah, I think AR with glasses and overlays is going to be really cool! In cars too.. heads up satnav..</p>
<p><strong>Tish:</strong> Also could you tell me the role of the messaging technology you developed in all this?<br />
<strong><br />
Andy SC: </strong><a id="g.i:" title="using MQTT" href="http://mqtt.org/" target="_blank">using MQTT</a> of course.. which is the area I have been working on with my team for the past 10 years: the IBM messaging technology which underpins all this cool stuff we&#8217;re doing for home automation, energy monitoring, inter-world messaging.. all that stuff.. all using MQTT and WebSphere messaging technology.</p>
<p><strong>Tish:</strong> I would be interested to know more about how you see VR and AR with what we have available today producing a cool interface for home automation that could get some mass traction.</p>
<p><strong>Andy SC:</strong> So I think the AR/VR thing.. at the moment, far too few people are using these technologies.. we need to get energy awareness and energy saving to the masses (looping back round to the original Gavin Starks question!)&#8230; and by saying &#8220;you can reduce energy by interacting in a virtual 3D world&#8221;, just isn&#8217;t going to cut it for all but a very small fraction of the people we need to get to.&#8221;</p>
<p><strong>Tish: </strong>Yes in basic ambient ways first.Â  How does the messaging technology you have developed open up possibilities for leveraging network effects and creating new forms of participatory culture around reducing consumption?</p>
<p><strong>Andy SC:</strong> It is important because the messaging allows the real-time interaction that can be used to give dynamic feedback, and it&#8217;s that immediacy which makes people react to changes.</p>
<p>And, with MQTT and RSMB &#8211; Really Small Message Broker, which is now available free on <a id="h0is" title="IBM AlphaWorks" href="http://alphaworks.ibm.com/tech/rsmb" target="_blank">IBM AlphaWorks</a> for anyone to download and play with, lots of people can start playing with home energy monitoring, social aspects of the data sharing, home automation, ambient displays, etc. without having to worry about how to get the messages from A to B.. that bit&#8217;s done for you.. you can just focus on the interesting stuff. Folks at HomeCamp got quite excited about it! And for those who care (e.g. if you want to link your home in to infrastructure like the power company or distributed building management, or whatever) then the MQTT and RSMB technology is compatible with IBM&#8217;s WebSphere enterprise messaging products, and so can link right in.</p>
<p><strong>Tish:</strong> So people could use this to build some interfaces with projects like AMEE say? For example letting you know when your light bulb went out which was the most energy efficient one to replace it with?</p>
<p><strong>Andy SC:</strong> Yes, indeed.. was talking to <a href="http://www.pachube.com/" target="_blank">Pachube</a> this morning, as another example.</p>
<p><strong>Tish:</strong> What did you discuss with Pachube?</p>
<p><strong>Andy SC:</strong> using MQTT as the feed to get EML data into and out of Pachube rather than over HTTP. That&#8217;s interesting because MQTT is a much more lightweight protocol, designed for small sensors and low bandwidth / expensive (e.g. cellular) networks&#8230; and it&#8217;s also true push.. i.e. data is pushed to you directly from the broker (the hub in the middle), rather than you having to ask for it constantly (polling). It is an easy way to interface existing MQTT/RSMB home automation or energy monitoring systems into Pachube and it&#8217;s scalable publish/subscribe.. so one data feed in, many data feeds out.Â  This opens up lots of new possibilities for Pachube feeds. <a id="knkj" title="Pachube" href="http://www.pachube.com/feeds/1214" target="_blank">Here is one Pachube feed coming from MQTT.</a></p>
<p><strong>Tish:</strong> Ah yes, no polling! That is a killer in HTTP</p>
<p><strong>Andy SC:</strong> Absolutely!!!</p>
<p><strong>Tish:</strong> And other examples of interfaces using MQTT with potential applications in the sustainability area are &#8230;<br />
<strong><br />
Andy SC:</strong> The power graphs (as described in my talk) are a good example. Also when people start generating their own power with PV or wind, they&#8217;ll want to monitor the contribution their power plant is making to their power usage, and compare it with spot prices on the grid, weather data, etc, etc. These are exactly the kinds of data feeds that MQTT is great for.</p>
<p><strong>Tish: </strong>As you said the most important aspect of MQTT is that it frees people up from having to worry aboutÂ  getting messages from A to B so they can &#8220;start playing with home energy monitoring, social aspects of the data sharing, home automation, ambient displays, etc. &#8230;..How to capture the data.. and what to do with it when it gets to the other end of the comms link.&#8221;</p>
<p><strong>Andy SC: </strong>Yes, exactly &#8211; the incremental cost of adding new devices and applications is very low, once you&#8217;ve got the messging infrastructure in place. So once you&#8217;ve got your home RSMB hub set up, it become easy to integrate new data sources and play with new applications which use that data in interesting ways!</p>
<p>I&#8217;m fascinated by the social aspects of energy saving &#8211; the way communities have formed around the graphs we&#8217;re generating from the currentcost data. I&#8217;m sure that&#8217;s only the tip of an iceberg &#8211; it&#8217;s still quite geeky, but if you start to bring in some kind of gaming or competitive element, then I think harnessing the peer pressure and competitive spirit in people will be a powerful way to encourage change in people&#8217;s energy-using habits.</p>
<p>Ambient displays are another area of interest &#8211; the orb is just one way of doing it. Using twitter to keep you ambiently aware of what&#8217;s going on is another, and there are other media like sound and images, which can tell you things in a subtle way. Lots of scope for more experiments <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" /> </p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/feed/</wfw:commentRss>
		<slash:comments>15</slash:comments>
		</item>
	</channel>
</rss>
