<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>UgoTrade &#187; Open Source Virtual Worlds</title>
	<atom:link href="http://www.ugotrade.com/category/virtual-realities/open-source-virtual-worlds/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.ugotrade.com</link>
	<description>Augmented Realities at the Edge of the Network</description>
	<lastBuildDate>Wed, 25 May 2016 15:59:56 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>Augmented World Expo 2013:  It&#8217;s a wrap!</title>
		<link>http://www.ugotrade.com/2013/07/09/augmented-world-expo-2013-its-a-wrap/</link>
		<comments>http://www.ugotrade.com/2013/07/09/augmented-world-expo-2013-its-a-wrap/#comments</comments>
		<pubDate>Tue, 09 Jul 2013 03:38:56 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Ambient Findability]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Data]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[GeoFencing]]></category>
		<category><![CDATA[gestrural interface]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[Linden Lab]]></category>
		<category><![CDATA[Linked Data]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[nanotechnology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[Philip Rosedale]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[Amber Case]]></category>
		<category><![CDATA[augmented reality eyewear]]></category>
		<category><![CDATA[Augmented World Expo]]></category>
		<category><![CDATA[AWE2013]]></category>
		<category><![CDATA[Ben Cerveny]]></category>
		<category><![CDATA[connected hardware]]></category>
		<category><![CDATA[gesture interaction]]></category>
		<category><![CDATA[Google Glass]]></category>
		<category><![CDATA[hardware startups]]></category>
		<category><![CDATA[Mike Kuniavsky]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[Steve Mann]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[wearables]]></category>
		<category><![CDATA[Will Wright]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=6600</guid>
		<description><![CDATA[Augmented World Expo 2013 was really an amazing experience. I&#8217;m co-founder and co-organizer of the conference, along with Ori Inbar, so it has meant a lot to me to see our event grow over the last four years, and thrilling to make such a big splash this year.Â  There were 1,163 attendees, and the expo [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><iframe width="560" height="315" src="//www.youtube.com/embed/4d0k_7pdPGg" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/NQ-g0Jimg7I" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/9GxVQREssdY" frameborder="0" allowfullscreen></iframe></p>
<p><a href="http://augmentedworldexpo.com/" target="_blank">Augmented World Expo 2013</a> was really an amazing experience.  I&#8217;m co-founder and co-organizer of the conference, along with Ori Inbar, so it has meant a lot to me to see our event grow over the last four years, and thrilling to make such a big splash this year.Â  There were 1,163 attendees, and the expo show cased an ecosystem of emerging technologies &#8211; augmented reality, gesture interaction, eyewear, wearables, and connected hardware ofÂ  many stripes, that mark the beginning of natural computing entering the mainstream.  It was a unique opportunity to get up close and personal with what it feels like to be an augmented human in an augmented world! </p>
<p>Videos of AWE 2013â€²s 35 hours of educational sessions and inspirational keynotes are now available on <strong><a href="http://www.youtube.com/user/AugmentedRealityOrg/videos?view=0&amp;shelf_index=0&amp;sort=dd&amp;tag_id=" target="_self">our YouTube channel</a></strong>.  I am sharing <a href="http://www.youtube.com/watch?v=9GxVQREssdY">my own talk</a> (my slides are also up <a href="http://www.slideshare.net/TishShute/augmented-humansaugmentedworld">on slideshare here</a>), and a few of my favorites in this post, but there are far to many to post here, so please browse further on the Augmented World Expo youtube channel.</p>
<p>One notable high point of AWE2013, for me, was the showcase sponsored by <a href="http://www.meta-view.com/about">Meta</a>, a startup developing the first device allowing visualization and interaction with 3D virtual objects in the real world using your hands.  It was made possible by the generous contribution from the private collections of Paul Travers, Dan Cui, Steven Feiner, Steve Mann, and Chris Grayson, and passionate volunteers who are helping advance the industry.  Sean Hollister of The Verge did this excellent  report on the eyewear showcase <a href="http://www.theverge.com/2013/6/9/4409940/35-years-of-wearable-computing-history-at-augmented-world-expo-2013">35 years of wearable computing history at Augmented World Expo 2013<br />
</a>  Also for more on Meta see <a href="http://news.cnet.com/8301-11386_3-57584739-76/meta-glasses-bring-3d-and-your-hands-into-the-picture/">this article by Dan Farber</a>.</p>
<p>My colleagues at <a href="http://www.syntertainment.com/">Syntertainment</a>, Will Wright, Avi Bar-Zeev, Jason Shankel, and LaurenElliott all gave great talks.  Ironically, weâ€™re not building augmented reality apps or H/W.  We all just happen to continue to be very interested in the field. Â </p>
<p>Thank you to everyone for supporting the event! </p>
<p>The press coverage was truly extensive:</p>
<p style="text-align: left;"><a href="http://www.theverge.com/2013/6/9/4410406/in-the-shadow-of-google-glass-at-augmented-world-expo-2013">In the shadow of Google Glass, an augmented reality industry revs its engines<br />
</a>The Verge, Sean Hollister, June 9, 2013,Â <a href="http://topsy.com/www.theverge.com/2013/6/9/4410406/in-the-shadow-of-google-glass-at-augmented-world-expo-2013">271 Tweets</a></p>
<p><a href="http://news.cnet.com/8301-11386_3-57588128-76/the-next-big-thing-in-tech-augmented-reality/">The next big thing in tech: Augmented reality<br />
</a>CNET, Dan Farber, June 7, 2013<br />
Pick up onÂ <a href="http://currentnewsdaily.com/the-next-big-thing-in-tech-augmented-reality/">Current News Daily<br />
</a><a href="http://topsy.com/news.cnet.com/8301-11386_3-57588128-76/the-next-big-thing-in-tech-augmented-reality/">350 Tweets</a></p>
<p><a href="http://thepersuaders.libsyn.com/awe-2013-conference-report-augmented-reality-and-marketing">AWE 2013 Conference Report: Augmented Reality and Marketing<br />
</a>The Persuaders Marketing Podcast onÂ Dublin City FM, June 23, 2013</p>
<p><a title="AR Dirt Podcast â€“ Episode 26 â€“ Ori Inbar AWE2013 Extravaganza Recap" rel="bookmark" href="http://www.ardirt.com/general-news/ar-dirt-podcast-episode-26-ori-inbar-awe2013-extravaganza-recap.html">AR Dirt Podcast â€“ Ori Inbar AWE2013 Extravaganza Recap<br />
</a>AR Dirt by Joseph Rampolla,Â June 18, 2013</p>
<p><a href="http://www.theverge.com/2013/6/9/4409940/35-years-of-wearable-computing-history-at-augmented-world-expo-2013">35 years of wearable computing history at Augmented World Expo 2013<br />
</a>The Verge, Sean Hollister, June 9, 2013<br />
<a href="http://topsy.com/www.theverge.com/2013/6/9/4409940/35-years-of-wearable-computing-history-at-augmented-world-expo-2013">7 Tweets</a></p>
<p><a href="http://www.wired.com/beyond_the_beyond/2013/06/augmented-reality-bruce-sterling-keynote-at-augmented-world-expo-2013/">Augmented Reality: Bruce Sterling, keynote at Augmented World Expo 2013<br />
</a>Wired, Bruce Sterling, June 9, 2013<br />
<a href="http://topsy.com/www.wired.com/beyond_the_beyond/2013/06/augmented-reality-bruce-sterling-keynote-at-augmented-world-expo-2013/">9 Tweets</a></p>
<p><a href="http://doc-ok.org/?p=598">On the road for VR: Augmented World Expo 2013<br />
</a>Doc-Ok, Staff, June 7, 2013<br />
<a href="http://topsy.com/trackback?url=http%3A%2F%2Fdoc-ok.org%2F%3Fp%3D598">3 Tweets</a></p>
<p><a href="http://www.wassom.com/my-interview-from-augmented-world-expo-2013-video.html">My Interview from Augmented World Expo 2013 [VIDEO] </a><a href="http://wassom.com/">Wassom.com</a>, Brian Wassom, June 7, 2013</p>
<p><a href="http://zenfri.com/2013/06/augmented-world-expo/">Augmented World Expo</a><br />
ZenFri, Staff, June 7, 2013</p>
<p><a href="http://www.fbnsantos.com/?p=9634">AWE2013: Hardware for an augmented world</a><br />
FBNSantos.com, Felipe Neves Dos Santos, June 6, 2013</p>
<p><a href="http://investorplace.com/2013/06/augmented-reality-will-be-the-new-reality/">Augmented Reality Will Be the New Reality</a><br />
InvestorPlace, Brad Moon, June 6, 2013</p>
<p><a href="http://www.techhive.com/article/2040837/wearable-computing-pioneer-steve-mann-who-watches-the-watchmen-.html">Wearable computing pioneer Steve Mann: Who watches the watchmen?</a><br />
TechHive, Armando Rodriguez, June 6, 2013</p>
<p><a href="http://abclocal.go.com/kgo/video?id=9127769">Expo puts augmented reality in the limelight</a><br />
ABC 7 News, Jonathan Bloom, June 5, 2013</p>
<p><a href="http://www.dvice.com/2013-6-5/these-oled-microdisplays-are-future-augmented-reality">These OLED microdisplays are the future of augmented reality</a><br />
DVICE, Evan Ackerman, June 5, 2013</p>
<p><a href="http://www.engadget.com/2013/06/05/visualized-history-of-augmented-and-virtual-reality-eyewear/?utm_medium=feed&amp;utm_source=Feed_Classic&amp;utm_campaign=Engadget">Visualized: a history of augmented and virtual reality eyewear</a><br />
Engadget, Michael Gorman, June 5, 2013</p>
<p><a href="http://www.papitv.com/wikitude-announces-wikitude-studio-and-in-house-developed-ir-tracking-engine">Wikitude announces Wikitude Studio and in-house developed IR &amp; Tracking engine</a><br />
PapiTV, KC Leung, June 5, 2013</p>
<p><a href="http://www.usatoday.com/story/tech/personal/2013/06/05/augmented-reality-expo-google-glass/2392769/">Augmented reality expo aims for sci-fi future today</a><br />
USA Today, Marco della Cava, June 5, 2013</p>
<p><a href="http://www.wired.com/beyond_the_beyond/2013/06/augmented-reality-high-dynamic-range-hdr-video-image-processing-for-digital-glass/">Augmented Reality: High Dynamic Range (HDR) Video Image Processing For Digital Glass</a><br />
Wired, Bruce Sterling, June 5, 2013</p>
<p><a href="http://allthingsd.com/20130604/will-wright-at-augmented-reality-conference-dont-augment-reality-decimate-it/">Will Wright at Augmented Reality Conference: Donâ€™t Augment Reality, Decimate It</a><br />
AllThingsD, Eric Johnson, June 4, 2013</p>
<p><a href="http://news.cnet.com/8301-11386_3-57587672-76/philip-rosedales-second-life-with-high-fidelity/">Philip Rosedaleâ€™s Second Life with High Fidelity</a><br />
CNET, Dan Farber, June 4, 2013</p>
<p><a href="http://www.pcworld.com/article/2040801/google-glass-competitors-vie-for-attention-as-industry-grows.html">Google Glass competitors vie for attention as industry grows</a><br />
PC World, Zack Miners for IDG News Service, June 4, 2013</p>
<p><a href="http://daqri.com/press_posts/press-release-4d-augmented-reality-leader-daqri-announces-15-million-financing-2/#.Ua-RjNhNuSo">4D Augmented Reality Leader Daqri Announces $15 Million Financing</a><br />
Press Release, June 4, 2013</p>
<p><a href="http://www.techzone360.com/topics/techzone/articles/2013/06/03/340432-crowdoptic-powers-lancome-virtual-gallery-app-crowd-powered.htm">CrowdOptic Powers Lancome Virtual Gallery App, Crowd-powered Heat Map</a><br />
TechZone 360, Peter Bernstein, June 3, 2013</p>
<p><a href="http://www.craveculture.net/2013/06/augmented-humans-now/">Augmented humans, enhanced happiness?</a><br />
Crave Culture, Angelica Weihs, June 2, 2013</p>
<p><a href="http://www.metaio.com/press/press-release/2013/metaio-vuzix-to-showcase-ar-ready-smart-glasses-at-the-2013-augmented-world-expo/">Metaio &amp; Vuzix to Showcase AR-Ready Smart Glasses at the 2013 Augmented World Expo</a><br />
Press Release, May 30, 2013</p>
<p><a href="http://qz.com/89467/four-ways-augmented-reality-will-invade-your-life-in-2013/">Four ways augmented reality will invade your life in 2013</a><br />
Quartz, Rachel Feltman, May 30, 2013</p>
<p><a href="http://www.wired.com/beyond_the_beyond/2013/05/augmented-reality-augmented-world-expo-is-next-week/">Augmented Reality: Augmented World Expoâ„¢ is next week</a><br />
Wired, Bruce Sterling, May 28, 2013</p>
<p><a href="http://www.prweb.com/releases/candy-lab/augmented-reality/prweb10763283.htm">Strike it Rich with Cachetown and AWE 2013 Playing the Gold Rush 49â€™er Challenge In Augmented Reality</a><br />
Press Release, May 24, 2013</p>
<p><a href="http://interact.stltoday.com/pr/lifestyle/PR052413071613074">Local Community College Student Headed to Silicon Valley to Learn More about Augmented Reality</a><br />
St. Louis Post-Dispatch, Staff, May 24, 2013</p>
<p><a href="http://www.cnet.com.au/explore-an-intricate-labyrinth-with-smartphone-ar-339344350.htm">Explore an intricate labyrinth with smartphone AR</a><br />
CNET Australia, Michelle Starr, May 21, 2013</p>
<p><a href="http://thechronicleherald.ca/business/1130672-dartmouth-firm-lands-super-app">Dartmouth firm lands super app</a><br />
Herald Business, Remo Zaccagna, May 21, 2013</p>
<p><a href="http://siliconangle.com/blog/2013/05/17/augmented-world-expo-2013-the-future-of-augmented-reality/">Augmented World Expo 2013â€“The Future of Augmented Reality</a><br />
Silicon Angle, Saroj Kar, May 17, 2013</p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/o6L3dcsLEto" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/FhLx7k07Pa4" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/ON7VUzsNcYI" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/qhVdTFcR6TA" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/REoEj-JkDww" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/ohatuq8tekk" frameborder="0" allowfullscreen></iframe></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2013/07/09/augmented-world-expo-2013-its-a-wrap/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Mobile Augmented Reality and Mirror Worlds: Talking with Blair MacIntyre</title>
		<link>http://www.ugotrade.com/2009/06/12/mobile-augmented-reality-and-mirror-worlds-talking-with-blair-macintyre/</link>
		<comments>http://www.ugotrade.com/2009/06/12/mobile-augmented-reality-and-mirror-worlds-talking-with-blair-macintyre/#comments</comments>
		<pubDate>Fri, 12 Jun 2009 05:07:01 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[MMOGs]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Virtual Realities]]></category>
		<category><![CDATA[Virtual Worlds]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[3D mirror world]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Android and augmented reality]]></category>
		<category><![CDATA[ARhrrrr]]></category>
		<category><![CDATA[Art of Defense]]></category>
		<category><![CDATA[augmented reality on the gphone]]></category>
		<category><![CDATA[augmented reality on the iphone]]></category>
		<category><![CDATA[augmented reality shooter games]]></category>
		<category><![CDATA[Aware Home Research]]></category>
		<category><![CDATA[Blair Macintyre]]></category>
		<category><![CDATA[Bragfish]]></category>
		<category><![CDATA[Dark Star]]></category>
		<category><![CDATA[geolocation]]></category>
		<category><![CDATA[geotagging]]></category>
		<category><![CDATA[google earth]]></category>
		<category><![CDATA[handheld AR games]]></category>
		<category><![CDATA[handheld augmented reality]]></category>
		<category><![CDATA[Immersive augmented reality]]></category>
		<category><![CDATA[Information Landscapes]]></category>
		<category><![CDATA[instrumented homes]]></category>
		<category><![CDATA[instrumented world]]></category>
		<category><![CDATA[iphone 3Gs]]></category>
		<category><![CDATA[iphone games]]></category>
		<category><![CDATA[ISMAR]]></category>
		<category><![CDATA[ISMAR 2009]]></category>
		<category><![CDATA[location aware applications]]></category>
		<category><![CDATA[minimally immersive augmented reality]]></category>
		<category><![CDATA[MMO of the real world]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[MS Virtual Earth]]></category>
		<category><![CDATA[NVidia Tegra devkits]]></category>
		<category><![CDATA[Open Sim]]></category>
		<category><![CDATA[OpenSim and Augmented Reality]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[outdoor tracking and markerless AR]]></category>
		<category><![CDATA[parallel mirror worlds]]></category>
		<category><![CDATA[persistent immersive mirror worlds]]></category>
		<category><![CDATA[photosynth]]></category>
		<category><![CDATA[Sun's Wonderland]]></category>
		<category><![CDATA[Texas Instrument's OMAP3 devkits]]></category>
		<category><![CDATA[the shape of alpha]]></category>
		<category><![CDATA[ubicomp]]></category>
		<category><![CDATA[Unity3D]]></category>
		<category><![CDATA[Unity3D and Augmented Reality]]></category>
		<category><![CDATA[virtual pets]]></category>
		<category><![CDATA[Wikitude]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=3691</guid>
		<description><![CDATA[Blair MacIntyre is one of the original pioneers ofÂ  augmented reality and an extraordinary amount of creative work is coming out of his Augmented Environments Laboratory at Georgia Tech &#8211; see YouTube videos here.Â  The screenshot below is from, ARhrrrr, a very impressive augmented reality shooter game created at Georgia Tech Augmented Environments Lab and [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/arf.jpg"></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/arf2.jpg"><img class="alignnone size-full wp-image-3732" title="arf2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/arf2.jpg" alt="arf2" width="259" height="239" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/droppedimage1.jpg"><img class="alignnone size-full wp-image-3725" title="droppedimage1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/droppedimage1.jpg" alt="droppedimage1" width="271" height="240" /></a></p>
<p><a href="http://www.cc.gatech.edu/~blair/home.html" target="_blank">Blair MacIntyre</a> is one of the original pioneers ofÂ  augmented reality and an extraordinary amount of creative work is coming out of his <a href="http://www.cc.gatech.edu/ael/" target="_blank">Augmented Environments Laboratory</a> at Georgia Tech &#8211; see <a href="http://www.youtube.com/user/AELatGT" target="_blank">YouTube videos here</a>.Â  The screenshot below is from, <strong>ARhrrrr</strong>, a very impressive augmented reality shooter game created at Georgia Tech <span class="description">Augmented Environments Lab </span>and <span class="description"> Savannah College of Art and Design, </span>(SCAD- Atlanta), and produced  on the <strong>NVidia Tegra devkits</strong> &#8211; <a href="http://www.youtube.com/watch?v=cNu4CluFOcw" target="_blank">watch the demo here</a>.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/picture-63.png"><img class="alignnone size-medium wp-image-3799" title="picture-63" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/picture-63-300x169.png" alt="picture-63" width="300" height="169" /></a></p>
<p>Blair has spent much of his career working on immersive augmented reality and more recently the integration of augmented reality with mirror worlds. Blair explains:</p>
<p><strong>&#8220;</strong><strong>I am interested in the intersection of mobile devices &#8211; whether they are head mounts or handhelds &#8211; and parallel mirror worlds&#8230;I think that parallel mirror worlds are a direct manifestation of the intersection of the virtual world we now live in (the web) and geotagging. Â As more and more information is tied to place, and as more of our searching become place-based, we will want to do those searches about places we are not at. Â A 3D mirror world may provide one interface to that data. Â Want to plan your trip to London; Â go their virtually and look around, see what is there (both physically and virtually), teleport between areas you want to learn about, and so on. Â More interestingly, talk to people who are there now, and retrieve your location-based notes when you are on your trip.&#8221;</strong></p>
<p>But, at a time when many augmented reality developers are focusing on AR apps for smart phones, including Blair (the picture on left opening this post is Blair&#8217;s augmented reality <a href="http://www.youtube.com/watch?v=_0bitKDKdg0&amp;feature=channel_page" target="_blank">iphone app ARf)</a>, I was interested in finding out from Blair what the state of play was for the real deal Rainbow&#8217;s End style AR, as well as the potential he sees in smart phones to mediate meaningful AR experiences.</p>
<p>There is enormous amount ofÂ  innovation in mapping our world, see my post, <a href="http://www.ugotrade.com/2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/" target="_blank">&#8220;Location Becomes Oxygen at Where 2.0 and WhereCamp,</a>&#8221; andÂ  <a href="http://gamesalfresco.com/2009/05/26/where-2-0-the-world-is-mapped-now-use-it-to-augmented-our-reality/" target="_blank">Ori Inbar&#8217;sÂ  Where 2.0. conference roundup. </a>But as Ori notes, to move augmented reality forward:</p>
<p><strong>My point is not a shocker: all we need is to tap into this information and bring it, in context, into peopleâ€™s field of view.</strong></p>
<p>And this is what Blair MacIntyre&#8217;s work is all about.</p>
<h3>Talking With Blair MacIntyre</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/picture-62.png"><img class="alignnone size-medium wp-image-3728" title="picture-62" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/picture-62-300x257.png" alt="picture-62" width="300" height="257" /></a></p>
<p><strong>Tish Shute:</strong> There do seem to be broader implications to augmented reality today than when this term was first coined. I am interested to have your perspective on how augmented reality may go beyond some of our early definitions?</p>
<p><strong>Blair MacIntyre: I still think the original definition of the term is useful: Â media (typically graphics) tightly registered (aligned) with the physical world, in real time. Â Many people talk about many things that relate virtual worlds to places, spaces, objects and people. Â There is room for many of them, and they don&#8217;t all have to &#8220;be&#8221; augmented reality. Â I like using Milgram&#8217;s definition of Mixed Reality as everything from the physical world (at one end) to the virtual world at the other; Â it&#8217;s a spectrum, and augmented reality just sits at one point.</strong></p>
<p><strong>The reason I like the old definition is I believe there is something special about graphics that are tightly, rigidly aligned with the physical world. Â When things appear to stick to the world, and an obviously identifiable location, people can start leveraging their natural perceptual, physical and social abilities and interact with the mixed world as they do the physical world. Â We&#8217;ve found this with the two studies we&#8217;ve done of tabletop AR games (<a href="http://www.augmentedenvironments.org/lab/research/handheld-ar/artofdefense/" target="_blank">Art of Defense</a> and </strong><a href="http://www.augmentedenvironments.org/lab/research/handheld-ar/artofdefense/" target="_blank"><strong></strong></a><strong><a href="http://www.youtube.com/watch?v=w3iBrj_zfTM&amp;feature=channel_page" target="_blank">Bragfish</a></strong><strong>); Â one key to those games is that the graphics were tightly aligned with identifiable landmarks in the physical world (gameboard).</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/aod-sandbox-video-15.png"><img class="alignnone size-medium wp-image-3729" title="aod-sandbox-video-15" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/aod-sandbox-video-15-300x225.png" alt="aod-sandbox-video-15" width="300" height="225" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/imgp0782-2.jpg"><img class="alignnone size-medium wp-image-3782" title="imgp0782-2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/imgp0782-2-300x225.jpg" alt="imgp0782-2" width="300" height="225" /></a></p>
<p><em><a href="http://www.augmentedenvironments.org/lab/research/handheld-ar/artofdefense/" target="_blank">Art of Defense</a> (pic on left) <a href="http://www.youtube.com/watch?v=w3iBrj_zfTM&amp;feature=channel_page" target="_blank">Bragfish</a> (pic on right)<br />
</em></p>
<p><strong>Tish:</strong> I know that you are involved with <a id="b-c6" title="ISMAR 2009" href="http://www.ismar09.org/" target="_blank">ISMAR 2009</a> which is the key US augmented reality conference.Â  What do you think will be the hot themes, applications, innovations at this year&#8217;s conference? Do you think this will be the year that AR really breaks out of eye candy into truly useful and sustained experiences?</p>
<p><strong>Blair:  Unfortunately, I won&#8217;t be involved this year. Â I was supposed to be helping run the technical program, as well as the art/media program, but sickness in my family prevented me from having the time, so I am not helping this year.</strong></p>
<p><strong>First, I would not agree with the implication of the last question &#8212; I don&#8217;t think AR has just been eye candy up to now. Â I do agree that the &#8220;high profile&#8221; uses of it have largely been that, which is mostly because of the limits of the technology. Â I don&#8217;t think we&#8217;ll see huge changes in that regard by ISMAR this year. Â However, we will hopefully see a mixing of communities that hasn&#8217;t happened at ISMAR before, and I do believe that this year (independent of ISMAR) we will see more and more AR apps. Â Whether they go beyond eye candy is still a question. Â I&#8217;m hoping that some folks (including myself and other ISMAR folks!) will help push AR in new directions. Â But I also expect many folks new to ISMAR and AR to play a big role, because it is this new blood, especially those folks with real problems to solve, new art and game ideas, and a fresh perspective, that will open new doors.</strong></p>
<p><strong>Tish:</strong> You have been working on integrating augmented reality with virtual worlds. You mentioned that the way you use <a href="https://lg3d-wonderland.dev.java.net/" target="_blank">Sun&#8217;s Wonderland</a> is really about pulling the virtual world into the real world, i.e., Wonderland, &#8220;is just a place to put data.&#8221;Â  How is your use of the persistent virtual space different from what we have become accustomed to call virtual worlds?</p>
<p><strong>Blair: The approach we are taking in our project at Georgia Tech is to use the virtual world as the central hub of the information space, and allow the virtual world to be the element that enables distributed workers to collaborate more smoothly. Â This is work we are doing with Sun and Steelcase (and the NSF), and is an outgrowth of a project (the InSpace project) that&#8217;s been going on for a few years.</strong></p>
<p><strong>What we are trying to do is use mixed reality and ubicomp techniques to pull as much of the physical activity into the virtual world, and then reflect that activity back out to the different participants as best suits their situation. Â So, folks in highly instrumented team rooms will collaborate in one way, and their activity will be reflected in the virtual world; Â remote participants (e.g., those at home, or in a cafe or hotel) may control their virtual presence in different ways, but the presence of all participants will be reflected back out to the other sides in analogous ways. Â We may see ghosts of participants at the interactive displays, or hear their voices in 3D space around us; Â everyone will hopefully be able to manipulate content on all displays and tell who is making those changes.</strong></p>
<p><strong>A secondary benefit, I hope, is that by putting the data in the virtual world and making that the place that gives you more powerful and flexible access to the data (e.g., by leveraging space and giving access to history), distributed teams will begin to have the virtual space become a place they go to work, bump into each other and have those casual contacts co-located workers take for granted.</strong></p>
<p><strong><br />
</strong></p>
<h3><strong>Creating the Information Landscape of the Future</strong></h3>
<p><strong></strong></p>
<p><strong>Tish: </strong>At the end of <a href="http://www.ugotrade.com/2009/05/06/composing-reality-and-bringing-games-into-life-talking-with-ori-inbar-about-mobile-augmented-reality/" target="_blank">my interview with Ori Inbar</a> he said, in order to have a ubiquitous experience <em>&#8220;youâ€™ll need to 3d map the world. Google earth like apps are going to help but it is not going to be sufficient. So letâ€™s leverage people. Google became successful in part by making people work with them.Â  Each time you create a link from your blog to my blog their search engines learn from it.Â  So letâ€™s find ways to make people create information that can be used for AR.&#8221;</em> What ways do you think people can create information that can be used for AR?</p>
<p><strong>Blair: I think the big part of that is the creation of models and environments, the necessary &#8220;baseline&#8221; for specifying experiences. Â Google and Microsoft are clearly working toward this; Â recent videos from Microsoft show them starting to move the photosynth work toward Virtual Earth. Â Similarly, I came across a page where people are finally starting to mine geotagged Flickr [see my post, <a href="http://www.ugotrade.com/2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/" target="_blank">&#8220;Location Becomes Oxygen,&#8221;</a> and <a href="http://www.ugotrade.com/2009/05/17/creating-the-information-landscapes-of-the-future-locative-media-and-the-shape-of-alpha/" target="_blank">here</a> for more on the <a href="http://code.flickr.com/blog/2008/10/30/the-shape-of-alpha/" target="_blank">â€œThe Shape of Alphaâ€</a></strong><strong> project from Flickr]Â  images to create models. Â It&#8217;s that kind of thing that will be useful first; Â using the data we all create to enable modeling and (eventually) vision-based tracking in the real world.</strong></p>
<p><strong>After that, it&#8217;s a matter of time till more of what we &#8220;create&#8221; (e.g., Tweets and blog posts and so on) are all geo-referenced; Â these will become the information landscape of the future, the kinds of things people think about when they read &#8220;Rainbow&#8217;s End&#8221;. Â  The big problem will be filtering, searching and sorting. Â And, of course, safety and security.</strong></p>
<p><strong>Tish: </strong>You are working with <a href="http://unity3d.com/" target="_blank">Unity3D</a> to research the integration of mobile location based AR with persistent mirror world like spaces.Â  What has attracted you to Unity? What is the difference between this and your Wonderland project? I know you mentioned. you will be using head-mounted displays are part of this Unity project. What are your goals for this project?</p>
<p><strong>Blair:</strong> <strong>We started to use <a href="http://unity3d.com/" target="_blank">Unity3D</a> because it gave us what we wanted in a game engine. Â Most importantly, it&#8217;s very open and let us trivially expose AR technologies into the editor. Â Similarly, it can target the iPhone, so we can begin to work with it on that platform, too. Â The biggest problem with creating compelling experiences is content; Â and a show stopper for creating content is not getting it into your engine. Â Unity has a nice content workflow.</strong></p>
<p><strong>Unity3D is a front end engine, for creating the game; Â Wonderland is both a front end, and a backend. Â We are actually looking into using the Wonderland backend with Unity as well. Â Wonderland also has growing support for doing &#8220;real work&#8221; in a virtual world, which is key to our other projects.</strong></p>
<p><strong>Eventually, we&#8217;ll be using HMD&#8217;s. Â The goal for the Unity3D project, initially, was to explore what you can do with an AR/VR mirror-world; this is a project are working on with Alcatel-Lucent, and demo&#8217;d at CTIA this year. Â It&#8217;s continuing to grow, though, and now includes a number of our projects, including some work on mobile social AR and soon, some performance and experience design projects in the area of AR ARG&#8217;s. Â It&#8217;s really quite interesting to imagine what you can do when you have an &#8220;MMO of the real world&#8221; (which we now have for part of campus) that supports both VR-style desktop access simultaneously with mobile AR access.</strong></p>
<p><strong>Tish: </strong>Have you taken another look at <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a> as a possible backend for augmented reality?Â  Recently I talked to David Levine, IBM and he is thinking about some possibilities to optimize OpenSim to dynamically load a large amount of objects at once (i.e how fast OpenSim can bulk load into an existing sim) and make it better suited to augmented reality/mirror world type projects.</p>
<p><strong>Blair: I haven&#8217;t looked at OpenSim recently. Â We will probably look at it this summer.</strong></p>
<p><strong>Tish:</strong> Why did you select Unity as a good client for augmented reality?</p>
<p><strong>Blair: Unity is a 3D game authoring environment so at some level it is no different from using Ogre, if all the associated stuff was just as well done. It has integrated physics, scripting, debugging, etc. &#8211; you can write code in javascript or C# or whatever. Â  It has a good content pipeline, as well, and supports a range of platforms.</strong></p>
<p><strong>It has simple networking built in, so multiple unity engines can talk to each other but it is not a virtual world platform out of the box &#8211; there is no back end &#8230;</strong></p>
<p><strong>Tish: </strong>Someone described Unity to me as a great client waiting for a great backend? So what are you going to use as a back end?</p>
<p><strong>Blair: There is no real processing except in the client right now.Â  We will eventually have to create a back end.Â  We are thinking of using Dark Star because someone on the Sun Wonderland community forums has already built a set of scripts connecting Unity to Darkstar.</strong></p>
<p><strong>But for us, we are not proposing right now to build a real product.Â  This is research to demonstrate what you could do if you actually had the back end.</strong></p>
<p><strong>Tish:</strong> What are the most important aspects of the backend from your POV?</p>
<p><strong>Blair: We want to simulate a variety of the interesting aspects of the back end.Â  So I very much care about notions of privacy and security and how these sorts of AR/VR Mirror Worlds would work in practice.Â  But I care about how those things as they impact user experience, not really about how we would really implement them.</strong></p>
<p><strong>Tish:</strong> So looking at some of the big problems from the perspective of user experience? Are we are going to go through the same growing pains that the web and VWs have seen, for example, will we have to type in passwords to get into everyone&#8217;s little worlds&#8230;.</p>
<p><strong>Blair: Well you know the SciFi background to this, you&#8217;ve mentioned it in other posts on your blog. Â Because when you look at the Rainbow&#8217;s End model where you have security certificates flying around, that is in effect what cookies and so on are now.Â  You can authenticate yourself once and then have those certificates hang around. So you can easily imagine how it could be done.Â  But the big question is how does that change user experience.Â  There are all kinds of things that start coming into play &#8211; like what happens if nearby people see different things &#8211; it goes on and on!</strong></p>
<p><strong>Tish:</strong> Sounds Like this is very valuable research.Â  It seems to me that there will be a lot of investment soon in putting the pieces together to do location based markerless AR and it would be nice if we knew more about it from the user experience POV.</p>
<p>Isn&#8217;t it vital for a productive intersection between mobile AR and persistent mirror world spaces for us to have markerless AR?Â  Aren&#8217;t we right at the beginning of people really saying yeah markerless AR is doable now? But it seems to me not many people researching or working on fully immersive AR and its integration with mirror worlds?</p>
<p><strong>Blair: I think some of the AR community is thinking about this. There&#8217;s probably people who are doing stuff in some other non technical communities. It wouldn&#8217;t surprise me to find out that people in the digital performance or ARS electronica world who are thinking a little bit about these sorts of things. Although not necessarily at the level of actually trying to build it, because they probably can&#8217;t right now. Â But experimenting with the precursors. Â My colleagues in digital media like to point out that this is often the purpose of digital art, to point out new directions and push the boundaries.</strong></p>
<p><strong>Obviously Science Fiction has explored the possibilities because that is what Rainbow&#8217;s End and the Matrix were all about.</strong></p>
<p><strong>Tish:</strong> and <a href="http://en.wikipedia.org/wiki/Denn%C5%8D_Coil" target="_blank">Denno Coil</a>&#8230;</p>
<p><strong>Blair: There has been some research &#8211; people like my adviser Steve Feiner up at Columbia, Mark Billinghurst in New Zealand, myself and people at Graz University in Austria .Â  But partly it has been so hard to do mobile AR up to now &#8211; so many people mock head worn displays and can&#8217;t get past current technology &#8211; you have hadÂ  to be willing to ignore the bulky back packs and cables and batteries and so on.Â  That is changing which is good.</strong></p>
<p><strong>My current response to the anti-head-mounted display people is if 5 years ago you told me you told me that fabulously dressed people who care about their looks and wear stylish clothes would have had big things hanging from their ears that blink bright blue light, so they could talk on the phone, many of us would have said you were crazy, because it would be ugly and so on.Â  But because there is an intersection of demonstrable need and benefit&#8230;Bluetooth headsets are really useful and the sort of early gestalt feeling that grew up around them &#8211; that people who use them are so important that they always have to be in touch, they wear these things &#8211; so people accept them.</strong></p>
<p><strong>It will likely be a similar thing with head mounted displays. And I don&#8217;t know if it will be that people wearing them so that they can read their mail while driving, god forbid. But it will be something.Â  And when we get the 2nd generation of the wrap glasses that look more like sun glasses and are not bulky and so on, we will have the potential for them catching on because you will look at them and you will think that the person is wearing because they are doing x&#8230;</strong></p>
<p><strong>X might be surfing a virtual world or reading their email or keeping in touch, or being aware. It will happen. But they have to get unbulky enough and there has to be moreÂ  than one important application, not just watching TV.</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/karmablair-fix.jpg"><img class="alignnone size-medium wp-image-3787" title="karmablair-fix" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/karmablair-fix-300x227.jpg" alt="karmablair-fix" width="300" height="227" /></a><br />
</strong></p>
<p><em>Picture above showsÂ  an outside view of the KARMA AR system; Â the knowledge based maintenance system Blair built in his first year of grad school (<strong>&#8220;first AR system Steve Feiner, Doree Seligmann, and I worked on&#8221;</strong>).Â  Blair noted, &#8220;<strong>The Communications of the ACM paper on it (from 1993) is a pretty widely cited AR paper.&#8221;</strong></em></p>
<p><strong>Tish:</strong> I think the need forÂ  full on transparent, immersive, wraparound, Gucci stylish eyewear with a decent field of view are the elephant in the room in terms of realizing the full potential of augmented reality.Â  There are a few new players in the field <a href="http://www.sbglabs.com/" target="_blank">Digilens</a>,Â  <a href="http://www.vuzix.com/home/index.html" target="_blank">Vuzix</a>, others?Â  What is the progress in this area and what do you hope for in terms of near term solutions?</p>
<p><strong>Blair: I agree with that sentiment. Â I think that, in the near term, there is a lot we can do with handhelds, as we&#8217;ve been doing in the lab. Â However, because it&#8217;s awkward and tiring to hold up a device, even a small one, for any length of time, handhelds will only be good for &#8220;focused&#8221; uses of AR. Â Such as the table-top games we&#8217;ve been doing, or the constellation viewing app that I heard came our recently for the Android G1. Â I don&#8217;t even see something like Wikitude as that compelling (beyond the &#8220;gee wiz&#8221; factor) for a handheld form factor. Â  Many proposed AR apps only really become compelling when users have constant awareness of them, and that requires a see-through head-worn display.</strong></p>
<p><strong>I&#8217;ve seen the mockups of the Vuzix ones; Â they seem pretty interesting, and are getting to were early adopters could use them (they will be cheap enough, and will hopefully be good enough). Â Microvision&#8217;s virtual retinal display is also promising; Â the contact lens displays will be the most interesting, if anyone can ever make them work. Â  I don&#8217;t know of anything else out there.</strong></p>
<p><strong><br />
</strong></p>
<h3><strong>&#8220;its not really a killer app you care about, it is the killer existence that all of the technology and small applications taken together facilitate&#8221;</strong></h3>
<p><strong></strong></p>
<p><strong>Tish:</strong> While location based services are accepted now and people are understanding that it is something that opens up a new relationship to everything, we still haven&#8217;t found the experience that will get everyone holding up their mobile devices?</p>
<p><strong>Blair: Well that is actually the killer problem. Â Gregory Abowd is one of my colleagues who does ubiquitous computing research here at Tech. Â  Way back when we started the Aware Home project (<a href="http://www.awarehome.gatech.edu/">Aware Home Research Institute at Georgia Tech</a>) when I first got here about ten years ago, there was always this question of what is the killer app.Â  So Gregory comment in a meeting once that its not really a killer app you care about, it is the killer existence that all of the technology and small applications taken together facilitate. It is not that any one of these AR demos we see right, whether it is seeing your photos in the world or whatever, is important. Its that when taken together, there is enough of a benefit that you would use the whole environment.</strong></p>
<p><strong>In the original context we were talking about an instrumented home, but it is the same thing here with AR.</strong></p>
<p><strong>The problem with the mobile phone as a AR device is that problem of awareness. If I have a head mount on and I walk down the street and there is bunch of probably-not-useful-but-potentially-useful information floating by me, that&#8217;s a good thing, because I may see something that is useful or makes me think of something else.Â  But if I have to hold up my phone to see if something might be interesting nearby, I will never hold up my phone because at the time there is a high probability that there won&#8217;t be anything particularly important there.Â  You might imagine you can get around this by using alerts or something like that, but then you overload whatever alert channel you use. Â For example, I forward maybe 5 or 6Â  people&#8217;s updates from Facebook to my phone &#8211; started with my wife, a few friends, my brother, and the net result of that is I never get SMSs&#8217; anymore because when my phone buzzes, usually I ignore it because it is probably just somebody&#8217;s random Facebook update. So if we start overloading channels like that with &#8220;oh there might be something useful here in the real world, if you pick up the phone and look through it you will see it &#8230; and I will buzz you.&#8221; PeopleÂ  just start ignoring the buzzes.</strong></p>
<p><strong>So it is a very hard problem if you think about the kinds of applications that people always imagine with global AR &#8212; names over peoples heads and other random information floating in the world &#8212; until you have a head mount and all that information is around you all the time. That is when those sort of applications will actually happen.</strong></p>
<p><strong>Tish:</strong> <a href="http://curiousraven.squarespace.com/" target="_blank">Robert Rice</a> notes: <strong>&#8220;AR is inherently about who YOU are, WHERE you are, WHAT you are doing, WHAT is around you, et</strong><em><strong>c.&#8221; </strong></em>(see my interview with Robert,<em> </em><a href="http://www.ugotrade.com/2009/01/17/is-it-%E2%80%9Comg-finally%E2%80%9D-for-augmented-reality-interview-with-robert-rice/" target="_blank">&#8220;Is it &#8216;OMG Finally&#8217; for Augmented Reality?</a>)<em>. </em>And I think the iphone experience has laid the foundation for the increasing desire to experience the network wherever we are &#8211; and not be stuck behind a pc.Â  We cannot perhaps do all we want to do yet. But even in the range of things we can do know, we are not even sure exactly what it is we want to do where yet is it?</p>
<p><strong><br />
</strong></p>
<h3><strong>&#8220;imagine your iphone Facebook client supports AR and that all data on Facebook might be georeferenced &#8211; pictures, status updates etc&#8230;&#8230;.&#8221;</strong></h3>
<p><strong></strong></p>
<p><strong>Blair: Yes that is a huge problem. I have been lucky to be able to teach two fun classes this year that let the students and I start to explore some of the potential that handheld AR might bring. Â Last fall I taught a handheld AR game design class &#8212; coordinated with a class at the Savanna College of Art and Design&#8217;s Atlanta campus &#8212; and we had the students build a sequence of prototype handheld AR games, which was a lot of fun. Â  This spring I taught a mixed reality/augmented reality design class with Jay Bolter (a professor in the School of Literature, Communication, and Culture here at GT). Â Jay and I have been teaching this class off and on for about 9 years; this semester we decided to say to the students &#8220;imagine your iphone Facebook client supports AR and that all data on Facebook might be georeferenced &#8211; pictures, status updates etc&#8230;&#8230;.&#8221; and have them do projects aimed at such an environment.</strong></p>
<p><strong>Tish: </strong>Not many of our favorite social media today have much sense of location do they? But FlickrÂ  areÂ  utilizing the geo-referenced pictures to create vernacular maps&#8230;..The Shape of Alpha</p>
<p><strong>Blair:Yes that is because lots of cameras put geo location data into the exif data so they can extract it&#8230;</strong></p>
<p><strong>Some mobile Twitter clients like the one I use in my iphone will let you add your location.Â  But in general Facebook and other sites don&#8217;t have any notion of location. But if you look at all the things people do in Facebook, such as sending gifts and other games, its easy to imagine what these might look like with geo-reference data. Â So, the high level project for the class is the groups have to design experiences people might have using mobile AR Facebook. Â We told them to assume Facebook as it stands now, but add geolocation and AR to the client. Â The class boiled down to &#8220;What would you imagine people doing?&#8221; So it has been kind of fun.</strong></p>
<p><strong>And we are using Unity for the class too &#8211; the same infrastructure I am working on in my research linking mobile AR to persistent immersive mirror world type spaces &#8211; and we having the students mock up what a mobile AR Facebook experience would be like.</strong></p>
<p><strong>Tish: </strong>Can you describe some of the ideas you class came up with that you think have potential? I know Ori mentioned that from the games class he liked <a href="http://www.youtube.com/watch?v=Rqcp8hngdBw&amp;feature=channel_page" target="_blank">Candy Wars.</a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/candywars-6.png"><img class="alignnone size-medium wp-image-3693" title="candywars-6" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/candywars-6-300x225.png" alt="candywars-6" width="300" height="225" /></a></p>
<p><em>Candy Wars</em></p>
<p><strong>Blair: In the end, they had a nice range of projects in the Spring class. Â One created tag clouds out of status messages over spaces, others looked at analogies to virtual pets and gift giving out in the world, one looked at leveraging geolocation to help with crowd-sourced cultural translation, and three groups did straight-up social games.</strong></p>
<p><strong>[See <a href="http://www.youtube.com/user/AELatGT" target="_blank">all of the projects from the handheld AR games class on YouTube here</a>]</strong></p>
<p><strong><br />
</strong></p>
<h3><strong>iphone, Android, or </strong><strong>NVidia Tegra devkits or the Texas Instrument&#8217;s OMAP3 devkits?</strong></h3>
<p><strong>Tish:</strong> Is anyone in the class working on Android?</p>
<p><strong>Blair: Nobody is using Android because no-one in the class has the phones. We have ATT microcell infrastructure on campus. Â Some ATT people joke that we are better off than them because we have a head office on campus so we can build in the network applications which people even at ATT research can&#8217;t do.Â  But becauseÂ  we have this infrastructure on campus, and a great relationship with ATT and the other sponsors, we have the ability to provision our own phones without having to pay for long-term contracts, which is vital for research and teaching.</strong></p>
<p><strong>Tish:</strong> So does this lock you into the iphone?</p>
<p><strong>Blair: Well the G1 is of course not AT&amp;T but it is GSM so we could probably buy them unlocked and put them on our AT&amp;T network. But the students I work with are much more interested in the iphone right now.</strong></p>
<p><strong>Tish:</strong> Is that because the iphone has the market?</p>
<p><strong>Blair: For me the reason I am not interested in the G1 is because you can&#8217;t do AR on it &#8211; there is <a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a> and a few other apps, but it is all hideously slow. Â Worse, because the Java code isn&#8217;t compiled like it would be on the desktop, you can&#8217;t do computer vision with it, so you can&#8217;t do anything particularly interesting on the current commercial G1s.Â  We could probably take the NVidia Tegra devkits or the Texas Instrument&#8217;s OMAP3 devkits (both are chipsets for next gen phones &#8212; high end graphics, fast processing),Â  and install Android on those and we may actually do that yet. Â But, it seems like a lot of work right now, for not much benefit.</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/pastedgraphic.jpg"><img class="alignnone size-medium wp-image-3730" title="pastedgraphic" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/06/pastedgraphic-300x166.jpg" alt="pastedgraphic" width="300" height="166" /></a><br />
</strong></p>
<p><em>Augmented Reality shooter game <strong>ARrrrr</strong> from<strong> </strong></em><em>Georgia Tech and SCAD Atlanta on the <strong>NVidia Tegra devkits</strong></em><em> &#8211; <a href="http://www.youtube.com/watch?v=cNu4CluFOcw" target="_blank">watch the demo on YouTube here</a></em><em>. </em><strong> </strong></p>
<p><strong>Tish: </strong>Everyone seems very excited about the iphone OS 3.0 and the addition of compass. Compass is pretty essential for AR right?</p>
<p><strong>Blair: It is necessary if you can&#8217;t do other forms of outdoor tracking, but the problem is that the compass on the G1 isn&#8217;t very good, relatively speaking and the iPhone one probably won&#8217;t be much better. It does not have very high accuracy, nor is it very fast (compared to, say, the high end 3D orientation sensors we use, from Intersense and MotionNode). As far as I can tell, it doesnâ€™t even give full 3D orientation. I donâ€™t have a G1 (although I have pre-ordered an iPhone 3Gs), but people have told me it only has absolute 2D orientation, so you can only line things up if you are careful.Â  Your can&#8217;t look around arbitrarily&#8230;</strong></p>
<p><strong>Tish: </strong>You can&#8217;t sweep your phone?</p>
<p><strong>Blair: You can look left and right, but if it doesn&#8217;t have full 3D orientation, you can&#8217;t go up and down. You can&#8217;t tilt it in weird directions. It is not fast in the form that you would want to look around quickly.Â  So it is nice demo.Â  And it is good for what the Android people use it for which is to let you do your Google street view by looking around, which is actually really useful.</strong></p>
<p><strong>I think there are lots of really useful things you can do with such a compass.</strong></p>
<p><strong>And, it is clear that compass is a necessary feature if we want to do AR. Â It&#8217;s just not sufficient.</strong></p>
<p><strong><br />
</strong></p>
<h3><strong>Outdoor Tracking and Markerless AR<br />
</strong></h3>
<p><strong></strong></p>
<p><strong>Tish:</strong> Isn&#8217;t it essential for markerless AR?  I guess not I just saw this post about <a href="http://artimes.rouli.net/2009/04/srengine-in-english.html" target="_blank">SREngine on Augmented Times</a>!</p>
<p>This wasn&#8217;t up when we spoke so perhaps you have some comments about what it brings to the table?</p>
<p><strong>Blair: Maybe. The folks at Nokia are working on outdoor tracking, they demoed some stuff at ISMAR last year on the N95 handsets that is all image based.Â  We are trying to do some work with them, one of my students is working on it.Â  And probably Microsoft is going to do more on this as well, they had a video up showing that they are also working on vision based techniques.Â  If you give the phone the equivalent of those panoramic Google Street View images (assuming they are up-to-date) and you are standing at the right place, you don&#8217;t really need a compass, you can figure out which way you are looking by looking at the camera video.  Ulrich Neumann (USC) did some work on tracking from panorama&#8217;s years ago, I don&#8217;t know what ever became of it.</strong></p>
<p><strong>Regarding SREngine, that project appears to be a pretty simple first step, but is probably just a demo at this point, and limitations like &#8220;only works on static scenes&#8221; and &#8220;doesn&#8217;t work for simple scenes&#8221; means it&#8217;s probably extracting some simple features out of the image and then matching those to some database. Â The trick would be getting this to work on a large scale, where the world changes a lot. Â  It&#8217;s not obvious how to get there.</strong></p>
<p><strong>Tish:</strong> So forget RFID for AR&#8230;</p>
<p><strong>Blair: RFID is not really useful.</strong></p>
<p><strong>Tish:</strong> not at all?</p>
<p><strong>Blair: RFID is useful for telling you what things are near you.Â  The problem is it doesn&#8217;t give you any directional information &#8211; it just tells you you&#8217;re in range of the tag. So can use it to tell you when you are near a certain product for example.Â  So it is useful in terms of telling you what thing you are near, and then you can load up a vision system or something else that will recognize that thing.</strong></p>
<p><strong>In that way, it could be useful as a good starting point.</strong></p>
<p><strong>Similarly for computer vision, the compass and the gps are very useful for giving you an initial guess at what you may be looking at that can then speed up the rest of the process. Â But, computer vision by itself will not be a complete solution because if I have my panoramic Google Street view (or whatever image database I use for tracking) and you are standing between me and the building -Â  I am not going to see what I expect to see, I am going to see you.</strong></p>
<p><strong>So I think it is all going to be part of one big package &#8211; you are going to see accelerometers, digital compasses, and gps and then combine that with computer vision and other sensors, and then maybe we are going to start getting the things that we have always dreamed about.Â  I like to show <a href="http://mi.eng.cam.ac.uk/~gr281/outdoortracking.html" target="_blank">this video </a>from the U. of Cambridge (work done by Gerhard Reitmayr and Tom Drummond) of an outdoor tracking demo because it gives a sense of what will be possible.Â  Techniques like this will be an ingredient in the future of things.Â  It becomes especially interesting when you have these highly detailed mirror worlds.Â  It is sort of one of those chicken and egg problems where if I have an highly detailed model of the world then techniques like they have can be used to track.Â  But that mirror world needs to be accurate or you can&#8217;t use it for tracking, and why would you create the mirror world if you couldn&#8217;t track?</strong></p>
<p><strong>Tish:</strong> I noticed in your comment to <a href="http://www.ugotrade.com/2009/01/17/is-it-%E2%80%9Comg-finally%E2%80%9D-for-augmented-reality-interview-with-robert-rice/" target="_blank">&#8220;my interview with Robert Rice&#8221;</a> that you said you thought that is was important not to collapse AR into ubicomp &#8211; &#8220;forgetting what originally inspired us about AR&#8221; is, I think if I remember correctly, the suggestion you made. But aren&#8217;t ubiquitous computing and AR basically coextensive?</p>
<p>The <a href="http://www.ugotrade.com/2009/03/18/dematerializing-the-world-shadows-subscriptions-and-things-as-services-talking-with-mike-kuniavsky-at-etech-2009/" target="_blank">vision of ubicomp Mike Kuniavsky describes</a> &#8211; &#8220;sharing data through open APIs and the promise of embedded information processing and networking distributed through the environment&#8221; demonstrates how much can be done with very little processing power.&#8221; In its most immersive form augmented reality requires a lot of processing power. I think we have all become very conscious about trying minimize levels of consumption.Â  Can you explain why you think people shouldn&#8217;t see AR as the Hummer (energy squandering indulgence) of Ubiquitous Computing?</p>
<p><strong>Blair:Â  I think there will be a hierarchy of interfaces. You are going to have the rich Rainbow&#8217;s End like experience &#8211; you are totally submerged in a mixed environment, if you have a head mount on (its not going to be Rainbow&#8217;s End for while) but if you don&#8217;t have the headmount on that information might be available to you other ways, whether it is a 3D overlay using your handheld or just a 2D mashup with Google maps.Â  But there will be some circumstances and people who will want to get the compelling experience you can only get with the headmount.</strong></p>
<p>Tish:Â  Are you doing any research on how all these hierarchies of experiences will fit together &#8211; what aspects of this are you looking at?</p>
<p><strong>Blair: The thing that really needs to happen is you need to have this backend architecture that allows you to collect your data from different sources and aggregate it much like the web. Right now Google Earth and Microsoft&#8217;s Virtual Earth are much like the old pre-web hyper-text systems that were all centralized. And what we really need is to have the web equivalent where Georgia tech can publish their building models and I.B.M. can publish their building models and their campus models, and your client can aggregate them, as opposed to Microsoft or I.B.M. puts their building models into Google Earth and then somehow you get them out with Google&#8217;s google earth browser. That&#8217;s just not going to fly.</strong></p>
<p>Tish: so what does it take then to get us to this backend architecture, because I&#8217;m in total agreement?</p>
<p><strong>Blair: The nice thing about augmented reality versus virtual reality is that you don&#8217;t need everything modeled. You can do interesting AR apps like <a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a> with absolutely no world model.</strong></p>
<p><strong>Tish:</strong> So that means we can start with what we have &#8211; utilize cloud services without a full blown backend architecture?</p>
<p><strong>Blair: It may very well be that Google Earth and MS Virtual Earth act as a portal because people go and build models and link them with KML, and they can see them in google earth but they can also download the KML&#8217;s through some some other channel. So it may be that those things end up being something that feeds some of this along. Then people start seeing a benefit to having these highly accurate models so then you start integrating the Microsoft photosynth stuff and leveraging photographs to generate models.</strong></p>
<p><strong>It&#8217;s just keeping up with it and building it in real time is the challenge. A lot of folks think it will be tourist applications where there&#8217;s models of times square and models of central park and models of Notre Dame and the big square around that area in paris and along the river and so on, or the models of Italian and Greek history sites &#8211; the virtual Rome. As those things start happening and people start building onto the edges, and when Microsoft Photosynth and similar technologies become more pervasive you can start building the models of the world in a semi-automated way from photographs and more structured, intentional drive-by&#8217;s and so on. So I think it&#8217;ll just sort of happen. And as long there&#8217;s a way to have the equivalent of Mosaic for AR, the original open source web browser, that allows you to aggregate all these things. It&#8217;s not going to be a Wikitude. It&#8217;s not going to be this thing that lets you get a certain kind of data from a specific source, rather it&#8217;s the browser that allows you to link through into these data sources.</strong></p>
<p><strong>So it&#8217;s that end that interests me. It&#8217;s questions like &#8220;what is the user experience&#8221;, how do we create an interface that allows us to layer all these different kinds of information together such that I can use it for all my things. I imagine that I open up my future iphone and I look through it. The background of the iphone, my screen, is just the camera and it&#8217;s always AR.</strong></p>
<p><strong>I want the camera on my phone to always be on, so it&#8217;s not just that when I hold it a certain way it switches to camera mode, but literally it&#8217;s always in video mode so whenever there&#8217;s an AR thing it&#8217;s just there in the background.</strong></p>
<p><strong>When we can do that I can have little alerts so when I have my phone open I can look around and see it independent of the buttons and things that I&#8217;m tapping and pushing to use the phone. That&#8217;ll be a really a different kind of experience.</strong></p>
<p><strong>Of course it is not known yet if the next gen iphone will have an open video API. Â And of course, the current camera is pretty low quality, so why would they give it an open API until they put in a better camera? Â I am not expecting anything one way or the other until the 3Gs comes out and people start using it.</strong></p>
<p><strong>But there are many things about the iphone 3.0 OS that are hugely important, like the discovery API that allows people to play games with other people nearby, that don&#8217;t have much to do with AR.</strong></p>
<p><strong>Tish:</strong> You have an iphone AR virtual pet application ARf.</p>
<p><a href="http://www.macrumors.com/2009/04/08/video-in-and-magnetometers-could-introduce-interesting-iphone-app-possibilites/" target="_blank">Macrumors wrote it up</a> and suggested that the neg gen iphone will have compass and open video API.Â  What are your plans for ARf?</p>
<p><strong>Blair: ARf is just a demo right now. Â I know what we&#8217;d like to do with it, but it would require tons of work; Â imagine what it would take to do a multiplayer, social version of Nintendogs? Â It&#8217;s not clear what we&#8217;d really learn by doing that, but there are lots of other game ideas we have that we want to explore.</strong></p>
<p><strong>Tish:</strong> I think it was on Twitter where Tim O&#8217;Reilly said, &#8220;saying everything must have a RFID tag is like saying we can&#8217;t recognize each other unless we wear name tags. Look at what&#8217;s happening with speech recognition, image recognition et.al. and tell me you really think we need embedded metadata.&#8221; What would you say to that?</p>
<p><strong>Blair: I think that whatever extra data is there will be used. So if we put machine readable labels on some objects then they&#8217;ll be used if they make the identification and tracking problem easier. But it&#8217;s pretty clear that people are already working on tracking and so on.</strong></p>
<p><strong>A lot of these mobile AR apps are clearly putting ideas in people&#8217;s minds things that won&#8217;t really be doable in the near future. Like being able to look down the aisle of the store and it recognize all of the products. Given the distances and complexity of the scene, the number of pixels devoted to each of those objects, and so on &#8211; you just can&#8217;t recognize things in that context. But if I&#8217;m standing in front of a small set of objects, or looking at one thing, or I&#8217;m standing in front of a building, or if I&#8217;m in the store and because of the location API &#8212; imagine an enhanced location API that can tell me within a few feet where I am, and then combine that with some use of the discovery API that allows the store to tell your device you&#8217;re in the toothpaste section. Now you only have to look for different brands of toothpaste. So now you can recognize the big letters &#8220;Crest&#8221; or whatever. It&#8217;s all about constraining the problem.</strong></p>
<p><strong>That&#8217;s why I like that particular piece of Drummond&#8217;s work, the tracking web site I mentioned above. The general tracking problem of looking around and recognizing objects and tracking is still impossible. But if I know roughly what direction I&#8217;m looking in and I have a good estimate of my position, and I have models of what I should be seeing when I look in that direction, then it becomes a tractable problem. And so it&#8217;s not that a compass and a GPS are 100% necessary. But if you have them it certainly makes things possible that you wouldn&#8217;t otherwise be able to do.</strong></p>
<p><strong>Imagine for exampleÂ  if there&#8217;s a new version of GPS, I just noticed that some of the new satellites going up have this new L5 channel. There&#8217;s the L1 &amp; L2 signalsÂ  that the military and civilian ones use and they added this civilian L5 signal, which should make GPS more accurate. I haven&#8217;t found anything online that says how much more accurate.</strong></p>
<p><strong>But someday, hopefully, all GPS will get to be the quality of survey-grade GPS. Right now, if you get an RTK GPS from one of these companies that make the survey grade GPS systems, they give you position estimates in the range of two centimeters, and update 10 to 20 times a second. When you have that kind of positional accuracy combined with the kind of orientational accuracy you get from the orientation sensors we use in the lab from Intersense and MotionNode, everything is easier because you&#8217;ve pretty much got absolute position. You put that into a phone and now when I look up, it&#8217;s still not perfectly aligned because there will still be errors (especially in orientation, since the compasses are affected by metal and other magnetic noise). But it does mean if you and I are standing 5 feet apart from each other and look at each other, I can pretty much put a little smiley face above your head. Whereas now, with GPS, if I look at you and we&#8217;re 5 feet apart our GPS&#8217;s might think we&#8217;re on the opposite side of each other because they&#8217;re only accurate to two to five meters.</strong></p>
<p><strong>And that depending on the time of day and weather!</strong></p>
<p><strong>Putting RFID tags everywhere is easy; the problem is the readers &#8211; they currently require lots of power and they have a limited range.Â  Sprinkling RFID tags everywhere is fine. But you have to be able to activate those tags and read back the signal.Â  In certain contexts it works.</strong></p>
<p><strong>Tish:</strong> And one final question!Â  What do you think can be done re beginning to think about standards for AR.Â  Is there a meaningful discussion going on yet? Thomas Wrobel left this comment on my blog rcently and I was wondering what your position was on some of the ideas he raises?</p>
<p>Wrobel wrote, <em>&#8220;The AR has to come to the users, they cant keep needing to download unique bits of software for every bit of content! We need an AR Browsing standard that lets users log into an out of channels (like IRC) and toggle them as layers on their visual view (like Photoshop).Channels need to be public or private, hosted online (making them shared spaces) or offline (private spaces). They need to be able to be both open (chat channel) or closed (city map channel) as needed. Created by anyone anywhere. Really IRC itself provides a great starting point. Most data doesn&#8217;t need to be persistent, after all. I look forward too seeing the world though new eyes.I only hope I will be toggling layers rather then alt+tabbing and only seeing one â€œreality additionâ€ at a time.&#8221;<br />
</em></p>
<p><strong>Blair:  I agree with him, in principle. Â But, I&#8217;m not sure there&#8217;s a point yet. Â It can&#8217;t hurt to try, of course, from a research perspective, and I&#8217;m interested in the experience such an infrastructure would enable (as we&#8217;ve talked about already).</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/06/12/mobile-augmented-reality-and-mirror-worlds-talking-with-blair-macintyre/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
		</item>
		<item>
		<title>HomeCamp 2: Home Energy Management and Distributed Sustainability</title>
		<link>http://www.ugotrade.com/2009/04/24/homecamp-2-home-energy-management-and-distributed-sustainability/</link>
		<comments>http://www.ugotrade.com/2009/04/24/homecamp-2-home-energy-management-and-distributed-sustainability/#comments</comments>
		<pubDate>Fri, 24 Apr 2009 19:14:16 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Bar Camp]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[Virtual HomeCamp]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[Virtual Worlds]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[Add new tag]]></category>
		<category><![CDATA[distributed sustainability]]></category>
		<category><![CDATA[electricity 2.0.]]></category>
		<category><![CDATA[green technology]]></category>
		<category><![CDATA[home energy management]]></category>
		<category><![CDATA[intelligent energy management]]></category>
		<category><![CDATA[living greener]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[sustainable interaction design]]></category>
		<category><![CDATA[TweetaWatt]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=3423</guid>
		<description><![CDATA[HomeCamp is a home hacking, automation and green technology community that will be gathering in London tomorrow, Saturday 25th April 2009, 10am until 6pm BST (GMT + 1), and in an OpenSim event running alongside for virtual participation, to brainstorm new possibilities for distributed sustainability, creative smart meters, monitoring, graphing and visulaizing energy usage. More [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-31.png"><img class="alignnone size-medium wp-image-3424" title="picture-31" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-31-299x300.png" alt="picture-31" width="299" height="300" /></a></p>
<p><a rel="nofollow" href="http://homecamp.org.uk/">HomeCamp</a> is a home hacking, automation and green technology community that will be <a href="http://maps.google.co.uk/maps?f=q&amp;source=s_q&amp;hl=en&amp;geocode=&amp;q=65+-+71+Scrutton+Street,+London,+EC2A+4PJ&amp;sll=51.509912,-0.129361&amp;sspn=0.100214,0.30899&amp;ie=UTF8&amp;ll=51.524379,-0.080895&amp;spn=0.006582,0.019312&amp;z=16&amp;iwloc=addr" target="_blank">gathering in London</a> tomorrow, Saturday 25th April 2009, 10am until 6pm BST (GMT + 1), and in an <a href="http://homecamp.pbwiki.com/Virtual-Home-Camp">OpenSim event running alongside for virtual participation</a>, to brainstorm new possibilities for distributed sustainability, creative smart meters, monitoring, graphing and visulaizing energy usage.</p>
<p class="MsoNormal">More details and videos on the <a href="http://homecamp.org.uk" target="_blank">blog.</a> <a href="http://homecamp.pbwiki.com/" target="_blank">The wiki, which includes signup</a>, is the main portal to all the online activity.<a href="http://homecamp.pbwiki.com/"></a></p>
<p>As James Governor notes <a href="http://www.redmonk.com/jgovernor/2009/04/24/homecamp-returns/" target="_blank">here</a>:</p>
<blockquote><p><span lang="EN-GB">there has been a huge amount of code and applications released focused purely on using technology for home energy monitoring and automation.Â  We have an active google group and quite a few videos and content showcasing the various applications and hardware currently being used by geeks to save money and live greener.</span></p></blockquote>
<p><span lang="EN-GB">Now the challenge is to see how this seedling home energy management movement</span><span lang="EN-GB"> can </span><span lang="EN-GB">really grow into widely adopted distributed sustainability solutions that </span><span lang="EN-GB">everyone can use, and participate in.</span></p>
<p>Both <a href="http://www.yellowpark.net/cdalby/index.php/about/" target="_blank">Chris Dalby</a> (<a href="http://www.yellowpark.net/cdalby/index.php/2009/04/23/homecamp-2-is-this-saturday/" target="_blank">see here)</a>, <a href="http://andypiper.wordpress.com/2009/04/24/home-camp-mark-2/" target="_blank">Andy Piper</a>, James Governor of <a href="http://www.redmonk.com/jgovernor/" target="_blank">Monkchips</a> (<a href="http://www.redmonk.com/jgovernor/2009/04/24/homecamp-returns/" target="_blank">see here</a>),Â  and Tom Raftery of <a href="http://greenmonk.net/" target="_blank">GreenMonk</a> (<a href="http://greenmonk.net/homecamp-ii/" target="_blank">see here</a>), have posted on tomorrow&#8217;s <a href="http://homecamp.pbwiki.com/" target="_blank">Ho</a><a href="http://homecamp.pbwiki.com/" target="_blank">meCamp</a> event. So I am just going to add some quick notes, especially to highlight some of what will be going on virtually for those of you, like me, who canâ€™t make it to London.</p>
<p>You can tune in either on the live video ustream, or sign up on <a href="http://reactiongrid.com/">ReactionGrid </a>and join the <a href="http://homecamp.pbwiki.com/Virtual-Home-Camp">OpenSim event</a>. Also, you can keep up on what is happening on Twitter #homecamp. I highly recommend that you catch Tom Raftery&#8217;s talk which will be streamed from Spain live into the London meeting, the OpenSim event on ReactionGrid, and Ustream. Tom Raftery, a leading Green technology analyst at <a href="http://redmonk.com/" target="_blank">RedMonk</a> <a href="http://greenmonk.net/" target="_blank">(see also GreenMonk</a>), will be picking up, in depth, on some themes raised in his brilliant ETech 2009 presentation, <a href="http://en.oreilly.com/et2009/public/schedule/detail/5655" target="_blank">&#8220;Electricity 2.0: Applying the Lessons of the Web to Our Energy Networks.&#8221;</a></p>
<p class="MsoNormal"><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/tweetawatt.jpg"><img class="alignnone size-medium wp-image-3425" title="tweetawatt" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/tweetawatt-300x162.jpg" alt="tweetawatt" width="300" height="162" /></a></p>
<p class="MsoNormal">There will be homecampers dropping in to virtual homecamp in ReactionGrid throughout the day, including <a href="http://blogs.ipona.com/chris/" target="_blank">Chris Hart (the awesome &#8220;girl-geek&#8221;@dstrawberrygirl)</a>, <a href="http://mikethebee.mevio.com/" target="_blank">MiketheBee</a>, and <a href="http://www.cminion.com/wordpress/" target="_blank">Cminion</a>, who has a number of cool projects to demo, including <a href="http://www.cminion.com/wordpress/?p=43" target="_blank">his energy turbines</a>.Â  <a href="http://www.gomaya.com/glyph/" target="_blank">Dave Pentecost</a> (pictured above with his <a href="http://twitter.com/tweetawatt" target="_blank">Tweetawatt</a>, <a href="http://www.pachube.com/" target="_blank">Pachube</a> Orb) and I (<a href="http://docs.google.com/Presentation?id=dhj5mk2g_214g48q37hj" target="_blank">see our presentation for EarthWeek SL here</a>) plan to be at Virtual Homecamp on ReactionGrid between 9am and 10.30am EST. Dave has done a number of cool energy monitoring hacks including a <a href="http://www.pachube.com/" target="_blank">Pachube</a> link to and from <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a>.</p>
<p><span class="title">Also keep your eye on Dave&#8217;s blog, <a href="http://www.gomaya.com/glyph/" target="_blank">The Daily Glyph</a>, for what&#8217;s new in distributed sustainability. Dave just posted some great links on Sustainable Interaction, design</span> and work by ITP researchers and others in sustainable use of technology.</p>
<p><a title="Sustainable Interaction | Main / Papers" href="http://itp.nyu.edu/sustainability/interaction/Main/Papers">Sustainable Interaction | Main / Papers</a></p>
<p><a title="Sustainable interaction design | Sustainable Minds" href="http://www.sustainableminds.com/category/categories/sustainable-interaction-design">Sustainable interaction design | Sustainable Minds</a></p>
<p><a title="Design For the Other 90% | Cooper-Hewitt, National Design Museum" href="http://other90.cooperhewitt.org/">Design For the Other 90% | Cooper-Hewitt, National Design Museum</a></p>
<p class="MsoNormal">If you are in London, look out for Oliver Goh of <a href="http://www.shaspa.com/" target="_blank">Shaspa</a> as Oliver will be at Homecamp in London. As I mentioned in <a href="http://www.ugotrade.com/2009/04/19/sensor-networks-and-sustainability-connecting-real-virtual-mobile-and-augmented-reality/" target="_blank">my previous post</a>, Oliver will soon be launching both Shaspa commmunity and enterprise hardware and software packages for &#8220;Intelligent Energy Management.&#8221;</p>
<p class="MsoNormal"><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-35.png"><img class="alignnone size-medium wp-image-3428" title="picture-35" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-35-300x229.png" alt="picture-35" width="300" height="229" /></a></p>
<p>For a bit of homecamp history, James Governor (picture below from <a href="http://chinposin.com/home/monkchips" target="_blank">Chinposin)</a>, recapsÂ  some of the successes ofÂ  the first HomeCamp <a href="http://www.redmonk.com/jgovernor/2009/04/24/homecamp-returns/" target="_blank">here</a>.</p>
<p>And last but not least, a big thanks to sponsors, <a href="http://currentcost.co.uk/">CurrentCost</a>, <a href="http://greenmonk.net/">Greenmonk</a>, <a href="http://www.pachube.com/">Pachube</a>, <a href="http://www.onzo.co.uk/" target="_blank">Onzo</a>, and <a href="http://reactiongrid.com/">ReactionGrid</a>,Â  and media partner <a href="http://theattick.tv/" target="_blank">theattick.tv</a> who are making the London and virtual homecamp events possible.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-33.png"><img class="alignnone size-medium wp-image-3426" title="picture-33" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-33-294x300.png" alt="picture-33" width="294" height="300" /></a></p>
<p class="MsoNormal"><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/tweetawatt.jpg"></a></p>
<p class="MsoNormal"><a href="http://homecamp.pbwiki.com/"></a></p>
<p class="MsoNormal">
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/04/24/homecamp-2-home-energy-management-and-distributed-sustainability/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Sensor Networks and Sustainability: &#8220;Connecting Real, Virtual, Mobile and Augmented Spaces&#8221;</title>
		<link>http://www.ugotrade.com/2009/04/19/sensor-networks-and-sustainability-connecting-real-virtual-mobile-and-augmented-reality/</link>
		<comments>http://www.ugotrade.com/2009/04/19/sensor-networks-and-sustainability-connecting-real-virtual-mobile-and-augmented-reality/#comments</comments>
		<pubDate>Sun, 19 Apr 2009 06:32:59 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[MQTT and RSMB]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[realXtend]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Virtual HomeCamp]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[Virtual Realities]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[AMEE]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[Carbon Goggles]]></category>
		<category><![CDATA[distributed sustainability]]></category>
		<category><![CDATA[home energy management]]></category>
		<category><![CDATA[open data]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[sensor networks]]></category>
		<category><![CDATA[sensor networks and sustainability]]></category>
		<category><![CDATA[SHASPA]]></category>
		<category><![CDATA[the internet of things]]></category>
		<category><![CDATA[TweetaWatt]]></category>
		<category><![CDATA[Virtual Worlds]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=3381</guid>
		<description><![CDATA[Today, I did a presentation, on connecting real, virtual, mobile, and augmented spaces to support sustainability, for Earth Week SL, with Dave Pentecost and Jim Purbrick, who presented on Carbon Goggles. Dave and I focused on sensor networks, open data, Pachube, OpenSim, and sustainability from perspective of, &#8220;hack local, think global.&#8221;Â  Dave and I will [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-21.png"><img class="alignnone size-medium wp-image-3382" title="picture-21" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-21-300x225.png" alt="picture-21" width="300" height="225" /></a></p>
<p>Today, I did a presentation, on <a href="http://docs.google.com/Presentation?id=dhj5mk2g_214g48q37hj" target="_blank">connecting real, virtual, mobile, and augmented spaces to support sustainability,</a> for <a href="http://slearthweek.wordpress.com/2009/04/10/earth-week-press-release-see-schedule-also/" target="_blank">Earth Week SL</a>, with <a href="http://www.gomaya.com/glyph/" target="_blank">Dave Pentecost</a> and <a href="http://jimpurbrick.com/" target="_blank">Jim Purbrick</a>, who presented on <a href="http://carbongoggles.org/" target="_blank">Carbon Goggles</a>.</p>
<p>Dave and I focused on sensor networks, open data,<a href="http://www.pachube.com/" target="_blank"> Pachube</a>,  <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim,</a> and sustainability from perspective of, &#8220;hack local, think global.&#8221;Â  Dave and I will be picking up on some of these themes of sensor networks and sustainability next week in our presentation with <a href="http://www.darleon.com/" target="_blank">Dimitri Darras</a> at ITP,Â  NYU, Aprl 24th, 6.30 pm to 8 pm &#8211; <a href="http://itp.nyu.edu/sigs/news/special-event-open-sim/" target="_blank">details here</a>.Â  If you are in New York City, I hope to see you there.</p>
<p>We got some interesting insights into augmented reality from <a href="http://jimpurbrick.com/" target="_blank">Jim Purbrick</a> whose <a href="http://carbongoggles.org/" target="_blank">Carbon Goggles</a> project prototypes how we can use augmented reality to read carbon identity and to combine well organized, verified data from <a href="http://www.amee.com/" target="_blank">AMEE</a> &#8211; a neutral aggregation platform to measure the &#8220;carbon footprint&#8221; of everything on earth, with crowd sourced tagging and linking.</p>
<h3>Shaspa &#8211; &#8220;the sensor network system that has it all&#8221;</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-22.png"><img class="alignnone size-medium wp-image-3391" title="picture-22" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-22-300x224.png" alt="picture-22" width="300" height="224" /></a></p>
<p>We also discussed, recently launched, <a href="http://www.shaspa.com/" target="_blank">Shaspa</a>. Shaspa&#8217;s energy management packages connect spaces &#8211; real, virtual, mobile and augmented.Â  Shaspa has been bloggedÂ  by <a href="http://www.maxping.org/business/real-life/virtual-management-of-energy-consumption-in-the-home.aspx/" target="_blank">Maxping</a> and <a href="http://www.virtualworldsnews.com/2009/04/shaspa-launches-home-energy-organizer-on-opensim.html" target="_blank">Virtual World News</a>, so you can read all about it, but the Shaspa device kit won&#8217;t be available until next week. Some key features of the Home EnergyÂ  package are listed on the slide above.Â  However, this evening, Dave Pentecost and I got a sneak preview of both the Shaspa commmunity and enterprise hardware and software packages from Shaspa founder Oliver Goh. We were pretty impressed.</p>
<p><strong>Dave:</strong> &#8220;<strong>It&#8217;s the ultimate hackable device for energy management!&#8221;</strong></p>
<p><strong>Oliver:</strong> <strong>&#8220;Bring us any sensor device &#8211; with documentation, and within three days we will put a driver into Shaspa.&#8221;</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/daveandoliverpost.jpg"><img class="alignnone size-medium wp-image-3392" title="daveandoliverpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/daveandoliverpost-300x178.jpg" alt="daveandoliverpost" width="300" height="178" /></a></p>
<p>Oliver is on the right and Dave on the left in the picture above. The picture below shows Shaspa in OpenSim. Oliver and I will be attending the <a href="http://www.3dtlc.com/"><span style="color: #810081;">3D Training, Learning and Collaboration</span></a> Conference in Washington, DC, next week.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-23.png"><img class="alignnone size-medium wp-image-3412" title="picture-23" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/picture-23-300x208.png" alt="picture-23" width="300" height="208" /></a></p>
<h3>Links</h3>
<p>Here are some of the links that came up in the presentation as many people asked for them to be published. Dave also has them on <a href="http://www.gomaya.com/glyph/archives/002520.html#002520" target="_blank">his blog</a>.</p>
<p>SLIDES on GOOGLE DOCS:<br />
<a title="Earth Week SL Presentation, April 18th, 2009 - Google Docs" href="http://docs.google.com/Presentation?id=dhj5mk2g_214g48q37hj">Earth Week SL Presentation, April 18th, 2009 &#8211; Google Docs</a></p>
<p><a href="http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/" target="_blank">Pachube, sensor networks</a></p>
<p><a href="http://www.gomaya.com/glyph" target="_blank">Dave&#8217;s blog covering Maya archaeology, jungle ecology, and technology</a></p>
<p><a href="http://www.gomaya.com/glyph/archives/001914.html" target="_blank">Maya Frontier, Usumacinta River videos</a></p>
<p><a href="http://en.wikipedia.org/wiki/Collapse_(book)" target="_blank">Collapse</a></p>
<p><a href="microcontrollers http://arduino.cc/" target="_blank">Arduino</a></p>
<p><a href="http://community.pachube.com/tutorials" target="_blank">Pachube &#8211; tutorials</a></p>
<p><a href="http://apps.pachube.com/" target="_blank">Pachube Apps </a>-</p>
<p><a href="http://www.pachube.com/feeds/1284" target="_blank">Arduino-SL-Pachube data site</a></p>
<p><a href="http://www.pachube.com/feeds/1505" target="_blank">SL to Pachube site</a></p>
<p><a href="http://www.zachhoeken.com/connecting-to-the-world" target="_blank">Dave&#8217;s Danger Shield &#8211; Pachube  tutorial</a></p>
<p><a href="http://www.ladyada.net/make/tweetawatt/" target="_blank">TweetaWatt site (LadyAda)</a></p>
<p><a href="http://www.gomaya.com/glyph/archives/002505.html" target="_blank">Dave&#8217;s post on TweetaWatt to Opensim/SL</a></p>
<p><a href="http://peterquirk.wordpress.com/2008/12/22/tutorial-using-the-streamlined-tool-chain-for-importing-sketchup-models-into-realxtend-04/" target="_blank">Peter Quirk&#8217;s post on Importing Sketchup into RealXtend</a></p>
<p><a href="http://opensimulator.org/wiki/Main_Page" target="_blank">Opensim</a></p>
<p><a href="http://www.realxtend.org/" target="_blank">RealXtend</a></p>
<p><a href="http://reactiongrid.com/" target="_blank">ReactionGrid</a></p>
<p><a href="http://homecamp.pbwiki.com/" target="_blank">homecamp</a></p>
<p><a href="http://www.cminion.com/wordpress/" target="_blank">cminion -wind turbines in OpenSim</a></p>
<p><a href="http://mikethebee.mevio.com/" target="_blank">MiketheBee</a></p>
<p><a href="http://www.ugotrade.com/2009/01/17/is-it-%E2%80%9Comg-finally%E2%80%9D-for-augmented-reality-interview-with-robert-rice/" target="_blank">Is it &#8220;OMG finally&#8221; for Augmented Reality?</a></p>
<p><a href="http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/" target="_blank">Smart Planet: Interview with Andy Stanford-Clark</a></p>
<p><a href="http://www.orangecone.com/" target="_blank">Orange Cone &#8211; Information Shadows and Things as Services</a></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/04/19/sensor-networks-and-sustainability-connecting-real-virtual-mobile-and-augmented-reality/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>&#8220;Do Well By Doing Good:&#8221; Talking Experience and Design in a Mobile World with Nathan Freitas and David Oliver</title>
		<link>http://www.ugotrade.com/2009/04/04/do-well-by-doing-good-talking-experience-and-design-in-a-mobile-world-with-nathan-freitas-and-david-oliver/</link>
		<comments>http://www.ugotrade.com/2009/04/04/do-well-by-doing-good-talking-experience-and-design-in-a-mobile-world-with-nathan-freitas-and-david-oliver/#comments</comments>
		<pubDate>Sat, 04 Apr 2009 06:05:18 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[Metarati]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Phones in Africa]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[albany's king geek]]></category>
		<category><![CDATA[andrew hoppin]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[android APIs]]></category>
		<category><![CDATA[android market place]]></category>
		<category><![CDATA[android on HTC]]></category>
		<category><![CDATA[Bre Pettis]]></category>
		<category><![CDATA[Coovents]]></category>
		<category><![CDATA[crowd sourced]]></category>
		<category><![CDATA[david oliver]]></category>
		<category><![CDATA[geo report android]]></category>
		<category><![CDATA[geotagging]]></category>
		<category><![CDATA[government 2.0]]></category>
		<category><![CDATA[greporter]]></category>
		<category><![CDATA[information age volunteerism]]></category>
		<category><![CDATA[inkscape]]></category>
		<category><![CDATA[julian Bleeker]]></category>
		<category><![CDATA[location based services]]></category>
		<category><![CDATA[MeetMoi]]></category>
		<category><![CDATA[Mobile design]]></category>
		<category><![CDATA[mobile user experience design]]></category>
		<category><![CDATA[mobile voter]]></category>
		<category><![CDATA[nathan freitas]]></category>
		<category><![CDATA[NYC Resistor]]></category>
		<category><![CDATA[oliver coady]]></category>
		<category><![CDATA[Oliver+Coady]]></category>
		<category><![CDATA[open intents]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[Peek]]></category>
		<category><![CDATA[tech president]]></category>
		<category><![CDATA[the extraordinaries]]></category>
		<category><![CDATA[Thingiverse]]></category>
		<category><![CDATA[viaplace]]></category>
		<category><![CDATA[Volunteerism in the information age]]></category>
		<category><![CDATA[widget based commerce]]></category>
		<category><![CDATA[xtify]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=3356</guid>
		<description><![CDATA[Nathan Freitas holding a Peek with Oliver+Coady partner David Oliver talking to fans at New York Tech Meetup &#8211; Mobile Meets Social Volunteerism and participation in public life seem to come naturally to Nathan Freitas. Nathan is one of the leading innovators/developers in NYC in mobile strategy/design (for more on his Android development read on). [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/nathafreitaswithpeek.jpg"><img class="alignnone size-medium wp-image-3357" title="nathafreitaswithpeek" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/nathafreitaswithpeek-300x199.jpg" alt="nathafreitaswithpeek" width="300" height="199" /></a></p>
<p><em>Nathan Freitas holding a <a href="http://www.getpeek.com/indexb.html" target="_blank">Peek</a> with <a href="http://olivercoady.com/" target="_blank">Oliver+Coady</a> partner David Oliver talking to fans at <a href="http://www.meetup.com/ny-tech/calendar/9466657/" target="_blank">New York Tech Meetup &#8211; Mobile Meets Social</a><br />
</em><br />
Volunteerism and participation in public life seem to come naturally to <a id="chzc" title="Nathan Freitas" href="http://openideals.com/" target="_blank">Nathan Freitas</a>. Nathan is one of the leading innovators/developers in NYC in mobile strategy/design (for more on his Android development read on). And he is much in demand as speaker who shows others how to realize their mobile experience and design dreams (for upcoming speaking engagements see Nathan&#8217;s blog). But also Nathan has spent much of the last ten years working on new ways for causes and non profits to benefit from technology.</p>
<p>Most recently <a id="plcq" title="Nathan has started working part time for the NY Senate under, &quot;Albany's King Geek,&quot;" href="http://www.observer.com/2009/media/albany%E2%80%99s-king-geek" target="_blank">Nathan has started working part time for the NY Senate under, &#8220;Albany&#8217;s King Geek,&#8221;</a> the new CIO Andrew Hoppin:</p>
<p><strong>&#8220;The CIO team is organizing training sessions for senators and their staff on social networking platforms and how to pay attention to online feedback. Last week, they hired mobile specialist <span class="il">Nathan</span> <span class="il">Freitas</span> to create new phone applications that will allow citizens to get government news on the go.&#8221; </strong></p>
<p>Also, Nathan is currently supporting engineer on, <a href="http://www.theextraordinaries.org/" target="_blank">The Extraordinaries</a>, a smart phone application that explores territory &#8220;beyond the flattening tendency of online relationships&#8221; (see <a id="i6qw" title="this list from Andy Oram" href="http://www.praxagora.com/andyo/professional/government_participation_question.html" target="_blank">this list from Andy Oram</a> of the Questions on Government participation).Â  <a href="http://www.theextraordinaries.org/" target="_blank">The Extraordinaries</a> is Ben Rigby and Jacob Colker&#8217;s prize winning projectÂ  &#8211; &#8220;a smartphone application that delivers volunteer opportunities on-demand.&#8221;</p>
<p>Ben&#8217;s post, <a title="Information Age Volunteerism - Open Sourced! Crowdsourced!" href="http://techpresident.com/blog-entry/information-age-volunteerism-open-sourced-crowdsourced" target="_blank">Information Age Volunteerism &#8211; Open Sourced! Crowdsourced!</a> and the extensive comments give a detailed analysis and critique of this brilliant and creative new approach to volunteersim in the information age.</p>
<p>Nathan, in my view, is a great example of how to &#8220;do well by doing good.&#8221; And, I am particularly excited by the work Nathan and his partner in <a id="nwp6" title="Oliver+Coady" href="http://olivercoady.com/">Oliver+Coady,</a> David Oliver, are doing on Android, e.g., Nathan&#8217;s new <a id="jjed" title="gReporter - opensource, geotagging, media capture report client" href="http://openideals.com/greporter/" target="_blank">gReporter &#8211; opensource, geotagging, media capture report client</a> (you can <a id="ycbi" title="download the source here" href="http://github.com/natdefreitas/georeport-android/tree/master">download the source here</a>).</p>
<p>I first met Nathan when I interviewed him about <a id="kx4_" title="Cruxy" href="http://openideals.com/2009/03/11/cruxy/">Cruxy</a> in 2007 (see my post, <a href="http://www.ugotrade.com/2007/05/24/the-mixed-reality-metarati-at-destroy-tv-merging-art-commerce-politics-and-play/" target="_blank">The Mixed Reality Metarati and &#8220;Destroy TV:&#8221;Â  Merging Art, Technology, Politics and Play</a>).Â  Nathan recently announced that <a id="v9nm" title="&quot;the fat lady has just uploaded her last song,&quot;" href="http://openideals.com/2009/03/11/cruxy/">&#8220;the fat lady has just uploaded her last song.&#8221;</a> Cruxy was an innovative distributed music venture Nathan started with Jon Oakes.Â  Although, as Nathan explains, Cruxy &#8220;never really broke through in the way we hoped.&#8221; Nevertheless Cruxy seems to have been a fertile garden for ideas that are coming of age in Oliver-Coady&#8217;s current mobile experience endeavors.Â  As Nathan explains, &#8220;the world, including Apple and iTunes, has shifted to embrace some of the ideals we have always had &#8211; open formats, more ways to distribute and promote online, more avenues for niche content to be discovered and heard.&#8221; Cruxy&#8217;s technology platform, built by the incomparable Will Meyer:<br />
<strong><br />
&#8220;was a great success in my mind, being one of the first to fully embrace Amazonâ€™s cloud and provide a widget-based commerce system that actually worked!&#8221;</strong></p>
<p>Nathan has a new company, Oliver+Coady. But Nathan told me that he feels he is over his &#8220;start up phase.&#8221;</p>
<p><strong>Nathan Freitas:</strong> I am just tired of the term &#8220;startup.&#8221; I&#8217;m more interested in being defined as person than a member of a corporation. Also I am more interested in the ideas of cooperatives, and have been working on this idea (<a id="un1g" title="see here for more on the New York Creative Cooperative" href="http://scratch.openideals.com/index.php/New_York_Creative_Cooperative" target="_blank">see here for more on the New York Creative Cooperative</a> ).</p>
<p><strong>Tish Shute:</strong> You do a high percentage of non profit work. Are you still managing to keep the home fires burning in the economic downturn?</p>
<p><strong>Nathan Freitas:</strong> There is definitely profit to be made in non-profits because even if you only get paid half of what you get for corporate work, it is worth it in terms of fulfillment, ego, respect, and general contribution back to the planet. However, I&#8217;ve also been investing time &amp; energy w/o pay into thinking about how causes can benefit from technology for over ten years. So its not just something you decide to do one day, and suddenly are successful.</p>
<p><strong>Tish Shute:</strong> What are some of the highlights of your non profit work recently?<br />
<strong><br />
Nathan</strong>: Well, <a id="nywz" title="The Extraordinaries" href="http://www.theextraordinaries.org/about.html" target="_blank">The Extraordinaries</a> project is definitely a highlight. It is focused on a whole new approach to volunteering and winning the first prize at the <a href="http://wemedia.com/miami09/" target="_blank">WeMedia Conference</a> for the non-profit tech category was a great validation of the work. I am just a supporting engineer on the effort, which was founded by my good friend Ben Rigby (a longtime non-profit tech guy as well) and Jacob Colker.</p>
<p>Ben wrote this excellent book on mobile tech and organizing, <a id="lrfb" title="Mobilizing Generation 2.0" href="http://www.amazon.com/Mobilizing-Generation-2-0-Practical-Technologies/dp/0470227443" target="_blank">Mobilizing Generation 2.0</a> He&#8217;s done a ton of mobile work with youth voters via his non-profit, <a id="u5yr" title="Mobile Voter" href="http://mobilevoter.org/about.html" target="_blank">Mobile Voter</a>.</p>
<p>The Extraordinaries is really taking all of our joint experience and putting it into a whole new system that is meant to go beyond generic email blasts that just ask you to &#8220;send a fax&#8221; or &#8220;send a link&#8221;. it gives people specific tasks they can accomplish on their phone or in their local area using their phone.</p>
<p><strong>Tish: </strong>Did you do Twitter Vote Report with Ben too?</p>
<p><strong>Nathan:</strong> Oh, no, <a id="rkbs" title="Twitter Vote Report" href="http://twittervotereport.com/" target="_blank">Twitter Vote Report</a> was with a different group of folks&#8230;mostly east coast-based, organized by the <a id="z91u" title="TechPresident.com blog" href="http://techpresident.com/" target="_blank">TechPresident.com blog</a>. But Ben and I worked on SMS efforts for the 2004 election. We sent 40,000 messages out to SEIU labor members and MoveOn members&#8230; really the first time SMS was used in a wide-scale manner to help get out the vote on election day.</p>
<p><strong>Tish:</strong> Do you have a new mobilization project planned?</p>
<p><strong>Nathan:</strong> Its all about The Extraordinaries right now. We&#8217;ve got a big launch coming in June, and are working actively to add more causes that can benefit from volunteers and organizations that have volunteers but don&#8217;t know what to do with them.</p>
<p><strong>Tish:</strong> I was just looking at <a id="mg55" title="your post on Peek" href="http://openideals.com/?s=peek&amp;x=0&amp;y=0" target="_blank">your post on Peek</a> too.</p>
<p><strong>Nathan:</strong> Yeah&#8230; fortunately that is a completelyÂ  &#8220;for profit&#8221; gig.Â  But I like the company a lot, and think their spirit of providing access to email at a very low cost plays well with the non-profit world.</p>
<p><strong>Tish:</strong> So it isn&#8217;t just iphone apps that are paying the bills?</p>
<p><strong>Nathan:</strong> Nope. iPhone is just an aspect. Everyone is so obsessed with it and how to strike it rich quick, but in the greater scheme of things, there is a huge ecosystem of mobility out there for you to find a niche in, if you are looking.</p>
<p><strong>Tish:</strong> Are you able to monetize your work on Android yet?</p>
<p><strong>Nathan:</strong> here and there&#8230; releasing some for pay apps soon, also including &#8220;free&#8221; Android ports in some high-profile iPhone apps we hope to have out soon. Some successful iPhone app developers are looking for people to port their apps to Android, as well.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/georeporter.jpg"><img class="alignnone size-medium wp-image-3358" title="georeporter" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/georeporter-145x300.jpg" alt="georeporter" width="145" height="300" /></a></p>
<p><a id="jjed" title="gReporter - opensource, geotagging, media capture report client" href="http://openideals.com/greporter/" target="_blank">gReporter &#8211; opensource, geotagging, media capture report client</a></p>
<p><strong>Tish: </strong>So what are your hopes for Android development in general and your gReporter app in particular?</p>
<p><strong>Nathan:</strong> I think Android represents right now what Linux on desktops did in 99 or 00Â  Though as we all know, cycles of technology seem to speed up. There is huge interest in it at the academic level and there is also a genuine interest in its use by non-profit/development agencies working around the globe.</p>
<p>You have to jump through hoops to get an unlocked, open iPhone w/o contract. Android provides an alternative solution to this, that acts more like a true platform, and not just a consumer product.</p>
<p><strong>Tish:</strong> At the moment the Android market place is only for free apps right?</p>
<p><strong>Nathan:</strong> No, it now supports paid apps. I just bought one today for $2.99</p>
<p><strong>Tish:</strong> What did you buy?</p>
<p><strong>Nathan:</strong> An app that allows me to turn my G1 phone into a WiFi hotspot sharing my 3G connection to anyone who connects.</p>
<p><strong>Tish:</strong> So what are the most important aspects of Android in your view?</p>
<p><strong>Nathan:</strong> There areÂ  two sites to help demonstrate what is really going on with Android that makes it significant</p>
<p>1) <a id="jr_o" title="Open Intents" href="http://www.openintents.org/en/intentstable" target="_blank">Open Intents</a> &#8211; this is the ecosystem of developers, all creating services and apps that interoperate, share data, and generally build a very rich Microsoft style platform:<br />
except all these are open-source and built by lots of small developers and not one big corporation.</p>
<p>2) <a id="zdqw" title="Android on HTC" href="http://www.androidonhtc.com/" target="_blank">Android on HTC</a> &#8211; this is the home for all the efforts to port Android to pre-existing HTC/XDA mobile phone hardware. You can see the status of ports here: http://wiki.xda-developers.com/index.php?pagename=Android_devicesÂ  Imagine&#8230; taking an old Windows Mobile HTC phone, and then popping in an SD card that reformats it over to Android brand new phone!Â  For much of Asia, India and Africa, there is huge interest in this.</p>
<p><strong>Tish:</strong> Nice! You mentioned earlier that you are thinking of doing SDK for the android sensor API&#8217;s?</p>
<p><strong>Nathan: </strong>That would be part of the geo report app&#8230; expanding it to capture all sensing data and report that when you submit your text, photo or audio report.Â  Right now it just detects your lat and lon, but no reason it couldn&#8217;t also check your compass, altitude and whatever other data the device might offer.</p>
<p><strong>Tish</strong>: So what will your geo report do now?</p>
<p><strong>Nathan:</strong> It allows you to submit a text, photo or audio report, tagged with geo coordinates, timestamp, and basic user info (name, email, home location, etc) to whatever server it is configured to us. it is the latest release of code used for the TwitterVoteReport and InaugurationReport efforts.</p>
<p>There is also just a lot to learn or use from the code itself, which is available at: http://github.com/natdefreitas/georeport-android</p>
<p>Lots of little lessons learned packaged up into a functioning application</p>
<p><strong>Tish:</strong> How many sensor APIs does android have?</p>
<p><strong>Nathan</strong>: http://developer.android.com/reference/android/hardware/SensorManager.html</p>
<p>int SENSOR_ACCELEROMETER A constant describing an accelerometer.<br />
int SENSOR_ALL A constant that includes all sensors<br />
int SENSOR_DELAY_FASTEST get sensor data as fast as possible<br />
int SENSOR_DELAY_GAME rate suitable for games<br />
int SENSOR_DELAY_NORMAL rate (default) suitable for screen orientation changes<br />
int SENSOR_DELAY_UI rate suitable for the user interface<br />
int SENSOR_LIGHT A constant describing an ambient light sensor Only the first value is defined for this sensor and it contains the ambient light measure in lux.<br />
int SENSOR_MAGNETIC_FIELD A constant describing a magnetic sensor See SensorListener for more details.<br />
int SENSOR_MAX Largest sensor ID<br />
int SENSOR_MIN Smallest sensor ID<br />
int SENSOR_ORIENTATION A constant describing an orientation sensor.<br />
int SENSOR_ORIENTATION_RAW A constant describing an orientation sensor.<br />
int SENSOR_PROXIMITY A constant describing a proximity sensor Only the first value is defined for this sensor and it contains the distance between the sensor and the object in meters (m)<br />
int SENSOR_STATUS_ACCURACY_HIGH This sensor is reporting data with maximum accuracy<br />
int SENSOR_STATUS_ACCURACY_LOW This sensor is reporting data with low accuracy, calibration with the environment is needed<br />
int SENSOR_STATUS_ACCURACY_MEDIUM This sensor is reporting data with an average level of accuracy, calibration with the environment may improve the readings<br />
int SENSOR_STATUS_UNRELIABLE The values returned by this sensor cannot be trusted, calibration is needed or the environment doesn&#8217;t allow readings<br />
int SENSOR_TEMPERATURE A constant describing a temperature sensor Only the first value is defined for this sensor and it contains the ambient temperature in degree centigrade.<br />
int SENSOR_TRICORDER A constant describing a Tricorder When this sensor is available and enabled, the device can be used as a fully functional Tricorder.<br />
float STANDARD_GRAVITY<br />
with a few easter eggs as well<br />
GRAVITY_DEATH_STAR_I<br />
SENSOR_TRICORDER<br />
 <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_wink.gif" alt=";)" class="wp-smiley" /> </p>
<p><strong>Nathan</strong>: They are all in the API however, there isn&#8217;t hardware to support all of them yet&#8230; for instance TEMPERATURE is not yet supported<br />
nor is LIGHT.<br />
<strong><br />
Tish:</strong> and errr what is gravity_deathstar</p>
<p><strong>Nathan: </strong>It is a value representing the fictional gravity on the Death Star from Star Wars &#8211; geek humour<br />
<strong><br />
Tish: </strong>That makes me think of <a id="t8:v" title="this great essay by Julian Bleeker, Design Fiction: A Short Essay on Design Science, Fact and Fiction" href="http://www.nearfuturelaboratory.com/2009/03/17/design-fiction-a-short-essay-on-design-science-fact-and-fiction/" target="_blank">this great essay by Julian Bleeker, Design Fiction: A Short Essay on Design Science, Fact and Fiction</a>:</p>
<p><strong>&#8220;When you trace the knots that link science, fact and fiction you see the fascinating crosstalk between and amongst ideas and their materialization. In the tracing you see the simultaneous knowledge-making activities, speculating and pondering and realizing that things are made only by force of the imagination. In the midst of the tangle, one begins to see that fact and fiction are productively indistinguishable.<em>&#8220;</em></strong><em><br />
</em><br />
Picture below is Nathan playing his dream ukulele &#8211; designed using the free, open-source <a href="http://www.inkscape.org/">Inkscape</a> vector drawing tool (see his <a href="http://www.thingiverse.com/thing:299">open-source Ukulele plans here)</a><br />
See <a id="dqj2" title="Nathan's blog for the whole story" href="http://openideals.com/2009/03/27/open-source-ukulele-proto-uno-lazzzzored-ftw/" target="_blank">Nathan&#8217;s blog for the whole story</a> of how the Flying V Rockinâ€™ Ukulele Design he posted to <a href="http://thingiverse.com/">Thingiverse</a> a few weeks ago, after being inspired by <a href="http://twitter.com/bre">Bre Pettisâ€™</a> talk at ROFLThang materialized at theÂ  <a href="http://nycresistor.com/">NYC Resistor</a> &#8220;amazing workshop laboratory in Brooklyn where they let anyone come over and hang out at, to learn how to make, build and fabricate pretty much anything. They also have a <a href="http://www.nycresistor.com/laser/">laser</a> (aka â€œLAAAZZZOOORâ€) which you can think of as an automagic thing cutter-outer!&#8221;</p>
<p>so this&#8230;.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/lazoorukele.jpg"><img class="alignnone size-medium wp-image-3359" title="lazoorukele" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/lazoorukele-300x164.jpg" alt="lazoorukele" width="300" height="164" /></a></p>
<p>became this &#8230;</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/nathanfreitasplayingukele.jpg"><img class="alignnone size-full wp-image-3360" title="nathanfreitasplayingukele" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/nathanfreitasplayingukele.jpg" alt="nathanfreitasplayingukele" width="240" height="180" /></a></p>
<p>Nathan and David presented <a id="oofs" title="Coovents" href="http://www.coovents.com/" target="_blank">Coovents</a> at NYTM &#8211; Mobile Meets Social. They had a large group of questioners surrounding them (see picture below).Â Â  I talked to David after the presentation.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/new-yorktechmeetup.jpg"><img class="alignnone size-medium wp-image-3361" title="new-yorktechmeetup" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/new-yorktechmeetup-300x199.jpg" alt="new-yorktechmeetup" width="300" height="199" /></a></p>
<p>David Oliver was a software architect, user experience designer and product manager in the areas of mobile/wireless and electronic payment at IBM for over a decade.Â  Most recently, he lead the effort to productize a mobile client for IBM&#8217;s Lotus Connections enterprise social networking suite.Â  As a software architect, David was often technical lead for IBM&#8217;s business partner relationships with mobile device manufacturers.Â  Prior to IBM, David was co-founder of the Internet&#8217;s ï¬rst &#8220;micropayments&#8221; company, Clickshare.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/david-oliver.jpg"><img class="alignnone size-medium wp-image-3362" title="david-oliver" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/04/david-oliver-227x300.jpg" alt="david-oliver" width="227" height="300" /></a></p>
<h3>Talking with David Oliver</h3>
<p><strong>Tish Shute: </strong>How are smart phones are causing us to rethink what networked online relationships are all about.</p>
<p><strong>David Oliver: </strong>You know these [mobile] devices are .. there&#8217;s a long time we tried to pitch that we&#8217;re going to treat them like they&#8217;re PC&#8217;s, or they&#8217;re just like anything else. But they&#8217;re really not. It may be the same coding style but the way you think about using them is entirely different. And the way you think about your program. so if you use html, java and that kind of stuff, yes it&#8217;s same code type but the way you think about it is entirely different. And to me these little devices make what you said [<em><strong>relationships</strong></em> <em><strong>inherently about who YOU are, WHERE you are, WHAT you are doing, WHAT is around you, etc.</strong></em>] a lot more possible than a PC. because in a PC you almost have to sit in front of it and like it controls you. But the device is so little and there&#8217;s almost no user interface by comparison. You got to be very smart how you build something so that it&#8217;s almost invisible. And of course that&#8217;s the beauty of the iphone, Apple will tell you. The idea of ubiquitous computing. Ubiquitous what? Am I really computing? I don&#8217;t feel like I&#8217;m computing. I feel like I&#8217;m interacting or something.</p>
<p>I think twitter is very cool. The real way it&#8217;s cool is that there&#8217;s no required client. You can access Twitter any way you want.Â  You can imagine other ways to use it. Tweet Deck happens to be a nice for now. What I like about Twitter is, if you give it a tiny bit of thought, the Twitter network&#8217;s complete white noise, just like the internet itself. If you put a probe on the internet it&#8217;s all white noise, it&#8217;s all unordered packets. It makes no sense. So it&#8217;s cool that Twitter is at the level of little bitty conversations, but collectively all white noise. Totally meaningless white noise.Â  There&#8217;s some neat things going on, but I think we haven&#8217;t seen barely the first of what you can do with Twitter.</p>
<p>The way I see it is it&#8217;s like instant messaging where you don&#8217;t instant message to someone you instant message to the network and there are listeners. So normally in the old world of IM like AOL IM I would say Tish let&#8217;s talk and I kind of like grab you. Then it&#8217;s a narrow pipe you to me. You can add a few people in and make a little group, and that makes a bit of a closed network. But with twitter you just like talk into the air as if I were standing over there and you had a twitter client here, we could have the same interview. Because I would be watching you OH I see Tish&#8217;s question. I&#8217;d be over there talkingÂ  and you&#8217;d be picking me up over here. I&#8217;ts like you&#8217;re talking into white noise, like at this bar. You choose to hear me, this guy is not choosing to hear me right now.</p>
<p><strong>Tish Shute:</strong> So what does Android bring to the party?</p>
<p><strong>David Oliver:</strong> They have the notion that you have a telephone platform that&#8217;s open, and that everybody can use. And it&#8217;s got a variety of sensor data &#8211; not just location but also accelerometer and compass and more.Â  So in theory you can almost broadcast that data. It&#8217;s connected to a network. It&#8217;s easy, open API&#8217;s to get at that data. But the question is who are you going to broadcast it to or who are you sending it to. What are they going to do with it? How are you going to control it, and make sure people don&#8217;t misuse it? As you heard with the services tonight, there&#8217;s a central kind of service necessary to filter and rebroadcast that stuff back out to places that need it, or can use it, or you want to have use it. I think the mobile device is only one piece of this. Nat and I always talk about well we do mobile applications but a portion of it is on the server. And coordinating with the people or the group or the central resource that brings all this data together.</p>
<p><strong>Tish Shute: </strong>There seems to be a lot of new location based services &#8211; platforms to aggregate location based data being developed (e.g. <a id="lm5o" title="xtify" href="http://www.xtify.com/" target="_blank">xtify</a> and <a id="algg" title="viaplace" href="http://www.viaplace.com/" target="_blank">viaplace</a>). What do you think about the direction this development is going in?</p>
<p><strong>David Oliver:</strong> It&#8217;s not conventional wisdom but it&#8217;s one of these things where when a crowd of people does something, and that means people themselves are the service providers,Â  when they all get together the net effect is greater that the individual effect would be. Pooling together makes more sense than doing it individually. Its a little bit like an advanced version of you have to have a password for every single site and you manage your passwords. Location is the same way. If you had to give every single website that you enjoyed your location data or tell them how to get it, what a huge pain. So they&#8217;re offering a way to do that in a more general sense. There are humongous privacy issues though. Just like passwords. Would you really trust a place that held all your passwords centrally?</p>
<p>Even with the most basic level of calling. Now that you can call from anywhere. Largely people are getting into a mode where their mobile phone is them. It&#8217;s always with them. That&#8217;s how you reach me. Forget the home phone, the work phone it&#8217;s just a mobile phone. You have an address attached to you, an address I can reach you at that&#8217;s location independent. So there some beauty in that and it&#8217;s very freeing. It makes your location unimportant, you can call me anywhere. You can text me anywhere, message me anywhere. You can be anonymous. My son told me something recently. &#8220;I love going to New York City because I can just walk around and nobody knows me. I&#8217;m completely anonymous. That&#8217;s the coolest thing&#8221;, he says. At one level that is a good thing and a lot of good things can happen that way. But this new thing is sort of the flip side where everybody knows your location. And we haven&#8217;t figured out if that&#8217;s a good thing yet. But we&#8217;re in the throes of that whole changeover happening. And we&#8217;ll see. There&#8217;ll be some misuse. I&#8217;m not an advertising guy, so the fact that everything&#8217;s got to be ad supported makes it potentially very creepy and very dangerous. So we&#8217;ll see how that evolves.</p>
<p>Is there any model where you can go &#8220;Oh this is just like &#8216;S&#8217;&#8221;? I don&#8217;t see where that&#8217;s possible. It&#8217;s a new world. Where you&#8217;re exposed all the time, potentially. And how do you figure out either as an individual or a larger group, society or whatever, when that works and when that doesn&#8217;t. And you know there&#8217;s going to be some mis-steps probably. But the tangibility creates some of these interesting opportunities, there are just some amazing things that could happen, really, really good things. But we&#8217;re not going to get there in one step.</p>
<p>One of the things that was really a killer for privacy and a killer for in some ways the internet, was during the dot com bust. Prior to the bust, there were web sites that you&#8217;d given your name and email, and they said &#8220;we promise to preserve this privacy.&#8221; But as soon as those companies went bankrupt, their email list was gold. It was value. And a bankruptcy judge, in a court in Delaware, created a legal basis to sell that data. Those things that were formerly private were no longer private &#8211; &#8220;no no no that&#8217;s got value. I&#8217;m going to sell it so the shareholders get their money.&#8221; So all these web sites who had lists of user names that they promised were private, became public information. That was one of the biggest blows to privacy in the history of the internet. That&#8217;s going to happen again and again. Like if <a href="http://www.meetmoi.com/welcome" target="_blank">MeetMoi</a> goes out of business the likelihood is all your shit&#8217;s going to get sold. I&#8217;m sorry it&#8217;s all going to be sold. It&#8217;s all a big joke. And that&#8217;s why central services are horrid, and I don&#8217;t like anything about a central service.</p>
<p>There are some pragmatic things about the way routing on networks actually works and the fact that the internet has gotten very centralized itself. The core ideas of the early internet which were essentially a survivable telecommunications network, remember it was the defense department that did the original internet? So the original idea of the original internet was survivability. The Russians could bomb the daylights out of the United States, territorial U.S. and we would still have a survivable network. That was the idea. And therefore all the nodes were dispersed and did not count on each other, and could reroute. Well now one company UUNET or whatever they are they own the whole thing. And you can look up all their locations on some internet database. 18 well placed bombs and the whole internet goes down. That&#8217;s what happens over time.</p>
<p>Well the whole cloud thing is also kind of a myth. It&#8217;s a very neat sounding term, and some aspects of it are different and new. Nate and I do a lot of cloud computing, it&#8217;s all on Amazon.</p>
<p>But we&#8217;ve always had that. That&#8217;s called time sharing. Strictly speaking it&#8217;s a thin contractual accompanied by a much much much easier application programming interface. That&#8217;s what cloud computing is. It&#8217;s a very skinny contract. Timeshare was aÂ  huge contract. Literally it&#8217;s legal and a little bit of API ease. It&#8217;s just timesharing. But at Amazon and the other ones too, you&#8217;re not responsible for your node going down. If it goes down, they push it somewhere else automatically. Your disk goes down. You&#8217;re not responsible for backing up your disk, it&#8217;s already on 14 copies on 8 continents. They do that. So it&#8217;s a higher level of service. Nate and I have this thing called slice host. And we&#8217;ll probably build some services on it, and if they get popular, it&#8217;s like a vending machine. You just drop in a dime, they give you another slice. No contract at all. It is growth and learning about old ideas. Like this whole idea of software as a service. The company called ADP Automatic Data Processing, who basically in short do payroll for everybody. It&#8217;s software as a service. It&#8217;s been going on since 1952 or something. It&#8217;s more like a reconception using modern tools. It&#8217;s like virtual worlds are a different thing. That&#8217;s a whole different beast.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/04/04/do-well-by-doing-good-talking-experience-and-design-in-a-mobile-world-with-nathan-freitas-and-david-oliver/feed/</wfw:commentRss>
		<slash:comments>4</slash:comments>
		</item>
		<item>
		<title>People Meet People Meet Big Data: ScienceSim Explores Collaborative High Performance Computing</title>
		<link>http://www.ugotrade.com/2009/02/11/people-meet-people-meet-big-data-sciencesim-explores-collaborative-high-performance-computing/</link>
		<comments>http://www.ugotrade.com/2009/02/11/people-meet-people-meet-big-data-sciencesim-explores-collaborative-high-performance-computing/#comments</comments>
		<pubDate>Wed, 11 Feb 2009 22:40:02 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[Intel in Virtual Worlds]]></category>
		<category><![CDATA[interoperability of virtual worlds]]></category>
		<category><![CDATA[Metaverse]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[nanotechnology]]></category>
		<category><![CDATA[Open Grid]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[open protocols for virtual worlds]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[Open Source Virtual Worlds]]></category>
		<category><![CDATA[open standards for virtual worlds]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[science outreach in virtual worlds]]></category>
		<category><![CDATA[scientific simulation in virtual worlds]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[Virtual Realities]]></category>
		<category><![CDATA[Virtual Worlds]]></category>
		<category><![CDATA[virtual worlds in Japan]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[big data]]></category>
		<category><![CDATA[collaboration and big data]]></category>
		<category><![CDATA[collaborative visualization]]></category>
		<category><![CDATA[haptic interfaces for virtual worlds]]></category>
		<category><![CDATA[Hypergrid]]></category>
		<category><![CDATA[linked data]]></category>
		<category><![CDATA[modelling complex systems]]></category>
		<category><![CDATA[n-body simulation]]></category>
		<category><![CDATA[Piet Hut]]></category>
		<category><![CDATA[rapid data movement in virtual worlds]]></category>
		<category><![CDATA[ScienceSim]]></category>
		<category><![CDATA[scientific simulation]]></category>
		<category><![CDATA[steering big data simulations from virtual worlds]]></category>
		<category><![CDATA[steering virtual worlds with brain waves]]></category>
		<category><![CDATA[super computing conference]]></category>
		<category><![CDATA[supercomputing]]></category>
		<category><![CDATA[Wilf Pinfold]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=2855</guid>
		<description><![CDATA[Wilfred Pinfold, Director, Extreme Scale Programs for Intel, and the Supercomputing Conference general chair, is working with some Intel colleagues to make a project called ScienceSim the centerpiece of a special workshop event at the SC09 conference (see Supercomputing Conference, an ACM and IEEE Computer society sponsored event). Recently, I interviewed Wilf Pinfold (see interview [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/02/gwave_lg.jpg"><img class="alignnone size-full wp-image-2861" title="gwave_lg" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/02/gwave_lg.jpg" alt="gwave_lg" width="540" height="540" /></a></p>
<p>Wilfred Pinfold, Director, Extreme Scale Programs for Intel, and the<em> </em><em><a href="http://sc08.supercomputing.org/">Supercomputing Conference</a></em> general chair, is working with some Intel colleagues to make a project called <a href="http://www.sciencesim.com/">ScienceSim</a> the centerpiece of a special workshop event at the SC09 conference (<em>see </em><em><a href="http://sc08.supercomputing.org/">Supercomputing Conference</a>, an ACM and IEEE Computer society sponsored event)</em>.</p>
<p>Recently, I interviewed Wilf Pinfold (see interview below), Mic Bowman (also <a href="../../2008/09/15/interview-with-mic-bowman-intel-the-future-of-virtual-worlds/">see my previous interview here</a>), and John A. Hengeveld (see interview below). I wanted to find out what are the underlying goals of this SC conference program?Â  Why are members of the SC community being encouraged to participate with the ScienceSim environment? What projects are beginning to emerge?  And, what are Intel&#8217;s goals in giving infrastructure support to further the conversation between high performance computing and collaborative virtual worlds?</p>
<p>The vision of creating new ways to collaborate and interact with big data does seem to be one of the more significant steps we can take at a time when we find many of our most complex systems roiling and threatening total collapse. As Tim O&#8217;Reilly has pointed out &#8211; from financial markets to the climate, the complex systems we depend on for our survival seem to be reaching their limits.</p>
<p>But,Â  how can we get from the place we are now &#8211; <a href="http://www.youtube.com/watch?gl=GB&amp;hl=en-GB&amp;v=gM4fmL6dLdY" target="_blank">see this example of an n-body simulation in OpenSim</a>, to the point where we can collaboratively steer from our visualizations big data simulations of climate change, financial markets, or the depths of the universe.Â  The picture opening this post is a:</p>
<blockquote><p><em>Frame from a 3D simulation of gravitational waves produced by merging black holes, representing the largest astrophysical calculation ever performed on a NASA supercomputer. The honeycomb structures are the contours of the strong gravitational field near the black holes. Credit: C. Henze, NASA</em></p></blockquote>
<p>Wilf Pinfold explained to me part of the reason to begin a dialogue on collaborative visualization at SC &#8217;09 is that super computing communities (that tend to be highly skilled and visionary) have played key roles in internet development in the past. Wilf pointed out,Â  key browser technologyÂ  developed out of these communities in the early days of the internet &#8211; see <a href="http://en.wikipedia.org/wiki/Mosaic_(web_browser)" target="_blank">this wikipedia entry</a> that givesÂ  a background on the role of NCSA (National Center for Supercomputer Applications).</p>
<p>The hope is, while there are many obstacles to overcome, the super computing community has both the skills and motivation to find solutions to creating collaborative environments capable of the kind of rapid data movement that scientific/big data visualization needs. Solving the problems of realtime collaborative interaction with big data willÂ  have many ramifications for the way we understand virtual reality, the metaverse, virtual worlds (all these terms are becoming increasingly inadequate for cyberspace in the age of ubiquitous computing, an argument I will make in another post!).</p>
<p><em></em></p>
<p>There have already been a number of blogs on ScienceSim (see <a href="http://www.virtualworldsnews.com/2008/11/intel-creating-sciencesim-on-opensim.html" target="_blank">Virtual World News</a>, <a href="http://nwn.blogs.com/nwn/2009/02/intel-outside-.html" target="_blank">New World Notes</a>, <a href="http://www.vintfalken.com/intel-using-opensim-for-immersive-science-project/" target="_blank">Vint Falken</a>, and <a href="http://daneel-ariantho.blogspot.com/2009/02/sciencesim.html" target="_blank">Daneel Ariantho</a>). There have also been Intel blogs &#8211; <a href="http://blogs.intel.com/research/2009/01/sciencesim.php" target="_blank">see this post</a> by John A. Hengeveld (a senior business strategist working with Intel planners and researchers to accelerate the adoption of Immersive Connected Experiences). And Intel CTO <a href="http://blogs.intel.com/research/2008/11/immersive_science.php" target="_blank">Justin Rattner&#8217;s pos</a>t announcing the project this November.</p>
<p>But to blow my own horn a little, I think i was the first to blog the encounter between <a href="http://opensimulator.org/">OpenSim</a> and Supercomputing (an encounter I to some degree provoked by making the introductions) <a href="http://www.ugotrade.com/2008/07/19/astrophysics-in-virtual-worlds-implementing-n-body-simulations-in-opensim/ " target="_blank">see this post</a>.Â  So I have been following the ScienceSim initiative with great interest.</p>
<p>Very shortly after N-Body astrophysicicsts Piet Hut and Jun Makino, creators ofÂ  &#8211; GRAPE (an acronym for â€œgravity pipelineâ€ and an intended pun on the Apple line of computers) &#8211; a super computer that will <a href="http://grape.mtk.nao.ac.jp/grape/news/ABC/ABC-cuttingedge000602.html" target="_blank">become one of the fastest super computers in the world (again)</a>, met <a href="http://www.genkii.com/" target="_blank">Genkii</a> &#8211; a Tokyo based strategic company working with OpenSim, the first N-body simulation appeared in OpenSim.Â  And in a matter of weeksÂ  <a href="http://www.youtube.com/watch?v=gM4fmL6dLdY" target="_blank">this video went up on YouTube</a> &#8211; the result of a collaboration between MICA and Genkii.Â  But the nirvana of being able to create visualizations using real time data from super computers that can be steered from a collaborative environment is still a ways off.</p>
<p>Super computing communities tend to be geographically very dispersed and researchers often find themselves far from simulation facilities so there is both the motivation and skills to pioneer new tools for collaborative visualization. I know that astrophysicists certainly see their value (Piet Hut has some profound ideas on this). Astrophysicist Piet Hut and othersÂ  (<a href="http://www.ugotrade.com/2008/07/19/astrophysics-in-virtual-worlds-implementing-n-body-simulations-in-opensim/b" target="_blank">see here for more</a>) have been pioneering the use of VWs for collaboration.Â  There are two Virtual World organizations, both founded by <span class="nfakPe">Piet</span> Hut and collaborators, that are currently exploring the use of OpenSim for scientific visualizations. Â One is specifically aimed at astrophysics, MICA, the<a href="http://www.mica-vw.org/" target="_blank"> Meta Institute for Computational Astrophysics</a>, and the other is aimed broadly at interdisciplinary collaborations in and beyond science, <a href="http://www.kira.org/" target="_blank">Kira</a>, a 12-year old organization focused on `science in context&#8217;. Â As of last week, there are two weekly workshops sponsored jointly by Kira and MICA that explore the use of OpenSim, ScienceSim, and other virtual worlds. Â One of them is <a href="http://www.kira.org/index.php?option=com_content&amp;task=view&amp;id=124&amp;Itemid=154" target="_blank">&#8220;Stellar Dynamics in a Virtual Universe Workshop&#8221; </a>and the other is <a href="http://www.kira.org/index.php?option=com_content&amp;task=view&amp;id=119&amp;Itemid=149" target="_blank">&#8220;ReLaM: Relocatable Laboratories in the Metaverse.&#8221;</a></p>
<p>MICA was founded two years ago by <span class="nfakPe">Piet</span> Hut within the virtual world of <a href="http://qwaq.com" target="_blank">Qwaq Forums</a> (see the paper <a href="http://arxiv.org/abs/0712.1655" target="_blank">&#8220;Virtual Laboratories and Virtual Worlds&#8221;</a>). The Kira Institute is much older: it was founded in 1997. Â Later this month, on February 24, Kira will celebrate its 12th anniversary with a presentation of talks, a panel discussion, and a series of workshops. Â See the <a href="http://www.kira.org/index.php?option=com_content&amp;task=view&amp;id=83&amp;Itemid=113" target="_blank">Kira Calendar</a> for the main event, and the Kira Japan branch for a <a href="http://www.kirajapan.org/event/" target="_blank">special mixed RL/SL</a> event in Tokyo. Â During both events, Junichi Ushiba will give a talk about his research in which <a href="http://nwn.blogs.com/nwn/2007/10/the-second-life.html" target="_blank">he let paralyzed patients steer avatars using only brain waves</a>.</p>
<p>Other early adopters of ScienceSim include Tom Murphy, who teaches computer science at a Contra Costa College. Prior to teaching, Tom spent 35+ years working for supercomputer manufacturers. Tom said:</p>
<blockquote><p>it is very natural for me to find significantly new ways to visualize and interact with scientific mathematical models via ScienceSim and the OpenSim software behind it. ScienceSim also allows us to interact with each other and teach students in new ways.</p></blockquote>
<p>Also Charlie Peck, chair of the SC09 Education Program, (his day job is teaching computer science at Earlham College in Richmond, IN), is working with Wilf Pinfold, Tom Murphy and others &#8220;to explore how 3D Internet/metaverse technology can be used to support science education and outreach.&#8221;</p>
<p><a href="http://www.ics.uci.edu/~lopes/" target="_blank">Cristina Videira Lopes</a>, University of Irvine, is doing very interesting workÂ  on road and pedestrian traffic simulations. Crista is also the creator of <a href="http://opensimulator.org/wiki/Hypergrid" target="_blank">hypergrid in OpenSim</a>,</p>
<h3>People Meet People Meet Data: A Conversation With Mic Bowman</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/02/sciencesim_002_thumb1.png"><img class="alignnone size-full wp-image-2908" title="sciencesim_002_thumb1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/02/sciencesim_002_thumb1.png" alt="sciencesim_002_thumb1" width="404" height="239" /></a><em></em><br />
<em>Screenshot of ScienceSim from <a href="http://daneel-ariantho.blogspot.com/2009/02/sciencesim.html" target="_blank">Daneel Ariantho</a></em></p>
<p><strong>Tish:</strong> How does this work on ScienceSimÂ  fit into a wider dialogue on linked data? Where people meet people meet data, and where data meets data?</p>
<p><em><strong>Mic:</strong> Yeahâ€¦ thatâ€™s hard by the way.Â  Open integration of data (and more interestingly the functions on data) is very hard if it comes from multiple, independent sources.</em></p>
<p><em>Thatâ€™s the people part. For example, if Crista can build a model of the UCI campus somebody else builds an accurate model of several cars and another expert provides the simulation that computes the pollution generated by those cars in that environmentâ€¦its bringing people together to solve real problems, no matter how far apart physically.</em></p>
<p><strong>Tish:</strong> You mention three different simulations here. Could you explain why it is difficult to integrate data from multiple sources?</p>
<p><em><strong>Mic:</strong> integrating data from multiple sources has always been one of understanding &amp; interpreting both the syntax &amp; semantics of the data. Even relatively simple things like multiple date formats require explicit translation. More complex formats, like the many formats data is represented for urban planning, are barely computable independently let alone in conjunction with data from other sources (each with its own representation for data). Its often the expertise &amp; the collaboration of bringing people (and their bag of tools) together that solves these problems.</em></p>
<p><strong>Tish:</strong> and in this case the bag of tools is high performance modeling..?</p>
<p><em><strong>Mic:</strong> high performance modeling, rich visualizations and data. Its the three that matterâ€¦ data, function, and interface.</em></p>
<p><strong>Tish:</strong> Some people have a very hard time wrapping their head aropund the fact that anything that seems related to Second Life can do this.Â  Can you explain more about the difference between SL and OpenSim?</p>
<p><em><strong>Mic:</strong> OpenSim potentially improves data &amp; function because it can be extended through region modules. Region modules hook directly into the simulator to provide additional functionality. For example, a region module could be implemented to drive the behavior of objects in a virtual world according based on a protein folding model.</em></p>
<p><em>We need to work on additional viewer capabilities to address the user interface limitations.</em><br />
<strong><br />
Tish:</strong> Yes Rob Smartâ€™s (IBM) recent data integrations with OpenSim (<a href="http://robsmart.co.uk/2009/01/22/visualizing-live-shipping-data-in-opensim-isle-of-wight-ferries/" target="_blank">see here</a>) are impressive. Re viewers one of the biggest objections to virtual worlds is the mouse pushing and pc tied interface.</p>
<p><em><strong>Mic:</strong> There are great opportunities for improving the interface</em></p>
<p><strong>Tish:</strong> Yes I really like where the Andy Piperâ€™s experiments with Haptic Interfaces for OpenSim lead, <a href="http://andypiper.wordpress.com/2009/02/06/haptic-user-interfaces/" target="_blank">see Haptic Fantastic</a>! And I think that we will have cyberspace ubiquitous in our environment, not just stuck on a pc screen, sooner than we think.</p>
<p><em><strong>Mic:</strong> Micâ€™s opinion (not Intel): until we get souped up sunglasses with HD screens embedded (or writing directly into the eye) there will always be a role for the PC/Console/TV).Â  But, it isnâ€™t about the deviceâ€¦ its about the services projected through the deviceâ€¦ sometimes youâ€™ll want a very rich experienceâ€¦ sometimes youâ€™ll want an experience NOW wherever you are.</em></p>
<p><strong>Tish:</strong> I think people are only just realizing that VWs will be a now and wherever you are experience very soon.</p>
<p><em><strong>Mic:</strong> Thatâ€™s the critical observation the virtual world is not an application you runâ€¦ its a â€œplaceâ€â€¦ and you interact with it where you are or maybe interact through it. Speaking for Intelâ€¦ it is the spectrum of experiences that are critical to support.</em></p>
<h3>Interview with Wilfred Pinfold</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/02/gustav_h.jpg"><img class="alignnone size-full wp-image-2860" title="gustav_h" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/02/gustav_h.jpg" alt="gustav_h" width="416" height="200" /></a></p>
<p><em>Picture from National Science Foundation &#8211; <a href="http://www.nsf.gov/news/news_summ.jsp?cntn_id=112166" target="_blank">&#8220;Climate Computer Modeling Heats Up.&#8221;</a></em></p>
<p><strong>Tish Shute:</strong> I know your day job for Intel is in High Performance computing.  Could you explain to me a little bit more about what you are working on in this regard &#8211; a mini state of play for high performance computing from your perspective?</p>
<p><em><strong>Wilfred Pinfold:</strong> My title is Director, Extreme Scale Programs. This program drives a research agenda that will put in place the technologies required to make an Exa (10^18) scale systems by 2015. The current generation of high performance computers are Peta (10^15) scale so this is a 1000x increase in performance and this increase will require significant improvements in power efficiency, reliability, scalability and new techniques for dealing with locality and parallelism.</em></p>
<p><strong>Tish:</strong> The nirvana in terms of linking supercomputers to the collaborative spaces of immersive virtual worlds is to be able to create visualizations using real time data from super computers in collaborative VW environments, and ultimately for researchers to be able to collaborate and steer their simulations from their visualizations.Â   Where are we at now in terms of scientific data visualization in VWs? And what are the current obstacles to using realtime data from super computers?</p>
<p><em><strong>Wilf: </strong>Being able to steer a simulation from a visualization requires both a visualization interface that allows interaction and a simulation that operates at a speed that is responsive in interactive timeframes. For example a weather model that predicts the path of a hurricane would need to operate at something close to 1000x real time. This would run through a day in ~1.5 minutes allowing an operator to run the simulation over several days multiple times with different parameters in a single sitting to understand the likelyhood of certain outcomes?</em></p>
<p><strong>Tish:</strong> Do you see a networked online collaborative virtual world being capable of being a visualization interface that allows meaningful interaction with the hurricane scenario you describe in the near future (next 6 to 18 months)?</p>
<p><em><strong>Wilf: </strong>I was using the hurricane example to explain the usage model not an imminent capability. Hurricane Simulation: Accurate hurricane simulations require multiscale models able to resolve the global forces working on the storm as well as the microforces that define precipitation. We can build useful weather models today that run faster than real time (anything slower is not useful for prediction) but we are a long way from the ideal.<br />
Visualization: There are excellent visualizations of weather systems but I have not yet seen a virtual world that can track a simulation and allow the scientist or team of scientists to see what is going on at both the macro scale and zoom in to see precipitation conditions. Today&#8217;s supercomputers are much better at this than they were a few years ago but they are a long way from ideal.</em></p>
<p><strong>Tish:</strong> Open Source Virtual World technologies are pretty diverse in their approaches, Croquet, Sun&#8217;s Wonderland and OpenSim are quite different and have different strengths and weaknesses. As you have become more familiar with OpenSim, what have you found about the technology that particularly lends itself to this project &#8211; ScienceSim (Mic mentioned Crista&#8217;s hypergrid code for example, modularity is another feature often cited).</p>
<p><em><strong>Wilf: </strong>We have found OpenSim&#8217;s client server model is well suited to the visualization model and the ability to put the server next to the supercomputer producing the visualization data is critical. We are however very interested in other environments and encourage papers, demonstrations and research on any of these platforms at the conference.</em></p>
<h3>Interview with John A. Hengeveld</h3>
<p><strong>Tish Shute:</strong> OpenSimâ€™s dependence on Second Life based viewers is sometimes cited as a limitation, and sometimes as a strength. What are your views on this?Â  What would a strong open viewer project directed at science applications bring to the picture?</p>
<p><em><strong>John Hengeveld:</strong> There may be more than one strong open viewer project required for opensim compatible experiences.Â  The strength of the Hippo viewer, for example, is availability and its weakness is the size of the client.Â  We would love a ubiquitous, client.. that runs on all platforms, but each hardware platform brings tradeoff and restrictions of its own.Â  Today, probably all of the folks innovating in the space can deal with the size of a very fat rich client ap.. they have big computers anyway.Â  But as we get into more 3D entertainment and augmented reality applications.. virtual mall, collaboration apps.. etcâ€¦ there is a great deal of room to optimize for the specific experience.Â  Balancing visual experience with bandwidth and compute performance available .. tying into standard browsers, etcâ€¦ people have done some of this work.. and I think all of it adds to the usefulness of these worlds.</em></p>
<p><strong>Tish:</strong> Integrating highend game engines and OpenSim opens up new possibilities. But licensing issues have been an obstacle. Could a project like ScienceSim get a non-commercial license on a high end game engine?Â  What would that bring to the picture?</p>
<p><em><strong>John: </strong>Anything is possible. Game engines can give a great deal of design power for high value experiences, but the programming of these experiences must be simplified.Â  Mainstream adoption in enterprise can&#8217;t be premised on the programming model of studio gamesâ€¦ thatâ€™s a big step to get over I think.Â  There are very interesting possibilities when we take that step tho.Â  Simulation, training, agents of various types (I just finished watching â€œThe Matrixâ€ for like the billionth timeâ€¦ I think agents are coolâ€¦)</em></p>
<p><strong>Tish:</strong> Where does Larabee fit into the picture of ScienceSim and next generation virtual worlds?</p>
<p><em><strong>John:</strong> We are all very excited about the Larrabee architecture and its application to work loads like next generation virtual worlds, both in the client.. delivering immersive reality.. and someday potentially in a distributed architecture simulating and producing these worlds.Â  For Intel CVC is an all play.Â  Atom will be used in strong mobile clients.Â  Core will be used in Enterprise PCs, Laptops and DesktopsÂ Â  Xeon will be simulating these environments and handling the data communication, and Whatever we brand Larrabeeâ€¦ will be enabling compelling visual experiences. Oh.. and our software products (Havoc, tools and others) will be building blocks in knitting all this together.Â  Larrabee is a part, but there are a lot of other pieces in our visionâ€¦</em></p>
<p><strong>Tish:</strong> If the kind of rapid data movement that scientific visualization needs is achieved in virtual worlds, this will be quite a game changer for business applications of VWs too. Also it will blurr the boundaries between what we call virtual worlds and mirror worlds. It seems to me this kind of rapid data movement is a vital step towards what Mic described to me as Intelâ€™s vision of CVC: â€œConnected Visual Computing is the union of three application domains: mmog, metaverse, and paraverse (or augmented reality).â€ It almost seems to me that if you achieve your goals for ScienceSim you will change how we think about virtual worlds in general? What do you think?</p>
<p><em><strong>John:</strong> I certainly hope so..Â  Part of our goal is to stimulate innovation in the technology and usage models that will enable broad mainstream adoption of CVC based applications (what we categorize as immersive connected experiences).Â Â  By tackling the scientific visualization problem, we hope to find the key technology barriers and encourage the ecosystem to solve them.</em></p>
<p><strong>Tish: </strong>To me virtual worlds and augmented reality should be complimentary and connected experiences. How do you see this connection evolving?</p>
<p><em><strong>John:</strong> We certainly see them as related.Â  In the long term, there are many common building blocks.. but they arenâ€™t united per se.Â  Its about the user experience, and in some usages these two are almost identicalâ€¦Â  in some.. they donâ€™t look or feel at all alikeâ€¦ the viewer is distinct by a lot.Â  Our approach is to enable building blocks that people can quickly build out usages that are robust.</em></p>
<p><strong>Tish: </strong>What is Intelâ€™s vision for ubiquitous mobile computing and an internet of objects?Â  How can high performance computing be an enabler for this vision?</p>
<p><em><strong>John: </strong>Mobile computing is a central part of our life, culture and community in economically enabled economies.Â  It feeds the data of our decisions, it connects us to entertainment, it is the access point to our soapboxes, pulpits, economy and families.Â  This creates a massive increase in data, a massive increase in interactions, transactions and visualizations.Â  While many HPC applications will be behind the scenes (finance, health, energy, visual analytics and others), HPC will emerge as a part of a scale solution to serving some of this increaseâ€¦ particularly that part where interactions and visualizations are complex or compelling.. or where scale enables the usage per se .. I talked about my love of agents earlier.. and some of that comes in here.Â  Compute working behind the scenes to help managed the data complexity, manage some of the base interactions between ourselves and technology.Â  The other thing we talk internally about the â€œHannah Montana usageâ€ where millions of people use their mobile devices to access and participate (using the sensors in the device) with an interactive live concert.Â  When Mylie hears the applause of a virtual interactive audienceâ€¦ and can scream back at them.. weâ€™re there.Â  Access to ubiquitous compute will be mobile, and interactive experiences will be complex.. and HPC can help make that real.Â  Watch out for the mental trap that HPC is always high end super compute clusters thoâ€¦ the â€œmainstream HPCâ€.. smaller clustersâ€¦ high threads, etcâ€¦ will play a key part in all of this as well.</em></p>
<p>Interesting that John ended on this point as this just came in from <a href="http://blog.wired.com/gadgets/2009/02/intel-fights-re.html" target="_blank">Wired. </a><em><br />
</em></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/02/11/people-meet-people-meet-big-data-sciencesim-explores-collaborative-high-performance-computing/feed/</wfw:commentRss>
		<slash:comments>4</slash:comments>
		</item>
		<item>
		<title>Pachube, Patching the Planet: Interview with Usman Haque</title>
		<link>http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/</link>
		<comments>http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/#comments</comments>
		<pubDate>Wed, 28 Jan 2009 16:31:41 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[MQTT and RSMB]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[Second Life]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[Virtual HomeCamp]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[Virtual Realities]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[connecting environments]]></category>
		<category><![CDATA[dynamic environments]]></category>
		<category><![CDATA[electronically assisted plants]]></category>
		<category><![CDATA[Extended Environment Markup Language]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[sensor technology]]></category>
		<category><![CDATA[smart buildings]]></category>
		<category><![CDATA[smart spaces]]></category>
		<category><![CDATA[social networking sensor data]]></category>
		<category><![CDATA[software of space]]></category>
		<category><![CDATA[sustainable real estate]]></category>
		<category><![CDATA[the street as a platform]]></category>
		<category><![CDATA[ubicomp]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=2686</guid>
		<description><![CDATA[Usman Haque (architect and director, Haque Design + Research) and founder of Pachube pointed me to this image from T.R. Oke&#8217;s book, &#8220;Boundary Layer Climates&#8221; (original photo source Prof. L. E. Mount&#8217;s The Climatic Physiology of the Pig) to explain his approach to the &#8220;software&#8221; of space. My focus as an architect has always been [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pigletspachubepost.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/dcfwgkt_8g2dvxgdg_b2.jpg"><img class="alignnone size-full wp-image-2835" title="piglets" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/dcfwgkt_8g2dvxgdg_b2.jpg" alt="piglets" width="614" height="407" /></a></p>
<p>Usman Haque (architect and director, <a id="o.td" title="Haque Design + Research" href="http://www.haque.co.uk/" target="_blank">Haque Design + Research</a>) and founder of <a id="cpbp" title="Pachube" href="http://www.pachube.com/">Pachube</a> pointed me to this image from <a href="http://www.geog.ubc.ca/~toke/Profile.htm &lt;http://www.geog.ubc.ca/%7Etoke/Profile.htm" target="_blank">T.R. Oke&#8217;s</a> book, <a href="http://www.amazon.com/Boundary-Layer-Climates-T-Oke/dp/0415043190" target="_blank">&#8220;Boundary Layer Climates&#8221;</a> (original photo source Prof. L. E. Mount&#8217;s <a href="http://www.alibris.com/booksearch?qwork=1137594&amp;matches=1&amp;author=Mount%2C+Laurence+Edward&amp;browse=1&amp;cm_sp=works*listing*title" target="_blank">The Climatic Physiology of the Pig</a>) to explain his approach to the &#8220;software&#8221; of space.</p>
<p><em>My focus as an architect has always been to consider what I&#8217;ve called the &#8220;software&#8221; of space (sounds, smell, light, temperature, electromagnetic fields, social relationships, etc.) rather than the &#8220;hardware&#8221; (floors, walls, roof, etc.) as it has traditionally been considered. The image (above) really sums up why I think this is important.</em></p>
<p><em>It&#8217;s the same piglets, in the same box, but on the right hand side the temperature has been increased. This small change in how the space is &#8220;programmed&#8221; has dramatically changed the way the &#8216;inhabitants&#8217; relate to each other and how they relate to their space. This approach to architecture became my challenge: how to translate such strategies into the general architectural discourse and how to bring into reality such possibilities for the construction industry.</em></p>
<h3>&#8220;Connecting Environments, Patching the Planet&#8221;<em><br />
</em></h3>
<p>Pachube is the culmination of 12 years of work.<em> </em></p>
<p><em>&#8220;It is now occupying pretty much all my time and will do for the foreseeable future,&#8221; </em>Usman told me.</p>
<p>Haque Design + Research is not foregrounded on theÂ <a id="q51:" title="Pachube site" href="http://www.pachube.com/" target="_blank">Pachube site</a>. And I did not make the connection at first. But when I followed a small link at the bottom, I was soon delving into the <a id="n4ku" title="work of Usman Haque" href="http://www.haque.co.uk/" target="_blank">work of Usman Haque</a>.Â  Then the penny dropped and I realized that Pachube is not only:</p>
<p><em><em>A web service that enables people to tag and share real time sensor data from objects, devices and spaces around the world, facilitating interaction between remote environments, both physical and virtual.</em></em><strong><em><br />
</em></strong></p>
<p>Pachube is also a really big idea.</p>
<h3><strong>Ubicomp and the &#8220;Software of Space?&#8221;<br />
</strong></h3>
<p>Usman suggested that, if I really wanted to go back to the beginning of the Pachube vision, I should check out the work of Dutch architect Constant Nieuwenhuys and his 1956 proposal for a visionary society, <a id="y-7j" style="font-weight: normal;" title="New Babylon" href="http://www.artfacts.net/index.php/pageType/exhibitionInfo/exhibition/15904" target="_blank">New Babylon</a></p>
<p>Usman explained:<strong><em></em></strong></p>
<p><em>Constant Nieuwenhuys is certainly an inspiration for Pachube. He envisages a globally connected architecture, built by its inhabitants &#8211; configured, reconfigured, reappropriated&#8230;</em></p>
<p>For a more contemporary reference, Usman noted there are lots of overlapping concepts with <a id="d21o" title="Adam Greenfield (head of design direction for service and user-interface design at Nokia)" href="http://speedbird.wordpress.com/about/" target="_blank">Adam Greenfield&#8217;s work. </a>Adam is head of design direction for service and user-interface design at Nokia. see Everyware: <a id="spz5" title="The dawning age of ubiquitous computing" href="http://www.amazon.com/exec/obidos/ASIN/0321384016/v2organisa/" target="_blank">The dawning age of ubiquitous computing</a>, and <a href="http://www.lulu.com/content/1554599">Urban Computing and its Discontents</a> to understand more about the vision Adam Greenfield has been developing.</p>
<p>Pachube is right in the zone with the ideas outlined in <a id="pxeu" title="The project description for Adam Greenfield's upcoming book, The City Is Here For You To Use" href="http://speedbird.wordpress.com/2008/01/01/new-day-rising/" target="_blank">The project description </a>for Adam Greenfield&#8217;s upcoming book,<a id="pxeu" title="The project description for Adam Greenfield's upcoming book, The City Is Here For You To Use" href="http://speedbird.wordpress.com/2008/01/01/new-day-rising/" target="_blank"> The City Is Here For You To Use</a>:</p>
<p><em><em>The City&#8230; takes everything explored in Everyware as a given, and a point of departure.<br />
<em><br />
It assumes that emergent technologies like RFID, mesh networking and shape-memory actuators&#8230;</em></em></em><em><em><em>will simply be part of how cities will be made from now on&#8230;</em></em></em></p>
<p><em><em><em><br />
</em></em></em></p>
<h3 style="text-align: left;">The Pachube Team</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pachubeteamfull.jpg"><img class="alignnone size-full wp-image-2764" title="pachubeteamfull" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pachubeteamfull.jpg" alt="pachubeteamfull" width="480" height="344" /></a></p>
<p>The Pachube Team &#8211; Usman Haque (creative director), Chris Leung (EEML developer), photoshopped laptop: Chris Burman (&#8220;example-maker&#8221;. e.g. SL code and Google SketchUp plugin), Ai Hasegawa (graphic designer), Sam Mulube (technical producer and website development).</p>
<p>Also, with Bruce Sterling as a &#8220;visionary&#8221; adviser and other luminaries involved, Pachube has some brilliant guiding lights.Â  Usman pointed that many people have<em> &#8220;have helped, prodded, nudged and advised along the way!&#8221; </em></p>
<div><em>Gavin Starks and also Dopplr&#8217;s Matt Biddulph have been sort of &#8220;friendly neighbours&#8221; to Pachube: they&#8217;ve made some great introductions and I turn to them often for advice on being a London start-up. What&#8217;s been really useful for me is that they are active in a related area and have directly useful advice: Gavin, of course, since he&#8217;s involved in metering the world&#8217;s energy; and Matt perhaps less tangibly in his day job as Dopplr&#8217;s CTO but more so in his active Arduino-enabled social life!</em></div>
<div><em><br />
</em></div>
<div><em>One very important Pachube advisor has been Dr. Paul Pangaro, who has previously been CTO at a number of technology startups, and brings vital experience from his time at Sun Microsystems as Senior Director and Distinguished Market Strategist. Oh, and he&#8217;s also a former student and collaborator of Gordon Pask&#8217;s! He has been very helpful in developing a viable business model in conjunction with my brother Yusuf Haque, who, with his experience in raising capital for startups, has led the fundraising process.</em></div>
<div><em><br />
</em></div>
<div><em>Of course, direct daily input from the Pachube team has been vital to the development of the project, and without Chris Leung (EEML development) and Sam Mulube (backend development) it would be a very different thing indeed!</em></div>
<div>
<h3>Pachube is not just a social networking project for sensor data.</h3>
<p>Pachube evolved out three strands of thought:</p>
<p><em>1) the geographical non-specificity of architecture these days as people live their lives in constant connection with people in remote spaces </em></p>
<p><em>2) a desire to open up the production process of &#8220;smart homes&#8221; in reaction to current trends forÂ placing the design and construction process solely in the hands of knowledgeable others.</em></p>
<p><em>3) an emphasis on contextually specific &#8220;environments&#8221; rather than object-centric &#8220;sensors&#8221;</em></p>
<p>Sensor/actuator integrations are a part of whatÂ  Pachube is about (also see Peter Quirk&#8217;s in depth post on <a id="ai70" title="the strong connection between virtual worlds and sensor networks" href="http://peterquirk.wordpress.com/2009/01/21/sensor-networks-and-virtual-worlds/" target="_blank">the strong connection between virtual worlds and sensor networks</a>), and an interest in home automation and energy management is giving a lot of early momentum to Pachube.</p>
<p>But Usman makes clear Pachube is about &#8220;environments&#8221; rather than &#8220;sensors.&#8221;Â  &#8220;An &#8216;environment&#8217; has dynamic frames of reference, all of which are excluded when simply focusing on devices, objects or mere sensors&#8221; (Usman explains this in depth in the interview below). A central part of Pachube is the development ofÂ  the <a id="f0b2" title="Extended Environments Markup Language." href="http://www.eeml.org/" target="_blank">Extended Environments Markup Language.</a></p>
<h3>Extended Environment Markup Language</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/eeml.jpg"><img class="alignnone size-full wp-image-2765" title="eeml diagram" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/eeml.jpg" alt="eeml diagram" width="520" height="159" /></a></p>
<p><em>Pachube came about as a direct attempt to enable the production of dynamic, responsive, conversant &#8216;environments&#8217;. </em></p>
<p><em>The <a id="gv6y" style="color: #551a8b;" title="Extended Environments Markup Language (EEML)" href="http://www.eeml.org/" target="_blank">Extended Environments Markup Language (EEML)</a> (which is the protocol around which much of Pachube is based) is being developed to make the idea of &#8220;dynamic, responsive and conversant environments&#8221; a reality. It worksÂ with existing construction standards like <a id="l7sl" style="color: #551a8b;" title="Industry Foundation Classes (IFC)" href="http://en.wikipedia.org/wiki/Industry_Foundation_Classes" target="_blank">Industry Foundation Classes (IFCs)</a>, but exists to extend them to account for dynamic, responsive and, dare I say it, conversant buildings. </em></p>
<p>A key member of the Pachube<em> </em>team<em> </em>doing EEML development is <a id="h3n5" title="Chris Leung" href="http://www.chrisleung.org/" target="_blank">Chris Leung</a><em>. </em>Haque Design + Research<em> </em>is industry sponsor of Chris&#8217; doctorate that:</p>
<p><em>investigates how Architectural and Engineering consultancies can use advanced imaging, sensing and visualisation technology to capture, record and playback the responsive behaviour of built Architecture in response to its environment as a decision-support tool to meet this unique challenge.</em></p>
<p><strong><a href="http://www.chrisleung.org/CaseStudy1.htm">Case-Study I â€“ Kielder Forest</a></strong></p>
<p><em><strong><img class="alignnone size-medium wp-image-2707" title="kielderforest" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/kielderforest-300x225.jpg" alt="kielderforest" width="300" height="225" /></strong></em></p>
<p>Usman explained to me the full vision for Pachube is not yet fleshed out on the web site (so read the full interview!), and this is in part because the focus has been on building a backend capable of handling millions of users.</p>
<h3>The business model for Pachube</h3>
<p>Usman explained his commitment to an ethically driven business model to allow a diverse group of companies and individuals to transition to the internet of things. Usman emphasizes that one of his chief concerns is to make sure that these technologies of &#8220;extreme connectivity,&#8221; that will soon be part of every aspect of our lives, are in the hands of all who want to use them.<br />
<em><br />
Pachube is here to make it easier to participate in what I expect to be a vast &#8216;eco-system&#8217; of conversant devices, buildings &amp; environments. </em></p>
<p><em>Pachube will facilitate the development of a huge range of new products and services that will arise from extreme connectivity. It&#8217;s relatively easy for large technology companies like Nike and Apple to transition into the Internet of Things, but Pachube will be particularly helpful for that huge portion of smaller scale industry players that *want* to become part of it, but which are only now waking up to the potentials of the internet &#8212; small and medium scale designers, manufacturers and developers who are very good at developing their products but don&#8217;t have the resources to develop in-house a massive infrastructure for their newly web-enabled offerings. </em></p>
<p><em>Basically, having built a generalized data-brokering backend to connect physical (and virtual) entities to the web, others can now start to build the applications that make the connections really useful. </em></p>
<h3>An Inspired Community of Early Adopters and Business Visionaries</h3>
<p><em><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/monkchipsathomecamp1.jpg"><img class="alignnone size-full wp-image-2766" title="monkchipsathomecamp1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/monkchipsathomecamp1.jpg" alt="monkchipsathomecamp1" width="462" height="308" /></a><br />
</em></p>
<p>James Governor <a href="../wp-content/uploads/2008/12/andystanfordclark.jpg"><span class="entry-content">(</span></a><a id="qd8i" title="@monkchips" href="http://twitter.com/monkchips" target="_blank">@monkchips</a>), <a href="http://redmonk.com/">Redmonk</a> has Pachube, <a href="http://currentcost.co.uk/">Current Cost</a>, <a id="g.i:" title="using MQTT" href="http://mqtt.org/" target="_blank">MQTT</a> and RSMB (<a id="h0is" title="IBM AlphaWorks" href="http://alphaworks.ibm.com/tech/rsmb" target="_blank">IBM AlphaWorks</a>), and <a href="http://www.arduino.cc/" target="_blank">Arduino</a> on the board at <a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp â€˜08.</a> Photo from theÂ  <a href="http://www.flickr.com/photos/tags/homecamp08/" target="_blank">Flickr</a><a href="http://www.flickr.com/search/?q=homecamp&amp;w=29034542%40N00" target="_blank"> stream</a> ofÂ  <a href="http://benjaminellis.co.uk/" target="_blank">Benjamin Ellis</a>.<a href="http://www.flickr.com/search/?q=homecamp&amp;w=29034542%40N00" target="_blank"></a></p>
<p>What attracted my attention to Pachube, at first, was the small but highly energized community of early adopters I noticed experimenting with Pachube.Â  <a id="x2vv" title="Nigel Crawley" href="http://www.nigelcrawley.co.uk/" target="_blank">Nigel Crawley</a> <a id="nf4y" title="@ni" href="http://twitter.com/ni" target="_blank">@ni</a>), and <a id="zjcv" title="James Taylor" href="http://jtlog.wordpress.com/" target="_blank">James Taylor</a>, (<a id="ie4m" title="@jtonline" href="http://twitter.com/jtonline" target="_blank">@jtonline</a>)Â  were some of the first to plunge in.Â <a id="o0.i" title="Rick Bullotta" href="http://www.automation.com/content/wonderware-appoints-rick-bullotta-vp-and-cto" target="_blank">Rick Bullotta,</a> Usman noted, has been very active in the community forum bringing much-needed automation expertise to the conversation. <a id="ny-t" title="Pam Broviak" href="http://www.publicworksgroup.com/" target="_blank">Pam Broviak</a> (<a id="xkmo" title="@pbroviak" href="http://twitter.com/pbroviak" target="_blank">@pbroviak</a>) is an early Second Life adopter.Â  And <a id="ugu0" title="Matt Biddulph" href="http://www.hackdiary.com/about/" target="_blank">Matt Biddulph</a> (CTO of <a href="http://www.dopplr.com/">Dopplr</a>) was the first non-Pachube person to get a feed up!</p>
<p>A very active early adopter is <a id="q54j" title="Carl Johan Rosen" href="http://carljohanrosen.com/" target="_blank">Carl Johan Rosen</a> wrote an <a href="http://www.openframeworks.cc/" target="_blank">openFrameworks</a> addon (<a id="ljuh" title="for more see here" href="http://carljohanrosen.com/?p=42" target="_blank">see here</a>) for <a href="http://www.pachube.com/" target="_blank">Pachube</a> that he presented at the <a href="http://www.aec.at/en/festival2008/program/project.asp?parent=14439&amp;iProjectID=14447" target="_blank">OFLab at Ars Electronica Festival</a>.<br />
After the first inaugural <a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp</a>, where Usman and Chris Burman from Pachube were presenters, (<a id="diae" title="see slides here" href="http://www.slideshare.net/tag/pachube" target="_blank">see slides here</a>), I began to notice that people were sending their current cost feeds into Pachube. And recently, it was announced that Pachube has <a href="http://apps.pachube.com/carbon_footprint.php" target="_new">carbon footprint calculation app</a> which:</p>
<p><em>makes it very easy to take any Pachube feed that measures electricity consumption in watts or kilowatts and convert it into a Pachube feed that shows a realtime estimated carbon footprint for the last 15 minutes, the last hour and the last 24 hours.</em></p>
<p><em>The app makes use of international data provided by <a href="http://www.amee.cc/" target="_new">&#8216;AMEE &#8211; The world&#8217;s energy meter&#8217;</a>. AMEE provides figures that are specific to electricity suppliers in UK &amp; Ireland and specific to country in the rest of the world.</em></p>
<p><em>This app, combined with the <a href="http://community.pachube.com/?q=node/100">Current Cost app</a> makes it simple to monitor your carbon footprint on a day to day basis!</em></p>
<p>I still haven&#8217;t found out what <a id="kmt8" title="@yellowpark" href="http://twitter.com/yellowpark" target="_blank">@yellowpark</a> was doing last Saturday to produce so much CO2&#8230;&#8230;? (the perils of going public with your energy consumption as <a id="am8t" title="@epachube" href="http://twitter.com/pachube" target="_blank">@epachube</a> pointed out).</p>
<p>But perhaps Chris Dalby <a id="kmt8" title="@yellowpark" href="http://twitter.com/yellowpark" target="_blank">(@yellowpark</a>) can be excused a day of CO2 excess as he has just released <a id="qf:l" title="Pachube Air" href="http://www.yellowpark.net/cdalby/index.php/2009/01/10/pachube-air-the-first-release/" target="_blank">Pachube Air</a>.</p>
<p>While enterprise and government projects are on the near horizon, PachubeÂ  is designed to introduce a DIY approach to ubicomp.Â  Usman said he is &#8220;concerned by developments in ubiquitous computing whereby &#8216;making technology invisible&#8217; equates to placing the design and construction process solely in the hands of knowledgeable others.</p>
<p>DIY City (see the <a id="zwms" title="Do-It-Yourself-City Project" href="http://diycity.org/diycity-main-group/call-work-first-diycity-project" target="_blank">Do-It-Yourself-City Project</a>) is developing a similar vision here in NYC.</p>
<h3>Natural Fuse: &#8220;A city wide network of electronically-assisted plants.&#8221;</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/naturalfusenetwork1.jpg"><img class="alignnone size-full wp-image-2779" title="naturalfusenetwork1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/naturalfusenetwork1.jpg" alt="naturalfusenetwork1" width="405" height="305" /></a></p>
<p><em>I think we&#8217;ve really not even begun to imagine the kinds of applications that will be important,&#8221; </em> Usman Haque.</p>
<p>Haque Design + Research which still continues, and has a separate team will be involved mostly in the kinds of things it has in the past, but it isÂ <em> &#8220;also in pushing development of things that *use* Pachube,&#8221;</em> such as the project Natural Fuse, by Usman Haque, <a id="y5x7" title="Nitipak Samsen (Designer)" href="http://www.dotmancando.info/" target="_blank">Nitipak Samsen (Designer)</a>,Â <a id="d.p2" title="Cesar Harada (Designer)" href="http://www.cesarharada.com/" target="_blank">Cesar Harada (Designer)</a>, Barbara Jasinowicz (Producer), was commissioned by <a href="http://www.archleague.org/index-dynamic.php?show=757" target="_new">the Architecture League</a> &amp; <a href="http://www.situatedtechnologies.net/?q=node/89" target="_new">Situated Technologies: Toward the Sentient City</a> and will open to the public in Autumn 2009.</p>
<p><em>Natural Fuse harnesses the carbon-sinking capabilities of plants to create a city-wide network of electronically-assisted plants that act both as energy providers and as shared &#8220;carbon sink&#8221; circuit breakers. By sharing resources and information between the plants, energy expenditure can be collectively monitored and managed.</em></p>
<p><em> The purpose is to create a collective &#8220;carbon sink&#8221;, that offsets the amount of energy consumed by the plant owners &#8211; a natural &#8220;circuit breaker&#8221;. If people cooperate on their energy expenditure then the plants thrive (and they can all use more energy); but if they don&#8217;t then the network starts to kill plants, thus diminishing the network&#8217;s energy capacity,</em> (a full description of natural fuse in the interview below).</p>
<h3>The Street As Platform</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/streetasaplatform1.jpg"><img class="alignnone size-full wp-image-2780" title="streetasaplatform1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/streetasaplatform1.jpg" alt="streetasaplatform1" width="450" height="301" /></a></p>
<p>Image courtesy ofÂ <a id="k0g3" title="Timo Arnall" href="http://www.elasticspace.com/" target="_blank">Timo Arnall</a> -Â  who is an awesome photographer and mover and shaker in ubicomp. <em>&#8220;The way the street feels may soon be defined by what cannot be seen with the naked eye,&#8221;</em> writes Dan Hill in his post <a href="http://www.cityofsound.com/blog/2008/02/the-street-as-p.html" target="_blank">&#8220;The Street as Platform.&#8221;</a> Usman comments on Dan Hill&#8217;s other &#8220;must read&#8221; post:</p>
<p><em><a id="doow" title="&quot;the personal well-tempered environment,&quot;" href="http://www.cityofsound.com/blog/2008/01/the-personal-we.html" target="_blank">&#8220;The Personal Well-Tempered Environment&#8221;</a> is full of &#8220;fascinating propositions&#8230; &#8230;they&#8217;re relevant to things I&#8217;m interested in&#8230;</em></p>
<p>In a summary of his ideas on personal well-tempered env., Dan Hill writes:<br />
<em></em></p>
<p><em>A real-time dashboard for buildings, neighbourhoods, and the city, focused on conveying the energy flow in and out of spaces, centred around the behaviour of individuals and groups within buildings.</em></p>
<p><em>A form of &#8216;BIM 2.0&#8242; that gives users of buildings both the real-time and longitudinal information they need to change their behaviour and thus use buildings, and energy, more effectively. An ongoing post-occupancy evaluation for the building, the neighbourhood and the city.</em></p>
<p><em>A software service layer for connecting things together within and across buildings.</em></p>
<p><em>As information increasingly becomes thought of a material within building, it makes sense to consider it holistically as part of the built fabric, as glass, steel, ETFE etc.</em></p>
<h3>Interview With Usman Haque</h3>
<p><strong>Tish Shute:</strong> You have been involved in many awesome projects but Pachube seems to be quite a new direction.Â  What are the key influences in your career and the development of your thinking? And, could you tell me more about how your previous work brought you to creating Pachube? Is Pachube a central focus for you and Haque design now?</p>
<p><strong>Usman Haque:</strong><em> To me Pachube is the logical culmination of everything I&#8217;ve worked on for the last 12 years since finishing my post-grad architecture studies.</em></p>
<p><em>A lot of my work until now has centered around large-scale mass-collaboration interactive &#8220;spectacles&#8221; involving many thousands of members of the public at once. I found this a good medium in which (a) to explore strategies for collaboration that take account of the granularity of participation (i.e. the fact that different people have different interests, skills and intentions in any participative act); and (b) to work at an urban scale; i.e. in a way that has an effect at the scale of buildings, parks, and streetscapes etc.</em></p>
<p><em> <a id="kr8h" title="Open Burble" href="http://www.haque.co.uk/openburble.php" target="_blank">Open Burble</a> was a good example of this approach: essentially a framework, composed of 2m carbon-fibre modules, it had electronics embedded in 1000 helium balloons. Members of the public could configure and assemble these, inflate them and then unfurl the complex structure up to the scale of a 15 storey buidling. Finally, by shaking, rowing, twisting and bending a handlebar embedded with sensors (the same as in the Wii controller as it happens), dozens of people at once could have an effect on the Burble&#8217;s position and the colours streaming through it.</em></p>
<p><em><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/openburble2.jpg"><img class="alignnone size-full wp-image-2832" title="openburble2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/openburble2.jpg" alt="openburble2" width="509" height="338" /></a><br />
</em></p>
<p><a href="http://www.haque.co.uk/openburble.php" target="_blank">Open Burble, Singapore Biennale 2006</a></div>
<p><em>Along the way I became interested at times in what an &#8220;operating system&#8221; might mean in the context of architecture (paper,Â <a id="cxpf" title="Hardspace, Softspace and the possibilities of open source architecture, 2002" href="http://www.haque.co.uk/papers/hardsp-softsp-open-so-arch.PDF" target="_blank"> Hardspace, Softspace and the possibilities of open source architecture, 2002 (PDF)</a>, particularly an &#8220;open source&#8221; operating system (Urban Versioning System,Â <a id="yvjc" title="http://uvs.propositions.org.uk/" href="http://uvs.propositions.org.uk/" target="_blank">http://uvs.propositions.org.uk/</a> ). I was also interested in developing tools for supposedly &#8220;non-technical&#8221; people to start building their own interactive systems or environments, hence the release of <a id="zv:-" title="The &quot;Low Tech Sensors &amp; Actuators for Artists and Architects&quot;" href="http://lowtech.propositions.org.uk/" target="_blank">The &#8220;Low Tech Sensors &amp; Actuators for Artists and Architects&#8221;</a> pamphlet , co-authored with an old friend,Â <a id="w-ad" title="Adam Somlai-Fischer" href="http://www.aether.hu/" target="_blank">Adam Somlai-Fischer</a>, back in 2005.</em></p>
<p><em>An off-shoot of this has been an obsession withÂ <a id="ahue" title="trying to rescue the concept of &quot;interaction&quot;" href="http://mags.acm.org/interactions/20090102/?pg=71" target="_blank">trying to rescue the concept of &#8220;interaction&#8221;</a> from oblivion &#8211; I say oblivion because I think the really exciting possibilities of the concept of interaction are being lost because we&#8217;re being sold a billion so-called &#8220;interactive&#8221; devices and gadgets that are, in fact, merely &#8220;reactive&#8221;. In this, <a id="t5h7" title="I turn often to the work of cybernetician Gordon Pask" href="http://www.haque.co.uk/papers/architectural_relevance_of_gordon_pask.pdf" target="_blank">I turn often to the work of cybernetician Gordon Pask</a>, particularly active in the 50s, 60s and 70s in the development of truly interactive systems. (And also a collaborator withÂ <a id="gt4p" title="Cedric Price" href="http://en.wikipedia.org/wiki/Cedric_Price" target="_blank">Cedric Price</a>, one of my favourite architects).</em></p>
<p><em>Which brings me to Pachube, which is now occupying pretty much all my time and will do for the foreseeable future. (<a id="qdfj" title="Haque Design + Research" href="http://www.haque.co.uk/" target="_blank">Haque Design + Research</a> still continues, and has a separate team &#8212; it will be involved mostly in the kinds of things it has in the past, but also in pushing development of things that *use* Pachube, such as the projectÂ <a id="h:9w" title="Natural Fuse" href="http://www.haque.co.uk/naturalfuse.php" target="_blank">Natural Fuse</a> ).</em></p>
<p><em>Pachube came about as a direct attempt to enable the production of dynamic, responsive, conversant &#8216;environments&#8217;. ItÂ basically evolved out of three strands of thought.</em></p>
<p><em>The first was the notion of the <strong>geographical non-specificity of architecture</strong> these days. By this I mean that, for many of us now, &#8220;home&#8221; is an idea constructed from several places &#8211;we live and work in environments composited by networked technology from fragments that bridge huge geographical distances. These environments are resolutely &#8220;human&#8221; (in the sense of being inhabited, designed and determined by people) yet context-free (because they do not privilege geographical location). I wanted to find a way to &#8220;connect&#8221; up remote spaces, much likeÂ <a id="ubie" title="Remote Home" href="http://www.tobi.net/remotehome/remotehome.htm" target="_blank">Remote Home</a> and a whole range of other projects had done, but in a generalized way so that it would be possible to keep adding to the ecosystem of connected environments on an ad hoc basis; a global architecture if you will.</em></p>
<p><em>The second strand of thought came from the <strong>desire to open up the production process of &#8220;smart homes.&#8221;</strong> I&#8217;m concerned by developments in ubiquitous computing whereby &#8220;making technology invisible&#8221; equates to placing the design and construction process solely in the hands of knowledgeable others. Whereas it&#8217;s still possible more or less to do DIY on your home, if many ubicomp technologists had their way it would become less and less possible simply because of the complexity of reverse-engineering such closed-systems. It&#8217;s already a problem with larger buildings: service companies go out of business, proprietary skills or tools disappear and complex lighting and sensor systems remain unused. So, with Pachube I wanted to help foster a more open way of developing the discipline: to embrace the concept of the maker, and to help people negotiate their technological future.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/reconfigurablehouse.jpg"><img class="alignnone size-full wp-image-2781" title="reconfigurablehouse" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/reconfigurablehouse.jpg" alt="reconfigurablehouse" width="419" height="107" /></a></p>
<p><em><a id="ex31" title="Reconfigurable House" href="http://haque.co.uk/reconfigurablehouse.php" target="_blank">Reconfigurable House</a>,Â an environment constructed from thousands of low tech components that can be &#8220;reconfigured&#8221; by its occupants.</em></p>
<p><em>The final strand of thought relates to Pachube&#8217;s emphasis on <strong>&#8220;environments&#8221; rather than &#8220;sensors.&#8221; </strong>I believe that one of the major failings of the usual ubicomp approach is to consider the connectivity and technology at the object-level, rather than at the environment-level. It&#8217;s built into much of contemporary Western culture to be object-centric, but at the level of &#8220;environment&#8221; we talk more about context, about disposition and subjective experience. An &#8216;environment&#8217; has dynamic frames of reference, all of which are excluded when simply focusing on devices, objects or mere sensors. If one really studies deeply what an &#8216;environment&#8217; is (by this I mean more than simply saying that &#8220;it&#8217;s what things exist in&#8221;), one begins to understand that an environment is a construction </em><em>process and </em><em>not a medium; nor is it a state or an entity. In this I would refer to Gordon Pask&#8217;s phenomenally important text </em><em>&#8220;Aspects of Machine Intelligence&#8221; in Nicholas Negroponte&#8217;sÂ <a id="hlcg" title="Soft Architecture Machine" href="http://www.amazon.com/Soft-Architecture-Machines-Nicholas-Negroponte/dp/0262140187" target="_blank">Soft Architecture Machine</a> though it makes for extremely tough reading (Negroponte compared it in importance to Alan Turing&#8217;s contributions to the computer science discipline).</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pachube1.jpg"><img class="alignnone size-full wp-image-2782" title="pachube1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/pachube1.jpg" alt="pachube1" width="411" height="275" /></a></p>
<p><em>Ultimately, though, Pachube is here to make it easier to participate in what I expect to be a vast &#8216;eco-system&#8217; of conversant devices, buildings &amp; virtual environments. Pachube will facilitate the development of a huge range of new products and services that will arise from extreme connectivity. It&#8217;s relatively easy for large technology companies likeÂ <a id="ps11" title="Nike and Apple" href="http://www.apple.com/ipod/nike/" target="_blank">Nike and Apple</a> to transition into the Internet of Things, but Pachube will be particularly helpful for that huge portion of smaller scale industry players that *want* to become part of it, but which are only now waking up to the potentials of the internet &#8212; small and medium scale designers, manufacturers and developers who are very good at developing their products but don&#8217;t have the resources to develop in-house a massive infrastructure for their newly web-enabled offerings.Â Basically, having built a generalized data-brokering backend to connect physical (and virtual) entities to the web, others can now start to build the applications that make the connections really useful.</em></p>
<p><strong>Tish Shute:</strong> You mentioned that both Bruce Sterling and Gavin Starks (AMEE) have given input on Pachube.Â  Can you describe any specific ways they (and others?) have influenced the evolution of Pachube? You mentioned the concept of &#8220;engaged responsible spime wrangling&#8221; when we talked on skype?</p>
<p><strong>Usman Haque:</strong> <em>Yes, I am very grateful to a whole bunch of people who have helped, prodded, nudged and advised along the way!</em></p>
<p><em>I asked Bruce to be a &#8220;visionary&#8221; adviser because he was one of the people early on to envisage the concepts and ramifications ofÂ <a id="v5w3" title="&quot;spimes&quot;Â Â (his neologism for 'space-time objects')" href="http://www.boingboing.net/images/blobjects.htm" target="_blank">&#8220;spimes&#8221;Â Â (his neologism for &#8216;space-time objects&#8217;)</a>. While I agree that &#8220;spimes&#8221; are directly relevant, what I found most important from his conception was the concept of &#8220;wrangling&#8221; &#8211; being actively and productively engaged and responsible in the development of spimed environments. I think it was a crucial leap: to talk about &#8220;wranglers&#8221; rather than &#8220;end-users&#8221;. So the kinds of questions I&#8217;ve turned to him for regard how to nudge people away from being &#8220;end users&#8221; and towards being &#8220;wranglers&#8221;; and about how to transition from being a &#8220;hacker toy&#8221; to &#8220;major infrastructure&#8221;. He had some great (and invaluable) responses, of which one of the most important to me was something he said in email: &#8220;&#8230;I think total openness is fatal. Â It&#8217;s like lying in a blazing sun under a sky full of vultures, naked. It&#8217;s also rather rude, like babbling anything or anything that flies into your head and still expecting people to pay attention.&#8221;</em></p>
<p><em><a id="qrs7" title="Gavin Starks" href="http://www.amee.cc/" target="_blank">Gavin Starks</a> and alsoÂ <a id="bbd." title="Dopplr's" href="http://www.dopplr.com/" target="_blank">Dopplr&#8217;s</a> <a id="aqy:" title="Matt Biddulph" href="http://www.hackdiary.com/" target="_blank">Matt Biddulph</a> have been sort of &#8220;friendly neighbours&#8221; to Pachube: they&#8217;ve made some great introductions and I turn to them often for advice on being a London start-up. What&#8217;s been really useful for me is that they are active in a related area and have directly useful advice: Gavin, of course, since he&#8217;s involved inÂ <a id="lzoi" title="metering the world's energy" href="http://www.amee.cc/" target="_blank">metering the world&#8217;s energy</a>; and Matt perhaps less tangibly in his day job as Dopplr&#8217;s CTO but more so in hisÂ <a id="jav_" title="active Arduino-enabled social life" href="http://tinker.it/now/2009/01/20/toy-hacking-workshop-09/" target="_blank">active Arduino-enabled social life</a>!</em></p>
<p><em>One very important Pachube advisor has beenÂ <a id="qjz0" title="Dr. Paul Pangaro" href="http://www.pangaro.com/" target="_blank">Dr. Paul Pangaro</a>, who has previously been CTO at a number of technology startups, and brings vital experience from his time at Sun Microsystems as Senior Director and Distinguished Market Strategist. (Oh, and he&#8217;s also a former student and collaborator of Gordon Pask&#8217;s!) He has been very helpful in developing a viable business model in conjunction with my brother Yusuf Haque, who, with his experience in raising capital for startups, has led the fundraising process.</em></p>
<p><em>Of course, direct daily input from the Pachube team has been vital to the development of the project, and withoutÂ <a id="nyoj" title="Chris Leung" href="http://www.chrisleung.org/" target="_blank">Chris Leung</a> (EEML development) andÂ <a id="xr8l" title="Sam Mulube" href="http://twitter.com/smazero" target="_blank">Sam Mulube</a> (backend development) it would be a very different thing indeed!</em></p>
<p><strong>Tish Shute:</strong> Now the emerging internet is the world as a networked, enhanced virtual/reality environment &#8211; sorry about the inadequate terminology, but as you said &#8220;the distinction between real and virtual is becoming as quaint as the distinction between mind and body&#8221;. You are participating in the <a id="k7s8" title="Sentient City" href="http://www.situatedtechnologies.net/?q=node/89" target="_blank"><strong>Sentient City</strong> exhibition organized by the </a><a href="http://www.archleague.org/" target="_blank">Architectural League of New York for September 2009.</a></p>
<p>Could you explain more about the Sentient City project and what your contribution Natural Fuse which uses common house plants, energy-monitoring sensors, and Pachube to create &#8220;a city-wide network of electronically-assisted plants that act as carbon-cycle circuit-breakers in much the same way as conventional electrical circuit-breakers do&#8230;..&#8221; is about?</p>
<p><strong>Usman Haque: </strong><em>Situtated Technologies, founded toÂ explore the impact of &#8220;situated&#8221; technologies (i.e. locative media, etc.) in urban spaces,Â kicked off with a <a id="b77z" title="symposium organised by Mark Shepard, Omar Khan and Trebor Scholz" href="http://www.situatedtechnologies.net/?q=node/1" target="_blank">symposium organised by Mark Shepard, Omar Khan and Trebor Scholz</a> and supported by theÂ <a id="o7a4" title="Architecture League of New York" href="http://www.archleague.org/" target="_blank">Architecture League of New York</a> a couple of years ago, and continued throughÂ <a id="o5o6" title="a series of pamphlets" href="http://www.situatedtechnologies.net/?q=node/75" target="_blank">a series of pamphlets</a> (the first by Adam Greenfield &amp; Mark Shepard; the second by me and Matthew Fuller; the third and fourth byÂ Benjamin Bratton &amp; Natalie Jeremijenko andÂ Laura Forlano &amp; Dharma Dailey). This is now culminating in an exhibition,Â &#8220;Toward the Sentient City&#8221;, opening in September 2009, as a public manifestation of many of the concepts raised over the years.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantcircuit1.jpg"><img class="alignnone size-full wp-image-2783" title="plantcircuit1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantcircuit1.jpg" alt="plantcircuit1" width="400" height="289" /></a></p>
<p><em><a id="k48e" title="Natural Fuse" href="http://www.haque.co.uk/naturalfuse.php" target="_blank">Natural Fuse</a>, a project funded by the Architecture League to be part of that exhibtion, is really a Haque Design + Research project rather than Pachube project alone. It came about for two reasons. The first was because we had been investigating for several months many different ways to use plants and vegetation in interactive architectural design: as living walls, as responsive systems, as visual and olfactory indicators, as passive ventilation &#8212; fantastic research undertaken predominantly by my invaluable production assistant Barbara Jasinowicz. We were particularly interested in energy creation and monitoring and had made a number of (unsuccessful) proposals to develop building systems based on plant interaction. The second was because I wanted to have a good demonstration project for Pachube: a system that was not just end-to-end single-point communication, but one in which the system increased its efficiency over time through more and more geographically-dispersed connections. So Natural Fuse developed through a series of conversations with a very intelligent and witty designerÂ <a id="ed_l" title="Nitipak (Dot) Samsen" href="http://www.dotmancando.info/" target="_blank">Nitipak (Dot) Samsen</a> who was then an intern and who will now lead design work along withÂ <a id="w9.y" title="Cesar Harada" href="http://www.cesarharada.com/" target="_blank">Cesar Harada</a> (similarly intelligent and witty!).</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfusecare1.jpg"><img class="alignnone size-full wp-image-2784" title="plantfusecare1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfusecare1.jpg" alt="plantfusecare1" width="400" height="322" /></a></p>
<p><em>Briefly, the point of Natural Fuse is to use networked plants, based on the Arduino ethernet platform, to harnessÂ the carbon-sinking capabilities of plants to create a city-wide network of electronically-assisted plants that act both as energy providers and as shared &#8220;carbon sink&#8221; circuit breakers. By sharing resources and information between the plants, energy expenditure can be collectively monitored and managed. The purpose is to create a collective &#8220;carbon sink&#8221;, that offsets the amount of energy consumed by the plant owners &#8211; a natural &#8220;circuit breaker&#8221;. If people cooperate on their energy expenditure then the plants thrive (and they can all use more energy); but if they don&#8217;t then the network starts to kill plants, thus diminishing the network&#8217;s energy capacity.Â Of course, the network functionality is enabled by Pachube. The plan is to distribute these to some households in New York and offer plans and downloads for people to build their own as well.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfusesystem1.jpg"><img class="alignnone size-full wp-image-2785" title="plantfusesystem1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfusesystem1.jpg" alt="plantfusesystem1" width="432" height="214" /></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfuseunit.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfuseunit1.jpg"><img class="alignnone size-full wp-image-2786" title="plantfuseunit1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/plantfuseunit1.jpg" alt="plantfuseunit1" width="443" height="197" /></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/naturalfusenetwork2.jpg"><img class="alignnone size-full wp-image-2787" title="naturalfusenetwork2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/naturalfusenetwork2.jpg" alt="naturalfusenetwork2" width="462" height="348" /></a><br />
<strong><br />
Tish Shute:</strong> You describe Pachube as linking environments not just sensor to sensor (as sensorbase.org does) &#8211; an environment for Pachube could be a web page. An essential concept in Pachube is the concept that anything could be an environment and such environments are treated equivalently with EEML. You describe EEML as a protocol that sits comfortably with existing building protocols &#8220;what it brings to the picture is the ability to describe buildings that change.&#8221;</p>
<p>How will EEML change our understanding of architecture and enable the view of architecture that &#8220;includes smells, sounds, light, electromagnetic fields &#8211; buildings as dynamic and changing?&#8221; (Prasad Passive House?)</p>
<p>You describe EEML as straddling and designed to work alongside IFC construction industry format. Who is involved in the creation of EEML?Â  Could you explain a little bit how it is different from SensorEML? You mentioned little has been done re post-construction evaluation of buildings. How will EEML enable buildings to share strategies (for example on energy consumption) as you put it?</p>
<p><strong>Usman Haque:</strong> <em>TheÂ <a id="gv6y" style="color: #551a8b;" title="Extended Environments Markup Language (EEML)" href="http://www.eeml.org/" target="_blank">Extended Environments Markup Language (EEML)</a> (which is the protocol around which much of Pachube is based) is being developed to make the idea of &#8220;dynamic, responsive and conversant environments&#8221; a reality. It worksÂ with existing construction standards likeÂ <a id="l7sl" style="color: #551a8b;" title="Industry Foundation Classes (IFC)" href="http://en.wikipedia.org/wiki/Industry_Foundation_Classes" target="_blank">Industry Foundation Classes (IFCs)</a>, but exists to extend them to account for dynamic, responsive and, dare I say it, conversant buildings. In the perhaps prosaic world of construction, this helps to facilitate a number of architectural requirements such asÂ <a id="i2_j" style="color: #551a8b;" title="post-occupancy evaluation" href="http://www.google.com/search?hl=en&amp;client=safari&amp;rls=en&amp;defl=en&amp;q=define:post+occupancy+evaluation&amp;sa=X&amp;oi=glossary_definition&amp;ct=title" target="_blank">post-occupancy evaluation</a>, realtime site-based environmental feedback at the design phase and simulations that synchronise with realworld installation. WithÂ <a id="hxs4" style="color: #551a8b;" title="EEML" href="http://www.eeml.org/" target="_blank">EEML</a> and Pachube you&#8217;ll be able to start working with, say, an Autocad model at the design phase, and include *real time* environmental data from the site, as well as to model expected sensor and assumed energy consumption data of the design; use the same model during the construction phase (because it will translate fine to standard modelling descriptions), and keep working with the same set of information even after the building is occupied and running &#8212; making it a whole lot easier to learn from the design and maintenance processes than it is currently.</em></p>
<p><em>At the same time this does not exclude the possiblity of talking about &#8220;sensors&#8221; (asÂ <a id="swia" title="SensorML" href="http://en.wikipedia.org/wiki/SensorML" target="_blank">SensorML</a> wants to), but we are more easily able to consider, say, the dozens of different ways that different clients will want to address, access or search for those sensors; the changing contextual motivations for actually processing sensor information; and the capacity for flexible sensor ontologies &#8212; where you don&#8217;t need to know from the beginning everything you&#8217;ll be looking for once you&#8217;ve recorded mountains of data.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/environmentsconnected.jpg"><img class="alignnone size-full wp-image-2792" title="environmentsconnected" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/environmentsconnected.jpg" alt="environmentsconnected" width="454" height="151" /></a></p>
<p><em>We can consider, equally as &#8216;environments&#8217; a mountainside, the interior a building, the context of a webpage, the internal status and external context of a mobile device, the interactions within something like Second Life.</em></p>
<p><em>As a result of this conception of &#8220;environment&#8221; we remove the need for a distinction between &#8220;real&#8221; and &#8220;virtual&#8221;. We can consider, equally as &#8216;environments&#8217; a mountainside, the interior a building, the context of a webpage, the internal status and external context of a mobile device, the interactions within something like Second Life &#8212; all these are environments and can communicate with each other on equivalent terms. More importantly a single &#8220;environment&#8221; can be expressed as a snapshot in time; or it can be expressed as a sequence of many snap shots over several years.</em></p>
<p><em>One very important thing we&#8217;re looking at now is how to transition the protocol from something that is status-based, to something that can express transactions, goals and processes. We&#8217;ve just started looking at howÂ <a id="e7.0" title="RDF" href="http://en.wikipedia.org/wiki/Resource_Description_Framework" target="_blank">RDF</a> andÂ <a id="khn." title="machine tags" href="http://en.wikipedia.org/wiki/Machine_tag" target="_blank">machine tags</a> might help in this, largely spurred on by perceptive comments from one of my favourite designers,Â <a id="mit9" title="Toxi, a.k.a. Karsten Schmidt" href="http://postspectacular.com/" target="_blank">Toxi, a.k.a. Karsten Schmidt</a>.</em></p>
<p><strong>Tish Shute:</strong> You mentioned that you see &#8220;smart&#8221; buildings and &#8220;smart&#8221; cities as environments not just a collection of devices? On the Pachube web page there is a chart describing potential interactions between entities (one to one, one to many, etc.) but you do not give many pointers to how two unrelated objects that are connected would derive any value out of the connection&#8230;could you give me some examples of the kinds of use cases (Natural Fuse is one of course!) and interesting new opportunities to create shared value that Pachube will enable?</p>
<p><strong>Usman Haque:</strong> <em>Yes, I recognize that the Pachube website information leaves a lot to be desired&#8230;! Apart from a whole lot of conceptual information that&#8217;s missing, there are a number of undocumented API features that nobody has yet uncovered!</em></p>
<p><em>Well, in answer to your question: much of it is intuition &#8211; I don&#8217;t know exactly _how_ it will be valuable but I do expect the community to find ways to make such seemingly disparate interoperability valuable.</em></p>
<p><em>To make a prosaic example: say, (once privacy options are introduced) that a manufacturer creates aÂ <a id="s53b" title="Pachube input application" href="http://community.pachube.com/?q=node/100" target="_blank">Pachube input application</a>, like an electricity meter that automatically charts on Pachube. There is a certain benefit to its customers in being able to monitor their usage over time and to compare their usage to the aggregation of others in a similar class, but anonymised. Say that someone else has produced a Pachube output application like aÂ <a id="fhjs" title="mobile phone Pachube viewer" href="http://www.rcreations.com/freeandroidgphoneg1applications" target="_blank">mobile phone Pachube viewer</a>. Now the electricity meter users can use this new output application as an extension to be able to monitor their consumption on a mobile phone. Now, imagine if someone else develops a new product, aÂ <a id="j.l-" title="networked lamp" href="http://www.goodnightlamp.com/" target="_blank">networked lamp</a> &#8212; it would now be very easy for that designer to write a little app to make the networked lamp switch on (or change brightness) according to the electricity consumption, even remotely. The point is that the more input and output apps are added the more valuable they each become.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/scatteredhouse.jpg"><img class="alignnone size-full wp-image-2791" title="scatteredhouse" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/scatteredhouse.jpg" alt="scatteredhouse" width="443" height="109" /></a><br />
<a id="tzsq" title="Scattered House" href="http://www.haque.co.uk/scatteredhouse.php" target="_blank"></a></p>
<p><em><a id="tzsq" title="Scattered House" href="http://www.haque.co.uk/scatteredhouse.php" target="_blank">Scattered House</a>, like Reconfigurable House, but spread throughout various cities in the world to demonstrate the implications of designing environments and buildings in the context of family diasporas and ubiquitous ad hoc networked connectivity.</em></p>
<p><em>Part of Pachube&#8217;s emphasis, in not making specific connections more important than others, is that the community can develop new types of connection. So, while of course it makes it relatively simple to create remote control connections between seemingly unrelated entities (like mobile phones and houses; or web pages and furniture); and it makes it relatively simple to connect up environmental conditions from the physical world to seemingly distant Second Life (or, more interestingly to me,Â <a id="iqkx" title="OpenSim" href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a> ) which can make it a more viable interactive environment; and it makes data aggregation and comparison possible between wide ranges of energy consumers to facilitate aggregation analysis; but, the point really is to make it easy for people and companies to build in this kind of connectivity and invent new uses.</em></p>
<p><em>Through my close association withÂ <a id="sin8" title="The Bartlett, University College London's architecture school" href="http://www.bartlett.ucl.ac.uk/" target="_blank">The Bartlett, University College London&#8217;s architecture school</a>, I hope to develop some particularly relevant use-case scenarios for the architectural industry. I think we&#8217;ve really not even begun to imagine the kinds of applications that will be important, though I guess Natural Fuse exemplifies the kind of approach I would like to see in Pachube-enabled applictations: one in which the collective/hive experience contributes towards some end goal, to make it possible to create a &#8220;wikipedia of environments&#8221; as opposed to a web-based Wikipedia &#8211; it&#8217;s not that I necessarily want to create these things myself, but rather I want to make it </em><em>possible to create such things.</em></p>
<p><strong>Tish Shute:</strong> You mentioned that you hope Pachube to be the place to connect smart products &#8211; product to product communication?Â  Also you mentioned that you would like to have a way that smart products can self register with Pachube. While all feeds are public now, you are going to create groups with different levels of privacy. Both of the aforementioned features would enable more business applications for Pachube.Â  But could you describe the business model for Pachube?</p>
<p><strong>Usman Haque:</strong> Essentially, there are three facets to the business model. The first takes a cue fromÂ <a id="irzp" title="Flickr" href="http://www.flickr.com/upgrade/" target="_blank">Flickr</a> in recognising that there are those who would like a more sophisticated set of services as &#8220;professional&#8221; accounts. The second is to be able to provide a set of tools and applications for medium scale manufacturers and developers who want to web-enable their offerings, who will be able to take advantage of the growing repository of Pachube.Apps and add-ons, and who want the convenience, security and economy that Pachube will be able to offer. The third approach is to become more directly involved in large-scale urban infrastructure projects. There is a fourth facet, but we consider it the killer so I&#8217;m keeping quiet for the moment&#8230;.</p>
<p>So yes, in order to make all these things more useful we&#8217;ll soon be introducing a range of privacy options on feeds, the ability to create &#8220;aggregates&#8221; from collections of feeds, and the possibility of groups, organised around feeds. Another thing we&#8217;re hoping to introduce soon is open environment-level tagging, so that anyone will be able to tag environments, though there will be a way of evaluating the importance of any given tag.</p>
<p><strong>Tish Shute: </strong>I know you mentioned that you are trying to find ways to find tools that allow people to contribute to their environment. There are a number of projects aimed at providing tools that will help people/business to reduce their carbon footprintÂ  &#8211; <a id="a2qc" title="The Carbon Account," href="http://www.thecarbonaccount.com/" target="_blank">The Carbon Account,</a> AMEE, Wattzon, <a id="f8y3" title="Onzo" href="http://www.onzo.co.uk/" target="_blank"> Onzo</a> Is Pachube working with any of these projects and how?</p>
<p>What are the most interesting ideas in this area of changing our relationship to energy consumption emerging from Pachube?</p>
<p><strong>Usman Haque: </strong><em>The carbon footprint calculating industry is getting quite crowded&#8230;! So far I&#8217;ve particularly appreciated AMEE&#8217;s API (which is also used by the Carbon Account, I believe). So one thing we have just released a Pachube.App &#8216;plugout&#8217; which will take a feed from an electricity meter tagged &#8220;watts&#8221; or &#8220;kilowatts&#8221; and convert it into a realtime carbon footprint calculation (driven by AMEE&#8217;s international and region- and supplier-specific carbon conversion factors). So it should be really easy to discover how many kilograms of CO2 you generated in the last 15 minutes&#8230;. that last hour&#8230; the last 24 hours. Here&#8217;s a list of some of the feeds that are already making use of this:Â http://www.pachube.com/tag/co2_last_15_mins</em><br />
<strong><br />
Tish Shute:</strong> I know the Aduino community has really taken and interest in Pachube. Who are the early adopters on Pachube?Â  What are the most prevalent use cases you have seen so far?</p>
<p><strong>Usman Haque:<em> </em></strong><em>It has actually been more difficult than I thought it would be getting the Arduino community interested. This has partly been due to the difficulty of internet-enabling Arduino (until recently adding ethernet access has been a bit of a tough chore). Now that it&#8217;s easier to connect up Arduinos, some of the early adopters have been interfacing Arduino to Current Cost meters (alleviating the need for a computer in between); and others have been doing things like tracking temperature, humidity and light level in their homes and offices.Â <a id="ohbg" title="Pachube user C4C" href="http://www.gomaya.com/glyph/" target="_blank">Pachube user C4C</a> has been pretty active from early on:Â http://www.pachube.com/feeds/1284</em><br />
<strong><br />
Tish Shute:</strong> Pachube is input heavy at the moment &#8211; you mentioned not many accuators are plugged into Pachube yet.Â  You said this is in part because you have focused on making the backend robust and stable before taking a lot of hits. What new directions for Pachube will emerge from enabling the dynamic relationship between sensors and accuators?</p>
<p><strong>Usman Haque:</strong> <em>This will be a crucial evolution in Pachube, when we make actuators more evident. It&#8217;s input heavy at the moment, basically in the sense of being easy to see the inputs &#8212; you add &#8220;inputs&#8221; rather than &#8220;outputs&#8221;, so at the moment we have no idea of what&#8217;s actually plugged into the outputs unless people tell us! However, we know that there are plenty of outputs because they&#8217;re making API requests, we just don&#8217;t know what they&#8217;re being used for! Once the concept of actuators and output environments get built in to the system then I think we&#8217;ll know a lot more about how people are using the system.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/currentcost.jpg"><img class="alignnone size-full wp-image-2794" title="currentcost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/currentcost.jpg" alt="currentcost" width="444" height="150" /></a></p>
<p><em>To make this easier in the meantime we recently announce theÂ <a id="zp60" title="Pachube.apps" href="http://apps.pachube.com/%29" target="_blank">Pachube.apps</a> site, where people can start contributing Pachube &#8216;plugins&#8217; and &#8216;plugouts&#8217; &#8212; things that can be used by others without needing to code or hack, to create, generate or modulate Pachube inputs and outputs. One of these wasÂ <a id="htj9" title="Status2Pachube" href="http://apps.pachube.com/online-status.html" target="_blank">Status2Pachube</a>, which turns the online status of AIM, MSN Messenger, Skype or Yahoo! Messenger users into a Pachube input feed (to make it easy to create &#8220;remote presence&#8221; orbs and such); another was theÂ <a id="wjey" title="CurrentCost2Pachube" href="http://community.pachube.com/?q=node/100" target="_blank">CurrentCost2Pachube</a> app to make it easy to connect up Current Cost electricity meters as input feeds; all of which can then be used by Pachube output apps, like theÂ <a id="xki1" title="G1 Android phone Pachube viewer" href="http://www.rcreations.com/freeandroidgphoneg1applications" target="_blank">G1 Android phone Pachube viewer</a> by Pachube user N4Spd or in the soon-to-launchÂ <a id="pd2x" title="Pachube2SketchUp" href="http://apps.pachube.com/" target="_blank">Pachube2SketchUp</a> plugout which will direct Pachube outputs into Google SketchUp (and by extension Google Earth) in order to generate or modulate 3-d models in response to realtime environmental/sensor data. (Pachube2SketchUp is pretty much finished for Mac OS X &#8212; but we&#8217;re having difficulty getting it to work on Windows, because of its sometimes pigheaded security measures&#8230; we&#8217;ll probably release it for Mac OS X alone soon anyway).</em></p>
<p><strong>Tish Shute:</strong> Do you and Haque design expect to go beyond just providing a platform? Will you be producing more interesting applications like Natural Fuse on Pachube?Â  If so, can you tell me more about what you have in mind?</p>
<p><strong>Usman Haque:</strong> <em>I keep a clear distinction between my work as creative director of Pachube.com and my work as director of Haque Design + Research. Basically, while Pachube.com continue development of the platform in general, I hope that Haque Design + Research will separately continue creating pioneering interactive experiences, some using Pachube and others not. We have some things in mind, such as the idea of creating an open source building management platform, but that&#8217;s all to come later&#8230;</em></p>
<p><strong>Tish Shute:</strong> One very interesting project you have been involved in is the creation of &#8220;Urban Versioning System 1.0&#8243; which asks &#8220;What lessons can architecture learn from software development, and more specifically, from the Free, Libre, and Open Source Software (FLOSS) movement?&#8221; Can you tell me more about this project, its goals, and its progress? How Does UVS 1.0 relate to Pachube?</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/urbanvs.jpg"><img class="alignnone size-full wp-image-2795" title="urbanvs" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/urbanvs.jpg" alt="urbanvs" width="277" height="386" /></a></p>
<p><strong>Usman Haque: </strong><em>TheÂ <a id="xujn" title="Urban Versioning System" href="http://uvs.propositions.org.uk/" target="_blank">Urban Versioning System</a> was essentially an attempt to understand what lessons the &#8220;open source&#8221; approach in software might provide to the collaborative development of environments and cities. It&#8217;s a sort of quasi-license &#8212; not yet quite ready to have the status of something like Creative Commons (which nicely suits media and software based creations, but doesn&#8217;t suit quite so well hardware and physical things beyond their design files). It&#8217;s more of a challenge, a series of constraints that might be applied. It has a link to Pachube, in the sense of encouraging conception at the environment and systemic level &#8212; you might call it the manifesto that connects Constant&#8217;s New Babylon hypothesis to the reality of Pachube!</em></p>
<p><strong>Tish Shute:</strong> I know that you imagine Pachube scaling up to millions (billions???) of users. But scaling the real time web has proved a challenge (e.g the frequent surfacings of the Twitter failwhale during big events). What are the key points of Pachube&#8217;s architecture and design that will enable successful scaling?</p>
<p>How do you see Pachube itself fitting into the FLOSS movement?</p>
<p><strong>Usman Haque: </strong><em>This is a really important question. There are a couple of things we are doing. The first is constantly to assume that we have 20 to 50 times more connections than we actually have&#8230; I put a lot of pressure on Sam about making sure about this, so he&#8217;s constantly developing, thinking about and testing little things for weeks in advance while at the same time fighting the usual daily little fires that arise <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" />  The second is that we&#8217;re trying to learn from strategies being developed byÂ <a id="fq2y" title="Vlad Trifa" href="http://vladtrifa.com/" target="_blank">Vlad Trifa</a> and his group at theÂ <a id="zjfb" title="Institute for Pervasive Computing at ETH Zurich" href="http://www.pc.inf.ethz.ch/" target="_blank">Institute for Pervasive Computing at ETH Zurich</a> in Switzerland regarding the development of infrastructures for millions or more entities.</em></p>
<p><em>Regarding the connection to the FLOSS movement, there is no specific technical part of Pachube that is currently open source (apart from all the example apps and tutorials of course). However, I find the approach taken by OpenSim and Hypergrid really fascinating: I haven&#8217;t given this enough thought to how it might be implemented but I find quite appealing the idea of a multitude of open source and geographically dispersed Pachube-enabled servers with seamless transfer of data connections between them as necessary&#8230;..</em></p>
<p><strong>Tish Shute: </strong>I know you have an <a id="ttbg" title="Android Viewer for Pachube" href="http://en.androidwiki.com/wiki/Pachube_Viewer" target="_blank"> Android Viewer for Pachube</a>.Â  Android is a landmark for extended/augmented reality, as <a id="x-.a" title="Wikitude" href="http://www.mobilizy.com/wikitude.php" target="_blank"><span style="color: #0000ff;"><strong>Wikitude</strong></span></a> proved, because with its compass mode Android brings together the essential ingredients for extended/augmented reality &#8211; knowing who YOU are, WHERE you are, WHAT you are doing, WHAT is around you.Â  It seems Pachube could be a powerful backend to a number of multi-user, mobile augmented/enhanced reality android applications?Â  Do you have any ideas/thoughts on this?</p>
<p><strong>Usman Haque:</strong> <em>That&#8217;s right &#8212; the Android viewer was created by rcreations.com/ a Pachube user &#8212; this new platform brings amazing opportunities to mobile devices. I would be really interested to see what I would consider the obvious next step: an app that becomes both a Pachube input and an output feed, one that overlays existing Pachube data, with new context-based, site specific data.</em></p>
<p><em>If I was to make a parallel to a Japanese anime, I&#8217;m fascinated byÂ <a id="ht3b" title="Dennou Coil" href="http://en.wikipedia.org/wiki/Dennou_Coil" target="_blank">Dennou Coil</a> a Japanese anime set 20 years in the future where children take for granted the overlay of the digital world with the physical world. BUT, I&#8217;d say that Pachube somehow relates more closely toÂ <a id="zg78" title="Furi Kuri" href="http://www.adultswim.com/shows/flcl/index.html" target="_blank">Furi Kuri</a> in itsÂ <a id="gko_" title="pataphysical" href="http://en.wikipedia.org/wiki/%E2%80%99Pataphysics" target="_blank">pataphysical</a> stance and because one of the main characters has a portal to another galaxy in his head&#8230;&#8230;.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/furikuri.jpg"><img class="alignnone size-full wp-image-2793" title="furikuri" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/furikuri.jpg" alt="furikuri" width="420" height="320" /></a></p>
<p><strong> Tish Shute:</strong> Do you do you see Haque design picking up on the challenge of creating some cool next generation interfaces/GUIs for extended/enhanced/augmented (sorry no perfect term) reality?</p>
<p><strong>Usman Haque:</strong> <em>Actually, no, I don&#8217;t see this as Haque Design + Research&#8217;s core focus going forward. We did some of this early on, getting involved in, for example, the development of aÂ <a id="ty:5" title="3d smell interface" href="http://www.haque.co.uk/scentsofspace.php" target="_blank">3d smell interface</a>; and exploring theÂ <a id="ykap" title="role of electromagnetic fields on perception of haunted spaces" href="http://www.haque.co.uk/haunt.php" target="_blank">role of electromagnetic fields on perception of haunted spaces</a>. But these days, in the context of HDR, I&#8217;m less interested in making seamless interfaces and more interested in exploring what authentic interaction actually is (whether technologically based or not). I think it&#8217;s challenge enough for me to make a light-switch engaging, dynamic and conversant before getting to the perceptual infrastructure that goes on top of it all! HDR will also spend more time exploringÂ <a id="p2v5" title="passive systems, phase-change materials and plants" href="http://www.haque.co.uk/climateclock.php" target="_blank">passive systems, phase-change materials and plants</a> in the context of the built environment.</em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/scentsofspace.jpg"><img class="alignnone size-full wp-image-2796" title="scentsofspace" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/scentsofspace.jpg" alt="scentsofspace" width="550" height="197" /></a></p>
<p><strong>Tish Shute: </strong>I know there has been some interesting integrations with Pachube lately &#8211; <a href="http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/" target="_blank">Andy Stanford-Clark&#8217;s mentioned using MQTT as the feed to get EML data into and out of Pachube</a> rather than over HTTP. He said thatâ€™s interesting because MQTT is a much more lightweight protocol, designed for small sensors and low bandwidth / expensive (e.g. cellular) networksâ€¦ and itâ€™s also true push.. i.e. data is pushed to you directly from the broker (the hub in the middle), rather than you having to ask for it constantly (polling).</p>
<p>Have you opted for MQTT over HTTP polling?</p>
<p><strong>Usman Haque:</strong> <em>We haven&#8217;t yet implemented an MQTT bridge in part because it has proved pretty difficult. HTTP is quite important for us right now because there&#8217;s a whole universe out there using it; from your average web browser, to mobile devices, to ethernet devices and a whole range of languages and platforms &#8212; they all work, pretty much out of the box with HTTP. However, what we are exploring instead is being able to interface withÂ <a id="a4w." title="Oliver Goh" href="http://www.eolusone.com/cms/website.php" target="_blank">Oliver Goh</a>&#8216;s Shaspa project &#8212; they&#8217;re already in the middle of solving the MQTT-Pachube bridge problem, and so that should hopefully provide Pachube access to and from MQTT devices.</em></p>
<p><strong>Tish Shute:</strong> Chris Dalby just released <a id="qcm6" title="Pachube Air" href="http://www.yellowpark.net/cdalby/index.php/2009/01/10/pachube-air-the-first-release/" target="_blank">Pachube Air.</a> Have you had a chance to play with that yet?</p>
<p><strong>Usman Haque:</strong> <em>I have indeed! It&#8217;s still early days yet, and I know he did it partly just to test the AIR development process rather than solely solving a desperate Pachube need but I&#8217;m looking forward to future iterations!</em></p>
<p><strong>Tish Shute:</strong> Peter Quirk felt the Pachube web page positions Pachube as a social networking site focused on data exchange, inviting anyone with an interest in sharing environmental or other data to publish data or construct interesting uses for the data.</p>
<p>What is your response to that?</p>
<p><strong>Usman Haque:</strong> <em>Hmm&#8230; I don&#8217;t really see Pachube as a social networking site. Yes, it perhaps enables the creation of social-networking objects and environments, but in itself and in terms of networking of people that has barely begun yet. Certainly Pachube exists quite comfortably in facilitating mashups and visualisations and other web 2.0 based social applications but I don&#8217;t see that as a driving force. I think it would be a mistake also to conceive of Pachube solely as being the storage of machine communication that then gets experienced by people; rather, it can transition quite easily to being solely useful for machine-to-machine communication. </em></p>
<p><em>In fact, with recent API releases (which as it happens as of this writing we haven&#8217;t announced&#8230; <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" />  it&#8217;s now possible to use most of Pachube&#8217;s features without ever going to the website: i.e. your Arduino can create feeds, search feeds, edit feeds, delete feeds. Over time,Â as direct machine-to-machine communication becomes more prominent,Â it&#8217;s quite likely that the website itself becomes less and less important, while the backend becomes the focus of everything.</em><br />
<strong><br />
Tish Shute:</strong> I am interested in some of the differences between<a href="http://sensorbase.org/" target="_blank"> SensorBase.org&#8217;s project</a> and Pachube. Is Sensorbase as more of a data repository (environmental data in particular)?</p>
<p><strong>Usman Haque</strong>: <em>The difference I see between Pachube and SensorBase is that while (from what I know) SensorBase is mostly about &#8220;write&#8221; operations, with later &#8220;read&#8221; operations (i.e. it&#8217;s about being a data repository), Pachube is really &#8220;read-write&#8221; (i.e. it&#8217;s about being both a data repository _and_ a quasi-realtime proxy). Pachube will be able to handle potentially millions of connections, both incoming and outgoing, and as we&#8217;ll soon start storing every data point ever recorded, so of course the data repository aspect will be crucial. However, the fact that it *also* facilitates one-to-many realtime broadcasts of that data (and facilitates conversion to a number of different formats: EEML, CSV and JSON now, more in the future) means that the two-way connectivity aspect of it is just as important.</em></p>
<p><strong>Tish Shute</strong>: I know you mentioned something that sounding a lot like Pachube would facilitate buildings and products ability to benchmark and optimize themselves against/with each other?</p>
<p><strong>Usman Haque:</strong> <em>Further down the line, I would like to see Pachube able to help two particular processes:</em></p>
<p><em>1) to make it straightforward for developers and manufacturers to web-enabled their products and services; and 2) to help building and environment designers create their buildings (by providing access to realtime site data) and also help in the post-occupancy evaluation process &#8212; where buildings will be able to talk with each other, share information on energy consumption, resource management or occupancy rates and even &#8220;learn&#8221; from each others&#8217; strategies. This type of approach has a parallel at the level of individuals (for example, networked electricity meter users who are able to compare and contrast their usage and strategies for conservation). I don&#8217;t want Pachube to become the application; rather I want to make it easier for other people and companies to create such applications. So in that sense, yes, perhaps Pachube can be considered an enabler of social networking applications&#8230;!</em></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/feed/</wfw:commentRss>
		<slash:comments>64</slash:comments>
		</item>
		<item>
		<title>Is it â€œOMG Finallyâ€ for Augmented Reality?: Interview with Robert Rice</title>
		<link>http://www.ugotrade.com/2009/01/17/is-it-%e2%80%9comg-finally%e2%80%9d-for-augmented-reality-interview-with-robert-rice/</link>
		<comments>http://www.ugotrade.com/2009/01/17/is-it-%e2%80%9comg-finally%e2%80%9d-for-augmented-reality-interview-with-robert-rice/#comments</comments>
		<pubDate>Sun, 18 Jan 2009 01:03:32 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[Metaverse]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[nanotechnology]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[Second Life]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[virtual communities]]></category>
		<category><![CDATA[virtual economy]]></category>
		<category><![CDATA[virtual goods]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[virtual world standards]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web 3D]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[Web3.D]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[AR Geisha Doll]]></category>
		<category><![CDATA[compass in the android]]></category>
		<category><![CDATA[Denno Coil]]></category>
		<category><![CDATA[EEML]]></category>
		<category><![CDATA[hybrid augmented/virtual reality]]></category>
		<category><![CDATA[immersive mobile augmented reality]]></category>
		<category><![CDATA[markerless augmented reality]]></category>
		<category><![CDATA[massively multiuser augmented reality]]></category>
		<category><![CDATA[minimally immersive augmented reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[Neogence]]></category>
		<category><![CDATA[next generation transparent wearable displays]]></category>
		<category><![CDATA[NYC Tech Meetup]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[Robert Rice]]></category>
		<category><![CDATA[socializing sensor data]]></category>
		<category><![CDATA[Unreal 3]]></category>
		<category><![CDATA[Web Alive]]></category>
		<category><![CDATA[Wikitude]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=2620</guid>
		<description><![CDATA[Neogence is on stealth mode with an immersive mobile augmented reality platform &#8211; â€œtools, sdk, and infrastructure plus some applications.â€ They are probably six months away from YouTubing anything according to CEO, Robert Rice.Â  But Robert rustled up this pic for me &#8211; a Google street view of Neogence R&#38;D labs: â€œthe patio on the [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><img class="alignnone size-full wp-image-2557" title="neogencesekrithqpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/neogencesekrithqpost.jpg" alt="neogencesekrithqpost" width="450" height="412" /></p>
<p><a id="zd89" title="Neogence" href="http://www.neogence.com/sekrets.html" target="_blank">Neogence</a> is on stealth mode with an immersive mobile augmented reality platform &#8211; â€œtools, sdk, and infrastructure plus some applications.â€ They are probably six months away from YouTubing anything according to CEO, <a id="rzgp" title="Robert Rice" href="http://curiousraven.squarespace.com/about-me/" target="_blank">Robert Rice</a>.Â  But Robert rustled up this pic for me &#8211; a Google street view of Neogence R&amp;D labs: â€œthe patio on the lower left is where I do a lot of pacing and smoking my pipe and the porch and office upstairs is whereÂ  a lot ofÂ  meetings have been held.â€</p>
<p><a id="rzgp" title="Robert Rice" href="http://curiousraven.squarespace.com/about-me/" target="_blank">Robert Rice</a> (<a id="x_:i" title="@RobertRice" href="http://twitter.com/RobertRice" target="_blank">@RobertRice</a> ), CEO of <a id="zd89" title="Neogence" href="http://www.neogence.com/sekrets.html" target="_blank">Neogence</a>, recently tweeted:</p>
<p><em><strong>Iâ€™m changing my name to Robert Mobile Ubiquitous Geospatial Augmented Rice. Iâ€™m betting on radical changes in next 18 months.</strong></em></p>
<p>Although Robertâ€™s new AR platform is still under wraps, I think you will get a good idea of what direction he is going in from this interview (full text at end ofÂ  this post). Robert is the author of â€œ<a id="c:rr" title="MMO Evolution" href="http://books.google.com/books?id=dkZ-6C5utz8C&amp;dq=MMO+Evolution&amp;printsec=frontcover&amp;source=bn&amp;hl=en&amp;sa=X&amp;oi=book_result&amp;resnum=4&amp;ct=result" target="_blank">MMO Evolution</a>â€ and is a key developer and thought leader in persistent immersive environments, simulations, virtual worlds and massively multiplayer games as well as large scale communities and social networking.</p>
<h3>It is OMG finally, at least, for minimally immersive but truly useful AR.</h3>
<p>Since the launch of Android a new generation of useful augmented reality applications like <strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a></strong> are emerging.</p>
<p>After the last<a href="http://www.meetup.com/ny-tech/calendar/9466657/" target="_blank"> NYC Tech Meetup</a>, myÂ  friend <a title="Nat Mobile Meets Social DeFreitas" href="http://openideals.com/" target="_blank">Nathan Freitas</a>,Â  <a title="Nat Mobile Meets Social DeFreitas" href="http://openideals.com/" target="_blank">(</a><a title="@NatDefreitas" href="http://twitter.com/natdefreitas" target="_blank">@NatDefreitas</a>),Â <a title="Nat Mobile Meets Social DeFreitas" href="http://openideals.com/" target="_blank"> </a>or rather Nathan Mobile Meets Social Freitas, demoed for me a cool graffiti appÂ  he has developed on Android.Â Â  You leave a marker for your graffiti so other people can find view/add their own &#8211; a nice primal experience like pissing on the lamp post to let your pack know where youâ€™ve been.Â  Also the graffiti app taps into a long history ofÂ  NYC street culture around tagging and graffiti art.Â  For more cool mobile projects Nathan is working on &#8211; <a href="http://blog.twittervotereport.com/" target="_blank">Vote Report </a>and data collection for mass events, a guide to pubs and nightlife in New York City, and more, see his blog, â€œNathanâ€™s<a href="http://openideals.com/" target="_blank"> OpenIdeals. </a>With Camera, GPS, compass, and accelerometer, and APIs on Android for temperature, light meters, (no hardware yet), Nathan says Android:</p>
<p><a href="http://openideals.com/" target="_blank"><em><strong> </strong></em></a><em><strong>â€œseems to be the platform most likely to socialize the idea that sensor data could be a piece of every application.â€ </strong></em></p>
<p>As Nathan is fond of saying:</p>
<p><strong><em>The compass is a killer app enabler!</em></strong></p>
<p><a href="http://openideals.com/" target="_blank">Also see </a><a id="ixwx" title="OpenIntents" href="http://code.google.com/hosting/search?q=label:sensors" target="_blank">OpenIntents</a> for some interesting Android Sensor projects.</p>
<p><img class="alignnone size-full wp-image-2558" title="wikitudepost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/wikitudepost.jpg" alt="wikitudepost" width="450" height="356" /></p>
<p><strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a></strong> was one of <em><strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Thomas Wrobel</a>â€™s</strong></em> two top AR milestones for 2008 (see <a id="vwuu" title="Gamesalfreso" href="http://gamesalfresco.com/" target="_blank">Gamesalfreso</a>):</p>
<p><em><strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a> I think. Seems the first released, useful, AR software.</strong></em></p>
<p><em><strong></strong></em> <a href="http://gamesalfresco.com/2008/07/20/want-your-own-augmented-reality-geisha/" target="_self">AR Geisha doll</a> is also a remarkable breakout for AR &#8211; but useful, nah.</p>
<p>I asked Robert if he also thought <a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a> and <a href="http://gamesalfresco.com/2008/07/20/want-your-own-augmented-reality-geisha/" target="_self">AR Geisha doll</a> asÂ  significant breakthroughs:</p>
<p><em><strong>Yes,Â  these are among the first attempts to get away from the novelty of simply rendering a 3D object based on a marker and making it interesting.</strong></em></p>
<p><em><strong>Remember, one of the biggest risks that AR has, is being branded as â€œnoveltyâ€, which means â€œcool for five minutes but ultimately a waste of time.â€ I think we have a ways to go before something is truly useful, but as 2009 progresses we should start seeing some effort here. Iâ€™d guess 2010 before something really useful comes outâ€¦at least something practical.</strong></em></p>
<p><em><strong>Now, having said that, I should say that I expect entertainment and games to take the lead (as usual), although there are a few companies really trying to leverage AR and video/graphics compositing for marketing (brochures) and location based methods (kiosks, large screen projections, etc.)</strong></em></p>
<h3>So when is it â€œOMG finally!â€ for massively multiuser augmented reality?</h3>
<p><img class="alignnone size-full wp-image-2559" title="ar-guipost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/ar-guipost.jpg" alt="ar-guipost" width="450" height="360" /></p>
<p>The picture above is from <a id="kzm2" title="benjapo's portfolio" href="http://www.istockphoto.com/file_closeup/technology/computers/3919295-futuristic-computer-panel.php?id=3919295" target="_blank">benjapoâ€™s portfolio</a> on istockphoto &#8211; also see the <a id="cqhi" title="istock video here" href="http://www.istockphoto.com/file_closeup/technology/computers/3919295-futuristic-computer-panel.php?id=3919295" target="_blank">istock video here</a>.</p>
<p><a id="ylpn" title="Alex Soojung-Kim Pang considers" href="http://www.endofcyberspace.com/2006/11/royal_college_o.html" target="_blank">Alex Soojung-Kim Pang</a> (who weighed in recently on the <a id="vr8o" title="twitter-baby" href="http://www.endofcyberspace.com/2008/12/twitter-baby.html" target="_blank">twitter-baby</a> debates &#8211; see my <a href="http://tishshute.com/twitter-baby-debates" target="_blank">KickBee Posterous</a> blog) challenges design assumptions for augmented reality that take as a given the userâ€™s desire for numerous private enhancements to their reality.</p>
<p>Alex points out less will probably be more so that enhancements do not impinge on shared experience.Â  See his write up of a talk he gave at the Royal College of Art, <a id="bxx1" title="&quot;and the end of my own private Shibuya.&quot;" href="http://www.endofcyberspace.com/2006/11/royal_college_o.html" target="_blank">â€œand the end of my own private Shibuya.â€</a> Photo below by <em>StÃ©fan, â€œ</em><em><a href="http://www.flickr.com/photos/st3f4n/130889444/in/pool-84787688@N00">Karaoke in Shibuya</a></em><em>â€œ</em></p>
<p><em></em><em><strong>Part of the pleasure of these streetscapes is precisely that theyâ€™re collectively experienced, rather than individual visions: for even a brief period, we share with other postmodern, globe-hopping flaneurs and expatriates and temporary natives the light of the ABC-Mart sign and storefront.</strong></em></p>
<p><em><strong><img class="alignnone size-full wp-image-2560" title="karaokepost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/karaokepost.jpg" alt="karaokepost" width="450" height="338" /><br />
</strong></em></p>
<p>It is collective experience of enhanced, augmented, virtual or real experiences that interests me too. This is one of the reasons I find <strong><em><a href="http://www.pachube.com/" target="_new">Pachube</a></em></strong> and the <a href="http://www.eeml.org/" target="_blank">EEML project </a>of Haque Design and Research so interesting.</p>
<p><strong><em>Extended Environments Markup Language (EEML), a protocol for sharing sensor data between remote responsive environments, both physical and virtual. It can be used to facilitate </em><em>direct connections between any two environments; it can also be used to facilitate many-to-many connections as implemented by the web service <a href="http://www.pachube.com/" target="_new">Pachube</a>, which enables people to tag and share real time sensor data from objects, devices and spaces around the world.</em></strong></p>
<h3>â€œDistinctions between virtual and real are as quaint and outmoded as distinctions between mind and bodyâ€ (Usman Haque)</h3>
<p><img class="alignnone size-full wp-image-2603" title="chair1post1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/chair1post1.jpg" alt="chair1post1" width="150" height="150" /><img class="alignnone size-full wp-image-2602" title="remotechair-slpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/remotechair-slpost.jpg" alt="remotechair-slpost" width="150" height="150" /><img class="alignnone size-full wp-image-2604" title="chair2post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/chair2post.jpg" alt="chair2post" width="150" height="150" /></p>
<p>Usman Haque (founder of <a href="http://www.haque.co.uk/pachube.php" target="_blank">Pachube</a> and <a href="http://www.haque.co.uk/" target="_blank">Haque Design and Research</a>) points out this is an underlying premise of his work &#8211; and augmented reality (full interview coming up soon!).</p>
<p>The pictures above show the Haque Design project, <a href="http://www.haque.co.uk/remote.php" target="_blank">Remote</a>:</p>
<p>â€˜<em><strong>Remoteâ€™ connects together two spaces, one in Boston the other in Second Life, and treats them as a single contiguous environment, bound together by the internet so that things that occur in one space affect things that happen in the other and vice versa &#8211; remotely controlling each other.</strong></em></p>
<p>There was a discussion in twitter recently about how the terms like Second Life, Exit Reality, Virtual Worlds are misleading and outmoded. As Robert pointed out we need:</p>
<p><em><strong>one word pleaseâ€¦that sums up virtual and/or augmented reality, interactive, immersive, virtual worlds, mmorpgs, simulations, etcâ€¦ also, I really donâ€™t like the term â€œaugmented realityâ€ or â€œmixed realityâ€. Neither is all that great. And NO â€œmatrixâ€ or â€œmetaverseâ€ please</strong></em></p>
<p>Robert argues strongly that there is a stultification both in virtual world technology &#8211; much of what we call virtual world technology was already, basically, where it is now in the mid 90â€™s. And MMOGs have devolved into gameplay design â€œthat emphasizes the single player experience and does nothing to take advantage of the potential of the massively connected internet.â€</p>
<p>Robert suggested I take a cruise through a new Virtual Space -Â  <a href="http://www.cooliris.com/">CoolIris</a> to find some good pictures for this post (note the partnership between <a href="http://blog.cooliris.com/2009/01/14/cooliris-and-seesmic-streamline-video-blogging/" target="_blank">CoolIris and Seesmic to Streamline Video Blogging.</a> I added the Cooliris Plugin to Firefox and typed Augmented Reality into search and soon I was cruising a highway of images and links. The Road Map image grabbed my attention (see below). It shows the continua that <a href="http://www.metaverseroadmap.org/" target="_blank">the Metaverse RoadMap</a> authors thought are likely to influence the ways in which the Metaverse unfolds. It is â€œa map of the spectrum of technologies and applications ranging from augmentation to simulation; and the spectrum ranging from intimate (identity-focused)external (world-focused)â€</p>
<p><img class="alignnone size-full wp-image-2561" title="metaverseroadmap" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/metaverseroadmap.jpg" alt="metaverseroadmap" width="452" height="427" /></p>
<p>Quite to my surprise, when I clicked out of <a href="http://www.cooliris.com/">CoolIris</a> to the source for the image, I found it had been drawn from a post I wrote in May 2007, <em><strong><a id="jv.r" title="Hybridized Digital/Physical Worlds: Where Pop and Corporate Cultures Mingle." href="../../2007/05/22/hybridized-digitalphysical-worlds-where-pop-and-corporate-cultures-mingle/" target="_blank">Hybridized Digital/Physical Worlds: Where Pop and Corporate Cultures Mingle.</a> </strong></em>My post talks about a number of hybridization experiments that were bringing together lifelogging, sensors everywhere, simulation, virtual worlds, and augmentation.</p>
<p>The striking difference from 2007 to now is that we have definitely moved on from mere experimentation. And the poles of the continua<em><strong> intimate/extimate, augmentation/simulation </strong></em>as<em><strong> </strong></em>expressed in the Metaverse Roadmap are now becoming entwined (note the picture above seems to be slightly different to the one used in the road map as <a id="vdcf" title="posted here" href="http://www.metaverseroadmap.org/overview/" target="_blank">published here</a> &#8211; perhaps I had an early version?)</p>
<h3>&#8220;Augmented Reality is not just about overlaying dataâ€¦&#8221; (Robert Rice)</h3>
<p><img class="alignnone size-full wp-image-2562" title="totalimmersion" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/totalimmersion.jpg" alt="totalimmersion" width="450" height="332" /></p>
<p>Th<em>e </em>screenshot above is from <a id="c7vm" title="TotalImmersions video" href="http://www.t-immersion.com/en,video-gallery,36.html#">TotalImmersions video</a> demoing Augmented Reality with 3D Cell Phones.<em> Also see <a id="tvca" title="video of their immersive games" href="http://www.t-immersion.com/en,video-gallery,36.html#" target="_blank">video of their immersive games</a>, and FutureScope kiosks <a id="eje0" title="here" href="http://www.t-immersion.com/en,video-gallery,36.html#" target="_blank">here</a> and <a id="h-:s" title="here" href="http://www.t-immersion.com/en,video-gallery,36.html#" target="_blank">here</a>.<br />
</em><br />
<a id="vwuu" title="Gamesalfreso" href="http://gamesalfresco.com/">Gamesalfreso</a> noted that Will Wright, delivered the best <a href="http://www.pocketgamer.co.uk/r/Various/Spore+Origins/news.asp?c=8725" target="_blank">augmented reality quote</a> of the year. When describing AR as the way of the future for games, Will Wright said:</p>
<p><em><strong>â€œGames could increase our awareness of our immediate environment, rather than distract us from itâ€.</strong></em></p>
<p>Robert points out in this interview the term Augmented Reality itself has become associated with a very limited understanding of what â€œenhancing your specific reality,â€ is really about. Robert notes:</p>
<p><em><strong>it is inherently about who YOU are, WHERE you are, WHAT you are doing, WHAT is around you, etc.</strong></em></p>
<p><em><strong>When I talk about AR, I try to expand the definition a little bit. Usually, when you talk to someone about augmented reality, the first thing that comes to mind is overlaying 3D graphics on a video stream. I think though, that it should more properly be any media that is specific to your location and the context of what you are doing (or want to do)â€¦augmenting or enhancing your specific reality.</strong></em></p>
<p><strong><em>In this sense, anything that at least knows who you are (your ID, mobile phone #, etc.), where you are (GPS coord or a specific place like a cafe), and gives you relevant data, information, or media = augmented reality. Sure, you can make things more interactive or immersive, but that is the minimum.</em></strong></p>
<p><strong><em>So, in this case, yes, I think there will be networked applications in the next 18 monthsâ€¦mostly things that are enhanced by friends lists (you are here, your friend is over there). These will be *application specific*. My team at Neogence is already going beyond this, building a platform and infrastructure for other applications to be developed onâ€¦all networked through the same backbone. Now, in this context (the science fiction AR that we all dream about), no I do not see anyone else trying to leap a generation or two ahead of the industry to build a massively multiuser shared AR space. Expect to see things like multi-user AR games, virtual pets, kiosk marketing, magic book, â€œgee whizâ€ presentations (tradeshow booths, entertainment parks, etc.), and so forth.</em></strong></p>
<p><strong><em><br />
</em></strong></p>
<h3>Goggleâ€™s Are Not The Secret Sauceâ€¦</h3>
<p><strong><em><img class="alignnone size-full wp-image-2563" title="ar-catpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/ar-catpost.jpg" alt="ar-catpost" width="137" height="150" /><img class="alignnone size-full wp-image-2564" title="goggles-avatarpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/goggles-avatarpost.jpg" alt="goggles-avatarpost" width="150" height="150" /><br />
</em></strong></p>
<p>AR Cat left and Robert Rice right</p>
<p>What has come to be associated with the term Augmented Reality, in the popular imagination &#8211; an idea of 3D graphics projected over markers that has been forever waiting for the advent of â€œwicked next generation transparent wearable displaysâ€ &#8211; nirvana for augmented reality. While such displays may be nirvana for AR (and they could be with us in less than twenty four months), Goggles are not the â€œsecret sauceâ€ of AR as Robert points out.<strong><em><br />
</em></strong></p>
<p><em><strong>All the glasses are, is another display device. At the end of the day, it doesnâ€™t matter if you are looking at an LCD monitor, an IPhone, a head mounted display, or a pair of wicked next generation transparent wearable displays that magically draw directly on your retinas.</strong></em><br />
<em><strong><br />
The real tricky stuff is what happens on the backendâ€¦making it all persistent, massively multiuser, intelligent, interoperable, realistic, etc. etc.</strong></em></p>
<p><em><strong><img class="alignnone size-full wp-image-2585" title="vuzix" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/vuzix.jpg" alt="vuzix" width="450" height="318" /><br />
</strong></em></p>
<p>There has been quite<a href="http://www.realwire.com/release_detail.asp?ReleaseID=10934" target="_blank"> a buzz going around</a> about the new <a href="http://www.vuzix.com/iwear/products_wrap920av.html" target="_blank">Vuzix Eyewear</a>, and recently Robert talked with Vuzix and checked The Wrap 920AV eyewear out:</p>
<p><em><strong>Vuzix is not alone in pursuing the ultimate in hardware, at least as far as wearable displays. However, I think they are much farther than the rest of the pack in vision, roadmap, and execution. They have put together a team that has a sense of urgency and ambition that will blow the industry away. After talking to them, I got the feeling that they really know what they are doing and there is a lot of mind blowing stuff in their pipeline. Iâ€™m sure they are one of the few companies that really gets it and has a clear vision of the future. Definitely my first choice to work with.</strong></em></p>
<p><em><strong><br />
</strong></em></p>
<h3>Hybrid Augmented/Virtual Reality</h3>
<p><img class="alignnone size-full wp-image-2566" title="qa_2post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/qa_2post.jpg" alt="qa_2post" width="450" height="347" /></p>
<p><a id="va0_" title="Cory Ondrejka posted" href="http://ondrejka.blogspot.com/2009/01/anybots-telepresence-robot.html" target="_blank">Cory Ondrejka posted</a> this picture of the anybots telepresence robot and â€œcongrats to <a href="http://www.tlb.org/">Trevor Blackwell</a> and the rest of the <a href="http://anybots.com/">Anybots</a> team on the launch of <a href="http://anybots.com/abouttherobots.html">QA at CES</a>.â€Â  Cory (one of the founders and former CTO of Second Life) also made some predictions for Virtual Worlds, some optimistic and some less so, including â€œthe increasing need to be able to diversify the Second Life product offering to begin truly rebuilding the code base.â€</p>
<p>Robert is unabashedly irritated with the state of play in Virtual Worlds and MMOGS:<br />
<em><strong><br />
</strong><strong>Unless both industries (Virtual Worlds and MMOGs) have some serious upheaval or radical new approaches, they will quickly be eclipsed by AR, which will eventually evolve into something hybrid..AR/VR depending on your level of access and hardware.</strong></em></p>
<p><em><strong></strong><strong>Iâ€™d like to see someone grab an engine like Offset, Crytek, HERO, or Unreal 3, and smack on a fat MMO server infrastructure (Eve or Bigworld)â€¦toss in the right tools, and you would see a revolution and renaissance occur at the same time in the virtual world space. All the puzzle pieces are there, just no one is putting them together the right way.</strong></em></p>
<p>I did just find out that Nortelâ€™s <a id="qkxv" title="WebAlive is powered by the Unreal 3 engine" href="http://www2.nortel.com/go/news_detail.jsp?cat_id=-8055&amp;oid=100251105&amp;locale=en-US" target="_blank">WebAlive is powered by the Unreal 3 engine</a>. You <a id="xqbw" title="can try WebAlive" href="http://www.lenovo.com/elounge" target="_blank">can try WebAlive</a> out here.</p>
<p>Robert<strong><em> </em></strong>points out how rare it has become to see people really push virtual worlds technology and MMOGs into entirely new directions.Â  Although, of course, there are exceptions.Â  I managed to engage some interest from Robert in the possibilities the <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">opensource modular architecture of OpenSim</a> opens up, and <a id="vx_i" title="the augmented reality experiments from Georgia Tech with Second Life" href="http://arsecondlife.gvu.gatech.edu/" target="_blank">the augmented reality experiments from Georgia Tech with Second Life</a> (screenshot below) got praise from Robert for trying to do something new. (Georgia tech have also put out a <a id="kfzj" title="virtual pet app for the iphone" href="http://uk.youtube.com/watch?v=_0bitKDKdg0" target="_blank">virtual pet app for the iphone</a> ).</p>
<p><img class="alignnone size-full wp-image-2567" title="picture-4" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/picture-4.png" alt="picture-4" width="321" height="245" /></p>
<p>But while Robert clearly has zero patience for virtual world technology which he sees stuck in the mid nineties, he notes:</p>
<p><em><strong>the innovative and wonderful stuff about SL isnâ€™t SL, it is what people are doing and creating on their own with terrible tools *IN* SL</strong></em> [Second Life].</p>
<p>The immersive mobile augmented reality platform Robert is building, he hopes, will generate this kind of user creativity but with 21st century tools.</p>
<h3>So is it â€œOMGâ€ finally for the Augmented Reality we have dreamed about?</h3>
<p>According to Robert:</p>
<p><em><strong>It really boils down to a markerless solution and a good application.</strong></em></p>
<p>In the interview below we cover a number of topics including business models for Augmented Reality, e.g., how business models based on micro-transactions and virtual goods will translate to Augmented Reality.</p>
<p>Many of the challenges to becoming mainstream faced by virtual worldsÂ  are similar to the challenges AR must overcome. Robert discusses these including the interface/gui that is a critical element for AR, solving the riddle of one world or many, patent wars in Virtual Worlds and Augmented Reality, the role of Augmented Reality in the future of sustainable computing, and what interoperability is about.</p>
<h3>The Back Story for AR/VRâ€¦</h3>
<p>In case you want to get up to speed on the required background reading forÂ  Augmented Reality. This is Robertâ€™s required reading list and Denno Coil is an absolute <strong>must</strong> see (feel free to add to this list in the comments, please).</p>
<p>â€œIf you want to see the things that have inspired our vision of what we want to build, check out:</p>
<p>* Dream Park by Larry Niven and Steven Barnes<br />
* Rainbows End by Vernor Vinge<br />
* Spook Country by William Gibson<br />
* Halting State by Charles Stross<br />
* The Diamond Age by Neal Stephenson<br />
* Donnerjack by Roger Zelazny and Jane Lindskold<br />
* Otherland by Tad Williams<br />
* Neuromancer by William Gibson<br />
* Idoru by Wiliam Gibson<br />
* Cryptonomicon by Neal Stephenson</p>
<p>and watch the whole anime of Denno Coil (subbed NOT dubbed!)â€</p>
<p><img class="alignnone size-full wp-image-2568" title="dennoucoil" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/01/dennoucoil.jpg" alt="dennoucoil" width="450" height="256" /></p>
<p>Screenshot from Denno Coil from<a id="yic5" title="Concrete Badger" href="http://www.concretebadger.net/blog/2007/12/17/dennou-coil-full-series-2007-in-12-day-4/" target="_blank"> Concrete Badger</a>.</p>
<h3>Interview With Robert Rice</h3>
<p><strong>Tish Shute:</strong> I am glad to hear that you are working on this [an immersive mobile augmented reality platform]!</p>
<p><strong>Robert Rice:</strong> We switched gears from MMO stuff about a year ago and we are finally getting some traction. It is very hard doing anything in this economy right now, but we found an opportunity to take AR to a new level beyond what you see on youtube. AR is still too â€œcuteâ€ and novelty. We donâ€™t want to play around.</p>
<p><strong>Tish Shute:</strong> I like Wikitude â€˜cos it even manages to do something useful!</p>
<p><strong>Robert </strong><strong> Rice</strong><strong>:</strong> Yeah, useful = traction. Now that we are getting near a prototype we are starting to get a lot of interest even though we are still technically way under the radar.</p>
<p><strong>Tish Shute:</strong> r u funded?</p>
<p><strong>Robert </strong><strong> Rice</strong><strong>:</strong> privately funded, some revenues from an early license, and ongoing discussions with several institutional investors. So, we have some funding, but nothing spectacular just yet.</p>
<p><strong>Tish Shute:</strong> are you just developing an AR platform?</p>
<p><strong> Robert Rice:</strong> hrm, sort of, but not just that. By platform I mean tools, sdk, and infrastructure plus some applications. The idea is to build something that facilitates everyone else making cool things and useful applications for different industries/sectors</p>
<p><strong>Tish Shute:</strong> Yes that is the cool thing to do but isnâ€™t that hard to fund!</p>
<p>(Robert grins) Well, that depends on the business model. Weâ€™ve got that figured out. Iâ€™d be absolutely happy if everyone and their brother were making applications on our stuff that gives us an edge on market penetration/saturation. There are plenty examples that prove the model. If you give people free and easy to use tools, they will run with it. ARtoolkit for example, has tons of people making nifty things and posting videos on youtube that has pushed them to the forefront as THE AR middleware to use right now, or heck, look at youtube free service, and they dominate video sharing.Â  Sure there will be a lot of â€œnoiseâ€, but there will also be a lot of â€œsignalâ€ that will rise to the top, facilitating and enabling is creating value in its own right.</p>
<p><strong>Tish Shute:</strong> But how do you expect to monetize?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> There are a good half a dozen ways to monetize AR or an AR platform.</p>
<p><strong>Tish Shute:</strong> What are your top 3?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> hrm, microtransactions, localized mobile advertising, and enterprise solutions (visualization)</p>
<p><strong>Tish Shute:</strong> Do you think the consumer market will give the lead?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Iâ€™m not sure. We are getting people from academia, intelligence, defense, border security, and some corporate types knocking on our door already, and pretty aggressively. It may be that those sectors push AR before consumer entertainment really kicks off.</p>
<p>But going back to a discussion we had earlier &#8211; yes working with â€œno markersâ€ is a big deal.</p>
<p><strong>Tish Shute:</strong> Can you talk about what you are doing there or is it still under wraps?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> I can say that between some university tech transfer and some of our own proprietary stuff, we are using some fairly common visual tracking technology. if you are really plugged into the AR scene, you will know there are probably half a dozen visual tracking methods out there. We just looked for the best one, licensed it for commercial use, and then started working our magic. This is a very small piece of the overall effort, but worth noting.</p>
<p>The downside with working with university tech is that it is usually based on research, incomplete, and not wrapped up in a nice commercial package on the upside, it can be a good start to build on.</p>
<p><strong>Tish Shute:</strong> As you know I am very interested in â€œtechnology that mattersâ€ in particular tech that can help us accomplish the urgent goal of sustainable living.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong>: oh, Iâ€™m pretty keen on sustainable living as wellâ€¦after I sell off a few companies and have money of my own, Iâ€™m going to get into arcologies<br />
â¦<br />
Robert grins</p>
<p>The interesting thing with the visual stuff combined with our other tech, is that we can make things multiuser, persistent, dynamic, and mobile.<br />
The markers (fiducials) are really really limiting outside of basic applications. You canâ€™t really plaster everyone and everything with a marker.Â  And they are, by nature, static (even if they are animated or whatever).</p>
<p>Alsoâ€¦ our stuff works indoors and outdoors even without a GPS connection.<br />
â¦<br />
Robert grins</p>
<p><strong>Tish Shute:</strong> Now that does sound interesting!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yeah, with visual, you donâ€™t need a compass or accelerometers either. Less hardware : )</p>
<p>You start with wifi triangulation or gps coord to get a â€œbruteâ€ location, and then you use the visual stuff for down to the meter accuracy and that by nature, gives you your orientation and positioning.</p>
<p><strong>Tish Shute: </strong>Wow this is beginning to sound very interesting!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Once you have that, it doesnâ€™t matter where you go, it continues to track and continually refines areas you have been before. Weâ€™ve spent the last year figuring all this out. There are so many problems and obstacles that are going to be developing in the future for anyone trying to do what we are, but we have already discovered solutions.</p>
<p>oh, visual tracking = gesture based interfaces too thatâ€™s going to take some work, but its doable.Â  The real pain in the ass there isnâ€™t the actual tracking, it is in the interface design.</p>
<p>Thatâ€™s something that almost every AR company, venture, and research program is missing out on entirely. They are so focused on making cute things with markers.Â  They are missing the larger problems of AR Spam, interface, iconography, GUI, metaphor, interoperability, privacy, identity.</p>
<p><strong>Tish Shute:</strong> So how are you dealing with all that!!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> We took the backwards approach of trying to think where we want things to be in ten years (and we read all the cool booksâ€¦Vinge, Stephenson, Gibson, etc.) and then we spent time trying to think of what the potential problems areâ€¦.like AR spam. Its bad enough when a giant penis flies by in second life, we donâ€™t want that to happen in a global wireless AR platform.</p>
<p><strong>Tish Shute: </strong>Do you have a prototype yet?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> hrm, 6 months away from youtubing something. Problem has been slow funding, which equals slow development. We also donâ€™t want to show our cards too soonâ€¦too many potential competitors out there.</p>
<p>â¦<br />
Robert grins</p>
<p><strong>Tish Shute:</strong> when you say microtransactions what is the business potential there?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>hrm last year I think, $1.5B was spent on virtual items. Thatâ€™s games and virtual worlds. That should hit $5B in a couple of years. Thatâ€™s basically people buying and selling things like WoW gold or items in SL or whatever. microtransactions, is basically the same thing, but in AR space.</p>
<p>Why couldnâ€™t a 3D artist make a wicked animated 3D dragon, and then sell it to someone else? With AR, you could sit it on your shoulder. With a good scripting engine, you could train it to do stuff. Thats what I want to enable.</p>
<p>tools + sdk + platform = enabling people to make and create. Add in a commerce level (microtransactions) and wala.</p>
<p><strong>Tish Shute:</strong> At the moment all of these virtual goods are very platform specific, is that a problem for you?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Not at all. This is at a higher level. You have to switch mental models when you talk about what AR could or should be. For example, lets contrast the web and virtual worlds. For every virtual world you go to, you have to download a whole new client. Imagine if that model was applied to the webâ€¦ you would need a brand new browser for every website you went to. That is just soâ€¦wrong.</p>
<p>Itâ€™s the same thing for ARâ€¦people are thinking about it with the same mental and business models and development philosophies as virtual worlds or web.Â  There are some things and aspects that work fine, but not everything.</p>
<p>Virtual worlds, are, by nature, necessarily different and walled gardens. The idea of 100% open and interoperable virtual worlds is a red herringâ€¦it sounds good but in practice it is a really dumb idea.</p>
<p><strong>Tish Shute: </strong>I was wondering if you had a way to leverage all the 3D content already created â€˜cos that would jump start things in AR wouldnâ€™t it?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Oh yeah, thatâ€™s easy. They all use the same polygons. Any virtual item in any game or virtual world is likely created with 3D studio or maya or something similar would be easy to convert and use.</p>
<p><strong>Tish Shute:</strong> So people could bring their WoW weapons into your system?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Not legally, but sure. Its just a 3D model with a texture.Â  It doesnâ€™t matter if you use corel draw or photoshop or paintshop proâ€¦.or one screwdriver or another. Part of my teamâ€™s advantage, is that we are all experienced in MMORPG and virtual world design and development. We know the tools, the tech, and what works and what doesnâ€™t.</p>
<p><strong>Tish Shute:</strong> But some of the 3D content created in the social worlds is what has most value to people.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Right, and that can be exported out easily.</p>
<p><strong>Tish Shute: </strong>But back to â€œrealâ€ life applications. Is you platform really markerless?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yes.Â  marker = printed icon or glyph, also known as a fiducial</p>
<p><strong>Tish Shute:</strong> But u must have some marker?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> hrm, more accurately, you need a point of reference.</p>
<p>Visual tracking has been around for more than a decade.Â  Lots of work for robots and other sectors.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> But isnâ€™t the specificity of reference n terms of RL applications a vital key, for example, for a database of things?</p>
<p>Robert grin That is a different problemâ€¦tracking, registration, mapping, positioning, etc. That question has to do with mapping which is related to visual tracking, but not the same thing. We have a rather unique approach to some of this that I canâ€™t discuss (patent pending).</p>
<p><strong>Tish Shute: </strong>But for example, to create an augmented natural history of food &#8211; say I want to point at the slab of meat on my plate and know where that cow came from, what feed lot how it was treated etc</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>That is not possible without ubiquitous nanotechnology. Shall I explain?</p>
<p><strong>Tish Shute:</strong> Yes please!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Ok, lets step back a minute and turn that burger back into a cowâ€¦ the first problem (of this particular situation) is differentiating from one cow to another since most cows look alike, you can either attempt to discriminate visually (cow patterns) or use a much simpler option, like giving each cow a rfid chip in their bell, or hoof</p>
<p>Now, most people would try to figure out how to jam all sorts of info in the rfid chip, which sounds like a good idea, but isnâ€™t, the trick would be to simply use the rfid to store a unique identifier with is then linked to a database elsewhere, or hoof.</p>
<p>That database should continually be updated with whatever relevant information you need so as you get close with your AR laptop, wearable displays, or embedded brain chip, you get the identifier broadcast, then you get the info downloaded to you, and it â€œsticksâ€ to the cow with the generic visual tracking (object following, even simple bounding box is sufficient for a slow moving cow)</p>
<p>So, up to that point, you can get tons of information about that specific cow, that cow population (remember, AR is not just about overlaying dataâ€¦it is inherently about who YOU are, WHERE you are, WHAT you are doing, WHAT is around you, etc.) Tie in data visualisation and some farmer tools and all sorts of other things happen. Now, lets move the timeline ahead a bit.</p>
<p>The butcher gets the cow and does his handiworkâ€¦because we know all the info about the cow, all of the meat can be properly labeled and marked. Ideally, with a UPC code or a unique glyph (somewhat problematic depending on how many unique glyphs you can create) so, while you are in the grocery store, you can access the relevant shopping dataâ€¦age of cow, state of origin, type of feed, how many spots, how much body fat, which butcher, whatever not because of what is inside the package, but the package itself.</p>
<p>Getting back to your hamburger, the problem is that it is a burgerâ€¦there is nothing to distinguish that burger from another one at the tableâ€¦unless you stuck a rfid chip in it or splattered it with ink and a unique glyph, or maybe a special one of a kind plate.</p>
<p>However, a properly designed AR system could say â€œhey! that/s a hamburger! and I know I am at Fat Daddyâ€™s Burger Joint in Raleigh North Carolina on Glenwood Avenue, and I know that they cook their burgers this particular way, and their meat supplier is those guys over there, and they usually get their cow meat from a farm out in Utahâ€</p>
<p>With ubiquitous nanomites or whatever, then its not that far out to consider edible nanos that are in the meat and that broad cast info so a slab of meat can tell you about itself and broadcast that to the general public.</p>
<p><strong>Tish Shute:</strong> What useful scenarios can we create without the nanomites?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> If it wasnâ€™t a burger or a consumable organic, the scenario changes.</p>
<p><strong>Tish Shute: </strong>What is the time scale on nanomites?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> ehhhhhhh 20 years minimum if we are lucky. They sound good on paper, but there is a whole book worth of problems and why they are so far offâ€¦as consumer grade, all over the place, type of stuff.</p>
<p><strong>Tish Shute:</strong> Did you see the Nokia Home Control center?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yes, I saw the Nokia stuff.</p>
<p>AR for sensors, like security systems, temperature control, etc. all become â€œsources of dataâ€ that a AR system can visualize. So yes, thats easily doable. You could do that in a short period of time with some half decent engineers.</p>
<p>The trick of what Nokia is doing is aggregating sensor data from a building/home/facility, mashing it together, and sending the mobile device alerts and data visualization conceptually rather simple, but no one has done it right or well yet.</p>
<p>It wouldnâ€™t surprise me if Nokia pulled it off.</p>
<p><strong>Tish Shute:</strong> yes and if they do and someone does an AR interface to it that would be an inflection point for AR?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> In a roundabout way, yes. You could get data directly from your house, or get it through your mobile device and in either case, use the AR for visualization and control.</p>
<p>The interface/gui is a critical element for AR. That is one of the areas where it, as an industry, risks doing a bad job and turning into just a fad or another novelty like VR.Â  Virtual worlds have been struggling with that for a while, but MMORPGs have had the effect of extending their life cycle</p>
<p><strong>Tish Shute: </strong>Yes VWs have not solved the interface problem.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>The interface is one of their problems yes. Most virtual worlds are stuck in 1996/98</p>
<p><strong>Tish Shute:</strong> If ARÂ  is inherently about who YOU are, WHERE you are, WHAT you are doing, WHAT is around you, etc. seems that it is the ideal interface for home control?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Well for home control, you must know:</p>
<p>1) Who am I? Am I authorized to know this information? Am I a guest?</p>
<p>2) Where am I? Is this my house? or someone elses?</p>
<p>3) What am I doing? Do I want to make all the doors lock? Turn on or off lights? Open the garage door? Trigger the security alarm?</p>
<p>So the same questions apply</p>
<p>Iâ€™d say that all virtual worlds are stuck in the mid 90s. They are at least a decade behind the game worldsâ€¦in technology, design, implementation, architecture, etc. etc. In my opinion, things like Second Life are shameful in how they are presented as state of the art, innovative, ground breaking, new, wonderful, and world changing.</p>
<p>But thats another topic of conversation : )</p>
<p><strong>Tish Shute: </strong> Well for me the contribution of VWs is the presence enabled real time interaction with application (as 3D info machine) and context with other people.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Oh,there is no doubt that they are greatly useful and have a phenomenal amount of potential.</p>
<p>They *could* be all those things I just said that SL isnâ€™tâ€¦the problem is that they are either just existing, or they are meandering around without any real focus or direction. They arenâ€™t evolving.</p>
<p>Even MMORPGs are losing their way and beginning to stagnate terribly</p>
<p><strong>Tish Shute:</strong> yes I agree</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>But, AR has the potential to change a lot of things.</p>
<p>Im sure you have seen <a id="n_22" title="the yellowbook commercials" href="http://www.youtube.com/watch?v=zdPFBTQpk-U" target="_blank">the yellowbook commercials</a>? The technologies you are seeing here are doable in hrm, a year or less maybe. The tricky part is the interactivity and AIâ€¦that is, the content. Everything else isnâ€™t a problem. The avatar there could be photorealistic or stylized like a WoW character.</p>
<p>You could do that to some degree with markers for registration but dynamically changing the content linked to those markers is a little weird</p>
<p>(by the way, for the record, I like markers just fine, I just donâ€™t think they are useful for real-world mobile applications)</p>
<p>I also think that the guys that want to dust the planet with miniature rfid chips are on crack and are going about it the wrong way</p>
<p><strong>Tish Shute: </strong>A high level of interactivity is hard though. Isnâ€™t it? Even in VWs it is very limited.</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> it depends if you can track what the user is doing, and interpret that properly. Interactive is also a very lose term.</p>
<p>Clicking a button and making a light blink could be considered interactive.</p>
<p><strong>Tish Shute: </strong>In VWs a high level of interactivity wouldÂ  be to wield a virtual hammer and have a real nail go in! is physics part of the problem?</p>
<p><strong>Robert Rice:</strong> physics arenâ€™t difficult, plenty of middleware out there for it. The problem with that isnt so much the physics as much as it is the scale and purpose</p>
<p><strong>Tish Shute:</strong> well for robotics?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> that gets into a conversation about meshes, textures, and volumetric collision detection and stuff</p>
<p><strong>Tish Shute:</strong> virtual robotics?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> You mean teleremote/telepresence of real robots?</p>
<p><strong>Tish Shute: </strong>yes!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> ah, for that, you need some tactile feedback and some other stuff &#8211; doable, but insanely difficult. Thatâ€™s why you donâ€™t see a whole lot of remote controlled surgery robots all over the place.</p>
<p>They do existâ€¦</p>
<p><strong>Tish Shute: </strong> Will AR contribute to sustainable living by freeing us from some of our energy hogging devices?<strong></strong></p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>AR will ultimately encourage energy saving and recycling. where did I leave a light on at? where is the nearest trash can? what is the UV index outside today?</p>
<p>Yes, computers are energy hogs, but as we start seeing larger SSD drives, more efficient CPUs (even if the number of cores increases in multiples), and so on, the power will go down.</p>
<p>Also, think about thisâ€¦wearable displays potentially use less energy than LCD monitors on your desk.</p>
<p><strong>Tish Shute: </strong>Yes I should pick the brains of my intel chums on energy saving!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Getting rid of the monitor and switching to solid state drives will save an assload of power. Yes, I said assload.</p>
<p>Tell your intel chums to quit screwing around with single core mobile CPUs. We need multiple cores, that are smaller, faster, and use less power.</p>
<p><strong>Tish Shute: </strong>Is AR is the sustainable future of VW and MMOGs?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>The fun stuff will happen when they are both integrated in some fashion.</p>
<p><strong>Tish Shute:</strong> So perhaps this is why the Georgia guys are thinking in trying to combine AR and SL (<a id="boum" title="see video here" href="http://uk.youtube.com/watch?v=O2i-W9ncV_0&amp;feature=related" target="_blank">see video here</a>).</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> That first video was pretty damn cool. It just pains me that they are using SL for it. And omg, all those markers on the table.</p>
<p>Although, I could care less about seeing my SL avatar on my coffee table. I would rather see an avatar representing ME in the real world, moving around in a virtual world that is a â€œto scaleâ€ replica of the real world. That is MUCH more interesting and innovative.</p>
<p>But even if I donâ€™t like where they are going, or that they are using SL, the important thing is that they are doing something and forging ahead. I have a massive amount of respect for anyone, private, government, or academic, that is doing that.</p>
<p>And yes, the door (or window, or looking glass) has to work both ways for maximum potential, at least, thatâ€™s what Id like to see. They donâ€™t *have* to, but it would be rather cool.</p>
<p>And going back to sustainability, AR has the potential to make monitors generally obsolete, laptops too. Thatâ€™s a lot of power hungry devices with all sorts of metals and batteries inside.</p>
<p>But, even if the tech was absolutely crazy awesome right this minute, it would take a little while for consumer adoption.</p>
<p><strong>Tish Shute:</strong> But AR unleashes the mobile device?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes, AR is going to be built on powerful mobile devices for the near future, eventually embedded comps in clothing and whatnot. But that is a ways off</p>
<p>Entertainment is going to be the first huge driver.</p>
<p><strong>Tish Shute:</strong> So people will get used to having a pet virtual dragon on their shoulder first?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes, virtual dragon is way cool, easy tech for games, and can eventually be leveraged into a smart agent which becomes a practical applicationâ€¦agent based contextual search, etc. Yes, entertainment will also drive people to get used to the tech</p>
<p><strong>Tish Shute: </strong>Oh thanks for turning me on to <a id="kzbv" title="gamesalfresco" href="http://gamesalfresco.com/" target="_blank">gamesalfresco</a>!<strong></strong></p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Ive noticed that the good stuff usually gets linked to there. They donâ€™t list my blog, but thatâ€™s what I get for staying under the radar and not posting often. But anyway, gamesalfresco is the first place I send people that need a crash course in AR. Great site, great owner.</p>
<p><strong>Tish Shute:</strong> So are you in agreement with Thomas Wrobelâ€™s positioning ofÂ <a href="http://www.mobilizy.com/wikitude.php" target="_blank"> </a><em><strong><a href="http://www.mobilizy.com/wikitude.php" target="_blank">Wikitude</a></strong></em> and <em><strong><a href="http://gamesalfresco.com/2008/07/20/want-your-own-augmented-reality-geisha/" target="_self">AR Geisha doll</a> </strong></em>as being significant milestones for AR?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes,Â  these are among the first attempts to get away from the novelty of simply rendering a 3D object based on a marker and making it interesting.</p>
<p class="MsoNormal">Remember, one of the biggest risks that AR has, is being branded as â€œnoveltyâ€, which means â€œcool for five minutes but ultimately a waste of time.â€ I think we have a ways to go before something is truly useful, but as 2009 progresses we should start seeing some effort here. Iâ€™d guess 2010 before something really useful comes outâ€¦at least something practical.</p>
<p>Now, having said that, I should say that I expect entertainment and games to take the lead (as usual), although there are a few companies really trying to leverage AR and video/graphics compositing for marketing (brochures) and location based methods (kiosks, large screen projections, etc.)</p>
<p><strong>Tish Shute:</strong> Many people would say SnowCrash (metaverse) is now and Halting State (AR) is ten years from now. But you are seeing a development timeline for some popular AR apps in the next 18 months?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> Anyone that says SnowCrash is -now- is living in a box. Virtual Worlds, Virtual Reality, and immersive tech in general stopped innovating in the mid 90s. Iâ€™m continually flabbergasted at the number of people that think that things like Second Life are state-of-the-art or innovative. You might as well try to market a walkman as cutting edge, even though we have IPods out there.</p>
<p>Id like to see someone grab an engine like offset, crytek, hero, or unreal 3, and smack on a fat mmo server infrastructure (eve or big world)â€¦toss in the right tools, and you would see a revolution and renaissance occur at the same time in the virtual world space. All the puzzle pieces are there, just no one is putting them together the right way.</p>
<p><strong>Tish Shute:</strong> Why doesnâ€™t anyone do that?<strong></strong></p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Its not cheap, people will only fund a copy of something that exists already, people fear change and innovation, etc, The list goes on. The right money goes to the wrong people all the time.</p>
<p>Alternatively stated, there is a lot of â€œright idea, wrong implementationâ€</p>
<p>MMORPGs carried the torch and have made huge strides on the technology front, but have devolved in design. More often than not the gameplay emphasizes the single player experience and does nothing to take advantage of the potential of the massively connected internet.</p>
<p>Unless both industries have some serious upheaval or radical new approaches, they will quickly be eclipsed by AR, which will eventually evolve into something hybrid..AR/VR depending on your level of access and hardware.</p>
<p>But yes, Iâ€™d say that the next 18 months are going to be very interesting with a lot of money being thrown around, new ventures, and plenty of content/applications. I expect most of this will be centered on single user AR experienced through a mobile device with a screen (iphone, android, etc.). I expect that there will be a significant boost after Vuzix releases some of their wearable *transparent* displays, putting Microvision back into the â€œhas potential but is too quietâ€ position.</p>
<p><strong>Tish Shute:</strong> AR conjurs an image in many peopleâ€™s minds of dreadful head gear!</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes, it is either transparent wearable displays (in eyeglass formfactor) or nothing. HMDs with miniature LCD or OLED displays are good for streaming video, but for the mobile ubiquitous AR we all dream about, it has to be something that looks and feels like a pair of Oakleys.</p>
<p>I should also mention that several different types and modes of AR are going to find themselves being defined and refined over the next two years as we continue to blaze new trails, establish a lexicon (we keep borrowing terms from games, VR, virtual worlds, mmorpgs), and really work out the how as well as the why.</p>
<p>Even though the idea of AR has been around for a long time, the technology is just beginning to emerge, and very few people are even looking far enough ahead to figure out the problems and solutions that the tech creates. Really, who is thinking about how to deal with AR spam right now?</p>
<p><strong>Tish Shute: </strong>Do you see any successful networked AR applications emerging in the next 18 months?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yes and no.</p>
<p>When I talk about AR, I try to expand the definition a little bit. Usually, when you talk to someone about augmented reality, the first thing that comes to mind is overlaying 3D graphics on a video stream. I think though, that it should more properly be any media that is specific to your location and the context of what you are doing (or want to do)â€¦augmenting or enhancing your specific reality.</p>
<p>In this sense, anything that at least knows who you are (your ID, mobile phone #, etc.), where you are (GPS coord or a specific place like a cafe), and gives you relevant data, information, or media = augmented reality. Sure, you can make things more interactive or immersive, but that is the minimum.</p>
<p>So, in this case, yes, I think there will be networked applications in the next 18 monthsâ€¦mostly things that are enhanced by friends lists (you are here, your friend is over there). These will be *application specific*. My team at Neogence is already going beyond this, building a platform and infrastructure for other applications to be developed onâ€¦all networked through the same backbone. Now, in this context (the science fiction AR that we all dream about), no I do not see anyone else trying to leap a generation or two ahead of the industry to build a massively multiuser shared AR space. Expect to see things like multi-user AR games, virtual pets, kiosk marketing, magic book, â€œgee whizâ€ presentations (tradeshow booths, entertainment parks, etc.), and so forth.</p>
<p>The big thing Iâ€™m worried about is AR becoming the next silicon valley trendâ€¦once they realize the potential, an enormous amount of capital will flow to a bunch of startups with half baked ideas, weak business models, ten year old tech, and a lot of overhyped marketing. That is the very thing that will kill this technology as something that has true power and potential to literally change the way we interact with each other, our surroundings, information, and media.</p>
<p><strong>Tish Shute: </strong>Do you think AR has value for a project like Pachube that helps us connect dtat from lots of different environments and sensor actuator data?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> I think that AR has value as an interface to this data (essentially data visualization based on information streaming from a sensor or source that is interpreted in some dynamic graphical manner that has meaning). This is one of the â€œbig areasâ€ where ubiquitous augmented reality and wearable computing will really shine. Iâ€™ll definitely be keeping an eye on Pachube .</p>
<p><strong>Tish Shute:</strong> I canâ€™t help it! I am really interested to hear more about the Vuzix glasses?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yeah, everyone is getting hung up on the glasses as the end-all be all and having markers everywhere too.</p>
<p>All the glasses are, is another display device. At the end of the day, it doesnt matter if you are looking at a lcd monitor, a iphone, a head mounted display, or a pair of wicked next generation transparent wearable displays that magically draw directly on your retinas.</p>
<p>The real tricky stuff is what happens on the backendâ€¦making it all persistent, massively multiuser, intelligent, interoperable, realistic, etc. etc.</p>
<p>I think that we are within 24 months of the magic wearables (these new ones by vuzix are probably the real first generation attempt at doing it right). They wont be perfect, but I expect they will be functionalâ€¦and once we have functional, we can start doing the good stuff.</p>
<p><strong>Tish Shute:</strong> You mentioned you disappointement with VWs and MMORPGs earlier.Â  Could you tell me more about that?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> Yeah, there was an evolutionary divergence between virtual worlds and mmorpgs a while back. One stagnated almost completely, and the other leapt ahead in one sense and devolved horribly in the other sense. Neither is where the state of the art should be.Â  That is a whole other conversation, and probably a second book.</p>
<p><strong>Tish Shute:</strong> So making AR persistent, massively multiuser, intelligent, interoperable, realistic, etc. etc. that is where your efforts are going?<strong></strong></p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes. I fully expect that the hardware is almost ready for it. You can cobble together some amazing things in the lab right now, and I think commercial viability is imminent. The real value (as far as Iâ€™m concerned) is in making it mobile, wireless, persistent, and massively multiuser. You could argue that augmented reality will take over where virtual reality failed and become internet 3, internet one being the internet, internet two being the webâ€¦</p>
<p>mmorpgs are nothing more than single player games in a multiuser environment these days. Iâ€™m more than a bit bitter about it. All the right money went to the wrong people, and the best games we have are barely shadows of what we could have had by now.</p>
<p><strong>Tish Shute:</strong> Are there any open source AR platform dev projects?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>open source? hrm, Im sure there are multiple ones out there</p>
<p>if not entirely open source, there are plenty of things to experiment with that are generally free if you arenâ€™t trying to sell something, DART and ARTOOLKIT come to mind as very accessible applications.</p>
<p>Marker based AR is very important right nowâ€¦it is easy, low tech, understandable, highly customizable, and most importantly, accessible to the average joe. Ultimately though, we need a method of pure trackingâ€¦no markers glued to everything on the planet, no â€œbillions of RFIDsâ€ embedded in every square inch of every object on the planet, etc.</p>
<p><strong>Tish Shute:</strong> What do you mean by interoperability in AR? And what do you think about the development of standards?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> Ooh, good question.</p>
<p>Ok, so the internet is basically computers communicating with computers, and the web is mostly pages linking to other pages (Iâ€™m greatly oversimplifying here). Hold this thought for a minute.</p>
<p>Switch over to MMORPGs. If you want to play in one (or a virtual world), you need to download a client that is specific to that world. One client does not work with another world. There are plenty of efforts to change this, but they are all barking up the wrong tree. The specific uniqueness of each world defeats the need and purpose of true interoperability, unless you completely reinvent the whole thing with a common backbone, features, functionality, etc. The very nature of virtual worlds and mmorpgs rebels against this.You absolutely do not want an avatar from second life running around in world of warcraft (for reasons that should be obvious).</p>
<p>On the other hand, with the web, you can use just about any client (browser) to access nearly any website (some requiring plugins or whatever).</p>
<p>The thing with augmented reality, is how do we go about making this? Iâ€™ve seen a few people thinking about this from the wrong perspective. There was a question at the last techcrunch to the Sekai Camera guys (a conceptual AR application for the iphone) where someone on the panel wanted to know how website owners would convert their content for augmented reality. BZZZZZT! That is a fundamental misunderstanding of what AR is, or could be, and it falls into the same trap I see a lot of people doingâ€¦and that is looking at AR through the web 2.0 lens or the virtual world lens. It is absolutely fundamentally different at the coreâ€¦sure there are similarities: it has social networking/media applications and properties, and it has 3D graphics, but it stops there.</p>
<p>Ubiquitous augmented reality will be dramatically different depending on which standards, approaches, and philosophies get the most traction first. Will you walk down the street with your AR glasses and have a pop up every 30 feet asking you if you want to access the AR content on another server? Will you then have to register, subscribe, or whatever?</p>
<p>Or will all AR content be mediated by one sole master control server deep in the bowels of google? What about some other option? Will you need different sets of glasses to access different features and content from multiple sources?</p>
<p>At the end of the day, it should not matter what brand of glasses you are wearing, you should never have to deal with AR server popups to join/subscribe, and so forth.</p>
<p>Interoperability, in the context of what I was saying earlier, is the sense of how to build the infrastructure so all of this is seamless to the end user, but still maintaining the features/functionality necessary for all of what augmented reality promises usâ€¦I dont want to see everything in AR space, I want to be able to tune in or filter out some things, and I want to customize the snot out of what I see (perhaps changing metaphors or â€œholoscapesâ€), and so on. It all has to work together and simplify the end-user experience or it wonâ€™t get anywhere</p>
<p><strong>Tish Shute: </strong>So what caused the stagnation of new development and devolution of MMOGs in you opinion?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>yes, look at all the hope and hype for the mmorpgs released in the last 12 months really, what is different or better? Now, what is worse?</p>
<p>I bet any decent mmorpg gamer could give you a list of 2 or 3 things for the first question and 20-30 things for the second.</p>
<p>And, VWs seem to be stuck in a feedback loop</p>
<p><strong>Tish Shute: </strong>feedback loop?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Imagine nailing one of your feet to the ground and then trying to run â€™round and â€™round and â€™round.</p>
<p><strong>Tish Shute:</strong> Why do you think this happened to VWs?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Men in suits and flashy watches.</p>
<p>actually, hang onâ€¦..</p>
<p>I saw a video clip the other day from a conference about using various virtual and game technologies for simulations and other real world applications several people were talking about â€œavatar technologyâ€ and how theirs was better than their competitions and what not.</p>
<p>Now, can you tell me what â€œavatar technologyâ€ is? Avatar technology is a red herring. Avatar technology is the same thing as calling a toaster a new â€œfire technology.â€</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> The problem is that a lot of people that donâ€™t have a clue about what they are doing are selling the tech to other people that have no clue what they are buying, but they feel like the should for some unknown reason.</p>
<p>That is happening all over the government, academic, and industrial sectors now with a few companies selling virtual worlds (again, mid 90-s tech) as the ultimate solution to all problems.</p>
<p>Anyway, getting back to your question</p>
<p>Once virtual reality started getting some buzz, some people got greedy and jumped into the avatar/virtual world thing and tried making it commercial too soon half of the 3D chat worlds were being jammed into platforms for virtual shopping malls.</p>
<p>Most of the money funding tech R&amp;D started funneling towards VRML, and doing 3D in web pages, etc.</p>
<p><strong>Tish Shute: </strong>yes horrible idea trying make web pages 3D IMHO</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong> The money people got involved too soon, and then the greedy people jumped in and tried patenting everything possible. Take a look at the worlds.com patent for 3D worlds.</p>
<p>They filed it back in 2000 or so and it was awarded in 07 (it shouldnt have been in my opinion) now they are suing everyone they can.</p>
<p><strong>Tish Shute: </strong>Will there be patent wars in AR?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> Yes, the AR patent wars will be legendary once people start waking up to the real potential here.</p>
<p>The only solution is for everyone to band together and pre-emptively patent or make public domain every possible patentable concept, technology, or implementation for AR otherwise, you havenâ€™t seen anything yet.</p>
<p><strong>Tish Shute:</strong> Is the AR community organized enough to do that yet?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> That depends on how my company fares in the next six months.</p>
<p><strong>Tish Shute:</strong> Will you patent or make your tech public domain?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> I plan on patenting the snot out of everything we can possibly think of, and then giving away our content creation tools and SDK stuff for free. The whole goal of what we are trying to build is to empower the end user and facilitate the creation of a wonderful world of augmented reality.</p>
<p>There are some things we will make public domain for sure, on top of that</p>
<p><strong>Tish Shute:</strong> So back to my question on networked real time experience. Will we have networked Real time AR experiences in the next 18 months</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> It is possible, yes. Other than what we are doing, I am not aware of anyone else taking the same approach we are, but the potential for an â€œunder the radar ventureâ€ (much like my own company) is definitely there.</p>
<p><strong>Tish Shute: </strong>Will you use cloud computing?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>I think thatâ€™s overrated and probably another attempt at the whole â€œthin clientâ€ model that some companies have been pushing for the last 20 years.</p>
<p>It sounds good on paper, but ultimately takes power and control away from the end user.</p>
<p><strong>Tish Shute:</strong> cloud computing?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>Yes. You know, we arenâ€™t playing around, We are totally building â€œTHE ARâ€ that everyone keeps dreaming about. None of this cute stuff you see on youtube. Actually, if you want to see the things that have inspired our vision of what we want to build, check out:</p>
<p>* Dream Park by Larry Niven and Steven Barnes<br />
* Rainbows End by Vernor Vinge<br />
* Spook Country by William Gibson<br />
* Halting State by Charles Stross<br />
* The Diamond Age by Neal Stephenson<br />
* Donnerjack by Roger Zelazny and Jane Lindskold<br />
* Otherland by Tad Williams<br />
* Neuromancer by William Gibson<br />
* Idoru by Wiliam Gibson<br />
* Cryptonomicon by Neal Stephenson</p>
<p>and watch the whole anime of Denno Coil (subbed NOT dubbed!).</p>
<p><strong>Tish Shute:</strong> So scaling the real time experience wonâ€™t be a problem in your project hehe</p>
<p>Cos no sharding allowed in AR right</p>
<p>And if you have lots of API calls?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong>: haha, sharding is one of the dumbest things to happen to the VW/MMO industry</p>
<p>It is a solution to a technical problem that was relevant 15 years ago.</p>
<p><strong>Tish Shute:</strong> so why did it stick (i know men in suits)</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> it stuck because â€œthats what the other guys didâ€ and the mmo designers are too lazy to reconcile gameplay for PvP and RP gamers</p>
<p>However, there is a curious problem between dealing with â€œone worldâ€ and â€œanyone can start their own custom AR serverâ€</p>
<p><strong>Tish Shute: </strong>Now that is a very interesting problem the one world and own AR server</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> It took me a few weeks of not sleeping to figure that one out. It gets back to the interoperability issue</p>
<p><strong>Tish Shute:</strong> What did you come up with?</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> a solution. Thats all I can say for now on that.</p>
<p><strong>Tish Shute</strong>: eeextra seeekrit!</p>
<p>Well I will definitely have to bug you on that.</p>
<p>The problem has produced some creativity in OpenSim with people coming up with hybrids of p2p and oneworld</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> As far as virtual worlds are concerned, they need to look at the problem from a different perspective. They are trying to make all virtual worlds interoperable intead of creating a new model for interoperable worlds that new ones will be created to adhere to.</p>
<p><strong>Tish Shute: </strong>well some people are. I would say most OpenSim developers see their modular approach doing this.Â  And you choose to interoperate based on what modules you have activated and then social agreementsâ€¦</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>:</strong> hrm, thats a start, but that only works on a functional and social level &#8211; doesnâ€™t account for content (story, mythos, game rules), unique data (my +3 sword), or the concepts of commerce, inherent value, and intellectual property</p>
<p>Enabling my WoW avatar to run around in SL and vice versa creates more problems than it solves.</p>
<p>Its like two alien races working hard to make sure that their two spaceships can dock but no one is paying any attention to the fact that race A breathes nitrogen and race B breathes sulpher.</p>
<p>Its technically possible, but they are missing the boat on the content side of the problem.</p>
<p><strong>Tish Shute:</strong> Yes but donâ€™t you think when a modular open source tech for virtual worldsÂ  becomes pervasive, what will happen is that those interested in a similar genre will increasingly use the module in ways that allows their content to interoperate if they want it too</p>
<p><strong>Robert</strong><strong> Rice</strong><strong>: </strong>everyone has to use the same backend tech, and the front end clients need to adhere to the same standards. Bu I have to admit, I havenâ€™t been paying much attention to the vw space in the last 9 months or so.</p>
<p>Oh I have to run now.Â  But download and install <a id="vsnt" title="cooliris" href="http://www.cooliris.com/" target="_blank">cooliris</a>. I promise you will be blown away and will start using it to search for images and videos</p>
<p>Its frigging awesome.</p>
<p><strong>Tish Shute:</strong> Will do!Â  Thanks so much great talking to you. I canâ€™t wait for your launch.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/01/17/is-it-%e2%80%9comg-finally%e2%80%9d-for-augmented-reality-interview-with-robert-rice/feed/</wfw:commentRss>
		<slash:comments>27</slash:comments>
		</item>
		<item>
		<title>Hacking the World in 2009: Google Street View, &#8220;Smart Stuff,&#8221; and Wikiculture.</title>
		<link>http://www.ugotrade.com/2008/12/29/hacking-the-world-in-2009-google-street-view-smart-stuff-and-wikiculture/</link>
		<comments>http://www.ugotrade.com/2008/12/29/hacking-the-world-in-2009-google-street-view-smart-stuff-and-wikiculture/#comments</comments>
		<pubDate>Mon, 29 Dec 2008 19:20:11 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Architecture Working Group]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[interoperability of virtual worlds]]></category>
		<category><![CDATA[Linden Lab]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[Metaverse]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[MQTT and RSMB]]></category>
		<category><![CDATA[Open Grid]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[open protocols for virtual worlds]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[Open Source Virtual Worlds]]></category>
		<category><![CDATA[open standards for virtual worlds]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[virtual communities]]></category>
		<category><![CDATA[Virtual HomeCamp]]></category>
		<category><![CDATA[Virtual Meters]]></category>
		<category><![CDATA[virtual world standards]]></category>
		<category><![CDATA[Virtual Worlds]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[World 2.0]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=2463</guid>
		<description><![CDATA[Google Street View Hacking This Google Street View Hack (via @timoreilly) will get my nomination for a Hacking the World Award this year, if there is such an award. A parade (the screenshot opening this post), a marathon,Â a mad-scientists laboratory, a sword fight, and more (see The Infonaut Blog) were staged all along the route [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/sampsoniawaypost.jpg"><img class="alignnone size-full wp-image-2475" title="sampsoniawaypost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/sampsoniawaypost.jpg" alt="" width="450" height="274" /></a></p>
<h3>Google Street View Hacking</h3>
<p><a href="http://www.wikio.com/video/576734" target="_blank">This Google Street View Hack</a> (via<a href="http://twitter.com/timoreilly" target="_blank"> @timoreilly</a>) will get my nomination for a Hacking the World Award this year, if there is such an award.</p>
<p><a href="http://maps.google.com/maps?cbp=1,262.96388206761037,,0,16.58444579096093&amp;cbll=40.456878,-80.01196&amp;layer=c&amp;ie=UTF8&amp;ll=40.458499,-80.009319&amp;spn=0.00569,0.012918&amp;z=17&amp;panoid=zHdES6mj-vBrH2nF-K9ROQ" target="_blank">A parade</a> (the screenshot opening this post), <a href="http://maps.google.com/maps?cbp=1,260.87215088682916,,0,8.64102186979147&amp;cbll=40.457046,-80.011085&amp;layer=c&amp;ie=UTF8&amp;ll=40.458671,-80.00845&amp;spn=0.00569,0.012918&amp;z=17&amp;panoid=81ALq0NpV6uyLEF5S5ENhw" target="_blank">a marathon</a>,Â <a href="http://maps.google.com/maps?cbp=1,160.10914016686365,,0,33.949139944215034&amp;cbll=40.456949,-80.011593&amp;layer=c&amp;ie=UTF8&amp;ll=40.458573,-80.008954&amp;spn=0.00569,0.012918&amp;z=17&amp;panoid=C4I-QLkZJoT1SHXslK5f7Q" target="_blank">a mad-scientists laboratory</a>, <a href="http://maps.google.com/maps?cbp=1,9.995045624107206,,0,10.698194796922357&amp;cbll=40.457636,-80.00767&amp;layer=c&amp;ie=UTF8&amp;ll=40.459103,-80.006486&amp;spn=0.00569,0.012918&amp;z=17&amp;panoid=W_ox0QPcWyPqWGNPiK91Nw" target="_blank">a sword fight</a>, and more (see <a href="http://www.infonaut.ca/blog/?p=290" target="_blank">The Infonaut Blog</a>) were staged all along the route of the Google Street View truck by artists Robin Hewlett and Ben Kinsley working in conjunction with the local community and Google Street View<em><strong>. </strong></em></p>
<p>The Google Street View Hack suggests at a myriad of possibilities for anyone with their eye on the prize for a great world hack for 2009.Â  In my mind&#8217;s eye, I imagine the Google Street View truck&#8217;s trek across the planet triggering local environmental street action carnivals wherever it goes.</p>
<p>Local energy conservationists,<a href="http://www.nytimes.com/2008/12/27/world/europe/27house.html?_r=1&amp;pagewanted=all" target="_blank"> &#8220;passive house&#8221; architects</a>, retrofitters, could turn the arrival ofÂ  Google Street View into an occasion to create projects for a sustainable future &#8211; a traveling StreetCamp (see <a href="http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/" target="_blank">my post on HomeCamp &#8217;08 here</a>).Â  As Google Street View intends, surely, to go everywhere,Â  this would be a global hack for sustainable living that crossed the bounds of the physical and the virtual.Â  And the vast public record of Google Street View would became a generative engine and global resource for sustainable living.</p>
<h3>Working together on the noble aim of sustainable living</h3>
<p>- this is my (and many other people&#8217;s) big theme for 2009.</p>
<p>A Hacking the World award should also go toÂ  <a href="http://www.pachube.com/">Pachube</a> &#8211; &#8220;patching the planet&#8221; &#8211; for demonstrating that instrumenting the world is not merely a Sci FiÂ  fantasy anymore.Â  By facilitating &#8220;interaction between remote environments, both physical and virtual,&#8221;Â  Pachube demonstrates (see <a href="http://community.pachube.com/?q=node/1" target="_blank">diagram here</a>) how we have only just begun to dip our toes into the many new opportunities we have to work together to save energy, rethink our culture of consumption, and to reboot our failing economy under a new sustainable operating system.</p>
<p>Energy awareness unlike the glut of information we have in entertainment and games suffers from a dearth of information. We really have very little idea about what we are consuming and the waste we are producing.Â  So more Hacking the World Awards should go to projects like <a href="http://www.amee.com/" target="_blank">AMEE</a> &#8211; creating the world&#8217;s energy meter, and <a href="http://www.wattzon.com/" target="_blank">Wattzon</a> &#8211; your personal energy meter, for giving us new ways to understand and work with energy data.</p>
<p>Many people and organizations, given the information, will change their behaviours. But the cultural changes necessary for sustainable living are deep and old habits die hard (see <a href="http://www.nytimes.com/2008/12/27/opinion/27sat1.html" target="_blank">this disturbing report</a> on the recent return to SUV buying in November as soon as gas prices fell!).</p>
<h3>AÂ  Small Community of Volunteers Can Bring Change on a Global Scale</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/homecampthethrongpost.jpg"><img class="alignnone size-full wp-image-2535" title="homecampthethrongpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/homecampthethrongpost.jpg" alt="" width="450" height="153" /></a></p>
<p>Picture above by <a href="http://benjaminellis.co.uk/" target="_blank">Benjamin Ellis</a>, &#8220;HomeCamp &#8211; The Throng,&#8221; from his <a href="http://www.flickr.com/photos/tags/homecamp08/" target="_blank">Flickr</a><a href="http://www.flickr.com/search/?q=homecamp&amp;w=29034542%40N00" target="_blank"> stream.</a></p>
<p>One of my favorite &#8220;instrumenting the world&#8221; projects to date and another top contender for a Hacking the World Award is <span class="entry-content"><a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp â€˜08</a></span> (see my <a href="http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/" target="_blank">previous post</a>).Â  HomeCamp brings together a community of creators and enthusiasts ofÂ  &#8220;smart stuff,&#8221; creating <a href="http://meta.wikimedia.org/wiki/Wikiculture" target="_blank">a wikiculture</a> for the noble cause of sustainable living.</p>
<p>The key to whether &#8220;instrumenting the world&#8221; empowers people and changes our lives for the better will be the capacity our systems of instrumentation have for what Jonathan Zittrain in <em><strong>&#8220;</strong></em><a href="http://futureoftheinternet.org/" target="_blank">The Future of the Internet: And How To Stop It:,&#8221; </a><em><strong> </strong></em>defines as generativity, i.e.:Â  &#8220;the system&#8217;s capacity to produce unanticipated change through unfiltered contributions from broad and varied audiences&#8221; ( Zittrain, 2008).</p>
<p>Generativity is the &#8220;secret sauce&#8221; that makes the difference between, for example, <a href="http://www.wikipedia.org/" target="_blank">Wikipedia</a> and its all but forgotten predecessor &#8211; the &#8220;written by experts&#8221; <a href="http://en.wikipedia.org/wiki/Nupedia" target="_blank">Nupedia</a>.</p>
<p>Jonathan Zittrain writes:</p>
<p><em><strong></strong></em></p>
<p><em><strong>Wikipedia stands for more than the ability of people to craft their own knowledge and culture.Â  It stands for the idea that people of diverse backgrounds can work together on a common project with, whatever its other weaknesses, a noble aim </strong><strong>- bringing such knowledge to the world. (p.147)</strong></em></p>
<p>At <a href="http://en.oreilly.com/web2008/public/content/home" target="_blank">Web 2.0 Summit</a>, Jonathan Hochman (<em><strong><a href="http://en.oreilly.com/web2008/public/schedule/detail/6952" target="_blank">Known as </a><a href="http://en.wikipedia.org/wiki/User:Jehochman">Jehochman</a> on Wikipedia</strong></em>), shared with me his insider perspective as a Wikipedia administrator. The <a href="http://www.ugotrade.com/2008/12/26/wikipedia-houdini-google-street-view-instrumenting-sustainable-living#link_1">full interview</a> with Jonathan is later in this post.</p>
<p>Jonathan comments on the role of wikiculture in sustainable living:</p>
<p><em><strong>&#8220;Sustainable Living requires everything to become more efficient. Incentives need to line up with conservation priorities. This requires a radical change to the way we govern ourselves. Command economies, whether commanded by politicians or capital, lead to huge inefficiencies.&#8221;</strong></em></p>
<p>And surely, if we have learned anything in 2008, we have learned that very bad things happen when the complex systems of modern life are left in the hands of a few people motivated solely by the urge to make profit.</p>
<h3>Hacking Design and Planning Processes for Real Estate and Transportation with Virtual Worlds</h3>
<p><object width="400" height="302" data="http://vimeo.com/moogaloop.swf?clip_id=2326434&amp;server=vimeo.com&amp;show_title=1&amp;show_byline=1&amp;show_portrait=0&amp;color=&amp;fullscreen=1" type="application/x-shockwave-flash"><param name="allowfullscreen" value="true" /><param name="allowscriptaccess" value="always" /><param name="src" value="http://vimeo.com/moogaloop.swf?clip_id=2326434&amp;server=vimeo.com&amp;show_title=1&amp;show_byline=1&amp;show_portrait=0&amp;color=&amp;fullscreen=1" /></object></p>
<p>This great machinima by Azwaldo Vilotta shows the progress so far on the <a href="http://studiowikitecture.wordpress.com/2008/12/12/now-is-an-ideal-time-to-join-wikitecture-40/" target="_blank">Wikitecture 4.0 project</a>, â€˜Re-Inventing the Virtual Classroomâ€™ for the University of Alabama.</p>
<p>Though still a niche market Virtual Worlds are growing at a steady pace.Â  As I mentioned in my previous post, energy hungry avatars themselves will be a target for optimization in 2009.Â  But as my personal power usage breakdown from <a href="http://www.wattzon.com/" target="_blank">Wattzon</a> shows, cutting down the amount of flying I do in 2009 would be far more effective in reducing my carbon footprint than deciding not to log into Virtual Worlds!</p>
<p>Note: Read Write Web&#8217;s recent post, &#8220;<a href="http://www.readwriteweb.com/archives/enterprise_virtual_worlds.php" target="_blank">Report Enterprise Virtual Worlds More Effective Than Web Conferencing</a>.Â  Also check out <a href="http://www.projectchainsaw.com/" target="_blank">Web.Alive</a>, and <a href="http://immersivespaces.com/" target="_blank">Immersive WorkSpaces</a> and Dusan Writer&#8217;s post on &#8220;<a href="http://dusanwriter.com/index.php/2008/12/20/thinkbalm-the-immersive-internet-and-collaborative-culture/" target="_blank">ThinkBalm,The Immersive Internet and Collaborative culture</a>,&#8221;</p>
<p>My friend Melanie Swan points out in her <a href="Jimmy Wales recent personal appeal for support for Wikipedia." target="_blank">Top Ten Computing Trends for 2009</a>, that Virtual Worlds not only have the power of the 3 Cs (communication, collaboration and commerce) but they are fast expanding into <a href="http://www.3pointd.com/20070406/rapid-architectural-prototyping-in-second-life/">rapid prototyping</a>, <a href="http://your2ndplace.com/node/926">simulation</a> and <a href="http://sldataviz.pbwiki.com/">data visualization</a>.</p>
<p>My Hacking the World, 2008, Awards for Virtual World innovation would go to three potentially world changing projects for sustainable living:</p>
<p>1) <a href="http://studiowikitecture.wordpress.com/" target="_blank">Studio Wikitecture</a>, (see <a href="http://studiowikitecture.wordpress.com/" target="_blank">&#8220;Reinventing the Virtual Classroom&#8221;</a> for The University of Alabama).</p>
<p>2) Oliver Goh&#8217;s work on &#8220;<a href="http://www.shaspa.com/cms/website.php" target="_blank">The Path to Sustainable Real Estate.&#8221;</a></p>
<p>3) Encitra,Â <a href="http://www.podcar.org/uppsalaconference/christerlindstrom.htm" target="_blank"></a>a company recently co-founded by <a href="http://www.ics.uci.edu/informatics/research/research_highlight_view.php?id=52" target="_blank">Crista Lopes</a> and <a href="http://www.podcar.org/uppsalaconference/christerlindstrom.htm" target="_blank">Christer Lindstrom</a> focused on improving urban planning processes, starting with transportation, using virtual worlds (<a href="http://www.ugotrade.com/2008/11/25/web-meets-world-participatory-culture-and-sustainable-living/" target="_blank">see my previous post here for more</a>).</p>
<p>The latter two projects are being developed in <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a> &#8211; the open source project that should also get a Hacking The World Award for creating an open modular architecture for virtual worlds that is unleashing all these new possibilites for integrating physical and virtual worlds.</p>
<p>The 2008 code contributions to OpenSim of special note re world hacking are Crista Lopes&#8217;<a href="http://opensimulator.org/wiki/Hypergrid"> OpenSim Hypergrid</a> &#8211; see Justin CC&#8217;s blog for full details on <a href="http://justincc.wordpress.com/2008/12/19/what-is-the-hypergrid/" target="_blank">&#8220;What is the hypergrid?,&#8221;</a> and David Levine&#8217;s work (IBM),  in collaboration with Linden Lab (see<a href="http://wiki.secondlife.com/wiki/Architecture_Working_Group" target="_blank"> Architecture Working Group</a>), on interoperability (see <a href="http://www.ugotrade.com/2008/07/" target="_blank">my earlier post here</a>).</p>
<p>Both these projects expand the frontiers of interoperability for virtual worlds although they &#8220;slice the problem from different ends,&#8221; as David Levine put it.Â  The emphasis in the LL/IBM approach is on security so assets are not moving yet.Â  In Crista&#8217;s solution you can have assets but the security issues are not addressed yet. But this work is vital to expanding the usefulness of virtual worlds and both projects should get Hacking the World Awards IMHO.</p>
<p>I asked <a href="http://archsl.wordpress.com/" target="_blank">Jon Brouchoud </a>(full interview upcoming) what he thought were Studio Wikitecture&#8217;s most important successes to date:</p>
<p><strong><em>&#8220;I think the greatest success has been in proving, on some level, that everyone has important knowledge that can inform and improve the design of a building, not just architects.Â  If we can continue building on that success, I hope we can eventually start to hack the traditional design process, and find ways to harness the wealth of knowledge held by the general public, instead of ignoring or avoiding it, as is most often the case.&#8221;</em></strong></p>
<h3>Harnessing the &#8220;Smart Stuff&#8221; to the Noble Cause of Sustainable Living</h3>
<p>Robert Scoble&#8217;s, <a href="http://scobleizer.com/2008/12/27/the-interview-of-the-year-tim-oreilly/" target="_blank">The Interview of the Year: Tim O&#8217;Reilly,</a> is not to be missed. Tim O&#8217;Reilly discusses the key trends for 2009 that are bubbling up at O&#8217;Reilly Media.Â  And, Yes, Tim O&#8217;Reilly, as the guru of Hacking the World, gets the &#8220;Distinguished Thinker &#8211; Hacking The World Award of 2008!&#8221;</p>
<p>Tim O&#8217;Reilly&#8217;s trend list includes:</p>
<p>1) big data- vast peer produced data bases in the cloud accessible by mobile devices</p>
<p>2) &#8220;smart stuff&#8221; &#8211; sensors and robotics and hacking on stuff for fun and not for profit</p>
<p>3) Green Tech</p>
<p>4) Advances in Biological/Life Sciences.</p>
<p>And, in Robert Scoble&#8217;s interview, there is a nice titbit of history re his attendance of early <a href="http://en.wikipedia.org/wiki/Foo_Camp" target="_blank">Foo Camps</a>.Â  Foo Camp is the wiki of O&#8217;Reilly conferences and a lineage holder to my favorite Hacking the World event of 2008, <span class="entry-content"><a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp â€˜08</a></span>.</p>
<p>But what will be the &#8220;secret sauce&#8221; for these big ideasÂ  &#8211; the generative engines that harness to the noble cause of sustainable living these vast peer produced data bases and all the creative &#8220;smart stuff&#8221; hackers across the globe are creating?Â  What will motivate the mass adoption of Green Tech and sustainable living?</p>
<p>What can Wikipedia teach us about how generative systems and bottom up approaches can change the world?</p>
<p>Jimmy Wales (interview coming soon!)Â  writes in his recent <a href="http://wikimediafoundation.org/wiki/Donate/Letter/en?utm_source=2008_jimmy_letter_r&amp;utm_medium=sitenotice&amp;utm_campaign=fundraiser2008#appeal" target="_blank">personal appeal</a> for support for Wikipedia.</p>
<p><em><strong>At its core, Wikipedia is driven by a global community of more than 150,000 volunteers &#8211; all dedicated to sharing knowledge freely. Over almost eight years, these volunteers have contributed more than 11 million articles in 265 languages. More than 275 million people come to our website every month to access information, free of charge and free of advertising.</strong></em></p>
<p>To answer questions on a how to create a successful wikiculture for sustainable living, an insider&#8217;s view of Wikipedia may be a good place to start.</p>
<h3>Interview With Jonathan Hochman on Wikipedia.</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/gammapostjon.jpg"><img class="alignnone size-full wp-image-2477" title="gammapostjon" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/gammapostjon.jpg" alt="" width="223" height="158" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/jonathanwikikpost.jpg"><img class="alignnone size-full wp-image-2473" title="jonathanwikikpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/jonathanwikikpost.jpg" alt="" width="224" height="158" /></a></p>
<p>The picture on the left is from the Wikipedia article, <a href="http://en.wikipedia.org/wiki/Gamma-ray_burst" target="_blank">Gamma-ray Burst</a>, that Jonathan Hochman is currently working on.Â  It is a drawing of a massive <a title="Star" href="http://en.wikipedia.org/wiki/Star">star</a> collapsing to form a <a title="Black hole" href="http://en.wikipedia.org/wiki/Black_hole">black hole</a>. Energy released as jets along the axis of rotation forms a gamma-ray burst. <em>Credit: Nicolle Rager Fuller/NSF </em></p>
<p>The picture on the right, Jonathan at Web 2.0 Summit, is taken by me. Jonathan was part of the,<em> <a href="http://en.oreilly.com/web2008/public/schedule/detail/6952" target="_blank">Defending Web 2.0 from Virtual Blight, panel.</a> </em></p>
<p><em><strong><a href="http://en.oreilly.com/web2008/public/schedule/detail/6952" target="_blank">Known as </a><a href="http://en.wikipedia.org/wiki/User:Jehochman">Jehochman</a> on Wikipedia, he serves as an administrator and as a leader in addressing online harassment, disruption and sock puppetry. He is also the founder of <a href="http://www.hochmanconsultants.com/">Hochman Consultants</a>, an Internet marketing consultancy, and the director of <a href="http://www.semne.org/">Search Engine Marketing New England</a>, a regional conference series.</strong></em></p>
<p><strong>Tish:</strong> Second Life and Wikipedia are the two great experiments in collaborative co-creation what do they have to teach us about the future of the internet?</p>
<p><strong>Jonathan:</strong> Yes, Wikipedia and Second Life are key social spaces.Â  Some people have been seeing Second Life as the beginning of Web 3.0 &#8211; a wrap around environment where you can almost experience another life. Wikipedia is sort of another example of this.</p>
<p>All the problems that exist in the real world are mirrored right into that little universe.Â  For example, the Armenians and the Turks are at each others throats and the Japanese and the Koreans are going at it, the Palestinians and the Israelis, and the &#8220;Troubles&#8221;Â  &#8230; all the conflicts are imported into Wikipedia.Â  People are fighting over the content of these articles. They want to have it their way because these are first ranked in Google and they have a big impact in public opinion.</p>
<p>There was a huge fight on the waterboarding article a while back. Some guys from Little Green Footballs &#8211; they are a very conservative reactionary type of media. They are trying to change the article to say that water boarding might not be torture &#8211; change it to say it is probably not so bad.Â  Crazy stuff. They were trying to water it down.Â  And it is very clear, from every source out there, that waterboarding is torture.Â  We did a study and there are 115 sources that say waterboarding is torture. You simulate drowning &#8211; you simulate killing someone &#8211; that is a violation of the Geneva Convention and everything else. People were fighting, fighting, fighting!</p>
<p>One of the things I did was to try and clear people out who were being disruptive.Â  We actually had to go to arbitration over that article. It is like the supreme court of Wikipedia. There is a panel of 15 arbitrators.Â  They hear the case. There is evidence, arguments and decisions. It is really like a simulated law suit. You get all the experience of a simulated law suit with the real threat that you could be banned. If they don&#8217;t like what you are doing they can actually ban you or restrict you from topics.</p>
<p>So it is really fascinating how this social space Wikipedia becomes a very real platform though it is in a virtual world for real world disputes.Â  Most disputes are over the definition of things.Â  If you have a you suit most disputes are about how things are defined. And Wikipedia has become the defacto definition of things in the real world.Â  People want to know what are &#8220;The Troubles.&#8221;Â  If you go to Wikipedia you find outÂ  The Troubles are a dispute over Northern Ireland.Â  What the article says has a profound impact on public opinion.</p>
<p><strong>Tish:</strong> So who is on the court of Wikipedia?</p>
<p><strong>Jonathan:</strong> They are volunteers. these people work two or three hours a day to run this court.Â  There are all kinds of projects.Â  There is a WikiProject Spam which has people who can write computer programs to statistically analyze Wikipedia projects &#8211; not only Wikipedia. But all of them are looking at the links and reporting them and banning those people who are abusing or gaming the system.</p>
<p><strong>Tish:</strong> You were on the Stopping Virtual Blight Panel at Web 2.0 Summit &#8211; what are the most important things to think about on this topic?</p>
<p><strong>Jonathan:</strong> Yes we were talking about how to defend the web against virtual blight. The thing I find interesting about Wikipedia is that because it is the eighth largest web site and possibly the second largest web site comprised of user generated content after YouTube. The problems that exist in Wikipedia are larger and more detailed than any other site.Â  For whatever problem someone has for their social media site or their Web 2.0 site these problems already exist in Wikipedia and the solutions are there and they are transparent. You can actually see the history of what&#8217;s been done.</p>
<p>If there is, for example, a problem on Digg &#8211; some problem with sock puppetry or vote stacking &#8211; it happens, it goes away.Â  You don&#8217;t get full disclosure.Â  With Wikipedia you can actually go in and look at a dispute and watch it unfold.Â  You can watch the arbitration cases that are filed, the arguments, the decisions, the logic, the rationale.Â  You can see the successes and the failures and the different things people have tried to control blight. For example, we tried to resolve this dispute one way but it was a disaster, so we have tried something else and that worked.</p>
<p>Wikipedia is a large laboratory for social media. Wikipedia and the large universe around it Wiki and WikiMedia projects that individuals, enterprises and put together like Commons.Â  Wikimedia Commons is a repository of publicly licensed images that anyone can take and reuse. They have sound and they have video, and all of this stuff is being stitched together now.</p>
<p>So if you go to the article on ObamaÂ  you can probably now hear his acceptance speech because that is public domain &#8211; its been stitched into the article.Â  If you go to the article on Richard Nixon &#8211; his resignation speech &#8211; you may even hear his conversation with the astronauts when they landed on the moon.Â  So this becomes a giant repository of all our culture and knowledge.Â  When I design a website, a lot of times I go to Commons to find images I use for free.Â  I don&#8217;t want to pay for an image I can get for free.Â  <strong></strong></p>
<p><strong>Tish: </strong>And the Commons images get contextualized in Wikipedia too.</p>
<p><strong>Jonathan:</strong> Some of these articles are fascinatingly detailed. If you want a quick summary of the Dr. Strangelove, the article is fantastic.Â  It is enjoyable, a pleasure to read.Â  I was reading about S.A. Andree&#8217;s North Pole balloon expedition of 1897. Some guys from Sweden decided to fly a balloon over the North pole.Â  They managed to get aloft then they flew over the icepack for 24 hrs then they crashed.</p>
<p>They unloaded their stuff and hiked back across the ice toward the island they had launched from. They ended up being on the ice pack for three months before they finally holed up in an ice cave and starved to death.Â  There weren&#8217;t found until thirty years later!Â  There was a camera with these guys and the frozen pictures taken 30 yrs earlier.Â  They developed the film and those pictures are now on Wikipedia.Â  It is just a fascinating thing!</p>
<p><strong>Tish: </strong> Do you see real time collaboration beginning to play more of a role in Wikipedia &#8211; whether virtual worlds or just voice/IM &#8212; how could real time collaboration change the wikipedia editing process?</p>
<p>Jonathan:Â  The Presidential candidate articles were being edited very rapidly yesterday. There are certain real time problems.Â  Some of the more interesting problems are when you get two administrators who &#8220;get into it.&#8221; One administrator says I am blocking this user and the other one says I am unblocking him, and the other one &#8220;NO I am blocking him!&#8221; And so on&#8230;&#8230; And everyone says, &#8220;Stop fighting. You are not allowed to do that!&#8221; And they both get their powers stripped. People do get very heated over the silliest things. Wikipedia does have some mailing lists attached and there are some IRC channels. So there are some real time elements.</p>
<p><strong>Tish: </strong>What is the role of avatars in Wikipedia?<br />
<br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> In Wikipedia you have a user page and many users are anonymous.Â  They create an avatar and they personalize it and show themselves in ways they want to show themselves through an avatar. In many ways it is a lot like Second Life.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">Some users have created second accounts &#8211; or a humerous second account. Bishzilla &#8211; a Swedish lady who is in tremendous command of the English language and has a razor sharp wit.Â  She has created this secondary account &#8211; almost like in a baby language.Â  Her avatar is a dinosaur that is not very bright that goes around frying people. Bizarre what people do! People may be editing a topic like an interest they have &#8211; e.g. Pokemon that they don&#8217;t want associated with their professional avatar. Or people may be editing a topic about hot political issues.Â  There have actually been some death threats issued to people over stuff they have been putting into the encyclopedia. </span><strong><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /></strong><span style="background-color: #ffffff;"><strong>Tish: </strong>So avatars are important in Wikipedia.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> Absolutely because people may be going in and editing articles that they may not want their friends and family to know they are editing.Â  One editor may say to another, &#8220;Stop putting stuff in or I will come and kill you!&#8221; Well then we have to ban them.Â  We have to call the police.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Tish:</strong> Can you build reputations on multiple avatars?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan: </strong>You are allowed to use multiple avatars as long as they don&#8217;t cross paths.Â  You can&#8217;t have two avatars editing in the same area beacuse you are going to be giving yourself double weight commenting on a discussion. </span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Tish:</strong> How do you know when this is happening?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> You can watch the style of a users editing.Â  You have to watch behavior.Â  And if you have enough evidence through behavior that suggests accounts are controlled by one person you can go and request a technical check.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">There are some uses who are called Checkusers who are able to access information desired from the server logs and check the technical characteristics of these accounts to see if they are using the same IP address.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Tish:</strong> So if you want to understand avatar interaction on the web it helps to understand Wikipedia. </span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> Yes it is a fantastic way to understand how avatars work in some aspects, and also how to deal with community dynamics.Â  We have some very strong willed people &#8211; people in their 40s, 50s, and 60s &#8211; who are very successful in business.Â  They have plenty of money and spare time and they are doing this as a hobby. And some of these people can really butt heads.Â  You can have a problem when you have an editor who has been writing fantastic articles but also happens to be rude and chew other people out and tell them to f**k off if they are not behaving. What do you do?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Tish:</strong> Sounds a bit like Second Life!</span><br style="background-color: #ffffff;" /><strong><br style="background-color: #ffffff;" /></strong><span style="background-color: #ffffff;"><strong>Jonathan:</strong> The person is a great contributor to the community but they are telling noobies to f**k off, so you can&#8217;t allow that.</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">What do you do?Â  Vested contributors are a major problem to some of these sites. They are vested in the community but they start misbehaving. You can&#8217;t block them, because if you block them there is a huge upsroar from all their friends and it causes a cataclysm.Â  It requires very careful diplomacy to deal with some of these situations. </span><br style="background-color: #ffffff;" /><strong><br style="background-color: #ffffff;" /></strong><span style="background-color: #ffffff;"><strong>Tish:</strong> How many Wikipedia volunteers are there now?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">Jonathan: Think of a Venn Diagram &#8211; a big circle. The total number of contributors are about one million different people that contribute.Â  But there are probably about 5,000 active editorsÂ  that are consistently and regularly contributing.Â  And within that kernel there are fifteen hundred people that have administrator access and probably only eight hundred of them are active.Â  People have a natural life span with the community.Â  People come an typically stay for 6 months to 3 years.Â  Usually after that they become bored, disillusioned or get into a conflict with someone.Â  There is a natural tendency for people to stay for a while and move on. Some people stay longer, a few, but the majority will move on at some point.Â  So it is a lot of fresh faces moving in.</span><br style="background-color: #ffffff;" /><strong><br style="background-color: #ffffff;" /></strong><span style="background-color: #ffffff;"><strong>Tish:</strong> What lessons of trust does Wkipedia have to teach us about new projects like AMEE that aims to aggregate the world&#8217;s energy data?</span><br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;"><strong>Jonathan:</strong> Well you have to know who is releasing the data. Who is creating the data? The beauty of Wikipedia is that you have an edit history so you can see exactly who has done what.Â  So you can judge whether this person is trustworthy or not.Â  That&#8217;s a huge problem on the web today.Â  We don&#8217;t have enough identification information.Â  When you see a web page you don&#8217;t necessarily know when that page was created and by whom, or how many revisions it has had.Â  Sometimes you can glean information by checking it.Â  If you see typos and errors you may decide that that page probably didn&#8217;t receive as much attention as it should have, and probably it is not that good.</span> <br style="background-color: #ffffff;" /><br style="background-color: #ffffff;" /><span style="background-color: #ffffff;">Typos are an interesting thing.Â  People always try to figure out how Google ranks web pages. </span><a id="uy3s" style="background-color: #ffffff;" title="Matt Cutts" href="http://www.mattcutts.com/">Matt Cutts</a><span style="background-color: #ffffff;"> was here from Google today.Â  And he was talking about spam.Â  But Matt also did a <a id="e4lo" title="blog post" href="http://www.mattcutts.com/blog/2006-pubcon-in-vegas-getting-there-and-back/">blog post</a> about how he was in an airport once, and how he has a policy &#8211; when you are reading a document as soon as you come to the first error just stop because if the author hasn&#8217;t taken the care to make everything correct, you don&#8217;t need to read it. So he was in the airport, there was a sign, he came to a typo and stopped reading it. Somehow he got in trouble for not reading the sign and not having the information.Â  But it is interesting to think whether Goggle is looking for for typos, misspellings, broken links and using that as a signal of quality to rank pages.</span><br style="background-color: #ffffff;" /></p>
<p><strong>Tish:</strong> Aaaagh typos might bring down your page rank!!!Â  That certainly is a scary thought for a blogger like me who likes to write impossibly long posts that are hard to check&#8230;&#8230;&#8230;</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2008/12/29/hacking-the-world-in-2009-google-street-view-smart-stuff-and-wikiculture/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
		</item>
		<item>
		<title>Smart Planet:Interview with Andy Stanford-Clark</title>
		<link>http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/</link>
		<comments>http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/#comments</comments>
		<pubDate>Mon, 15 Dec 2008 18:13:59 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[CurrentCost]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Energy Saving]]></category>
		<category><![CDATA[home automation]]></category>
		<category><![CDATA[home energy monitoring]]></category>
		<category><![CDATA[HomeCamp]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[MQTT and RSMB]]></category>
		<category><![CDATA[Open Grid]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[Open Source Virtual Worlds]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[Virtual HomeCamp]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[Web3.D]]></category>
		<category><![CDATA[World 2.0]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=2404</guid>
		<description><![CDATA[&#8220;Smart Planet: Andy Stanford-Clark&#8217;s time has really come. His career of work in lightweight brokers and sensors is now going to pay off,&#8221; twittered James Governor (@monkchips), Redmonk, recently. The picture opening this post (from Andy Piper&#8217;s Flickr stream} was taken during Andy Stanford-Clark&#8217;s talk at The Inaugural HomeCamp (for more photos see Flickr &#8220;homecamp08&#8243;). [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/andystanfordclark.jpg"><img class="alignnone size-full wp-image-2405" title="andystanfordclark" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/andystanfordclark.jpg" alt="" width="450" height="300" /></a></p>
<p><span class="entry-content"><em><strong>&#8220;Smart Planet: Andy Stanford-Clark&#8217;s time has really come. His career of work in lightweight brokers and sensors is now going to pay off,&#8221;</strong></em> <a href="http://twitter.com/monkchips/status/1029249885" target="_blank">twittered</a> </span><span class="entry-content">James Governor</span><span class="entry-content"> </span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/andystanfordclark.jpg"><span class="entry-content">(</span></a><a id="qd8i" title="@monkchips" href="http://twitter.com/monkchips" target="_blank">@monkchips</a>), <a href="http://redmonk.com/">Redmonk,</a> recently<span class="entry-content">. </span></p>
<p><span class="entry-content">The picture opening this post (from <a id="wfe3" title="Andy Piper's Flickr stream" href="http://www.flickr.com/photos/andypiper/" target="_blank">Andy Piper&#8217;s Flickr stream}</a> was taken during Andy Stanford-Clark&#8217;s talk at <a id="exzg" title="The Inaugural HomeCamp" href="http://andypiper.wordpress.com/2008/12/01/the-inaugural-homecamp/">The Inaugural HomeCamp</a> (for more photos see <a id="hi96" title="Flickr &quot;homecampo8&quot;" href="http://www.flickr.com/photos/tags/homecamp08/" target="_blank">Flickr &#8220;homecamp08&#8243;</a>). </span></p>
<p><span class="entry-content"><a id="h4a0" title="HomeCamp '08" href="http://homecamp.pbwiki.com/homecamp08" target="_blank">HomeCamp &#8217;08</a> was organized by </span><a id="pnnc" title="Chris Dalby" href="http://www.yellowpark.net/cdalby/" target="_blank">Chris Dalby</a> and <a id="vqd3" title="Dale Lane" href="http://dalelane.co.uk/blog/" target="_blank">Dale Lane</a> and sponsored by <a href="http://currentcost.co.uk/">Current Cost</a> and <a href="http://redmonk.com/">Redmonk</a>. A<span class="entry-content"> video </span><span class="entry-content">of Andy Stanford-Clark&#8217;s talk</span><span class="entry-content">, by <a id="hwom" title="Andy Piper" href="http://andypiper.wordpress.com/" target="_blank">Andy Piper,</a></span><span class="entry-content"> </span><span class="entry-content"> is <a href="http://www.viddler.com/explore/andypiper/videos/21/" target="_blank">up </a></span><a id="k4xo" title="see the video taken" href="http://www.viddler.com/explore/andypiper/videos/21/" target="_blank"><span class="entry-content">on Viddler</span></a><span class="entry-content">. Also see </span>Andy Piper&#8217;s <a href="http://andypiper.wordpress.com/2008/04/27/current-cost/" target="_blank">post abut CurrentCost meters</a> and most recently about <a href="http://andypiper.wordpress.com/2008/12/11/current-cost-monitoring-from-an-iphone/" target="_blank">running his CurrentCost meterâ€™s graphs on his iphone</a>.</p>
<p>Ambient displays were a hot topic at HomeCamp see <a id="q39t" title="here" href="http://ambientdevices.com/cat/orb/orborder.html" target="_blank">here</a> and <a id="ss3w" title="here" href="http://ambientdevices.com/cat/index.html" target="_blank">here</a> for some good examples.</p>
<p><span class="entry-content">I </span><a id="pyxa" title="first wrote about IBM Master Inventor Andy Stanford-Clarkâ€™s Home Automation project June of 2007" href="../../2007/06/05/extreme-life-logging-3d-experience-architects-digging-it-with-destroy-tv/" target="_blank">first wrote about IBM Master Inventor Andy Stanford-Clarkâ€™s Home Automation project June of 2007</a><span class="entry-content">.Â  At that time relatively few people were playing with home monitoring. But now the lynch pin of Andy&#8217;s work -</span> MQTT and RSMB &#8211; Really Small Message Broker, is available free on <a id="h0is" title="IBM AlphaWorks" href="http://alphaworks.ibm.com/tech/rsmb" target="_blank">IBM AlphaWorks</a> for anyone to download and play with.</p>
<p>This puts a key tool into the hands of developers and mashup artists ready to explore the possibilities of home automation as a generative technology that can bring the power of participatory culture to the urgent task of creating sustainable living.Â  Andy points out:<span class="entry-content"> </span></p>
<p><em><strong>&#8220;Lots of people can start playing with home energy monitoring, social aspects of the data sharing, home automation, ambient displays, etc. The powerful thing about messaging middleware like MQTT, is that you don&#8217;t have to worry about how to get the messages from A to B: you can focus on how to capture the data, and what to do with it when it gets to the other end.&#8221;</strong></em></p>
<p>The full interview, that I did with Andy last week, is later in this post.</p>
<p><span class="entry-content">Also recently, I did an <a id="gp5_" title="interview with Gavin Starks, founder of AMEE" href="../../2008/11/02/tim-oreilly-instrumenting-the-world/">interview with Gavin Starks, founder of AMEE</a>. </span>As a neutral data aggregation platform, &#8220;AMEEâ€™s vision is to enable the measurement of the â€œCarbon Footprintâ€ of everything on Earth.&#8221;Â  A<span class="entry-content"><a id="cde2" title="A press release out yesterday" href="http://www.amee.com/?p=556"> press release last week</a> announced that a &#8220;co</span>llaboration between Oâ€™Reilly Alphatech Ventures (OATV), Union Square Ventures (USV) and The Accelerator Group (TAG) will enable AMEE to expand its reach by enhancing its data, and extend globally.<span class="entry-content">&#8221; </span><span class="entry-content"> </span></p>
<p>The combination of a neutral aggregation platform and MQTT and RSMB can enable new forms of data sharing to meet broader sustainability goals (see <a id="ol7c" title="my interview with Gavin for AMEE's direction re privacy and data sharing" href="../../2008/11/02/tim-oreilly-instrumenting-the-world/">my interview with Gavin for AMEE&#8217;s direction re privacy and data sharing</a>), and the kind of ecological intelligence that Larry Brilliant, Google.org,Â  talked about at <a href="http://en.oreilly.com/web2008/public/content/home" target="_blank">Web 2.0 Summit</a>.Â  Dan Golemanâ€™s new book: <a title="&quot;Ecological Intelligence: How Knowing the Hidden Impacts of What We Buy Can Change Everything,&quot;" href="http://www.randomhouse.ca/catalog/display.pperl?isbn=9780385527828" target="_blank">â€œEcological Intelligence: How Knowing the Hidden Impacts of What We Buy Can Change Everything,â€</a> will come out in April, 2009. (<a id="fkkt" title="see my previous post" href="../../2008/11/25/web-meets-world-participatory-culture-and-sustainable-living/">see my previous post</a>).</p>
<p>There is already a <a id="c-ox" title="virtual worlds integration to AMEE" href="http://carbongoggles.org/">virtual worlds integration to AMEE</a> by <a id="qg5." title="Jim Purbrick" href="http://jimpurbrick.com/">Jim Purbrick</a> of Linden Lab!<br />
<span class="entry-content"><br />
</span></p>
<h3>Links For HomeCamp &#8217;08</h3>
<p>Chris Dalby has a list of blog posts about homecamp in his <a id="vx_v" title="HomeCamp Review" href="http://www.yellowpark.net/cdalby/index.php/2008/12/10/home-camp-review/" target="_blank">HomeCamp Review</a>.</p>
<p><a href="http://dalelane.co.uk/blog/?p=318">Homecamp by Dale Lane</a><br />
<a href="http://nicktaylor.co.uk/2008/11/10/home-camp/">Home Camp Unconference &#8211; inspired me by the thoughts</a><br />
<a href="http://andypiper.wordpress.com/2008/12/01/the-inaugural-homecamp/">The Inaugural Homecamp<br />
</a><a href="http://www.tomtaylor.co.uk/blog/2008/11/30/homecamp-demand-shifting/">Home Camp Deman Shifting</a><a href="http://andypiper.wordpress.com/2008/12/01/the-inaugural-homecamp/"><br />
</a><a href="http://pbjots.blogspot.com/2008/11/homecamp-november-2008.html">Homecamp</a> from Phoebe Bright<br />
<a id="tti9" title="Homecamp '08" href="http://jamie.op-i.net/blog/" target="_blank">Homecamp &#8217;08</a><br />
<a id="lnis" title="HomeCamp Event: Andy Stanford-Clarkâ€™s View" href="http://digital-lifestyles.info/2008/12/08/homecamp-event-andy-stanford-clarks-view/" target="_blank">HomeCamp Event: Andy Stanford-Clarkâ€™s View</a></p>
<h3>Virtual HomeCamp</h3>
<p><span class="entry-content">In 2007, I published the picture below (thanks <a href="http://annieok.com" target="_blank">Annie Ok</a> as Destroy Television for SL pics) which shows:</span></p>
<p>On the right the virtualization of Andy&#8217;s RL house which is part of a Second Life Real Life Home Automation project. The pictures in the bottom row shows Mrs Stanford-Clarkâ€™s Real Life Llamas on the left and their virtual counterparts on Second Life on the right. Real and Virtual Llamas are linked through GPS and MQTT so people can &#8220;track the trek&#8221; when the llamas are out on a walk (see <a href="http://www-03.ibm.com/innovation/us/podcasts/blog_videocast.shtml">this IBM podcast</a>).</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/andysautomatedhouse.jpg"><img class="alignnone size-full wp-image-2409" title="andysautomatedhouse" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/andysautomatedhouse.jpg" alt="" width="448" height="338" /></a></p>
<p>I am currently working on a Virtual HomeCamp which will probably be nomadic from meetup to meetup but will kick off in Andy&#8217;s virtual house in Second Life. Andy Stanford-Clark, <a id="awwk" title="Adam Frisby" href="http://www.adamfrisby.com/blog/" target="_blank">Adam Frisby</a> (one of the founders of <a id="bc79" title="OpenSim" href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a> ), and Chris Dalby have all agreed to talk (more presenters to come!) at the first Virtual HomeCamp.</p>
<p>Charles Crinke, OpenSim has offered Virtual HomeCamp a patch of land on <a id="s58j" title="OSGrid" href="http://osgrid.org/" target="_blank">OSGrid,</a> and to give a talk on interesting home automation projects to get started in OpenSim. Charles has a cornucopia of great ideas!</p>
<p>And Kyle Gomboy (avatar G2 Proto) of the Microsoft Developer Community has set up an OpenSim on <a id="z:jr" title="ReactionGrid" href="http://reactiongrid.com/" target="_blank">ReactionGrid</a> that virtual HomeCampers can use to develop projects related to participatory culture and sustainable living.</p>
<p>The interview with Andy Stanford-Clark in this post gives Virtual HomeCampers some great ideas for good projects &#8220;that matter&#8221; to work on.</p>
<p>If you have a Second Life or OpenSim venue and you would like to offer your sim for a meetup &#8211; please let me know! Meetups will need to be streamed to the web as there is already a dynamic and rapidly growing HomeCamp community. See:</p>
<p><a id="mg60" title="HomeCamp Wiki" href="http://homecamp.pbwiki.com/" target="_blank">HomeCamp Wiki</a></p>
<h4 style="font-weight: normal;"><a href="http://homecamp.org.uk/">HomeCamp Blog</a></h4>
<h4 style="font-weight: normal;"><a href="http://upcoming.yahoo.com/event/1304370">HomeCamp on Upcoming</a></h4>
<h4 style="font-weight: normal;"><a href="http://www.facebook.com/events.php?ref=sb#/event.php?eid=43794919520">HomeCamp on Facebook</a></h4>
<h4 style="font-weight: normal;"><a href="http://groups.google.co.uk/group/homecamp?hl=en">Google Group Discussion</a></h4>
<p><a href="http://friendfeed.com/rooms/homecamp">FriendFeed Room</a></p>
<h3>Reducing the Carbon Footprint of Avatars and Getting Energy Awareness to the Masses</h3>
<p>As Andy notes:</p>
<p><em><strong>&#8220;We need to get energy awareness and energy saving to the masses; and by saying &#8220;you can reduce energy by interacting in a virtual 3D world&#8221;, just isn&#8217;t going to cut it for all but a very small fraction of the people we need to get to.&#8221;</strong></em></p>
<p>But, perhaps, some of our phenomenal OpenSim developers will push the envelope and produce the code that will make open source virtual worlds one of the most important future contributors to sustainable living. And, hopefully, Virtual HomeCamp will leverage both the collective intelligence of the web and the real time presence plus rapid prototyping capabilities unique to immersive 3D virtual worlds, to explore new ways to get energy awareness and energy saving to the masses in the short term as well as the long term.</p>
<p>And yes we will have to address the topic of those energy-hogging avatars!!!</p>
<p>Adam Frisby has been doing some interesting work with OpenSim that has the potential to reduce the energy consumption of VWs. And Michael Osias, IBM, told me:</p>
<p><em><strong>&#8220;We operate the IBM grid [100 OpenSims] on almost all virtual machines with Xen. Recently, we migrated the opensim appliance into the IBM Research cloud appliance catalog.&#8221;</strong></em></p>
<p>So I will definitely be calling on Michael and Adam to present on how server virtualization and cloud computing can reduce the carbon footprint of avatars.</p>
<h3>Setting Up Your Own Home Automation Hub</h3>
<p>There is an amazing choice of home automation technology becoming available now. Recently <a id="i0w2" title="Nokia announced their home automation ecosystem" href="http://www.engadget.com/2008/11/27/nokia-launching-z-wave-home-control-center-next-year/" target="_blank">Nokia announced their home automation ecosystem</a> &#8211; available in late 2009. And I recently saw <a id="sph0" title="The Apple Macintosh Z - Wave Home Automation System" href="http://www.automatedhome.co.uk/New-Products/Apple-Macintosh-Z-Wave-Home-Automation-System.html" target="_blank">The Apple Macintosh Z &#8211; Wave Home Automation System</a>. If you don&#8217;t already, start checking out Automated Home<a href="http://www.automatedhome.co.uk/"> for lots of good ideas and smart devices.</a></p>
<p>In the interview below, Andy describes how he achieves some impressive energy consumption reduction with some very affordable and readily available hardware, a little detective work, and a tip from his son to examine the energy consumption of the home automation set-up itself. And with the newly &#8220;available for free download&#8221; Really Small Message Broker from <a id="h0is" title="IBM AlphaWorks" href="http://alphaworks.ibm.com/tech/rsmb" target="_blank">IBM AlphaWorks</a>, IBM has made available a cool way to give creative home automators a free vehicle to broker and share their data and integrate home automation in all the exciting ways we can come up with.</p>
<p>The pictures below (<a href="http://podcast.ubuntu-uk.org/2008/12/03/s01e19-love-letters/" target="_blank">see here for enlargements</a>) are the before and after shots of a streamlining effort Andy made on his own home automation setup.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/img_9072-small.jpg"><img class="alignnone size-full wp-image-2416" title="img_9072-small" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/img_9072-small.jpg" alt="" width="200" height="150" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/img_9074-small.jpg"><img class="alignnone size-full wp-image-2417" title="img_9074-small" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/img_9074-small.jpg" alt="" width="200" height="150" /></a></p>
<p>Andy said:</p>
<p><em><strong>&#8220;I&#8217;ve moved my entire home automation system from the pile of equipment shown in the first photo, to a single Viglen MPC-L with a load of USB serial connections (second photo).</strong></em></p>
<p><em><strong> The pile of equipment I replaced is: A Cisco wireless access point, an IBM ThinkPad, aÂ  Linksys NSLU2 SLUG, an Arcom Viper, and an Arcom Field Sentry I/O box.<br />
</strong></em></p>
<p><em><strong>Moving to the Viglen and turning off all that lot, has replaced 50W of always-on standby power with 10W, i.e. 40W less, or about Â£40 a year!&#8221;</strong></em></p>
<p>See Chris Dalby&#8217;s post, <a href="http://www.yellowpark.net/cdalby/index.php/2008/12/15/viglen-mpc-l-useful-commands-and-tips/" target="_blank">Viglen MPC-L Useful Commands and Tips</a>.</p>
<h2>Interview With Andy Stanford-Clark</h2>
<p><strong>Tish Shute:</strong> I just got a good question for you from Gavin Starks AMEE, &#8220;if the Stern report is going to be out by 100% by 2020, and we have to start seeing an actual reduction of 10% per annum starting next year: What would you do, personally?&#8221; (See <a href="http://www.climatesafety.org" target="_blank">The Climate Safety</a> report, backed by IPCC).<br />
<strong><br />
Andy Stanford-Clark:</strong> Oh, man! Now you&#8217;re asking the tough questions!</p>
<p>We have to change attitudes, otherwise just a few people making a noise about this stuff isn&#8217;t going to make any significant difference &#8211; and the way to change attitudes is by starting to make people aware of just how much energy various things we have, and things we do, take. But it needs to be something in each person&#8217;s home, that&#8217;s not &#8220;in your face&#8221;&#8230; something more subtle &#8211; &#8220;ambient&#8221;&#8230; otherwise people reject it out of hand.</p>
<p>Also, people are suspicious of the power companies asking us to use less power: &#8220;what, give you less money?? Surely there&#8217;s a catch?&#8221; This is a real problem. Someone phoned one of the power companies here and accused them of sending her an energy monitor that would suck electricity out of the wall socket at night, to INCREASE her bill! If that&#8217;s the kind of thing we&#8217;re up against, it&#8217;s going to be a long journey!</p>
<p><strong>Tish:</strong> So what it the best way to change attitudesÂ  &#8211; have you seen projects like <a id="c.tc" title="Wattzon" href="http://www.wattzon.com/" target="_blank">Wattzon</a>?</p>
<p><strong>Andy SC:</strong> Yes, projects like Wattzon are exactly the kind of thing that start to make people realise the true cost of wasting energy.</p>
<p>Personally, my family has reduced our home electricity bill by 30%, which is great! But my neighbours didn&#8217;t, nor the other 4 billion or so people who have electricity.<br />
<strong><br />
Tish:</strong> How did you reduce your consumption so much?<br />
<strong><br />
Andy SC:</strong> We reduced our home electricity bill when we got a <a id="w57x" title="currentcost meter" href="http://www.automatedhome.co.uk/Announcements/Reduce-Your-Bills-with-Smart-Home-Power-Monitoring.html" target="_blank">currentcost meter</a> &#8211; a plug-in energy monitor which gives a total for the whole house.Â  When we got it, it showed up really quickly a couple of things&#8230;. that our &#8220;standby power&#8221; was really high (i.e. in the middle of the night, when everyone&#8217;s asleep, you creep up to the meter with a torch (flashlight &lt;grin&gt;) and see what it&#8217;s showing).</p>
<p>That was about 500 Watts before we started paying attention to it. The other thing was the lights.. I had no idea the lights in the kitchen used 480 Watts.. we just used to leave them on all the time when we were in the house. A simple change, once I realised: turn them off when you leave the room!</p>
<p>Our standby power was really high because I had a load of geeky home automation stuff running, and my first-generation, homebrew, energy monitoring solution (how ironic!)&#8230; which included 3 laptops doing various things (monitoring data and displaying information round the house). I just didn&#8217;t think about the cost.</p>
<p>So one weekend we went round the house making an inventory in each room of things that were on (the children were keen to help!). That enabled me to pretty much track down the whole 500 W&#8230; there were a few things that took some sleuthing, like the alarm system and the central heating controller. We used <a id="asuo" title="a plug-in meter" href="http://www.amazon.co.uk/gp/product/B000Q7PJGW?ie=UTF8&amp;tag=markmccall&amp;linkCode=as2&amp;camp=1634&amp;creative=19450&amp;creativeASIN=B000Q7PJGW" target="_blank">a plug-in meter</a> to see what individual appliances were using.. a really useful diagnostic aid.</p>
<p>It&#8217;s worth having a look at <a id="wjzg" title="AutomatedHome's review" href="http://www.automatedhome.co.uk/Announcements/Reduce-Your-Bills-with-Smart-Home-Power-Monitoring.html" target="_blank">AutomatedHome&#8217;s review</a> of these energy monitoring products, by the way.</p>
<p>So I turned off a load of things that were sitting there on standby.. things like stereo, microwave, scanner, Wii, power bricks&#8230; each taking 4-6 Watts just doing nothing &#8211; each one small, but it all adds up. The big hitters were the PCs&#8230; turned off 3 of those, and consolidated onto a low power (10W) <a id="ym7y" title="linux server (Viglen MPC)" href="http://www.viglen.co.uk/viglen/Products_Services/Product_Range/Product_file.aspx?eCode=XUBUMPCL&amp;Type_Info=Description&amp;Type=Desktops&amp;GUID=" target="_blank">Linux server (Viglen MPC-L)</a>&#8230;so that got our standby power down to 180 watts. And that, combined with being proactive about turning off lights, reduced our power usage from 900 KWH a month to 600&#8230; i.e. 30% and it has been at that for 4 months now.</p>
<p><strong>Tish:</strong> Interesting that your home automation was one of the power issues as I am an aspiring home automator myself!</p>
<p><strong>Andy SC:</strong> Yes, you have to strike a balance of using energy to save energy, and make sure you know what your standby power is. There are a number of home energy monitors available &#8211; there&#8217;s a <a id="qy1h" title="review" href="http://www.automatedhome.co.uk/Announcements/Reduce-Your-Bills-with-Smart-Home-Power-Monitoring.html" target="_blank">review</a> on the AutomatedHome blog. The CurrentCost meter has a handy serial port so you can plug it into a computer to download history data or make it live on the internet.</p>
<p><strong>Tish:</strong> That is interesting because it opens the door to having a social energy network, doesn&#8217;t it?</p>
<p><strong>Andy SC:</strong> Yes.. absolutely&#8230; you should watch <a id="i28f" title="my intro talk at homecamp" href="http://www.viddler.com/explore/andypiper/videos/21/" target="_blank">my intro talk at HomeCamp</a>! About 50 of us at IBM in the UK (and one in Australia!) have put our home energy graphs online using a currentcost meter plus a cheap low power Linux server like the Viglen MPC-L or Linksys NSLU2 (SLUG) type devices.</p>
<p>And a community has formed around the graphs (I described this in my HomeCamp talk at some point).. so people ask what&#8217;s that spike, or why&#8217;s yours so high in the morning, or how do you get your standby power so low.. and people talk about it and exchange ideas. There&#8217;s a facebook group (currentcost) too, with people talking about this.</p>
<p>And there&#8217;s some peer pressure too.. if my power is really high compared with everyone else, I feel bad about it and see what I can do to reduce it.. or if not reduce it, at least know why it&#8217;s high, and have been through a process to justify that to myself.</p>
<p><strong>Tish:</strong> You mentioned earlier that it was important to have ambient solutions, not &#8220;in your face&#8221; messages from Big Brother like &#8220;turn your lights off now!&#8221; What kind of &#8220;ambient&#8221; solutions have you been working on?</p>
<p><strong>Andy SC: </strong>Ok &#8211; <a id="ewgg" title="ambient" href="http://ambientdevices.com/cat/index.htm" target="_blank">ambient devices</a> &#8230; so an <a id="stq:" title="&quot;orb&quot; is a good example" href="http://ambientdevices.com/cat/orb/orborder.html" target="_blank">&#8220;orb&#8221; is a good example</a>.. wired up to the home automation system, or the energy monitor.. or maybe even controlled by the power company&#8230;</p>
<p>It glows different colours (e.g. blue through red, or red/amber/green) to tell me how &#8220;healthy&#8221; the house is from an energy point of view. So I don&#8217;t have to open a browser and pull up a geeky graph and analyse it.. it just lets me know subconsciously how we&#8217;re doing.</p>
<p><strong>Tish:</strong> But it doesn&#8217;t necessarily help you find out what your problems is, right?</p>
<p><strong>Andy SC:</strong> In our house, it&#8217;s in my study, so when I go to bed, for example, I glance in to see it, and if it&#8217;s green, all is good&#8230; but if it&#8217;s still amber or red(!), then I think.. hmm &#8211; what&#8217;s still on.. oh, the dishwasher.. ok &#8211; that will finish soon&#8230; or.. oh, I left the heater on .. I&#8217;ll go and turn it off.</p>
<p><strong>Tish:</strong> What do you have to help you troubleshoot the problem?</p>
<p><strong>Andy SC:</strong> If the orb doesn&#8217;t jog your memory, then you can pull up the graph to give more information, or a dashboard which shows various things that are turned on, both of which help with knowing what&#8217;s going on.</p>
<p><strong>Tish: </strong>And how to fix it?<strong></strong></p>
<p><strong>Andy SC:</strong> Yes, so if things are on X10 or other appliance control systems like <a id="rv3d" title="Bye Bye Standby" href="http://www.byebyestandby.co.uk/" target="_blank">Bye Bye Standby</a>, for example, and under computer control, then you can have a dashboard of what&#8217;s on so you can see.</p>
<p><strong>Tish:</strong> Good interfaces to home automation seem to be a problem yet to be solved?</p>
<p><strong>Andy SC:</strong> There&#8217;s at least one company which has technology to analyze your power usage (voltage and current together) to &#8220;learn&#8221; which appliance has which profile on the graph, so you can see what&#8217;s on that&#8217;s using lots of power and also get a pie chart view of the whole house with slices showing different appliances &#8211; so many % for the TV, so many for freezer, etc. that&#8217;s <a id="k2ca" title="Onzo.com" href="http://www.onzo.co.uk/" target="_blank">Onzo.com</a> . Their product isn&#8217;t out yet, but will give a much finer grain understanding of what&#8217;s using the power in your home.</p>
<p>There are also some &#8220;IAM&#8217;s&#8221;.. Individual Appliance Monitors, which are like the plug-in meter I showed you, but with a (usually wireless) link back to a base station to tell you how much power is flowing through each of them. So by knowing what appliances you plugged into your IAMs round the house, you can break out the usage by appliance. And if they&#8217;re 2-way, which some of them will be, you can have the computer turn them off if you tell it, say from the web, or your mobile phone, etc. Or maybe the home automation system will make an autonomous decision to turn it off for you!</p>
<p>Back to interfaces to home automation: there are two typical approaches &#8211; PLC (power line carrier) like X10, and wireless (like Bye Bye Standby, etc)&#8230; there are computer interfaces to both, but it&#8217;s all still quite expensive (in UK at least &#8211; cheaper in the US because X10 is more ubiquitous)&#8230;Â  but the cheaper ones don&#8217;t tell you that they definitely turned the device on or off &#8211; all you know is that the command was sent out. It might not have got there, so you don&#8217;t <em>really</em> know if the heater got turned off.. unless you monitor it by some secondary means, like seeing if the temperature goes down, or if the power usage goes down, or (for a light) if the room goes dark, or whatever.</p>
<p>BTW, my standby is now down to 120 watts</p>
<p><strong>Tish:</strong> Yes!</p>
<p><strong>Andy SC:</strong> I consolidated some more home automation stuff into one device.. there are two photos on <a id="i-2g" title="this page" href="http://podcast.ubuntu-uk.org/2008/12/03/s01e19-love-letters/" target="_blank">this page</a> &#8211; my &#8220;before&#8221; and &#8220;after&#8221; shots. It gets a mention in the podcast. They did a promotion on the low power Viglen servers.. Â£80 instead of Â£150&#8230; bargain! Loads of people have bought them for home automation.. you can&#8217;t have failed to see the #viglen references on twitter over the past few months!</p>
<p><strong>Tish:</strong> I think there is a lot of enthusiasm for virtual worlds as a good interface for home automation. But we need to come up with something simple enough for everyone?</p>
<p><strong>Andy SC:</strong> Yes, virtual worlds are very interesting.. though let&#8217;s not mention the carbon cost of running a VW!</p>
<p>So you know already, I think, that I can control my home automation stuff from SL&#8230; if I turn on my lights in SL, my FL (first life, i.e. here!) lights turn on, and also meter reading.. my live electricity and water meter readings are displayed on virtual meters in my virtual house so the meter reader doesn&#8217;t even need to drive to my house &lt;grin&gt; and the orb is there too, so I can see how healthy the house is, energy-wise, in-world.</p>
<p>Imagine a row of houses each glowing blue through red according to its power use &#8211; peer pressure again. If you have local generation.. the power hogs could be made to feel guilty for using all the town&#8217;s energy from the wind farm or gas turbine generator.</p>
<p><strong>Tish:</strong> So every one would see if you have a Bad House eeek!</p>
<p><strong>Andy SC: </strong>right!</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/slmeterpost.jpg"><img class="alignnone size-full wp-image-2410" title="slmeterpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/12/slmeterpost.jpg" alt="" width="450" height="332" /></a></p>
<p><span class="entry-content"> <em>The picture above shows Andy Stanford-Clark&#8217;s electricity meter in Second Life. </em></span></p>
<p><em></em></p>
<p><strong>Tish:</strong> Yes and the great thing about a VW is you get a sense of confidence your controls are working and how to adjust them. But yes the carbon cost is one of the obstacles.</p>
<p>Do you think the power hogging sims of Virtual Worlds could be improved by server virtualization techniques and clouds &#8211; I know there is an IBMer here in US who is working on server virtualization integrated into OpenSim?</p>
<p><strong><br />
Andy SC:</strong> Yes, cloud technologies have a lot to offer in terms of making best use of a set of machines to run a set of applications, rather than one machine per application as often tends to be the case.</p>
<p>And with dynamic load balancing, like we do for our sporting event on-demand server farms for things like Wimbledon, as the load ramps up, we squeeze out the other apps that are using the farm to give extra capacity (as Wimbledon takes priority in that instance!)</p>
<p>But there was a popular statistic when SL became really popular &#8211; over a year ago now, that was something like to have an avatar in SL for a year was the same carbon footprint as driving an SUV from NY to SF or something &#8211; don&#8217;t quote me on that till we check it &#8230; <a id="ymnc" title="here it is" href="http://www.roughtype.com/archives/2006/12/avatars_consume.php" target="_blank">here it is</a> &#8211; 2000 miles</p>
<p><strong>Tish:</strong> Yes I remember <a href="http://www.ugotrade.com/2008/06/27/ibms-virtual-wimbledon-web-rendering-in-second-life/" target="_blank">Judge telling me about some of the interesting load balancing you do at Wimbledon</a>.</p>
<p>Many of my friends are thinking ahead to AR solutions now too.<br />
<strong></strong></p>
<p><strong>Andy SC:</strong> Yeah &#8211; AR very interesting too.. you have to read Halting State by Charles Stross</p>
<p><strong>Tish:</strong> Yes loved it!</p>
<p><strong>Andy SC:</strong> So &#8220;Halting State is to 15 years&#8217; time as SnowCrash was to NOW, 15 years ago&#8221;</p>
<p>SnowCrash is effectively a history book now.</p>
<p>Yeah, I think AR with glasses and overlays is going to be really cool! In cars too.. heads up satnav..</p>
<p><strong>Tish:</strong> Also could you tell me the role of the messaging technology you developed in all this?<br />
<strong><br />
Andy SC: </strong><a id="g.i:" title="using MQTT" href="http://mqtt.org/" target="_blank">using MQTT</a> of course.. which is the area I have been working on with my team for the past 10 years: the IBM messaging technology which underpins all this cool stuff we&#8217;re doing for home automation, energy monitoring, inter-world messaging.. all that stuff.. all using MQTT and WebSphere messaging technology.</p>
<p><strong>Tish:</strong> I would be interested to know more about how you see VR and AR with what we have available today producing a cool interface for home automation that could get some mass traction.</p>
<p><strong>Andy SC:</strong> So I think the AR/VR thing.. at the moment, far too few people are using these technologies.. we need to get energy awareness and energy saving to the masses (looping back round to the original Gavin Starks question!)&#8230; and by saying &#8220;you can reduce energy by interacting in a virtual 3D world&#8221;, just isn&#8217;t going to cut it for all but a very small fraction of the people we need to get to.&#8221;</p>
<p><strong>Tish: </strong>Yes in basic ambient ways first.Â  How does the messaging technology you have developed open up possibilities for leveraging network effects and creating new forms of participatory culture around reducing consumption?</p>
<p><strong>Andy SC:</strong> It is important because the messaging allows the real-time interaction that can be used to give dynamic feedback, and it&#8217;s that immediacy which makes people react to changes.</p>
<p>And, with MQTT and RSMB &#8211; Really Small Message Broker, which is now available free on <a id="h0is" title="IBM AlphaWorks" href="http://alphaworks.ibm.com/tech/rsmb" target="_blank">IBM AlphaWorks</a> for anyone to download and play with, lots of people can start playing with home energy monitoring, social aspects of the data sharing, home automation, ambient displays, etc. without having to worry about how to get the messages from A to B.. that bit&#8217;s done for you.. you can just focus on the interesting stuff. Folks at HomeCamp got quite excited about it! And for those who care (e.g. if you want to link your home in to infrastructure like the power company or distributed building management, or whatever) then the MQTT and RSMB technology is compatible with IBM&#8217;s WebSphere enterprise messaging products, and so can link right in.</p>
<p><strong>Tish:</strong> So people could use this to build some interfaces with projects like AMEE say? For example letting you know when your light bulb went out which was the most energy efficient one to replace it with?</p>
<p><strong>Andy SC:</strong> Yes, indeed.. was talking to <a href="http://www.pachube.com/" target="_blank">Pachube</a> this morning, as another example.</p>
<p><strong>Tish:</strong> What did you discuss with Pachube?</p>
<p><strong>Andy SC:</strong> using MQTT as the feed to get EML data into and out of Pachube rather than over HTTP. That&#8217;s interesting because MQTT is a much more lightweight protocol, designed for small sensors and low bandwidth / expensive (e.g. cellular) networks&#8230; and it&#8217;s also true push.. i.e. data is pushed to you directly from the broker (the hub in the middle), rather than you having to ask for it constantly (polling). It is an easy way to interface existing MQTT/RSMB home automation or energy monitoring systems into Pachube and it&#8217;s scalable publish/subscribe.. so one data feed in, many data feeds out.Â  This opens up lots of new possibilities for Pachube feeds. <a id="knkj" title="Pachube" href="http://www.pachube.com/feeds/1214" target="_blank">Here is one Pachube feed coming from MQTT.</a></p>
<p><strong>Tish:</strong> Ah yes, no polling! That is a killer in HTTP</p>
<p><strong>Andy SC:</strong> Absolutely!!!</p>
<p><strong>Tish:</strong> And other examples of interfaces using MQTT with potential applications in the sustainability area are &#8230;<br />
<strong><br />
Andy SC:</strong> The power graphs (as described in my talk) are a good example. Also when people start generating their own power with PV or wind, they&#8217;ll want to monitor the contribution their power plant is making to their power usage, and compare it with spot prices on the grid, weather data, etc, etc. These are exactly the kinds of data feeds that MQTT is great for.</p>
<p><strong>Tish: </strong>As you said the most important aspect of MQTT is that it frees people up from having to worry aboutÂ  getting messages from A to B so they can &#8220;start playing with home energy monitoring, social aspects of the data sharing, home automation, ambient displays, etc. &#8230;..How to capture the data.. and what to do with it when it gets to the other end of the comms link.&#8221;</p>
<p><strong>Andy SC: </strong>Yes, exactly &#8211; the incremental cost of adding new devices and applications is very low, once you&#8217;ve got the messging infrastructure in place. So once you&#8217;ve got your home RSMB hub set up, it become easy to integrate new data sources and play with new applications which use that data in interesting ways!</p>
<p>I&#8217;m fascinated by the social aspects of energy saving &#8211; the way communities have formed around the graphs we&#8217;re generating from the currentcost data. I&#8217;m sure that&#8217;s only the tip of an iceberg &#8211; it&#8217;s still quite geeky, but if you start to bring in some kind of gaming or competitive element, then I think harnessing the peer pressure and competitive spirit in people will be a powerful way to encourage change in people&#8217;s energy-using habits.</p>
<p>Ambient displays are another area of interest &#8211; the orb is just one way of doing it. Using twitter to keep you ambiently aware of what&#8217;s going on is another, and there are other media like sound and images, which can tell you things in a subtle way. Lots of scope for more experiments <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" /> </p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2008/12/15/smart-planetinterview-with-andy-stanford-clark/feed/</wfw:commentRss>
		<slash:comments>15</slash:comments>
		</item>
	</channel>
</rss>
