<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>UgoTrade &#187; Google Wave</title>
	<atom:link href="http://www.ugotrade.com/tag/google-wave/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.ugotrade.com</link>
	<description>Augmented Realities at the Edge of the Network</description>
	<lastBuildDate>Wed, 25 May 2016 15:59:56 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>The Missing Manual for the Future: Tim Oâ€™Reillyâ€™s Four Cylinder Innovation Engine</title>
		<link>http://www.ugotrade.com/2010/10/31/tim-o%e2%80%99reilly%e2%80%99s-four-cylinder-innovation-engine-the-missing-manual-for-the-future/</link>
		<comments>http://www.ugotrade.com/2010/10/31/tim-o%e2%80%99reilly%e2%80%99s-four-cylinder-innovation-engine-the-missing-manual-for-the-future/#comments</comments>
		<pubDate>Sun, 31 Oct 2010 21:25:02 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Bar Camp]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[#w2e]]></category>
		<category><![CDATA[algorithmic economies]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[Area/Code]]></category>
		<category><![CDATA[ARWave]]></category>
		<category><![CDATA[Battlestorm]]></category>
		<category><![CDATA[Chris Arkenberg]]></category>
		<category><![CDATA[Cloudera]]></category>
		<category><![CDATA[counter surveillance]]></category>
		<category><![CDATA[Credit Suisse trading bots]]></category>
		<category><![CDATA[CrowdFlower]]></category>
		<category><![CDATA[data is gasoline]]></category>
		<category><![CDATA[Defeating Big Brother]]></category>
		<category><![CDATA[Dennis Crowley]]></category>
		<category><![CDATA[Dr Alex Kilpatrick]]></category>
		<category><![CDATA[ecologies of human and machine inteligence]]></category>
		<category><![CDATA[Esther Dyson]]></category>
		<category><![CDATA[Facebook for Data]]></category>
		<category><![CDATA[food52]]></category>
		<category><![CDATA[Four Square]]></category>
		<category><![CDATA[Four Square and Dodge Ball]]></category>
		<category><![CDATA[Four Square API]]></category>
		<category><![CDATA[Fred Wilson]]></category>
		<category><![CDATA[Games That Know Where You Live]]></category>
		<category><![CDATA[geopollster]]></category>
		<category><![CDATA[Glympse]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Gov 2.0]]></category>
		<category><![CDATA[Hackett]]></category>
		<category><![CDATA[Hadoop World]]></category>
		<category><![CDATA[high frequency trading]]></category>
		<category><![CDATA[hour.ly]]></category>
		<category><![CDATA[iPad]]></category>
		<category><![CDATA[iphone apps]]></category>
		<category><![CDATA[Jet Packs]]></category>
		<category><![CDATA[jetpack]]></category>
		<category><![CDATA[John Battele's Points of Control Map]]></category>
		<category><![CDATA[Kevin Slavin]]></category>
		<category><![CDATA[Knight Foundation]]></category>
		<category><![CDATA[Lars Rasmussen]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[Loitering on the Motherboard]]></category>
		<category><![CDATA[machine to machine data]]></category>
		<category><![CDATA[machine to machine intelligence]]></category>
		<category><![CDATA[Macon Money]]></category>
		<category><![CDATA[Madagascar Institute]]></category>
		<category><![CDATA[Maker Faire]]></category>
		<category><![CDATA[Mary Haskett]]></category>
		<category><![CDATA[Mike Olsen]]></category>
		<category><![CDATA[mobile social augmented reality]]></category>
		<category><![CDATA[Nanex]]></category>
		<category><![CDATA[Nanex API]]></category>
		<category><![CDATA[Next Jump]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[Pachube API]]></category>
		<category><![CDATA[pathfinder]]></category>
		<category><![CDATA[people are the platform]]></category>
		<category><![CDATA[Platforms for Growth]]></category>
		<category><![CDATA[Points of Control Map]]></category>
		<category><![CDATA[Qualcomm vision based augmented reality SDK]]></category>
		<category><![CDATA[quant trading]]></category>
		<category><![CDATA[quantative analysis]]></category>
		<category><![CDATA[real time data analytics]]></category>
		<category><![CDATA[real time internet]]></category>
		<category><![CDATA[Samasource]]></category>
		<category><![CDATA[sensor platforms]]></category>
		<category><![CDATA[Shazam]]></category>
		<category><![CDATA[Shazam for faces]]></category>
		<category><![CDATA[sousveillance]]></category>
		<category><![CDATA[stock market flash crash]]></category>
		<category><![CDATA[Strata]]></category>
		<category><![CDATA[surveillance bots]]></category>
		<category><![CDATA[The Battle for the Internet Economy]]></category>
		<category><![CDATA[The Battle of the Networks]]></category>
		<category><![CDATA[The Business of Data]]></category>
		<category><![CDATA[The Consequences of Living in a World of Data]]></category>
		<category><![CDATA[The Future: The Missing Manual]]></category>
		<category><![CDATA[The Gartner Hype Cycle]]></category>
		<category><![CDATA[the internet is a data operating system]]></category>
		<category><![CDATA[The Internet Operating System]]></category>
		<category><![CDATA[The Jet Ponies]]></category>
		<category><![CDATA[The Missing Manual For The Future]]></category>
		<category><![CDATA[Tim O'Reilly]]></category>
		<category><![CDATA[Tim O'Reilly's Four Cylinder Engine for Innovation]]></category>
		<category><![CDATA[Tim O'Reilly's Four Cylinder Innovation Engine]]></category>
		<category><![CDATA[trading bots]]></category>
		<category><![CDATA[Twitter for Sensors]]></category>
		<category><![CDATA[Union Square Ventures]]></category>
		<category><![CDATA[Ushahidi]]></category>
		<category><![CDATA[Usman Haque]]></category>
		<category><![CDATA[Valveless Pulse Jets]]></category>
		<category><![CDATA[WanderID]]></category>
		<category><![CDATA[Wave Federation Protocol]]></category>
		<category><![CDATA[Wave in a Box]]></category>
		<category><![CDATA[Web 2.0 Expo]]></category>
		<category><![CDATA[Web 2.0 Expo start ups]]></category>
		<category><![CDATA[Web 2.0 Summit]]></category>
		<category><![CDATA[Web Squared]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[Where 2.0]]></category>
		<category><![CDATA[William Gibson]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=5985</guid>
		<description><![CDATA[The Missing Manual for The Future (or The Future: The Missing Manual) Oâ€™Reilly Media, is famous for is producing&#160; â€œmissing manualsâ€ for new technologies, but thinking of Oâ€™Reilly as just a publisher of books would be like saying Facebook is just a website (this came up in the discussion at Media Round Table at Web [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="../wp-content/uploads/2010/10/Screen-shot-2010-10-11-at-11.40.56-AM.png" mce_href="../wp-content/uploads/2010/10/Screen-shot-2010-10-11-at-11.40.56-AM.png"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-11-at-11.40.56-AM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-11-at-11.40.56-AM.png"><img class="alignnone size-medium wp-image-5786" title="Screen shot 2010-10-11 at 11.40.56 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-11-at-11.40.56-AM-300x198.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-11-at-11.40.56-AM-300x198.png" alt="Screen shot 2010-10-11 at 11.40.56 AM" height="198" width="300"></a><br mce_bogus="1"></p>
<h3>The Missing Manual for The Future (or The Future: The Missing Manual)</h3>
<p>Oâ€™Reilly Media, is famous for is  producing&nbsp; <a href="http://missingmanuals.com/" mce_href="http://missingmanuals.com/" target="_blank">â€œmissing manualsâ€</a> for new  technologies, but thinking of Oâ€™Reilly as just a publisher of  books would be like saying Facebook is just a website (this came up in  the discussion at Media Round Table at <a href="http://www.web2expo.com/" mce_href="http://www.web2expo.com/">Web 2.0 Expo, NY, 2010)</a>.&nbsp;&nbsp; In recent weeks, I managed to catch Tim Oâ€™Reilly at several events, <a href="http://makerfaire.com/newyork/2010/" mce_href="http://makerfaire.com/newyork/2010/" target="_blank">Maker Faire</a>, <a href="http://www.web2expo.com/" mce_href="http://www.web2expo.com/">Web 2.0 Expo</a>, <a href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" mce_href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" target="_blank">Hadoop World</a>, and the free webcast Tim did with John Battelle on <a href="http://radar.oreilly.com/2010/10/the-battle-for-the-internet-ec.html" mce_href="http://radar.oreilly.com/2010/10/the-battle-for-the-internet-ec.html" target="_blank">The Battle for the Internet Economy </a> (although Tim spoke several other times during this period!).</p>
<p>It  occurred to me, as I immersed myself in the depth and breadth of  innovation showcased and discussed at these events that Tim Oâ€™Reilly,  and the  Oâ€™Reilly team, are creating, <b>The Missing Manual for the Future.<br />
</b></p>
<p>As Tim  puts it, we are <b>â€œchanging the world by  spreading the knowledge of   innovators.â€</b> Tim uses a quote from William Gibson to illuminate what is at the heart of the Oâ€™Reilly project<b>:</b></p>
<p><b> </b></p>
<p><b>â€œThe Future is here, it is just not evenly distributed yet.â€ (William Gibson). </b></p>
<p>But Tim Oâ€™Reilly makes another point about the future when he  speaks.&nbsp; The future unfolds unexpectedly â€“ so we must invent for an  unknown future not a known future, or as Alex Steffen put it so well in  his post, <a href="http://www.worldchanging.com/archives/010959.html" mce_href="http://www.worldchanging.com/archives/010959.html" target="_blank"><span>Why Our Bright Green Futures Will Be Weirder Than We Think</span>,</a> â€“ <b>â€œThe world we need is one weâ€™ve never yet seen.â€</b> The magic of  attending an Oâ€Reilly event is that it gives you a chance to work on  this koan in interesting ways, and to take more responsibility for how  things turn out.<b> </b><b><br />
</b></p>
<p>Tim Oâ€™Reilly also urges that we think more deeply about what we are doing.&nbsp; His keynote for <a href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" mce_href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" target="_blank">Hadoop World</a> , NYC, billed as, <b>â€œThe Business of Dataâ€ </b>turned towards <b>â€œThe Consequences of Living in a World of Data.â€ </b>The  900 strong crowd at Hadoop World was probably one of the most savvy  crowds in the world about the business of data, so this was a nice turn.<b> </b></p>
<p><b> </b></p>
<p><a href="http://www.web2expo.com/" mce_href="http://www.web2expo.com/">Web 2.0 Expo</a> with the theme, <b>Platforms for Growth,</b> was a deep dive into the business of innovation.&nbsp; Tim Oâ€™Reillyâ€™s keynote at <a href="http://www.web2expo.com/" mce_href="http://www.web2expo.com/">Web 2.0 Expo</a>,&nbsp; â€œThinking Hard About The Futureâ€ (or rather â€œthinking a little bit creatively or differently about the future)&nbsp; â€“ see<a href="http://www.youtube.com/watch?v=3637xFBvkYg&amp;p=6F97A6F4BA797FB3" mce_href="http://www.youtube.com/watch?v=3637xFBvkYg&amp;p=6F97A6F4BA797FB3" target="_blank"> video here,</a> developed the call he made at Web 2.0 Expo 2008, to <b>â€œwork on stuff that matters,â€</b> into a Four  Cylinder Engine for Innovation. &nbsp; The first of the four  cylinders in the firing order is, <b>â€œHaving Fun!â€</b> But,&nbsp; at Maker Faire, Web 2.0 Expo, and Hadoop World I  got an inside  look at the workings of all four cylinders, and there is more to come, I  am sure, as the other Oâ€™Reilly events unfold over the coming months  including,&nbsp; <a href="http://www.web2summit.com/web2010" mce_href="http://www.web2summit.com/web2010" target="_blank">Web 2.0 Summit</a>, <a href="http://strataconf.com/strata2011" mce_href="http://strataconf.com/strata2011" target="_blank">Strata </a>(a new Oâ€™Reilly conference on The Business of Data), and <a href="http://radar.oreilly.com/2010/10/where-20-2011-cfp-is-open.html" mce_href="http://radar.oreilly.com/2010/10/where-20-2011-cfp-is-open.html" target="_blank">Where 2.0,  2011</a>.</p>
<p>In a free webcast, last week (<a href="http://www.youtube.com/oreillymedia#p/c/7/8CEyHSoWJcs" mce_href="http://www.youtube.com/oreillymedia#p/c/7/8CEyHSoWJcs" target="_blank">recording here</a>), previewing <a href="http://www.web2summit.com/web2010" mce_href="http://www.web2summit.com/web2010" target="_blank">Web 2.0 Summit</a>, John Battelle and Tim Oâ€™Reilly discussed the <a href="http://map.web2summit.com/" mce_href="http://map.web2summit.com/" target="_blank">Points of Control Map</a> which is developing into a fun and useful tool to examine a very  serious topic, â€œThe Battle for the Internet Economy,â€ and how the  â€œincreasingly direct conflicts between its major playersâ€ could effect  â€œpeople, government and the future of technology innovation.â€ &nbsp; In my  previous post, <a title="Permanent Link to Platforms for Growth and Points of Control for Augmented Reality: Talking with Chris Arkenberg" rel="bookmark">Platforms for Growth and Points of Control for Augmented Reality</a>, I had a great conversation with <a href="http://www.urbeingrecorded.com/" mce_href="http://www.urbeingrecorded.com/" target="_blank">Chris Arkenberg</a> using this map as a springboard.&nbsp; More on Points of Control later in this post.</p>
<h3>The Four Cylinders of Innovation</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-7.45.36-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-7.45.36-PM.png"><img class="alignnone size-medium wp-image-5814" title="Screen shot 2010-10-23 at 7.45.36 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-7.45.36-PM-300x193.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-7.45.36-PM-300x193.png" alt="Screen shot 2010-10-23 at 7.45.36 PM" height="193" width="300"></a><br mce_bogus="1"></p>
<p><i>click to enlarge</i></p>
<h3>From Jet Ponies to Jet Packs: The First Cylinder of Innovation â€“ â€œHave Funâ€</h3>
<p>The â€œmakerâ€ energy and its spirit of play, and the courage to create,  hack, reinvent and re-purpose everything and anything, is a  quintessential example of the first cylinder of innovation firing big.&nbsp;  Many â€œmakerâ€ projects also go on to fire on all four cylinders. &nbsp; But  the Maker forte definitely is in the first cylinder zone (and safety  third as some of the rides, including Jet Ponies, warned).&nbsp; The photo  opening this post by Marc  de Vinck â€“ for more pics <a href="http://www.flickr.com/photos/wurx/sets/72157624914508135/with/5027190140/" mce_href="http://www.flickr.com/photos/wurx/sets/72157624914508135/with/5027190140/">see here</a>, is of <a href="http://blog.makezine.com/archive/2010/09/tim_oreilly_rides_the_jet_ponies.html" mce_href="http://blog.makezine.com/archive/2010/09/tim_oreilly_rides_the_jet_ponies.html" target="_blank">Tim riding The Jet  Ponies</a> at <a href="http://makerfaire.com/newyork/2010/" mce_href="http://makerfaire.com/newyork/2010/" target="_blank">Maker Faire </a>which took&nbsp; the New York Hall of Science by storm in late September â€“ see<a href="http://makerfaire.com/newyork/2010/" mce_href="http://makerfaire.com/newyork/2010/" target="_blank"> </a><a href="http://cityroom.blogs.nytimes.com/2010/09/24/where-engineering-prowess-meets-burning-man/" mce_href="http://cityroom.blogs.nytimes.com/2010/09/24/where-engineering-prowess-meets-burning-man/" target="_blank">The New York Times coverage here</a>.&nbsp; The ride was <b>â€œbuilt by the  dastardly  danger-hackers at  the <a href="http://madagascarinstitute.com/" mce_href="http://madagascarinstitute.com/" target="_blank">Madagascar  Institute.</a>â€œ</b> See this <a href="http://thefastertimes.com/jetpacks/2009/10/09/this-guy-might-build-a-jetpack-or-at-least-a-hovercraft/" mce_href="http://thefastertimes.com/jetpacks/2009/10/09/this-guy-might-build-a-jetpack-or-at-least-a-hovercraft/" target="_blank">wonderful interview </a>with    Hackett on his work to design <b>â€œour specific jets from a patent that   was  filed in 1960s by a Mr. Lockwood, for Valveless Pulse Jets.â€ </b> Hackett points out:<b> </b></p>
<p><b>â€œLouder than god, glowing white-hot and looking like the  trombone of the Apocalypse, pulse jets are also really shitty,  inefficient engines,â€</b></p>
<p>But, he adds:</p>
<p><b>â€œI have always wanted a jetpack, and one of the reasons I learned to build these things was to further that    goal.â€</b></p>
<p>This grand vision behind the Jet Ponies is a key to firing, <b>The Second Cylinder of Innovation,&nbsp; â€œHey, we can change the world!â€</b></p>
<p>But Jet Ponies, as a stepping stone to jet packs, also really struck a  chord for me as I have been devoting a lot of time lately to the  emerging Augmented Reality industry, a technology which was lumped in  the same category of sci fi  chimera  as jet packs until very recently.</p>
<h3><b> Data is the Gasoline</b></h3>
<p><b><a href="../wp-content/uploads/2010/10/data.jpg" mce_href="../wp-content/uploads/2010/10/data.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/data.jpg" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/data.jpg"><img class="alignnone size-full wp-image-5862" title="data" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/data.jpg" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/data.jpg" alt="data" height="212" width="300"></a><br />
</b></p>
<p><b> </b></p>
<p><b>â€œThe faces are coming from the sky. &nbsp;The locations are coming   from  the sky.   &nbsp;All these apps depend on something, somewhere up.   &nbsp;And   that,  to me,  was always the heart of Web 2.0. &nbsp;And I am so  delighted   that        people are   finally getting it. &nbsp;Because for a long time,  people   thought, â€˜Oh,  Web 2.0, itâ€™s about    lightweight  advertising   supported   in a web  start up.â€™&nbsp;  So I   went, â€˜No, no, no.    Itâ€™s about  the fact that  weâ€™re  building  these    giant database    subsystems in  the  sky  that are   going to   drive    applications.â€™&nbsp;  And   now, of  course, the  same      application is  on   your PC,  itâ€™s  on  your   phone,  itâ€™s on you    iPad.  &nbsp;And  clearly, the    applications are   just sort of  an  interface   to   something    that   is being  driven  from the    cloud,   and that is     fabulous. &nbsp;Thatâ€™s     the  difference.   &nbsp;People get it    now.â€ </b>(Tim Oâ€™Reilly, said this as part of a response to the first questioner at the Media Round table Web 2.0 Expo)</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/5036745797_cf544d22cd_z.jpg" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/5036745797_cf544d22cd_z.jpg"><img class="alignnone size-medium wp-image-5802" title="Media Roundtable" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/5036745797_cf544d22cd_z-300x199.jpg" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/5036745797_cf544d22cd_z-300x199.jpg" alt="Media Roundtable" height="199" width="300"></a><br mce_bogus="1"></p>
<p><i>Answering questions about the importance of â€œHaving Funâ€ to innovation doesnâ€™t look quite as fun as riding Jet Ponies!</i> <i>Photo above from<a href="http://www.flickr.com/photos/lucasartoni/5036745797/in/photostream/" mce_href="http://www.flickr.com/photos/lucasartoni/5036745797/in/photostream/" target="_blank"> luca.sartoniâ€™s Flickr stream</a></i></p>
<p><i>&#8220;</i><b> the  data that  is generated by the sensors  and the applications  that  use  that data is  going to be where people  are going to be  innovative.â€ (Tim O&#8217;Reilly)<br />
</b></p>
<p>During the Media Round Table, I had a chance to ask Tim more about  the role of bottom up innovation in a world where big data is the  gasoline for increasingly sophisticated engines â€“ platforms integrating  machine to machine intelligence and real time analytics.</p>
<p><b>Tish Shute:</b> You brought up Maker Faire in your  keynote, and again now. &nbsp;I was    there, which not many people in the  audience were&nbsp; [not too many hands   went up when Tim asked during his  keynote]. &nbsp;But I think one of  the things that struck me   was the jet  ponies â€“ they were just earthshaking to stand near. &nbsp;They   made the  ground tremble; they made the  world shake.&nbsp; Yet, most of your keynote,  and most of whatâ€™s on our minds here,   at Web 2.0 Expo, is extracting  intelligence from the big data [in the   sky],  and algorithmic  intelligences are the jet engines of the   internet.&nbsp; And of course, not  to be forgotten, as we are here in  New   York City, where the trading  markets are creating the air we breathe&nbsp;   [although we probably don't  realize it until we lose our mortgage or   something] and these  algorithmic economies or â€œrobot casinosâ€ as Kevin Slavin put it, are all  about speed â€“ itâ€™s not just real-time, issues of latency are&nbsp; so  critical that co-location is key to winning the game of the markets.&nbsp;  [Kevin Slavin brilliantly unpacks this in his talk, "Loitering on the  Motherboard."  For more in this see my conversation with Kevin Slavin  below].</p>
<p>So   my question is, whoâ€™s making the jet ponies for the algorithmic    economies in the sky that you just described?&nbsp;&nbsp; How can we make a play    from the bottom up?&nbsp; I always feel <a href="http://www.ushahidi.com/" mce_href="http://www.ushahidi.com/">Ushahidi</a> is one of the jet ponies of   the data  algorithmic space [because of  their great work to bring human   and machine intelligence together to  solve problems in crisis   situations]. &nbsp;But who do you think is doing  exciting work and how can we   ensure that this powerful  world of data  and algorithmic intelligences does not become hidden in a   closed black   box [only really accessible to elite players like the  NYC  trading  markets]?</p>
<p><b> </b></p>
<p><b>Tim Oâ€™Reilly: â€œWell, I think thereâ€™s certainly a lot of  interesting things happening    in, say, the financial services that a  lot of, kind of, the Internet    folks are kind of blind to. &nbsp;I think  that there are companies like <a href="http://www.nextjump.com/" mce_href="http://www.nextjump.com/" target="_blank">Next  Jump</a> which are really good with data and good with algorithms. But  kind of  speaking specifically to the maker side of this, that   whole  sensor  enabled world which is going to produce data is in its   infancy.  &nbsp;What  we have that I think is so powerful right now is we have   the first   portable sensor platform. &nbsp;I said in my talk the other day,   you know,   your phone has ears, it has eyes, it has a sense of where  it  is. &nbsp;And   these are all available to application developers. You know, you can  compare, say, Dodgeball to Foursquare, you can see how  differentâ€¦  Dodgeball is Foursquare in the tele-type era.&nbsp; Foursquare is now  possible because there are so many more capabilities  on the phone.</b></p>
<p><b>And  I think that we are going to see a lot of other areas  that are revolutionized by the sensors in the device. &nbsp;It could well be  that some    of them will come explicitly out of the maker kind of  projects, or it could just be that make is sort of a proxy for them.&nbsp; So  yeah, <a href="http://www.arduino.cc/" mce_href="http://www.arduino.cc/" target="_blank">Arduino</a> is  this great maker sensor platform, but hey, hereâ€™s a    consumer sensor  platform [holding up phone]. Maybe we vaulted past  the  maker stage  already  and we just didnâ€™t know it.</b></p>
<p><b> </b></p>
<p><b>And  thatâ€™s not entirely true, because Arduino is building a  whole economy  of special purpose devices. &nbsp;But it feels a little bit  like the days when people rolling their own PCs coexisted with the rise  of Dell, who was a kid in his college dorm room who made his own PCs and  sold them  on the net, but figured out how to scale it pretty quickly  and get  good  at  it.  But  there were still a lot of garage shops, you  know, â€˜Iâ€™ll make a PC  and sell it to youâ€™ people for probably a decade  before there was   really a  clue that that was a commodity industry.  &nbsp;In fact, I do think   the sensor  platforms are going to become a  commodity industry. &nbsp;And  the  data that  is generated by the sensors  and the applications that  use  that data is  going to be where people  are going to be innovative.â€</b></p>
<h3><b>The internet operating system is a data operating system and it is happening in real time (Tim Oâ€™Reilly)<br />
</b></h3>
<p><b> </b></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Hadooppost.jpg" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Hadooppost.jpg"><img class="alignnone size-medium wp-image-5839" title="Hadooppost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Hadooppost-300x202.jpg" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Hadooppost-300x202.jpg" alt="Hadooppost" height="202" width="300"></a><br mce_bogus="1"></p>
<p><i>click to enlarge the image above&nbsp; â€“ a slide from Mike Olsenâ€™s&nbsp; (CEO of Cloudera) keynote at <a href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" mce_href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" target="_blank">Hadoop World</a></i></p>
<p>Not only  do  we have a portable sensor platform in our pockets&nbsp;    but developers also have  powerful platforms and tools to make sense of  data that fuel  our apps. &nbsp; Opensource <a href="http://hadoop.apache.org/" mce_href="http://hadoop.apache.org/" target="_blank">Hadoop</a> makes  available, to    anyone with   some data  munching chops, the  power to work  with giant  unstructured databases and  do <a target="_blank" mce_href="http://gigaom.com/2009/09/20/getting-closer-to-real-time-with-hadoop/" href="http://gigaom.com/2009/09/20/getting-closer-to-real-time-with-hadoop/">the kind of  real time  analytics</a>  previously only available to giants  like Google.&nbsp;  Big players  like  Yahoo, Facebook, and Twitter use Hadoop (Jonathon  Gray from Facebook noted they add 10TB <i>a day)</i>. &nbsp; But, as <a href="http://www.cscyphers.com/blog/2010/10/12/hadoop-world-2010/" mce_href="http://www.cscyphers.com/blog/2010/10/12/hadoop-world-2010/" target="_blank">this great roundup of Hadoop World </a>points  out, while Hadoop gets  the press for handling petabytes of data , Mike  Olsen (CEO of Cloudera) noted, the fastest growing area of  users are  working with clusters   smaller than 10TB and over half of the Hadoop  clusters were under 10TB in size.</p>
<h3>Four Square: A Platform for Growth with an ecosytem built on top of data that exists in the real world</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-2.27.19-AM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-2.27.19-AM.png"><img class="alignnone size-medium wp-image-5888" title="Screen shot 2010-10-26 at 2.27.19 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-2.27.19-AM-300x256.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-2.27.19-AM-300x256.png" alt="Screen shot 2010-10-26 at 2.27.19 AM" height="256" width="300"></a><br mce_bogus="1"></p>
<p>As an augmented reality enthusiast it is not hard to guess that one of my favorite platforms for growth is <a href="http://foursquare.com/apps/" mce_href="http://foursquare.com/apps/" target="_blank">Four Square</a>.&nbsp; See <a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15652" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15652" target="_blank">Dennis Crowleyâ€™s keynote at Web 2.0 Expo</a> here.&nbsp; The Four Square API has been available to developers since   November 2009,&nbsp; and there are already a number of&nbsp; interesting   applications, and there will be many more to come.&nbsp; The screen shot  above is of <a href="http://geopollster.com/" mce_href="http://geopollster.com/" target="_blank">geopollster</a> â€“ <a href="http://foursquare.com/apps/" mce_href="http://foursquare.com/apps/" target="_blank">see the gallery of Four Square apps here</a>.</p>
<p><i><b><b><b>@dens  tweeted recently&nbsp; â€œPolitics +  @Foursquare = @GeoPollsterâ€   http://geopollster.com &lt;- I love love  love that people are using 4SQ   to think about election tools</b></b></b></i></p>
<p>As Kati London pointed out in her keynote, Four Square is the <b>â€œkind   of augmented reality that is aimed at shifting or  changing a   personâ€™s  social reality, e.g. the mayor badges in Four Square  that   change my  relationship to the people and the place I am in, and   augment   engagement and reputation through socially driven consumer tie   ins.â€ </b> We are already see augmented reality developers beginning to work with the Four Square API â€“ see here, <a href="http://recombu.com/apps/iphone/arstreets-app-review_M12590.html" mce_href="http://recombu.com/apps/iphone/arstreets-app-review_M12590.html" target="_blank">Foursquare + Augmented Reality + Virtual Graffiti = ARstreets</a>.</p>
<p>As augmented reality development tools mature, Four Square will, increasingly, become an important platform<b> </b>for creative AR developers interested in integrating the power of this platform for augmented engagement and reputation with <b>â€œdevice aided augmented  reality that can shift visual experiences of situated geolocal  experiences.â€ </b> With the <a href="http://developer.qualcomm.com/dev/augmented-reality" mce_href="http://developer.qualcomm.com/dev/augmented-reality" target="_blank">Qualcomm vision based augmented reality SDK</a> now available for download, and <a href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" mce_href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" target="_blank">Wave in a Box</a> soon? to be released, and an <a href="http://arwave.org/" mce_href="http://arwave.org/" target="_blank">ARWave</a> client working on Android (almost!), I have been exploring the Four Square API in my non existent spare time!!</p>
<p>The Four Square API also offers some interesting possibilities for  exploring games that take the complex economy of Four Square â€“ not  personal data but aggregates of behavior, as their subject matter (for  more on this see my conversation with Kevin Slavin later in this post  and in an upcoming post).</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/DennisatWhere2009post.jpg" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/DennisatWhere2009post.jpg"><img class="alignnone size-medium wp-image-5886" title="DennisatWhere2009post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/DennisatWhere2009post-199x300.jpg" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/DennisatWhere2009post-199x300.jpg" alt="DennisatWhere2009post" height="300" width="199"></a><br mce_bogus="1"></p>
<p><i>I took this picture of Dennis at <a href="http://where2conf.com/where2009/" mce_href="http://where2conf.com/where2009/" target="_blank">Where 2.0, 2009</a> at the beginning of Four Squareâ€™s phenomenal growth (they are at 4 million plus users now).</i></p>
<p><i><br />
</i></p>
<h3><b><b><b>Pachube (Patch-Bay): </b></b></b>a web service for storing and sharing sensor, energy and environmental data</h3>
<p><b><b><b><a href="../wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-7.58.17-PM1.png" mce_href="../wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-7.58.17-PM1.png"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-7.58.17-PM1.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-7.58.17-PM1.png"><img class="alignnone size-medium wp-image-5838" title="Screen shot 2010-10-24 at 7.58.17 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-7.58.17-PM1-300x198.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-7.58.17-PM1-300x198.png" alt="Screen shot 2010-10-24 at 7.58.17 PM" height="198" width="300"></a><br />
</b></b></b></p>
<p>Eighteen months ago, I interviewed Usman Haque (architect and director, <a id="o.td" title="Haque Design + Research" href="http://www.haque.co.uk/" mce_href="http://www.haque.co.uk/" target="_blank">Haque Design + Research</a>) and founder of <a id="cpbp" title="Pachube" href="http://www.pachube.com/" mce_href="http://www.pachube.com/">Pachube</a> â€“ see <a target="_blank">Pachube, Patching the Planet</a>. &nbsp; Usman pointed me to this wonderful evocative image from <a href="http://www.geog.ubc.ca/%7Etoke/Profile.htm%20%3Chttp://www.geog.ubc.ca/%7Etoke/Profile.htm" mce_href="http://www.geog.ubc.ca/%7Etoke/Profile.htm%20%3Chttp://www.geog.ubc.ca/%7Etoke/Profile.htm" target="_blank">T.R. Okeâ€™s</a> book, <a href="http://www.amazon.com/Boundary-Layer-Climates-T-Oke/dp/0415043190" mce_href="http://www.amazon.com/Boundary-Layer-Climates-T-Oke/dp/0415043190" target="_blank">â€œBoundary Layer Climatesâ€</a> (original photo source Prof. L. E. Mountâ€™s <a href="http://www.alibris.com/booksearch?qwork=1137594&amp;matches=1&amp;author=Mount%2C+Laurence+Edward&amp;browse=1&amp;cm_sp=works*listing*title" mce_href="http://www.alibris.com/booksearch?qwork=1137594&amp;matches=1&amp;author=Mount%2C+Laurence+Edward&amp;browse=1&amp;cm_sp=works*listing*title" target="_blank">The Climatic Physiology of the Pig</a>).&nbsp; â€œ<i>Itâ€™s  the same piglets, in the same box, but on the right hand side  the  temperature has been increased. This small change in how the space  is  â€œprogrammedâ€ has dramatically changed the way the â€˜inhabitantsâ€™  relate  to each other and how they relate to their space.â€</i></p>
<h3><b><b><b><b><b><b>The Challenge of Connecting people and environments.</b></b></b></b></b></b></h3>
<p>At Web 2.0 Expo, I got  the opportunity to talk with Usman Haque again.&nbsp;&nbsp; <a href="http://www.pachube.com/" mce_href="http://www.pachube.com/" target="_blank">Pachube,</a> is becoming an established platform now, Usman explained.&nbsp; They have a  development team of eleven and robust back end.&nbsp; And, they will now be  spending some more time on the front end, including a redesign of the  website,&nbsp;making <b>â€œit a lot easier to widgetize the entire website  so that you will be  able to take almost any element and embed that  into your own website.â€ </b>And, as <a href="http://www.web2expo.com/webexny2010/public/schedule/speaker/43845" mce_href="http://www.web2expo.com/webexny2010/public/schedule/speaker/43845" target="_blank">Usman mentioned in his presentation</a>,  they are working on an augmented reality interface, Porthole, for  facilities management and, â€œas a consumer-oriented application that  extends the universe of Pachube data into the context of AR â€“ a  â€˜portholeâ€™ into Pachubeâ€™s data environments..&nbsp; Usman is also  contributing to the AR standards discussion and on the program committee  now <a href="http://www.w3.org/2010/06/16-w3car-minutes.html#item02" mce_href="http://www.w3.org/2010/06/16-w3car-minutes.html#item02" target="_blank">for the W3C group on augmented reality</a>.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-10.22.24-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-10.22.24-PM.png"><img class="alignnone size-medium wp-image-5912" title="Screen shot 2010-10-26 at 10.22.24 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-10.22.24-PM-300x134.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-10.22.24-PM-300x134.png" alt="Screen shot 2010-10-26 at 10.22.24 PM" height="134" width="300"></a><br mce_bogus="1"></p>
<p>Click to enlarge the image above from Chris Burmanâ€™s paper for the W3C, <a href="http://www.w3.org/2010/06/w3car/portholes_and_plumbing.pdf" mce_href="http://www.w3.org/2010/06/w3car/portholes_and_plumbing.pdf" target="_blank">Portholes and Plumbing: how AR erases boundaries between â€œphysicalâ€ and â€œvirtualâ€</a><br mce_bogus="1"></p>
<p>Pachube, is sometimes described as the Facebook    for Data or an  analogy Usman prefers, a Twitter for   Sensors.&nbsp; At Web 2.0 Expo, I had    an amazing opportunity  to   hear from Twitter and Facebook about  their strategies as platforms for growth.&nbsp; This gave me lots of fuel for  questions about Pachubeâ€™s approach to developing their platform.&nbsp;  Simplicity was a theme that Facebook&nbsp; and Twitter both affirmed as a  key.&nbsp; One of Pachubeâ€™s challenges will be to deliver ease of use, and  the equivalent of Facebookâ€™s â€œlikeâ€ and &nbsp;Twitterâ€™s â€œfollowâ€ to gain mass  appeal.</p>
<p>Here is a brief excerpt from my upcoming conversation with Usman:</p>
<p><b>Tish Shute</b>:  So as a platform you see Pachube as having  more in common with Twitter â€“ a Twitter for Sensors. In what ways is  Pachube similar to Twitter?</p>
<p><b>Usman Haque:  Well we are the Twitter of sensors, devices  &amp; machines in the sense that, really, the API that enables all this  communication is important, much more so than the website itself.  It is  where, basically, most of the millions of our hits actually go, is to  the backend.  And weâ€™ve now got dozens of applications built on top of  the system, a little bit like Twitterâ€™s applications; you know, all the  apps are the important part.</b></p>
<p><b>But we are actually going to be doing some quite exciting  things with API keys that we havenâ€™t really spoken that much about in  public.  But we have come up with a pretty innovative solution to make  almost every resource have granular privacy options on it, <a href="http://community.pachube.com/node/526" mce_href="http://community.pachube.com/node/526">now discussed here</a>. </b></p>
<p>At Hadoop World, Tim Oâ€™Reilly also raised some interesting broader  questions that are very relevant to Pachubeâ€™s vision to â€œpatch the  planetâ€, e.g, the problem of digital identity in the  age of sensors?  (Smart phones already know their users by the way they walk!) And, <b>â€œHow should we think about privacy in a world where data can be triangulated?â€</b></p>
<p>Usman talked about  Pachubeâ€™s approach to both the   technical  aspects of  how to build  a   massively scalable system, and the   conceptual aspects of  how people connect to  each other, and what they   might do with  these   new opportunities to  connect environments and     sensor data&nbsp; (see my   earlier talk with Usman, <a target="_blank">Pachube, Patching the Planet</a>, for a detailed    explanation of some of the   concepts behind  Pachube).</p>
<p>I look forward to posting this conversation.  Pachube is growing, and  Usman always goes beyond the familiar tropes of connecting human and  machine intelligence.</p>
<h3><b> 2nd Cylinder of Innovation: â€œHey Can We Change the World!â€</b></h3>
<p><b><a href="../wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-5.26.55-PM.png" mce_href="../wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-5.26.55-PM.png"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-5.26.55-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-5.26.55-PM.png"><img class="alignnone size-medium wp-image-5826" title="Screen shot 2010-10-24 at 5.26.55 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-5.26.55-PM-300x217.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-5.26.55-PM-300x217.png" alt="Screen shot 2010-10-24 at 5.26.55 PM" height="217" width="300"></a><br />
</b></p>
<p>The possibilities for reimagining of the role of data in healthcare  produced some of the most powerful â€œHey Can We Change the Worldâ€ moments  for me at both Web 2.0 Expo and Hadoop World.&nbsp; The slide above is from Esther  Dysonâ€™s brilliant Ignite presentation, <a href="http://www.slideshare.net/ignitenyc/esther-dyson-what-you-can-and-cant-learn-from-your-genes" mce_href="http://www.slideshare.net/ignitenyc/esther-dyson-what-you-can-and-cant-learn-from-your-genes" target="_blank">â€œWhat you can and canâ€™t learn from your genes?â€ are here</a>,  &nbsp; Tim Oâ€™Reilly also brought up the powerful role real time data  analytics can play in improving healthcare in his Hadoop World Keynote.&nbsp;  Also see Alex Howardâ€™s post, <a href="http://radar.oreilly.com/2010/10/top-10-lessons-for-gov-20-from.html" mce_href="http://radar.oreilly.com/2010/10/top-10-lessons-for-gov-20-from.html" target="_self">10 Lessons for Gov 2.0 from Web 2.0 </a>for some more great, â€œhey we can change the world momentsâ€ at Web 2.0 Expo.&nbsp; The keynote from <a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15726" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15726" target="_blank">Lukas Biewald of CrowdFlower and Leila Chirayath Janah of Samasource </a>(screen shot below)<a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15726" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15726" target="_blank"> </a>in particular, is a provocative exploration of the future of work in the new ecologies of human and machine intelligence.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-8.21.43-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-8.21.43-PM.png"><img class="alignnone size-medium wp-image-5870" title="Screen shot 2010-10-25 at 8.21.43 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-8.21.43-PM-300x184.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-8.21.43-PM-300x184.png" alt="Screen shot 2010-10-25 at 8.21.43 PM" height="184" width="300"></a><br mce_bogus="1"></p>
<h3><b>Changing the World When Our Lives Are Increasingly Shaped by Forces Invisible To Us?</b></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-11.49.32-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-11.49.32-PM.png"><img class="alignnone size-medium wp-image-5840" title="Screen shot 2010-10-24 at 11.49.32 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-11.49.32-PM-300x152.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-11.49.32-PM-300x152.png" alt="Screen shot 2010-10-24 at 11.49.32 PM" height="152" width="300"></a><br mce_bogus="1"></p>
<p><i>Click to enlarge</i></p>
<p>Mike Olsen, CEO of Cloudera, noted that <b>â€œthe largest area of  data growth does not come from humans interacting  with machines;  rather, itâ€™s from machines interacting with each otherâ€ </b>(see here in <a href="http://www.cscyphers.com/blog/2010/10/12/hadoop-world-2010/" mce_href="http://www.cscyphers.com/blog/2010/10/12/hadoop-world-2010/" target="_blank">Minor Technical Difficulties</a>).&nbsp;&nbsp; One of the most  interesting presentations at Web 2.0 Expo was <a href="http://www.web2expo.com/webexny2010/public/schedule/speaker/86516" mce_href="http://www.web2expo.com/webexny2010/public/schedule/speaker/86516" target="_blank">Kevin Slavinâ€™s, â€œLoitering  on the Motherboard,â€ </a>which,  as Tim Oâ€™Reilly pointed out in his keynote at Hadoop World, is a  talk  that raises all  kinds of questions about a system where big  players  are gaming the data  for their own ends.</p>
<p>Kevin Slavin, a founder of <a href="http://areacodeinc.com/" mce_href="http://areacodeinc.com/">Area/Code</a>,  notes  the operating system of our mortgage, life insurance, the  operating  system of currencies and gold is now governed by machine to  machine  intelligence and algorithimic economies outside of human  cognitive  processes.&nbsp; The  markets are now legible only to bots  in an  algorithmic  arms race with bots surveilling bots, and throwing off   false  information in a bid for counter-surveillance.&nbsp; He showed some  slides of  the eery but beautiful visualizations of traces of the  trading bots  created from the Nanex API.</p>
<p>The screenshot above is from the <a href="http://www.nanex.net/FlashCrash/CCircleDay.html" mce_href="http://www.nanex.net/FlashCrash/CCircleDay.html" target="_blank">Nanex: Crop Circle of the Day â€“ Quote Stuffing and Strange Sequences</a>.&nbsp; <b>â€œThe   common theme with the charts shown on this page is they are  all   generated in code and are algorithmic. Some demonstrate  bizarre price   or size cycling, some demonstrate large burst of quotes in  extremely   short time frames and some will demonstrate bothâ€¦â€</b> This one is a   zoom of the NSDQ â€œWild Thing.â€&nbsp; Wild  price/size repeater from NSDQ   running at 1,000 quotes per second,  effecting the BBO along the way (I   love the great names Nanex gives the different patterns and traces   produced by the trading bots).</p>
<p>Nanex supplies a <a href="http://www.nanex.net/" mce_href="http://www.nanex.net/">real-time data feed</a> comprising trade and quote data for all US equity, option, and futures exchanges. They have <a href="http://www.nanex.net/historical.html" mce_href="http://www.nanex.net/historical.html">archived this data</a> since 2004 and have created and used numerous tools to â€œsift through   the enormous dataset: approximately 2.5 trillion quotes and trades as of   June 2010.â€ May 6th 2010 (day of the flash crash), had approximately  7.6  billion trade, quote, level 2, and depth records.</p>
<p>Kevin points out that our lives are being shaped by criteria  invisible to  us and the old hackneyed tropes of machine to machine  intelligence such a  robots reading HUDs in English are long worn out.&nbsp;  The latter  point is, perhaps, something for us augmented reality geeks  absorbed in  ideas of â€œmaking the invisible visibleâ€ to chew on.</p>
<p>Changing a world shaped by forces that are, increasingly, invisible to us presents a huge challenge.</p>
<p>But I had the glimmer of a, â€œHey Can We Change the Worldâ€ moment,  when I attended Kevin Slavin founder of Area/Codeâ€™s presentation and had  a conversation with him after his talk.&nbsp; Could games take these complex  economies as their subject matter?&nbsp; The economies of&nbsp; Farmville and  games like WoW are not opaque at all, and these are environments with  complex economic behavior, <b>â€œwhere you can actually have enough data to understand what it isâ€</b> â€“ <b>â€œitâ€™s not so much about personal data. &nbsp;Itâ€™s more about, like, aggregate behaviors.â€ </b> <b>â€œGames   that can really model those, and play with those, and take those as  the  subject the way that Monopoly takes Monopoly as a subject could be   really interesting.â€ </b>Kevin made many fascinating points â€“ more to come on this topic.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/KevinSlavin.jpg" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/KevinSlavin.jpg"><img class="alignnone size-medium wp-image-5980" title="Kevin Slavin" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/KevinSlavin-300x199.jpg" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/KevinSlavin-300x199.jpg" alt="Kevin Slavin" height="199" width="300"></a><br mce_bogus="1"></p>
<p>Photo by <a rel="nofollow" href="http://duncandavidson.com/" mce_href="http://duncandavidson.com/">James Duncan Davidson</a>, of Kevin Slavin speaking at Web 2.0 Expo NY, 2010, from the <a href="http://www.flickr.com/photos/oreillyconf/5035426532/" mce_href="http://www.flickr.com/photos/oreillyconf/5035426532/" target="_blank">Oâ€™Reilly Conferences Flickr stream</a><br mce_bogus="1"></p>
<p>Here is the beginning of our conversation:</p>
<h3>Talking With Kevin Slavin</h3>
<p><b><b>Tish Shute: </b></b>You began your talk  today about visibility and where some of the  algorithmic masters of  disguise went to work, after they had solved the  math behind stealth  bombers. &nbsp;I thought perhaps you were leading into  ideas about a reverse  surveillance society.</p>
<p>But  you surprised me, as I felt you made visibility itself kind of a   non-issue by the end of your presentation and that counter  surveillance  became basically a time and speed issue. &nbsp;Now I am not  sure quite how to  imagine a counter-surveillance society, something I  try to think  aboutâ€¦</p>
<p><b><b>Kevin Slavin: Well, letâ€™s see. &nbsp;Thereâ€™s a couple ways  to think about it. &nbsp;I think  one point is just that when we talk about  counter-surveillance, we  usually locate that as something that comes  from &nbsp;the bottom up,  something that comes from the population. Think  about the way the  plane spotters discovered the CIA black rendition  flights.</b></b></p>
<p><b><b>I  think in general, when people talk about counter  surveillance, or  sousveillance, they imagine it as an inversion of the  traditional  relationship between the people and the state.</b></b></p>
<p><b><b>But  thatâ€™s whatâ€™s interesting. Whatâ€™s happening now,  is that there are  forms of surveillance and counter-surveillance that  are in play beyond  any human perceptual horizons. These forms are at  their most  sophisticated in financial services, in the markets.</b></b></p>
<p><b><b>If  you were a bot, and could read the market legibly  (which humans  cannot), what you would see, effectively, are bots that  are surveilling  bots. Then you have bots that are throwing off false  information in a  bid for counter-surveillance. Many of the bots are,  themselves,  surveilling other bots; each one of them is trying to  figure out what  all the other ones are going to do. In essence, itâ€™s an  algorithmic arms  race, and game theory has become concrete, since the  theories are code,  the code is action, and the action affects, letâ€™s  say: your mortgage.</b></b></p>
<p><b><b>And  so, basically what you have is you have this  series of algorithms that  are all looking to discern each other, while  also trying to prevent  themselves from being discerned. I think of the  tunnels under the  trenches in WWI, tunnels to surveil the trenches, and  then, later,  tunnels to surveil the tunnels. Thereâ€™s a few examples of  this kind of  thing. &nbsp;But Itâ€™s especially strange when itâ€™s computer  code, and at the  magnitude weâ€™re seeing today.</b></b></p>
<p><b><b>All  of it, as noted in the talk, accounting for 70%  of all the trades in  the market. 70% of the market trades are never  touched by human hands or  even seen by human eyes; they donâ€™t move  through a conventional  cognitive process. &nbsp;And thatâ€™s why you get  things like the Credit Suisse  algorithm, it was buying, selling 200,000  shares of stocks to itself  over and over and over again. It was a bug  and it slowed the market to a  crawl.</b></b></p>
<p><b><b>Credit  Suisse was fined, in essence, for failing to  control an algorithm.  Maybe thatâ€™s the first time an algorithm was  treated like a human, in a  way. As if the algorithm broke the law, and  Credit Suisse was  responsible for letting it do so. For me, that feels  like a threshold  event.</b></b></p>
<p><b><b>Itâ€™s not that humans never made mistakes when trading on the market. But when algorithms err, they err with magnitude.</b></b></p>
<p><b><b>The  idea that we now have bugs in the United States  market economy is  really worth looking at. &nbsp;If Apple canâ€™t keep code  bugs from the most  simple iPhone apps in a closed and regulated  ecosystem, Iâ€™m pretty  certain weâ€™ll have a lot more Credit Suisse type  bugs in the future.</b></b></p>
<p><b><b>And  that will be pretty interesting. There will be  viruses, and the  operating system they will operate on will be the  operating system of  the United States. The operating system of your  pension, your house,  your life insurance. The operating system of  currencies and gold.</b></b></p>
<p><b><b>Tish Shute:</b></b> I was hard-pressed by  the end of your talk to think of like, â€œWell,  what would be the  equivalent of, sort of a peopleâ€™s uprising to create a  better fairer  society in this kind of world where, really, the things  that affect the  key aspects of lives most are going on beyond human perception at an  algorithmic  level?â€&nbsp; But you made a pretty radical suggestion at the  endâ€¦</p>
<p><b><b>Kevin Slavin: Well  I think increasingly the markets  have become delaminated from anything  meaningful. First from goods,  then from fundamentals, and now finally  from homo sapiens. So thatâ€™s  hard to fight.</b></b></p>
<p><b><b>Itâ€™s  the race towards abstraction that makes it  impossible to simply  â€œresist.â€ The latest version in the long series of  fiscal catastrophes  was based on Wall Street finding goods that could  be rolled up and sold  with false valuations, but goods that would take a  long time to fail.  Mortgages are handy like that. Itâ€™s the tradition  of extending the  abstraction as long as possible, until finally the  bill arrives and the  banks fail. I donâ€™t know if thatâ€™s something to  rise up against or not.  Itâ€™s like a rally against evil.</b></b></p>
<p><b><b>But  really, I think the point is that it wonâ€™t be  the people that rise up.  It will be the financial services themselves  that rise up. Theyâ€™ll just  detach completely.</b></b></p>
<p><b><b>That  was harder to do with cotton or with wheat,  with simple futures; they  keep financial services tied to the ground.  &nbsp;So what weâ€™re doing is  creating increasingly complex financial  instruments that are further and  further removed from anything you can  touch. &nbsp;Like the way a mortgage  is abstract. But, of course, the bottom  line is that at the end of that  mortgage lies someoneâ€™s home.</b></b></p>
<p><b><b>Itâ€™s  said that Wall Street is now moving onto life  insurance, because thatâ€™s  going to take even longer to fail. &nbsp;Theyâ€™re  doing the exact same thing.  The word is that they are rolling up CDOs  made out of crap life  insurance policies, same way they rolled them up  with crap mortgages a  few years ago.</b></b></p>
<p><b><b>And  those will probably take, I donâ€™t know, 15 or 20  years to unwrap and  unravel. &nbsp;But what you see in the meantime, is  that they are looking for  things that are increasingly abstract,  intangible, removed as far as  possible from the experience of everyday  life.</b></b></p>
<p><b><b>So  maybe this is good. Maybe thatâ€™s financial  services rising up. Lifting  off. I think best case scenario now is that  they actually leave humans  alone altogether. &nbsp;That, someday, they are  just trading, effectively,  completely arbitrary goods, the stocks could  be anything at all, maybe  for crops that no longer exist, and Iâ€™m just  saying that then these bots  would no longer affect what we do and what  we are, it would just be a  robot casino, an invisible paradise in the  air.</b></b></p>
<p><b><b><br />
</b></b></p>
<h3><b><b>People are the platform: How Games Can Be Engines of Innovation in Our Lives</b></b></h3>
<p><b><b><a href="../wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-11.34.58-PM.png" mce_href="../wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-11.34.58-PM.png"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-11.34.58-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-11.34.58-PM.png"><img class="alignnone size-medium wp-image-5872" title="Screen shot 2010-10-25 at 11.34.58 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-11.34.58-PM-300x204.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-11.34.58-PM-300x204.png" alt="Screen shot 2010-10-25 at 11.34.58 PM" height="204" width="300"></a><br />
</b></b></p>
<p><i><b><b>See the video of <a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15446" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15446" target="_blank">Games that Know Where We Live</a> here (screen shot above)<br />
</b></b></i></p>
<p><i><b><b> </b></b></i></p>
<p>Kati London, Senior Producer, <a href="http://areacodeinc.com/" mce_href="http://areacodeinc.com/">Area/Code</a>, in her keynote showed how <a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15446" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15446" target="_blank">games that know where we  live</a> can shift players perspectives â€“ from device aided augmented  reality  that can shift visual experiences of situated geolocal  experiences to a  kind of augmented reality that is aimed at shifting or  changing a  personâ€™s social reality, e.g. the mayor badges in Four Square  that  change my relationship to the people and the place I am in, and  augment  engagement and reputation through socially driven consumer tie  ins.</p>
<p>Area/Code has recently developed<a id="internal-source-marker_0.7281649763651145" href="http://www.knightfoundation.org/news/press_room/knight_press_releases/detail.dot?id=370129" mce_href="http://www.knightfoundation.org/news/press_room/knight_press_releases/detail.dot?id=370129"> two games for the Knight Foundation</a> that take people as the platform.&nbsp; Macon  Money, uses very simple games dynamics (for more <a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15446" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15446" target="_blank">see the video</a> of Katiâ€™s keynote) in a game designed to help â€œKnightâ€™s continuing  efforts  to support revitalizing Macon and creating a vibrant college  town.â€</p>
<p>The  other game that Area/Code has designed with the support of the  Knight  Foundation &nbsp;is for the Biloxi and Gulf Coast community, a game  called  Battlestorm.&nbsp; <a href="http://www.knightfoundation.org/news/press_room/knight_press_releases/detail.dot?id=370129" mce_href="http://www.knightfoundation.org/news/press_room/knight_press_releases/detail.dot?id=370129"> â€œThe gameâ€™s purpose is to increase awareness about natural disasters and change the way people prepare for them.â€</a><br mce_bogus="1"></p>
<p><b><br />
</b></p>
<h3><b>3rd Cylinder of Innovation: Build products, business models and entire industries.</b></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.06.57-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.06.57-PM.png"><img class="alignnone size-medium wp-image-5822" title="Screen shot 2010-10-23 at 11.06.57 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.06.57-PM-300x151.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.06.57-PM-300x151.png" alt="Screen shot 2010-10-23 at 11.06.57 PM" height="151" width="300"></a><br mce_bogus="1"></p>
<p><a href="http://www.glympse.com/" mce_href="http://www.glympse.com/" target="_blank">Glympse</a> â€“ real-time, private location tracking</p>
<p>Julianne Pepitone, Yahoo! Finance, nailed the essence of Web 2.0 Expo, NYC, this year in her post, <a href="http://finance.yahoo.com/news/Web-20-Expo-startups-are-big-cnnm-2700333063.html?x=0&amp;.v=2" mce_href="http://finance.yahoo.com/news/Web-20-Expo-startups-are-big-cnnm-2700333063.html?x=0&amp;.v=2" target="_blank">Web 2.0 Expo startups are big on neighborhoods, storytelling</a>.&nbsp; She writes:</p>
<p><b>â€œAt   the Web 2.0 Expo in New York City this week, executives  from big   sites  like Facebook, Twitter and Pandora all spoke about  industry   trends.  But the showcase of 27 startup tech companies stole  the show.â€</b></p>
<p>Listen  carefully to Tim Oâ€™Reilly and Fred Wilson, Union Square Ventures,  question their picks from the<a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15525" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15525" target="_blank"> startup showcase</a> at Web 2.0 Expo.&nbsp; Also see <a href="http://www.youtube.com/watch?v=Xbui5_5_NCA&amp;p=6F97A6F4BA797FB3" mce_href="http://www.youtube.com/watch?v=Xbui5_5_NCA&amp;p=6F97A6F4BA797FB3" target="_blank">this video of Fred and Tim discussing their conversations with all the start ups</a>.&nbsp;  This&nbsp; is one of the clearest public windows onto both how to present  your company to VC, and how to figure out what are the most important   questions for you as an entrepreneur&nbsp; building a  business in a world of  data.</p>
<p><a href="http://www.glympse.com/" mce_href="http://www.glympse.com/">Glympse</a> <a href="http://www.youtube.com/watch?v=EuKScQbPvVc&amp;feature=channel" mce_href="http://www.youtube.com/watch?v=EuKScQbPvVc&amp;feature=channel" target="_blank">successfully  pitches </a>their  â€œjet ponyâ€ strategy for a  location based business, and is Fredâ€™s  pick.&nbsp; They hold up well under pressure and  answer Tim and Fredâ€™s hard  questions  about how their start up will not  get overtaken by an  encumbent player with resources  and market share before they can gain   traction.&nbsp;&nbsp; <a href="http://www.food52.com/" mce_href="http://www.food52.com/">food52</a> <a href="http://www.youtube.com/watch?v=NZZ0apJTUQA&amp;feature=channel" mce_href="http://www.youtube.com/watch?v=NZZ0apJTUQA&amp;feature=channel" target="_blank">responds to Timâ€™s probing about their  strategy</a> for business data  analytics that he points out are vital if they  want  to survive with the  small margins of ecommerce.&nbsp; There is a list of  all the participants in the start up showcase in Bradyâ€™s <a href="http://radar.oreilly.com/2010/09/the-startups-at-the-expo-showc.html" mce_href="http://radar.oreilly.com/2010/09/the-startups-at-the-expo-showc.html" target="_blank">post here.</a> <a href="http://hour.ly/" mce_href="http://hour.ly/" target="_blank">hour.ly</a> was the audience pick.</p>
<h3><a href="http://www.shazam.com/" mce_href="http://www.shazam.com/" target="_blank">Shazam</a> for Faces!</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.14.52-AM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.14.52-AM.png"><img class="alignnone size-medium wp-image-5897" title="Screen shot 2010-10-26 at 4.14.52 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.14.52-AM-300x134.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.14.52-AM-300x134.png" alt="Screen shot 2010-10-26 at 4.14.52 AM" height="134" width="300"></a><br mce_bogus="1"></p>
<p>My favorite start up  was a biometric service doing face, iris, and finger print matching,<a href="http://www.tacticalinfosys.com/" mce_href="http://www.tacticalinfosys.com/" target="_blank"> Tactical Information Systems</a>.</p>
<p>Tim and Fred also liked them, and they have an interesting discussion  about the merits or not of approaching your platform through a narrow  first application as Tactical Information Systems are with <a href="http://www.wanderid.org/" mce_href="http://www.wanderid.org/" target="_blank">WanderID</a> -&nbsp; an application to help identifying lost Alzheimer patients.&nbsp; As Fred pointed out, they are potentially the <a href="http://www.shazam.com/" mce_href="http://www.shazam.com/" target="_blank">Shazam</a> for faces, so why start so small?</p>
<p>I&nbsp; had asked TIS the same question when I met them in the â€œspeed  datingâ€ session.&nbsp; This is just their first toe in the water as they are a  two person company at the moment. Their vision for their platform is  big.&nbsp; Mary Haskett and Dr Alex Kilpatrick, the founders of this  quintessential jet pony for the algorithmic economies in the sky, are  not only a partnership with the credentials to do a&nbsp; <a href="http://www.shazam.com/" mce_href="http://www.shazam.com/" target="_blank">Shazam</a> for faces â€“ <a href="http://www.tacticalinfosys.com/about.html" mce_href="http://www.tacticalinfosys.com/about.html" target="_blank">see their bios here</a>, they are the people I would want to be running a <a href="http://www.shazam.com/" mce_href="http://www.shazam.com/" target="_blank">Shazam</a> for faces!&nbsp; They really get the consequences of living in a world of  data â€“ check out Dr Kilpatrickâ€™s absolute killer Ignite talk, <a href="http://ignite.oreilly.com/2010/10/defeating-big-brother-by-dr-alex-kilpatrick-ep-75.html" mce_href="http://ignite.oreilly.com/2010/10/defeating-big-brother-by-dr-alex-kilpatrick-ep-75.html" target="_blank">â€œDefeating Big Brother.â€</a> (screenshot below)</p>
<p><i><b><b><b><a href="http://ignite.oreilly.com/2010/10/defeating-big-brother-by-dr-alex-kilpatrick-ep-75.html" mce_href="http://ignite.oreilly.com/2010/10/defeating-big-brother-by-dr-alex-kilpatrick-ep-75.html" target="_blank"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.03.11-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.03.11-PM.png"><img class="alignnone size-medium wp-image-5819" title="Screen shot 2010-10-23 at 11.03.11 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.03.11-PM-300x229.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.03.11-PM-300x229.png" alt="Screen shot 2010-10-23 at 11.03.11 PM" height="229" width="300"></a><br />
</b></b></b></i></p>
<h3>How Can Augmented Reality Add Value to the Real Time Internet/Data Operating System?</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.12.57-AM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.12.57-AM.png"><img class="alignnone size-medium wp-image-5896" title="Screen shot 2010-10-26 at 4.12.57 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.12.57-AM-300x199.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.12.57-AM-300x199.png" alt="Screen shot 2010-10-26 at 4.12.57 AM" height="199" width="300"></a><br mce_bogus="1"></p>
<p><i> <a href="http://www.planefinder.net/" mce_href="http://www.planefinder.net/" target="_blank">planefinder.net</a> â€“ an augmented reality app that lets you find information about planes  by pointing your phone at the sky, â€œincluding flight  number, aircraft  registration, speed, altitude and how far away  it isâ€ (via <a href="http://www.maclife.com/article/news/do_some_plane_scouting_augmented_reality_plane_finder_app" mce_href="http://www.maclife.com/article/news/do_some_plane_scouting_augmented_reality_plane_finder_app">MacLife</a>).</i></p>
<p>The new opportunities in the algorithmic economies in the sky were    center stage at Web 2.0 Expo and there are some interesting AR apps for  the real time internet/data operating system emerging, like <a href="http://www.planefinder.net/" mce_href="http://www.planefinder.net/" target="_blank">planefinder.net</a>.&nbsp; But Augmented Reality was still pretty   low profile at Web 2.0 Expo (<a target="_blank">except that NVidia augmented reality demo attracted a lot of attention at the sponsors expo</a>).&nbsp;  However, everyone working in the emerging industry of AR should  recognize that   apps big on â€œneighborhoods and story tellingâ€ are  heading right up the   AR street, and that platforms like Four Square  and Pachube present enormous opportunity to explore the possibilities of  AR.&nbsp; And if augmented reality enthusiasts are not already paying    attention to real time data analytics, and <a href="http://hadoop.apache.org/" mce_href="http://hadoop.apache.org/" target="_blank">Hadoop</a>, they should be (see <a href="http://www.cscyphers.com/blog/2010/10/12/hadoop-world-2010/" mce_href="http://www.cscyphers.com/blog/2010/10/12/hadoop-world-2010/" target="_blank">this post for an excellent round up</a> on Hadoop World).</p>
<p>At Hadoop World, Tim Oâ€™Reilly referenced the great tagline from the&nbsp; <a href="http://vimeo.com/11742135" mce_href="http://vimeo.com/11742135">IBM commercial</a>:</p>
<p><i><b><b><b><b>â€œ</b></b></b></b></i><b><b><b><b>Would you be willing to cross the street â€” blindfolded â€” on  data that was five minutes old? Five hours? Five days?â€</b></b></b></b></p>
<p>As I have noted in several earlier posts â€“ <a href="../../2010/09/27/urban-games-storytelling-with-augmented-reality-the-big-arny-and-inside-ar-talking-with-thomas-alt-metaio/" mce_href="../../2010/09/27/urban-games-storytelling-with-augmented-reality-the-big-arny-and-inside-ar-talking-with-thomas-alt-metaio/" target="_blank">see here</a> and <a href="../../2010/08/05/vision-based-augmented-reality-ar-in-smart-phones-qualcomms-ar-sdk-interview-with-jay-wright/" mce_href="../../2010/08/05/vision-based-augmented-reality-ar-in-smart-phones-qualcomms-ar-sdk-interview-with-jay-wright/" target="_blank">here</a> for starters,&nbsp; we are just seeing the tools&nbsp; for developing near field,  vision based, mobile, social AR become widely available to developers,  so there should be a new level of AR apps emerging through 2011.&nbsp; There  is a wonderful discussion in the comments of this post by Mac  Slocum, <a href="http://radar.oreilly.com/2010/10/two-ways-augmented-reality-app.html" mce_href="http://radar.oreilly.com/2010/10/two-ways-augmented-reality-app.html" target="_blank">â€œHow Augmented Reality Apps Can Catch On,â€ </a> between Mac, Raimo one of     the founders of <a href="http://www.layar.com/" mce_href="http://www.layar.com/" target="_blank">Layar</a>, and <a href="http://www.urbeingrecorded.com/" mce_href="http://www.urbeingrecorded.com/" target="_blank">Chris Arkenberg</a> on what constitutes a platform for growth for     augmented reality.</p>
<p>Macâ€™s post, the comments and <a href="http://www.urbeingrecorded.com/news/2010/10/13/is-ar-ready-for-the-trough-of-disillusionment/" mce_href="http://www.urbeingrecorded.com/news/2010/10/13/is-ar-ready-for-the-trough-of-disillusionment/" target="_blank">Chris Arkenbergâ€™s post</a> on the <a href="http://www.gartner.com/it/page.jsp?id=1447613" mce_href="http://www.gartner.com/it/page.jsp?id=1447613" target="_blank">latest edition of the Gartner Hype Cycle,</a> that rather curiously placed Augmented reality almost at the peak of  inflated expectations. really got me excited     about exploring an idea  I have been thinking about for a while, which   is   to get the AR  community to discuss the <a href="http://map.web2summit.com/" mce_href="http://map.web2summit.com/">Points of Control map</a>. &nbsp;&nbsp; See my discussion with Chris Arkenberg here, <a rel="bookmark" href="http://www.ugotrade.com/2010/10/27/platforms-for-growth-and-points-of-control-for-augmented-reality-talking-with-chris-arkenberg/" mce_href="http://www.ugotrade.com/2010/10/27/platforms-for-growth-and-points-of-control-for-augmented-reality-talking-with-chris-arkenberg/" target="_blank">Platforms for Growth and Points of Control for Augmented Reality</a><a href="http://www.ugotrade.com/2010/10/27/platforms-for-growth-and-points-of-control-for-augmented-reality-talking-with-chris-arkenberg/" mce_href="http://www.ugotrade.com/2010/10/27/platforms-for-growth-and-points-of-control-for-augmented-reality-talking-with-chris-arkenberg/" target="_blank">.</a> The recording of&nbsp; John Battelle&#8217;s and Tim O&#8217;Reilly&#8217;s webcast on Points of Control <a href="http://www.youtube.com/oreillymedia#p/c/7/8CEyHSoWJcs" mce_href="http://www.youtube.com/oreillymedia#p/c/7/8CEyHSoWJcs" target="_blank">is posted here.</a><br mce_bogus="1"></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.01.38-AM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.01.38-AM.png"><img class="alignnone size-medium wp-image-5932" title="Screen shot 2010-10-27 at 2.01.38 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.01.38-AM-300x124.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.01.38-AM-300x124.png" alt="Screen shot 2010-10-27 at 2.01.38 AM" height="124" width="300"></a><br mce_bogus="1"></p>
<p><a href="http://map.web2summit.com/" mce_href="http://map.web2summit.com/" target="_blank">The interactive Points of Control map</a> is an amazing  tool    to think with! Check it out  in movements, territory and movements, acquisition mode.&nbsp; There is a  competition for the most interesting comment and most interesting  acquisition suggestion.&nbsp; The prize is a ticket to Web 2.0 Summit!</p>
<h3>What is the Future of Social?</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/ARwave_logo_small.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/ARwave_logo_small.png"><img class="alignnone size-full wp-image-5987" title="ARwave_logo_small" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/ARwave_logo_small.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/ARwave_logo_small.png" alt="ARwave_logo_small" height="146" width="208"></a><br mce_bogus="1"></p>
<p>The recent â€œdefectionâ€ from Google to Facebook â€“ see <a title="Lars Rasmussen, Father Of Google Maps And Google Wave, Heads To&nbsp;Facebook" rel="bookmark" href="http://techcrunch.com/2010/10/29/rasmussen-facebook-google/" mce_href="http://techcrunch.com/2010/10/29/rasmussen-facebook-google/">Lars Rasmussen, Father Of Google Maps And Google Wave, Heads To&nbsp;Facebook</a>,&nbsp; is as MG Siegler of TechCrunch points out, â€œthe biggest one since Chrome OS lead <a href="http://www.crunchbase.com/person/matthew-papakipos" mce_href="http://www.crunchbase.com/person/matthew-papakipos">Matthew Papakipos </a>made <a href="http://techcrunch.com/2010/06/28/closing-in-on-chrome-os-launch-key-architect-matthew-papakipos-jumps-to-facebook/" mce_href="http://techcrunch.com/2010/06/28/closing-in-on-chrome-os-launch-key-architect-matthew-papakipos-jumps-to-facebook/">the same jump in June</a>â€ (TechCrunch also notes â€œcurrent Facebook CTO <a href="http://www.crunchbase.com/person/bret-taylor" mce_href="http://www.crunchbase.com/person/bret-taylor">Bret Taylor</a> was heavily involved in the launch of Google Mapsâ€).</p>
<p>These moves have drawn my particular attention as did <a href="http://www.youtube.com/watch?v=ZqDYjA5RGCU&amp;p=6F97A6F4BA797FB3" mce_href="http://www.youtube.com/watch?v=ZqDYjA5RGCU&amp;p=6F97A6F4BA797FB3" target="_blank">Bret Taylorâ€™s response in his conversation with Brady Forrest at Web 2.0 Expo</a> to Bradyâ€™s question, <b>â€œHow soon until we get the Facebook firehose?â€ </b></p>
<p>If you have been reading Ugotrade you already know<b> </b>how  important I think an open, distributed, standard for  real-time  communications such as the very innovative Wave Federation Protocol  could be for AR development&nbsp; -&nbsp; see <a href="http://www.arwave.org/" mce_href="http://www.arwave.org/" target="_blank">ARWave </a>and <a href="http://www.mobilemonday.nl/talks/tish-shute-the-next-wave-of-ar/" mce_href="http://www.mobilemonday.nl/talks/tish-shute-the-next-wave-of-ar/" target="_blank">my presentation at MoMo13, Amsterdam</a> last year, <a rel="bookmark" href="http://www.ugotrade.com/2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/" mce_href="http://www.ugotrade.com/2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/" target="_blank">The Next Wave of AR: Mobile Social Interaction Right Here, Right Now!</a><br mce_bogus="1"></p>
<p>The anticipated release of&nbsp; <a href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" mce_href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" target="_blank">Wave in a Box, </a>has  raised hopes in the developer community that&nbsp; WFP will soon become  easier to work with, and hopefully more widely adopted.&nbsp; Like many  others, I wonder what will happen to <a href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" mce_href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" target="_blank">Wave in a Box</a> now?</p>
<p>But the innovation of Wave is deep and broad (and as many have  pointed out hugely ambitious).&nbsp; Perhaps the boldest attempt yet to  innovate both at the low level of architecture (where Google is so  powerful) and at the high level of <b>the Mark Zuckerberg, â€œbig idea,â€ which  as Tim Oâ€™Reilly notes is, â€œWhat is the future of social?â€ </b> MG Siegler  noted <a title="Facebook Groups Is Sort Of Like Google Wave For Human&nbsp;Beings" rel="bookmark" href="http://techcrunch.com/2010/10/07/facebook-groups-google-wave/" mce_href="http://techcrunch.com/2010/10/07/facebook-groups-google-wave/">Facebook Groups Is Sort Of Like Google Wave For Human&nbsp;Beings</a>.</p>
<p>But I deeply hope that the open, distributed standard part of the Wave big idea is not lost in the mix here.</p>
<p><b><br />
</b></p>
<h3><b>Fourth Cylinder of Innovation: Keep the Ecosystem Going, Create More Value than You Capture<br />
</b></h3>
<p><i><b><a href="../wp-content/uploads/2010/10/Screen-shot-2010-10-21-at-5.58.27-AM.png" mce_href="../wp-content/uploads/2010/10/Screen-shot-2010-10-21-at-5.58.27-AM.png"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-1.56.15-AM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-1.56.15-AM.png"><img class="alignnone size-medium wp-image-5931" title="Screen shot 2010-10-27 at 1.56.15 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-1.56.15-AM-300x181.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-1.56.15-AM-300x181.png" alt="Screen shot 2010-10-27 at 1.56.15 AM" height="181" width="300"></a><br />
</b></i></p>
<p><i>The Points of Control map is interactive, so please <a href="http://map.web2summit.com/" mce_href="http://map.web2summit.com/" target="_blank">click here </a>or on the image above for the full experience.</i></p>
<p>Tim Oâ€™Reilly points out that there is a worrisome dark side to the Points of Control Map â€“ see <a href="http://www.youtube.com/watch?v=3637xFBvkYg&amp;p=6F97A6F4BA797FB3" mce_href="http://www.youtube.com/watch?v=3637xFBvkYg&amp;p=6F97A6F4BA797FB3" target="_blank">Timâ€™s keynote here</a>.&nbsp; To paraphrase some or his points:</p>
<p>There are companies on the map that are forgetting to think about  creating a sustainable ecosystem.&nbsp; Rather than growing the pie, they are  trying to divide up the pie and that threatens to cause the fourth  cylinder of innovation to misfire.&nbsp; This fourth cylinder is essential to  the ecosystem.</p>
<p>Tim Oâ€™Reilly looks back to the lessons of the personal computing  industry which was incredibly vital and creative, and lots of people  made money until a couple of big players <b>â€œsucked all the air out of the ecosystemâ€</b> and innovation had to go elsewhere.</p>
<p>The Power of Platforms is to create value not just for your company  but for other people.&nbsp;&nbsp; Create value for yourself by creating value for  other people.&nbsp; Tim Oâ€™Reilly used the wonderful example of&nbsp; Henry Ford  inventing the weekend so that there would be enough people with time and  money to buy his mass produced cars.&nbsp; Think about building the  ecosystem that will support the future your are going to build.&nbsp; Grow  the pie rather than cut up the pie.&nbsp; This will be the vital fourth  cylinder of innovation in a <a href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" mce_href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" target="_blank">Web Squared</a> world.</p>
<p>Tim Oâ€™Reilly has long proposed that&nbsp;<a href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" mce_href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" target="_blank"> </a><a href="http://www.oreillynet.com/go/web2" mce_href="http://www.oreillynet.com/go/web2">Web 2.0 is all about harnessing collective intelligence</a>,&nbsp; But as Gartner predicts, â€œ<span lang="EN-GB">By  year end 2012, physical sensors will create 20 percent of non-video  internet traffic.â€ </span><span lang="EN-GB"> </span>Yet   another  previously unevenly distributed future is going mainstream,  and if you havenâ€™t read it already, now is the time to read<span lang="EN-GB"> this  paper by Tim Oâ€™Reilly and John Batelle, </span><a href="http://www.web2summit.com/web2009/public/schedule/detail/10194" mce_href="http://www.web2summit.com/web2009/public/schedule/detail/10194" target="_blank">Web Squared: Web 2.0 Five Years On</a>.</p>
<h3><b><b><b>The Consequences of Living in a World of Data</b></b></b></h3>
<p><i><b><b><b><b><a href="../wp-content/uploads/2010/10/Dataarmsrace.jpg" mce_href="../wp-content/uploads/2010/10/Dataarmsrace.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Dataarmsrace.jpg" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Dataarmsrace.jpg"><img class="alignnone size-medium wp-image-5817" title="Dataarmsrace" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Dataarmsrace-300x199.jpg" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Dataarmsrace-300x199.jpg" alt="Dataarmsrace" height="199" width="300"></a><br />
</b></b></b></b></i></p>
<p>To bring this very long post to a close!&nbsp; Here are just a few of the  key questions re The Consequences of Living in a World of Data that Tim  Oâ€™Reilly raised during his keynote for Hadoop World:</p>
<p><b><b><b><b>â€œHow would we solve the problem of  digital identity in the age of sensors? (Our smart phones are able to  know their users by the way they walk â€“ their gait!)</b></b></b></b></p>
<p><b><b><b><b>â€œHow will we input data when our devices are smart enough to listen on their own?â€</b></b></b></b></p>
<p><b><b><b><b>â€œHow should we think about privacy in a world where data can be triangulated?â€</b></b></b></b></p>
<p><b><b><b><b>â€œWe are moving to a world in which  every device generates useful data, in which every action creates  information shadows on the net.â€</b></b></b></b></p>
<p><b><b><b><b>â€œShouldnâ€™t we regulate the misuse of data rather than the possession of it?â€</b></b></b></b></p>
<p><b><b><b><b>â€œHow do we avoid a data arms race?â€</b></b></b></b></p>
<p><b><b><b><b>â€œCreate more value than you capture.â€</b></b></b></b></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2010/10/31/tim-o%e2%80%99reilly%e2%80%99s-four-cylinder-innovation-engine-the-missing-manual-for-the-future/feed/</wfw:commentRss>
		<slash:comments>8</slash:comments>
		</item>
		<item>
		<title>Urban Augmented Realities and Social Augmentations that Matter: Talking with Bruce Sterling, Part 2</title>
		<link>http://www.ugotrade.com/2010/09/17/urban-augmented-realities-and-social-augmentations-that-matter-interview-with-bruce-sterling-part-2/</link>
		<comments>http://www.ugotrade.com/2010/09/17/urban-augmented-realities-and-social-augmentations-that-matter-interview-with-bruce-sterling-part-2/#comments</comments>
		<pubDate>Fri, 17 Sep 2010 21:43:35 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[3D point clouds]]></category>
		<category><![CDATA[an ARG for World Peace]]></category>
		<category><![CDATA[Anselm Hook]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[are2010]]></category>
		<category><![CDATA[ARWave Android client]]></category>
		<category><![CDATA[ARWave at Software Freedom Day]]></category>
		<category><![CDATA[augmented foraging]]></category>
		<category><![CDATA[augmented reality checkins]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[Bertine van Hovell]]></category>
		<category><![CDATA[Biological Globalisation]]></category>
		<category><![CDATA[Boskoi]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Crisis Filter]]></category>
		<category><![CDATA[cryptoforests]]></category>
		<category><![CDATA[Davide Carnovale]]></category>
		<category><![CDATA[deterritorialization]]></category>
		<category><![CDATA[difference between augmented reality and ubiquitous computing]]></category>
		<category><![CDATA[emergency response]]></category>
		<category><![CDATA[Favela Chic]]></category>
		<category><![CDATA[fightthegooglejugend]]></category>
		<category><![CDATA[Four Square]]></category>
		<category><![CDATA[Gamepocalypse]]></category>
		<category><![CDATA[gardens gone wild]]></category>
		<category><![CDATA[Gene Becker]]></category>
		<category><![CDATA[google goggles]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[gowalla]]></category>
		<category><![CDATA[homophilies]]></category>
		<category><![CDATA[hyperlocal experiences]]></category>
		<category><![CDATA[interview with Bruce Sterling]]></category>
		<category><![CDATA[JCPT the open Android 3D engine]]></category>
		<category><![CDATA[Jesse James Garrett]]></category>
		<category><![CDATA[Jesse Schell]]></category>
		<category><![CDATA[Joshua Kauffman]]></category>
		<category><![CDATA[Ken Eklund]]></category>
		<category><![CDATA[Kooaba]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[Lightning Laboratories]]></category>
		<category><![CDATA[location based social networking]]></category>
		<category><![CDATA[Maarten Lens-FitzGerald]]></category>
		<category><![CDATA[machine intelligence]]></category>
		<category><![CDATA[machine learning]]></category>
		<category><![CDATA[Mark Evin]]></category>
		<category><![CDATA[Markus Strickler]]></category>
		<category><![CDATA[NextHope]]></category>
		<category><![CDATA[NextHope AMD]]></category>
		<category><![CDATA[Occipital]]></category>
		<category><![CDATA[open distributed augmented reality]]></category>
		<category><![CDATA[open distributed platform for AR]]></category>
		<category><![CDATA[physical world platform]]></category>
		<category><![CDATA[proximity-based social networking]]></category>
		<category><![CDATA[psychogeography]]></category>
		<category><![CDATA[real-time information brokerages]]></category>
		<category><![CDATA[realtime information brokerages]]></category>
		<category><![CDATA[Shaping Things]]></category>
		<category><![CDATA[ShapingThings]]></category>
		<category><![CDATA[Sixth Sense for Autism]]></category>
		<category><![CDATA[SMSSlingshot]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[Social Augmented Experiences that Matter]]></category>
		<category><![CDATA[social mapping]]></category>
		<category><![CDATA[Software Freedom Day]]></category>
		<category><![CDATA[Swift]]></category>
		<category><![CDATA[territorialization]]></category>
		<category><![CDATA[The Cryptoforests of Utrecht]]></category>
		<category><![CDATA[Thomas Wrobel]]></category>
		<category><![CDATA[Tonchidot]]></category>
		<category><![CDATA[Ubistudio]]></category>
		<category><![CDATA[urban augmented realities]]></category>
		<category><![CDATA[Urban Edibles Amsterdam]]></category>
		<category><![CDATA[urban fallows]]></category>
		<category><![CDATA[urban forsts]]></category>
		<category><![CDATA[urban informatic mapping]]></category>
		<category><![CDATA[urban informatics]]></category>
		<category><![CDATA[Ushahidi]]></category>
		<category><![CDATA[vision assisted augmented reality]]></category>
		<category><![CDATA[vision based augmented reality]]></category>
		<category><![CDATA[visual search]]></category>
		<category><![CDATA[Wave in a Box]]></category>
		<category><![CDATA[WaveinaBox]]></category>
		<category><![CDATA[Westraven Psychogeography]]></category>
		<category><![CDATA[Will Wright at Augmented Reality Event]]></category>
		<category><![CDATA[YDreams]]></category>
		<category><![CDATA[Zorop]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=5627</guid>
		<description><![CDATA[Social Augmented Experiences leveraging geoawareness and human and machine intelligence to create real time information brokerages, combined with an augmented reality view, can create a new opportunities to reimagine our relationships with each other and our environment. This Summer, I have been on a blogging hiatus, which has meant I haven&#8217;t been sharing as frequently [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><strong><strong><span> </span></strong></strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/augmentedforaging1.jpg"><img class="alignnone size-medium wp-image-5651" title="augmentedforaging" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/augmentedforaging1-200x300.jpg" alt="augmentedforaging" width="200" height="300" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/westraven81.JPG"><img class="alignnone size-medium wp-image-5652" title="westraven8" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/westraven81-225x300.jpg" alt="westraven8" width="225" height="300" /></a></p>
<p>Social Augmented Experiences leveraging geoawareness and human and machine intelligence to create real time information brokerages, combined with an augmented reality view, can create a new opportunities to reimagine our relationships with each other and our environment.</p>
<p>This   Summer, I have been on a blogging hiatus, which has meant I haven&#8217;t   been sharing as  frequently and, unfortunately, the second half of two conversations I had earlier this year, both of which have much influence my thinking on social augmented reality, have languished in private mode -Â  part 2 of my talk with Bruce  Sterling (see <a title="Permanent Link to Interview with Bruce Sterling, Part I: At the 9am of the Augmented Reality Industry, are2010" rel="bookmark" href="../../2010/06/16/interview-with-bruce-sterling-part-i-at-the-9am-of-the-augmented-reality-industry-are2010/">Interview with Bruce Sterling, Part I: At the 9am of the Augmented Reality Industry, are2010</a>, and part 2 of my conversation with Anselm   Hook <a title="Permanent Link to Visual Search, Augmented Reality and a Social Commons for the Physical World Platform: Interview with Anselm Hook" rel="bookmark" href="../../2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/">- Visual Search, Augmented Reality and a Social Commons for the Physical World Platform: Interview with Anselm Hook, Part 1.</a> Time to get caught up on some blogging!Â  The lightly edited transcript of Part 2 of <a href="#tag1">my conversation with Bruce Sterling is posted in full below</a>.</p>
<p>Bruce Sterling has been blogging all the key developments in augmented reality (amongst other topics of interest!) on <a href="http://www.wired.com/beyond_the_beyond/" target="_blank">his Wired Blog</a>, and <a href="http://www.wired.com/beyond_the_beyond/2010/08/augmented-reality-augmented-foraging/" target="_blank">he brought my attention</a> to <a href="http://libarynth.org/augmented_foraging">Boskoi</a> the <a title="http://www.ushahidi.com/" rel="nofollow" href="http://www.ushahidi.com/">Ushahidi</a> based app for Android phones, <a href="http://lib.fo.am/augmented_foraging" target="_blank">augmented foraging </a>pictured in use above &#8211; for more pics see<span> <a href="http://fightthegooglejugend.com/index.html" target="_blank">fightthegooglejugend</a>. </span></p>
<p><span><br />
</span></p>
<h3><strong><strong>Augmented Reality and Real Time Information Brokerages</strong></strong></h3>
<p><span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/08/Screen-shot-2010-08-28-at-12.53.54-AM.png"><img class="alignnone size-medium wp-image-5630" title="Screen shot 2010-08-28 at 12.53.54 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/08/Screen-shot-2010-08-28-at-12.53.54-AM-300x176.png" alt="Screen shot 2010-08-28 at 12.53.54 AM" width="300" height="176" /></a><br />
</span></p>
<p><em><span>Picture above is the path the &#8220;nomads&#8221; took through the Westhaven cryptoforest with Pieter Bol,co-auteur of the book <a href="http://www.biologicalglobalisation.com/">Biological Globalisation</a> and Theun Karelse of <a href="http://urbanedibles.blogspot.com/">Urban Edibles Amsterdam</a> &#8220;who presented his &#8216;augmented foraging&#8217; app <a href="http://libarynth.org/augmented_foraging">Boskoi.</a>&#8220;Â   For more see, <a href="http://fightthegooglejugend.com/cryptoforests.html" target="_blank">The Cryptoforests of Utrecht </a>and, <a href="http://fightthegooglejugend.com/westraven.html" target="_blank">Westra</a><a href="http://fightthegooglejugend.com/westraven.html" target="_blank">ven Psychogeography, 6 June 2010.</a> </span><span> </span><span>Note</span><span>: Cryptoforests: 1) Urban forests hidden from view 2) Urban fallows that might or might </span><span> </span><span>not be considered as forests 3) Gardens gone wild)</span></em></p>
<p><strong> </strong></p>
<p>My interest in the Ushahidi family of ideas was already fired up by a conversation with <a href="http://www.hook.org/" target="_blank">Anselm Hook</a> early this year.Â  We discussed a number of <a href="http://vimeo.com/ushahidi">Ushahidi</a> related    projects, <a href="http://swift.ushahidi.com/" target="_blank">Swift</a>, Crisis Filter and Anselm&#8217;s project <a href="http://hook.org/" target="_blank">Angel</a>, Augmented    Reality, and my own keen interest in an open, real time, distributed platform for    augmented reality &#8211; <a href="http://www.arwave.org/" target="_blank">ARWave</a>.</p>
<p>The Ushahidi platform and the related project Swift has pioneered the real  time brokerage of information with people acting in curatorial roles or  matchmaking roles coevolving with machine assisted  matching to connect wants to haves.Â  Ushahidi uses multiple gateways including SMS, and Twitter.Â  But the Ushahidi family of ideas is extremely interesting when combined with augmented reality and suggests many new possibilities for social augmented experiences, as Anselm pointed out, for human to human communications, human  to  civilization communication, and human to environment communications (e.g., perhaps, how machine intelligence can help bridge the difference in time scale that Kate Hartman explores in her, <a href="http://vimeo.com/10352604"> Research for Glacier-Human Communication Techniques).</a></p>
<p>Ushahidi, which means &#8220;testimony&#8221; in Swahili, is a website that was    initially  developed to map reports of violence in Kenya after the post-election  fallout at the beginning of 2008.  It is now an open platform with a wide range of applications and growing developer community.Â  See <a href="http://vimeo.com/7838030">What is  the Ushahidi Platform?</a> from <a href="http://vimeo.com/ushahidi">Ushahidi</a> on <a href="http://vimeo.com/">Vimeo</a>.</p>
<p><a href="http://swift.ushahidi.com/" target="_blank">Swift </a>- a project that emerged from the Ushahidi dev community, is a human sensor/real-time brokerage for dealing with emergencies, enabling the filtering and verification of real-time data from channels such as Twitter, SMS, Email and RSS feeds.</p>
<p><a href="http://libarynth.org/augmented_foraging">Boskoi</a> &#8211; <a href="http://lib.fo.am/augmented_foraging" target="_blank">augmented foraging </a><span>is the first app,Â  I have seen, to begin linking Ushahidi with augmented reality  &#8211; although I don&#8217;t think there is a full augmented view for Boskoi developed yet?</span></p>
<h3><strong>&#8220;The whole point of AR is to see things from a different point of view&#8230;&#8221;</strong></h3>
<p><strong><br />
</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/ARWaveCurrentStatus3post.png"><img class="alignnone size-medium wp-image-5705" title="ARWaveCurrentStatus3post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/ARWaveCurrentStatus3post-300x212.png" alt="ARWaveCurrentStatus3post" width="300" height="212" /></a><br />
</strong></p>
<p><strong> </strong></p>
<p><em>Click to enlarge poster from upcoming ARWave demo at Software Freedom Day &#8211; for more see below</em></p>
<p>I am often asked what augmented reality brings to the table with respect to location based social networking, which is on the verge of going mainstream in smart phone apps like <a href="http://foursquare.com/">Four Square</a>. While the first part to my answer is usually to explain what is unique to augmented reality.</p>
<p>As Bo Begole notes, the full vision of AR requires machine   perception  technologies to detect  the identity and physical   configuration of  objects relative to each  other to accurately project   information  alongside/overlaid with a physical object (see this post on the PARC Blog by Bo Begole on the <a href="http://bit.ly/9Rsh79">difference between AR and ubiquitous computing</a> &#8211; thank you <a href="http://gamesalfresco.com/2010/09/12/weekly-linkfest-62/" target="_blank">Rouli for bringing my attention to this</a>).</p>
<p>But it is only in recent months that we have begun to see the kind of tools that make this possible become freely available to developers &#8211; see<a href="http://www.ugotrade.com/2010/08/05/vision-based-augmented-reality-ar-in-smart-phones-qualcomms-ar-sdk-interview-with-jay-wright/" target="_blank"> my interview with Jay Wright of Qualcomm here</a>. Â  Also see this post on <a href="http://phototour.cs.washington.edu/bundler/" target="_blank">Bundler: Structure from Motion for Unordered Image Collections</a> an open source system that allows the creation of 3D point clouds from unordered image collections, e.g. internet image collections.Â  We now have many tools available to move mobile augmented reality beyond the recent crop of apps relying on GPS and compass alone for positioning into a new era of vision assisted AR apps that will increasingly bring the full vision of AR into our daily lives.</p>
<p>Further, the  integration of visual search  applications   like <a href="http://www.google.com/mobile/goggles/#text">Google Goggles</a> and <a href="http://www.kooaba.com/">Kooaba</a> which can detect the identity of particular objects will add another vital tool to machine perception technologies enabling AR &#8220;checkins&#8221; on potentially anything in the physical world around us, and more fuel to the <a href="http://gamepocalypsenow.blogspot.com/">Gamepocalypse</a> (e.g. it would be easy to turn every trash can in the city into a basketball hoop as we discussed at the <a href="http://www.meetup.com/ARNY-Augmented-Reality-New-York/" target="_blank">ARNY</a> meetup last month).Â   And soon, the Pandora&#8217;s Box ofÂ  facial recognition (Google Goggles have the capability though it is not released to the  public  yet) will open up.</p>
<p>Jesse Schell described the importance of AR in a nutshell <a href="http://augmentedrealityevent.com/2010/08/25/are2010-keynote-by-jesse-schell-augmented-reality-will-define-the-21st-century/" target="_blank">in his keynote for are2010</a>:</p>
<p><strong>â€œThe  whole point of AR is to see things from a different point of  viewâ€¦How  can there be a more powerful art form than one that actually  changes  what you see?â€</strong></p>
<p>But how AR matures as a social experience will be the key to Jesse&#8217;s suggestion that:</p>
<p><strong>â€œAugmented Reality will be one of the things that fundamentally define the 21st centuryâ€</strong></p>
<p>There are many interesting forms of AR that are not reliant on a tight  registration between media and physical objects &#8211; several are put forward by Bruce in the convo below.Â  And, it is likely we will see AR eyewear as an occasional useful accessory to a smart phone long before we have the sexy, affordable augmented reality eyewear worn that we wear throughout the day. Â  <a href="http://www.yankodesign.com/2010/08/31/speech-to-text-glasses/" target="_blank">These speech to text glasses</a> would be a very useful and viable accessory to a smart phone right now for the hearing impaired.</p>
<p>For the moment, as Bruce notes, some of the most interesting and useful augmented experiences to date have not been in the cell phone space:</p>
<p><strong> &#8220;There are other aspects of AR besides the cell phone space. Thereâ€™s  Total Immersion&#8217;s big display screens. Thereâ€™s the web-based fiduciary  stuff. And thereâ€™s projection mapping. And then thereâ€™s experience  design just for people who need their reality augmented for whatever  personal or social reason.&#8221;</strong></p>
<p>On of my favorite social AR experiences is this<a href="http://www.youtube.com/watch?v=oLnKSKaY1Yw&amp;feature=player_embedded" target="_blank"> SMS Slingshot</a>.</p>
<p>But I have been excited for a long while about the intersection of mobile social augmented    reality, real time communications, and ubiquitous computing see <a title="Permanent Link to Total Immersion and the â€œTransfigured City:â€ Shared Augmented Realities, the â€œWeb Squared Era,â€ and Google Wave" rel="bookmark" href="../../2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/">Total Immersion and the â€œTransfigured City:â€ Shared Augmented Realities, the â€œWeb Squared Era,â€ and Google Wave</a>.Â    And I have  described in    many places why I think real time, open,   distributed communications  for AR are so    important to developing social augmented experiences &#8211; see <a href="http://www.slideshare.net/TishShute/ar-wave-a-proof-of-concept-federation-game-dynamics-semantic-search-mobile-social-communications" target="_blank">the slides for my talk at Augmented Reality Event here</a>, <a href="../../2010/04/02/ar-wave-at-where-2-0-exploring-social-augmented-experiences/" target="_blank">here</a> and <a href="http://www.mobilemonday.nl/talks/tish-shute-the-next-wave-of-ar/" target="_blank">here</a> for starters.</p>
<p><strong><br />
</strong></p>
<h3><strong> ARWave at Software Freedom Day 2010, September 18th 2010<br />
</strong></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/Screen-shot-2010-09-17-at-12.12.02-PM.png"><img class="alignnone size-medium wp-image-5683" title="Screen shot 2010-09-17 at 12.12.02 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/Screen-shot-2010-09-17-at-12.12.02-PM-300x38.png" alt="Screen shot 2010-09-17 at 12.12.02 PM" width="300" height="38" /></a></p>
<p>Thomas Wrobel and Bertine van Hovell will demo the first ARWave Android client <a href="http://www.sfd2010.nl/" target="_blank">at Software Freedom Day this weekend</a>!</p>
<p>A number of people have asked me, (including Bruce), What will be the future of ARWave now that Google Wave is no longer a stand alone application?Â  Yes, the recently announced release of <a href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" target="_blank">Wave in a Box</a> (see <a href="http://arstechnica.com/web/news/2010/09/google-sticks-wave-source-in-a-box-sticks-a-bow-on-top.ars" target="_blank">here </a>and<a href="http://www.readwriteweb.com/archives/google_announces_wave_in_a_box.php" target="_blank"> here</a>) is very exciting for the ARWave team.</p>
<p>The ARWave Android client is the  first open AR client built on an open, real time, distributed platform -Â  based on a server that anyone can download and set up, currently the  &#8220;FedOne&#8221; server but Wave in a Box, hopefully,  will be even easier to deploy.Â  Wave in a Box seems perfect for ARWave&#8217;s needs &#8211;  for more <a href="https://groups.google.com/group/wave-protocol/browse_thread/thread/70067fc740b4c8d3" target="_blank">see the WiaB Google Group here</a>.Â   And for more information on the ARWave client -Â  click to enlarge the poster below, see the <a href="http://arwave.org/pages/Videos.php" target="_blank">ARWave concept video here</a>, and for more, and how to get involved see <a href="http://arwave.org/new_index.php" target="_blank">arwave.org</a>.Â Â  Props to <a href="http://www.lostagain.nl/#" target="_blank">Thomas Wrobel and Bertine van Hovell</a> (posters below from demo for Software Freedom Day), Mark Evin, <a href="http://twitter.com/need2revolt" target="_blank">Davide Carnovale</a>, and <a href="http://twitter.com/kusako" target="_blank">Markus Strickler</a>, for all their hard and brilliant work on ARWave.Â  Also to <a href="http://www.jpct.net/" target="_blank">JCPT the open Android 3D engine</a> that has saved a lot of work!</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/ARWaveCurrentStatus1post.png"><img class="alignnone size-medium wp-image-5687" title="ARWaveCurrentStatus1post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/ARWaveCurrentStatus1post-212x300.png" alt="ARWaveCurrentStatus1post" width="212" height="300" /></a></p>
<p><em>click to enlarge slide</em></p>
<h3><strong>Social Augmented Experiences that Matter</strong></h3>
<p>My ideas on the future of social augmented experience have been deeply informed by the the conversations I had with Bruce Sterling and Anselm Hook this year.</p>
<p>Bruce  Sterling notes in the conversation below, location based social  apps like, Four Square, are interesting because they are not <strong> &#8220;urban geography like Google&#8217;s  satellite stare from above,&#8221;</strong> but  rather <strong>&#8220;groups of citizens are doing portraits  of their own region.&#8221; </strong> Augmented Reality, with its of lauded power to make the invisible visible is, of course, is the ideal tool for &#8220;citizen portraits&#8221;Â  to the next level.Â  Cory Doctorow  described to me three years ago (<a href="http://www.ugotrade.com/2007/10/31/cory-doctorow-a-reverse-surveillance-society/" target="_blank">see here</a>) an &#8220;inverse surveillance society,&#8221; enabled by an augmented viewÂ  &#8211; &#8220;<strong>where all the data from the positional and temporal  characteristics of all the objects that we own  were in aggregate  visible and available so that we can mix and match them  remix them  understand them and have more agency in the world.&#8221;</strong></p>
<p>It is very cool to go back to reread <a href="http://www.ugotrade.com/2007/10/31/cory-doctorow-a-reverse-surveillance-society/" target="_blank">this  conversation </a>now that it is becoming possible to build the kinds of apps Cory described, and Bruce Sterling envisioned in <strong><a href="http://mitpress.mit.edu/catalog/item/default.asp?tid=10603&amp;ttype=2" target="_blank">Shaping Things</a></strong> (see Amazon.orgÂ  page 111).</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/shapingthings.jpg"><img class="alignnone size-thumbnail wp-image-5689" title="shapingthings" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/shapingthings-150x150.jpg" alt="shapingthings" width="150" height="150" /></a></p>
<p><em>click to enlarge</em></p>
<p>MyÂ  conversation with Bruce earlier this summer (see below) took place on the heels of <a href="http://augmentedrealityevent.com/">are2010 &#8211; Augmented Reality Event</a>.Â Â  <a href="http://augmentedrealityevent.com/2010/06/06/are-2010-keynote-by-bruce-sterling-build-a-big-pie/" target="_blank">See the video of Bruce&#8217;s keynote, &#8220;Bake a BigPie,&#8221; here</a>,Â  and the <a href="http://augmentedrealityevent.com/2010/08/25/are2010-keynote-by-jesse-schell-augmented-reality-will-define-the-21st-century/" target="_blank">final keynote, &#8220;Seeing,&#8221; by Jesse Schell (see video here)</a> in which Jesse riffed on AR and the man with the X-ray eyes.Â  Both these awesome talks are still fresh in my mind.Â  Bruce noted how we should pay attention to augmentations for people and situations that could really use some augmentation&#8230; and not get too fixated on the coming of AR Goggles.Â  He elaborated on this in our conversation (again full transcript below):</p>
<p><strong>&#8220;Well,  itâ€™s a matter of deciding whose reality it is that youâ€™re  trying to augment.  Iâ€™m not trying to be a bleeding heart about it, but  obviously there are people in our society right now with reality that  could really use some augmentation.  They are mostly disadvantaged  people.  They are vision impaired, or maybe they have autism.  They  might be senile and just canâ€™t remember where they put their shoes.   These are people who could really use some help, right?&#8221;</strong></p>
<p><strong>So, start  with people who really need sensory or cognitive help. Before you  turn  our geeks into Superman, why donâ€™t you try turning some people who are  harmed into more functional individuals?  Then youâ€™ll be able to learn  how to do that. Then maybe you can ramp it up to these Nietzschian  heights of the superb Man With the X-ray Eyes.  Whatever.&#8221;</strong></p>
<p>What will make AR interesting and useful long before and long after we see the full vision of AR eyewear manifest is its social aspects.Â  Bruce points out:</p>
<p><strong>&#8220;My  argument would be that if you want people to be  more sensitive toward   certain, say, issues and problems, itâ€™s better to  find the people who   are already sensitive to those issues and  problems, and give them a   bigger stake in your augmentation system.&#8221;</strong></p>
<p><strong>&#8220;Say that I am really worried about public health.   Well, if I have a lot of nurses that are using my system, people who are  aware of my issues, then I could be walking around and Iâ€™ll see a lot  more tags saying, â€œThis is where he got food poisoning!â€  &#8220;In this  shooting gallery, many people have caught AIDS!â€  Or, you know,  â€œTuberculosis has been spotted over here in this building.â€</strong></p>
<p><strong>At  that point, I could simply share their knowledge and get some social  intelligence.  As opposed to trying to  amp the basements of my little  hacker-mind and drag stuff up thatâ€™s escaped my conscious attention.&#8221;</strong></p>
<p>Finding new ways to broker information &#8211; bring together needs with haves and different participants, empowered and disempoweredÂ  is., as Anselm discussed with me, one way to change our view of human to human, human to environment and human to civilization communication (particularly in light of thisÂ  &#8220;sobering account of how open data is used against the poor in Bangalore&#8221; that as <a href="http://twitter.com/timoreilly/status/23179898934" target="_blank">@timoreilly noted</a> recently <a href="http://gurstein.wordpress.com/2010/09/02/open-data-empowering-the-empowered-or-effective-data-use-for-everyone/" target="_blank">OpenData Empowering the Empowered)</a>.</p>
<p>The key idea in a crisis filter, Anselm noted,Â  was to break  up the participants into different kinds, to connects wants with haves:</p>
<p><strong>&#8220;There are  people who are  inÂ  situation.Â  We call them citizens.Â  And  then there  are reporters,  people who report situations back to Twitter.Â  And then there are curators, people that canvas Twitter    looking for important Tweets.Â  And then there are first responders, people who take the curating collection of responses and then act on them.&#8221;</strong></p>
<p>This kind of brokerage between people acting in a curatorial role or matchmaking role with each other can be extended into and coevolve with machine assisted matching as Anselm explains.</p>
<p>It is also a vital part of creating social augmented experiences that matter.</p>
<p>One of Anselm Hook&#8217;s projects, which is called <a href="http://hook.org/" target="_blank">Angel</a> is the the most radical expression of connecting wants with haves in that the  idea is that &#8220;you have a  situation, you broadcast that  situation, and help  magically appears.Â   You donâ€™t even sign up forÂ a service.Â  You just get  help â€¦</p>
<p>As Anselm explains this is the same idea of a brokerage for dealing with emergencies, but applied to the long tail of crisis response.Â  As Anselm describes it:</p>
<p><strong><strong>&#8220;I am interested in personal crisis.Â  &#8216;I lost my cat.Â  Help.Â  I canâ€™t find </strong>where my kid is.Â  I am out of gas.Â  I have a flat tire.Â  My house is on fire.Â  My aunt is trapped in the bedroom.&#8217;Â  The kind of personal crisis    that is just as important, but is not enough to get a national  movement   to help you&#8230;</strong></p>
<p>I will publish this conversation with Anselm in full in an upcoming post.</p>
<h3>Zorop &#8211; an ARG for World Peace</h3>
<p><strong><strong><span> </span></strong></strong><a href="http://libarynth.org/augmented_foraging"><span style="font-family: 'times new roman';"><span style="font-size: small;"> </span></span></a>If you want to be part of a really exciting experiment to reimagine our relationships with each other and can be in San Jose this weekend, I highly recommend exploring <a href="http://zorop.org" target="_blank">this &#8220;rabbit hole&#8221;</a>.</p>
<p><object classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" width="640" height="385" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0"><param name="allowFullScreen" value="true" /><param name="allowscriptaccess" value="always" /><param name="src" value="http://www.youtube.com/v/czUpYfme0kg?fs=1&amp;hl=en_US" /><param name="allowfullscreen" value="true" /><embed type="application/x-shockwave-flash" width="640" height="385" src="http://www.youtube.com/v/czUpYfme0kg?fs=1&amp;hl=en_US" allowscriptaccess="always" allowfullscreen="true"></embed></object></p>
<p>Thank you <a href="http://www.lightninglaboratories.com/tcw/about-2/" target="_blank">Gene Becker</a>, <a href="http://www.lightninglaboratories.com/" target="_blank">Lightning Laboratories</a> and <a href="http://ubistudio.org/" target="_blank">Ubistudio</a> for sending me this invite:</p>
<p><strong>&#8220;Ken  Eklund (<a href="http://twitter.com/writerguygames" target="_blank">@writerguygames</a>) is developing a wonderful game for the 01SJ  Biennial called ZOROP, aimed at creating World Peace(!). Some of you  might know Ken from his work on the amazing ARGs EVOKE and World Without  Oil. Anyway Ken, along with his collaborator Annette Mees, are  furiously working to get ZOROP ready to go for the Sept 17th premiere at  01SJ.</strong></p>
<p><strong>Are you intrigued? I thought so, and here are your next steps down the rabbit hole:</strong> <strong> </strong></p>
<p><strong>&gt; Check out </strong> <strong><a href="http://zorop.org/" target="_blank">http://zorop.org</a> to learn about the game</strong></p>
<p><strong>&gt; Follow @ZoropPrime to watch it unfold: </strong> <strong><a href="http://twitter.com/zoropprime" target="_blank">http://twitter.com/zoropprime</a></strong></p>
<p><strong>&gt; &#8216;Like&#8217; ZOROP on FB for a different view: </strong> <strong><a href="http://www.facebook.com/pages/Zorop/141140772593618" target="_blank">http://www.facebook.com/pages/Zorop/141140772593618</a></strong></p>
<p><strong>&gt; Become one with the game; consider volunteering as a Zoropathian: </strong> <strong><a href="mailto:curious@zorop.org">curious@zorop.org</a></strong></p>
<p><strong>&gt; Head down to San Jose on the 17th, play the game, and ride the ZOROP Mexican Party Bus. Seriously.&#8221;</strong></p>
<p style="margin: 0pt;">
<p><strong><br />
</strong></p>
<h3><strong>Interview with Bruce Sterling</strong><strong> </strong><a name="tag1"></a></h3>
<p><a href="http://www.flickr.com/photos/brucesterling/4671866157/in/photostream/" target="_blank"><img class="alignnone size-medium wp-image-5676" title="Screen shot 2010-09-16 at 7.59.56 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/Screen-shot-2010-09-16-at-7.59.56-PM-300x180.png" alt="Screen shot 2010-09-16 at 7.59.56 PM" width="300" height="180" /></a></p>
<p><em>Click on image above to see video clip from</em> <em><a href="http://www.flickr.com/photos/brucesterling/4673885122/" target="_blank"><em>from brucesflickr</em></a></em></p>
<p>[Note the<a href="http://www.ugotrade.com/2010/06/16/interview-with-bruce-sterling-part-i-at-the-9am-of-the-augmented-reality-industry-are2010/" target="_blank"> first part of this interview is here</a> and I broke in anticipation of Part 2 just as I started experimenting with an idea <a href="http://www.linkedin.com/in/joshuakauffman" target="_blank">Joshua Kauffman</a> &#8211; an advisor and entrepreneur working on design  in the public sphere gave me for an interview technique &#8211; the All Souls College one-word  question interview.Â  Although apparently <a href="http://www.nytimes.com/2010/05/28/world/europe/28oxford.html" target="_blank">they recently scrapped it</a> and I am not very good to sticking to a single word!]</p>
<p><strong>Tish  Shute:</strong> We were talking about these proximity-based social work networks like Foursquare and Gowalla and how they may influence the emergence of social augmented experiences.</p>
<p>So Joshua&#8217;s suggestion for the first word was &#8220;territorialization&#8221; e.g. how do these new mobile social experiences like Foursquare,  and the observation that actually rather than breaking down territorialization &#8211; which would be a good thing, tend to support territorialization&#8230;</p>
<p><strong>Bruce Sterling: Yeah, theyâ€™re re-intensifying it in a very odd, electronic fashion.</strong></p>
<p><strong>Tish Shute:</strong> Yes.</p>
<p><strong>Bruce Sterling:  Itâ€™s not true of  projection mapping or the webcam fiduciary display stuff. But with the handheld stuff, and especially the urban informatic stuff, it really canâ€™t help but take on a local flavor. <a href="http://www.layar.com/" target="_blank">Layar</a> is like &#8220;Augmented Dutch Reality.&#8221;</strong></p>
<p><strong>And <a href="http://www.tonchidot.com/" target="_blank">TonchiDot</a> is &#8220;Augmented Japanese Reality.&#8221; Itâ€™s hard to imagine a Layar interface going gangbusters at Tokyo.  Whereas the TonchiDot interface, which is so clearly influenced by Anime and cartoon graphics&#8230;. Maybe it could find some niche of hipsters in Amsterdam hash barsâ€¦</strong></p>
<p><strong>Stuff that&#8217;s socially generated by people on the ground, as with Foursquare and Gowalla, is bound to take on a regional influence. Right? It&#8217;s like the New York hipsters who were early adopters of Foursquare. They&#8217;re not mapping New York! They&#8217;re mapping Hipster New York.</strong></p>
<p><strong>It&#8217;s all about Williamsburg and places where 24-year-olds go to drink&#8230; They found a demographic niche there. These guys are building the service for them. They&#8217;re people who are willing to work for Foursquare for free, because they want to wear the little king hat.</strong></p>
<p><strong>Tish Shute:</strong> I got the far far away badge &#8216;cos I live on the Upper West Side!</p>
<p><strong>Bruce Sterling: But that&#8217;s not urban geography, right? I mean, that&#8217;s not like Google&#8217;s satellite stare from above.  That&#8217;s a group of citizens doing a portrait of their own region.  You&#8217;re going to see interesting things happen because, of course, people who use Foursquare elsewhere are going to check into New York, and they&#8217;re going to look at the &#8220;New York Foursquare.&#8221;   They&#8217;re going to be aliens who interact with Foursquare people in New York and annotate what they&#8217;re seeing.</strong></p>
<p><strong>Tish Shute:</strong> Oh! Yes. Good point.</p>
<p><strong>Bruce Sterling:  That Foursquare community has a certain Ã©migrÃ© soul.  It&#8217;s different from the normal Ã©migrÃ© soul of simple tourists on New York. So you&#8217;re friend there is right about the territorialization.</strong></p>
<p><strong>Tish Shute:</strong> Yes, Joshua Kauffman is a smart guy!  Yes I am interested to see what interesting kinds of deterritorializations proximity based social networks and the hyperlocal view of augmented reality might bring, not just the new territorializations.</p>
<p><strong>Bruce Sterling: It&#8217;s not the intense kind of territorialization, like gangs putting down graffiti markers and beating people up.  It&#8217;s an inherent regional character that comes with using peer production to build your database.</strong></p>
<p><strong>Tish Shute:</strong> We were discussing whether AR could break down the walls between people &#8211;  people who share the same physical space but actually inhabit different territories even if they are sitting on the table next to you.</p>
<p><strong>Bruce Sterling: You know, I just wrote an article for my Italian magazine column. I think I mentioned this to you &#8211; a report about ARE 2010.   I titled it, &#8220;Chicks Dig Augmented Reality.&#8221;</strong></p>
<p><strong>Tish Shute:</strong> [laughs]</p>
<p><strong>Bruce Sterling:   There is a very heavy social element to AR, and a phone based element. So the question is: Why would a woman wear a fiducial marker? Like our <a href="http://www.metaio.com/" target="_blank">Metaio</a> speaker at ARE2010 who had a fiducial marker on her lapel pin.</strong></p>
<p><strong>Tish Shute:</strong> Right. Lisa!</p>
<p><strong>Bruce Sterling: Why would a woman go out in public with her Facebook profile on her body?</strong></p>
<p><strong>Tish Shute: </strong>Well I can think of some reasons&#8230;</p>
<p><strong>Bruce Sterling: So that men will approach her, of course.</strong></p>
<p><strong>Tish Shute:</strong> Yes the core of all successful social networks is always a form of dating app.</p>
<p><strong>Bruce Sterling: You do it a social icebreaker.  It&#8217;s like: I&#8217;m a woman, I&#8217;m sitting here alone, and you can sort of glide by and, you know, take a snap of me.  Then you retreat and have a beer with your friends and  you work up the courage, and then you come and say, &#8220;So! Susan!  I understand you like bicycling!  And, boy, me too!&#8221; Right?</strong></p>
<p><strong>Tish Shute:</strong> There are all kinds of social barriers between people in cities that AR might be helpful in breaking down.  An extreme example is the dilemma you actually quite often face as a New Yorker as you walk around a city.  There are people asleep on the pavement and you don&#8217;t know if they&#8217;re dead or alive.</p>
<p><strong>Bruce Sterling: Right.</strong></p>
<p><strong>Tish Shute:</strong> And you sort of like have this awful ethical dilemma of like, &#8220;Am I walking by someone I should be shaking by the shoulder, right, to wake them up so they don&#8217;t die, right?&#8221;</p>
<p><strong>Bruce Sterling: Yes.</strong></p>
<p><strong>Tish Shute: </strong> You said in your keynote that we should pay attention to augmentations for people and situations that could really use some augmentation..</p>
<p><strong>Bruce Sterling: Right. There actually is such an app in Britain right now.  I posted about it:  two Augmented Reality schemes for rubbish and hobos.</strong></p>
<p><strong>Tish Shute:</strong> Right. Yes I saw that!</p>
<p><strong>Bruce Sterling:  &#8220;Any sufficiently advanced technology is indistinguishable from garbage and hobos.&#8221;  You don&#8217;t need to personally find out whether this hobo is worth your help.  What you need is a good way to report the hobo to a hobo check-up service.   They come in, and they look on their own database or supply a database to you, or a facial recognition unit, whatever.  The service says: &#8220;Oh, well.  That&#8217;s Fred. He&#8217;s a paranoid schizophrenic. He always sleeps in that alley. Let him be.&#8221;</strong></p>
<p><strong>The same goes for the rubbish &#8212; although I don&#8217;t want to compare rubbish to hobos.   In fact, people do go out with their AR kits and take pictures of abandoned garbage bags and broken glass.  They upload them with geolocated tags for the local garbage guys.  Guys who are sitting around doing pretty much nothing because they don&#8217;t know where the rubbish is.</strong></p>
<p><strong>And they will come out and get the rubbish! I mean, they just deputize guys to go out and follow these alerts. Right?</strong></p>
<p><strong>But nobody predicted &#8212; least of all me &#8212; that you were going to have a high-tech Augmented Reality system that consisted of removing rubbish and derelicts. Right?   But rubbish and derelicts  always go profoundly under-reported. It&#8217;s just hard to get people&#8217;s attention.</strong></p>
<p><strong>But it&#8217;s very easy to set up a system so that, if you get  ten reports on the same piece of rubbish, that&#8217;s going to work its way to the top of the stack.   That&#8217;s why I was trying to get AR people away from the romance of  the hottest app for the shiniest machine.  More toward a design stance that&#8217;s more user-centric.</strong></p>
<p><strong>Where are the actual problems about stuff that we perceive?  Stuff we can&#8217;t do anything about?   Or people whose mechanisms of perceptions are harmed. They could be doing good work, being more participative, if they didn&#8217;t, basically, walk around without their glasses on.</strong></p>
<p><strong>Tish Shute:</strong> Well this leads well into the second word, Joshua suggested was interesting spring board &#8211; sensitivity.</p>
<p>On the one hand we can do these things for people who maybe need the augmentation because they have difficulty with one or another sense, e.g.,  their eyes are not functioning, or their ears are not functioning. But on the other hand, we can&#8217;t cross the social bridge to communicate with people who are temporarily disempowered in relation to the rest of society e.g. hobos and people who sleep on the streets of New York City.Â  And even though Augmented Reality could potentially be helpful it can even be more disempowering to the already disempowered.</p>
<p><strong>Bruce Sterling: Right.</strong></p>
<p><strong>Tish Shute:</strong> But re &#8220;sensitivity&#8221; &#8211; does augmentation increase or decrease our sensitivity?  This is a problem that Will Wright brought up [<a href="http://augmentedrealityevent.com/2010/06/14/are-2010-keynote-by-will-wright-brilliant-inspiration-for-the-augmented-reality-community/" target="_blank">see video of Will Wright&#8217;s keynote at are2010</a>], e.g, the problem of parking HUDs getting in the way of your intuitive parallel parking skills.  The Lexus that takes driving control from you when you look back, &#8216;cos it knows that you&#8217;re looking at the road, and it starts to brake. Right?</p>
<p><strong>Bruce Sterling: Right.</strong></p>
<p><strong>Tish Shute:</strong> The fact that the problem with technology is that it makes us less sensitive, right, augmentations sometimes get in our way?</p>
<p><strong>Bruce Sterling:  I suppose that&#8217;s true. But I&#8217;ve heard that said about practically every medium.  Especially television.</strong></p>
<p><strong>Everybody wants to blame machinery for their lack of morality.   It&#8217;s hard to top something like the Kitty Genovese killing in New York. This sort of legendary New York horror story from the 1960s. A woman is stabbed to death in public, no one does anything.</strong></p>
<p><strong>Tish Shute:</strong> Right.</p>
<p><strong>Bruce Sterling: I don&#8217;t think that our media is making us any less humane or more callous.</strong></p>
<p><strong>Tish Shute: </strong>All right. Oh no! I see what you&#8217;re saying. Perhaps I misrepresented what Will was suggesting by putting it that way.  The question is perhaps more how do we get the sensitivity into the technology.  Human bodies are fantastically sensitive and sensory.</p>
<p><strong>Bruce Sterling: Right.</strong></p>
<p><strong>Tish Shute: </strong>And we have these like sensitivities.  For instance, How could augmentations of reality be like a blush ? You definitely want an interaction that&#8217;s not just this data being pushed at you. But what is the data that counts, right?  Will shows a slide often of an iceberg with the tip of the iceberg which is the conscious mind.</p>
<p><strong>Bruce Sterling: Oh, I see.  Yeah.</strong></p>
<p><strong>Tish Shute: </strong> And underneath it is all the preconscious stuff that really counts, right?  Any thoughts on that?</p>
<p><strong>Bruce Sterling:  I did take interest in that.  Will has obviously been spending a lot of time studying cognition.</strong></p>
<p><strong>Tish Shute:</strong> Yes.</p>
<p><strong>Bruce Sterling:  Iâ€™m not convinced that AR has got a lot to do with that.  There is certainly a trend there.  There are a lot of people who want to do body hacks and brain hacks.  I can imagine AR being used for that purpose, but it seems like a niche application.   What is the point of our accessing even more stuff thatâ€™s outside of our consciousness?</strong></p>
<p><strong>Tish Shute:</strong> One of the things he is talking about is game dynamics, is it?  The role of the imagination in play.  For example, he shows the high dynamic range photos that make the world magical.  Something you want to engage with playfully.  This he points out increases a sense of agency because you are encouraged to engage and to play with the world.</p>
<p><strong>Bruce Sterling:  Well, Iâ€™m a literary guy.  Italo Calvino did a lot of writing about this.  He talked about the classics of literature.  Why do we read the classics?  Calvino said we do not read, but reread the classics.  And the reason we do that is that, at first, we read a classic book and we think, â€œBoy, this book is really good.&#8221;   Then, five years later, we read it again and we think, â€œBoy, this is a really good book, and itâ€™s got so much more in it than I thought it had when I was 18.â€  Then we read it again at 28, and itâ€™s like, â€œOK, now I really seem to understand this book, and it means something to me now that I didnâ€™t know when I was 18 and 25.â€</strong></p>
<p><strong>What you are doing through that access is learning something about yourself.  So Will is arguing is what I really need is like a better augmentation.  So that I can go in there and sop up the book all at once.  I can grab every cultural nuance in it, instead of the stuff thatâ€™s  sliding past me because Iâ€™m 18 and kind of young and hasty.  Maybe I could have certain words and phrases helpfully underlined, that are like, â€œOK, well, this part is problematic for you.â€  In some sense, thatâ€™s not allowing me to be 18.</strong></p>
<p><strong>Iâ€™m never going to have the experience of my own maturation against this text, because Iâ€™ve devoured it all in one gulp.</strong></p>
<p><strong>My argument would be that if you want people to be more sensitive toward certain, say, issues and problems, itâ€™s better to find the people who are already sensitive to those issues and problems, and give them a bigger stake in your augmentation system.</strong></p>
<p><strong>Tish Shute:</strong> Yes the social augmented experiences are going to be the most valuable.</p>
<p><strong>Bruce Sterling:  Say that I am really worried about public health.  Well, if I have a lot of nurses that are using my system, people who are aware of my issues, then I could be walking around and Iâ€™ll see a lot more tags saying, â€œThis is where he got food poisoning!â€  &#8220;In this shooting gallery, many people have caught AIDS!â€  Or, you know, â€œTuberculosis has been spotted over here in this building.â€</strong></p>
<p><strong>At that point, I could simply share their knowledge and get some social intelligence.  As opposed to trying to  amp the basements of my little hacker-mind and drag stuff up thatâ€™s escaped my conscious attention.</strong></p>
<p><strong>Tish Shute:</strong> Interesting that seems to bring us to another kind of repetitive theme in AR,  the people tend to pigeon hole it as &#8220;merely&#8221; a visual interface.  But actually, itâ€™s the intersection, isnâ€™t it, of social intelligence and augmentation.</p>
<p><strong>Bruce Sterling:  Well, it depends entirely on how you design the system.  If Iâ€™ve got a military augmented reality, I would expect that to be mostly about urban fighting.  Itâ€™s going to be about kicking in a door and shooting terrorists.   If I pull that helmet off my head and put that on the head of an emergency worker or a cop, Iâ€™m going to get a militarized cop or a militarized emergency worker.</strong></p>
<p><strong>Tish Shute:</strong> Well the histories of the two great mass media of the twentieth century &#8211; TV and the atomic bomb were intertwined, and I suppose the evolution of ubiquitous media, augmented reality and urban warfare is already intertwined too.Â   So how can we encourage augmented realities to move beyond military roots that is common to much technology and into more peaceful urban realities?</p>
<p><strong>Bruce Sterling:  Well,  itâ€™s a matter of deciding whose reality it is that youâ€™re trying to augment.  Iâ€™m not trying to be a bleeding heart about it, but obviously there are people in our society right now with reality that could really use some augmentation.  They are mostly disadvantaged people.  They are vision impaired, or maybe they have autism.  They might be senile and just canâ€™t remember where they put their shoes.  These are people who could really use some help, right?</strong></p>
<p><strong>So, start with people who really need sensory or cognitive help. Before you  turn our geeks into Superman, why donâ€™t you try turning some people who are harmed into more functional individuals?  Then youâ€™ll be able to learn how to do that. Then maybe you can ramp it up to these Nietzschian heights of the superb Man With the X-ray Eyes.  Whatever.</strong></p>
<p><strong>Tish Shute:</strong> Did you notice that a couple of apps actually like <a href="http://www.tagwhat.com/" target="_blank">TagWhat</a> have apps geared towards people with disabilities &#8211; I haven&#8217;t had a chance to check it out.</p>
<p><strong>Bruce Sterling: Iâ€™m sorry, I wasnâ€™t looking at their tags.</strong></p>
<p><strong>Tish Shute:</strong> I was discussing this with Joshua who mentioned <a href="http://www.eyewriter.org/" target="_blank">Zachary Liebermanâ€™s Eye Writer</a>, which is for people with locked-in syndrome. Do you know that?</p>
<p><strong>Bruce Sterling: Sure. And people appreciate that because the poor guy, heâ€™s laid up with Lou Gehrigâ€™s Disease. Now theyâ€™ve given him  a way out.  AR is like a spark of new hope that gives his life meaning. Whatâ€™s wrong with that?</strong></p>
<p><strong>Tish Shute:</strong> Yeah. And <a href="http://www.youtube.com/watch?v=IJ8VMLECToQ" target="_blank">Tim Byrne using Sixth Sense</a> for Autism is interesting.</p>
<p><strong>Bruce Sterling: Letâ€™s consider it the other way. Letâ€™s say this graffiti writer there, instead of him being sick and weak, letâ€™s say heâ€™s an athlete.  So I want to make him into a super-human graffiti writer. I want him to run around graffiti-tagging the entire town before dawn. Is that a good idea? Do we need that? Super human, super taggers? What if heâ€™s going to spray up stencils of  Nietszche?  I kinda wonder whether the game is worth the candle.</strong></p>
<p><strong>Tish Shute: </strong>Yes I suppose it is not a great social scenario to be always augmenting the lives of the elites!  Hmm, the third single word interview question is &#8220;homophily,&#8221; and earlier were youâ€™re saying that weâ€™ve kinda got to accept this is very much part of AR &#8211; as how it works, because hyperlocal experiences gets created by local communities &#8211; that up to know have tended to be homophilies.</p>
<p><strong>Bruce Sterling: Well, I think thatâ€™s easily handled with some design thinking. You&#8217;ve got to do some user observation and show some sympathy with the user, and to be aware that youâ€™re designing for the user and youâ€™re not designing for yourself.</strong></p>
<p><strong>In a field as young as this, itâ€™s mostly geeks building cool stuff for geeks. In a lot of ways, itâ€™s a â€œcan you top thisâ€ contest. Thatâ€™s OK, but itâ€™s not good design to be your own client all the time. Itâ€™s like writing novels to amuse yourself, or sitting on the porch singing the blues on your own guitar with only yourself to hear.</strong></p>
<p><strong>Tish Shute:</strong> What will it take for AR mature out of this &#8220;geeks building cool stuff for geeks&#8221; phase do you think?</p>
<p><strong>Bruce Sterling:  Itâ€™s necessary to master some of the tools first.  I think of the way the web has developed over the years. When the World Wide Web first appeared, it was just for physicists, and was all line commands and quite unstable and difficult. Then you got usability studies, and things like Ajax and so forth. Itâ€™s a very painstaking thing.</strong></p>
<p><strong>Weâ€™re not best at  building interfaces for the best computer scientists.  Web 2.0 was built from things like watching people cry while they were trying to fill out insurance forms. â€œWell, why are you so upset?â€</strong></p>
<p><strong>â€œWell, I got to the end of the webpage, and then it said I took too long, and it cut me off and now I have to start all over!â€ <a href="http://blog.jjg.net/" target="_blank">Jesse James Garrett</a>, right? Benefactor of mankind.</strong></p>
<p><strong>If youâ€™re experienced, you think:  â€œWhy donâ€™t I build a little module here, and kind of move the form over here, then Iâ€™ll periodically update it with some asynchronous Java and XTML.â€ And people are like, â€œGee, how odd.â€ But that really works for real people. It comes from studying what people want to do.  Whereas, the current AR approach to a problem like the insurance form would be like, â€œI will give you the ability to record the entire insurance form, and it will flash before your eyes!â€    OK great, thatâ€™s a cool hack, but I donâ€™t really need X-Ray Eyes to fill out my insurance form. What I need is a more user friendly interface.</strong></p>
<p><strong>Tish Shute:</strong> Well it seems like we are moving into the terrain of Joshua&#8217;s fifth word &#8220;ventilation,&#8221; &#8211; if I understand it rightly &#8211; it is at least partially the antidote to territorialization because itâ€™s this idea that a place needs air so we come out of our hermetically sealed boxes of the way we relate to a place and what kind of augmentation would bring more oxygen to that space.</p>
<p>There was an interesting moment in the Auggies because when <a href="http://twitter.com/dutchcowboy" target="_blank">Maarten Lens-FitzGerald</a> presented the guerrilla shopping Layar and basically Mark Billinghurst and Jessie Schell who spoke first didn&#8217;t seem too impressed. They didnâ€™t want to walk to shopping &#8211; that was what web shopping did, it saved us from walking to shop&#8230; but I felt, to me you picked up on something which might have some bearing on &#8220;ventilation&#8221; in that this AR shopping Layar was kind of squatting Prada &#8211; a favela chic AR shopping thing?</p>
<p><strong>Bruce Sterling: I wasnâ€™t sure if I was interpreting what Maarten had in mind by that.  But I think Maarten sees his structure accurately as an experience thing rather than a mapping thing. I think heâ€™s proudest of things like the Berlin Wall app on Layar, as opposed to Layars that help you go get a hamburger. Itâ€™s like&#8230;so when Layar inserts parasitic augmented shopping over other peopleâ€™s  real shopping? That was rather a subversive thing.</strong></p>
<p><strong>I think the key there is that his client is called &#8220;Hostage T-shirts,&#8221; right? I mean itâ€™s actually kind of a transgressive little hippy T-shirt store that Layar can dump anywhere in the world. Layered right over, say, Versace and Prada.  I donâ€™t know what becomes of that effort. And Iâ€™m not sure about the term â€œventilation,â€ because thatâ€™s a term of art I havenâ€™t heard much.</strong></p>
<p><strong>Tish Shute:</strong> Maybe it&#8217;s like in a cafe.  Ventilation would mean we were able to communicate with all these different categories of people that we normally would be unable to connect to, even though we might be sitting only a few feet apart.</p>
<p><strong>Bruce Sterling:   So it means ventilation in the bottles of our homophilies.</strong></p>
<p><strong>Thatâ€™s not a personal problem for me.  I commonly live in foreign cities and, you know, and spend a helluva lot of time talking to strangers at conferences. So I donâ€™t think Iâ€™d have that particular tight little social island problem.</strong></p>
<p><strong>Tish Shute:</strong> Of the three judges at the Auggies, you seemed most enthusiastic about the Layar entry.</p>
<p><strong>Bruce Sterling: It may be theyâ€™re not as familiar with the business models of locative AR as I am, and as Maarten is. It was kind of a subtle in-joke he was making about Layarâ€™s own business model there.</strong></p>
<p><strong>Tish Shute: </strong>How do you explain that?</p>
<p><strong>Bruce Sterling: Well, you know, Layar&#8217;s in the business of  selling software to make mapping and urban structures into ecommerce.</strong></p>
<p><strong>The ideal way to do that obviously would be to move the richest customers into the most expensive shops in the most rapid way possible. Or at least distribute them in the directions they want to go, a la Google. Whereas this app that Maarten was talking about puts big barnacles in the way that are selling punk t-shirts.</strong></p>
<p><strong>Tish Shute:</strong> Right! Right!</p>
<p><strong>Bruce Sterling:   The Dutch are a bit subtle in their humor.  I rather imagine thereâ€™s a lot of discussion in Layarâ€™s inner circle about exactly what they want developers to do with their platform. Theyâ€™re going to have considerable political difficulty deciding who can have a Layar key and how you discipline people when they start doing weird stuff. &#8220;The Oakland Medical Marijuana layar.&#8221;</strong></p>
<p><strong>Tish Shute:</strong> Well, finding nudists is one of the top layars at the moment.</p>
<p><strong>Bruce Sterling: You know, obviously so. And finding narcotics in Amsterdam, or a prostitution layer.  I warned them nine months ago this was bound to happen. Iâ€™m sure theyâ€™re aware of it.  I don&#8217;t think Layar wants Googleâ€™s style of cool, technocratic detachment.</strong></p>
<p><strong>Tish Shute:</strong> But thatâ€™s pretty difficult to do in current augmented reality because we donâ€™t have all the mathematical voodoo for full on AR search yet, do we?</p>
<p><strong>Bruce Sterling: Well, you can hire it out. Somebodyâ€™s going to do it, if they get interested enough.  Thereâ€™s Nokia-Yahoo. Nokia-Yahoo! just did a big corporate deal&#8230;involving Nokiaâ€™s mapping system and Yahooâ€™s localization. So the Nokia-Yahoo! mash-up is called Nooo!   Or could be called Yahno. Yakia!  Unfortunately ridiculous names.</strong></p>
<p><strong>Tish Shute:</strong> Itâ€™s interesting because you mentioned the spidersâ€™ mating problem at Google. Theyâ€™ve got all the pieces to make this kind of level of AR obviously right now. But they actually havenâ€™t done it yet.</p>
<p><strong>Bruce Sterling: There must be at least some discussion in Google, but the same goes for Microsoft. Iâ€™m frankly baffled by Microsoft, because itâ€™s just full of insanely brilliant people. What the hell are they doing in there? Name one serious innovation thatâ€™s come out of their labs in five years. They make Integral Research look dynamic. Itâ€™s really kind of sad.</strong></p>
<p><strong>Tish Shute:</strong> Itâ€™s a very curious situation with AR though, because AR more than any new technology relies on these big hordes of data particularly for the mapping, right? And only the big four have the data &#8211; although we are beginning to see upstarts, Earth Mine, Simple Geo&#8230; Did you get a chance to meet Di-Ann Eisnor  from <a href="http://www.waze.com/homepage/" target="_blank">Waze &#8211; real-time maps and traffic information based on the wisdom of the crowd</a>.Â  Waze is a very interesting project that is a potential giant killer.</p>
<p><strong>Bruce Sterling: No, I didnâ€™t talk to them.  Iâ€™ve seen people speculate that Earthmine and Apple are going to make an allegiance. I guess if youâ€™re thinking that urban informatic mapping is a super big thing for AR, that must be true.   But Iâ€™m not convinced thatâ€™s necessarily the case. People have pointed out that you can just use Google Maps, and you donâ€™t have to walk around with a little visor.  There are other aspects of AR besides the cell phone space. Thereâ€™s Total Immersion&#8217;s big display screens. Thereâ€™s the web-based fiduciary stuff. And thereâ€™s projection mapping. And then thereâ€™s experience design just for people who need their reality augmented for whatever personal or social reason. [dog barking]</strong></p>
<p><strong>Tish Shute:</strong> Right. Oh, Iâ€™m in the middleâ€¦ My sonâ€™s come. What a good hair cut!</p>
<p><strong>Bruce Sterling: Hi, there.</strong></p>
<p><strong>Tishâ€™s Son</strong>: Hi.</p>
<p><strong>Bruce Sterling: Howâ€™s it going, sir? Good to see youâ€¦</strong></p>
<p><strong>Tishâ€™s Son:</strong> Good.</p>
<p><strong>Tish:</strong> [laughs]</p>
<p><strong>Bruce Sterling: Yeah. Nice looking shirt. I like the back of it.</strong></p>
<p><strong>Tish Shute:</strong> Thatâ€™s from the American Shaolin Temple. [laughs<strong>]</strong></p>
<p><strong>Bruce Sterling: All</strong> right. Awesome. Kung Fu geek shirt.</p>
<p><strong>Tish Shute:</strong> Yup he is a bit of Kung Fu Geek. He and his dad did an iPhone app on it for Yu-Gi-Oh, for Yu-Gi-Oh scoring.</p>
<p><strong>Bruce Sterling: Awesome. Plenty of PokÃ©mon-style combat in Yu-Gi-Oh.</strong></p>
<p><strong>Tish Shute:</strong> Yeah. Well, itâ€™s interesting because youâ€™ve talked about this aspect. That all of this, the PokÃ©mon aspect of AR hasnâ€™t kicked in yet. But itâ€™s obviously a match made in heaven to some degree, isnâ€™t it?</p>
<p><strong>Bruce Sterling: One would think so, yeah.  The whole little kid gaming thing. What does that have to do with Google or Bing? You donâ€™t need a massive database for stuff like that.</strong></p>
<p><strong>Tish Shute: </strong>Yeah, youâ€™re right. But good tracking, mapping and registration requires a lot of mapping&#8230;</p>
<p><strong>Bruce Sterling: Well, our current tracking, mapping and registration requires that. Maybe thereâ€™s some other way to hack it that we donâ€™t know about yet.</strong></p>
<p><strong>Tish Shute: </strong>Thatâ€™s a very interesting point. We always have to stretch the way we think about mappingâ€¦ perhaps its a real-time understanding of the location youâ€™re in&#8230;perhaps the map is being negotiated through several social processes?</p>
<p><strong>Bruce Sterling: There are maps, and then there are maps. Thereâ€™s a kind of artillery map where you need to know the precise location of target spaces. And then thereâ€™s the kind of social map where Iâ€™m really looking for the IN-N-OUT Burger where my sister went last Tuesday. Thatâ€™s a different  system.</strong></p>
<p><strong>Tish Shute:</strong> And I think AR, at the moment, weâ€™re getting the most out of the social maps certainly. And the other [machine   perception  technologies to detect  the identity and physical    configuration of  objects relative to each  other to accurately  project   information  alongside/overlaid with a physical object] is still kind of the big dream, isnâ€™t it?</p>
<p><strong>Bruce Sterling: They say that men never ask for directions and women never read maps. Clearly, the genders have different ways of navigating the world. Whoâ€™s to say what manner of augmenting our experiences is hottest?  Iâ€™m not convinced that todayâ€™s rather rigid geolocativity is really what our society wants from that particular service. Maybe what we want is something more folksy.   Some useful nudge in the right direction as opposed to grids with 200 meters here and instructions to turn such-and-such.</strong></p>
<p><strong>Besides, thereâ€™s other hacks we havenâ€™t considered.  Weâ€™re very dependent on GPS, but just suppose all those satellites are blown out of the sky in a solar storm. Would we really want to give up mapping? Wouldnâ€™t we just come up with some other nifty hack?  Radio beacons, letâ€™s just say. Atomic clock timers in towns. Or maybe just little QR codes on lampposts that give you the exact location of that lamppost, and just click the thing and have it calculate where you are.</strong></p>
<p><strong>Tish Shute:</strong> Yes the <a href="http://thenexthope.org/" target="_blank">NextHope</a> <a href="http://thenexthope.org/2010/07/hackable-badge-accessory-kits-available/" target="_blank">OpenAMD project</a> had a clever way of triangulating location indoors.</p>
<p><strong>Bruce Sterling: Well, GPS is there and people all want to use it. Itâ€™s got good API, so of course you want to. And the guys who are good at doing it are real geolocative freaks. But the mere fact that we can do it this way, and that you can make it pay, doesnâ€™t mean that itâ€™s the ultimate way to provide that service to people.  Itâ€™s like saying that Egyptian hieroglyphics must be the greatest way to write,  because weâ€™ve got a lot of them and theyâ€™re hard to learn. What if somebody comes along with an alphabet? Itâ€™s going to be a little embarrassing.</strong></p>
<p><strong>Tish Shute:</strong> Yeah, thatâ€™s a very good point. Now, this is a more simple ordinary question about the event. <a href="http://www.ydreams.com/#/en/homepage/" target="_blank">YDreams</a> went off the map in the Auggie voting, and walked away with The Auggies. No one doubted that that was the mostâ€¦</p>
<p><strong>Bruce Sterling: I donâ€™t know. I thought those <a href="http://occipital.com/blog/" target="_blank">Occipital</a> guys with the panoramic painting&#8230;. That was hairy. I would have been tempted to give them the prize myself, actually.</strong></p>
<p><strong>Tish Shute:</strong> And what did you like best about that? Because I agree. I love <strong><a href="http://occipital.com/blog/" target="_blank">Occipital</a></strong>.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/Screen-shot-2010-09-17-at-6.20.58-PM.png"><img class="alignnone size-medium wp-image-5704" title="Screen shot 2010-09-17 at 6.20.58 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/09/Screen-shot-2010-09-17-at-6.20.58-PM-300x41.png" alt="Screen shot 2010-09-17 at 6.20.58 PM" width="300" height="41" /></a></p>
<p><em>click to enlarge</em></p>
<p><strong>Bruce Sterling: I thought it was a more technically difficult stunt than the hand registration thing.  Using a hand as a 3-D cursor is hot, but  not like painting a panorama in 3-D in real time.  That was an impressive technical feat.</strong></p>
<p><strong>Tish Shute: </strong>And they hinted at the 2.1.1 AR, more AR version of that. What do you see coming out of that as possibilities?</p>
<p><strong>Bruce Sterling: Well, Iâ€™d heard of <a href="http://www.ydreams.com/#/en/homepage/" target="_blank">YDreams</a>, so I wasnâ€™t stunned. But Iâ€™d never heard of those guys. I wonder what else the heck theyâ€™ve got in the att</strong>ic.</p>
<p><strong>Tish Shute:</strong> very cool stuff&#8230;</p>
<p><strong>Bruce Sterling: Well, more power to them. But clearly YDreams was the popular favorite. And who couldnâ€™t like it? It was just so AR.</strong></p>
<p><strong>Tish Shute</strong>: It was so AR and so gorgeous.</p>
<p><strong>Bruce Sterling: It was pretty, actually.Â  Except for their ugly menu button and poor font choice.</strong></p>
<p><strong>Tish Shute:</strong> Oh, yes. You didnâ€™t like that, did you? [laughs] But with the Occipital panorama, what do you see the next stage of that?</p>
<p><strong>Bruce Sterling: Well, obviously quicker and faster. Quicker and faster and more accurate in a network. Letâ€™s just say Iâ€™m in New York and youâ€™re in New York and Iâ€™m calling you for help. And you say where are you?  I just whirl around like this and I mail it to you on a Google Wave. And you whirl around like that, and then we compare the two panoramas and do an instant triangulation. And you say: Iâ€™m over here on this red dot of your screen.</strong></p>
<p><strong>Tish Shute: </strong>Yeah, exactly.</p>
<p><strong>Bruce Sterling:  Weâ€™re navigating with panoramas by having two connected panoramas and considering the difference.</strong></p>
<p><strong>Tish Shute: </strong> Yeah, very interesting&#8230;</p>
<p><strong>Bruce Sterling: Not shabby, right?</strong></p>
<p><strong>Tish Shute:</strong> Not shabby at all.</p>
<p><strong>Bruce Sterling: If you could do it in real time.</strong></p>
<p><strong>Tish Shute:</strong> Then the other thing I missed because I was going to meet Will was I missed the Launch Pad competition. Did you catch that?</p>
<p><strong>Bruce Sterling: I didnâ€™t see it either. I thought of another app though.</strong></p>
<p><strong>Tish Shute:</strong> Oh!</p>
<p><strong>Bruce Sterling: Youâ€™ve got a panorama maker in your home office, and it just scans the office 24 hours 365 and tags anything that moves, right? OK, whereâ€™s the clipboard?Â  At 8:15 it was over here.  Now itâ€™s vanished. Now another object is viewed over here. So, logically, ping, you hit it with a sticky light and there it is, right?</strong></p>
<p><strong>Tish Shute:</strong> Oh,  that&#8217;s cool also knowing what has changed in any environment would be a big enabler for a lot of AR visions.</p>
<p><strong>Bruce Sterling:  Iâ€™m sure there are many other things you could do with panoramas.</strong></p>
<p><strong>Tish Shute:</strong> My jet lag is beginning to kick in big time &#8211; so many ideas to pursue from are2010 &#8211; those panoramas are very exciting though.</p>
<p><strong>Bruce Sterling: Oh, well, itâ€™s all right.  We can augment reality!   Iâ€™ve got three heads and six hands!</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2010/09/17/urban-augmented-realities-and-social-augmentations-that-matter-interview-with-bruce-sterling-part-2/feed/</wfw:commentRss>
		<slash:comments>8</slash:comments>
		</item>
		<item>
		<title>The Next Wave of AR: Exploring Social Augmented Experiences at Where 2.0</title>
		<link>http://www.ugotrade.com/2010/03/29/the-next-wave-of-ar-exploring-social-augmented-experiences-at-where-2-0/</link>
		<comments>http://www.ugotrade.com/2010/03/29/the-next-wave-of-ar-exploring-social-augmented-experiences-at-where-2-0/#comments</comments>
		<pubDate>Mon, 29 Mar 2010 05:25:03 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Anselm Hook]]></category>
		<category><![CDATA[AR Blip]]></category>
		<category><![CDATA[AR browsers]]></category>
		<category><![CDATA[ARWave]]></category>
		<category><![CDATA[ARWave demo]]></category>
		<category><![CDATA[atemorality]]></category>
		<category><![CDATA[atemporal network culture]]></category>
		<category><![CDATA[augmented reality and federation]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[augmented reality search]]></category>
		<category><![CDATA[augmenting the map as interface]]></category>
		<category><![CDATA[Brady Forrest]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[collaborative augmented reality]]></category>
		<category><![CDATA[Davide Carnovale]]></category>
		<category><![CDATA[Dennou Coil]]></category>
		<category><![CDATA[design principles for social augmented experiences]]></category>
		<category><![CDATA[FourSquare]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[gowalla]]></category>
		<category><![CDATA[Jeremy Hight]]></category>
		<category><![CDATA[Jesse Schell]]></category>
		<category><![CDATA[Joe Lamantia]]></category>
		<category><![CDATA[layers and channels of augmentation]]></category>
		<category><![CDATA[location technologies]]></category>
		<category><![CDATA[locative media]]></category>
		<category><![CDATA[locative narratives]]></category>
		<category><![CDATA[Markus Strickler]]></category>
		<category><![CDATA[narrative archaeology]]></category>
		<category><![CDATA[open augmented reality]]></category>
		<category><![CDATA[open distributed augmented reality]]></category>
		<category><![CDATA[pygowave]]></category>
		<category><![CDATA[real time social augmented experiences]]></category>
		<category><![CDATA[Ruby On Sails]]></category>
		<category><![CDATA[social AR]]></category>
		<category><![CDATA[social AR and crisis response]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[Sophia Parafina]]></category>
		<category><![CDATA[Thomas Wrobel]]></category>
		<category><![CDATA[Wave]]></category>
		<category><![CDATA[Wave Federation Protocol]]></category>
		<category><![CDATA[Where2.0]]></category>
		<category><![CDATA[WhereCamp]]></category>
		<category><![CDATA[Will Wright]]></category>
		<category><![CDATA[writing within the map]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=5332</guid>
		<description><![CDATA[Where 2.0 is going to be epic this year (see my interview with Brady Forrest here), and it is so exciting to be part of it.Â  Location technologies and augmented reality are annointed rulers now.Â  Time Magazine recognized augmented reality as one of its 10 Tech Trends for 2010 (for more see ReadWriteWeb). The photo [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/jeremyandlisahight.jpg"><img class="alignnone size-medium wp-image-5336" title="jeremyandlisahight" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/jeremyandlisahight-300x160.jpg" alt="jeremyandlisahight" width="300" height="160" /></a></p>
<p><a id="jqit" title="Where 2.0" href="http://en.oreilly.com/where2010">Where  2.0</a> is going to be epic this year (see <a id="ysmn" title="my interview with Brady Forrest here" href="../../2010/02/10/the-physical-world-becomes-a-software-construct-talking-with-brady-forrest-about-where-2-0-2010/">my interview  with Brady Forrest here</a>), and it is so exciting to be part of it.Â   Location technologies and augmented reality are annointed rulers now.Â  <a href="http://www.time.com/time/specials/packages/article/0,28804,1973759_1973760_1973797,00.html">Time  Magazine recognized</a> augmented reality as one of its 10 Tech Trends  for 2010 (for more <a href="http://www.readwriteweb.com/archives/augmented_reality_among_times_10_tech_trends_2010.php" target="_blank">see ReadWriteWeb</a>).</p>
<p>The  photo above is by Jeremy and Lisa Hight.Â  <a id="ohzg" title="Jeremy Hight" href="http://34n118w.net/">Jeremy Hight</a> is an information  designer, theorist and artist working in Augmented Reality and Locative  Media. Â  His essay â€œNarrative Archaeologyâ€ was named one of the 4  primary texts in Locative Media.</p>
<p><a id="xel:" title="Jeremy Hight" href="http://en.oreilly.com/where2010/public/schedule/speaker/69399">Jeremy Hight</a> will be part of our  panel: <a title="The Next Wave of AR: Exploring Social Augmented Experiences" href="http://en.oreilly.com/where2010/public/schedule/detail/11046">The  Next Wave of AR: Exploring Social Augmented Experiences</a>, with <a id="b49q" title="Anselm Hook" href="http://en.oreilly.com/where2010/public/schedule/speaker/6545">Anselm Hook</a>, <a id="h3j-" title="Joe Lamantia" href="http://en.oreilly.com/where2010/public/schedule/speaker/26367">Joe Lamantia</a>, <a id="xtfk" title="Sophia Parafina" href="http://en.oreilly.com/where2010/public/schedule/speaker/59688">Sophia Parafina</a> and <a id="uw9f" title="myself." href="http://en.oreilly.com/where2010/public/schedule/speaker/38011">myself.</a> We will <a href="http://www.youtube.com/watch?v=ZjXCTCSKtRQ" target="_blank">debut the video of the  ARWave project demo </a>that brings together augmented reality,  geolocation, and wave federation (more details later in this post).Â  And, Jeremy will bring to our  presentation some augmentations on his recent brilliant work and paper, <a href="http://www.neme.org/main/1111/writing-within-the-map" target="_blank">â€œWriting Within the Map.â€</a></p>
<p>Greg  J. Smithâ€™s points out in <a href="http://serialconsign.com/2010/03/thoughts-writing-within-map#comments" target="_blank">his in depth look at Jeremyâ€™s work</a> that it, <strong>â€œdovetails  with some of the main points in Bruce Sterlingâ€™s recent <a href="http://www.wired.com/beyond_the_beyond/2010/02/atemporality-for-the-creative-artist/">atemporality  keynote</a> at Transmedialeâ€ â€“ </strong>fortunately there is a <a href="http://www.wired.com/beyond_the_beyond/2010/02/atemporality-for-the-creative-artist/" target="_blank">transcription of Bruceâ€™s keynote here</a>.Â  What is so  awesome about this dovetailing is that you can get a feel for the  fun part of living in an, â€œatemporal network culture.â€Â  And, if you want  to really understand just how much locative media and augmented reality  have changed us, youÂ  might want to dig into these texts.</p>
<p>Bruce  Sterling and Jeremy Hight, and members of the ARWave team, and a  superb cast of augmented reality movers and shakers &#8211; including Will  Wright and Jesse Schell, will be <a id="ncnl" title="speaking at Augmented Reality Event in Santa Clara, June 2nd and  3rd." href="http://augmentedrealityevent.com/speakers/">speaking at Augmented Reality Event in Santa Clara, June 2nd and  3rd.</a></p>
<p>But, this week, the AR community&#8217;s attention  will be on the events at Where 2.0.Â Â  The  keynote speakers will be streamed live, so if you are not fortunate  enough to be there, tune in!</p>
<h3>The Next Wave of AR: Exploring Social Augmented Experiences</h3>
<p>On our panel, Jeremy  Hight, Anselm Hook, Sophia Parafina, Joe Lamantia and I will cover some  of the key social, cultural, technical and interactional questions for  exploring social augmented experiences. There will be five lightning  presentations, and an opportunity for questions from the audience, and a  world premier of the ARWave demo!</p>
<p><strong>1)  â€œAugmenting the map as interface: AR and Locative Narrativesâ€ -</strong> Jeremy Hight<strong><br />
</strong></p>
<p><strong>*Map augmentation of the historic route 66  can house an essay contest and publication globally but as embedded  within that map augmentation instead of books or even web sites.</strong></p>
<p><strong>*  A place on a map can be a graphic index and database to save and  collect<br />
the writing of that place with a graphic or textual search  index.</strong></p>
<p><strong>*One can pop immersive visualizations of abandoned or lost  buildings from map location in shared software and collectively augment  (imagine channels within the lost core of detroit where one is memories  and accounts tagged within parts in the immersive visualization while  another is of poems and stories written by people moved by the place and  its semiotics and story).</strong></p>
<p><strong>*The news stand is to be the map.</strong></p>
<p><strong>*New  forms of literature will be born of mapping, spaces,augmentation and<br />
new tools</strong></p>
<p>The concept drawings below (click to  enlarge)Â are  a collaboration between Jeremy Hight and Paul Wehby, Senior Designer at  <a href="http://www.lacma.org/" target="_blank">LA County Museum of Art.</a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/wehby1post.jpg"><img class="alignnone size-thumbnail wp-image-5342" title="wehby1post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/wehby1post-150x150.jpg" alt="wehby1post" width="150" height="150" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/wehby2post.jpg"><img class="alignnone size-thumbnail wp-image-5343" title="wehby2post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/wehby2post-150x150.jpg" alt="wehby2post" width="150" height="150" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/wehby3post.jpg"><img class="alignnone size-thumbnail wp-image-5352" title="wehby3post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/wehby3post-150x150.jpg" alt="wehby3post" width="150" height="150" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/wehby4post.jpg"><img class="alignnone size-thumbnail  wp-image-5353" title="wehby4post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/wehby4post-150x150.jpg" alt="wehby4post" width="150" height="150" /></a></p>
<p><strong>2) </strong>Anselm Hook will look at, <strong>&#8220;10 reasons why AR isn&#8217;t a  flash in the pan,&#8221; </strong>and how,<strong> â€œAR can help us see the world we  would like to have exist.â€</strong></p>
<p>Anselm notes, <strong>â€œSo  much of what we do is so fickle and Iâ€™m looking for ways to connect  digital media work to deep values.â€</strong></p>
<p><strong>3)</strong> Sophia Parafina will present on, <strong>â€œSocial AR and Crisis Responseâ€</strong></p>
<p><strong>â€œAugmented  reality as a multi-party conversation. Â Rather than being passive  viewers of AR with a limited ability to Â checkin to places and make  annotations, current devices can broadcast sensor information that can  be fused into an interactive stream. AR users can send and receive  information, location, and sensor data from their mobile device.Â  The  streams can be federated into a unique AR view composed by the user.</strong></p>
<p><strong>Entertainment  and gaming are obvious applications, but it can also be applied to  crisis situations such as the search and rescue operations in Haiti.  Â Efforts such as Mission 4636, the SMS translation service, could  benefit from AR views. Â The collaboration among the Mission 4636  volunteers was the key element Â in their success for providing location  and rapid translation to responders on the ground.</strong></p>
<p><strong>With an AR  view, responders can send back their sensor information from their  mobiles to provide contextual information to remote volunteers. Â This  extends the conversation between remote volunteers and on the ground  responders and fosters collaboration which was a key element for the  success of Mission 4636â€³</strong></p>
<p><strong>4)</strong> Joe Lamantia,  an experience design and strategy consultant helping to define the  interaction framework and scenarios behind ARWave, will discuss, <strong>â€œDesign  Principles For Social Augmented Experiences:â€</strong></p>
<p><strong>â€œWith  the exotic mixed realities envisioned by futurists and science fiction  writers seemingly around the corner, it is time to move beyond questions  of technical feasibility to consider the value and impact of turning  reality inside out for everyday social settings and experiences. Thanks  to the inherently social nature of augmented reality, we can be sure the  value and impact of many augmented experiences depends in large part on  how effectively they integrate with the social dimensions of real-world  settings, in real time.&#8221;</strong></p>
<p>Joe will share, <strong>&#8220;eight guiding  principles for designing experiences that engage naturally with the  social dimension, and increase the value of augmented experiences.&#8221; </strong></p>
<p><strong>5) <a id="y08e" title="AR Wave" href="http://groups.google.com/group/arwave">&#8220;ARWave</a> &#8211; A demo and state of play,&#8221; </strong>from Tish Shute</p>
<p>I  will have the awesome privilege, on our Where 2.0 panel, of showcasing <a id="y08e" title="AR Wave" href="http://groups.google.com/group/arwave">ARWave</a>.Â Â  We willÂ   premier the ARWave demo which shows how ARWave has accomplished the  basics of geolocating data on Wave Federation Protocol (and real time  collaboration on this geolocated data).Â  <span id="ejpu" dir="ltr">If  you&#8217;re interested in the ARWave project join the <a id="n4k6" title="Mailing  list" href="http://groups.google.com/group/arwave">Mailing list</a>, FAQ are <a id="medt" title="here" href="http://lostagain.nl/websiteIndex/projects/Arn/information.html">here</a>, and have a peek at the current state of  development at <a id="ius-" title="Google Code" href="http://code.google.com/p/arwave/">Google Code</a>, and the <a id="dj:p" title="specification for an AR Blip" href="http://arwave.wiki.zoho.com/ARBlip-Specification.html">specification for an AR Blip</a>.Â   We also have Waves for the project hosted on Google Wave.Â  You can  join the general discussion <a id="xiwt" title="here" href="https://wave.google.com/wave/#restored:wave:googlewave.com%21w%252BJAcNzz16A">here</a>, and the technical side <a id="s393" title="here" href="https://wave.google.com/wave/#restored:wave:googlewave.com%21w%252Bhvk2Fj3wB">here</a>.</span></p>
<p>The picture below is a  screen shot from the demo video produced by core AR Wave developer and  concept designer, Thomas Wrobel.</p>
<p>Click on the  image to enlarge, and note: <strong>â€œThe pink thing is from Dennou Coil. Its  an anti-virus program (that literally chaseâ€™s down bugs and glitches and  removes them).â€</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/Screen-shot-2010-03-27-at-6.58.55-PM.png"><img class="alignnone size-medium wp-image-5344" title="Screen shot 2010-03-27 at 6.58.55 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/Screen-shot-2010-03-27-at-6.58.55-PM-281x300.png" alt="Screen shot 2010-03-27 at 6.58.55 PM" width="281" height="300" /></a></p>
<h3>ARWave</h3>
<p>In ARWave, stories or art are tied to place. And as Jeremy Hight  writes:</p>
<p><strong>â€œThe possibility exists to take a part of an  area and overlay a dystopia, a utopia, multiples of each of these, or  even recreations of previous incarnations in the past. Writing and  publication thus cannot only be of place, and form(s), but of selected  augmentations of icons, streets, buildings and related texts on top of  the map. These spaces can be built in real time and can be turned on and  off as channels of augmentation that over time illustrate many faces of  place in its present, past, possible futures,etc. with texts within  these alternate spaces as commentary, as fused aesthetic analysis, or  simply creative writing relevant to these charged and hybrid spaces.â€</strong></p>
<p>As  Thomas notes, Jeremy Hightâ€™s,Â  <strong>â€œidea of channels ties into the concept  of waves = a layer, and people can have many layers on at once.â€</strong></p>
<p>This  is different from the <a href="http://layar.com/" target="_blank">Layar</a> concept of a layer or rather â€œlayar.â€</p>
<p><strong>&#8220;We  are not talking about layers in the classical map layer way of  thinking, where you have a layer of all restaurants or a layer of all  mountain peaks, etc.,&#8221; </strong>notes ARWave developer Markus Strickler.</p>
<p>Currently all geo location apps like Layar have to use their own  servers, so users have to use different clients with different log ins  to see data from different sources.Â  But because ARWave uses federation,  we don&#8217;t depend on centralized infrastructure where the client of one  company can only connect to the server of that company.Â  This opens up  many exciting new possibilities for how people can decide to view and  publish geolocated data.</p>
<p>With AR Wave, via one  login, people can access the whole distributed network of servers (see  diagrams below), and any content will be accessible to them. ARWave will  make it easy for individuals, not just developers, to layer their  environment â€“ allowing the creation of augmented reality content to be  as simple as contributing to a Wave.</p>
<p><strong>â€œARWave  will enable individuals to publish easily to everyoneâ€¦.or just a few  people,â€</strong> Thomas notes:</p>
<p><strong>â€œTo â€˜publishâ€™ is also  self publication and distribution in communities or like minded groups  without the hard read of publication or rejection.â€ = publishing on a  Wave. No one approves it, anyone can publish to communities, or their  friends and family. Or even just personal publishing it for their own  reference.â€</strong></p>
<p>But ARWave does not compete with  existing AR Browsers.Â Â  On the contrary, AR browsers like Layar,  Wikitude and others, could implement ARWave and use it to enhance their  applications.</p>
<p><strong>â€œ<a href="http://layar.com/" target="_blank">Layar</a></strong><strong> has a killer  browser already,Â  ARWave would add social features. They can keep their  â€œwalled gardenâ€ of data and still join the federation of open data too <img src="../wp-includes/images/smilies/icon_smile.gif" alt=":)" /> â€ (Thomas Wrobel)</strong></p>
<p>Yup, that is the cool  part of federation â€“ you can have your cake and eat it too!</p>
<p>Sophia  Parafina and I will be organizing a discussion session on ARWave and  Federation at <a href="http://upcoming.yahoo.com/event/4909659/CA/Mountain-View/WhereCamp-SF/Google-Maxwell-Tech-Talk/CA/Mountain-View/WhereCamp-SF-2010/Google-Maxwell-Tech-Talk/" target="_blank">WhereCamp</a>, right after Where 2.0, April 3rd and 4th, and<a href="http://twitter.com/dlpeters" target="_blank"> Dan Peterson</a> who is in leading the  federation effort for Google Wave will join us.</p>
<p>The  diagrams below illustrate how ARWave and federation can revolutionize  the way we share our augmented realities.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/Screen-shot-2010-03-27-at-6.06.33-PM.png"><img class="alignnone size-medium wp-image-5347" title="Screen shot 2010-03-27 at 6.06.33 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/Screen-shot-2010-03-27-at-6.06.33-PM-300x218.png" alt="Screen shot 2010-03-27 at 6.06.33 PM" width="300" height="218" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/Screen-shot-2010-03-27-at-6.06.00-PM.png"><img class="alignnone size-medium wp-image-5345" title="Screen shot 2010-03-27 at 6.06.00 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/Screen-shot-2010-03-27-at-6.06.00-PM-300x214.png" alt="Screen shot 2010-03-27 at 6.06.00 PM" width="300" height="214" /></a></p>
<h3><strong>Real Time Social Augmented Experiences</strong></h3>
<p>Another key  aspect of ARWave is itâ€™s near to real time update capabilities.Â  As Jeff  Pulver pointed out in, â€œ<a href="http://pulverblog.pulver.com/archives/009156.html" target="_blank"><strong>SXSW  2010: The days twitter became less relevant:â€</strong></a></p>
<p><a href="http://pulverblog.pulver.com/archives/009156.html" target="_blank"><strong> </strong></a><strong>â€œAt  <a href="http://click.bsftransmit1.com/ClickThru.aspx?pubids=6954%7c149%7c09546&amp;digest=j9iIm6%2b67%2fKjaKaD%2bG459g" target="_blank">South By Southwest</a> 2010 (SXSW), a strange thing  happened on the way to Austin. A community of twitter faithful shifted  from sharing everything about everything on only twitter (and maybe  Facebook) and changed their habits to rely on learning about what was  happening and where things were happening by using <a href="http://click.bsftransmit1.com/ClickThru.aspx?pubids=6954%7c140%7c09546&amp;digest=vh5VR%2fg1W2H2FHKwRIGl8g" target="_blank">foursquare</a> and <a href="http://click.bsftransmit1.com/ClickThru.aspx?pubids=6954%7c141%7c09546&amp;digest=SyK27R5EP7LzBWYvodNDpQ" target="_blank">Gowalla</a> instead. Iâ€™m sure there were other products  and platforms being used including <a href="http://click.bsftransmit1.com/ClickThru.aspx?pubids=6954%7c142%7c09546&amp;digest=Nd55%2flEGjFr3lopcn8%2fqiA" target="_blank">Loopt</a> and <a href="http://click.bsftransmit1.com/ClickThru.aspx?pubids=6954%7c143%7c09546&amp;digest=rJYwQX8VJw9Bww36xQ1Lbg" target="_blank">GySPii</a> but foursquare and Gowalla were the dominant  platforms.â€<br />
</strong></p>
<p>Later Jeff wrote:</p>
<p><strong>â€œThere  were times where I could feel the ebbs and the flows of the people move  as different people checked into various locations. While most of this  was felt locally in the place I was in, it also became apparent on the  platforms when hundreds of people would rush to check in to a location.  There were also times when it felt like I was chasing ghosts; These were  the times I would go to a spot because a friend had checked into that  spot only to discover they were no longer there.â€</strong></p>
<p>ARWaveâ€™s  realtime collaborative capabilities are going to introduce some  fascinating dynamics to â€œchasing ghosts,â€ as the  ARWave framework gets integrated into services like foursquare â€“ a  project we have already begun to look at.</p>
<h3><strong>Augmented Reality  Search</strong></h3>
<p>As I mention<a href="../../2010/03/18/visual-search-augmented-reality-and-physical-hyperlinks-for-playfulness-not-just-purchases-talking-with-paige-saez-about-imagewiki/" target="_blank"> in my previous post</a>, ARWave presents some  fascinating possibilities for AR Search.Â  For example, one might do  advanced searching within waves using SPARQL, which could then display  in the form of a personal blip in your viewpoint (which in turn could be  shared with others).Â  Linked data will be massively important in  filtering and delivering useful info for augmented views (<a href="../../2010/03/03/the-game-is-about-the-world-not-dragons-talking-with-will-wright/" target="_blank">see my conversation with Will Wright </a>about the  problem of augmented reality overriding our very smart instincts and not  being useless or worse as a result).</p>
<p>Anselm Hook, who I  interviewed in depth recently about,Â <a title="Permanent Link to Visual Search,  Augmented  Reality and a Social Commons for the Physical World Platform:  Interview  with Anselm Hook" rel="bookmark" href="http://docs.google.com/2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/">Visual Search, Augmented Reality and a Social Commons  for the Physical World Platform: Interview with Anselm Hook</a>, has  some very interesting thoughts on real time stuff, trading brokerages,  andÂ  the view within a single city block, which he elaborated on in the  second half to this interview which is upcoming on Ugotrade soon!</p>
<h3><strong>The  ARWave Developers</strong></h3>
<p><strong> </strong>There are three  people who unfortunately canâ€™t join us at Where 2.0 â€“ Â the costs of  travelling from Europe being an obstacle. Â But as they have been  developing the code for ARWave that will rock our augmented world, I  asked them, in a Wave conversation, to give me a few comments about  their interest in working on ARWave, and a pic and a short bio. Â  Also I  should mention the work of the PyGoWave team whose incredibly fast work  creating <a id="stt3" title="PyGoWave" href="http://pygowave.net/">PyGoWave</a> has given ARWave a rocket launch pad.Â  Also many thanks to the Wave community, see the <a id="vma_" title="Wave Federation  Protocol documentation" href="http://www.waveprotocol.org/">Wave Federation Protocol documentation</a>, <a id="exsg" title="Google's Wave  Server" href="https://wave.google.com/wave">Google&#8217;s Wave Server</a>, <a id="b:s7" title="RubyOnSails" href="http://wiki.github.com/danopia/ruby-on-sails/">RubyOnSails</a> (Ruby On Rails based Wave server).</p>
<p><a href="http://need2revolt.wordpress.com/" target="_blank"><strong>Davide   Carnovale</strong></a> @need2revolt</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/davide.jpg"><img class="alignnone size-thumbnail wp-image-5349" title="davide" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/davide-150x150.jpg" alt="davide" width="150" height="150" /></a></p>
<p><strong>â€œImho, the coolest  geolocated related thing is that weâ€™re making a world where the info  does not necessarily comes from an explicit search from the user, but  comes also from the actual locaton youâ€™re in. For instance, you can have  special offers in stores like fourquare does, or your friends can leave  geolocated notes for you that are triggered when you walk by.Â  We can  have games based on the treasure hunt schema requiring you to actually  go to specific location.</strong></p>
<p><strong>Other than this I  can think about self-guided tours of the city, maybe user generated  too, or for museums.<br />
</strong></p>
<p><strong>Naturally these are long term  goals with some rl use cases.</strong></p>
<p><strong>As for my  bio, there isnâ€™t much to sayâ€¦ I got a first level degree in computer  science and Iâ€™m taking the second (and last) level. Iâ€™ve developed with  mobile agents, osgart/artoolkit, brain computer interfaces, linux kernel  and thatâ€™s pretty much allâ€¦â€</strong></p>
<p><strong><br />
</strong></p>
<p><strong><a href="http://www.lostagain.nl/" target="_blank">Thomas Wrobel</a></strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/Screen-shot-2010-03-28-at-4.35.59-AM.png"><img class="alignnone size-thumbnail wp-image-5354" title="Screen shot 2010-03-28 at 4.35.59 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/Screen-shot-2010-03-28-at-4.35.59-AM-150x150.png" alt="Screen shot 2010-03-28 at 4.35.59 AM" width="150" height="150" /></a></p>
<p><strong>&#8220;If you are looking for specific advantages of using Wave I&#8217;d say;<br />
</strong><strong> </strong></p>
<p><strong>*  Federated â€“ Letting creators tap into bigger userbase. Each new app or  data layer will add to the â€œincentiveâ€ for users to join in. Google had  some good stats a few months back as to how much a simple login screen  can put people off using stuff. Â By breaking that barrier it should make  AR userbaseâ€™s grow.</strong></p>
<p><strong>* It deals with user accounts,  permissions, and real-time updating without creators needing to make a  new server standard themselves. It lowers barriers to development.</strong></p>
<p><strong>*  As the clients, servers, and data can be made separately by different  parties, its easier for developers to concentrate on just providing what  they want. You want to just make content? No problem! You dont need to  worry about doing anything else but that. It would become as easy as  making a webpage (or easier!).</strong></p>
<p><strong>* Bots will allow the  development of interactive AR games very easily. Just like modern  version of IRC bots, the infrastructure does the heavy lifting, and  interesting things can be done with just simple scripting.</strong></p>
<p><strong>*  The idea is anyone will be able to make a layer onto the world, and  people can mix, match and share their layers as they wish. Its not just  the data that becomes interesting to see augmenting our world, but the  combinations of data! For example, perhaps you could see the profits  generated by different companies above their buildings, but also see how  environmentally friendly they are at the same time. Or maybe see  pollution levels against health-statistics.Â  Seeing combinations of  geolocated data from different sources at the same time has many  interesting possibilities both for scientific as well as casual (game/  map/ chat) use.</strong></p>
<p><strong>hmz..I could go on forever listing stuff  here reallyâ€¦..</strong></p>
<p><strong>I guess if we are supposed  to be forming a roadmap of significant/interesting things for ARWave?</strong></p>
<p><strong>*  Example clients letting people make their own layers (waves) and add  points to them.</strong></p>
<p><strong>* Letting people log in to different  servers</strong></p>
<p><strong>* Servers federated together. (not our  responsibility, but essential part of the roadmap).</strong></p>
<p><strong>*  Anyone logged into any server can see data from anyone else that&#8217;s shared  with them, regardless of where they are logged into</strong></p>
<p><strong> * 3D  support, demonstrating various sorts of geolocated data.?</strong></p>
<p><strong>*  Use of bots for example games?<br />
â€”-<br />
My Bioâ€™s quite simple.<br />
Studied 3D Animation in Portsmouth, UK.<br />
Moved to the Netherlands,  have since been working in creating ARG games, in the last year founded  Lostagain (Lostagain.nl).â€</strong></p>
<p><strong><br />
</strong></p>
<p><strong><a id="ikdu" title="Markus Strickler" href="http://twitter.com/kusako">Markus  Strickler @kusako</a></strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/markus.jpg"><img class="alignnone size-thumbnail wp-image-5350" title="markus" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/markus-150x150.jpg" alt="markus" width="150" height="150" /></a></p>
<p><strong>â€œI think the main point behind ARWave is to go beyond simply  displaying existing placemarks on top of a live camera view, towards a  highly personalized, augmented world where everybody can edit and share  localized information collaboratively and in real time. Wave provides  the means to do this through its model of persistent real time  conversations and adds even more by providing a way for personal agents  (robots) to participate in these conversations.</strong></p>
<p><strong>As  for my Bio: Iâ€™ve been developing Web applications for the last 15  years, hold a degree in Image Sciences and am currently working as a  Java developer in Cologne, Germany.â€</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2010/03/29/the-next-wave-of-ar-exploring-social-augmented-experiences-at-where-2-0/feed/</wfw:commentRss>
		<slash:comments>3</slash:comments>
		</item>
		<item>
		<title>The Physical World Becomes a Software Construct: Talking with Brady Forrest about Where 2.0, 2010</title>
		<link>http://www.ugotrade.com/2010/02/10/the-physical-world-becomes-a-software-construct-talking-with-brady-forrest-about-where-2-0-2010/</link>
		<comments>http://www.ugotrade.com/2010/02/10/the-physical-world-becomes-a-software-construct-talking-with-brady-forrest-about-where-2-0-2010/#comments</comments>
		<pubDate>Wed, 10 Feb 2010 05:37:24 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Artificial general Intelligence]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Artificial Life]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Phones in Africa]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[ARCommons]]></category>
		<category><![CDATA[are2010]]></category>
		<category><![CDATA[ARWave]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[Brady Forrest]]></category>
		<category><![CDATA[crisis management]]></category>
		<category><![CDATA[Crisis Mappers]]></category>
		<category><![CDATA[CrisisCamp]]></category>
		<category><![CDATA[CrisisMapping]]></category>
		<category><![CDATA[Food Genome]]></category>
		<category><![CDATA[google goggles]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Google Wave Federation Protocol]]></category>
		<category><![CDATA[H.E.AI.D]]></category>
		<category><![CDATA[human energized artificial intelligence]]></category>
		<category><![CDATA[hyperlocal search]]></category>
		<category><![CDATA[hyperlocal view]]></category>
		<category><![CDATA[image links]]></category>
		<category><![CDATA[iPad]]></category>
		<category><![CDATA[ISMAR 2009]]></category>
		<category><![CDATA[local search]]></category>
		<category><![CDATA[location based analysis]]></category>
		<category><![CDATA[location based technologies]]></category>
		<category><![CDATA[Mike Liebhold]]></category>
		<category><![CDATA[Mixer Labs]]></category>
		<category><![CDATA[mobile social]]></category>
		<category><![CDATA[mobile social interaction]]></category>
		<category><![CDATA[Nathan Torkington]]></category>
		<category><![CDATA[O'Reilly Media]]></category>
		<category><![CDATA[Open CV]]></category>
		<category><![CDATA[Open Street Map]]></category>
		<category><![CDATA[OpenAR]]></category>
		<category><![CDATA[Ovi]]></category>
		<category><![CDATA[People Finder]]></category>
		<category><![CDATA[physical hyperlinks]]></category>
		<category><![CDATA[proximity-based social networking]]></category>
		<category><![CDATA[real-time social location aware applications]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[Steve the Robot H.E.AI.D]]></category>
		<category><![CDATA[Twitter and geolocation]]></category>
		<category><![CDATA[Uber Geek]]></category>
		<category><![CDATA[Ushahidi]]></category>
		<category><![CDATA[Vernor Vinge]]></category>
		<category><![CDATA[visual search]]></category>
		<category><![CDATA[Where2.0]]></category>
		<category><![CDATA[WhereCamp]]></category>
		<category><![CDATA[Yelp Monocle]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=5137</guid>
		<description><![CDATA[&#8220;The internet eats everything it touches,&#8221; write Brady Forrest and Nathan Torkington, Oâ€™Reilly Media, Inc., in their must read 2006 companion essay The State of Where 2.0 (PDF).Â  Now in 2010 that statement is more true than ever. Last week,Â  I talked to Brady about what we can look forward to at Where 2.0, 2010,Â  [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://heaid.com/" target="_blank"><img class="alignnone size-medium wp-image-5138" title="Screen shot 2010-02-08 at 11.05.18 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/02/Screen-shot-2010-02-08-at-11.05.18-PM-300x202.png" alt="Screen shot 2010-02-08 at 11.05.18 PM" width="300" height="202" /></a></p>
<p>&#8220;The internet eats everything it touches,&#8221; write <a href="http://radar.oreilly.com/brady/" target="_blank">Brady Forrest</a> and <a href="http://nathan.torkington.com/" target="_blank">Nathan Torkington</a>, Oâ€™Reilly Media, Inc., in their must read 2006 companion essay <a style="border-width: 0px; margin: 0px; padding: 0px; color: #a43000; text-decoration: none;" title="Opens link in a new browser window." href="http://assets.en.oreilly.com/1/event/4/state_of_where_20.pdf" target="_blank">The State of Where 2.0</a> (PDF).Â  Now in 2010 that statement is more true than ever.</p>
<p>Last week,Â  I talked to Brady about what we can look forward to at <a href="http://en.oreilly.com/where2010" target="_blank">Where 2.0, 2010</a>,Â  and what he thinks will be the &#8220;internet eating&#8221; trends emerging this year.Â  Brady is uniquely positioned to get a glimpse of things to come.Â  His job for Oâ€™Reilly Media is tracking changes in technology and organizing large scale events, including Where 2.0 which he chairs, and Web 2.0 Expo in San Francisco and NYC which he co-chairs.Â  Brady also runs <a href="http://ignite.oreilly.com/" target="_blank">Ignite</a>, and previously worked at Microsoft on Live Search.Â  And, when not doing his day job, he participates in such Uber Geek activities as <a id="swtp" title="Steve the Robot H.E.AI.D - A Human Energized Artificial Intelligence Device...with lasers and generative sound." href="http://heaid.com/?page_id=5">Steve the Robot H.E.AI.D &#8211; A Human Energized Artificial Intelligence Device&#8230;with lasers and generative sound,</a> (click on pic above or see <a id="qvff" title="video here" href="http://vimeo.com/7153320">video here</a>).Â  Look out for <a id="swtp" title="Steve the Robot H.E.AI.D - A Human Energized Artificial Intelligence Device...with lasers and generative sound." href="http://heaid.com/?page_id=5">Steve the Robot H.E.AI.D,</a> at <a id="sfnk" title="Augmented Reality Event, June 2nd and 3rd, Santa Clara, CA" href="http://augmentedrealityevent.com/">Augmented Reality Event, June 2nd and 3rd, Santa Clara, CA</a>,Â  and a presentation from Brady.</p>
<p>As <a href="http://en.wikipedia.org/wiki/Vernor_Vinge" target="_blank">Vernor Vinge</a> pointed out in his intro to <a href="http://www.ismar09.org/" target="_blank">ISMAR 2009</a> &#8211; the &#8220;possibilities are both scary and wondrous&#8221; as &#8220;the physical world becomes much more like a software construct.&#8221;Â  Brady Forrest has taken a lead role, since 2004 &#8211; when &#8220;&#8216;local search&#8217; was interesting but not yet real,&#8221; in shaping this transformation.</p>
<p><a id="j70w" title="Where 2.0" href="http://en.oreilly.com/where2010">Where 2.0</a>, together with <a id="y46x" title="WhereCamp" href="https://wherecamp.pbworks.com/session/login?return_to_page=FrontPage">WhereCamp</a> (this year at Google) constitutes WhereWeek &#8211; a crucible for emerging trends in web mapping platforms, and location based technologies.Â  This year augmented reality, proximity-based social networking, local search, and the rapidly maturing field of Crisis Management are in theÂ  mix along with the huge and long established GIS industry which has moved rapidly into the Where 2.0 space.</p>
<p>But what business models will oxygenate the system is still a key question &#8211; one Brady discusses in the interview below.Â  Certainly, the usefulness of location based analysis, mapping, new interfaces, and bringing this data to every application is clear.</p>
<p>Crisis management is center stage this year <a href="http://en.oreilly.com/where2010/public/schedule/speaker/2345">Jeffrey Johnson</a> (Open Solutions Group), <a href="http://en.oreilly.com/where2010/public/schedule/speaker/67704">John Crowley</a> (Star-Tides), <a href="http://en.oreilly.com/where2010/public/schedule/speaker/2118">Schuyler Erle</a> (Entropy Free LLC) who will present on, <a id="d4lf" title="Haiti: CrisisMapping the Earthquake" href="http://en.oreilly.com/where2010/public/schedule/detail/13201">Haiti: CrisisMapping the Earthquake</a>.Â  And Chris Vein &amp; Tim O&#8217;Reilly will &#8220;discuss how cities and application developers will benefit from open data and what these programs will look like in the future&#8221;Â  in the plenary <a id="pv3i" title="City Data" href="http://en.oreilly.com/where2010/public/schedule/detail/14124">City Data</a>.</p>
<p>Mobile social, proximity- based social networking, which may soon emerge as a challenger to web based social networks, and augmented reality are the sexy rockstars ofÂ  the Where 2.0&#8242;s 2010 showcase of potentially disruptive technologies.Â  Augmented Reality has had a breakthrough year, and this is reflected in its strong showing on the Where 2.0 schedule.Â  But, as Brady notes, AR awaits the killer app, that will drive it to the next levelÂ  Of course, we hope to unveil thatÂ at<a href="http://augmentedrealityevent.com/" target="_blank"> are2010</a>!</p>
<p>At Where 2.0, I am presenting on <a id="mknx" title="The Next Wave of AR: Exploring Social Augmented Experiences" href="http://en.oreilly.com/where2010/public/schedule/detail/11046">The Next Wave of AR: Exploring Social Augmented Experiences</a> panel.Â  We will look at how social augmented experiences will be key to the next wave of mobile augmented reality.Â  <a href="http://en.oreilly.com/where2010/public/schedule/speaker/2119" target="_blank">Mike Liebhold</a>, in a complementary presentation, looks at <a id="e0_a" title="Truly Open AR." href="http://en.oreilly.com/where2010/public/schedule/detail/11096">Truly Open AR.</a> If you have been reading Ugotrade, you already know I am an advocate for an open, distributed, real time communications framework for AR &#8211; see <a href="http://arwave.wiki.zoho.com/HomePage.html" target="_blank">ARWave</a>.Â  Wave Federation Protocol is an open fast, compact, federated, communications protocol that is a dream come true for AR.Â  And, I would hazard a guess that in 2010, real time communications plus location will become oxygen.</p>
<p>But also key to the next wave of AR, as I discussed with <a href="http://www.hook.org/" target="_blank">Anselm Hook</a> in this post on <a id="it3q" title="Visual Search, Augmented Reality and a Social Commons for the Physical World Platform" href="../../2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/">Visual Search, Augmented Reality and a Social Commons for the Physical World Platform</a>, will be a view constructed through complex â€œhybrid tracking and sensor fusion techniquesâ€ (Jarell Pair), cooperating cloud data services, powerful search and computer vision algorithms, and apps that learn by context accumulation.&#8221;</p>
<p>And as Brady notes in the interview below,Â  a key step forward would be<strong> &#8220;to take advantage of your location, but it doesnâ€™t need to have been mapped before.&#8221;</strong></p>
<p>For some interesting news on the mapping front (<em>and a discount code for Where 2.0 for Radar readers</em>) see Brady&#8217;s post, <a href="http://radar.oreilly.com/brady/" target="_blank">Flickr Photos in Google Street View</a>. These kind of human built maps have the potential to develop into â€œphoto-based positioning systemsâ€ that could create new opportunities for augmented reality.Â  Brady asks:</p>
<p><strong>&#8220;how often the Flickr photos get updated, where else these Flickr photos are going to show up in Google&#8217;s services (Google Goggles perhaps?) and will they show up in new search partner <a href="http://www.bing.com/maps/">Bing</a>? I am doubly curious if Facebook will ever let its photos be used in a similar way.&#8221;</strong></p>
<p><a id="ooyl" title="Lion Ron speaking" href="http://en.oreilly.com/where2010/public/schedule/speaker/4743"><em> </em><em><em> </em></em></a><em> </em><a id="ooyl" title="Lion Ron speaking" href="http://en.oreilly.com/where2010/public/schedule/speaker/4743">Lior Ron</a> of Google Goggles will be at Where 2.0 to tell us all about, <a id="oy8v" title="Looking into Google Goggles" href="http://en.oreilly.com/where2010/public/schedule/detail/14123">Looking into Google Goggles</a>.Â  And if you want to learn more about how our view of the physical world will be &#8221; rooted in powerful computing, pervasive connectivity, and the cloud&#8221; don&#8217;t miss this one.Â  I will be there.Â  And I very much hope there is a Q and A with this session.</p>
<p>During our conversation (see the full conversation below) Brady gave me his short list for breakthroughs that he sees as having big significance in 2010:</p>
<p><strong>&#8220;Well, I think Google Goggles is one of the most exciting things to me.Â  Having access to a visual search&#8230;having someone actually release a visual search engine in that way, to consumers, I think is huge.Â  You know, you see stuff like that in the labs. But I donâ€™t see it&#8230; itâ€™s rare to see it out. </strong></p>
<p><strong>I think Android is huge.Â  And the way Google is pushing hardware to show off the platform; so the Nexus One being another example and the fact that itâ€™s breaking free from the carriers.Â  Because I think when we get away from the carriers we are able to see more innovation, it&#8217;s whatâ€™s going to allow people or developers and companies to really innovate.</strong></p>
<p><strong>And I think Twitter adding geo-location to their APIs and buying <a href="http://www.crunchbase.com/company/mixer-labs" target="_blank">MixerLabs</a> is a huge move. I think Twitter may end up becoming the end-all be-all of location services. They are going to be updated constantly by people; they are going to have a really good grasp, real-time, of what is happening in any one place, at least based on the people. </strong></p>
<p><strong>And then with the addition of the MixerLabs data, they&#8217;re going to have more datasets at their ready, as well as any data that they start to collect from the clients themselves, like from TweetDeck.</strong></p>
<p><strong>So there are global clients that are updating Twitter.Â  I think those are some of the most exciting things.Â  And again, just to come back to Yelp, I think Yelp&#8217;s Monocle is also pretty significant, just because it&#8217;s an AR [augmented reality] app that&#8217;s being pushed into consumers&#8217; hands. </strong></p>
<p><strong>And we&#8217;ll see how useful they find it.&#8221;</strong></p>
<p><strong><br />
</strong></p>
<h3><strong><strong>Talking With Brady Forrest</strong></strong></h3>
<p><strong><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/02/bradyandgenomepost.jpg"><img class="alignnone size-medium wp-image-5141" title="bradyandgenomepost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/02/bradyandgenomepost-300x199.jpg" alt="bradyandgenomepost" width="300" height="199" /></a></strong></strong></p>
<p><em>Pic above from WhereCamp 2009, Brady Forrest, facing camera, checks out Mark Powell&#8217;s <a id="a-:n" title="Food Genome Project.Â  Check it out here" href="http://www.foodgenome.com/home">Food Genome Project</a>.Â  <a id="a-:n" title="Food Genome Project.Â  Check it out here" href="http://www.foodgenome.com/home">Check it out here</a> &#8211; it just woke up!</em></p>
<p><strong>Tish Shute:</strong> So last year when you were <a id="q5wp" title="interviewed for WebMonkey" href="http://www.webmonkey.com/blog/New_Wave_of_Apps_Build__Where__Into_the_Web">interviewed by Michael Calore for WebMonkey</a> before Where 2.0 you said, â€œLocation is no longer a differentiator itâ€™s going to become oxygen.â€ And after attending Where Week 2009, I agreed with you and <a id="k.gp" title="wrote about it here" href="../../2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/">wrote about it here</a>.Â  But, in what ways did this prediction exceed expectations, and what ways were you disappointed now as we get close to Where 2.0, 2010?</p>
<p><strong><strong>Brady Forrest:</strong> Well, it exceeded expectations in that there are now five different mobile OSâ€™s where you can load on third party applications that active usersâ€™ locations that can then be shared out.</strong></p>
<p><strong>And so, what it is making is the possibility of real-time social location aware applications.Â  And this is something that hasnâ€™t truly been possible in years past.  Looking back three years ago when the iPhone launched, it was the first major phone, especially in the US, to be location aware.Â  And a year later, the Apps Store launched, giving developers full access to location, which previously had been held onto very, very, incredibly tightly by the carriers.</strong></p>
<p><strong>And now, a year and a half later, you have Android, you have Palm Pre, you have Blackberry working on their SDK to make it better, but it still is there.Â  You have Windows Mobile working on their SDK.Â  And, you know, who knows?Â  Maybe even BREW will get into the mix. </strong></p>
<p><strong>And AT&amp;T is opening up their own interactive store.Â  And so, AT&amp;T and Verizon and all their smart phones may now be looking at BREW. </strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Right. It was very exciting <a href="http://www.ugotrade.com/2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/" target="_blank">last year at Where 2.0,</a> where we had all these new toolsets announced and then the iphone being location aware.  What were the best implementations of these new capabilities that became available in 2009, do you think?Â  What, in your view, was the most creative, surprising and disruptive?</p>
<p><strong><strong>Brady Forrest:</strong> Well, I am a huge fan of <a href="http://www.youtube.com/watch?v=jHEcg6FyYUo" target="_blank">Yelp Monocle.</a> I think, you know, that is just a real life example of using Augmented Reality.Â  You are on a street.Â  You have got a bunch of restaurants.Â  You have got a bunch of businesses.Â  And just to be able to swing through and look for peopleâ€¦I mean and look for ratings and reviews. </strong></p>
<p><strong>They have just started to institute check in, so you will be able to know where your friends are and where your friends have gone.Â  And that type of real-time, incredibly useful data is what will make augmented reality a standard part of the landscape. </strong></p>
<p><strong>I think it is that type of data, more so than, say, reference data, that will make people want to have all the possible sensors.Â  So, what do you need for that?Â  You need a camera.Â  You need a compass for orientation.Â  You need a GPS or, at least, a decent location service.Â  And then you need a screen where you can actually see the data, and then you need an Internet connection. </strong></p>
<p><strong>So it is not like any phone can handle this.Â  And so, you are going to need those killer apps to actually drive people to the type of phones that can support this.Â  I donâ€™t think AR is quite there yet. </strong></p>
<p><strong><strong>Tish Shute:</strong></strong> I agree, for true AR you need more that compass, camera, and GPS.Â  There are some missing pieces for the real deal experience &#8211; and not just a pair of sexy AR spec.Â  As you mention, hybrid tracking and sensor fusion techniques that can combine computer vision technology withÂ  compass and GPS are vital.Â  We need the compass.Â  We need the GPS.Â  We definitely need the camera!Â  But we need this combined with computer vision techniques to get the tracking, mapping and registration for true AR, or even to deliver a stable experience with the post-it/geonote AR that we see emerging with Layar, Wikitude, and others. At the moment we need to put together the tools for a true AR hyper-local experience.</p>
<p>And, of course, another aspect of this is the kind of physical hyper-links that applications like Google Goggles are building.</p>
<p>Do you have a speaker from Google Goggles at Where 2.0.Â  I would be absolutely fascinated to hear more about their road map?</p>
<p><strong>Brady Forrest: I was loading Google Goggles onto the program yesterday.</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Oh, you did?Â  Oh, fantastic. And you have <a id="namh" title="Lior Ron speaking" href="http://en.oreilly.com/where2010/public/schedule/speaker/4743">Lior Ron speaking</a>!</p>
<p><strong><strong>Brady Forrest:</strong> It is actually possible it is not up on the website, but I talked to them and got them to agree to do a talk on it.</strong></p>
<p><strong><strong>Tish Shute: </strong></strong>I very much want to hear more about their road map.Â  Google Goggle&#8217;s is a very, very significant step towards the physical internet and this integration of computer vision with sensor fusion techniques necessary for true AR.</p>
<p><strong><strong>Brady Forrest:</strong> I mean that combination with Computer Vision is going to be incredibly valuable, because,Â  and then the other issue you have there is like is it on the client,Â  or is it on the server?Â  And right now, Google Goggles is definitely on the server, and that is not fast enough in real-time AR.Â  So that is like more of a 10 blue links IO interface. </strong></p>
<p><strong><strong>Tish Shute:</strong></strong> And also, they havenâ€™t got an open API, have they?</p>
<p><strong><strong>Brady Forrest:</strong> No, not yet.<br />
<strong><br />
Tish Shute:</strong> </strong>Maybe they will announce that.Â  Can you nudge them?Â  For true AR,Â  we need to move forward in several areas &#8211; of course, there is the mediating device issues, like access to the video buffers in the iphone, and the development of cool AR eye wear would be nirvana!</p>
<p>But my recent obsession has been working on a real-time communications infrastructure for AR, because that is quite doable now, yet we donâ€™t really have that real-time infrastructure, i.e. a real-time mobile social utility that is really up to the real time requirements of AR [see more about this <a href="http://www.ugotrade.com/2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/" target="_blank">here</a> and on <a href="http://arwave.wiki.zoho.com/HomePage.html" target="_blank">ARWave</a> wiki].</p>
<p>But we certainly donâ€™t have the integration of computer vision and sensor techniques, and the access to the big image databases we need, let alone the clients we need to put it all together either!</p>
<p><strong><strong>Brady Forrest:</strong> Google has done work to help out the community with their support of <a href="http://opencv.willowgarage.com/wiki/" target="_blank">Open CV</a>. </strong></p>
<p><strong>It is based out of <a href="http://www.willowgarage.com/" target="_blank">Willow Garage</a>, but I believe that Google has done quite a bit of work on it.</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Could you talk a bit more about Open CV?</p>
<p><strong><strong>Brady Forrest: </strong><a href="http://oreilly.com/catalog/9780596516130" target="_blank">O&#8217;Reilly hasÂ  a 500 page book</a> on it.Â  It came out of the Darpa Project, or the  Darpa Contest, where unmanned vehicles are raced.Â  And that has since become, at least in my mind, the primary computer vision library that people work with. </strong></p>
<p><strong>I actually used itâ€¦or, one of the teammates did, on our project we did this summer.Â  We implemented an Open CV pretty quickly that detected where people were, and then we would play music based on that. </strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/02/3185351345_67e3514d36_o.jpg"><img class="alignnone size-medium wp-image-5144" title="3185351345_67e3514d36_o" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/02/3185351345_67e3514d36_o-300x225.jpg" alt="3185351345_67e3514d36_o" width="300" height="225" /></a></strong></p>
<p><a href="http://www.flickr.com/photos/55361487@N00/3185351345/" target="_blank"><em>Uber Geek Meeting from ShellyShelly&#8217;s photostream</em></a><br />
<strong>Tish Shute:</strong> Is that your Burning Man project? Do you have a link for that, and some pictures, video?</p>
<p><strong>Brady Forrest:</strong> <strong>Yeah.Â  <a id="riim" title="Heaid.com" href="http://heaid.com/">Heaid.com</a>.Â  Human Enhanced Artificial Intelligence Dancing.<br />
</strong></p>
<p><strong>Tish Shute:</strong> Thank you! This year the augmented reality story has been fairly basic &#8211; relying on basic sensors, compass, gps, accelerometers.Â  But it has also been an exciting year becauseÂ  we hadnâ€™t even hadÂ  smart phones with the camera, and GPS, and compass before this.</p>
<p>But now, the big adventure is to hook this all these sensor fusion techniques up with computer vision so that we can actually do reverse positioning for example from photos from what we are looking at, right?</p>
<p><strong>Brady Forrest:</strong> <strong>Yeah, and start to use it in a more ad-hoc manner so that as you are traveling around, yes, it will take advantage of your location, but it doesnâ€™t need to have been mapped before.</strong></p>
<p><strong>Tish Shute:</strong> Right &#8211; moving from mapping to context awareness.Â  Could you give like a quick explanation of what you did in your Burning Man project and how that relates to this kind of,Â  ad-hoc, on the fly, beginning to know what you are looking at without it having been mapped before, that is fascinating.</p>
<p><strong>Brady Forrest:</strong> <strong>Sure.Â  So we mounted a camera about 30 feet off the ground.Â  And as people would move underneath or dance, they would move from block to block.Â  And we had kind of created kind of bitmap of the area underneath and set up different sound zones.Â  So as people moved from zone to zone, it would play different music.</strong></p>
<p><strong>And we used Maxim FP to handle the computer vision, although it has Open CV library to handle the computer vision part and to handle determining which of the audio to fire off.Â  And then, also, we had a laser that would play at the same time.</strong></p>
<p><strong>And then we used Ableton Live, which is a very popular DJ software to actually handle the music.Â  So as someone moved from, say, square A to square B, it would fire off various MIDI signals and Ableton would interpret that.Â  And each person who went in, up toâ€¦well, theoretically, up to 4- 8 people.Â  But because of how small the stage was and how the sounds are played, realistically, more like 4-6 people.</strong></p>
<p><strong>Each person had there own set of sound.</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/02/3921063406_db4fbee6af_b.jpg"><img class="alignnone size-medium wp-image-5145" title="3921063406_db4fbee6af_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/02/3921063406_db4fbee6af_b-300x168.jpg" alt="3921063406_db4fbee6af_b" width="300" height="168" /></a></p>
<p><em>Pic from <a href="http://www.flickr.com/photos/extramatic/"><strong>extramatic</strong></a>&#8216;s Flickr </em><a id="sgdt" title="stream here" href="http://www.flickr.com/photos/extramatic/3921063406/sizes/l/"><em>s</em><em>tream here</em></a></p>
<p><strong>Tish Shute: </strong> Wow! Awesome.</p>
<p><strong>Brady Forrest:</strong> <strong>We would be able to detect different people, assign them a sound, or a set of sounds, so, like bass, drums, vocals.Â  And then we would have clips that played well together that were 3-5 seconds in length.</strong></p>
<p><strong>Tish Shute:</strong> At what distance could you detect people?</p>
<p><strong>Brady Forrest: </strong> <strong>We had a 22 foot  area underneath the camera.Â  That was mostly based on what the lens could capture.</strong></p>
<p><strong>Tish Shute:</strong> OMG I love this!Â  This is really the next step for augmented realities &#8211; not just attaching reference data to the world but exploring new shared &#8220;cosensual realities&#8221; (see Anselm Hook&#8217;s interview part 2 upcoming).</p>
<p>I am very interested in how in something you talk about a lot in your &#8220;State of Where 2.0&#8243; essay, about lifestyle coming first for a potentially disruptive technology, not commercial considerations.Â  I still have to post the second half to my interview withÂ  Anselm Hook but Anselm has some brilliant ideas in this area.Â  He is working on a project called <a href="http://makerlab.org/news/21" target="_blank">Angel</a>, where part of the vision is for people to actually find what they need without explicitly having to ask for it having to ask for it.</p>
<p>And this brings me to something that is very, to me, noticeable about Where 2.0 this year, and very exciting.Â  This is that location aware technology and crisis management basically has matured, hasnâ€™t it?Â  We are beginning to see really useful stuff in this area now.</p>
<p>What is different this year that has brought crisis management and location aware technology together, a world in crisis?</p>
<p><strong>Brady Forrest: </strong> <strong>Well, I think the primary thing that has brought all these technologies together is Haiti.Â  Without Haitiâ€¦A lot of times, future crises benefit from the current one, because people put in a lot of work.Â  And so, there is new infrastructure being laid with things such as <a href="http://www.ushahidi.com/" target="_blank">Ushahidi</a>, which is an open source platform for trackingâ€¦well, originally for tracking election violence in, but now is being used to track people and their locations and food requests in Haiti.</strong></p>
<p><strong>Also, Haiti did not have solid, accessible, good maps at the time of the of the earthquake.Â  And there have been two volunteer projects that have sprung up to help with that.Â  One being headed by the <a href="http://www.harrywood.co.uk/blog/2010/01/21/haiti-earthquake-on-openstreetmap/" target="_blank">Open StreetMap Wood Foundation</a> and many volunteers.Â  And then the other, Google Map Maker.  And in both cases the activity around Haiti on these programs went up exponentially&#8230;or, I donâ€™t know about exponentially, but a lot.Â  In the case of Map Maker, it was up 100 times and was the most worked on country for that week.Â   And one of the most downloaded for that week.</strong></p>
<p><strong>Tish Shute:</strong> Yes the work being done in <a href="http://crisiscommons.org/" target="_blank">CrisisCamps</a> around the country is very encouraging.</p>
<p><strong>Brady Forrest: And then also, you know, not just Ushahidi or Open Street Map, but also the<a href="http://haiticrisis.appspot.com/" target="_blank"> People Finder</a> which had open API so that different organizations could share their data, thus learning from Katrina.Â  There are all these different pieces of technology will be used in the future and hopefully be able to save more lives.Â  I didnâ€™t see&#8230;there are iPhones apps that were released.Â  But Iâ€™m not aware of any Android apps.Â  Iâ€™m not aware of any AR apps.</strong></p>
<p><strong>Tish Shute:</strong> We donâ€™t have smart phones devices distributed widely enough for them to be appropriate, do we, in a lot of areas where crisis strikes.</p>
<p><strong>Brady Forrest:</strong> <strong>Yeah and there was criticism that they shouldnâ€™t have been on iPhone.Â  You know, that iPhones were a waste of time. Because they arenâ€™t&#8230;a lot of on the ground agencies arenâ€™t going to have iPhones.Â  However, a lot of people who are going from the States will, and if the apps are there, then people will start to have them.</strong></p>
<p><strong>But relatively speaking, an iPhone is not that expensive.</strong></p>
<p><strong>Tish Shute:</strong> One thing I noticed and actually I discussed this in the second half of the interview I did with Anselm which I am getting ready to post.Â  But one of the aspects of the crisis filter was having people working as curators looking at messages coming out of Haiti, and while integrating the streams that would be useful is still probably a challenge, many curators will be on iPhones because they are based in the US.</p>
<p>We need to work across all platforms probably.<br />
<strong><br />
Brady Forrest:</strong> <strong>Yes.Â  Patrick Meier of Ushahidi, who runs <a href="http://www.crisismappers.net/forum/topics/task-force-haiti-earthquake" target="_blank">Crisis Mappers</a>, he ran a 24/7 emergency room.  It was out of the Fletcher School in Boston.</strong></p>
<p><strong>They had volunteers all over the States and Canada.Â  They had volunteers in Vancouver that were translating Creole messages in under ten minutes.</strong></p>
<p><strong>Tish Shute:</strong> Yes and another point that is interesting in terms of the reconstruction and rebuilding ofÂ  Haiti isÂ  the whole idea of leap frogging, and the idea that you can really&#8230; thereâ€™s always, as weâ€™ve seen in other parts of the world, opportunity, when you miss pieces of basic infrastructure, to skip a whole stage and go onto the next one, like how virtual banking took off in Africa because of the absence of brick and mortar infrastructure.</p>
<p><strong>Brady Forrest:</strong> <strong>To skip to a topic that been in my head, Iâ€™m just so bummed that the iPad does not have a camera.</strong></p>
<p><strong>Tish Shute:</strong> I was bummed is barely the word I would use.Â  Particularly as we had just been planning our ground breaking AR/next generation ebook in the days leading up to the announcement!</p>
<p>I suppose there is the hope theyâ€™re going to put it in the next one.Â   But I suppose the play for conventional content delivery is so big that everything else is trivial in comparison &#8211; especially in seems jump starting the emerging augmented reality industry!</p>
<p>So we might get thrown a camera and compass in the next round but will we get access to the video buffers?Â  AR enthusiasts may have to live on table scraps from Apple a bit longer it seems.</p>
<p>But what blows my mind is why hasnâ€™t the iTouch got a camera, been AR enabled?Â  AR gaming would get an enormous boost from that alone. My son loves even the simple minded AR games available now on the iphone, and he loves iphone games &#8211; he has 110 games downloaded!</p>
<p><strong>Brady Forrest:</strong> <strong>Ridiculous.Â  Yeah.Â  I donâ€™t know what they donâ€™t like about cameras.Â  And I plan on getting an iPad, but because of the limitations I plan on using it for base content and will probably get the bottom line model. I canâ€™t imagine&#8230;I donâ€™t know.</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>It is very interesting, who actually puts together the big enabling mediating device for AR is still an open question, isnâ€™t it?Â  I mean, thatâ€™s the truth; we have sort of mediating devices but we donâ€™t have the magic brew yet do we?</p>
<p><strong><strong>Brady Forrest:</strong> No. Not yet.</strong></p>
<p><strong><strong>Tish Shute:</strong></strong> Good enough in some ways, and certainly a start but not quite the real deal.Â  For me, Where 2.0 this year covers the groundwork for true AR, mobile social proximity-based social networking, visual search, computer vision and sensor fusion techniques&#8230;.Â Â  And because all these things have a chicken and egg relationship laying the groundwork is basically as important as having the mediating device otherwise you canâ€™t do interesting things when we get the mediating device, right?</p>
<p>Is this the year we get the magic brew for AR, i.e., the business model, the killer app, and the mediating device?</p>
<p><strong><strong>Brady Forrest:</strong> This is not the year.</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Then I should ask you. Are you in the Goggles camp? That is do you think AR needs eyewear to go mainstream?</p>
<p><strong><strong>Brady Forrest:</strong> I think this may be where we get&#8230;we start to see what is going to be the killer app that gets people to buy the hardware that will support AR.Â  You see what I mean?Â  And then from there the apps will come out and the hardware will advance in that direction.</strong></p>
<p><strong>I donâ€™t think AR has made that leap yet.Â  It hasnâ€™t, to use almost a clichÃ©, it hasnâ€™t crossed the chasm yet and it hasnâ€™t proven that it will.Â  Because I donâ€™t know if&#8230;I think itâ€™s difficult to tell right now.Â  Is it going to be games?Â  Is it going to be data layers? What is going to drive people to an AR device, especially one fully dedicated to it?</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>I think in terms of AR games taking off a bit of help from the mediating device e.g. access to the iphone video buffers would probably be enough to stoke up AR games into being a hot commodity.Â  But in terms of AR data layersÂ  going mainstream, we need some of the other players in the location space to put together the magic brew on the business model, donâ€™t we?</p>
<p><strong><strong>Brady Forrest:</strong> Thatâ€™s why Iâ€™m so curious though&#8230;thatâ€™s why I gave Yelp their own talk.Â  They are&#8230;Those guys are gang busters, theyâ€™re a consumer company, very consumer facing website.Â  Theyâ€™ve got amazing data stores.Â  They do a lot of interesting stuff with their data.Â  And I donâ€™t think people always give them the geek credit they deserve.</strong></p>
<p><strong><strong>Tish Shute: </strong></strong>You began Where 2.0 back in 2004, when as you point out, &#8220;&#8216;local search&#8217; was interesting but not yet real&#8221; and you have always stressed something thatâ€™s proven to be absolutely true which is lifestyle before commerce, right?Â  And that if location based services were going to be big it was because they meant something in terms of our lifestyle, not just because they told us where to get another good burger.Â  Right?</p>
<p>I think thereâ€™s been a lot of breakthrough in that area this year in terms of what location based services and proximity based social networks are to us now, how theyâ€™re changing our lifestyle.Â  What do you see as the breakthroughs for in 2009 and what are you hoping for in 2010?</p>
<p><strong><strong>Brady Forrest:</strong> Well, I think Google Goggles is one of the most exciting things to me.Â  Having access to a visual search&#8230;having someone actually release a visual search engine in that way, to consumers, I think is huge.Â  You know, you see stuff like that in the labs. But I donâ€™t see it&#8230; itâ€™s rare to see it out.</strong></p>
<p><strong>I think Android is huge.Â  And the way Google is pushing hardware to show off the platform; so the Nexus One being another example and the fact that itâ€™s breaking free from the carriers. Because I think when get away from the carriers we are able to</strong><strong> see more innovation, it&#8217;s whatâ€™s going to allow people or developers and companies to really innovate.</strong></p>
<p><strong>And I think Twitter adding geo-location to their APIs and buying MixerLabs is a huge move. I think Twitter may end up becoming the end-all be-all of location services. They are going to be updated constantly by people; they are going to have a really good grasp, real-time, of what is happening in any one place, at least based on the people.</strong></p>
<p><strong>And then with the addition of the MixerLabs data, they&#8217;re going to have more datasets at their ready. As well as any data that they start to collect from the clients themselves, like from TweetDeck.</strong></p>
<p><strong>So there are global clients that are updating Twitter. I think those are some of the most exciting things. And again, just to come back to Yelp, I think Yelp&#8217;s Monocle is also pretty significant, just because it&#8217;s an AR app that&#8217;s being pushed into consumers&#8217; hands.</strong></p>
<p><strong>And we&#8217;ll see how useful they find it.</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong><a href="http://en.oreilly.com/where2010/public/schedule/speaker/24907" target="_blank">Gary Gale, Yahoo! Inc.,</a> is going to talk on overcoming the business, social, and technological hurdles so we can reach the long promised [Laughs] Hyperlocal Nirvana. I think you&#8217;ve outlined some of these obstacles in relation toÂ  AR, where there are obstacles are in terms of mediating device, and bringing all the pieces together including computer vision techniques in order to have an AR view. That&#8217;s the AR side of it. But the layer below that, which is the layer where actual location based apps that are beginning to go mainstream now,Â  are these presenting successful business models for location-based services.</p>
<p>So in short, in your view, what are the big hurdles to Hyperlocal Nirvana before we get to AR, even just for these location-based services?</p>
<p><strong><strong>Brady Forrest:</strong> Well, how do you make money?</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Yeah, to put it bluntly. I like <a href="http://battellemedia.com/" target="_blank">John Battelle&#8217;s</a> way of putting it [laughs] how do we oxygenate the system!</p>
<p><strong><strong>Brady Forrest:</strong> So are location-based services something that you can make money in the long-term? Nokia bought NavTec for $8 billion. And then two years later, they&#8217;re giving it away free as part of Ovi Maps.</strong></p>
<p><strong><strong>Tish Shute: </strong></strong>Right.</p>
<p><strong><strong>Brady Forrest: </strong>I&#8217;m assuming that that&#8217;s actually part of the plan.Â  And that although their hand may have been forced by Google with their release of Turn-By-Turn thatâ€¦but it&#8217;s still got to be a hard nut to swallow that this huge investment in location ends up becoming a loss leader to sell more phones.</strong></p>
<p><strong>So, can you make money through subscriptions, through selling apps? And I think that is still being proven. The other one is, can you use advertising? And it&#8217;s kind of scary to see that Apple is restricting the use of advertisers to use location.</strong></p>
<p><strong>It came out yesterday or two days ago that advertisers cannot use location, or app developers cannot use location for ads. They can only use location to show something interesting or useful to their customers.</strong></p>
<p><strong>And there&#8217;s a lot of speculation that it&#8217;s because Apple wants to control the location-based ads that go on the iPhone.</strong></p>
<p><strong>Tish Shute</strong>: Yes. I heard a strange rumor.Â  Actually its an un-strange rumor, a likely rumor in fact,Â  that Apple and MS are getting together to replace some of the Google aspects of the iPhone like search and maps?</p>
<p><strong><strong>Brady Forrest:</strong> Yes, &#8230;. Microsoft employees get 10% off at the Apple store. There&#8217;s a longstanding relationship between those two companies.</strong></p>
<p><strong>And Android is definitely more of a competitive threat than Windows Mobile is.Â  And it&#8217;s well-known what the relationship between PCs and Macs are. So I donâ€™t thinkâ€¦I donâ€™t find that to be that surprising of a rumor.Â  I do wonder if it would hurt the iPhone, but it doesnâ€™t surprise me that they would consider it.</strong></p>
<p><strong><strong>Tish Shute:</strong></strong> I do know, certainly from the AR point of view, Microsoft has recently hired some of the key researchers, including Georg Klein. And they are looking for more people in the image recognition area so it seems currently MS is going to be making a bigger push not just with PhotoSynth, but with image ID.</p>
<p>So it could be a pretty powerful combo between the iPhone, and Microsoft &#8211; they have some of the key computer vision research that would be needed for full AR.</p>
<p><strong><strong>Brady Forrest</strong>: Oh, yeah. Microsoft has amazing research depth. They&#8217;ve got an amazing team.</strong></p>
<p><strong><strong>Tish Shute: </strong></strong>But it is a bit of a mystery to me why Microsoft haven&#8217;t done more with Photosynth.Â  As I noted in myÂ <a id="jyr:" title="previous post" href="../../2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/">previous post</a>, <a href="http://www.slashgear.com/nokia-image-space-adds-augmented-reality-for-s60-3067185/" target="_blank">Nokiaâ€™s ImageSpace</a> is beginning to do what many thought Microsoft would do with photosynth two years ago.Â  And â€œphoto-based positioning systemsâ€ -Â  3d models of the environment to cover every possible angle, and then software that can work out in reverse based on a picture precisely where you are and where your facing could be hugely important to AR.Â  But that brings me to another mystery why haven&#8217;t we seen more from Nokia in this space  yet &#8211; the N900 doesn&#8217;t have a compass?</p>
<p><strong><strong>Brady Forrest:</strong> Yeah, I donâ€™t know why Nokia hasnâ€™t made more of a space for themselves in these things. They did a lot of early work in these areas. I think they are trying toâ€¦my guess is that they&#8217;re trying to restructure themselves. They made some pretty big changes on the web-Ovi made its own division. And they&#8217;ve been doing a lot of location-based acquisitions: Places, Gate 5 several years ago, Gossler, just the past six months.  And so I think that&#8217;s really been their focus&#8230;</strong><strong>and the research team.</strong></p>
<p><strong>And a large company, since they havenâ€™t found a business model, which is what we&#8217;ve been discussing here, they are hesitant to launch it, or toâ€¦they donâ€™t really know if this is a business that they need to launch, or if this is an app that they should have there out for fun.</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Yeah. And that&#8217;s back to the oxygenation of the system and location.Â  We really still have some work to do to with the business models</p>
<p>Final question!Â  At the core of many of today&#8217;s business model is the idea of hoarding data &#8211; that&#8217;s an underpinning.</p>
<p>But ultimately, for open AR, we want a situation where we can really share data so that we donâ€™t really have the data all locked inside one particular browser or app. The current crop of AR browsers arenâ€™t really browsers in the sense that we understand a browser on the web today, because the data&#8217;s locked inside each service, Wikitude, Layar, Acrossair etc.</p>
<p>I have become very interested with Federation as a model for solving this, so that we can begin to have an opportunity to build consensual relations around data,  sometimes sharing, sometimes not. Federation is my big dream at the moment.Â  And now we even have something to work with in the Wave Federation Protocol. But how do we get from here to there, where we really have a federated world of data for AR and location-based services? But you think people need to solve the question of business models first?<strong><br />
<strong><br />
Brady Forrest:</strong> I think people needâ€¦I think one potential is ads; so serving up content.Â  And by ads, I also mean coupons, meals, the Foursquareâ€¦. what it looks like Foursquare&#8217;s going to do, featured content, which is Layar&#8217;s.</strong></p>
<p><strong>So we need to see, is that the way we&#8217;re going to sell these? The other is to have the best viewer, which in some ways is a race in selling that, but that&#8217;s potentially a race to the bottom, price-wise.</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Right. Do you think Google Wave Federation Protocol has a chance of taking off and changing the game for real-time communications, federation, real-timeâ€¦<strong><br />
<strong><br />
Brady Forrest:</strong> Quite possibly with the real-time. I think they need to work on the UI.</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Oh dear we can&#8217;t discuss the Wave UI right at the end of the interview &#8211; of course I believe it would do better in an AR view!Â Â  I know you have to goÂ  now but I have to say Google Wave not standardizing the client/server interface &#8211; so we could seem some new UIs for Wave [we are working with PygoWave for ARWave because of this], andÂ  iPad&#8217;s lack of camera were two huge disappointments in recent months.</p>
<p><strong><strong>Brady Forrest: </strong>Yeah. It&#8217;s [the Wave client] is very difficult to use.</strong></p>
<p><strong>Tish Shute: </strong>But the Wave Federation Protocol is an open fast, compact protocol that is a dream come true for AR.Â  Open, distributed, real time communications is a very big enabler for AR.Â  I would hazard a guess that in 2010 real time communications plus location becomes oxygen.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2010/02/10/the-physical-world-becomes-a-software-construct-talking-with-brady-forrest-about-where-2-0-2010/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
		</item>
		<item>
		<title>Visual Search, Augmented Reality and a Social Commons for the Physical World Platform: Interview with Anselm Hook</title>
		<link>http://www.ugotrade.com/2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/</link>
		<comments>http://www.ugotrade.com/2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/#comments</comments>
		<pubDate>Sun, 17 Jan 2010 17:05:01 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial general Intelligence]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[virtual communities]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Anselm Hook]]></category>
		<category><![CDATA[AR Commons]]></category>
		<category><![CDATA[AR Consortium]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[ardevcamp]]></category>
		<category><![CDATA[are2010]]></category>
		<category><![CDATA[ARNY Meetup]]></category>
		<category><![CDATA[ARWave]]></category>
		<category><![CDATA[ARWave Wiki]]></category>
		<category><![CDATA[augmented reality conference]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[augmented reality goggles]]></category>
		<category><![CDATA[augmented reality social commons]]></category>
		<category><![CDATA[brightkite]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Davide Carnivale]]></category>
		<category><![CDATA[distributed AR]]></category>
		<category><![CDATA[distributed augmented reality]]></category>
		<category><![CDATA[federated search]]></category>
		<category><![CDATA[FourSquare]]></category>
		<category><![CDATA[Games Alfresco]]></category>
		<category><![CDATA[google goggles]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[gowalla]]></category>
		<category><![CDATA[graffitigeo]]></category>
		<category><![CDATA[hacking maps]]></category>
		<category><![CDATA[Head Map manifesto]]></category>
		<category><![CDATA[imageDNS]]></category>
		<category><![CDATA[imagemarks]]></category>
		<category><![CDATA[imagewiki]]></category>
		<category><![CDATA[location based services]]></category>
		<category><![CDATA[Map Kiberia]]></category>
		<category><![CDATA[Mikel Maron]]></category>
		<category><![CDATA[mobile internet]]></category>
		<category><![CDATA[mobile social]]></category>
		<category><![CDATA[mobile social interaction utility]]></category>
		<category><![CDATA[Muku]]></category>
		<category><![CDATA[neo-viridian]]></category>
		<category><![CDATA[Nokia's ImageSpace]]></category>
		<category><![CDATA[Ogmento]]></category>
		<category><![CDATA[open distributed AR]]></category>
		<category><![CDATA[OpenGeo]]></category>
		<category><![CDATA[paige saez]]></category>
		<category><![CDATA[photo-based positioning systems]]></category>
		<category><![CDATA[physical world platform]]></category>
		<category><![CDATA[placemarks]]></category>
		<category><![CDATA[Planetwork]]></category>
		<category><![CDATA[Platial]]></category>
		<category><![CDATA[point and find]]></category>
		<category><![CDATA[proximity based social networks]]></category>
		<category><![CDATA[snaptell]]></category>
		<category><![CDATA[social cartography]]></category>
		<category><![CDATA[social commons]]></category>
		<category><![CDATA[social search]]></category>
		<category><![CDATA[SpinnyGlobe]]></category>
		<category><![CDATA[Thomas Wrobel]]></category>
		<category><![CDATA[Tonchidot]]></category>
		<category><![CDATA[trust filters]]></category>
		<category><![CDATA[Viridian]]></category>
		<category><![CDATA[viridiandesign]]></category>
		<category><![CDATA[visual search]]></category>
		<category><![CDATA[Wave]]></category>
		<category><![CDATA[Wave Federation Protocol]]></category>
		<category><![CDATA[WhereCamp]]></category>
		<category><![CDATA[whurley]]></category>
		<category><![CDATA[yelp]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=5050</guid>
		<description><![CDATA[Visual search is heating up, and with it a key stage of turning the physical world into a platform is underway as images become hyperlinks to the world in applications like Google Goggles, Point and Find, and SnapTell &#8211; see this post by Katie Boehret.Â  And while there may be no truly game changing augmented [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/anselmhook.jpg"><img class="alignnone size-medium wp-image-5051" title="anselmhook" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/anselmhook-300x225.jpg" alt="anselmhook" width="300" height="225" /></a></p>
<p>Visual search is heating up, and with it a key stage of turning the physical world into a platform is underway as images become hyperlinks to the world in applications like <a href="http://www.google.com/mobile/goggles/#dc=gh0gg" target="_blank">Google Goggles</a>, <a href="http://pointandfind.nokia.com/" target="_blank">Point and Find</a>, and <a href="http://www.snaptell.com/" target="_blank">SnapTell</a> &#8211; <a href="http://solution.allthingsd.com/20100112/in-search-of-images-worth-1000-results/" target="_blank">see this post by Katie Boehret</a>.Â   And while there may be no truly game changing augmented reality goggles for a while, make no mistake, key aspects of our augmented view, factors that will have a lot to do with what we will actually see when an augmented vision of the world is a commonplace, are already in the works.Â  And, as Anselm Hook (pic above <a href="http://www.flickr.com/photos/caseorganic/2994952828/" target="_blank">from @caseorganic&#8217;s flickr</a>) notes:</p>
<p><strong>&#8220;There is a real risk of our augmented reality world being owned by interests which are not our own. There is a real question of when you hold up that AR goggle, what are you going to see?&#8221;</strong></p>
<p>Cooperating services, e.g., Google Earth, Maps, Streetview, Google Goggles, and leader in local search like Yelp (<a href="http://www.huffingtonpost.com/ramon-nuez/google-is-getting-ready-f_b_426493.html" target="_blank">see here</a>) would have an enormous ability to filter and control a mobile, social, context aware view of the physical world, and Google themselves see an ethical quandary.</p>
<p><strong> &#8220;A Google spokesperson says this app has the ability to use facial recognition with Goggles, but hasnâ€™t launched this feature because it hasnâ€™t been built into an app that would provide real value for users. The spokesperson also cites â€œsome important transparency and consumer-choice issues we need to think throughâ€ </strong><strong> (quote from Wall Street Journal Column</strong><a href="http://solution.allthingsd.com/20100112/in-search-of-images-worth-1000-results/" target="_blank"> by Katie Boehret)</a>.</p>
<p><a href="http://www.hook.org/" target="_blank">Anselm Hook</a> and <a href="http://paigesaez.org/" target="_blank">Paige Saez</a>, with great prescience, have been advocating a social commons for the placemarks and imagemarks to our physical world platform through a number of pioneering projects, including <a href="http://imagewiki.org/" target="_blank">imagewiki</a>.Â Â  I have interviewed both Anselm and Paige (upcoming) in depth, recently.Â  My talk with Anselm was nearly three hours long!Â  So I am publishing the transcript in two parts.</p>
<p>Understanding what it means to have a social commons forÂ  our physical world platform, and augmented reality, are key questions for all of us to think about, but especially important for those of us involved in the emerging industry of augmented reality.</p>
<p>Anselm <a href="http://blog.makerlab.org/2009/11/augmentia-redux/">notes</a> :</p>
<p><strong>â€œThe placemarks and imagemarks in our reality are about to undergo that same politicization and ownership that already affects DNS and content. Creative Commons, Electronic Frontier Foundation and other organizations try to protect our social commons. When an image becomes a kind of hyperlink â€“ thereâ€™s really a question of what it will resolve to. Will your heads up display of McDonalds show tasty treats at low prices or will it show alternative nearby places where you can get a local, organic, healthy meal quickly? Clearly thereâ€™s about to be a huge ownership battle for the emerging imageDNSâ€</strong></p>
<p>The mobile internet is moving beyond the internet in your pocket phase of mobility with mobile, social, proximity-based, context aware networks like <a href="http://www.foursquare.com/">FourSquare</a>, <a href="http://gowalla.com/" target="_blank">Gowalla</a>, <a href="http://brightkite.com/" target="_blank">Brightkite</a> and <a href="http://www.geograffiti.com/">GraffitiGeo</a> (see <a href="http://smartdatacollective.com/Home/23811">Smart Data Collective</a>) likely, soon, to start to take precedence over other forms of social network.</p>
<p>Regardless of the timeline for true augmented reality &#8211; 3D images &amp; graphics tightly registered to the physical world,Â  proximity-based social networking and real time search are already taking us into a hyper-local mode and the realm of augmented reality which is <strong><strong>&#8220;inherently about who you are, where you are, what you are doing, and what is around you&#8221; </strong></strong>(<a href="http://curiousraven.squarespace.com/" target="_blank">Robert Rice</a> &#8211; see <a href="http://www.ugotrade.com/2009/01/17/is-it-%E2%80%9Comg-finally%E2%80%9D-for-augmented-reality-interview-with-robert-rice/" target="_blank">here</a>).<strong><strong> </strong></strong>The ground is being prepared for augmented reality now.<strong><strong><br />
</strong></strong></p>
<p>If you have been reading Ugotrade, you will know I have been actively involved in developingÂ  an open, distributed AR platform/mobile social interaction utility for geolocated data based on the Wave Federation Protocol &#8211; AR Wave a.k.a Muku &#8211; &#8220;crest of a wave&#8221; (see my posts <a href="http://www.ugotrade.com/2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/" target="_blank">here</a>, <a href="http://www.ugotrade.com/2009/12/04/ar-wave-project-an-introduction-and-faq-by-thomas-wrobel/" target="_blank">here</a> and <a href="http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/" target="_blank">here</a> for more on this project, and the <a href="http://arwave.wiki.zoho.com/HomePage.html" target="_blank">AR Wave Wiki</a> here).Â  Federation is, I believe, one vital aspect to developing a social commons for augmented reality and the physical world platform.</p>
<p>Also, a bit of news, I am co-chairing the upcoming <a title="Augmented Reality Event (are2010) Opens Call For Speakers" href="http://augmentedrealityevent.com/2010/01/17/augmented-reality-event-2010-opens-call-for-speakers/">Augmented Reality Event (are2010)</a> with <a href="http://gamesalfresco.com/about/" target="_blank">Ori Inbar</a> of <a href="http://gamesalfresco.com/" target="_blank">Games Alfresco</a> and <a href="http://ogmento.com/" target="_blank">Ogmento</a>, <a href="http://whurley.com/" target="_blank">whurley</a>.Â  Sean Lowery, <a href="http://www.innotechconference.com/pdx/Details/other.php" target="_blank">Prospera</a>, is the event organizer, and <a title="Augmented Reality Event (are2010) Opens Call For Speakers" href="http://augmentedrealityevent.com/2010/01/17/augmented-reality-event-2010-opens-call-for-speakers/">are2010</a> has the support of the <a href="http://www.arconsortium.org/" target="_blank">AR Consortium</a>. Â  The <a title="Augmented Reality Event (are2010) Opens Call For Speakers" href="http://augmentedrealityevent.com/2010/01/17/augmented-reality-event-2010-opens-call-for-speakers/">are2010</a> web site is live and there is an <a title="Augmented Reality Event (are2010) Opens Call For Speakers" href="http://augmentedrealityevent.com/2010/01/17/augmented-reality-event-2010-opens-call-for-speakers/">Open Call For Speakers</a>.Â   You can submit your proposals and demos for one of the three tracks, business, technology, or production <a href="http://augmentedrealityevent.com/speakers/call-for-proposals/" target="_blank">on the web site here</a>.</p>
<p><a href="http://augmentedrealityevent.com/" target="_blank"><img class="alignnone size-medium wp-image-5101" title="are2010" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/are20101-300x60.png" alt="are2010" width="300" height="60" /></a></p>
<p><a href="http://www.wired.com/beyond_the_beyond/" target="_blank">Bruce Sterling</a> &#8220;prophet&#8221; ofÂ  augmented reality and more, &#8220;will deliver the most anticipated <a href="http://augmentedrealityevent.com/speakers/" target="_blank">Augmented Reality keynote</a> of the year.&#8221;</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/bruces-brasspost.jpg"><img class="alignnone size-medium wp-image-5105" title="bruces-brasspost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/bruces-brasspost-300x225.jpg" alt="bruces-brasspost" width="300" height="225" /></a></p>
<p>It didn&#8217;t surprise me when Anselm mentioned that Bruce Sterling was a key influence for his work on the geospatial web and augmented reality.Â  Anselm explained:</p>
<p><strong>&#8220;Iâ€™d seen <a href="http://www.viridiandesign.org/notes/151-175/00155_planetwork_speech.html" target="_blank">a talk by Bruce Sterling</a> at an event called Planetwork [May, 2000]. And that event was, for me, a turning point where I decided to focus full time on exactly what I cared about instead of doing things that were kind of similar to what I cared about.</strong> <strong>So, his influences is a pretty significant one to me at that exact moment.&#8221;</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_490gcp7q6fn_b.png"><img title="dhj5mk2g_490gcp7q6fn_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_490gcp7q6fn_b-300x80.png" alt="dhj5mk2g_490gcp7q6fn_b" width="300" height="80" /></a></p>
<p>For more see <a id="q2or" title="viridiandesign.org" href="http://www.viridiandesign.org/About.htm">viridiandesign.org</a> -Â  seems it is time for a &#8220;Neo-Viridian,&#8221;  revival!</p>
<p>This <a href="http://www.wired.com/beyond_the_beyond/2009/05/spime-watch-pachube-feeds/" target="_blank">post by Bruce Sterling on Pachube Feeds</a>, and Thomas Wrobel&#8217;s <a href="http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/" target="_blank">prototype design for open distributed augmented reality on IRC</a>, were key inspirations for me when I began thinking about the potential of Google Wave Federation protocol for augmented reality.Â  I had been exploring <a href="http://www.pachube.com/" target="_blank">Pachube</a> and deeply interested in <a href="http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/" target="_blank">the vision of Usman Haque</a>, but I had a real <a href="http://www.ugotrade.com/2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/" target="_blank">aha moment</a> when I read this :</p>
<p><strong>â€œ(((Extra credit for eager ubicomp hackers: combine this [pachube feeds] with Googlewave, then describe it in microsyntax. Hello, 2015!)))â€</strong></p>
<p>I think the AR Wave group will earn the extra credit and more very soon!Â  <a href="http://need2revolt.wordpress.com/about/" target="_blank">Davide Carnovale, need2revolt</a>, and <a href="http://www.lostagain.nl/" target="_blank">Thomas Wrobel</a><strong> </strong>have been leading the coding charge, and there will be a very early AR Wave demo soon, perhaps as soon as the <a href="http://www.meetup.com/arny-Augmented-Reality-New-York/" target="_blank">Feb 16th ARNY Meetup</a>.Â  <strong><br />
</strong></p>
<p>Open access to the creation of view that will eventually find its way into AR goggles, will depend not only on the power ofÂ  an open distributed platform for collaboration like the AR Wave project.Â  Our augmented reality view will be constructed through complex &#8220;hybrid tracking and sensor fusion techniques&#8221; (Jarell Pair), cooperating cloud data services, powerful search and computer vision algorithms, and apps that learn by context accumulation will drive our augmented experiences, and at the moment, these kind of resources, at least at scale, are for the most part in private hands.</p>
<p>In the interview below, Anselm&#8217;s discussesÂ  how trust filters, and <span id="zuat" title="Click to view full content">being able to publicly permission your searches so that other people can respond and so that people can reach out to you, and the democratization of data in general, are even more of a concern </span>with augmented reality and hyper local search<span id="zuat" title="Click to view full content">.</span> The task of understanding what it means to haveÂ  a social commons for the outernet remains an open, and pressing question.</p>
<p>Anselm explains (see full interview below):</p>
<p><strong><span id="e18n" title="Click to view full content">&#8220;as we move towards a physical internet where there&#8217;s no clicking and there&#8217;s no interface and the computer&#8217;s just telling you what it thinks you&#8217;re looking at, translating, you know, an image of a billboard to the name of the rock star who&#8217;s on that billboard, or translating the list of ingredients on a can of soup to the source outlets where it thinks that, those ingredients came from. When you have that kind of automated mediation, the question of trust definitely arises.</span></strong></p>
<p><strong><span id="e18n" title="Click to view full content"> And we haven&#8217;t seen the Clay Shirkys or the Larry Lessigs of the world start to talk about this yet.Â  Although I suspect that in the next four or five years that the zero click interface will become the primary interface, that we&#8217;ll have&#8230;we&#8217;ll come to assume that what we see with the extra enhanced data we get projected onto our view is the truth. Yet, at the same time, there is just no structure or mechanism even being considered for a democratic ownership of it.&#8221;</span></strong></p>
<h3>Augmented Reality will emerge through sensor fusion techniques &amp; cooperating cloud services</h3>
<p>In 2010, sensor fusion techniques, computer vision technology in conjunction with GPS and compass data will create data linking that can enable the kind of augmented reality that has been the stuff of imagination for nearly four decades (see <a href="http://laboratory4.com/2010/01/the-reality-of-augmented-reality/" target="_blank">Jarrell Pair&#8217;s post).</a></p>
<p>Putting stuff in the world in 3D is of course key to the original vision of augmented reality, and one of its biggest challenges.Â  Augmented reality is going to be implicated in a real time mapping of the world at an unprecedented scale and granularity.Â  We have barely an inkling of the implications of this now.</p>
<p>Anselm and Paige have been working in the heart of the social cartography movement for nearly a decade.Â  The vision and experience of this community is vital to understanding how augmented reality and the world as a physical platform can evolve into something that benefits people and allows them &#8220;to have a better understanding of the opportunities around them.&#8221;</p>
<p>We have been hacking maps for millenia â€“Â  â€œfrom conceptual story mapping, to colloquial mapping in European development and the cartographic renaissance created by the global voyages and rediscovery of Ptolemyâ€™s mapsâ€ (<a href="http://highearthorbit.com/" target="_blank">Andrew Turner</a>).Â  And, recently, initiatives on a public-provided GIS, like <a href="http://opengeo.org/" target="_blank">OpenGeo</a>, have led the way toward more open, interoperable, geospatial data.</p>
<p>Mapping takes on a new an crucial role to augmented reality.Â  <a href="http://www.slashgear.com/nokia-image-space-adds-augmented-reality-for-s60-3067185/" target="_blank">Nokia&#8217;s ImageSpace</a> is beginning to do what many thought Microsoft would do with photosynth two years ago.</p>
<p>And, if we see these kind of projects developed into a &#8220;photo-based positioning systems&#8221; -Â  &#8220;3d models of the environment to cover every possible angle, and then software that can work out in reverse based on a picture precisely where you are and where your facing&#8221; (Thomas Wrobel), we would find augmented reality leap forward over night.</p>
<p>It is time to take very seriously the vast opportunities and potential pitfalls of an augmented world.</p>
<p><strong><span id="vix9" title="Click to view full content">&#8220;when you are mediating the translation layer between the image and the data, then there is an opportunity for you to control it, and that opportunity is hard to resist.Â  It is hard to choose not to own that opportunity. It is an advertising opportunity. It is a revenue opportunity. It is a chance to send a message and a tone. </span></strong></p>
<p><strong><span id="vix9" title="Click to view full content">I know that Google and companies like that are keenly aware of the kinds of roles they donâ€™t want to hold, but it is sometimes seductive to think about them. And I am afraid that we, as a community, need to assert an ownership, kind of a commons, over how computers will translate what they see to information that we perceive.&#8221;</span></strong></p>
<p>There are some initiatives emerging.Â  <a href="http://www.tonchidot.com/" target="_blank">Tonchidot</a> (who <a href="http://www.techcrunch.com/2009/12/08/tonchidot-sekai-camera-funding/" target="_blank">closed on $4 million of VC for augmented reality </a>last December) has helped create the <a href="http://translate.google.com/translate?client=tmpg&amp;hl=en&amp;u=http%3A%2F%2Fwww.arcommons.org%2F&amp;langpair=ja%7Cen" target="_blank">AR Commons</a> in Japan.Â  <a href="http://www.tonchidot.com/corporate-profile.html" target="_blank">CFO of Tonchidot</a>, <a href="http://www.linkedin.com/ppl/webprofile?action=vmi&amp;id=499984&amp;pvs=pp&amp;authToken=r8TF&amp;authType=name&amp;trk=ppro_viewmore&amp;lnk=vw_pprofile" target="_blank">Ken Inoue</a> explained in <a href="http://www.ugotrade.com/2009/09/17/tonchidot-taking-augmented-reality-beyond-lab-science-with-fearless-creativity-and-business-savvy/" target="_blank">an interview with me in September 2009</a>.</p>
<p>&#8220;<strong>We feel that public data, such as landmarks, government facilities, and public transport should be shared. We see an AR world where people can readily and easily access information by just seeing â€“ quick, easy, and efficient.Â  And because of this ease and intuitiveness, children, the elderly and handicapped will surely benefit.Â  AR could help create a safer society.Â  Warnings, alerts, and safety information could save lives and avoid disasters.Â  These are what we, and <a href="http://translate.google.com/translate?client=tmpg&amp;hl=en&amp;u=http%3A%2F%2Fwww.arcommons.org%2F&amp;langpair=ja%7Cen" target="_blank">AR Commons</a> would like to tackle in the not so distant future.&#8221;</strong></p>
<p>But<strong> </strong>the task of building a social commons for the physical world platform has only just begun.<strong><br />
</strong></p>
<p><strong><span title="Click to view full content"><br />
</span></strong></p>
<h3>Interview with Anselm Hook</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/anselm31.jpg"><img class="alignnone size-medium wp-image-5085" title="anselm3" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/anselm31-300x225.jpg" alt="anselm3" width="300" height="225" /></a></p>
<p><em>photo from <a href="http://www.flickr.com/photos/anselmhook/3832691280/in/set-72157621946362509/" target="_blank">Anselm&#8217;s Flickr stream here</a></em></p>
<p><span id="u2mq" title="Click to view full content"><strong>Tish Shute:</strong> We <a href="http://www.ugotrade.com/2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/" target="_blank">first met last year </a></span><span id="zjlm" title="Click to view full content"><a href="http://www.ugotrade.com/2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/" target="_blank">at Wherecamp</a>. </span><span id="suh4" title="Click to view full content">The start of 2009 was I think</span><span id="e_r5" title="Click to view full content"> the &#8220;OMG finally&#8221; moment for augmented reality and</span><span id="wo16" title="Click to view full content"> in less than a year AR, at least in proto forms, AR is breaking into the mainstream now! You are one of the founding visionaries/philosophers/hackers of the geo web and you have been thinking about geo web and AR for a long time &#8211; <a href="http://hook.org/headmap" target="_blank">all the way back to the legendary Head Map Manifesto</a>, and before.Â  Mostly recently you led the way in the very successful <a href="http://www.ardevcamp.org/wiki/index.php?title=Main_Page" target="_blank">ARDevCamp</a> in Mountain View. </span><span id="kn-y" title="Click to view full content"> Could you start by telling me a little bit about the history of your pioneering work with geolocated data?</span></p>
<p><strong>Anselm Hook: </strong>I am a long time Geo fanatic. I&#8217;m really interested in social cartography and what some people call public-provided GIS, thatâ€™s some language that people use. Anyway, my personal interest, when I talk to people who are non-technical (and it&#8217;s been a long term interest in the way I phrase it) is that I want to help people see through walls. So, the goal is very simple. I want people to have a better understanding of opportunities around them, the landscape around them. I always get frustrated when people make bad decisions because of a lack of information, especially when it&#8217;s related to their community and related to their environment. But, plainly put, I really just want &#8220;to help people see through walls&#8221;. It&#8217;s a very simple goal.</p>
<p><strong>Tish Shute:</strong> I know you worked on <a href="http://platial.com/" target="_blank">Platial</a>, which is really one of my favorite social mapping applications. It really broke new ground. What was the history of that? How did you get involved with Platial?</p>
<p><strong>Anselm Hook:</strong> Thatâ€™s an interesting question. It actually started at around 2000 when I saw Bruce Sterling talk. I had been writing video games for many years, and I was quite good at it, and I enjoyed it. But, the reasons I was doing it diverged from why the industry was doing it. I was making video games because I like to make shared spaces for my friends to play in and to share experience. I really enjoyed making shared environments. I worked on <a id="jrn-" title="BBS's" href="http://en.wikipedia.org/wiki/Bulletin_board_system">BBS&#8217;s</a> and my friends and I were always making these collaborative shared environments.</p>
<p>Once the video game industry kind of started to take off, I started to do high performance, 3D interactive video games and making compelling shared spaces, and it was a lot of fun. But, the frustration for me was that there was a huge industry growing around it and became very commercial. Although it paid well, it started to diverge from my values which were more centered around community environments, and shared understanding.</p>
<p><strong>Tish Shute:</strong> Yes very rapidly, the big games kind of devolved from the social aspects and became more and more into single player really, didnâ€™t they?</p>
<p><strong>Anselm Hook:</strong> It was the way, actually, because even though often you were in a many player world, you werenâ€™t collaborating, everything else became just a target.Â  I liked the idea of deep collaboration that calls the kind of playful space you see in IRC, or in the real world, where people are solving real world problems.</p>
<p>And I grew up in the Rockies, and I was always had a lot of access to the outside. So, I saw shared spaces and collaboration as a way to protect our environment. [ To step back ] I think people used different metrics <span id="gozb" title="Click to view full content">for measuring their choices in the world and many people have a value system centered around minimization of harm: making sure that the people are not hurt. But, my value system is different. I personally believe that protecting the planet is more important: to maximize biodiversity. I feel like protecting people around me comes from protecting the ecosystems they live in.</span></p>
<p><strong>Tish Shute:</strong> Thatâ€™s interesting, isnâ€™t it, because the history of Keyhole was really that, wasnâ€™t it.Â  Keyhole later became Google Earth, but I mean it began out of a project to look at what was going on in the ecosystem over Africa at that time, didnâ€™t it?<br />
<strong><br />
Anselm Hook:</strong> Yes, in fact many peopleâ€™s projects are stemming from an environmental concern. <a id="zxy9" title="Mikel Mironâ€™s" href="http://brainoff.com/weblog/">Mikel Maronâ€™s</a> works for example &#8211; heâ€™s doing <a id="euvm" title="Map Kiberia" href="http://mapkibera.org/">Map Kiberia</a>, and he also worked on OpenStreetMaps.</p>
<p><strong>Tish Shute:</strong> Map Kiberia &#8211; that is the new project?</p>
<p><strong>Anselm Hook:</strong> Oh, yes his project is called <a id="r7ie" title="Map Kiberia" href="http://mapkibera.org/">Map Kiberia</a>. Heâ€™s mapping a city in Africa.<br />
[For more see <a id="ngn." title="Map Kiberia's YouTube Channel" href="http://www.youtube.com/user/mapkibera">Map Kiberia&#8217;s YouTube Channel</a> &#8211; <a id="amqx" title="photo below" href="http://www.flickr.com/photos/junipermarie/4098163856/" target="_blank">photo below</a> from <a href="http://www.flickr.com/photos/junipermarie/">ricajimarie</a> ]</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_487qfcv76ft_b.jpg"><img class="alignnone size-medium wp-image-5052" title="dhj5mk2g_487qfcv76ft_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_487qfcv76ft_b-300x199.jpg" alt="dhj5mk2g_487qfcv76ft_b" width="300" height="199" /></a></p>
<p><strong>Tish Shute:</strong> Right, great!</p>
<p><strong>Anselm Hook:</strong> When I started to look at GIS and mapping I started to meet people who had a very similar background. What happened to me is I kind of stepped away from games around the year 2000. Iâ€™d seen a talk by Bruce Sterling at an event called <a id="e8dn" title="PlaNetwork" href="http://www.conferencerecording.com/newevents/pla20.htm">PlaNetwork</a>. And that event was, for me, a turning point where I decided to focus full time on exactly what I cared about instead of doing things that were kind of similar to what I cared about. So, his influences is a pretty significant one to me at that exact moment.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_490gcp7q6fn_b.png"><img class="alignnone size-medium wp-image-5053" title="dhj5mk2g_490gcp7q6fn_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_490gcp7q6fn_b-300x80.png" alt="dhj5mk2g_490gcp7q6fn_b" width="300" height="80" /></a></p>
<p>[For more see <a id="q2or" title="viridiandesign.org" href="http://www.viridiandesign.org/About.htm">viridiandesign.org</a> &#8211; seems that it is time for a &#8220;Neo-Viridian,&#8221;  revival.]</p>
<p><strong>Tish Shute:</strong> Itâ€™s interesting because now your paths are crossing again with augmented reality. You are on the same wavelength again.</p>
<p><strong>Anselm Hook:</strong> Itâ€™s funny, actually, Iâ€™ve had a couple of brief overlaps in that way.Â  Well, so in 2000 I<span id="mdsf" title="Click to view full content"> went to see this talk and I did a small project called &#8212; well, I called it <a id="bx3u" title="SpinnyGlobe" href="http://github.com/anselm/SpinnyGlobe">SpinnyGlobe</a>. What I did is I mapped protests from a number of websites onto a globe to show the level of community opposition to the pending war in Iraq. It was the first time there had been a protest before a war. So, it was very interesting to me. [ See <a href="http://hook.org/headmap" target="_blank">http://hook.org/headmap</a> ]<br />
<strong><br />
Tish Shute:</strong> Thatâ€™s really fascinating. Do you have any pictures of that you could send me? </span></p>
<p><span id="r0h_" title="Click to view full content"><a href="http://www.flickr.com/photos/anselmhook/1747152617/sizes/m/in/set-72157602696188420/" target="_blank"><img class="alignnone size-medium wp-image-5054" title="dhj5mk2g_492ffct2df4_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_492ffct2df4_b-300x225.jpg" alt="dhj5mk2g_492ffct2df4_b" width="300" height="225" /></a></span></p>
<p><span id="mdsf" title="Click to view full content">photo from <a id="j05v" title="anselm's flickrstream" href="http://www.flickr.com/photos/anselmhook/1747152617/sizes/m/in/set-72157602696188420/">anselm&#8217;s flickrstream</a></span></p>
<p><strong>Tish Shute:</strong> Yes, Iâ€™ll definitely look <a id="ua2l" title="SpinnyGlobe" href="http://github.com/anselm/SpinnyGlobe">SpinnyGlobe</a><span id="m0:j" title="Click to view full content"> up. It sounds very interesting.Â  One of the aspects of your work on geo-located data projects like this and <a id="h.gx" title="Platial" href="http://platial.com/">Platial</a> is that you really started to develop this idea of a culture of place, about how people make place. This was the wake up call to me regarding the power of networks combined with geo-data. </span></p>
<p><span id="m0:j" title="Click to view full content">We are hoping to extend this idea into augmented reality with the an open distributed platform for AR so that we can collaboratively map our worlds from the perspective of who we are, where we are, and what we are doing.Â  I know youâ€™ve just done some work recently in augmented reality.Â  I know you put the code up already. </span></p>
<p><span id="m0:j" title="Click to view full content">By the way, I love the way you take your philosophy into the way you make code &#8211; the practice of making some code, trying some things out, making it all public and publishing your findings, you know, your comments on that experience.Â  Perhaps you could recap sort of how you picked up recently on the state of play with augmented reality and what aspects you looked at, and what came out of that experience?</span></p>
<p><strong>Anselm Hook:</strong> So, itâ€™s a very simple trajectory. Coming out of the work I had done, <a id="cs18" title="Platial" href="http://platial.com/">Platial</a>, among other projects and I started to just look at the hyper-local and I suddenly realize that even those services werenâ€™t really speaking to living, and how to really see and solve local problems. What was missing was a sense of context.</p>
<p>The map doesnâ€™t know how youâ€™re feeling, it doesnâ€™t know if youâ€™re in a hurry, it doesnâ€™t know what you want, itâ€™s very static. Even the web maps are very static. And augmented reality for me I started to recognize as a combination of &#8212; well &#8212; itâ€™s probably collision of many forces, many forces that weâ€™re all a part of. Weâ€™ve also didnâ€™t realize that the real-time web is really important, itâ€™s part of<span id="bja1" title="Click to view full content"> what AR is about.</span></p>
<p>We have all started to realize that the context is important. You know, your personal disposition, your needs, if you want to be interrupted or not. That is the kind of thing that the ubiquitous computing crowd has talked about. We started to recognize that there are sensors everywhere, and the ambient sensing communities talked about that. So what is funny for me about augmented reality is I started realizing it is just a collision of many other trends into something bigger.</p>
<p>Everything else we thought was a separate thing is actually just part of this thing. Even things like Google Maps or mapping systems we think are so great are really just kind of almost an aspect of a hyper-local view. You actually donâ€™t really care what is happening 10 blocks away or 100 blocks away. If you could satisfy those same interests and needs within a single block, one block away, you would probably be really happy. You really just want to satisfy needs and interests, find ways to contribute, or get yourself fed, or whatever it is you want. And AR seemed to be the playground to really explore the human condition.</p>
<p><strong>Tish Shute:</strong> Anyway, I think one of the things that has been very amazing this year is we to have the good mediating devices that, for the first time, give us compasses, GPS, and accelerometers. But one sort of missing pieces with AR at the moment is [tracking, mapping, and registration] &#8211; the kind of things colloquial mappings of the world could be of great help with.</p>
<p>We have seen mapping coming out of the Flickr data, e.g., the University of Washington, put the maps together from the geo-tagged Flickr photos. Now if we could have that linked up with AR, then we have the kind of mapping we need to kind of really hook the geo-data onto the world in a way that goes beyondâ€¦you know, what compass and GPS can really deliver is pretty minimal at the moment.</p>
<p><strong>Anselm Hook</strong>: There is a real risk of our augmented reality world being owned by interests which are not our own. There is a real question of when you hold up that AR goggle, what are you going to see? Are you going to see corporate advertising? Are you going to see your friendsâ€™ comments or criticisms? It is going to be an Iran or a democracy, right? It is unclear.</p>
<p><span id="vix9" title="Click to view full content">Right now there are some disturbing trends I have noticed. I am a big fan of Google Goggles. I think it is a great project. But when you are mediating the translation layer between the image and the data, then there is an opportunity for you to control it, and that opportunity is hard to resist. It is hard to choose not to own that opportunity. It is an advertising opportunity. It is a revenue opportunity. It is a chance to send a message and a tone. </span></p>
<p><span id="vix9" title="Click to view full content">I know that Google and companies like that are keenly aware of the kinds of roles they donâ€™t want to hold, but it is sometimes seductive to think about them. And I am afraid that we, as a community, need to assert an ownership, kind of a commons, over how computers will translate what they see to information that we perceive.</span></p>
<p><strong>Tish Shute:</strong> Yes. And this is how we met, again, recently [over the project to create an open, distributed platform for AR using the Wave Federation Protocol]â€¦</p>
<p><span id="e18n" title="Click to view full content">This is something I feel really deeply is that, you know, basically we need the physical internet to be as open as, as the, as the internet, as the end-to-end internet has been. Or more so, actually, because the end-to-end internet has seen the trend has been to walled gardens.Â  Basically Facebook became enormous, an enormous walled garden which, I think, was despite, our predictions about them, [walled gardens] are the social experience really on the web.Â  It&#8217;s very much in walled gardens still and I, and I really feel that with the physical internet, we need to make great efforts not for it not just to be a series of small pockets of privately funded walled gardens.</span></p>
<p>There needs to be some kind of communications infrastructure that keeps it open so that was when I got interested in looking at the Wave Federation Protocol because it was a real time, you know, an open real time protocol that could possibly be a basis for that. But I think the point you&#8217;ve talked to just now, the mapping of the world and who has the &#8220;goggles&#8221;, i.e., the image data, image databases, that make the world meaningful is really, that&#8217;s still a, it&#8217;s still a BIG question [i.e. who controls the view?].</p>
<p>When I saw <a id="ewxn" title="ImageWiki" href="http://imagewiki.org/">ImageWiki</a>, [I realized] that is a piece that is vital for, for augmented reality. We need to have a huge social effort to be involved in this,Â  linking in and creating theÂ  physical internet, in creating the image hyperlinks that will make that meaningful.</p>
<p><span title="Click to view full content"><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_493fv23rg33_b.png"><img class="alignnone size-medium wp-image-5055" title="dhj5mk2g_493fv23rg33_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_493fv23rg33_b-300x219.png" alt="dhj5mk2g_493fv23rg33_b" width="300" height="219" /></a></span></p>
<p><span id="e18n" title="Click to view full content"><strong>Anselm Hook:</strong> I think that&#8217;s a great point. The search interface, the kind of Internet that we&#8217;re used to, the way we talk to the network now, is fundamentally open end to end. Yes, you can have your oligarchies inside of it, as we see with Facebook, but you can always start your own venture up and you can do a search on something, and you can find that, that website and you can join it or you can put up your own webpage and people can find it. </span></p>
<p><span id="e18n" title="Click to view full content">The translation layer, the idea of text search and the ability to discovery power and the serendipity and the openness of that discovery, it&#8217;s pretty open right now. We do have some serious boundaries of language, which is one of the reasons I was working at the <a id="xg:8" title="Meadan.org" href="http://www.imug.org/events/past2007.htm#meadan">Meedan.org</a> [hybrid distributed, natural language translation] for a couple of years, trying to bridge that issue.</span></p>
<p>But here, as we move towards a physical internet where there&#8217;s no clicking and there&#8217;s no interface and the computer&#8217;s just telling you what it thinks you&#8217;re looking at, translating, you know, an image of a billboard to the name of the rock star who&#8217;s on that billboard, or translating the list of ingredients on a can of soup to the source outlets where it thinks that, those ingredients came from. When you have that kind of automated mediation, the question of trust definitely arises.</p>
<p>And we haven&#8217;t seen the Clay Shirkys or the Larry Lessigs of the world start to talk about this yet.Â  Although I suspect that in the next four or five years that the zero click interface will become the primary interface, that we&#8217;ll have&#8230;we&#8217;ll come to assume that what we see with the extra enhanced data we get projected onto our view is the truth. Yet, at the same time, there is just no structure or mechanism even being considered for a democratic ownership of it.</p>
<p><span id="fv3x" title="Click to view full content">We have with DNS, for example, the idea that you can register the domain name and people can search for it, and find it, and go to it. There&#8217;s no such thing as an Image DNS, or an image translation to DNS right now. What does it mean when everything is just &#8220;magic&#8221;, when there&#8217;s no way for you to be a part of the conversation, where you&#8217;re just a consumer of what people tell you, or of what one company right now, tells you, is reality? That&#8217;s a real concern.<br />
<strong><br />
Tish Shute: </strong>This, to me is the most important question at the moment. I mean, it&#8217;s the big one and it&#8217;s the place to put energy if you love the Internet [and what it can now become] right. You&#8217;ve got to put a lot of energy into this because this [a democratized view of the physical world as a platform] won&#8217;t just happen, because there&#8217;s a lot of momentum already for it to be heavily privatized, partly because, one reason is, some of the computer vision algorithms that, say, make sense of things like the geotag photographs are not open.Â  I mean, for example, the beautiful maps that have been made from the University of Washington [from Flickr geotagged photo sets], that isn&#8217;t in the public domain.</span></p>
<p><strong>Anselm Hook:</strong> Right. Tish, and in fact you&#8217;re referring to [with the maps from the Flickr photos] to ordinary maps and the fact we&#8217;ve already seen that maps lie, we&#8217;ve already, seen how much maps are reflecting a certain truth that becomes the normative truth. Google maps reflects roads, because this is roads and cars, right? Only recently have they thought about buses and walking. So the normative view that people assume is the reality, is showing off you know Starbucks, and roads, and cars, that becomes the default, those prejudices are just assumed, you know, the truth. But they&#8217;re not the truth at all.</p>
<p>I was talking to a friend of mine in Montreal, [Renee Sieber], and she said that their Indian portage routes are a bridge across land and water, they don&#8217;t think of a piece of land and a piece of water as being different things, they think of them as one thing: a route. It&#8217;s already a different kind of language we can&#8217;t even reflect it.</p>
<p>So not only is there this kind of formal, anthropological lie, in a sense, but there&#8217;s this way that we deceive ourselves because of our own prejudices.</p>
<p><strong>Tish Shute:</strong> Yes I agree and that&#8217;s why I think when I saw some of the things you had written on the ImageWiki point clearly to the need to create a social commons. We need a social commons for the real-time physical internet, we need it for the image hyperlinks that make sense of that.</p>
<p>And it&#8217;s a complicated thing in a sense, though, because we don&#8217;t actually have a good distributed infrastructure for AR yet, and I found exploring AR Wave, that at last we have the suggestion of an open, federated protocol for real-time communication &#8211; the wave federation protocol. [Real time communications is a very important part of AR].Â  It isn&#8217;t an actuality yet where lots of people are able to use it, set up their own servers, and there&#8217;s not a standard all the way throughÂ  [there is not a standard for how data is sent between the client and the server].</p>
<p>But Wave Federation Protocol does make possible truly distributed social AR.Â  I started thinking when I saw ImageWiki that to bring ImageWiki together with the social collaborative power of distributed AR.Â  This really would be the basis of creating a social commons for augmented reality and the physical world as a platform &#8211; the <span id="np6x" title="Click to view full content">start of a bottom up with deep social collaboration on how we create augmented reality colloquial maps that can inform a hyper-local of the world.</span></p>
<p><strong>Anselm Hook:</strong> Yes. When Paige Saez, John Wiseman, and myself, and a few other folksâ€¦ You know, Benjamin Foote, Marlin Pohlmann, and a couple other people started to play with this, we quickly found thatâ€¦ We started to realize, â€œOh, this kind of thing will be at least as popular as IRC. There will be at least as many people doing this as chatting in little virtual spaces. Thereâ€™ll be at least as many people decorating the world with augmented reality markup, and maybe using the real world as a kind of barcode for translating what youâ€™re looking at into an artifact, a digital artifact.</p>
<p>And<span id="csy2" title="Click to view full content"> that the size of that space was going to be huge, basically. Maybe not quite as commodifiable as Twitter, but certainly very energetic.</span></p>
<p>Many of the projects we did were just kind of looking at these kinds of issues sort of from an artistic, technical, and political point of view. We werenâ€™t so much posing complete solutions, but simply using a praxis to explore the idea with an implementation, as a foundation for this discussion. So I think we sort of opened that can of worms for sure.</p>
<p><strong>Tish Shute:</strong> Did you actually set up ImageWiki to be working as a location based app yet?</p>
<p><strong>Anselm Hook:</strong> It is a location based app. It collects your longitude, latitude, and the image and stores it. And then it uses that as a way to translate that image to anything else. It could be a piece of text or a URL.<br />
<strong><br />
Tish Shute:</strong> So there is a smartphone app, but you didnâ€™t take it as far as an AR app yet?</p>
<p><strong>Anselm Hook:</strong> No. We didnâ€™t do a heads-up view. There are apps on the iPhone store that do that, but they donâ€™t do the brute force image recognition that we were using. We used a third party off the shelf algorithm that we found on Wikipedia and downloaded the source code, and threw it on the server. And John Wiseman in LA wrote the scalable database backend so that we could scale the actualâ€¦<br />
<strong><br />
Tish Shute:</strong> So how did you set the iphone app up to work?</p>
<p><strong>Anselm Hook</strong>: The iPhone side was very simple. You take a picture of something and it tells you what it is. That is all it did. We would take the location, but the client side, the iPhone side, just rendered, returned to youâ€¦It said, â€œSomeone said that this picture of a barking dog is an advertisement for a local band.â€</p>
<p><strong>Tish Shute:</strong> Right. So basically it was a geo-tagged?</p>
<p><strong>Anslem Hook:</strong> Yes. We are just collecting the geo information. Actually, there were a whole lot of technical challenges. The whole idea of ImageWiki is actually kind of beyond our technical ability for a small team like us. It really does take a team, a group like Google, to do this kind of thing in a scalable way.<br />
<strong><br />
Tish Shute:</strong> Why is that?</p>
<p><strong>Anslem Hook:</strong> There are two sides. There is the curating the images. I think that is the job of groups like us &#8211; open source groups who can curate images <span id="vxty" title="Click to view full content">that are owned by the community. And then the searching side, the algorithm side, where you are actually matching the fingerprint of one image to images in your database, that takes a much moreâ€¦that is much more industrial.Â  We get both sides, ours is not a scalable solution. It is mostlyâ€¦proving that it could be done was important.<br />
</span><br />
<span id="a3ou" title="Click to view full content"><strong>Tish Shute: </strong>In terms of hooking Imagewiki up to the collaborative possibilities of AR Wave wouldn&#8217;t federation pose some interesting possibilities for scaling search algorithms and all that?</span></p>
<p><span id="vp27" title="Click to view full content"><strong>Anselm Hook:</strong> Yes. And what is funny also, incidentally, is that, nevertheless, we did look for some financial support for it, but we couldnâ€™tâ€¦we just didnâ€™t find the investors to scale it. Now, other companies like SnapTell took a shot at it. And they have an app in the iPhone store where you can point at a beer bottle and get back the name of the beer bottle.</span></p>
<p>The classic example everyone uses is a book. Amazon has all the image jackets of all their books. You can point SnapTell at almost any book and get back links to buy that at Amazon, the price of the book, and user comments on the book. So they are treating Amazon as the canonical voice of the book, for better or worse. That is the state of the art so far, up until Google Goggles came out a little while ago, which actually blows it out of the water. But, that is where we are now.</p>
<p><strong>Tish Shute: </strong>Right. But the point you raise about how when something like Amazon comes canonical of what is book, right, this is the whole point, isnâ€™t it?</p>
<p><strong>Anselm Hook:</strong> Is Amazon truth? Itâ€™s not bad. Jeff Bezos seems like a nice guy, but, you know.</p>
<p><strong>Tish Shute:</strong> And this is the point of having these open infrastructures for this.Â  And this should be obvious in a way, but it comes back to the thing about what made the Internet great was the fact that even though as you note, you get an oligarchy like Facebook, but people always could just go off and do something else, right? Because the fundamental infrastructure was basically open and designed to be available for everyone. And many people have championed that and fought for it hard [to maintain this openness] havenâ€™t they? They have devoted their lives to keeping it that way, even if the oligarchies have done their thing.<br />
<strong><br />
Anselm Hook:</strong> Yes. There are really some things that are underneath all of this that havenâ€™t been solved yet.</p>
<p>One is that the trust in social networks has not been built yet, so we canâ€™t do peer based recommendations very well. We canâ€™t filter noise by peers. Twitter kind of is moving there, but I donâ€™t just want to listen to my Twitter friends. I want to listen to my friends of friends. If I am getting truth from somebody, I want to get that truth from people my friends say that they trust.</p>
<p>Then the second problem is that there is a search business. My friend Ed Bice, who owns <a id="lir5" title="Meedan" href="http://beta.meedan.net/">Meedan</a>, always says that a search itself, a search request, is an opportunity to makeâ€¦is a publishing moment. It is an opportunity to say what you think. In the real world, if you are just hanging out with humans and you look somewhere, other people might look at your gaze and they might look at what you are looking at. Your gaze itself is a public act.</p>
<p>Gaze is a soft act, but it is one that is visible. With Google, the gaze<span id="zuat" title="Click to view full content"> of four billion people is invisible. We don&#8217;t what people are looking at, there is no opportunity to participate. Let me give you a real example.Â  I have taken a image of something of the bust of figure or a statue.Â  Why can&#8217;t the museum in Cairo look at my request and tell me oh yeah that is Tutankhamen, or that is Nefertiti right? Why can&#8217;t they have a chance to participate in the search and respond to me?</span></p>
<p><span id="zuat" title="Click to view full content"> Right now the the only person that responds is Google when I do a search. We need to invert the search pyramid and open up search, so that search is a democratic act, so that you can publicly permission your searches so that other people can respond and so that people can reach out to you, not just you having to do a dialogue. </span></p>
<p><span id="zuat" title="Click to view full content">The common example of this.. and we see this everywhere: I am looking for a slice of pizza right, now I am hungry I want some pizza. I have to ask Google, look find twelve websites, call twelve phone numbers, and talk to each of the twelve stores, and ask them are they open late, is the food organic, is the food in any good, do my friends like it.</span></p>
<p>Whereas what I should be able to do is just say it&#8217;s a search moment and I am interested in pizza. If those pizza places my criteria like you know my friend&#8217;s like them and they are organic, they are open, then that pizza place can call me. I have the money why should I do the search? So the whole business of search, the whole structure of search is predicated around a revenue model, but its a really short-sighted revenue model, its not a brokerage.</p>
<p>Search isn&#8217;t search, search is hand waving.Â  These should be moments for us to have a discourse. So problem we are seeing in AR with communication of the right information is actually underneath AR, at the level of the whole infrastructure.</p>
<p>Search needs to be inverted, trust filters need to be built. We need to democratically own our data institutions.Â  We don&#8217;t right now.Â  That will be more of a concern, especially with AR.</p>
<p><strong>Tish Shute: </strong>Yes, especially with AR, which is this why got all excited about federation.Â  Do you think federation has the potential, an opportunity to create [the new infrastructure you describe?]</p>
<p><strong>Anselm Hook:</strong> Absolutely,Â  its absolutely what we must do. It is much harder to do. It is absolutely critical.</p>
<p><span id="lwzk" title="Click to view full content"><strong>Tish Shute:</strong> And why is it much harder to do? Could you explain that?</span></p>
<p><strong>Anselm Hook:</strong> Well, it&#8217;s very easy for a bunch of hackers to build a service that you log into and fetch some data, it&#8217;s a single thing. They don&#8217;t have to talk anybody, they can use their own protocols, they can hack it, it&#8217;s a big black box, behind the scenes. There&#8217;s running back and forth in a giant Chinese room delivering manuscripts and scrolls to you. Whatever is behind the black box, you donâ€™t care, it just works.Â  But when you federate, you need to actually publish and have standards, and then you&#8217;re talk about semantic, everyone starts getting really excited and wave some hands. It becomes a disaster. It&#8217;s, at least, another power order, more difficult than DIY, build it yourself.</p>
<p><strong>Tish Shute:</strong> So, in terms of what Google Wave have done with their approach to federation, what do you think have been their achievements and what do you think is their obstacles? What do you think are the failings of the Wave? Because it&#8217;s the first big public major player backed approach to something federated, isnâ€™t it? In real time.</p>
<p><strong>Anselm Hook:</strong> Yes. I think the most important non-federated service on the planet today is Twitter.Â  <a id="uhg3" title="Ident.ic.a" href="http://identi.ca/group/identica">Identi.ca</a> it&#8217;s not getting any traction with respect to Twitter. [ Even though ] Identi.ca is a federated version of Twitter and is very good. [ Identica is now <a id="w05j" title="Status.net" href="http://status.net/">Status.net</a> ] . So, we see already there that small players arenâ€™t being competitive. Then look at other services like IRC. IRC is the secret backbone of the Net. All the open source projects, all the teams, all the people that work on opensource projects are all on IRC. It&#8217;s the only way they get anything done.</p>
<p>With Google Wave, and the protocols underneath Google Wave, we see an attempt to build a similar kind of real time, but distributed protocol. I think it&#8217;s the right direction. I think, people should pick up the offering and make their own servers. I think that protocol is really great, I think the fact that is compressed, its high performance, <span id="md2h" title="Click to view full content">it is small, real-time of blobs of data flying around, all exactly the way it should be done. It is getting close to this kind of rewrite of the Internet that people keep talking about, because, you know, the net protocols are so bad, it is starting to treat the idea of intermittent exchanges being more transitory, volatile, and not heavy.</span></p>
<p><strong>&#8230;.to be continued.Â  Part 2 coming soon!<br />
</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/feed/</wfw:commentRss>
		<slash:comments>17</slash:comments>
		</item>
		<item>
		<title>The AR Wave Project: An Introduction and FAQ by Thomas Wrobel</title>
		<link>http://www.ugotrade.com/2009/12/04/ar-wave-project-an-introduction-and-faq-by-thomas-wrobel/</link>
		<comments>http://www.ugotrade.com/2009/12/04/ar-wave-project-an-introduction-and-faq-by-thomas-wrobel/#comments</comments>
		<pubDate>Sat, 05 Dec 2009 02:50:18 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[AR]]></category>
		<category><![CDATA[AR Blps]]></category>
		<category><![CDATA[AR DevCamp]]></category>
		<category><![CDATA[AR Network]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[AR Wave project]]></category>
		<category><![CDATA[AR Wave Wiki]]></category>
		<category><![CDATA[ARBlip]]></category>
		<category><![CDATA[ARDevCampNYC]]></category>
		<category><![CDATA[ARN]]></category>
		<category><![CDATA[Augmented Realit]]></category>
		<category><![CDATA[augmented reality network]]></category>
		<category><![CDATA[distributed augmented reality]]></category>
		<category><![CDATA[Goggle Wave Federation Protocol]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Joe Lamantia]]></category>
		<category><![CDATA[layers and channels of augmented reality]]></category>
		<category><![CDATA[markerless augmented reality]]></category>
		<category><![CDATA[multiuser multisource augmented reality]]></category>
		<category><![CDATA[open augmented reality network]]></category>
		<category><![CDATA[open distributed augmented reality]]></category>
		<category><![CDATA[pygowave]]></category>
		<category><![CDATA[PyGoWave Qt-Based Desktop Client]]></category>
		<category><![CDATA[shared augmented realities]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[Sophia Parafina]]></category>
		<category><![CDATA[storing geolocated data on Wave Servers]]></category>
		<category><![CDATA[Thomas Wrobel]]></category>
		<category><![CDATA[Wave enabled augmented reality]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4960</guid>
		<description><![CDATA[ImagesÂ  from Mitsuo Iso&#8217;s Denno Coil (Click to enlarge), the game &#8220;Metroid Prime,&#8221; and Terminator. Thomas Wrobel, Sophia Parafina, Joe Lamantia, Matthieu Pierce, and I will lead a Â session tomorrow for AR DevCampNYC introducing the AR Wave Project.Â  Thomas, Joe and Matthieu will be participate via skype (10am to 11.30am EST), and Sophia Parafina and [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/Screen-shot-2009-12-04-at-7.56.58-PM.png"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/Screen-shot-2009-12-04-at-6.43.24-PM.png"><img class="alignnone size-medium wp-image-4961" title="Screen shot 2009-12-04 at 6.43.24 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/Screen-shot-2009-12-04-at-6.43.24-PM-300x181.png" alt="Screen shot 2009-12-04 at 6.43.24 PM" width="300" height="181" /></a><br />
</strong></p>
<p><em>ImagesÂ  from Mitsuo Iso&#8217;s<a href="http://en.wikipedia.org/wiki/Denn%C5%8D_Coil" target="_blank"> Denno Coil</a> (Click to enlarge), the game &#8220;Metroid Prime,&#8221; and Terminator.</em></p>
<p><a href="http://www.lostagain.nl/" target="_blank">Thomas Wrobel</a>, <a href="http://opengeo.org/about/team/sophia.parafina/" target="_blank">Sophia Parafina</a>, <a href="http://www.joelamantia.com/" target="_blank">Joe Lamantia, </a><a href="http://matthieupierce.com/" target="_blank">Matthieu Pierce</a>, and I will lead a Â session tomorrow for<a href="http://www.ardevcamp.org/wiki/index.php?title=Main_Page" target="_blank"> </a><a href="http://www.ardevcamp.org/wiki/index.php?title=NYC_ardevcamp" target="_blank">AR DevCampNYC</a> introducing the AR Wave Project.Â  Thomas, Joe and Matthieu will be participate via skype (10am to 11.30am EST), and Sophia Parafina and I will both be at <a href="http://www.ardevcamp.org/wiki/index.php?title=NYC_ardevcamp" target="_blank">AR DevCampNYC</a> at the <a title="http://openplans.org/contact/" rel="nofollow" href="http://openplans.org/contact/">The Open Planning Project office (TOPP)</a>.Â  The <a href="http://pygowave.net/" target="_blank">PyGoWave</a> crew will be introducing <a href="http://livestream.com/pygowave" target="_blank">PyGoWave via LiveStream</a>.</p>
<p>At 1.30pm EST to 2.30pm EST there will be a shared <a href="http://pygowave.net/" target="_blank">PyGoWave</a>/AR Wave session <a href="http://www.ardevcamp.org/wiki/index.php?title=Main_Page" target="_blank">with Mountain View </a>(if bandwidth permits).</p>
<p>The skype conference will be at ardevcampnyc . Â To participate in Wave,Â  please join the public Wave, Â <a href="https://wave.google.com/wave/#restored:wave:googlewave.com!w%252BH83lcj6RA" target="_blank">AR Wave: AR DevCamp Session</a>. Â There is also a <a href="http://arwave.wiki.zoho.com/HomePage.html" target="_blank">AR Wave Wiki up now &#8211; see here</a>.</p>
<p><a href="tridarras.com/#http://www.dimitridarras.com/images/dd_work.jpg" target="_blank">Dimitri Darras </a>(avatar Dimitri Illios) is working on streaming the AR DevCampNYC sessions into Second Life,Â  <a href="http://slurl.com/secondlife/Ambleside/228/247/25" target="_blank">SLURL here</a>.</p>
<p>Thomas has done a very nice introduction and FAQ below.Â  This should help people new to this project to get up to speed quickly.</p>
<p>There are already several Waves that show the history of this project including: <a href="https://wave.google.com/wave/#restored:wave:googlewave.com%21w%252Bhvk2Fj3wB" target="_blank">AR Wave: Augmented Reality Framework Development</a>,Â  <a href="https://wave.google.com/wave/#restored:wave:googlewave.com!w%252BeyLQLb4ED" target="_blank">AR Wave Use Cases</a>, <a href="https://wave.google.com/wave/#restored:wave:googlewave.com!w%252Bok4URyFyR" target="_blank">PyGoWave AR Tech Discussion</a>,Â  <a href="https://wave.google.com/wave/#restored:wave:googlewave.com!w%252BJAcNzz16A" target="_blank">AR Wave Augmented Reality Wave Development</a>, <a href="https://wave.google.com/wave/#restored:wave:googlewave.com!w%252B0VnNxxoOB.1" target="_blank">AR Wave / Muku Organization and Admin</a>.</p>
<p>Also I have several posts for people interested in more of the background, including: <a title="Permanent Link to The Next Wave of AR: Mobile Social Interaction Right Here, Right Now!" rel="bookmark" href="../../2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/">The Next Wave of AR: Mobile Social Interaction Right Here, Right Now!</a>, <a href="http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/" target="_blank">AR Wave: Layers and Channels of Social Augmented Experiences</a>, <a title="Permanent Link to Total Immersion and the â€œTransfigured City:â€ Shared Augmented Realities, the â€œWeb Squared Era,â€ and Google Wave" rel="bookmark" href="../../2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/">Total Immersion and the â€œTransfigured City:â€ Shared Augmented Realities, the â€œWeb Squared Era,â€ and Google Wave.</a></p>
<p>Thomas uses the term Arn (augmented reality network) which is one of the candidate names for the project, Muku (crest of a Wave) is another suggestion.Â  Thomas&#8217; intro and FAQ below can also be found <a href="http://lostagain.nl/testSite/projects/Arn/information.html" target="_blank">here</a>.</p>
<p><strong><br />
</strong></p>
<h3><strong>What is the AR Wave Project?</strong></h3>
<p><strong> </strong></p>
<p>In simple terms its a protocol for storing <a id="zblc" title="geolocated" href="http://en.wikipedia.org/wiki/Geolocation">geolocated</a> data on Wave servers that&#8217;s currently being developed.</p>
<p>We believe this will help lay the foundations for an open, universally accessible, and decentralised system for shared augmented reality overlays which various clients can connect to and use.</p>
<p>This AR Network should spark a lot more rapid adoption of AR technologies, give existing browsers more functionality, and provide the network infrastructure, allowing many of the fictional depictions of AR to become a reality one day.</p>
<p><strong>The AR Network.</strong></p>
<p>When we speak of a future AR Network, we mean one as universal and as standard as the internet. One where people can connect from any number of devices, and without additional downloads, experience the majority of the content.</p>
<p>Where people can just point their phone, webcam, or pair of AR glasses anywhere where a virtual object should be, and they will see it. The user experience is seamless, AR comes to them without them needing to â€œprepareâ€ their device for it.</p>
<p>The Arn should be an inclusive and open platform where any number of devices can connect to, and anyone can make and host their own location-specific models or data.</p>
<p>It should allow people to communicate both publicly and privately, and not have their vision constantly cluttered with things they donâ€™t want to see.</p>
<p>This is our vision, and we think a Wave protocol will help it become a reality.</p>
<p><strong>Why Wave?</strong></p>
<p>Wave allows the advantages of both real-time communication, as well as the advantages of persistent hosting of data. It is both like IRC, and like a Wiki. It allows anyone to create a Wave, and share it with anyone else. It allows Waves to be edited at the same time by many people, or used as a private reference for just one person.</p>
<p>These are all incredibly useful properties for any AR-experience, more so Wave is open. Anyone can make a server or client for Wave. Better yet, these servers will exchange data with each other, providing a seamless world for the user: a single login will let you browse the whole world of public waves, regardless of whoâ€™s providing or hosting the data. Wave is also quite scalable and secure: data is only exchanged when necessary, and will stay local to just one server if no one else needs to view it.</p>
<p>Wave allows bots to run on it and thus allowing blips in a waves to be automatically updated, created or destroyed based on any criteria the coders choose. Wave even allows the playback of all edits since the wave was created.</p>
<p>For all these reasons and a few more, Wave makes a great platform for AR.</p>
<p><strong>How?</strong></p>
<p>In basic terms, we will diverse a standard way to geolocate a bit of data and store it as aÂ <a id="u0cd" title="Blip" href="http://google.about.com/od/b/g/google_wave_blip.htm">Blip</a> within a wave.</p>
<p>This data could be a 3d mesh, a bit of text, or even a piece of audio.</p>
<p>Then various clients on various devices could logon, locate, interpret and display this data as they see fit.</p>
<p><a href="http://lostagain.nl/tempspace/PrototypeDiagram3_wave.html" target="_blank"><img class="alignnone size-medium wp-image-4962" title="Screen shot 2009-12-04 at 7.56.58 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/Screen-shot-2009-12-04-at-7.56.58-PM-300x168.png" alt="Screen shot 2009-12-04 at 7.56.58 PM" width="300" height="168" /></a></p>
<p><em>Click on image above to enlarge.</em></p>
<p>A typical example of this might be holding up your phone and seeing messages written by your friends and family in the locations which they are relevant.</p>
<p>You could see an arrow hovering over the cafÃ© your meeting a friend at, notes above their flat saying if they are in or out, or messages by shops telling you to pick up the particular brand of cereal they like.</p>
<p>This data would be personal to just yourself and whoever you invite to share that wave with.</p>
<p>Other forms of data could be public, like city-maps, online games, or historical landmarks being recreated. Custom views of the world with data for entertainment, commercial, environmental or informative purposes.</p>
<p>The possibilities with geolocated data are endless, as are the various ways to display and make use of them.</p>
<p>One of the things I&#8217;m most passionate about is people being able to see many different types of data, both public and private at the same time and from many different sources at once.</p>
<p>For instance, if your playing a AR game, why shouldn&#8217;t your chat window be viewable at the same time?</p>
<p>If you have skinned your environment with a custom view of the world, why shouldn&#8217;t you also see mapping or restaurant recommendations?</p>
<p>The ways to present these layers of data and toggle them on/off in the most intuitive and flexible ways would be a task for the client markers, and I&#8217;m sure we will see many innovations in those areas.</p>
<p>But by using Wave it at least provides the framework for having multiple information sources controlled by many different people yet accessible, and user-submittable, via the same protocol.</p>
<p><strong>Who?</strong></p>
<p>This idea first sprouted from a paper I route focusing on the potential for IRC to be used for AR;</p>
<p><a id="ig44" title="http://www.lostagain.nl/testSite/projects/Arn/AR_paper.pdf" href="http://www.lostagain.nl/testSite/projects/Arn/AR_paper.pdf">http://www.lostagain.nl/testSite/projects/Arn/AR_paper.pdf</a></p>
<p>I suggested near the end Wave might be a better alternative (using Google Wave was an idea Tish Shute, Ugotrade, brought up in response to the Arn prototype design on IRC), and it quickly became apparent that Wave was a very suitable medium.</p>
<p>Since then, there was a lot of interest, and numerous people have offered to help.</p>
<p>In particular, recently, the <a id="vms1" title="PygoWave" href="http://pygowave.net/blog/">PygoWave</a> team is helping us out, as they have an existing server supporting c/s protocol, which is currently being actively developed.</p>
<p><strong>Where?</strong></p>
<p>You can join the general discussion here;<br />
<a id="wvja" title="Augmented Reality Wave Development" href="https://wave.google.com/wave/#restored:wave:googlewave.com%21w%252BJAcNzz16A">Augmented Reality Wave Development</a></p>
<p>Technical side here;<br />
<a id="qw95" title="Augmented Reality Wave Framework Development" href="https://wave.google.com/wave/#restored:wave:googlewave.com%21w%252Bhvk2Fj3wB">Augmented Reality Wave Framework Development</a></p>
<p><strong>When?</strong></p>
<p>There&#8217;s lots still to do, and we are at an early stage.</p>
<p>Our current targets: (last updated 11/12/2009)</p>
<ul>
<li>Getting reading/writing of prototype ARBlips to the PygoWave sever. (the PygoWave team have already made a standalone client and have the protocol for this sorted!)</li>
<li>Establishing a minimal spec for ARBlips to be later expanded.</li>
<li>Writing a very simple prototype online client showing how to store/retrieve the data.</li>
<li>Expanding client to work for some use-cases.</li>
<li>Establish a logo/branding for the project.</li>
</ul>
<p><strong>Other FAQs.</strong></p>
<p><strong>Where&#8217;s the catch?</strong></p>
<p>While we believe Wave is highly suitable for development, it has the drawbacks of being a new system with just a few servers worldwide, which (at the time of writing this), have not yet been federated together yet.</p>
<p>Naturally, as a new technology, its likely to have some growing pains. And building a new technology on other new technology will multiply that somewhat. The first pain is the lack of a standard client / sever protocol. PygoWave have stepped in to the rescue a bit here, by being not just one of the most developed Wave server other then Google, but also leaping ahead with support for Json based c/s interaction. Google has stated they want community to take the lead on on a c/s protocol, so we are hoping they will adopt a Json variant, or a XMPP one and add it to the spec. We hope in much the same way as POP3/IMAP have been a standard for email server interaction, a similar one will develop for Wave.</p>
<p>In the meantime we plan to keep the code for writing ARBlips somewhat abstracted so as to make it easy to adapt in future.</p>
<p>As for the newness of Wave and other potential problems it will bring, we aren&#8217;t that worried as its built on <a id="jnw1" title="XMPP" href="http://en.wikipedia.org/wiki/XMPP">XMPP</a>, which has proved reliable already.</p>
<p>The other catch is we are unfunded, which slows development down considerable as we have to fit it around our other jobs.</p>
<p><strong>I&#8217;m making my own AR Browser, and am slightly interested in maybe supporting you.</strong></p>
<p>We are naturally very keen for support, and particularly for those with skills and visions to give feedback on the proposed protocol. Specifically: what do you want stored in a blip?</p>
<p>That&#8217;s what&#8217;s important at this stage.</p>
<p>We don&#8217;t see the Arn as a replacement for existing browser systems at the moment. We don&#8217;t want to restrict innovation or development in this fast developing market as we are very impressed at what&#8217;s been achieved so far. In many ways our task is small in comparison to what&#8217;s already accomplished.</p>
<p>However, we do believe the Arn will make a good addition to existing browser systems. It will allow users contribute data and have social features without having to worry about accounts or hosting.</p>
<p>It will still be quite some work to support; new GUIs will need to be developed to make it easy to submit data from the devices, as well as to login to waves.</p>
<p>However, we hope over time to build a set of example libs to make the read/writing of ARBlips as as easy as possible to implement in your software.</p>
<p>Perhaps a good way to think about it is existing AR Browsers are like word-processors, supporting the Arn will be like adding support for *.txt, but doesn&#8217;t limit what you can do with your own format.</p>
<p><em>Eventually</em> we do hope ARBlips hosted on Wave will become the majority of AR data, and its functionality will be analogous to the internet is today. We truly believe in the long run a standard is essential.</p>
<p>But for now we think merely getting a baseline format established for how AR data can be communicated will increase user-ability, usefulness, and help the market grow.</p>
<p><strong>Can I help?</strong></p>
<p>Sure.</p>
<p>We particularly need people with technical skills in relevant fields. (both gwt/javascript web programming and c++(/qt)standalone programming help very welcome!).</p>
<p>But we also welcome people just with vision to help focus use-cases and to conceptualise what we want to be able to do with the system.</p>
<p>Please either join the relevant AR Waves or <a href="http://arwave.wiki.zoho.com/HomePage.html">Wiki</a></p>
<p>We are especially interested in those with JSON and Comet experience. Specifically those with the abilities to make standalone applications to read/write to a sever using these methods.</p>
<p><strong>What type of data will a AR Blip store?</strong></p>
<p>This is still actively being decided, but essentially its a physical hyperlink.</p>
<p>A connection between a physical location (or object, see below) and a piece of data.</p>
<p>Specifically, we are thinking about the following fields;</p>
<p>Location in X,Y,Z,<br />
Coordinate System used for the above,<br />
Orientation,<br />
MIMEType <span style="color: #666666;">[the type of data stored]</span><br />
DataItself <span style="color: #666666;">[either a http link for 3d meshs and other larger data, or an inline text string if its just a comment]</span><br />
DataUpdateTimestamp <span style="color: #666666;">[so clients know if its necessary redownload]</span><br />
Editors <span style="background-color: #ffffff;"><span style="background-color: #666666;"><span style="background-color: #ffffff;"><span style="background-color: #666666;"><span style="color: #666666;"><span style="background-color: #ffffff;">[the user/s that edited/created this blip]</span></span></span></span></span></span><br />
ReferanceLink <span style="color: #666666;">[data needed to tie the object at a non-fixed location, such as an image to align it to an object in realtime],</span><br />
Metatags <span style="color: #666666;">[to describe the data]</span></p>
<p><strong>Are you purely tying stuff to fixed geolocations?</strong></p>
<p>Certainly not <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" /><br />
As part of of the spec we wish to be able for people to be able to link data to dynamically moving objects, trackable by image or other methods.</p>
<p>The idea being that one day someone could link a piece of text or 3d mesh to an image on a t-shirt they are wearing, or perhaps link a dynamically updating twitter feed, or perhaps provide information on a product (based on its logo).</p>
<p>There&#8217;s a large number of possibility&#8217;s for image-based linking alone, and that&#8217;s not even considering possibilities like linking RFIDs, or other forms of less precise but invisible binding data.</p>
<p>We need a lot of feedback from those companies already doing markless tracking. What types of images do you need, idly to link a mesh to an object? is one enough?</p>
<h3><strong>Summary of AR Wave Work to Date</strong></h3>
<p><strong>Purpose:</strong> To provide an open, distributed, and universally accessible platform for augmented reality. To allow the creation of augmented reality content to be as simple as making an html page, or contributing to a wiki.</p>
<p><strong>Specific Goal:</strong> To establish a method for geolocating digital data in physical space (or linking it to physical objects) using wave as a platform.</p>
<p>(For justification as to why we are using Wave see: <a href="http://lostagain.nl/testSite/projects/Arn/information.html" target="_blank">our faq</a> )</p>
<p><strong>Wave as a platform</strong></p>
<p>We are developing on the <a title="PyGoWave" href="http://code.google.com/p/pygowave-server/" target="_blank">PyGoWave</a> server at the moment but the goal is to be compatible with all Wave servers</p>
<p>PyGoWave has already achieved an important aspect in enabling the project in being a waveserver with a working and well documented server protocol. This allows both standalone and webbased clients to interface with it already.Â  See -Â <a href="http://github.com/p2k/pygowave-qt">The PyGoWave Qt-Based Desktop Client</a></p>
<p>This is one of the reasons why we have chosen to develop for the Pygo server at this stage.</p>
<p>However, the overall goal of AR Wave is to have a framework compatible with all servers using the Wave Federation Protocol. As more wave servers get c/s protocols then ARblips (the data needed to geolocate objects) could be posted and retrieved from various servers using the same client software. For this a standard should emerge. Just as websites don&#8217;t have to be hosted on specific servers, neither should AR data need to be hosted on specific wave servers.</p>
<p>In order to reach our goal, there are a few very achievable steps involved &#8211; see below.</p>
<p><strong>Feedback</strong></p>
<p>We are still actively seeking feedback, so feel free to join the <a href="https://wave.google.com/wave/#restored:wave:googlewave.com%21w%252Bhvk2Fj3wB">Wave discussions, </a>and see the history of how the specifications of the protocol evolved. You can also read the justification for some of the choices already made. Note a new discussion for AR DevCamp will be begin at <a href="https://wave.google.com/wave/#restored:wave:googlewave.com%21w%252BH83lcj6RA">AR Wave: AR DevCamp Session</a></p>
<p>This will, of course, only be the first draft of the specification, and it is sure to develop much in future.<br />
The important thing now is to make working prototypes while maintaining flexibility.</p>
<p>So what do we need to do?</p>
<p><strong>Steps :</strong></p>
<p><strong>* Establish the overall method &#8211; Done.</strong></p>
<p>Each Wave will be a layer on reality which an individual or a group can create.Â  Each Blip in this Wave refers to either a small piece of inline data (like text) or a remote piece of larger data (like a 3D mesh) as well as the data needed to pin-point it in either relative or absolute real space.<br />
We call these blips: ARblips. They are simply blips that stored the data necessary to augment a single object onto a specific bit reality.</p>
<p>It is up to the clients how they interpret and display the data. They could interpret it as a simple 2d list of nearby objects, or as an advanced 3D overlay, whereby multiple waves from different sources could to be viewed at once. Whatâ€™s important is that there is a standard way to link the digital data to the real world space.</p>
<p>* Establishing the specification for the ARblip &#8211; In progress<br />
We have a good idea of whatâ€™s needed to be stored in an ARblip, and we have hammered out a rough format.<br />
The data might be stored as blip-annotations, but this has yet to be finalised.<br />
A rough outline of the type of data stored can be seen in this c++/qt header for ARblip data can be seen at the end of this document.</p>
<p>* Storing and retrieving these pieces of ARblip data on the PyGo server &#8211; In progress.<br />
The Pygowave team has made some excellent libraries that should make reading and writing data on the PyGoWave server very trivial for those with c++ skills.<br />
This, however, is a real critical step, so more developers with C++ skills are very welcome!</p>
<p>* Making the above client mobile, and using a devices gps device to place the data. &#8211; Not started.<br />
The next step would be to port the code to a mobile phone and use it&#8217;s gps-inputÂ  to post geolocated data and view what others have posted. This would be a fairly simple and not to useful app in itself. However, it would mark the first time anyone could post AR data and anyone could view it, all using open-source infrastructure.<br />
As a bonus, because we are using wave infrastructure, the updates to any ARblip should appear in near realtime.</p>
<p>* To continue with the proof of concept, we would like to have simultaneous wave input from a PC<br />
and mobile phone at the same time. &#8211; Not started.<br />
For example, someone could post a pin on Google maps API and have that data posted to a ARBlip in a wave. Someone logged into that wave on their mobile device would then see the data posted appear.<br />
More so we hope that when the Google map pin is dragged about, the mobile phone viewer, with just a few seconds lag, will see its location updated in real time.</p>
<p>We hope to make a modest yet practical app at this stage.</p>
<p>* After all this, we can go onto the interesting things:<br />
3D data, camera-overlays, data fixed to objects and many more.Â  There&#8217;s plenty of existing software using these features (such as Wikitude, Layer) and some that are even open source software (like Gamaray and Flashkit). The open source code can give us a leg-up. However, we prefer to establish the protocol first. So naturally, these fancy features aren&#8217;t a priority for us. Rather we think our energy is better spent establishing the protocols and infrastructure so that other people can build more advanced bit of software easier.</p>
<p>However, once our primary goals are established, we will look to make a open source augmented reality browser ourself which will surely feature many of these features.</p>
<p>Overall, we hope once we have a simple proof of concept, there will be many groups, both existing and new, wanting to use this Wave system for their own apps, games and data.</p>
<p><strong>Conclusion</strong>:<br />
Really it&#8217;s now all about growing the community. We hope as soon as we show how great Wave can be for augmented reality, that lots of individuals and teams will start making their own clients to read/write geolocated data.<br />
Overall we don&#8217;t think anything we make will be that impressive in itself. That&#8217;s not our goal.<br />
We instead hope that our project will enable AR-content to be made as easily as web content. That games, information and apps will be able to be created without the creators having to worry<br />
about the infrastructure behind it.</p>
<p><strong>Technical information -</strong><strong> </strong></p>
<p><strong><br />
</strong><strong>Current ARBlip header file</strong></p>
<p>(below is a c++/qt header file for an ARBlip object that should illustrate the data being stored)</p>
<hr />class <strong>arblip</strong></p>
<p>{</p>
<p align="left"><strong>public</strong>:</p>
<p align="left">arblip();</p>
<p>~arblip();</p>
<p>arblip(QString,QString,double,double,double,int,int,int,QString);</p>
<p>QString getDataAsString();</p>
<p>QString getEditors();</p>
<p>QString getRefID();</p>
<p>QString getXAsString();</p>
<p>QString getYAsString();</p>
<p>QString getZAsString();Â bool isFaceingSprite();Â <strong> </strong></p>
<p><strong><br />
private</strong>:</p>
<p>//ID reference. This would be a unique identifier for the blip. Presumably the same as Wave uses itself.</p>
<p>QString ReferanceID;</p>
<p>//Last editor(s)</p>
<p>QString Editors;</p>
<p>int PermissionFlags = 68356; Â // default 664 octal = rw-rw-r&#8211;</p>
<p>//Location</p>
<p>double Xpos;Â Â  // left/right</p>
<p>double Ypos;Â Â  // up/down</p>
<p>double Zpos;Â  // front/back</p>
<p>//Orientation</p>
<p>// names, ranges and directions are taken from aeronautics.</p>
<p>// If no orientation is specified, itâ€™s assumed to be a facing sprite.</p>
<p>// Roll: rotation around the front to back (z) axis. (Lean left or right.)</p>
<p>// range +/- 180 degrees with + values moving the objects right side down.</p>
<p>int Roll;</p>
<p>// Pitch: rotation around the left to right (x) axis. (tilt up or down)</p>
<p>// Range +/- 90 degrees with + values moving the objects front up. (looking up)</p>
<p>int Pitch;</p>
<p>// Yaw: rotation around the vertical (y) axis. (turn left or right.)</p>
<p>// range +/- 180 degrees with + values moving the objects face to its right.</p>
<p>int Yaw;</p>
<p>bool FacingSprite; //if no rotation specified, this should default to true</p>
<p>//if set to true when a rotation is set, then it keeps that rotation relative to the viewer</p>
<p>//not relative to the earth.</p>
<p>//Data format</p>
<p>QString DataMIME;</p>
<p>QString CordinateSystemUsed; //The co-ordinate system used. This should be a string representing a Open Geospatial Consortium standard. This could be earth-relative for gps co-ordinates, or in some cases relative to the viewer, for data to be displayed in a HUD like style.</p>
<p>//Data itself</p>
<p>QString Data;</p>
<p>QString DataUpdatedTimestamp; //Time the Data was updated changed</p>
<p align="left">//Note; A seperate timestamp should be used for updates that dont effect the data itself.<br />
//(such as if a 3d object moves, but its mesh isnt changed)</p>
<p>//Data metadataÂ QMap&lt;QString, QString&gt; Metadata;</p>
<p>};</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/12/04/ar-wave-project-an-introduction-and-faq-by-thomas-wrobel/feed/</wfw:commentRss>
		<slash:comments>3</slash:comments>
		</item>
		<item>
		<title>Web 2.0 Meets Gov 2.0: Hacking Human Behavior within a City, FourSquare, MoMo #13, and AR DevCamp</title>
		<link>http://www.ugotrade.com/2009/12/02/web-2-0-meets-gov-2-0-hacking-human-behavior-within-a-city-four-square-momo-13-and-ar-devcamp/</link>
		<comments>http://www.ugotrade.com/2009/12/02/web-2-0-meets-gov-2-0-hacking-human-behavior-within-a-city-four-square-momo-13-and-ar-devcamp/#comments</comments>
		<pubDate>Thu, 03 Dec 2009 04:05:52 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[Anil dash]]></category>
		<category><![CDATA[AR DevCamp]]></category>
		<category><![CDATA[AR DevCamp NYC]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[architectures of participation]]></category>
		<category><![CDATA[Big AR NY Game]]></category>
		<category><![CDATA[Carl Malamud]]></category>
		<category><![CDATA[Code for America]]></category>
		<category><![CDATA[Dot Gov]]></category>
		<category><![CDATA[Expert Labs]]></category>
		<category><![CDATA[Four Square]]></category>
		<category><![CDATA[FourSquare]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Google Wave Federation Protocol]]></category>
		<category><![CDATA[Gov 2.0]]></category>
		<category><![CDATA[Gov 2.0 Expo]]></category>
		<category><![CDATA[Gov 2.0 Summit]]></category>
		<category><![CDATA[government as a platform]]></category>
		<category><![CDATA[Hacking Human Behavior Within A City]]></category>
		<category><![CDATA[Jennifer Pahlka]]></category>
		<category><![CDATA[Mark Drapeau]]></category>
		<category><![CDATA[mobile aug]]></category>
		<category><![CDATA[Mobile Monday]]></category>
		<category><![CDATA[mobile social communication]]></category>
		<category><![CDATA[mobile social connectedness]]></category>
		<category><![CDATA[MoMo 13]]></category>
		<category><![CDATA[Ohan Oda]]></category>
		<category><![CDATA[open augmented reality]]></category>
		<category><![CDATA[open data]]></category>
		<category><![CDATA[open distribute]]></category>
		<category><![CDATA[open distributed AR]]></category>
		<category><![CDATA[open Goblin XNA platform]]></category>
		<category><![CDATA[Open Gov]]></category>
		<category><![CDATA[pygowave]]></category>
		<category><![CDATA[real time communications]]></category>
		<category><![CDATA[Real Time Crunchup]]></category>
		<category><![CDATA[real time internet]]></category>
		<category><![CDATA[real time web]]></category>
		<category><![CDATA[Scott Yates]]></category>
		<category><![CDATA[Sean White]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[social interaction design]]></category>
		<category><![CDATA[Steven Feiner]]></category>
		<category><![CDATA[The Next Wave of AR]]></category>
		<category><![CDATA[The Open Planning Project]]></category>
		<category><![CDATA[the outernet]]></category>
		<category><![CDATA[War for the Web]]></category>
		<category><![CDATA[Wave]]></category>
		<category><![CDATA[Wave enabled AR]]></category>
		<category><![CDATA[Wave Federation Protocol]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4880</guid>
		<description><![CDATA[Mobile social communication is beginning to take center stage as the internet moves to real time communications. The recent explosion of interest in augmented reality is part of a wider concern to orchestrate a new landscape of contextually relevant information linked to location/place/time and mobile social connectedness. The picture above, &#8220;Having an iphone has completely [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/GentryUnderwood2.jpg"><img class="alignnone size-medium wp-image-4917" title="GentryUnderwood2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/GentryUnderwood2-300x199.jpg" alt="GentryUnderwood2" width="300" height="199" /></a></p>
<p><span id="sp:r" title="Click to view full content">Mobile social communication is beginning to take center stage as the internet moves to real time communications</span><span id="sp:r" title="Click to view full content">. </span><span id="sp:r" title="Click to view full content">The recent explosion of interest in augmented reality is part of a wider concern to orchestrate a new landscape of </span><span id="sp:r" title="Click to view full content">contextually relevant information linked to location/place/time and mobile social connectedness.</span><span id="sp:r" title="Click to view full content"> </span></p>
<p><span id="sp:r" title="Click to view full content">The picture above, &#8220;Having an iphone has completely changed the way I poop,&#8221; is a slide from </span><a href="http://www.ideo.com/thinking/voice/gentry-underwood" target="_blank">Gentry Underwood&#8217;s</a> <span id="sp:r" title="Click to view full content">workshop at <a href="http://www.web2expo.com/webexny2009/" target="_blank">Web 2.0 Expo, NYC</a>, <a href="http://www.web2expo.com/webexny2009/public/schedule/detail/10638" target="_blank">&#8220;Social Interaction Design a Primer.&#8221;</a><br />
</span></p>
<p><span id="sp:r" title="Click to view full content">Last month, I attendedÂ  three events starting with<a href="http://www.mobilemonday.nl/category/events/13/" target="_blank"> MoMo #13</a>, Amsterdam, where I presented on, <a href="http://www.mobilemonday.nl/talks/tish-shute-the-next-wave-of-ar/" target="_blank">&#8220;The Next Wave of AR: Mobile Social Interaction, Right Here, Right Now!</a>.Â  Then I caught the last two days of the <a href="http://www.web2expo.com/webexny2009/" target="_blank">Web 2.0 Expo, NYC</a>, and finally, <a href="http://www.techcrunch.com/real-time-crunchup-sf/" target="_blank">Real Time Crunchup SF</a> (which I watched online). </span></p>
<p><span id="sp:r" title="Click to view full content"> </span><span id="sp:r" title="Click to view full content">New forms of real time mobile, social connectedness were central themes on all three occasions. </span></p>
<p><span id="sp:r" title="Click to view full content"> </span><span id="sp:r" title="Click to view full content">But, in terms of doing stuff that matters with mobile real time technologies, at the moment, we are still at the &#8220;hello world&#8221; demonstration</span><span id="sp:r" title="Click to view full content"> </span><span id="sp:r" title="Click to view full content"> (see my conversation below with <a href="http://dashes.com/anil/" target="_blank">Anil Dash</a> and <a href="http://www.markdrapeau.com/" target="_blank">Mark Drapeau</a> at Web 2.0 Expo below).</span> <span id="sp:r" title="Click to view full content"> </span></p>
<p><span id="sp:r" title="Click to view full content">As Anil Dash noted,Â  <strong>&#8220;</strong></span><strong><span id="uz2e" title="Click to view full content">I think everybody starts with a train schedule&#8230;&#8221;</span></strong></p>
<p><span id="sp:r" title="Click to view full content"> </span><strong><span id="ljc1" title="Click to view full content">&#8220;I remember five years ago when Adrian did Chicagocrime.org. It was a revelation but I mean, that was five years ago.Â  And people still keep making that app over and over.&#8221; </span></strong><br />
<span id="sp:r" title="Click to view full content"> </span></p>
<p><span id="yvdi" title="Click to view full content">Anil Dash</span> announced at the Web 2.0 Expo that he will be the director of <a href="http://www.expertlabs.org/">Expert Labs</a>, a new nonprofit that will take the dot-com incubator model and apply it to new digital tools for the federal government:</p>
<p><strong>&#8220;For me, in starting Expert Labs it&#8217;s been great just to tap into the desire people have to help and serve and to take the idea that you can work for your country without having to work for your government. What can you do to participate?&#8221;</strong></p>
<p><span id="sp:r" title="Click to view full content"> </span><span id="sp:r" title="Click to view full content"> The Gov 2.0 movement is attracting the best and the brightest, if you need inspiration check out <a href="http://public.resource.org/" target="_blank">Carl Malamud&#8217;</a>s <a href="http://www.gov2summit.com/" target="_blank">Gov 2.0 Summit</a> presentation, <a href="http://gov2summit.blip.tv/file/2605719/" target="_blank">By the People&#8230;.</a>.Â Â  <a href="http://radar.oreilly.com/jenpahlka/" target="_blank">Jennifer Pahlka</a> is leaving her long time post as co-chair of Web 2.0 events for TechWeb to concentrate on <a href="http://codeforamerica.org/" target="_blank">Code for America</a>. </span>And <a href="http://www.markdrapeau.com/about/" target="_blank">Mark Drapeau</a> is co-chair of the <a href="http://www.gov2expo.com/gov2expo2010" target="_blank">Government 2.0 Expo</a> next May, that Oâ€™Reilly and TechWeb are also producing.Â  You can submit ideas about Gov 2.0, ICT, and cities (or other topics) to the upcoming <a href="http://gov2expo.com" target="_blank">Gov 2.0 Expo</a>.Â  Mark says he will welcome them! Note there is a <a href="http://en.oreilly.com/gov2fall09" target="_blank">Free Gov 2.0 Online conf.</a> Thursday, Dec. 10th</p>
<p><span id="sp:r" title="Click to view full content"> Tim O&#8217;Reilly has committed to Gov 2.0 work and &#8220;doing stuff that matters&#8221; with missionary zeal (see his </span><span id="sp:r" title="Click to view full content">keynote Web 2.o Expo, <a href="http://www.youtube.com/watch?v=EYRC8nfZ67M&amp;feature=PlayList&amp;p=A0D433518BDA7856&amp;index=2" target="_blank">War for the Web)</a></span><span id="sp:r" title="Click to view full content">.Â Â  Tim O&#8217;Reilly&#8217;s talk, also the article,Â  <a href="http://radar.oreilly.com/2009/11/the-war-for-the-web.html" target="_blank">War for the Web</a>, are a stark reminder of how the centralization and privatization of large parts of </span>our communications infrastructure<span id="sp:r" title="Click to view full content"> threatens the open web.Â  But &#8220;doing stuff that matters,&#8221; as it turns out,Â  is one of the best ways to win the war for the open web. </span></p>
<h3>&#8220;Level playing Fields, Open access, Open APIs, Controlling our data, being able to move with it&#8221; (Anil Dash)</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/Screen-shot-2009-12-02-at-9.08.04-PM.png"><img class="alignnone size-medium wp-image-4934" title="Screen shot 2009-12-02 at 9.08.04 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/Screen-shot-2009-12-02-at-9.08.04-PM-300x184.png" alt="Screen shot 2009-12-02 at 9.08.04 PM" width="300" height="184" /></a></p>
<p><em>Slide above from Anil Dash&#8217;s presentation at the Web 2.0 Expo, NYC, <a href="http://www.youtube.com/watch?v=aOlKfbE97ok&amp;feature=PlayList&amp;p=A0D433518BDA7856&amp;index=9" target="_blank">&#8220;Listening to the Experts&#8221;</a></em></p>
<p><span id="sp:r" title="Click to view full content">T</span><span id="sp:r" title="Click to view full content">he Gov 2.0 movement is still in the idea and initiative phase</span><span id="sp:r" title="Click to view full content">, but the</span><span id="sp:r" title="Click to view full content"> ideals and scope of the movement are a natural antidote to the fox in the social network chicken coop business model du jour (see the <a href="http://www.techcrunch.com/2009/11/02/zynga-takes-steps-to-remove-scams-from-games/" target="_blank">latest antics of Zynga</a>).<br />
</span></p>
<p><span title="Click to view full content">Anil Dash notes the intrinsic bond between Gov 2.0 work and the open web:<br />
</span></p>
<p><strong>&#8220;Because government has an inclination to creating openness by its nature. Right?Â  We donâ€™t have an entirely toll system of federal highways in the states. We understand that the broadcast airwaves are a public good. And so government is inclined to think about creating public goods. It would be ridiculous to spend tax payer dollars on funding proprietary platforms.&#8221;</strong></p>
<p><a href="http://www.businessweek.com/globalbiz/blog/globespotting/archives/2009/12/the_power_of_go.html" target="_blank">The Power of Government as a Platform</a> for citizen involvement is just beginning to emerge from initiatives like Data.govÂ  &#8220;a collection of federal data housed on the www.data.gov <a href="http://www.data.gov/">Web site</a> thatâ€™s open to public access.&#8221;</p>
<p>One of the most challenging aspects of creating in context mobile applications that do stuff that matters is the data curation.</p>
<p><a href="http://www1.cs.columbia.edu/~swhite/" target="_blank">Sean White</a>, explained to me <a href="http://www.ugotrade.com/2009/10/24/ismar-2009-an-augmented-reality-top-chef-coopetition/" target="_blank">at ISMAR 2009</a>, the challenges of data curation behind this beautiful example of augmented reality doing something that matters (pic below) -Â  a pollution meter, that â€œshows carbon monoxide levels projected over New York City.Â  The height of each ball reflects concentrations of the pollutantâ€ (developed at Columbia University Graphics and User Interface Lab where <a href="http://www1.cs.columbia.edu/%7Efeiner/" target="_blank">Steven Feiner</a> is Director).Â  Note Sean White and Steven Feiner will be at <a href="http://www.ardevcamp.org/wiki/index.php?title=AR_DevCamp_interest_list" target="_blank">AR DevCamp NYC</a> this weekend at <a title="http://openplans.org/contact/" rel="nofollow" href="http://openplans.org/contact/">The Open Planning Project office (TOPP)</a> &#8211; see below for more information.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/Screen-shot-2009-12-02-at-2.32.05-PM1.png"><img class="alignnone size-medium wp-image-4925" title="Screen shot 2009-12-02 at 2.32.05 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/Screen-shot-2009-12-02-at-2.32.05-PM1-300x214.png" alt="Screen shot 2009-12-02 at 2.32.05 PM" width="300" height="214" /></a></p>
<h3>Open Data combined with Open Architectures of Participation are a Powerful Combination.</h3>
<p>Scott Yates commented in his <a href="http://www.examiner.com/x-25758-Google-Wave-Examiner~y2009m11d20-Google-Wave-may-be-the-future-but-the-future-is-not-Real-Time" target="_blank">very insightful post </a>on <span id="sp:r" title="Click to view full content"><a href="http://realtimecrunchupsf241.eventbrite.com/" target="_blank">RT Crunchup SF</a> that</span> a &#8220;literny of fixes&#8221; for a broken web were &#8220;presented asÂ  the state of the art&#8221;Â  in a <strong>&#8220;series of presentations from companies that have solutions that fix some subset of all the long list of annoyances&#8221;</strong> (annoyances arising from finding data and friends locked into a variety of different walled gardens).</p>
<p>And, Scott Yates writes:</p>
<p><strong>&#8220;There have been presentations from companies who hope to be the future of socially connected communications, but not one of them has the economic or intellectual heft to be considered a true vision for the future.&#8221;</strong></p>
<p>If you have been following my recent posts, you will already know that I agree with Scott Yates when he concludes:<strong> &#8220;Wave really has an opportunity to fix so much of what is broken in communications.&#8221; </strong></p>
<p><strong> </strong>I have been working on<a href="http://www.slideshare.net/TishShute/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now-2542526" target="_blank"> a project to create an open distributed augmented reality/mobile social communications framework based on the Wave Federation Protocol.</a></p>
<p>This <a href="http://www.ardevcamp.org/wiki/index.php?title=Main_Page" target="_blank">Saturday Dec 5th there will be AR DevCamps held in Mountain View, New York City, Wave and Skype. </a> There will be sessions on many aspects of open augmented reality, including Wave enabled AR.</p>
<h3>AR DevCamp</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/Screen-shot-2009-12-02-at-2.13.59-AM.png"><img class="alignnone size-full wp-image-4908" title="Screen shot 2009-12-02 at 2.13.59 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/Screen-shot-2009-12-02-at-2.13.59-AM.png" alt="Screen shot 2009-12-02 at 2.13.59 AM" width="135" height="139" /></a></p>
<p><strong> </strong></p>
<p>I will attend <a href="http://www.ardevcamp.org/wiki/index.php?title=NYC_ardevcamp" target="_blank">AR DevCampNYC </a>at the NYC location, <a title="http://openplans.org/contact/" rel="nofollow" href="http://openplans.org/contact/">The Open Planning Project office (TOPP)</a> penthouse in Manhattan.Â  This will be an awesome opportunity to meet some of the key augmented reality thought leaders and innovators, including <a href="http://www.cs.columbia.edu/~swhite/" target="_blank">Sean White</a>, <a href="http://graphics.cs.columbia.edu/top.html" target="_blank">Steven Feiner</a>,Â  <a href="http://www.cs.columbia.edu/~henderso/" target="_blank">Steve Henderson,</a> and many others (see the sign up <a href="http://www.ardevcamp.org/wiki/index.php?title=AR_DevCamp_interest_list" target="_blank">list here</a>).Â  <a href="http://www.cs.columbia.edu/~ohan/" target="_blank">Ohan Oda</a> will demo the <a href="http://graphics.cs.columbia.edu/projects/goblin/" target="_blank">open Goblin XNA platform</a>. Â  Thomas Wrobel will answer questions on writing AR Blips to PygoWave Servers and Sophia Parafina <a href="http://twitter.com/spara" target="_blank">(@spara</a>), Joe Lamantia <a title="http://joelamantia.com" rel="nofollow" href="http://joelamantia.com/">(@mojoe</a>) and I will be on hand to discuss the open distributed framework for AR project -Â  <a href="http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/" target="_blank">Wave enabled AR</a>. Â  The <a href="http://pygowave.net/blog/" target="_blank">PyGoWave crew</a> will participate via skype (they will be introducing some of their latest work ).Â  Ori Inbar of <a href="http://ogmento.com/" target="_blank">Ogmento</a> and <a href="http://gamesalfresco.com/" target="_blank">Games Alfresco</a> will lead a brainstorming session on the &#8220;Big AR NY Game&#8221;: The first location-based, social, augmented reality game designed for New York by New Yorkers.</p>
<p>We will continue the interesting discussion led by Marco Neumann (<a href="http://twitter.com/Neumarcx" target="_blank">@neumarcx </a>) on the Semantic Web and Augmented Reality at the <a href="http://semweb.meetup.com/25/calendar/11819773/" target="_blank">Semantic Web Meetup</a>.Â  <a href="http://www.tacticaltransparency.com/my_weblog/author-bios.html" target="_blank">John C. Havens</a> will introduce the <a href="http://outernetguidelinesinitiative.pbworks.com/" target="_blank">Outernet Guidelines Initiative</a>.Â  And <a href="http://www.mattsnod.com/" target="_blank">Matthew Snodgrass</a> <a title="http://www.twitter.com/mattsnod" rel="nofollow" href="http://www.twitter.com/mattsnod">@mattsnod</a>, <a title="http://www.lippetaylor.com" rel="nofollow" href="http://www.lippetaylor.com/">Lippe Taylor</a> will lead a session on the future implications of AR.Â  <a title="Noah Zerkin (page does not exist)" href="http://www.ardevcamp.org/wiki/index.php?title=Noah_Zerkin&amp;action=edit&amp;redlink=1">Noah Zerkin</a>,  will share his brilliant work on AR software and hardware interfaces and exploring the idea of an AROS.Â  And <a href="http://www.maploser.com/?page_id=6" target="_blank">Kate Chapman</a>, from Washington, DC,Â  <a href="http://twitter.com/wonderchook" target="_blank">@wonderchook</a>, and a bevy of local NYC geo geniuses, including organizer Sophia Parafina (<a href="http://twitter.com/spara" target="_blank">@spara</a> ), will explore ways to visualize government data through AR.Â  I am hoping we will have some projects for the upcoming Gov 2.0 Expo at <a href="http://gov2expo.com/">http://gov2expo.com</a>.</p>
<p>And there will be much, much more &#8211; <a href="http://www.ardevcamp.org/wiki/index.php?title=NYC_ardevcamp" target="_blank">keep checking and adding to the wiki.</a> See you there!</p>
<p><span id="sp:r" title="Click to view full content"> </span></p>
<h3>&#8220;Hacking Human Behavior Within a City&#8221;</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/Screen-shot-2009-12-02-at-7.47.46-PM.png"><img class="alignnone size-medium wp-image-4931" title="Screen shot 2009-12-02 at 7.47.46 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/12/Screen-shot-2009-12-02-at-7.47.46-PM-300x227.png" alt="Screen shot 2009-12-02 at 7.47.46 PM" width="300" height="227" /></a></p>
<p><em>Picture from inspiring cities.org -shows some <a href="http://www.inspiringcities.org/index.php?id=395&amp;page_type=Article&amp;id_article=18826" target="_blank">Amsterdam bicycle trends</a><br />
</em></p>
<p>At Web 2.0 Expo, <a href="http://www.ideo.com/thinking/voice/gentry-underwood" target="_blank">Gentry Underwood</a>,<a href="http://www.ideo.com/thinking/voice/gentry-underwood" target="_blank"> IDEO</a>, gave <span title="Click to view full content">a great presentation on </span>how software changes community and communities change software<span title="Click to view full content"> from an ethnographic perspective &#8211; <a href="http://www.youtube.com/watch?v=bPbzdcZBl6M&amp;feature=PlayList&amp;p=A0D433518BDA7856&amp;index=19" target="_blank">&#8220;Designing Web 2.0: Here Come the Anthropologists.&#8221;</a></span></p>
<p><span id="sp:r" title="Click to view full content">And Baratunde Thurston&#8217;s,</span><a href="http://www.youtube.com/watch?v=xkyqKPcfx64&amp;feature=PlayList&amp;p=A0D433518BDA7856&amp;index=0" target="_blank">&#8220;There&#8217;s a #hashtag for That</a>, was an<span id="sp:r" title="Click to view full content"> </span><span id="sp:r" title="Click to view full content"> </span><span id="sp:r" title="Click to view full content">inspired, brilliant romp through the </span><span id="sp:r" title="Click to view full content">&#8220;mini-grass roots movements&#8221; of hashtags </span><span id="sp:r" title="Click to view full content">- </span><span id="sp:r" title="Click to view full content">which are &#8220;quickly assembled/demolished malleable fun!&#8221; or &#8220;great ways to mess with people,&#8221; </span><span id="sp:r" title="Click to view full content">that reminded us the power of grass roots movements when it comes to &#8220;hacking human behavior.&#8221; </span></p>
<p><span id="sp:r" title="Click to view full content">But my visit to <a href="http://www.mobilemonday.nl/category/talks/" target="_blank">MoMo #13</a> preceeding the Web 2.0 Expo showed me clearly &#8220;hacking human behavior within a city&#8221; is on home turf in Amsterdam, where smart phones and bicycles are the vehicles for the </span><span id="sp:r" title="Click to view full content">MoMoesque lifestyle</span><span id="sp:r" title="Click to view full content">.<br />
</span></p>
<p>Thanks to the foresight and generosity of the MoMo organizers, who make sure that the experience ofÂ  the speakers together goes beyond the few hours of the event, I had a three day, three night intensive on the future of mobile social interaction &#8211; living, thinking, and breathing mobile social connectedness, often into the wee hours, with Dennis Crowley, CEO of <a href="http://www.foursquare.com/" target="_blank">FourSquare</a> (see <a href="http://www.mobilemonday.nl/talks/dennis-crowley-foursquare/" target="_blank">his great MoMo 13 presentation here</a>), Ted Morgan, SkyHook (<a href="http://www.mobilemonday.nl/talks/ted-morgan-location-makes-mobile-mobile/" target="_blank">a must see presentation on what SkyHook is doing with data</a>), the MoMo crew, and many of Amsterdam&#8217;s enthusiastic Four Square community.Â <span id="sp:r" title="Click to view full content"> </span></p>
<p>And yes, Four Square really is an awesome way to enjoy a city and meet new people.Â  MIA in this particular pic are key MoMo organizers -Â  <a href="http://twitter.com/samWarnaars" target="_blank">@samwarnaars</a> <a href="http://twitter.com/MdBraber" target="_blank">@mdbraber</a> and <a href="http://twitter.com/vanGeest" target="_blank">@vangeest</a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/foursquare-polaroid.jpg"><img class="alignnone size-medium wp-image-4885" title="foursquare-polaroid" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/foursquare-polaroid-300x224.jpg" alt="foursquare-polaroid" width="300" height="224" /></a></p>
<p>But what do fun times in Amsterdam and FourSquare have to do with doing stuff that matters?</p>
<p><a href="http://twitter.com/marcfonteijn" target="_blank">Marc Fonteijn,</a> MoMo chair and co-founder of <a href="http://www.31v.nl/" target="_blank">31Volts</a> points out:<strong> &#8220;foursquare looks and feels like a game but what it&#8217;s actually doing is changing behavior in a playful way&#8221;</strong></p>
<p>And <a href="http://twitter.com/vanGeest" target="_blank">Yuri van Geest</a>, who co-founded not only <a href="http://www.mobilemonday.nl/" target="_blank">Mobile Monday Amsterdam</a> but also <a href="http://www.tedxamsterdam.nl/" target="_blank">TEDx Amsterdam,</a> added:</p>
<p>&#8220;<strong>in Holland we are working on using the FourSquare API for mHealth purposes also we see that smart venue owners reward all mayors/lead users/visitors with free meals/drinks/privileges/perks etc. and smart advertisers to boost their co-marketing deals based on FourSquare targeting capabilities of key influencers&#8221;</strong></p>
<p>Dennis Crowley, seemingly immune to lack of sleep and jet lag, followed up his MoMo #13 talk with <a href="http://www.web2expo.com/webexny2009/public/schedule/detail/11589" target="_blank">a presentation at Web 2.0 Expo</a>.Â Â  I was sitting just behind Mark Drapeau, and I managed to catch up with Mark after Dennis&#8217; talk.</p>
<p>Mark listed Foursquare in his big takeaways from the Web 2.0 Expo, pointing out the potential new forms of mobile social interaction have for &#8220;hacking human behavior within a city.&#8221;</p>
<p><strong>&#8220;I had always been a little leery of trying FourSquare because I have a certain level of privacy I try to keep up. But listening to the CEO of Foursquare talk about it&#8230; I knew what it was. I have friends that use it.. but thinking about it as hacking human behavior within a city and social engineering of peoplesâ€™ behavior and what they can do, and really understanding what citizens are doing within cities, or other areas, and how they interact with each other. Â  I think could be incredibly valuable for government 2.0 and government understanding people better.&#8221;</strong></p>
<p>And Anil Dash concurred:</p>
<p><span id="ivk8" title="Click to view full content"><strong>&#8220;I think Foursquare is a good model in terms of having a game dynamic, being mobile from its default, having a great social experience, leveraging existing networks like Twitter and Facebook instead of trying to compete with them by building their own. I think those are all really, really smart leanings. </strong></span></p>
<p><span id="ivk8" title="Click to view full content"><strong>I think about if I were a government agency trying to meet those same goals, could I earn badges in Foursquare by doing things that help my community. Right? So when I volunteer at a soup kitchen is that one way to earn an exclusive badge? Is that going to earn me a discount at the bar? Those are all dynamics that we can set up very, very easily and I think that model&#8230;maybe it is a public-private partnership. Thatâ€™d be great.&#8221;<br />
</strong> </span></p>
<p><span title="Click to view full content"><br />
</span></p>
<p><span title="Click to view full content"> </span></p>
<h3>Talking with Mark Drapeau and Anil Dash</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/MarkAnilpost1.jpg"><img class="alignnone size-medium wp-image-4884" title="Mark&amp;Anilpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/MarkAnilpost1-300x199.jpg" alt="Mark&amp;Anilpost" width="300" height="199" /></a></p>
<p><span id="v6ni" title="Click to view full content"><strong>Tish</strong></span><span id="v6ni" title="Click to view full content"><strong> Shute:</strong> I was in Amste</span><span id="v6ni" title="Click to view full content">rdam speaking at MoMo</span><span id="v6ni" title="Click to view full content">#13 and I had a lot of fun hanging out with the MoMo crew and Dennis, CEO of Four Square.Â  I got to meet people and hang outÂ  with Amsterdam&#8217;s new Four Square community. But unfortunately I missed the first two days of Web 2.0 Expo.</span></p>
<p><strong>Mark Drapeau:</strong> I got here yesterday too. Yeah. So some big takeaways.</p>
<p>I had always been a little leery of trying FourSquare because I have a certain level of privacy I try to keep up. But listening to the CEO of Foursquare talk about it&#8230; I knew what it was. I have friends that use it but thinking about it as hacking human behavior within a city and social engineering of peoplesâ€™ behavior and what they can do, and really understanding what citizens are doing within cities, or other areas, and how they interact with each other, I think that could be incredibly valuable for government 2.0 and government understanding people better.</p>
<p>Also I really wanted to hear Tim Oâ€™Reilly interview Beth Noveck. I thought the most interesting thing about the interview were the questions and not the answers (also see<a href="http://markdrapeau.posterous.com/white-house-deputy-cto-beth-noveck-wants-more" target="_blank"> Mark&#8217;s Posterous</a> <span id="beie" title="Click to view full content">).Â  I thought a lot of the answers were disappointing and political and vague.</span></p>
<p><span id="beie" title="Click to view full content"> But I thought Tim really got some important issues about how do people in the web 2.0 community, the audience of Web 2 Expo, interact in reality when you have a system that we nicknamed Gucci Gulch, where you have lobbyists and lawyers and special interest and councils and all these things that&#8230;developers and app builders are not really a part of.Â  So how do you break in?Â  I didnâ€™t really hear good answers for that.</span></p>
<p>I really liked the presentation by the IBM researcher, if I can get his name. Forgive me. Ching Yun-Lin. Talking about putting a value on how many friends you have, how many connections you have and the fact that IBM can actually put a monetary value on the number of connections you have to managers. The number of email accounts you have in your inbox. Or your address book.</p>
<p>I thought that was just fascinating and thatâ€™s something Iâ€™m very passionate about is social networking for the sake of social networking and not merely for collaboration but making connections among diverse communities and using that to help your business or help your government agency.Â  Those are my big takeaways this morning.<br />
<strong> </strong></p>
<p><strong>Tish Shute:</strong> I was lucky enough to attend Gov 2.0 Summit.Â  I think a lot very important areas for Gov 2.0 were defined there, transparency, open data, getting developers into the public sector loop, and citizen-government interaction.Â  In what areas are we seeing progress and where are we stymied and why?Â  How do you see this Web 2.0 community connecting to the ideals and plans for action of Gov 2.0?<br />
<span id="nw6g" title="Click to view full content"><br />
<strong>Mark Drapeau:</strong> I think thereâ€™s a lot of unanswered questions about Government 2.0 because thereâ€™s a lot of good talk and a lot of good ideas and initiatives but thereâ€™s still a long way to go before people in this audience, in this community who want to help the government or be a part of policy making or technology in the government can really in a meaningful way, interact with the government processes that, for the most part, are not going away. </span></p>
<p><span id="nw6g" title="Click to view full content">And the people that are also part of the system, like giant contractors, theyâ€™re not going away. Thereâ€™s a place for everyone. The question is how do the smaller people break in and I donâ€™t think there are really great answers for that.</span></p>
<p><strong>Tish Shute:</strong> What is the plan for the next Government 2.0 event?</p>
<p><strong>Mark Drapeau:</strong> So part of the reason Iâ€™m here is to learn and be inspired as the co-chair of the Government 2.0 Expo next May that Oâ€™Reilly and TechWeb are also producing. And so thereâ€™s increasingly at the Web 2.0 events that they host, there are technologies and people relevant to government missions, or the public sector missions.</p>
<p>And so I think some of the speakers here will carry over in different ways to the Gov 2 Expo in May.</p>
<p><strong>Tish Shute:</strong> Is the call for proposals for the Gov 2.0 Expo still open?</p>
<p><strong>Mark Drapeau:</strong> Thereâ€™s still an open call for proposals. Or people, if they know me they can talk to me directly.</p>
<p>(<a id="dtf7" title="Anil Dash" href="http://dashes.com/anil/" target="_blank">Anil Dash</a> arrives and Mark introducesÂ  me (Anil and I met briefly at Gov 2.0 Summit) &#8211; see Anil&#8217;s post, <a id="btc0" title="New York City is the Future of the Web" href="http://dashes.com/anil/2009/11/new-york-city-is-the-future-of-the-web.html" target="_blank">New York City is the Future of the Web</a> I do agree but, of course, NYC is my hometown!).</p>
<p><span id="yvdi" title="Click to view full content"><strong>Tish Shute:</strong> Anil you are moving into Gov 2.0 work full time now after being a key thought leader in Web 2.0. </span></p>
<p><strong>Anil Dash:</strong> My perspective is probably unique in that I am very strongly from the Web 2.0 world, and new to the Gov 2.0 world and I think it&#8217;s telling that you can make the leap. I think that the profound thing is that these worlds are converging and it&#8217;s not where it was.</p>
<p>Five years ago the government technology was a bike with the training wheels on it. It was very much somebody&#8217;s old hacked up version of Drupel and crossed fingers.Â  And it looked a little homely and you thought, &#8220;well this looks like a run down office kind of thing.&#8221;</p>
<p>Now we have have institutions that have wonderful physical presences. You can&#8217;t stand in front of the Capital Building or the White House or Supreme Court and not say &#8220;that&#8217;s a majestic building.&#8221; We should have online institutions that reflect the scope and the scale of what they do.</p>
<p>For me, in starting Expert Labs it&#8217;s been great just to tap into the desire people have to help and serve and to take the idea that you can work for your country without having to work for your government. What can you do to participate?</p>
<p><strong>Tish Shute:</strong> You and Mark had very interesting journeysÂ  into Gov 2.0 didn&#8217;t you?</p>
<p><strong><span id="b7qr" title="Click to view full content">Mark Drapeau</span></strong><span id="jhqo4" title="Click to view full content"><strong>:</strong> </span><span id="z4dm" title="Click to view full content">Like the Hunter S. Thomson of Government 2.0<br />
<strong><br />
Tish Shute:</strong> I like that!</span></p>
<p><strong>Anil Dash:</strong><span id="g68:" title="Click to view full content"> Can you say that about yourself?</span></p>
<p><strong>Mark Drapeau:</strong> I did the other night and people seemed to buy it, so</p>
<p><span id="lepm" title="Click to view full content"><strong>Anil Dash</strong></span><span id="q1.v" title="Click to view full content">:</span><span id="jm87" title="Click to view full content"> People were feeling it..</span></p>
<p><span id="wezg" title="Click to view full content"><strong>Mark Drapeau: </strong></span><span id="g80v" title="Click to view full content">That&#8217;s right!</span></p>
<p><strong>Tish Shute:</strong> And even if it&#8217;s controversial it&#8217;s good too!Â  I see Mark (and perhaps I am wrong with these characterizations) as coming to this via an interest in the social narratives of government and, Anil, you have come to Gov 2.0 work, as you point out, from a deep immersion in the cultures of technology and Web 2.0&#8230;.</p>
<p><strong>Anil Dash:</strong> And it&#8217;s also a little bit, I&#8217;ve had the privilege of seeing blogs and social media develop from the start and what I learned from it is cultural change and [not just] technology change. This is the same thing happening in government.</p>
<p>We&#8217;re calling it Government 2.0 and it makes it seem like it&#8217;s a version upgrade and it&#8217;s a software thing but it&#8217;s cultural change. And the interesting thing is many of the key players have a willingness to go through that cultural change, which means that the technology, therefore, has the opportunity to succeed.</p>
<p><strong>Tish Shute:</strong> What did you think about Tim O&#8217;Reilly&#8217;s keynote and the warning he gave re the open web?<br />
<strong><br />
Anil Dash: </strong>The war for the web! He&#8217;s absolutely right. Honestly, before Expert Labs had started and I&#8217;d come on board, my initial plan for a talk at this event was exactly the topics Tim covered in the War for the Web.Â  That the centralization of vast parts of our communications infrastructure around privately owned, venture funded companies is a risk to innovation in some ways.</p>
<p>We have to make sure to set up our incentives for those companies, the Facebooks and Googles and Twitters of the world, to align with what our goals are as a society, as a culture, as entrepreneurs, and all those other goals.</p>
<p>So I think it&#8217;s good to have a voice like Tim&#8217;s articulating that threat and that danger so that we can respond to it. I agree completely that we are in the next phase of the battle between open and closed platforms that we went through ten years ago with AOL.</p>
<p>There was a time when AOL dominated more of the dial up internet, one-third of all dial-up users in the US were coming through AOL. People now say &#8220;oh, the iPhone is dominant.&#8221; The iPhone has 2 percent market share of all phones or something like that, and yet people are doing all their innovation on their platform.</p>
<p>Well, people used to do all their innovation on AOL&#8217;s platform and then they ended up having to rewrite it all for the open web.</p>
<p>This pattern is going to repeat. The choice is whether people want to encourage it or fight it or hope it goes away and ultimately there&#8217;s no great business that was built entirely within the walls of AOL&#8217;s garden. I doubt there will be a great business built entirely within the walls of Apple&#8217;s or Facebook&#8217;s or anyone else&#8217;s garden.</p>
<p>That&#8217;s not to say those companies couldn&#8217;t evolve to be open, I hope they do, but as it stands right now you would be foolish to bet your business either from a promotional standpoint, from a start-up standpoint, from a new technology standpoint, on any closed platform that you don&#8217;t control.</p>
<p><span id="fgxq" title="Click to view full content"><strong>Tish Shute:</strong> I was mentioning to Mark that I thought itâ€™s sort of ironic that we now understand how important the architecture of participation of the internet can be to government just as we are on the verge of another big battle to keep the web open&#8230;, a moment when walled gardens are seeming to dominate..will this be an obstacle for Gov 2.0?</span></p>
<p><span id="e55e" title="Click to view full content"><strong>Anil Dash</strong></span><span id="lsod" title="Click to view full content">:</span><span id="gus9" title="Click to view full content"> No I think actually theyâ€™ll get to skip the closed era. </span></p>
<p><span id="gus9" title="Click to view full content">You know I look at the rather famous example in India of never having landlines. They went directly to satellite phones, skipped directly to the wireless generation so they never had an old infrastructure to rip out. </span></p>
<p><span id="gus9" title="Click to view full content">I think you are going to see the same thing with government tech adoption is they are going to start in the era of recognizing the threat of closed platforms and move directly to open platforms.</span></p>
<p>Because government has an inclination to creating openness by its nature. Right? We donâ€™t have an entirely toll system of federal highways in the states. We understand that the broadcast airwaves are a public good. And so government is inclined to think about creating public goods. It would be ridiculous to spend tax payer dollars on funding proprietary platforms.</p>
<p><strong>Mark Drapeau:</strong> Universal accessibility for citizens.</p>
<p><span id="zsxt" title="Click to view full content"><strong>Anil Dash</strong></span><span id="m9vw" title="Click to view full content">: </span><span id="ij6p" title="Click to view full content">Right. Itâ€™s a fundamental tenant of government and we have an incredible history including the Internet itself, of embracing open standards to solve government problems in a way that helps all of society.</span></p>
<p><strong>Tish Shute:</strong> So people who have championed open participatory architecture of the internet and open source approaches now have even is more incentive to team up with government 2.0!</p>
<p><span id="pfgt" title="Click to view full content"><strong>Anil Dash</strong></span><span id="tlow" title="Click to view full content">:</span><span id="zkrl" title="Click to view full content"> Yeah it is an advantage. But also, I mean candidly, open source is almost incidental to it. I mean I think we have come to the point where open source is assumed as some element of any new tech venture. It is much more about level playing fields, open access, open APIs, controlling our data, being able to move with it, that I think is key.</span></p>
<p><span id="zkrl" title="Click to view full content"> And I go back to that AOL example, there was a moment where they opened their email gateways to standard Internet email. And so instead of the AOL users only being able to email each other, they could email anybody on the web and this is the moment in which all the value was created. You start to have email marketing companies, and open exchanges and open mailing lists happen when anybody could email anybody else, that is the sort of thing that government catalyzed just by being the example.<br />
<strong><br />
Tish Shute:</strong> Mark talked about Four Square and how that could be really interesting as part of a Gov 2.0 project. But mobile has followed a course with many complications re an architectural participation &#8211; I am thinking about the control exerted by the carriers and now Apple for example?</span></p>
<p><span id="c1o7" title="Click to view full content"><strong>Anil Dash</strong></span><span id="yttq" title="Click to view full content">: </span><span id="v3kw" title="Click to view full content">No, I think it has revealed complications that have always been there. Right? Thereâ€™s always been multiple platforms. There have always been user agents and web browsers that have different capabilities. There has always been a digital divide. Mobile is making clear that those realities existed. </span></p>
<p><span id="v3kw" title="Click to view full content">But I keep saying this, like, I think if I am designing an application today, you design for mobile first, for a number of reasons. One, the digital divide is much less pronounced on mobile devices. Two, you are much more likely to have an experience that scales well from a small device to a larger one than vice versa. Three, you are able to target international markets or other developing markets where mobile is the default computer platform. And you become aware of constraints in bandwidth, in accessibility, in user experience, in general experience with computers, that a lot of people in the technology industry just completely ignore.</span></p>
<p><span id="v3kw" title="Click to view full content"> And you know you go to Silicon Valley and people think itâ€™s normal to have a six hundred dollar phone that has a thousand dollar a year data plan. And without blinking they designed for devices like that. Itâ€™s myopic and ridiculous to think that people can live with that level of privilege all the time on all the devices that they have and that they have a brand new computer. And so that will be its own undoing.</span></p>
<p>Right? Itâ€™s the people that are thinking about open platforms and working with any device and I think FourSquare candidly is doing a good job of this because they did start with the assumption of iPhones and this thing but their initial target audience of hipsters in the east village probably did have those. But now they have an open API, anybody can access it, thatâ€™s the right evolution. And I think theyâ€™re smart enough, that was always on their plan.</p>
<p><strong>Tish Shute:</strong> But in terms of mobile social interaction we basically have really a structure all of lots of different wall gardens?</p>
<p><span id="mq8x" title="Click to view full content"><strong>Anil Dash</strong></span><span id="kdmb" title="Click to view full content">: For now..</span></p>
<p><strong>Tish Shute: </strong>How do you see mobile developing more interoperability and social interaction capabilities?</p>
<p><span id="v8r9" title="Click to view full content"><strong>Anil Dash: </strong>By using the web. I think it doesnâ€™t have to be full-fledged Ajax-y, html applications on the phone. But if we simply rely on the capabilities of the web as it stands today instead of developing for proprietary mobile platforms we can make a lot of amazing things happen. Itâ€™s a good constraint. We should embrace our constraints.</span></p>
<p>Itâ€™s not conventional wisdom yet that mobile applications should be developed for the web. But thatâ€™ll change in the next year.<br />
<strong><br />
Tish Shute:</strong> There is a lot of exciting new real time technologies coming to the Web, Pubhubsubbub, Google Wave Federation Protocol. How will these change mobile development?</p>
<p><span id="a_s9" title="Click to view full content"><strong>Anil Dash:</strong></span><span id="yhh6" title="Click to view full content"> RSS cloud. I mean there&#8217;s a ton of real time technologies that are coming out together.<br />
<strong><br />
Tish Shute:</strong> What are your favorites in the real time area?<br />
</span><br />
<span id="ygnk" title="Click to view full content"><strong>Anil Dash:</strong></span><span id="vb:n" title="Click to view full content"> I wrote a post about this .. called <a href="http://dashes.com/anil/2009/07/the-pushbutton-web-realtime-becomes-real.html" target="_blank">The Push-Button Web</a> where I actually go into this&#8230;</span></p>
<p><strong>Tish Shute:</strong> Oh yes great post!</p>
<p><span id="vb:n" title="Click to view full content"><strong>Anil Dash:</strong> I donâ€™t pick a favorite. I think all of them together will work. I think itâ€™s similar to how the web itself evolved. </span></p>
<p><span id="vb:n" title="Click to view full content">We have a tangle of different related technologies that get abstracted away when you use a browser. You donâ€™t know if itâ€™s a gif image or a jpeg image when you browse a page. You just know itâ€™s showing an image in line. </span></p>
<p><span id="vb:n" title="Click to view full content">I think weâ€™re going to see the same thing happen to real time web. Weâ€™re going to very, very quickly settle into a stack of technologies that let us do real time. As a developer you might have to be aware of the subtle differences. As a user your experience is going to be, â€œI have real time and it works on whatever device Iâ€™m on.â€<br />
<strong><br />
Tish Shute:</strong> Mobile seems like a vital part of government 2.0 because it can connect people and their government to their context/public infrastructure/environment that is a shared concern. The open data movement has shown that being able to mash up data and get that delivered in context is a very powerful kind of technology for government 2.0. Right?<br />
</span><br />
<span id="ztwv" title="Click to view full content"><strong>Anil Dash</strong></span><span id="f.f4" title="Click to view full content">: </span><span id="uz2e" title="Click to view full content">I donâ€™t know. I think thatâ€™s there&#8217;s just been the â€œhello worldâ€ demonstration. I think everybody starts with a train schedule&#8230;</span></p>
<p><span id="tb43" title="Click to view full content"><strong>Mark Drapeau</strong></span><span id="vri0" title="Click to view full content">:</span><span id="flw7" title="Click to view full content"> I was just going to say that everyone is starting with the very low hanging fruit. The transportation, the crime. Itâ€™s not exactly clear where itâ€™s going to go but I think itâ€™ll go â€“<br />
</span><strong> </strong></p>
<p><strong>Anil Dash</strong><span id="qr2v" title="Click to view full content">: </span><span id="ljc1" title="Click to view full content">I remember five years when Adrian did Chicagocrime.org. It was a revelation but I mean, that was five years ago. And people still keep making that app over and over. </span></p>
<p><span id="ljc1" title="Click to view full content">I remember at the time I had just become friends with Craig Newmark and I said, â€œCraigâ€™s List should show the crime around the neighborhoods where you have an apartment listing.â€ And he said, â€œWell, if I do that then neighbourhoods that are getting better, that h</span><span id="suj7" title="Click to view full content">istorically had more crime, will never improve because people wonâ€™t rent apartments there.â€ And he came back with that answer immediately as soon as I suggested the idea and revealed one, why Craigâ€™s List is the success that it is. But two, what the implications are of releasing data and having to think about the social implications of that.<br />
</span><br />
<span id="jhqo" title="Click to view full content"><strong>Mark Drapeau</strong></span><span id="qmcp" title="Click to view full content">:</span><span id="m9uw" title="Click to view full content"> Well, itâ€™s like Gentry from IDEO said that, â€œSocial software changes the community, which changes the software.â€</span></p>
<p><strong>Anil Dash</strong><span id="fvyu" title="Click to view full content">:</span><span id="kcdl" title="Click to view full content"> Right. Exactly. We have to think about the social implications of the tools and technology we create. </span></p>
<p><span id="kcdl" title="Click to view full content">That means that the reason&#8230;one of the reasons we have only had these, frankly, unambitious obvious applications of open data is because the people that have had access thus far have been people that are not socially oriented. Like geeks are very inwardly focused.</span></p>
<p><strong>Tish Shute: </strong>Oh. Okay. Well Markâ€™s changing this..</p>
<p><span id="uef0" title="Click to view full content"><strong>Anil Dash</strong></span><span id="jq7v" title="Click to view full content">:</span><span id="njzb" title="Click to view full content"> Theyâ€™re in a very insular community.</span></p>
<p><span id="wbkn" title="Click to view full content"><strong>Mark Drapeau</strong></span><span id="lv6c" title="Click to view full content">:</span><span id="lgf8" title="Click to view full content"> I think thereâ€™s a number of people that are trying to change that.</span></p>
<p><span id="l4kk" title="Click to view full content"><strong>Anil Dash</strong></span><span id="ods7" title="Click to view full content">:</span><span id="f69t" title="Click to view full content"> Yeah. Itâ€™s starting to change but Iâ€™m saying thatâ€™s why weâ€™ve seen that symptom in the past.<br />
</span><br />
<span id="sen2" title="Click to view full content"><strong>Mark Drapeau</strong></span><span id="r9db" title="Click to view full content">:</span><span id="ib01" title="Click to view full content"> I get a lot of mileage out of the fact that Iâ€™m neither sort of a career govie type thatâ€™s getting into the 2.0 stuff. Nor am I a lifelong techie whoâ€™s getting into the government and stuff. Iâ€™m sort of&#8230;Iâ€™m interested in these anthropological, psychological, animal behavioral, ecological questions about human behavior and networking. And thatâ€™s where I kind of come into this.<br />
</span><br />
<span id="bjr:" title="Click to view full content"><strong>Anil Dash</strong></span><span id="m5em" title="Click to view full content">:</span><span id="xkxg" title="Click to view full content"> And I think weâ€™re going to need an ethnographic approach to looking at how people work with this data in their real lives. People are using this data already and donâ€™t realize it. You know, when we grab a map in an unfamiliar city youâ€™re using government data. We just donâ€™t think of those behaviors as doing so, and we need to understand that to build applications that really solve peopleâ€™s problems.</span></p>
<p><strong>Tish Shute</strong><span id="w5nn" title="Click to view full content">:</span><span id="x9qz" title="Click to view full content"> So can you speculate on the next generation youâ€™d like to see?</span></p>
<p><span id="eh9w" title="Click to view full content"><strong>Anil Dash</strong></span><span id="k1av" title="Click to view full content">:</span><span id="ivk8" title="Click to view full content"> I think Foursquare is a good model in terms of having a game dynamic, being mobile from its default, having a great social experience, leveraging existing networks like Twitter and Facebook instead of trying to compete with them by building their own.</span></p>
<p><span id="ivk8" title="Click to view full content"> I think those are all really, really smart leanings. I think about if I were a government agency trying to meet those same goals, could I earn badges in Foursquare by doing things that help my community. Right? So when I volunteer at a soup kitchen is that one way to earn an exclusive badge?Â  Is that going to earn me a discount at the bar?Â  Those are all dynamics that we can set up very, very easily and I think that model&#8230;maybe it is a public-private partnership. Thatâ€™d be great.<br />
<strong><br />
Mark Drapeau:</strong> Or even doing things to help your internal community. Key people at work or within your agency or things like that. From my vantage point it does seem like local Government 2.0 types are thinking much more about mobile than the Federal government types. The reality is government employees all have BlackBerries and theyâ€™re running around all the time. But theyâ€™re in terms of government 2.0 type stuff theyâ€™re thinking about the Dell desktop they have and the Microsoft Windows system and whenever I mention something like mobile or pervasive videos&#8230;..people arenâ€™t really there. Theyâ€™re worried about cyber security on the traditional systems. Theyâ€™re worried about desktop applications on a Dell.</span></p>
<p><span id="vtzh" title="Click to view full content"><strong>Anil Dash</strong></span><span id="r1u4" title="Click to view full content">:</span><span id="zpwe" title="Click to view full content"> Theyâ€™re still five years ago.</span></p>
<p><span id="xl-x" title="Click to view full content"><strong>Mark Drapeau:</strong></span><span id="z-b4" title="Click to view full content"> Yeah. Theyâ€™re still five years ago and so I think these kind of Oâ€™Reilly-Tech Web events, Gov 2.0 Expo, Web 2.0 Expo, etc., are really starting to get at these questions that are now and not five years ago.</span></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/12/02/web-2-0-meets-gov-2-0-hacking-human-behavior-within-a-city-four-square-momo-13-and-ar-devcamp/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Toward the Sentient City: The Future of the Outernet and How to Imagine it?</title>
		<link>http://www.ugotrade.com/2009/11/09/toward-the-sentient-city-the-future-of-the-outernet-and-how-to-imagine-it/</link>
		<comments>http://www.ugotrade.com/2009/11/09/toward-the-sentient-city-the-future-of-the-outernet-and-how-to-imagine-it/#comments</comments>
		<pubDate>Mon, 09 Nov 2009 21:09:00 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[3rd cloud]]></category>
		<category><![CDATA[Adam Greenfield]]></category>
		<category><![CDATA[aesthetics of distributed participation]]></category>
		<category><![CDATA[Amphibious Architecture]]></category>
		<category><![CDATA[architectures of participation]]></category>
		<category><![CDATA[asynchronous city]]></category>
		<category><![CDATA[Benjamin H. Bratton]]></category>
		<category><![CDATA[Breakout!]]></category>
		<category><![CDATA[Conflux 2009]]></category>
		<category><![CDATA[Dan Hill]]></category>
		<category><![CDATA[Dharma Dailey]]></category>
		<category><![CDATA[distributed open AR]]></category>
		<category><![CDATA[Enrique Ramirez]]></category>
		<category><![CDATA[everyware]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[human electric hybrid]]></category>
		<category><![CDATA[hybrid social netoworks]]></category>
		<category><![CDATA[julian Bleeker]]></category>
		<category><![CDATA[Laura Forlano]]></category>
		<category><![CDATA[location aware applications]]></category>
		<category><![CDATA[Mark Shepard]]></category>
		<category><![CDATA[Martijn de Waal]]></category>
		<category><![CDATA[Matthew Fuller]]></category>
		<category><![CDATA[Mimi Zeiger]]></category>
		<category><![CDATA[Natalie Jeremijenko]]></category>
		<category><![CDATA[Natural Fuse]]></category>
		<category><![CDATA[new architectures of participation]]></category>
		<category><![CDATA[Nicolas Nova]]></category>
		<category><![CDATA[Omar Khan]]></category>
		<category><![CDATA[Open AR]]></category>
		<category><![CDATA[outernet]]></category>
		<category><![CDATA[Philip Beesley]]></category>
		<category><![CDATA[real time communication]]></category>
		<category><![CDATA[real time web]]></category>
		<category><![CDATA[real-time database enable city]]></category>
		<category><![CDATA[sensor networks]]></category>
		<category><![CDATA[Sentient City Survival Kit]]></category>
		<category><![CDATA[Situated Technologies]]></category>
		<category><![CDATA[smart things]]></category>
		<category><![CDATA[social mobility]]></category>
		<category><![CDATA[social mobility and the 3rd cloud]]></category>
		<category><![CDATA[synchronous internet of things]]></category>
		<category><![CDATA[The Copenhagen Wheel]]></category>
		<category><![CDATA[The Living Architecture Lab]]></category>
		<category><![CDATA[the social negotiation of Technology]]></category>
		<category><![CDATA[Too Smart City]]></category>
		<category><![CDATA[Toward the Sentient City]]></category>
		<category><![CDATA[Trash Track]]></category>
		<category><![CDATA[urban sustainability]]></category>
		<category><![CDATA[urbanware]]></category>
		<category><![CDATA[Usman Haque]]></category>
		<category><![CDATA[Web Squared]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4758</guid>
		<description><![CDATA[Amphibious Architecture &#8211; &#8220;submerges ubiquitous computing into the waterâ€”that 90% of the Earthâ€™s inhabitable volume that envelops New York City but remains under-explored and under-engaged.&#8221; Toward the Sentient City, brought &#8220;architects and urban designers into a conversation that until now has been limited largely to technologists,â€ and created an extraordinary opportunity to investigate distributed architectures [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.sentientcity.net/exhibit/?p=603" target="_blank"><span id="n.6p" title="Click to view full content"> </span></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-06-at-12.03.40-AM.png"><img class="alignnone size-medium wp-image-4783" title="Screen shot 2009-11-06 at 12.03.40 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-06-at-12.03.40-AM-300x200.png" alt="Screen shot 2009-11-06 at 12.03.40 AM" width="300" height="200" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/dhj5mk2g_404g3prc6dc_b.jpg"><img class="alignnone size-medium wp-image-4759" title="dhj5mk2g_404g3prc6dc_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/dhj5mk2g_404g3prc6dc_b-300x199.jpg" alt="dhj5mk2g_404g3prc6dc_b" width="300" height="199" /></a><br />
<span id="ot:x" title="Click to view full content"> </span></p>
<p><em><a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank"><span id="it_d" title="Click to view full content">Amphibious </span>Architecture</a> &#8211; &#8220;submerges ubiquitous computing into the waterâ€”that 90% of the Earthâ€™s inhabitable volume that envelops New York City but remains under-explored and under-engaged.&#8221;</em></p>
<p><a href="http://www.sentientcity.net/exhibit/">Toward the Sentient City</a>,<span id="ju31" title="Click to view full content"> brought </span> &#8220;architects and urban designers into a conversation that until now has been limited largely to technologists,â€ and <span id="hb:z" title="Click to view full content">created an extraordinary opportunity to investigate distributed architectures of participation of what we might call the &#8220;outernet.&#8221;Â  This is a</span><span id="hb:z" title="Click to view full content"> timely conversation as &#8220;web squared,&#8221;Â  &#8220;smart things,&#8221; the &#8220;internet of things,&#8221; or the &#8220;outernet,&#8221;</span><span id="g6ad" title="Click to view full content"> and their popular &#8220;ambassador&#8221; augmented reality are rapidly becoming everyone&#8217;s &#8220;business.&#8221;</span><span id="eb9y" title="Click to view full content"> From </span><span id="b265" title="Click to view full content">&#8220;evil&#8221; marketers, to global corporations, </span><span id="sq48" title="Click to view full content">environmentalists, artists and community activists -Â  everyone, it seems, is</span><span id="mqn_" title="Click to view full content"> interested in the possibilities of this new frontier.</span></p>
<p><span id="ot:x" title="Click to view full content">It is a challenging task to respond to, </span><a href="http://www.sentientcity.net/exhibit/">Toward the Sentient City</a><span id="ot:x" title="Click to view full content">, an exhibition whose backdrop includes a series of conversations on Situated Technologies &#8211; published by the Architectural League, from a circle of people who have been thinking, writing, and speaking on networked urbanism for many years now, including: Adam Greenfield, </span><span id="vjks" title="Click to view full content"> Mark Shepard, Matthew Fuller, Usman Haque, Benjamin H. Bratton, Natalie JeremiJenko, Laura Forlano, Dharma Dailey,Â  Philip Beesley, Omar Khan, Julian Bleeker, Nicolas Nova</span><span id="o7yp" title="Click to view full content">.Â  And the exhibition itself has a very thoughtful group of respondents, see posts from: <a href="http://www.sentientcity.net/exhibit/?p=595" target="_blank">Dan Hill</a>, <a href="http://www.sentientcity.net/exhibit/?p=659" target="_blank">Martijn de Waal,</a> <a href="http://www.sentientcity.net/exhibit/?p=622" target="_blank">Enrique Ramirez</a>, and <a href="http://www.sentientcity.net/exhibit/?p=603" target="_blank">Mimi Zeiger.</a></span><a href="http://www.sentientcity.net/exhibit/?p=603" target="_blank"><span id="n.6p" title="Click to view full content"> </span></a></p>
<p>But one ofÂ  Toward the Sentient City&#8217;s key accomplishments was to go beyond the rhetorical, and to put practical examples out into the world to<span id="ijgh" title="Click to view full content"> organize a discussion on some of the ideas and possibilities of ubiquitous computing that have barely begun to emerge from academic research, and entrepreneurial blue skying.Â  As curator, </span><a href="http://www.andinc.org/v3/" target="_blank">Mark Shepard</a><span id="ijgh" title="Click to view full content">, explained:<br />
</span></p>
<p><strong><span id="fqkh" title="Click to view full content">&#8220;The </span></strong><strong><span id="tq6_" title="Click to view full content"><span>aim is to provide concrete examples in the present around which to organize a discussion about just what kind of future we might want. Whether theyâ€™re prototypes or not, these commissions are concrete examples. Theyâ€™re not abstract ideas. And we can go stand next to each other and look at and interact with something which is out there in the world behaving in the way it behaves, performing as it does, and we can then begin to have a discussion about it that is less dependent upon powers of rhetoric.</span> So itâ€™s not about me persuading you about an idea but itâ€™s about us evaluating something thatâ€™s living and existing in this world. And that was really the intention of the show.â€</span></strong></p>
<p><span id="ijgh" title="Click to view full content">The commissioned works </span><span id="d4-:" title="Click to view full content">-<a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank"> Amphibious Arc</a></span><span id="d4-:" title="Click to view full content"><a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank">hitecture</a>, <a href="http://www.sentientcity.net/exhibit/?p=53" target="_blank">Breakout!</a>, <a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank">Natural Fuse</a>, <a href="http://www.sentientcity.net/exhibit/?p=59" target="_blank">Too Smart City</a>, and <a href="http://www.sentientcity.net/exhibit/?p=31" target="_blank">TrashTrack,</a> </span><span id="xnxp" title="Click to view full content">that were the hub of Toward the Sentient City&#8217;s </span><span id="g.08" title="Click to view full content"> events, themes and texts, provided a unique glimpse</span><span id="j-jh" title="Click to view full content"> at </span><span id="pa9i" title="Click to view full content">some of the possible dystopian and utopian futures of a &#8220;smart&#8221; city.Â  But, most importantly,Â  all the works questioned what might be new </span><span id="ijgh" title="Click to view full content">architectures of participation for a sentient city. </span></p>
<h3>New Architectures of Participation: Hybrid Social Networks with Human and Non-human Participants .</h3>
<p>Of the five works, Amphibious Architecture and Natural Fuse were particularly fascinating to me because they explored the possibilities of sensor networks to create new forms of distributed participation in networked ecosystems that connected the experience/trajectories of human and non human actors &#8211; fish, plants,Â  and people.</p>
<p>Both Amphibious Architecture, andÂ  &#8220;Natural Fuse&#8221; &#8211; from Usman Haque and <a href="http://www.haque.co.uk/" target="_blank">Haque Design + Research,</a> gave exhibition attendees the chance to experience at a personal level our relationships with our non-human neighbors.</p>
<p><a href="http://www.sentientcity.net/exhibit/?p=5" target="_blank"><span id="it_d" title="Click to view full content">Amphibious </span>Architecture</a> from the The Living Architecture Lab at Columbia University Graduate School of Architecture, Planning and Preservation (Directors David Benjamin and Soo-in Yang) and Natalie Jeremijenko, Environmental Health Clinic at New York University, <span id="w.m9" title="Click to view full content">used a sensor array to &#8220;pierce the reflective </span><span id="ud4u" title="Click to view full content">surface of the water&#8221; that</span> separates us from the underwater ecosystem below.Â  <span id="kfwr" title="Click to view full content">The sensor arrays just below the surface of the East River andÂ  floating light array</span> (see picture on left opening this post) create a new interface between people and fish whose movements and water quality are transmitted in light.</p>
<p>One could also SMS the fish and the single beaver that lives in the rivers surrounding NYC to find the conditions they were experiencing.<span id="cehj" title="Click to view full content"> But t</span><span id="y9m6" title="Click to view full content">urning the city&#8217;s &#8220;back stories,&#8221; like the movements of &#8220;Yo beaver,&#8221; and the oxygen levels and water quality of the rivers into &#8220;fore stories,&#8221; is only one of the many ways Natalie JeremiJenko explores how we can engender the empathy necessary for humans and non humans to live in harmony and mutual benefit.</span></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/nataliefishandmicrochips.jpg"><img class="alignnone size-medium wp-image-4802" title="nataliefishandmicrochips" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/nataliefishandmicrochips-300x199.jpg" alt="nataliefishandmicrochips" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/fishfoodpost.jpg"><img class="alignnone size-medium wp-image-4803" title="fishfoodpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/fishfoodpost-300x199.jpg" alt="fishfoodpost" width="300" height="199" /></a></p>
<p><span id="y9m6" title="Click to view full content"> </span>Toward the Sentient City also held workshops/presentations in conjunction with <a href="http://confluxfestival.org/2009/" target="_blank">Conflux 2009</a>. After her Conflux presentation, Natalie Jeremijenko of Amphibious Architecture (which is also a collaborative project between <a href="http://www.environmentalhealthclinic.net/">xClinic</a>, <a href="http://www.thelivingnewyork.com/">The Living</a><span id="wz9v" title="Click to view full content">, </span>&#8220;and other intelligent creatures on the East River&#8221;)Â  invited participants to enjoy a lunch of cross-species foods at the East River site.Â  <span id="k2u." title="Click to view full content"> </span></p>
<p><span id="k2u." title="Click to view full content">The cross-species lunch takes </span><span id="x0h." title="Click to view full content"> an existing interaction pattern through which people and fish are already communicating, </span><span id="tkk5" title="Click to view full content">i.e., people going to the river â€“ the waterfront,Â  and feeding the fish</span><span id="vct4" title="Click to view full content"> Wonder Bread (which is bad for humans and fish); and transforms this desire to feed the fish into something which actually can remove the mercury content from the fish and our bodies by removing it from the food chain, so a previously inharmonious connection between people and fish, is redirected into a productive interaction benefitting both species.Â  As it turns out, food that is good for Fish (see pictures above), and removes mercury from their bodies can also be nutritious and tasty for humans. </span></p>
<p><a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank">Natural Fuse</a>, from team members, Usman Haque, creative director, Nitipak â€˜Dotâ€™ Samsen, designer, Ai Hasegawa, designer, Cesar Harada, designer, Barbara Jasinowicz, producer, used sensors toÂ <span id="oenx" title="Click to view full content"> link humans and plants in network where we are accountable for how our behavior effects others in your ecosystem. </span></p>
<p><span id="oenx" title="Click to view full content">If you brought an ordinary plant to the exhibition, you could take home an electronically assisted plant and become part of a social network of humans and plants. This network of humans and electronically assisted plants is also a carbon sink and ifÂ  more energy is consumed than the total number of plants in the social network can offset, plants begin to die giving immediate feedback and consequences to being greedy about energy consumption. </span><span id="ijgh" title="Click to view full content">For more about joining the Natural Fuse network see<a href="http://www.naturalfuse.org" target="_blank"> here.</a><br />
</span></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/naturalfusepres.jpg"><img title="naturalfusepres" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/naturalfusepres-300x199.jpg" alt="naturalfusepres" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/naturalfusetakehome.jpg"><img title="naturalfusetakehome" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/naturalfusetakehome-300x199.jpg" alt="naturalfusetakehome" width="300" height="199" /></a></p>
<p><span id="pa9i" title="Click to view full content"> </span><span id="w-ed" title="Click to view full content">We are in the pre-dawn ofÂ  sensor networks like those Natural Fuse and Amphibious Architecture created &#8211; social</span><span id="n.6p" title="Click to view full content"> networksÂ  that link human and non human participants in entirely new ways are largely an uncharted territory. </span><span id="o7yp" title="Click to view full content">(Note: T</span><span id="zr9t" title="Click to view full content">he upcoming <a href="http://www.situatedtechnologies.net/" target="_blank">Situated Technologies</a> Pamphlet 6</span><span id="ijgh" title="Click to view full content"> &#8211; <strong>&#8220;Micro Public Places,&#8221; </strong>Marc Bohlen and Hans Frei, indicates it will continue the journey with an investigation ofÂ  &#8220;transparent and distributed participation.&#8221;)</span></p>
<h3>Where Does the Social Negotiation ofÂ  Technology Happen?</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/markshepardpost.jpg"><img class="alignnone size-medium wp-image-4825" title="markshepardpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/markshepardpost-199x300.jpg" alt="markshepardpost" width="199" height="300" /></a></p>
<p>Frequent questions that came up at the presentations given by the teams that produced the commissioned works were: Does this idea scale?Â Â  Does it close the loop in that you <span>get answers to the questions asked?Â  How does the conversation gain agency?Â  And where does the social negotiation of technology happen?Â  (These last two questions were asked by <a href="http://www.orangecone.com/" target="_blank">Mike Kuniavsky</a> at Mark Shepardâ€™s presentation at Conflux: â€œ</span><a id="ktb-" title="Sentient City Survival Kit" href="http://survival.sentientcity.net/" target="_blank"><span>Sentient City Survival Kit</span></a><span>.â€ â€“ see picture above)Â  I think it is fair to say that these questions for the most part remain unanswered. But Toward the Sentient city was alive with ideas and practical examples about ways we can explore these questions more deeply.</span></p>
<p><span id="oenx" title="Click to view full content">Usman Haque in response to the question, &#8220;Does this experiment scale?,&#8221; replied:</span></p>
<p><strong>&#8220;it would, but at an individual level because it has to remain at the individual level because it is about the individual in relationship to the wider social context as opposed to building a forest to offset a city it is about each individual making choices of their own about what they do andÂ  having some kind of knowledge about the effect they are having on other people because most of the time we are quite complacent &#8211; we are able to do whatever we want because we are not necessarily aware how our intrusions effect both human and non-human neighbors&#8230;.&#8217;</strong></p>
<p>So how does this close the loop?Â  Usman explains that one of the key aspects for him is that if you do take home a plant you become part of a system in which you are no longer anonymous and if a plant is threatened (plants get three lives) you have the opportunity to email the person in the system who has threatened your plant.Â  Usman noted that one of the interesting things that happened in the context of the exhibition, where there was a single unit, was that 90% of the time people switched it on to selfish mode &#8211; presumably because they were anonymous.Â  Another aspect of Natural Fuse that raises interesting questions is that as more people decide to join the network the risk of a plant being harmed by any particular individual&#8217;s selfishness lessens.Â  As <a href="http://www.sentientcity.net/exhibit/?p=659" target="_blank">Martijn de Waal</a>,<span id="gi2_" title="Click to view full content"> in his response that unpacks some of the deeper philosophical, epistemological, and ethical questions that Natural Fuse addresses, observes:</span></p>
<p><strong>&#8220;The concept of a commons thus assumes cooperation and mutual accommodation. Could Sentient Technology play a role in the allocation of limited resources between citizens? Could it lead to the emergence of some sort of peer-to-peer governance model, that could prevent overusage of scarce resources?&#8221;</strong></p>
<h3><strong><br />
New Aesthetics of Distributed Participation</strong></h3>
<p><span id="nqx:" title="Click to view full content">The works of, </span><span id="nqx:" title="Click to view full content"><span> &#8220;Toward the Sentient City&#8221; point to possibilities for a new aesthetics of distributed participation in which users and system are no longer separated but instead â€œdevelop joint forms of observing and knowing that neither [...] is capable on its own.â€ (quote from upcoming, <a href="http://www.situatedtechnologies.net/" target="_blank">Situated Technologies Pamphlets</a></span> 6: Micro Public Places, Marc Bohlen and Hans Frei).Â  Natural Fuse and Amphibious Architecture examine the new transactional realities of the Sentient City.</span></p>
<p><span id="po-s" title="Click to view full content"> But there are many questions left unanswered.Â  We know a lot about the power of generativity from the </span>internet (see Zittrain)-Â  the ur<strong> &#8220;architecture of participation.&#8221;</strong> <span id="hri-" title="Click to view full content">As Zittrain points out, the &#8220;generativity&#8221; of the internet is &#8220;the engine that has catapaulted the internet from backwater to ubiquity.&#8221; </span> Tim O&#8217;Reilly coined the phrase, &#8220;architecture of participation,&#8221; to &#8220;describe the nature of systems that are designed for user contribution,&#8221;<span id="o7et" title="Click to view full content"> such that &#8220;participants extend the reach/increase the value of the system.&#8221;Â  But as Tim O&#8217;Reilly put it in his recent talk, &#8220;<a href="http://www.slideshare.net/timoreilly/state-of-the-internet-operating-system" target="_blank">State of the Internet Operating System:&#8221;</a></span></p>
<p><span title="Click to view full content"><strong>&#8220;Web 2.0 is about finding meaning in user-generated data, and turning that meaning into real-time user facing services.Â  &#8220;Web Squared&#8221; takes that same concept to real-time sensor data.&#8221;</strong><br />
</span></p>
<p><span id="o7et" title="Click to view full content">We know little yet about what constitutes generativity for the &#8220;outernet,&#8221; particularly for the kind ofÂ  hybrid social networks that Natural Fuse and Amphibious Architecture present.Â  Social Networks that connect people and place, humans and non humans, challenge dichotomies of man and nature, and machine and user in new and unexpected ways.</span></p>
<p>At the moment, the internet is going through a metamorphosis with the emergence of real time technologies like XMPP, PubHubSubBub and Google Wave and the coming of age of mobile computing.Â Â  While these shifts were not investigated specifically in any of the commissioned works I think all the worksÂ  begged the question,Â  What is a common platform for social interaction in the &#8220;outernet,&#8221; or sentient city?Â  I was not entirely satisfied, from this point of view, with a web interface for Natural Fuse or SMS as a mobile interface for Amphibious Architecture.</p>
<p><a href="http://www.media.mit.edu/people/dpreed" target="_blank">David P. Reed</a> points to the relationship between social mobility what he describes as the 3rd cloudÂ  and the need for a common platform (see <a href="http://www.slideshare.net/venicesessions/david-reed-social-mobility-and-the-3rd-cloud" target="_blank">David Reed &#8211; Social Mobility and the 3rd Cloud</a>. Hat tip to <a href="http://twitter.com/srenan" target="_blank">@srenan</a> for pointing me to David&#8217;s presentation).</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-06-at-11.11.25-PM.png"><img class="alignnone size-medium wp-image-4826" title="Screen shot 2009-11-06 at 11.11.25 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-06-at-11.11.25-PM-300x222.png" alt="Screen shot 2009-11-06 at 11.11.25 PM" width="300" height="222" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-06-at-11.16.59-PM1.png"><img class="alignnone size-medium wp-image-4828" title="Screen shot 2009-11-06 at 11.16.59 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-06-at-11.16.59-PM1-300x222.png" alt="Screen shot 2009-11-06 at 11.16.59 PM" width="300" height="222" /></a></p>
<p><em>Slides above are from David P. Reed&#8217;s presentation,Â <a href="http://www.slideshare.net/venicesessions/david-reed-social-mobility-and-the-3rd-cloud" target="_blank"> Social Mobility and the 3rd Cloud</a></em><a href="http://www.slideshare.net/venicesessions/david-reed-social-mobility-and-the-3rd-cloud" target="_blank"></a></p>
<p>What is an architecture of participation for mobile, social interaction? This is something I am very interested in.</p>
<p>Recently I began a project with a small group of augmented reality developers and enthusiasts to use Google Wave Federation Protocol as a transport system for open distributed, social augmented experiences (lots more to come on this soon &#8211; you can see the back story in my posts <a href="http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/" target="_blank">here</a> and <a href="http://www.ugotrade.com/2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/" target="_blank">here</a>).Â  Wave has introduced an open federated architecture of participation that <strong style="font-weight: normal;">combines asynchronous &amp; synchronous data,Â  bringingÂ  together the advantages of real-time communication with the persistent hosting of collaborative data (like wikis). </strong><strong> </strong></p>
<p>Augmented Reality puts who you are, where you are, and what you are doing center stage, and is an interface for &#8220;communications embedded in context&#8221; and &#8220;enabled by identity&#8221; &#8211; two key qualities of what David <span>P. Reed calls the 3rd cloud.Â  An open, distributed framework for augmented reality could createÂ  an interconnected sense of AR, one that fuses augmentation, data overlays, and varied media with location/time/place and crucially, social networking.Â  Such an interface would open up many possibilities for the new transactional realities that could </span>integrate real-time cloud based data with a human perspective and social networking.Â  I am using the term,<span> transactional realitiesÂ  to suggest an extension into social augmented experiences ofÂ  what, Di-Ann Eisnor, </span><a id="s050" title="Platial" href="http://www.platial.com/"><span>Platial</span></a><span>, describes as,Â  &#8220;</span><span><span><span>transactional cartography&#8221; &#8211; &#8220;the movement from map providing entertainment/information to map as enabling action&#8221; (see </span><a id="h6.r" title="Human as Sensors" href="http://www.youtube.com/watch?v=Di285pgcZRE&amp;feature=PlayList&amp;p=F664D8C553A57C93&amp;index=3"><span>Human as Sensors</span></a><span>).</span></span></span></p>
<p>We have only just got a glimpse ofÂ  how real time technologies and &#8220;communications embedded in context&#8221; will transform social interaction and our cities.Â  This post on <a id="r3ow" title="Writing as Real-Time Performance" href="http://snarkmarket.com/2009/3605">Writing as Real-Time Performance</a> that looks at the Google Wave playback feature is a brilliant example of how real time technology turns familiar practices like writing inside out, and catapaults us into new time trajectories. And, if you haven&#8217;t already seen Matt Jones of BERG&#8217;s, brilliant look at, <a href="http://berglondon.com/blog/2009/10/26/all-the-time-in-the-world-talk-at-design-by-fire-2009-utrecht/" target="_blank">&#8220;All the time in the world&#8221; </a>- from the &#8220;soft time&#8221; and &#8220;squishy time&#8221; ofÂ  cell phone culture, to their anticedents in real-time computing, go now!Â  Also see Dan Hill&#8217;s work on <a href="http://cityofsound.com" target="_blank">&#8220;time based notation,&#8221;</a> and Tom Carden&#8217;s work for mysociety.org</p>
<p><span> </span></p>
<h3>Transactional Realities Between the &#8220;Asynchronous City&#8221; and the &#8220;Synchronous Internet ofÂ  Things&#8221;</h3>
<p><span> </span><span id="nqbb" title="Click to view full content"><span>Out of Toward the Sentient City&#8217;s five commissioned works,</span><span> only</span></span><span id="n:_n" title="Click to view full content"><span> </span></span><span> </span><a href="http://www.sentientcity.net/exhibit/?p=31" target="_blank"><span>Trash Track</span></a><span> </span><span id="nqbb" title="Click to view full content"></span><span> </span><span id="n:_n" title="Click to view full content"><span>focused on the â€œsynchronized Internet of Things.â€ </span></span><a href="http://www.sentientcity.net/exhibit/?p=31" target="_blank"><span id="n:_n" title="Click to view full content"><span> </span></span></a><span id="n:_n" title="Click to view full content"><span>Trash Track asks what can we learn from the aggregated data streams of â€œsmartâ€ trash about</span></span><span> the infamous path of trash from cities of privilege to rivers of want,Â  rather than</span><span id="rkuc" title="Click to view full content"><span> exploring the the particular transactional realities of a social network that linked people with their trash</span></span><span id="n.6p" title="Click to view full content"> </span></p>
<p><span id="ft58" title="Click to view full content"><br />
<span> </span></span><span id="ft58" title="Click to view full content"> </span><span id="n.6p" title="Click to view full content"><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/TrashTrack2.jpg"><img title="TrashTrack2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/TrashTrack2-300x199.jpg" alt="TrashTrack2" width="300" height="199" /></a></span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/TrashTrack2.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/trashtrack4.jpg"><img title="trashtrack4" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/trashtrack4-300x199.jpg" alt="trashtrack4" width="300" height="199" /></a><span id="ft58" title="Click to view full content"><span> </span></span></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/trashtrack3.jpg"><img class="alignnone size-medium wp-image-4768" title="trashtrack3" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/trashtrack3-300x199.jpg" alt="trashtrack3" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/trashtrackpost.jpg"><img class="alignnone size-medium wp-image-4782" title="trashtrackpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/trashtrackpost-300x199.jpg" alt="trashtrackpost" width="300" height="199" /></a></p>
<p><span id="ft58" title="Click to view full content"><span>The goals of </span></span><span id="ft58" title="Click to view full content"><span>Trash Track </span></span><span id="ft58" title="Click to view full content"><span>were</span></span><span id="ft58" title="Click to view full content"><span>, Assaf </span></span><span id="ft58" title="Click to view full content"><span>Biderman explained during his presentation:</span></span></p>
<p><span id="ft58" title="Click to view full content"><span> <strong>â€œto learn about the removal chain, to see if knowing more cou</strong></span></span><strong><span id="f:mt" title="Click to view full content"><span>ld promote behavioral change, and investigate if smart tagging could one day lead to 100% recycling.â€ </span></span></strong></p>
<p><strong><span id="f:mt" title="Click to view full content"> </span></strong><span>The team from SENSEable City Laboratory, MIT included &#8211; Carlo Ratti: Director, Assaf Biderman: Associate Director, Rex Britter: Advisor, Stephen Miles: Advisor, Kristian Kloeckl Project Leader, Musstanser Tinauli, E Roon Kang, Alan Anderson, Avid Boustani, Natalia Duque Ciceri, Lorenzo Davolli, Samantha Earl, Lewis Girod, Sarabjit Kaur, Armin Linke, Eugenio Morello, Sarah Neilson, Giovanni de Niederhausern, Jill Passano, Renato Rinaldi, Francisca Rojas, Louis Sirota, Malima Wolf.</span></p>
<p><span>However, Assaf,Â  in his presentation, presented another project from SENSEable City Laboratory in partnership with the City of Copenhagen, </span><a href="http://senseable.mit.edu/copenhagenwheel/" target="_blank">The Copenhagen Wheel</a>.Â  <span>This project seems to work brilliantly at the intersection of the &#8220;asynchronous city&#8221; (Bleeker and Nova) and the &#8220;synchronized internet of things&#8221;Â  The &#8220;smart&#8221; wheel &#8211; a low cost, open source, human electric hybrid is:</span></p>
<p><strong>&#8220;an electric bicycle wheel that can be easily retrofitted into any regular bicycle and location and environmental sensors which are powered by the bike wheel and in turn provide data for a variety of applications.&#8221;</strong></p>
<p>This project, that aims to promote urban sustainability through smart biking, opens up many possibilities for a bottom up architecture of participation for the sentient city (<a href="http://senseable.mit.edu/copenhagenwheel/">see video here</a>). <strong><br />
</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-08-at-7.18.45-PM.png"><img class="alignnone size-medium wp-image-4838" title="Screen shot 2009-11-08 at 7.18.45 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-08-at-7.18.45-PM-300x218.png" alt="Screen shot 2009-11-08 at 7.18.45 PM" width="300" height="218" /></a><br />
</strong></p>
<p><a href="http://www.andinc.org/v3/" target="_blank">Mark Shepard</a> describes something he calls &#8220;propagativeÂ  urbanism:&#8221;</p>
<p><strong>&#8220;a way of thinking about shaping the experience of urban space in terms of a bottom-up, participatory approach to the evolution of cities.&#8221; </strong></p>
<p>And, in the most recent pamphlet in the <a href="http://www.situatedtechnologies.net/" target="_blank">Situated Technologies pamphlets </a><span><a href="http://www.situatedtechnologies.net/" target="_blank">series, #5 â€œAsynchonicity Design Fictions for Asynchronous Urban Computing,â€ </a>Julian Bleeker and Nicolas Nova invert an emphasis in the so-called â€œreal-time database enabled cityâ€ with its synchronized Internet of Thingsâ€¦.Â  and speculate on the existence of an â€œasynchronous city.â€Â  They &#8220;forecast situated technologies based on weak signals that show the importance of time on human perspectives.â€Â  They ask:</span></p>
<p><span><strong>&#8220;why, besides &#8216;operational efficiency,&#8217; would we want a ubiquitously computed environment?Â  What are the measures of &#8216;better&#8217; that we want to count as meaningful?&#8221;</strong></span></p>
<p><span>They explain:</span></p>
<p><span><strong>..we are trying to think through what &#8220;urbanwares might be &#8211; urban operating systems &#8211; if they were less about synchronization, top-down construction and connected channels of information and databases and so forth, and more about asynchronized, decentralized things.Â  Software, data, time out of alignment, incongruities, tiles and imbrications of the geographic, spatial parameters into a delicious kind of lively peasant&#8217;s stew.&#8221; </strong><br />
</span></p>
<p><span>One takeaway, perhaps, from Toward the Sentient City is that it&#8217;s at the intersection ofÂ  theÂ  â€œasynchronous cityâ€Â  and theÂ  â€œreal-time database enabled cityâ€ where many new transactional realities of the sentient city will arise.</span></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/11/09/toward-the-sentient-city-the-future-of-the-outernet-and-how-to-imagine-it/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>ISMAR 2009: An Augmented Reality &#8220;Top Chef&#8221; Coopetition</title>
		<link>http://www.ugotrade.com/2009/10/24/ismar-2009-an-augmented-reality-top-chef-coopetition/</link>
		<comments>http://www.ugotrade.com/2009/10/24/ismar-2009-an-augmented-reality-top-chef-coopetition/#comments</comments>
		<pubDate>Sat, 24 Oct 2009 22:26:42 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Acrossair]]></category>
		<category><![CDATA[AR Sketch]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[ARhrrr]]></category>
		<category><![CDATA[augmented reality at VW]]></category>
		<category><![CDATA[avatars and people together in physical spaces]]></category>
		<category><![CDATA[Avilus]]></category>
		<category><![CDATA[Blair Macintyre]]></category>
		<category><![CDATA[Chetan Damani]]></category>
		<category><![CDATA[Christine Perey]]></category>
		<category><![CDATA[cloud computing]]></category>
		<category><![CDATA[Dirk Groten]]></category>
		<category><![CDATA[distributed computing]]></category>
		<category><![CDATA[eyewear for augmented reality]]></category>
		<category><![CDATA[geoAR]]></category>
		<category><![CDATA[Georg Klein]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Green Tech AR Competition]]></category>
		<category><![CDATA[HMDs]]></category>
		<category><![CDATA[Humans as Sensors]]></category>
		<category><![CDATA[industrial augmented reality]]></category>
		<category><![CDATA[Institut Graphische Datenverarbeitung]]></category>
		<category><![CDATA[ISMAR 2009]]></category>
		<category><![CDATA[ISMAR 2010]]></category>
		<category><![CDATA[ISMAR09]]></category>
		<category><![CDATA[Jay Wright]]></category>
		<category><![CDATA[Joe Ludwig]]></category>
		<category><![CDATA[Junaio]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[Mark Billinghurst]]></category>
		<category><![CDATA[Markus Tripp]]></category>
		<category><![CDATA[Metaio]]></category>
		<category><![CDATA[Michael Goesele]]></category>
		<category><![CDATA[Microsoft and augmented reality]]></category>
		<category><![CDATA[Mobile Monday]]></category>
		<category><![CDATA[Mobilizy]]></category>
		<category><![CDATA[MoMo]]></category>
		<category><![CDATA[Noah Zerking]]></category>
		<category><![CDATA[Noora Guldemond]]></category>
		<category><![CDATA[Ogmento]]></category>
		<category><![CDATA[open distributed AR]]></category>
		<category><![CDATA[open hardware]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[participatory sensing]]></category>
		<category><![CDATA[Pattie Maes]]></category>
		<category><![CDATA[Peter Meier]]></category>
		<category><![CDATA[Platial]]></category>
		<category><![CDATA[PTAM on an iphone]]></category>
		<category><![CDATA[Put a Spell. Thomas Carpenter]]></category>
		<category><![CDATA[RoomWare]]></category>
		<category><![CDATA[Sean White]]></category>
		<category><![CDATA[sensor networks]]></category>
		<category><![CDATA[smart phones]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[social augmented realities]]></category>
		<category><![CDATA[standards for augmented reality]]></category>
		<category><![CDATA[Steven Feiner]]></category>
		<category><![CDATA[Technische Universitat Munchen]]></category>
		<category><![CDATA[The RoomWare Project]]></category>
		<category><![CDATA[The Zerkin Glove]]></category>
		<category><![CDATA[tracking and mapping in mobile augmented reality]]></category>
		<category><![CDATA[transactional cartography]]></category>
		<category><![CDATA[ubicomp]]></category>
		<category><![CDATA[Vernor Vinge]]></category>
		<category><![CDATA[virtual pets]]></category>
		<category><![CDATA[Volkswagen augmented reality group]]></category>
		<category><![CDATA[Vuzix]]></category>
		<category><![CDATA[Wave]]></category>
		<category><![CDATA[Wave enabled augmented reality]]></category>
		<category><![CDATA[Web 2.0 Summit]]></category>
		<category><![CDATA[Yuri van Geest]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4670</guid>
		<description><![CDATA[ISMAR 2009 -Â  was an extraordinary mix ofÂ  high geek, academic eminence, gungho Dutch Cowboy entrepreneurial spirit, German engineering and industry, brilliant artistry, and invention, all fueled by a sense, and a very active presence in the case of Diamond Sponsor &#8211; Qualcomm, that the big technology players are waking up to augmented reality. In [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/MetaioLayarpost.jpg"><img class="alignnone size-medium wp-image-4674" title="Metaio&amp;Layarpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/MetaioLayarpost-300x199.jpg" alt="Metaio&amp;Layarpost" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/DirkseesDirkonJunaiopost.jpg"><img class="alignnone size-medium wp-image-4676" title="DirkseesDirkonJunaiopost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/DirkseesDirkonJunaiopost-300x199.jpg" alt="DirkseesDirkonJunaiopost" width="300" height="199" /></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dirkwatchesdirkvcupost.jpg"><img class="alignnone size-medium wp-image-4675" title="dirkwatchesdirkvcupost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dirkwatchesdirkvcupost-300x199.jpg" alt="dirkwatchesdirkvcupost" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/metaiodinasaurpost.jpg"><img class="alignnone size-medium wp-image-4678" title="metaiodinasaurpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/metaiodinasaurpost-299x201.jpg" alt="metaiodinasaurpost" width="299" height="201" /></a></p>
<p><a href="http://www.ismar09.org/" target="_blank">ISMAR 2009</a> -Â  was an extraordinary mix ofÂ  high geek, academic eminence, gungho Dutch Cowboy entrepreneurial spirit, German engineering and industry, brilliant artistry, and invention, all fueled by a sense, and a very active presence in the case of Diamond Sponsor &#8211; Qualcomm, that the big technology players are waking up to augmented reality.</p>
<p>In the picture sequence above (click on photos to enlarge),Â  <a href="http://twitter.com/metaioUS" target="_blank">Noora </a><span><span><a href="http://twitter.com/metaioUS" target="_blank">Guldemond</a></span></span><span><span>, <a href="http://www.metaio.com/" target="_blank">Metaio</a>, demonstrates <a href="http://www.junaio.com/" target="_blank">Junaio</a> (coming to an iphone near you Nov 2nd) to <a href="http://twitter.com/dirkgroten" target="_blank">Dirk Groten</a>, CTO of<a href="http://layar.com/" target="_blank"> Layar</a> (top left photo).Â  One of the nice social features of Junaio is that users can share the 3D augmented scenes they have created.Â  Noora is demoing this capability to </span></span><span><span>Dirk, and as you can see he cracks up when he sees theÂ  scene Noora has stored on her phone.Â  Dirk and I both recognize that this cute little dinosaur augmentation (close up above on bottom left) must have been created by <a href="http://www.metaio.com/company/" target="_blank">Peter Meier, CTO of Metaio</a>, during the Interoperability and Standards workshop earlier that day.Â  Metaio it seems were discussing standards while enjoying some 3D augmented back chat.<br />
</span></span></p>
<p><span><span> Both Dirk and I were active participants in the workshop too.Â  But little did we know that Peter Meier had introduced his little 3D dinosaur into our discussion while we diligently, and sometimes heatedly, debated the merits of XMPP, Wave Federation Protocol,Â  KML, ARML, VRML, X3D, andÂ  more!Â  The photo I took is on the bottom right of the four pics above. It was probably taken very shortly after Peter&#8217;s augmented Junaio scene.Â  Of course there is no little dinosaur in my pic ofÂ  Dirk Groten with <a href="http://twitter.com/JoeLudwig" target="_blank">Joe Ludwig</a> and <a href="http://twitter.com/markustripp" target="_blank">Markus Tripp of Mobilizy</a> who were discussing AR standards oblivious to Peter&#8217;s virtual pet in our midst.<br />
</span></span></p>
<p><span><span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/MarkusTrippPeterMeier.jpg"><img class="alignnone size-medium wp-image-4685" title="MarkusTrippPeterMeier" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/MarkusTrippPeterMeier-300x199.jpg" alt="MarkusTrippPeterMeier" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Thereisawillingnesstostandardizepost.jpg"><img class="alignnone size-medium wp-image-4686" title="Thereisawillingnesstostandardizepost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Thereisawillingnesstostandardizepost-300x199.jpg" alt="Thereisawillingnesstostandardizepost" width="300" height="199" /></a><br />
</span></span></p>
<p><span><span>I must say I had noticed an impish look on Peter Meier&#8217;s face (see photo above on the left &#8211; Peter is wearing glasses and holding a phone).Â  And Markus Tripp, of MobilizyÂ  revealed a little bit of gaming of his own, when he let out that, in part, ARML is a provocation.Â  But Peter was clearly unfazed and enjoying himself.Â  Dirk, tasked to summarize our discussion, stalwartly maintained an optimistic but serious tone fitting for a standards discussion:Â  &#8220;There is a willingness to standardize&#8230;.,&#8221; he began (pic above on left &#8211; click to enlarge and read text). </span></span></p>
<p><span><span> But it was a little 3D dinosaur that, perhaps appropriately, had the last laugh. Fitting, as I am not sure whether anything anyone says about AR standards at the moment will hold up.Â  But, as Ori commented in <a href="http://gamesalfresco.com/2009/10/23/ismar-2009-epilogue-a-new-augmented-reality-world-order/" target="_blank">his great post &#8211; an epilogue for ISMAR 2009,</a> the vibe was &#8220;Peace and Love&#8221; in AR Browser land (</span></span>although Chetan Damani of <a href="http://gamesalfresco.com/?s=%22acrossair%22" target="_blank">Across Air</a> was not in the standards discussion because he attended the UX/content? workshop instead)<span><span>.Â  But as they say, &#8220;all&#8217;s fair in love and war.&#8221;Â  And it is my feeling the games have barely begun!Â  There are many players (<a href="http://www.youtube.com/watch?v=KI4lB00Ht9o&amp;feature=player_embedded#" target="_blank">virtual pets </a>included) waiting in the wings. I met some at ISMAR, and they are just itching to join the frey.<br />
</span></span></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/coopetitionpost.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/ARConsortiumpost2.jpg"><img class="alignnone size-medium wp-image-4701" title="ARConsortiumpost2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/ARConsortiumpost2-300x188.jpg" alt="ARConsortiumpost2" width="300" height="188" /></a><img class="alignnone size-medium wp-image-4690" title="coopetitionpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/coopetitionpost-300x185.jpg" alt="coopetitionpost" width="300" height="185" /></p>
<p><span><span>Ori Inbar, <a href="http://ogmento.com/" target="_blank">Ogmento </a>and Robert Rice, <a href="http://www.neogence.com/#/home" target="_blank">Neogence Enterprises</a>, both founders of the <a href="http://www.arconsortium.org/" target="_blank">AR Consortium</a>, made great efforts to set our young industry off on the right foot -Â  in theÂ  spirit of <a href="http://en.wikipedia.org/wiki/Coopetition" target="_blank">coopetition </a>(</span></span>a <a title="Neologism" href="http://en.wikipedia.org/wiki/Neologism">neologism</a> coined to describe <a title="Co-operation" href="http://en.wikipedia.org/wiki/Co-operation">cooperative</a> <a title="Competition" href="http://en.wikipedia.org/wiki/Competition">competition)</a><span><span>. See </span></span><a href="http://gamesalfresco.com/2009/10/23/ismar-2009-epilogue-a-new-augmented-reality-world-order/" target="_blank">Curious Raven for </a><a href="http://curiousraven.squarespace.com/home/2009/10/23/ismar-09-observations-and-comments.html" target="_blank">Robert&#8217;s conference observations</a>, and <span><span><a href="http://gamesalfresco.com/2009/10/23/ismar-2009-epilogue-a-new-augmented-reality-world-order/" target="_blank">Ori&#8217;s post on Games Alfresco</a> for more about </span></span>Mobile Augmented Reality at ISMAR 2009.Â  The Mobile Augmented Reality Workshops were driven by an indomitable spokesperson for the new AR industry, <a href="http://www.perey.com/" target="_blank">Christine Perey</a>.Â  Christine not only helped motivate discussion on the issue of oxygen to the system, i.e. business value, but also she was a very generous connector at the conference.</p>
<p><span><span><br />
</span></span></p>
<h3>What&#8217;s Next From Augmented Reality&#8217;s Top Chefs?</h3>
<p><span><span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-7.15.58-PM.png"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-7.12.35-PM.png"><img class="alignnone size-medium wp-image-4692" title="Screen shot 2009-10-24 at 7.12.35 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-7.12.35-PM-300x196.png" alt="Screen shot 2009-10-24 at 7.12.35 PM" width="300" height="196" /></a><br />
</span></span></p>
<p>As Ori pointed out, <a href="http://www.imdb.com/name/nm0218033/" target="_blank">Kent Demaine</a>, <a href="http://www.ooo-ii.com/" target="_blank">oooii</a> (pic above is from the oooii web site), Minority report VFX designer was hanging out at ISMAR 2009 and he came to the panel I was on: &#8220;Augmented Reality in Sports,Â  Entertainment and Advertising.&#8221;Â  We chatted afterwords about instrumented environments and how this is such a key to development interesting augmented experiences.Â  Also I mentioned how back in the day I was involved in some of the early development of motion control software.Â  And it was great to hear Kent say they were still finding motion control cool at <a href="http://www.ooo-ii.com/" target="_blank">oooii</a>.Â  As Ori notes, he is the &#8220;guy with the most enviable AR credentials in the world (the guy who designed VFX for minority report)<strong>,&#8221;</strong><strong> </strong>and <a href="http://www.ooo-ii.com/" target="_blank">oooii</a> is busy and hiring.</p>
<p>One of the highlights of the Arts, Media and Humanities track for me was meeting <a href="http://jarrellpair.com/" target="_blank">JarrellÂ  Pair.</a> He really brought the best out in panelists with his well tuned questions.Â  The recording of ISMAR was comprehensive and videos should be up next week.Â  I will post the slides on Ugotrade of my presentation:Â  &#8220;The Next Wave of AR: Shared Augmented Realities and Remix Culture.&#8221;.</p>
<h3>&#8220;Mixed and Augmented Reality: &#8216;Scary and Wondrous&#8217;&#8221; &#8211; <a href="http://en.wikipedia.org/wiki/Vernor_Vinge" target="_blank">Vernor Vinge</a></h3>
<p><strong>&#8220;Imagine an environment where most physical objects know where they are, what they are, and can, (in principle) network with any other object. With this infrastructure, reality becomes its own database.Â  Multiple consensual virtual environments are possible, each oriented to the needs of its constituency.Â  If we also have open standards, then bottom-up social networks and even bottom up advertising become possible. Now imagine that in addition to sensors, many of these itsy-bitsy processors are equipped with effectors.Â  Then the physical world becomes much more like a software construct.Â  The possibilities are both scary and wondrous.&#8221;</strong> (<a href="http://en.wikipedia.org/wiki/Vernor_Vinge" target="_blank">Vernor Vinge</a> -Â  intro to ISMAR 2009)</p>
<p>Vernor Vinge&#8217;s short intro to ISMAR 2009 (which can be downloaded with the <a href="http://www.ismar09.org/" target="_blank">ISMAR 2009 schedule here)</a> captures the essence of the &#8220;Scary and Wondrous&#8221; dawn of the age of ubiquitous computing and mixed and augmented reality.Â  It is definitely worth a moment to download.Â  The future of augmented and mixed realities, as Vernor Vinge points out, is tied up in a &#8220;tension between centralized and distributed computing&#8221; that &#8220;will continue long into the future.&#8221; One ofÂ  my fascinations with Wave is that it offers a tantalizing opportunity to explore augmented reality in an open distributed architecture.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-12-at-2.40.39-PM.png"><img class="alignnone size-medium wp-image-4586" title="Screen shot 2009-10-12 at 2.40.39 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-12-at-2.40.39-PM-300x154.png" alt="Screen shot 2009-10-12 at 2.40.39 PM" width="300" height="154" /></a></p>
<p>At ISMAR, I talked with as many people as possible about the AR Wave project &#8211; <a href="../../2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/" target="_blank">see my post here for more about Wave enabled AR</a>.Â  Many people were very enthusiastic to join the AR wave and the only thing I really lacked was about 100 invites to hand out!</p>
<h3>&#8220;Everything, Everywhere &#8211; making visible the invisible&#8221;</h3>
<p>Some of the areas that I would have liked to see given more attention on at ISMAR were sensor networks, data curation, and user experience.Â  Not that these areas were entirely neglected with Pattie Maes, MIT as a keynote speaker, and Mark Billinghurst presenting on some fascinating work on social augmented experiences and user experience.Â  I highly recommend catching up on these and other ISMAR presentations when the videos go up.</p>
<p><a href="http://www1.cs.columbia.edu/~swhite/" target="_blank"><img class="alignnone size-medium wp-image-4716" title="Screen shot 2009-10-25 at 12.28.25 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-12.28.25-PM-300x57.png" alt="Screen shot 2009-10-25 at 12.28.25 PM" width="300" height="57" /></a></p>
<p>And, I was very happy to meet and talk to <a href="http://www1.cs.columbia.edu/~swhite/" target="_blank">Sean White</a> whose work at Columbia University is one of my inspirations (for more <a href="http://www1.cs.columbia.edu/~swhite/" target="_blank">about Sean&#8217;s work see here</a> or click image above):</p>
<p><strong>&#8220;the confluence of powerful connected mobile devices, advances in computer vision and sensing, and techniques such as augmented reality (AR) enables exciting new opportunities for interacting with this hidden network of dynamic information and shifts the locus of interaction from the desktop computer to the world around us&#8221;</strong></p>
<p>And I had several very interesting conversationsÂ  at ISMAR about developing social augmented experiences that connect us to a physical world that is becoming &#8220;much more like a software construct&#8221; (Vernor Vinge).Â  Dirk Groten, CTO of Layar mentioned a few interesting projects Layar has up their sleeves, including somethingÂ  Layar may be cooking up with <a href="http://www.roomwareproject.org/" target="_blank">The RoomWare Project.</a></p>
<p><span><span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-10.03.00-PM.png"><img class="alignnone size-medium wp-image-4697" title="Screen shot 2009-10-24 at 10.03.00 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-10.03.00-PM-300x231.png" alt="Screen shot 2009-10-24 at 10.03.00 PM" width="300" height="231" /></a><br />
</span></span><br />
The picture above is of RoomWare&#8217;s Social RFID Installation for Media Plaza in Utrecht (<a href="http://blog.roomwareproject.org/2008/10/06/social-rfid-installation-for-media-plaza/">read more here</a>).</p>
<h3>Demos Galore!</h3>
<p>In the demo rooms,<a rel="cc:attributionURL" href="http://augmentation.wordpress.com/2009/10/24/ismar-ismar-ismar-where-to-start/augmentation.wordpress.com"> Noah Zerkin</a> (pic below left) pretty much single handedly carried the AR flag for a growing community of augmented reality Makers and Hackers.Â  But his presence was much appreciated, and he tirelessly demoed <a href="http://zerkinglove.com/" target="_blank">The Zerkin Glove.</a> See <a href="http://augmentation.wordpress.com/2009/10/24/ismar-ismar-ismar-where-to-start/" target="_blank">the first of what may be several posts from Noah on ISMAR here</a>.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/noah2post.jpg"><img class="alignnone size-medium wp-image-4700" title="noah2post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/noah2post-300x199.jpg" alt="noah2post" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/TishVuzixgogglespost.jpg"><img class="alignnone size-medium wp-image-4704" title="Tish&amp;Vuzixgogglespost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/TishVuzixgogglespost-300x199.jpg" alt="Tish&amp;Vuzixgogglespost" width="300" height="199" /></a></p>
<p>And I got to try out the Vuzix goggles (picture above on right).Â Â  This was my first experience playing an AR game that was smart about real world gravity. It&#8217;sÂ  &#8220;an <span>augmented reality</span> marble game that uses gravity as a <span>game controller</span>&#8221; &#8211; see <a href="http://gamesalfresco.com/2009/08/09/augmented-reality-has-gained-gravity/" target="_blank">Ori Inbar&#8217;s write up here</a>.Â  It was a very compelling experience and I have to say I didn&#8217;t really notice the shortcomings of the Vuzix goggles while I was absorbed in the game. AndÂ  I turned out to be quite good at the game too. It is intuitive unlike the kind ofÂ  rule based games I never have time to learn properly.Â  But what is so special about this project is the tools that it is built with are open, and available for all, and affordable (see this <a href="http://gamesalfresco.com/2009/08/09/augmented-reality-has-gained-gravity/" target="_blank">list on Games Alfresco</a>).</p>
<p>It was a great pleasure to meet <a href="http://www1.cs.columbia.edu/~feiner/" target="_blank">Prof. Steven Feiner</a> (picture on below the left) who heads Columbia University&#8217;s brilliant AR research team at <a href="http://graphics.cs.columbia.edu/top.html" target="_blank">The Columbia University Graphics and User Interfaces Lab.</a></p>
<p>Ori Inbar (pic below on right) also spent a lot of time in the demo room showing off Ogmento&#8217;s lovely AR learning game that delighted attendees, <a href="http://ogmento.com/"><strong>â€œPut a Spell: Learn to Spell with Augmented Reality.â€</strong></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/TishVuzixpost.jpg"><img class="alignnone size-medium wp-image-4703" title="TishVuzixpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/TishVuzixpost-199x300.jpg" alt="TishVuzixpost" width="199" height="300" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Ogmentopost.jpg"><img class="alignnone size-medium wp-image-4702" title="Ogmentopost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Ogmentopost-199x300.jpg" alt="Ogmentopost" width="199" height="300" /></a></p>
<p>For a round up ofÂ  what&#8217;s next for augmented reality head mounted displays check out, <a href="http://gamesalfresco.com/2009/10/23/ismar-2009-epilogue-a-new-augmented-reality-world-order/" target="_blank">Games Alfresco here</a>, and Thomas Carpenter&#8217;s excellent review of the <a href="http://thomaskcarpenter.com/2009/10/21/ismar09-hmd-review/">head mounted displays.</a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/GeorgandBlairpost.jpg"><img class="alignnone size-medium wp-image-4712" title="GeorgandBlairpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/GeorgandBlairpost-300x199.jpg" alt="GeorgandBlairpost" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/cypherpost.jpg"><img class="alignnone size-medium wp-image-4713" title="cypherpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/cypherpost-300x199.jpg" alt="cypherpost" width="300" height="199" /></a></p>
<p><strong>Ori Inbar on Games Alfresco asks is &#8220;Microsoft â€“ the new big player to watch</strong>?&#8221;Â Â  &#8220;<a href="http://www.robots.ox.ac.uk/%7Egk/" target="_blank">Georg Klein</a>, inventor of <a href="http://www.youtube.com/watch?v=pBI5HwitBX4" target="_blank">PTAM-on-an-iPhone</a> (and the smartest Computer Vision guy on the block)&#8221; has joined Microsoft to make Mobile AR.</p>
<p>The picture on the left above shows Georg trying out <a href="http://www.youtube.com/watch?v=Cix3Ws2sOsU&amp;feature=player_embedded" target="_blank">ARhrrr</a> with Blair MacIntyre.Â Â  And on the right Blair is demoing his marker card pack to Senior Vice President of Cypher Entertainment, David Elmekies.Â  Yes ISMAR was abuzz with demos. See<a href="http://compscigail.blogspot.com/2009/10/ismar09-few-demos.html" target="_blank"> </a><a href="http://compscigail.blogspot.com/2009/10/ismar09-few-demos.html" target="_blank">this post</a> from Gail Carmichael for more video demos.</p>
<h3>Next Year ISMAR 2010 in Korea!</h3>
<p><span><span><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/ISMARBanquet.jpg"><img class="alignnone size-medium wp-image-4693" title="ISMARBanquet" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/ISMARBanquet-300x199.jpg" alt="ISMARBanquet" width="300" height="199" /></a></span></span></p>
<p><span style="font-weight: normal;"><span style="font-weight: bold;"><span style="font-size: 0.800001em;"> </span></span></span>At the banquet, I managed to find a seat at a table with Sean White (at left in photo above with Christine Perey to his right) and the Columbia University team.Â  The banquet culminated with the â€œPast and Future of ISMARâ€ Panel chaired valiantly by Jay Wright of Qualcomm.Â  We were asked to offer our input for ISMAR 2010.Â  I offered up an idea that I have been nurturing for a while now -Â  to stage a &#8220;Green Tech AR Competition.&#8221;Â  Perhaps, I suggested, we could <span id="zx-." title="Click to view full content">base the competition around a conference (ISMAR 2010 in Korea?) and set up a target rich, instrumented environment for the occassion.Â  I think the Arduino open hardware community and AR developers have a synergy that is just waiting to be explored!Â  And, if we add the innovators of data curation to the mix, e.g., Pachube, AMEE, and Path Intelligence&#8230;(Markus Tripp left ISMAR to speak on a <a href="http://www.web2summit.com/web2009" target="_blank">Web 2.0 Summit</a> panel, <a href="http://www.readwriteweb.com/archives/humans_as_sensors.php" target="_blank">&#8220;Humans as Sensors,&#8221;</a> which also included Path Intelligence, Deborah Estrin on <a href="http://research.cens.ucla.edu/people/estrin/" target="_blank">&#8220;participatory sensing,&#8221;</a> and the brilliant work of <a href="http://twitter.com/dianneisnor" target="_blank">Di-Ann Eisnor</a>, <a href="http://platial.com/" target="_blank">Platial</a>, on &#8220;Transactional Cartography&#8221;).Â  Anyway a big Green tech AR competition could get people working together across the broad spread of AR terrain on some of the sticky problems of user experience.Â  And, with a high level of support from Smart Phone companies, HMDs manufacturers and the chip makers we just might come up with some extraordinary magic.<br />
</span></p>
<p><span id="zx-." title="Click to view full content"> The devil of course will be in the details.Â  But a competition like this could not only motivate key players to come together in the spirit of coopetition but also be an opportunity to show the world the power of AR to make visible the invisible ecosystems that are so important to the health of our planet.<br />
</span></p>
<p>One of the notable presences at ISMAR 2009 was the Qualcomm team.Â Â  Jay Wright&#8217;s presentation (an exclusive for ISMAR) not only outlined AR for 2012, but Jay also talked about some &#8220;close to the metal&#8221; innovation that we will see from Qualcomm very, very soon!Â  I had some time in the press room with Jay and his team prompted by <a href="http://www.mobilemonday.nl/" target="_blank">MoMo&#8217;s </a>Yuri van Geest.Â  When I twittered about Qualcomm&#8217;s presentation at ISMAR, Yuri replied:<strong><br />
</strong></p>
<p><a href="http://twitter.com/vanGeest" target="_blank">vangeest</a> <a href="http://twitter.com/TishShute" target="_blank">&#8220;@tishshute</a>: good stuff, hopefully you will integrate the neat new solutions and ideas in your talk in November ;)&#8221;</p>
<p><strong> </strong>I will be presenting at <a href="http://www.mobilemonday.nl/" target="_blank">MoMo #13</a> on AR, open AR, future of AR and GeoWeb,Â  and hopefully will bring some good news from Qualcomm too.Â  Anyway Jay seemed to like the idea of a Green Tech AR Competition, even though I did stress that I thought it needed some serious sponsorship and BIG prizes.</p>
<p><strong><br />
</strong></p>
<h3>Where&#8217;s the beef? Tracking and Mapping at ISMAR 2009</h3>
<p>On the flight from NYC to Orlando and ISMAR&#8217;o9 I dozed (I had been up late preparing my presentation) and I watched the Dew Tour Pro Skateboard competition and Top Chef on the Food Channel.Â  In this particular episode of Top Chef, the aspiring chefs were all given a brown bag of ingredients by an already famous chef who then judged whether the contenders managed to make a delicious meal with their allotment which was notably lacking in key ingredients of haute cusine.</p>
<p>This metaphor ofÂ  trying to cook up a great meal while perhaps missing the staples is apt for the current early stage of commercial augmented reality.Â  And when I arrived in Orlando, not only were the Dew Tour pro skateboarders staying at the same hotel as ISMAR, but ISMAR itself felt remarkably like an Augmented Reality Top Chef Coopetition.</p>
<p>Much of ISMAR was dedicated to the task ofÂ  providing the meat and potatoes of Augmented Reality, solutions to mobile tracking, mapping and registration, particularly in the Science and Technology track.</p>
<p>Industrial and Military Augmented reality solutions I found out, typically, solve the tracking problems by using fixed mounts which clearly wouldn&#8217;t translate well into the AR everywhere with everything mobile consumer culture expects.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/DanielPustkapost.jpg"><img class="alignnone size-medium wp-image-4679" title="DanielPustkapost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/DanielPustkapost-300x199.jpg" alt="DanielPustkapost" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-2.41.56-PM.png"><img class="alignnone size-medium wp-image-4726" title="Screen shot 2009-10-25 at 2.41.56 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-2.41.56-PM-300x208.png" alt="Screen shot 2009-10-25 at 2.41.56 PM" width="300" height="208" /></a></p>
<p><em>In the picture on the left Fabian Doil stands by the VW engine that provided some of the outdoor targets for the ISMAR tracking competition.Â  On the right is a picture from the VW&#8217;s presentation on their research and development of AR.</em></p>
<p>I followed the tracking contest, organized by Daniel Pustka and Fabian Doil of Volkswagen, quite closely. And I learned a lot in the process. WhileÂ  it is clear there has been progress in AR mapping and tracking, we still have a ways to go.</p>
<p>But hanging around the Tracking Competition was a good way to find out the state of play of this crucial part of the AR dream.Â  For example,Â  a little tidbit I learned is that <a href="http://www.gris.informatik.tu-darmstadt.de/~mgoesele/" target="_blank">Michael Goesele </a>who has been reconstructing &#8220;high-quality geometry models from images collected from the internet (so called community photo collections, CPC)&#8221; is soon to be at the <a href="http://www.ini-graphics.net/ini-graphicsnet/members/fraunhofer-institut-fuer-graphische-datenverarbeitung-igd.html" target="_blank">Institut Graphische Datenverarbeitung</a> where top contenders in the tracking contest &#8211; Harald WuestÂ  and Folker Weintipper (in the foreground of the photo at the left and right respectively) are also to be found. [update Harold and Folker were the winning team <a href="http://docs.google.com/gview?a=v&amp;pid=gmail&amp;attid=0.1&amp;thid=1248dd2927becb21&amp;mt=application%2Fpdf&amp;url=http%3A%2F%2Fmail.google.com%2Fmail%2F%3Fui%3D2%26ik%3De77cfddae9%26view%3Datt%26th%3D1248dd2927becb21%26attid%3D0.1%26disp%3Dattd%26zw&amp;sig=AHBy-hbcqUsaRNjbqpHO8vAF_vJqfDrMig" target="_blank">see here for details of scoring and results</a>!] Otto Korkalo and Tuomas Kantonen of VTT, Finland, Augmented Reality team are in the background. They have been working on the joint IBM, Nokia and VTT project that brings, <a href="http://www.marketwatch.com/story/researchers-from-ibm-nokia-and-vtt-bring-avatars-and-people-together-for-virtual-meetings-in-physical-spaces-2009-10-19" target="_blank">Avatars and People Together for Virtual Meetings in Physical Spaces.</a></p>
<p>The picture on the right is another team that were doing very well. If my notes serve me well (and please forgive me if they don&#8217;t. I came back with my card wallet overflowing!) the photo on the right showsChristian Waechter (on the left) and Peter Keitler (on the right) of the <a href="http://portal.mytum.de/welcome" target="_blank">Technische Universitat Munchen</a>.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/trackingcompetitionpost.jpg"><img class="alignnone size-medium wp-image-4672" title="trackingcompetitionpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/trackingcompetitionpost-300x199.jpg" alt="trackingcompetitionpost" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Trackingcompetition2post.jpg"><img class="alignnone size-medium wp-image-4681" title="Trackingcompetition2post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Trackingcompetition2post-300x199.jpg" alt="Trackingcompetition2post" width="300" height="199" /></a></p>
<p>Germany is certainly leading the way in industrial AR. And I learned how small businesses like Metaio get to work with top research institutions and big companies like VW, thanks to very strong German funding program for AR and VR. The current iteration of a series of funding programs isÂ  called<a href="http://www.avilus.de/" target="_blank"> Avilus</a>.Â  AvilusÂ  is putting 42 million Euros into AR and VR this year alone (click on the slide below to see more about Avilus ).</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-1.08.48-AM.png"><img title="Screen shot 2009-10-24 at 1.08.48 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-1.08.48-AM-300x212.png" alt="Screen shot 2009-10-24 at 1.08.48 AM" width="300" height="212" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-2.04.50-AM.png"><img class="alignnone size-medium wp-image-4673" title="Screen shot 2009-10-24 at 2.04.50 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-24-at-2.04.50-AM-300x202.png" alt="Screen shot 2009-10-24 at 2.04.50 AM" width="300" height="202" /></a></p>
<p>I wish we had the equivalent of Avilus here in the US.Â  But there is no equivalent to Arvilus for AR here, andÂ  no AR isÂ  being developed by the US car industry either it seems.Â  But look at the slide above to get a taste of some of the cool stuff Metaio and other small AR and VR businesses do for VW through the Avilus project.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/VWtrackinggudrunpost.jpg"><img class="alignnone size-medium wp-image-4682" title="VWtrackinggudrunpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/VWtrackinggudrunpost-300x199.jpg" alt="VWtrackinggudrunpost" width="300" height="199" /></a></p>
<p>I also got to meet many people from one of the world&#8217;s most important AR hubs -Â  The Department of Informatics, <a href="http://portal.mytum.de/welcome" target="_blank">Technische Universitat Munchen</a>, including Prof. Gudren Klinker on the far right in pic above.Â  And from left to right, Fabian Doil (VW, co-organizer of contest), Sebastian Lieberknecht , Selim Ben Himane (Metaio), Tobias Eble (Metaio).Â  Prof. Klinker is the engine behind much of German innovation in AR.</p>
<p>Metaio was one of the few teams to rely mainly on markerless tracking which in this contest was very challenging because of the very different light conditions (see pics below) between the windowless interior and dazzling Florida sunshine outside (pic on the right shows targets under ideal lighting conditions).Â  Many people in the US may beÂ  familiar with Metaio&#8217;s consumer applications, like Junaio,Â  but thanks to Germany&#8217;s efforts to nurture augmented and virtual reality they are also respected software developers in industrial AR.Â  And I suspect that Metaio will spearhead markeless tracking in consumer AR too.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Trackingcompetition5post.jpg"><img class="alignnone size-medium wp-image-4740" title="Trackingcompetition5post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Trackingcompetition5post-300x199.jpg" alt="Trackingcompetition5post" width="300" height="199" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-7.47.44-PM.png"><img class="alignnone size-medium wp-image-4745" title="Screen shot 2009-10-25 at 7.47.44 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-7.47.44-PM-300x229.png" alt="Screen shot 2009-10-25 at 7.47.44 PM" width="300" height="229" /></a></p>
<p>This post as usual has already expanded to something much longer than I originally attended &#8211; pretty typical for me! There is much I have not been able to cover including some of the interesting contributions by augmented reality artists at ISMAR &#8211; again I recommend the upcoming videos.</p>
<p>But I cannot end without a hat tip to, Oriel, Nate et al. who won the best student paper award for AR Sketch &#8211; again please <a href="http://gamesalfresco.com/2009/10/23/ismar-2009-epilogue-a-new-augmented-reality-world-order/" target="_blank">see Games Alfresco for more on this</a> (pic below from Games Alfresco). AR Sketch, Ori notes, is featured &#8220;in our <a href="http://gamesalfresco.com/2009/10/16/ismar-2009-sketch-and-shape-recognition-preview-from-ben-gurion-university/" target="_self">top post</a> and popular <a href="http://www.youtube.com/watch?v=M4qZ0GLO5_A" target="_blank">video</a>.&#8221; And</p>
<p><strong>&#8220;Their work is revolutionizing the AR world by avoiding the need to print markers â€“ or any images whatsoever.&#8221;</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-1.58.35-PM1.png"><img class="alignnone size-medium wp-image-4719" title="Screen shot 2009-10-25 at 1.58.35 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-25-at-1.58.35-PM1-300x223.png" alt="Screen shot 2009-10-25 at 1.58.35 PM" width="300" height="223" /></a><br />
</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/10/24/ismar-2009-an-augmented-reality-top-chef-coopetition/feed/</wfw:commentRss>
		<slash:comments>9</slash:comments>
		</item>
		<item>
		<title>AR Wave: Layers and Channels of Social Augmented Experiences</title>
		<link>http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/</link>
		<comments>http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/#comments</comments>
		<pubDate>Tue, 13 Oct 2009 18:52:42 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[virtual communities]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Amphibious Architecture]]></category>
		<category><![CDATA[AR Blip]]></category>
		<category><![CDATA[AR Browser]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[augmentaion]]></category>
		<category><![CDATA[augmented reality search]]></category>
		<category><![CDATA[Blair Macintyre]]></category>
		<category><![CDATA[Channels and Social Augmented Realities]]></category>
		<category><![CDATA[citi sensing]]></category>
		<category><![CDATA[citizen sensing]]></category>
		<category><![CDATA[Clayton Lilly]]></category>
		<category><![CDATA[cybernetics vs ecology and human waste]]></category>
		<category><![CDATA[distributed]]></category>
		<category><![CDATA[eco mapping]]></category>
		<category><![CDATA[Gene Becker]]></category>
		<category><![CDATA[geoAR]]></category>
		<category><![CDATA[geospatial web]]></category>
		<category><![CDATA[geospatial web and augmented reality]]></category>
		<category><![CDATA[Goggle Wave Federation Protocol]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Google Wave as an AR enabler]]></category>
		<category><![CDATA[Google Wave enable augmented reality]]></category>
		<category><![CDATA[Google Wave Protocols]]></category>
		<category><![CDATA[green tech augmented reality]]></category>
		<category><![CDATA[immersive sight]]></category>
		<category><![CDATA[Jeremy Hight]]></category>
		<category><![CDATA[Joe Lamantia]]></category>
		<category><![CDATA[Layers]]></category>
		<category><![CDATA[layers and channels of augmented reality]]></category>
		<category><![CDATA[Life Clipper]]></category>
		<category><![CDATA[life streaming]]></category>
		<category><![CDATA[location based media]]></category>
		<category><![CDATA[location based services]]></category>
		<category><![CDATA[locative media]]></category>
		<category><![CDATA[locative narratives]]></category>
		<category><![CDATA[Mannahatta]]></category>
		<category><![CDATA[map based augmentation]]></category>
		<category><![CDATA[mapping]]></category>
		<category><![CDATA[modulated mapping]]></category>
		<category><![CDATA[modulated napping]]></category>
		<category><![CDATA[multi-user]]></category>
		<category><![CDATA[narrative archaeology]]></category>
		<category><![CDATA[Natural Fuse]]></category>
		<category><![CDATA[neogeography]]></category>
		<category><![CDATA[networked urbanism]]></category>
		<category><![CDATA[non euclidian geometry]]></category>
		<category><![CDATA[open augmented reality framework]]></category>
		<category><![CDATA[Seanseable Labs]]></category>
		<category><![CDATA[sensor networks]]></category>
		<category><![CDATA[shared augmented realities]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[social augmented reality experiences]]></category>
		<category><![CDATA[sound augmentation]]></category>
		<category><![CDATA[Thomas K. Carpenter]]></category>
		<category><![CDATA[Thomas Wrobel]]></category>
		<category><![CDATA[Trash Track]]></category>
		<category><![CDATA[ubicomp]]></category>
		<category><![CDATA[virtual reality]]></category>
		<category><![CDATA[Wave as a platform for augmented reality]]></category>
		<category><![CDATA[Wave Blip]]></category>
		<category><![CDATA[Wave Bots]]></category>
		<category><![CDATA[Wave playback]]></category>
		<category><![CDATA[Wave playback feature]]></category>
		<category><![CDATA[Wave Robots]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4585</guid>
		<description><![CDATA[It is now nearly two weeks since the Google Wave preview launch and I am happy to say we have some AR Wave news. The diagram above shows Thomas Wrobelâ€™s basic concept for a distributed, multi-user, open augmented reality framework based on the Google Wave Federation Protocol and servers (click on the image to see [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://lostagain.nl/tempspace/PrototypeDiagram3_wave.html" target="_blank"><img class="alignnone size-medium wp-image-4586" title="Screen shot 2009-10-12 at 2.40.39 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-12-at-2.40.39-PM-300x154.png" alt="Screen shot 2009-10-12 at 2.40.39 PM" width="300" height="154" /></a></p>
<p>It is now nearly two weeks since the <a href="http://wave.google.com/" target="_blank">Google Wave </a>preview launch and I am happy to say we have some AR Wave news. The diagram above shows Thomas Wrobelâ€™s basic concept for a distributed, multi-user, open augmented reality framework based on the <a href="http://www.waveprotocol.org/" target="_blank">Google Wave Federation Protocol</a> and servers (click on the image to see the dynamic annotated sketch <a href="http://lostagain.nl/tempspace/PrototypeDiagram3_wave.html" target="_blank">or here</a>).</p>
<p>Even in the short time we have had to explore Wave, some very exciting possibilities are becoming clear. Thomas puts some of the virtues of Wave as an AR enabler succinctly when he writes:</p>
<p><strong>â€œWave allows the advantages of both real-time communication, as well as the advantages of persistent hosting of data. It is both like IRC, and like a Wiki. It allows anyone to create a Wave, and share it with anyone else. It allows Waves to be edited at the same time by many people, or used as a private reference for just one person.</strong></p>
<p><strong>These are all incredibly useful properties for any AR experience, more so Wave is open. Anyone can make a server or client for Wave. Better yet, these servers will exchange data with each other, providing a seamless world for the userâ€¦..a single login will let you browse the whole world of public waves, regardless of whoâ€™s providing or hosting the data. Wave is also quite scalable and secureâ€¦data is only exchanged when necessary, and will stay local if no one else needs to view it.</strong></p>
<p><strong>Wave allows bots to run on itâ€¦allowing blips in a waves to be automatically updated, created or destroyed based on any criteria the coders choose. Wave even allows the playback of all edits since the wave was created.</strong></p>
<p><strong>For all these reasons and more, Wave makes a great platform for AR.â€</strong></p>
<p>There will be much more <span>coming soon on Wave enabled AR because the Google Wave invites have begun to flow out to a wider community now. This week, many of our small ad-</span>hoc group looking at the development challenges and implications of Google Wave for AR actually got into Wave for the first time.</p>
<p>Many thanks to all the people who have contributed to this discussion so far including: Thomas Wrobel, Thomas K. Carpenter, Jeremy Hight, Joe Lamantia, Clayton Lilly, Gene Becker and many others.</p>
<p>We will be setting up some public AR Framework Development Waves this week.Â  If you have any trouble finding them, or adding yourself to it, please add Thomas and I to your contact list.Â  I am tishshute@googlewave.comÂ  Thomas is darkflame@googlewave.comÂ  The first two are currently called:<strong> </strong></p>
<p><strong><br />
AR Wave: Augmented Reality Wave Framework Development</strong> (developer forum)</p>
<p><strong>AR Wave: Augmented Reality Wave Development</strong> (for general discussion)</p>
<p>The discussion so far has been in two areas. On the one hand, it is gear-heady and focused on the <a href="http://www.waveprotocol.org/" target="_blank">Google Wave Federation Protocol</a>, code, development challenges, and interfacing to mobile, while on the other hand people have been looking at use cases and questions of user experience.</p>
<p>Distributed, â€œshared augmented realities,â€ or â€œsocial augmented experiences&#8221; â€“ that not only allow mashups, &amp; multisource data flows, but dynamic overlays (not limited to 3d), created by users, linked to location/place/time, and distributed to other users who wish to engage with the experience by viewing and co-creating elements for their own goals and benefit &#8211; are something very new for us to think about.</p>
<p>As, Joe Lamantia, puts it, now:</p>
<p><strong>â€œthereâ€™s a feedback loop between which interactions are made easy by any given combo of device;/ hardware / software / connectivity, and the ways that people really work in real life (without any mediation / permeation by tech).â€</strong></p>
<p>Joe Lamantia whose term, <strong>â€œsocial augmented experiencesâ€</strong> I borrow for this post title, has done some thinking about <strong>â€œconcepts and models for understanding and contributing to shared augmented experiences, such as the social scales for interaction, and the challenges attendant to designing such interactions.â€ </strong>Check out <a href="http://www.joelamantia.com/" target="_blank">Joe Lamantia&#8217;s blog </a>for more on this later this week.</p>
<p>It is very helpful, as Joe points out, to shift the focusÂ  back and forth between the experience and the medium.</p>
<p>It is super exciting to have clear evidence that shared augmented realities are no longer merely possible, but highly probable and actually do-able now.</p>
<p>I shouldÂ  be absolutely clear about what Google Wave does to enable AR because obviously Wave plays no role in solving image recognition and tracking/registrations issues.Â  But, for example, Wave protocols and servers do provide a means to exchange, edit, and read data, and that enables distributed, social augmented realities.</p>
<p>Thomas explains how the newly named &#8220;AR Blip&#8221; works as:</p>
<p><strong>&#8220;An AR Blip is simply a Blip in wave containing AR data. Typically this would be the positional and url data telling a AR browser to position a 3d object at a location in space.</strong></p>
<p><strong>In more generic terms, an AR Blip allows data of various forms (meshes,text,sound) to be given a real-world position.&#8221;</strong></p>
<p>I have mentioned in other posts (<a href="http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/" target="_blank">here</a> and <a href="http://www.ugotrade.com/2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/" target="_blank">here</a>) that Wave can be used for AR as precise or as loose as the current generation devices can handle. And as the hardware and software for the kind of AR that can put media out in the world to truly immerse you in a mixed space, the frameworkÂ  shouldÂ  be able to handle this too.</p>
<p>(a note on the Wave playback feature &#8211; this opens up a whole new world of possibilities.Â  Check out <a href="http://snarkmarket.com/2009/3605" target="_blank">this post</a> on some of the implications of playback for writing!)</p>
<p>The use cases we have been coming up with are too numerous to go into in detail this post<span>.Â  The open nature of an AR framework/Wave standard will lead to many new applications we have barely begun to imagine.Â  As Thomas points out, different client software can be made for browsing, potentially allowing for various specialist browsers, as well as more generic ones for typicalÂ  use. T</span>he multitudes of different kinds of data in/output that could be integrated in an open AR framework as it evolves are mind boggling.</p>
<p>But, for now, someÂ  obvious use cases do come to mind:<br />
eg.</p>
<p>- Historical environmental overlays showing how a city used to be/and how this vision may be constructed differently by different communities</p>
<p>- Proposed building work showing future changes to a structure/and the negotiations of this future (both the public and professionals could submit their own comments to the plans in context), seeing pipes, cables and other invisible elements that can help builders and engineers collaborate and do their work.</p>
<p>- Skinning the world with interactive fantasies</p>
<p>I asked Thomas to help people understand how Wave enables new interactions to data by explaining how Wave could enable citi sensing and citizen sensing projects (e.g.<a href="http://tinyurl.com/y97d5zr" target="_blank"> this one being pioneered by Griswold</a>):</p>
<p><strong><strong>&#8220;Sensors, both mobile and static could contribute environmental data into city overlays;</strong></strong></p>
<div><strong><strong>â€”temperature, windspeed, air quality (amounts of certain particles) water quality, amount of sunlight, Co2 emissions could all be feed into different waves. The AR Wave Framework makes it easy to see any combination of these at the same time.&#8221;</strong></strong></div>
<div><strong><strong><br />
</strong></strong></div>
<p><strong><strong> </strong></strong>Having these invisible aspects of the world made visible would create ways to improve sustainability, social equity, urban management, energy efficiency, public health, and allow communities to understand and become active participants in the ecosystems and infrastructure of their neighborhoods.</p>
<p>The key is reflecting thisÂ  kind of data back to people &#8220;making it not back story but fore story,&#8221; right where we are, right where it happens, as well as having it available for analysis.</p>
<p>As well asÂ  creating new opportunities to interact/respond to/and enhance data, making visible the invisible as <a href="http://www.environmentalhealthclinic.net/people/natalie-jeremijenko/" target="_blank">Natalie Jeremijenko&#8217;s</a> work on <a href="http://www.amphibiousarchitecture.net/" target="_blank">Amphibious Architecture</a> and <a href="http://www.haque.co.uk/" target="_blank">Usman Haque&#8217;s</a> project <a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank">Natural Fuse</a> shows, can also create new connections/understandings between humans and the non human&#8217;s that share our world, e.g. fish, plants, waterways.</p>
<p>At a more prosaic levelÂ  potential buyers of property could see more clearly what they are buying, city planners could see better what needs to be worked on, and environmental researchers could see more clearly the impact people are having on an area.</p>
<p>Also Wave can provide some of the framework necessary to begin to begin to address tricky problems of privacy. Sensitive data can be stored on private waves, e.g. medical data for doctors and researchers, but the analysis of theÂ  data could still be of benefit to all, e.g., if it&#8217;s tied disease occurrences to locations andÂ  relationships between the environmental data and health wereâ€¦quite literallyâ€¦made visible.</p>
<p><strong>&#8220;The publication of energy consumption and making it visible as overlays, could help influence the public into supporting more energy efficiency companies and businesses. It could also help citizens to try to keep their own energy usage down, to try to keep their street in â€œthe green.â€</strong></p>
<p>Thomas notes:</p>
<p><strong>&#8220;With all of the above, it becomes fairly trivial to write persistent Wave-bots that automatically send notice when certain criteria are met (pollutants over a certain level, for example). On publicly readable waves, anyone can use the data in their local computers, process it, and contribute results back on a new wave. Alternatively, persistent remote severs could run Cron jobs, or other automated processing, using services such as App Engine to run wave robots.</strong></p>
<p><strong>All these possibilities become â€œfreeâ€ when using Wave as a platform for geographically tied data.&#8221;</strong></p>
<p>But of course this is just the beginning!</p>
<p><em>Recently, I talked at length with Jeremy Hight who has been thinking about, designing and creating shared augmented realities, that anticipate the kind of dynamic, real time, large scale architecture we now have available through Wave,Â  for quite some time now.Â Â  This is exciting stuff. </em></p>
<p><em><br />
</em></p>
<h3><strong>Modulated Mapping:</strong> Talking with Jeremy Hight about Layers, Channels andÂ  Social Augmented Experiences</h3>
<p><strong><strong> </strong></strong></p>
<p><strong><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping5.jpg"><img class="alignnone size-medium wp-image-4611" title="modulatedmapping5" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping5-230x300.jpg" alt="modulatedmapping5" width="230" height="300" /></a><br />
</strong></strong></p>
<p><strong><strong><em><span>image from Volume Magazine (Hight/Wehby)</span></em></strong></strong></p>
<p><strong><strong>Tish Shute:</strong></strong> I know you have been involved in locative media from its early days. Perhaps we can talk about how AR continues the locative media journey?</p>
<p><a href="http://www.cc.gatech.edu/~blair/home.html" target="_blank">Blair MacIntyre</a> gave me this distinction, recently:<em> &#8220;AR is about systems that put media out in the world, and immerse you in a mixed space. Â Even the current &#8220;not really registered&#8221; mobile phone AR systems are still &#8220;sort of&#8221; AR (e.g., Layar, etc).</em></p>
<p><em>Locative media/ubicomp/etc are very different, in that they tend to display media on a device (phone screen) that is relevant to your context, but does not attempt to merge it with the world.<br />
The difference is significant, and making it clear helps people think about what they do and what they want to do, with their work. The locative media space though points toward future AR systems (when the technology catches up!).&#8221;</em></p>
<p><strong><strong>Jeremy Hight: The need is to finish the arc that locative media and early AR have started and to now truly return to the map itself, but as an internet of data, interactivity, channels of data , end user options like analog machines once were but in high end tools, a smart AI-ish ability for it to cull data for the user, and to allow social networking to be in real world places on the map both in building augmentation and in using and appreciating it..not hacks..which have their place&#8230;but a rhizome, a branched system with shared root,end user adjustable and variable..this is the key.</strong></strong></p>
<p><strong><strong>This takes AR and mapping and makes a possible world of channels in space and this eventually can be a kind of net we see in our field of vision with a selected percentage of visual field and placement so a geo-spatial net, a local to world wide fusion of lm into a tool and educational tool</strong></strong></p>
<p><strong><strong><span>VR[virtual reality] has greatly advanced, but in nodes as it has limitationsâ€¦LM [locative media] is the sameâ€¦AR [augmented reality] is the way..</span></strong><strong> it now has locative elements and aspects of VR integrated into its functionality and nodes&#8230;it is the best option with all of these elements, greater hybridity and data level potential a well as end user and community sourcing potential</strong></strong></p>
<p><strong><strong>I wrote an essay for Archis&#8217; Volume, the architecture magazine on a near future sense of some of this&#8230;.a visual net on the lens like ar but with smart objects and social networking and dissent.</strong></strong></p>
<p><strong><strong>I also wrote of these things for immersive graphic design, spatially aware museumÂ  augmentation, education through ar and lm and nod to the base interface of eye to cerebral cortex in layered and malleable augmentation in my essay <a href="http://www.neme.org/main/645/immersive-sight" target="_blank">&#8220;Immersive Sight&#8221;</a> a few years back</strong></strong></p>
<div id="gqg9" style="text-align: left;"><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_3dj7g8zf7_b.jpg"><img class="alignnone size-medium wp-image-4601" title="dgznj3hp_3dj7g8zf7_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_3dj7g8zf7_b-300x225.jpg" alt="dgznj3hp_3dj7g8zf7_b" width="300" height="225" /></a></strong></div>
<p><strong><strong>image [above] is simple illustration of a possible example on a screen or in front of eye where in a mondrian show..the graphic design of information actually builds as one moves</strong></strong></p>
<p><strong><strong>(key is calibrated spatial intervals and related layers of further augmentation which is logical due to location and proximity)</strong></strong></p>
<p><strong><strong>from immersive sight on immersive graphic design:</strong> <em>&#8220;The design can work with this in a way that creates an interactive supplemental set of information that is malleable, shifts based on location, builds and peels away as one moves closer to a work and plays with the forms of the works and the elements of the space itself. The sequence can contain many different elements and their interplay (both in the field of vision and in terms of context and layers of information). This is the model of sections of augmentation turning on and off at key points as individual spatial and concepts moments and nodes.</em></strong></p>
<p><strong><em>Another interesting possibility is that individual points of augmentation donâ€™t turn off, but instead are designed to build as one moves in a direction toward a specific part of the exhibit. The design can work in a sequence both content wise and visually in terms of a delay powered compositional development and style in which each discreet layer of text and image does not fade out, but builds on each other into a final composition. This can form paintings similar to Mondrian perhaps if it is a show of similar works of that era or it can form something much more metaphorical and open interpretation of the space and content but utilizing a sense of emergence spatially in terms of the composition (pieces laid bare until final approach for effect). </em></strong></p>
<p><strong><em>Each section will be well designed, but they build in layers as one moves until finally forming the final composition both visually and in terms of scope of information or building immediacy. The effect can be akin to taking a painting and slicing it into onion skin layers laid out in the air at intervals, each the same dimensions, but only one section compositionally of the greater whole. This has many semiotic applications beyond its potential aesthetically and as spatialized information possessing a sense of inter-relationship as one moves.</em>&#8220;</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>One of the things I found very inspiring when I read your papers was that your ideas are not all dependent on a model of AR that would necessarily require goggles, back packs and lots of CPU/GPU &#8211; not that that wouldn&#8217;t be nice, but that even using &#8220;magic lens&#8221; AR of the kind smart phones has enabled in an open distributed framework would open up a lot of new possibilities for what you call modulated mapping wouldn&#8217;t it?Â  What kind of social augmented realities might be enabled by a distributed infrastructure like this [AR Wave]?</p>
<p><strong><strong>Jeremy Hight: right&#8230;.I see that as wayyy down the road&#8230;most important is the one you talk about as it is more immediate and thus more essential and needed. Eventually the goggles will be like a contact lens and a deep immersive ar version ofÂ  this will come, that to me is certain, but a ways down the road.Â  An incredible amount is possible now, and this is a more pragmatic move as opposed to the more theoretical of what is a few steps from here. Thus it is more important and essential now. Tools like Google Wave are taking what even 2 years ago was more theoretical discussions of what may be and instead introducing key elements to a more immediate, powerful, flexible level of augmentation. What have been hacks and isolated elements are to be integrated and social networking, task completion, shared tools and graphics building and geo-location.</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>I think some people question what augmented reality has to bring to the continuum of location based experiences that other forms of interface/mapping do not?</p>
<p><strong><strong><span>Jeremy Hight: rightâ€¦.and the schism between its commercial </span></strong><strong>flat self and tests with physics etc and in between &#8230;there are a lot of unfortunate assumptions it seems as to where ar and lm cross and how ar can be many things beyond deep immersion or the opposite pole of a hockey puck having a magic purple line etc&#8230;.like lm is seen as either car directions or situationist experiments with deep data&#8230;..the progression to me is deeply organic&#8230;.and now augmentation can be more malleable, variable and end user controlled.</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Yes, it is really exciting time for AR.Â  Historically AR research has gone after the hard problems of image recognition, tracking and registration because we have had available to us these dynamic, real time, large scale architectures like Wave available (until now!),Â  so less work has been done on exploring the possibilities for distributed AR fully integrated with the internet and WWW hasn&#8217;t it?</p>
<p>A distributed augmented reality framework such as we have envisaged on Wave wouldÂ  allow people to see many layers from many different people at the same time. â€¬And this kind of model has been part of your thinking and fundamental to your work for a while, hasn&#8217;t it? But it is a very new idea to most people to think about collaboratively editing layers on the world, and to be able to viewÂ  augmented space through channels and networked communities?Â  Could you explain some of the ways you have explored these ideas and how they could be explored further now to create meaningful experiences for people?</p>
<p><strong><strong><span>Jeremy Hight: right..exactlyâ€¦modulated mapping to me can be an amazing tool for studentsâ€¦back end searching data visualizations and augmentations based on their needsâ€¦while they do something else on their computer or iphoneâ€¦that can be amazing..and not deep </span></strong><strong>immersive..The map can be active, malleable, open source fed, and even, in a sense, intelligent and able to adapt. The possibility also exists for this map to have a function that based on key words will search databases on-line to find maps, animations, histories and stories etc to place within it for your study and engagement. The map is thus a platform and yet is active. Community is possible as people can communicate graphically in works placed on the map and in building mode in the tool. All the tropes of locative media are to be in a </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> system of channels of augmentation and a spatial net. The software by design will allow development on the map and communication like programs such as second life but in </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> itself.</strong></strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modultedmapping1.jpg"><img class="alignnone size-medium wp-image-4607" title="interactive 3d map copy" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modultedmapping1-246x300.jpg" alt="interactive 3d map copy" width="246" height="300" /></a></strong></p>
<p><strong><strong><em><strong><span>image from Parsons Journal of Information Mapping Volume 2 (Hight/Wehby)</span></strong></em></strong></strong></p>
<p><strong><strong><span>I wrote an essay a few years ago for the Sarai reader questioning the traditional map and its semiotics and need to reconsider â€“ then did work looking into it and what those dynamics were and they got into 2 group shows in museums in Russiaâ€¦so it actually was my arc toward modulated mappingâ€¦an interesting way to it! But yes the map itself..this is a huge area of potential and non screen based alone navigation etc. I see now that my 2 dozen or so essays in lm,ar, interface design and augmentation have all also been leading in this direction for about 10 years now</span></strong></strong></p>
<p><strong><strong>Tish Shute: </strong>IÂ  love immersive visualization but can we &#8220;return to the map &#8211; the internet of data&#8221; as you mentioned earlier and produce interesting augmentation experiences that go beyond locative media&#8217;s device display mode without having the goggles, for example, through the magic lens of or smart phones?</strong></p>
<p><strong><strong>Jeremy Hight: yes, absolutely.Â  the map in the older paradigm is an artifice born often of war and border dispute and not of the earth itself and its processes&#8230;the new mapping like google maps is malleable, can be open source, can read spaces and can be layers of info in the related space not plucked from it as in the past..this is amazing. the old map also was born of false semiotics/semantics like &#8220;discovery of new lands&#8221; or &#8221; pioneer&#8221;Â  while the places were there already and names often were of empire&#8230;now this is no longer the case</strong></strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping2.jpg"><img class="alignnone size-medium wp-image-4608" title="jeremy map small2 copy" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping2-300x233.jpg" alt="jeremy map small2 copy" width="300" height="233" /></a></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>So geoAR is an a better way to express a new social relationship to mapping? And how does this fit into the evolving arc of locative media that evolves into augmented reality?</p>
<p><strong><strong>Jeremy Hight:&#8230;early lm was mostly geocaching and drawing with gps..it took new paradigms to invigorate the fieldÂ  a lot of folks focus on tools and what already is, cross pollination can ground ideas that are more radical&#8230;a metaphor in a sense to place what can be in a familiar context.</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>one of the great disappointments in VR has been its isolation from networked computing and also, up to now, augmented reality &#8211; to achieve an immersive experience withÂ  tight registration of media/graphics have to create separate system isolated from the internet and power of the web.</p>
<p><strong><strong>Jeremy Hight: yes&#8230;.this will change. vr is to me an island but ar takes a part of it and shifts the paradigm and new things open this way. Do you know the project <a href="http://www.lifeclipper.net/EN/process.html" target="_blank">&#8220;life clipper&#8221;</a>? friends of mine..doing interesting things..they are a clear bridge betwen lm and ar&#8230;.and from vr</strong></strong></p>
<p><strong><strong>in ar augmentation and what is being augmented become fused or in collision or in complex interactions as a means to a larger contextualization and exploration of what is being augmented..this is true in immersive or non ar&#8230;.huge potential</strong></strong></p>
<p><strong><strong>vr is a space, now can be surgery which is amazing. but not layered interaction, thus an island and graphic iconography on a location can use symbolic icons which opens up even more layers (graphic designer/information designer in me talking there I suppose..)</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Yes !Â  talk to me more about layers and channels I think this is one of the most interesting questions for meÂ  in augmented reality at the moment &#8211; what can we do with layers and channels and the new possibilities on connections between people and environments that these can create?</p>
<p>The ability for anyone to post something is critical to the distributed idea but one of the reasons I am so excited by Google Wave is I am fascinated by the playback function. How do you think this will enable new forms of collaborative locative narratives (<a href="http://snarkmarket.com/2009/3605" target="_blank">nice post on Wave playback here </a>).</p>
<p><strong><strong>Jeremy Hight: We are in an age of cartographic awareness unseen in hundreds of years. When was the last time that new </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> tools were sold in chain stores and installed in most vehicles? When was the last time that also the augmentation of maps was done by millions (Google map hacks, etc)? The ubiquitous gps maps run in automobiles while people post pictures and graphic pins to denote specific places on on-line maps.</strong></strong></p>
<p><strong><strong>The need is for a tool that combines all of these new elements into an open source, intuitive layered and rhizomatic map that is porous (like pumice, organic in form yet with â€œbreathing roomâ€ ),ventilated (i.e: adjustable, a flow in and out), and open (open source,open access,open spatialized dialog).</strong></strong></p>
<p><strong><strong><span> I wrote of this in my essay &#8220;Revising the Map: Modulated Mapping and the Spatial Interface .&#8221;(</span></strong><span> </span><a id="h0qr" title="http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf )" href="http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf%20%29"><span>http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf )</span></a></strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping3.jpg"><img class="alignnone size-medium wp-image-4609" title="jeremy map small2 copy" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping3-300x206.jpg" alt="jeremy map small2 copy" width="300" height="206" /></a></strong></p>
<p><strong><em><strong><span>image from Parsons Journal of Information Mapping (Hight/Wehby)</span></strong></em></strong></p>
<p><strong><strong>Tish Shute:</strong></strong> One mapping project I really like is <a href="http://themannahattaproject.org/" target="_blank">Mannahatta</a>.Â  How could distributed AR contribute to a project like <a href="http://themannahattaproject.org/" target="_blank">Mannahatta</a>?</p>
<p><strong><strong>Jeremy Hight: that is a good example..imagine taking manhattan and having channels of options to overlay, that being an excellent option, and imagine being able to even run a few at once with deliniating icons..you can augment a space with history, data, erasure, narrative, scientific analysis, time line of architecture, infrastructure, archaeological record etc&#8230;.endless possibilities, and this agitates place and place on a map into an active field of information with end user control&#8230;and open options for new layers</strong></strong></p>
<p><strong><strong>Tish Shute: </strong></strong>and do you think we could do interesting things with AR on a project like Mannahatta even with the current mediating devices we have available &#8211; i.e. our smart phones as obviously the rich pc experience of Mannhatta has built for it&#8217;s web interface would not be available as AR at this point?</p>
<p><strong><strong>Jeremy Hight: yes&#8230;.k.i.s.s right?Â Â  these projects do not have to only be immersive and graphic intensive&#8230;&#8230;take how people upload photos onto google maps&#8230;.just make that on a menu of options, there are some pretty cool hacks already..<br />
&#8230;options is key, a space can have a community as well, building on it in software, and others navigating it, i see it near future and down the road..always have with ar really</strong></strong></p>
<p><strong><strong><a href="../wp-content/uploads/2009/10/locativenarratives1.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/locativenarratives1.jpg"><img class="alignnone size-medium wp-image-4596" title="locativenarratives1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/locativenarratives1-230x300.jpg" alt="locativenarratives1" width="230" height="300" /></a><br />
</strong></strong></p>
<p><strong><em><strong><span>image from Volume Magazine (Hight/Wehby)</span></strong></em></strong></p>
<p><strong><strong>Jeremy Hight: and yes, a lot of people focus on ar as its limitations and processing power needs as a major road block</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>so do you see AR on smart phones adding any value to a project like Mannahatta?</p>
<p><strong><strong>Jeremy Hight: yes&#8230;that it can be integrated into other similar works and even disparate but cloud linked ones&#8230;so a place can be &#8220;read&#8221; in diff ways on the iphone&#8230;.beyond its map location, and more can be possible if you are there&#8230;others away, so it becomes channels of augmentation</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>AR like locative media puts who you are, where you are, what you are doing, what is around you center stage in online experience but it also &#8220;puts media out in the world&#8221; &#8211; people I think understand this well as a single user experience but we are only just beginning to think about how this will manifest as a social experience &#8211; could explain more about modulated mapping as an experience of social augmentation?</p>
<p><strong><strong style="background-color: #99ff99; color: black;"><span>Jeremy H</span>ight: Modulated</strong> <strong style="background-color: #ff9999; color: black;">Mapping </strong><strong>is a tool that will allow channels to be run along the map itself. This will allow one to view different icons and augmentations both as systems on the map and in deeper layers of information (photos, videos, animations,Â  visualizations, etc) that can be turned on and off as desired. The different layers of icons and data may be history, dissent, artworks, spatialized narratives, and annotations developed that are communally based on shared interests, placed spatially and far beyond. The use of chat functionality in text or audio will be open in building mode and in </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> navigation/usage as desired. This also allows a community to develop or augment in the spaces on the earth. These nodes can be larger and open or small and set by groups in their channel. The end result is an open source sense of </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> that will also have a needed sense of user control as one can select which layers of augmentation they wish to see and interact with at any time. It also will incorporate all the functionality of locative media in </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> software and </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong>. In building mode and in map mode, icons will be coded to represent within channels (remember that the person using it has selected channels of augmentation from many based on their current interests and needs). Icons will be coded as active to show work in progress in cities and the globe to both invite participation and to further agitate the map from the sense of the static as action is visible even with its icons as people are working and community is formed in common interest/need .</strong></strong></p>
<p><strong><strong>locative media got a buzz for &#8220;reading&#8221; places&#8230;when I helped create locative narrative that was what blew me away back in 2001&#8230;that we could give places a voice by placing data from research and icons on a map&#8230;&#8230;this meant lost history or augmentation was possible as kind of voices of a place and its layers&#8230;&#8230;.I called it &#8220;narrative archaeology.&#8221; We now have tools that can push these ideas and concepts farther..much farther&#8230;and with a range beyond what was before, and then the map was just a tool&#8230;.but now we are returning to the map itself&#8230;..and this as place as much as marker..this is where ar takes the ball to use a bad metaphor</strong></strong></p>
<p><strong><strong>also that project could only work if you came to our spot of a 4 block augmentation and with us there to lend you our gear&#8230;we are far beyond that now but it had its place</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>How do you see &#8220;in context&#8221; AR and something we might call &#8220;context aware&#8221; cloud computing models interacting?</p>
<p><strong><strong>Jeremy Hight: sure&#8230;and I must add that I have issues with cloud computing as much as it is a good idea..</strong>.</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>because of loss of autonomy?</p>
<p><strong><strong>Jeremy Hight: tivo is simply a hard drive&#8230;but it keyword reads and givesÂ  suggestions..that is the is cro magnon link to what can be</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>The nice thing about Wave is because of the Federation model, the cloud model and local store ur own data models should work together.<strong><strong><span> </span></strong></strong></p>
<p><strong><strong><span>Jeremy Hight: yes..that is better&#8230;..loss of autonomy also opens up the arbitrary which is the flaw of search engines as we know itâ€¦even Bing fails to me in that sense</span></strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>how do you mean, could you explain?</p>
<p><span> </span><strong><strong><span>Jeremy Hight: spidersÂ  cull from wordsÂ  but cull like trawlers at sea â€¦. tested Bing with very specific requests.. it spat out the same mass of mostly off topic resultsâ€¦.</span><br />
<span> I wonder if there is a way to cull from key words and topics from a userâ€¦not O</span>rwellian back end of courseâ€¦but from their preferences, their searches etc..</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>did you see the discussion on search in the AR Framework doc? AR search will be a massively important thing that will take a lot of intelligence and all sorts of algorithm development won&#8217;t it?</p>
<p><strong><strong>Jeremy Hight:It also has one area of key functionality that moves into more intuitive software. Upon continued usage, the </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> software will â€œlearnâ€ and search based on key words used and spheres of interest the user is </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> or observing as mapped and will integrate deeper data and types of animations, etc. into the map or will have them waiting to be integrated upon user approval as desired. Over time the level of sophistication of additions and of search intuition will increase dramatically. The search can also, if the user wishes, run in the back end while working in the </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> program, or in off time as selected while doing other tasks. It also can never be used if one is not interested. One of the key elements of this </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> is that it is not composed of a closed set or needs user hacks to augment, but instead is to evolve and deepen by user controls and desired as designed. Pre-existing data,visualizations and augmentations can be integrated with relative ease.</strong></strong></p>
<p><strong><strong>Tish Shute: </strong></strong>One of the things that Joe Lamantia points out about social augmented experiences is that they will operate across a number of different scales &#8211; conversation &gt; product design &amp; build team &gt; neighborhood / town fixing potholes &gt; global community for causes. How do designs for channels and layers change across these different social scales?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> quote myself &#8230;&#8221;The &#8220;frontier&#8221; is often defined as the space just ahead of the known edge and limit, and where it may be pushed out deeper into the previously unknown. The frontier in the world of ideas is not the warm comfort of what has been long assimilated; and the frontier in the landscape is not of maps, but of places beyond and before themâ„</strong></strong></p>
<p><strong><strong>The border along what has been claimed is not only that of maps â€“ it is of concepts, functions, inventions and related emergent industries. Ideas and innovations are like the cloud shape that briefly forms around a jet breaking the sound barrier, tangible yet not fully mapped into measure. It is when things are nailed down into specific entities, calibrated and assessed, that the dangers may inflict themselves â€“ greed, competition, imitation, anger, jealously, a provincial sense of ownership either possessed or demanded&#8221;. (from essay in Sarai reader). Otherwise channels and augmentation do not have to be socio-economically stratifying or defined by them. We built 34nÂ  for almost nothing on older tools.</strong></strong></p>
<div id="yqjj" style="text-align: left;"><strong><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_1g3svj8fq_b.jpg"><img class="alignnone size-medium wp-image-4599" title="dgznj3hp_1g3svj8fq_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_1g3svj8fq_b-300x225.jpg" alt="dgznj3hp_1g3svj8fq_b" width="300" height="225" /></a></strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_1g3svj8fq_b.jpg"><span> </span></a></strong></div>
<p><strong><em><strong><span>image from 34north 118westÂ  (Spellman/Hight/Knowlton)</span></strong></em></strong></p>
<p><strong><strong>The ar that is not deep immersion can be more readily available and channels can be what end users need like the diversity of chat rooms or range of Facebook users among us.</strong></strong></p>
<p><strong><strong>I had two moments yesterday that totally fit what we talked about.Â  I went to west hollywood book fair and traditional directions off of mapping for driving directions were wrong and we got lost&#8230;our friend could only get a wireless signal to map on itouch and we had to roam neighborhoods then we called a friend who google mapped it and we found we were a block away&#8230;.so a fast geomapping overlay with an icon for the book fair on some optional grid service or community would have made it immediate.Â  Then at the book fair talked to a small press publisher who is trying to map works about los angeles by los angeles authors on a map..she was stunned when I told her it could be a kind of google map feature option</strong></strong></p>
<p><strong><strong>it also has great potential to publish and place writing and art in places..both for commentary and access. imagine reading joyce in chapters where it was written about and then another similar experience but with writers who published on a service into their city.</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong></strong></strong> The challenge of shared augmented realities is not just a matter of shipping bits around, but also of how it we will use channels and layars &#8211; to create and negotiate different, distributed perspectives, understand a shared common core/or expressions of dissent (this came up in an email conversation with <a href="http://www.oreillynet.com/pub/au/166" target="_blank">Simon St Laurent</a>).</p>
<p><strong><strong><strong>Jeremy Hight:</strong> well my example earlier could have been communal in a way too..a tribe sort of augmentation channeling &#8230;.like subscribing to list servs back in the day but of augmentation communities/channels, and for folks to build and use in shared live form, coordinating too</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong></strong> </strong>one good thing though about building an open AR Framework is that as bandwidth/CPU/hardware gets better shared high def immersive experiences could be supported by the same framework..</p>
<p><strong><strong>Jeremy Hight: excellent</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>were you thinking of the image recognition and tracking with this example?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> yeah&#8230;.like scanning across a multi channeled google map augmentation with diff icons and their connected data&#8230;and poss social networking and fle sharing even in that mode&#8230;and rastering etc&#8230;.could be cool with google wave </strong><strong><span>- on the map..then zooming in a la powers of ten..(eames film).</span></strong></strong></p>
<p><strong><strong>-</strong><strong><span>I have pictured variations of this for a few years now in my head like the example of my friends and I yesterdayâ€¦we could have correlated a destination by icons in diff channels..one being lit events within lit channel in l.a mapâ€¦maybe things streaming on it tooâ€¦remote info and video etc&#8230; that would be awesome</span></strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong></strong></strong> So many of the ideas in you paper on modulated mapping (see <a href="http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf" target="_blank">here</a>) are brilliant use cases for shared augmented realities. Perhaps you could talk more your ideas about locative narrative because this is something I think is at the core of the kinds of experiences that a distributed AR Framework would make possible?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> on the project &#8220;34 north 118 west&#8221; we mapped out a 4 block area for augmentation of sound files triggered by latitude and longitude on the gps grid and map and the map on the screen had pink rectangles that were the &#8220;hot spots&#8221; where the augmentation had been placed.</strong></strong></p>
<div id="nwc6" style="text-align: left;"><strong><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_0gg994bf9_b.jpg"><img class="alignnone size-medium wp-image-4600" title="dgznj3hp_0gg994bf9_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_0gg994bf9_b-300x225.jpg" alt="dgznj3hp_0gg994bf9_b" width="300" height="225" /></a></strong></strong></div>
<p><strong><em><strong><span>image of interactive map with map based augmentation connected to audio augmentation on site for 34north 118west (Spellman/Hight/Knowlton)</span></strong></em></strong></p>
<p><strong><strong>We researched the history of the area and placed moments in time of what had been there at specific locations &#8230;.I called this <a href="http://www.xcp.bfn.org/hight.html" target="_blank">&#8220;narrative archaeology&#8221;</a> as it allowed places to be &#8220;read&#8221; by their augmentations&#8230;info that was of the place beyond the immediate experience (diff types of info) that otherwise would be lost or only found in books or web sites elsewhere. there now are locative narratives around the world but they need to be linked.Â  from humble origins &#8220;narrative archaeology&#8221; went on to be recently named of the 4 primary texts in locative media which is pretty amazing to me&#8230;but it is growing</strong></strong></p>
<p><strong><strong>- the limitations then were what I called the &#8220;bowling alley connundrum&#8221; &#8211; the specifc data had to reset like pins&#8230;..and was isolated&#8230;.this led me to think about ar back then and up to now.Â  How these could lead to much more from that point, data that would be more layered, variable , fluid..yet still augmented place and sense of place and social networking within data and software</strong></strong></p>
<p><strong><strong><a href="http://34n118w.net/34N/" target="_blank">lifeclipper</a> to me is a bridge</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>But Life Clipper is isolated from the internet currently is it?</p>
<p><strong><strong><span>Jeremy Hight: yes&#8230;ours was too.. that is what google wave makes possible.. our project only ran on our gear..in 4 blocksâ€¦with additional auxi</span>liary info online, and not malleable..but hey 2001 and all..</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>so the sites for 34 north 118 west are still active though?</p>
<p><strong>Jeremy Hight: oh yeah!</strong></p>
<p><strong><strong><strong>Tish Shute: </strong></strong></strong>nice I really like sound augmentation &#8211; have you seen <a href="http://www.soundwalk.com/blog/tag/augmented-reality/" target="_blank">Soundwalk</a>?</p>
<p><strong><strong><span>Jeremy Hight: yes, very cool..</span> </strong><strong>we chose sound only as it fought the power of image..instead caused a person to be in a sense of two places and times at once</strong></strong></p>
<p><strong><strong>Tish Shute:</strong></strong> and in 2001 that was definitely a visionary project!</p>
<p>You must be very excited that finally the pieces are coming together to make this stuff scale!</p>
<p><strong><strong><strong>Jeremy Hight:</strong> I can&#8217;t even tell you!! it is funny..i have known that this would come..just waited and waited&#8230;</strong></strong></p>
<p><strong><strong>..knew it needed the right people and tools..</strong></strong></p>
<p><strong><strong><span>..so the bowling alley connundrum led me to develop my project shortlisted for the iss (international space station)Â  as I thought a lot about how points and works are not to be isolatedâ€¦but connectedÂ  and should be flowing in diff parts of a mapâ€¦.to open up perspective and connected augmentations , but also to think about the map againâ€¦not as a base only. then moved into my work with new ways to visualize time and it all really began to gell.Â  The ideas first were published as an essay</span></strong><span> </span><a id="qw.2" title="(http://www.fylkingen.se/hz/n8/hight.html)" href="http://www.fylkingen.se/hz/n8/hight.html"><span>(http://www.fylkingen.se/hz/n8/hight.html)</span></a><span> </span><strong><span>and later my project blog</span></strong><span> (</span><a id="bp.b" title="http://floatingpointsspace.blogspot.com/)" href="http://floatingpointsspace.blogspot.com/%29"><span>http://floatingpointsspace.blogspot.com/)</span></a></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>One thing I noticed when I was reading your paper is how you have been exploring non-euclidian geometries.Â  Could you explain how this is part of your idea of modulated mapping?</p>
<p><strong><strong><span>Jeremy Hight: Yes, this first came to me when my wife was reading to me from a book on the Poincare Conjecture and I was hit with a new way to measure events in time and after months of sketches, schematics and research came to see how it could also be connected to a geo-spatial web of projects and augmentations.Â  It was published in the inaugural issue of Parsons School of Design&#8217;s Journal of Information Mapping which was an exciting fit.</span></strong><span><strong> I call it &#8220;Immersive Event Time&#8221;</strong>(</span><a id="o3rt" title="http://piim.newschool.edu/journal/issues/2009/01/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf)" href="http://piim.newschool.edu/journal/issues/2009/01/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf%29"><span>http://piim.newschool.edu/journal/issues/2009/01/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf)</span></a></strong></p>
<p><span><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_4cxz57xgv_b.jpg"><img class="alignnone size-medium wp-image-4634" title="dgznj3hp_4cxz57xgv_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_4cxz57xgv_b-195x300.jpg" alt="dgznj3hp_4cxz57xgv_b" width="195" height="300" /></a></strong></span></p>
<p><span><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_5g68k9ggh_b.jpg"><img class="alignnone size-medium wp-image-4635" title="dgznj3hp_5g68k9ggh_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_5g68k9ggh_b-300x225.jpg" alt="dgznj3hp_5g68k9ggh_b" width="300" height="225" /></a><br />
</strong></span></p>
<p><strong><strong>so the last 3 years I have been working on how it could all work as channels of augmentation, and building and navigation as open and community in a sense as well as ai capability that was the time work especially. how time as experienced within an event is not a time &#8220;line&#8221;Â  but points on and within a form&#8230;.and how this model is better for visualizing events in time and documenting them. it actually sprang form reading a book on the poincare conjecture, popped a bunch of other stuff together so one could visualize an event in time as like being in the belly of a whale..with time as the ribs..and our measure of time as the skin&#8230;and moving within it&#8230;.hoping this will be used as educational tool</strong></strong></p>
<p><strong><strong>and this also can be tied to ar and map again&#8230;how documentation of important events can be kept within icons on a google map..then download varying visualizations based on bandwidth and desired format</strong></strong></p>
<p><strong><strong><strong>Tish Shute: </strong></strong></strong>I have been thinking about is the new forms of social interaction/agency that these kinds of augmentations of space/place/time will create.Â  it seems there are two poles &#8211; one is the area Natalie Jeremijenko explores of shifting social relations from institutions/statistics to real time/location based/interactions and new forms of social agency.Â  The other pole completely is more like the cloud based AI and perhaps crowd sourced machine learning.</p>
<p>Your ideas explore the possibilities of both these poles.Â  And certainly one of the big deals of distributed AR integrated with would be the possibilities it opened up both for new forms of networked social relationships and for new ways to draw on network effects.</p>
<p><strong><strong><strong>Jeremy Hight:</strong> and cross pollinations within &#8230;that is what my mind goes to</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>The other night I met Assaf Biderman, MIT, from the <a href="http://senseable.mit.edu/trashtrack/" target="_blank">Trash Track</a> team. Trash Track doesn&#8217;t utilize AR but I could see that there are possibilites there.<br />
What do you think?</p>
<p><strong><strong><span>Jeremy Hight: yes, absolutely,</span> </strong><strong>there can sort of skins on locations that user end selection can yield &#8230;like channels of place&#8230;.and can range from pragmatic core to art and play and places between&#8230;.how this recalibrates the semiotics of map&#8230;more than just augmentation as seen as a kind of piggy back on map..map becomes interface and defanged platform if you wil, interestingly my more poetic/philosophic writing led me here too</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong></strong></strong> I know they are at very different poles of the system but I do wonder how AR can bring some of the level of social agency/interaction that <a href="http://www.environmentalhealthclinic.net/people/natalie-jeremijenko/" target="_blank">Natalie Jeremijenko</a> works on into a productive interaction with the kind of innovations in Machine learning that Dolores Lab style machine learning!!and others are pioneering?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> Natalie&#8217;s genius to me is in practical functional tech that also opens deeper questions and even new openings of what is needed..amazing layers in her work that way.. succint yet deep..very deep</strong></strong></p>
<p><strong><strong><strong>Tish Shute: </strong></strong></strong>Yes &#8211; I a just writing a post about her work &#8211; I find it deeply moving the way she has delved into the possibilities to using technology to open us up to our world.Â  One of the reasons I find distributed AR so interesting is because it will make it possible for all kinds of people to create and use augmentation in their lives and communities.</p>
<p>So to return to how a distributed AR framework could contribute to a project like Trash Track?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> what about using it for community, dissent and awareness raising then?Â  like Natalie&#8217;s work but building like a communal work of multiple points, like the old adage of the elephant and the blind menÂ  sorry..metaphor &#8211; like one of my points in immersive sight was how one could take augmentation as multiple works sort of turning the faces of a thing or place&#8230;and how this would make a larger work even in such a flow so people moving in a space could also build..</strong></strong></p>
<p><strong><strong>what of ar traces left as people move calibrated to user traffic and trash as estimated in an urban space&#8230;like it goes back to chris burden in the 70&#8242;s making you know that as you turn the turnstyle you are drilling into the foundation and may be the one that collapses the building?</strong></strong></p>
<p><strong><strong>so their movements leave trash. Natalie is all about raising awareness to cause and effect and data , space and ecology. love that.Â  so maybe &#8230;<br />
a feedback loop , artifact and user end responsibility can leave traces &#8230;trash&#8230;</strong></strong></p>
<p><strong><strong>.. cybernetics vs ecology and human waste</strong></strong></p>
<p><strong><strong><strong>Tish Shute: </strong></strong></strong>could you elaborate?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> brain fart&#8230;that the mass of trash people leave is a piece at a tiime&#8230;.and how like the space shuttle mission when it was argued first true cybernaut occured&#8230;.one cord to air for astronaut..one for computer on their back to fix broken bay arm&#8230;if there is a way to build on that and in relation to the topic&#8230;..how this can go further, that machines do not waste as much&#8230;as ar is a means to cybernetic raise awareness..eh..</strong><strong><span>In a sense it is likeÂ  the space shuttle mission when arguably the first true cybernaut occurredâ€¦.one cord to air for astronaut..one for computer on their back to fix broken bay armâ€¦if there is a way to build on that and in relation to the topicâ€¦..how this can go further, that machines do not waste as muchâ€¦as ar is a means to cybernetic raise awareness..eh.. hmmm.</span>.. </strong><strong> sensors etc&#8230;wearables too &#8211; could be eco awareness with data and machine and human</strong></strong></p>
<p><strong><strong>what about a cloud computing system with a slight ai in the sense of intuitive word cloud and interest scans&#8230;..so as one moves through say new york they can be offered new ai data and services as they move ? could also be of eco interests? concerns about urban farming, eco waste, air pollution etc&#8230;.perhaps with (jeremijenko element here) Â sensors placed in locations and these also giving data reads in public areas Â with no input but hard data itself&#8230;&#8230;hmm..could be interesting</strong></strong></p>
<p><strong><strong>it can also give info of the carbon footprints (estimated prob unless data is public record somehow) of chain businesses Â and data on which are more eco friendly as well as an iconography color coded and icon coded to the best places to go to support greening and eco friendly business? Â and the companies could promote themselves on this service to attract eco aware customers who would be seeing them as kindred spirits and helping the<br />
larger effort?</strong></strong></p>
<p><strong><strong>kind of eco mapping..and ar on mobile app</strong></strong></p>
<p><strong><strong>what about sensors that read air pollution levels, levels of solar radiation (to aid with skin protection in shifting light values in a city space..ie put on some skin cream now&#8230;), light sensors that detect density and over density in public spaces&#8230;to use the old trope in art of reading crowds in a space..but instead could indicate overcrowding, failing infrastructure in public spaces (which is a congestion that leads to greater pollution levels as well as flaws in city planning over time..), and perhaps a tie in to wearables&#8230;&#8230;worn sensors Â on smart clothes&#8230;.this could form a node network of people in the crowds &#8230;.and also send data within moving in a space&#8230;</strong></strong></p>
<p><strong><strong>here is a kooky thought&#8230; what of taking the computing power and data of people moving in a space..and not only get eco data and make available to them levels of<br />
data..but make possibly a roving super computer&#8230;crunching the deeper data of people open to this&#8230;&#8230;a hive crunching deeper analysis of the space, scan properties from sensors, and even a game theory esque algorithm of meta data if say 40 people out of 50 hit on a certain spike or reading&#8230;and even their input&#8230;..I worked in game theory for paleontology in this manner for a time as a teen&#8230;.a private project&#8230;&#8230; Â  the reading can lead to a sort of meta read by what hits most consistently..as well as in their input..text of what they experienced, observed,postulated,analyzed even&#8230;. this could be really interesting&#8230;even if just the last part from collected data and not from any complex branching of servers..</strong></strong></p>
<p><strong><strong>I thought at 19 or so that the flaw in paleontology was in how so many larger theories were shifting exhibitions and larger senses of things like were there pre-historic birds that were mistaken for amphibean and then back again&#8230;.so why not make a computer program and feed all the papers published into it and see what hits were counted in terms of an emerging meta theory&#8230;and landscape of key points being agreed upon&#8230;this data would be in a sense both algorithmic and a sort of unspoken dialogue &#8230;came from a lot of study of game theory one summer&#8230;</strong></strong></p>
<p><strong><strong>hope this makes some sense&#8230;I forgot to mention that I originally planned to be a research meteorologist and my plan in middle school or so was to get a phd and develop new software to have a global map and then run models of hypothetical storms across it in real time animations of cloud forms, radar and wind analysis/fields, barometric pressure spaghetti charts etc&#8230;.and to also do 3d cut away models of storm architectures&#8230;so been into visualizations of complex data and mapping for a long time!</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>Wow let me think about this one!</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/feed/</wfw:commentRss>
		<slash:comments>18</slash:comments>
		</item>
	</channel>
</rss>
