<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>UgoTrade &#187; privacy and online identity</title>
	<atom:link href="http://www.ugotrade.com/category/participatory-culture/privacy-and-online-identity/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.ugotrade.com</link>
	<description>Augmented Realities at the Edge of the Network</description>
	<lastBuildDate>Wed, 25 May 2016 15:59:56 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>Story Telling &#8211; the Art, Science, and Business of Data: Talking with Edd Dumbill about Strata, NYC, 2011</title>
		<link>http://www.ugotrade.com/2011/08/31/story-telling-the-art-science-and-business-of-data-talking-with-edd-dumbill-about-strata-nyc-2011/</link>
		<comments>http://www.ugotrade.com/2011/08/31/story-telling-the-art-science-and-business-of-data-talking-with-edd-dumbill-about-strata-nyc-2011/#comments</comments>
		<pubDate>Wed, 31 Aug 2011 18:51:52 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Findability]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[Hadoop]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[Linked Data]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[New Interfaces]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[Open Data]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[Real Time Big data]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[big data]]></category>
		<category><![CDATA[Bloom]]></category>
		<category><![CDATA[Business in the Age of Big Data]]></category>
		<category><![CDATA[Color]]></category>
		<category><![CDATA[data analytics]]></category>
		<category><![CDATA[data design]]></category>
		<category><![CDATA[data expression]]></category>
		<category><![CDATA[data Science]]></category>
		<category><![CDATA[Data Sift]]></category>
		<category><![CDATA[data story telling]]></category>
		<category><![CDATA[data visualization]]></category>
		<category><![CDATA[Edd Dumbill]]></category>
		<category><![CDATA[Google +]]></category>
		<category><![CDATA[Google Maps]]></category>
		<category><![CDATA[Google Plus]]></category>
		<category><![CDATA[GreenPlum]]></category>
		<category><![CDATA[infographics]]></category>
		<category><![CDATA[Kinect]]></category>
		<category><![CDATA[Media Sift]]></category>
		<category><![CDATA[Mike Kuniavsky]]></category>
		<category><![CDATA[Narrative Science]]></category>
		<category><![CDATA[Natural Language Generation]]></category>
		<category><![CDATA[OKCupid]]></category>
		<category><![CDATA[Quid]]></category>
		<category><![CDATA[Singly]]></category>
		<category><![CDATA[Somatic Data Perception]]></category>
		<category><![CDATA[Strata Conference]]></category>
		<category><![CDATA[Strata Summit]]></category>
		<category><![CDATA[The Locker project]]></category>
		<category><![CDATA[Visual.ly]]></category>
		<category><![CDATA[Visualizing Data]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=6338</guid>
		<description><![CDATA[I&#8217;m really looking forward to the O&#8217;Reilly Strata events that are coming to NYC in a couple of weeks. Iâ€™m fascinated to seeÂ where the art, science, and business of data has gone since February, when I attended the first Strata Conference in Santa Clara &#8211; a sold out event imbued with an awareness that this [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><iframe width="500" height="281" src="http://www.youtube.com/embed/sCmO8YKzv9U?feature=oembed" frameborder="0" allowfullscreen></iframe></p>
<p>I&#8217;m really looking forward to the<a href="http://strataconf.com/stratany2011"> O&#8217;Reilly Strata </a>events that are coming to NYC in a couple of weeks. Iâ€™m fascinated to seeÂ where the art, science, and business of data has gone since February, when I <a href="../../2011/01/20/real-time-big-data-at-strata-2011-ambient-findability-geomessaging-augmented-data-and-new-interfaces/">attended the first Strata Conference in Santa Clara</a> &#8211; a sold out event imbued with an awareness that this was an important gathering of cognoscenti working on   the next big thing.</p>
<p>Strata in New York City is a sequence of events,Â  <a href="http://strataconf.com/jumpstart2011/">Strata JumpStart</a>, Sept. 19th, and then<a href="http://strataconf.com/summit2011/"> The Strata Summit</a>, &#8220;The Business of Data,&#8221; Sept. 20th &amp; 21st, and followed by the <a href="http://strataconf.com/stratany2011/">Strata Conference</a>, &#8220;Making Data Work,&#8221; Sept. 22nd, 23rd.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/Screen-shot-2011-08-28-at-7.15.41-PM.png" target="_blank"><img class="alignnone size-medium wp-image-6376" title="Screen shot 2011-08-28 at 7.15.41 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/Screen-shot-2011-08-28-at-7.15.41-PM-300x101.png" alt="" width="300" height="101" /></a><em><a href="http://strataconf.com/public/content/landing?_discount=adw&amp;cmp=kn-conf-st11-starta-terms" target="_blank"></a></em></p>
<p><em><a href="http://strataconf.com/public/content/landing?_discount=adw&amp;cmp=kn-conf-st11-starta-terms" target="_blank">&#8220;The future belongs to those who understand how to collect and use their data successfully.&#8221;</a></em></p>
<p>Below is a transcript of a conversation I had last Friday with <a href="http://strataconf.com/stratany2011/public/content/about" target="_blank">Strata Program Chair, Edd Dumbill</a> about some of the highlights of the schedule from my perspective.Â  However, I highly recommend taking a good look at <a href="http://strataconf.com/public/content/landing?_discount=adw&amp;cmp=kn-conf-st11-starta-terms" target="_blank">all that is planned through the three events</a> because there is a depth and breadth that could not be covered in one conversation.</p>
<p>The video opening this post is from <a href="http://visual.ly/about" target="_blank">visual.ly.com</a> &#8211; a start-up making it easier for people to create, explore, share, and promote data visualizations and infographics.</p>
<h3>Talking with Edd Dumbill</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/edddumbillheadshot.png"><img class="alignnone size-full wp-image-6391" title="edddumbillheadshot" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/edddumbillheadshot.png" alt="" width="150" height="150" /></a></p>
<p><strong>Tish Shute:</strong> It seems a dialogue between the art of data and the science of data is going to be center stage at Strata NYC, and there will be much discussion about story telling with data.</p>
<p>Is that observation correct or is there something else going on there?</p>
<p><strong>Edd Dumbill:</strong> No, I think thatâ€™s a great characterization.  For the <a href="http://strataconf.com/summit2011/" target="_blank">Summit</a>, the core realization for me has been that when you have these tools for getting value from data and when you can drive what youâ€™re doing by data, then actually, the biggest consequences are human ones, and they are organizational ones, and they are strategic ones once you have the technology in place.</p>
<p>So what the summit is doing is really looking at how, in a variety of industries, governments, and within disciplines within those, how the amount of data, the ease of which it can be communicated and mined is changing the way industry is shaped.</p>
<p><strong>Tish Shute: </strong> Also, I noticedÂ  that the <a href="http://strataconf.com/summit2011/public/schedule/full" target="_blank">Strata Summit Schedule</a> (Sept 20th &amp; 21st), and even through to the <a href="http://strataconf.com/stratany2011/" target="_blank">Strata Conference</a> (Sept 22nd &amp; 23rd), has more of an emphasis on pop culture; sports &#8211; baseball, dating &#8211; OKCupid, and Narrative Science, all have a place on the schedule, for example?</p>
<p>Is this the culture of New York City being reflected â€“ interests in media and marketing, or is there something else going on?Â  Has the data tool stack matured since the Strata Conference in Silicon Valley at beginning of the year?</p>
<p><strong><br />
Edd Dumbill</strong>:  Yes, thereâ€™s certainly a different flavor to the event because weâ€™re in New York.  And, yes, the tool stack has matured, but it is, by no means mature, and the maturityâ€™s only coming at the lowest level.</p>
<p>I think thereâ€™s many years left in maturing the tool stack.  But one of the beauties of big data is that once you have the data together, the algorithms to get value from it initially are pretty simple.</p>
<p>So, focusing on the stories of success of being data driven, particularly in the Summit, is important to us because the two questions people are asking are, â€œOne, Iâ€™ve got data.  Two, What do I do with it?â€Â    We donâ€™t need to make the argument that data is important anymore.  But we do need to demonstrate what you can do with it.</p>
<p>The data isnâ€™t necessarily big; itâ€™s just there.  Itâ€™s about having an analytical approach to your business that compliments your intuition, and compliments your vision.</p>
<h3>&#8220;One of the most powerful ways of presenting data to people is in a story,&#8221; Edd Dumbill</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/NarrativeScience.png"><img class="alignnone size-full wp-image-6351" title="NarrativeScience" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/NarrativeScience.png" alt="" width="260" height="218" /></a></p>
<p><strong>Tish Shute:</strong> Yes I can see the emphasis in the schedule on how to tell meaningful stories with data. <a href="http://www.narrativescience.com/" target="_blank">Narrative Science</a> seem to be doing something very interesting re turning data into stories?</p>
<p><strong>Edd Dumbill: </strong>Yes. They absolutely fascinate me with what they do.  Thereâ€™s this kind of hierarchy and sort of chain of needs right now where business is going, â€œWe need data scientists.  Find me data scientists.  Train me data scientists.  Hire me data scientists.â€  And the data scientists are all going, â€œI need visualization.  Iâ€™ve got this data, I now need to turn it back into a story thatâ€™s going to be useful to people or provide interfaces that are going to help people understand and explore this,â€ because it doesnâ€™t scale to have to have an interpreter all the time between the data and the results.</p>
<p>You need to be able to present it in a way that means something to people.</p>
<p>People can look at a graph and get many things out of it, maybe not even get anything at all out of it if they are not used to it.  But particularly for digesting certain kinds of high-level summaries and results, if you can put the data back into prose, it makes it very accessible to people.<br />
<strong><br />
Tish Shute:</strong> Natural Language Generation from data really opens up so many possibilities..</p>
<p><strong>Edd Dumbill:</strong> Yes, itâ€™s interesting. I think itâ€™s a very novel use.  A lot of people would consider that the end result of their data was a spreadsheet or a graph that they are processing.</p>
<p>But if you turn that back into a story, I think thereâ€™s a lot of potential of helping executives understand whatâ€™s going on. It makes it possible to use language to understand the results.<br />
<strong><br />
Tish Shute:</strong> I am really excited to see the emphasis on stories, data design and visualization, and the way we experience data is as much part of The Strata Summit and The Strata Conference as some of the more hardcore big data challenges and analytics stuff.<br />
<strong><br />
Edd Dumbill: </strong> Yes.  We are definitely ramping up on visualization.  And I think thatâ€™s going to become more important. Having a fundamental grasp of how to use graphics and charts is still incredibly core to what weâ€™re saying.  But Iâ€™m also interested in ways that go beyond, because at least 50% of the point of visualization is to help people understand the dynamics of the data, to really augment their senses with the results of the computation.</p>
<p>You know, the people who are some of our best leaders, the ones who know how to ask the right questions of the data, have a sort of indefinable fingertip feel that you get for numbers when you live around them for a while.  And anything we can do with interfaces to accelerate this is going to be very beneficial, whether it comes to being visual and flying through the data or hearing it in natural language.</p>
<p><strong>Tish Shute:</strong> Have I missed anything in that in terms of what youâ€™ve got on the schedule re visualization?  VisualizingData.com published <a href="http://www.visualisingdata.com/index.php/2011/08/data-viz-schedule-for-oreilly-strata-conference/">an ideal schedule from the visualizing data perspective</a>.  But have you added anything recently?</p>
<p><strong>Edd Dumbill: </strong> Well, thereâ€™s one event which isnâ€™t actually listed on the schedule yet, which is on Tuesday night.  Thereâ€™s a venue called <a href="http://www.eyebeam.org/">EyeBeam in New York</a>; weâ€™re having a visualization showcase that evening.  So there will be stuff to walk around and then a few talks, really from some of the most interesting companies doing viz and viz approaches.  So thatâ€™s not up on the schedule yet, but that will be in addition.  It gives a nice focus on Tuesday night.</p>
<p><strong>Tish Shute:</strong> Oh, thatâ€™s super awesome.  I&#8217;ll definitely go to that.<br />
<strong><br />
Tish Shute:</strong> I am very interested in mobile social communications and augmented reality &#8211; especially augmented reality that feels different, not just looks different, as Kevin Slavin puts it.</p>
<p>I am excited to see people thinking about data not just in terms of visualization, but in other ways too that we can feel it through our secondary senses as well (see <a href="http://orangecone.com/archives/2011/05/somatic_data_pe.html">Mike Kuniavskyâ€™s talk at ARE2011, &#8220;Somatic Data Perception&#8221;</a>).</p>
<p><strong>Edd Dumbill: </strong> Yes, absolutely.  That is where we view this as going.  I will be incredibly depressed if Iâ€™m still looking at the world through a glowing rectangle in 10 years time.</p>
<p><strong>Tish Shute: </strong> Yes, it would be!  I am looking forward to see the new data start ups too.</p>
<p><strong>Edd Dumbill:</strong> Yes, there are a variety of interesting startups, that I feel are particularly important in the data space.  <a href="http://mediasift.com/">Media Sift</a> and Data Sift, for example,<a href="http://datasift.com/"> Data Sift</a> is doing a lot of real time processing on the Twitter fire hose.  They provide real time analytics on Twitter, which I think is very important.</p>
<p><strong>Tish Shute:</strong> In terms of using data to provision mobile experiences, real time is massively important, isnâ€™t it?<br />
<strong><br />
Edd Dumbill:</strong> Absolutely.  Yes.</p>
<p><strong>Tish Shute: </strong> But real time data is still a big challenge, isn&#8217;t it?<br />
<strong><br />
Edd Dumbill: </strong> Yes.  I mean right now, our focus on real time is probably at the technology level.  Looking at real time, people are kind of building out the frameworks, companies like Media Sift and Data Sift creating parts of the experience.</p>
<p>And yes, our <a href="http://conferences.oreillynet.com/">Where 2.0</a> conference will be focused more on the mobile experience.</p>
<p><strong>Tish Shute: </strong>Re mobile experiences,<strong> </strong> I am very excited about <a href="http://www.infochimps.com/" target="_blank">Infochimps</a> and <a href="http://semanticweb.com/infochimps-adds-geo-apis-and-takes-a-shine-to-schema-org-too_b22613" target="_blank">their new geo APIs</a>, and sensor data is becoming such a big part of the picture now too. But the Kinect has also opened up a whole set of possibilities for the future of sensor data!</p>
<p><strong>Edd Dumbill:</strong> Yeah.  I still think Kinect is probably one of the most exciting things going down because of the democratization of that kind of capability.  Interesting things happen when the sensors become cheap, right?</p>
<p>When alongside a little camera in your iPad you have a Kinect sensor equivalent.  Thatâ€™s become extremely interesting because everybody has it with them and can do things based off it.</p>
<p>So the things that always fascinate me are when it becomes cheap and hackable.<br />
<strong><br />
Tish Shute:</strong> And if Kinect went mobile, that would be exciting?</p>
<p><strong>Edd Dumbill:</strong> I think itâ€™s entirely likely in the next couple years, yes.</p>
<p>The more sensors we can start instrumenting our mobile and personal devices with, I think itâ€™s going to always result in some much more novel uses that we ever dreamed of.</p>
<p><strong>Tish Shute:</strong> There was a lot of hoo-ha about <a href="http://blogs.wsj.com/venturecapital/2011/06/15/after-seeing-green-color-is-black-and-blue/">Color</a> when they launched this year. They were unable to capture a user base, but if they had issues of privacy might have come to the fore because they were really collecting more sensor data than any other app, right?</p>
<p>We are still waiting to see a breakthrough app in that area in terms of using all the phone sensors in ways that will really enhance a user experience rather than just the aims of data mining, aren&#8217;t we?</p>
<p><strong>Edd Dumbill:</strong> Yes.  I think this is one of the things where, in parallel, weâ€™re really learning out the social and privacy implications of this kind of technology.  It seems to me the focus has shifted from the tech in the second half of the year too.  Frankly, everybody getting kind of freaked out about the amount of data thatâ€™s being mined and, you know, whatâ€™s acceptable use for that.</p>
<p>But on a slightly more prosaic level, there are some rather fabulous things being done.  If you look at the Google Maps navigation experience on an Android phone.  For instance, thereâ€™s some very practical applications of sensors collecting data with traffic and a variety of other augmentations going in that to actually do something useful.</p>
<p>So maybe weâ€™d like to think we carry our sixth sense around with us in our pocket, and maybe we will.  But we certainly can in our car right now with all the automatic rerouting and so on.  Thatâ€™s slightly more prosaic, but I think a lot more significant in terms of a pattern of how that can be applied.<br />
<a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/Singly.png"><img class="alignnone size-thumbnail wp-image-6367" title="Singly" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/Singly-150x150.png" alt="" width="150" height="150" /></a><br />
<strong><br />
Tish Shute:</strong> One of the Startups that really excited me in February at Strata, Santa Clara was <a href="http://singly.com/">Singly</a> and <a href="http://lockerproject.org/">The Locker project.</a> They are really thinking innovately in the area of putting people at the center of their data.</p>
<p>I am looking forward to seeing the fruition of that work.  And, while Iâ€™m enjoying Google +, it seems, we are just sort of holding up our hands and saying, â€œWell, thereâ€™s only one business model for data, and that is a centralized Fort Knox,â€ isnâ€™t it?  Or is there something that Iâ€™m missing?</p>
<p><strong>Edd Dumbill:</strong> Youâ€™re right.  I mean I think Google +, for instance, is rather the walled garden is a hedged garden.  You know, there is a certain barrier there that I think is more about the fact that you need to put certain barriers up to actually create a decent user experience in the first place.  I think user experience is one of the BIG problems with open data, and private data, to be honest.</p>
<p>Thereâ€™s a reason we are not all writing PGP encrypted emails to each other, right?  Because itâ€™s so hard to make a UI for encryption thatâ€™s safe.  Most people donâ€™t use passwords properly.  And I think a lot of the same user experience considerations come into this whole data thing.</p>
<p>Facebook can get away with anything they want to because have you ever tried using their privacy settings?  Google, I think, more than anybody has tried to address this issue using sensible defaults, making the explanations clear.  And they probably succeeded for a geek tech audience.</p>
<p>So I honestly think, probably, Lockerâ€™s biggest challenge, in that kind of approach, is definitely UI and giving the concept to the users so they can understand it.</p>
<p>But thereâ€™s certainly a very useful contribution to this conversation.</p>
<p>I think there are parallels in blogging, actually.  There is a case where people have information they want to disseminate.  And do you choose to do in on your own website, set everything up, publish for yourself, host for yourself, so you have complete control, or do you cede, for convenience, control to Blogger or Tumbler, knowing that you are being monetized somehow and that youâ€™re playing in somebody elseâ€™s walled garden and donâ€™t have that control?</p>
<p>So I havenâ€™t really expanded that thought too much, but I think thereâ€™s something there in following that along and seeing where that actually leads.</p>
<p>But, you know, there is a whole technical challenge as well.</p>
<p>I really like the idea of being able to give permission to people. Being able to say, well, â€œIâ€™m engaging you to do X,Y,Z in return for such and such. That seems like a good bargain to me. Giving up my data is a decent bargain for the services Iâ€™m getting back.â€ I mean thatâ€™s generally the contract we make in real life with people anyway.</p>
<p>Thatâ€™s another thing re Google+, &#8211;why itâ€™s a promising approach. At least in their rhetoric, theyâ€™re trying to say, well, â€œWeâ€™re trying to model this on the real life economy, the economy of real life interactions.â€</p>
<p><strong>Tish Shute:</strong> Yes. Any movement towards saying, well, â€œIâ€™m not just collecting your data randomly, Iâ€™m collecting this data because I want to give something back to you that will enhance your interactions,â€ definitely feels like an improvement, doesnâ€™t it?<br />
<strong><br />
Edd Dumbill:</strong> Yes. I think that bargain is clear. Iâ€™m just fascinated by who could be trusted andâ€¦ I do actually wonder if there will be some kind of, rather than necessarily everything being decentralized like Lockers suggests, there might be an idea of a variety of inter-operating, trusted identity brokers. People who we would actually trust. Banks, right? We do that right now. Banks are pretty much our identity brokers. Who knows?</p>
<p><strong>Tish Shute:</strong> I think, that is where the Locker projectâ€™s going with Singly, isnâ€™t? Isnâ€™t Singly the trusted broker for the Lockers, right?</p>
<p><strong>Edd Dumbill: </strong>Yes. Now the question is whether you trust a startup with that or whether youâ€™re going to trustâ€¦ I mean, who knows? Trust levels are at such all-time lows with everybody right now. People in America wonâ€™t trust the government. I think Google are probably one of the most trusted brokers out there online.<br />
<strong><br />
Tish Shute:</strong> Perhaps, thatâ€™s interesting, isnâ€™t it?<br />
<strong><br />
Edd Dumbill:</strong> I did write a piece, which kind of speculated that Google may become some sort of center brokering of social information and kind of a platform.</p>
<p><strong>Tish Shute:</strong> Oh, yes, <a href="http://radar.oreilly.com/2011/07/google-plus-social-backbone.html"><strong>&#8220;Google+ is the social backbone&#8221;</strong></a> &#8211; a very thought provoking piece! It deserves an interview on it&#8217;s own!</p>
<p>But back to the Strata schedule!  I notice you have DePodesta doing the Moneyball talk, right? Whatâ€™s the 2011 twist on Moneyball?</p>
<p><strong>Edd Dumbill: </strong>I think the twist on that is that theâ€™re a lot more people can play now, really, which is why weâ€™re having Strata in the first place. That 10 years ago the people doing this kind of stuff are McDonalds and Walmart and sports teams. Everybody, where there was large money, they could afford to gather the data. Maybe they could try this service out in making decisions based on it.</p>
<p>Well, weâ€™re now in a very instrumented society where every business, every person has instrumented data about their interactions. I think the kind of resistance and dynamics and opinions that Moneyball brought up are the ones that people are going to be facing again right now as they seek to be more data-driven in what theyâ€™re doing.</p>
<p>Itâ€™s also very interesting to know 10 years on, what do you think? Youâ€™ve had 10 years of this, of sort of sabermetrics and so on. Have you matured in your view, have you softened?</p>
<p>What Iâ€™m endlessly and ultimately fascinated by is, where does this fit in the decision process and in the organization tree? Where does it mesh with vision?</p>
<p>Steve Jobs achieved it perfectly. He had vision and all kinds of things for his products. But Apple succeeded through a relentless operational efficiency. Absolutely relentless in their suppliers, their supply train, their manufacturing lines down to their detail. They are an utterly data-driven, process-driven organization at the same time as melding that with vision, design values and good quality. Thatâ€™s a case where it worked together.</p>
<p>Iâ€™m eager to try and tease it out, figure out how that really works and how those things come together.<br />
<strong><br />
Tish Shute: </strong> And thatâ€™s another thread I see being explored at Strata, NYC.  Itâ€™s not human versus machine or machine trumps human, but itâ€™s human with machine.  This is another theme, isn&#8217;t it?<br />
<strong><br />
Edd Dumbill: </strong>Exactly. We all operate by feedback loops. Really, what machines are doing enables us to get better quality data and in a tighter feedback loop.</p>
<p><strong>Tish Shute: </strong> One feedback loop that weâ€™re finding machines very useful for is understanding how we feel. I think thatâ€™s really interesting.</p>
<p><strong>Edd Dumbill: </strong>Yes. Iâ€™m very fascinated by all the quantified-self stuff and where that can take us. At the end of the day, we have a very personal little organization to deal with, which is ourselves.<br />
<strong><br />
<a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/Quid.png"><img class="alignnone size-medium wp-image-6369" title="Quid" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/Quid-300x182.png" alt="" width="300" height="182" /></a><br />
<a href="http://quid.com/" target="_blank"><em>Quid: Building Software and Mathematical Solutions â€¨to Simplify Complex Decisions</em></a></strong></p>
<p><strong>Tish Shute:</strong> Yes! But the thing is we donâ€™t understand ourselves in isolation, do we?   I am definitely going to attend the session by Sean Gourley, CTO of <a href="http://quid.com/" target="_blank">Quid</a>, on semantic clustering analysis.  It seems like sentiment analysis is going big-time now, isnâ€™t it?<br />
<strong><br />
Edd Dumbill: </strong>Yes. I mean, sentiment analysis is actually becoming a checkbox feature in databases now. The latest release of <a href="http://www.greenplum.com/">Greenplum</a> has it built it. Itâ€™s that kind of level of feature that people want as social data is so important. Of course a lot of this is being driven by marketing and advertising.</p>
<p><strong>Tish Shute: </strong> Yes but even re marketing data story telling has been taking some interesting and quirky turns hasn&#8217;t it?<br />
<strong><br />
Edd Dumbill: </strong>Yes, absolutely. I think thereâ€™s a lot of interesting research ahead of us there as well.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/OKCupid.png"><img class="alignnone size-medium wp-image-6370" title="OKCupid" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/OKCupid-273x300.png" alt="" width="273" height="300" /></a><br />
<em><a href="http://blog.okcupid.com/">OKCupid Trends</a></em></p>
<p><strong>Tish Shute:</strong> <a href="http://www.okcupid.com/">OkCupid</a> is a very interesting example of data story telling that leverages our desire to know ourselves, and ourselves in relation to others.<br />
<strong>.<br />
Edd Dumbill:</strong> Yes. I mean theyâ€™re an example of a shift thatâ€™s happening in the PR industry, actually, which is companies understanding that telling marketing stories with data is very, very compelling. OkCupid really used that to hit well above their weight. Of course they got acquired as a direct result of that and their profile.</p>
<p><strong>Tish Shute:</strong> I know OKCupid got acquired by Match.com, but you were saying they hit above their weight by using this analysis? How did that work?<br />
<strong><br />
Edd Dumbill:</strong> I think a lot of itâ€™s down to their blog. That they analyze these things, publish them on their blog. It got a lot of attention, generated a lot of media stories, which brought them to Match.comâ€™s attention. Thereâ€™re millions of &#8211; well a large number of dating sites. But they differentiated themselves through the smart use of their data.</p>
<p><strong>Tish Shute:</strong> Data and Games is an area I am very interested in.  Zynga changed the game with game analytics and social games. And now we are seeing Rovio partner with <a href="http://medio.com/">Medio</a> for analytics,<a href="http://radar.oreilly.com/2011/08/angry-birds-data-hp-daily-dot.html"> </a>(see<a href="http://radar.oreilly.com/2011/08/angry-birds-data-hp-daily-dot.html"> Green pigs and data). </a> But I noticed that you donâ€™t have games as a strong theme on the schedule?</p>
<p><strong> Edd Dumbill: </strong>I think youâ€™ll see more of that on the West Coast to be honest. Itâ€™s not that weâ€™re not interested. I just feel that the center of gravity to that topic is probably back on the West at the moment.</p>
<p><strong>Tish Shute:</strong> So whatâ€™s after Zynga in terms of game analytics? A nice easy question!<br />
<strong><br />
Edd Dumbill:</strong> Sure. Let me predict the future for you.</p>
<p><strong>Tish Shute:</strong> Yes please do!</p>
<p><strong>Edd Dumbill:</strong> I donâ€™t know, to be honest. One of the very interesting things about games is that it helps us understand the real world by modeling and playing around.  Iâ€™m highly fascinated to see some more of those things played out through real life actors.   Thereâ€™s been some examples right out of <a href="http://www.scvngr.com/" target="_blank">Scavngr</a> and whatnot. But if any of those techniques can really start to make a way into mobile technology, thatâ€™s one interesting thing.</p>
<p>What lessons can we take from what weâ€™ve actually learned in game analytics that are reproducible and useful elsewhere?</p>
<p>Gamification is a bit of a trend right now. I am slightly skeptical&#8230; But I am fascinated by a lot of systems that are having these game elements added to them.   And so the second question is, if youâ€™re having games added to things, like losing weight or saving money or writing a book, Iâ€™ve seen that too, what can you apply from the analytics world on top of that, and learn about systems and tweak them?</p>
<p>I donâ€™t have that good of an answer for you. How my game is, is not steeped in that. But I am aware that thereâ€™s probably a lot of progress in games that has yet to be applied anywhere else.</p>
<p>Zynga and whatnot, is kind of a space race, isnâ€™t it, to monetize that.   Space races generate technologies that can be applied in a variety of places.</p>
<p>What are the spinouts of game analytics that we can actually use elsewhere?</p>
<h3>&#8220;These Bloom Instruments arenâ€™t merely games or graphics. They&#8217;re new ways of seeing what&#8217;s important.&#8221;</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/cartagram.png"><img class="alignnone size-medium wp-image-6373" title="cartagram" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/08/cartagram-300x129.png" alt="" width="300" height="129" /></a><br />
<em>Cartagr.am by Bloom</em><a href="http://cartagr.am/#10.00/40.8526/-74.6277"></a></p>
<p><strong>Tish Shute:</strong> Last February,  at Strata, I was very struck by the new work by Ben Cerveny and<a href="http://bloom.io/"> Bloom</a> on &#8220;pop cultural instruments for data expression&#8221; (also see<a href="http://cartagr.am/#10.00/40.8526/-74.6277"> </a><a href="http://www.youtube.com/watch?v=HWDcc5gNVrE">Ben Cerveny&#8217;s talk at ARE2011</a>).<br />
<strong><br />
Edd Dumbill:</strong> Yeah. I love every time the visualization comes onto a tabletâ€¦.thereâ€™s an interesting back channel there.</p>
<p>And Google has done this in extreme to add to their great advantage. Thereâ€™s a potential when you read an E-book, or you interact with the visualization of a tablet, that it can learn from your interactions.</p>
<p>If you read an E-book, and the book is instrumented and sends stuff back, then the book can read you at the same time that youâ€™re reading it. That kind of collective intelligence can then be harnessed.</p>
<p>So what if Bloomâ€™s pop culture visualizations are instrumented so that they know how people are using it?   Well what can they learn about that?  About either the quality of the visualization, about whatâ€™s interesting to data and back at the same time?</p>
<p>This is what the fundamental principles I think even of Web 2.0 and definitely in this era of big data that weâ€™re in, is that the secondary signals, the exhaust from any electronic product, can be incredibly valuable.</p>
<p>We know that every time you run Google you are probably a part of at least one experiment that they are running to determine an optimal, and optimize their product through that. And how can you turn this up to generalize that out?</p>
<p><strong>Tish Shute:</strong> I agree.Â  This is at the core of the art, science and business of data.Â  I hear your phone ringing, but do I have time for one more quick question?</p>
<p><strong>Edd Dumbill:</strong> Oh yes.<strong> </strong></p>
<p><strong>Tish Shute:</strong> So it sort of follows on from my previous question.Â  The relationship between the crowd sourced intelligence and machine intelligence has played a huge role in making data work andÂ  solve real world problems &#8211; <a href="http://crowdflower.com/" target="_blank">Crowd Flower</a>, for example.</p>
<p>Where are we at now with this relationship between crowdsourcing power of, for example, Crowd Flower and Mechanical Turk when combined with machine intelligence. Is there anything new going on here?<br />
<strong><br />
Edd Dumbill:</strong> What weâ€™re actually starting to do is learn where to apply these tools. Weâ€™re reaching a point of understanding what crowd-sourcing is for, how to better design crowd-source tasks and so on in innovative uses.</p>
<p>One of the things I am particularly excited about is Natala Menezes who was at Amazon working on Mechanical Turk, sheâ€™s now moved to a company called <a href="http://gigwalk.com/" target="_blank">GigWalk</a>, which is a Turk platform thatâ€™s mobile.</p>
<p>So if you want to assign tasks that depend on people being in particular places and being able to do particular things, this is a platform for turking using that, which I think is fascinating. Thatâ€™s definitely a new approach.<br />
<strong><br />
Tish Shute:</strong> Yes <a href="http://gigwalk.com/">GigWalk</a> is awesome â€“ I saw that <a href="http://blogs.msdn.com/b/photosynth/archive/2011/07/19/get-paid-to-shoot-mobile-photosynths.aspx">Photosynth is partnering with GigWalk.</a> That is interesting â€“ perhaps a step towards strong AR! ( see <a href="http://www.wired.com/beyond_the_beyond/2011/05/augmented-reality-readwrite-world-at-are2011/" target="_blank">Read Write World and Blaise Aguera Y Arcas&#8217;s work on Photosynth was big news at ARE2011</a>).</p>
<p><strong> Edd Dumbill:</strong> Natala will be talking about GigWalk.  I think the session is called quirky crowdsourcing. I want to call it Quirky Turks.</p>
<p><strong>Tish Shute:</strong> [laughs] I like that.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2011/08/31/story-telling-the-art-science-and-business-of-data-talking-with-edd-dumbill-about-strata-nyc-2011/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Augmented Reality &#8211; Transitioning out of the old-fashioned &#8220;Legacy Internet&#8221;: Interview with Bruce Sterling</title>
		<link>http://www.ugotrade.com/2011/05/06/augmented-reality-transitioning-out-of-the-old-fashioned-legacy-internet-interview-with-bruce-sterling/</link>
		<comments>http://www.ugotrade.com/2011/05/06/augmented-reality-transitioning-out-of-the-old-fashioned-legacy-internet-interview-with-bruce-sterling/#comments</comments>
		<pubDate>Fri, 06 May 2011 22:23:38 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Ambient Findability]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Data]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[gestrural interface]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[ipad]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[New Interfaces]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[Real Time Big data]]></category>
		<category><![CDATA[Semantic Web]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Adam Greenfield]]></category>
		<category><![CDATA[AR and Experience Design]]></category>
		<category><![CDATA[AR hacks]]></category>
		<category><![CDATA[AR Magic]]></category>
		<category><![CDATA[are2010]]></category>
		<category><![CDATA[ARE2011]]></category>
		<category><![CDATA[Augmented Bollywood Reality]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[Ben Cerveny]]></category>
		<category><![CDATA[Blaise Aguera y Arcas]]></category>
		<category><![CDATA[Bloom Studio]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Design Fiction]]></category>
		<category><![CDATA[Frank Cooper]]></category>
		<category><![CDATA[gestural interfaces]]></category>
		<category><![CDATA[gestural interfaces for augmented reality]]></category>
		<category><![CDATA[Jaron Lanier]]></category>
		<category><![CDATA[Jesper Sparre Andersen]]></category>
		<category><![CDATA[Jesse Schell]]></category>
		<category><![CDATA[Kinect]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[Marco Tempest]]></category>
		<category><![CDATA[Metaio]]></category>
		<category><![CDATA[Planetary]]></category>
		<category><![CDATA[TeleHash]]></category>
		<category><![CDATA[The Legacy Internet]]></category>
		<category><![CDATA[The Locker project]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[Tom Carden]]></category>
		<category><![CDATA[Tomi Ahonen]]></category>
		<category><![CDATA[Total Immersion]]></category>
		<category><![CDATA[Vernor Vinge]]></category>
		<category><![CDATA[Will Wright]]></category>
		<category><![CDATA[William Gibson]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=6252</guid>
		<description><![CDATA[Planetary from Bloom Studio, Inc. on Vimeo. It is just over a week until Augmented Reality Event, and I know there are a lot of people, including me (full disclosure I am co-chair and co-founder) who are totally psyched to see what unfolds there this year.Â Â  Bruce Sterling, Vernor Vinge, Blaise Aguera Y Arcas,Â  Jaron [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><iframe src="http://player.vimeo.com/video/23158141?title=0&amp;byline=0&amp;portrait=0" width="400" height="300" frameborder="0"></iframe>
<p><a href="http://vimeo.com/23158141">Planetary</a> from <a href="http://vimeo.com/bloomstudioinc">Bloom Studio, Inc.</a> on <a href="http://vimeo.com">Vimeo</a>.</p>
<p>It is just over a week until <a href="http://augmentedrealityevent.com/" target="_blank">Augmented Reality Event</a>, and I know there are a lot of people, including me (full disclosure I am co-chair and co-founder) who are totally psyched to see what unfolds there this year.Â Â  Bruce Sterling, Vernor Vinge, Blaise Aguera Y Arcas,Â  Jaron Lanier, Will Wright, Marco Tempest and Frank Cooper will join <a title="107 speakers from 76 augmented reality companies on a single stage" href="http://augmentedrealityevent.com/2011/04/24/107-speakers-from-76-augmented-reality-companies-on-a-single-stage/">107 speakers from 76 augmented reality companies on a single stage</a> (<a href="http://www.ugotrade.com/2011/04/13/augmented-reality-event-2011-bruce-sterling-vernor-vinge-will-wright-and-jaron-lanier-to-judge-the-auggies/" target="_blank">see my previous post</a>) to tell a momentous story of a technology of our time (also see <a href="http://www.ugotrade.com/2011/04/13/augmented-reality-event-2011-bruce-sterling-vernor-vinge-will-wright-and-jaron-lanier-to-judge-the-auggies/" target="_blank">my previous post here</a>).</p>
<p>As Bruce Sterling points out, Augmented Reality is &#8220;<strong>truly a child of the twenty-teens, a genuine digital native,&#8221; </strong> and one visible indication that:</p>
<p><strong> </strong><strong>..the Internet really could look like a &#8220;legacy.&#8221;  The Legacy Internet  as an old-fashioned, dusty, desk-based place best left to archivists and  librarians, while the action is out on the streets </strong>(see the full interview below)<strong>.<br />
</strong><br />
<a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/bruce-industrialdecline.jpg"><img src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/bruce-industrialdecline-300x225.jpg" alt="" title="bruce-industrialdecline" width="300" height="225" class="alignnone size-medium wp-image-6299" /></a><br />
(<em>photo by Jasmina Tesanovic</em>)</p>
<p>Opening this post is a video of Ben Cerveny&#8217;s <a href="http://planetary.bloom.io/">Planetary</a> app, which <a href="http://www.wired.com/underwire/2011/05/planetary-ipad-app/" target="_blank">&#8220;turns your music into a universe,&#8221;</a> and enchants all who try it.Â  Planetary shot into #3 on the Top Ten Free ipad app list soon after its release.</p>
<p>Ben Cerveny&#8217;s talk at Augmented Reality Event will be one of the must attend talks (<a href="http://augmentedrealityevent.com/schedule/" target="_blank">see the full schedule for Augmented Reality Event here</a>, and note my discount code for Augmented Reality Event, TISH295, is still good, if you want to register).</p>
<p>Planetary, while it is not an AR experience,Â  points the way for AR to take us out of the old-fashioned, &#8220;Legacy Internet.&#8221;</p>
<p>â€œ<a href="http://planetary.bloom.io/">Planetary</a> is just the sort of science fiction experience you expect when using an object from the future like <a href="http://www.wired.co.uk/topics/ipad">iPad</a>,â€ developer Bloom Studio writes on the appâ€™s <a href="http://itunes.apple.com/us/app/planetary/id432462305?mt=8">iTunes page</a>.<a title="107 speakers from 76 augmented reality companies on a single stage" href="http://augmentedrealityevent.com/2011/04/24/107-speakers-from-76-augmented-reality-companies-on-a-single-stage/"> </a>( <a href="http://www.wired.com/underwire/2011/05/planetary-ipad-app/" target="_blank">f</a>rom Mark Brown&#8217;s<a href="http://www.wired.com/underwire/2011/05/planetary-ipad-app/" target="_blank"> Wired post)</a>.</p>
<p>In <a href="http://news.cnet.com/8301-13772_3-20058911-52.html" target="_blank">his interview on cnet Daniel Terdiman</a>, Ben describes how popular computing will evolve beyond those, &#8220;<strong>dusty, desk-based place best left to archivists and librarians,&#8221; </strong> (Bruce Sterling).</p>
<p>Ben points out:</p>
<p><strong>&#8220;The tablet is a total disruption of how we understand popular  computing. The next era of experiences will be driven by visceral  gesture-based input, and rich fluid responsiveness in native graphics  contexts. I see the potential for Bloom to help define a &#8220;killer  pattern&#8221; for application design. Because apps have been deconstructed  into discrete tasks that flow across devices&#8230;.&#8221;</strong></p>
<p>Bruce Sterling had some interesting comments on the Bloom app:</p>
<p><strong>I&#8217;m a big fan of Ben and his good works in infoviz &#8212; and urban informatics, too.  I admit  I&#8217;m not  sure the I entirely need the metaphor of a solar system in order to play a few Texas blues tracks.  But I could be persuaded.  Ben Cerveny is a significant thinker and a very well-spoken guy.</p>
<p>The thing I consider significant about that remarkable piece of Bloom software is that it uses information visualization as a new breed of control interface.  That&#8217;s not just fancy re-skinning of the same old music-machine pushbuttons. That whole graphic shebang is generated in real-time on the fly.  And you can run code with that, play music, do media with it!  An advance like that is important.</p>
<p>I said at Layar, two years ago, that Augmented Reality would become a real industry when you could design an Augmented Reality system with an Augmented Reality system.  Some people in the audience had startled, &#8220;what the hell? Why would we bother?&#8221; reactions to that notion.  This Bloom piece makes that concept more plausible.</p>
<p>Think of it this way:  if AR is &#8220;real-time interaction that combines virtual data with three-dimensional real spaces,&#8221; then why would you leave that environment, and go to some dusty flat Internet screen to get real work done?  Isn&#8217;t that rather like designing a website on graph paper?  Bloom &#8220;Planetary&#8221; is definitely not Augmented Reality, but it suggests an approach that AR would follow if AR was seizing its own means of production.  It means AR, through AR, by AR, for AR.</p>
<p>I&#8217;m not saying that happens tomorrow; I&#8217;m just saying, why not?  Why not aspire to that?<br />
</strong><br />
I too am a huge fan ofÂ  The Bloom team, Ben Cerveny, Tom Carden, and Jesper Sparre Andersen (<a href="http://www.ugotrade.com/2011/02/10/jeremie-miller-the-locker-project-give-a-data-platform-to-the-people-in-the-era-of-data-everywhere-and-bloom-presents-fizz/" target="_blank">also see my post here about Fizz, the Bloom team&#8217;s app used by The Locker Project for their Strata demo</a>).Â  And, if you haven&#8217;t already heard about T<a href="http://blog.lockerproject.org/welcome-to-the-locker-project-tlp" target="_blank">he Locker Project</a> and<a href="http://www.telehash.org/about.html" target="_blank"> Telehash</a> &#8211; get on it!Â  This is one of the most important projects of our time &#8211; an infrastructure for a better future!</p>
<p> </br></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/bruce-pulpit.jpg"><img src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/bruce-pulpit-186x300.jpg" alt="" title="bruce-pulpit" width="186" height="300" class="alignnone size-medium wp-image-6296" /></a></p>
<h3><strong><strong>Interview with Bruce Sterling by Tish Shute and Ori Inbar</strong></strong></h3>
<p><strong>Tish Shute:</strong> As you so memorably put it, â€œAR is a technovisionary dream come true &#8211; something really rare, and you have to be really patient for those&#8230;.â€</p>
<p>What is best and worst, in your view,  about the way Augmented Reality technovisionary dream is coming true and emerging to flourish in the wild?</p>
<p><strong>Bruce Sterling: The best part is that AR is truly happening and is a  lot of fun, and the worst part is that it&#8217;s happening in a Depression.  If AR had broken loose in the dotcom days when cash flew around like soap bubbles, man, that would have been psychedelic.</strong></p>
<p><strong>AR that is even more of-our-time than &#8220;social media.&#8221; AR has arisen directly from modern technical factors that just didn&#8217;t use to exist.  It&#8217;s made from shiny new parts, and is truly a child of the twenty-teens, a genuine digital native.   It&#8217;s a little kid and it has to walk before it can run, but it&#8217;s great to see it walking.</strong></p>
<p><strong>Tish Shute:</strong> As Jesse Schell pointed out last year at ARE2010, â€œThe whole point of AR is to see things from a different point of viewâ€¦How can there be a more powerful art form than one that actually changes what you see?â€  What do you feel will be the most impactful application of AR in people&#8217;s everyday lives?</p>
<p><strong>Bruce Sterling:</strong><strong> I&#8217;m all for impact, but it&#8217;s pretty clear that the people who would weep for joy to have Augmented Reality are people whose reality is already damaged.  People who need reality augmented as a prosthetic, in other words, so that they can achieve an &#8220;everyday life.&#8221;  This is like the impactful but underappreciated role of the Internet in the lives of people who&#8217;ve been shut-in.  If you&#8217;re laid-up in a hospital bed, a laptop is a revolution in convalescence.</strong></p>
<p><strong>But that kind of &#8220;impact&#8221; doesn&#8217;t sound too exciting or too profitable.  My guess would be that the biggest arena for &#8220;impactful AR&#8221; would be augmenting cityscapes for foreign people who can&#8217;t speak the local language, can&#8217;t read the signs, and lack time to learn the local reality.  Imagine, say, the Brazilian overlay for Moscow.  You could show up, read your native Brazilian overlay of that city, do your business, eat, sleep, buy, leave, and scarcely &#8220;be in Moscow&#8221; at all.  Constructed right, the AR Brazilian Moscow might even be a better Moscow &#8212; a Moscow that Russians themselves would pay to visit.</strong></p>
<p><strong>Tish Shute: </strong>You pointed out last year, in your opening keynote for ARE2010, that less immersive forms of AR have their own merits.  We are still not seeing much â€œhead mounted display weirdnessâ€ yet, but many other forms of AR are emerging &#8211; mobile, webcam, projected video, sonic augmented reality, even sticky light.  You noted, practically everything that AR is involved in is a transitional technology.  But since you spoke last year at ARE2010, which of these transitional technologies have shown the most promise for AR?</p>
<p><strong>Bruce Sterling: It&#8217;s got to be handsets.  Smartphones.  The stats there are just amazing.  The smartphone biz makes the personal computer business look like a Victorian railroad.  When I read a guy like Tomi Ahonen, who talks about transitioning out of the old-fashioned &#8220;Legacy Internet,&#8221; that idea is startling.  But AR is one visible indication that the Internet really could look like a &#8220;legacy.&#8221;  The Legacy Internet as an old-fashioned, dusty, desk-based place best left to archivists and librarians, while the action is out on the streets.</strong></p>
<p><strong>Tish Shute:</strong> This year we have seen gestural interfaces go mainstream.  What are the most interesting directions for gestural interfaces that you have seen emerge in recent months?</p>
<p><strong>Bruce Sterling:</strong> <strong>To me, the most &#8220;interesting&#8221; part is seeing people do gestural stuff in public.  William Gibson, my fellow author, observes that cellphones have stolen the gestural language of cigarettes.  There&#8217;s lots of fidgeting, box tapping, ash-swiping, slipping boxes in and out of pockets&#8230; People quickly learn to do that without thinking twice, and they forget how weird it looks. It&#8217;s &#8220;design dissolving in behavior,&#8221; as Adam Greenfield puts it.</strong></p>
<p><strong>The gestural hack scene for the Kinect has been amazing.  It&#8217;s like watching 1950s Beatnik dancing go mainstream.</strong></p>
<p><strong>Tish Shute: </strong>You have observed that Augmented Reality is Glocal which not only gives us different flavors of augmented experience but is â€œa departure from earlier models of tech startups, where you usually have like three hippies in a local garage.  Now youâ€™ve got German-American-Korean outfits like Metaio, and Total Immersion has a Russian affiliate.  Theyâ€™re inherently multinational, both inside the company and out.&#8221;  What flavors of glocalness do you hope/expect to see at Augmented Reality Event this year.</p>
<p><strong>Bruce Sterling: I&#8217;d be pretty happy to see some AR input from Brazil, India, and South Africa.  I seem to be picking up a lot of followers in my Twitter stream from those locales.  If I saw some Augmented Bollywood Reality, that would pretty much make my day.</strong></p>
<p><strong>Ori Inbar:</strong> What sessions will you go to at ARE this year? Who do you want to meet at ARE 2011?</p>
<p><strong>Bruce Sterling: I make it my business to hang out with artists, but I&#8217;m hoping to drill down more on the technical aspects.  For instance, where exactly are the bottlenecks in building animated augments?  It looks like we&#8217;re about a sneeze away from jamming some crude Hanna-Barbera cartoons into real spaces. But the devil is in the details there.</strong></p>
<p><strong>Ori Inbar:</strong> Your commentary about the evolution of the AR industry over the years had significant focus on style. Is the AR industry dressed to kill yet? Any glimpses of promise in that direction?</p>
<p><strong>Bruce Sterling: I&#8217;m not &#8220;pro-style&#8221; in every possible aspect of life, but as an Augmented Reality critic, it&#8217;s clear to me that if you claim to &#8220;augment&#8221; reality, then you should work hard to augment it &#8212; struggle to make it better.  Otherwise you might as well call yourself &#8220;Defaced Reality,&#8221; or even &#8220;3D Spam.&#8221;  When I see that kind of crudity and carelessness in AR, I&#8217;m gonna call people out on it.  I know there will be the AR equivalent of cheesy billboards and gang graffiti, but I never much cared for those, either.</strong></p>
<p><strong>The industry&#8217;s videos have improved radically in the past year and a half.  It used to be all about &#8220;look at my grainy, shaky handheld video of my cool new AR hack,&#8221;  but nowadays the biz has really pulled its socks up.</strong></p>
<p><strong>If AR is about &#8220;experience design,&#8221; as I think it basically is, then eventually, as a matter of intellectual consistency and professional pride, everything you create will be considered  part of &#8220;the experience.&#8221;  That&#8217;s the industry&#8217;s way forward &#8212; that&#8217;s what it would do if it was grown-up.</strong></p>
<p><strong>AR people already look better than most similar geeks in the gaming business, and some day, I really do believe that augmentation people will become glamorous.  They won&#8217;t be supermodels, but they&#8217;ll be about as chic as, say, professional set designers.  Because AR is set design, in a way; it&#8217;s real-time interactive set-design for three-D spaces.</strong></p>
<p><strong>Ori Inbar: </strong>In the Layar Launch in 2009 you said â€œitâ€™s the dawn of AR&#8230;â€, at ARE 2010, you followed up on the theme saying â€œitâ€™s 9am in the AR industry.â€ What time is it now?</p>
<p><strong>Bruce Sterling: I&#8217;d be guessing it&#8217;s around 9:30 AM, but come on, that&#8217;s just a metaphor! ARE we all gonna blow off at 4:30 PM and have a beer, or is AR one of those cruel tech startups where nobody ever gets a personal life?</strong></p>
<p><strong>Ori Inbar:</strong> Are you reading any new fictional literature about AR that inspires you?  And/or What interesting design fictions for AR have you come across recently?</p>
<p><strong>Bruce Sterling: Well, I&#8217;m always interested in creative people who just plain make stuff up.  Because that&#8217;s why I commonly do myself.  The stuff that &#8220;inspires&#8221; me is usually stuff that I just didn&#8217;t expect to see.  But when I don&#8217;t expect it, that usually means I wasn&#8217;t paying enough attention.  I plan to pay a lot of attention to AR this year.</strong></p>
<p><strong>I&#8217;m not sure it makes a lot of sense to write fiction nowadays &#8220;about AR,&#8221; because it&#8217;s no longer a fictional topic.  It&#8217;s become like writing fiction &#8220;about cinema.&#8221;  You can write good fiction about someone who works in cinema, but not fiction about cinema itself.  AR is not sci-fi &#8220;Augmented Reality&#8221; any more, it&#8217;s become a real-world phenomenon, a new industry of real augmentation.</strong></p>
<p><strong>With that said, I must remark that I sit up straight whenever I see Marco Tempest do stuff.  Magicians are all about mystery and wonder.  You wouldn&#8217;t see a magician, say, using AR to work an assembly line, or re-order library books, or find a pizza joint in Barcelona.  And that&#8217;s great.   Marco is always gonna do something freaky and out-there, and even though he&#8217;s a tech magician, it&#8217;s never about the tech first.  It&#8217;s always about his ingenuity in finding new ways to employ new tools in creating a magical experience for his audience.</strong></p>
<p><strong>Marco&#8217;s not an entrepreneur, he&#8217;s  not gonna revolutionize people&#8217;s daily lives or invent Web 4.0, but even if AR becomes &#8220;old hat&#8221; some day, it&#8217;s never going to be old hat when he&#8217;s doing it.  The guy is a pro, and I&#8217;m quite the fan.</strong></p>
<p><iframe src="http://player.vimeo.com/video/11801074?portrait=0" width="400" height="225" frameborder="0"></iframe>
<p><a href="http://vimeo.com/11801074">Magic Projection Live @ TEDxTokyo 2010</a> from <a href="http://vimeo.com/magician">Marco Tempest</a> on <a href="http://vimeo.com">Vimeo</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2011/05/06/augmented-reality-transitioning-out-of-the-old-fashioned-legacy-internet-interview-with-bruce-sterling/feed/</wfw:commentRss>
		<slash:comments>14</slash:comments>
		</item>
		<item>
		<title>Jeremie Miller &amp; The Locker Project Give a Data Platform to the People in the Era of Data Everywhere. And Bloom presents Fizz!</title>
		<link>http://www.ugotrade.com/2011/02/10/jeremie-miller-the-locker-project-give-a-data-platform-to-the-people-in-the-era-of-data-everywhere-and-bloom-presents-fizz/</link>
		<comments>http://www.ugotrade.com/2011/02/10/jeremie-miller-the-locker-project-give-a-data-platform-to-the-people-in-the-era-of-data-everywhere-and-bloom-presents-fizz/#comments</comments>
		<pubDate>Thu, 10 Feb 2011 17:10:30 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Ambient Findability]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Data]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[GeoFencing]]></category>
		<category><![CDATA[GeoMessaging]]></category>
		<category><![CDATA[gestrural interface]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[Linked Data]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[New Interfaces]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[Open Data]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[Real Time Big data]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Alistair Croll]]></category>
		<category><![CDATA[Ben Cerveny]]></category>
		<category><![CDATA[Bloom]]></category>
		<category><![CDATA[data visualization]]></category>
		<category><![CDATA[federation]]></category>
		<category><![CDATA[Fizz]]></category>
		<category><![CDATA[Instant Messaging]]></category>
		<category><![CDATA[Introspectr]]></category>
		<category><![CDATA[Jabber]]></category>
		<category><![CDATA[Jason Cavnar]]></category>
		<category><![CDATA[Jeremie Miller]]></category>
		<category><![CDATA[Jesper Sparre Anderson]]></category>
		<category><![CDATA[lifestreaming]]></category>
		<category><![CDATA[Locker Project]]></category>
		<category><![CDATA[Marshall Kirkpatrick]]></category>
		<category><![CDATA[open federated protocol]]></category>
		<category><![CDATA[P2P]]></category>
		<category><![CDATA[peer to peer protocols]]></category>
		<category><![CDATA[real time data]]></category>
		<category><![CDATA[real time data visualization]]></category>
		<category><![CDATA[Roger Magoulas]]></category>
		<category><![CDATA[Simon Murtha-Smith]]></category>
		<category><![CDATA[Singly]]></category>
		<category><![CDATA[social data aggregation]]></category>
		<category><![CDATA[social graph]]></category>
		<category><![CDATA[Sophia Parafina]]></category>
		<category><![CDATA[Strata]]></category>
		<category><![CDATA[Strata 2011]]></category>
		<category><![CDATA[TeleHash]]></category>
		<category><![CDATA[Tim O'Reilly]]></category>
		<category><![CDATA[Tom Carden]]></category>
		<category><![CDATA[XMPP]]></category>
		<category><![CDATA[Zynga]]></category>
		<category><![CDATA[Zyngification]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=6102</guid>
		<description><![CDATA[Singlyâ€™s appearance at the startup showcase at Strata 2011 this week has excited thought leaders across the web since the story got out. Singly is a new startup that exists to provide oxygen and commercial support to the open source Locker Project, and new protocol TeleHash. With some wonderful serendipity I met Singly on my [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/Jeremiemiller.jpg"><img class="alignnone size-medium wp-image-6105" title="Jeremiemiller" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/Jeremiemiller-300x223.jpg" alt="" width="300" height="223" /></a></p>
<p><a href="http://sing.ly/" target="_blank">Singlyâ€™s</a> appearance at the <a href="http://strataconf.com/strata2011/public/cfp/148" target="_blank">startup showcase at Strata 2011</a> this week has excited thought leaders across the web since the story got out. Singly is a new startup that exists to provide oxygen and commercial support to the open source <a href="https://github.com/quartzjer/Locker" target="_blank">Locker Project</a>, and new protocol <a href="http://www.telehash.org/about.html" target="_blank">TeleHash</a>.</p>
<p>With some wonderful serendipity I met Singly on my first night at <a href="http://strataconf.com/strata2011" target="_blank">Strata</a>.Â  The next day, I talked in depth to <a href="http://en.wikipedia.org/wiki/Jeremie_Miller" target="_blank">Jeremie Miller</a> and <a href="http://twitter.com/#!/smurthasmith" target="_blank">Simon Murtha-Smith</a>, two of the three Singly co-founders (see later in this post).  I also had the opportunity to ask <a href="http://radar.oreilly.com/tim/" target="_blank">Tim Oâ€™Reilly</a>, <a href="http://strataconf.com/strata2011/profile/17816" target="_blank">Alistair Croll</a> and <a href="http://www.oreillynet.com/pub/au/2717" target="_blank">Roger Magoulas</a> for some of their thoughts on the significance of this project (see below for their comments).</p>
<p>It was a real &#8211; pinch myself in case I need to wake up from a dream  experience &#8211; for me, to stumble across Jeremie Miller with Simon  Murtha-Smith sitting behind a hand written sign demoing Singly at Strata  (see myÂ  pic opening this post).  As <a href="http://www.readwriteweb.com/archives/creator_of_instant_messaging_protocol_to_launch_ap.php" target="_blank">Marshall Kirkpatrick notes</a>:</p>
<p><strong>â€œJeremie  Miller is a revered figure among developers, best known for building  XMPP, the open source protocol that powers most of the Instant Messaging  apps in the world. Now Miller has raised funds and is building a team  that will develop software aimed directly at the future of the web.â€</strong></p>
<p>Singlyâ€™s appearance at Strata began auspiciously when they won the judges choice award in the startup showcase.  And following Marshall Kirkpatrickâ€™s post, <a href="http://www.readwriteweb.com/archives/creator_of_instant_messaging_protocol_to_launch_ap.php#disqus_thread" target="_blank">Creator of Instant Messaging Protocol to Launch App Platform for Your Life </a>, and <a href="http://gigaom.com/2011/02/04/the-locker-project-why-leave-data-tracking-to-others-do-it-yourself/" target="_blank">The Locker Project: Why Leave Data Tracking to Others? Do It Yourself,</a> Singly have been burning up Twitter.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/tweetssingly3.jpg"><img class="alignnone size-medium wp-image-6110" title="tweetssingly3" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/tweetssingly3-300x236.jpg" alt="" width="300" height="236" /></a></p>
<p>Singly, by giving people the ability to do things with their own data, has the potential to change our world.Â  And, as <a href="http://www.readwriteweb.com/archives/creator_of_instant_messaging_protocol_to_launch_ap.php#disqus_thread" target="_blank">Marshall Kirkpatrick notes,</a> this wonâ€™t be the first time Jeremie has done that.</p>
<p><strong> </strong></p>
<h3><strong> â€œPop-cultural instruments for data expression and exploration,â€ by Bloom</strong></h3>
<p><strong> </strong>I was drawn over to the Singly table when an awesome app they were demonstrating caught my eye.  <a href="http://bloom.io/fizz/index.html" target="_blank">Fizz</a>, which is running on a locker with data aggregated from three different places is a first glimpse of one of <a href="http://bloom.io/" target="_blank">Bloomâ€™s</a>,  â€œpop-cultural instruments for data expression and exploration.â€</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/SimonMurthaSmith.png"><img class="alignnone size-medium wp-image-6116" title="SimonMurthaSmith" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/SimonMurthaSmith-300x224.png" alt="" width="300" height="224" /></a></p>
<p>Fizz is an intriguing early manifestation of capabilities never seen before on the web &#8211; the ability for us to control, aggregate, share and play with our own data streams, and bring together the bits and pieces of our digital selves scattered about the web (for more about Bloom and Singly, see Tim Oâ€™Reillyâ€™s comments below).  The picture below is my Fizz.  In <a href="http://bloom.io/fizz/index.html" target="_blank">Fizz</a>, large circles represent people and small circles represent their status updates. Bloom says:</p>
<p><strong>â€œClicking a circle will reveal its contents. Typing in the search box will highlight matching statuses.<br />
This is an early preview of our work and we&#8217;ll be adding more features in the next few weeks. <a href="https://spreadsheets.google.com/viewform?hl=en&amp;formkey=dGZINGpDQ3NubVNiMlY3eFZ6MUNGdFE6MQ#gid=0" target="_blank">We&#8217;d love to hear your feedback and suggestions</a>.â€</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/FizzbyBloom.png"><img class="alignnone size-medium wp-image-6117" title="FizzbyBloom" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/FizzbyBloom-300x179.png" alt="" width="300" height="179" /></a><br />
</strong></p>
<p>If you are not already familiar with The Bloom team, Ben Cerveny, Tom Carden, and Jesper Sparre Andersen &#8211; go directly to<a href="http://bloom.io/about" target="_blank"> their about page</a> and you will understand why the match of Bloom and The Locker Project is a cause for great delight.</p>
<h3>The Locker Project &#8211; a whole new way to connect from the protocol up</h3>
<p>As Jeremie began explaining the depth and breadth of what The Locker Project is facilitating, I was utterly gob smacked. And when the penny dropped and I realized this is the whole 9 yards, bringing awesomeness to people with a whole new way to connect, from the protocol up, all I could think was, OMG finally!</p>
<p>Luckily I have had time to catch up with the whole team since then, and recovered my composure enough to ask some coherent questions. But I can still barely contain my enthusiasm for this project.</p>
<p>Singly, The Locker Project and TeleHash take on, and deliver a simple, elegant, and open solution to some of the holy grails of the next generation of networked communications.   I have written on, and been nibbling at the edges of some of these grails in various projects myself for quite a while now.  Even if you havenâ€™t been reading Ugotrade, just a glance at <a href="http://www.ugotrade.com/2011/01/20/real-time-big-data-at-strata-2011-ambient-findability-geomessaging-augmented-data-and-new-interfaces/" target="_blank">the monster mash of my pre Strata post</a> will give you an idea of how important I think Singly is.</p>
<p>My previous post raised the question of how to invert the search pyramid and to transform search into a social, democratic act.  But if you are really interested in social search, I suggest staying keyed into what Singly is doing with The Locker Project!</p>
<p>One of Singlyâ€™s three founders,  Simon Murtha-Smith, was building a company called <a href="https://www.introspectr.com/" target="_blank">Introspectr</a>, a social aggregator and search product. Singlyâ€™s other founder <a href="http://www.linkedin.com/in/jasoncavnar" target="_blank">Jason Cavnar </a>was working on another similar project.  And they came together as Singly because social aggregation and search is a very hard problem for one company to solve, and they realized that the basic infrastructure needs to be open source and built on an open protocol.</p>
<p>As Jeremie puts it,<strong> â€œWe shouldnâ€™tâ€¦(every startup that wants to do something interesting) have to spend this much time aggregating the data, building robust aggregators.â€</strong></p>
<p>To me what is so important about the Locker Project is that it is built on a new open protocol, TeleHash.  And having the Singly team focused on supplying tools and the trust/security layer for the Locker Project will mean that developers have the whole stack they need to do some interesting stuff very soon.</p>
<p>I asked Jeremy to explain the relationship between TeleHash, The Locker Project and Singly.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/TeleHash.png"><img class="alignnone size-medium wp-image-6118" title="TeleHash" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/TeleHash-300x172.png" alt="" width="300" height="172" /></a></p>
<p><strong>Tish Shute:</strong> So<a href="http://www.telehash.org/about.html" target="_blank"> TeleHash</a>â€¦</p>
<p><strong>Jeremie Miller:   Itâ€™s a peer-to-peer protocol to move bits of data for applications around.  Not file sharing, but itâ€™s for actual applications to find each other and connect.  So if you had an app and I had an app, whenever weâ€™re running that app on our devices, we can actually find those other devices from each other and then connect.  Our applications can connect and do something.</strong></p>
<p><strong>For the entire edge of the network, basically, out there in the wild, and let those things mesh together.</strong></p>
<p><strong>A</strong><strong>nd TeleHash is actually what has led to the Locker project itself.</strong></p>
<p><strong>Tish Shute:</strong> So  TeleHash led to the The Locker Project and the Locker Project led to Singly?</p>
<p><strong>Jeremie Miller: Singly is a company who is sponsoring the open source Locker Projectâ€¦the three of us as founders, [left to right in pic below - Jeremie Miller, Jason Cavnar, Simon Murtha-Smith, ]</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/RRWSingly.png"><img class="alignnone size-medium wp-image-6119" title="RRWSingly" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/RRWSingly-300x220.png" alt="" width="300" height="220" /></a></p>
<p><em>I took the pic above of all three founders being interviewed by Marshall Kirkpatrick of Read Write Web for his post, <a href="http://www.readwriteweb.com/archives/creator_of_instant_messaging_protocol_to_launch_ap.php#disqus_thread" target="_blank">â€œCreator of Instant Messaging Protocol to Launch App Platform for Your Life.</a>&#8220;Â  I think we will look back on this moment and say it was <a href="http://twitter.com/#!/TishShute/status/33403971649544192" target="_blank">an inflection point for the web.</a> At least I tweeted that!</em></p>
<p><strong>Jeremie Miller: TeleHash is a protocol that lets the lockers connect with each other and share things.  The locker is like all of your data.  So itâ€™s sort of like a digital personâ€¦</strong></p>
<p><strong>Tish Shute:</strong> A locker for bits and pieces your digital self?</p>
<p><strong>Jeremie Miller:</strong> <strong> Yes. So TeleHash lets the lockers connect and directly peer-to-peer connect with each other and share things.  Singly, as a company, is going to be hosting lockers first and foremost.  But the Locker Project is an open source project.  You can have a locker in your machine or you can install it wherever you wantâ€¦</strong></p>
<p><strong>Tish Shute:</strong> Yes itâ€™s often too difficult for a lot of people to set up something locally&#8230;so Singly makes it easy to have a locker right?</p>
<p><strong>Jeremie Miller:  A lot of people see this cool app or this cool thing they want to do, itâ€™s something that you run in your locker that they need to be able to turn on a locker somewhere very easily.</strong></p>
<p><strong>Tish Shute:</strong> So Singly will provide the trust layer and hosting?</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/Singly.jpg"><img class="alignnone size-full wp-image-6130" title="Singly" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/Singly.jpg" alt="" width="159" height="80" /></a></p>
<p><strong>Jeremie Miller:  Yeah,  Singly is a company that will host lockers, as well asâ€¦when people build applications that run inside your lockers or use your data, you need to be able to trust them.  Maybe itâ€™s like social data and you donâ€™t care that much.  But especially once you start to get any of your transactions in there, your browsing history, your health data, like your running logs or sleepingâ€¦fit-bit stuffâ€¦then itâ€™s much more important to be careful about what youâ€™re running inside your locker and sharing.</strong></p>
<p><strong>So Singly will also look at the applications that are available that you can install and actually run them and look at what data they access, and look at who created them, and be able to come back and either certify or vouch for them.</strong></p>
<p><strong>And I hope in the long-run, as this grows and builds, that power users may actually be able to buy a little device that they can plug into their home network that is their locker.  Wouldnâ€™t that be cool?  This little hard drive or whatever that you plug in.</strong></p>
<p><strong>Tish Shute: </strong>Wow &#8211; that would be very cool!  Architecturally is TeleHash and the Locker Project related to your work on XMPP?</p>
<p><strong>Jeremie Miller:  Architecturally, some of the stuff Iâ€™ve learned, XMPP, in Jabber it was designed for the specific purpose of instant messaging, but it was still a federated model, in that you still had to go through sort of a central point so you couldâ€¦a server that lived somewhere.  So it was really optimized for like businesses and small groups, teams, as well as big companies out there; ISPs can use it.</strong></p>
<p><strong>So it was designed with that in mindâ€”for the communication path to be routed through somewhere.  And where Iâ€™ve sort of evolved over the years since then is really fascinated with truly distributed protocols that are completely sort of decentralized so that things are going peer-to-peer instead of actually through any server.</strong></p>
<p><strong>The last 10 years, peer-to-peer has gotten a pretty bad rap with file sharing.</strong></p>
<p><strong>Tish Shute: </strong> A really bad rap, yes.</p>
<p><strong>Jeremie Miller:  Yeah.  And almost because of that, and because itâ€™s really hard to do, that it hasnâ€™t gottenâ€¦the potential for itâ€™s awesome.  Thereâ€™s so many really good things that can be done peer-to-peer.  And it hasnâ€™t gotten used very much.</strong></p>
<p><strong>But the other side of the peer-to-peer thing that I think is critically important, look at the explosion of the computing devices around a person anymore, both in the home and on our person.  We have one, two, three, four even.  And the number of devices that are online for you that are yoursâ€¦I look at my home network router and Iâ€™ve got 30 devices in my house on Wi-Fi.  What the heck?  Thatâ€™s a lot of devices.</strong></p>
<p><strong><br />
But right now, all of those devices, for me to work with them, Iâ€™m almost always going through a server somewhere, through a data center somewhere, which is ridiculous at face value.  You go five, 10 years out from now, thereâ€™s probably going to be 300 devices on me in some form.</strong></p>
<p><strong>Tish Shute:</strong> So we need a peer-to-peer network just to manage our own devices?</p>
<p><strong>Jeremie Miller:  A peer-to-peer, yes.  You know, my phone should be talking straight to my computer, or to the iPad, or to the washing machine, or refrigerator.  The applications in my TV, or whatever, they should all be talking peer-to-peer.  And it should be easy to do that.  It shouldnâ€™t be that the only way you can do that is to go through a data center somewhere.</strong></p>
<p>[Our conversation continued, but to sum things up, for now, here is the final question I asked Jeremie which pretty much packs in everything I would like to do with TeleHash, the Locker project, and Singly tools/trust layer all in one!]</p>
<p><strong>Tish Shute:</strong> How can TeleHash, the Locker, and Singly help people combine personal data from different sources &#8211; web and mobile for example, so the data locked up in our social graph on the web can be integrated with, for example, the location data and â€œthe data wakeâ€ from our cell phone sensors, to know not only where we have been but to give us more ways to know where we are going?</p>
<p><strong>Jeremie Miller: That&#8217;s a pretty packed question, but here&#8217;s my simple answer, hopefully just seeds the right discussion <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" /> </strong></p>
<p><strong>Telehash is the protocol that lets the apps (mobile, sensor, or anywhere) talk to a locker as well as lockers talk to each other, it&#8217;s the chatter, moving the bits around the network.  The locker is the storage for a person&#8217;s data and the crunching ability to analyze it or trigger actions from it. Singly is the company sponsoring the project(s) and helping anyone dev apps atop it.  We&#8217;re going to build the platform and looking to the world to create some amazing things on top of it (we have lots of our own personal ideas we already want to create, hah!).</strong></p>
<p><strong><br />
</strong></p>
<h3>The Locker Project is not just â€œone more rebel army trying to undo these big data aggregations,&#8221; Tim O&#8217;Reilly</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/Screen-shot-2011-02-10-at-12.01.29-AM.png"><img class="alignnone size-full wp-image-6120" title="Screen shot 2011-02-10 at 12.01.29 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/Screen-shot-2011-02-10-at-12.01.29-AM.png" alt="" width="240" height="238" /></a></p>
<p><strong><a href="http://twitter.com/#!/lockerproject" target="_blank">@lockerproject</a>: &#8220;We&#8217;ll be posting updates on the Locker Project (<a rel="nofollow" href="https://github.com/quartzjer/Locker" target="_blank">https://github.com/quartzjer/Locker</a>) here as we make progress, very awesome stuff &#8220;</strong></p>
<p>During the Strata Media Conversation I asked Roger Magoulas about Singly and The Locker Project because Roger played Yentl and brought Singly and Bloom together!Â  Although there was not much time to discuss it, the relationship of TeleHash, The Locker project and Singly to the social network encumbents, came up, and Roger Magoulis and Tim Oâ€™Reilly gave some very insightful comments on this when I talked to them afterwards (see below).</p>
<p>Roger Magoulas pointed out:</p>
<p><strong>â€œI think Singly has Facebook like aspects, but I think a better description is an app platform that integrates your personal and social network data &#8211; including data from Facebook. Sing.ly is likely to have challenges with some of their data sources, particularly if Sing.ly gains traction with users.</strong></p>
<p><strong>I like the app platform business model, although they face risks getting critical mass and app developer attention, and I like how they plan on using open source connectors to keep up with changing social network platforms. Jeremie has credibility with the open source community and is likely to find cooperating developers. The team seems to bring complementary strengths to the project and you can tell they all work well together. â€<br />
</strong><br />
And Tim O&#8217;Reilly went on to elaborate the awesome potential of this platform to bring something new to the ecosystem, and to comment on just how interesting Bloom&#8217;s insight into, &#8220;data visualization as a means of input and control&#8221; is.</p>
<h3>Talking with Tim O&#8217;Reilly</h3>
<p><strong>Tish Shute:</strong> So will the Locker Project be able to break the lock of  Facebook&#8217;s and other big sites&#8217; control of everyoneâ€™s data.  Sometimes  I feel we are stuck in the era of Zyngification, where you have to do what Zynga did and leverage the system in order to gain traction or do anything with social data?</p>
<p><strong>Tim Oâ€™Reilly:  I donâ€™t think that is the objective of  the Locker Project â€”to break the Facebook lock, because I tend to agree,  the value of Facebook is having your data there with other peopleâ€™s data.  What Singly may be able to accomplish is to give people better tools for managing their data.  Because if you can actually start to abstract the data from various sites and you can set it and manage it yourself, then you can potentially make better decisions about what youâ€™re going to allow and not allow.  Because right now, the interfaces on a lot of these sites make it very, very difficult to understand exactly what the implications are.</strong></p>
<p><strong>And I think all this done right will create a marketplace where people will build better interfaces that will give people more control over their data.  Theyâ€™ll still want to put it on those sites, because why do you put your money in the bank?  You know, because itâ€™s more valuable being with other peopleâ€™s money.</strong></p>
<p><strong>And I think that to conceive of it as one more rebel army trying to undo these big data aggregations is just the wrong way to frame it.</strong></p>
<p><strong>Tish Shute: </strong> Yes and framing the question the way you just did &#8211; that this is not just one more rebel army, might mean that the stage at Strata will be filled with new startups next year!  Thatâ€™s what I thought when I found out what The Locker Project and Singly  was about &#8211; that we are about to see an explosion of creativity with personal and social data.</p>
<p><strong>Tim Oâ€™Reilly:  Yeah, sure.  I mean, because at the end of the day, if you can start to extract your personal data in ways that make it more useful, you can potentially create the ability for people to build better interfaces.  Itâ€™s not just Facebook.</strong></p>
<p><strong>You know, you think, â€œOh, wow, Iâ€™d really like to have a management console for all my contacts.â€  And you go, â€œWell, Iâ€™m stuck with, I can use Facebook, I can use LinkedIn, I can use my address book in Outlook or Gmail or whatever, or on my local machine.â€  The tools are pretty primitive.  And if we get a better set of tools, I think weâ€™ll see a lot of innovation.</strong></p>
<p><strong>Now, some of those startups might well be acquired by a Facebook or a Google.  But it if moves the ball forward in giving people better visibility and control over their data, thatâ€™s a good thing.</strong></p>
<h3>Bloom&#8217;s insight,  &#8220;data visualization will become a means of input and control.&#8221;</h3>
<p><strong>Tish Shute:</strong> I loved the marriage with Bloom, which is interesting, because Ben and the Bloom team havenâ€™t really talked a lot about Bloom yet, but I gather Bloom is moving to consumer facing work with data?</p>
<p><strong>Tim Oâ€™Reilly:  Whatâ€™s really interesting about Bloom is the notionâ€¦You know people think of data visualization as output.  And the insight that I think Ben has had with Bloom is that data visualization will become a means of input and control.</strong></p>
<p><strong>Tish Shute: </strong> Right, very cool.</p>
<p><strong>Tim Oâ€™Reilly: So I&#8217;ve started to feel like data visualization as a way of making sense of complex data is kind of a dead-end.  Because what you really want to do is to build these feedback loops where you actually figure something out, some particular atomic action well enough that you can create an application that letâ€™s somebody actually do something with it. But the idea of visualization as a way of manipulating the data in real-time, data visualization as interface rather than as a report, itâ€™s a small but subtle shift that I think becomes kinda cool.</strong></p>
<h3>Talking with Alistair Croll</h3>
<p><object classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" width="400" height="225" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0"><param name="allowfullscreen" value="true" /><param name="allowscriptaccess" value="always" /><param name="src" value="http://vimeo.com/moogaloop.swf?clip_id=19738228&amp;server=vimeo.com&amp;show_title=1&amp;show_byline=1&amp;show_portrait=1&amp;color=00ADEF&amp;fullscreen=1&amp;autoplay=0&amp;loop=0" /><embed type="application/x-shockwave-flash" width="400" height="225" src="http://vimeo.com/moogaloop.swf?clip_id=19738228&amp;server=vimeo.com&amp;show_title=1&amp;show_byline=1&amp;show_portrait=1&amp;color=00ADEF&amp;fullscreen=1&amp;autoplay=0&amp;loop=0" allowscriptaccess="always" allowfullscreen="true"></embed></object></p>
<p><a href="http://vimeo.com/19738228">Sing.ly &#8211; Join or Die</a> from <a href="http://vimeo.com/user5977233">Singly Inc</a> on <a href="http://vimeo.com">Vimeo</a>.</p>
<p><strong>Alistair Croll:</strong> <strong>So Iâ€™m a big fan of Singly.  They were my choice for the Startup Showcase.  I think itâ€™s certainly the right time &#8211; the team can execute on it.  But the thing I like the most is I thought back to the early days of Photoshop.</strong></p>
<p><strong>So, Photoshop was a neat application that could take data in the form of an image and manipulate it.  But the real value from Photoshop came from these amazing plugins.  Like, thereâ€™s a company called Kai&#8217;s Power Tools that made these things that would allow you to do manipulations.  Today, commonplace things that are built in.  But at the time, they were things like building bubbles, and spheres, and drop shadows and stuff like that, cutouts, in amazing ways.</strong></p>
<p><strong>Another company, I think, called Alien Skin that made these things.  Thereâ€™s whole ecosystems of plugins.  So you could go and get a plugin and transform that original data in ways you hadnâ€™t thought of.  And eventually, there was a macro language for scripting how you could do those things, and that found its way into the Photoshop environment.</strong></p>
<p><strong>But you think about the transformation of digital design from Photoshop, I think if you can take that same pattern of you create the basic ecosystem of a few tools, and then you allow people an open system on top of that, thatâ€™s unprecedented.  I think it really does allow you to take ownership of that.</strong></p>
<p><strong>And then when you allow people the proper tools to federate that information.  I was actually thinking of starting a company a couple of years ago based on data federation like that.  But what you really want to say is Iâ€™ve got a patternâ€¦Itâ€™s almost like a multi-channel mixer.  Youâ€™ve got a band that is your health, your weight, your blood pressure, family photos, words youâ€™ve used.  You know, the more data I record when I carry my phone around with a headset of whatever, all of that stuff goes in, all my searches, everything.</strong></p>
<p><strong>And then I say, â€œAh, I want to federate height, weight, and blood pressure with my doctor. I want to federate sleep cycles and nutrition with my childâ€™s teacher,â€ and so on and so on.  And you start to create these federated sources of data where now you have a teacher data mining, in a safe manner, the sleep and health habits of all the students along with report card information.  And you suddenly realize that Johnny is borderline diabetic and falls asleep at recess.</strong></p>
<p><strong>Thatâ€™s something that never would have happened.  And that happens when you have tools to federate data and then compute on top of them.  So this idea of, like, lifestreaming or life logging, this is a logical consequence of the whole lifestreaming movement; that whole recorded future stuff.</strong></p>
<p><strong>Tish Shute:</strong> Yes it really is a wonderful fruition to the visions of the lifestreaming movement [<a href="http://lifestreamblog.com/interview-with-david-galernter-on-the-future-of-lifestreaming-and-my-thoughts/" target="_blank">see this interview with David Galernter]</a>.  And best of all it sits on a new open protocol, TeleHash and the open source Locker Project that will give tools to everyone to work with these data streams.</p>
<p><strong>Alistair Croll:  Exactly.  This is the toolset that sits on top of that stuff.  Because once Iâ€™ve life-streamed everything, great, I have this bucket of stuff that I did that I never look at again. But if I can suddenly unlock that with data mining tools and analyze patterns, all of a sudden that life logging has a reason to have existed.</strong></p>
<p><strong> The biggest problem we have with data right now is we donâ€™t have apriori knowledge of what will be useful.  We could have been recording crime reports in the city of Chicago, and a year later it turns out that data is really useful for predicting diabetes in the city, but we didnâ€™t know it was related.</strong></p>
<p><strong>So the problem, and one of the things I think that distinguishes big data from traditional data, traditional data is collected to some apriori knowledge of how it will be used.  Big data tends to be collected for the sake ofâ€¦itâ€™s almost collected on faith that later on it will be useful for something.</strong></p>
<p><strong>Tish Shute:</strong> I am very interested in this idea of federation, I actually went as far as to deep dive into Wave servers because of thisâ€¦.</p>
<p><strong>Alistair Croll:  Yeah, Wave was a great example of federation, just too complicated.  When it was canceled, both users [and developers] were furious.</strong></p>
<p><strong>Tish Shute:</strong> Yeah, I suppose you could see Google Wave as a bit of an Icarus project, right?  I am so excited by Singly because  it is coming sort of bottom-up &#8211; a very different approachâ€¦</p>
<p><strong>Alistair Croll:  And remember, Facebook didnâ€™t work before Friendster.  The only difference between being wrong and being too early is that too early costs a lot of money.  So it may be that this is an idea that works now, but a couple years ago didnâ€™t work.</strong></p>
<p><a href="http://twitter.com/#%21/acroll" target="_blank">Alistair Croll</a>, co-chair of <a href="http://strataconf.com/strata2011" target="_blank">Strata 2011</a>, in his post, reframed the question, <a href="http://mashable.com/2011/01/12/data-ownership/" target="_blank">â€œWho Owns Your Data?â€</a> as, â€œItâ€™s not who owns the data, itâ€™s about who can put the data to work.â€</p>
<p>And I am sure there  will be many more people able to put data to work, and into play, in a multitude ofÂ   interesting ways, now we have TeleHash, the Locker Project, and Singly.</p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/TishStrata.png"><img class="alignnone size-medium wp-image-6127" title="TishStrata" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/TishStrata-300x197.png" alt="" width="300" height="197" /></a><br />
</strong></p>
<p><em>Photo by <a href="http://duncandavidson.com/" target="_blank">Duncan Davidson</a>.<br />
<a href="http://strataconf.com/strata2011" target="_blank">Strata 2011</a> is presented by O&#8217;Reilly Media. Produced by<a href="http://2goodcompany.com/" target="_blank"> Good Company Communications.</a></em></p>
<p>I think the photo above gives a good idea of how I felt on the last day  at the Strata conference.  Yup &#8211; like the cat who got the cream!</p>
<p>And in case you are wondering<em> </em>where AR is in this story &#8211; it is everywhere!Â  Below is a pic of the AR concept designs that were omnipresent in the media communications at Strata.Â  The one below I snapped off the job board.Â  But as <a href="http://sproke.blogspot.com/" target="_blank">Sophia Parafina</a> noted,Â  <strong>&#8220;AR is maturing from displaying last year&#8217;s text bubbles and dinosaurs to big data overlaid on the world.&#8221;</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/Screen-shot-2011-02-10-at-1.39.01-AM.png"><img class="alignnone size-medium wp-image-6137" title="Screen shot 2011-02-10 at 1.39.01 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/02/Screen-shot-2011-02-10-at-1.39.01-AM-300x222.png" alt="" width="300" height="222" /></a></p>
<p><em><br />
</em></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2011/02/10/jeremie-miller-the-locker-project-give-a-data-platform-to-the-people-in-the-era-of-data-everywhere-and-bloom-presents-fizz/feed/</wfw:commentRss>
		<slash:comments>5</slash:comments>
		</item>
		<item>
		<title>The Missing Manual for the Future: Tim Oâ€™Reillyâ€™s Four Cylinder Innovation Engine</title>
		<link>http://www.ugotrade.com/2010/10/31/tim-o%e2%80%99reilly%e2%80%99s-four-cylinder-innovation-engine-the-missing-manual-for-the-future/</link>
		<comments>http://www.ugotrade.com/2010/10/31/tim-o%e2%80%99reilly%e2%80%99s-four-cylinder-innovation-engine-the-missing-manual-for-the-future/#comments</comments>
		<pubDate>Sun, 31 Oct 2010 21:25:02 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Bar Camp]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[#w2e]]></category>
		<category><![CDATA[algorithmic economies]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[Area/Code]]></category>
		<category><![CDATA[ARWave]]></category>
		<category><![CDATA[Battlestorm]]></category>
		<category><![CDATA[Chris Arkenberg]]></category>
		<category><![CDATA[Cloudera]]></category>
		<category><![CDATA[counter surveillance]]></category>
		<category><![CDATA[Credit Suisse trading bots]]></category>
		<category><![CDATA[CrowdFlower]]></category>
		<category><![CDATA[data is gasoline]]></category>
		<category><![CDATA[Defeating Big Brother]]></category>
		<category><![CDATA[Dennis Crowley]]></category>
		<category><![CDATA[Dr Alex Kilpatrick]]></category>
		<category><![CDATA[ecologies of human and machine inteligence]]></category>
		<category><![CDATA[Esther Dyson]]></category>
		<category><![CDATA[Facebook for Data]]></category>
		<category><![CDATA[food52]]></category>
		<category><![CDATA[Four Square]]></category>
		<category><![CDATA[Four Square and Dodge Ball]]></category>
		<category><![CDATA[Four Square API]]></category>
		<category><![CDATA[Fred Wilson]]></category>
		<category><![CDATA[Games That Know Where You Live]]></category>
		<category><![CDATA[geopollster]]></category>
		<category><![CDATA[Glympse]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Gov 2.0]]></category>
		<category><![CDATA[Hackett]]></category>
		<category><![CDATA[Hadoop World]]></category>
		<category><![CDATA[high frequency trading]]></category>
		<category><![CDATA[hour.ly]]></category>
		<category><![CDATA[iPad]]></category>
		<category><![CDATA[iphone apps]]></category>
		<category><![CDATA[Jet Packs]]></category>
		<category><![CDATA[jetpack]]></category>
		<category><![CDATA[John Battele's Points of Control Map]]></category>
		<category><![CDATA[Kevin Slavin]]></category>
		<category><![CDATA[Knight Foundation]]></category>
		<category><![CDATA[Lars Rasmussen]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[Loitering on the Motherboard]]></category>
		<category><![CDATA[machine to machine data]]></category>
		<category><![CDATA[machine to machine intelligence]]></category>
		<category><![CDATA[Macon Money]]></category>
		<category><![CDATA[Madagascar Institute]]></category>
		<category><![CDATA[Maker Faire]]></category>
		<category><![CDATA[Mary Haskett]]></category>
		<category><![CDATA[Mike Olsen]]></category>
		<category><![CDATA[mobile social augmented reality]]></category>
		<category><![CDATA[Nanex]]></category>
		<category><![CDATA[Nanex API]]></category>
		<category><![CDATA[Next Jump]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[Pachube API]]></category>
		<category><![CDATA[pathfinder]]></category>
		<category><![CDATA[people are the platform]]></category>
		<category><![CDATA[Platforms for Growth]]></category>
		<category><![CDATA[Points of Control Map]]></category>
		<category><![CDATA[Qualcomm vision based augmented reality SDK]]></category>
		<category><![CDATA[quant trading]]></category>
		<category><![CDATA[quantative analysis]]></category>
		<category><![CDATA[real time data analytics]]></category>
		<category><![CDATA[real time internet]]></category>
		<category><![CDATA[Samasource]]></category>
		<category><![CDATA[sensor platforms]]></category>
		<category><![CDATA[Shazam]]></category>
		<category><![CDATA[Shazam for faces]]></category>
		<category><![CDATA[sousveillance]]></category>
		<category><![CDATA[stock market flash crash]]></category>
		<category><![CDATA[Strata]]></category>
		<category><![CDATA[surveillance bots]]></category>
		<category><![CDATA[The Battle for the Internet Economy]]></category>
		<category><![CDATA[The Battle of the Networks]]></category>
		<category><![CDATA[The Business of Data]]></category>
		<category><![CDATA[The Consequences of Living in a World of Data]]></category>
		<category><![CDATA[The Future: The Missing Manual]]></category>
		<category><![CDATA[The Gartner Hype Cycle]]></category>
		<category><![CDATA[the internet is a data operating system]]></category>
		<category><![CDATA[The Internet Operating System]]></category>
		<category><![CDATA[The Jet Ponies]]></category>
		<category><![CDATA[The Missing Manual For The Future]]></category>
		<category><![CDATA[Tim O'Reilly]]></category>
		<category><![CDATA[Tim O'Reilly's Four Cylinder Engine for Innovation]]></category>
		<category><![CDATA[Tim O'Reilly's Four Cylinder Innovation Engine]]></category>
		<category><![CDATA[trading bots]]></category>
		<category><![CDATA[Twitter for Sensors]]></category>
		<category><![CDATA[Union Square Ventures]]></category>
		<category><![CDATA[Ushahidi]]></category>
		<category><![CDATA[Usman Haque]]></category>
		<category><![CDATA[Valveless Pulse Jets]]></category>
		<category><![CDATA[WanderID]]></category>
		<category><![CDATA[Wave Federation Protocol]]></category>
		<category><![CDATA[Wave in a Box]]></category>
		<category><![CDATA[Web 2.0 Expo]]></category>
		<category><![CDATA[Web 2.0 Expo start ups]]></category>
		<category><![CDATA[Web 2.0 Summit]]></category>
		<category><![CDATA[Web Squared]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[Where 2.0]]></category>
		<category><![CDATA[William Gibson]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=5985</guid>
		<description><![CDATA[The Missing Manual for The Future (or The Future: The Missing Manual) Oâ€™Reilly Media, is famous for is producing&#160; â€œmissing manualsâ€ for new technologies, but thinking of Oâ€™Reilly as just a publisher of books would be like saying Facebook is just a website (this came up in the discussion at Media Round Table at Web [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="../wp-content/uploads/2010/10/Screen-shot-2010-10-11-at-11.40.56-AM.png" mce_href="../wp-content/uploads/2010/10/Screen-shot-2010-10-11-at-11.40.56-AM.png"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-11-at-11.40.56-AM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-11-at-11.40.56-AM.png"><img class="alignnone size-medium wp-image-5786" title="Screen shot 2010-10-11 at 11.40.56 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-11-at-11.40.56-AM-300x198.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-11-at-11.40.56-AM-300x198.png" alt="Screen shot 2010-10-11 at 11.40.56 AM" height="198" width="300"></a><br mce_bogus="1"></p>
<h3>The Missing Manual for The Future (or The Future: The Missing Manual)</h3>
<p>Oâ€™Reilly Media, is famous for is  producing&nbsp; <a href="http://missingmanuals.com/" mce_href="http://missingmanuals.com/" target="_blank">â€œmissing manualsâ€</a> for new  technologies, but thinking of Oâ€™Reilly as just a publisher of  books would be like saying Facebook is just a website (this came up in  the discussion at Media Round Table at <a href="http://www.web2expo.com/" mce_href="http://www.web2expo.com/">Web 2.0 Expo, NY, 2010)</a>.&nbsp;&nbsp; In recent weeks, I managed to catch Tim Oâ€™Reilly at several events, <a href="http://makerfaire.com/newyork/2010/" mce_href="http://makerfaire.com/newyork/2010/" target="_blank">Maker Faire</a>, <a href="http://www.web2expo.com/" mce_href="http://www.web2expo.com/">Web 2.0 Expo</a>, <a href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" mce_href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" target="_blank">Hadoop World</a>, and the free webcast Tim did with John Battelle on <a href="http://radar.oreilly.com/2010/10/the-battle-for-the-internet-ec.html" mce_href="http://radar.oreilly.com/2010/10/the-battle-for-the-internet-ec.html" target="_blank">The Battle for the Internet Economy </a> (although Tim spoke several other times during this period!).</p>
<p>It  occurred to me, as I immersed myself in the depth and breadth of  innovation showcased and discussed at these events that Tim Oâ€™Reilly,  and the  Oâ€™Reilly team, are creating, <b>The Missing Manual for the Future.<br />
</b></p>
<p>As Tim  puts it, we are <b>â€œchanging the world by  spreading the knowledge of   innovators.â€</b> Tim uses a quote from William Gibson to illuminate what is at the heart of the Oâ€™Reilly project<b>:</b></p>
<p><b> </b></p>
<p><b>â€œThe Future is here, it is just not evenly distributed yet.â€ (William Gibson). </b></p>
<p>But Tim Oâ€™Reilly makes another point about the future when he  speaks.&nbsp; The future unfolds unexpectedly â€“ so we must invent for an  unknown future not a known future, or as Alex Steffen put it so well in  his post, <a href="http://www.worldchanging.com/archives/010959.html" mce_href="http://www.worldchanging.com/archives/010959.html" target="_blank"><span>Why Our Bright Green Futures Will Be Weirder Than We Think</span>,</a> â€“ <b>â€œThe world we need is one weâ€™ve never yet seen.â€</b> The magic of  attending an Oâ€Reilly event is that it gives you a chance to work on  this koan in interesting ways, and to take more responsibility for how  things turn out.<b> </b><b><br />
</b></p>
<p>Tim Oâ€™Reilly also urges that we think more deeply about what we are doing.&nbsp; His keynote for <a href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" mce_href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" target="_blank">Hadoop World</a> , NYC, billed as, <b>â€œThe Business of Dataâ€ </b>turned towards <b>â€œThe Consequences of Living in a World of Data.â€ </b>The  900 strong crowd at Hadoop World was probably one of the most savvy  crowds in the world about the business of data, so this was a nice turn.<b> </b></p>
<p><b> </b></p>
<p><a href="http://www.web2expo.com/" mce_href="http://www.web2expo.com/">Web 2.0 Expo</a> with the theme, <b>Platforms for Growth,</b> was a deep dive into the business of innovation.&nbsp; Tim Oâ€™Reillyâ€™s keynote at <a href="http://www.web2expo.com/" mce_href="http://www.web2expo.com/">Web 2.0 Expo</a>,&nbsp; â€œThinking Hard About The Futureâ€ (or rather â€œthinking a little bit creatively or differently about the future)&nbsp; â€“ see<a href="http://www.youtube.com/watch?v=3637xFBvkYg&amp;p=6F97A6F4BA797FB3" mce_href="http://www.youtube.com/watch?v=3637xFBvkYg&amp;p=6F97A6F4BA797FB3" target="_blank"> video here,</a> developed the call he made at Web 2.0 Expo 2008, to <b>â€œwork on stuff that matters,â€</b> into a Four  Cylinder Engine for Innovation. &nbsp; The first of the four  cylinders in the firing order is, <b>â€œHaving Fun!â€</b> But,&nbsp; at Maker Faire, Web 2.0 Expo, and Hadoop World I  got an inside  look at the workings of all four cylinders, and there is more to come, I  am sure, as the other Oâ€™Reilly events unfold over the coming months  including,&nbsp; <a href="http://www.web2summit.com/web2010" mce_href="http://www.web2summit.com/web2010" target="_blank">Web 2.0 Summit</a>, <a href="http://strataconf.com/strata2011" mce_href="http://strataconf.com/strata2011" target="_blank">Strata </a>(a new Oâ€™Reilly conference on The Business of Data), and <a href="http://radar.oreilly.com/2010/10/where-20-2011-cfp-is-open.html" mce_href="http://radar.oreilly.com/2010/10/where-20-2011-cfp-is-open.html" target="_blank">Where 2.0,  2011</a>.</p>
<p>In a free webcast, last week (<a href="http://www.youtube.com/oreillymedia#p/c/7/8CEyHSoWJcs" mce_href="http://www.youtube.com/oreillymedia#p/c/7/8CEyHSoWJcs" target="_blank">recording here</a>), previewing <a href="http://www.web2summit.com/web2010" mce_href="http://www.web2summit.com/web2010" target="_blank">Web 2.0 Summit</a>, John Battelle and Tim Oâ€™Reilly discussed the <a href="http://map.web2summit.com/" mce_href="http://map.web2summit.com/" target="_blank">Points of Control Map</a> which is developing into a fun and useful tool to examine a very  serious topic, â€œThe Battle for the Internet Economy,â€ and how the  â€œincreasingly direct conflicts between its major playersâ€ could effect  â€œpeople, government and the future of technology innovation.â€ &nbsp; In my  previous post, <a title="Permanent Link to Platforms for Growth and Points of Control for Augmented Reality: Talking with Chris Arkenberg" rel="bookmark">Platforms for Growth and Points of Control for Augmented Reality</a>, I had a great conversation with <a href="http://www.urbeingrecorded.com/" mce_href="http://www.urbeingrecorded.com/" target="_blank">Chris Arkenberg</a> using this map as a springboard.&nbsp; More on Points of Control later in this post.</p>
<h3>The Four Cylinders of Innovation</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-7.45.36-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-7.45.36-PM.png"><img class="alignnone size-medium wp-image-5814" title="Screen shot 2010-10-23 at 7.45.36 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-7.45.36-PM-300x193.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-7.45.36-PM-300x193.png" alt="Screen shot 2010-10-23 at 7.45.36 PM" height="193" width="300"></a><br mce_bogus="1"></p>
<p><i>click to enlarge</i></p>
<h3>From Jet Ponies to Jet Packs: The First Cylinder of Innovation â€“ â€œHave Funâ€</h3>
<p>The â€œmakerâ€ energy and its spirit of play, and the courage to create,  hack, reinvent and re-purpose everything and anything, is a  quintessential example of the first cylinder of innovation firing big.&nbsp;  Many â€œmakerâ€ projects also go on to fire on all four cylinders. &nbsp; But  the Maker forte definitely is in the first cylinder zone (and safety  third as some of the rides, including Jet Ponies, warned).&nbsp; The photo  opening this post by Marc  de Vinck â€“ for more pics <a href="http://www.flickr.com/photos/wurx/sets/72157624914508135/with/5027190140/" mce_href="http://www.flickr.com/photos/wurx/sets/72157624914508135/with/5027190140/">see here</a>, is of <a href="http://blog.makezine.com/archive/2010/09/tim_oreilly_rides_the_jet_ponies.html" mce_href="http://blog.makezine.com/archive/2010/09/tim_oreilly_rides_the_jet_ponies.html" target="_blank">Tim riding The Jet  Ponies</a> at <a href="http://makerfaire.com/newyork/2010/" mce_href="http://makerfaire.com/newyork/2010/" target="_blank">Maker Faire </a>which took&nbsp; the New York Hall of Science by storm in late September â€“ see<a href="http://makerfaire.com/newyork/2010/" mce_href="http://makerfaire.com/newyork/2010/" target="_blank"> </a><a href="http://cityroom.blogs.nytimes.com/2010/09/24/where-engineering-prowess-meets-burning-man/" mce_href="http://cityroom.blogs.nytimes.com/2010/09/24/where-engineering-prowess-meets-burning-man/" target="_blank">The New York Times coverage here</a>.&nbsp; The ride was <b>â€œbuilt by the  dastardly  danger-hackers at  the <a href="http://madagascarinstitute.com/" mce_href="http://madagascarinstitute.com/" target="_blank">Madagascar  Institute.</a>â€œ</b> See this <a href="http://thefastertimes.com/jetpacks/2009/10/09/this-guy-might-build-a-jetpack-or-at-least-a-hovercraft/" mce_href="http://thefastertimes.com/jetpacks/2009/10/09/this-guy-might-build-a-jetpack-or-at-least-a-hovercraft/" target="_blank">wonderful interview </a>with    Hackett on his work to design <b>â€œour specific jets from a patent that   was  filed in 1960s by a Mr. Lockwood, for Valveless Pulse Jets.â€ </b> Hackett points out:<b> </b></p>
<p><b>â€œLouder than god, glowing white-hot and looking like the  trombone of the Apocalypse, pulse jets are also really shitty,  inefficient engines,â€</b></p>
<p>But, he adds:</p>
<p><b>â€œI have always wanted a jetpack, and one of the reasons I learned to build these things was to further that    goal.â€</b></p>
<p>This grand vision behind the Jet Ponies is a key to firing, <b>The Second Cylinder of Innovation,&nbsp; â€œHey, we can change the world!â€</b></p>
<p>But Jet Ponies, as a stepping stone to jet packs, also really struck a  chord for me as I have been devoting a lot of time lately to the  emerging Augmented Reality industry, a technology which was lumped in  the same category of sci fi  chimera  as jet packs until very recently.</p>
<h3><b> Data is the Gasoline</b></h3>
<p><b><a href="../wp-content/uploads/2010/10/data.jpg" mce_href="../wp-content/uploads/2010/10/data.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/data.jpg" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/data.jpg"><img class="alignnone size-full wp-image-5862" title="data" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/data.jpg" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/data.jpg" alt="data" height="212" width="300"></a><br />
</b></p>
<p><b> </b></p>
<p><b>â€œThe faces are coming from the sky. &nbsp;The locations are coming   from  the sky.   &nbsp;All these apps depend on something, somewhere up.   &nbsp;And   that,  to me,  was always the heart of Web 2.0. &nbsp;And I am so  delighted   that        people are   finally getting it. &nbsp;Because for a long time,  people   thought, â€˜Oh,  Web 2.0, itâ€™s about    lightweight  advertising   supported   in a web  start up.â€™&nbsp;  So I   went, â€˜No, no, no.    Itâ€™s about  the fact that  weâ€™re  building  these    giant database    subsystems in  the  sky  that are   going to   drive    applications.â€™&nbsp;  And   now, of  course, the  same      application is  on   your PC,  itâ€™s  on  your   phone,  itâ€™s on you    iPad.  &nbsp;And  clearly, the    applications are   just sort of  an  interface   to   something    that   is being  driven  from the    cloud,   and that is     fabulous. &nbsp;Thatâ€™s     the  difference.   &nbsp;People get it    now.â€ </b>(Tim Oâ€™Reilly, said this as part of a response to the first questioner at the Media Round table Web 2.0 Expo)</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/5036745797_cf544d22cd_z.jpg" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/5036745797_cf544d22cd_z.jpg"><img class="alignnone size-medium wp-image-5802" title="Media Roundtable" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/5036745797_cf544d22cd_z-300x199.jpg" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/5036745797_cf544d22cd_z-300x199.jpg" alt="Media Roundtable" height="199" width="300"></a><br mce_bogus="1"></p>
<p><i>Answering questions about the importance of â€œHaving Funâ€ to innovation doesnâ€™t look quite as fun as riding Jet Ponies!</i> <i>Photo above from<a href="http://www.flickr.com/photos/lucasartoni/5036745797/in/photostream/" mce_href="http://www.flickr.com/photos/lucasartoni/5036745797/in/photostream/" target="_blank"> luca.sartoniâ€™s Flickr stream</a></i></p>
<p><i>&#8220;</i><b> the  data that  is generated by the sensors  and the applications  that  use  that data is  going to be where people  are going to be  innovative.â€ (Tim O&#8217;Reilly)<br />
</b></p>
<p>During the Media Round Table, I had a chance to ask Tim more about  the role of bottom up innovation in a world where big data is the  gasoline for increasingly sophisticated engines â€“ platforms integrating  machine to machine intelligence and real time analytics.</p>
<p><b>Tish Shute:</b> You brought up Maker Faire in your  keynote, and again now. &nbsp;I was    there, which not many people in the  audience were&nbsp; [not too many hands   went up when Tim asked during his  keynote]. &nbsp;But I think one of  the things that struck me   was the jet  ponies â€“ they were just earthshaking to stand near. &nbsp;They   made the  ground tremble; they made the  world shake.&nbsp; Yet, most of your keynote,  and most of whatâ€™s on our minds here,   at Web 2.0 Expo, is extracting  intelligence from the big data [in the   sky],  and algorithmic  intelligences are the jet engines of the   internet.&nbsp; And of course, not  to be forgotten, as we are here in  New   York City, where the trading  markets are creating the air we breathe&nbsp;   [although we probably don't  realize it until we lose our mortgage or   something] and these  algorithmic economies or â€œrobot casinosâ€ as Kevin Slavin put it, are all  about speed â€“ itâ€™s not just real-time, issues of latency are&nbsp; so  critical that co-location is key to winning the game of the markets.&nbsp;  [Kevin Slavin brilliantly unpacks this in his talk, "Loitering on the  Motherboard."  For more in this see my conversation with Kevin Slavin  below].</p>
<p>So   my question is, whoâ€™s making the jet ponies for the algorithmic    economies in the sky that you just described?&nbsp;&nbsp; How can we make a play    from the bottom up?&nbsp; I always feel <a href="http://www.ushahidi.com/" mce_href="http://www.ushahidi.com/">Ushahidi</a> is one of the jet ponies of   the data  algorithmic space [because of  their great work to bring human   and machine intelligence together to  solve problems in crisis   situations]. &nbsp;But who do you think is doing  exciting work and how can we   ensure that this powerful  world of data  and algorithmic intelligences does not become hidden in a   closed black   box [only really accessible to elite players like the  NYC  trading  markets]?</p>
<p><b> </b></p>
<p><b>Tim Oâ€™Reilly: â€œWell, I think thereâ€™s certainly a lot of  interesting things happening    in, say, the financial services that a  lot of, kind of, the Internet    folks are kind of blind to. &nbsp;I think  that there are companies like <a href="http://www.nextjump.com/" mce_href="http://www.nextjump.com/" target="_blank">Next  Jump</a> which are really good with data and good with algorithms. But  kind of  speaking specifically to the maker side of this, that   whole  sensor  enabled world which is going to produce data is in its   infancy.  &nbsp;What  we have that I think is so powerful right now is we have   the first   portable sensor platform. &nbsp;I said in my talk the other day,   you know,   your phone has ears, it has eyes, it has a sense of where  it  is. &nbsp;And   these are all available to application developers. You know, you can  compare, say, Dodgeball to Foursquare, you can see how  differentâ€¦  Dodgeball is Foursquare in the tele-type era.&nbsp; Foursquare is now  possible because there are so many more capabilities  on the phone.</b></p>
<p><b>And  I think that we are going to see a lot of other areas  that are revolutionized by the sensors in the device. &nbsp;It could well be  that some    of them will come explicitly out of the maker kind of  projects, or it could just be that make is sort of a proxy for them.&nbsp; So  yeah, <a href="http://www.arduino.cc/" mce_href="http://www.arduino.cc/" target="_blank">Arduino</a> is  this great maker sensor platform, but hey, hereâ€™s a    consumer sensor  platform [holding up phone]. Maybe we vaulted past  the  maker stage  already  and we just didnâ€™t know it.</b></p>
<p><b> </b></p>
<p><b>And  thatâ€™s not entirely true, because Arduino is building a  whole economy  of special purpose devices. &nbsp;But it feels a little bit  like the days when people rolling their own PCs coexisted with the rise  of Dell, who was a kid in his college dorm room who made his own PCs and  sold them  on the net, but figured out how to scale it pretty quickly  and get  good  at  it.  But  there were still a lot of garage shops, you  know, â€˜Iâ€™ll make a PC  and sell it to youâ€™ people for probably a decade  before there was   really a  clue that that was a commodity industry.  &nbsp;In fact, I do think   the sensor  platforms are going to become a  commodity industry. &nbsp;And  the  data that  is generated by the sensors  and the applications that  use  that data is  going to be where people  are going to be innovative.â€</b></p>
<h3><b>The internet operating system is a data operating system and it is happening in real time (Tim Oâ€™Reilly)<br />
</b></h3>
<p><b> </b></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Hadooppost.jpg" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Hadooppost.jpg"><img class="alignnone size-medium wp-image-5839" title="Hadooppost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Hadooppost-300x202.jpg" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Hadooppost-300x202.jpg" alt="Hadooppost" height="202" width="300"></a><br mce_bogus="1"></p>
<p><i>click to enlarge the image above&nbsp; â€“ a slide from Mike Olsenâ€™s&nbsp; (CEO of Cloudera) keynote at <a href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" mce_href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" target="_blank">Hadoop World</a></i></p>
<p>Not only  do  we have a portable sensor platform in our pockets&nbsp;    but developers also have  powerful platforms and tools to make sense of  data that fuel  our apps. &nbsp; Opensource <a href="http://hadoop.apache.org/" mce_href="http://hadoop.apache.org/" target="_blank">Hadoop</a> makes  available, to    anyone with   some data  munching chops, the  power to work  with giant  unstructured databases and  do <a target="_blank" mce_href="http://gigaom.com/2009/09/20/getting-closer-to-real-time-with-hadoop/" href="http://gigaom.com/2009/09/20/getting-closer-to-real-time-with-hadoop/">the kind of  real time  analytics</a>  previously only available to giants  like Google.&nbsp;  Big players  like  Yahoo, Facebook, and Twitter use Hadoop (Jonathon  Gray from Facebook noted they add 10TB <i>a day)</i>. &nbsp; But, as <a href="http://www.cscyphers.com/blog/2010/10/12/hadoop-world-2010/" mce_href="http://www.cscyphers.com/blog/2010/10/12/hadoop-world-2010/" target="_blank">this great roundup of Hadoop World </a>points  out, while Hadoop gets  the press for handling petabytes of data , Mike  Olsen (CEO of Cloudera) noted, the fastest growing area of  users are  working with clusters   smaller than 10TB and over half of the Hadoop  clusters were under 10TB in size.</p>
<h3>Four Square: A Platform for Growth with an ecosytem built on top of data that exists in the real world</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-2.27.19-AM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-2.27.19-AM.png"><img class="alignnone size-medium wp-image-5888" title="Screen shot 2010-10-26 at 2.27.19 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-2.27.19-AM-300x256.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-2.27.19-AM-300x256.png" alt="Screen shot 2010-10-26 at 2.27.19 AM" height="256" width="300"></a><br mce_bogus="1"></p>
<p>As an augmented reality enthusiast it is not hard to guess that one of my favorite platforms for growth is <a href="http://foursquare.com/apps/" mce_href="http://foursquare.com/apps/" target="_blank">Four Square</a>.&nbsp; See <a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15652" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15652" target="_blank">Dennis Crowleyâ€™s keynote at Web 2.0 Expo</a> here.&nbsp; The Four Square API has been available to developers since   November 2009,&nbsp; and there are already a number of&nbsp; interesting   applications, and there will be many more to come.&nbsp; The screen shot  above is of <a href="http://geopollster.com/" mce_href="http://geopollster.com/" target="_blank">geopollster</a> â€“ <a href="http://foursquare.com/apps/" mce_href="http://foursquare.com/apps/" target="_blank">see the gallery of Four Square apps here</a>.</p>
<p><i><b><b><b>@dens  tweeted recently&nbsp; â€œPolitics +  @Foursquare = @GeoPollsterâ€   http://geopollster.com &lt;- I love love  love that people are using 4SQ   to think about election tools</b></b></b></i></p>
<p>As Kati London pointed out in her keynote, Four Square is the <b>â€œkind   of augmented reality that is aimed at shifting or  changing a   personâ€™s  social reality, e.g. the mayor badges in Four Square  that   change my  relationship to the people and the place I am in, and   augment   engagement and reputation through socially driven consumer tie   ins.â€ </b> We are already see augmented reality developers beginning to work with the Four Square API â€“ see here, <a href="http://recombu.com/apps/iphone/arstreets-app-review_M12590.html" mce_href="http://recombu.com/apps/iphone/arstreets-app-review_M12590.html" target="_blank">Foursquare + Augmented Reality + Virtual Graffiti = ARstreets</a>.</p>
<p>As augmented reality development tools mature, Four Square will, increasingly, become an important platform<b> </b>for creative AR developers interested in integrating the power of this platform for augmented engagement and reputation with <b>â€œdevice aided augmented  reality that can shift visual experiences of situated geolocal  experiences.â€ </b> With the <a href="http://developer.qualcomm.com/dev/augmented-reality" mce_href="http://developer.qualcomm.com/dev/augmented-reality" target="_blank">Qualcomm vision based augmented reality SDK</a> now available for download, and <a href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" mce_href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" target="_blank">Wave in a Box</a> soon? to be released, and an <a href="http://arwave.org/" mce_href="http://arwave.org/" target="_blank">ARWave</a> client working on Android (almost!), I have been exploring the Four Square API in my non existent spare time!!</p>
<p>The Four Square API also offers some interesting possibilities for  exploring games that take the complex economy of Four Square â€“ not  personal data but aggregates of behavior, as their subject matter (for  more on this see my conversation with Kevin Slavin later in this post  and in an upcoming post).</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/DennisatWhere2009post.jpg" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/DennisatWhere2009post.jpg"><img class="alignnone size-medium wp-image-5886" title="DennisatWhere2009post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/DennisatWhere2009post-199x300.jpg" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/DennisatWhere2009post-199x300.jpg" alt="DennisatWhere2009post" height="300" width="199"></a><br mce_bogus="1"></p>
<p><i>I took this picture of Dennis at <a href="http://where2conf.com/where2009/" mce_href="http://where2conf.com/where2009/" target="_blank">Where 2.0, 2009</a> at the beginning of Four Squareâ€™s phenomenal growth (they are at 4 million plus users now).</i></p>
<p><i><br />
</i></p>
<h3><b><b><b>Pachube (Patch-Bay): </b></b></b>a web service for storing and sharing sensor, energy and environmental data</h3>
<p><b><b><b><a href="../wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-7.58.17-PM1.png" mce_href="../wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-7.58.17-PM1.png"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-7.58.17-PM1.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-7.58.17-PM1.png"><img class="alignnone size-medium wp-image-5838" title="Screen shot 2010-10-24 at 7.58.17 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-7.58.17-PM1-300x198.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-7.58.17-PM1-300x198.png" alt="Screen shot 2010-10-24 at 7.58.17 PM" height="198" width="300"></a><br />
</b></b></b></p>
<p>Eighteen months ago, I interviewed Usman Haque (architect and director, <a id="o.td" title="Haque Design + Research" href="http://www.haque.co.uk/" mce_href="http://www.haque.co.uk/" target="_blank">Haque Design + Research</a>) and founder of <a id="cpbp" title="Pachube" href="http://www.pachube.com/" mce_href="http://www.pachube.com/">Pachube</a> â€“ see <a target="_blank">Pachube, Patching the Planet</a>. &nbsp; Usman pointed me to this wonderful evocative image from <a href="http://www.geog.ubc.ca/%7Etoke/Profile.htm%20%3Chttp://www.geog.ubc.ca/%7Etoke/Profile.htm" mce_href="http://www.geog.ubc.ca/%7Etoke/Profile.htm%20%3Chttp://www.geog.ubc.ca/%7Etoke/Profile.htm" target="_blank">T.R. Okeâ€™s</a> book, <a href="http://www.amazon.com/Boundary-Layer-Climates-T-Oke/dp/0415043190" mce_href="http://www.amazon.com/Boundary-Layer-Climates-T-Oke/dp/0415043190" target="_blank">â€œBoundary Layer Climatesâ€</a> (original photo source Prof. L. E. Mountâ€™s <a href="http://www.alibris.com/booksearch?qwork=1137594&amp;matches=1&amp;author=Mount%2C+Laurence+Edward&amp;browse=1&amp;cm_sp=works*listing*title" mce_href="http://www.alibris.com/booksearch?qwork=1137594&amp;matches=1&amp;author=Mount%2C+Laurence+Edward&amp;browse=1&amp;cm_sp=works*listing*title" target="_blank">The Climatic Physiology of the Pig</a>).&nbsp; â€œ<i>Itâ€™s  the same piglets, in the same box, but on the right hand side  the  temperature has been increased. This small change in how the space  is  â€œprogrammedâ€ has dramatically changed the way the â€˜inhabitantsâ€™  relate  to each other and how they relate to their space.â€</i></p>
<h3><b><b><b><b><b><b>The Challenge of Connecting people and environments.</b></b></b></b></b></b></h3>
<p>At Web 2.0 Expo, I got  the opportunity to talk with Usman Haque again.&nbsp;&nbsp; <a href="http://www.pachube.com/" mce_href="http://www.pachube.com/" target="_blank">Pachube,</a> is becoming an established platform now, Usman explained.&nbsp; They have a  development team of eleven and robust back end.&nbsp; And, they will now be  spending some more time on the front end, including a redesign of the  website,&nbsp;making <b>â€œit a lot easier to widgetize the entire website  so that you will be  able to take almost any element and embed that  into your own website.â€ </b>And, as <a href="http://www.web2expo.com/webexny2010/public/schedule/speaker/43845" mce_href="http://www.web2expo.com/webexny2010/public/schedule/speaker/43845" target="_blank">Usman mentioned in his presentation</a>,  they are working on an augmented reality interface, Porthole, for  facilities management and, â€œas a consumer-oriented application that  extends the universe of Pachube data into the context of AR â€“ a  â€˜portholeâ€™ into Pachubeâ€™s data environments..&nbsp; Usman is also  contributing to the AR standards discussion and on the program committee  now <a href="http://www.w3.org/2010/06/16-w3car-minutes.html#item02" mce_href="http://www.w3.org/2010/06/16-w3car-minutes.html#item02" target="_blank">for the W3C group on augmented reality</a>.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-10.22.24-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-10.22.24-PM.png"><img class="alignnone size-medium wp-image-5912" title="Screen shot 2010-10-26 at 10.22.24 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-10.22.24-PM-300x134.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-10.22.24-PM-300x134.png" alt="Screen shot 2010-10-26 at 10.22.24 PM" height="134" width="300"></a><br mce_bogus="1"></p>
<p>Click to enlarge the image above from Chris Burmanâ€™s paper for the W3C, <a href="http://www.w3.org/2010/06/w3car/portholes_and_plumbing.pdf" mce_href="http://www.w3.org/2010/06/w3car/portholes_and_plumbing.pdf" target="_blank">Portholes and Plumbing: how AR erases boundaries between â€œphysicalâ€ and â€œvirtualâ€</a><br mce_bogus="1"></p>
<p>Pachube, is sometimes described as the Facebook    for Data or an  analogy Usman prefers, a Twitter for   Sensors.&nbsp; At Web 2.0 Expo, I had    an amazing opportunity  to   hear from Twitter and Facebook about  their strategies as platforms for growth.&nbsp; This gave me lots of fuel for  questions about Pachubeâ€™s approach to developing their platform.&nbsp;  Simplicity was a theme that Facebook&nbsp; and Twitter both affirmed as a  key.&nbsp; One of Pachubeâ€™s challenges will be to deliver ease of use, and  the equivalent of Facebookâ€™s â€œlikeâ€ and &nbsp;Twitterâ€™s â€œfollowâ€ to gain mass  appeal.</p>
<p>Here is a brief excerpt from my upcoming conversation with Usman:</p>
<p><b>Tish Shute</b>:  So as a platform you see Pachube as having  more in common with Twitter â€“ a Twitter for Sensors. In what ways is  Pachube similar to Twitter?</p>
<p><b>Usman Haque:  Well we are the Twitter of sensors, devices  &amp; machines in the sense that, really, the API that enables all this  communication is important, much more so than the website itself.  It is  where, basically, most of the millions of our hits actually go, is to  the backend.  And weâ€™ve now got dozens of applications built on top of  the system, a little bit like Twitterâ€™s applications; you know, all the  apps are the important part.</b></p>
<p><b>But we are actually going to be doing some quite exciting  things with API keys that we havenâ€™t really spoken that much about in  public.  But we have come up with a pretty innovative solution to make  almost every resource have granular privacy options on it, <a href="http://community.pachube.com/node/526" mce_href="http://community.pachube.com/node/526">now discussed here</a>. </b></p>
<p>At Hadoop World, Tim Oâ€™Reilly also raised some interesting broader  questions that are very relevant to Pachubeâ€™s vision to â€œpatch the  planetâ€, e.g, the problem of digital identity in the  age of sensors?  (Smart phones already know their users by the way they walk!) And, <b>â€œHow should we think about privacy in a world where data can be triangulated?â€</b></p>
<p>Usman talked about  Pachubeâ€™s approach to both the   technical  aspects of  how to build  a   massively scalable system, and the   conceptual aspects of  how people connect to  each other, and what they   might do with  these   new opportunities to  connect environments and     sensor data&nbsp; (see my   earlier talk with Usman, <a target="_blank">Pachube, Patching the Planet</a>, for a detailed    explanation of some of the   concepts behind  Pachube).</p>
<p>I look forward to posting this conversation.  Pachube is growing, and  Usman always goes beyond the familiar tropes of connecting human and  machine intelligence.</p>
<h3><b> 2nd Cylinder of Innovation: â€œHey Can We Change the World!â€</b></h3>
<p><b><a href="../wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-5.26.55-PM.png" mce_href="../wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-5.26.55-PM.png"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-5.26.55-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-5.26.55-PM.png"><img class="alignnone size-medium wp-image-5826" title="Screen shot 2010-10-24 at 5.26.55 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-5.26.55-PM-300x217.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-5.26.55-PM-300x217.png" alt="Screen shot 2010-10-24 at 5.26.55 PM" height="217" width="300"></a><br />
</b></p>
<p>The possibilities for reimagining of the role of data in healthcare  produced some of the most powerful â€œHey Can We Change the Worldâ€ moments  for me at both Web 2.0 Expo and Hadoop World.&nbsp; The slide above is from Esther  Dysonâ€™s brilliant Ignite presentation, <a href="http://www.slideshare.net/ignitenyc/esther-dyson-what-you-can-and-cant-learn-from-your-genes" mce_href="http://www.slideshare.net/ignitenyc/esther-dyson-what-you-can-and-cant-learn-from-your-genes" target="_blank">â€œWhat you can and canâ€™t learn from your genes?â€ are here</a>,  &nbsp; Tim Oâ€™Reilly also brought up the powerful role real time data  analytics can play in improving healthcare in his Hadoop World Keynote.&nbsp;  Also see Alex Howardâ€™s post, <a href="http://radar.oreilly.com/2010/10/top-10-lessons-for-gov-20-from.html" mce_href="http://radar.oreilly.com/2010/10/top-10-lessons-for-gov-20-from.html" target="_self">10 Lessons for Gov 2.0 from Web 2.0 </a>for some more great, â€œhey we can change the world momentsâ€ at Web 2.0 Expo.&nbsp; The keynote from <a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15726" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15726" target="_blank">Lukas Biewald of CrowdFlower and Leila Chirayath Janah of Samasource </a>(screen shot below)<a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15726" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15726" target="_blank"> </a>in particular, is a provocative exploration of the future of work in the new ecologies of human and machine intelligence.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-8.21.43-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-8.21.43-PM.png"><img class="alignnone size-medium wp-image-5870" title="Screen shot 2010-10-25 at 8.21.43 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-8.21.43-PM-300x184.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-8.21.43-PM-300x184.png" alt="Screen shot 2010-10-25 at 8.21.43 PM" height="184" width="300"></a><br mce_bogus="1"></p>
<h3><b>Changing the World When Our Lives Are Increasingly Shaped by Forces Invisible To Us?</b></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-11.49.32-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-11.49.32-PM.png"><img class="alignnone size-medium wp-image-5840" title="Screen shot 2010-10-24 at 11.49.32 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-11.49.32-PM-300x152.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-24-at-11.49.32-PM-300x152.png" alt="Screen shot 2010-10-24 at 11.49.32 PM" height="152" width="300"></a><br mce_bogus="1"></p>
<p><i>Click to enlarge</i></p>
<p>Mike Olsen, CEO of Cloudera, noted that <b>â€œthe largest area of  data growth does not come from humans interacting  with machines;  rather, itâ€™s from machines interacting with each otherâ€ </b>(see here in <a href="http://www.cscyphers.com/blog/2010/10/12/hadoop-world-2010/" mce_href="http://www.cscyphers.com/blog/2010/10/12/hadoop-world-2010/" target="_blank">Minor Technical Difficulties</a>).&nbsp;&nbsp; One of the most  interesting presentations at Web 2.0 Expo was <a href="http://www.web2expo.com/webexny2010/public/schedule/speaker/86516" mce_href="http://www.web2expo.com/webexny2010/public/schedule/speaker/86516" target="_blank">Kevin Slavinâ€™s, â€œLoitering  on the Motherboard,â€ </a>which,  as Tim Oâ€™Reilly pointed out in his keynote at Hadoop World, is a  talk  that raises all  kinds of questions about a system where big  players  are gaming the data  for their own ends.</p>
<p>Kevin Slavin, a founder of <a href="http://areacodeinc.com/" mce_href="http://areacodeinc.com/">Area/Code</a>,  notes  the operating system of our mortgage, life insurance, the  operating  system of currencies and gold is now governed by machine to  machine  intelligence and algorithimic economies outside of human  cognitive  processes.&nbsp; The  markets are now legible only to bots  in an  algorithmic  arms race with bots surveilling bots, and throwing off   false  information in a bid for counter-surveillance.&nbsp; He showed some  slides of  the eery but beautiful visualizations of traces of the  trading bots  created from the Nanex API.</p>
<p>The screenshot above is from the <a href="http://www.nanex.net/FlashCrash/CCircleDay.html" mce_href="http://www.nanex.net/FlashCrash/CCircleDay.html" target="_blank">Nanex: Crop Circle of the Day â€“ Quote Stuffing and Strange Sequences</a>.&nbsp; <b>â€œThe   common theme with the charts shown on this page is they are  all   generated in code and are algorithmic. Some demonstrate  bizarre price   or size cycling, some demonstrate large burst of quotes in  extremely   short time frames and some will demonstrate bothâ€¦â€</b> This one is a   zoom of the NSDQ â€œWild Thing.â€&nbsp; Wild  price/size repeater from NSDQ   running at 1,000 quotes per second,  effecting the BBO along the way (I   love the great names Nanex gives the different patterns and traces   produced by the trading bots).</p>
<p>Nanex supplies a <a href="http://www.nanex.net/" mce_href="http://www.nanex.net/">real-time data feed</a> comprising trade and quote data for all US equity, option, and futures exchanges. They have <a href="http://www.nanex.net/historical.html" mce_href="http://www.nanex.net/historical.html">archived this data</a> since 2004 and have created and used numerous tools to â€œsift through   the enormous dataset: approximately 2.5 trillion quotes and trades as of   June 2010.â€ May 6th 2010 (day of the flash crash), had approximately  7.6  billion trade, quote, level 2, and depth records.</p>
<p>Kevin points out that our lives are being shaped by criteria  invisible to  us and the old hackneyed tropes of machine to machine  intelligence such a  robots reading HUDs in English are long worn out.&nbsp;  The latter  point is, perhaps, something for us augmented reality geeks  absorbed in  ideas of â€œmaking the invisible visibleâ€ to chew on.</p>
<p>Changing a world shaped by forces that are, increasingly, invisible to us presents a huge challenge.</p>
<p>But I had the glimmer of a, â€œHey Can We Change the Worldâ€ moment,  when I attended Kevin Slavin founder of Area/Codeâ€™s presentation and had  a conversation with him after his talk.&nbsp; Could games take these complex  economies as their subject matter?&nbsp; The economies of&nbsp; Farmville and  games like WoW are not opaque at all, and these are environments with  complex economic behavior, <b>â€œwhere you can actually have enough data to understand what it isâ€</b> â€“ <b>â€œitâ€™s not so much about personal data. &nbsp;Itâ€™s more about, like, aggregate behaviors.â€ </b> <b>â€œGames   that can really model those, and play with those, and take those as  the  subject the way that Monopoly takes Monopoly as a subject could be   really interesting.â€ </b>Kevin made many fascinating points â€“ more to come on this topic.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/KevinSlavin.jpg" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/KevinSlavin.jpg"><img class="alignnone size-medium wp-image-5980" title="Kevin Slavin" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/KevinSlavin-300x199.jpg" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/KevinSlavin-300x199.jpg" alt="Kevin Slavin" height="199" width="300"></a><br mce_bogus="1"></p>
<p>Photo by <a rel="nofollow" href="http://duncandavidson.com/" mce_href="http://duncandavidson.com/">James Duncan Davidson</a>, of Kevin Slavin speaking at Web 2.0 Expo NY, 2010, from the <a href="http://www.flickr.com/photos/oreillyconf/5035426532/" mce_href="http://www.flickr.com/photos/oreillyconf/5035426532/" target="_blank">Oâ€™Reilly Conferences Flickr stream</a><br mce_bogus="1"></p>
<p>Here is the beginning of our conversation:</p>
<h3>Talking With Kevin Slavin</h3>
<p><b><b>Tish Shute: </b></b>You began your talk  today about visibility and where some of the  algorithmic masters of  disguise went to work, after they had solved the  math behind stealth  bombers. &nbsp;I thought perhaps you were leading into  ideas about a reverse  surveillance society.</p>
<p>But  you surprised me, as I felt you made visibility itself kind of a   non-issue by the end of your presentation and that counter  surveillance  became basically a time and speed issue. &nbsp;Now I am not  sure quite how to  imagine a counter-surveillance society, something I  try to think  aboutâ€¦</p>
<p><b><b>Kevin Slavin: Well, letâ€™s see. &nbsp;Thereâ€™s a couple ways  to think about it. &nbsp;I think  one point is just that when we talk about  counter-surveillance, we  usually locate that as something that comes  from &nbsp;the bottom up,  something that comes from the population. Think  about the way the  plane spotters discovered the CIA black rendition  flights.</b></b></p>
<p><b><b>I  think in general, when people talk about counter  surveillance, or  sousveillance, they imagine it as an inversion of the  traditional  relationship between the people and the state.</b></b></p>
<p><b><b>But  thatâ€™s whatâ€™s interesting. Whatâ€™s happening now,  is that there are  forms of surveillance and counter-surveillance that  are in play beyond  any human perceptual horizons. These forms are at  their most  sophisticated in financial services, in the markets.</b></b></p>
<p><b><b>If  you were a bot, and could read the market legibly  (which humans  cannot), what you would see, effectively, are bots that  are surveilling  bots. Then you have bots that are throwing off false  information in a  bid for counter-surveillance. Many of the bots are,  themselves,  surveilling other bots; each one of them is trying to  figure out what  all the other ones are going to do. In essence, itâ€™s an  algorithmic arms  race, and game theory has become concrete, since the  theories are code,  the code is action, and the action affects, letâ€™s  say: your mortgage.</b></b></p>
<p><b><b>And  so, basically what you have is you have this  series of algorithms that  are all looking to discern each other, while  also trying to prevent  themselves from being discerned. I think of the  tunnels under the  trenches in WWI, tunnels to surveil the trenches, and  then, later,  tunnels to surveil the tunnels. Thereâ€™s a few examples of  this kind of  thing. &nbsp;But Itâ€™s especially strange when itâ€™s computer  code, and at the  magnitude weâ€™re seeing today.</b></b></p>
<p><b><b>All  of it, as noted in the talk, accounting for 70%  of all the trades in  the market. 70% of the market trades are never  touched by human hands or  even seen by human eyes; they donâ€™t move  through a conventional  cognitive process. &nbsp;And thatâ€™s why you get  things like the Credit Suisse  algorithm, it was buying, selling 200,000  shares of stocks to itself  over and over and over again. It was a bug  and it slowed the market to a  crawl.</b></b></p>
<p><b><b>Credit  Suisse was fined, in essence, for failing to  control an algorithm.  Maybe thatâ€™s the first time an algorithm was  treated like a human, in a  way. As if the algorithm broke the law, and  Credit Suisse was  responsible for letting it do so. For me, that feels  like a threshold  event.</b></b></p>
<p><b><b>Itâ€™s not that humans never made mistakes when trading on the market. But when algorithms err, they err with magnitude.</b></b></p>
<p><b><b>The  idea that we now have bugs in the United States  market economy is  really worth looking at. &nbsp;If Apple canâ€™t keep code  bugs from the most  simple iPhone apps in a closed and regulated  ecosystem, Iâ€™m pretty  certain weâ€™ll have a lot more Credit Suisse type  bugs in the future.</b></b></p>
<p><b><b>And  that will be pretty interesting. There will be  viruses, and the  operating system they will operate on will be the  operating system of  the United States. The operating system of your  pension, your house,  your life insurance. The operating system of  currencies and gold.</b></b></p>
<p><b><b>Tish Shute:</b></b> I was hard-pressed by  the end of your talk to think of like, â€œWell,  what would be the  equivalent of, sort of a peopleâ€™s uprising to create a  better fairer  society in this kind of world where, really, the things  that affect the  key aspects of lives most are going on beyond human perception at an  algorithmic  level?â€&nbsp; But you made a pretty radical suggestion at the  endâ€¦</p>
<p><b><b>Kevin Slavin: Well  I think increasingly the markets  have become delaminated from anything  meaningful. First from goods,  then from fundamentals, and now finally  from homo sapiens. So thatâ€™s  hard to fight.</b></b></p>
<p><b><b>Itâ€™s  the race towards abstraction that makes it  impossible to simply  â€œresist.â€ The latest version in the long series of  fiscal catastrophes  was based on Wall Street finding goods that could  be rolled up and sold  with false valuations, but goods that would take a  long time to fail.  Mortgages are handy like that. Itâ€™s the tradition  of extending the  abstraction as long as possible, until finally the  bill arrives and the  banks fail. I donâ€™t know if thatâ€™s something to  rise up against or not.  Itâ€™s like a rally against evil.</b></b></p>
<p><b><b>But  really, I think the point is that it wonâ€™t be  the people that rise up.  It will be the financial services themselves  that rise up. Theyâ€™ll just  detach completely.</b></b></p>
<p><b><b>That  was harder to do with cotton or with wheat,  with simple futures; they  keep financial services tied to the ground.  &nbsp;So what weâ€™re doing is  creating increasingly complex financial  instruments that are further and  further removed from anything you can  touch. &nbsp;Like the way a mortgage  is abstract. But, of course, the bottom  line is that at the end of that  mortgage lies someoneâ€™s home.</b></b></p>
<p><b><b>Itâ€™s  said that Wall Street is now moving onto life  insurance, because thatâ€™s  going to take even longer to fail. &nbsp;Theyâ€™re  doing the exact same thing.  The word is that they are rolling up CDOs  made out of crap life  insurance policies, same way they rolled them up  with crap mortgages a  few years ago.</b></b></p>
<p><b><b>And  those will probably take, I donâ€™t know, 15 or 20  years to unwrap and  unravel. &nbsp;But what you see in the meantime, is  that they are looking for  things that are increasingly abstract,  intangible, removed as far as  possible from the experience of everyday  life.</b></b></p>
<p><b><b>So  maybe this is good. Maybe thatâ€™s financial  services rising up. Lifting  off. I think best case scenario now is that  they actually leave humans  alone altogether. &nbsp;That, someday, they are  just trading, effectively,  completely arbitrary goods, the stocks could  be anything at all, maybe  for crops that no longer exist, and Iâ€™m just  saying that then these bots  would no longer affect what we do and what  we are, it would just be a  robot casino, an invisible paradise in the  air.</b></b></p>
<p><b><b><br />
</b></b></p>
<h3><b><b>People are the platform: How Games Can Be Engines of Innovation in Our Lives</b></b></h3>
<p><b><b><a href="../wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-11.34.58-PM.png" mce_href="../wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-11.34.58-PM.png"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-11.34.58-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-11.34.58-PM.png"><img class="alignnone size-medium wp-image-5872" title="Screen shot 2010-10-25 at 11.34.58 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-11.34.58-PM-300x204.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-25-at-11.34.58-PM-300x204.png" alt="Screen shot 2010-10-25 at 11.34.58 PM" height="204" width="300"></a><br />
</b></b></p>
<p><i><b><b>See the video of <a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15446" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15446" target="_blank">Games that Know Where We Live</a> here (screen shot above)<br />
</b></b></i></p>
<p><i><b><b> </b></b></i></p>
<p>Kati London, Senior Producer, <a href="http://areacodeinc.com/" mce_href="http://areacodeinc.com/">Area/Code</a>, in her keynote showed how <a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15446" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15446" target="_blank">games that know where we  live</a> can shift players perspectives â€“ from device aided augmented  reality  that can shift visual experiences of situated geolocal  experiences to a  kind of augmented reality that is aimed at shifting or  changing a  personâ€™s social reality, e.g. the mayor badges in Four Square  that  change my relationship to the people and the place I am in, and  augment  engagement and reputation through socially driven consumer tie  ins.</p>
<p>Area/Code has recently developed<a id="internal-source-marker_0.7281649763651145" href="http://www.knightfoundation.org/news/press_room/knight_press_releases/detail.dot?id=370129" mce_href="http://www.knightfoundation.org/news/press_room/knight_press_releases/detail.dot?id=370129"> two games for the Knight Foundation</a> that take people as the platform.&nbsp; Macon  Money, uses very simple games dynamics (for more <a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15446" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15446" target="_blank">see the video</a> of Katiâ€™s keynote) in a game designed to help â€œKnightâ€™s continuing  efforts  to support revitalizing Macon and creating a vibrant college  town.â€</p>
<p>The  other game that Area/Code has designed with the support of the  Knight  Foundation &nbsp;is for the Biloxi and Gulf Coast community, a game  called  Battlestorm.&nbsp; <a href="http://www.knightfoundation.org/news/press_room/knight_press_releases/detail.dot?id=370129" mce_href="http://www.knightfoundation.org/news/press_room/knight_press_releases/detail.dot?id=370129"> â€œThe gameâ€™s purpose is to increase awareness about natural disasters and change the way people prepare for them.â€</a><br mce_bogus="1"></p>
<p><b><br />
</b></p>
<h3><b>3rd Cylinder of Innovation: Build products, business models and entire industries.</b></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.06.57-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.06.57-PM.png"><img class="alignnone size-medium wp-image-5822" title="Screen shot 2010-10-23 at 11.06.57 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.06.57-PM-300x151.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.06.57-PM-300x151.png" alt="Screen shot 2010-10-23 at 11.06.57 PM" height="151" width="300"></a><br mce_bogus="1"></p>
<p><a href="http://www.glympse.com/" mce_href="http://www.glympse.com/" target="_blank">Glympse</a> â€“ real-time, private location tracking</p>
<p>Julianne Pepitone, Yahoo! Finance, nailed the essence of Web 2.0 Expo, NYC, this year in her post, <a href="http://finance.yahoo.com/news/Web-20-Expo-startups-are-big-cnnm-2700333063.html?x=0&amp;.v=2" mce_href="http://finance.yahoo.com/news/Web-20-Expo-startups-are-big-cnnm-2700333063.html?x=0&amp;.v=2" target="_blank">Web 2.0 Expo startups are big on neighborhoods, storytelling</a>.&nbsp; She writes:</p>
<p><b>â€œAt   the Web 2.0 Expo in New York City this week, executives  from big   sites  like Facebook, Twitter and Pandora all spoke about  industry   trends.  But the showcase of 27 startup tech companies stole  the show.â€</b></p>
<p>Listen  carefully to Tim Oâ€™Reilly and Fred Wilson, Union Square Ventures,  question their picks from the<a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15525" mce_href="http://www.web2expo.com/webexny2010/public/schedule/detail/15525" target="_blank"> startup showcase</a> at Web 2.0 Expo.&nbsp; Also see <a href="http://www.youtube.com/watch?v=Xbui5_5_NCA&amp;p=6F97A6F4BA797FB3" mce_href="http://www.youtube.com/watch?v=Xbui5_5_NCA&amp;p=6F97A6F4BA797FB3" target="_blank">this video of Fred and Tim discussing their conversations with all the start ups</a>.&nbsp;  This&nbsp; is one of the clearest public windows onto both how to present  your company to VC, and how to figure out what are the most important   questions for you as an entrepreneur&nbsp; building a  business in a world of  data.</p>
<p><a href="http://www.glympse.com/" mce_href="http://www.glympse.com/">Glympse</a> <a href="http://www.youtube.com/watch?v=EuKScQbPvVc&amp;feature=channel" mce_href="http://www.youtube.com/watch?v=EuKScQbPvVc&amp;feature=channel" target="_blank">successfully  pitches </a>their  â€œjet ponyâ€ strategy for a  location based business, and is Fredâ€™s  pick.&nbsp; They hold up well under pressure and  answer Tim and Fredâ€™s hard  questions  about how their start up will not  get overtaken by an  encumbent player with resources  and market share before they can gain   traction.&nbsp;&nbsp; <a href="http://www.food52.com/" mce_href="http://www.food52.com/">food52</a> <a href="http://www.youtube.com/watch?v=NZZ0apJTUQA&amp;feature=channel" mce_href="http://www.youtube.com/watch?v=NZZ0apJTUQA&amp;feature=channel" target="_blank">responds to Timâ€™s probing about their  strategy</a> for business data  analytics that he points out are vital if they  want  to survive with the  small margins of ecommerce.&nbsp; There is a list of  all the participants in the start up showcase in Bradyâ€™s <a href="http://radar.oreilly.com/2010/09/the-startups-at-the-expo-showc.html" mce_href="http://radar.oreilly.com/2010/09/the-startups-at-the-expo-showc.html" target="_blank">post here.</a> <a href="http://hour.ly/" mce_href="http://hour.ly/" target="_blank">hour.ly</a> was the audience pick.</p>
<h3><a href="http://www.shazam.com/" mce_href="http://www.shazam.com/" target="_blank">Shazam</a> for Faces!</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.14.52-AM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.14.52-AM.png"><img class="alignnone size-medium wp-image-5897" title="Screen shot 2010-10-26 at 4.14.52 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.14.52-AM-300x134.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.14.52-AM-300x134.png" alt="Screen shot 2010-10-26 at 4.14.52 AM" height="134" width="300"></a><br mce_bogus="1"></p>
<p>My favorite start up  was a biometric service doing face, iris, and finger print matching,<a href="http://www.tacticalinfosys.com/" mce_href="http://www.tacticalinfosys.com/" target="_blank"> Tactical Information Systems</a>.</p>
<p>Tim and Fred also liked them, and they have an interesting discussion  about the merits or not of approaching your platform through a narrow  first application as Tactical Information Systems are with <a href="http://www.wanderid.org/" mce_href="http://www.wanderid.org/" target="_blank">WanderID</a> -&nbsp; an application to help identifying lost Alzheimer patients.&nbsp; As Fred pointed out, they are potentially the <a href="http://www.shazam.com/" mce_href="http://www.shazam.com/" target="_blank">Shazam</a> for faces, so why start so small?</p>
<p>I&nbsp; had asked TIS the same question when I met them in the â€œspeed  datingâ€ session.&nbsp; This is just their first toe in the water as they are a  two person company at the moment. Their vision for their platform is  big.&nbsp; Mary Haskett and Dr Alex Kilpatrick, the founders of this  quintessential jet pony for the algorithmic economies in the sky, are  not only a partnership with the credentials to do a&nbsp; <a href="http://www.shazam.com/" mce_href="http://www.shazam.com/" target="_blank">Shazam</a> for faces â€“ <a href="http://www.tacticalinfosys.com/about.html" mce_href="http://www.tacticalinfosys.com/about.html" target="_blank">see their bios here</a>, they are the people I would want to be running a <a href="http://www.shazam.com/" mce_href="http://www.shazam.com/" target="_blank">Shazam</a> for faces!&nbsp; They really get the consequences of living in a world of  data â€“ check out Dr Kilpatrickâ€™s absolute killer Ignite talk, <a href="http://ignite.oreilly.com/2010/10/defeating-big-brother-by-dr-alex-kilpatrick-ep-75.html" mce_href="http://ignite.oreilly.com/2010/10/defeating-big-brother-by-dr-alex-kilpatrick-ep-75.html" target="_blank">â€œDefeating Big Brother.â€</a> (screenshot below)</p>
<p><i><b><b><b><a href="http://ignite.oreilly.com/2010/10/defeating-big-brother-by-dr-alex-kilpatrick-ep-75.html" mce_href="http://ignite.oreilly.com/2010/10/defeating-big-brother-by-dr-alex-kilpatrick-ep-75.html" target="_blank"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.03.11-PM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.03.11-PM.png"><img class="alignnone size-medium wp-image-5819" title="Screen shot 2010-10-23 at 11.03.11 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.03.11-PM-300x229.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-23-at-11.03.11-PM-300x229.png" alt="Screen shot 2010-10-23 at 11.03.11 PM" height="229" width="300"></a><br />
</b></b></b></i></p>
<h3>How Can Augmented Reality Add Value to the Real Time Internet/Data Operating System?</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.12.57-AM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.12.57-AM.png"><img class="alignnone size-medium wp-image-5896" title="Screen shot 2010-10-26 at 4.12.57 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.12.57-AM-300x199.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-26-at-4.12.57-AM-300x199.png" alt="Screen shot 2010-10-26 at 4.12.57 AM" height="199" width="300"></a><br mce_bogus="1"></p>
<p><i> <a href="http://www.planefinder.net/" mce_href="http://www.planefinder.net/" target="_blank">planefinder.net</a> â€“ an augmented reality app that lets you find information about planes  by pointing your phone at the sky, â€œincluding flight  number, aircraft  registration, speed, altitude and how far away  it isâ€ (via <a href="http://www.maclife.com/article/news/do_some_plane_scouting_augmented_reality_plane_finder_app" mce_href="http://www.maclife.com/article/news/do_some_plane_scouting_augmented_reality_plane_finder_app">MacLife</a>).</i></p>
<p>The new opportunities in the algorithmic economies in the sky were    center stage at Web 2.0 Expo and there are some interesting AR apps for  the real time internet/data operating system emerging, like <a href="http://www.planefinder.net/" mce_href="http://www.planefinder.net/" target="_blank">planefinder.net</a>.&nbsp; But Augmented Reality was still pretty   low profile at Web 2.0 Expo (<a target="_blank">except that NVidia augmented reality demo attracted a lot of attention at the sponsors expo</a>).&nbsp;  However, everyone working in the emerging industry of AR should  recognize that   apps big on â€œneighborhoods and story tellingâ€ are  heading right up the   AR street, and that platforms like Four Square  and Pachube present enormous opportunity to explore the possibilities of  AR.&nbsp; And if augmented reality enthusiasts are not already paying    attention to real time data analytics, and <a href="http://hadoop.apache.org/" mce_href="http://hadoop.apache.org/" target="_blank">Hadoop</a>, they should be (see <a href="http://www.cscyphers.com/blog/2010/10/12/hadoop-world-2010/" mce_href="http://www.cscyphers.com/blog/2010/10/12/hadoop-world-2010/" target="_blank">this post for an excellent round up</a> on Hadoop World).</p>
<p>At Hadoop World, Tim Oâ€™Reilly referenced the great tagline from the&nbsp; <a href="http://vimeo.com/11742135" mce_href="http://vimeo.com/11742135">IBM commercial</a>:</p>
<p><i><b><b><b><b>â€œ</b></b></b></b></i><b><b><b><b>Would you be willing to cross the street â€” blindfolded â€” on  data that was five minutes old? Five hours? Five days?â€</b></b></b></b></p>
<p>As I have noted in several earlier posts â€“ <a href="../../2010/09/27/urban-games-storytelling-with-augmented-reality-the-big-arny-and-inside-ar-talking-with-thomas-alt-metaio/" mce_href="../../2010/09/27/urban-games-storytelling-with-augmented-reality-the-big-arny-and-inside-ar-talking-with-thomas-alt-metaio/" target="_blank">see here</a> and <a href="../../2010/08/05/vision-based-augmented-reality-ar-in-smart-phones-qualcomms-ar-sdk-interview-with-jay-wright/" mce_href="../../2010/08/05/vision-based-augmented-reality-ar-in-smart-phones-qualcomms-ar-sdk-interview-with-jay-wright/" target="_blank">here</a> for starters,&nbsp; we are just seeing the tools&nbsp; for developing near field,  vision based, mobile, social AR become widely available to developers,  so there should be a new level of AR apps emerging through 2011.&nbsp; There  is a wonderful discussion in the comments of this post by Mac  Slocum, <a href="http://radar.oreilly.com/2010/10/two-ways-augmented-reality-app.html" mce_href="http://radar.oreilly.com/2010/10/two-ways-augmented-reality-app.html" target="_blank">â€œHow Augmented Reality Apps Can Catch On,â€ </a> between Mac, Raimo one of     the founders of <a href="http://www.layar.com/" mce_href="http://www.layar.com/" target="_blank">Layar</a>, and <a href="http://www.urbeingrecorded.com/" mce_href="http://www.urbeingrecorded.com/" target="_blank">Chris Arkenberg</a> on what constitutes a platform for growth for     augmented reality.</p>
<p>Macâ€™s post, the comments and <a href="http://www.urbeingrecorded.com/news/2010/10/13/is-ar-ready-for-the-trough-of-disillusionment/" mce_href="http://www.urbeingrecorded.com/news/2010/10/13/is-ar-ready-for-the-trough-of-disillusionment/" target="_blank">Chris Arkenbergâ€™s post</a> on the <a href="http://www.gartner.com/it/page.jsp?id=1447613" mce_href="http://www.gartner.com/it/page.jsp?id=1447613" target="_blank">latest edition of the Gartner Hype Cycle,</a> that rather curiously placed Augmented reality almost at the peak of  inflated expectations. really got me excited     about exploring an idea  I have been thinking about for a while, which   is   to get the AR  community to discuss the <a href="http://map.web2summit.com/" mce_href="http://map.web2summit.com/">Points of Control map</a>. &nbsp;&nbsp; See my discussion with Chris Arkenberg here, <a rel="bookmark" href="http://www.ugotrade.com/2010/10/27/platforms-for-growth-and-points-of-control-for-augmented-reality-talking-with-chris-arkenberg/" mce_href="http://www.ugotrade.com/2010/10/27/platforms-for-growth-and-points-of-control-for-augmented-reality-talking-with-chris-arkenberg/" target="_blank">Platforms for Growth and Points of Control for Augmented Reality</a><a href="http://www.ugotrade.com/2010/10/27/platforms-for-growth-and-points-of-control-for-augmented-reality-talking-with-chris-arkenberg/" mce_href="http://www.ugotrade.com/2010/10/27/platforms-for-growth-and-points-of-control-for-augmented-reality-talking-with-chris-arkenberg/" target="_blank">.</a> The recording of&nbsp; John Battelle&#8217;s and Tim O&#8217;Reilly&#8217;s webcast on Points of Control <a href="http://www.youtube.com/oreillymedia#p/c/7/8CEyHSoWJcs" mce_href="http://www.youtube.com/oreillymedia#p/c/7/8CEyHSoWJcs" target="_blank">is posted here.</a><br mce_bogus="1"></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.01.38-AM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.01.38-AM.png"><img class="alignnone size-medium wp-image-5932" title="Screen shot 2010-10-27 at 2.01.38 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.01.38-AM-300x124.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.01.38-AM-300x124.png" alt="Screen shot 2010-10-27 at 2.01.38 AM" height="124" width="300"></a><br mce_bogus="1"></p>
<p><a href="http://map.web2summit.com/" mce_href="http://map.web2summit.com/" target="_blank">The interactive Points of Control map</a> is an amazing  tool    to think with! Check it out  in movements, territory and movements, acquisition mode.&nbsp; There is a  competition for the most interesting comment and most interesting  acquisition suggestion.&nbsp; The prize is a ticket to Web 2.0 Summit!</p>
<h3>What is the Future of Social?</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/ARwave_logo_small.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/ARwave_logo_small.png"><img class="alignnone size-full wp-image-5987" title="ARwave_logo_small" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/ARwave_logo_small.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/ARwave_logo_small.png" alt="ARwave_logo_small" height="146" width="208"></a><br mce_bogus="1"></p>
<p>The recent â€œdefectionâ€ from Google to Facebook â€“ see <a title="Lars Rasmussen, Father Of Google Maps And Google Wave, Heads To&nbsp;Facebook" rel="bookmark" href="http://techcrunch.com/2010/10/29/rasmussen-facebook-google/" mce_href="http://techcrunch.com/2010/10/29/rasmussen-facebook-google/">Lars Rasmussen, Father Of Google Maps And Google Wave, Heads To&nbsp;Facebook</a>,&nbsp; is as MG Siegler of TechCrunch points out, â€œthe biggest one since Chrome OS lead <a href="http://www.crunchbase.com/person/matthew-papakipos" mce_href="http://www.crunchbase.com/person/matthew-papakipos">Matthew Papakipos </a>made <a href="http://techcrunch.com/2010/06/28/closing-in-on-chrome-os-launch-key-architect-matthew-papakipos-jumps-to-facebook/" mce_href="http://techcrunch.com/2010/06/28/closing-in-on-chrome-os-launch-key-architect-matthew-papakipos-jumps-to-facebook/">the same jump in June</a>â€ (TechCrunch also notes â€œcurrent Facebook CTO <a href="http://www.crunchbase.com/person/bret-taylor" mce_href="http://www.crunchbase.com/person/bret-taylor">Bret Taylor</a> was heavily involved in the launch of Google Mapsâ€).</p>
<p>These moves have drawn my particular attention as did <a href="http://www.youtube.com/watch?v=ZqDYjA5RGCU&amp;p=6F97A6F4BA797FB3" mce_href="http://www.youtube.com/watch?v=ZqDYjA5RGCU&amp;p=6F97A6F4BA797FB3" target="_blank">Bret Taylorâ€™s response in his conversation with Brady Forrest at Web 2.0 Expo</a> to Bradyâ€™s question, <b>â€œHow soon until we get the Facebook firehose?â€ </b></p>
<p>If you have been reading Ugotrade you already know<b> </b>how  important I think an open, distributed, standard for  real-time  communications such as the very innovative Wave Federation Protocol  could be for AR development&nbsp; -&nbsp; see <a href="http://www.arwave.org/" mce_href="http://www.arwave.org/" target="_blank">ARWave </a>and <a href="http://www.mobilemonday.nl/talks/tish-shute-the-next-wave-of-ar/" mce_href="http://www.mobilemonday.nl/talks/tish-shute-the-next-wave-of-ar/" target="_blank">my presentation at MoMo13, Amsterdam</a> last year, <a rel="bookmark" href="http://www.ugotrade.com/2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/" mce_href="http://www.ugotrade.com/2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/" target="_blank">The Next Wave of AR: Mobile Social Interaction Right Here, Right Now!</a><br mce_bogus="1"></p>
<p>The anticipated release of&nbsp; <a href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" mce_href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" target="_blank">Wave in a Box, </a>has  raised hopes in the developer community that&nbsp; WFP will soon become  easier to work with, and hopefully more widely adopted.&nbsp; Like many  others, I wonder what will happen to <a href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" mce_href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" target="_blank">Wave in a Box</a> now?</p>
<p>But the innovation of Wave is deep and broad (and as many have  pointed out hugely ambitious).&nbsp; Perhaps the boldest attempt yet to  innovate both at the low level of architecture (where Google is so  powerful) and at the high level of <b>the Mark Zuckerberg, â€œbig idea,â€ which  as Tim Oâ€™Reilly notes is, â€œWhat is the future of social?â€ </b> MG Siegler  noted <a title="Facebook Groups Is Sort Of Like Google Wave For Human&nbsp;Beings" rel="bookmark" href="http://techcrunch.com/2010/10/07/facebook-groups-google-wave/" mce_href="http://techcrunch.com/2010/10/07/facebook-groups-google-wave/">Facebook Groups Is Sort Of Like Google Wave For Human&nbsp;Beings</a>.</p>
<p>But I deeply hope that the open, distributed standard part of the Wave big idea is not lost in the mix here.</p>
<p><b><br />
</b></p>
<h3><b>Fourth Cylinder of Innovation: Keep the Ecosystem Going, Create More Value than You Capture<br />
</b></h3>
<p><i><b><a href="../wp-content/uploads/2010/10/Screen-shot-2010-10-21-at-5.58.27-AM.png" mce_href="../wp-content/uploads/2010/10/Screen-shot-2010-10-21-at-5.58.27-AM.png"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-1.56.15-AM.png" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-1.56.15-AM.png"><img class="alignnone size-medium wp-image-5931" title="Screen shot 2010-10-27 at 1.56.15 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-1.56.15-AM-300x181.png" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-1.56.15-AM-300x181.png" alt="Screen shot 2010-10-27 at 1.56.15 AM" height="181" width="300"></a><br />
</b></i></p>
<p><i>The Points of Control map is interactive, so please <a href="http://map.web2summit.com/" mce_href="http://map.web2summit.com/" target="_blank">click here </a>or on the image above for the full experience.</i></p>
<p>Tim Oâ€™Reilly points out that there is a worrisome dark side to the Points of Control Map â€“ see <a href="http://www.youtube.com/watch?v=3637xFBvkYg&amp;p=6F97A6F4BA797FB3" mce_href="http://www.youtube.com/watch?v=3637xFBvkYg&amp;p=6F97A6F4BA797FB3" target="_blank">Timâ€™s keynote here</a>.&nbsp; To paraphrase some or his points:</p>
<p>There are companies on the map that are forgetting to think about  creating a sustainable ecosystem.&nbsp; Rather than growing the pie, they are  trying to divide up the pie and that threatens to cause the fourth  cylinder of innovation to misfire.&nbsp; This fourth cylinder is essential to  the ecosystem.</p>
<p>Tim Oâ€™Reilly looks back to the lessons of the personal computing  industry which was incredibly vital and creative, and lots of people  made money until a couple of big players <b>â€œsucked all the air out of the ecosystemâ€</b> and innovation had to go elsewhere.</p>
<p>The Power of Platforms is to create value not just for your company  but for other people.&nbsp;&nbsp; Create value for yourself by creating value for  other people.&nbsp; Tim Oâ€™Reilly used the wonderful example of&nbsp; Henry Ford  inventing the weekend so that there would be enough people with time and  money to buy his mass produced cars.&nbsp; Think about building the  ecosystem that will support the future your are going to build.&nbsp; Grow  the pie rather than cut up the pie.&nbsp; This will be the vital fourth  cylinder of innovation in a <a href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" mce_href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" target="_blank">Web Squared</a> world.</p>
<p>Tim Oâ€™Reilly has long proposed that&nbsp;<a href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" mce_href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" target="_blank"> </a><a href="http://www.oreillynet.com/go/web2" mce_href="http://www.oreillynet.com/go/web2">Web 2.0 is all about harnessing collective intelligence</a>,&nbsp; But as Gartner predicts, â€œ<span lang="EN-GB">By  year end 2012, physical sensors will create 20 percent of non-video  internet traffic.â€ </span><span lang="EN-GB"> </span>Yet   another  previously unevenly distributed future is going mainstream,  and if you havenâ€™t read it already, now is the time to read<span lang="EN-GB"> this  paper by Tim Oâ€™Reilly and John Batelle, </span><a href="http://www.web2summit.com/web2009/public/schedule/detail/10194" mce_href="http://www.web2summit.com/web2009/public/schedule/detail/10194" target="_blank">Web Squared: Web 2.0 Five Years On</a>.</p>
<h3><b><b><b>The Consequences of Living in a World of Data</b></b></b></h3>
<p><i><b><b><b><b><a href="../wp-content/uploads/2010/10/Dataarmsrace.jpg" mce_href="../wp-content/uploads/2010/10/Dataarmsrace.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Dataarmsrace.jpg" mce_href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Dataarmsrace.jpg"><img class="alignnone size-medium wp-image-5817" title="Dataarmsrace" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Dataarmsrace-300x199.jpg" mce_src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Dataarmsrace-300x199.jpg" alt="Dataarmsrace" height="199" width="300"></a><br />
</b></b></b></b></i></p>
<p>To bring this very long post to a close!&nbsp; Here are just a few of the  key questions re The Consequences of Living in a World of Data that Tim  Oâ€™Reilly raised during his keynote for Hadoop World:</p>
<p><b><b><b><b>â€œHow would we solve the problem of  digital identity in the age of sensors? (Our smart phones are able to  know their users by the way they walk â€“ their gait!)</b></b></b></b></p>
<p><b><b><b><b>â€œHow will we input data when our devices are smart enough to listen on their own?â€</b></b></b></b></p>
<p><b><b><b><b>â€œHow should we think about privacy in a world where data can be triangulated?â€</b></b></b></b></p>
<p><b><b><b><b>â€œWe are moving to a world in which  every device generates useful data, in which every action creates  information shadows on the net.â€</b></b></b></b></p>
<p><b><b><b><b>â€œShouldnâ€™t we regulate the misuse of data rather than the possession of it?â€</b></b></b></b></p>
<p><b><b><b><b>â€œHow do we avoid a data arms race?â€</b></b></b></b></p>
<p><b><b><b><b>â€œCreate more value than you capture.â€</b></b></b></b></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2010/10/31/tim-o%e2%80%99reilly%e2%80%99s-four-cylinder-innovation-engine-the-missing-manual-for-the-future/feed/</wfw:commentRss>
		<slash:comments>8</slash:comments>
		</item>
		<item>
		<title>Platforms for Growth and Points of Control for Augmented Reality: Talking with Chris Arkenberg</title>
		<link>http://www.ugotrade.com/2010/10/27/platforms-for-growth-and-points-of-control-for-augmented-reality-talking-with-chris-arkenberg/</link>
		<comments>http://www.ugotrade.com/2010/10/27/platforms-for-growth-and-points-of-control-for-augmented-reality-talking-with-chris-arkenberg/#comments</comments>
		<pubDate>Wed, 27 Oct 2010 09:14:49 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[AR and html 5]]></category>
		<category><![CDATA[AR eyewear]]></category>
		<category><![CDATA[AR eyewear for smart phones]]></category>
		<category><![CDATA[ardevcamp]]></category>
		<category><![CDATA[arduino]]></category>
		<category><![CDATA[ARWave]]></category>
		<category><![CDATA[augmented foraging]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[augmented reality eyewear]]></category>
		<category><![CDATA[augmented reality on tablets]]></category>
		<category><![CDATA[augmented reality search]]></category>
		<category><![CDATA[cloud computing and AR]]></category>
		<category><![CDATA[EarthMine]]></category>
		<category><![CDATA[gartner hype cycle]]></category>
		<category><![CDATA[Gary Hayes]]></category>
		<category><![CDATA[John Battelle]]></category>
		<category><![CDATA[Kevin Slavin]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[location based services]]></category>
		<category><![CDATA[Metaio]]></category>
		<category><![CDATA[Mobile AR]]></category>
		<category><![CDATA[mobile social augmented reality]]></category>
		<category><![CDATA[MUVEdesign]]></category>
		<category><![CDATA[NVidia augmented reality demo]]></category>
		<category><![CDATA[Ogmento]]></category>
		<category><![CDATA[Pachube]]></category>
		<category><![CDATA[Platforms for Growth]]></category>
		<category><![CDATA[Points of Control Map]]></category>
		<category><![CDATA[Porthole]]></category>
		<category><![CDATA[QR codes]]></category>
		<category><![CDATA[Qualcomm SDK for AR]]></category>
		<category><![CDATA[real time analytics and AR]]></category>
		<category><![CDATA[RFID]]></category>
		<category><![CDATA[Simple Geo]]></category>
		<category><![CDATA[The Battle for the Internet Economy]]></category>
		<category><![CDATA[Tim O'Reilly]]></category>
		<category><![CDATA[Total Immersion]]></category>
		<category><![CDATA[transmedia story telling]]></category>
		<category><![CDATA[trasmedia]]></category>
		<category><![CDATA[ubicomp]]></category>
		<category><![CDATA[Ushahidi]]></category>
		<category><![CDATA[Usman Haque]]></category>
		<category><![CDATA[vision based AR]]></category>
		<category><![CDATA[W3C group on augmented reality]]></category>
		<category><![CDATA[Wave in a Box]]></category>
		<category><![CDATA[Web 2.0 Expo]]></category>
		<category><![CDATA[web standards based browser for AR]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=5924</guid>
		<description><![CDATA[The Points of Control map is interactive, so please click here or on the image above for the full experience. Today at 4pm EST, 1pm PDT John Battelle and Tim O&#8217;Reilly will discuss the Points of Control map and The Battle for the Internet Economy in a Free Webcast: &#8220;More than any time in the [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://map.web2summit.com/"><img class="alignnone size-medium wp-image-5931" title="Screen shot 2010-10-27 at 1.56.15 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-1.56.15-AM-300x181.png" alt="Screen shot 2010-10-27 at 1.56.15 AM" width="300" height="181" /></a></p>
<p><em>The Points of Control map is interactive, so please <a href="http://map.web2summit.com/" target="_blank">click here </a>or on the image above for the full experience.</em></p>
<p><em> </em>Today at 4pm EST, 1pm PDT John Battelle and Tim O&#8217;Reilly will discuss the <a href="http://map.web2summit.com/" target="_blank">Points of Control</a> map and The Battle for the Internet Economy <a href="http://oreilly.com/emails/poc_web2summit-webcast-prg.html" target="_blank">in a Free Webcast</a>:</p>
<p><strong>&#8220;More than any time in the history of the Web, incumbents in the network  economy are consolidating their power and staking new claims to key  points of control. It&#8217;s clear that the internet industry has moved into a  battle to dominate the Internet Economy.</strong></p>
<p><strong>John Battelle and Tim O&#8217;Reilly will debate and discuss these shifting  points of control as the board becomes increasingly crowded. They&#8217;ll map  critical inflection points and identify key players who are clashing to  control services and infrastructure as they attempt to expand their  territories. They&#8217;ll also explore the effect these chokepoints could  have on people, government, and the future of technology innovation.&#8221;</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.01.38-AM.png"><img class="alignnone size-medium wp-image-5932" title="Screen shot 2010-10-27 at 2.01.38 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.01.38-AM-300x124.png" alt="Screen shot 2010-10-27 at 2.01.38 AM" width="300" height="124" /></a></p>
<p><em> </em>I&#8217;ve been wanting to start a discussion on theÂ  <a href="http://map.web2summit.com/">Points of Control map </a>in the Augmented Reality community for a while now, and Chris&#8217; recent post on <a href="http://www.gartner.com/it/page.jsp?id=1447613" target="_blank">the latest edition of the Gartner Hype Cycle</a>, <a href="http://www.urbeingrecorded.com/news/2010/10/13/is-ar-ready-for-the-trough-of-disillusionment/" target="_blank">&#8220;Is AR Ready for the Trough of Disillusionment?&#8221; </a>and this post by Mac  Slocum, <a href="http://radar.oreilly.com/2010/10/two-ways-augmented-reality-app.html" target="_blank">â€œHow Augmented Reality Apps Can Catch On,&#8221;</a> and the conversation in the comments between Mac, Raimo (one of the founders of <a href="http://www.layar.com/" target="_blank">Layar)</a>, and Chris, all prompted me to get a conversation started&#8230;(see below for all that followed!).Â  Chris put me on the hot seat back in June when he did <a href="http://www.boingboing.net/2010/06/17/tish-shute---augment.html" target="_blank">this very generous interview with me on Boing Boing</a>, so it was time to turn the tables.</p>
<p>Tim O&#8217;Reilly, in hisÂ <a href="http://www.youtube.com/watch?v=3637xFBvkYg&amp;p=6F97A6F4BA797FB3" target="_blank"> keynote for Web 2.0 Expo,</a> pointed out there is both a fun and a dark side to the Points of Control map.Â  There are companies on this map, he noted, that rather than &#8220;growing the pie,&#8221; are  trying to divide up the pie, and they are forgetting to think about  creating a sustainable ecosystem. I expect the conversation between Tim O&#8217;Reilly and John Battelle to dig deep into this Battle for the Internet Economy.Â  If, like me, you have another engagement at the time of the webcast, you can register on the site to receive the recording.</p>
<p>AR is still too young to figure in the battles of the giants, but there will be a lot to be learned from this conversation.Â  And, The Points of Control map is good to think with from the POV of AR in many ways.Â  As Chris Arkenberg observed:</p>
<p><strong>&#8220;When I look at this map, the points of control map, itâ€™s  really interesting to me, because what it says to me with respect to AR  is each of these little regions that they have drawn out would be a  great research project. So every single one of these should be  instructive to AR.</strong></p>
<p><strong>In other words, we should be able to look at social networks,  the land of search, or kingdom of ecommerce, and apply some very  rigorous critical thinking to say, â€œHow would AR add to this engagement,  this experience of gaming, or ecommerce, or content?â€</strong></p>
<p><strong>Looking at each of these individually and really meticulously  saying, â€œOK, well yes, it can do this but how is that different from  the current screen media experience, the current web experience that we  have of all these types of things?â€ Â  You know, how can augmented  reality really add a new layer of value and experience to these? And I  think that process would really trim a lot of the fat from the hopes and  dreams of AR and anchor it down into some very pragmatic avenues for  development.Â   And then you could start looking at, â€œWell, OK, what  happens when we start combining these?â€ When we take gaming levels and  plug that into the location basin, as you suggested.&#8221;</strong></p>
<p>Chris Arkenberg is a technology professional with a focus on product strategy &amp; development, specializing in 3D, augmented reality, ubicomp and the social web. He uses research, scenario planning, and foresight methodologies to help organizations anticipate change and adopt a resilient and forward-looking posture in the face of unprecedented uncertainty. His personal work is collected at <a href="http://urbeingrecorded.com " target="_blank">urbeingrecorded</a>, and his <a href="http://www.linkedin.com/in/chrisarkenberg" target="_blank">professional profile is here.</a></p>
<p>He is also one of the founder/organizers of <a href="http://ardevcamp.org" target="_blank">AR DevCamp</a> which is currently scheduled for Dec. 4th (somewhere in SF or The Valley!)Â  Chris said, &#8220;No further details atm (still trying to find a venue and get sponsors) but please direct people to http://ardevcamp.org for upcoming information.&#8221;</p>
<h3>Talking with Chris Arkenberg</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/ChrisArkenberg.jpg"><img class="alignnone size-medium wp-image-5929" title="ChrisArkenberg" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/ChrisArkenberg-300x199.jpg" alt="ChrisArkenberg" width="300" height="199" /></a></p>
<p><strong>Tish Shute:</strong> I know some people thought <a href="http://www.gartner.com/it/page.jsp?id=1447613" target="_blank">the positioning of AR by Gartner near the peak of the hype cycle </a>was misguided, and based on a very narrow understanding of AR as used in marketing apps. But reading your post I thought you made a lot of good points.</p>
<p><strong>Chris Arkenberg:  Itâ€™s tracking hype, right?  Itâ€™s not necessarily tracking the growth of the technologies or their maturation so much as itâ€™s tracking the general attention level.  And whatâ€™s interesting to me is that tends to affect the amount of money that goes into those technologies.</strong></p>
<p><strong>Tish Shute:</strong> I was particularly interested in your post because I have been writing a post about two recent Oâ€™Reilly events in NYC, <a href="http://makerfaire.com/newyork/2010/" target="_blank">Maker Faire</a>, <a href="http://www.web2expo.com/">Web 2.0 Expo</a>, and then <a href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" target="_blank">Hadoop World</a>, where Tim gave a very interesting 45 minute keynote.Â Â  AR was pretty low profile at all three events.Â Â <a href="../../augmented%20reality%20at%20web%202.0%20http://www.flickr.com/photos/bdave2007/5036397168/in/photostream/" target="_blank"> But the NVidia augmented reality demo attracted a lot of attention at the sponsors expo, </a> and Usman Haque, Founder of <a href="http://www.pachube.com/" target="_blank">Pachube</a> announced in<a href="http://www.web2expo.com/webexny2010/public/schedule/speaker/43845" target="_blank"> his presentation</a>,  they are working on an augmented reality interface for Pachube called Porthole, its designed for  facilities management and, â€œas a consumer-oriented application that  extends the universe of Pachube data into the context of AR â€“ a  â€˜portholeâ€™ into Pachubeâ€™s data environments.. &#8220;Â  Usman also mentioned, when I talked to him, that he is contributing to the AR standards discussion and on the program committee now <a href="http://www.w3.org/2010/06/16-w3car-minutes.html#item02" target="_blank">for the W3C group on augmented reality</a>.Â  For more on this standards discussion and the Pachube AR interface, see Chris Burmanâ€™s paper for the W3C, <a href="http://www.w3.org/2010/06/w3car/portholes_and_plumbing.pdf" target="_blank">Portholes and Plumbing: how AR erases boundaries between â€œphysicalâ€ and â€œvirtual.&#8221;</a></p>
<p>I think pioneers in the augmented reality commmunity should pay attention to these wider conversations about the Battle for the Internet Economy, and the exploration of theÂ  â€œPlatforms for Growthâ€ theme at <a href="http://www.web2expo.com/">Web 2.0 Expo</a> is very important- this is a course also a nudge to read my upcoming post on these O&#8217;Reilly events!</p>
<p>Also I have another project I have been chewing on that I would like to talk to you about. Â   I want to start an AR conversation about the wonderful <a href="http://map.web2summit.com/">Points of Control map</a> produce for Web 2.0 summit by <a href="http://battellemedia.com/" target="_blank">John Battelle</a>. [ Note there will be, "Battle for the Internet Economy" free Web2Summit webcast w/ @johnbattelle &amp; @timoreilly Wed 10/27 at 1pm PT http://bit.ly/b46cmb #w2s]</p>
<p>Up to this point, understandably given the immaturity of the technology, AR has little role in the â€œBattle for the Internet  Economy.â€Â    But this doesnâ€™t mean that the map isnâ€™t good for AR visionaries, enthusiasts, entrepreneurs, and developers to think with. Â   And both you and Tim have pointed out the potential for AR to leverage the giant data subsystems in the sky. Â  I have to say the positioning of Cloud Computing on the brink of heading down into the trough of disillusionment in this recent rendition of the Gartner Hype Cycle seems ridiculous!</p>
<p>Cloud Computing is already ubiquitous hardly seems credible that it is headed for a trough of disillusionment!</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.48.30-AM.png"><img class="alignnone size-medium wp-image-5940" title="Screen shot 2010-10-27 at 2.48.30 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.48.30-AM-300x199.png" alt="Screen shot 2010-10-27 at 2.48.30 AM" width="300" height="199" /></a></p>
<p><strong>Chris Arkenberg:  Yeah, itâ€™s ubiquitous so why even talk about it when itâ€™s your fundamental infrastructure?</strong></p>
<p><strong>Tish Shute:</strong> Yeah and I seriously doubt it is  imminently headed for a  trough of disillusionmentâ€¦.and this brings me back to the Points of Control Map which as John Batelle points out,  â€œaims to  identify key players who are battling to control the services and infrastructure of a websquared worldâ€ in which the â€œWeb and the world intertwine through mobile and sensor platforms.â€Â   This instrumented world, of course, creates a great deal of opportunity for augmented reality.  Have you seen that, that points of control map?</p>
<p><strong>Chris Arkenberg:  I think I have, actually.</strong></p>
<p><strong>Tish Shute: </strong> There has been much debate about how this intertwining of the web and  the world will play out in augmented reality.Â Â  Chris Burman points out in his position paper for W3C,Â  <a href="http://www.w3.org/2010/06/w3car/portholes_and_plumbing.pdf" target="_blank">Portholes and Plumbing: how AR erases boundaries between â€œphysicalâ€ and â€œvirtualâ€</a>, that &#8220;trying to draw parallels between a browser based web and the possibilities of AR may solve issues of information distribution in the short-term,&#8221;Â  but it must not have a limiting effect in the long-term.Â Â  But now we at least have one <a href="https://research.cc.gatech.edu/polaris/" target="_blank">web standards-based browser for AR</a> thanks to the work of Blair MacIntyre and the Georgia Tech team.Â  But  I think the discussion in the comments of Mac Slocumâ€™s recent post, <a href="http://radar.oreilly.com/2010/10/two-ways-augmented-reality-app.html" target="_blank">â€œHow Augmented Reality Apps Can Catch Onâ€</a> is an interesting starting point from which to think about platforms of growth for AR.Â   I am not sure if I am stretching his meaning but I think Raimo, <a href="http://www.layar.com/" target="_blank">Layar</a>, is suggesting that what the Point of Control map call the the Plains of Media content is very important to the growth of the fledgling AR industry right now.   And I would agree with this, and add that the neighboring terrain of gaming levels will be pretty key as one of my other favorite AR start ups <a href="http://ogmento.com/" target="_blank">Ogmento</a> hopes to reveal in the near future!  But what do you think was most important in this brief but pithy dialogue between you Raimo and Mac?</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.56.02-AM.png"><img class="alignnone size-medium wp-image-5941" title="Screen shot 2010-10-27 at 2.56.02 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/10/Screen-shot-2010-10-27-at-2.56.02-AM-300x179.png" alt="Screen shot 2010-10-27 at 2.56.02 AM" width="300" height="179" /></a></p>
<p>[The screenshot above isÂ <a title="MuveDesign" href="http://www.muvedesign.com/"></a>a teaser video the <a title="Gary Hayes" href="http://www.personalizemedia.com/future-of-location-based-augmented-reality-story-games/?utm_source=feedburner&amp;utm_medium=twitter&amp;utm_campaign=Feed:+PersonalizeMedia+%28PERSONALIZE+MEDIA%29" target="_blank">Gary Hayes</a> from <a title="MuveDesign" href="http://www.muvedesign.com/">MUVEdesign</a> for his upcoming (2011 release date), game called Time Treasure.Â  See Gary&#8217;s <a title="Gary Hayes" href="http://www.personalizemedia.com/future-of-location-based-augmented-reality-story-games/?utm_source=feedburner&amp;utm_medium=twitter&amp;utm_campaign=Feed:+PersonalizeMedia+%28PERSONALIZE+MEDIA%29" target="_blank">blog</a> for more and Gary&#8217;sÂ <a href="http://www.personalizemedia.com/16-top-augmented-reality-business-models/" target="_blank"> post from over a year ago</a> on AR Business models.Â  Thomas K. Carpenter, <a href="http://gamesalfresco.com/2010/10/25/time-treasure-future-tablet-game/" target="_blank">on Games Alfresco notes</a>, &#8220;I think this is a terrific idea and I find it interesting heâ€™s planning this on a tablet rather than a smartphone.&#8221;</p>
<p><strong>Chris Arkenberg:  The way I took itâ€¦And to give a little bit of context, I came from sort of this apprehension of augmented reality as an expression of the existing Internet.  So as sort of a visualization layer that allows you to kind of draw out data, and then, with all the affordances of being able to anchor it to real world things.</strong></p>
<p><strong>And my own sort of path has led me to want to really try to understand that and refine it, particularly with respect to the sort of Internet of things and the smarter planet idea of just having embedded systems everywhere.  And specifically, what is the value-add  for augmented reality as a visualization layer of an instrumented world?</strong></p>
<p><strong>And so thatâ€™s caused me to be a bit biased towards that side of AR.  And the way I took Raimoâ€™s comment was that he was saying that, â€œYou know, really what weâ€™re interested in is media.â€  That he was effectively saying that AR for them is really just about that space between the screen and the the world, or between your eyes and the world, and what you can do there.</strong></p>
<p><strong>Certainly I had considered it in the past, but I hadnâ€™t really focused on it or assumed that it was a priority as a business model.  And so he kind of reminded me that, actually, thereâ€™s a lot of entertainment applications.  Thereâ€™s a lot of, obviously, advertising and marketing applications.<br />
And so I felt that I was being a little narrow in my focusâ€¦</strong></p>
<p><strong>Tish Shute: </strong> Yes this comes to the heart of what I am interested in about the role AR can play in opening up new relationships to the world of data that we live, not just making it more accessible and useful to us when and where we need it, but AR as a road to reimaginingÂ  it..</p>
<p>Have you seen any interesting work yet to explore these great data economies in the cloud through AR.  I mean can you think of any others &#8211; there is <em><em><a href="http://www.planefinder.net/" target="_blank">planefinder.net</a> </em></em> but others?</p>
<p><strong>Chris Arkenberg:  Iâ€™ve seen a few just sort of skunk works type applications that people have been playing around with, again, to try and reveal things.  One of them was similar to the aircraft, but it was more for military use and being able to identify things of interest in the sky.  Iâ€™ve seen a couple other for navigation, so being able to identify mountain peaks on a visual plane, for example, but this isnâ€™t so much about revealing an instrumented world.</strong></p>
<p><strong>Tish Shute:</strong> Yeah, I think that was from the Imagination right?  I know thatâ€™s an interesting one. Usman at Web 2.0 Expo, <a href="http://www.web2expo.com/webexny2010/public/schedule/speaker/43845" target="_blank">in his presentation,</a> mentioned the work Pachube is doing on an Augmented Reality interface.  I interviewed Usman again as my last long interview with him was nearly 18 months ago now and Pachube is well on the way to becoming the Facebook of Data or the analogy that Usman prefers &#8211; the Twitter of sensors!</p>
<p><strong>Chris Arkenberg:  Hmm, interesting.</strong></p>
<p><strong>Tish Shute:</strong> And to go back to your comments on Augmented Reality not getting caught in some of the traps that have made virtual worlds lose relevancy I think that is vital that AR developers understand the strategic possibilities of key points of control in the internet economy because the isolation and Balkanization of virtual worlds were certainly a factor in their rapid slide into the trough of disillusionment &#8211; although many would argue that a fundamental flaw in the kind of virtual experience that Second Life and other virtual worlds constructed was really the fatal flaw (see James Turner&#8217;s interview with Kevin Slavin <a href="http://radar.oreilly.com/2010/09/drawing-the-line-between-games.html" target="_self">Reality has a gaming layer</a>).</p>
<p>But Second Lifeâ€™s isolation from the other great network economies of the internet was certainly a limiting factor.</p>
<p><strong>Chris Arkenberg:  And thatâ€™s been exactly my sense, and Iâ€™ve, over the years, tried to encourage development in that direction for virtual worlds.  I did work, through Adobe, to help develop Atmosphere 3D back in the the early 2000â€™s.  And we did a lot of work to try and understand the marketplace and the specific value-add of doing things in 3D over 2D.</strong></p>
<p><strong>And this is kind of why I keep referring back to VR and VWâ€™s with respect to augmented reality, is that with immersive worlds, there was this ideaâ€¦there was this big rush.  Everybody was so excited about it.  It was obviously the next cool thing.  And everybody wanted to try to do everything in it.  You could do your shopping in virtual worlds. You could have meetings in virtual worlds.</strong></p>
<p><strong>Tish Shute:</strong> and  shopping, yes ..that didn&#8217;t work out so well!</p>
<p><strong>Chris Arkenberg:  And everybody was very excited in developing these things.  And what it really came down it is, â€œYeah, you can, but itâ€™s actually a lot better to do those things on a flat plane or in person.â€  Meeting Place, WebEx, TelePresence &#8211; those tools generally do a much better job at facilitating TelePresence meetings than a virtual world does. The same with TelePresent Education. There are only very specific things that both VR and AR are really good at.</strong></p>
<p><strong>And thatâ€™s where I find myself with augmented reality right now, trying to really pick through that and critically look at which uses are really appropriate for an AR overlay. And again, I think thatâ€™s why the hype cycle is important, because it reflects back this desire that AR is going to be the next big thing &#8211; the be-all, end-all of interacting with data in the cloud &#8211; and forces us all to take a critical look at why we should do things in AR instead of on a screen.</strong></p>
<p><strong>AR is not going to work well for most things but itâ€™s going to be very good for certain uses.  Right now Iâ€™m very keen at trying to understand what those things might be.</strong></p>
<p><strong>Tish Shute:</strong> I had this wonderful conversation (more in an upcoming post) with Kevin Slavin one of the founders of <a href="http://areacodeinc.com/" target="_blank">Area/Code</a> at Web 2.0 Expo and I think some of what he describes about the data brokerages of High Frequency trading have some interesting implications for ARâ€™s role, say, in ubiquitous computing.  The trading markets are now pretty much dominated by machine to machine intelligence; machine to machine brokerages.  They are basically game economies on the scale that we can barely wrap our heads around where the speed that bots and algo traders can access the network is the key.  We really have no clue what is going on  until we lose our houseâ€¦</p>
<p>Kevin was also<a href="http://radar.oreilly.com/2010/09/drawing-the-line-between-games.html" target="_blank"> interviewed by James Turner on Oâ€™Reilly Radar.</a> He talked about how much of the interesting work in location based mobile social apps is defined in opposition to the model of Second Life.  He also talked to me about  how we are seeing â€œfirst lifeâ€ take on the qualities of â€œsecond life.â€  What goes on the trading floor is largely a performance secondary to a more important world of machine intelligence with giant co-located servers  and bots fighting for trading advantages measured in fractions of seconds.</p>
<p>He pointed out how we draw on all these tropes from sci-fi movies, these HUDs based on ideas of machine intelligence where the robot talks to the other robot in English through an English HUD!Â  Many of our current visual tropes for AR are perhaps just as inadequate for the kind of data driven world we live in.</p>
<p>Of course, when you are thinking of having fun with  dinosaurs, or illustrated books, or whatever, this is not, perhaps, an issue.Â  But if you are thinking of augmented reality interfaces as being important in a battle for network economy, and platforms for growth,Â  how this new interface helps us live better in a world of data is an important issue.</p>
<p><strong>Chris Arkenberg:  Now, does that indicate that the UI just needs more overhaul and innovation, or more that the visual interface for those experiences shouldnâ€™t really leave the screen?  It shouldnâ€™t move on to the view plane?</strong></p>
<p><strong>Tish Shute: </strong> Yes we have a few concept videos that try and explore this ..</p>
<p><strong>Chris Arkenberg:  Well, and I think this will happen at the level of human-computer interface.  I mean thatâ€™s always been its role, in making coherent the sort of machine mind, for lack of a better term, making it coherent to the human mind. So I mean there is a lot of this sort of machine intelligence, the semantic Web 3.0 revolution, where it really is about enabling machines, and agents, and bots to understand the content that weâ€™re feeding them.</strong></p>
<p><strong>But at the end of the day, they, for now, need to be providing value to us human operators. So thereâ€™s always going to be a role for  human-computer interface and user experience design to make this stuff meaningful.</strong></p>
<p><strong>I mean, if you look at the revolution in visualization &amp; data viz, this is of incredible value because it takes a tremendous amount of data and collates it into a glanceable graphic that you can look at and immediately comprehend massive amounts of data because itâ€™s delivered in a handy, visual way.</strong></p>
<p><strong>So I see that as a fascinating design challenge, how the user experience of the data world can be translated into meaningful human interaction.</strong></p>
<p><strong>Tish Shute:</strong> Yeah.  And when we see <a href="http://stamen.com/" target="_blank">Stamen Design</a> pursuing a big idea in AR, thatâ€™s when we might start to rock and roll, right?</p>
<p><strong>Chris Arkenberg:  Yeah. In my article, I sort of jokingly suggested that Apple will create the iShades.  But, theyâ€™ve got the track record of being way ahead of the curve and delivering the future in very bold forms.</strong></p>
<p><strong>Tish Shute:</strong> A key part for the battle for the network economy is to bring the complexity of data into the human realm in a way that increases human agency.  Kevin suggests that the giant robot casinos of markets should actually lift off into total abstractions as theses machine-driven trades get back into the human realm in ways that are so damaging to our lives &#8211;  a lost house or job!  The notion of a counterveillence society where people have more agency over the important aspects of their lives, health, housing, job (which I discussed with Kevin &#8211; interview upcoming) has gotten pretty tricky!</p>
<p>But I think we will begin to see AR eyewear for specific applications (gaming and industrial) get more common fairly soon &#8211; possibly as smart phone accessories.</p>
<p>And it is clear that AR is going to be, increasingly,Â  a part of our entertainment smorgesborg in coming months. Itouch has a camera (although lower resolution),  Nintendoâ€™s are AR-ready and many aspects of the AR vision of hands-free spatial interfaces will go mainstream through Natal.</p>
<p>But we are yet to see an app/platform emerge for  mobile. Social AR games that turn every bar and cafe and ultimately the whole city into a gaming venueÂ  -although I think Ogmento and MUVE aim to lead the way here!  Will an AR company achieve Zynga level success by using the Foursquare, for example?</p>
<p>My feeling is that the lesson of Zynga is pretty important for mobile social AR games.  Could Flash social gaming have taken off without Facebook?</p>
<p><strong>Chris Arkenberg:  And thatâ€™s the real driver.  And again, as you mentioned with Second Life, and this was exactly my own sense, is that they stuck to the closed garden model and didnâ€™t get the power of social and collaboration.  They attempted to add some of those affordances within the world, but, you know, ultimately most people arenâ€™t in virtual worlds, and most people arenâ€™t using augmented reality.  So leveraging the really predominate platforms like Twitter and Facebook and Foursquare, being able to leverage those affordances, that connectivity, into a platform like augmented reality, I think, is really critical. Because again, you get nothing unless you have the masses, unless you have people present.</strong></p>
<p><strong>Tish Shute:</strong> In AR research there is a long history of the  notion of powerful AR-dedicated devices, but smart phones and tablets are good enough,Â  and can launch augmented reality into the heart of the internet economy.  I thinkÂ  the elusive AR eyewear will come to us initially as a smart phone accessory for specific apps.Â  But, for the moment, most AR apps make little attempt to play in the wider internet economy.</p>
<p><strong>Chris Arkenberg:  And I think itâ€™s actually much lower hanging fruit, really, to do gaming, marketing, transmedia.  Because then you donâ€™t really care about the cloud, or maybe you only really care about a little part of it that your gaming property is addressing. Then it becomes much more about entertainment, and much more about persuasion, and sensationalism.  And if youâ€™ve got dancing dinosaurs on your street, great!  Itâ€™s entertaining, itâ€™s cool, itâ€™s new. That stuff is fairly straightforward.</strong></p>
<p><strong>I keep coming back to this idea of, you know, the instrumented city.  What sort of data trails do you get out of a fully instrumented city?  So maybe you get traffic patterns, maybe you get geo-local movements of masses, maybe you get energy usage, that sort of thing, all the, sort of  heat maps you can generate from a city. But then what good does it do to be able to have that on an augmented reality layer versus just looking at it on a mobile device or looking at it on your laptop?</strong></p>
<p><strong>Tish Shute:</strong> Of course the use cases for â€œmagic lensâ€ AR are different from the kind of hands free, 360 view with tightly registered media, that a full vision of AR has always promised.  The 360 view is  quite a different metaphor from the web and mobile rectangular screens.</p>
<p><strong>Chris Arkenberg:  Yes, yes.</strong></p>
<p><strong>Tish Shute:</strong> Did you see that <a href="http://laughingsquid.com/tweet-it-ipads-vs-iphones-a-parody-of-michael-jacksons-beat-it/" target="_blank">great parody of Michael Jackson&#8217;s</a> â€œBeat Itâ€ with the iPads versus the iPhones, right?</p>
<p><strong>Chris Arkenberg:  Oh, really?</strong></p>
<p><strong>Tish Shute:</strong> I tweeted it cos i thought it was quite funny and a little close to the bone!<br />
[laughter]</p>
<p>&#8220;ur wanna an ipatch 2 b the new fad?&#8221; #AR gets cameo in Twitter, iPads &amp; iPhone&#8217;s Michael Jackson-Inspired Parody via @mashable</p>
<p>It is hard to get away from the importance of eyewear when discussing AR!</p>
<p><strong>Chris Arkenberg: Yes, so the hardware, to me, is a big stumbling point right now, or itâ€™s a large gating factor, I think, for realizing what an augmented reality vision could really be like.  That it really does need to be heads up.  This holding the phone up in front of you is fun to demonstrate that itâ€™s possible, and itâ€™s valuable in some waysâ€¦</strong></p>
<p><strong>Tish Shute:</strong> And itâ€™s particularly nice in some applications like the planes app, the Acrossair subway app where you hold the phone down and get the arrow, right?</p>
<p><strong>Chris Arkenberg:  Yeah, the way-finding stuff I think is really valuable&#8230;</strong></p>
<p><strong>Tish Shute:</strong> Sixth Sense really caught peopleâ€™s imagination because it managed to deliver the gesture interface with cheap hardware, even if projection has limited uses (no brightly lit spaces or privacy for example!).</p>
<p>The other important and as yet unrealized part of the AR dream is  real-time communications.  Many interesting uses cases would require this. As you know that is my chief excitement, along with federation,  in the Google Wave Servers for (which should soon be released at <a href="http://googlewavedev.blogspot.com/2010/09/wave-open-source-next-steps-wave-in-box.html" target="_blank">Wave in a Box</a>) for <a href="http://www.arwave.org/" target="_blank">ARWave</a>.</p>
<p><strong>Chris Arkenberg:  Well my sense of Wave is that it was a ChromeOS protocol that they instantiated, or that they exhibited in the public deployment of Google Wave.  That that was a proof of their sort of low level architectural solution.  Because, you know, theyâ€™ve been rumored to be working on this cloud OS for some time. And so my sense is that Wave is actually one of their core components of that cloud OS, and that it just happened to incarnate for the public in a test run as Google Wave.</strong></p>
<p><strong>Tish Shute:</strong> I do hope that Wave  In the Box will lower the barriers to entry to people experimenting with this technology.  The FedOne server was just way too hard for most people to take the time to set up.  Of course, it is the brilliance of the Wave Operational Transform work that also poses problems in terms of ease of use. But Wave Federation Protocol is pretty innovative. And could even play an important role in a real time communications for AR eyewear connected to smartphones. The challenges that Wave takes on re real-time communications, federation, permissions and filters are pretty important ones for ARâ€¦</p>
<p><strong>Chris Arkenberg:  Especially when youâ€™re trying to federate a lot of permissions and filter a lot of data, which all of that gets even more important when you have a visual layer between you and the real world.</strong></p>
<p><strong>Tish Shute:</strong> You got it.  Yeah!</p>
<p><strong>Chris Arkenberg:  I think thatâ€™s really valuable real estate, both for third parties that want to get access to your eyes, as well as for you, as the user, who still needs to navigate through the phenomenal world and not be occluded by massive amounts of overhead data.</strong></p>
<p><strong>Tish Shute:</strong> Yes, I am sure Google has big plans for the next level of cloud computing and Wave looks at some key challenges.  I suppose federation poses some key business problems.  I think it was Michael Jones who said to me that it was a bit like socialism in that you have to be willing to give something up for the greater good.</p>
<p>Perhaps federation does not present enough appeal because of its challenges re business models?</p>
<p><strong>Chris Arkenberg:  Well, I wonder.  I mean thereâ€™s got to be some value for their ad platform as ads are moving more towards this personalized experience.  Advertising is becoming less of a shotgun blast and more of a very precise, surgical strike. So being able to track user data to such a fine degree to mobilize the appropriate ads around them wherever they are, on any platform, is certainly very valuable to Google and their ad ecology.</strong></p>
<p><strong>Tish Shute:</strong> Many people have high hopes that HTML 5 by lowering the barrier of entry forÂ  browser style AR could also pave the way for some interesting AR work..</p>
<p><strong>Chris Arkenberg:  Well, as much as I would hope that all the different players are going to come together and establish some shared set of standards, really, whatâ€™s happening is itâ€™s a rush to the finish line to be the firstâ€¦to get the most penetration in the marketplace so that Layar, for example, can say, â€œItâ€™s official.  Weâ€™re the platform.â€  And then the consolidation that will follow, where the Googles and the other big players like Qualcomm say, â€œOK, itâ€™s mature enough.  Weâ€™ll start buying up all the smaller companies.â€</strong></p>
<p><strong>And thatâ€™s where the real challenge is right now is that there are no standards.  Itâ€™s such an immature technology that you have a lot of different players trying to establish the ground rules.  And again, this is one of the challenges that faced public virtual worlds, is that you had a lot of different virtual worlds that werenâ€™t talking to each other in any particular way, and that they each had their own development platform. And so you end up with a very fractured ecosystem or set of competing ecosystems, which is kind of whatâ€™s happening with AR right now, where a developer has to choose between a number of different new platforms or hedge by deploying across multiple platforms. Basically, the web browser wars are set to be recapitulated by the AR browsers.</strong></p>
<p><strong>Among them, Layar and Metaio seem to be getting the most traction.  But thereâ€™s still not a really strong case for a unified development ecosystem to emerge.</strong></p>
<p><strong>Tish Shute:</strong> So a discussion of ecosystem development brings us back to the Points of Control Map I think. So what do you see as key points of interest for AR developers to watch in the  Points of Control Map? And where do you want to sort of put your bets, right?  We are still really waiting for mobile social AR to emerge into the mainstream.</p>
<p><strong>Chris Arkenberg:  Yes.  And thatâ€™s primarily the shortcoming of  the hardware itself, but also of the accuracy of current GPS technology.  Thatâ€™s another kind of gating factor, because again, AR wants to be able to express the data within a distinct place or object.</strong></p>
<p><strong>So in a lot of ways, other than kind of what weâ€™ve allowed for the broader entertainment purposes, for AR to really work, there needs to be more resolution in GPS location.  So for it to be truly locativeâ€¦because itâ€™s OK to tell Foursquare that youâ€™re in Bar X.  But if you want to be able to draw data directly on a wall within that bar, or do advertising over the marquee on the front, you need more factors to accurately register those images on a discreet location. So thatâ€™s another, sort of, aspect of the immaturity of AR, is that itâ€™s still very hard to register things on discreet locations without employing a number of diverse triangulation methods.</strong></p>
<p><strong>Tish Shute:</strong> Right.  The mobile AR games we see at the moment are really just faking a relationship to the physical world unless they rely on markers or some limited form of natural feature recognition which is really just a more sophisticated form of markers.  But the Qualcomm  SDK does offer some opportunities to tie AR media to the world more tightly as does the Metaio SDK. But in terms of a mobile social AR game that could be like the Cape of Zynga to FourSquare in Location Basin [see the <a href="http://map.web2summit.com/">Points of Control map</a>]&#8230; We havenâ€™t seen anything close yet.</p>
<p>AR should be able to bring the check-in mode to any object in our environment.</p>
<p><strong>Chris Arkenberg:  Yes, yes.  And thatâ€™s actually one of the early interests I had in the notion of social augmented reality. I wanted a way to tag my community with invisible annotations that only certain people could read, and found pretty quickly that thatâ€™s very difficult to do.  I mean you can kind of do some regional tagging, like on a  beach, for example, but if you wanted to tag the bench that was on the cliff above the beach, itâ€™s very difficult to do that using strictly locative reckoning.</strong></p>
<p><strong>Thereâ€™s all sorts of really cool social engagement that can be revealed when people are allowed to attach things to the world around them, to the streets they normally pass through, or the points of interest that they normally engage in. To be able to author on the fly on the streets and attach it discreetly to an object effectively.</strong></p>
<p><strong>Tish Shute:</strong> And yes we do have all kinds of markers and QR codes.  But Erick Schonfeld of Tech Crunch<a href="http://techcrunch.com/2010/10/18/likify-qr-code/" target="_blank"> made a good point that QR codes</a>: &#8220;Until QR code scanners become a default feature of most smartphones and  they start to become actually useful enough for people to go through the  trouble to scan them, they will remain a gee-whiz feature nobody uses.&#8221;</p>
<p><strong>Chris Arkenberg:  So again, this gets back to competing standards and who gets access to the phone stack, the bundle. Who gets the OEM dealâ€¦?</strong></p>
<p><strong>Tish Shute:</strong> Yes, the battles for the networks on the Handset Plains are pretty important for AR!<br />
[laughter] I think Layar have made some smart moves on The Handset Plains.</p>
<p>And there are a lot of acquisitions of nearfield technology to look at.Â   If I remember rightly Ebay bought the Red Laser tech from Occipital &#8211; now thereâ€™s any interesting company. Their panorama stuff rocks!</p>
<p><strong>Chris Arkenberg:  Right. Thereâ€™s a lot of nearfield stuff thatâ€™s supposed to hit all of the major mobile platforms in the next year or so.</strong></p>
<p><strong>I mean I think where this is heading, in my mind, is basically smart motes.  You know, little nearfield wide-range RFIDâ€™s that are the size of a small, tiny square that you could attach to just about anything and then program it to be a representative of your establishment or of an object, that then you can start to tag just about anything. I mean you canâ€™t rely on geo to do it, but if you have a Nearfield chip there that costs maybe like two cents to buy in bulk, and you can flash program it, then you can start to attach data to just about anything.</strong></p>
<p><strong>Tish Shute:</strong> Yes &#8216;cos some things still remain very difficult for near field image recognition technologies like Google Goggles.</p>
<p><strong>Chris Arkenberg:  Well, if your phone can interrogate for Nearfield devices, and it detects a chip in its near field, it can then interrogate that chip.  The chip may contain flash data on itself, or it may contain the local server in the establishment, or it may go to the cloud and get that data back.</strong></p>
<p><strong>Tish Shute:</strong> Yes there is moverment from the top and open source hardware like Arduino has created an opportunity for all sorts of creativity with instrumented environments.Â  And the handheld sensors in our pockets &#8211; our smart phones create a lot of opportunity for bottom up innovation too.</p>
<p><strong>Chris Arkenberg:  I mean thatâ€™s my guess.  If you look at what IBM is doing with their Smarter Planet initiative, theyâ€™re partnering with a lot of municipalities, and obviously with a lot of businesses and their global supply chains.</strong></p>
<p><strong>But theyâ€™re basically working with municipalities and all these stakeholders to instrument their territory, their business, or their city, as it were. So theyâ€™re working to provide embedded sensors and the software necessary to read them out and run reports &amp; viz.  And presumably that software can extend to include some sort of mobile device to interrogate the sensors and read the data.</strong></p>
<p><strong>Thatâ€™s kind of a top-down approach of a very large global company working with top-down governance bodies to do this. Simultaneously you have the maker crowd experimenting with Arduino and such to build from the grassroots, the bottom up approach.</strong></p>
<p><strong>And thatâ€™s primarily gated by the amount of learning it takes to be able to program these devices, to be able to hack them.  Typically, the grassroots creators who make these devices donâ€™t have the luxury of very large budgets to make things highly usable and Wizywig.</strong></p>
<p><strong>So the bottom up community is a sandbox to create tremendous amounts of innovation, because they are unconstrained by the very real financial needs of the top down innovators.  And so you get a lot of fascinating innovation, a very rich ecology from the bottom-up approach, but you donâ€™t get a lot of wide distribution.  But that does filter up to and inform the top down approach that has a lot more money to put into this stuff.  And it ultimately has to respond to the needs of the marketplace.</strong></p>
<p><strong>I mean if thereâ€™s an answer to the question of whether something like AR will succeed through the bottom-up grassroots approach or the top-down industry approach, I would say it would be both.  That handsets will be hacked to read the bottom up innovations of the maker community, and handsets will be preprogrammed to read the top down efforts of the IBMs of the world.</strong></p>
<p><strong>Tish Shute:</strong> Yes but i have to say it is very time-consuming hacking phones (I have just seen a few days suck up in this myself so that I could upgrade my G1 to try out the new ARWave client!).  I mean Android has obviously been the platform of choice because of openness but the business model of iPhone and its market share in the US sure make it important for developers.Â   Itâ€™s like you donâ€™t exist if you donâ€™t have an iphone app for what you are doing.</p>
<p><strong>Chris Arkenberg:  Yeah, and thatâ€™s the challenge, because at the end of the day developers prefer not to work for free and a solid, reliable mechanism to monetize their efforts becomes very appealing.</strong></p>
<p><strong>When I look at this map, the points of control map, itâ€™s really interesting to me, because what it says to me with respect to AR is each of these little regions that they have drawn out would be a great research project. So every single one of these should be instructive to AR.</strong></p>
<p><strong>In other words, we should be able to look at social networks, the land of search, or kingdom of ecommerce, and apply some very rigorous critical thinking to say, â€œHow would AR add to this engagement, this experience of gaming, or ecommerce, or content?â€</strong></p>
<p><strong>Looking at each of these individually and really meticulously saying, â€œOK, well yes, it can do this but how is that different from the current screen media experience, the current web experience that we have of all these types of things?â€  You know, how can augmented reality really add a new layer of value and experience to these? And I think that process would really trim a lot of the fat from the hopes and dreams of AR and anchor it down into some very pragmatic avenues for development.  And then you could start looking at, â€œWell, OK, what happens when we start combining these?â€ When we take gaming levels and plug that into the location basin, as you suggested.</strong></p>
<p><strong>Tish Shute: </strong> Some of the important platforms for AR donâ€™t appear to have spots on the map like Google Street View and other mapping technologies that hold out so much hope for AR, or am I missing something?</p>
<p><strong>Chris Arkenberg:  You mean on the map?</strong></p>
<p><strong>Tish Shute:</strong> Yes for the full vision of AR we need sensor integration, computer vision and cool mapping technologies to come together. Do you see where Google Maps and Google Street View&#8230; Where would they be?</p>
<p><strong>Chris Arkenberg:  Yeah, I mean itâ€™s certainly content, itâ€™s locationâ€¦</strong></p>
<p><strong>Are you familiar with Earthmine?</strong></p>
<p><strong>Tish Shute:</strong> Yes, yes I am, definitely.<a href="http://www.earthmine.com/index" target="_blank"> Earth Mine</a>, <a href="http://simplegeo.com/" target="_blank">Simple Geo</a>, Google Street View, user generated internet photo sets like  Flickr all of these could be very important to AR, potentially.</p>
<p><strong>Chris Arkenberg:  Well, and the interesting thing about Earthmine is that theyâ€™re effectively trying to do an extremely precise pixel to pixel location mapping.  So theyâ€™re taking pictures of cities just like Street View, except theyâ€™re using the Z axis to interrogate depth and then using very precise geolocation to attach a GPS signature to each pixel that theyâ€™re registering in their images. Effectively, you get a one-to-one data set between pixels and locations.  And so you can look at something like Google Street View, and if you point to the side of a building, in theory, it should know exactly where that is.</strong></p>
<p><strong>Theyâ€™re rolling this out with the idea of being able to tag augmented reality objects in layers directly to surfaces in the real world.  So thatâ€™s another approach to trying to get accurate registration and to try and create what are essentially mirror worlds. Then your Google Street View becomes a canvas for authoring the blended world, because if you plop a 3D object into Street View on your desktop, and then you go out to that location with your AR headset, youâ€™ll see that 3D object on the actual street.</strong></p>
<p><strong>Tish Shute:</strong> There was some experimental work with Google Earth as a platform for a kind of simulated AR but I suppose Google Earth doesnâ€™t figure in the battle for the network economy as it never got developed as a platform.</p>
<p><strong>Chris Arkenberg:  It hasnâ€™t tried to become a platform, to my  knowledge.  I mean I know some people are doing stuff with it, but as far as I know, Google owns it, they did it the best because they have the best maps, and thereâ€™s not a huge ecosystem of development thatâ€™s based around it other than content layers.</strong></p>
<p><strong>And my sense of everything else on the Points of Control map is theyâ€™re looking more at these sort of platform technologies thatâ€¦</strong></p>
<p><strong>Tish Shute:</strong> Yes, re platforms for growth for AR. Gaming consoles will probably emerge as a significant platform for AR this year.</p>
<p><strong>Chris Arkenberg:  There will be much more of a blended reality experience in the living room for sure, and with interactive billboards. Digital mirrors are another area.  So I mean if we kind of extend AR to include just blended reality in general, you know, this is moving into our culture through a number of different points. As you mentioned, it will be in the living room, it will be in our department stores where you can preview different outfits in their mirror. Weâ€™re already seeing these giant interactive digital billboards in Times Square and other areas.</strong></p>
<p><strong>Itâ€™s funny.  I mean for me, the sort of blended reality aside, the augmented reality, to me, is actually a very simple proposition in some respects.  When I look at this map, augmented reality is just an interface layer to this map in my mind, just as itâ€™s an interface layer to the cloud and itâ€™s an interface layer to the instrumented world. Itâ€™s a way to get information out of our devices and onto the world.</strong></p>
<p><strong>Tish Shute:</strong> The importance of leveraging existing platforms has become pretty clear but it is interesting Facebook definitely gave Zynga the opportunity but would Facebook be so big without Zingaâ€™s social gaming boost?</p>
<p><strong>Chris Arkenberg:  I feel that Zynga has definitely helped its growthâ€¦But I think Zynga has benefited a lot more from Facebook than Facebook has from Zynga.</strong></p>
<p><strong>Tish Shute:</strong> Zynga certainly proved you  could build a profitable business on Facebookâ€™s API!</p>
<p><strong>Chris Arkenberg:  They did.  And they also really validated the Facebook ecosystem and the platform.  They really extended itâ€¦ Zynga benefited from the massive social affordances that Facebook had already architected and developed. They brought gaming directly into Facebook, and particularly, this emerging brand of lightweight social gaming that when you sit it on top of a massive global social network like Facebook, it suddenly lights up.</strong></p>
<p><strong>Tish Shute: </strong>AR pioneers should quite carefully go through this map. There is so much to think about here. Iâ€™m a kind of fanatic about  Streams of  Activity in AR.  Real time brokerages and their potential for AR is something I am fascinated by.  That is one reason I love the ARWave project.</p>
<p>Anselm Hook, to me, is one of the great thinkers in this area of real time brokerages &#8211; with his project Angel, and the work of <a href="http://www.ushahidi.com/" target="_blank">Ushahidi,</a> which is now the platform <a href="http://www.ugotrade.com/2010/09/17/urban-augmented-realities-and-social-augmentations-that-matter-interview-with-bruce-sterling-part-2/" target="_blank">for augmented foraging (see here)</a>.  Anselm is now working on AR at PARC which is exciting.</p>
<p><strong>Chris Arkenberg:  Well, there are some challenges working with data streams. Presentation and filtering I think is a big challenge with any sort of stream.  Because obviously, you have a lot of potential data to manage, to parse, and to make valuable and comprehensible. So I think this is bound very closely to being able to personalize experiences, or having very discreet valuable experiences.  Disaster relief, for example, I think is an interesting idea that ties into the Pachube type of work. Where, if you had the headset and you were a relief worker, and you had immediate lightweight, non-intrusive, heads up alpha channel overlay, waypoint markers showing you all of the disaster locations or points of need, AR becomes extremely valuable, because itâ€™s a primarily hands-free environment.  This is why the military stuff is so interesting.</strong></p>
<p><strong>Tish Shute:</strong> Ha!  We are running  into the eye patch/shades/goggles/sexy specs thing again.  But filtering and making streams of activity relevant will be very interesting for  AR.Â  Again that it why I love the Wave Federation Protocol work because what they have built into their XMPP extensions.  You can have your real-time personal data streams, or community streams, or broadcast publicly &#8211; the permissions are built.</p>
<p>And Thomas Wrobelâ€™s original vision of these layers and channels is only fully expressed if you have the eyewear.</p>
<p><strong>Chris Arkenberg:  Well, and it becomes redundant if itâ€™s on a mobile. To use a very basic example, Twitter, obviously thereâ€™s an app you can view those streams of activity on the camera stream. But you can view that real time data on the screen.  Why do you need to see it heads up?</strong></p>
<p><strong>The reason I really pay attention to what the military is investing in, one, because they have a ton of money, but also because they tend to represent the core bio survival needs of the speciesâ€¦So, when I look at computing, I see this very obvious trend of computers getting smaller and smaller and closer and closer to us because theyâ€™re so valuable to our success.  They give us so much valuable information for engaging our world on a moment by moment basis.  So, of course now we have these tiny little handheld devices that give us access to the global knowledge depositories of human history, because itâ€™s so useful to have that stuff right at hand.</strong></p>
<p><strong>The only impediment now is that it takes one of our hands, if not both of them, to access it.  So if you are in the natural world, which we are all always in the natural world, ultimately, you want your hands free in order to engage with the world on a physical level.</strong></p>
<p><strong>I see computation, or rather, our access to computation is just going to get thinner and thinner, and weâ€™ll very soon move into eyewear, and inevitably, weâ€™ll move into brain computer interface in some capacity.</strong></p>
<p><strong>So when youâ€™re the disaster worker, or a deployed soldier, or the extreme mountain biker, or the heli-skier, or just an adventurer, there are a lot of very practical reasons to have access to information on a heads-up plane. I see AR as being so profound and so valuable, but weâ€™re getting a glimpse of it in its infancy, and itâ€™s got a ways to go to be able to really contain what it is weâ€™re reaching for.</strong></p>
<p><strong>Tish Shute:</strong> I agree.</p>
<p><strong>Chris Arkenberg:  And thatâ€™s been a big criticism Iâ€™ve had with all the existing AR implementations that Iâ€™ve seen, is that the UI really needs a revolution.  Itâ€™s very heavy handed.  It is not dynamic, even though itâ€™s supposed to be.  It does not take advantage of transparencies.  It treats the screen like a screen.  It doesnâ€™t treat the screen like a window onto the real world. When youâ€™re looking on the real world, you donâ€™t want a lot of occlusion.  You want very soft-touch indicators of a data shadow behind something that you can then address and then have it call out the information thatâ€™s important to you.</strong></p>
<p>Tish Shute:  Now, thatâ€™s a very nice kind of image youâ€™ve conjured for me there.  Do you see that more could be done on the smartphone than is being done within that?  Or are we like waiting for the old ishades?</p>
<p><strong>Chris Arkenberg:  I think thereâ€™s definitely a lot of room for improvement on the smartphone UI.  Nobodyâ€™s really played around with it much. And again, I think thatâ€™s in part that there hasnâ€™t been a really established platform with enough money to fund interesting UI work. We see it in some of the concept demos that float around every now and then.</strong></p>
<p><strong>I guess itâ€™s both a blessing and curse that Iâ€™m always five steps ahead of where Iâ€™m trying to get to.</strong></p>
<p><strong>Tish Shute:</strong> Yeah, I am familiar with that feeling!</p>
<p><strong>Chris Arkenberg:  So Iâ€™m always trying to reach for the vision even though itâ€™s a bit distant. I think thereâ€™s going to be a lot of development on the handsets.  But again, I think we need a lot of refinement.  We need a lot of real critical analysis of why this is a good thing.</strong></p>
<p><strong>To get back to the original point of Raimoâ€™s comment, it struck me.  And I knew it, but I just had set it aside as gimmickry. But heâ€™s right.  Content is a huge driver for this.  Just stuff thatâ€™s engaging, and fun, and cool, and shows off the technology so they can get enough money to make it through whatever Trough of Disappointment may be waiting.</strong></p>
<p><strong>Tish Shute:</strong> Yeah, donâ€™t underestimate the Planes of Content!Â  They are a great place to get interest and money to keep AR technology  moving on, right?</p>
<p><strong>Chris Arkenberg:  Yeah, yeah.  Because, you know, thereâ€™s a lot of freedom there.  And you can piggyback on all the rest of the content thatâ€™s out there and jump on memes and marketing objectives, etc&#8230;</strong></p>
<p><strong>And thereâ€™s a lot of stuffâ€¦Iâ€™m blanking on some of the names, but some of these historical recreations of city streets.  Thereâ€™s a street in London where they overlaid historical photos in a really compelling experience. [Museum of London - http://www.museumoflondon.org.uk/] Again, Iâ€™m completely forgetting the attributions, but hose are the type of things that can really be pursued on the existing platforms.  There is stuff thatâ€™s really compelling and really cool.</strong></p>
<p><strong>I heard of another interesting use case &#8211; and I should say that I canâ€™t find attributions to this anywhere on the web and I may be paraphrasing or mis-representing the actual work, but I think the concept is worth exploring anyway. But the idea was that you could take the locations of border checkpoints and conflict sites in Palestine and Israel and visually overlay them on an AR layer in San Francisco.  And it would do some sort of transposition where you could virtually view these things in San Francisco with the same locational mapping superimposed. So you could see where the checkpoints where.  You could see where the wall was.  You could see where suicide bombings were and where there had been conflicts.</strong> <strong>[I cannot find any citations for this!]</strong></p>
<p><strong>Tish Shute: </strong> But with an AR view?  But why would you use an AR view if you  are in San Francisco, then?</p>
<p><strong>Chris Arkenberg:  Because it superimposes two realities, translating the Gaza conflict into San Francisco as you are walking around. You can interrogate the world. Thereâ€™s a discoverability aspect where youâ€™re using the headset to reveal things, or the handset rather, to reveal things that you could not see otherwise in your city. It was done as an art piece, but as a provocative, obviously political art piece.</strong></p>
<p><strong>Tish Shute: </strong>Very interesting.  Iâ€™d love to see that. Because thatâ€™s interesting to get away from this idea that you actually have to sort of have this one to one relationship between the data and the world is kinda nice, isnâ€™t it?  Well, not one to one, but a very literalâ€¦getting away from that literalness is kind of good.</p>
<p><strong>Chris Arkenberg:  And thatâ€™s a possibility of virtual reality and augmented reality merging, that maybe virtual reality is actually going to do best by coming out of the box and writing itself over our reality, so that as you are walking around, you are no longer seeing San Francisco, but you are seeing part of Everquest or World of Warcraft.</strong></p>
<p><strong>Tish Shute: </strong> Well this is where Bruce Sterling gets to that point he made in <a href="http://augmentedrealityevent.com/2010/06/06/are-2010-keynote-by-bruce-sterling-build-a-big-pie/" target="_blank">his keynote for are2010</a>, that if we actually have viable AR eyewear, then you get the gothic stepsister of AR, VR rising from the grave!Â  He asks whether the very charm of augmented reality, is in fact that it adds rather than subtracts from your engagement with the world and that getting get sucked back into the black hole of VR might not be so great.</p>
<p><strong>Chris Arkenberg:  And then you get all sorts of interesting challenges to social cohesion if you have a lot of different people experiencing very different worlds, effectively.  That if there is no real consensual reality and a majority of your local populous is, in fact, experiencing very different and unique versions of the world, what does that do to social cohesion?  How does that reinforce tribalism, for example, when only you and certain others get to opt in to a particular layer view of the world?</strong></p>
<p><strong>Tish Shute:</strong> Yes Jamais Cascio wrote an interesting piece on that issue on AR and social cohesion a while back.</p>
<p>An eye patch is a more logical vision than the goggles in many ways but I suppose the loss is stereo vision?</p>
<p><strong>Chris Arkenberg:  And actually, there were developments in military helicopter technology many years ago that used a single pane square of glass over the eye mounted to the helmets of pilots.  And then they drew various bits of heads-up information on it. So that ensures that youâ€™re having a real strong engagement with the real world, which, obviously, when youâ€™re a helicopter pilot is quite important.  But you still have access to the data layer of  the invisible world.</strong></p>
<p><strong>Tish Shute:</strong> I just went to <a href="http://www.cloudera.com/company/press-center/hadoop-world-nyc/" target="_blank">Hadoop World</a> and I have to say, I was awestruck about how big thatâ€™s got.  I mean <a href="http://hadoop.apache.org/" target="_blank">Hadoop</a> has gone from like zero to huge in just a few years.  I mean itâ€™s just like now everyone has the power of the Google big table at their fingertips.</p>
<p>Whatâ€™s the play for AR in the land of search?</p>
<p>I could imagine Hadoop being very powerful tool for AR analytics?</p>
<p>Have you got any thoughts on the land of search and AR? Of course visual search is proceeding at a fast pace and there is a lot of promise for integrations with AR in the future but the latency for visual search is still pretty high?</p>
<p><strong>Chris Arkenberg:  In the near term, not a lot.  In the medium term, thereâ€™s a larger trend towards virtual agents that you can program or teach to keep watch over things for you as an effort to scale down the data overload.  So search is something thatâ€™s going to become more personalized and more active.  Thereâ€™s a movement to make it so people can essentially deputize these agents to be always searching for them; to be out there looking for the things that they have told these agents are important to them.</strong></p>
<p><strong>So active search for AR I think presents some challenges, obviously because you need to do text input, typically, or voice input.  Voice input, I think, is much more achievable than text input for AR.  But I can certainly imagine an AR layer that is being serviced by these agents that we have roaming around the web for us reconciling their visual view of the world with our personalizations. AR apps are contextually aware so it knows that if youâ€™re downtown, itâ€™s not going to be giving you a ton of information about Software as a Service infrastructure, or what have you.  But that, instead, itâ€™s going to be handing you little tidbits about a particular clothing brand youâ€™ve opted in to follow and information about  music venues &amp; schedules, for example.  Or perhaps youâ€™ll be on the lookout for other users that have opted in to publicly tag themselves as a member of this or that affinity.</strong></p>
<p><strong>I keep coming back to this idea of AR as really just a simple visualization layer that all of these other technologies can potentially feed into.  So in that sense, search becomes a passive thing that AR is just simply presenting to you in a heads-up, hands-free, or potentially hands-free environment.</strong></p>
<p><strong>Tish Shute:</strong> Yes, the big challenge is the stepping stones to that point! Small steps that keep interest going into developing the underlying technology (and not just in research labs!) that will bring us that interface.Â  We have seen some movement already with Qualcomm.</p>
<p><strong>Chris Arkenberg:</strong> And there are bandwidth issues as well, as we can see with the Google Goggles, which is a great idea of visual search.  But you have to take a picture and send it to the cloud and wait for your results.  Itâ€™s not a real-time dynamic interrogation of the world.</p>
<p><strong>Tish Shute:</strong> Yes we are really only at the very beginning of  AR being ready for prime time.. it would be interesting to ask AR developers how many of them use AR on a daily basis.</p>
<p><strong>Chris Arkenberg:  I think a lot of us, weâ€™re just informed by the sci-fi myths and fascinated with the potential now thatâ€™s itâ€™s starting to become real. But I think we all kinda get that itâ€™s still extraordinarily young.  I mean the web is extraordinarily young. And AR is itself far younger in a lot of ways in its implementations.</strong></p>
<p><strong>Everybody has a lot of excitement about all of the great potentials that are being unleashed by this great wave of the Internet and the web and ubiquitous mobile computing.  So thatâ€™s why, you know, you look at that map and we talk about AR and you canâ€™t talk about any of the stuff without talking about all of it, in a lot of ways, particularly with something like AR where itâ€™s so ultimately agnostic and could be completely pervasive across all of these layers.</strong></p>
<p><strong>So my fascination is with the future, and I measure our progress towards it by the young nascent offerings from the platform players and the developers. And yeah, a lot of it isâ€¦itâ€™s akin to getting that first triangle on the screen in 3D.  You know, when the renderer finally works and you get a triangle on the screen, and you go, â€œOh my God, it renders.â€  And then you can start to really build polygons and build objects, and start doing boolian operations, and get light and rendering in there, and textures, and on, and on, and on.<br />
So Iâ€™m fascinated by the Layars and the Metaioâ€™sâ€¦<br />
[laughter]</strong></p>
<p><strong>Tish Shute:</strong> Yes and hats off to all the players in the emerging industry, Layar, Metaio, Ogmento, Total Immsersion, and all the others who are finding clever ways to bring fun aspects of  AR into the mainstream, and fuel interest to take the technology to the next level.</p>
<p><strong>Chris Arkenberg:  Absolutely.  And the hype cycle is very valuable.  It has really helped launch the AR industry.  Itâ€™s brought a lot of eyes, and itâ€™s brought a lot of money into the industry.  And itâ€™s forcing people like us to have these conversations to understand how to refine its growth and really focus on the potential in all these different venues, whether itâ€™s trying to save lives, or better understand your city, or have really compelling entertainment experiences.</strong></p>
<p><strong>Everybodyâ€™s excited, and everybodyâ€™s sharing, and everybodyâ€™s trying to move it forward in a way thatâ€™s the most productive.</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2010/10/27/platforms-for-growth-and-points-of-control-for-augmented-reality-talking-with-chris-arkenberg/feed/</wfw:commentRss>
		<slash:comments>3</slash:comments>
		</item>
		<item>
		<title>Augmented Twitter at Jeff Pulver&#8217;s #140conf</title>
		<link>http://www.ugotrade.com/2010/04/23/augmented-twitter-at-jeff-pulvers-140conf/</link>
		<comments>http://www.ugotrade.com/2010/04/23/augmented-twitter-at-jeff-pulvers-140conf/#comments</comments>
		<pubDate>Fri, 23 Apr 2010 14:25:03 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[nanotechnology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[#140conf]]></category>
		<category><![CDATA[#ashtag. TEDxVolcano]]></category>
		<category><![CDATA[3D mailbox]]></category>
		<category><![CDATA[Alon Nir]]></category>
		<category><![CDATA[Anselm Hook]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[are2010]]></category>
		<category><![CDATA[ARWave]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[augmented reality games]]></category>
		<category><![CDATA[augmented twitter]]></category>
		<category><![CDATA[Dancing Ink Productions]]></category>
		<category><![CDATA[EComm]]></category>
		<category><![CDATA[Evolutionary Reality]]></category>
		<category><![CDATA[Farmville]]></category>
		<category><![CDATA[federation protocol]]></category>
		<category><![CDATA[Foure Square]]></category>
		<category><![CDATA[Gamepocalypse]]></category>
		<category><![CDATA[gowalla]]></category>
		<category><![CDATA[Jeff Pulver]]></category>
		<category><![CDATA[Jerry Paffendorf]]></category>
		<category><![CDATA[Joshua Fouts]]></category>
		<category><![CDATA[Latitude]]></category>
		<category><![CDATA[Loveland]]></category>
		<category><![CDATA[micro-real estate]]></category>
		<category><![CDATA[mobial social]]></category>
		<category><![CDATA[mobile social augmented reality]]></category>
		<category><![CDATA[mobile social games]]></category>
		<category><![CDATA[Open AR]]></category>
		<category><![CDATA[Open AR Web]]></category>
		<category><![CDATA[open standard federated protocol]]></category>
		<category><![CDATA[Rita J. King]]></category>
		<category><![CDATA[Rouli Nir]]></category>
		<category><![CDATA[social games]]></category>
		<category><![CDATA[The Kotel]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[tishshute]]></category>
		<category><![CDATA[wave federation prtocol]]></category>
		<category><![CDATA[WhereCamp]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=5406</guid>
		<description><![CDATA[Augmented Twitter &#8211; open, mobile, social augmented reality via ARwaveView more presentations from Tish Shute. Augmented Twitter Presenting Augmented Twitter (see video and slides above) at Jeff Pulver&#8217;s 140 Characters Conference (#140conf ) was super fun, and great video makes this a conference that you can enjoy catching up on after the fact.Â  Jeff Pulver [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ippio.com/view_video.php?viewkey=da6ab5c15dd856998e4b" target="_blank"><img class="alignnone size-full wp-image-5407" title="Screen shot 2010-04-22 at 9.52.22 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/04/Screen-shot-2010-04-22-at-9.52.22-AM.png" alt="Screen shot 2010-04-22 at 9.52.22 AM" width="458" height="368" /></a></p>
<div id="__ss_3817428" style="width: 425px;"><strong style="display:block;margin:12px 0 4px"><a title="Augmented twitter - open, mobile social augmented reality via ARwave" href="http://www.slideshare.net/TishShute/augmented-twitter">Augmented Twitter &#8211; open, mobile, social augmented reality via ARwave</a></strong><object classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" width="425" height="355" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0"><param name="allowFullScreen" value="true" /><param name="allowScriptAccess" value="always" /><param name="src" value="http://static.slidesharecdn.com/swf/ssplayer2.swf?doc=augmentedtwitter-100422085925-phpapp01&amp;stripped_title=augmented-twitter" /><param name="allowfullscreen" value="true" /><embed type="application/x-shockwave-flash" width="425" height="355" src="http://static.slidesharecdn.com/swf/ssplayer2.swf?doc=augmentedtwitter-100422085925-phpapp01&amp;stripped_title=augmented-twitter" allowscriptaccess="always" allowfullscreen="true"></embed></object>View more <a href="http://www.slideshare.net/">presentations</a> from <a href="http://www.slideshare.net/TishShute">Tish Shute</a>.</div>
<p> </br></p>
<h3>Augmented Twitter</h3>
<p>Presenting <a href="http://www.ippio.com/view_video.php?viewkey=da6ab5c15dd856998e4b" target="_blank">Augmented Twitter</a> (see video and slides above) at <a href="http://140conf.com/" target="_blank">Jeff Pulver&#8217;s 140 Characters Conference</a> (#140conf ) was super fun, and <a href="http://www.ippio.com/140conf" target="_blank">great video </a>makes this a conference that you  can enjoy catching up on after the fact.Â  Jeff Pulver does an excellent job of keeping people to a challengingly short format.Â  Even I managed to bring my talk in under 5 mins!</p>
<p>#140conf is a real time mobile social crowd, and pretty attuned to Augmented Reality.Â  Everyone had heard of Augmented Reality in the audience, and while most had never tried an AR app, nearly everyone used a mobile social app like, <a href="http://foursquare.com/" target="_blank">Four Square</a>, <a href="http://gowalla.com/" target="_blank">Gowalla</a>, or <a href="http://www.google.com/latitude/intro.html" target="_blank">Latitude</a>. Â  As Dan Harple (@dharple) &#8211; Executive Chairman,<a href="http://www.gypsii.com/" target="_blank"> GyPSii</a>, said in hisÂ  interesting presentation, <a href="http://www.ippio.com/view_video.php?viewkey=44143e1f2f13b2b729ab"><strong>Evolution  of Location and Places</strong></a>,Â  &#8220;everyone get&#8217;s connection, and that connection in real time is the thing if we can get it, and that real time connection is innately mobile.&#8221;</p>
<p><a href="http://www.arwave.org/" target="_blank">ARwave</a> aims to push mobile, social, real time connection even further with augmented reality.Â  As Anselm Hook puts it so brilliantly in his <a href="http://www.slideshare.net/anselm/20100421-ecomm-pressy" target="_blank">presentation at EComm</a>, &#8220;AR is about publishing &#8220;verbs&#8221; &#8211; interactive, actionable, digital agents not publishing 3D models.&#8221;Â  I have some mega posts brewing on this topic.Â  Augmented Reality will need to support publishing game like behavior, and digital agents that can  embody a set of actions and reactions.</p>
<p>This need for augmented reality to publish behavior, and to share and integrate, in one view, multiple real time data streams are just some of the reasons <a href="http://www.arwave.org/" target="_blank">AR Wave</a> uses <a href="http://www.waveprotocol.org/" target="_blank">an open federated   protocol</a>.Â  Federation is also particularly important for augmented reality because, as Anselm pointed out at <a href="http://wherecamp.org/" target="_blank">WhereCamp</a>,Â  AR will certainly demand very efficient distribution of state change at the systems level &#8211; Â to move the computation to its lowest latency.</p>
<p>The only other cloud over our Augmented Reality party at #140confÂ  was that #ashtag kept our co-panelist and panel chair from joining us. Â  Rita J King, @ritajking, who is Innovator-in-Residence at IBMâ€™s Analytics Virtual Center, the &#8220;General of the Imagination Age,&#8221; and <a href="http://dancinginkproductions.com/" target="_blank">Dancing Ink Productions</a>, and Joshua Fouts, @josholalia, &#8220;Cultural AttachÃ©,&#8221; and Chief Global Strategist of Dancing Ink, were on a 5 day trek out of #ashcloud, and, sadly, not there for our panel.</p>
<p>Bu Twitter, once again, was a life line in a time of crisis connecting them to <a href="TEDxVolcano">TEDxVolcano,</a> an impromptu unconference with must see presentations from Rita and others, see<a href="http://www.theimaginationage.net/" target="_blank"> Rita&#8217;s blog for more</a>.</p>
<p>So the two of us carried the flag forÂ  Augmented Twitter.Â  Myself and Jerry Paffenfdorf, futurist, artist, entrepreneur and swell guy  &#8211; the co-inventor of the most famous real time social web system you have never heard of (actually I tried and loved it in alpha testing, before it was quote &#8220;shut down by blood thirsty investors&#8221;).</p>
<p>Now Jerry lives in Detroit Michigan where he works on the <a href="http://makeloveland.com/" target="_blank">Loveland Micro-real estate project</a> which is the simplest, cheapest, funnest way to become a land owner.Â   At a dollar a square inch it mixes video games and real estate, like Farmville for urban development.</p>
<p>Joshua and Rita, our very virtual panel mates, are the first and largest inchvestors, and creating their own micro city within the project.Â   Jerry is one of the most creative and original thinkers on the planet, so treat yourself to glimpse of what is on his mind in the video above &#8211; <a href="http://makeloveland.com/" target="_blank">Loveland</a>, <a href="http://www.3dmailbox.com/" target="_blank">3D mailbox</a>, canned augmented reality, and the relationship of virtual worlds to the real time social web.</p>
<p>Jerry also hat tipped one of the most captivating projects and presentations of the conference, Alon Nir&#8217;s, <a href="http://www.ippio.com/view_video.php?viewkey=510442f2fd40f2100b05"><strong>The  Story Behind @TheKotel</strong></a>, &#8220;Tweet Yr Prayers!&#8221;Â  What a great story about the power of Twitter to reach out into the world, and beyond!Â  I got a chance to chat with Alon at #140conf, and I found out he is brother of augmented reality guru, Rouli Nir, @augmented.Â  Rouli is known for his sharp and comprehensive AR commentary on <a href="http://artimes.rouli.net/" target="_blank">Augmented Times </a>and <a href="http://gamesalfresco.com/2010/04/22/the-future-of-ar-browser/" target="_blank">Games Alfresco</a>.Â  Cool family!</p>
<p>Before I close this post, I want to mention @AndyDixn&#8217;s talk on the prison sysetm, <a href="http://www.ippio.com/view_video.php?viewkey=7bc562a711ef96884a38"><strong>A  conversation with Andy Dixon: What the prison yard &amp; twitter have  in common</strong></a>.Â  This conversation, I think, is a great example about what makes #140conf special.Â  As @nwjerseyliz pointed out, we, &#8220;hear few voices from those who&#8217;ve experienced that side of the issue.&#8221;</p>
<p>Thank you @jeffpulver for creating such a cool staging for so many diverse voices.</p>
<p>And before I close here is what the only slide I didn&#8217;t have time to show said!</p>
<h3><strong>If you liked &#8220;Augmented Twitter&#8221;<br />
Donâ€™t miss Augmented Reality Event! </strong></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/04/are234x60augmented_w.png"><img class="alignnone size-full wp-image-5424" title="are234x60augmented_w" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/04/are234x60augmented_w.png" alt="are234x60augmented_w" width="234" height="60" /></a></p>
<p><strong>2 days, 3  tracks, 40 AR companies, 76 SpeakersArt! Magic!  Competitions!  Awards!Bruce (the Prophet) Sterling, Will (The Sims)  Wright, Jesse  (Gamepocalypse) Schell, Blaise Aguera y Arcas (Microsoft  Bing) and You! </strong> T<strong>he <a href="http://augmentedrealityevent.com/2010/04/10/sneak-preview-of-are-2010-schedule-packed-with-augmented-reality-goodness/">sneak preview of the schedule is here</a>.<br />
</strong><br />
<strong>Register today at<a href="http://augmentedrealityevent.com/" target="_blank"> Augmented Reality Event.com</a></strong></p>
<p><strong>Discount  code for @140 attendees, (and readers of this post!) <a href="https://register03.exgenex.com/GcmRegister/Index.Aspx?C=70000088&amp;M=50000500" target="_blank">TISH245</a> activates $245 price for full  conference.</strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/04/are234x60augmented_w.png"></a></p>
<p><strong>See you there!</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2010/04/23/augmented-twitter-at-jeff-pulvers-140conf/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Visual Search, Augmented Reality and a Social Commons for the Physical World Platform: Interview with Anselm Hook</title>
		<link>http://www.ugotrade.com/2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/</link>
		<comments>http://www.ugotrade.com/2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/#comments</comments>
		<pubDate>Sun, 17 Jan 2010 17:05:01 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial general Intelligence]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[virtual communities]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Anselm Hook]]></category>
		<category><![CDATA[AR Commons]]></category>
		<category><![CDATA[AR Consortium]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[ardevcamp]]></category>
		<category><![CDATA[are2010]]></category>
		<category><![CDATA[ARNY Meetup]]></category>
		<category><![CDATA[ARWave]]></category>
		<category><![CDATA[ARWave Wiki]]></category>
		<category><![CDATA[augmented reality conference]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[augmented reality goggles]]></category>
		<category><![CDATA[augmented reality social commons]]></category>
		<category><![CDATA[brightkite]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Davide Carnivale]]></category>
		<category><![CDATA[distributed AR]]></category>
		<category><![CDATA[distributed augmented reality]]></category>
		<category><![CDATA[federated search]]></category>
		<category><![CDATA[FourSquare]]></category>
		<category><![CDATA[Games Alfresco]]></category>
		<category><![CDATA[google goggles]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[gowalla]]></category>
		<category><![CDATA[graffitigeo]]></category>
		<category><![CDATA[hacking maps]]></category>
		<category><![CDATA[Head Map manifesto]]></category>
		<category><![CDATA[imageDNS]]></category>
		<category><![CDATA[imagemarks]]></category>
		<category><![CDATA[imagewiki]]></category>
		<category><![CDATA[location based services]]></category>
		<category><![CDATA[Map Kiberia]]></category>
		<category><![CDATA[Mikel Maron]]></category>
		<category><![CDATA[mobile internet]]></category>
		<category><![CDATA[mobile social]]></category>
		<category><![CDATA[mobile social interaction utility]]></category>
		<category><![CDATA[Muku]]></category>
		<category><![CDATA[neo-viridian]]></category>
		<category><![CDATA[Nokia's ImageSpace]]></category>
		<category><![CDATA[Ogmento]]></category>
		<category><![CDATA[open distributed AR]]></category>
		<category><![CDATA[OpenGeo]]></category>
		<category><![CDATA[paige saez]]></category>
		<category><![CDATA[photo-based positioning systems]]></category>
		<category><![CDATA[physical world platform]]></category>
		<category><![CDATA[placemarks]]></category>
		<category><![CDATA[Planetwork]]></category>
		<category><![CDATA[Platial]]></category>
		<category><![CDATA[point and find]]></category>
		<category><![CDATA[proximity based social networks]]></category>
		<category><![CDATA[snaptell]]></category>
		<category><![CDATA[social cartography]]></category>
		<category><![CDATA[social commons]]></category>
		<category><![CDATA[social search]]></category>
		<category><![CDATA[SpinnyGlobe]]></category>
		<category><![CDATA[Thomas Wrobel]]></category>
		<category><![CDATA[Tonchidot]]></category>
		<category><![CDATA[trust filters]]></category>
		<category><![CDATA[Viridian]]></category>
		<category><![CDATA[viridiandesign]]></category>
		<category><![CDATA[visual search]]></category>
		<category><![CDATA[Wave]]></category>
		<category><![CDATA[Wave Federation Protocol]]></category>
		<category><![CDATA[WhereCamp]]></category>
		<category><![CDATA[whurley]]></category>
		<category><![CDATA[yelp]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=5050</guid>
		<description><![CDATA[Visual search is heating up, and with it a key stage of turning the physical world into a platform is underway as images become hyperlinks to the world in applications like Google Goggles, Point and Find, and SnapTell &#8211; see this post by Katie Boehret.Â  And while there may be no truly game changing augmented [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/anselmhook.jpg"><img class="alignnone size-medium wp-image-5051" title="anselmhook" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/anselmhook-300x225.jpg" alt="anselmhook" width="300" height="225" /></a></p>
<p>Visual search is heating up, and with it a key stage of turning the physical world into a platform is underway as images become hyperlinks to the world in applications like <a href="http://www.google.com/mobile/goggles/#dc=gh0gg" target="_blank">Google Goggles</a>, <a href="http://pointandfind.nokia.com/" target="_blank">Point and Find</a>, and <a href="http://www.snaptell.com/" target="_blank">SnapTell</a> &#8211; <a href="http://solution.allthingsd.com/20100112/in-search-of-images-worth-1000-results/" target="_blank">see this post by Katie Boehret</a>.Â   And while there may be no truly game changing augmented reality goggles for a while, make no mistake, key aspects of our augmented view, factors that will have a lot to do with what we will actually see when an augmented vision of the world is a commonplace, are already in the works.Â  And, as Anselm Hook (pic above <a href="http://www.flickr.com/photos/caseorganic/2994952828/" target="_blank">from @caseorganic&#8217;s flickr</a>) notes:</p>
<p><strong>&#8220;There is a real risk of our augmented reality world being owned by interests which are not our own. There is a real question of when you hold up that AR goggle, what are you going to see?&#8221;</strong></p>
<p>Cooperating services, e.g., Google Earth, Maps, Streetview, Google Goggles, and leader in local search like Yelp (<a href="http://www.huffingtonpost.com/ramon-nuez/google-is-getting-ready-f_b_426493.html" target="_blank">see here</a>) would have an enormous ability to filter and control a mobile, social, context aware view of the physical world, and Google themselves see an ethical quandary.</p>
<p><strong> &#8220;A Google spokesperson says this app has the ability to use facial recognition with Goggles, but hasnâ€™t launched this feature because it hasnâ€™t been built into an app that would provide real value for users. The spokesperson also cites â€œsome important transparency and consumer-choice issues we need to think throughâ€ </strong><strong> (quote from Wall Street Journal Column</strong><a href="http://solution.allthingsd.com/20100112/in-search-of-images-worth-1000-results/" target="_blank"> by Katie Boehret)</a>.</p>
<p><a href="http://www.hook.org/" target="_blank">Anselm Hook</a> and <a href="http://paigesaez.org/" target="_blank">Paige Saez</a>, with great prescience, have been advocating a social commons for the placemarks and imagemarks to our physical world platform through a number of pioneering projects, including <a href="http://imagewiki.org/" target="_blank">imagewiki</a>.Â Â  I have interviewed both Anselm and Paige (upcoming) in depth, recently.Â  My talk with Anselm was nearly three hours long!Â  So I am publishing the transcript in two parts.</p>
<p>Understanding what it means to have a social commons forÂ  our physical world platform, and augmented reality, are key questions for all of us to think about, but especially important for those of us involved in the emerging industry of augmented reality.</p>
<p>Anselm <a href="http://blog.makerlab.org/2009/11/augmentia-redux/">notes</a> :</p>
<p><strong>â€œThe placemarks and imagemarks in our reality are about to undergo that same politicization and ownership that already affects DNS and content. Creative Commons, Electronic Frontier Foundation and other organizations try to protect our social commons. When an image becomes a kind of hyperlink â€“ thereâ€™s really a question of what it will resolve to. Will your heads up display of McDonalds show tasty treats at low prices or will it show alternative nearby places where you can get a local, organic, healthy meal quickly? Clearly thereâ€™s about to be a huge ownership battle for the emerging imageDNSâ€</strong></p>
<p>The mobile internet is moving beyond the internet in your pocket phase of mobility with mobile, social, proximity-based, context aware networks like <a href="http://www.foursquare.com/">FourSquare</a>, <a href="http://gowalla.com/" target="_blank">Gowalla</a>, <a href="http://brightkite.com/" target="_blank">Brightkite</a> and <a href="http://www.geograffiti.com/">GraffitiGeo</a> (see <a href="http://smartdatacollective.com/Home/23811">Smart Data Collective</a>) likely, soon, to start to take precedence over other forms of social network.</p>
<p>Regardless of the timeline for true augmented reality &#8211; 3D images &amp; graphics tightly registered to the physical world,Â  proximity-based social networking and real time search are already taking us into a hyper-local mode and the realm of augmented reality which is <strong><strong>&#8220;inherently about who you are, where you are, what you are doing, and what is around you&#8221; </strong></strong>(<a href="http://curiousraven.squarespace.com/" target="_blank">Robert Rice</a> &#8211; see <a href="http://www.ugotrade.com/2009/01/17/is-it-%E2%80%9Comg-finally%E2%80%9D-for-augmented-reality-interview-with-robert-rice/" target="_blank">here</a>).<strong><strong> </strong></strong>The ground is being prepared for augmented reality now.<strong><strong><br />
</strong></strong></p>
<p>If you have been reading Ugotrade, you will know I have been actively involved in developingÂ  an open, distributed AR platform/mobile social interaction utility for geolocated data based on the Wave Federation Protocol &#8211; AR Wave a.k.a Muku &#8211; &#8220;crest of a wave&#8221; (see my posts <a href="http://www.ugotrade.com/2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/" target="_blank">here</a>, <a href="http://www.ugotrade.com/2009/12/04/ar-wave-project-an-introduction-and-faq-by-thomas-wrobel/" target="_blank">here</a> and <a href="http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/" target="_blank">here</a> for more on this project, and the <a href="http://arwave.wiki.zoho.com/HomePage.html" target="_blank">AR Wave Wiki</a> here).Â  Federation is, I believe, one vital aspect to developing a social commons for augmented reality and the physical world platform.</p>
<p>Also, a bit of news, I am co-chairing the upcoming <a title="Augmented Reality Event (are2010) Opens Call For Speakers" href="http://augmentedrealityevent.com/2010/01/17/augmented-reality-event-2010-opens-call-for-speakers/">Augmented Reality Event (are2010)</a> with <a href="http://gamesalfresco.com/about/" target="_blank">Ori Inbar</a> of <a href="http://gamesalfresco.com/" target="_blank">Games Alfresco</a> and <a href="http://ogmento.com/" target="_blank">Ogmento</a>, <a href="http://whurley.com/" target="_blank">whurley</a>.Â  Sean Lowery, <a href="http://www.innotechconference.com/pdx/Details/other.php" target="_blank">Prospera</a>, is the event organizer, and <a title="Augmented Reality Event (are2010) Opens Call For Speakers" href="http://augmentedrealityevent.com/2010/01/17/augmented-reality-event-2010-opens-call-for-speakers/">are2010</a> has the support of the <a href="http://www.arconsortium.org/" target="_blank">AR Consortium</a>. Â  The <a title="Augmented Reality Event (are2010) Opens Call For Speakers" href="http://augmentedrealityevent.com/2010/01/17/augmented-reality-event-2010-opens-call-for-speakers/">are2010</a> web site is live and there is an <a title="Augmented Reality Event (are2010) Opens Call For Speakers" href="http://augmentedrealityevent.com/2010/01/17/augmented-reality-event-2010-opens-call-for-speakers/">Open Call For Speakers</a>.Â   You can submit your proposals and demos for one of the three tracks, business, technology, or production <a href="http://augmentedrealityevent.com/speakers/call-for-proposals/" target="_blank">on the web site here</a>.</p>
<p><a href="http://augmentedrealityevent.com/" target="_blank"><img class="alignnone size-medium wp-image-5101" title="are2010" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/are20101-300x60.png" alt="are2010" width="300" height="60" /></a></p>
<p><a href="http://www.wired.com/beyond_the_beyond/" target="_blank">Bruce Sterling</a> &#8220;prophet&#8221; ofÂ  augmented reality and more, &#8220;will deliver the most anticipated <a href="http://augmentedrealityevent.com/speakers/" target="_blank">Augmented Reality keynote</a> of the year.&#8221;</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/bruces-brasspost.jpg"><img class="alignnone size-medium wp-image-5105" title="bruces-brasspost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/bruces-brasspost-300x225.jpg" alt="bruces-brasspost" width="300" height="225" /></a></p>
<p>It didn&#8217;t surprise me when Anselm mentioned that Bruce Sterling was a key influence for his work on the geospatial web and augmented reality.Â  Anselm explained:</p>
<p><strong>&#8220;Iâ€™d seen <a href="http://www.viridiandesign.org/notes/151-175/00155_planetwork_speech.html" target="_blank">a talk by Bruce Sterling</a> at an event called Planetwork [May, 2000]. And that event was, for me, a turning point where I decided to focus full time on exactly what I cared about instead of doing things that were kind of similar to what I cared about.</strong> <strong>So, his influences is a pretty significant one to me at that exact moment.&#8221;</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_490gcp7q6fn_b.png"><img title="dhj5mk2g_490gcp7q6fn_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_490gcp7q6fn_b-300x80.png" alt="dhj5mk2g_490gcp7q6fn_b" width="300" height="80" /></a></p>
<p>For more see <a id="q2or" title="viridiandesign.org" href="http://www.viridiandesign.org/About.htm">viridiandesign.org</a> -Â  seems it is time for a &#8220;Neo-Viridian,&#8221;  revival!</p>
<p>This <a href="http://www.wired.com/beyond_the_beyond/2009/05/spime-watch-pachube-feeds/" target="_blank">post by Bruce Sterling on Pachube Feeds</a>, and Thomas Wrobel&#8217;s <a href="http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/" target="_blank">prototype design for open distributed augmented reality on IRC</a>, were key inspirations for me when I began thinking about the potential of Google Wave Federation protocol for augmented reality.Â  I had been exploring <a href="http://www.pachube.com/" target="_blank">Pachube</a> and deeply interested in <a href="http://www.ugotrade.com/2009/01/28/pachube-patching-the-planet-interview-with-usman-haque/" target="_blank">the vision of Usman Haque</a>, but I had a real <a href="http://www.ugotrade.com/2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/" target="_blank">aha moment</a> when I read this :</p>
<p><strong>â€œ(((Extra credit for eager ubicomp hackers: combine this [pachube feeds] with Googlewave, then describe it in microsyntax. Hello, 2015!)))â€</strong></p>
<p>I think the AR Wave group will earn the extra credit and more very soon!Â  <a href="http://need2revolt.wordpress.com/about/" target="_blank">Davide Carnovale, need2revolt</a>, and <a href="http://www.lostagain.nl/" target="_blank">Thomas Wrobel</a><strong> </strong>have been leading the coding charge, and there will be a very early AR Wave demo soon, perhaps as soon as the <a href="http://www.meetup.com/arny-Augmented-Reality-New-York/" target="_blank">Feb 16th ARNY Meetup</a>.Â  <strong><br />
</strong></p>
<p>Open access to the creation of view that will eventually find its way into AR goggles, will depend not only on the power ofÂ  an open distributed platform for collaboration like the AR Wave project.Â  Our augmented reality view will be constructed through complex &#8220;hybrid tracking and sensor fusion techniques&#8221; (Jarell Pair), cooperating cloud data services, powerful search and computer vision algorithms, and apps that learn by context accumulation will drive our augmented experiences, and at the moment, these kind of resources, at least at scale, are for the most part in private hands.</p>
<p>In the interview below, Anselm&#8217;s discussesÂ  how trust filters, and <span id="zuat" title="Click to view full content">being able to publicly permission your searches so that other people can respond and so that people can reach out to you, and the democratization of data in general, are even more of a concern </span>with augmented reality and hyper local search<span id="zuat" title="Click to view full content">.</span> The task of understanding what it means to haveÂ  a social commons for the outernet remains an open, and pressing question.</p>
<p>Anselm explains (see full interview below):</p>
<p><strong><span id="e18n" title="Click to view full content">&#8220;as we move towards a physical internet where there&#8217;s no clicking and there&#8217;s no interface and the computer&#8217;s just telling you what it thinks you&#8217;re looking at, translating, you know, an image of a billboard to the name of the rock star who&#8217;s on that billboard, or translating the list of ingredients on a can of soup to the source outlets where it thinks that, those ingredients came from. When you have that kind of automated mediation, the question of trust definitely arises.</span></strong></p>
<p><strong><span id="e18n" title="Click to view full content"> And we haven&#8217;t seen the Clay Shirkys or the Larry Lessigs of the world start to talk about this yet.Â  Although I suspect that in the next four or five years that the zero click interface will become the primary interface, that we&#8217;ll have&#8230;we&#8217;ll come to assume that what we see with the extra enhanced data we get projected onto our view is the truth. Yet, at the same time, there is just no structure or mechanism even being considered for a democratic ownership of it.&#8221;</span></strong></p>
<h3>Augmented Reality will emerge through sensor fusion techniques &amp; cooperating cloud services</h3>
<p>In 2010, sensor fusion techniques, computer vision technology in conjunction with GPS and compass data will create data linking that can enable the kind of augmented reality that has been the stuff of imagination for nearly four decades (see <a href="http://laboratory4.com/2010/01/the-reality-of-augmented-reality/" target="_blank">Jarrell Pair&#8217;s post).</a></p>
<p>Putting stuff in the world in 3D is of course key to the original vision of augmented reality, and one of its biggest challenges.Â  Augmented reality is going to be implicated in a real time mapping of the world at an unprecedented scale and granularity.Â  We have barely an inkling of the implications of this now.</p>
<p>Anselm and Paige have been working in the heart of the social cartography movement for nearly a decade.Â  The vision and experience of this community is vital to understanding how augmented reality and the world as a physical platform can evolve into something that benefits people and allows them &#8220;to have a better understanding of the opportunities around them.&#8221;</p>
<p>We have been hacking maps for millenia â€“Â  â€œfrom conceptual story mapping, to colloquial mapping in European development and the cartographic renaissance created by the global voyages and rediscovery of Ptolemyâ€™s mapsâ€ (<a href="http://highearthorbit.com/" target="_blank">Andrew Turner</a>).Â  And, recently, initiatives on a public-provided GIS, like <a href="http://opengeo.org/" target="_blank">OpenGeo</a>, have led the way toward more open, interoperable, geospatial data.</p>
<p>Mapping takes on a new an crucial role to augmented reality.Â  <a href="http://www.slashgear.com/nokia-image-space-adds-augmented-reality-for-s60-3067185/" target="_blank">Nokia&#8217;s ImageSpace</a> is beginning to do what many thought Microsoft would do with photosynth two years ago.</p>
<p>And, if we see these kind of projects developed into a &#8220;photo-based positioning systems&#8221; -Â  &#8220;3d models of the environment to cover every possible angle, and then software that can work out in reverse based on a picture precisely where you are and where your facing&#8221; (Thomas Wrobel), we would find augmented reality leap forward over night.</p>
<p>It is time to take very seriously the vast opportunities and potential pitfalls of an augmented world.</p>
<p><strong><span id="vix9" title="Click to view full content">&#8220;when you are mediating the translation layer between the image and the data, then there is an opportunity for you to control it, and that opportunity is hard to resist.Â  It is hard to choose not to own that opportunity. It is an advertising opportunity. It is a revenue opportunity. It is a chance to send a message and a tone. </span></strong></p>
<p><strong><span id="vix9" title="Click to view full content">I know that Google and companies like that are keenly aware of the kinds of roles they donâ€™t want to hold, but it is sometimes seductive to think about them. And I am afraid that we, as a community, need to assert an ownership, kind of a commons, over how computers will translate what they see to information that we perceive.&#8221;</span></strong></p>
<p>There are some initiatives emerging.Â  <a href="http://www.tonchidot.com/" target="_blank">Tonchidot</a> (who <a href="http://www.techcrunch.com/2009/12/08/tonchidot-sekai-camera-funding/" target="_blank">closed on $4 million of VC for augmented reality </a>last December) has helped create the <a href="http://translate.google.com/translate?client=tmpg&amp;hl=en&amp;u=http%3A%2F%2Fwww.arcommons.org%2F&amp;langpair=ja%7Cen" target="_blank">AR Commons</a> in Japan.Â  <a href="http://www.tonchidot.com/corporate-profile.html" target="_blank">CFO of Tonchidot</a>, <a href="http://www.linkedin.com/ppl/webprofile?action=vmi&amp;id=499984&amp;pvs=pp&amp;authToken=r8TF&amp;authType=name&amp;trk=ppro_viewmore&amp;lnk=vw_pprofile" target="_blank">Ken Inoue</a> explained in <a href="http://www.ugotrade.com/2009/09/17/tonchidot-taking-augmented-reality-beyond-lab-science-with-fearless-creativity-and-business-savvy/" target="_blank">an interview with me in September 2009</a>.</p>
<p>&#8220;<strong>We feel that public data, such as landmarks, government facilities, and public transport should be shared. We see an AR world where people can readily and easily access information by just seeing â€“ quick, easy, and efficient.Â  And because of this ease and intuitiveness, children, the elderly and handicapped will surely benefit.Â  AR could help create a safer society.Â  Warnings, alerts, and safety information could save lives and avoid disasters.Â  These are what we, and <a href="http://translate.google.com/translate?client=tmpg&amp;hl=en&amp;u=http%3A%2F%2Fwww.arcommons.org%2F&amp;langpair=ja%7Cen" target="_blank">AR Commons</a> would like to tackle in the not so distant future.&#8221;</strong></p>
<p>But<strong> </strong>the task of building a social commons for the physical world platform has only just begun.<strong><br />
</strong></p>
<p><strong><span title="Click to view full content"><br />
</span></strong></p>
<h3>Interview with Anselm Hook</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/anselm31.jpg"><img class="alignnone size-medium wp-image-5085" title="anselm3" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/anselm31-300x225.jpg" alt="anselm3" width="300" height="225" /></a></p>
<p><em>photo from <a href="http://www.flickr.com/photos/anselmhook/3832691280/in/set-72157621946362509/" target="_blank">Anselm&#8217;s Flickr stream here</a></em></p>
<p><span id="u2mq" title="Click to view full content"><strong>Tish Shute:</strong> We <a href="http://www.ugotrade.com/2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/" target="_blank">first met last year </a></span><span id="zjlm" title="Click to view full content"><a href="http://www.ugotrade.com/2009/06/02/location-becomes-oxygen-at-where-20-wherecamp/" target="_blank">at Wherecamp</a>. </span><span id="suh4" title="Click to view full content">The start of 2009 was I think</span><span id="e_r5" title="Click to view full content"> the &#8220;OMG finally&#8221; moment for augmented reality and</span><span id="wo16" title="Click to view full content"> in less than a year AR, at least in proto forms, AR is breaking into the mainstream now! You are one of the founding visionaries/philosophers/hackers of the geo web and you have been thinking about geo web and AR for a long time &#8211; <a href="http://hook.org/headmap" target="_blank">all the way back to the legendary Head Map Manifesto</a>, and before.Â  Mostly recently you led the way in the very successful <a href="http://www.ardevcamp.org/wiki/index.php?title=Main_Page" target="_blank">ARDevCamp</a> in Mountain View. </span><span id="kn-y" title="Click to view full content"> Could you start by telling me a little bit about the history of your pioneering work with geolocated data?</span></p>
<p><strong>Anselm Hook: </strong>I am a long time Geo fanatic. I&#8217;m really interested in social cartography and what some people call public-provided GIS, thatâ€™s some language that people use. Anyway, my personal interest, when I talk to people who are non-technical (and it&#8217;s been a long term interest in the way I phrase it) is that I want to help people see through walls. So, the goal is very simple. I want people to have a better understanding of opportunities around them, the landscape around them. I always get frustrated when people make bad decisions because of a lack of information, especially when it&#8217;s related to their community and related to their environment. But, plainly put, I really just want &#8220;to help people see through walls&#8221;. It&#8217;s a very simple goal.</p>
<p><strong>Tish Shute:</strong> I know you worked on <a href="http://platial.com/" target="_blank">Platial</a>, which is really one of my favorite social mapping applications. It really broke new ground. What was the history of that? How did you get involved with Platial?</p>
<p><strong>Anselm Hook:</strong> Thatâ€™s an interesting question. It actually started at around 2000 when I saw Bruce Sterling talk. I had been writing video games for many years, and I was quite good at it, and I enjoyed it. But, the reasons I was doing it diverged from why the industry was doing it. I was making video games because I like to make shared spaces for my friends to play in and to share experience. I really enjoyed making shared environments. I worked on <a id="jrn-" title="BBS's" href="http://en.wikipedia.org/wiki/Bulletin_board_system">BBS&#8217;s</a> and my friends and I were always making these collaborative shared environments.</p>
<p>Once the video game industry kind of started to take off, I started to do high performance, 3D interactive video games and making compelling shared spaces, and it was a lot of fun. But, the frustration for me was that there was a huge industry growing around it and became very commercial. Although it paid well, it started to diverge from my values which were more centered around community environments, and shared understanding.</p>
<p><strong>Tish Shute:</strong> Yes very rapidly, the big games kind of devolved from the social aspects and became more and more into single player really, didnâ€™t they?</p>
<p><strong>Anselm Hook:</strong> It was the way, actually, because even though often you were in a many player world, you werenâ€™t collaborating, everything else became just a target.Â  I liked the idea of deep collaboration that calls the kind of playful space you see in IRC, or in the real world, where people are solving real world problems.</p>
<p>And I grew up in the Rockies, and I was always had a lot of access to the outside. So, I saw shared spaces and collaboration as a way to protect our environment. [ To step back ] I think people used different metrics <span id="gozb" title="Click to view full content">for measuring their choices in the world and many people have a value system centered around minimization of harm: making sure that the people are not hurt. But, my value system is different. I personally believe that protecting the planet is more important: to maximize biodiversity. I feel like protecting people around me comes from protecting the ecosystems they live in.</span></p>
<p><strong>Tish Shute:</strong> Thatâ€™s interesting, isnâ€™t it, because the history of Keyhole was really that, wasnâ€™t it.Â  Keyhole later became Google Earth, but I mean it began out of a project to look at what was going on in the ecosystem over Africa at that time, didnâ€™t it?<br />
<strong><br />
Anselm Hook:</strong> Yes, in fact many peopleâ€™s projects are stemming from an environmental concern. <a id="zxy9" title="Mikel Mironâ€™s" href="http://brainoff.com/weblog/">Mikel Maronâ€™s</a> works for example &#8211; heâ€™s doing <a id="euvm" title="Map Kiberia" href="http://mapkibera.org/">Map Kiberia</a>, and he also worked on OpenStreetMaps.</p>
<p><strong>Tish Shute:</strong> Map Kiberia &#8211; that is the new project?</p>
<p><strong>Anselm Hook:</strong> Oh, yes his project is called <a id="r7ie" title="Map Kiberia" href="http://mapkibera.org/">Map Kiberia</a>. Heâ€™s mapping a city in Africa.<br />
[For more see <a id="ngn." title="Map Kiberia's YouTube Channel" href="http://www.youtube.com/user/mapkibera">Map Kiberia&#8217;s YouTube Channel</a> &#8211; <a id="amqx" title="photo below" href="http://www.flickr.com/photos/junipermarie/4098163856/" target="_blank">photo below</a> from <a href="http://www.flickr.com/photos/junipermarie/">ricajimarie</a> ]</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_487qfcv76ft_b.jpg"><img class="alignnone size-medium wp-image-5052" title="dhj5mk2g_487qfcv76ft_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_487qfcv76ft_b-300x199.jpg" alt="dhj5mk2g_487qfcv76ft_b" width="300" height="199" /></a></p>
<p><strong>Tish Shute:</strong> Right, great!</p>
<p><strong>Anselm Hook:</strong> When I started to look at GIS and mapping I started to meet people who had a very similar background. What happened to me is I kind of stepped away from games around the year 2000. Iâ€™d seen a talk by Bruce Sterling at an event called <a id="e8dn" title="PlaNetwork" href="http://www.conferencerecording.com/newevents/pla20.htm">PlaNetwork</a>. And that event was, for me, a turning point where I decided to focus full time on exactly what I cared about instead of doing things that were kind of similar to what I cared about. So, his influences is a pretty significant one to me at that exact moment.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_490gcp7q6fn_b.png"><img class="alignnone size-medium wp-image-5053" title="dhj5mk2g_490gcp7q6fn_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_490gcp7q6fn_b-300x80.png" alt="dhj5mk2g_490gcp7q6fn_b" width="300" height="80" /></a></p>
<p>[For more see <a id="q2or" title="viridiandesign.org" href="http://www.viridiandesign.org/About.htm">viridiandesign.org</a> &#8211; seems that it is time for a &#8220;Neo-Viridian,&#8221;  revival.]</p>
<p><strong>Tish Shute:</strong> Itâ€™s interesting because now your paths are crossing again with augmented reality. You are on the same wavelength again.</p>
<p><strong>Anselm Hook:</strong> Itâ€™s funny, actually, Iâ€™ve had a couple of brief overlaps in that way.Â  Well, so in 2000 I<span id="mdsf" title="Click to view full content"> went to see this talk and I did a small project called &#8212; well, I called it <a id="bx3u" title="SpinnyGlobe" href="http://github.com/anselm/SpinnyGlobe">SpinnyGlobe</a>. What I did is I mapped protests from a number of websites onto a globe to show the level of community opposition to the pending war in Iraq. It was the first time there had been a protest before a war. So, it was very interesting to me. [ See <a href="http://hook.org/headmap" target="_blank">http://hook.org/headmap</a> ]<br />
<strong><br />
Tish Shute:</strong> Thatâ€™s really fascinating. Do you have any pictures of that you could send me? </span></p>
<p><span id="r0h_" title="Click to view full content"><a href="http://www.flickr.com/photos/anselmhook/1747152617/sizes/m/in/set-72157602696188420/" target="_blank"><img class="alignnone size-medium wp-image-5054" title="dhj5mk2g_492ffct2df4_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_492ffct2df4_b-300x225.jpg" alt="dhj5mk2g_492ffct2df4_b" width="300" height="225" /></a></span></p>
<p><span id="mdsf" title="Click to view full content">photo from <a id="j05v" title="anselm's flickrstream" href="http://www.flickr.com/photos/anselmhook/1747152617/sizes/m/in/set-72157602696188420/">anselm&#8217;s flickrstream</a></span></p>
<p><strong>Tish Shute:</strong> Yes, Iâ€™ll definitely look <a id="ua2l" title="SpinnyGlobe" href="http://github.com/anselm/SpinnyGlobe">SpinnyGlobe</a><span id="m0:j" title="Click to view full content"> up. It sounds very interesting.Â  One of the aspects of your work on geo-located data projects like this and <a id="h.gx" title="Platial" href="http://platial.com/">Platial</a> is that you really started to develop this idea of a culture of place, about how people make place. This was the wake up call to me regarding the power of networks combined with geo-data. </span></p>
<p><span id="m0:j" title="Click to view full content">We are hoping to extend this idea into augmented reality with the an open distributed platform for AR so that we can collaboratively map our worlds from the perspective of who we are, where we are, and what we are doing.Â  I know youâ€™ve just done some work recently in augmented reality.Â  I know you put the code up already. </span></p>
<p><span id="m0:j" title="Click to view full content">By the way, I love the way you take your philosophy into the way you make code &#8211; the practice of making some code, trying some things out, making it all public and publishing your findings, you know, your comments on that experience.Â  Perhaps you could recap sort of how you picked up recently on the state of play with augmented reality and what aspects you looked at, and what came out of that experience?</span></p>
<p><strong>Anselm Hook:</strong> So, itâ€™s a very simple trajectory. Coming out of the work I had done, <a id="cs18" title="Platial" href="http://platial.com/">Platial</a>, among other projects and I started to just look at the hyper-local and I suddenly realize that even those services werenâ€™t really speaking to living, and how to really see and solve local problems. What was missing was a sense of context.</p>
<p>The map doesnâ€™t know how youâ€™re feeling, it doesnâ€™t know if youâ€™re in a hurry, it doesnâ€™t know what you want, itâ€™s very static. Even the web maps are very static. And augmented reality for me I started to recognize as a combination of &#8212; well &#8212; itâ€™s probably collision of many forces, many forces that weâ€™re all a part of. Weâ€™ve also didnâ€™t realize that the real-time web is really important, itâ€™s part of<span id="bja1" title="Click to view full content"> what AR is about.</span></p>
<p>We have all started to realize that the context is important. You know, your personal disposition, your needs, if you want to be interrupted or not. That is the kind of thing that the ubiquitous computing crowd has talked about. We started to recognize that there are sensors everywhere, and the ambient sensing communities talked about that. So what is funny for me about augmented reality is I started realizing it is just a collision of many other trends into something bigger.</p>
<p>Everything else we thought was a separate thing is actually just part of this thing. Even things like Google Maps or mapping systems we think are so great are really just kind of almost an aspect of a hyper-local view. You actually donâ€™t really care what is happening 10 blocks away or 100 blocks away. If you could satisfy those same interests and needs within a single block, one block away, you would probably be really happy. You really just want to satisfy needs and interests, find ways to contribute, or get yourself fed, or whatever it is you want. And AR seemed to be the playground to really explore the human condition.</p>
<p><strong>Tish Shute:</strong> Anyway, I think one of the things that has been very amazing this year is we to have the good mediating devices that, for the first time, give us compasses, GPS, and accelerometers. But one sort of missing pieces with AR at the moment is [tracking, mapping, and registration] &#8211; the kind of things colloquial mappings of the world could be of great help with.</p>
<p>We have seen mapping coming out of the Flickr data, e.g., the University of Washington, put the maps together from the geo-tagged Flickr photos. Now if we could have that linked up with AR, then we have the kind of mapping we need to kind of really hook the geo-data onto the world in a way that goes beyondâ€¦you know, what compass and GPS can really deliver is pretty minimal at the moment.</p>
<p><strong>Anselm Hook</strong>: There is a real risk of our augmented reality world being owned by interests which are not our own. There is a real question of when you hold up that AR goggle, what are you going to see? Are you going to see corporate advertising? Are you going to see your friendsâ€™ comments or criticisms? It is going to be an Iran or a democracy, right? It is unclear.</p>
<p><span id="vix9" title="Click to view full content">Right now there are some disturbing trends I have noticed. I am a big fan of Google Goggles. I think it is a great project. But when you are mediating the translation layer between the image and the data, then there is an opportunity for you to control it, and that opportunity is hard to resist. It is hard to choose not to own that opportunity. It is an advertising opportunity. It is a revenue opportunity. It is a chance to send a message and a tone. </span></p>
<p><span id="vix9" title="Click to view full content">I know that Google and companies like that are keenly aware of the kinds of roles they donâ€™t want to hold, but it is sometimes seductive to think about them. And I am afraid that we, as a community, need to assert an ownership, kind of a commons, over how computers will translate what they see to information that we perceive.</span></p>
<p><strong>Tish Shute:</strong> Yes. And this is how we met, again, recently [over the project to create an open, distributed platform for AR using the Wave Federation Protocol]â€¦</p>
<p><span id="e18n" title="Click to view full content">This is something I feel really deeply is that, you know, basically we need the physical internet to be as open as, as the, as the internet, as the end-to-end internet has been. Or more so, actually, because the end-to-end internet has seen the trend has been to walled gardens.Â  Basically Facebook became enormous, an enormous walled garden which, I think, was despite, our predictions about them, [walled gardens] are the social experience really on the web.Â  It&#8217;s very much in walled gardens still and I, and I really feel that with the physical internet, we need to make great efforts not for it not just to be a series of small pockets of privately funded walled gardens.</span></p>
<p>There needs to be some kind of communications infrastructure that keeps it open so that was when I got interested in looking at the Wave Federation Protocol because it was a real time, you know, an open real time protocol that could possibly be a basis for that. But I think the point you&#8217;ve talked to just now, the mapping of the world and who has the &#8220;goggles&#8221;, i.e., the image data, image databases, that make the world meaningful is really, that&#8217;s still a, it&#8217;s still a BIG question [i.e. who controls the view?].</p>
<p>When I saw <a id="ewxn" title="ImageWiki" href="http://imagewiki.org/">ImageWiki</a>, [I realized] that is a piece that is vital for, for augmented reality. We need to have a huge social effort to be involved in this,Â  linking in and creating theÂ  physical internet, in creating the image hyperlinks that will make that meaningful.</p>
<p><span title="Click to view full content"><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_493fv23rg33_b.png"><img class="alignnone size-medium wp-image-5055" title="dhj5mk2g_493fv23rg33_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/01/dhj5mk2g_493fv23rg33_b-300x219.png" alt="dhj5mk2g_493fv23rg33_b" width="300" height="219" /></a></span></p>
<p><span id="e18n" title="Click to view full content"><strong>Anselm Hook:</strong> I think that&#8217;s a great point. The search interface, the kind of Internet that we&#8217;re used to, the way we talk to the network now, is fundamentally open end to end. Yes, you can have your oligarchies inside of it, as we see with Facebook, but you can always start your own venture up and you can do a search on something, and you can find that, that website and you can join it or you can put up your own webpage and people can find it. </span></p>
<p><span id="e18n" title="Click to view full content">The translation layer, the idea of text search and the ability to discovery power and the serendipity and the openness of that discovery, it&#8217;s pretty open right now. We do have some serious boundaries of language, which is one of the reasons I was working at the <a id="xg:8" title="Meadan.org" href="http://www.imug.org/events/past2007.htm#meadan">Meedan.org</a> [hybrid distributed, natural language translation] for a couple of years, trying to bridge that issue.</span></p>
<p>But here, as we move towards a physical internet where there&#8217;s no clicking and there&#8217;s no interface and the computer&#8217;s just telling you what it thinks you&#8217;re looking at, translating, you know, an image of a billboard to the name of the rock star who&#8217;s on that billboard, or translating the list of ingredients on a can of soup to the source outlets where it thinks that, those ingredients came from. When you have that kind of automated mediation, the question of trust definitely arises.</p>
<p>And we haven&#8217;t seen the Clay Shirkys or the Larry Lessigs of the world start to talk about this yet.Â  Although I suspect that in the next four or five years that the zero click interface will become the primary interface, that we&#8217;ll have&#8230;we&#8217;ll come to assume that what we see with the extra enhanced data we get projected onto our view is the truth. Yet, at the same time, there is just no structure or mechanism even being considered for a democratic ownership of it.</p>
<p><span id="fv3x" title="Click to view full content">We have with DNS, for example, the idea that you can register the domain name and people can search for it, and find it, and go to it. There&#8217;s no such thing as an Image DNS, or an image translation to DNS right now. What does it mean when everything is just &#8220;magic&#8221;, when there&#8217;s no way for you to be a part of the conversation, where you&#8217;re just a consumer of what people tell you, or of what one company right now, tells you, is reality? That&#8217;s a real concern.<br />
<strong><br />
Tish Shute: </strong>This, to me is the most important question at the moment. I mean, it&#8217;s the big one and it&#8217;s the place to put energy if you love the Internet [and what it can now become] right. You&#8217;ve got to put a lot of energy into this because this [a democratized view of the physical world as a platform] won&#8217;t just happen, because there&#8217;s a lot of momentum already for it to be heavily privatized, partly because, one reason is, some of the computer vision algorithms that, say, make sense of things like the geotag photographs are not open.Â  I mean, for example, the beautiful maps that have been made from the University of Washington [from Flickr geotagged photo sets], that isn&#8217;t in the public domain.</span></p>
<p><strong>Anselm Hook:</strong> Right. Tish, and in fact you&#8217;re referring to [with the maps from the Flickr photos] to ordinary maps and the fact we&#8217;ve already seen that maps lie, we&#8217;ve already, seen how much maps are reflecting a certain truth that becomes the normative truth. Google maps reflects roads, because this is roads and cars, right? Only recently have they thought about buses and walking. So the normative view that people assume is the reality, is showing off you know Starbucks, and roads, and cars, that becomes the default, those prejudices are just assumed, you know, the truth. But they&#8217;re not the truth at all.</p>
<p>I was talking to a friend of mine in Montreal, [Renee Sieber], and she said that their Indian portage routes are a bridge across land and water, they don&#8217;t think of a piece of land and a piece of water as being different things, they think of them as one thing: a route. It&#8217;s already a different kind of language we can&#8217;t even reflect it.</p>
<p>So not only is there this kind of formal, anthropological lie, in a sense, but there&#8217;s this way that we deceive ourselves because of our own prejudices.</p>
<p><strong>Tish Shute:</strong> Yes I agree and that&#8217;s why I think when I saw some of the things you had written on the ImageWiki point clearly to the need to create a social commons. We need a social commons for the real-time physical internet, we need it for the image hyperlinks that make sense of that.</p>
<p>And it&#8217;s a complicated thing in a sense, though, because we don&#8217;t actually have a good distributed infrastructure for AR yet, and I found exploring AR Wave, that at last we have the suggestion of an open, federated protocol for real-time communication &#8211; the wave federation protocol. [Real time communications is a very important part of AR].Â  It isn&#8217;t an actuality yet where lots of people are able to use it, set up their own servers, and there&#8217;s not a standard all the way throughÂ  [there is not a standard for how data is sent between the client and the server].</p>
<p>But Wave Federation Protocol does make possible truly distributed social AR.Â  I started thinking when I saw ImageWiki that to bring ImageWiki together with the social collaborative power of distributed AR.Â  This really would be the basis of creating a social commons for augmented reality and the physical world as a platform &#8211; the <span id="np6x" title="Click to view full content">start of a bottom up with deep social collaboration on how we create augmented reality colloquial maps that can inform a hyper-local of the world.</span></p>
<p><strong>Anselm Hook:</strong> Yes. When Paige Saez, John Wiseman, and myself, and a few other folksâ€¦ You know, Benjamin Foote, Marlin Pohlmann, and a couple other people started to play with this, we quickly found thatâ€¦ We started to realize, â€œOh, this kind of thing will be at least as popular as IRC. There will be at least as many people doing this as chatting in little virtual spaces. Thereâ€™ll be at least as many people decorating the world with augmented reality markup, and maybe using the real world as a kind of barcode for translating what youâ€™re looking at into an artifact, a digital artifact.</p>
<p>And<span id="csy2" title="Click to view full content"> that the size of that space was going to be huge, basically. Maybe not quite as commodifiable as Twitter, but certainly very energetic.</span></p>
<p>Many of the projects we did were just kind of looking at these kinds of issues sort of from an artistic, technical, and political point of view. We werenâ€™t so much posing complete solutions, but simply using a praxis to explore the idea with an implementation, as a foundation for this discussion. So I think we sort of opened that can of worms for sure.</p>
<p><strong>Tish Shute:</strong> Did you actually set up ImageWiki to be working as a location based app yet?</p>
<p><strong>Anselm Hook:</strong> It is a location based app. It collects your longitude, latitude, and the image and stores it. And then it uses that as a way to translate that image to anything else. It could be a piece of text or a URL.<br />
<strong><br />
Tish Shute:</strong> So there is a smartphone app, but you didnâ€™t take it as far as an AR app yet?</p>
<p><strong>Anselm Hook:</strong> No. We didnâ€™t do a heads-up view. There are apps on the iPhone store that do that, but they donâ€™t do the brute force image recognition that we were using. We used a third party off the shelf algorithm that we found on Wikipedia and downloaded the source code, and threw it on the server. And John Wiseman in LA wrote the scalable database backend so that we could scale the actualâ€¦<br />
<strong><br />
Tish Shute:</strong> So how did you set the iphone app up to work?</p>
<p><strong>Anselm Hook</strong>: The iPhone side was very simple. You take a picture of something and it tells you what it is. That is all it did. We would take the location, but the client side, the iPhone side, just rendered, returned to youâ€¦It said, â€œSomeone said that this picture of a barking dog is an advertisement for a local band.â€</p>
<p><strong>Tish Shute:</strong> Right. So basically it was a geo-tagged?</p>
<p><strong>Anslem Hook:</strong> Yes. We are just collecting the geo information. Actually, there were a whole lot of technical challenges. The whole idea of ImageWiki is actually kind of beyond our technical ability for a small team like us. It really does take a team, a group like Google, to do this kind of thing in a scalable way.<br />
<strong><br />
Tish Shute:</strong> Why is that?</p>
<p><strong>Anslem Hook:</strong> There are two sides. There is the curating the images. I think that is the job of groups like us &#8211; open source groups who can curate images <span id="vxty" title="Click to view full content">that are owned by the community. And then the searching side, the algorithm side, where you are actually matching the fingerprint of one image to images in your database, that takes a much moreâ€¦that is much more industrial.Â  We get both sides, ours is not a scalable solution. It is mostlyâ€¦proving that it could be done was important.<br />
</span><br />
<span id="a3ou" title="Click to view full content"><strong>Tish Shute: </strong>In terms of hooking Imagewiki up to the collaborative possibilities of AR Wave wouldn&#8217;t federation pose some interesting possibilities for scaling search algorithms and all that?</span></p>
<p><span id="vp27" title="Click to view full content"><strong>Anselm Hook:</strong> Yes. And what is funny also, incidentally, is that, nevertheless, we did look for some financial support for it, but we couldnâ€™tâ€¦we just didnâ€™t find the investors to scale it. Now, other companies like SnapTell took a shot at it. And they have an app in the iPhone store where you can point at a beer bottle and get back the name of the beer bottle.</span></p>
<p>The classic example everyone uses is a book. Amazon has all the image jackets of all their books. You can point SnapTell at almost any book and get back links to buy that at Amazon, the price of the book, and user comments on the book. So they are treating Amazon as the canonical voice of the book, for better or worse. That is the state of the art so far, up until Google Goggles came out a little while ago, which actually blows it out of the water. But, that is where we are now.</p>
<p><strong>Tish Shute: </strong>Right. But the point you raise about how when something like Amazon comes canonical of what is book, right, this is the whole point, isnâ€™t it?</p>
<p><strong>Anselm Hook:</strong> Is Amazon truth? Itâ€™s not bad. Jeff Bezos seems like a nice guy, but, you know.</p>
<p><strong>Tish Shute:</strong> And this is the point of having these open infrastructures for this.Â  And this should be obvious in a way, but it comes back to the thing about what made the Internet great was the fact that even though as you note, you get an oligarchy like Facebook, but people always could just go off and do something else, right? Because the fundamental infrastructure was basically open and designed to be available for everyone. And many people have championed that and fought for it hard [to maintain this openness] havenâ€™t they? They have devoted their lives to keeping it that way, even if the oligarchies have done their thing.<br />
<strong><br />
Anselm Hook:</strong> Yes. There are really some things that are underneath all of this that havenâ€™t been solved yet.</p>
<p>One is that the trust in social networks has not been built yet, so we canâ€™t do peer based recommendations very well. We canâ€™t filter noise by peers. Twitter kind of is moving there, but I donâ€™t just want to listen to my Twitter friends. I want to listen to my friends of friends. If I am getting truth from somebody, I want to get that truth from people my friends say that they trust.</p>
<p>Then the second problem is that there is a search business. My friend Ed Bice, who owns <a id="lir5" title="Meedan" href="http://beta.meedan.net/">Meedan</a>, always says that a search itself, a search request, is an opportunity to makeâ€¦is a publishing moment. It is an opportunity to say what you think. In the real world, if you are just hanging out with humans and you look somewhere, other people might look at your gaze and they might look at what you are looking at. Your gaze itself is a public act.</p>
<p>Gaze is a soft act, but it is one that is visible. With Google, the gaze<span id="zuat" title="Click to view full content"> of four billion people is invisible. We don&#8217;t what people are looking at, there is no opportunity to participate. Let me give you a real example.Â  I have taken a image of something of the bust of figure or a statue.Â  Why can&#8217;t the museum in Cairo look at my request and tell me oh yeah that is Tutankhamen, or that is Nefertiti right? Why can&#8217;t they have a chance to participate in the search and respond to me?</span></p>
<p><span id="zuat" title="Click to view full content"> Right now the the only person that responds is Google when I do a search. We need to invert the search pyramid and open up search, so that search is a democratic act, so that you can publicly permission your searches so that other people can respond and so that people can reach out to you, not just you having to do a dialogue. </span></p>
<p><span id="zuat" title="Click to view full content">The common example of this.. and we see this everywhere: I am looking for a slice of pizza right, now I am hungry I want some pizza. I have to ask Google, look find twelve websites, call twelve phone numbers, and talk to each of the twelve stores, and ask them are they open late, is the food organic, is the food in any good, do my friends like it.</span></p>
<p>Whereas what I should be able to do is just say it&#8217;s a search moment and I am interested in pizza. If those pizza places my criteria like you know my friend&#8217;s like them and they are organic, they are open, then that pizza place can call me. I have the money why should I do the search? So the whole business of search, the whole structure of search is predicated around a revenue model, but its a really short-sighted revenue model, its not a brokerage.</p>
<p>Search isn&#8217;t search, search is hand waving.Â  These should be moments for us to have a discourse. So problem we are seeing in AR with communication of the right information is actually underneath AR, at the level of the whole infrastructure.</p>
<p>Search needs to be inverted, trust filters need to be built. We need to democratically own our data institutions.Â  We don&#8217;t right now.Â  That will be more of a concern, especially with AR.</p>
<p><strong>Tish Shute: </strong>Yes, especially with AR, which is this why got all excited about federation.Â  Do you think federation has the potential, an opportunity to create [the new infrastructure you describe?]</p>
<p><strong>Anselm Hook:</strong> Absolutely,Â  its absolutely what we must do. It is much harder to do. It is absolutely critical.</p>
<p><span id="lwzk" title="Click to view full content"><strong>Tish Shute:</strong> And why is it much harder to do? Could you explain that?</span></p>
<p><strong>Anselm Hook:</strong> Well, it&#8217;s very easy for a bunch of hackers to build a service that you log into and fetch some data, it&#8217;s a single thing. They don&#8217;t have to talk anybody, they can use their own protocols, they can hack it, it&#8217;s a big black box, behind the scenes. There&#8217;s running back and forth in a giant Chinese room delivering manuscripts and scrolls to you. Whatever is behind the black box, you donâ€™t care, it just works.Â  But when you federate, you need to actually publish and have standards, and then you&#8217;re talk about semantic, everyone starts getting really excited and wave some hands. It becomes a disaster. It&#8217;s, at least, another power order, more difficult than DIY, build it yourself.</p>
<p><strong>Tish Shute:</strong> So, in terms of what Google Wave have done with their approach to federation, what do you think have been their achievements and what do you think is their obstacles? What do you think are the failings of the Wave? Because it&#8217;s the first big public major player backed approach to something federated, isnâ€™t it? In real time.</p>
<p><strong>Anselm Hook:</strong> Yes. I think the most important non-federated service on the planet today is Twitter.Â  <a id="uhg3" title="Ident.ic.a" href="http://identi.ca/group/identica">Identi.ca</a> it&#8217;s not getting any traction with respect to Twitter. [ Even though ] Identi.ca is a federated version of Twitter and is very good. [ Identica is now <a id="w05j" title="Status.net" href="http://status.net/">Status.net</a> ] . So, we see already there that small players arenâ€™t being competitive. Then look at other services like IRC. IRC is the secret backbone of the Net. All the open source projects, all the teams, all the people that work on opensource projects are all on IRC. It&#8217;s the only way they get anything done.</p>
<p>With Google Wave, and the protocols underneath Google Wave, we see an attempt to build a similar kind of real time, but distributed protocol. I think it&#8217;s the right direction. I think, people should pick up the offering and make their own servers. I think that protocol is really great, I think the fact that is compressed, its high performance, <span id="md2h" title="Click to view full content">it is small, real-time of blobs of data flying around, all exactly the way it should be done. It is getting close to this kind of rewrite of the Internet that people keep talking about, because, you know, the net protocols are so bad, it is starting to treat the idea of intermittent exchanges being more transitory, volatile, and not heavy.</span></p>
<p><strong>&#8230;.to be continued.Â  Part 2 coming soon!<br />
</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/feed/</wfw:commentRss>
		<slash:comments>17</slash:comments>
		</item>
		<item>
		<title>The Next Wave of AR: Mobile Social Interaction Right Here, Right Now!</title>
		<link>http://www.ugotrade.com/2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/</link>
		<comments>http://www.ugotrade.com/2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/#comments</comments>
		<pubDate>Fri, 20 Nov 2009 04:53:07 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial general Intelligence]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[AR browsers]]></category>
		<category><![CDATA[AR Dev camp]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[calo]]></category>
		<category><![CDATA[mobile social]]></category>
		<category><![CDATA[mobile social interaction utility]]></category>
		<category><![CDATA[open distributed augmented reality]]></category>
		<category><![CDATA[pygowave]]></category>
		<category><![CDATA[real time internet]]></category>
		<category><![CDATA[siri]]></category>
		<category><![CDATA[smart things]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[social augmented reality]]></category>
		<category><![CDATA[The Copenhagen Wheel]]></category>
		<category><![CDATA[the internet of things]]></category>
		<category><![CDATA[the outernet]]></category>
		<category><![CDATA[the sentient city]]></category>
		<category><![CDATA[Wave Federation Protocol]]></category>
		<category><![CDATA[Web Squared]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4869</guid>
		<description><![CDATA[The Next Wave of AR: Mobile Social Interaction, Right Here, Right Now! View more presentations from Tish Shute. Click on the image below or here to watch this presentation and others from Momo13]]></description>
				<content:encoded><![CDATA[<div id="__ss_2542526" style="width: 425px; text-align: left;"><a style="font:14px Helvetica,Arial,Sans-serif;display:block;margin:12px 0 3px 0;text-decoration:underline;" title="The Next Wave of AR: Mobile Social Interaction, Right Here, Right Now!" href="http://www.slideshare.net/TishShute/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now-2542526">The Next Wave of AR: Mobile Social Interaction, Right Here, Right Now!</a><object style="margin:0px" classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" width="425" height="355" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0"><param name="allowFullScreen" value="true" /><param name="allowScriptAccess" value="always" /><param name="src" value="http://static.slidesharecdn.com/swf/ssplayer2.swf?doc=thenextwaveofar2-091120000046-phpapp01&amp;stripped_title=the-next-wave-of-ar-mobile-social-interaction-right-here-right-now-2542526" /><param name="allowfullscreen" value="true" /><embed style="margin:0px" type="application/x-shockwave-flash" width="425" height="355" src="http://static.slidesharecdn.com/swf/ssplayer2.swf?doc=thenextwaveofar2-091120000046-phpapp01&amp;stripped_title=the-next-wave-of-ar-mobile-social-interaction-right-here-right-now-2542526" allowscriptaccess="always" allowfullscreen="true"></embed></object></p>
<div style="font-size: 11px; font-family: tahoma,arial; height: 26px; padding-top: 2px;">View more <a style="text-decoration:underline;" href="http://www.slideshare.net/">presentations</a> from <a style="text-decoration:underline;" href="http://www.slideshare.net/TishShute">Tish Shute</a>.</div>
<p>Click on the image below or <a href="http://www.mobilemonday.nl/talks/tish-shute-the-next-wave-of-ar/" target="_blank">here to watch</a> this presentation and others from <a href="http://www.mobilemonday.nl/">Momo13</a></div>
<p><a href="http://www.mobilemonday.nl/talks/tish-shute-the-next-wave-of-ar/" target="_blank"><img class="alignnone size-medium wp-image-4876" title="Screen shot 2009-11-20 at 1.32.24 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/11/Screen-shot-2009-11-20-at-1.32.24-PM-300x167.png" alt="Screen shot 2009-11-20 at 1.32.24 PM" width="300" height="167" /></a></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/11/19/the-next-wave-of-ar-mobile-social-interaction-right-here-right-now/feed/</wfw:commentRss>
		<slash:comments>4</slash:comments>
		</item>
		<item>
		<title>AR Wave: Layers and Channels of Social Augmented Experiences</title>
		<link>http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/</link>
		<comments>http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/#comments</comments>
		<pubDate>Tue, 13 Oct 2009 18:52:42 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[sustainable mobility]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[virtual communities]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Amphibious Architecture]]></category>
		<category><![CDATA[AR Blip]]></category>
		<category><![CDATA[AR Browser]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[augmentaion]]></category>
		<category><![CDATA[augmented reality search]]></category>
		<category><![CDATA[Blair Macintyre]]></category>
		<category><![CDATA[Channels and Social Augmented Realities]]></category>
		<category><![CDATA[citi sensing]]></category>
		<category><![CDATA[citizen sensing]]></category>
		<category><![CDATA[Clayton Lilly]]></category>
		<category><![CDATA[cybernetics vs ecology and human waste]]></category>
		<category><![CDATA[distributed]]></category>
		<category><![CDATA[eco mapping]]></category>
		<category><![CDATA[Gene Becker]]></category>
		<category><![CDATA[geoAR]]></category>
		<category><![CDATA[geospatial web]]></category>
		<category><![CDATA[geospatial web and augmented reality]]></category>
		<category><![CDATA[Goggle Wave Federation Protocol]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Google Wave as an AR enabler]]></category>
		<category><![CDATA[Google Wave enable augmented reality]]></category>
		<category><![CDATA[Google Wave Protocols]]></category>
		<category><![CDATA[green tech augmented reality]]></category>
		<category><![CDATA[immersive sight]]></category>
		<category><![CDATA[Jeremy Hight]]></category>
		<category><![CDATA[Joe Lamantia]]></category>
		<category><![CDATA[Layers]]></category>
		<category><![CDATA[layers and channels of augmented reality]]></category>
		<category><![CDATA[Life Clipper]]></category>
		<category><![CDATA[life streaming]]></category>
		<category><![CDATA[location based media]]></category>
		<category><![CDATA[location based services]]></category>
		<category><![CDATA[locative media]]></category>
		<category><![CDATA[locative narratives]]></category>
		<category><![CDATA[Mannahatta]]></category>
		<category><![CDATA[map based augmentation]]></category>
		<category><![CDATA[mapping]]></category>
		<category><![CDATA[modulated mapping]]></category>
		<category><![CDATA[modulated napping]]></category>
		<category><![CDATA[multi-user]]></category>
		<category><![CDATA[narrative archaeology]]></category>
		<category><![CDATA[Natural Fuse]]></category>
		<category><![CDATA[neogeography]]></category>
		<category><![CDATA[networked urbanism]]></category>
		<category><![CDATA[non euclidian geometry]]></category>
		<category><![CDATA[open augmented reality framework]]></category>
		<category><![CDATA[Seanseable Labs]]></category>
		<category><![CDATA[sensor networks]]></category>
		<category><![CDATA[shared augmented realities]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[social augmented reality experiences]]></category>
		<category><![CDATA[sound augmentation]]></category>
		<category><![CDATA[Thomas K. Carpenter]]></category>
		<category><![CDATA[Thomas Wrobel]]></category>
		<category><![CDATA[Trash Track]]></category>
		<category><![CDATA[ubicomp]]></category>
		<category><![CDATA[virtual reality]]></category>
		<category><![CDATA[Wave as a platform for augmented reality]]></category>
		<category><![CDATA[Wave Blip]]></category>
		<category><![CDATA[Wave Bots]]></category>
		<category><![CDATA[Wave playback]]></category>
		<category><![CDATA[Wave playback feature]]></category>
		<category><![CDATA[Wave Robots]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4585</guid>
		<description><![CDATA[It is now nearly two weeks since the Google Wave preview launch and I am happy to say we have some AR Wave news. The diagram above shows Thomas Wrobelâ€™s basic concept for a distributed, multi-user, open augmented reality framework based on the Google Wave Federation Protocol and servers (click on the image to see [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://lostagain.nl/tempspace/PrototypeDiagram3_wave.html" target="_blank"><img class="alignnone size-medium wp-image-4586" title="Screen shot 2009-10-12 at 2.40.39 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/Screen-shot-2009-10-12-at-2.40.39-PM-300x154.png" alt="Screen shot 2009-10-12 at 2.40.39 PM" width="300" height="154" /></a></p>
<p>It is now nearly two weeks since the <a href="http://wave.google.com/" target="_blank">Google Wave </a>preview launch and I am happy to say we have some AR Wave news. The diagram above shows Thomas Wrobelâ€™s basic concept for a distributed, multi-user, open augmented reality framework based on the <a href="http://www.waveprotocol.org/" target="_blank">Google Wave Federation Protocol</a> and servers (click on the image to see the dynamic annotated sketch <a href="http://lostagain.nl/tempspace/PrototypeDiagram3_wave.html" target="_blank">or here</a>).</p>
<p>Even in the short time we have had to explore Wave, some very exciting possibilities are becoming clear. Thomas puts some of the virtues of Wave as an AR enabler succinctly when he writes:</p>
<p><strong>â€œWave allows the advantages of both real-time communication, as well as the advantages of persistent hosting of data. It is both like IRC, and like a Wiki. It allows anyone to create a Wave, and share it with anyone else. It allows Waves to be edited at the same time by many people, or used as a private reference for just one person.</strong></p>
<p><strong>These are all incredibly useful properties for any AR experience, more so Wave is open. Anyone can make a server or client for Wave. Better yet, these servers will exchange data with each other, providing a seamless world for the userâ€¦..a single login will let you browse the whole world of public waves, regardless of whoâ€™s providing or hosting the data. Wave is also quite scalable and secureâ€¦data is only exchanged when necessary, and will stay local if no one else needs to view it.</strong></p>
<p><strong>Wave allows bots to run on itâ€¦allowing blips in a waves to be automatically updated, created or destroyed based on any criteria the coders choose. Wave even allows the playback of all edits since the wave was created.</strong></p>
<p><strong>For all these reasons and more, Wave makes a great platform for AR.â€</strong></p>
<p>There will be much more <span>coming soon on Wave enabled AR because the Google Wave invites have begun to flow out to a wider community now. This week, many of our small ad-</span>hoc group looking at the development challenges and implications of Google Wave for AR actually got into Wave for the first time.</p>
<p>Many thanks to all the people who have contributed to this discussion so far including: Thomas Wrobel, Thomas K. Carpenter, Jeremy Hight, Joe Lamantia, Clayton Lilly, Gene Becker and many others.</p>
<p>We will be setting up some public AR Framework Development Waves this week.Â  If you have any trouble finding them, or adding yourself to it, please add Thomas and I to your contact list.Â  I am tishshute@googlewave.comÂ  Thomas is darkflame@googlewave.comÂ  The first two are currently called:<strong> </strong></p>
<p><strong><br />
AR Wave: Augmented Reality Wave Framework Development</strong> (developer forum)</p>
<p><strong>AR Wave: Augmented Reality Wave Development</strong> (for general discussion)</p>
<p>The discussion so far has been in two areas. On the one hand, it is gear-heady and focused on the <a href="http://www.waveprotocol.org/" target="_blank">Google Wave Federation Protocol</a>, code, development challenges, and interfacing to mobile, while on the other hand people have been looking at use cases and questions of user experience.</p>
<p>Distributed, â€œshared augmented realities,â€ or â€œsocial augmented experiences&#8221; â€“ that not only allow mashups, &amp; multisource data flows, but dynamic overlays (not limited to 3d), created by users, linked to location/place/time, and distributed to other users who wish to engage with the experience by viewing and co-creating elements for their own goals and benefit &#8211; are something very new for us to think about.</p>
<p>As, Joe Lamantia, puts it, now:</p>
<p><strong>â€œthereâ€™s a feedback loop between which interactions are made easy by any given combo of device;/ hardware / software / connectivity, and the ways that people really work in real life (without any mediation / permeation by tech).â€</strong></p>
<p>Joe Lamantia whose term, <strong>â€œsocial augmented experiencesâ€</strong> I borrow for this post title, has done some thinking about <strong>â€œconcepts and models for understanding and contributing to shared augmented experiences, such as the social scales for interaction, and the challenges attendant to designing such interactions.â€ </strong>Check out <a href="http://www.joelamantia.com/" target="_blank">Joe Lamantia&#8217;s blog </a>for more on this later this week.</p>
<p>It is very helpful, as Joe points out, to shift the focusÂ  back and forth between the experience and the medium.</p>
<p>It is super exciting to have clear evidence that shared augmented realities are no longer merely possible, but highly probable and actually do-able now.</p>
<p>I shouldÂ  be absolutely clear about what Google Wave does to enable AR because obviously Wave plays no role in solving image recognition and tracking/registrations issues.Â  But, for example, Wave protocols and servers do provide a means to exchange, edit, and read data, and that enables distributed, social augmented realities.</p>
<p>Thomas explains how the newly named &#8220;AR Blip&#8221; works as:</p>
<p><strong>&#8220;An AR Blip is simply a Blip in wave containing AR data. Typically this would be the positional and url data telling a AR browser to position a 3d object at a location in space.</strong></p>
<p><strong>In more generic terms, an AR Blip allows data of various forms (meshes,text,sound) to be given a real-world position.&#8221;</strong></p>
<p>I have mentioned in other posts (<a href="http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/" target="_blank">here</a> and <a href="http://www.ugotrade.com/2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/" target="_blank">here</a>) that Wave can be used for AR as precise or as loose as the current generation devices can handle. And as the hardware and software for the kind of AR that can put media out in the world to truly immerse you in a mixed space, the frameworkÂ  shouldÂ  be able to handle this too.</p>
<p>(a note on the Wave playback feature &#8211; this opens up a whole new world of possibilities.Â  Check out <a href="http://snarkmarket.com/2009/3605" target="_blank">this post</a> on some of the implications of playback for writing!)</p>
<p>The use cases we have been coming up with are too numerous to go into in detail this post<span>.Â  The open nature of an AR framework/Wave standard will lead to many new applications we have barely begun to imagine.Â  As Thomas points out, different client software can be made for browsing, potentially allowing for various specialist browsers, as well as more generic ones for typicalÂ  use. T</span>he multitudes of different kinds of data in/output that could be integrated in an open AR framework as it evolves are mind boggling.</p>
<p>But, for now, someÂ  obvious use cases do come to mind:<br />
eg.</p>
<p>- Historical environmental overlays showing how a city used to be/and how this vision may be constructed differently by different communities</p>
<p>- Proposed building work showing future changes to a structure/and the negotiations of this future (both the public and professionals could submit their own comments to the plans in context), seeing pipes, cables and other invisible elements that can help builders and engineers collaborate and do their work.</p>
<p>- Skinning the world with interactive fantasies</p>
<p>I asked Thomas to help people understand how Wave enables new interactions to data by explaining how Wave could enable citi sensing and citizen sensing projects (e.g.<a href="http://tinyurl.com/y97d5zr" target="_blank"> this one being pioneered by Griswold</a>):</p>
<p><strong><strong>&#8220;Sensors, both mobile and static could contribute environmental data into city overlays;</strong></strong></p>
<div><strong><strong>â€”temperature, windspeed, air quality (amounts of certain particles) water quality, amount of sunlight, Co2 emissions could all be feed into different waves. The AR Wave Framework makes it easy to see any combination of these at the same time.&#8221;</strong></strong></div>
<div><strong><strong><br />
</strong></strong></div>
<p><strong><strong> </strong></strong>Having these invisible aspects of the world made visible would create ways to improve sustainability, social equity, urban management, energy efficiency, public health, and allow communities to understand and become active participants in the ecosystems and infrastructure of their neighborhoods.</p>
<p>The key is reflecting thisÂ  kind of data back to people &#8220;making it not back story but fore story,&#8221; right where we are, right where it happens, as well as having it available for analysis.</p>
<p>As well asÂ  creating new opportunities to interact/respond to/and enhance data, making visible the invisible as <a href="http://www.environmentalhealthclinic.net/people/natalie-jeremijenko/" target="_blank">Natalie Jeremijenko&#8217;s</a> work on <a href="http://www.amphibiousarchitecture.net/" target="_blank">Amphibious Architecture</a> and <a href="http://www.haque.co.uk/" target="_blank">Usman Haque&#8217;s</a> project <a href="http://www.sentientcity.net/exhibit/?p=43" target="_blank">Natural Fuse</a> shows, can also create new connections/understandings between humans and the non human&#8217;s that share our world, e.g. fish, plants, waterways.</p>
<p>At a more prosaic levelÂ  potential buyers of property could see more clearly what they are buying, city planners could see better what needs to be worked on, and environmental researchers could see more clearly the impact people are having on an area.</p>
<p>Also Wave can provide some of the framework necessary to begin to begin to address tricky problems of privacy. Sensitive data can be stored on private waves, e.g. medical data for doctors and researchers, but the analysis of theÂ  data could still be of benefit to all, e.g., if it&#8217;s tied disease occurrences to locations andÂ  relationships between the environmental data and health wereâ€¦quite literallyâ€¦made visible.</p>
<p><strong>&#8220;The publication of energy consumption and making it visible as overlays, could help influence the public into supporting more energy efficiency companies and businesses. It could also help citizens to try to keep their own energy usage down, to try to keep their street in â€œthe green.â€</strong></p>
<p>Thomas notes:</p>
<p><strong>&#8220;With all of the above, it becomes fairly trivial to write persistent Wave-bots that automatically send notice when certain criteria are met (pollutants over a certain level, for example). On publicly readable waves, anyone can use the data in their local computers, process it, and contribute results back on a new wave. Alternatively, persistent remote severs could run Cron jobs, or other automated processing, using services such as App Engine to run wave robots.</strong></p>
<p><strong>All these possibilities become â€œfreeâ€ when using Wave as a platform for geographically tied data.&#8221;</strong></p>
<p>But of course this is just the beginning!</p>
<p><em>Recently, I talked at length with Jeremy Hight who has been thinking about, designing and creating shared augmented realities, that anticipate the kind of dynamic, real time, large scale architecture we now have available through Wave,Â  for quite some time now.Â Â  This is exciting stuff. </em></p>
<p><em><br />
</em></p>
<h3><strong>Modulated Mapping:</strong> Talking with Jeremy Hight about Layers, Channels andÂ  Social Augmented Experiences</h3>
<p><strong><strong> </strong></strong></p>
<p><strong><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping5.jpg"><img class="alignnone size-medium wp-image-4611" title="modulatedmapping5" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping5-230x300.jpg" alt="modulatedmapping5" width="230" height="300" /></a><br />
</strong></strong></p>
<p><strong><strong><em><span>image from Volume Magazine (Hight/Wehby)</span></em></strong></strong></p>
<p><strong><strong>Tish Shute:</strong></strong> I know you have been involved in locative media from its early days. Perhaps we can talk about how AR continues the locative media journey?</p>
<p><a href="http://www.cc.gatech.edu/~blair/home.html" target="_blank">Blair MacIntyre</a> gave me this distinction, recently:<em> &#8220;AR is about systems that put media out in the world, and immerse you in a mixed space. Â Even the current &#8220;not really registered&#8221; mobile phone AR systems are still &#8220;sort of&#8221; AR (e.g., Layar, etc).</em></p>
<p><em>Locative media/ubicomp/etc are very different, in that they tend to display media on a device (phone screen) that is relevant to your context, but does not attempt to merge it with the world.<br />
The difference is significant, and making it clear helps people think about what they do and what they want to do, with their work. The locative media space though points toward future AR systems (when the technology catches up!).&#8221;</em></p>
<p><strong><strong>Jeremy Hight: The need is to finish the arc that locative media and early AR have started and to now truly return to the map itself, but as an internet of data, interactivity, channels of data , end user options like analog machines once were but in high end tools, a smart AI-ish ability for it to cull data for the user, and to allow social networking to be in real world places on the map both in building augmentation and in using and appreciating it..not hacks..which have their place&#8230;but a rhizome, a branched system with shared root,end user adjustable and variable..this is the key.</strong></strong></p>
<p><strong><strong>This takes AR and mapping and makes a possible world of channels in space and this eventually can be a kind of net we see in our field of vision with a selected percentage of visual field and placement so a geo-spatial net, a local to world wide fusion of lm into a tool and educational tool</strong></strong></p>
<p><strong><strong><span>VR[virtual reality] has greatly advanced, but in nodes as it has limitationsâ€¦LM [locative media] is the sameâ€¦AR [augmented reality] is the way..</span></strong><strong> it now has locative elements and aspects of VR integrated into its functionality and nodes&#8230;it is the best option with all of these elements, greater hybridity and data level potential a well as end user and community sourcing potential</strong></strong></p>
<p><strong><strong>I wrote an essay for Archis&#8217; Volume, the architecture magazine on a near future sense of some of this&#8230;.a visual net on the lens like ar but with smart objects and social networking and dissent.</strong></strong></p>
<p><strong><strong>I also wrote of these things for immersive graphic design, spatially aware museumÂ  augmentation, education through ar and lm and nod to the base interface of eye to cerebral cortex in layered and malleable augmentation in my essay <a href="http://www.neme.org/main/645/immersive-sight" target="_blank">&#8220;Immersive Sight&#8221;</a> a few years back</strong></strong></p>
<div id="gqg9" style="text-align: left;"><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_3dj7g8zf7_b.jpg"><img class="alignnone size-medium wp-image-4601" title="dgznj3hp_3dj7g8zf7_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_3dj7g8zf7_b-300x225.jpg" alt="dgznj3hp_3dj7g8zf7_b" width="300" height="225" /></a></strong></div>
<p><strong><strong>image [above] is simple illustration of a possible example on a screen or in front of eye where in a mondrian show..the graphic design of information actually builds as one moves</strong></strong></p>
<p><strong><strong>(key is calibrated spatial intervals and related layers of further augmentation which is logical due to location and proximity)</strong></strong></p>
<p><strong><strong>from immersive sight on immersive graphic design:</strong> <em>&#8220;The design can work with this in a way that creates an interactive supplemental set of information that is malleable, shifts based on location, builds and peels away as one moves closer to a work and plays with the forms of the works and the elements of the space itself. The sequence can contain many different elements and their interplay (both in the field of vision and in terms of context and layers of information). This is the model of sections of augmentation turning on and off at key points as individual spatial and concepts moments and nodes.</em></strong></p>
<p><strong><em>Another interesting possibility is that individual points of augmentation donâ€™t turn off, but instead are designed to build as one moves in a direction toward a specific part of the exhibit. The design can work in a sequence both content wise and visually in terms of a delay powered compositional development and style in which each discreet layer of text and image does not fade out, but builds on each other into a final composition. This can form paintings similar to Mondrian perhaps if it is a show of similar works of that era or it can form something much more metaphorical and open interpretation of the space and content but utilizing a sense of emergence spatially in terms of the composition (pieces laid bare until final approach for effect). </em></strong></p>
<p><strong><em>Each section will be well designed, but they build in layers as one moves until finally forming the final composition both visually and in terms of scope of information or building immediacy. The effect can be akin to taking a painting and slicing it into onion skin layers laid out in the air at intervals, each the same dimensions, but only one section compositionally of the greater whole. This has many semiotic applications beyond its potential aesthetically and as spatialized information possessing a sense of inter-relationship as one moves.</em>&#8220;</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>One of the things I found very inspiring when I read your papers was that your ideas are not all dependent on a model of AR that would necessarily require goggles, back packs and lots of CPU/GPU &#8211; not that that wouldn&#8217;t be nice, but that even using &#8220;magic lens&#8221; AR of the kind smart phones has enabled in an open distributed framework would open up a lot of new possibilities for what you call modulated mapping wouldn&#8217;t it?Â  What kind of social augmented realities might be enabled by a distributed infrastructure like this [AR Wave]?</p>
<p><strong><strong>Jeremy Hight: right&#8230;.I see that as wayyy down the road&#8230;most important is the one you talk about as it is more immediate and thus more essential and needed. Eventually the goggles will be like a contact lens and a deep immersive ar version ofÂ  this will come, that to me is certain, but a ways down the road.Â  An incredible amount is possible now, and this is a more pragmatic move as opposed to the more theoretical of what is a few steps from here. Thus it is more important and essential now. Tools like Google Wave are taking what even 2 years ago was more theoretical discussions of what may be and instead introducing key elements to a more immediate, powerful, flexible level of augmentation. What have been hacks and isolated elements are to be integrated and social networking, task completion, shared tools and graphics building and geo-location.</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>I think some people question what augmented reality has to bring to the continuum of location based experiences that other forms of interface/mapping do not?</p>
<p><strong><strong><span>Jeremy Hight: rightâ€¦.and the schism between its commercial </span></strong><strong>flat self and tests with physics etc and in between &#8230;there are a lot of unfortunate assumptions it seems as to where ar and lm cross and how ar can be many things beyond deep immersion or the opposite pole of a hockey puck having a magic purple line etc&#8230;.like lm is seen as either car directions or situationist experiments with deep data&#8230;..the progression to me is deeply organic&#8230;.and now augmentation can be more malleable, variable and end user controlled.</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Yes, it is really exciting time for AR.Â  Historically AR research has gone after the hard problems of image recognition, tracking and registration because we have had available to us these dynamic, real time, large scale architectures like Wave available (until now!),Â  so less work has been done on exploring the possibilities for distributed AR fully integrated with the internet and WWW hasn&#8217;t it?</p>
<p>A distributed augmented reality framework such as we have envisaged on Wave wouldÂ  allow people to see many layers from many different people at the same time. â€¬And this kind of model has been part of your thinking and fundamental to your work for a while, hasn&#8217;t it? But it is a very new idea to most people to think about collaboratively editing layers on the world, and to be able to viewÂ  augmented space through channels and networked communities?Â  Could you explain some of the ways you have explored these ideas and how they could be explored further now to create meaningful experiences for people?</p>
<p><strong><strong><span>Jeremy Hight: right..exactlyâ€¦modulated mapping to me can be an amazing tool for studentsâ€¦back end searching data visualizations and augmentations based on their needsâ€¦while they do something else on their computer or iphoneâ€¦that can be amazing..and not deep </span></strong><strong>immersive..The map can be active, malleable, open source fed, and even, in a sense, intelligent and able to adapt. The possibility also exists for this map to have a function that based on key words will search databases on-line to find maps, animations, histories and stories etc to place within it for your study and engagement. The map is thus a platform and yet is active. Community is possible as people can communicate graphically in works placed on the map and in building mode in the tool. All the tropes of locative media are to be in a </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> system of channels of augmentation and a spatial net. The software by design will allow development on the map and communication like programs such as second life but in </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> itself.</strong></strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modultedmapping1.jpg"><img class="alignnone size-medium wp-image-4607" title="interactive 3d map copy" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modultedmapping1-246x300.jpg" alt="interactive 3d map copy" width="246" height="300" /></a></strong></p>
<p><strong><strong><em><strong><span>image from Parsons Journal of Information Mapping Volume 2 (Hight/Wehby)</span></strong></em></strong></strong></p>
<p><strong><strong><span>I wrote an essay a few years ago for the Sarai reader questioning the traditional map and its semiotics and need to reconsider â€“ then did work looking into it and what those dynamics were and they got into 2 group shows in museums in Russiaâ€¦so it actually was my arc toward modulated mappingâ€¦an interesting way to it! But yes the map itself..this is a huge area of potential and non screen based alone navigation etc. I see now that my 2 dozen or so essays in lm,ar, interface design and augmentation have all also been leading in this direction for about 10 years now</span></strong></strong></p>
<p><strong><strong>Tish Shute: </strong>IÂ  love immersive visualization but can we &#8220;return to the map &#8211; the internet of data&#8221; as you mentioned earlier and produce interesting augmentation experiences that go beyond locative media&#8217;s device display mode without having the goggles, for example, through the magic lens of or smart phones?</strong></p>
<p><strong><strong>Jeremy Hight: yes, absolutely.Â  the map in the older paradigm is an artifice born often of war and border dispute and not of the earth itself and its processes&#8230;the new mapping like google maps is malleable, can be open source, can read spaces and can be layers of info in the related space not plucked from it as in the past..this is amazing. the old map also was born of false semiotics/semantics like &#8220;discovery of new lands&#8221; or &#8221; pioneer&#8221;Â  while the places were there already and names often were of empire&#8230;now this is no longer the case</strong></strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping2.jpg"><img class="alignnone size-medium wp-image-4608" title="jeremy map small2 copy" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping2-300x233.jpg" alt="jeremy map small2 copy" width="300" height="233" /></a></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>So geoAR is an a better way to express a new social relationship to mapping? And how does this fit into the evolving arc of locative media that evolves into augmented reality?</p>
<p><strong><strong>Jeremy Hight:&#8230;early lm was mostly geocaching and drawing with gps..it took new paradigms to invigorate the fieldÂ  a lot of folks focus on tools and what already is, cross pollination can ground ideas that are more radical&#8230;a metaphor in a sense to place what can be in a familiar context.</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>one of the great disappointments in VR has been its isolation from networked computing and also, up to now, augmented reality &#8211; to achieve an immersive experience withÂ  tight registration of media/graphics have to create separate system isolated from the internet and power of the web.</p>
<p><strong><strong>Jeremy Hight: yes&#8230;.this will change. vr is to me an island but ar takes a part of it and shifts the paradigm and new things open this way. Do you know the project <a href="http://www.lifeclipper.net/EN/process.html" target="_blank">&#8220;life clipper&#8221;</a>? friends of mine..doing interesting things..they are a clear bridge betwen lm and ar&#8230;.and from vr</strong></strong></p>
<p><strong><strong>in ar augmentation and what is being augmented become fused or in collision or in complex interactions as a means to a larger contextualization and exploration of what is being augmented..this is true in immersive or non ar&#8230;.huge potential</strong></strong></p>
<p><strong><strong>vr is a space, now can be surgery which is amazing. but not layered interaction, thus an island and graphic iconography on a location can use symbolic icons which opens up even more layers (graphic designer/information designer in me talking there I suppose..)</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>Yes !Â  talk to me more about layers and channels I think this is one of the most interesting questions for meÂ  in augmented reality at the moment &#8211; what can we do with layers and channels and the new possibilities on connections between people and environments that these can create?</p>
<p>The ability for anyone to post something is critical to the distributed idea but one of the reasons I am so excited by Google Wave is I am fascinated by the playback function. How do you think this will enable new forms of collaborative locative narratives (<a href="http://snarkmarket.com/2009/3605" target="_blank">nice post on Wave playback here </a>).</p>
<p><strong><strong>Jeremy Hight: We are in an age of cartographic awareness unseen in hundreds of years. When was the last time that new </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> tools were sold in chain stores and installed in most vehicles? When was the last time that also the augmentation of maps was done by millions (Google map hacks, etc)? The ubiquitous gps maps run in automobiles while people post pictures and graphic pins to denote specific places on on-line maps.</strong></strong></p>
<p><strong><strong>The need is for a tool that combines all of these new elements into an open source, intuitive layered and rhizomatic map that is porous (like pumice, organic in form yet with â€œbreathing roomâ€ ),ventilated (i.e: adjustable, a flow in and out), and open (open source,open access,open spatialized dialog).</strong></strong></p>
<p><strong><strong><span> I wrote of this in my essay &#8220;Revising the Map: Modulated Mapping and the Spatial Interface .&#8221;(</span></strong><span> </span><a id="h0qr" title="http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf )" href="http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf%20%29"><span>http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf )</span></a></strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping3.jpg"><img class="alignnone size-medium wp-image-4609" title="jeremy map small2 copy" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/modulatedmapping3-300x206.jpg" alt="jeremy map small2 copy" width="300" height="206" /></a></strong></p>
<p><strong><em><strong><span>image from Parsons Journal of Information Mapping (Hight/Wehby)</span></strong></em></strong></p>
<p><strong><strong>Tish Shute:</strong></strong> One mapping project I really like is <a href="http://themannahattaproject.org/" target="_blank">Mannahatta</a>.Â  How could distributed AR contribute to a project like <a href="http://themannahattaproject.org/" target="_blank">Mannahatta</a>?</p>
<p><strong><strong>Jeremy Hight: that is a good example..imagine taking manhattan and having channels of options to overlay, that being an excellent option, and imagine being able to even run a few at once with deliniating icons..you can augment a space with history, data, erasure, narrative, scientific analysis, time line of architecture, infrastructure, archaeological record etc&#8230;.endless possibilities, and this agitates place and place on a map into an active field of information with end user control&#8230;and open options for new layers</strong></strong></p>
<p><strong><strong>Tish Shute: </strong></strong>and do you think we could do interesting things with AR on a project like Mannahatta even with the current mediating devices we have available &#8211; i.e. our smart phones as obviously the rich pc experience of Mannhatta has built for it&#8217;s web interface would not be available as AR at this point?</p>
<p><strong><strong>Jeremy Hight: yes&#8230;.k.i.s.s right?Â Â  these projects do not have to only be immersive and graphic intensive&#8230;&#8230;take how people upload photos onto google maps&#8230;.just make that on a menu of options, there are some pretty cool hacks already..<br />
&#8230;options is key, a space can have a community as well, building on it in software, and others navigating it, i see it near future and down the road..always have with ar really</strong></strong></p>
<p><strong><strong><a href="../wp-content/uploads/2009/10/locativenarratives1.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/locativenarratives1.jpg"><img class="alignnone size-medium wp-image-4596" title="locativenarratives1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/locativenarratives1-230x300.jpg" alt="locativenarratives1" width="230" height="300" /></a><br />
</strong></strong></p>
<p><strong><em><strong><span>image from Volume Magazine (Hight/Wehby)</span></strong></em></strong></p>
<p><strong><strong>Jeremy Hight: and yes, a lot of people focus on ar as its limitations and processing power needs as a major road block</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>so do you see AR on smart phones adding any value to a project like Mannahatta?</p>
<p><strong><strong>Jeremy Hight: yes&#8230;that it can be integrated into other similar works and even disparate but cloud linked ones&#8230;so a place can be &#8220;read&#8221; in diff ways on the iphone&#8230;.beyond its map location, and more can be possible if you are there&#8230;others away, so it becomes channels of augmentation</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>AR like locative media puts who you are, where you are, what you are doing, what is around you center stage in online experience but it also &#8220;puts media out in the world&#8221; &#8211; people I think understand this well as a single user experience but we are only just beginning to think about how this will manifest as a social experience &#8211; could explain more about modulated mapping as an experience of social augmentation?</p>
<p><strong><strong style="background-color: #99ff99; color: black;"><span>Jeremy H</span>ight: Modulated</strong> <strong style="background-color: #ff9999; color: black;">Mapping </strong><strong>is a tool that will allow channels to be run along the map itself. This will allow one to view different icons and augmentations both as systems on the map and in deeper layers of information (photos, videos, animations,Â  visualizations, etc) that can be turned on and off as desired. The different layers of icons and data may be history, dissent, artworks, spatialized narratives, and annotations developed that are communally based on shared interests, placed spatially and far beyond. The use of chat functionality in text or audio will be open in building mode and in </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> navigation/usage as desired. This also allows a community to develop or augment in the spaces on the earth. These nodes can be larger and open or small and set by groups in their channel. The end result is an open source sense of </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> that will also have a needed sense of user control as one can select which layers of augmentation they wish to see and interact with at any time. It also will incorporate all the functionality of locative media in </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> software and </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong>. In building mode and in map mode, icons will be coded to represent within channels (remember that the person using it has selected channels of augmentation from many based on their current interests and needs). Icons will be coded as active to show work in progress in cities and the globe to both invite participation and to further agitate the map from the sense of the static as action is visible even with its icons as people are working and community is formed in common interest/need .</strong></strong></p>
<p><strong><strong>locative media got a buzz for &#8220;reading&#8221; places&#8230;when I helped create locative narrative that was what blew me away back in 2001&#8230;that we could give places a voice by placing data from research and icons on a map&#8230;&#8230;this meant lost history or augmentation was possible as kind of voices of a place and its layers&#8230;&#8230;.I called it &#8220;narrative archaeology.&#8221; We now have tools that can push these ideas and concepts farther..much farther&#8230;and with a range beyond what was before, and then the map was just a tool&#8230;.but now we are returning to the map itself&#8230;..and this as place as much as marker..this is where ar takes the ball to use a bad metaphor</strong></strong></p>
<p><strong><strong>also that project could only work if you came to our spot of a 4 block augmentation and with us there to lend you our gear&#8230;we are far beyond that now but it had its place</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>How do you see &#8220;in context&#8221; AR and something we might call &#8220;context aware&#8221; cloud computing models interacting?</p>
<p><strong><strong>Jeremy Hight: sure&#8230;and I must add that I have issues with cloud computing as much as it is a good idea..</strong>.</strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>because of loss of autonomy?</p>
<p><strong><strong>Jeremy Hight: tivo is simply a hard drive&#8230;but it keyword reads and givesÂ  suggestions..that is the is cro magnon link to what can be</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>The nice thing about Wave is because of the Federation model, the cloud model and local store ur own data models should work together.<strong><strong><span> </span></strong></strong></p>
<p><strong><strong><span>Jeremy Hight: yes..that is better&#8230;..loss of autonomy also opens up the arbitrary which is the flaw of search engines as we know itâ€¦even Bing fails to me in that sense</span></strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>how do you mean, could you explain?</p>
<p><span> </span><strong><strong><span>Jeremy Hight: spidersÂ  cull from wordsÂ  but cull like trawlers at sea â€¦. tested Bing with very specific requests.. it spat out the same mass of mostly off topic resultsâ€¦.</span><br />
<span> I wonder if there is a way to cull from key words and topics from a userâ€¦not O</span>rwellian back end of courseâ€¦but from their preferences, their searches etc..</strong></strong></p>
<p><strong><strong>Tish Shute:</strong> </strong>did you see the discussion on search in the AR Framework doc? AR search will be a massively important thing that will take a lot of intelligence and all sorts of algorithm development won&#8217;t it?</p>
<p><strong><strong>Jeremy Hight:It also has one area of key functionality that moves into more intuitive software. Upon continued usage, the </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> software will â€œlearnâ€ and search based on key words used and spheres of interest the user is </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> or observing as mapped and will integrate deeper data and types of animations, etc. into the map or will have them waiting to be integrated upon user approval as desired. Over time the level of sophistication of additions and of search intuition will increase dramatically. The search can also, if the user wishes, run in the back end while working in the </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> program, or in off time as selected while doing other tasks. It also can never be used if one is not interested. One of the key elements of this </strong><strong style="color: black; background-color: #ff9999;">mapping</strong><strong> is that it is not composed of a closed set or needs user hacks to augment, but instead is to evolve and deepen by user controls and desired as designed. Pre-existing data,visualizations and augmentations can be integrated with relative ease.</strong></strong></p>
<p><strong><strong>Tish Shute: </strong></strong>One of the things that Joe Lamantia points out about social augmented experiences is that they will operate across a number of different scales &#8211; conversation &gt; product design &amp; build team &gt; neighborhood / town fixing potholes &gt; global community for causes. How do designs for channels and layers change across these different social scales?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> quote myself &#8230;&#8221;The &#8220;frontier&#8221; is often defined as the space just ahead of the known edge and limit, and where it may be pushed out deeper into the previously unknown. The frontier in the world of ideas is not the warm comfort of what has been long assimilated; and the frontier in the landscape is not of maps, but of places beyond and before themâ„</strong></strong></p>
<p><strong><strong>The border along what has been claimed is not only that of maps â€“ it is of concepts, functions, inventions and related emergent industries. Ideas and innovations are like the cloud shape that briefly forms around a jet breaking the sound barrier, tangible yet not fully mapped into measure. It is when things are nailed down into specific entities, calibrated and assessed, that the dangers may inflict themselves â€“ greed, competition, imitation, anger, jealously, a provincial sense of ownership either possessed or demanded&#8221;. (from essay in Sarai reader). Otherwise channels and augmentation do not have to be socio-economically stratifying or defined by them. We built 34nÂ  for almost nothing on older tools.</strong></strong></p>
<div id="yqjj" style="text-align: left;"><strong><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_1g3svj8fq_b.jpg"><img class="alignnone size-medium wp-image-4599" title="dgznj3hp_1g3svj8fq_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_1g3svj8fq_b-300x225.jpg" alt="dgznj3hp_1g3svj8fq_b" width="300" height="225" /></a></strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_1g3svj8fq_b.jpg"><span> </span></a></strong></div>
<p><strong><em><strong><span>image from 34north 118westÂ  (Spellman/Hight/Knowlton)</span></strong></em></strong></p>
<p><strong><strong>The ar that is not deep immersion can be more readily available and channels can be what end users need like the diversity of chat rooms or range of Facebook users among us.</strong></strong></p>
<p><strong><strong>I had two moments yesterday that totally fit what we talked about.Â  I went to west hollywood book fair and traditional directions off of mapping for driving directions were wrong and we got lost&#8230;our friend could only get a wireless signal to map on itouch and we had to roam neighborhoods then we called a friend who google mapped it and we found we were a block away&#8230;.so a fast geomapping overlay with an icon for the book fair on some optional grid service or community would have made it immediate.Â  Then at the book fair talked to a small press publisher who is trying to map works about los angeles by los angeles authors on a map..she was stunned when I told her it could be a kind of google map feature option</strong></strong></p>
<p><strong><strong>it also has great potential to publish and place writing and art in places..both for commentary and access. imagine reading joyce in chapters where it was written about and then another similar experience but with writers who published on a service into their city.</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong></strong></strong> The challenge of shared augmented realities is not just a matter of shipping bits around, but also of how it we will use channels and layars &#8211; to create and negotiate different, distributed perspectives, understand a shared common core/or expressions of dissent (this came up in an email conversation with <a href="http://www.oreillynet.com/pub/au/166" target="_blank">Simon St Laurent</a>).</p>
<p><strong><strong><strong>Jeremy Hight:</strong> well my example earlier could have been communal in a way too..a tribe sort of augmentation channeling &#8230;.like subscribing to list servs back in the day but of augmentation communities/channels, and for folks to build and use in shared live form, coordinating too</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong></strong> </strong>one good thing though about building an open AR Framework is that as bandwidth/CPU/hardware gets better shared high def immersive experiences could be supported by the same framework..</p>
<p><strong><strong>Jeremy Hight: excellent</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>were you thinking of the image recognition and tracking with this example?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> yeah&#8230;.like scanning across a multi channeled google map augmentation with diff icons and their connected data&#8230;and poss social networking and fle sharing even in that mode&#8230;and rastering etc&#8230;.could be cool with google wave </strong><strong><span>- on the map..then zooming in a la powers of ten..(eames film).</span></strong></strong></p>
<p><strong><strong>-</strong><strong><span>I have pictured variations of this for a few years now in my head like the example of my friends and I yesterdayâ€¦we could have correlated a destination by icons in diff channels..one being lit events within lit channel in l.a mapâ€¦maybe things streaming on it tooâ€¦remote info and video etc&#8230; that would be awesome</span></strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong></strong></strong> So many of the ideas in you paper on modulated mapping (see <a href="http://piim.newschool.edu/journal/issues/2009/02/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf" target="_blank">here</a>) are brilliant use cases for shared augmented realities. Perhaps you could talk more your ideas about locative narrative because this is something I think is at the core of the kinds of experiences that a distributed AR Framework would make possible?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> on the project &#8220;34 north 118 west&#8221; we mapped out a 4 block area for augmentation of sound files triggered by latitude and longitude on the gps grid and map and the map on the screen had pink rectangles that were the &#8220;hot spots&#8221; where the augmentation had been placed.</strong></strong></p>
<div id="nwc6" style="text-align: left;"><strong><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_0gg994bf9_b.jpg"><img class="alignnone size-medium wp-image-4600" title="dgznj3hp_0gg994bf9_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_0gg994bf9_b-300x225.jpg" alt="dgznj3hp_0gg994bf9_b" width="300" height="225" /></a></strong></strong></div>
<p><strong><em><strong><span>image of interactive map with map based augmentation connected to audio augmentation on site for 34north 118west (Spellman/Hight/Knowlton)</span></strong></em></strong></p>
<p><strong><strong>We researched the history of the area and placed moments in time of what had been there at specific locations &#8230;.I called this <a href="http://www.xcp.bfn.org/hight.html" target="_blank">&#8220;narrative archaeology&#8221;</a> as it allowed places to be &#8220;read&#8221; by their augmentations&#8230;info that was of the place beyond the immediate experience (diff types of info) that otherwise would be lost or only found in books or web sites elsewhere. there now are locative narratives around the world but they need to be linked.Â  from humble origins &#8220;narrative archaeology&#8221; went on to be recently named of the 4 primary texts in locative media which is pretty amazing to me&#8230;but it is growing</strong></strong></p>
<p><strong><strong>- the limitations then were what I called the &#8220;bowling alley connundrum&#8221; &#8211; the specifc data had to reset like pins&#8230;..and was isolated&#8230;.this led me to think about ar back then and up to now.Â  How these could lead to much more from that point, data that would be more layered, variable , fluid..yet still augmented place and sense of place and social networking within data and software</strong></strong></p>
<p><strong><strong><a href="http://34n118w.net/34N/" target="_blank">lifeclipper</a> to me is a bridge</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>But Life Clipper is isolated from the internet currently is it?</p>
<p><strong><strong><span>Jeremy Hight: yes&#8230;ours was too.. that is what google wave makes possible.. our project only ran on our gear..in 4 blocksâ€¦with additional auxi</span>liary info online, and not malleable..but hey 2001 and all..</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>so the sites for 34 north 118 west are still active though?</p>
<p><strong>Jeremy Hight: oh yeah!</strong></p>
<p><strong><strong><strong>Tish Shute: </strong></strong></strong>nice I really like sound augmentation &#8211; have you seen <a href="http://www.soundwalk.com/blog/tag/augmented-reality/" target="_blank">Soundwalk</a>?</p>
<p><strong><strong><span>Jeremy Hight: yes, very cool..</span> </strong><strong>we chose sound only as it fought the power of image..instead caused a person to be in a sense of two places and times at once</strong></strong></p>
<p><strong><strong>Tish Shute:</strong></strong> and in 2001 that was definitely a visionary project!</p>
<p>You must be very excited that finally the pieces are coming together to make this stuff scale!</p>
<p><strong><strong><strong>Jeremy Hight:</strong> I can&#8217;t even tell you!! it is funny..i have known that this would come..just waited and waited&#8230;</strong></strong></p>
<p><strong><strong>..knew it needed the right people and tools..</strong></strong></p>
<p><strong><strong><span>..so the bowling alley connundrum led me to develop my project shortlisted for the iss (international space station)Â  as I thought a lot about how points and works are not to be isolatedâ€¦but connectedÂ  and should be flowing in diff parts of a mapâ€¦.to open up perspective and connected augmentations , but also to think about the map againâ€¦not as a base only. then moved into my work with new ways to visualize time and it all really began to gell.Â  The ideas first were published as an essay</span></strong><span> </span><a id="qw.2" title="(http://www.fylkingen.se/hz/n8/hight.html)" href="http://www.fylkingen.se/hz/n8/hight.html"><span>(http://www.fylkingen.se/hz/n8/hight.html)</span></a><span> </span><strong><span>and later my project blog</span></strong><span> (</span><a id="bp.b" title="http://floatingpointsspace.blogspot.com/)" href="http://floatingpointsspace.blogspot.com/%29"><span>http://floatingpointsspace.blogspot.com/)</span></a></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>One thing I noticed when I was reading your paper is how you have been exploring non-euclidian geometries.Â  Could you explain how this is part of your idea of modulated mapping?</p>
<p><strong><strong><span>Jeremy Hight: Yes, this first came to me when my wife was reading to me from a book on the Poincare Conjecture and I was hit with a new way to measure events in time and after months of sketches, schematics and research came to see how it could also be connected to a geo-spatial web of projects and augmentations.Â  It was published in the inaugural issue of Parsons School of Design&#8217;s Journal of Information Mapping which was an exciting fit.</span></strong><span><strong> I call it &#8220;Immersive Event Time&#8221;</strong>(</span><a id="o3rt" title="http://piim.newschool.edu/journal/issues/2009/01/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf)" href="http://piim.newschool.edu/journal/issues/2009/01/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf%29"><span>http://piim.newschool.edu/journal/issues/2009/01/pdfs/ParsonsJournalForInformationMapping_Hight-Jeremy.pdf)</span></a></strong></p>
<p><span><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_4cxz57xgv_b.jpg"><img class="alignnone size-medium wp-image-4634" title="dgznj3hp_4cxz57xgv_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_4cxz57xgv_b-195x300.jpg" alt="dgznj3hp_4cxz57xgv_b" width="195" height="300" /></a></strong></span></p>
<p><span><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_5g68k9ggh_b.jpg"><img class="alignnone size-medium wp-image-4635" title="dgznj3hp_5g68k9ggh_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/10/dgznj3hp_5g68k9ggh_b-300x225.jpg" alt="dgznj3hp_5g68k9ggh_b" width="300" height="225" /></a><br />
</strong></span></p>
<p><strong><strong>so the last 3 years I have been working on how it could all work as channels of augmentation, and building and navigation as open and community in a sense as well as ai capability that was the time work especially. how time as experienced within an event is not a time &#8220;line&#8221;Â  but points on and within a form&#8230;.and how this model is better for visualizing events in time and documenting them. it actually sprang form reading a book on the poincare conjecture, popped a bunch of other stuff together so one could visualize an event in time as like being in the belly of a whale..with time as the ribs..and our measure of time as the skin&#8230;and moving within it&#8230;.hoping this will be used as educational tool</strong></strong></p>
<p><strong><strong>and this also can be tied to ar and map again&#8230;how documentation of important events can be kept within icons on a google map..then download varying visualizations based on bandwidth and desired format</strong></strong></p>
<p><strong><strong><strong>Tish Shute: </strong></strong></strong>I have been thinking about is the new forms of social interaction/agency that these kinds of augmentations of space/place/time will create.Â  it seems there are two poles &#8211; one is the area Natalie Jeremijenko explores of shifting social relations from institutions/statistics to real time/location based/interactions and new forms of social agency.Â  The other pole completely is more like the cloud based AI and perhaps crowd sourced machine learning.</p>
<p>Your ideas explore the possibilities of both these poles.Â  And certainly one of the big deals of distributed AR integrated with would be the possibilities it opened up both for new forms of networked social relationships and for new ways to draw on network effects.</p>
<p><strong><strong><strong>Jeremy Hight:</strong> and cross pollinations within &#8230;that is what my mind goes to</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>The other night I met Assaf Biderman, MIT, from the <a href="http://senseable.mit.edu/trashtrack/" target="_blank">Trash Track</a> team. Trash Track doesn&#8217;t utilize AR but I could see that there are possibilites there.<br />
What do you think?</p>
<p><strong><strong><span>Jeremy Hight: yes, absolutely,</span> </strong><strong>there can sort of skins on locations that user end selection can yield &#8230;like channels of place&#8230;.and can range from pragmatic core to art and play and places between&#8230;.how this recalibrates the semiotics of map&#8230;more than just augmentation as seen as a kind of piggy back on map..map becomes interface and defanged platform if you wil, interestingly my more poetic/philosophic writing led me here too</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong></strong></strong> I know they are at very different poles of the system but I do wonder how AR can bring some of the level of social agency/interaction that <a href="http://www.environmentalhealthclinic.net/people/natalie-jeremijenko/" target="_blank">Natalie Jeremijenko</a> works on into a productive interaction with the kind of innovations in Machine learning that Dolores Lab style machine learning!!and others are pioneering?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> Natalie&#8217;s genius to me is in practical functional tech that also opens deeper questions and even new openings of what is needed..amazing layers in her work that way.. succint yet deep..very deep</strong></strong></p>
<p><strong><strong><strong>Tish Shute: </strong></strong></strong>Yes &#8211; I a just writing a post about her work &#8211; I find it deeply moving the way she has delved into the possibilities to using technology to open us up to our world.Â  One of the reasons I find distributed AR so interesting is because it will make it possible for all kinds of people to create and use augmentation in their lives and communities.</p>
<p>So to return to how a distributed AR framework could contribute to a project like Trash Track?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> what about using it for community, dissent and awareness raising then?Â  like Natalie&#8217;s work but building like a communal work of multiple points, like the old adage of the elephant and the blind menÂ  sorry..metaphor &#8211; like one of my points in immersive sight was how one could take augmentation as multiple works sort of turning the faces of a thing or place&#8230;and how this would make a larger work even in such a flow so people moving in a space could also build..</strong></strong></p>
<p><strong><strong>what of ar traces left as people move calibrated to user traffic and trash as estimated in an urban space&#8230;like it goes back to chris burden in the 70&#8242;s making you know that as you turn the turnstyle you are drilling into the foundation and may be the one that collapses the building?</strong></strong></p>
<p><strong><strong>so their movements leave trash. Natalie is all about raising awareness to cause and effect and data , space and ecology. love that.Â  so maybe &#8230;<br />
a feedback loop , artifact and user end responsibility can leave traces &#8230;trash&#8230;</strong></strong></p>
<p><strong><strong>.. cybernetics vs ecology and human waste</strong></strong></p>
<p><strong><strong><strong>Tish Shute: </strong></strong></strong>could you elaborate?</p>
<p><strong><strong><strong>Jeremy Hight:</strong> brain fart&#8230;that the mass of trash people leave is a piece at a tiime&#8230;.and how like the space shuttle mission when it was argued first true cybernaut occured&#8230;.one cord to air for astronaut..one for computer on their back to fix broken bay arm&#8230;if there is a way to build on that and in relation to the topic&#8230;..how this can go further, that machines do not waste as much&#8230;as ar is a means to cybernetic raise awareness..eh..</strong><strong><span>In a sense it is likeÂ  the space shuttle mission when arguably the first true cybernaut occurredâ€¦.one cord to air for astronaut..one for computer on their back to fix broken bay armâ€¦if there is a way to build on that and in relation to the topicâ€¦..how this can go further, that machines do not waste as muchâ€¦as ar is a means to cybernetic raise awareness..eh.. hmmm.</span>.. </strong><strong> sensors etc&#8230;wearables too &#8211; could be eco awareness with data and machine and human</strong></strong></p>
<p><strong><strong>what about a cloud computing system with a slight ai in the sense of intuitive word cloud and interest scans&#8230;..so as one moves through say new york they can be offered new ai data and services as they move ? could also be of eco interests? concerns about urban farming, eco waste, air pollution etc&#8230;.perhaps with (jeremijenko element here) Â sensors placed in locations and these also giving data reads in public areas Â with no input but hard data itself&#8230;&#8230;hmm..could be interesting</strong></strong></p>
<p><strong><strong>it can also give info of the carbon footprints (estimated prob unless data is public record somehow) of chain businesses Â and data on which are more eco friendly as well as an iconography color coded and icon coded to the best places to go to support greening and eco friendly business? Â and the companies could promote themselves on this service to attract eco aware customers who would be seeing them as kindred spirits and helping the<br />
larger effort?</strong></strong></p>
<p><strong><strong>kind of eco mapping..and ar on mobile app</strong></strong></p>
<p><strong><strong>what about sensors that read air pollution levels, levels of solar radiation (to aid with skin protection in shifting light values in a city space..ie put on some skin cream now&#8230;), light sensors that detect density and over density in public spaces&#8230;to use the old trope in art of reading crowds in a space..but instead could indicate overcrowding, failing infrastructure in public spaces (which is a congestion that leads to greater pollution levels as well as flaws in city planning over time..), and perhaps a tie in to wearables&#8230;&#8230;worn sensors Â on smart clothes&#8230;.this could form a node network of people in the crowds &#8230;.and also send data within moving in a space&#8230;</strong></strong></p>
<p><strong><strong>here is a kooky thought&#8230; what of taking the computing power and data of people moving in a space..and not only get eco data and make available to them levels of<br />
data..but make possibly a roving super computer&#8230;crunching the deeper data of people open to this&#8230;&#8230;a hive crunching deeper analysis of the space, scan properties from sensors, and even a game theory esque algorithm of meta data if say 40 people out of 50 hit on a certain spike or reading&#8230;and even their input&#8230;..I worked in game theory for paleontology in this manner for a time as a teen&#8230;.a private project&#8230;&#8230; Â  the reading can lead to a sort of meta read by what hits most consistently..as well as in their input..text of what they experienced, observed,postulated,analyzed even&#8230;. this could be really interesting&#8230;even if just the last part from collected data and not from any complex branching of servers..</strong></strong></p>
<p><strong><strong>I thought at 19 or so that the flaw in paleontology was in how so many larger theories were shifting exhibitions and larger senses of things like were there pre-historic birds that were mistaken for amphibean and then back again&#8230;.so why not make a computer program and feed all the papers published into it and see what hits were counted in terms of an emerging meta theory&#8230;and landscape of key points being agreed upon&#8230;this data would be in a sense both algorithmic and a sort of unspoken dialogue &#8230;came from a lot of study of game theory one summer&#8230;</strong></strong></p>
<p><strong><strong>hope this makes some sense&#8230;I forgot to mention that I originally planned to be a research meteorologist and my plan in middle school or so was to get a phd and develop new software to have a global map and then run models of hypothetical storms across it in real time animations of cloud forms, radar and wind analysis/fields, barometric pressure spaghetti charts etc&#8230;.and to also do 3d cut away models of storm architectures&#8230;so been into visualizations of complex data and mapping for a long time!</strong></strong></p>
<p><strong><strong><strong>Tish Shute:</strong> </strong></strong>Wow let me think about this one!</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/10/13/ar-wave-layers-and-channels-of-social-augmented-experiences/feed/</wfw:commentRss>
		<slash:comments>18</slash:comments>
		</item>
		<item>
		<title>Everything Everywhere: Thomas Wrobel&#8217;s Proposal for an Open Augmented Reality Network</title>
		<link>http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/</link>
		<comments>http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/#comments</comments>
		<pubDate>Thu, 20 Aug 2009 03:58:57 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[alternate reality games]]></category>
		<category><![CDATA[alternate reality games and augmented reality]]></category>
		<category><![CDATA[AR Consortium]]></category>
		<category><![CDATA[AR Network]]></category>
		<category><![CDATA[ARG games]]></category>
		<category><![CDATA[ARN]]></category>
		<category><![CDATA[augmented reality and privacy]]></category>
		<category><![CDATA[augmented reality browser wars]]></category>
		<category><![CDATA[Augmented Reality Browsers]]></category>
		<category><![CDATA[augmented reality concepts]]></category>
		<category><![CDATA[augmented reality filters]]></category>
		<category><![CDATA[augmented reality games]]></category>
		<category><![CDATA[augmented reality permissions]]></category>
		<category><![CDATA[Bertine van Hovell]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Dark Flame]]></category>
		<category><![CDATA[Denno Coil]]></category>
		<category><![CDATA[distributed augmented reality]]></category>
		<category><![CDATA[Elan Lee]]></category>
		<category><![CDATA[Fourth Wall Studios]]></category>
		<category><![CDATA[future of augmented reality]]></category>
		<category><![CDATA[Google Wave]]></category>
		<category><![CDATA[Google Wave Protocols]]></category>
		<category><![CDATA[Google Wave Web of Protocols]]></category>
		<category><![CDATA[Internet Relay Photoshop]]></category>
		<category><![CDATA[IRC paradigm]]></category>
		<category><![CDATA[IRC protocols and augmented reality]]></category>
		<category><![CDATA[J Aaron Farr]]></category>
		<category><![CDATA[Lost Again]]></category>
		<category><![CDATA[Mez Breeze]]></category>
		<category><![CDATA[Mitsuo Iso]]></category>
		<category><![CDATA[Open Augmented Reality Netwrok System]]></category>
		<category><![CDATA[open standards for augmented reality]]></category>
		<category><![CDATA[protocols for augmented reality]]></category>
		<category><![CDATA[real time communications protocols]]></category>
		<category><![CDATA[real time web]]></category>
		<category><![CDATA[res-nova]]></category>
		<category><![CDATA[Robert Rice]]></category>
		<category><![CDATA[social tesseracting]]></category>
		<category><![CDATA[Thomas Wrobel]]></category>
		<category><![CDATA[XMPP]]></category>
		<category><![CDATA[XMPP and presence]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=4228</guid>
		<description><![CDATA[Today, I was very excited when Thomas Wrobel sent me a draft of, &#8220;Everything Everywhere: A proposal for an Augmented Reality Network system based on existing protocols and infrastructure.&#8221; Thomas has kindly agreed to let me publish his draft, to open a discussion on this topic. The diagram opening this post (click image to enlarge) [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Image1.jpg"><img class="alignnone size-medium wp-image-4277" title="Image1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Image1-300x162.jpg" alt="Image1" width="300" height="162" /></a></p>
<p>Today, I was very excited when <a href="http://www.darkflame.co.uk/">Thomas Wrobel</a> sent me a draft of, <strong>&#8220;Everything Everywhere: A proposal for an Augmented Reality Network system based on existing protocols and infrastructure.&#8221;</strong></p>
<p>Thomas has kindly agreed to let me publish his draft, to open a discussion on this topic. The diagram opening this post (click image to enlarge) shows, <strong>&#8220;An example of how collaborative 3D-spaces could be shared over existing IRC networks.&#8221;</strong> It is from Thomas&#8217; proposal.<strong> </strong>The full text of his paper is included later in this post.</p>
<h3>&#8220;Can we try to avoid a browser war this time?&#8221;</h3>
<p>Thomas notes in the closing remark to his paper:</p>
<p><strong>&#8220;I am absolutely confident in my belief AR will become at least as important as the web has, and probably a lot more so. It will also face much the same hurdles and challenges getting established as that medium did. But, speaking as a web-developer, can we try to avoid a browser war this time?&#8221;</strong></p>
<p><a href="http://www.darkflame.co.uk/">Thomas Wrobel</a> has consistently posted insightful comments on how existing standards could be used for creating open augmented reality networks. But he expressed concern to me that his work and this paper not be overplayed:</p>
<p><strong>&#8220;I&#8217;m hardly a leader, I&#8217;m just an amateur with a load of ideas on AR-related topics, some which might be useful, others might become unworkable. I don&#8217;t want anyone to get the impression this is how I think it has to, or should be done.&#8221;</strong></p>
<p>I have brought/am bringing up this topic of using existing standards and infrastructure where possible for open augmented reality networks in all my interviews with members of the <a href="http://www.arconsortium.org/" target="_blank">AR Consortium</a>.</p>
<p>And I am finding agreement on a point that <a href="http://curiousraven.squarespace.com/" target="_blank">Robert Rice</a> makes, <strong>&#8220;there is no perfect, ultimate solution *now*, but we have to do *something* to work from and refine/evolve.&#8221;<br />
</strong></p>
<p>Thomas Wrobel makes what I consider some crucial opening suggestions. I take my hat off to him for thinking about this early, coming up with some clear, elegant, and practical ideas, and doing the work to articulate these ideas so others can participate in evolving them.Â  Massive props for that, many times over.</p>
<p>Good ideas on standards at an early stage ofÂ  a developing industry like augmented reality are like spring sunshine and April showers for new crops. No one knows what storms and pests the growing season will bring &#8211; but water and sunshine (open standards) are always a good start. And, personally, I can&#8217;t wait to see how this new industry unfolds (see Bruce Sterling&#8217;s Layar Conference awesome keynote : <a href="http://layar.com/video-bruce-sterlings-keynote-at-the-dawn-of-the-augmented-reality-industry/" target="_blank">&#8220;At the Dawn of the Augmented Reality Industry.&#8221;</a>)</p>
<p>Thomas Wrobel is:</p>
<p><strong> &#8220;a web developer working for a small, brand-new company called <a href="http://www.lostagain.nl/" target="_blank">Lost Again</a>, which mostly works on ARGs (That is, the alternate reality games, not the augmented reality games, although there&#8217;s probably going to be big overlap there in the future). We developed two educational ARG games for the Netherlands with <a href="http://www.res-nova.nl/">a company called res-nova</a>.&#8221;</strong></p>
<p>I have been following Alternate Reality GamesÂ  through the amazing work of Elan Lee and <a href="http://www.fourthwallstudios.com/">Fourth Wall Studios</a>. Like Thomas, I think the intersection of ARGs and augmented realities is going to be very interesting.  Thomas wanted me to point out that the website for his company with Bertine van HÃ¶vell, http://www.lostagain.nl/, is just a placeholder for now.<br />
<strong><br />
&#8220;Probably be up fully within a week or two. And, &#8220;despite the logo, we aren&#8217;t an AR company [yet], or a travel firm. The logos supposed to represent being lost in our minds.&#8221;</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/logolostagainsmall.png"><img class="alignnone size-full wp-image-4250" title="logolostagainsmall" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/logolostagainsmall.png" alt="logolostagainsmall" width="162" height="56" /></a></p>
<p>Thomas has been thinking about the topic of an open augmented reality network for a while now.Â  He is an artist also known as <a href="http://www.renderosity.com/mod/gallery/index.php?image_id=1221354&amp;member">DarkFlame</a> and his ARN network is included in this augmented reality concept for 2086 he did in 2006 (click on image below to enlarge).</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-78.png"><img class="alignnone size-medium wp-image-4254" title="Picture 78" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Picture-78-300x218.png" alt="Picture 78" width="300" height="218" /></a></p>
<h3>Beyond IRC</h3>
<p>Both Thomas and <a href="http://arsvirtuafoundation.org/research/">Mez Breeze</a> made extensive and insightful comments on my last post, <a href="http://www.ugotrade.com/2009/08/03/augmented-reality-bigger-than-the-web-second-interview-with-robert-rice-from-neogence-enterprises/">&#8220;Augmented Reality &#8211; Bigger Than the Web: Second Interview with Robert Rice.&#8221; </a>And in particular they both picked up on something I am very interested in &#8211; the potential use of the Google Wave Web of protocols in creating open augmented reality networks.</p>
<p>Mez in her brilliant brainsplosion on social tesseracting takes on the very definition of information:</p>
<p><strong>&#8220;Tish, when you ask Robert â€œâ€¦what is your approach to delivering a massively shared real time [augmented reality] experience that is like Wave not confined to a walled garden?â€ thatâ€™s an extremely relevant question + one that needs to be addressed while considering the entirety of the Reality-Virtual Continuum. Iâ€™ve recently finished a series of articles addressing this: the framework Iâ€™ve developed is termed<a href="http://arsvirtuafoundation.org/research/2009/03/01/_social-tesseracting_-part-1/" target="_blank"> â€œSocial Tesseracting.â€</a></strong></p>
<p>I have recently begun exploring the Google Wave Web of Protocols which are nicely outlined in <a href="http://cubiclemuses.com/cm/articles/2009/08/09/waves-web-of-protocols/">this post</a> by J Aaron Farr which includes the very interesting diagram below (so more on Google Wave in another post).</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/wave_protocols.png"><img class="alignnone size-medium wp-image-4255" title="wave_protocols" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/wave_protocols-300x293.png" alt="wave_protocols" width="300" height="293" /></a></p>
<p>But, as Thomas notes, while he demonstrates his ideas using IRC (Internet Relay Chat) they reach<strong> Beyond IRC</strong>:</p>
<p><strong>&#8220;As mentioned before IRC has some drawbacks, which are due to its age or method of working. As such, future systems might yet prove better alternatives for a open AR network. One example of such a system is Google Wave. It shares many of the advantages of IRC (open, anyone can create a channel of data, different permission levels can be set and its free), while avoiding some critical restrictions. (The data can be persistent). I believe some of the ideas I&#8217;ve mentioned, and possibly even the proposed protocol string could be adapted for Google Wave or other future systems. I believe overall the principles are more important then any specific implementation to get to them</strong>&#8221;</p>
<p>Also Thomas pointed that while he uses markers to illustrate some of his examples, they are just a method for tracking.Â  What he is presenting is going to be transparent to the methodology of registration/tracking.</p>
<p><strong><strong>Tish Shute: You mostly use marker based examples but there is no reason why the principles you are suggesting will not be just as relevant as we move more into using more sophisticated image recognition tools is there?<br />
</strong><br />
Thomas Wrobel: No reason whatsoever. I mostly choose familiar markers as something that could be used now, with a lot of coding library&#8217;s already established for them. I think for most future AR use, markers will go completely&#8230;especially outside. Either things will be done purely by gps, object recognition, or the (in the case of advertising) markers will look like normal posters.</strong></p>
<p><strong>However, I do think traditional markers might &#8220;cling on&#8221; as being used for non geographical specific stuff at home. After all, if you need some reference points for moving mesh&#8217;s about in real time&#8230;(say, when playing a board game with a friend on the other side of the world)&#8230;.then there&#8217;s probably nothing that&#8217;s going to be more practical then some simple bits of paper or card.</strong></p>
<p><strong><br />
</strong></p>
<h3>Everything Everywhere</h3>
<h4>-Â  A proposal for an Augmented Reality Network system based on existing protocols and infrastructure.</h4>
<h3>by Thomas Wrobel</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/darkflame2.jpg"><img class="alignnone size-medium wp-image-4260" title="darkflame" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/darkflame2-199x300.jpg" alt="darkflame" width="199" height="300" /></a></p>
<p>The following paper is my vision of a open AR Network and potential methods to implement it with existing technologies. Specifically I&#8217;ll be focusing on a potential for a global outdoor AR network, although the ideas  aren&#8217;t limited to that.</p>
<p>Of course I call it â€œmyâ€ vision, but I&#8217;m obviously not the first to have many of these ideas. I have been influenced and inspired by many things&#8230;</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/AR_paper_img_0new1.jpg"><img class="alignnone size-medium wp-image-4232" title="AR_paper_img_0new" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/AR_paper_img_0new1-140x300.jpg" alt="AR_paper_img_0new" width="140" height="300" /></a></p>
<p><em>[Some of Thomas Wrobel&#8217;s influences &#8211; watched and played. ImagesÂ  from Mitsuo Iso&#8217;s<a href="http://en.wikipedia.org/wiki/Denn%C5%8D_Coil" target="_blank"> Denno Coil</a> (Click to enlarge) top,Â  below from the game &#8220;Metroid Prime,&#8221; and Terminator, and the last from Buffy the Vampire Slayer!]</em></p>
<p><strong>The AR Network.</strong></p>
<p>When I speak of a future AR Network, I mean one as universal and as standard as the internet. One where people can connect from any number of devices, <em>and without additional downloads</em>, experience the majority of the content.<br />
Where people can just point their phone, webcam, or pair of AR glasses anywhere were a virtual object should be, and they will see it. The user experience is seamless, AR comes to them without them needing to â€œprepareâ€ their device for it.</p>
<p>From this point forward, I will refer to this future AR Network simple as the <strong>â€œArnâ€.</strong></p>
<p>The Arn should be an inclusive, and open platform where any number of devices can connect to, and anyone can make and host their own location-specific models or data.<br />
It should allow people to communicate both publicly and privately, and not have their vision constantly cluttered with things they don&#8217;t want to see.</p>
<p>There&#8217;s two old, existing paradigms that I think can help reach this goal when they are combined.</p>
<p><strong>The Internet Relay Photoshop.</strong></p>
<p>IRC, or Internet Relay Chat  was a chat system designed by Jarkko Oikarinen in the late 80&#8242;s.</p>
<p>Its a system where people meet on &#8220;channels&#8221;, they can talk in groups, or privately. Channels can be read-only, or open to all to contribute to. There is no restriction to the number of people that can participate in a given discussion, or the number of channels that can be formed. All servers are interconnected and pass messages from user to user over the network.</p>
<p>To me, this relatively old internet technology is a great template, or even foundation, for how the Arn could operate. Rather then text being exchanged, it would be mesh data (or links to mesh data), but other then that much of the same principles could apply.</p>
<p>People could join channels of information to view or contribute. Families could leave messages to each other scribbled in mid-air on private channels. Strangers can watch AR games being played between people in parks. People going into a restaurant could see the comments from recent guests hovering by the menu items.<br />
None of this would have to be called up specially, if they are on the right channel when it was broadcast, they will see it.</p>
<p>The IRC paradigm becomes particularly powerful when combined with another one common to many computer users; that of a â€œLayerâ€ in an art program, such as Photoshop or Paint Shop Pro.<br />
As most of us know, layers allow us to separate out different components of a piece of art while editing, either to focus our attention on one piece, or to make future editing easier.</p>
<p>Now what if we simply have each  â€œchannelâ€ of information represented as a layer?</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/AR_paper_img_1.jpg"><img class="alignnone size-medium wp-image-4265" title="AR_paper_img_1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/AR_paper_img_1-300x206.jpg" alt="AR_paper_img_1" width="300" height="206" /></a></p>
<p><em>Click to enlarge image above.</em></p>
<p>Having channels corresponding to layers is an easy and intuitive way for the Arn to operate. The user can login and contribute data to any channel, like IRC as well as adjusting the desired opacity and visual range of each layer, like they would a layer in Photoshop.</p>
<p>In this way they can get a custom view of the world, both with shared and personal AR elements visible at the same time.<br />
They would not have to switch between various overlays to their world view, as they could see many at the same time.</p>
<p><strong>Persistence of Data</strong></p>
<p>With IRC or IRC-like system to communicate the data sent is mostly temporary data&#8230;broadcast on the fly from user to user and device to device. Retained in the users local logs, but not â€œhostedâ€ anywhere.</p>
<p>I think for the majority of day to day purpose&#8217;s this is not so much a drawback, but actually desired for AR. Most casual communication doesn&#8217;t need to be recorded permanently in 3D space and, indeed, if it was, the cost of running such a service would increase exponentially with users and with time. Not to mention, our visual view of the world would get very cluttered very quickly. Imagine what your monitor would be like if it kept a history of every window you have ever opened and their positions!</p>
<p>So for most cases AR space should be treated like a 3D monitor letting us display many pieces of data from remote and local sources, and even to share them with others, but not being, by default, a permanent record for it all.</p>
<p>Most data will be analogous to pixels on a display, and if kept in records its only on the clients devices, not on the network itself.</p>
<p>However, occasionally we do want 3d data analogous to a web-page, such as (in the example above), the map layer. Data here should be persistent and visible to all that have that layered turned on.Â  I see no reason why hosting this data needs to use anything else but standard web-hosting with the (read only) #channel on the Arn merely providing a route to the data.</p>
<p>As the user logs onto the channel, the server, using a chat-bot, can send them a list of meshes with location data attached, and the Arn browser can simply pick the data to display that&#8217;s local to them. (Note 1: By doing it this way around, it allows some degree of anonymity to be possible, rather then the server knowing exactly where you are and feeding the specific correct string to you.)</p>
<p>We simply need to establish standards so this data can be pulled up and interpreted.</p>
<p>For instance, this standard could be as simple as a XML string pointing to a KML file on a server. This could then be then displayed in the users field of view at the co-ordinates specified.</p>
<p>In this way permanent data tied to locations, such as historical overlays or maps, could co-exist on the same protocol as temporary data such as mid-air chat&#8217;s or gaming related meshes.</p>
<p>There is also no reason why this shared-space/personal spaces based on channels of data has to be restricted to things given absolute co-ordinates.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/AR_paper_img_2.jpg"><img class="alignnone size-medium wp-image-4266" title="AR_paper_img_2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/AR_paper_img_2-226x300.jpg" alt="AR_paper_img_2" width="226" height="300" /></a></p>
<p>(Different ways to access the same mesh)</p>
<p>It could work just as well with Markers and thus relative co-ordinates.</p>
<p>This would be mostly useful for indoor use, letting people logged onto a channel see the same meshes as everyone else on the markers. Thus allowing multi-player AR games, or AR games with observers very easily.</p>
<p>For example; games like Chess could be played between people with no additional code needed; You simply have a set of markers for only your own pieces, and as you move them the channel updates with the new positions, which are displayed in place in your opponents field of view.</p>
<p>This sort of game comes â€œfreeâ€ with just having a  generic system of shared space supporting markers.</p>
<p>It would also allow AR adverts down the street or in magazines to be viewed by simply logging onto the right AR channel</p>
<p>If markers are designed with URL data in them, this could even be a prompted or automatic process.<br />
â€œThere is visual data in this area on the following channel;  #ABCD  would you like to view this channel?â€</p>
<p><strong>Pros and Cons of using IRC or IRC-like systems</strong></p>
<p><strong>Pros;</strong></p>
<p><strong>â€¢	Anyone can write a IRC interface software.<br />
â€¢	Anyone can create new IRC channels without cost<br />
â€¢	Channels can have read and write permissions set.<br />
â€¢	Users can easily have multiple channels open at once.<br />
â€¢	Already established with thousands of severs worldwide.</strong></p>
<p><strong>Cons;</strong></p>
<p><strong>â€¢	500-or-so character limit. 3D data must be linked too, not sent.<br />
â€¢	Slow update rate. Lines of data can take a whole second or more to send.<br />
â€¢	Non-persistent. Good for a 3d-view, not good for storage.</strong></p>
<p><strong><br />
</strong></p>
<p><strong>An example of how collaborative 3D-spaces could be shared over existing IRC networks;</strong></p>
<p><strong><br />
</strong></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Image1.jpg"><img class="alignnone size-medium wp-image-4277" title="Image1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/08/Image1-300x162.jpg" alt="Image1" width="300" height="162" /></a></p>
<p><strong><em>Click on the image to enlarge </em><br />
</strong><br />
While in the long run I would hope for a dedicated AR network to be developed, with greater flexibility with persistence of data, there is a lot that can be done with the existing IRC system to implement the ideas mentioned above.</p>
<p>Below I will show an example of simple, crude, pseudo-protocol that could be fairly easily implemented to create shared AR spaces broadcast across IRC channels.</p>
<p>Its important to note, the goal here isn&#8217;t to exchange the mesh data itself on IRC, its to exchange links to the data.</p>
<p>Exchanging the mesh data directly within the 500 character IRC limit would be very hard, and liable to errors.</p>
<p>It&#8217;s also a waste of network bandwidth, as many people logged onto the channel might not have that object in their field of view, so their clients should not bother downloading it. (it should be up to the client browsers when to anticipate and cache mesh data).</p>
<p><strong>Proposed Basic XML link exchange for AR;</strong></p>
<p>Principle;<br />
As user creates or changes an object, the clients softwareÂ  posts a simple xml formatted string to<br />
the IRC channel.<br />
Anyone logged into that channel then sees that mesh displayed in the specified location.</p>
<p>This string could be formatted as follows;</p>
<p>&lt;Mesh<br />
ID=â€DARKFLAME:1â€<br />
Obj=â€http://www.darkflame.co.uk/mesh/church/chuch.kmlâ€<br />
Loc=â€(49.5000123,-123.5000123)â€<br />
Permissions=â€Noneâ€<br />
LastUpdate=â€12/12/0000,2012:12â€<br />
/&gt;</p>
<p>This string allows other users client logged into the channel to automatically load the object from the URL and display it at the correct position in their field of view.<br />
If the permissions are set to allow it, they could then move the object themselves, with the update being feeding back seamlessly to other users on the channel.</p>
<p>The objects posted are given an ID, which can be just the posters name, followed by a unique object number for that name. These unique ID&#8217;s would allow clients to track different instances of the same mesh, as well as making it easy to implement permissions. (if only the poster should be allowed to move this object, then the clients simply check if ID matches the user name posting the update. If its not, they can ignore it).</p>
<p>Next the objects need to be linked to a mesh.</p>
<p>The location of the objects mesh doesn&#8217;t have to be a fixed remotely-hosted url, it could be an IP address and port number of the user posting the mesh,hosted by the application posting the link to the channel.</p>
<p>Obj=â€www.darkflame.co.uk/mesh/church/chuch.kmlâ€<br />
Obj=â€123,223,14,23::3030â€</p>
<p>The objects co-ordinates, likewise, need not be specified as absolute gps co-ordinates, but instead could refer to generic Marker.</p>
<p>Loc=â€(49.5000123,-123.5000123)â€<br />
Loc=â€Marker1â€<br />
Or relative to a marker;<br />
Loc=â€Marker4 (+0.0023,-0.0023)â€<br />
Or relative to a default plane;<br />
Loc=â€Default(+0.213,-0.123)â€</p>
<p>The AR Browsers could then handle the association between the Markers pattern and its Name.<br />
This way the markers are reusable, they do need unique markers to be printed for every new bit of AR they want to look at.<br />
Users could just keep a set of generic markers handy, which they could simply assign to be Marker1,Marker2 etc for any AR use. (Note 2: As mentioned above specific makers could also contain a default ID name and channel built into their data, letting the Arn browser simply prompt the user if they want to see the model even if they aren&#8217;t in the right channel. This set up would be most useful for paper and even billboard advertising.)</p>
<p>The Default location could be a settable region, or marker, on the clients browser that defines a playable/user-able area in the field of view. Mostly useful for home use, this could typical be a square region on a users desk.</p>
<p>So, in the chess-game example, the client of the person making the moves simply updates the position relative to the Default every time they move their marker (which is tied to a chess piece mesh).<br />
Then the (non-owners) clients software could automatically display it relative to their Default plane. This would make games like Chess, Checkers, Go or any other game involving merely moving objects about automatically very intuitive and easy to set up.</p>
<p>So by having meshes settable to absolute gps, marker-relative, or default-relative locations, reduces the bother necessary to experience AR content quite considerably, and makes â€œnon-geo-specificâ€ AR applications and games trivial to implement.</p>
<p>Next is permissions.</p>
<p>Mesh-permissions would be a simple string saying who else can update the data, if anyone.</p>
<p>eg;<br />
Permissions=â€Noneâ€<br />
Permissions=â€RandomPerson1, RandomPerson2â€<br />
Permissions=â€Allâ€</p>
<p>By default you could only update or move your own meshes. (identified by the ID of first posting). If you attempt to update anyone else&#8217;s,Â  their clients would just ignore it.</p>
<p>Thus in a game of chess, you can only move your own pieces. If you attempted to move your opponents (by reassigningÂ  your own marker to their pieces Ids), the clients would just ignore that assignment. You&#8217;d only be fooling your own system.<br />
Likewise, when pinning a message in mid-air for your friends to read, no one else can change that message without your permission, although copying it would be easy.Â  (Note 3: It&#8217;s important to note this sort of object-specific permission system is in addition to the global-permissions, or â€œuser-modesâ€ it&#8217;s possible to set for the IRC channels and users as a whole.)</p>
<p>Finally, as object data could change within all sorts of time-scales, the easiest way to keep everyone logged in up to date is to just have a time-stamp of when each model was last updated.</p>
<p>LastUpdate=â€12/12/0000,2012:12â€</p>
<p>This would not necessarily be the same as the XML string post date, because the models mesh might not be updated, but merely moved, and in such case the Arn browser shouldn&#8217;t redownload the mesh.</p>
<p>This sort of arrangement could be used as a standard today, and users wouldn&#8217;t have to constantly download special AR programs to view a single AR mesh.</p>
<p>In the long-term I would hope for more advanced methods to manipulate Arn-content online, analogous to Dom manipulation in web-pages. But for now, we should at least establish standard methods for devices to pull up meshes and overlay them in the correct position.</p>
<p>So, having a layered system could give the user a seamless blend of dynamic and static data with which to paint their world with.<br />
I believe this is all relatively easy to achieve using modifications of existing web technology, combined with some basic graphics systems.<br />
<strong><br />
Local Data:</strong></p>
<p>However, so far I have only talked about remote data.<br />
What of programs originating on the device itself? This is, after all, how most AR software we have at the moment works.</p>
<p>I think, that just like the remote channels, local software should also be blended into the same list of layers.Â  People shouldn&#8217;t have to â€œAlt+Tabâ€ out of one view of the world, to see another.<br />
They should be able to see both at once, if they wish.</p>
<p>For instance, if your playing a AR game, why shouldn&#8217;t your chat window be viewable at the same time?</p>
<p>If you have skinned your environment with a custom view of the world, why shouldn&#8217;t you also see mapping or restaurant recommendations?</p>
<p>So local data and remote data should be blended in the same view.<br />
How can AR software &#8211; of which I hope, there will beÂ  thousands &#8211; seamlessly be expected to layer their graphics, not only with the real world, but with each other, and with online data too? Will games and software makers need to co-operate to allow their graphics to be integrated together with correct occlusion taken into account? A tall order, no?</p>
<p>I must confess though, my technology knowledge fails me here.</p>
<p>I can only guess special graphics drivers, or 3D APIs,Â  will have to be developed to let programs share their 3D world with that of a Arn browser.<br />
Maybe programmes should simply treat themselves as a local-sever which the browser can connect too, and let the Arn handle all the rendering itself (although I imagine many games designers would find this quite limiting).<br />
So I leave it as an exercise to the readers to discuss and propose the best methods by which this vision of a layered world could be realised..</p>
<p><strong>Beyond IRC:</strong></p>
<p>As mentioned before IRC has some drawbacks, which are due to its age or method of working.<br />
As such, future systems might yet prove better alternatives for a open AR network.<br />
One example of such a system is Google Wave.<br />
It shares many of the advantages of IRC (open, anyone can create a channel of data, different permission levels can be set and its free), while avoiding some critical restrictions. (The data can be persistent).<br />
I believe some of the ideas I&#8217;ve mentioned, and possibly even the proposed protocol string could be adapted for Google Wave or other future systems.<br />
I believe overall the principles are more important then any specific implementation to get to them.<br />
<strong><br />
Summary;</strong></p>
<p>âƒÂ Â  Â In order for AR to flourish the user shouldn&#8217;t need to download a separate application for each mesh they want to see.<br />
âƒÂ Â  Â  Having url&#8217;s embedded into QRCoded markers which point to standard mesh files like dxf or kml would be a way to do this right now.Â  The QR code would only have to be seen preciselyÂ  in shot once, then its borders could be used like a standard marker.</p>
<p>âƒÂ Â  Â An augmented view of the world needs to support visual multitasking, and havingÂ  layers of information is the best way to do that.<br />
âƒ<br />
âƒÂ Â  Â Methods need to be devised to allow drastically different software to contribute to these layers, without restricting either the software&#8217;s rendering ability&#8217;s, or the users ability to pick and choose what layers of information he wants to see.<br />
<strong><br />
Last point;</strong></p>
<p>I am absolutely confident in my belief AR will become at least as important as the web has, and probably a lot more so. It will also face much the same hurdles and challenges getting established as that medium did.<br />
But, speaking as a web-developer, can we try to avoid a browser war this time?</p>
<p>Everything Everywhere , draft.<br />
by Thomas Wrobel<br />
Darkflame a t gmail</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/08/19/everything-everywhere-thomas-wrobels-proposal-for-an-open-augmented-reality-network/feed/</wfw:commentRss>
		<slash:comments>34</slash:comments>
		</item>
	</channel>
</rss>
