<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>UgoTrade &#187; Tish Shute</title>
	<atom:link href="http://www.ugotrade.com/tag/tish-shute/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.ugotrade.com</link>
	<description>Augmented Realities at the Edge of the Network</description>
	<lastBuildDate>Wed, 25 May 2016 15:59:56 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>Augmented World Expo 2013:  It&#8217;s a wrap!</title>
		<link>http://www.ugotrade.com/2013/07/09/augmented-world-expo-2013-its-a-wrap/</link>
		<comments>http://www.ugotrade.com/2013/07/09/augmented-world-expo-2013-its-a-wrap/#comments</comments>
		<pubDate>Tue, 09 Jul 2013 03:38:56 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Ambient Findability]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Data]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[GeoFencing]]></category>
		<category><![CDATA[gestrural interface]]></category>
		<category><![CDATA[home energy monitors]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[Linden Lab]]></category>
		<category><![CDATA[Linked Data]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[nanotechnology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[Philip Rosedale]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[Amber Case]]></category>
		<category><![CDATA[augmented reality eyewear]]></category>
		<category><![CDATA[Augmented World Expo]]></category>
		<category><![CDATA[AWE2013]]></category>
		<category><![CDATA[Ben Cerveny]]></category>
		<category><![CDATA[connected hardware]]></category>
		<category><![CDATA[gesture interaction]]></category>
		<category><![CDATA[Google Glass]]></category>
		<category><![CDATA[hardware startups]]></category>
		<category><![CDATA[Mike Kuniavsky]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[Steve Mann]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[wearables]]></category>
		<category><![CDATA[Will Wright]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=6600</guid>
		<description><![CDATA[Augmented World Expo 2013 was really an amazing experience. I&#8217;m co-founder and co-organizer of the conference, along with Ori Inbar, so it has meant a lot to me to see our event grow over the last four years, and thrilling to make such a big splash this year.Â  There were 1,163 attendees, and the expo [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><iframe width="560" height="315" src="//www.youtube.com/embed/4d0k_7pdPGg" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/NQ-g0Jimg7I" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/9GxVQREssdY" frameborder="0" allowfullscreen></iframe></p>
<p><a href="http://augmentedworldexpo.com/" target="_blank">Augmented World Expo 2013</a> was really an amazing experience.  I&#8217;m co-founder and co-organizer of the conference, along with Ori Inbar, so it has meant a lot to me to see our event grow over the last four years, and thrilling to make such a big splash this year.Â  There were 1,163 attendees, and the expo show cased an ecosystem of emerging technologies &#8211; augmented reality, gesture interaction, eyewear, wearables, and connected hardware ofÂ  many stripes, that mark the beginning of natural computing entering the mainstream.  It was a unique opportunity to get up close and personal with what it feels like to be an augmented human in an augmented world! </p>
<p>Videos of AWE 2013â€²s 35 hours of educational sessions and inspirational keynotes are now available on <strong><a href="http://www.youtube.com/user/AugmentedRealityOrg/videos?view=0&amp;shelf_index=0&amp;sort=dd&amp;tag_id=" target="_self">our YouTube channel</a></strong>.  I am sharing <a href="http://www.youtube.com/watch?v=9GxVQREssdY">my own talk</a> (my slides are also up <a href="http://www.slideshare.net/TishShute/augmented-humansaugmentedworld">on slideshare here</a>), and a few of my favorites in this post, but there are far to many to post here, so please browse further on the Augmented World Expo youtube channel.</p>
<p>One notable high point of AWE2013, for me, was the showcase sponsored by <a href="http://www.meta-view.com/about">Meta</a>, a startup developing the first device allowing visualization and interaction with 3D virtual objects in the real world using your hands.  It was made possible by the generous contribution from the private collections of Paul Travers, Dan Cui, Steven Feiner, Steve Mann, and Chris Grayson, and passionate volunteers who are helping advance the industry.  Sean Hollister of The Verge did this excellent  report on the eyewear showcase <a href="http://www.theverge.com/2013/6/9/4409940/35-years-of-wearable-computing-history-at-augmented-world-expo-2013">35 years of wearable computing history at Augmented World Expo 2013<br />
</a>  Also for more on Meta see <a href="http://news.cnet.com/8301-11386_3-57584739-76/meta-glasses-bring-3d-and-your-hands-into-the-picture/">this article by Dan Farber</a>.</p>
<p>My colleagues at <a href="http://www.syntertainment.com/">Syntertainment</a>, Will Wright, Avi Bar-Zeev, Jason Shankel, and LaurenElliott all gave great talks.  Ironically, weâ€™re not building augmented reality apps or H/W.  We all just happen to continue to be very interested in the field. Â </p>
<p>Thank you to everyone for supporting the event! </p>
<p>The press coverage was truly extensive:</p>
<p style="text-align: left;"><a href="http://www.theverge.com/2013/6/9/4410406/in-the-shadow-of-google-glass-at-augmented-world-expo-2013">In the shadow of Google Glass, an augmented reality industry revs its engines<br />
</a>The Verge, Sean Hollister, June 9, 2013,Â <a href="http://topsy.com/www.theverge.com/2013/6/9/4410406/in-the-shadow-of-google-glass-at-augmented-world-expo-2013">271 Tweets</a></p>
<p><a href="http://news.cnet.com/8301-11386_3-57588128-76/the-next-big-thing-in-tech-augmented-reality/">The next big thing in tech: Augmented reality<br />
</a>CNET, Dan Farber, June 7, 2013<br />
Pick up onÂ <a href="http://currentnewsdaily.com/the-next-big-thing-in-tech-augmented-reality/">Current News Daily<br />
</a><a href="http://topsy.com/news.cnet.com/8301-11386_3-57588128-76/the-next-big-thing-in-tech-augmented-reality/">350 Tweets</a></p>
<p><a href="http://thepersuaders.libsyn.com/awe-2013-conference-report-augmented-reality-and-marketing">AWE 2013 Conference Report: Augmented Reality and Marketing<br />
</a>The Persuaders Marketing Podcast onÂ Dublin City FM, June 23, 2013</p>
<p><a title="AR Dirt Podcast â€“ Episode 26 â€“ Ori Inbar AWE2013 Extravaganza Recap" rel="bookmark" href="http://www.ardirt.com/general-news/ar-dirt-podcast-episode-26-ori-inbar-awe2013-extravaganza-recap.html">AR Dirt Podcast â€“ Ori Inbar AWE2013 Extravaganza Recap<br />
</a>AR Dirt by Joseph Rampolla,Â June 18, 2013</p>
<p><a href="http://www.theverge.com/2013/6/9/4409940/35-years-of-wearable-computing-history-at-augmented-world-expo-2013">35 years of wearable computing history at Augmented World Expo 2013<br />
</a>The Verge, Sean Hollister, June 9, 2013<br />
<a href="http://topsy.com/www.theverge.com/2013/6/9/4409940/35-years-of-wearable-computing-history-at-augmented-world-expo-2013">7 Tweets</a></p>
<p><a href="http://www.wired.com/beyond_the_beyond/2013/06/augmented-reality-bruce-sterling-keynote-at-augmented-world-expo-2013/">Augmented Reality: Bruce Sterling, keynote at Augmented World Expo 2013<br />
</a>Wired, Bruce Sterling, June 9, 2013<br />
<a href="http://topsy.com/www.wired.com/beyond_the_beyond/2013/06/augmented-reality-bruce-sterling-keynote-at-augmented-world-expo-2013/">9 Tweets</a></p>
<p><a href="http://doc-ok.org/?p=598">On the road for VR: Augmented World Expo 2013<br />
</a>Doc-Ok, Staff, June 7, 2013<br />
<a href="http://topsy.com/trackback?url=http%3A%2F%2Fdoc-ok.org%2F%3Fp%3D598">3 Tweets</a></p>
<p><a href="http://www.wassom.com/my-interview-from-augmented-world-expo-2013-video.html">My Interview from Augmented World Expo 2013 [VIDEO] </a><a href="http://wassom.com/">Wassom.com</a>, Brian Wassom, June 7, 2013</p>
<p><a href="http://zenfri.com/2013/06/augmented-world-expo/">Augmented World Expo</a><br />
ZenFri, Staff, June 7, 2013</p>
<p><a href="http://www.fbnsantos.com/?p=9634">AWE2013: Hardware for an augmented world</a><br />
FBNSantos.com, Felipe Neves Dos Santos, June 6, 2013</p>
<p><a href="http://investorplace.com/2013/06/augmented-reality-will-be-the-new-reality/">Augmented Reality Will Be the New Reality</a><br />
InvestorPlace, Brad Moon, June 6, 2013</p>
<p><a href="http://www.techhive.com/article/2040837/wearable-computing-pioneer-steve-mann-who-watches-the-watchmen-.html">Wearable computing pioneer Steve Mann: Who watches the watchmen?</a><br />
TechHive, Armando Rodriguez, June 6, 2013</p>
<p><a href="http://abclocal.go.com/kgo/video?id=9127769">Expo puts augmented reality in the limelight</a><br />
ABC 7 News, Jonathan Bloom, June 5, 2013</p>
<p><a href="http://www.dvice.com/2013-6-5/these-oled-microdisplays-are-future-augmented-reality">These OLED microdisplays are the future of augmented reality</a><br />
DVICE, Evan Ackerman, June 5, 2013</p>
<p><a href="http://www.engadget.com/2013/06/05/visualized-history-of-augmented-and-virtual-reality-eyewear/?utm_medium=feed&amp;utm_source=Feed_Classic&amp;utm_campaign=Engadget">Visualized: a history of augmented and virtual reality eyewear</a><br />
Engadget, Michael Gorman, June 5, 2013</p>
<p><a href="http://www.papitv.com/wikitude-announces-wikitude-studio-and-in-house-developed-ir-tracking-engine">Wikitude announces Wikitude Studio and in-house developed IR &amp; Tracking engine</a><br />
PapiTV, KC Leung, June 5, 2013</p>
<p><a href="http://www.usatoday.com/story/tech/personal/2013/06/05/augmented-reality-expo-google-glass/2392769/">Augmented reality expo aims for sci-fi future today</a><br />
USA Today, Marco della Cava, June 5, 2013</p>
<p><a href="http://www.wired.com/beyond_the_beyond/2013/06/augmented-reality-high-dynamic-range-hdr-video-image-processing-for-digital-glass/">Augmented Reality: High Dynamic Range (HDR) Video Image Processing For Digital Glass</a><br />
Wired, Bruce Sterling, June 5, 2013</p>
<p><a href="http://allthingsd.com/20130604/will-wright-at-augmented-reality-conference-dont-augment-reality-decimate-it/">Will Wright at Augmented Reality Conference: Donâ€™t Augment Reality, Decimate It</a><br />
AllThingsD, Eric Johnson, June 4, 2013</p>
<p><a href="http://news.cnet.com/8301-11386_3-57587672-76/philip-rosedales-second-life-with-high-fidelity/">Philip Rosedaleâ€™s Second Life with High Fidelity</a><br />
CNET, Dan Farber, June 4, 2013</p>
<p><a href="http://www.pcworld.com/article/2040801/google-glass-competitors-vie-for-attention-as-industry-grows.html">Google Glass competitors vie for attention as industry grows</a><br />
PC World, Zack Miners for IDG News Service, June 4, 2013</p>
<p><a href="http://daqri.com/press_posts/press-release-4d-augmented-reality-leader-daqri-announces-15-million-financing-2/#.Ua-RjNhNuSo">4D Augmented Reality Leader Daqri Announces $15 Million Financing</a><br />
Press Release, June 4, 2013</p>
<p><a href="http://www.techzone360.com/topics/techzone/articles/2013/06/03/340432-crowdoptic-powers-lancome-virtual-gallery-app-crowd-powered.htm">CrowdOptic Powers Lancome Virtual Gallery App, Crowd-powered Heat Map</a><br />
TechZone 360, Peter Bernstein, June 3, 2013</p>
<p><a href="http://www.craveculture.net/2013/06/augmented-humans-now/">Augmented humans, enhanced happiness?</a><br />
Crave Culture, Angelica Weihs, June 2, 2013</p>
<p><a href="http://www.metaio.com/press/press-release/2013/metaio-vuzix-to-showcase-ar-ready-smart-glasses-at-the-2013-augmented-world-expo/">Metaio &amp; Vuzix to Showcase AR-Ready Smart Glasses at the 2013 Augmented World Expo</a><br />
Press Release, May 30, 2013</p>
<p><a href="http://qz.com/89467/four-ways-augmented-reality-will-invade-your-life-in-2013/">Four ways augmented reality will invade your life in 2013</a><br />
Quartz, Rachel Feltman, May 30, 2013</p>
<p><a href="http://www.wired.com/beyond_the_beyond/2013/05/augmented-reality-augmented-world-expo-is-next-week/">Augmented Reality: Augmented World Expoâ„¢ is next week</a><br />
Wired, Bruce Sterling, May 28, 2013</p>
<p><a href="http://www.prweb.com/releases/candy-lab/augmented-reality/prweb10763283.htm">Strike it Rich with Cachetown and AWE 2013 Playing the Gold Rush 49â€™er Challenge In Augmented Reality</a><br />
Press Release, May 24, 2013</p>
<p><a href="http://interact.stltoday.com/pr/lifestyle/PR052413071613074">Local Community College Student Headed to Silicon Valley to Learn More about Augmented Reality</a><br />
St. Louis Post-Dispatch, Staff, May 24, 2013</p>
<p><a href="http://www.cnet.com.au/explore-an-intricate-labyrinth-with-smartphone-ar-339344350.htm">Explore an intricate labyrinth with smartphone AR</a><br />
CNET Australia, Michelle Starr, May 21, 2013</p>
<p><a href="http://thechronicleherald.ca/business/1130672-dartmouth-firm-lands-super-app">Dartmouth firm lands super app</a><br />
Herald Business, Remo Zaccagna, May 21, 2013</p>
<p><a href="http://siliconangle.com/blog/2013/05/17/augmented-world-expo-2013-the-future-of-augmented-reality/">Augmented World Expo 2013â€“The Future of Augmented Reality</a><br />
Silicon Angle, Saroj Kar, May 17, 2013</p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/o6L3dcsLEto" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/FhLx7k07Pa4" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/ON7VUzsNcYI" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/qhVdTFcR6TA" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/REoEj-JkDww" frameborder="0" allowfullscreen></iframe></p>
<p><iframe width="560" height="315" src="//www.youtube.com/embed/ohatuq8tekk" frameborder="0" allowfullscreen></iframe></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2013/07/09/augmented-world-expo-2013-its-a-wrap/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Shaping Play with Connected Stuff: IoToaster a prize winner in the YCombinator Upverter Hackathon!</title>
		<link>http://www.ugotrade.com/2013/03/10/shaping-play-with-connected-stuff-iotoaster-a-prize-winner-in-the-ycombinator-upverter-hackathon/</link>
		<comments>http://www.ugotrade.com/2013/03/10/shaping-play-with-connected-stuff-iotoaster-a-prize-winner-in-the-ycombinator-upverter-hackathon/#comments</comments>
		<pubDate>Sun, 10 Mar 2013 01:00:29 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Ambient Findability]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Data]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[GeoFencing]]></category>
		<category><![CDATA[GeoMessaging]]></category>
		<category><![CDATA[Hadoop]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[New Interfaces]]></category>
		<category><![CDATA[smart appliances]]></category>
		<category><![CDATA[Smart Devices]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Adam Wilson]]></category>
		<category><![CDATA[AR eyewear]]></category>
		<category><![CDATA[augmented reality eyewear]]></category>
		<category><![CDATA[Connected Stuff]]></category>
		<category><![CDATA[Dave Bisceglia]]></category>
		<category><![CDATA[hardware]]></category>
		<category><![CDATA[hardware startups]]></category>
		<category><![CDATA[Parsing Reality]]></category>
		<category><![CDATA[Parsing Reality Shaping Play with Connected Stuff]]></category>
		<category><![CDATA[Phu Nguyen]]></category>
		<category><![CDATA[robotic gaming systems]]></category>
		<category><![CDATA[robots]]></category>
		<category><![CDATA[robots and play]]></category>
		<category><![CDATA[romotive. orbotix]]></category>
		<category><![CDATA[smart phones and robots]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[social games]]></category>
		<category><![CDATA[SXSW]]></category>
		<category><![CDATA[SXSW interactive]]></category>
		<category><![CDATA[the future of play]]></category>
		<category><![CDATA[The Tap Lab]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[ubicomp]]></category>
		<category><![CDATA[YCombinator]]></category>
		<category><![CDATA[YCombinator Upverter Hackathon]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=6580</guid>
		<description><![CDATA[We had so much fun at the YCombinator Upverter Hackathon. I was honored to be part of &#8220;the beatles&#8221; team Â (Sam Cuttriss, Josh Cardenas, Jason Appelbaum, Lauren Elliott, Tish Shute, Otto Leichliter III &#38; IV) that produced the prize winning IoToaster. Rick Merritt did an awesome write up in EE Times, Slideshow: Y Combinator hackathon&#8217;s [&#8230;]]]></description>
				<content:encoded><![CDATA[<p>We had so much fun at the <a href="http://upverter.com/hackathons/yc-hackathon-2013/" target="_blank">YCombinator Upverter Hackathon</a>. I was honored to be part of &#8220;the beatles&#8221; team Â (Sam Cuttriss, Josh Cardenas, Jason Appelbaum, Lauren Elliott, Tish Shute, Otto Leichliter III &amp; IV) that produced the prize winning IoToaster. Rick Merritt did an awesome write up in EE Times, <a href="http://www.eetimes.com/electronics-news/4408238/Slideshow--Toaster-burns-in-Instagrams-at-hackathon?pageNumber=0" target="_blank">Slideshow: Y Combinator hackathon&#8217;s prize-winning designs</a>.   If you want to hear more about hardware startups shaping play with connected stuff, I hope you will stop by, <a href="http://schedule.sxsw.com/2013/events/event_IAP5412" target="_blank">Parsing Reality: Shaping Play with Connected Stuff</a>, Tuesday March 12th, 12.30pm -1.30pm, Raddison Town Lake Ballroom, Austin, SXSW 2013.  I&#8217;m delighted to join, Adam Wilson Founder, Chief Software Architect <a href="https://www.gosphero.com/company/" target="_blank">Orbotix</a>, Dave Bisceglia Co-Founder &amp; CEO <a href="http://thetaplab.com/" target="_blank">The Tap Lab</a>,  Phu Nguyen Founder <a href="http://romotive.com/" target="_blank">Romotive Inc</a> to talk about shaping play with connected stuff &#8211; <a href="http://schedule.sxsw.com/2013/events/event_IAP5412" target="_blank">more details here.</a></p>
<p>Meanwhile enjoy Rick Merritt&#8217;s great write up of IoToaster (<a href="http://www.eetimes.com/electronics-news/4408238/Slideshow--Toaster-burns-in-Instagrams-at-hackathon?pageNumber=0" target="_blank">reprinted from EE Times</a>).</p>
<blockquote>
<h2><span style="font-weight: normal;">&#8220;Y Combinator hackathon&#8217;s prize-winning designs&#8221;</span></h2>
</blockquote>
<p>&#8220;An Internet Toaster, two pair of faux Google glasses and two novel electronic gloves emerged from a hackathon organized by Upverter and hosted by Y Combinator.Â <span style="font-family: Arial;">SAN JOSE, Calif. â€“ Imagine sending an Instagram to your Internet toaster and printing itâ€”on whole wheat or white bread. Imagine creating your own vision for a variant of Google&#8217;s Project Glass.</span></p>
<p>Those were among the 32 projects from more than 130 designers at a recent all-day event organized by Upverter.com and hosted by Y Combinator, a startup incubator in Mountain View, Calif.</p>
<p>Winners took home iPads, Pebble watches, Arduino kits and Raspberry Pi boards after dedicating about 10 hours of their Saturday to hacking on their best ideas. Some took with them hopes of products that could make it to the market or new-formed teams that could be the heart of a new startup. Others just had a good time.</p>
<p>Hereâ€™s a look at some of the winners.</p>
<div><img src="http://m.eet.com/media/1179469/1%20glasses%20with%20woman.jpg" alt="" /></div>
<div>
<p><strong><span style="font-family: Arial;">Two teams worked on variants of Googleâ€™s $1,500 glasses-mounted computer. One team (above) used laser-cut medium-density fibreboard and embedded LEDs that could indicate when the wearer faced north. Another team (below) created Prism, a more thorough knock-off of Googleâ€™s concept complete with an embedded display and gesture recognition.</span></strong></p>
<p><strong> </strong><strong><img src="http://m.eet.com/media/1179470/1%20thanh%20with%20glasses%20x%20420.jpg" alt="" /><br />
</strong></p>
</div>
<div><span style="font-family: Arial;"><strong>Photos courtesy of Kuy Mainwaring and Sam Wurzel of Octopart.</strong></span></div>
<p><strong>Printing on whole wheat or white</strong></p>
<div><img src="http://m.eet.com/media/1179471/1%20toast.jpg" alt="" /></div>
<p>The IO Toaster (above) is sort of the Reeseâ€™s Peanut Butter Cup of social electronics. Itâ€™s an Internet-connected combo toaster/printer that creators say can â€œbring the cloud to your breakfast.â€</p>
<p>The team adapted code from an LED matrix to control heat transmission down to the pixel level. They hope to present the device at the Augmented World Expo at SXSW as well as at other hackathons and hardware meetups.</p>
<p>The team included Sam Cuttriss, Josh Cardenas, Tish Shute, Lauren Elliott, Jason Appelbaum and both Otto Leichliter III and IV.</p>
<div><img src="http://m.eet.com/media/1179472/1%20toaster%20engineer.jpg" alt="" /></div>
<p><strong>Peripherals and apps for the IO Toaster</strong></p>
<div><img src="http://m.eet.com/media/1179473/1%20toast%20face%20x%20420.jpg" alt="" /></div>
<p>The potential for the IO Toaster is great, said team members who brainstormed spin off products including:</p>
<ul>
<li>FaceToast: Your friendsâ€™ Facebook status messages pop up automatically at breakfast.</li>
<li>Instagram Toast: Patented sepia tone filters add artistic textures to photos (above). Too grainy?</li>
<li>Toasted, Augmented Reality: Toast revitalizes boring QR codes (below).</li>
<li>Pop Tweets: Twitter toaster pastries. Follow your favorite fruit flavor.</li>
<li>FlipToast: Create an edible FlipBook with a carb-hinge technology in development.</li>
<li>Angry Toast: A hyper sling and gimble add on hurls slices at kids trying to leave for school without breakfast.</li>
</ul>
<p><img src="http://m.eet.com/media/1179474/1%20toast%20q%20code%20x%20420.jpg" alt="" /><br />
<strong>Touch screen toaster displays</strong><br />
<iframe width="640" height="360" src="http://www.youtube.com/embed/OOSM8y7vuvA?feature=player_embedded" frameborder="0" allowfullscreen></iframe><br />
Designers of the IO Toaster created this animation to show the romantic possibilities of their product.</p>
<p><strong>Grand prize was a real grabber</strong></p>
<div><img src="http://m.eet.com/media/1179475/1%20hand%20thing.jpg" alt="" /></div>
<div><strong>The Tactilus is a haptic feedback glove for interacting with 3-D environments. A series of cables applies pressure to the wearer&#8217;s fingers to resist their motion in response to pushing against a virtual object.</strong></div>
<div><img src="http://m.eet.com/media/1179476/1%20hand%20thing%202.jpg" alt="" /></div>
<p><strong>Meet the Tactilus team</strong></p>
<div><img src="http://m.eet.com/media/1179477/1%20tactilous%20team.jpg" alt="" /></div>
<div><strong>Jack Minardy had the idea to create a haptic glove. Five strangers who stopped by his table and liked the idea became a virtual team for the day, bringing Tactilus to life. They are (from left) Matt Bigarani, Nick Bergseng, Jack Minardy, Neal Mueller and Tom Sherlock. Not pictured: Oren Bennett.</strong></div>
<p><strong>Fitness glove has something up its sleeve</strong></p>
<div><img src="http://m.eet.com/media/1179478/1%20glove.jpg" alt="" /></div>
<div><strong>The Body API is a comprehensive metric-gathering device that gives the sports enthusiast a big data boost.</strong></div>
<p><strong>Baby gets a robo rocker</strong></p>
<div><img src="http://m.eet.com/media/1179479/1%20rocker.jpg" alt="" /></div>
<div><strong>One team prototyped its invention for an automatic baby rocker using an electric can opener. Parents can control it visa a mobile app.<br />
</strong></div>
<p><strong>And other winners were&#8230; </strong><br />
At the end of the day, 30 groups took two minutes each to pitch their hack (below), some of which judges pitches in the circular file. A handful of others got various levels of recognition.</p>
<p>The winner in the most marketable category was the DIYNot, a plug that fits between your recharging device and the socket to turn off the two amp energy flow anytime you want. The Window Blind Controller, a clip on device that keeps streetlight out in the night and lets sunlight in during the day, got a nod from judges.</p>
<p>Judges also liked the Walkmen, an ultrasound virtual walking stick with haptic feedback for guiding disabled people. A team from Electric Imp got the Corporate Shill Award for a networked dispenser that spits out M&amp;Ms in response to tweets. Another group added Wi-Fi links to home switches opening a circuit for new kinds of remote controlsâ€”and pranks.</p>
<div><img src="http://m.eet.com/media/1179480/1%20presentations.jpg" alt="" /></div>
<p><strong>From here to China and back</strong></p>
<div><img src="http://m.eet.com/media/1179481/1%20zak%20and%20matt.jpg" alt="" /></div>
<div><strong>Zack Hormuth of Upverter.com (left), organizer for the event, helps hacker Matt Sarnoff. UpverterÂ <a href="http://www.eetimes.com/electronics-news/4405202/Slideshow--Hangin--at-a-hardware-hackathon">led a hackathon</a> at Facebookâ€™s Open Compute Summit. It also has hackathons in the works for New York City and Shenzhen.&#8221;</strong></div>
<div><strong><br />
</strong></div>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2013/03/10/shaping-play-with-connected-stuff-iotoaster-a-prize-winner-in-the-ycombinator-upverter-hackathon/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Augmented Awareness &amp; Reality Games, ARE2012</title>
		<link>http://www.ugotrade.com/2012/05/09/augmented-awareness-reality-games-are2012/</link>
		<comments>http://www.ugotrade.com/2012/05/09/augmented-awareness-reality-games-are2012/#comments</comments>
		<pubDate>Wed, 09 May 2012 18:12:41 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Ambient Findability]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[Artificial general Intelligence]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Data]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[GeoFencing]]></category>
		<category><![CDATA[GeoMessaging]]></category>
		<category><![CDATA[gestrural interface]]></category>
		<category><![CDATA[Hadoop]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[ipad]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[Linked Data]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[New Interfaces]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[AR]]></category>
		<category><![CDATA[ARE2012]]></category>
		<category><![CDATA[Augmented Awareness]]></category>
		<category><![CDATA[augmented experiences]]></category>
		<category><![CDATA[Cold Reading]]></category>
		<category><![CDATA[CosPlay]]></category>
		<category><![CDATA[Dimensions App]]></category>
		<category><![CDATA[facial recognition]]></category>
		<category><![CDATA[Game Design]]></category>
		<category><![CDATA[Gaming Reality]]></category>
		<category><![CDATA[global possibility space]]></category>
		<category><![CDATA[Google Project Glass]]></category>
		<category><![CDATA[Improv and Game Design]]></category>
		<category><![CDATA[Integrated Games]]></category>
		<category><![CDATA[Life Based Games]]></category>
		<category><![CDATA[Life Ganes]]></category>
		<category><![CDATA[location based games]]></category>
		<category><![CDATA[New Aesthetic]]></category>
		<category><![CDATA[new aesthetic of artificial intelligence]]></category>
		<category><![CDATA[Qualified Self]]></category>
		<category><![CDATA[quantified self]]></category>
		<category><![CDATA[reality games]]></category>
		<category><![CDATA[social shopping]]></category>
		<category><![CDATA[The Future of AR eyewear]]></category>
		<category><![CDATA[time- based games]]></category>
		<category><![CDATA[TimeHop]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[Weavrs]]></category>
		<category><![CDATA[Where 2012]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=6527</guid>
		<description><![CDATA[Augmented Awareness &#38; Reality Games, ARE2012 View more PowerPoint from Tish Shute ARE2012 is being live streamed this year, and the wrap up fire side chat between Bruce Sterling and Daniel Suarez and a surprise stupid fun grand finale is still to come. We have a live stream this year so you can see for [&#8230;]]]></description>
				<content:encoded><![CDATA[<div style="width:425px" id="__ss_12853433"> <strong style="display:block;margin:12px 0 4px"><a href="http://www.slideshare.net/TishShute/augmented-awareness-reality-games" title="Augmented Awareness &amp; Reality Games, ARE2012" target="_blank">Augmented Awareness &amp; Reality Games, ARE2012</a></strong> <iframe src="http://www.slideshare.net/slideshow/embed_code/12853433" width="425" height="355" frameborder="0" marginwidth="0" marginheight="0" scrolling="no"></iframe>
<div style="padding:5px 0 12px"> View more <a href="http://www.slideshare.net/thecroaker/death-by-powerpoint" target="_blank">PowerPoint</a> from <a href="http://www.slideshare.net/TishShute" target="_blank">Tish Shute</a> </div>
</p></div>
<p><a href="http://augmentedrealityevent.com/">ARE2012 </a> is being live streamed this year, and the wrap up fire side chat between Bruce Sterling and Daniel Suarez and a surprise stupid fun grand finale is still to come.  We have  <a href="http://augmentedrealityevent.com/stream/index.2.php">a live stream this year</a> so you can see for yourself!   Also you can catch up on any sessions you have missed, including the video of my talk, Augmented Awareness and Reality Games.  My slides are here and my speaker notes are below, enjoy!</p>
<p>1. Hi my name is Tish Shute. Currently I am working with Will Wright and Stupid Fun Club on a new genre of personally aware mobile games that move away fromt he idea that games are a way to escape reality. If you want to know more about what I mean by Reality Architect please feel free to look up my TEDXSilicon Alley talk <a href="http://www.youtube.com/watch?v=pBRa4gJPLHo">â€œOn Becoming a Reality Architect..&#8221;</a> .</p>
<p>2. As Will puts it, â€œgames are getting more and more personal to the point that our actual lives are becoming the most interesting gaming platform.&#8221;  Personally Aware Games, Life Based Gaming or Integrated Games are expressions that are just beginning to emerge to describe this idea that our lives are the most interesting gaming platform.</p>
<p>3. <a href="http://www.ugotrade.com/2012/04/25/where-2012-will-wright-gaming-reality/">Will Wrightâ€™s talk</a> at Where 2012 is a must see.  He pointed too a turning point for mobile gaming.- a shift for games from being about simulating reality to being about parsing reality.</p>
<p>4. The ghosts of AR past. Bruce Sterling at ARE2010 mentioned that AR eyewear was haunted by the spectre of ARs Gothic Stepsister &#8211; virtual reality, and Jesse Schell probed on the other hand ARâ€™s aspirations as the ubiquitous all seeing data eyeâ€“ the man with the x-ray eyes.. As Jesse put it, â€œYou guys are going to put it togetherâ€¦and then everybody is going to be like, oh my god we are freaking naked, all this information about me is out thereâ€¦I had security through obscurity, but not anymoreâ€¦â€</p>
<p>5. Yes, it seems we have put it all together. Although the ubiquitous all seeing data eye &#8211; our x ray eyes have turned out to be carried around in our pockets or integrated into our clothes and eyewear is not yet ubiquitous, at least yet. But, for the moment, we are looking at the most intimate aspects of ours lives only as an opportunity for optimization and efficiency, (but there are some interesting apps/products emerging &#8211; try out the Heart Rate app â€“ if you hold your finger up against the camera an you will get a pretty accurate reading). But as the explorations of makers, hackers and self trackers move out into consumer culture the quantified self is ripe for new forms of expression http://www.electricfoxy.com/projects/modwells/The term â€œgamificationâ€ has been worn out already . We sense its shallow inadequacy. So whatâ€™s next? </p>
<p>6. There is barely a trace of ARâ€™s Gothic stepsister VR in the Google glasses pitch which is super simple and seems to be aimed at optimizing Pinterest like social shopping experiences, by taking photos and videos from your direct eye-line and disseminating them through Google+  No mentions of mapping, tracking and registration or how they are working the hands free part yet â€“ all Iâ€™ve seen for input is nods so far. Is eye movement tracking up next &#8211; or what? Thrun was pretty down on the AR ghosts &#8211; the man with the x ray eyes stuff (Iâ€™m already feeling nostalgic for classic AR!).  But seeing with shared eyes is what makes AR technology super interesting as Jesse Schell pointed out at ARE2010, â€œThe internet allowed us to think with shared memoryâ€¦Augmented Reality will allow us to see with shared eyes,â€ Jesse Schell ARE2012.  Applying our design chops to this possibility space seems like a pretty good project to me. Bruce has always said that AR should be more about creating experiences than the technology.</p>
<p>7.  And we do need new forms of expression in our digital culture where technologies of seeing are primarily technologies of watching used for power and control.</p>
<p>8.  If you havenâ€™t already drunk at the New Aesthetic fountain you have some googling to do after this session â€“ start with James Bridleâ€™s Tumblr and Bruce Sterlingâ€™s essay http://www.wired.com/beyond_the_beyond/2012/04/ perhaps. James Bridle might have already closed the New Aesthetic tumblr but this collection of images is a provocation to explore the possibilities of feedback loops between people and machines â€“ a reflexive augmented awareness where we play with modes of digital seeing. I think AR and digital seeing is in need of a New Aesthetic more than most technologies because augmentation implies that we have an idea of what is aesthetically  valid at a given time and place, and that we have a position re the difference between augmented and degraded reality, and machinomorphic and anthropomorphic modes of perception. Howie Wooâ€™s â€œin <a href="http://woowork.blogspot.ca/2012/03/in-yo-face-facial-recognition.html">yo face facial recognition</a>â€ project (pic in my opening slide too), uses crochet + cunning to transform facial recognition into a reality game.</p>
<p>9.  Reality Games can give us new opportunities to explore the free play in the systems of our lives. AyseBirsel, a friend and brilliant  designer from New York City has being showing people, in a series of innovative workshops, how to bring powerful design tools to their lives, to design not necessarily a better life but at least an original life, beginning with a method of deconstruction,reconstruction, and visualization. The goal of an original life rather than an optimized more efficient life challenges AR and reality game designers to explore the possibility space of our lives.</p>
<p>10. We are already parsing our lives through powerful digital filters. Four Square has shown us the power of the fundamental change to maps that has at itâ€™s center the notion that â€œyou are hereâ€. See <a href="http://www.youtube.com/watch?v=Tzlv69lGrtQ">Adam Greenfieldâ€™s Where 2012</a>  talk for a deeper understanding of the significance of this change to mapping. While location is a powerful filter to parse what Will callâ€™s the GPS â€œglobal possibility spaceâ€ of our lives, it is not the only one.http://dornob.com/you-are-here-3-real-life-works-of-digital-map-inspired-art/</p>
<p>11. Time is another a powerful filter for our lives and games. Jonathan Blowâ€™s Braid explores how time can be manipulated in different game worlds. </p>
<p>12. Cosplay (or costume role playing) is different from earlier incarnations of say renaissance fairs or civil war reenactments in its integration into the present. In Tokyo a commuting hub turns into a cosplay mecca every Sunday and as AT Wilson puts it â€œturns a non-place to a place.â€</p>
<p>13. â€œ[TimeHop] sends users a daily e-mail reminder of what they did a year ago, and it does so by retracing the subscriberâ€™s digital footsteps Facebook, Twitter, Instagram and Foursquare.â€http://www.nytimes.com/2012/01/08/fashion/timehop-a-new-online-service-tells-you-what-you-were-doing-a-year-ago.html</p>
<p>14. Reality Games have of course predated a machine readable world. This book on Cold Reading by Ian Rowland parses the rules of the game that enables â€œpsychicsâ€ and â€œfortune tellersâ€ to deploy techniques that border on actual mind reading. http://www.thecoldreadingbook.com/Lifeâ€™s players &#8211; â€œpick up artistsâ€ &#038; â€œpsychicsâ€ and â€œcon-artistsâ€ are master gamers of the intimate social dynamics of life but NLP and semantic tech are bringing digital seeing to the kind of intimate social dynamics that are the domain of cold reading.</p>
<p>15. Status games are a core dynamic of life. The great ethnologist Erving Goffman, devoted his career to analyzing the face-to face relations of everyday life. Goffman, described everyday social life as a strategic game that could be understood through the metaphors of the stage.- front stage and back stage. But, as we parse reality, digital hierarchies and the abstractions of data viz begin to control the information flow and create a new stage for status games that demand a a different kind of awareness of what is back stage and what is front stage in social lives.</p>
<p>16. We are entering a new era of social intelligence where people and algorithms are interacting in interesting new ways. OKCupid has been getting a lot of attention for offering social intelligence that can help us play better in our dating lives. Did you know your profile narratives can reveal whether you like rough or gentle sex?</p>
<p>17. We are also beginning to see an interesting New Aesthetic for Artificial Intelligence -the expressive interaction between algorithms and people. SIRI, for example, is no cold reader, but she does have has a more developed character than Google voice.<br />
 Jeff Kramer has <a href="http://www.realityaugmentedblog.com/">an excellent post on Weavrs</a> &#8211; personality based social â€“ web robots.  I like weavrs a lot because they are out on there at the edge with there exploration of the expressive power of bots. Bots shape our algorithmic world from call centers to Wall street but we have barely began to explore their expressive potential .<br />
Weavrs exist on their own. You can ask them questions, but you canâ€™t tell them for example â€˜I like this, post more like this. Weavrs are social web bots that evolve and grow without your direct hand guiding them. But as <a href="http://www.realityaugmentedblog.com/2012/05/life-in-the-weavrs-web/">Jeff Kramer in his interesting post</a> on Reality Augmented notes,  </p>
<p>â€œitâ€™s also obvious that having more full featured persona creation/control options is going to be a big part of the future of social bots too.â€</p>
<p>18. The eruption of the digital into the physical is a catch phrase for The New Aesthetic. And <a href="http://dimensions.rjdj.me/">RjDjâ€™s Dimensions app</a>  and awesome Inception app, I think are exemplary explorations of new aesthetic dimensions for Sonic AR. The dimensions app pulls data from your surroundings â€” including movement, time of day and microphone input â€” to give you a very personal experience that adjusts to and transforms your environment and actions.</p>
<p>19. Imrov practitioners are early explorers of Reality Games. The Life Game is one of Keith Johnstoneâ€™s projects and his books on Improv have been a great source of inspiration for RPG players and game designers. A CMU student visiting Stupid Fun Club once asked Will what he should do to be a better game designer and Will said study Improv!</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2012/05/09/augmented-awareness-reality-games-are2012/feed/</wfw:commentRss>
		<slash:comments>3</slash:comments>
		</item>
		<item>
		<title>ScreenBurn Presents Will Wright&#8217;s Stupid Fun Club: SXSW Interactive 2012</title>
		<link>http://www.ugotrade.com/2012/01/23/sxsw-interactive-2012-screenburn-presents-will-wrights-stupid-fun-club/</link>
		<comments>http://www.ugotrade.com/2012/01/23/sxsw-interactive-2012-screenburn-presents-will-wrights-stupid-fun-club/#comments</comments>
		<pubDate>Mon, 23 Jan 2012 01:30:11 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Ambient Findability]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Data]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[GeoFencing]]></category>
		<category><![CDATA[GeoMessaging]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[ipad]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[Linked Data]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[New Interfaces]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[Rainbows End]]></category>
		<category><![CDATA[Real Time Big data]]></category>
		<category><![CDATA[Semantic Web]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[Gamepocalypse]]></category>
		<category><![CDATA[gamification]]></category>
		<category><![CDATA[geosocial games]]></category>
		<category><![CDATA[location based games]]></category>
		<category><![CDATA[location based services]]></category>
		<category><![CDATA[machine learning]]></category>
		<category><![CDATA[mobile local social]]></category>
		<category><![CDATA[mobile social]]></category>
		<category><![CDATA[personal gaming]]></category>
		<category><![CDATA[Reality Architect]]></category>
		<category><![CDATA[ScreenBurn]]></category>
		<category><![CDATA[social games]]></category>
		<category><![CDATA[social utilities]]></category>
		<category><![CDATA[Stupid Fun Club]]></category>
		<category><![CDATA[SXSW 2012]]></category>
		<category><![CDATA[SXSW interactive]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[Wil Wright]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=6464</guid>
		<description><![CDATA[I am super excited to be speaking at SXSW Interactive 2012, as part of Will Wright&#8217;s Stupid Fun Club, on &#8220;A Lifestyle with a Gaming Sense.&#8221; Michael Trice just did a post on our session for SXSW.com, Screen Burn Panels at the Palmer Presents Will Wright&#8217;s Stupid Fun Club. The photos of Will Wright, Tish [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2012/01/SXSW-WillWright-TishShute-PeterSwearengen.png"><img src="http://www.ugotrade.com/wordpress/wp-content/uploads/2012/01/SXSW-WillWright-TishShute-PeterSwearengen-300x148.png" alt="" title="SXSW-WillWright-TishShute-PeterSwearengen" width="300" height="148" class="alignnone size-medium wp-image-6465" /></a></p>
<p>I am super excited to be speaking at SXSW Interactive 2012, as part of Will Wright&#8217;s <a href="www.stupidfunclub.com/">Stupid Fun Club</a>, on <a href="http://schedule.sxsw.com/2012/events/event_IAP12616">&#8220;A Lifestyle with a Gaming Sense.&#8221;</a> Michael Trice just did a post on our session for SXSW.com, <a href="http://sxsw.com/node/9969">Screen Burn Panels at the Palmer Presents Will Wright&#8217;s Stupid Fun Club. </a>  The photos of Will Wright, Tish Shute (me!) and Peter Swearengen are by Anya Zavarzina.   Thank you Anya for such great photos!</p>
<p>I have been too busy to blog much lately, but there is a lot to unpack in future posts in my quote in Michael&#8217;s SXSW post!  </p>
<blockquote><p>&#8220;Really we&#8217;ve entered a new era where the world has become a platform for storytelling and the goal is to turn everyday life into an opportunity for play, relatedness, and new forms of autonomy and fun. We&#8217;ve now come to a point where software has moved out of the computer and into the world. Rather than viewing this process in terms we&#8217;ve already grown out of, like gamification, we view this as an opportunity to explore everyday activities as possibility spaces.&#8221;</p></blockquote>
<p>For the complete post, including Peter Swearengen of Stupid Fun Club on StoryMaker, see here <a href="http://sxsw.com/node/9969">http://sxsw.com/node/9969 </a> </p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2012/01/23/sxsw-interactive-2012-screenburn-presents-will-wrights-stupid-fun-club/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Interview with Vernor Vinge: Smart phones and Empowering Aspects of Social Networks &amp; Augmented Reality Still Massively Underhyped</title>
		<link>http://www.ugotrade.com/2011/05/10/interview-with-vernor-vinge-smart-phones-and-the-empowering-aspects-of-social-networks-augmented-reality-are-still-massively-underhyped/</link>
		<comments>http://www.ugotrade.com/2011/05/10/interview-with-vernor-vinge-smart-phones-and-the-empowering-aspects-of-social-networks-augmented-reality-are-still-massively-underhyped/#comments</comments>
		<pubDate>Tue, 10 May 2011 18:21:06 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Ambient Findability]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial general Intelligence]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Artificial Life]]></category>
		<category><![CDATA[Augmented Data]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[evolutionary technologies]]></category>
		<category><![CDATA[GeoFencing]]></category>
		<category><![CDATA[GeoMessaging]]></category>
		<category><![CDATA[gestrural interface]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[ipad]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[New Interfaces]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[Open Data]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[technological singularity]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[A Fire Upon the Deep]]></category>
		<category><![CDATA[AR Vision]]></category>
		<category><![CDATA[augmented cognition]]></category>
		<category><![CDATA[Augmented Reality Contact Lenses]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[augmented reality eyewear]]></category>
		<category><![CDATA[augmented reality social networks]]></category>
		<category><![CDATA[augmented social experiences]]></category>
		<category><![CDATA[augmented vision]]></category>
		<category><![CDATA[bottom up social networking]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Daemon]]></category>
		<category><![CDATA[Daniel Suarez]]></category>
		<category><![CDATA[digital gaia]]></category>
		<category><![CDATA[Fast Times at Fairmount High]]></category>
		<category><![CDATA[Freedom (TM)]]></category>
		<category><![CDATA[HUDs]]></category>
		<category><![CDATA[intelligence amplification]]></category>
		<category><![CDATA[Maneki Neko]]></category>
		<category><![CDATA[Rainbows End]]></category>
		<category><![CDATA[smart phones]]></category>
		<category><![CDATA[The Singularity]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[ubicomp]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Vernor Vinge]]></category>
		<category><![CDATA[visual search]]></category>
		<category><![CDATA[wearable computing]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=6277</guid>
		<description><![CDATA[Interview with Vernor Vinge Tish Shute: Many of the pioneers of the emerging AR industry who will be speaking at, and attending Augmented Reality Event, consider &#8220;Rainbows End&#8221; one of their key inspirations. [Note: If you want to attend ARE2011 readers of this post can use my discount code TISH295 ($295 for two days, or [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-12.51.38-PM.png"><img class="alignnone size-medium wp-image-6200" title="Screen shot 2011-04-13 at 12.51.38 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-12.51.38-PM-200x300.png" alt="" width="200" height="300" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/VernorVinge_RainbowsEnd.jpg"><img class="alignnone size-medium wp-image-6314" title="VernorVinge_RainbowsEnd" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/VernorVinge_RainbowsEnd-196x300.jpg" alt="" width="196" height="300" /></a></p>
<h3>Interview with Vernor Vinge</h3>
<p><strong>Tish Shute: </strong> Many of the pioneers of the emerging AR industry who will be speaking at, and attending <a href="http://augmentedrealityevent.com/" target="_blank">Augmented Reality Event,</a> consider <a href="http://www.amazon.com/Rainbows-End-Novel-Foot-Future/dp/0312856849" target="_blank">&#8220;Rainbows End&#8221;</a> one of their key inspirations. [Note: If you want to attend ARE2011 readers of this post can use my discount code <a href="http://augmentedrealityevent.com/register/" target="_blank">TISH295</a> ($295 for two days, or for one day only <a href="http://augmentedrealityevent.com/register/" target="_blank">TISH1DAY11</a> for $149]</p>
<p>What is the best and worst, in your view, about the way Augmented Reality is emerging from science fiction into science fact?</p>
<p><strong>Vernor Vinge:</strong> <strong>Progress that sets the stage:<br />
The worldwide market penetration of cellphones in the era 2000-2010 was of a size and speed that would have counted as foolish implausibility even in science-fiction of earlier times. More than half the human race suddenly had access to knowledge and comms. Being in the middle of this firestorm of progress, we can&#8217;t really judge ultimate effects, but I expect that smart phones and the empowering aspects of social networks and AR are still massively underhyped. (This is not to say that individual innovation enterprises can&#8217;t fail; the treasure is there for those who dare, and ultimately the whole human race can benefit.)</strong></p>
<p><strong>But I can still whine:<br />
Some &#8212; mostly political/legal &#8212; issues are disappointing. These affect AR but also the broad range of our progress with technology:<br />
o Software patents and some styles of cloud computing are blunting the ability of average people to innovate. In the 2010-2020 era, average people should have the building blocks to empower them to create (and throw away at the end of the workday) tools that in olden times would have been the whole purpose of a business startup.<br />
Unfortunately, some companies restrict and compartmentalize their releases like we&#8217;re still living in the twentieth century.<br />
There are also some mostly tech issues that I&#8217;m impatient with (speaking as a never-satisfied consumer and fan:)<br />
o The low pixel counts in contemporary head up displays.<br />
o The poor position coordination in current HUDs.<br />
o The lack of mass market acceptance of HUDs.<br />
o The lack of progress in distributed store-and-forward between<br />
mobile devices (sub-femtocell, ad hoc and transitory forwarding).<br />
o The lack of progress in uniform solutions to centimeter-scale<br />
localization.</strong></p>
<p><strong>Tish Shute:</strong> What do you feel will be the most impactful application of AR in people&#8217;s everyday lives?</p>
<p><strong>Vernor Vinge: There are nebulous and fairly high likelihood answers: AR apps that let each person/team see those aspects of physical reality that are important for their current activity. Pointing technologies that coordinate with that AR vision. The combination is a revolution of interfaces, and the probable physical disappearance of more and more of the gadgets that twentieth century people associated with high tech.</strong></p>
<p><strong> </strong></p>
<p><strong>There are also more specific, spectacular, and necessarily uncertain impacts (that depend on social acceptance and the development of network infrastructure for consensual sharing of local imagery).<br />
o Economic disruption of the trend toward huge, expensive display devices.<br />
o Bottom up social networking, arising from GPL&#8217;d tools. I see this as very disruptive, in good, bad and arguable ways, as illustrated by descriptive terms such as &#8220;consumer protection clubs&#8221;, &#8220;belief circles&#8221; and &#8220;lifestyle cults&#8221;. Some of these could be as public as our topdown social networks. Some might be quiet and widespread, perhaps growing out of pre-existing groups that already have a lot of intermember trust. (See:<a href="http://www-rohan.sdsu.edu/faculty/vinge/C5/index.htm" target="_blank">http://www-rohan.sdsu.edu/faculty/vinge/C5/index.htm</a>)<br />
o More farfetched, but in the tradition of the last 50 years: the digitization of external visual design: building architecture could give less priority to physical appearance and more to cheap physical strength, network access support, and physical modifiability.</strong></p>
<p><strong>Tish Shute: </strong>I interviewed Bruce Sterling earlier this week &#8211; <a href="http://www.ugotrade.com/2011/05/06/augmented-reality-transitioning-out-of-the-old-fashioned-legacy-internet-interview-with-bruce-sterling/" target="_blank">http://www.ugotrade.com/2011/05/06/augmented-reality-transitioning-out-of-the-old-fashioned-legacy-internet-interview-with-bruce-sterling/</a>.Â  And, I&#8217;m really looking forward to your &#8220;fireside chat&#8221; with Bruce at the end of Augmented Reality Event to sum up the event [<a href="http://augmentedrealityevent.com/schedule/" target="_blank">see the full schedule for ARE2011 here</a>].Â  But was there anything that particularly rung a bell for you in my conversation with Bruce?</p>
<p><strong>Vernor Vinge:</strong> <strong>Bruce says:Â  <em>&#8220;&#8230; it&#8217;s pretty clear that the people who would weep for joy to have Augmented Reality are people whose reality is already damaged. People who need reality augmented as a prosthetic &#8230;&#8221;</em> This really rings a bell with me. And social networks with AR may have a special impact at small sizes, even just _two_ players. At such a scale, they might be better called &#8220;joint entities&#8221; than &#8220;social networks&#8221;. For example, two differently disabled persons, where one is mobile. There&#8217;s a lot more that could be said about this, including applications that could be done (maybe are being done) already.</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/ar-contact1.jpg"><img class="alignnone size-medium wp-image-6319" title="ar-contact1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/ar-contact1-300x279.jpg" alt="" width="300" height="279" /></a><br />
</strong></p>
<p><em><a href="http://spectrum.ieee.org/biomedical/bionics/augmented-reality-in-a-contact-lens/0">Picture via IEEE Spectrum: Augmented Reality in a Contact Lens</a></em></p>
<p><strong>Tish Shute: </strong>As <a href="http://augmentedrealityevent.com/2010/08/25/are2010-keynote-by-jesse-schell-augmented-reality-will-define-the-21st-century/" target="_blank">Jesse Schell pointed out last year at ARE2010</a>, &#8220;The whole point of AR is to see things from a different point of view &#8230; How can there be a more powerful art form than one that actually changes what you see?&#8221;</p>
<p>The magic lens of the smart phone, screens &#8211; large and small, projection, audio and sensory devices are mediating our AR experiences today.  Bruce pointed out last year in his opening keynote, that these less immersive forms of AR have their own merits.</p>
<p>But eyewear has always been integral to the big vision of AR.  Do you see some interesting futures for AR without eyewear?  And, How long before AR eyewear is part of our everyday lives?<br />
<strong>Vernor Vinge: This importance of vision is a visionist claim :-), but for the majority of us who have sight, binocular vision is by far the highest bitrate input we have, and we have enormously sophisticated wetware for analyzing what we see. Current display tech is far short of fully exploiting this input channel.</strong></p>
<p><strong> </strong></p>
<p><strong>Along the way to this goal, I expect we&#8217;ll pass through mini-eras of exploiting the best-available tech. Right now, that is the tablet and the smartphone. Sometimes I almost wish for slower progress: in the nineteenth century, you could profitably spend your tech lifetime mastering one mechanism (for instance, black-and-white silver halide photography). The whole world would benefit from your career. Now, we rattle through the mini-eras so fast that we never fully exploit what&#8217;s zooming past before we&#8217;re on to the next stage.</strong></p>
<p><strong>How fast (or if) HUDs like in Rainbows End show up will probably depend on network and localizer tech as much as the HUDs themselves, with clear generational differences within such eyeware. In fact, it&#8217;s fun to imagine the mini-eras you could get with different combinations of HUDs tech, localization, and networking.</strong></p>
<p><strong>(Aside, a quibble: I think AR should not be restricted to visual only. There are tactile and kinesthetic possibilities, at least.)</strong></p>
<p><strong>(Aside, a whine: If only we had an output channel with the bitrate and flexibility of vision! Wearables plus voice and gesture could do some of that. Going further might involve scary human re-engineering. In  <a href="http://www.fictionwise.com/ebooks/eBook4380.htm" target="_blank">Fast Times at Fairmont High</a>, I speculated that a small re-engineering (eidetic memory) could give a form of highrate output,<br />
simply by allowing selection from very large menus.)</strong></p>
<p><strong>Tish Shute:</strong> Augmented Reality and Ubiquitous Computing are intimately connected. Is a distinction between AR and Ubicomp still useful? (This recent PARC blog post: <a href="http://blogs.parc.com/blog/2010/03/defining-ubiquitous-computing-vs-augmented-reality/" target="_blank">http://blogs.parc.com/blog/2010/03/defining-ubiquitous-computing-vs-augmented-reality/</a> takes a look at the definitions.)</p>
<p><strong>Vernor Vinge: In a literal sense there is a distinction, and there is enough technical challenge in AR to justify specialists spending all their time with AR. But Augmented Reality&#8217;s importance to humanity is in its role as a portal to the power of ubicomp and human cooperation.</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/TechnologicalSingularity.jpg"><img class="alignnone size-medium wp-image-6317" title="TechnologicalSingularity" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/TechnologicalSingularity-200x300.jpg" alt="" width="200" height="300" /></a><br />
</strong></p>
<p><strong>Tish Shute:</strong> Augmented Reality, as we understand it now, is a human centered experience.  But even now some of the most important aspects of our lives are governed by machine to machine intelligences that operate for the most part beyond the reach of human perception, e.g., the trading bots of Wall Street.  What role can augmented reality play in better mediating between human intelligence and machine to machine intelligence?  Does AR hasten the arrival of the technological singularity?</p>
<p><strong>Vernor Vinge: I see four or five concurrently active paths to the Singularity:<br />
a) Artificial Intelligence: We create superhuman artificial intelligence in computers.<br />
b) Digital Gaia: The worldwide network of embedded microprocessors, sensors, effectors, and localizers becomes a superhumanly intelligent entity.<br />
c) Internet Scenario: Humanity with its networks, computers, and databases becomes a superhuman being. (Bruce&#8217;s story <a href="http://www.amazon.com/Good-Old-Fashioned-Future-Bruce-Sterling/dp/0553576429" target="_blank">&#8220;Maneki Neko&#8221;</a> is a beautiful and subtle illustration of this possibility.)<br />
d) Intelligence Amplification: We enhance individual human intelligence through human-to-computer interfaces.<br />
e) Biomedical: We directly increase our intelligence by improving the neurological function of our brains. (I regard this last item to be the weakest of the possibilities.)</strong></p>
<p><strong>AR is central to progress with possibilities (c) and (d).<br />
If we humans want to keep our hand in the game, AR is an important thing to pursue.</strong></p>
<p><strong>Tish Shute: </strong>Powerful computer vision apps are emerging for smart phones and face recognition technologies are beginning to appear in consumer apps.  Do you think we need a major shift in the way we handle data ownership?   And, is &#8220;there is a real risk of our augmented reality world being owned by interests which are not our own?&#8221; (see my conversation with Anselm Hook last year. <a href="http://www.ugotrade.com/2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook" target="_blank">http://www.ugotrade.com/2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook</a></p>
<p><strong>Vernor Vinge: Yes, there is such a risk. (See also my political/legal comments in response to your question (1).)<br />
More broadly, I see DRM and the Law being used to reify our intellectual heritage as permanent private property. If this could work, it would be the biggest grab in history &#8212; and a major roadblock on human progress.</strong></p>
<p><strong>But even setting aside all the open/closed/free ideological questions, there is another important issue here: anytime laws are passed making popular and easily accomplished behavior illegal, things get very ugly. It may seem frivolous to compare this to the first stages of the War on Drugs, but that&#8217;s where serious enforcement would lead.</strong></p>
<p><strong>Tish Shute:</strong> We have seen gestural interfaces go mainstream in the last year.  What are the most interesting innovations with gestural interfaces that you have seen in recent months? What sessions will you go to at ARE this year?</p>
<p><strong>Vernor Vinge: I&#8217;m way behind the curve as to what is happening right now. Collecting data points on real hardware and applications is a high priority for me in attending ARE 2011.</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/the-children-of-the-sky.jpg"><img class="alignnone size-medium wp-image-6322" title="the-children-of-the-sky" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/the-children-of-the-sky-196x300.jpg" alt="" width="196" height="300" /></a><br />
</strong></p>
<p><strong>Tish Shute:</strong> Are you reading/writing any new fictional literature about AR?  And/or, What design fictions for AR are most interesting to you in the moment?</p>
<p><strong>Vernor Vinge: As to writing: My novel The Children of the Sky should come out this October from Tor Books. It&#8217;s set in the far future and is the sequel to <a href="http://www.amazon.com/Fire-Upon-Deep-Vernor-Vinge/dp/0812515285" target="_blank">A Fire Upon the Deep</a>. Alas, the story has only indirect connections to our present technological interests.</strong></p>
<p><strong>As to reading: I got a big kick out of Daniel Suarez&#8217;s duology <a href="http://www.goodreads.com/book/show/4699575-daemon" target="_blank">Daemon</a> and <a href="http://search.barnesandnoble.com/Freedom/Daniel-Suarez/e/9780525951575" target="_blank">Freedom(TM)</a>.</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2011/05/10/interview-with-vernor-vinge-smart-phones-and-the-empowering-aspects-of-social-networks-augmented-reality-are-still-massively-underhyped/feed/</wfw:commentRss>
		<slash:comments>9</slash:comments>
		</item>
		<item>
		<title>Augmented Reality &#8211; Transitioning out of the old-fashioned &#8220;Legacy Internet&#8221;: Interview with Bruce Sterling</title>
		<link>http://www.ugotrade.com/2011/05/06/augmented-reality-transitioning-out-of-the-old-fashioned-legacy-internet-interview-with-bruce-sterling/</link>
		<comments>http://www.ugotrade.com/2011/05/06/augmented-reality-transitioning-out-of-the-old-fashioned-legacy-internet-interview-with-bruce-sterling/#comments</comments>
		<pubDate>Fri, 06 May 2011 22:23:38 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Ambient Findability]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Data]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[gestrural interface]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[ipad]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[New Interfaces]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[Real Time Big data]]></category>
		<category><![CDATA[Semantic Web]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Adam Greenfield]]></category>
		<category><![CDATA[AR and Experience Design]]></category>
		<category><![CDATA[AR hacks]]></category>
		<category><![CDATA[AR Magic]]></category>
		<category><![CDATA[are2010]]></category>
		<category><![CDATA[ARE2011]]></category>
		<category><![CDATA[Augmented Bollywood Reality]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[Ben Cerveny]]></category>
		<category><![CDATA[Blaise Aguera y Arcas]]></category>
		<category><![CDATA[Bloom Studio]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Design Fiction]]></category>
		<category><![CDATA[Frank Cooper]]></category>
		<category><![CDATA[gestural interfaces]]></category>
		<category><![CDATA[gestural interfaces for augmented reality]]></category>
		<category><![CDATA[Jaron Lanier]]></category>
		<category><![CDATA[Jesper Sparre Andersen]]></category>
		<category><![CDATA[Jesse Schell]]></category>
		<category><![CDATA[Kinect]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[Marco Tempest]]></category>
		<category><![CDATA[Metaio]]></category>
		<category><![CDATA[Planetary]]></category>
		<category><![CDATA[TeleHash]]></category>
		<category><![CDATA[The Legacy Internet]]></category>
		<category><![CDATA[The Locker project]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[Tom Carden]]></category>
		<category><![CDATA[Tomi Ahonen]]></category>
		<category><![CDATA[Total Immersion]]></category>
		<category><![CDATA[Vernor Vinge]]></category>
		<category><![CDATA[Will Wright]]></category>
		<category><![CDATA[William Gibson]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=6252</guid>
		<description><![CDATA[Planetary from Bloom Studio, Inc. on Vimeo. It is just over a week until Augmented Reality Event, and I know there are a lot of people, including me (full disclosure I am co-chair and co-founder) who are totally psyched to see what unfolds there this year.Â Â  Bruce Sterling, Vernor Vinge, Blaise Aguera Y Arcas,Â  Jaron [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><iframe src="http://player.vimeo.com/video/23158141?title=0&amp;byline=0&amp;portrait=0" width="400" height="300" frameborder="0"></iframe>
<p><a href="http://vimeo.com/23158141">Planetary</a> from <a href="http://vimeo.com/bloomstudioinc">Bloom Studio, Inc.</a> on <a href="http://vimeo.com">Vimeo</a>.</p>
<p>It is just over a week until <a href="http://augmentedrealityevent.com/" target="_blank">Augmented Reality Event</a>, and I know there are a lot of people, including me (full disclosure I am co-chair and co-founder) who are totally psyched to see what unfolds there this year.Â Â  Bruce Sterling, Vernor Vinge, Blaise Aguera Y Arcas,Â  Jaron Lanier, Will Wright, Marco Tempest and Frank Cooper will join <a title="107 speakers from 76 augmented reality companies on a single stage" href="http://augmentedrealityevent.com/2011/04/24/107-speakers-from-76-augmented-reality-companies-on-a-single-stage/">107 speakers from 76 augmented reality companies on a single stage</a> (<a href="http://www.ugotrade.com/2011/04/13/augmented-reality-event-2011-bruce-sterling-vernor-vinge-will-wright-and-jaron-lanier-to-judge-the-auggies/" target="_blank">see my previous post</a>) to tell a momentous story of a technology of our time (also see <a href="http://www.ugotrade.com/2011/04/13/augmented-reality-event-2011-bruce-sterling-vernor-vinge-will-wright-and-jaron-lanier-to-judge-the-auggies/" target="_blank">my previous post here</a>).</p>
<p>As Bruce Sterling points out, Augmented Reality is &#8220;<strong>truly a child of the twenty-teens, a genuine digital native,&#8221; </strong> and one visible indication that:</p>
<p><strong> </strong><strong>..the Internet really could look like a &#8220;legacy.&#8221;  The Legacy Internet  as an old-fashioned, dusty, desk-based place best left to archivists and  librarians, while the action is out on the streets </strong>(see the full interview below)<strong>.<br />
</strong><br />
<a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/bruce-industrialdecline.jpg"><img src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/bruce-industrialdecline-300x225.jpg" alt="" title="bruce-industrialdecline" width="300" height="225" class="alignnone size-medium wp-image-6299" /></a><br />
(<em>photo by Jasmina Tesanovic</em>)</p>
<p>Opening this post is a video of Ben Cerveny&#8217;s <a href="http://planetary.bloom.io/">Planetary</a> app, which <a href="http://www.wired.com/underwire/2011/05/planetary-ipad-app/" target="_blank">&#8220;turns your music into a universe,&#8221;</a> and enchants all who try it.Â  Planetary shot into #3 on the Top Ten Free ipad app list soon after its release.</p>
<p>Ben Cerveny&#8217;s talk at Augmented Reality Event will be one of the must attend talks (<a href="http://augmentedrealityevent.com/schedule/" target="_blank">see the full schedule for Augmented Reality Event here</a>, and note my discount code for Augmented Reality Event, TISH295, is still good, if you want to register).</p>
<p>Planetary, while it is not an AR experience,Â  points the way for AR to take us out of the old-fashioned, &#8220;Legacy Internet.&#8221;</p>
<p>â€œ<a href="http://planetary.bloom.io/">Planetary</a> is just the sort of science fiction experience you expect when using an object from the future like <a href="http://www.wired.co.uk/topics/ipad">iPad</a>,â€ developer Bloom Studio writes on the appâ€™s <a href="http://itunes.apple.com/us/app/planetary/id432462305?mt=8">iTunes page</a>.<a title="107 speakers from 76 augmented reality companies on a single stage" href="http://augmentedrealityevent.com/2011/04/24/107-speakers-from-76-augmented-reality-companies-on-a-single-stage/"> </a>( <a href="http://www.wired.com/underwire/2011/05/planetary-ipad-app/" target="_blank">f</a>rom Mark Brown&#8217;s<a href="http://www.wired.com/underwire/2011/05/planetary-ipad-app/" target="_blank"> Wired post)</a>.</p>
<p>In <a href="http://news.cnet.com/8301-13772_3-20058911-52.html" target="_blank">his interview on cnet Daniel Terdiman</a>, Ben describes how popular computing will evolve beyond those, &#8220;<strong>dusty, desk-based place best left to archivists and librarians,&#8221; </strong> (Bruce Sterling).</p>
<p>Ben points out:</p>
<p><strong>&#8220;The tablet is a total disruption of how we understand popular  computing. The next era of experiences will be driven by visceral  gesture-based input, and rich fluid responsiveness in native graphics  contexts. I see the potential for Bloom to help define a &#8220;killer  pattern&#8221; for application design. Because apps have been deconstructed  into discrete tasks that flow across devices&#8230;.&#8221;</strong></p>
<p>Bruce Sterling had some interesting comments on the Bloom app:</p>
<p><strong>I&#8217;m a big fan of Ben and his good works in infoviz &#8212; and urban informatics, too.  I admit  I&#8217;m not  sure the I entirely need the metaphor of a solar system in order to play a few Texas blues tracks.  But I could be persuaded.  Ben Cerveny is a significant thinker and a very well-spoken guy.</p>
<p>The thing I consider significant about that remarkable piece of Bloom software is that it uses information visualization as a new breed of control interface.  That&#8217;s not just fancy re-skinning of the same old music-machine pushbuttons. That whole graphic shebang is generated in real-time on the fly.  And you can run code with that, play music, do media with it!  An advance like that is important.</p>
<p>I said at Layar, two years ago, that Augmented Reality would become a real industry when you could design an Augmented Reality system with an Augmented Reality system.  Some people in the audience had startled, &#8220;what the hell? Why would we bother?&#8221; reactions to that notion.  This Bloom piece makes that concept more plausible.</p>
<p>Think of it this way:  if AR is &#8220;real-time interaction that combines virtual data with three-dimensional real spaces,&#8221; then why would you leave that environment, and go to some dusty flat Internet screen to get real work done?  Isn&#8217;t that rather like designing a website on graph paper?  Bloom &#8220;Planetary&#8221; is definitely not Augmented Reality, but it suggests an approach that AR would follow if AR was seizing its own means of production.  It means AR, through AR, by AR, for AR.</p>
<p>I&#8217;m not saying that happens tomorrow; I&#8217;m just saying, why not?  Why not aspire to that?<br />
</strong><br />
I too am a huge fan ofÂ  The Bloom team, Ben Cerveny, Tom Carden, and Jesper Sparre Andersen (<a href="http://www.ugotrade.com/2011/02/10/jeremie-miller-the-locker-project-give-a-data-platform-to-the-people-in-the-era-of-data-everywhere-and-bloom-presents-fizz/" target="_blank">also see my post here about Fizz, the Bloom team&#8217;s app used by The Locker Project for their Strata demo</a>).Â  And, if you haven&#8217;t already heard about T<a href="http://blog.lockerproject.org/welcome-to-the-locker-project-tlp" target="_blank">he Locker Project</a> and<a href="http://www.telehash.org/about.html" target="_blank"> Telehash</a> &#8211; get on it!Â  This is one of the most important projects of our time &#8211; an infrastructure for a better future!</p>
<p> </br></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/bruce-pulpit.jpg"><img src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/05/bruce-pulpit-186x300.jpg" alt="" title="bruce-pulpit" width="186" height="300" class="alignnone size-medium wp-image-6296" /></a></p>
<h3><strong><strong>Interview with Bruce Sterling by Tish Shute and Ori Inbar</strong></strong></h3>
<p><strong>Tish Shute:</strong> As you so memorably put it, â€œAR is a technovisionary dream come true &#8211; something really rare, and you have to be really patient for those&#8230;.â€</p>
<p>What is best and worst, in your view,  about the way Augmented Reality technovisionary dream is coming true and emerging to flourish in the wild?</p>
<p><strong>Bruce Sterling: The best part is that AR is truly happening and is a  lot of fun, and the worst part is that it&#8217;s happening in a Depression.  If AR had broken loose in the dotcom days when cash flew around like soap bubbles, man, that would have been psychedelic.</strong></p>
<p><strong>AR that is even more of-our-time than &#8220;social media.&#8221; AR has arisen directly from modern technical factors that just didn&#8217;t use to exist.  It&#8217;s made from shiny new parts, and is truly a child of the twenty-teens, a genuine digital native.   It&#8217;s a little kid and it has to walk before it can run, but it&#8217;s great to see it walking.</strong></p>
<p><strong>Tish Shute:</strong> As Jesse Schell pointed out last year at ARE2010, â€œThe whole point of AR is to see things from a different point of viewâ€¦How can there be a more powerful art form than one that actually changes what you see?â€  What do you feel will be the most impactful application of AR in people&#8217;s everyday lives?</p>
<p><strong>Bruce Sterling:</strong><strong> I&#8217;m all for impact, but it&#8217;s pretty clear that the people who would weep for joy to have Augmented Reality are people whose reality is already damaged.  People who need reality augmented as a prosthetic, in other words, so that they can achieve an &#8220;everyday life.&#8221;  This is like the impactful but underappreciated role of the Internet in the lives of people who&#8217;ve been shut-in.  If you&#8217;re laid-up in a hospital bed, a laptop is a revolution in convalescence.</strong></p>
<p><strong>But that kind of &#8220;impact&#8221; doesn&#8217;t sound too exciting or too profitable.  My guess would be that the biggest arena for &#8220;impactful AR&#8221; would be augmenting cityscapes for foreign people who can&#8217;t speak the local language, can&#8217;t read the signs, and lack time to learn the local reality.  Imagine, say, the Brazilian overlay for Moscow.  You could show up, read your native Brazilian overlay of that city, do your business, eat, sleep, buy, leave, and scarcely &#8220;be in Moscow&#8221; at all.  Constructed right, the AR Brazilian Moscow might even be a better Moscow &#8212; a Moscow that Russians themselves would pay to visit.</strong></p>
<p><strong>Tish Shute: </strong>You pointed out last year, in your opening keynote for ARE2010, that less immersive forms of AR have their own merits.  We are still not seeing much â€œhead mounted display weirdnessâ€ yet, but many other forms of AR are emerging &#8211; mobile, webcam, projected video, sonic augmented reality, even sticky light.  You noted, practically everything that AR is involved in is a transitional technology.  But since you spoke last year at ARE2010, which of these transitional technologies have shown the most promise for AR?</p>
<p><strong>Bruce Sterling: It&#8217;s got to be handsets.  Smartphones.  The stats there are just amazing.  The smartphone biz makes the personal computer business look like a Victorian railroad.  When I read a guy like Tomi Ahonen, who talks about transitioning out of the old-fashioned &#8220;Legacy Internet,&#8221; that idea is startling.  But AR is one visible indication that the Internet really could look like a &#8220;legacy.&#8221;  The Legacy Internet as an old-fashioned, dusty, desk-based place best left to archivists and librarians, while the action is out on the streets.</strong></p>
<p><strong>Tish Shute:</strong> This year we have seen gestural interfaces go mainstream.  What are the most interesting directions for gestural interfaces that you have seen emerge in recent months?</p>
<p><strong>Bruce Sterling:</strong> <strong>To me, the most &#8220;interesting&#8221; part is seeing people do gestural stuff in public.  William Gibson, my fellow author, observes that cellphones have stolen the gestural language of cigarettes.  There&#8217;s lots of fidgeting, box tapping, ash-swiping, slipping boxes in and out of pockets&#8230; People quickly learn to do that without thinking twice, and they forget how weird it looks. It&#8217;s &#8220;design dissolving in behavior,&#8221; as Adam Greenfield puts it.</strong></p>
<p><strong>The gestural hack scene for the Kinect has been amazing.  It&#8217;s like watching 1950s Beatnik dancing go mainstream.</strong></p>
<p><strong>Tish Shute: </strong>You have observed that Augmented Reality is Glocal which not only gives us different flavors of augmented experience but is â€œa departure from earlier models of tech startups, where you usually have like three hippies in a local garage.  Now youâ€™ve got German-American-Korean outfits like Metaio, and Total Immersion has a Russian affiliate.  Theyâ€™re inherently multinational, both inside the company and out.&#8221;  What flavors of glocalness do you hope/expect to see at Augmented Reality Event this year.</p>
<p><strong>Bruce Sterling: I&#8217;d be pretty happy to see some AR input from Brazil, India, and South Africa.  I seem to be picking up a lot of followers in my Twitter stream from those locales.  If I saw some Augmented Bollywood Reality, that would pretty much make my day.</strong></p>
<p><strong>Ori Inbar:</strong> What sessions will you go to at ARE this year? Who do you want to meet at ARE 2011?</p>
<p><strong>Bruce Sterling: I make it my business to hang out with artists, but I&#8217;m hoping to drill down more on the technical aspects.  For instance, where exactly are the bottlenecks in building animated augments?  It looks like we&#8217;re about a sneeze away from jamming some crude Hanna-Barbera cartoons into real spaces. But the devil is in the details there.</strong></p>
<p><strong>Ori Inbar:</strong> Your commentary about the evolution of the AR industry over the years had significant focus on style. Is the AR industry dressed to kill yet? Any glimpses of promise in that direction?</p>
<p><strong>Bruce Sterling: I&#8217;m not &#8220;pro-style&#8221; in every possible aspect of life, but as an Augmented Reality critic, it&#8217;s clear to me that if you claim to &#8220;augment&#8221; reality, then you should work hard to augment it &#8212; struggle to make it better.  Otherwise you might as well call yourself &#8220;Defaced Reality,&#8221; or even &#8220;3D Spam.&#8221;  When I see that kind of crudity and carelessness in AR, I&#8217;m gonna call people out on it.  I know there will be the AR equivalent of cheesy billboards and gang graffiti, but I never much cared for those, either.</strong></p>
<p><strong>The industry&#8217;s videos have improved radically in the past year and a half.  It used to be all about &#8220;look at my grainy, shaky handheld video of my cool new AR hack,&#8221;  but nowadays the biz has really pulled its socks up.</strong></p>
<p><strong>If AR is about &#8220;experience design,&#8221; as I think it basically is, then eventually, as a matter of intellectual consistency and professional pride, everything you create will be considered  part of &#8220;the experience.&#8221;  That&#8217;s the industry&#8217;s way forward &#8212; that&#8217;s what it would do if it was grown-up.</strong></p>
<p><strong>AR people already look better than most similar geeks in the gaming business, and some day, I really do believe that augmentation people will become glamorous.  They won&#8217;t be supermodels, but they&#8217;ll be about as chic as, say, professional set designers.  Because AR is set design, in a way; it&#8217;s real-time interactive set-design for three-D spaces.</strong></p>
<p><strong>Ori Inbar: </strong>In the Layar Launch in 2009 you said â€œitâ€™s the dawn of AR&#8230;â€, at ARE 2010, you followed up on the theme saying â€œitâ€™s 9am in the AR industry.â€ What time is it now?</p>
<p><strong>Bruce Sterling: I&#8217;d be guessing it&#8217;s around 9:30 AM, but come on, that&#8217;s just a metaphor! ARE we all gonna blow off at 4:30 PM and have a beer, or is AR one of those cruel tech startups where nobody ever gets a personal life?</strong></p>
<p><strong>Ori Inbar:</strong> Are you reading any new fictional literature about AR that inspires you?  And/or What interesting design fictions for AR have you come across recently?</p>
<p><strong>Bruce Sterling: Well, I&#8217;m always interested in creative people who just plain make stuff up.  Because that&#8217;s why I commonly do myself.  The stuff that &#8220;inspires&#8221; me is usually stuff that I just didn&#8217;t expect to see.  But when I don&#8217;t expect it, that usually means I wasn&#8217;t paying enough attention.  I plan to pay a lot of attention to AR this year.</strong></p>
<p><strong>I&#8217;m not sure it makes a lot of sense to write fiction nowadays &#8220;about AR,&#8221; because it&#8217;s no longer a fictional topic.  It&#8217;s become like writing fiction &#8220;about cinema.&#8221;  You can write good fiction about someone who works in cinema, but not fiction about cinema itself.  AR is not sci-fi &#8220;Augmented Reality&#8221; any more, it&#8217;s become a real-world phenomenon, a new industry of real augmentation.</strong></p>
<p><strong>With that said, I must remark that I sit up straight whenever I see Marco Tempest do stuff.  Magicians are all about mystery and wonder.  You wouldn&#8217;t see a magician, say, using AR to work an assembly line, or re-order library books, or find a pizza joint in Barcelona.  And that&#8217;s great.   Marco is always gonna do something freaky and out-there, and even though he&#8217;s a tech magician, it&#8217;s never about the tech first.  It&#8217;s always about his ingenuity in finding new ways to employ new tools in creating a magical experience for his audience.</strong></p>
<p><strong>Marco&#8217;s not an entrepreneur, he&#8217;s  not gonna revolutionize people&#8217;s daily lives or invent Web 4.0, but even if AR becomes &#8220;old hat&#8221; some day, it&#8217;s never going to be old hat when he&#8217;s doing it.  The guy is a pro, and I&#8217;m quite the fan.</strong></p>
<p><iframe src="http://player.vimeo.com/video/11801074?portrait=0" width="400" height="225" frameborder="0"></iframe>
<p><a href="http://vimeo.com/11801074">Magic Projection Live @ TEDxTokyo 2010</a> from <a href="http://vimeo.com/magician">Marco Tempest</a> on <a href="http://vimeo.com">Vimeo</a>.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2011/05/06/augmented-reality-transitioning-out-of-the-old-fashioned-legacy-internet-interview-with-bruce-sterling/feed/</wfw:commentRss>
		<slash:comments>14</slash:comments>
		</item>
		<item>
		<title>Augmented Reality Event, 2011: Bruce Sterling, Vernor Vinge, Will Wright, and Jaron Lanier to Judge the &#8220;Auggies&#8221;</title>
		<link>http://www.ugotrade.com/2011/04/13/augmented-reality-event-2011-bruce-sterling-vernor-vinge-will-wright-and-jaron-lanier-to-judge-the-auggies/</link>
		<comments>http://www.ugotrade.com/2011/04/13/augmented-reality-event-2011-bruce-sterling-vernor-vinge-will-wright-and-jaron-lanier-to-judge-the-auggies/#comments</comments>
		<pubDate>Wed, 13 Apr 2011 22:38:32 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Ambient Findability]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Big Data]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[gestrural interface]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[New Interfaces]]></category>
		<category><![CDATA[Real Time Big data]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[2011]]></category>
		<category><![CDATA[Amir Baradaran]]></category>
		<category><![CDATA[AR Magic]]></category>
		<category><![CDATA[ARE2011]]></category>
		<category><![CDATA[augmented reality and ubiquitous computing]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[augmented reality in education]]></category>
		<category><![CDATA[augmented reality magic]]></category>
		<category><![CDATA[augmented reality startup launchpad]]></category>
		<category><![CDATA[Aurasma]]></category>
		<category><![CDATA[Ben Cerveny]]></category>
		<category><![CDATA[Berg]]></category>
		<category><![CDATA[big brans and augmented reality]]></category>
		<category><![CDATA[Blaise Aguera y Arcas]]></category>
		<category><![CDATA[Bloom]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Chris Arkenberg]]></category>
		<category><![CDATA[Chris Grayson]]></category>
		<category><![CDATA[collaborative consumption]]></category>
		<category><![CDATA[computer vision search database]]></category>
		<category><![CDATA[data driven augmented reality]]></category>
		<category><![CDATA[Dentsu London]]></category>
		<category><![CDATA[Design Fiction]]></category>
		<category><![CDATA[Frank Cooper III.]]></category>
		<category><![CDATA[Helen Papagiannis]]></category>
		<category><![CDATA[Ina Centaur]]></category>
		<category><![CDATA[Jaron Lanier]]></category>
		<category><![CDATA[Layar]]></category>
		<category><![CDATA[magic as a metaphor for ubiquitous computing]]></category>
		<category><![CDATA[Marco Tempest]]></category>
		<category><![CDATA[Mark Billinghurst]]></category>
		<category><![CDATA[Metaio]]></category>
		<category><![CDATA[Microsoft's Mobile Strategy]]></category>
		<category><![CDATA[Mike Kuniavsky]]></category>
		<category><![CDATA[Occipital]]></category>
		<category><![CDATA[OCR and augmented reality]]></category>
		<category><![CDATA[Ogmento]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[Qualcomm SDK for vision based augmented reality]]></category>
		<category><![CDATA[Sander Veenhof]]></category>
		<category><![CDATA[Seth Praebatsch]]></category>
		<category><![CDATA[Steven Feiner]]></category>
		<category><![CDATA[Suwappu]]></category>
		<category><![CDATA[Tactical Information Systems]]></category>
		<category><![CDATA[TeleHash]]></category>
		<category><![CDATA[the Auggie Award]]></category>
		<category><![CDATA[The Auggies]]></category>
		<category><![CDATA[the future of augmented reality]]></category>
		<category><![CDATA[the game layer on top of the world]]></category>
		<category><![CDATA[The Locker project]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[Total Immersion]]></category>
		<category><![CDATA[Vernor Vinge]]></category>
		<category><![CDATA[Viewdle]]></category>
		<category><![CDATA[visual search]]></category>
		<category><![CDATA[Will Wright]]></category>
		<category><![CDATA[Wordlens]]></category>
		<category><![CDATA[YDreams]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=6176</guid>
		<description><![CDATA[The prophets of Augmented Reality, Bruce Sterling (best know for his science fiction works, his non-fiction includes &#8220;The Hacker Crackdown,&#8221; &#8220;Tomorrow Now,&#8221; and &#8220;Shaping Things&#8221;), andÂ  Vernor Vinge (author of â€œRainbows Endâ€ and â€œThe Coming Technological Singularityâ€) are joiningÂ  Will Wright (Legendary game designer of SimCity, The Sims, and Spore), and Jaron Lanier (a computer [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-12.51.05-PM1.png"><img class="alignnone size-thumbnail wp-image-6203" title="Screen shot 2011-04-13 at 12.51.05 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-12.51.05-PM1-150x150.png" alt="" width="150" height="150" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-12.51.38-PM.png"><img class="alignnone size-thumbnail wp-image-6200" title="Screen shot 2011-04-13 at 12.51.38 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-12.51.38-PM-150x150.png" alt="" width="150" height="150" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-12.51.52-PM.png"><img class="alignnone size-thumbnail wp-image-6199" title="Screen shot 2011-04-13 at 12.51.52 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-12.51.52-PM-150x150.png" alt="" width="150" height="150" /></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-12.55.41-PM.png"><img class="alignnone size-thumbnail wp-image-6205" title="Screen shot 2011-04-13 at 12.55.41 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-12.55.41-PM-150x150.png" alt="" width="150" height="150" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-12.51.21-PM.png"><img class="alignnone size-thumbnail wp-image-6201" title="Screen shot 2011-04-13 at 12.51.21 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-12.51.21-PM-150x150.png" alt="" width="150" height="150" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-1.43.28-PM.png"><img class="alignnone size-thumbnail wp-image-6211" title="Screen shot 2011-04-13 at 1.43.28 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-1.43.28-PM-150x150.png" alt="" width="150" height="150" /></a></p>
<p>The prophets of Augmented Reality, <a href="http://en.wikipedia.org/wiki/Vernor_Vinge" target="_blank">Bruce Sterling</a> (best know for his science fiction works, his non-fiction includes &#8220;The Hacker Crackdown,&#8221; &#8220;Tomorrow Now,&#8221; and  &#8220;Shaping Things&#8221;), andÂ  <a href="http://en.wikipedia.org/wiki/Vernor_Vinge" target="_blank">Vernor  Vinge </a>(author of â€œRainbows Endâ€ and â€œThe Coming Technological Singularityâ€) are joiningÂ  <a href="http://www.stupidfunclub.com/WWBio.html" target="_blank">Will Wright</a> (Legendary game designer of SimCity, The Sims, and Spore), and <a href="http://www.jaronlanier.com/" target="_blank">Jaron Lanier</a> (a computer scientist, composer, visual artist, legend of virtual reality research, and the author of<em> You Are Not A Gadget: A Manifesto</em>) to judge the Auggies at the Second Annual <a href="http://augmentedrealityevent.com/" target="_blank">Augmented Reality Event, 2011, Santa Clara, Ca., May 17th, May 18th</a>!Â  Pictures top row, Bruce Sterling, Vernor Vinge, Will Wright, 2nd row, Jaron Lanier, Blaise AgÃ¼era y Arcas, and Frank Cooper III.</p>
<p>Augmented Reality has been prototyped in science fiction writing and films for many years now.Â  But <a href="http://augmentedrealityevent.com/" target="_blank">Augmented Reality Event</a>, the first global event dedicated to the emerging industry of AR,Â  is your chance to be part of a momentous transformation of science fiction into science fact.Â Â  Bruce gave a seminal keynote last year at ARE2010, <a href="http://www.ugotrade.com/2010/06/16/interview-with-bruce-sterling-part-i-at-the-9am-of-the-augmented-reality-industry-are2010/" target="_blank">&#8220;At the 9am of the Augmented Reality Industry&#8230;&#8221; </a>But we are edging closer to high noon this year &#8211; see <a href="http://bits.blogs.nytimes.com/2011/04/07/augmented-reality-comes-closer-to-reality/" target="_blank">John  Markoff&#8217;s recent New York Times post, &#8220;Augmented Reality Comes Closer to  Reality,&#8221;</a> and Bruce Sterling&#8217;s opening keynote at <a href="http://augmentedrealityevent.com/" target="_blank">ARE2011</a> will raise the bar for us all again.</p>
<p>Bruce SterlingÂ  and Vernor Vinge will wrap up the show with fireside chat.Â  <a href="http://www.jaronlanier.com/" target="_blank">Jaron Lanier</a> will be keynoting at are2011 for the first time and <a href="http://www.ted.com/speakers/blaise_aguera_y_arcas.html" target="_blank">Blaise AgÃ¼era y Arcas</a> (a leader in Microsoftâ€™s mobile strategy), will return to top his Ted talk demonstrating innovations in Bing Maps and Augmented Reality.Â  Frank Cooper III, who serves as Senior Vice  President and Chief Consumer Engagement Officer of PepsiCo, <a href="http://augmentedrealityevent.com/" target="_blank">will give, an up to now, missing perspective ofÂ  the big brandsâ€™ on Augmented Reality  (AR)</a>.</p>
<p>&#8220;The main stage at <a href="http://augmentedrealityevent.com/">ARE 2011</a> will be blessed with science fiction visionaries, ground breaking   scientists, mind blowing technologists, and legendary game designers and more!&#8221;</p>
<p>Come prepared to Augmented Reality Event.Â  Read as much Bruce Sterling  and Vernor Vinge as you can, and follow @bruces blog on  <a href="http://www.wired.com/beyond_the_beyond/" target="_blank">Wired, Beyond the Beyond</a>.Â  With only a few weeks to go until <a href="http://augmentedrealityevent.com/" target="_blank">Augmented Reality Event</a>, myself and  co-chairs Ori Inbar, and Chris Grayson, have been working hard <a href="http://augmentedrealityevent.com/schedule/" target="_blank">on a  preliminary schedule</a> ( by the time you click on this link it should be updated  again).Â  You can follow us on twitter for updates as they come in &#8211; <a href="http://twitter.com/#%21/tishshute" target="_blank">@tishshute</a>, <a href="http://twitter.com/#%21/comogard">@comogard</a>,  <a href="http://twitter.com/#%21/chrisgrayson" target="_blank">@chrisgrayson</a>,Â  and please follow <a href="http://twitter.com/#%21/arealityevent" target="_blank">@arealityevent</a> and join <a href="http://www.facebook.com/augmentedrealityevent" target="_blank">our  ARE facebook posse</a>.Â  And, of course, don&#8217;t forget to register soon while we are still offering discount codes.</p>
<p><strong>Readers of  this post can use the code TISH295 for $100 of the already sweet price  if $395.</strong></p>
<p>If you joined us for Augmented Reality Event last year, you will know that the&#8221;Auggies&#8221; &#8211; an award for the best AR demo presented live with commentary American idol style,Â  is a chance to join the best  and brightest in AR as they write the next chapter for AR in the sharp and often amusing  reparte between judges  and contestants.Â  SubmitÂ your demo proposal<a href="http://augmentedrealityevent.com/speakers/call-for-proposals/"> here</a> under the â€œAuggiesâ€ track.</p>
<p>The picture below is the <a href="http://gallery.me.com/pookatak#100153" target="_blank">Auggie  Award</a> for the best AR demo designed by <a href="http://www.pookatak.com/" target="_blank">Sigal Arad Inbar</a>.Â  (See, <a title="Permanent Link to Ivan Franco recounts the teamâ€™s   ARE 2010 experience, and winning the eventâ€™s first-ever Auggie Award" rel="bookmark" href="http://www.ydreams.com/blog/2010/06/05/ivan-franco-recounts-the-team%e2%80%99s-are-2010-experience-and-winning-the-event%e2%80%99s-first-ever-auggies-award/">Ivan   Franco recounts the teamâ€™s ARE 2010 experience, and winning the  eventâ€™s  first-ever Auggie Award,</a> and the video shot at the <a href="http://www.ydreams.com/" target="_blank">YDreams</a> booth by Bruce Sterling.<em> â€œThe Hotnessâ€ â€“ <a href="http://www.flickr.com/photos/brucesterling/4671874785/in/photostream/" target="_blank">YDreams rocking it at ARE2010 from brucesflickr)</a></em></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/auggie.jpg"><img class="alignnone size-medium wp-image-6213" title="auggie" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/auggie-300x217.jpg" alt="" width="300" height="217" /></a></p>
<p><a href="http://augmentedrealityevent.com/" target="_blank">ARE2011</a> will include more than 90(!) speakers in 30 sessions  organized into 3 tracks: business, technology &amp; programming, and  production &amp; design.</p>
<p>It will feature special activities such as a Startup launch pad (submit your proposal<a href="http://augmentedrealityevent.com/speakers/call-for-proposals/"> here</a> under track: â€œStartup Launch Padâ€), and an  ARt Gala with live performances.Â  There will be displays by top AR artists of AR art projects and live performance art by the worldâ€™s best AR artists featuring: <a href="http://augmentedstories.wordpress.com/" target="_blank">Helen Papagiannis</a>, <a href="http://amirbaradaran.com/" target="_blank">Amir Baradaran</a>, <a href="http://sndrv.nl/" target="_blank">Sander Veenhof</a>, <a href="http://augmentedrealityevent.com/speakers/">Ina Centaur</a> and more to be announced.</p>
<p>The exhibition area will be open  throughout the 2 days of the event and will include the latest product  demos by the leading AR companies, as well as a career fair to help grow the fledging industry.Â  A small number of booths is still available in the exhibition hall ($995 for 10â€²x10â€²).<a href="http://augmentedrealityevent.com/sponsors/">Grab them</a> while they last!</p>
<p>A press conference will kick start the event, as a vehicle for AR  companies to launch new products and services. Submit your proposal<a href="http://augmentedrealityevent.com/speakers/call-for-proposals/"> here</a> and add: â€œPress Conferenceâ€ in the title. The event wrap up will  feature <a href="http://www.youtube.com/watch?v=cvTJzbhX98s&amp;feature=player_embedded" target="_blank">Marco Tempest with a live AR Magic show</a></p>
<p>I have been so busy working on are2011 (and another project  yet to be named -Â  hint I am focused on social augmentation and the world as a platform for a geo-situated social interest graph), that I have had no time to blog since I was on <a href="http://schedule.sxsw.com/events/event_IAP7238" target="_blank">The  Potential for Augmented Reality in Education Panel at SXSW, 2011.</a> But I  have posted my slide deck here for my talk, <a href="http://www.slideshare.net/TishShute/sxsw-augmented-realityineducationslides" target="_blank">&#8220;Enchanted Objects and People: Data Driven AR.&#8221;</a></p>
<p>We are entering a new era of collective  engagement possibilities for augmented reality,  consumer to consumer  brokerages which could unleash  the the visions of the  <a href="http://www.collaborativeconsumption.com/the-movement/" target="_blank">collaborative consumption movement</a> and tap into our collective   energies in totally new ways.Â  As Seth Praebatsch, Scavngr, suggested in his SXSW keynote, <a href="http://www.slideshare.net/chiefninja1/sxsw-keynote-the-game-layer-on-top-of-the-world" target="_blank">The Game Layer on Top of the World,</a> the power of communal game play may even help us address even the most intractable problems like failure in education and global warming!!</p>
<p>One of my favorite new AR ventures exploring the potential of new  forms of social engagement through AR is <em><a href="http://www.dentsulondon.com/blog/2011/04/05/introducing-suwappu/">Suwappu</a></em>.Â  We are working on how to get them across the pond and a few time zones to are2011.</p>
<p><em>&#8220;<a href="http://www.dentsulondon.com/blog/2011/04/05/introducing-suwappu/" target="_blank">Dentsu London </a>are developing an original product called </em><em><a href="http://www.dentsulondon.com/blog/2011/04/05/introducing-suwappu/">Suwappu</a></em><em>.  Suwappu are woodland creatures that swap pants, toys that come to life  in augmented reality. <a href="http://berglondon.com/blog/2011/04/05/suwappu-toys-in-media/" target="_blank">BERG </a>have been brought in as consultant inventors,  and weâ€™ve made this film. <a href="http://www.youtube.com/watch?v=sBmLWdjtzPw">Have a look</a>!&#8221;</em></p>
<p><em><a href="http://www.youtube.com/watch?v=sBmLWdjtzPw&amp;feature=player_embedded" target="_blank"><img class="alignnone size-medium wp-image-6229" title="Screen shot 2011-04-13 at 3.52.57 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-13-at-3.52.57-PM-300x181.png" alt="" width="300" height="181" /></a><br />
</em></p>
<p>Click<a href="http://www.youtube.com/watch?v=sBmLWdjtzPw&amp;feature=player_embedded" target="_blank"> here</a> or on the image above to watch the video.</p>
<p><em>&#8220;Suwappu is a range of toys, animal characters that live in little  digital worlds. The physical toys are canvasses upon which we can paint  worlds, through a phone (or tablet) lens we can see into the narratives,  games and media in which they live.&#8221;</em></p>
<p>Many new players in Augmented Reality will join the pioneering AR  companies, <a href="http://www.layar.com/" target="_blank">Layar</a>, <a href="http://www.metaio.com/" target="_blank">Metaio</a>, <a href="http://occipital.com/blog/" target="_blank">Occipital</a>, <a href="http://ogmento.com/" target="_blank">Ogmento</a>, <a href="http://www.t-immersion.com/" target="_blank">Total Immersion</a>, <a href="http://www.ydreams.com/#/en/homepage/" target="_blank">YDreams</a>, and renowned innovators,<a href="http://augmentedrealityevent.com/speakers/" target="_blank"> Mark Billinghurst, Steven Feiner, Blair MacIntyre and the Georgia Tech team and many others </a>will return to surprise and amaze us this year &#8211; see the <a href="http://augmentedrealityevent.com/speakers/" target="_blank">still growing list of speakers here</a>.Â Â  <a href="http://www.qualcomm.com/" target="_blank">Qualcomm,</a> who announced <a href="http://qdevnet.com/ar" target="_blank">an SDK for vision based augmented reality</a>, and <a href="http://qdevnet.com/dev/augmented-reality/developer-challenge" target="_blank">$200,000 developer challenge</a> last year, will also be back with several of their team at are2011!</p>
<p>Visual Search and OCR augmented reality apps have flourished in the last year and they will have a strong presence at are2011.Â  Long anticipated and oft fretted over face recognition apps are coming to a phone near you soon.Â  <a href="http://www.viewdle.com/" target="_blank">Viewdle</a>, working locally on the phone, is taking a new approach to face recognition.Â  Google has so far said that they will not use or release the face recognition technology that they apparently already have.Â  But Shailesh Nalawadi, Google Goggles, will return to are2011 to discuss <a href="http://www.eweek.com/c/a/Search-Engines/Meet-Google-Goggles-Augmented-Reality-Vector-239952/1/" target="_blank">building out a computer vision search database and 3D Classifiers to solve AR challenges</a> and show us what <a href="http://www.google.com/mobile/goggles/#text" target="_blank">Google Goggles</a> has been working on recently.Â  <a href="http://www.youtube.com/watch?v=GBKy-hSedg8" target="_blank">Aurasma</a>,  out next month, is perhaps the next big thing in this space, so we  are all eager to try it out and see what they bring to ARE2011.</p>
<p>My interview with Anselm Hook<a title="Permanent Link to Visual Search, Augmented Reality and a Social Commons for the Physical World Platform: Interview with Anselm Hook" rel="bookmark" href="../../2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/">, Visual Search, Augmented Reality and a Social Commons for the Physical World Platform: Interview with Anselm Hook, </a>is still a great reference point for important questions in this space.<a title="Permanent Link to Visual Search, Augmented Reality and a Social Commons for the Physical World Platform: Interview with Anselm Hook" rel="bookmark" href="../../2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/"><br />
</a></p>
<p><a href="http://questvisual.com/" target="_blank">Wordlens,</a> wowed us all with OCR done locally on the phone, and they will beÂ  showing off new features at ARE2011. Pulling out WordLens to translate at subway poster for the first time is a magic moment, and guaranteed to impress the person sitting next to you too! Â Â  <a href="http://www.tacticalinfosys.com/" target="_blank">Tactical Information Systems</a>, who Fred Wilson said had <a href="http://www.ugotrade.com/2010/10/31/tim-o%E2%80%99reilly%E2%80%99s-four-cylinder-innovation-engine-the-missing-manual-for-the-future/" target="_blank">the potential to be a Shazam for faces at Web 2.0 Expo</a>, NYC, will show us what they have been up to since then.Â  <a title="Permanent Link to Visual Search, Augmented Reality and a Social Commons for the Physical World Platform: Interview with Anselm Hook" rel="bookmark" href="../../2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/"><br />
</a></p>
<p>Data driven AR is still in the early stages, but it is data that will unlock augmented experiences of the world around us in many powerful ways.Â Â  Also, we are seeing exciting new efforts to put people at the center of their data.Â   Jeremie Miller who ushered in the Twitter era with his invention  of XMPP  has brought us a new real time protocol, <a href="http://www.telehash.org/about.html" target="_blank">TeleHash</a> and the <a href="https://github.com/quartzjer/Locker" target="_blank">Locker Project</a>, that will <a href="http://www.ugotrade.com/2011/02/10/jeremie-miller-the-locker-project-give-a-data-platform-to-the-people-in-the-era-of-data-everywhere-and-bloom-presents-fizz/" target="_blank">unleash the full potential of communication at the edge of the network</a>. Â  And don&#8217;t miss <a href="http://bloom.io/#about" target="_blank">Ben Cerveny</a> and<a href="http://bloom.io/" target="_blank"> Bloom</a>,Â  at ARE2011, with &#8220;pop-cultural instruments for data expression and exploration&#8221; and &#8220;where data visualization meets game design,&#8221;Â  I highly recommend Ben&#8217;s talk!</p>
<p>Also I am very excited that <a href="http://www.orangecone.com/about.html" target="_blank">Mike Kuniavsky</a> will be speaking atÂ  ARE2011 for the first time.Â  HeÂ  pointed out in 2007 that Magic is a powerful core interaction metphor for Ubiquitous computing [and AR].Â  Offices. libraries, and the desktop provided us with common metaphors that unleashed the power of the PC through files, desktops and trash cans, browsing, etc.,Â  but metaphors of magic &#8211; &#8220;enchanted objects  and people,&#8221; are keys to augmented reality experiences.Â Â  And, if Magic is a core interaction metaphor for AR,  kinect hacks have shown us that gesture will be the soul of the  AR experience.Â  Jaron Lanier&#8217;s keynote will be a must see and make sure you catch <a href="http://www.urbeingrecorded.com/news/" target="_blank">Chris Arkenberg&#8217;s</a> presentation on the  Future of Hands Free AR.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-02-at-7.57.58-PM.png"><img class="alignnone size-medium wp-image-6178" title="Screen shot 2011-04-02 at 7.57.58 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-02-at-7.57.58-PM-300x220.png" alt="" width="300" height="220" /></a></p>
<p>This post is just a glimpse at some of what will be going on at Augmented Reality Event.Â  I cannot do justice to everything in a single post, so I hope to see you there!Â  And, if by any chance you have a few moments to  daydream between now and  then, feel  free to  ponder your own  script  for a Green Lantern  movie:Â  &#8220;In  brightest day  and  darkest night, no  evil will escape my  sight.&#8221;Â  I will too.Â  As  <a href="http://twitter.com/#!/bscully" target="_blank">@brendanscully</a> noted,  &#8220;Green  lantern&#8217;s power  is Augmented Reality!&#8221;</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2011/04/13/augmented-reality-event-2011-bruce-sterling-vernor-vinge-will-wright-and-jaron-lanier-to-judge-the-auggies/feed/</wfw:commentRss>
		<slash:comments>4</slash:comments>
		</item>
		<item>
		<title>Real Time Big Data at Strata 2011: Ambient Findability, Social Search, GeoMessaging, Augmented Data, and New Interfaces</title>
		<link>http://www.ugotrade.com/2011/01/20/real-time-big-data-at-strata-2011-ambient-findability-geomessaging-augmented-data-and-new-interfaces/</link>
		<comments>http://www.ugotrade.com/2011/01/20/real-time-big-data-at-strata-2011-ambient-findability-geomessaging-augmented-data-and-new-interfaces/#comments</comments>
		<pubDate>Thu, 20 Jan 2011 22:48:12 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[data science]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[New Interfaces]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[Alistair Croll]]></category>
		<category><![CDATA[Ambient Findability]]></category>
		<category><![CDATA[Android Tasker]]></category>
		<category><![CDATA[Anselm Hook]]></category>
		<category><![CDATA[AR]]></category>
		<category><![CDATA[attention data]]></category>
		<category><![CDATA[augmented data]]></category>
		<category><![CDATA[augmented reality ecosystem]]></category>
		<category><![CDATA[augmented reality search]]></category>
		<category><![CDATA[BackType]]></category>
		<category><![CDATA[big data]]></category>
		<category><![CDATA[Big data and new interfaces]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Cassandra]]></category>
		<category><![CDATA[Collecta]]></category>
		<category><![CDATA[content-shifting]]></category>
		<category><![CDATA[curating big data]]></category>
		<category><![CDATA[Data Engineering]]></category>
		<category><![CDATA[data privacy]]></category>
		<category><![CDATA[digital divide]]></category>
		<category><![CDATA[distributed computing]]></category>
		<category><![CDATA[Edd Dumbill]]></category>
		<category><![CDATA[Factual]]></category>
		<category><![CDATA[future of work]]></category>
		<category><![CDATA[geo]]></category>
		<category><![CDATA[geo social aware discovery]]></category>
		<category><![CDATA[geo-search]]></category>
		<category><![CDATA[geodata]]></category>
		<category><![CDATA[geolocation]]></category>
		<category><![CDATA[Geoloqi]]></category>
		<category><![CDATA[GeoMessaging]]></category>
		<category><![CDATA[geosearch]]></category>
		<category><![CDATA[gestural interfaces]]></category>
		<category><![CDATA[Gov2.0.]]></category>
		<category><![CDATA[HBase]]></category>
		<category><![CDATA[Hive]]></category>
		<category><![CDATA[key data trends]]></category>
		<category><![CDATA[linked data]]></category>
		<category><![CDATA[location data]]></category>
		<category><![CDATA[Maneko Neki]]></category>
		<category><![CDATA[MapReduce]]></category>
		<category><![CDATA[mapufacture]]></category>
		<category><![CDATA[Mesos]]></category>
		<category><![CDATA[Michal Avny]]></category>
		<category><![CDATA[mobile local interactions]]></category>
		<category><![CDATA[MongoDB]]></category>
		<category><![CDATA[My6sense]]></category>
		<category><![CDATA[neogeography]]></category>
		<category><![CDATA[NoSQL]]></category>
		<category><![CDATA[OpenGeo]]></category>
		<category><![CDATA[OpenGov]]></category>
		<category><![CDATA[P2P cloud computing]]></category>
		<category><![CDATA[pervasive computing]]></category>
		<category><![CDATA[Q&A]]></category>
		<category><![CDATA[Q&A ecosystems]]></category>
		<category><![CDATA[Q&A platforms]]></category>
		<category><![CDATA[Q&A The New Search Insurgents]]></category>
		<category><![CDATA[Quora]]></category>
		<category><![CDATA[RabbitMQ]]></category>
		<category><![CDATA[real time data analytics]]></category>
		<category><![CDATA[real time data in mobile development]]></category>
		<category><![CDATA[real time search]]></category>
		<category><![CDATA[real time search engines]]></category>
		<category><![CDATA[real time social discovery]]></category>
		<category><![CDATA[semantic web]]></category>
		<category><![CDATA[Simple Geo]]></category>
		<category><![CDATA[social graph]]></category>
		<category><![CDATA[social search]]></category>
		<category><![CDATA[social web]]></category>
		<category><![CDATA[Sophia Parafina]]></category>
		<category><![CDATA[Strata 2011]]></category>
		<category><![CDATA[Swift River]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[Topsy]]></category>
		<category><![CDATA[Web 2.0 Summit]]></category>
		<category><![CDATA[Who owns your data?]]></category>
		<category><![CDATA[XMPP]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=6025</guid>
		<description><![CDATA[We are in the age of unearthing and uncovering data, and only just at the beginning of the age of processing data and dealing with it (see my interview with Anselm Hook, Part 2 upcoming).Â  O&#8217;Reilly&#8217;s Strata Confernence 2011, will explore, &#8220;the change brought to technology and business by data science, pervasive computing, and new [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/noisedderived31.jpg"><img class="alignnone size-medium wp-image-6034" title="noisedderived3" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/noisedderived31-300x163.jpg" alt="" width="300" height="163" /></a></p>
<p>We are in the age of unearthing and uncovering data, and only just at the beginning of the age of processing data and dealing with it (see my interview with <a href="http://www.hook.org/" target="_blank">Anselm Hook</a>, Part 2 upcoming).Â  <a href="http://strataconf.com/strata2011" target="_blank">O&#8217;Reilly&#8217;s Strata Confernence 2011</a>, will explore, &#8220;the change brought to technology and business by data science, pervasive computing, and new interfaces.&#8221; It is, perhaps, one of the most important events of 2011.</p>
<p>Data is driving a revolution much as coal, oil, and steel powered the industrial revolution.Â  And the world changing insight from Karl Marx that &#8220;the industrial revolution polarized the world into two groups: those who own the means of production and those who work on them,&#8221; is taking on on new life, asÂ <a href="http://twitter.com/#!/acroll" target="_blank"> Alistair Croll</a>, co-chair of <a href="http://strataconf.com/strata2011" target="_blank">Strata 2011</a>, points out in his post,Â  <a href="http://mashable.com/2011/01/12/data-ownership/" target="_blank">&#8220;Who Owns Your Data?&#8221;</a></p>
<p><strong>&#8220;The important question isnâ€™t who owns the data. Ultimately, we all do. A better question is, who owns the means of analysis? Because thatâ€™s how, as Brand suggests, you get the right information in the right place. The digital divide isnâ€™t about who owns data â€” itâ€™s about who can put that data to work.&#8221;</strong></p>
<p>Strata is where a vanguard will be meet, not only to discuss this revolutionâ€™s futures, but to define how to create, handle, and build the platforms and experiences that will harness the data.  My flight is booked!Â  (Also check out <a href="http://www.bigdatacamp.org/">BigDataCamp</a> which takes place the night before <a title="Strata Conference" href="https://en.oreilly.com/strata2011/public/regwith/str11dnaff" target="_blank">Strata</a>.)</p>
<p>The picture opening this post is from Michael EdgeCumbe&#8217;sÂ  <a href="http://garden.neocyde.net/thoughts/2010/12/fall-2010-itp-winter-show-project/">Fall 2010: ITP Winter Show Project</a>.Â  A project exploring ways to intuitively get the feel of what it going on with big data sets using &#8220;the gestural manipulation and stereoscopic visualization of complex data to create a meditative state for data analysis.&#8221;Â  Michael project will be part of the <a href="http://strataconf.com/strata2011/public/schedule/detail/17840" target="_blank">Science Fair at Strata</a>.Â  For more on Michael&#8217;s work see <a href="http://www.neocyde.net/derive/2010/12" target="_blank">Noise Derived.</a> I also have a number of theÂ    <a href="http://strataconf.com/strata2011/public/schedule/topic/595 " target="_blank">interesting new interface sessions </a>at Strata in my schedule.</p>
<p>The daily <a href="http://radar.oreilly.com/2010/12/write-your-own-visualizations.html" target="_blank">Strata Gems</a> on O&#8217;Reilly Radar are great place to get a gestalt of some of the Strata themes, and <a href="http://radar.oreilly.com/2010/12/strata-gems-three-key-data-trends-for-2011.html" target="_blank">this  post </a>by <a href="http://strataconf.com/strata2011/profile/1" target="_blank">Edd Dumbill</a>, program chair for Strata,<a href="http://radar.oreilly.com/m/2010/12/strata-gems-three-key-data-trends-for-2011.html" target="_blank"> Three key data trends for 2011</a>, looks at the year ahead.Â  This week, I got the chance to ask Edd a few of the questions that I will have on mind at Strata &#8211; see his responses below.</p>
<p>If you have been reading Ugotrade, you will know I am interested in our mobile social augmented futures and there is no question in my mind that these will be unleashed by our new capacities to work with data (see <a href="http://www.ugotrade.com/2010/10/31/tim-o%E2%80%99reilly%E2%80%99s-four-cylinder-innovation-engine-the-missing-manual-for-the-future/" target="_blank">my post here</a>).</p>
<p><strong><br />
</strong></p>
<h3>Data is the how.</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/backtypediagram.png"><img class="alignnone size-medium wp-image-6045" title="backtypediagram" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/backtypediagram-210x300.png" alt="" width="210" height="300" /></a></p>
<p><em>The pic above is from <a href="http://www.readwriteweb.com/hack/2011/01/secrets-of-backtypes-data-engineers.php" target="_blank">&#8220;Secrets of BackType&#8217;s Data Engineers.&#8221;</a> This post on ReadWriteHack by <a href="http://twitter.com/petewarden">Pete Warden</a>, an ex-Apple engineer, and founder of <a href="http://www.openheatmap.com/">OpenHeatMap</a>, really lives up to its title.Â  Check it out if you want to know howÂ <strong> &#8220;three guys (the <a title="opens in new window" href="http://backtype.com/" target="_blank">BackType</a> team ) with only seed funding process a hundred million messages a day?&#8221;</strong></em></p>
<p>I asked on Quora, &#8220;<a href="http://www.quora.com/What-will-be-the-most-important-developments-in-augmented-reality-in-2011" target="_blank">What would be the most important developments for Augmented Reality in 2011,&#8221;</a> <a href="https://sites.google.com/site/michalavny/" target="_blank">Michal Avny,</a> Strategist &amp; Real Time search expert, wrote:</p>
<p><strong>&#8220;AR strongly relies on localized personalized real time information.</strong></p>
<p><strong>Having a stream of tweets based on keyword search, location or circle of friends doesnâ€™t really make the AR experience; it is the processed real time relevant information that will make AR useful and intensify the experience.&#8221;</strong></p>
<p><strong>In 2011 Real Time search and Social Search will drastically change to provide the infrastructure required.&#8221;</strong></p>
<p>I followed up on Michal&#8217;s Quora answer with some more questions &#8211; see below in this post.<strong><br />
</strong></p>
<p>Also note<a href="http://www.quora.com/What-will-be-the-most-important-developments-in-augmented-reality-in-2011" target="_blank"> the response</a> from <a href="http://research.microsoft.com/en-us/people/dmolnar/" target="_blank">David Molna</a>r, here is an excerpt:</p>
<p><strong>&#8220;2. A wave of actionable, important data APIs opened up, enabling useful non-gimmicky AR apps for the first time. Think geoloqi.com , or the work Max Ogden has done with Portland civic data. Plus of course <a href="http://face.com/" target="_blank">face.com</a> , email providers and calendar providers, etc.&#8221;</strong></p>
<p><a href="http://strataconf.com/strata2011/public/schedule/speaker/100889" target="_blank">Amber Case</a>, one of the founders of <a href="http://geoloqi.com/" target="_blank">Geoloqi</a>, is on the programming committee of Strata and will be speaking.  Be sure to catch her session! <a href="http://strataconf.com/strata2011/public/schedule/detail/17748" target="_blank">Posthumans, Big Data and New Interfaces,</a> and if you haven&#8217;t already seen it, <a href="http://www.ted.com/talks/amber_case_we_are_all_cyborgs_now.html" target="_blank">Amber&#8217;s TED talk</a> is a must see.</p>
<p>Geographic proximity is a powerful filter, as is route, and time. But clearly social proximity, social relevance, and shared tastes are also key dimensions for location based experiences, (see my convo with Schuyler of <a href="http://simplegeo.com/" target="_blank">Simple Geo</a>, upcoming).</p>
<p>While the whole business of location based search and curation of augmented mobile social experiences is still, for the most part, uncharted terrain, the danger of key points of control being only really accessible to elite players looms large.   I asked <a href="http://www.youtube.com/watch?v=C2HcWlu1BS4" target="_blank">Sophia Parafina</a>, a pioneer in the open geo space for some thoughts on real-time local /geosearch and geomessaging, and the future of openess &amp; big data (see Sophia&#8217;s response below).</p>
<h3><a href="http://www.quora.com/Is-the-market-ready-yet-for-P2P-cloud-computing" target="_blank">Is the market ready yet for P2P cloud computing?</a></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/8a174_invisibles_bigbrother_1210.jpg"><img class="alignnone size-full wp-image-6048" title="8a174_invisibles_bigbrother_1210" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/8a174_invisibles_bigbrother_1210.jpg" alt="" width="150" height="150" /></a></p>
<p>This is another question I&#8217;m following,Â <a href="http://www.quora.com/home/following" target="_blank"> </a><a href="http://www.quora.com/Is-the-market-ready-yet-for-P2P-cloud-computing" target="_blank">Is the market ready yet forÂ P2P cloud computing?</a> It is one of those questions that we seem to have been asking in various forms for a very long while now, but without aÂ  major shift in sight.Â  The pic above is from, <a title="Permanent link to The Cloud Made Open Source " href="http://www.readwriteweb.com/cloud/2010/12/open-source-invisible.php">The Cloud Made Open Source &#8220;Invisible&#8221; This Year</a>.Â  But, perhaps, we are at the point when open p2p clouds will find a place in the market because of their potential importance in real time social search and discovery. <a href="http://distributedsearch.blogspot.com/" target="_blank">Borislav Agapiev</a>, Search Entrepreneur and founder of <a href="Vast.com" target="_blank">Vast.com</a>, writes on <a href="http://www.quora.com/Is-the-market-ready-yet-for-P2P-cloud-computing?q=p2p+for+a+non+centralized+infrastructure" target="_blank">Quora</a>:</p>
<p><strong>&#8220;I believe a P2P cloud is ideally suited for social &amp; real-time search and discovery.</strong></p>
<p><strong>Consider MapReduce, a very interesting and popular paradigm for distributed computing. MapReduce is very much about bringing computation to data i.e. doing computation at nodes (map) and then aggregating results through network (reduce).</strong></p>
<p><strong>It is very clear now that user attention data (what they click on) is very valuable for search and discovery, yet a centralized model relies upon uploading all that to a single location and then doing a supposed local MapReduce. Clearly, MapReduce could be done  across the network, without any centralized uploads.</strong></p>
<p><strong>In addition to the efficiency argument raised here, it is even more important to consider privacy issues. Uploading massive amounts of user attention data to a centralized location is not something that is going to make users warm and fuzzy <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_smile.gif" alt=":)" class="wp-smiley" />   as we are increasingly seeing.</strong></p>
<p><strong>In a P2P cloud, there is no big brother watching over anyone, all computation and data storage is done in the cloud, fragmented in many, many small  encrypted pieces ala BitTorrent.&#8221;</strong></p>
<p><strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/Screen-shot-2011-01-16-at-2.13.43-PM1.png"><img class="alignnone size-medium wp-image-6066" title="Screen shot 2011-01-16 at 2.13.43 PM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/Screen-shot-2011-01-16-at-2.13.43-PM1-300x223.png" alt="" width="300" height="223" /></a><br />
</strong></p>
<p><em>Picture above from Brynn Marie Evans, <a href="http://brynnevans.com/blog/2010/03/17/it-takes-two-to-tango/">&#8220;It takes two to tango: review of my social search panel</a>&#8220;</em></p>
<p><em><br />
</em></p>
<h3>The Delta of Now &#8211; Transforming Search into a Social Democratic Act</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/2538108030_d37d124e44.jpg"><img class="alignnone size-medium wp-image-6049" title="2538108030_d37d124e44" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/2538108030_d37d124e44-300x225.jpg" alt="" width="300" height="225" /></a></p>
<p><em>Picture of Maneki Neko &#8220;beckoning&#8221; cats from <a href="http://www.journeyetc.com/travel-ideas/famous-landmarks-of-cats-and-dogs-around-the-globe/">Journeyetc</a></em></p>
<p>New ecologies of human and machine intelligence are beginning to change basic social structures â€“ see the <a href="http://www.youtube.com/watch?v=t1J2RXrvPek" target="_blank">Future of Work (Biewald and Chirayath Janah 2010)</a>. And projects like <a href="http://swift.ushahidi.com/" target="_blank">Swift River</a>, using search and machine mining to filter out streams on topics of interest that can then be subsequently curated by human beings. This may be extended to the curation of real-time data streams and employment of machine learning algorithms based upon the explicit relationships.</p>
<p>Augmented mobile social experiences are a new frontier in which ideas and practices from a number of fields collide, including: ambient findability (Morville 2005), urban psychogeography, narrative structures, ambient games and devices, 4d (time-space), explorations of place and memory, enchanted objects and people (Kuniavsky 2010), and designed animism (Laurel 2010), to mention just a few.</p>
<p>Mobile local interaction presents an opportunity to invert the search pyramid and to transform search into a social, democratic act (see my interview with Anselm Hook upcoming).Â  Up until now search has been predicated around a very narrow revenue model.  Google has an implicit model of a B2C â€“ business to consumer brokerage. We are only just beginning to get a glimpse of the disruptive potential of C2C &#8211; consumer to consumer brokerages.  Mobile local C2C brokerages that allow us to transact in a trustworthy way over our local geography in close to real time (Hook 2010) have the potential to enable new forms of social organization.  Bruce Sterlingâ€™s short story about a networked gift economy, <a href="http://tqft.net/wiki/Maneki_Neko" target="_blank">Maneko Neki,</a> is a brilliant glimpse at the disruptive potential of such re-imaginings.</p>
<p>Augmented experiences that shift or change a personâ€™s situated geolocal experience of social reality, and change our relationship to the people and the place by augmenting engagement in, and reputation through, socially driven consumer tie ins and game dynamics, like <a href="http://foursquare.com/" target="_blank">Four Square</a>, &amp; <a href="http://gowalla.com/" target="_blank">Gowalla</a> are beginning to emerge, as <a href="http://www.web2expo.com/webexny2010/public/schedule/detail/15446" target="_blank">Kati London pointed out in her excellent keynote at Web 2.0 Expo</a>.  And, while the integration of mobile local interaction and an augmented view that shifts our geolocal experience visually will involve creative solutions to some well churned mobile, tracking, mapping and registration challenges, the exploration and development of new dimensions through which we can filter and create trusted and meaningful augmented mobile social experiences is vital, whether you are considering a mobile screen, map, camera view, or futuristic HUDs and gestural interfaces.</p>
<h3>Talking with Edd Dumbill</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/edddumbill.jpg"></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/edddumbillheadshot.png"><img class="alignnone size-full wp-image-6077" title="edddumbillheadshot" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/edddumbillheadshot.png" alt="" width="150" height="150" /></a><br />
Picture from <a href="http://people.oreilly.com/edd" target="_blank">O&#8217;Reilly Community.</a></p>
<p><strong>Tish Shute: </strong>First congratulations on Strata!   On the Strata homepage there is a quote from Jason Hoffman:</p>
<p><strong>&#8220;My gut feeling is that we&#8217;re going to look back at the upcoming Strata Conference like we do at the Web 2.0 Conference in 2004/2005.&#8221;<br />
â€”Jason Hoffman, CTO/Founder, Joyent, Inc.</strong></p>
<p>Why do you think Jasonâ€™s comparison might be prescient?</p>
<p><strong>Edd Dumbill: Web 2.0 is a development that ran through every brand that has a web presence and radically changed the way business is done for many companies and brands.</strong></p>
<p><strong>Strata will have a similar impact: every business has data, every business collects an increasing amount of data. This data is the new oil â€“ a valuable raw material that when refined or combined creates value and opportunity.</strong></p>
<p><strong>Tish Shute:</strong> The rise of real time was one of your three key data trends for 2011.  Hadoop is bringing the capacity to work with big data to more than just a few elite players.  But the challenge is still real time.  You mention we will be seeing a hybrid approach to real time and batch MapReduce processing.  Will we hear more about these approaches to real time at Strata?  And, what do you see as the most important conversations on real time data analytics emerging at Strata?</p>
<p>You point out â€œopen source projects and cloud infrastructure means developers can evaluate and learn to love technologies without requiring support or approval from above.â€  What are the most exciting developments on the horizon for open source tools?</p>
<p><strong>Edd Dumbill: </strong><strong>Here are some projects worth watching, in the key areas of real time, cluster management and Hadoop.</strong></p>
<p><strong>* Cassandra and MongoDB â€” NoSQL databases that will prove vital for anybody with real time big data needs</strong></p>
<p><strong>* Mesos â€” a compute cluster management tool, modeled after that which powers Google</strong></p>
<p><strong>* Hadoop ecosystem&#8217;s continuing maturation, especially HBase and Hive.</strong></p>
<p><strong>Tish Shute: </strong> Do you think the market is ready for p2p cloud computing?</p>
<p><strong>Edd Dumbill: The market is emerging for decentralized and distributed cloud computing, and P2P technologies are one way of achieving that. They key trends will be moving computation nearer the data sets or nearer the point of user consumption of the result.</strong></p>
<p><strong>P2P is a difficult model for anybody wanting to commercialize a service, so I think it will tend to form part of a hybrid solution.</strong></p>
<p><strong>Tish Shute:</strong> We have seen enormous strides in our ability to work with giant unstructured databases recently.  Do you think, perhaps, that the dream of a web of linked data &#8211;  â€œa web of data that can be processed directly and indirectly by machines,â€ will be attained through brute force &#8211; i.e. through our ability to harness the power of massively parallel processing, as much as by Semantic Web approaches focused on machine readable metadata? [Also see <a href="http://www.quora.com/Is-this-a-good-approach-www-dist-systems-bbn-com-people-krohloff-shard_overview-shtml-to-use-Hadoop-to-build-a-scalable-distributed-triple-store" target="_blank">my question on Quora</a>, &#8220;Is this a good approach (<a rel="nofollow" href="http://www.dist-systems.bbn.com/people/krohloff/shard_overview.shtml" target="_blank">www.dist-systems.bbn.com/people/&#8230;</a>) to use Hadoop to build a scalable, distributed triple store?&#8221;]</p>
<p><strong>Edd Dumbill:  I&#8217;ve been an observer of the SW for over a decade and I tend to believe that on the web, data means to you whatever meaning you give it as the consumer. With that model, the links are made by the consumer rather than sitting out there explicitly. Some links become de facto standards, and some very few become web standards.</strong></p>
<p><strong>I think the actuality will be a mix of both explicitly stated metadata and that which is inferred. The Semantic Web is a great framework for certain operations, especially interoperable exchange of metadata. A great many more private meanings, never intended to be shared, will be created by consuming software.</strong></p>
<p><strong>There&#8217;s no question that machines will learn how to process most of the Web. Furthermore, machines will learn how to process most of the physical world we&#8217;re in. And that by the end of this decade</strong>.</p>
<h3>Talking with Sophia Parafina</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/sophiawhere.jpg"><img class="alignnone size-medium wp-image-6062" title="sophiawhere" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/sophiawhere-300x250.jpg" alt="" width="300" height="250" /></a></p>
<p><em>Picture of Sophia at <a href="http://where2conf.com/where2011" target="_blank">Where 2.0</a><a href="http://www.flickr.com/photos/rich_gibson/2509114741/" target="_blank"></a></em></p>
<p><strong>Tish Shute:</strong> Sophia you have worked in the trenches for a long time now  to support the growth of open geo data.  What do you hope to see emerge in 2011 in the field of geo-data?</p>
<p><strong>Sophia Parafina: Better support for displaying and handling location data across multiple apps. Fred Wilson <a href="http://www.avc.com/a_vc/2011/01/content-shifting.html?utm_source=feedburner&amp;utm_medium=feed&amp;utm_campaign=Feed%3A+AVc+%28A+VC%29" target="_blank">recently blogged about content-shifting</a>, he talks about overcoming content silos across devices. Weâ€™ve worked very hard to reduce data silos via formats, but devices are creating their own silos. I would like to see a standard method for sending geo data and geo information to mobile devices.</strong></p>
<p><strong>Producing content for mobile is different from producing content for a computer browser. Web 2.0 produced a lot of infrastructure for browser based interfaces, but in mobile devices that gap has been filled with apps which is fragmenting how data is handled by various devices. What is even more interesting in the mobile space is that devices can push data back that contains location, user updates, photos and even sensor data.Â  If mobile data standardizes, it could lead to browser based applications and stem the continued fragmentation of the mobile application market.</strong></p>
<p><strong>Tish Shute:</strong> <a href="http://simplegeo.com/" target="_blank">Simple Geo</a> and<a href="http://www.factual.com/" target="_blank"> Factual</a> are startups emerging in the geodata space. What do you see on the horizon in terms of both the growth of business opportunities and an open geo data community?</p>
<p><strong>Sophia Parafina: In the near future think weâ€™ll see startups providing curated data + API and in response we will also see companies that provide a single interface across multiple data providers. We saw this when everyone released a mapping API and companies such as <a href="http://mapufacture.com/">Mapufacture</a> provided a single interface across multiple APIs.</strong></p>
<p><strong>We will see a resurgence in data providers repackaging the the 2010 US Census data in different ways to respond to market segments, some of this will be open data but all of it will be provided through an API instead of file. Additionally, weâ€™ll see more data from outside the US.</strong></p>
<p><strong>Tish Shute:</strong> What are the biggest obstacles to having the open geodata sets available that we need to enable mobile local interactions and social augmented experiences?</p>
<p><strong>Sophia Parafina: Licensing for both crowd sourced data and private curated open data will become an issue. We recently seen VLC, the open source video player, pulled from the Apple app store because of licensing issues. Also, licensing of content by geography will be problematic, limiting searches by geographical location. In addition, how will licensing of data that is updated by crowd sourcing work?</strong></p>
<p><strong>Multiple APIs for accessing data sources. The current trend for each provider to create an API for their data sets will result in data silos â€“ there needs to be a single sign-on equivalent for requesting data.</strong></p>
<p><strong>Size of data on the wire, the current models for delivering data is based on broadband connections. However, as mobiles increasingly become the way people use the web, the data needs to be sized accordingly. This also goes for mobile interfaces. Have you tried to shop on a mobile device, or buy a train or plane ticket? Itâ€™s frustrating and error prone. There is a large untapped market of people who only use the Internet on mobile devices.</strong></p>
<p><strong>Tish Shute</strong>: You pointed me to <a href="http://radar.oreilly.com/2010/12/strata-gems-diy-personal-sensi.html" target="_blank">this link in Strata Gems</a> re â€œan interesting and pertinent (also a competitor to GeoLoqi),â€ â€“ <a href="http://tasker.dinglisch.net/" target="_blank">the Android Tasker app.</a> What do these emerging services bring to the table in terms of the next generation of location based services?</p>
<p><strong>Sophia Parafina: This app letâ€™s your device interact with the environment. I think that this is a great way of using the sensors on existing platforms to increase interaction and to implement ambient findability. The basic premise of Tasker is that some action happens in response to an event in an application, time, date, location, event, or gesture. Tasker has defined 180 actions that can occur based and number or combination of events. This can provide a basic vocabulary for interaction between the user and the device and more importantly between users. Tasker also can use Android script plugins, which lowers the bar to creating your own ambient  application.</strong></p>
<p><strong>Programs such as Tasker can provide a way for people to interact with social networks beyond sending messages. People can use their mobile devices to interact with their surroundings with out having to interact with the device.</strong></p>
<p><strong>Tish Shute:</strong> We have had many conversations about emerging ideas of geo-search, geo-messaging and geo-fencing. What are the most interesting developments in these areas and what do you see on the horizon for 2011?</p>
<p><strong>Sophia Parafina: The map will fade into the background and become less important. Display of information will be context aware, that includes location. For example, letâ€™s say I make a grocery list, when Iâ€™m at the grocery story, the list will just pop-up without the need for me to find the app that has the list. Or reminders or offers pop-up when you are near a place at a certain time, letâ€™s say you need to buy a present for a birthday party for a child, you could send out a request that you are looking for an item and retailers could offer â€œon the spotâ€ discounts if you are in the area.</strong></p>
<p><strong>Geo-search, geo-messaging, and geo-fencing are geared to towards mobile devices, so I expect to see them soon as part of apps. Building generic applications that implement geo* will fail because that sort of information is useful only within a context. Geo* apps are solutions looking for an problem. The killer mobile app will use these functions transparently to reduce the cognitive load of the user who is busy moving around in the world.</strong></p>
<p><strong>User data gathered from multiple web applications will become consolidated profiles that will used for context aware applications. For example, there could be a service which matches prices of items that you have shopped for on the web, so for example the service would have access to your cookies, know your favorite retailers, things you have shopped for, your location and activity patters (when you are at home, work, restaurant). When you are in the vicinity of a brick and mortar retailer with the same or similar items, the service can send you alert to match the price of the item you found on line. So your digital life will become more closely linked with your day to day activities.</strong></p>
<p><strong><br />
</strong></p>
<h3>Talking with Michal Avny</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/Michal_Pic.jpg"><img class="alignnone size-medium wp-image-6059" title="Michal_Pic" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2011/01/Michal_Pic-300x275.jpg" alt="" width="300" height="275" /></a></p>
<p><strong>Tish Shute: </strong>At <a href="http://www.web2summit.com/web2010" target="_blank"> Web 2.0 Summit</a>, one of the highlights for me was the, <a href="http://www.web2summit.com/web2010/public/schedule/detail/17101" target="_blank">Q&amp;A:The New Search Insurgents</a> lunch where Charlie Cheever of <a href="http://www.quora.com/" target="_blank">Quora</a>, IMO, stole the show. I tweeted:</p>
<p><em>&#8220;One of my takeaways from #w2s is that #quora points to future of augmented mobile social experiences &#8211; a search filter for experience! #AR&#8221;</em></p>
<p>In your view what are the biggest challenges for location Q&amp;A to emerge as a search filter for location based experiences?</p>
<p><strong>Michal Avny: The biggest location Q&amp;A challenges yet to be conquered are immediacy (real time dynamic data), relevancy (strong personalized filters) and user experience (simplified interface).</strong></p>
<p><strong>Location Q&amp;A enables different use cases.  The most prominent are Follow (follow places, topics and friends to learn about a location), Interact (meet new people based on common interests), Plan ahead (plan a trip, night out or a shopping day by asking and searching for local information) and On-site (check for recommendations, friends, deals, events and traffic nearby).</strong></p>
<p><strong>Unlike Follow, Interact and Plan ahead that can be added to existing Q&amp;A platforms (such as Quora) by attending location specifics as they share similar characteristics, the on-site mode introduces a completely different experience, first and foremost it requires immediate attention.  It is real time based and the nature of the data is dynamic.  Traffic updates, current events, nearby friends, all that changes constantly.  Posting a location question on-site implies the response should be in real time (e.g. best kid friendly restaurant), the normal Q&amp;A response latency wouldnâ€™t work.</strong></p>
<p><strong>Strong relevancy filters are required to accommodate for the overwhelming flood of information.  Moreover, some of the data should be filtered by user behavior and preferences, check in notifications (type of relation), restaurant recommendations (type of food, price level, etc), shopping deals (commercial categories) and more.</strong></p>
<p><strong>Mobile experience requires ease of use and simplicity.  A new Q&amp;A interface and query language that allows for posting questions should be defined as well as coherent summarized response interface.  User on the go should not have to post lengthy questions, browse through tens of results or search for the right service, but instead use a simple intuitive tool.</strong></p>
<p><strong>Tish Shute: </strong>Real- time location based search is in its infancy.  Real time questions can be answered using different services such as Yelp, TripAdvisor, <a href="http://www.waze.com/homepage/" target="_blank">Waze,</a> <a href="http://foursquare.com/" target="_blank">Foursquare</a>, IMDb and more.  But what are the challenges to moving forward with aggregating these sources and then into â€œlocalsâ€ that are able to process and deal with vast amounts of information?</p>
<p><strong>Michal Avny: Using some of the leading location services to answer question is sufficient to start with.</strong></p>
<p><strong>In order to provide broad coverage (worldwide) and reliable information, aggregation of the different services is required for instance to normalize product and service rank, aggregate classified, and more. This is quite challenging as there is no one standard available.</strong></p>
<p><strong>When location Q&amp;A user base is big enough, I foresee a tendency to rely more on â€˜localsâ€™ input as the base of information.   As the platform grows, communities will be formed with different cultures, relationships and trust levels, making the information more valuable and customizable.  Some of the challenges I already mentioned are implementing filters, query language and interfaces to enable using the vast amounts of real time data in a mobile environment.  More of the challenges lying ahead are integrating the â€˜localsâ€™ data with location based services as they are integral components of the Q&amp;A ecosystem.   Merging trust levels and relationships while adhering to different privacy guidelines is a challenge yet to be explored. (This should be discussed in more detail under the protocols topic).</strong></p>
<p><strong>It is quite evident that Quora is now facing growing pains and is struggling to maintain its character.  Same as with Quora, it will also be a challenge to support and maintain the ecosystem while allowing for massive scale-up.</strong></p>
<p><strong>Tish Shute:</strong> I have been very interested in exploring protocols that will be enablers to micro local interaction and mobile social interaction for AR &#8211; particularly the XMPP extensions and operational transform work of Google Wave (now <a href="http://incubator.apache.org/projects/wave.html" target="_blank">Apache Wave</a>), and PubSub protocols like <a href="http://code.google.com/p/pubsubhubbub/" target="_blank">PubHubSubbub</a> and Erlang based <a href="http://www.rabbitmq.com/" target="_blank">RabbitMQ</a>.  We are beginning to see protocols emerging that could enable new real time local services.  What do you think are some of the most valuable use cases for â€œlocalsâ€ that this new generation of real time protocols can enable?</p>
<p><strong>Michal Avny: AR is about interacting with digital information; the AR ecosystem is composed of layers and components such as devices, platforms, browsers, applications and content.  For the different components to interact new protocols, security guidelines, and privacy policies must be in place.  A standard will enable local vendors and service providers to publish specials, deals, updates and events for any application to broadcast, identify people and places by proximity (without having to use the same application or device), local recommendations will be shared by services, devices will be able to interact, location based platforms, such as Q&amp;A, will have access to vast breadth of information, geo aware devices will provide consistent experience globally, and much more.</strong></p>
<p><strong>Tish Shute:</strong> What do you think are the biggest challenges to going mainstream for this emerging field of real time social discovery?</p>
<p><strong>Michal Avny: The biggest challenge is building towards real time, geo-aware, localized, personalized ambient data.   Discovery is in its infancy, location social based Best, Top, and Trending lists with some basic filtering options are available, and this is great as people are getting accustomed to information surrounding them.  To some degree it can intensify the AR experience, for instance suggest the most popular dish in a restaurant, or map the best coffee shops nearby, but it is customized at best by friend recommendations and depends on the coverage and broadness of the specific discovery service.</strong></p>
<p><strong>There is a need for the next generation of discovery, customized geo social aware discovery that filters the vast amount of real time data by learning user preferences and behavior (built on top of the much needed local social real time open protocol)</strong></p>
<p><strong>Tish Shute:</strong> Who are your favorite startups/upstarts in the the field of real time search and why?</p>
<p><strong>Micha Avny: <a href="http://www.my6sense.com/" target="_blank">My6Sense </a>- My6sense provides a sharper and better way to experience your information from feeds you subscribe to (Social Networks, News, RSS feeds, etc.).  Itâ€™s personal &#8211; Content is ranked based on whatâ€™s relevant to you. It learns what&#8217;s valuable to you by translating your consumption behavior into a personalized ranking function.<br />
My6Sense â€“ because it is a personalized prediction filter, a critical foundation for AR</strong></p>
<p><strong><a href="http://topsy.com/" target="_blank">Topsy</a> &#8211; Topsy is realtime search powered by the social web that finds the most relevant conversations happening online. The siteâ€™s underlying technology examines popular links as well as the influence of each person citing a link. Topsy augments traditional search engines by finding information that people are talking about.<br />
Topsy â€“ because its ranking is based on retweets and influencers, a great social experience</strong></p>
<p><strong><a href="http://collecta.com/" target="_blank">Collecta</a> &#8211; Collecta is a real-time search engine for the social web. It monitors the update streams of popular realtime blogs and sites like Twitter, WordPress, and Flickr, and shows results as they happen. Results can be filtered by status updates, comments, stories, or photos. The entire engine is built around the XMPP standard, which pushes out data on a continual basis, so that for every search you end up watching a stream that keeps updating itself.<br />
Collecta â€“ because it is built around XMPP, a real time experience</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2011/01/20/real-time-big-data-at-strata-2011-ambient-findability-geomessaging-augmented-data-and-new-interfaces/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>Augmented Twitter at Jeff Pulver&#8217;s #140conf</title>
		<link>http://www.ugotrade.com/2010/04/23/augmented-twitter-at-jeff-pulvers-140conf/</link>
		<comments>http://www.ugotrade.com/2010/04/23/augmented-twitter-at-jeff-pulvers-140conf/#comments</comments>
		<pubDate>Fri, 23 Apr 2010 14:25:03 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[Android]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[iphone]]></category>
		<category><![CDATA[message brokers and sensors]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Mobile Technology]]></category>
		<category><![CDATA[nanotechnology]]></category>
		<category><![CDATA[new urbanism]]></category>
		<category><![CDATA[online privacy]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[privacy and online identity]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[#140conf]]></category>
		<category><![CDATA[#ashtag. TEDxVolcano]]></category>
		<category><![CDATA[3D mailbox]]></category>
		<category><![CDATA[Alon Nir]]></category>
		<category><![CDATA[Anselm Hook]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[are2010]]></category>
		<category><![CDATA[ARWave]]></category>
		<category><![CDATA[augmented reality event]]></category>
		<category><![CDATA[augmented reality games]]></category>
		<category><![CDATA[augmented twitter]]></category>
		<category><![CDATA[Dancing Ink Productions]]></category>
		<category><![CDATA[EComm]]></category>
		<category><![CDATA[Evolutionary Reality]]></category>
		<category><![CDATA[Farmville]]></category>
		<category><![CDATA[federation protocol]]></category>
		<category><![CDATA[Foure Square]]></category>
		<category><![CDATA[Gamepocalypse]]></category>
		<category><![CDATA[gowalla]]></category>
		<category><![CDATA[Jeff Pulver]]></category>
		<category><![CDATA[Jerry Paffendorf]]></category>
		<category><![CDATA[Joshua Fouts]]></category>
		<category><![CDATA[Latitude]]></category>
		<category><![CDATA[Loveland]]></category>
		<category><![CDATA[micro-real estate]]></category>
		<category><![CDATA[mobial social]]></category>
		<category><![CDATA[mobile social augmented reality]]></category>
		<category><![CDATA[mobile social games]]></category>
		<category><![CDATA[Open AR]]></category>
		<category><![CDATA[Open AR Web]]></category>
		<category><![CDATA[open standard federated protocol]]></category>
		<category><![CDATA[Rita J. King]]></category>
		<category><![CDATA[Rouli Nir]]></category>
		<category><![CDATA[social games]]></category>
		<category><![CDATA[The Kotel]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[tishshute]]></category>
		<category><![CDATA[wave federation prtocol]]></category>
		<category><![CDATA[WhereCamp]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=5406</guid>
		<description><![CDATA[Augmented Twitter &#8211; open, mobile, social augmented reality via ARwaveView more presentations from Tish Shute. Augmented Twitter Presenting Augmented Twitter (see video and slides above) at Jeff Pulver&#8217;s 140 Characters Conference (#140conf ) was super fun, and great video makes this a conference that you can enjoy catching up on after the fact.Â  Jeff Pulver [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ippio.com/view_video.php?viewkey=da6ab5c15dd856998e4b" target="_blank"><img class="alignnone size-full wp-image-5407" title="Screen shot 2010-04-22 at 9.52.22 AM" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/04/Screen-shot-2010-04-22-at-9.52.22-AM.png" alt="Screen shot 2010-04-22 at 9.52.22 AM" width="458" height="368" /></a></p>
<div id="__ss_3817428" style="width: 425px;"><strong style="display:block;margin:12px 0 4px"><a title="Augmented twitter - open, mobile social augmented reality via ARwave" href="http://www.slideshare.net/TishShute/augmented-twitter">Augmented Twitter &#8211; open, mobile, social augmented reality via ARwave</a></strong><object classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" width="425" height="355" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0"><param name="allowFullScreen" value="true" /><param name="allowScriptAccess" value="always" /><param name="src" value="http://static.slidesharecdn.com/swf/ssplayer2.swf?doc=augmentedtwitter-100422085925-phpapp01&amp;stripped_title=augmented-twitter" /><param name="allowfullscreen" value="true" /><embed type="application/x-shockwave-flash" width="425" height="355" src="http://static.slidesharecdn.com/swf/ssplayer2.swf?doc=augmentedtwitter-100422085925-phpapp01&amp;stripped_title=augmented-twitter" allowscriptaccess="always" allowfullscreen="true"></embed></object>View more <a href="http://www.slideshare.net/">presentations</a> from <a href="http://www.slideshare.net/TishShute">Tish Shute</a>.</div>
<p> </br></p>
<h3>Augmented Twitter</h3>
<p>Presenting <a href="http://www.ippio.com/view_video.php?viewkey=da6ab5c15dd856998e4b" target="_blank">Augmented Twitter</a> (see video and slides above) at <a href="http://140conf.com/" target="_blank">Jeff Pulver&#8217;s 140 Characters Conference</a> (#140conf ) was super fun, and <a href="http://www.ippio.com/140conf" target="_blank">great video </a>makes this a conference that you  can enjoy catching up on after the fact.Â  Jeff Pulver does an excellent job of keeping people to a challengingly short format.Â  Even I managed to bring my talk in under 5 mins!</p>
<p>#140conf is a real time mobile social crowd, and pretty attuned to Augmented Reality.Â  Everyone had heard of Augmented Reality in the audience, and while most had never tried an AR app, nearly everyone used a mobile social app like, <a href="http://foursquare.com/" target="_blank">Four Square</a>, <a href="http://gowalla.com/" target="_blank">Gowalla</a>, or <a href="http://www.google.com/latitude/intro.html" target="_blank">Latitude</a>. Â  As Dan Harple (@dharple) &#8211; Executive Chairman,<a href="http://www.gypsii.com/" target="_blank"> GyPSii</a>, said in hisÂ  interesting presentation, <a href="http://www.ippio.com/view_video.php?viewkey=44143e1f2f13b2b729ab"><strong>Evolution  of Location and Places</strong></a>,Â  &#8220;everyone get&#8217;s connection, and that connection in real time is the thing if we can get it, and that real time connection is innately mobile.&#8221;</p>
<p><a href="http://www.arwave.org/" target="_blank">ARwave</a> aims to push mobile, social, real time connection even further with augmented reality.Â  As Anselm Hook puts it so brilliantly in his <a href="http://www.slideshare.net/anselm/20100421-ecomm-pressy" target="_blank">presentation at EComm</a>, &#8220;AR is about publishing &#8220;verbs&#8221; &#8211; interactive, actionable, digital agents not publishing 3D models.&#8221;Â  I have some mega posts brewing on this topic.Â  Augmented Reality will need to support publishing game like behavior, and digital agents that can  embody a set of actions and reactions.</p>
<p>This need for augmented reality to publish behavior, and to share and integrate, in one view, multiple real time data streams are just some of the reasons <a href="http://www.arwave.org/" target="_blank">AR Wave</a> uses <a href="http://www.waveprotocol.org/" target="_blank">an open federated   protocol</a>.Â  Federation is also particularly important for augmented reality because, as Anselm pointed out at <a href="http://wherecamp.org/" target="_blank">WhereCamp</a>,Â  AR will certainly demand very efficient distribution of state change at the systems level &#8211; Â to move the computation to its lowest latency.</p>
<p>The only other cloud over our Augmented Reality party at #140confÂ  was that #ashtag kept our co-panelist and panel chair from joining us. Â  Rita J King, @ritajking, who is Innovator-in-Residence at IBMâ€™s Analytics Virtual Center, the &#8220;General of the Imagination Age,&#8221; and <a href="http://dancinginkproductions.com/" target="_blank">Dancing Ink Productions</a>, and Joshua Fouts, @josholalia, &#8220;Cultural AttachÃ©,&#8221; and Chief Global Strategist of Dancing Ink, were on a 5 day trek out of #ashcloud, and, sadly, not there for our panel.</p>
<p>Bu Twitter, once again, was a life line in a time of crisis connecting them to <a href="TEDxVolcano">TEDxVolcano,</a> an impromptu unconference with must see presentations from Rita and others, see<a href="http://www.theimaginationage.net/" target="_blank"> Rita&#8217;s blog for more</a>.</p>
<p>So the two of us carried the flag forÂ  Augmented Twitter.Â  Myself and Jerry Paffenfdorf, futurist, artist, entrepreneur and swell guy  &#8211; the co-inventor of the most famous real time social web system you have never heard of (actually I tried and loved it in alpha testing, before it was quote &#8220;shut down by blood thirsty investors&#8221;).</p>
<p>Now Jerry lives in Detroit Michigan where he works on the <a href="http://makeloveland.com/" target="_blank">Loveland Micro-real estate project</a> which is the simplest, cheapest, funnest way to become a land owner.Â   At a dollar a square inch it mixes video games and real estate, like Farmville for urban development.</p>
<p>Joshua and Rita, our very virtual panel mates, are the first and largest inchvestors, and creating their own micro city within the project.Â   Jerry is one of the most creative and original thinkers on the planet, so treat yourself to glimpse of what is on his mind in the video above &#8211; <a href="http://makeloveland.com/" target="_blank">Loveland</a>, <a href="http://www.3dmailbox.com/" target="_blank">3D mailbox</a>, canned augmented reality, and the relationship of virtual worlds to the real time social web.</p>
<p>Jerry also hat tipped one of the most captivating projects and presentations of the conference, Alon Nir&#8217;s, <a href="http://www.ippio.com/view_video.php?viewkey=510442f2fd40f2100b05"><strong>The  Story Behind @TheKotel</strong></a>, &#8220;Tweet Yr Prayers!&#8221;Â  What a great story about the power of Twitter to reach out into the world, and beyond!Â  I got a chance to chat with Alon at #140conf, and I found out he is brother of augmented reality guru, Rouli Nir, @augmented.Â  Rouli is known for his sharp and comprehensive AR commentary on <a href="http://artimes.rouli.net/" target="_blank">Augmented Times </a>and <a href="http://gamesalfresco.com/2010/04/22/the-future-of-ar-browser/" target="_blank">Games Alfresco</a>.Â  Cool family!</p>
<p>Before I close this post, I want to mention @AndyDixn&#8217;s talk on the prison sysetm, <a href="http://www.ippio.com/view_video.php?viewkey=7bc562a711ef96884a38"><strong>A  conversation with Andy Dixon: What the prison yard &amp; twitter have  in common</strong></a>.Â  This conversation, I think, is a great example about what makes #140conf special.Â  As @nwjerseyliz pointed out, we, &#8220;hear few voices from those who&#8217;ve experienced that side of the issue.&#8221;</p>
<p>Thank you @jeffpulver for creating such a cool staging for so many diverse voices.</p>
<p>And before I close here is what the only slide I didn&#8217;t have time to show said!</p>
<h3><strong>If you liked &#8220;Augmented Twitter&#8221;<br />
Donâ€™t miss Augmented Reality Event! </strong></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/04/are234x60augmented_w.png"><img class="alignnone size-full wp-image-5424" title="are234x60augmented_w" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/04/are234x60augmented_w.png" alt="are234x60augmented_w" width="234" height="60" /></a></p>
<p><strong>2 days, 3  tracks, 40 AR companies, 76 SpeakersArt! Magic!  Competitions!  Awards!Bruce (the Prophet) Sterling, Will (The Sims)  Wright, Jesse  (Gamepocalypse) Schell, Blaise Aguera y Arcas (Microsoft  Bing) and You! </strong> T<strong>he <a href="http://augmentedrealityevent.com/2010/04/10/sneak-preview-of-are-2010-schedule-packed-with-augmented-reality-goodness/">sneak preview of the schedule is here</a>.<br />
</strong><br />
<strong>Register today at<a href="http://augmentedrealityevent.com/" target="_blank"> Augmented Reality Event.com</a></strong></p>
<p><strong>Discount  code for @140 attendees, (and readers of this post!) <a href="https://register03.exgenex.com/GcmRegister/Index.Aspx?C=70000088&amp;M=50000500" target="_blank">TISH245</a> activates $245 price for full  conference.</strong><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/04/are234x60augmented_w.png"></a></p>
<p><strong>See you there!</strong></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2010/04/23/augmented-twitter-at-jeff-pulvers-140conf/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Visual Search, Augmented Reality, and Physical Hyperlinks for Playfulness, Not just Purchases: Talking with Paige Saez about ImageWiki</title>
		<link>http://www.ugotrade.com/2010/03/18/visual-search-augmented-reality-and-physical-hyperlinks-for-playfulness-not-just-purchases-talking-with-paige-saez-about-imagewiki/</link>
		<comments>http://www.ugotrade.com/2010/03/18/visual-search-augmented-reality-and-physical-hyperlinks-for-playfulness-not-just-purchases-talking-with-paige-saez-about-imagewiki/#comments</comments>
		<pubDate>Fri, 19 Mar 2010 03:25:17 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[Ambient Devices]]></category>
		<category><![CDATA[Ambient Displays]]></category>
		<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Artificial general Intelligence]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[internet of things]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[mobile augmented reality]]></category>
		<category><![CDATA[mobile meets social]]></category>
		<category><![CDATA[Mobile Reality]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[ubiquitous computing]]></category>
		<category><![CDATA[Web Meets World]]></category>
		<category><![CDATA[websquared]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[Anselm Hook]]></category>
		<category><![CDATA[AR Wave]]></category>
		<category><![CDATA[are2010]]></category>
		<category><![CDATA[ARNY]]></category>
		<category><![CDATA[ARWave]]></category>
		<category><![CDATA[Augmented reality Magician]]></category>
		<category><![CDATA[Augmented Reality Meetup]]></category>
		<category><![CDATA[augmented reality search]]></category>
		<category><![CDATA[Bruce Sterling]]></category>
		<category><![CDATA[Chris Grayson]]></category>
		<category><![CDATA[distributed augmented reality]]></category>
		<category><![CDATA[Gamepocalypse]]></category>
		<category><![CDATA[google goggles]]></category>
		<category><![CDATA[imagewiki]]></category>
		<category><![CDATA[Imagwik]]></category>
		<category><![CDATA[interaction design]]></category>
		<category><![CDATA[Jason Kolb]]></category>
		<category><![CDATA[Jesse Schell]]></category>
		<category><![CDATA[linked data]]></category>
		<category><![CDATA[linked data and augmented reality]]></category>
		<category><![CDATA[Makerlab]]></category>
		<category><![CDATA[Marco Tempest]]></category>
		<category><![CDATA[open augmented reality]]></category>
		<category><![CDATA[open Frameworks]]></category>
		<category><![CDATA[open Frameworks and augmented reality]]></category>
		<category><![CDATA[OpenCV]]></category>
		<category><![CDATA[OpenCV and augmented reality]]></category>
		<category><![CDATA[optical character recognition]]></category>
		<category><![CDATA[Ori Inbar]]></category>
		<category><![CDATA[paige saez]]></category>
		<category><![CDATA[physical hyperlinking]]></category>
		<category><![CDATA[physical world platform]]></category>
		<category><![CDATA[point and find]]></category>
		<category><![CDATA[RDF and Augmented Reality Search]]></category>
		<category><![CDATA[semantic web and augmented reality]]></category>
		<category><![CDATA[snaptell]]></category>
		<category><![CDATA[social augmented experiences]]></category>
		<category><![CDATA[social augmented reality]]></category>
		<category><![CDATA[social commons]]></category>
		<category><![CDATA[Social Commons for Augmented Reality]]></category>
		<category><![CDATA[SPARQL]]></category>
		<category><![CDATA[SPARQL and ARWAVE]]></category>
		<category><![CDATA[SPARQL and Wave]]></category>
		<category><![CDATA[SPARQL and XMPP]]></category>
		<category><![CDATA[Steven Feiner]]></category>
		<category><![CDATA[Tish Shute]]></category>
		<category><![CDATA[ubiquity]]></category>
		<category><![CDATA[visual search]]></category>
		<category><![CDATA[Wave Federation Protocol]]></category>
		<category><![CDATA[Where2.0]]></category>
		<category><![CDATA[Will Wright]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=5262</guid>
		<description><![CDATA[The video above, The Imawik commercial, is a collaboration between In The Can Productions and Paige Saez for Makerlab &#8220;The Imawik (ImageWiki) is a visual search tool for mobile devices. It allows for the ability to turn images into physical hyperlinks, conflating visual culture with a community-editable universal namespace for images.&#8221; Paige Saez is an [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><object classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" width="400" height="225" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0"><param name="allowfullscreen" value="true" /><param name="allowscriptaccess" value="always" /><param name="src" value="http://vimeo.com/moogaloop.swf?clip_id=2818525&amp;server=vimeo.com&amp;show_title=1&amp;show_byline=1&amp;show_portrait=0&amp;color=&amp;fullscreen=1" /><embed type="application/x-shockwave-flash" width="400" height="225" src="http://vimeo.com/moogaloop.swf?clip_id=2818525&amp;server=vimeo.com&amp;show_title=1&amp;show_byline=1&amp;show_portrait=0&amp;color=&amp;fullscreen=1" allowscriptaccess="always" allowfullscreen="true"></embed></object></p>
<p><em>The video above, <a href="http://www.vimeo.com/2818525" target="_blank">The Imawik commercial</a>, is a collaboration between <a href="http://www.inthecanllc.com/" target="_blank">In The Can Productions</a> and <a href="http://makerlab.com/who.html" target="_blank">Paige Saez</a> for <a href="makerlab.com/projects_show_imagewiki.html" target="_blank">Makerlab</a></em></p>
<p>&#8220;The Imawik (<a href="http://imagewiki.org/" target="_blank">ImageWiki</a>) is a visual search tool for mobile devices. It allows for the  ability to turn images into physical hyperlinks, conflating visual  culture with a community-editable universal namespace for images.&#8221;</p>
<p>Paige Saez is an artist, designer and researcher.Â  In 2007 she founded <a href="makerlab.com/projects_show_imagewiki.html" target="_blank">Makerlab</a> with <a href="http://www.hook.org/" target="_blank">Anselm  Hook</a>, an arts and technology incubator focused on civic and  environmental projects.</p>
<p>Paige and Anselm (see my interview with Anselm Hook here, <a title="Permanent Link to Visual Search,  Augmented Reality and a Social Commons for the Physical World Platform:  Interview with Anselm Hook" rel="bookmark" href="../../2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/">Visual Search, Augmented Reality and a Social Commons  for the Physical World Platform: Interview with Anselm Hook</a>) have been asking a very important question:<strong></strong></p>
<p><strong>&#8220;Who Will Own Our Augmented Future?&#8221;</strong></p>
<p>But most importantly, they have been actually developing applications (again<a href="http://www.ugotrade.com/2010/01/17/visual-search-augmented-reality-and-a-social-commons-for-the-physical-world-platform-interview-with-anselm-hook/" target="_blank"> see my interview with Anselm</a> for more background on this), to allow people to play with, hack and explore and create with the physical world platform, and to imagine new possibilities for physical hyperlinking and augmented realities.Â  This is pretty important stuff, and kudos to Paige and Anselm for beginning this work before the big players &#8211; <a href="http://www.google.com/mobile/goggles/#dc=gh0gg" target="_blank">Google Goggles</a>, <a href="http://pointandfind.nokia.com/" target="_blank">Point and Find</a>,  and <a href="http://www.snaptell.com/" target="_blank">SnapTell</a> came hurtling into the field of visual search and physical hyperlinkingÂ  &#8211; <a href="http://techblips.dailyradar.com/video/translation-in-google-goggles-prototype/" target="_blank">see this demonstration of translation and optical   character recognition</a> in Google Goggle&#8217;s.Â  Also check out Jamey Graham&#8217;s (Ricoh Research) Ignite presentation at Tools of Change, 2010 &#8211; <a href="http://www.toccon.com/toc2010/public/schedule/detail/13370" target="_blank">Visual Search: Connecting Newspapers, Magazines and Books to Digital Information without Barcodes</a>, for more see <a href="http://ricohinnovations.com/betalabs/visualsearch">ricohinnovations.com/betalabs/visualsearch</a>.</p>
<p>We are only just beginning  to get a glimpse of how contested the social commons of the physical  world platform is going to be &#8211; see the Yelp <a href="http://blogs.wsj.com/digits/2010/03/17/small-businesses-join-lawsuit-against-yelp/" target="_blank">controversy.</a> <strong> </strong></p>
<p>As Paige points out:</p>
<p>&#8220;<strong>The lens that you are actually  looking through was as important as what you were looking at. And  democratizing that lens became the most important thing that we could  possibly do.&#8221;</strong></p>
<p>I<strong> </strong>am in total agreement.Â  One reason I have so much enthusiasm for <a href="http://arwave.wiki.zoho.com/HomePage.html" target="_blank">ARWave</a> (note: if you are interested in following the developer conversations there are several public Waves) is I see this open framework playing an important role in the democratization of our augmented views, by creating an open, distributed, and universally accessible platform for  augmented reality that will allow the creation of augmented reality content and games to be as  simple as making an html page, or contributing to a wiki.</p>
<p>Federation, real time collaboration, <a href="http://linkeddata.org/" target="_blank">linked data</a> &#8211; ARBlips that contain metadata that is usable for semantic searches, and modified wave servers that can listen to and respond toÂ <a href="http://www.w3.org/TR/rdf-sparql-query/" target="_blank"> <span> </span>SPARQL</a> HTTP  requests properly (see Jason Kolb&#8217;s <a href="http://jasonkolb.com/" target="_blank">many interesting posts </a>on XMPP and Wave).Â <span> These are just some of the reasons why </span>ARWave could revolutionize augmented reality  searches and more! (see<a href="http://www.mobilemonday.nl/talks/tish-shute-the-next-wave-of-ar/" target="_blank"> my presentation at MoMo13</a> &#8211; video <a href="http://www.youtube.com/watch?v=Y7iqg8X24mU" target="_blank">here</a>)</p>
<p>For more on real time social augmented experiences see our panel, <a href="http://en.oreilly.com/where2010/public/schedule/detail/11046" target="_blank">The Next Wave of AR: Exploring Social Augmented Experiences</a> at <a href="http://en.oreilly.com/where2010" target="_blank">Where2.0 2010</a>, and don&#8217;t miss the <a href="http://en.oreilly.com/where2010" target="_blank">Where2.0</a> conference which has been the crucible for the emergence of location technologies.</p>
<p>Augmented realities, proximity- based social networks,  mapping &amp; location aware  technologies, sensors everywhere, <a href="http://linkeddata.org/" target="_blank">linked data</a>, and human  psychology are on a collision course in what <a href="http://www.schellgames.com/" target="_blank">Jesse Schell</a> calls the &#8220;Gamepocalypse&#8221; Â  See <a href="http://g4tv.com/videos/44277/dice-2010-design-outside-the-box-presentation/" target="_blank">Jesse Schell&#8217;s Dice 2010  talk here,</a> and check out his <a href="http://www.gamepocalypsenow.blogspot.com/" target="_blank">Gamepocalypse Now</a> blog.Â  As Bruce Sterling&#8217;s notes in <a href="http://www.wired.com/beyond_the_beyond/2010/02/jesse-schell-future-of-games-from-dice-2010/" target="_blank">his post here</a>:</p>
<p><strong>*Another  precious half hour out of your life.Â   However: if youâ€™re into   interaction design, ubiquity, social networking, and trendspotting, in   the gaming biz or out of it, youâ€™re gonna wanna do yourself a favor and   listen to this.</strong></p>
<p>And don&#8217;t forget to <a href="http://augmentedrealityevent.com/register/" target="_blank">register now</a> for <a href="http://augmentedrealityevent.com/" target="_blank">Augmented  Reality Event (ARE2010 in 2-3 June, 2010 â€“ Santa Clara, CA</a><a href="http://augmentedrealityevent.com/" target="_blank">)</a><strong>.</strong></p>
<p><a href="http://www.wired.com/beyond_the_beyond/" target="_blank">Bruce Sterling</a>, <a href="http://www.stupidfunclub.com/" target="_blank">Will Wright</a>, and Jesse Schell <a href="http://augmentedrealityevent.com/speakers/" target="_blank">will be keynoting, and there is a totally awesome line up of AR innovators and industry leaders</a>, including Paige and Anselm!</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/bruce_sterling.jpg"><img class="alignnone size-thumbnail wp-image-5289" title="bruce_sterling" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/bruce_sterling-150x150.jpg" alt="bruce_sterling" width="150" height="150" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/will_wright.jpg"><img class="alignnone size-thumbnail wp-image-5290" title="will_wright" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/will_wright-150x150.jpg" alt="will_wright" width="150" height="150" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/Jesseschellpost.jpg"><img class="alignnone size-thumbnail wp-image-5291" title="Jesseschellpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/Jesseschellpost-150x150.jpg" alt="Jesseschellpost" width="150" height="150" /></a></p>
<h3>And:</h3>
<p>You are in luck!</p>
<p>Here is a discount code for the first 100 folks to register to the  event (before the end of March). Go to the <a href="https://register03.exgenex.com/GcmRegister/Index.Aspx?C=70000088&amp;M=50000500" target="_blank">registration page</a>, type in code AR245 and &#8220;youâ€™ll be  asked to pay onlyÂ $245 for 2 full days of AR goodness.&#8221;</p>
<p>&#8220;Watching AR prophet Bruce Sterling, and gaming legend Will Wright, visionary game designer Jesse Schell  deliver keynotes for this price â€“ is aÂ magnificentÂ steal.Â  And on top,  participating in more than 30 talks by AR industry leaders will turn  these $254 into your best investment of the year,&#8221; as OriÂ  put is so well on Games Alfresco!</p>
<p>If you want a preview of just how exciting it is to be involved in augmented reality right now check out <a href="http://gamesalfresco.com/2010/03/17/magic-games-education-and-live-coding-at-the-augmented-reality-meetup-in-nyc/" target="_blank">Ori Inbar&#8217;s great round up</a> on our latest monthly <a href="http://www.meetup.com/ARNY-Augmented-Reality-New-York/" target="_blank">Augmented Reality Meetup NY</a> (or as, Ori notes, we fondly like to  call itÂ <a href="http://www.meetup.com/ARNY-Augmented-Reality-New-York/" target="_blank">ARNY</a>.)Â  There is lots of video up now (much thanks to <a href="http://www.chrisgrayson.com/" target="_blank">Chris  Grayson</a>, whoÂ  <a href="http://armeetup.org/001_arny/video/index.html" target="_blank">live  streamed it</a>).Â  <a href="http://www.marcotempest.com/" target="_blank">Augmented Reality Magician, Marco Tempest</a>, is an absolutely <strong>must</strong> see.Â  (developers note this is an awesome use of <a href="http://www.openframeworks.cc/" target="_blank">open Frameworks</a> and <a href="http://opencv.willowgarage.com/wiki/">OpenCV</a>).Â Â  The video of the show includes a rare explanation of how it  all worksÂ  &#8211; see <a href="http://www.youtube.com/watch?v=6TluCaxz7KM&amp;feature=player_embedded" target="_blank">here</a>.</p>
<h3>Talking with Paige Saez &#8211; &#8220;Software is candy now!&#8221;</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/paige_headshot_sq135.jpg"><img class="alignnone size-full wp-image-5266" title="paige_headshot_sq135" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/paige_headshot_sq135.jpg" alt="paige_headshot_sq135" width="135" height="135" /></a><br />
<strong> </strong></p>
<p><strong>Tish  Shute:</strong> What interests me about ImageWiki is that you have thought  about physical hyperlinking beyond the obvious of where to get your  next good hamburger and beer, right?</p>
<p><strong>Paige Saez:</strong> Right. It was interesting for  me in just thinking about the two things. How do you design a tool to  work in a way that people are getting value from it? And also, how do  you make it work in a way where people can explore and hack it? I think  the most interesting technologies, and this is probably something  somebody else said sometime, are the ones that disappear, that we don&#8217;t  see, instead we see <em>through</em>. They become just the  intermediaries.Â  They don&#8217;t interfere with what we are trying to do.</p>
<p>It&#8217;s a struggle whenever you are developing a new way for  people to get information or make something happen, because you are  playing with magic a little bit. And you have to make it vanish the way a  good magic trick makes an experience a magical one. But at the same  time you also need to reveal just enough that you let people in and they  can see how to change it and make it their own. That is the interesting  tension for this space right now, the idea of augmented reality begins  to lead the idea of a social commons for physical things. The Imagewiki  project was a locus of just this tension. Tish you and I have previously  discussed how difficult it was to even get people to understand the two  concepts independently.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/dhj5mk2g_515dwxtjnds_b.png"><img class="alignnone size-full wp-image-5269" title="dhj5mk2g_515dwxtjnds_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/dhj5mk2g_515dwxtjnds_b.png" alt="dhj5mk2g_515dwxtjnds_b" width="642" height="163" /></a></p>
<p><strong>Tish Shute:</strong> Right, until  recently most people hadn&#8217;t even heard the term augmented reality and I  am not sure that a particularly high percentage of people would  recognize it now despite the recent interest in smart phone apps.</p>
<p><strong>Paige Saez:</strong> It&#8217;s very  difficult to get people to understand the two concepts, and now you are  adding in the third level of participation as well. So I don&#8217;t think it  is impossible, but I do think it requires narrative. It is interesting  that you were talking about the stories you heard this morning from the  creatives at the event [Tish mentioned David Curcurito, Creative  Director, Esquire gave an excellent presentation at Sobel Media event  NYC] because it&#8217;s narrative and the attention to telling a story that  help you walk through all of the ways you can understand how completely  expansive this area is right now.</p>
<p>So I think we have to play with it, play with the space and the  tools. I think we need to have an idea of what we want people to use  the tool for, and we need to not only introduce them to the tool and the  technology, but also introduce them to the concepts as well. So I see  it as a three part process.</p>
<p>I&#8217;m really excited to be there with people,  helping them do that. I think we need to do this face to face. I don&#8217;t  think this can be only through a social network. The ImageWiki website  is like one quarter of the entire picture, you know? The website is the  resource center and the place where you can see people adding images,  but what value is it to you to see an added image? It is more valuable  for you to be interacting with the image or interacting with the object  in the real world.</p>
<p>Designing for the experience of using the  ImageWiki got very complicated very fast. I was trying to figure out the main  thrust of the design for the UI for the ImageWiki and at a certain point  I had to take a step back and say â€œOkay, this has to be good enough for  now because we can lay it out and prototype as long as we want on the  Web or mobile UI. What we need to be doing is going outside and actually  aggregating and putting images into the database in order to see what  exactly happens when we are adding.â€Â  It&#8217;s not just like you are taking a  picture of something and adding it to Flickr. Using the tool is very  context specific and the information is context specific, and you can&#8217;t  necessarily make that all happen at the exact same time. I think these  are really fascinating spaces to be struggling in and I&#8217;m so glad to be  working in this space.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/imagewiki_2.jpg"><img class="alignnone size-medium wp-image-5300" title="imagewiki_2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/imagewiki_2-300x225.jpg" alt="imagewiki_2" width="300" height="225" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/imagewiki1.jpg"><img class="alignnone size-medium  wp-image-5299" title="imagewiki" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/imagewiki1-300x225.jpg" alt="imagewiki" width="300" height="225" /></a></p>
<p><em>Images by Chris Blow of <a href="http://unthinkingly.com/" target="_blank">unthinkingly.com</a></em></p>
<p><strong>Tish  Shute:</strong> Could you explain why we need ImageWiki? I mean I think I  have ideas on this, but perhaps you can explain to me from you point of  view why we need an ImageWiki, as opposed, to say, extending the image  space of Wikimedia or something added on to Flickr.Â  I mean maybe  something leveraging the geotagged photos sets and APIs we already have?</p>
<p><strong>Paige Saez:</strong> Yes, definitely. It&#8217;s a really good question, I mean it really is. Like,  do you need an entirely new place to be holding images outside of the  places that we are already holding images? That&#8217;s a huge question;  enormous. Especially when you take a look at the problems around that.  Its&#8217; exhausting for an end user. Who the heck wants to go and reload  everything into <em>yet another place</em>, right?</p>
<p><strong>Tish Shute:</strong> Right.</p>
<p><strong>Paige Saez:</strong> Moreover, who is going to  really bother? Another problem would be what happens to the existing  datasets that people have already committed to? And then of course there  is the problem of authority and explanations why&#8230;.Gaining interest  and authority in a space when nobody even understands why that space  should exist in the first place. And those are just three, you know, off  the top of my head problems with that idea.</p>
<p>And yet at the same time, I don&#8217;t actually know  how else to go about thinking about the ImageWiki unless I think about  it as it&#8217;s own thing. Then you start thinking about models of large  independant image databases that exist already, examples of this from a  product standpoint- references to consider. The Getty Foundation comes  to mind. There are many other historical centers that have huge  resources and images that are licensed out and used. So here we have a  working example of people already doing this. But succesfully? I don&#8217;t  know. We do have a ton of intellectual property rights and copyright  issues and ownership and use issues with images currently. As a working  artist these issues for me were a major red flag to consider. Working on  the social commons for augmented reality starts paralleling issues  found in digital rights management and intellectual property.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/dhj5mk2g_518gpgpr7gd_b.png"><img class="alignnone size-full wp-image-5274" title="dhj5mk2g_518gpgpr7gd_b" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2010/03/dhj5mk2g_518gpgpr7gd_b.png" alt="dhj5mk2g_518gpgpr7gd_b" width="441" height="606" /></a></p>
<p><strong>Tish Shute:</strong> But one good thing about Wikimedia, why I focused on Wikimedia, is Flickr and Wikimedia already use a creative commons licensing, right?</p>
<p><strong>Paige Saez:</strong> Creative commons, you know they have their own resource center, too. But you know they haven&#8217;t been successful as great databases for images so far.</p>
<p><strong>Tish Shute:</strong> What would you like to see that they don&#8217;t have? Like say maybe start with Wikimedia, right?</p>
<p><strong>Paige Saez:</strong> There&#8217;s just still a lot of issues with how to encourage people to want to contribute. It&#8217;s hard to show the value to someone who doesn&#8217;t already understand the value for some reason. At least for me personally this is something I have run into frequently. I don&#8217;t know if it is necessarily what Wikimedia doesn&#8217;t have, I think it is a lack of understanding of what creative commons really means. And there is still a very strong sense of ownership and concern about creative property rights. Being paid to be creative is a tremendously difficult thing to do. People fear losing their livelihoods. They think this is possible. Is it? I dunno.</p>
<p>For example : Look at me, I take a photograph of something, I can sell that.  And there&#8217;s a question about whether or not, as an artist, I want to have my photographs in a pool of images that is open and accessible when I could be making money on it instead. Now that is just an example. Me personally, I can see the value. But that is a common concern. The gist of the question being, &#8216;what value does it bring to give something away versus holding on to it?&#8217; A hugely popular discussion right now.</p>
<p>This is the same crux of the problem we are dealing with when we talk about thinking about images in the social commons for the real world. It&#8217;s a conversation about ownership. It&#8217;s about, who does this belong to really? If I take a photograph of a Levi&#8217;s billboard, does that photograph belong to me or does it belong to Levi&#8217;s? We know the boundaries of that. But when the image becomes a living image, an image capable of transmutation; an image that provokes an action or hyperlinks to a product, experience, information&#8230;.where are the boundaries in that?</p>
<p><strong>Tish Shute: </strong>But how is ImageWiki handling that differently from Wikimedia, I suppose is my question.</p>
<p><strong>Paige Saez:</strong> We haven&#8217;t solved the problem.</p>
<p><strong>Tish Shute:</strong> Yes, I suppose it is not like we have fully solve the problem of a creative commons for images on the internet let alone the issues of a social commons for the real world! So neither one has solved the problem, right?</p>
<p><strong>Paige Saez:</strong> Exactly. To be honest, it made my head spin. I realized we were building a web application and a mobile tool doing augmented reality, real time feedback on the world and suddenly we weren&#8217;t. Suddenly we were dealing with DNS and talking about physical hyperlinks and ownership and property. And basically at that point you just have to sit and really start looking at catching up on IP issues and figuring out how to deal with that space in a much more wholistic way. It became so important that we had to take a step back and go</p>
<p>â€œOh my god I think we have really uncovered a real problem here.â€</p>
<p>At the point when we were building out the tools we realized something was really going on with our project. Here we were thinking that this was just a beautiful experience of learning about the world around us. We reallyâ€¦Anselm and I both just really wanted this tool to exist. It was something that we both just really wanted to happen in the world, something that we felt really just thrilled to make. And we looked at and used it and realized that instead of it just being a beautiful experience, it was a fundamental shift in how we understood everything. That it impacted our world in the same way the Internet impacted our world. It was a fundamental shift in understanding. A sea-change.</p>
<p>So I put down the prototype and went back to researching, read a ton of books on IP and went and presented to friends, family, schoolmates and co-workers trying to explain the project and then the larger conceptual framework that had emerged from the project. I began using the metaphor of thinking about Magritte&#8217;s &#8220;Ceci n&#8217;est pas une pipe.&#8221; Thinking about a pipe that isn&#8217;t actually a pipe.</p>
<p><strong>Tish Shute:</strong> Oh, yes!</p>
<p><strong>Paige Saez: </strong>..to try to help explain to people that the image that you see is actually not, you know, it&#8217;s not an image of a thing. It&#8217;s an image. And that image has a tone and that image has a voice, and that image was chosen. And there were decisions that were made through the interface of the camera, specific decisions that defined the view of what you were looking at. And that that wasn&#8217;t being acknowledged and that that was a fundamental part of what the ImageWiki was aiming to do. The lens that you are actually looking through was as important as what you were looking at. And democratizing that lens became the most important thing that we could possibly do.</p>
<p><strong>Tish Shute:</strong> So the emphasis for you on ImageWiki was in fact the lens, even though you found obstacles to creating the interface, right?</p>
<p><strong>Paige Saez:</strong> Yes. Definitely. That&#8217;s what I fell in love with first. I really wanted to be able to use my phone to learn about what kind of tree this was or to buy tickets for the band on the poster I just saw, or see a hidden secret. For me it was very much a story, a narrative experience that I just thought was magical. And that is how I fell in love with it, which is not where I ended up.  Where I ended up was realizing it was a fundamental shift in not only my own understanding of how to use the world around me, but in our understanding of looking at the world.</p>
<p><strong>Tish Shute: </strong>It would be pretty scary if an image DNS was basically in the hands of either one or very few people, right?  I mean even ImageWiki would be stuck with this problem, that if you set up a bunch of servers, you are going to be holding a very, very large image database. I mean, whatever your motivation, right?  I think at the minute that is why I am very into seeing everything through the lens of federation, I see that unless we have federation, these giant central, databases are inevitable aren&#8217;t they?</p>
<p><strong>Paige Saez: </strong>Essentially, yes. I mean I wasn&#8217;t able to walk through it as quickly as that. It kind of just overwhelmed me. Looking back on it, it seems perfectly obvious. I was just like â€œOh my god, what have we done? Like what is going on?â€ Particularly for me because so much of my life has been spent in art, it was really easy to immediately understand the connection between the view, the viewer, and whatâ€™s being viewed as all just different layers of ownership and understanding that it is a gaze. Right? We know that we are never able to look at something without passing judgment on it, but to see that become a part of the interface in a real-time fashion just blew my mind.</p>
<p><strong>Tish Shute: </strong>Yes.</p>
<p><strong>Paige Saez:</strong> I think you are right. Getty Images, Flickr images, no matter what you are always holding on to something and you have to be responsible for it. Right? So how do you deal with the responsibility but don&#8217;t take on too much ownership? Where is the boundary with that?</p>
<p><strong>Tish Shute: </strong>And for me, the simple answer to that is loosely connected small parts, distributed systems and federation.  Because there is only one way to be able to utilize these things is to have them distributed so that no one holds all the cards. Right?</p>
<p><strong>Paige Saez: </strong>Definitely and I personally agree with you wholeheartedly. However, the idea of distributed power is a concept that most people just don&#8217;t know how to deal with.</p>
<p><strong>Tish Shute:</strong> And it&#8217;s easier said than done because actually the root problems that you are talking about aren&#8217;t got rid through federation, because if someone really holds the, sort of, all the good image databases just because they have the potential to be federated, they may not choose to open them up on many levels.</p>
<p><strong>Paige Saez:</strong> And even then you have to think about, sort of, like the next level of it, which is we want it to be all open and accessible, but everything is owned by somebody. Like, what really is public anymore, in general?</p>
<p><strong>Tish Shute:</strong> And what is interesting though, regardless of what we speculate conceptually on this, we already set off down the road. I mean we have already several largeâ€¦they are all in beta I suppose, Google Goggles, Point and Find, right? But we have applications that are beginning to implement this. They are beginning to implement search on it, and it is geo-located even if it&#8217;s not in an augmented view, right? So it is proximity based.</p>
<p><strong>Paige Saez: </strong>Right, right. I mean maybe the solution is that if we follow that line of thinking then Flickr will be partnering with Google Goggles. And then my images would stay under my ownership through the authority of Flickr. And I would use Flickr as my place to add images and they would just be responsive via my devices via AR.</p>
<p><strong>Tish Shute:</strong> That&#8217;s very interesting.</p>
<p><strong>Paige Saez:</strong> Definitely I think so. It is also the shortest distance between things.</p>
<p><strong>Tish Shute:</strong> Yes, and as Anselm kept pointing out, basically it is going to happen in the simplest way possible, really, regardless of the implications of that. But OK, getting back to ImageWiki. As you say neither Wikimedia nor Flickr were really designed to take this role, right?</p>
<p><strong>Paige Saez:</strong> Right.</p>
<p><strong>Tish Shute:</strong> With ImageWiki, you&#8217;ve had these ideas and a concern with the social implications of physical hyperlinking  in your mind since it&#8217;s inception. Are there any design ideas you&#8217;ve come up with that you know, as opposed to sort of, as you say, connecting Flickr to Point and Find, or who knows, Google Goggles.  How is ImageWiki going to be different, do you think? Is that a hard question at this point?</p>
<p><strong>Paige Saez:</strong> It is, and it&#8217;s a great question, and it&#8217;s a question I really love to think about. I think we have to introduce the politics with the tools. It has to be acknowledged that it&#8217;s not just a place to hold information, that&#8217;s what I feel in my heart.</p>
<p>At the same time, is that too much for people to really grasp at one time? In my experience it really has been, so the design of the experience needs to allow for an understanding of the power of the tool and the level of authority that the tool offers, while not getting in the way of it; just using it.  Because ultimately, at the end of the day, nobody will use anything if it isn&#8217;t valuable to them. And so I could talk for miles and miles and miles about how important it is that corporations don&#8217;t own all of the rights to all of the visual things in my life, right? For the rest of my life I could talk about that. The idea that advertising is dominating all of our views of anything in the world around us is horrifying. It doesn&#8217;t matter unless I can show somebody why it matters to them or how it affects them. It&#8217;s just that that is a tremendously difficult thing to explain through a user interface.</p>
<p>And I actually think that it&#8217;s great that tools like Google Goggles and Nokia Point and Find are here to do a lot of the hard work of showing people how it works. Recently somebody explained to me their experience of using Google Goggles. They went through this process of saying how the Google Goggles took a picture and then did this really complicated visual scanning thing over the image and it took a full minute.</p>
<p>And I said, â€œWell of course they did it that way.â€  And they said, â€œWell what do you mean?&#8221; I said, â€œWell, what they are really doing there when they are doing all these fancy graphics, is they are showing you how it works.â€ And even if it isn&#8217;t actually related at all to how it functionally works, algorithmically, that&#8217;s not the point. The point is that this gesture of the time taken to make it look like it&#8217;s scanning an image and going back and forth with pretty colors is giving people the time to process that as an experience. That&#8217;s a metaphor for what&#8217;s really happening. And these kinds of metaphors are crucial with user experience design. We have lots and lots of examples of them and how they work, and many of them aren&#8217;t necessary. Like you know, for example, the bar that shows you the time it&#8217;s taking for something to process.There is no relationship between that and reality. But it is really important.</p>
<p><strong>Tish Shute:</strong> Yes those bars often have no relationship between the actual time..</p>
<p><strong>Paige Saez:</strong> And that&#8217;s the thing. Like the idea of time versus our perceived understanding of time. Right? The length of time it takes for your Firefox browser to open and load your last 30 tabs, versus the reality of what&#8217;s actually happening. When you are doing that sort of research you are actually accessing millions and millions of places and points of interest all over the world, so we need more of that. We need more of the process shown. Anselm and I worked with a film maker named Karl Lind from In the Can Productions here in Portland to try and make a video about the ImageWiki. We made this little video and I can try to show it to you or send it to you if you want.</p>
<p><strong>Tish Shute:</strong> One of the issues with this kind of visual search is that it is inherently dependent on large databases, regardless of where they are federated, are going to be very large. Right? I mean someone is going to have something big, and aggregated there.   I suppose someone will figure out the challenges of federated search eventually but that is quite a big challenge!</p>
<p>So I suppose I am still trying to understand what ImageWiki can offer that we can&#8217;t get with any other existing service?  How will their be a social commons and even a social contract for the world as a platform for computing and physical hyperlinks?</p>
<p>Eben Moglen  brought up something when I talked to him about virtual worlds, he said we need code angels to let us know what was going on in the virtual space &#8211; who was gathering data and how, for example.</p>
<p><strong>Paige Saez:</strong> Tell me more about that, I want to hear more about that.</p>
<p><strong>Tish Shute: </strong> Eben suggested this metaphor for when I was asking him about privacy in virtual worlds. The fact that people just didn&#8217;t know that when they were pushing avatars around virtual worlds what metrics were being gathered on their behavior.  And he basically said that what we need is code angels when we enter these spaces because having the rules of the game buried in a TOC was ridiculous.</p>
<p><strong>Paige Saez:</strong> That is a really interesting idea.</p>
<p><strong>Tish Shute: </strong> Maybe ImageWiki needs to be our code angel to navigate the augmented world. I mean that&#8217;s what I want to see it as. And when I hear you talk, what I hear is you talking in broad categories about what a code angel might be in the space of images and image links to the physical world. I mean that is what I hear from you.</p>
<p><strong>Paige Saez:</strong> Yeah. No, I definitely agree with that. It is interesting. In that sense, it is kind of a protection layer. Is that what you are thinking?</p>
<p><strong>Tish Shute: </strong>Yes, I suppose because we can&#8217;t be navigating a lot of complicated opt-ins and opt-outs just to get around our neighborhood safely (in terms of privacy (also see Eben Moglen&#8217;s definition of privacy hereâ€¦)  We will need a code angel that is sort of keeping up with you in real time!</p>
<p><strong>Paige Saez:</strong> Right, right. I wonder how that would work in regards to images, though. That is a really interesting thing to try and put on an image. I guess why I am having such a hard time being specific about it, is I am <strong>just trying to work it in my head, thinking of a specific use case, like what would be an example of that?</strong></p>
<p><strong>Tish Shute: </strong>Well I suppose the example, and this is a crude one, is when you point your Google Goggles to the book jacket, the code angel, this is very crude, would say â€œYou are right now drawing images from the Amazon database &#8211; they are collecting data such and such data from your search.</p>
<p>And then of course the ability to have crowd sourced tagging and corrections..</p>
<p>There was a wonderful book that came out last year on how we can have commercial intelligence -Dan Golemanâ€™s new book: â€œEcological Intelligence: How Knowing the Hidden Impacts of What We Buy Can Change Everything&#8221;&#8230;</p>
<p>how corporations various different stakeholders, including their customers will drive corporations to do the morally right thing because they will lose the commercial support of customers who wonâ€™t support them unless they are more green, fairer, do the things we would like them to do whatever that happens to be &#8211; physical hyperlinking and tagging I guess would be a big part of this.</p>
<p><strong>Paige Saez:</strong> Sort of a transparency issue.  And that almost becomes a page rank algorithm in and of itself. I mean now we are really talking about search more than anything, and what tool becomes the dominant search tool. Anselm and I talked a lot about one platformâ€¦  I mean eventually we will have a unified platform. It willâ€¦No matter what, for the Internet and for physical objects and visual objects in the real world. It will just be a matter of, literally, who can find the best and most valuable, most relevant information on a thing. Currently we just have it very proprietary.</p>
<p><strong>Tish Shute:</strong> Yes.</p>
<p><strong>Paige Saez: </strong>That definitely won&#8217;t last. It just can&#8217;t, because of the exact problem that you are raising. And we already know too much about resources and information as they pertain to products for us to ever go back to a time where we are not considering other ways of getting information about it anyway. Right?</p>
<p>Like I have the same concerns nowadays when I look at fruit. I look at a piece of fruit in the store. I would never just assume that the person who put the sticker on that fruit, anymore, is the ultimate authority necessarily. I would always assume at this point I could go online and go find out more information about a company. Issues about like eco-footprint or how much toxicity, or pesticides or whatnot are now totally accessible already.</p>
<p>So I am thinking when you look at that piece of fruit and that sticker for Google, say what you are describing, do we just go immediately to the company&#8217;s website, or is it even more specific? Do we know that the sticker on that piece of fruit is going to tell us specific information about that? Or are we just getting back the nutritional resources, or are we getting a listing of all of the different options out of a page rank algorithm that shows us, â€œWell this is the website for the fruit.  Here is the nutritional information.  Here are the last 15 comments on it.â€  It&#8217;s basically just a basic search.</p>
<p>Have you heard of Good Search?</p>
<p><strong>Tish Shute:</strong> you mean http://en.wikipedia.org/wiki/GoodSearch</p>
<p><strong>Paige Saez:</strong> Right.</p>
<p><strong>Tish Shute: </strong>A code angel interface would have to give you options, wouldn&#8217;t it on possible views available?</p>
<p><strong>Paige Saez:</strong> Yes. You are then talking about filtering your view. Then it really gets really interesting, of course. I don&#8217;t even know if we have a choice in that. I think we are really kind of hitting a wall with who owns the space and the platform. Is it just a basic search because we are already familiar with search? If you had an option to choose, say, â€œI want to look at this apple sticker and I only want to getâ€¦programmatically only looking at my friend&#8217;s opinions of this company.â€</p>
<p>Or I have a safety valve on it that only shows me certain information based on what the code angel knows about me, my preferences, my age, things like that. Then that gets really, really interesting, because we are trying to do all that work right now just with social media and the Internet. We are already overwhelmed with too much information. It is already past the point of comprehension. So to think that we would actually drill down even more specifics is very interesting.</p>
<p><strong>Tish Shute:</strong> That was a point Anselm made about the fact that once you are into this mobile, just in time, one view kind of situation, it is quite different than the Internet where you can bring up all these different screens and go to another website.</p>
<p><strong>Paige Saez: </strong>Well yes, mobile is a different level of engagement. Very contextual. Much less information. Much more about timeliness. I don&#8217;t want to look an apple and get back a Google search. Oh my God no. Thatâ€™s the last thing I want. I would love to be able to look at an apple and my phone already knows exactly what I want, information-wise, to get back from that apple. But I don&#8217;t know. It&#8217;s all contextual and personal.  So I think the code angle concept you are talking about is really interesting because you still need to think about who is the person that is adding or creating those level filters- is it you, a filtered friend network, an algorithm? How much work is too much work? Where do we draw the line? How much of this are we willing to let the machine do for us?</p>
<p><strong>Tish Shute: </strong>Right.</p>
<p><strong>Paige Saez: </strong>And then of course once you have those filters in place, you need control over them. You will need to dial them up and dial them down, be able to choose and add new ones, so on and so forth. It becomes very modal at that point. For example, I want to change my view: To walk into a grocery store and instead of finding out information, Iâ€™d want to see where the hidden Easter egg puzzles were that my friends left last week because weâ€™re playing a game.</p>
<p>Iâ€™m still really attracted to the creative opportunities with the ImageWiki. Iâ€™m really attracted to changing this experience from being a one-to-one relationship (from Corporation to Consumer) to an open-ended relationship (From Person to Person). If I look at a book jacket, sure I can find out where to buy the book, but thatâ€™s boring. Who cares? Iâ€™d like to find out a link to a story or an adventure or a movie or something unthought-of before.</p>
<p>How do we build that in? How do we encourage serendipity? Mystery? I think the ImageWiki is the space for building that in, actually. Not how, that would be the one place, right? Thatâ€™s my really big fear is that this relationship just stays one-to-one. Click an image of consumable object, get back objects retail value. How completely dull. We have to do better than this.</p>
<p>Additionally, what if I want to take a photograph of a book, an apple, or something and I donâ€™t want to pull back data. Instead, I want to pull back music, or I want to pull back a video, or I want to pull back a song, or lyrics, or a story, or another image. Itâ€™s just a hyperlink at the end of the day, you know? Thatâ€™s all weâ€™re really doing. Hyperlinks can pull back so many different things.</p>
<p><strong>Tish Shute:</strong> And thatâ€™s one of the reasons I&#8217;m into mobile social interaction utility building, because without that, if we donâ€™t have that way to do that in mobile technologyâ€¦thatâ€™s very available on the Internet, as weâ€™ve seen, with Twitter. These applications are very easy to do on the Internet. Theyâ€™re not easy to do natively in a mobile application..</p>
<p>hey, Iâ€™m just promoting AR Wave again. I should shut up.</p>
<p><strong>Paige Saez:</strong> Oh, no.  I think itâ€™s a fascinating concept, I really do. I totally agree. As weâ€™ve talked about it before, itâ€™s amazing that marketing and advertising are helping push forward AR, and itâ€™s great. Itâ€™s fantastic.</p>
<p>But itâ€™s also the worst possible thing that could ever happen because it is such a singular way of looking at an overall ubiquitous computing experience. There are other ways.</p>
<p>The best experience I ever had was trying to explain to people about physical hyperlinks. I had to walk them through it. Good interactive isnâ€™t something you present or show, itâ€™s something you do. Nothing beats just walking around and showing people with a device or a tool or something else.</p>
<p>I mean, God forbid it always stays in our computers and our phones. I really hope we donâ€™t have to be stuck living our entire lives with these horrible interfaces.  But for the time being, we will. Having an AR app show you a puzzle, or a mystery, or a game, or an adventure is a magnificent experience, totally overwhelming, and people get it right away. Thereâ€™s no question; they totally understand.</p>
<p><strong>Tish Shute:</strong> Yes, I agree.</p>
<p><strong>Paige Saez:</strong> You walk them through the experience with a physical hyperlink and then you say, â€œHere, I could use this device and I could show you where to buy this thing, or I could use this device and we could start playing a game.â€ Then everybody gets it.</p>
<p><strong>Tish Shute:</strong> So then I have a question, because one of the things Anselm said to me when he wanted me to refer back to you is that he feels that the direction for ImageWiki should be perhaps to focus less on the technology and more on just the actual, I suppose, gathering of the images, how theyâ€™re going to be annotated, the metadata, right? But my question to him was, the problem if you do that, without the platform, thereâ€™s no experience or motivation for people to do that. Right? Is there?</p>
<p><strong>Paige Saez: </strong>Yeah, I agree with you on that one. Iâ€™m curious what hisâ€¦I think the reason why he wants to do that is he wants to be able to show people examples via the resources. Like to be able to show someone a library, essentially, which I think makes sense with some people. I definitely think that some audiences would really relate to that. For me, it doesnâ€™t make sense because Iâ€™m just very experiential. I need to do it and I need to show other people how to do it and I need to grow that way. I think that at the end of the day, those are great ways to go about doing it. Itâ€™s just itâ€™s a huge thing to do in either direction.</p>
<p>What Anselm&#8217;s really thinking on, I believe, is more about exemplifying how we read and understand images culturally. Then youâ€™re really getting into Visual Studies and Critical Theory which is what I did for my Masters at PNCA. I worked on the ImageWiki while I was in grad school, it was something I was doing for fun. Independently of my studies, the project lead to issues on democracy and objects and property and I ended up right smack in the middle of what I was studying; the nature and cultural analysis of images Questions like, &#8216;what exactly do we get out of images?&#8217; and how all these different things are happening in an image, and people get tons of totally different things out of an image depending on many factors.</p>
<p>The questions I began to ask myself got very philosophical. Questions like â€œIs this apple red? Is this apple red-orange? Is this a small apple? Whatâ€™s my understanding of small versus your understanding of small?â€</p>
<p>Because you supposed that you needed a text backup to the search, how would I be able to search for an apple? Because what if my understanding of apple is red and your understanding of apple is green. And so if Iâ€™m looking for a green apple, am I looking for the same green apple as you? Itâ€™s all semantics, sure.  But at the same time, it gets bigger and bigger, and itâ€™s fascinating.</p>
<p><strong>Tish Shute: </strong>Google Goggles seem to work best on book jackets, basically.</p>
<p><strong>Paige Saez: </strong> But book jackets are actually perfect for this.  Book jackets are perfect for this problem, because book jackets are specifically designed art.  So at the end of the day, we are still talking about creative works, artistic works, that have been designed as a communication tool.  But that is not something that people can own.  Creative works that are designed are a communication tool, with varying levels of skill to be sure, but still something anybody can do.  What we need to do is we need to be using that language.  We donâ€™t need to be trying to reach as far as facial recognition.  We need to develop our own logos, our own brand, our ownâ€¦I mean not brand.  Brand is a bad way of saying it.  Another way of saying it would be like, just use it.  Develop a visual language that we can use that is as effective and as well utilized as book jackets or the movie posters or something.</p>
<p><strong>Tish Shute:</strong> What are some of the use cases for ImageWiki you would like to develop first?</p>
<p><strong>Paige Saez:</strong> My dreamâ€¦I have like four or five use cases that I want to see happen.  One of them is I walk down the street and there is a new poster for my favorite band.  And I can just go up to the poster and I use my device, whatever it looks like, and I download the latest album. It&#8217;s transactional. I am able to just plug in my headset and walk down the street and the transaction is done. I saw something I wanted. It was beautiful. I was able to get it and I was able to move on in my life.  And that is totally possible.</p>
<p>Another one would be I walk down the street and there is a piece of graffiti.  And I am able to use my device to find out who the artist was that made it and to give them props, and to point my other friends to the fact that the piece is there and it will most likely be there only for a short period of time- information retrieval and socialization.</p>
<p>Or, use my device to find an Easter egg, to find a narrative puzzle that ends up going on for weeks, and everybody is involved, and we are all playing this game together. Adventure-based, non-linear experiences. I want playfulness, not just purchases.</p>
<p><strong>Tish Shute: </strong> Did you think of piggybacking on the Flickr API for geo-tagged photos as a way to work with those databases or not?</p>
<p><strong>Paige Saez:</strong> Yeah, we definitely thought about that.</p>
<p><strong>Tish Shute: </strong> And why did you decide not to, for any reason orâ€¦?</p>
<p><strong>Paige Saez:</strong> Ultimately, we justâ€¦we were such a small group, we just had to tackle certain things at a certain time.</p>
<p><strong>Tish Shute:</strong> Right.  And you were so prescient, you were working slightly before we had the mediating devices, werenâ€™t you?  You were just before the mobile devices really got adequate for this.</p>
<p><strong>Paige Saez:</strong> Yeah.  We started on itâ€¦I believe it was Januaryâ€¦No. December 2007. Basically, the iPhone had just launched like maybe six months prior or something like that.</p>
<p><strong>Tish Shute:</strong> But not 3G and not 3GS, right?</p>
<p><strong>Paige Saez: </strong>Not 3GS. It was the first generation iPhone. We built the ImageWiki before the App Store existed.</p>
<p>We knew that the App Store was coming out.  And we knew that the App Store was going to be the biggest thing in the whole world. I remember getting into multiple fights with friends about how revolutionary the iPhone and the App Store were going to be and people thinking I was totally crazy; people just thinking I was absolutely nuts for being so excited about it.</p>
<p>It sucks that it is a closed proprietary system, but the App Store has done something for software that nothing has ever done in the whole world.  Software is candy now.  It&#8217;s candy.  It is like when you are waiting at the grocery store at the checkout line and you are stuck behind somebody, and you have got all these little tchotchka&#8217;s, candy bars, magazines, nail-clippers and things. That is the equivalent of software now.  It&#8217;s become an impulse buy, which is amazing.  Nobody would ever have thoughtâ€¦that is actually revolutionary. That&#8217;s huge.</p>
<p><strong>Tish Shute:</strong> <a href="http://www.cs.columbia.edu/~feiner/" target="_blank">Steven Feiner</a>, who is one of the founding fathers of augmented reality said to me during a conversations at the ARNY meetup that one reason that augmented reality, despite the hype, is manifesting very differently from how virtual reality burst onto the tech scene is that it is about affordable apps on affordable readily available hardware.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2010/03/18/visual-search-augmented-reality-and-physical-hyperlinks-for-playfulness-not-just-purchases-talking-with-paige-saez-about-imagewiki/feed/</wfw:commentRss>
		<slash:comments>5</slash:comments>
		</item>
	</channel>
</rss>
