RSS

The Physical World Becomes a Software Construct: Talking with Brady Forrest about Where 2.0, 2010

Wed, Feb 10, 2010

Screen shot 2010-02-08 at 11.05.18 PM

“The internet eats everything it touches,” write Brady Forrest and Nathan Torkington, O’Reilly Media, Inc., in their must read 2006 companion essay The State of Where 2.0 (PDF).  Now in 2010 that statement is more true than ever.

Last week,  I talked to Brady about what we can look forward to at Where 2.0, 2010,  and what he thinks will be the “internet eating” trends emerging this year.  Brady is uniquely positioned to get a glimpse of things to come.  His job for O’Reilly Media is tracking changes in technology and organizing large scale events, including Where 2.0 which he chairs, and Web 2.0 Expo in San Francisco and NYC which he co-chairs.  Brady also runs Ignite, and previously worked at Microsoft on Live Search.  And, when not doing his day job, he participates in such Uber Geek activities as Steve the Robot H.E.AI.D – A Human Energized Artificial Intelligence Device…with lasers and generative sound, (click on pic above or see video here).  Look out for Steve the Robot H.E.AI.D, at Augmented Reality Event, June 2nd and 3rd, Santa Clara, CA,  and a presentation from Brady.

As Vernor Vinge pointed out in his intro to ISMAR 2009 – the “possibilities are both scary and wondrous” as “the physical world becomes much more like a software construct.”  Brady Forrest has taken a lead role, since 2004 – when “‘local search’ was interesting but not yet real,” in shaping this transformation.

Where 2.0, together with WhereCamp (this year at Google) constitutes WhereWeek – a crucible for emerging trends in web mapping platforms, and location based technologies.  This year augmented reality, proximity-based social networking, local search, and the rapidly maturing field of Crisis Management are in the  mix along with the huge and long established GIS industry which has moved rapidly into the Where 2.0 space.

But what business models will oxygenate the system is still a key question – one Brady discusses in the interview below.  Certainly, the usefulness of location based analysis, mapping, new interfaces, and bringing this data to every application is clear.

Crisis management is center stage this year Jeffrey Johnson (Open Solutions Group), John Crowley (Star-Tides), Schuyler Erle (Entropy Free LLC) who will present on, Haiti: CrisisMapping the Earthquake.  And Chris Vein & Tim O’Reilly will “discuss how cities and application developers will benefit from open data and what these programs will look like in the future”  in the plenary City Data.

Mobile social, proximity- based social networking, which may soon emerge as a challenger to web based social networks, and augmented reality are the sexy rockstars of  the Where 2.0′s 2010 showcase of potentially disruptive technologies.  Augmented Reality has had a breakthrough year, and this is reflected in its strong showing on the Where 2.0 schedule.  But, as Brady notes, AR awaits the killer app, that will drive it to the next level  Of course, we hope to unveil that at are2010!

At Where 2.0, I am presenting on The Next Wave of AR: Exploring Social Augmented Experiences panel.  We will look at how social augmented experiences will be key to the next wave of mobile augmented reality.  Mike Liebhold, in a complementary presentation, looks at Truly Open AR. If you have been reading Ugotrade, you already know I am an advocate for an open, distributed, real time communications framework for AR – see ARWave.  Wave Federation Protocol is an open fast, compact, federated, communications protocol that is a dream come true for AR.  And, I would hazard a guess that in 2010, real time communications plus location will become oxygen.

But also key to the next wave of AR, as I discussed with Anselm Hook in this post on Visual Search, Augmented Reality and a Social Commons for the Physical World Platform, will be a view constructed through complex “hybrid tracking and sensor fusion techniques” (Jarell Pair), cooperating cloud data services, powerful search and computer vision algorithms, and apps that learn by context accumulation.”

And as Brady notes in the interview below,  a key step forward would be “to take advantage of your location, but it doesn’t need to have been mapped before.”

For some interesting news on the mapping front (and a discount code for Where 2.0 for Radar readers) see Brady’s post, Flickr Photos in Google Street View. These kind of human built maps have the potential to develop into “photo-based positioning systems” that could create new opportunities for augmented reality.  Brady asks:

“how often the Flickr photos get updated, where else these Flickr photos are going to show up in Google’s services (Google Goggles perhaps?) and will they show up in new search partner Bing? I am doubly curious if Facebook will ever let its photos be used in a similar way.”

Lior Ron of Google Goggles will be at Where 2.0 to tell us all about, Looking into Google Goggles.  And if you want to learn more about how our view of the physical world will be ” rooted in powerful computing, pervasive connectivity, and the cloud” don’t miss this one.  I will be there.  And I very much hope there is a Q and A with this session.

During our conversation (see the full conversation below) Brady gave me his short list for breakthroughs that he sees as having big significance in 2010:

“Well, I think Google Goggles is one of the most exciting things to me.  Having access to a visual search…having someone actually release a visual search engine in that way, to consumers, I think is huge.  You know, you see stuff like that in the labs. But I don’t see it… it’s rare to see it out.

I think Android is huge.  And the way Google is pushing hardware to show off the platform; so the Nexus One being another example and the fact that it’s breaking free from the carriers.  Because I think when we get away from the carriers we are able to see more innovation, it’s what’s going to allow people or developers and companies to really innovate.

And I think Twitter adding geo-location to their APIs and buying MixerLabs is a huge move. I think Twitter may end up becoming the end-all be-all of location services. They are going to be updated constantly by people; they are going to have a really good grasp, real-time, of what is happening in any one place, at least based on the people.

And then with the addition of the MixerLabs data, they’re going to have more datasets at their ready, as well as any data that they start to collect from the clients themselves, like from TweetDeck.

So there are global clients that are updating Twitter.  I think those are some of the most exciting things.  And again, just to come back to Yelp, I think Yelp’s Monocle is also pretty significant, just because it’s an AR [augmented reality] app that’s being pushed into consumers’ hands.

And we’ll see how useful they find it.”


Talking With Brady Forrest

bradyandgenomepost

Pic above from WhereCamp 2009, Brady Forrest, facing camera, checks out Mark Powell’s Food Genome Project.  Check it out here – it just woke up!

Tish Shute: So last year when you were interviewed by Michael Calore for WebMonkey before Where 2.0 you said, “Location is no longer a differentiator it’s going to become oxygen.” And after attending Where Week 2009, I agreed with you and wrote about it here.  But, in what ways did this prediction exceed expectations, and what ways were you disappointed now as we get close to Where 2.0, 2010?

Brady Forrest: Well, it exceeded expectations in that there are now five different mobile OS’s where you can load on third party applications that active users’ locations that can then be shared out.

And so, what it is making is the possibility of real-time social location aware applications.  And this is something that hasn’t truly been possible in years past. Looking back three years ago when the iPhone launched, it was the first major phone, especially in the US, to be location aware.  And a year later, the Apps Store launched, giving developers full access to location, which previously had been held onto very, very, incredibly tightly by the carriers.

And now, a year and a half later, you have Android, you have Palm Pre, you have Blackberry working on their SDK to make it better, but it still is there.  You have Windows Mobile working on their SDK.  And, you know, who knows?  Maybe even BREW will get into the mix.

And AT&T is opening up their own interactive store.  And so, AT&T and Verizon and all their smart phones may now be looking at BREW.

Tish Shute: Right. It was very exciting last year at Where 2.0, where we had all these new toolsets announced and then the iphone being location aware. What were the best implementations of these new capabilities that became available in 2009, do you think?  What, in your view, was the most creative, surprising and disruptive?

Brady Forrest: Well, I am a huge fan of Yelp Monocle. I think, you know, that is just a real life example of using Augmented Reality.  You are on a street.  You have got a bunch of restaurants.  You have got a bunch of businesses.  And just to be able to swing through and look for people…I mean and look for ratings and reviews.

They have just started to institute check in, so you will be able to know where your friends are and where your friends have gone.  And that type of real-time, incredibly useful data is what will make augmented reality a standard part of the landscape.

I think it is that type of data, more so than, say, reference data, that will make people want to have all the possible sensors.  So, what do you need for that?  You need a camera.  You need a compass for orientation.  You need a GPS or, at least, a decent location service.  And then you need a screen where you can actually see the data, and then you need an Internet connection.

So it is not like any phone can handle this.  And so, you are going to need those killer apps to actually drive people to the type of phones that can support this.  I don’t think AR is quite there yet.

Tish Shute: I agree, for true AR you need more that compass, camera, and GPS.  There are some missing pieces for the real deal experience – and not just a pair of sexy AR spec.  As you mention, hybrid tracking and sensor fusion techniques that can combine computer vision technology with  compass and GPS are vital.  We need the compass.  We need the GPS.  We definitely need the camera!  But we need this combined with computer vision techniques to get the tracking, mapping and registration for true AR, or even to deliver a stable experience with the post-it/geonote AR that we see emerging with Layar, Wikitude, and others. At the moment we need to put together the tools for a true AR hyper-local experience.

And, of course, another aspect of this is the kind of physical hyper-links that applications like Google Goggles are building.

Do you have a speaker from Google Goggles at Where 2.0.  I would be absolutely fascinated to hear more about their road map?

Brady Forrest: I was loading Google Goggles onto the program yesterday.

Tish Shute: Oh, you did?  Oh, fantastic. And you have Lior Ron speaking!

Brady Forrest: It is actually possible it is not up on the website, but I talked to them and got them to agree to do a talk on it.

Tish Shute: I very much want to hear more about their road map.  Google Goggle’s is a very, very significant step towards the physical internet and this integration of computer vision with sensor fusion techniques necessary for true AR.

Brady Forrest: I mean that combination with Computer Vision is going to be incredibly valuable, because,  and then the other issue you have there is like is it on the client,  or is it on the server?  And right now, Google Goggles is definitely on the server, and that is not fast enough in real-time AR.  So that is like more of a 10 blue links IO interface.

Tish Shute: And also, they haven’t got an open API, have they?

Brady Forrest: No, not yet.

Tish Shute:
Maybe they will announce that.  Can you nudge them?  For true AR,  we need to move forward in several areas – of course, there is the mediating device issues, like access to the video buffers in the iphone, and the development of cool AR eye wear would be nirvana!

But my recent obsession has been working on a real-time communications infrastructure for AR, because that is quite doable now, yet we don’t really have that real-time infrastructure, i.e. a real-time mobile social utility that is really up to the real time requirements of AR [see more about this here and on ARWave wiki].

But we certainly don’t have the integration of computer vision and sensor techniques, and the access to the big image databases we need, let alone the clients we need to put it all together either!

Brady Forrest: Google has done work to help out the community with their support of Open CV.

It is based out of Willow Garage, but I believe that Google has done quite a bit of work on it.

Tish Shute: Could you talk a bit more about Open CV?

Brady Forrest: O’Reilly has  a 500 page book on it.  It came out of the Darpa Project, or the Darpa Contest, where unmanned vehicles are raced.  And that has since become, at least in my mind, the primary computer vision library that people work with.

I actually used it…or, one of the teammates did, on our project we did this summer.  We implemented an Open CV pretty quickly that detected where people were, and then we would play music based on that.

3185351345_67e3514d36_o

Uber Geek Meeting from ShellyShelly’s photostream
Tish Shute: Is that your Burning Man project? Do you have a link for that, and some pictures, video?

Brady Forrest: Yeah.  Heaid.com.  Human Enhanced Artificial Intelligence Dancing.

Tish Shute: Thank you! This year the augmented reality story has been fairly basic – relying on basic sensors, compass, gps, accelerometers.  But it has also been an exciting year because  we hadn’t even had  smart phones with the camera, and GPS, and compass before this.

But now, the big adventure is to hook this all these sensor fusion techniques up with computer vision so that we can actually do reverse positioning for example from photos from what we are looking at, right?

Brady Forrest: Yeah, and start to use it in a more ad-hoc manner so that as you are traveling around, yes, it will take advantage of your location, but it doesn’t need to have been mapped before.

Tish Shute: Right – moving from mapping to context awareness.  Could you give like a quick explanation of what you did in your Burning Man project and how that relates to this kind of,  ad-hoc, on the fly, beginning to know what you are looking at without it having been mapped before, that is fascinating.

Brady Forrest: Sure.  So we mounted a camera about 30 feet off the ground.  And as people would move underneath or dance, they would move from block to block.  And we had kind of created kind of bitmap of the area underneath and set up different sound zones.  So as people moved from zone to zone, it would play different music.

And we used Maxim FP to handle the computer vision, although it has Open CV library to handle the computer vision part and to handle determining which of the audio to fire off.  And then, also, we had a laser that would play at the same time.

And then we used Ableton Live, which is a very popular DJ software to actually handle the music.  So as someone moved from, say, square A to square B, it would fire off various MIDI signals and Ableton would interpret that.  And each person who went in, up to…well, theoretically, up to 4- 8 people.  But because of how small the stage was and how the sounds are played, realistically, more like 4-6 people.

Each person had there own set of sound.

3921063406_db4fbee6af_b

Pic from extramatic‘s Flickr stream here

Tish Shute: Wow! Awesome.

Brady Forrest: We would be able to detect different people, assign them a sound, or a set of sounds, so, like bass, drums, vocals.  And then we would have clips that played well together that were 3-5 seconds in length.

Tish Shute: At what distance could you detect people?

Brady Forrest: We had a 22 foot area underneath the camera.  That was mostly based on what the lens could capture.

Tish Shute: OMG I love this!  This is really the next step for augmented realities – not just attaching reference data to the world but exploring new shared “cosensual realities” (see Anselm Hook’s interview part 2 upcoming).

I am very interested in how in something you talk about a lot in your “State of Where 2.0″ essay, about lifestyle coming first for a potentially disruptive technology, not commercial considerations.  I still have to post the second half to my interview with  Anselm Hook but Anselm has some brilliant ideas in this area.  He is working on a project called Angel, where part of the vision is for people to actually find what they need without explicitly having to ask for it having to ask for it.

And this brings me to something that is very, to me, noticeable about Where 2.0 this year, and very exciting.  This is that location aware technology and crisis management basically has matured, hasn’t it?  We are beginning to see really useful stuff in this area now.

What is different this year that has brought crisis management and location aware technology together, a world in crisis?

Brady Forrest: Well, I think the primary thing that has brought all these technologies together is Haiti.  Without Haiti…A lot of times, future crises benefit from the current one, because people put in a lot of work.  And so, there is new infrastructure being laid with things such as Ushahidi, which is an open source platform for tracking…well, originally for tracking election violence in, but now is being used to track people and their locations and food requests in Haiti.

Also, Haiti did not have solid, accessible, good maps at the time of the of the earthquake.  And there have been two volunteer projects that have sprung up to help with that.  One being headed by the Open StreetMap Wood Foundation and many volunteers.  And then the other, Google Map Maker. And in both cases the activity around Haiti on these programs went up exponentially…or, I don’t know about exponentially, but a lot.  In the case of Map Maker, it was up 100 times and was the most worked on country for that week.  And one of the most downloaded for that week.

Tish Shute: Yes the work being done in CrisisCamps around the country is very encouraging.

Brady Forrest: And then also, you know, not just Ushahidi or Open Street Map, but also the People Finder which had open API so that different organizations could share their data, thus learning from Katrina.  There are all these different pieces of technology will be used in the future and hopefully be able to save more lives.  I didn’t see…there are iPhones apps that were released.  But I’m not aware of any Android apps.  I’m not aware of any AR apps.

Tish Shute: We don’t have smart phones devices distributed widely enough for them to be appropriate, do we, in a lot of areas where crisis strikes.

Brady Forrest: Yeah and there was criticism that they shouldn’t have been on iPhone.  You know, that iPhones were a waste of time. Because they aren’t…a lot of on the ground agencies aren’t going to have iPhones.  However, a lot of people who are going from the States will, and if the apps are there, then people will start to have them.

But relatively speaking, an iPhone is not that expensive.

Tish Shute: One thing I noticed and actually I discussed this in the second half of the interview I did with Anselm which I am getting ready to post.  But one of the aspects of the crisis filter was having people working as curators looking at messages coming out of Haiti, and while integrating the streams that would be useful is still probably a challenge, many curators will be on iPhones because they are based in the US.

We need to work across all platforms probably.

Brady Forrest:
Yes.  Patrick Meier of Ushahidi, who runs Crisis Mappers, he ran a 24/7 emergency room. It was out of the Fletcher School in Boston.

They had volunteers all over the States and Canada.  They had volunteers in Vancouver that were translating Creole messages in under ten minutes.

Tish Shute: Yes and another point that is interesting in terms of the reconstruction and rebuilding of  Haiti is  the whole idea of leap frogging, and the idea that you can really… there’s always, as we’ve seen in other parts of the world, opportunity, when you miss pieces of basic infrastructure, to skip a whole stage and go onto the next one, like how virtual banking took off in Africa because of the absence of brick and mortar infrastructure.

Brady Forrest: To skip to a topic that been in my head, I’m just so bummed that the iPad does not have a camera.

Tish Shute: I was bummed is barely the word I would use.  Particularly as we had just been planning our ground breaking AR/next generation ebook in the days leading up to the announcement!

I suppose there is the hope they’re going to put it in the next one.  But I suppose the play for conventional content delivery is so big that everything else is trivial in comparison – especially in seems jump starting the emerging augmented reality industry!

So we might get thrown a camera and compass in the next round but will we get access to the video buffers?  AR enthusiasts may have to live on table scraps from Apple a bit longer it seems.

But what blows my mind is why hasn’t the iTouch got a camera, been AR enabled?  AR gaming would get an enormous boost from that alone. My son loves even the simple minded AR games available now on the iphone, and he loves iphone games – he has 110 games downloaded!

Brady Forrest: Ridiculous.  Yeah.  I don’t know what they don’t like about cameras.  And I plan on getting an iPad, but because of the limitations I plan on using it for base content and will probably get the bottom line model. I can’t imagine…I don’t know.

Tish Shute: It is very interesting, who actually puts together the big enabling mediating device for AR is still an open question, isn’t it?  I mean, that’s the truth; we have sort of mediating devices but we don’t have the magic brew yet do we?

Brady Forrest: No. Not yet.

Tish Shute: Good enough in some ways, and certainly a start but not quite the real deal.  For me, Where 2.0 this year covers the groundwork for true AR, mobile social proximity-based social networking, visual search, computer vision and sensor fusion techniques….   And because all these things have a chicken and egg relationship laying the groundwork is basically as important as having the mediating device otherwise you can’t do interesting things when we get the mediating device, right?

Is this the year we get the magic brew for AR, i.e., the business model, the killer app, and the mediating device?

Brady Forrest: This is not the year.

Tish Shute: Then I should ask you. Are you in the Goggles camp? That is do you think AR needs eyewear to go mainstream?

Brady Forrest: I think this may be where we get…we start to see what is going to be the killer app that gets people to buy the hardware that will support AR.  You see what I mean?  And then from there the apps will come out and the hardware will advance in that direction.

I don’t think AR has made that leap yet.  It hasn’t, to use almost a cliché, it hasn’t crossed the chasm yet and it hasn’t proven that it will.  Because I don’t know if…I think it’s difficult to tell right now.  Is it going to be games?  Is it going to be data layers? What is going to drive people to an AR device, especially one fully dedicated to it?

Tish Shute: I think in terms of AR games taking off a bit of help from the mediating device e.g. access to the iphone video buffers would probably be enough to stoke up AR games into being a hot commodity.  But in terms of AR data layers  going mainstream, we need some of the other players in the location space to put together the magic brew on the business model, don’t we?

Brady Forrest: That’s why I’m so curious though…that’s why I gave Yelp their own talk.  They are…Those guys are gang busters, they’re a consumer company, very consumer facing website.  They’ve got amazing data stores.  They do a lot of interesting stuff with their data.  And I don’t think people always give them the geek credit they deserve.

Tish Shute: You began Where 2.0 back in 2004, when as you point out, “‘local search’ was interesting but not yet real” and you have always stressed something that’s proven to be absolutely true which is lifestyle before commerce, right?  And that if location based services were going to be big it was because they meant something in terms of our lifestyle, not just because they told us where to get another good burger.  Right?

I think there’s been a lot of breakthrough in that area this year in terms of what location based services and proximity based social networks are to us now, how they’re changing our lifestyle.  What do you see as the breakthroughs for in 2009 and what are you hoping for in 2010?

Brady Forrest: Well, I think Google Goggles is one of the most exciting things to me.  Having access to a visual search…having someone actually release a visual search engine in that way, to consumers, I think is huge.  You know, you see stuff like that in the labs. But I don’t see it… it’s rare to see it out.

I think Android is huge.  And the way Google is pushing hardware to show off the platform; so the Nexus One being another example and the fact that it’s breaking free from the carriers. Because I think when get away from the carriers we are able to see more innovation, it’s what’s going to allow people or developers and companies to really innovate.

And I think Twitter adding geo-location to their APIs and buying MixerLabs is a huge move. I think Twitter may end up becoming the end-all be-all of location services. They are going to be updated constantly by people; they are going to have a really good grasp, real-time, of what is happening in any one place, at least based on the people.

And then with the addition of the MixerLabs data, they’re going to have more datasets at their ready. As well as any data that they start to collect from the clients themselves, like from TweetDeck.

So there are global clients that are updating Twitter. I think those are some of the most exciting things. And again, just to come back to Yelp, I think Yelp’s Monocle is also pretty significant, just because it’s an AR app that’s being pushed into consumers’ hands.

And we’ll see how useful they find it.

Tish Shute: Gary Gale, Yahoo! Inc., is going to talk on overcoming the business, social, and technological hurdles so we can reach the long promised [Laughs] Hyperlocal Nirvana. I think you’ve outlined some of these obstacles in relation to  AR, where there are obstacles are in terms of mediating device, and bringing all the pieces together including computer vision techniques in order to have an AR view. That’s the AR side of it. But the layer below that, which is the layer where actual location based apps that are beginning to go mainstream now,  are these presenting successful business models for location-based services.

So in short, in your view, what are the big hurdles to Hyperlocal Nirvana before we get to AR, even just for these location-based services?

Brady Forrest: Well, how do you make money?

Tish Shute: Yeah, to put it bluntly. I like John Battelle’s way of putting it [laughs] how do we oxygenate the system!

Brady Forrest: So are location-based services something that you can make money in the long-term? Nokia bought NavTec for $8 billion. And then two years later, they’re giving it away free as part of Ovi Maps.

Tish Shute: Right.

Brady Forrest: I’m assuming that that’s actually part of the plan.  And that although their hand may have been forced by Google with their release of Turn-By-Turn that…but it’s still got to be a hard nut to swallow that this huge investment in location ends up becoming a loss leader to sell more phones.

So, can you make money through subscriptions, through selling apps? And I think that is still being proven. The other one is, can you use advertising? And it’s kind of scary to see that Apple is restricting the use of advertisers to use location.

It came out yesterday or two days ago that advertisers cannot use location, or app developers cannot use location for ads. They can only use location to show something interesting or useful to their customers.

And there’s a lot of speculation that it’s because Apple wants to control the location-based ads that go on the iPhone.

Tish Shute: Yes. I heard a strange rumor.  Actually its an un-strange rumor, a likely rumor in fact,  that Apple and MS are getting together to replace some of the Google aspects of the iPhone like search and maps?

Brady Forrest: Yes, …. Microsoft employees get 10% off at the Apple store. There’s a longstanding relationship between those two companies.

And Android is definitely more of a competitive threat than Windows Mobile is.  And it’s well-known what the relationship between PCs and Macs are. So I don’t think…I don’t find that to be that surprising of a rumor.  I do wonder if it would hurt the iPhone, but it doesn’t surprise me that they would consider it.

Tish Shute: I do know, certainly from the AR point of view, Microsoft has recently hired some of the key researchers, including Georg Klein. And they are looking for more people in the image recognition area so it seems currently MS is going to be making a bigger push not just with PhotoSynth, but with image ID.

So it could be a pretty powerful combo between the iPhone, and Microsoft – they have some of the key computer vision research that would be needed for full AR.

Brady Forrest: Oh, yeah. Microsoft has amazing research depth. They’ve got an amazing team.

Tish Shute: But it is a bit of a mystery to me why Microsoft haven’t done more with Photosynth.  As I noted in my previous post, Nokia’s ImageSpace is beginning to do what many thought Microsoft would do with photosynth two years ago.  And “photo-based positioning systems” -  3d models of the environment to cover every possible angle, and then software that can work out in reverse based on a picture precisely where you are and where your facing could be hugely important to AR.  But that brings me to another mystery why haven’t we seen more from Nokia in this space yet – the N900 doesn’t have a compass?

Brady Forrest: Yeah, I don’t know why Nokia hasn’t made more of a space for themselves in these things. They did a lot of early work in these areas. I think they are trying to…my guess is that they’re trying to restructure themselves. They made some pretty big changes on the web-Ovi made its own division. And they’ve been doing a lot of location-based acquisitions: Places, Gate 5 several years ago, Gossler, just the past six months. And so I think that’s really been their focus…and the research team.

And a large company, since they haven’t found a business model, which is what we’ve been discussing here, they are hesitant to launch it, or to…they don’t really know if this is a business that they need to launch, or if this is an app that they should have there out for fun.

Tish Shute: Yeah. And that’s back to the oxygenation of the system and location.  We really still have some work to do to with the business models

Final question!  At the core of many of today’s business model is the idea of hoarding data – that’s an underpinning.

But ultimately, for open AR, we want a situation where we can really share data so that we don’t really have the data all locked inside one particular browser or app. The current crop of AR browsers aren’t really browsers in the sense that we understand a browser on the web today, because the data’s locked inside each service, Wikitude, Layar, Acrossair etc.

I have become very interested with Federation as a model for solving this, so that we can begin to have an opportunity to build consensual relations around data, sometimes sharing, sometimes not. Federation is my big dream at the moment.  And now we even have something to work with in the Wave Federation Protocol. But how do we get from here to there, where we really have a federated world of data for AR and location-based services? But you think people need to solve the question of business models first?

Brady Forrest:
I think people need…I think one potential is ads; so serving up content.  And by ads, I also mean coupons, meals, the Foursquare…. what it looks like Foursquare’s going to do, featured content, which is Layar’s.

So we need to see, is that the way we’re going to sell these? The other is to have the best viewer, which in some ways is a race in selling that, but that’s potentially a race to the bottom, price-wise.

Tish Shute: Right. Do you think Google Wave Federation Protocol has a chance of taking off and changing the game for real-time communications, federation, real-time…

Brady Forrest:
Quite possibly with the real-time. I think they need to work on the UI.

Tish Shute: Oh dear we can’t discuss the Wave UI right at the end of the interview – of course I believe it would do better in an AR view!   I know you have to go  now but I have to say Google Wave not standardizing the client/server interface – so we could seem some new UIs for Wave [we are working with PygoWave for ARWave because of this], and  iPad’s lack of camera were two huge disappointments in recent months.

Brady Forrest: Yeah. It’s [the Wave client] is very difficult to use.

Tish Shute: But the Wave Federation Protocol is an open fast, compact protocol that is a dream come true for AR.  Open, distributed, real time communications is a very big enabler for AR.  I would hazard a guess that in 2010 real time communications plus location becomes oxygen.

categories: Ambient Devices, Android, Artificial general Intelligence, Artificial Intelligence, Artificial Life, Augmented Reality, culture of participation, digital public space, Instrumenting the World, internet of things, iphone, Mixed Reality, mobile augmented reality, mobile meets social, Mobile Phones in Africa, Mobile Reality, Mobile Technology, online privacy, Paticipatory Culture, sustainable mobility, ubiquitous computing
tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

2 Comments For This Post

  1. Bruce Says:

    [...] The Physical World Becomes a Software Construct: Talking with Brady Forrest about Where 2.0, 2010 “The internet eats everything it touches.” [...]

  2. Great PDF Says:

    The Physical World Becomes a Software Construct:

5 Trackbacks For This Post

  1. Steve the Robot H.E.Ai.D. » The Physical World Becomes a Software Construct Says:

    [...] The Physical World Becomes a Software Construct: Talking with Brady Forrest about Where 2.0, 2010 “The internet eats everything it touches.” [...]

  2. IMAP | Daily Digest for February 10th Says:

    [...] Jeff Watson shared The Physical World Becomes a Software Construct: Talking with Brady Forrest about Where 2.0, 2010 [...]

  3. Bookmarks for February 11th from 09:35 to 14:23 — arghh.net Says:

    [...] The Physical World Becomes a Software Construct: Talking with Brady Forrest about Where 2.0, 2010&nb… – The Physical World Becomes a Software Construct: Talking with @brady about #where [...]

  4. Weekly Linkfest « Games Alfresco Says:

    [...] Shute talks with Brady Forest of Where 2.0 on commercial AR models, whether this is going to be the year of augmented reality and what were [...]

  5. Augmented Reality Year in Review – 2010 « The Future Digital Life Says:

    [...] Best article / interview – Tish does all the best interviews.  Here’s another with Brady Forrest: The Physical World Becomes a Software Construct. [...]