Above is an image above from Total Immersion’s augmented reality experience developed for the “Networked City” exhibition in South Korea, – “a fun scenario created for a u-City’s infrastructure and city management service”
“To the naked eye, the exhibit looks like a bare bones model of a city. But when visitors put on the special AR goggles a whole new world unfolds – as graphics overlaid on the city model.” (Games Alfresco)
“The Networked City,” is a large scale augmented virtuality of a scenario for a networked city. But my guess, reading the Korea IT Times, is the plan is to move from an augmented virtuality to an augmented reality as Incheon Free Economic Zone (IFEZ) realizes its vision to become a leading u-City – where reality is turned “inside out” (see Inside Out: Interaction Design for Augmented Reality ). If you are not familiar with South Korea’s u-Cities, check out this post for a short primer (and note Google Trends search on Augmented Reality shows South Korea leaving everyone else in the dust).
Ubiquitous computing and augmented reality are like adenine and thymine – a DNA base pair.
Korea IT Times writes about the u-city concept:
“Korea began using the term u-City after accepting the concept of ubiquitous computing, a post-desktop model of human-computer interaction created by Mark Weiser, the chief technologist of the Xerox Palo Alto Research Center in California, in 1998. There have been a lot of research in this field since 2002. As a result, many local governments in Korea have applied this concept to various development projects since 2005 based on a practical approach to it.”
The back story to many of my recent posts, including this one, is an understanding of a relationship between ubiquitous computing and augmented reality that emerged, for me, in a February conversation with Adam Greenfield, Towards a Newer Urbanism: Talking Cities, Networks, and Publics with Adam Greenfield. In cased you missed it, here is the link again because I think it holds up very well considering the rapid developments of recent months. Also, importantly for this post, it includes a discussion of moving on from Weiserian visions.
Adam Greenfield’s Speedbird is one of my key sources for understanding “networked urbanism,” and the list he makes of the elements of networked urbanism here (also see the comments) – is my mantra for thinking about the DNA base pair relationship of augmented reality and ubiquitous computing.
Adam Greenfield’s, “summary of what those of us who are thinking, writing and speaking about networked urbanism seem to be seeing” is:
1. From latent to explicit; 2. From browse to search; 3. From held to shared; 4. From expiring to persistent; 5. From deferred to real-time; 6. From passive to interactive; 7. From component to resource; 8. From constant to variable; 9. From wayfinding to wayshowing; 10. From object to service; 11. From vehicle to mobility; 12. From community to social network; 13. From ownership to use; 14. From consumer to constituent.
Augmented Reality – Making Visible the Invisible
The screenshot above is one of the coolest “making visible the invisible” AR applications. It was developed at Columbia University Graphics and User Interface Lab where Steven Feiner is Director (see the deep list of projects from the lab here). This app “shows carbon monoxide levels projected over New York City. The height of each ball reflects concentrations of the pollutant.” Credit: Sean White and Steven Feiner (via Technology Review).
The recent emergence of “magic lens” augmented reality apps for our smart phones – Wikitude, Layar, Acrossair, Sekai Camera, and many others now, have given us a new window into our cities. But we are yet to realize the full potential of the AR/ubicomp base pair that can “make visible the invisible” and give us new opportunities to relate to the invisible data ecosystems of our cities, not merely as a spectator experience, but as an interactive, in context, real time opportunity to reimagine social relations.
“In place of natural weather systems, however, today we find the dataclouds of 21st century urban space increasingly shaping our experience of this city and the choices we make there.”
Augmented Reality, as Joe Lamantia points out, is becoming the great “ambassador of ubiqitous computing.” AR is. “…mak[ing] it possible to experience the new world of ubiquitous computing by reifying the digital layer that permeates our inside-out world,” and we are only just glimpsing the razor thin end of the wedge in this regard.
I am still working on my Gov 2.0 Summit write up and, amongst other things, I will talk about how an emerging new social contract around open data, here in the US, will put augmented reality apps center stage – “doing stuff that matters.” At Gov 2.0 Expo Showcase Tim O’Reilly tweeted:
Also see Tim O’Reilly and Jennifer Pahlka on Forbes.com discuss the The “Web Squared” Era - “the Web Squared era is an era of augmented reality arriving (like the sensor revolution) stealthily, in more pedestrian clothes than we expected.… …our world will have “information shadows.” Augmented reality amounts to information shadows made visible.”
Again there is back story to how I came to think about Information Shadows in relation to augmented reality. So in case your missed it the first time, here is the link to a conversation that began in a hallway meeting between Tim O’Reilly, Mike Kuniavsky, ThingM, Usman Haque, Pachube, and Gavin Starks, AMEE, at ETech earlier this year, “Dematerializing the World, Shadows, Subscriptions and Things as Services: Talking With Mike Kuniavsky at ETech 2009.”
So What’s Next for Mobile Augmented Reality?
These videos from Daniel Wagner’s team from Graz University of Technology showing Realtime Panorama Mapping and Tracking on Mobile Phones and Creating an Indoor Panorama in Realtime, as Rouli from Games Alfresco points out, indicate that there is a lot in store for us at ISMAR09.
We may not be so impressed by directory style/”post it” AR anymore, as these applications have become common place so quickly! But while these early mobile AR apps may be disappointing in relation to some futurist visions of AR – merely AR/ubicomp appetizers, there are still good implementations of this model coming out (see new comers to the app store Bionic Eye and RobotVision). And Layar, always on the ball, has upped the ante for the new cohort of AR Browsers with Layar 3D.
But as Bruce Sterling notes here:
*In AR, everybody wants to be the platform and the browser, and nobody wants to be the boring old geolocative database. Look how Tim [creator of RobotVision] here, who is like one guy working on his weekends, can boldly fold-in the multi-billion dollar, multi-million user empires of Apple iPhone, Microsoft Bing, Flickr, and Twitter, all under his right thumb
(watch video here)
But if you looking for something more from AR, you probably won’t have to wait too long. The two pioneering companies in AR, Total Immersion – founded in 1999, and Metaio – founded in 2003 are both coming out with “mobile augmented reality platforms” in a matter of weeks (see press releases here and here). And both companies, it seems, will deploy much more sophisticated AR rendering and tracking than we have seen to date.
I approached Bruno Uzzan, founder and CEO of Total Immersion, for an interview as part of my look at the new industry of augmented reality through the eyes of the founding members of the AR Consortium. These consortium members are some of the first commercial augmented reality companies.
The significance of this announcement is that Total Immersion is now positioned to take the augmented reality experiences they have developed for a number of top brands onto multiple mobile platforms with, “Int13′s very clever embedded solution that allows our [Total Immersion's] solutions to work across many [mobile] platforms,” while Int13 gets to extend their reach.
Total Immersion has a 50 person R&D team and their two main focuses have been, firstly getting:
“Augmented Reality to work with as many platforms as possible – PC, Mac, Mobile, Game Consoles, all those are the platforms that we are targeting. We are currently doing lot of work in the R & D team in cross platform compatibility….”
“Our R&D guys are working on the real world interacting more with the virtual world. And I have started seeing some results which are pretty much crazy and this will be ready for next year.”
Pandora’s Box – Shared Augmented Realities
There are many weaknesses to the mobile smart phone AR experiences we have now, and the lack of near field object recognition (to date), and difficulties with accurate positioning aren’t the only ones. Note re solving positioning problems in mobile AR, we are yet to see AR leverage public libraries for analyzing scenes like Flickr’s geo tagged photos, see Aaron Straup Copes’s work on “The Shape of Alpha.” And for more on this my post here.
But, as Joe Lamantia points out:
“One of the weakest aspects of the existing interaction patterns for augmented reality is their reliance on single-person, socially disconnected user experiences.”
In my view, The Pandora’s Box of Augmented Realities is an open, distributed, multiuser augmented reality framework, fully integrated with the internet and world wide web.
As Yochai Benkler has pointed out many times, and argues again in, Capital, Power, and the Next Step in Decentralization, it is “open, collaborative, distributed practices that have been at the core of what made the Internet.” We have to try to make sure that open, collaborative, distributed practices are at the core of mobile augmented reality.
Can Google Wave be the basis for an Open, Distributed, Multiuser Augmented Reality Framework?
I have been exploring the idea of using Google Wave protocol as the basis for a distributed, multiuser open augmented reality framework with a small group of AR enthusiasts and developers. And I am happy to say the proposal is beginning to get fleshed out a little. New collaborators are welcome both for “gear heady” input and use case suggestions (but re the latter, you can’t just say everything you see in Denno Coil..!).
This effort started with Thomas Wrobel’s proposal for an Open AR Framework prototyped on IRC – see here, and click to enlarge the image above of, “Sky Writer: Basic Concept for an Open Multi-source AR Framework.”
But recently we began looking at the Wave Federation Protocol. And, if you check out this post, and this post, you may get a glimpse of why Google Wave protocol might be a good basis for an open, distributed, AR Framework. You will notice, if you study what Google Wave has done with the XMPP protocol, that many of the elements of networked urbanism that Adam Greenfield describes resonate strongly with what is being attempted in Wave.
But enough said for now! Regardless of the details of implementation, Google Wave or an AR protocol built from scratch (phew! the latter does seem like a lot of work) - an open, distributed, multiuser AR framework integrated with the internet and web would explode the potential of AR, creating new possibilities for data flows, mashups ,and shared augmented realities.
And we are excited by Google Wave because, as Thomas puts it:
“The really great thing wave does ….(aside from being an open standard backed by a major player…hopefully leading to thousands of worldwide servers )….is that it allows anyone to create any number of waves, set precisely who can view or edit them, and for them to be able to be updated quickly and continuously (and even simultaneously!) Better yet, changes will (if necessary) propagate to all the other servers sharing that wave. It does all this right now. From my eyes this does a lot of the work of an AR infrastructure already.
I cant see any other protocol actually doing anything like this at the moment, although correct me if I’m wrong, as alternatives are always welcome ”
Also, Thomas notes, “even the playback system (that is, the ability to playback the changes made to a wave since its creation) …this could give us automatically some of the ideas Jeremy Hight has mentioned in his visionary work here, and here on “the geo spatial web, interlinked locations and data, immersive augmentation and open source geo augmentation.”
One of the many reasons why an Open, distributed AR Framework would be so cool is it would open up all kinds of possibilities for GeoAR by providing the over-arching standard protocol for communication of updates necessary for the substandards that will facilitate GeoAR.
Also important to note is the Wave Federation Protocol allows anyone:
“to run wave servers and become wave providers, for themselves, or as services for their users, and to “federate” waves, that is, to share waves with each other and with Google Wave. – “the federation gateway and a federation proxy and is based on open extension to XMPP core [RFC3920] protocol to allow near real-time communication between two wave servers.” See Reuven Cohen’s blog for more here and here, “HTTP is Dead, Long Live the Real Time Cloud.”
Still some people have expressed concern that an AR Framework using Google Wave protocol would give Google disproportionate influence. Will Google-specific functionality be an issue? How much stuff is Google specific just because no one else is using it (yet)? And how much is Google specific because it holds no value to anyone else but Google? These are some of the questions that have come up.
You are going to see a variety of suggestions for standards and specs for open AR coming out out in the next few months which as, Robert Rice of the AR Consortium points out is: “a good thing, we need that competition early on to settle down on best case.” Recently, Mobilizy have offered up an ARML (“an augmented reality mark-up language specification based on the OpenGIS® KML Encoding Standard (OGC KML) with extensions”) for consideration see here.
So it is, perhaps, also important to note, that an Open AR Framework should be neutral/transparent to techniques of “reality recognition,” and methodologies of registration/tracking, allowing various ones to work on the system as new techniques evolve, and to support as many evolving standards as possible.
Augmented Reality developers, like Total Immersion and others with powerful rendering/tracking AR software, should be able use an Open AR Framework to exchange the data which their tracking will use. And the tracking/rendering problems they and other researchers have solved are much harder than figuring out data exchange on on a standard infrastructure or protocol!
So I pricked up my ears when I heard Bruno Uzzan, CEO of Total Immersion - the first and currently the largest augmented reality company, with a 50 person R&D team in France and offices in LA, where Bruno himself is now based, say: “Total Immersion is only months away from launching shared mobile augmented reality experiences using near field object recognition/tracking across multiple platforms” (for more details read my conversation with Bruno Uzzan below).
I was happy when I asked Bruno about the possibilities for developing an open, distributed, multiuser augmented reality framework fully integrated with the internet and world wide web (possibly using Google Wave protocols), and he replied:
“I think this is feasible. I think that’s doable, that’s just in my opinion. I mean some people might have another kind of opinion but I think that that’s definitely doable.”
Total Immersion – working with the “symbiosis between augmented reality and brands”
Total Immersion has created many of the best known and most ambitious augmented reality experiences for major brands to date, including Mattel’s new AR toys to be released in conjunction with the James Cameron film Avatar, and AR baseball cards for Topps, video here (or click screenshot above), and the UK’s first augmented reality books.
Bruno founded Total Immersion 10 years ago when he was just 27. And the kind of conviction it took to survive as an augmented reality business in the decade before augmented reality captured the world’s attention is remarkable.
AR’s first steps out into the world after 17 years as predominantly a lab science maybe “wobbly” (what new technology isn’t), and sometimes gloriously kitsch – check out this riotus video of the 3D Interactive Live Show Total Immersion produced in Korea (also see the Total Immersion Augmented Reality Blog for more on the TI’s turn key Interactive 3D Live Show Solution).
As Lamantia points out here, ” projecting mixed realities into public, common, or social spaces makes them social by default.”
However, the potential for shared location based augmented reality experiences is as yet untapped. So I see the entry of the most experienced commercial augmented reality company into mobile as pretty interesting. While smart phone AR still has significant limitations, and it certainly does differ from some of the futurist dreams of AR (see Mok Oh’s post here on his disappointment in this regard), it is significant that Total Immersion is committing to becoming a leader in mobile AR.
Our smart phones, the powerful networked sensor devices that so many people carry in their pockets, have proved themselves a “good enough for now” mediating device for early manifestations of the ubiquitous computing and augmented reality base pair. And now AR and ubicomp is mixed in the rich, messy soup of everyday life, commerce, business, marketing, art, entertainment, and government, we should get ready to see these technologies grow up fast, and unfold in some surprising ways that lab science didn’t necessarily predict.
And, perhaps, the new dialogue between scientists and entrepreneurs may spur both communities to outdo themselves.
Particularly, as Joe Ludwig notes: “It seems to me that the biggest disconnect between the academics and the entrepreneurs is that they disagree on how far we are from the finish line.”
See the comments’s on Ori Inbar’s post, Augmented Reality Entrepreneurship: Natural Evolution or Intelligent Design?, for a courteous but spirited discussion on the potential benefits and frictions of the newly expanded AR community of researchers and entrepreneurs.
“not all academics and researchers are only interested in the traditional models of impact. Case in point: I wouldn’t be building unpublishable games, nor investing so much time talking to the press, entrepreneurs and VCs if I did not believe strongly in the value of the impact I am having by doing that — and I know others with the same attitude.”
In this vein, check out the Marble Game (video here) developed by Steve Feiner and his team at Columbia U. It’s enabled by Goblin XNA, an open source AR framework built on top of Microsoft’s XNA, which powers XBox live games, Zune games, and some Windows games. For more about Goblin XNA and AR from Columbia U see here. (Hat tip to Brian Jepson for this link)
While we are still waiting for the kind of sexy AR specs – nothing totally game changing in Gigantico’s AR eyewear rounup (maybe note this Apple patent), that might get wide adoption. But at least researchers are not afraid to explore the possibilities of AR Goggles.
But how far are we now, with or without sexy goggles, from a fuller expression of the base pair DNA of ubiquitous computing and augmented reality?
We may have a LAN of things before we have an Internet of Things
The picture above is a workshop I attended at Conflux last weekend – Fish ‘n microChips, with Natalie Jeremijenko. We are at the site of the Amphibious Architecture project (a commissioned work for Toward the Sentient City) and “a collaborative project with xClinic, The Living and other intelligent creatures.”
We are probably as far off some grand futurist visions of ubiquitious computing as we are some of the futurist visions of augmented reality. But as it turns out that may not be a bad thing! Recently, @mikekuniavsky noted in a tweet:
“Another argument for the LAN of Things before the Internet of Things: http://tinyurl.com/lgp9uq”
Bert Moore, in the article Mike linked to points out, the grand vision of an “internet of things” with everything connected to everything can “distract people from thinking about the benefits of RFID in smaller, more easily implemented and cost-justified applications.” The same argument I think applies to sensor networks and augmented reality.
In New York City, a series of commissioned works for the Architectural League of New York’s exhibit, “Toward the Sentient City” are giving us the opportunity to dip our toes into the ocean of a “networked urbanism.” For only a small budget, two of the five commissioned works, Amphibeous Architecture and Natural Fuse demonstrate how sensor networks can allow us to explore new kinds of communities – connecting people to environments in interesting ways to create new forms of social agency.
“Amphibeous Architecture” - from The Living Architecture Lab at Columbia University Graduate School of Architecture, Planning and Preservation (Directors David Benjamin and Soo-in Yang) and Natalie Jeremijenko, Environmental Health Clinic at New York University, uses a skillfully built (electronics and water are notoriously hard to mix) array of partially submerged sensors to pierce the blinding, reflective surfaces of the rivers surrounding Manhattan and to create a new two way relationship with the ecosystem below – the water, our neighbors the fish and even a beaver that lives in the water surrounding Manhattan.
Image from Toward the Sentient City
In a similar spirit, “Natural Fuse” – Usman Haque, creative director, Nitipak ‘Dot’ Samsen, designer, Ai Hasegawa, designer, Cesar Harada, designer, Barbara Jasinowicz, producer, creates a network of people and electronically assisted plants to explore what it takes to work together on energy consumption and to experience the consequences of “selfish” and “unselfish” behavior interactively before it is too late to modify our actions.
The “Greedy Switch“ from Natural Fuse on the left. On the right “The System” – click to enlarge.
Much more to come in another post on these works, and “Toward the Sentient City.” Also an update on how Pachube – an important part of both these projects and a very important contribution to ubiquitous computing because it creates the opportunity to connect environments and create mashups from diverse sensor data feeds – has matured since my interview with Pachube founder, Usman Haque, “Pachube, Patching the Planet,” in January this year.
In the picture above Natalie Jeremijenko, and Jonathan Laventhol give the Amphibious Architecture sensor array a last look over, as it will soon be lowered into the East River. Jonathan is on a busman’s holiday to help out at the pre launch of Amphibious Architecture, nr Manhattan Bridge, NYC.
I was very happy to get a chance to talk to Jonathan Laventhol - more on our conversation in another post. Jonathan Laventhol is CTO of Imagination – one of the world’s leading design, events, and branding agencies. We talked about the importance of Pachube, which Jonathan called the “The Facebook of Data,” and how the symbiosis between brands and augmented reality, and healthcare applications, would be key to augmented reality emerging into the mainstream.
Natalie Jeremijenko’s workshop at Conflux on the social negotiation of technology and how “everyware” can give us the chance to experience new forms of agency and connection was a totally inspiring. And I will cover this too in another post. I have so much awesome stuff to write about at the moment!
None of the projects in, “Toward the Sentient City,” included a mobile augmented reality, or “magic lens” component, but they all pointed to why “enchanted windows into our newly inside-out reality” are going to be so important. And why the DNA base pair of ubicomp and augmented reality can really do stuff that matters.
Shangri- La – “Transfigured City”
In my AR Consortium founder member interview series, I have found that, understandably, the visionary founders of these first augmented reality companies are a little reticent about sharing their full vision. They are basically on stealth mode in this regard. So as you will not, from my interview with Total Immersion founder and CEO, Bruno Uzzan, get a fully drawn scenario of his vision for a next generation of shared augmented reality experiences, here’s a really interesting anime episode from the anime Shangri La called, Transfigured City, to mull over instead.
As you can tell from this rather long and circuitous intro to my my conversation with Bruno Uzzan, I have been investigating shared augmented realities pretty intensively recently. And Mike Kuniavsky pointed me to Shangri-La, and Transfigured City, in a conversation with Mark Shepard, after Mark’s presentation at Conflux, Sentient City Survival Kit.
Mike Kuniavsky with Tod E. Kurt is founder of ThingM, a ubiquitous computing device studio. Also Mike Kuniavsky researches, designs and writes about people’s experiences at the intersection of technology and everyday life – see Mikes blog Orange Cone. And I interviewed Mike at Etech- see here.
In Transfigured City, the “Metal Age” group has to figure out how to share and communicate in a city transfigured by augmented realities/virtualities, where no-one sees the same place in the same way. Only one character can figure out from her previous experience of the city the relationship between the transfigured city and how it used to be.
The conversation I had with Mike Kuniavsky on The Transfigured City continued at a picnic in Washington Square Park the next day with Elizabeth Goodman, who I met at Etech when she gave a brilliant presentation, Designing for Urban Green Space. We covered so many areas at the picnic related to ubiquitous computing and augmented realities that this conversation probably deserves a post of its own (my writing to do list is growing longer!).
“In the mid-21st century, the international committee decided to forcefully reduce CO2 emission levels to mitigate the global warming crisis. As a result, the economic market was transferred mainly into the trade of carbon. A great earthquake destroys much of Japan, yet the carbon tax placed on the country is not lifted, so Tokyo is turned into the world’s largest “jungle-polis” that absorbs carbon dioxide. Project Atlas is commenced to plan the rebuilding of Tokyo and oversee the government organization, which the Metal Age group opposes due to its oppressive nature. However, Atlas is only built with enough room for 3,500,000 people and most people are not allowed to migrate into the city. The disparity between the elite within Atlas and the refugees living in the jungles outside of its walls set up the background of the story.”
Tish Shute: We won’t have fully opened the Pandora’s Box of Augmented Realities until we have ubiquitous, shared augmented realities, will we?
Bruno Uzzan: Yes. The most important for augmented reality is the experience we want to share. Now we are working on the cell phone, we can potentially do some marketing components that we already have developed now on cell phone. Done. It’s working.
But the most interesting part of it is how these new components [cell phone AR] will be used for marketing campaigns by brands. And we are also pretty much well positioned to transform some of the AR that we currently have working on Mac and PC and to transform these to applications working on mobile devices.
Tish Shute: We haven’t really experienced yet what it means to actually share mobile AR experiences?
Bruno Uzzan: It’s hard — we did a Facebook app. It’s a first try, it has a way to go. But to go more and more into social, is the way forward for us – to share and expand AR experiences. But yes, I mean what you’re seeing is how two people on two different applications can share that same expanse. For sure we are going in that direction. We are currently working on those kind of solutions. How people can share and experience together at the same time. That’s how we start creating excitement in augmented reality, and it’s coming up.
It’s a new market and there’s so much more in store for augmented reality. You know, some people are telling me, don’t you believe that augmented reality is a gimmick? It will be a trend for a few weeks or a few months and then gone? I say, you’re kidding me. This is only the beginning. I mean I can assure you that the applications that are on the market today are one percent of what we will have five years from now.
Tish Shute: I agree.
Bruno Uzzan: And I’m sure that augmented reality will be a part of a lot of components that we are currently using today – GPS, web browser, glasses, I mean there are so many applications that will come up shortly. This is only the beginning. I’m completely convinced that augmented reality will be in three years from now what virtual reality is today, which is a billion dollar market. I know that it’s not just a gimmick of a few weeks or a few months, because so many brands are jumping into it, spending money, exploring solutions. I know that it’s not just short term -what they are willing to do and we are willing to do, but also middle and long term. And that’s what makes this adventure pretty much unique and what makes creating a cutting edge technology, very, very much exciting for us.
Tish Shute: First could you explain more to me about your partnership with Int13. I am not sure I understand what is in the arrangement from Total Immersion’s POV. I mean what happens re your own mobile software development? Haven’t you only been licensed the Int13 SDK for a limited period of time and have limited access to all it’s power? Stephane from Int13 said to Ori on Games Alfresco, here, “we have licensed the SDK4 for two years,” and then Ori asks, “but you have basically kept the power to yourselves, right?” So if they are the only ones that can enhance it and develop the software, where will TI be in two years in mobile if you haven’t really had the chance to develop your own software .
Bruno Uzzan: Actually it’s a real win-win situation. Int13 is a very small company and they have so many requests they can’t possibly fulfill them all. So this is a way for both of us to be, as quickly as possible, the first mobile provider for all the requests we have. Also they give us exclusivity so nobody else can use INT13 SDK for such applications. I think that it is a good partnership,
And concerning our own mobile application… First of all we have currently some mobile applications working. But with Int13 we have a mobile solution that can work on many different devices. That’s a fact and that’s working. And, believe me you will hear from us a lot more about this soon. We are fully independent on our mobile development. The reason we closed the partnership with Int 13 is to be able to deploy mobile in a broad way.
I mean you know that the difficulty with AR mobile is that each separate device needs some customization. Working on the iPhone is different from working on the Nokia, different from working on the Palm; it’s different from working on the Samsung. Each of them have their own operating system inside and so we were interested in Int13′s very clever embedded solution that allows our solutions to work across many platforms.
The reason we are working with Int13 is that we are able to work on so many mobile devices, thanks to Int13. And in the mobile AR race that we are currently in, the next two years will be extremely important to us…
Tish Shute: OK, that definitely clarifies it a lot. So Int13 has done an embedded solution to allow TI developed AR solutions to work easily across many devices?
Bruno Uzzan: Yes they have kind of an embedded solution, a way to address extremely quickly new cell phone… But, currently on our side, we are in discussions with a mobile company… and that only refers to some very specific mobile devices. And what they have is also a way to embed deeper our technology into mobile, so that we can have quicker… applications that work on a large number of cell phones.
Tish Shute: So, basically it means you don’t have to go through some complicated negotiations with each of the cell phone companies, is what you are saying?
Bruno Uzzan: Not only negotiations, but also hard development. You know? Working on the Windows mobile is completely different from working on the Palm OS. You know, that’s different! Its a big work, to have a mobile application working on many other devices. So, INt13, provides us a way for us to save some time and some development cost too.
Tish Shute: And Int13 doesn’t have powerful AR development tools like D’fusion right?
Bruno Uzzan: Right! That’s right. That’s why we say it’s a true win-win solution. They can benefit from our work too. And we can benefit from their work, in order to deploy quicker and faster mobile solutions.
Tish Shute: Now, the second thing is… there is a lot of debate and disagreement about how far mobile augmented reality is from delivering something more that the “post it” approach that has been much publicized in recent months, via all the AR browser apps.
But from my understanding from the conversation we had earlier this summer (see below), Total Immersion is targeting a much higher level of mobile augmented reality than we’ve seen to date?
Bruno: Yes the browser apps we have seen are a kind of augmented reality, but not exactly the way we see it. Let me explain you why. With this kind of application it’s true that you can overlay 3D-information and video. That’s a fact. So, in a sense, that’s augmented reality. But the way that they are working on the position of the 3D on that video is that they are using compass and GPS-information.. so it means that this AR solution will work only on some building and some physical objects that are FIXED. In a fixed and known position.
So you want to go to a theater?
The theater is here, for sure it will not move, so you know the position of the theater, and that’s a fact that you can superimpose an object on the theater. That’s what can be done currently. What we are achieving and what we are doing on mobile is more than that. We want to be able to port our solution with trading cards, with brands, into a smart phone.
I’m assuming that you want a can, a drink can, to be able to trigger an experience. The only way you can do it is to be able to understand what the can, it is. And the current solutions that are out there can’t do that, it’s impossible.
Tish Shute: Right, yes. There’s no near-field object at all in these early browser apps.
Bruno Uzzan: And the solution we have is that we can recognize a can and then — in a very, very precise way and that activates geo-location, so we can superimpose 3D. I mean in that case, it opens up all the applications that we currently have, so they could work on mobile.
Tish Shute: So for example, if you’re working with a soft drink company, people can trigger that experience wherever they see that can?
Bruno Uzzan: Correct.
Tish Shute: Yes. Yes, I assumed that was what you’re doing
Bruno Uzzan: We believe — and maybe that’s not the case, but we believe that our marker-less tracking technology is pretty much unique on the mobile devices.
I haven’t seen yet, from anyone, a full augmented reality mobile solution working.
I really see AR being part of the Web 3.0 next generation. I mean the vision I have is that, you know — today, when you want to have information, you go on a website and then you find your information. AR — and the future is that I think it will be part of the opposite. You want to have information about a product, you just show it to your computer and the information will automatically pop up. I see here a new way to market some key messages, a new way to get information is that some physical product by themselves could be a way to get information, and you don’t have to search anymore for them, it’s coming out to you.
AR is definitely for me, one of these components. Another thing that AR is a solution, another thing that AR itself will create these kind of results in how information is being displayed. But I’m seeing here a way that could be part of a new way to have access to information. And that’s part of the vision I have. Whatever, if it is through mobile phone or web or PC, Mac, whatever, I really believe that now this kind of new generation of receiving information will come shortly and could be a kind of a new — could be part of the new 3.0 generation of the web.
Tish Shute: My friend Gene Becker did an interesting post recently on some of the current limitations of mobile AR where he pointed out the problem of:
“Simplistic, non-standard data formats – POIs, the geo-annotated data that many of these apps display, are mostly very simple one-dimensional points of lat/long coordinates, plus a few bytes of metadata. Despite their simplicity there has been no real standardization of POI formats; so far, data providers and AR app developers are only giving lip service to open interoperability. Furthermore, they are not looking ahead to future capabilities that will require more sophisticated data representations. At the same time, there is a large community of GIS, mapping and Geoweb experts who have defined open formats such as GeoRSS, GeoJSON and KML that may be suitable for mobile AR use and standardization.”
Bruno Uzzan: That’s interesting. I mean — I know exactly what his is referring to. He is mainly referring to a localization and how you can have a quick, accurate localization. If you look at current solutions, and you look at this 3-D superimposing on the video, the 3-D is shaking a lot. I don’t know if you see that in some of these early efforts.
It’s hard to use because the 3-D, you know, is part of the magic of augmented reality, that is when the 3-D is being inserted in a very easy way and smooth way in your solution. Here, when you see this overlay, 2-D or 3-D overlaid on the video, it’s shaking a lot. One reason for this is that the GPS compass is not accurate enough to coordinate the perfect location of the user. And here, what Gene says is interesting. I think we are addressing this localization issue in a pretty smart way.
But to be frank with you, I don’t believe mobile augmented reality in the extremely short term — I’m talking about three weeks, one, two months is mature enough for good AR applications. It will be shortly. But for now it is more proof of concept than a true and easy application to use.
But we are starting to see a lot of new application coming out, but I really believe that marketing and entertainment are the two key markets for AR right now.
I’ve been working ten years in augmented reality. And, eight years ago, when I was talking about augmented reality, I was E.T., you know? Nobody understood what I said, and I thought it was crazy. And now, today, yes it’s completely different.
Tish Shute: The Pandora’s Box of Augmented Realities, in my view, is an open, universal and standard, distributed, multiuser, augmented reality framework fully integrated with the internet and world wide web. I have been looking into Google Wave protocols as a basis for this would you be interested in this? Do you think it is feasable?
Bruno Uzzan: I think this is feasible. I think that’s doable, that’s just in my opinion. I mean some people might have another kind of opinion but I think that that’s definitely doable.
Tish Shute: Yes I suppose an open AR Framework involves cooperation and collaboration, it is more about business and politics than technological problems.
Bruno Uzzan: Yes! Actually the Web is politics. Business is politics.
Tish Shute: I would be interested if anyone in your R&D team would be interested in looking at some of the ideas that are emerging in our little discussion of Google Wave and an Open AR Framework to offer feedback. it is an interesting time now to input on the Wave Federation Protocol docs because nothing is set it stone right now.
Bruno Uzzan: Just shoot me an email, I’ll try to put you in touch with the right person and, and a team member that can input on this.
Tish Shute: For mobile augmented reality the best thing we’ve got now is the phone, right?
Bruno Uzzan: Right.
Tish Shute: And the only way we can use the phone is by holding it up, right? Isn’t this a bit of an an obstacle as you introduce better object recognition and tracking? People are going to have to stop moving to use their phone. What do you feel about that experience? Isn’t AR eyewear and essential part of a tightly registered AR experience?
Bruno Uzzan: We don’t do hardware and we don’t have the current solution for eyewear that would do all we need for a good mobile AR experience, so I guess we don’t have the current answer for that. But we are beginning to see the next generation of this — of these glasses.
Tish Shute: But you’re happy enough with the mobile experience of augmented reality on smart phones that you’re investing in this next generation of software for this.
Bruno Uzzan: Yes, I know. We know that some application will not work on the iPhone. And yes, whatever you do, you still need to hold the iPhone, so it means that you can’t play with your hands anymore. So we know that partially, some AR solutions we have on other platforms will lose the magical effectivities on just the iPhone.
But I’m starting to see on the market some glasses that could perhaps be not too expensive — that’s a challenge! And easy to use — that’s another big challenge. And, that could fit on anybody’s faces and head — there’s another big challenge. So yes, I’m starting to see that, but so far AR glasses are only applicable for some very, very specific application, like design or theme park or, you know, some specific location where it makes sense to move forward with glasses.
I don’t believe that kids will use glasses for — in our toys and for games in the next months or maybe othe next one or two years. But maybe something will come out shortly and that could be a big breakthrough, and enable us to think another way. But from what we have seen so far and from what we know in this hardware market, I don’t believe that currently there is a workable solution.
Note: The following section of the interview took place earlier in the Summer.
Tish Shute: You are the first commercial AR company – you started in 1999 right?
Bruno Uzzan: Yes you are right. We started the extremely early in this augmented reality market. We were the first company worldwide to start doing augmented reality and to start promoting augmented reality. So it’s true, we are pretty old players although the market has been getting bigger and bigger for the last year and a half. So for a long time we were only in the market, and the market was not really there.
But for the past 8 months, the company has been growing really fast.
Tish Shute: Yes I’m sure. Congratulations for hanging in there long enough to get the pay off!
Bruno Uzzan: You know, my background is Financial. So I have been driving the company for many years in a very cash efficient way. So we have been waiting for the markets to reach maturity before starting make some investments. So that’s the reason we are still here, and that’s the reason I think we managed pretty smartly the cash that we raised for the company.
Tish Shute: Yes there is a saying that when a market takes off you can tell a pioneers because they are the ones with the arrows in their backs. But I am glad you are dodging the arrows!
Bruno Uzzan: You know, I’ve always driven the company with revenue. And because revenue was not there at the beginning I was extremely cautious about the cash. So now that the company is getting some revenue, for sure we are making more and more investments, and taking advantage of our situation as a worldwide leader of augmented reality.
This situation is not easy as it appears today but it’s now getting better, as you can see, AR, Augmented Reality, has very good momentum and we are benefiting a lot from all this momentum for augmented reality right now.
Tish Shute: You’ve been very involved in researching developing augmented reality tools. Are you still as active in the research area, or are you too busy keeping up with work for hire now, to be working on research and building new technology for Augmented Reality?
Bruno Uzzan: Both. First of all, we are part of lot of projects either directly with clients like Mattel or with some partners that are using our technology to promote and develop other AR projects. From what we he have seen, many, many, many, projects augmented projects have been done currently with our solutions.
To continue with your previous question. So we are being perceived as this leader in that space, and we have some pretty heavy demand for our services. But we are coming up with new technology, of course, still connected to Augmented Reality. But, our R & D is working in two different directions, which of course also bind together.
The first one is platform developments. So we want Augmented Reality to work with as many platforms as possible – PC, Mac, Mobile, Game Consoles, all those are the platforms that we are targeting. We are currently doing lot of work in the R & D team in cross platform compatibility.
Tish Shute: Robert Rice said recently, “markers and webcams equal Photoshop page curls…”
Bruno Uzzan: Yes. There are so many concerns with markers. The quality is extremely bad. As soon as you hide a part of the marker, a slight part of the marker, you’re dead. You can’t track any more of the object. So compared to our solution where I want to say play with cards or where you are going to play with a Mattel toy, even if you hide a part of the toy, it’s still working.
Tish Shute: But you haven’t offered the public an SDK to your engine right? Basically the way people get access to your tools is working in a partnership with Total Immersion right?
Bruno Uzzan: Correct.
Tish Shute: Do you think in the future you might open your SDK? Are you considering that?
Bruno Uzzan: Yes, it would be interesting.
Tish Shute: So that is something we can see coming soon?
Bruno Uzzan: Maybe, because it’s true that Total Immersion is starting to be mature enough for these kind of tools. The only thing is that we have to respect good timing for that. It’s a big decision. You know what I mean? It is a big, big decision. We would then compete with others using our technology.
Tish Shute: Oh I know, it is a big decision when you have so much skin in the game! But it would be nice to have your SDK being THE platform for AR, wouldn’t it?
Bruno Uzzan: It is a really big decision that we can’t just take like that, you know. There are a lot of friends who told me you have to be extremely careful about timing. This timing is pretty much connected to the maturity of the market. For sure, we see the market being more and more mature. But, there are a lot of low hanging fruits we still want to address
To get the best value possible for all the publicity we have and all the clients we have now.
Tish Shute: Yes, I know. You’ve been in this game so long. Now, there is an interesting question here though about tools and platforms because you know, A.R., augmented reality has already expanded beyond its kind of original purist definition. And when I talk to people about augmented reality, there are actually lot of different ideas and priorities of where the tools should go right now. You know, obviously we have these kind of browser-like applications, but these browser like applications are not dealing with recognizing near field objects yet. What are your priorities for tool development and what are your priorities for AR development in the future? What areas are you going to focus on? Oh dear that is a rambling question!
Bruno Uzzan: [laughter] So, one of our first priorities is we need to create our software with one development, one installer, one software that can be spread on different platforms. The same application, the same software can be used either on a PC, Mac, phone or console. So that’s a lot of work, because that means that our platform has to address many many different devices and that’s a big priority for us because we received this request from our clients. We want to be able to use one application on many different platforms and devices. So, that’s the first one.
And the second one is to add more and more interactivity between the real and the virtual world. So, we are working on some improvements to add some real components that will interact with virtual, and that also part of our big strategy and direction and these two worlds can more and more be bridged together, linked together so they can interact one with the other.
Our R&D guys are working on the real world interacting more with the virtual world. And I have started seeing some results which are pretty much crazy and this will be ready for next year.
There are so many different directions for interaction between the real world and virtual world to develop. I’m sure ten years from now you’re going to have AR applications everywhere. Its not just temporary fashion stuff or a gimmick for few months. I mean we are getting there, its getting stronger and stronger and we are getting a good adoption rate from our consumers. They like it, they test it, they play with it and brands wants more, people want more and its getting bigger and bigger.
Tish Shute: Yea and I totally agree, its not a gimmick because the interaction between “virtual” and “real” enhances the magic of both. Another question about you RandD operation. Is your R&D still in France or have you moved totally out to LA.
Bruno Uzzan: We are 50 people in France and I started this LA office two years ago and I moved permanently two years to LA. So I’m now permanently located in the US to take care of the US office, knowing that revenues are really getting bigger and bigger in the US. So it means that we are getting a lot of traction, working with large company and now I’m currently located in the US.
Tish Shute: My sister lives in Paris. Could I visit your R&D lab at some point? I’d love to visit!
Bruno Uzzan: Yeah sure sure sure. I mean if you want to go. You won’t have access to all the research. But if you want to go out and meet all the team please do.
Tish Shute: I’d love to.
Bruno Uzzan: No problem. Shoot me an Email you and I will introduce you to Eric Gehl, COO, he is the COO of the French team. And he can definitely take care of that.
Tish Shute: That would be fun. Thank you!
Recently, AR browser applications have really caught the imagination of the web community, eg., Layar and Wikitude? Where do you think the most important market for AR is at the moment, entertainment, green tech, business, education?
Bruno Uzzan: I think that all that you mention will be important. The first one that did grab my attention is entertainment particularly dual marketing, because they always searching for new ways to interact with players or the consumers. But it’s just the tip of the iceberg, you know, I mean medical applications could be huge using augmented reality. Education, and edutainment is definitely using more and more augmented reality components. And I will just be submitting with big companies – that are considering using augmentation for education. Museums are very important too. Also augmentation as a kind of free sales tool, you know there are so many applications, design, architecture – so many directions that it’s hard to say today which one will take the lead.
But I do believe that on the short term the ones that are really really moving fast are the entertainment business and the digital marketing business.
Tish Shute: What do you think are the biggest shortcomings with current augmented reality and what are the obstacles that no one has solved yet?
Bruno Uzzan: I think the cell phone is not fully ready for augmented reality – a lot of people are working on that but there are still a lot of constraints to get the augmented reality working on a cell phone and I think that from what I heard a lot of manufacturers and a lot of companies are working from direction that are going to help us a lot to develop some great cell phone applications.
And I think that’s one of the biggest part of the game. All the applications that you see on cell phones so far are just gimmicks – the next big key is how to transform some gimmick cell phone application to a real, industrial, robust application that’s going to work on a cell phone. So I think that’s a big challenge for this year.
Most of what we see now is just matching and overlaying some 2d components in a video. This is not what I call AR. You’re far away – with this kind of application, you are far away from doing the registration that we need to do – you can’t do it. So here’s the challenge: “how can you get a Topps is an application working on cell phone. That’s the big challenge How we can make that work!” You can’t today get a real AR Topps application working on cell phone because there’s no cell phone that’s actually ready. But we are working on it and the first one that can make that work, it’s going to be huge.
When you are working with good AR components you need a lot of CPU and GPU programs. So today new cell phone have started to be more and more ready for augmented reality but you need a really good cell phone to make it work. You can’t choose an old cell phone to make it work because you have some recognition, you have some tracking, you have some rendering, so you can’t choose a Nokia cell phone two years old to make that work. For sure the newest iPhone is the one that can make it work, but that’s it for now. There is a lot of research – from large cell phone companies – to get more CPU and GPU into their cell phone. But so far we are also waiting for these devices to be released to consumers.
Tish Shute: And the current economic climate has put a damper on MIDs hasn’t it. But who can tell? It depends what price points some new MID came out at right?
Bruno Uzzan: Correct.
Tish Shute: Yes,I agree. But basically what’s interesting, the interesting thing is, the iPhone can deliver so much of what is necessary and even if Apple hasn’t given access to the full power of the iphone to AR developers yet, there is really no going back now – the mobile augmented reality cat is out of the bag!
Bruno Uzzan: You’re right, you’re fully right.