<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>UgoTrade &#187; Intel in Virtual Worlds</title>
	<atom:link href="http://www.ugotrade.com/category/virtual-realities/open-source-virtual-worlds/intel-in-virtual-worlds/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.ugotrade.com</link>
	<description>Augmented Realities at the Edge of the Network</description>
	<lastBuildDate>Wed, 25 May 2016 15:59:56 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=3.9.40</generator>
	<item>
		<title>People Meet People Meet Big Data: ScienceSim Explores Collaborative High Performance Computing</title>
		<link>http://www.ugotrade.com/2009/02/11/people-meet-people-meet-big-data-sciencesim-explores-collaborative-high-performance-computing/</link>
		<comments>http://www.ugotrade.com/2009/02/11/people-meet-people-meet-big-data-sciencesim-explores-collaborative-high-performance-computing/#comments</comments>
		<pubDate>Wed, 11 Feb 2009 22:40:02 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[architecture of participation]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Carbon Footprint Reduction]]></category>
		<category><![CDATA[culture of participation]]></category>
		<category><![CDATA[digital public space]]></category>
		<category><![CDATA[Ecological Intelligence]]></category>
		<category><![CDATA[Energy Awareness]]></category>
		<category><![CDATA[Instrumenting the World]]></category>
		<category><![CDATA[Intel in Virtual Worlds]]></category>
		<category><![CDATA[interoperability of virtual worlds]]></category>
		<category><![CDATA[Metaverse]]></category>
		<category><![CDATA[mirror worlds]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[nanotechnology]]></category>
		<category><![CDATA[Open Grid]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[open protocols for virtual worlds]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[Open Source Virtual Worlds]]></category>
		<category><![CDATA[open standards for virtual worlds]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Paticipatory Culture]]></category>
		<category><![CDATA[science outreach in virtual worlds]]></category>
		<category><![CDATA[scientific simulation in virtual worlds]]></category>
		<category><![CDATA[Smart Planet]]></category>
		<category><![CDATA[sustainable living]]></category>
		<category><![CDATA[Virtual Realities]]></category>
		<category><![CDATA[Virtual Worlds]]></category>
		<category><![CDATA[virtual worlds in Japan]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[big data]]></category>
		<category><![CDATA[collaboration and big data]]></category>
		<category><![CDATA[collaborative visualization]]></category>
		<category><![CDATA[haptic interfaces for virtual worlds]]></category>
		<category><![CDATA[Hypergrid]]></category>
		<category><![CDATA[linked data]]></category>
		<category><![CDATA[modelling complex systems]]></category>
		<category><![CDATA[n-body simulation]]></category>
		<category><![CDATA[Piet Hut]]></category>
		<category><![CDATA[rapid data movement in virtual worlds]]></category>
		<category><![CDATA[ScienceSim]]></category>
		<category><![CDATA[scientific simulation]]></category>
		<category><![CDATA[steering big data simulations from virtual worlds]]></category>
		<category><![CDATA[steering virtual worlds with brain waves]]></category>
		<category><![CDATA[super computing conference]]></category>
		<category><![CDATA[supercomputing]]></category>
		<category><![CDATA[Wilf Pinfold]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=2855</guid>
		<description><![CDATA[Wilfred Pinfold, Director, Extreme Scale Programs for Intel, and the Supercomputing Conference general chair, is working with some Intel colleagues to make a project called ScienceSim the centerpiece of a special workshop event at the SC09 conference (see Supercomputing Conference, an ACM and IEEE Computer society sponsored event). Recently, I interviewed Wilf Pinfold (see interview [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/02/gwave_lg.jpg"><img class="alignnone size-full wp-image-2861" title="gwave_lg" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/02/gwave_lg.jpg" alt="gwave_lg" width="540" height="540" /></a></p>
<p>Wilfred Pinfold, Director, Extreme Scale Programs for Intel, and the<em> </em><em><a href="http://sc08.supercomputing.org/">Supercomputing Conference</a></em> general chair, is working with some Intel colleagues to make a project called <a href="http://www.sciencesim.com/">ScienceSim</a> the centerpiece of a special workshop event at the SC09 conference (<em>see </em><em><a href="http://sc08.supercomputing.org/">Supercomputing Conference</a>, an ACM and IEEE Computer society sponsored event)</em>.</p>
<p>Recently, I interviewed Wilf Pinfold (see interview below), Mic Bowman (also <a href="../../2008/09/15/interview-with-mic-bowman-intel-the-future-of-virtual-worlds/">see my previous interview here</a>), and John A. Hengeveld (see interview below). I wanted to find out what are the underlying goals of this SC conference program?Â  Why are members of the SC community being encouraged to participate with the ScienceSim environment? What projects are beginning to emerge?  And, what are Intel&#8217;s goals in giving infrastructure support to further the conversation between high performance computing and collaborative virtual worlds?</p>
<p>The vision of creating new ways to collaborate and interact with big data does seem to be one of the more significant steps we can take at a time when we find many of our most complex systems roiling and threatening total collapse. As Tim O&#8217;Reilly has pointed out &#8211; from financial markets to the climate, the complex systems we depend on for our survival seem to be reaching their limits.</p>
<p>But,Â  how can we get from the place we are now &#8211; <a href="http://www.youtube.com/watch?gl=GB&amp;hl=en-GB&amp;v=gM4fmL6dLdY" target="_blank">see this example of an n-body simulation in OpenSim</a>, to the point where we can collaboratively steer from our visualizations big data simulations of climate change, financial markets, or the depths of the universe.Â  The picture opening this post is a:</p>
<blockquote><p><em>Frame from a 3D simulation of gravitational waves produced by merging black holes, representing the largest astrophysical calculation ever performed on a NASA supercomputer. The honeycomb structures are the contours of the strong gravitational field near the black holes. Credit: C. Henze, NASA</em></p></blockquote>
<p>Wilf Pinfold explained to me part of the reason to begin a dialogue on collaborative visualization at SC &#8217;09 is that super computing communities (that tend to be highly skilled and visionary) have played key roles in internet development in the past. Wilf pointed out,Â  key browser technologyÂ  developed out of these communities in the early days of the internet &#8211; see <a href="http://en.wikipedia.org/wiki/Mosaic_(web_browser)" target="_blank">this wikipedia entry</a> that givesÂ  a background on the role of NCSA (National Center for Supercomputer Applications).</p>
<p>The hope is, while there are many obstacles to overcome, the super computing community has both the skills and motivation to find solutions to creating collaborative environments capable of the kind of rapid data movement that scientific/big data visualization needs. Solving the problems of realtime collaborative interaction with big data willÂ  have many ramifications for the way we understand virtual reality, the metaverse, virtual worlds (all these terms are becoming increasingly inadequate for cyberspace in the age of ubiquitous computing, an argument I will make in another post!).</p>
<p><em></em></p>
<p>There have already been a number of blogs on ScienceSim (see <a href="http://www.virtualworldsnews.com/2008/11/intel-creating-sciencesim-on-opensim.html" target="_blank">Virtual World News</a>, <a href="http://nwn.blogs.com/nwn/2009/02/intel-outside-.html" target="_blank">New World Notes</a>, <a href="http://www.vintfalken.com/intel-using-opensim-for-immersive-science-project/" target="_blank">Vint Falken</a>, and <a href="http://daneel-ariantho.blogspot.com/2009/02/sciencesim.html" target="_blank">Daneel Ariantho</a>). There have also been Intel blogs &#8211; <a href="http://blogs.intel.com/research/2009/01/sciencesim.php" target="_blank">see this post</a> by John A. Hengeveld (a senior business strategist working with Intel planners and researchers to accelerate the adoption of Immersive Connected Experiences). And Intel CTO <a href="http://blogs.intel.com/research/2008/11/immersive_science.php" target="_blank">Justin Rattner&#8217;s pos</a>t announcing the project this November.</p>
<p>But to blow my own horn a little, I think i was the first to blog the encounter between <a href="http://opensimulator.org/">OpenSim</a> and Supercomputing (an encounter I to some degree provoked by making the introductions) <a href="http://www.ugotrade.com/2008/07/19/astrophysics-in-virtual-worlds-implementing-n-body-simulations-in-opensim/ " target="_blank">see this post</a>.Â  So I have been following the ScienceSim initiative with great interest.</p>
<p>Very shortly after N-Body astrophysicicsts Piet Hut and Jun Makino, creators ofÂ  &#8211; GRAPE (an acronym for â€œgravity pipelineâ€ and an intended pun on the Apple line of computers) &#8211; a super computer that will <a href="http://grape.mtk.nao.ac.jp/grape/news/ABC/ABC-cuttingedge000602.html" target="_blank">become one of the fastest super computers in the world (again)</a>, met <a href="http://www.genkii.com/" target="_blank">Genkii</a> &#8211; a Tokyo based strategic company working with OpenSim, the first N-body simulation appeared in OpenSim.Â  And in a matter of weeksÂ  <a href="http://www.youtube.com/watch?v=gM4fmL6dLdY" target="_blank">this video went up on YouTube</a> &#8211; the result of a collaboration between MICA and Genkii.Â  But the nirvana of being able to create visualizations using real time data from super computers that can be steered from a collaborative environment is still a ways off.</p>
<p>Super computing communities tend to be geographically very dispersed and researchers often find themselves far from simulation facilities so there is both the motivation and skills to pioneer new tools for collaborative visualization. I know that astrophysicists certainly see their value (Piet Hut has some profound ideas on this). Astrophysicist Piet Hut and othersÂ  (<a href="http://www.ugotrade.com/2008/07/19/astrophysics-in-virtual-worlds-implementing-n-body-simulations-in-opensim/b" target="_blank">see here for more</a>) have been pioneering the use of VWs for collaboration.Â  There are two Virtual World organizations, both founded by <span class="nfakPe">Piet</span> Hut and collaborators, that are currently exploring the use of OpenSim for scientific visualizations. Â One is specifically aimed at astrophysics, MICA, the<a href="http://www.mica-vw.org/" target="_blank"> Meta Institute for Computational Astrophysics</a>, and the other is aimed broadly at interdisciplinary collaborations in and beyond science, <a href="http://www.kira.org/" target="_blank">Kira</a>, a 12-year old organization focused on `science in context&#8217;. Â As of last week, there are two weekly workshops sponsored jointly by Kira and MICA that explore the use of OpenSim, ScienceSim, and other virtual worlds. Â One of them is <a href="http://www.kira.org/index.php?option=com_content&amp;task=view&amp;id=124&amp;Itemid=154" target="_blank">&#8220;Stellar Dynamics in a Virtual Universe Workshop&#8221; </a>and the other is <a href="http://www.kira.org/index.php?option=com_content&amp;task=view&amp;id=119&amp;Itemid=149" target="_blank">&#8220;ReLaM: Relocatable Laboratories in the Metaverse.&#8221;</a></p>
<p>MICA was founded two years ago by <span class="nfakPe">Piet</span> Hut within the virtual world of <a href="http://qwaq.com" target="_blank">Qwaq Forums</a> (see the paper <a href="http://arxiv.org/abs/0712.1655" target="_blank">&#8220;Virtual Laboratories and Virtual Worlds&#8221;</a>). The Kira Institute is much older: it was founded in 1997. Â Later this month, on February 24, Kira will celebrate its 12th anniversary with a presentation of talks, a panel discussion, and a series of workshops. Â See the <a href="http://www.kira.org/index.php?option=com_content&amp;task=view&amp;id=83&amp;Itemid=113" target="_blank">Kira Calendar</a> for the main event, and the Kira Japan branch for a <a href="http://www.kirajapan.org/event/" target="_blank">special mixed RL/SL</a> event in Tokyo. Â During both events, Junichi Ushiba will give a talk about his research in which <a href="http://nwn.blogs.com/nwn/2007/10/the-second-life.html" target="_blank">he let paralyzed patients steer avatars using only brain waves</a>.</p>
<p>Other early adopters of ScienceSim include Tom Murphy, who teaches computer science at a Contra Costa College. Prior to teaching, Tom spent 35+ years working for supercomputer manufacturers. Tom said:</p>
<blockquote><p>it is very natural for me to find significantly new ways to visualize and interact with scientific mathematical models via ScienceSim and the OpenSim software behind it. ScienceSim also allows us to interact with each other and teach students in new ways.</p></blockquote>
<p>Also Charlie Peck, chair of the SC09 Education Program, (his day job is teaching computer science at Earlham College in Richmond, IN), is working with Wilf Pinfold, Tom Murphy and others &#8220;to explore how 3D Internet/metaverse technology can be used to support science education and outreach.&#8221;</p>
<p><a href="http://www.ics.uci.edu/~lopes/" target="_blank">Cristina Videira Lopes</a>, University of Irvine, is doing very interesting workÂ  on road and pedestrian traffic simulations. Crista is also the creator of <a href="http://opensimulator.org/wiki/Hypergrid" target="_blank">hypergrid in OpenSim</a>,</p>
<h3>People Meet People Meet Data: A Conversation With Mic Bowman</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/02/sciencesim_002_thumb1.png"><img class="alignnone size-full wp-image-2908" title="sciencesim_002_thumb1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/02/sciencesim_002_thumb1.png" alt="sciencesim_002_thumb1" width="404" height="239" /></a><em></em><br />
<em>Screenshot of ScienceSim from <a href="http://daneel-ariantho.blogspot.com/2009/02/sciencesim.html" target="_blank">Daneel Ariantho</a></em></p>
<p><strong>Tish:</strong> How does this work on ScienceSimÂ  fit into a wider dialogue on linked data? Where people meet people meet data, and where data meets data?</p>
<p><em><strong>Mic:</strong> Yeahâ€¦ thatâ€™s hard by the way.Â  Open integration of data (and more interestingly the functions on data) is very hard if it comes from multiple, independent sources.</em></p>
<p><em>Thatâ€™s the people part. For example, if Crista can build a model of the UCI campus somebody else builds an accurate model of several cars and another expert provides the simulation that computes the pollution generated by those cars in that environmentâ€¦its bringing people together to solve real problems, no matter how far apart physically.</em></p>
<p><strong>Tish:</strong> You mention three different simulations here. Could you explain why it is difficult to integrate data from multiple sources?</p>
<p><em><strong>Mic:</strong> integrating data from multiple sources has always been one of understanding &amp; interpreting both the syntax &amp; semantics of the data. Even relatively simple things like multiple date formats require explicit translation. More complex formats, like the many formats data is represented for urban planning, are barely computable independently let alone in conjunction with data from other sources (each with its own representation for data). Its often the expertise &amp; the collaboration of bringing people (and their bag of tools) together that solves these problems.</em></p>
<p><strong>Tish:</strong> and in this case the bag of tools is high performance modeling..?</p>
<p><em><strong>Mic:</strong> high performance modeling, rich visualizations and data. Its the three that matterâ€¦ data, function, and interface.</em></p>
<p><strong>Tish:</strong> Some people have a very hard time wrapping their head aropund the fact that anything that seems related to Second Life can do this.Â  Can you explain more about the difference between SL and OpenSim?</p>
<p><em><strong>Mic:</strong> OpenSim potentially improves data &amp; function because it can be extended through region modules. Region modules hook directly into the simulator to provide additional functionality. For example, a region module could be implemented to drive the behavior of objects in a virtual world according based on a protein folding model.</em></p>
<p><em>We need to work on additional viewer capabilities to address the user interface limitations.</em><br />
<strong><br />
Tish:</strong> Yes Rob Smartâ€™s (IBM) recent data integrations with OpenSim (<a href="http://robsmart.co.uk/2009/01/22/visualizing-live-shipping-data-in-opensim-isle-of-wight-ferries/" target="_blank">see here</a>) are impressive. Re viewers one of the biggest objections to virtual worlds is the mouse pushing and pc tied interface.</p>
<p><em><strong>Mic:</strong> There are great opportunities for improving the interface</em></p>
<p><strong>Tish:</strong> Yes I really like where the Andy Piperâ€™s experiments with Haptic Interfaces for OpenSim lead, <a href="http://andypiper.wordpress.com/2009/02/06/haptic-user-interfaces/" target="_blank">see Haptic Fantastic</a>! And I think that we will have cyberspace ubiquitous in our environment, not just stuck on a pc screen, sooner than we think.</p>
<p><em><strong>Mic:</strong> Micâ€™s opinion (not Intel): until we get souped up sunglasses with HD screens embedded (or writing directly into the eye) there will always be a role for the PC/Console/TV).Â  But, it isnâ€™t about the deviceâ€¦ its about the services projected through the deviceâ€¦ sometimes youâ€™ll want a very rich experienceâ€¦ sometimes youâ€™ll want an experience NOW wherever you are.</em></p>
<p><strong>Tish:</strong> I think people are only just realizing that VWs will be a now and wherever you are experience very soon.</p>
<p><em><strong>Mic:</strong> Thatâ€™s the critical observation the virtual world is not an application you runâ€¦ its a â€œplaceâ€â€¦ and you interact with it where you are or maybe interact through it. Speaking for Intelâ€¦ it is the spectrum of experiences that are critical to support.</em></p>
<h3>Interview with Wilfred Pinfold</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/02/gustav_h.jpg"><img class="alignnone size-full wp-image-2860" title="gustav_h" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2009/02/gustav_h.jpg" alt="gustav_h" width="416" height="200" /></a></p>
<p><em>Picture from National Science Foundation &#8211; <a href="http://www.nsf.gov/news/news_summ.jsp?cntn_id=112166" target="_blank">&#8220;Climate Computer Modeling Heats Up.&#8221;</a></em></p>
<p><strong>Tish Shute:</strong> I know your day job for Intel is in High Performance computing.  Could you explain to me a little bit more about what you are working on in this regard &#8211; a mini state of play for high performance computing from your perspective?</p>
<p><em><strong>Wilfred Pinfold:</strong> My title is Director, Extreme Scale Programs. This program drives a research agenda that will put in place the technologies required to make an Exa (10^18) scale systems by 2015. The current generation of high performance computers are Peta (10^15) scale so this is a 1000x increase in performance and this increase will require significant improvements in power efficiency, reliability, scalability and new techniques for dealing with locality and parallelism.</em></p>
<p><strong>Tish:</strong> The nirvana in terms of linking supercomputers to the collaborative spaces of immersive virtual worlds is to be able to create visualizations using real time data from super computers in collaborative VW environments, and ultimately for researchers to be able to collaborate and steer their simulations from their visualizations.Â   Where are we at now in terms of scientific data visualization in VWs? And what are the current obstacles to using realtime data from super computers?</p>
<p><em><strong>Wilf: </strong>Being able to steer a simulation from a visualization requires both a visualization interface that allows interaction and a simulation that operates at a speed that is responsive in interactive timeframes. For example a weather model that predicts the path of a hurricane would need to operate at something close to 1000x real time. This would run through a day in ~1.5 minutes allowing an operator to run the simulation over several days multiple times with different parameters in a single sitting to understand the likelyhood of certain outcomes?</em></p>
<p><strong>Tish:</strong> Do you see a networked online collaborative virtual world being capable of being a visualization interface that allows meaningful interaction with the hurricane scenario you describe in the near future (next 6 to 18 months)?</p>
<p><em><strong>Wilf: </strong>I was using the hurricane example to explain the usage model not an imminent capability. Hurricane Simulation: Accurate hurricane simulations require multiscale models able to resolve the global forces working on the storm as well as the microforces that define precipitation. We can build useful weather models today that run faster than real time (anything slower is not useful for prediction) but we are a long way from the ideal.<br />
Visualization: There are excellent visualizations of weather systems but I have not yet seen a virtual world that can track a simulation and allow the scientist or team of scientists to see what is going on at both the macro scale and zoom in to see precipitation conditions. Today&#8217;s supercomputers are much better at this than they were a few years ago but they are a long way from ideal.</em></p>
<p><strong>Tish:</strong> Open Source Virtual World technologies are pretty diverse in their approaches, Croquet, Sun&#8217;s Wonderland and OpenSim are quite different and have different strengths and weaknesses. As you have become more familiar with OpenSim, what have you found about the technology that particularly lends itself to this project &#8211; ScienceSim (Mic mentioned Crista&#8217;s hypergrid code for example, modularity is another feature often cited).</p>
<p><em><strong>Wilf: </strong>We have found OpenSim&#8217;s client server model is well suited to the visualization model and the ability to put the server next to the supercomputer producing the visualization data is critical. We are however very interested in other environments and encourage papers, demonstrations and research on any of these platforms at the conference.</em></p>
<h3>Interview with John A. Hengeveld</h3>
<p><strong>Tish Shute:</strong> OpenSimâ€™s dependence on Second Life based viewers is sometimes cited as a limitation, and sometimes as a strength. What are your views on this?Â  What would a strong open viewer project directed at science applications bring to the picture?</p>
<p><em><strong>John Hengeveld:</strong> There may be more than one strong open viewer project required for opensim compatible experiences.Â  The strength of the Hippo viewer, for example, is availability and its weakness is the size of the client.Â  We would love a ubiquitous, client.. that runs on all platforms, but each hardware platform brings tradeoff and restrictions of its own.Â  Today, probably all of the folks innovating in the space can deal with the size of a very fat rich client ap.. they have big computers anyway.Â  But as we get into more 3D entertainment and augmented reality applications.. virtual mall, collaboration apps.. etcâ€¦ there is a great deal of room to optimize for the specific experience.Â  Balancing visual experience with bandwidth and compute performance available .. tying into standard browsers, etcâ€¦ people have done some of this work.. and I think all of it adds to the usefulness of these worlds.</em></p>
<p><strong>Tish:</strong> Integrating highend game engines and OpenSim opens up new possibilities. But licensing issues have been an obstacle. Could a project like ScienceSim get a non-commercial license on a high end game engine?Â  What would that bring to the picture?</p>
<p><em><strong>John: </strong>Anything is possible. Game engines can give a great deal of design power for high value experiences, but the programming of these experiences must be simplified.Â  Mainstream adoption in enterprise can&#8217;t be premised on the programming model of studio gamesâ€¦ thatâ€™s a big step to get over I think.Â  There are very interesting possibilities when we take that step tho.Â  Simulation, training, agents of various types (I just finished watching â€œThe Matrixâ€ for like the billionth timeâ€¦ I think agents are coolâ€¦)</em></p>
<p><strong>Tish:</strong> Where does Larabee fit into the picture of ScienceSim and next generation virtual worlds?</p>
<p><em><strong>John:</strong> We are all very excited about the Larrabee architecture and its application to work loads like next generation virtual worlds, both in the client.. delivering immersive reality.. and someday potentially in a distributed architecture simulating and producing these worlds.Â  For Intel CVC is an all play.Â  Atom will be used in strong mobile clients.Â  Core will be used in Enterprise PCs, Laptops and DesktopsÂ Â  Xeon will be simulating these environments and handling the data communication, and Whatever we brand Larrabeeâ€¦ will be enabling compelling visual experiences. Oh.. and our software products (Havoc, tools and others) will be building blocks in knitting all this together.Â  Larrabee is a part, but there are a lot of other pieces in our visionâ€¦</em></p>
<p><strong>Tish:</strong> If the kind of rapid data movement that scientific visualization needs is achieved in virtual worlds, this will be quite a game changer for business applications of VWs too. Also it will blurr the boundaries between what we call virtual worlds and mirror worlds. It seems to me this kind of rapid data movement is a vital step towards what Mic described to me as Intelâ€™s vision of CVC: â€œConnected Visual Computing is the union of three application domains: mmog, metaverse, and paraverse (or augmented reality).â€ It almost seems to me that if you achieve your goals for ScienceSim you will change how we think about virtual worlds in general? What do you think?</p>
<p><em><strong>John:</strong> I certainly hope so..Â  Part of our goal is to stimulate innovation in the technology and usage models that will enable broad mainstream adoption of CVC based applications (what we categorize as immersive connected experiences).Â Â  By tackling the scientific visualization problem, we hope to find the key technology barriers and encourage the ecosystem to solve them.</em></p>
<p><strong>Tish: </strong>To me virtual worlds and augmented reality should be complimentary and connected experiences. How do you see this connection evolving?</p>
<p><em><strong>John:</strong> We certainly see them as related.Â  In the long term, there are many common building blocks.. but they arenâ€™t united per se.Â  Its about the user experience, and in some usages these two are almost identicalâ€¦Â  in some.. they donâ€™t look or feel at all alikeâ€¦ the viewer is distinct by a lot.Â  Our approach is to enable building blocks that people can quickly build out usages that are robust.</em></p>
<p><strong>Tish: </strong>What is Intelâ€™s vision for ubiquitous mobile computing and an internet of objects?Â  How can high performance computing be an enabler for this vision?</p>
<p><em><strong>John: </strong>Mobile computing is a central part of our life, culture and community in economically enabled economies.Â  It feeds the data of our decisions, it connects us to entertainment, it is the access point to our soapboxes, pulpits, economy and families.Â  This creates a massive increase in data, a massive increase in interactions, transactions and visualizations.Â  While many HPC applications will be behind the scenes (finance, health, energy, visual analytics and others), HPC will emerge as a part of a scale solution to serving some of this increaseâ€¦ particularly that part where interactions and visualizations are complex or compelling.. or where scale enables the usage per se .. I talked about my love of agents earlier.. and some of that comes in here.Â  Compute working behind the scenes to help managed the data complexity, manage some of the base interactions between ourselves and technology.Â  The other thing we talk internally about the â€œHannah Montana usageâ€ where millions of people use their mobile devices to access and participate (using the sensors in the device) with an interactive live concert.Â  When Mylie hears the applause of a virtual interactive audienceâ€¦ and can scream back at them.. weâ€™re there.Â  Access to ubiquitous compute will be mobile, and interactive experiences will be complex.. and HPC can help make that real.Â  Watch out for the mental trap that HPC is always high end super compute clusters thoâ€¦ the â€œmainstream HPCâ€.. smaller clustersâ€¦ high threads, etcâ€¦ will play a key part in all of this as well.</em></p>
<p>Interesting that John ended on this point as this just came in from <a href="http://blog.wired.com/gadgets/2009/02/intel-fights-re.html" target="_blank">Wired. </a><em><br />
</em></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2009/02/11/people-meet-people-meet-big-data-sciencesim-explores-collaborative-high-performance-computing/feed/</wfw:commentRss>
		<slash:comments>4</slash:comments>
		</item>
		<item>
		<title>Doing Something Useful With Virtual Worlds</title>
		<link>http://www.ugotrade.com/2008/10/28/doing-something-useful-with-virtual-worlds/</link>
		<comments>http://www.ugotrade.com/2008/10/28/doing-something-useful-with-virtual-worlds/#comments</comments>
		<pubDate>Tue, 28 Oct 2008 08:52:37 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[avatar 2.0]]></category>
		<category><![CDATA[free software]]></category>
		<category><![CDATA[Intel in Virtual Worlds]]></category>
		<category><![CDATA[interoperability of virtual worlds]]></category>
		<category><![CDATA[Linden Lab]]></category>
		<category><![CDATA[Metarati]]></category>
		<category><![CDATA[Metaverse]]></category>
		<category><![CDATA[Mixed Reality]]></category>
		<category><![CDATA[Open Grid]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[open protocols for virtual worlds]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[open standards for virtual worlds]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Second Life]]></category>
		<category><![CDATA[social gaming]]></category>
		<category><![CDATA[social media]]></category>
		<category><![CDATA[vapor standards]]></category>
		<category><![CDATA[virtual world standards]]></category>
		<category><![CDATA[Virtual Worlds]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web 3D]]></category>
		<category><![CDATA[Web3.D]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[collaboration in virtual worlds]]></category>
		<category><![CDATA[connecting-the-physical-world-to-the-digital-world]]></category>
		<category><![CDATA[doing-something-useful-with-the-internet]]></category>
		<category><![CDATA[enterprise virtual worlds]]></category>
		<category><![CDATA[enterprise-applications-for-virtual-worlds]]></category>
		<category><![CDATA[extended-internet]]></category>
		<category><![CDATA[green-it]]></category>
		<category><![CDATA[integrating-virtual-worlds-into-web-2.0]]></category>
		<category><![CDATA[lternative-reality-games]]></category>
		<category><![CDATA[soa]]></category>
		<category><![CDATA[social-computing]]></category>
		<category><![CDATA[Tim O'Reilly]]></category>
		<category><![CDATA[virtual-conferences]]></category>
		<category><![CDATA[virtual-worlds-for-green-conferencing]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=1962</guid>
		<description><![CDATA[I have just got back from attending two conferences in the UK, the Head Conference, and Virtual Worlds London.Â  I was on a mission at both the events to ask questions about how Virtual World technology will answer the call Tim O&#8217;Reilly made at the Web 2.0 Expo in New York City to &#8220;create more [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/images/VirtualWorldRoadMapupload.jpg" target="_blank"><img class="alignnone size-full wp-image-1964" title="virtualworldroadmapuploadpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/virtualworldroadmapuploadpost.jpg" alt="" width="311" height="207" /></a><a href="http://www.ugotrade.com/images/BruceDamerupload.jpg" target="_blank"><img class="alignnone size-full wp-image-1963" title="brucedameruploadpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/brucedameruploadpost.jpg" alt="" width="138" height="207" /></a></p>
<p>I have just got back from attending two conferences in the UK, the <a href="http://www.headconference.com/" target="_blank">Head Conference</a>, and <a href="http://www.virtualworldslondon.com/" target="_blank">Virtual Worlds London</a>.Â  I was on a mission at both the events to ask questions about how Virtual World technology will answer the call Tim O&#8217;Reilly made at the Web 2.0 Expo in New York City to &#8220;create more value than you extract&#8221; and do something worthy and useful with the internet.</p>
<p>The <a href="http://www.headconference.com/">Head Conference</a> was an ambitious, timely, and much needed creative exploration of the potential for &#8220;green&#8221; conferencing using Adobe Connect Pro, Second Life andÂ  <a href="http://www.headconference.com/hubs/">local conference hubs</a> in various cities. For more on the conference organization see <a href="http://www.digital-web.com/articles/head_conference_aral_balkan/" target="_blank">this pre-conference interview</a> with Aral Balkan.</p>
<p>Head will be the focus of my next post, so more on Head soon!Â  One of my main goals in attending the <a href="http://www.headconference.com/hubs/london-uk/" target="_blank">London Hub</a> of Head was to interview the CEO and founder of <a href="http://www.amee.cc/" target="_blank">AMEE</a>, &#8220;Avoiding Mass Extinctions Engine,â€ <a href="http://www.headconference.com/speakers/gavin-starks/" target="_blank">Gavin Starks</a>. AMEE aims to be &#8220;the energy meter of the world.&#8221;</p>
<blockquote><p>AMEE is a neutral aggregation platform designed to measure and track all the energy data on Earth.&#8221;</p></blockquote>
<p>AMEE is a project with the kind of big goals that O&#8217;Reilly talked about in his keynote at Web 2.0 Expo, NYC.Â  Tim O&#8217;Reilly is an investor in AMEE. He announced, at Head, that the O&#8217;Reilly VC company has just closed a deal with AMEE.</p>
<p>I had an extraordinary opportunity to spend time some time talking with Tim O&#8217;Reilly while looking for a sandwich in Euston Square.Â  More on this sandwich adventure and my interview with Tim O&#8217;Reilly, and my long talk with Gavin Starks about AMEE, in my next post!</p>
<p>Tim kept saying in London that he doesn&#8217;t like predicting the future. But the future comes to Tim O&#8217;Reilly!</p>
<p>And, after talking with Tim and Gavin, I felt I had a very exciting glimpse of what is emerging from the tech&#8217;s burning issues. George F. Colony, Forrester, summarized these issues nicely in his post, <a href="http://blogs.forrester.com/colony/2008/10/my-take-on-the.html" target="_blank">&#8220;Why This Tech recession Will Be Different.&#8221;</a> Colony noted, &#8220;Virtualization, social computing, mobile computing, Green IT, SOA, extended Internet (connecting the physical world to the digital world) are front and center on the agendas of large companies.&#8221;</p>
<p>And, yes, this is supposed to be a little bit of a teaser for my next post on AMEE!</p>
<h3>Virtual Worlds Road Map.</h3>
<p>The final keynote at the Virtual Worlds London was what Ian Hughes in <a href="http://eightbar.co.uk/2008/10/23/virtual-worlds-london-metarati-and-moving-coffee-day-1-part-1/" target="_blank">his post on the conference for Eightbar</a>, aptly described as a call to arms for the <a href="http://www.virtualworldsroadmap.org/" target="_blank">Virtual Worlds Roadmap</a>. As Ian pointed out: &#8220;This needs a post in its own right as we all need to get on board with this across the industry and help.&#8221; Ian Hughes&#8217; (IBM) own presentation on &#8220;Business Process Management&#8221; was one of the best I attended in conference.Â  Yes, amazingly, he made this topic very interesting and fresh!</p>
<p>The pictures opening this post are the Virtual Worlds Road Map presenters. Victoria Coleman (Samsung) -seated at center, Sibley Verbeck (<a href="http://www.electricsheepcompany.com/">Electric Sheep Company)</a> &#8211; in trademark hat, <a href="http://www.virtualworldslondon.com/speakers/jeffreypope.html">Jeffrey Pope </a>3Di &#8211; far left, andÂ  <a href="http://www.damer.com/">Bruce Damer</a> &#8211; close up in the picture on the right.</p>
<p>I am delighted to join Bruce Damer, later today, for a <a href="http://www.fastcompany.com/node/1052129" target="_blank">FastCompany.com Technology Group Call-in</a>: <strong>&#8220;Next Generation Interaction: Are Virtual Worlds Waiting in the Wings?&#8221; </strong>with <a title="Donald Schwartz" href="http://www.fastcompany.com/user/donald-schwartz" target="_blank">Donald Schwartz</a> (October 28th at 4:00 PM EST).</p>
<p>I will also be in Second Life <a href="http://slurl.com/secondlife/Wolpertinger/173/87/51" target="_blank">at Train 4 Success (SLURL)</a> on Thursday, October 30 (starting at 9AM PST) with <a href="http://peterquirk.wordpress.com/" target="_blank">Peter Quirk, EMC</a>, and Jani Pirkola, <a href="http://www.realxtend.org/" target="_blank">realXtend</a> talking about <a href="http://www.opensimulator.org" target="_blank">OpenSim</a> and <a href="http://www.realxtend.org/" target="_blank">realXtend</a> for an event organized by Eilif Trondsen of the <a href="http://www.sri.com/" target="_blank">Stanford Research Institute</a> and the Gronstedt Group.</p>
<p>John Hengeveld (Intel) &#8211; was off screen for this group picture (above). But, Intel is doing some very interesting work in Virtual Worlds <a href="http://www.ugotrade.com/2008/09/15/interview-with-mic-bowman-intel-the-future-of-virtual-worlds/" target="_blank">see my earlier post here</a>.Â  And, John isÂ  &#8220;helping <a href="http://www.digitalspace.com/projects/b612movies.html">NASA work out how to deflect extinction level event asteriods from Earth!</a>).&#8221;</p>
<p>As Ian noted, the main aim of Virtual Worlds Road Map, &#8220;is to gather together and cut through use cases to understand and help people come to terms with which applications need to be built for which case.&#8221;</p>
<p>For more great coverage of Virtual Worlds London check out <a href="http://eightbar.co.uk/2008/10/23/virtual-worlds-london-metarati-and-moving-coffee-day-1-part-1/" target="_blank">Ian&#8217;s post</a> on Eightbar. And, check out Roo Reynolds&#8217;, <a href="http://rooreynolds.com/2008/10/21/virtual-worlds-london-liveblogging-day-2/" target="_blank">live blogging here </a>and <a href="http://rooreynolds.com/2008/10/20/virtual-worlds-london-liveblogging/" target="_blank">here</a>. Also see Roo&#8217;s post on his panel on <a href="http://rooreynolds.com/2008/10/24/arg-panel-at-virtual-worlds-london-2/" target="_blank">&#8220;ARGs [Alternative Reality Games] and Virtual Worlds.&#8221;</a> which includes slides and audio. Picture below is Roo  in action live blogging. Roo is Portfolio Executive for Social  	Media at BBC Vision.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/rooreynoldslivebloggin.jpg"><img class="alignnone size-full wp-image-1987" title="rooreynoldslivebloggin" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/rooreynoldslivebloggin.jpg" alt="" width="450" height="299" /></a></p>
<h3>Tribal Media: A Teacher Training Intranet For The Swedish Government on OpenSim</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/darrenpost.jpg"><img class="alignnone size-full wp-image-1980" title="darrenpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/darrenpost.jpg" alt="" width="350" height="368" /></a></p>
<p>One of the more interesting developments I saw at Virtual Worlds London was a highly customized training intranet for 50,000 teachers being developed for the Swedish Government by <a href="http://tribalnet.se/About/TribalMedia/tabid/78/Default.aspx" target="_blank">Tribal Media</a>. The flexibility of <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a> to provide cost effective custom intranet solutions was nicely demoed by Darren Guard, Tribal Media R&amp;D (pictured above). Darren is one of the more reclusive founders and phenom developers of OpenSim.</p>
<h3>Virtual Worlds and Web 2.0</h3>
<p>In my earlier interviews with Rob Smart <a href="http://www.ugotrade.com/2008/09/29/rob-smart-ibm-web-20-to-opensim-made-easy/" target="_blank">here</a>, and Teravus Ousley <a href="http://www.ugotrade.com/2008/10/06/putting-opensim-into-the-heart-of-web-20/" target="_blank">here</a>, we discussed the work to integrate OpenSim with Web 2.0.</p>
<p>To meet the O&#8217;Reilly challenge &#8211; to do something useful with the internet and help solve some of the world&#8217;s big problems, in my view, Virtual World technologies must engage more fully with the power of the internet-as-a-platform &#8211; <span id="intelliTxt">&#8220;a system without an owner, tied together by a set of protocols, open standards and agreements for cooperation.&#8221; (see O&#8217;Reilly, </span> <a href="http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html" target="_blank">&#8220;What Is Web 2.0?&#8221;</a> ).</p>
<p>Unfortunately the worst presentation at Virtual Worlds London was purportedly on standards for virtual worlds.Â  I do not want to waste energy rehashing the misinformed and misguided presentation on the MPEG-V&#8217;s archaic blunderbuss approach to standards in this post.Â  I completely concur with Jim Purbrick of Linden Lab&#8217;s characterization of this talk as <a href="http://jimpurbrick.com/2008/10/23/second-life/" target="_blank">&#8220;the worst talk Iâ€™ve heard in a long time</a>.&#8221; (Also, see Jim&#8217;s post for an <a href="http://jimpurbrick.com/2008/10/23/second-life/" target="_blank">astute commentary</a> on other aspects of Virtual Worlds London.)Â  Luckily, there is much productive work from quarters aimed at leading to standards for Virtual Worlds. And, s<span id="intelliTxt">ome of these efforts I have blogged here on Ugotrade. </span></p>
<p><span id="intelliTxt"><strong> B</strong>ecause there is confusion, sometimes, in Virtual World discussions about how business models work on a &#8220;system without an owner,&#8221; here is the concluding quote from, &#8220;What is Web 2.0.&#8221;</span></p>
<blockquote><p><span id="intelliTxt">This is not to say that there are not opportunities for lock-in and competitive advantage, but we believe they are not to be found via control over software APIs and protocols. There is a new game afoot. The companies that succeed in the Web 2.0 era will be those that understand the rules of that game, rather than trying to go back to the rules of the PC software era.</span></p></blockquote>
<h3><strong>What is the Killer App. for Virtual Worlds?</strong></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/robsmartpost.jpg"><img class="alignnone size-full wp-image-1971" title="robsmartpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/robsmartpost.jpg" alt="" width="450" height="299" /></a></p>
<p><strong>&#8220;The killer is that any app you do create is automatically presence enabled.<br />
The people with you can view the changing states of that application or context as and Â when you do.&#8221; Rob Smart, IBM.</strong></p>
<p>The picture above are the presenters for the <span class="style34"><strong>&#8220;<strong>Platform Integration Considerations for Enterprise Virtual Worlds&#8221; panel. From left to right: </strong></strong></span><a href="http://www.virtualworldslondon.com/speakers/jeanmiller.html">Jean Miller, German Market  		Development Manager, Linden Lab</a><span class="style34"><strong>, </strong></span><a href="http://www.virtualworldslondon.com/speakers/mattfurman.html">Matt Furman, Software Engineer,  		Northrop Grumman</a>, <span class="style34"><strong></strong></span><a href="http://www.virtualworldslondon.com/speakers/robsmart.html">Rob Smart, Emerging Technology  		Specialist, IBM Hursley</a>,</p>
<h3>Interview with Rob Smart, IBM: Part 2.</h3>
<p><span style="font-size: small;"><strong><strong>Tish Shute:</strong> </strong>Up to now, Virtual Worlds have been relatively isolated from Web 2.0, living somewhere between the gaming world and the Web 2.0 world. How are the curtains lifting and virtual worlds becoming the linking the space between social media, and online gaming?</span></p>
<p><span style="font-size: small;"><strong><strong>Rob Smart: </strong></strong>Virtual Worlds that allow user created content and the association of behaviour to that content via scripting put themselves forward as the ideal platform to combine realtime social interaction with existing Web 2.0 tools. The data and function out there currently on Web sites can serve to augment the real-time social interactions. For example enhancing/enabling cross cultural communication with chat translation (example my translation HUD from wayback in 2006). </span></p>
<p><span style="font-size: small;">Another example is augmenting personal spaces with flickr images, video etc. In many flash room based Virtual Worlds this level of integration exists. However without the ability of the users to create their own gadgets and gizmos the pressure is on the development team to innovate and give users what they want, tough to do in the long term. A blended approach is to open APIs and content creation to registered developers.</span></p>
<div class="Ih2E3d">
<p><strong><strong>Tish Shute:</strong> </strong>Many developers have not been interested in taking part in virtual world development yet as they haven&#8217;t yet seen a killer app. How are, open source, open protocols, and the use of web standards where possible Â enabling an environment of innovation from which killer apps may emerge?</div>
<p><span style="font-size: small;"><strong><strong>Rob Smart:</strong><strong> </strong></strong>When you&#8217;re integrating any system with another it becomes so much simpler if the creators have provided,Â  services and APIs for external systems to interact with. It becomes even easier if those system entry accept/give inputs and outputs in a common way e.g. xml/json. The same goes for both data and media.</span></p>
<p><span style="font-size: small;"> By using common existing standards we shorten the development time taken, because if a standard is widely adopted there will be a multitude of programming language libraries for it. The existence of which means the developer can get straight onto the important task of creating the logic for their application/gadget rather than messing around trying to understand some weird data encoding method you&#8217;ve invented. </span></p>
<p><span style="font-size: small;">Having an Open Source platform spreads the work load around, as long as the method under which the OS software is licenced isnt too prohibitive then developers from all walks of life will contribute. Spreading that workload also leads to an increase of innovative features as people always bring their experience and interests to bear, the features they create can be shared back and others build on top of them.</span></p>
<p><span style="font-size: small;"> If a company chooses to implement a feature they specialize in or integrate with their existing products they can sell this as an add-on, this creates a market where the base product can improve through contributions from companies making a living of the OS product, it also introduces some competition and financial incentive to the platforms well being.</span></p>
<p><span style="font-size: small;">People keep talking about killer apps within Virtual Worlds, the killer is that any app you do create is automatically presence enabled. The people with you can view the changing states of that application or context as and Â when you do.</span></p>
<div class="Ih2E3d"><span style="font-size: small;"><strong><strong> Tish Shute:</strong> </strong>How have Virtual Worlds outgrown this name! Â The term Virtual Worlds has connotations of separateness from &#8220;real&#8221; worlds?Â  What might be a better term? Â (I have seen a number of other terms cropping up = Virtual Universe is what IBMers wore on their t-shirts here in London, Immersive Work Spaces has been trade marked by RRR, and many people prefer the terms virtual environments or virtual spaces).</span></div>
<p><span style="font-size: small;"><strong><strong>Rob Smart: </strong></strong>I still think Virtual Worlds is a good term, though it is very fuzzy. If we&#8217;re talking about VWs that can be extended and integrated with web 2.0 then maybe we need to talk about Immersive Application Platforms. Yep not very catchy but probably something more people in the enterprise world would say out loud in front of their boss <img src="http://www.ugotrade.com/wordpress/wp-includes/images/smilies/icon_wink.gif" alt=";)" class="wp-smiley" />  In addition another term that could be used is 3D Internet it conjures more of a picture of integration between the different parts of what is a vast networked system.<br />
<strong><strong><br />
Tish Shute:</strong> </strong>The Â original metaverse roadmap had four distinct segments Augmented Reality, and Life Logging at the pole of augmentation, and Mirror worlds and Virtual worlds at opposite corners of the pole of simulation. How are these areas coming together?<br />
</span><strong><br />
<span style="font-size: small;"><strong>Rob Smart: </strong></span></strong><span style="font-size: small;">There&#8217;s no reason these need to be separated, its all down to the use of the VW platform these four segments are just applications of a virtual world platform. A platform like OpenSim can merge several of these together if neccessary. For example the Publish Subscribe messaging module written about on eightbar that I created lets me do things like bring in Realtime Flight data and show planes positions etc. across a region I could at the same time call an API that gives me more details on that flight. I could even search for blogs that mention that flight number and bring them into the same space. I could add additional script functions to the plane objects so that when a visitor clicks on a plane it thereafter sends them messages about its position. </span><br />
<strong><span style="font-size: small;"><br />
<strong>Tish Shute:</strong> </span></strong><span style="font-size: small;">Virtual worlds are being broken down to open source basics building blocks and modules that can be mixed and matched and mashed up with Web 2.0 to create a new ecosystem that enriches both what has been know as virtual worlds and traditional web environments. What kind of innovation do you see coming out of these new opportunities to mashup virtual worlds with Web 2.0?</span></p>
<p><span style="font-size: small;"><strong><strong>Rob Smart: </strong></strong>I&#8217;m hoping to see as a number one priority an increase of accessibility, despite a number of people saying that browser based virtual worlds aren&#8217;t worth the effort they certainly are. The ability to just send a friend a URL or Instant Message etc.. and pull them in with you is an important step to adoption. As are simplified interfaces that don&#8217;t scare off those unfamiliar with gaming. An example of this is the Lotus Sametime 3D work with OpenSim that lets you invite a friend or colleague in via an instant message.</span></p>
<h3>Virtual Worlds For Enterprise: A Coming of Age Party?</h3>
<p>As Ian mentioned I did think that the London Conference was a coming of age party for enterprise virtual worlds. In the picture below there are just some of the Lindens who were there, many to promote the Linden Lab collaboration with Rivers Run Red on <a href="http://immersivespaces.com/" target="_blank">&#8220;Immersive Work Spaces&#8221; </a>which was <a href="http://blogs.wsj.com/biztech/2008/10/20/linking-the-real-web-with-virtual-worlds/" target="_blank">written up in Wall Street Journal.</a> Also see this post yesterday on Silicon.com, <a href="http://www.silicon.com/silicon/networks/webwatch/0,39024667,39285821,00.htm" target="_blank">&#8220;Virtual Worlds Set For Second Coming.&#8221; </a></p>
<p>Someone please help me with the all the names of the Lindens in the picture below!Â  <a href="http://www.virtualworldslondon.com/speakers/mattfurman.html">Matt Furman</a> from Northrop Grumman is center and Joey Seiler from <a href="http://www.virtualworldsnews.com/" target="_blank">Virtual World News</a> is on the right.<a href="http://blogs.wsj.com/biztech/2008/10/20/linking-the-real-web-with-virtual-worlds/" target="_blank"></a></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/lindens.jpg"><img class="alignnone size-full wp-image-1988" title="lindens" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/lindens.jpg" alt="" width="450" height="299" /></a></p>
<p>Justin Bovington said to me that this conference was in his view: &#8220;the enterprise virtual worlds coming out party &#8211; an acceptance that this is a tangible solution- about selling relevant tools and relevant ROI &#8211; rather than talk about virtual worlds it is about relevant tool sets.&#8221;</p>
<p>And, while the conference was small, I think the engagement level of the enterprise attendees did back up this assertion of Justin&#8217;s. <a href="http://www.virtualworldslondon.com/speakers/mattfurman.html">Matt Furman, Software Engineer,  		Northrop Grumman</a> was asked by more than one attendee how he was dealing with scaling up the behind the firewall virtual world he is developing for Northrup Grumman with Linden Lab to meet a big demand internally to start using virtual worlds for collaboration.Â  Apparently some attendees were seeing so much interest in virtual world solutions for internal collaboration in their own companies, they were concerned about meeting the needs of thousands of employees in short order.</p>
<h3>Immersive Work Spaces</h3>
<p>I asked Justin a few questions about Immersive Work Spaces while waiting for an elevator!</p>
<p><strong>Tish Shute:</strong> And what are the relevant tool sets from your point of you?</p>
<p><strong>Justin Bovington:</strong> Collaboration, sharing, integration of existing backend systems and applications.Â  For example, we have developed seamless ways to share powerpoint or share screens. And, also going back down to the ROI models as well,Â  tangible ROI based on subscription based system where basically in four or five usages it has paid for itself. We have never had that with Virtual Worlds. It has always been in the bounds of experimentation or the bounds of isn&#8217;t it cool technology. Now we are seeing this become a serious collaboration tool.</p>
<p>And as I have said before that argueably the twentieth century ended two weeks ago and the twenty first century is now with us.Â  And that is about companies rengineering their thinking particularly in the financial sector they have to restart again. And that is going to be aboutÂ  using additional tools and additional guide lines to do that. This is the change over and I have said this in the panel as well. This show in particularly is enterprise virtual worlds coming out party.</p>
<p>And again we see a massive change between the last three shows &#8211; there is a level of interest we have never seen before and also an acceptance that this is a tangible solution not just something that is cool&#8230;</p>
<p>We have hundreds of users in out product and it will goÂ  to thousands and tens of thousands in the next year.</p>
<p>And we know where it is going &#8211; data visualization is going to be the next big thing and getting this 10,000 ft view of your company. We are using this term called snow globing which lets you pick up a snow globe and shake it and let you see exactly what a company is about and this is exactly what virtual worlds are about.</p>
<p>It&#8217;s about having a ten thousand foot view of your company because that&#8217;s when it becomes powerful because then it becomes a broadcast medium. And I think it will change people&#8217;s perception of data. And it is also moving to beyond just having the avatar as the main presence. The environment itself becomes an essence or a kind of dynamic level that is inside there. We are working on stuff at the moment that allows you have direct influence on data or the environment you are in which on a massive collaboration scale could actually give you a huge amount of input and ideas around company. And there is a genuine need to have this kind of collective intelligence.</p>
<h3>Sine Wave Dinner!</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/sinewavedinnerpost.jpg"><img class="alignnone size-full wp-image-1990" title="sinewavedinnerpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/sinewavedinnerpost.jpg" alt="" width="450" height="299" /></a></p>
<p>The grand finale for me was the excellent Indian meal very generously hosted by Rohan Freeman of <a href="http://www.sinewavecompany.com/" target="_blank">Sine Wave Company</a>. Standing on the left is Chris Collins, Linden Lab, seated left front is, Steve Spangaro, bigpipemedia, and on the right Ren Reynolds of the Virtual Policy Network. Many other metarati were there including Bruce Joy, Vast Park, Corey Bridges, Multiverse, Dave Taylor, Imperial College, Gia Rossini, Sloodle, Peter Haik, Metaversality, Adam Frisby, OpenSim, Mal Burns, and many more &#8211; please help me out with the name tagging!<a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/sinewavedinnerpost.jpg"><br />
</a></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2008/10/28/doing-something-useful-with-virtual-worlds/feed/</wfw:commentRss>
		<slash:comments>4</slash:comments>
		</item>
		<item>
		<title>Putting OpenSim Into The Heart of Web 2.0</title>
		<link>http://www.ugotrade.com/2008/10/06/putting-opensim-into-the-heart-of-web-20/</link>
		<comments>http://www.ugotrade.com/2008/10/06/putting-opensim-into-the-heart-of-web-20/#comments</comments>
		<pubDate>Mon, 06 Oct 2008 18:36:56 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[Architecture Working Group]]></category>
		<category><![CDATA[BSD versus GPL]]></category>
		<category><![CDATA[free software]]></category>
		<category><![CDATA[GPL]]></category>
		<category><![CDATA[Intel in Virtual Worlds]]></category>
		<category><![CDATA[interoperability of virtual worlds]]></category>
		<category><![CDATA[Open Grid]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[open protocols for virtual worlds]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[open standards for virtual worlds]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[realXtend]]></category>
		<category><![CDATA[Second Life]]></category>
		<category><![CDATA[virtual world standards]]></category>
		<category><![CDATA[Virtual Worlds]]></category>
		<category><![CDATA[virtual worlds in china]]></category>
		<category><![CDATA[virtual worlds in Japan]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web 3D]]></category>
		<category><![CDATA[Web3.D]]></category>
		<category><![CDATA[World 2.0]]></category>
		<category><![CDATA[3Di OpenSim Standards]]></category>
		<category><![CDATA[Asian virtual Worlds]]></category>
		<category><![CDATA[ChinaQ]]></category>
		<category><![CDATA[communication protocols for virtual worlds]]></category>
		<category><![CDATA[immersive virtual worlds and Web 2.0]]></category>
		<category><![CDATA[Immersive Worlds and Web 2.0]]></category>
		<category><![CDATA[Integration of OpenSim into Web 2.0]]></category>
		<category><![CDATA[Integration of Virtual Worlds in Web 2.0]]></category>
		<category><![CDATA[licensing of open virual worlds]]></category>
		<category><![CDATA[MPEG-V]]></category>
		<category><![CDATA[Open Grid Protocol]]></category>
		<category><![CDATA[OpenSim in the Architecture of Web 2.0]]></category>
		<category><![CDATA[OpenSim Standards]]></category>
		<category><![CDATA[small architecture versus big architecture virtual worlds]]></category>
		<category><![CDATA[standardization of virtual worlds]]></category>
		<category><![CDATA[virtual world protocols]]></category>
		<category><![CDATA[virtual worlds and consumer adoption]]></category>
		<category><![CDATA[Web 2.0 Architecture]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=1853</guid>
		<description><![CDATA[This post, and my previous post about integration of OpenSim into Web 2.0, explore how immersive virtual worlds, through a full architectural integration into Web 2.0, will become part of the fabric of everyday computing. The diagram above shows where OpenSim sits in Web 2.0 (click on the diagram to see a readable enlarged version!). [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/images/Teravus2copy.jpg" target="_blank"><img class="alignnone size-full wp-image-1857" title="teravus2copypostnew1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/teravus2copypostnew1.jpg" alt="" width="450" height="255" /></a></p>
<p>This post, and <a href="http://www.ugotrade.com/2008/09/29/rob-smart-ibm-web-20-to-opensim-made-easy/">my previous post </a>about integration of <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a> into Web 2.0, explore how immersive virtual worlds, through a full architectural integration into Web 2.0, will become part of the fabric of everyday computing.</p>
<p>The diagram above shows where OpenSim sits in Web 2.0 (click on the diagram to see a readable enlarged version!). The following interview with OpenSim developer, Teravus Ousley, describes some of the work being done to create documented protocols that will make OpenSim fit seamlessly into Web 2.0 architecture.</p>
<p>OpenSim is in the news a lot these days, explicitly as in the case of the announcement last week by <a href="http://3di.jp/" target="_blank">3Di</a> of their  <a href="http://3di-opensim.com/">â€œ3Di OpenSimâ€ Standard</a> (for more see <a href="http://www.virtualworldsnews.com/2008/10/3di-begins-sell.html" target="_blank">here</a> and <a href="http://blog.mindblizzard.com/2008/10/3di-moves-opensim-into-enterprise-mode.html#links" target="_blank">here</a>), and <a href="http://www.chinaq.com/web/" target="_blank">implicitly with the launch of ChinaQ</a>.Â <a href="http://www.adamfrisby.com/blog/" target="_blank"> Adam Frisby</a>, OpenSim, pointed out to me if you download the ChinaQ client that it is based on OpenSim, it connects nicely to <a href="http://osgrid.org/" target="_blank">OSGrid</a> too. There is speculation the client is a rebranded version of the<a href="http://www.realxtend.org/" target="_blank"> realXtend</a> viewer (which is based on the open source <a href="http://www.secondlife.com" target="_blank">Second Life</a> viewer) as all the version numbers are the same.</p>
<p>So OpenSim is not only attracting the interest of business giants like IBM, Microsoft and Intel, it is becoming the architecture of choice for virtual world initiatives from Chinese and Japanese telecoms (see <a href="http://parksassociates.blogspot.com/2008/09/chinaq-based-on-opensim.html" target="_blank">here</a> and <a href="http://www.virtualworldsnews.com/2008/06/ntt-investing-1.html" target="_blank">here</a> for more). Also, <a href="http://www.realxtend.org/page.php?pg=news&amp;s=20080929" target="_blank">see the press release</a> about Nokia and the <a href="http://www.businessoulu.com/">City of Oulu</a>, Finland, joining as supporters of  <a href="http://www.realxtend.org/">realXtend</a>.</p>
<p>But, as Raph Koster in <a href="http://www.raphkoster.com/2008/10/03/enterprise-vws-do-they-suck/" target="_blank">his post commenting on 3Di&#8217;s OpenSim announcement</a> notes, the question how immersive virtual worlds can go from strong niche or enterprise markets to mass adoption in consumer markets must be answered.Â  As Raph points out, <em>Lively</em>, <em>Whirled, SmallWorlds, Vivaty</em>, and yes, <a href="http://www.metaplace.com/"><em>Metaplace</em></a> have a very different architecture that they hope will attract broad consumer markets.Â   (I did a long interview with Raph on this at <a href="http://www.virtualworldsexpo.com/" target="_blank">The Virtual Worlds Conference and Expo in LA</a> which I will post as soon as it is transcribed, so more on this soon!).</p>
<p>Architectural integration into the heart of Web 2.0, I would argue, is the key to mass adoption for immersive virtual worlds. While architecture alone will not guarantee the necessary breakthroughs in usability for widespread consumer adoption, it will create the ideal conditions for the innovation through which usability obstacles will be overcome, and the enormous potential for immersive, real time interaction over the internet will be realized.</p>
<h3><strong><br />
</strong></h3>
<h3><strong>Interview with Teravus Ousley</strong></h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/teravus_ousley_pic.jpg"><img class="alignnone size-full wp-image-1869" title="teravus_ousley_pic" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/teravus_ousley_pic.jpg" alt="" width="314" height="271" /></a></p>
<p><strong>Tish: </strong>What has beenÂ  the most fundamental problem re virtual world architecture that has kept immersive virtual worlds isolated from web 2.0 to date?Â <strong> </strong></p>
<p><strong>Teravus</strong>: a lack of standardization, licensing issues, and the difficulty of entry into the industry.</p>
<h3>1) Standardization</h3>
<p><strong>Tish: </strong>In order of importance what in your view are the priorities for standardization?</p>
<p><strong>Teravus:</strong> Probably the same order that OpenSimulator was tackled in, basic connect (current state of OGP &#8211; <a href="http://wiki.secondlife.com/wiki/SLGOGP_Draft_1" target="_blank">Open Grid Protocol</a>).Â  Basic Service (interaction standards).Â  Advanced connect/mashup/aggregate extensions. Â  Preferably people will have working code in the various spaces there to use freely under various licenses..</p>
<p><strong>Tish:</strong> Can you show me where OpenSim will fit in this drawing of Web 2.0 architecture? [Teravus makes some modifications on the drawing I send him from  <a href="http://hinchcliffeandcompany.com/" target="_blank">Dion Hinchcliffeâ€™s</a> presentation from his Web 2.0  Expo workshop, <a href="http://www.ugotrade.com/images/Hinchcliffe.jpg" target="_blank">see  original here</a>]</p>
<p><strong>Teravus:</strong> The modified diagram [now opening this post] is a great view of how it will look.</p>
<p><strong>Tish</strong>: Why is the TCP stream left out of the original drawing? [For more about <strong>Transmission Control Protocol (TCP)</strong> is one of the core protocols of the <a title="Internet Protocol Suite" href="http://en.wikipedia.org/wiki/Internet_Protocol_Suite">Internet Protocol Suite </a>see <a href="http://en.wikipedia.org/wiki/Transmission_Control_Protocol" target="_blank">here</a>.<a title="Internet Protocol Suite" href="http://en.wikipedia.org/wiki/Internet_Protocol_Suite"><br />
</a></p>
<p><strong>Teravus:</strong> It is left out because the person who made this diagram had web pages in mind.Â  Static large files, or small changing files. In the the drawing the fact that TCP streams are smaller then HTTP is on purpose.</p>
<p><strong>Tish:</strong> I have heard different opinions on the percentage of the communications for virtual worlds that can be done over HTTP?</p>
<p><strong>Teravus:</strong> The fact is that the biggest usage of communications in virtual worlds is transmitting images thatâ€™s the number one bandwidth usage. So, if weâ€™re counting by â€˜usageâ€™ I say 91%.Â Â  If weâ€™re counting by services that use http.Â Â  I say probably 75%Â  I definitely think that http should be evaluated for use on new things â€˜firstâ€™. But, there are a few places where HTTP doesnâ€™t shine.</p>
<p>I am skeptical about replacing things in the UDP with HTTPÂ  thinking that theyâ€™ll â€˜perform better. [For more about <strong>User Datagram Protocol</strong> (<strong>UDP</strong>) another of the core protocols of the <a title="Internet Protocol Suite" href="http://en.wikipedia.org/wiki/Internet_Protocol_Suite">Internet Protocol Suite </a>see <a href="http://en.wikipedia.org/wiki/User_Datagram_Protocol" target="_blank">here</a>.]<a title="Internet Protocol Suite" href="http://en.wikipedia.org/wiki/Internet_Protocol_Suite"><br />
</a></p>
<p>I think thereâ€™s been a huge test going on now and for the last 5 or six years with regards to the UDP protocol and it really has performed admirably.Â Â  In the last year and a half, Iâ€™ve seen attempts to convert several things to HTTP that have failed, and failed somewhat spectacularly sometimes.Â  In the end the items get reverted back to the UDP protocol. One such item that sticks out in my mind is CAPS(HTTP) based inventory retrieval. The capability to do that in the client has been available since before February. And, itâ€™s been turned on and off on â€˜Agniâ€™ at least once in the process. Additionally, we (OpenSimulator) enabled http inventory, and, theÂ  inventory failures rose pretty steeply.</p>
<p>I think some services are really just not â€˜rightâ€™ for HTTP.. . particularly where a â€˜pollâ€™ methodology is used, or, the data is significantly dynamic enough that it makes caching useless.</p>
<p>Anyway, as far as the future is concerned, I do want to see some services over HTTP. Other services, it would be more appropriate to have a TCP stream. Stock market data, for example, uses a TCP stream. The Scalability of the stock market, is just one example of a scalable TCP stream.</p>
<p><strong>Tish:</strong> So you see TCPÂ  as the communications protocol that would do the work for the parts of virtual worlds not suitable for HTTP. At least that is how you have shown it in our Web 2.0 architecture drawing. But should there also be a UDP stream?</p>
<p><strong>Teravus</strong>: For the virtual world of tomorrow? .. probably not.</p>
<p><strong>Tish:</strong> Why not?</p>
<p><strong>Teravus:</strong> You have less control over the quality of service when it&#8217;s delivered over UDP then TCP.</p>
<p><strong>Tish</strong>: What is the exact relation between TCP and UDP.Â  My understanding is UDP a lower level protocol.</p>
<p><strong>Teravus:</strong> TCP offers guaranteed delivery through flow control, while UDP does not.Â  One of the failures of UDP, is the â€˜resendâ€™ technology weâ€™ve put on top of it to try and make it reliable.Â Â  TCP does this automatically and better then we could at a lower level but it does also cost up to twice the bandwidth depending on what is being sent. HTTP is a layer on top of TCP.</p>
<p><strong>Tish:</strong> So just like the HTTP/TCP discussion there has to be a TCP/UDP boundary discussion â€¦so it is HTTP then TCP then UDP and the boundaries have to be worked on.</p>
<p><strong>Teravus: </strong>Those are the orderings in my mindâ€¦Â  probably if UDP uses any..Â  it should use less then 0.5%.</p>
<p><strong>Tish:</strong> And the current Second Life architecture what does it use if it isnâ€™t using HTTP? [to see the work of the <a href="http://wiki.secondlife.com/wiki/Architecture_Working_Group" target="_blank">Architecture Working Group</a> on the future <a href="http://www.secondlife.com" target="_blank">Second Life</a> architecture here]</p>
<p><strong>Teravus:</strong> UDP or HTTP</p>
<p><strong>Tish:</strong> and TCP?</p>
<p><strong>Teravus:</strong> Well, TCP is a layer under HTTP.Â  As far as I know, SL doesnâ€™t use TCP streams directly</p>
<p><strong>Teravus: </strong>Instead, it uses HTTP polling.Â  This is one of the places, that Iâ€™ve highlighted where it doesnâ€™t shine.</p>
<p><strong>Tish: </strong>Polling does sound slow?</p>
<p><strong>Teravus:</strong> Polling is essentially..Â Â Â Â  (connect) Got any data for me? No?(disconnect), (connect) Got any data for me?Â  No?(disconnect).</p>
<p><strong>Tish:</strong> So what is the path to standards for this then?<strong></strong></p>
<p><strong>Teravus:</strong> Distilling what we know works and what we actually intend on supporting as far as adoption under these standards.</p>
<p><strong>Tish:</strong> Where does <a href="http://www.metaverse1.org/" target="_blank">MPEG-V</a> fit in?Â  Have you read their document yet?</p>
<p><strong>Tervavus:</strong> MPEG-V is interesting readingâ€¦Â Â Â Â  but is there any working example? I have just the overview. But Iâ€™ll read it over to have a better determination of how to â€˜keep it in mindâ€™ for the future. It looks like theyâ€™ve only really defined the requirements of the MPEG-V spec. The MPEG-V spec looks quite far reaching..Â  butÂ  the documents so far are requirements and marketing talk aimed toward business people &#8211; obviously intended to get more people interested in working on them.</p>
<p>But I have a feeling that any format with MPEG before it will be onerous to support. ..for me itâ€™s too early to tell. Itâ€™s quite far reachingâ€¦it isnâ€™t anything like â€™signal processingâ€™ which the MPEG group is most famous for.</p>
<p><strong> Tish:</strong> The whole top down approach of the MPEG-V initiative seems counter to Web 2.0 principles to me.</p>
<p><strong>Teravus:</strong> Well, remember..Â  that even if thereâ€™s a virtual world format war (reference to DVD-HD vs BlueRay) we still need to win over the rest of the web.</p>
<p><strong>Tish:</strong> Yes and donâ€™t you think the way to win over the web is to use as many existing standards as possible?</p>
<p><strong>Teravus:</strong> Well, itâ€™s to use as many existing standards as â€˜fitâ€™ though.. KISS, as always (K)eep (I)t (S)imple (S)tupid if we have 30 different internet standards..Â Â Â Â  people looking at it will @.@</p>
<p><strong>Tish:</strong> But it is just lack of documented protocols that has created isolation from Web 2.0?Â  And really doesnâ€™t it boil down to standardizing that small percentage that is outside HTTP &#8211; the TCP and UDP stream that we talked about earlier where the real time stuff that virtual worlds bring to the web happens?</p>
<p><strong>Teravus:</strong> no..Â  actually the HTTP standardization is just as important.</p>
<p><strong>Tish:</strong> You mean even though SL used HTTP it isnâ€™t standardized?</p>
<p><strong>Teravus:</strong> Not documented specifically.</p>
<p><strong>Tish:</strong> And OpenSim is that documented?</p>
<p><strong>Teravus:</strong> Not well enough probably to define a standard.</p>
<p><strong>Tish:</strong> Is AWG (<a href="http://wiki.secondlife.com/wiki/Architecture_Working_Group" target="_blank">Architecture Working Group</a>) doing the documentation?</p>
<p><strong>Teravus:</strong> working on it..</p>
<h3>2)<strong> Licensing Issues</strong></h3>
<p><strong>Tish:</strong> It sounds like some of this work has to go on across client and server.Â  Are we running into the issue of <a href="http://en.wikipedia.org/wiki/Berkeley_Software_Distribution" target="_blank">BSD</a> for OpenSim and <a href="http://en.wikipedia.org/wiki/GNU_General_Public_License" target="_blank">GPL</a> for the Second Life viewer?</p>
<p><strong>Tervaus:</strong> Well, some of the issue here is license choice.Â  One of the reasons that libOMV was able to achieve what they did was they did it /before/ the client was open sourced.</p>
<p><strong>Tish:</strong> So open sourcing the client actually became an obstacle!!???</p>
<p><strong>Teravus</strong>: I donâ€™t think so in a whole.Â  I think it was great for the community.Â  I do, however think that C++ UDP stacks will be scrutinized more for GPL license violations because, of course, the client is GPL and C++ .<strong><br />
</strong></p>
<p><strong>Tish:</strong> It is my understanding that Linden Lab is open to discussions on making the licensing more efficient for the open source community?</p>
<p><strong>Teravus</strong>: Well, the client, in a whole, should not be changed as far as the license.Â Â  JUST the things that they expect people to adopt should be made more open. If they expect people to adopt PRIMs, then there should be an efficient implementation available for anyone to use..Â Â  at the very least, in <a href="http://en.wikipedia.org/wiki/GNU_Lesser_General_Public_License" target="_blank">LGPL</a> format. Otherwise, the die hards are forced to re-implement them from scratch, and most people will just choose something more open.</p>
<p><strong>Tish: </strong>Has anyone ever put together a list of the parts that need to be <a href="http://en.wikipedia.org/wiki/GNU_Lesser_General_Public_License" target="_blank">LGPL</a>ed?</p>
<p><strong>Teravus</strong>: Well, I think itâ€™s there in a few places.Â  There is at least one jira open on it.</p>
<p><strong>Teravus:</strong> A few that come to mind for me..Â Â  is the UDP stack and the prim to mesh/UV code. Â  I think there are some things that can definitely be improved about the UDP Stack.Â  There are some things, (images come to mind), that would be better over HTTP</p>
<p><strong>Tish: </strong>Do you think if the UDP stack were L GPLed that would be a significant help to integrating OpenSim better with the web?</p>
<p><strong>Teravus:</strong> Well, it would certainly be adopted by more clients. GPL + (your own code) = GPL Licensed client. LGPL linked library + (your own code) = Your own license.<br />
You still need to mention that you used LLâ€™s UDP stack, and provide the source code for it at request.</p>
<p>The general client itself should remain GPL, itâ€™s better for LL that way.Â  Just the items that they want people to â€™standardizeâ€™ on. It would help..Â Â  if it was at least LGPL<br />
<strong></strong></p>
<p><strong>Tish:</strong> And the value toÂ  LL on LGPLing these parts is it helped spread their basic technology while protecting the rest of their viewer?</p>
<p><strong>Teravus:</strong> It furthers their goal of standardization on their systems because it allows more people to adopt it for their own uses without worrying about GPL-ing their own client.</p>
<p><strong>Tish:</strong> It is hard to standardize without access to the low level parts of the client right?</p>
<p><strong>Teravus:</strong> The general population of Developers..Â Â Â Â  will want a libX that they can plug into their application for communicating.. .Â  libY to deal with object data..</p>
<p><strong>Tish:</strong> Hence your requests for LGPL wereÂ  UDP stack andÂ  the prim-&gt;mesh/UV</p>
<p><strong>Teravus nods</strong></p>
<p><strong>Tish: </strong> and at the moment they only have openmv?</p>
<p><strong>Teravus</strong>: Thatâ€™s the only â€˜trulyâ€™ open standard right now as far as the LL technology is concerned. OpenSimulatorâ€™s use of that data..Â Â  could also be seen as a standard..</p>
<p><strong>Teravus:</strong> But we have not published anything beyond code..Â Â  neither have theyÂ  really..Â  technically..Â  but their organization of the way things work is very very clear</p>
<p><strong>Tish:</strong> What are the most significant limitations of openmv?</p>
<p><strong>Teravus:</strong> Probably..Â  just it not being in c++.Â Â  c++ has itâ€™s benefits and itâ€™s pitfalls.Â  Changes in c++ usually take longer then ones in C#.Â  But, of course c++ is always faster.Â  With libOMV It isnâ€™t always clear about what packet is used when.Â  However, with some experimentation, you can figure it out in 30 minutes or less..</p>
<p><strong><br />
</strong></p>
<h3><strong>Usability</strong></h3>
<p><strong></strong></p>
<p>We didnâ€™t spend much time discussing some of the innovation in usability that this architectural integration into Web 2.0 will enable (more to come on that!). But, Teravus mentioned one interesting use case he is working on.</p>
<p><strong>Teravus:</strong> You might also stick a â€˜cloud rendererâ€™ into the graphic [Tervaus was looking at the diagram (from   <a href="http://hinchcliffeandcompany.com/" target="_blank">Dion Hinchcliffe</a>) that opened my previous post on &#8220;Web 2.0 to OpenSim Made easy&#8221;Â  click on the thumbnail below].</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/teravus1the-moving-pieces-modified-twice.jpg"><img class="alignnone size-medium wp-image-1865" title="teravus1the-moving-pieces-modified-twice" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/10/teravus1the-moving-pieces-modified-twice-300x186.jpg" alt="" width="300" height="186" /></a></p>
<p>Some people have discussed having a â€˜video streamâ€™ thatâ€™s rendered on the cloud and providing that to flash clients would be the best solution to it for them.</p>
<p>The cloud renderer is for organizations that have large pools of servers with GPUs so would allow for very powerful rendering. The servers can render the scenes and stream them to the low end browsers. It would allow extremely high quality rendering for really low end browsers..Â  such as â€˜cell phones.â€™</p>
<p><strong>Tish:</strong> Is that possible now on OpenSim?</p>
<p><strong>Teravus</strong>: Nope.Â  But itâ€™s something that in the future, I intend on working on. It would essentially be a video [streamed to low end browsers].</p>
<p><strong>Tish:</strong> Is that different from what <a href="http://blog.newsweek.com/blogs/levelup/archive/2008/04/21/second-life-on-your-mobile-phone-yes-says-vollee.aspx" target="_blank">Vollee</a> is doing? The mobile client for SL?</p>
<p><strong>Teravus</strong>:Â  It appears that they are, indeed, pre-rendering the client&#8217;s view and streaming it to the mobile device</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2008/10/06/putting-opensim-into-the-heart-of-web-20/feed/</wfw:commentRss>
		<slash:comments>9</slash:comments>
		</item>
		<item>
		<title>Philip Rosedale: Open Source, Interoperable Virtual Worlds</title>
		<link>http://www.ugotrade.com/2008/09/26/philip-rosedale-open-source-virtual-worlds/</link>
		<comments>http://www.ugotrade.com/2008/09/26/philip-rosedale-open-source-virtual-worlds/#comments</comments>
		<pubDate>Fri, 26 Sep 2008 06:08:46 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[Architectural Working Group]]></category>
		<category><![CDATA[free software]]></category>
		<category><![CDATA[Intel in Virtual Worlds]]></category>
		<category><![CDATA[interoperability of virtual worlds]]></category>
		<category><![CDATA[Linden Lab]]></category>
		<category><![CDATA[Metaverse]]></category>
		<category><![CDATA[Open Grid]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[open protocols for virtual worlds]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[open standards for virtual worlds]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Philip Rosedale]]></category>
		<category><![CDATA[realXtend]]></category>
		<category><![CDATA[Second Life]]></category>
		<category><![CDATA[vapor standards]]></category>
		<category><![CDATA[virtual world standards]]></category>
		<category><![CDATA[Virtual Worlds]]></category>
		<category><![CDATA[Web 2.0]]></category>
		<category><![CDATA[Web 3D]]></category>
		<category><![CDATA[Web3.D]]></category>
		<category><![CDATA[World 2.0]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=1750</guid>
		<description><![CDATA[Metanomics host Robert Bloomfield interviewed Second Life founder and Chairman of the Board, Philip Rosedale, at the Second Life Community Convention in Tampa, Florida.Â  The Rosedale interview is available here (pictures above are Philip Rosedale and his avatar). Rosedale talked about Linden Lab&#8217;s long standing commitment to open source and open protocols in one segment [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/09/philip_linden_2.jpg"><img class="alignnone size-full wp-image-1751" title="philip_linden_2" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/09/philip_linden_2.jpg" alt="" width="156" height="176" /></a><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/09/philippost.jpg"><img class="alignnone size-full wp-image-1752" title="philippost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/09/philippost.jpg" alt="" width="156" height="176" /></a></p>
<p><a href="http://metanomics.net/19-sep-2008/philip-rosedale-interview-and-expert-reactions">Metanomics</a> host Robert Bloomfield interviewed Second Life founder and Chairman of the Board, Philip Rosedale, at the Second Life Community Convention in Tampa, Florida.Â  <a onclick="javascript:urchinTracker ('/outbound/article/www.metanomics.net');" href="http://www.metanomics.net/19-sep-2008/philip-rosedale-interview-and-expert-reactions">The Rosedale interview is available here</a> (pictures above are Philip Rosedale and his avatar).</p>
<p>Rosedale talked about Linden Lab&#8217;s long standing commitment to open source and open protocols in one segment of this interview and Robert asked me to post a brief reaction. The full interview covers a wide range of topics and Robert has gotten responses on different parts of the interview from <a href="http://nwn.blogs.com/nwn/2008/09/philip-linden-o.html#more" target="_blank">Wagner James Au</a>, <a href="http://www.christianrenaud.com/weblog/2008/09/metanomics-and-rosedales-future-vision.html#more" target="_blank">Christian Renaud</a>, <a href="http://npirl.blogspot.com/2008/09/reacting-to-rosedale-on-ll-press.html" target="_blank">â€˜Bettina Tizzy,â€™</a> <a href="http://www.kzero.co.uk/blog/?p=2501" target="_blank">Nic Mitham</a> and <a href="http://dusanwriter.com/?p=941" target="_blank">â€˜Dusan Writer,â€™</a> and <a href="http://virtuallyblind.com/2008/09/22/rosedale-interview-reaction/" target="_blank">Benjamin Duranske</a> as well.</p>
<h3>A System Without an Owner is A beautiful Thing</h3>
<p>While Philip Rosedale&#8217;s comments may not, at first glance, appear to be saying anything new, they are in fact a very cogent summary of the important and crucial role Linden Lab has played, and continues to play, in moving virtual worlds out of their walled gardens and bringing them closer to that beautiful thing &#8211; a system without an owner.</p>
<p>Only a system without an owner can unleash, for virtual world technology, the kind of creative, world changing power that we have seen on the 2D web from http and html.Â  Anyone with even a vague idea of the history of the internet understands that it is only through openess, open source, open protocols, open standards, and open APIs, that we will get from here &#8211; the alpha days of virtual world technology, to their coming of age of age as a mainstream phenomena.</p>
<p>It is very much to the credit of Linden Lab that, as Rosedale says, they have never been afraid of openess: &#8220;I donâ€™t think that the open grid will impact our revenues any more than open sourcing the client,&#8221;Â  he says. While there have been criticisms of licensing choices and ways Linden Lab handles contributions back to their viewer from the community, I think that overall Linden Lab has made very important and visionary moves, first to open source, and now to open protocols.</p>
<p>Open sourcing the viewer at a relatively early point in Second Life&#8217;s development created an enormous opportunity for the rapid development of an open source re-engineering of the server side, <a href="http://opensimulator.org/wiki/Main_Page" target="_blank">OpenSim</a>.Â  OpenSim with the Second Life viewer is the most complete, open implementation of a persistent virtual world.Â  Without the head start from the open source Second Life viewer, and the connection to the thriving developer community of Second Life, the light speed progress of OpenSim would have been considerably more difficult.</p>
<p>Now OpenSim is getting closer to breaking free from the Second Life viewer. And, standard messaging protocols between client and server are, perhaps, the next step. Rob Smart, IBM, discussed this with me recently (see my upcoming interview with Rob Smart, &#8220;Web 2.0 Made Easy in OpenSim,&#8221; and see <a href="http://tinyurl.com/3ekl2d" target="_blank">his post by this title</a> for more).</p>
<p>As, Rob Smart, IBM, notes, &#8220;If, for example, the messages that went between your SecondLife client and the OpenSim/SecondLife servers was a standard protocol which had a bunch of libraries for a variety of languages, then you could start logging into VW servers from all kinds of clients.&#8221;Â  (for more see my upcoming post, &#8220;Interview with Rob Smart, IBM: Web 2.0 Made easy for OpenSim.&#8221;</p>
<h3>Open Standards Will Emerge From Rough Consensus and Working Code</h3>
<p>There are some that subscribe to the view that standards will arise in a virgin birth from an ivory tower, i.e., professors and captains of industry, removed from open source developer communities, will produce long documents that describe all of the fields, and every one of the messages, and all the APIs in detail prior to implementation.</p>
<p>But as, David Levine, IBM. Mike Mazur, 3Di, Mic Bowman, Intel, <a href="http://justincc.wordpress.com/">Justin Clark-Casey</a>, and <a href="http://www.adamfrisby.com/blog/">Adam Frisby</a>, Deep Think/<a href="http://www.sinewavecompany.com/" target="_blank">Sine Wave</a> cogently argued, on the &#8220;Open Source and Interoperable Virtual Worlds&#8221; panel at the Virtual Worlds Conference and Expo in LA, this top down approach to standards, (or &#8220;vapor standards&#8221;), does not, typically, produce good results. For more on the the virtues of creating standards from &#8220;rough consensus and working code&#8221; as opposed to top down there is a full recording of the LA panel <a href="http://www.ugotrade.com/2008/09/09/open-source-and-interoperability-will-take-virtual-worlds-mainstream/" target="_blank">here</a>.</p>
<p>Thus, in my view, Linden Lab&#8217;s current focus on open protocols, <a href="http://www.ugotrade.com/2008/07/31/the-open-grid-beta-the-first-step-to-interoperable-virtual-worlds/" target="_blank">OpenGrid</a> (for more see <a href="http://www.ugotrade.com/2008/07/31/the-open-grid-beta-the-first-step-to-interoperable-virtual-worlds/" target="_blank">here</a>), and interoperability is another key step towards the creation of open standards for virtual worlds. And Linden Lab are again leading the way in creating an environment that fosters innovation.</p>
<p>OpenGrid creates a testing ground where protocols can be worked out, and it enables the kind of heterogeneous ecosystem to develop that can nurture the creation of standards. IÂ  agree with Rosedale when he says content makers will have an important role in driving interoperability and standards. The creation of standards is certainly a social as well as technical process. And as Rosedale notes content creators will have compelling reasons to move their content around in an open metaverse.</p>
<p>David Levine&#8217;s (IBM), described in detail in LA (again see <a href="http://www.ugotrade.com/audio/OSInteroppanel.mp3" target="_blank">recording here</a>) the importance of interoperability and parallel innovation  for the creation of standards. OpenSim has already produced an extraordinary amount of innovation, <a href="http://www.realxtend.org/" target="_blank">realXtend</a>, <a href="http://tribalnet.se/" target="_blank">Tribal Media</a> and more. Also see my interview with <a href="http://www.ugotrade.com/2008/09/15/interview-with-mic-bowman-intel-the-future-of-virtual-worlds/" target="_blank">Mic Bowman, Intel</a>, for more on the role of open source/open standards in fostering innovation and in moving virtual worlds into &#8220;the fabric of everday computing.&#8221;</p>
<p>While Linden Lab only have a small team working on OpenGrid, it is a vital one.Â  And, with MarkLentczner (<a href="http://wiki.secondlife.com/wiki/User:Zero_Linden" target="_blank">Zero Linden </a>in Second Life) leading the <a href="http://wiki.secondlife.com/wiki/Architecture_Working_Group" target="_blank">Architectual Working Group</a> for Linden Lab, and a collaboration with IBM led by David Levine (<a href="http://zhaewry.wordpress.com/" target="_blank">Zha Ewry</a> in Second Life) driving the interoperability effort, plus the OpenGrid project, Linden lab has a high powered, agile, lean, machine working for an open future.</p>
<p>So with no more ado, here it is: Robert Bloomfield&#8217;s interview with Philip Rosedale!</p>
<h3>Rosedale on Open Sim:Â  Pandoraâ€™s Box Was Already Open</h3>
<p><strong>Introduction from Robert Bloomfield</strong></p>
<p>Naturally, a major topic of my interview with Philip Rosedale was on the implications of OpenSim and the Open Grid project, which both involve creating open source server-side implementations of virtual worlds that can replicate Second Lifeâ€™s funcationality.Â  As a relative newcomer to this corner of the tech industry, I still find myself asking what a company would essentially create its own competitor.Â  Here is what Philip had to say; I have asked Tish Shute of UgoTrade to comment, as one of the people who has covered the OpenSim/OpenGrid movement with more detail and passion than just about anyone.</p>
<p>PHILIP ROSEDALE: I just really hold true to the strategic belief that thereâ€™s going to be a tremendous amount of consolidation and interconnection between these worlds because the content development process is so challenging that the content developers are going to push us all together. Theyâ€™re going to say, â€œGive me a file format. Give me an interchange format. And let me move that chair from this grid to that grid. Iâ€™ve got to be able to do that because Iâ€™ve got a customer here who wants to buy it.â€ And so I think that that consolidation is going to happen, and itâ€™s going to happen earlier than people would have thought.</p>
<p>ROBERT BLOOMFIELD:Â  And this is looking at the success, the energy around OpenSim, open grid.</p>
<p>PHILIP ROSEDALE:Â  The energy, yeah. I think, at this point, weâ€™ve got an appropriate level of energy â€“ I think thatâ€™s exactly the right word â€“ around exploring how quickly we can generalize all this stuff and open and interconnect everything together. I really think thatâ€™s going to continue.</p>
<p>ROBERT BLOOMFIELD:Â  [D]o you feel like you might have opened Pandoraâ€™s box and that itâ€™s not really under your control now?</p>
<p>PHILIP ROSEDALE:Â  I think that Second Life has, in many ways, not been under our control from the beginning and that itâ€™s been a basic operating assumption that to create the kind of incredible place and business opportunity, and social opportunity more broadly, that Second Life would require a certain lack of control. And that was true with the content from day one.</p>
<p>So for us, oh, we open-sourced the client a while ago, and now weâ€™re trying to do the same thing with respect to operating standards to interconnect grids. This is a pretty logical progression, using worlds that weâ€™re pretty familiar with. I mean weâ€™ve always felt that, if you have a compelling use proposition, which certainly Second Life does, in other words, if thereâ€™s real utility, real fun or real business or real whatever in what people are doing, then there should be a way, as a company, to be open, global and still make money on an hour-to-hour or a user-to-user basis or whatever on what weâ€™re doing. And the economic aspects of the business have been fantastic from the very early days, and we donâ€™t really even worry about them.</p>
<p>Our ability as a company to find a way to make a reasonable amount of money per hour that people spend in Second Life, itâ€™s really never been that much of a problem. Itâ€™s actually been fascinating as weâ€™ve changed pricing and as weâ€™ve changed the ways that we make money. Introducing new ways of making money â€“Â  like selling currency on the LindeX â€“ itâ€™s been amazing how stable our revenues have been as a function of usage hours. Itâ€™s one of the things that we sometimes marvel at. Itâ€™s almost an emergent effect, if you will, that the companyâ€™s business, its operating revenues are really very stable.</p>
<p>ROBERT BLOOMFIELD:Â  Even though theyâ€™re coming from different streams.</p>
<p>PHILIP ROSEDALE:Â  Even though theyâ€™re coming from different streams. And sometimes the requirements of the platform and decisions that we make will really substantially change the nature of those streams, but when you put them all together and you divide them by the number of usage hours, itâ€™s like a constant. Itâ€™s almost a magic number. And itâ€™s a magic number that allows us to be profitable, and therefore, is certainly adequate to make a business in the future. I donâ€™t think that continuing to open Second Life up as we have been is going to impact that. Again, I just think there are so many opportunities to make money that we shouldnâ€™t have to worry about that too much in the company. And, again, I think thatâ€™s a lot like the early internet. I mean if you step back and look holistically at the internet â€“ you look at PayPal, the payment systems, auction systems, transaction systems, posting, naming â€“ you look at all the businesses that comprise the internet, well, those are all the kinds of businesses that we as a company can be in, in this emerging market. Thereâ€™s no business thatâ€™s denied us. We are in the hosting business. We can continue to be in the hosting business long term, putting servers up and providing access to them.</p>
<p>We can certainly be in the naming business. Weâ€™re in the currency and transaction support business. Itâ€™s funny, itâ€™s something thatâ€™s often discussed. We worry much more about improving the scalability, stability and the usability of the system: reducing that initial user experience, reducing the time associated with it, making it easier. Thatâ€™s got to be the lever that drives more growth in the overall industry, more revenues for us. So itâ€™s really all we worry about. But I donâ€™t think that the open grid will impact our revenues any more than open sourcing the client did.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2008/09/26/philip-rosedale-open-source-virtual-worlds/feed/</wfw:commentRss>
		<slash:comments>5</slash:comments>
<enclosure url="http://www.ugotrade.com/audio/OSInteroppanel.mp3" length="40308529" type="audio/x-mpeg" />
		</item>
		<item>
		<title>Interview with Mic Bowman, Intel: The Future of Virtual Worlds</title>
		<link>http://www.ugotrade.com/2008/09/15/interview-with-mic-bowman-intel-the-future-of-virtual-worlds/</link>
		<comments>http://www.ugotrade.com/2008/09/15/interview-with-mic-bowman-intel-the-future-of-virtual-worlds/#comments</comments>
		<pubDate>Mon, 15 Sep 2008 13:42:04 +0000</pubDate>
		<dc:creator><![CDATA[Tish Shute]]></dc:creator>
				<category><![CDATA[3D internet]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[avatar 2.0]]></category>
		<category><![CDATA[free software]]></category>
		<category><![CDATA[Intel in Virtual Worlds]]></category>
		<category><![CDATA[interoperability of virtual worlds]]></category>
		<category><![CDATA[Linden Lab]]></category>
		<category><![CDATA[Metaverse]]></category>
		<category><![CDATA[MMOGs]]></category>
		<category><![CDATA[open metaverse]]></category>
		<category><![CDATA[open source]]></category>
		<category><![CDATA[OpenSim]]></category>
		<category><![CDATA[Second Life]]></category>

		<guid isPermaLink="false">http://www.ugotrade.com/?p=1653</guid>
		<description><![CDATA[Intel obviously benefits from broad adoption of applications that drive significant compute so it is hardly surprising that they had been paying attention to the early adopters of the Gaming &#38; Visual Computing market.Â  But, in a recent post the Intel blog states, &#8220;going forward the bigger growth will be coming from the other two [&#8230;]]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/09/secondlifevw2008post.jpg"><img class="alignnone size-full wp-image-1687" title="secondlifevw2008post" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/09/secondlifevw2008post.jpg" alt="" width="453" height="361" /></a></p>
<p>Intel obviously benefits from broad adoption of applications that drive significant compute so it is hardly surprising that they had been paying attention to the early adopters of the Gaming &amp; Visual Computing market.Â  But, <a href="http://softwareblogs.intel.com/2008/09/09/introducing-connected-visual-computing-cvc-2/" target="_blank">in a recent post the Intel blog states</a>, &#8220;going forward the bigger growth will be coming from the other two segments Metaverse and Paraverse (for more on the future of the paraverse see the recording of the Augmented reality panel in LA in <a href="http://www.ugotrade.com/2008/09/09/open-source-and-interoperability-will-take-virtual-worlds-mainstream/" target="_blank">my previous post</a>.)</p>
<p>(Thanks Joshua Meadows (Joshua Nightshade in SL), <a href=" http://abstractavatars.com" target="_blank">Abstract Avatars</a>, for the picture of the Linden Lab booth at the <a href="http://www.virtualworldsexpo.com/" target="_blank">Virtual Worlds Conference and Expo, LA 2008</a>.Â  Those giant avatars from <a href="http://www.secondlife.com">Second Life</a> (TM) are very cool. That is John Lester (Pathfinder Linden) in the striped shirt helping give us an idea of their scale.)</p>
<p>Intel is also in a powerful position to facilitate mass adoption of rich, immersive virtual worldsÂ  where there is a direct connection between more compute and better user experience.Â  As Christian Renaud pointed out in, <a href="http://blog.techintelgroup.com/2008/08/announcing-the-tig-virtual-worlds-industry-outlook-2008-2009.html" target="_blank">The Techology Intelligence Group&#8217;s Virtual Worlds Industry Outlook, 2008 -2009</a> (written with Sean F. Kane Esq.), the &#8220;ability for the computerâ€™s graphics subsystems to render the data as quickly as required&#8221; has been an obstacle for mainstream adoption of virtual worlds. But, Renaud goes on to note, Intel&#8217;s new Larrabee architecture may be a game changer for virtual worlds.</p>
<p><strong><em>Recent announcements may change the landscape.Â  At the SIGGRAPH trade show in August 2008, Intel announced their Larrabee architecture, slated for product release in the late 2009-2010 timeframe.Â  This would take what has typically been a separate Graphical Processing Unit (GPU)Â  function and relocate it into the processor architecture on the motherboard of a computer.<br />
Although the early stages of this technology will undoubtably be prone to compatibility issues with legacy graphics drivers, the assimilation of this function on to the main motherboard should streamline the graphics performance and compatibility issues that virtual worlds have been susceptible to.</em></strong><a href="http://softwareblogs.intel.com/author/george-jobi/"></a></p>
<p><a href="http://softwareblogs.intel.com/author/george-jobi/">Jobi </a><a href="http://softwareblogs.intel.com/author/george-jobi/">George</a><a href="http://softwareblogs.intel.com/author/george-jobi/">,</a> on the <a href="http://softwareblogs.intel.com/2008/09/09/introducing-connected-visual-computing-cvc-2/" target="_blank">Intel blog</a> explains how Intel sees three segments, gaming, metaverse, and paraverse<span> as driving &#8220;the next logical evolution of web, where â€œconnectednessâ€ and â€œimmersionâ€ (not just richness) come together to bring us to an era ofÂ  â€œ<strong>Connected Visual Computing&#8221; (see the press coverage of CVC <a href="http://www.theinquirer.net/gb/inquirer/news/2008/08/19/intel-reveals-plans-connected" target="_blank">here</a>, <a href="http://www.bit-tech.net/news/2008/08/19/intel-intros-connected-visual-computing-initiative/1" target="_blank">here,</a> and <a href="http://www.hexus.net/content/item.php?item=15047" target="_blank">here</a>).</strong></span></p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/09/cvc1.jpg"><img class="alignnone size-full wp-image-1689" title="cvc1" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/09/cvc1.jpg" alt="" width="450" height="293" /></a></p>
<h3>Getting from here (gaming, metaverse, paraverse) to there (connected visual computing)</h3>
<p>Mic Bowman, Intel, was on two panels at the Virtual Worlds Conference and Expo in LA last week. I wrote up and posted the recording of the panel I facilitated, <a href="http://www.ugotrade.com/2008/09/09/open-source-and-interoperability-will-take-virtual-worlds-mainstream/" target="_blank">&#8220;Open Source, Interoperable Virtual Worlds&#8221; </a>in my <a href="http://www.ugotrade.com/2008/09/09/open-source-and-interoperability-will-take-virtual-worlds-mainstream/" target="_blank">previous post</a>. On our panel, Mic explained in detail some of the work Intel is doing to help us get from here (gaming, metaverse, paraverse) to there (connected visual computing). Mic also spoke on the <a href="http://www.virtualworldsroadmap.org/" target="_blank">Virtual World Road Map</a> session with keynote speaker, Sibley Verbeck, Electric Sheep Company, (see <a href="http://blogs.electricsheepcompany.com/sheep/" target="_blank">Sibley&#8217;s blog</a>). This panel focused more on cross industry cooperation.</p>
<p>Mic&#8217;s message for our panel on &#8220;OpenSource and Interoperable Virtual Worlds,&#8221; in a nutshell was:</p>
<p><strong><em>To achieve a thriving, growing, broadly adopted CVC ecosystem, we believe the industry must come to some agreement on common building block technologies. Open source technologies represent a critical element in the discovery and development of these technologies, and foster innovative usages that drive adoption.</em></strong></p>
<p>To give you a taste of how deeply (err yes we were a panel of unbridled geekiness to some)Â  we discussed the work being done to research and create these common building blocks. Here is a short transcription of a portion of Mic&#8217;s contribution to our panel, lightly edited.</p>
<p>The creation of common building blocks for virtual worlds similar to what HTML and HTTP did for the internet is a vital step, in Mic&#8217;s view, for the transition to connected visual computing and for the experience of virtual worlds to become ubiquitous and transparent in the way that when we say &#8220;browse the web,&#8221; i.e., we take the &#8220;web&#8221; for granted it is the applications YouTube, Flickr etc that gets our attention</p>
<h3>The Evolution of the Web Into Connected Visual Computing</h3>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/09/cvcpost.jpg"><img class="alignnone size-full wp-image-1688" title="cvcpost" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/09/cvcpost.jpg" alt="" width="450" height="333" /></a></p>
<p><strong><em>In 1995 we talked about surfing the web, nobody uses that phrase any more. Today we talk about updating our blogs or adding something to twitter, or I want to go off and buy something from E-Bay or Amazon. The web has become essentially a fundamental part of the fabric. Itâ€™s the applications that it enables that are important. Right now we think about virtual world technologies generally as an application. Ultimately we would like to figure out how to get that kind of technology into the basic fabric. So that we think about collaboration as an application, we think about a conference, and attending the conference, as the thing we do, not as a platform on which we do that. And to accomplish that, what we envision at Intel is a set of building blocks that are created or emerge out of the various platforms, as being consistent technologies.</em></strong></p>
<p><strong><em>And so we looked at a variety of different approaches to understanding what those technologies could be, what those common technologies were, and how they are created and adopted. What we saw in OpenSimâ€™s modular architecture, was an opportunity to start articulating boundaries between the various pieces of technology in a way that allowed us to disaggregate the architecture so that we could start thinking about how to pull the pieces apart and think about how the interfaces could be made consistent across those pieces. For example, thereâ€™s a set of types for the basic building blocks that exist across the Second Life and OpenSim protocols. </em></strong></p>
<p><strong><em>One of the people we just hired John Hurliman has been working libopenmv for awhile, and as one of the things we were having a discussion about is how to capture that consistency of types. And so Johnâ€™s going off and pulling the set of modules out of the openmv project, in order to give us a basic set of types that can be applied across multiple applications, that can be re-used in many different ways.Â  And so itâ€™s useful to the OpenSim community, and its useful for building out some new test servers and clients that can allow us to actually try out different types of load, and potentially allows us a way of extracting out the set of protocols that implement those types so that we can start looking at new ways of building more efficient protocols. </em></strong></p>
<p><strong><em>Another example of that would be the meshing code, the code that actually takes the basic conceptual level of object that is being represented in the world and turns it into something that can actually be sent to a GPU in order to be put on a screen. And so that basic meshing component that breaks it down seems to be something that we see as a consistent piece of technology that occurs inÂ  several places thatâ€™s useful both in sort of mapping the representation into the physics engine and on the client mapping it into the graphics engine. And so thatâ€™s another example of the basic technology that seems to be appearing to consistently in many locations.<br />
</em></strong></p>
<p><strong><em>And so, what we like about OpenSim in particular, and again this is just a tool and framework for us for understanding what these basic building blocks are, but what we like about it is we can experiment with these new boundaries in the framework of a complete and functioning system. And so it gives us a framework for testing out what these interfaces should be and what the basic building blocks are.</em></strong></p>
<p>Mic pointed out some of the key points of OpenSim architecture and ecosystem at the <a href="http://www.intel.com/idf/?cid=cim:ggl|idf_home|k4EF5|s" target="_blank">Intel Developer Forum</a>. The slide below is from his presentation there.</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/08/opensim.jpg"><img class="alignnone size-full wp-image-1620" title="opensim" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/08/opensim.jpg" alt="" width="450" height="291" /></a></p>
<p>(The Genkii team created the OpenSim N-Body demonstration with astrophysicists Piet Hut and Junichiro Makino, <a href="http://www.ugotrade.com/2008/07/19/astrophysics-in-virtual-worlds-implementing-n-body-simulations-in-opensim/" target="_blank">see here for more</a>).</p>
<h3>Interview with Mic Bowman:Â  &#8220;The Future of Connected Visual Computing.&#8221;</h3>
<p>1) First could you define what you mean by, &#8220;Connected visual computing?&#8221;</p>
<p><strong>Connected Visual Computing is the union of three application domains: mmog, metaverse, and paraverse (or augmented reality). These application domains are united through common technologies, especially 3D content creation, and common properties such as persistence, social interaction, rich presentation, and user-generated content with potentially complex behaviors.</strong><br id="jp8219" /><br id="jp8220" />2) One of the key aspects of fostering innovation in a new technologyÂ  is recognizing the important paradigm shifts that it fosters.Â  New forms of collaboration are oneÂ  potentially most disruptive contributions ofÂ  virtual worlds.Â  However, I know you have gone a little further than most on thinking how virtual worlds create new opportunities for non-linear, asynchronous collaboration.Â  Could you explain some of your thinking on this? And, why developing thinking about the applications of virtual worlds is something you and thus Intel has got involved with?</p>
<p><a href="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/09/nonlinearpresentation.jpg"><img class="alignnone size-full wp-image-1692" title="nonlinearpresentation" src="http://www.ugotrade.com/wordpress/wp-content/uploads/2008/09/nonlinearpresentation.jpg" alt="" width="443" height="321" /></a></p>
<p><em>This slide is from Mic Bowman&#8217;s presentation &#8220;Non-Linear Presentation: or how to use virtual worlds for asynchronous collaboration.&#8221;</em></p>
<p><strong id="jp8223">Although Intelâ€™s research agenda focuses on the hard ware platform impact of CVC applications, it is necessary to understand the different usages that CVC enables. To that end, we built an experimental tool in OpenSim where we could explore new modes of collaboration designed exclusively for virtual worlds. That is, we didnâ€™t want to look for ways to just translate our real world collaborative culture into the virtual world, we wanted to find out what unique forms of collaboration are enabled by virtual worlds. The first result is a tool we call non-linear presentations. </strong></p>
<p id="jp8226" class="western"><strong id="jp8227">In addition, Intel actively collaborates with Qwaq/Croquet to integrate information space visualization into their enterprise collaboration tool â€œQwaq Forumsâ€.</strong></p>
<p id="jp8228" class="western"><a id="jp8229" name="f4bn0"></a><br id="jp8230" />3) Why did Intel choose to engage with OpenSim?</p>
<p id="jp8233" class="western"><strong id="jp8234">We like OpenSim because it has the best logo. Go Hippos!</strong></p>
<p id="jp8237" class="western"><strong id="jp8238">Seriouslyâ€¦ a year ago we started to look at open source platforms for virtual worlds. Open source platforms provide a completely functional framework that enables researchers to focus on specific innovations. My group wanted to look at scalability limitations in the distributed systems software architecture of CVC applications. We considered four candidate platforms (OpenSim, Croquet, Ogoglio, and Wonderland). We chose OpenSim because it was the most complete implementation of a persistent world. In addition, the development community was most active. Further, the modular architecture makes it easier to experiment with new functionality. </strong></p>
<p id="jp8241" class="western">4)  I know you have contributed code to OpenSim,Â  will Intel be putting more developers into OpenSim in the future?</p>
<p id="jp8245" class="western"><strong id="jp8247">Our focus is on investigating general technologies to support broad adoption of scalable CVC applications. That is, we want to understand the general problems that limit scalability across multiple CVC applications. However, it is important to validate general principles through specific implementations (even better, implementations with real end users). As a result, we expect to continue our collaboration with the OpenSim development community and with the emerging end-user community. </strong></p>
<p id="jp8248" class="western"><a id="jp8249" name="oage9"></a><br id="jp8250" />5) You mentioned you were doing some testing on OpenSim.Â  Have you found specific areas in Intel&#8217;s domainÂ  that could be significantly improve OpenSim performance?</p>
<p id="jp8253" class="western"><strong id="jp8255">Our research is still very early stage. In one area, however, we have some very promising early results. Script execution in CVC applications creates unique stress on the platform with potentially thousands of concurrently executing scripts. One method we are investigating appears to improve performance and scales to the number of hardware threads on the CPU.</strong></p>
<p id="jp8256" class="western"><a id="jp8257" name="ls.2"></a><br id="jp8258" />6) Everyone I think agrees that OpenSim and a next generation browser/viewer would be killer.Â  And when we talked last you mentioned interest in the OpenViewer project.Â  What do you see as being the best way forward on this very big task?</p>
<p id="jp8261" class="western"><strong id="jp8262">Clearly, experimentation with new communication protocols requires that we modify both the client and server. Licensing issues with existing viewers certainly complicate any effort to modify the viewer. </strong></p>
<p id="jp8263" class="western"><a id="jp8264" name="oage12"></a><a id="jp8265" name="oage11"></a><a id="jp8266" name="le0n"></a><a id="jp8267" name="xyru"></a><a id="jp8268" name="xyru0"></a> <br id="jp8269" />7) And, what about the user experience in virtual worlds?Â  What might be the contribution of browser-based views?Â  What are your thoughts on this?<br id="jp8273" /></p>
<p id="jp8275" class="western"><strong id="jp8277">Browser-based viewers are a reflection of deployment challenges. Broad adoption of CVC applications requires that the industry address the problem of simplified deployment, whether through stand-alone viewer (or viewer platform) consolidation or through browser-based viewers.</strong><br id="jp8278" /></p>
<p id="jp8279" class="western"><strong id="jp8280">Software as a service is one approach that could address the deployment problem. Limitations in browser-based sandboxes must be addressed to deliver appropriate client performance and experience.</strong></p>
<p id="jp8281" class="western"><a id="jp8282" name="i2mc"></a><a id="jp8283" name="v_yj"></a><a id="jp8284" name="v_yj0"></a> <br id="jp8285" />9) Intel has Havok and a software ray tracing engine that scales to cores.Â  The latter would really make for a completely new generation ofÂ  virtual world viewers.Â  Can you explain some of the innovations you see coming from this ray tracing engine?Â  And will there be a special license offered to bring Havok into reach of the open source community? What role / impact will <span style="color: #000080;"><span style="text-decoration: underline;"><a id="jp8288" href="http://en.wikipedia.org/wiki/Larrabee_%28GPU%29">Larrabee</a></span></span> have?<br id="jp8289" /><br id="jp8290" /></p>
<p id="jp8291" class="western"><strong id="jp8292">Ray tracing is particularly helpful in making user-created content look good. Let me give you a concrete example&#8230; In a professionally authored 3D environment, objects can be placed with complete understanding of the lighting requirements. In any virtual world where users can create or customize content (including simple customizations like changing the placement of objects), lighting cannot be predicted (and as a result it is very difficult to create the appropriate shading for objects). Ray tracing (both as a runtime component and as an offline tool) can dynmically compute appropriate lighting, shadows and reflections.<br id="nlrg" /></strong></p>
<p id="nlrg2" class="western"><strong id="nlrg3">Havok is a fully owned subsidiary of Intel with an independent business model. Questions of Havok&#8217;s license should be directed to Havok. (see the link to the Havok evaluation and developers licenses)</strong></p>
<p id="jp8293" class="western"><a id="jp8296" name="baol"></a><a id="jp8297" name="baol0"></a><a id="jp8298" name="dltw1"></a><a id="jp8299" name="dltw2"></a><a id="jp82100" name="oage8"></a><a id="jp82101" name="rf3f"></a><a id="jp82102" name="rf3f0"></a><a id="jp82103" name="kddv"></a><a id="jp82104" name="e562"></a><a id="jp82105" name="b:.-0"></a><a id="jp82106" name="bu1w1"></a><a id="jp82107" name="drfz"></a><a id="jp82108" name="dw.x"></a><a id="jp82109" name="ht42"></a></p>
<p id="nr3y" class="western"><strong id="nr3y0">As a compute engine, Larrabee is designed for compute loads that frequently occur in CVC applications including physics (collision detection), spatialization of audio, and ray tracing. In usages where rich immersion, ie accurate physical simulation and photorealistic content, determines the quality of user experience, Larrabee can certainly improve the user&#8217;s experience.<br id="og40" /></strong></p>
<p><br id="im5x" />10) How do you see the landscape for virtual worlds five years out?<br id="im5x0" /><br id="im5x1" /><strong id="im5x2">Obviously any predictions on the future of an industry as immature as virtual worlds must be considered highly speculative. That being said, Intel&#8217;s vision is that the industry, as it matures, forms around a relatively small set of basic common building block technologies that are sufficiently general to enable many different usages. Examples we see emerging include identity, presence, text and voice communication, and asset/object management/storage.</strong><strong id="g6:1"> These basic building blocks can be put together with physics, game engines, and </strong><strong id="lpp5">other tools to address the needs of a particular usage.</strong><br id="jbcz" /><br id="jbcz0" /><br id="im5x3" /></p>
<p id="wi1m3" class="western"><br id="jp82126" /><br id="jp82127" /><br id="jp82128" /></p>
]]></content:encoded>
			<wfw:commentRss>http://www.ugotrade.com/2008/09/15/interview-with-mic-bowman-intel-the-future-of-virtual-worlds/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
		</item>
	</channel>
</rss>
