New Page 1

PREVIOUS PAGE |_Page index_| Website | Review Page | Journey | Donate | Links | NEXT PAGE

 If you jumped to this page from another page from within the eBook, then click your browsers back button to return to it, rather than the previous or next buttons above.

You are reading a page from a free video eBook called Heaven or Hell It's Your Choice, for more information click on the website button above.

Section 3 / Page 75

Avatars or digital people could have used a relatively new approach to 3D modelling so as to produce the life like images, this is the most impressive rendering system I have come across to date. It is being produced at Stanford University by a Henrick Wann Jensen and goes by the name of subsurface scattering, it seems to be an extension to photon mapping techniques. For all things 3d, then check out Siggraph.

The managing of complex 3D environments is already being done, developers working in the field of massively multiplayer online role playing games (MMORPG) are showing the way (e.g. Everquest 2 etc). The virtual real-estate arena already has a lot of big players, such as DreamWorks, ILM etc, these companies are all designing virtual real-estate. This means a lot of the new software tools being created, check out Maya and Mental Ray used to create some of the special effects in the Transformers movie. Also - Sony and NVIDIA have paired up to develop virtual real-estate plus check out, James Cameron's Avatar project, this is the start of the true VR revolution.


 

The combination of VR environments and speech recognition would have required a combination of neural net algorithms, but the emergent behaviour within the MNN was to have been studied by inbuilt mechanisms, so helping to train and optimise the network and grid system. The databases were to have used some simple transivity mechanisms or algorithms coded into their structure so as to help them self organise their own data. Such as add one to the weight (in simple programming terms a=a+1) every time a user clicked on a hyperlink. In other words the more a link was clicked on the higher that number would become, so denoting its popularity. The employment of even simplistic transivity mechanisms, such as the one just exampled, would have allowed these mechanisms to automatically optimise the links and to a large degree self organise the data contained. This approach can already be seen in the development of some web based search engines such as Googles page rank system, (micro-democracy at work). The push and pull mechanisms employed were seen as being steadily increased in sophistication, allowing both the user and system to benefit.

More complex algorithms would have been needed to study all of the interactions taking place between any of the end users, along with any of the objects held within the databases. The data flow was seen as being highly dynamic in nature, with the use of automatic mechanisms being used to control most aspects of the systems development. Once a simulation had been tested and shown to be stable enough for public use, then it could have been added to the database network. The sharing and processing of relevant information in such a large and powerful network and in real-time, was seen as being a key ingredient, to making the macroscopic neural network concept work. This type of collective intelligence is obviously being worked on in many quarters, but you should check out work carried out by Bollen & Heylighen, 1996; 1999.

Web 2.0 / 3.0 concepts in the making, if you have the time and the inclination - press play.


Please report any problems you see on this page -such as broken links, non playing video clips, spelling or grammatical errors etc to:-

problems_or_suggestions@heaven-or-hell-its-your-choice.com

I don't have the time to individually respond to every email, so I thank you in advance, for your help.

Google

PREVIOUS PAGE |_Page index_| Website | Review Page | Journey | Donate | Links | NEXT PAGE

Author Alan Keeling