You are reading a page from a free video eBook called Heaven or Hell It's Your Choice, for more information click on the website button above.
Section 3 / Page 76
Skip this page if your not into tech?
Multifunction platforms and the convergence of online systems capable of passing data seamlessly between such devices, should have allowed the network to interface with many different devices that are capable of streaming information direct from source. Platforms such as the next generation of mobile phones, enabled with 3G and others, all moving towards UMTS and beyond, should have allowed the thin client devices connected, to utilise a lot of the facilities outlined.
Servers capable of handling information from more than one platform are already here, middleware code has already been written to allow multiple platforms to interact together online, this code will allow for a totally generic TCP/IP stack, also see I/O Acceleration Technology. The uptake of iPv6 was also seen as being needed, for the full expansion of the databases capabilities, especially when you consider the amount of connected devices the web will have. At present the IP protocol (iPv4) is a 32bit system, iPv6 is a 128 bit, so allowing many more addresses to be handled. (For the layperson, each connected device to the net, is assigned an IP address, just like a phone number, so everybody or device connected can talk to every other device connected). 128 bit encoding simply means, that the net will be able to handle many more connected devices, compared to 32 bit.
Current software trends show that developers are trying to achieve a software architecture which is totally independent of hardware, Java and the plans for Dot.Net can be seen as examples, of this principal. This was seen as being one of the major goals in the evolution of the databases development. I.e. The networks databases were to have been capable of being installed on any hardware capable, whilst also being able to interface to any device utilising a universal code engine or a virtual processor type design.
XML (eXtensible Markup Language) is crucial to Microsoft’s Dot.Net plans. So this was seen as being good news for the developers, as the hopeful uptake of these formats by most net appliances should have allowed a lot of the scalability problems to be overcome. All the file storage and metaformats now being introduced should allow systems developers a much easier time of it. Also check out the Java Virtual Machine (JVM) + C sharp + ASP.NET, N1, Visual Studio.Net etc, if your into this type of thing. If you wish to understand this stuff a little better, then check out this >link<, which explains a bit of the history of XML, SGML etc.
Eventually haptic interfaces (force feedback devices) will allow end users to be given the sensation of touch to whichever object or environment they will be interacting with. This type of interface will come into its own when VR systems become mainstream and big business latches on to them as yet another way to make money. The processing of tactile information, was seen as a very important step in the MNN's full development. How users interacted with tactile objects, was seen as a way of teaching the MNN how we perceive and use such objects. This should eventually have allowed the MNN to control robotic devices.
Sony:- is now calling for partners to help it leverage its Playstation 2 and 3 consoles as home gateways to the Internet. Sony is actively looking to work with anybody who wishes to commercialise the Playstation and Sony is already offering online games etc directly to it's PS 2 and 3 customers. Sony’s vice president of Computer Entertainment said at the CEATEC electronics show in Tokyo, that they were interested in talking to anybody who wished to license the Playstations core technologies. Unlike the X-Box live system, which is using the big M's own central gateway for online gaming etc, Sony has let 3rd party developers design their own solutions to the online gaming problem, whilst supplying developers with a basic 'open' infrastructure. The Playstation 3 is building on this success and shows the future for gaming consoles, in other words many microworlds running online, the MNN was seen as a massively multiplayer / user environment, containing many conjoined microworlds.
Most ISP / ASP providers have installed LAN quality game server's capable of handling many simultaneous connections and they are using these systems to offer a wide range of online services. The industry's move toward using blade servers shows how the industry is tackling the hosting needs, within these emerging ASP markets, with every hardware manufacturer already jumping on this bandwagon. The LAN quality game servers now being employed to host the type of content described are now becoming the industry norm, this will allow for an increasing amount of online users and 3D environments to be held and interacted with and in many different ways. Cell blade servers are already being used.
BT openworld, Yahoo and Games Domain one of the largest online games sites, along with its user base, have all in a effect joined up, this is an alliance that is mutually beneficial to all parties. AOL and Time Warner have also partnered, showing that, the larger ISP's are very aware that content will become the single most important factor in ISP or ASP growth. As I said, the future is all about content over net - charge for access, this is a situation playing itself out all over the industry (see Microsoft acquires Rare). Big players in the I.T. marketplace are all looking to snap up any developers that can deliver good content, so as to expand their own interests in this coming battle for the virtual real-estate of the future. (See Lucas and HP).
The last bit was written in 2005ish but it is still relevant
in so many ways.
Please report any problems you see on this page -such as broken links, non playing video clips, spelling or grammatical errors etc to:-
I don't have the time to individually respond to every email, so I thank you in advance, for your help.