You are reading a page from a free video eBook called Heaven or Hell It's Your Choice, for more information click on the website button above.
Section 4 / Page 95
Future Software And Hardware
As the web develops into Web 2.0 / 3.0 etc, then we should begin to see software being used by the masses as an invisible workspace in other words, the TV and the net will merge into one seamless, service. This merging will bring about a new type of world, in which people will be having interaction with both local and global communities, this is a revolution and it's impact on society, should not be underestimated. As software and hardware develops so as to join up the world in this new way, then we will increasingly see the emergence of computer systems, with the ability to correlate data from many different sources and then tie that data together, into a coherent almost thought like process.
What you have to understand, is that programming techniques are usually construed in a certain way, as in, here is the data, followed by a set of procedures or programming instructions to process that data, deriving a set value. Computer systems are now capable of studying data at a much higher level, the average I.T. illiterate on the other hand seems to think that machines, can only perceive data in a yes or no fashion, this is not true. The rigid programming of old (procedural languages), is now being replaced by intuitive and adaptive systems, which have the ability to study some extremely complex data sets, whilst interpreting that data in increasingly complex ways. This ability can be seen as an evolutionary process with each new step (higher level languages) taking us ever closer to the holy grail of computer science – self-aware A.I.
A computer system through the use of VR should be able to see and therefore come to watch, learn and then copy what it is seeing. This evolutionary step in computer terms would mean that instead of having to write reams of programming code to describe a set of steps between actions, the software itself should be able to copy those steps without any intervention by humans. This is the evolutionary process that programmers are now striving for, in other words the ability to have a computer system evolve its own programming procedures somewhat autonomously. Neural networks show how computer systems can be taught to learn by associating relative meanings to patterns found in data sets, but this process is still limited at present due to the sophistication of the technologies currently available.
The MNN concept was envisioned as a way of overcoming these technological hurdles, via the sharing out, of the processing needed, between all of the grids components. This should have allowed for the processing of enough data sets to give the operators of the network, access to a whole new level of interactive real-time A.I., a level that is not even being touched upon, by current day systems. The programming of such a system, would have become far too much for most programmers, so a modular design was to be employed along with semi-autonomous / self-organising systems. The programming procedures and algorithms normally associated with programming such a system, should have ultimately become meaningless, as the system steadily grew via, it's perceived ability to see and then copy what it was seeing.
This can be seen as an input / output system with limited intelligence, but is a reasonable example to show how a system can respond to input then translate input into action.
How could it do this? :-
Ever since the first primitive human picked up a stone and used it as a tool, then showed another primitive human how to do that, from that point on, a new form of visual communication and learning was born. The building up of a visual communication language with any advanced software system, was foreseen at least by me, as an evolutionary process capable of being achieved eventually.Once the system began to evolve or comprehend data or should I say, understand all forms of language in a similar way and in a similar format to ourselves, then the speed of development should have dramatically increased. The rate at which biological systems learn and evolve would I believe pale in comparison. As long as the system had access to enough data, memory and processing power, then there is no telling how smart such a system could become.
I also belief that once
a certain level of intelligence or awareness occurs within
any sufficiently advanced A.I. system, then a kind of
may occur. The speed of this
evolution is hard to predict but theoretically it maybe a spontaneous leap to a
whole new level awareness outside the control of any intended design. I.e. the
network suddenly becomes self aware and then begins to learn at an unprecedented
rate, beyond anybody's control, especially if the network/grid is global and the
walled garden principal is not strictly adhered too. If the
system did Macroevolve, then debugging the system may become impossible leading
to a chain reaction scenario. Just like in
Terminator, SkyNet gets built and nobody
is able to stop it, until it's to late, fiction so often becomes reality, the
Dick Tracy watch is
already old hat?
Please report any problems you see on this page -such as broken links, non playing video clips, spelling or grammatical errors etc to:-
I don't have the time to individually respond to every email, so I thank you in advance, for your help.