New Page 1

PREVIOUS PAGE |_Page index_| Website | Review Page | Journey | Donate | Links | NEXT PAGE

 If you jumped to this page from another page from within the eBook, then click your browsers back button to return to it, rather than the previous or next buttons above.

You are reading a page from a free video eBook called Heaven or Hell It's Your Choice, for more information click on the website button above.

Section 4 / Page 105


The human race has proven many times in its past that it has a propensity for war and conflict to help resolve its differences, but the weapons at the disposal of future armies, terrorists and governments, could be more lethal than anything that has yet been developed. Nanotechnology and its proliferation could make nuclear war look like a nice option. Nanotechnology and genetic manipulation could be used to make weapons that only target certain ethnic groups or even be programmed to seek down and kill individuals, whilst leaving absolutely no trace of how that person died and absolutely no trace of who committed the crime. (Research Today - Technology Tomorrow?).

The proliferation of nano and A.I. weapons could become the most devastating disaster that could ever happen, unless the powers that be wish to control and watch every single person on the planet then there maybe no defence against a single lone nut using a nano weapon. Nano weapons would be very easy to move about, if you know what I mean. Take into consideration, that the smallest publicly known nuclear devices are the Russian made brief case bombs then consider the fact, that they have managed to lose a few, then this does not bode well for the future, (also see mini nukes, Davy Crockett & Gamma-ray weapons). Nano-weapons will be by their very nature, be extremely small. This will make control of these types of weapons almost impossible, especially if nanotechnology is introduced by competing market forces and into a world, run the same way it is today. Imagine the future.... welcome to the Cuban nano crisis or the Al-Qaeda super nano virus scare or even more likely, N-Korea manage to acquire a commercial nano assembler and then manage to weaponise it. Nano weapons could be airborne or carried by insects, the possibilities are endless.

Also the possibility of one lone nut causing the complete demise of the whole human race is not beyond the realms of probability. The following situation could develop, a single lone nut uses or develops a nano weapon, a weapon that would travel along using the internet as a carrier. At this stage it could travel purely as computer code, just like a normal computer virus. Then when the computer virus came across any nano-assembling robot, then it could infect that robots controlling software, so instructing it to then churn out some type of deadly agent. This means that the rather clever lone nut, maybe nothing more than a good software designer. So without ever having access to his or her own nano assembling equipment, they could end up turning the whole world into grey goo. The other possibility is that the lone nut could turn out to be a slightly frustrated or annoyed A.I. program with a grudge. The response time between a problem like this being detected and it being stopped could be far too slow and then that would be the end of that, as they say. Modern day computer viruses can be seen as a forerunner to this type of problem occurring, plus once developed, then the spread and customization of such weapons, even by your average script kiddy, is a frightening prospect.

Hackers show that no security system is totally safe and secure and if it were an inside job, then there would be no real way of stopping the problem. This happens all the time in big business, but it doesn’t like talking about it, because it upsets their customers. If this is a big problem now, then imagine all of the people now being trained up to become I.T. literate, this could be a big mistake. It may just open up more possibilities for techno terrorism. The average person sees technology as a way of making their lives easier, but there are a lot of different types of people in society and as far as I’m concerned, making everybody I.T. literate, is kind of like handing everybody in society a loaded gun.

America is a great gun culture and it doesn't take a genius to see the problems they’re having, so in a similar way, if you make everybody I.T. literate, then you could have just as many problems but with a nation of bad hackers. The problem with bad hackers is that you need goof hackers to stop them, so if we ban hacking completely then most of the bad hackers will end up winning. (To much Die Hard 4)?

There has always got to be a balance struck between good and evil. If the black hats win, then we could all end up losing, because unlike a bunch of people armed with guns, a bunch of bad hackers armed with a bit of blackcode, especially in a nanotechnology future, could end up doing considerably more damage to a society and its economic structures, than any, of their gun toting equals. Also check out some of the extreme measures now being planned by the authorities to prevent terrorism / hacking etc -EFF: Chilling Effects of Anti-Terrorism + check this - hacker to be jailed for life, under a new Homeland Security act. I say lock up anybody who has a C-Compiler, before information warfare takes off big time, now where did I put my scanners and sniffers, buffer overflow?

As the title of the page is war then I quote Einstein - I'm not sure what weapons will be used in World War III, but World War IV will be fought
with sticks and stones.

Just remember, it is your world too and if we allow the powers that be to build such things, then we only have ourselves to blame.

Call me an idealist, but I've always thought, wars would be much harder to fight, if arms manufacturers didn't produce weapons?
Got a problem, then throw a rock, beats blowing up the whole planet.

The people at the top have got to be made to understand, that it is they, that could end up being the biggest threat to us all.

Their philosophy of always having to develop better weapons, can not be sustained in an A.I. / Nanotech future, it is far too dangerous.

By the way better weapons is an oxymoron if you know what I mean.

Also imagine if amnesty international or the international red cross etc, had, had as much money spent on them as weapons,

I just wonder what the world would be like, sorry forgot, there isn't any profit in it?




Please report any problems you see on this page -such as broken links, non playing video clips, spelling or grammatical errors etc to:-

I don't have the time to individually respond to every email, so I thank you in advance, for your help.


PREVIOUS PAGE |_Page index_| Website | Review Page | Journey | Donate | Links | NEXT PAGE

Author Alan Keeling ©