Next in line from Horn’s fantasy factory: The Web Bot project. The sum of this one is that a program developed in the 1990s to track the stock market allegedly “took on a mind of its own” in 2001 and started making other predictions, such as that a “life-altering event” would take place within 60-90 days of September 11. It also supposedly predicted the anthrax attacks, the tsunami, and Hurricane Katrina, and is now predicting “global devastation” for December 2012.
Well, this one’s not hard, and it’s the same problem as with Horn’s use of the “Petrus Romanus” “prophecy.” “Life-altering event”? And a 90 day window? If you can’t find a “life-altering events” with that much leeway, you’re not trying very hard. So the vagueness of predictions is strike one.
Strike two is incoherence. We wouldn’t mind trying to apply the Deuteronomic prophet test to the Web Bot, but you can’t, for the most part, because it’s almost impossible to know what it’s predicting. Maybe we should stone it anyway, because chances are a stoning would only improve its coherence. Some have said the Web Bot has its hits and misses, but who can tell? Check out this sample from the project’s own blog, with “predictions” for July 31-August 9 of this past year (one bit of profanity modified):
the [bottom] becomes [visible] off in the [near distance] to [the officialdom minions]. The data suggests that [offcialdom] here in the USA starts sh***ing the small sharp ones as the [new dawn] reveals the [chaos] and [smoking rubble] of their [hopes] of [sustaining] the [economic illusion] until the time of the [political illusion (election)]. The data sets are pointing toward an [understanding] from [offcialdom] that will come [in the early morning light] during a [horrific night] here in USA that is further described as [changing the game] as well as [affecting the (olympic) Games]. As an aside, it warms the cockles to know that what 'we' do here in the usa causes such [vexation] to [queen/royal lines] in [celtic lands] that [images of spittle] and [yellow eyes] will be [visible].
When I worked for the Florida prison system, I knew inmates on heavy psychiatric meds who were more coherent than that. You have to wonder why Horn didn’t reprint some samples of the Web Bot’s predictions. Well, actually, no, you don’t have to wonder at all.
Strike three has to do specifically with 2012, though it also relates to how the Web Bot actually works. As usual, debunkers have been at work on this, and there’s a good one here with a worthy quote. The author is a computer engineer, and between his expertise – and mine as an information specialist – you have two sources that can stand with significant verification.
So does it work and are Web Bots reliable? well it’s kind of hit and miss. Sometimes you’ll see things predicted correctly and sometimes not. I think the project is a really good initiative and can lead to great things, but at the moment I’m more under the impression that they interpret the results in the same way we interpret Nostradamus Quatrains. What I mean is that what the Web Bots are getting out of the internet while crawling is not that clear and when an event happen a couple of days later, then we find a way to relate the data.
Being in the computer engineering domain, I think I can see where the Web Bots will succeed and where it will fail. There are fields I believe the Web Bots can predict stuff and there are field they can’t. What are these fields? Well, essentially, anything “man-made” could be predicted in some way and anything man has no control over can’t be predicted. This is for the plain and simple reason that the Web Bots crawl the Internet for data and the Internet is actually man-made. So, the only data that can be collected is data written by people/government/companies,etc. I don’t see how you can predict a natural disaster or anything like that by simply crawling the Web. The only thing you can get by crawling is facts or opinions, nothing else. The only way I can see predicting natural disasters or anything not man related is if the Web Bots actually crawls 3 000 blogs/websites written by specialists of a certain domain and that they are pointing towards a similar conclusion.
What about 2012 (Read my article on 2012) and the Web Bots? As I said, I’m not seeing how a computer can figure out what’s going to happen in 2012 simply by visiting websites published by real people. The more data Web Bots get pointing towards 2012 just means more and more people are publishing stuff about 2012 and the end of the world. Remember, the only thing they can crawl is the internet and what you find on the internet was created by real persons, not God. They will surely get a strong correlation between 2012 and the end of the world; there’s ton of websites talking about it.
So, in the case of 2012, what we have with the Web Bot amounts to an attempt at self-fulfilling prophecy. Of COURSE the Web Bot sees devastation in 2012 – it’s finding a lot of people saying that is what is going to happen. This is like Johnny Carson’s “toilet paper shortage” routine (link below), and it is easy to see why the Web Bot might be able to “predict” certain things – especially human activities – once you figure out (rather, interpret!) what it is saying. And Horn is too insensate to figure out this very simple point.
In the words of the great Homer Simpson – “D’oh!”