Friday, May 11, 2007

Grid Computing Slowly Falls From Grace

I spoke at the The 451 Groups ECS Summit, London Regional Roundtable on Thursday, May 10, 2007 at the Brewery at Chiswell Street in London where the focus was supposed to be on Grid computing, the focus of much research by the 451 Group over the last year.

I noticed the trend in early 2006 that term Grid was losing it market appeal. It started disappearing from job titles, to be replaced by High Performance Computing (HPC) - this has now become mainstream with at least two IB's having the post. Another two trends were noted - data architecture is now high on the agenda and low-latency messaging is seen as the crucial to meeting the challenges of high transaction volumes.

So where's Grid computing heading? The general concensus were that the technology had not lived up to its promises in terms of performance and in particular: managability, security and lack of data architecture seemed to be the prime areas of concern. Grid in IB needs to grow up - in essence, despite the marketing hype, current grid offerings are little more than cluster computing applied to a few niche areas in IB such as CDO/CDO squared (btw check out www.cdo2.com - run by an acquaintance of mine - a real web service with real customers running on BLAST

The reality is that most jobs can be done now on multi-core, large memory machines. An 8-core/4 CPU machine with 64GB memory are a lot cheaper than an investment in grid - and you get a lot of processing power now for not a lot of cash. Nodes can be chained together with 1GE nics and switch to produce an effective HPC cluster and you can either use your own threaded app with memcached or have a play with jini.



Thursday, May 10, 2007

BP's Black Swan

BP's Black Swan

I was happy to discover Nassim Nicholas Taleb's new book waiting for me when I arrived home yesterday evening. Although I've recently been experimenting with vignettes like this mini-blog entry (I'm standing on the train, scribing with my thumbs, so please forgive any errors or incoherence), I think I'd rather have something to read.

I heard an interview with the author on Econtalk this weekend and ordered the book on Sunday. Very nice service from amazon, if a bit pricey.

Recently there has been some indignance, mostly from the yellow press, alleging that Lord Browne had callously decided £10m to be the value of a human life. I suppose some would say that human life is priceless, but this conveniently ignores the fact that industrial accidents do occur, and they have a financial impact that must be considered.

More to the point is that BP's analysis of the likely consequences of an industrial accident was an utter failure. The reason for this grossly uninformed statement takes us back to the subject of Taleb's book: the vast majority of industrial accidents result in little or no loss of life. A large firm collecting data on the impact of these events would reasonably expect that a large explosion in a major refinery in a wealthy industrial centre to be quite unlikely. So unlikely, we can be fairly sure it was unanticipated, in spite of widely available reports of dangers at the affected plant site.

I am sure this sounds improabable to most people. Why would a large company look purely at a statistical model to account for risk, and ignore the widely-reported concerns of their employees? From a purely technical perspective, it's all just data.

The crux of the matter is that most risk models based on statistcal assumptions are (for lack of a better word) wrong. They assume that the probability of an event occurring in the future can easily be derived as a function of past occurrences. Even if the concerns of employees could be quantified in a meaningful way, the risk calculations would become intractable.

So maybe the indignant voices have a point after all. Putting too much faith in mathematical models is foolhardy: many improbable events are more likely than can be forecast, if only because they've never happened in the past. While the black swan is maybe a tired example, it's easy to understand.

More on Taleb's book once I've actually read it...

Wednesday, May 09, 2007

Cognitive Dissonance

My current definition of cognitive dissonance comes from reading the Economist's Style Guide on the way in to the office, and Communications of the ACM on the way home.

With some effort, there is some meaning to be extracted from the latter journal, but the violence impressed upon the English language by the computing profession is severe.

I know I am a victim of this social phenomenon. The Style Guide informs us that the "online community means geeks and nerds". I take exception to this, but given my colonial education in science and engineering, I am poorly equipped to explain why I am neither a circus performer who bites the heads off chickens nor a character in a derivative American sitcom.

More serious is the droning language describing initiatives in information technology to obtain an understanding of the role of new technology and its systemic impacts of automation, notwithstanding the achievements we have heretofore achieved regarding the human impacts (CACM March 2007, p. 37).

With all due respect, I say WTF?

The state of information technology is confusion and crisis, as it has been for the past 30 years.

How do we fix it? How about if IT practitioners (we aren't professionals, thank goodness) start saying what they mean and meaning what they say?

The software crisis is a crisis in human communication. Let's move out of this state of perpetual war and see if we can make some progress in this industry.