Sunday, 18 May 2014

Back to the future: processing speed

I have now come down from the heady heights of the Troublesome Inheritance, the dragnet of the seed, the dance of history, the interplay of culture and breeding, the rise and fall of degenerate civilisations, and other matters of Great Import, and can finally get back to the nitty gritty: processing speed.

This may seem like something of a come-down, but after Arthur Jensen published his 1969 paper there was a pretty fearsome attack on the concept of IQ. Critics who were offended by the suggestion that racial differences in intelligence had partly genetic causes laid into every step in the chain of argument. (I joined them, on the typically narrow front of arguing that African American’s poor results on non-verbal tests were due to lack of access to constructional toys, but that is another story). One line of attack was to say that IQ results were “meaningless” because they related to an arbitrary collection of tasks which did not depend on any real underlying biological differences between people. This hit a sore nerve, because psychologists suffer from Physics envy, and would like to find some fundamental units of behaviour. So there was renewed interest in finding something basic which, elaborated upwards, might contribute to the final result we call intelligence. Processing speed in its various forms was a strong candidate. Naturally, there had been people independently working on reaction times since antiquity, which in the case of psychology means before 1889. Franciscus Donders was probably the first to use differences in human reaction time to infer differences in cognitive processing, finding that simple reaction was faster than choice reaction time.

Therefore, I want to regenerate in you a sense of hope that the close study of some rather simple tasks can lead us to the promised land of understanding intelligence. To motivate you I briefly considered following the school of journalism which believes that the public can be led to science by showing that scientists are human. You know the sort of thing: Professor So and So, who rides a motorbike and plays in a rock group…. I eschew such vulgarity.

However, since my hosts took the precaution of inviting photographer Douglas Robertson to the meeting, for once I can try to interest you in determining the mind’s construction from the face. You may understand why Ian Deary, in his introductory remarks, referred to us as Twelve Angry Men. I will set out who they are, and then post the presentations which I have received so far. You can then pester the rest of them to send you their work, or at least their references.



Seated: Pat Rabbitt (Oxford), Mark Bastin (Edinburgh), Nick Mackintosh (Cambridge)

Standing left to right: Geoff Der (Glasgow), Thomas Espeseth (Oslo), Tim Croudace (York), James Thompson (UCL), James Goodwin (Age Concern) , Stuart Ritchie (Edinburgh), Paul Verhaegen (Georgia),  Rogier Kievit (Cambridge), Elliot Tucker-Drob (Austin), Ian Deary (Edinburgh).

Seated in an oil painting: Peter Higgs (Edinburgh)

At this stage you may wish to turn to other matters, and who could blame you?


  1. Andrew Sabisky18 May 2014 at 14:38

    As some wise sage, whose name I temporarily forget, said in the early decades of the 20th century;

    "One group seem to be using the Binet tests as a check on the school, while another seem to be using the school as a check on the Binet tests. Don't we go round in a circle?" - or words to that effect. The implication, of course, is that tests are preferable where there is a definite yardstick of measurement, an absolute zero. So here we are today. For decades research has been severely troubled by poor sample sizes of highly restricted range, and it's hard to know to what extent the true relationship between processing speed and psychometric intelligence has been significantly clarified over the years.

    Future updates eagerly awaited!


    Very interesting.


  3. Aw, nobody looks angry (a few impatient, perhaps:)

    Test publisher/standardization manual data are great resources, BUT the test industry has "tainted" processing speed, b/c it often refers to silly low-to-moderate g-loaded paper-pencil tasks of coding, symbol search, crossing out the biggest # in a row fast as one can, quickly circling the 2 #s that are the same, etc.

    The psych-public associates "processing speed" with Wechsler's (sad little) version - so "processing speed" has been "branded" - & the better higher-g reaction time tests have been pushed to the background - until publishers can figure out how to create & control income from them.

    in the meantime "processing speed" doesn't get the respect it's due, b/c it gets equated with coding type tasks... but those men look impatient enough to change all this:) will the real "processing speed" please stand up!