Early life intelligence and adult healthBMJ 2004; 329 doi: https://doi.org/10.1136/bmj.329.7466.585 (Published 09 September 2004) Cite this as: BMJ 2004;329:585
- G David Batty, Wellcome advanced training fellow (, )
- Ian J Deary, professor of differential psychology
- MRC Social and Public Health Sciences Unit, University of Glasgow, Glasgow G12 8RZ
- Department of Psychology, University of Edinburgh, Edinburgh EH8 9JZ
Tests of psychometric intelligence (IQ-type tests), introduced about a century ago, have traditionally been used in educational and workplace settings. Only in the past decade has their predictive capacity for non-psychiatric health outcomes been examined in the rapidly evolving field of cognitive epidemiology. The health outcome most commonly examined in relation to people's differences in measured intelligence is all cause mortality. Cohort studies of older people indicate that those with higher intelligence scores experience a lower risk of mortality.1 A problem with examining the association between IQ and mortality in these age groups is the cognition lowering effect of comorbidities, which might explain the apparent protective effect of high intelligence scores against premature mortality.
To address the comorbidity issue, investigators have recently started to link results from intelligence tests taken in early life with adult mortality. Findings are highly consistent—people with a higher IQ in childhood live longer.2–8 In studies that report comparable statistics,4 6–8 the hazards ratios for total mortality, comparing the groups with the lowest and highest intelligence scores, range from 1.4 (95% confidence interval 0.9 to 2.0)7 to 1.9 (1.4 to 2.6).4 Moreover, the association between IQ and mortality seems to be incremental, not solely generated by the typically poor later health outcomes of people with learning disabilities, who represent a sizeable proportion of the groups scoring lower for intelligence.4 6 8
These results have recently been advanced in important ways by an examination of the association between pre-adult intelligence test scores and disease specific mortality and morbidity. Extended follow ups of the 1932 and 1947 Scottish mental surveys find that individuals scoring higher on a well validated mental test at age 11 had fewer total hospital admissions and were less likely to develop coronary heart disease, some cancers, and high blood pressure in adult life.3 8 9
Several non-exclusive explanations for the association between IQ and mortality have been advanced.2 They include measured intelligence representing: an archaeological type record of psychological and physiological insults (for example, birth complications, suboptimal postnatal care, illness); a predictor of advantageous social circumstances in later life (high educational attainment, high job status); an intrinsic indicator of general body integrity (as measured via the brain's capacity to process information rapidly, correctly, and reliably); a proxy for stress management skills—people with higher intelligence scores may be less likely to place themselves in stressful environments or cope better if they do. Another hypothesis concerns the acquisition of behaviours conducive to health (not smoking, more physical activity, prudent diet, avoidance of accidents). Supporting this, recent findings imply an increased likelihood of children who scored highly in intelligence tests to give up smoking in adult life.10
Some of these explanations for the association between IQ and mortality have started to be tested empirically. In analyses that have taken into account the potential confounding effect of birth complications (as indexed by low birth weight),6 illness in childhood, and cigarette use at age 26,7 the risk of total mortality remains raised in the lower scoring intelligence groups. Studies vary in the degree to which controlling for socioeconomic position alters the magnitude of the apparent health preserving effects of a high mental test score. Although this issue has yet to be resolved, early indications are that adjustment for social conditions in childhood (typically indexed by paternal occupational social class) has an only minimal effect on the magnitude of the relation between IQ and mortality.4 6 7 Some attenuation is found, however, when social circumstances in adulthood are taken into account.7 8 That the influence of pre-adult intelligence on later mortality might be mediated partly through social factors in adulthood may be unsurprising, implying a protective chain of events. High intelligence in childhood is likely to lead to educational success, placement into well paid employment, enhanced social status, and the accompanying benefits to health that the latter has repeatedly been shown to confer. Other investigators have argued that this causal pathway is only one possibility, no more plausible than the converse—measures of social position might be indicators of cognitive differences, which themselves affect health outcomes.11
Another mechanism that may underpin the apparent benefit conferred by high intelligence scores in childhood in relation to mortality in later life concerns the self management of treatment, clearly a cognitive task, in people experiencing illness. Thus people of low literacy or educational attainment (both are strongly correlated with psychometric intelligence scores) are less likely to understand instructions for use of medicines, recognise when their condition requires corrective intervention (for example, unfavourable blood sugar concentrations in people with diabetes), know the appropriate actions to take,11 seek rapid medical advice upon onset of common chronic illnesses (for example, a myocardial infarction), and receive treatment at a medical facility most appropriate for their clinical needs.12 The latter observations have been made in Scandinavian countries, which offer equity of healthcare delivery.
Given their inherently complex and sometimes conflicting nature, healthcare messages, treatment regimens, and preventative strategies perhaps surpass the cognitive abilities of some people.11 If this is the case—and bearing in mind that oversimplification of advice might reduce effectiveness substantially—proactive involvement of healthcare providers is warranted to reduce health inequalities attributable to differences in cognitive ability.
DB is supported by a University of Copenhagen senior research fellowship. ID is the recipient of a Royal Society-Wolfson research merit award. We thank Catherine Gale for her constructive comments.
Competing interests None declared.