Not (just) a Parlour Game: in Defence of Counterfactual History
The classic examples roll off the tongue easily, crystallized into cliché by decades of pop culture exposure: What if the Third Reich had won the Second World War? What if the Spanish Armada had not failed in 1588? What if the Confederacy had triumphed in the US Civil War in the 1860s? What if President Kennedy had avoided those fateful shots in November 1963? What sort of worlds would have resulted from such dramatic changes to established history—and, more intriguingly, would they have been better or worse than our own? This form of speculation is widespread and virtually irresistible. Almost everyone indulges in imagining particular turning points in history or in their own lives, ‘one moment when all was held in a balance ready to fall to one side or the other’, and in wondering what might have been if events had played out differently.
This mode of thought goes by a variety of names, some of which depend on the forum in which it is practised. In fiction, it is frequently referred to as ‘alternate history’, ‘allohistory’ or, occasionally, ‘uchronia’ (a closely related genre, from the French uchronie, meaning ‘no-time’). In Alan Bennett’s acclaimed play (and later film) The History Boys, it is described as ‘subjunctive history’. In its academic form, however, it has come to be known as counterfactual history, and it is under this name that the concept is being developed into an increasingly refined and useful tool for historical analysis.
As a branch of academic history, counterfactual history is believed to have been practised since Ancient Greece or earlier, while one of the first authors in the modern field was no less illustrious a figure than Winston Churchill. Despite this pedigree, however, counterfactual history has remained confined to the margins of scholarship, with most historians, including famously E. H. Carr, dismissing it as a ‘parlour game’ and unhistorical ‘red herring’. The two most significant texts in the discipline until recently—Alexander Demandt’s Ungeschehene Geschichte (literally, ‘Un-happened History’, first published in 1984) and Niall Ferguson’s edited volume Virtual History (first published in 1997)—were both insightful scholarly works, yet neither made any real impact in the academic world at the time of their original publication.
It is only over the last decade or so that the field has begun to attract greater interest and respect. This is reflected in the growing number of publications, conferences and workshops devoted to it, one of the most interesting being the establishment of an ongoing counterfactual history research project at the University of Konstanz in 2012. It is unclear whether this increased attention is primarily due to Westerners’ mounting anxiety in an uncertain post-9/11 world, as Roland Wenzlhuemer suggests. Regardless, it is certainly a positive development, as counterfactual reasoning has several unique qualities that lead it to deserve wider historical application.
A delicate balance
Most fundamentally, the use of counterfactuals allows historians to acquire a more sophisticated appreciation for the contingency of most historical developments. Contingency can be defined as the inherent fragility and unpredictability of events: a single event is the product of a host of different factors and influences, any one of which could, if changed or prevented, alter the event in unforeseeable ways. As Richard Ned Lebow makes clear, this is an understanding that all historians need to gain one way or another, as ‘the contingency of our world should be self-evident to any serious reader of history.’
The fact that this concept is so commonplace has led some researchers, such as Martin Bunzl, to argue that counterfactual claims are ‘implicit in every causal assertion’, and therefore ‘not as easy to avoid in the practice of history as one might think’. This is echoed by Naomi R. Lamoreaux, who summarizes the point rather neatly: ‘All historical arguments by their very nature imply that history would have turned out differently if the events or factors singled out for emphasis did not occur.’ Crucially, however, she goes on to suggest that such arguments would be strengthened by a more explicit and supported exploration of these implied alternatives. It is this more overt articulation of alternative possibilities that counterfactuals offer, and which in turn forces historians to confront the notion of contingency more directly.
Coming of age
Second, the field is becoming steadily more rigorous as more researchers are drawn to it. Many of the more frequent criticisms levelled at the use of counterfactual history have in fact already been addressed by its practitioners, who have fashioned it into a far more versatile discipline than is supposed by the wider academic community. Scholars such as Bunzl have developed clear and practical criteria for distinguishing between ‘good’ and ‘bad’ counterfactual reasoning, with the latter encompassing much of the rampant, unfounded imaginings and ‘analytical naiveté’ so damaging to the field’s reputation.
‘Good’ counterfactual arguments, on the other hand, are those whose turning points ‘can be grounded’ and supported with evidence, as in any other form of history. Naturally, given the nature of counterfactuals, this is usually ‘indirect’ evidence: the turning point might be tested against physical laws, knowledge of the time and place in question, or more abstract ‘considerations of rationality’. This process involves some measure of imagination, but an imagination anchored by a rational, evidence-based evaluation of how historical actors at the time perceived their options and priorities. The value of the counterfactual therefore hinges on the quality of this evaluation. Bunzl points out that various other fields of historical practice (such as oral or biographical history) rely on equally ‘informal methods’, and sees no reason why counterfactual history should be held to account any more strictly than these.
Along similar lines, Bunzl presents a means of judging the plausibility of a counterfactual, stressing that the likelihood of both the turning point (the ‘antecedent’) and its consequences should be examined. As a caveat to this argument, however, he also points out that ‘not all historically interesting counterfactuals need to be plausible’—that is, certain counterfactual scenarios have an appeal and a value that extends beyond the need for their turning points to be at all probable. If a historian’s main interest is in assessing the impact of a particular event, for instance, exploring the consequences of its alternatives is more important than whether those alternatives were likely to happen. What matter, as in any form of historical inquiry, are the relevance of the method to the research questions, and the quality of the analysis that follows.
An alternative perspective
Even on this latter point, counterfactuals have a lot to offer more mainstream historical study. Scholars such as Lamoreaux have advanced the idea that counterfactual reasoning helps to make historical judgements in general more robust, by encouraging ‘a mode of thinking that is constantly seeking out alternative ways of making sense of the evidence, as well as methods for deciding which alternative is most likely to be correct.’ Lebow makes a related point when arguing in favour of counterfactuals as a way of ‘providing distance from our world’, thereby allowing us to analyse it more scrupulously. All historians, whether or not they engage with counterfactuals, rely on such analytical skills; counterfactuals are, however, an especially efficient way of honing them.
This point is illustrated even by those more critical of counterfactual arguments. Randall Collins (rightly) objects to the bulk of counterfactuals as crude simplifications of causality, arguing that an appreciation of more complex and diffuse processes of historical change is vital to any accurate understanding of why certain events took place. He demonstrates this, however, by dissecting several counterfactual assertions (the idea that a Nazi invasion of Britain in 1940 would have led to a thousand-year Reich, the idea that a Confederate victory in the US Civil War would have preserved slavery in America to this day, and so on), and it is through this dissection that he arrives at a more complete picture of the network of causes and influences that led things to develop as they did. He determines that counterfactuals can indeed serve as a useful tool for ‘sharpen[ing] our understanding of the processes of historical change’—provided they are not simply taken at face value.
By incorporating counterfactuals into a broader range of research projects, therefore, and by considering a wider variety of alternatives against the evidence as a result, we may either come to a more considered understanding of why events unfolded as they did, or even discover that some of our assumptions about the past are wrong. In either case, the practice of history is much enriched by the additional effort.
In short, as Charlotte Nettleship concludes in her overview of the discipline: ‘counterfactual history is more than just a fun exercise—it can reveal how important certain people and events were in history, and, taken seriously, you can really begin to appreciate how history happens. Although it is kind of fun too!’ Such a gratifying blend of enjoyment and intellectual depth brings out the best in historical study, and historians of all hues—whether SF fans or no—are encouraged to give it a try.
Interested readers could also do much worse than visit The Counterfactual History Review, Gavriel D. Rosenfeld’s blog discussing instances of counterfactual reasoning in the academic, cultural and political spheres. Rosenfeld’s examples, particularly the political ones, show how surprisingly prevalent this sort of reasoning is, as well as how casually or superficially it is often employed.