On Counterfactual History

I’ll let you into a secret: Last week’s post, the one in which I tried to explain what might have happened if the 1973 Israeli crossing of the Suez Canal had failed, was inspired by a French magazine, Guerre et Histoire, that asked me to write it for them. For that I am grateful, for it forced me to think about the nature of counterfactual history. What it is good for (assuming, that is, it is good for anything) and what its problems are. Today I’d like to put some of my thoughts on paper.

As a rule, historians dislike counterfactual history. E. H. Carr (1892-1982), an Oxford historian perhaps best remembered for his little book, What Is History? (1961), went so far as to call it a mere “parlor game.” Not, mind you, that there is anything wrong with parlor games, incidentally. I find them very useful in keeping my grandchildren amused. And some of them, notably chess, go and others, are excellent intellectual exercise indeed—at least as good as writing history.

That, aside, though, Carr was wrong. Counterfactual history has its uses: it can counteract determinism and remind us that what happened was not necessarily what had to happen. It is, in other words, a method for keeping historians, and indeed anyone else interested in the way human affairs work, away from the ever-present danger of hubris.

But that is not the end of the matter. History, certainly history as practiced by modern academics over the last two centuries or so, is to a large extent an attempt to answer the question: why did X, or Y, or Z, happen? Rerum cognoscere causas, “to know the causes of things,” is the motto of the London School of Economics where I myself did my PhD almost half a century ago. This is good and well. However, without counterfactual history the search for causes, showing that everything that happened did so because it had to and could not have happened otherwise, will end up by degenerating into sheer idiocy. If, as Hegel (“the real is the rational and the rational is the real”) claimed, everything that happened was bound to happen, then what is the point of looking for what caused it?

That is all the more the case because the “laws” on which historians rely when they speak of causation are not nearly as strict as those we know from the natural sciences. There is no equivalent in social science (if it is a science) to Galileo’s laws of mechanics, Newton’s law of gravitation, Bernoulli’s law of pressure, and countless others. With very few exceptions, indeed, they are not laws at all; just generalizations that seem to make intuitive sense to those of us who have been educated within a given civilization, at a given place, at a given time.

In one sense all of us are constantly engaging in counterfactual history even if we do not mean to. When I say that A caused B, the implication is that, but for A, B would never have happened. When I say that World War I was caused by the assassination of Archduke Franz Ferdinand (see the above image) the implication is that, but for the assassination, the war would not have broken out. When I say that President Reagan, by increasing America’s defense budget to an extent that the Soviet Union could not match, caused the latter to break up, or at any rate helped it break up, the implication is that, without him and his arms buildup, it would not have happened. And so on. Paradoxically, then, counterfactual history is built even into the work of the very historians who claim to despise it so much.

All this means that counterfactual history is both useful and inevitable. However, that is not to say that all counterfactual narratives are born equal any more than all historical narratives are. Some are clearly much better than others. This leads me to the question, what is good counterfactual history? Follow some preliminary thoughts:

  1. Counterfactual history must be plausible, i.e it must not introduce all kinds of things that are a priori impossible. For example, the question what would have happened if Hitler and not the US had built the first nuclear weapons is a plausible one, given that, as late as the summer of 1939, German nuclear research led the world. An attempt to answer it can result in some interesting answers that will shed light both on the Fuehrer and on the role the weapons in question have played and are playing in international relations. However, asking what would have happened if Napoleon, or Genghis Khan, had had them does not make sense and should be discouraged.
  2. Counterfactual history should only go so far and no further. That is because, in human affairs, few if any events have one cause only. Trying to trace the immediate chain of events that might have resulted from one counterfactual event is hard enough. Pushing this more than a very few steps forward will, in the words of Winston Churchill (at a time when, as Lord of the Admiralty, he was responsible for guessing what future naval warfare would be like), cause thought “to diverge too fast.” The outcome is likely to be pure fiction with no link to reality at all. Let me provide another example of this. Many years ago I had a student, an American, who wanted to do a paper on the consequences following from the invention of print. This being Israel, he said that, without print, there would never have been a kibbutz. He was right, of course; yet writing a paper on the topic did not make sense. The reason why it made no sense was because, between Guttenberg and the kibbutzim, there were too many intermediate steps far more relevant to the topic than print was. I told him to limit his inquiry to the years before 1550. What came of it, if anything, I cannot recall.
  3. This warning also has an obverse side. The more plausible a counterfactual narrative, the less it will deviate from what actually happened. As it does so, it may very well turn into an exercise in futility. What is the point of writing counterfactual history that is only marginally different from that which actually took place? On second thought, perhaps this is what I did in the piece I posted last week, perhaps not. Let the reader be the judge of that.

Thus writing good counterfactual history is a question of navigating between the Scylla of unforeseeability and the Charybdis of banality. In other words, it requires judgement. But isn’t that also true of most other things in life as well?

Just Published! A Biography of Conscience

M. van Creveld, A Biography of Conscience, London, Reaktion, 2015.

5107q7xPjVL

Many would consider conscience to be one of the most important, if not the most important, quality that distinguished humans from animals on one hand and machines on the other. However, what is conscience? Is it a product of our biological roots, as Darwin thought, or is it a purely human invention? If so, how did it come into the world? Who invented it, where, when, and against what social background? What did the ancient philosophers have to say about it? How does it relate to religion, Judaism and Christianity in particular? How did conscience survive the secularization of the Western world after 1600 or so, and where is it today? Are there any societies that, not recognizing the idea of conscience, have developed and used other methods for internalizing social control? If so, what are those mechanism like?

The present volume, the only one of its kind, attempts to provide answers to these and other questions. Well-documented but written in simple, jargon-free language, it starts in ancient Egypt. From there it leads all the way to present day campaigns aimed at hammering issues such as human rights, health and environmental into our consciences. Readers will learn about the Old Testament which, erroneously as it turns out, is normally seen as the fountainhead from which the Western idea of conscience has sprung. They will also meet Antigone, the first person on record ever to explicitly speak of conscience, syneidēsis in Greek, as a basis for action.

Next they will encounter the philosophers Zeno, Cicero, Lucretius, and Seneca; outstanding Christian thinkers such as Saint Paul, Saint Augustine, Saint Thomas Aquinas, and, above all, Luther with his famous saying, “here I stand, I cannot otherwise;” as well as modern intellectual giants. The list opens with Machiavelli, the man who, drawing a sharp line between private and public behavior, admitting conscience into the former but not into the latter. Next come Rousseau, Kant, Hegel, Nietzsche, Freud, and Burton Skinner.

Separate chapters are devoted to Japan and China. Both are societies that, rather than relying on conscience as a method of social control, put their trust in shame and reverence instead. There are chapters dealing with the Nazis—starting with Hitler and proceeding downward, did the Nazis have any kind of conscience at all?—as well as the most recent discoveries in robotics and brain science. On the way readers will follow the evolution of conscience in many of its numerous, occasionally strange and even surprising, permutations.

The book concludes by arguing that, the claims of workers in artificial intelligence and brain science notwithstanding, we today are no closer to understanding the nature of conscience than we have ever been. In the words of one contemporary computer expert cum psychotherapist, probably we shall be able to build machines able to mimic conscience before we ever know what conscience really is.

The Fall and Rise of History

I well remember the time when I fell in love with history. This was 1956 and I was ten years old, living with my parents in Ramat Gan near Tel Aviv. While rummaging in a storage room, I came across a book with the title (in Dutch), World-History in a Nutshell. Greatly impressed by the story of the small, but brave, ancient Greek people fighting and defeating the far more numerous Persian army, I quickly read it from cover to cover. Much later I learnt that the volume was part of a series issued by the Dutch ministry of education and updated every few years. To the best of my memory the one in my hands did mention World War I but not Hitler; hence it must have dated to the 1920s when my parents went to school.

It was World-History in a Nutshell and the wonderful tales it contained that made me decide I wanted to study history. In 1964 this wish took me to the Hebrew University where I started thinking seriously about what I was trying to do. From beginning to end, my aim was always to understand what happened and why it happened. Though it took me a long time to realize the fact, in doing so I, like countless other modern historians, was following in the footsteps of the German philosopher Georg Friedrich Hegel (1770-1831).

Hegel’s most important propositions, as I came to understand them, could be summed up as follows. First, the past had a real, objective existence. It was, so to speak, solidified present, more or less covered by the sands of time; which meant that, given sufficient effort was devoted to removing the sand, “the truth” about it could be discovered. Second, in the main it consisted not of the more or less accidental, more or less cranky deeds of individuals but was pushed ever-onward by vast, mostly anonymous, spiritual, economic—this was Marx’s particular contribution—social and technological forces none could control. Men and women were carried along by it like corks floating on a stream; now using it to swim in the right direction, now vainly trying to resist it and being overwhelmed by it. Third, the past mattered. It was only by studying the past that both individuals and groups of every kind could gain an understanding as to who they were, where they had come from, and where they wanted to go and might be going.

Starting around the time of Hegel’s death, these assumptions were widely shared. All three of the most important ideologies of the period 1830-1945, i.e. liberalism, socialism/communism and fascism subscribed to it. None more so than Winston Churchill, Vladimir Ilyich Lenin, and Adolf Hitler. The last-named once said that a person who did not know history was like a person without a face. As religion declined in front of secularism history, with Hegel as its high priest, became the source of truth, no less.

To be sure, there were always those who cast doubt on the enterprise. Whether seriously and out of ignorance, as when Henry Ford famously said that history was bunk; or only half-so, as in Walter Sellar’s hilariously funny 1931 best-seller, 1066 and All That. The outcome was a vast outpouring of written works—later, movies as well—and an ever greater increase in the number of students both in- and outside academia.

At the time I took on my studies in 1960s, few people doubted that finding out the historical truth was an important objective in itself. Then, around 1970, things started changing. This time the herald of change was a Frenchman, Michel Foucault (1926-84). The way Foucault saw it, post Hegelian historians—and, looming behind them, his own countryman Rene Descartes—were wrong. Contrary to their delusions, such thing as an objective fact, event, process or text did not exist. Rather, each person interpreted—“read” was the term Foucault’s followers invented for this—each text, process, event and fact in his or own way. Assuming, that is, that these things had any kind of objective existence at all and were not imposed on history ex post facto. The choice of interpretation was determined by each person’s experience and personality; in reality, therefore, the number of possible interpretations was infinite. If, as sometimes happened, this interpretation or that was widely accepted, then this fact only showed that it suited the psychological needs of many people, not that it was more “correct” than any others.

Since then this view has been eating up the study of history like a worm eating up an apple from within. Previously people had written learned tomes about, say, Greek antiquity; how it came into being, what its main characteristics were, how it unfolded, expanded, passed away, and so on. Now they did the same about the way historians had “discovered” or “invented” that antiquity. The same applies to “the middle ages,” “the renaissance,” “the enlightenment,” “the industrial revolution,” and so on and so on. This came dangerously close to saying that history was but a fairy tale and any attempt to write about it was not “science” but fiction—good or bad.

The implications of this view were tremendous. If all the study of history was capable of yielding was some kind of subjective tale, then of what use could it be in establishing “the truth”? And if it could not help in establishing “the truth,” then what could be the purpose of engaging in it? And how about the remaining social sciences such as political science, international relations, sociology, and so on? Weren’t they, too, based on the assumption that an “objective” past did exist and could be used to understand the present?

For a century and a half it had been assumed that a firm grasp of these subjects would qualify those who had it for many kinds of work not only in academia but in both the public and the private sphere. Now, increasingly degrees in these fields were seen as useless. The more useless they appeared to be, the less capable they were of providing their owners with a reasonable income as well as an acceptable position in society. The less capable they were of providing their owners with an acceptable position in society and a reasonable income, the smaller their perceived uselessness.

And so began the decline of the humanities and many of the social sciences that we see all around us. The lives of an entire generation of young academics have been blighted, given that nobody any more is interested in whatever they may have to say. Finding work outside the universities is even harder; instead of degrees, prospective employers demand “experience” above everything else.

Does that mean that books and movies that deal with the past will soon disappear? Of course not. Rather, it means that the purpose of reading those works has shifted. Instead of analyzing underlying factors and trying to extract “lessons,” people started looking for stories with heroes and villains in them. Instead of looking for the general picture they took an interest in the details; often, needless to say, the juicier the better. Instead of asking, “how we got to where we are now,” they wanted to know what life in the past had felt like. Nowhere was this more true than in my own field, military history. The reason, presumably, being that the vast majority of people in advanced countries no longer had any personal experience of warfare.

Where the demand exists supply will follow. Contrary to the situation as it existed a few decades ago, the most important historians writing today are not academics. They are popular writers, with his difference that the adjective “popular” is now as likely to be used in a complimentary way as in a derogatory one. By and large they do not reflect on underlying theoretical principles, create frameworks, or provide deep analysis. Yet from Antony Beevor in Stalingrad through Max Hastings in Catastrophe to Keith Lowe in Savage Continent, they have a vivid sense for detail and know how to spin a tale. Those tales may be useless in the classroom—having tried to use them there, I know. Yet judging by sales they seem to be filling the psychological needs of many people.

The king is dead; long live the king.