If You Want to Know the Future…

“If you want to know the future, study the past,” is one of the clichés of our age. Among those who are said to have said so are the Spanish philosopher George Santayana and President Theodore Roosevelt. Rarely, though, have people gone very far in explaining just how it should be done. So here are a few thoughts about the question.

Until 1750. The idea that history is an arrow-like, ever-changing, non-repeating, process that leads in a straight line from far in the past to far into the future is a surprisingly recent one. In this form it only made its appearance around the middle of the eighteenth century. Before that date history was considered to be the province of again and again. Either the most important things did not change at all but always followed the same patterns, as Thucydides and Machiavelli thought. This, too, was what Sun Tzu was referring to, albeit in a negative way, when he said the historical analogies were no way for finding out what the enemy would do. Or else history moved in cycles as many philosophers and historians from Plato to Arnold Toynbee did. Either way it was possible to use the past for looking into the future, at any rate in principle.

From 1750 on. Starting with the late Enlightenment, patterns and cycles have been joined, and to some extent replaced, by the view of history as a linear process. A process, in other words, that was moving in a certain direction from the Creation (later replaced by the Big Bang) towards an objective or goal. This in turn gave birth to two other ideas, both of which are often used for predicting the future. The first, which has since become one the most common of all, was “trends.” The term is derived from the Middle English trenden, meaning to roll about, turn, revolve. In other words, the very opposite of what it means today. During the sixteenth century it began to stand for a move in a specific direction; but it was only about 1880 that its use became at all common.

Trends gave rise to extrapolation, another modern term. Starting its rise around 1920, today extrapolation is everywhere. The number of fields which have been analyzed with its aid, sometimes with success and sometimes without, is vast. Among them are births, deaths, populations (both human and non-human), migration, incomes, demand, sales, traffic (including accidents), energy consumption, hothouse gases in the atmosphere, the number of working scientists, technological development, the speed at which we move from one point to another, and so many other things as to boggle the mind.

Following hard on the discovery of trends and extrapolation came the other post-1750 historical method, i.e dialectics. The first to point to dialectics as the key to historical change, and therefore to any attempt to look into the future, was the early nineteenth-century German philosopher Georg Hegel. Hegel’s starting point that it was the spirit that moved the world. Any idea (thesis) would quickly give rise to its opposite (antithesis). As the two met, the outcome would be a synthesis made up of elements taken from both the thesis and the antithesis—for nothing is ever completely lost—and forming a new thesis. And so on in a process that could be observed at work in all human affairs, from the highest to the lowest.

Where Hegel really left his predecessors behind was by insisting that the process was not stationary, like scales moving now one way and now in another while in search of equilibrium, but dynamic. Unfolding in time, never repeating itself but always taking on new forms, it led history away from the past through the present and from there into the future. History, in other words, was a process of becoming.

It was in this form that dialectics were taken over by Karl Marx. Marx’s starting point was that, while Hegel had been right in pointing to dialectics as the moving principle of history, he should have applied it to economic life first of all. Here the various systems of production were forever jostling each other, pushing development along. Thus emergent slavery replaced “primitive communism.” Feudalism took the place of slavery; capitalism drove out feudalism; and communism, returning in a much more highly-developed form with every kind of modern technology at its disposal, would end up by doing away with capitalism. Each of these four systems contained traces of the previous one. And each also contained the germ of its own opposite within itself. When the time was ripe it would be negated by that opposite. As the old passed away, the new would emerge out of it like a butterfly out of its chrysalis. To this process Hegel had given the name Aufhebung. Inadequately translated as sublation, it can mean both “abolition” and “taking to a new, and higher, level.”

Hegel and Marx are long dead. However, arguably dialectics, applied to both spiritual and material factors and recognizing the interaction between them, still remains the best way to describe the way history unfolds over time. If so, then seen as a method for understanding the present and forecasting the future it is by no means passé. Modern examples of the way dialectics work are all around us. One such is the shift from craftsmanship to conveyor belts producing endless numbers of identical items and from there to computerized factories which manufacture an almost equally endless variety of them. Another is the growth in motor traffic which has now reached the point where, instead of increasing mobility, it threatens to choke it and bring it to a halt.

Still others are the rise of globalization which, having emerged after the end of the Cold War with its sharp division between West and East, is now being confronted by its opposite, decentralization, regionalization, and social fragmentation; and the rise of political correctness (itself, in many ways, a reaction to the “sexual revolution” of the 1960s and 1970s), the reaction to which became manifest when Donald Trump was elected president of the United States. Thanks to dialectics all these, and many others, were predictable. And some far-sighted people actually did predict every one of them.

To retrace our steps, history (A) and (B) together provides us with four different ways of looking into the future. Two of those, the one based on the idea that there is no change and the one that change is cyclical, go back at least as far as the fifth century BCE when the very idea of history, meaning a record of things past, was conceived of for the first time. Between them they dominated the field until the effects of the industrial revolution started making themselves felt during the second half of the eighteenth century. Both remain in use even today. The other two, which assume that history does not repeat itself and that change is the very stuff of which it is made, are of more recent vintage.

What all four methods have in common is that they are based, or are supposed to be based, on the sober and systematic study of recorded facts and processes. Such as anyone, provided he or she applies himself, can access and interpret. The difficulty, of course, is to decide which method should be applied to what development at what time; also, which one to use in dealing with each problem and how to combine all four.

To this question, no answer had yet been found.

With Just One Important Exception

Quite by accident, I finished reading this book on 7 March, the eve of the International Women’s Day. The author, Prof. Steven Pinker, is nothing if not an optimist. Perhaps one reason for this is because, as a 63-year old psychologist who teaches at Harvard and has several best-sellers to his name, he has good reason to be satisfied with life so far. Parts I and II of his latest book, Enlightenment Now, are basically a list of all the ways in which the world has been improving over the last two centuries or so. By contrast, some of part III looks—to me, at any rate—like a “philosophical” tract so confused as to be hardly worth commenting on.

Even skipping that part, though, no short review can hope to do justice to the tons of evidence Pinker produces to support his claim. Follow some highlights:

  • Starting at the end of the eighteenth century, and taking the human race as a whole, real per capita product has gone up thirtyfold. During the same period the global population has increased tenfold fro 800 million to almost 8 billion; meaning that, in little more than two centuries, total production has increased three hundred times, no less. Contrary to the fears of Malthus and others, humanity has not run out of food and other resources. To the point where many formerly hungry countries have turned to exporting food and where in quite some developed ones obesity is a greater menace than malnutrition is.
  • Taking into account qualitative advances—vastly improved nutrition and living conditions, faster and more comfortable travel, more efficient communications, incomparably cheaper data-processing, to mention but a few—the improvement in our material situation has been much greater still. Just consider that King Louis XIV at Versailles had neither electricity, nor running water, nor flush toilets. Not for nothing did visitors keep complaining about the awful way everything smelled—and this in the palace for whose owner nothing could be too good.
  • The increase in the size of human population could never have taken place without radical advances in the related fields of medicine and health. Including a vast decline in perinatal mortality (the percentage of women who die in childbirth or shortly thereafter); a vast increase in the number of children who live to adulthood; the introduction, during the second half of the nineteenth century, of sterilization and anesthetics; the complete or near-complete eradication of some of the deadliest diseases, such as smallpox and polio; and the mitigation of crises such as AIDS, SARS and the rest which, had they broken out more than a few decades ago, could well have decimated the human race in the same way as Spanish Influenza did in 1919-20. Taken together, these and other advances explain why, world-wide, life expectancy is now around seventy years. That is twice as much as at the beginning of the nineteenth century.
  • Hand in hand with the general “betterment”—a term much beloved by Pinker’s heroes, i.e. the scientists, technicians and philosophers who made the Enlightenment—went improvements in education. To cut a long story short, in all countries with hardly any exception the percentage of illiterates has gone down, whereas that of those who enjoyed a secondary or tertiary education went up. To extrapolate—and extrapolation is the method Pinker himself uses whenever he wants to look into the future—the day may indeed come when illiteracy, like the abovementioned diseases, is all but eliminated. And when, as part of the fight against discrimination, everyone over the age of twenty will be awarded the title of professor free of charge.
  • While wealth and health and education have improved, war has shrunk. Much the most radical changes took place during the decades since 1945. World War II, which was the deadliest in history, probably killed between two and three percent of humanity as it then was (consisting of somewhat more than two billion people). Since then the figures, calculated on an annual basis, have gone down to the point where they can hardly even be expressed in terms of percentages. To put it in a different way, world-wide the average person’s chances of being killed in war are lower now than they have ever been. Which is not, of course, to say that life in some countries is not much more dangerous than in others.
  • Not just war, but other forms of legal violence have greatly diminished. In many countries torture, which used to be a regular and indeed almost ubiquitous part of the justice system, has been outlawed. The same applies to the death sentence as well as other forms of what the U.S Constitution calls “cruel and unusual” punishment. Especially in the US, cases are on record when the authorities wanted to carry out death sentences but could not—because the companies that made the necessary deadly poisons were no longer prepared to supply them.
  • Another sign of the growing concern with human life is the improvement in safety. In many developed countries working accidents are way down from what they used to be only a few decades ago. Calculating on the basis of person/miles travelled per year, the same applies to traffic accidents. I myself am old enough to remember the fight over safety belts—and how, overcoming all obstacles, those who advocated them won.

Enough, and more than enough, to make many of us happy? Pinker thinks so. True, the evidence, depending as it does on the recent invention known as polls, is not as plentiful as in other fields where progress has been made. But what little of it is available suggests that more people today enjoy more happiness than was the case a few decades ago. With just one important exception: several studies, some that are listed by Pinker and others that are not, have suggested that women, at any rate women in developed counties, are less satisfied with their lot than they used to be.

Time to reconsider whether feminism is such a marvelous thing after all?

Fast Forward to the Past

What does a Superpower that has been defeated in war do? Proclaim that it had been fighting the wrong enemy, that’s what. An example par excellence comes from the late 1970s and early 1980s. In 1973, after a decade of war, the last US troops fled from Vietnam without having accomplished their mission. Two years later the same scenario repeated itself in Cambodia. In both cases the victors were little brown men (“Coons,” as President Johnson once called them) fighting in what President Trump has so delicately called s——e countries. Men who, by right, should never have been able to challenge, let alone vanquish, the mightiest and most beneficent power on earth. However, that power refused to confront the problem head on. Instead, having made up its mind that over a decade of continuous warfare had been of no importance, it was happy to go back to “real soldiering” on what was then known as the Central Front.

In the event, there was no war on the Central Front. Forty or so years later, events seem to be repeating themselves. First, in September 2001, came the Islamic terrorists who attacked the Twin Towers in New York, bringing them down and killing about 3,000 people on US soil. This marked the beginning of a decade and a half during which the US was busy waging counterinsurgency; first in Afghanistan, then in Iraq, and finally in Syria. True, none of these wars ended as disastrously as Vietnam and Cambodia did. Looking back, though, neither did the US forces involved have much to show for their efforts and the losses they suffered.

Next, President Trump and his national security team decided that enough is enough. Having spent perhaps a trillion dollars fighting terrorists in various countries, it turned out that America’s main enemies are not terrorists at all. They are Russia and China, acting either together or, which is perhaps more likely, separately. And let’s not forget North Korea and the Little Rocket Man, of course. The former two have long had nuclear weapons capable of reaching the US. The third will have them soon enough. All three also have formidable conventional armed forces that are improving (“modernizing,” is what this is called) all the time. They are preparing for hybrid war, space war, robot war, cyber war, war without limits, and God knows what other kinds of war their nefarious leaders can dream up. And they must be outgunned, or else.

Already long-forgotten ideas are beginning to make a comeback. The Cold War, this time waged not on one front but on two. Brinkmanship, the only way to describe the games played by Washington DC and Pyongyang. Arms races, expensive but necessary and very good in providing employment. The strategic balance, which may be stable (or not). Deterrence, which may work (or not). Escalation (which, if nuclear weapons are used, will almost certainly follow). High-speed “precision strikes” against the other side’s missiles, launched in the hope of destroying them before they can be used.

Coming along with the old phrases are old/new weapons. A new generation of low-yield, “usable,” tactical nuclear weapons supposedly small enough as to masquerade as conventional ones. A new bomber, the B-21, which is going to be assembled in the same factory hall where (the largely useless) B-2 was built. A new fighter, the PCA (Penetrating Counter Air), supposed to help the B-21 reach its target. Anti-missile defenses (remember Ronald Reagan’s Star Wars?). A new class of aircraft carriers, as useful or, given the submarine menace, as useless as their predecessors. And so on and so on.

Without exception, all these developments are déjà vu. All rest on the (correct or not) assumption that future wars will be fought primarily by states and armies, not guerrillas or insurgents or terrorists. Also that America’s opponents are going to be without a credible second-strike capability; or else it is hard to see how nuclear escalation can be ruled out and how the wars in question can be fought. Also that they are going to be relatively small and weak; or else it is hard to see why they should not build an offensive nuclear capability and become untouchable, as all previous nuclear countries also did.

Suppose they are small and weak, however, why fight them in the first place? Unless we are back to salami tactics, of course.

Age of the Muzzle

Welcome to the age of the muzzle.

In Russia you cannot say that Putin is a dangerous scoundrel. The same, of course, applies to the rulers of many other non-countries.

In Canada, I am told, you cannot say that homosexuality is unnatural.

In Austria you cannot say that there was no Holocaust. Ditto in Germany.

In America, you cannot say that certain countries are s——-s.

In many American schools and universities, you cannot wear a cross pendant for fear someone will be offended.

In the Netherlands any reference to Zwarte Piet (Black Peter, a legendary comic character who has accompanied Santa Claus for ages) is bound to get you in trouble.

In almost all Western countries, you cannot say that many refugees and migrants are uncouth louts.

Ditto, that Islam is a religion that puts great emphasis on violence and the sword (which, incidentally, is its symbol).

Ditto, that trans-gender people are poor confused creatures who do not know what sex they belong, or want to belong, to.

Ditto, that there are some things men can do and women cannot. Or that people of different races have different qualities.

So why get excited when, in Poland, you are no longer allowed to say that quite some Polish people cooperated with the Germans in hunting and killing Jews?

And here is what Supreme Court member Louis Brandeis, back in 1927, in Whitney v. California, concerning a decision to convict a woman who had been sued for setting up a communist cell, had to say about the matter:

“Those who won our independence believed that the final end of the State was to make men free to develop their faculties, and that, in its government, the deliberative forces should prevail over the arbitrary. They valued liberty both as an end, and as a means. They believed liberty to be the secret of happiness, and courage to be the secret of liberty. They believed that freedom to think as you will and to speak as you think are means indispensable to the discovery and spread of political truth; that, without free speech and assembly, discussion would be futile; that, with them, discussion affords ordinarily adequate protection against the dissemination of noxious doctrine; that the greatest menace to freedom is an inert people; that public discussion is a political duty, and that this should be a fundamental principle of the American government. They recognized the risks to which all human institutions are subject. But they knew that order cannot be secured merely through fear of punishment for its infraction; that it is hazardous to discourage thought, hope and imagination; that fear breeds repression; that repression breeds hate; that hate menaces stable government; that the path of safety lies in the opportunity to discuss freely supposed grievances and proposed remedies, and that the fitting remedy for evil counsels is good ones. Believing in the power of reason as applied through public discussion, they eschewed silence coerced by law — the argument of force in its worst form. Recognizing the occasional tyrannies of governing majorities, they amended the Constitution so that free speech and assembly should be guaranteed.”

Did he make himself clear enough?

O Captain! My Captain!

Eleven years have passed since the earthly wanderings of Ariel Sharon were terminated by the April 2006 stroke that put him hors de combat. For eight long years after that he lingered. Tied to life support apparatus, occasionally moving an eyelid, but never once regaining consciousness. As time goes on, fewer and fewer people even remember his name. Where did he come from, what role did he play in Israeli history, and how is he likely to be remembered?


Ariel Sharon was born in 1928, the son of a farmer who worked the land to the northeast of Tel Aviv. During the first weeks of Israel’s 1948 War of Independence the young Sharon found himself defending his very home against Iraqi troops who had come all the way from Baghdad. So well did he do that he was given a platoon to command even though he had never attended officer school.

In May 1948, during an attack on a fortified police station near Jerusalem, Sharon commanded the lead platoon. Wounded in the groin and unable to walk, he was carried back to friendly lines on the shoulders of a comrade who had gone blind. Many years later, visiting the battlefield to explain the episode to me and about a hundred of my students, he added, with a wink, that he had not always been as big as he later became.

Soon after the war he left the army to study law. However, in 1953 he was brought back by the then deputy chief of staff, General Moshe Dayan who charged him with organizing and command a newly-established commando unit. The task of 101, as it was known, was to strike into the neighboring countries, principally Jordan and Egypt but occasionally Syria as well, from which terrorists crossed into Israel, robbing and murdering civilians living close to the borders. Later it was merged with a paratrooper battalion that carried on in a similar way. Sharon quickly proved an effective, if headstrong and brutal, commander. Repeatedly exceeding his orders and killing far more few Arabs than his superiors had expected (or so they claimed), his raids caused an international furor that reached all the way to the United Nations.

In the 1956 Israeli-Egyptian War he commanded an elite paratroop brigade. First he drove into the Sinai Peninsula to link up with one of his battalions that had been dropped near the strategic Mitlah Pass. Next, violating explicit orders, he sent another battalion to enter the Pass itself. Later, to justify himself, he argued that the move had been necessitated by reports about an armored Egyptian brigade which was coming at his paratroopers from the north. Perhaps so; the ensuing battle led to his brigade suffering one quarter of all Israeli casualties in that campaign.

Following this episode Sharon’s progress up the military hierarchy was brought to a halt Only in 1963 did he return to favor; the man who promoted him was then chief of staff Yitzhak Rabin. In the June 1967 Arab-Israeli War Sharon commanded a division. Leading it in a model operation he captured Abu Agheila, the most important Egyptian fortified perimeter in the Sinai. Later, while serving as Commander, Southern Command, from 1969 to the summer of 1973, he waged the so-called War of Attrition against the Egyptians on the Suez. He also brutally put down a Palestinian Uprising in Gaza, killing hundreds and tearing down thousands of homes in the process.

By the time the October 1973 War broke out Sharon was no longer in uniform. However, he was called back to command a reserve division against the Egyptians. With it he crossed the Suez Canal, all but encircling the Egyptian Third Army making a decisive contribution to the outcome of the war. The men who fought with him gratefully remember the steadying effect of his voice as it came through on the radio amidst the chaos of burning tanks, exploding shells, and the screams of the wounded. Perhaps it was to reassure them that, during the war, he always had a vase with flowers standing on his desk.

By 1974 Sharon was out of the army for good. When Likud came to power in 1977 he became minister of agriculture under Menachem Begin. With Begin’s backing, used his position to increase the number of Jewish settlers in the West Bank from 15,000 to 100,000 within just four years.

In June 1981 he became minister of defense. In June 1982 he launched the enormous war machine now under his command into Lebanon, Israel’s weak neighbor to the north. The declared objective was to end terrorism which had been coming from that country for over a decade past. The undeclared and much larger one, to help the Lebanese Christians set up a government that would turn it into an Israeli protectorate. But victory proved elusive; the outcome was a terrorist campaign fought first by members of the Palestinian Liberation Organization (PLO) in Lebanon, then by a militia known as Amal, and finally by Hezbollah.

In March 1983, held responsible for failing to prevent his Christian Lebanese allies from massacring as many as 3,000 men, women and children in the refugee camps of Sabra and Shatila, he lost his post. By that time so unpopular had he and the war become that the troops, adapting a well-known children’s ditty, were chanting the following rhymes:

Aircraft come down from the clouds

Take us far to Lebanon 

We shall fight for Mr. Sharon

And come back, wrapped in shrouds.

He did, however, remain in parliament. As Likud’s political fortunes rose, fell, and rose again, now he carried a ministerial portfolio, now was left out in the cold. As before, he strongly opposed all concessions to the Arabs. Including the 1993 Oslo Agreements with the Palestinians which were signed by his former commander and then prime minister, Yitzhak Rabin. In September 2000, following the failure of Prime Minister Ehud Barak and PLO chief Yasser Arafat to reach agreement at Camp David, Sharon, by demonstratively visiting the Temple Mount, helped trigger off the Second Palestinian Uprising. Early in 2001 he took over as prime minister. In 2002 he consolidated his power by winning the elections. Meanwhile his efforts to suppress the uprising involved quite a bit of brutality, culminating in the attack on the West Bank City of Jenin in April-May 2002.

Whether Sharon was already thinking of giving up at least some of the occupied territories will never be known. At the time, he repeatedly said he was no de Gaulle. However this may have been, his hand was forced. To put an end to terrorism, the Israeli public demanded that a fence be built between themselves and the Palestinians. A fence did in fact go up around the Gaza Strip, and over the years has proved very effective in stopping the suicide-bombers who, at the time, formed the most serious threat of all.

From that point on there was no turning back. Israel evacuated the Strip, and Sharon made no secret of his intention to evacuate parts of the West Bank as well. When this led to a revolt among the members of his own Likud Party he left it, founded a new one of his own, and prepared for new elections. The rest, as they say, is history.  


Looking back on Sharon eleven years after his political demise, what can one say? Like most Israelis, he spent his entire life in a country that seldom knew anything like peace. Between the ages of twenty and forty-five he was almost always in uniform. Rising from the ranks, he was a highly aggressive and original commander who was constantly in the thick of battle. At least one of his operations, the attack on Abu Agheila, is widely regarded as a classic. None of this could prevent him from being disliked by his superiors, colleagues, and immediate subordinates some of whom accused him of dishonesty and undependability. He was, however, liked by his men and well-known for the way he took care of them.

Sharon’s role in 1973 War and the 1982 invasion of Lebanon, including the Sabra and Shatila massacre, will forever remain the subject of debate in Israel. It is, however, overshadowed by his record as prime minister which is even more controversial. At the time when he first proposed, then carried out, the withdrawal from the Gaza Strip Israel’s hawkish right, including many of his fellow Likud members, launched vicious attacks on him. So vicious that they may well have helped bring about the stroke that finally killed him. Later the wind shifted. By now, even some of his greatest opponents see the withdrawal for what it was. To wit, a smashing success—even though the occasional rocket is still coming in.

No other man could have done it. Had he lived, almost certainly he would have withdrawn form parts of the West Bank as well, or at least tried to do so. Not because he liked Palestinians. But because he believed, quite rightly in this author’s view, that stationing Israeli troops and civilian amidst a hostile population could only lead to an endless waste of lives and treasure. He would also have completed the security fence around the West Bank—something his successors Olmert and Netanyahu, for various reasons, never did.

To Sharon, the following lines apply:

O Captain! My Captain! Our fearful trip is done;

The ship has weather’d every rack, the prize we sought is won;

The port is near, the bells I hear, the people all exulting,

While follow eyes the steady keel, the vessel grim and daring;

                   But O heart! Heart! Heart!

                   O the bleeding drops of red

                   Where on the deck my Captain lies,

                   Fallen Cold and dead.

You Could Be Next

The man in the photograph, Boaz Arad, used to be an Israeli artist. A good one, as I think you can see for yourself. He was also a charismatic teacher in his field. The fact that he was single did nothing to diminish his popularity. But last week, following an article in which a nameless female student was quoted as saying that he had harassed her, he killed himself.

He left behind a letter (in Hebrew) I want you to read:

“This female journalist calls me and says she has heard complaints about my romantic involvement with students at Telma Yallin [an Israeli art school, MvC]. She does not provide names. She does not provide facts I can respond to. She does not explicitly mention sex, just drops hints about it. The complaints mention romance, not sex. But the journalist interprets this as sex between a man and a woman.

Under any legal system in the world, there is such a thing as a statute of limitation [the alleged sexual encounter took place two decades ago]. Under any legal system in the world, a man is presumed innocent until proven guilty. But there are cases in which the law must be circumvented. Suddenly [the man] is weak. I have to stand up against unspecific accusations and defend myself. But given how powerful the media are, who will believe me? How can I look anyone in the eyes? How can I fight back?

At Telma Yallin I met wonderful young people. With some of them I am still in touch. In some cases the ties became stronger [but only, as Arad made clear in an interview, after the girls were over sixteen, which is the legal age of consent in Israel; and only after they were no longer his students]. Who can stop a liaison that is growing stronger? There was nothing there that had to be concealed.

For years on end there was gossip about me. And I, instead of denying it, became paralyzed.

And then there is xxxxx, who has never been known for truthfulness. She accused the school of allowing me to participate in a show even though some female students had complained that I had harassed them. I never had an affair with a student. Investigations both at Telma Yallin and Bezalel [another art school, MvC] showed that there never has been a complaint. But xxxxx is convinced I am guilty. She will get her pound of flesh. And to hell with the truth. For years she has been active behind my back, trying to shame me. The great warrior for justice. Goodbye, Ms. xxxxx. I have no doubt that you are behind all this. You have left plenty of evidence in your wake.

I’ve had a wonderful life filled with teaching and art. Now it has all been turned into muck.

How can I look anyone in the eye? Who will allow me to teach? Who will put my work on show?

All I ever was is gone.

Goodbye to my wonderful family. Goodbye to my wonderful students.

My apologies to anyone I may have hurt in this letter.

I love you.


The Good Life

Tomb of Qabus in Gonbad-e Qabus

Half a century has passed since I studied Plato under the guidance of my revered teacher, Prof. Alexander Fuks. I’ll never forget how, early in the course, he told the class—just five or six of us—that all philosophy is an attempt to answer just two questions. First, what the nature of things is; and second, what the good life is and how to lead it. I won’t go so far as to say that the first without the second is worthless. Study is, and for me has always been, its own reward. But there is no doubt that one of its main purposes is to serve the second and more important one.

At that time I was twenty-one years old and a graduate student in Jerusalem. I lived in a rented room on less than $ 100 a month, walked to the university each day, and had a girlfriend. For recreation I played tennis and went long-distance running. Once a fortnight I would take the bus to visit my parents in Ramat Gan, near Tel Aviv. Life was as good as it has ever been before or since.

Prof. Fuks is long dead. Now that my seventy-second birthday is only a few weeks away, though, I thought I would write down a few of the things I think I have learnt about what the good life means. Trying to do so, I quickly realized that the task is beyond my powers. Partly because there are so many of them. Partly because many of them contradict each other, and partly, because it seemed impossible to put them into any kind of logical order. So I decided to submit, by way of a somewhat belated New Year greeting to my readers, the thoughts of another man. I came across him by accident while reading, of all things, William Murray’s History of Chess (1913).

Qabus bin Washmgir (976-1012) was ruler (Emir) of Gurgan and Tabaristan, southeast of the Caspian in what is now Iran. He was not exactly a nice guy—very few rulers are. He spent most of his life fighting for the throne, gaining a reputation for cruelty on the way. Not that his contemporaries were less cruel; as is shown by the fact that his men, after having deposed him, ended up by freezing him to death. Still his poem struck an echo with me. I hope it will do the same with you.

Here goes.

The things of this world from end to end
are the goal of desire and greed.

And I set before this heart of mine the
Things which I most do need.

But a score of things I have chosen out of
the world’s unnumbered throng.

That in quest of them I my soul may
Please and speed my life along.

Verse and song, and minstrelsy, and
Wine full flavored and sweet,

Backgammon, and chess, and the hunting-
Ground, and the falcon and cheeta fleet;

Field, and ball, and audience hall, and
battle, and banquet rare.

Horse, and arms, and a generous hand,
And praise of my Lord and prayer.

A History of the Future

  1. J, Bowler, A History of the Future: Prophets of Progress from H. G Wells to Isaac Asimov, Cambridge, Cambridge University Press, 2017.

As Yuval Harari’s Home Deus: A Brief History of Tomorrow shows, “histories” of the future are all the rage at the moment. Why that is, and what it means for conventional histories of the past, I shall not try to discuss. Where Prof. Bowler’s volume differs from the rest in that it is real history. Instead of trying to guess what the future may be like, he has produced a history of what people thought it might be like. The outcome is fascinating.

To the reviewer, the book provides so many possible starting points that it is hard to know where to begin. True, there had always been people who envisioned a better society. Most of the time, though, that society was located in the past—as with Plato and Confucius—in the afterworld—as with St. Augustine—or on some remote island (from Thomas More to about 1770). “Prophets of Progress” started making their appearance towards the end of the eighteenth century when the industrial revolution was making itself felt and when idea of progress itself took hold. As technical advances became more frequent and more important during the nineteenth century, their number increased. Starting at least as early as 1880, for any half literate person not to encounter their visions was practically impossible. Even if he (or, for god’s sake, she) only got his impressions from pulp magazines, themselves an invention of the late 1920s. And even if he was a boy (rarely, girl) who got his information from the long defunct Meccano Magazine, as I myself did.

Bowler himself proceeds not author by author, nor chronologically, but thematically. First he discusses the background of some of the authors in question. Quite a few turn out to have been scientists, engineers or technicians, a fact which in Bowler’s view gave them an advantage. Many were moved by personal interests, particularly the need to promote their own inventions. Next he takes us over one field after another; from “How We’ll Live,” through “Where We’ll Live,” “Communicating and Computing,” “Getting around,” “Taking to the Air,” “Space,” “War,” Energy and Environment,” all the way to “Human Nature.” Some predictions, such as the discovery of a method to counter gravity, travel at speeds greater than that of light, and tele-transportation proved totally wrong and have still not come about, if they ever will. Others, such as air travel, TV, helicopters, and megacities—though without the moving people conveyors many visionaries thought they could see coming—were realized rather quickly. Often it was not the technical characteristics of a new invention but its commercial possibilities, or lack of them, which determined the outcome.

Interestingly enough, two major inventions whose role very few people saw coming were radar and computers. The inability to envisage radar helps explain why, between about 1920 and 1939, fictive descriptions of future war almost always included apocalyptic visions of cities totally destroyed and even entire countries annihilated. The initial lack of attention paid to computers was probably linked to the fact that, until 1980 or so, they were only used by government and large corporations as well as the somewhat esoteric nature of the machines themselves. As a result, it was only after the invention of the microchip around 1980 that their full impact on daily life, both that of individuals and that of society as a whole, began to be understood.

William Blake (“black satanic mills”) and the Luddites having gone, until 1914 the reception new inventions got was normally positive. After all, who could argue with cheaper goods, faster travel, electric trams (taking the place of the clumsy, dirty horses of old), better control of many infectious diseases, and a zillion other things that made the lives of those who could afford them better and more comfortable? Next, the wind shifted. World War I with its millions of victims having come and gone, it was hard to remain optimistic. The change is well illustrated by the difference between H. G. Well’s Modern Utopia (1905) and Yevgeny Zamyatin’s We (1921). The former is lightly written and not without an occasional bit of humor. It describes a world which, though it may not be to everyone’s taste, is meant to be an improvement on the existing one. The latter is a grim tale of totalitarian government and control that extends right down to the most intimate aspects of life.

From this point on most new inventions have usually met with mixed reactions. Some people looked forward to controls that would reduce the proportion of the “unfit” in society, others feared them. While the former approach seemed to have been finally buried by the Nazi atrocities, later fear of global overpopulation caused it to return. The advent of television for entertainment and education was countered by the fear less it would turn all of us into what, much later, came to be called couch potatoes. Many welcomed continuing industrialization and growing productivity, but others worried about eventual shortages of resources as well as the things pollution might be doing both to the environment and, through it, to our own health. As Bowler points out, most prophecies were based on the relentless advance of technology. However, the arguments pro and contra changed much more slowly. Indeed they displayed a clear tendency to move in cycles, repeating themselves every generation or so.

One particularly fascinating story Bowler does not follow as carefully as he might have concerns nuclear weapons. As he notes, following the discovery of radium and radiation in the 1890s more than one author started speculating about the possibility of one day “liberating” the enormous energy within the atom and using it for military purposes. So much so, indeed, that one World War II American science fiction writer had to put up with visit by the FBI because of his stories’ uncanny resemblance to what, without his knowledge, was going on at Los Alamos. Coming on top of steadily improving “conventional” (the term, of course, is of much later vintage) weapons, this new form of energy threatened to literally destroy the world. Yet after the first “atomic” weapons were used against Hiroshima and Nagasaki in 1945 there was a tendency to belittle the danger they posed. Especially by way of radiation which some politicians and officers declared to be a “bugaboo” hardly worth taking seriously. More twists and turns followed, culminating in Johnathan Schell’s 1983 dark volume, The Fate of the Earth. What practically all authors Bowler discusses missed was the ability of nuclear weapons to impose what is now known as “the long peace.” An ability due, not to the efforts of well-meaning protesters but precisely to proliferation.

But I am beginning to quibble. Based on a vast array of sources—mostly, it must be admitted, British and American ones—clear and very well written, Bowler’s book is a real eye opener. For anyone interested in the way society and technology have interacted and, presumably will continue to interact, it is a must.

On Balance

At the beginning of 2018 the alarm bells are ringing. Doomsayers are crawling out of their holes, terrifying the rest of us with their predictions. Including, pollution, global warming, anti-biotic-resistant germs, nuclear war (especially in northeast Asia), computers that are more intelligent than we are, and what not. Accordingly, this is as good a time as any to draw up a balance of what we humans have achieved and not achieved on our particular cosmic speck of dust over the last few millennia or so.

Without any further preliminaries, here goes


What we have achieved


By some studies, 70,000 years ago humanity numbered just a few thousand individuals. Today the figure stands at about 7.6 billion. Had something similar applied to any other species, e.g chimpanzees, surely we would have called it an unparalleled triumph. Some ninety percent of the increase, incidentally, took place during the last two centuries or so.

We have extended our life expectancy from less than thirty years during the Neolithic to a little over seventy years today. Most of the increase also took place during the last two centuries or so.

We have reduced women’s perinatal mortality by approximately 95 percent. Ditto.

We have more or less done away with a number of important killer diseases. The last global outbreak of a pandemic that killed tens of millions was the so-called Spanish flu in 1919-20. Since then, all we’ve had is flashes in the pan. Hats off to the medical establishment.

The kind that is self-inflicted apart, we have more or less overcome famine. In many parts of the world today we are more likely to die of overeating than of not eating enough.

We have made life more comfortable. In fact, such are the amenities most of us in the West in particular enjoy as to exceed anything available even to royalty until the middle of the nineteenth century.

We have vastly increased our understanding of the universe, the things it contains, and the laws according to which it works.

Our technological genius has enabled us to set foot on the bottom of the sea as well as the surface of the moon. Also, to explore the planets. It has even enabled us to build machines that think, after a fashion.

We have built weapons capable of more or less putting an end to us. Though whether or not that should be counted as an achievement is hard to say.

What we have not achieved

We have not succeeded in uniting humanity under a single government (not that such a government would necessarily be a blessing).

We have not put an end to war.

We have not surpassed the achievements of, say, the ­ancient Greeks in such fields as sculpture, architecture, literature, drama, rhetoric, philosophy, and historiography.

We have not put an end to misery or to madness.

We have not made life less stressful. Some claim, to the contrary.

We have become no happier.

My grandfather used to say that, while growing old was good, being old was bad. That was almost certainly true during the Paleolithic It remains true today.

We have not put an end to death (though that may very well be a good thing).

Pace Freud and the entire psychological community, we do not understand ourselves any better than we did millennia ago. Did anyone ever understand human nature better than Shakespeare did?

Feminist claptrap to the contrary, the gap that separates men from women has not closed or even narrowed. Men are still from Mars and women, from Venus. Make up your own mind as to whether that is good or bad.

We still have no direct knowledge of the way animals think. Nor improved methods of communicating with them.

We have not become wiser.

Everyone thinks he or she knows about education. So how come we have not yet found a way to make our children better than ourselves?

We have not built a kinder, gentler, more just society. Nor, though everything is relative, is there any question of eliminating poverty.

We have not improved our methods of dealing with evil, when and where it raises its ugly face.

We have not closed the gap between free will and determinism even by one jota.

We have not discovered the secret of life. As a result, we are unable to create it either.

We are unable to control the weather or even forecast it much more than a week in advance.

We are unable to predict earthquakes.

We do not know what the future will bring. That means we are not in charge of our destiny.

We still do not know whether God exists.


Make up your own mind which of those two lists predominates.

On Technology and War (3)

Two weeks ago I tried to answer the question, how to use military-technological superiority when one has attained it. A week ago, to point out the things that technology does not change and will not change and cannot change. Today’s post is the last in the mini-series. I want to use it in order to ask: How is a new military technology received, and what happens to it once it is received?

Many of you will be familiar with the name of Giulio Douhet (1869-30). The Italian general who in 1921, published Il dominio dell’aereo, probably the most famous volume on the topic ever written. His portrait graced this column last week. But it is not this book I want to discuss here. In 1913 Douhet was a major on the general staff. In that capacity he produced an article on the above question, which I have used as my guide.

Stage A. A new technology is introduced. Normally this is done by the inventors and manufacturers who hope to make a profit and turn to the military as a potentially very large client. The idea meets with skepticism on the part of the officers who are sent to examine it. Though ingenious it is a mere toy, or so they declare. Good examples for this argument can be found in the Zeppelin; heavier than air aircraft; the submarine; and the tank. All of which were invented before 1914, and all of which initially met this fate. There is even a story about a British regimental commander, who receiving a couple of machine guns, told his men to take the “bloody things” to the wing and hide them.

Stage B. The manufacturers do not give up. They continue to push, sometimes by offering their invention to an enemy of the country they first approached. Sir Basil Zaharoff, though not an inventor but a merchant, was the undisputed master in this game, selling warships to both Turkey and Greece. Slowly and gradually, the military undergo a limited shift. They are now ready to see whether there is any way in which they can incorporate the new weapon or weapon system into the existing organizations without, however, acknowledging the need to change that organization in any fundamental way. At times indeed, they start adopting a new invention in order to prevent change; as the German Luftwaffe did when it developed the V-1 as a counter to the early ballistic missiles favored by the land army. Other good examples of the attempt to pour new weapons into old organizations are, once again, the heavier-than-air aircraft, and the submarine. And the aircraft carrier, of course.  

Stage C. Quite suddenly, the wind changes. As older officers die or retire, younger ones—those in charge of the new technologies and in favor of them—start shouting their virtues from the rooftops. Military history is making a fresh start! They (the new technologies) are about to take over! Everything else is ripe for the dustbin! And so on and so on. Douhet himself set the example. By the time he wrote his book he had convinced himself that armies and navies were about to disappear and that aviation, like the Jewish God in one of the prayers addressed to him, “all alone would rule in awe.” Similar claims on behalf of aircraft were made in the US by General Billy Mitchel; whereas in Britain another officer, Colonel John Fuller, was doing the same on behalf of tanks. Nowadays they are being made on behalf of artificial intelligence and autonomous killing machines among other things,

Stage D. It becomes evident that, useful as the new technologies are, they do not provide answers to all problems. As the defense becomes stronger, pilots find that their aircraft cannot simply bomb the hell out of whomever they want at any time they want. Submariners discover that, without support from the air (later, satellites), their ability to find their targets is very limited. Tanks are threatened by anti-tank guns and are, moreover, only useful in certain, well-defined, kinds of terrain. Carriers have to be escorted by entire fleets of anti-missile destroyers, anti-submarine destroyers, and supply ships. And autonomous killing machines kill indiscriminately. Briefly, the new technologies must be integrated with everything else: strategy, tactics, command and control, logistics, intelligence, doctrine, training and what not.

Stage E. Following the usual logistic curve, shown above, the process of reorganization has been driven as far as it will ever be and is now flattening out. Advanced, even revolutionary, weapons and weapon systems have become an integral part of the forces. Perhaps, as in the case of carriers from 1941 on, their lynchpin. By this time most of those who initially opposed the changes are gone. A new generation officers has risen and takes things as they now are for granted. And they start asking themselves: What has really changed?

Which, of course, itself is both cause and consequence why, as we have seen, so much does not change.