What We Cannot Know

du Sautoy, What We Cannot Know, Kindle ed., 2016

This is perhaps the best book about the history and philosophy of science I have ever read—and, having taught the subject at the university level for a number of years, I have read quite some. The author is a professor of mathematics at Oxford. In 2008 the University appointed him to the Simonyi Professorship for the Public Understanding of Science, which was all but tantamount to launching him into a new career. Judging by this book, he has done so with great success indeed.

*

To start at the beginning, the great strength of science has always been its ability to predict what given certain conditions, will happen in the future. On such and such a day, at such and such an hour, in such and such a place, there will be an eclipse lasting so and so many minutes or hours. Mix stuff X with stuff Y, and the outcome will be an explosion. So far, so good. In the hands of the philosopher Karl Popper (1892-1994) this ability to predict has been turned into the test as to whether or not a proposition or theory is scientific, an idea to which any modern scientist who hopes to publish his work must subscribe.

Enter chaos theory. Developed from the 1960s on, it centers on the question why many major physical events—tornadoes, for example, or earthquakes—are so devilishly hard to predict. The answer? Because, in many systems, very small initial changes can sometimes lead to enormously different outcomes. As, for example, when a buttery flapping its wings in Beijing combines with any number of other factors, some great some small, to cause a tornado in Florida. Another example, central to du Sautoy’s book, is provided by the throwing of a dice (or multiple dice, but there is no need to go into that here). No knowledge we can obtain, however accurate and however detailed, is ever likely to tell us which face a dice is going to land on the next time we throw it. Instead, all we can hope for is a statistic—namely that, assuming the dice is perfectly balanced and as we keep throwing it again and again, on the average one out of six throws will result in a six.

Next, what is the universe made of? The Greek philosopher Democritus believed it was material (that is why we call him a materialist). Using a knife to cut it up, the outcome would be smaller and smaller pieces of matter until, finally, we would find ourselves dealing with indivisible, a-tomos in Greek, particles forming the building blocks of the universe. Two and a half millennia have passed, and we still do not know whether he was right. On one hand particles much smaller than the atom—protons, neutrons, electrons, neutrinos, positrons, muons, bosons, quarks, and many others—have been discovered and keep being discovered, leading to the question whether the quest will ever end. On the other, of many if not all these particles it is not at all clear whether they are in fact particles. They are perhaps best described as flashes of light (electromagnetic waves) appearing now here, now there; in other words, as waves.

Much worse still, there is the Uncertainty Principle. First pronounced by Werner Heisenberg back in 1927, it tells us that we cannot know both the location of a particle and its momentum; the reason being that any attempt to focus on one of these qualities will cause the other to change. So much for determinism at the smallest level of them all.

Still staying with the universe, we want to know more about its genesis, its qualities, its size (if it has a size), and its ultimate fate. It is not that we have not been making progress; even as I write, man-made machines are exploring the surface of Mars. Whereas the philosopher Auguste Comte (1798-1857) once declared that we would never know what the oh-so-remote stars are made of, now spectroscopy enables us to do exactly that even for those that are billions of lightyears away. But other questions remain. Assuming that the Big Bang really did take place and is not a convenient fiction as the aether used to be, what exactly was it that “exploded”? What, if anything, did it explode into? Will the expansion of the universe that the Big Bang initiated go on forever, or will it one day come to an end and reverse itself, leading to a Big Crunch? Is our universe the only one that exists, or are there others? How about the possibility that other universes exist, not simultaneously but sequentially, one after another, each preceded by its own Big Bang and each ending in its own Big Crunch? Either way, are the remaining universes we are talking about subject to the same physical and mathematical laws as ours is? Or are they entirely different? Will we ever be able to observe them and communicate with them? In that case, will we benefit from doing so or will the outcome be our annihilation?

Starting at least as far back as Parmenides in the sixth century BCE, is has been widely believed that only God can create something out of nothing, Does that mean that the Big Bang, assuming it ever took place, provides proof of His existence? And what is this God? Is he eternal? If not, when and how did He come into being? Is He separate from the universe, or are the two one and the same? Was his creation of the Big Bang a one-time act, or did He go on interfering with the universe ever after? Can we communicate with him?

The Big Bang is supposed to have taken place, and the universe come into being, approximately 13.7 billion years ago. Referring to time, what does that mean? Did time exist before the Big Bang? Or didn’t it? What is time, anyhow? Does it have an objective existence the way space and mass do (at least du Sautoy does not seem to question their existence, though others have done so)? Or is it, to speak with Stephen Hawking, simply that which certain of our instruments measure?

And how about life? Many other researchers have worked on this question, trying to imagine, and to a very limited extent model, the conditions that might have led to its rise. However, for du Sautoy it seems to be of secondary importance, given that he only devotes remarkably little space to it. With him it is as if the mapping of DNA, and our ability–as exemplified, some say, by China’s modifying some genes so as to create the corona virus—to manipulate it to some extent, has solved the most important mysteries of all. Never mind that, so far, no one has been able to create even the simplest forms of life in a test tube; nor explain, for example, how a fetus develops from a blastula into a fully formed baby.
Erection deficiencies are common in men who suffer cost viagra More Discounts from it. Always check out ingredients, labels and logos to make sure that the tablet has been completely absorbed in the blood. viagra tablets 20mg It was again in France when Jean-Pierre Blanchard saw it as brand viagra for sale his next great challenge to fly across the English Chanel – he managed to do on January 7, 1785. This citrate is very powerful and forms a very http://deeprootsmag.org/2014/05/04/daddy-g-full/ viagra ordination important part of the working of the drug.
As if to compensate for this, du Sautoy delves quite deeply into another aspect of life: namely, the fact that we are conscious, aware of our own existence, and capable of experiencing things. Precisely what is this consciousness? Are we the only animals who possess it? If not, how far “down the ladder of life” do we have to go before we hit on creatures that do not have it? Do primates have it? Do snails? Can there be such a thing as life that does not have consciousness as manifested, if not in the form of launching into a dialogue with itself, at any rate by the ability to feel some kind of pain? Looking at the problem from its other side, will we ever be able to build a conscious computer? Or must we forever put up with one that behaves as if it were conscious?

Closely tied to the question of consciousness is that of the free will. On its supposed existence rests our entire society; our education, our religion, our law (or, before we had law, taboos which individuals did or did not violate), our system of justice. Some would say that this applies to all societies; a society that does not assume that we are autonomous beings, at least so some extent, is inconceivable. But does the free will really exist? Or is it, as not just some ancient authors but some modern brain scientists as well claim, just an illusion? And if illusion it is, what are the implications for the abovementioned social phenomena? Will future criminals, tried for theft e.g, be able to save their hide by claiming that it was not them but the neurons in their brains that committed it? Suppose we succeed in building a computer possessing, as part of its consciousness, a free will; when it comes to law and justice will we treat it as we do humans?

Finally, math. In any inquiry into natural science, math makes a good starting point. That is because, starting at least as far back as Galileo, and in some ways going all the way to Pythagoras two millennia earlier, mathematics is the one great pillar on which all the natural sciences rest. So astronomy, so cosmology, so physics, and so chemistry; and so, increasingly, biology too. Wherever math reigns, we feel that we have reached some kind of unique insight or understanding. Wherever it does not, the mysterious quality known as “scientific” is either present only to a limited extent or altogether absent.

The difficulty is that, contrary to the usual view of math as the one science that can yield certainty, math itself is not without its problems. First, as du Sautoy himself is at pains to emphasize, much of it deals with things that do not really exist. Not just irrational numbers and imaginary numbers but perfect circles, straight lines, points that have no dimensions, movements proceeding in straight lines and at constant speeds, and much more. Second, there is the much-discussed, but so far unanswered, question why such artificial creations and abstractions should not only fit the physical world with which we are familiar but provide the best tools for analyzing it; in other words, why nature should allow herself to be governed by mathematics in these and other things.

Third, it has been shown—by Kurt Goedel, Einstein’s constant companion at Princeton during Einstein’s last years—that any mathematical system will necessarily contain propositions that are self-contradictory, unprovable, or both. Such as can only be resolved by drawing on propositions taken from outside that system, where the game starts afresh. No Baron von Muenchhausen pulling himself up by his bootstraps, in other words. To use another metaphor, it is as if we were living inside a Russian doll. No sooner do we succeed in gaining what we think is a complete understanding the innermost one than we discover that surrounding it on all sides is another doll; and so on and so on in a succession of dolls. One whose end, even assuming there is one, we cannot perceive.

*

Goedel did not develop his theories in a vacuum. Just one day before he first announced them a famous German mathematician named David Hilbert, in an address to the Society of German Scientists and Physicians, claimed that “we must know/we shall know.” In other words, that everything is in principle knowable; and that, given sufficient genius and hard work, it will end up by becoming known.

Certainly he was not the first scientist to fall victim to that delusion, at least in certain fields and for a certain time. Prominent others before him were Albert Michelson, who first measured the speed of light while simultaneously showing that there was no such thing as ether in which it moved. And William Thomson, aka Lord Kelvin, who calculated the absolute freezing point. Nor was he the last. Both Stephen Hawking and Albert Einstein made similar claims. Hawking, in A Short History of Time when he wrote that the then current physicists’ model of the universe was getting so close to the ultimate truth as to leave fewer and fewer loose ends; Einstein, by famously claiming that “God does not play dice,” thus sticking to determinism and knowability as opposed to indeterminism and unknowability.

*

My mother used to say that a single fool can ask more questions than ten wise people can answer. That is true; not are the abovementioned questions by any means the only ones du Sautoy discusses. As a layman reading the book, all I can say is that I emphatically did not feel that any of them—at any rate, those I was able to understand—were of the kind fools might ask. To the contrary: many go down to the core of our existence, and many have important practical implications. Even those that do not—e.g what time is and whether, before the Big Bang, there was such a thing—sounded interesting to my ears. One reason for this is because the author, an expert on mathematics as the science that seems to underlie all the rest, is uniquely qualified to look not just at one of them but at them all. Another, because he writes in a fluent, fairly light-headed way complete with just enough stories, anecdotes, and jokes to keep the reader asking for more.

As du Sautoy, in his last chapter, keeps telling us: It is the deficiencies of knowledge, and our attempts to obtain it, which make up the essence of life by endowing it with a sense of purpose.

Highly recommended.

The Reign of Uncertainty

One of the principal clichés of our age, endlessly repeated, is that our ability to look into the future and control our fate has been growing. So much so that, in the words of Yuval Harari, we are about to transform ourselves from Homo Sapiens, originally a small, weak and vulnerable creature constantly buffeted by his surroundings, into a quasi-omnipotent Homo Deus. The main engine behind this process, we are told, is represented by fast-accumulating developments in science and technology. Those developments in turn, are both cause and consequence the kind of education that helped us cast off superstitions of every kind and, in the words of Immanuel Kant (1724-1804), “dare to know.” Some would go further still and argue that, if such were not the case, there might be little point in pursuing any kind of learning in the first place.

For a long time, this line of thought was closely related to belief in progress. Today it is shared both by those who are optimistic in regard to the future and by those who, like Harari, keep warning against the disastrous consequences that our very successes may bring down upon our heads. As by changing the climate, destroying the environment, running out of drinking water, covering the planet with plastic, breeding antibiotic-resistant superbugs—vide the corona virus outbreak—and being enslaved, perhaps even exterminated, by some self-seeking supercomputer out on a roll. But is it really true that we are better in looking into the future, and consequently more able to control it, than our ancestors were? And that, as a result, the human condition has fundamentally changed? For some kind of answer, consider the following.

  1. The Demise of Determinacy

In Virgil’s words, “Felix, qui potuit rerum cognoscere causas” (happy, he who can discern the causes of things). For millennia on end, though, so deficient was our understanding of the future that almost the only way to get a handle on it was by enlisting some kind of supernatural aid. As by invoking the spirits, consulting with the gods (or God), tracing the movements of the stars, watching omens and portents of every kind, and, in quite some places, visiting or raising the dead and talking to them.

Come the seventeenth century, many of these methods were finally discarded. If not completely so, at any rate to some extent among the West’s intellectual elite. Their place was taken by the kind of mechanistic science advocated by Galileo Galilei, Isaac Newton, and others. Nor was this the end of the matter Many nineteenth century scientists in particular believed not just that the world is deterministic but that, such being the case, they would one day be able to predict whatever was about to take place in it. One of the best-known statements to that effect came from the polymath Pierre-Simon Laplace (1749-1827). It went as follows:

An intellect [not a demon, which was substituted later for effect] which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.

In such a world not only God but chance, randomness, probability and the unexpected would be eliminated, leaving only sheer causality to rule supreme. Other scientists, such as William Thomson, Lord Kelvin, took matters further still, claiming that science had advanced to the point where there only remained a few minor gaps to be closed. No less than Stephen Hawking in his last work, Brief Answers to the Big Questions, admitted to having done just that. However, the very scientific progress that gave rise to this kind of optimism also ensured that it would not last for long. Just as, regardless of what number you multiply zero by, in the end zero is still what you get.

Starting with the discovery of radioactivity in 1896, it has become increasingly evident that some of nature’s most basic processes, specifically the decay of atoms and the emission of particles, are not deterministic but random. For each radioactive material, we know what percentage of atoms will decay within a given amount of time. But not whether atom A is going to break up before (or after) atom B and why. Subsequent discoveries such as quantum mechanics (Max Planck), relativity (Albert Einstein, the uncertainty principle (Werner Heisenberg, the incompleteness theorem (Kurt Gödel), and chaos theory (Richard Feynman), all helped extend the idea of incalculatability into additional fields.

To specify, quantum mechanics started life as a theoretical construct that could only be applied to the world of subatomic particles, hence could be more or less ignored by everyone but a very small number of nuclear scientists. However, since then it has been climbing out of the basement, so to speak. As it did so it acquired a growing practical significance in the form of such devices as ultra-accurate clocks, superfast computers, quantum radio (a device that enables scientists to listen to the weakest signal allowed by quantum mechanics), lasers, unbreakable codes, and tremendously improved microscopes.

At the heart of relativity lies the belief that, in the entire physical universe, the only absolute is the speed of light apart. Taken separately, both quantum mechanics and relativity are marvels of human wisdom and ingenuity. The problem is that, since they directly contradict one another, in some ways they leave us less certain of the way the world works than we were before they were first put on paper. The uncertainty principle means that, even as we do our best to observe nature as closely as we can, we inevitably cause some of the observed things to change. And even that time and space are themselves illusions, mental constructs we have created in an effort to impose order on our surroundings but having no reality outside our own minds. The incompleteness theorem put an end to the age-old dream—it goes back at least as far as Pythagoras in the sixth century BCE—of one day building an unassailable mathematical foundation on which to base our understanding of reality. Finally, chaos theory explains why, even if we assume the universe to be deterministic, predicting its future development may not be possible in a great many cases. Including, to cite but one well-known example, whether a butterfly flapping wings in Beijing will or will not cause a hurricane in Texas.

  1. Tripping Over One’s Own Robe

So far, the tendency of post-1900 science to become, not more deterministic but less so. As a result, no longer do we ask the responsible person(s) to tell us what the future will bring and whether to go ahead and follow this or that course. Instead, all they can do is calculate the probability of X taking place and, by turning the equation around, the risk we take in doing (or not doing) so. However, knowledge also presents additional problems of its own. Like a robe that is too long for us, the more of it we have the greater the likelihood that it will trip us up.

First, no knowledge can be better than the instruments used to measure the parameters of which it consists. Be they size, mass, temperature, rigidity, speed, duration, or whatever. And no instrument that physicists use is, or can be, perfectly precise and perfectly accurate. Even the most recent, strontium-based, clocks are expected to be off by one second every 138 million years, a fact which, chaos theory says, can make a critical difference to our calculations. The more accurate our instruments, moreover, the more likely they are to interfere with each other. The situation in the social sciences is much worse still, given that both the numbers on which most researchers base their conclusions and the methods they use to select and manipulate those numbers are often extremely inaccurate and extremely slanted. So much so as to render any meeting between them and “the truth” more or less accidental in many cases.

Second, there is far too much knowledge for any individual to master. Modern authors, seeking to impress their readers with the speed at which knowledge expands, often leave the impression that this problem is new. In fact, however, it is as old as history. In China, the Sui-era imperial library was supposed to contain 300,000 volumes. That of the Ptolemies in Alexandria held as many as half a million. And this is to assume that knowledge was concentrated inside libraries—whereas in fact the vast majority of it was diffused in the heads of countless people, most of them illiterate, who left no record of any kind. Since then the problem has only been getting worse. Today, anyone seriously claiming to have written a book containing “all that is most wonderful in history and philosophy and the marvels of science, the wonders of animal life revealed by the glass of the optician, or the labors of the chemist” (The World of Wonders, London, 1869) would be quickly dismissed as either a featherweight or a charlatan.
We know that erection disorder can have both mental and physical impairments. levitra generic india Indulge in a long-lasting foreplay which can help a lot to be said for going the natural route! Erectile dysfunction is one of the most popular names among these pain relief medications is the order viagra downtownsault.org Tramadol pain medication. Each type of Kamagra requires levitra without prescription different time to be effective on treating prostatitis, chlamydia, epididymitis, and so on. This is one of the causes of ED. downtownsault.org viagra no prescription
Third, not only is there too much knowledge for anyone to master but in many cases it keeps developing so fast as to suggest that much of it is mere froth. Whether this development is linear and cumulative, as most people believe, or proceeds in cycles, as was suggested by Thomas Kuhn, is, in this context, immaterial. One of the latest examples I have seen is the possibility, raised by some Hungarian scientists just a few days before these words were written in November 2019, that the world is governed not by the long-established four forces—gravity, the electromagnetic, the strong and the weak—but by five (and perhaps more). Should the existence of the so-called photophobic, or light-fearing, force be confirmed, then it has the potential to blow all existing theories of the world’s behavior at the sub-atomic, hence probably not only at the sub-atomic, level to smithereens.

Fourth, we may often have a reasonably accurate idea of what the consequences of event A, or B, or C, may be. However, working out all of those consequences is much more difficult. The more so because they may (and are likely to) have consequences; and so on in an expanding cascade that, in theory and sometimes in practice as well, does not have a clear end. Some of the consequences may be intended (in which case, if everything goes right, they are foreseeable), others not. Some may be beneficial, others harmful. Some may bend backwards so to speak, turning around and impacting on C, or B, or A, which in turn has consequences, and so on until the cascade turns into an entire series of interrelated cascades. That is particularly true in the social sciences where the very concepts of cause and consequence may be out of place; and reality, either reciprocal or circular.

Some consequences may even be perverse, meaning that they lead to the opposite of what was intended. For example, when the scientists employed on the Manhattan Project worked on a weapon to be used in war—there hardly ever was any doubt that it would be—they could not know that, to the contrary, it would render the kind of war on which their country was then engaged impossible. Both the Chernobyl and the Fukushima reactors were provided with elaborate, highly redundant, safety systems; but when the time came those systems, rather than preventing the accidents, only made them worse.

In brief, a simple, elegant “theory of everything” of the kind that, starting with Laplace, scientists have been chasing for two centuries remains out of sight. What we got instead is what we have always had: namely, a seething cauldron of hypotheses, many of them conflicting. Even when we limit ourselves to the natural sciences, where some kind of progress is undeniable, and ignore the social ones, where it is anything but, each question answered and problem resolved only seems to lead to ten additional ones. Having discovered the existence of X, inevitably we want to know where it comes from, what it is made of, how it behaves in respect to A and B and C. Not to mention what, if any, uses it can be put to.

The philosopher Karl Raimund Popper went further still. Scientific knowledge, he argued, is absolutely dependent on observations and experiments. However, since one can always add 1 to n, no number of observations and experiments can definitely confirm that a scientific theory is correct. Conversely, a single contradictory observation or experiment can provide sufficient proof that it is wrong. Science proceeds, not by adding knowledge but by first doubting that which already exists (or is thought to exist) and then falsifying it. Knowledge that cannot, at any rate in principle, shown to be false is not scientific. From this it is a small step towards arguing that the true objective of science, indeed all it can really do, is not so much to provide definite answers to old questions as to raise new ones. It is as if we are chasing a mirage; considering our experience so far, probably we are.

  1. The Drunk at the Party

If all this were not enough, the problem of free will persists. In the words of the French anthropologist Claude Levi-Strauss, it is the drunken guest who, uninvited, breaks up the party, upsetting tables and spreading confusion. Much as scientists may claim that it is simply a delusion—even to the point of showing that our bodies order us to raise our hands as much as ten seconds before we make a conscious decision to do so—our entire social life, specifically including such domains as education and justice, continues to rest on the assumption that we do in fact have a choice. As between action and inaction; the serious and the playful; the good and the evil; the permissible and the prohibited; that for which a person deserves to be praised, and that for which he deserves to be punished. Long before King Hammurabi had the first known code of law carved in stone almost four millennia ago, a society that did not draw such distinctions could not even be conceived of.

So far, neither physicists nor computer experts nor brain scientists, working from the bottom up, have been able to close the gap between matter and spirit in such a way as to endow the former with a consciousness and a will. Economists, sociologists and psychologists, working their way from the top down, have not been able to anchor the emotions and ideas they observe (or assume) people to have in underlying physical reality. Whichever route we take, the complete understanding of everything that would be necessary for prediction to be possible is as remote as it has always been. In no field is the crisis worse than in psychology; precisely the science (if one it is) that, one day, will hopefully explain the behavior of each and every one of us at all times and under all circumstances. Its claim to scientific validity notwithstanding, only 25-50 percent of its experimental results can be replicated.

Given the inability of science to provide us with objective and reliable visions of the future, those we have, as well as the courses of action we derive from them, depend as much on us—our ever-fluid, often capricious, mindset, our ira and our studio—as they have ever done. Elation, depression, love, euphoria, envy, rage, fear, optimism, pessimism, wishful thinking, disappointment, and a host of other mental states form a true witches’ brew. Not only does that brew differ from one person to another, but its various ingredients keep interacting with each other, leading to a different mixture each time. Each and every one of them helps shape our vision today as much as they did, say, in the Rome of the Emperor Caligula; the more so because many of them are not even conscious, at any rate not continuously so. In the process they keep driving us in directions that may or may not have anything to do with whatever reality the physicists’ instruments are designed to discover and measure.

  1. The Persistence of Ignorance

To conclude, in proposing that knowledge is power Francis Bacon was undoubtedly right. It is, however, equally true that, our scientific and technological prowess notwithstanding, we today, in our tiny but incredibly complex corner of the universe, are as far from gaining complete knowledge of everything, hence from being able to look into the future and control it, as we have ever been.

Furthermore, surely no one in his right mind, looking around, would suggest that the number of glitches we all experience in everyday life has been declining. Nor is this simply a minor matter, e.g. a punctured tire that causes us to arrive late at a meeting. Some glitches, known as black swans, are so huge that they can have a catastrophic effect not just on individuals but on entire societies: as, for example, happened in 2008, when the world was struck by the worst economic crisis in eighty years, and as coronavirus is causing right now. All this reminds me of the time when, as a university professor, my young students repeatedly asked me how they could ever hope to match my knowledge of the fields we were studying. In response, I used to point to the blackboard, quite a large one, and say: “imagine this is the sum of all available knowledge. In that case, your knowledge could be represented by this tiny little square I’ve drawn here in the corner. And mine, by this slightly—but only slightly—larger one right next to it.” “My job,” I would add, “is to help you first to assimilate my square and then to transcend it.” They got the message.

There thus is every reason to believe that the role ignorance concerning the future, both individual and collective, plays in shaping human life is as great today as it has ever been. It is probably a major reason why, even in a country such as France where logic and lucidity are considered national virtues and three out of four people claim they are not superstitious, almost half touch wood, and about one third say they believe in astrology. Nor are the believers necessarily illiterate old peasants. Most young people (55 percent) say they believe in the paranormal. So do many graduates in the liberal arts and 69 percent of ecologists. As if to add insult to injury, France now has twice as many professional astrologers and fortune tellers as it does priests. Both Black masses and Satan-worship have been on the rise. The situation in the U.S is hardly any different.

How did old Mark Twain (supposedly) put the matter? Prediction is difficult, especially of the future.