The questions we do not yet have the wit to ask will be a growing preoccupation of science in the next 50 years. That is what the record shows. Consider the state of science a century ago, in 1899. Then, as now, people were reflecting on the achievements of the previous 100 years. One solid success was the proof by John Dalton in 1808 that matter consists of atoms. Another was the demonstration (by James Prescott Joule in 1851) that energy is indeed conserved and the earlier surmise (by French physicist Sadi Carnot) that the efficiency with which one form of energy can be converted into another is inherently limited: jointly, those developments gave us what is called thermodynamics and the idea that the most fundamental laws of nature incorporate an "arrow of time."

There was also Charles Darwin, whose Origin of Species by Means of Natural Selection (published in 1859) purported to account for the diversity of life on Earth but said nothing about the mechanism of inheritance or even about the reasons why different but related species are usually mutually infertile. Finally, in the 19th century's catalogue of self-contentment, was James Clerk Maxwell's demonstration of how electricity and magnetism can be unified by a set of mathematical equations on strictly Newtonian lines. More generally, Newton's laws had been so well honed by practice that they offered a solution for any problem in the real world that could be accurately defined. What a marvelous century the 1800s must have been!

Image: NASA Johnson Space Center
EARTH viewed from the moon portends a new way of seeing our world and its inhabitants but gives few hints of the paths future discoveries will take.

Only the most perceptive people appreciated, in 1899, that there were flaws in that position. One of those was Hendrik Antoon Lorentz of Leiden University in the Netherlands, who saw that Maxwell's theory implicitly embodied a contradiction: the theory supposed that there must be an all-pervading ether through which electromagnetic disturbances are propagated, but it is far simpler to suppose that time passes more slowly on an object moving relative to an observer. It was a small step from there (via Henri Poincaré of the University of Paris) to Albert Einstein's special theory of relativity, published in 1905. The special theory, which implies that relative velocities cannot exceed the speed of light, falsifies Newton only philosophically: neither space nor time can provide a kind of invisible grid against which the position of an object, or the time at which it attains that position, can be measured. A century ago few people seem to have appreciated that A. A. Michelson and E. W. Morley, in the 1880s, had conducted an experiment whose simplest interpretation is that Maxwell's ether does not exist.

For those disaffected from, or even offended by, the prevailing complacency of 1899, ample other evidence should have hinted that accepted fundamental science was heading into trouble. Atoms were supposed to be indivisible, so how could one explain what seemed to be fragments of atoms, the electrons and the "rays" given off by radioactive atoms, discovered in 1897? Similarly, although Darwin had supposed that the inheritable (we would now say "genetic") changes in the constitution of individuals are invariably small ones, the rediscovery of Gregor Mendel's work in the 1850s (chiefly by Hugo de Vries in the Netherlands) suggested that spontaneous genetic changes are, rather, discrete and substantial. That development led, under the leadership of Thomas Hunt Morgan, to the emergence of Columbia University in New York City as the citadel of what is now called classical genetics (a phrase coined only in 1906) and to the recognition in the 1930s that the contradiction between Darwinism and "Mendel-Morganism" (as the Soviets in the 1950s came to call Columbia's work) is not as sharp as it first seemed.

Now we marvel at how these contradictions have been resolved and at much else. Our own contentment with our own century surpasses that of 1899. Not least important is the sense of personal liberation we enjoy that stems from applications of science in the earliest years of the 20th century--Marconi's bridging of the Atlantic with radio waves and the Wright brothers' measured mile of flight in a heavier-than-air machine. (Wilbur and Orville had built a primitive wind tunnel at their base in Ohio before risking themselves aloft.) The communications and aviation industries have grown from those beginnings. Our desks are cluttered with powerful computing machines that nobody foresaw in 1900. And we are also much healthier: think of penicillin!


A Catalogue of Contentment

In fundamental science, we have as much as or more to boast about than did the 19th century. Special relativity is no longer merely Newton made philosophically respectable. Through its implication that space and time must be dealt with on an equal footing, it has become a crucial touchstone of the validity of theories in fundamental physics.

The other three landmarks in fundamental science this century were hardly foreseen. Einstein's general theory of relativity in 1915, which would have been better called his "relativistic theory of gravitation," would have been a surprise to all but close readers of Ernst Mach, the Viennese physicist and positivist philosopher. By positing that gravitational forces everywhere are a consequence of a gravitational field that reaches into the farthest corners of the cosmos, Einstein launched the notion that the structure and evolution of the universe are ineluctably linked. But even Einstein was surprised when Edwin Hubble discovered in 1929 that the universe is expanding.

Image: NASA Johnson Space Center
A LANDMARK OF 20th-CENTURY SCIENCE, Einstein's general theory of relativity reformulated gravity as a warping of space and time, predicting effects such as the bending of light by large masses. A graphic example is provided by this image of a so-called
Einstein Cross, obtained with the Hubble Space Telescope. Four images of a quasar surround the central image of the galaxy, which acts as a gravitational lens.

Quantum mechanics was another bolt from the blue, even though people had been worrying about the properties of the radiation from hot objects for almost half a century. The problem was to explain how it arises that the radiation from an object depends crucially on its temperature such that the most prominent frequency in the emission is directly proportional to the temperature, at least when the temperature is measured from the absolute zero (which is 273 degrees Celsius below the freezing point of water, or -459 degrees Fahrenheit, and which had itself been defined by 19th-century thermodynamics). The solution offered by Max Planck in 1900 was that energy is transferred between a hot object and its surroundings only in finite (but very small) amounts, called quanta. The actual amount of energy in a quantum depends on the frequency of the radiation and, indeed, is proportional to it. Planck confessed at the time that he did not know what this result meant and guessed that his contemporaries would also be perplexed.

As we know, it took a quarter of a century for Planck's difficulty to be resolved, thanks to the efforts of Niels Bohr, Werner Heisenberg, Erwin Schrödinger and Paul Dirac, together with a small army of this century's brightest and best. Who would have guessed, in 1900, that the outcome of the enterprise Planck began would be a new system of mechanics, as comprehensive as Newton's in the sense that it is applicable to all well-posed problems but applies only to atoms, molecules and the parts thereof--electrons and so on?

Even now there are people who claim that quantum mechanics is full of paradoxes, but that is a deliberate (and often mischievous) reading of what happened in the first quarter of this century. Our intuitive understanding of how objects in the macroscopic world behave (embodied in Newton's laws) is based on the perceptions of our senses, which are themselves the evolutionary products of natural selection in a world in which the avoidance of macroscopic objects (predators) or their capture (food) would have favored survival of the species. It is difficult to imagine what selective advantage our ancestors would have gained from a capacity to sense the behavior of subatomic particles. Quantum mechanics is therefore not a paradox but rather a discovery about the nature of reality on scales (of time and distance) that are very small. From that revelation has flowed our present understanding of how particles of nuclear matter may be held to consist of quarks and the like--an outstanding intellectual achievement, however provisional it may be.

The third surprise this century has followed from the discovery of the structure of DNA by James D. Watson and Francis Crick in 1953. That is not to suggest that Watson and Crick were unaware of the importance of their discovery. By the early 1950s it had become an embarrassment that the genes, which the Columbia school of genetics had shown are arranged in a linear fashion along the chromosomes, had not been assigned a chemical structure of some kind. The surprise was that the structure of DNA accounted not just for how offspring inherit their physical characteristics from their parents but also for how individual cells in all organisms survive from millisecond to millisecond in the manner in which natural selection has shaped them. The secret of life is no longer hidden.


A Catalogue of Ignorance

Both quantum mechanics and the structure of DNA have enlarged our understanding of the world to a degree that their originators did not and could not have foretold. There is no way of telling which small stone overturned in the next 50 years will lead to a whole new world of science. The best that one can do is make a catalogue of our present ignorance--of which there is a great deal--and then extrapolate into the future current trends in research. Yet even that procedure suggests an agenda for science in the next half a century that matches in its interest and excitement all that has happened in the century now at an end. Our children and grandchildren will be spellbound.

One prize now almost ready for the taking is the reconstruction of the genetic history of the human race, Homo sapiens. A triumph of the past decade has been the unraveling of the genetics of ontogeny, the transformation of a fertilized embryo into an adult in the course of gestation and infancy. The body plans of animals and plants appear initially to be shaped by genes of a common family (called Hox genes) and then by species-specific developmental genes. Although molecular biologists are still struggling to understand how the hierarchical sequence of developmental genes is regulated and how genes that have done their work are then made inactive, it is only a matter of time before the genes involved in the successive stages of human development are listed in the order in which they come into play.

Then it will be possible to tell from a comparison between human and, say, chimpanzee genes when and in what manner the crucial differences between humans and the great apes came into being. The essence of the tale is known from the fossil record: the hominid cerebral cortex has steadily increased in size over the past 4.5 million years; hominids were able to walk erect with Homo erectus 2.1 million years ago; and the faculty of speech probably appeared with mitochondrial Eve perhaps as recently as 125,000 years ago. Knowing the genetic basis of these changes will give us a more authentic history of our species and a deeper understanding of our place in nature.

That understanding will bring momentous by-products. It may be possible to infer why some species of hominids, of which the Neanderthals are only one, failed to survive to modern times. More important is that the genetic history of H. sapiens is likely to be a test case for the mechanism of speciation. Despite the phrase "Origin of Species" in the title of Darwin's great book, the author had nothing to say about why members of different species are usually mutually infertile. Yet the most striking genetic difference between humans and the great apes is that humans have 46 chromosomes (23 pairs), whereas our nearest relatives have 48. (Much of the missing ape chromosome seems to be at the long end of human chromosome 2, but other fragments appear elsewhere in the human genome, notably on the X chromosome.) It will be important for biology generally to know whether this rearrangement of the chromosomes was the prime cause of human evolution or whether it is merely a secondary consequence of genetic mutation.

The 50 years ahead will also see an intensification of current efforts to identify the genetic correlates of evolution more generally. Comparison of the amino acid sequences of similar proteins from related species or of the sequences of nucleotides in related nucleic acids--the RNA molecules in ribosomes are a favorite--is in principle a way of telling the age of the common ancestor of the two species. It is simply necessary to know the rate at which mutations naturally occur in the molecules concerned.

But that is not a simple issue. Mutation rates differ from one protein or nucleic acid molecule to another and vary from place to place along their length. Constructing a more reliable "molecular clock" must be a goal for the near future. (The task is similar to, but if anything more daunting than, cosmologists' effort to build a reliable distance-scale for the universe.) Then we shall be able to guess at the causes of the great turning points in the evolution of life on Earth--the evolution of the Krebs cycle by which all but bacterial cells turn chemicals into energy, the origin of photosynthesis, the appearance of the first multicellular organisms (now firmly placed more than 2,500 million years ago).

With luck, the same effort will also tell us something about the role of viruslike agents in the early evolution of life. The human genome is crammed with DNA sequences that appear to be nucleic acid fossils of a time when genetic information was readily transferred between different species much as bacteria in the modern world acquire certain traits (such as resistance to antibiotics) by exchanging DNA structures called plasmids. We shall not know our true place in nature until we understand how the apparently useless DNA in the human genome (which Crick was the first to call "junk") contributed to our evolution.

Understanding all the genomes whose complete structure is known will not, in itself, point back to the origin of life as such. It should, however, throw more light on the nature of living things in the so-called RNA world that is supposed to have preceded the predominantly DNA life that surrounds us. It is striking and surely significant of something that modern cells still use RNA molecules for certain basic functions--as the editors of DNA in the nucleus, for example, and as the templates for making the structures called telomeres that stabilize the ends of chromosomes.

At some stage, but probably more than half a century from now, someone will make a serious attempt to build an organism based on RNA in the laboratory. But the problem of the origin of life from inorganic chemicals needs understanding now lacking--not least an understanding of how flux of radiation such as that from the sun can, over time, force the formation of complex from simpler chemicals. Something of the kind is known to occur in giant molecular clouds within our galaxy, where radioastronomers have been finding increasingly complex chemicals, most recently fullerenes (commonly called "buckyballs") such as C60. The need is for an understanding of the relation between complexity and the flux of radiation. This is a problem in irreversible thermodynamics to which too little attention has been paid.

Indeed, biologists in general have paid too little attention to the quantitative aspects of their work in the past few hectic decades. That is understandable when there are so many interesting (and important) data to be gathered. But we are already at the point where deeper understanding of how, say, cells function is impeded by the simplification of reality now commonplace in cell biology and genetics--and by the torrent of data accumulating everywhere. Simplification? In genetics, it is customary to look for (and to speak of) the "function" of a newly discovered gene. But what if most of the genes in the human genome, or at least their protein products, have more than one function, perhaps even mutually antagonistic ones? Plain-language accounts of cellular events are then likely to be misleading or meaningless unless backed up by quantitative models of some kind.

A horrendous example is the cell-division cycle, in which the number of enzymes known to be involved seems to have been growing for the past few years at the rate of one enzyme a week. It is a considerable success that a complex of proteins that functions as a trigger for cell division (at least in yeast) has been identified, but why this complex functions as a trigger and how the trigger itself is triggered by influences inside and outside a cell are questions still unanswered. They will remain so until researchers have built numerical models of cells in their entirety. That statement is not so much a forecast as a wish.

The catalogue of our ignorance must also include the understanding of the human brain, which is incomplete in one conspicuous way: nobody understands how decisions are made or how imagination is set free. What consciousness consists of (or how it should be defined) is equally a puzzle. Despite the marvelous successes of neuroscience in the past century (not to mention the disputed relevance of artificial intelligence), we seem as far from understanding cognitive process as we were a century ago. The essence of the difficulty is to identify what patterns of the behavior of neurons in the head signal making a decision or other cognitive activity. Perhaps decision making has several alternative neural correlates, which will complicate the search. Yet there is no reason to believe the problem is intractable. Even nonhuman animals (such as rats in a maze) make decisions, although they may not be conscious that they do so, which means that observation and experiment are possible. But it will be no shame on neuroscience if these questions are unanswered 50 years from now.

That is also the case for the central problem in fundamental physics, which stems from the fact that quantum mechanics and Einstein's theory of gravitation are incompatible with each other. So much has emerged from failed attempts to "quantize" the gravitational field in the past two decades. Yet without a bridge of some kind between these two theories, two of the triumphs of our century, it will not be possible to describe the big bang with which the universe is supposed to have begun with anything like the customary rigor. Doubt has also infected particle physics, where for many years researchers have shared the goal that all four forces of nature should eventually be unified. Those laboring in the field of string theory believe their work provides an acceptable bridge, but others point to the waxing and waning of enthusiasm in the past 20 years and are less sanguine. At least the next 50 years should show which camp is correct.

Is that not a long time to wait for the resolution of what often seems to be a mere problem in mathematics? My forecast may be overlong, but we should not be surprised if a few more decades pass before it is clear whether string theory is a true description of the particles of matter or merely a blind alley. We should not forget that, in the 19th century, three decades passed between Faraday's experimental proof that electricity and magnetism are aspects of the same phenomenon and Maxwell's eventually successful theory of electromagnetism. Then, the mathematics Maxwell needed was amply described in textbooks; now, in string theory, it must be invented as people inch their way forward. Moreover, if string theory is successful in bridging gravitation and quantum mechanics, it will also provide a new picture of the pointlike elementary particles of matter, one that endows both space and time with a kind of microscopic structure on a scale so small that it cannot be probed by existing accelerator machines or any now in prospect. Yet as things are, there are no uniquely relevant experimental data. We must be patient.

Despite the illusion we enjoy that the pace of discovery is accelerating, it is important that, in some fields of science, many goals appear to be attainable only slowly and by huge collective effort. To be sure, the spacecraft now exploring the solar system are usually designed a decade or so before their launch. After a century of seismology, only now are measurement and analytical techniques sensitive enough to promise that we shall soon have a picture of the interior of the planet on which we live, one that shows the rising convection plumes of mantle rock that drive the tectonic plates across the surface of Earth. Since the 1960s, molecular biologists have had the goal of understanding the way in which the genes of living organisms are regulated, but not even the simplest bacterium has yet been comprehensively accounted for. And we shall be lucky if the neural correlates of thinking are identified in the half-century ahead. The application of what we know already will enliven the decades immediately ahead, but there are many important questions that will be answered only with great difficulty.

And we shall be surprised. The discovery of living things of some kind elsewhere in the galaxy would radically change the general opinion of our place in nature, but there will be more subtle surprises, which, of necessity, cannot be anticipated. They are the means by which the record of the past 500 years of science has been repeatedly enlivened. They are also the means by which the half-century ahead will enthrall the practitioners and change the lives of the rest of us.


The Author

SIR JOHN MADDOX was lecturer in theoretical physics at the University of Manchester from 1949 to 1956 and editor-in-chief of Nature from 1966 to 1973 and from 1980 to 1995. He was knighted for "services to science" in 1985.