Introduction to the Professions
Biology, Chemistry, and Physics 100

lecture notes for Tuesday-Thursday, 28 - 30 November 2006

History and Philosophy of Science

The History of Science

Many of you may have read biographies of famous scientists and gotten therefrom an idea of how scientific progress was made in the past. This is a valid approach to understanding the history of science, but it is incomplete. It is incomplete first in that it focuses on a few huge success stories, whereas enduring scientific progress is often made through the accumulation of insights derived from hundreds of comparatively obscure researchers. It is also incomplete in that it usually excludes from consideration the cultural background before which the scientific effort is set. We'll try to remedy a bit of that incompleteness here.

Science, as defined in our first lecture, becomes a recognizable human activity in Europe in the early Renaissance, wherein the notion of the experimental method arose. Clearly technology goes much further back into European history than that; agriculture depends on technology, as do urban life and mining. Thus any race that farmed, lived in cities, or mined minerals had to have technology. But the notion of experiment and systematic observation in the sense that I described in September is a relatively new notion in western society.

The classical Greek thinkers did engage in systematic observation of nature. Aristotle made careful observations (sometimes inaccurate ones) of biological phenomena, and thinkers following him referenced his observations for many centuries. Much of the scientific writing of the Middle Ages in western Europe took the form of commentary on Aristotle's writings. Aristotle wasn't always right, but he did observe nature and write down his observations. But the very air of authority that his writings produced is additional evidence that the modern concept of science had not altogether emerged. Taking previous writers' observations as definitive, rather than independently corroborating them, is unscientific.

Did science thrive outside of western (European and Mediterranean) culture? Possibly. Certainly there was more systematic observation of nature in Arabic and Indian cultures between 300 and 1100 AD than there was in Europe. The fact that most of our current names for stars are Arabic names, and the fact that the zero and the current numerals we use are derived from Arabic practice, illustrate this. I know very little about science in what westerners call the Far East (China, Japan, Korea, and southeast Asia), so I would not presume to include them in a statement that "science was not invented until the twelfth century." But within European culture, I belive that statement is supportable.

The first western experimental scientists, then, were for the most part chemists, and their primary interest was in studying and manipulating what we now know are elements. They were trying in many cases to change base metals like lead and tin into gold. We now know how difficult it is to do that, and Albertus Magnus and his contemporaries did not have sources of energy or neutrons capable of effecting elemental transmutation. But along the wayside they did make some important observations and learned how to do experiments. For that we owe them a substantial debt of gratitude.

Progress in experimental science in the late Middle Ages was accompanied and helped by significant advances in observational science and mathematics. Scientists studied light, and produced an explanation of rainbows that looks nearly identical to our present understanding of them. The introduction of Indo-Arabic numerals and algebra (an Arabic word) into the west in the fourteenth century enabled rapid progress in basic mathematics and its application to science.

Recognizable scientific endeavor was reasonably commonplace by the Renaissance. Galileo was an archetypal Renaissance thinker: restless, interested in many things, eager to try new approaches. He was aware of the discoveries of his predecessors, and used them where it was appropriate to do so, but he devised and performed experiments when he needed to strike out on his own. His approach to studying natural phenomena established a pattern that has carried through to the present.

Until the late nineteenth century few researchers actually earned their living doing research. Of course there were scientists professionally employed at universities, teaching and doing research, but the notion that a government or an external sponsor would pay for scientific research had not become prevalent. Some governments paid for research into arms and armaments, but few paid for research into health, chemistry, or basic physics. So those kinds of science tended to be the province of wealthy individuals whose livelihood was derived from inheritance or another job, and whose equipment was inexpensive enough to enable the work to proceed in the experimenter's basement or back room. This is a partial explanation for the preponderance of titled aristocrats in the list of prominent scientists of the eighteenth and nineteenth centuries.

Even in the early twentieth century, science tended to be carried out on a shoestring. Government support for applied science in Europe and North America was growing, and German, English, and American university researchers were beginning to be able to get governmental support for research even into areas of basic science. Large companies like Bell Telephone Laboratories and Westinghouse began to support applied and even basic research. But outside of defense expenditures, governmental support for basic research was piecemeal and paltry.

The end of World War II brought about a change in attitude. Public figures recognized that the nuclear weapons that (for good or ill) brought about an end to the Japanese portion of the war and the V-2 rockets fired by the Germans onto British soil were products of a concerted technological and scientific effort. This recognition led to a call for postwar governments to invest in large-scale research programs, both in basic and applied science. Vannevar Bush was the principal advocate of including basic research in the mix in the United States, and Bush's proposals led to the establishment of respectable budgets for the National Science Foundation, the National Institutes of Health, and other science-centered governmental bodies. The paradigm in which individual researchers, most of them in universities or government laboratories, apply competitively for Federal support of their research programs, became the dominant model for research in the United States, western Europe, and the more technologically advanced nations of the Communist bloc. Scientific research received an additional boost in the 1957 when the Soviet Union successfully launched a satellite into orbit. The fear that the communists would overwhelm the west with superior technology and science led to increases in awareness of science in American schools and modest increases in support for research in the physical sciences. Much of that research was applied research, and indeed most of it was militarily-motivated, but it did advance basic science as well.

Clearly the progress of science involves more elements than financial support. It involves the specific contributions of great scientists; an environment in which scientists can work productively, discuss their work with colleagues, and publish their works for others to read and digest; and a cultural milieu that values and makes use of scientific effort. Do these conditions pertain in modern America and Europe? The answer is mostly yes, in spite of the deep distrust that Americans display toward intellectuals of any kinds and the nihilistic cynicism that many Europeans display toward technology. Many former communist nations and third-world countries are developing the cultural and economic bases for the encouragement of research, and I believe that scientists will benefit from the improved attitude toward their work worldwide. The budget of the National Institutes of Health doubled between 1996 and 2002, enabling an across-the-board improvement in the funding of biological research. There have been slippages: it was harder in 1995 to get good biological research funded than it was in 1970 unless the research involved either AIDS or cancer, and the reduction in military expenditures in the 1990's has made it difficult to fund some kinds of physics experiments that would have been easy to fund in the 1960's. Science in eastern Europe suffered greatly with the fall of Communism in the 1980's, and it is only beginning to recover.

In this country a huge percentage--probably a majority--of graduate students in the sciences are foreign-born. This speaks well for the attractiveness of the training that US universities offer to the rest of world, but it speaks ill of the status of scientific research in US culture. Nonetheless, science is alive and well worldwide, and I see no prospect of a disappearance of research.

The challenges facing science in the twenty-first century are substantial. Society looks to science to solve serious social and environmental problems, some of which science and technology contributed to in the first place--like global warming and air pollution. At the same time many leaders are expressing doubts about the ability of scientists to contribute to solving those problems. Scientists will be working in a culture that expects a lot out of them, and has limited confidence in science's fitness for the tasks set before it. So, as Dickens described the turbulence of the French revolution, "it [is] the best of times, it [is] the worst of times."