Skepticism about science and medicine

In search of disinterested science

Posts Tagged ‘how science has changed’

How science has changed: Who are the scientists?

Posted by Henry Bauer on 2018/04/07

Scientists are people who do science, Nowadays scientists are people who work at science as a full-time occupation and who earn their living at it.
Science means studying and learning about the natural world, and human beings have been doing that since time immemorial; indeed, in a sense all animals do that, but humans have developed efficient means to transmit gained knowledge to later generations.
At any rate, there was science long before [1] there were scientists, full-time professional students of Nature. Our present-day store of scientific knowledge includes things that have been known for at least thousands of years. For example, from more than 6,000 years ago in Mesopotamia (Babylon, Sumer) we still use base-60 mathematics for the number of degrees in the arcs of a circle (360) and the number of seconds in a minute and the number of minutes in an hour. We still cry “Eureka” (found!!) for a new discovery, as supposedly Archimedes did more than 2000 years ago when he recognized that floating an object in water was an easy way to measure its volume (by the increase in height of the water) and that the object’s weight equaled the weight of the water it displaced. The Islamic science of the Middle Ages has left its mark in language with, for instance, algebra or alchemy.
Despite those early pieces of science that are still with us today, most of what the conventional wisdom thinks it knows about science is based on what historians call “modern” science, which is generally agreed to have emerged around the 17th century in what is usually called The Scientific Revolution.
The most widely known bits of science are surely the most significant advances. Those are typically associated with the names of people who either originated them or made them popular [2]; so many school-children hear about Archimedes and perhaps Euclid and Ptolemy; and for modern science, even non-science college students are likely to hear of Galileo and Newton and Darwin and Einstein. Chemistry students will certainly hear about Lavoisier and Priestley and Wöhler and Haber; and so on, just as most of us have learned about general history in terms of the names of important individuals. So far as science is concerned, most people are likely to gain the general impression that it has been done and is being done by a relatively small number of outstanding individuals, geniuses in fact. That impression could only be entrenched by the common thought-bite that “science” overthrew “religion” sometime in the 19th century, leading to the contemporary role of science as society’s ultimate arbiter of true knowledge.
The way in which scientists in modern times have been featured in books and in films also gives the impression that scientists are somehow special, that they are by no means ordinary people. Roslynn Haynes [3] identified several stereotypes of scientists, for example “adventurer” or “the noble scientist as hero or savior of society”, with most stereotypes however being less than favorable — “mad, bad, dangerous scientist, unscrupulous in the exercise of power”. But no matter whether good or bad in terms of morals or ethics, society’s stereotype of “scientist” is “far from an ordinary person”.
That is accurate enough for the founders of modern science, but it became progressively less true as more and more people came to take part in some sort of scientific activity. Real change began in the early decades of the 19th century, when the term “scientist” seems to have been used for the first time [4].
By the end of the 19th century it had become possible to earn a living through being a scientist, through teaching or through doing research that led to commercially useful results (as in the dye-stuff industry) or through doing both in what nowadays are called research universities. By the early 20th century, scientists no longer deserved to be seen as outstanding individual geniuses, but they were still a comparatively elite group of people with quite special talents and interests. Nowadays, however, there is nothing distinctly elite about being a scientist. In terms of numbers (in the USA), scientists at roughly 2.7 million are comparable to engineers at 2.1 million (in ~2001), less elite than lawyers (~ 1 million) or doctors (~800,000); and teachers, at ~3.5 million, are almost as elite as scientists.
Nevertheless, so far as the general public and the conventional wisdom are concerned, there is still an aura of being special and distinctly elite associated with science and being a scientist, no doubt because science is so widely acknowledged as the ultimate authority on what is true about the workings of the natural world; and because “scientist” brings to most minds someone like Darwin or Einstein or Galileo or Newton.
So the popular image of scientists is wildly wrong about today’s world. Scientists today are unexceptional white-collar workers. Certainly a few of them could still be properly described as geniuses, just as a few engineers or doctors could be — or those at the high tail-end of any distribution of human talent; but by and large, there is nothing exceptional about scientists nowadays. That is an enormous change from times past, and the conventional wisdom has not begun to be aware of that change.
One aspect of that change is that the first scientists were amateurs seeking to satisfy their curiosity about how the world works, whereas nowadays scientists are technicians or technical experts who do what they are told to do by employers or enabled to do by patrons. A very consequential corollary is that the early scientists had nothing to gain by being untruthful, whereas nowadays the rewards potentially available to prominent scientists have tempted a significant number to practice varying degrees of dishonesty.
Another way of viewing the change that science and scientists have undergone is that science used to be a cottage industry largely self-supported by independent entrepreneurial workers, whereas nowadays science is a corporate behemoth whose workers are apparatchiks, cogs in bureaucratic machinery; and in that environment, individual scientists are subject to conflicts of interest and a variety of pressures owing to their membership in a variety of groups.

Science today is not a straightforward seeking of truth about how the world works; and claims emerging from the scientific community are not necessarily made honestly; and even when made honestly, they are not necessarily true. More about those things in future posts.

=======================================

[1]    For intriguing tidbits about pre-scientific developments, see “Timeline Outline View”
[2]    In reality, most discoveries hinge on quite a lot of work and learning that prefigured them and made them possible, as discussed for instance by Tony Rothman in Everything’s Relative: And Other Fables from Science and Technology (Wiley, 2003). That what matters most is not the act of discovery but the making widely known is the insight embodied in Stigler’s Law, that discoveries are typically named after the last person who discovered them, not the first (S. M. Stigler, “Stigler’s Law of Eponymy”, Transactions of the N.Y. Academy of Science, II: 39 [1980] 147–58)
[3]    Roslynn D. Haynes, From Faust to Strangelove: Representations of the Scientist in Western Literature, Johns Hopkins University Press, 1994; also “Literature Has shaped the public perception of science”, The Scientist, 12 June 1989, pp. 9, 11
[4]    William Whewell is usually credited with coining the term “scientist” in the early 1830s

Advertisements

Posted in conflicts of interest, fraud in science, funding research, media flaws, peer review, science is not truth, scientific culture, scientists are human | Tagged: , , | 4 Comments »

From Dawn to Decadence: The Three Ages of Modern Science

Posted by Henry Bauer on 2012/12/03

[I’ve snitched my title from the book, From Dawn to Decadence: 500 Years of Western Cultural Life — 1500 to the Present, Jacques Barzun’s cultural tour de force published in 2000. It happens to fit for what’s happened to science in virtually the same period. Hardly surprising, since science has played such a prominent role in Western society during these centuries.]

The popular view of science isn’t historically informed, but it is based on the past. It doesn’t recognize that the activity we call “science” has changed in important ways over the centuries, that it continues to change, and that today’s “science” is not at all like the popular view.
Much of the conventional wisdom about science reflects notions discussed a century or so ago and long abandoned by scholars of science, like “the scientific method”, thought up by philosophers trying to understand why science had been so successful. Popular icons of science also date to a century or so ago or even further back —Darwin, Einstein, Galileo, Newton. In reality, of course, most scientists are not at all like the famous few, but public discourse doesn’t have exemplary figures of what most scientists are like nowadays — technological analogues of Babbitt  or men in grey flannel suits, performing banal routines more than producing inspired creativity. Just about everything associated with science in the 21st century is significantly different from what it was a century ago, even half a century ago.

The First Age of Modern Science:
Curious Amateurs Seeking Authentic Knowledge

Historians are in reasonable agreement that modern science had its beginnings in about the 17th century, marked by such figures as Galileo and Newton, and such events as the founding of the Royal Society of London. Some discrete, isolated bits of science and even more bits of technological skill from earlier times were incorporated, but what historians call “The” Scientific Revolution of about the 17th century was the beginning of an integrated venture using both theorizing and experimenting, and sharing the results in a somewhat organized way so that something like a coherent community of knowledge seekers formed. The people involved were said to be doing “natural philosophy” — seeking to understand Nature. Some of them were clergy who wanted to do it in service to God, as a way of understanding his ways better, while others were doing it just because they wanted to, whether out of sheer curiosity or in the hope of finding materially useful things. The essential point is that they were amateurs, doing what they loved. Their direct aim, unsullied by external conflicts of interest, was just to understand how the world works.
In this first age of modern science, flaws stemmed purely from human characteristics. People naturally took pride in their discoveries and wanted to be recognized for making them, and to be acknowledged as having made them first, and they could be heavily invested in their own theories and believing themselves to be right and others wrong. So there were arguments, sometimes quite bitter, typically over who had priority for a discovery. But those arguments were not exacerbated by interests external to science and knowledge-seeking.
That first age of modern science has left its mark on the contemporary view. Many people imagine that scientists nowadays are just self-driven by curiosity, that discovering the truth is their only interest. That can be accurate for some scientists, but it isn’t overall: most scientists nowadays are employees doing what they’re paid to do, no doubt wanting to do honest work but influenced by a variety of conflicts of interest, whose consequences I’ll discuss below or later.

The Second Age of Modern Science:
Science as a Career

By the early 19th century, natural philosophy had accumulated a respectable amount of trustworthy knowledge about and understanding of Nature, enough to inspire confidence that even more could be learned in the future.
The term “science” was becoming used in something like its modern form; William Whewell is generally credited with first use of the term “scientist” in the 1830s. So the professional identity of scientist came into being, and the possibility of making it a career, a way to earn a living: at first primarily through teaching, doing research as a sideline, but soon also through carrying out applied research, beginning with the dye-stuff industry based on the synthesis of  new and better dyes to replace the earlier use of dyes derived, expensively, from plants. In the later 19th century Germany pioneered what have become “research universities” where the teaching of undergraduates tends to play a subsidiary role.
Now it became not just a matter of personal satisfaction to get there first and to be acknowledged for it and to be right while others were wrong, it was henceforth a way to succeed in practical terms, rising to better positions. Making great discoveries could even lead to high social status, for example being inducted into the British peerage like William Thomson who became the first Baron Kelvin, or Ernest Rutherford who became the first Baron Rutherford of Nelson (New Zealand).
During the first World War, Germany lost access to the previously imported nitrates needed for explosives as well as fertilizers, and Ernst Haber found out how to synthesize the needed chemicals from the atmosphere’s nitrogen. Many other fundamental discoveries turned out to have practical applications. Industrial scientists could sometimes benefit from making patentable discoveries. But, by and large, the rewards from being a scientist came from the satisfaction of doing the work and being able to earn a decent living from doing something interesting.
In this second age of modern science, from about mid-19th century to about mid-20th century, science was in many ways an attractive career, but it was not a path one would choose if seeking wealth or an entrée into the halls of power.

The New Age of Modern Science:
Money and Politics

The Second World War introduced the present age of science, in which research can lead to great wealth and to considerable influence on those who construct national and international policies. Science is thereby subjected to strong external conflicts of interest. The funding and control of research are enmeshed in bureaucracy and competing interests. The aims of research may be purely profit-seeking rather than truth-seeking. Applications of research may be determined by personal or private or corporate interests even to the exclusion of the public good. The distinction between “pure” science seeking basic understanding and “applied” science based on trustworthy fundamental knowledge has become largely meaningless as more research is funded by patrons interested only in profitable outcomes rather than new understanding gained.
Something like a perfect storm ensued as these changes coincided with an inevitable change from seemingly endless expansion of scientific activity to an essentially zero-sum game where the total resources available for research can no longer grow appreciably.
From growth to steady state:
Derek Price, ground-breaking historian of science, had recognized that every available quantitative measure of science had increased exponentially, doubling about every 15 years since the 17th century: numbers of articles published, numbers of scientific journals, numbers of people who could be called “scientists”. The ethos of scientific activity was consonant with that, an expectation that every promising avenue could be explored, every graduating potential researcher would find employment doing science, every new result could find publication. Increasingly insiders as well as outsiders would look to numbers as gauges of success: numbers of articles published, numbers of students mentored, and especially in the New Age of modern science, numbers of grants collected and total amount of money raised.
The reality Price also saw was that by about mid-20th century, developed societies were devoting something like 2-3% of Gross Domestic Product to science, broadly defined as “Research & Development” and funded by private, public, and corporate patrons. That proportion could not continue to grow exponentially, to ~5% in 15 years, ~10% in 30 years, and so on. Science had reached its limit of growth relative to the rest of society, and would have to adjust to a steady state: doing one thing would mean not doing another; the numbers of prospective researchers graduated should be the numbers needed to replace retiring researchers; no new journals would need to be established. Measures of success would need to be more qualitative than quantitative. The traditional ethos of scientific activity would need to be replaced by different criteria or characteristics.
Those changes are needed, have been needed for decades, but they have not yet occurred.
John Ziman, distinguished physicists turned STS scholar, detailed the necessary changes in ethos in Prometheus Bound (Cambridge University Press, 1994). The classic norms, whose definition is generally credited to Robert Merton, were that science was a universal public good characterized by disinterestedness and organized skepticism, to which Ziman added “originality”. These norms apply to something like the first age of science: curious people seeking understanding for its own sake, skeptical of new claims since experience had shown them to be fallible; Ziman’s addition of originality recognizes the value of creativity and progress.
In the second age, personal careerism and institutional interests sometimes interfered with disinterestedness or with organized skepticism; but in the third age, the new age, the norms of scientists’ behavior are entirely different. Ziman pointed out that research is now largely a matter of authoritative professional experts hired to produce wanted results, and the traditional universality of science is often subordinate to local demands.
What Ziman did not emphasize is that, under the new regime, the media and the public may be fed “scientific results” that are nowhere near as trustworthy as they used to be since they may be promulgated for institutional, bureaucratic or profit-making purposes, not because of a wish to disseminate genuine knowledge.
The enormous expansion in numbers of researchers has inevitably diluted their average quality, and the possibility of wealth and political influence has also brought a difference in the personalities of those who self-recruit into research. Increasingly science is being done not out of the inherent curiosity of disinterested knowledge-seekers; rather, as Gordon Tullock put it (The Organization of Inquiry, Duke University Press, 1966; reprinted, Liberty Fund, 2004), their curiosity is induced by offers of rewards.
The new zero-sum steady-state funding of research together with more potential researchers than the resources can support has had seriously deleterious consequences: cutthroat competition, dishonesty, and consequent unreliability of public pronouncements by researchers and their patrons or employers.
What the media and the public and the policy makers hear about matters of science has become untrustworthy to a dangerous degree, on such important matters as HIV/AIDS and global warming — see my Dogmatism  in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth, McFarland, 2012.

Posted in politics and science, science is not truth, science policy, scientists are human, the scientific method | Tagged: , | 7 Comments »

 
%d bloggers like this: