Skepticism about science and medicine

In search of disinterested science

Archive for December, 2012

Dysfunctional research funding

Posted by Henry Bauer on 2012/12/26

Contemporary scientific activity is marked by cutthroat competition and unconscionable conflicts of interest stemming from commercial and political sources (From Dawn to Decadence: The Three Ages of Modern Science, 2012/12/03 ).

One consequence is that researchers waste inordinate time and effort evolving proposals for funding, because only a small proportion of those proposals succeed (80% unemployment?! The research system is broken, 2012/12/11).

So the incentives are very great, to find some quite startling tidbit that will make a proposal stand out from the crowd. Since genuine discoveries that are genuinely startling are quite rare, an increasing number of would-be researchers have been making deliberately fraudulent claims, not only in their grant proposals but in their research publications as well (Dishonesty and dysfunction in science, 2012/12/16). That adds significantly and tragically to the dysfunctionality of the research system, misleading some unknown number of other researchers and wasting the time and effort of reviewers and editors and science writers.

In this hothouse environment of cutthroat competition and external commercial and political influence, what’s funded is often not the best science: for example, the majority of the most highly cited authors publishing biomedical research did not have funding from the National Institutes of Health, which is by far the greatest source of funding for such research (1). Evidently the manner in which research is funded does not conduce to supporting the best people or the best projects; perhaps because “Collegiality and careerism trump critical questions and bold new ideas” (2), not unexpected since “a large majority of the current members of NIH study sections — the people who recommend which grants to fund — do have NIH funding for their work irrespective of their citation impact, which is typically modest” (1; emphasis added).
It’s a universal experience that bureaucracies tend to favor banal mediocrities over creative people who tend to be prickly, unreasonable prima donnas. But as George Bernard Shaw noted long ago, progress depends on the unreasonable ones.

One possible amelioration of the system would be to fund people rather than projects, to direct funds toward the most already-proven creative, competent people rather than attempting to discern from promises in grant proposals what is most likely to bear valuable fruit (3). After all, the greatest advances have come by serendipity and because unusual individuals took note of details that most others had overlooked.

But before solutions can be contemplated, the underlying problems need to be fully understood. One is that the demand has far outstripped the available funds [this sentence is corrected from an earlier version, credit to Richard Karpinski for catching this]. Another is that this has exacerbated the natural tendency of the mainstream to be dogmatically contemptuous of non-mainstream approaches. Those who are prominent in the mainstream naturally wish to keep as much as possible of available resources for work that conforms to their own views. But the history of science is quite clear that the greatest advances have been those that overturned the mainstream consensus — eventually, after strong resistance (46).

Therefore and unfortunately, choosing the best people — those most likely to spur the greatest progress — is nowadays as problematic as choosing the best projects. A significant aspect of the present dysfunctionality is the hegemony exercised by the mainstream consensus in more and more fields (7). That hegemony determines the judgments made as to who the best people are. So a necessary part of ameliorating the unsatisfactory current situation is to find procedures that will bring funding to ventures currently dismissed or ignored by the mainstream hegemony. At present, creatively unorthodox individuals have few places to which they can turn: the National Center for Complementary and Alternative Medicine  in NIH and the Defense Advanced Research Projects Agency  (DARPA) in the Department of Defense are the significant ones, and each has severely limited funds. Private foundations for the most part seek to protect their reputation by using well established advisers, i.e. people steeped in the mainstream beliefs. By contrast, in the 1950s and 1960s researchers in many sciences had more than a few possible sources of research funding from a variety of federal agencies, very much including separate Army, Navy, and Air Force Research Offices, a productive situation that was effectively closed by the Mansfield Amendments (8) which limited federal agencies (other than the National Science Foundation and NIH) to supporting applied and not basic research.

One part of a solution might be to require that all advisory panels, review panels, and the like include competent, accomplished experts who are also known dissenters from the mainstream view. A similar approach would be to require that a certain percentage (say 5-10%) of research funds on any given topic be allotted to non-mainstream approaches. Without something like that, the mainstream hegemony will continue to ensure that banal mediocrity is funded to the exclusion of brilliant ideas and truly ground-breaking work.

————————————————
(1) Joshua M. Nicholson and John P. A. Ioannidis, “Conform and be funded”,
Nature 492 (2012) 34-6
(2) Joshua M. Nicholson, “Collegiality and careerism trump critical questions
and bold new ideas: A student’s perspective and solution”,
BioEssays 34 (#6, 2012) 448-50
(3) John P. A. Ioannidis, “Fund people not projects”, Nature 477 (2011) 529-31
(4) Bernard Barber, “Resistance by scientists to scientific discovery”,
Science 134 (1961) 596-602
(5) Gunther Stent, “Prematurity and uniqueness in scientific discovery”,
Scientific American, December 1972, 84-93
(6) Ernest B. Hook (ed)., Prematurity in Scientific Discovery: On Resistance and
Neglect
, University of California Press, 2002
(7) Henry H. Bauer, Dogmatism  in Science and Medicine:
How Dominant Theories Monopolize Research and Stifle the Search for Truth
,
McFarland, 2012
(8) http://en.wikipedia.org/wiki/Mike_Mansfield:
The Mansfield Amendment of 1969, “passed as part of the fiscal year 1970 Military Authorization Act (Public Law 91-121) prohibited military funding of research that lacked a direct or apparent relationship to specific military function. Through subsequent modification the Mansfield amendment moved the Department of Defense toward the support of more short-term applied research in universities.” … This amendment affected the Military Services, for example research funding by the Office of Naval Research (ONR)….
The Mansfield Amendment of 1973 expressly limited appropriations for defense research through ARPA, which is largely independent of the Military Services, to projects with direct military application.…

Advertisements

Posted in funding research, resistance to discovery, science policy, scientists are human | Tagged: , , , , | 3 Comments »

Dishonesty and dysfunction in science

Posted by Henry Bauer on 2012/12/16

The traditional (Mertonian) norms intended to describe the behavior of scientists during the 1st and 2nd ages of modern science included not only disinterestedness and organized skepticism but also “universalism” and “communalism”: scientific understanding as a freely shared public good, universal rather than local.

No more free sharing
The intrusion of politics and big money in the present-day 3rd age of modern science has effectively neutered that ideal of free sharing.

Secrecy for commercial purposes, including patenting, used to be restricted to industry and to “applied” science in general. But the distinction between pure and applied has eroded, and moreover universities — the traditional home of “pure” or “basic” research — have themselves become profit-seeking and patent-greedy. One consequence is that the sharing of information between researchers at universities has become subject to bureaucratic restrictions expressed in “Material Transfer Arrangements” (Philip Mirowski, Science-Mart: Privatizing American Science, Harvard University Press, 2011).

Individuals as well as institutions have become secretive and wary of being scooped. A notorious instance in the race to invent high-temperature superconductors had the author of a manuscript insert wrong information so that the reviewers would not be able to benefit from early  knowledge of crucial details of the work; the information was corrected only when the article had reached the proof stage of publication (Robert M. Hazen, The Breakthrough: The Race for the Superconductor, Summit Books / Simon & Schuster; 1988).

Outright fraud
Deliberate dishonesty was rare during the first and second ages of modern science. By 1980, however, instances had become sufficiently common that two science journalists could suggest that it is endemic within science: William Broad & Nicholas Wade, Betrayers of the Truth: Fraud and Deceit in the Halls of Science (Simon & Schuster, 1982). Their claim to trace instances back for many centuries indicated, however, that fraud had actually been quite rare in times past, becoming disturbingly frequent only in modern times, in biomedical matters in particular (book review, 4S Review, 1 [#3, Fall 1983] 17-23).

That dishonesty has become much more common in science during the last three decades can be amply demonstrated. For instance, in 1989 the National Academies of Science (NAS) had felt it necessary to publish a booklet entitled On Being a Scientist. By 1995, the 2nd edition added a sub-title to emphasize ethical behavior: On Being a Scientist: A Guide to Responsible Conduct in Research, and this was downloaded 850 times from the NAS Press website. Since the 3rd edition of 2009 there have been 40,000 downloads.

Also in the 1980s the National Institutes of Health found it necessary to establish an Office of Research Integrity (ORI; its name has changed several times over the years). ORI newsletters  all too often have to report penalties enacted on individuals who have been found dishonest in grant applications or in other ways. Nowadays it is also required that universities receiving NIH grants must provide courses in research ethics for their faculty and students; and many universities have set up their own offices of research integrity to ensure that their faculty and students are taught how to be honest in doing research. Such honesty is difficult to ensure, apparently, since there is a mushrooming industry carrying on research into research integrity: Centers for Research Ethics have sprung up at a number of universities, and there are opportunities for grant-getting for such scholarship — “Funding Opportunity Title: Research on Research Integrity (R21)”. Journals dedicated to the problem have of course been founded: Accountability in Research (volume 1 in 1989), Ethics in Science and Environmental Politics (volume 1 in 2001), Journal of Academic Ethics (since 2003), Research Ethics (since 2005), Journal of Empirical Research on Human Research Ethics (since 2006), and of course International Journal of Internet Research Ethics established in 2008. Dishonesty among PhDs and MDs has evidently become rampant.

Just how prevalent fraud has become in science is also illustrated by a proliferation not only of scholarly journals but also news items, blogs, and websites concerned with the problem. Much of the media still find this astonishing:  “A surprising upsurge in the number of scientific papers that have had to be retracted because they were wrong or even fraudulent has journal editors and ethicists wringing their hands” (emphasis added; New York Times, Editorial — Fraud in the scientific literature, 5 October 2012). Individual scientists come to recognize the problem not because it has become fully recognized within the scientific community but from unhappy personal experience (see e.g. the website Science Fraud: Highlighting Misconduct in Life Sciences Research). A few people, however, are recognizing that this points to systemic dysfunction, see e.g. Horace Freeland Judson in The Great Betrayal: Fraud In Science (2004) or Pete Etchells and Suzi Gage, “Scientific fraud is rife: it’s time to stand up for good science. The way we fund and publish science encourages fraud” (emphasis added; Guardian blog, 2 November 2012).

The crux of the matter is that too many would-be researchers are competing for inadequate available resources, under burdensome demands by universities as well as commercial institutions that researchers get grants and make patentable discoveries. No amount of regulation, or education in ethics, can bring disinterested ethical behavior when all the incentives point the opposite way, urging speedy production of profitable outcomes which in the normal course of scientific work can never be guaranteed, let alone quickly.

Dogmatism and barriers to progress
Outright fraud is only the most obviously damaging feature of this 3rd age of modern science. The absolute necessity for researchers to obtain uninterrupted flows of grant money brings enormous pressure to be working along productive lines, not to be wrong. But the essence of research is to enlarge understanding, which means venturing into the unknown. By definition, the unknown is a mystery, and by easy extension the outcome of genuine research is not predictable. Surely every serious scientist has sometimes hit a dead end and made mistakes along the way. The very history of science is a story of trials and errors. Therefore seeking to avoid making any mistakes or to take on only projects that are guaranteed to succeed means restricting research to banalities.

Furthermore, if one nevertheless goes wrong, for instance by clinging too long to a superseded theory, the incentives are strong to resist acknowledging the mistake for as long as possible. Established leaders, who as a group control available resources — grants, hiring, publishing — are in a good position to stave off threats to the established mainstream consensus. So contemporary science has also seen a marked increase in dogmatic adherence to outmoded approaches and interpretations; see Dogmatism  in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth.

The problem is clear, the solution is not
I wish I could suggest remedies whose early introduction might be feasible. But a necessary first step is to understand what the problem is. No amount of research into research integrity is needed to recognize that the hothouse environment of cutthroat competition brings to would-be researchers temptations that a significant proportion of scientists are unable to resist.

The system of scientific and medical research has become seriously dysfunctional. Perhaps my analogy of success rates in grant-getting with actual unemployment was somewhat forced (80% unemployment?! The research system is broken), but it is surely no exaggeration to describe it as absurdly dysfunctional when senior researchers as well as would-be scientists have to construct 5 or 6 grant proposals for every one that succeeds. Instead of doing research, scientists spend huge amounts of time and effort on grant-writing (John P. A. Ioannidis, “Fund people not projects”, Nature 477 [2011] 529-31); and universities and other research institutions even have grant-writing specialists to assist their scientists by providing marketing and public-relations skills to make the grants appear more impressive.

The very system of project grants has become dysfunctional; for a cogently argued and documented discussion, see Donald W. Miller, Jr., “The government grant system: Inhibitor of truth and innovation?”, Journal of Information Ethics, 16 (2007) 59-69.
Half a century ago, it could seem appropriate to fund research in response to requests generated by scientists themselves. But as competition increased, attempts to judge competing requests led to increasingly inappropriate criteria; for instance it is common that grant proposals are expected to forecast the value of what the research will generate, when everyone knows that the most valuable results come serendipitously and not necessarily in line with researchers’ aims or expectations.

The whole research enterprise has become too large, too bureaucratic, too thoroughly dysfunctional for its own good and for the public good.

Posted in fraud in science, funding research, science policy | Tagged: , | 7 Comments »

80% unemployment?! The research system is broken

Posted by Henry Bauer on 2012/12/11

Try to imagine what it would be like to attempt to make a career in an occupation that has an 80% rate of unemployment and where on average you get to be 40 years old before landing your first full-time job.

Impossible to imagine, isn’t it?

That thought-experiment serves to describe present-day cutthroat competition in many research fields.

People who want to do scientific research on topics chosen by themselves rather than by an industrial employer generally seek careers in academe. For several decades now, academic researchers have needed to obtain grants from outside their university: otherwise they don’t get tenure in the first place or promotion later. At my own university, for example, already 30 years ago grant-getting of $100,000 per year was a requirement for tenure in the College of Engineering, and 3 times that for promotion to full professor. Nowadays at the University of California, Berkeley, molecular-biology faculty need to command about ¼ million dollars in external funds per year if they are to be able to mentor a graduate student in research (see p. 72 in Dogmatism  in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth) — and mentoring graduate students is virtually synonymous with getting research done, having a research career.

The major source of grants for research in biology is the NIH (National Institutes of Health). By 2011, only 18% of grant applications to NIH were successful, and the average age at which an individual first obtained a grant as Principal Investigator (PI) was 42.
Before they get their own grants, researchers have to work for other PIs, so they are not independent researchers and have not begun an independent research career.

In 1980, the first year in which NIH gathered age data for PIs, that average age had been about 36 or 37. That may still sound rather old to begin a career, but it is not many years older than the average age — rarely less than 30 —  at which one can have obtained a Ph.D. and have experienced the essentially mandatory few years as a postdoctoral fellow. The increase since 1980 by 5 or 6 years represents about a doubling of the time between being qualified for a job and actually getting one. That’s a significant increase, especially since many graduates have large student-loan debt waiting to be paid off and gathering interest in the meantime.

Another take on these data looks at how many PIs are at the lower and upper ends of the age range:

AgeDistributionPIs
In the early 1980s, 1 in 5 or 6 PIs had been 36 or younger while a negligible percentage were 66 or older (65 had then been a mandatory retiring age almost everywhere). By 2010, only 3% of PIs were 36 or younger and more than twice as many were 66 or older.

These numbers indicate what it’s like to pursue a research career in today’s hothouse environment of cutthroat competition; and not only in biology or medicine. In many other areas of science, researchers look to the National Science Foundation (NSF) for grants. When, as a chemist at the University of Kentucky in 1967 I first applied for an NSF grant, I and my colleagues there had a success rate of about 50%; by the time I left the Department a decade later, the success rate had declined to about 10%.

As Derek Price had predicted (From Dawn to Decadence: The Three Ages of Modern Science), science experienced crises during the second half of the 20th century as the proportion of GDP available for research no longer increased. But the fact of pervasive crisis has hardly been recognized outside the small specialty of STS. NSF continued to press for more support to train more graduate students for scientific work, and official policies pressed for more doctors and more medical research. So more and more people have been competing more and more desperately for bits of a pie that has not been growing commensurately with the growth of would-be researchers.
Few in or outside academe saw this coming. I recall being surprised, in 1965 at the University of Michigan, when a fresh Ph.D. told me that he intended to look for a job in industry in order to stay out of the academic rat-race — I was still imbued then with the traditional view of academe as an ivory tower. I did soon note how competitive grant-seeking was becoming, though, but I had no sense of the wide-ranging future consequences.

An overt consequence of the crisis has been a marked increase in dishonesty, including keeping important information secret as well as outright deliberate cheating within science.  More about that soon.

Posted in funding research, science policy | Tagged: | 4 Comments »

The culture and the cult of science

Posted by Henry Bauer on 2012/12/07

About contemporary science I wrote in my last post:
What the media and the public and the policy makers hear about matters of science has become untrustworthy to a dangerous degree, on such important matters as HIV/AIDS and global warming — see my Dogmatism  in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth, McFarland, 2012.

The lack of trustworthiness results from the manner in which science has become entangled with politics and big money in this 3rd age of modern science. Personal and institutional conflicts of interest are everywhere, largely as consequences of external funding by patrons whose prime aim is profit-seeking and not truth-seeking. But even in the absence of such distortions or corruptions, outsiders should beware of trusting what scientists say because their statements reflect their own peculiar culture and can easily be misinterpreted if one does not understand the culture of science, in particular the unspoken presumptions that underlie and guide scientific work.

How different the mindsets of scientists and of non-scientists are can be illustrated by scientists’ remarks that outsiders might well judge to be so peculiar as to border on insanity. Take cosmology as an example. New Scientist just published an article, “Before the big bang: something or nothing” (by Marcus Chown, magazine issue 2893, 3 December 2012, pp. 32-5). The sub-title is “Has the cosmos existed forever, or did something bring it into existence? Time to grapple with the universe’s greatest mystery”. Chown reports that there have long been these opposing views — that the cosmos has always existed, or that something brought it into being. And now “cosmologists Alex Vilenkin and Audrey Mithani claimed to have settled the debate. They have uncovered reasons why the universe cannot have existed forever.” Then follow details of the history of the debate and the evidence and arguments that Vilenkin and Mithani now offer.

What’s entirely missing is any consideration of the question, How are human beings supposed to understand the question of whether the universe had a beginning or not?

Our understanding of what it means for something to have a beginning, to have come into being “from nothing”, can only be based on human experience of Earthly objects. We have no touchstone for what “nothing” could mean in the context of the universe or the cosmos. That scientists are content to talk about such things doesn’t make their discourse meaningful in any human sense.

Cosmologists also distinguish between the cosmos, presumably “everything”, and our universe, which may not be everything because cosmologists are quite happy to discuss the “many worlds interpretation” of quantum mechanics. That solves issues of quantum uncertainty by postulating that every event spawns not only what we know happens but also what else might have happened (by quantum-mechanical logic) . Every event creates a new universe, in other words, so whole worlds or universes are spawned at every Earthly event.
Shouldn’t I rather have put scare quotes around “solves”? How does such a fantasy really help to understand quantum uncertainty — or anything else, for that matter?

Scientists qua scientists are not bothered by questions of that sort. They’re happy manipulating equations, seeking models that mimic observations, and speculating about the meanings underlying those equations and models. If an equation seems to contain the possibility of a universe arising “from nothing”, then they are content to believe that might be what actually happens. The fact that their equation’s “nothing” has nothing to do with any human meaning of “nothing” doesn’t bother the experts.

On this issue, cosmologists are exemplary of scientists in general. Physicists — high-energy or particle physicists — can be equally happy pretending or believing that equations can reveal actual reality, that the equations are somehow synonymous with reality. So when long effort appeared to have found signs that the Higgs boson actually “exists”, one overjoyed physicist declared that now we understood where mass comes from. Another physicist who recently published a book about the Higgs explained in an interview that now we know everything about the matter we are familiar with, and it remains only to understand the dark energy and the dark matter that make up the rest of the universe — or so they now believe, despite that these “dark” things are fudge factors about which we know absolutely nothing except that they’re needed to make equations seem to fit reality.
When geneticists had decoded the DNA sequences of a number of organisms, humans included, one enthusiastic expert declared that we now understood that yeast is just like us. Outsiders might respond that yeast is quite unlike us in almost every way that matters to the everyday life of human beings.

Such differences in mindset illustrate that science is not just a particular area of knowledge, it’s a culture: scientists behave and believe and use language differently than do non-scientists. “Reality” does not mean the same thing in the scientific culture as it does in everyday life. Many other terms mean different things inside and outside the scientific culture. Moreover there are sub-cultures: physicists and chemists are culturally different — for example, that something is “stable” means different things in chemistry and in physics. And experimental chemists and theoretical chemists differ in a number of ways. Every so-called intellectual discipline, in fact, is in a meaningful sense a culture; see Disciplines as cultures  and Barriers against interdisciplinarity.

A good way to think about science is as a glorious entertainment (Jacques Barzun, Science: The Glorious Entertainment, HarperCollins 1964) that can be entirely addictive and all-consuming for its practitioners, who may even become dysfunctional in everyday practical terms. The stereotypes of the mad scientist and the absent-minded professor are not without grains of truth. It seems to me that believing that human beings are capable of understanding how the cosmos began, or how it has always existed, is not a mark of sanity. I suggest that human brains or minds or spirits are simply incapable of grasping either of those possibilities in any meaningful way. I think Fred Hoyle took the same view in his superb The Black Cloud, a science fiction populated by entirely authentic scientific characters and a fine understanding of the scientific culture.

Unfortunately it isn’t widely understood that science is an alien culture. What’s worse, science has become a cult of supposedly authoritative knowledge: its pronouncements are taken as unquestionable truth, and what is merely a mainstream consensus becomes dogma (Dogmatism  in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth).  That makes it vital to recognize that pronouncements by the scientific community are functionally in a language that differs from the common tongue. Journalists, policy makers, people in general need to learn that scientists have a peculiar mind-set and that what they say can easily be misinterpreted. Scientists see things from the viewpoint of their own work, not what it would mean to translate their hypotheses into social action.

Doing research, one inevitably proceeds as though one’s hypothesis were true. One has to use some basis for designing research projects, and in practice the basis that one chooses is thereby treated as though it were true, it isn’t itself questioned. So, for instance, the people who are working on computer models that attempt to describe climate must, in the course of their work, take a number of assumptions for granted, as if they were known to be true. When speaking to reporters or policy makers, the modelers of course don’t point that out — they themselves are hardly ever conscious of it, it isn’t on the tip of their tongues. So what the experts take as highly probable, as a useful basis for further work, may be misinterpreted by policy makers as having been established beyond doubt.
When further work shows that a research project was misguided in some way, that one or other presumption was not valid, it matters only to the researchers. It might make it harder to get another grant, it might hinder rather than help a career, but it doesn’t matter outside the research group and perhaps its patrons or employers. But when national or international policy goes awry, the consequences could be very damaging to large numbers of people and institutions and even nations.

If the global-warming modelers are wrong and emissions of carbon dioxide are not appreciably adding to climate change, the modelers may have to look harder for research grants and they may lose prestige in the halls of power. But if carbon dioxide is not appreciably adding to climate change and nations have based thoroughgoing changes in energy production and manufacturing on that mistaken premise, standards of living will have been greatly damaged for huge numbers of people.

Policy makers and scientists base their judgments on quite different criteria. For a given set of actual scientific data, scientists and policy makers might well reach opposite judgments as to what should be done in the actual practice of legislation and regulations.

So science has become unreliable not just because of money and politics, it can also be an unreliable guide to action because scientists live in a world that is almost free of consequences for actions taken, a world in which what they do has no consequences outside their experiments and calculations.
When policy makers imagine misguidedly that the advice they receive from scientists is attuned to the real world, the consequences may be little short of disastrous. That scientists rather than STS scholars are typically used as advisers to policy makers about matters of science and technology carries considerable risks. The case of supposedly human-caused global warming   serves as a scary example.

Posted in global warming, politics and science, science is not truth, science policy, scientific culture, scientism, scientists are human | Tagged: , , , , , | 5 Comments »

From Dawn to Decadence: The Three Ages of Modern Science

Posted by Henry Bauer on 2012/12/03

[I’ve snitched my title from the book, From Dawn to Decadence: 500 Years of Western Cultural Life — 1500 to the Present, Jacques Barzun’s cultural tour de force published in 2000. It happens to fit for what’s happened to science in virtually the same period. Hardly surprising, since science has played such a prominent role in Western society during these centuries.]

The popular view of science isn’t historically informed, but it is based on the past. It doesn’t recognize that the activity we call “science” has changed in important ways over the centuries, that it continues to change, and that today’s “science” is not at all like the popular view.
Much of the conventional wisdom about science reflects notions discussed a century or so ago and long abandoned by scholars of science, like “the scientific method”, thought up by philosophers trying to understand why science had been so successful. Popular icons of science also date to a century or so ago or even further back —Darwin, Einstein, Galileo, Newton. In reality, of course, most scientists are not at all like the famous few, but public discourse doesn’t have exemplary figures of what most scientists are like nowadays — technological analogues of Babbitt  or men in grey flannel suits, performing banal routines more than producing inspired creativity. Just about everything associated with science in the 21st century is significantly different from what it was a century ago, even half a century ago.

The First Age of Modern Science:
Curious Amateurs Seeking Authentic Knowledge

Historians are in reasonable agreement that modern science had its beginnings in about the 17th century, marked by such figures as Galileo and Newton, and such events as the founding of the Royal Society of London. Some discrete, isolated bits of science and even more bits of technological skill from earlier times were incorporated, but what historians call “The” Scientific Revolution of about the 17th century was the beginning of an integrated venture using both theorizing and experimenting, and sharing the results in a somewhat organized way so that something like a coherent community of knowledge seekers formed. The people involved were said to be doing “natural philosophy” — seeking to understand Nature. Some of them were clergy who wanted to do it in service to God, as a way of understanding his ways better, while others were doing it just because they wanted to, whether out of sheer curiosity or in the hope of finding materially useful things. The essential point is that they were amateurs, doing what they loved. Their direct aim, unsullied by external conflicts of interest, was just to understand how the world works.
In this first age of modern science, flaws stemmed purely from human characteristics. People naturally took pride in their discoveries and wanted to be recognized for making them, and to be acknowledged as having made them first, and they could be heavily invested in their own theories and believing themselves to be right and others wrong. So there were arguments, sometimes quite bitter, typically over who had priority for a discovery. But those arguments were not exacerbated by interests external to science and knowledge-seeking.
That first age of modern science has left its mark on the contemporary view. Many people imagine that scientists nowadays are just self-driven by curiosity, that discovering the truth is their only interest. That can be accurate for some scientists, but it isn’t overall: most scientists nowadays are employees doing what they’re paid to do, no doubt wanting to do honest work but influenced by a variety of conflicts of interest, whose consequences I’ll discuss below or later.

The Second Age of Modern Science:
Science as a Career

By the early 19th century, natural philosophy had accumulated a respectable amount of trustworthy knowledge about and understanding of Nature, enough to inspire confidence that even more could be learned in the future.
The term “science” was becoming used in something like its modern form; William Whewell is generally credited with first use of the term “scientist” in the 1830s. So the professional identity of scientist came into being, and the possibility of making it a career, a way to earn a living: at first primarily through teaching, doing research as a sideline, but soon also through carrying out applied research, beginning with the dye-stuff industry based on the synthesis of  new and better dyes to replace the earlier use of dyes derived, expensively, from plants. In the later 19th century Germany pioneered what have become “research universities” where the teaching of undergraduates tends to play a subsidiary role.
Now it became not just a matter of personal satisfaction to get there first and to be acknowledged for it and to be right while others were wrong, it was henceforth a way to succeed in practical terms, rising to better positions. Making great discoveries could even lead to high social status, for example being inducted into the British peerage like William Thomson who became the first Baron Kelvin, or Ernest Rutherford who became the first Baron Rutherford of Nelson (New Zealand).
During the first World War, Germany lost access to the previously imported nitrates needed for explosives as well as fertilizers, and Ernst Haber found out how to synthesize the needed chemicals from the atmosphere’s nitrogen. Many other fundamental discoveries turned out to have practical applications. Industrial scientists could sometimes benefit from making patentable discoveries. But, by and large, the rewards from being a scientist came from the satisfaction of doing the work and being able to earn a decent living from doing something interesting.
In this second age of modern science, from about mid-19th century to about mid-20th century, science was in many ways an attractive career, but it was not a path one would choose if seeking wealth or an entrée into the halls of power.

The New Age of Modern Science:
Money and Politics

The Second World War introduced the present age of science, in which research can lead to great wealth and to considerable influence on those who construct national and international policies. Science is thereby subjected to strong external conflicts of interest. The funding and control of research are enmeshed in bureaucracy and competing interests. The aims of research may be purely profit-seeking rather than truth-seeking. Applications of research may be determined by personal or private or corporate interests even to the exclusion of the public good. The distinction between “pure” science seeking basic understanding and “applied” science based on trustworthy fundamental knowledge has become largely meaningless as more research is funded by patrons interested only in profitable outcomes rather than new understanding gained.
Something like a perfect storm ensued as these changes coincided with an inevitable change from seemingly endless expansion of scientific activity to an essentially zero-sum game where the total resources available for research can no longer grow appreciably.
From growth to steady state:
Derek Price, ground-breaking historian of science, had recognized that every available quantitative measure of science had increased exponentially, doubling about every 15 years since the 17th century: numbers of articles published, numbers of scientific journals, numbers of people who could be called “scientists”. The ethos of scientific activity was consonant with that, an expectation that every promising avenue could be explored, every graduating potential researcher would find employment doing science, every new result could find publication. Increasingly insiders as well as outsiders would look to numbers as gauges of success: numbers of articles published, numbers of students mentored, and especially in the New Age of modern science, numbers of grants collected and total amount of money raised.
The reality Price also saw was that by about mid-20th century, developed societies were devoting something like 2-3% of Gross Domestic Product to science, broadly defined as “Research & Development” and funded by private, public, and corporate patrons. That proportion could not continue to grow exponentially, to ~5% in 15 years, ~10% in 30 years, and so on. Science had reached its limit of growth relative to the rest of society, and would have to adjust to a steady state: doing one thing would mean not doing another; the numbers of prospective researchers graduated should be the numbers needed to replace retiring researchers; no new journals would need to be established. Measures of success would need to be more qualitative than quantitative. The traditional ethos of scientific activity would need to be replaced by different criteria or characteristics.
Those changes are needed, have been needed for decades, but they have not yet occurred.
John Ziman, distinguished physicists turned STS scholar, detailed the necessary changes in ethos in Prometheus Bound (Cambridge University Press, 1994). The classic norms, whose definition is generally credited to Robert Merton, were that science was a universal public good characterized by disinterestedness and organized skepticism, to which Ziman added “originality”. These norms apply to something like the first age of science: curious people seeking understanding for its own sake, skeptical of new claims since experience had shown them to be fallible; Ziman’s addition of originality recognizes the value of creativity and progress.
In the second age, personal careerism and institutional interests sometimes interfered with disinterestedness or with organized skepticism; but in the third age, the new age, the norms of scientists’ behavior are entirely different. Ziman pointed out that research is now largely a matter of authoritative professional experts hired to produce wanted results, and the traditional universality of science is often subordinate to local demands.
What Ziman did not emphasize is that, under the new regime, the media and the public may be fed “scientific results” that are nowhere near as trustworthy as they used to be since they may be promulgated for institutional, bureaucratic or profit-making purposes, not because of a wish to disseminate genuine knowledge.
The enormous expansion in numbers of researchers has inevitably diluted their average quality, and the possibility of wealth and political influence has also brought a difference in the personalities of those who self-recruit into research. Increasingly science is being done not out of the inherent curiosity of disinterested knowledge-seekers; rather, as Gordon Tullock put it (The Organization of Inquiry, Duke University Press, 1966; reprinted, Liberty Fund, 2004), their curiosity is induced by offers of rewards.
The new zero-sum steady-state funding of research together with more potential researchers than the resources can support has had seriously deleterious consequences: cutthroat competition, dishonesty, and consequent unreliability of public pronouncements by researchers and their patrons or employers.
What the media and the public and the policy makers hear about matters of science has become untrustworthy to a dangerous degree, on such important matters as HIV/AIDS and global warming — see my Dogmatism  in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth, McFarland, 2012.

Posted in politics and science, science is not truth, science policy, scientists are human, the scientific method | Tagged: , | 7 Comments »