Skepticism about science and medicine

In search of disinterested science

Archive for the ‘scientists are human’ Category

How science changed — IV. Cutthroat competition and outright fraud

Posted by Henry Bauer on 2018/04/15

The discovery of the structure of DNA was a metaphorical “canary in the coal mine”, warning of the intensely competitive environment that was coming to scientific activity. The episode illustrates in microcosm the seismic shift in the circumstances of scientific activity that started around the middle of the 20th century [1], the replacement of one set of unwritten rules by another set [2].
The structure itself was discovered by Watson and Crick around 1950, but it was only in 1968, with the publication of Watson’s personal recollections, that attention was focused on how Watson’s approach and behavior marked a break from the traditional unwritten rules of scientific activity.
It took even longer for science writers and journalists to realize just how cutthroat the competition had become in scientific and medical research. Starting around 1980 there appeared a spate of books describing fierce fights for priority on a variety of specific topics:
Ø    The role of the brain in the release of hormones; Guillemin vs. Schally — Nicholas Wade, The Nobel Duel: Two Scientists’ 21-year Race to Win the World’s Most Coveted Research Prize, Anchor Press/Doubleday, 1981.
Ø    The nature and significance of a peculiar star-like object — David H. Clark, The Quest for SS433, Viking, 1985.
Ø    “‘Mentor chains’, characterized by camaraderie and envy, for example in neuroscience and neuropharmacology” — Robert Kanigel, Apprentice to Genius: The Making of a Scientific Dynasty, Macmillan, 1986.
Ø    High-energy particle physics, atom-smashers — Gary Taubes, Nobel Dreams: Power, Deceit, and the Ultimate Experiment, Random House, 1986.
Ø    “Soul-searching, petty rivalries, ridiculous mistakes, false results as rivals compete to understand oncogenes” — Natalie Angier, Natural Obsessions: The Search for the Oncogene, Houghton Mifflin, 1987.
Ø    “The brutal intellectual darwinism that dominates the high-stakes world of molecular genetics research” — Stephen S. Hall, Invisible Frontiers: The Race to Synthesize a Human Gene, Atlantic Monthly Press, 1987.
Ø    “How the biases and preconceptions of paleoanthropologists shaped their work” — Roger Lewin, Bones of Contention: Controversies in the Search for Human Origins, Simon & Schuster, 1987.
Ø    “The quirks of . . . brilliant . . . geniuses working at the extremes of thought” — Ed Regis, Who Got Einstein’s Office: Eccentricity and Genius at the Institute for Advanced Study, Addison-Wesley, 1987.
Ø    High-energy particle physics — Sheldon Glashow with Ben Bova, Interactions: A Journey Through the Mind of a Particle Physicist and the Matter of the World, Warner, 1988.
Ø    Discovery of endorphins — Jeff Goldberg, Anatomy of a Scientific Discovery, Bantam, 1988.
Ø    “Intense competition . . . to discover superconductors that work at practical temperatures “ — Robert M. Hazen, The Breakthrough: The Race for the Superconductor, Summit, 1988.
Ø    Science is done by human beings — David L. Hull, Science as a Process, University of Chicago Press, 1988.
Ø    Competition to get there first — Charles E. Levinthal, Messengers of Paradise: Opiates and the Brain, Anchor/Doubleday 1988.
Ø    “Political machinations, grantsmanship, competitiveness” — Solomon H. Snyder, Brainstorming: The Science and Politics of Opiate Research, Harvard University Press, 1989.
Ø    Commercial ambitions in biotechnology — Robert Teitelman, Gene Dreams: Wall Street, Academia, and the Rise of Biotechnology, Basic Books, 1989.
Ø    Superconductivity, intense competition — Bruce Schechter, The Path of No Resistance: The Story of the Revolution in Superconductivity, Touchstone (Simon & Schuster), 1990.
Ø    Sociological drivers behind scientific progress, and a failed hypothesis — David M. Raup, The Nemesis Affair: A Story of the Death of Dinosaurs and the Ways of Science, Norton 1999.

These titles illustrate that observers were able to find intense competitiveness wherever they looked in science; though mostly in medical or biological science, with physics including astronomy the next most frequently mentioned field of research.
Watson’s memoir had not only featured competition most prominently, it had also revealed that older notions of ethical behavior no longer applied: Watson was determined to get access to competitors’ results even if those competitors were not yet anxious to reveal all to him [3]. It was not only competitiveness that increased steadily over the years; so too did the willingness to engage in behavior that not so long before had been regarded as improper.
Amid the spate of books about how competitive research had become, there also was published. Betrayers of the Truth: Fraud and Deceit in the Halls of Science by science journalists William Broad and Nicholas Wade (Simon & Schuster, 1982). This book argued that dishonesty has always been present in science, citing in an appendix 33 “known or suspected” cases of scientific fraud from 1981 back to the 2nd century BC. These actual data could not support the book’s sweeping generalizations [4], but Broad and Wade had been very early to draw attention to the fact that dishonesty in science was a significant problem. What they failed to appreciate was why: not that there had always been a notable frequency of fraud in science but that scientific activity was changing in ways that were in process of making it a different kind of thing than in the halcyon few centuries of modern science from the 17th century to the middle of the 20th century.
Research misconduct had featured in Congressional Hearings as early as 1981. Soon the Department of Health and Human Services established an Office of Scientific Integrity, now the Office of Research Integrity. Its mission is to instruct research institutions about preventing fraud and dealing with allegations of it. Scientific periodicals began to ask authors to disclose conflicts of interest, and co-authors to state specifically what portions of the work were their individual responsibility.
Academe has proliferated Centers for Research and Medical Ethics [5], and there are now periodicals entirely devoted to such matters [6]. Courses in research ethics have become increasingly common; it is even required that such courses be available at institutions that receive research funds from federal agencies.
In 1989, the Committee on the Conduct of Science of the National Academy of Sciences issued the booklet On Being a Scientist, which describes proper behavior; that booklet’s 3rd edition, titled A Guide to Responsible Conduct in Research, makes even clearer that the problem of scientific misconduct is now widely seen as serious.
Another indication that dishonesty has increased is the quite frequent retraction of published research reports: Retraction Watch estimates that 500-600 published articles are retracted annually. John Ioannidis has made a specialty of reviewing literature for consistency, and reported: “Why most published research findings are false” [7]. Nature has an archive devoted to this phenomenon [8].

Researchers half a century ago would have been aghast and disbelieving at all this, that science could have become so untrustworthy. It has happened because science changed from an amateur avocation to a career that can bring fame and wealth [9]; and scientific activity changed from a cottage industry to a highly bureaucratic corporate industry, with pervasive institutional as well as individual conflicts of interest; and researchers’ demands for support have far exceeded the available supply.

And as science changed, it drew academe along with it. More about that later.

===============================================

[1]    How science changed — III. DNA: disinterest loses, competition wins
[2]    How science has changed— II. Standards of Truth and of Behavior
[3]    The individuals Watson mentioned as getting him access corrected his recollections: they shared with him nothing that was confidential. The significant point remains that Watson had no such scruples.
[4]    See my review, “Betrayers of the truth: a fraudulent and deceitful title from the journalists of science”, 4S Review, 1 (#3, Fall) 17–23.
[5]   There is an Online Ethics Center for Engineering and Science. Physical Centers have been established at: University of California, San Diego (Center for Ethics in Science and Technology); University of Delaware (Center for Science, Ethics and Public Policy); Michigan State University (Center for Ethics and Humanities in the Life Sciences); University of Notre Dame (John J. Reilly Center for Science, Technology, and Values).
[6]    Accountability in Research (founded 1989); Science and Engineering Ethics (1997); Ethics and Information Technology (1999); BMC Medical Ethics (2000); Ethics in Science and Environmental Politics (2001).
[7]    John P. A. Ioannidis, “Why Most Published Research Findings Are False”, PLoS Medicine, 2 (2005): e124. 
[8]    “Challenges in irreproducible research”
[9]    How science has changed: Who are the scientists?

Advertisements

Posted in conflicts of interest, fraud in medicine, fraud in science, funding research, media flaws, science is not truth, scientific culture, scientists are human | Tagged: , | Leave a Comment »

How science changed — III. DNA: disinterest loses, competition wins

Posted by Henry Bauer on 2018/04/10

The Second World War marked a shift of economic and political power from Europe to the United States, with associated changes in the manner and style with which those powers are deployed. Science began to change at about the same time and in somewhat analogous and perhaps associated ways.

The change in the norms of science, from CUDOS to PLACE, that Ziman had described (How science has changed — II. Standards of Truth and of Behavior) began with what happened in the middle of the 20th century. The first of the Mertonian norms to fade away was disinterestedness: Science came to be like other spheres of human activity in that some people chose to pursue it as an avenue for satisfying personal ambition rather than as an opportunity to serve the public good.

My cohort of science students in Australia in the early 1950s had been notably idealistic about science. We could imagine no finer future then the opportunity to earn a living doing science. The relative absence of excessive personal ambition may have stemmed in large part from the fact that Australia was at that time a profoundly egalitarian society; no one should imagine himself to be “better” than anyone else [1].

Our ideals about science included taking honesty for granted, as Merton had.

Our ranking of desirable occupations had doing research in a university setting at the top. Those who were not good enough to do innovative self-directed research would still be able to have a place in science by working in industry. If one were not talented enough even for that, one would have to make do with teaching science. And if one could not even do that, then it would have to be some sort of administrative job. I still recall the minor functionary at the University of Sydney who represented a living lesson for us in the wages of sin: As a graduate student in chemistry, he had faked some of his results, and so he had been condemned to lifelong labor as a paper pusher.

The sea change in science around the middle of the 20th century is illustrated in microcosm by the circumstances of the discovery of the structure of DNA by James Watson and Francis Crick. Watson’s description of that discovery in his memoir, The Double Helix (Atheneum, 1968), and the reactions to that book in the scientific community, illustrate the profound changes in scientific activity beginning to take place around that time. Gunther Stent’s annotated edition of The Double Helix [2] provides a ready source for appreciating how the DNA discovery touches on many aspects of how scientific activity changed profoundly, beginning in the middle of the 20th century; the edition includes the original text of the book, commentaries, many of the original book reviews, and pertinent articles.

Watson himself, as portrayed in his own memoir, exemplifies the brash, personally ambitious American ignorant of or simply ignoring the traditional ways of doing things, in personal behavior as well as in doing science [3].

In Watson’s memoir, traditional ways including disinterestedness are exemplified by the Europeans Max Perutz and Erwin Chargaff. Perutz had been working diligently for a decade or so, gradually refining what could be learned about the structure of proteins through the technique of X-ray crystallography. With similar diligence Erwin Chargaff had been analyzing the chemical constitutions of DNA from a variety of different sources. Both those research approaches comported with traditional experience that carefully accumulating sufficient pertinent information would eventually be rewarded by important new understanding. In Britain, since Maurice Wilkins and Rosalind Franklin were working on DNA structure via X-ray crystallography, no other British lab would trespass onto that research project.

Watson of course had no such scruples, nor was he prepared to wait for the traditional ways to pay off; Watson’s own words make it appear that his prime motivation was to make a name for himself — any advance in human understanding, for the public good, would be a byproduct.

To short-circuit old-fashioned laborious approaches, he and his co-worker Francis Crick looked to what had been pioneered by another American, Linus Pauling, who is often still regarded as the outstanding chemist of the 20th century. Pauling did also use X-ray crystallography, but only as a secondary adjunct. He had laid the foundations for an understanding of chemical bonding and had been interested from the beginning in the three-dimensional structures of molecules; applying his insights to the study of macromolecules, he succeeded in elucidating the configuration of protein molecules in part by constructing feasible molecular models.

Traditional cosmopolitan European culture could be disdainful and snobbish toward the parvenu, nouveau-riche American ways that were taking over the world, including the world of science. Erwin Chargaff provides an apposite, rather sad illustration. He disliked not only Watson’s personality and actions, he led himself to believe that his own diligent traditional work on the chemical composition of DNA should have been rewarded by a share of the Nobel Prize. Chargaff’s review [4] of The Double Helix flaunts his cultured erudition and also reveals his personal disappointment; later he refused Gunther Stent permission to reprint his review, in company with all the others, in Stent’s annotated edition.

The technical point at issue is that Chargaff had been content to allow results to accumulate until insight revealed itself rather than to take a gamble on some premature interpretation: he had merely remarked on an apparently consistent ratio of purines to pyrimidines in the DNA from a variety of sources [5]: “It is . . . noteworthy — whether this is more than accidental cannot yet be said — that in all deoxypentose nucleic acids examined thus far the molar ratios of total purines to total pyrimidines, and also of adenine to thymine and of guanine to cytosine, were not far from 1”.

The important insight, however, is that the numbers are exactly equal; adenine faces thymine, and guanine faces cytosine in the molecular structure of DNA, and that is the central and crucial feature of the double helix. In hindsight, Chargaff wanted his tentative statement of approximate equality to be construed as “the discovery of   the base-pairing regularities” [4].

Erwin Chargaff may have been acerbic and ungenerous in his book review, but he will also have spoken for generations of scientists in his regret for the passing of the more idealistic, disinterested, traditional order and distaste for what was replacing it: “in our time a successful cancer researcher is not one who ‘solves the riddle,’ but rather one who gets a lot of money to do so” [6]; “Watson’s book may contribute to the much-needed demythologization of modern science”; “with very few exceptions, it is not the men that make science; it is science that makes the men” [4].

That disappearing idealistic traditional order might be exemplified in Sinclair Lewis’s Arrowsmith. Published in 1925 by Harcourt, Brace, according to amazon.com there have been more than 80 later editions, including a 2008 paperback. Evidently the yearning remains strong for disinterested science for the public good. The book’s protagonist, after some early mis-steps and yieldings to commercial temptations, opts for pure research for the good of humankind. Even a couple of decades ago, an academic of my generation (a biochemist) told me that he still gave his graduate students Arrowsmith to read as a guide to the proper ethos of science.

That occasion for being reminded of Arrowsmith was a series of seminars I was then holding on our campus about ethics in research [7], a topic that was just becoming prominent as instances of dishonesty in scientific work were beginning to be noted with increasing frequency.

More about that in a future blog post.

========================================

[1]    A widely shared view was that “tall poppies” should be decapitated. A highly educated Labor-Party leader was careful to adopt a working-class accent in public to hide his normal “educated”, British-BBC-type dialect. I personally saw fisticuffs occasioned by one party feeling that the other had thought themselves better in some way
[2]    Gunther S. Stent (ed.), The Double Helix — Text, Commentary, Reviews, Original Papers, W. W. Norton, 1980
[3]    I had begun to sense the new self-serving ethos in science in the late 1960s, after a career move from Australia to the USA. I encountered ambitious young go-getters who luxuriated in the [then!] largesse of research support, inserting personal pleasures into publicly funded research travel, for example studying aspects of marine environments in ways that made possible scuba-diving and general cavorting in the Caribbean. I participated in the WETS, one of the informal associations of young up-and-comers who used to sample fleshly diversions as part of research-grant-paid trips to professional conferences
[4]    Erwin Chargaff, “A quick climb up Mount Olympus”, Science, 159 (1968) 1448-9
[5]    Erwin Chargaff, “Chemical specificity of nucleic acids and mechanism of their enzymatic degradation”, Experientia, 6 (1950) 201-40
[6]    Erwin Chargaff, Voices in the Labyrinth, Seabury, 1977, p. 89
[7]    For instance, “Ethics in Science” under “Current topics in analytical chemistry: critical analysis of the literature”, 15 & 17 March 1994;
reprinted at pp. 169-182 in Against the Tide, ed. Martín López Corredoira & Carlos Castro Perelman, Universal Publishers, 2008;

 

Posted in peer review, scientific culture, scientists are human | Tagged: , , , , , | Leave a Comment »

How science has changed — II. Standards of Truth and of Behavior

Posted by Henry Bauer on 2018/04/08

The scientific knowledge inherited from ancient Babylon and Greece and from medieval Islam was gained by individuals or by groups isolated from one another in time as well as geography. Perhaps the most consequential feature of the “modern” science that we date from the 17th-century Scientific Revolution is the global interaction of the people who are doing science, and especially the continuity over time of their collective endeavors.
These interactions among scientists began in quite informal and individual ways. An important step was the formation of academies and societies, among which the Royal Society of London is usually acknowledged to be the earliest (founded 1660) that has remained active up to the present time — though it was not the earliest such institution and even the claim of “longest continually active” has been challenged [1].
Even nowadays, the global community of scientists remains in many ways informal despite the host of scientific organizations and institutions, national and international: the global scientific community is not governed by any formal structure that lays down how science should be done and how scientists should behave.
However, observing the actualities of scientific activity indicates that there had evolved some agreed-on standards generally seen within the community of scientists as proper behavior. Around the time of the Second World War, sociologist Robert Merton described those informal standards, and they came to be known as the “Mertonian Norms” of science [2]. They comprise:

Ø    Communality or communalism (Merton had said “communism”): Science is an activity of the whole scientific community and it is a public good — findings are shared freely and openly.
Ø    Universalism: Knowledge about the natural world is universally valid and applicable. There are no separations or distinctions by nationality or religion race or anything of that sort.
Ø    Disinterestedness: Science is done for the public good and not for personal benefit; scientists seek to be impartial, objective, unbiased, and not self-serving.
Ø    Skepticism: Claims and reported findings are subject to critical appraisal and testing throughout the scientific community before they can be accepted as proper scientific knowledge.

Note that honesty is not mentioned; it was simply taken for granted.
These norms clearly make sense for a cottage industry, as ideal behavior that individuals should aim for; but they are not appropriate for a corporate environment, they cannot guide the behavior of individuals who are part of some hierarchical enterprise.
In the late 1990s, John Ziman [3] discussed the change in scientific activity as it had morphed from the activities of an informal, voluntary collection of individuals seeking to understand how the world works to a highly organized activity with assigned levels of responsibility and authority and where sources of research funding have a say in what gets done, and which often expect to get something useful in return for their investments, something profitable.
The early cottage industry of science had been essentially self-supporting. Much could be done without expensive equipment. People studied what was conveniently at hand, so there was little need for funds to support travel. Interested patrons and local benefactors could provide the small resources needed for occasional meetings and the publication of findings.
Up to about the middle of the 20th century, universities were able to provide the funds needed for basic research in chemistry and biology and physics. The first sign that exceptional resources could be needed had come in the 1920s when Lawrence constructed the first large “atom-smashing machine”; but that and the need for expensive astronomical telescopes remained outliers in the requirements for the support of scientific research overall.
From about the time of the Second World War, however, research going beyond what had already been accomplished began to require ever more expensive and specialized equipment as well as considerable infrastructure: technicians to support the equipment, glass-blowers and secretaries and book-keepers and librarians, and managers of such ancillary staff; so researchers increasingly came to need support beyond that available from individual patrons or universities. Academic research came to rely increasingly on getting grants for specific research projects from public agencies or from wealthy private foundations.
Although those sources of research funds typically claim that they want to support simply “the best science”, their view of what the best science is does not necessarily jibe with the judgments of the individual researchers [4].
At the same time as research in universities was calling on outside sources of funding, an increasing number of industries were setting up their own laboratories for research specifically toward creating and improving their products and services. Such product-specific “R&D” (research and development) sometimes turned up novel basic knowledge, or revealed the need for such fundamentally new understanding. One consequence has been that some really striking scientific advances have come from such famous industrial laboratories as Bell Telephone Laboratories or the Research Laboratory of General Electric. Researchers employed in industry have received a considerable number of Nobel Prizes, often jointly with academics [5].
Under these new circumstances, as Ziman [3] pointed out, the traditional distinction between “applied” research and “pure” or “basic” research lost its meaning.
Ziman rephrased the Mertonian norms as the nice acronym CUDOS, adding the “O” for originality, quite appropriately since within the scientific community credit was and is given to for the most innovative, original contributions; CUDOS, or preferably “kudos”, being the Greek term for acclaim of exceptional accomplishment. By contrast, Ziman proposed for the norms that obtain in a corporate scientific enterprise, be it government or private, the acronym PLACE: Researchers nowadays get their rewards not by adhering to the Mertonian norms but by producing Proprietary findings whose significance may be purely Local rather than universal, the subject of research having been chosen under the Authority of an employer or patron and not by the individual researcher, who is Commissioned to do the work as an Expert employee.

Ziman too did not mention honesty; like Merton he simply took it for granted.
Ziman had made an outstanding career in solid-state physics before, in his middle years, he began to publish, starting in 1968 [6] highly insightful works about how science functions, in particular what makes it reliable. In the late 1960s, it had still been reasonable to take honesty in science for granted; but by the time Ziman published Prometheus Bound, honesty in science could no longer be taken for granted; Ziman had failed to notice some of what was happening in scientific activity. Competition for resources and for career advancement had increased to a quite disturbing extent, presumably the impetus for the increasing frequency with which scientists were found to have cheated in some way. Even published, supposedly peer-reviewed research failed later attempted confirmation in many cases, and all too often it was revealed as simply false, faked [7].
More about that in a following blog post.

==========================================

[1]    “The Royal Societies [sic] claim to be the oldest is based on the fact that they developed out of a group that started meeting in Gresham College in 1645 but unlike the Leopoldina this group was informal and even ceased to meet for two years between 1658 and 1660” — according to The Renaissance Mathematicus, “It wasn’t the first but…”
[2]    Robert K. Merton, “The normative structure of science” (1942); most readily accessible as pp. 267–78 in The Sociology of Science (ed. N. Storer, University of Chicago Press, 1973) a collection of Merton’s work
[3]    John Ziman, Prometheus Bound: Science in a Dynamic Steady State, Cambridge University Press, 1994
[4]    Richard Muller, awarded a prize by the National Science Foundation, pointed out that truly innovative studies are unlikely to be funded and need to be carried out more or less surreptitiously; and Charles Townes, who developed masers and lasers, testified to his difficulty in getting research support for that ground-breaking work, or even encouragement from some of his distinguished older colleagues —
Richard A. Muller, “Innovation and scientific funding”, Science, 209 (1980) 880–3
Charles Townes, How the Laser Happened: Adventures of a Scientist, Oxford University Press , 1999
[5]    Karina Cummings, “Nobel Science Prizes in industry”;
Nobel Laureates and Research Affiliations
[6]    John Ziman, Public Knowledge (1968); followed by The Force of
Knowledge
(1976); Reliable Knowledge (1978); An Introduction to Science
Studies
(1984); Prometheus Bound (1994); Real Science (2000);
all published by Cambridge University Press
[7]    John P. A. Ioannidis, “Why most published research findings are false”,
         PLoS Medicine, 2 (2005) e124
Daniele Fanelli, “How many scientists fabricate and falsify research? A systematic review and meta-analysis of survey data”,
PLoS ONE, 4(#5, 2009): e5738

Posted in conflicts of interest, fraud in medicine, fraud in science, funding research, peer review, resistance to discovery, science is not truth, scientific culture, scientists are human | Tagged: , | Leave a Comment »

How science has changed: Who are the scientists?

Posted by Henry Bauer on 2018/04/07

Scientists are people who do science, Nowadays scientists are people who work at science as a full-time occupation and who earn their living at it.
Science means studying and learning about the natural world, and human beings have been doing that since time immemorial; indeed, in a sense all animals do that, but humans have developed efficient means to transmit gained knowledge to later generations.
At any rate, there was science long before [1] there were scientists, full-time professional students of Nature. Our present-day store of scientific knowledge includes things that have been known for at least thousands of years. For example, from more than 6,000 years ago in Mesopotamia (Babylon, Sumer) we still use base-60 mathematics for the number of degrees in the arcs of a circle (360) and the number of seconds in a minute and the number of minutes in an hour. We still cry “Eureka” (found!!) for a new discovery, as supposedly Archimedes did more than 2000 years ago when he recognized that floating an object in water was an easy way to measure its volume (by the increase in height of the water) and that the object’s weight equaled the weight of the water it displaced. The Islamic science of the Middle Ages has left its mark in language with, for instance, algebra or alchemy.
Despite those early pieces of science that are still with us today, most of what the conventional wisdom thinks it knows about science is based on what historians call “modern” science, which is generally agreed to have emerged around the 17th century in what is usually called The Scientific Revolution.
The most widely known bits of science are surely the most significant advances. Those are typically associated with the names of people who either originated them or made them popular [2]; so many school-children hear about Archimedes and perhaps Euclid and Ptolemy; and for modern science, even non-science college students are likely to hear of Galileo and Newton and Darwin and Einstein. Chemistry students will certainly hear about Lavoisier and Priestley and Wöhler and Haber; and so on, just as most of us have learned about general history in terms of the names of important individuals. So far as science is concerned, most people are likely to gain the general impression that it has been done and is being done by a relatively small number of outstanding individuals, geniuses in fact. That impression could only be entrenched by the common thought-bite that “science” overthrew “religion” sometime in the 19th century, leading to the contemporary role of science as society’s ultimate arbiter of true knowledge.
The way in which scientists in modern times have been featured in books and in films also gives the impression that scientists are somehow special, that they are by no means ordinary people. Roslynn Haynes [3] identified several stereotypes of scientists, for example “adventurer” or “the noble scientist as hero or savior of society”, with most stereotypes however being less than favorable — “mad, bad, dangerous scientist, unscrupulous in the exercise of power”. But no matter whether good or bad in terms of morals or ethics, society’s stereotype of “scientist” is “far from an ordinary person”.
That is accurate enough for the founders of modern science, but it became progressively less true as more and more people came to take part in some sort of scientific activity. Real change began in the early decades of the 19th century, when the term “scientist” seems to have been used for the first time [4].
By the end of the 19th century it had become possible to earn a living through being a scientist, through teaching or through doing research that led to commercially useful results (as in the dye-stuff industry) or through doing both in what nowadays are called research universities. By the early 20th century, scientists no longer deserved to be seen as outstanding individual geniuses, but they were still a comparatively elite group of people with quite special talents and interests. Nowadays, however, there is nothing distinctly elite about being a scientist. In terms of numbers (in the USA), scientists at roughly 2.7 million are comparable to engineers at 2.1 million (in ~2001), less elite than lawyers (~ 1 million) or doctors (~800,000); and teachers, at ~3.5 million, are almost as elite as scientists.
Nevertheless, so far as the general public and the conventional wisdom are concerned, there is still an aura of being special and distinctly elite associated with science and being a scientist, no doubt because science is so widely acknowledged as the ultimate authority on what is true about the workings of the natural world; and because “scientist” brings to most minds someone like Darwin or Einstein or Galileo or Newton.
So the popular image of scientists is wildly wrong about today’s world. Scientists today are unexceptional white-collar workers. Certainly a few of them could still be properly described as geniuses, just as a few engineers or doctors could be — or those at the high tail-end of any distribution of human talent; but by and large, there is nothing exceptional about scientists nowadays. That is an enormous change from times past, and the conventional wisdom has not begun to be aware of that change.
One aspect of that change is that the first scientists were amateurs seeking to satisfy their curiosity about how the world works, whereas nowadays scientists are technicians or technical experts who do what they are told to do by employers or enabled to do by patrons. A very consequential corollary is that the early scientists had nothing to gain by being untruthful, whereas nowadays the rewards potentially available to prominent scientists have tempted a significant number to practice varying degrees of dishonesty.
Another way of viewing the change that science and scientists have undergone is that science used to be a cottage industry largely self-supported by independent entrepreneurial workers, whereas nowadays science is a corporate behemoth whose workers are apparatchiks, cogs in bureaucratic machinery; and in that environment, individual scientists are subject to conflicts of interest and a variety of pressures owing to their membership in a variety of groups.

Science today is not a straightforward seeking of truth about how the world works; and claims emerging from the scientific community are not necessarily made honestly; and even when made honestly, they are not necessarily true. More about those things in future posts.

=======================================

[1]    For intriguing tidbits about pre-scientific developments, see “Timeline Outline View”
[2]    In reality, most discoveries hinge on quite a lot of work and learning that prefigured them and made them possible, as discussed for instance by Tony Rothman in Everything’s Relative: And Other Fables from Science and Technology (Wiley, 2003). That what matters most is not the act of discovery but the making widely known is the insight embodied in Stigler’s Law, that discoveries are typically named after the last person who discovered them, not the first (S. M. Stigler, “Stigler’s Law of Eponymy”, Transactions of the N.Y. Academy of Science, II: 39 [1980] 147–58)
[3]    Roslynn D. Haynes, From Faust to Strangelove: Representations of the Scientist in Western Literature, Johns Hopkins University Press, 1994; also “Literature Has shaped the public perception of science”, The Scientist, 12 June 1989, pp. 9, 11
[4]    William Whewell is usually credited with coining the term “scientist” in the early 1830s

Posted in conflicts of interest, fraud in science, funding research, media flaws, peer review, science is not truth, scientific culture, scientists are human | Tagged: , , | 4 Comments »

Dangerous knowledge IV: The vicious cycle of wrong knowledge

Posted by Henry Bauer on 2018/02/03

Peter Duesberg, universally admired scientist, cancer researcher, and leading virologist, member of the National Academy of Sciences, recipient of a seven-year Outstanding Investigator Grant from the National Institutes of Health, was astounded when the world turned against him because he pointed to the clear fact that HIV had never been proven to cause AIDS and to the strong evidence that, indeed, no retrovirus could behave in the postulated manner.

Frederick Seitz, at one time President of the National Academy of Sciences and for some time President of Rockefeller University, became similarly non grata for pointing out that parts of an official report contradicted one another about whether human activities had been proven to be the prime cause of global warming (“A major deception on global warming”, Wall Street Journal, 12 June 1996).

A group of eminent astronomers and astrophysicists (among them Halton Arp, Hermann Bondi, Amitabha Ghosh, Thomas Gold, Jayant Narlikar) had their letter pointing to flaws in Big-Bang theory rejected by Nature.

These distinguished scientists illustrate (among many other instances involving less prominent scientists) that the scientific establishment routinely refuses to acknowledge evidence that contradicts contemporary theory, even evidence proffered by previously lauded fellow members of the elite establishment.

Society’s dangerous wrong knowledge about science includes the mistaken belief that science hews earnestly to evidence and that peer review — the behavior of scientists — includes considering new evidence as it comes in.

Not so. Refusal to consider disconfirming facts has been documented on a host of topics less prominent than AIDS or global warming: prescription drugs, Alzheimer’s disease, extinction of the dinosaurs, mechanism of smell, human settlement of the Americas, the provenance of Earth’s oil deposits, the nature of ball lightning, the evidence for cold nuclear fusion, the dangers from second-hand tobacco smoke, continental-drift theory, risks from adjuvants and preservatives in vaccines, and many more topics; see for instance Dogmatism in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth, Jefferson (NC): McFarland 2012. And of course society’s officialdom, the conventional wisdom, the mass media, all take their cue from the scientific establishment.

The virtually universal dismissal of contradictory evidence stems from the nature of contemporary science and its role in society as the supreme arbiter of knowledge, and from the fact of widespread ignorance about the history of science, as discussed in earlier posts in this series (Dangerous knowledge; Dangerous knowledge II: Wrong knowledge about the history of science; Dangerous knowledge III: Wrong knowledge about science).

The upshot is a vicious cycle. Ignorance of history makes it seem incredible that “science” would ignore evidence, so claims to that effect on any given topic are brushed aside — because it is not known that science has ignored contrary evidence routinely. But that fact can only be recognized after noting the accumulation of individual topics on which this has happened, evidence being ignored. That’s the vicious cycle.

Wrong knowledge about science and the history of science impedes recognizing that evidence is being ignored in any given actual case. Thereby radical progress is nowadays being greatly hindered, and public policies are being misled by flawed interpretations enshrined by the scientific consensus. Society has succumbed to what President Eisenhower warned against (Farewell speech, 17 January 1961) :

in holding scientific research and discovery in respect, as we should,
we must also be alert to the equal and opposite danger
that public policy could itself become the captive
of a scientific-technological elite.

The vigorous defending of established theories and the refusal to consider contradictory evidence means that once theories have been widely enough accepted, they soon become knowledge monopolies, and support for research establishes the contemporary theory as a research cartel(“Science in the 21st Century: Knowledge Monopolies and Research Cartels”).

The presently dysfunctional circumstances have been recognized only by two quite small groups of people:

  1. Observers and critics (historians, philosophers, sociologists of science, scholars of Science & Technology Studies)
  2. Researchers whose own experiences and interests happened to cause them to come across facts that disprove generally accepted ideas — for example Duesberg, Seitz, the astronomers cited above, etc. But these researchers only recognize the unwarranted dismissal of evidence in their own specialty, not that it is a general phenomenon (see my talk, “HIV/AIDS blunder is far from unique in the annals of science and medicine” at the 2009 Oakland Conference of Rethinking AIDS; mov file can be downloaded at http://ra2009.org/program.html, but streaming from there does not work).

Such dissenting researchers find themselves progressively excluded from mainstream discourse, and that exclusion makes it increasingly unlikely that their arguments and documentation will gain attention. Moreover, frustrated by a lack of attention from mainstream entities, dissenters from a scientific consensus find themselves listened to and appreciated increasingly only by people outside the mainstream scientific community to whom the conventional wisdom also pays no attention, for instance the parapsychologists, ufologists, cryptozoologists. Such associations, and the conventional wisdom’s consequent assigning of guilt by association, then entrenches further the vicious cycle of dangerous knowledge that rests on the acceptance of contemporary scientific consensuses as not to be questioned — see chapter 2 in Dogmatism in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth and “Good Company and Bad Company”, pp. 118-9 in Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed (McFarland 2017).

Posted in conflicts of interest, consensus, denialism, funding research, global warming, media flaws, peer review, resistance to discovery, science is not truth, science policy, scientific culture, scientism, scientists are human, unwarranted dogmatism in science | Tagged: , | 2 Comments »

Dangerous knowledge III: Wrong knowledge about science

Posted by Henry Bauer on 2018/01/29

In the first post of this series (Dangerous knowledge) I pointed to a number of specific topics on which the contemporary scientific consensus is doubtfully in tune with the actual evidence. That disjunction is ignored or judged unimportant both by most researchers and by most observers; and that, I believe, is because the fallibility of science is not common knowledge; which in turn stems from ignorance and wrong knowledge about the history of science and, more or less as a consequence, about science itself.

The conventional wisdom regards science as a thing that is characterized by the scientific method. An earlier post (Dangerous knowledge II: Wrong knowledge about the history of science) mentioned that the scientific method is not a description of how science is done, it was thought up in philosophical speculation about how science could have been so successful, most notably in the couple of centuries following the Scientific Revolution of the 17th century.

Just as damaging as misconceptions about how science is done is the wrong knowledge that science is even a thing that can be described without explicit attention to how scientific activity has changed over time, how the character of the people doing science has changed over time, most drastically since the middle of the 20th century. What has happened since then, since World War II, affords the clearest, most direct understanding of why contemporary official pronouncements about matter of science and medicine need to be treated with similar skepticism as are official pronouncements about matters of economics, say, or politics. As I wrote earlier (Politics, science, and medicine),

In a seriously oversimplified nutshell:

The circumstances of scientific activity have changed, from about pre-WWII to nowadays, from a cottage industry of voluntarily cooperating, independent, largely disinterested ivory-tower intellectual entrepreneurs in which science was free to do its own thing, namely the unfettered seeking of truth about the natural world, to a bureaucratic corporate-industry-government behemoth in which science has been pervasively co-opted by outside interests and is not free to do its own thing because of the pervasive conflicts of interest. Influences and interests outside science now control the choices of research projects and the decisions of what to publish and what not to make public.

 

For a detailed discussion of these changes in scientific activity, see Chapter 1 of Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed (McFarland 2017); less comprehensive descriptions are in Three Stages of Modern Science  and The Science Bubble.

Official pronouncements are not made primarily to tell the truth for the public good. Statements from politicians are often motivated by the desire to gain favorable attention, as is widely understood. But less widely understood is that official statements from government agencies are also often motivated by the desire to gain favorable attention, to make the case for the importance of the agency (and its Director and other personnel) and the need for its budget to be considered favorably. Press releases from universities and other research institutions have the same ambition. And anything from commercial enterprises is purely self-interested, of course.

The stark corollary is that no commercial or governmental entity, nor any sizable not-for-profit entity, is devoted primarily to the public good and the objective truth. Organizations with the most laudable aims, Public Citizen,  say, or the American Heart Association, etc. etc. etc., are admittedly devoted to doing good things, to serving the public good, but it is according to their own particular definition of the public good, which may not be at all the same as others’ beliefs about what is best for the public, for society as a whole.

Altogether, a useful generalization is that all corporate entities, private or governmental, commercial or non-profit, have a vested self-interest in the status quo, since that represents the circumstances of their raison d’être, their prestige, their support from particular groups in society or from society as a whole.

The hidden rub is that a vested interest in the status quo means defending things as they are, even when objective observers might note that those things need to be modified, superseded, abandoned. Examples from the past are legion and well known: in politics, say, the American involvement in Vietnam and innumerable analogous matters. But not so well known is that unwarranted defense of the status quo is also quite common on medical and scientific issues. The resistance to progress, the failure to correct mis-steps in science and medicine in any timely way, has been the subject of many books and innumerable articles; for selected bibliographies, see Critiques of Contemporary Science and Academe and What’s Wrong with Present-Day Medicine. Note that all these critiques have been effectively ignored to the present day, the flaws and dysfunctions remain as described.

Researchers who find evidence that contradicts the status quo, the established theories, learn the hard way that such facts don’t count. As noted in my above-mentioned book,  science has a love-hate relationship with the facts: they are welcomed before a theory has been established, but after that only if they corroborate the theory; contradictory facts are anathema. Yet researchers never learn that unless they themselves uncover such unwanted evidence; scientists and engineers and doctors are trained to believe that their ventures are essentially evidence-based.

Contributing to the resistance against rethinking established theory is today’s hothouse, overly competitive, rat-race research climate. It is no great exaggeration to say that researchers are so busy applying for grants and contracts and publishing that they have no time to think new thoughts.

Posted in conflicts of interest, consensus, medical practices, peer review, resistance to discovery, science is not truth, scientists are human, the scientific method, unwarranted dogmatism in science | Tagged: | 1 Comment »

Dangerous knowledge II: Wrong knowledge about the history of science

Posted by Henry Bauer on 2018/01/27

Knowledge of history among most people is rarely more than superficial; the history of science is much less known even than is general (political, social) history. Consequently, what many people believe they know about science is typically wrong and dangerously misleading.

General knowledge about history, the conventional wisdom about historical matters, depends on what society as a whole has gleaned from historians, the people who have devoted enormous time and effort to assemble and assess the available evidence about what happened in the past.

Society on the whole does not learn about history from the specialists, the primary research historians. Rather, teachers of general national and world histories in schools and colleges have assembled some sort of whole story from all the specialist bits, perforce taking on trust what the specialist cadres have concluded. The interpretations and conclusions of the primary specialists are filtered and modified by second-level scholars and teachers. So what society as a whole learns about history as a whole is a sort of third-hand impression of what the specialists have concluded.

History is a hugely demanding pursuit. Its mission is so vast that historians have increasingly had to specialize. There are specialist historians of economics, of   mathematics, and of other aspects of human cultures; and there are historians who specialize in particular eras in particular places, say Victorian Britain. Written material still extant is an important resource, of course, but it cannot be taken literally, it has to be evaluated for the author’s identity, and clues as to bias and ignorance. Artefacts provide clues, and various techniques from chemistry and physics help to discover dates or to test putative dates. What further makes doing history so demanding is the need to capture the spirit of a different time and place, an holistic sense of it; on top of which the historian needs a deep, authentic understanding of the particular aspect of society under scrutiny. So doing economic history, for example, calls not only for a good sense of general political history, it requires also a good understanding of the whole subject of economics itself in its various stages of development.

The history of science is a sorely neglected specialty within history. There are History Departments in colleges and universities without a specialist in the history of science — which entails also that many of the people who — at both school and college levels — teach general history or political or social or economic history, or the history of particular eras or places, have never themselves learned much about the history of science, not even as to how it impinges on their own specialty. One reason for the incongruous place — or lack of a place — for the history of science with respect to the discipline of history as a whole is the need for historians to command an authentic understanding of the particular aspect of history that is their special concern. Few if any people whose career ambition was to become historians have the needed familiarity with any science; so a considerable proportion of historians of science are people whose careers began in a science and who later turned to history.

Most of the academic research in the history of science has been carried on in separate Departments of History of Science, or Departments of History and Philosophy of Science, or Departments of History and Sociology of Science, or in the relatively new (founded within the last half a century) Departments of Science & Technology Studies (STS).

Before there were historian specialists in the history of science, some historical aspects were typically mentioned within courses in the sciences. Physicists might hear bits about Galileo, Newton, Einstein. Chemists would be introduced to thought-bites about alchemy, Priestley and oxygen, Haber and nitrogen fixation, atomic theory and the Greeks. Such anecdotes were what filtered into general knowledge about the history of science; and the resulting impressions are grossly misleading. Within science courses, the chief interest is in the contemporary state of known facts and established theories, and historical aspects are mentioned only in so far as they illustrate progress toward ever better understanding, yielding an overall sense that science has been unswervingly progressive and increasingly trustworthy. In other words, science courses judge the past in terms of what the present knows, an approach that the discipline of history recognizes as unwarranted, since the purpose of history is to understand earlier periods fully, to know about the people and events in their own terms, under their own values.

*                   *                   *                  *                    *                   *

How to explain that science, unlike other human ventures, has managed to get better all the time? It must be that there is some “scientific method” that ensures faithful adherence to the realities of Nature. Hence the formulaic “scientific method” taught in schools, and in college courses in the behavioral and social sciences (though not in the natural sciences).

Specialist historians of science, and philosophers and sociologists of science and scholars of Science & Technology Studies all know that science is not done by any such formulaic scientific method, and that the development of modern science owes as much to the precursors and ground-preparers as to such individual geniuses as Newton, Galileo, etc. — Newton, by the way, being so fully aware of that as to have used the modest “If I have seen further it is by standing on the shoulders of giants” mentioned in my previous post (Dangerous knowledge).

*                     *                   *                   *                   *                   *

Modern science cannot be understood, cannot be appreciated without an authentic sense of the actual history of science. Unfortunately, for the reasons outlined above, contemporary culture is pervaded by partly ignorance and partly wrong knowledge of the history of science. In elementary schools and in high schools, and in college textbooks in the social sciences, students are mis-taught that science is characterized, defined, by use of “the scientific method”. That is simply not so: see Chapter 2 in Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed (McFarland 2017)  and sources cited there. The so-called the scientific method is an invention of philosophical speculation by would-be interpreters of the successes of science; working scientists never subscribed to this fallacy, see for instance Reflections of a Physicist (P. W. Bridgman, Philosophical Library, 1955), or in 1992 the physicist David Goodstein, “I would strongly recommend this book to anyone who hasn’t yet heard that the scientific method is a myth. Apparently there are still lots of those folks around” (“this book” being my Scientific Literacy and Myth of the Scientific Method).

The widespread misconception about the scientific method is compounded by the misconception that the progress of science has been owing to individual acts of genius by the people whose names are common currency — Galileo, Newton, Darwin, Einstein, etc. — whereas in reality those unquestionably outstanding individuals were not creating out of the blue but rather placing keystones, putting final touches, synthesizing; see for instance Tony Rothman’s Everything’s Relative: And Other Fables from Science and Technology (Wiley, 2003). The same insight is expressed in Stigler’s Law, that discoveries are typically named after the last person who discovered them, not the first (S. M. Stigler, “Stigler’s Law of Eponymy”, Transactions of the N.Y. Academy of Science, II, 39 [1980] 147–58).

That misconception about science progressing by lauded leaps by applauded geniuses is highly damaging since it hides the crucially important lesson that the acts of genius that we praise in hindsight were vigorously, often even viciously resisted by their contemporaries, their contemporary scientific establishment and scientific consensus; see “Resistance by scientists to scientific discovery” (Bernard Barber, Science, 134 [1961] 596–602); “Prematurity and uniqueness in scientific discovery” (Gunther Stent, Scientific American, December 1972, 84–93); Prematurity in Scientific Discovery: On Resistance and Neglect (Ernest B. Hook (ed)., University of California Press, 2002).

What is perhaps most needed nowadays, as the authority of science is invoked in so many aspects of everyday affairs and official policies, is clarity that any contemporary scientific consensus is inherently and inevitably fallible; and that the scientific establishment will nevertheless defend it zealously, often unscrupulously, even when it is demonstrably wrong.

 

Recommended reading: The historiography of the history of science, its relation to general history, and related issues, as well as synopses of such special topics as evolution or relativity, are treated authoritatively in Companion to the History of Modern Science (eds.: Cantor, Christie, Hodge, Olby; Routledge, 1996) [not to be confused with the encyclopedia titled Oxford Companion to the History of Modern Science, ed. Heilbron, Oxford University Press, 2003).

Posted in consensus, media flaws, resistance to discovery, science is not truth, scientific culture, scientific literacy, scientism, scientists are human, the scientific method, unwarranted dogmatism in science | Tagged: , , | 2 Comments »

Dangerous knowledge

Posted by Henry Bauer on 2018/01/24

It ain’t what you don’t know that gets you into trouble.
It’s what you know for sure that just ain’t so.

That’s very true.

In a mild way, the quote also illustrates itself since it is so often attributed wrongly; perhaps most often to Mark Twain but also to other humorists — Will Rogers, Artemus Ward, Kin Hubbard — as well as to inventor Charles Kettering, pianist Eubie Blake, baseball player Yogi Berra, and more (“Bloopers: Quote didn’t really originate with Will Rogers”).

Such mis-attributions of insightful sayings are perhaps the rule rather than any exception; sociologist Robert Merton even wrote a whole book (On the Shoulders of Giants, Free Press 1965 & several later editions) about mis-attributions over many centuries of the modest acknowledgment that “If I have seen further it is by standing on the shoulders of giants”.

No great harm comes from mis-attributing words of wisdom. Great harm is being done nowadays, however, by accepting much widely believed and supposedly scientific medical knowledge; for example about hypertension, cholesterol, prescription drugs, and more (see works listed in What’s Wrong with Present-Day Medicine).

The trouble is that “science” was so spectacularly successful in elucidating so much about the natural world and contributing to so many useful technologies that it has come to be regarded as virtually infallible.

Historians and other specialist observers of scientific activity — philosophers, sociologists, political scientists, various others — of course know that science, no less than all other human activities, is inherently and unavoidably fallible.

Until the middle of the 20th century, science was pretty much an academic vocation not venturing very much outside the ivory towers. Consequently and fortunately, the innumerable things on which science went wrong in past decades and centuries did no significant damage to society as a whole; the errors mattered only within science and were corrected as time went by. Nowadays, however, science has come to pervade much of everyday life through its influences on industry, medicine, and official policies on much of what governments are concerned with: agriculture, public health, environmental matters, technologies of transport and of warfare, and so on. Official regulations deal with what is permitted to be in water and in the air and in innumerable man-made products; propellants in spray cans and refrigerants in cooling machinery have been banned, globally, because science (primarily chemists) persuaded the world that those substances were reaching the upper atmosphere and destroying the natural “layer” of ozone that absorbs some of the ultraviolet radiation from the sun, thereby protecting us from damage to eyes and skin. For the last three decades, science (primarily physicists) has convinced the world that human generation of carbon dioxide is warming the planet and causing irreversible climate change.

So when science goes wrong nowadays, that can do untold harm to national economies, and to whole populations of people if the matter has to do with health.

Yet science remains as fallible as it ever was, because it continues to be done by human beings. The popular illusion that science is objective and safeguarded from error by the scientific method is simply that, an illusion: the scientific method describes how science perhaps ought to be done, but how it is done depends on the human beings doing it, none of whom never make mistakes.

When I wrote that “science persuaded the world” or “convinced the world”, of course it was not science that did that, because science cannot speak for itself. Rather, the apparent “scientific consensus” at any given time is generally taken a priori as “what science says”. But it is rare that any scientific consensus represents what all pertinent experts think; and consensus is appealed to only when there is controversy, as Michael Crichton pointed out so cogently: “the claim of consensus has been the first refuge of scoundrels[,] … invoked only in situations where the science is not solid enough. Nobody says the consensus of scientists agrees that E=mc2. Nobody says the consensus is that the sun is 93 million miles away. It would never occur to anyone to speak that way”.

Yet the scientific consensus represents contemporary views incorporated in textbooks and disseminated by science writers and the mass media. Attempting to argue publicly against it on any particular topic encounters the pervasive acceptance of the scientific consensus as reliably trustworthy. What reason could there be to question “what science says”? There seems no incentive for anyone to undertake the formidable task of seeking out and evaluating the actual evidence for oneself.

Here is where real damage follows from what everyone knows that just happens not to be so. It is not so that a scientific consensus is the same as “what science says”, in other words what the available evidence is, let alone what it implies. On any number of issues, there are scientific experts who recognize flaws in the consensus and dissent from it. That dissent is not usually mentioned by the popular media, however; and if it should be mentioned then it is typically described as misguided, mistaken, “denialism”.

Examples are legion. Strong evidence and expert voices dissent from the scientific consensus on many matters that the popular media regard as settled: that the universe began with a Big Bang about 13 billion years ago; that anti-depressant drugs work specifically and selectively against depression; that human beings (the “Clovis” people) first settled the Americas about 13,000 years ago by crossing the Bering Strait; that the dinosaurs were brought to an end by the impact of a giant asteroid; that claims of nuclear fusion at ordinary temperatures (“cold fusion”) have been decisively disproved; that Alzheimer’s disease is caused by the build-up of plaques of amyloid protein; and more. Details are offered in my book, Dogmatism in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth (McFarland, 2012). That book also documents the widespread informed dissent from the views that human-generated carbon dioxide is the prime cause of global warming and climate change, and that HIV is not the cause of AIDS (for which see the compendium of evidence and sources at The Case against HIV).

The popular knowledge that just isn’t so is, directly, that it is safe to accept as true for all practical purposes what the scientific consensus happens to be. That mistaken knowledge can be traced, however, to knowledge that isn’t so about the history of science, for that history is a very long story of the scientific consensus being wrong and later modified or replaced, quite often more than once.

Further posts will talk about why the real history of science is so little known.

 

Posted in consensus, denialism, global warming, media flaws, medical practices, prescription drugs, science is not truth, scientific literacy, scientism, scientists are human, the scientific method, unwarranted dogmatism in science | Tagged: , | 4 Comments »

Politics, science, and medicine

Posted by Henry Bauer on 2017/12/31

I recently posted a blog about President Trump firing members of the Presidential Advisory Council on HIV/AIDS in which I concluded with
”Above all, the sad and bitter fact is that truth-seeking does not have a political constituency, be it about HIV, AIDS, or anything else”.

That sad state of affairs, the fragile foothold that demonstrable truth has in contemporary society, is owing to a number of factors, including that “Science is broken” and the effective hegemony of political correctness (Can truth prevail?).

A consequence is that public policies are misguided about at least two issues of significant social impact: HIV/AIDS (The Case against HIV), and human-caused global warming (A politically liberal global-warming skeptic?).

Science and medicine are characterized nowadays on quite a number of matters by dogmatic adherence to views that run counter to the undisputed evidence (Dogmatism in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth, McFarland, 2012). To cite just one absurdity (on a matter that has no significant public impact): in cosmology, the prevailing Big-Bang theory of the universe requires that “dark matter” and “dark energy” make up most of the universe, the “dark” signifying that they have never been directly observed; and there are no credible suggestions for how they might be observed directly, and nothing is known about them except that their postulated influences are needed to make Big-Bang theory comport to the facts of the real world. Moreover, a less obviously flawed theory has been available for decades, the “steady-state” theory that envisages continual creation of new matter, observational evidence for which was collected and published by Halton Arp (Qasars, Redshifts and Controversies, Interstellar Media, 1987; Seeing Red: Redshifts, Cosmology and Academic Science, Apeiron, 1998).

Dozens of books have documented what is wrong with contemporary medicine, science, and academe:
Critiques of contemporary science and academe;
What’s wrong with present-day medicine.

The common feature of all the flaws is the failure to respect the purported protocols of “the scientific method”, namely, to test hypotheses against reality and to keep testing theories against reality as new evidence comes in.

Some political commentators have described our world as “post-truth”, and a variety of social commentators have held forth for decades about a “post-modern” world. But the circumstances are not so much “post-truth” or “post-modern” as pre-Enlightenment.

So far as we know and guess, humans accepted as truth the dogmatic pronouncements of elders, shamans, priests, kings, emperors and the like until, perhaps half a millennium ago, the recourse to observable evidence began to supersede acceptance of top-down dogmatic authority. Luther set in motion the process of taking seriously what the Scriptures actually say instead of accepting interpretations from on high. The religious (Christian only) Reformation was followed by the European Enlightenment; the whittling away of political power from traditional rulers; the French Revolution; the Scientific Revolution. By and large, it became accepted, gradually, that truth is to be found by empirical means, that explanations should deal with the observed natural world, that beliefs should be tested against tangible reality.

Science, in its post-17th-century manifestation as “modern science”, came to be equated with tested truth. Stunning advances in understanding confirmed science’s ability to learn accurately about the workings of nature. Phenomena of physics and of astronomy came to be understood; then chemistry; then sub-atomic structure, relativity, quantum mechanics, biochemistry … how could the power of science be disputed?

So it has been shocking, not fully digested by any means, that “science” has become untrustworthy, as shown in the last few decades by, for instance, increasing episodes of dishonesty, fraud, unreproducible claims.

Not yet widely realized is the sea change that has overtaken science since about the middle of the 20th century, the time of World War II. It’s not the scientific method that determines science, it’s the people who are doing the research and interpreting it and using it; and the human activity of doing science has changed out of sight since the early days of modern science. In a seriously oversimplified nutshell:

The circumstances of scientific activity have changed, from about pre-WWII to nowadays, from a cottage industry of voluntarily cooperating, independent, largely disinterested ivory-tower intellectual entrepreneurs in which science was free to do its own thing, namely the unfettered seeking of truth about the natural world, to a bureaucratic corporate-industry-government behemoth in which science has been pervasively co-opted by outside interests and is not free to do its own thing because of the pervasive conflicts of interest. Influences and interests outside science now control the choices of research projects and the decisions of what to publish and what not to make public.

What science is purported to say is determined by people; actions based on what science supposedly says are chosen by people; so nowadays it is political and social forces that determine beliefs about what science says. Thus politically left-leaning people and groups acknowledge no doubt that HIV causes AIDS and that human generation of carbon dioxide is the prime forcer of climate change; whereas politically right-leaning people and groups express doubts or refuse flatly to believe those things.

For more detailed discussion of how the circumstances of science have changed, see “Three stages of modern science”; “The science bubble”; and chapter 1 in Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed (McFarland 2017).

For how to make science a public good again, to make science truly reflect evidence rather than being determined by political or religious ideology, see chapter 12 in Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed (McFarland 2017).

Posted in conflicts of interest, fraud in medicine, fraud in science, global warming, politics and science, science is not truth, science policy, scientists are human, the scientific method, unwarranted dogmatism in science | Tagged: | 1 Comment »

Science is broken: Illustrations from Retraction Watch

Posted by Henry Bauer on 2017/12/21

I commented before about Science is broken: Perverse incentives and the misuse of quantitative metrics have undermined the integrity of scientific research.  The magazine The Scientist published on 18 December “Top 10 Retractions of 2017 —
Making the list: a journal breaks a retraction record, Nobel laureates Do the Right Thing, and Seinfeld characters write a paper”, compiled by Retraction Watch. It should be widely read and digested for an understanding of the jungle of unreliable stuff nowadays put out under the rubric of “science”.

See also “Has all academic publishing become predatory? Or just useless? Or just vanity publishing?”

 

Posted in conflicts of interest, fraud in medicine, fraud in science, media flaws, science is not truth, scientific culture, scientists are human | Tagged: , | Leave a Comment »

 
%d bloggers like this: