Skepticism about science and medicine

In search of disinterested science

Archive for the ‘conflicts of interest’ Category

From uncritical about science to skeptical about science: 5

Posted by Henry Bauer on 2021/01/09

Learning from what science ignores — within science as well as outside

The Society for Scientific Exploration (SSE) had been founded at the start of the 1980s by scientists, engineers, and other scholars who believed that there was sufficient substantive evidence, enough sheer facts, to warrant proper scientific investigation of topics ignored by science or dismissed as fictive, existing not in Nature’s reality but only in human imaginations: psychic phenomena; flying saucers or UFOs (Unidentified Flying Objects); cryptozoology — animals unknown to biology, or extinct animals said to be still extant; as well as such heretical views as that the theory of relativity is unsound [1].

Being ignored in the face of apparently good evidence was the shared bond within SSE. Few if any of us shared belief in the reality of all the topics that one or more members favored. I certainly didn’t. In fact, I soon began wondering how it was that so many competent, accomplished, intelligent, highly educated, cosmopolitan people could believe firmly in things that seemed to me highly implausible, at best doubtful.

The next insight followed naturally: My new colleagues surely wondered how I, a successful  chemist and cosmopolitan Dean of Arts and Sciences, could firmly believe in the reality of the Loch Ness Monster.

My fascination over that had begun through random chance, a book picked up and riffled through. No doubt something analogous, some unplanned experience, had set my new colleagues off on their particular interests.

There is an important general point to be made here. Scientists characteristically have an intellectual blind spot — certainly I do: imagining that beliefs are created by factual knowledge, remain held because of factual evidence, and can be changed by new facts. That is simply not the case.
Interest or some other stimulus is crucial. Why does one ever seek facts in some specific direction?
Everyone would likely look for reliable knowledge about something pertinent to health, family matters, earning a living; but it can also happen by chance, by happening upon a book picked up at random. So there is no reason why others should find interest where I happen to.

And it is not sufficient that good and respected friends and colleagues urge one to look at the facts. I have maintained only an observer’s interest in most of the matters that absorb others in the Society. Even though I’d quite like to know enough to warrant having an informed opinion, the problem is the sheer amount of time and effort needed to wade through all the claims and counterclaims before reaching a reasonably firm belief or disbelief. Outside chemistry, I’ve looked in enough detail at only three major controversial topics: Loch Ness Monsters, HIV/AIDS, and global warming (or climate-change).

That there are a great variety of different specialized interests in the Society for Scientific Exploration was not a disturbing factor. We talked (and wrote and published [2]) about our interests and claimed facts and speculations, and benefited from constructive mutual criticism, sometimes quite incisive.  Frustration at the lack of interest from mainstream science was and remains an overwhelmingly strong bond. A corollary is something like shared disdain for the individuals and groups who wage public campaigns about the purported dangers to society of believing in the reality of UFOs, Bigfoot, psychic phenomena and the like [3]. Those activists, who purport to be supporters and defenders of science, typically describe themselves as Skeptics [4], a grossly misleading misnomer since they are dogmatists of the highest order, unwilling to contemplate that official or mainstream science might be wrong in any particular — a stance that ignores the whole history of science.
To my mind, the real danger to society stems from such arrogantly dogmatic groups which insist that everyone share their particular beliefs, as is all too commonly the case with specific religions or, in this case, scientism, the religious faith that science be acknowledged as the sole authoritative source of knowledge and understanding.
These “Skeptics” (Truzzi famously and aptly called them “pseudo-skeptics”) criticize the topics of interest within SSE as pseudo-science, but SSE advocates scientific exploration, seeking the best available facts about Nature and trying to explain and understand them. SSE has quarrels not with “science” but with the too-many career scientists who behave unscientifically in forming opinions without looking at the facts, and then defend those opinions dogmatically.

When I analyzed the Velikovsky Affair [5], what had then most struck me was how incompetently the scientific community had criticized Velikovsky’s pseudo-science, and how little so many scientists seemed to understand what science is really about. Several decades later, having written articles and books about the prevalence of dogmatism in science [6], I can see in retrospect that I had overlooked or not noticed or missed the significance of how insufferably dogmatic the criticisms of Velikovsky had been. Yet that dogmatism was far from a minor part of the Affair; it surely played some part in bringing some social scientists and humanists to rally to Velikovsky’s defense.

The Society for Scientific Exploration also led to my learning about the extent of dogmatism within mainstream science. The society offered a forum not only for topics dismissed as pseudoscience, we also heard at times about  the suppression of unorthodox views within mainstream science. For example, Thomas Gold was widely acknowledged and applauded for his original insights in astrophysics, but mainstream science wanted nothing to do with his ideas about the origin of what are said to be fossil fuels in the Earth  and about life having originated deep in the earth rather than in warm ponds on its surface [7]. Gold also favored the steady-state theory of the cosmos rather than the accepted paradigm of the Big Bang. Halton Arp, an observational astronomer, published data that support the steady-state theory, whereupon mainstream science refused to allow him further access it to the telescopes he needed [8]. A variety of observations indicate that earthquakes may be predictable by electromagnetic or other signals, but mainstream geology will have none of it [9]. “Cold fusion” remains beyond the pale despite intriguing evidence from competent mainstream researchers [10].

I learned that even distinguished mainstream researchers who take a distinctly different view from the prevailing majority consensus are treated no better than are those of us accused of espousing pseudo-science, in fact they often have it worse: their unorthodoxies can damage their career, whereas most members of SSE earn their living by something quite separate from their oddball interests, which are more hobbies, things pursued in amateur fashion, out of sheer fascination and not as a way to earn a living.

So Loch Ness Monsters led me to SSE and SSE led me to recognize how widespread throughout mainstream science is the passionately dogmatic, even vindictive suppression of minority opinion [6] — quite contrary to the popular view of science, the idealistic view that remains my own vision of how science should be carried on.

It seemed natural, then, in my new academic career in STS, to make my special interest the study of scientific controversies and of what exactly distinguishes genuine proper science from what is widely denigrated as fringe, alternative, or pseudo science [1].
My research focus required looking for examples of scientific controversies to study. I don’t recall what first alerted me that there was dissent from the belief that HIV causes AIDS, that there was ever any controversy about it, but I did come across that in the early 1990s.
That is what eventually taught me that what taken-as-authoritative institutions nowadays proclaim in the name of science should never be automatically trusted; it should be fact-checked. The dogmatism, careerism, and institutional as well as personal conflicts of interest that are now rampant in contemporary science have actually brought official public policies and actions that are contrary to the facts of reality, have harmed massive numbers of people, and threaten to cause yet further damage.

—————————————————————————————————–

[1]    Science or Pseudoscience: Magnetic Healing, Psychic Phenomena, and Other Heterodoxies, University of Illinois Press 2001
[2]    The Journal of Scientific Exploration began publication in 1987. It is now freely available on-line
[3]    Examples are discussed and critiqued at p. 200 ff. in [1]
[4]    The iconic organization was CSICOP (Committee for Scientific Investigation of Claims of the Paranormal), founded in 1976 by predominantly non-scientists (philosophers, psychologists, writers, amateur investigators) but including a few prominent scientists, for example Carl Sagan; it publishes Skeptical Inquirer and includes under matters criticized as “paranormal”, claims of the existence of what would be perfectly natural creatures
[5]    Beyond Velikovsky: The History of a Public Controversy, University of Illinois Press, 1984
[6]    Dogmatism  in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth, McFarland,  2012
[7]    Fuel’s Paradise
[8]    Halton Arp, Quasars, Redshifts and Controversies, Interstellar Media, 1987; Seeing Red: Redshifts, Cosmology and Academic Science, Apeiron, 1998
[9]    On earthquake prediction, but more generally about matters that global tectonics (“continental drift”) does not adequately explain, see the NCGT Journal
[10]  The topic is nowadays thought to be not the fusion originally inferred but the general phenomenon of Low Energy Nuclear Reactions (LENR), nuclear transformations at ordinary temperatures

Posted in conflicts of interest, denialism, global warming, medical practices, science is not truth, science policy, scientific culture, scientism, unwarranted dogmatism in science | Tagged: , , , , , | 2 Comments »

From uncritical about science to skeptical about science: 3

Posted by Henry Bauer on 2021/01/03

From more than ample funding to stifling competition

In the middle 1960s in the United States, I observed more of the consequences of the enormous infusion of federal resources into scientific activity that I had glimpsed as a postdoctoral researcher in the late 1950s.

Moving  from Australia, I was appointed Associate Professor at the University of Kentucky in 1966, just when the university was spreading its wings towards gaining recognition for research excellence. I was expected to help that along, given that I already had several dozen publications to my name.

Kentucky was far from alone in its ambition. The flood of federal money designed to stimulate scientific research and the training of future scientists had brought a major transformation in American academe. Four-year Liberal-Arts Colleges made themselves over into Research Universities; Teachers Colleges morphed into universities. In 1944, there had been 107 doctorate-granting institutions in the U.S.; then 142 by 1950-54, 208 by 1960-64, 307 by 1970-74 [1]. In chemistry, there had been 98 doctoral programs in 1955; by 1967 there were 165, and 192 by 1979 [2].

A presumably unforeseen consequence of pushing production of would-be researchers and wannabe research universities was that by the 1970s, demand for grant funds was exceeding the supply. At Kentucky, about half of the Chemistry Department’s proposals to the National Science Foundation (NSF) had been funded in the mid-to-late 1960s; but by 1978, our success rate had fallen to only 1 grant for every 10 applications. That sort of decline has continued into the 21st century: at the National Institutes of Health (NIH), the main source of funds for biological and medical research, the national success rate for grant applicants fell from 31% in 1997 to 20% by 2014 [3]; the average age was 42 at which an individual first obtained a grant from NIH as Principal Investigator in 2011 [4].

By the 1970s, there were more PhD mathematicians and physicists graduating than there were academic research jobs available. Some pundits speculated that the 2008 economic crash owed quite a lot to ingenious stock-trading software programs and bad-mortgage-bundling-and-valuing “securities” designed by PhD mathematicians and physicists who were working on Wall Street because they could not find positions in academe.

When I had prepared at my first research grant application to the National Science Foundation in 1966, the newly-appointed Director of the University’s Research Division rewrote my budget without consulting me, to make it twice as long in duration and four times as costly in total. When the grant was refused, and I asked the NSF manager why, he pointed out that it requested about twice as much as their usual grants. Faced with this news, our Research Director expressed surprise, claiming that one of the purposes of these federal funds was to support universities in general.

Federal grants for scientific research brought with them many perks.

My particular specialty enabled me to observe how grants for analytical chemistry made it possible to enjoy summer-time fishing and scuba diving in the Caribbean, as a necessary part of research that involved analyzing sea-water.

Some groups of grant-getters would meet before or after professional meetings at desirable locations for fun and games. In those socially boisterous 1960s-70s, traveling at will on funds from research grants made it easy, for example, to sample the topless bars in San Francisco and perhaps a performance of the norm-breaking, counter-cultural musical Hair on the way to the highly regarded Gordon Research Conferences in Santa Barbara. And why not? What could be wrong with using small amounts of our grant funds for personal recreation, just as people in business or industry might use their travel expenses.

Such a point of view was certainly not hindered by the fact that grants for scientific research routinely brought, for academics, an additional 25-33% of personal salary. Almost all academics are  routinely paid on a so-called “9-month basis”, with no teaching or other responsibilities during the three months of summer. Since scientific research would be carried on year-round, including during the summer, it seemed quite appropriate that researchers would receive a salary during that time as part of their research grants.

That practice no doubt had an undesirable side-effect, arousing or enhancing jealousy among non-science academics, and perhaps increasing the determination, among social scientists in particular, to be treated like the physicists and chemists and biologists: after all, psychologists and sociologists are scientists too, are they not? Lobbying eventually — in 1957, seven years after NSF had been established — led to the Social Science Research Program at NSF, for support of anthropology, economics, sociology, and history and philosophy of science.

Federal grants for scientific research brought with them many benefits for institutions as well as for the researchers: universities siphoned off from grants the so-called “indirect costs”, the self-justifying, much-preferred term for “overhead”. Increased scientific research placed greater obligations on the university’s libraries and physical facilities and administrative tasks, so it seemed quite proper to add to the costs of actual research, and the researcher’s summer salary, and the wages and tuition fees of graduate students and postdoctoral researchers, a certain percentage that the University Administration could use to defray those added burdens. That certain percentage can be as high as 50%, or even more in the case of private, non-state-funded, universities [5].

The more the money flowed, the more necessary it became for researchers to obtain grant funds. Costs increased all the time. Scientific journals had traditionally been published by scientific societies, underwritten by membership fees and edited by society members, often without remuneration. As printing and postage costs increased, journals began to levy so-called “page charges” that soon increased to many tens of dollars per published page, particularly as an increasing number of scientific periodicals were taken over or newly founded by commercial publishers, who naturally paid professional staff including editors. Page charges were of course legitimate charges on grant funds. Academics without access to grant funds could still be published in society journals, but their second-class status was displayed for all to see as their publications carried the header or footer, “Publication costs borne by the Society”.

Increasing competition, with the stakes continually increasing, would naturally encourage corner-cutting, unscrupulous behavior, even outright cheating and faking. At least by hindsight it is clear enough that scientists and universities had been corrupted by money — willingly,  greedily; but Science itself seemed not visibly affected, could still be trusted. Dishonest behavior began to be troubling, noticeably, only by the 1980s.

The 1960s were still pleasantly high-flying years for scientific researchers. Things went well for me personally, and at the tail end of those great years I even collared my best grant yet, a five-year (1969-74) million-dollar project for fundamental work relevant to fuel cells, whose promise was something of a fad at the time.

But in the early 1970s,  the American economy turned down. The  job market for PhD scientists collapsed. Our graduate program in chemistry could not attract enough students, and, as already mentioned, we were not doing well with grant funds from NSF.

That is when my recreational interest in the Loch Ness Monster began to pay off, in entirely unforeseeable ways: leading to new insights into science and how it was changing; as well as bringing a career change.

———————————————————————————————

[1]    A Century of Doctorates: Data Analyses of Growth and Change, National Academy of Sciences, 1978
[2]    Henry H. Bauer, Fatal Attractions: The Troubles with Science,
Paraview Press, 2001, p. 166
[3]    NIH Data Book: Research Grants, 15 June 2015
[4]    W. A. Shaffer “Age Distribution – AAMC Medical School Faculty and NIH R01 Principal Investigators” (2012), cited in Michael Levitt & Jonathan M. Levitt, “Future of fundamental discovery in US biomedical research”,
Proceedings of the National Academy of Science, USA, 114 (#25, 2017): 6498-6503
[5]        Jocelyn Kaiser, “The base rate for NIH grants averages about 52%; NIH plan to reduce overhead payments draws fire”

Posted in conflicts of interest, fraud in science, funding research, scientific culture, scientists are human | Tagged: , , | Leave a Comment »

From uncritical about science to skeptical about science

Posted by Henry Bauer on 2020/12/31

Science has been so successful at unlocking Nature’s secrets, especially since about the 16th century, that by the early decades of the 20th century, science had become almost universally accepted as the trustworthy touchstone of knowledge about and insight into the material world. In many ways and in many places, science has superceded religion as the ultimate source of truth.
Yet in the 21st century, an increasing number and variety of voices are proclaiming that science is not — or no longer — to be trusted.
Such disillusion is far from unanimous, but I certainly share it [1], as do many others [2, 3], including such well-placed insiders as editors of scientific periodicals.
How drastically different 21st– century science is from the earlier modern science that won such status and prestige seems to me quite obvious; yet the popular view seems oblivious to this difference. Official statements from scientific authorities and institutions are still largely accepted automatically, unquestioningly, by the mass media and, crucially, by policy-makers and governments, including international collaborations.
Could my opinion be erroneous about a decline in the trustworthiness of science?
If not, why is it that what seems so obvious to me has not been noticed, has been overlooked by the overwhelming majority of practicing researchers, by pundits and by scholars of scientific activity and by science writers and journalists?

That conundrum had me retracing the evolution of my views about science, from my early infatuation with it to my current disillusionment.
Almost immediately I realized that I had happened to be in some of the right places at some of the right times [4] with some of the right curiosity to be forced to notice the changes taking place; changes that came piecemeal over the course of decades.
That slow progression will also have helped me to modify my belief, bit by bit, quite slowly. After all, beliefs are not easily changed. From trusting science to doubting science is quite a jump; for that to occur quickly would be like suddenly acquiring a religious belief, Saul struck on the road to Damascus, or perhaps the opposite, losing a faith like the individuals who escape from cults, say Scientology — it happens quite rarely.
So it is natural but worth noting that my views changed slowly just as the circumstances of research were also changing, not all at once but gradually.
Of course I didn’t recognize at the time the cumulating significance of what I was noticing. That comes more easily in hindsight. Certainly I could not have begun to suspect that a book borrowed for light recreational reading would lead a couple of decades later to major changes of professional career.

Beginnings: Science, chemistry, unquestioning trust in science

I had become enraptured by science, and more specifically by chemistry, through an enthusiastic teacher at my high school in Sydney, Australia, in the late 1940s. My ambition was to become a chemist, researching and teaching, and I could imagine nothing more interesting or socially useful.
Being uncritically admiring of science came naturally to my cohort of would-be or potential scientists. It was soon after the end of the second World War; and that science really understands the inner workings of Nature had been put beyond any reasonable doubt by the awesome manner in which the war ended, with the revelation of atomic bombs. I had seen the newspaper headlines, “Atom bomb used over Japan”, as I was on a street-car going home from high-school, and I remember thinking, arrogantly, “Gullible journalism, swallowing propaganda; there’s no such thing as an atomic bomb”.

Learning how it was a thing made science seem yet more wonderful.

The successful ending of that war was also of considerable and quite personal significance for me. By doing it, “science” had brought a feeling of security and relief after years of high personal anxiety, even fear. When I was a 7-year-old school-boy, my family had escaped from Austria, in the nick of time, just before the war had started; and then in Australia, we had experienced the considerable fear of a pending Japanese invasion, a fear is made very real by periodic news of Japanese atrocities in China, for instance civilians being buried alive, as illustrated in photographs.
Trusting science was not only the Zeitgeist of that time and place, it was personally welcome, emotionally appealing.

The way sciences were taught only confirmed that science could be safely equated with truth. For that matter, all subjects were taught quite dogmatically. We just did not question what our teachers said; time and place, again. In elementary school we had sat with arms folded behind our backs until the teacher entered, when we stood up in silent respect. Transgressions of any sort were rewarded by a stroke of a cane on an outstretched hand.
(Fifty years later, in another country if not another world, a university student in one of my classes complained about getting a “B” and not an “A”.)

I think chemistry also conduces to trusting that science gets it right. Many experiments are easy to do, making it seem obvious that what we’ve learned is absolutely true.
After much rote learning of properties of elements and compounds, the Periodic Table came as a wonderful revelation: never would I have to do all that memorizing again, everything can be predicted just from that Table.
Laboratory exercises, in high school and later at university, worked just as expected; failures came only from not being adept or careful enough. The textbooks were right.

Almost nothing at school or university, in graduate as well as undergraduate years, aroused any concerns that science might not get things right. A year of undergraduate research and half-a-dozen years in graduate study brought no reason to doubt that science could learn Nature’s truths. Individuals could make mistakes, of course; I was taken aback when a standard reference resource, Chemical Abstracts, sent me erroneously to an article about NaI instead of NOI — human error, obviously, in transcribing spoken words.

Of course there was still much to learn, but no reason to question that science could eventually come to really understand all the workings of the material world.

Honesty in doing science was taken for granted. We heard the horror story of someone who had cheated in some way; his studying of science was immediately canceled and he had to take a job somewhere as a junior administrator. Something I had written was plagiarized — the historical introduction in my PhD thesis — and the miscreant was roundly condemned, even as he claimed a misunderstanding. Individuals could of course go wrong, but that threw no doubt on the trustworthiness of Science itself.

In many ways, scientific research in Australia in the 1940s and 1950s enjoyed conditions not so different from the founding centuries of modern science when the sole driving aim was to learn how the world works. In the universities, scientific research was very much part of the training of graduate students for properly doing good science. The modest needed resources were provided by the University. No time and effort had to be spent seeking necessary support from outside sources, no need to locate and kowtow to potential patrons, individuals or managers at foundations or government agencies.
Research of a more applied sort was carried out by the government-funded Council for Scientific and Industrial Research, CSIR (which later became a standard government agency, the Commonwealth Scientific and Industrial Research Organization, CSIRO). There the atmosphere was quite like that in academe: people more or less happily working at a self-chosen vocation. The aims of research were sometimes quite practical, typically how better to exploit Australia’s natural resources: plentiful coal, soft brown as well as hard black; or the wool being produced in abundance by herds of sheep. CSIR also made some significant “pure science” discoveries, for example the importance of nutritional trace elements in agricultural soils [5] and in the development of radio astronomy [6].

In retrospect the lack of money-grubbing is quite striking. At least as remarkable, and not unrelated, is that judgments were made qualitatively, not quantitatively. People were judged by the quality, the significance, the importance of what they accomplished, rather than by how much of something they did. We judged our university teachers by their mastery of the subjects they taught and on how they treated us. Faculty appointments and promotions relied on personal recommendations. Successful researchers might often — and naturally— publish more than others, but not necessarily. Numbers of publications were not the most important thing, nor how often one’s publications were cited by others: The Science Citation Index was founded only in 1963, followed by the Social Sciences Citation Index in 1973 and the Arts and Humanities Citation Index a few years later. “Impact factors” of scientific journals had begun to be calculated in the early 1970s.

So in my years of learning chemistry and beginning research, nothing interfered with having an idealistic view of science, implicitly “pure” science, sheer knowledge-seeking. For my cohort of students, it was an attractive, worthy vocation. The most desired prospect was to be able to work at a university or a research institute. If one was less fortunate, it might be a necessary to take a job in industry, which in those years was little developed in Australia, involving the manufacture of such uncomplicated or unsophisticated products as paint, or the processing of sugar cane or technicalities associated with brewing beer, making wine, or distilling spirits.

The normal path to an academic career in Australia began with post-doctoral experience in either Britain or the United States. My opportunity came in the USA; there, in the late 1950s, I caught my first glimpses of what science would become, with an influx of funds from government and industry and the associated consequences, then unforeseen if not unforeseeable but at any rate not of any apparent concern.

——————————————-

[1]    Henry H. Bauer, Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed, McFarland, 2017
[2]    Critiques of Contemporary Science and Academe
https://mega.nz/file/NfwkSR7S#K7llqDfA9JX_mVEWjPe4W-uMM53aMr2XMhDP6j0B208
[3]    What’s Wrong With Medicine; https://mega.nz/file/gWoCWTgK#1gwxo995AyYAcMTuwpvP40aaB3DuA5cvYjK11k3KKSU
[4]    Insight borrowed from Paula E. Stephen & Sharon G. Levin, Striking the Mother Lode in Science: The Importance of Age, Place, and Time, Oxford University Press, 1992
[5]    Best known is the discovery that cobalt supplements avoided “coast disease”, a wasting condition of sheep; see Gerhard N. Schrauzer, “The discovery of the essential trace elements: An outline of the history of biological trace element research”, chapter 2, pp. 17-31, in Earl Frieden, Biochemistry of the Essential Ultratrace Elements, Plenum Press, 1984; and the obituary, “Hedley Ralph Marston 1900-1965”; https://www.science.org.au/fellowship/fellows/biographical-memoirs/hedley-ralph-marston-1900-1965
[6] Stories of Australian Astronomy: Radio Astronomy; https://stories.scienceinpublic.com.au/astronomy/radio-astronomy/

Posted in conflicts of interest, fraud in science, funding research, scientific culture, scientism | Tagged: , , , | Leave a Comment »

The misleading popular myth of science exceptionalism

Posted by Henry Bauer on 2020/12/28

Human beings are fallible; but we suppose the Pope to be infallible on spiritual matters and science to be exceptional among human endeavors as correctly, authoritatively knowledgeable about the workings of the material world. Other sources purporting to offer veritable knowledge may be fallible — folklore, history, legend, philosophy — but science can be trusted to speak the truth.

Scholars have ascribed the infallibility of science to its methodology and to the way scientists behave. Science is thought to employ the scientific method, and behavior among scientists is supposedly described by the Mertonian Norms. Those suppositions have somehow seeped into the conventional wisdom. Actually, however, contemporary scientific activity does not proceed by the scientific method, nor do scientists behave in accordance with the Mertonian Norms. Because the conventional wisdom is so wrong about how science and scientists work, public expectations about science are misplaced, and public policies and actions thought to be based on science may be misguided.

Contemporary science is unrecognizably different from the earlier centuries of modern science (commonly dated as beginning around the 16th century). The popular view was formed by those earlier times, and it has not yet absorbed how radically different the circumstances of scientific activities have become, increasingly since the middle of the 20th century.

Remarkable individuals were responsible for the striking achievements of modern science that brought science its current prestige and status; and there are still some remarkably talented people among today’s scientists. But on the whole, scientists or researchers today are much like other white-collar professionals [1: p. 79], subject to conflicts of interest and myriad annoyances and pressures from patrons and outside interests; 21st century “science” is just as interfered with and corrupted by commercial, ideological, and political forces as are other sectors of society, say education, or justice, or trade.

Modern science developed through the voluntary activities of individuals sharing the aim of understanding how Nature works. The criterion of success was that claimed knowledge be true to reality. Contemporary science by contrast is not a vocation carried on by self-supporting independent individuals; it is done by white-collar workers employed by a variety of for-profit businesses and industries and not-for-profit colleges, universities, and government agencies. Even as some number of researchers still genuinely aim to learn truths about Nature, their prime responsibility is to do what their employers demand, and that can conflict with being wholeheartedly truthful.

The scientific method and the Mertonian Norms
 do not encompass the realities of contemporary science

The myth of the scientific method has been debunked at book length [2]. It should suffice, though, just to point out that the education and training of scientists may not even include mention of the so-called scientific method.

I had experienced a bachelor’s-degree education in chemistry, a year of undergraduate research, and half-a-dozen years of graduate research leading to both a master’s degree and a doctorate before I ever heard of “the scientific method”. When I eventually did, I was doing postdoctoral research in chemistry (at the University of Michigan); and I heard of “the scientific method” not from my sponsor and mentor in the Chemistry Department but from a graduate student in political science. (Appropriately enough, because it is the social and behavioral sciences, as well as some medical doctors, who make a fetish of claiming to follow the scientific method, in the attempt to be granted as much prestige and trustworthiness as physics and chemistry enjoy.)

The scientific method would require individuals to change their beliefs readily whenever the facts seem to call for it. But everything that psychology and sociology can agree on is that it is very difficult and considerably rare for individuals or groups to modify a belief once it has become accepted. The history of science is consonant with that understanding: New and better understanding is persistently resisted by the majority consensus of the scientific community for as long as possible [3, 4]; pessimistically, in the words of Max Planck, until the proponents of the earlier belief have passed away [5]; as one might put it, science progresses one funeral at a time.

The Mertonian norms [6], too, are more myth than actuality. They are, in paraphrase:

Ø     Communality or communalism (Merton had said “communism”): Science is an activity of the whole scientific community and it is a public good — findings are shared freely and openly.
Ø      Universalism: Knowledge about the natural world is universally valid and applicable. There are no separations or distinctions by nationality, religion, race, sex, etc.
Ø      Disinterestedness: Science is done for the public good, not for personal benefit; scientists seek to be impartial, objective, unbiased, not self-serving.
Ø      Skepticism: Claims and reported findings are subject to critical appraisal and testing throughout the scientific community before they can be accepted as proper scientific knowledge.

As with the scientific method, these norms suggest that scientists behave in ways that do not come naturally to human beings. Free communal sharing of everything might perhaps have characterized human society in the days of hunting and foraging [7], but it was certainly not the norm in Western society at the time of the Scientific Revolution and the beginnings of modern science. Disinterestedness is a very strange trait to attribute to a human being, voluntarily doing something without having any personal interest in the outcome; at the very least, there is surely a strong desire that what one does should be recognized as the good and right way to do things, as laudable in some way. Skepticism is no more natural than is the ready willingness to change beliefs demanded by the scientific method.

As to universalism, that goes without saying if claimed knowledge is actually true, it has nothing to do with behavior. If some authority attempts to establish something that is not true, it just becomes a self-defeating, short-lived dead end like the Stalinist “biology” of Lysenko or the Nazi non-Jewish “Deutsche Physik” [8].

Merton wrote that the norms, the ethos of science, “can be inferred from the moral consensus of scientists as expressed in use and wont, in countless writings on the scientific spirit and in moral indignation directed toward contraventions of the ethos” [6]. That falls short of claiming to have found empirically that scientists actually behave like that for the inferred reasons.

Merton’s norms are a sociologist’s speculation that the successes of science could only have come if scientists behaved like that; just as “the scientific method” is a philosophers’ guess that true knowledge could only be arrived at if knowledge seekers proceeded like that.

More compatible with typical human behavior would be the following:

Early modern science became successful after the number of people trying to understand the workings of the natural world reached some “critical mass”, under circumstances in which they could be in fairly constant communication with one another. Those circumstances came about in the centuries following the Dark Ages in Europe. Eventually various informal groups began to meet, then more formal “academies” were established (of which the Royal Society of London is iconic as well as still in existence). Exchanges of observations and detailed information were significantly aided by the invention of inexpensive printing. Relatively informal exchanges became more formal, as Reports and Proceedings of Meetings, leading to what are now scientific journals and periodicals (some of which still bear the time-honored title of “Proceedings of . . .).

Once voluntary associations had been established among individuals whose prime motive was to understand Nature, some competition, some rivalry, and also some cooperation will have followed automatically. Everyone wanted to get it right, and to be among the first to get it right, so the criterion for success was the concurrence and approval of the others who were attempting the same thing. Open sharing was then a matter of self-interest and therefore came naturally, because one could obtain approval and credit only if one’s achievements were known to others. Skepticism was provided by those others: one had to get it right in order to be convincing. There was no need at all for anyone to be unnaturally disinterested. (This scenario is essentially the one Michael Polanyi  described by the analogy of communally putting together a jigsaw puzzle [2: pp. 42-44, passim; 9].)

Such conditions of free, voluntary interactions among individuals sharing the sole aim of understanding Nature, something like a intellectual free-market conditions, simply do not exist nowadays; few if any researchers can be self-supporting, independent, intellectual entrepreneurs, most are employees and thereby beholden to and restricted by the aims and purposes of those who hold the purse-strings.

Almost universally nowadays, the gold standard of reliability is thought to be “the peer-reviewed mainstream literature”. But it would be quite misleading to interpret peer review as the application of organized skepticism, “critical appraisal and testing throughout the scientific community”. As most productive researchers well know, peer review does not guarantee the accuracy or objectivity or honesty of what has passed peer-review. In earlier times, genuine and effective peer-review took place by the whole scientific community after full details of claimed results and discoveries had been published. Nowadays, in sharp contrast, so called peer-review is carried out by a small number of individuals chosen by journal editors to advise on whether reported claims should even be published. Practicing and publishing researchers know that contemporary so-called peer-review is riddled with bias, prejudice, ignorance and general incompetence. But even worse than the failings of peer review in decisions concerning publication is the fact that the same mechanism is used to decide what research should be carried out, and even how it should be carried out [1: pp. 106-9, passim].

Contemporary views of science, and associated expectations about science, are dangerously misplaced because of the pervasive mistaken belief that today’s scientific researchers are highly talented, exceptional individuals in the mold of Galileo, Newton, Einstein, etc.,  and that they are unlike normal human beings in being disinterested, seeking only to serve the public good, disseminating their findings freely, self-correcting by changing their theories whenever the facts call for it, and perpetually skeptical about their own beliefs.

Rather, a majority consensus nowadays exercises dogmatic hegemony, insisting on theories contrary to fact on a number of  topics, including such publicly important ones as climate-change and HIV/AIDS [10].

————————————————-

[1]    Henry H. Bauer, Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed, McFarland, 2017
[2]    Henry H. Bauer, Scientific Literacy and Myth of the Scientific Method, University of Illinois Press, 1992;
“I would strongly recommend this book to anyone who hasn’t yet heard that the scientific method is a myth. Apparently there are still lots of those folks around”
(David L. Goodstein, Science, 256 [1992] 1034-36)
[3]    Bernard Barber, “Resistance by scientists to scientific discovery”,
 Science, 134 (1961) 596-602
[4]    Thomas S. Kuhn, The Structure of Scientific Revolutions, University of Chicago Press, 1970 (2nd ed., enlarged ; 1st ed. 1962)
[5]    Max Planck, Scientific Autobiography and Other Papers, 1949; translated from German by Frank Gaynor, Greenwood Press, 1968
[6]    Robert K. Merton, “The normative structure of science” (1942); pp. 267–78 in The Sociology of Science (ed. N. Storer, University of Chicago Press, 1973)
[7]    Christopher Ryan & Cacilda Jethá, Sex at Dawn: The Prehistoric Origins of Modern Sexuality, HarperCollins, 2010
[8]    Philipp Lenard, Deutsche Physik, J. F. Lehmann (Munich), 1936
[9]    Michael Polanyi, “The Republic of Science: Its political and economic theory”,
Minerva, I (1962) 54-73
[10]  Henry H. Bauer, Dogmatism  in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth, McFarland, 2012

Posted in conflicts of interest, consensus, funding research, media flaws, peer review, politics and science, resistance to discovery, science is not truth, scientific culture, scientism, scientists are human, the scientific method, unwarranted dogmatism in science | Tagged: , | 1 Comment »

Science Court: Why and What

Posted by Henry Bauer on 2020/12/16

The idea for what has come to be called a Science Court was proposed half a century ago by Arthur Kantrowitz [1].

The development of nuclear reactors as part of the atom-bomb project made it natural to contemplate the possibility of generating power for civil purposes by means of nuclear reactors (the reactor at Hanford that made plutonium for the Nagasaki bomb was also the first full-scale nuclear reactor ever built [2]).

The crucial question was whether power-generating nuclear reactors could be operated safely. The technical experts were divided over that, and Kantrowitz proposed that an “Institution for Scientific Judgment” was needed to adjudicate the opposing opinions.

In those years, scientific activity was still rather like in pre-WWII times: A sort of ivory-tower cottage industry of largely independent intellectual entrepreneurs who shared the aim of learning how the material world works. Mediating opposing opinions could then seem like a relatively straightforward matter of comparing data and arguments. Half a century later, however, scientific activity has pervaded business, commerce, and medical practices, and research has become intensely competitive, with cutthroat competition for resources and opportunities for profit-making and achieving personal wealth and influence. Conflicts of interest are ubiquitous and inescapable [3]. Mediating opposing technical opinions is now complicated because public acceptance of a particular view has consequences for personal and institutional power and wealth; deciding what “science” truly says is hindered by personal conflicts of interest, Groupthink, and institutional conflicts of interest.

Moreover, technical disagreements nowadays are not between more or less equally placed technical experts; they are between a hegemonic mainstream consensus and individual dissenters. The consensus elite controls what the media and the public learn about “science”, as the “consensus” dominates “peer review”, which in practice determines all aspects of scientific activity, for instance the allocation of positions and research resources and the publication (or suppression) of observations or results.

It has become quite common for the mainstream consensus to effectively suppress minority views and anomalous research results, often dismissing them out of hand, not infrequently labeling them pejoratively as denialist or flat-earther crackpot [4]. Thereby the media, the public, and policymakers may not even become aware of the existence of competent, plausible dissent from a governing consensus.

The history of science is, however, quite unequivocal: Over the course of time, a mainstream scientific consensus may turn out to be inadequate and to be replaced by previously denigrated and dismissed minority views.

Public actions and policies might bring about considerable damage if based on a possibly mistaken contemporary scientific consensus. Since nowadays a mainstream consensus so commonly renders minority opinions invisible to society at large, some mechanism is needed to enable policymakers to obtain impartial, unbiased, advice as to the possibility that minority views on matters of public importance should be taken into consideration.

That would be the prime purpose of a Science Court. The Court would not be charged with deciding or declaring what “science” truly says. It would serve just to force openly observed substantive engagement among the disagreeing technical experts — “force” because the majority consensus typically refuses voluntarily to engage substantively with dissident contrarians, even in private.

In a Court, as the elite consensus and the dissenters present their arguments and their evidence, points of disagreement would be made publicly visible and also clarified under mutual cross-examination. That would enable lay observers — the general public, the media, policymakers — to arrive at reasonably informed views about the relative credibility of the proponents of the majority and minority opinions, through noting how evasive or responsive or generally confidence-inspiring they are. Even if no immediate resolution of the differences of opinion could be reached, at least policymakers would be sufficiently well-informed about what public actions and policies might plausibly be warranted and which might be too risky for immediate implementation.

A whole host of  practical details can be specified only tentatively at the outset since they will likely need to be modified over time as the Court gains experience. Certain at the beginning is that public funding is needed as well as absolute independence, as with the Supreme Court of the United States. Indeed, a Science Court might well be placed under the general supervision of the Supreme Court. While the latter might not at first welcome accepting such additional responsibilities, that might change since the legal system is currently not well equipped to deal with cases where technical issues are salient [5]. For example, the issue of who should be acceptable as an expert technical witness encounters the same problem of adjudicating between a hegemonic majority consensus and a number of entirely competent expert dissenters as the problem of adjudicating opposing expert opinions.

Many other details need to be worked out: permanent staffing of the Court as well as temporary  staffing for particular cases; appointment or selection of advocates for opposing views; how to choose issues for consideration; the degree and type of authority the Court could exercise, given that a majority consensus would usually be unwilling to engage voluntarily with dissidents. These questions, and more, have been discussed elsewhere [6]. As already noted, however, if a Science Court is actually established, its unprecedented nature would inevitably make desirable progressive modification of its practices in the light of accumulating experience.

————————————————-

[1]    Arthur Kantrowitz, “Proposal for an Institution for Scientific Judgment”, Science, 156 (1967) 763-64

[2]    Steve Olson, The Apocalypse Factory, W. W. Norton, 2020

[3]    Especially chapter 1 in Henry H. Bauer, Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed, McFarland, 2017

[4]    Henry H. Bauer, Dogmatism  in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth, McFarland, 2012

[5]    Andrew W. Jurs, “Science Court: Past proposals, current considerations, and a suggested structure”, Drake University Legal Studies Research Paper Series, Research Paper 11–06 (2010); Virginia Journal of Law and Technology, 15 #1

[6]    Chapter 12 in Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed, McFarland, 2017

Posted in conflicts of interest, consensus, denialism, funding research, peer review, politics and science, resistance to discovery, science is not truth, science policy, scientific culture, scientism, unwarranted dogmatism in science | Tagged: , | 2 Comments »

Can science regain credibility?

Posted by Henry Bauer on 2020/12/09

Some of the many critiques of contemporary science and medicine [1] have suggested improvements or reforms: among them, ensuring that empiricism and fact determine theory rather than the other way around [2]; more competent application of statistics; awareness of biases as a way of decreasing their influence [1, 2, 3].

Those suggestions call for individuals in certain groups, as well as those groups and institutions as a whole, to behave differently than they have been behaving: researchers, editors, administrators, patrons; universities, foundations, government agencies, and commercial sponsors of research.

Such calls for change are, however, empty whistling in the wind if not based on an understanding of why those individuals and those groups have been behaving in ways that have caused science as a whole to lose credibility — in the eyes of much of the general public, but not only the general public: a significant minority of accomplished researchers and other informed insiders have concluded that on any number of topics the mainstream “consensus” is flawed or downright wrong, not properly based on the available evidence [4].

It is a commonplace to remark that science displaced religion as the authoritative source of knowledge and understanding, at least in Western civilization during the last few centuries. One might then recall the history of religion in the West, and that corruption of its governing institutions eventually brought rebellion: the Protestant Reformation, the Enlightenment, and the enshrining of science and reason as society’s hegemonic authority; so it might seem natural now to call for a Scientific Reformation to repair the institutions of science that seem to have become corrupted.

The various suggestions for reform have indeed called for change in a number of ways: in how academic institutions evaluate the worth of their researchers; in how journals decide what to publish and what not to publish; in how the provision of research resources is decided; and so forth and so on. But such suggestions fail to get to the heart of the matter. The Protestant Reformation was seeking the repair of a single, centrally governed, institution. Contemporary science, however, comprises a whole collection of institutions and groups that interact with one another in ways that are not governed by any central authority.

The way “science” is talked and written about is highly misleading, since no single word can properly encompass all its facets or aspects. The greatest source of misunderstanding comes about because scientific knowledge and understanding do not generate themselves or speak for themselves; so in common discourse, “science” refers to what is said or written about scientific knowledge and theories by people — who are, like all human beings, unavoidably fallible, subject to a variety of innate ambitions and biases as well as external influences; and hindered and restricted by psychological and social factors — psychological factors like confirmation bias, which gets in the way of recognizing errors and gaps, social factors like Groupthink, which pressures individuals not to deviate from the beliefs and actions of any group to which they belong.

So whenever a claim about scientific knowledge or understanding is made, the first reaction that should be, “Who says so?”

It seems natural to presume that the researchers most closely related to a given topic would be the most qualified to explain and interpret it to others. But scientists are just as human and fallible as others, so researchers on any given subject are biased towards thinking they understand it properly even though they may be quite wrong about it.

A better reflection of what the facts actually are would be the view that has become more or less generally accepted within the community of specialist researchers, and thereby in the scientific community as a whole; in other words, what research monographs, review articles, and textbooks say — the “consensus”. Crucially, however, as already noted, any contemporary consensus may be wrong, in small ways or large or even entirely.

Almost invariably there are differences of opinion within the specialist and general scientific communities, particularly but not only about relatively new or recent studies. Unanimity is likely only over quite simple matters where the facts are entirely straightforward and readily confirmed; but such simple and obvious cases are rare indeed. Instead of unanimity, the history of science is a narrative of perpetual disagreements as well as (mostly but not always) their eventual resolution.

On any given issue, the consensus is not usually unanimous as to “what science says”. There are usually some contrarians, some mavericks among the experts and specialist researchers, some unorthodox views. Quite often, it turns out eventually that the consensus was flawed or even entirely wrong, and what earlier were minority views then become the majority consensus [5, 6].

That perfectly normal lack of unanimity, the common presence of dissenters from a “consensus” view, is very rarely noted in the popular media and remains hidden from the conventional wisdom of society as a whole — most unfortunately and dangerously, because it is hidden also from the general run of politicians and policymakers. As a result, laws on all sorts of issues, and many officially approved practices in medicine, may come to be based on a mistaken scientific consensus; or, as President Eisenhower put it [7], public policies might become captive to a scientific-technological elite, those who constitute and uphold the majority consensus.

The unequivocal lesson that modern societies have yet to learn is that any contemporary majority scientific consensus may be misleading. Only once that lesson has been learned will it then be noted that there exists no established safeguard to prevent public policies and actions being based on erroneous opinions. There exists no overarching Science Authority to whom dissenting experts could appeal in order to have the majority consensus subjected to reconsideration in light of evidence offered by the contrarian experts; no overarching Science Authority, and no independent, impartial, unbiased, adjudicators or mediators or interpreters to guide policymakers in what the actual science might indicate as the best direction.

That’s why the time is ripe to consider establishing a Science Court [8].

——————————————–

[1]     CRITIQUES OF CONTEMPORARY SCIENCE AND ACADEME 
WHAT’S WRONG WITH PRESENT-DAY MEDICINE

[2]    See especially, about theoretical physics, Sabine Hossenfelder,Lost in Math: How Beauty Leads Physics Astray, Basic Books, 2018

[3]    Stuart Ritchie, Science Fictions: How FRAUD, BIAS, NEGLIGENCE, and HYPE Undermine the Search for Truth, Metropolitan Books (Henry Holt & Company), 2020

[4]    A number of examples are discussed in Henry H. Bauer, Dogmatism  in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth, McFarland, 2012

[5]    Bernard Barber, “Resistance by scientists to scientific discovery”, Science, 134 (1961) 596-602

[6]    Thomas S. Kuhn, The Structure of Scientific Revolutions, University of Chicago Press, 1970, 2nd (enlarged) ed. [1st ed. was 1962]

[7]    Dwight D. Eisenhower, Farewell speech, 17 January 1961; transcript at http://avalon.law.yale.edu/20th_century/eisenhower001.asp

[8]    Chapter 12 in Henry H. Bauer, Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed, McFarland, 2017

Posted in conflicts of interest, consensus, fraud in science, media flaws, medical practices, peer review, politics and science, resistance to discovery, science is not truth, science policy, scientific culture, scientists are human, unwarranted dogmatism in science | Tagged: , | 3 Comments »

Why skepticism about science and medicine?

Posted by Henry Bauer on 2020/09/06

My skepticism is not about science and medicine as sources or repositories of objective knowledge and understanding. Skepticism is demanded by the fact that what society learns about science and medicine is mediated by human beings. That brings in a host of reasons for skepticism: human fallibility, individual and institutional self-interest, conflicts of interest, sources of bias and prejudice.

I have never come across a better discussion of the realities about science and its role in society than Richard Lewontin’s words in his book, Biology as Ideology (Anansi Press 1991, HarperPerennial 1992; based on 1990 Massey Lectures, Canadian Broadcasting Corporation):

“Science is a social institution about which there is a great deal of misunderstanding, even among those who are part of it. . . [It is] completely integrated into and influenced by the structure of all our other social institutions. The problems that science deals with, the ideas that it uses in investigating those problems, even the so-called scientific results that come out of scientific investigation, are all deeply influenced by predispositions that derive from the society in which we live. Scientists do not begin life as scientists, after all, but as social beings immersed in a family, a state, a productive structure, and they view nature through a lens that has been molded by their social experience.
. . . science is molded by society because it is a human productive activity that takes time and money, and so is guided by and directed by those forces in the world that have control over money and time. Science uses commodities and is part of the process of commodity production. Science uses money. People earn their living by science, and as a consequence the dominant social and economic forces in society determine to a large extent what science does and how it. does it. More than that, those forces have the power to appropriate from science ideas that are particularly suited to the maintenance and continued prosperity of the social structures of which they are a part. So other social institutions have an input into science both in what is done and how it is thought about, and they take from science concepts and ideas that then support their institutions and make them seem legitimate and natural. . . .
Science serves two functions. First, it provides us with new ways of manipulating the material world . . . . [Second] is the function of explanation” (pp. 3-4). And (p. 5) explaining how the world works also serves as legitimation.

Needed skepticism takes into account that every statement disseminated about science or medicine serves in some way the purpose(s), the agenda(s), of the source or sources of that statement.

So the first thing to ask about any assertion about science or medicine is, why is this statement being made by this particular source?

Statements by pharmaceutical companies, most particularly their advertisements, should never be believed, because, as innumerable observers and investigators have documented, the profit motive has outweighed any concern for the harm that unsafe medications cause even as there is no evidence for definite potential benefit. The best way to decide on whether or not to prescribe or use a drug is by comparing NNT and NNH, the odds on getting benefit compared to the odds of being harmed; but NNT and NNH are never reported by drug companies. For example, there is no evidence whatsoever that HPV vaccination decreases the risk of any cancer; all that has been observed is that the vaccines may decrease genital warts. On the other hand, many individuals have suffered grievous harm from “side” effects of these vaccines (see Holland 2018 in the bibliography cited just below, and the documentary, Sacrificial Virgins. TV ads by Merck, for example in August 2020 on MSNBC, cite the Centers for Disease Control & Prevention as recommending the vaccine not only for girls but also for boys.

For fully documented discussions of the pervasive misdeeds of drug companies, consult the books listed in my periodically updated bibliography, What’s Wrong with Present-Day Medicine.
I recommend particularly Angell 2004, Goldacre 2013, Gøtzsche 2013, Healy 2012, Moynihan, & Cassels 2005. Greene 2007 is a very important but little-cited book describing how numbers and surrogate markers have come to dominate medical practice, to the great harm of patients.

Official reports may be less obviously deceitful than drug company advertisements, but they are no more trustworthy, as argued in detail and with examples in “Official reports are not scientific publications”, chapter 3 in my Dogmatism in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth (McFarland 2012):
“reports from official institutions and organizations . . . are productions by bureaucracies . . . . The actual authors of these reports are technical writers whose duties are just like those of press secretaries, advertising writers, and other public-relations personnel: to put on the actual evidence and conclusions the best possible spin to reinforce the bureaucracy’s viewpoint and emphasize the importance of the bureaucracy’s activities.
Most important: The Executive Summaries, Forewords, Prefaces, and the like may tell a very different story than does the actual evidence in the bulk of the reports. It seems that few if any pundits actually read the whole of such documents. The long public record offers sad evidence that most journalists certainly do not look beyond these summaries into the meat of the reports, given that the media disseminate uncritically so many of the self-serving alarums in those Executive Summaries” (p. 213).

So too with press releases from academic institutions.

As for statements direct from academic and professional experts, recall that, as Lewontin pointed out, “people earn their living by science”. Whenever someone regarded as an expert or authority makes public statements, an important purpose is to enhance the status, prestige, career, profitability, of who is making the statement. This is not to suggest that such statements are made with deliberate dishonesty; but the need to preserve status, as well as the usual illusion that what one believes is actually true, ensures that such statements will be dogmatically one-sided assertions, not judicious assessments of the objective state of knowledge.

Retired academic experts like myself no longer suffer conflicts of interest at a personal or institutional-loyalty level. When we venture critiques of drug companies, official institutions, colleges and universities, and even individual “experts” or former colleagues, we will be usually saying what we genuinely believe to be unvarnished truth. Nevertheless, despite the lack of major obvious conflicts of interest, one should have more grounds than that for believing what we have to say. We may still have an unacknowledged agenda, for instance a desire still to do something useful even as our careers are formally over. Beyond that, of course, like any other human beings, we may simply be wrong, no matter that we ourselves are quite sure that we are right. Freedom from frank, obvious conflicts of interest does not bring with it some superhuman capacity for objectivity let alone omniscience.

In short:
Believe any assertion about science or medicine, from any source, at your peril.
If the matter is of any importance to you, you had best do some investigating of evidence and facts, and comparison of diverse interpretations.

Posted in conflicts of interest, consensus, fraud in medicine, fraud in science, medical practices, peer review, politics and science, science is not truth, scientific literacy, scientism, scientists are human, unwarranted dogmatism in science | Tagged: , , , , | Leave a Comment »

Percentages absolute or relative? Politicizing science

Posted by Henry Bauer on 2020/08/24

Convalescent plasma reduces the mortality of CoVID-19 by 35%, citizens of the United States were assured in a press conference on 23 August 2020, and the approval of this treatment for emergency use by the Food and Drug Administration (FDA) underscored that this constituted a breakthrough in treating the pandemic disease.

As usual, critical voices ventured to disagree. One physician reported that he had been using this treatment for a considerable length of time and had noted a perhaps marginal, certainly not great benefit for this intervention. Others pointed out that the use of convalescent plasma in general was nothing new.

That “35%” mortality reduction was emphasized a number of times in the televised official announcement. It was only a few days later that we learned that the original data suggested a reduction of mortality to about 8% from 11-12% for presumably comparable patients not so treated. In other words, 3 to 4% of patients may have derived a benefit in terms of decreased mortality.

Indeed, 8 is about 35% less than 11-12. However, a 3.5% reduction in mortality is nothing like a 35% reduction.

This episode illustrates what is quite commonplace as drug companies seek to impress doctors and patients with the wonderful benefits to be derived from their medications: relative effects rather than absolute ones are reported.

This is just one of the many things wrong with present-day practices in medicine, of course; dozens of works describing the dysfunctions are listed in my periodically updated bibliography.

Investigative reporters also revealed and that the FDA’s emergency use approval had come at the behest of the White House. Historians will recall that the whole science of genetics was derailed in the Soviet Union for a generation as Stalin’s administration enshrined as science the pseudoscience invented by Lysenko.

Posted in conflicts of interest, fraud in medicine, media flaws, medical practices, politics and science, prescription drugs, scientific literacy | Tagged: , , , | 1 Comment »

Vaccines are not all equally safe and effective

Posted by Henry Bauer on 2019/07/13

The article below is copied from the website of the Roanoke Times:

https://www.roanoke.com/opinion/commentary/bauer-all-vaccines-are-not-equally-safe-and-effective/article_ef1bf6b6-4e8f-5dcd-b071-91736b99c68a.html

The article also appeared on the Opinion page of the Times on 11 July 2019.

The Roanoke Times is a local/regional newspaper in South-West Virginia. I had tried for a wider audience, but essentially the same piece had been rejected by the New York Times, Washington Post, Wall St Journal, and Financial Times.

Several people have been unable to access the Internet link given above, either asked to subscribe to the newspaper or told that it is not available outside the USA, but a number of people accessed it without difficulty.

Recent outbreaks of measles have brought widespread unrestrained criticism of parents who have avoided vaccinating their children under the presumed influence of misguided ideological “anti-vaxxers.” But at least some of the anger and blame should be directed at official sources for refusing to admit that some vaccines occasionally do bring sometimes very serious harm to some individuals. By not admitting that, officialdom provides unwarranted credibility to allegations of official cover- ups, allegations then expanded to blanket warnings against vaccinating in general.

There are three main ways in which vaccines can sometimes cause harm to some individuals.

One is the presence in some vaccines of preservatives to protect against contamination by bacteria. Being toxic to bacteria, they can also be toxic to higher forms of life. A commonly used preservative, thimerosal, is a mercury-containing organic substance, and organic-mercury compounds are indeed often toxic to human beings.

A second possible source of harm in some vaccines is the use of so-called adjuvants. These cause a non-specific stimulation of the immune system, in the belief that when the immune system is already aroused it will respond better to the specific components in the vaccine. Adjuvants work through being recognized by the immune system as foreign and undesirable, in other words as being potentially harmful to the person receiving the vaccine. Commonly used adjuvants include organic aluminum compounds, which are known to be harmful if they accumulate in the nervous system, particularly the brain; some people of my age may recall the long-ago warnings against aluminum cookware because of that possible harm.

A third possible danger lies in the inherent specific action of the particular vaccine. Some vaccines sometimes, though quite rarely, actually bring about the very disease against which they are intended to act. More generally, since vaccines are intended to cause the immune system to do certain things, it is far from implausible that the immune system may sometimes react in a different fashion than desired, for example by setting in process an autoimmune reaction. Our present understanding of immune-system functioning does not warrant dogmatic, supposedly authoritative pronouncements alleging that all vaccines are safe for everyone.

The known sources of possible harm from vaccination makes it not unreasonable, for instance, to recommend that babies be vaccinated against mumps, measles, and rubella separately, at intervals, rather than with a single dose of a multiple (MMR) vaccine. The known nervous-system toxicity of organic aluminum and mercury compounds makes it unreasonable to dismiss out-of-hand that these additives in some vaccines may produce such neural damage as symptoms of autism; reports and claims need to be investigated, not ignored or pooh-poohed. Moreover, wherever possible we should be offered the option of vaccines free of adjuvants and preservatives.

The public would be better served than we are now if official proclamations were to distinguish among different vaccines. The benefit-to-risk ratio of measles vaccine, for instance, or of polio vaccine, seems well established through long experience of efficacy and relative safety (“relative” because there is never 100.000…% certainty). By contrast, vaccines against HPV (human papillomavirus) have accumulated quite a substantial record of serious adverse events: the National Vaccine Injury Compensation Program of the Department of Health and Human Services had by 2013 awarded about $6 million to 49 victims in claims against HPV vaccines, with barely half of 200 claims adjudicated at that time; by May 2019, 130 of 480 claims against HPV vaccines had been compensated. Here the benefit-to-risk ratio is not known to be favorable because it cannot yet be known whether the vaccines actually prevent cervical or other cancers, it is only known that they act against viruses sometimes associated with cancer but never yet proven to actually cause cancer.

It is dangerous and without reasonable basis for ideological anti-vaxxers to raise alarm over all vaccinations because of instances like the HPV vaccines. But the conspiratorial and ideological anti-vaxxers are lent unwarranted public credibility and plausibility because officialdom refuses to admit the harm done by, for example, the HPV vaccines, while emphasizing the desirability of maintaining herd immunity against, say, measles, as though the same logic and practical experience applied to all vaccines including new, recently-devised ones. “Since they are lying to us about HPV vaccines, why should we trust them about measles vaccine?”
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
=============================================================

Dr. Christian Fiala, MD, adds:
You may add the experience that vaccines have been withdrawn because it became obvious that they were mainly dangerous and had little if any benefit, like Swine flu. Furthermore it because known in this case that most of the recommendations were by people paid for by the industry, including WHO ‚experts‘. This example is proof of the fact that pharmaceutical companies do in some cases exert a strong influence on bodies which are supposed to be neutral. Just like the Cochrane scandal.
The fact that these negative examples are totally left out by the vaccine lobby seriously harms their credibility.

Posted in conflicts of interest, consensus, media flaws, medical practices, peer review, prescription drugs, unwarranted dogmatism in science | Tagged: | 3 Comments »

Modern Psychiatric Diagnosis is Bullshit

Posted by Henry Bauer on 2019/07/09

I use the term   “bullshit”, of course, as the appropriate description of “assertions made without regard to whether or not they have any truth value”, following the analysis of professor of philosophy Harry Frankfurt in his book On Bullshit (Princeton University Press, 2005).

Those who commit bullshit orally or in writing do, of course, often imagine that they are asserting something that is true, but they are merely parroting popular shibboleths, “what everyone knows”,  without having taken any time it to examine the evidence for themselves (see Climate change is responsible for everything, as everyone knows (but what everyone knows is usually wrong).

Extraordinary as it may seem, the professional reference work on psychiatric diagnosis, the Diagnostic and Statistical Manual of Mental Disorders, published by the American Psychiatric Association and (since 2013) in its 5th edition (DSM-5), gives every appearance of having been put together without any careful attention to evidence, or for that matter to whether it makes any sense.

A couple of years ago, I pointed to the nonsense incorporated in DSM-5 about ADHD — Attention-Deficit/Hyperactivity Disorder (The banality of evil — Psychiatry and ADHD).

Now, the peer-reviewed professional journal Psychiatry Research has published a detailed analysis revealing that the diagnostic categories in DSM-5 make no sense in theory or in practice: (Allsopp et al., Heterogeneity in psychiatric diagnostic classification, Psychiatry Research 279 (2019) 15–22; https://doi.org/10.1016/j.psychres.2019.07.005).

It should suffice to offer two quotes:

“ [I]n the majority of diagnoses in both DSM-IV-TR and DSM-5 (64% and 58.3% respectively), two people could receive the same diagnosis without sharing any common symptoms.”

“[T]here are 270 million combinations of symptoms that would meet the criteria for both PTSD and major depressive disorder, and when five other commonly made diagnoses are seen alongside these two, the figure rises to one quintillion symptom combinations — more than the number of stars in the Milky Way.”

QED

Of course, the professional literature refrains from exposing its guild’s follies, the nakedness of the unclothed Emperor, to the general public, hence the article’s title is “Heterogeneity in psychiatric diagnostic classification”, unlikely to catch the eye of the uninitiated, rather than the plain “Modern psychiatric diagnosis is bullshit”, but both are saying the same thing. As George Bernard Shaw noted a century or so ago, “All professions are conspiracies against the laity”.

Posted in conflicts of interest, consensus, fraud in medicine, medical practices, peer review, science is not truth | Tagged: | Leave a Comment »

 
%d bloggers like this: