Skepticism about science and medicine

In search of disinterested science

Archive for the ‘fraud in science’ Category

The Loch Ness “Monster”: Its real and important significance

Posted by Henry Bauer on 2021/01/29

Because of my writings about Nessie, the Loch Ness Monster [1], I am periodically approached by various media. Last year I had published [2] the suggestion that the Loch Ness creatures are more plausibly related to sea turtles than to the commonly popular notion of plesiosaurs.

A Scottish journalist came across that article, and for one day something about it and me was featured in every yellow-press newspaper in Britain, and several broadcast media asked for interviews.

The episode reminded me of some of the things that are so wrong with modern mass media.

Their overriding concern is simply to attract an audience. There is no intention of offering that audience any genuinely insightful analysis or context or background information. Media attention span approximates that of Twittering. One television network asked for an instant interview, wanted the best phone-contact number, even offered me compensation — and then never followed up.

I did talk to one Russian and one Spanish station or network, and I tried to point to what the real significance is of the Loch Ness animals, namely, that their existence has been denied by official scientific sources for not much less than a century, demonstrating that official science can be wrong, quite wrong; and while that matters little if at all about Loch Ness, I said, it does matter greatly when official science is wrong about such matters of public importance as HIV/AIDS  or climate change,  about which official science does in fact happen to be wrong [3].

So far, however, my bait about those important matters has not been snapped up.

Misunderstandings about science are globally pervasive, especially not realizing that it is fallible. The consequent unwarranted acceptance of wrong beliefs about HIV and about carbon dioxide demonstrate the need for some institution independent of official science, independent of existing scientific organizations and institutions, to provide fact-checking of contemporary scientific consensuses, an impartial, unbiased, strictly evidence-based assessments of official science. In other words, society sorely needs a Science Court [4].

Misconceptions about science can already be seen as a significant reason for flaws in the announced policies of the new Biden administration, as it places high priority on “combating climate change” and engaging in a “moon shot” to cure cancer: having not learned any lessons from the failure of the war on cancer, or from the fact, obvious in great swaths of the geological literature, that carbon dioxide is demonstrably not the prime cause of global warming since there is no correlation between global temperatures and carbon-dioxide levels in the atmosphere [5], neither over the whole life of the Earth nor over the last couple of centuries.

——————————————————

[1]    The Enigma of Loch Ness: Making Sense of a Mystery, University of Illinois Press, 1986/88; Wipf & Stock reprint, 2012
GENUINE  FACTS about “NESSIE”, THE LOCH NESS “MONSTER”
[2]    “Loch Ness Monsters as Cryptid (Presently Unknown) Sea Turtles”, Journal of Scientific Exploration, 34 (2020) 93-104
[3]    Dogmatism  in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth, McFarland, 2012
The Origin, Persistence and Failings of HIV/AIDS Theory, McFarland, 2007
[4]    Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed (McFarland 2017), chapter 12
“The Case for a Science Court”
Science Court: Why and What
[5]    “A politically liberal global-warming skeptic?”
”Climate-change facts: Temperature is not determined by carbon dioxide”

Advertisement

Posted in consensus, fraud in medicine, fraud in science, global warming, media flaws, politics and science, resistance to discovery, science is not truth, science policy, scientific culture, unwarranted dogmatism in science | Tagged: , , , , | 17 Comments »

From uncritical about science to skeptical about science: 3

Posted by Henry Bauer on 2021/01/03

From more than ample funding to stifling competition

In the middle 1960s in the United States, I observed more of the consequences of the enormous infusion of federal resources into scientific activity that I had glimpsed as a postdoctoral researcher in the late 1950s.

Moving  from Australia, I was appointed Associate Professor at the University of Kentucky in 1966, just when the university was spreading its wings towards gaining recognition for research excellence. I was expected to help that along, given that I already had several dozen publications to my name.

Kentucky was far from alone in its ambition. The flood of federal money designed to stimulate scientific research and the training of future scientists had brought a major transformation in American academe. Four-year Liberal-Arts Colleges made themselves over into Research Universities; Teachers Colleges morphed into universities. In 1944, there had been 107 doctorate-granting institutions in the U.S.; then 142 by 1950-54, 208 by 1960-64, 307 by 1970-74 [1]. In chemistry, there had been 98 doctoral programs in 1955; by 1967 there were 165, and 192 by 1979 [2].

A presumably unforeseen consequence of pushing production of would-be researchers and wannabe research universities was that by the 1970s, demand for grant funds was exceeding the supply. At Kentucky, about half of the Chemistry Department’s proposals to the National Science Foundation (NSF) had been funded in the mid-to-late 1960s; but by 1978, our success rate had fallen to only 1 grant for every 10 applications. That sort of decline has continued into the 21st century: at the National Institutes of Health (NIH), the main source of funds for biological and medical research, the national success rate for grant applicants fell from 31% in 1997 to 20% by 2014 [3]; the average age was 42 at which an individual first obtained a grant from NIH as Principal Investigator in 2011 [4].

By the 1970s, there were more PhD mathematicians and physicists graduating than there were academic research jobs available. Some pundits speculated that the 2008 economic crash owed quite a lot to ingenious stock-trading software programs and bad-mortgage-bundling-and-valuing “securities” designed by PhD mathematicians and physicists who were working on Wall Street because they could not find positions in academe.

When I had prepared at my first research grant application to the National Science Foundation in 1966, the newly-appointed Director of the University’s Research Division rewrote my budget without consulting me, to make it twice as long in duration and four times as costly in total. When the grant was refused, and I asked the NSF manager why, he pointed out that it requested about twice as much as their usual grants. Faced with this news, our Research Director expressed surprise, claiming that one of the purposes of these federal funds was to support universities in general.

Federal grants for scientific research brought with them many perks.

My particular specialty enabled me to observe how grants for analytical chemistry made it possible to enjoy summer-time fishing and scuba diving in the Caribbean, as a necessary part of research that involved analyzing sea-water.

Some groups of grant-getters would meet before or after professional meetings at desirable locations for fun and games. In those socially boisterous 1960s-70s, traveling at will on funds from research grants made it easy, for example, to sample the topless bars in San Francisco and perhaps a performance of the norm-breaking, counter-cultural musical Hair on the way to the highly regarded Gordon Research Conferences in Santa Barbara. And why not? What could be wrong with using small amounts of our grant funds for personal recreation, just as people in business or industry might use their travel expenses.

Such a point of view was certainly not hindered by the fact that grants for scientific research routinely brought, for academics, an additional 25-33% of personal salary. Almost all academics are  routinely paid on a so-called “9-month basis”, with no teaching or other responsibilities during the three months of summer. Since scientific research would be carried on year-round, including during the summer, it seemed quite appropriate that researchers would receive a salary during that time as part of their research grants.

That practice no doubt had an undesirable side-effect, arousing or enhancing jealousy among non-science academics, and perhaps increasing the determination, among social scientists in particular, to be treated like the physicists and chemists and biologists: after all, psychologists and sociologists are scientists too, are they not? Lobbying eventually — in 1957, seven years after NSF had been established — led to the Social Science Research Program at NSF, for support of anthropology, economics, sociology, and history and philosophy of science.

Federal grants for scientific research brought with them many benefits for institutions as well as for the researchers: universities siphoned off from grants the so-called “indirect costs”, the self-justifying, much-preferred term for “overhead”. Increased scientific research placed greater obligations on the university’s libraries and physical facilities and administrative tasks, so it seemed quite proper to add to the costs of actual research, and the researcher’s summer salary, and the wages and tuition fees of graduate students and postdoctoral researchers, a certain percentage that the University Administration could use to defray those added burdens. That certain percentage can be as high as 50%, or even more in the case of private, non-state-funded, universities [5].

The more the money flowed, the more necessary it became for researchers to obtain grant funds. Costs increased all the time. Scientific journals had traditionally been published by scientific societies, underwritten by membership fees and edited by society members, often without remuneration. As printing and postage costs increased, journals began to levy so-called “page charges” that soon increased to many tens of dollars per published page, particularly as an increasing number of scientific periodicals were taken over or newly founded by commercial publishers, who naturally paid professional staff including editors. Page charges were of course legitimate charges on grant funds. Academics without access to grant funds could still be published in society journals, but their second-class status was displayed for all to see as their publications carried the header or footer, “Publication costs borne by the Society”.

Increasing competition, with the stakes continually increasing, would naturally encourage corner-cutting, unscrupulous behavior, even outright cheating and faking. At least by hindsight it is clear enough that scientists and universities had been corrupted by money — willingly,  greedily; but Science itself seemed not visibly affected, could still be trusted. Dishonest behavior began to be troubling, noticeably, only by the 1980s.

The 1960s were still pleasantly high-flying years for scientific researchers. Things went well for me personally, and at the tail end of those great years I even collared my best grant yet, a five-year (1969-74) million-dollar project for fundamental work relevant to fuel cells, whose promise was something of a fad at the time.

But in the early 1970s,  the American economy turned down. The  job market for PhD scientists collapsed. Our graduate program in chemistry could not attract enough students, and, as already mentioned, we were not doing well with grant funds from NSF.

That is when my recreational interest in the Loch Ness Monster began to pay off, in entirely unforeseeable ways: leading to new insights into science and how it was changing; as well as bringing a career change.

———————————————————————————————

[1]    A Century of Doctorates: Data Analyses of Growth and Change, National Academy of Sciences, 1978
[2]    Henry H. Bauer, Fatal Attractions: The Troubles with Science,
Paraview Press, 2001, p. 166
[3]    NIH Data Book: Research Grants, 15 June 2015
[4]    W. A. Shaffer “Age Distribution – AAMC Medical School Faculty and NIH R01 Principal Investigators” (2012), cited in Michael Levitt & Jonathan M. Levitt, “Future of fundamental discovery in US biomedical research”,
Proceedings of the National Academy of Science, USA, 114 (#25, 2017): 6498-6503
[5]        Jocelyn Kaiser, “The base rate for NIH grants averages about 52%; NIH plan to reduce overhead payments draws fire”

Posted in conflicts of interest, fraud in science, funding research, scientific culture, scientists are human | Tagged: , , | Leave a Comment »

From uncritical about science to skeptical about science

Posted by Henry Bauer on 2020/12/31

Science has been so successful at unlocking Nature’s secrets, especially since about the 16th century, that by the early decades of the 20th century, science had become almost universally accepted as the trustworthy touchstone of knowledge about and insight into the material world. In many ways and in many places, science has superceded religion as the ultimate source of truth.
Yet in the 21st century, an increasing number and variety of voices are proclaiming that science is not — or no longer — to be trusted.
Such disillusion is far from unanimous, but I certainly share it [1], as do many others [2, 3], including such well-placed insiders as editors of scientific periodicals.
How drastically different 21st– century science is from the earlier modern science that won such status and prestige seems to me quite obvious; yet the popular view seems oblivious to this difference. Official statements from scientific authorities and institutions are still largely accepted automatically, unquestioningly, by the mass media and, crucially, by policy-makers and governments, including international collaborations.
Could my opinion be erroneous about a decline in the trustworthiness of science?
If not, why is it that what seems so obvious to me has not been noticed, has been overlooked by the overwhelming majority of practicing researchers, by pundits and by scholars of scientific activity and by science writers and journalists?

That conundrum had me retracing the evolution of my views about science, from my early infatuation with it to my current disillusionment.
Almost immediately I realized that I had happened to be in some of the right places at some of the right times [4] with some of the right curiosity to be forced to notice the changes taking place; changes that came piecemeal over the course of decades.
That slow progression will also have helped me to modify my belief, bit by bit, quite slowly. After all, beliefs are not easily changed. From trusting science to doubting science is quite a jump; for that to occur quickly would be like suddenly acquiring a religious belief, Saul struck on the road to Damascus, or perhaps the opposite, losing a faith like the individuals who escape from cults, say Scientology — it happens quite rarely.
So it is natural but worth noting that my views changed slowly just as the circumstances of research were also changing, not all at once but gradually.
Of course I didn’t recognize at the time the cumulating significance of what I was noticing. That comes more easily in hindsight. Certainly I could not have begun to suspect that a book borrowed for light recreational reading would lead a couple of decades later to major changes of professional career.

Beginnings: Science, chemistry, unquestioning trust in science

I had become enraptured by science, and more specifically by chemistry, through an enthusiastic teacher at my high school in Sydney, Australia, in the late 1940s. My ambition was to become a chemist, researching and teaching, and I could imagine nothing more interesting or socially useful.
Being uncritically admiring of science came naturally to my cohort of would-be or potential scientists. It was soon after the end of the second World War; and that science really understands the inner workings of Nature had been put beyond any reasonable doubt by the awesome manner in which the war ended, with the revelation of atomic bombs. I had seen the newspaper headlines, “Atom bomb used over Japan”, as I was on a street-car going home from high-school, and I remember thinking, arrogantly, “Gullible journalism, swallowing propaganda; there’s no such thing as an atomic bomb”.

Learning how it was a thing made science seem yet more wonderful.

The successful ending of that war was also of considerable and quite personal significance for me. By doing it, “science” had brought a feeling of security and relief after years of high personal anxiety, even fear. When I was a 7-year-old school-boy, my family had escaped from Austria, in the nick of time, just before the war had started; and then in Australia, we had experienced the considerable fear of a pending Japanese invasion, a fear is made very real by periodic news of Japanese atrocities in China, for instance civilians being buried alive, as illustrated in photographs.
Trusting science was not only the Zeitgeist of that time and place, it was personally welcome, emotionally appealing.

The way sciences were taught only confirmed that science could be safely equated with truth. For that matter, all subjects were taught quite dogmatically. We just did not question what our teachers said; time and place, again. In elementary school we had sat with arms folded behind our backs until the teacher entered, when we stood up in silent respect. Transgressions of any sort were rewarded by a stroke of a cane on an outstretched hand.
(Fifty years later, in another country if not another world, a university student in one of my classes complained about getting a “B” and not an “A”.)

I think chemistry also conduces to trusting that science gets it right. Many experiments are easy to do, making it seem obvious that what we’ve learned is absolutely true.
After much rote learning of properties of elements and compounds, the Periodic Table came as a wonderful revelation: never would I have to do all that memorizing again, everything can be predicted just from that Table.
Laboratory exercises, in high school and later at university, worked just as expected; failures came only from not being adept or careful enough. The textbooks were right.

Almost nothing at school or university, in graduate as well as undergraduate years, aroused any concerns that science might not get things right. A year of undergraduate research and half-a-dozen years in graduate study brought no reason to doubt that science could learn Nature’s truths. Individuals could make mistakes, of course; I was taken aback when a standard reference resource, Chemical Abstracts, sent me erroneously to an article about NaI instead of NOI — human error, obviously, in transcribing spoken words.

Of course there was still much to learn, but no reason to question that science could eventually come to really understand all the workings of the material world.

Honesty in doing science was taken for granted. We heard the horror story of someone who had cheated in some way; his studying of science was immediately canceled and he had to take a job somewhere as a junior administrator. Something I had written was plagiarized — the historical introduction in my PhD thesis — and the miscreant was roundly condemned, even as he claimed a misunderstanding. Individuals could of course go wrong, but that threw no doubt on the trustworthiness of Science itself.

In many ways, scientific research in Australia in the 1940s and 1950s enjoyed conditions not so different from the founding centuries of modern science when the sole driving aim was to learn how the world works. In the universities, scientific research was very much part of the training of graduate students for properly doing good science. The modest needed resources were provided by the University. No time and effort had to be spent seeking necessary support from outside sources, no need to locate and kowtow to potential patrons, individuals or managers at foundations or government agencies.
Research of a more applied sort was carried out by the government-funded Council for Scientific and Industrial Research, CSIR (which later became a standard government agency, the Commonwealth Scientific and Industrial Research Organization, CSIRO). There the atmosphere was quite like that in academe: people more or less happily working at a self-chosen vocation. The aims of research were sometimes quite practical, typically how better to exploit Australia’s natural resources: plentiful coal, soft brown as well as hard black; or the wool being produced in abundance by herds of sheep. CSIR also made some significant “pure science” discoveries, for example the importance of nutritional trace elements in agricultural soils [5] and in the development of radio astronomy [6].

In retrospect the lack of money-grubbing is quite striking. At least as remarkable, and not unrelated, is that judgments were made qualitatively, not quantitatively. People were judged by the quality, the significance, the importance of what they accomplished, rather than by how much of something they did. We judged our university teachers by their mastery of the subjects they taught and on how they treated us. Faculty appointments and promotions relied on personal recommendations. Successful researchers might often — and naturally— publish more than others, but not necessarily. Numbers of publications were not the most important thing, nor how often one’s publications were cited by others: The Science Citation Index was founded only in 1963, followed by the Social Sciences Citation Index in 1973 and the Arts and Humanities Citation Index a few years later. “Impact factors” of scientific journals had begun to be calculated in the early 1970s.

So in my years of learning chemistry and beginning research, nothing interfered with having an idealistic view of science, implicitly “pure” science, sheer knowledge-seeking. For my cohort of students, it was an attractive, worthy vocation. The most desired prospect was to be able to work at a university or a research institute. If one was less fortunate, it might be a necessary to take a job in industry, which in those years was little developed in Australia, involving the manufacture of such uncomplicated or unsophisticated products as paint, or the processing of sugar cane or technicalities associated with brewing beer, making wine, or distilling spirits.

The normal path to an academic career in Australia began with post-doctoral experience in either Britain or the United States. My opportunity came in the USA; there, in the late 1950s, I caught my first glimpses of what science would become, with an influx of funds from government and industry and the associated consequences, then unforeseen if not unforeseeable but at any rate not of any apparent concern.

——————————————-

[1]    Henry H. Bauer, Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed, McFarland, 2017
[2]    Critiques of Contemporary Science and Academe
https://mega.nz/file/NfwkSR7S#K7llqDfA9JX_mVEWjPe4W-uMM53aMr2XMhDP6j0B208
[3]    What’s Wrong With Medicine; https://mega.nz/file/gWoCWTgK#1gwxo995AyYAcMTuwpvP40aaB3DuA5cvYjK11k3KKSU
[4]    Insight borrowed from Paula E. Stephen & Sharon G. Levin, Striking the Mother Lode in Science: The Importance of Age, Place, and Time, Oxford University Press, 1992
[5]    Best known is the discovery that cobalt supplements avoided “coast disease”, a wasting condition of sheep; see Gerhard N. Schrauzer, “The discovery of the essential trace elements: An outline of the history of biological trace element research”, chapter 2, pp. 17-31, in Earl Frieden, Biochemistry of the Essential Ultratrace Elements, Plenum Press, 1984; and the obituary, “Hedley Ralph Marston 1900-1965”; https://www.science.org.au/fellowship/fellows/biographical-memoirs/hedley-ralph-marston-1900-1965
[6] Stories of Australian Astronomy: Radio Astronomy; https://stories.scienceinpublic.com.au/astronomy/radio-astronomy/

Posted in conflicts of interest, fraud in science, funding research, scientific culture, scientism | Tagged: , , , | Leave a Comment »

Can science regain credibility?

Posted by Henry Bauer on 2020/12/09

Some of the many critiques of contemporary science and medicine [1] have suggested improvements or reforms: among them, ensuring that empiricism and fact determine theory rather than the other way around [2]; more competent application of statistics; awareness of biases as a way of decreasing their influence [1, 2, 3].

Those suggestions call for individuals in certain groups, as well as those groups and institutions as a whole, to behave differently than they have been behaving: researchers, editors, administrators, patrons; universities, foundations, government agencies, and commercial sponsors of research.

Such calls for change are, however, empty whistling in the wind if not based on an understanding of why those individuals and those groups have been behaving in ways that have caused science as a whole to lose credibility — in the eyes of much of the general public, but not only the general public: a significant minority of accomplished researchers and other informed insiders have concluded that on any number of topics the mainstream “consensus” is flawed or downright wrong, not properly based on the available evidence [4].

It is a commonplace to remark that science displaced religion as the authoritative source of knowledge and understanding, at least in Western civilization during the last few centuries. One might then recall the history of religion in the West, and that corruption of its governing institutions eventually brought rebellion: the Protestant Reformation, the Enlightenment, and the enshrining of science and reason as society’s hegemonic authority; so it might seem natural now to call for a Scientific Reformation to repair the institutions of science that seem to have become corrupted.

The various suggestions for reform have indeed called for change in a number of ways: in how academic institutions evaluate the worth of their researchers; in how journals decide what to publish and what not to publish; in how the provision of research resources is decided; and so forth and so on. But such suggestions fail to get to the heart of the matter. The Protestant Reformation was seeking the repair of a single, centrally governed, institution. Contemporary science, however, comprises a whole collection of institutions and groups that interact with one another in ways that are not governed by any central authority.

The way “science” is talked and written about is highly misleading, since no single word can properly encompass all its facets or aspects. The greatest source of misunderstanding comes about because scientific knowledge and understanding do not generate themselves or speak for themselves; so in common discourse, “science” refers to what is said or written about scientific knowledge and theories by people — who are, like all human beings, unavoidably fallible, subject to a variety of innate ambitions and biases as well as external influences; and hindered and restricted by psychological and social factors — psychological factors like confirmation bias, which gets in the way of recognizing errors and gaps, social factors like Groupthink, which pressures individuals not to deviate from the beliefs and actions of any group to which they belong.

So whenever a claim about scientific knowledge or understanding is made, the first reaction that should be, “Who says so?”

It seems natural to presume that the researchers most closely related to a given topic would be the most qualified to explain and interpret it to others. But scientists are just as human and fallible as others, so researchers on any given subject are biased towards thinking they understand it properly even though they may be quite wrong about it.

A better reflection of what the facts actually are would be the view that has become more or less generally accepted within the community of specialist researchers, and thereby in the scientific community as a whole; in other words, what research monographs, review articles, and textbooks say — the “consensus”. Crucially, however, as already noted, any contemporary consensus may be wrong, in small ways or large or even entirely.

Almost invariably there are differences of opinion within the specialist and general scientific communities, particularly but not only about relatively new or recent studies. Unanimity is likely only over quite simple matters where the facts are entirely straightforward and readily confirmed; but such simple and obvious cases are rare indeed. Instead of unanimity, the history of science is a narrative of perpetual disagreements as well as (mostly but not always) their eventual resolution.

On any given issue, the consensus is not usually unanimous as to “what science says”. There are usually some contrarians, some mavericks among the experts and specialist researchers, some unorthodox views. Quite often, it turns out eventually that the consensus was flawed or even entirely wrong, and what earlier were minority views then become the majority consensus [5, 6].

That perfectly normal lack of unanimity, the common presence of dissenters from a “consensus” view, is very rarely noted in the popular media and remains hidden from the conventional wisdom of society as a whole — most unfortunately and dangerously, because it is hidden also from the general run of politicians and policymakers. As a result, laws on all sorts of issues, and many officially approved practices in medicine, may come to be based on a mistaken scientific consensus; or, as President Eisenhower put it [7], public policies might become captive to a scientific-technological elite, those who constitute and uphold the majority consensus.

The unequivocal lesson that modern societies have yet to learn is that any contemporary majority scientific consensus may be misleading. Only once that lesson has been learned will it then be noted that there exists no established safeguard to prevent public policies and actions being based on erroneous opinions. There exists no overarching Science Authority to whom dissenting experts could appeal in order to have the majority consensus subjected to reconsideration in light of evidence offered by the contrarian experts; no overarching Science Authority, and no independent, impartial, unbiased, adjudicators or mediators or interpreters to guide policymakers in what the actual science might indicate as the best direction.

That’s why the time is ripe to consider establishing a Science Court [8].

——————————————–

[1]     CRITIQUES OF CONTEMPORARY SCIENCE AND ACADEME 
WHAT’S WRONG WITH PRESENT-DAY MEDICINE

[2]    See especially, about theoretical physics, Sabine Hossenfelder,Lost in Math: How Beauty Leads Physics Astray, Basic Books, 2018

[3]    Stuart Ritchie, Science Fictions: How FRAUD, BIAS, NEGLIGENCE, and HYPE Undermine the Search for Truth, Metropolitan Books (Henry Holt & Company), 2020

[4]    A number of examples are discussed in Henry H. Bauer, Dogmatism  in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth, McFarland, 2012

[5]    Bernard Barber, “Resistance by scientists to scientific discovery”, Science, 134 (1961) 596-602

[6]    Thomas S. Kuhn, The Structure of Scientific Revolutions, University of Chicago Press, 1970, 2nd (enlarged) ed. [1st ed. was 1962]

[7]    Dwight D. Eisenhower, Farewell speech, 17 January 1961; transcript at http://avalon.law.yale.edu/20th_century/eisenhower001.asp

[8]    Chapter 12 in Henry H. Bauer, Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed, McFarland, 2017

Posted in conflicts of interest, consensus, fraud in science, media flaws, medical practices, peer review, politics and science, resistance to discovery, science is not truth, science policy, scientific culture, scientists are human, unwarranted dogmatism in science | Tagged: , | 3 Comments »

Why skepticism about science and medicine?

Posted by Henry Bauer on 2020/09/06

My skepticism is not about science and medicine as sources or repositories of objective knowledge and understanding. Skepticism is demanded by the fact that what society learns about science and medicine is mediated by human beings. That brings in a host of reasons for skepticism: human fallibility, individual and institutional self-interest, conflicts of interest, sources of bias and prejudice.

I have never come across a better discussion of the realities about science and its role in society than Richard Lewontin’s words in his book, Biology as Ideology (Anansi Press 1991, HarperPerennial 1992; based on 1990 Massey Lectures, Canadian Broadcasting Corporation):

“Science is a social institution about which there is a great deal of misunderstanding, even among those who are part of it. . . [It is] completely integrated into and influenced by the structure of all our other social institutions. The problems that science deals with, the ideas that it uses in investigating those problems, even the so-called scientific results that come out of scientific investigation, are all deeply influenced by predispositions that derive from the society in which we live. Scientists do not begin life as scientists, after all, but as social beings immersed in a family, a state, a productive structure, and they view nature through a lens that has been molded by their social experience.
. . . science is molded by society because it is a human productive activity that takes time and money, and so is guided by and directed by those forces in the world that have control over money and time. Science uses commodities and is part of the process of commodity production. Science uses money. People earn their living by science, and as a consequence the dominant social and economic forces in society determine to a large extent what science does and how it. does it. More than that, those forces have the power to appropriate from science ideas that are particularly suited to the maintenance and continued prosperity of the social structures of which they are a part. So other social institutions have an input into science both in what is done and how it is thought about, and they take from science concepts and ideas that then support their institutions and make them seem legitimate and natural. . . .
Science serves two functions. First, it provides us with new ways of manipulating the material world . . . . [Second] is the function of explanation” (pp. 3-4). And (p. 5) explaining how the world works also serves as legitimation.

Needed skepticism takes into account that every statement disseminated about science or medicine serves in some way the purpose(s), the agenda(s), of the source or sources of that statement.

So the first thing to ask about any assertion about science or medicine is, why is this statement being made by this particular source?

Statements by pharmaceutical companies, most particularly their advertisements, should never be believed, because, as innumerable observers and investigators have documented, the profit motive has outweighed any concern for the harm that unsafe medications cause even as there is no evidence for definite potential benefit. The best way to decide on whether or not to prescribe or use a drug is by comparing NNT and NNH, the odds on getting benefit compared to the odds of being harmed; but NNT and NNH are never reported by drug companies. For example, there is no evidence whatsoever that HPV vaccination decreases the risk of any cancer; all that has been observed is that the vaccines may decrease genital warts. On the other hand, many individuals have suffered grievous harm from “side” effects of these vaccines (see Holland 2018 in the bibliography cited just below, and the documentary, Sacrificial Virgins. TV ads by Merck, for example in August 2020 on MSNBC, cite the Centers for Disease Control & Prevention as recommending the vaccine not only for girls but also for boys.

For fully documented discussions of the pervasive misdeeds of drug companies, consult the books listed in my periodically updated bibliography, What’s Wrong with Present-Day Medicine.
I recommend particularly Angell 2004, Goldacre 2013, Gøtzsche 2013, Healy 2012, Moynihan, & Cassels 2005. Greene 2007 is a very important but little-cited book describing how numbers and surrogate markers have come to dominate medical practice, to the great harm of patients.

Official reports may be less obviously deceitful than drug company advertisements, but they are no more trustworthy, as argued in detail and with examples in “Official reports are not scientific publications”, chapter 3 in my Dogmatism in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth (McFarland 2012):
“reports from official institutions and organizations . . . are productions by bureaucracies . . . . The actual authors of these reports are technical writers whose duties are just like those of press secretaries, advertising writers, and other public-relations personnel: to put on the actual evidence and conclusions the best possible spin to reinforce the bureaucracy’s viewpoint and emphasize the importance of the bureaucracy’s activities.
Most important: The Executive Summaries, Forewords, Prefaces, and the like may tell a very different story than does the actual evidence in the bulk of the reports. It seems that few if any pundits actually read the whole of such documents. The long public record offers sad evidence that most journalists certainly do not look beyond these summaries into the meat of the reports, given that the media disseminate uncritically so many of the self-serving alarums in those Executive Summaries” (p. 213).

So too with press releases from academic institutions.

As for statements direct from academic and professional experts, recall that, as Lewontin pointed out, “people earn their living by science”. Whenever someone regarded as an expert or authority makes public statements, an important purpose is to enhance the status, prestige, career, profitability, of who is making the statement. This is not to suggest that such statements are made with deliberate dishonesty; but the need to preserve status, as well as the usual illusion that what one believes is actually true, ensures that such statements will be dogmatically one-sided assertions, not judicious assessments of the objective state of knowledge.

Retired academic experts like myself no longer suffer conflicts of interest at a personal or institutional-loyalty level. When we venture critiques of drug companies, official institutions, colleges and universities, and even individual “experts” or former colleagues, we will be usually saying what we genuinely believe to be unvarnished truth. Nevertheless, despite the lack of major obvious conflicts of interest, one should have more grounds than that for believing what we have to say. We may still have an unacknowledged agenda, for instance a desire still to do something useful even as our careers are formally over. Beyond that, of course, like any other human beings, we may simply be wrong, no matter that we ourselves are quite sure that we are right. Freedom from frank, obvious conflicts of interest does not bring with it some superhuman capacity for objectivity let alone omniscience.

In short:
Believe any assertion about science or medicine, from any source, at your peril.
If the matter is of any importance to you, you had best do some investigating of evidence and facts, and comparison of diverse interpretations.

Posted in conflicts of interest, consensus, fraud in medicine, fraud in science, medical practices, peer review, politics and science, science is not truth, scientific literacy, scientism, scientists are human, unwarranted dogmatism in science | Tagged: , , , , | Leave a Comment »

Aluminum adjuvants, autoimmune diseases, and attempted suppression of the truth

Posted by Henry Bauer on 2019/03/24

An earlier post (Adjuvants — the poisons hidden in some vaccines) described the danger that aluminum adjuvants in vaccines pose, including that they may indeed be associated with a risk of inducing autism. A recent book, How to End the Autism Epidemic,   underscores that risk and exposes what should be the crippling, disqualifying conflicts of interest of one of the most prominent accepted experts on vaccinations. I had learned about this from a splendidly informative article by Celeste McGovern at Ghost Ship Media (Prescription to end the autism epidemic, 17 September 2018).

It turns out that animals as well as human beings have experienced tangible harm from vaccines containing aluminum adjuvants: in particular, sheep. Celeste McGovern has reported about that in other recent posts:
Spanish sheep study finds vaccine aluminum in lymph nodes more than a year after injection, behavioural changes, 3 November 2018; Vaccines induce bizarre anti-social behaviour in sheep, 6 November 2018; Anatomy of a science study censorship, 20 March 2019.

This last piece describes the attempt to prevent the truth about aluminum adjuvants from becoming public knowledge, by pressuring the publisher, Elsevier, to withdraw an already accepted, peer-reviewed article in one of its journals: “Cognition and behavior in sheep repetitively inoculated with aluminum adjuvant-containing vaccines or aluminum adjuvant only”, by Javier Asína et al., published online in Pharmacological Research before being withdrawn. Fortunately there are   nowadays resources on the Internet that make it more difficult for the censors to do their dirty work. One invaluable resource is the Wayback Machine, which too few people seem to know about. In the present case, a PDF of the Asína et al. article, as accepted and published online as “In Press” in Pharmacological Research, is available at ResearchGate.

Elsevier publishes thousands of scientific and medical journals, including in the past some that were actually advertisements written by and paid for by pharmaceutical companies, presented dishonestly and misleadingly as genuine scientific periodicals: Elsevier published 6 fake journals); Elsevier had a whole division publishing fake medical journals).

Elsevier had also engaged in censorship on earlier occasions, in one case to the extent of emasculating a well respected, independent publication, Medical Hypotheses (see Chapter 3, “A Public Act of Censorship: Elsevier and Medical Hypotheses”, in Dogmatism in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth).

If the shenanigans and cover-ups about aluminum adjuvants make an insufficiently alarming horror story,   please look at yet another article by Celeste McGovern: Poisoned in Slow Motion, 1 October 2018:

“Immune-system disease is sweeping the globe. . . . Autoimmune/inflammatory syndrome induced by adjuvants, or ASIA — a wildly unpredictable inflammatory response to foreign substances injected or inserted into the human body . . . . The medical literature contains hundreds of such cases. . . . [with] vague and sundry symptoms — chronic fatigue, muscle and joint pain, sleep disturbances, cognitive impairment, skin rashes and more . . . that . . . share the common underlying trigger of certain immune signaling pathways. Sometimes this low-grade inflammation can smolder for years only to suddenly incite an overt autoimmune disease. . . . Chronic fatigue syndrome (also known as myalgic encephalitis), once a rare “hypochondriac” disorder, now affects millions of people globally and has been strongly associated with markers of immune system dysfunction. . . . One in thirteen American children has a hyperactive immune system resulting in food allergy,4 and asthma, another chronic inflammatory disease of the immune system, affects 300 million people across the globe.5 Severe neurological disorders like autism (which now affects one in 22 boys in some US states) have soared from virtual nonexistence and are also linked to a damaged immune system.”

[4. Pediatrics, 2011; 128: e9-17
5. Global Initiative for Asthma. Global Strategy for Asthma Management and Prevention. 2008.
6. Eur J Pediatr, 2014; 173: 33-43]

******************************************************************

These particulars offer further illustrations of the general points that I have been making for some time:

 Science and medicine have become dogmatic wielders of authority through being co-opted and in effect bought out by commercial interests. Pharmaceutical companies are perhaps in the forefront of this takeover, but the influence of other industries should not be forgotten, for instance that of Monsanto with its interest in Genetically Modified products; see Dogmatism in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth, Jefferson (NC): McFarland 2012

 Science, research, medicine, are very different things nowadays than they were up to about the middle of the 20th century, and very different from the conventional wisdom about them. Media, policy makers, and the public need an independent, impartial assessment of what science and medicine are said to have established; needed is  a Science Court; see Science Is Not What You Think: How It Has Changed, Why We Can’t Trust It, How It Can Be Fixed, McFarland, 2017

Posted in conflicts of interest, fraud in medicine, fraud in science, legal considerations, media flaws, medical practices, peer review, prescription drugs, science is not truth, scientific culture, scientific literacy, scientism, scientists are human, unwarranted dogmatism in science | Tagged: , , , , , , | Leave a Comment »

HPV does not cause cervical cancer; HPV vaccination can be deadly

Posted by Henry Bauer on 2018/09/16

Evidence continues to mount that the presumed connection between HPV and cervical cancer is no more than a statistical association, not a causative relationship:

The Gardasil controversy: as reports of adverse effects increase, cervical cancer rates rise in HPV-vaccinated age groups 

Annette Gartland

“The Gardasil vaccines continue to be vaunted as life-saving, but there is no evidence that HPV vaccination is reducing the incidence of cervical cancer, and reports of adverse effects now total more than 85,000 worldwide. Nearly 500 deaths are suspected of being linked to quadrivalent Gardasil or Gardasil 9.
As Merck’s latest human papillomavirus (HPV) vaccine, Gardasil 9, continues to be fast tracked around the world, the incidence of invasive cervical cancer is increasing in many of the countries in which HPV vaccination is being carried out.”

Once again independent scientists without conflicts of interest are maltreated by bureaucratic organizations with conflicts of interest to commercial interests, drug companies in particular:

“This article was updated with information from the AHVID on 14/09/2018.
Update 15/9/2018:
Peter Gøtzsche has been expelled from the Cochrane Collaboration. Six of the 13 members of the collaboration’s governing board voted for his expulsion.
. . . . .
‘This is the first time in 25 years that a member has been excluded from membership of Cochrane. This unprecedented action taken by a minority of the governing board . . . . ‘
In just 24 hours, Gøtzsche said, the Cochrane governing board had lost five of its members, four of whom were centre directors and key members of the organisation in different countries.
Gøtzsche says that, in recent years, Cochrane has significantly shifted more to a profit-driven approach.
‘Even though it is a not-for-profit charity, our ‘brand’ and ‘product’ strategies are taking priority over getting out independent, ethical and socially responsible scientific results,’ he said'”.

 

 

Posted in conflicts of interest, fraud in medicine, fraud in science, medical practices, prescription drugs, unwarranted dogmatism in science | Tagged: , , , , | Leave a Comment »

21st century science:   Group-Thinking Elites and Fanatical Groupies

Posted by Henry Bauer on 2018/08/11

Science has been a reliable resource for official policies and actions for much of the era of modern science, which is usually regarded as having begun around the 17th century.

It is almost without precedent that a mistaken scientific consensus should lead to undesirable and damaging public actions, yet that is now the case in two instances: the belief that carbon dioxide generated by the burning of fossil fuels is primarily responsible for global warming and climate change; and the belief that HIV is the cause of AIDS.

Both those beliefs gained hegemony during the last two or three decades. That these beliefs are mistaken seems incredible to most people, in part because of the lack of any well known precedent and in part because the nature of science is widely misunderstood; in particular it is not yet widely recognized how much science has changed since the middle of the 20th century.

The circumstances of modern science that conspire to make it possible for mistaken theories to bring misguided public policies have been described in my recent book, Science Is Not What You Think [1]. The salient points are these:

Ø     Science has become dysfunctionally large

Ø     It is hyper-competitive

Ø     It is not effectively self-correcting

Ø     It is at the mercy of multiple external interests and influences.

A similar analysis was offered by Judson [2]. That title reflects the book’s opening theme of the prevalence of fraud in modern science (as well as in contemporary culture). It assigns blame to the huge expansion in the number of scientists and the crisis that the world of science faces as it finds itself in something of a steady-state so far as resources are concerned, after a period of some three centuries of largely unfitted expansion: about 80% of all the scientists who have ever lived are extant today; US federal expenditure on R&D increased 4-fold (inflation-adjusted!) from 1953 to 2002, and US industry increased its R&D spending by a factor of 26 over that period! Judson also notes the quintessential work of John Ziman explicating the significance of the change from continual expansion to what Ziman called a dynamic steady-state [3].

Remarkably enough, President Eisenhower had foreseen this possibility and warned against it in his farewell address to the nation: “in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite”. The proponents of human-caused-climate-changer theory and of HIV/AIDS theory are examples of such elites.

A crucial factor is that elites, like all other groups, may be dysfunctionally affected by the phenomenon of Groupthink.

Janis [4] showed in detail several decades ago how that phenomenon of Groupthink had produced disastrously bad policy actions by the United States. The same phenomenon of Groupthink can cause bad things to happen in other social sectors than the government. Recently, Booker [5] has shown how Groupthink has been responsible for creating the worldwide belief, shibboleth, cliché, that humankind’s use of fossil fuels is causing global warming and climate change through the release of carbon dioxide.

Commonly held ideas about science do not envisage the possibility that a scientific consensus could bring misguided policies and actions on a global scale. What most people know — think they know — about science is that its conclusions are based on solid evidence, and that the scientific method safeguards against getting things wrong, and that science that has been primarily responsible for civilization’s advances over the last few centuries.

Those things that most people know are also largely mistaken [1, 6]. Science is a human activity and is subject to all the frailties and fallibilities of any human activity. The scientific method and the way in which it is popularly described does not accurately portray how science is actually done.

While much of the intellectual progress in understanding how the world works does indeed stand to the credit of science, what remains to be commonly realized is that since about the middle of the 20th century, science has become too big for its own good. The huge expansion of scientific activity since the Second World War has changed science in crucial ways. The number of people engaged in scientific activity has far outstripped the available resources, leading to hyper-competition and associated sloppiness and outright dishonesty. Scientists nowadays are in no way exceptional individuals, people doing scientific work are as common as are teachers, doctors, or engineers. It is in this environment that Groupthink has become significantly and damagingly important.

Booker [5] described this in relation to the hysteria over the use of fossil fuels. A comparable situation concerns the belief that HIV is the cause of AIDS [7]. The overall similarities in these two cases are that a quite small number of researchers arrived initially at more or less tentative conclusions; but those conclusions seemed of such great import to society at large that they were immediately seized upon and broadcast by the media as breaking news. Political actors become involved, accepting those conclusions quickly became politically correct, and those who then questioned and now question the conclusions are vigorously opposed, often maligned as unscientific and motivated by non-scientific agendas.

 

At any rate, contemporary science has become a group activity rather than an activity of independent intellectual entrepreneurs, and it is in this environment that Groupthink affects the elites in any given field — the acknowledged leading researchers whose influence is entrenched by editors and administrators and other bureaucrats inside and outside the scientific community.

A concomitant phenomenon is that of fanatical groupies. Concerning both human-caused climate change and the theory that HIV causes AIDS, there are quite large social groups that have taken up the cause with fanatical vigor and that attack quite unscrupulously anyone who differs from the conventional wisdom. These groupies are chiefly people with little or no scientific background, or whose scientific ambitions are unrequited (which includes students). As with activist groups in general, groupie organizations are often supported by (and indeed often founded by) commercial or political interests. Non-profit organizations which purportedly represent patients and other concerned citizens and which campaign for funds to fight against cancer, multiple sclerosis, etc., are usually funded by Big Pharma, as are HIV/AIDS activist groups.

__________________________________

[1]  Henry H. Bauer, Science Is Not What You Think — how it has changed, why we can’t trust it, how it can be fixed, McFarland 2017

[2] Horace Freeland Judson, The Great Betrayal, Harcourt 2004

[3]  John Ziman, Prometheus Bound, Cambridge University Press 1994

[4]  I. L. Janis, Victims of Groupthink, 1972; Groupthink, 1982, Houghton Mifflin.

[5]  Christopher Booker, GLOBAL WARMING: A case study in groupthink, Global Warming Policy Foundation, Report 28; Human-caused global warming as Groupthink

[6]  Henry H. Bauer, Scientific Literacy and Myth of the Scientific Method, University of Illinois Press 1992

[7]  Henry H. Bauer, The Origin, Persistence and Failings of HIV/AIDS Theory, McFarland 2007

Posted in conflicts of interest, consensus, fraud in science, funding research, global warming, media flaws, science is not truth, science policy, scientific culture, scientific literacy, scientism, scientists are human, the scientific method, unwarranted dogmatism in science | Tagged: , , | Leave a Comment »

How science changed — IV. Cutthroat competition and outright fraud

Posted by Henry Bauer on 2018/04/15

The discovery of the structure of DNA was a metaphorical “canary in the coal mine”, warning of the intensely competitive environment that was coming to scientific activity. The episode illustrates in microcosm the seismic shift in the circumstances of scientific activity that started around the middle of the 20th century [1], the replacement of one set of unwritten rules by another set [2].
The structure itself was discovered by Watson and Crick around 1950, but it was only in 1968, with the publication of Watson’s personal recollections, that attention was focused on how Watson’s approach and behavior marked a break from the traditional unwritten rules of scientific activity.
It took even longer for science writers and journalists to realize just how cutthroat the competition had become in scientific and medical research. Starting around 1980 there appeared a spate of books describing fierce fights for priority on a variety of specific topics:
Ø    The role of the brain in the release of hormones; Guillemin vs. Schally — Nicholas Wade, The Nobel Duel: Two Scientists’ 21-year Race to Win the World’s Most Coveted Research Prize, Anchor Press/Doubleday, 1981.
Ø    The nature and significance of a peculiar star-like object — David H. Clark, The Quest for SS433, Viking, 1985.
Ø    “‘Mentor chains’, characterized by camaraderie and envy, for example in neuroscience and neuropharmacology” — Robert Kanigel, Apprentice to Genius: The Making of a Scientific Dynasty, Macmillan, 1986.
Ø    High-energy particle physics, atom-smashers — Gary Taubes, Nobel Dreams: Power, Deceit, and the Ultimate Experiment, Random House, 1986.
Ø    “Soul-searching, petty rivalries, ridiculous mistakes, false results as rivals compete to understand oncogenes” — Natalie Angier, Natural Obsessions: The Search for the Oncogene, Houghton Mifflin, 1987.
Ø    “The brutal intellectual darwinism that dominates the high-stakes world of molecular genetics research” — Stephen S. Hall, Invisible Frontiers: The Race to Synthesize a Human Gene, Atlantic Monthly Press, 1987.
Ø    “How the biases and preconceptions of paleoanthropologists shaped their work” — Roger Lewin, Bones of Contention: Controversies in the Search for Human Origins, Simon & Schuster, 1987.
Ø    “The quirks of . . . brilliant . . . geniuses working at the extremes of thought” — Ed Regis, Who Got Einstein’s Office: Eccentricity and Genius at the Institute for Advanced Study, Addison-Wesley, 1987.
Ø    High-energy particle physics — Sheldon Glashow with Ben Bova, Interactions: A Journey Through the Mind of a Particle Physicist and the Matter of the World, Warner, 1988.
Ø    Discovery of endorphins — Jeff Goldberg, Anatomy of a Scientific Discovery, Bantam, 1988.
Ø    “Intense competition . . . to discover superconductors that work at practical temperatures “ — Robert M. Hazen, The Breakthrough: The Race for the Superconductor, Summit, 1988.
Ø    Science is done by human beings — David L. Hull, Science as a Process, University of Chicago Press, 1988.
Ø    Competition to get there first — Charles E. Levinthal, Messengers of Paradise: Opiates and the Brain, Anchor/Doubleday 1988.
Ø    “Political machinations, grantsmanship, competitiveness” — Solomon H. Snyder, Brainstorming: The Science and Politics of Opiate Research, Harvard University Press, 1989.
Ø    Commercial ambitions in biotechnology — Robert Teitelman, Gene Dreams: Wall Street, Academia, and the Rise of Biotechnology, Basic Books, 1989.
Ø    Superconductivity, intense competition — Bruce Schechter, The Path of No Resistance: The Story of the Revolution in Superconductivity, Touchstone (Simon & Schuster), 1990.
Ø    Sociological drivers behind scientific progress, and a failed hypothesis — David M. Raup, The Nemesis Affair: A Story of the Death of Dinosaurs and the Ways of Science, Norton 1999.

These titles illustrate that observers were able to find intense competitiveness wherever they looked in science; though mostly in medical or biological science, with physics including astronomy the next most frequently mentioned field of research.
Watson’s memoir had not only featured competition most prominently, it had also revealed that older notions of ethical behavior no longer applied: Watson was determined to get access to competitors’ results even if those competitors were not yet anxious to reveal all to him [3]. It was not only competitiveness that increased steadily over the years; so too did the willingness to engage in behavior that not so long before had been regarded as improper.
Amid the spate of books about how competitive research had become, there also was published. Betrayers of the Truth: Fraud and Deceit in the Halls of Science by science journalists William Broad and Nicholas Wade (Simon & Schuster, 1982). This book argued that dishonesty has always been present in science, citing in an appendix 33 “known or suspected” cases of scientific fraud from 1981 back to the 2nd century BC. These actual data could not support the book’s sweeping generalizations [4], but Broad and Wade had been very early to draw attention to the fact that dishonesty in science was a significant problem. What they failed to appreciate was why: not that there had always been a notable frequency of fraud in science but that scientific activity was changing in ways that were in process of making it a different kind of thing than in the halcyon few centuries of modern science from the 17th century to the middle of the 20th century.
Research misconduct had featured in Congressional Hearings as early as 1981. Soon the Department of Health and Human Services established an Office of Scientific Integrity, now the Office of Research Integrity. Its mission is to instruct research institutions about preventing fraud and dealing with allegations of it. Scientific periodicals began to ask authors to disclose conflicts of interest, and co-authors to state specifically what portions of the work were their individual responsibility.
Academe has proliferated Centers for Research and Medical Ethics [5], and there are now periodicals entirely devoted to such matters [6]. Courses in research ethics have become increasingly common; it is even required that such courses be available at institutions that receive research funds from federal agencies.
In 1989, the Committee on the Conduct of Science of the National Academy of Sciences issued the booklet On Being a Scientist, which describes proper behavior; that booklet’s 3rd edition, titled A Guide to Responsible Conduct in Research, makes even clearer that the problem of scientific misconduct is now widely seen as serious.
Another indication that dishonesty has increased is the quite frequent retraction of published research reports: Retraction Watch estimates that 500-600 published articles are retracted annually. John Ioannidis has made a specialty of reviewing literature for consistency, and reported: “Why most published research findings are false” [7]. Nature has an archive devoted to this phenomenon [8].

Researchers half a century ago would have been aghast and disbelieving at all this, that science could have become so untrustworthy. It has happened because science changed from an amateur avocation to a career that can bring fame and wealth [9]; and scientific activity changed from a cottage industry to a highly bureaucratic corporate industry, with pervasive institutional as well as individual conflicts of interest; and researchers’ demands for support have far exceeded the available supply.

And as science changed, it drew academe along with it. More about that later.

===============================================

[1]    How science changed — III. DNA: disinterest loses, competition wins
[2]    How science has changed— II. Standards of Truth and of Behavior
[3]    The individuals Watson mentioned as getting him access corrected his recollections: they shared with him nothing that was confidential. The significant point remains that Watson had no such scruples.
[4]    See my review, “Betrayers of the truth: a fraudulent and deceitful title from the journalists of science”, 4S Review, 1 (#3, Fall) 17–23.
[5]   There is an Online Ethics Center for Engineering and Science. Physical Centers have been established at: University of California, San Diego (Center for Ethics in Science and Technology); University of Delaware (Center for Science, Ethics and Public Policy); Michigan State University (Center for Ethics and Humanities in the Life Sciences); University of Notre Dame (John J. Reilly Center for Science, Technology, and Values).
[6]    Accountability in Research (founded 1989); Science and Engineering Ethics (1997); Ethics and Information Technology (1999); BMC Medical Ethics (2000); Ethics in Science and Environmental Politics (2001).
[7]    John P. A. Ioannidis, “Why Most Published Research Findings Are False”, PLoS Medicine, 2 (2005): e124. 
[8]    “Challenges in irreproducible research”
[9]    How science has changed: Who are the scientists?

Posted in conflicts of interest, fraud in medicine, fraud in science, funding research, media flaws, science is not truth, scientific culture, scientists are human | Tagged: , | Leave a Comment »

How science has changed — II. Standards of Truth and of Behavior

Posted by Henry Bauer on 2018/04/08

The scientific knowledge inherited from ancient Babylon and Greece and from medieval Islam was gained by individuals or by groups isolated from one another in time as well as geography. Perhaps the most consequential feature of the “modern” science that we date from the 17th-century Scientific Revolution is the global interaction of the people who are doing science, and especially the continuity over time of their collective endeavors.
These interactions among scientists began in quite informal and individual ways. An important step was the formation of academies and societies, among which the Royal Society of London is usually acknowledged to be the earliest (founded 1660) that has remained active up to the present time — though it was not the earliest such institution and even the claim of “longest continually active” has been challenged [1].
Even nowadays, the global community of scientists remains in many ways informal despite the host of scientific organizations and institutions, national and international: the global scientific community is not governed by any formal structure that lays down how science should be done and how scientists should behave.
However, observing the actualities of scientific activity indicates that there had evolved some agreed-on standards generally seen within the community of scientists as proper behavior. Around the time of the Second World War, sociologist Robert Merton described those informal standards, and they came to be known as the “Mertonian Norms” of science [2]. They comprise:

Ø    Communality or communalism (Merton had said “communism”): Science is an activity of the whole scientific community and it is a public good — findings are shared freely and openly.
Ø    Universalism: Knowledge about the natural world is universally valid and applicable. There are no separations or distinctions by nationality or religion race or anything of that sort.
Ø    Disinterestedness: Science is done for the public good and not for personal benefit; scientists seek to be impartial, objective, unbiased, and not self-serving.
Ø    Skepticism: Claims and reported findings are subject to critical appraisal and testing throughout the scientific community before they can be accepted as proper scientific knowledge.

Note that honesty is not mentioned; it was simply taken for granted.
These norms clearly make sense for a cottage industry, as ideal behavior that individuals should aim for; but they are not appropriate for a corporate environment, they cannot guide the behavior of individuals who are part of some hierarchical enterprise.
In the late 1990s, John Ziman [3] discussed the change in scientific activity as it had morphed from the activities of an informal, voluntary collection of individuals seeking to understand how the world works to a highly organized activity with assigned levels of responsibility and authority and where sources of research funding have a say in what gets done, and which often expect to get something useful in return for their investments, something profitable.
The early cottage industry of science had been essentially self-supporting. Much could be done without expensive equipment. People studied what was conveniently at hand, so there was little need for funds to support travel. Interested patrons and local benefactors could provide the small resources needed for occasional meetings and the publication of findings.
Up to about the middle of the 20th century, universities were able to provide the funds needed for basic research in chemistry and biology and physics. The first sign that exceptional resources could be needed had come in the 1920s when Lawrence constructed the first large “atom-smashing machine”; but that and the need for expensive astronomical telescopes remained outliers in the requirements for the support of scientific research overall.
From about the time of the Second World War, however, research going beyond what had already been accomplished began to require ever more expensive and specialized equipment as well as considerable infrastructure: technicians to support the equipment, glass-blowers and secretaries and book-keepers and librarians, and managers of such ancillary staff; so researchers increasingly came to need support beyond that available from individual patrons or universities. Academic research came to rely increasingly on getting grants for specific research projects from public agencies or from wealthy private foundations.
Although those sources of research funds typically claim that they want to support simply “the best science”, their view of what the best science is does not necessarily jibe with the judgments of the individual researchers [4].
At the same time as research in universities was calling on outside sources of funding, an increasing number of industries were setting up their own laboratories for research specifically toward creating and improving their products and services. Such product-specific “R&D” (research and development) sometimes turned up novel basic knowledge, or revealed the need for such fundamentally new understanding. One consequence has been that some really striking scientific advances have come from such famous industrial laboratories as Bell Telephone Laboratories or the Research Laboratory of General Electric. Researchers employed in industry have received a considerable number of Nobel Prizes, often jointly with academics [5].
Under these new circumstances, as Ziman [3] pointed out, the traditional distinction between “applied” research and “pure” or “basic” research lost its meaning.
Ziman rephrased the Mertonian norms as the nice acronym CUDOS, adding the “O” for originality, quite appropriately since within the scientific community credit was and is given to for the most innovative, original contributions; CUDOS, or preferably “kudos”, being the Greek term for acclaim of exceptional accomplishment. By contrast, Ziman proposed for the norms that obtain in a corporate scientific enterprise, be it government or private, the acronym PLACE: Researchers nowadays get their rewards not by adhering to the Mertonian norms but by producing Proprietary findings whose significance may be purely Local rather than universal, the subject of research having been chosen under the Authority of an employer or patron and not by the individual researcher, who is Commissioned to do the work as an Expert employee.

Ziman too did not mention honesty; like Merton he simply took it for granted.
Ziman had made an outstanding career in solid-state physics before, in his middle years, he began to publish, starting in 1968 [6] highly insightful works about how science functions, in particular what makes it reliable. In the late 1960s, it had still been reasonable to take honesty in science for granted; but by the time Ziman published Prometheus Bound, honesty in science could no longer be taken for granted; Ziman had failed to notice some of what was happening in scientific activity. Competition for resources and for career advancement had increased to a quite disturbing extent, presumably the impetus for the increasing frequency with which scientists were found to have cheated in some way. Even published, supposedly peer-reviewed research failed later attempted confirmation in many cases, and all too often it was revealed as simply false, faked [7].
More about that in a following blog post.

==========================================

[1]    “The Royal Societies [sic] claim to be the oldest is based on the fact that they developed out of a group that started meeting in Gresham College in 1645 but unlike the Leopoldina this group was informal and even ceased to meet for two years between 1658 and 1660” — according to The Renaissance Mathematicus, “It wasn’t the first but…”
[2]    Robert K. Merton, “The normative structure of science” (1942); most readily accessible as pp. 267–78 in The Sociology of Science (ed. N. Storer, University of Chicago Press, 1973) a collection of Merton’s work
[3]    John Ziman, Prometheus Bound: Science in a Dynamic Steady State, Cambridge University Press, 1994
[4]    Richard Muller, awarded a prize by the National Science Foundation, pointed out that truly innovative studies are unlikely to be funded and need to be carried out more or less surreptitiously; and Charles Townes, who developed masers and lasers, testified to his difficulty in getting research support for that ground-breaking work, or even encouragement from some of his distinguished older colleagues —
Richard A. Muller, “Innovation and scientific funding”, Science, 209 (1980) 880–3
Charles Townes, How the Laser Happened: Adventures of a Scientist, Oxford University Press , 1999
[5]    Karina Cummings, “Nobel Science Prizes in industry”;
Nobel Laureates and Research Affiliations
[6]    John Ziman, Public Knowledge (1968); followed by The Force of
Knowledge
(1976); Reliable Knowledge (1978); An Introduction to Science
Studies
(1984); Prometheus Bound (1994); Real Science (2000);
all published by Cambridge University Press
[7]    John P. A. Ioannidis, “Why most published research findings are false”,
         PLoS Medicine, 2 (2005) e124
Daniele Fanelli, “How many scientists fabricate and falsify research? A systematic review and meta-analysis of survey data”,
PLoS ONE, 4(#5, 2009): e5738

Posted in conflicts of interest, fraud in medicine, fraud in science, funding research, peer review, resistance to discovery, science is not truth, scientific culture, scientists are human | Tagged: , | Leave a Comment »

 
%d bloggers like this: