The cult of science - Politics | PoFo

Wandering the information superhighway, he came upon the last refuge of civilization, PoFo, the only forum on the internet ...

All sociological topics not appropriate or suited to other areas of the board.
Forum rules: No one line posts please.
By Sivad
the norms of science, like those of morality or politics, are ideals rather than realities and there are large discrepancies between official images of science and the realities of life in a community of professional researchers. Scientists are not guileless observers, patiently recording the facts that nature places before them, but crafty cultural operators, manipulating vast technical resources to precipitate artificial new phenomena, and then networking like mad, through the production, distribution and exchange of masses of words, diagrams and statistics. They negotiate, in short, not with the objective world but with each other.

The strong programme or strong sociology is a variety of the sociology of scientific knowledge (SSK) particularly associated with David Bloor,[1] Barry Barnes, Harry Collins, Donald A. MacKenzie,[2] and John Henry. The strong programme's influence on Science and Technology Studies is credited as being unparalleled (Latour 1999). The largely Edinburgh-based school of thought has illustrated how the existence of a scientific community, bound together by allegiance to a shared paradigm, is a prerequisite for normal scientific activity.

The strong programme is a reaction against "weak" sociologies of science, which restricted the application of sociology to "failed" or "false" theories, such as phrenology. Failed theories would be explained by citing the researchers' biases, such as covert political or economic interests. Sociology would be only marginally relevant to successful theories, which succeeded because they had revealed a true fact of nature. The strong programme proposed that both "true" and "false" scientific theories should be treated the same way. Both are caused by social factors or conditions, such as cultural context and self-interest. All human knowledge, as something that exists in the human cognition, must contain some social components in its formation process.
Last edited by Sivad on 09 Nov 2018 11:13, edited 1 time in total.
By Sivad
The Problem With the March for Science
Our culture’s understanding of science is very, very broken, and on Saturday, it was impossible to ignore.

even among the sanctimonious elite who want to own science (and pwn anyone who questions it), most people have no idea how science actually works. The scientific method itself is already under constant attack from within the scientific community itself and is ceaselessly undermined by its so-called supporters, including during marches like those on Saturday. In the long run, such demonstrations will do little to resolve the myriad problems science faces and instead could continue to undermine our efforts to use science accurately and productively.

Let’s start with my contention that most “pro-science” demonstrators have no idea what they were demonstrating about. Being “pro-science” has become a bizarre cultural phenomenon in which liberals (and other members of the cultural elite) engage in public displays of self-reckoned intelligence as a kind of performance art, while demonstrating zero evidence to justify it.

accepting a cringe-worthy hive-mind mentality that celebrates Science as a vague but wonderful entity, what Richard Feynman called “cargo cult science.” There was an uncomfortable dronelike fealty to the concept—an oxymoronic faith that information presented and packaged to us as Science need not be further scrutinized before being smugly celebrated en masse. That is not intellectually rigorous thought—instead, it’s another kind of religion, and it is perhaps as terrifying as the thing it is trying to fight.

I am glad that people believe science is a concept worth marching for. But the reality is that the state of affairs within the scientific community and literacy among its “fans” lies largely in shambles. The March for Science, and the somewhat mindless glee that was on display, is entirely antithetical to the idea of science as a whole. ... gious.html

David Bloor The Strong Programme in the Sociology of Knowledge
By Sivad
The problem with ­science is that so much of it simply isn’t.

Many defenders of the scientific establishment will admit to this problem, then offer hymns to the self-correcting nature of the scientific method. Yes, the path is rocky, they say, but peer review, competition between researchers, and the comforting fact that there is an objective reality out there whose test every theory must withstand or fail, all conspire to mean that sloppiness, bad luck, and even fraud are exposed and swept away by the advances of the field.

So the dogma goes. But these claims are rarely treated like hypotheses to be tested. Partisans of the new scientism are fond of recounting the “Sokal hoax”—physicist Alan Sokal submitted a paper heavy on jargon but full of false and meaningless statements to the postmodern cultural studies journal Social Text, which accepted and published it without quibble—but are unlikely to mention a similar experiment conducted on reviewers of the prestigious British Medical Journal. The experimenters deliberately modified a paper to include eight different major errors in study design, methodology, data analysis, and interpretation of results, and not a single one of the 221 reviewers who participated caught all of the errors. On average, they caught fewer than two—and, unbelievably, these results held up even in the subset of reviewers who had been specifically warned that they were participating in a study and that there might be something a little odd in the paper that they were reviewing. In all, only 30 percent of reviewers recommended that the intentionally flawed paper be rejected.

If peer review is good at anything, it appears to be keeping unpopular ideas from being published. Consider the finding of another (yes, another) of these replicability studies, this time from a group of cancer researchers. In addition to reaching the now unsurprising conclusion that only a dismal 11 percent of the preclinical cancer research they examined could be validated after the fact, the authors identified another horrifying pattern: The “bad” papers that failed to replicate were, on average, cited far more often than the papers that did! As the authors put it, “some non-reproducible preclinical papers had spawned an entire field, with hundreds of secondary publications that expanded on elements of the original observation, but did not actually seek to confirm or falsify its fundamental basis.”

What they do not mention is that once an entire field has been created—with careers, funding, appointments, and prestige all premised upon an experimental result which was utterly false due either to fraud or to plain bad luck—pointing this fact out is not likely to be very popular. Peer review switches from merely useless to actively harmful. It may be ineffective at keeping papers with analytic or methodological flaws from being published, but it can be deadly effective at suppressing criticism of a dominant research paradigm. Even if a critic is able to get his work published, pointing out that the house you’ve built together is situated over a chasm will not endear him to his colleagues or, more importantly, to his mentors and patrons.

Older scientists contribute to the propagation of scientific fields in ways that go beyond educating and mentoring a new generation. In many fields, it’s common for an established and respected researcher to serve as “senior author” on a bright young star’s first few publications, lending his prestige and credibility to the result, and signaling to reviewers that he stands behind it. In the natural sciences and medicine, senior scientists are frequently the controllers of laboratory resources—which these days include not just scientific instruments, but dedicated staffs of grant proposal writers and regulatory compliance experts—without which a young scientist has no hope of accomplishing significant research. Older scientists control access to scientific prestige by serving on the editorial boards of major journals and on university tenure-review committees. Finally, the government bodies that award the vast majority of scientific funding are either staffed or advised by distinguished practitioners in the field.

Max Planck famously quipped: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” Planck may have been too optimistic. A recent paper from the National Bureau of Economic Research studied what happens to scientific subfields when star researchers die suddenly and at the peak of their abilities, and finds that while there is considerable evidence that young researchers are reluctant to challenge scientific superstars, a sudden and unexpected death does not significantly improve the situation, particularly when “key collaborators of the star are in a position to channel resources (such as editorial goodwill or funding) to insiders.”

Which brings us to the odd moment in which we live. At the same time as an ever more bloated scientific bureaucracy churns out masses of research results, the majority of which are likely outright false, scientists themselves are lauded as heroes and science is upheld as the only legitimate basis for policy-making. There’s reason to believe that these phenomena are linked. When a formerly ascetic discipline suddenly attains a measure of influence, it is bound to be flooded by opportunists and charlatans, whether it’s the National Academy of Science or the monastery of Cluny.

This comparison is not as outrageous as it seems: Like monasticism, science is an enterprise with a superhuman aim whose achievement is forever beyond the capacities of the flawed humans who aspire toward it. The best scientists know that they must practice a sort of mortification of the ego and cultivate a dispassion that allows them to report their findings, even when those findings might mean the dashing of hopes, the drying up of financial resources, and the loss of professional prestige. It should be no surprise that even after outgrowing the monasteries, the practice of science has attracted souls driven to seek the truth regardless of personal cost and despite, for most of its history, a distinct lack of financial or status reward. Now, however, science and especially science bureaucracy is a career, and one amenable to social climbing. Careers attract careerists, in Feyerabend’s words: “devoid of ideas, full of fear, intent on producing some paltry result so that they can add to the flood of inane papers that now constitutes ‘scientific progress’ in many areas.”

If science was unprepared for the influx of careerists, it was even less prepared for the blossoming of the Cult of Science. The Cult is related to the phenomenon described as “scientism”; both have a tendency to treat the body of scientific knowledge as a holy book or an a-religious revelation that offers simple and decisive resolutions to deep questions. But it adds to this a pinch of glib frivolity and a dash of unembarrassed ignorance. Its rhetorical tics include a forced enthusiasm (a search on Twitter for the hashtag “#sciencedancing” speaks volumes) and a penchant for profanity. Here in Silicon Valley, one can scarcely go a day without seeing a t-shirt reading “Science: It works, b—es!” The hero of the recent popular movie The Martian boasts that he will “science the sh— out of” a situation. One of the largest groups on Facebook is titled “I f—ing love Science!” (a name which, combined with the group’s penchant for posting scarcely any actual scientific material but a lot of pictures of natural phenomena, has prompted more than one actual scientist of my acquaintance to mutter under her breath, “What you truly love is pictures”). Some of the Cult’s leaders like to play dress-up as scientists—Bill Nye and Neil deGrasse Tyson are two particularly prominent examples— but hardly any of them have contributed any research results of note. Rather, Cult leadership trends heavily in the direction of educators, popularizers, and journalists.

At its best, science is a human enterprise with a superhuman aim: the discovery of regularities in the order of nature, and the discerning of the consequences of those regularities. We’ve seen example after example of how the human element of this enterprise harms and damages its progress, through incompetence, fraud, selfishness, prejudice, or the simple combination of an honest oversight or slip with plain bad luck. These failings need not hobble the scientific enterprise broadly conceived, but only if scientists are hyper-aware of and endlessly vigilant about the errors of their colleagues . . . and of themselves. When cultural trends attempt to render science a sort of religion-less clericalism, scientists are apt to forget that they are made of the same crooked timber as the rest of humanity and will necessarily imperil the work that they do. The greatest friends of the Cult of Science are the worst enemies of science’s actual practice. ... ic-regress
Society needs this pointed out to them. Science like individual rights should be considerations in our decision making, not absolute determinates. Science should be suspect if it follows a cultural trend instead of preceding it.
By Sivad
One Degree wrote: Science should be suspect if it follows a cultural trend instead of preceding it.

Ideology can be just as much of a conflict of interest as money or status and I have no doubt that many fields of science, especially those that impact more substantially on public policy, are chock-full of ideologues with an axe to grind. We should always follow the money, but we should also follow the ideology as well.
By Sivad
Ideological Conflicts of Interest Worry Me More Than Financial Conflicts

while I do not disregard or trivialize the potential conflict of interest that may come from financial interests (e.g. such as holding a patent or industry funding), we need to also be aware of other powerful conflicts that range from a simple desire to advance one’s personal career (e.g. get tenure, publish in a high-impact journal) to ideological conflicts (e.g. as in spinning research findings to support dearly held world-views or hypotheses).

Whereas disclosing financial conflicts is relatively straightforward (and now pretty much the norm), disclosing other conflicts is more challenging.

Just how devastating ideological conflicts can be to the scientific discourse is perhaps best illustrated by the recent publication by Chritopher Ramsden and colleagues in the British Medical Journal on their analysis of recovered data from the Minnesota Coronary Experiment (MCE).

Conducted back in 1968-73, the MCE was not only the largest (n=9570) but also the most rigorously executed randomized controlled dietary trial of cholesterol lowering by replacement of saturated fat with vegetable oil rich in linoleic acid.

The trial was initiated by Ancel Keys, a fervent supporter of the idea that atherosclerosis was directly related to dietary saturated fat intake and a champion of replacing dietary fats with vegetable oils rich in linoleic acid.

Importantly, this line of thinking was the driver behind the low-fat recommendations that found their way dietary recommendations and ultimately the low-fat craze that characterized much of second half of the last century.

Although completed in 1973, the findings from this study were never published – until now, when Ramsden and colleagues not only managed to recover the original data but also to conduct the analyses according to hypotheses prespecified by original investigators.

As has been suspected by some for a long time, the results turn out to be devastating for the idea that reducing saturated fat intake or switching to vegetable oils can help prevent heart attacks.

As to why these results (that could well have changed decades of dietary recommendations) were never made public, the authors have this to offer,

“In the case of the MCE, the crude study results were clearly at odds with prevailing beliefs….There would have been little or no scientific or clinical trial literature at the time to support findings that were so contrary to prevailing beliefs and public policy.”


“It is interesting to speculate whether complete publication of randomized controlled trial results might have altered key policy decisions promoting replacement of saturated fat with linoleic acid rich oils (such as the 1977 McGovern report and National Cholesterol Education Program (1984-85)) or contributed to a shift in research priorities.”

How much was the fact that the findings were never published influenced by the investigators’ strong “beliefs” in the benefits of reducing saturated fat intake and their “ideological” interest in promoting linoleic-acid rich vegetable oils?

We may never know.

No doubt, Ancel Keys and colleagues would have realised that making these findings public would have done severe damage to their “pet hypothesis”.

When “ideological conflicts” creep into science it can be far more damaging to science in the long run than any financial conflicts simply because the former is far less evident than the latter.

When someone has “no financial conflicts to declare” I often ask my self, “what are the authors really hiding?” ... -conflicts
By Sivad
Paul Feyerabend How to Defend Society Against Science
By Sivad
President Dwight D. Eisenhower's famous 1960 farewell address contained more than an admonition about the danger of an expanding "military-industrial complex."

Little attention has been given to an equally important warning that Eisenhower issued in the same farewell address: the danger that public policy might become the captive of a scientific technological elite.

... [In] the technological revolution during recent decades ... research has become central ... complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government ... the solitary inventor ... has been overshadowed by task forces of scientists in laboratories and testing fields ...

... the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity.

The prospect of domination of the nation's scholars by Federal employment, project allocations, and the power of money is ever present and is gravely to be regarded. ... we must ... be alert to the ... danger that public policy could itself become the captive of a scientific technological elite [ii].
By Sivad
Academic careerism is the tendency of academics (professors specifically and intellectuals generally) to pursue their own enrichment and self-advancement at the expense of honest inquiry, unbiased research and dissemination of truth to their students and society. Such careerism has been criticized by thinkers from Socrates in ancient Athens to Russell Jacoby in the present.

Publish-or-perish: Peer review and the corruption of science

There is just one problem with self-publication and post-publication review. In 2006 Nature magazine tried it and it wasn't popular. Most people who were asked didn't want to take part, and, more important, most people who were invited to comment declined to do so. The probable reason is the exceedingly competitive nature of research in many fields. A junior person might be terrified to criticise a senior person, and senior researchers might similarly be terrified of criticising each other, in case the person criticised was reviewing their next grant(or sitting on their promotion or tenure committee). ... ew-science

Has the Scientific Method Been Compromised By Careerism?

science today seems increasingly interested in “other things,” from academic advancement to financial rewards. And the scientific publishing process seems more and more geared to abetting these practices as the number and capacity of outlets has exploded over the past decade.

despite dozens of fraudulent papers being in the literature for years, the perpetrator was finally caught; therefore, science can police itself.

There are a few problems with this cheerful line of thought. First is the obvious “we don’t know what we don’t know.” How many papers in the literature are currently packed with explosives, waiting to detonate under our noses? It’s hard to tell. And aside from the scandalous, there are the more pernicious but individually less harmful uninteresting, unread, and uncited papers, turning the stepping stones of science into something more akin to an intellectual swampland. It’s hard for the police to succeed when they’re up to their waists in sludge.

It seems malfeasance is uncovered by a tip from someone close to a perpetrator. It’s not as if science policed itself — rather, someone fed up with a cheater’s charade, and success perpetrating it, finally blows the whistle, a major journal investigates, and months later, there is a retraction. Humans police humans, the same as if a drug kingpin had been narced out by a crony. There’s usually nothing noble about how “science” polices itself. Baser human motivations feed the fraud, and ultimately they tip off authorities.

Exaggeration is another form of careerism — hardening hints and shadows into declarative certainties, all for the sake of a higher impact publication event. This plays on the desire to believe that scientific publishing yields “truth,” which seems fairly prevalent.

The lesson once again is that science is done by humans, and prone to the failings of its practitioners and their institutionalized practices. ... careerism/

The true clerc is never diverted from single-hearted adoration of the beautiful and the divine by the necessity of earning their daily bread. But such clercs are inevitably rare. ... The rule is that the living creature condemned to struggle for life turns to practical passions, and thence to the sanctifying of those passions.
By Sivad
Misplaced faith
The public trusts scientists much more than scientists think. But should it?

although high-profile fraud makes headlines, a broader and more common set of unappealing behaviours — from corner-cutting to data-juggling — lie under the surface. Convention says that a tiny minority of scientists cheats, yet academics and researchers frequently make the case that irregularities are widespread. A 2014 survey of hundreds of economists, for example, found that 94% admitted to having engaged in at least one “unaccepted” research practice (S. Necker Res. Policy 43, 1747–1759; 2014).

Just like with British chemistry, it seems that the wider public’s view of science and research is rosier than that of many people who are directly involved. For how long can this continue?
By Sivad
Science and objectivity

the autonomy of the scientific field cannot be taken for granted. An important part of Bourdieu's theory is that the historical development of a scientific field, sufficiently autonomous to be described as such and to produce objective work, is an achievement that requires continual reproduction. Having been achieved, it cannot be assumed to be secure. Bourdieu does not discount the possibility that the scientific field may lose its autonomy and therefore deteriorate, losing its defining characteristic as a producer of objective work. In this way, the conditions of possibility for the production of transcendental objectivity could arise and then disappear.

By Sivad

"Science must shape policy. Science is universal. Science brings out the best in us.

With an informed, optimistic view of the future, together, we can — dare, I say it — SAVE THE WORLD! Thank you! … Science!"

:knife: Nye's pollyannish idiocy is incredibly naive and dangerous. Science is a double edged sword and ignoring the dark side of science and all the horrors it has unleashed upon the world(weapons of mass destruction, dystopian technocracy, mass surveillance) and all the future existential threats it has in store, is typical of the deranged optimism of these cultists. These people are irresponsible reckless idiots and their scientistic fundamentalism needs to be exposed and debunked.
By Sivad
The sugar conspiracy
In 1972, a British scientist sounded the alarm that sugar – and not fat – was the greatest danger to our health. But his findings were ridiculed and his reputation ruined. How did the world’s top nutrition scientists get it so wrong for so long?


Nutrition scientists are angry with the press for distorting their findings, politicians for failing to heed them, and the rest of us for overeating and under-exercising. In short, everyone – business, media, politicians, consumers – is to blame. Everyone, that is, except scientists.

But it was not impossible to foresee that the vilification of fat might be an error. Energy from food comes to us in three forms: fat, carbohydrate, and protein. Since the proportion of energy we get from protein tends to stay stable, whatever our diet, a low-fat diet effectively means a high-carbohydrate diet. The most versatile and palatable carbohydrate is sugar, which John Yudkin had already circled in red. In 1974, the UK medical journal, the Lancet, sounded a warning about the possible consequences of recommending reductions in dietary fat: “The cure should not be worse than the disease.”

Still, it would be reasonable to assume that Yudkin lost this argument simply because, by 1980, more evidence had accumulated against fat than against sugar.

After all, that’s how science works, isn’t it?

If, as seems increasingly likely, the nutritional advice on which we have relied for 40 years was profoundly flawed, this is not a mistake that can be laid at the door of corporate ogres. Nor can it be passed off as innocuous scientific error. What happened to John Yudkin belies that interpretation. It suggests instead that this is something the scientists did to themselves – and, consequently, to us.


These sharp fluctuations in Yudkin’s stock have had little to do with the scientific method, and a lot to do with the unscientific way in which the field of nutrition has conducted itself over the years. This story, which has begun to emerge in the past decade, has been brought to public attention largely by sceptical outsiders rather than eminent nutritionists. In her painstakingly researched book, The Big Fat Surprise, the journalist Nina Teicholz traces the history of the proposition that saturated fats cause heart disease, and reveals the remarkable extent to which its progress from controversial theory to accepted truth was driven, not by new evidence, but by the influence of a few powerful personalities, one in particular.

Teicholz’s book also describes how an establishment of senior nutrition scientists, at once insecure about its medical authority and vigilant for threats to it, consistently exaggerated the case for low-fat diets, while turning its guns on those who offered evidence or argument to the contrary. John Yudkin was only its first and most eminent victim.

Today, as nutritionists struggle to comprehend a health disaster they did not predict and may have precipitated, the field is undergoing a painful period of re-evaluation. It is edging away from prohibitions on cholesterol and fat, and hardening its warnings on sugar, without going so far as to perform a reverse turn. But its senior members still retain a collective instinct to malign those who challenge its tattered conventional wisdom too loudly, as Teicholz is now discovering.


Ancel Keys, a prominent nutritionist at the University of Minnesota, was intensely aware that Yudkin’s sugar hypothesis posed an alternative to his own. If Yudkin published a paper, Keys would excoriate it, and him. He called Yudkin’s theory “a mountain of nonsense”, and accused him of issuing “propaganda” for the meat and dairy industries. “Yudkin and his commercial backers are not deterred by the facts,” he said. “They continue to sing the same discredited tune.”


Throughout the 1960s, Keys accumulated institutional power. He secured places for himself and his allies on the boards of the most influential bodies in American healthcare, including the American Heart Association and the National Institutes of Health. From these strongholds, they directed funds to like-minded researchers, and issued authoritative advice to the nation. “People should know the facts,” Keys told Time magazine.

This apparent certainty was unwarranted


In a 2015 paper titled Does Science Advance One Funeral at a Time?, a team of scholars at the National Bureau of Economic Research sought an empirical basis for a remark made by the physicist Max Planck: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

The researchers identified more than 12,000 “elite” scientists from different fields. The criteria for elite status included funding, number of publications, and whether they were members of the National Academies of Science or the Institute of Medicine. Searching obituaries, the team found 452 who had died before retirement. They then looked to see what happened to the fields from which these celebrated scientists had unexpectedly departed, by analysing publishing patterns.

What they found confirmed the truth of Planck’s maxim. Junior researchers who had worked closely with the elite scientists, authoring papers with them, published less. At the same time, there was a marked increase in papers by newcomers to the field, who were less likely to cite the work of the deceased eminence. The articles by these newcomers were substantive and influential, attracting a high number of citations. They moved the whole field along.

A scientist is part of what the Polish philosopher of science Ludwik Fleck called a “thought collective”: a group of people exchanging ideas in a mutually comprehensible idiom. The group, suggested Fleck, inevitably develops a mind of its own, as the individuals in it converge on a way of communicating, thinking and feeling.

This makes scientific inquiry prone to the eternal rules of human social life: deference to the charismatic, herding towards majority opinion, punishment for deviance, and intense discomfort with admitting to error. ... ohn-yudkin
By Sivad
Groupthink in Science

Sabine Hossenfelder (born 18 September 1976) is an author and theoretical physicist who researches quantum gravity. She is a Research Fellow at the Frankfurt Institute for Advanced Studies where she leads the Analog Systems for Gravity Duals group. She is the author of "Lost in Math: How Beauty Leads Physics Astray"
By Sivad
Judith A. Curry is an American climatologist and former chair of the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology. Her research interests include hurricanes, remote sensing, atmospheric modeling, polar climates, air-sea interactions, and the use of unmanned aerial vehicles for atmospheric research. She is a member of the National Research Council's Climate Research Committee.[1] As of 2017, she has retired from academia.[2][3]

Curry is the co-author of Thermodynamics of Atmospheres and Oceans (1999), and co-editor of Encyclopedia of Atmospheric Sciences (2002), as well as over 140 scientific papers. Among her awards is the Henry G. Houghton Research Award from the American Meteorological Society in 1992.

Polanyi’s essay provides some interesting insights, as well as some striking contrasts with the Republic of Science in the early 21st century.

Polanyi’s analogy of the scientific process with markets captures the pure incentives that drive scientists – search of truth, intellectual satisfaction and individual ego. What happens when the externalities of the Republic of Science produce perverse incentives, and careerism becomes a dominant incentive that requires publishing a lot of papers rapidly and producing headline-worthy results (who even cares if these papers don’t survive scrutiny beyond their press release)? (see What is the measure of scientific success?) What happens is that you get increasing incidence of scientific fraud (see Science: in the doghouse?), cherry picking and meaningless papers on headline grabbing topics that don’t stand up to the test of time (see Trust and don’t bother to verify).

And what happens when the ‘hand’ guiding science isn’t ‘invisible’, i.e. science is driven by politics, such as a political imperative to move away from fossil fuels and towards renewable energy? Federal funding can bias science, particularly in terms of selecting which scientific problems receive attention (link).

And what of Polanyi’s statement: “Such self-coordination of independent initiatives leads to a joint result which is unpremeditated by any of those who bring it about.” The ‘result’ of dangerous anthropogenic climate change and the harms of dietary fat were hardly unpremeditated.

When science is politically relevant and has been politicized, how objective are the authorities that are keepers of the orthodoxy — journal editors, officers of professional societies, university administrators — and how open are they to dissenting perspectives? The experiences of Lennart Bengtsson (link), my being called a ‘climate heretic’ (see my essay Heresy and the creation of monsters), Christopher Essex’s essay (link), Roger Pielke Jr’s experiences, and MANY more examples among climate scientists speak to the fact that the keepers of the climate science orthodoxy are failing in this regard [link to Are climate scientists being forced to toe the line?]. Without the internet and the blogosphere, these dissenting voices would be rendered silent by the keepers of the orthodoxy.

Climate and environmental sciences are far from the only scientific fields suffering in this way – the problem is also rampant in medicine, nutrition, and psychology [link to Partisanship and silencing science.]

Where lies the solution to this? Well, one possibility is reflected in Polanyi’s statement: “[L]ittle more can, or need, be done towards the advancement of science, than to assist spontaneous move­ments towards new fields of distinguished discovery, at the expense of fields that have become exhausted.” Now that climate science is ‘settled’, i.e. at least it is perceived to be sufficiently settled to provide the basis for a very expensive international climate ‘agreement’ (not treaty), perhaps future investments should be directed towards other fields that are deemed important or where greater progress can be made. This is exactly what has been happening in Australia, as the Turnbull administration has been axing climate jobs at CSIRO (link).

Is climate science ‘exhausted’ in terms of diminishing returns on future research? I would argue that climate science is an immature field with many unknowns; however the current paradigm of using inadequate climate models to focus on human caused climate change has reached the point of diminishing returns. Further, the intense politicization of the subject has adversely influenced the community of scientists — in terms of biasing the scientists and also in discouraging young scientists from entering and staying in the field. So in a sense, climate science has become ‘exhausted’ by the politicization.

Governments who fund science and universities who hire scientists need to make the hard decisions regarding which fields and subfields are most worthy of investment, in terms of new breakthrough science. While I was Chair of the School of Earth and Atmospheric Sciences, it was my privilege and opportunity to hire 27 faculty members (24 as primary appointments, 3 as joint hires) over the course of 13 years. This is a rare opportunity for a department in the geosciences. When I became Chair in 2002, the School had 4 divisions – geochemistry, geophysics, atmospheric chemistry, and atmospheric dynamics. I made it a priority to bring ‘water’ into the School, and to hire faculty members that could interact with other scientists and engineers, beyond the geosciences, to stimulate new research areas. Apart from these broad objectives, I hired the best people that we could attract, with little preference for specific research areas. This approach resulted in a reconfiguration of the school to include oceanography, planetary and space sciences, biogeochemistry, and new subfields of geophysics.

I did not hire much in the areas of atmospheric dynamics or climate science (outside of oceanography and biogeochemistry), simply because the quality of the applicants was not as strong as in the other fields. While I have inferred that my provost was not pleased that I did not hire more in ‘climate science’, the outstanding young scientists that I did hire are garnering substantial external recognition and are being heavily recruited by other universities (good luck to the new Chair in retaining these outstanding faculty members). Why didn’t I hire more in atmospheric dynamics and climate science? The atmospheric dynamics faculty candidates generally were in the areas of data assimilation and mesoscale modeling — areas that are important, but arguably engineering rather than science that is going to lead to a breakthrough in understanding. In climate science, most of the applicants were using climate models, by running scenarios and inferring dire consequences — not the climate dynamics theorists that I was hoping for, that could help understand and untangle the complex physical, chemical and even biological processes influencing the climate system.

In a broader sense, which scientific subfields and topics are deemed to be important and why? There is no easy answer to this, but it is the job of university Deans and federal funding agencies to prioritize. There is an interesting example currently in the news, that comes from Georgia Tech’s David Hu, Associate Professor in Mechanical Engineering. He has written an essay Confessions of a Wasteful Scientist. Subtitle: Three of my projects appeared last week on a senator’s list of questionable research. Allow me to explain…

I would also like to respond to Polanyi’s statement: “universities provide an intimate communion for the for­mation of scientific opinion, free from corrupting intrusions and distractions.” I am very sad to report that this simply isn’t true of universities in the early 21st century. is responding to the lack of intellectual diversity at universities. Universities are becoming very uncomfortable places for faculty members with minority perspectives on controversial topics.

As a result, many scientists with minority perspectives are leaving universities. Further, the internet has enabled many individuals outside of academia to make important contributions to climate science (published in refereed journals, in books, and in other reports). Polanyi wrote: “[T]he general public cannot participate in the intellectual milieu in which discoveries are made. For such work the scientist needs a secluded place among like­ minded colleagues who keenly share his aims and sharply control his per­formances.” This is a perspective on scientists that is peculiar to the 20th century [see Scientist: the evolving story of a word]. Particularly in climate science, we are seeing the emergence of a substantial and influential cohort of non-academic scientists, contributing both to the published literature and the public scientific debate. This broadening of the notions of expertise away from university elites is leading some to question whether our traditional notions of expertise are dead [link].

So, what should the Republic of Science look like in the 21st century? The overwhelming issue for the health of science is to reassert the importance of intellectual and political diversity in science, and to respect and even nurture scientific mavericks. The tension between pure (curiosity driven) science and use-inspired and applied science [see Pasteur’s quadrant] needs to be resolved in a way that supports all three, with appropriate roles for universities, government and the private sector. And finally, the reward structure for university scientists need to change to reward more meaningful science that stands the test of time, versus counting papers and press releases, which may not survive even superficial scrutiny even after being published in prestigious journals that are more interested in impact than in rigorous methods and appropriate conclusions.

Failure to give serious thought to these issues risks losing the public trust and support for elite university science (at least in certain fields). Scientists are becoming their own worst enemy when they play into the hands of politicians and others seeking to politicize their science. ... f-science/
User avatar
By One Degree
Get the federal government out of universities. Don’t make scientists compete for special interest money to fund them.
By Sivad
One Degree wrote:Get the federal government out of universities. Don’t make scientists compete for special interest money to fund them.

That's not the way to go, we need publicly funded science. The solution is democratizing faculty appointments and review, grants and funding, promotions, peer review, and publishing . Right now it's a top down hierarchy controlled by an elite establishment, it needs to be opened up to at least grad students and above. Science is always gonna have politics but it doesn't have to be hierarchical elitist politics.
By Sivad
We could also issue citizens science vouchers($200 a year or so) and science could be crowdfunded with public money.
User avatar
By One Degree
Sivad wrote:That's not the way to go, we need publicly funded science. The solution is democratizing faculty appointments and review, grants and funding, promotions, peer review, and publishing . Right now it's a top down hierarchy controlled by an elite establishment, it needs to be opened up to at least grad students and above. Science is always gonna have politics but it doesn't have to be hierarchical elitist politics.

The government doesn’t fund anything without exerting control. They are the biggest creator of bias and have the greatest motive for directing it where they want it to go. Nothing changes as long as they are paying the bill.
By Sivad
One Degree wrote:The government doesn’t fund anything without exerting control.

With the system I'm proposing the government would supply the funds but it wouldn't direct them. Funding would be controlled democratically, primarily by the scientific community but the public could also be included in the process.

They are the biggest creator of bias and have the greatest motive for directing it where they want it to go. Nothing changes as long as they are paying the bill.

That's not true, private money is just as corrosive to the integrity of science as public money.

You have a bad idea on how politics work one degr[…]

I think you can keep the money without being charg[…]

I agree with POD. Mental health treatment is not […]

@Drlee While I don't disagree in principle, the R[…]