Monday, September 01, 2014

Why do normal (real) science when you can do post-normal science?

Recent disclosures regarding the conduct of studies in vaccine safety by the CDC should lead everyone to ask some pointed questions regarding their safety and efficacy. A whistleblower has come forward alleging studies regarding the MMR vaccine were spiked in order to obtain particular results.

There are links to four articles below, none of which are directly about vaccines, but all of which inform the current controversy.

The concept of "post-normal science" has been developed and discussed since the early 1990's, and this writer wonders if it has been applied (even without intending to be) to studies of vaccines.

This quote of Steven Mosher (#1) describes a "post-normal" situation:
Science has changed. More precisely, in post normal conditions the behavior of people doing science has changed.
Ravetz describes a post normal situation by the following criteria:
  1. Facts are uncertain
  2. Values are in conflict
  3. Stakes are high
  4. Immediate action is required
The first three criteria would certainly apply to many aspects of medicine, including
vaccination - the fourth, not so much. Part of what is so appalling about the concept of "post-normal science" is the assumption that immediate action is required - given the first three
criteria the appropriate response is to slow down until more is known as taking immediate action can result in catastrophe if the intervention is ultimately found to be detrimental.

Here is a diagram (from #2) to explain the difference between traditional science ("applied science" - green area) and "post-normal science" - (red area);

It could well be argued that quality has always been the effective principle in practical research science, but it was largely ignored by the dominant philosophy and ideology of science. For post-normal science, quality becomes crucial, and quality refers to process at least as much as to product. It is increasingly realised in policy circles that in complex environment issues, lacking neat solutions and requiring support from all stakeholders, the quality of the decision-making process is absolutely critical for the achievement of an effective product in the decision. This new understanding applies to the scientific aspect of decision-making as much as to any other.
Figure 1
Post-Normal Science can be located in relation to the more traditional complementary strategies, by means of a diagram (see Figure 1). On it, we see two axes, "systems uncertainties" and "decision stakes". When both are small, we are in the realm of "normal", safe science, where expertise is fully effective. When either is medium, then the application of routine techniques is not enough; skill, judgement, sometimes even courage are required. We call this "professional consultancy", with the examples of the surgeon or the senior engineer in mind. Our modern society has depended on armies of "applied scientists" pushing forward the frontiers of knowledge and technique, with the professionals performing an aristocratic role, either as innovators or as guardians.
Of course there have always been problems that science could not solve; indeed, the great achievement of our civilisation has been to tame nature in so many ways, so that for unprecedented numbers of people, life is more safe, convenient and comfortable than could ever have been imagined in earlier times. But now we are finding that the conquest of nature is not complete. As we confront nature in its reactive state, we find extreme uncertainties in our understanding of its complex systems, uncertainties that will not be resolved by mere growth in our data-bases or computing power. And since we are all involved with managing the natural world to our personal and sectional advantage, any policy for change is bound to affect our interests. Hence in any problem-solving strategy, the decision-stakes of the various stakeholders must also be reckoned with.

This is why the diagram has two dimensions; this is an innovation for descriptions of "science", which had traditionally been assumed to be "value-free". But in any real problem of environmental management, the two dimensions are inseparable. When conclusions are not completely determined by the scientific facts, inferences will (naturally and legitimately) be conditioned by the values held by the agent. This is a necessary part of ordinary research practice; all statistical tests have values built in through the choice of numerical "confidence limits", and the management of "outlier" data calls for judgements that can sometimes approach the post-normal in their complexity. If the stakes are very high (as when an institution is seriously threatened by a policy) then a defensive policy will involve challenging every step of a scientific argument, even if the systems uncertainties are actually small. Such tactics become wrong only when they are conducted covertly, as by scientists who present themselves as impartial judges when they are actually committed advocates. There are now many initiatives, increasing in number and significance all the time, for involving wider circles of people in decision-making and implementation on environmental issues.
The financial stakes in vaccination are high - for the companies that make these products, along with those recommending them. The personal stakes are high for all who are required to be vaccinated in order to keep their jobs or participate in public school/daycare - and who must accept the personal and financial consequences of any adverse effects - in other words, accepting nearly all the risk. Increasingly there is reason to believe uncertainty is high in regards to vaccination - uncertainty that the risk/reward ratio may not be as accurate as it has been portrayed.

If one study was thrown, might others have been thrown as well?

What other "irregularities" might have occurred? Paul Thorson is implicated (indicted, actually) in wire-fraud/improper use of funds (#3) and has yet to be extradited from Denmark to answer the charges against him (well, at least for how he used the money entrusted to him for the sake of researching vaccine safety). But if the funds allocated for research can not be accounted for can the findings of the research be trusted? Both issues involve numbers - if you'll fudge one set, might you also alter another?

In 2005 Ioannidis (#4) published an essay titled "Why Most Published Research Findings
Are False" reviewing factors that lead to the publication of findings that are later found to be false or incorrect. In this paper he develops several corollaries regarding the likelihood of true findings in a research paper - two in particular are applicable in vaccine science:

Corollary 5: The greater the financial and other interests and prejudices in a scientific field, the less likely the research findings are to be true.
Conflicts of interest and prejudice may increase bias, u. Conflicts of interest are very common in biomedical research [26], and typically they are inadequately and sparsely reported [26,27]. Prejudice may not necessarily have financial roots. Scientists in a given field may be prejudiced purely because of their belief in a scientific theory or commitment to their own findings. Many otherwise seemingly independent, university-based studies may be conducted for no other reason than to give physicians and researchers qualifications for promotion or tenure. Such nonfinancial conflicts may also lead to distorted reported results and interpretations. Prestigious investigators may suppress via the peer review process the appearance and dissemination of findings that refute their findings, thus condemning their field to perpetuate false dogma. Empirical evidence on expert opinion shows that it is extremely unreliable [28]. 
Corollary 6: The hotter a scientific field (with more scientific teams involved), the less likely the research findings are to be true.
There is tremendous financial COI among those involved in vaccine science and involved in making "recommendations" for vaccination.  The market for vaccination is in the billions (yearly), with little risk of accountability for any adverse effects. Well-paying and prestigious careers are made by many who are beholden to the doctrine of vaccination - in academia, governmental bureaucracies (CDC) and private multi-national pharmaceutical companies.

Lastly, Mr. Patrick Michaels, in a piece for (#5), "The Threat to the Scientific Method", reviews some of the history for how and why the process of science has been undermined - basically how the "publish or perish" phenomenon has impacted how science is done.

Pretense masks arrogance in strutting around trumpeting one's presumed objectivity as a scientist when further, and more objective analysis reveals this to be an abject lie.

The only aspect of "post-normal science" this writer agrees with is that it seems to (more) openly acknowledge everyone has an agenda - the problem is that the agenda is influencing results and driving policy rather than facts and truth ... and few are willing to be patient to wait for facts and truth to be known and understood. There is an abundance of evidence that many are harmed when this is allowed to prevail. History has already shown that, and will no doubt show it again. Because of this we should err on the side of both freedom and caution, allowing each individual to determine for themselves whether they will, or will not, receive a vaccination.






1 comment:

  1. In terms of Ravetz's describion, I do agree with the first two, Facts are uncertain and Values are in conflict.


Comments are moderated - expect your post to be approved within 24 hours.
Polite, respectful discussion welcomed.