Donate

Challenging the precautionary principle

How has society come to be governed by the maxim 'better safe than sorry'?

Helene Guldberg

Topics Science & Tech

Want to read spiked ad-free? Become a spiked supporter.

Not so long ago, the precautionary principle was a new concept gaining marginal support within environmental policy. Now, it has become a value to which we all are supposed to subscribe.

Even when UK prime minister Tony Blair delivered a major speech in 2002, self-consciously designed to celebrate scientific achievements and to set out a ‘clear challenge for Britain’ to improve science and technology, he was compelled to defend the maxim ‘better safe than sorry’. ‘None of this…should diminish the precautionary principle’, he told the Royal Society. ‘Responsible science and responsible policymaking operate on the precautionary principle.’ (1)

But what is the precautionary principle, and why has it gained such an influence in society today?

A central component of the precautionary principle is the shifting of the ‘duty of care’ or ‘onus of proof’ from those who oppose change, to those who propose change (2). We are encouraged to err on the side of caution, even when there is no evidence of harm.

One of the most quoted definitions of the precautionary principle is the Wingspread Statement, produced by a gathering of scientists, philosophers, lawyers and environmental activists in the USA in 1998, which pronounces that: ‘When an activity raises threats of harm to the environment or human health, precautionary measures should be taken even if some cause and effect relationships are not fully established scientifically.’ (3)

The fact is, however, that most scientific and technological developments do raise possible ‘threats of harm’. Despite often minimising, or even eradicating, old risks, they expose us to new and often unpredictable risks. It seems clear that an excessive preoccupation with hypothetical novel risks will be detrimental to scientific and technological progress, and to society as a whole. Yet these tensions have not prevented the rapid development of the precautionary principle in recent decades, and its incorporation into various spheres of life.

The precautionary principle has its beginnings in the German ‘Vorsorgeprinzip’, or foresight principle, which emerged in the early 1970s and developed into a principle of German environmental law. It has since flourished in international policy statements and agreements – initially recognised in the World Charter for Nature, which was adopted by the UN General Assembly in 1982; and subsequently adopted in the First International Conference on Protection of the North Sea in 1984.

To begin with, the precautionary principle was adopted fairly narrowly, in relation to aspects of the natural environment such as marine life. But the UN Conference on Environment and Development in Rio de Janeiro in 1992 marked a turning point. World leaders adopted Principle 15, advocating the widespread international application of the precautionary principle.

Principle 15 states that: ‘In order to protect the environment, the precautionary approach shall be widely applied by states according to their capabilities. Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.’ (4)

The precautionary principle was also formally adopted by countries of the European Union in the Treaty of Maastricht in 1992. It has since been extended from environmental issues to developments related to human health.

In the UK, the outcomes of the official, high-profile inquiries into BSE and mobile phones at the end of the 1990s illustrate that the precautionary principle became a central tenet of New Labour environmental and health policy.

The Independent Expert Group on Mobile Phones (IEGMP), under the chairmanship of Sir William Stewart, was set up in 1999 and reported in May 2000. Its conclusions, published in what is widely known as the Stewart Report, accepted that the balance of evidence showed no adverse health effects from exposure to radiofrequency (RF) radiation from mobile phone technologies. However, the committee still recommended that ‘a precautionary approach to the use of mobile phone technologies be adopted until much more detailed and scientifically robust information on any health effects becomes available’ (5).

Similarly, the BSE Inquiry, which was set up under the chairmanship of Lord Phillips in 1998 and reported in October 2000, drew the conclusion that: ‘Precautionary measures should be strictly enforced even if the risk that they address appears to be remote.’ (6)

But the precautionary principle is not merely confined to the spheres of health and science. In today’s risk-averse world, just about every sphere of life, from business and politics to parenting and health, is increasingly organised around the notion that it is better to be safe than sorry. So why has the precautionary principle become so influential?

It is often assumed that an increasingly technological society will bring with it new worries about human health and environmental disruption. According to the Stewart Report, ‘We live in an era where science and technology are advancing at an ever-increasing rate… many people have anxieties about the pace of change and the potential for major adverse consequences if new developments are not appropriately
controlled – if science has greater power to do good, it also has greater power to do harm’ (7).

But while this might seem like common sense, it is no explanation at all. It is questionable whether the rate of scientific and technological change is qualitatively different from the past. The rate of innovation during the industrial revolution led to the transformation of people’s lives over a very short space of time: not only in terms of working practices, but in terms of how they lived – with the growth of cities and rapid expansion of transport.

So far as scientific and technological advance goes, we are creating a safer world, rather than a more risky one. Just look at how much safer our children – the most vulnerable section of society – are today. At the turn of the twentieth century 150 in every 1000 babies born in England died before they reached their first birthday. Nutrition was poor and lack of vaccinations led to deaths from smallpox, diphtheria,
measles, typhoid and cholera, and many other diseases. Most of these
diseases are now virtually non-existent, mainly as a result of immunisation.

Furthermore, significant medical advances over the past decades have led to improved rates of survival for children diagnosed with cancer – such as leukaemia. Today infant mortality has dropped to fewer than five in every 1000 babies born.

Accidents in the home are also declining: open fires and unreliable gas heaters have been replaced by central heating; and candlelight has been replaced by electric lighting. Of course, we are not living in a risk-free world, but it is clearly not the case that we are facing more risks – whether it be at home, at work or in public.

The current unease about new developments cannot be explained by what goes on in the world of science. Instead, it is the outcome of a broader culture of caution. If risk-aversion really was a result of the rate of scientific and technological advance, how can we explain the fact that the belief in putting safety first shapes our response to everything from mobile phone masts and GM food to SARS and how we raise our children?

As one of the speakers pointed out at spiked’s recent conference, Panic Attack: interrogating our obsession with risk, we live in a state of ‘free-floating anxiety’ that can attach itself to anything from MMR to bio-terrorism (8). We need to explore the broader cultural assumptions about human vulnerability in order to understand the emergence of the precautionary principle into ever-more spheres of life.

But if we accept that the widespread adoption of the precautionary principle is motivated, not by anything uniquely new or risky at the level of science and technology, but by a broader negative cultural outlook, is there still a case for adhering to the precautionary principle in science? The answer is, bluntly, no.

Of course, society should not be reckless in its approach to innovation. We should consider potential risks before introducing new products or proceeding with new activities. Scientists do – and should – hypothesise about, try to predict, and, as far as is possible, model possible harm.

It is this reality that often leads proponents of the precautionary principle to respond to accusations that they are anti-science with the argument that they are in fact more pro-science than their critics: in the sense that they want more science rather than less. As Poul Harremoes, chair of the editorial team of the 2002 European Environment Agency report ‘Late Lessons From Early Warnings: The Precautionary Principle 1896-2000′, said: ‘The use of the precautionary principle can [stimulate] both more innovation, via technological diversity and flexibility, and better science.’ (9)

But the precautionary principle does not merely ask us to hypothesise about and try to predict outcomes of particular actions, whether these outcomes are positive or negative. Rather, it demands that we take regulatory action on the basis of possible ‘unmanageable’ risks, even after tests have been conducted that find no evidence of harm. We are asked to make decisions to curb actions, not on the basis of what we know, but on the basis of what we do not know.

The role of the unknown – or what is referred to as ‘uncertainty’ – in determining action is spelt out clearly in ‘Late Lessons From Early Warnings’. It states that ‘complex reality demands better science, characterised by more humility and less hubris, with a focus on “what we don’t know” as well as on “what we do know”’ (10).

According to the European Commission Communication on the precautionary principle, which was issued in 2000, if there is no evidence that something is harmful – but there are ‘reasonable ground for concern’ that it might be – then experimentation should not proceed. The Communication states: ‘Whether or not to invoke the precautionary principle is a decision exercised where scientific information is insufficient, inconclusive, or uncertain and where there are indications that the possible effects on the environment, or human, animal or plant health may be potentially dangerous and inconsistent with the chosen level of protection.’ (11)

Such arguments seem to be putting a straightforward case for restraint. Indeed, taken to its logical conclusion, it would surely mean that scientific experimentation would never take place at all. But proponents of the precautionary principle do not want to be seen to be too trigger-happy when it comes to banning things. In Interpreting the Precautionary Principle, Tim O’Riordan and James Cameron emphasise that the principle involves a ‘proportionality of response’ to ensure that ‘the selected degree of restraint is not unduly costly’ (12).

The science writer Colin Tudge, in a piece for the New Scientist criticising spiked’s Panic Attack conference, acknowledges that the precautionary principle has various forms, but asserts that ‘all of them generally include some notion of cost-effectiveness’. Consequently, he continues, ‘the point is not simply to ban things that are not known to be absolutely safe. Rather [the precautionary principle] says: “Of course you can make no progress without risk. But if there is no obvious gain from taking the risk, then don’t take it”’ (13).

Tudge is right that a central component of the precautionary principle involves weighing up hypothetical risks against hypothetical benefits, before proceeding with new products or activities. The question remains, however: how far we can reliably quantify hypothetical costs and hypothetical benefits? And is it even possible to anticipate – in a quantifiable way – the future benefits of current discoveries and inventions?

As American science writer Ronald Bailey points out, ‘when the optical laser was invented in 1960, it was dismissed as “an invention looking for a job”. No one could imagine of what possible use this interesting phenomenon might be. Of course, now it is integral to the operation of hundreds of everyday products: it runs our printers, runs our optical telephone networks, performs laser surgery to correct myopia, removes
tattoos, plays our CDs, opens clogged arteries, helps level our crop fields, etc. It’s ubiquitous’ (14).

Or take aspirin. If we had weighed up the hypothetical risks against the hypothetical benefits, would we ever have allowed the drug to be licensed? According to Peter McNaughton, Sheild professor of pharmacology at the University of Cambridge, we would not. In a spiked-survey investigating the effect of the precautionary principle on society, McNaughton argues that ‘this drug has considerable adverse side-effects, and would never be licensed today. The benefits, however, are enormous and growing – apart from the well-known treatment for inflammatory pain, there are uses in cancer, heart disease and prevention of deep vein thrombosis’ (see spiked-survey Science, risk and the price of precaution, by Sandy Starr).

Many of the benefits derived from aspirin could not have been anticipated. But also, as a result of the success of aspirin, many safer alternatives have been developed. In the course of scientific progress, there are endless examples of technologies that have served as bridges to new and better technologies.

We do need to accept that scientific and technological advances will often be accompanied by new risks. We cannot eliminate all risk, and we should not aspire to do so. Sometimes the consequences of innovation can be costly, and sometimes costly consequences cannot be foreseen.

Furthermore, just as there are risks in not restricting certain experiments and developments, there are risks in restricting them. As the Australian academic Ronald Brunton argued in the journal Biotechnology back in 1995, the risk in permitting certain things to go ahead ‘tend to be much more visible and politically threatening’. For example, he says, ‘approving a new medicinal drug which turns out to have harmful side-effects – such as thalidomide – can produce highly visible victims, heart-rendering news stories, and very damaging political fallout. But incorrectly delaying a drug produces victims who are essentially invisible’ (15).

Before considering regulation, one does indeed have to balance of the cost of being too permissive in relation to innovation with the costs of being too restrictive. It would be reckless to advocate risks where the potential costs are quantifiably high and the benefits quantifiably small. Hoover Institution research fellow Henry I Miller, and Gregory Conko, director of food safety policy at the Competitive Enterprise Institute, have argued: ‘History offers compelling reasons to be cautious about societal risks, to be sure. These include the risk of incorrectly assuming the absence of danger (false negatives).’ But as they also point out, ‘there are compelling reasons to be wary of excessive precaution, including the risk of too eagerly detecting a non-existent danger (false positive).’ (16)

Regardless of what many might believe, the precautionary principle is not the ‘safe option’. It incurs the cost of ‘false positives’. That means forgoing many social benefits – most of which tend to make our lives safer rather than less safe.

History has shown us that, while scientific and technological progress may often introduce new risks, its general trajectory has been to reduce many other, more serious, risks. Examples are plentiful: including the development of vaccinations, organ transplantation, blood transfusion, the chlorination of drinking water, the use of pesticides, and much more.

The precautionary principle will therefore not make us any safer. But we could pay a very heavy price for taking it on board, by missing out on future social benefits that are unimaginable to us today.

A fear of the new and the unknown may have been understandable in the past, where people turned to religious prejudice to explain phenomena that seemed out of their control. Today’s conservative precautionary outlook is equally guided by prejudice, but a prejudice that has rejected human knowledge and control in favour of ignorance. Those of us who still have an aspiration for a better society need to confront the restraints imposed on us by an outlook guided by ignorance, in favour of an attempt to control our destiny in order to improve our world.

Read on:

Who wants to live under a system of Organised Paranoia?, by Mick Hume

Science, risk and the price of precaution, by Sandy Starr

Risky living, by Colin Berry

(1) Science matters, Tony Blair, ePolitix, 23 May 2002

(2) Tim O’Riordan and James Cameron, in Interpreting the Precautionary Principle, Earthscan Publications, 1994. Buy this book from Amazon (UK) or Amazon (USA)

(3) Wingspread statement on the precautionary principle, January 1998

(4) Rio declaration on environment and development, June 1992

(5) A precautionary approach (.pdf 100 KB), Mobile Phones
and Health
, Independent Expert Group on Mobile Phones, 28 April 2000

(6) BSE Inquiry Report, October 2000

(7) A precautionary approach (.pdf 100 KB), Mobile Phones
and Health
, Independent Expert Group on Mobile Phones, 28 April 2000

(8) See Panic Attack: Interrogating our obsession with risk

(9) EEA draws key lessons from history on using precaution in policy-making, European Environment Agency, 10 January 2002

(10) Late Lessons from Early Warnings: The Precautionary Principle 1896-2000 (.pdf 1.73 MB), European Environment Agency, 2001, p193

(11) Communication from the commission on the precautionary principle (.pdf 72.2 KB), European Union, 2 February 2002, p8

(12) Interpreting the Precautionary Principle, ed Tim O’Riordan and James Cameron, Earthscan Publications, 1994, p17. Buy this book from Amazon (UK) or Amazon (USA)

(13) New Scientist, 17 May 2003

(14) Precautionary tale, Ronald Bailey, Reason, April 1999

(15) Biotechnology, Vol 5, No 4, August 1995

(16) The perils of precaution, Henry Miller and Gregory Conko, Policy Review, June/July 2001

To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.

Topics Science & Tech

Comments

Want to join the conversation?

Only spiked supporters and patrons, who donate regularly to us, can comment on our articles.

Join today