The Human Cost of Animal Experimentation


Download the entire document in PDF format


Documented examples of failures, fatalities, misleading results, mistakes, and missed opportunities which have occurred through our acceptance of vivisection.


Dr. Robert Sharpe was a Senior Research Chemist at the prestigious Royal Postgraduate Medical School in London when he found himself at odds with colleagues who tested his chemicals on animals. He resigned his position and began to investigate, what until then he had taken for granted, that animal experiments were vital for medical progress.


The facts proved disturbing and posed fundamental questions about the use of animals. For how could a method that produced such conflicting results be so vital to our health?


Dr. Robert Sharpe presents a powerful body of evidence documenting the failures, misleading results and missed opportunities of animal research. Is our acceptance of vivisection the biggest medical error in human history?


The suffering is real, and so it seems are the mistakes.


‘SAFE’ ANTIBIOTICS FATAL FLAW
BODY CHEMICALS PRODUCE OPPOSITE EFFECTS IN ANIMALS
WOMEN AT RISK FROM "PILL" SAFETY TESTS
DOCTORS WARN AGAINST "SAFE" EYE TREATMENT
ANIMAL TESTS CONFUSE PAINKILLER PROBE
DRUG INDUCED DISASTER LEAVES THOUSANDS DEAD
VIVISECTION UNDERMINES MINERS’ WELFARE
ANIMAL TESTS USED TO PROMOTE ‘SUPERIOR’ ARTHRITIS DRUG
BLOOD CELL DAMAGE MISSED BY ANIMAL TESTING
BLOOD PRESSURE PILL LEADS TO WITHDRAWAL SYNDROME
SLIMMING TREATMENT CAUSED CATARACTS
ANIMALS MIX DEADLY COCKTAIL OF CONFUSING RESULTS
SUPROFEN JOINS BANNED LIST
MIGRAINE PILL’S HORRIFIC SIDE EFFECT
LEUKEMIA & THE NUCLEAR INDUSTRY
TRANSPLANT DRUG ALMOST LOST
THE CHLOROFORM CONTROVERSY
"NON-TOXIC" OINTMENT PROVES DANGEROUS
FATAL DIURETIC SEEMED SAFE
CANCER-PRONE MICE CONTRADICT HUMAN EXPERIENCE
DOCTORS WARNED ABOUT HALOTHANE LIVER TOXICITY
SHOCK TREATMENT
ARSENIC AND THE DECADES OF FAILURE
PESTICIDE POISONING
NATURAL SKIN SUBSTANCE HARMS ANIMALS
ANIMAL-TESTED ARTHRITIS DRUGS KILLED THOUSANDS
MENTHOL & EYE IRRITATION
MORPHINE MANIA
MONKEY EXPERIMENTS PUT MALARIA PATIENTS AT RISK
DIARRHOEA TREATMENT LEAVES 10,000 VICTIMS
RODENT TESTS MISS INDUSTRIAL CANCER RISK
VALUABLE EYE THERAPY WOULD NOT PASS RABBIT TEST
RABBIT TEST MISSES HUMAN EYE IRRITANT
ANIMAL SKIN TESTS NOT UP TO SCRATCH
DOG RESEARCH UNDERMINES HEART VALVE DEVELOPMENT
ANGINA DRUG’S FATAL EFFECTS
SAFE CLEANING AGENTS DAMAGE ANIMAL VICTIMS
HEART TREATMENT WITHDRAWN
COUGH REMEDY LEAVES OVERDOSE PATIENTS IN COMA
HEART DRUG FEARS GROUNDLESS
ANIMAL TESTS MASK NEERVE DAMAGE RISK
POISONING TESTS OFFER LITTLE HOPE TO OVERDOSE PATIENTS
‘SAFE’ EYE SOLUTIONS FAIL THE HUMAN TEST
ANTIBIOTIC’S DEADLY SIDE-EFFECT
ANIMALS & AIDS
NINE SPECIES FAIL TO PREDICT LIVER DAMAGE
ANTI-CANCER HOPE ABANDONED
ANAEMIA CURE FAILS IN ANIMALS
DRUG DANGER UNDETECTED
ANIMALS DIVERT ATTENTION FROM CANCER PREVENTION
EPILEPSY ‘MODELS’ GIVE FITFUL RESULTS
ANIMALS STARVE IN BRAIN RESEARCH FIASCO
THE PRACTOLOL SYNDROME
ANIMALS MISS STEROID EYE RISKS
BABIES AT RISK FROM TALC
INHALATION TESTS THROW FALSE DOUBT ON FORMALDEHYDE
USELESS TREATMENT POISONS WORKERS
ANIMAL DIET STUDIES CONTRADICT HUMAN COLON CANCER RISKS
WORKERES AT RISK FROM MISLEADING ANIMAL TESTS
THE OPREN AFFAIR
‘FLEXIBLE’ ANIMAL TESTS SUPPORT RIVAL THEORIES
LABORATORY ANIMALS FAIL STROKE VICTIMS
RIFAMPICIN & THE PILL
RATS CAST DOUBT ON OLIVE OIL!
BLEACH HIGHLIGHTS FAULTY SKIN TESTS
TRANSPLANT RESEARCH MISDIRECTED
TRAGEDY HITS HEPATITIS VICTIMS
LEUKEMIC MICE FAIL CANCER PATIENTS
‘HARMLESS’ ULCER DRUG COULD CAUSE HEART FAILURE
BEAGLE DOGS MISLEAD CANCER RESEARCH
THALIDOMIDE
"HARMLESS" ANTIDEPRESSANT DAMAGED LIVER
TRAGEDY OF THE KILLER DUST
SMOKING DANGERS MASKED BY FALSE ANIMAL DATA
THE DOGMA OF DEATH
DOG DEATHS DENY WOMEN CONTRACEPTIVE OPTION
TRANSPLANT DRUG CAUSES UNEXPECTED KIDNEY DAMAGE
TOXIC TREATMENTS
MINOR TRANSQILLIZERS PRODUCE MAJOR PROBLEMS
CORTICOSTEROIDS & BIRTH DEFECTS
LIVER DAMAGE NOT PREDICTED … AGAIN!
ANIMAL TESTS MINIMISE RIOT GAS HAZARD
HEART DRUGS MAY HAVE KILLED 3,000
ANTIBIOTICS, GUINEA PIGS, AND HAMSTERSE
DAUGHTESR OF DES
THE FIRST BETA-BLOCKERS
X-RAYS & CANCER
STEROIDS & THE IMMUNE SYSTEM
UNEXPECTED EYE PROBLEMS LED TO DRUG REJECTION
PETHIDINE ADDICTION
THE METHANOL SCANDAL
ANIMAL VICTIMS ESCALATE AFTER ICI DRUG FAILS
OBESITY DRUG’S HORRIFIC SIDE EFFECTS
RESEARCH ‘PARALYZED’ BY ANIMAL MODELS



‘SAFE’ ANTIBIOTICS FATAL FLAW


Animal experiments suggested that chloramphenicol was a very safe drug but clinical experience soon revealed serious side-effects making it no longer suitable for internal use, except for life-threatening infections such as typhoid fever. In France, chloramphenicol has been completely withdrawn.1


In 1952, physicians in Baltimore drew attention to chloramphenicol’s effects on nerve cells in the body.2 They described a patient who almost became blind and who suffered such severe pain in her feet that she could only walk with the aid of pain-killing narcotic drugs. She had been taking chloramphenicol for 5 months. This was the first of many cases of optical and peripheral neuritis caused by chloramphenicol yet animal experiments had shown the drug to be practically free of side-effects, even after prolonged administration.2


Even more seriously, the drug caused aplastic anaemia, an often fatal blood disease sometimes terminating in leukemia. Once again, the effect had not been predicted by animal tests, and the British Medical Journal records how chloramphenicol produced nothing worse than transient anaemia in dogs when given the drug for long periods by injection, and nothing at all when given orally.3


Today we know that chloramphenicol’s deadly side-effect can be identified by test-tube studies with human bone marrow cells.4


REFERENCES:
1) C. Spriet-Pourra & M. Auriche, Drug Withdrawal from Sale (PJB Publications, 1988)
2) L. Wallenstein & J. Snyder, Annals of Internal Medicine, 1952, vol. 36, 1526-1528.
3) British Medical Journal, 1952, July 19, 136-138.
4) G.M.L. Gyte & J.R.B. Williams, ATLA, 1985, vol. I3, 38-47.



BODY CHEMICALS PRODUCE OPPOSITE EFFECTS IN ANIMALS


An important area of medical research is pharmacology where scientists study exactly how drugs and natural body substances exert their effects on the tissues. An understanding of the chemical processes involved can be valuable in providing a more rational basis for the design of new treatments. Unfortunately, many pharmacologists rely on animals despite numerous contradictory results. As a result of experiments with dogs, acetylcholine, a chemical produced by nerve endings, was widely believed to dilate the coronary arteries. But in human coronary tissue it causes a narrowing of the vessels which is thought to lead to heart spasm in a living person.1 Another body chemical, bradykinin, relaxed blood vessels in human brain tissue but contracts them in dogs.2


Further species differences have been found with leukotrienes (LT), natural substances involved in inflammation. Leukotrienes known as LTC4 and LTD4 constrict blood vessels in the guinea pig’s skin but dilate corresponding tissues from people and pigs.3 Yet another case is the prostaglandins (PG), a family of substances discovered over 50 years ago in human seminal fluid: in heart tissue from cats and rabbits, PGE1 has no effect on contractile force or heart rate but increases them in rats, guinea pigs and chickens.4


Some pharmacologists have recognized that “direct extrapolation from animals to humans is frequently invalid," so that “recently much interest has focused on use of human autopsy or biopsy tissue as a means of overcoming these limitations."5


REFERENCES:
1) S. Kalsner, Journal of Physiology, 1985, vol. 358, 509-526.
2) K. Schror & R. Verheggen, Trends in Pharmacological Sciences, 1988, vol. 9, 71-74.
3) P.J. Piper et al, Annals of the New York Academy of Sciences, 1988, vol. 524, 133-141.
4) S. Bergstrom et al, Pharmacological Reviews, 1968, vol. 20, 1-48.
5) Trends in Pharacological Sciences, 1987, vol. 8, 289-290.



WOMEN AT RISK FROM “PILL" SAFETY TESTS


Careful observation of women taking the pill has shown that the most serious side-effects are on the circulatory system: there is an increased risk of blood clots leading to heart attacks, strokes and lung diseases. By 1980, Britain’s Committee on Safety of Medicines (CSM) had received reports of 404 deaths.1 Further studies found that 1-5% of women taking the pill have raised blood pressure.


None of these problems had been identified by animal experiments.2 Furthermore, in some species oral contraceptives produced the opposite effect, making it more difficult for the blood to clot!3 As Professor Briggs of Deakin University in Australia points out, “Many experimental toxicity studies have been conducted on contraceptive oestrogens, alone or in combination with progesterones. At multiples of the human dose no adverse effect on blood clotting was found in mice, rats, dogs or non-human primates. Indeed, far from accelerating blood coagulation, high doses of oestrogens in rats and dogs prolonged clotting times. There is therefore no appropriate animal model for the coagulation changes occurring in women using oral contraceptives."4


In 1972, the CSM described tests on over 13,000 animals which showed that very high doses of oral contraceptives cause cancer.5 But the rats and mice used in these experiments were so susceptible to cancer that even those not dosed with the pill (the “control" animals) suffered high levels of disease: for instance, lung and liver tumours were found in 25% and 23% of control mice, and adrenal pituitary and breast tumours were found in 26%, 30%, and 99% of control rats. Under these circumstances, the British Medical Journal notes, “It is difficult to see how experiments on strains of animals so exceedingly liable to develop tumours of these various kinds can throw away useful light on the carcinogenicity of any compound for man." The Journal believed that tests neither incriminated nor exonerated the pill and concluded that we would have to wait for the results of human studies.


The uncertaintly of animal experiments has meant that, effectively, oral contraceptives have been tested by women themselves during long term use.


REFERENCES:
1) G.R. Venning, British Medical Journal, 1983, January 22, 289-292.
2) R. Heywood in Animal Toxicity Studies: Their Relevance for Man, Eds. C.E. Lumley & S.R. Walker (Quay Publishing, 1990)
3) R. Heywood & P.F. Wadsworth in Pharmacology of Estrogens, Ed. R.R. Chaudhury (Pergamon Press, 1981).
4) M.H. Briggs in Biomedical Research Involving Animals, Eds. Z. Bankowski & N. Howard-Jones (CIOMS, 1984).
5) British Medical Journal, 1972, October 28, 190.



DOCTORS WARN AGAINST “SAFE" EYE TREATMENT


In 1951, physicians at the University of California Medical School in San Francisco, warned ophthalmologists against the prolonged use of furmethide in the treatment of glaucoma.1 They noted that permanent obstruction of the tear passages occurred in over 70% of patients where the drug was used for more than 3 months.


Eleven years earlier, researchers had reported experiments on animals’ eyes, pronouncing the drug “entirely safe" and worthy of clinical trial.2 The tests were performed on rats, guinea pigs and rabbits and continued for several months.


REFERENCES:
1) R.N. Shaffer & W.L. Ridgway, American Journal of Ophthalmology, 1951, vol. 34, 718-720.
2) A. Myerson & W. Thau, Archives of Ophthalmology, 1940, vol. 24, 758-760.



ANIMAL TESTS CONFUSE PAINKILLER PROBE


Since 1953 when doctors first drew attention to the kidney damage associated with prolonged use of combination painkillers, there have been many animal experiments to try and clarify the effects seen in people. In fact, these have only obscured the issue. For example, interest centered on which ingredient was responsible, and although suspicion naturally fell on phenacetin since this was present in most analgesic mixtures, the characteristic kidney damage seen in patients could not be reproduced in animals.1


The experiments also suggested that aspirin rather than phenacetin was to blame in painkillers containing the two drugs.2 This is because, unlike phenacetin, aspirin readily induces kidney damage in laboratory animals. Eventually, human studies showed that phenacetin was indeed a major culprit.3


So contradictory were the experiments that a major analysis of the subject concluded that if doctors had not first observed the effects in patients, they would never have been suspected, foreseen or predicted by animal tests.1 Phenacetin was finally withdrawn in 1980 when there were also suspicions that it caused cancer.


REFERENCES:
1) I. Rosner, CRC Critical Reviews in Toxicology, 1976, vol. 4, 331-352.
2) British Medical Journal, 1970, October 17, 125-126.
3) K.G. Koutsaimanis & H.E. de Wardener, British Medical Journal, 1970, October 17, 131-134.


DRUG INDUCED DISASTER LEAVES THOUSANDS DEAD


A major disaster occurred in the UK during the 1960’s when at least 3,500 young asthma sufferers died following the use of isoprenaline aerosol inhalers.1 Fatalities were reported in countries using a particularly concentrated form of aerosol that delivered 0.4mg of isoprenaline per spray.2, 3 Fortunately, the death rate declined rapidly when the drug was made “prescription only" and warnings were issued to doctors.


Attempts to replicate the effects in laboratory animals proved difficult. In 1971 researchers at New York’s Food and Drug Research Laboratory reported that “Intensive toxicologic studies with rats, guinea pigs, dogs and monkeys at dosage levels far in excess of current commercial metered dose vials … have not elicited similar adverse effects."4


Experimenters persisted in their attempts however, and eventually found that by artificially reducing the amount of oxygen in the animal’s tissues, they could increase the toxic effects of isoprenaline on the heart.5


REFERENCES:
1) W.H. Inman in Monitoring for Drug Safety, Ed. W.H. Inman (MTP Press, 1980).
2) P.D. Stolley, American Review of Respiratory Diseases, 1972, vol. 105, 883-890.
3) P.D. Stolley & R. Schinnar, Lancet, 1979, October 27, 896.
4) S. Carson et al, Pharmacologist, 1971, vol. I8, 272.
5) British Medical Journal, 1972, November 25, 443-444.



VIVISECTION UNDERMINES MINERS’ WELFARE


During the 20th century, there has been much debate over the actual cause of pneumoconiosis, a lung disease suffered by coal miners because of their occupation. For many years, scientists believed that inhalation of coal dust was “completely innocuous" and that any respiratory disease arose from the silica that sometimes contaminated the coal. In bituminous coal pits, where there is little exposure to silica, mining was not considered dangerous and consequently few observational studies were carried out in the US between 1900 and 1960. As a result, there was almost no information on the amount of coal workers’ pneumoconiosis until the Public Health Service conducted studies in 1962/63.1


The idea that coal dust was harmless originated primarily from the vivisection laboratory. According to an editorial in the British Medical Journal,2 scientists who believed silica to be the responsible contaminant, “take their strongest stand on the fact that animal experiments … have with few exceptions shown that pure coal dust produces no fibrogenic reaction." Fibrosis is the formation of scar tissue and a clear sign of damage to the lung. In fact, the experimental evidence exonerated pure coal dust and pointed to silica as the cause of respiratory disease.3


However, the animal data were contradicted by the discovery that men who worked with pure coal dust or carbon alone, also developed pneumoconiosis.1, 2 Such evidence shows that coal dust can cause lung disease even in the absence of silica. The experimental results were further undermined when coal dust, collected at a coal face where pneumoconiosis among miners was high, proved innocuous to laboratory rats!2


REFERENCES:
1) W.K.C. Morgan in Occupational Lung Diseases, Eds. W.K.C. Morgan & A. Seaton (Saunders, 1982).
2) British Medical Journal, 1953, January 17, 144-146.
3) L.U. Gardner, Journal of the American Medical Association, 1938, November 19, 1925-1936; Chronic Pulmonary Disease in South Wales III Experimental Studies, Medical Research Council Special Report Series No. 250 (HMSO, 1945).



ANIMAL TESTS USED TO PROMOTE ‘SUPERIOR’ ARTHRITIS DRUG


A major hazard of the anti-inflammatory drugs used to treat arthritis is that they damage the stomach.1 So serious is the problem that any drug free of this side effect would have an enormous advantage over its competitors.


The anti-inflammatory drug Surgam appeared to have these advantages because it was promoted by the company, Roussel Laboratories, as giving “gastric protection". However, the claims were made on the basis of animal tests and could not be confirmed in clinical trials. As a result of their promotional claims, Roussel were found guilty of misleading advertising and fined £20,000. A report of the case in the Lancet described how expert witnesses for both sides, “… agreed that animal data could not safely be extrapolated to man."2


REFERENCES:
1) R. Cockel, Gut, 1987, 515-518.
2) J. Collier & A. Herxheimer, Lancet, 1987, January 10, 113-114.



BLOOD CELL DAMAGE MISSED BY ANIMAL TESTING


The antidepressant drug mianserin can cause potentially fatal blood disorders and the British National Formulary recommends that patients should have full blood counts every 4 weeks during the first 3 months of treatment.1 By early 1988 the World Health Organization Collaborative Centre for International Drug Monitoring had collected 321 reports referring to white blood cell disorders. The effects had not been predicted by animal tests,2 but subsequent studies showed that they could be observed in test tube experiments with human tissues.3


REFERENCES:
1) British National Formulary, No. 26 (BMA & the Royal Pharmaceutical Society of GB, 1993)
2) H.M. Clink, British Journal of Clinical Pharmacology, 1983, vol. 15, 291S-293S.
3) P. Roberts et al, Drug Metabolism & Disposition, 1991, vol. I9, 841-843.



BLOOD PRESSURE PILL LEADS TO WITHDRAWAL SYNDROME


During the 1960’s animal experiments suggested that clonidine might be a useful drug for preventing migraine. Using cats, experimenters found that clonidine interfered with physiological processes thought to cause headaches. The drug was introduced in 1969 but clinical experience now suggests that clonidine is largely ineffective and little better than a dummy pill.1


Clonidine proved more successful in the treatment of high blood pressure. Its ability to lower blood pressure was discovered accidentally when it was given to people as a nasal decongestant.2 Although effective, there were serious unexpected side-effects when patients stopped taking the drug: the “clonidine withdrawal syndrome" is characterized in extreme cases by sweating, trembling, rapid heart beat and a dangerous rise in blood pressure. The symptoms may occur one or two missed doses or even after gradual withdrawal over 3 days.


Attempts to replicate the condition in dogs and cats produced inconsistent results3 whilst in the rat “… attempts to reproduce the clonidine discontinuation syndrome … have met with even more difficulties and controversy than those encountered in dogs and cats." ‘Success’ was only achieved when researchers implanted a special pump into the rat’s body to maintain adequate levels of clonidine in the bloodstream prior to withdrawal.4


In view of its serious side-effects, the Drug and Therapeutics Bulletin considers clonidine obsolete for the treatment of high blood pressure.5


REFERENCES:
1) Drug & Therapeutics Bulletin, 1990, vol. 28, 79-80.
2) A.S. Niles in Clinical Pharmacology: Basic Principles in Therapeutics, 2nd Edition, Eds. K.L. Melmon & H.F. Morrelli (MacMillan, 1978).
3) L. Hansson et al, American Heart Journal, 1973, vol. 85, 605-610.
4) M.J.M.C. Thoolen et al, General Pharmacology, 1981, vol. I2, 303-308.
5) Drug & Therapeutics Bulletin, 1984, vol. 22, 42-43.



SLIMMING TREATMENT CAUSED CATARACTS


In 1933, following “thorough" experiments on animals, researchers described the use of dinitrophenol as a treatment for obesity. However, doctors soon noticed that the drug unexpectedly caused cataracts in some of their patients, and nearly 200 cases were reported before the drug was prohibited for internal use. Attempts were made to replicate the clinical findings in rats, rabbits, guinea pigs and dogs but none of the experiments produced any change in the lens of the eye.1 In 1942, a summary of the tests stated that “All attempts to produce experimental cataracts in laboratory animals by various and repeated doses of dinitrophenol have been unsuccessful."2


Although birds are rarely used for the safety testing of drugs, later experiments accidentally discovered that cataracts could be induced in chicks dosed with dinitrophenol in their food.1


Similar problems were encountered with triparanol (Mer-29), a drug used to lower cholesterol levels. The cataracts observed in human patients could be induced in rats and dogs after very high doses but not in rabbits and monkeys.3 Triparanol was withdrawn in 1962.


REFERENCES:
1) B.H. Robbins, Journal of Pharmacology, 1944, vol. 80, 264-269.
2) Reproduced in ref. 1.
3) W.M. Grant, Toxicity of the Eye, 2nd edition (Charles Thomas, 1974).



ANIMALS MIX DEADLY COCKTAIL OF CONFUSING RESULTS


For centuries, alcohol has been regarded as poisonous for the liver.1 That is, until the first half of the 20th century when it was cleared of liver toxicity following experiments on animals.1, 2 In 1934, a summary of animal tests concluded that “experimental evidence has not substantiated the belief that alcohol is a direct cause of cirrhosis."3 Based largely on experiments with rats, researchers later argued that “there is no more evidence of a specific toxic effect of pure ethyl alcohol upon liver cells than there is for one due to sugar."4 Today, alcohol is once again considered a liver toxin but since it has proved so difficult to induce cirrhosis in laboratory animals, there are still some who doubt the evidence.5


Animal experiments have proved misleading in other areas of alcohol research. Although it has been known for decades that too much alcohol can cause cancer, this well established clinical fact has been questioned because it proved impossible to induce the disease in animals. Indeed, some insist that alcohol should not be classified as a human carcinogen since there is no evidence from animal experiments!6


Alcohol seems more toxic to the circulatory system of humans than animals, and whereas prolonged consumption raises the blood pressure in alcoholics, this is not usually the case in rats.7 And whilst alcohol can damage the human heart, “Studies on a variety of animals being given large amounts of ethanol (alcohol) over long periods of time did not lead to heart failure. Also, until recently when the heart of the Nicholas turkey was shown to be susceptible to alcohol damage, there has been no animal model of alcoholic cardiomyopathy (heart muscle damaged) as it is seen in man."7


During the early 1970’s researchers described how alcohol could induce physical dependence in mice. The experiments showed that the tranquilizing drug Librium could reduce the severity of withdrawal convulsions, but also suggested that the treatment had a lethal side-effect with some of the animals dying.8 Fortunately, clinical studies carried out 6 years earlier had already shown that Librium was effective9 and the drug remains an important treatment for alcohol withdrawal symptoms.


Despite the known effects of alcohol and the availability of human tissues to supplement clinical observations, there seems no shortage of funds for animal experiments. A report by the National Research Information Centre, compiled by Murry Cohen MD and Constance Young, revealed that the US Government funded 284 alcohol research projects involving animals during 1986, costing nearly $24 million.10 “Animal research", they concluded, “has had no significant effect on our knowledge of alcohol-use disorders."


REFERENCES:
1) H.J. Zimmerman, Alcoholism: Clinical & Experimental Research, 1986, vol. 10, 3-15.
2) C.S. Lieber & L.M. DeCarli, Journal of Hepatology 1991, vol. 12, 394-401.
3) V.H. Moon, Archives of Pathology, 1934, vol. 18, 381-424.
4) Reported in ref. 2.
5) R.F. Derr et al, Journal of Hepatology, 1990, vol. 10, 381-386.
6) L. Tomatis et al, Japanese Journal of Cancer Research, 1989, vol. 80, 795-807.
7) J.V. Jones et al, Journal of Hypertension, 1988, vol. 6, 419-422.
8) D.B. Goldstein, Journal of Pharmacology & Experimental Therapeutics, 1972, vol. 183, 14-22.
9) G. Sereny & H. Kalant, British Medical Journal, 1965, January 9, 92-97.
10) M. Cohen & C. Young, Alcoholic Rats, The National Research Information Centre, 1989.



SUPROFEN JOINS BANNED LIST


The arthritis drug suprofen (Suprol) was withdrawn worldwide in May 1987 following reports of kidney problems and pain in the side of the body. Patients experiencing these side-effects had to have their kidney function monitored for 2 years after they stopped taking the drug. The dangers were unexpected because “In animal studies suprofen has been shown to have an excellent safety profile. No significant effects were observed on cardiac, renal [kidney] or central nervous system parameters in several species."


REFERENCES:
1) Drug Withdrawal from Sale, C. Spriet-Pourra & M. Auriche (PJB Publications, 1988).
2) FDA Drug Review: Postapproval Risks, 1976-1985 (US General Accounting Office, April 1990).
3) A. Yeadon et al, Pharmacology, 1983, vol. 27, Suppl. I, 87-94.



MIGRAINE PILL’S HORRIFIC SIDE EFFECT


The British National Formulary (1993) warns that methysergide, a treatment for migraine, should only be administered under hospital supervision because of dangerous side-effects resulting from abnormal formation of fibrous, can lead to obstruction of abdominal blood vessels and blockage of the tube carrying urine from the kidneys to the bladder. Fibrotic damage to the heart valves has also been reported and can result in heart failure.


Methysergide’s life-threatening side-effects were not predicted by animal tests,1 nor could they be induced during subsequent experimentation, and a report in the British Medical Journal notes that “Attempts to reproduce these fibrotic lesions in animals have been unsuccessful."2


REFERENCES:
1) R. Heywood in Animal Toxicity Studies: Their Relevance for Man, Eds. C.E. Lumley & S.R. Walker (Quay Publications, 1990).
2) K.A. Misch, British Medical Journal, 1974, May 18, 365-366.



LEUKEMIA & THE NUCLEAR INDUSTRY


In 1983 a television documentary programme drew attention to an increased number of childhood leukemia cases in the vicinity of the nuclear reprocessing plant at Sellafield in Britain. Although the incidence of leukemia was 10 times the national average, the official Committee of Inquiry decided the nuclear facility was not the cause. Their conclusions were based on calculations from animal experiments. By preferring animal data to direct human observations, the effect was to minimize the risks of radiation.1


Subsequently, a major investigation concluded that radiation was indeed to blame, for those at highest risk of leukemia were born to fathers who worked at the nuclear plant.2 Not all studies supported these findings and clarification must await further epidemiological research. Nevertheless, the observations linking leukemia clusters to nuclear plants did persuade the Ministry of Defence and the government’s Health and Safety Executive to recommend major cuts in the maximum radiation doses to which workers are legally exposed.3


REFERENCES:
1) E. Millstone in Animal Experimentation: The Consensus Changes, Ed. G. Langley (MacMillan, 1989).
2) M.J. Gardner et al, British Medical Journal, 1990, February 17, 423-429.
3) The Guardian, 1991, March 22 and April 30.



TRANSPLANT DRUG ALMOST LOST


The life-saving qualities of a new anti-rejection drug, FK506, could have been missed when animal experiments suggested it was too toxic for human use.1 The tests were carried out at Cambridge University in England and showed that “… animal toxicity was too severe to proceed to clinical trial".2 US researchers, however, decided it was worthy of further investigation but nevertheless did not feel justified in first giving the drug to healthy volunteers, the usual practice in drug development, since this could be “potentially dangerous."3 Instead, FK506 was administered as a last chance option to liver transplant patients in “desperate plight". So for clinical experience with FK506 has been very promising.4


Animal tests also proved misleading in suggesting that FK506 would give better results if combined with another antirejection drug, cyclosporin. However, clinical trials revealed the opposite, with FK506 actually increasing the kidney damage caused by cyclosporin.3


REFERENCES:
1) R. Allison, Journal of the American Medical Association, 1990, April 4, 1766.
2) R.Y. Calne et al, Lancet, 1989, July 22, 227.
3) T.E. Starzl et al, Lancet, 1989, October 28, 1000-1004.
4) J. Neuberger, Hepatology, 1991, vol. 13, 1259-1260.



THE CHLOROFORM CONTROVERSY


The anaesthetics ether, nitrous oxide and chloroform originated from experiments carried out by physicians and scientists on themselves, and, together with the introduction of hygienic conditions, enabled surgery to emerge from the dark ages.1 Because of their high safety profile, nitrous oxide and ether have stood the test of time. In the case of chloroform, entrenched attitudes and contradictory animal experiments allowed a toxic drug to outlive its value and remain in use for over 100 years.2


Deaths from chloroform were reported almost weekly during the second half of the 19th century and between 1887 and 1896 there were 376 fatalities in England and Wales. Many believed the deaths resulted from respiratory failure but that risks could be minimized by appropriate administration of the drug and by devoting attention to the patient’s breathing in order to detect early warnings signs. The alternative (correct) explanation, that chloroform has a direct effect on the heart, was discounted.


Unfortunately, animal experiments carried out by the Hyderabad Commissions of 1888 and 1889 supported the view that chloroform affects the respiration rather than the heart.2 In the famous telegram to the Lancet,3 Lauder Brunton summarized results from the Second Commission: “Four hundred and ninety dogs, horses, goats, cats and rabbits used … Results most instructive. Danger from chloroform is asphyxia or overdose: none whatever heart direct." Anaesthetists must have been reassured to hear Brunton’s conclusion that chloroform “never causes sudden death from stoppage of the heart." In 1893, clinical observations completely contradicted the conclusions from Hyderabad and showed that heart failure is the commonest cause of death from chloroform.2 Nevertheless, use of the drug continued until the 1950’s and the Hyderabad Commissions were later blamed for failing to recognize species differences.


REFERENCES:
1) R. Sharpe, The Cruel Deception: the use of animals in medical research (Thorsons, 1988).
2) K.B. Thomas, Proceedings of the Royal Society of Medicine, 1974, vol. 67, 723-730.
3) Lancet, 1889, December 7, 1183.



“NON-TOXIC" OINTMENT PROVES DANGEROUS


The success of selenium disulphide (Selsun) as an antidandruff shampoo, led to the suggestion that it might also be useful for the treatment of blepharitis, a similar but painful condition involving the eyelids. Trials were carried out in which an ointment containing 0.5% selenium disulphide was applied to the lid margins. However, the ointment proved irritating if it accidentally came into contact with the conjunctiva and one patient developed “moderately severe conjunctivitis."1 In contrast, animal experiments have shown that “Selenium disulphide 0.5% ophthalmic ointment is nontoxic to rabbit corneas or conjunctivas" (emphasis added).2


REFERENCES:
1) G.C. Bahn, Southern Medical Journal, 1954, vol. 47, 749-752.
2) J.W. Rosenthal & H. Adler, Southern Medical Journal, 1962, March 318.



FATAL DIURETIC SEEMED SAFE


The diuretic drug Selacryn was introduced in 1979 but withdrawn from the US market only a year later after 363 reports of liver damage including 24 fatalities.1 In many other countries, including the UK, development of the drug was cancelled.2 Selacryn’s harmful effects were unexpected since they had not been detected in animal experiments.1


REFERENCES:
1) S. Takagi et al, Toxicology Letters, 1991, vol. 55, 287-293.
2) C. Spriel-Pourra & M. Auriche, Drug Withdrawal from Sale (PJB Publications, 1988).



CANCER-PRONE MICE CONTRADICT HUMAN EXPERIENCE


Butadiene is an important intermediate in the production of synthetic rubber but causes cancer in the B6C3 F1 strain of laboratory mouse, an animal widely used to assess the risk of chemicals. Tumours have also been found in rats although the dose was very high.


Based on the experimental with B6C3 F1 mice, America’s National Institute of Occupational Safety and Health (NIOSH) has classified butadiene as a carcinogen, estimating that exposure to 2 parts per million for 45 years would result in 597 cancers per 10,000 people. However, careful observation of butadiene plant workers employed since 1945, and exposed to much higher levels of the chemical, revealed no extra cancers. On the contrary, overall cancer deaths were considerably less than among the ordinary public.1


The NIOSH findings have been criticized since there are many differences between people and the cancer prone B6C3 F1 mouse. According to an editorial in the journal Science,1 “with trillions of dollars, loss of competitiveness, and jobs at stake, a searching review of the risk assessment metholodology of the regulatory agencies is overdue."


REFERENCES:
1) P.H. Abelson, Science, 1992, June 19, 1609.



DOCTORS WARNED ABOUT HALOTHANE LIVER TOXICITY


Halothane was introduced into clinical practice in 1956 and immediately hailed as a great advance in anaesthesia. Unfortunately, the anaesthetic was soon found to harm the liver and within 5 years, at least 350 cases of “halothane hepatitis" had been recorded. The condition sometimes proves fatal and between 1964 and 1985, 180 British deaths were linked to the drug.1


The original animal tests had shown no evidence of liver damage,2 and “early attempts to produce an animal model of halothane hepatitis proved disappointing," according to anaesthetists at Edinburgh’s Royal Infirmary. Nevertheless, there has been no shortage of experiments: since 1976 five “animal models" have been described though “their application to humans is of doubtful significance."3


By 1986, when Britain’s Committee on Safety of Medicines strengthened the warnings of liver toxicity in human patients,4 it was still not clear whether the same injuries could be induced in animals.5


REFERENCES:
1) British Medical Journal, 1986, April 5, 949.
2) Anaesthesiology, 1963, vol. 24, 109-110.
3) D.C. Ray & G.B. Drummond, British Journal of Anaesthesia, 1991, vol. 67, 84-99.
4) Scrip, 1987, October 2, 2.
5) C.E. Blogg, British Medical Journal, 1986, June 28, 1691-1692.



SHOCK TREATMENT


For years, high doses of corticosteroids have been recommended for the treatment of septic shock, a condition which leads to heart, kidney, and respiratory failure in a high proportion of patients. The idea was based on animal experiments where corticosteroids improve survival when given before shock1 or shortly afterwards.2


It has been pointed out, however, that “… extrapolation of data from experimental models of shock to the clinical setting may be dangerous and misleading."3 So it is not surprising that an analysis of clinical trials by the Drug & Therapeutics Bulletin found that “high-dose corticosteroids are ineffective for the prevention or treatment of shock associated with sepsis. They do not improve outcome, and make secondary infection worse. They may harm patients with impaired renal (kidney) function." For instance, one trial found that corticosteroids not only failed to prevent or reverse shock but actually seemed to increase deaths amongst patients, even though treatment was initiated within 2 hours.4


REFERENCES:
1) Drug & Therapeutics Bulletin, 1990, vol. 74-75.
2) S.G. Hershey in Anaesthesiology: Proceedings of the VI World Congress of Anaesthesiology, Mexico City, April 1976, Eds. E. Hulsz et al (Exerpta Medica, 1977).
3) A.S. Niles in Clinical Pharmacology: Basic Principles in Therapeutics, Eds. K.L. Melmon & H.F. Morrelli (MacMillan, 1978).
4) R.C. Bone et al, New England Journal of Medicine, 1987, September 10, 653-658.



ARSENIC AND THE DECADES OF FAILURE


It was fortunate that so much human evidence linked arsenic to cancer because for over 70 years, researchers were unable to “confirm" the dangers in laboratory animals. Suspicions that arsenic might cause cancer date back to 1809 when its harmful effects in drinking water were first noted.1 In 1887/88, Sir Jonathon Hutchinson described the earliest cases of cancer resulting from medicinal use of arsenic and subsequently, others have reported cancers in chemical, agricultural and metallurgical workers exposed to arsenic.2


Animal tests began 1911 and an historical analysis of the subject published during 1947, described how dozens of experiments had been performed.1 However, these had given “only doubtful results." The tests continued but still proved negative, and in 1969 researchers at America’s National Cancer Institute stated that “arsenic has been suspected by many investigators as a carcinogen in man, though there is no supporting evidence from animal experiments."3 And in 1977 a further summary of the data concluded that “there is little evidence that arsenic compounds are carcinogenic in experimental animals."2


Finally, in the late 1980’s, scientists managed to produce cancer in animals. This was 180 years after arsenic was first suggested as a human carcinogen. Despite decades of failure, animal researchers had at least been correct about one thing: in 1962 Heuper and Payne wrote that “With perseverance and some luck arsenicals one day may be shown to cause cancer in animals."4


REFERENCES:
1) o. Neubauer, British Journal of Cancer, 1947, vol. 1, 192-251.
2) F.W. Sunderman Jr. in Advances in Modern Toxicology, vol. 2, Eds. R.A. Goyer & M.A. Mehlman (Wiley, 1977).
3) A.M. Lee & J.F. Fraumeni Jr., Journal of the National Cancer Institute, 1969, vol. 42, 1045-1052.
4) W.C. Heuper & W.W. Payne, Archives of Environmental Health, 1962, vol. 5, 459.



PESTICIDE POISONING


In February 1986 the British Parliament’s Agriculture Committee began an enquiry into pesticides and human health. The Committee learnt that great reliance is placed on animal experiments but that “… similar tests in different animal species often yield quite different results."1 An example is the organophosphate presticide dipterex which produces nerve damage in people but not in the animal tests specially designed to detect such injuries.2 In fact, Dr. Murray of the National Poisons Unit informed the Committee that one well documented case of human poisoning is equivalent to 20,000 animal experiments!


The Committee concluded that “it cannot be satisfactory to rely on animals so much as a means of testing and, as other forms of testing become available, we recommend that they be adopted … we are satisfied from the evidence that we have received that animal testing can produce misleading results."1


REFERENCES:
1) Special Report of the House of Commons Agriculture Committee, reproduced in FRAME News, 1987, No. 16, p. 2.
2) A.N. Worden in Animals and Alternatives to Toxicity Testing, Eds. M. Balls et al (Academic Press, 1983).



NATURAL SKIN SUBSTANCE HARMS ANIMALS


Squalene is a natural constituent of human sebum, the substance formed by sebaceous glands around the roots of hairs to keep the skin lubricated and supple. Although a natural human product, squalene has still been applied to the skin of rabbits and guinea pigs, where it actually produced hair loss. This is obviously not the case in people1, and it has been extensively and safely employed in cosmetics.2


REFERENCES:
1) B. Boughton et al, Journal of Investigative Dermatology, 1955, vol. 24 179-189.
2)M.M. Rieger & G.W. Battista, Journal of the Society of Cosmetic Chemists, 1964, vol. 15, 161-172.



ANIMAL-TESTED ARTHRITIS DRUGS KILLED THOUSANDS


Phenylbutazone (Butazolidine) was once widely employed for the treatment of arthritis but reports of aplastic anaemia, an often fatal blood disease caused by damage to the bone marrow, led to the drug’s withdrawal in some countries and to its restriction in others, notably America, France and the UK.1


On the basis of animal tests, phenylbutazone had seemed a safe drug with no toxic effects observed in rats even after administration of 5 " 10 times the dose used for people.2 In particular, phenylbutazone’s harmful effect on the bone marrow had not been predicted3, and one year after marketing, researchers ntoed that “there have been no published reports of serious effects … on the hemopoietic (blood forming) system … in the experimental animal."4 Later research showed that the dangers could be identified by test-tube experiments with human bone marrow cells.5


It has been estimated that phenylbutazone and oxyphenbutazone, a closely related drug that also causes aplastic anaemia, have been responsible for 10,000 deaths worldwide.6 Ozyphenbutazone (Tanderil) as withdrawn altogether in 1985.


REFERENCES:
1) C. Spriet-Pourra & M. Auriche, Drug Withdrawal from Sale (PJB Publications, 1988).
2) C. Hinz & L.M. Gaines, Journal of the American Medical Association, 1953, vol. I51, 38-39.
3) R. Heywood in Animal Toxicity Studies: Their Relevance for Man, Eds. C.E. Lumley & S.R. Walker (Quay Publishing, 1990).
4) O. Steinbrocker et al, Journal of the American Medical Association, 1952, November 15, 1087-1091.
5) C.S. Smith et al, Biochemical Pharmacology, vol. 26, 847-852.
6) Estimate by Dr. Sidney Wolfe in Lancet, 1984, February 11, 353.



MENTHOL & EYE IRRITATION


Menthol is an ingredient of many cough and cold remedies and is used as an inhalant to relieve symptoms of bronchitis, sinusitis and similar conditions. It can also be used as an ointment for application to the chest or nostrils. If menthol accidentally comes into contact with the eye, it produces a temporary burning sensation lasting 15 " 30 minutes, but there are no after effects. In contrast, menthol causes “severe damage" to the rabbit’s eye.1


REFERENCE:
1) W.M. Grant, Toxicology of the Eye, 2nd Edition (Charles Thomas, 1974).



MORPHINE MANIA


Morphine remains the most valuable analgesic for severe pain1 yet has such a peculiar effect in some species that had it been tested on, say cats, prior to human studies, it could have been rejected. In these animals the drug produces a condition known as “morphine mania" which leaves them highly excitable and apprehensive. Their movements are irregular and jerky, and their pupils are abnormally dilated.2 While morphine produces hyperexcitement in cats, it has the opposite, calming effect on people.3 Furthermore, their pupils may be contracted.1 Fortunately the drug was discovered through human studies and only later tested on animals.4


REFERENCES:
1) British National Formulary, no. 26 (BMA & the Royal Pharmaceutical Society of G.B., 1993)
2) F.M. Sturtevant & V.A. Drill, Nature, 1957, June 15, 1253.
3) B. Brodie, Clinical Pharmacology & Therapeutics, 1962, vol. 3, 374-380.
4) J.T. Litchfield in Drugs in our Society, Ed. P. Talalay (Johns Hopkins, 1964).



MONKEY EXPERIMENTS PUT MALARIA PATIENTS AT RISK


The use of monkeys to investigate malaria led to the suggestion that coma in human patients is due to an increased amount of protein in the cerebro-spinal fluid, and that this leakage could be corrected with steroids.1 But in people, steroids do not help with coma. On the contrary, they actually prove harmful.2 Among survivors, for instance, coma is prolonged by 16 hours, while complications such as pneumonia, urinary tract infections, convulsions and gastrointestinal tract bleeding, developed more frequently in patients receiving steroids. Subsequent clinical observations of malaria victims have shown that “the monkey model may simply not be relevant."1


REFERENCES:
1) Lancet, 1987, May 2, 1016.
2) D.A. Warrell et al, New England Journal of Medicine, 1982, February 11, 313-319.



DIARRHOEA TREATMENT LEAVES 10,000 VICTIMS


During the 1960’s, Japan suffered a devastating epidemic of drug-induced disease associated with clioquinol, the main ingredient of Ciba-Geigy’s antidiarrhoea medicines Enterovioform and Mexaform. At least 10,000 people, and perhaps as many as 30,000, were victims of SMON (subacute myelo-optic neuropathy), a new disease whose symptoms include numbness, weakness in the legs, paralysis and eye problems, including blindness.1 In 1970 Japan’s Ministry of Health and Welfare banned the drug and 15 years later clioquinol was withdrawn worldwide.


Clioquinol’s harmful effects result from nerve damage yet animal experiments performed by the company revealed “no evidence that clioquinol is neurotoxic", tests being carried out on rats, cats, beagles and rabbits.2


Although some argue that “Animal tests have consistently failed to reproduce the effects seen in humans,"3 researchers at the Okayama University Medical School say they have induced clioquinol toxicity in mongrels.4 Nevertheless, they note that different species respond differently, with monkeys, hens, cocks, and mice only mildly affected even after higher doses. They also found that beagle dogs were 3 " 4 times less sensitive to clioquinol than mongrels, and concluded that “These facts suggest strongly differences in strains as well as species of animals for the neurotoxicity of clioquinol."


REFERENCES:
1) Lancet, 1977, March 5, 534.
2) R. Hess et al, Lancet, 1972, August 26, 424-425.
3) W. Sneader, Drug Development: From Laboratory to Clinic (Wiley)
4) J. Tateishi et al, Lancet, 1972, June 10, 1289-1290.



RODENT TESTS MISS INDUSTRIAL CANCER RISK


Benzene is used as a starting point for the production of industrial chemicals and for the manufacture of detergents, explosives and pharmaceuticals. It is also present in gasoline and was once commonly employed as a chemical solvent. Because benzene is so widely used, there has been considerable debate over the safety of exposed workers, especially since experience has shown it to be a cancer hazard.


Tragically, human evidence was once again undermined by the animal laboratory. According to Lester Lave of the Brookings Institute in Washington DC, “although there are reliable human data linking benzene to leukemia, scientists have been reluctant to categorise benzene as a carcinogen because there are no published reports that it induced leukemia in rodents."1


In fact, 14 separate animal trials, starting in 1932, failed to show that benzene caused cancer.2 Only during the late 1980’s were researchers finally able to induce cancer in laboratory animals by dosing them with benzene.


REFERENCES:
1) L.B. Lave, The American Statistician, 1982, vol. 36, 260-261.
2) D.M. De Marini et al, in Benchmarks: Alternative Methods in Toxicology, Ed. M.A. Mehlman (Princeton Scientific Publishing Co. Inc., 1989).



VALUABLE EYE THERAPY WOULD NOT PASS RABBIT TEST


Chymotrypsin is widely used in ophthalmic surgery for the treatment of cataracts. Although recommended for human use,1 chymotrypsin is harmful to the rabbit eye. In his book Toxicology of the Eye (1974), Morton Grant states that “the rabbit cornea appears to differ significantly from the human cornea in its reaction to ?-chymotrypsin. It has been noted repeatedly that introduction of ?-chymotrypsin into the (rabbit’s) corneal stroma … leads to severe swelling reaction of cornea, much more than is seen in human beings, and in some instances leading to perforation of the cornea."


REFERENCE:
1) British National Formulary, No. 26 (BMA and The Royal Pharmaceutical Society of G.B., 1993).



RABBIT TEST MISSES HUMAN EYE IRRITANT


Lindane is probably best known as an agricultural insecticide but very dilute lotions, creams and shampoos are used therapeutically for treating lice and scabies. Nevertheless such preparations can cause “excessive eye irritation" and conjunctivitis, and the British National Formulary (1993) warns users to “avoid contact with eyes". In rabbits, however, application of a far more concentrated solution produced only minimal effects. Furthermore, exposure to lindane in the form of a dust proved non-irritating to the eyes and nasal mucosa of rabbits but caused irritation to the eyes and respiratory passages of sensitive people.1


REFERENCE:
1) W.M. Grant, Toxicology of the Eye, 2nd Edition (Charles Thomas, 1974).



ANIMAL SKIN TESTS NOT UP TO SCRATCH


Many people suffer dermatitis when they come into contact with nickel compounds as they are considered potent skin sensitizers.1 Nickel is recognized as the single most common cause of contact dermatitis in women and many of those who suffer prolonged eczema receive disability pensions.2 In people exposed occupationally, the condition is known as “Nickel itch".


In contrast, nickel is not a potent skin sensitizer is most of the animal tests used to predict allergic responses.3 The Draize guinea pig test, for instance, suggests that nickel does not cause allergic reactions. Even in the two most widely used animal procedures, nickel produces either no response (the Buehler Test) or only a moderate response (the Maximization Test). Both methods also use guinea pigs.


REFERENCES:
1) Medical Toxicology, Ed. M.J. Ellenhorn & D.G. Barceloux (Elsevier, 1988).
2) Textbook of Dermatology, Vol. 1, 5th Edition, Eds. R.H. Champion et al (Blackwell Scientific Publications, 1992).
3) P.A. Botham et al, Food & Chemical Toxicology, 1991, vol. 29, 275-286.



DOG RESEARCH UNDERMINES HEART VALVE DEVELOPMENT


Dogs are favourite animals in cardiac research and many experiments were carried out to develop an artificial mitral valve. However, the artificial valves almost always produced fatal blood clots in these animals,1 with the result that many surgeons were deterred from carrying out human trials.2


Like other experimental surgeons, Starr and Edwards encountered the familiar problem of blood clots but eventually decided on a “caged-ball" device.3 Other designs were uniformly fatal to the animals and whilst 6 of the 7 dogs receiving the caged ball valve died new valve proved far more successful in clinical trials were blood clotting was not a problem.4 The surgeons concluded that “the marked propensity of the dog to thrombotic occlusion (blood clotting) or massive embolization from a mitral prosthesis is not shared by the human being."5


Starr and Edwards wanted to carry out further animal testing of their new caged ball device but could not use the valve that proved so successful in patients because it nearly always killed the dogs. Instead, they designed a different valve specifically for use in these animals! The modified valve did not kill the animals so frequently: even so, 78% still died within 46 days. It was noted that “species differences have therefore led to the use in this clinic of an unshielded ball valve for human mitral replacement and a shielded ball valve as the prosthesis of choice for further testing in the dog."5


The successful clinical application of another early design of mitral valve replacement cast further doubt on the value of animal research, since none of the dogs used in preclinical tests survived beyond 40 hours!6


REFERENCES:
1) A.V. Doumanian & F.H. Ellis, Journal of Thoracic & Cardiovascular Surgery, 1961, vol. 42, 683-695.
2) G.H.A. Clowes Jr., Annals of Surgery, 1961, vol. I54, 740.
3) A. Starr, American College of Surgeons, Surgical Forum, 1960, vol. 11, 258-260.
4) A. Starr & M.L. Edwards, Annals of Surgery, 1961, vol. I54, 726-740.
5) A. Starr & M.L. Edwards, Journal of Thoracic & Cardiovascular Surgery, 1961, vol. 42, 673-682.
6) N.S. Braunwald et al, Journal of Thoracic & Cardiovascular Surgery, 1960, vol. 40, 1-11.



ANGINA DRUG’S FATAL EFFECTS


Perhexiline was first marketed in France during the 1970’s as a treatment for angina. But concern over its side-effects, especially fatal cases of liver damage, led to withdrawal in the UK, while in some countries it was never licensed at all. Indeed, some argue that “its use should be completely avoided."1


The dangers were not predicted by animal tests2 and administration of high doses to several species for up to 2 years produced no effect on the liver.3 According to Richardson Merrell, the company marketing perhexiline, “… there has been an inordinate amount of animal work done. At this point we simply have been unable to induce hepatic (liver) disease in any species."4


Perhexiline’s harmful effects arise in individuals whose body chemistry has been altered by genetic factors, making them more sensitive to the drug. Reliance on animal tests can therefore be seriously misleading since they provide no basis for such subtle predictions.


REFERENCES:
1) D.G. McDevitt & A.M. MacConnachie in Meyler’s Side Effects of Drugs, 11th edition, Ed. M.N.G. Dukes (Elsevier, 1988).
2) C.T. Eason et al, Regulatory Toxicology & Pharmacology, 1990, vol. 11, 288-307.
3) J.W. Newberne, Postgraduate Medical Journal, 1973, vol. 49, April Suppl., 125-129.
4) Ibid., p.130.



SAFE CLEANING AGENTS DAMAGE ANIMAL VICTIMS


Researchers have discovered that coconut soap causes skin irritation in rabbits. During a comparison of human and animal test data for a selection of household and industrial products, Proctor and Gamble scientists found that while coconut soap had a “negligible" effect on the skin of volunteers, it produced “moderate" irritation in rabbits. Pine oil cleaner also produced “moderate" irritation in rabbits (and guinea pigs) but only a slight effect on human skin.


Other substances which produced insignificant effects on human skin but irritation in animals included high and low carbonate detergents, phosphate detergents, enzyme detergent, sodium carbonate and even lemon juice! Overall, only 6 of 24 products tested had the same effects in people, rabbits and guinea pigs. The report concluded that “Neither the rabbit nor the guinea pig provides an accurate model for human skin. The skin responses of these animals differ in both degree and in kind from those found in human skin."1


Similar conclusions have been reached for cosmetic ingredients. Scientists at the Warner Lambert Research Institute in New Jersey note that “… animal skin is entirely different from human skin and that there may be no correlation between the mildness of a raw material on a rabbit’s back and its safety during use on a human face." They describe how the cosmetic ingredient isopropyl myristate is considered safe for use on the human body but causes irritation to rabbits.2


REFERENCES:
1) G.A. Nixon et al, Toxicity & Applied Pharmacology, 1975, vol. 31, 481-490.
2) M.M. Rieger & G.W. Battista, Journal of the Society of Cosmetic Chemists, 1964, vol. 15, 161-172.



HEART TREATMENT WITHDRAWN


Prenylamine, a treatment for angina, was withdrawn from the UK market in 1988,1 the main problem being that the drug caused ventricular tachycardia, a condition in which the heart beats abnormally fast. The side-effect caused patients to faint. In contrast, animal experiments carried out at the University of Eoteborg in Sweden revealed that in cats, rabbits and guinea pigs, prenylamine reduced the heart rate by up to 25%.2 In cats, for instance, a dose of prenylamine reduced heart rate from 225 beats per minute to 171.


REFERENCES:
1) C. Spriel-Pourra & M. Auriche, Drug Withdrawal from Sale (PJB Publications, 1988).
2) H. Obianwu, Acta Pharmacology et Toxicology, 1967, vol. 25, 127-140.



COUGH REMEDY LEAVES OVERDOSE PATIENTS IN COMA


In 1984 a Milan Poison Control Centre reported 32 patients with severe neurological side-effects following an overdose of zipeprol, the cough suppressant.1 Symptoms included seizures and coma, and the Centre states that “Zipeprol should be much more strictly controlled …" Animal tests had given no warning of severe neurological problems despite the use of higher doses.2


REFERENCES:
1) C. Moroni et al, Lancet, 1984, January 7, 45.
2) D. Cosnier et al, Drug Research, 1976, vol. 26, 848-854; G. Rispat et al, Drug Research, 1976, vol. 26, 523-530.



HEART DRUG FEARS GROUNDLESS


The vital heart drugs digoxin and digitoxin are the pure substances extracted from digitalis whose value in treating heart failure and cardiac arrhythmias originated from studies of human patients. 1, 2 However, doctors must be careful not to give too high a dose as they can then be toxic. Fortunately the drugs did not derive from animal experiments since doses considered safe for rats, guinea pigs, dogs, and cats can actually kill human patients.3 Today we know that digoxin’s lethal dose is more accurately predicted by test-tube studies in human cells.4


Animal tests also suggested that digitalis raised the blood pressure, and as a result, it was once widely taught that the drug would be dangerous for certain patients and should not therefore be given. Thankfully, clinical observations eventually showed this to be incorrect and digitalis can be used with great benefit.2


REFERENCES:
1) W. Sneader, Drug Discover: The Evolution of Modern Medicine (Wiley, 1985).
2) T. Lewis, Clinical Science (Shaw & Sons Ltd, 1934).
3) G.T. Okita, Federation Proceedings, 1967, vol. 26, 1125-1130.
4) R. Jover et al, Toxicology in Vitro, 1992, vol. 6, 47-52.



ANIMAL TESTS MASK NEERVE DAMAGE RISK


In September 1983, the antidepressant zimelidine (Zelmid) was withdrawn worldwide following potentially serious side-effects including nerve damage, leading to loss of sensation or paralysis.1 Some patients also suffered hypersensitivity reactions such as fever, headache, muscle or joint pains, and liver problems. The drug had been introduced only a year earlier but Britain’s Committee on Safety of Medicines had received over 300 reports of adverse reactions, 60 of which were serious: there were 7 deaths.2 Prolonged tests in rats and dogs had shown no evidence of toxicity at 5 times the human dose.3


REFERENCES:
1) B. Blackwell in Side Effects of Drugs Annual, vol. 8, Eds. M.N.G. Dukes & J. Ellis (Elsevier, 1984).
2) R.D. Mann, Modern Drug Use, an Inquiry on Historical Principles (MTP Press, 1984).
3) R.C. Heel et al, Drugs, 1982, vol. 24, 169-206.



POISONING TESTS OFFER LITTLE HOPE TO OVERDOSE PATIENTS


For decades animals have been deliberately poisoned to death in lethal dose (LD50) toxicity tests, yet the results are of little value in the prevention and treatment of overdose patients and can be misleading. According to their lethal doses in rats, aspirin would seem safer than another common pain killer drug, ibuprofen. In fact, human overdose experience reveals that ibuprofen is the safer drug.1 As physicians at London’s National Poisons Centre point out, “The ‘natural experiment’ of cases of self poisoning has to be taken as the starting point as the results of experiments on animals cannot reliably be extrapolated to man …"2


REFERENCES:
1) G.N. Volans in The Contribution of Acute Toxicity Tests to the Evaluation of Pharmaceuticals, Eds. D. Schuppan et al (Springer-Verlag, Berlin, 1986).
2) S. Cassidy & J. Henry, British Medical Journal, 1987, October 24, 1021-1024.



‘SAFE’ EYE SOLUTIONS FAIL THE HUMAN TEST


Detergents are not only used for domestic and industrial cleaning. In research aimeda t increasing penetration of therapeutic drugs across the cornea, a number of dilute detergents were assessed in the eyes of volunteers. Although considered “generally harmless to rabbit eyes," some caused pain and irritation in people. For instance, a detergent called Brij 58 produced “alarming" changes to the surface of the human eye, together with discomfort and blurred vision.1 In rabbits Brij 58 is classified as a “non-irritant."2


A 3% solution of a similar product, Brij 35, caused delayed irritation in volunteers but was also non-irritating to the rabbit eye, even when undiluted.1 And although another detergent, dupanol, caused immediate severe pain in human subjects,1 it was considered to have only moderate effects in the eyes of rabbits.3


REFERENCES:
1) R.J. Marsh & D.M. Maurice, Experimental Eye Research, 1971, vol. 11, 43-48.
2) M. Cornelis et al, ATLA, 1991, vol. 19, 324-336.
3) L.W. Hazleton, Proceedings of the Scientific Section of the Toilet Goods Association, 1952, vol. 17, 5-9.



ANTIBIOTIC’S DEADLY SIDE-EFFECT


Britain’s Committee on Safety of Medicines has alerted doctors to the dangers of clindamycin, an antibiotic whose most serious side-effect is an intestinal disease called pseudomembraneous colitis. The condition leads to diarrhea and sometimes proves fatal. By 1980, 12 years after the drug was marketed in the UK, 36 deaths had been reported.1 Although the problem can occur with other antibiotics, it is most frequently seen with clindamycin, and the British National Formulary warns that patients should stop taking the drug immediately if diarrhea develops.


In contrast, rats and dogs given clindamycin every day for a year, could tolerate 12 times the maximum recommended human dose.2


REFERENCES:
1) G.R. Venning, British Medical Journal, 1983, January 15, 199-202.
2) The British National Formulary (No. 26, 1993) lists the maximum oral dose for severe infections as 450mg every 6 hours i.e. 25mg/kg for a person weighing 70 kg taking 4 doses in 24 hours. Rats and dogs could tolerate more than 300mg/kg (J.E. Gray et al, Toxicology & Applied Pharmacology, 1972, vol. 21, 516-531).



ANIMALS & AIDS


The fact that even chimpanzees do not develop AIDS when infected with HIV, casts serious doubt on the validity of animal experiments.1 Some AIDS researchers seem to recognize this since vaccines which failed to protect chimpanzees from infection with HIV, were nevertheless tested in human trials!2 Certainly, faith in animal tests could have serious repercussions. For instance, failure to induce AIDS in laboratory animals has been used to support arguments against HIV as the cause.3


Attempts to produce “animal models" of AIDS could be dangerous in other ways. By inserting parts of the human immune system into mice, scientists believed they had developed an animal model of AIDS. But fears have been expressed that interaction of HIV with viruses commonly found in mice may not only make the “model" irrelevant to people but promote hazardous changes in the AIDS virus. The new HIV variants could then spread in different ways, possibly even through the air.4


REFERENCES:
1) P. Newmark, Nature, 1989, October 19, 566-567.
2) A.S. Fauci & P.J. Fischinger, Public Health Reports, 1988, vol. 103, 230-236.
3) New Scientist, 1988, March 3, 34.
4) J. Marx, Science, 1990, February 16, 809; P. Lusso et al, Science, 1990, February 16, 848, 852.



NINE SPECIES FAIL TO PREDICT LIVER DAMAGE


Evicromil (code name FPL 52757) was submitted for clinical trial as an antiasthmatic drug following safety evaluation in mice, rats, hamsters, rabbits, ferrets, squirrel monkeys, cynomolgus monkeys, stump-tail monkeys and baboons. Despite using doses many times greater than the amount intended for human use, no harmful effects were seen, especially with respect to the liver.1 Yet 20% of patients participating in the trial had symptoms of liver damage, precluding any further development of the drug.2 Subsequent tests showed that liver toxicity could only be induced in dogs.1, 2


REFERENCES:
1) D.V. Parke in Animals & Alternatives in Toxicity Testing, Eds. M. Balls et al (Academic Press, 1983).
2) C.T. Eason et al, Regulatory Toxicology & Pharmacology, 1990, vol. 11, 288-307.



ANTI-CANCER HOPE ABANDONED


When animal researchers tested a newly discovered substance, psicofuranine, for anti-cancer activity, they found contradictory evidence in rats and mice. The drug proved active against several tumours in laboratory rats but had no effect on 3 different cancers in mice. Unfortunately doctors could not properly assess the drug against human cancer since psicofuranine produced severe and unexpected side-effects in early human trials, thus terminating any further investigation in people. The drug damaged the heart yet no cardiac toxicity had been found in mice, rats, dogs or monkeys.1


Although clinical study of psicofuranine was abandoned, further animal experiments were carried out in an attempt to reproduce the heart problems seen in people. Once again, no cardiac toxicity could be observed even when dogs and monkeys were given 5 " 10 times the harmful human dose.


REFERENCE:
1) C.G. Smith et al, Journal of International Medical Research, 1973, vol. 1, 489-503.



ANAEMIA CURE FAILS IN ANIMALS


When treating iron-deficiency anaemia, doctors prefer their patients to take iron by mouth, but should oral therapy fail, the iron is administered by injection. Injectable iron remedies were introduced during the 1930’s but could easily have been discarded. At that time, experiments in which anaemias were artificially induced in animals by iron deficiency or by repeated haemorrhage, led to the conclusion that injecting iron had no therapeutic value. Fortunately, clinical studies proved that anaemic patients could be cured this way.


Iron sorbitol is one form of injectable iron that might have been rejected for a different reason. Administration to rats and rabbits caused cancer at the injection site and the implications for human therapeutics appeared serious. However, clinical experience has revealed no real hazard to patients.


REFERENCES:
1) British National Formulary, No. 26 (BMA & Royal Pharmaceutical Society of G.B., 1993).
2) G.N. Burger & L.J. Witts, Proceedings of the Royal Society of Medicine, 1934, vol. 27, 447-455.
3) M. Weatherall, Nature, 1982, April 1, 387-390.



DRUG DANGER UNDETECTED


The anti-inflammatory drug ibufenac was marketed in Britain during 1966 but withdrawn two years later following 12 deaths, mainly through liver damage. Although submitted to “extensive" tests in mice, rats, and dogs, no evidence of liver damage was detected except for a slight effect in rats exposed to lethal doses of the drug.


Dr. Cuthbert of the Medicine’s Division at Britain’s Department of Health and Social Security, explained that “Evidence of liver damage is sometimes detected in animal studies of non-steroidal anti-inflammatory drugs but usually no such evidence is forthcoming even in circumstances where a drug is eventually shown to be hepatotoxic (damaging to the liver) in man."1


REFERENCE:
1) M.F. Cuthbert in Current Approaches to Toxicology, Ed. B. Ballantyne (Wright & Sons, 1977)



ANIMALS DIVERT ATTENTION FROM CANCER PREVENTION


Prevention is always better than cure, particularly for diseases like cancer where treatment can be both difficult and unpleasant. But first, doctors must discover the causes so people know how to avoid ill-health. This is the primary role of epidemiology " the study of disease in human populations. Tragically, a preference for laboratory research and animal experiments diverted attention from epidemiology, and for decades little was known about the main causes of human cancer.


Before World War I, epidemiology had identified several causes of the disease.1 For instance, pipe smokers were more likely to develop cancer of the lip; workers in the aniline dye industry often contracted bladder cancer; and skin cancer was an occupational hazard of radiologists. It was also known that combustion products of coal (soot and tar) could cause the disease, an observation dating back to 1775 when the English surgeon Potts identified soot as a carcinogen in chimney sweeps.


Attempts to replicate Potts’ findings in laboratory animals repeatedly failed2 but finally, in 1918, Japanese researchers reported that cancer could be produced on a rabbit’s ear by continually painting it with tar, a discovery that changed the course of cancer research. According to the renowned British epidemiologist Sir Richard Doll, human observational data were now commonly dismissed because it was confidently assumed that laboratory experiments held the key to success.1 Crucial epidemiological studies like those of Percy Stocks at London University, who reported in 1933 that people consuming larger amounts of fruit and vegetables were less likely to develop cancer,3 received little attention,1 yet today we know that Stocks was right.4


The absence of human epidemiological data allowed mistaken ideas based on animal research to flourish. Although we now know that only about 5% of Western cancers are linked to viral infection,5 some scientists believed that most, if not all cases were caused by viruses, a view derived from experiments on animals where it is easy to transmit the disease in this way.6 One animal researcher even argued that women should not breast feed their babies: he believed that in humans, as in mice, a virus is the prime cause of breast cancer, and that the virus is acquired in the mother’s milk!7


Following World War II, interest in epidemiology was reawakened with the striking discovery that smoking causes lung cancer. This breakthrough led to further population studies which identified the causes of many other types of cancer. The result is that 80 " 90% of cases are now considered potentially preventable. And it is revealing that the 1980 Congress Office of Technology Assessment Report on the causes of cancer, relied far more on epidemiology than laboratory tests because these “cannot provide reliable risk assessments."5


REFERENCES:
1) R. Doll, Cancer, 1980, vol. 45, 2475-2485.
2) W.H. Woglom, Archives of Pathology, 1926 vol. 2, 533-576.
3) P. Stocks & M.N. Kam, Annals of Eugenics, 1933, vol. 5, 237-280.
4) J. Robbins, Diet for a New America (Stillpoint, 1987).
5) R. Peto & R. Doll, The Causes of Cancer (Oxford University Press, 1981).
6) E. Northrup, Science Looks at Smoking (Conard-McCann, 1957).
7) J. Furth, Bulletin of the New York Academy of Medicine, 1964, vol. 40, 421-431.



EPILEPSY ‘MODELS’ GIVE FITFUL RESULTS


Scientists have devised more than 50 ways of inducing fits in laboratory animals. One reason for the large number is that “none of the models is fully trustworthy as an imitation of clinical epilepsy,"1 and indeed results can vary depending on the “model" chosen.


An example is the artificial sweetener aspartame. In research sponsored by the NutraSweet Company and the Wellcome Trust, researchers at London’s Institute of Psychiatry carried out experiments with photosensitive baboons in which fits are induced by flashing lights. The tests followed suggestions that high doses of aspartame may produce seizures in sensitive people. Aspartame had no effect in the baboons but conflicting data has been found in other animal models: aspartame enhances chemically-induced convulsions in mice, for instance, but has no effect on electric shock-induced or sound-induced seizures in these animals.2


Similar species differences are found in drug development. Although reducing convulsions in mice and baboons, the drug THIP proved ineffective when tried in patients with epilepsy.3


REFERENCES:
1) R.S. Fisher, Brain Research Reviews, 1989, vol. 14, 245-278.
2) B.S. Meldrum et al, Epilepsy Research, 1989, vol. 4, 1-7.
3) Lancet, 1985, January 26, 198-200.



ANIMALS STARVE IN BRAIN RESEARCH FIASCO


Reliance on animal experiments rather than human observations delayed a full realization that lack of food early in life can harm the brain. During the first quarter of the 20th century, there was considerable interest in the possibility that lack of food during childhood might interfere with the proper development of the brain and therefore affect later achievement of the individual. Unfortunately, almost all the research was carried out on animals and showed that starving baby or adult rats had no effect on the brain. Not surprisingly, the topic was abandoned and only resumed in the late 1950’s when children with histories of under-nutrition were persistently found to underachieve, both in school and in formal tests.1


Researchers then realized that the early animal tests had failed since no account had been taken of the “brain growth spurt". This is the period of fastest growth when the brain is at its most vulnerable. Furthermore, the exact timing varies between the species: in human babies the brain growth spurt begins during the final stage of pregnancy and proceeds through to at least a year; in guinea pigs, it occurs almost entirely during the foetal period; and in rats it happens during the first 3 weeks after birth.2


Despite millions of underfed and malnourished people, “early life undernutrition" remains a popular subject among animal researchers. Unlike current aid levels to developing nations, there seems no shortage of funds for such research: indeed, one justification is that, someday, it might better enable us to give relief to the starving!3


REFERENCES:
1) J. Dobbing in Early Nutrition & Later Behaviour, Ed. J. Dobbing (Academic Press, 1987).
2) J. Dobbing & J.L. Smart, British Medical Bulletin, 1974, vol. 30, 164-168.
3) J.L. Smart in ref. 1



THE PRACTOLOL SYNDROME


Practolol (Eraldin), marketed by ICI during the early 1970’s for the treatment of heart conditions, was “particularly notable for the thoroughness with which its toxicity was studied in animals, to the satisfaction of the regulatory authorities."1 Nevertheless, unforeseen side-effects began to emerge including serious skin, eye and abdominal problems. Some patients suffered dry eyes, conjunctivitis and corneal damage leading to blindness. There were also cases of stomach damage with obstruction of the intestine, a condition known as sclerosing peritonitis which led to 23 reported deaths.2 Overall, ICI compensated more than 1000 victims.3


The “practolol syndrome" had not been predicted by animal experiments4 and even after the drug was withdrawn in 1976, no one could replicate the harmful effects in laboratory animals.1


REFERENCES:
1) M. Weatherall, Nature, 1982, April 1, 387-390.
2) G.R. Venning, British Medical Journal, 1983, January 15, 199-202; January 22, 289-292.
3) A Questioning of Balance, Office of Health Economics, 1980.
4) F.H. Gross & W.H. Inman (Eds.), Drug Monitoring (Academic Press, 1977).



ANIMALS MISS STEROID EYE RISKS


One of the most serious side-effects of steroid eye therapy is glaucoma. An abnormally high pressure builds up within the eye and can lead to permanent loss of vision if the effects are prolonged. During the early 1950’s, when corticosteroids were first employed in ophthalmology, animal tests suggested that cortisone had no effect on pressure within the eye.1 Subsequent attempts to induce glaucoma in rabbits and monkeys proved difficult or impossible,2 and researchers at Britain’s Portion Down laboratories refer to “the differing response of the eye of man and animals to repeated topical (surface) application of corticosteroids. Such a procedure is without effect on tension of the eye of many experimental mammals, but increases tension in the human eye."3


Another side effect of steroid therapy that is difficult to replicate in laboratory animals is cataract. Although scientists have produced slight changes in the lens of the rabbit’s eye after repeated application of high doses, they did not mimic the more serious condition found in human patients.2


REFERENCES:
1) L.H. Leopold et al, American Journal of Ophthalmology, 1951, vol. 34, 361-371.
2) W.M. Grant, Toxicology of the Eye, 2nd edition, (Charles Thomas, 1974).
3) B. Ballantyne & D.W. Swanston in Current Approaches in Toxicology, Ed. B. Ballantyne (Wright Sons, 1977).



BABIES AT RISK FROM TALC


In 1991, doctors at Southampton General Hospital warned that inhaling babies’ talcum powder could be fatal,1 representing “an unappreciated hazard." They state that “talcum powder can cause severe respiratory symptoms in infants: its use should be discouraged and containers should carry a warning and have child proof caps." Eight deaths have been attributed to the inhalation of talc.


Concerns over the safety of talc have been raised before and studies of talc miners and millers have shown that it can damage the lungs.2 But experiments in which huge amounts of the commercial product were administered to animals, seemed to suggest no hazard to consumers. For instance, in 1977 experimenters exposed hamsters to high grade cosmetic talc at doses nearly 2000 times higher than that experienced by babies during toilet care. There was no effect on survival or damage to the lungs. In the same year, other scientists forced rats to breathe talc at doses approached 6000 times those used in baby care. Despite the massive amounts, there was only a slight effect on the lungs.3


REFERENCES:
1) P.W. Pairaudean et al, British Medical Journal, 1991, May 18, 1200-1201.
2) A. Seaton in Occupational Lung Diseases, Eds. W.K. Morgan & A. Seaton (Saunders, 1982).
3) Lancet, 1977, June 25, 1348-1349.



INHALATION TESTS THROW FALSE DOUBT ON FORMALDEHYDE


Fears for the safety of formaldehyde workers followed reports that the chemical causes cancer in rats.1 Formaldehyde is widely used as a laboratory fixative and as an embalming fluid but human epidemiological studies had revealed no evidence of cancer. The animal tests led to further observations of exposed workers but these were also negative.1


The rats had been forced to breathe such high doses (7-15 times that inhaled by workers) that the formaldehyde caused tissue damage which led to the cancers. Nevertheless, “… there are still some who believe that the positive results in the rats are the dominant factor to be taken into account and overrides the epidemiology but there is always some hope that common sense may prevail."1


REFERENCE:
1) P. Grasso, Journal of the Royal Society of Medicine, 1989, vol. 82, 470-473.



USELESS TREATMENT POISONS WORKERS


In 1939, animal researchers devised an astonishing treatment for silicosis, the debilitating lung disease caused by exposure to silica dust. They found that inhalation of metallic aluminum could prevent silicosis in laboratory rabbits,1 and from the early 1940’s to the mid-1950’s, the technique was widely employed by industry in an attempt to treat or prevent the condition amongst workers.2


Before beginning work, men whose occupations exposed them to silica, passed through an aluminum dusting chamber where they breathed a daily dose of the powder. But in 1956, studies of pottery workers showed that the metal did not work and the Industrial Pulmonary Disease Committee of Britain’s Medical Research Council recommended that the technique should not be used.3


Today we know that the treatment itself carried risks. Although large doses of aluminum proved harmless to animals,4 cases of lung damage and cancer have been reported amongst aluminum workers.5 Furthermore, studies of Canadian miners who breathed aluminum powder to prevent silicosis, have revealed symptoms consistent with the current theory that aluminum may cause Alzheimer’s Disease!6


REFERENCES:
1) J.J. Denny et al, Canadian Medical Association Journal, 1939, vol. 40, 213: reported in ref. 3.
2) W.R. Parkes, Occupational Lung Disorders (Butterworths, 1982)
3) M.C.S. Kennedy, British Journal of Industrial Medicine, 1956, vol. 13, 85-101.
4) L.U. Gardner et al, Journal of Industrial Hygiene & Toxicology, 1944, vol. 26, 211-223.
5) M.J. Ellenhorn & D.G. Barceloux, Medical Toxicology (Elsevier, 1988).
6) Lord Walton of Detchant, Journal of the Royal Society of Medicine, 1992, vol. 85, 69-70.



ANIMAL DIET STUDIES CONTRADICT HUMAN COLON CANCER RISKS


Comparisons of people living in different countries, together with other human studies, have shown that too much fat in the diet can lead to cancer of the colon, with saturated fat the chief culprit. Animal tests agree that too much fat can be dangerous but suggest it is the polyunsaturated fats that are the most to blame.1


Clinical studies have also suggested that a high fibre diet is beneficial and the idea has been tested by animal researchers. Again the results are conflicting, some experiments showing a reduced risk of cancer and others an increased risk.2 And although population studies have identified diets high in animal protein as most risky,3 much laboratory research suggests that the type of protein is irrelevant.2


Human studies have consistently shown that diets rich in fruit and vegetables can protect against colon cancer. In contrast, many of the natural substances evolved by fruit and vegetables to protect themselves from predators and parasites, actually cause cancer when tested in rats and mice!4


REFERENCES:
1) J.L. Freudenheim & S. Graham, Epidemiologic Reviews, 1989, vol. 11, 229-235.
2) D. Galloway, Cancer Surveys, 1989, vol. 8, 169-188.
3) B. Armstrong & R. Doll, International Journal of Cancer, 1975, vol. 15, 617-631.
4) P.H. Abelson, Science, 1990, September 21, 1357.



WORKERES AT RISK FROM MISLEADING ANIMAL TESTS


In 1991 the US Occupational Safety and Health Administration decided that glass fibre products should be labeled as a potential cancer hazard.1 The decision followed studies of glass fibre workers that showed an increased risk of lung cancer.


Glass wool products have been manufactured for about 60 years during which animal experiments seemed reassuring. In the 1950’s, experiments with rats, guinea pigs, rabbits and monkeys produced no lung damage when the animals were forced to breathe the fibres.2 And an analysis of further tests conducted during the 1980’s noted that “An increase in lung tumours or mesothelioma has not been observed following long-term inhalation studies in several animal species including rats, hamsters, guinea pigs, mice, monkeys, and baboons exposed to glass fibres, glass wool or mineral wool."3


Ironically, experiments in which rats did develop cancer have been dismissed as unlikely to have any relevance to the human condition. This is because the glass fibres were artificially implanted into the tissue membrane lining the animal’s lung, whereas in people the usual means of exposure is through breathing. Furthermore, it is well known that rats are especially prone to cancer when solid substances are surgically implanted into their bodies.2 In his book Occupational Lung Disorders, Raymond Parkes concludes that “the production of malignant tumors in animals by direct implantation experiments is unlikely to have any relevance to human exposure."


REFERENCES:
1) Letter from G.F. Scannell, Assistant Secretary for Occupational Safety and Health, Washington DC, to Richard Munson, Chairman of Victims of Fibreglass (May 6, 1991); The Guardian, July 20, 1991.
2) Reported in R. Parkes, Occupational Lung Disorders (Butterworths, 1982).
3) C.S. Wheeler, Toxicology & Industrial Health, 1990, vol. 6, 293-307.



THE OPREN AFFAIR


The arthritis drug Opren (Oraflex in the US) was withdrawn from the world market in August 1982 following British reports of deaths and serious liver damage in people taking the drug.1 Since 1980, when Opren was first introduced in the UK, there had been 3,500 reports of harmful effects with 61 deaths, mainly in elderly patients.2


Scientists list Opren as a drug whose injuries were not predictable from animal tests,3 and note that “despite searching preclinical animal toxicity studies … administration to rheumatoid patients resulted in adverse reactions including onycholysis (nail damage) and skin phototoxicity (light sensitivity) and finally in fatal hepatotoxicity (liver damage) whereupon the drug was withdrawn." And Dista, the subsidiary of Eli Lilly who marketed the drug in Britain, stated in their literature that “the effects of benoxaprofen (Opren) in the rhesus monkey were studied for one year … There were no apparent adverse effects on survival."


Researchers believed that the fatal cases of liver damage might have been averted by more extensive clinical trials,3 especially in the elderly who take much longer to eliminate Opren from the body than either young people or laboratory animals.


REFERENCES:
1) E.M.B. Sorensen, Toxicology Letters, 1986, vol. 34, 277-286.
2) British Medical Journal, 1982, August 14, 459-460.
3) C.T. Eason et al, Regulatory Toxicology & Pharmacology, 1990, vol. 11, 288-307.



‘FLEXIBLE’ ANIMAL TESTS SUPPORT RIVAL THEORIES


A new animal test raised fears that Astra’s ulcer treatment, omeprazole, may cause stomach cancer. In the test, developed by Glaxo pharmaceutical company, rats are dosed with the suspect drug or chemical, after which tissue samples are removed from the animal’s stomach and analysed for effects on DNA, the substance which controls proper development of the cells. Interference with DNA is regarded as a possible first step to cancer.


The experiments showed that omeprazole damaged the DNA but that ranitidine, Glaxo’s own antiulcer drug, did not.1 On the basis of these results, Glaxo halted comparative clinical trials of ranitidine (Zantac) and omeprazole, an action, according to the Lancet, that seemed certain to influence prescribing habits.2


In response, Astra, the makers of omeprazole, argued that “the method used by Glaxo is scientifically unsound and the results therefore have no clinical consequences."3 They noted that “long term studies in which omeprazole was administered for up to 2 years in rats, 18 months in mice, and 1 year in dogs yielded no evidence for a direct carcinogenic potential, in the stomach or elsewhere.


REFERENCES:
1) B. Burlinson et al, Lancet, 1990, February 17, 419.
2) Lancet, 1990, February 17, 386
3) L. Ekman et al, Lancet, 1990, February 17, 419-420.



LABORATORY ANIMALS FAIL STROKE VICTIMS


Following experiments on rabbits, dogs, gerbils and monkeys, animal researchers suggested that barbiturates could provide protection against the effects of stroke.1 In human stroke victims, however, barbiturates had little or no protective effect.2 This failure of animal tests is not an isolated example: between 1976 and 1988, 25 drugs were found useful in treating animals with artificially-induced stroke yet none has come into general clinical use.2


Stroke researchers are divided over the relevance of animal experiments3 and some argue that “over-reliance upon such (animal) models may impede rather than advance scientific progress in the treatment of this disease … Each time one of these potential treatments is observed to be effective based upon animal research, it propagates numerous further animal and human studies consuming enormous amounts of time and effort to prove that the observation has little or no relevance to human disease or that it may have been an artifact of the animal model itself."2


Although defending the role of animal experiments, researchers at the Mayo Clinic conclude that “Ultimately … the answers to many of our questions regarding the underlying pathophysiology and treatment of stroke do not lie with continued attempts to model the human situation perfectly in animals but rather with the development of techniques to enable the study of … living humans."2


REFERENCES:
1) Stroke, 1975, vol. 6, 28-33; Stroke, 1974, vol. 5, 1-7; Neurology, 1975, vol. 25, 870-874; Stroke, 1972, vol. 3, 726-732; Annals of Neurology, 1979, vol. 5, 59-64.
2) D.O. Wiebers et al, Stroke, 1990, vol. 21, 1-3.
3) C. Millikan, Stroke, 1992, vol. 23, 795-797.



RIFAMPICIN & THE PILL


In 1971 doctors reported unexpected pregnancies among women taking the “pill".1 Of 88 women taking oral contraceptives in addition to the antituberculosis drug rifampicin, 75% suffered disturbances to their menstrual cycle and 5 became pregnant. The rifampicin had stimulated the patient’s liver to metabolise, or break down, the pill more rapidly. Consequently, far less contraceptive remained to protect the women from pregnancy. The British National Formulary (1993) now tells doctors prescribing rifampicin to “advise patients on oral contraceptives to use additional means (of contraception)."


Further reports showed that rifampicin accelerates the breakdown of many other medicines.2 An example is methadone where rifampicin led to withdrawal symptoms by reducing the amount of drug. Another patient rejected their kidney graft because rifampicin had diminished the dose of immunosuppressive drug cyclosporin.


Rifampicin’s peculiar effect had not been predicted by animal experiments.3 Following discovery of the effects in people, further animal tests were carried out but these proved contradictory. For instance, the drug’s action could not be reproduced in rats. In mice, however, prolonged treatment with rifampicin did stimulate the liver’s metabolic processes but a single dose had the opposite effect, slowing down metabolism.4 Nevertheless, the problems with rifampicin might have been predicted had scientists used human liver tissue for their tests.5


REFERENCES:
1) Reported in J.P. Mumford, British Medical Journal, 1974, May 11, 333-334.
2) H. Meyer et al in Meyler’s Side Effects of Drugs, 11th edition, Ed. M.N.G. Dukes (Elsevier, 1988).
3) E. Nieschlag, Pharmacology & Therapeutics, 1979, vol. 5, 407-409.
4) D. Pessayre & P. Mazel, Biochemical Pharmacology, 1976, vol. 25, 943-949.
5) A.M. Jezequel et al, Gut, 1971, vol. 2, 984-987.



RATS CAST DOUBT ON OLIVE OIL!


Although olive oil has been used to anoint the human body for thousands of years without any apparent ill effects,1 tests carried out at New York University showed that olive oil actually had a harmful effect when applied to the skin of rats, causing swelling, proliferation of cells and a great shedding of large, loose flakes of skin!2


REFERENCES:
1) M.M. Rieger & G.W. Battista, Journal of the Society of Cosmetic Chemists, 1964, vol. 15, 161-172.
2) E.O. Butcher, Journal of Investigative Dermatology, 1951, vol. 16, 85-90.



BLEACH HIGHLIGHTS FAULTY SKIN TESTS


Rabbits and guinea pigs are commonly used to assess irritancy but neither provides an accurate model for human skin.1 For instance, by the criterion of animal experiments, hypochlorite bleach should be considered comparatively safe for human use, since it only produces “slight visible irritation" in rabbits and guinea pigs.1 However, in human volunteers bleach causes severe skin reactions.


REFERENCE:
1) G.A. Nixon et al, Toxicology & Applied Pharmacology, 1975, vol. 31, 481-490.



TRANSPLANT RESEARCH MISDIRECTED


The key problem for transplant scientists has always been to overcome the body’s natural defence mechanism whereby a transplanted organ is rejected. Most of the animal research directed is towards this end has relied on rodents, with rats by far the most commonly used species.1 Yet scientists have discovered important tissue differences which mean their results are of questionable relevance to people, and could be misleading.


For instance, if experiments with rats were used as a guide, patients receiving heart or kidney grafts would only need a very brief period of immunosuppression with drugs like cyclosporin, after which they would never reject their new organ.2, 3 In fact, such a course would be disastrous, for unlike rats, human patients need lifelong immunosuppression to prevent organ rejection.


The reason, scientists suspect, is that within a few days of transplantation, the rat’s kidney has no cells to stimulate the immune system, so the animal does not reject a transplant when immunosuppressive drug treatment is stopped. In contrast, the human kidney does have these cells as an integral part of its structure and transplant patients must therefore have lifelong drug treatment to suppress the cells’ immune-stimulating effects.4


During the 1960’s and 1970’s much research focused on rat “models" of kidney and heart transplants, and according to John Fabre of Oxford University’s Nuffield Department of Surgery, “The many encouraging results raised hopes that a major advance in clinical immunosuppression for transplantation was in the offing, but these hopes have now faded and nothing of the great mass of work has been translated into clinical practice." Fabre suggests that the tissue differences between people and rats may be responsible.2


REFERENCES:
1) According to British figures for 1986, 66% of experiments performed in transplant research used rats, 26% used mice, 7% used rabbits, dogs, primates or other species. Source: Statistics of Experiments on Living Animals, Great Britain, 1986 (HMSO, 1987).
2) J.W. Fabre, Transplantation, 1982, vol. 34, 223-224.
3) D.J. Cohen et al, Annals of Internal Medicine, 1984, vol. 101, 667-682.
4) P.J. Morris, (Ed.), Tissue Transplantation (Churchill Livingstone, 1982). See also ref. 2.



TRAGEDY HITS HEPATITIS VICTIMS


In June 1993, researchers at America’s National Institutes of Health abruptly halted trials of a new drug to combat hepatitis B virus, following deaths and serious complications among participants. Although the drug, fialuridine (FIAU), was intended to improve liver disease, many of the patients undergoing prolongued treatment were getting worse with several dying from liver failure.1


The liver toxicity surprised researchers, for the drug seemed safe and effective in animal experiments.1 It reduced the amount of hepatitis virus in infected woodchucks, the “preferred" animal model, and was also tested for toxicity in mice, rats and rhesus monkeys. However, one of the trial’s chief investigators later asked “… why didn’t the animal toxicity studies show any abnormality at all due to the drug?"2


The metabolism of anti-viral drugs of this type is said to be very different in animals and people,3 and the tragedy has prompted a closer look at related drugs to see if other patients are experiencing similar harmful effects.


REFERENCES:
1) N. Touchette, The Journal of NIH Research, 1993, 5, 33-35.
2) J. Hoofnagle, reported in ref. 1.
3) C. Macilwain, Nature, 1993, July 22, 275.



LEUKEMIC MICE FAIL CANCER PATIENTS


For decades, America’s National Cancer Institute (NCI) has used animals in the search for new drugs. Tens of thousands of chemicals have been assessed in mice given leukemia but the method has proved highly inefficient. One scientist estimates that for every 30 " 40 drugs effective in treating mice with cancer, only one will work in people,1 which suggests that during clinical trials many cancer patients will be exposed to the severe toxicity of anticancer drugs without any corresponding benefit. During the 1980’s, researchers acknowledged that the NCI’s traditional approach was failing to identify promising new treatments against any of the main cancers.2, 3


In the new strategy, mice have been replaced by test-tube studies with human cancer cells, at least for preliminary experiments. Drugs showing promising activity are then subject to further animal tests so there is still the risk of misleading predictions.4 As an alternative, drugs could be further assessed using fresh human tumour tissue from biopsies or therapeutic operations.5 Results would then be directly relevant to people.4


REFERENCES:
1) D.D. Von Hoff, Journal of the American Medical Association, 1979, August 10, 503.
2) R. Kolberg, Journal of NIH Research, 1990, vol. 2, 82-84.
3) A. Pihl, International Journal of Cancer, 1986, vol. 37, 1-5.
4) S.E. Salmon, Cloning of Human Tumor Stem Cells, (Alan Liss, 1980).
5) C.W. Taylor et al, Journal of the National Cancer Institute, 1992, vol. 84, 489-494.



‘HARMLESS’ ULCER DRUG COULD CAUSE HEART FAILURE


Carbenoxalone was introduced during the 1960’s for the treatment of peptic ulcers but caused salt and water retention in some patients leading to high blood pressure, swelling, weight gain, muscle weakness, and heart failure. Other drugs are now preferred, says the British National Formulary, but if carbenoxalone is to be used, patients should be monitored carefully during treatment.1


Prior to marketing, animal tests had given the impression that carbenoxalone was safe, having revealed no harmful effects.2 These tests were carried out on rodents but scientists then realized that people metabolized carbenoxalone quite differently to rats, mice and rabbits. Further experiments were therefore undertaken with dogs and monkeys but again, there was no evidence of toxicity.2


REFERENCES:
1) British National Formulary, no. 26 (BMA & the Royal Pharmaceutical Society of G.B., 1993)
2) C.T. Eason et al, Regulatory Toxicology & Pharmacology, 1990, vol. 11, 288-307.



BEAGLE DOGS MISLEAD CANCER RESEARCH


Mitoxantrone was developed in the hope of providing effective cancer treatment without side-effects on the heart. Animal researchers were presumably reassured when tests on beagle dogs “failed to demonstrate cardiac failure."1 But in clinical trials several patients suffered side-effects including heart failure, and more widespread use of the drug confirmed that cardiac toxicity is a major problem. For instance, data from 3,360 patients receiving mitoxantrone included 88 reports of cardiac side effects with 29 cases of heart failure.2 And a recent Chinese study suggested that 20% of patients developed cardiotoxicity following treatment with mitoxantrone.3


REFERENCES:
1) R. Stuart Harris et al, Lancet, 1984, July 28, 219-220.
2) Martindale: The Extra Pharmacopoeia, 29th edition, Ed. J.E.F. Reynolds (Pharmaceutical Press, 1989).
3) A. Stanley & G> Blackedge in Side Effects of Drugs, Annual 15, Eds. M.N.G. Dukes & J.K. Aronson (Elsevier, 1991).



THALIDOMIDE


Thalidomide was first introduced as a sedative by the German drug company Chemie Grunenthal in 1957, and by the Distillers company in Britain a year later. Although animals could tolerate massive doses without ill-effect,1 thalidomide was soon found to cause peripheral neuritis in human patients: feelings of numbness were followed by severe muscular cramps, weakness of the limbs and a lack of coordination.


The Australian obstetrician William McBride was first alerted to thalidomide’s most notorious side-effect after seeing 3 babies born with very unusual birth defects. Unfortunately, his warnings to the medical profession were delays because he tried to “confirm" his observations by testing the drug in mice and guinea pigs, both of whom proved resistant to the drug.2 Only after seeing further human cases did McBride publish his findings.


Although not specifically tested for birth defects prior to marketing, subsequent experiments revealed “extreme variability in species susceptibility to thalidomide."3 For instance, mice could safely tolerate 8000 times the dose found harmful to human babies.4 In his book Drugs as Teratogens, Schardein writes, “in approximately 10 strains of rats, 15 strains of mice, 11 breeds of rabbit, two breeds of dogs, three strains of hamsters, eight species of primates and in other such varied species as cats, armadillos, guinea pigs, swine and ferrets in which thalidomide has been tested, teratogenic effects (birth defects) have been induced only occasionally." Scientists eventually discovered that birth defects similar to those found in people could be induced in certain types of rabbit and primate. Nevertheless, New Zealand white rabbits had to be dosed with 300 times the amount dangerous to humans.5


The thalidomide disaster prompted additional, extensive testing of drugs and chemicals in pregnant animals, but some scientists believe that “animal malformations seldom correlate with those of humans."6 Furthermore, “… no animal model has been found which responds satisfactorily to all known teratologic agents in humans to permit reliable screening of substances for their teratologic potential. Careful surveillance, reporting and prospective study … remain the mainstays for detection of adverse effects following foetal drug exposure."


REFERENCES:
1) R.D. Mann, Modern Drug Use, an Enquiry on Historical Principles (MTP Press, 1984)
2) The Sunday Times “Insight" Team, Suffer the Children " The Story of Thalidomide (Andre Deutsche, 1979)
3) T.H. Shephard, Catalogue of Teratogenic Agents (Johns Hopkins Press, 1976)
4) S.K. Keller & M.K. Smith, Teratogenesis, Carcinogenesis & Mutagenesis, 1982, vol. 2, 361-374
5) New Zealand White rabbits were sensitive to doses of 150mg/Kg of thalidomide (ref. 6) whilst the dangerous human dose was 0.5mg/Kg (ref. 4)
6) R.M. Ward & T.P. Green, Pharmacology & Therapeutics, 1988, vol. 36, 326.



“HARMLESS" ANTIDEPRESSANT DAMAGED LIVER


Iproniazid was originally developed as a treatment for tuberculosis but found use as an antidepressant. Although considered “harmless" on the basis of animal tests,1 iproniazid produced fatal cases of liver damage in human patients and the drug was eventually avandoned.2


REFERENCES:
1) J. Boyer in Clinical Pharmacology, Basic Principles in Therapeutics, 2nd edition, Eds. K.L. Melmon & H.F. Morrelli (MacMillan, 1978).
2) B. Blackwell & J.S. Simon in Side Effects of Drugs Annual 13, Eds. M.N.G. Dukes & L. Beeley (Elsevier, 1989)



TRAGEDY OF THE KILLER DUST


Asbestosis, the lung disease caused by inhaling asbestos, was first recognized in 1907. The reports were so disturbing that 11 years later, the Prudential Insurance Company in New York refused to issue life policies on asbestos workers. Animal research began in 1925 but much of the early experimentation proved contradictory. For instance, during the 1930’s, one group of scientists wrongly classified the chrysotile, amosite and crocidolite forms of asbestos as harmless on the basis of animal tests.1 Others found that chrysotile caused lung damage in guinea pigs but not rabbits.2


In 1931 and again in 1951, experimenters reported that the injuries caused by asbestos start to heal when the animals are removed from the dusty atmosphere.2 This is contrary to human experience where asbestosis progresses even when workers are no longer exposed. Only later were researchers able to mimic this aspect of the disease in animals.3


The fact that asbestos could harm the lungs was serious enough but doctors soon discovered a more alarming threat " cancer. The first reports of an association between asbestos and lung cancer came from America, England and Germany during the 1930’s following examination of people who had died with asbestosis. But attempts to induce cancer in animals repeatedly failed and despite further evidence from exposed workers, the carcinogenic action of asbestos was doubted until the 1960’s.4, 5 Only then were researchers able to mimic the disease in animals.


Prior to this “… a large literature on experimental studies has failed to furnish any definite evidence for induction of malignant tumours in animals exposed to various varieties and preparations of asbestos by inhalation or intratracheal injection."6


REFERENCES:
1) Reported in L.U. Gardner,Journal of the American Medical Association, 1938, November 19, 1925-1936.
2) J.C. Wagner, British Journal of Industrial Medicine, 1963, vol. 20, 1-12
3) J.C. Wagner et al, British Journal of Cancer, 1974, vol. 29, 252-269.
4) P.E. Enterline in Epidemiology & Health Risk Assessment, Ed. L. Gordis (Oxford University Press, 1988)
5) P.E. Enterline, American Review of Respiratory Diseases, 1978, vol. 118, 975-978.
6) W.E. Smith et al, Annals of the New York Academy of Sciences, 1965, vol. 132, 456-488.



SMOKING DANGERS MASKED BY FALSE ANIMAL DATA


In 1954, Richard Doll and Bradford Hill published their famous investigation into the smoking habits of British doctors which clearly revealed that the chances of developing lung cancer increased with the number of cigarettes smoked.1 More than a dozen similar (human) studies had already been published but some scientists still argued that the link between smoking and lung cancer was unwarranted since no-one had produced the disease in laboratory animals.2


Two years after publication of Doll and Hill’s findings, the British Empire Cancer Campaign " the forerunner of today’s Cancer Research Campaign " reported nearly two years of experiments during which mice, rabbits and other animals were exposed to tobacco derivatives by direct inhalation, feeding, injection into the lungs, and skin painting. None developed cancer.3 And in 1957, American pathologist Eric Northrup concluded in his book Science Looks at Smoking that the “… inability to induce experimental cancers, except in a handful of cases, during 50 years of trying, casts serious doubt on the validity of the cigarette-lung cancer theory."


Health warnings were delayed for years and Northrup describes how “it is reassuring … that public health agencies have rejected the demand for a mass lay educational programme against the alleged dangers of smoking. Not one of the leading insurance companies, who consider health hazards in terms of monetary risk, has raised the life insurance rates for heavy smokers."


Despite years of further experimentation, it has proved “difficult or impossible" to induce lung cancer in animals using the method (inhalation) by which people are exposed to the smoke.4


REFERENCES:
1) R. Doll and A.B. Hill, British Medical Journal, 1954, June 26, 1451-1455.
2) Reported in S. Peller, Quantitative Research in Human Biology (J. Wright & Sons, 1967).
3) Reported in E. Northrup, Science Looks at Smoking (Conard-McCann, 1957).
4) Lancet, 1977, June 25, 1348-1349. See also F.T. Gross et al, Health Physics, 1989, vol. 56, 256.



THE DOGMA OF DEATH


In a meticulous study at the Vienna General Hospital, Ignaz Phillipe Semmelweiss discovered that expectant mothers were more likely to die of childbed (puerperal) fever if their attendants had previously been working in the dissecting and post-mortem rooms. The disease, he reasoned, must be caused by an infection carried from the dissecting room on the hands of doctors and students. When Semmelweiss insisted on strict hygiene, the death rate promptly dropped from 1 in 8 confinements to 1 in 100.1


Tragically, the Hospital professors responded with such hostility that Semmelweiss was forced to leave. Only 4 years earlier, in 1843, the American researcher and humanitarian Oliver Wendell Holmes had reached the same conclusion by careful observation, but had been similarly vilified. According to medical statistician Dr. Sigmund Peller, “In a world that had not been stultified by the idea that only animal experimentation and only the laboratory can provide proof in matters of human pathology, the battle against puerperal fever would not have needed to wait for the discovery of cocci (the responsible bacterium, discovered during the 1860’s). The experts who, during the 1840’s, opposed and prevented the initiation of a rational programme for combating the disease should have been charged with a negligence that resulted in mass killings. But they were not."2


Proper recognition of Semmelweiss and Holmes, and the central role of cleanliness, would surely have hastened the introduction of lifesaving, hygienic measures in surgery. But these had to wait at least another 20 years until Lister developed his antiseptic techniques.


REFERENCES:
1) R. Sand, The Advance to Social Medicine (Staple Press, 1952).
2) S. Peller, Quantitative Research in Human Biology (J. Wright & Sons, 1967).



DOG DEATHS DENY WOMEN CONTRACEPTIVE OPTION


During the 1960’s, doctors noticed that women receiving the steroid drug Depo-Provera as a treatment for premature labour, experienced a delay in the return of fertility after the birth of their babies. The observation led to clinical trials of the drug as a possible long-acting contraceptive.1 Injectable preparations of Depo-Provera are now known to be as effective as oral contraceptives and are available in Europe, Asia, Africa, and the Far East. In America however, approval was delayed for many years.2


Much of the controversy surrounding Depo-Provera related to experiments with beagle dogs that indicated a host of disturbing side-effects. There were abnormal growth problems, cases of breast cancer, and many animals died of pyometra, a condition in which pus accumulates in the uterus. None of these effects have been observed in women taking Depo-Provera1, 2 and scientists point to physiological differences between human beings and dogs which make beagles especially sensitive to certain kinds of steroids.1


High doses of Depo-Provera can also cause cancer in monkeys but again their relevance has been questioned since the tumours arise from a type of cell not found in women. Furthermore, the kind of cancer produced in monkeys is successfully treated by Depo-Provera in women!1


In 1991, an editorial in the Lancet entitled “DMPA (Depo-Provera) and breast cancer: the dog has had its day," argued that “Countries such as the USA, Australia and Japan would do well to reassess their existing policies on injectable preparations, otherwise they may deprive their female citizens of a reliable, effective and safe method of contraception."2 One year later, America’s Food & Drug Administration finally decided to approve Depo-Provera as a long acting contraceptive.


REFERENCES:
1) Bulletin of the World Health Organisation, 1982, vol. 60, 199-210.
2) Lancet, 1991, October 5, 856-857.



TRANSPLANT DRUG CAUSES UNEXPECTED KIDNEY DAMAGE


Cyclosporin is used to prevent rejection of transplanted organs and although hailed as a major advance over existing drugs, it is not a panacea: side-effects are common and sometimes dangerous. The most serious hazard is kidney damage,1 an effect not predicted by the initial animal tests.2 Ironically, kidney toxicity has been reported in almost 80% of kidney transplant patients receiving the drug.2 Some heart transplant patients treated with cyclosporin required dialysis because their kidneys had failed.3


Subsequent animal experiments showed that only extremely high doses of cyclosporine could induce kidney toxicity in rats1 although dogs and rhesus monkeys were still unaffected.2 Researchers believe that “… failure to produce renal dysfunction (kidney damage) experimentally that is similar to that seen clinically may result from species differences in metabolism."2


Although cyclosporin can prevent rejection of transplanted organs in both animals and people, an early review of the drug found sufficient variation in experimental results to suggest that “The immunosuppressive effects of cyclosporin have … differed considerably between species, limiting any direct inference that may be made regarding use in human organ transplantation …"1


REFERENCES:
1) D.J. Cohen et al, Annals of Internal Medicine, 1984, vol. 101, 667-682.
2) W.M. Bennett & J.P. Pulliam, Annals of Internal Medicine, 1983, vol. 99, 851-854.
3) Lancet, 1986, February 22, 419-420.



TOXIC TREATMENTS


Many cancer patients have suffered unnecessarily because researchers believed large doses of anticancer drugs were necessary for efficient treatment. The widely held view was that to be effective in reducing tumour size, cancer chemotherapy must also be toxic:1 only then did doctors think they had given sufficient drug. The idea was based on animal experiments1, 2 yet there were early warning signs that patients survived longer when given comparatively nontoxic doses, even though the drugs had a smaller effect on tumour size.3


The high dose concept has been challenged by clinical researchers. During the 1960’s, a series of statistical studies by the Rosewell Park Memorial Institute for Cancer Research in New York, concluded that toxicity is not necessary and can be counterproductive.2 In 1976, London cancer specialists found that the animal data on which the high dose concept is based, are not always valid for human patients.1 They argued that “Since patients given large doses of antineoplastic (anticancer) agents are often at greater risk of toxicity, alternative methods of improving the selectivity of cancer chemotherapy must be explored."


REFERENCES:
1) M.H.N. Tattersall & J.S. Tobias, Lancet, 1976, November 13, 1073-1074.
2) I.D. Bross, Perspectives On Animal Research, 1989, vol. 1, 83-108.
3) M.A. Schneiderman & M.J. Krant, Cancer Chemotherapy Reports, 1966, vol. 50, 107-112.



MINOR TRANSQILLIZERS PRODUCE MAJOR PROBLEMS


Librium and Valium were the first of a new type of tranquilizing drug to be introduced during the early 1960’s. They were called “minor tranquillizers" (benzodiazepines) and many similar drugs quickly followed. They soon became the most widely used of all prescribed drugs. Almost immediately after the introduction of Librium and Valium, doctors reported cases of dependence but it was generally assumed that high doses were necessary.1 At the usual therapeutic amounts, dependence was thought to be uncommon and not a serious problem. The idea prevailed for 20 years and received support from laboratory research since “animal experiments … do not indicate the potential for the development in the human dependence at therapeutic dosage levels."2


It is known, however, that “animal studies … do not predict clinical dependence potential reliably,"3 and more careful human observations revealed that tranquillizers could induce dependence at ordinary doses. By the mid-1980’s, an estimated 500,000 people in Britain alone may have been addicted to their treatment.4


REFERENCES:
1) H. Petursson & M. Lader, Dependence on Tranquillizers (Oxford University Press, 1984)
2) J. Marks, The Benzodiazepines (MTP Press, 1978)
3) Drug & Therapeutics Bulletin, 1989, vol. 27, 28.
4) The Benzodiazepines in Current Clinical Practice, Eds. H. Freeman & Y. Rue (Royal Society of Medicine Services, 1987).



CORTICOSTEROIDS & BIRTH DEFECTS


Contrary to human experience, experiments on pregnant mice and rabbits would suggest that corticosteroids are very dangerous to the unborn child. In some strains of mice cortisone produces cleft palate in up to 100% of the offspring.1 With rabbits, corticosteroids mainly affect the heart but can also cause severe growth retardation in the uterus and death of the foetus. However, scientists have found “very wide species variation"2 and cortisone is not considered harmful to human babies.1 Rats and monkeys are also “very tolerant of corticosteroids in pregnancy, abnormalities or growth retardation only occurring uncommonly, with high doses of the most potent compounds."2


REFERENCES:
1) R.M. Ward & T.P. Green, Pharmacology & Therapeutics, 1988, vol. 36, 326.
2) R.K. Sidhu in Drugs & Pregnancy: Human Teratorgenesis & Related Problems, Ed. D.F. Hawkins (Churchill Livingstone, 1983)



LIVER DAMAGE NOT PREDICTED … AGAIN!


In 1985 Britain’s Committee on Safety of Medicines issued a special warning of serious liver damage associated with antifungal drug ketoconazole (Nizeral).1 The Committee cited 82 cases with 5 deaths. The warnings followed similar action by the US Food and Drug Administration in 1982.2 Doctors are advised to monitor their patients carefully and perform regular liver function tests throughout treatment with ketoconazole. No evidence of liver toxicity had been found in the original animal tests.3


REFERENCES:
1) Lancet, 1985, January 12, 121.
2) C.B.M. Tester-Dalderup in Meyler’s Side-Effects of Drugs, 11th edition, Ed. M.N.G. Dukes (Elsevier, 1988)
3) J.K. Heiberg & E. Svejgaard, British Medical Journal, 1981, September 26, 825.



ANIMAL TESTS MINIMISE RIOT GAS HAZARD


Studies with human volunteers have shown that animal experiments can seriously underestimate the likely effect of riot control gases on the eye. The tests found that people are 18 times more sensitive to CS than rabbits, and 90 times more sensitive to another sensory irritant, CR.1


When applied to the rabbit’s eye, a solution of CR produced only “minor transient changes" in pressure within the eye. But instillation of a smaller amount into the human eye produced a 40% rise in pressure within 5 minutes compared with only a 3% rise after 10 minutes in rabbits.2


Species differences have also been found when CS and CR are applied to the skin. A method known as the human blister-base technique allows volunteers to classify irritants according to the level of discomfort they produce. The procedure showed that CR is a more potent irritant than CS which is confirmed by other human test systems, yet is the reverse of that found from experiments on rodents.3 The study also found that a further sensory irritant, VAN, is less potent than CR which is again the opposite of that found from animal tests. In a masterpiece of understatement, the researchers conclude that “data derived from humans thus appears to be of importance when assessing irritant potency."3


REFERENCES:
1) D.W. Swanston in Animals & Alternatives in Toxicity Testing, Eds. M. Balls et al (Academic Press, 1983)
2) B. Ballentune et al in Current Approaches in Toxicology, Ed. B. Ballantyne (Wright & Sons, 1977)
3) R.W. Foster et al, Pain, 1986, vol. 25, 269-278.



HEART DRUGS MAY HAVE KILLED 3,000


A US study has found that two drugs designed to prevent irregular heart beats can actually cause heart attacks in certain types of patient. The cardiac arrhythmia suppression trial (CAST) began in June 1987 but was halted in April 1989 when doctors found more deaths among patients treated with encainide and flecainide than in those receiving a placebo (dummy pill).1 Based on the findings it has been estimated that, nationwide, 3000 people may have died prematurely after taking the drugs.2 In contrast, the animal research had indicated that encainide and flecainide were both safe and effective.3


REFERENCES:
1) CAST Investigators, New England Journal of Medicine, 1989, August 10, 406-412.
2) Dr. J. Morganroth reported in Washington Times, 1989, July 26.
3) Flecainide: B. Holmes & R.C. Heel, Drugs, 1985, vol. 29, 1-33; encainide: D.C. Harrison et al, American Heart Journal, 1980, Vol. 100, 1046-1054, and J.E. Byrne et al, Journal of Pharmacology & Expedrimental Therapeutics, 1977, vol. 200, 147-154.



ANTIBIOTICS, GUINEA PIGS, AND HAMSTERSE


Years of experimentation have taught scientists that guinea pigs and hamsters are especially sensitive to the harmful effects of antibiotics. For instance, widely prescribed human antibiotics such as ampicillin, amoxyciillin and oxytetracycline are considered “toxic" and therefore inappropriate for use in these species.1 Another example is erythromycin where the usually recommended human dose is enough to kill a hamster!2


Today, “it is generally recognized that the guinea pig is peculiarly sensitive to the lethal effects of antibiotics,"3 but this was not always realized. In his book Drug Development: From Laboratory to Clinic, Dr. Walter Sneader describes how “it was fortunate that Florey and Chain did not decide to use guinea pigs when first testing penicillin, for they may then have abandoned the project as these animals are hypersensitive to penicillin." Florey and Chain were the Oxford scientists who carried out animal tests following Fleming’s discovery of penicillin. Florey later commented “… mice were tried in the initial toxicity tests because of their small size, but what a lucky chance it was, for in this respect man is like the mouse and not the guinea pig. If we had used guinea pigs exclusively we should have said that penicillin was toxic, and we probably should not have proceeded to try to overcome the difficulties of producing the substance for trial in man."4


REFERENCES:
1) A.A. Tuffery (ed.), Laboratory Animals " An Introduction for New Experimenters (Wiley, 1987)
2) A single, minimum recommended dose of erythromycin is 250-500 mg every 6 hours i.e. 3.5-7 mg/kg for a 70 kg person. The lethal dose for hamsters is 3.6mg/kg (ref. 3)
3) S.J. Desalva et al, Toxicology & Applied Pharmacology, 1969, vol. 14, 510-514
4) H. Florey, Conquest, January, 1953.



DAUGHTESR OF DES


On the basis of animal experiments, the synthetic oestrogen diethylstilbestrol (DES) was suggested as a means of preventing miscarriage.1 Although no proper human (clinical) trials were carried out,2 the procedure nevertheless became widely accepted, and between 1948 and 1971, DES was given to some 2 " 3 million pregnant women in the US alone.


However, DES was ineffective. In 1953, properly controlled clinical trials showed that DES did not work.3 Tragically, the study failed to report that DES increased abortions, neonatal deaths and premature births, a conclusion that could have been made from the data available in the trial.4 DES was not only ineffective, it was also unsafe. Just how unsafe was only revealed in 1971 when researchers traced a link between exposure to DES and a previously rare form of vaginal and cervical cancer in daughters of women who had taken the drug during pregnancy.5 Almost 600 cases have been reported6 but DES has proved a biological timebomb as side-effects continue to surface in sons and daughters of women who took the drug.


It has been suggested that animal tests provided an early warning of the problems. It is true that in 1938 DES was found to cause breast cancer in male mice, but since the cancer-causing potential of other oestrogens varied according to the strain of mouse used,7 the results could hardly be a serious basis for action. Furthermore, the consensus among animal researchers at the time was that oestrogens did not produce cancer, rather they gave male mice mammary glands and this made them susceptible to the same cancer-causing factors that operated within female animals. In fact, a summary of the animal data in 1941 found “only meager evidence" that oestrogens cause cancer of the cervix.7 Not until the 1970’s did it become clear that in contrast to the majority of animal experiments, DES was a potent cause of cervical cancer in women.


REFERENCES:
1) Health Action International, “Problem Drugs" pack, 1986, May 13
2) D. Brahams, Lancet, 1988, October 15, 916.
3) W.J. Dieckmann et al, American of Obstetrics & Gynaecology, 1953, vol. 66, 1062-1081.
4) Y. Brackbill & H.W. Berendes, Lancet, 1978, September 2, 520.
5) A.L. Herbst et al, New England Journal of Medicine, 1971, April 22, 878-881.
6) C. Vanchieri, Journal of the National Cancer Institute, 1992, vol. 84, 565-566.
7) S. Peller, Cancer in Man (McMillan, 1952)



THE FIRST BETA-BLOCKERS


Beta-blockers were developed for the treatment of heart conditions and the first agents to be administered to human patients were pronethalol and propranolol. Ironically, pronethalol proved generally safe and effective in laboratory animals but failed the clinical test, while propranolol appeared toxic in many animal experiments yet is widely used in clinical practice.


Pronethalol was “well tolerated" by rats and dogs in prolonged toxicity tests at high doses, except for occasional effects on the central nervous system.1 However, clinical trials revealed an unacceptable number of side-effects2 including heart failure, a hazard not predicted by animal experiments.1 Shortly after, long term tests in a certain (Alderly Park) strain of laboratory mouse produced cancer of the thymous gland but no carcinogenic effects were ever found in rats, guinpea pigs, dogs, monkeys or other types of mouse.1


Pronethalol was quickly replaced by propranolol but tests in rats, dogs and mice put further development in jeopardy.3 Moderate to high doses caused rats to collapse and dogs to vomit severely.1 Deaths were also seen in mice shortly after dosing. When the amount of drug was reduced to that used clinically, propranolol was said to be “well tolerated". Even so, some of the rats still had heart lesions.1


Later clinical observations showed that propranolol could also lower the blood pressure,4 and today beta-blockers are widely used for the treatment of high blood pressure.


REFERENCES:
1) J.M. Cruickshank et al in Safety Testing of New Drugs, Eds. D.R. Laurence et al (Academic Press, 1984)
2) W. Sneader, Drug Discover: the evolution of modern medicine (Wiley, 1985)
3) D.R. Laurence et al (Eds.), Safety Testing of New Drugs (Academic Press, 1984)
4) E.S. Snell, Pharmacy International, 1986, February, 33-37.



X-RAYS & CANCER


In 1956 British doctors drew attention to a link between X-rays during pregnancy and subsequent childhood cancers.1 Within a few years similar findings were reported in American children. But for a quarter of a century, scientists questioned whether X-rays were actually the cause and cited animal experiments to show that the foetus is not especially sensitive to radiation.2 However, it seems that compared with other species, the human foetus is more susceptible to the carcinogenic effects of X-rays,2 and during the 1980’s further observational studies confirmed the hazards, particularly in early pregnancy.3


REFERENCES:
1) A.M. Stewart et al, Lancet, September 1, 447; British Medical Journal, 1958, June 28, 1495-1508.
2) E.B. Harvey et al, New England Journal of Medicine, 1985, February 28, 541-545.
3) E.G. Knox et al, Journal of the Society of Radiological Protection, 1987, vol. 7, 3-15; E.A. Gilman et al, Journal of Radiological Protection, vol. 8, 3-8.



STEROIDS & THE IMMUNE SYSTEM


Because of their potent effects on the immune system, corticosteroid drugs are widely used in medicine. They also have many side-effects which limit their usefulness, and much research has been carried out to discover exactly how the drugs work. However, there are said to be “remarkable differences in susceptibility to glucocorticosteroids between various species," with animals being classified as steroid resistant or steroid-sensitive.1 In mice, a steroid-sensitive species, a single dose of cortisone produces a 90% decrease in the thymus, an organ that plays a crucial role in immunity. By contrast, the same dose of cortisone given every day for a week, produced only a 37% decrease in the steroid-resistant guinea pig’s thymus. And while steroids inhibit the production of circulating antibodies in sensitive animals, the same effect is difficult to achieve in resistant species.1


Most of the research on corticosteroids has been carried out on steroid-sensitive species such as rats, mice, rabbits and hamsters whereas human beings are steroid resistant1. As researchers at the University of Dundee point out, “The mode of action of these drugs is very complicated, so it is regrettable that most of the extensive literature on animal experimental work is irrelevant to human therapeutics since many species respond in a very different manner from man."2 Consequently they concentrated on human clinical studies and test-tube experiments.


REFERENCES:
1) H.N. Claman, New England Journal of Medicine, 1972, August 24, 388-397.
2) J.S. Beck & M.C.K. Browning, Journal of the Royal Society of Medicine, 1983, vol. 76, 473-479.



UNEXPECTED EYE PROBLEMS LED TO DRUG REJECTION


During clinical trials, the anticancer drug sparsomycin produced eye damage, resulting in serious blind spots in 3 of the 5 patients. Although sparsomycin was highly toxic to several animal species, as would be expected for an anticancer drug, no specific effect on the eye had been found.1 After the eye problems had been reported, further attempts were made to induce the condition in rats and monkeys but these also failed even though rats were dosed every day for 2 weeks with 30-300 times the amount found to harm people.1 No retinal toxicity was observed in additional animal tests and further experimentation was abandoned, as was the drug.


REFERENCE:
1) C.G. Smith et al, Journal of International Medical Research, 1973, vol. 1, 489-503



PETHIDINE ADDICTION


On the basis of experiments with dogs, the narcotic analgesic pethidine was once thought ot be non-addictive in people.1 The side effect was not anticipated because pethidine is metabolized, or broken down, much more quickly in dogs resulting in less exposure to the drug. In fact, dogs metabolize pethidine more than 6 times faster than people.2


Such differences in metabolism are the rule rather than the exception2, 3 and according to Miles Weatherall, former Director of the Wellcome Research Laboratories, “every species has its own metabolic pattern, and no two species are likely to metabolize a drug identically."4


REFERENCES:
1) B. Brodie, Pharmacologist, 1964, vol. 6, 12-26.
2) R. Levine, Pharmacology: Drug Actions & Reactions (Little, Brown & Co., 1978)
3) G. Zbinden, Advances in Pharmacology, 1963, vol. 2, 1-112.
4) M. Weatherall, Nature, 1982, April 1, 387-390.



THE METHANOL SCANDAL


Methanol is employed in a wide variety of consumer products including solid fuels, antifreeze, windshield wiper fluid, paint remover, varnishes and as a solvent in photocopying machines. It is also imbibed as a cheap alternative to alcohol.


Although methanol is a highly poisonous, potentially lethal substance, this is was not realized for many years.1 Common laboratory species such as rats and mice are resistant to its effects,2 and experiments during the early years of the 20th century gave the impression that methanol was only slightly toxic, and far less poisonous than alcohol.3 In fact, methanol is ten times more toxic and a single bout of drinking methanol can lead to temporary or permanent blindness in people.4 This does not happen in rats, mice, dogs, cats, rabbits, or chickens.3 Eventually, in the 1950’s, and again during the 1970’s, scientists found that the horrifying symptoms of methanol poisoning could be induced in monkeys.2


Animal experiments also proved misleading in devising treatment. During the 1920’s, good results were achieved using bicarbonate in cases of human poisoning, but tragically the results were undermined by animal experiments. In 1955 an analysis of the subject stated that “it is indeed deplorable that about 30 years elapsed before the good effects of this treatment became commonly known, and unfortunately some still doubt its value. It seems that the authors of medical textbooks have paid more attention to the results of animal experiments than to clinical observations."3 The treatment not only failed in animals but generally proved fatal, prompting some researchers to advise against it.


Another approach is to administer alcohol in order to reduce the toxicity of methanol. While this is effective in people, animal tests suggested that it would actually increase the danger of methanol . As a result, some discouraged its use in cases of human poisoning.3 However, both bicarbonate and alcohol have withstood the clinical test and are still recommended for the treatment of methanol poisoning.1


REFERENCES:
1) M.J. Ellenhorn & D.G. Barceloux, Medical Toxicology: Diagnosis & Treatment of Human Poisoning (Elsevier, 1988)
2) T.R. Tephly, Life Sciences, 1991, vol. 48, 1031-1041.
3) O. Roe, Pharmacological Reviews, 1955, vol. 7, 399-412.
4) P. Wingate, Medical Encyclopedia (Penguin, 1983).



ANIMAL VICTIMS ESCALATE AFTER ICI DRUG FAILS


During clinical trials, ICI’s arthritis drug fenclozic acid unexpectedly produced jaundice in some of the patients. Researchers were surprised since tests with rats, mice, dogs, and monkeys had given no hint of liver problems.1 Not content with these results, further experiments with rabbits, guinea pigs, ferrets, cats, pigs, horses, neonatal rats and mice, together with a different strain of rat, were carried out but still no evidence of liver damage could be found.1 The ICI researcher commented that, “The quite unexpected onset of jaundice in a few patients caused withdrawal of the drug from humans and initiated a cast programme of experimental work. This search for hepatotoxicity (liver damage) in different species or any indication of its likelihood has so far been unrewarding."1


REFERENCE:
1) S.J. Alcock, Proceedings of the European Society for the Study of Drug Toxicity, 1971, vol. 12, 184-190.



OBESITY DRUG’S HORRIFIC SIDE EFFECTS


During the 1960’s Swiss doctors noticed a sudden and unexpected rise in a dangerous lung disease called obstructive pulmonary hypertension. The cause was traced to aminorex which had been used since 1965 for the treatment of obesity.1 The drug produces an increase in lung pressure leading to chest pains, difficulty breathing, fainting spells, heart problems and, in some cases, death.2 Aminorex’s deadly side effect had not been predicted by animal experiments3 and in 1968 the drug was withdrawn from sale.


Animal experiments continued even after withdrawal but long term administration to rats still failed to induce the disease.2 In dogs, aminorex did increase lung pressure,1 but its relevance to the human condition is unclear since a later analysis concluded that “pulmonary hypertension cannot be induced in experimental animals even with aminorex…"4


REFERENCES:
1) F. Follath et al, British Medical Journal, 1971, January 30, 265-266.
2) E.H. Ellinwood & W.J.K. Rockwell in Meyler’s Side Effects of Drugs, 11th edition, Ed. M.N.G. Dukes (Elsevier, 1988)
3) A.D. Dayan in Risk-Benefit Analysis in Drug Research, Ed. J.F. Cavalla (MTP Press, 1981)
4) P.H. Connell in Side Effects of Drugs Annual - 3, Ed. M.N.G. Dukes (Excerpta Medica, 1979)



RESEARCH ‘PARALYZED’ BY ANIMAL MODELS


During the twentieth century, extensive research has been carried out to develop an animal model that mimics spinal cord injuries (SCI) in people.1 A common procedure is to drop weights on to the spinal cord of cats.2 By using animals, researchers hoped to devise promising therapies and discover new insights into the condition. However, virtually no treatments have been developed that work in human patients.1 In 1988, for instance, Dennis Maiman of the Department of Neurosurgery at the Medical College of Wisconsin, Milwaukee, noted that “In the last two decades at least 22 agents have been found to be therapeutic in experimental SCI … Unfortunately, to date none of these has been proven effective in clinical SCI." The failure to accurately predict human responses is attributed to the artificial nature of the animal model.


In 1990, however, clinical trials did show that high doses of steroids could be beneficial. Some have credited animal tests with the discovery but the claim has been challenged. It is argued that the animal experiments were not only unnecessary but they gave inconsistent results, with some tests suggesting the therapy would actually fail!2


REFERENCES:
1) D. Maiman, Journal of the American Paraplegia Society, 1988, vol. 11, 23-25.
2) S.R. Kaufman, Perspectives on Medical Research, 1990, vol. 2, 1-12.