Medical treatment by a doctor is not risk-free. In an ideal world, medical diagnosis would be perfect and treatment would be appropriate for the patient’s condition, and without adverse side effects. In the real world, the risks of excess medical treatment arise from the fallibility of the healthcare system both to properly diagnose illnesses and to provide appropriate care.
In a recent article on the ineffectiveness and potential harms of health screening, we saw that imperfect diagnosis can lead to medical treatment that carries health risks. False negatives – a failure to find a real disease – and false positives – “detecting” a disease that isn’t really there – make the screening of healthy people much less safe and effective than most people realize. The diagnosis or even suspicion of a non-existent cancer, for example, can lead to further extensive testing including biopsies, chemotherapy, and surgery, all of which carry risks from anxiety to death.
Back in the early 1970s, a Stanford psychologist, David Rosenhan, sent 8 volunteers, one of whom was himself, to a number of different mental hospitals around the country. All of the volunteers were completely normal. They made appointments at these hospitals and claimed that they had been hearing voices which said “empty”, “hollow”, and “thud”. All were diagnosed as schizophrenic and admitted to hospital. This became well-known as the Rosenhan experiment.
After the “patients” were admitted, they ceased pretending to hear voices, acted normally, and told the staff that they were fine. In all cases, it took days to weeks to a couple of months before they were released, all with a diagnosis of schizophrenia in remission. Meanwhile, they were prescribed anti-psychotic drugs (which they didn’t take) and confined to the hospital.
The false positive rate was 100%.
After Rosenhan published a paper on this experiment, psychiatrists were outraged that they had been tricked, and the head of a mental hospital challenged Rosenhan to send his volunteers to his hospital, where he said he would readily detect them. Rosenhan agreed. Out of about 200 people over the subsequent several months, the hospital’s doctors decided that 41 of them were definite fakes and another 42 were suspected fakes. However, Rosenhan had sent no one.
False negative rate was maybe 20 to 40%, depending on how you measure it, that is, assuming the people were really mentally ill but deemed to be faking it.
Do you suppose psychiatric diagnosis has improved much since the early 1970s? I’ve no idea.
People diagnosed with a mental illness can get lots of mind-altering and toxic drugs, including antipsychotics, antidepressants, tranquilizers, etc, which kill people.
It may be thought that a group of people faking psychiatric symptoms may not be relevant to psychiatric treatment, but normal people often think something may be psychologically wrong with them and seek treatment for it. Psychiatrists and psychologists should be able to detect normality, to simply say “there’s nothing wrong with you”; many people are placed on SSRIs or other drugs, and it’s a good bet that some fraction of them have little wrong with them.
A recent economics paper was written by some economists who sent a patient with healthy teeth to 180 different dentists in Switzerland.
Of those dentists, 50 recommended that the patient get at least 1 cavity filled. The economists found that the patient was considerably more likely to get a recommendation for treatment when the dentist needed the money.
If you can’t trust a Swiss dentist, who can you trust?
Imagine what happens when you go to a doctor. You could end up with toxic medications, surgery, chemotherapy.
Most physicians, through no fault of their own, don’t know how much of what they practice is not based on high-quality evidence. John Ioannidis et al. write:
Most physicians and other healthcare professionals are unaware of the pervasiveness of poor quality clinical evidence that contributes considerably to overuse, underuse, avoidable adverse events, missed opportunities for right care and wasted healthcare resources. The Medical Misinformation Mess comprises four key problems. First, much published medical research is not reliable or is of uncertain reliability, offers no benefit to patients, or is not useful to decision makers. Second, most healthcare professionals are not aware of this problem. Third, they also lack the skills necessary to evaluate the reliability and usefulness of medical evidence. Finally, patients and families frequently lack relevant, accurate medical evidence and skilled guidance at the time of medical decision-making. Increasing the reliability of available, published evidence may not be an imminently reachable goal. Therefore, efforts should focus on making healthcare professionals, more sensitive to the limitations of the evidence, training them to do critical appraisal, and enhancing their communication skills so that they can effectively summarize and discuss medical evidence with patients to improve decision-making. Similar efforts may need to target also patients, journalists, policy makers, the lay public and other healthcare stakeholders.
Medical treatment comes with risks.
The surgeon determined that 17.2% of them had been scheduled for unnecessary spinal surgery.
From the physician perspective, overtreatment is common. In a survey of physicians, they themselves thought that
20.6% of overall medical care was unnecessary, including 22.0% of prescription medications, 24.9% of tests, and 11.1% of procedures. The most common cited reasons for overtreatment were fear of malpractice (84.7%), patient pressure/request (59.0%), and difficulty accessing medical records (38.2%).
Polypharmacy occurs when people take too many prescription drugs, and when those drugs are inappropriately prescribed. Polypharmacy is a significant problem, especially among older people.
In one study at a Veterans Administration hospital, 65% of patients were taking at least one drug that was inappropriate.
Virtually all drugs have adverse side effects, some of them serious ones.
The practice of medicine is as much art as science.
Physicians are not infallible, and medical treatment, whether surgery or drugs, carries risk.
Solving this problem isn’t easy, but one of the main things patients can do is to educate themselves.
Fear of malpractice may drive many of a physician’s decisions, which means your best interests may not always be in mind.