Chance News 74

From ChanceWiki
Jump to navigation Jump to search

Quotations

"The government are very keen on amassing statistics. They collect them, add them, raise them to the nth power, take the cube root and prepare wonderful diagrams. But you must never forget that every one of these figures comes in the first instance from the village watchman, who just puts down what he damn pleases."

Sir Josiah Charles Stamp [1880-1941]

As quoted by Howard Wainer, in Picturing the Uncertain World Princeton University Press, 2009, p. 23. The above observation is by no means dated, as shown by the following exerpt from p. 24:

"The term curbstoning describes a practice that is widespread enough to warrant inclusion in the glossary of The 2000 Census: Interim Assessment, which defines it as 'The practice by which a census enumerator fabricates a questionnaire for a residence without actually visiting it.' "

Submitted by Paul Alper


"So when I hear scientists say, 'The data speak for themselves,' I cringe. Data never speak. And data generally and most often are politically and socially inflected."

Andrew J. Hoffman, Professor of Sustainable Enterprise, University of Michigan

As quoted in Q. and A.: Taking On Climate Skepticism as a Field of Study, Green Blog, New York Times, 9 April 2011

Submitted by Bill Peterson

Forsooth

"Is there one New Yorker in a hundred who knows what the morning line is? (It’s the track oddsmaker’s estimate of how the public will bet each horse in a race, expressed in probabilities like 3/1 or 7/2.)"

Belmont awaits its annual return to glory, New York Times, 9 June 2011

Submitted by Bill Peterson

Overdiagnosed, overtreated

Obviously, the sun revolves around the earth, the harmonic series converges and the earth is flat. Likewise, the prevailing medical paradigm is: early screening leads to early intervention resulting in improved medical outcomes. Completely plausible and believed by most patients and medical practitioners alike. If you are one of the believers then you ought to read H. Gilbert Welch’s Overdiagnosed: Making People Sick In The Pursuit of Health (Beacon Press, 2011).

It has become increasingly evident with regard to prostate cancer or breast cancer that, “While it may seem that screening can only help you, it can also hurt you: it can lead you to being overdiagnosed and treated needlessly.” Consider a different cancer, one far less frequent: thyroid cancer. According to Welch, the ratio of prostate cancer deaths to diagnoses is 1 to 6 but the ratio for thyroid cancer is 1 to 20.

One possible explanation for this, as you will recall from similar issues with prostate cancer, is that we are really, really good at treating thyroid cancer. The other is less optimistic: that many of the diagnosed cancers didn’t need treatment in the first place.

Further, despite the “dramatic growth in the number of thyroid cancers found,” “The death rate for thyroid cancer is rock-solid stable.” He puts it boldly for thyroid cancer: “Here there is just a downside—a lot of diagnosis and no change in mortality…there is no discernible benefit.” This overtreatment results in the unnecessary removal of the thyroid gland and the need to take medication for the rest of the person’s life.

On page 64 is a time series graph that shows new thyroid cancer diagnoses and thyroid cancer deaths vs. time (1975 to 2005); new diagnoses rise inexorably with time while deaths are flatlined over the 30-year period. Two pages later there is virtually the same graph but now for melanoma and again the ratio of cancer deaths to diagnoses is high enough to speculate that “there is less an epidemic of melanoma than an epidemic of diagnoses.”

One reason for the general increase in overdiagnoses is the changing of the rules, the moving of the goalposts. Thresholds for diabetes, hypertension, hyperlipidemia and osteoporosis have been changed such that the disease prevalence has increased by 14%, 35%, 86% and 85%, respectively. “Whether or not” the cutoff changes were “a good thing for the individuals is a tough question. But there is no question about whether or not it was a good thing for business. These changes substantially increased the market for treatments—and the money to be made from them.”

As might be expected, the experts who set the cutoffs had financial relationships with the pharmaceutical industry which stood to gain many millions of new customers. However, “To be fair, many of these experts may be true believers…but the fact that there is so much money on the table may lead them to overestimate the benefits and harms of overdiagnosis.”

The publisher's webpage excerpts the Introduction to the book, and provides links to numerous reviews from current newspapers.

The author was interviewed on the radio show People's Pharmacy; the interview can be listened to here.

Discussion
1. A somewhat companion book for this topic is Shannon Brownlee’s Overtreated: Why Too Much Medicine Is Making Us Sicker and Poorer. Although published four years before Overdiagnosed, Welch makes no reference to her book.

2. HeathNewsReview.org is an excellent weekly website that deals critically with health and medical matters, especially of harms vs. benefits, relative risks vs. absolute risks.

3. Why would an overdiagnosis have health insurance consequences?

4. Welch often mentions the term, “incidentaloma.” Wikipedia says “In medicine, an incidentaloma is a tumor (-oma) found by coincidence (incidental) without clinical symptoms or suspicion. It is a common problem.” Why might an incidentaloma lead to overdiagnosis?

5. Lawyers and fear of a malpractice suit are discussed in the book. Why might they lead to overdiagnosis and overtreatment?

Submitted by Paul Alper

See also More Harm Than Good: What Your Doctor May Not Tell You about Common Treatments and Procedures, 2008. The authors are strong advocates for evidence-based medical decision-making.

Submitted by Margaret Cibes

A related graphic

The following graphic appeared in a 2009 New York Times article, Cancer Society, in shift, has concerns on screenings.

http://graphics8.nytimes.com/images/2009/10/21/health/1021-biz-cancergraphic/popup.jpg

The article quotes Dr. Otis Brawley, chief medical officer of the American Cancer Society: “We don’t want people to panic. But I’m admitting that American medicine has overpromised when it comes to screening. The advantages to screening have been exaggerated.”

See also Lung cancer screening may increase your risk of dying from Chance News 25.

Submitted by Paul Alper

Kidney cancer

A classic example of data maps is presented by Howard Wainer in Picturing the Uncertain World; it can be seen here. Many rural counties show up in the lowest decile for kidney cancer rates, which might suggest that climate or lifestyle have some benefit. But then it is seen that adjoining, similarly rural, counties turn up in the highest decile. Shown below is a map that accompanied Wainer's discussion of the example in an article entitled The Most Dangerous Equation (American Scientist May-June 2007 Volume 95, Number 3, p. 249).

http://www.americanscientist.org/Libraries/images/2007327141118_846.gif

The highest decile counties are shown in red, the lowest in green. What we are seeing the effect of variation in small samples: because the rural counties have so few people, having one case vs. no cases makes the difference between the being in the top vs. the bottom decile for standardized rates of kidney cancer!

Submitted by Paul Alper

Don't bother me with facts!

“How facts backfire: Researchers discover a surprising threat to democracy: our brains”
by Joe Keohane, boston.com, July 11, 2010

Political scientists have been researching people’s abilities to alter their uninformed opinions and/or beliefs when confronted with accurate information and concluded:

Facts don’t necessarily have the power to change our minds. In fact, quite the opposite.

Studies appear to indicate that when people with strong political views are shown corrected facts in news stories, “they rarely changed their minds.” In fact, “our beliefs can dictate the facts we choose to accept.” Add this to the unprecedented amount of information available today and the widespread “political ignorance of the American voter," and political scientists are concerned about the threat to rational voting choices by our citizenry.

A 2000 study, in particular, showed that, while only 3% of 1000 Illinois residents answered more than half of a series of questions about welfare correctly, more than half expressed confidence in their answers. Other studies have confirmed this relationship between knowledge and confidence in knowledge with respect to partisan issues. Accepting information that is consistent with our beliefs, rightly or wrongly, is called “motivated reasoning.”

The focus of this article is a more recent study study[1] of undergraduates at a Midwestern college in 2005-2006. In one experiment, participants first read news stories with a planted, provably false but widely believed, fact, and then read a correction. The result was that those who had believed the non-fact before reading the correction believed it even more strongly after reading the correction. This effect was stronger in self-identified conservatives than in self-identified liberals.

This article discusses several other studies on the potential effect of education on participants, and posits possible solutions to this problem.

Submitted by Margaret Cibes