Conclusions Based on Estimates Based on Estimates
[Update: After posting this, I noticed that the CRC's web site links to the very persuasive California Catholic Daily article referred to here, with the link caption, "How does Planned Parenthood explain this?"]
Yesterday someone pointed us to an interesting article proving that comprehensive sex-ed doesn't work. California Catholic Daily, which is certainly going to be an authoritative and objective publication with no axe to grind when it comes to sex ed, had this:
Okay, you get the idea. They go on to quote some experts from Focus on the Family.
Turns out it's not hard to find the paper they're talking about: HERE. It's called Sexually Transmitted Infections Among California Youth: Estimated Incidence and Direct Medical Cost, 2005, by Jerman, Constantine, and Nevarez.
Now this is interesting. They explain what they did in the abstract:
Notice, they didn't measure the number of new cases, they "estimated" them.
We'll back up a step, following their reasoning here, because this is fascinating. They say:
So let's go see where this came from.
We can find Weinstock, et al., HERE. Reading this is unbelievable. They take bad data and adjust it. Like, they say:
They explain that when they say "Level II," they mean that these are data from "fair" samples, without representative statistical sampling, with incomplete national reporting, and with extrapolations and assumptions based on representative national surveys.
So they take lousy numbers and double them, assuming they're only half of what they should be. This is one shaky piece of research to build on, I'll say.
But at least Weinstock, et al. used actual numbers collected in the field, even if they weren't very good. The paper that based its methods on that one -- the Jerman paper, quoted in this Catholic magazine -- didn't even do that. Listen to what they say:
I found this rather dense, so I'm going to try to break it down into steps.
They had total numbers of reported cases of gonorrhea and chlamydia at the county level, and some numbers for the other diseases at the state level.
They knew the statewide proportion of the population in the 15-24 year range, and divided each county's numbers by that (assuming a uniform age distribution throughout the state and that the targeted age range has the same rate of infection as the rest of the population).
Then, because they wanted to make estimates about syphilis, genital herpes, HPV, hepatitis B, and trichomoniasis, too, and assuming that all these diseases would go together, they used the derived statistics for gonorrhea and chlamydia to generate numbers for those diseases by proportion from the state levels.
OK, I'll stop.
I hope you get the point. These are made-up numbers, extremely dubitable, if there is such a word. The California Catholic Daily notes that the derived estimates are "ten times higher than previously believed," and then goes on to assume that there are ten times as many cases out there as anybody thought; oh, and it's because of comprehensive sex-ed. The skeptical reader looks at these weird techniques and the discrepancy from other published results, and concludes that there is a good chance the assumptions and methods used were invalid, and that this attempt at creating good information from bad data might not really work.
My criticism here is not against the research. I take this to be a developing field, and this is a minor article in a parochial journal. My criticism is in manipulative people using a weak study like this as evidence to support conclusions that they were already convinced of before they read the study. Trust me, there is not chance -- not a chance -- that the California Catholic Daily was going to say, "Wow, we just read a study in a journal, and it turns out comprehensive sex education really does work." No, when studies find it works, this magazine just don't mention them. The goal is not to increase the readers' knowledge, it is to reinforce their preconceptions. And then people like our friend who showed us this can take that and feed off of it, as if this article really was convincing, with its solid facts and unblinking hard reasoning.
And look, again, I don't want to be too critical of these studies. Health researchers have a tough job. Nobody knows how many people have such-and-such disease, never mind if it's HIV or influenza or acne, but it would be great to know that, so that epidemics can be identified and controlled before they become pandemic. These researchers have got lots of bad data, how many people went to the doctor for something, with no clue how many people actually have any disease; and maybe they can use what they do know to make predictions about something they don't have data for. Weinstock, et al. admit everybody figures they're off by fifty percent from the start -- they don't know the exact amount, but they know they're way off, they figure their numbers are about half of the real thing. I sympathize with the researcher who has to try to get something out of these kinds of numbers, and I expect that as medical data-mining becomes more sophisticated, as probability distributions and correlations become better understood, the estimates will improve.
This paper did not demonstrate that comprehensive sex education "doesn't work." Using ball-park estimates derived from ball-park estimates by applying unjustified adjustments based on inexplicable assumptions, they came up with some guess at what might really be out there. A guess that we are told is ten times higher than estimates by others in the field.
It is clear to the authors, even, that these are unreliable estimates. Jerman et al. say (note that they went a step further in this paper and estimated the cost of all these STDs):
In other words, the sky may be falling, the sky may be falling, we can't tell.
Yesterday someone pointed us to an interesting article proving that comprehensive sex-ed doesn't work. California Catholic Daily, which is certainly going to be an authoritative and objective publication with no axe to grind when it comes to sex ed, had this:
A study published last month in the Californian Journal of Health Promotion reports that in 2005 there were 1.1 million new cases of sexually-transmitted infections among young people in California.
The 1.1 million figure is ten times higher than previously believed, and it means that in the 15-24 age group, diseases such as chlamydia, gonorrhea, syphilis, HPV and HIV now infect almost one out of every four young Californians.
Is this because of a lack of sex-ed in the public schools? Apparently not. According to Chris Weinkopf, editorial-page editor of the Los Angeles Daily News, the California Department of Education reports that "96 percent of California school districts provide comprehensive sexual health education" and all California schools have been required to teach HIV/AIDS prevention education since 1992.
Can we blame the abstinence-only programs promoted by the Bush administration? Not in California. Weinkopf notes that state law prohibits 'abstinence-only' education in the public schools. In addition, California may be the only state in the country that has refused to accept millions of federal dollars for abstinence education. How does Planned Parenthood explain this? Rate of sexually-transmitted diseases soars among young Californians
Okay, you get the idea. They go on to quote some experts from Focus on the Family.
Turns out it's not hard to find the paper they're talking about: HERE. It's called Sexually Transmitted Infections Among California Youth: Estimated Incidence and Direct Medical Cost, 2005, by Jerman, Constantine, and Nevarez.
Now this is interesting. They explain what they did in the abstract:
On the basis of the methods developed at the Centers for Disease Control and Prevention we estimated the statewide number of new cases of eight major STIs among young persons aged 15 to 24 years in California in 2005: chlamydia, gonorrhea, syphilis, genital herpes, human papillomavirus (HPV), hepatitis B, trichomoniasis, and HIV.
Notice, they didn't measure the number of new cases, they "estimated" them.
We'll back up a step, following their reasoning here, because this is fascinating. They say:
Basing our methods on those developed by Weinstock et al. (2004) for their national study, we estimated the incidence of eight major STI among young persons aged 15 to 24 years in California in 2005: chlamydia, gonorrhea, syphilis, genital herpes, HPV, hepatitis B, trichomoniasis, and HIV.
So let's go see where this came from.
We can find Weinstock, et al., HERE. Reading this is unbelievable. They take bad data and adjust it. Like, they say:
In 2000, a total of 358,995 new cases of gonorrhea were reported to the CDC, of which 60% were among persons aged 15–24. Previous estimates of gonorrhea incidence have assumed a 50% underdiagnosis and underreporting rate. Applying this assumption to the available level II national surveillance data, we estimate that 718,000 new cases of gonorrhea occurred in 2000 and that 431,000 cases occurred among persons aged 15–24.
They explain that when they say "Level II," they mean that these are data from "fair" samples, without representative statistical sampling, with incomplete national reporting, and with extrapolations and assumptions based on representative national surveys.
So they take lousy numbers and double them, assuming they're only half of what they should be. This is one shaky piece of research to build on, I'll say.
But at least Weinstock, et al. used actual numbers collected in the field, even if they weren't very good. The paper that based its methods on that one -- the Jerman paper, quoted in this Catholic magazine -- didn't even do that. Listen to what they say:
Because county-specific numbers of STIs are not available for most STIs, we extrapolated the California estimates for chlamydia, gonorrhea, syphilis, genital herpes, HPV, hepatitis B, and trichomoniasis to the county level using the number of gonorrhea and chlamydia cases reported in 2005 in California (California Department of Health Services, 2006d), which are available at the county level. For both chlamydia and gonorrhea among 15-24-year-olds, we calculated the proportion of each county’s cases from the statewide total. We then averaged the chlamydia and the gonorrhea proportion to obtain an overall proportion for each county. This was then multiplied by the statewide estimate for each STI to obtain county-level estimates for each STI. Given the limited data available on other proxies for risk at the county level, we assumed that the distributions of other STIs are associated with the distribution of gonorrhea and chlamydia. Because the distribution of chlamydia and gonorrhea differs slightly, and because it is unknown which of the two the other diseases follow most closely, we weighted equally the gonorrhea and chlamydia distributions.
I found this rather dense, so I'm going to try to break it down into steps.
They had total numbers of reported cases of gonorrhea and chlamydia at the county level, and some numbers for the other diseases at the state level.
They knew the statewide proportion of the population in the 15-24 year range, and divided each county's numbers by that (assuming a uniform age distribution throughout the state and that the targeted age range has the same rate of infection as the rest of the population).
Then, because they wanted to make estimates about syphilis, genital herpes, HPV, hepatitis B, and trichomoniasis, too, and assuming that all these diseases would go together, they used the derived statistics for gonorrhea and chlamydia to generate numbers for those diseases by proportion from the state levels.
County-specific numbers of HIV infection also are not available, and thus, we used a similar method to extrapolate the statewide estimate for HIV to the county level...
OK, I'll stop.
I hope you get the point. These are made-up numbers, extremely dubitable, if there is such a word. The California Catholic Daily notes that the derived estimates are "ten times higher than previously believed," and then goes on to assume that there are ten times as many cases out there as anybody thought; oh, and it's because of comprehensive sex-ed. The skeptical reader looks at these weird techniques and the discrepancy from other published results, and concludes that there is a good chance the assumptions and methods used were invalid, and that this attempt at creating good information from bad data might not really work.
My criticism here is not against the research. I take this to be a developing field, and this is a minor article in a parochial journal. My criticism is in manipulative people using a weak study like this as evidence to support conclusions that they were already convinced of before they read the study. Trust me, there is not chance -- not a chance -- that the California Catholic Daily was going to say, "Wow, we just read a study in a journal, and it turns out comprehensive sex education really does work." No, when studies find it works, this magazine just don't mention them. The goal is not to increase the readers' knowledge, it is to reinforce their preconceptions. And then people like our friend who showed us this can take that and feed off of it, as if this article really was convincing, with its solid facts and unblinking hard reasoning.
And look, again, I don't want to be too critical of these studies. Health researchers have a tough job. Nobody knows how many people have such-and-such disease, never mind if it's HIV or influenza or acne, but it would be great to know that, so that epidemics can be identified and controlled before they become pandemic. These researchers have got lots of bad data, how many people went to the doctor for something, with no clue how many people actually have any disease; and maybe they can use what they do know to make predictions about something they don't have data for. Weinstock, et al. admit everybody figures they're off by fifty percent from the start -- they don't know the exact amount, but they know they're way off, they figure their numbers are about half of the real thing. I sympathize with the researcher who has to try to get something out of these kinds of numbers, and I expect that as medical data-mining becomes more sophisticated, as probability distributions and correlations become better understood, the estimates will improve.
This paper did not demonstrate that comprehensive sex education "doesn't work." Using ball-park estimates derived from ball-park estimates by applying unjustified adjustments based on inexplicable assumptions, they came up with some guess at what might really be out there. A guess that we are told is ten times higher than estimates by others in the field.
It is clear to the authors, even, that these are unreliable estimates. Jerman et al. say (note that they went a step further in this paper and estimated the cost of all these STDs):
We relied on various assumptions in our calculations of incidence and cost estimates. Therefore, the estimates we have derived should be considered approximations. Our analyses are subject to the same limitations as were the methods and cost-per-case estimates on which we relied in our calculations. For example, the assumptions included rates of underreporting, proportion of STIs among young people, treatment guidelines, previous cost estimates, non exhaustive direct medical costs, and others.
In other words, the sky may be falling, the sky may be falling, we can't tell.
7 Comments:
October is on course to record the second consecutive decline in U.S. military and Iraqi civilian deaths and Americans commanders say they know why: the U.S. troop increase and an Iraqi groundswell against al-Qaida and Shiite militia extremists.
Maj. Gen. Rick Lynch points to what the military calls “Concerned Citizens” — both Shiites and Sunnis who have joined the American fight. He says he’s signed up 20,000 of them in the past four months.
“I’ve never been more optimistic than I am right now with the progress we’ve made in Iraq. The only people who are going to win this counterinsurgency project are the people of Iraq. We’ve said that all along. And now they’re coming forward in masses,” Lynch said
"This paper did not demonstrate that comprehensive sex education "doesn't work.""
You don't need a new paper to do so. We had a national experiment in the 70s. Despite the revisionism of advocacy groups like the Guttmacher Institute, comprehensive sex ed, unless taught with a moral perspective, increases sexual activity, pregnancy among teens and STDs.
Of course, the ab-only movement pressed the sex ed movement to strategically give mention to abstinence in their programs and this is an improvement, but we still have a ways to go to clean up the amoralism being pushed on kids.
Testimony that the director of the Centers for Disease Control and Prevention planned to give yesterday to a Senate committee about the impact of climate change on health was significantly edited by the White House, according to two sources familiar with the documents.
Specific scientific references to potential health risks were removed after Julie L. Gerberding submitted a draft of her prepared remarks to the White House Office of Management and Budget for review.
Instead, Gerberding's prepared testimony before the Senate Environment and Public Works Committee included few details on what effects climate change could have on the spread of disease. Only during questioning did the director of the government's premier disease-monitoring agency describe any specific diseases likely to be affected, again without elaboration.
A CDC official familiar with both versions said Gerberding's draft "was eviscerated," cut from 14 pages to four. The version presented to the Senate committee consisted of six pages.
The official, who spoke on the condition of anonymity because of the sensitive nature of the review process, said that while it is customary for testimony to be changed in a White House review, these changes were particularly "heavy-handed."
It's interesting to notice the steady progress Anonymous has made in pirating this blog site. It seems it is more his than TTF's. So tiresome! I guess the expression, "Give 'em an inch, and they take a mile" is true after all.
Anon, one good thing about this website is that it is a place where dialog can take place. Over the years we have had numerous "Anons," sometimes CRC people and sometimes apparently not, but there is always somebody to take the position ... complementary ... to ours. Though they are different people, there is certainly a personality type!
I blocked Illiterate Anon's IP number some months ago, when his ravings were just too stupid. But all in all, they serve a purpose here of expressing the other side's views, and showing why our position is necessary.
A lot of people post links in their comments, and sometimes I end up posting them
JimK
Jim...where are you? Did you miss this?
http://www.washingtonpost.com
/wp-dyn/content/article/2007/10/23/
AR2007102301806.html
Daughter Knows Best
By Ruth Marcus
Wednesday, October 24, 2007; Page A19
...no doubt smarter than her dad, but wiser?
October is on course to record...
A poll released yesterday by the Associated Press made it official: Americans are more likely to believe in ghosts (34 percent) than to believe that President Bush is doing a good job with the war in Iraq (29 percent).
Dana Milbank, Washington Post
Friday, October 26, 2007; Page A02
Post a Comment
<< Home