Anti-choice news-bender Life News has trumpeted proudly that abstinence education totally works, yo. Using the language of science–and some fancy-looking footnotes (which actually lead to, among other things, a book published by a Mormon abstinence education “research centre”)–Life News claims that abstinence education works.
Well, they’re sort of right. It does work. If taught as an intensive programme compared to reading a few textbooks that are also about abstinence. When tested in a study as full of holes as a colander [not paywalled, and published in a journal I hadn’t heard of].
The participants in the study were ninth-grade pupils in schools in Georgia, a state where abstinence education is already the norm. I’m sure this is a wholly unrelated point, but Georgia also has one of the highest teenage pregnancy rates in the US. Six schools were selected, and parents were asked for consent for their children to participate. Less than 40% of pupils were allowed to participate in the study; among the sample, girls and African Americans were overrepresented demographically. On top of this minor issue is the fact that this means that participants were aware that they were participating in a research study, and had an awareness of whether they were in the intervention or control group. When this happens, results of studies tend to skew somewhat, inflating the positive effect of the intervention.
I am going to give some credit to the authors of the study: they actually made a brave attempt at using a theory to evaluate the intervention: you’d be surprised how many behavioural interventions are atheoretical clusterfucks with a mishmash of things the authors like chucked about willy nilly. Unfortunately, they picked the Theory of Planned Behaviour, which is rather simplistic. And they didn’t even use it that well: they forgot to measure one of the key theoretical constructs (perceived behavioural norms), and threw in a bunch of other measures of things like “hopefulness” which have absolutely nothing to do with the theory.
Perhaps most vitally, though, the authors failed to measure some very important behavioural measures. Sexual behaviour was measured entirely by asking on the questionnaire if participants had “gone all the way” (using those exact words). So there is no way of knowing whether they had been enjoying all of the other rainbow of sexual experience, and whether the participants chose to define what they were doing in such euphemistic terms. Secondly, the authors report that they were not able to measure whether the sex participants were having was safe: this was due to the politics of obtaining participants for the study.
With the measures this royally cocked-up and run in some dodgy circumstances, what can be concluded from the study? Firstly, that there’s a short-term effect of the more intensive abstinence programme, but in the longer-term the effect diminishes. It should be noted that the “long-term” follow-up happened just after the summer holidays, while the “short-term” follow-up happened just before the holidays. So, the effect of a more intensive abstinence programme diminishes in the space of a couple of months. It is worth noting, once again, that this is in comparison to doing nothing different from usual.
With this in mind, it is highly disingenuous–or thoroughly scientifically illiterate–of Life News to dress this study up as evidence that abstinence works. It shows nothing of the kind. It shows that in a study which inherently favours a slightly more intensive approach to teaching abstinence, there’s a slight effect for more intensive teaching of abstince, but that effect fucks off in the space of a summer holiday. And that’s the best they’ve got.
While the study may be inconclusive, I have to take a small issue with your summary of it being useless because it’s “[not paywalled, and published in a journal I hadn’t heard of].”
SAGE Open is an Open Access journal, the Open Access movement being about making peer-reviewed research available without a paywall. Open Access as a ‘movement’ is relatively new, which is why a lot of these journals are not as well-established as those that you may have heard of. They do state this: http://sgo.sagepub.com/, and as I work with Open Access for a living I think it’s worth mentioning that not paying for research does not automatically indicate that the research is flawed. The research therein may well be flawed for a number of other reasons – such as all those you’ve mentioned!
This really doesn’t surprise me one little bit. I live in Texas and shake my head every day.