We have been closely reading all things related to the Common Core State Standards and blogging every weekday for about six months now. We think that the Common Core has some redeeming qualities and that any effort to change schools is likely to be politicized by those who weren’t involved with the effort. Our blog posts are as likely to criticize the CCSS as they are to find points worth celebrating, and in a given week may oscillate between the two.
But, we have some concerns about the way research is/was stretched by some involved with the development of the Common Core State Standards. We have been very hesitant to write about this for a public forum because we think it worthwhile to pursue the better parts of the CCSS and build from there, rather than bicker and/or start from scratch with new standards. But our concerns about research references around the CCSS have hit a tipping point, and we are beginning to feel a burden of responsibility.
The top ten list below articulates common misuses of research. These are prevalent in and around the CCSS. Pretty much anyone who cites research is guilty of these to some degree. We (Burkins & Yaris) certainly cannot claim to have spotless records on all these counts. There are extremes, however, which warrant noticing, particularly when they are from people who are influencing how the whole country teaches children to read.
Top 10 Tactics of Those-Who-Cite-Research-for-Purposes-of-Persuading-People-Who-Don’t-Generally-Read-Research
10. Talk about the flaws in the studies that refute their opinions.
Consider this analysis of instructional reading level theory by Timothy Shanahan, a member of the CCSS Feedback Team.
9. Disregard flaws in the studies that support their points.
See our analysis of Shanahan’s references to studies he presents to support the idea that frustration level text is best for students.
8. Say “Research says …” but don’t say what research they are talking about.
Here is an example of a blog by Shannahan with references to reading and writing research without any citations or links to the references. Most of his blogs that we have read have referred to research without actually linking or citing. This practice is also common throughout the Common Core documents and videos.
7. Presume that, because they are considered experts, readers will let them get away with not citing research.
We have begun to believe the opposite, however. With increasing celebrity in the literacy world, experts can get sloppy. Sometimes researchers hold themselves to exacting standards when it comes to talking about research; but not always. You know these thoughtful, cautious researchers by the way they qualify what they say and the ways they encourage readers/listeners not to over generalize their results.
6. Try to arouse attention with phrases or article titles that create controversy or panic.
Consider this blog title: Thank Goodness the Writing Scores are Going to Drop on a recent post by Timothy Shanahan. Now that many teachers have their livelihood attached to test scores, dropping test scores seems to be a controversial topic to celebrate. Considering that he is involved with the development of the test, we are not sure how to react to the idea that he is rooting for kids to do poorly.
5. Decide what they believe or wish to be true and then look for (or even conduct) research that supports it.
Consider the appendage to Appendix A of the Common Core State Standards as an example of this after-the-fact inclusion of research. This tacking on of research support is likely to continue because the CCSS were never field tested, piloted, or even vetted. So now researchers in competing camps must prove after-the-fact whether they “work”, or not.
4. Cite tangential, albeit compelling research, that really doesn’t prove the point but distracts and engages us enough to increase the likelihood that we won’t notice the lack of research support.
One example of this smoke-and-mirrors routine is related to the Common Core. Appendix A, as well as a host of official CCSS videos, cite research that a lot of college students have to take remedial classes, with the speakers/writers using this as evidence that elementary students should read from books on their frustration level. Meanwhile, we have yet to find a single study that supports teaching with frustration level texts in elementary schools as suggested in the Common Core recommendations to publishers, nor have any been offered from the Common Core authors. This, of course, doesn’t mean that such studies don’t exist. It does mean, however, that the likelihood of a substantial body of evidence existing is pretty slim.
3. Talk about the ways the research applies to broad populations when the research to which they are referring looked very specifically at a particular population.
One huge illustration of this is the Appendix A authors’ reference to Moss & Newton (2002) to cite the amount of informational text reading students do in elementary schools. See Appendix A of the CCSS or one of many videos of David Coleman asserting this specific result as representation of all elementary students in classes today. You can read our analysis of this mis-reference in this blog. Another example is Shanahan’s reference to compelling research in favor of putting students in frustration level text when the study he is citing only looks at shared reading among second graders partnered for fifteen minutes a day. He overlooks the fact that in every student dyad in that study, one of the partners is reading text that is around instructional reading level for him/her. So half of the second graders made progress when they read with partners from frustration level text, while the other half engaged in reading that Shanahan would have to argue was a waste of their time.
2. Insert adverbs that make us feel that their points are solidly rooted in research, making it less likely that we will question them.
We say to ourselves, “Surely he/she wouldn’t say that the research overwhelmingly shows something if it wasn’t true.” In this transcript of David Coleman’s presentation on Engage NY entitled, Bringing the Common Core to Life, he says “overwhelmingly” in relation to research 5 times, without offering any specific references to studies on which we can follow-up. In reality, there is little that research “overwhelmingly” shows. Quantitatively speaking, the best bet is a meta-analysis or statistical summary of research, in which case Coleman could offer some specific numbers in summary or cite patterns across research.
1. Assume that you won’t read the research to which they are referring.
The only way this changes, is if we all read a little more and a little more closely. Perhaps, if we all draw on our college-and-career skills and read like detectives and write like investigative reporters, we can hold decision makers and policy writers accountable for their claims.