My husband walked in the door after a full day of technology troubleshooting and making scheduling changes at his school and had to listen to me to rant about an article I read today via the ASCD SmartBrief. Mid-rant, he lovingly reminded me that I have access to a blog. Now you, gentle reader, shall get to enjoy the full wrath of my rant. Because if you can't rant on your own blog, where can you?
Background: every day, the Association for Supervision and Curriculum Development (ASCD) sends out an e-mail called SmartBrief. According to ASCD's website:
ASCD SmartBrief brings you the K-12 education news that really matters. Our editors handpick key articles from hundreds of publications, do a brief summary of each and provide links back to the original sources. In other words, we do all the research...and you get the news you need, without the fluff.This statement implies someone trolled the web looking for articles related to education. They ten pick key articles and e-mail a summary and link to a lot of educators (couldn't find the exact number on the ASCD site, please forgive my use of vague qualitative data to prove a point) The article that was picked as the lucky above-the-scroll article on September 7, 2007 is called: Daily News exam finds math scores up when difficulty rating went down. In fact, a version of the article title appeared in the subject line on the SmartBrief.
Let's set aside for the moment the difference between causation and correlation. Let's ignore for a moment that the author is discrediting increases in scores that came about because of improvement in instruction and curriculum. I can even forgive Erin Einhorn for misidentifying p-value (it’s the percent of students who responded correctly to a question NOT “The easy score - called a Probability-value”) or assuring the reader that her conclusions are valid because . . . well, she say it is.
Three experts said The News' findings were valid.34 kids were given the 2002 and 2005 test. They did better on the 2005 test. Therefore, it's an easier test. I think I'm going to try that approach in my dissertation. The paragraph following the quote above contains a statement from one statistician who talks about the significance of their study. The other two apparently wanted to remain confidential sources or anonymous statisticians. I just got a really big chuckle at the idea of statisticians skulking about in the shadows with copies of SPSS tucked furtively under their coats. Admit it. It’s funny.
What deflects my anger away from the author, besides silently sulking statisticians, is that in a link below the article, the author quotes a researcher who candidly admits that p-value isn’t a good measure of the quality of a test. The author appears to have done some research. In the link, but not the main article, she correctly defines p-value and gives a solid example. Though, I’m baffled why she insists on calling p-value an “easy score”, state assessment "quizzes", and ignores completely the concept of standard setting when she writes:
Kids in New York get the same number of points for correct answers regardless of whether a question is rated easy or difficult. One way testmakers equalize exams is by requiring more correct answers on easier tests. If the 2005 test was easier than the 2002 test, that wasn't done. Kids needed 40 points to pass the 2002 test but only 39 points to pass in 2005.All of the above transgressions can be forgiven. Quantifying learning is a messy business. Even our state education department acknowledges that standardizing testing has unintended consequences. Einhorn didn’t do a very thorough job exploring the whole picture (i.e. raw to scale conversion, standard setting) and has several glaring errors but Erin doesn’t write for a professional journal. She is writing for her fellow New Yorkers and answering questions (albeit incorrectly) for her readers and raising some powerful questions for us to ponder on the role of evaluation in education. So, Erin can be forgiven. She doesn’t speak for, or represent educators.
That honor, however, does belong to the Association for Supervision and Curriculum Development. My wrath, which has tempered into crankiness, goes out into the blue void at the person who picked this article to be first. I have nothing personal against ASCD. I’m sure the organization is staffed by lovely people. I enjoy their books and journals. I haven’t been a conference yet but I look forward to attending one soon.
I am angry because a professional organization in my field sent its members to an article that is incorrect and misleading. If ASCD wants us draw our collective attention to current news on standardized testing, on the same day that the article was published in the New York Daily News, Gerald Bracey wrote a piece in Education Week reminding educators to look at the bigger picture. (As the author of a book about statistics in education, I’d be curious about his response to Erin’s article.)
What would I have wanted instead? I would have been thrilled to pieces if that editor at SmartBrief had tagged the Daily News article and followed up a few days later to see the impact of the article – and trust me, there has been one. The editor would have found a response from an angry parent, a blogger pleased that the conspiracy was finally being uncovered, a radio show dedicated to the topic, even Bloomberg and Spitzer got in on the discussion, and on the following day, another article from Erin herself calling for a massive external audit of our testing program. I have posted a link to the article on a data listserv that I belong to and am eager to see how people respond.
Edited to add: My anger completed dissolved to resignation when I re-read Erin’s follow-up article. I will sadly point toward the previously mentioned issue on correlation and causation (this time with a British accent) and consider sending Erin an article about standard setting and post-equating but instead I think I’m going to grab my copy of SPSS and go sulk.
Edit #2: Now ELA is under fire. *Sigh*