<body><script type="text/javascript"> function setAttributeOnload(object, attribute, val) { if(window.addEventListener) { window.addEventListener('load', function(){ object[attribute] = val; }, false); } else { window.attachEvent('onload', function(){ object[attribute] = val; }); } } </script> <div id="navbar-iframe-container"></div> <script type="text/javascript" src="https://apis.google.com/js/platform.js"></script> <script type="text/javascript"> gapi.load("gapi.iframes:gapi.iframes.style.bubble", function() { if (gapi.iframes && gapi.iframes.getContext) { gapi.iframes.getContext().openChild({ url: 'https://draft.blogger.com/navbar.g?targetBlogID\x3d15526040\x26blogName\x3dSchools+of+Thought\x26publishMode\x3dPUBLISH_MODE_BLOGSPOT\x26navbarType\x3dBLUE\x26layoutType\x3dCLASSIC\x26searchRoot\x3dhttps://haspel.blogspot.com/search\x26blogLocale\x3den_US\x26v\x3d2\x26homepageUrl\x3dhttp://haspel.blogspot.com/\x26vt\x3d-2837553055558778188', where: document.getElementById("navbar-iframe-container"), id: "navbar-iframe" }); } }); </script>

Schools of Thought

Translating education research into usable knowledge

What test results really mean

I think it's particularly appropriate to stay on the theme of assessments during the Month of the Test Results. Yesterday, I wrote about the chicanery states can and do use to inflate their test scores; today, I want to talk about the fallacy of how scores are reported in the media.

If you read most any newspaper article following a state's release of its test scores, some combination of the words "more," "less," "better," or "worse" will dominate the headlines. To be fair, making comparisons across time is a useful excercise; it tells people whether schools and districts are heading in the right direction. But when the media -- and certaintly the states -- don't dwell on the absolute numbers, the picture becomes distorted. It's nice to say that more people in a town excercise regularly this year compared to last, but if the increase is from 1% to 2%, you might want to reevaluate your methods. Trend data must be melded with absolute numbers.

It's hard to know the authentic absolute numbers for all the reasons I outlined yesterday. But if we look at absolute scores in 2003-2004 or 2004-2005 (depending on whether the state has released '05 yet) on top of comparisons with the baseline 2001-2002 school year, the picture begins to get definition around its edges.

First, consider the states whose test results are closest aligned to their respective NAEP scores. Of the four states who recieve an "A" on their 4th grade tests by Peterson and Hess -- South Carolina, Maine, Wyoming and Massachussetts -- the highest passing proficient rate was 56% for reading and 42% for math, both coming from Massachussetts (Source: State DoEds). All have made mere single-digit gains over the past several years. So, for the most legitimate cross-section we can find, nearly 6 in 10 4th graders aren't proficient in math, and somewhere between 4 or 5 in 10 aren't proficient in reading. Whether scores inch up or down, that's abysmal. That's scary.

Let's dig deeper into Massachussetts, though. In 2000, it was the 13th most populous state according to the census, and it's not exactly anyone's idea of a backwards boondock. In 2003, Massachussetts had about 75,000 kids take the 4th grade English Language Arts test. That means that a 1% increase in test scores in any given grade represents 750 kids becoming proficient.

In 2001, 51% of Massachussetts 4th graders were proficient -- or, put another way, 36,750 weren't. Last year, it was 56%. Which means, NCLB and all, after three full school years an underwhelming 3,750 more kids are learning to read and write now. Nearly 33,000 still can't.

These are the stories that are not being told, the stories that must be told. NCLB is undeniably a shot in the arm; 4th/5th grade reading scores are rising across the nation. But those three- or four-year rises are incremental -- 2 points in Colorado, 7 in Delaware, 8 in Georgia, 2 in Illinois, 3 in Indiana, 7 in Louisiana, 1 in Minnesota, 4 in Mississippi, 6 in Pennsylvania, 3 in Wyoming -- and most state NAEP pass rates hover in the 20-50% range. This is progress, but not success. At this rate, every child will be actually proficient sometime around the turn of the next century. If standards and assessment are to do their job, citizens, the media and policymakers alike must understand what the numbers mean and just how deep of a crisis they represent.
« Home | Next »
| Next »
| Next »
| Next »
| Next »