CommenTerry: Volume Thirty-two | Eastern North Carolina Now

    Publisher's note: The author of this post is Dr. Terry Stoops, who is the Director of Education Studies at the John Locke Foundation.

Test scores and graduation rates for 2013-14: Part 2


    Last week's CommenTerry offered a brief overview of 2013-14 state test scores and graduation rates.

    This week, I take a subterranean look at state testing results. As a result, my CommenTerry may get a little messy.

    One would think that the process of comparing and reporting test scores would be a straightforward matter. In North Carolina, state standardized testing is anything but straightforward.

    As I mentioned last week, the state reported test results using a four-level system last year but added a fifth achievement level this year. According to N.C. Department of Public Instruction officials, this meant that this year's five-level scores would "stand on their own." Initially, they refused to voluntarily calculate comparisons between the two years, but members of the N.C. State Board of Education insisted that they do. State testing and accountability analysts have not produced those documents yet.

    Subsequent conversations with folks in the testing and accountability business suggested that comparing the two school years was as easy as comparing levels 3 and 4 of last year's (2012-13) results to levels 4 and 5 of this year's (2013-14) results. Why? The "new" Level 3 was actually carved out of the "old" Level 2. (See Facts and Stats, Figure 1 below.)

    Think about it this way. You're down to your last stick of butter and your recipe for a Bearnaise sauce requires you to cut the stick into quarters or four sections. But in the course of cooking a spectacular meal, you decide that you need a sliver of butter for a box of macaroni and cheese. Obviously, your children demand mac and cheese because they have inferior palates. The solution is to remove a piece from one of the existing sections. Because you're odd, you want a sliver that is nearest to the middle of the stick. To create this new section, you carve a small piece out of the end of second quarter that is closest to the third quarter. Now you have five lovely pieces of butter, although not all of them are the same. The first, third, and fourth sections remain the same; the size and shape of these sections are no different than before. Only the second quarter has changed.

    This is the essence of the changes made by state officials. The first, third, and fourth sections are comparable because they did not change. The pre-cut second section is the same as the post-cut second and third sections.

    Apparently, some school administrators did not understand or ignored how to create apples-to-apples comparisons. Rather than comparing levels 3 and 4 of the 2012-13 tests to levels 4 and 5 of the 2013-14 tests, they compared levels 3, 4, and 5 of the 2012-13 results to levels 4 and 5 of the 2013-14 results. This led to claims that the district had a fantastic increase in proficiency rates. In a September 4 PowerPoint presentation, Charlotte-Mecklenburg Schools (CMS) staff touted double-digit proficiency rate gains in nearly every testing category. The problem is that the two were not comparable.

    It is easy to understand why some made erroneous comparisons. As a state, we have always calculated grade-level proficiency based on the percentage of students who reached or exceeded Level 3 on state tests. So, it appeared sensible to compare the percentage of students who met that standard over the two school years. Assigning similar labels to the respective level 3s did not help matters.

    Properly compared, the state had higher proficiency rates in some subjects and declines in others. As a state, we should be most concerned about the performance of middle school students. Mathematics gains were miniscule between sixth- and eighth-grade, and sixth- and seventh-grade reading proficiency dropped. On the other hand, it is encouraging to see that science and high school math scores rose significantly.

    In the end, the confusion over this year's testing results only reinforces the idea that the state needs to rethink its approach to student assessment. There are numerous nationally-normed tests that, at long last, would provide dependable and intelligible assessments. Our public school students, teachers, and administrators deserve no less.

    Facts and Stats

Figure 1. Achievement level changes, 2012-13 and 2013-14


Table 1. Comparisons of state proficiency rates, 2012-13 and 2013-14


    Acronym of the Week

    ICBINB -- I Can't Believe It's Not Butter

    Quote of the Week

    "North Carolina's public schools have just released end-of-grade and end-of-course test scores for the 2013-14 school year. You may have heard something about them, particularly from districts eager to claim a large increase in the share of North Carolina students who test at or above grade level. It's a good example of why you shouldn't believe everything you hear."

    - John Hood, "Consensus Requires Solid Data," Carolina Journal Online, September 10, 2014.

    Click here for the Education Update archive.
Go Back


Leave a Guest Comment

Your Name or Alias
Your Email Address ( your email address will not be published)
Enter Your Comment ( no code or urls allowed, text only please )




NCSEN: Way too much DE-fense John Locke Foundation Guest Editorial, Editorials, Op-Ed & Politics Let's go (Partway) to War

HbAD0

 
Back to Top