The 2017 National Assessment of Educational Progress (NAEP) results will be released on April 10th and it seems as though Louisiana State Superintendent of Education John White got an advanced look and he is worried.
Matt Barnum with Chalkbeat reported on a letter that White sent to Dr. Peggy Carr, the Acting Commissioner of the National Center for Education Statistics, who administers NAEP, that questions about how the switch to computer-based testing will impact student scores.
The 2017 NAEP administration marked a significant transition from paper-based testing to computer-based testing. NCES found that, consistent with research on the NAEP (Bennett et al., 2008; Horkay, Bennett, Allen, Kaplan, & Yan, 2006), this shift in the mode of testing contributed to lower performance on NAEP forms among the general U.S. sample population. Using the small sample of paper-based testers, NCES calculated a baseline level of performance and adjusted nationwide scores to maintain the longitudinal NAEP trend. Based on this mode effect adjustment, NCES has preserved the integrity of its effort to report trends in nationwide math and reading over time.
I understand that NCES may have found disparities in the mode effect on different subgroups of students. However, any disparate effect found was not significant. Thus NCES did not include any difference from one group of students to the next in its calculation of the mode effect. The adjustment NCES made in order to preserve the national trend is the same for every student.
It is my understanding that, though NCES maintained a consistent longitudinal trend at the national level, there remains the possibility that the mode effect in a given state may have been greater than the nationwide mode effect. This could be attributable to a disproportionately large population of a subgroup that experienced a greater mode effect than the national effect. It also could be attributable to the relative capacity of 4th and 8th grade students in a given state to use computers.
As a potential illustration of this point, no Louisiana student in 4th grade or 8th grade had ever been required to take a state assessment via a computer or tablet as of the 2017 NAEP administration. This fact, coupled with a variety of social indicators that may correspond with low levels of technology access or skill, may mean that computer usage or skill among Louisiana students, or students in any state, is not equivalent to computer skills in the national population.
The first problem, as Richard Innes with the Bluegrass Institute pointed out is that NAEP may have some validity issues. He wrote, “Certainly, the possibility White raises that NAEP might have performed differently for different states could be a real concern. Could the comparability of NAEP data between states and across years have been compromised?”
To that end White asked Carr for additional information to which Chalkbeat reports Carr said she would provide. White wrote:
I am therefore writing to request that the following information be made available to state chiefs as soon as possible:
- The mode effect adjustment applied to each grade and subject nationally
- The average mean scores for students taking the paper-based test and for students taking the tablet-based test, at the state level and at the national level, in each grade, subject, and subgroup
- Evidence of the random equivalence of the groups of students taking the paper-based testsand students taking the tablet-based tests, at the state level and at the national level
- National subgroup performance trends, reported by performance quintile, quartile, or decile.
The second problem, that White concedes, is with using computer-based tests to begin with. He wrote, “I would like to be assured, as soon as possible, that when NCES reports math and reading results on a state-by-state basis over a two-year interval, the results and trends reported at the state level reflect an evaluation of reading and math skill rather than an evaluation of technology skill.”
This is why I don’t favor computer-based assessments.
This, of-course, could be spin. Barnum writes, “Even though researchers warn that it is inappropriate to judge specific policies by raw NAEP results, if White’s letter is a signal that Louisiana’s scores have fallen, that could deal a blow to his controversial tenure, where he’s pushed for vouchers and charter schools, the Common Core, letter grades for schools, and an overhaul of curriculum.”
White contends that he’s not concerned about Louisiana scores. “I doubt that any mode effect would have radically vaulted Louisiana to the top or dropped Louisiana further below,” White told Chalkbeat. “The issue is from a national perspective.”
Jay P. Greene, Chair of the Department of Education Reform at the University of Arkansas, says that it looks like “pre-spin” to him. He wrote on his blog, “Maybe it’s just a remarkable coincidence that White has suddenly developed these technical concerns about the validity of NAEP at about the same time that he was briefed on his state’s results. How much do you want to bet that there is a decline in LA?”
I’m not a betting man.
Considering the Council of State School Chiefs also expressed concern, Louisiana is probably not alone.