The Case of Kentucky

Filed in Common Core Assessments by on February 1, 2013 0 Comments

Matthew Ladner brings up Kentucky as an example that the Common Core State Standards are working.  Kentucky was one of the first states to implement them.

He said that the Kentucky exams were far more lax than that of the NAEP assessments, and after the Common Core they were brought closer into line saying that Kentucky’s definition of “proficient” was far more lax than NAEP’s definition.

He posted these charts below:

ky-cc-11

ky-cc-22

Richard Innes of the Bluegrass Institute for Public Policy Solutions said in an email that Ladner wasn’t being accurate as he had the wrong NAEP scores for Kentucky making the above graphs misleading.

As most of you know, Kentucky was the first state to adopt CCSS (even before they were finalized!) and is the first to create CCSS-aligned math and reading assessments.

Anyway, the Greene blog has the wrong NAEP scores for Kentucky in both math and reading, so the comparison graphs are misleading. I am in communications now with the author to get a fix.

In the meantime, I have a great data sourcebook with Kentucky’s new test results compared to our old test, the NAEP (with the right scores) and ACT, Inc.’s EPAS tests, which are definitely aligned with college and career readiness.

Ladner has since updated and said:

I made the mistake of looking at the cumulative rather than the discrete achievement levels and then treating the cumulative as discrete-thus double counting the NAEP advanced. If you have any idea of what I am talking about give yourself a NAEP Nerd Gold Star. Getting instant expert feedback is one of the best things about blogging, and I have updated the charts to correct the error.

In terms of substance, both sets of KY tests were further apart from NAEP proficiency standards, but the new ones are still far closer than the old ones.

So basically they’re still bad, just not as bad?  What a winning argument!

Ze’ev Wurman made the following observation in the comment section:

No question it is *harder* but is it more rigorous? Look at this NAEP item (http://tinyurl.com/an2r3wa ) — a perfect example of an item that has difficult numbers, but rather trivial concept. Does it make the test more “rigorous” for 8th graders? Clearly not…

…Measuring the quality of the standards by its assessment is like evaluating the quality of a car by how easy it is to pass a driving test in it. There may be some very tenuous connection, but can you argue with a straight face that passing rates are higher in Mercedes and BMW than in Hundai or Kia?

Standards are about what is taught and at which grade. Assessment — every decent assessment!! — is aligned with the standards, whether the standards are excellent, mediocre, or worthless. And, as I just demonstrated above, one can make any test harder or easier, independent of whether the concepts are harder or easier.

Finally, you correctly say that “starting out more rigorous doesn’t guarantee that they will stay that way.” In fact, the situation is worse: There cut scores are not set yet!! The Kentucky pilot test set essentially arbitrary cut-scores, largely because it wanted to test the waters and see the public reaction. Only *after* the actual testing will occur next year will PARCC set the cut scores. I can promise you they will be set by a *political* process more than by anything else.

Which brings me to Kentucky, which you praised as “the earliest adopter of Common Core in 2012.” What you should have mentioned is that Kentucky adopted the Common Core standards even before they were finished … so much for the educational diligence of the great state of Kentucky.

The only way to ascertain the quality of the standards before full-scale implementation is by expert review. And the experts had their say. Sandra Stotsky and Jim Milgram, the only content experts on the Common Core Validation Committee refused to sign on them because they found them not rigorous and below our international competitors. The British Deputy Director of the UK Institute of Education, another validation committee member, refused to sign on them for the same reasons. Jonathan Goodman, a Courant Institute mathematics prof, found them 1-2 years below international levels. And even Andrew Porter, Dean of Penn GSE, as “establishment” as they come, found them mediocre. Bill Schmidt had to resort to magician tricks of re-arranging tables to present them as “similar” to the TIMSS A+ countries — hard to believe anyone would ever go lower than he did.

Since the PARCC assessments are not even completed yet it’s hard to say what Kentucky (and other states in that coalition) will be left with.

Tags: , , , ,

About the Author ()

Shane Vander Hart is the Editor-in-Chief of Caffeinated Thoughts, a popular Christian conservative blog in Iowa. He is also the President of 4:15 Communications, a social media & communications consulting/management firm, along with serving as the communications director for American Principles Project’s Preserve Innocence Initiative.  Prior to this Shane spent 20 years in youth ministry serving in church, parachurch, and school settings.  He has taught Jr. High History along with being the Dean of Students for Christian school in Indiana.  Shane and his wife home school their three teenage children and have done so since the beginning.   He has recently been recognized by Campaigns & Elections Magazine as one of the top political influencers in Iowa. Shane and his family reside near Des Moines, IA.  You can connect with Shane on Facebook, follow him on Twitter or connect with him on Google +.