Truth in American Education

Fighting to stop the Common Core State Standards, their Assessments and Student Data Mining.

  • Home
  • About Us
    • TAE Advocates
    • Network Participants
    • Related Websites
  • Common Core State Standards
    • National Education Standards
    • Gates Foundation & NCEE Influence
    • State Costs for Adopting and Implementing the Common Core State Standards
    • National Curriculum
    • Common Core State Standards Content
      • Standard Algorithms in the Common Core State Standards
    • Myths Versus Facts
    • States Fighting Back Map
    • Closing the Door on Innovation
    • CCSSI Development Teams
  • Common Core Assessments
    • Opt Out Info
  • Race To The Top
    • District-Level Race to the Top–Race to the Top IV
  • Resources
    • Legislative Bills Against CCSS
    • Pioneer Institute White Papers
    • Model Resolutions
    • Parents’ & Educators’ Executive Order
    • CC = Conditions + Coercion + Conflict of Interest
  • Audio & Video
  • Privacy Issues and State Longitudinal Data Systems
    • Statewide Longitudinal Data Systems
  • ESEA/NCLB
    • Statements and Proposed Plans
    • Every Child Achieves Act July 2015
    • Student Success Act
    • Every Child Ready for College or Career Act
    • No Child Left Behind Waivers
    • ESEA Blueprint, Briefing Book, and Position Paper
  • Home School/Private School
  • Action Center
    • Parent and Community Action Plan
    • Stop CCSSI ToolKit
    • Sign Up or Contact TAE

Smarter Balanced Has Lots of Room for Improvement

November 20, 2014 By Shane Vander Hart

239-PreTests.jpgDoug McRae is a retired educational measurement specialist who has served as an educational testing company executive in charge of design and development of K-12 tests widely used across the United States, as well as an adviser on the initial design and development of California’s STAR assessment system.  He’s knows just a little something about assessment development.

He has some concerns with Smarter Balanced he writes in EdSource.  An excerpt:

I did the online exercise for grade 3 English Language Arts, and for this grade level and content area traditional multiple-choice questions dominated. In fact, 84 % of the questions were either multiple-choice or “check-the-box” questions that could be electronically scored, and these questions were very similar or identical to traditional “bubble” tests. Only 16 percent of the questions were open-ended questions, which many observers say are needed to measure Common Core standards.

The online exercise used a set of test items with the questions arranged in sequence by order of difficulty, from easy questions to hard questions. The exercise asked the participant to identify the first item in the sequence that a Category 3 or B-minus student would have less than a 50 percent chance to answer correctly. I identified that item after reviewing about 25 percent of the items to be reviewed. If a Category 3 or proficient cut score is set at only 25 percent of the available items or score points for a test that has primarily multiple-choice questions, clearly that cut score invites a strategy of randomly marking the answer sheet. The odds are that if a student uses a random marking strategy, he or she will get a proficient score quite often. This circumstance would result in many random (or invalid and unreliable) scores from the test, and reduce the overall credibility of the entire testing program.

It troubled me greatly that many of the test questions later in the sequence appeared to be far easier than the item I identified as the item marking a Category 3 or proficient cut score, per the directions for the online exercise. I found at least a quarter of the remaining items to be easier, including a cluster of clearly easier items placed about 2/3 of the way into the entire sequence. This calls into question whether or not the sequence of test questions used by Smarter Balanced was indeed in difficulty order from easy to hard items. If the sequence used was not strictly ordered from easy to hard test questions, then the results of the entire exercise have to be called into serious question.

There were several additional concerns about the Smarter Balanced cut-score-setting exercise this October that are too technical for full discussion in this commentary. Briefly, the exercise appeared not to include any use of “consequence” data that typically is included in a robust cut-score-setting process. Consequence data is estimated information on what percent of students will fall in each performance category, given the cut scores being recommended. I also questioned whether the spring 2014 Smarter Balanced field test data were used to guide the exercise in any significant way. Indeed, since the 2014 Smarter Balanced field test was essentially an item-tryout exercise, an exercise designed to qualify test questions for use in final tests, it did not generate the type of data needed for final cut score determinations in a number of significant ways.

Read more.

Filed Under: Common Core Assessments Tagged With: Common Core State Standards, Doug McRae, Smarter Balanced

Comments

  1. Bob Valiant, Ed.D. says

    November 20, 2014 at 1:52 pm

    McCrae only discusses the problems with Smarter Balanced that can be fixed over a period of time to meet the technical requirements. He does NOT discuss the fact that this type of test can never measure the true achievement of a child (let only the teacher of the child) over the broad range of education for a full year of school experiences. Even so, he does say it is not ready for prime time and should be delayed. I say Smarter Balanced and PARCC should be scrapped and the money that would be spent on them put to good use in classrooms across the country.

  • Email
  • Facebook
  • Phone
  • Twitter

States Fighting Back

https://app.box.com/s/10nl1409mkaf00zzzuyf

CCSS Opt-Out Form

  • Click here to download the CCSS Opt-Out Form

Campbell’s Law

"The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor."

- Donald Campbell

Copyright © 2021 Truth in American Education · Developed & Hosted by 4:15 Communications, LLC.