Truth in American Education

Fighting to stop the Common Core State Standards, their Assessments and Student Data Mining.

  • Home
  • About Us
    • TAE Advocates
    • Network Participants
    • Related Websites
  • Common Core State Standards
    • National Education Standards
    • Gates Foundation & NCEE Influence
    • State Costs for Adopting and Implementing the Common Core State Standards
    • National Curriculum
    • Common Core State Standards Content
      • Standard Algorithms in the Common Core State Standards
    • Myths Versus Facts
    • States Fighting Back Map
    • Closing the Door on Innovation
    • CCSSI Development Teams
  • Common Core Assessments
    • Opt Out Info
  • Race To The Top
    • District-Level Race to the Top–Race to the Top IV
  • Resources
    • Legislative Bills Against CCSS
    • Pioneer Institute White Papers
    • Model Resolutions
    • Parents’ & Educators’ Executive Order
    • CC = Conditions + Coercion + Conflict of Interest
  • Audio & Video
  • Privacy Issues and State Longitudinal Data Systems
    • Statewide Longitudinal Data Systems
  • ESEA/NCLB
    • Statements and Proposed Plans
    • Every Child Achieves Act July 2015
    • Student Success Act
    • Every Child Ready for College or Career Act
    • No Child Left Behind Waivers
    • ESEA Blueprint, Briefing Book, and Position Paper
  • Home School/Private School
  • Action Center
    • Parent and Community Action Plan
    • Stop CCSSI ToolKit
    • Sign Up or Contact TAE

U.S. DOE Report Shows Smarter Balanced Lagging Behind

October 21, 2013 By Shane Vander Hart

image

In a two-year report written by the U.S. Department of Education and published in May 2013 reviewing the Smarter Balanced Assessment Consortium shows the consortium is lagging behind its stated goals and with technology preparedness in local schools.

Looking at pg. 15 of the report it is evident that Smarter Balanced does not have a handle on the technology readiness of the states in the consortium.

image

Seven of their member states they don’t have any data listed at all.  Only Delaware and Connecticut reported 100% readiness.

On page 7 of the report they say they developed 5000 items for the Spring 2013 pilot.  This is a reduction from a 10,000 item goal.

Recognizing the importance of having strong quality control measures, the consortium revised its processes for monitoring and reviewing alignment to the CCSS and quality throughout future item development cycles. In order to provide a focus on alignment and quality, and to increase the percentage of machine-scored items, the consortium reduced the number of items developed for the pilot test from 10,000 to 5,000, (pg. 9).

That is report-speak for we had a crappy product so we needed to tweak it.  The Department of Education saw this as a challenge:

While Smarter Balanced item development is well underway, the consortium also experienced several  challenges. During the item review process in fall 2012, the consortium recognized that the review process for ensuring the quality of the items was not sufficient. As a result, the consortium revised the number of items that were developed for the pilot test (from 10,000 to 5,000) so that an additional review could occur to provide a clearer focus on quality and alignment to the CCSS. Moving forward, the consortium is going to be developing 38,000 items in year three for the field test in spring 2014. It is essential that Smarter Balanced maintain a strong process for determining the quality of the items  being developed. This will require that the consortium monitor and evaluate the processes for writing  and reviewing items as well as for reviewing the quality of the items themselves, and that the consortium include external content experts in English language arts and mathematics as a component of the item development processes. The RFP for field test item writing, released in December 2012, included several of these components to strengthen the consortium’s quality control measures, (pg. 23).

They came up with only 5,000 assessment items in two years.  We are to believe they will develop 32,000 additional items by Spring 2014?  Right.

You can read the report for yourself here, here or below.

Filed Under: Common Core Assessments Tagged With: common core assessments, Race to the Top, Smarter Balanced Assessment Consortium, U.S. Department of Education

Comments

  1. Richard_Innes says

    October 21, 2013 at 5:26 pm

    This raises even more questions about sustainability, a topic I addressed to the Indiana legislative commission examining CCSS on October 1, 2013. Sustainability became a big problem with Kentucky’s KIRIS assessment Performance Events in the early 1990s and those events crashed in just four years (KIRIS was remarkably similar to S-B proposals).

    It’s not enough just to create a first round of questions. Questions have to be changed out over time to avoid teaching to the tests, and generating new questions that cover the same academic material to the same level of difficulty gets really challenging when we are talking about either open-response questions or the even more problematic performance items.

    One other question, are the 5,000 questions spread out across both English language arts and math and across all Elementary and Secondary Education Act testable grades from 3 to 8 plus high school? If so, that works out to an average of less than 400 questions per subject per grade, which probably isn’t enough to sustain the assessments for very long.

    With these tests slated for use in multiple states, question compromise is pretty much a certainty, so they will have to be changed out. Also, some of the questions will probably prove unsuitable during pilot testing and will have to be discarded, as well.

  2. Darren says

    October 22, 2013 at 6:14 am

    Have you done an analysis of the report on PARCC?

    • Shane Vander Hart says

      October 22, 2013 at 11:33 am

      No, haven’t seen it yet.

  • Email
  • Facebook
  • Phone
  • Twitter

States Fighting Back

https://app.box.com/s/10nl1409mkaf00zzzuyf

CCSS Opt-Out Form

  • Click here to download the CCSS Opt-Out Form

Campbell’s Law

"The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor."

- Donald Campbell

Copyright © 2021 Truth in American Education · Developed & Hosted by 4:15 Communications, LLC.