In a two-year report written by the U.S. Department of Education and published in May 2013 reviewing the Smarter Balanced Assessment Consortium shows the consortium is lagging behind its stated goals and with technology preparedness in local schools.
Looking at pg. 15 of the report it is evident that Smarter Balanced does not have a handle on the technology readiness of the states in the consortium.
Seven of their member states they don’t have any data listed at all. Only Delaware and Connecticut reported 100% readiness.
On page 7 of the report they say they developed 5000 items for the Spring 2013 pilot. This is a reduction from a 10,000 item goal.
Recognizing the importance of having strong quality control measures, the consortium revised its processes for monitoring and reviewing alignment to the CCSS and quality throughout future item development cycles. In order to provide a focus on alignment and quality, and to increase the percentage of machine-scored items, the consortium reduced the number of items developed for the pilot test from 10,000 to 5,000, (pg. 9).
That is report-speak for we had a crappy product so we needed to tweak it. The Department of Education saw this as a challenge:
While Smarter Balanced item development is well underway, the consortium also experienced several challenges. During the item review process in fall 2012, the consortium recognized that the review process for ensuring the quality of the items was not sufficient. As a result, the consortium revised the number of items that were developed for the pilot test (from 10,000 to 5,000) so that an additional review could occur to provide a clearer focus on quality and alignment to the CCSS. Moving forward, the consortium is going to be developing 38,000 items in year three for the field test in spring 2014. It is essential that Smarter Balanced maintain a strong process for determining the quality of the items being developed. This will require that the consortium monitor and evaluate the processes for writing and reviewing items as well as for reviewing the quality of the items themselves, and that the consortium include external content experts in English language arts and mathematics as a component of the item development processes. The RFP for field test item writing, released in December 2012, included several of these components to strengthen the consortium’s quality control measures, (pg. 23).
They came up with only 5,000 assessment items in two years. We are to believe they will develop 32,000 additional items by Spring 2014? Right.