Data Mine Students to Measure Grit?

Filed in Privacy Invasion/Data Mining by on April 2, 2013 1 Comment

truegritThe U.S. Department of Education’s Office of Educational Technology published a report back in February entitled “Promoting Grit, Tenacity and Perseverance: Critical Factors for Success in the 21st Century.”  From the executive summary:

How can we best prepare children and adolescents to thrive in the 21st century—an era of achievement gaps that must be closed for the benefit of everyone in society, rapidly evolving technology, demanding and collaborative STEM knowledge work, changing workforce needs, and economic volatility? The test score accountability movement and conventional educational approaches tend to focus on intellectual aspects of success, such as content knowledge. However, this is not sufficient. If students are to achieve their full potential, they must have opportunities to engage and develop a much richer set of skills. There is a growing movement to explore the potential of the “noncognitive” factors—attributes, dispositions, social skills, attitudes, and intrapersonal resources, independent of intellectual ability—that high-achieving individuals draw upon to accomplish success, (pg. v).

When I think of “grit,” I think of John Wayne in True Grit, and in a way they refer to something similar as well.

They define “grit” as, “Perseverance to accomplish long-term or higher-order goals in the face of challenges and setbacks, engaging the student’s psychological resources, such as their academic mindsets, effortful control, and strategies and tactics,” (pg. vii).

“Grit,” I believe is a personality trait and is mostly the result of someone’s upbringing.  To the Feds, however, it’s something that must be measured, data mined and shared.

There are many different types of measurement methods, each with important tradeoffs.

  • Self-report methods typically ask participants to respond to a set of questions about their perceptions, attitudes, goals, emotions, beliefs, and so on. Advantages are that they are easy to administer and can yield scores that are easy to interpret. Disadvantages are that people are not always valid assessors of their own skills, and self-reports can be intrusive for evaluating participants’ in-the-moment perceptions during tasks.
  • Informant reports are made by teachers, parents, or other observers. Advantages are that they can sidestep inherent biases of self-report and provide valuable data about learning processes. The main disadvantage is that these measures can often be highly resource intensive—especially if they require training observers, time to complete extensive observations, and coding videos or field notes.
  • School records can provide important indicators of perseverance over time (e.g., attendance, grades, test scores, discipline problems) across large and diverse student samples. Advantages are the capacity to identify students who are struggling to persevere and new possibilities for rich longitudinal research. Disadvantages are that these records themselves do not provide rich information about individuals’ experiences and nuances within learning environments that may have contributed to the outcomes reported in records.
  • Behavioral task performance measures within digital learning environments can capture indicators of persistence or giving up. Advantages are that new methods can be seamlessly integrated into the learning environment and provide unprecedented opportunities for adaptivity and personalized learning. Disadvantages are that these methods are still new and require intensive resources to develop, (pg. ix-x).

Pages 35-49 in the report give further detail in the type of measurements that would be deemed helpful.  One suggestion they give with self-report entails students to carry a device with them to report their feelings during certain situations:

For example, researchers can examine consistency in participant’s ratings to determine the strength of the belief or skill. Self-report can also be used to measure process constructs; for example, in the Experience Sampling Method (ESM), participants typically carry around a handheld device that “beeps” them at random intervals, prompting self-report of experiences in that moment (e.g., Hektner, Schmidt, & Csikszentmihalyi, 2007). Such data can be used to make inferences about emotions, thoughts, and behaviors within and across specific situations, (pg. 35).

Below is example of a self-report measure of “grit.”

image

Then you have informant reports, and they cite something that KIPP is doing with their students:

For example, KIPP and other character education programs have been developing methods of using explicit teacher feedback to help students gauge their level of grit with respect to specific criteria and to open up conversations among parents, teachers, and students (see Chapter 4 for more details about these models). These schools have been implementing a “Character Report Card” on which students receive ratings pooled from multiple teachers on factors such as grit and self-control. Exhibit 9 illustrates what such a report card might look like. It is important that these ratings come from multiple teachers, as they are then less susceptible to biases of particular relationships. KIPP has been facilitating the use of these Report Cards with a technology called PowerTeacher that allows teachers to input their ratings online. Informant reporting is also a common approach for teachers, parents/guardians, and mental health professionals to assess the social-emotional competencies that serve as protective factors associated with resilience in young children. Forexample, the Devereux Student Strengths Assessment (DESSA; LeBuffe, Shapiro & Naglieri, 2009) can be used for children in kindergarten through eighth grade (ages 5-­‐14). The DESSA is a 72-­‐item, standardized, norm-­‐referenced behavior rating scale that focuses on student strengths and positive behaviors related to eight dimensions: self-­‐awareness, social awareness, self-­management, goal-directed behavior, relationship skills, personal responsibility, decision making, and optimistic thinking. It can be used for screening, profiling for intervention, and monitoring and measuring change (Hall, 2010), (pg. 38).

Below is an example of a character report card from KIPP:

image

Then you have record keeping.  They tout the example of a Youth Data Archive at the Gardner Center at Stanford University.  This archive, “links data across systems—school, social services, foster care, youth development programming, juvenile justice—to provide actionable integrated student profiles to educators,” (pg. 40).

Having formerly worked in the juvenile justice field I have a HUGE problem with any data gathered in the juvenile justice system with anyone not directly working in that system.

Why do educators need an “actionable integrated student profile”?  So they can place a label on a student?  Are kids and parents giving consent for any of this information to be shared?

Then there is “behavioral task performance.”

While laboratory experiments have examined behavioral task performance for many years, new technological opportunities offer potential for new methods and approaches. Educational data mining (EDM) and learning analytics within digital learning environments allow for “micro-level” analyses of moment-by-moment learning processes.

Student data collected in online learning systems can be used to develop models about processes associated with grit, which then can be used, for example, to design interventions or adaptations to a learning system to promote desirable behaviors. Dependent behavioral variables associated with a challenge at hand may include responses to failure (e.g., time on task, help-seeking, revisiting a problem, gaming the system, number of attempts to solve a problem, use of hints), robustness of strategy use (e.g., planning, monitoring, tools used, number of solutions tried, use of time), level of challenge of self-selected tasks, or delay of gratification or impulse control in the face of an enticing off-task stimulus. Such data can be examined for discrete tasks or aggregated over many tasks, (pg. 41).

They also discuss affective computing (the study and development of systems and devices that can recognize, interpret, process, and simulate aspects of human affect.)

Researchers are exploring how to gather complex affective data and generate meaningful and usable information to feed back to learners, teachers, researchers, and the technology itself. Connections to neuroscience are also beginning to emerge, (pg. 41 emphasis mine).

President Obama just today called for investing $100 million for a brain mapping project.  USA Today reports that it’s goal is to develop “new technologies that can record the activities of individual cells and neurons within the brain.”

They then devote a short paragraph on the ethics of garnering this new type of personal data.

As new forms of measurement emerge and new types of personal data become available, the field must also deal with critical ethical considerations. Of course, privacy is always a concern, especially when leveraging data available in the “cloud” that users may or may not be aware is being mined. However, another emergent concern is the consequences of using new types of personal data in new ways. Learners and educators have the potential to get forms of feedback about their behaviors, emotions, physiological responses, and cognitive processes that have never been available before. Measurement developers must carefully consider the impacts of releasing such data, sometimes of a sensitive nature, and incorporate feedback mechanisms that are valuable, respectful, and serve to support productive mindsets, (pg. 48).

They recognize that users may not be aware they are mining data. 

Let’s focus on schools being places where students learn instead of being a laboratory where student’s behavior can be studied, recorded and shared.  How much of this mentality will be present when SBAC and PARCC assessments (which will be used to mine data, we are just not sure of the extent) are released remains to be seen.

“Grit” isn’t something that schools will be able to “can” and give out to students so measuring it seems rather ludicrous to me, especially at the expense of privacy.

Tags: , , , , , , , , , , , , , ,

About the Author ()

Shane Vander Hart is the Editor-in-Chief of Caffeinated Thoughts, a popular Christian conservative blog in Iowa. He is also the President of 4:15 Communications, a social media & communications consulting/management firm, along with serving as the communications director for American Principles Project’s Preserve Innocence Initiative.  Prior to this Shane spent 20 years in youth ministry serving in church, parachurch, and school settings.  He has taught Jr. High History along with being the Dean of Students for Christian school in Indiana.  Shane and his wife home school their three teenage children and have done so since the beginning.   He has recently been recognized by Campaigns & Elections Magazine as one of the top political influencers in Iowa. Shane and his family reside near Des Moines, IA.  You can connect with Shane on Facebook, follow him on Twitter or connect with him on Google +.

Comments (1)

Trackback URL | Comments RSS Feed

  1. Dockaren says:

    Shane, Great article. We noted this same report in our analysis of a Florida data mining bill and in our analysis of what is wrong with Common Core in Florida. See http://www.edlibertywatch.org.

    Given your question about SBAC and PARCC, I thought you would be interested in this quote listed on p. 49 of this report:

    [A]s new assessment systems are developed to reflect the new standards in English
    language arts, mathematics, and science, significant attention will need to be given to
    the design of tasks and situations that call on students to apply a range of 21st century
    competencies that are relevant to each discipline. A sustained program of research and
    development will be required to create assessments that are capable of measuring
    cognitive, intrapersonal, and interpersonal skills.
    – National Research Council 2012 Report on 21st Century Knowledge and Skills
    (NRC, 2012)

    Keep up the great work!!

Leave a Reply

Your email address will not be published. Required fields are marked *