
In the age of technology-powered algorithms, most universities are making admissions decisions based on factors that would horrify applicants and parents if they knew about it. These universities are paying technology companies to scan the digital lives of applicants and advise whether they are good candidates for admission.Â
This goes beyond just accessing applicants’ social-media posts. With this technology, clicking on the “wrong” websites may doom an applicant’s chances of attending his or her dream school. And the applicant will never know why that rejection email appeared in the inbox.
Maryland attorney Brad Shear is sounding the alarm about this troubling trend. Having successfully defended and advised many students accused by universities of inappropriate online behavior, he’s exposing these surreptitious sci-fi practices.
According to Shear, many universities are contracting with companies such as Capture and Technolutions to track applicants’ online activity. Obviously, this would include social-media postings. Is Edward a fan of Donald Trump, or of Bernie Sanders? An admissions official of the opposite political persuasion may conclude that Edward isn’t a good “fit” for this university. Postings about religion or many other contentious topics could have the same effect.
What if these postings came from someone else – perhaps another user of the only computer Edward’s family can afford? As Capture admits, “there’s no way currently for us to distinguish between multiple students on the same device. Both [users’] histories will be merged together and associated with the first . . . who clicks on an email to identify herself.” Don’t worry, though: “Our guess is that it’s not going to happen that often or end up being a significant issue.”
But the surveillance is deeper and more sophisticated. When universities give such companies access to applicants’ email and possibly other online accounts, the companies install a tracking code to monitor activity from that account. Or, they may contract with data brokers to purchase the mountains of information generated by students in classroom online activity. This possibility reinforces the dangers of “digital learning” in school – the student-generated data trail can end up in the hands of these surveillance companies.
Armed with this data, the companies then create algorithms to predict whether a student will succeed at a particular university, or in a particular program.
For example, say Ann applies to College X and wants to study engineering. But College X’s surveillance contractor detects that Ann didn’t whiz through online math lessons at school, or that she (or some other family member – the software won’t distinguish on a shared device) accessed a math-tutoring site. Because engineering requires strong math skills, the company’s algorithm calculates that Ann isn’t a good candidate for that program.Â
Or maybe the surveillance discovers that Tim, who wants to go into medicine, is frequently active online at 3:00 AM. The algorithm may conclude he’s a bad fit for programs that require attending clinical sessions at 6:30 AM.Â
This software can also report on applicants based on their interaction with the university’s website (Capture calls this “behavioral engagement”). Suppose Kate is a promising candidate academically, but she has accessed the financial-aid page several times. That suggests Kate might be a more expensive, and therefore less desirable, student. Her application ends up in the circular file along with Ann’s and Tim’s.Â
And none of these applicants will ever know why they were rejected.
This system represents the triumph of technology over liberty and the human spirit. Like the omnipresent “career tracking” of students beginning in middle school, it assumes students must be categorized by their background or their “aptitudes” and channeled into a place where the algorithm predicts they’ll succeed. If the aspiring engineer has resolved to remedy her deficiencies by sheer hard work and will, she may never have that chance. If the night owl is capable of fundamentally changing his habits to get to the hospital at the crack of dawn, no one may ever know. The algorithm has spoken.
How can students thwart this surveillance? Shear recommends they perform a “digital audit” to delete any potentially damaging content (postings about religion, political beliefs, or controversial activities). Beyond that, he recommends they “limit their searchable digital footprint by creating fake accounts, utilizing robust privacy settings, [and] deploying virtual private networks” to blunt the universities’ surveillance techniques. (Shear offers consulting services to individuals and local PTA’s to help families accomplish this.)
The techno-charged admissions process makes sense if we accept that Big Data should dictate all decisions. But in America, individuals should be free to accomplish their own dreams by virtue of their unique characteristics. An impersonal algorithm should never block that chance.
This article brings red-light cameras to mind, where the owner of the vehicle would be ticketed, but someone else was driving and committed the infraction. In Missouri, the state supreme court ruled them unconstitutional as the owner of the vehicle was assumed to be the one driving.
This makes me wonder if a student would not be accepted to a university because of something searched for in high school to complete a paper. My child had been assigned a paper regarding the use of opioids/heroin. The students were allowed to do research during class time, but that topic was blocked by school computers. With such a search, would an algorithm unfairly label my child as someone wanting to experiment with drugs?
The data mining must stop. Most I have warned about this over the years are also concerned, but they do nothing to fight it. People need to take action. Speak up and fight this!
I think most reasonable parents talk to their children about the pitfalls of technology and how data/posts/photos will live on forever in the cloud/ethernet. Children (teens) don’t understand the concept of “forever” and they don’t understand big consequences for their behavior yet. They also are impulsive creatures and clique oriented. Parents are finding it very difficult to keep up with the fast pace of technology development. It’s not that we don’t want to take action…. we don’t know how to take action and most parents don’t realize how much data is being collected on their children during a normal school day. We can’t live under a log and we can’t expect that our children do that either. It’s a Catch 22 situation.