Filtering and Surveillance Should Not Be Considered Protection

11 min read

Yesterday, two applications that use student and parent data were written up on EdSurge. Both of the applications put student social media use under surveillance, and attempt to tie this surveillance and data collection to the students' best interest.

The two apps are Securly and Mevoked - Securly describes itself as a "filtering 2.0 for schools and families"; Mevoked describes itself as "bridging the gap between mental health and technology".

Securly

In the EdSurge piece, a director of instructional innovation describes what Securly offers:

From the Securly dashboard, the administrators can see what students have and haven’t been able to access,” she explains. “If I want to see what kids are posting on Twitter or Facebook, I can--everything on our Chromebooks gets logged by Securly."

So, let's parse this out. Securly logs and tracks what kids are doing on Facebook and Twitter. Because these activities are on Chromebooks, we can assume some level of email, search, and docs logging, at the very least. But it's necessary to remember that technology is not neutral, and that the direction a technology takes can be shaped by the context within which it's used.

The social media filtering makes an especially significant difference at schools like [name redacted], a Catholic all-girls high school where most students bring their school-supplied Chromebooks home. “Most of our students are economically disadvantaged, and use our device as their only device,” [name redacted] explains. “Students take Chromebooks home, and the Securly filters continue there.”

So, at a Catholic girls school serving poor kids, the school issues a device to all students that tracks their online behavior, fully aware that for many of their kids this is their main conduit to the internet. I can only imagine what a response to a kid would look like if/when a student at the school looks for resources on coming out, or looks for resources or protection from abuse.

And in case anyone was unclear on the vision and direction of the company, Awais Ahsan, the company founder, lays it out:

Ahsan envisions Securly as eventually connecting educators and parents in monitoring all social use of technology by students. “A tool for parents to log in and see a view for their particular child, and have alerts through SMS for these activities, would complete the picture in our eyes,” he says.

Nothing supports student autonomy and growth like text messages to parents when their kid is searching online. As to any argument that online learning can support open ended inquiry, or that personal interests can inform and drive a student's academic growth? Nope.

Ahsan reminds that so far, Securly only functions on school-issued devices, which students ostensibly should not be using for personal social media use in the first place.

Those pesky student interests should never get mingled in with the important stuff: teacher directed activities, which need to be constantly logged to protect kids.

It also bears highlighting that describing an internet filter and activity tracker as a bullying prevention tool is some incredible marketing spin.

Mevoked

Mevoked has some similarities to Securly, with a different focus. The following quotation is from the EdSurge piece and Mevoked founder Arun Ravi:

Like Securly, Mevoked analyzes social mobile and online data, but focuses on mental health and connecting individuals with online and in-person resources. “We want to fill in the gap of identifying negative behavior and be the conduit to managing your condition,” Ravi says.

While the precise nature of the "condition" requiring management remains vague, we clearly need data to do what we need to do. With regards to data collection and analysis, Mevoked falls back on the "Google is already doing more of this" explanation:

Ravi explains that Mevoked accumulates data about how students use technology that is already largely accessible. “There’s no barrier in collecting this data,” he says. “We’re doing exactly what Google does when they advertise to you, using the same algorithms to assess mental health.”

While this comparison appears to create an odd parallel between advertising and mental illness, the statement doesn't hold up. Granted, Google sucks up data like a caffeinated Dyson salesperson, but Google is not selling a mental health big data app. The use of the data matters. Mevoked aims to use the data it collects to put pressure on teachers:

By offering Mevoked to schools, Ravi hopes “to put the onus on educators to take a more active interest” in student mental health.

To be very clear: increasing emotional support for kids in schools is a very good thing. But, contextualizing that support within a mental health framework is problematic, and placing non-mental health professionals on the front line with mental health issues has the real potential to do more harm than good. It's hard to tell what's worse: equipping teachers who may or may not have any expertise in mental health with a name of a student in need, or outsourcing these judgments to an algorithm outside any form of informed local professional review.

The EdSurge piece also cites an ongoing study of the app with Lewis & Clark College students. The article states that a "senior psychology major at Lewis & Clark ... is conducting the study with Mevoked." It's very unclear how and what the study includes, and what the level of supervision is, but the way the study is described it sounds like an undergrad psych major is running a study on classmates with the support of a tech company. I suspect and hope I am missing some key details here, because this work sounds like it tramples all over the gray area between a tech pilot and a research experiment on human subjects. I hope and trust that any work that asks students to share mental health data includes supervision of mental health and/or medical professionals. While this is likely/hopefully in place, any supervision is not mentioned in the article.

While I was writing this piece, the people running the Mevoked Twitter account reached out to dispute some of my descriptions of their privacy policy, and to highlight that they are likely pivoting to work more with adults. Moving away from direct outreach to schools would be a good thing, but even in the case of a full pivot to only working with adults, the privacy issues highlighted here still need clarification. I took a screenshot of the conversation, as well as of the Mevoked privacy policy in place when this post was written.

Online Filtering, Mental Health, Surveillance, and Privacy

After reading the EdSurge piece, I took a quick jump over to the privacy policies of Securly and Mevoked. I didn't do a full review of their policies, but a quick read showed some of the common issues where data could potentially leak out.

Securly - the monitoring and logging application used in a school serving "disadvantaged" kids - reserves the right to sell any data collected "in connection with a sale of all or substantially all of the assets of Securly or the merger of Securly into another entity". Additionally, Securly reserves "the right to fully use and disclose any information that is not Personal Information (such as statistics, most frequented domains, etc)." Based on the amount of data collected by a logging service (for example, imagine what you have done on your computer and on the internet in the last 45 minutes), Securly would appear to have a sizeable trove of data on student usage patterns. Securly also reserves the right to change their terms of service and privacy policies at any time, with no notice.

A reasonable person might expect that an app like Mevoked, aiming to "bridg[e] the gap between mental health and technology" would have a more solid privacy policy. However, Mevoked quickly dashes that expectation. Mevoked uses information collected from parents and about children to advertise to parents:

We may use information collected from parents,to send such users you (sic) news and newsletters, special offers, and promotions, or to otherwise contact such users about products or information we think may be of interest. We will not send marketing or promotional materials to children.

So, an app created to support mental health will use data collected within the app to market to parents. The opportunities for opportunistic marketing here are mind boggling - and one can only imagine what Big Pharma would do with this dataset. Of course, this trove of curated mental health data is also a business asset, and can be transferred with no conditions:

If we are acquired by or merged with another company, if substantially all of our assets are transferred to another company, or as part of a bankruptcy proceeding, we may transfer the information we have collected from you to the other company.

And, of course, data collected within Mevoked can be shared in aggregate or de-identified form:

We may share aggregate or de-identified information about users with third parties for marketing, advertising, research, or similar purposes.

This assumes, of course, that the dataset can be adequately de-identified, and won't be combined with any other external datasets. And we need to re-emphasize here: the dataset in question contains data points that create a partial picture of the mental health of individuals, tied to their identity, and their privacy policy doesn't mention anything prohibiting the recombination of the Mevoked dataset with other datasets.

Both the Securly and Mevoked terms could be dramatically improved by stating that user data will never be included as part of any sale or transfer. If Securly is serious about being a filtering tool, it should not attempt to make user logs an asset. With regards to Mevoked, given that the company collects and analyzes data around an individual's mental health, treating that information as a financial asset seems poorly thought out, at best. Mevoked should also restrict any advertising or marketing related uses of its data. People are going to Mevoked for mental health reasons, not to help marketers get better at selling, and using mental health data in any form - with PII, in aggregate, or de-identified - to fuel marketing seems uncaring.

Conclusions

If you are a parent, concern for your children is natural. It's something we live with. But addressing concern doesn't require constant surveillance, and throwing the data trail generated by that surveillance into the hands of a tech company. If you work in a school and you absolutely must filter, don't sell out your kid's online habits under a misplaced sense of "safety." If you are a kid, look for these intrusive devices, and ask pointed questions about them. If you encounter a filter, ask what is logged, and why. This is your education, and you deserve the freedom to pursue it on your terms, without having every keystroke logged. If you are a teacher who really thinks you need to intrude this deeply into your students lives to make a difference, check yourself.

If you are running a tech company that is selling to schools and will get information on children as a direct result of your app, set up privacy policies that actually protect privacy and respect student autonomy and experimentation. Until you do that, you don't deserve our trust. Funders - the more you fund companies that trample on user privacy, the more you strengthen the impression that you care more about profits than people. And journalists and tech writers - please, push back on the techno-utopian narratives that get pushed your way. Read privacy policies, and ask companies questions about them. Think about the implications of the software you describe. Don't be afraid to call bullshit when it's needed.

And, as always, it circles back to the role students play in the learning environments we create. Securly and Mevoked - like most every EdTech app out there - treat students like observed objects, rather than creative people with agency. The EdTech space is filled with people trumpeting the potential for student directed learning, and technology that reinforces the traditional paradigm of a teacher leading obedient students. We can't transform the process of learning while celebrating tools that remain rooted in the power structures we seek to change.

Please, Correct Any Misconceptions Here

I'm always open to the possibility/certainty that I have gotten something wrong. Please, if you see something here that is inaccurate, let me know.

, , ,