Selling Cheap

4 min read

December 10th was an interesting day for reports on apps for learning.

The Joan Ganz Cooney Center released a report on the ways in which technology can be used to foster and support improved reading skills among children. The report covers a fair amount of ground, and is worth reading in its entirety. Part of the report included a scan of apps and web sites focused on supporting literacy. From the report:

Digital products aimed at building literacy skills in young children are a significant segment of the market. Yet many of these products may not be providing the educational benefit they claim. Few apps and e-books have information in their descriptions that point to any effectiveness studies to back them up, and most only focus on very basic literacy skills that would not be useful for children who are beginning to learn skills like grammar and storytelling.

Surveillance

Also on December 10th, the FTC released its second report on privacy concerns with children's apps, and this report indicates that people selling apps to kids are still collecting data from kids, and that they are still doing it without informing parents.

Staff examined hundreds of apps for children and looked at disclosures and links on each app’s promotion page in the app store, on the app developer’s website, and within the app. According to the report, “most apps failed to provide any information about the data collected through the app, let alone the type of data collected, the purpose of the collection, and who would obtain access to the data. Even more troubling, the results showed that many of the apps shared certain information with third parties – such as device ID, geolocation, or phone number – without disclosing that fact to parents. Further, a number of apps contained interactive features – such as advertising, the ability to make in-app purchases, and links to social media – without disclosing these features to parents prior to download.”

From the first report, we see that apps designed to support literacy are doing a mediocre job of it. From the second report, we see that the manufacturers of these mediocre learning apps are doing a great job harvesting information without informing their users, or their user's parents. So, even if the kid using the app is having a mediocre learning experience, the manufacturer of the app is still able to use your demographic data to sell ads, and/or raise additional VC money, and/or sell your user data outright.

If your kid is attending a school that is rolling out an iPad program, it's worth asking if they have done a privacy audit on the apps they are using. Ask for the process they have used, and for examples of privacy policies that they found incompatible with the rights of their learners. Ask to see a documented process or rubric that they use to evaluate privacy of apps that they will use in their programs.

If you are rolling out a 1:1 program, what do your privacy audits look like? What steps do you take to ensure that the privacy of your learners is respected? How do you communicate about this to teachers, students, and families?

As adults, we can make decisions about how we want to protect (or not protect) our privacy. But we shouldn't require kids and their families to expose themselves to marketers as a precondition to learning. Additionally, given that some of the more popular apps don't promote higher level thinking, if we are going to sell out privacy as a means to learning via apps, we should at least ensure that we get something worthwhile in exchange for our privacy.

Photo Credit: Lextech, via Nowhere Else

, , , ,