Site icon Truth on the Market

No good deed goes unpunished: EFF slams Google for alleged violation of ambiguous privacy pledge

Scolding teacher

I have small children and, like any reasonably competent parent, I take an interest in monitoring their Internet usage. In particular, I am sensitive to what ad content they are being served and which sites they visit that might try to misuse their information. My son even uses Chromebooks at his elementary school, which underscores this concern for me, as I can’t always be present to watch what he does online. However, also like any other reasonably competent parent, I trust his school and his teacher to make good choices about what he is allowed to do online when I am not there to watch him. And so it is that I am both interested in and rather perplexed by what has EFF so worked up in its FTC complaint alleging privacy “violations” in the “Google for Education” program.

EFF alleges three “unfair or deceptive” acts that would subject Google to remedies under Section 5 of the FTCA: (1) Students logged into “Google for Education” accounts have their non-educational behavior individually tracked (e.g. performing general web searches, browsing YouTube, etc.); (2) the Chromebooks distributed as part of the “Google for Education” program have the “Chrome Sync” feature turned on by default (ostensibly in a terribly diabolical effort to give students a seamless experience between using the Chromebooks at home and at school); and (3) the school administrators running particular instances of “Google for Education” have the ability to share student geolocation information with third-party websites. Each of these violations, claims EFF, violates the K-12 School Service Provider Pledge to Safeguard Student Privacy (“Pledge”) that was authored by the Future of Privacy Forum and Software & Information Industry Association, and to which Google is a signatory. According to EFF, Google included references to its signature in its “Google for Education” marketing materials, thereby creating the expectation in parents that it would adhere to the principles, failed to do so, and thus should be punished.

The TL;DR version: EFF appears to be making some simple interpretational errors — it believes that the scope of the Pledge covers any student activity and data generated while a student is logged into a Google account. As the rest of this post will (hopefully) make clear, however, the Pledge, though ambiguous, is more reasonably read as limiting Google’s obligations to instances where a student is using  Google for Education apps, and does not apply to instances where the student is using non-Education apps — whether she is logged on using her Education account or not.

The key problem, as EFF sees it, is that Google “use[d] and share[d] … student personal information beyond what is needed for education.” So nice of them to settle complex business and educational decisions for the world! Who knew it was so easy to determine exactly what is needed for educational purposes!

Case in point: EFF feels that Google’s use of anonymous and aggregated student data in order to improve its education apps is not an educational purpose. Seriously? How can that not be useful for educational purposes — to improve its educational apps!?

And, according to EFF, the fact that Chrome Sync is ‘on’ by default in the Chromebooks only amplifies the harm caused by the non-Education data tracking because, when the students log in outside of school, their behavior can be correlated with their in-school behavior. Of course, this ignores the fact that the same limitations apply to the tracking — it happens only on non-Education apps. Thus, the Chrome Sync objection is somehow vaguely based on geography. The fact that Google can correlate an individual student’s viewing of a Neil DeGrasse Tyson video in a computer lab at school with her later finishing that video at home is somehow really bad (or so EFF claims).

EFF also takes issue with the fact that school administrators are allowed to turn on a setting enabling third parties to access the geolocation data of Google education apps users.

The complaint is fairly sparse on this issue — and the claim is essentially limited to the assertion that “[s]haring a student’s physical location with third parties is unquestionably sharing personal information beyond what is needed for educational purposes[.]”  While it’s possible that third-parties could misuse student data, a presumption that it is per se outside of any educational use for third-parties to have geolocation access at all strikes me as unreasonable.

Geolocation data, particularly on mobile devices, could allow for any number of positive and negative uses, and without more it’s hard to really take EFF’s premature concern all that seriously. Did they conduct a study demonstrating that geolocation data can serve no educational purpose or that the feature is frequently abused? Sadly, it seems doubtful. Instead, they appear to be relying upon the rather loose definition of likely harm that we have seen in FTC actions in other contexts ( more on this problem here).  

Who decides what ambiguous terms mean?

The bigger issue, however, is the ambiguity latent in the Pledge and how that ambiguity is being exploited to criticize Google. The complaint barely conceals EFF’s eagerness, and gives one the distinct feeling that the Pledge and this complaint are part of a long game. Everyone knows that Google’s entire existence revolves around the clever and innovative employment of large data sets. When Google announced that it was interested in working with schools to provide technology to students, I can only imagine how the anti-big-data-for-any-commercial-purpose crowd sat up and took notice, just waiting to pounce as soon as an opportunity, no matter how tenuous, presented itself.

EFF notes that “[u]nlike Microsoft and numerous other developers of digital curriculum and classroom management software, Google did not initially sign onto the Student Privacy Pledge with the first round of signatories when it was announced in the fall of 2014.” Apparently, it is an indictment of Google that it hesitated to adopt an external statement of privacy principles that was authored by a group that had no involvement with Google’s internal operations or business realities. EFF goes on to note that it was only after “sustained criticism” that Google “reluctantly” signed the pledge. So the company is badgered into signing a pledge that it was reluctant to sign in the first place (almost certainly for exactly these sorts of reasons), and is now being skewered by the proponents of the pledge that it was reluctant to sign. Somehow I can’t help but get the sense that this FTC complaint was drafted even before Google signed the Pledge.

According to the Pledge, Google promised to:

  1. “Not collect, maintain, use or share student personal information beyond that needed for authorized educational/school purposes, or as authorized by the parent/student.”
  2. “Not build a personal profile of a student other than for supporting authorized educational/school purposes or as authorized by the parent/student.”
  3. “Not knowingly retain student personal information beyond the time period required to support the authorized educational/school purposes, or as authorized by the parent/student.”

EFF interprets “educational purpose” as anything a student does while logged into her education account, and by extension, any of the even non-educational activity will count as “student personal information.” I think that a fair reading of the Pledge undermines this position, however, and that the correct interpretation of the Pledge is that “educational purpose” and “student personal information” are more tightly coupled such that Google’s ability to collect student data is only circumscribed when the student is actually using the Google for Education Apps.

So what counts as “student personal information” in the pledge? “Student personal information” is “personally identifiable information as well as other information when it is both collected and maintained on an individual level and is linked to personally identifiable information.”  Although this is fairly broad, it is limited by the definition of “Educational/School purposes” which are “services or functions that customarily take place at the direction of the educational institution/agency or their teacher/employee, for which the institutions or agency would otherwise use its own employees, and that aid in the administration or improvement of educational and school activities.” (emphasis added).

This limitation in the Pledge essentially sinks EFF’s complaint. A major part of EFF’s gripe is that when the students interact with non-Education services, Google tracks them. However, the Pledge limits the collection of information only in contexts where “the institutions or agency would otherwise use its own employees” — a definition that clearly does not extend to general Internet usage. This definition would reasonably cover activities like administering classes, tests, and lessons. This definition would not cover activity such as general searches, watching videos on YouTube and the like. Key to EFF’s error is that the pledge is not operative on accounts but around activity — in particular educational activity “for which the institutions or agency would otherwise use its own employees.”

To interpret Google’s activity in the way that EFF does is to treat the Pledge as a promise never to do anything, ever, with the data of a student logged into an education account, whether generated as part of Education apps or otherwise. That just can’t be right. Thinking through the implications of EFF’s complaint, the ultimate end has to be that Google needs to obtain a permission slip from parents before offering access to Google for Education accounts. Administrators and Google are just not allowed to provision any services otherwise.

And here is where the long game comes in. EFF and its peers induced Google to sign the Pledge all the while understanding that their interpretation would necessarily require a re-write of Google’s business model.  But not only is this sneaky, it’s also ridiculous. By way of analogy, this would be similar to allowing parents an individual say over what textbooks or other curricular materials their children are allowed to access. This would either allow for a total veto by a single parent, or else would require certain students to be frozen out of participating in homework and other activities being performed with a Google for Education app. That may work for Yale students hiding from microaggressions, but it makes no sense to read such a contentious and questionable educational model into Google’s widely-offered apps.

I think a more reasonable interpretation should prevail. The privacy pledge is meant to govern the use of student data while that student is acting as a student — which in the case of Google for Education apps would mean while using said apps. Plenty of other Google apps could be used for educational purposes, but Google is intentionally delineating a sensible dividing line in order to avoid exactly this sort of problem (as well as problems that could arise under other laws directed at student activity, like COPPA, most notably). It is entirely unreasonable to presume that Google, by virtue of its socially desirable behavior of enabling students to have ready access to technology, is thereby prevented from tracking individuals’ behavior on non-Education apps as it chooses to define them.

What is the Harm?

According to EFF, there are two primary problems with Google’s gathering and use of student data: gathering and using individual data in non-Education apps, and gathering and using anonymized and aggregated data in the Education apps. So what is the evil end to which Google uses this non-Education gathered data?

“Google not only collects and stores the vast array of student data described above, but uses it for its own purposes such as improving Google products and serving targeted advertising (within non-Education Google services)”

The horrors! Google wants to use student behavior to improve its services! And yes, I get it, everyone hates ads — I hate ads too — but at some point you need to learn to accept that the wealth of nominally free apps available to every user is underwritten by the ad-sphere. So if Google is using the non-Education behavior of students to gain valuable insights that it can monetize and thereby subsidize its services, so what? This is life in the twenty-first century, and until everyone collectively decides that we prefer to pay for services up front, we had better get used to being tracked and monetized by advertisers.

But as noted above, whether you think Google should or shouldn’t be gathering this data, it seems clear that the data generated from use of non-Education apps doesn’t fall under the Pledge’s purview. Thus, perhaps sensing the problems in its non-Education use argument, EFF also half-heartedly attempts to demonize certain data practices that Google employs in the Education context. In short, Google aggregates and anonymizes the usage data of the Google for Education apps, and, according to EFF, this is a violation of the Pledge:

“Aggregating and anonymizing students’ browsing history does not change the intensely private nature of the data … such that Google should be free to use it[.]”

Again the “harm” is that Google actually wants to improve the Educational apps:  “Google has acknowledged that it collects, maintains, and uses student information via Chrome Sync (in aggregated and anonymized form) for the purpose of improving Google products”

This of course doesn’t violate the Pledge. After all, signatories to the Pledge promise only that they will “[n]ot collect, maintain, use or share student personal information beyond that needed for authorized educational/school purposes.” It’s eminently reasonable to include the improvement of the provisioned services as part of an “authorized educational … purpose[.]” And by ensuring that the data is anonymized and aggregated, Google is clearly acknowledging that some limits are appropriate in the education context — that it doesn’t need to collect individual and identifiable personal information for education purposes — but that improving its education products the same way it improves all its products is an educational purpose.

How are the harms enhanced by Chrome Sync? Honestly, it’s not really clear from EFF’s complaint. I believe that the core of EFF’s gripe (at least here) has to do with how the two data gathering activities may be correlated together. Google has ChromeSync enabled by default, so when the students sign on at different locations, the Education apps usage is recorded and grouped (still anonymously) for service improvement alongside non-Education use. And the presence of these two data sets being generated side-by-side creates the potential to track students in the educational capacity by correlating with information generated in their non-educational capacity.

Maybe there are potential flaws in the manner in which the data is anonymized. Obviously EFF thinks anonymized data won’t stay anonymized. That is a contentious view, to say the least, but regardless, it is in no way compelled by the Pledge. But more to the point, merely having both data sets does not do anything that clearly violates the Pledge.

The End Game

So what do groups like EFF actually want? It’s important to consider the effects on social welfare that this approach to privacy takes, and its context. First, the Pledge was overwhelmingly designed for and signed by pure education companies, and not large organizations like Google, Apple, or Microsoft — thus the nature of the Pledge itself is more or less ill-fitted to a multi-faceted business model. If we follow the logical conclusions of this complaint, a company like Google would face an undesirable choice: On the one hand, it can provide hardware to schools at zero cost or heavily subsidized prices, and also provide a suite of useful educational applications. However, as part of this socially desirable donation, it must also place a virtual invisibility shield around students once they’ve signed into their accounts. From that point on, regardless of what service they use — even non-educational ones — Google is prevented from using any data students generate. At this point, one has to question Google’s incentive to remove huge swaths of the population from its ability to gather data. If Google did nothing but provide the hardware, it could simply leave its free services online as-is, and let schools adopt or not adopt them as they wish (subject of course to extant legislation such as COPPA) — thereby allowing itself to possibly collect even more data on the same students.

On the other hand, if not Google, then surely many other companies would think twice before wading into this quagmire, or, when they do, they might offer severely limited services. For instance, one way of complying with EFF’s view of how the Pledge works would be to shut off access to all non-Education services. So, students logged into an education account could only access the word processing and email services, but would be prevented from accessing YouTube, web search and other services — and consequently suffer from a limitation of potentially novel educational options.

EFF goes on to cite numerous FTC enforcement actions and settlements from recent years. But all of the cited examples have one thing in common that the current complaint does not: they all are violations of § 5 for explicit statements or representations made by a company to consumers. EFF’s complaint, on the other hand, is based on a particular interpretation of an ambiguous document generally drafted, and outside of the the complicated business practice at issue. What counts as “student information” when a user employs a general purpose machine for both educational purposes and non-educational purposes?  The Pledge — at least the sections that EFF relies upon in its complaint — is far from clear and doesn’t cover Google’s behavior in an obvious manner.

Of course, the whole complaint presumes that the nature of Google’s services was somehow unfair or deceptive to parents — thus implying that there was at least some material reliance on the Pledge in parental decision making. However, this misses a crucial detail: it is the school administrators who contract with Google for the Chromebooks and Google for Education services, and not the parents or the students.  Then again, maybe EFF doesn’t care and it is, as I suggest above, just interested in a long game whereby it can shoehorn Google’s services into some new sort of privacy regime. This isn’t all that unusual, as we have seen even the White House in other contexts willing to rewrite business practices wholly apart from the realities of privacy “harms.”

But in the end, this approach to privacy is just a very efficient way to discover the lowest common denominator in charity. If it even decides to brave the possible privacy suits, Google and other similarly situated companies will provide the barest access to the most limited services in order to avoid extensive liability from ambiguous pledges. And, perhaps even worse for overall social welfare, using the law to force compliance with voluntarily enacted, ambiguous codes of conduct is a sure-fire way to make sure that there are fewer and more limited codes of conduct in the future.

Exit mobile version