Throwing cookies (and people) out of windows
Peter Leonard, Data Synergies
The defenestration of Prague and the deprecation of cookies have two things in common.
Most people have no idea what the words defenestration and deprecation mean. Both involve throwing things out the window.
The defenestration of Prague refers to three mob rule incidents in the history of Bohemia, in each of which multiple governors and town councillors were defenestrated (thrown out of a window). The most famous defenestration, in 1618, led to the Thirty Years’ War and significantly influenced the history of Europe.
About 400 years later, deprecation of the cookie is inspiring human passion, albeit expressed this time with words, not biceps. As most readers of this column will know, in January 2020, Google announced that that they will deprecate (cease to approve) third-party cookies over the next two years. Then in June 2020, Apple announced forthcoming limitation of use of its mobile device ID, Identifier for Advertisers (IDFA), in iOS apps. Given use of these identifiers by digital marketers for targeting, measuring and personalizing digital ad campaigns, many digital marketers have decried that these changes will create new ‘dataopolies’, where large digital platforms access to first party data will create enduring advantage unmatchable by other digital advertising intermediaries.
Two debates mashed up: privacy practices and personalisation
Debate over deprecation of cookies has been confused and confusing. This is because two concerns are entangled. The first concern is poor data privacy practices of some entities active in digital marketing and advertising. The second, newly emerging concern is expressed by some consumer rights advocates and focusses upon personalisation or individuation. Unless these concerns are considered separately, the debate goes nowhere, because would be debaters are talking different languages. Let us do some unpacking.
When ad data is personal information about people
Many publishers, marketers and adtech vendors, and some digital platform providers, claim that the data they use for marketing and behavioural advertising purposes is not ‘personal data’ or ‘personal information’. These entities argue that relevant data cannot be associated with an identifiable individual because the data relates to a device or tracking code which cannot be associated with an identifiable individual.
Privacy advocates respond that this is rubbish, at least for some of the individuals using devices or browsers. These advocates argue that with access to multiple data points, a motivated entity could deduce the identity of some individuals using a device or browser, that therefore relevant data could be associated with an identifiable individual, and should therefore be treated as regulated personal information about individuals. They then conclude that disclosures about marketing uses do not comply with the law, are misleading by omission, and that data is being shared between digital advertising intermediaries without each of them observing the constraints required by privacy law.
Some digital marketers respond that they have implemented controls and safeguards that reliably ensure that relevant data tracked by device or online code cannot be associated with identifiable individuals, and therefore never becomes personal information about individuals.
Many privacy advocates say simply that they do not believe such claims.
Other privacy advocates say that even where a particular publisher, marketer, adtech vendor, or digital platform has put its own deidentification management house in order, it can’t reliably say whether other entities with whom it shares relevant data have their respective deidentification management house in order, and because deidentification management is not assured, disclosure of relevant data within the relevant data ecosystem must be treated as a disclosure of personal information about identifiable individuals.
At least some of the digital platforms agree with the privacy advocates: hence the deprecation of the third-party cookie and the changes with Apple iOS 14.5. However, uses of first party cookies within the digital platform may nonetheless sail on, because the digital platform says that it can vouch for the reliability of its own (internal) controls and safeguards for deidentification management.
More than consumer notices: data architecture and assurance frameworks
The outcome of this data privacy debate will turn upon design of ad data architectures, assurance frameworks for assurance of ad data flows in multiparty ad data ecosystems, and which entities are able to demonstrate reliable controls and safeguards around deidentification. This debate could be resolved in compliance with existing data privacy laws if there was across industry commitment to demonstrably reliable and verified practices in data governance and compliance assurance. Some publishers, marketers and adtech vendors still do not seem to think that this is necessary. They seem to think that the relevant concern can be resolved through greater transparency to consumers, such as through improved notices to affected individuals about uses of data for digital marketing. Improved notices are comparatively easy. Reliable and verified governance and assurance are hard. Credibility requires more than claims.
A new conflict: personalisation or individuation
Meanwhile, a new battleship of potential woe has snuck into port. The ACCC’s Digital Platforms Inquiry Final Report recommended that the Privacy Act should be amended to provide that personal information includes “data such as IP addresses, device identifiers, location data, and any other online identifiers that may be used to identify an individual”. The justification was not clearly explained by the ACCC. The Commission referenced the European Union’s General Data Protection Regulation (GDPR) and implied that the amendment would reflect GDPR. However, this is not correct: any such amendment would operate much more broadly than relevant GDPR provisions.
In any event, the recommendation has been taken up by some consumer advocates that argue that cookies and other tracking code can be used to create a detailed picture of ‘the consumer behind the device’ and individuate treatment of (non-identified) individuals, thereby (the argument goes) exposing consumers to growing risks of manipulation, exclusion and discrimination.
The argument is that this capacity for differentiated treatment, or individuation, of non-identified individuals, creates sufficient risk of harms to the consumers behind the devices that the devices (and tracking codes) should effectively be treated the same as consumers. Data collected using device identifiers or tracking code would then be treated as personal information about individuals. The suggestion appears to be that the risk of manipulation, exclusion and illegal discrimination is such that device and code tracking should be regulated without any need for a regulator or complainant to demonstrate real or likely manipulation, exclusion or illegal discrimination.
Where conflict over individuation may lead
If this argument is accepted, reliable controls and safeguards around deidentification would cease to be relevant. Most tracking for digital advertising would become impractical or legally impossible, at least outside of digital platforms using first party data collected by them or media publishers with whom they directly deal. The digital advertising sector would end up looking very different from how it looks today.
In some industry sectors, small tweaks to regulation can fundamentally affects who can operate and their freedom to operate. Proposals for changes to the definition of personal information are often framed as merely about ensuring that consumers get more informative notices as to marketing uses of data. But the tweaks now proposed by some consumer advocates could end many uses of data for personalisation or individuation, regardless of how well entities do deidentification management, and regardless of whether differentiation between consumers amounts to manipulation, exclusion or illegal discrimination. If deprecation of tracking codes leads on to broad prohibitions on uses of data for personalisation or individuation, the deprecation might end up much more like a defenestration, but with a much quicker outcome than the Thirty Years’ War.
6 June 2021