Digital advertising people are much better than data privacy people….
Digital advertising people are much better than data privacy people at making up and using technical terms and abbreviations.
However, there is one abbreviation known to all data privacy people needs to find its way into adtech land: BOPA.
Why should digital advertising people care about BOPA?
Because the ACCC cares about BOPA, and what the ACCC concludes about it will affect who can participate, and who may earn how much, in the digital advertising supply and value chain. Does this sound overly dramatic? Calibrate this statement against the views of your friends working at Google and Facebook and media publishers – just ask them about shifts in value driven by the ACCC-invented news bargaining code.
But I’m getting ahead of myself: I need to first explain BOPA.
BOPA, or Because Of the Privacy Act, is the excuse used by data holders to close down discussion as to ways in which data might be shared or made available to others. The excuse usually refer to data as a thing, and as though any risk that a recipient might further disclose, accidentally leak or be hacked, overuse, or misuse that thing constitutes as an existential threat to that human. Data holders frequently imply that their trustworthiness and excellent data custodianship presents a stark contrast to unreliable, untrustworthy or unknown data handling practices of would be data recipients. Often this argument is sufficient to close down discussions with the would be data recipient. If not, the data holder usually goes on to say that if the data recipient did something wrong, or there was a data breach, the affected human would blame the data discloser, and the trust of users as hard earned by the data holder would be instantly lost.
BOPA is sometimes justified, but frequently does not pass careful scrutiny.
Consumer trust is different to trustworthiness. Many consumer no longer believe the assertion we take your privacy really seriously, because that statement is too often being made by a business that didn’t. Trust requires transparency (aka good communication and visibility when things go wrong), commitment and accountability (aka consequences when things go wrong). Just like a family, really.
Trust can and must be earned. For consumer trust to flow through multi-party data ecosystems, there usually needs to be a few things. Consumers need to believe that they have a choice, so they can bestow it, or withdraw it, or believe that any entity that breaches it will be detected, called to account, and suffer real pain.
The BOPAers are right on at least one thing: trustworthiness is hard to demonstrate and maintain through multi-party data ecosystems. Multi-party data ecosystems are federations voted on by citizens that are consumers. Some citizens don’t want federations. Some prefer dictators: there will always be people that vote for populist demagogues like Donald Trump. Big businesses sometimes should be believed when they say that only they can be trusted with data about us: it is true often enough – but not as often as we are told by those businesses. Keeping data about us warehoused makes many of us feel safe – if we trust what may be going on within the four walls.
And this is not just about privacy. Although many digital service data sets are data about us, it should not be seen by default as our data. The data sets may be derived at significant cost by an entity to enable a compelling digital service to be provided to us by that entity. Just because an entity is a big business and the data they collect is about us should not cause that data to made available as a free gift to us or to that entity’s competitors. That is not good economics. For data about us to be required to be shared through a multi-party data ecosystem, there should be good legal justification.
Let’s first think about that justification.
Data about us (note again: not our data) can give us choice, exercisable as data-driven votes.
We get data-driven votes in three ways.
Our most effective vote is our ability to spend dollars elsewhere. That ability is lost when we really must, or we believe that we must, have a digital product or service. At that point, we need a second vote: an ability to withhold data about us and get a product or service: that alternative might not be as good as that we would have got if we provided our data.
Data privacy law, with all its faults and inadequacies, still makes sense. Notice to us about how data about us is being collected, used and shared, and our legal right to give or withhold consent to more intrusive or unusual data handling, gives us choice.
But we all know that sometimes this choice is just legal fiction: we can’t understand, we don’t have the time to read, the notice, we really, really need that digital product and so on. At that point we need a third vote to be exercisable by our proxy: a regulator, or a ruler. Competition and consumer protection laws enable regulators to control intrusive, unusual or unreasonable data handling by entities that we ourselves cannot control. Competition regulators can cause data about us to be made available to us or to others, or locked down within any part of an entity. Data privacy laws (if the regulator is given the resources to enforce them) can do a lot less, but can still enable effective control of data uses and flows. These laws also give regulators power to poke around, to see which entity is doing what, behind their closed doors. This investigative power is particularly important in working out what is going on across a multi-party data ecosystem: we consumers only see the front door of the first party.
Let’s go back to basics on how we can make sharing of data about us work, within our existing laws. When is BOPA real?
Data sharing may be trustworthy if data that is shared is demonstrably protected by technical, operational and legal controls and safeguards that limit analysis, use or further disclosure of relevant data by each data recipient. Many users intuitively understand this, contrary to views of many BOPAers. Often the affected user simply wants to be fully informed – and the BOPAer data holder doesn’t wish to inform the affected human, either because to explain what’s going on requires hard thinking about good communications and user engagement, or because it suits the data holder not to enable any data sharing.
Data is not a thing: it is a bundle of information about another thing, which thing may be living or inert, identifiable or not. That bundle can be sliced and diced. Privacy law provides incentives that reward good data privacy practices. It is now broadly accepted in Australian data privacy regulation that acts or practices in deidentification of personal information preparatory to use of deidentified information in ways that are not uses could lead to later reidentification of relevant humans (users, consumers, etc.) by any entity (the immediate recipient or further downstream), are not acts or practices required to be addressed by an APP entity in its privacy policy published pursuant to APP 1.4 and in any privacy notice given to an affected individual pursuant to APP 5. In other words, demonstrably reliable deidentification of personal information enables (properly deidentified) information to be shared and reused.
At this point the BOPAer data holder expresses concern as to reidentification risk through mosaic effects or through linking with or look-up against other data sets (i.e. device identifiers matched with user identifiers) by a direct or indirect recipient. Yes, those risks must be carefully evaluated. Sometimes a careful evaluation concludes that reidentification risk is quite low, but the risks of privacy harms to the individual are not remote and accordingly the data should be treated as personally identifying and therefore still regulated as personal information about an individual. Sometimes the evaluation concludes that, after taking into account controls and safeguards that limit analysis, use or further disclosure of relevant data by each possible data recipient and carefully considering their likely effectiveness, risks of reidentification and privacy harm to an individual are low and therefore the data can be shared because it is no longer regulated personal information about an individual.
The key data privacy point is this: what an entity is permitted by data privacy law to do, or must not do, with a bundle of information about a human, depends upon how the entity slices and dices that bundle of information, who they permit to see the sliced and diced information, and whether the sliced and diced information then might move outside of the controls and safeguards protecting it. This is about good data governance: evaluation, mitigation and management of residual privacy risks and harms. Data privacy law is not stupid – although sometimes the advice you read about it makes you wonder… !
The ACCC’s Digital advertising services inquiry – interim report released in late January 2021 shows awareness of BOPA. The ACCC cites examples of “a recurring theme in this industry: a tension (real or claimed) between consumer privacy on the one hand and transparency and competition on the other. In each example, publishers or advertisers (as applicable) claim that they need greater access to raw data about the operation of the ad tech service to properly evaluate how well their service providers are performing, and therefore to make effective choices on which services to use”. The ACCC refers to Google’s proposed changes to the treatment of third-party cookies by its Chrome browser. The ACCC asserts that “Google often publicly claims that privacy legislation, or consumer expectations of privacy, prevent it from releasing the data sought. But without access to the more detailed information, publishers and advertisers consider that they have to make decisions based on trust that the service is operating as claimed, which is unacceptable in a commercial relationship”.
The ACCC then outlines possible data interoperability measures, “to tools that would increase the data mobility between firms without a request from a consumer. For example, requiring firms with a significant data advantage to offer access to rival firms in adjacent markets to specified types of data in a standardised format, in certain circumstances”. The ACCC moots possible introduction of a secure common transaction ID, and a common user ID, “which would enable ad tech providers to link together disparate datasets for use in” performing ad targeting functions”. The Commission notes that “any measures to increase data mobility should be carefully designed to ensure that there are effective mechanisms to manage the risks that deidentified data may become re-identified and to ensure that consumers have effective controls over the sharing of their personal data”.
Many readers will be familiar with the activities of IAB Tech Lab’s Project Rearc working groups which are working towards providing a framework for consumers to be able to set persistent privacy preferences given effect across devices and through a multiparty ad data ecosystem. Current projections are that Project Rearc will release early designs for public comment soon, in Q2 2021. As we all know, the devil is in detail: whether it is practical enough to be capable of implementation, and how the controls, traceability and revocability are built in.
The ACCC has expressed willingness to engage with sensible proposals for a common transaction ID that would allow providers along the ad supply chain, as well as advertisers and publishers, to follow individual ad impressions across the supply chain and better observe the performance of their ad tech services.
The ACCC is also seeking submissions on whether a common user ID could be used to improve the ability of third parties to provide independent attribution services. A common user ID is different to a transaction ID, in that it allows the tracking of a user (subject to privacy protection) rather than the bids for a particular advertising impression. The ACCC states “multi-touch attribution can be difficult if DSPs use different user IDs. If users were assigned common IDs accessible to all third party attribution providers, they would be able to track all ads seen by a user, regardless of the DSP that served each ad. Overall this would improve the ability of attribution providers to provide full and independent attribution of ads served using all DSPs, including Google’s DSPs. This could help to improve transparency over the performance of ad tech services and thereby promote competition in the provision of DSP services more broadly”.
So the ACCC is now looking at how new ad tech might be used to address BOPA (“subject to privacy protection”), promote competition and improve outcomes for consumers. This is a challenging task. Experience with the news bargaining code demonstrates that the ACCC can be willing to go hard, take positions and push those positions over industry objections. Interesting times indeed in ad tech land.