Repent, o ye ad trackers, for the cookiepocalypse is nigh!
If Google sticks to its roadmap, by this time next year Chrome will no longer allow websites to use third-party cookies, which are cookies that come from outside their own domains. The change theoretically makes it vastly more difficult for advertisers to track your activities on the web and then serve you targeted ads. Safari and Firefox have already blocked those cookies, but when it comes to market share, Chrome is currently the leader and so its switchover is the big one.
Blocking third-party cookies means that only websites you explicitly visit will be able to save those little cookie files on your computer, and they should theoretically only do what cookies were originally intended to do: keep track of smaller things like whether you’re logged in or which shopping cart is yours. Blocking third-party cookies also means ad networks can’t figure out who you are and serve you targeted ads, which is a big problem for the ad industry.
Google, which is the biggest player in online ads, has claimed that it does not intend to replace third-party cookies with “alternative identifiers to track individuals as they browse across the web.” This seems like a win for privacy all around, but if something about the story of Google as the privacy and anti-ad crusader strikes you as a little… off, you are far from alone.
Because of course Google doesn’t want to kneecap the online ad industry — the one it dominates and from which it makes all its money. Instead, Google wants to replace the third-party tracking cookie with a complicated set of (bird-themed) technologies that are meant to let ad companies target specific demographics like age and location, while at the same time allowing the people who are targeted to remain anonymous.
Google is trying to avert the cookiepocalypse for the ad tech industry, no repentance necessary.
And so today, the company is forging ahead with an “origin trial” for one of these new technologies, the Federated Learning of Cohorts (FLoC). In an origin trial, websites are able to begin testing without asking browser users to turn on specific flags. The feature itself will be slowly turned on inside Chrome via the usual process of introducing it into developer builds, then beta, then finally in the shipping version most people use.
But what the hell is FLoC, and does it really protect your privacy?
FLoC: a Federated Learning of Cohorts
FloC is a proposed browser standard that, in Google’s words, will enable “interest-based advertising on the web” without letting advertisers know your identity. Instead, you’ll be associated with a “cohort,” a group of users sufficiently large enough to make you at least semi-anonymous to the companies targeting you.
That’s the simple explanation. The technical one gets very complicated very quickly. Here’s a quick version. Chrome browsers will use algorithms (the “Federated Learning” part) to create a very large number of “cohorts,” groups of people that share certain qualities and interests. Each person’s individual browsing history is kept private and never shared with anybody, but the browser itself will look at the history and then assign a user to one of those cohorts.
When you visit a website, Chrome will tell that site that the visitor is part of cohort 198273 (or whatever) and then it’s up to the website to know that cohort 198273 (or whatever) is interested in pickup trucks and shoes with vegan leather. Since Chrome will never assign a user to a small cohort (Google has proposed that it will wait until there are “thousands” in a group), your identity as an animal-loving coal roller is theoretically protected.
Chrome itself isn’t assigning any content labels to these FloCs; Google is leaving that to the ad tech industry to figure out. So you won’t be able to open up a privacy page inside Chrome and see what it thinks you’re interested in (though there’s theoretically nothing stopping a third-party website from telling you).
Since FLoC is structured in this way, it could mean that the powerful players in ad tech could become even more entrenched, because they have the technology to parse what FLoCs mean and what ads to target against them. Or it could mean smaller players could find a way in. We don’t know all the possible repercussions of FLoC, which is why it has both ad industry executives and privacy advocates so unsettled.
You can read the whole proposal and even check out the code for how it works at the GitHub repository for FLoC inside the Web Incubator Community Group. As with most things on the web, it’s being developed out in the open and is part of a process of proposals, critiques, counter-proposals, attempts to get other browser vendors to join, arguments, harangues, screeds, and good-faith efforts to make the web a better place. It’s a party, y’all.
The new front in the browser wars: privacy
No other browser vendor has signaled its intention to support FLoC. The rest are simply blocking third-party cookies and letting the chips fall where they may. And those chips are messy.
Whatever motivations you want to imbue on the Chrome team, it is already apparent that simply blocking third-party cookies will lead to very problematic new solutions from the ad tech industry. So Google is creating both FLoC and a suite of other technologies to replace the third-party cookie, in order to hopefully forestall even worse replacements.
One of the very bad things Google is trying to forestall is fingerprinting. That’s the generalized term for ways that websites can identify you through little data signals that leak out of your browser when you visit a site. Sites can look at your IP address, the OS you’re browsing from, the size of your window, whether your browser supports Bluetooth controllers, and much more.
Battling fingerprinting is a huge arms race for browser engineers and new, nefarious methods pop up seemingly weekly. Here’s a new method of fingerprinting I just came across: playing a very tiny bit of audio and then analyzing how your particular browser and device handle it, and then using that data to individually identify you in milliseconds. (The website that proposed it sells fingerprint services to legitimate companies so they can ostensibly use it to better identify potential fraudsters on their sites.)
Apple has very publicly and vociferously advocated for cutting off all methods of individualized tracking, including fingerprinting, and has committed itself to that arms race indefinitely. The Chrome team’s concern is that essentially such a hard line creates an incentive for legitimate ad tech companies to start engaging in fingerprinting, which will then be all but impossible to stop or regulate.
Here’s how Google puts it in its blog post:
When other browsers started blocking third-party cookies by default, we were excited about the direction, but worried about the immediate impact. Excited because we absolutely need a more private web, and we know third-party cookies aren’t the long-term answer. Worried because today many publishers rely on cookie-based advertising to support their content efforts, and we had seen that cookie blocking was already spawning privacy-invasive workarounds (such as fingerprinting) that were even worse for user privacy. Overall, we felt that blocking third-party cookies outright without viable alternatives for the ecosystem was irresponsible, and even harmful, to the free and open web we all enjoy.
It’s hard to separate each company’s financial incentives from their very real philosophical differences. Google prints money with its de facto monopoly on monetizing the open web through ads and is therefore incentivized to keep it going. At the same time, Chrome’s developers are true believers in the power and importance of the open web. Meanwhile, Apple wouldn’t be sad if Google made less money amid a massive online ad tracking reckoning. At the same time, Apple’s developers are true believers in the importance of personal privacy and the urgent need to go all-out in protecting that privacy against constant online assaults.
In any case, the problem with fingerprinting is that once you’re identified, it’s much harder to anonymize yourself. A cookie can be deleted, but the way your particular computer processes a milliseconds-long snippet of audio is much harder to change (though Brave has an innovative solution called Farbling).
The basic argument from the Chrome team is that erecting a so-called “privacy wall” will entice legitimate ad tech companies into succumbing to the temptation of fingerprinting. Google is hoping that ad tech companies will adopt FLoC as an alternative.
If nothing else, there’s one big thing to take away from all this: FLoC is a hell of a lot better than the current status of third-party cookies that directly identify you anywhere you go on the web. But “better than the worst” is a low bar, and it’s hard to know yet whether FLoC just clears it or vaults way over it.
Is FLoC really private?
Instead of a trying to build a metaphorical privacy wall that blocks all forms of ad targeting, Google plans on building a Privacy Sandbox inside Chrome. Within that sandbox, websites can still legitimately request to know certain details about your browser as they need. A game streaming site could ask to know if your browser supports a game controller, for example. But ask too much and you’ll exceed the browser’s “privacy budget” and get cut off. Websites can have just a little identifying information, as a treat.
FLoC will be part of that privacy sandbox and further should protect your identity by only associating you with a cohort if that cohort is sufficiently large. Chrome will also change what FLoC cohort your browser is associated with on a regular basis, say once a week or so.
But whether FLoC is actually anonymous is very much up for debate. Bennett Cyphers at Electronic Frontier Foundation recently put up a handy post detailing some of the biggest concerns with FLoC.
One of the key aspects of FLoC is that Google isn’t making some giant list of interests and demographics and then assigning you to them. Instead, it’s proposing to use Federated Learning to create a ton of these cohorts algorithmically. Chrome won’t really know what any of them are actually about; it’ll be up to ad tech vendors to understand that over time.
But as Cyphers points out, that algorithm will inevitably create cohorts that could be incredibly dangerous — say, a group of people who have visited sites about getting out of domestic abuse situations. The Chrome team says it recognizes this concern and so will be analyzing the algorithmically created cohorts to see if any are related to what it deems to be sensitive topics — and then Chrome won’t serve those cohort IDs. But FLoC isn’t centralized, so it’s important to know that if another browser vendor adopts FLoC, it will be incumbent on that browser to create similar block lists.
Websites will be able to opt out of participating in FLoC, meaning that visits to their sites won’t contribute to an individual FLoC user’s profile. Similarly, the Chrome team intends to put opt-out toggles somewhere in Chrome’s settings for users who don’t want to provide FLoC IDs to the websites they visit.
Could FLoC become just another data point for fingerprinters? It seems likely, and defending against that seems to be another job for Chrome’s privacy budget and privacy sandbox algorithms.
One more thing: FLoC is a very convenient way for the websites you visit to know enough about you to target relevant ads, which means that FLoC is a very convenient way for websites to know things about you. It’s certainly no worse than the current cookie situation, but it’s far from the “You Shall Not Pass!” philosophy other browser vendors (like Apple and Brave) apply to allowing access to potentially identifiable information.
This first FLoC “origin trial” is designed to help websites learn how FLoC works; some of the testing for Chrome users will come later. Here is how Google describes the way it’s going to work:
The initial testing of FLoC is taking place with a small percentage of users in Australia, Brazil, Canada, India, Indonesia, Japan, Mexico, New Zealand, Philippines and the U.S. We’ll expand to other regions as the Privacy Sandbox expands globally. In April, we’ll introduce a control in Chrome Settings that you can use to opt out of inclusion in FLoC and other Privacy Sandbox proposals. In the meantime, if you’ve chosen to block third-party cookies, you won’t be included in these origin trials.
If you look at that list of countries, you might notice that something stands out: none of them are in the EU, where GDPR regulations are in effect. Recently, Robin Berjon of The New York Times wondered whether that meant that FLoC would run afoul of those privacy regulations. According to the product manager for the Chrome privacy sandbox, Marshall Vale, it’s more a matter of limiting the size of the early tests and that his team is “100% committed to the Privacy Sandbox in Europe.”
Under normal circumstances, a newly proposed web technology wends its way through mailing lists and W3C conference room debates. It gets supported by the browser vendor that championed it and then, if its lucky, other browsers. Thus, the web manages to not become browser-specific in the ways it was back in the bad old days of Internet Explorer 6.
But when Google originally announced its intention to block third-party cookies last year, I pointed out that the rhetoric between browser vendors was getting sharp. It’s only gotten sharper as Apple, Google, Microsoft, Mozilla, Brave, and others have gone further down their respective paths.
It seems unlikely that FLoC will lead to a standard because everybody agrees on a good way to allow targeted advertising. If FLoC does become a standard, it’ll probably be because Chrome will eventually turn it on and it will become the norm just through sheer market share — both Chrome’s within the browser market and Google’s within the ad tech market.
That possible future might avert the cookiepocalypse, but it could also become a different kind of nightmare for the web: one where websites once again try to push you to use the browser they can best monetize via whatever ad tech platform they’re using.