> No, they're asking for consent to track because the EU demanded it.
Personally, I think that despite seemingly good intentions, this common practice is counterintuitively harming user-privacy and security, especially on mobile where the banners take up a large chunk of the screen.
Many normal people get something like "banner-blindness": they are so used to seeing banners requesting confirmation when they visit a website that they by default click any random buttons they see to try and hide them right away without reading what is requested.
This practice doesn't really help anybody, IMO, and should probably be handled on the browser-level if people care about it.
> Personally, I think that despite seemingly good intentions
Why do you think they have good intentions?
Many of these banners employ some pretty slick dark patterns for you to opt-in to their most critical analytics. One of my favorites is when cookie selection is more than one click from the banner, or it causes a page reload.
I think you're talking about the businesses' motivations and I'm talking about the random EU bureaucrats that imposed the regulations. Despite how skeptical I am of most government interventions, I'd tend to assign benign intentions on the bureaucrats part here as I'd have to guess that they genuinely wanted to do something good. But like any bureaucrats sitting in their ivory towers imposing rules on others, the majority of their rules have unintended consequences, can be taken advantage of, are usually designed by committees that even when well-intentioned produce a mish-mash of inconsistent ideas, etc.
The rules are actually fairly sensible: the fact that the banners are deliberately confusing is actually illegal. The issue is that national agencies who enforce the rules (because EU rules are implemented via national laws) aren't enforcing the rules properly.
The "make it part of the browser" argument doesn't work in practice because the GDPR covers the intent and purpose of data collection/processing rather than any specific technical way of collecting or processing said data. Blocking cookies at the browser level doesn't prevent the website from using browser fingerprinting or the information you manually provided (your delivery address to make a purchase for example) in a way you didn't agree with.
I agree there is a greater chance they're more stupid than a brick than they're malicious but I wouldn't exclude the idea that internet gatekeeper like Facebook and Google are bribing them to create extra barrier for newcomers to have independent websites.
The net result of VATMOSS, GDPR and cookie banners was that a ton of small businesses decided not to bother with a website and moved to being FB only or Amazon only.
I don't understand why they didn't use the Do-Not-Track header. It's perfect: a client sending DNT is explicitly denying consent to any form of tracking before the page is even rendered. The presence of such a header should cause web applications to automatically delete any and all tracking javascript from their pages at the very least.
No idea why it turned into this cookie banner nonsense.
Microsoft pulled an Apple and turned DNT into opt-in. Advertisers were very clear that they would only honor DNT if it people were tracked by default.
The EU then passed a bill that said you can't collect data unless it's for one of six reasons, one of which is "user consent". This basically mandated opt-in, so everyone went super-aggressive on consent banners (which, BTW, are probably illegal).
> Advertisers were very clear that they would only honor DNT if it people were tracked by default.
It's ridiculous that Microsoft's response wasn't to just nuke trackers from space with some kind of adware blocker integration in Edge. This is the equivalent of a mugger saying he'll only honour your "do no mug" sign if the sign defaults to "mug me please" and has to be explicitly changed.
Ad blocking is not a "nuke trackers from space" button. It's more like piloting a drone fleet to pick out and kill terrorists or insurgents in a not-so-friendly country. It requires lots of work to identify ads and create comprehensive filter rules to block ads, and periodic re-checking to make sure they haven't been broken by the advertising companies.
Note how most ad blocking tech is either community-run FOSS projects or companies with not-so-savory business practices. It's really not the kind of work that browser vendors want to do. In fact, Apple went out of their way to create an extension type purely for delivering ad block lists to Safari all the way back in iOS 9. Ad blocking is that much of a pain that even Apple was willing to farm it out to third parties years before we got proper mobile extension support.
Occasionally, browser vendors get lucky, and there's a tracker type that's "easy enough" to kill. Things like third-party cookies would be one of them - but even then this required a huge amount of testing to avoid breaking apps that relied on them for authentication.
The only reason why ad block even works is because ad companies are incredibly paranoid and don't trust each other. The standard way to do display ads is to embed each other's `<iframe>`s or JS, which gives ad blockers a nice easy target to hit. Platforms like Facebook or Twitter that are trusted to do their own ad delivery and thus don't hotlink subresources are far harder to block. They can change how ads are styled basically every hour if they wanted, which would make any kind of rule-based ad blocking ineffective. If every ad platform did this, ad block as we know it would be dead.
While you are right that Microsoft loosened their stance with privacy, let's not conflate data collection purposes:
1. telemetry, for diagnostics and health monitoring
2. usage analysis, for program improvement and personalization
3. content analysis, for advertising and marketing purposes
Windows requires kind 1 and encourages kind 2*. Type 3 does not really apply, though, as I don't see Windows sniffing what I write in my text files so that I'm shown relevant ads later.
Without a law like the GDPR, nothing stops them from using data collected for 1) and 2) for 3). Which they will do once some PM realizes it's worth something.
(There's even 0), data collected for functional purposes like 2FA. Multiple companies have taken data straight from 0 to 3 once they see the possible revenue.)
Also, "consent" has a specific meaning in GDPR, see article 4(11) [0]:
> Consent of the data subject means any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her.
Which is why I suspect almost all "cookie banners" are worthless. They don't give a clear, informed consent, so the site operator is still not allowed to use the data for anything at all.
It's a modern variant of the "this site is unsafe, continue using" click through that browsers gave for incorrectly configured SSL before the major browser vendors converged on the conclusion that they should make it actively hard to pass through insufficiently-secured SSL configuration because users would just click okay on the spooky dialog.
Personally, I think that despite seemingly good intentions, this common practice is counterintuitively harming user-privacy and security, especially on mobile where the banners take up a large chunk of the screen.
Many normal people get something like "banner-blindness": they are so used to seeing banners requesting confirmation when they visit a website that they by default click any random buttons they see to try and hide them right away without reading what is requested.
This practice doesn't really help anybody, IMO, and should probably be handled on the browser-level if people care about it.