Dark patterns: an example

5 minute read Published: 2021-02-13

There is an ongoing discussion about the so called dark patterns, they are getting more and more attention. I will show a very simple example to frame the issue, then I will try to lay out some thoughts.

I usually don't see a lot of cookie popup requesters as my browser configuration shields a lot of this crap. The other day, however, I was using a Firefox profile with "only" uBlock Origin enabled and was surprised when visited a pretty well known website to check out some CSS stuff, so I decided to make it an example of how misleading things can be.

The website in object is w3schools.com, we're not talking of some random blog with low quality content trying to monetize on eyeballs.

First, the website as I usually see it:

Then how it appears with JavaScript enabled:

oh - a nice BIG popup, it's the Nineties all over again! Let's zoom in and see what they're asking.

We have a menu with three items: "Vendor Consent", "Vendors", "Legitimate Interest".

  1. "Vendor Consent" lists the purposes of the cookies. Data stored, if accepted, may include geolocation, device fingerprinting and a final section "Ensure security, prevent fraud, and debug" - that I can't disable - with a note that "data collected (...) may include automatically-sent device characteristics for identification, precise geolocation data, and data obtained by actively scanning device characteristics for identification without separate disclosure and/or opt-in.", in other words everything I have denied my consent right above.
  2. "Vendors" lists a long list of vendors ("chain of trust", as defined in their disclaimer) the data will be shared with. By default they're all disabled, so this is an "opt-in" selection (a user action is required to enable them)
  3. "Legitimate Interest", best joke of the year 2020, basically repeating the same purposes but this time it's an "opt-out" so you have to explicitely deny it, after the previous choice screen of vendors worked the other way around (!).

After I select to object to any "legitimate interest" request (which is just weasel wording to trick users into giving consent), I want to save my preferences.

Are you really sure you don't want to deny your choice?

Which button do you think I will instinctively click to confirm my preferences? Well, the green, friendly button reverts all my choices and accepts everything, the other button actually accepts my preferences.

This small overview illustrates to which extent advertisers are willing to go to circumnvent laws and exploit any loophole to shave their advertisement down users' throats.

I don't want to even mention all that can go wrong from a technical standpoint: there are so many moving parts and actors involved that this whole consent workflow looks to me a farce, completely untraceable, impossible to track liabilities.

§ B-b-but I am blogger/streamer, I publish content/recipes/stuff!

Let's repeat why this happened: I am well aware that websites offering content also need to sustain themselves, there must be a way to produce a revenue.

From my point of view this starts by excluding all those trying to make a quick buck out of low quality content including, but not limited to:

If you fall into one of these categories, if your content sucks, you are not entitled to earn a revenue just because you publish.

§ Where do we go from here?

Getting money out of original content is hard and there's no easy way around this fact of life. Else, you're a fraud and part of the problem.

I feel empathy for legitimate content creators asking themselves: how can I generate revenue out of my original content? The answer is certainly not the free-but-paid-by-advertisement anymore which, by looking at how advertisers are frantically exploiting loopholes to stay afloat, shows how this business model is worn out and needs to be rethought.

Keeping this mindset will only get things worse: it's not about advertisement anymore, it's not about selling me a dishwasher, it's about companies harvesting and funneling huge amount of data about people towards unspecified goals.
Companies are exploring unmapped territories, some don't even know why they're doing it, they just learn by looking at other giants living on advertisement and deep analysis of these data that the more they collect, the better it is - "we will figure out that part later".

The game is now about privacy, digital rights and we have to stop this while it's in its infancy.