It’s a wearying list of online annoyances: the “OK” button that’s actually a link to a full-screen advert; the mailing list that seems impossible to unsubscribe from; the forms that make it difficult to choose the option you want; the guilt-tripping pop-up messages that do their best to change your mind. They are irritating and inescapable parts of online life, and they don’t just annoy us – they can deprive of us our money and our privacy. Back in 2010, user experience (UX) designer Harry Brignull gave these techniques a name – “dark patterns” – with the aim of raising awareness of them and reducing their potency. But, as he admitted last year, more sites seem to be making use of dark patterns than ever before, and legislators are now taking action.
Last month, a government agency in Norway produced a report called Deceived By Design, outlining how some of the world's biggest companies – Facebook, Google and Microsoft – have used underhand tactics to bend us to their will. It seems that the wider world is finally waking up to the use of dark patterns – but who is at fault, here? The UX designers? The chief executives of Silicon Valley? Or us, for being so slow to wise up?
"There are a number of people in this moral drama, and it's pretty hard to point the finger because they all share the blame," says Cennydd Bowles, former design manager at Twitter and author of a forthcoming book, Future Ethics. "As end users we might screw up by being careless, but we can't inoculate ourselves against dark patterns. The manipulations being used against us are becoming more sophisticated."
Deceptive design might lead us to, say, unwittingly pay a subscription fee for a service we thought was free. It might hide important information in light grey type on a white background, or flash up false warnings that an offer is about to expire when it simply isn’t. Companies have long used underhand methods of avoiding giving us refunds or locking us into contracts, but the internet has created a whole new set of ways for firms to deceive us – and the UX designer is one of their most effective weapons.
“I’m keen that designers make a stronger case for the long-term value of treating users with respect, rather than just milking them for every dollar,” says Bowles. “But Wall Street rewards short-term profit, and there’s an expectation that companies grow to enormous scale very quickly.”
But even if suspect corporate values lie at the root of so-called dark patterns, the UX designers aren’t completely absolved from blame, according to Bowles. “One principle of UX design has been to make certain things invisible so that people don’t have to worry about them,” he says. “That makes for a breeding ground of unethical behaviour, because you’re reducing people’s opportunity to understand what’s happening. Saying to people ‘Look, you don’t need to worry about this’ is fine if you’re trustworthy. But tech companies have shown themselves not to be.” The Norwegian Consumer Council’s criticism of the internet giants for engaging in deceptive practises was scathing. Their letter to the European consumer organisation BEUC refers to the “numerous tricks and tactics [used] to nudge or push consumers toward giving consent to sharing as much data for as many purposes as possible.”
Facebook stood accused of using emotional language to persuade people not to turn off a facial recognition feature, rather than give them a free and open choice. Similarly, Google customers who tried disabling the personalisation of adverts were presented with a request that they reconsider. And while Microsoft came in for less criticism, the method it used back in 2016 to force PC users to upgrade to Windows 10 (clicking an “x” to close the notification window failed to halt the upgrade) irritated many people and lives long in their memories.
All three companies issued statements to assert their legal compliance or that they have consumers’ best interests at heart. But should the question of what constitutes our best interests really be made by technology firms? “Technologists are making the decisions because we’re the only ones who speak the language,” says Bowles. “Technology’s mysteries are only revealed to us, so we make the call. But is that right? We should worry if democracy ends up being replaced by technocracy.”
Back in 2016, Harry Brignull stated that companies who use dark patterns “operate in a sort of a safe zone, where they’re not likely to be prosecuted or get into trouble legally.” But while regulations usually move a lot slower than technology does, the EU’s introduction earlier this year of the General Data Protection Regulation has now caused firms across the globe to take the idea of consumer consent far more seriously. Indeed, GDPR now outlaws services that are offered on condition of users agreeing to have their personal data processed. A year ago, such conditions would be hidden, in typical “dark pattern” fashion, among pages of small print that we’d never read. Now, the prospect of substantial fines should stop this from happening.
Other, more tenacious dark patterns will continue to pay dividends for online firms, not least because of our impatience. We will give up on the idea deleting our accounts when the link to do so is hidden seven clicks away and requires us to argue with an unhelpful customer service assistant. We will be convinced by user interfaces to entering our email addresses to connect to a wi-fi hotspot, even though we’re not required to do so. We will become flustered by passive aggressive messages that question our decision-making.
“Things would change more quickly with consumer rebellion,” says Bowles, “but for consumer rebellion to happen, you need alternatives. And in the world of internet monopolies there are no alternatives.” So, if we wish to take control of our digital lives, it will require us to open our eyes and pay greater heed to the ways our behaviour is guided by UX design. “Our best defence against dark patterns,” says Brignull, “is to be aware of them.”