French lawyer and entrepreneur Marie Potel-Saville, whom I met at Lisbon’s Web Summit, has a mission: to root out dark patterns in the digital world.
Her company, Fair Patterns, has developed a range of solutions to identify these manipulative designs, auditing digital platforms to protect users and guide companies toward more ethical practices. These design tricks, found on countless websites and apps, subtly—or sometimes not so subtly—mislead users into actions they might not choose otherwise.
Remember that time you were tricked into accepting an “unmissable opportunity” and signed up for a subscription you did not really need, just to discover you had to go through a lengthy and messy process to cancel it? Or when you tried to close the pop-up window of an online ad, but the “x” symbol was so small you actually ended up clicking on the ad instead?
These are typical examples of “dark patterns” and there are several more: the OECD has identified six main categories of design practices, from nagging to forced registrations and countdown timers, used to manipulate consumers into making unwanted purchases or compromising their online privacy.
These tactics not limited to niche websites or sketchy services. A recent report from the European Commission noted that over 97% of the most popular websites and apps in the EU employ dark patterns to some degree, and studies from the Federal Trade Commission (FTC) in the U.S. show similar statistics.
And with the mass adoption of artificial intelligence on online platforms, the problem is only going to get worse.
“Generative AI can supercharge dark patterns,” Potel-Saville explains. “You don’t need AI to personalize interactions, but with AI, it’s much easier to do it at a massive, hyper-targeted scale.” She describes a hypothetical example that highlights the implications of such technology: imagine a shopper on a website, browsing sneakers.
They interact with an AI chatbot to inquire about a product, and the bot—armed with a wealth of data from social media, past purchases, and personal preferences—subtly suggests items or upgrades they hadn’t planned to buy. “The bot might say, ‘We have these shoes in your size, and they’d look great with the jeans you bought last week. And don’t forget, you’ve got a party coming up.’”
Potel-Saville notes that while such personalization can appear harmless, it crosses into manipulation when it exploits vulnerabilities or personal data to push products or services that the user didn’t intend to buy. “For instance, the bot could say ‘if you choose this pair, you can have free delivery’ and then the free delivery is not just free delivery, it’s also a recurring subscription.”
The Fair Patterns’ CEO emphasizes the importance of informed choice in digital spaces, something she sees eroding with the increasing sophistication of AI.
“For some people, it’s annoying. But for others, like young people or the elderly, it can be outright exploitative,” she says.
Even more concerning, generative AI, which learns from huge datasets, can amplify dark patterns to an extent never seen before, unwittingly replicating these manipulative tactics simply because they are embedded in the data it was trained on.
“If you don’t clean the data, the AI will just assume that these tactics are normal,” Potel-Saville explains.
This is where Fair Patterns’ work begins. The company’s approach is not to eliminate influence altogether, but to make it more transparent and align it with ethical guidelines and legal compliance
Using a multimodel algorithm which leverages Claude, Gemini, Lama and chat GPT’s capabilities, the company scans sites and apps to flag these manipulative designs, linking each instance to the legal risks the company faces, which vary by region.
In Europe, for example, violations related to dark patterns could be punished under the GDPR, the Digital Services Act or the AI Act, with fines up to 4% (for the GDPR) or 6% (for the DSA) of the culprit’s global turnover. In the U.S., the FTC under the leadership of Lina Khan has also started to take a stronger stance against companies that exploit users in this way.
Fornite’s maker Epic Games was sentenced to pay $245 million to consumers to settle charges that the company used dark patterns to trick players into making unwanted purchases. The Commission is also taking action against Amazon, accusing the company of a “years-long effort to enroll consumers into its Prime program without their consent while knowingly making it difficult for consumers to cancel their subscriptions.”
The move sparked a separate lawsuit, from one of Amazon’s investors.
While only a handful of cases have led to significant fines, Potel-Saville sees them as a promising start, setting a precedent that might one day make dark patterns less ubiquitous. “These actions are important because they show companies that there are consequences,” she says.
Her aim is to move companies toward an approach where they don’t need to rely on these manipulative designs to boost sales or data collection.
Several companies have already begun to embrace Fair Patterns’ recommendations, including major names like Canva and Bumble, which are taking steps to eliminate dark patterns from their platforms.
The business case for playing clean, she argues, is self-evident.
Although dark patterns might deliver short-term profit boosts, the long-term effects can be disastrous for consumer trust. Studies show that once users realize they have been tricked, they become less loyal to the brand.
“What you gain from these tactics in the short term, you lose double when the customer realizes what’s happened,” she says, explaining that this erodes a company’s customer lifetime value—a crucial metric in today’s competitive digital landscape.
Design manipulation and deception tactics are so engrained in today’s digital landscape that getting rid of, at least limiting them, seems like a Herculean task; still, with a mix of regulation, consumer awareness and startup’s tackling the issue, there’s hope the situation might change.
“Ultimately, we want to create a marketplace where digital fairness is the rule, not the exception,” Potel-Saville says.
Who could argue against that?
Read the full article here