Light In Times Of Dark Patterns: #It'sNotAGame, It's Serious Business

Published by Wranga | November 29, 2022
Light In Times Of Dark Patterns

Written by Venkatesh Ramamrat

“It’s easier to fool people than to convince them that they’ve been fooled.” — Unknown.

I have grown up in the 1980’s in India and lucky to have access to computer games since the late 80’s early 90’s, with black and white monitors and MS DOS games, and for quite a while I've been fan of all the SEGA, Nintendo, PS, X Box, computer games, but when smartphones and mobile games came into a phenomenal growth stage, I did try a lot of games but then they had a feature where they would put you in an anxious spot, either you will continue to play after paying , or the game gets difficult or near impossible taking endless time.

With my involvement in child online safety with Wranga, which literally means "Ray of Light", I somehow feel deeply responsible to share need to share knowledge and shed light on the topic of Dark Pattern, a neologism introduced by Harry Brignull who defined it as

"A user interface that has been carefully crafted to trick users into doing things they do not have the user’s interest in mind." Dark patterns are created to trick the user into choosing an option that is not what they would choose on their own.

What are the techniques and psychological tricks, designers use, is very well explained in the blog by Tristan Harris, Co Founder, Center for Humane Technology

 Light in times of Dark Patterns

You’ve probably encountered loads of types of dark patterns during your time on the internet, you just didn’t realise it.Most likely because they’re designed to be deceptive, and quietly manipulate you into doing something you don’t want to do. I did some research into several kind of dark patterns , and though i do not wish to name, but most top companies and products we use online are full of dark patterns like these:

Bait and Switch: You set out to do one thing, but a different, undesirable thing happens instead.

Confirm shaming: The act of guilting the user into opting into something. The option to decline is worded in such a way as to shame the user into compliance.

Disguised Ads: Adverts that are disguised as other kinds of content or navigation, in order to get you to click on them.

Forced Continuity: Forced Continuity: When your free trial with a service comes to an end and your credit card silently starts getting charged without any warning. You are then not given an easy way to cancel the automatic renewal.

Friend Spam: The product asks for your email or social media permissions under the pretence it will be used for a desirable outcome (e.g. finding friends), but then spams all your contacts in a message that claims to be from you.

Hidden Costs: You get to the last step of the checkout process, only to discover some unexpected charges have appeared, e.g. delivery charges, tax, etc.

Misdirection: The design purposefully focuses your attention on one thing in order to distract your attention from another.

Price Comparison Prevention: The retailer makes it hard for you to compare the price of an item with another item, so you cannot make an informed decision\: aliexpress.

Privacy Zuckering: You are tricked into publicly sharing more information about yourself than you really intended to. Named after Facebook CEO Mark Zuckerberg.

Roach Motel: The design makes it very easy for you to get into a certain situation, but then makes it hard for you to get out of it (e.g. a subscription).

Sneak into Basket: You attempt to purchase something, but somewhere in the purchasing journey the site sneaks an additional item into your basket, often through the use of an opt-out radio button or checkbox on a prior page.

Trick Questions: While filling in a form you respond to a question that tricks you into giving an answer you didn't intend. When glanced upon quickly the question appears to ask one thing, but when read carefully it asks another thing entirely.

As per OECD (Organisation for Economic Cooperation and Development), these will be the broadly the harms of dark patterns to consumers:

  • Harms to consumer autonomy
  • Personal consumer detriment
    • Financial Loss
    • Privacy Loss
    • Psychological detriment and time loss
  • Structural Consumer Detriment
    • Weaker or distorted competition
    • Less consumer trust and engagement

Gaming Industry

There might be another reason why you are so addicted to your game.For someone who plays video games a lot, I realise how easy it is to get sucked into a game and play for hours at a time, but this can be a problem if you are playing a game intentionally designed to be played for hours at a time. Here are few deceptive design patterns that are employed into game design and how it manipulates players:

Temporal dark patterns:

  • Playing by appointment: Games with temporal dark patterns require users to play according to their schedule instead of the user’s own schedule/flexibility.
  • Daily rewards: The daily reward system pushes users to visit daily; else it punishes them for missing a day.
Monetary dark patterns:

  • Scarcity: Scarcity is when people value things more when they are scarce and place a low value on things that are available in abundance.
  • Pay to skip: Pay to skip is a dark pattern that relies on the user spending money on skipping parts of the game they don’t have the patience for or don’t want to do.
Social dark patterns

  • Social pyramid schemes: Typically, pyramid schemes work on making money by relying on the customers to bring in more customers.
  • Social obligation: When you’re pulled into the game because of a friend or playing a team game, there’s a chance of social obligation kicking in.
Psychological dark patterns

  • Endowed value: This dark pattern works on the basic psychology of a human’s reluctance to abandon anything in which they’ve invested a lot.
  • Endowed progress: The dark pattern is based on the endowed progress effect and Zeigarnik effect, where users remember incomplete tasks better than completed tasks.
  • List of Skyrim tasks: This Skyrim task list wouldn’t let a gamer sleep, this would keep popping in their heads until they tick these off.

The goal is to make more informed decisions about how you spend your time, not get sucked into the vortex of games' dark patterns, and promote manipulation literacy to be aware of such deceptive techniques.Dark patterns are used in many facets of life, and video games are no exception. It’s up to you to decide if you’re going to use them for good or for not-so-good intentions. It always depends on the use and the context.

Dark Pattern AI

“The algorithm is optimised to change your behaviour.”

This makes new age Machine Learning based dark patterns more effective and very scary as they are not visible, and they have an agenda that differs from you. They will optimise over time to do whatever they can to align your agenda with its goal.This dark pattern happens over time without you ever being aware it’s happening.

Most dark patterns are defined by misleading you into taking an action that does not align with your agenda. These new machine learning algorithms go a step further—they change your behaviour over time to take an action that does not align to your agenda. They do not mislead you. They just change you.

Behavioural data is what algorithms use to decide if each technique it employs is successful. We say it’s 95% sure it will be successful, it’s really saying it’s 95% sure it can change your behaviour. It learns and optimises through behavioural change.

Dark Patterns and Children

Dark Patterns and Children

In the USA, Two leading advocacy groups protecting children from predatory practices online filed comments today asking the Federal Trade Commision (FTC) to create strong safeguards to ensure that internet “dark patterns” don’t undermine children’s well-being and privacy. Campaign for a Commercial-Free Childhood (CCFC) and Center for Digital Democracy (CDD) say tech companies are preying upon vulnerable kids, capitalising on their fear of missing out, desire to be popular, and inability to understand the value of misleading e-currencies, as well as putting them on an endless treadmill on their digital devices. They urged the FTC to take swift and strong action to protect children from the harms of dark patterns.

Comment of Jeff Chester, executive Director of the Center for Digital Democracy:

“Dark Patterns” are being used in the design of child-directed services to manipulate them to spend more time and money on games and other applications, as well as give up more of their data. It’s time the FTC acted to protect young people from being unfairly treated by online companies. The commission should issue rules that prohibit the use of these stealth tactics that target kids and bring legal action against the companies promoting their use.

Comment of Josh Golin, executive Director of the Campaign for a Commercial-Free Childhood:

In their rush to monetize children, app and game developers are using dark patterns that take advantage of children’s developmental vulnerabilities. The FTC has all the tools it needs to stop unethical, harmful, and illegal conduct. Doing so would be a huge step forward towards creating a healthy media environment for children.

Here are the developmental reasons why adult design assumptions are not appropriate for children and, in extreme cases, exploit their developmental differences:

  • Autoplay is hard enough for adults to resist. But children are particularly susceptible to autoplay, and research shows they get upset when screen time limits are put in place.
  • Positive reinforcement appears in the form of likes, hearts, stars, applause, extra gameplay items, outfits for game characters, virtual toys.. Play most apps with your child and you will see a disproportionate amount of applause and rewards provided for simple achievements.
  • Badges/rewards based on elevated levels of engagement provide children a sense of artificial achievement or fulfilment. Children lack the critical-thinking skills to realise that the rewards are gimmicks to get them to re-engage with the app day after day.
  • Product placement, branded content, and influencers are some of the most popular children's content on YouTube. However, children cannot discern advertising content as easily as adults can and are more likely to follow their trusted characters' recommendations without realising that their behaviour is being influenced.
  • Frictionless access to hundreds of thousands of apps in the app stores. Without strong parental control settings, research suggests that young children are accessing inappropriate content, such as violent jump-scare apps.
  • Push alerts and nudge techniques to spend more time on digital platforms or apps is common in the apps we play in my lab. We have documented techniques such as rewards for playing on a daily basis and for watching ad videos, enticements such as bouncing presents that, when clicked, show an ad video, share data with Facebook, or show encouragement from trusted characters to make in-app purchases

It is time to recognize that adult design principles have been loaded into children's digital products without a discussion about the ethics or impact of this practice. Other countries have also begun efforts to require child-centred design. The U.K. recently announced their Age Appropriate Design Code to guide companies' product design to prioritise kids' privacy and data protection.

The KIDS Act would help regulate the powerful influence of advertising and the use of inappropriate design features for children, preventing them from appearing in children's digital environments in the first place. It would also provide parents with tools to find healthy content, help their children avoid outrageous content, and push back against manipulative design.

The advent of big tech has redefined the fundamental rules of the market. Technology companies often indulge in the practice of dark patterns, which can broadly be characterised as a user interface that subtly tricks users to take decisions on the platform that adversely affect their own interests.

Generally, these patterns trick the user into either paying more money or parting with more data than they ordinarily would. A number of jurisdictions have dealt with this through data protection laws like the European GDPR.

According to the GDPR, Article 4(11);

“‘consent’ of the data subject means any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”.

However, India does not have a comprehensive data protection framework, though there was some deliberation on the Data Protection Bill which was withdrawn from Parliament in August 2022. We try to situate the hazard of dark patterns within the already existing antitrust framework. The Competition Commission of India is well within its power under the Competition Act, 2002 to prohibit or penalise tech enterprises from adopting means to gain advantage over other competitors and prevent new entrants into the market.

Alliances of Light

“Wranga AI” - A ray of Light

Wranga is an ecosystem for digital parents to raise children in a safe and positive cyber environment. It provides you with an online platform of content ratings, reviews and recommendations customised for your child. Wranga thrives to guide parents to foster their children about digital well-being in making digital spaces safe for them. It’s your friend's philosopher guide helping you better navigate the digital world for your children.

With more evolution of our AI, and further research into understanding Dark Patterns and including various parameters into our AI we will certainly use Technology to enable us to help serve parents better in the future and not fall prey to predator technology practices. With involving public policy exponents, educators, schools, caregivers we need to create awareness about the harms of the digital age, and enable safe spaces for us and for our children online.

With more regulations and more advancement in technology we believe that we can channelise our energy, knowledge, expertise, experience and create a culture of light where we do not encourage such practices and find more innovative ways of creating value for humans. As we evolve and our technology, and AI evolve, we need to look deeper into understanding and creating compassionate technology, and business practices, which look at long term sustainability and expand our positive intent rather than the negative, growing organically and helping people rather than harvesting the human spirit for short term growth and profits.

Further Reading