Written by Venkatesh Ramamrat
“It’s easier to fool people than to convince them that they’ve been fooled.” — Unknown.
I have grown up in the 1980’s in India and lucky to have access to computer games since the late 80’s early 90’s, with black and white monitors and MS DOS games, and for quite a while I've been fan of all the SEGA, Nintendo, PS, X Box, computer games, but when smartphones and mobile games came into a phenomenal growth stage, I did try a lot of games but then they had a feature where they would put you in an anxious spot, either you will continue to play after paying , or the game gets difficult or near impossible taking endless time.
With my involvement in child online safety with Wranga, which literally means "Ray of Light", I somehow feel deeply responsible to share need to share knowledge and shed light on the topic of Dark Pattern, a neologism introduced by Harry Brignull who defined it as
"A user interface that has been carefully crafted to trick users into doing things they do not have the user’s interest in mind." Dark patterns are created to trick the user into choosing an option that is not what they would choose on their own.
What are the techniques and psychological tricks, designers use, is very well explained in the blog by Tristan Harris, Co Founder, Center for Humane Technology
You’ve probably encountered loads of types of dark patterns during your time on the internet, you just didn’t realise it.Most likely because they’re designed to be deceptive, and quietly manipulate you into doing something you don’t want to do. I did some research into several kind of dark patterns , and though i do not wish to name, but most top companies and products we use online are full of dark patterns like these:
Bait and Switch: You set out to do one thing, but a different, undesirable thing happens instead.
Confirm shaming: The act of guilting the user into opting into something. The option to decline is worded in such a way as to shame the user into compliance.
Disguised Ads: Adverts that are disguised as other kinds of content or navigation, in order to get you to click on them.
Forced Continuity: Forced Continuity: When your free trial with a service comes to an end and your credit card silently starts getting charged without any warning. You are then not given an easy way to cancel the automatic renewal.
Friend Spam: The product asks for your email or social media permissions under the pretence it will be used for a desirable outcome (e.g. finding friends), but then spams all your contacts in a message that claims to be from you.
Hidden Costs: You get to the last step of the checkout process, only to discover some unexpected charges have appeared, e.g. delivery charges, tax, etc.
Misdirection: The design purposefully focuses your attention on one thing in order to distract your attention from another.
Price Comparison Prevention: The retailer makes it hard for you to compare the price of an item with another item, so you cannot make an informed decision\: aliexpress.
Privacy Zuckering: You are tricked into publicly sharing more information about yourself than you really intended to. Named after Facebook CEO Mark Zuckerberg.
Roach Motel: The design makes it very easy for you to get into a certain situation, but then makes it hard for you to get out of it (e.g. a subscription).
Sneak into Basket: You attempt to purchase something, but somewhere in the purchasing journey the site sneaks an additional item into your basket, often through the use of an opt-out radio button or checkbox on a prior page.
Trick Questions: While filling in a form you respond to a question that tricks you into giving an answer you didn't intend. When glanced upon quickly the question appears to ask one thing, but when read carefully it asks another thing entirely.
As per OECD (Organisation for Economic Cooperation and Development), these will be the broadly the harms of dark patterns to consumers:
Gaming Industry
There might be another reason why you are so addicted to your game.For someone who plays video games a lot, I realise how easy it is to get sucked into a game and play for hours at a time, but this can be a problem if you are playing a game intentionally designed to be played for hours at a time. Here are few deceptive design patterns that are employed into game design and how it manipulates players:
Temporal dark patterns:
The goal is to make more informed decisions about how you spend your time, not get sucked into the vortex of games' dark patterns, and promote manipulation literacy to be aware of such deceptive techniques.Dark patterns are used in many facets of life, and video games are no exception. It’s up to you to decide if you’re going to use them for good or for not-so-good intentions. It always depends on the use and the context.
Dark Pattern AI
“The algorithm is optimised to change your behaviour.”
This makes new age Machine Learning based dark patterns more effective and very scary as they are not visible, and they have an agenda that differs from you. They will optimise over time to do whatever they can to align your agenda with its goal.This dark pattern happens over time without you ever being aware it’s happening.
Most dark patterns are defined by misleading you into taking an action that does not align with your agenda. These new machine learning algorithms go a step further—they change your behaviour over time to take an action that does not align to your agenda. They do not mislead you. They just change you.
Behavioural data is what algorithms use to decide if each technique it employs is successful. We say it’s 95% sure it will be successful, it’s really saying it’s 95% sure it can change your behaviour. It learns and optimises through behavioural change.
Dark Patterns and Children
In the USA, Two leading advocacy groups protecting children from predatory practices online filed comments today asking the Federal Trade Commision (FTC) to create strong safeguards to ensure that internet “dark patterns” don’t undermine children’s well-being and privacy. Campaign for a Commercial-Free Childhood (CCFC) and Center for Digital Democracy (CDD) say tech companies are preying upon vulnerable kids, capitalising on their fear of missing out, desire to be popular, and inability to understand the value of misleading e-currencies, as well as putting them on an endless treadmill on their digital devices. They urged the FTC to take swift and strong action to protect children from the harms of dark patterns.
Comment of Jeff Chester, executive Director of the Center for Digital Democracy:
“Dark Patterns” are being used in the design of child-directed services to manipulate them to spend more time and money on games and other applications, as well as give up more of their data. It’s time the FTC acted to protect young people from being unfairly treated by online companies. The commission should issue rules that prohibit the use of these stealth tactics that target kids and bring legal action against the companies promoting their use.
Comment of Josh Golin, executive Director of the Campaign for a Commercial-Free Childhood:
In their rush to monetize children, app and game developers are using dark patterns that take advantage of children’s developmental vulnerabilities. The FTC has all the tools it needs to stop unethical, harmful, and illegal conduct. Doing so would be a huge step forward towards creating a healthy media environment for children.
Here are the developmental reasons why adult design assumptions are not appropriate for children and, in extreme cases, exploit their developmental differences:
It is time to recognize that adult design principles have been loaded into children's digital products without a discussion about the ethics or impact of this practice. Other countries have also begun efforts to require child-centred design. The U.K. recently announced their Age Appropriate Design Code to guide companies' product design to prioritise kids' privacy and data protection.
The KIDS Act would help regulate the powerful influence of advertising and the use of inappropriate design features for children, preventing them from appearing in children's digital environments in the first place. It would also provide parents with tools to find healthy content, help their children avoid outrageous content, and push back against manipulative design.
The advent of big tech has redefined the fundamental rules of the market. Technology companies often indulge in the practice of dark patterns, which can broadly be characterised as a user interface that subtly tricks users to take decisions on the platform that adversely affect their own interests.
Generally, these patterns trick the user into either paying more money or parting with more data than they ordinarily would. A number of jurisdictions have dealt with this through data protection laws like the European GDPR.
According to the GDPR, Article 4(11);
“‘consent’ of the data subject means any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”.
However, India does not have a comprehensive data protection framework, though there was some deliberation on the Data Protection Bill which was withdrawn from Parliament in August 2022. We try to situate the hazard of dark patterns within the already existing antitrust framework. The Competition Commission of India is well within its power under the Competition Act, 2002 to prohibit or penalise tech enterprises from adopting means to gain advantage over other competitors and prevent new entrants into the market.
Alliances of Light
“Wranga AI” - A ray of Light
Wranga is an ecosystem for digital parents to raise children in a safe and positive cyber environment. It provides you with an online platform of content ratings, reviews and recommendations customised for your child. Wranga thrives to guide parents to foster their children about digital well-being in making digital spaces safe for them. It’s your friend's philosopher guide helping you better navigate the digital world for your children.
With more evolution of our AI, and further research into understanding Dark Patterns and including various parameters into our AI we will certainly use Technology to enable us to help serve parents better in the future and not fall prey to predator technology practices. With involving public policy exponents, educators, schools, caregivers we need to create awareness about the harms of the digital age, and enable safe spaces for us and for our children online.
With more regulations and more advancement in technology we believe that we can channelise our energy, knowledge, expertise, experience and create a culture of light where we do not encourage such practices and find more innovative ways of creating value for humans. As we evolve and our technology, and AI evolve, we need to look deeper into understanding and creating compassionate technology, and business practices, which look at long term sustainability and expand our positive intent rather than the negative, growing organically and helping people rather than harvesting the human spirit for short term growth and profits.
Further Reading