Dark patterns are malicious or manipulative interface designs or cyber tools that try to guide end users into desired behaviour patterns. This is widely seen in the Indian cyberspace presently as some companies combine the option of using the wallet and the post-paid option, often depicted as — wallet + postpaid. Here the user is many a time unable to exercise consent whether they want to avail the postpaid option if the wallet lacks sufficient balance for the payment required. Additionally, dark patterns could also be something as simple as an extra item being added into your cart during checkout, or even you being nudged into accepting terms that you were initially opposed to, or a tedious process to unsubscribe or delete an account.
What are dark patterns?
The term dark pattern was first coined by Harry Brignull who described dark patterns as manipulative user interface tricks “that make you do things that you did not mean to”. The US Federal Trade Commissioner Rohit Chopra also recently defined dark patterns as “design features used to deceive, steer, or manipulate users into behaviour that is profitable for an online service, but often harmful to users or contrary to their intent”.1 That is to say, dark patterns tend to violate users’ privacy and breach consent through the use of innovative technological designs in order to aid digital commercial platforms.
However, it important to note that while all dark patterns are deceptive in nature, majority are not considered to be fraudulent or deceptive with the current paradigm of data and privacy legislations. Some nudging patterns or methods employed in dark patterns might be inherently infuriating to the users, however, classifying them as illegal in the current legal framework is difficult. Further, dark patterns are numerous, and exploit myriad biases in individuals. Purdue research in 2018 attempted to classify them into 5 categories — nagging, obstruction, sneaking, interface interference, and forced action.2 Harry Bringnull, also categorised dark patterns into 12 forms.3 In this regard, the abovementioned example of wallet and post-paid options would be considered as interface interference; whereas instances where users are compelled to sign in or share their location can be considered as forced action.
Furthermore, dark patterns rely heavily on confirmshaming users into accepting certain terms and conditions that would initially be opposed to. Confirmshaming is the use of guilt-inducing, deceptive content to persuade users to take a specific action. Exit-intent pop-ups, and other conversion or retention-based windows and interactions use the strategy, which is a form of dark pattern.4 A common example of this is where YouTube asks users if they are okay viewing ads, without a subscription. LinkedIn was also subject to a class action lawsuit for the use of friend spam dark pattern. Their website made it appear as though users were sending the emails individually while actually sending automated emails to users’ contacts. A US $13 million settlement was ultimately paid by LinkedIn for their dishonest actions.5
The proliferation of dark patterns in the Indian cyberspace has brought privacy and cyberspace regulations to light. Dark patterns tend to exploit personal data, and their carte blanche use cannot be curbed with the existing statutes. Further, as dark patterns involve personal data — which could include details about a person’s attributes or characteristics and be used to identify them — and in some cases sensitive personal data or critical personal data, regulating it is all the more important. Informational privacy of users is also at stake when dark patterns are used to obtain consent by illicit means. Moreover, autonomy and dignity of individuals is involved when informational privacy is violated, as the same eventually amounts to a violation of fundamental rights, and the right to exercise free will and consent.
Dark patterns and the façade of decisional privacy
The General Data Protection Regulation (GDPR) defines consent as “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”.6 Dark patterns tend violate user consent and expectations by asking them to forfeit or jeopardise resources like time, money or social capital to an extent that they were not expecting to.7 This form of manipulation threatens the privacy of users by not only increasing the resources spent on the platform, but also by compromising the decisional privacy and invading personal autonomy.
In the Indian context, Clause 11 of the Personal Data Protection Bill, 2019,8 recognises user consent as a key component of data protection frameworks, where personal data cannot be processed without obtaining valid consent from the user. It also states that consent is valid only if it is: (a) freely given; (b) on the basis of an informed decision; (c) specific in nature; (d) clearly expressed; and (e) capable of being withdrawn. It also states that the provision of services cannot be conditional upon the provision of services or the performance of a contract, and users must be notified of any risks associated with giving consent. Through K.S. Puttaswamy v. Union of India (1) judgment discourse has fostered immensely in the privacy realm, the Court said that the individual and his privacy is inextricably linked.9
However, manipulation through dark patterns allows platforms to first collect infinite amount of data by exploiting the attention cycle, secondly use or disseminate the data collected, thirdly capitalise on the data by profiling users and their practices, and fourthly derive from the privacy costs associated with decisional privacy.10 Hence, the power of decision-making and privacy is compromised and eroded when dark patterns affect free will,11 or interfere with self-interest and autonomy of individuals. Additionally, consent must be freely and unambiguously given, an implicit or opt-out consent that does not constitute to a positive action cannot be considered as consent. Therefore, consent must be explicit, that is, there must be a clear, positive, affirmative act.
Attempts to regulate dark patterns globally
Internationally in an attempt to regulate dark patterns, the EU in Article 25 of the GDPR requires data controllers to implement privacy by design and by default, while this might help in prohibiting default settings, small print and other dark pattern strategies, it cannot prevent their use.12 Section 5 of the Federal Trade Commission (FTC) Act also looks to promote decisional privacy that the FTC should begin to enforce.13 The European Data Protection Board adopted “guidelines on dark patterns in social media platform interfaces”on GDPR Article 60.14 The guidelines provide designers and users of social media platforms with practical tips on how to identify and prevent so-called “dark patterns” in social media interfaces that violate GDPR rules. In the USA, states have also brought about legislations to protect users from dark patterns. The California Privacy Rights Act, in its new definition of consent includes an “agreement obtained through use of dark patterns does not constitute consent”. The law defines “dark patterns” as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation”. The Colorado Privacy Act, definition of consent specifically excludes “agreement obtained through dark patterns”.
Remarkably, the abovementioned legislations and guidelines recognise that consent obtained through manipulation does not amount to effective consent. The Report of the Joint Parliamentary Committee on the Data Protection Bill, 2019,15 adopts the concept of data fiduciaries from the GDPR and acknowledges that a data fiduciary is required to give the data principal notice when collecting the data principal’s personal information, even if the information is not being taken directly from the data principal. Notably, it also states that the notice must contain the rights of the data principle, the nature and categories of personal data being collected, and the various purposes for which personal data is being processed. Implicitly, the Report also notes that to process sensitive personal data that is likely to cause significant harm to the data principal consent must be obtained, in clear terms. However, this pertains only to sensitive personal data, and excludes personal data, or non-personal data from the contingency of obtaining consent to process data; leading platforms to exploit users as they lack information. For example, if a postpaid and wallet option is combined and the wallet lacks balance, the user would be taking a loan without their knowledge. Thus, giving rise to questions like whether the users provided effective consent to a loan, and eventually raising a claim on the violation of fundamental rights under Articles 1916 and 2117 of the Constitution.
Legal implications of dark patterns in India
For a violation of Articles 19 and 21 of the Constitution to be legitimate it must pass the test of proportionality articulated by the Supreme Court in K.S. Puttaswamy v. Union of India (1)18 and Puttaswamy v. Union of India (2)19 which means the restrictions must be (a) imposed by law; (b) a suitable means of achieving a legitimate aim; (c) necessary and must not disproportionately impact the rights of citizens; and (d) have sufficient procedural guarantees to check abuse against State interference. However, in the case of dark patterns none of the above criteria are met, except to further the interests of commercial platforms. Regrettably, the Personal Data Protection Bill, 2019, and other existing models and discourse on data protection focus on the growth of the digital economy at the cost of individual privacy.20
As individual privacy is eroded through dark patterns, it eventually compromises the liberty and dignity of individuals as well, since their capacity to make decisions is deprived. This raises numerous questions in light of the Report on the Data Protection Bill, 202121 which also states that consent managers, who will be data fiduciaries registered with the Data Protection Authority (DPA), shall provide interoperable platforms that aggregate consent from a data principal. Simultaneously, as the DPA’s powers have not been delineated, nor are they broad enough like the FTC to regulate dark patterns, India faces a two-fold conundrum (i) of identifying a dark pattern; (ii) preventing misuse through an enforcement authority of the identified dark pattern.
The suggestion of including “privacy by design” in the report too cannot comprehensively prevent dark patterns.22 Therefore, recognising the term “dark patterns” by legislation is crucial. Additionally, drawing from the EU Guidelines, it is important for principles of transparency, accountability, and data protection, along with international best practices (GDPR provisions), and domestic regulations (upcoming Data Protection Bill) to be placed at the forefront that can help dark pattern assessments. The DPA may also consider formulating a checklist that serves to identify all the potential dark patterns, so that ethical design is used by UX designers in the future.
In conclusion, India’s lack of data protection legislation may lead to an unabated use of dark patterns, which may affect various socioeconomic classes differently, as only a minority of the population has the digital literacy and awareness to identify and evade dark patterns. However, considering India’s socioeconomic situation and class differences, it is necessary that dark patterns must be regulated through legislation; as not doing so involves unrestricted access to personal data of individuals — without any means for remedy in instances of misuse — which only serves to exacerbate information asymmetry.
† 3rd year student, BA LLB (Hons.), National Law University, Jodhpur. Author can be reached at <firstname.lastname@example.org>.
1. Statement of Commissioner Rohit Chopra, Regarding Dark Patterns in the Matter of Age of Learning, Inc. Commission File Number 1723186, 2-9-2020, <https://www.ftc.gov/system/files/documents/public_statements/1579927/172_3086_abcmouse_-_rchopra_statement.pdf>.
2. Colin M. Gray, et al., “The Dark (Patterns) Side of UX Design”, CHI 2018 Paper, April 21—26, <https://dl.acm.org/doi/pdf/10.1145/3173574.3174108>.
3. Harry Brignull, “Types of Dark Pattern”, Dark Patterns (2019), <https://www.darkpatterns.org/types-of-dark-pattern> <https://perma.cc/X9QV-P5J4>.
5. John Brownlee, “After Lawsuit Settlement, Linkedln’s Dishonest Design is Now a $13 Million Problem”, FastCompany (5-10-2015), <https://www.fastcompany.com/3051906/after-lawsuit-settlement-linkedins-dishonest-design-is-now-a- 13-millionproblem> <https://perma.cc/6ZGC-TCXA>.
6. General Data Protection Regulation, Art. 4(11) (hereinafter “GDPR”).
7. Chris Lewis, Irresistible Apps: Motivational Design Patterns for Apps, Games, and Web-Based Communities (2014).
10. Gregory Day and Abbey Stemler, “Are Dark Patterns Anticompetitive?”, 72 Ala L Rev 1 (2020).
11. Ryan Calo, “Digital Market Manipulation”, 82 Geo Wash L Rev 995, (2014).
12. GDPR, Art. 25.
13. Federal Trade Commission Act, S. 5, <https://www.federalreserve.gov/boarddocs/supmanual/cch/200806/ftca.pdf>.
14. Guidelines on Dark Patterns in Social Media Platform Interfaces: How to Recognise and Avoid them, European Data Protection Board, <https://edpb.europa.eu/system/files/2022-03/edpb_03-2022_guidelines_on_dark_patterns_in_social_media_platform_interfaces_en.pdf>.
20. Internet Freedom Foundation, “#StartfromScratch: Constitutional Utopias of Digital Protection”, <https://internetfreedom.in/constitutional-utopias-of-digital-protection/>.
21. Guidelines on Dark Patterns in Social Media Platform Interfaces: How to Recognise and Avoid them, European Data Protection Board, <https://edpb.europa.eu/system/files/2022-03/edpb_03-2022_guidelines_on_dark_patterns_in_social_media_platform_interfaces_en.pdf>.