The EDPB has published draft guidance on “dark patterns” in social media (the Guidelines) for consultation. The Guidelines consider in detail common social media interfaces that present the content of privacy policies and collect consent in ways which substantively violate the GDPR requirements, while still pretending to formally comply with them (these methods now termed “dark patterns”) and explain how the GDPR applies in this context. The best practice recommendations provided by the EDPB in these draft Guidelines, although directed at social media platform operators, could be applied to other online service providers and so are of general relevance.
Background and scope
The guidance appears to have been inspired by last year’s Irish DPC’s WhatsApp decision as many of the privacy issues raised in that decision are addressed in the Guidelines, in particular (i) the requirements for privacy policies to be easily accessible under Article 12(1) GDPR, transparent and detailed under Article 5(1)(a) GDPR, and (ii) the requirements for the privacy policies set out at different pages or provided to different users (and non-users) to be consistent.
Annex 4 of the Guidelines sets out six categories of “dark patterns” and several sub-cateogories. It refers to particular GDPR requirements that may be breached by each category together with use case examples. We recommend reading the Guidelines to understand the EDPB’s approach in practice as it provides detailed best practice recommendations for each stage of the user account’s lifecycle with visual examples.
The Guidelines review application of the key principles under GDPR: (i) fair processing under Article 5(1)(a), (ii) data minimisation under Article 5(1)(c), (iii) accountability of the data controller under Article 5(2), (iv) transparency under Article 12 and (v) data protection by design under Article 25. The Guidelines also highlight the relevance of special consent transparency requirements under Article 7 and information obligations under Articles 13 and 14.
Interestingly, the Guidelines also mention potential infringements of the right of data portability under Article 20 GDPR which aligns with the European Commission’s digital strategy to encourage users to access their own data, effectively switch between providers and have greater choice under the proposed Data Act and the Digital Markets Act.
Examples of dark patterns
One of the common issues is continued requests for users to provide additional data after their initial refusal, for example, requests for phone numbers. Whether these are requested for initial login identification or further verification on each login instance, the service provider should consider whether a phone number is in fact necessary and whether alternative methods are available such as emails, security apps, notifications in the app on a different device, etc. Unlike email addresses and other methods, phone numbers are not easily interchangeable making this authentication method more intrusive. Such continuous requests that ignore existing alternatives are not limited to what is necessary for processing, and therefore, may constitute a breach of data minimisation requirement under Article 5(1)(c). The Guidelines also stress that in this example phone numbers are not necessary where users select email addresses as the regular contact method on registration. The unstated reason for wanting the phone number may be for data matching purposes.
If the requests for additional information are also supported by motivational rather than neutral and clear language, they may constitute “emotional steering” by influencing the users’ emotional state. The EDPB considers emotional steering by creating a sense of urgency or subtly suggesting that there is a “correct” approach to sharing user’s personal data, to be biased messaging influencing the user’s decision, and therefore, contrary to the principle of fair processing under Article 5(1)(a).
Another key issue for users is revisiting the privacy notice. The privacy notice should be easily accessible to comply with the Guidelines on Transparency (Articles 5(1)(a) and 12) and requirement of accessibility (Article 12). If the notice appears several times without hierarchy or consistency, users will not be able to identify the correct privacy notice and will be confused by the redundancy and discrepancies making the notice inaccessible contrary to Article 12.
The notice has to be comprehensive and well-structured and provide links to privacy policies of any joint controllers. At the same time, it should not overload users with too many layers so they overlook relevant information. The EDPB recognises that acceptable format may vary depending on the application and that regulators will make decisions on a case-by-case basis. However, it also recommends that social media providers run tests with users to receive and implement feedback on the effectiveness of their format of privacy notice. Service providers should test the experience across all platforms to ensure that the format works and the content itself is consistent across the platforms in line with the accessibility requirement under Article 12.
Practical dos and don’ts
The main points to consider when designing the user experience and privacy interfaces are set out below (clearly the number of users and sensitivity of personal data collected will be relevant to how far you go in terms of testing but the other points will be helpful in most contexts):
- Make sure that it is a single document with all references to it leading to the same link.
- Provide plain definitions for any legal or technical jargon.
- Provide previous versions highlighting changes and dates of release.
- If you provide the service in a particular language, then provide the data protection information in that language too.
- When collecting consent, pay particular attention to the design:
- Make sure the website is accessible without agreeing to provide the data consent is being requested for.
- Use clear and concise rather than emotional language.
- Do not pre-select or make more visible the less restrictive data protection option. If a particular data sharing option which is not the least invasive has been preselected, inform the user about this and where the selection may be changed.
- Include visual clues for expanding menus and make sure that any visual signifiers correspond to users’ expectations.
- If data is collected or shared via an unrelated interface (e.g. for geolocation sharing on social media posts), highlight the data protection elements at the point of data collection.
- Shortly after the user creates an account, make it easy to navigate to the prominent page with data protection preferences (with settings related to the same aspect located on the same page) and provide users with multiple opportunities to select and update their preferences, particularly at the time the data collection takes place or a particular type of data is used.
- Make sure that users have an easy way to exercise their rights such as consent withdrawal, filing of complaints or termination of their account
- Make sure that the relevant links are intuitively located and put related topics together.
- Make sure that there are no more steps necessary to withdraw consent than to grant it.
- Ensure that the language surrounding consent withdrawal or termination of their account does not discourage the user or require them to complete unnecessary steps (e.g. providing reasons for deletion prior to exercise of their right).
- Explain what data will be kept on deletion.
- Check consistency of the user experience across websites and apps. In particular, make sure that consistent wording is used and that all shortcuts work as intended.
- Test the processes on users and implement their feedback.