GDPR: Kids and Other Quirks
For the musically inclined, think of the new EU General Data Protection Regulation (GDPR) as “theme and variations.” The principles and terminology are standard across Europe, but there are many instances where the application will vary to some extent from country to country. An example of non-standardization is the varying national age limit, from 13 to 16, beyond which parental consent is required to collect information about children. Another is the entire field of HR privacy (!), as each EU country is largely free to specify how the general principles of the GDPR will be applied in the employment context. Such diversity has historically been a challenge for multinationals and online businesses operating in European markets. The GDPR should indeed produce more “harmonization,” compared to the 1995 EU Data Protection Directive, but there will still be variations after GDPR enforcement begins in May. This article highlights areas where national differences matter, as well as some strategies for dealing with those variations.
Directive to Regulation
For all the quirks, it is important first to emphasize the continuity that the GDPR represents – it truly is evolutionary rather than revolutionary – and the baseline similarity of fundamental principles in data protection across Europe. The GDPR updates and amplifies the principles of the 1995 EU Data Protection Directive, which is the foundation for current data protection laws in the 28 EU countries and the three additional EEA countries (as well as for data protection laws in many non-member countries, such as Switzerland, Russia, and Israel). The GDPR further elaborates some of those principles, especially transparency and consent. It also adds new principles, such as the concepts of privacy “by design and by default,” the right to “data portability,” and the right to deletion (or “the right to be forgotten”). The GDPR definition of personal data is modernized by adding online identifiers and location data. The “special categories” of sensitive data, which get greater protection, now include genetic data and biometric identifiers. GDPR rules on record-keeping and the appointment of data protection officers (DPOs) are new, along with security breach notification obligations and requirements for parental consent to collect information about children. The GDPR also significantly increases the range of sanctions and the weight of potential penalties for noncompliance.
Local Options
The Directive had to be “transposed” into national law through local statutes and regulations that were delayed for a matter of years in some countries and sometimes varied in detail. By contrast, the GDPR is a single EU Regulation that applies with direct effect from May 25, 2018 and will be enforced by courts and data protection supervisory authorities throughout the EU / EEA. According to Recital 10, harmonization is one of the key objectives of the GDPR: “Consistent and homogenous application of the rules for the protection of the fundamental rights and freedoms of natural persons with regard to the processing of personal data should be ensured throughout the Union.” This is especially important to organizations based outside Europe that do business in multiple European countries or attract European consumers to their websites and mobile apps, many of which are now directly subject to jurisdiction under the GDPR because they regularly offer goods or services to European residents or monitor their behavior (see “The Long Reach of the GDPR”).
But Recital 10 also recognizes a “margin of manoeuvre” for member states to detail rules in “specific processing situations.” The text of the GDPR identifies numerous areas where the member states may adopt options or derogations or add more specific requirements. In other areas, new procedures may be necessary, such as protocols for notifying security breaches or requesting “prior consultation” on high-risk processing activities. Altogether, depending how one counts, there are more than 70 “opening clauses” that a country might enact in the form of enabling legislative measures to localize the GDPR. And so far, only two of the member states, Germany and Austria, have finalized their GDPR implementing measures.
In addition to the options expressly contemplated in the GDPR, some of the member states are considering further modifications (which might or might not withstand judicial scrutiny if challenged at national or EU level), expanding the scope of the GDPR or limiting its application in specific circumstances. Some of these are mentioned below. Added to the inevitable differences in interpretation and enforcement priorities, the result is more diversity than might be comfortable for the global home office or offshore business.
Still, GDPR is a big step toward greater harmonization. In addition, the new European Data Protection Board (EDPB), replacing the Article 29 Data Protection Working Group established under the Directive, is meant to work along with the European Commission and industry associations to develop more uniform rules and guidelines over time.
Meanwhile, cross-border businesses will have to navigate zones of uncertainty and adapt to local law and culture. Here are some areas where local variations will be of particular interest to US companies with European customers or employees:
Scope of “personal data”. The GDPR lays down rules relating to the protection of “natural persons” – human beings – who are “data subjects” “in the Union” (see Art. 1(1), Art. 3, Art. 4(1), Recital 2) (note that the Regulation is meant to protect all people in the European Union, not just citizens). Despite the regulation’s focus on natural persons, the Austrian implementing legislation continues a national tradition of extending data protection rights to legal persons (such as corporations and nonprofit foundations) in addition to natural persons. Other EU / EEA countries are unlikely to take this approach. But outside the EU, Switzerland’s data protection law, which is currently being revised to align with the GDPR, also applies to legal as well as natural persons. Thus, ALL data from Austria and Switzerland should be handled according to GDPR standards.
Children. In the US, the federal COPPA Act has long protected the privacy of children under age 13 online, but the EU Data Protection Directive did not expressly address children’s privacy. GDPR Article 8 is a new requirement for parental consent to process information about children under age 16 “in relation to the offer of information society services directly to the child,” if consent is the basis for lawful processing. (This provision does not apply where there are other bases for lawful processing -- a legal obligation, for example, or an emergency.) “Information society services” are defined as in the 2015 EU Information Society Services Directive and essentially means any commercial online or mobile service.
The impact of the children’s provisions will vary by country. While Article 8 by default requires parental consent for children under age 16, member states may provide for a lower age, so long as it is not below age 13. Article 8 also requires data controllers to make reasonable efforts to verify parental consent, and member states will probably vary in what methods they consider satisfactory for verification, at least until EU-level codes or standards are adopted. Moreover, GDPR expressly does not affect member state laws setting the rules for forming a valid contract with a child (Art. 8(3)). This suggests that apart from the context of “information society services” addressed in Article 8(1), there will also be national variations in assessing consent for handling information about children in any other context, simply because there will always be the question of whether the child was legally capable of giving consent. The German statute implementing GDPR expressly requires parental consent to handle information about children under 16 in other contexts, but the draft legislation in most countries does not specify the age of consent for handling children’s information apart from “information society services,” which will no doubt lead to further uncertainty.
In the US, age 13 has proven to be a practical demarcation for parental consent, given that the interests of pre-teens for products and digital content tend to differ noticeably from those of teens and adults. Hence, it is very common for US website and mobile app operators simply to state that their sites are “not intended” for visitors under 13, avoiding any need to include distracting age verification checks or parental consent mechanisms.
Raising the limit to age 16 should give many operators pause, however. Many sites, apps, games, and products of interest to adolescents and adults would also be of interest to at least some mid-teens, and it would become harder to say that a site is strictly “not intended” for persons under 16, as opposed to 13. The operator would have to look carefully at the language, images, and advertising associated with the site, as well as the interactive messages or posts it attracted, to make sure that it was not, in fact, appealing to an under-16 audience.
Here is how the member states are lining up so far on the age issue (this could change as they finalize their implementing legislation). Countries not listed below will default to age 16 unless they produce implementing legislation with a different option.
Age 13: Czech Rep., Denmark, Finland (the Ministry of Justice recommends either 13 or 15), Ireland, Latvia, Poland, Spain, Sweden, UK.
Age 14: Austria (final).
Age 15: Greece.
Age 16: France, Germany (final), Hungary, Lithuania, Luxembourg, Netherlands, Slovakia.
For the operators of US-hosted websites and apps contemplating European users but offering only English-language content, it will be tempting to stay with the same approach as in the US and argue that the materials are not intended for users under 13, since the English-speaking jurisdictions (UK and Ireland) are likely to adopt that age limit. However, English is the most common second language in the other EU countries, so a popular English-language site or app is also likely to attract users from other countries. A small number of incidental users might not trigger complaints or enforcement, but the supervisory authorities would likely take a sterner view if it appeared that a US site operator knowingly interacted with minors across Europe while disregarding parental consent requirements. To effectively limit compliance to UK and Irish law, the operator might have to decline deliveries or downloads to other countries, for example, or refuse payments for digital content from customers in other countries.
Alternatively, a logical approach for marketers, social media, retailers, networked game sites, and others aiming for a pan-European market that includes mid-teens is simply to include age verification and parental consent for anyone under 16. This would be a compliance passport for all countries in the EU / EEA. But any operators pursuing this strategy will be pioneers, as the mechanisms for age verification and parental consent are not yet approved at national level or agreed at EU level.
It is likely that European authorities will look to the US experience under COPPA as a model, and the operators of US kids’ sites are already familiar with mechanisms for parental consent approved by the US Federal Trade Commission (FTC). Depending on the sensitivity and use of the information, these techniques for obtaining parental consent include emails followed by confirmation via phone or text, use of a payment card, toll-free telephone or video conference calls, scanned or faxed copies of official ID (sometimes compared to a mobile phone photo of the parent or guardian scanned with facial recognition software), and challenge questions. A number of FTC-approved vendors offer verification services. Such techniques could be deployed in Europe as well, but they will tend to be more complicated and expensive on a European scale because of diversity in language, culture, telecommunications networks, and identification credentials.
Employees and job candidates. The GDPR will require greater transparency, record-keeping (except in small businesses – see Art. 30(5)), security policies, and the designation of a DPO in many organizations. Therefore, most companies with employees or independent contractors in the EU / EEA will have to update their privacy notices or employee handbooks, as well as their online recruiting pages or paper application forms, to be GDPR-compliant.
But Article 88 also allows member states “to provide for more specific rules to ensure the protection of the rights and freedoms in respect of the processing of employees’ personal data in the employment context,” including obligations relating to recruitment, employment contracts and collective bargaining agreements, diversity and equal opportunity, health and safety, protection of the employer’s and employee’s property, and termination of employment. Hence, global companies must continue to rely on their local HR managers and advisors to know, for example, what questions they can ask job candidates, what diversity data they can collect, what information should be recorded in a performance evaluation, when they can monitor employee emails or web browsing, who has access to records of an employee’s medical or family leave, at what stage the employer must disclose the results of a disciplinary investigation, what the employer can disclose about the reasons for separation, how long the employer can retain job applications or old personnel files, and many other details of employment-related information handling.
Official ID numbers. Member states can determine “the specific conditions for the processing of a national identification number or any other identifier of general application” (such as a social security, national health insurance, or driver’s license number) (Art. 87). Such identifiers are routinely required for employees and independent contractors and at a certain point in the job application process, so employers must understand the rules for handling them. France, for example, required a long-form declaration to the data protection authority CNIL for authorization to transmit the social security number outside the EU; it is not yet clear whether that requirement continues under the GDPR.
Now that the GDPR includes a security breach notification obligation, it might be expected that a data loss involving unencrypted official ID numbers would entail notification to affected individuals, as in the US, because of the risk of ID theft and fraud. But that is a decision that typically will be made in consultation with the relevant national data protection supervisory authority (see GDPR Arts. 37, 38), and few of them have published comprehensive guidance so far on breach notification.
Biometric, genetic, and health data. Member states may introduce “further conditions” on the processing of biometric, genetic, or health data (Art. 9(4)). Italy’s data protection authority, for example, has in the past issued general authorizations with conditions for processing genetic and health data and has required written security policies specifically safeguarding such data, and the French data protection authority has required prior authorization to deploy biometric ID systems. It is not yet clear whether these particular conditions and procedures will be continued or revised following GDPR implementation. Multinational employers and online businesses should be alert to local restrictions on the handling of these three data categories, watching for guidance on the websites of the relevant national supervisory authorities.
Criminal background checks. Any comprehensive register of criminal convictions is to be kept under the control of “official authority,” and the individual member states are to establish safeguards for processing personal data relating to criminal convictions, offenses, or related security measures (Art. 10).
The Directive included a similar provision, and there has been considerable variation among the member states on the availability of criminal background checks. The national laws typically limit the purposes for requesting background checks, such as vetting applicants for employment in management, financial, or legal roles, or in healthcare positions, as well as requiring confidentiality and data destruction after use, subject to penalties. After GDPR, employers must continue to comply with conditions for using the national criminal offenses databases, such as those in the UK that separately cover England and Wales, Scotland, and Northern Ireland.
Representatives. Article 27 requires a controller or processor without an establishment in the EU to appoint a local “representative” (similar to a corporate agent for service of process, which is different from the obligation to appoint a DPO under Art. 37, who might or might not be located in the EU). The GDPR does not state that this as a national option, but the UK Data Protection Bill excludes this requirement. With Brexit looming, this divergence is unlikely to result in a challenge from Brussels, and non-European companies would not be able to rely on a representative in the UK as their EU representative after Brexit. The lack of a pre-designated representative in the UK for non-EU controllers and processors is probably not significant in practice: they must still include valid contact information in their privacy notices, including the contact details for their DPOs, where applicable (see GDPR Arts. 13(1)(a), 14(1)(a)).
Record-keeping and DPIAs. GDPR Art. 30 lists items of information that must be included in a controller’s or processor’s “record” of personal data processing activities (unless the business has fewer than 250 employees and does not handle sensitive or risky data), and Art. 35 requires controllers to conduct a data protection impact assessment (DPIA) for “high-risk” processing. Some of the national data protection supervisory authorities have already published guidelines or templates for the data protection record and the DPIA, and these, of course, vary somewhat.
Where a company operates in multiple EU countries, Article 56 contemplates that the “lead supervisory authority,” in the country where the company has its “main establishment,” will be competent to supervise and handle complaints and cross-border issues. Thus, it would make sense for a foreign company to build its data protection record and DPIAs on the model recommended by its lead supervisory authority.
Data Protection Officer. Article 37 requires a controller or processor to designate a data protection officer (DPO) if its core activities require “regular and systematic monitoring of data subjects on a large scale” or consists of processing on a large scale of any of the special categories of data under Article 9 (race, politics, religion, union membership, health and sexual life, genetic data, or biometric identifiers) or data relating to criminal convictions and offenses.
The German GDPR implementation statute is more far-reaching, also requiring a DPO if the company employs at least 10 people who regularly work with the automated processing of personal data, or if the company must prepare a data protection impact assessment (DPIA), or if it commercially processes personal data for third parties or for market research or opinion research.
It remains to be seen whether other national implementing legislation will similarly mandate DPOs in additional circumstances. However, several data protection supervisory authorities (notably in France and the UK) have already published GDPR guidance encouraging controllers and processors to appoint a DPO even where it is not mandatory, to help ensure compliance and provide a point of contact for individuals and the authorities.
Consumer credit. The German GDPR implementation statute includes specific provisions regulating consumer credit reports and scores. This area is not expressly addressed in GDPR and is not covered in Germany by separate legislation similar to the US FCRA. Several other EU countries similarly address credit reporting in regulations or codes promulgated under their current data protection laws, and they are likely to continue these sectoral specifications in implementing the GDPR.
Email marketing and Cookies. The rules governing online marketing are now affected by the stricter transparency and consent requirements of the GDPR, and it remains to be seen whether those will be interpreted differently by the various national data protection supervisory authorities. These are also still applying their own regulations (such as PECR in the UK) based on the EU ePrivacy Directive. It is expected that this directive will be replaced later this year or early next year by an ePrivacy Regulation aligned with the GDPR. The ePrivacy Regulation may result in more uniformity in website cookies rules and email marketing rules, for example, but with a wider jurisdictional reach similar to the GDPR, applying to operators that do not use servers located in the EU but target EU consumers. For now, operators outside the EU with an EU audience or marketing list should be updating their privacy policies and consent mechanisms to comply with GDPR, keeping up with national email marketing rules – and watching the horizon.
Behavioral marketing / profiling /adtech. A recurring theme in the legislative history of the GDPR was concern over the collection and sharing of information about browsing and shopping habits, personal preferences, and social media posts. The European Commission and Parliament were intensely interested in the rise of online ad networks that aggregate publisher ad space on web pages and mobile apps, along with software-based services that promise to refine the matchmaking process of delivering the most relevant advertising to individual consumers. They were well aware of the trend toward behavioral marketing (often termed “online behavioural advertising” or "OBA" in Europe), and they deliberately crafted the jurisdictional provisions of Article 3 so as to make the GDPR apply to companies located outside the EU, not only when they offer goods and services to EU residents but also when they “monitor” their “behavior” in the EU.
Accordingly, Article 14 of the GDPR requires certain disclosures when information is collected indirectly about an individual, including the purposes, the sources of the data, the “recipients or categories of recipients,” whether the data is transferred outside the EU and what safeguards apply (such as EU-approved standard contract clauses or Privacy Shield), data retention policies, rights of access and objection, the right to complain to a supervisory authority, the existence and consequences of automated decision-making including “profiling,” as well as a point of contact with the data controller. And if the individual data subject subsequently exercises certain rights, such as rights of correction or deletion, the controller is obliged to take reasonable steps to so inform other controllers with which the controller shared that data.
In addition to these rules providing for greater transparency, the GDPR lays down stricter requirements for obtaining and documenting consent (Art. 7) and reinforces the data subject’s right to object to any direct marketing uses of his or her data, expressly including marketing based on “profiling” (Art. 21(2)). These GDPR provisions also obviously impact adtech practices.
But despite legislators’ interest in the topic, the GDPR does not define either “profiling” or “monitoring” (are they meant to be synonymous?) and national supervisory authorities may differ in their interpretation of those terms and as to what forms of notice and consent are considered satisfactory for collecting and sharing information about website browsing and app use for marketing purposes. It will be a challenge for all parties involved in adtech to work out how to comply with the stricter transparency and consent requirements, and how to handle access requests, objections, and withdrawal of consent, given the multiple roles and relationships and the ever-changing lists of participating advertisers, publishers, and intermediaries. US operators have some relevant experience because of California’s Online Privacy Protection Act of 2003, which requires disclosure of online tracking practices and allows California residents to request details about information sharing with third parties, but GDPR has a much broader scope.
Meanwhile, some of the national data protection authorities have issued guidance about online “monitoring,” often with reference to the ePrivacy Directive and a June 2010 opinion from the Article 29 Working Party about “Behavioral Advertising.” That opinion recommended that web browsers should be designed to reject third-party cookies by default and convey “clear,” “comprehensive” information about personal data collection practices on websites. As this has not happened and is not within the control of ad networks, publishers, or adtech developers, the Working Party’s other recommendations are more germane. These include encouraging ad networks to create opt-in mechanisms that require users to accept cookies and similar technologies that track browsing across websites. According to the Working Party, the consents must be revocable, and they should expire after a stated period of time.
Since the GDPR was adopted, the concerns of European legislators and regulators have expanded from commercial direct marketing to noncommercial “profiling” issues in political campaigns, targeting victims of cyberbullying, and disseminating “fake news” (some of it possibly disguised libel of competitors or their products, as well as of political adversaries). As a result, there is the possibility that national rules will be adopted with the objective of increasing transparency and accountability in the online collection and use of personal data. This could affect ad networks as well as website, social media, and app operators in unpredictable ways, especially if the rules are not coordinated across Europe.
Industry Codes. The GDPR text and recitals refer in many places to the potential compliance role of approved codes of conduct developed by or with industry associations. See, e.g., Articles 24, 28, 32, 35, 40, 41, 46, 57, 58; Recitals 70, 77, 81, 98, 99, 148, 168. The EU institutions recognized that codes, standards, and certification programs can serve to educate both data users and the public, and they can provide greater clarity and consistency in applying the principles of the GDPR in specific contexts. It would seem that childrens' sites and parental controls might be an appropriate field for European codes and standards. Adtech is another natural application for industry codes of conduct: it is technical, involves many companies across Europe and outside Europe, and it evolves more quickly than legislators can be expected to react. Industry codes linked to user-friendly seal programs, information, and mechanisms for exercising choices and resolving disputes could offer clear, efficient, and consistent solutions across Europe.
In the US, the Network Advertising Initiative (NAI) maintains a Code of Conduct for adtech companies, as well as a Mobile Application Code, and it operates a Consumer Opt-Out page for Internet users. The Digital Advertising Alliance (DAA) created the “Ad Choices” icon that members place on their websites, giving users quick and comprehensible access to information about data collection and options. DAAC is the Canadian counterpart. The Interactive Advertising Bureau (IAB) is an industry organization for media and marketing companies, which has furnished research and guidance on regulatory and self-regulatory policies for behavioral marketing.
IAB Europe has already published an Online Behavioral Advertising (OBA) Framework. The European Advertising Standards Alliance has also published OBA Best Practices, and the European Interactive Digital Advertising Alliance (EDAA), an umbrella organization for national digital advertising bodies in Europe, is also working on EU standards for online behavioral advertising standards. Ideally, the efforts of these bodies should be coordinated and brought before the new European Data Protection Board to produce a practical, EU-wide code governing adtech. This is unlikely to happen, of course, while the debate over the ultimate shape of the ePrivacy Regulation continues, but industry could be working toward preferred and practicable solutions that can be put forward at that time.
Enforcement and liability. GDPR Article 80 says individuals may choose to be represented by nonprofit associations in proceedings before courts or data protection supervisory authorities. Article 81 gives the member states the option of allowing such groups independently to pursue complaints for alleged violations of the GDPR. Germany opted to grant this right to nonprofit consumer and privacy advocacy groups in its implementing legislation, while the draft UK bill does not.
Article 82 also provides for both “material and non-material” damages in claims under the GDPR, which raises the stakes for privacy litigation. Thus, in Germany, advocacy groups as well as individuals can bring claims for “moral damage” as well as pecuniary loss. Article 84 expressly permits member states to specify additional penalties that are “effective, proportionate and dissuasive,” in addition to the better known administrative penalties provided in Article 83, which can be as much as EUR 20 million or 4% of a company’s gross annual revenues (whichever is higher) for serious infringements. We will see whether any national legislatures feel a need to do so.
*****
This summary highlights the fact that European data protection rules continue to evolve and points up the value of having a data protection officer fulfilling the DPO’s “inform and alert” roles. The DPO should not simply be an individual who learns the rules set out in the text of the GDPR and then applies them to the company’s data processing operations. The organization doing business in Europe needs a DPO, or even better, a DPO team, that keeps current with all of the evolving national and EU-level standards and interpretations of relevance to the organization. This is important, because the GDPR will continue to be a composition of theme and variations.