What is the GDPR?
The General Data Protection Regulation (GDPR) is perhaps the most sweeping change in the law of data in the past two decades. We asked 129 law professors questions about the GDPR and conducted our own independent research in order to compile this report. The report finds that many technology companies are wildly underestimating their responsibilities under the new law.
Prior regulations in both the EU and the US were enacted well before the Internet became a mainstream part of everyday life. Internet companies and other organizations must now interpret the GDPR and make adjustments to their services and business models to comply. If they fail to do so adequately, large companies could face billions in fines and potential bankruptcy. The GDPR will take effect on May 25, 2018.
Who has GDPR rights?
The question of who can legally exercise GDPR rights is perhaps the most interesting point of contention surrounding the GDPR. First, the GDPR does not just apply to citizens of an EU country. It applies to anyone who at any time set foot in an EU country and transmitted their data to a covered Internet company. So, a US tourist who visits Germany for one day and returns to the US has rights under the law if that person used e.g. Facebook while on the trip.
However, the biggest source of potential controversy is whether it is illegal “national origin discrimination” under US law to give GDPR rights to immigrants from the EU and not to everyone. The Civil Rights Act of 1964 outlawed discrimination on the basis of race, color, religion, sex, or national origin. It applies in several contexts such as employment and in “any place of public accommodation.” Few if any Internet companies have triggered lawsuits under this public accommodation prong. It has, to date, been exceedingly rare for such organizations to discriminate on such bases.
However, the Americans with Disabilities Act (ADA) includes very similar language. It prohibits discrimination against disabled persons in “any place of public accommodation” as well. When one statute takes the exact language from a prior similar statute, courts will often find Congress intended the language to have the same meaning.
There have been several high profile cases alleging Internet organizations have discriminated against disabled persons by making inaccessible websites and services. Quite a few have been very successful. An ADA lawsuit against Netflix spurred it to massively increase the amount of closed-captioning for the deaf. An ADA lawsuit against Target led to almost $10 million in payment by Target plus mandatory changes to the target.com website to make it more accessible to visually impaired persons.
Federal courts of appeal are divided on whether the “place of public accommodation” language applies to purely Internet websites. For example, the 7th circuit court of appeals said “the core meaning of Title III [of the ADA], plainly enough, is that the owner or operator of a store, hotel, restaurant, dentist’s office, travel agency, theater, Web site, or other facility (whether in physical space or in electronic space) that is open to the public cannot exclude disabled persons from entering the facility and, once in, from using the facility in the same way that the nondisabled do.”
Similarly, the 1st Circuit court of appeal said that “it would be irrational to conclude that persons who enter an office to purchase services are protected by the ADA, but persons who purchase the same services over the telephone or by mail are not.” Other courts of appeal have concluded otherwise. For example, the Ninth Circuit court of appeal said, “some connection between the good or service complained of and an actual physical place is required.”
To recap, the Civil Rights Act of 1964 bans discrimination on the basis of national origin in places of public accommodation. These public accommodations are sometimes ruled to include websites. The GDPR provides rights that disproportionately benefit people whose national origin is an EU country and who move to the US.
If you were born in France and study abroad in the US, you are much more likely to have rights directly under the GDPR than someone born in Texas. However, national origin discrimination is itself illegal when done by public accommodations.
The next question is whether a court would view a disproportionate advantage of immigrants to be discrimination. After all, any person can gain rights directly under the GDPR by traveling to the EU. In the employment context, courts have viewed the Civil Rights Act to cover policies that do not explicitly ban people based on a protected class status, so long as the policy disproportionately burdens members of that class. This is called the “disparate impact” doctrine.
The Department of Justice says that when it comes to employment, “The first step in analyzing any disparate impact case is determining whether the [employer’s] criteria or method of administering its programs or activities adversely and disparately affect members of a protected class.”
If a company like Facebook says the only people in the US with new data privacy rights are those who have been in Europe then that is an “adverse” action against all the people without these data privacy rights. It also disparately (as in disproportionately) affects people who were born in countries other than EU countries. Some courts have said the “disparate impact” doctrine applies to public accommodations and some have ruled otherwise.
Finally, organizations might try to argue it is a “business necessity” to have these policies because the GDPR requires them. Business necessity can be a defense to otherwise discriminatory policies. However, the GDPR does not say you cannot give rights to people who have never been to the EU. It does not affect that question one way or the other. While it is a necessity to provide these rights to those from the EU, it is not a necessity to refrain from providing the same rights to everyone.
To recap, the GDPR requires that Internet organizations give rights to people from EU countries. The Civil Right Act of 1964 prohibits discrimination on the basis of national origin. Discrimination can arise even when a policy is neutral on its face yet disproportionately impacts members of a protected class. Several courts of appeal have concluded websites are public accommodations and have anti-discrimination obligations. Further, while it may be a business necessity to give rights to people from the EU, it is not a necessity to refrain from giving those rights to all people.
Cautious Internet organizations are encouraged to avoid discriminating against their users in the provision of the rights outlined in the GDPR. If that means someone from New York has equal rights to data privacy as someone from Spain, it seems like a reasonable cost compared to a discrimination lawsuit.
Organizations with high revenue and low profit face enormous legal risk for violating the GDPR.
Silicon Valley is known for creating companies that generate billions in revenue and little if any profit. This has worked well for many such organizations. However, these organizations could face massive fines or even bankruptcy if they are not careful with users’ data. Facebook, Alphabet (the owner of Google), Twitter, and other popular services have filed statements with the Securities and Exchange Commission (SEC) noting the significant new risks the GDPR brings.
Certain violations of the GDPR can lead to fines of “up to 4 % of the total worldwide annual turnover of the preceding financial year.” For example, Amazon.com reported $177.866 billion in 2017 revenue yet only $3.033 in profit in its latest annual filing with the SEC. A fine of 4% based on Amazon’s worldwide revenue could be $7.11464 billion, or more than two years’ worth of profit.
The GDPR explicitly requires such fines to be “effective, proportionate and dissuasive.” It is hard to overstate how important GDPR compliance is to technology organizations. It could be the difference between solvency and bankruptcy.
When may a company legally process data under the GDPR?
There are two primary bases for processing data under the GDPR. The first is when such processing is necessary for the business. Processing data is necessary in a few main cases. If the business will use the data in furtherance of its “legitimate interests” that do not conflict with the user’s fundamental rights and freedoms, then such processing is generally permitted. Data controllers relying only on the “legitimate interests” prong of the GDPR and not the “consent” prong must still inform users of what will happen to their data.
For example, websites often collect data on users who abuse the service. It may be necessary to monitor such abusive users so they can be permanently banned. Another example of necessity is compliance with a law. If a business is sued or expects to be sued, it is required to preserve evidence relating to the lawsuit. So, it may be legally unable to destroy a data subject’s data. This should not violate the GDPR.
The second basis for data processing is when the user consents. Consent must be freely given, specific to an identifiable list of data uses, and unambiguous. For example, silence, pre-ticked boxes or inactivity are not legally sufficient consent. Consent from minors involves additional rules under the law.
Overbroad consents are also legally invalid. For example, consent to use any data about the subject whatsoever for any purpose is insufficient. The key is to describe with reasonable particularity what you want to do and why. Then, it is up to the user to accept or decline. Consent is not required if the business can prove the collection of the data is for its legitimate interests.
Importantly, consent generally can be revoked at any time under the law. If ten years down the line a user withdraws consent, a company relying on such consent to store that user’s data better be able to remove it.
What legal rights do users have?
Users (“data subjects” in GDPR parlance) have various powerful rights against organizations that collect data on them. One crucial right is the right to transparency over how and why their data is being collected. Website owners must provide detailed information in plain English as to the specific uses of this data. No more burying overbroad collection practices in legalese. Website owners must also justify this collection to their users. Likewise, organizations are not the final say on whether such justifications are truly sufficient. Users can file complaints that could cost organizations billions of dollars if the justifications prove inadequate. Websites must also describe how users can remove their data, revoke their consent to having their data stored, and how to migrate their data to other platforms.
The law also introduces a powerful legal principle called the right to “data minimization.” So, if a web service can operate with a lesser amount of data, it is in fact required to do so. This may significantly impact Silicon Valley type startups that plan to collect as much data as users are willing to provide with the hopes that one day it can be monetized. Such distant future plans for data are likely not consistent with the data minimization rights of users.
When a company relies on “legitimate interests” and not consent, users may legally challenge such legitimate interests. Upon such a challenge, the data processing must stop unless the company can articulate compelling grounds that justify continuing with the processing at issue that override the challenger’s rights. The company may also justify continued data processing on the basis that such data is required to pursue or defend against a legal claim.
Users also have special rights when their data is used by artificial intelligence or other automated means to profile them in certain situations. For example, if data is analyzed by an AI to determine whether someone should be hired, get banking services such as mortgages, or to get insurance, users have additional rights under the GDPR.
Users may formally object to decisions made based on automated assessments of their risk profiles, productivity, health, behaviors, and similar can be objected to under the law. This is big news for companies that use artificial intelligence to assess individuals as almost all such organizations rely on this type of data.
When a user exercises her rights under the law, the company generally has 30 days to comply. This is so even if a company is overwhelmed with requests and needs to hire additional employees rapidly to meet their deadlines. In many cases, organizations are not permitted to charge any fees to a user who exercises rights under the law.
The GDPR is sending shockwaves throughout the world of technology businesses and nonprofits. However, organizations may still be wildly underestimating the scope of the GDPR. In particular, it may be illegal “national origin discrimination” under Title II of the Civil Rights Act of 1964 to provide rights to people from EU countries and not others. Companies should take a conservative approach to the GDPR. Users have substantial legal rights, including to transparency of how data will be used and minimization of collected data. Fines of as much as 4% of worldwide revenue can bankrupt companies or other organizations with high revenues that do not also make enormous profits.
Alexander Stern earned his Doctor of Law degree from UC Berkeley School of Law in 2015. He is an attorney and the founder of the Attorney IO family of companies.
 Doe v. Mutual of Omaha Insurance, 179 F.3d 557, 559 (7th Cir. 1999)
 Carparts Distribution Center v. Automobile Wholesaler’s Association of New England, 37 F.3d 12, 19 (1st Cir. 1994)
 Weyer v. Twentieth Century Fox Film, 198 F.3d 1104, 1114, 1115 (9th Cir. 2000)
 Some courts have found that the Civil Rights Act authorizes disparate impact claims against public accommodations. See Olzman v. Lake Hills Swim Club, Inc., 495 F.2d 1333, 1341–42 (2d Cir. 1974); Robinson v. Power Pizza, Inc., 993 F. Supp. 1462, 1464–66 (M.D. Fla. 1998), while others have rejected disparate impact in this context. See Akiyama v. U.S. Judo Inc., 181 F. Supp. 2d 1179, 1187 (W.D. Wash. 2002); LaRoche v. Denny’s, Inc., 62 F. Supp. 2d 1366, 1370 n.2 (S.D. Fla. 1999).
 GDPR Art. 83
 Form 10-K of Amazon.com, Inc. dated February 1, 2018
 GDPR Art. 83
 GDPR Art. 6.
 GDPR Art. 4 point 11 and Art. 7
 GDPR Art. 8.
 GDPR Art. 5
 GDPR Art. 77
 GDPR Art. 5
 GDPR Art. 22
 GDPR Art. 3 point 3.