On September 15, 2017, the Trump White House released a Press Release regarding the EU-U.S. Privacy Shield—reiterating that they “firmly believe that the upcoming review [of the EU-U.S. Privacy Shield] will demonstrate the strength of the American promise to protect the personal data of citizens on both sides of the Atlantic.”

The first alliance of its kind, the E.U.-U.S. Privacy Shield provides a framework for the exchange of consumer personal data between the United States and countries in the European Union. Established in 2016, one of the purposes was to enable U.S. companies to more efficiently receive data from countries in the EU while staying compliant with privacy laws that protect EU citizens.  The agreement also allows companies to store EU citizens’ personal data on U.S. servers.

The “upcoming review” referenced in the White House Press Release refers to the first annual review of the Privacy Shield since its adoption, with both EU and U.S. officials stating their support for the alliance in a joint statement released September 21, 2017.  According to this statement, over 2,400 organizations have jointed the Privacy Shield since the program’s inception a year ago.  The U.S. and EU both declared a “share[d] . . . interest in the Framework’s success and remain committed to continued collaboration to ensure it functions as intended.”

But what good is an agreement without any bite for potential violators? The Federal Trade Commission (FTC) recently signaled that it fully intended to keep companies accountable for potential violations of the EU-U.S. Privacy Shield.

According to an FTC Press Release dated September 8, 2017, three U.S. Companies agreed to settle FTC charges that they “misled consumers about their participation” in the EU-U.S. Privacy Shield. The FTC alleged that these companies violated the FTC Act by “falsely claiming that they were certified to participate in the EU-U.S. Privacy Shield” when they had all “failed to complete the certification process for the Privacy Shield.” Acting FTC Chairman Maureen K. Ohlhausen warned companies that these “actions highlight the FTC’s commitment to aggressively enforce the Privacy Shield frameworks, which are important tools in enabling transatlantic commerce.” Notably, these enforcement actions are the first cases the FTC has brought to enforce the Privacy Shield.

Moving forward, companies should carefully assess whether they have completed the steps and certification necessary to make certain representations about participation in the EU-U.S. Privacy Shield—as both the FTC and the current White House administration fully intend on continuing to “demonstrate the strength of the American promise” to pull their weight in the alliance.

Back in January 2016 Sarah Thompson reported on the European Court of Human Rights (ECHR) which ruled in favour of an employer who had terminated an employee’s employment, after investigating his misuse of a company email account.

Earlier this week, the Grand Chamber of the ECHR overturned that ruling, finding that the Romanian employee’s right to privacy had in fact been infringed by his employer, when his personal messages were read in the course of an investigation, even though they were sent using company equipment and during working hours. The decision of the Grand Chamber represents the final decision of European courts on this issue, as it is the highest court of appeal and this judgment is therefore conclusive. As a result, Mr. Barbulescu is now entitled to compensation, although as can be seen from the decision, the court determined the amounts to be relatively low.

Employers should already be aware that employees have a certain right to privacy at work and must be properly informed if their communications are to be monitored and in what, if any, limited circumstances such monitoring may be conducted, always bearing in mind the need to balance employee rights and legitimate business interests.

The ECHR Grand Chamber’s decision considers this in detail, and although the judgment is lengthy, the key points benefit from the further clarification given by the court’s Q&A on the judgment. This helpful summary points out that Mr. Barbulescu’s right to private life and correspondence (protected by Article 8 of the European Convention on Human Rights) was violated by his employer because his employer failed to strike the necessary fair balance between each party’s rights and the Romanian courts had failed to determine whether he had properly been informed that his communications could be monitored.

The Q&A also states that this decision “does not mean that employers cannot, under any circumstances, monitor employees’ communications when they suspect them of using the internet at work for private purposes. However, the Court considers that States should ensure that, when an employer takes measures to monitor employee’s communications, these measures are accompanied by adequate and sufficient safeguards against abuse.”

In early 2017, the EU Commission published a communication about Exchanging and Protecting Personal Data in a Globalized World in which the EU Commission prioritizes discussions on possible adequacy decision with key trading partners, starting from Japan and South Korea in 2017.  More particularly, on July 3, 2017, the EU Commission and a representative of the Japanese Personal Information Protection Commission met in Brussels to move forward on a possible adequacy decision.

With the recent reform of the Japanese Act on the Protection of Personal Information on May 30, 2017 and with the new EU General Data Protection Regulation (the “GDPR”, which will apply from May 25, 2018), Japan and the EU have strengthened their respective data protection regimes. As a result, both countries have a very similar regime and ensure a very high level of protection for personal data. This convergence offers new opportunities to pursue a dialogue on adequacy decision.

The EU Commission considers that, in particular, the following criteria should be taken into account to assess with which countries a dialogue on adequacy should be pursued:

  • The extent of the EU’s (actual or potential) commercial relation with a given third country;
  • The extent of personal data flows from the EU, reflecting geographical and/or cultural ties;
  • The pioneering role that the third country plays in the field of privacy and data protection that could serve a model for other countries in its region; and
  • The overall political relationship with the third country in question.

An adequacy decision is an implementing decision taken by the EU Commission to make a determination that a third country ensures an adequate level of protection of personal data. Once an adequate level of protection is recognized by the EU Commission, transfers can be made without specific authorizations. For now, the Commission has adopted 12 adequacy decisions, including the EU-US Privacy Shield.

The EU Commission, when determining whether a third country has an adequate level of protection, must take into account among others (GDPR, art. 45.2):

  • the rule of law, respect for human rights and fundamental freedoms, relevant legislation, both general and sectoral, including concerning public security, defence, national security and criminal law and the access of public authorities to personal data, as well as the implementation of such legislation, data protection rules, professional rules and security measures, including rules for the onward transfer of personal data to another third country or international organisation which are complied with in that country or international organisation, case-law, as well as effective and enforceable data subject rights and effective administrative and judicial redress for the data subjects whose personal data are being transferred;”
  • “the existence and effective functioning of one or more independent supervisory authorities in the third country or to which an international organisation is subject, with responsibility for ensuring and enforcing compliance with the data protection rules, including adequate enforcement powers, for assisting and advising the data subjects in exercising their rights and for cooperation with the supervisory authorities of the Member States”; and
  • “the international commitments the third country or international organisation concerned has entered into, or other obligations arising from legally binding conventions or instruments as well as from its participation in multilateral or regional systems, in particular in relation to the protection of personal data.”

The overall evaluation does not require a level of protection identical to that offered within the EU, but requires a level of protection that is “essentially equivalent”.

Under the GDPR, an adequacy decision is not a definitive decision but a decision that once adopted needs close monitoring by the EU Commission and review, at least every four years, to take into account all relevant developments affecting the level of protection ensured by the third country.

This two-way dialogue with Japan will include exploring ways to increase convergence of Japan’s laws and practice with the EU data protection rules. The EU Commission and Japan have reaffirmed their commitment to intensify their efforts and to conclude this dialogue by early 2018.

The UK Government will introduce a new Data Protection Bill (the “Bill”) this year. As highlighted in the Queen’s speech back in June, the Government has committed to introduce the new law and, on Monday, published a Statement of Intent.

The Bill will not change the position that the EU’s new data protection legislation – the General Data Protection Regulation (GDPR) – will bring when it comes into force on 25 May 2018. The UK will still be in the EU at that time and so the GDPR will be automatically transposed into English law and replace the UK’s current Data Protection Act. However, when the UK leaves the EU and is no longer subject to the GDPR, the Bill when then implement the GDPR into English law. The importance of this is two-fold; it will support the UK’s position with regard to preserving personal data flows between the UK, EU and other countries around the world, and gives UK businesses clarity about their data protection obligations following Brexit.

The Bill will also introduce the national member state derogations that are permitted under the GDPR. The Government asked for feedback (Call for Views) on how the UK should deal with these exemptions earlier this year. The Statement of Intent provides some detail on the Government’s proposed approach, which include:

  • Enabling children aged 13 year or older to consent to their personal data being processed (under the GDPR the age for valid consent is 16 unless member states reduce this through national law);
  • Maintaining the UK’s position on processing personal data relating to criminal convictions and other sensitive personal data (enabling employers to carry out criminal background checks in certain circumstances);
  • Enabling organisations to carry out automated decision making for certain legitimate functions (e.g. credit reference checks);
  • Maintaining the UK’s current position with regard to the processing of personal data in relation to freedom of expression in the media, research and archiving.

Two new criminal offences will also be created. An offence of intentionally or recklessly re-identifying individuals from anonymised or pseudonymised data, and an offence of altering records with intent to prevent disclosure following a subject access request. Both offences will be subject to an unlimited fine.

The Bill will also implement the EU’s new Data Protection Law Enforcement Directive (DPLED) in English law. The DPLED sits alongside the GDPR and deals with processing of personal data by the police, prosecutors and other agencies involved in law enforcement. However, unlike the GDPR, the DPLED is an EU Directive (not a Regulation) and so must be implemented into member state law through national legislation by 6 May 2018.

The draft text of the Bill is due to be published and put before Parliament in early September. The Bill will be largely identical in effect to the GDPR. In light of the increased fines imposed by the GDPR (up to €20,000,000 (£17,000,000) or 4 per cent of an organisation’s global annual turnover, whichever is higher), companies should still be continuing with their GDPR compliance efforts to ensure adherence to the new law by 25 May 2018.

The Article 29 Data Protection Working Party (comprising representatives from the data protection regulators in each EU Member State, the European Data Protection Supervisor and the European Commission) has issued an opinion on data processing at work (2/2017) (the Opinion).  The Opinion is not legally binding but it does provide an indication as to how EU data protection regulators will consider and interpret EU data protection law.  The new EU data protection law (the General Data Protection Regulation – or the GDPR) comes into force on 25 May 2018 and will impose significant fines on non-compliant organizations (up to 4% of annual worldwide turnover or €20 million, whichever is higher) in addition to giving individuals more rights with regard to their personal data.  The GDPR does not only apply to EU companies, but can also apply to non-EU based organizations processing EU citizens’ personal data.

The Opinion notes that in light of the increasing amount of personal data that is being processed in the context of an employment relationship, the balance between the legitimate interests of the employer and the privacy rights of the employee becomes ever more important. It provides guidance on a number of specific scenarios including the use of social media during recruitment. Nowadays, employers may be tempted to view job applicants’ social media profiles as part of the recruitments process. However, according to the Opinion, employers may only use social media to find out information about a job applicant where: (a) they have a “legal ground” for doing so; (b) doing so is necessary and relevant for the performance of the position being applied for; (c) the applicant has been informed that their social media profiles will be reviewed; and (d) the employer complies with all of the data protection principles set out in the law.

What steps should your organization take if it wishes to review social media profiles as part of the recruitment process while also complying with the Opinion and EU data protection law? Continue Reading New Guidance Issued by EU Data Protection Regulators – Does Your Organization Use Social Media During Recruitment?

In June the ICO updated its Subject Access Code of Practice, which gives guidance to data controllers on how to respond to subject access requests from data subjects. The Code itself is not legally binding, but provides advice on good practice to promote compliance with the Data Protection Act 1998 (DPA). With less than a year to go before the introduction of the GDPR, it seems a shame that this revised Code does not address the forthcoming amendments to the law, such as the reduced time limits to respond to a subject access request (which will decrease from the current 40 days to a mere 30) but it does make recommendations for more streamlined and user-friendly options for responding and, in addition to helpful notes on how to handle requests and deal with tricky issues, serves as a reminder of the basic entitlements, which are to:

  • Be told whether any personal data is being processed;
  • Receive a description of the personal data, the reasons it is being processed and whether it will be given to any other organizations or people;
  • Receive a copy of the personal data; and
  • Receive details of the source of the data (where available).

For many businesses, subject access requests can be a time-consuming and frustrating aspect of data protection compliance. There is an understandable urge to ignore them, or provide a minimal response, particularly if the request is made in the context of an existing dispute, or preempting litigation and disclosure/discovery of documents. However, the law states that data controllers must be prepared to make extensive efforts to find and retrieve the information requested in a subject access request, unless it would be unreasonable or would involve disproportionate effort to do so. There is an exemption in the DPA accordingly. This issue has been tentatively raised in the past but the recent cases of Dawson-Damer[1] and Ittiadieh/Deer and Oxford University[2] (both decisions of the Court of Appeal) have given the ICO the opportunity to provide more clarification on these points:

  1. Disproportionate effort is not defined in the DPA, but there may be cases where the work/expense involved in complying with a request by providing a copy of the information in permanent form exceeds the individual’s right of access to their personal data;
  2. Data controllers can take into account any difficulties in finding the information and complying with the request. (This approach is consistent with the EU concept of proportionality, but the ICO expects data controllers to balance any difficulties with the benefits the information might bring to the data subject);
  3. Data controllers have the burden of proof to show that they have taken all reasonable steps to comply with a subject access request and it would be disproportionate in all the circumstances to take further steps; and
  4. It is good practice to engage with the person making the request, to help reduce the costs and effort involved in searching for the information requested. (If there is a complaint, the data controller’s willingness to engage with the requestor will be considered).

Overall, the ICO expects data controllers to act positively towards those making a subject access request and to have readily accessible systems in place to respond to requests. Those receiving a request should deal with them promptly and fairly from the start. Subject access is a fundamental right and (as noted in the Code) an opportunity to improve customer service and delivery, by increasing levels of trust and confidence, streamlining processes and providing better customer care. These aims are consistent with the GDPR and so even though this Code is not specifically targeted at compliance with the new laws, companies should benefit from its up to date guidance.

 

[1] Dawson-Damer & Ors v Taylor Wessing LLP [2017] EWCA Civ 74

[2] Ittihadieh v 5-11 Cheyne Gardens RTM Co Ltd & Ors

The UK government launched its 5-year National Cyber Security Strategy in November 2016, investing a reported £1.9 billion to protect UK businesses from cyber-attacks and make the country the safest place to live and do business online. This strategy has included the opening of the National Cyber Security Centre (part of GCHQ) and the creation of campaigns to support businesses with expert guidance on cyber security, such as Cyber Aware and Cyber Essentials.

More recently, on 19 April, the government produced its report into cyber security breaches, based on a survey of over 1500 UK businesses. According  to the government report, just under half of all UK businesses suffered at least one cyber security breach or attack in the last 12 months, yet only 1 in 10 businesses have a cyber security incident management plan in place and only a third have a formal policy that covers cyber security risks. The average cost of a breach is said to be around £20,000, but this is a conservative estimate and for many larger companies the cost is much more, not least in monetary terms. The risk of negative publicity and damage to reputation remains high, even when security measures are adopted and insurance cover is in place, so it is no wonder that businesses are confused about what to do to protect themselves and the data they hold. The danger is that companies do not sufficiently address the problems, perhaps because it seems impossible to eliminate the threat completely, or they are put off by scaremongering tactics by InfoSec consultants or cyber insurance brokers.

Cybersecurity should be a priority for company directors. Under the Companies Act 2006, they have a duty to promote the success of the company and to exercise reasonable care, skill and diligence in the performance of their role. Failing to adopt and maintain appropriate security measures to protect personal data and confidential information against cyber-attacks could be considered a breach of these duties and expose the company and individual directors to legal liabilities, including fines and claims for compensation, under data protection legislation and potential action from regulators, such as the ICO or FCA, for businesses in the financial sector. Continue Reading UK Cyber- Security Breaches Survey

On May 18, 2017, the European Commission imposed a “proportionate and deterrent” fine of €110 million on Facebook for providing misleading information during the Commission’s investigation under the EU merger control rules of Facebook’s acquisition of WhatsApp. This decision – which it is understood Facebook will not appeal – is an example of the importance that the Commission puts on complying with all aspects of the EU merger rules.  The information at issue concerned how Facebook would be able to use its and WhatsApp’s data.  Although the case did not directly concern the processing or use of data as such, its factual background raises data protection issues and it is notable that similarly high fines will soon be possible under the EU’s General Data Protection Regulation (GDPR) for data protection infringements.

During the acquisition notification procedure in 2014, the Commission had some concerns about Facebook’s ability to establish automated matching between users’ accounts in the two services. Such matching could be a way for Facebook to introduce advertising on WhatsApp and/or to use personal data sourced from WhatsApp to improve its targeting of advertisements. From a competition perspective, this could strengthen Facebook’s position in the online advertising market and hamper competition in such market. From the data protection side, data subjects and data protection authorities should be informed of any such data sharing between Facebook and WhatsApp, as well as possible new processing resulting from that matching.

Facebook informed the Commission that it would be technically impossible to achieve reliable automated matching between Facebook users’ accounts and WhatsApp users’ account.  However, WhatsApp updated its Terms of Service and Privacy Policy in August 2016, which update included the possibility of linking WhatsApp user’ phone numbers with Facebook users’ identities.  The Commission investigated and found that the technical possibility of this automatic matching of identities existed in 2014, that Facebook staff were aware of this and that Facebook was aware of the relevance of the issue for the Commission’s investigation. Facebook’s answers in 2014 had been incorrect or misleading and a fine was justified.

Separately, in a letter of October 2016, the Article 29 Working Party (WP29, gathering all EU data protection authorities) called into question the validity of the existing WhatsApp users’ consent to this change under data protection rules.  This is because, at the time they signed up, users were not informed that their data was to be shared among the “Facebook family of companies” for marketing and advertising purposes.  The WP29 announced an investigation, urged WhatsApp to communicate all available information on this new data processing and required the company not to proceed with the sharing of users’ data until appropriate legal protections could be assured.

This investigation by the Article 29 Working Party demonstrates once again, against the background of the increased sanctions soon to be introduced under the GDPR, the importance of compliance with data protection law in the EU.  For example, companies engaged in a merger or acquisition should integrate data protection compliance programs (in addition to those covering, at least, general corporate, competition and bribery/corruption matters). Such programs should include at least the following measures:

  • Map and assess the privacy risk involved in the new processing to be carried out in the context of the corporate operation (due diligence audits, international transfers, etc.), as well as the privacy risk involved in the new processing that will be carried after the operation.
  • To the extent required by law, inform the data subjects (employees, clients, stakeholders, etc.) about those new processing and purposes, taking into account confidentiality issues.
  • Take all steps necessary to make the new data processing, data transfers and processing purposes compliant with the various applicable data protection rules.

It has been less than three years since the Court of Justice of the European Union (CJEU) decided that people have the right to have incorrect information about them removed from online search engine results. However, this so-called “right to be forgotten” is not absolute, as confirmed by the CJEU’s most recent ruling last week.

This case concerned an Italian director, Mr. Salvatore Manni, who sought to have his personal details removed from company records in an official public register. He believed that his properties had failed to sell because the companies register showed that he had been an administrator of another company that went bankrupt.

The CJEU held that Mr. Manni could not demand the deletion of his personal data from the official register because the public nature of company registers is intended to ensure legal certainty and to protect the interests of third parties. It was held that this inference with an individual’s fundamental rights to a private life and to protect personal data was not disproportionate in the circumstances. This was because company registers only disclose a limited amount of personal data and company executives should be required to disclose data relating to their identity and functions within a company. The CJEU concluded by saying that in specific and exceptional situations, overriding and legitimate reasons may justify limiting the rights of third parties to access such data, and left it up to national courts to determine whether “legitimate and overriding reasons” exist on a case-by-case basis.

This decision echoes the ruling in the 2014 Google Spain Case; the right to be forgotten must be balanced against individuals’ fundamental rights, such as the right of freedom of expression and the public’s right to know information about persons holding key positions within a company. The General Data Protection Regulation (GDPR) which codifies the right to be forgotten also confirms this position. The right to be forgotten allows individuals to request the deletion of personal data in specific circumstances. However, the GDPR contains certain exemptions where companies can refuse to deal with a deletion request, such as where the processing is necessary to exercise the right of freedom of expression, and for archiving purposes in the public interest.

Companies who receive requests by individuals asking that their personal data be deleted will need to determine, on a case-by-case basis, whether or not such data should be erased. Organizations will be required to perform a balancing act against any competing rights when considering such erasure requests.

See also:

UK’s First Ever Right To Be Forgotten Enforcement: Google In the Firing Line Again

The French Data Protection Authority Puts Google On Notice To Delist Domain Names Beyond Site’s EU Extensions

The CJEU’s Google Spain Decision: A Right to be Forgotten Within the Limits of the Freedom of Expression

Costeja’s Revenge: Orders to Delete Accurate Data and the Right to be Forgotten in the EU

Between the cancellation of the Safe Harbor by the Court of Justice of the European Union (CJEU) and the adoption of the Privacy Shield, a number of data exporters have relied on the Standard Contractual Clauses (SCC) as the safest export tool to transfer personal data from the EU to the U.S. But as announced in our previous blog posts, the validity of the SCC and the Privacy Shield had to pass the EU legal test as regard to the fundamental right to data protection.

Indeed, while the Privacy Shield is facing an action for annulment brought by Digital Right Ireland to the CJEU, it is now the turn of the SCC to be examined in the context of a request filed by Maximilian Schrems against Facebook Ireland Limited to the Irish data protection authority (DPA). This last case has been submitted by the DPA to the Irish High Court, which is now assessing the opportunity to refer the question to the CJEU.

On May 24, 2016, the Irish DPA issued a draft decision summarizing its concerns about the validity of the SCC. It is worth noting that this was a turning point for the Irish DPA: the former Irish Commissioner, Billy Hawkes, defended the Safe Harbor against Maximilian Schrems and some other DPAs, whereas the new Irish Commissioner Helen Dixon basically defends the opposite, despite some improvements in U.S. laws and the SCC that occurred after the cancellation of the Safe Harbor. This might be the sign of an evolution due to the entry into force of the EU General Data Protection Regulation, the new strong and unified piece of data protection legislation that will apply from May 2018.

The main concern of the Irish DPA about the use of the SCC is the absence of an effective court’s remedy in the U.S. legislation for EU citizens to enforce their right to data protection where it might be a risk that personal data is processed by U.S. State agencies for national security purposes. Indeed, even if an EU citizen meets the criteria for a remedy against surveillance under the U.S. Foreign Intelligence Security Act, it appears on foot of the U.S. court’s decisions they cannot sue the U.S. government.

Concerning the Privacy Shield, it is too soon to know if it will survive the new U.S. political era. As observed with the dead Safe Harbor, strong voices start to express themselves opposing the industry and the EU and U.S. Privacy Shield negotiators (pro) to the EU civil society and some members of the EU Parliament and DPAs (contra).

The key issue finally lies in the ability for the U.S. legislation to grant data subjects with enforceable data protection rights that EU authorities and courts would find at least equivalent to those granted by the EU. The two above-mentioned legal cases, as well as the economic stakes of EU-U.S. data flows should put a strong pressure on U.S. government to provide additional guarantees.

For more information on the future of the Privacy Shield and SCC, please refer to the following prior Password Protected blog posts:

Expected Soon: Modifications of the Standard Contractual Clauses

Is the Privacy Shield Viable? Article 29 Working Party Proposes to Wait for Its Final Verdict

New Threat to Transatlantic Personal Data Transfers: Possible Invalidation of Standard Contractual Clauses

WP 29 Expresses Concerns About EU-U.S. Privacy Shield