In its guidance following the Schrems II judgement, the EDPB has outlined a six step approach to ensure appropriate safeguards for the transfer of personal information to third countries (i.e. countries outside the EEA and without an adequacy decision). The third such step is particularly relevant to the issue of a so-called “risk-based approach” to data transfers.
This approach aim to assess the threat level, in order to identify and implement appropriate mitigating technical, organizational and legal safeguards to the transfer of personal information, in order to establish an appropriate level of protection of the personal information, comparable to that of the EEA. The guideline in the third step (para 41 and 42) states that the assessment of the relevant third county’s legislation should be based on “objective factors” i.e that:
- The lack of an essentially equivalent level of protection will be especially evident where the legislation or practice of the third country relevant to your transfer does not meet the requirements of the European Essential Guarantees.
Your assessment must be based first and foremost on legislation publicly available. However, in some situations this will not suffice because the legislation in the third countries may be lacking. In this case, if you still wish to envisage the transfer, you should look into other relevant and objective factors, and not rely on subjective ones such as the likelihood of public authorities’ access to your data in a manner not in line with EU standards. You should conduct this assessment with due diligence and document it thoroughly, as you will be held accountable to the decision you may take on that basis (emphasis added).
It appears that this is not entirely in accordance with the view of the French Conseil d’etat.
Summary judgement of the Conseil d’etat
On October 13, 2020, the “Conseil d’État” (the highest administrative court in France) issued a summary judgment that rejected a request for the suspension of France’s centralized health data platform, Health Data Hub (the “HDH”). The HUB was hosted by Microsoft in the Netherlands. The judge observed that personal data hosted in the Netherlands under a contract with Microsoft cannot legally be transferred outside the European Union. He further stated that if the risk cannot be completely excluded that the American intelligence services request access to this data, it does not justify, in the very short term, the suspension of the Platform. He therefor called for additional guarantees to be implemented under the control of the French data protection authority (the “CNIL”).
The Health Data Platform, a public body also called “Health Data Hub”, was created at the end of November 2019, to facilitate the sharing of health data in order to promote research. Some of this data is used in particular for the needs of managing health emergencies and improving knowledge about the covid-19 virus. On April 15, 2020, the Platform signed a contract with an Irish subsidiary of the American company Microsoft for the hosting of data and the use of software necessary for their processing.
Several associations, unions and individual applicants had asked the judge for an urgent interim ruling of the Council of State, to suspend the processing of data related to the covid-19 epidemic on the Health Data Platform due to the risks that this situation entails with regard to the right to respect for private life, given the potential transfers of data to the United States.
The judge of the Council of State noted that the Health Data Platform and Microsoft had undertaken, by contract, to refuse any transfer of health data outside the European Union. A ministerial decree decided on October 9, 2020 also prohibits any transfer of personal data under this contract.
The judge noted that it cannot be completely ruled out that the American authorities, in the context of surveillance and intelligence programs, ask Microsoft and its Irish subsidiary for access to certain data.
But first of all the judgement stated, the CJEU has not, to date (not even in the Schrems II case), ruled that European data protection law would prohibit entrusting data processing, on the territory of the European Union, to an American company.
In addition the judge stated that:
- A violation of the GDPR in such a set up would be hypothetical, as it would assume that Microsoft would not be able to oppose a possible request by the US authorities.
- Health data is also pseudonymized before it is hosted and processed by the Platform.
- Finally, there is an important public interest in allowing the continued use of health data for the needs of the covid-19 epidemic thanks to the technical means available to the Platform.
Consequently, the judge of the Council of State in these interlocutory injunction proceedings did not find any serious and manifest illegality which would justify the immediate suspension of data processing by the Microsoft platform.
On the other hand, faced with the existence of a hypothetical risk, and given the fact that the judge in this case only determine short-term measures, he told the Health Data Hub to continue, under the control of the CNIL, to work with Microsoft to strengthen the protection of the rights of data subjects over their personal data.
These precautions will have to be taken while awaiting a solution that will eliminate any risk of access to personal data by the American authorities, as announced by the Secretary of State for Digital on the very day of the Council hearing (potential choice of a new subcontractor, use of a license agreement suggested by the CNIL, etc.).
It should be noted that even if the EDPB has significant authority in their interpretation of the GDPR – it is still only an interpretation it is not “the law”.
It will be interesting to see more case-law interpretation of the GDPR from the European courts, like this one from the French Conceil d’Etat!
 Based on the press release of the Conseil d’etat
 The French Data Protection Authority – Commission nationale de l’informatique et des libertés
 The French Minister for digital affiars