Conferences andon attendees in Europe without doing the necessary due diligence over data protection risks beware: The organizers of the global connectivity industry shindig, (MWC), which takes place annually in Barcelona, have been fined €200,000 (~$224k) by Spain’s data protection watchdog over a breach of privacy rules during .
In an 8-page(PDF in Spanish) dismissing an appeal by MWC’s organizer, the GSMA, against the infringement finding, the Agencia Española de Protección de Datos (AEPD) concludes it infringed Article 35 of the General Data Protection Regulation (GDPR) — which deals with requirements for carrying out a data protection impact assessment (DPIA).
The breach finding relates to biometric data collection by the GSMA on show attendees, including for a facial recognition system it implemented (called BREEZZ), which offered attendees the option of using automated identify verification to enter the venue in person rather than manually showing their ID documentation to staff.
If you cast your mind back to 2021 you’ll recalltook place at a time when COVID-19 pandemic-related concerns over attending in-person events were still riding high. Not that that stopped MWC’s organizer from going ahead with a physical conference in the summer of that year — months later than the show’s usual timing and in an inexorably slimmed down form with far fewer exhibitors and attendees than in years past.
In fact fewer than 20,000 people registered attended MWC 2021 in person (17,462 to be exact), per GSMA disclosures made to the AEPD — and of those just 7,585 actually used the facial recognition system BREEZZ to access the venue. The majority apparently opted for the alternative of manual checks of their ID documents. (Albeit, with MWC 2021 taking place (still) in the midst of the pandemic, the GSMA also offered virtual attendance, with conference sessions being streamed to remote viewers — and no ID checks were required for that type of attendance.)
Returning to the GDPR, the regulation requires that a DPIA is carried out proactively in situations where processing people’s data carries a high risk to individuals’ rights and freedoms. Facial recognition technology, meanwhile, entails the processing of biometrics data — which, where it’s used for identifying individuals, is classed as special category data under the GDPR. This means uses of biometrics for identification inevitably falls into this type of high risk category requiring proactive assessment.
This assessment must consider the necessity and proportionality of the proposed processing, as well as examining the risks and detailing envisaged measures to address identified risks. The GDPR puts the emphasis on data controllers conducting a robust and rigorous proactive assessment of risky processing — so the fact the AEPD found the GSMA breached Article 35 indicates it failed to demonstrate it had done the required due diligence in this regard.
In fact the regulator found the GSMA’s DPIA to be “merely nominal”, per the resolution — saying it failed to examine “substantive aspects” of the data processing; nor did it assess risks or the proportionality and necessity of the system it implemented.
“What the resolution concludes is that a [DPIA] that does not contemplate its essential elements is neither effective nor fulfils any objective,” the AEPD adds, confirming its view that the GSMA’s DPIA did not fulfil the GDPR’s requirements [NB: this is a machine translation of the original Spanish text].
More from the AEPD’s resolution:
The [GSMA’s DPIA] document lacks an assessment of the necessity and proportionality of the processing operations with respect to its purpose; the use of facial recognition for access to events, its assessment of the risks to the rights and freedoms of data subjects referred to in Article 35(1) of the GDPR and of the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data, and to demonstrate compliance with the GDPR, taking into account the rights and legitimate interests of data subjects and other affected persons. It also lists the passport and identity card data that it states are required by the Mossos d`Esquadra [local police] which allegedly have a purpose, in order to connect it with the photo taken with the software, which initiates the process of facial recognition, matching your identity to facilitate access.
A description of the GSMA’s DPIA in the AEPD’s resolution suggests that as well as failing to conduct an adequate assessment, the GSMA lent on a security justification for collecting show attendees’ passports/EU ID documents — saying it had been instructed by Spanish police to put in place “strict processes” for identity-screening attendees.
It also appears to have required attendees to consent to biometric processing of their facial data as part of the ID upload process, with the AEPD noting consent information provided in BREEZZ which asked the individual for their consent to it using “biometric data obtained from the photographs provided for identification validation purposes in the context of online registration and MWC Barcelona for venue access purposes”.
This is important since the GDPR sets a clear bar for consent to be a valid legal basis — requiring it’s informed, specific (i.e. not bundled) and freely given. Ergo you can’t force consent. (While consent for processing sensitive data like facial biometrics has an even higher bar of explicit consent to be legally processed.)
It was the lack of a free choice for conference attendees around uploading sensitive biometric data which led to a complaint against the GSMA’s data processing being lodged with the AEPD by Dr Anastasia Dedyukhina, a digital wellness speaker who had been invited to speak on a panel at MWC 2021. It’s her complaint that’s led — a couple of years later — to the GSMA being sanctioned now.
“I could not find a reasonable justification for it,” she explained in alate last week, when she made her complaint public, discussing what she felt was a disproportionate demand by the GSMA that MWC attendees upload ID documents. “Their website suggested that I could also bring my ID/passport for in-person verification, which I didn’t mind. However, the organizers insisted that unless I upload my passport details, I COULD NOT attend the live event and would need to join virtually, which I ended up doing.”
Technologist, Adam Leon Smith, who co-authored her complaint, also wrote about it in a— in which he warns: “Facial recognition is public spaces is highly sensitive and if you really need to use it, use an excellent lawyer and tech team.”
“The AEPD was able to request internal privacy assessment documents from MWC, and was able to see that it was outdated and insufficient. The AEPD’s decision mostly focusses on that,” he also said. “There wasn’t any other specific remedies, although I think the MWC will need to conduct that risk and impact assessment very carefully.”
While the Spanish data protection regulator’s resolution does not weigh in on whether the GSMA’s legal basis for the biometric processing was valid or not, Smith suggests that may just be a sequential consequence of finding the DPIA inadequate — i.e. it might have decided a fuller technical assessment is not worthwhile.
“I would not be surprised if they abandoned the use of facial recognition technology,” he suggested of the GSMA. “This kind of application of the technology would fall within the high-risk category in the latest drafts of the [EU] AI Act, that means they would need some form of conformity assessment by an independent party.”
The GSMA was contacted for comment on the AEPD’s penalty but at the time of writing it had not responded.
It’s worth noting that while AEPD’s administrative process on this complaint concludes with this resolution, the GSMA could seek to challenge the outcome via a legal appeal to the Audiencia Nacional (Spain’s National High Court).
Zooming out, as Smith points out, the incoming pan-EU AI Act isin the coming years.
The draft version of this legislation proposed by the Commissionincludes a prohibition on the use of remote biometrics, like facial recognition, in public places which — if it makes it into the final version — will certainly crank up the regulatory risk around implementing automated verification checks in the future. (Add to that, parliamentarians have been .) And that’s on top of existing GDPR risks for any data processors taking a sloppy approach to risk due diligence (or indeed the hard requirement to have a valid legal basis for such sensitive data processing).
For its part, the GSMA has continued to provide a facial biometrics-based automated ID check option for attendees of MWC (both this year and last) — as well as continuing to require ID document uploads for registration for in person attendance. So it will be interesting to see whether it amends its privacy disclosures and/or makes changes to the registration process for MWC 2024 in light of the GDPR sanction. (And, if it does continue offering a biometrics-based automated ID check option at the show in future, it may be well advised to ensure its technology supplier is wholly located inside the EU.)