GDPR: New criteria for permitted VIDEO SURVEILLANCE and biometric data

In brief:
Controllers and processors in Austria should now check their video surveillance systems for DSGVO compliance. This is because the Austrian Federal Administrative Court (BVwG) found some of the provisions of the Data Protection Act concerning video surveillance to be invalid. And in January 2020, the European Data Protection Board (EDPB) published its final version of guidelines on video surveillance. The new directive also contains interesting examples regarding the processing of biometric data, in particular facial recognition.


The particularly high level of attention given to video surveillance systems by the media and data protection authorities in comparison to other important privacy-topics might appear to be greatly exaggerated. The excellent documentary in FS 1 on 6.2.2020 Tokyo 2020 – The Price of Security ( ) might change this opinion. Whatever one may think of the data protection regulations for video systems, there are new reasons for operators to look into the subject again:

  1. In November 2019, the Austrian Federal Administrative Court (BVwG) ruled in 2 decisions – both not legally binding yet – that current provisions of the Austrian Data Protection law on image processing (DSG § 13 and § 12 para. 4 no. 1) were not applicable because they do not comply with the European GDPR (BVwG, decision of 20.11.2019, W256 2214855-1, and BVwG, decision of 25.11.2019, W211 2210458-1).
  2. In 2020, the European Data Protection Committee (EDPB) published the final 33-page version of its Guidelines 3/2019 on processing of personal data through video devices, version 2.0, adopted on 29 January 2020 ( ).

The EDPB is composed of representatives of all data protection authorities in the EU. It issues guidelines containing legal opinions on which the data protection authorities base their decisions. Even though such legal opinions can be reviewed by the companies and authorities concerned by challenging the decisions, they are nevertheless of great importance.

In addition to abstract explanations, the Guidelines 3/2019 also contains numerous practical examples of how video surveillance may – or may not – be operated in conformity with GDPR.


Lawful processing of video surveillance

In addition to a legal obligation for video surveillance, the processing can be lawful on the ground of

  • legitimate interests pursued by the controller or by a third party, and
  • performance of a task carried out in the public interest or in the exercise of official authority vested in the controller.

Legitimate interests can be of a legal, economic and non-material nature. The protection of persons or property against robbery, theft or vandalism can constitute a legitimate interest. To prove the necessity of this protection, concrete previous threats or injuries (e.g. also in the neighbourhood) should be documented. Banks, jewellers, petrol stations are listed as typical examples of immanent danger.

Those affected must be informed in good time about the nature of the legitimate interests pursued by the controller or by a third party. In case of an objection by the data subject to the processing of personal data on the basis of legitimate interests, the responsible person must document its “compelling” legitimate interest in the video surveillance. The guidelines – regrettably – do not give us an example for such “compelling” legitimate interests.

When weighing up the interests of controller and data subject, the reasonable expectations of data subject with regard to surveillance in general and its nature must also be taken into account (e.g. no surveillance in the toilet).

Consent of data subjects can only be considered as a legal basis for systematic surveillance in exceptional cases. This is because it will be difficult for a controller to prove (principle of accountability!) that the many data subjects have given their (informed) consent in time; and in the event that consent is withdrawn, that the processing of this data subjects personal data is stopped immediately.

For employees, explicit “works agreements” may constitute a valid consent under the Directive. Unfortunately, this English term is not explained in detail.

Any transfer of video data to third parties also requires a lawful basis. A legitimate interest may exist in the disclosure of a video recording to a law enforcement agency, even in the absence of a request for surrender by the security authority (e.g. interest of a shop operator who has a well-founded suspicion that a criminal act has been recorded on the surveillance images).

If special categories of personal data (= sensitive data) are processed, the Directive requires that the exception admissibility for processing under Art. 9 DSGVO be determined – and also the legal basis under Art. 6 DSGVO.

A high level of security is of course required when processing sensitive data.


Facial recognition and other biometric data:

A separate section in the Guideline is devoted to the processing of biometric data, in particular facial recognition.

Video recordings are not in themselves biometric data according to Art 9 DSGVO. Specific technical processing is required, which also contributes to the identification of the individual.

Face recognition, which does not generate biometric recordings in order to identify persons individually, but only divides them into categories such as age, gender, etc., does not generate biometric data in the sense of Art 9 DSGVO.

When processing biometric data, particular attention must be paid to lawfull processing, necessity, proportionality and data minimisation. The use of less intrusive means to achieve the objectives must always be examined first.

In most cases, only valid consent will be considered as the lawful basis for the processing of biometric data for own purposes by private controllers. Thus, data subjects can give their consent to facial recognition during an entry check. In order to prevent other persons from being automatically biometrically recorded, there must be a separate checkpoint for those giving consent; or those giving consent must press a special button at the general checkpoint each time to activate facial recognition.

Specific measures to minimise the risks when processing biometric data are mentioned:

  • Compliance with the purpose limitation, no further processing! Ensuring that templates do not get outside the biometric system.
  • Templates that are required for identification, authentication/verification purposes must be stored in the most appropriate location. As far as possible and reasonable at the user himself (in the smartphone, ID card) or encrypted in a centralized database with a key that only the person concerned receives. If others than the data subject have access to the templates, encryption of the templates and special access authorizations can be considered.
  • Ensuring availability, integrity and confidentiality of the biometric data.
  • Sealing off the biometric data in Cryptainer vaults during transmission and storage
  • Storage of biometric templates and raw or identity data in different databases
  • Encryption of biometric data, especially of templates
  • Encryption policy and key management
  • TOMs for fraud detection
  • Assignment of the biometric data to an integrity code (signature or hash function)
  • Preventing external access to biometric data
  • Adaptation to technological progress
  • Deleting raw data (facial images, voice signals, way of walking etc.) and biometric data and ensuring the effectiveness of the deletion process


Rights of data subjects

The Guidelines contain general information on requests for information and deletion and on objections to processing on the basis of legitimate interests


Transparency and information requirements

The pattern for a typical first layer information (video information board) appears practical: – page 27 para 116.

Easy access to complete (digital and non-digital) second layer information (according to Art 13 DSGVO) must be guaranteed.


Storage periods and obligation to erasure

In the absence of specific legal provisions in the member state, the storage period depends on the purpose of the video surveillance. In the case of surveillance for the purpose of protecting property or securing evidence, damage can usually be detected within 1 or 2 days. Data minimization and storage limitation therefore usually require deletion after a few days. In a shop, for example, closed weekends or longer holidays may require longer storage.

In any case, storage for more than 72 hours must be clearly defined and determined for each individual purpose according to the principles of necessity and proportionality; and data subjects must be informed about accordingly.


Technical and organisational measures (TOMs)

The Guidelines refer to the necessity for measures to protect data security in accordance with Art 32 DSGVO.

TOMs however must also explicitly guarantee compliance with data protection principles (lawfulness, purpose limitation and storage limitation, data minimisation through privacy by default, integrity and confidentiality as well as accountability, etc.); and compliance with the rights of the data subject (Art 15-22 GDPR). This must also be observed in the case of new acquisitions.

The measures must apply to all components of the system and all data processed therein over their entire life cycle.

The technical solutions should enable masking (covering) and scrambling of areas not required for surveillance as well as cutting out third parties.

Functions not required for the purpose of surveillance, such as unrestricted movement of the camera, zoom, radio transmission, analysis and sound recording shall be absent or disabled.

Numerous possibilities for organisational and technical measures are listed in the Guidelines , the use of which should be considered.


Data protection impact assessment:

Due to the often high risks that video surveillance can pose to the rights and freedoms of data subjects, many video surveillance systems require a Data protection impact assessment DPIA (see Art 35 DSGVO).

Regulations of the member states in which processing operations are listed, for which a DPIA is required in any case, must be carefully examined.

In Austria this is the DSFA-V, BGBl II No. 278/2018 (so-called “black list”). Special image processing is listed there, especially in section 2 para. 2 subpara 3 to 5.

Explicit exceptions to the obligation to carry out a DPIA for some image processings can be found in the “white-list”, DSFA-AV regulation, BGBl II No. 108/2018 (see DSFA-A08 to A11).


Markus Frank