Home NETWORK POLITICS Data protection amendment: Police and civil rights activists favour ban on facial...

Data protection amendment: Police and civil rights activists favour ban on facial recognition

0


At a hearing in the Bundestag on the federal government’s draft bill for the first amendment to the Federal Data Protection Act (BDSG), one point took up a lot of space that has not yet been included in the project: a ban on biometric surveillance, for example through automatic facial recognition in public places. Matthias Marx of the Chaos Computer Club (CCC) and Simone Ruf of the Society for Freedom Rights (GFF) specifically called for the inclusion of this requirement. Surprisingly, constitutional lawyer Eike Richter of the Hamburg Police Academy also made it clear that he has no constitutional concerns about such a ban. The legislature may even be constitutionally obliged to include such a clause.

Advertisement


During negotiations on the EU’s new AI regulation, live facial recognition in particular was considered a hot topic. The EU Parliament initially called for a general ban on biometric mass surveillance, but member states were against it. The final version stipulates that real-time identification should be possible “limited in time and space”, in particular for the targeted search for victims of kidnapping, human trafficking and sexual exploitation or to address “a concrete and present terrorist threat”.

The Traffic Light Coalition agrees that it does not want to use the remaining backdoor in the AI ​​Act for biometric surveillance, but rather wants to limit tools such as automatic facial recognition. In fact, such forms of remote identification for the police “are now prohibited in this country until it is allowed,” Richter explained. However, at present, this is clearly an “ineffective reservation”. If this does not work, an explicit ban should be preferred, taking into account the interference with the right to informational self-determination associated with the tool. The police lawyer cited the ban on torture in parallel. The AI ​​regulation refers specifically to the regulation of technological systems. Nevertheless, the legal restriction “can remain broad” and possibly also include the private sector.

Cases are increasingly being reported in which police officers use biometric surveillance systems inappropriately and without a legal basis, with the CCC’s Marx referring to the covert observation technique Paris, developed on behalf of the Saxon police department, which is now also used by prosecutors in other federal states. The police also avoided democratic control. Paris only became more widely known in Saxony through a parliamentary question. “The means of pressure on the authorities are missing,” Marx said, also promoting an appeal to the conference of federal and state data protection supervisory authorities (DSK) to be able to impose fines on public administrations.

The hacker explained that biometric surveillance systems lead to ubiquitous recording in public places. Every step is recorded and can be evaluated in detail due to the obvious physical identification characteristics. But anyone who feels they are being monitored can decide not to participate in a demonstration. This process must stop.

Ruf from the GFF said the BDSG is particularly suitable for banning biometric remote identification systems. It already includes rules for biometric data processing. It also applies to private companies, federal public bodies and, temporarily, states. In general, automated facial recognition does not work without discrimination and “often leads to misidentification of non-white people in particular.” This also makes the work of the police difficult. In the event of unauthorized access to the collected biometric data, the affected characteristic can no longer be changed.

Another point of contention: The government also wants to stipulate that data such as home address, name or information from social networks cannot be used to assess consumers’ ability to pay in the future through scoring. Probability values ​​can only be created or used if the personal data involved is “not processed for any other purpose”. Johannes Müller from the Federal Association of Consumer Organisations (vzbv) describes it as “important that certain categories are not recycled”. Otherwise, users could identify their behaviour on social media or even their address itself as a target in order to increase their score.

Incoming Federal Data Protection Commissioner Luisa Specht-Riemenschneider praised the planned restrictions on automated decisions based on scoring by Schufa & Co. as a “balanced proposal”. However, this will not be enough to bring unbalanced assessment practices under control. The “elephant in the room” is payment service providers such as PayPal or Klarna, for which the requirements do not apply. The Bonn law professor therefore recommended applying Article 18 of the Consumer Protection Directive and thus involving actors. Like outgoing Federal Data Protection Officer Ulrich Kelber and Hessian Data Protection Officer Alexander Rosenagel, Specht-Riemenschneider also spoke in favour of lifting the planned restrictions on the right to information to protect operational and business secrets.

However, Meinhard Schröder, a constitutional lawyer from Passau, considered the ban on scoring too broad. A “complete ban on special categories” of personal information such as ethnic origin, biometric characteristics and health data are not covered by the General Data Protection Regulation (GDPR). The exclusion of social networks also goes too far: a Facebook page, for example, is more “open” than a normal website. Munich law professor Boris Pal also fears imminent “misunderstandings and legal uncertainties” due to restrictions on automated decisions, which are necessary in principle but miss the target.

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version