Newsletters
Europe

The Use of Augmented Cameras in Retail – Yes or No?

29 Aug 2025

The French Data Protection Authority (Commission Nationale de l'Informatique et des Libertés – “CNIL”) has recently issued an important decision on the lawfulness of using so-called augmented cameras in tobacco shops. These systems, equipped with artificial intelligence algorithms to estimate the age of customers, were introduced as a means to prevent the sale of tobacco products to minors. However, CNIL considers that this method of processing personal data is neither necessary nor proportionate to the objective pursued.

The decision raises an important question about the boundary between protecting the public interest and safeguarding the fundamental rights of individuals, particularly the right to privacy and the protection of personal data.

Proportionality Between the Purpose of Processing and the Protection of Individuals’ Rights

Augmented cameras that use artificial intelligence algorithms to estimate age rely on the processing of personal data classified as biometric data, which are subject to strict rules under the General Data Protection Regulation (“GDPR”). Biometric data, such as data obtained through facial scanning, are considered particularly sensitive, and their processing requires the existence of a legal basis and a lawful purpose, in compliance with the principles of proportionality and necessity.

Under Article 5 of the GDPR, the processing of personal data must be limited to what is necessary to achieve the legitimate purpose, while Article 6 stipulates that a fair balance must be maintained between the interests of the controller and the fundamental rights and freedoms of the data subjects. In the context of protecting minors from the sale of tobacco products, the purpose can undoubtedly be considered legitimate. However, the means of achieving that goal must be proportionate to the level of interference with the privacy of all customers.

In addition, the GDPR provides that the choice of tools and methods of processing should be based on an assessment of whether a less intrusive measure could achieve the same objective.

CNIL’s Decision

CNIL responded to customer complaints regarding the use of these cameras, which automatically display a red or green light depending on the estimated age of the customer. While the purpose, preventing the sale of tobacco products to minors, is legitimate and legally regulated, CNIL concluded that there is a less intrusive and equally effective alternative: simply asking to see an identity document.

CNIL reached the following conclusions:

  • The processing of data through facial analysis clearly constitutes the processing of personal data within the meaning of the GDPR;
  • Shop owners have not demonstrated that the use of such cameras is necessary, the purpose can be achieved by simpler, less intrusive measures, such as checking an identity document;
  • Relying solely on the algorithm carries the risk of incorrect assessments, without additional human oversight
  • This method of processing requires all customers to “prove” their age, thereby processing the data of individuals who are unquestionably entitled to purchase tobacco products;
  • There is a risk of limiting the right to object under the GDPR, as the recording and analysis take place in real time, without the possibility of easily opting out of processing.

 

In its decision, CNIL did not explicitly state that such processing is “prohibited,” nor did it adopt corrective measures ordering the removal of the cameras. However, the reasoning clearly indicates that CNIL does not consider this practice to be in compliance with personal data protection rules, leaving very little room for a different interpretation.

The Significance of CNIL’s Decision and Its Impact on the Practice of Other EU Member States

This CNIL decision reflects a broader trend in European practice, insisting on clear evidence of necessity and proportionality for any data processing activity that interferes with privacy. Although Serbia is not an EU member, the Law on Personal Data Protection (“LPDP”) largely follows GDPR standards, including the obligation to assess the proportionality of processing.

For domestic businesses, especially those in retail or in sectors with restricted sales, such as alcohol and tobacco, the French regulator’s decision can serve as a warning: innovative technological measures must comply with fundamental rights, and before implementing AI-based systems, it is essential to consider all less intrusive alternatives.

The issue of balancing the protection of public interest with the privacy of citizens is not new, but with technological advancement, it becomes increasingly complex. Cases like this demonstrate that regulatory bodies are increasingly placing individual privacy on an equal footing with the protection of society as a whole, and that “technically advanced” solutions are not automatically lawful.

Author:

Sonja Stojčić, Senior Associate
Email: 
sonja.stojcic@prlegal.rs; legal@prlegal.rs;