[Follow-up] - Age verification: the economic argument

Rédigé par Martin Biéri

 - 

19 juillet 2023


In June 2022, LINC, Olivier Blazy, a professor at the Ecole Polytechnique, and the Pôle d'expertise de la régulation numérique (PEReN – Digital regulation centre of expertise) proposed a proof of concept for an age verification system that would be both functional and protective of personal privacy. Is this system, which minimises the information shared between the various players, economically viable if no one knows who is doing what?

Is a business model possible?

The idea of our demonstrator was to show that it is possible to access a site whose content is restricted to people over a certain age, without revealing any other information. The main aim was to prove that it was possible, so the version proposed by LINC was necessarily a little simplified.

In its operation, only the proof of age requested and the proof that this information comes from a trusted third party are transmitted. However – and this is to be expected, given that this is a proof of concept – a number of points have not been addressed (or only partially): for example, the 'user path' section, the security measures that would need to be implemented, and the business model for such an operation. It is on this last point that we would like to return.
As a reminder, this scheme involved four players:

  • The person wishing to access a service whose use is restricted according to age;
  • The service requiring proof of age (application or website);
  • A certified third party who knows the person and is able to issue proof of age on their behalf;
  • And finally, a certifying authority, whose role is to organise this operation by providing the cryptographic specifications for the service and which also certifies the third parties (it can revoke third parties and their tokens if they do not follow the rules).

Let's assume that it is up to the sites offering content or services to which access is restricted (forbidden under certain ages) to pay for the age verification systems they offer as part of their service. In the case of pornographic sites, for example, there is a legal obligation for them to implement such a system: the onus is therefore on them.

The fact that there are several methods of age verification and several types of certified third parties raises the question of their remuneration, as well as the monitoring of the verifications carried out. How is it possible to have a correct account if we don't know where the signed challenge is going, on the one hand, and where that same challenge is coming from, on the other? And how can this "accounting balance" be achieved without undermining the promise of a double-anonymity system (anonymity being understood here in its cryptographic sense, as the signatures are anonymous).

Counting the tokens

To achieve this, a simple solution would be for a third party to collect the signed challenges obtained by the sites at the end of each month, and to carry out a partial de-anonymisation of the tokens from the age verification providers, which would produce an aggregated list. This offers the possibility of reliable billing. As the token issued by the certified third party does not contain any identity data, this partial de-anonymisation will not enable a link to be made with a particular individual. This solution would be based on a classic cryptographic building block: secret sharing.

However, this approach has two limitations:

  • The first concerns the calculation of this balance and the trust between the players. We could simply compare the number of challenges issued by all the service providers and those received by the sites requesting age verification. In this way, it would be possible to invoice and pay for each challenge, according to the activity of each provider. However, how do you know that, on the one hand, you are not minimising the number of challenges received to keep costs down and, on the other hand, that you are not inflating the number of challenges issued to increase your profits?
  • Some age verification systems might wish to vary rates according to the type of player, the size or volume of visits to the site, etc.

Breaking boundaries: pushing the limits

For the first limit, one possibility might be for the authority to also issue "control" tokens, which it would have to find in the lists provided by the sites at the end of each month: if any are missing, it is clear that a cut has been made, and the publisher could then be sanctioned on this basis. Who would carry out this audit? One might think that the certifying authority, as the organiser of the ecosystem, could have this role. However, the two roles are not linked and could be siloed: another "authority" could have the accounting role.

To avoid the second limitation, the token could also contain the technique used, which would then be revealed during partial de-anonymisation at the end of each month.

Setting up the PoC, by Olivier Blazy

 

In the context of our PoC proposal, this can be implemented as follows: a first "authority" is responsible for generating the control tokens as proposed above.

The certifying authority generates an opening key when the system is initialised. In the PoC, this key was not used until then. Here we propose to use it by splitting it and then sharing it between several trusted authorities (for example: the authority responsible for the control tokens, one trusted authority for the service providers and one for the users). At the end of each time period (month, quarter, half-year), these authorities can work together on the data reported by the websites.

In practice, by joining their keys, they will be able to lift the anonymity of the age verification providers and deduce that over the period, 200 age verifications were provided by service A, 100 by service B and 50 by service C, and will therefore be able to invoice according to the conventions of each of the services.

Once this opening has been carried out, the certifying authority regenerates the keys for the various third parties, and provides a new sharing of opening keys to the various trusted authorities (this mechanism is there to ensure that an authority will not try to replay past exchanges to lift anonymity).

This technique is independent of the choice of solution (interactive or not).

In particular, in the PoC, it does not prevent the user from receiving an unmasked token and masking it himself to prevent the introduction of a bias by the third party.

Nor does it weaken the overall anonymity of the system, because it is in the interests of several authorities to ensure that it is respected. In particular, the authority in charge of the control tokens wants to ensure that they are not identified and the authority that trusts the users wants to preserve their anonymity.

Olivier Blazy is Professor of Cybersecurity in the Computer Science Department at École Polytechnique.

Outside the box: other models are possible

While the above variations are intended to show the possibility of marketing models linked to the reality of the use of anonymous tokens, other economic models are possible to get away from the need to count transactions. For example, a subscription-based fee paid to the system provider by the service offering the content, the price of which could be a flat rate (assuming that many services subscribe to it) or progressive depending on the audience and traffic (measured, for example, via trusted third parties specialising in this area) could also enable costs to be recovered, without having to call into question the anonymity properties of the system.

In this respect, it should be noted that the main cost of the dual-anonymity system is the fixed cost of setting up the infrastructure, each player, the roles assigned to them and the associated procedures. The variable cost associated with each verification is residual for the technical part (transmission of tokens) and relatively low for support and management operations (management of revocations, support for users and sites). The transaction-based pricing model may therefore not necessarily be the most appropriate for recovering costs. On the other hand, knowledge of the potential user market (which sectors, which types of players in that sector, etc.) is essential if a supplier is to secure its investments.

Conclusion

The purpose of our demonstrator was to show that it is possible, with the support of a few cryptographic concepts, to create the conditions for privacy-friendly age verification. It's a technical proposal, but one that is still malleable and adaptable if you want to make it work. This proposal can therefore serve as a basis for a privacy-friendly innovation as long as it offers the same guarantee of double anonymity.

At first sight, these solutions appear to be monetisable, financeable by solvent entities and based on several possible business models.

However, the question of governance remains to be explored: there are many choices to be made, particularly concerning the "authorities" responsible for setting the rules of the game (certifying authority) as well as the authority that could have the role of economically balancing the ecosystem with a more accounting role.



Article rédigé par Martin Biéri , Chargé d'études prospectives