On top of addressing numerous security challenges, the TeSLA architecture has been designed with several guarantees in terms of privacy, in compliance with the GPDR requirements. As mentioned in a previous post, ensuring privacy consists in minimizing the personal information retrieved by the TeSLA system during its interactions with the learners. Therefore, while obviously securing the access to databases, TeSLA makes sure to anonymize every sensitive data collected from the learner. This process applies to e-assessments, which are taken by the learners with an anonymized identifier, but also to the biometric samples required to authenticate the user. These biometric samples are anonymized the same way before reaching the TeSLA e-assessment Portal, where they are dispatched to various instruments that will analyze them accordingly, and return the results to TeSLA.
In that post, we had highlighted a few possible enhancements regarding privacy issues, such as adding anonymous certification, and improving the level of transparency of the whole system. Pushing the analysis further requires an overall look on the architecture, and on the fundamental choices that led to its design. Among these choices, relying on biometry for learners’ authentication is one that particularly stands out in terms of privacy. Contrary to a password, which would authenticate learners using what they know, biometric samples authenticate them using what they are. The data transmitted over the network, from the learner’s computer to the TeSLA instruments, are a part of the learners’ identity and as such, are much more sensitive than mere passwords, which can be changed at will. With encrypted data exchanges over the network, TeSLA ensures that these biometric samples cannot be retrieved by an attacker. The anonymous treatment of samples by the TeSLA instruments strongly limits the risks in terms of unwanted access and exploitation of personal data.
The choice of biometric-based authentication for the learners who are taking e-assessments entails other issues. First, the biometric samples are collected from the learner’s computer, which by definition has no guarantee whatsoever regarding security. Even if the samples are not meant to be stored on the learner’s computer, the risk of personal data theft at this point is independent from the TeSLA architecture, but is induced by the choice to rely on biometry. As such, it should be taken into account for further improvement of the TeSLA system. Second, even though the biometric samples are anonymized before they are sent to the TeSLA instruments, it may be better not to send such sensitive data to TeSLA at all, and decentralize Trusted Third Parties (TTPs) as much as possible. The role of TeSLA is to offer a specific service, namely the possibility to take e-assessments. It does not, and cannot act as a TTP. In the current configuration, what becomes of the biometric samples depends on how TeSLA is managed. With a TTP, which would have no specific connection to TeSLA or to the academic institutions, there would be a dedicated entity whose explicit role would be to guarantee the treatment of these sensitive data, independently of the current TeSLA policy. It is worth noting that anonymous certification will benefit of such a TTP decentralization, as well.
Improving the privacy in TeSLA therefore requires further decentralization of its fundamental choices, in order to offer the best privacy guarantees. Even if the use of biometry is maintained as it is, extending current TTP elements, such as the TeSLA Public Key Infrastructure (PKI), and the underlying Certification Authorities (CAs), would be a significant step in this direction.
Author: IMT
FUNDED BY THE EUROPEAN UNION
TeSLA is not responsible for any contents linked or referred to from these pages. It does not associate or identify itself with the content of third parties to which it refers via a link. Furthermore TESLA is not liable for any postings or messages published by users of discussion boards, guest books or mailing lists provided on its page. We have no control over the nature, content and availability of any links that may appear on our site. The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.
TeSLA is coordinated by Universitat Oberta de Catalunya (UOC) and funded by the European Commission’s Horizon 2020 ICT Programme. This website reflects the views only of the authors, and the Commission cannot be held responsible for any use which may be made of the information contained therein.