As previously discussed on this blog, one of the key features of the TeSLA architecture lies in its ability to ensure the learners’ privacy, in accordance with the requirements of the GDPR. Ensuring privacy consists in minimizing the personal information retrieved by the TeSLA system during its interactions with the learners, while anonymizing the data exchanged and stored in the databases whenever practicable. The TeSLA architecture provides the learner with anonymized identifiers, that hides the learner’s genuine identity when taking e-assessment activities.
Other solutions that would guarantee higher level of privacy to the learners have also been discussed under the scope of the TeSLA project. Among them, anonymous certification would make it possible to perform anonymous access control to protected resources. Research on this concept by the IMT team led to successful results, both in terms of prototype implementation and articles published in international workshops and conferences. Anonymous certification is now mature enough to be integrated in the TeSLA architecture as an extra feature meant to strengthen the learners’ privacy. Other approaches might also be added, following the same direction for privacy enhancement. One of them consists in mixing together the data stored in a database in order to make it impossible to associate the various attributes of a table entry, hence offering anonymous data storage. Should such a technique be integrated to TeSLA, it would guarantee that even a data leak from a sensitive database will not provide any certain information to anyone — as long as the leaked data do not contain secrets such as private keys or passwords.
In terms of trust, enhanced features beyond learners’ privacy can also be added to future releases of the architecture. A system such as TeSLA, where the learners have to take e-assessments under strict anti-cheating countermeasures, requires a high degree of trust from the learners to be widely deployed — and accepted as a legitimate assessment tool. TeSLA should provide public guarantees that its claims regarding privacy and security are met, meaning that TeSLA is as transparent as possible with respect to personal data management processes. Though it is not directly related to security and privacy, TeSLA should also ensure transparency regarding the anti-cheating decision processes, and let the students know how these decisions are made while informing them of the possible resorts at their disposal in case of false positive detection.
The successful deployment of a complex platform such as TeSLA among various universities is tied to numerous factors. The technical completion of the TeSLA platform, as well as its seamless integration to the usual educational activities, are probably the two most obvious factors that can be named. TeSLA must succeed in convincing the learners that they can trust the system as a legitimate examination module that is devoid of any serious risk for their personal data. Ensuring privacy and transparency will allow TeSLA to meet the requirements of the GDPR, but even more than these legal considerations, it will greatly help TeSLA to obtain the learners’ trust.
FUNDED BY THE EUROPEAN UNION
TeSLA is not responsible for any contents linked or referred to from these pages. It does not associate or identify itself with the content of third parties to which it refers via a link. Furthermore TESLA is not liable for any postings or messages published by users of discussion boards, guest books or mailing lists provided on its page. We have no control over the nature, content and availability of any links that may appear on our site. The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.
TeSLA is coordinated by Universitat Oberta de Catalunya (UOC) and funded by the European Commission’s Horizon 2020 ICT Programme. This website reflects the views only of the authors, and the Commission cannot be held responsible for any use which may be made of the information contained therein.