TeSLA’s metrics indicate, in the form of a percentage, how trustworthy the samples of biometric data/text sent by the learners are. These reliability indexes measured by the biometric instruments raise an ethical questioning because they are sorting the learners out in two categories: the cheaters and the non-cheaters. A major responsibility falls then to these metrics as they contribute to socially organize the whole learning process and are supposed to bring trust into the learner/teacher relationship. We have distinguished three aspects, which question the claimed “neutrality” of the metrics.
The use of the TeSLA’s metrics to judge whether a learner is a cheater or not is not “trivial” as it takes a role that was earlier reserved for the teachers. A quantitative “objective” indicator (the metrics given by TeSLA’s instruments) replaces a qualitative “subjective” indicator (the judgement of the teacher). This replacement generates a certain uniformity in the judgements made toward learners and a set of assumptions have been done regarding the cheater/non-cheater classification by the designers of the system behind this “neutral” and scientific regulation of cheating.
TeSLA metrics’ quantitative approach in itself may lead to a deep change in the way to judge a situation. They try to catch an objective reality, observable by naked eye, and to extract its quintessence. In short, TeSLA’s metrics aim to compress a whole material reality in a number to ease its interpretation. Metrics have a selective purpose, they do not deal with aspects of the reality that do not seem relevant according to their designers to judge the situation. This could potentially lead to neglect some aspects that were taken into account in more conventional ways to judge cheating.
Those ethical issues have an echo in the new general data protection regulation. The GDPR provides that the data subject has the right not to be subject to a decision based solely on an automated processing, without any human involvement (Article 22). The Article 29 Working Party defines the automated process as the production of a recommendation concerning the data subject. The principle in the Regulation is the prohibition on fully automated decision-making. However, in the TeSLA project, the collection and use of personal data from the learners are based on their freely given, specific and informed consent. The consent is an exception that permits the use of automated decision-making. As safeguard, learners have the right to be informed about the logic involved, the explanation of the mechanism and how the system works. According to Article 29 Working Party, it is not mandatory to enter into details on the functioning of algorithms.
In addition to that, learners have the rights to obtain a human intervention, to express their opinion and to contest the decision.
The Article 29 Working Party emphasises the possibility for someone who has the authority and ability to remove the decision to review the decision. Recital 71 of the GDPR specifies that they need to have the possibility to obtain an explanation of the concrete decision reached. This precision is not mandatory as it is in a recital and not in the article itself, but permits a real and meaningful possibility to express an opinion and to decide to contest or not. The Article 29 Working Party insists on the role of the controller in transparency of the processing.
The combination of legal obligations and ethical concerns raises major questions. On one hand, the GDPR says that the learners should be able to get deeper explanation regarding the automated decision, but on the other hand, TeSLA’s algorithms make the decision more opaque because they contain undiscussed postulates and summarize a complex reality within a single number.
Nathan De Vos, Manon Knockaert, UNamur.
FUNDED BY THE EUROPEAN UNION
TeSLA is not responsible for any contents linked or referred to from these pages. It does not associate or identify itself with the content of third parties to which it refers via a link. Furthermore TESLA is not liable for any postings or messages published by users of discussion boards, guest books or mailing lists provided on its page. We have no control over the nature, content and availability of any links that may appear on our site. The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.
TeSLA is coordinated by Universitat Oberta de Catalunya (UOC) and funded by the European Commission’s Horizon 2020 ICT Programme. This website reflects the views only of the authors, and the Commission cannot be held responsible for any use which may be made of the information contained therein.