There is no secret that many things that people used to do in real life can now also be done virtually, like: shopping and paying bills with use of Internet banking; making friends and staying in contact through social media websites; or following courses and passing exams online. When we experience all these situations in real life, we behave in a way fitting to social norms and principles. For example, when people meet for the first time, it is quite normal to keep some distance from each other and have a limited socially-accepted range of topics to discuss. It takes time to learn each other; and every gesture, word and movement provides valuable detail to better understand who is in front of us. The virtual world is different from the real world and full of risks. Risks concerning our privacy are hard to understand and to control when sometimes we even do not know how technologies that we use, function. To achieve the same level of privacy in the virtual world as we expect in real life, is quite a challenge. In this respect Nissenbaum (2004) speaks of “contextual integrity” which links adequate privacy protection to norms formulated for specific contexts, “demanding that information gathering and dissemination be appropriate to that context and obey the governing norms of distribution within it” (Nissenbaum, 2004, p. 101).
This question also concerns e-assessment technologies like TeSLA that collect personal data from students for identification and authorship verification purposes. A dominant approach addressing this question is a combination of transparency and choice (Nissenbaum, 2011). The gist of this approach is to inform users about data collection and to give them a choice whether or not to provide their data. Even those who decide to provide their data, should have opportunities to access, change or delete data, or withdraw consent. However, in practice we see that technology providers offer privacy on a “take it or leave it” basis (Nissenbaum, 2011), turning the choice into a dilemma. The same can be said of privacy policies; almost all of them are very long, and overloaded with legal terms making it impossible for ordinary users to read and understand. As a result, 85,5% of Internet users simply agree to privacy policies without reading (Elsen, Elshout, Kieruj, & Benning, 2014).
There is a need to understand the context in which a particular technology is used and expectations of users for this particular context; then – make sure that this technology does what is expected. Technology providers should think in advance not only about secure data collection and processing, but also ethical issues that can arise during exploitation. For e-assessment it means to provide students with information on instruments for identification and authorship verification in a clear way, and to give them opportunities to choose instruments according to their preferences. Depending on which data are required, preferences of students may differ. From the TeSLA pilots we know, for example, that students at the Open University in the Netherlands are much less willing to share video recordings of their face than recordings of their voice or keystroke dynamics. Some of them are even not willing to share any personal data for identification purposes. Although much work has been done already, there are still many questions that need to be answered, like: how well do students understand these technologies; how much effort do/must they make to sufficiently understand them; what is the best way to communicate it; and what influence do they have on the relationship between students and teachers. Answering these questions is of great importance for all parties in order to bring the experience of e-assessment a bit closer to a real life setting, and to create a safe and trustful environment for all students.
References
Elsen, M., Elshout, S., Kieruj, N., & Benning, T. (2014). Onderzoek naar privacyafwegingen. Retrieved from https://www.centerdata.nl/
Nissenbaum, H. (2004). Privacy as contextual integrity. Wachington Law Review, 79, 101–139. http://doi.org/10.1109/SP.2006.32
Nissenbaum, H. (2011). A Contextual Approach to Privacy Online. Journal of the Americam Academy of Arts and Sciences, 140(4), 32–48. http://doi.org/10.1162/DAED_a_00113
The Open University of the Netherlands
Ekaterina Mulder, PhD Candidate
José Janssen, Associate Professor
FUNDED BY THE EUROPEAN UNION
TeSLA is not responsible for any contents linked or referred to from these pages. It does not associate or identify itself with the content of third parties to which it refers via a link. Furthermore TESLA is not liable for any postings or messages published by users of discussion boards, guest books or mailing lists provided on its page. We have no control over the nature, content and availability of any links that may appear on our site. The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.
TeSLA is coordinated by Universitat Oberta de Catalunya (UOC) and funded by the European Commission’s Horizon 2020 ICT Programme. This website reflects the views only of the authors, and the Commission cannot be held responsible for any use which may be made of the information contained therein.