Research summary


Title: Digital privacy: Privacy calculi and GDPR: Lessons from the user perspective

Contents

Short summary

A small survey was done around digital privacy. Respondents rated some GDPR provisions and organisation privacy measures in relation to a computing scenario they described. The results show these respondents do value such provisions, but are critical of practice. They are concerned about  tracking and ‘linking’ across services, and want real transparency and control in relation to personal data. The GDPR should better incorporate these provisions.

Summary

This research project considered digital privacy, the protection of personal data in computing. The focus was the user perspective on the recent General Data Protection Regulation (GDPR) (Cf. ICO, n. d.). The study asked users how important GDPR provisions are to them in their digital privacy decisions, as well as their views on organisational implementations. The theoretical background critiqued “privacy calculus” models, that characterises users’ privacy decisions as including approaches such as pragmatist, fundamentalist, permissive and resigned (Lehtiniemi and Kortesniemi, 2017; Draper, 2016).

A small online survey was conducted with convenience sampling at UK Higher Education institutions. The survey asked respondents to describe a computing use scenario relevant to them. They then rated selected GDPR and organisation provisions in relation to this scenario, such as GDPR strict consent requirements or organisation privacy notices. They also indicated any provisions they felt were missing. The survey design aimed to provide enhanced anonymisation (discussed below, cf. Dench et al., 2004, pp. 71-; Network23.org, n. d.). The results were analysed with a combination of thematic analysis and simple statistics (Ling, 2015; Cohen et al., 2007, pp. 475-491).

This sample could, overall, be characterised as relatively digitally literate, and privacy savvy. Respondents described typical online scenarios, such as social networking, emailing and searching. They indicated that they do highly value GDPR provisions, as well as organisational measures for privacy. However, there was criticism of digital privacy in practice with a lack of trust in organisations and regulation. Respondents highlighted issues with data collection, tracking and inappropriate ‘linking’ across services. They wanted better ‘knowing’ and control in relation to the use of their personal data.

The research recommended the GDPR adds an explicit principle of individual control and ownership over personal data and stronger transparency provisions. The report concluded that further critical research around digital privacy is called for.

Delivering a privacy-enhanced survey

The survey aimed to provide enhanced privacy. The network23.org website service was selected as this promises to not track IP addresses, which was considered an essential prerequisite for digital anonymity (Network23.org, n. d.). The setup generally worked well. Respondents also provided feedback about any concerns with taking the survey.

Some points of interest:

  • The choice of setup is a complex decision, including considerations of trust in providers and possible user perceptions. For example, whether users may be more confident with a well-known mainstream provider over independent providers such as network23.
  • There are technical trade-offs. It is essential the survey works as required. A basic setup such as network23 does offer less survey features than dedicated services, and needs more technical setup. A self-hosting approach may risk technical issues. Good testing is required. The network23 solution did, however, work well.
  • It is problematic to find services or setups that do not store IP addresses at some level, independent of whether these are made available e.g. to the researcher.
  • Data security is a prerequisite for privacy; this research opted for secure email and offline storage. This can limit the portability, data analysis tool and backup options. Secure data archiving needs to be addressed at research finishing.
  • Some respondents did debate the setup in survey comments; positive assessments, but also concerns over the “activist” hosting used (noting a previous hack of this), and not finding the specified data storage “reassuring”. Despite the survey assurances, some respondents questioned whether there was a “catch”, with tracking or information use. Comments, however, indicated that overall the researcher being known and trusted by respondents facilitated the good response rate. These debates reflect the complexity in providing anonymity, but also suggest providing fuller information about the setup and reasoning behind choices for respondents to consider.   
  • With a small sample, any grouping of invitations should be cautious to avoid identifying respondents based on timing of submissions. Similarly, this survey warned respondents that if they used contact facilities soon after submitting the survey, this may identify their submission. This highlights the distinction between anonymity (not being identifiable) and confidentiality (provided by researcher discretion) (cf. Dench et al., 2004, pp. 71-).
  • While the survey did provide the option to be anonymous, a significant number of respondents did sacrifice anonymity by telling the researcher ‘I just did the survey’, effectively identifying their submission in this small scale research. In one case, the researcher could guess the respondent based on prior acquaintance. This recommends ensuring sufficient distance between researcher and respondents to facilitate anonymity. This also raises the dilemma that some respondents may want to be “named” rather than valuing anonymity (Dench et al., 2004, pp. 71-).
  • Anonymity does add survey constraints; preventing multiple submissions is trickier, respondents are restricted from editing submissions, and without linked contact details, follow-up communication between researcher and respondent is limited.

Providing anonymity therefore involves complex decisions, social as well as technical, and does complicate all aspects of research. While anonymity can be recommended to improve the credibility of responses, it should be considered whether the trade-offs are justified and appropriate to the research project.

Selected references

Cohen, L., Manion, L. and Morrison, K. (2007) Research Methods in Education (6ed), pp. 475-491 [Online], London, Routledge. Available at https://islmblogblog.files.wordpress.com/2016/05/rme-edu-helpline-blogspot-com.pdf (Accessed 04 May 2019).

Dench, S., Iphofen, R. and Huws, U. (2004) An EU code of ethics for socio-economic research [Online]. Available at http://www.respectproject.org/ethics/412ethics.pdf (Accessed 04 May 2019).

Draper, N. (2017) ‘From Privacy Pragmatist to Privacy Resigned: Challenging Narratives of Rational Choice in Digital Privacy Debates’, Policy & Internet, vol. 9, no. 2, pp. 232-251 [Online]. DOI: 10.1002/poi3.142 (Accessed 11 November 2018).

European Parliament and the Council of the European Union (EU) (2016) Regulation 2016/670 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [Online]. Available at https://gdpr-info.eu/ (Accessed 24 November 2018).

Information Commissioner’s Office (ICO) (n. d.) Guide to the General Data Protection Regulation (GDPR) [Online]. Available at https://ico.org.uk/for-organisations/guide-to-the-general-data-protection-regulation-gdpr/ (Accessed 12 January 2019).

Ling, D. L. (2015) Introduction to Statistics Using LibreOffice.org Calc, Apache OpenOffice.org Calc and Gnumeric. Statistics using open source software [Online]. Available at http://www.comfsm.fm/~dleeling/statistics/text5.html (Accessed 08 February 2019).

Network23.org (n. d.) WordPress.com vs WordPress.org vs Network23.org (Enter Network23….) [Online]. Available at https://network23.org/wordpress-com-vs-wordpress-org-vs-network23-org/ (Accessed 27 December 2018).

Lehtiniemi, T. and Kortesniemi, Y. (2017) ‘Can the obstacles to privacy self-management be overcome? Exploring the consent intermediary approach’, Big Data & Society, vol. 4, no. 2 [Online]. DOI: 10.1177/2053951717721935 (Accessed 17 November 2018).

Reichel, M. (2017) ‘Race, Class, and Privacy: A Critical Historical Review’, International Journal of Communication, vol. 11, pp. 4757-4768 [Online]. Available at http://ijoc.org/index.php/ijoc/article/view/7018 (Accessed 10 November 2018).