Apps are making personalised health advice possible – but what do people think about their privacy risks?

Image Credit: Unsplash/Luke Chesser

Personalised recommendations for goods and services are increasingly common, from algorithms that tailor social media content to individual users to adverts that adapt to purchase histories. However, this personalisation uses potentially sensitive personal data. How do people feel about handing over their data to receive personalised advice, especially in sensitive areas such as health, and why do they have these feelings?

We looked at these questions in our paper “Behavioural perspectives on personal health data sharing and app design: An international survey study”, published in the journal Data & Policy. We conducted an online survey to ask about participants’ behaviours relating to data privacy and their willingness to use personalised health apps. We then compared the results across over 1,000 responses from Hong Kong and 1,000 responses from London. This is what we found.

1.    User trust in personalised health advice depends on trust in the provider.

People who had more trust in medical advice, whether from a medical expert or AI, were more willing to use personalised health apps. This highlights the importance of building and retaining patient trust in the healthcare system and the advice that they receive.

Responses also suggest that AI implementation does not need to replace medical experts. Instead, experts can be heavily involved in the design of AI-powered health apps, use analytical AI as a supporting tool for diagnosis, and fact-check the advice generated by large language models (Dzobo et al. 2020). Transparency on how medical experts utilise AI-powered health apps could then facilitate trust-building with patients.

2.    Users would be more willing to share personal data if they had more control over its use.

Participants were more willing to use personalised health apps when they perceived their capability to understand how health apps use their data to be higher. They were also more willing when they had the opportunity to control exactly how the app uses data, and when they could limit who sees their data to trusted parties only. This suggests we can improve health apps by building in more granular data control options (Lee et al. 2024). Users should have the option to easily choose what data they share and with whom, where possible. This could involve allowing users to opt in or out of specific data-sharing features and providing clear explanations of the implications of their choices (e.g., Baker et al. 2016; Kaye et al. 2015; Scoccia et al. 2020). It would also be helpful to explore innovative data governance models to balance individual control over data with the value of data sharing for public health purposes.

3.    Hong Kong participants were generally more wary of sharing their personal data than their London peers were.

Respondents from Hong Kong were, on average, more wary of sharing their personal data than those from London. This was especially true when presented with a scenario where a government agency or private company had developed an app or had access to its data. Hong Kong citizens more often expressed scepticism towards their government’s ability to respect their data privacy. Hong Kong participants also perceived that there had been overly lax punishments for data leaks, whether by companies or (government) departments. They expressed concern about the adequacy of current data protection and privacy laws in Hong Kong with respect to Big Data and AI. Some respondents directly stated their concerns about government surveillance, as was the case during the rollout of COVID-19 contact tracing apps (Li et al 2022). Given the political unrest and societal concerns in Hong Kong in recent years, there is a job to do in facilitating trust in institutions involved in dealing with sensitive data in the public and private sectors (Cole and Tran 2022; Martin et al. 2022; Li and Yarime 2021).

4.    Women were generally more wary of personal data use than men.

Women tended to be more wary of sharing personal data than men. This may be because women may have higher levels of anxiety and risk aversion, and they may perceive greater safety risks online and offline (Tifferet 2019). More efforts should be made to understand how digital health technology regulations can be better designed to address the privacy concerns of people of all genders. This is crucial when current health apps marketed towards women may not fully comply with existing data privacy laws, leading to a gender gap in data protection (Alfawzan et al 2022; Hammond and Burdon 2024).

Overall, our findings showed that people do want tailored health advice from apps, and they could be willing to share their personal data to receive it. However, their willingness to do so is influenced by users’ data literacy and control, comfort with sharing health and location data, existing health concerns, access to personalised health advice from a trusted source, and willingness to provide data access to specific parties. We hope that these insights can help to develop better and more acceptable digital technology policies that improve health outcomes and public trust while preserving user data privacy.

About the authors:

Veronica Liis a PhD Candidate at the Department of Science, Technology, Engineering and Public Policy (STEaPP), UCL. She is also a Joseph Needham Merit Scholar.

Masaru Yarime is an Associate Professor at the Division of Public Policy and the Division of Environment and Sustainability at the Hong Kong University of Science and Technology and an Honorary Associate Professor at STEaPP, UCL.

Dr Vivi Antonopoulou is a Senior Research Fellow for the NIHR Policy Research Unit in Behavioural Science based in the Centre for Behaviour Change at UCL, providing research evidence for the Department of Health and Social Care. Vivi is a Chartered Psychologist and Associate Fellow of the British Psychological Society (BPS).

Dr Henry Potts is a Professor at the Institute of Health Informatics, UCL. His research focuses on the evaluation and development of digital health tools. He was the academic lead for the GOV.UK project “Evaluating Digital Health Products”, which is freely available online. He also worked on the COVID-19 response, advising the UK government and health service.

Dr Carla-Leanne Washbourne is a Reader at the Centre for Interdisciplinary Methodologies (CIM), University of Warwick and an Honorary Associate Professor in STEaPP, UCL.

References:
Alfawzan N, Christen M, Spitale G and Biller-Andorno N (5 2022) Privacy, Data Sharing, and Data Security Policies of Women’s mHealth Apps: Scoping Review and Content Analysis. JMIR Mhealth Uhealth 2022;10(5):e33735. doi: 10.2196/33735.
Baker DB, Kaye J and Terry SF (2 2016) Governance Through Privacy, Fairness, and Respect for Individuals. eGEMs 4 (2), 1207. doi: 10.13063/2327-9214.1207.
Cole A and Tran É (130 2022) Trust and the Smart City: The Hong Kong Paradox. China Perspectives (130), 9–20.
Dzobo K, Adotey S, Thomford NE and Dzobo W (5 2020) Integrating Artificial and Human Intelligence: A Partnership for Responsible Innovation in Biomedical Engineering and Medicine. OMICS: A Journal of Integrative Biology 24 (5). doi: 10.1089/omi.2019.0038.
Hammond E and Burdon M (2024) Intimate harms and menstrual cycle tracking apps. Computer Law & Security Review 55, 106038. doi: 10.1016/J.CLSR.2024.106038.
Kaye J, Whitley EA, Lund D, Morrison M, Teare H and Melham K (2015) Dynamic consent: a patient interface for twenty-first century research networks. European Journal of Human Genetics 23, 141–146. doi: 10.1038/ejhg.2014.71.
Lee AR, Koo D, Kim IK, Lee E, Yoo S and Lee HY (92 2024) Opportunities and challenges of a dynamic consent-based application: personalized options for personal health data sharing and utilization. BMC Medical Ethics 29 (92). doi: 10.1186/s12910-024-01091-3.
Li VQT, Ma L and Wu X (1 2022) COVID-19, policy change, and post-pandemic data governance: a case analysis of contact tracing applications in East Asia. Policy and Society 41 (1), 129–142. doi: 10.1093/polsoc/puab019.
Li VQT and Yarime M (2021) Increasing Resilience via the Use of Personal Data: Lessons from COVID-19 Dashboards on Data Governance for the Public Good. Data & Policy, 3, e29. doi: 10.1017/dap.2021.27.
Martin A, Mikołajczak G, Baekkeskov E and Hartley K (3 2022) Political stability, trust and support for public policies: a survey experiment examining source effects for COVID-19 interventions in Australia and Hong Kong. International Journal of Public
Opinion Research 34 (3), 1–10. doi: 10.1093/IJPOR/EDAC024.
Scoccia GL, Autili M, Pelliccione P, Inverardi P, Fiore MM and Russo A (2020) Hey, my data are mine! Active data to empower the user. In ICSE-NIER ’20: Proceedings of the ACM/IEEE 42nd International Conference on Software Engineering: New Ideas and Emerging Results. Institute of Electrical and Electronics Engineers Inc., 85–88. doi: 10.1145/3377816.3381726.
Tifferet S (2019) Gender differences in privacy tendencies on social network sites: A meta-analysis. Computers in Human Behavior 93, 1–12. doi: 10.1016/J.CHB.2018.11.046.

Leave a reply

Your email address will not be published. Required fields are marked *