Placing humans in the focus
A just Digitalisation of the healthcare system
by Bianca Kastl, Manuel Hofmann, Vanessa Schaffrath, Katharina Klappheck and Elisa Lindinger
We need more data to obtain better health-related research: This claim became popular within the context of the pandemic. It was clearly obvious that clinical research, administration and society needed to react rapidly to a whole series of new medical challenges. The claim has now been applied to the overall context of healthcare. This is evident in the German Health Data Use Act (GDNG), a legislationaimingto improve use of health data, which has entered into force in March 2024. It intends to regulate the processing of all health and care data, and represents the initial approach to the EU project of the European Health Data Space(EHDS). This legislation has a major objective: To “serve the interests of patients and the community and place citizens at the centre of all activities”. Whether this can actually be achieved requires a differentiated evaluation from the perspective of those who will be most affected by the health data use act – the patients.
Self-determination and the “new” personal health record
One of the first modules of the digital healthcare is the personal healthrecord (ePA),introduced in 2021. It officially aims to simplify and improve patient care. However: Health data are sensitive data and discrimination in healthcare is real. The healthcare system is actually the area where people with HIV experience the greatest level of discrimination.
Many other people also experience discrimination during their everyday medical treatment: LGBTQ people, people of colour, drug users and people with certain religious or ascribed religious identities. Diagnoses can also be derived from treatments and medication. This is why EU-wide digitalisation projects can pose real risks for LGBTQ people or women having had abortions, because their rights are not protected in some EU countries. EU-wide harmonisation objectives for the health data space can therefore increase the risk of future discrimination.
Although the new ePA will continue to make it possible to “hide” individual documents, the proposed options are insufficient to safeguard a comprehensive, self-determined handling of sensitive information. For example, new documents cannot be set by default as “only visible to me” or released for a self-selected group of doctors (e.g. trusted persons such as a general practitioner). Real self-determination regarding health data is therefore complicated – and almost impossible to implement, particularly when numerous and parallel visits to various physicians are involved. Self-determination over their own data is also more difficult for patients who struggle with technology. Real data ownershipis therefore not a given.
Rights to object
The GDNG draft focuses on university, clinical and private sector research and not on patient interests. The planned regulations will allow the former to access more – highly comprehensive – data. Patients, in contrast, are denied the principle of informed consent to the disclosure of highly private, sensitive data. Instead, they can opt-out, i.e., without their active objection, their data can be sent to and stored by health research data centres. The German law to accelerate the digitalisation of the healthcare system(DIGI-G), which waspursued parallel to the GDNG and also entered into force March 2023, only defines a few specifically protected areas where patients must be informed about their rights to object. This includes data on “HIV infections, psychological illnesses and abortions”.
From an intersectional perspective, numerous other security requirements are necessary. For instance, where marginalised identities of people (such as those with disabilities) are explicitly depicted in or implicitly derived from health data.
It is also unclear how an active objection against data transfer can be correctly implemented in practice. Because this is also problematic from a power-critical perspective. It can lead to situations where patients feel obliged to decide between their own interests in privacy and passing on data in the interests of research. As research, including commercial research, is generally portrayed as beneficial in the law, this can make it even more difficult to stand up for personal requirements.
The uneven distribution between power and participation
The planned policies increase pressure on patients to pass on their data to research. Research companies are those that profit the most. However, their utilisation of societal health data is not coupled with conditions that would be in the public interest: For example, an obligation to provide open access or even patent exemptions. Such an imbalance between taking and giving is not appropriate for patients. Those who are particularly affected here are disabled and chronically ill people: Their data could be particularly relevant for research projects, e.g., due to their rarity value, meaning that they have an involuntary pioneering role in medical innovation projects. But they do not benefit from any profits gained through their data. On the contrary: In certain circumstances, they have to pay horrendous prices for precisely those medicines that they or other people with similar illnesses or disabilities actually made possible in the first place through their consent to transfer data.
A similar imbalance occurs when healthcare systems and working worlds intersect. According to the proposed legislation, company physicians can access patient data with their permission. Company physicians can also evaluate the suitability of an applicant for a job. This gives them a position of power. Company physicians should not have general access because certain diagnoses can give rise to unpredictable or difficult to determine consequences for working life.
A further issue is that, while some wide-ranging diagnoses are a prerequisite for access to necessary assistance or treatments, they could also lead to negative effects in the labour market. This forced pathologisation is another structural problem that will be expanded and reinforced by the digitalisation of the healthcare system.
Promises that can’t be kept
“More data equals better data” is the creed by which companies using machine learning techniques swear. It justifies their uninhibited data collection greed and undermines the numerous appeals for data economy issued by human rights organisations. The assumption that more data will make machine learning, and thus medical development, better and more efficient, is not always correct: The almost endless individual data records that obviously result from a lifelong medical history means that machine learning processes will develop false correlations whilst ignoring relevant relationships.
In the current proposed legislation, health data storage is set at 100 years. This will lead to an almost insurmountable volume of data that needs to be administrated. Another point is that an extremely lucrative repository will be created of highly confidential, centrally stored and insufficiently anonymised data. How can this data be protected effectively over the long-term and against unauthorised access?
A right to deletion or the right to be forgotten is not proposed in reference to research data. Even though research data records are to be pseudonymised, this will not be enough to protect individual patients. For instance, identification of a patient with a rare disease that is of particular interest for research is far easier than for common illnesses.
Opportunities are only possible with a completely new set-up
The focus of the GDNG is on the rapid development of a central research data infrastructure. The accompanying utilisation of significant parts of the existing telematics infrastructure will result in a long-term technically obsolete lock-in. The implementation of this law actually offers the unique chance to provide more transparency, self-determination and participations for patients as well as more health data research through the use of modern technologies that enhance personal privacy. Open development processes, like those used for the German corona warning app, which take all perspectives into account, could lead to a real patient-centric dossier.
However, we need clear, implementable design principles for such development processes, ones that can significantly improve public digitalisation projects for more than just the healthcare system. The focus here should be on the critical reflection of dependencies, whether these be legal, technical or social. Patients require suitable objection and intervention options that make it as easy as possible for them to exert their rights.
The development of a digital healthcare system will profit from being scientifically supported and open. Totally new options for empowering patients will result if data does not just flow in one direction, but if resulting research knowledge and treatment options flow back to the patients. Such a design that can be used by both patients and doctors will be accepted more easily and would not need aggravating measures such as opt-out regulations or even sanctions.
The basic question is what infrastructures do we need, as is the case with numerous digitalisation projects. So far, the approaches to implement and operate this system in the public interest, i.e., safely, data efficientand protecting privacy, are still lacking. We need new forms of accountable public responsibility for such infrastructures.
About the authors
This text was developed with the participation of Bianca Kastl (Innovationsverbund Öffentliche Gesundheit), Elisa Lindinger (SUPERRR Lab), Manuel Hofmann (Deutsche Aidshilfe), Vanessa Schaffrath and Katharina Klappheck, and coordinated by SUPERRR Lab. The authors included their different perspectives regarding the digitalisation of the healthcare system. This article is an initial outline of the issue, where we present the fundamental risks inherent in the political project, but cannot provide a conclusion. The social effects of the healthcare digital transformation are too wide-ranging. However, this initial evaluation with different perspectives shows how complex and far-reaching this topic is – and that this complexity has as yet not been sufficiently reflected on.
Subscribe to our newsletter and don't miss any updates and events!