Pandemic-related healthcare advances raise ethical and political hurdles

Never in the field of human health has so much data been provided by so many people with so few safeguards as during the Covid-19 pandemic.

From passive sharing of medical records to active participation in clinical trials, from downloading diagnostic test results to using mobile phone apps to track individuals’ locations, the consumer digital age has been characterized by an increase in tools and information to fight the infection.

The unprecedented power of technology has helped to mitigate the worst effects of the pandemic, including allowing people to work and study remotely in ways that would have been unthinkable just a decade ago.

Likewise, it has offered direct health benefits, with more effective control of transmission, accelerated scientific and medical knowledge, and improved development of vaccines and drugs. Technology platforms have accelerated the shift to online medical consultations, and the data they collect has had much broader benefits.

But a lot has come thanks to mandatory “test, track and trace” systems, isolation and proof of vaccination requirements imposed by governments that restrict movement and step up surveillance. They used powers rarely deployed in peacetime by Western democracies, while authoritarian regimes adopted such measures even more aggressively.

The speed of these advances in digital health has left checks and balances lagging behind, fueling mistrust in governments and corporations. This risks undermining future gains unless health innovations are accompanied by new “data solidarity” approaches to balance public and private benefits, according to findings by The Lancet & Financial Times Commission on Health. governance of the future of health 2030.

The speed of progress in digital health has left checks and balances lagging behind © Sarah Hanson

Steve Davis, consultant and author of Beneath the Currents: Channeling Outrage to Spark Practical Activism, describes the digital revolution as “net-net one of the most powerful things that will ever happen to human health”. He recently argued that “there is a huge gap in understanding what is available, ecosystems are fragile, there is [are] no clear policies on data governance, on digital privacy, on misinformation management”.

While many people willingly share their personal data through social networks operated by corporations and governments, the pandemic has crystallized specific health concerns. Medical data is perceived as particularly sensitive, and its forced extraction can create resentment and lead to inconvenience or discrimination.

In the UK, the Information Commissioner’s Office (ICO) began investigating claims last year that at least one major Covid-19 testing company had included a notice – buried in detailed terms and conditions – according to which it could store the DNA and other genetic information of its clients. to share with external researchers.

This highlighted concerns about the potential commercial exploitation of information derived from government-mandated testing of travelers in the name of public health. Other concerns have emerged over wider sharing of Covid-19-related data with law enforcement, not always fully assuaged by the regulator. The ICO, for example, says it has “received assurances that there is no automatic bulk sharing of NHS Test and Trace data with police forces”. He adds that “limited data can be shared under strict controls where police suspect self-isolation rules have been broken.”

In Singapore, authorities have won praise for their swift action to control the spread of the coronavirus with the TraceTogether program in 2020. But, last year, new legislation was hastily passed to put in place additional safeguards on surveillance after officials revealed that data collected for coronavirus control had been used in a criminal investigation.

Montage with a doctor using technology

The collection and use of health data will only grow in the future © Sarah Hanson

The collection and use of health data is only growing, offering the potential to significantly aid the prevention and treatment of disease. However, it also risks creating ever more divergent outcomes between rich and poor regions and countries, older and younger people, and those whose data is better or less integrated into health systems.

At the most basic level, much information is still not systematically collected, digitized or shared – from details of patient discussions with doctors in the United States to medical records in the poorest countries. Wilfred Njagi, managing director of Villgro Africa, a Kenya-based healthcare investor, says medical information from clinics in his country remains “a black hole – and a huge opportunity”.

Bridging this “digital divide” will, however, require substantial investment. Hila Azadzoy, Managing Director of the Global Health Initiative at Ada Health, which is experimenting with artificial intelligence to diagnose diseases in Tanzania, Uganda and South Africa, says: “People agree that we need digital solutions. With the pandemic, healthcare systems, governments and the private sector are realizing that it really is a must, not a must have.

But many argue there should be stronger privacy safeguards, given periodic data leaks and inappropriate sharing of sensitive information. For example, Privacy International, an advocacy group, exposed the sale to advertisers of individuals’ mental health information collected on apps in France, Germany and the UK.

Montage with a muslim woman using her mobile phone

Greater trust requires stronger safeguards and AI scrutiny based on imperfect information © Sarah Hanson

Greater trust also requires stronger safeguards and AI scrutiny based on imperfect information. In the United States, for example, poorly constructed algorithms by health insurers to identify and provide better support to at-risk patients have proven to discriminate against African Americans.

Darlington Akogo, the founder of an AI-based radiology diagnostics company in Ghana, is part of an “international think tank on artificial intelligence for health” that seeks to help regulators analyze and verify machine learning. “My optimism has increased, but so has my skepticism,” he says. “It is clear that we need AI to support healthcare in Africa. These tools have a lot of potential, but they might not be quite ready. We need more evaluations before generalizing them. »

More rigorous evidence and scrutiny is also needed to demonstrate the clinical effectiveness and cost-effectiveness of many health technologies. The evidence base in most of these areas, including mental health, remains limited.

Tobias Silberzahn, partner at McKinsey consultants, says one of the problems with digital health initiatives during the pandemic has been the failure to provide enough useful information directly relevant to individuals, such as personalized treatment advice. on an application tailored to their own risk factors and stage of infection.

He suggests that future health programs need to be “fun, practical and effective”, and that there is substantial potential for integrating medical data with broader “wellness” information, such as sleep. , nutrition, stress and movement tracked by wearable devices.

But Pooja Rao, co-founder of Qure.ai, an Indian AI-based health company, suggests that such broader data integration must emphasize the primacy of individuals as owners and controllers of their personal information on health, with the right to move them between different health systems. “There is a lack of trust in private actors and the government,” she says.

This highlights the need for new institutions, such as data trusts or cooperatives that oversee any wider sharing of health records, as well as the advent of an approach known as “participatory digital health tools”. , developed directly with and for users.

As Amandeep Gill, Managing Director of the International Digital Health & AI Research Collaborative, puts it: “We have a data privacy and security paradigm. We need to shift the conversation to a data empowerment paradigm, in which the citizen has more power over the choice of their data. »

Comments are closed.