Ethical considerations

From Appiaplus
Revision as of 09:03, 20 May 2020 by Jeroen.lemaire (talk | contribs) (→‎Introduction)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Introduction

At the core of trade-off analysis there is the observation that choosing among solution alternatives is choosing among sets of trade-offs. A first step is then to look for objective requirements that might have been missed, and for priorities among the requirements - do we value requirement x over requirement y?

The delicateness of the debate about solutions for this problem domain is due to the balance to be found between effectiveness and privacy preservation - the inherent trade-off. The problem would be easy to solve if there were no restrictions on data gathering thanks to no risk of abuse of those data. But obviously, abuse exists and there need to be restrictions - privacy preservation and data protection are fundamental rights.

There is however no absolute bar for privacy and data protection. Proportionality is at the core of any formal or informal regulation - as it is in the GDPR. People can choose and states can demand citizens to hand over data - with the inherent risks - in order to obtain a benefit. The question is whether the data gathering and processing is needed and proportionate for the purpose. It is therefore not unthinkable that the fight against a lethal and economy-ruining pandemic would warrant a proportionate and time-limited processing of personal data.

The proper trade-off between effectiveness and privacy within the legal boundaries is then an ethical question, hence this section.

Model Source

Throughout this publication we use a model proposed by Gasser ea (see arXiv:2004.10236 [cs.CY], version arXiv:2004.10236v1) for “framing the ethical challenges and how to address them”.

They distinguish 4 types of public health technologies:

  • Symptom checkers (mapping on healthcare provisioning outcome)
  • Proximity & contact tracing (mapping on transmission tracing outcome)
  • Quarantine compliance (not included in our scope)
  • Flow modeling (mapping on the policy making outcome)

That raise legal and ethical issues:

  • Scientific validity, accuracy, and data necessity (see Effectiveness and Privacy)
  • Privacy (see Privacy)
  • Consent & voluntariness (see Privacy)
  • Discrimination (see Effectiveness)
  • Repurposing (see Privacy)
  • Expiration (see Privacy)
  • Digital inequality (see Effectiveness)
  • Public Benefit (the discussion to have)
  • Systemic accountability

That can be mapped on 6 ethical principles:

  • Autonomy
  • Solidarity
  • Privacy
  • Non-maleficence
  • Justice
  • Beneficence

Ethical principles for digital public health technologies

(Source: arXiv:2004.10236v1)

Finally, a navigation aid is proposed which we will use as assessment model for the ethical aspect:

  • Establish guiding ethical principles
  • Distinguish tools from purpose
  • Avoid lock-in and path dependency
  • Conduct risk assessments
  • Plan preemptively
  • Embrace privacy “by design” and “by default” approaches
  • Assemble the right team
  • Communicate proactively and continuously
  • Create systemic accountability
  • Keep records and capture learnings

Ethical principles legal issues and reccomendations.png

(Source: arXiv:2004.10236v1)

Assessment Model

Based on the above and filtering out the aspects that have been included in the Effectiveness and Privacy and Security requirements, we suggest following assessment model:

  • Discussion about non-maleficence and beneficence
  • Commitment to the navigation aid components
    • Establish guiding ethical principles
    • Distinguish tools from purpose
    • Avoid lock-in and path dependency
    • Conduct risk assessments
    • Plan preemptively
    • Embrace privacy “by design” and “by default” approaches
    • Assemble the right team
    • Communicate proactively and continuously
    • Create systemic accountability
    • Keep records and capture learnings

Navigation