If a computer diagnoses me who do I sue?
Medical malpractice is based on the relationship between a doctor and a patient, whereby the doctor has a duty to the patient and breached this duty by failing to act as a reasonably prudent physician. Liability depends on whether the medical care is properly performed. Without the doctor-patient relationship, the consulting doctor has no duty to the patient.
When a computer is assuming the role of the consulting doctor the question of whether that computer has a duty to the patient arises. Artificial intelligence is fast becoming vital in providing treating doctors with quicker and more reliable information in which to treat patients. However, the shift in the decision-making process to an algorithm complicates the issue of liability when a doctor provides improper treatment as a result of an error in the computer’s algorithm.
A computer diagnosing patients with the use of an algorithm by its very nature has a higher accuracy rate than the average doctor and following the advice of the computer would statistically be the best medical option.
As computers cannot yet independently treat and interact with patients, a duty of care to the patient cannot be attributed to the computer. Courts have held that where a doctor has merely given advice without examining or consulting with the patient, no doctor-patient relationship and thus no claim for medical malpractice may arise. Therefore, as a computer can only recommend treatment no legal duty arises.
Further, are computers even capable of committing negligent acts? A medical malpractice claim requires a doctor to fail to act as a reasonable doctor. Because computers run on a carefully calculated system of algorithms, always calculating the risks and probabilities of various outcomes a computer’s actions and decisions are inherently reasonable.
It is possible that in the future medical malpractice claims will be based upon the vicarious liability of the hospital using artificial intelligence in diagnosing and treating patients and it may be that doctors and hospitals take out professional indemnity insurance on behalf to cover the costs of their artificial intelligence. The programmer could be held accountable if the algorithms used are sub-standard and do not take into account a comprehensive and up to date either the current factual matrix or medical theory.
An incorrect diagnosis based upon a hardware failure in the computer could result in a claim for product liability being pursued against the manufacturer. Claims against the owner of the computer may arise in cases where the failure to maintain the computer correctly results in injury to a patient.
As artificial intelligence improves and is used more in diagnosing and decision-making in the medical field the traditional requirements of medical malpractice will become harder to apply. In the near future long-established legal principles will need to change and adapt in order to accommodate technological advancements in the medical field.
Learn more about our digital health legal services
We offer various services relating to e-health and medical law both in South Africa and globally to all stakeholders including patients, healthcare providers, pharmaceutical companies, startup and tech companies. Malcolm Lyons and Brivik specialise in medical law, personal injury law, labour and road accident claims and have been recognised as leading attorneys in South Africa since 1965.
Contact our offices below for further information
0861 MLB INC
Johannesburg Office
+27(0) 011 268 6697
Cape Town Office
+27(0) 21 425-5570
#hcsmSA #hcsm #eHealth #AI #healthIT #BigData