A glance at your iPhone quickly unlocks your screen, thanks to facial recognition technology (FRT). This technology is rapidly advancing in health care. Facial biometrics are being used to diagnose rare genetic conditions, contact trace, and even identify whether a person is suffering from depression.
The AMA Journal of Ethics notes, “FRT is being developed to predict health characteristics, such as longevity and aging. FRT is also being applied to predict behavior, pain, and emotions by identifying facial expressions associated with depression or pain, for example. Another major area for FRT applications in health care is patient identification and monitoring, such as monitoring elderly patients for safety or attempts to leave a health care facility or monitoring medication adherence through the use of sensors and facial recognition to confirm when patients take their medications.”
Oosto, a security company that specializes in facial recognition technology, says FRT could be used for public health initiatives such as retracing a person’s footsteps to enforce “quarantine efforts remotely,” or programed to alert authorities “when it sees that someone is not wearing a mask,” as occurred in China during the pandemic.
The breadth of potential uses of FRT raises serious privacy concerns. As the US National Library of Medicine reports, “Scholars consider privacy a central ingredient in the positive human functioning.” But is privacy attainable if aspects of your medical information could be known through a facial scan that you may not even realize is happening? Additionally, should consent be required before commencing a FRT scan that potentially stores a patient’s biometric data?
Although most people feel comfortable giving their biometric data to Apple, many hesitate when they realize that a hospital or clinic could potentially use FRT to identify them or their medical conditions. In a PLOS One study, “Over 70 percent of over 4,000 surveyed patients indicated that they were ‘very’ or ‘somewhat’ concerned about data privacy” if FRT was used to gather health data about them.
Patient concerns regarding FRT should cause hospital administrators to reconsider the use of facial recognition in their facilities. Lawmakers may want to consider legislation that would protect a person’s right to walk around freely and not be under biometric analysis at every turn. A person cannot operate freely in society if they must hide their face to achieve privacy.
Americans face increasing biometric risks today. Laws regulating privacy in health care have not kept up with changes in biometric technology. State laws vary significantly with some requiring informed consent when biometric information is collected — such as the Illinois Biometric Information Privacy Act (BIPA) — and others having no restrictions. At the federal level, anti-discrimination laws (such as the Genetic Information Nondiscrimination Act (GINA)) do not prohibit discrimination or mistreatment due to a diagnosis generated through FRT. Nor would the Americans with Disabilities Act of 1990 protect individuals from discrimination in schools or the workplace if their conditions that were diagnosed through an FRT “are currently unexpressed.” Without sound guidance and privacy considerations, use of FRT, and presumption of its findings as fact, could significantly harm patients and their futures.
To protect patient rights, FRT should not be incorporated into health care. As the AMA Journal of Ethics notes, using FRT holds significant harms, including the erosion of patient trust in physicians, which could lead to misdiagnosis, and potential security issues, such as identity theft.
Patients visit clinics to see doctors, not to be digitally diagnosed or to be treated by a robot— the future goal of some FRT proponents. Reducing direct physician involvement in diagnosis and treatment limits patient access to the highest level of medical expertise, potentially leading to improper diagnosis, inappropriate treatment, or delayed care. Some claim that FRT may be able to identify patients quickly or aid in diagnostics, but if the patient feels that the technology has violated their privacy or limited their treatment choices, the health care facility will lose the trust of the patient — something that cannot be replaced with the click of a button.
The views and opinions expressed in this commentary are those of the author and do not represent an official position of Alpha News.