How Informed Is Informed Consent? Putting Patients Back In Charge of Their Data-The HSB Blog 8/2/21
Our Take:
The constant advancement in technology and mobile health (mHealth) apps help improve care and patient experience but they give rise to questions around ownership (both beneficial and nominal) of patient data, as well as informed consent for data usage and security of that data. The questions and ethical dilemmas around data are numerous: will the data be kept confidential?, will the developer and user of the app properly safeguard and protect personal health information?, who actually is the rightful owner of healthcare data for clinical/research purposes or the commercialization of the data?, will the patient have the right to access the data and determine the method and length of time for which it will be stored. These are just a few of the questions that surround a patient’s right to know and understand what might become of their healthcare data. However, it is the individuals’ right to consent and make sure to understand the security of their data storage, transmission, access, and ownership that will allow individuals to make informed decisions.
Key Takeaways:
According to Visual Capitalist, the average American would need almost 250 hours to properly read all the digital contracts they accept for online services.
There are many mHealth apps on the market that lack appropriate privacy and security measures and which also fail to inform users as to how their data will be used.
One study found that of 79 apps certified as being clinically safe and trustworthy, 89% were found to transfer information online, 66% of which were not-encrypted.
Continuing advances in technology and mHealth apps have made it increasingly difficult to protect user data highlighted the issues around patient access to data
The Problem:
The Health Insurance Portability and Accountability Act (HIPAA) addresses the need for privacy of medical records while acknowledging the need for and importance of individuals being able to access their own health information. To address the need for secure and private patient data, many mobile health apps develop lengthy “terms of use” documentation detailing exactly how their healthcare information will be used. However, the sheer length of the documents and the patient’s lack of healthcare/legal literacy often results in the individual skipping this step, assuming no harm will be done in the long run. In fact, in 2014 six Britons agreed to give up their first born in return for free wifi in a very brief experiment by an internet security company before it was shut down. Although commonly ignored, these privacy policies often lead to patients losing control of what and how much information will be tracked, used, and shared. Moreover, even in the rare instances when these policies clearly and succinctly disclose what information will be allowed to be shared with and used by the apps, these apps often ask for much more information than they need to actually perform their tasks.
The Backdrop:
The Health Insurance Portability and Accountability Act (HIPAA) passed in 1996 was among the first regulations to directly address the privacy of medical records. HIPAA acknowledges the need for, and importance of individuals accessing their health information to trust their information will be used and disclosed according to their expectations and with full transparency. Patient data confidentiality, privacy, and data security have an important place within the healthcare industry. The continuing advances in technology, which now include GPS enabled tracking technology and social media cookies to track consumption patterns, when combined with mobile health apps, have made it even more difficult to protect (or de-identify) health data and protect patients. For example, as was pointed out in an article in Health Affairs entitled, ”Why Aren’t More Patients Electronically Accessing Their Medical Records (Yet)?,” HIPAA, the predominant legal framework for health data, is already wildly insufficient for protecting health data, both because the re-identification of de-identified data becomes increasingly easy as the volume of data about individuals grows and because HIPAA applies only to a set of “covered entities,” which do not always include many of the parties developing and using new health apps and services. The latest studies indicate that research participants can be identified by their MRI scans alone, even after they have been stripped of all identifying information including the 13 identifiers that HIPAA uses to define “legally protected health information.” In addition, even when patients want to be able to access and understand what data they may have given to providers and payers, current system controls make it incredibly difficult for them to gain access to their own data. For example, while the recently implemented CMS and ONC data interoperability rules should provide patients with greater access to their own data, the rules are still in the implementation phase and it remains to be seen whether they will eventually provide patients with more timely, easier access to their data on a device of their choice.
Similar issues arise with mHealth apps where both data privacy and ownership issues arise for patients. For example in an article entitled, “Developments in Privacy and Data Ownership in Mobile Health Technologies, 2016-2019”, the authors concluded that many mHealth apps on the market lack appropriate privacy and security measures. They noted that of 79 apps certified as being clinically safe and trustworthy, 89% were found to transfer information online, 66% of which were non-encrypted. In another study of 137 mHealth apps, more than 60% allowed transmission of health information via insecure methods and the same study showed 40% of apps failed to protect the integrity of the data they displayed. In addition, with various health applications, there are pages of legal descriptions and disclosures prior to a user gaining access which does not explicitly say with whom, why, and when their personal data/information will be shared. Moreover, users are typically confronted with privacy policies that are lengthy, not written in user-friendly language and end up unclear on what they are actually consenting to. As noted earlier, often out of frustration and a sense of powerlessness (ex: I have no choice), users often “agree” and continue without clearly understanding the depth of the privacy policies and they assume there is minimal risk while using the application. Finally, apps often end up inadvertently or indirectly obtaining data through inappropriate or overly broad permissioning. For example in an article entitled, “A Privacy and Security Analysis of Early-deployed COVID-19 Contract Tracing Android Apps,” the authors examined certain COVID contract tracing apps which operated on low energy Bluetooth technology and that did not require software permissions to be granted for such things as access to location data, ability to access the microphone, or access to a users contact details. This is particularly important in software because as noted in Wikipedia, software permissions are a means of controlling and regulating access to specific system-and device-level functions by software. Generally, permissions cover functions that may have privacy implications, such as the ability to access a device's hardware features (including the camera and microphone), and personal data (such as device storage, the contacts list, and the user's present geographical location). For example, in the COVID contact tracing apps, the authors found the only permission requirement that existed was to agree to basic software permissions, and survey results showed the app did not require access to sensitive information. However, upon further examination the authors found that when it came to the contract tracing apps run-time permission accesses, they were actively accessing such permissions (effectively meaning the apps were gaining access to private data without a user’s consent). The authors also noted these apps weren’t transparent regarding the data collection, processing, sharing, and transfer practices, which lead to concerns about whether or not they are compliant with the existing privacy laws.
Implications:
Providing patients with the ability to determine what data they are giving up when they access certain systems is challenging and complex caused, in no small part by the intricate set of rules surrounding data privacy and security. This task is made even more difficult when one takes into account the numerous vendors, institutions, and third parties that need or have access to private patient information in order to make the healthcare ecosystem function. In the end, users may find themselves in situations where their information is easily identifiable and traceable. In some cases, different parties have attempted to overcome some of these concerns by sell de-identified data sets to data brokers who will use this patient data for research of commercial purposes, yet it is often easily reidentified. In addition, given the application of healthcare data for marketing purposes, this data if often resold or shared with third parties for such purposes. For example it shouldn’t be surprising that Reuters recently reported that if a mental health or smoking cessation app is downloaded, there is a likely chance it will share marketing, advertising, or usage tracking data with either Facebook or Google. In addition, a recent article in Forbes highlighted similar issues in the privacy policies of 25 of the 36 mental health apps Forbes examined, noting that the privacy policies of the apps in question did not note how the data collected by the app would be used. Given the growth of the digital healthcare industry it is imperative that users/patients are informed how their data will be used, potentially monetized and what specifically they are consenting to in easier to understand terms. While some will argue that it is not possible to simplify all terms and conditions, at a minimum it would seem they could be summarized upfront and then explained in more detail. For example, perhaps, the first 5 sentences could very clearly and plainly state what and how their data will be used, in language that is easy to understand since most users lack adequate healthcare and legal literacy. Where possible, terms should be targeted at the so-called average reader (generally assumed to be the 5th grade reading level) and use short sentences that either avoid or explain legal and contractual terms. Similar to privacy disclosures, data ownership disclosures should follow several rules like those suggested by the Dartmouth Institute. They recommend for the sake of privacy and security that mobile health app developers keep in mind:
Complying with all relevant federal policies, regulations, and laws.
Providing patients with the choice to opt-out of providing sensitive data.
Providing patients with the choice to block the transfer of data to health care teams.
Providing patients with the ability to block use of data for research purposes.
In addition to fully informing and ensuring they understand how their data will be used, patients also have to make sure they have adequate knowledge and power over commercial discoveries that may emanate from their data. For example, in an article entitled, “If Your Medical Information Becomes a Moneymaker, Could You Get a Cut” the authors review the case of a patient being treated for testicular cancer at Memorial Sloan Kettering Cancer Center (MSK) in New York City, whose lymph nodes and other body parts were removed and used for profit-based, private research. He never paid attention to where these removed parts went but questioned who they actually belonged to - did it belong to him or MSK? Furthermore, were his body parts being used to advance medical research and not compensating him for it? The rules are under circumstances such as these are addressed under Federal HIPAA privacy rules. According to HIPAA, doctors can use patient data for research or to improve health care operations. However, if doctors are using information to develop a product they can sell or make a large monetary profit from, then it might fall outside the definition of “health care operations.” Being transparent is essential to meeting the standard of informed consent and fostering an open and honest relationship with patients, as is informing them when using their personal health information for healthcare research. Patients (and researchers) should also be aware that there is a difference in how patient data is approached in the U.S. and in Europe. For example, in “Developments in Privacy and Data Ownership in Mobile Health Technologies 2016-2019” the authors note that in the European Union, users have the right to be informed on what data will be collected as long as they are told how it will be used. By contrast, to the EU, the U.S. hardly informs what and how patient data will be used. Perhaps the U.S. should look to Europe and other countries' models and consider how they handle patient data privacy as a model for moving forward.
Related Reading:
Commenti