Risks & Rewards of Monetizing Healthcare Data
This week we look at the monetization of healthcare data & in particular, the potential dangers that may arise as more data is generated than ever before and how this data is used. As more patient data is being generated at unprecedented rates and subject to the same consumerization that has changed many industries, it is critical to understand the implications of massive growth without proper regulation and what that means for patients and their privacy. More insight is needed about the development of AI software when it comes to driving the services of organizations that use their analytics to provide transparency to the consumer. These organizations are looking to ensure that analyses are performed with minimal bias with population demographics in mind. The lack of regulation around data privacy raises many concerns about the fair usage and ethical applications of vast datasets collected. These datasets are then processed by AI software which may not operate entirely impartially. Strong projected future growth leaves the potential for stakeholders to reap vast earnings on the basis of the data.
With the growth in new technologies such as big data, broadband, and cloud storage, the collection and monetization of data has become big business. For example, according to New Revenue Streams in Health Data Monetization, published by TATA Consultancy Services, the data monetization market is projected to reach a valuation of over $700 billion by 2025. New innovations are constantly advancing the healthcare industry’s collection and monetization, with new revenue streams opening up on the behalf of the data sector which is eager to use these insights both for marketing purposes and to gain a foothold in healthcare as the industry moves towards consumerization. By using data insights to improve existing processes, offering data as a product to businesses, and offering data analysis capabilities as a service, a wide variety of organizations are hoping to gain access to a vast potential profit pool in the coming decades. For example, a study from International Data Corporation and Seagate found that the healthcare data industry is projected to grow up to 36% by 2025, outpacing many other industries.
While the practice of monetizing data has existed for a number of years, particularly in consumer industries, the maturation of this industry in healthcare has brought about concerns surrounding its future regulation. In addition, the potential for misuse and theft of patient data, either intentional or unintentional has brought with it the risk that these practices could hurt organizations' reputations and operations. As more and more of patient and healthcare data are being digitized (ex: as data from wearables are being stored in the cloud and providers and payers encourage the use of digital health services) these risks only continue to multiply. Consequently, both traditional players and new entrants have to ensure that they observe both the letter and spirit of the law and are especially sensitive to public perception of how patient’s data is being used and monetized.
The lack of clear-cut regulations and patient ownership rights with respect to personal medical data raises questions on the ethical use of this data. Moreover, even when fair usage occurs in line with existing federal policy, potential security risks arise with the absence of accountability. Previous high-profile collaborations in both this country and others, such as that between Google’s DeepMind and the Royal Free NHS Foundation Trust highlight those concerns. In addition, the lack of standards and transparency around the use and applications of artificial intelligence (A.I.) as well as the potential for bias has significant policy implications. There are still many questions that need to be addressed about the industry and if stakeholders can capitalize on this demand, it may open healthcare up to greater monetization and revenue streams than ever before.
What have been some of the “lessons learned” on modern concerns around the monetization of healthcare data from some of our earlier Our Takes?
An increasing number of healthcare organizations are incorporating artificial intelligence and data processing to support decision-making and compute meaning from large, unwieldy data sets.
According to the Association for the Advancement of Medical Instrumentation, 77% of organizations already use AI software to inform clinical decision-making, and 66% use AI to yield meaning from large sources of data.
The average patient generates 80 megabytes of data in electronic medical records and imaging annually, according to the New England Journal of Medicine.
The global healthcare data storage market is expected to grow by more than 16% by 2026 according to a report from research firm Research and Markets.
The technology sector is increasingly directing its skills in search and data analytics towards healthcare data but has yet to achieve the success seen in other industries given the uniquely personal nature of healthcare data and limited experience of working in the healthcare industry.
Despite numerous efforts that the international, national, and state levels (ex: GDPR, 21st Century Cures, CCPA, etc.), U.S. patients generally continue to have very little control over the usage of their medical data after it has been collected.
Of 97 mobile health apps reportedly certified as being clinically safe and trustworthy, 89% were found to transfer information online with 66% of this information unencrypted according to Developments in Privacy and Data Ownership in Mobile Health Technologies, 2016-2019.
The article noted that foreign regulations are tougher than those in the U.S. For example, in the European Union, consumers have the right to information about what data will be collected and how their data will be used in contrast to the U.S. where there is no Federal law that informs the consumer about data collection.
Many apps are not transparent about how their data is collected, processed, shared, and transferred raising questions as to whether they are actually compliant with existing data privacy laws or not. For example, an article published in Forbes one writer found that 25 out of 36 mobile health apps’ privacy policies had no disclaimer whatsoever concerning how patient data would ostensibly be used.
According to HIPAA privacy laws, doctors are allowed to use patient data for research or improving operations. Still, if this data is used to develop a product that can be sold on the private market or make a large profit from, it could fall outside the purview of “healthcare operations” and thus not apply to HIPAA regulations.
U.S. healthcare cyber breaches have increased 14% from 2019-2020 with security being the most important concern for 85% of the businesses and organizations working with cloud technologies per the Flexera State of the Cloud report.
Even with all the significant advances in AI and machine learning (ML), there is a need to ensure these tools are subject to routine and consistent oversight, audit, and review to detect and avoid bias, increase reliability and ensure fairness in their application.
According to a 2020 study entitled, “How Should AI Be Developed, Validated, and Implemented in Patient Care?”, most data processed by AI tools tend to be largely homogenous in terms of patient demographics leading to under-represented populations prevalence being under-represented. This can lead to vulnerabilities around erroneous diagnoses or faulty treatment procedures
As noted in an article from STAT News, there is no legal requirement for AI developers to thoroughly document how the AI was developed or ensure the accuracy of its performance, particularly around the processing of demographic data sets.
As widely noted during the Pandemic, certain pulse oximeters, a tool used to monitor blood oxygen levels, have been found to provide more inaccurate measurements the darker a patient’s skin. The problem was so widespread it prompted Senators Warren, Booker, and Wyden to send a letter to the FDA requesting a review of the issue.
According to the article Extracting and Utilizing Electronic Health Data from Epic for Research, researchers trying to apply AI to identify prescription patterns in certain antipsychotic medications found that 27% of prescriptions were missing dosages even after researchers took steps and statistical methods to address the issue of missing data.