The Orwellian Nature of Healthcare Data & Analytics
A few years back, I was a senior designer for GN ReSound — a Danish hearing instrument manufacturer and leader in the industry. They lured me to the position with an upcoming project designed to connect hearing aids to mobile devices. What I would be working on was the software audiologists used to adjust the hearing devices in what would eventually become a major platform overhaul to integrate mobile technology and telemedicine capabilities to device adjustments.
Hearing devices, as it turned out, are quite interesting. They have a chipset in them and are, in the most basic sense, tiny computers you wear on your ears. They have the capability of being programmed (for individual hearing needs), storing information and transmitting information. Within these capabilities is the ability to pull usage data from the devices. In layman terms, an audiologist can connect to these devices with software and see how often a patient is wearing them, in what sound environments a patient spends most of their time in and where they might be having issues (and make adjustments as necessary).
While I was working on this project, it occurred to me at how awkward this scenario can become for a patient who is, let’s say, not exactly compliant in wearing the devices. In the hearing instrument industry, this scenario is often referred to as the “In the Drawer Syndrome” where a patient puts the devices in the drawer and rarely wears them. This happens for a few different reasons.
Hearing instruments are not like glasses where your eyes immediately adjust to actually being able to see again. It takes some time for your brain to become used to hearing sounds you have not heard for a long time. They can also be uncomfortable and there is a social stigma to wearing them. Patients are often reluctant to admit they have a hearing problem or don’t realize it. There are other reasons a patient might not wear them as often as they should. But these are the most common.
So, I always imagined the scenario with a non-compliant patient looking like this: The audiologist views the device analytics during an appointment and mildly admonishes the patient for not wearing them more often. I never saw this happen and doubt audiologists shame their patients. But I do imagine they question the patient concerning their usage statistics.
That was 2012. Seven years later, I have worked on more and more projects where health analytics are becoming the norm. I continue to encounter situations where the implications of this technology we now have are unclear.
Clinicians are now able to gain powerful insights through healthcare analytics. Patients are also able to, depending on the technology, tap into those insights. As with any augmentation via technology, there comes the issue of unintended consequences and new questions or scenarios for consideration. There are ethical, legal and technological issues directly impacting patients and/or their care as it relates to the analytics we now gather and have access to.
The scenario above concerning the hearing aid data is all-too-familiar to me and it begs the question of how transparent a patient truly wants to be. This isn’t so much about privacy as it is about human behavior. To be perfectly honest, we lie. (I recognize the irony in that last sentence.) When it comes to our health and habits, we often blatantly lie or lie by omission.
What if your general practice physician had a means of observing your drinking or smoking habits? What if they had the means to monitor your compliance or adherence in taking a medication you dreaded? What if they could see the 24-hour blood glucose patterns of a diabetic who has an affinity for sweets? Or what if they could monitor your compliance with any prescribed treatment?
If this seems like a fantasy or a utopian version of healthcare in the future, it isn’t. The future is here or will soon arrive.
Large corporations like Facebook are beginning to dip their toes into the waters of healthcare data and analytics. Is it hard to imagine posting a video of you at a bar with friends smoking and your health insurance company raises your rates while also rejecting your tobacco use disclaimer? This isn’t a stretch. Insurance companies actively use social networking to reject claims and identify fraud. As we begin to increase our ability to make more sense of data (and collect more of it), how far can we push this concept and how granular can we become in viewing the hidden lives of patients?
Companies such as Google, Amazon and Apple are also beginning to explore healthcare. Amazon is rumored to have involvement with electronic health records and their recent purchase of Pill Pack confirms their steady trek into the three-trillion-dollar healthcare market. It’s easy to see just how big the picture can get and how powerful the data can become when companies with billions of points of data begin connecting the dots.
There are positive and negative aspects of monitoring patients via technology and the data we now have access to or can link. In a recent article on UX ecosystems in healthcare, I convey a common scenario regarding drug therapy:
This is a situation in which analytics could help. But perhaps there are reasons patients are not compliant with their therapy. Maybe they are edgy about taking another drug. Maybe they are suffering adverse consequences of a prescribed therapy. Or just maybe they are tremendously disappointed with the performance of their hearing instruments.
Collecting the data and monitoring patients is not unethical in and of itself. That is, the analytics are not bad. It is how we use them in healthcare that is key. Do we use them to improve patient care or will healthcare become aligned with the insurance industry, large corporations (Amazon, Google, Facebook) and pursue financially lucrative scenarios to become a big brother?
These examples, some real and some predictive, leave us in a very grey area. I honestly have never known where to stand in relation to patient monitoring. On the one hand, it is our mission in healthcare to make people better. We need to know if they are not complying with treatment protocols so we can work with them, not against them. On the other hand, it seems at some level this places a patient under a microscope and they may not always be comfortable with this. There is clearly the potential for the misuse of data.
The power of observation can change our behavior. If you know you are being observed, you’ll act differently. It’s as simple as the concept of security cameras in a store to prevent theft. But this could also bring about unintended consequences.
Have you ever tried to cut your weight before your annual appointment with your doctor? Have you ever dieted in an attempt to influence your blood test for cholesterol levels? Have you ever “prepped” for a medical appointment involving some routine screening? This is the power of observation.
At what level do we decide we have gone too far? At what level do we cease to have a choice over our treatment as patients and instead become automatons mindlessly obeying every medical therapy whether good or bad? We would do this because we are humans and know we are being observed. We would do this to avoid the shame of facing a doctor as non-compliant patients. We would do this at the risk of complying with treatments that may not be aligned with our choices and contradict our desires for a good quality of life.
I quit weighing myself daily years ago. I did this because I am very conscious of my weight and health. A one or two-pound deviation on the scale can really affect my mood for the rest of the day. But that deviation is actually quite normal. What I choose to focus on instead is simply enjoying a healthy diet. In short, I don’t want the metric to become the target or goal. I choose a system over a specific target or goal. My system is eating well.
Abbott Labs, Medtronic and DexCom all produce devices allowing Type 1 Diabetic patients to track their blood glucose levels continuously. Abbott produces the Freestyle Libre — a system whereby a small disc is attached to the back of the arm and a device (or smartphone) is used to scan the disc. The disc is temporary and stores eight hours of data. This data can be uploaded to Abbott’s online software (there is a desktop version) providing a full view of glucose activity for a patient.
I am a type 1 diabetic and love this system. But I had to learn how to use it correctly. At the first sign my sugars are rising or dropping, I have had to train myself not to panic and overcompensate by eating or dosing with insulin.
We have all of these devices that give us all of this data. Wearables have become common and easy to spot on most of us today. They’ll tell you how many calories you have burned. You can track how many calories you eat simply by taking a picture of your plate. You know where your heartbeat is at any given time. But is all of this too much? Do we really have the ability to holistically understand all of the numbers we are seeing?
There are ethical concerns here as well. We tend to view computers and devices as authoritative. That is, you are more likely to question another human than you are a computer. As a result, we often rely far too heavily on computers and devices. In the case of wearables, some users have claimed to gain weight as a result of the metrics reported to them.
Devices can malfunction. They can report inaccurate numbers. They can fail to account for the holistic nature of healthcare. And, they can lead to an over-reliance on technology affecting our decision-making process.
There is also the question of ownership where health data and analytics are concerned.
I like to think I have the patent rights on me. This would include my DNA. I’m not necessarily a privacy zealot. But something about sharing my DNA is unsettling to me. I feel like I own it — like I hold the copyright on me. I feel like I own my health data and it should not be used to impede my autonomy.
This same issue was broached in a recent Medium article, “Who Owns Your Health Data?” There are pacemakers and implantable cardioverter defibrillators that track your heartbeat and store the data. The issue here becomes less about patient compliance and more about who owns the story the data can tell. In 2017, an Ohio court ruled pacemaker data could be used in a case against a man accused of arson. The data could refute the man’s testimony that he had been sleeping at the time of the fire.
If corporations like Facebook and major healthcare giants begin trading data, where does it go or eventually end up? Much of this data would likely be de-identified (though it could probably be re-identified). Data collected within a healthcare setting (i.e. for electronic health records) is protected under HIPPA with strict regulations stipulating its use. Health data on social networking platforms, some medical device manufacturers and on wearables largely is not protected. In most instances, you won’t know how this data is being used nor have any control over its use. Is this not “you,” your data, your story, your life in a series of mathematical segments? And yet you have no ownership.
Perhaps it is an exaggeration, but it almost as if we are losing ourselves as we become more digital than human. Your DNA, your health data, your experiences on this planet and your life belong to you. Or do they? What are we surrendering as technology becomes more ubiquitous, pervasive and our lives become more of a digital reality than a physical one?
There is still the problem of coordination between all of the different systems and platforms in healthcare. My wife came home the other day with a paper from her doctor suggesting she take a couple of over-the-counter supplements. Because systems are not connected and the doctor may not know her history (or may not have the time to sift through all of the data available in her health record), my wife had to spend time digging through the side effects of the supplements to see how they could affect her other health conditions.
Connecting platforms poses two problems. The first problem is simply getting all of the data connected and in a place where a clinician can access it without logging into an inordinate number of different systems. The second problem is in making sense of all the data once it is in one place and not overwhelming the clinician. We are still working diligently on the first problem. The second problem is steadily becoming a reality.
Every corporation, startup or entity involved in the big healthcare data game has its own ax to grind. It took the Affordable Care Act to incentivize big electronic health record vendors and hospital systems to put forth an effort in sharing data. Still today, many healthcare entities have, as a primary interest, their specific mission and are less concerned with a holistic approach to patient care. That is, there are still mountains to climb in coordinating data across platforms for better patient care and a better patient experience.
The unified ecosystem I discussed in a recent article doesn’t exist in healthcare. We can connect our watch to our phone, our phone to our computers, control the lights in our house from remote locations, connect bank accounts and log in to various services using token authentication. But we still can’t connect our healthcare platforms. Your pharmacy system doesn’t communicate very well with your hospital system. Your doctor’s system may not communicate very well with your other providers. Your health devices largely do not communicate with any of the above and keep the data on their own servers. As a result, we end up with a fragmented system, a fragmented perspective of a patient’s health and, most important, a fragmented user and clinician experience.
The potential exists, through technology, to create better healthcare for patients and holistically manage their health. But what I fear instead is corporations are chasing the next dollar. Their top goal, I suspect, relates more to the next quarterly earnings statement than the health of their constituents. That goal also brings about an Orwellian potential in the use of healthcare analytics where corporations use or connect the data to observe only when it is financially lucrative to do so.
To be fair, many device manufacturers have either made it possible or are exploring the means to integrate services into the electronic health record. For example, the Freestyle Libre I write of above allows you to enter a practice ID into their online platform and transmit your data to other providers. But this is still a far cry from the utopian fantasy I describe above where your physician will know if you are compliant with various treatments, medications or therapies through a holistic technological ecosystem.
In short, if we cannot tie it all together and develop analytics in a holistic way, then we run the risk of making important medical decisions on fragmented data. We are doing this today in healthcare and it forces clinicians into a game of runaround as they dive into different systems to understand the bigger picture.
What we have to consider in this brave new world is both the social impact and the ethical implications of technology. As we continue to push the limits of technology, collecting more and more data on patients, we must consider how it impacts the experience. Yes, patients should not smoke, should take their medicines and adhere to their treatments. But patients are not points on a data chart. They have lives, are social animals and have emotions around their care. There are aspects of the user and patient experience that cannot easily be plotted on a chart or graph.
The ethical implications are clearer — primarily as they relate to privacy. Just because we can, does not mean we should. At what point do we allow the separation of lives?
We have different roles in life — a professional, a parent, a lover or spouse, a friend, a patient — roles we play in different worlds. Is it ethical to set those different worlds on a collision course? What level of privacy and segmentation will we afford as we move forward in connecting data points not only in healthcare but in all areas of our lives?
Humans have an innate ability to innovate. We develop great new technologies and then figure out the rest later. We’ll develop something like an automobile and later consider the safety implications — creating seatbelts, windshield wipers, headlights, antilock brakes, crumple zones and airbags. We’ll develop nuclear technology and later discover the environmental and ethical implications.
I am not sure we can afford to “figure out the rest later” where it concerns healthcare data. We may have no other choice. However, we should begin the discussion now. Patients should have a right to their own data and clearly know how it is used. Clinicians should be educated in analytics, using them to partner with a patient in their healthcare — not to berate patients or hold them accountable. Data must be connected to all the patient’s health platforms so we are telling the whole story — not just part of it. We must begin to educate patients in the power of analytics to avoid misuse. And, we certainly need to regulate large corporations who might otherwise use the power of data to chase their next quarterly earnings statement.
Healthcare data and analytics are powerful. The extrapolation of data leads to information, which leads to knowledge. Knowledge can be immensely useful or extremely dangerous. How we choose to use this technology will define the future of healthcare. More important, it will define who we are as a species.
The Orwellian Nature of Healthcare Data & Analytics
Research & References of The Orwellian Nature of Healthcare Data & Analytics|A&C Accounting And Tax Services