Skip to content

Trade-Off or Turn-Off? The Privacy Dilemma

October 6, 2016

I recently had the opportunity to join Boston news media veteran, Dan Rea, on his AM radio program, Nightside with Dan Rea. It was a one-hour call in program, and an eye opening experience for me. Dan and I chatted about connected health and how it can truly disrupt care delivery and put the individual at the center of their own health. Then Dan opened the lines to the fine citizens of New England for questions, and the phones started ringing off the hook.

The overwhelming concern – actual fear — among callers was maintaining their privacy in an increasingly connected world, especially their personal health data. This is a topic I touched upon in my recent book, The Internet of Healthy Things, and one which I will explore further in my upcoming talk at our Connected Health Symposium in a few weeks. But I was so struck by the extent of concern, I thought I’d present a few theories I’ve been contemplating on the subject.

When it comes to privacy issues, the cyber world is typically characterized as a sinister place, where consumers are duped and exploited, their data leaked or stolen. What we unfortunately don’t talk about is what consumers have to gain by sharing their data. For instance, the same information that can be used to create highly personalized programs to help people stay healthier and happier, can also be a key factor in improving efficiencies and reducing healthcare costs.  Further, it’s been shown that sharing data with providers, friends or social media groups can actually help people stay on track with their health and wellness goals.

Yes, there is always some risk sharing personal data – whether online banking or communicating with your healthcare provider. But there are also rewards. In my view, it’s a trade-off, and one that I personally am willing to make with my own health data.

As I see it, there are two main problems when it comes to privacy. First, many companies have not been forthright regarding their privacy policies, leaving consumers unaware of when and how their data is being used, sometimes in ways they may not approve of. Second, we are all too aware of some alarming data breaches that make consumers wary of posting or sharing their personal data.

Cyber security concept on virtual screen, consultant presentation

We can combat much of consumers’ fear by making privacy policies transparent; putting a halt to spying on people without their consent and creating systems to keep data confidential. Bottom line, the rights of individuals must be protected, and organizations – healthcare providers included – need to do a better job explaining privacy issues and safeguards.

As my friend and colleague Rob Havasy pointed out to me, HIPAA doesn’t directly apply to most connected health interventions, and certainly not to those things that don’t directly connect to a hospital. Therefore, the consumer’s protections are covered by the privacy policy of the company that provides the equipment or service.

In my mind, privacy is not a complicated issue. In fact, it’s pretty straightforward.

So how do we increase consumers’ comfort levels and create more transparency around the red-hot issue of privacy? Here are two simple ideas:

For anyone who is in the healthcare space, whether you’re a payer, provider, business or entrepreneur developing connected health devices or programs for consumers, you should be very forthcoming about your data collection and privacy policies. And, by all means, provide this information in simple, easy to understand language and skip the legal jargon.

And, consumers need to understand that there’s no such thing as a free app. If it’s a free service, more than likely the business model will sell advertising – or data – including subscriber lists, to marketers. In most cases, without this revenue stream, there would be a fee attached to the service. This is a concept most consumers will understand. Some will opt for the free service with the understanding that they give up some privacy. Others will want a fee-based service that will preserve their privacy. Either way, it should be the consumer’s choice.

Is the privacy fear such a turn-off that consumers will never agree to share their health data? Or can we help individuals understand the trade-off?

15 Comments leave one →
  1. October 9, 2016 5:01 pm

    It’s key to attribute the raw data ownership (and its primary value) to the citizen-patient … and to reward added real value to all stakeholders accordingly.

  2. October 9, 2016 7:55 pm

    i can’t argue with that

  3. October 11, 2016 6:29 pm

    Great post! As a patient advocate and technologist, I would agree that too many consumers and patients have a misunderstanding of their PHI. Its important to note that health information, when used properly, contains so much more valuable data when aggregated and analyzed collectively. Population health DEPENDS on this data. We are never going to achieve better patient outcomes without it. I will gladly give up some privacy for the sake of meeting healthcare objectives if my patient identity was verified. This is why Universal Patient Identity is key in the shift from volume to value based care. https://www.youtube.com/watch?v=5MVZTskUdjw

  4. October 17, 2016 8:51 am

    Most individuals will bristle at hearing about their data being shared in any way they don’t directly approve. Studies have demonstrated that most people will share data when they are asked and find the use of the data to be clinically valuable, but otherwise are adamantly against data re-use. App developers have to find business models based on reimbursement by payers or providers, and relinquish business models based on selling patient data. De-identifying data doesn’t make as big a difference as one would think. I wrote about that for Rock Health here:

    http://bit.ly/13yLQG7

  5. October 18, 2016 10:46 am

    Security & portability/cHealth may not have to be a trade-off. The Secure Exchange of Encrypted Data (SEED) Protocol was designed to put patients at the center and in control of their electronic health records. SEED encrypts your health data with your own personal lock, and you decide who can access your files and how. Dr. Kvedar, we’d love to hear your feedback on whether you think the SEED Protocol is a viable solution for HIT cybersecurity. We have a few short videos that give an overview and explain how it works: https://www.seed-protocol.com/kickstarter/

  6. October 19, 2016 9:24 am

    making legal jargon easy to understand is key. In my experience testing content, most people assume that Terms of Service, Security and Privacy policies are basically the same from app to app and they (including myself at times), do not want to take the time to read them. People typically accept the policies without reading the content. For those who do read them, when the language leads people to believe there might be something to fear – even if they do not understand what that fear is, they often decide not to risk using the app. I also think there is a strong cultural attitude of “it’s nobody’s business,” to address. Working towards demonstrating the value of sharing data should be a focus. There is an underlying belief that sharing personal information benefits the people asking for data more than it benefits the people providing data. Patients need success stories, case studies, and real people – talking about how sharing data made a difference in their health.

    • October 20, 2016 9:04 am

      This is an excellent point and an opportunity to educate that I hadn’t thought of. Do you have any resources — or can you point me in the right direction — for those accessible/tangible success stories to help make the case that sharing medical data and participating in studies can benefit the individual’s health?

  7. October 19, 2016 8:05 pm

    Adding to my first comment : Patients are the first stakeholders and the only ones with real skin in the game … they will be happy to trade their “data value” for real evident benefit (personal or public), depending on their uncertainty and risk attitiude.
    Labels, based on codes of conduct, might help, but only if they differentiate between privacy and ownership

Trackbacks

  1. Considering Healthcare Data Privacy with Health Data Sharing - KI Design Magazine
  2. Privacy in Connected Health : Great post by Joe Kvedar |

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: