Ethical Disrupting with Health Data

Silicon Valley startups seek to “disrupt” regulated industries and shift paradigms; however, these “outside-the-box” thinkers must be careful to not leave the regulatory box behind entirely.  For a recent example, consider the tale of Zenefits. This firm was a one-time darling of the tech world, recruiting employees with lavish perks and holding itself out as an Uber-like disruption within the world of health insurance. In the span of 12 months, this company has fallen from the limelight in part because of failing to have a compliance program or controls necessary within the existing regulatory framework for insurance brokerage. Since this news broke in February, even more information came out regarding the culture of Zenefits – a culture of compliance with traditional frat house values, if that.

It is with some irony that the WSJ broke news of Zenefits’ compliance issues on February 16, 2016, and then published an article regarding the firm Castlight Healthone day later. Castlight Health, and firms like it, provide employers with aggregated health trends minded from employee health data. Employees must opt-in to share their data with Castlight, but little information is provided as to how employers advertise or encourage participation with such services.  Fortune Magazine immediately raised and dismissed the HIPAA protections for such health data use: HIPAA only protects information that patients share with their healthcare providers, and Castlight gathers its information from unprotected consumer sources.

The news outlets and commentators have yet to draw a connection or parallel between these simultaneous news stories. Both Zenefits and Castlight serve employers seeking to manage the cost of employee healthcare. Both Zenefits and Castlight seek to disrupt the current industry that supports largely self-insured health plans governed outside state laws by the requirements of the Employee Retirement Income Security Act (ERISA). And both Zenefits and Castlight should raise compliance concerns that their disruption can be done in an ethical manner.

There are of course noted differences between these firms: nothing that Castlight offers is designed to break any existing laws, while Zenefits corporate profits appear to have been structured on a flagrant disregard of the existing regulatory framework for insurance brokerage. Castlight provides information gathered from those voluntarily providing it, and it is up to their clients, large employers, to use that information in an ethical manner.

But it should give pause to those companies looking to utilize the Zenefits and Castlight of the world to look beyond the slick marketing and consider the downstream pitfalls market disruption has to offer. Castlight mentions that information is shared with the individual’s employer in its terms and conditions – but how prominently? And while Castlight says that protecting individual privacy is central to its operations, HIPAA compliance sets a floor for data use, not a pinnacle to be reached. Moreover, HIPAA may not even fully apply to Castlight’s operations, thus that statement could be considered as relevant as a statement that this blog is fully compliance with the Securities and Exchange Commission. At the end of the day, employees and customers will judge a company for the choices it makes.

Fortune Magazine quotes two health legal scholars who raise important legal issues possible from use of such data.

  • Professor Nicholas Terry of Indiana University Robert H. McKinney School of Law, notes the personal privacy issues: “the ethics of tracking employee health information are “questionable, at best” and there is currently no legislation out there regulating these types of big data companies[,] . . . “It is incumbent upon the employer to be completely transparent and to demonstrate how this is being done exclusively to the employee’s benefit.””
  • Professor James Hodge of Arizona State University Sandra Day O’Connor College of Law notes the possible employment discrimination issues: “If [an employer] originally thought that 15% of the women in its employee base may become pregnant, but data shows it’s closer to 30%, that could lead an employer to say we cannot hire as many female employees this year because we can’t afford them being out for family leave.”

Additionally, as this blog also seeks to raise security issues, companies should question how employee’s health information will be handled to prevent data breaches and security losses. Castlight, in particular, offers a search portal by which data is collected and aggregated, and security gaps in network traffic or server storage can expose personal information worth much more on the black market than a wallet full of credit cards.

The ultimate take-aways from these examples:

  • Would-be startup founders with stars in their eyes need level-headed legal advice and guidance.
  • Lawyers and compliance professionals in “traditional” industries looking to partner with “disruptive” service providers need to vet their agreements and consider if the disruption will be in alignment with their corporate values.


2015 Guest Posts on The Compliance & Ethics Blog

2015 was not the blogging year I thought it would be when I started. While it felt to be a productive start after getting my inspiration at the Michigan State University College of Law LegalLaunchPad’s Social Media Workshop, something happened mid-August . . .(actually starting law school?) By the time the semester was over, sitting at the computer was the last thing on my mind. Even my planned 2015 recap post was a bit delayed (to today).

Now that my 1L year is winding to a close, I am itching to get back to thinking about developments in the health law world and where this regulated industry should go next (or, why I went to law school in the first place – because it was not to study theories of consideration.)

The Society of Corporate Compliance and Ethics’s Corporate Compliance & Ethics Blog included two posts I authored in 2015:

Two new posts are planned for this blog in the coming week, and hopefully this year will include additional posts for the SCCE.


3 Privacy and Security Considerations for Clinical Research

Technology and the Internet are connecting patients, physicians, advance practice professionals, and therapeutic treatments like never before.  With the increasing availability of internet-connected devices and wearable tech, individuals can monitor their own health status and share it with their health care team.  Through social media and social-enabled services, individuals can share their own health updates and experiences with traditional or alternative treatment methods (not a HIPAA violation). A hot topic for Institutional Review Boards (IRB) these days is to simply understand the role of social media and technology; with this understanding the IRB should establish guidelines for researchers.

There are three key privacy and security considerations in establishment of research technology usage guidelines:

  1. Patient Recruitment – Does the primary investigator (PI) propose recruiting patients through a general social media channel advertisement, such as Facebook or Twitter, or a patient community such as 23andMe or PatientsLikeMe?  Regardless of the manner in which a patient is recruited, informed consent for research must still be obtained.The use of apps for recruitment and data collection is likely to explode in the near future thanks to Apple’s recent release of the ResearchKit as part of iOS.  Apple’s terms of service for ResearchKit include the requirement that users seek Institutional Review Board approval as appropriate, but Apple’s lawyers are looking to absolve the company of as little liability as possible by not requiring proof of IRB approval as part of ResearchKit use.
  2. Secure Data Collection/Storage – While the closed-garden of iOS offers some security for any apps released through ResearchKit, IRBs should seek the approval of their organizational Chief Information Security Officer of a research proposal that involves extensive data collection and storage outside the organization’s internal network.  If patients are submitting diary data through a mobile app, that app needs to be encrypted and meet the same safeguards as any other manner in which protected health information (PHI) is handled, even though the diary data would not be part of the patient’s overall medical record.  A PI should be required to describe how they will be storing the collected information (hint: the only acceptable answer should be on the secured internal network – not a cloud storage service or personal storage device such as a flash drive).  A recent example of this is the HIPAA violation and $4.3 million fine for New York-Presbyterian and Columbia after a physician attempted to deactivate a computer from the network and accidentally published the PHI of 6,800 individuals.
  3. Publication Implications – If a researcher is primarily recruiting patients from a small online community, even de-identified information credited to that community ipso facto identifies possible study participants.  Subjects recruited online through forums and social media may be presenting a fictional persona, creating invalid data for the researcher.  Finally, the IRB should question to assure that the researcher is acting honestly and not simply using social media as a data collection front for creating false data.  IRBs should be studying watchdog groups such as RetractionWatch which can identify questionable research methods and individuals in the research community.

HIPAA Privacy and Security Considerations in the Age of Big Data

Health data, thanks to HIPAA, is required to be de-identified Protected Health Information (PHI) if used for research or data-aggregation purposes. However, de-identification only goes so far in the datasphere. Professor Ian Bogost of The Atlantic, Ivan Allen College and Georgia Institute of Technology’s recent article on “The Internet of Things You Don’t Really Need”, and Andy Greenberg’s Wired article last week hit on some key points that lawyers and senior leaders in healthcare should be thinking about, as providers, staff and patients are increasingly connected.

HIPAA Privacy Implications:

  • Companies such as Google, Facebook, Microsoft, Amazon and others build databases based on user activities. Some of these activities, such as on social media, are intentional and visible.  Others, such as a Google or Bing search, are made without the assumption that actions are being recorded.  Through ‘cookies’, different websites record computer user activities across multiple websites, connecting into advertising that links back to an Amazon search made a few days before without leading to a purchase. Big Data has already been shown to be able to identify health status changes through aggregation.
  • HIPAA does not apply if others learn about a patient’s health condition or status because of the patient posting to Instagram, Twitter or Facebook about their health care with location services enabled identifying their location as at a health care provider’s office. Doubly so if the patient ‘checks-in’ to the health care provider via a Foursquare account set to automatically Tweet or Facebook post this information.

HIPAA Security Implications:

  • Health care providers’ medical equipment is increasingly wireless and joining the ‘Internet of Things’. The new insulin or medication pump may be connecting to the electronic medical record system to triage and administer critical medications according to the physician order without the secondary step of a nurse re-entering the dosage directly into the pump at the patient’s bedside. While this is a Lean and patient safety improvement, the interface of the pump should be carefully considered by the Information Security experts on staff.  Is the wireless signal looking for an internal network connection, or is it going out to the Internet and then re-connecting to the hospital network for its data connection? Whatever the connection, has it been sufficiently encrypted to protect the privacy of the PHI the data contains?
  • Beyond the initial network configuration, the Internet of Things exposes device users to risk from hackers. In the past week, Wired published regarding the hacking of a Jeep, initiating a major vehicle recall by Fiat-Chrysler.  Is a major expose of a medical device’s hackability in the near future?

HIPAA and Meaningful Use require a risk assessment of PHI vulnerabilities. The questions and scenarios above are largely theoretical, but have a basis in reality. Senior leaders and lawyers in healthcare looking to make effective decisions about a HIPAA privacy complaint, equipment investments or Security Services staffing should consider the ramifications of the Internet of Things and Big Data as part of their annual risk analysis.

HIPAA: One P, Two As, and 1,000 Different Interpretations.

The Health Information Portability and Accountability Act, or HIPAA, as it’s colloquially known, is thrown about on a daily basis in the health care industry, and many people think they know what it requires.  More worrisome, just as many know they don’t know what it requires, but don’t ask questions of their internal experts and make their own judgments.  Paula Span has a great article published in the New York Times that speaks to these incorrect applications.  And then there’s the black sheep contingent who think “HIPPA” doesn’t apply to them, perhaps as one person recently commented to me, “Well, I don’t live around here – so I don’t worry about patient privacy”.

HIPAA: the law, the regulation, the mystery.

This law was originally passed in 1996, as a precursor of health reform, which established insurance portability requirements for individuals changing employers.  A secondary part of the law was requiring that medical records be kept under appropriate privacy standards by providers and health plans.  There were regulatory updates in 2006 and 2011, respectively establishing the security and breach notification requirements.  However, it was only in 2013 that the HIPAA Omnibus Rule was released by the Department of Health and Human Services.  The Omnibus Rule pulled all of the previous updates together, adding in genetic information privacy requirements established under GINA, and pulled the health care industry forward.

As it stands today, HIPAA creates numerous requirements for health care providers that substantially impact and define the manner in which a provider can operate, with often vague definitions such as “in the provider’s professional judgment” to qualify who should or should not have access to a patient’s medical record.  As Paula Span’s article highlights, that professional judgment most likely tips too conservative for what is actually required.

Here is a summary of these requirements, as they fit into three broad categories:

  • Security: Keep protected health information (PHI) secured within your organization – systems should require passwords for access, and only those whose jobs require system access should be assigned user accounts.  Do not use cloud services (example – Dropbox) to store PHI that have not been vetted by your Information Security Officer to have the appropriate encryption standards. Maintain and review access logs to medical record systems. Conduct a security risk assessment of all technology infrastructure annually to identify risks, and develop plans to mitigate those risks.
  • Privacy: Train staff on access standards: only access PHI with a ‘business need to know’ – defined as healthcare treatment, payment or operations. Complete formal business associate agreements (BAA) with any entity your organization is establishing a contract with to perform healthcare treatment, payment or operations on your organization’s behalf that establish clear expectations for how the BA will handle your organization’s PHI.  Establish a privacy policy and Notice of Privacy Practices (NPP) for your organization.  The NPP must be visibly posted across the organization, and document that it is offered to patients entering the organization.
  • Communication with Patients: Provide patients access to their medical records within a timely manner, in the manner with which the patient would like access- even if they would like their PHI emailed to them. Patients have the right to designate who has access to their PHI beyond their own person. Additionally, any breach of privacy or security standards for a patient’s PHI must be communicated to that patient within 60 days of the date that the breach was discovered.

The following is a tip sheet I’ve created that summarizes the highlights of these requirements, designed for the physician group practice setting:

HIPAA Best Practices for Medical Practices

What questions do you have about administering and maintaining HIPAA compliance?

Governance Dual-Usability Obligation

Leaders in health care are likely familiar with the concept of a ‘dual-fiduciary role’. This administrative responsibility for senior leaders and organizational governance requires the balance of resources to assure the organization resources to provide high quality care to today’s patient in balance with maintaining reserves for tomorrow’s needs. This blog proposes that, as privacy and security is of equal weight to the organization’s financials, there is a requirement that administrators must balance in terms of IT security: the dual-usability role. This dual usability role requires that administrators and governance assure the following:
1. End-user accessibility of health care systems. Interoperability and integration are the latest buzzwords regarding the information systems that help health care professionals to provide high-quality medical care. The Meaningful Use program challenges health care technology platforms to certify their ability for users to demonstrate that the computer is not just a box in the room; rather, the computer and its systems are an active tool in use to provide high quality and timely patient care. The program requires providers demonstrate that patients can have access their own medical records through a portal, and to share information with others through information transfer between inpatient and outpatient care settings.
2. Physical and cyber security of information systems to prevent unauthorized access. The claim that privacy and security are equal to the bank account is a bold one; however, who would honestly trust a provider that was known to have lax security standards protecting the privacy of their medical record? The privacy of the doctor-patient relationship is essential; patients assume that office staff will not be gossiping to their friends and neighbors about their medical conditions, or allowing their medical information to fall easily into the hands of criminals and identity thieves. Again under Meaningful Use program requirements, providers must document that they have completed a security risk assessment during their attestation period. HITECH and the HIPAA Omnibus Rule require careful handling of PHI, analyzing breaches, and providing appropriate notification.

Health care providers, and their leadership, must carefully balance the two aspects of usability. Tipped too far to the side of security, with security protocols hindering a user’s abilities to access the patient information, usability is compromised. If a system is too accessible, and information is available without limitations based on patient assignment or job duties, then security is at fault. Health care administrators and governance can measure financial health and responsible fiduciary oversight through concrete metrics, for example: days cash on hand, operating margin. Security, conversely, is measured by what we do not have – breaches and angry individuals. Measuring and monitoring the balance of usability requires new metrics. Lawyers working in the healthcare space, with an understanding of technology, may be uniquely qualified to design these metrics given their training to minimize client risk within the limits of what today’s information system capabilities.

“What is Code?” Review Part Two

In my last post I told you why you need to read, “What is Code” by Paul Ford.  The following is what I feel most critical of Ford’s article for a compliance or legal leader in health care to read:

5. The Time You Attended the E-mail Address Validation Meeting

In the interest of understanding more about how all this works, and with an open invitation from TMitTB, you attend a meeting of the programmers.

Two of them are late, and bravely you ask the one already in attendance to explain what’s going on. He quickly gathers the limits of your information through a series of questions, beginning with, “Do you know what a Web page is?”

Here’s what he shows you: To gather an e-mail address and a name, you can make a Web page using HTML.

On today’s agenda: How to make sure that registration is a positive experience for users but also a secure experience for the company. The questions to be discussed, the programmer tells you, are along the lines of, “Where will you put this data? Will you put it in a text file? What will you do with it? How will you act upon it?”

What follows is an illustration of a conversation between hypothetical programmers. In the again-hypothetical website upgrade in Paul’s article, the questions can be resolved through choosing to create the site update in a language that will include scripted answers for all of these questions.

This is critical because most healthcare companies aren’t writing their own code or creating their own software.  They’re purchasing a software package or a subscription to a software hosted through the Internet from another company.  You then become reliant on that company having resolved these questions in advance, and having answered them in a way that meets your risk profile. Your IT people will have to figure out, after implementation, if there are issues with the software, such as duplicate accounts, lack of data validation, if all the passwords are hashing to an unencrypted text file…  And now it’s your data in play. All of these examples are things that really do and have happened.

Healthcare compliance professionals and compliance lawyers who understand the limitations of software can ask better questions during the software demonstration. Assure that your information security officer has reviewed the program under consideration.  Asking hard questions up front can prevent being in a hard place later, when it’s real patient, employee, or provider data on the line.

You Need to Read – “What is Code?” By Paul Ford

Two weeks ago, the Wall Street Journal’s recommended weekend reading list included an opus by Paul Ford published on Bloomberg. You can find it here: warning – it’s not a short read.

Anyone working in health care who is not in IT should read this article, and anyone who works in the health care compliance, legal or governance spaces needs to read this article, in my opinion. Here’s why:

– Our technology runs largely on old code. Most information systems that power health care transactions are not running in the new, “sexy” languages, like Ruby. This makes it harder to find new coders who are proficient in the languages that need to be maintained. Older code langauges are also harder to work with to create what users want, leading to systems being used today in ways for which they were not intended or optimized to perform.

– Our industry has strict requirements for use of encryption and data security, but the tools we have to assure those requirements are imperfect. The code libraries that execute the encryption have been contributed to by many coders over time, and there could be a / or } in the wrong place buried at the very heart of the code creating a security flaw, but because of programs layered on top of each other no one has noticed.

– Healthcare has to use technology to collect patient information, maintain that information, and share it with those who need it (including doctors, nurses, quality reviewers, insurance companies, to name a few), while keeping it encrypted and logging user access. A stolen patient record is said to be worth anywhere from $50-70 on the black market (h/t to The Advisory Board Company for including that in a Daily Briefing email recently).

It’s important to remember that while HIPAA regulations require security audits and that encryption is used, there is no specific language setting a minimum standard for encryption. In part, I suspect this is because standards evolve so fast, today’s best practice “brick wall” equivalent is tomorrow’s “decorative white picket fence that won’t actually keep a bunny out of your vegetable garden”.  That’s no justification for poor security or encryption efforts.

My key take-away is to expect technology to continue to change. In healthcare leaders must balance making technology easier for nurses and doctors to use for patient care, and security of the technology.  At the end of the day, your technology is worse than worthless if not encrypted.

Hello world!

I am starting this blog as a place to share my perspective as a Michigan State University College of Law student on regulatory updates and other articles of interest in the health care space.  All thoughts are my own, and should not be taken as legal advice.