I came across an article on HIStalk Practice that describes exactly what happens when a laptop containing patient information is stolen from an employee’s car. The stolen laptop cost the company around $300,000. An analysis and breakdown of the costs are provided in the article.
A few things to note about the article:
- The article should scare any organization that owns laptops.
- Every organization should read it to fully understand the risks that mobile devices (laptops, smartphones, USB drives, etc.) present
- The article is dated. The publication date is 12/3/2011
- The article refers to the HIPAA “Harm threshold” that has been modified by the HIPAA Omnibus Rule
Again, every organization should read the entire article; it is incredibly detailed and provides excellent insight.
Here is the article conclusion with observations and lessons learned.
Whether you’re a physician practice or a contractor, look in the mirror (or use your phone camera) and ask yourself right now: do you know how the people on the front lines are handling personally identified data? Have you put in place the awareness, policies, and technologies to allow them to do their jobs efficiently AND securely? A meaningful self-examination will reveal that you are almost certainly not as good as you think you are (unless you’ve had a recent security incident of your own.) That was certainly true for me, and I suspect that I’m in very good company.
Assume that your portable devices contain sensitive information, even if your vendor tells you that they don’t. Most EHR systems are designed so that medical records are not stored locally on a laptop, yet, in our investigation of this matter, we found plenty of instances of the EHR saving temporary files locally that were later not purged, or of clinical users saving documents locally because they weren’t aware of the risks. While there is no doubt that EHR software should be better designed, and EHR users should be better trained, I wouldn’t bet on it. Put in place policies and technologies as a safety net, just in case software and users don’t do what they’re supposed to (because that’s never happened before, has it?) I now have whole-disk encryption on my laptop even though I never work with practice-level data. Sure, it takes about 20 seconds longer to boot up while it’s decrypting. But rather than being an inconvenience, I actually have found that I use this time to take comfort that I’m responsibly protecting my company’s and my customers’ privileged information.
If you’re in a physician practice, know who’s working in your practice and get a clear statement from the contractor at the outset of their work of what access to patient information they will need, and how they will be handling such information. As an industry, I fear that we are inadvertently letting business associate agreements absolve us from appropriate diligence of what our privacy and security protections are trying to accomplish. Hey, we’ve got a BA, so our contractors can do anything, right? You’re obviously not expected to be a privacy and security expert – that’s what you’re hiring. But understand that it’s your responsibility in the end if that contractor drops the ball on privacy and security, so it’s worth making yourself comfortable that they are reputable, diligent, honest, and competent. We are now providing all of our customers with a statement delineating what we expect to need access to and how we will handle such information. This isn’t a CYA exercise, because it doesn’t absolve us from any responsibility. It’s simply a vehicle to force acknowledgement of the seriousness of privacy and security, and to flag any differences in expectations at the earliest possible opportunity.
Once a security incident has occurred, set everything aside and create process and structure right away to identify what’s happened, prevent any further incidents, notify your stakeholders, and get to work on meeting your legal, business, and ethical responsibilities (hopefully, these are all perfectly aligned.) Create a crisis response team with your attorney, your customer (if you’re a contractor), and your own staff. It’s hard not to panic (believe me), but laying out a plan will help to identify what you know and what you don’t know and will allow you to set your priorities accordingly. In my case, it just helped me breathe. Some guides that I wish were available when we went through this are available here and here.
Don’t underestimate the effort that will be required to disentangle, respond to, and remediate the breach. If you’re a contractor, notify the organization who hired you right away to let them know the facts of the incident, the measures you are taking, and any immediate actions that they need to take (usually none.) There’s a temptation to wait to notify your customer until you know all of the facts and implications, but our experience is that that takes too long to disentangle. In our case, we notified the contracting network and its board immediately, who then decided to wait a little longer until we understood more of the implications for each practice before notifying each of the affected physicians directly. Good thing too – as it turns out, only seven of 18 practices had any legal liability for the breach, so it made sense to wait a little bit to sort this out. But realize we’re talking about days and maybe a couple of weeks – not months.
Keep a daily log of your activities from Day One. Hours quickly turn to days which even more quickly turn to weeks, and someone will inevitably ask why you didn’t notify them sooner. Having a log of your activities will allow you to demonstrate that you responded immediately and provided notifications as soon as you figured out what exactly had happened and who needed to be notified about what.
“Man up” and take responsibility for your actions. (Sorry for the sexist reference – I just think it sounds really good.) While we reported the incident to the police right away and immediately set the wheels in motion for compliance with federal and state regulations, I was struck by how easy it would have been to just let it slide, particularly as I contemplated the legal liabilities we might face, the financial penalties that could be imposed, and the loss of business that we might suffer. If we didn’t have a very recent backup of the laptop, we could easily have convinced ourselves that there were only innocuous error logs on the computer and stopped right there and reported nothing except a random theft. And we would have come to that conclusion honestly (for the most part.) I shudder at the thought of how many of these incidents go unreported each year, some perhaps not so honestly.
Take responsibility as an organization. Bill Belichick, the coach of the New England Patriots (Yeah! Wahoo! Go Pats!!) says that an execution error by an individual is really a lapse in education by the coach. The simplest thing for us to have done (aside from not reporting the incident at all) would have been to declare that this was the action of a rogue employee, contain the investigation and remediation to that, and pat ourselves for another job well done. In our case, it became clear as we investigated this incident that there was a certain amount of “there but for the grace of God go I” among our entire staff, at which point we realized that this was an individual failing AND a company failing, which meant that ultimately it was a management and leadership failing. Framing it that way sent a strong message to our team that we’re all in this together and that we need to be honest, transparent, and professional about our flaws. It also led to our building system approaches that will be more long lasting because they were developed organically from the ground up with staff input rather than being imposed from above. Any security professional will tell you that building security considerations into routine workflows, rather than tacking them on as additional workflows, is not just a best practice, it should be the only practice. Or, to paraphrase what I heard from a yoga instructor the other day, we have to move from doing security to being security. Om.
Not many organizations would be as transparent as the Massachusetts eHealth Collaborative. A huge thank you goes to Micky Tripathi, the president and CEO of Massachusetts eHealth Collaborative, for providing incredibly useful insight into a HIPAA security data breach.
The security breach cost Massachusetts eHealth Collaborative $300,000 but for less than $100 they could have installed laptop encryption. Don’t make the same mistake. Take a look at our HIPAA Technology Suite that provides affordable, secure HIPAA compliant products including laptop and mobile encryption, email encryption, backup and disaster recovery, HIPAA compliant email and network security.