Friday, May 25, 2012

Technology Churn as a distraction

Guest blog by Karen Witting, co-chair of the IHE ITI Planning Committee 
A few years ago the focus of a lot of Healthcare Standards and Policy discussion was which technology was the “best” for health information exchange. There is no question that groups have their preferences, be it REST or SMTP or Web Services or some other technology which promises either simplicity or security or whatever the hot need of the moment is. But finally I think the most significant problem is beginning to being addressed when I look at the Request for Information (RFI) on Governance of the Nationwide Health Information Network . The fundamental inhibitor to real deployment of Health Information Exchange is governance and trust. Discussion and debate on choosing the best technology is a distraction from the most important problem to be solved.

In addition to some very insightful comments on the RFI on Governance, the HIT Standards Committee NwHIN Power Team is in the process of developing a detailed set of criteria for evaluating the readiness of technical specifications. This assessment puts a high value on things like ease of implementation, deployment and operation. This assessment focus is supported by the general belief that the best technology for health information exchange is the one that has been or is being deployed. But listening to them debate how to judge this, points out how incredibly hard it is to assess. Each camp has its “proof” that its technology is most deployed, or most “easy” to deploy based on various beliefs and vague statistics. There is no hard science here. There are no independent surveys of this. It is largely speculation and opinions expressed by people

For me the scariest part is that despite being an expert on several technical specifications, including their deployment, I would not be comfortable assessing most of the “ease” attributes listed in their Appendix A . Assessment is good and healthy but let’s not get lost in the weeds. For example, for “Ease of Implementation/Deployment – Metrics” (slide 25) they have listed Deployment Costs. Their metrics relate to how much a deployment exceeded its original estimates of costs. How would we be able to get unbiased data to support this metric? Even if a project was completely honest about how much it estimated and how much the final cost was, how do we tease apart the increase that came from the particular technology we are discussing, versus costs that increased because they underestimated how much money would be needed to pay lawyers, architects, re-education of staff, etc. Deployment projects are too complex to provide real metrics on one particular aspect of the deployment. So this becomes a very subjective measure. Most of the “ease” metrics are scary in the level of detail requested. “Ease” will always be subjective and is much more an aspect of the knowledge level, skill and capability of the people involved then any technology applied. Having worked as a programmer for 25 years I can say that Java was “easier” than C++, and I can also say that many people will disagree with me.

Coming back to my point. Technology churn is a distraction from the real work that needs to be done. Let’s focus on getting some good governance and trust models adopted in this country. Stop fighting over the technology. We have plenty of good technology already identified and in the process of being adopted. The only “ease” assessment needed is whether a deployment project desires it. Certify the ones that people are asking for. Move on to more important matters.

IBM disclaimer:  "This postings is Karen's personal opinion and don't necessarily represent IBM's positions, strategies or opinions."

(Comment from John: I have not yet read the RFI, yup too busy with more important things, But what I hear sounds like a good direction but too deep for Governance)

Monday, May 21, 2012

Security - Operational concern

I would like to draw attention to an article by Jeremy Epstein Going to the doctor and worrying about cybersecurity. The article has some mistakes, but that is not my focus. What I like about the article is that it does outline the gap between the standards development (DICOM definition of how to put images onto a CDROM, and IHE profiling of them), and the operational environment. Although the article doesn't point out specific failings, it does 'think' about them.

The article talks about the writers experience getting his medical-imaging studies on a CDROM and carrying them to a referral specialist. The trigger for the cybersecurity concern is that the registration clerk easily took the CDROM, imported the imaging studies, and handed the CDROM back. This leads Jeremy to think about how he might have used this vector to infect the system.

I agree that there is concern with the completeness and consistency of the security (and privacy) of the operational environment. HIPAA has tried really hard to provide a framework, and does the correct thing in focusing on a Risk Assessment. There are plenty of ways to mess this up, plenty of ways to be ignorant, and plenty of ways to simply assume that there is no problem. However this is true across the board, not specific to CDROM formatted image import.

There are standards for the CDROM format that carries medical imaging studies, including XDM. These standards have tried to include capabilities and impart urgency to the software designers and operational environments. It should be noted that the standards organizations in healthcare are ahead of most in that there are Risk Assessment processes that have been inplace for years, to make sure that each standard developed considers security (and privacy) risks. But these can only provide guidance, there is no way to enforce perfection.

That said, I think that the scenario that is outline does have some inherit security (and privacy) built into it. I will be an optimist and presume that the software and operational environment have considered these. For example: When you hand over your CDROM, they know who you are. They knew this because you made an appointment, or better yet, were referred. A walk-in will likely cause more investigation into who you are. They have surely done some background work to make sure you have insurance or can pay upfront. Thus, if you were to infect their computer there would be a history. It is true they may not know it was your CDROM, but they know when their system was good and when it went bad and can investigate all the patients between.

There is also plenty of off-the-shelf software that can help here. Common today that any anti-malware (antivirus) will automatically scan removable media, remember the first malware was floppy based. Given that the registration clerk processed the CDROM without question, I optimistically assume that the clerk has done this workflow multiple times, and thus they have considered how to get the data off the CDROM. Likely using the DICOM formatted CDROM which is quite common today and will likely continue to be quite common for the next 5-10 years. 

Exchanging images is widely available through many means (CDROM, USB-Memory, DICOM-native, DICOM-web-services, DICOM-http, XDS, XDR, XDM, XCA), securing it still is grunt work of risk assessment and mitigation.

Monday, May 14, 2012

Healthcare Metadata

Metadata often results in meta discussions. Unfortunately these discussions are simply fun, and not productive. There are far too liberal understandings of metadata, especially in the S&I Framework Data Segmentation for Privacy, where there is a flat bucket of any describing attribute without recognition of purpose or how/where it will be used.

The Purpose of Metadata
Metadata is associated with data to provide for specific data handling purposes. These domains of data handling purposes fall into some general categories. Each metadata element typically has more than one of these purposes, although there are some metadata elements that cover only one purpose. It is important to understand these domains of metadata purposes. Often sited PCAST report did identify Patient Identity, Provenance, and Privacy; three good purpose categories but not sufficient. I have covered this before, but revisiting it because of HL7 work on metadata and IHE re-documentation of XD*.  For example here is a view of the metadata purposes in Document Exchange models, such as XDS/XCA/XDR/XDM.
  • Patient Identity – Characteristics that describe the subject of the data. This includes patient ID, patient name, and other patient identity describing elements
  • Provenance – Characteristics that describe where the data comes from. These items are highly influenced by Medical Records regulations. This includes human author, identification of system that authored, the organization that authored, processor documents, successor documents, and the pathway that the data took.
  • Security & Privacy – Characteristics that are used by Privacy and Security rules to appropriately control the data. These values enable conformance to Privacy and Security regulations. These characteristics would be those referenced in Privacy or Security rules. These characteristics would also be used to protect against security risks to: confidentiality, integrity, and availability.
  • Descriptive – Characteristics that are used to describe the clinical value, so they are expressly healthcare specific. These values are critical for query models and to enable workflows in all exchange models. This group must be kept to minimum so that it doesn't simply duplicate the data and to keep risk to a minimum. Thus the values tend to be from a small set of codes. Because this group is close to the clinical values the group tends to have few mandatory items, allowing policy to choose to not populate. For Healthcare data this is typically very closely associated with the clinical workflows, but also must recognize other uses of healthcare data
  • Exchange-- Characteristics that enable the transfer of the data for both push type transfers, and pull type transfers. These characteristics are used for low level automated processing of the data. These values are not the workflow routing , but rather the administrative overhead necessary to make the transfer. This includes the document unique ID, location, size, mime types, and document format. 
  • Object Lifecycle – Characteristics that describe the current lifecycle state of the data including relationships to other data. This would include classic lifecycle states of created, published, replaced, transformed, deprecated. 
All proper metadata elements are indeed describing the data and are not a replacement for the data. Care should be taken to limit the metadata to the minimum metadata elements necessary to achieve the goal. Therefore each metadata element must be considered relative to the risk that exposing it as metadata. A metadata element is defined to assure that when the element is needed that it be consistently assigned, and processed. Not all metadata elements are required, indeed some metadata elements would be used only during specific uses. For example the metadata definition inside a controlled environment such as an EHR, will be different than the metadata that is exposed in a transaction between systems, vs the metadata that would describe a static persistent object.

Not MetaData, but Meta something
There are other things that are often considered metadata, and they might be ‘meta’ in some way. For example when information is being pushed there are attributes on the transaction that are critical to the transaction. Thus for the purpose of the transaction they are critical, but they don’t really describe the data as much as they describe the transaction. For example: The Direct Project uses secure e-mail; in this context there is a sender address and a set of recipient addresses. These are ‘meta’ in the context of the transaction, but are not 'meta' about the data.

Another layer that is often confusing is the Privacy and Security layer. As indicated in the metadata model above there are some metadata elements that are specifically metadata that are there (purpose) of being used to protect privacy and security. The most referenced here is confidentialityCode; but also dates of service, individual author, author institution, class of document, as well as the patient and document ID themselves.

However security and privacy are also specific layers at the transaction level where there are other attributes that are critical to protecting the transaction: Endpoint authentication, encryption keys, endpoint addresses, user identity, user role, user purposOfUse, policy identifiers, obligation codes, etc. These are critical to transaction success, but are not meta about the data; they are meta about the transaction.

Dublin Core
I looked at Dublin Core, which is often cited as a Metadata definition with abstract model…Dublin Core defines 14 categories. It is interesting, and should not be ignored. I think that Healthcare has matured beyond Dublin Core, not to say that Dublin Core is immature but rather that we have identified specific needs of metadata to our industry. Similar to how Video files have metadata defined that are beyond Dublin Core. Healthcare should show traceability to Dublin Core, but not more than that.
Conclusion
IHE has a good set of metadata, it is not formally modeled abstractly; I am working with IHE to do this modeling as an effort to better communicate with the IHE reader. HL7 is working on metadata, but this work is far too tied to functionality triggers. We are not done, but we are moving in the right direction.

Update:
May 15th - Added back in "Routing", I had removed this thinking I could pack them into Discoverability. But it just doesn't work out. Later changed "Routing" to "Exchange" as it really is the characteristics needed to successfully exchange. Added a diagram showing how the XDS metadata can be shown in this topology.
May 16th - Updated some text and image for readability.

Sunday, May 13, 2012

Recommendation: ONCs New Guide on Health Information, Privacy and Security and Meaningful Use

I read the ONCs New Guide on Health Information, Privacy and Security and Meaningful Use, on the plane to the HL7 meeting in Vancouver. It didn't put me to sleep, but it is very high level. I think it is a fantastic level of detail for anyone that has been thrust into the position of "Privacy Officer", "Security Officer", or just "In charge of HIPAA/HITECH security/privacy compliance". I suspect the last one is more likely to be the first step. 

This document is 47 pages long, but many of the pages are empty cover sheets to chapters. So don't let the size of the document keep you from looking at it. The most useful part of this is that it sets out an overall compliance landscape; and provides pointers to government provided guidance on every step. The later chapters are very good references.

I really like how they have simplified Privacy (Page 5)
Your patients trust you. Trust is clinically important and a key business asset. How your practice handles patient information is an important aspect of this trust. To help cultivate patients’ trust, you:
Make sure patients can request access to their medical record;
Carefully handle patients’ health information to protect their privacy; and
Keep the information in patients’ individual records as accurate as possible. 
They do explain all of these, so they are not oversimplified.

They also do a good job (43 repetitions) explaining that the CE is responsible, they can't outsource responsibility or transfer responsibility. Yes you should work with the vendors, all of them:
Your practice, not your EHR vendor, is responsible for...
Risk Assessment needs slightly deeper understanding
I do think that they stayed too high level when discussing Risk Assessment. I have a blog article on the detailed view. Specifically they didn't really cover well enough that 'risk' is a combination of 'how bad of an impact would result of the vulnerability/threat be regardless of how likely it is', and 'how likely is this from happening regardless of how bad it would be'. This separation allows for a quantitative analysis, rather than an emotional analysis; which is critical to staying away from "Security Theater".  More important is that when a mitigation is chosen, one MUST reassess the risk values based on this mitigation to determine how the risk valuation has changed.

What is not said at all is that risk will never be brought to zero, so some 'acceptable threshold' must be determined by the Covered Entity. This is the value of 'risk' that they are going to be satisfied with. Typically the only way to totally get to even this 'acceptable threshold' is through 'insurance'. 

More important to me is that when this mitigation is chosen, one must assess if the mitigation has introduced NEW risks. The classic one is where the security office wants user-login to everything that presents PHI, yet putting a login on the Critical-Care bedside monitors would clearly present a patient safety/care risk. 

This is a detail, but the concept needs to be seen at the high level. (Note they spelled my name wrong on Page 20)

Encrypted Patient Carried Media
I am disappointed that they recommend that when PHI is put onto media to be given to the patient that this media be encrypted. Yes encryption is a good tool, but the availability of 'interoperable' encryption on removable memory (USB-Memory, CD-ROM, DVD, etc) is not strong today; AND is it really important to encrypt the media that you  are giving to the patient?  How does the patient or future providers read the media if it is encrypted? Likely this is done through putting the password onto the exterior of the device; thus feels encrypted but clearly is not. Security Theater. Yes there are standards, IHE has published the DEN profile that shows just how to do this in an interoperable way. I would still encourage the use, but not strongly.

Minor nit: On Page 21 they indicate that Internet based EHR might be harder to assess their security compliance. I might equally point out that an Internet based EHR might be more transparent on policies and procedures and have better security as they can focus attention on privacy/security across multiple CE, internally sharing best-practices that are hard to do when a CE hosts the system themselves.


Tuesday, May 8, 2012

FW: ONCs New Guide on Health Information, Privacy and Security and Meaningful Use

This just crossed my desk.

HealthIT.gov

ONC's New Guide on Health Information, Privacy and Security and Meaningful Use

ONC's Office of the Chief Privacy Officer (OCPO) recently released a "Guide to Privacy and Security of Health Information,"* an instructional guide designed to help healthcare practitioners, staff, and other professionals better understand the important role privacy and security play in the use of electronic health records (EHRs) and Meaningful Use. The guide is a comprehensive, and easy-to-understand tool to help providers and professionals integrate privacy and security into their clinical practice and includes sections addressing:
·       Privacy & Security and Meaningful Use
·       Security Risk Analysis and Management Tips
·       Working with EHR and Health IT Vendors
·       A Privacy & Security 10-Step Plan
·       Health IT Privacy and Security Resources
Full Guide: Check out the full Guide to Privacy and Security of Health Information: http://www.healthit.gov/sites/default/files/pdf/privacy/privacy-and-security-guide.pdf.
Sections of the Guide: You can also download individual sections of the guide. Please visit the privacy and security section under the Providers & Professionals tab on HealthIT.gov:
http://www.healthit.gov/providers-professionals/ehr-privacy-security  

Together, we can build a culture where privacy and security are respected and valued to inspire confidence and trust in health IT and electronic health information exchange by protecting the confidentiality, integrity, and availability of health information.

*OCPO developed this guide with assistance from an ONC cooperative agreement partner, the American Health Information Management Association (AHIMA) Foundation.

Sunday, May 6, 2012

Testing your XDM implementation

I am seeing more and more interest in testing XDM compliance. That is the XDS variant where by the documents and metadata are placed onto a filesystem, along with some browser ready files for low-tech viewing. This 'filesystem' could exist on USB-Memory (sticks), or on CD-ROM.Actually they could exist on anything that can take a file-system, so they are actually hardware independent. The FAT filesystem seems like it will live far beyond the wildest dreams.

More important is that this filesystem can be ZIPPED up and this zip file placed into an e-mail, hopefully secured with S/MIME. This is indeed a part of the Direct Project, where the security is mandated. The use of an XDM.ZIP file is part of the Direct project and is mandatory for any system capable of sending the XDM compliant ZIP. The reason this is mandatory is that the XDM format brings along many good advantages that allow for the receiver to more easily handle the documents. It includes metadata for

  • Patient Identification -- patient ID, patient name, etc
  • Provenance -- Author person, Facility, publisher, etc
  • Privacy/Security -- Confidentiality Codes, Hash, Size, Author, Patient ID, 
  • Routing -- Intended Recipient, type of document, format of document, mime type, etc
  • Lifecycle -- Previous document that this would replace, transforms, signs, etc
And other...

In the Direct Project the XDM zip file is required to be sent if the sending system can send it, because if you can format in the XDM format then you are enabling better interoperability. 

The receiver doesn't really need to have anything special to handle this format. The zip format is supported natively on many operating-systems. The XDM format requires that there be an INDEX.HTM and a README.TXT; and the documents are simply laid out in a filesystem. The INDEX.HTM is encouraged to simply use FILE url formats to allow simple viewing. Thus the XDM format can actually be MORE friendly than a bare file alone.

But if the receiver does understand the XDM then it can leverage the metadata.

So, how do you test your XDM?
The IHE Connectathon has an online validation tool at 
Use the "Message Validator". There is a selection for XDM, where you can provide an XDM zip file that you have created. It will validate the zip file provided according to XDM specification. It validates the directory format, metadata, alignment of metadata and documents.  It uses the new minimal metadata requirements for XDM that was created for the Direct Project.

If you want to see an XDM or test that you can view or import one; there are some samples on the IHE ftp site. On that directory you will also find a XDM_Boone.ZIP that Keith offered.

Update:
Another place with XDM samples