Wednesday, December 21, 2011

Predicting Meaningful Use Stage 2 Security

As a member of the HIT Standards ‘Privacy & Security' workgroup I do have first-hand experience with the discussions and potential. The workgroup met earlier this year to discuss the proposed Meaningful Use stage 2 security and privacy criteria. The workgroup was given a set of criteria to comment on, so was not given a clean slate to add criteria of its own. Generally the workgroup softened yet focused the criteria (See detailed table output). There was clear recognition that Meaningful Use stage 1 criteria were not well understood. Here is a list of some of my articles, which are still in the top 10 popular articles:
The HIT Standards 'Privacy & Security' workgroup added to the criteria pointers to existing standards that explains the criteria in general IT language. Most of the time it pointed at NIST 800-53. The recommendations were presented to the full HIT Standards committee in October.

Six adjustments:
The first criteria is for secure messaging with patients. Specific to the security functionality: the message must be encrypted, authenticable, and audited. The Privacy & Security workgroup tried really hard to stick purely to the security functions without getting tied into implementation specifications. It indicated that both NwHIN-Exchange and Direct were acceptable (Regulatory coexistence of Direct and Exchange). It is not known if HHS/ONC will keep this criteria general, or get more specific. Something that came up later in side conversations was to address how an EHR could be so sure that an endpoint it was communicating with was a specific Patient. Thus the need for further analysis on authenticating patient identities outside of direct treatment scenarios (Patient Identity Matching).

The second criteria is to assure that documents that are created include data-provenance information. This is a direct response to concerns that when importing documents, that the patient provides, that there is a need to identify the original author. Unfortunately the security criteria is not a strong non-repudiation (IHE - Privacy and Security Profiles - Document Digital SignatureSigning CDA Documents), but rather simple functional criteria that typical clinical documents (CDA, CCD, Blue-Button) already support. This criteria is linked to re-enforcement of the need for patients to be ‘able' to download a copy of their health information. The Privacy and Security committee didn't try to define the content, but rather simply the functionality of being capable of downloading their health information. In the short term simple data-provenance might be good, but eventually we need strong non-repudiation. 

The third criteria is a rather small one that surely everyone already supports. This criteria is commonly known as ‘inactivity timeout' or ‘auto logoff'. The idea that the system will detect an idle system and somehow prevent the system from further displaying or allowing access to PHI. This is a typical functionality, but is a difficult thing to describe in words in such a way that doesn't specify a specific method.

The fourth criteria is to more fully define security audit logging. This one mostly resurrected the wording that I created as a co-chair in CCHIT back in 2005. That is to define a set of auditable events (right from ATNA), a set of audit attributes (right from ATNA), and a set of audit log management functionality (also right from ATNA and other sources). Thus the criteria should end up looking very much like what CCHIT was testing before. I tried to get IHE ATNA listed as ‘super-compliant', but this was removed by the larger committee. I am confident that IHE ATNA is viewed as compliant, but I don't think that the stage 2 criteria will say this. (How granular does an EHR Security Audit Log need to be?IHE - Privacy and Security Profiles - Audit Trail and Node AuthenticationAccountability using ATNA Audit Controls, and ATNA and Accounting of Disclosures)

The fifth criteria is that systems need to authenticate themselves to other systems on the network. This is the typical system-to-system authentication found in IHE ATNA (e.g. Mutually-Authenticated-TLS). The workgroup tried to focus this criteria to only communications that go across organizational boundaries, so that it would not be applied to internal communications. I am not sure which way HHS/ONC will go on this. (IHE - Privacy and Security Profiles - Audit Trail and Node Authentication,  S/MIME vs TLS -- Two great solutions for different architectures)

The last criteria is to clear up the most confused security criteria from Stage 1. That is to define exactly what is required of encryption of data-at-rest. Many members on the Privacy and Security workgroup expressed that the Stage 1 criteria was hard to understand (Meaningful Use Encryption - passing the tests), and we all expressed that the criteria needs to be very specific about the risk that it is trying to solve. Specifically the EHR vendors on the call were strongly advocating for wording that would encourage software good design. Specifically an EHR design where the end-user system doesn't save PHI onto the hard-drive, whether this is a desktop, laptop, tablet, or other mobile device. We were very unified that this is a good system design that doesn't put PHI at risk of exposure if the system is lost or stolen. Yet if the EHR system does utilize the hard-drive on the end-user system then the EHR system must support encryption of that PHI. Clearly HHS/ONC is very worried about perceptions of the HHS Breach Notification ‘wall of shame', and thus want to provide politically-correct message that tells the general public that they are addressing these breaches. I thus would recommend both: make sure system design avoids risks of exposure, and allows workstations to use transparent hard-drive encryption.

The workgroup did recognize that the functionality of an EHR to export documents (e.g. to give a copy of health information to the patient), is excepted from the workstation encryption criteria; while also recognizing that there is a need for encryption of this exported information. We recognized that IHE has just released the Document Encryption profile which would be a future possibility, likely for Stage 3. Prior to that approach, the Provider Organization is expected to protect this exported PHI through other means, such as transparent encrypting USB memory devices and transparent hard-drive encryption.

There really is not much new this year, mostly providing clarity to previously known security functionality. I see more interest in leveraging existing general purpose IT security functionality standards, such as NIST 800-53.  There is also a recognition that the IHE profiles are a proper solution for interoperability (they don't cover functional or operational security), but there is HHS/ONC hesitancy to specify them out of fear that they drive a specific architecture or specific organizational infrastructure. The workgroup and my interactions with HHS/ONC show that there is a reasonable approach to security functionality as foundational to a high quality EHR.


  1. Thanks John. Very helpful summary. I notice that you didn't mention "Amendments" including the historical account of amendments (new criterion, #52 in the table). While I presume that was because it is really more "application functionality" rather than "security," still the Security WG did write comments about it. What are your thoughts about that? It is something new to MU.

  2. David,

    The P&S workgroup pushed the amendments topic back to the larger group as it is more an issue of Medical Records retention. Yes it is true that one reason for an amendment to happen is because of the HIPAA privacy regulation, but the functionality needs to follow Medical Records.