Sunday, January 31, 2010

Patrick Pyette: The Case for Privacy Accounting

Here is another guest blogger: Patrick Pyette. Pat is a friend and very active security/privacy standards geek.

I started this exercise as a response to Glen Marshall’s submission to the HIMSS Working Group around disclosure accounting, but it has taken on a life of its own.   It’s now my attempt to explain why privacy accounting (a term that I’ll define a bit later) is critically important, even in relation to quality care and availability. 

Before I begin, some disclaimers are probably in order:  I’m a Canadian, and my views (and spelling) are coloured by that background.   I do not represent any particular organisation, jurisdiction, or agenda, other than my own.   And while my involvement in healthcare privacy and security issues has exposed me to legal issues, especially when it pertains to consent, I am not a lawyer.

The first thing to note is that the disclosure accounting requirements are based upon the Individual Access and Challenging Compliance principles of fair information practices. 
  • The Individual Access principle states that a patient has the right to, upon request, be informed of the existence, use, and disclosure of their personal information.  
  • The Challenging Compliance principle states that a person has the right to challenge an organizations compliance with the other principles (depending on the country, there are a total of between 8 and 10, but they all cover substantively the same ground.  
The U.S., Canada, all of the EU countries, Japan, Australia, and Argentina (to name more than a few) have adopted substantively similar principles (see the references below for links to various versions of these). The message here is that this is a global issue and the solution needs to be viewed in that light.  

Disclosure accounting is only part of the requirement, as far as I’m concerned.   Accounting of collection, use, and disclosure of personal (health) information (I’ll refer to this as “privacy accounting” from here on in) is something that we need our health systems to do efficiently and effectively.   I do recognize that mandatory auditing of collection and use of personal health information is generally not a requirement in the U.S. but I would question the ability to establish accountability without it.

The reason is simple:  Trust.   Without trust in the electronic health systems that we are building, people (both patients and providers) will not want to disclose accurate information to those systems.  The result will be a stranded investment of immense proportions without any of the benefits in terms of outcomes and reduced cost that are purported to be achievable by the interconnection and interoperation of these systems.  

I was amazed to learn last week at the HL7 Working Group Meeting that several of the U.S. states have enacted legislation that gives people the right to consent (or withdraw that consent) to the communications channel that personal heath information collection, use, and disclosure can take place over!  This appears to me to be a direct result of the lack of trust that is being placed in the systems that are being designed, implemented, and operated to enable the exchange of health information today. 

If we can’t demonstrate the trustworthiness of these systems and the accountability to which we hold the users of those systems (via things such as privacy accounting), then more states will adopt similar legislation and will make it increasingly difficult to realize the benefits that are not just desired, but necessary if we hope to provide healthcare supported by better information at a lower cost. 

Some may argue that the legislation is intended to shield providers from liability for the inevitable breach  that will occur as a result of communicating personal health information over a Health Information Exchange (or interconnected EHR). I would assert that breaches are an inevitable part of information exchange.   If the information was transmitted by dog-sled, there would still be a breach at some point.   What we want to try to do is minimize the risk of that happening, and the subsequent damage done. The problem is that in order for consent to be valid, it must be informed.  

I’m a pretty technology-savvy guy, with a fair bit of understanding of privacy and security issues, and I can’t properly assess the risk of giving consent for a particular HIE without much more information than would be reasonable to provide (for security reasons).   If my consent is invalid (as it is not informed), then certainly my Aunt Mary’s consent would be also, as would everyone else that I know.   As a result, we’re effectively left with a requirement to provide consent in those states that have that legislation and no real way of doing so (Note that this is my opinion only, and remember – I’m not a lawyer.  I have no indication that this has been tested in the courts).   An Electronic Health Record?  An HIE?   Those states have effectively made those terms irrelevant, as they could never be used as intended.  And if they were, the liability still rests with those who have custodial responsibility for the information - as it always has.

The only way forward that I can see is to build enough business and technical safeguards into these systems (I include people, processes, and technology in this concept of a “system”).   Policies need to be established that go well beyond the security and privacy floors that are legislated, so that enough trust can be established with all stakeholders that these systems can start to provide the benefits that we believe are possible.   Privacy Accounting is one of those safeguards that I believe are required.

Will it cost more if we do this?   Absolutely. 

Will it cost even more if we don’t?  Absolutely.


Canadian Standards Association: Privacy Code:

U.S. FTC Fair Information Practice Principles:

U.S. Office of the National Coordinator (ONC), Nationwide Privacy and Security Framework For Electronic Exchange of Individually Identifiable Health Information

Organisation for Economic Co-operation and Development: OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data,3343,en_2649_34255_1815186_1_1_1_1,00.html


  1. This comment has been removed by a blog administrator.

  2. This comment has been removed by a blog administrator.