Patients are very connected to their bodies, which when they are measured result in health information; so it should be no surprise that Patients feel like they SHOULD be very well connected to where their health information goes. The feeling that their health information is going places that they don't know can and should make anyone feel uncomfortable.
A wonderful article in the NY Times - ‘Informed Consent’ and the Ethics of DNA Research - does a tremendous job of explaining various cases where a patient found out LATER that their data was used in ways that they didn't understand or authorize. This re-purposing of the information was clearly not the patients idea. In many of the cases the patient said that they WOULD HAVE allowed the research to be done, but were outraged that they were not asked.
There is also clear indications in the article that there is a huge gap between how researches communicate and how patients understand. This is not to put down patients intelligence, I actually see this as a total failure of the book-smart researcher. Possibly our higher-education system needs to do a better job of exposing students to simple facts of humanity. Research tends to focus so much on the outcome that they don't recognize the human part of their source data. This is the focus of the 'informed' part of consent. Communicating with patients is critical to acceptance.
There is an small paragraph on a hugely important case of Henrietta Lacks from the 1950s. A new book written by Rebecca Skloot, is out on the life of Henrietta, which has caused some of this reflection. I encourage everyone to do even the smallest research on Henrietta. We all have so many reasons to thank her, but she never knew. In her case it wasn't just her health information that was re-purposed, it was her actual body cells, which turned out to be amazing.
I have written on technical methods of De-Identification as highly contextual. I believe that we can appropriately de-identify data for a specific purpose. But even in a case where data has been de-identified for one purpose does not mean it is appropriately de-identified for another purpose. I can't tell which of the abuses noted in the NY Times article tried to de-identify, but clearly the method used failed. It is clear that in the case of the Havasupai Indians, the knowledge was exposed by someone involved in the research. The human factor is very hard to control with de-identification technology.
Any single security or privacy tool is only a tool. It can be used wrongly or carelessly. Utilizing multiple tools together helps greatly, but can still fail. But before technology is applied, Policies must first be thought through carefully and written. I really hate that everything must wait on policies, but without policies we can not understand the context or know when there is a loss of control. I very much want healthcare researchers to do their job. I do not want to suffer of failing health as my ancestor have. But that does not mean that I want to suffer because of health information exposure. I am going to get my genes sequenced, and would love to allow researchers to play with the information. I want, truly want to believe that nothing bad will happen.
Discussions of Interoperability Exchange, Privacy, and Security in Healthcare by John Moehrke - CyberPrivacy. Topics: Health Information Exchange, Document Exchange XDS/XCA/MHD, mHealth, Meaningful Use, Direct, Patient Identity, Provider Directories, FHIR, Consent, Access Control, Audit Control, Accounting of Disclosures, Identity, Authorization, Authentication, Encryption, Digital Signatures, Transport/Media Security, De-Identification, Pseudonymization, Anonymization, and Blockchain.
Monday, April 26, 2010
Friday, April 23, 2010
HIT Standards - Privacy & Security committee - Presentation of BPPC
I presented the BPPC profile to the HIT Standards Privacy and Security Committee. The HIT Policy Privacy Committee was also in attendance. The meeting was also open to the public, but I don't know how many people were there.
The slides are available on the HHS web site. Dixie did a wonderful job of setting the stage. I really like how she dissected the problem space and setup the parts of the problem that BPPC fills.
I also created a single document with 90% of the current BPPC profile. This is not an official document, but is far easier to read than to try to pick the parts out of the IHE Technical Framework.
Blog Post Updated: The recording of this session is available
During the Question segment I received really good support from Kathleen Connor, Walter Suarez, and Ioana Singureanu. The questions the asked were pointedly in support of BPPC as a stepping stone, and also setting the stage for May's presentation.
May 14, Ioana will be presenting the current HL7 ballot of CDA Consent.
The slides are available on the HHS web site. Dixie did a wonderful job of setting the stage. I really like how she dissected the problem space and setup the parts of the problem that BPPC fills.
I also created a single document with 90% of the current BPPC profile. This is not an official document, but is far easier to read than to try to pick the parts out of the IHE Technical Framework.
Blog Post Updated: The recording of this session is available
During the Question segment I received really good support from Kathleen Connor, Walter Suarez, and Ioana Singureanu. The questions the asked were pointedly in support of BPPC as a stepping stone, and also setting the stage for May's presentation.
May 14, Ioana will be presenting the current HL7 ballot of CDA Consent.
Thursday, April 22, 2010
Buggy McAfee update whacks Windows XP PCs
McAfee is the latest AntiVirus vendor to cause more damage than good.
There is also the humorous
A flawed McAfee antivirus update sent enterprise administrators scrambling today as the new signatures quarantined a crucial Windows system file, crippling an unknown number of Windows XP computers, according to messages on the security vendor’s support forum. MoreNot only did it wack windows XP PCs, it hurt multiple healthcare organizations that had automatic signature updating enabled.
The damage was widespread: the University of Michigan’s medical school reported that 8,000 of its 25,000 computers crashed. Police in Lexington, Ky., resorted to hand-writing reports and turned off their patrol car terminals as a precaution. Some jails canceled visitation, and Rhode Island hospitals turned away non-trauma patients at emergency rooms and postponed some elective surgeries. MoreIn this case there is no evidence that this caused any problems that resulted in patient safety concerns, but clearly a hospital that needs to turn anyone away is not a good thing. There has been great concern in the Medical-Device industry around AntiVirus mandates. As usual the Medical-Device industry position wants to be careful to weigh all the risks involved and points to various other times when AntiVirus has caused damage or delay.
There is also the humorous
XKCD Comic — Diebold blames customers for applying antivirus… XKCD Comic
Monday, April 19, 2010
Get Privacy Right, So We Can Move On Already
Seems like a down-to-earth assessment in a blog post by Lygeia Ricciardi. I wish someone would prove her assertions, because I think that they are likely correct. Please go and read her post.
A national survey released today by the California HealthCare Foundation shows that 66% of Americans believe we should address privacy worries, but not let them stop us from learning how technology can improve our health care. Amen.
This is particularly heartening news given that the same survey also documents for the first time real consumer benefits from the use of personal health records (PHRs). Seven percent of American now use PHRs, more than double the number in 2008. According to the survey, significant proportions of PHR users feel they know more about their health and health care, ask their doctors questions, feel connected to their doctor, and even take action to improve their health as a result of using a PHR.
Please continue and read the whole article
Saturday, April 17, 2010
Security NOW
This is a shout-out to those security geeks that listen to the podcast "Security Now" on the TWIT network. I have listened to this podcast since the beginning, not sure I might have had to review a half-dozen older ones when I first started. Last week podcast was about a specific problem with SSL certificate stores found in all browsers, but was casually identified as "SSL is broken". I posted a comment on the feedback site:
I was happy to see that my blog google analytics showed that my numbers had increased by 36% since the live broadcast on Wednesday and in the days since while people listen offline. This even though I fully expect that 99.44% of the listeners of Security Now do NOT have JavaScript enabled. So I suspect that my blog site has been visited by so many more people that I will never know about. I hope they enjoyed my blog, feel free to place their own feedback on my articles, and visit often (even if I will never know they are there).
I grit my teeth every time you say SSL is broken. Yet most of the time it isn't SSL that's broken, but the policies some have chosen to use to simplify our lives. So as an example, last episode, the problem with SSL server certificates, this isn't broken SSL, this is a broken policy. I recommend SSL very often to protect healthcare. I'm involved in all of that stuff going on in Washington, D.C. around healthcare IT. I often have to reverse misunderstandings. In addition, I have to point out that the recommendations that we're giving with healthcare are to use mutual-authenticated TLS to a well-controlled certificate or CA branch that is highly controlled, following a system inspection and business agreement. This isn't just server authentication to a list that some browser vendor chooses.I received an email from a colleague in GE's Energy division mentioning that he heard my feedback on this weeks podcast, clearly someone who was able to listen to the live recording. This morning I finally got to listening to the recording and sure enough my comment was his 'Comment #1' at almost minute 23. I was happily vindicated that Steve Gibson, the host of Security Now, agreed with my assessment. It was discussed and clarified for a whole 6 minutes (does that count against my 15 minutes of fame?). Policy is so important to think through, declare in writing, inspect for capability to enforce, and regularly audit that it is being executed.
I was happy to see that my blog google analytics showed that my numbers had increased by 36% since the live broadcast on Wednesday and in the days since while people listen offline. This even though I fully expect that 99.44% of the listeners of Security Now do NOT have JavaScript enabled. So I suspect that my blog site has been visited by so many more people that I will never know about. I hope they enjoyed my blog, feel free to place their own feedback on my articles, and visit often (even if I will never know they are there).
Tuesday, April 13, 2010
Research program aims to advance security in Health IT
It pains me greatly that this project is perceived as being needed. There has been so much research and standards development in this area. The problems are not unique to healthcare, and the path is well worn by other industries. However all experts of any field will likely recognize that everyone feels that they need to fix what ever it is you have spent your life working on.
No one has identified to me what this vast list of "critical problems" is. I suspect that the root of the problems are 99% a lack of declared policies:
1) Lack of Policies -- this is usually due to those responsible for the protection not taking the time to make compliance clear. The article on E-health security a problem at Vancouver Coastal Health Authority should have made this very clear. If no policies are declared then no one can say that anything not working as it should be. Yet without policies there will be breaches. This is a big concern that I have as part of the NHIN-Direct as this project is clearly trying to put off Privacy and portions of Security under the singular simplification that the sending provider has determined out-of-band that they are authorized to send the data to the specified receiver. The good news is that this is a declared policy of the NHIN-Direct.
"To accomplish this," Gunter added, "SHARPS has assembled an elite multidisciplinary team of healthcare and cyber-security experts that is uniquely capable of carrying out an ambitious strategic program to bring security and privacy in health information technology to a new level of sophistication." MoreYes, that is an actual quote... It terifies me that they feel we need a 'new level of sophistication'. Why is it so clear to Gunther that we need a new level of sophistication? Is it that the current solutions are too simple? Further it saddens me that they feel they have an 'elite' team that is 'uniquely capable'. Oh dear are we going to get some crap out of this.
No one has identified to me what this vast list of "critical problems" is. I suspect that the root of the problems are 99% a lack of declared policies:
1) Lack of Policies -- this is usually due to those responsible for the protection not taking the time to make compliance clear. The article on E-health security a problem at Vancouver Coastal Health Authority should have made this very clear. If no policies are declared then no one can say that anything not working as it should be. Yet without policies there will be breaches. This is a big concern that I have as part of the NHIN-Direct as this project is clearly trying to put off Privacy and portions of Security under the singular simplification that the sending provider has determined out-of-band that they are authorized to send the data to the specified receiver. The good news is that this is a declared policy of the NHIN-Direct.
Other Posts on the topic: EHR not used securely, A Look into the HHS Posts Data Breach Notifications.2) Lack of Consent -- this is very related but distinctly different. One can have policies without allowing the consumer/patient to have a say. This lack of control is a problem. Simply giving control, however limited can calm the consumer. This is shown over and over with social networking sites.
Former Posts on the Topic The meaning of Opt-Out, Opt-In, Opt-Out.... Don't publish THAT!, Consent standards are not just for consent, Consumer Preferences and the Consumer, and RHIO: 100,000 Give Consent.3) Identity. Including Patient Identity, Provider Identity, and User Identity. Identity is important to make sure that we get operational access controls correct, patient-privacy-preferences correct, delegation correct, accounting of disclosures correct, oversight correct, etc.
- Patient The aversion to a federally issued identity is well known. I am amazed that we are the only country that can’t get over this, yet have so many federally issued identities. We don’t need a federally issued identity, but a federation of identities. This is a strong linkage between the various healthcare identities. There could be purpose-of-use views into this federation that forbids the viewing of identities that your purpose-of-use does not need access to. What is important is that this is a STRONGLY provisioned linkage. This is NOT an algorithm that evaluates two sets of demographics and ‘decides’ that there is a probability that they match. The act of entering a new identity and matching it to the existing identities must be done following only well documented process. Essentially I am worried that a fuzzy matching of patient identities will have too many false positives and false negatives. Patient’s lives are too important. Patient misbehavior is too valuable.
- User Identity. This is highly linked to the Patient Identity as they are often going to be ‘users’ in the context of a PHR. This is also highly linked to “Provider Registries” that can be used to find a provider for treatment. And this is highly linked to “Directories” that are used today in the operational setting to manage user accounts for healthcare applications as well as other business applications (yes healthcare is a business and users need to do things besides use the EHR). Again a Federated approach is needed. The reason here is different than for Patient Identities. There is already a federally issued identity for Providers, NPI. This does not address all the ‘users’ of a Health Information Exchange; that is the P and O in TPO. Further there are many ‘users’ in the Treatment side that would not be issued a NPI.
Former Posts on the topic: Federated ID is not a universal ID
Former Posts on the topic: IT security problems continue, What has HITSP done to protect confidentiality with a suite of implementable security standards, Implementing standards takes time.
- A sub-theme here is abuse of de-identification. De-Identification is a method of lowering risk, it does not eliminate risk. ONC to test re-identification of protected data, De-Identification is highly contextual, How Private can Electronic Information Ever Be?
Monday, April 12, 2010
Anonymizing patient records for genomics
This article in the Journal NATURE points to a nice Risk Analysis and Mitigation plan to allow researchers access to genetic information and the diagnosis codes known for the patient. They have even added a mitigation to assure that small populations in diagnosis code pools don't happen through low thresholds and grouping.
I really like the approach taken as it takes a look at what the minimal information desired and determines through a risk assessment how to achieve that goal. From my read they realized that they simply needed to know what the known diagnosis values were, they didn't need demographics or other indirect identifiers. At least that is all they say they are taking in the article.
I like this approach because it follows nicely the approach that I outlined in De-Identification is highly contextual. I hope that the ONC when they test re-identification of protected data looks carefully at this output, and process they used to come to this conclusion. I do not expect that their output is reusable because De-Identification is highly contextual.
Surely more investigation needs to be done, but I like that this group was willing to think critically about what the minimal information that they needed for success.
To solve this problem, the new method allows researchers to set two parameters: the minimum number of patients (k) that should have the same set of codes, and a 'utility policy' which specifies how codes should be linked in the anonymized data. More
I really like the approach taken as it takes a look at what the minimal information desired and determines through a risk assessment how to achieve that goal. From my read they realized that they simply needed to know what the known diagnosis values were, they didn't need demographics or other indirect identifiers. At least that is all they say they are taking in the article.
I like this approach because it follows nicely the approach that I outlined in De-Identification is highly contextual. I hope that the ONC when they test re-identification of protected data looks carefully at this output, and process they used to come to this conclusion. I do not expect that their output is reusable because De-Identification is highly contextual.
Surely more investigation needs to be done, but I like that this group was willing to think critically about what the minimal information that they needed for success.
Tuesday, April 6, 2010
EHR not used securely
I ran across multiple articles this past week on the topic of EHR security. They were all shocked at the results, but I am not. The reason I am not surprised is that this is the first year when it really became important for a Healthcare Provider Organization to care about security. Sure in past years they should have been concerned with it, but we all know that organizations (or individuals) will not take security seriously until your neighbors are being publicly flogged. One hopes that it is their neighbor, but someone must be the bad guy. What changed is the Breach Notification, and specifically the HHS notification of breaches. I covered this just last week in the article A Look into the HHS Posts Data Breach Notifications.
This means that Provider Organizations are just now talking about it. It will be slightly better next year, not because they are not changing, well some will be changing, but rather because it takes quite a bit of effort to make changes to a system that is not secure by design. (see #2 on my Three Security Concerns for 2010). The pressure needs to be kept up month after month (I also note three new breach notifications in healthcare just last week).
This article acts surprised that Healthcare Organizations are reactive and not proactive... This is because they have not had to be proactive, and lacking a motivation they will not be proactive:
This means that Provider Organizations are just now talking about it. It will be slightly better next year, not because they are not changing, well some will be changing, but rather because it takes quite a bit of effort to make changes to a system that is not secure by design. (see #2 on my Three Security Concerns for 2010). The pressure needs to be kept up month after month (I also note three new breach notifications in healthcare just last week).
This article acts surprised that Healthcare Organizations are reactive and not proactive... This is because they have not had to be proactive, and lacking a motivation they will not be proactive:
The 2010 HIMSS Analytics Report: Security of Patient Data indicates that healthcare organizations are actively taking steps to ensure that patient data is secure. However, these efforts appear to be more reactive than proactive, as hospitals dedicate more resources toward breach response vs. breach prevention through risk management activities. MoreThis article boldly declares that EMR Data Theft is on the rise. Well of course it is, as data is moved into EMRs there are more EMRs to steal data from. This data could simply be telling us of the ever increasing use of EMR.
EMR Data Theft BoomingAnd this one finds 'mixed results'. The story they weave is that Provider Organizations say they are compliant, but yet breaches are up. Not very useful.
Fraud resulting from exposure of electronic medical records has risen from 3% in 2008 to 7% in 2009, a 112% increase, researcher says. More
Survey Finds Mixed Results on Security of Electronic Health Data
Health care professionals rated their organizations high for compliance with health IT regulations, but reports of data breaches in the past year were up from two years ago, according to a new biannual report released Monday, Health Data Management reports. MoreIt is true what they say: Lies, damned lies, and statistics. Yes we must get better, but we will get better by using Risk Assessment to apply reasonable controls against real risks. See my advise for Meaningful Use - Security Plan
Subscribe to:
Posts (Atom)