Saturday, September 15, 2012

The Magic of FHIR

As I look back on this week of HL7, it was like no week of HL7 I have ever had before. Not only did we have some fantastic and productive discussions on the Privacy and Security front; but there was such openness and excitement about FHIR.  Grahame has really stepped in something here. I personally think it has so much more to do with Grahame himself. The excitement he has is infectious. His personal background with experience with all of HL7, and I found out on Thursday he also has implemented DICOM in his distant past; gives him great perspective. He has surrounded himself with highly competent team. He has forced open and transparency; and strong governance. However that would not be enough to make it as big as it is.

At the beginning of the week at the FHIR Connectathon someone asked if the reason why ‘this’ was so easy is because of REST? The answers given were very positive to the REST approach. There was discussion that this REST focus should not be seen as purely HTTP-REST. It was pointed out strongly that although the initial pilots are using HTTP-REST, there is no reason that the FHIR Resources can’t be serialized into a Document, or communicated using PUSH technology, or even transmitted using HL7v2, v3, SOAP, etc. The FHIR “Resources’ are truly transport agnostic. The REST approach simply focuses the design on the concept of “Resources”. These other forms need quite a bit of discussion before they are interoperable, but it is powerful. I thought about the question of why was this so big throughout the week and although the REST philosophy is very much a contributing factor it is not alone enough to make it as big as it is.

Part of the FHIR approach is to address the most mainstream use-cases and leave the edge-cases for extensions. Indeed the concept of extending a FHIR Resource is encouraged. It is discouraged for you to do something that is abusing the specification, such as carrying the same information in an extension when there is a legitimate FHIR way to do it. Extensions allow the FHIR core to stick to the basics. The basic philosophy also recognizes that MANDATED values are likely bad. That is not to say that there would be Profiles that do mandate, but the core leaves as much optional as possible. This is in theory at the base of any standard, but it is stated boldly in FHIR. A contributing factor, but not alone enough to make it as big as it is.

Back to Grahame, his documentation tooling is amazing. The whole FHIR specification is documented in an XML spreadsheet. This core table of truth is processed by JAVA application that he fully publishes, that spits out EVERYTHING. I can’t claim that I can prove this, but everything that I heard about is generated by this application from this spreadsheet. This includes XML schema, Test tools, documentation, Java objects, C# objects, JSON, examples, etc, etc. I would not be surprised if this thing spit out stuff that we don’t know we need. This tooling is what I have heard many standards organizations want to be able to have. Grahame has made it happen for FHIR, but this alone is not enough to make it as big as it is.

A factor that I heard spoken of, but never spoken of as a factor, is the ready access to programming tools that make the grunt work totally hidden. I am not a programmer, I really want to get my fingers back into programming, but never find the time. Even if I found the time, it is something that one needs to use often. I think this is why Keith continues to do all kinds of demo code that shows this or that. It is really hard to be in the standards development world and yet also have responsibilities for programming. I think this is the sleeper HUGE factor. This array of tooling that is readily available makes super easy the processing simple-XML, JSON, and Atom feeds. I heard and saw lots of people tell me just how easy it is to process simple-XML, JSON, and Atom. This was also the feedback that I got on the IHE MHD Profile, however that didn’t really take advantage of this power, yet… I know that this factor is far more powerful than the factors I have said above, likely more powerful than all of them. We all know that the use of a standard is what makes that standard powerful. This tooling factor will make FHIR easy to use. This surely vindicates Arien Malec and John Halamka; they did tell us so. Clearly as big as this factor is it is not enough to make it as big as it is.

I have worked to coordinate FHIR, hData, and IHE MHD. I have had detailed discussions with Grahame on the concept of pulling the IHE work into FHIR, we are going to see what this might look like. I have worked with the FHA/S&I effort on RHEx as well. At this meeting I worked to pull into the tent the DICOM WADO work that is upcoming. Each of these efforts are independent, and can choose to cooperate or simply align or compete. I am amazed at how cooperative they all have been. It is early in these efforts themselves, and even earlier in the cooperation phase. I am still hopeful that we each can add value and thus the result is more powerful than any one project could be. This was also jokingly referred to in references to how to pronounce FHIR --> "FOUR"

There are many challenges that will need to be addressed. We just touched upon Security and Privacy this week. The actual problem is far bigger than a security layer like RHEx. It includes object classification and tagging. It includes an understanding of the difference between an object classification and the meaning of a communications, things like obligations. These are areas that we are working to develop even in the abstract, much less in a medium like FHIR that wants to keep everything simple. Related to this is the data provenance, aggregation and disaggregation, de-identification and re-identification. There are areas like clinical accuracy, and usefulness. There are concerns around patient safety, specifically regarding cases where not all the data was displayed to a treating doctor because that data was not understood. What does it mean to understand, and what does mustUnderstand mean?

I am worried that success for the intended use-cases will be abused for non-intended use-cases. This is of course the problem any standard has. But I see it rampant in Healthcare, mostly the abuse is government mandated.

As I write this, I am indeed listening to “The Firebird” by Stravinsky. There was joking on Twitter that one must read the FHIR specification while listening to “The Firebird”. Somehow it is working.

The excitement is not due to any one thing, nor any specific combination. It is not REST. It is not simple-XML or JSON. It is not Grahame. The excitement is driven by all of these factors converging at just the right time and place. Time will tell if this turns into something that can survive for a long time. We must be very careful to keep this in perspective.

1 comment:

  1. Thanks, John, this is great stuff indeed. Having been in the REST, modeling, oData, and RDF world for a long time it's great to see the healthcare IT world finally moving to REST, model driven architecture, and related technologies. The complexity of healthcare data is certainly higher than many other fields but it's been our lack of proper tooling and modern approaches that has kept us from having highly interoperable systems.

    ReplyDelete