Clinical Architecture Blog

Clinical Architecture and Apelon - Collaborating to Evolve and Enhance your Terminology Ecosystem

There was a press release issued today announcing our new agreement with Apelon.  Apelon has amazing, knowledgable and trusted resources that have been serviing this industry for decades.  Clinical Architecture has the most innovative and comprehensive terminology tooling in the market today.  We felt that by working together we could accomplish great things for our mutual clients and the industry in general.  I am very excited to have the opportunity to work with the Apelon team and honored that they thought enough of our Symedical product to collaborate with us at this level.  

Please stop by either booth at HIMSS (Apelon #5055, CA #1077) to ask us about it and score a commemorative chocolate bar.
CA_ApelonCandybar.png

Posted: 2/20/2014 9:18:03 AM by Charlie Harp | with 0 comments

Symedical for the iPad is now available in the iTunes app store!

I am extremely pleased to announce that Clinical Architecture has just released Symedical Mobile Map Manager (SM3) for iPad and iPad Mini.

 SM3_app.png

This application allows a clinical map administrator to connect to one or many Symedical servers and monitor and manage exceptions, work on mapping tasks or review submitted maps and publish them to the mapping run time.
 
Our goal was to create a convenient, intuitive, and highly functional application that allows the resources that are responsible for managing the mapping to do so quickly and from anywhere they can connect their iPad to the web.  

Here are a few screen shots:
SM3-in-hand.png
Server configuration and selection


IMG_0100.PNG
Map Asignments and Quick approval
 

IMG_0102.PNG
Term Mapping console

The SM3 App is free, but requires access to a Symedical server running the most recent version (1.5.1) with the Mobile Administration option enabled (and you must be at least 4 years old).   If you would like a demo or to try it out, contact our sales department and we can make it happen.  

We will also be showing it off at HIMSS, so feel free to stop by and see what it's like to have the power of interoperability in the palm of your hand.
Posted: 2/7/2014 11:50:45 AM by Charlie Harp | with 0 comments

Clinical Architecture at HIMSS in Orlando

Clinical Architecture will have a booth at the 2014 HIMSS conference in Orlando February 22-27th.

We will be camped out in booth # 1077 and would love to meet with you.

Please visit our HIMSS Meeting request page to schedule a face to face and learn how you could:
  • Simplify how you obtain and update standard terminologies, ontologies and maps.
  • Distribute, manage, localize and monitor terminology across your entire user base with relative ease.
  • Free your developers from the hamster wheel of terminology by integrating our powerful runtime APIs.
  • Realize the value of true clinical interoperability with Symedical’s semi-automated mapping and normalization pipelines.
  • Take advantage of unstructured data in ways you never thought possible with Symedical’s new SIFT engine.
  • Use the Symedical Mobile Map Manager for iPad to reduce the latency and increase the fidelity of your clinical interoperability.
We are looking forward to seeing you in Orlando!

Charlie

CA_HIMSS2014_location2-(1).png
Posted: 2/3/2014 3:10:12 PM by Charlie Harp | with 0 comments

Fit for Purpose - by Shaun Shakib, PhD

A mentor of mine once used the expression “Fit-For-Purpose” and I believe it captures an important and practical principle for orienting a discussion around the use of clinical terminology.  Controlled clinical terminology is a means, not an end and it is too easy to become obsessed with the systems, artifacts, and science around terminology and forget the many valuable reasons and objectives for implementing it.  Your objectives must drive your approach to the creation, management and implementation of terminology.  This discussion is not a comprehensive list of considerations, rather it provides examples of things you should think about when implementing and utilizing terminology.

Here are some use scenarios defined in the Lexical Query Services specification that capture the types of things you might want to do with terminology:


  1. Mediation - Using terminology to transform/translate messages or data records from one form or representation into another. (data exchange)
  2. Normalization – Eliminating redundant and invalid content by unifying the multiple codes and terms used to describe concepts into a single, concept-based terminology. (analytics and decision support)
  3. Standardization - translating local codes to standards for the purpose of information exchange. (data exchange and regulatory requirements)
  4. Information Acquisition - Using terminology to aid in the process of entering coded data (pick-lists, structured documentation).
  5. Information Display - Using terminology to translate coded data elements into human or machine-readable external forms.
  6. Indexing and Inference - Using terminology to inquire about associations which may or may not pertain between various concepts.

Once you know what you want to do with terminology it is important to consider the lifecycle of the content and the interaction it has with your systems.   These considerations will drive decisions regarding the architecture of your terminology environment and your strategy for integrating (mapping) and structuring internal and external/third party content.

Considerations such as:

  1. Can you use a standard terminology directly or do you require a layer of abstraction to insulate from potential semantic shift or drift in the standard?
  2. Will you have a need to author new content?
  3. What standard terminologies do you require?
  4. How does your use of terminology fit into your organizations overall data governance strategy?

Answering these types of questions will help ensure that you don’t spend a lot of time, money, and effort building something you can’t use or something that requires an unnecessarily large team of individuals to maintain and enhance.

Here are some basic capabilities you should look for in your terminology management environment:

  1. Obtaining and updating content – process and tools for acquiring, transforming, and loading both initial and ongoing updates of terminology content
  2. Content distribution – delivering content to other vocabulary servers or applications
  3. Batch and computer assisted Term Mapping – integration of local terminologies and standards
  4. Browsing, authoring, subsetting and extending content
  5. Deploying content to a runtime environment
  6. Monitoring the performance of the content in runtime
  7. Versioning capability

Implementing capabilities such as these requires resources that are supported by tooling and processes.  It is important to note that the degree to which these capabilities are automated (how good your tools and processes are) will have significant impact on the resources needed to implement and maintain content and the quality/consistency of the work product.  Investments in this solution (total cost of ownership) will come in the form of time (content creation, management, mapping and maintenance) and money (content and software licensing).
 
Finally, since the architecture you put in place and content you create will be foundational, it is important to address current needs effectively while allowing for future enhancements and growth.  This means that the ideal “Fit-For-Purpose” solution must also be flexible and extensible.  It must scale with your future needs.
 
These are just some of things to consider, the key message is don’t let the tail wag the dog.  Let your purpose drive your approach to clinical terminology.


Posted: 9/2/2013 2:21:10 PM by Shaun Shakib, PhD | with 0 comments

What is Data Normalization?

Recently a friend of mine asked me a question.  "What is normalization?" 
 
One formal definition is “Normalization is the process of reducing data to its canonical (normal) form.  In doing so removing duplicated, invalid, and potentially pre-coordinated data (depending on your definition of the canonical form).” 
 
While this definition might be technically correct, I think that when we think of normalization in the trenches of healthcare we are actually talking about something slightly different.
 
In healthcare we deal with data.  This data generally falls into three categories: Data intended for humans (free text information, images, audio, video), data intended for algorithms (data tables, indexes and graphs) and data intended for both (terminology).  The last category, terminology, binds language (words) to codes and allows us to bridge the human world and the algorithm world.  This works fairly well, as long as it stays confined within the walls of my information ecosystem.  When you try to share codes across systems you find out that, more often than not, different systems understand different codes and even though the information is coded, it is not using the codes that the receiving systems understand.
 
So, we find ourselves in a situation where we need to translate the meaning of the term from one terminology into another. This task has many names: "Mapping", "Mediation", "coordination", "Transcoding", "Interoperability" and, "Normalization".  In every case you are taking a term from the source terminology and trying to find the most appropriate match in the destination terminology.  The "rules" that you use to match terms across the semantic rift are not universal.  They will change based on the goals and objectives of the exchange.  This is called "purpose driven mapping" and my colleague, Shaun, is working on an excellent article on that topic, so I will leave the full explanation of that to him, stop stalling and talk about normalization.
 
I typically use "Normalization" when referring to a situation where there are many sources of terminology feeding into a single (normal) target environment.  This is typically the pattern you see when you are dealing with some type of clinical aggregation environment where you are collecting patient information from many sources so that you can combine and reason over the data in a central location using shared logic and "normal" clinical knowledge.  So the first rule of normalization is "Many sources going to a single (normal) terminology, for a given domain."
 
"Normalization" also implies that you started with something not normal and made it normal.  If this is true, then it makes no sense to reverse the process as that would result in "deviation".  So typically Normalization is considered a one way trip.  You always want to preserve the original terminology from the source, but the goal is to use the normalized data, not to provide a pivot between the deviant and normal worlds.  So the second rule of normalization is "Normalization flows in one direction".
 
Now, much of mapping is about conceptual equivalence, but not all of it.  You should always make sure that you have considered the objective of a map before you start mapping because the only thing worse than mapping is mapping again.  When you are normalizing terminology, the rules of mapping are dictated by the nature and specificity of the "Normal" terminology. So the third rule is "There are no rules when it comes to normalization mapping rules, you have to determine the most appropriate way to normalize each inbound terminology".
 
Lastly, many people confuse "Normalization"with “Standardization” but this is not a universal bidirectional.  They are the same if you are normalizing to a given standard, but you can normalize to any terminology that you deem normal for your purposes.  For example, suppose I have inbound medications and I want to determine if they are solids or liquids and that is all I care about. I could create a target terminology with three terms, 1=Solid, 2=Liquid and 3=Other.  I would then build a map that normalizes all of my inbound terms to one of those values.  This would be a proper, albeit limited, normalization pattern, using a terminology that I built to suit my purposes.  So the fourth rule is "Standardization is always Normalization but Normalization is not always Standardization".
 
To summarize pragmatic definition of Clinical terminology “Normalization” is:
  1. Many sources going to a single 'normal' terminology, for a given domain.
  2. Normalization flows in one direction.
  3. There are no fixed rules when it comes to normalization mapping rules, you have to determine the most appropriate way to normalize each inbound terminology, dependent on purpose.
  4. Standardization is always Normalization but Normalization is not always Standardization.

Posted: 8/5/2013 10:39:21 PM by Charlie Harp | with 0 comments