Auditing transactions

Clovertech Forums Read Only Archives Cloverleaf Cloverleaf Auditing transactions

  • Creator
    Topic
  • #52414
    Peter Heggie
    Participant

      I’m thinking of creating a thread dedicated to receiving copies of all messages. A TPS running here would extract the mrn, visit # and possibly one or two more fields from the data, and store that, along with the msg id and source thread name, in a sqlite database. I want to be able to run a query to select all records for a patient visit, to see the flow of all ADT, orders and results, specifically for troubleshooting (to answer the oft-asked question – what happened to my message?).

      I put a tps in the startup procedure that opens a connection to sqlite, and then a tps in the inbound procedure that extracts the data from each incoming message and performs an insert, using the same ‘handle’ obtained in the startup tps. This part is working fine.

      My goal is to make this non-invasive and require the least amount of maintenance. I’m not sure I would get any significant value from any kind of transaction auditor without opening up the message and extracting some of the content.

      I’m looking for comments on the value of this information, more specifically, the value of being able to find all the information in one place (one sql query), and comments on the support and upheaval required.

      I am faced with the problem of extracting the data. I have to update each existing thread (if I want to audit it’s transactions). I can either update the routing, to send a copy of a message to a new audit thread, or I can update the post-protocol upoc with a new tps that makes the sqlite insert directly. The latter seems more invasive, and more problematic (concerning the sqlite connections). The former seems less of a bottleneck because it is asynchronous; if the audit service fails, the transactions continue to flow. Can anyone comment on the performance trade-offs? I’m thinking that it is ‘relatively easy’ to just add another route to a thread.. being new to Cloverleaf I realize this may be a naive notion; I’m sure I don’t understand all the consequences of these updates.

      Finally, I can’t get some of this scheme to work in the test environment. I’ve got an inbound thread and an outbound thread. They both have an extra route to the audit thread. The copied message from the inbound thread is sent to the audit thread successfully, but there is nothing sent from the outbound thread to the audit thread. I do see the outbound thread successfully processing the application message (using fileset-local) and writing the data to the specified output file. But the transaction ends there and the entry in Route Messages seems to be ignored. I’ve tried using an hci static thread and also the same message name that I used in the inbound thread. Any ideas?

      Thanks, Pete

      Peter Heggie

    Viewing 10 reply threads
    • Author
      Replies
      • #74115
        David Harrison
        Participant

          I can see the value in this kind of database. I’m forever trawling smat files to answer the question “What happened to my message?”, usually from 3rd party vendors.

          I would suggest, rather than collecting the data in real-time as you are proposing, it may be better to build a system which updates the database from smat files after you have archived them. This way you could match up messages for a set of patients together with their respective HL7 Acks. It would also be completely non-invasive.

          Dave

        • #74116
          Jim Kosloskey
          Participant

            Pete,

            First of all I am not sure why you wouldn’t just utilize the already available SMAT files (which contain every message sent and received – including acknowledgments if you like) instead of adding overhead to capture the messages (or some portion thereof) again.

            However, I think the answer to your question regarding capturing the outbound messages is that you need to make a copy of the message in Tcl and set the disposition of the copied message to OVER. That will place the copied message on the inbound side of the outbound thread and thus look like a message which can then be routed.

            email: jim.kosloskey@jim-kosloskey.com 29+ years Cloverleaf, 59 years IT - old fart.

          • #74117
            Peter Heggie
            Participant

              Thanks Dave, Jim,

              I’ll look at the SMAT files and interfaces first. Using the proposed audit thread, if I have to use a tps to create a copy of the message, then it would be more invasive than I wanted.

              So far I’m not finding a command line command to display SMAT file content. In the testing tool and database tool, there is an option to build and show the equivalent command line syntax if you want to perform the same function in a script. Is there something for SMAT file displays?

              How do other people access SMAT Files or archived SMAT files from a script? Is there an API to read these files message by message, or to search messages for specific data values?

              I’m looking for a fast and easy way to trace all transactions for a visit, basically typing in a command with arguments for mrn, visit number and possible a date range. When I hit enter, it will return with a list of all msgids, thread names and dates & times.

              So to avoid some complex search programming, it would be better to extract all of this data on each SMAT file when it is archived, using the index portion to get the msgid, source thread name and date, and using the message portion to get the mrn and visit number and any other data, and then storing that in a database table. If I want milliseconds, then I’ll have to put that in the USERDATA in a post-proc tps? I’ll need a lookup table to help in the field search, depending on the message format, I”ll have different search criteria.

              Am I on the right track? Is there an easier way?

              Pete

              Peter Heggie

            • #74118
              Jim Kosloskey
              Participant

                Pete,

                Use the idx and msg files of the SMAT as indicated (via Tcl for example) and then build your mysql db from that information if you like.

                Then the search could be done with the mysql tools you will have (you were going to need them anyway – right?).

                Or — you could merge all of your SMAT files into a gigantic SMAT file (I am not recommending this just saying it could be done) and use the SMAT tool to query that huge SMAT file.

                Or — read through the SMAT files as indicated and maintain an Oracle or SQL Server or other proprietary DBMS and use the plethora of tools in the market for those.

                As for SMAT command line go to the help documents in the Reference Book, Engine commands and you will see hcismat.

                email: jim.kosloskey@jim-kosloskey.com 29+ years Cloverleaf, 59 years IT - old fart.

              • #74119
                Mike Shoemaker
                Participant

                  Hey Pete,  I’m not sure if this is what you have in mind, but one of my “extracurricular activities” here has been a message archiver for all of our engines (we have both Cloverleaf and Openlink).  We wrote a Java app that receives source messages (untranslated) via tcp/ip (with hl7 ack/nak) and archives them to a mysql database. The app lives on its own server and is 3 different modules, a tcp/ip port listener, an “archiver / indexer” that parses the data and inserts the appropriate fields, and a fancy little viewer utility that will also read cloverleaf Smat files.  We are NOT dealing with engine meta-data because I feel it doesn’t really provide any value. We are pulling the data from the MSH segment and able to provide the appropriate context to retrieve messages that way. I have 1 single thread on cloverleaf that basically forwards feeds from all of my ancillary feeds. Things like ADT, Orders, Results, etc etc are combined into a single feed which is received by the Java app. Our next step is to build a bigger index from the stored data the archiver produced.  So what I’m trying to get out of the indexer is a 3rd party app that basically will display the data for a particular patient, message by message across all sending systems. The patient data is directly linked to an HL7 message which can then be resent if needed.  If you or anyone is interested in specifics I can go further.

                  Mike

                • #74120
                  Peter Heggie
                  Participant

                    Thanks Jim,

                    As the new guy in the group, I need to be careful when introducing changes; reading SMAT files is a pretty safe option. I want to have the data ready to query, so I think there would be a batch process that runs often, extracting the data from them. I’d like to get it near-real-time… and yes, it would be better to store this information in mysql, Oracle or SQL Server instead of sqlite to allow access to the data from networked clients. Once it is in a database, it could be accessed from a command line or a small browser app.

                    Thanks Mike,

                    I think I understand what you have at a conceptual level; I worked on something similar in a previous life (using MQ) that had a GUI showing transaction summaries, ability to filter, ability to drill down into message content and ability to resubmit. I’m guessing that you had to include the ‘sending/copying’ function (tps) in your existing ancilliary threads to send a copy to your java system. I’ve always been a fan of real-time processing, but I can’t go in and update all our existing threads, even though the process would be more efficient. Your indexing/parsing functionality is something I’ll probably try to put into a tcl, pulling data from SMAT files and inserting into a table.

                    I don’t know enough about the HL7 content to understand that I don’t need the CL meta-data. If I query by visit number and see all the transactions, in and out of CL, I should be able to see, or notice as missing, specific transactions to specific ancillary systems. Maybe its enough to have a time-ordered list of rows, and I can follow the work flow based on the message timestamp, but I think I need to see the source and destination of each message, so doesn’t that mean I need CL meta-data? And if different messages have different formats, then it may be easier to use the format name to lookup a search/parse scheme for the required data elements.

                    I’m not sure it applies here (in this business), but when I built that previous application, I did it not only to reduce my troubleshooting time, but also to reduce the support calls – my superusers had access to the tool and they looked there first before calling me. My support call volume dropped significantly. I’m a big believer in automation and empowerment. I was fortunate in that the database and tables were already there and the population of those tables was already built into the application – there was not any user interface and the users were not given direct access, so a three-tier gui app solved that problem. I guess that explains my preference for a workflow query.

                    Pete

                    Peter Heggie

                  • #74121
                    Mike Shoemaker
                    Participant

                      Hey Pete,  No tcl needed in my solution. I’m simply adding another ancillary application to the mix, but in terms of netconfig, i am splitting the receiving thread at the point at which it is received. It might take a little rewiring in another setting, but I am able to take the 1 single point of input and split it off, 1 going through the interface and 1 going to an “archive” site. This site does only 1 thing, receives feeds of hl7 messages and combines them to a single tcp/ip connection.  These are all raw messages.  I can use then query on visit number, ssn or account number, name or whatever field (we even built it so we can easily add fields of interest and reprocess old data) to produce a list of hl7 message and then resend then do whatever I want with them. I’m trying to avoid needing to use cloverleaf since we also use Openlink. I can tell from the MSH segment where the message came from and from that point I’ll know the source engine as well.

                      Mike

                    • #74122
                      Peter Heggie
                      Participant

                        ok I think I understand about splitting the inbound process. I’m still hung up on having to differentiate between incoming messages and outgoing messages, I think I need to track both separately. For completeness, maybe I’ll later add in a feed from the error database…..

                        I don’t know enough about HL7 to understand the contect available in the MSH, but I’ll get there!

                        Good luck with living in two worlds.. I’m trying to use CL as much as I can.

                        Pete

                        Peter Heggie

                      • #74123
                        Jeff Dinsmore
                        Participant

                          I’ve been toying with something similar for searching across multiple days of SMAT files. It’s a cumbersome task to find messages when you don’t know exactly when they were processed.

                          The indexing of .idx files is a nice idea. I’d recommend storing most of the info in the .idx files. There’s a datestamp, current and source message ID’s, source/destination connections, and offset/length for pulling messages from the .msg files. Plus you can store select info from the messages – like message type, MRN, encounter, patient name, etc. – anything you may want to search by.

                          Depending on your message volume, this database could get large quickly.

                          The other option is to work with the .idx/.msg files directly. A method I used in the past was to start with “now” and work backward through logs – displaying messages that match search criteria as they’re found – so you can cancel when it uncovers what you’re looking for. A date range speeds the search if you have an idea of approximate date/time. Depending on how frequently you need to find messages – and assuming that most of what you’re looking for is fairly recent – this may be an acceptable approach.

                          Does anyone know if there are any locking concerns with reading the .idx/.msg files the engine is currently using?

                          Jeff Dinsmore
                          Chesapeake Regional Healthcare

                        • #74124
                          Mike Shoemaker
                          Participant

                            It seems like you are saving messages via smat for trouble shooting? I’m not sure thats the intended purpose. Smat is an archive of messages. Sure you can save both inbound and outbound and plainly see where you are missing messages, but thats kind of only part of the picture. My troubleshooting always begins with the question “What happened with Patient XYZ?” Or “What happened to lab result from this patient on this date?” So to deal with those issues, I’ve always used smat as a way to reproduce the questioned message (or messages), a source message that hasn’t been translated or filtered, and then use Cloverleaf’s testing tool to see whats happening.  I never save outbound messages because its redundant if you have the source. You can always resend the source message or run it through the route tester tool or translation tester to see what’s happening with that particular message. Also, as it stands now for me, those smat files become too big to deal with if I don’t cycle save routinely. I chose to do so every hour and then time-date stamp the files and store outside of the /hci filesystem. Also, each individual “interface” has it’s own threads and saves in different Smat files, so I never get to see “the big picture” without lots of work pooling all the messages. The other problem I have is dealing with a patient that spans several hours or even days. Pulling those files make my stomach churn sometimes 😉 So with my little utility, I can basically see all the messages in a 1 stop-shop time lapsed feed so to speak.  Think of it like a patient profile in facebook format for Smat 😉  On a side note, how often are you “losing” messages in the engine? 95% of my job is pointing a finger at one of the ancillary apps and of course vendor complaints 😉

                          • #74125
                            Jim Kosloskey
                            Participant

                              Jeff,

                              Beginning with Cloverleaf 5.6 on AIX platforms, there is no issue with referencing SMAT files that are active (with or without the SMAT tool).

                              Prior to 5.6 on AIX (and perhaps other platforms) one had to cycle the SMAT files to use them.

                              email: jim.kosloskey@jim-kosloskey.com 29+ years Cloverleaf, 59 years IT - old fart.

                          Viewing 10 reply threads
                          • The forum ‘Cloverleaf’ is closed to new topics and replies.