Using local db to capture and compare data

Clovertech Forums Cloverleaf Using local db to capture and compare data

  • Creator
    Topic
  • #122180
    Timothy O’Donnell
    Participant

      Good morning! I’m in a bit of a pickle and I’m hoping there’s a way out.

      I have an interface that sends hundreds of thousands of messages a day, many of them duplicative information related to aperiodic vitals ORUs into our EMR. Basically, the machine from the vendor sends out the first message, let’s say at 05:00 and then resends the same value over and over again every minute, as different messages until the vitals are taken again, and then that value is resent again and again. It essentially makes aperiodic vitals values into periodic, which is clogging up our EMR with processing junk data.

      The vendor can’t help, but I have an idea that may. I just don’t know how to implement it.

      Basically, when the inbound ORU comes in from the vendor, I want to pull the vitals timestamp and the visit number from their respective fields and store them paired in a database. Then send the original ORU down it’s respective path to the EMR. Then when the next message comes in, compare the Visit Number on the database and then compare the timestamp. If it’s the same, suppress the message; if the timestamp is different, update the database with the new timestamp and then send the message to the EMR. I’d also need to clean up the visit numbers as time goes on so the database doesn’t get too full.

      Any ideas? Or maybe a better way to handle the problem I’m not thinking of? Thanks!

    Viewing 3 reply threads
    • Author
      Replies
      • #122181
        Jason Russell
        Participant

          We do something similar with another vendor who sends a lot of patient data over and over. Just got to make sure you have enough data to make sure you can tell the difference. As for cleaning I would have a scheduled task that would run on a daily/weekly and checks the date/time stamp and removes them if they’re more than x time frame old.

        • #122182
          Brent Fenderson
          Participant

            We do a similar thing using sqlite

            If not in table or not a match write to table and continue msg

            else

            kill msg

          • #122183
            Jim Vilbrandt
            Participant

              Hi Timothy,

              We have a similar issue with several systems sending many MDM messages in a row for the same document ID. For these systems, I write the message to a directory, with the document ID as the file name. Newer messages with the same document ID overwrite the previous version. A second thread then checks the folder and only processes those files that have not be updated in the last five minutes.

              Best Regards, Jim

            • #122184
              Charlie Bursell
              Participant

                If you are sure that once the stored values change that that is the end of duplicates, you could simply update the database values to preclude cleanup

                • #122188
                  Jason Russell
                  Participant

                    The ‘cleanup’ comes when the specific accounts are no longer being sent at all (visit is over!), and there’s nothing to tell the script that this is the last message (it likely just stops). So to keep the DB minimal, you’d have to run a process outside the script itself to remove old accounts over x days old.

              Viewing 3 reply threads
              • You must be logged in to reply to this topic.