best way to resend in bulk from smatdb

Clovertech Forums Read Only Archives Cloverleaf Cloverleaf best way to resend in bulk from smatdb

  • Creator
    Topic
  • #55122
    Peter Heggie
    Participant

      Does anyone have any suggestions on the best and/or fastest way to resend thousands of messages, hopefully in an automated way?

      scenario:

      – 270,000 ADT messages in a smatDB

      – only 3600 encounter numbers need to be resent

      I tried using a regex with many options – (enc1|enc2|enc3|enc4) etc. but when I click on search,the cursor clocks for a few seconds, and then comes back with no messages selected. Is this a bug? what is the maximum size or maximum number of arguments in a regex used in searching a smat DB?

      I created a tcl that can perform a quick lookup of the encounter number and kill any message that does not have one of these, but using this tcl means I’ve already selected and resent the message. I’d like to filter the messages I’m resending, but be able to filter on 3600 possible encounters.

      I have not yet found a means to resend using regex from the command line.

      Peter Heggie

    Viewing 9 reply threads
    • Author
      Replies
      • #84181
        Robert Kersemakers
        Participant

          Hi Peter,

          I would create separate threads (can even be done in a separate site/server) which will take the 270,000 messages and filter out all the messages that need to be resend. You already seem to have a tcl for that, so it shouldn’t be hard to do. Then you can resend the filtered messages in the actual site, either from SMAT or from the outbound messages that were sent to a file for example.

          Huge number of messages you are talking about. Not going to ask what happened…

          Zuyderland Medisch Centrum; Heerlen/Sittard; The Netherlands

        • #84182
          Keith McLeod
          Participant

            Is there a pattern or range to the encounter numbers?  Are you using smatdb?  What is the destination system?  I have used several methods in the past….  Many times I end up creating a NL delimited file and splitting into chunks that can be run with room to allow current messages to stay caught up……If you need to process, you may be able to use output from the testing tool as well…

          • #84183
            Peter Heggie
            Participant

              thank you – I did something similar, or actually I am doing it now.. will take several hours to resubmit all. I created a new process and a new inbound thread and copied the smatdb files to the new process directory, only included the single route detail needed to send to the ancillary. I put the tcl that performs extra filtering up front, but still need to resubmit everything.

              We converted to a new HIS. This ancillary is a third party physician billing system that chose not to have a test system and declined to participate in our several rounds of integration testing. We did tell them that our goal was to update our interfaces to make sure that the output to them was the same as before, however some data could not be mapped like for like so there was a change to their data (insurance). So we are resending two weeks of data.

              While this is going on I will look at alternatives.

              Peter Heggie

            • #84184
              Robert Kersemakers
              Participant

                Was thinking about using the testing tool as well, but as it involves 270,000 messages I thought it was not a good idea to go that way. A few thousand messages would be doable though.

                Zuyderland Medisch Centrum; Heerlen/Sittard; The Netherlands

              • #84185

                You could export the whole DB to a file using the Resend to a file option and then use the msgParser script (see my signature) to search the file using grep. This is how I would prepare the messages for resending.

                And then create a new temporary thread just for resending (right click, full, resend), and set the priority lower or higher depending on what you need.

                -- Max Drown (Infor)

              • #84186
                Peter Heggie
                Participant

                  I’m going through the utility the normal way, resending using date criteria, one day at a time, to limit the activity. Its taking about 20 minutes to resend one day.

                  But at the same time, I’m trying something else. I copied the SMAT DB to a second file. Then I used SQLite to create a third, empty version. Now I’m using SQL to pull only the ADT with those encounter numbers from the second file to the third file. Once I complete that for all encounters I want, then I can just (hopefully) resend all messages from the third file.

                  This is also taking time to run the SQL query. It seems to slow way down when the WHERE clause is long (where MessageContent like ‘%enc1%’ or MessageContent like ‘%enc2%’ or MessageContent like ‘%enc3%’) etc. So I’m doing about 100 where clauses at a time. Its working, and I hope faster than the GUI method, but not sure yet. Both methods are running slow and I actually have not tried to resend from the third file.

                  So there is no magic bullet, but I’m not surprised when I consider that I want some way to filter out everything except 3200 encounters.

                  Peter Heggie

                • #84187
                  Joseph Paquette
                  Participant

                    I cant speak for how fast they would process,  but you could do something like this,  running on a linux system.  Running an older version,  make sure to change the ,  and to fit your needs.  This would pick up each file,  select all messages and then resend the selected messages.

                    #!/usr/bin/perl;

                    my $qdx_rt = “/usr/quovadx/qdx5.7MB/integrator”;

                    my $smatloc = “/usr/quovadx/qdx5.7MB/integrator//exec/processes”;

                    eval `$qdx_rt/sbin/hcisetenv -root perl $qdx_rt `;

                    opendir(SMATDIR, $smatloc)

                    or die “Unable to open $smatloc for purge: $!n”;

                    @msg_files = grep /.idx/, readdir(SMATDIR);

                    chdir($smatloc);

                    my @smatfils = sort @msg_files;

                    for $eachfile (@smatfils) {

                    $replaysmat = “hcismat -i /usr/quovadx/qdx5.7MB/integrator//exec/processes/$eachfile -ib -sall -orst

                  • #84188
                    Keith McLeod
                    Participant

                      Did you try a regular expression in your SQL query? It looks like you are using patterns that could be consolidated into a regular expression.

                      Generally SQL supports regexp.  Of course you are probably done by now….

                    • #84189
                      Peter Heggie
                      Participant

                        Thank you all for helping. Follow-Up – using the manual SQL Query of the SQLite message table took about 15% less time to run than manually querying and resending in big chunks using the SMAT DB functions. However, it took a few hours to setup, experiment and test the SQL Query, so it was a wash.

                        I don’t blame the SMAT DB resend functions for failing to accommodate a large regular expression when dealing with 100M+ SMAT databases – its a lot to load into memory.

                        We are still fighting fires from our HIS conversion over a month ago and we have not had a chance to setup a good archive process for our SMAT files and so we have only manually archived a few of them.

                        I have mentioned before that we prefer to have a known window of time to search in, whether it is one day or seven days. We are much more comfortable with the SMAT DB functions now and I think we will be fine with a seven day range.

                        To accomplish that, I think we will use the Smat History folders – we will script a process that will cycle save once per day, populating the SmatHistory folder, and then run a SQLite database “insert from” query to copy all messages older than seven days into a backup SQLite database which is stored in another location (will not be picked up by the SMAT DB search functions). A second query will “delete behind” using the same criteria. This will keep a rolling seven day history of messages.

                        To access the older backups, we will have another script that can copy the backup SQLite database files into the SmatHistory folder and so then include them in our searches. I’m still not clear on the format and functionality of this script – I want to make it easy to use and easy to specify which Smat DB backups to “restore”. Every night we can purge those copies of backups. Before we implement that, we have to verify that these copies of backups can actually be searchable once created in the SmatHistory folder.

                        To answer the last question – yes I have gotten much better at regular expressions, and also having the AND functionality of multiple criteria is significantly better than Smat File searches – this is a wonderful, long awaited improvement and our searches and resubmissions have gotten much easier.

                        Peter Heggie

                      • #84190
                        Brent Fenderson
                        Participant

                          There is a fix in 6.1.2 that corrects the smatdb issue you’re having with selecting messages for resend. Also the instant search is a client option in this release so your not running a query on big tables until the search criteria is set.[/code]

                      Viewing 9 reply threads
                      • The forum ‘Cloverleaf’ is closed to new topics and replies.