Jim Vilbrandt

Forum Replies Created

Viewing 15 replies – 1 through 15 (of 26 total)
  • Author
    Replies
  • in reply to: Statistic Database tool #122222
    Jim Vilbrandt
    Participant

      Hi Tim,

      I didn’t really do much with this, but as SMAT are SQLite Databases, you can query them.

      cd $HCIROOT/prod_oru_lab/exec/processes/ORU
      sqlite3 GlucoTab_ORU_ob.smatdb
      sqlite> .mode column
      sqlite> .header on
      sqlite> .table
      smat_info  smat_msgs
      select MessageContent from smat_msgs where MessageContent like ‘%94728999%GLU%’;
      select replace(MessageContent,X’0D’,’\r’) msg from smat_msgs where OrigSourceConn = ‘POCcelerator_ORU’ and MessageContent like ‘%94728999%GLU%’;

      The fields in the SMAT_MSGS table are basically what you see in the SMAT Database tool.

      Regards, Jim

       

      Jim Vilbrandt
      Participant

        Two alternative solutions to your original issue:

        • If you can access the Database of the system were the ADTs originate, create a DB-Lookup table to return the “best encounter” for the patient ID.
        • Create an SQLite Database that is updated by an ADT feed, then query that DB for the “best encounter”.

        Regards, Jim

        in reply to: Using local db to capture and compare data #122183
        Jim Vilbrandt
        Participant

          Hi Timothy,

          We have a similar issue with several systems sending many MDM messages in a row for the same document ID. For these systems, I write the message to a directory, with the document ID as the file name. Newer messages with the same document ID overwrite the previous version. A second thread then checks the folder and only processes those files that have not be updated in the last five minutes.

          Best Regards, Jim

          in reply to: Muliple messages from a TPS script into xlate #121971
          Jim Vilbrandt
          Participant

            Hi Jason,

            Two suggestions:

            • If it is possible, do a database query direct in Epic to retrieve the needed data when the documents are generated.
            • Write a semaphore (.sem) file containing the encounter details from the ADT events in the directory with your hold files. A file read thread would be configured to only trigger on .sem files. A parse directory TCL would read the contents of each .sem and search for the matching .hold file or files. The file list returned by the parse dir TCL to the engine would be the .hold files. Each file would be treated as a message.

            Best regards from Germany, Jim Vilbrandt

            in reply to: BASIC CLEANUP TASKS #121779
            Jim Vilbrandt
            Participant

              Hi Omar,

              I also wanted to setup Windows Tasks to perform daily tasks in Cloverleaf. After exetensive searching, I found the section “crontab in Windows” in the Infor Cloverleaf Integartion Services User Guide.

              It appears that you can only run BAT files (PS1 files did not work for me).

              The Windows Task Action is Start Program PowerShell with the Arguments “-Command Start-Process -Verb RunAs <path_to_your_batch.bat>

              Here is how a batch file for exporting the Site Documentation would look:

              @echo off
              call setroot
              call hcisitedoc -s site1
              call hcisitedoc -s site2

              I hope that helps!

              Best Regards from Germany, Jim Vilbrandt

              in reply to: Document Conversions #121769
              Jim Vilbrandt
              Participant

                Hi Jason,

                We utilize a third party product installed on a windows server that monitors a folder or folders for documents. These are then converted to the desired format and written to a secondary folder. The secondary folder is monitored by Cloverleaf.

                In our case, the originating system writes an HL7 message to the secondary folder and the original document to the primary folder with the same file name. Cloverleaf has a IB-parse TPS that only processes documents when both files are present in the secondary folder.

                There are several products that can be used for the document conversion, but most of the software that you can download for free are too limited.

                Best Regards, Jim

                • This reply was modified 11 months, 4 weeks ago by Jim Vilbrandt.
                in reply to: Error replaying JSON message Cloverleaf 22 #121754
                Jim Vilbrandt
                Participant

                  I have this issue with 22 as well. Set the encoding to “bypass” and you can then resend the message. Not sure why.

                  in reply to: Help with writing UPOC using TCL #121621
                  Jim Vilbrandt
                  Participant

                    Hi Rin,

                    I would suggest using a directory parse TPS for this purpose. The “message” you receive in this script type is a list of files in the configured directory.

                    Below you will find some pseudo code for this purpose.

                    Best Regards, Jim
                    <pre>
                    # Get Input Path from NetConfig
                    set conndata [netconfig get connection data $HciConnName]
                    set ibdir [keylget conndata PROTOCOL.IBDIR]

                    # List of files found
                    set listing [msgget $mh]
                    set newlist “”

                    foreach entry $listing {
                    # Check contents of each file
                    set fileName [cconcat $ibdir “/” $entry]
                    set fh [open $fileName]
                    fconfigure $fh -translation binary
                    set msg {}; set msg [read $fh]
                    close $fh
                    <add your logic here to determine which files should be processed>
                    if {<keep>} {
                    lappend newlist $entry
                    } else {
                    file delete [cconcat $ibdir “/” $entry]
                    }
                    }
                    # Pass new list to engine
                    msgset $mh $newlist
                    lappend dispList “CONTINUE $mh”</pre>
                     

                    in reply to: dblookup join #121570
                    Jim Vilbrandt
                    Participant

                      Hi Rick,

                      Here is a simple join for a MSSQL Database:

                      select d.name_doc, lower(d.feld9) as file_ext, m.name as medium from object65 d, medien m where d.feld32 = <docid> and d.flags in (1,2,16) and m.id = d.medium_doc

                      The SQL will differ depending on what database you are trying to access (ie: Oracle, MSSQL, SQLite, PostgreSQL, etc.). I always develop the query in the native Database browser first. If it works there, then in should work from Cloverleaf.

                      Best Regards, Jim

                      Jim Vilbrandt
                      Participant

                        We have a system that exports personnel events to a CSV file. This file is then processed through a Windows Task and PowerShell. Each the personnel data in CSV is then augmented with information from Active Directory before it is written to a second CSV. This second CSV is then sent to various systems by Cloverleaf in the format they are expecting. This would be an easy work-around if you can’t get the TCL/AD connection to work.

                        Jim Vilbrandt
                        Participant

                          Hi Jim,

                          I am running inbound stored procedures with both Oracle and MSSQL. The process is very different for both. I am not passing parameters to the called procedure, so I am curious if someone has accomplished this.

                          Oracle:
                          CREATE OR REPLACE PROCEDURE cp_clv_test(out_var out sys_refcursor)
                          IS
                          BEGIN
                          open out_var for select <values> from <tablename> where <qualification>;
                          END;

                          Read Action: {call cp_clv_test(rowset OUT CURSOR)}

                          MSSQL:

                          CREATE PROCEDURE cp_clv_test
                          AS
                          BEGIN
                          SET NOCOUNT ON;
                          select <values> from <tablename> where <qualification>;
                          RETURN;
                          END;

                          Read Action: {call cp_clv_test()}

                          You can update the row sent in this way:

                          Read Success Action: UPDATE <tablename> SET <fieldname>=<value> WHERE <fieldname>=<<passedvalue>>

                          Note: passedvalue must be a field returned by the stored procedure and must be enclosed in <>.

                          I hope that helps!

                          Are you aware that CentOS reached “end of life” on 30.06.2024???

                          Best Regards from Germany, Jim Vilbrandt

                          Jim Vilbrandt
                          Participant

                            Here’s one more:
                            <pre>set inStr [lindex $xlateInVals 0]
                            if [clength $inStr] {
                            set obStr [fmtclock [clock scan $inStr -format %Y%m%d%H%M%S] “%m/%d/%Y %H:%M:%S”]
                            xpmstore $xlateId [lindex $xlateOutList 0] c $obStr
                            }</pre>

                            in reply to: Xlate: Issue getting hex 0d in field using COPY Action #121265
                            Jim Vilbrandt
                            Participant

                              Hi Jim,

                              In HRL definitions, you need to use ‘\xd’. You might try that.

                              Regards, Jim

                              in reply to: Log into the xlates #121186
                              Jim Vilbrandt
                              Participant

                                Hello,

                                not sure if I understand your question, but you can always add the following line to a Pre or Post Proc of most actions (Copy, Concat, Table, Call):

                                echo [lindex $xlateInVals 0]

                                = or =

                                echo “Value: [lindex $xlateInVals 0]”

                                You will see the results in “Browse/Watch Output” for the process where the sending process/route is located. Or in the output window when debuging an XLate.

                                When this is in the Pre Proc, this is the value of the first argument passed into the action. In the Post Proc, it is the first value returned by the action (ie: the value returned from the table action).

                                Some actions do not have pre/post procs, but you can always add a dummy copy before or after to evaluate variables/fields.

                                Best Regards, Jim

                                in reply to: NetConfig extract “DEST” and “TRXID” #121121
                                Jim Vilbrandt
                                Participant

                                  Hello Joe,

                                  DATAXLATE is a single list with a list of 0-n routes and the routes contain lists of 0-n route details.

                                  Get the list of routes:

                                  set rteLst [lindex $dataxlate]

                                  Then loop through the list looking for the fields you want:

                                  foreach rte $rteLst {
                                    keylget rte TRXID trxid
                                    echo $trxid
                                  }

                                  Regards, Jim

                                Viewing 15 replies – 1 through 15 (of 26 total)