The best way to handle ever changing tables

Clovertech Forums Read Only Archives Cloverleaf Cloverleaf The best way to handle ever changing tables

  • Creator
    Topic
  • #53742
    Carl Duff
    Participant

      Hello and thanks in advance for any assistance.

      Running CL version 5.5 in Windows environment.

      We have many tables that we use but most are fairly static with infrequent changes required. But we are getting into some integration that requires (possibly) daily updates to a couple of tables. This has been driving me nuts updating these tables several times daily.

      I have searched the forum and there look to be a couple of options; either using a couple of tcl scripts to load the tables dynamically or using tcl csv and load a csv file into memory.

      I have been playing around with the second option but have failed miserably.

      Does anyone have insight to what they have done in this situation and how your site has handled ever changing data? Also examples would be greatly appreciated since I can’t seem to get anywhere with it.

      TIA,

      -Carl

    Viewing 12 reply threads
    • Author
      Replies
      • #78800
        Jim Kosloskey
        Participant

          Carl,

          There is another option if you have ODBC or JDBC and that is to query a user maintained DB which contains the entries you need (we would invoke a Stored Procedure rather than imbedding the Sql code in the engine – but your choice). We have yet to do this but I know others have.

          We infrequently allow dynamic tables but when forced to we have thus far used a generic, argument-driven Tcl proc and FTP to build a Cloverleaf Table from an extract file (delimited) provided by the system which is the system of truth for the data.

          You just have to remember to have a mechanism for refreshing the cache (either purge caches or stop/start the process).

          email: jim.kosloskey@jim-kosloskey.com 29+ years Cloverleaf, 59 years IT - old fart.

        • #78801
          Michael Hertel
          Participant

            We use the third option which is that we don’t do dynamic look up tables.

            We insist on the source or target handle translation tables.

          • #78802
            Carl Duff
            Participant

              Jim,

              We currently do utilize several ODBC connections and this may be the way to go for us. We certainly have more experience doing something like that. I’ll need to discuss with one of the DBA ‘s to see how the non ‘techy’ end user could maintain the table.

              Michael,

              I like that option but unfortunately my management doesn’t.  ðŸ˜•

              The tables that we are looking at are growing daily and I may have only one shot at doing this the ‘right’ way so I appreciate the experts opinion.

            • #78803
              Charlie Bursell
              Participant

                A few years back this was the subject of my Power of Tcl presentation at the User Group – dealing with dynamic tables.

                I have implemented this for a few clients and it works very well.

                Most tables can be maintained externally in an Excel spreadsheet usually by a clerk or admin type that will take the load off the analyst.  The table is exported as a CSV file to a fixed file name and location any time it is updated

                A Tcl proc checks the CSV file each time it runs and gets the last mod time.  If the last mod time has changed the CSV file is read into a global or namespace array.  If no change continue using the one in memory

                It takes a *LOT* of the routine maintenance of the table out of the hands of the analyst.  ODBC, sqlite, and other methods, IMHO, still require a lot of work by the analyst.

              • #78804
                Gary Atkinson
                Participant

                  Has anyone tried this using a web service?

                • #78805
                  David Harrison
                  Participant

                    We use sqlite as a short term store for data that needs to be cached and also instead of tables which need to be frequently updated.

                  • #78806
                    Russ Ross
                    Participant

                      I recalled doing this for one of our integrations ( batch_056 ) and used the method mentioned by Charlie of having the owner of the table do it in Excel and extract it to a CSV file that I used for generating the cloverleaf table on an automated daily basis.

                      I agree with Charlie overall it was an easy way to approach it as opposed to other methods, plus we already had a TCL script that would do the heavy lifting if I just gave it some arguments.

                      Here is an example of the arguments and how I called the table_builder.tcl script.

                      Here is the calling setup ( iftp_056_build_table.ksh ) to get you a tangible example to help solidify the concept and the input args to possibly consider.

                      Code:

                      #!/usr/bin/ksh

                      cd $HCISITEDIR/Tables

                      InputFileName=/ftp/$MDA_BATCH_ENV/iftp_056/work/doctor_to_BA.TXT.pre_processed
                      SeparatorCharacter=,
                      TableName=bill_area.tbl
                      TableDefaultValue=00510
                      LogFileDir=$HCISITEDIR/Tables

                      rm -f /ftp/$MDA_BATCH_ENV/oftp_056/work/*
                      rm -f $HCISITEDIR/Tables/*
                      touch $HCISITEDIR/Tables/$TableName

                      table_builder.tcl              
                         -i     $InputFileName      
                         -sep   $SeparatorCharacter
                         -outbound_proc ‘{oth_trim_chars {{CHARACTERS “r”} {DIRECTION “right”} {DEBUG “n”}}}’
                         -table $TableName          
                         -dflt  $TableDefaultValue  
                         -l     $LogFileDir        
                         -debug                         > /ftp/$MDA_BATCH_ENV/oftp_056/work/table_builder.log.processed

                      exit_status=$?

                      mv    $HCISITEDIR/Tables/$TableName   /ftp/$MDA_BATCH_ENV/oftp_056/work/${TableName}.processed
                      rm -f $HCISITEDIR/Tables/*

                      if [ “$exit_status” != “0” ]
                      then
                         script_name=”$0″
                         email_address=”${MDA_BATCH_ENV}_oftp_056_email”
                         alert_log_file=”$HCISITEDIR/Alerts/oftp_056.log”
                         alert_message=”`date` – $HCISITE alertrr$script_name FATAL ERROR, HUB on-call will need to investigate by doing:rrcd /ftp; findall | grep _056″
                         send_alert_message.ksh   “$email_address”   “$alert_message”
                         echo “$alert_message” >> $alert_log_file
                         exit $exit_status
                      fi

                      rm -f $InputFileName

                      If memory serves me right, I think I have a preprocessing step somewhere that might remove the header line from the CSV file, so keep that in mind if using a CSV approach.

                      Russ Ross
                      RussRoss318@gmail.com

                    • #78807
                      Jim Kosloskey
                      Participant

                        That proc Russ is referring to has an argument to indicate how many beginning records in the file should be treated as headers (and thus ignored). So no need to pre-process.

                        There are two user points of control one for the inbound side and pne for the outbound side when building the Table. This allows for adaptation of the data elements ( for example if the provided values need to have trailing characters removed – or some other manipulation).

                        That proc actually builds a Cloverleaf Table file rather than an in memory array as Charlie indicated in the process Charlie is referring to.

                        So the purge caches thing is still needed with this proc. There is an intent to provide the ability to specify the process or threads to have caches purged for if I ever get the time to add that. So the whole process would be more automated and self-contained.

                        email: jim.kosloskey@jim-kosloskey.com 29+ years Cloverleaf, 59 years IT - old fart.

                      • #78808
                        Parvathi Bhaskara
                        Participant

                          We have a need to use a proc like the one mentioned below, table_builder.tcl. Was this written by a client or was it from Cloverleaf ?Is this available for everyone to use and how do we get the code in our environment ? Thanks in advance for your help!

                        • #78809
                          Jim Kosloskey
                          Participant

                            I wrote that proc and you can have it – just email me and I will reply to your email with the proc and user doc.

                            Just click on the email button on my signature.

                            email: jim.kosloskey@jim-kosloskey.com 29+ years Cloverleaf, 59 years IT - old fart.

                          • #78810
                            Parvathi Bhaskara
                            Participant

                              Thank  you and I appreciate your quick response ! Just sent you an email.

                            • #78811
                              Charlie Bursell
                              Participant

                                This was the subject of one of my “Power of Tcl” presentations at the UG a few years back.

                                Basically, pretty simple.

                                Maintain the Table in an Excel spreadsheet and when it changes kick a copy out to a known location and file name as a CSV file

                                Then in the Tcl proc use the CSV package to split the records into fields.  Read into an array using the key as the array index.  Make sure the array is global or, preferably, a namespace variable.  Then stat the table and save last mod time in a global variable.  With each message stat the table and compare current mod time with the one saved.  If different, reload table else use table as is

                                I have implemented this in many places

                              • #78812
                                Parvathi Bhaskara
                                Participant

                                  Charlie and Jim,

                                  Thank you very much for your help with this ! I was able to use the table_builder.tcl successfully. It was only a one time load and it was pretty easy.

                                  thanks,

                                  Parvathi

                              Viewing 12 reply threads
                              • The forum ‘Cloverleaf’ is closed to new topics and replies.