To deal with this, we grab these segments out of the various inbound message threads and throw them into a Sqlite DB. Then when the outbound message hits Cloverleaf, we pull the segments from the DB and lvarpop them back into the message with a Tcl script. This has worked pretty well for some time, but things are getting more complicated as we need to capture this info from more and more inbound feeds and push it to the same db.
My question is, is it a problem to have several feeds passing messages to a single Sqlite DB? Also, we are about to switch pulling this data from the DB real-time as results go through the engine instead of pulling the data based on batch charges. Will I run into contention issues with several thousand writes to the DB across four threads and 2-400 reads per day from the DB from one thread?
I don’t even know where to look other than Clovertech for design guidance on things like this.