› Clovertech Forums › Read Only Archives › Cloverleaf › Cloverleaf › Creating a thread that will add a message to a SQL Lite DB
Thank you,
Ian
Here is an example of a call from an xlate:
package require sqlite3
proc xlt_phs2idx_InsertOrderNum {} {
[code]package require sqlite3
proc xlt_phs2idx_InsertOrderNum {} {
-- Max Drown (Infor)
Thank you so much for the examples. Here are the steps I followed to get this going.
1 – Made sure sqlite was installed by typing “sqlite” into the command prompt and getting the sqlite command prompt.
2 – Followed the SQLite “Getting Started” section to create my ex1 database with appropraite tables and fields.(http://www.sqlite.org/sqlite.html).
3 – Found example on forum provided by Rob Abbott to create proof of concept (https://usspvlclovertch2.infor.com/viewtopic.php?t=4929&highlight=sql).
4 – Added .tcl as a “pre” proc and proved that I could add data to the SQLite database!
Thank you all very much for your help.
Getting back to ‘creating a thread’… are you intending to use a new, separate thread to insert these values into a table?
I’d like to create a cross-reference table that has rows of msgid, MRN and Visit number, to be used to track all transactions for a patient visit. I’m not interested in looking at message data; I want to have visibility into the ADT, order and results messages going back and forth between the various systems, especially to track down problems of ‘missing messages’. I would use a process like the one above to store a row in such a table. I would store enough data to show ‘where’ a message was – the location would probably be a combination of thread name and destination name (I guess it depends on how a system is configured).
I am new to Cloverleaf, so I’m wondering if the information I can get is useful enough for this effort.
But I’m also concerned about performance and reliability – in the above proc, if there was a way, an API, to send a separate ‘audit’ message to a new thread dedicated to processing these ‘audit’ messages (performing the insert), then it would make the insert process asynchronous from the business process and reduce the performance penalty. And the audit messages would be reliably stored in a queue before being picked up by the new thread proc. And possibly the audit thread could have a long-running connection to the database and just execute an insert for each incoming message; that would reduce the database connection overhead. I see from previous postings that sometimes there are problems with too many database connections.
Is there a way for a proc to send a message directly to another thread, from the application IB Proc to the audit IB proc, without having to go through the application Translation thread? (Maybe this would be an enhancement to the base product or to the Global Monitor product, to provide transaction tracking. It would be a TCL extension that would be called with a table name and keyed list of column names and column values. It would put an ‘audit message’ into a queue of an audit thread).
One more question – to make this audit capability more useful, I’d like to indicate if a transaction / message was delivered successfully. Is there any way to capture this event and record it? According to some training documentation, the RecoveryDbState of 14 means that the outbound message was delivered, but I probably won’t ever see this status in the output of an HCIDBDUMP command (by the way, what does a state of 16 mean?)
Thanks for reading.
Peter Heggie
I use a detail and another thread to route the message to a different site for database logging. That way, if the database blows up, you won’t affect your production threads. Messages will simply stack up until you fix the problem.
I learned that the hard way.
You might also think about storage allocation. If you can put your DB files on a different filesystem, it will prevent blowing up cloverleaf if your DB runs out of space. Yup… did that too.
I have mine flowing into a MySQL database. Hardest part is getting out your crystal ball and predicting which fields of the message are “interesting” for your function. I did inbound first, and there are many fewer fields and indexes on my inbound than there are on outbound. Many times I’ve wished I had a particular insurance or odd PV1 field.
Make sure that your inflow rate will not overrun the rate at which you can parse the message and create the indexes. In my installation, I stuff in the whole message and do the parsing and indexing on a different machine.
Just some things to think about.
Thank you Terry
I will have to come up with a purging process before the storage is a problem. And yes, I don’t yet know what fields will be valuable.. I”m sure I’ll be updating that logic many times…
Pete
Peter Heggie