Homepage › Clovertech Forums › Read Only Archives › Cloverleaf › Cloverleaf › Metrics
- This topic has 7 replies, 4 voices, and was last updated 17 years ago by Max Drown (Infor).
-
CreatorTopic
-
September 25, 2007 at 1:43 pm #49539Max Drown (Infor)Keymaster
Is anyone out there doing any sort of metric reporting on Cloverleaf for their organizations? Ex. reports showing number of messages per hour, peak traffic times, uptime, number of errors, etc. -- Max Drown (Infor)
-
CreatorTopic
-
AuthorReplies
-
-
September 26, 2007 at 6:07 pm #62384Robert MilfajtParticipant
Yes, we have a job that digests a SMAT file and breaks it up into data points in a flat file (csv) and transmits this file to SQL Server to be uploaded via SSIS. I did not write the programs, but it seems pretty intuitive on how to get it all to work. Hope this helps,
Robert Milfajt
Northwestern Medicine
Chicago, IL -
September 26, 2007 at 6:20 pm #62385Max Drown (Infor)Keymaster
Any chance you could send me the script/program? -- Max Drown (Infor)
-
September 26, 2007 at 6:24 pm #62386John HamiltonParticipant
Or post it as an attachment. -
September 26, 2007 at 7:07 pm #62387Robert MilfajtParticipant
Here they are. One is a ksh script which calls a perl script. I see the ksh script could benefit from some indentation to make it more readable. This job runs daily from cron, and a file indicating last run date is kept in $HCISITEDIR/data/metrics/julian_date, and the file to be sent to SQL is also stored in $HCISITEDIR/data/metrics. I did not write these, and the person who did no longer works here to comment on the code.
Hope this helps,
Robert Milfajt
Northwestern Medicine
Chicago, IL -
September 27, 2007 at 10:59 pm #62388Russ RossParticipant
One day we might resume doing some work with metrics so I grabbed your post and quickly saw why you said it would be nice to add some indentation. Little did I expect the entire script to be one long line.
I did the indentation as best I could and added a couple of comments so I could test drive it.
Here is the underlying perl script that your K-shell wrapper calls and I did run it okay so the indentation did not mess anything up:
cl_metric_scan.pl
Code:#!/usr/bin/perl -W
#
# cl_metric_scan.pl – script to scan Cloverleaf *.idx files to generate a
# file suitable for loading into a database table with SSIS.
#
# written: 2/9/2007 – Jim Marselle
# downloaded from clovertech post
# http://clovertech.infor.com/viewtopic.php?t=2234
#
# example of normal usage:
#
# cl_metric_scan.pl < 12023_global_super_adt.in.old.idx # # The input in the *.idx file looks like this: # # { # {MID {{DOMAIN 0} {HUB 0} {NUM 433654899}}} # {SRCMID {{DOMAIN {}} {HUB {}} {NUM {}}}} # {TYPE DATA} # {SOURCECONN ib_idx_adt} # {ORIGSOURCECONN ib_idx_adt} # {DESTCONN {{}}} # {ORIGDESTCONN {{}}} # {TIME 1171187991} # {PRIORITY 5120} # {DATAFMT {}} # {SAVECONTEXT inbound} # {OFFSET 0} # {LENGTH 67} # } # # The output looks like this: # # 433654899,ib_idx_adt,ib_idx_adt,,,2007-02-19 03:00:36,5120,,inbound,0,67 # # NOTE - some of this data is redundant and/or NULL and may ultimately be # be removed. while (<>) {
chomp;
if (/{MID/) {
@flds = split;
$mid = $flds[6];
$mid =~ s/}//g;
next;
}
if (/{SRCMID/) {
@flds = split;
$srcmid = $flds[6];
$srcmid =~ s/[{}]//g;
next;
}
if (/{TYPE/) {
next;
}
if (/{SOURCECONN/) {
@flds = split;
$sourceconn = $flds[1];
$sourceconn =~ s/}//g;
next;
}
if (/{ORIGSOURCECONN/) {
@flds = split;
$origsourceconn = $flds[1];
$origsourceconn =~ s/}//g;
next;
}
if (/{DESTCONN/) {
@flds = split;
$destconn = $flds[1];
$destconn =~ s/[{}]//g;
next;
}
if (/{ORIGDESTCONN/) {
@flds = split;
$origdestconn = $flds[1];
$origdestconn =~ s/[{}]//g;
next;
}
# Convert localtime to text.
if (/{TIME/) {
@flds = split;
$time = $flds[1];
$time =~ s/}//g;
($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst) = localtime($time);
$mon = (++$mon < 10 ? "0" . $mon : $mon); $mday = ($mday < 10 ? "0" . $mday : $mday); $hour = ($hour < 10 ? "0" . $hour : $hour); $min = ($min < 10 ? "0" . $min : $min); $sec = ($sec < 10 ? "0" . $sec : $sec); $year += 1900; # Clear these to suppress Perl warnings. $wday = $yday = $isdst = 0; next; } if (/{PRIORITY/) { @flds = split; $priority = $flds[1]; $priority =~ s/}//g; next; } if (/{DATAFMT/) { @flds = split; $datafmt = $flds[1]; $datafmt =~ s/[{}]//g; next; } if (/{SAVECONTEXT/) { @flds = split; $savecontext = $flds[1]; $savecontext =~ s/}//g; next; } if (/{OFFSET/) { @flds = split; $offset = $flds[1]; $offset =~ s/}//g; next; } if (/{LENGTH/) { @flds = split; $length = $flds[1]; $length =~ s/}//g; next; } if (/^}/) { # Skip records with sourceconn starting with "ob" and null destconn. if ($sourceconn =~ /^ob*/ && $destconn eq "") { next; } print "$mid,$srcmid,$sourceconn,$origsourceconn,$destconn,$origdestconn,"; print "$year-$mon-$mday $hour:$min:$sec,"; print "$priority,$datafmt,$savecontext,$offset,$lengthn"; } }This script generated output consistant with the commented example and here is some of what I got:
Code:113991567,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:58:08,5120,,inbound,6776393,1947
113991573,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:58:33,5120,,inbound,6778340,2370
113991588,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:58:43,5120,,inbound,6780710,1762
113991605,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:58:53,5120,,inbound,6782472,2384
113991618,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:58:53,5120,,inbound,6784856,1947
113991634,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:58:53,5120,,inbound,6786803,2572
113991640,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:03,5120,,inbound,6789375,532
113991646,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:18,5120,,inbound,6789907,2719
113991652,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:28,5120,,inbound,6792626,2238
113991674,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:29,5120,,inbound,6794864,2922
113991680,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:29,5120,,inbound,6797786,1919
113991686,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:39,5120,,inbound,6799705,1766
113991692,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:39,5120,,inbound,6801471,2397
113991698,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:39,5120,,inbound,6803868,1407
113991704,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:39,5120,,inbound,6805275,1883
113991710,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:49,5120,,inbound,6807158,1515
113991719,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:49,5120,,inbound,6808673,1790
113991725,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:50,5120,,inbound,6810463,1833
113991731,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:50,5120,,inbound,6812296,1790
113991737,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 17:59:59,5120,,inbound,6814086,1515
113991746,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 18:00:14,5120,,inbound,6815601,1882
113991752,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 18:00:24,5120,,inbound,6817483,2572
113991762,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 18:00:24,5120,,inbound,6820055,1372
113991768,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 18:00:24,5120,,inbound,6821427,1845
113991774,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 18:00:40,5120,,inbound,6823272,1834
113991783,,12023_global_super_adt,12023_global_super_adt,,,2007-09-27 18:01:03,5120,,inbound,6825106,2737It is not intuitive to me as to how this data is intended to be arranged into useful metrics.
If you care to provide information on this point then thanks.
Here is the indented version of the K-shell script you posted. I did not run it so there is a higher likelyhood my indentation could of messed it up.
I did not run it because I could see it was a wrapper around the underlying perl script and was specifc to your site and would have to be modified for my site and archiving naming conventions.
cl_metric_scan.ksh
Code:#!/bin/ksh
#set -xv
# cl_metric_scan – this script scans Cloverleaf message log files to count the number
# of messages processed by each of the various interfaces. It is run daily
# from cron.
#
# written – 2/9/07 – Jim Marselle
# setroot sets HCISITEDIR, which is the root of the hci directory structure.
#
# downloaded from clovertech post
# http://clovertech.infor.com/viewtopic.php?t=2234
#setroot DATADIR=$HCISITEDIR/data/metrics
LOADFILE=$DATADIR/loadfile
trap “rm -f /tmp/*.idx ; exit 1” INT SIGQUIT SIGHUP
basedir=$HCISITEDIR/exec/processes cd $basedir# Get the julian date for processing from a file. We run one day behind to make
# sure all of the files are present for a particular julian date. So, if today is
# julian date 60, we use julian date 58.julian=$(cat $DATADIR/julian_date)
rm -f $LOADFILE
for dir in $(ls)
do# For the inbound and outbound directories. for dir2 in Inboundsave Outboundsave do
# For some reason, cd echoes the directory, so we send the output to the bit bucket.cd $dir/$dir2 > /dev/null 2>&1
# For each log file which was created today (send stderr to bit bucket in case no
# files today).for file_cmprsd in $(ls *.${julian}.*.idx.Z 2>/dev/null)
do
file_expanded=${file_cmprsd%.Z}
# Copy and uncompress the file.
cp $file_cmprsd /tmp uncompress /tmp/$file_cmprsd
# Don’t use hard-coded paths.
/home/hci/metrics/cl_metric_scan.pl < /tmp/$file_expanded >> $LOADFILE
# Remove the expanded file.
rm -f /tmp/$file_expanded
done
cd ../..
done
cd $basedir done# Save the julian data for the next run.
# If year modulo 4==0, it’s a leap year.year=$(date +%Y)
rem=$(expr $year % 4)
if [ $rem -eq 0 -a $julian -eq 366 -o $rem -ne 0 -a $julian -eq 365 ] then
julian=1
else
julian=$(expr $julian + 1)
fi
echo $julian > $DATADIR/julian_date
exit 0
Russ Ross
RussRoss318@gmail.com -
September 28, 2007 at 5:47 pm #62389Robert MilfajtParticipant
😳 Sorry about the formatting thing, but when I transferred it from Unix to my PC, I must have picked the incorrect transfer protocol. That is how the line feeds got stripped out. I previewed my response, but not my attachments.The information about each message is dumped to a SQL database, where queries by source or destination can be run over date/time ranges. I believe there was some talk about dumping the actual content of the msg file to get other data, like MSH data, but that never happened before the programmer left. What we do have is a good idea of volumes by thread by time. There is a piece that I cannot get for you is the SQL SSIS job that runs to upload the file to the SQL database. If you want, I can get you the SQL file definitions and queries, but from what I recall it was pretty straightforward stuff.
Hope this helps,
Robert Milfajt
Northwestern Medicine
Chicago, IL -
September 28, 2007 at 5:53 pm #62390Max Drown (Infor)Keymaster
How about a screen shot of the results? -- Max Drown (Infor)
-
-
AuthorReplies
- The forum ‘Cloverleaf’ is closed to new topics and replies.