Sitworld: ITM Example Perl/SOAP Data Retrieval

Following was rescued from an IBM Technote after a web area was shut down.



How should ITM data be retrieved using Perl/SOAP


SOAP – Best Practice and Model Perl Program

ITM data can be accessed via Perl/SOAP programs. In some recent cases, such programs severely stressed the hub TEMS with prolonged periods of high CPU utilization. This document presents efficient SOAP techniques and gives a model Perl/SOAP program which demonstrates how to minimize TEMS CPU resource usage.


The document assumes you are comfortable writing Perl and SQL and SOAP

Each SQL requires a significant amount of CPU and storage resource usage at the
hub TEMS. An occasional SQL will not be noticed, but if SQLs are submitted by
the scores or hundreds, the TEMS CPU utilization can soar to 100% or more for
many minutes at a time. The rules to follow are

1) Ask for data at moderate intervals.
2) Get multiple items of data with a single SQL.
3) Measure SOAP SQL processing and know the cost.
4) Get data from multiple targets at the same time.

The first three points are simple to understand: fewer SQLs require fewer resources.

This document and example program is focused on point (4) – getting data from
multiple targets with one SQL. The problem case SOAP looked like this


The Perl program used a text file of target agent names which was used to run this for each agent. A second SOAP was used for Disk_Read_Percent. In this way there were 2 calls per agent and with 50+ agents, this required 100+ SQLs. The activity was repeated every 10 minutes.

The gathered information was correct, but it was very costly and along with other SOAP activity caused 200% or more CPU at the TEMS for ten minutes at a time.

All the information gathered above could be collected with a single SQL. There is more work required: you have to write SQL instead of relying on the SOAP server. However the benefits are dramatic. The attached example Perl/SOAP program has the full details. See the icon just above the Document Feedback session near the end of this web page. This document will explain the example. It will not teach you Perl, but you can learn from the example.

SQL to gather above information

Here is the end result the TEMS will process:

FROM KLZ.LNXIOEXT AT (‘rtems1’, ‘rtems2’, ‘htems‘)

It is always easiest to develop and test the SQL first. See Appendix 1 for a good way to do the testing.

The above looks complex, but sometimes you have to work for a 100 to 1 improvement! Some of the complexity is pure boilerplate and is the same every time. And another part is dynamically calculated.

In the SQL –

“KLZ” is the application name – Linux OS Agent
LNXIOEXT is the table name for the attribute group
ORIGINNODE, IUTIL, OUTIL are the column names associated with the desired attributes.

There are several ways of determining these values. One way is create a situation using the needed attributes. This is against the Linux IO Ext attribute group


After clicking on the show formula button and then the Show detailed formula


The application name KLZ meaning Linux OS Agent is usually clear.

Another way is to review the TEPS object definition interface file, docklz in this case.

Windows: <installdir>\cnps\docklz
Linux/Unix: <installdir>/<arch>/cq/data/docklz

Near the top you see the APPL value – Application


Search for Linux IO Ext and see


The *OBJECT is the attribute name. The *CAPTION is what gets displayed at the Portal Client. The *TABLE is the table name needed in the SQL.

In the *ATTR entries below the column name can be found



These supply all the raw data needed.

The numeric data you get back from the SQL will be a simple number and not scaled.. In IUTIL and OUTIL *SCALE value of 2 means there is an implied two decimal value. That will be important in the end use of the information.

The AT clause

Next in the SQL is the AT clause

AT (‘rtems1’, ‘rtems2’, ‘htems‘)

This specifies what TEMSes the SQL should be run on. Each TEMS will gather data from agents that are connected to it at the moment. In the original version, the SOAP server determined which TEMS the agent was attached to and created the correct AT clause. Since we are getting all data in a single SQL, it needs to be sent to all the TEMSes. It would be a more efficient to send it to only the TEMS which are expected to have the agents connected, but that complicates the logic.

See below for Perl code to calculate the AT clause automatically based on the current hub TEMS environment. Avoiding configuration data is always useful.

Where Clause Boilerplate


To get data from multiple targets, you must supply a Managed System List. In this case the system generated *LINUX_SYSTEM is used – a list of all currently connected Linux OS Agents. The number 13 is the number of characters in the Managed System List name. You can use a system supplied name or a custom MSL.


If not all agents respond, the above clause will force an early end and a delivery of all available results. This can be very important in a large ITM environment. The above setting is for a 30 second limit. A Portal Client Workspace view will use a 600 second timeout by default. If you must know every
single status, you will need to add a later process to determine what agents did not report.

Added conditionals

Add any additional clauses to the end to limit the results, for example.


Use Appendix 1 to test the SQL thoroughly before incorporating into the Perl program.

SOAP and Perl and SQL

The following SOAP form is used for presenting arbitrary SQL to SOAP. First the SQL is wrapped so the </> characters can be used

$mySQL = “<![CDATA[” . $mySQL . “]]>”;

Perl and SOAP programming model

There are many styles to performing SOAP in Perl. The one presented here is
certainly common although I have not seen public examples. It uses the following CPAN modules:


Here is much of the DoSoap routine with notes at the end. The goal is to do the SOAP and then return the data in a form convenient for later processing.

sub DoSoap
my $mySQL = shift;
my $rt;
my @results = ();
my $k;
$faultstr = “”;

1: $mySQL = “<![CDATA[” . $mySQL . “]]>”; # wrap SQL to allow embedded angle brackets
2: $Message = “<CT_Get><userid>$UserId</userid><password>$Password</password<table>O4SRV.UTCTIME</table><sql>$mySQL</sql></CT_Get>”;
3: $Content = $XMLHeader . $Message;
4: $Response = $userAgent->request(POST $CMSSITE, Content_Type => ‘text/xml’, Content => $Content);
5: if ($? != 0) {
$faultstr = chop($Response);
} else {
my $parser = new XML::Simple;
6: my $result = $parser->XMLin($Response->content);
7: $rt =
if ($rt eq “ARRAY”) {
} elsif ($rt eq “HASH”) {
9: push(@results,$result->{‘SOAP-ENV:Body’}->{‘SOAP-CHK:Success’}->{TABLE}->{DATA}->{ROW});
} else {
10: # error case
11: return @results;
1) Wraps the SQL in <![CDATA[ and ]]> to allow embedded < and > characters
2) Composes the CT_Get SOAP message.
3) Prepends the XML header
4) Calls useragent request POST function
5) Detects error case

6) Parses results into an associative array

7) Determine if single line or multiple

8) Place useful slice of array for callers usage

9) Extract single line and put into return array

10) Error case

11) Return array of results to user

DoSoap Usage example – how the AT clause is

Here is an usage example:

1:my $atclause = ” AT (“;
3:@list = DoSoap($sSQL);
4:if ($run_status) { exit 1;}
5:foreach $r (@list) {
6: next if ($r->{O4ONLINE} ne ‘Y’);
7: $atclause .= “\'” . $r->{NODE} . “\’,”;
9:$atclause = substr($atclause,0,-1);
10:$atclause .= “)”;

Line 1 The AT clause beginning is set.

Line 2 Read the TEMS rows from the node status table.

Line 3 DoSoap function called and the result is
returned in an array.

Line 4 $runstatus is a counts errors… when not zero

Line 5 Each result key is extracted from the array

Line 6 If TEMS is not online, skip

Line 7 Add the TEMS name to the add clause

Line 8 Repeat for next row

Line 9 Remove the unneeded final comma character

Line 10 Add the trailing parenthesis

In the example code, you will also see this form of retrieval:


In this case, the numeric data is returned in that type of form. During development the Dumper package was used to display the returned data structures.

The code is minimal and each part is necessary to the result.

The balance of the example code collects the data and collates to calculate a summary. It isn’t very interesting by itself but it is needed for completeness.

Consideration when retrieving large amounts of data

When the results returned are too large for a single SOAP call, the request can be partitioned. First get a list of all the agent names, most likely from the INODESTS table. Sort them in alphabetic order. Then do a series of retrievals like this

AND ORIGINNODE >= ‘$start’

Where $start and $end take on a series of values which encompass all the possible agent names in the management system list. If you determine that 500 agents can safely give data at once, then select the start/end values from the alphabetic list of nodes. In this way you will use N/500 SQLs for a dramatic

TEMS Impact considerations

SOAP retrievals have storage and CPU impacts on the TEMSes. If you plan to retrieve a large number of rows, then get the results in segments. The current ballpark estimate is a limit 50,000 rows, but that number could be higher or lower depending on the platform and the server running the TEMS.

Another way to reduce the impact of a large retrieval is to get the data one remote TEMS at a time.

It is entirely possible to create a SOAP request that could cause a remote or hub TEMS to crash. Therefore in your planning, development and testing you must ensure that never happens.

SOAP limitations of retrieval

In large environments, you may not be able to retrieve all data all the time. Server or communication outages may make that impossible. If this is important to you, you will need to identify the non-reporting nodes and record their state in the data captured.

I have sometimes observed a failure to return all data. Typically the last row of XML tags is only partially present. For my purposes, missing a row of data was not harmful.

Fair warning, one IBM partner was using SOAP to retrieve data – essentially duplicating the historical data collection but all funneled through the hub TEMS. This worked OK with a small number of agents. In a large test their SOAP process prevented the hub TEMS from doing normal processing. They implemented the technique described here and did achieve an approximately 2/3 reduction in hub TEMS processing. However that was not enough to avoid damaging normal hub TEMS processing. In the end they resorted to using historical data from the Tivoli DataWarehouse to good effect. This also avoids the issue of agent levels which may have different attributes of interest and thus would require changed SQL columns.


This document explains details of creating minimal cost ITM SOAP access programs. You can use it as a model to guide your own development. If there are any errors please contact the author at or


Model programs are found in this zip file: ibm_linux_newmodel

#0.60000 – reset @results before use – corrected defect when a single row was

#0.70000 – handle https connection on Perl 16 levels

Appendix 1 – Running SQL against the TEMS

Warning: Running SQL to the hub TEMS can make the TEMS unusable such that a reinstall may be necessary. Making small well defined changes when you know what you are doing is sometimes useful. If
you have any doubts, please contact IBM Support.

The easiest way to run SQL is via the KfwSQLClient program. This program is part of the Portal Server installation and makes use of the TEPS connection to the TEMS. Therefore you do not need any extra knowledge about how to connect to the TEMS. It has the added benefit that the results are never truncated as can occur with other methods. You can put multiple SQL statements in the input file with semicolons at the end of each statement.

The directions assume the default install directory. Make any needed changes if you are using a different install path.


Logon to the server running the Portal Client and make /opt/IBM/ITM/bin the current directory. In general you should be logged on as the same userid as the one the TEPS was started under.

1) create a file doit.sql containing the needed SQL statements in /opt/IBM/ITM/tmp

2) run the following command

./itmcmd execute cqKfwSQLClient /f /opt/IBM/ITM/tmp/doit.sql” >/opt/IBM/ITM/tmp/doit.lst


Logon to the server running the Portal Client and make c:\IBM\ITM\bin the current directory.

1). Make a new directory for the SQL input and output files if it doesn’t exist

md \IBM\ITM\tmp

2) create a file doit.sql containing the needed SQL statements in \ITM\ITM\tmp

3) run the following command

KfwSQLClient /f \IBM\ITM\tmp\doit.sql >doit.lst

After running

Review the doit.lst for errors. In the case of INSERT/UPDATE/DELETE there will typically be no output on a success. You can run it without redirection to see the results immediately. On a SELECT there will be a count field and then the selected columns.


Model programs are found in this zip file: ibm_linux_newmodel

#0.60000 – reset @results before use – corrected defect when a single row was

#0.70000 – handle https connection on Perl 16 levels



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: