Log and Statistics Tables

In this section:

How to:

Reference:

DataMigrator stores log information and flow statistics in an internal data source. These tables may be accessed for read-only purposes through the Server. The tables are optimized for use by DataMigrator, and are subject to change by Information Builders. This topic briefly describes the format and function for each column of the tables.

The following tables are described by like-named synonyms.

ETLLOG Table

Flow run-time message log.

ETLSTATS Table

Run-time statistics for each flow.



x
ETLLOG Table

The ETLLOG table contains the DataMigrator run-time message log. The number of lines that are written to this table are managed by the Scheduler Configuration.

Column

Format

Function

DATE

Alpha 20

Date and time of log entry.

USER

Alpha 32

User who created the flow.

JOBNAME

Alpha 23

Unique internal job name.

APPNAME

Alpha 64

Flow application directory.

REQ_NAME

Alpha 64

Flow name.

COUNTER

Integer 9

Line number of the message.

MESSAGE

Alpha 80

Message strings in the log.

MSGCODE

Alpha 12

Message code number.



x
ETLSTATS Table

The ETLSTATS table contains the run-time statistics for each flow.

Column

Format

Function

S_REQ_ID

Alpha 23

Name of the flow that started this flow.

S_USERID

Alpha 32

Flow author.

S_APPNAME

Alpha 64

Flow application directory.

S_REQ_NAME

Alpha 64

Flow name.

STAT_SBM_T

Alpha 20

Submit date and time for a flow.

STAT_STRT_T

Alpha 20

Start date and time of execution.

STAT_END_T

Alpha 20

End date and time of execution.

STAT_RTRV_T

Alpha 20

Date and time of retrieval for a flow.

STAT_FORMAT

Alpha 15

Number of format rejections.

REQ_RC

Alpha 16

Internal return code for the flow.

TSCOMID

Alpha 15

TSCOMID for killing a job.

SESID

Alpha 15

SESID from EDAPRINT.

ORIG_REQ

Alpha 32

Original flow where dependency started.

ORIG_USER

Alpha 32

Original user ID running dependent flow.

DEP_CNT

Integer

Dependency number.

ETL__ERRTXT

Alpha 80

Text explanation of error messages.

FILLER

Alpha 30

Reserved for internal use. There are two columns with this field name.

USER_FILE_L

Alpha 9

Reserved for internal use.

RUN_BY_SCHED

Alpha 1

Scheduled request.

CPU_TIME

Alpha 15

Total CPU time in milliseconds.

S_TARG_NAME

Alpha 64

Target name.

Note: This and the remaining columns apply on a per target basis.

STAT_TRANS

Alpha 15

Number of rows processed.

STAT_ACCEPTS

Alpha 15

Number of rows accepted.

STAT_NOMATCH

Alpha 15

Number of rows rejected.

STAT_DBMSERR

Alpha 15

Number of rows rejected by the database for a constraint violation, such as a Unique Index constraint, a Not Null constraint, or a user-specified constraint.

STAT_INVALID

Alpha 15

Number of rows rejected for failing validation tests.

STAT_DUPLS

Alpha 15

Number of rejected duplicate records.

STAT_INPUT

Alpha 15

Number of rows input.

STAT_CHNGD

Alpha 15

Number of rows updated.

STAT_DELTD

Alpha 15

Number of rows deleted.

Note: You must be logged in as a Server Administrator with workspace configuration privileges in order to make any configuration changes, or to follow any of the following procedures.

In Release 7.6, the etllog file is in FOCUS format, which is limited to 2 GB. In Release 7.7, the etllog file is in XFOCUS format and the limit for this format is 32 GB.



x
List of Essential ETL LOG Message Codes

The following is a list of essential lines that will be imposed into the ETL LOG, even when the number of lines is limited to a positive integer.

Message Code

Log Message

Example or Explanation

ICM18007 

CPU Time: %1

CPU Time : 46

ICM18031

Finished

 
ICM18040

Return Code = xxx

Return Code = 0

ICM18041

-- stats for source file

Statistics about processing of a source file

ICM18072  

Elapsed run time hh:mm:ss

Elapsed run time 0:00:02

ICM18122 

Request - %1 (Owner: %2) submitted.

Request that was saved by the owner is being submitted. The owner is saved in a flow; e.g., control variable in focexec: -*DM_USERID=user1

ICM18237

Ending bulk load procedure

Shown in the LOG when a load type on a target is 'Bulk load utility via a disk file'

ICM18372

-- stats for target file: %1

Statistics about processing of a target file

ICM18514

%1 : %2 accepted by target table

18 : Row(s) accepted by target table

ICM18515

%1 : %2 inserted into target table

18 : Row(s) inserted into target table

ICM18516

%1 : %2 processed by job

18 : Row(s) processed by job

ICM18517

%1 : %2 updated in target table

0 : Row(s) updated in target table

ICM18518

%1 : %2 deleted from target table

0 : Row(s) deleted from target table

ICM18519

%1 : %2 rejected due to format error

0 : Row(s) rejected due to format error

ICM18520

%1 : %2 rejected due to validation errors

0 : Row(s) rejected due to validation errors

ICM18521

%1 : %2 rejected due to no match

0 : Row(s) rejected due to no match

ICM18522

%1 : %2 rejected because duplicate exist

0 : Row(s) rejected because duplicate exist

ICM18744

Ending Load

 
ICM18750

bulkload: DBMS error code at execution

The attempt to start the bulkload failed. Check configuration. It is possible that environmental variables that point to the DBMS bulkload facilities have not been configured.

ICM18751

bulkload: Error found in the bulkload log file %1

Review the log file for DBMS messages that indicate the specific error

ICM18808

%1 : %2 rejected due to DBMS error

2 : Row(s) rejected due to DBMS error

ICM18974 

Start of Log Record for %1/%2

------ Start of Log Record for baseapp/flow1 ------

ICM18975  

End of Log Record for %1/%2

------ End of Log Record for baseapp/flow1 ------

ICM18995 

The configured maximum number of log lines (%1) has been reached.

The message appears to inform a user that the log lines will be reduced as per the value in the sched_log_lines was set



x
Procedure: How to Automatically Delete Rows From the ETL Log and Statistics Table

It may be useful for a Server to automate deleting rows from the ETL log and statistics tables. For example, you can delete all rows more than a week old on a daily basis.

  1. Create a stored procedure with the following lines:
    -SET &DDATE=AYMD(&YYMD,-7,'I8');
    EX ETLDELLS DDATE=&DDATE, DTIME=00:00:00, SUSTART=Y, ETLSTART=Y
  2. Create a new process flow.
  3. Drag the procedure into the flow and connect it to the Start object.
  4. Drag a Schedule icon into the workspace.
  5. Double-click the Schedule icon and set a schedule for running the flow.
  6. Save the flow.

Note: The Clean utility rebuilds the ETLLOG. The FDS service has to be stopped to prevent possible file corruption during the rebuild process. Therefore, the following considerations should be taken into account:


Top of page

x
Procedure: How to Automatically Backup and Recreate the ETL Log and Statistics Tables

The DataMigrator server can be configured so that if there is an error writing to the ETL log, one or more of the following events will occur:

The following procedure shows how to backup and recreate ETL log and statistics.

  1. Create a new process flow and name it pf_backup_recreate.
  2. Add the following stored procedures to the process flow in this exact order: st_backup, st_rename, and st_recreate.

    The following is an explanation of the stored procedures mentioned.

    st_backup

    This procedure will create the etllogbk.foc and etlstabk.foc files in the \dm\catalog\ directory. These files are copies of the etllog.foc and etlstats.foc files. Example code for st_backup is:

    SHH CRTBACKUP
    END
    st_rename

    This step is optional. This procedure prevents these files from being overwritten during the next backup process. As a result of this procedure, the etllogbk.foc and etlstabk.foc files will be renamed, with a timestamp added to their suffix. For example, etllogbk_20101026_1120258.foc. Example code for st_rename is:

    -SET &LOGBK = c:\ibi\srv77\dm\catalog\etllogbk.foc; 	 
    -SET &STABK = c:\ibi\srv77\dm\catalog\etlstabk.foc; 	 
    -SET &LOGBKREN = etllogbk || '_ ' || &YYMD || '_ ' || 
    EDIT(&TOD,'99$99$99') || '.foc'; 	 
    -SET &STABKREN= etlstabk || '_ ' || &YYMD || '_ ' || 
    EDIT(&TOD,'99$99$99') || '.foc'; 	 
    !REN &LOGBK &LOGBKREN 	 
    !REN &STABK &STABKREN 	 
    END 
    st_recreate

    This procedure deletes the original etllog.foc and etlstats.foc files, and will replace them with the new initialized files. Example code for st_recreate is:

    EX ETLCRTLS ALL, Y, N 	 
    END 

    The process flow should look like the following image.

    Process Flow

  3. Run the new process flow by clicking Run or Submit one time, or scheduling the flow. You can also use Event Routing, as described in How to Send an Email Message When There Is an Error Writing to the ETL Log.

Top of page

x
Procedure: How to Send an Email Message When There Is an Error Writing to the ETL Log

You can configure the server so that an email message is sent to an address you specify if there is an error writing to the ETL log. For example, even though a log table is automatically maintained so that normal operations prevent it from reaching its maximum size, it is still possible for a flow to generate so many errors that it fills up the log table. This issue can be addressed by having the server send you a message when the log table is full.

  1. Configure the server to connect to a SMTP mail server, as described in How to Configure the Server for Email Notification.
  2. There are two ways to customize event routing from the DMC:
    • Right-click the Workspace folder and click Event Routing.

      Click Customize New Event Routing.

      or

    • Expand the Workspace and then the Logs and Traces folder.

      Right-click Workspace Logs and click Customize Event Routing.

      Click Customize New Event Routing.

    The Customize Event Routing dialog box opens, as shown in the following image.

  3. Select the following options for each field:
    • Type - Error
    • Filter - 60003 ETLLOG write error
    • EDAPRINT - Yes
  4. In the E-mail field, enter the address where the email message will be sent.
  5. In the Procedure field, enter the name of the process flow, for example, baseapp/pf_backup_recreate.
  6. Click Add.
  7. Click Save and Restart.

Top of page

x
Reference: Managing Log and Statistics

To manage the log and statistics from the DMC, expand the Logs and Traces folder. Then, right-click DataMigrator Log and Statistics. A context menu will open with your options, as shown in the following image.

Log and Statistics menu

The following options are available in the context menu:

View

Allows you to view the log and statistics tables.

Recreate

Creates empty new log and statistics tables.

Clean

Allows you to specify a date and time for deletion to occur in the Log and Statistics dialog box.

All records previous to the date and time entered are deleted, and the log and statistics files are rebuilt. This action frees up the space used by the deleted records, so that new records can be added. This does not reduce the size of the log and statistics tables on disk, and new records can still be accepted without growing in size until the newly freed space has been freed up.

Backup

Creates a backup copy of the log and statistics tables on the disk, and a synonym that can be used to retrieve data from them. These tables are created in the conf/catalog directory that is located on the server (for example, c:\ibi\srv77\dm\catalog). A message will appear in the console showing the location.


Top of page

x
ETL LOG Message Codes

Message Code

Text

ICM18007

CPU Time: %1

ICM18031

Finished

ICM18040

Return Code = xxx

ICM18072

Elapsed run time hh:mm:ss

ICM18174

Agent has been forced to terminate.

ICM18175

Agent terminated by an operator.

ICM18762

Job ID: %1

ICM18774

Request %1/%2 created

ICM18972

------Start of Iterator Log Record for %1/%2------

ICM18973

------End of Iterator Log Record for %1/%2------

ICM18974

------Start of Log Record for %1/%2------

ICM18975

------End of Log Record for %1/%2------

ICM18994

Iterated job parameters: %1

ICM18995

The configured maximum number of log lines (%1) has been reached.


iWay Software