Hitachi

JP1 Version 11 Job Management: Getting Started (Scripting Language)


4.2.2 Checking batch job execution results

In the spool root directory specified in the environment file, create a directory for each job and output job execution results to that directory. Job execution logs and the files output by programs in job steps are output to the directory for each job.

The following shows the structure of the spool directory:

The structure of the spool directory is shown below. Note that the description below only covers the directories and files used in this manual. For details about the entire structure of the spool directory, see Outputting job execution results to spool in the manual JP1/Advanced Shell.

spool-root-directory
|-lock-file
+-spool-job-directory
    +-JOBLOG#1
    +-JOBLOG_job-ID_sequence-number
    +-JOBLOG_number-giving-the-order-in-which-a-child-job-starts#2
    +-SCRIPT#1
    +-SCRIPT_number-giving-the-order-in-which-a-child-job-starts#2
    +-STDERR#1
    +-STDOUT#1
    +-step-number_step-name_STDOUT#1
    +-step-number_step-name_STDERR#1
#1

The contents of this file are also output to the job execution log. For details about what is output to the job execution log, see 4.2.1 Job execution log.

#2

This is a temporary file created during job execution. The following explains the contents of such temporary files.

File name

Description

JOBLOG_number-giving-the-order-in-which-a-child-job-starts

Job execution log for a child job for merging that is output when MERGE (merging the child job's spool job into the root job's spool job) is specified in the SPOOLJOB_CHILDJOB parameter in the environment file

SCRIPT_number-giving-the-order-in-which-a-child-job-starts

If a job is terminated immediately by SIGKILL in UNIX or TerminateProcess in Windows, these files might remain in the spool job directory. When you delete spool directories, also delete these files.

The following subsections explain the files and directories that are not temporary files.

Organization of this subsection

(1) spool-root-directory

The directory name is specified in the SPOOL_DIR parameter in the environment file.

(2) spool-job-directory

This directory has the job sequence number as its name and is created for each job. When the job terminates, the directory is renamed to job-ID-job-name.

You can use the adshhk command to delete accumulated spool jobs. For details about the adshhk command, see 4.2.3 Deleting spool jobs.

When a job terminates, the spool job directory named with the job ID is renamed. If a directory exists with the same name as the new directory, renaming will fail and the name of the spool job directory will remain as the job ID. Because the job execution has been completed and the succeeding job can be executed, the job returns 0 as the return code. While a directory named with the job ID remains, that job ID cannot be used and the directory cannot be deleted by using the adshhk command.

(3) JOBLOG

This is for job execution messages. Messages indicating the job's execution status, including command execution results and file allocation results, are output to this directory.

(4) JOBLOG_job-ID_sequence-number

This is the job execution log for a child job.

This file is created only when a child job is specified with the minimum output mode by using one of the following methods when starting the child job:

This file is not created when MERGE (merging the child job's spool job into the root job's spool job) is specified in the SPOOLJOB_CHILDJOB parameter.

(5) SCRIPT

This is for script image files. The contents of the first job definition script started and the contents of external job definition script files specified in the #-adsh_script command are output to this directory. External job definition script files specified using other methods, such as the . (dot) command, are not output to this directory. When you want to output the contents of job definition scripts as logs, you must use the #-adsh_script command.

If MERGE is specified in the SPOOLJOB_CHILDJOB parameter when the root job is run in the expansion output mode and the child job is run in the minimum output mode, the child job's SCRIPT is not merged into the root job's SCRIPT.For details, see the description in Merging a child job's spool job into the root job's spool job in the manual JP1/Advanced Shell.

(6) STDERR

This is the standard error output for the job. This file is not created when the root job is specified with the minimum output mode using one of the following methods when starting the root job:

The following header is output at the beginning of the file:

********   JOB SCOPE STDERR    ********

(7) STDOUT

This is the standard output for the job. It is created when the -s option is specified in the adshexec command or SPOOL is specified in the OUTPUT_STDOUT parameter in the environment file. This file is not created when the root job is specified with the minimum output mode using one of the following methods when starting the root job:

The following header is output at the beginning of the file:

********   JOB SCOPE STDOUT    ********

(8) step-number_step-name_STDOUT

If job steps are defined, this is the standard output within the corresponding job step. If the job step name consists of more than eight bytes, only the first eight bytes of the job step name are used for step-name.

This standard output is created when the -s option is specified in the adshexec command or SPOOL is specified in the OUTPUT_STDOUT parameter in the environment file. This file is not created when the root job is specified with the minimum output mode using one of the following methods when starting the root job:

(9) step-number_step-name_STDERR

If job steps are defined, this is the standard error output within the corresponding job step. If the job step name consists of more than eight bytes, only the first eight bytes of the job step name are used for step-name.

This file is not created when the root job is specified with the minimum output mode using one of the following methods when starting the root job: