Perform Insert Process

The distinction between performing an Insert Process with insert only, update only, or both insert and update processing is important. The difference rests on how existing rows in the table are processed when inserting rows that have matching primary key values.

When the primary key value of the source row does not already exist in the destination table, there is no conflict and the row is inserted only if you select insert or both as the processing method. If you select update as the processing method, the source row is marked as failed and is not inserted. However, when the primary key value of the source row already exists in the destination table, the source row updates the existing row only if you select update or both as the processing method. If you select insert as the processing method, the source row is marked as failed and is not inserted.

For example, you may want to avoid blindly updating existing rows with duplicate primary key values. You can specify insert processing to identify all rows that are not unique, then review the Extract File using the Browse Extract File option to determine which rows were duplicates and take action based on your findings. You can delete specific destination rows in the database to avoid these conflicts and retry the Insert Process. (For details, see the Common Elements Manual.)

Non-Unique Primary Keys

When the primary key for the source data is defined in the Optim Directory and is not unique, more than one row in the Extract File may exist with the same primary key value. If this data is then used to insert into a destination table defined with a unique primary key, only the first row with that primary key value in the Extract File is processed.

The additional rows are not processed and are identified as such in the Control File.

When the primary key for the destination table is also defined in the Optim Directory and is not unique, more than one row in the destination table may exist with the same primary key value. For insert processing, the rows in the Extract File are inserted. If you select either update or both as the processing method, the processing is performed as follows:

Determine when to use insert, update, or both insert and update processing by the results you want to obtain. Use insert to insert only new rows, update to replace existing rows, or both to insert new rows and update existing rows. You can also request that the rows in the destination table are deleted prior to performing an insert. This enables you to recreate the original set of test data exactly.

Insert Parameters Panel

When you select Option 2 PERFORM from the INSERT Process menu, the following panel is displayed to prompt for the parameters needed to perform the Insert Process.

Figure 1. Specify INSERT Parameters and Execute
--------------------- Specify INSERT Parameters and Execute -------------------
Command ===> 

Names for Extract File and Control File:    
  Extract File DSN : SAMPLE.FOP.INSERT.DATA 
  Control File DSN : EXTRACT.CTRL           

Process Options:
  Default Options (Overrides are not currently set in the Table Map): 
    Processing Method to Use           ===> B   (I-Insert, U-Update, B-Both)
    For Tables Processed by Insert Only: 
      Delete All Rows in Target Tables ===> N   (Y-Yes, N-No)
        If YES, Commit Frequency       ===> T   (T-After Each Table, E-At End)

  Lock Tables During Process      ===> N        (Y-Yes, N-No)
  Age Date Values                 ===> N        (Y-Yes, N-No) 
  Commit Every Nth Row            ===>          (1-1000, Blank/SL)
  Limit Number of Discarded Rows  ===> 1        (1-4294967295, Blank/NL)
  Review Propagation Key Sets     ===>          (A-Always, E-Error)

Run Process in Batch or Online    ===> O        (B-Batch, O-Online)
  If Batch, Review or Save JCL    ===> R        (N-No, R-Review, S-Save)

Process Report Type               ===> D        (D-Detailed, S-Summary)

Panel

This panel includes:

Extract File DSN
Name of the Extract File that contains the source data as specified on the INSERT Process menu. This value cannot be modified on this panel.
Control File DSN
Name of a sequential file that is to be used to accumulate information and statistics about the Insert Process as specified on the INSERT Process menu. This value cannot be modified on this panel.
Processing Method to Use
Process Options allow you to specify default values for tables that do not have processing overrides set in the Table Map Editor. To set the default Processing Method to Use, specify:
I
Data rows are inserted during the Insert Process when the primary key value of the source row does not already exist in the destination table.
U
Data rows are updated during the Insert Process when the primary key value of the source row already exists in the destination table.
B
Data rows are both inserted and updated during the Insert Process.
Note:
  • A parenthetical note on the Default Options line indicates whether overrides have been set in the Table Map Editor.
  • When the PROP function is specified in one or more Column Maps used by the Insert Process, you must use insert processing. (See the Common Elements Manual for information on specifying a PROP function in a Column Map.)
Delete All Rows in Target Tables
For tables that use insert only processing, you can specify whether to delete all rows in the destination table prior to inserting data from the Extract File. By deleting all rows prior to insert, you obtain a set of data that exactly matches the data in the Extract File. Specify:
Y
Delete before Insert.
N
Do not delete before Insert.
Note:
  • If site management does not allow user specification, the Delete All Rows in Target Tables and Commit Frequency lines may be omitted from this panel.
  • Also, during processing, Move prompts to confirm any cascade deletes caused from deleting rows in the destination.
If YES, Commit Frequency
If rows are to be deleted, indicate the frequency of commits during the delete processing:
T
Commit after deleting rows from each table.
E
Commit after deleting rows from all tables.
Lock Tables During Process
Specify whether the entire table is to be locked during an Insert Process. Locking the table ensures that other database activity does not interfere with the Insert Process. It will, however, prevent other users from accessing the table.

If you specify YES, a commit is performed only when Move has completed processing a table. This is because a commit causes the table lock to be relinquished.

Site management has the option to establish that tables are not to be locked during Insert. If so established, this option is set to NO and cannot be modified.

Age Date Values
Specify whether date values are to be aged as part of this process. Specify:
Y
Date values are to be aged. The Specify Aging Parameters panel is displayed. On this panel, specify aging values to be used. These values supplement the specifications for columns mapped with AGE functions and are used, if requested, to age DATE and TIMESTAMP columns not explicit targets of an AGE function.
N
Date values are not to be aged. The specifications for aging on the Column Maps included in the process are ignored.
See Age Date Values for additional information.
Commit Every Nth Row
Specify the frequency of commits. The commit points will affect the starting point in case of a RESTART. Frequent commits will keep page locks to a minimum. The shipped default value is 1000. Specify:
1-4,294,967,295
Absolute value to determine commit point.
blank
Site limit (S/L).
Note that this option has no effect if the prompt for Lock Tables During Insert is YES. The commit is performed when the processing for a table is completed.

The site-defined limit is displayed at your site to indicate the maximum value you may specify.

Limit Number of Discarded Rows
Specify a limit to the number of rows that can be discarded. If that limit is met, the process is terminated. You can use RESTART to begin the process at the termination point. Specify:
1-4,294,967,295
Terminate the process when a maximum number of rows are discarded.
blank
No limit (N/L).
To terminate the process if any rows are discarded, specify 1.
Review Propagation Key Sets
Specify whether the Propagating Key Set(s) panel is to be displayed before the Insert Process (with insert only processing) is performed. This option is only displayed when the PROP function has been specified in one or more Column Maps used by the Insert Process. Specify:
A
Always display the panel prior to performing the process.
E
Display the panel prior to performing the process only when the PROP specifications contain errors. Default.
Run Process in Batch or Online
Specify whether the Insert Process is to be run in batch or online. Specify:
B
Batch.
Note: If the extract file is stored on tape, you must use batch execution for the insert process.
O
Online
If site management has established a maximum number of rows for online processing and the Insert Process exceeds that limit, this option is set to Batch and cannot be modified. Consult site management for guidelines.
If Batch, Review or Save JCL
Specify whether the JCL and control statements are reviewed prior to job submission. This is specified for batch execution only. Since the JCL and control statements are displayed in the ISPF editor, you can modify them for the current request and save them to submit later. Specify:
N
Submit job, do not display or save the JCL and control statements.
R
Display the JCL and control statements for review prior to job submission.
S
Save the JCL and control statements. Prompts are provided for you to specify the name of a file in which to store JCL and control statements.
Insert Process Report Type
Indicator to include additional information in the Insert Process Report. If selected, detailed information about Column Map usage is displayed.
D
Display detailed information in the Insert Process Report.
S
Display summarized information in the Insert Process Report.

Available Commands

The available primary commands include:
  • CANCEL
  • END
  • OPTIONS
  • SHOW INDEXES

SHOW INDEXES Command

One or more missing indexes may cause performance problems in an Insert Process. Use the SHOW INDEXES command to display the Index Analysis pop-up window listing the destination tables of the Insert Process with the status of the supporting indexes. You can use the Index Analysis pop-up window as a diagnostic tool for determining whether to create the missing indexes. If the status of the index is Partial or None, creation of the missing index may enhance processing performance.

Figure 2. Index Analysis
 +----------------------- Index Analysis ----------------------+
 ¦                                                1 of 4       ¦
 ¦ Table Name                  Index Name         Index Status ¦
 ¦ --------------------------  ------------------ ------------ ¦
 ¦ ************************** TOP **************************** ¦
 ¦ FOPDEMO.CUSTOMERS           XCUSTPK            DBPK         ¦
 ¦ FOPDEMO.DETAILS                                Partial      ¦
 ¦ FOPDEMO.ITEMS               XITEMPK            DBPK         ¦
 ¦ FOPDEMO.ORDERS              XORDRPK            Unique       ¦
 ¦ ************************ BOTTOM *************************** ¦
 +-------------------------------------------------------------+

This panel includes the following:

Table Name
The name of the destination table.
Index Name
The name of the index, if any.
Index Status
The status of the index for each destination table.
DBPK
Index exactly matches the database primary key definition for the table.
Unique
A unique index is defined for the table; however, no primary key is defined.
Partial
Index exists with only a partial set of the required columns.
None
No index exists for the table.

Age Date Values

The Move AGE function requires a separate Data Privacy license.

When you specify Y to the prompt Age Date Values, the Specify Aging Parameters panel is displayed. The Specify Aging Parameters panel prompts for the values used to age date values. Date values are identified in Column Maps using the AGE function. The values specified on the Specify Aging Parameters panel are used when explicit values are not defined with the function in the Column Map. All explicit values override any specifications on this panel. (For details about the AGE function, see the Common Elements Manual.)

You have the option of applying these values to DB2-defined DATE and TIMESTAMP columns also.

Figure 3. Specifying Aging Parameters
-------------------------- Specify Aging Parameters ---------------------------
Command ===> 

Aging Specification 
  Explicit Date              ===>             YYYY/MM/DD 

  Or INCREMENTAL, 
    Years                    ===>             (-2500  to +1581)  
    Months                   ===>             (-30000 to +30000) 
    Weeks                    ===>             (-30000 to +30000) 
    Days                     ===>             (-99999 to +99999) 
    Business Rules           ===>             (0 to 30000)       
  Or TARGET DATING, 
    Base Date                ===>             YYYY/MM/DD (Default:Today) 
    Target Date              ===>             YYYY/MM/DD 

Default Aging Rule Table     ===>             (Default: FOP2RUSA) 
Default Aging Rule           ===>             
Century Pivot Year           ===>             (00 - 99) 
Process Date Columns         ===>             (A-All, U-User Defined) 
Report Invalid Dates         ===>             (Y-Yes, N-No) 
Report Skipped Dates         ===>             (Y-Yes, N-No) 
Output Rows w/ Invalid Dates ===>             (Y-Yes, N-No) 
Output Rows w/ Skipped Dates ===>             (Y-Yes, N-No) 

Panel

This panel includes:

Explicit Date
Specify an explicit date for aging. The date must be in the form YYYY/MM/DD or YYYY/DDD (a Julian date). The aging rule is applied to this date.
Years
Adjust the date by a number of years.
+nnnn -nnnn
Increment or decrement the value in the column by a number of years specified as one- to four-digits in the range -2500 to +1581.

A plus or a minus sign preceding the value indicates that the date is to be incremented or decremented. Increment is the default.

Months
Adjust the date by a number of months.
+nnnnn -nnnnn
Increment or decrement the value in the column by a number of months specified as one- to five-digits in the range -30000 to +30000.

A plus or a minus sign preceding the value indicates whether the date is to be incremented or decremented. Increment is the default.

Weeks
Adjust the date by a number of weeks.
+nnnnn -nnnnn
Increment or decrement the value in the column by the number of weeks specified as one- to five-digits in the range -30000 to +30000.

A plus or a minus sign preceding the value indicates whether the date is to be incremented or decremented. Increment is the default.

Days
Adjust the date by a number of days.
+nnnnn -nnnnn
Increment or decrement the value in the column by a number of days specified as a one- to five-digit number in the range -99999 to +99999.

A plus or a minus sign preceding the value indicates whether the date is to be incremented or decremented. Increment is the default.

Business Rules
Specify the date adjustment by “business” units.
nnnnn
Adjust the value in the column by a number of occurrences of the specified business rule date. This number is specified by a one to four digit value in the range 0 to 30000. Incrementing and decrementing is controlled by the rule.
For example, if the Aging Rule is specified as NEXTPAYDAY, the date is adjusted by the specified number of paydays. Therefore, a 4 in Business Units adjusts the date to the fourth payday after the date value in the column.

You can specify either calendar units (years, months, weeks, and days) or business units, not both.

TARGET DATING
Base Date
Specify an explicit date as the origination or starting date for calculating the aging amount. The date must be in the form YYYY/MM/DD or YYYY/DDD (a Julian date). If you leave Base Date blank, the current date is assumed.
Target Date
Specify an explicit date as the target for calculating the aging amount. The date must be in the form YYYY/MM/DD or YYYY/DDD (a Julian date). The aging amount is determined by the difference between the Base Date and the Target Date.
You must specify a value for only one aging method: EXPLICIT, INCREMENTAL or TARGET DATING. For INCREMENTAL aging, the combined values of Years, Months, Weeks, and Days cannot result in a year value greater than 3999. If the value exceeds 3999, an error occurs on processing.
Other Parameters
Default Aging Rule Table
The name of the aging rule table to be used. If blank, the site default aging rule table is used.
Default Aging Rule
The default aging rule used for any date column not explicitly assigned an aging rule. This must be a value in the aging rule table or blank.

To display a selection list of rules in the current table, specify an asterisk as the first or only character. Use S to select a rule. (For details about aging rule tables, see the Customization Guide.)

If Default Aging Rule is blank, an aging rule is not applied to any aged data that is not explicitly assigned a rule.

Century Pivot Year
The year used to determine the century to assign to two-digit values. For example, if the Century Pivot Year is 65, all two-digit years that are 65 or greater are assumed to be in the 20th century (19xx); all two-digit years that are less than 65 are assumed to be in the 21st century (20xx). This information is necessary to properly age the data. Specify a two-digit value from 00 to 99. If a value is specified for Pivot Year for an individual column, the Century Pivot Year value is ignored for that column.
Process Date Columns
The type of date columns to be aged. Specify:
A
All DATE and TIMESTAMP columns and the columns mapped to AGE are aged.
U
Only columns mapped to AGE are aged.

If a Column Map is not defined or AGE is not specified on the selected Column Map, these date values apply only to DATE and TIMESTAMP columns.

Report Invalid Dates
List details of invalid dates encountered during the process in the process report.
Report Skipped Dates
List details of skipped dates encountered during the process in the process report. (Dates are skipped when the value is not a valid date but has special meaning to the application. For example, “000000” or “999999” are not valid dates but may be special indicators for the application. Other examples of skipped dates are those containing only spaces, hex zeroes, or hex “FF”.) For a comprehensive list of values handled as skipped dates, see Skipped Columns.
Output Rows with Invalid Dates
Specifies whether rows with invalid dates are written to the destination database.
Output Rows with Skipped Dates
Specifies whether the rows with skipped dates are written to the database. For a comprehensive list of values handled as skipped dates, see Skipped Columns.

When you have completed the aging parameter specifications, use END to proceed with the Insert Process.

View PROP Specifications

The Propagating Key Set(s) panel is displayed prior to the execution of the Insert Process, according to your specification for the Review Propagation Key Sets prompt. When you specify A, this panel is always displayed prior to performing the process. When you specify E, this panel is displayed only when errors are encountered in the PROP function.

The Propagating Key Set(s) panel groups the tables affected by a single PROP function together and identifies the table for which the PROP function is specified. The specifications for determining the value to propagate are also displayed. (For details on specifying the PROP function on the Column Map see the Common Elements Manual.)

The following figure demonstrates:

  • The literal “JONES” defined in the ORDERS table is to be propagated to the SALES and CUSTOMERS tables.
  • The value for the CUST_ID column in the CUSTOMERS table is to be assigned sequence numbers starting with 1 and incremented by 1. This value is then propagated to the ORDERS table.
  • The value in the ORDER_ID column of the ORDERS table is to be propagated to the DETAILS table.

This ensures that Move does not insert child rows inappropriately. Note that if the propagated primary key value duplicates an existing value, that row is discarded. Move then discards the rows from related tables whose foreign key columns contain the propagated value. Therefore, if changing the CUST_ID in CUSTOMERS causes a duplicate row, the related ORDERS rows are discarded.

However, to ensure that the related DETAILS rows are also discarded, specify propagate for the ORDER_ID column used to relate ORDERS and DETAILS.

Figure 4. Propagating Key Set(s)
--------------------------- Propagating Key Set(s) ----------------------------
 Command ===>                                                 Scroll ===> PAGE 
                                                              ROW 0    OF 11  
******************************** Top of Data **********************************
Press PF12 to cancel.  Press END to continue.                     
                                                                               
 Set No.          Table Name            Column         Column Map Specification
 ------   -------------------------- ----------------- ------------------------
                                                                              
 Set:1    FOPDEMO.SALES               SALESMAN_ID                             
          FOPDEMO.CUSTOMERS           SALESMAN_ID                             
          FOPDEMO.ORDERS             *ORDER_SALESMAN     'JONES'              
                                                                              
 Set:2    FOPDEMO.CUSTOMERS          *CUST_ID            SEQ(1,1)   
          FOPDEMO.ORDERS              CUST_ID                                 

 Set:3    FOPDEMO.ORDERS             *ORDER_ID           ORDER_ID
          FOPDEMO.DETAILS             ORDER_ID
******************************* Bottom of Data ********************************

Panel

This panel includes:

Set No.
Number assigned to the set of tables included in the propagation specification.
Table Name
Names of the tables affected by the propagation. These tables are listed from parent to child.

In the figure, in Set 1, propagate is specified on a child table, ORDERS. In Set 2 and Set 3, it is specified on the parent, CUSTOMERS for Set 2 and ORDERS for Set 3.

Column
Name of the column that is the target of the propagation. The column for which the PROP function is specified is identified by an asterisk (*).
Column Map
Specification
The value specified in the Column Map to be propagated.

You can scroll the display, as necessary.

Use END to perform the process, or press PF12 to cancel the process request and return to the INSERT Process menu. (PF12 has been assigned this special use on the Propagating Key Set(s) panel only. Move automatically restores your ISPF values when the Propagating Key Set(s) panel is exited.)

Discarded Rows

When Move is inserting data, rows may be discarded in two ways:

Immediate discards -

During the Insert Process, a row is immediately discarded if a condition exists that cannot be rectified by Move. For example, the primary key value for the row already exists in the destination table.

Pending discards -

A pending discard occurs when the row cannot be inserted at the present time, but the condition preventing the insert may not exist later in the Insert Process. These rows fail the Insert but are held in a pending status while processing continues. As the Insert request proceeds, Move will attempt to insert these pending rows one or more additional times.

This occurs only when there are referential integrity cycles. An RI rule may prevent a row from being added because it references another row that is not present in a related table.

For example, you may not be able to add an ORDERS row that contains a customer ID for a CUSTOMERS row that does not exist in the CUSTOMERS table. Later in the processing, the related CUSTOMERS row is added. When Move re-tries the insert on the pending ORDERS row, the insert completes successfully.

It cannot be determined whether pending discards will be discarded until the end of the Insert Process. Therefore, there could be many more discarded rows than the discard limit. For example, assume the discard limit is 10. There are 5 immediate discards, but during the Insert there are 100 pending discards. If all pending discards are inserted sometime during the process, then the number of discards, 5, is well within the limit. If, however, only 50 of the pending discards are inserted, then the number of discards is 55 and well over the limit but this cannot be determined until the process terminates.

Commits during Cycles

Each time Move completes the Insert Process for a table, it issues a COMMIT statement. This occurs even when Move is processing a cycle and will return to a table later in the Insert Process. That means table locks are held only during the time Move is processing a table. When processing switches to another table, the COMMIT causes the table lock to be relinquished. If that table is processed again, the lock is re-established.

Prompt to Create Table

Before starting the Insert Process, Move checks for any UNKNOWN destination tables. If any destination tables are UNKNOWN, Move displays the CREATE Object List panel with a message indicating that the unknown tables must be created. All object definitions in the Extract File are included. Those object definitions that do not exist, are identified. Assume three tables and their primary keys, relationships, and indexes are included in the Extract File. Also assume that the ORDERS table is UNKNOWN at the destination and the other two tables exist.

Note:
  • For a VSAM data set, the Create Process can create the destination Legacy Table, VSAM data set, and any Optim objects associated with the Legacy Table.
  • For an IMS™ data set, the Create Process can only create the destination Legacy Table and any Optim objects associated with the Legacy Table (e.g., Relationships, Primary Keys, Foreign Keys). The Create Process cannot create a destination DBD or IMS data set.

The following panel is displayed:

Figure 5. CREATE Object List with Unknown Destination Tables
---------------------------- CREATE Object List ---------CREATE MISSING TABLES 
Command ===>                                                  Scroll ===> PAGE 

Primary : CREATE ALL, DROP ALL, DROP EXISTS, DROP CONFLICTS, DROP CHANGED  
          DEFAULTS, SHOW                                            1 of 10
   Line : S, U, I, CR(A), DR(A), DB2, OPT, SQL 

Cmd  Status     Type              Object Name           Database Tablespace 
--- --------  ---------  ---------------------------->> -------- ---------- 
*** ******************************** TOP ********************************** 
___  SELECT   TABLE      SOFTECH.ORDERS                 DSOFTECH  SSOFTCH2 
___  SELECT     INDEX    FOPDEMO.XORDERPK 
___  SELECT     PK(DB2)      
___  SELECT     FK(DB2)  RCO 
___  EXISTS   TABLE      SOFTECH.CUSTOMERS              DSOFTECH  SSOFTCH1 
___  EXISTS     INDEX    FOPDEMO.XCUSTPK 
___  EXISTS     PK(DB2) 
___  EXISTS   TABLE      SOFTECH.DETAILS                DSOFTECH  SSOFTCH1 
___  EXISTS     INDEX    FOPDEMO.XORDETPK 
___  EXISTS     PK(DB2)
*** ****************************** BOTTOM ********************************* 

Review SQL Before Create ===> Y    (Y-YES, N-NO)

The UNKNOWN table and its subordinate objects are identified by the SELECT status. (For this example, it is the first table listed in the figure.) The tables and other objects that exist at the destination are identified by the EXISTS status.

To create any table in SELECT status, a Database Name is required; a Table Space Name is optional. If you do not explicitly specify values, Move will use default values if they have been established. If a default is not specified for the database name and you do not supply a value, Move will prompt for a database name. If a default is not specified for the table space name and you do not supply a value, the table space name is automatically generated based on the table name.

Establish Defaults

You can establish default values for the database name and the table space name using Option 5 Create Process on the Data Migration menu or use the DEFAULTS command. Either displays a menu from which you select the defaults you want to establish. See Create Process for more information on specifying defaults.

Select Objects

You can select and unselect individual objects using the Select line command, S, and the Unselect line command, U. Any unselected object definitions are assigned the status UNSEL and are not included when the SQL to create the objects is generated by Move.

You can scroll the list using the UP, DOWN, TOP, and BOTTOM commands.

Create Objects

You can use the CREATE ALL command to direct Move to generate and execute the SQL DDL statements necessary to create the table and other objects that have the SELECT status. Alternatively, you can use the CR line command to create selected objects individually or the CRA line command to create a table and the objects related to it.

Review SQL

You can display the generated SQL statements prior to execution by specifying Yes to the prompt Review SQL. (This prompt is provided after the last object on the list.) The SQL statements are displayed in the ISPF editor and may be saved or edited as desired. The SQL is executed when you use END. Use CANCEL to abandon executing the SQL.

Details about Creating Objects

For more information on the CREATE Object List panel, see Perform Create Process. This panel is also displayed when you explicitly request the PERFORM option of the Create Process and the details of this panel are discussed in that section.

Inserting Data

If the missing tables are successfully created, the Insert Process proceeds to load the data after the SQL is executed. (If objects are not created, the Insert Process cannot proceed and a message is displayed. If you do not want to create the missing tables, you can remove their names from Destination Table Name on the INSERT Process Table Map panel.)

Online Executions

If the Insert Process is executed online, a panel is displayed noting the progress of the process.

Figure 6. Insert Process Status
-------------------- Specify INSERT Parameters and Execute --------------------
Command ===> 

Names for Extract File and Control File:

  +----------------------UPDATE/INSERT Process Status-------------------------+
  |                                                                           |
  |          UPDATE/INSERT Process in Progress                                |
  |                                                                           |
  |        Number of Rows Processed: 2053 of 10340                            |
  |                                                                           |
  |   Completed Table: FOPDEMO.CUSTOMERS                                      |
  |     Inserted Rows: 523                                                    |
  |      Updated Rows: 0                                                      |
  |       Failed Rows: 0                                                      |
  +---------------------------------------------------------------------------+

The total number of rows that have been inserted out of the total number of rows to be inserted is displayed. Also, the name of the currently processing table and total rows that have been inserted, updated, or failed for that table are displayed. This is revised:

  • Every 1000 rows for each table to display the current total number of processed rows.
  • When the processing for a table is complete and the processing for the next table begins.

Batch Execution

If you specify batch execution, Move builds the necessary JCL and Batch Utility control statements. The JOB card information is taken from the JCL specified on the Job Card and Print Options panel.
Note: If the extract file is stored on tape, you must use batch execution for the insert process.

If you entered YES to the Prompt for Changes Before Job Submission prompt on the Job Card and Print Options panel, the default Job card, as indicated on that panel, is displayed prior to job submission. You may edit the Job card and print options and specify whether your changes are to apply only to the current job submission or to be applied permanently. (See the Common Elements Manual for details about job card and print options.)

The information on the Job Card and Print Options panel is used, along with the Insert parameters, to build the JCL and control statements required to perform the Insert Process. If you enter Review to If Batch, Review or Save JCL on the Specify INSERT Parameters and Execute panel, the entire JCL and control statements are displayed in the ISPF editor. The JCL and control statements can be edited and saved. (See the Batch Utilities Guide for the INSERT statement keywords and values.)

If you have entered Save to If Batch, Review or Save JCL, you are prompted for the name of the file in which to save the JCL and control statements and whether the job should be submitted after saving. (Details are discussed later in this section.)

END is used to return from the ISPF editor to Move, however, your response at the prompt, Submit Jobs with END on the User Options panel, determines whether the job is automatically submitted. If you enter NO to the prompt, you must submit the job explicitly from the ISPF editor using the SUBMIT command.

If you enter YES, the job is automatically submitted. Use the CANCEL command to return to the Specify INSERT Parameters and Execute panel without submitting the job. You can modify the specifications or cancel the insert request from this panel.

(See the Common Elements Manual for information on establishing whether jobs are automatically submitted when END is used.)

If you submit the job and an error in the Job Card is encountered, a message is displayed. You can review the Job Card and correct the error or terminate the Insert Process.

Batch Overrides

If you save the generated batch job to a data set, you can submit the job directly from the ISPF editor instead of from within an online session. When you do so, you can override the default destination Creator ID defined in the Table Map used for the Insert Process. This is especially convenient when you want to apply different Extract Files to a single set of tables or a single Extract File to multiple sets of tables using common Insert Process JCL.

Any Extract File can be used in an Insert Process as long as at least one table name on the file matches one table name on the Table Map. The Creator IDs do not have to match. If any table does not match on the Table Map, it is not included in the process.

Use the PSDFOVRD DD statement in the JCL to provide the desired overrides.

Note: With Release 5.5, a generated batch job executes the Batch Utility to perform the specified function. The batch job includes a series of control statements defining the function to be performed. You can edit these control statements directly as an alternative to providing batch overrides. If batch overrides are not available, you must edit the control statements directly. The Batch Utilities Guide describes the Batch Utility control statements. (All batch overrides that were valid prior to Release 5.5 will continue to be valid.)

COMMIT_COUNT

To override the commit count that was specified when the job was created, specify:

COMMIT_COUNT   value

The value can range from zero to the site limit.

COMMIT_MINUTES

To change commit processing from number of updates to elapsed time, specify:

COMMIT_MINUTES   value

The value is specified in minutes and will override the commit count. The value can range from 1 to 1440. The process report will reflect the change from the number of updates to elapsed time.

Creator ID

To override the default destination Creator ID specified on the Table Map specify:

DEFCID   cid

where cid is the default Creator ID to be used. This applies only to destination tables that are not explicitly qualified in the Table Map. Only one DEFCID parameter may be specified for an Insert Process.

Date Aging

To override the date aging specifications for the Insert Process, specify one or more of the following:

AGE_AMT_YEAR
For incremental aging, the amount to increment by years.
AGE_AMT_MONTH
For incremental aging, the amount to increment by months.
AGE_AMT_WEEK
For incremental aging, the amount to increment by weeks.
AGE_AMT_DAY
For incremental aging, the amount to increment by days.

About AGE Parameters

The value must be numeric. Specify whether the value increments (+) or decrements (-). Increment is the default. For example, +15 or 15 increments the dates and -15 decrements the dates.

The combination of AGE_AMT parameters specifies the aging.

  • To specify an explicit date, specify a value for YEAR that is greater than or equal to 1582. Then, values must be specified for MONTH and DAY to define an explicit date. (A value for WEEK is invalid for an explicit date.)
  • To specify a date other than an explicit date, you can specify values for any or all units. The valid ranges for the values are:
    YEAR   -2500 to +1581
    MONTH   -30000 to +30000
    WEEK   -30000 to +30000
    DAY   -99999 to +99999
AGE_AMT_RULE
For incremental aging, the amount of occurrences of an aging rule to increment. For example, assume the specified rule is NEXTPAYDAY, specify 4 to age to the fourth payday from the current date in the column.
AGE_BASE_DATE
Specify a base date used by Move to determine the difference between this value and the AGE_TARGET_DATE. The difference is used as the number of days to age the data.

This must be specified in the format:
yyyy/mm/dd or yyyy/ddd

The current date is the default.

AGE_TARGET_ DATE
Specify the target date used by Move to determine the difference between this value and the AGE_BASE_DATE. The difference is used as the number of days to age the data.

This must be specified in the format:
yyyy/mm/dd or yyyy/ddd

This is required if AGE_BASE_DATE is specified.

TABLE
Specifies the name of the aging rule table to be used. Specify the name of a valid partitioned data set member.
AGE_RULE
Indicates the default aging rule to be used for any date not explicitly assigned an aging rule. Specify any value in the aging rule table specified for the TABLE statement. (For more information about the aging rule table, see the Customization Guide.)

If not specified, no aging rule is applied.

PIVOT_YEAR
Indicates which century to apply to two-digit year values. This information is used for aging rules and to provide the century in the output if so formatted.

Specify a two-digit value from 00 through 99. This value determines the threshold. If not specified, a pivot year is not used.

For example, assume the value is 65. All two-digit years 65 or over are assumed to be in the 20th century (19xx); all two-digit years that are less than 65 are assumed to be in the 21st century (20xx).

LIST_INVALID
Specifies whether details of the invalid dates encountered during the aging are listed at the beginning of the process report.

Specify Y to list the details or N to not list them. The default is Y.

LIST_SKIPPED
Specifies whether details of the skipped dates encountered during the aging are listed at the beginning of the process report. Dates are skipped when the date cannot be aged because the value is not a valid date but has special meaning to the application.

For example, “000000” or “999999” are not valid dates but may be special indicators for the application. Other examples of skipped dates are those containing only spaces, hex zeroes, or hex “FF”.

Specify Y to list the details or N to not list them. The default is Y.

PUT_INVALID
Specifies whether the rows with invalid dates are written to the database.

Specify Y to write the rows or N to not write them. The default is Y.

PUT_SKIPPED
Specifies whether the rows with skipped dates are written to the database.

Specify Y to write the records or N to not write them. The default is Y.

PROCESS_DATE _COLUMNS
Specifies whether date columns not explicitly mapped are to be aged.
A
All DATE and TIMESTAMP columns and the columns mapped to AGE are aged.
U
Only columns mapped to AGE are aged.

UNKNOWN

To ignore any tables referred to in the Table Map that do not exist when performing the Insert Process, specify:

UNKNOWN    { FAIL | ALLOW }
FAIL
Terminate the insert if any tables or relationships named in the Table Map are unknown. The report will contain a message listing the first unknown object. This is the default.
ALLOW
Bypass the unknown tables and relationships and continue with the Insert Process. The report will contain a message listing the unknown objects.

UNKNOWN ALLOW is most frequently used when you override the default Creator ID with DEFCID. Changing the default Creator ID may result in naming destination tables that do not exist. The UNKNOWN ALLOW parameter enables you to direct the Insert Process to skip these “unknown” tables.

Store Overrides

You can store these parameters in a sequential file or a partitioned data set rather than specify them directly in the jobstream. However, these parameters must be the only data in the file. (You cannot use the same file used for a batch Extract Process if selection criteria, SQL WHERE Clause, or Group Selection Processing parameters are also included. Also you cannot use this file for an Extract Process if the YEAR parameter is included.)

Save JCL

You can save the JCL and control statements, modify them and execute the process without re-invoking Move. Specify S to the prompt, If Batch, Review or Save JCL prompt. The following prompts for the information to save the JCL and control statements.

Figure 7. Save JCL Parameters
 +---------------------------- Save JCL Parameters --------------------------+
 |                                                                           |
 |  DSN to Save JCL to     ===>                                              |
 |  Member (if PDS)        ===>                                              |
 |  Replace Existing Data  ===>       Y-Yes, N-NO                            |
 |                                                                           |
 |  DSN to Hold SYSIN Data ===>                                              |
 |  Member (if PDS)        ===>                                              |
 |  Replace Existing Data  ===>       Y-Yes, N-NO                            |
 |                                                                           |
 |  Submit JCL, or Review  ===>       S-Submit, R-Review, N-Neither          |
 +---------------------------------------------------------------------------+

Panel

This panel includes:

DSN to Save JCL to
Name of the sequential file or partitioned data set to receive the JCL and Batch Utility control statements.

If you specify a partitioned data set, specify the member name in Member.

Member (if PDS)
Name of the member in the partitioned data set specified for the DSN prompt. If a sequential file is specified and you specify a member name, an error message displays.
Replace Existing Data
Specify whether the generated JCL and control statements replace existing data in the specified file.
DSN to Hold SYSIN Data
Name of the sequential file or partitioned data set to hold SYSIN data.

If you specify a partitioned data set, specify the member name in Member.

Member (if PDS)
Name of the member in the partitioned data set specified for the DSN prompt. If a sequential file is specified and you specify a member name, an error message displays.
Replace Existing Data
Specify whether the SYSIN data replaces existing data in the specified file.
Submit JCL or Review
Specify whether the JCL and control statements are saved and submitted, displayed for review, or both. If you select Submit, the JCL and control statements are saved and the job is submitted. If you select Review, use ISPF facilities to save or submit the JCL and control statements. If you select Neither, the JCL and control statements are saved, but not submitted or displayed for review.


Feedback

URL of this topic: