IBM Support

InfoSphere Information Server, Version 11.7 post-installation requirements

Product Readmes


Abstract

After you install InfoSphere Information Server product or fixes, in some cases you must complete post-installation steps.

Content


Steps required after you install InfoSphere Information Server

  • If you installed IBM Information Server Enterprise Search, you must synchronize the following asset types manually: Data Classes, Data Rules, Data Rule Sets, Data Rule Definitions and Data Rule Set Definitions. Use the following command:
    /opt/IBM/InformationServer/ASBServer/bin/graphbatchload.sh START --threads 4 --types ASCLAnalysis::DataClass,Rule,RuleSet,RuleExecutable
    For details about the command, see the Synchronizing assets manually topic.


Steps required after you install the fixes

Information Server, Version 11.7
After you install a fix pack or rollup patch, review the following table to find fixes that impact your jobs. If you find an applicable fix, follow the instructions to use that fix in your jobs.
Connectivity
Fix ID
What this fix does
Instructions
More details
Resolves the issue of properties being displayed in regional languages. To resolve this problem, check the text file in the "More Details" column.
Includes support for native client v11 in SQL Server Stage. To resolve this problem, check the text file in the "More Details" column.
Includes Oracle 12c support to Connector Migration Tool for DTS Connector. To include the support:
After installation and migration, DTStage jobs must be recompiled before they can run.
NA
Resolves the issue of Connector Migration Tool throwing a null pointer exception. To resolve this problem:
After installation, the variant update command line should be run which would update variants in all the stages which were previously not updated.
NA
Upgrading DataDirect OpenSSL libraries to version 1.0.2k To resolve this problem, check the text file in the "More Details" column.
Drop Sybase 12.0 Support in Information Server 11.7 To resolve this problem, check the text file in the "More Details" column.
Resolves the issue of Connector Migration Tool throwing a null pointer exception while migrating DRS plugin v7.5 to DRS Connector. After installing and migrating, DRS Connector jobs must be recompiled before they can run.
NA
JR59741 Optionally enforces full suite logon for the dsjob commandline tool

Unix:
To enable this option, a file needs to be created at the top level of the file system by a root user. The file is "/.dsslog", which should have rw-r--r-- permissions.

Windows:

To enable this option, a registry entry needs to be set by a Windows administrator. Run the regedit program. Locate in the registry tree structure the location,     HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Ascential Software\. If the key folder "DataStage" is not present, you need to create one and then
 another folder under that as "CurrentVersion". Ignore the "DataStage Client" and "DataStage Server" folders if present. In the location,
 HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Ascential Software\DataStage\CurrentVersion\, create a new String value ForceSuiteLogon and set the value to 1.

JR59624 Stage jobs migrated to DB2 Connector generate correct but unwanted IIS-CONN-DAAPI-000398/399 warnings To resolve this problem, set the environment variable, CC_SUPPRESS_SCHEMA_RECONCILIATION_WARNINGS to any value to suppress the warnings. Set it to nothing or do not define the environment variable to display the warnings.
JR59707 Add support for disabling Batch Error Mode for Oracle Connector running PL/SQL block as source stage. To resolve this problem, set the environment variable, CC_ORA_DISABLE_BATCH_ERRORS=TRUE to avoid an oracle error message which is logged by the Oracle Connector.
JR59580 Cannot create new record when the reference field is null using Salesforce Connector. To resolve this problem, set the environment variable, SF_CREATE_USE_DEFAULT_FOR_NULL as "false", re-compile and re-run the job. The record will be created even when the value of the reference field is null or empty. This is to overcome the default behavior of Sales force, which does not allow creating new record when the reference field is null.
JR59823 Oracle connector on server canvas (Non-NLS environment) is unable to load accentuated characters. To resolve this problem, set the environment variable, CC_SE_SKIP_NLS_CONVERSIONS as true so that there is no NLS conversions in the target context. This configuration parameter applies to Non-NLS environment only.
JR61567 Performance of ODBC Connector is significantly low when performing insert or update statements. To resolve this problem, set the environment variable CC_ODBC_BIND_AS_VARCHAR=1
JR61419 On connector User Interface, if the property value contains '>' followed by newline, then the newline is removed. To resolve this problem, define the environment variable CC_GUI_RETAIN_NEWLINE_IN_PROPVAL and set the value to 'true'. The environment variable can be defined at job or project level.
JR61964 Adding support for INT96 parquet types in File Connector See 'Reference' section in the Knowledge center.
JR61501 MongoDB ODBC Driver returns NULL key values of second collection of a left join You need to recreate the schemamap files using schematool.
JR61117 SQL Server EE Operator in WRITE mode (bulk load) to BCP causes heap corruption when processing DECIMAL data Perform the following set of steps for the SQL Native client that is used. By default the environment is set for SQL Native Client 6.
  • For SQL Native client 11, enter the following commands on the computer where the engine tier is installed:
         cd %DSHOME%\..\DSComponents\bin
         cp -f liborchsqlservernt.dll liborchsqlserver8nt.dll
         cp -f liborchsqlserver11nt.dll liborchsqlservernt.dll
  • After it is set for SQL Server Native Client 11, use  SQL Server client 6, enter the following commands on the computer where the engine tier is installed:
         cd %DSHOME%\..\DSComponents\bin
         cp -f liborchsqlserver8nt.dll liborchsqlservernt.dll
JR62155 Teradata connector truncates string data that includes the string 0x00 Teradata API Plugin stage truncates the '\0' NULL characters where as the Teradata connector reads the data as is. To isolate the behavior of Teradata API in TDCC, a new Environment variable, CC_TERA_TRUNCATE_STRING_WITH_NULL is added  which truncates
the string data that includes the string 0x00.
JR62437 In File Connector, 'NoSuchMethodError' error while reading ORC date and timestamp in IS 11.7.1.
File Connector code is enhanced to be able to read ORC date and timestamp columns when using hive-exec-3.1 version of hive binaries.
For jobs with read and write orc format files, the environment variable, CC_USE_LATEST_FILECC_JARS must be set to orcstream.jar and the CLASSPATH environment variable must contain the following values:
<ISHOME>/ASBNode/eclipse/plugins/com.ibm.iis.client/commons-logging-
1.2.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/jackson-databind-
2.10.1.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/joda-time-
2.1.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/commons-lang3-
3.6.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/commons-configuration-
1.10.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/commons-lang-
2.6.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/jackson-core-
2.10.1.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/hadoop-mapreduce-client-core-
3.1.0.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/httpclient-
4.5.10.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/commons-collections4-
4.1.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/hadoop-common-
2.7.5.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/hadoop-auth-
2.7.7.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/commons-collections-
3.2.2.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/java-jwt-
3.8.3.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/snappy-java-
1.1.7.3.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/commons-compress-
1.19.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/hadoop-hdfs-
2.7.7.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/guava-26.0-jre.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/slf4j-api-
1.7.30.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/jackson-annotations-
2.10.1.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/httpcore-
4.4.12.jar:<ISHOME>/Server/DSComponents/bin/thirdparty/slf4j-simple-1.7.30.jar

Replace ISHOME with the actual Information Server install directory.
This environment variable CC_USE_LATEST_FILECC_JARS can be set at job level or project level.
Metadata Import :

For metadata import the environment variable, CC_USE_LATEST_FILECC_JARS must be set to orcstream.jar.

The Agent.sh file must be modified to include the mentioned CLASSPATH environment, and the ASB Agent must be restarted.

NOTE:

For jobs which require FileConnectors configured for both ORC and Parquet format files, the value for environment variable must be set to
C_USE_LATEST_FILECC_JARS=orcstream.jar:parquetstream.jar
JR62905 Provided an option in File Connector to write temp files in a user specified location. You can specify the temporary location for the temp files in HDFS by setting the environment variable, CC_FILECONN_TEMPFILE_DIRECTORY. For example, CC_FILECONN_TEMPFILE_DIRECTORY=/tmp. When this environment variable is set, the temp files are created in the '/tmp' directory in HDFS.
 
Note:
  1. You can force delete the temporary files in HDFS by setting the environment variable, CC_FILECONN_FORCE_DELETE_MANIFEST as true. For example, CC_FILECONN_FORCE_DELETE_MANIFEST=true.
    When this environment variable is set, the temporary files are deleted immediately after it is created, and the usage of the temp files are completed.
  2. The environment variables are independent of each other. You can use any one of them or both can be used for the required functionality in the job.
     You must set the environment variables at the job level or at the project level, recompile the job and then run it.
JR62256 Performance of ODBC connector is significantly low, when performing insert or update statements, and char datatype as primary key. ISBindAsChar=1, a new ODBC-DSN parameter must be added for better performance.

For example:
[SQL_SERVER]
Driver=/opt/IBM/InformationServer/Server/branded_odbc/lib/VMsqls00.so
   ...
   ...
   ...
   Database=<DATABASE_NAME>
   HostName=<HOSTNAME>
   Password=<PASSWORD>
   Pooling=0
   PortNumber=1433
   ...
   ...
   ISBindAsChar=1 (new DSN variable)
JR62941 Hierarchical stage - REST step, unable to post or get data of size more than 10MB.

Added an environment variable, XMLSTAGE_REST_CONTENT_MAX_SIZE which can be used to override the default limit of 10 MB.
You can set the variable at the job level, or project level similar to any other environment variable, and increase the limit of body content as per the requirement. An integer value greater than 10 (Example, 30 sets the limit to 30MB) is accepted, and in all other cases the limit value defaults to 10 MB.
Connectivity Non IBM
JR59991 Oracle CC jobs (migrated from OCI stages) fail on server canvas while handling date/timestamp columns. You need to set CC_SE_HANDLE_DATE_IN_BULKMODE with the value TRUE to resolve this issue.
Following issues were fixed:
  1. Timestamp formats like 'YYYY-MM-DD-HH24.MI.SS' as well as the format like 'YYYYdMMdDDdHH24dMIdSS'
    Note: In above format, d can be any delimiter such as %, /, : or . etc.
  2. Date columns using BulkLoad (for jobs which are being migrated from OCI Load).
    As per the connector design, target connector expects the date in Julian format, hence it needs ICONV logic. However OCI load does not need this type of conversion.
JR60028 Oracle CC jobs fail post-migration from OCI/OCI load plugin stages. If a source connector contains to_char(<column name>,'YYYY-MM-DD HH24:MI:SS') in the SELECT statement and the corresponding column being mapped is Timestamp, set both CC_ORA_BIND_DATETIME_AS_CHAR=TRUE and CC_ORA_HONOR_DATETIME_AS_CHAR=TRUE. Following issues were fixed:
  1. POSITION attribute in a control file is not getting generated correctly in case of Manual load.
  2. CC_ORA_BIND_DATETIME_AS_CHAR should be honored only in source context for Oracle CC jobs (migrated from OCI) to work.

JR60212
Salesforce connector support in bulk extract without using PKChunking. Do not use the PK Chunking feature, set the job property "Enable PK Chunking" to "No", compile and run the job. This job will not use the PK Chunking feature.
Added the job property, "Enable PK Chunking", which is activated when the Query operation is selected in the Bulk Mode.
By default the value of this job property is "Yes", which implies that the bulk query job will use the PK Chunking feature.
 
JR60241 Adds the option for HardDelete in the Salesforce connector bulk mode during delete operation. You need to set this job property as "true", compile and run the job, which will delete the record permanently and does not store it in "Recycle Bin". Added the job property, "Empty Recycle Bin", which is activated when the Delete operation is selected in bulk mode.
 
JR60482 Introduce environment variable CC_ORA_DISABLE_PLSQL_BATCH_ERRORS
Set CC_ORA_DISABLE_PLSQL_BATCH_ERRORS=TRUE as a user defined environment variable in a job that uses PL/SQL within an Oracle Connector where both Oracle Client and Server are 12cR2 or later.
JR61241 Search users by role does not return users that inherited the role when external user registry used. Changes required in the post installation configuration.
JR61569 Salesforce Connector does not extract or load the timestamp field type in local timezone. To resolve this problem, implement the solution by using the environment variable SF_USE_LOCAL_TIMEZONE_DATETIMEFIELD so that the timestamp data is extracted from or loaded to Salesforce using the local timezone.
For the fix to work, set the environment variable, SF_USE_LOCAL_TIMEZONE_DATETIMEFIELD to 'true', compile and then run the job.
JR61141 Amazon RedShift Driver returns non-nullable column as NULLABILITY UNKNOWN Redshift JDBC driver is supporting new connect option 'ExtendedColumnMetaData'.
The default value is false. With ExtendedColumnMetaData=true, driver correctly reports Nullability MetaData information.
To enable connection option, append ExtendedColumnMetaData=true to the URL connection string.
Example:
 jdbc:ibm:redshift://<YourServer>:<portnumber>;DatabaseName=<name>;ExtendedColumnMetaData=true
Setting 'ExtendedColumnMetadata' to true may diminish performance.
It is recommended to selectively use this option at the job level only.
JR62461 Added support for session token in amazon s3 connector. The support for temporary credentials is added. The session token can be used only when 'Use credentials file' is set to 'Yes'.
 
The entries in the credentials file used with the S3 connector must be in the following format:
 accessKey=<Replace with the AccessKey>
 secretKey=<Replace with the SecretKey>
 sessionToken=<Replace with the SessionToken>
The credentials file with sessionToken can also be used while using S3 connector for metadata import using IMAM.
Added support for read, write and metadata import of ORC and Parquet format files with the latest specification using the Amazon S3 connector. To read/write ORC, and Parquet format files with the latest specification using Amazon S3 connector, the following environment variable must be set:
CC_USE_LATEST_FILECC_JARS=orcstream.jar:parquetstream.jar

The environment variable, CC_USE_LATEST_FILECC_JARS can be set at the job level, or project level.
Metadata import:

To import ORC, and Parquet format files with the latest specification using Amazon S3 connector, the following environment variable must be set, and then restart the ASB.
CC_USE_LATEST_FILECC_JARS=orcstream.jar:parquetstream.jar 
 
Note :
 
It is recommended to use the services or tools which support the latest specification of parquet and ORC when parquetstream or orcstream is used in AmazonS3 connector.
Parallel Engine
Fix ID
What this fix does
Instructions
More details
Resolves the issue of stage variables being incorrectly set to NULL at the start of the link evaluation cycle. To resolve the problem:
After installing and migrating, DRS Connector jobs must be recompiled before they can run.
DataStage
Ensures that DSODBON is not disabled after saving DSODBConfig.cfg in UTF-8 encoding. Make sure that the existing DSODBConfig.cfg files are saved in ANSI encoding to ensure correct processing.
Provides an option to suppress the generation of a summary report in sequence jobs. Set the new Project Properties option to specify that a job report summary is not accumulated in the log during the job run.
JR59030
Updates Microsoft redistributables to latest version. Install the cumulative security update from Microsoft for redistributables that are included in Information Server. It is recommended to restart Windows client systems to ensure that the installation is completed.
Fixes the issue with Oozie Workflow Invoker when Oozie client is configured using kerberos authentication. If kerberos authentication is used then add the following line to the dsenv file and restart the DataStage Engine:
OOZ_AUTHM="-authmode"; export OOZ_AUTHM
If the Oozie client is configured for kerberos authentication then the Oozie Workflow Invoker does not manage the remote workflow correctly. An environment variable has been added to allow correct configuration of the Oozie Workflow Invoker.
Ensures that the large IA rulestage jobs do not fail to load in Designer after being compiled. Compilation is required for the affected jobs. The fix will not correct jobs that are already
compiled and cannot be loaded.
If this fix appears to cause unexpected side effects, it can be disabled by setting the environment variable DS_DONT_SUPPRESS_ORCH_CODE=1 in projects.
Ensures that the parallel transformer jobs are not aborted with "No input sort key information found" error message. For any jobs that exhibit the problems, it is necessary to force-compile them after the fix pack has been applied. All other jobs are unaffected by the change.
If the problem has already occurred, it is possible to identify the job number and manually remove the offending keybreak.so file from the RT_BPnn.O directory.
The fix removes the keybreak.so and modify.so files before the stage compilation. This ensures that that the set of shared libraries are generated by the compile.
Enables detection that an Execute Command Activity script without a trap command has been
killed.
Set the DSE_CHILD_RETCODE_BEHAVIOR environment variable to 1 to allow detection that an Execute Command Activity script without a trap command has been killed. This causes a return code to be assigned a value of 128 + signal used (as is done at the UNIX
level).
Sets the 'fieldsToNull' property while updating a null value for Update Salesforce Connector Operation. You need to set the environment variable "SF_UPDATE_USE_DEFAULT_FOR_NULL" as "false". Compile the job and run after setting the variable.
JR60184 Sub sets the data rule exception counts You need to set the parameter using IAAdmin command. Installing this patch enables the user to set the parameter ‘defaultElseValueForRules’ using the command that specifies global settings for the host name, user name and password, and port number for the analysis engine and the project name for which the option is to be set.
If -projectName is omitted, 'defaultElseValueForRules' is set at global level. By default ‘defaultElseValueForRules’ is set to 'TRUE'.
JR60184.txt
JR61112 Supports customizable temporary directory for FTP Enterprise Stage (instead of Local /tmp) You need to add new environment variable, (APT_SFTP_TEMP_FOLDER to address customizable temporary directory. You can pass valid path to the environment variable for example,
  APT_SFTP_TEMP_FOLDER=/opt/IBM/PFTP_Temp_Folder
JR61636 FTP Enterprise supports OpenSFTP in the SFTP mode for windows platform You need to set the environment variable, APT_PFTP_SSH_CLIENT as 'OPENSSH'. After the environment variable is set, compile and run the job.
Common Event Framework
JR59789 New istool command line utility options for deleting DQEC exception sets. The istools cache must be purged after installing the patch. On the engine or client tier change directory to istools configuration:
  cd <IS>/Clients/istools/cli/configuration.
Remove all directories under configuration and retain only config.ini.
JR59789.txt
DSCore Engine
JR60399
DataStage clients fail for certain users on AIX with Active Directory
Add the following line to Server/DSEngine/dsenv.
    export DSE_PW_USERNAME=1
The DataStage server engine must be stopped and restarted to pick up the new environment variable setting.

[{"Business Unit":{"code":"BU059","label":"IBM Software w\/o TPS"},"Product":{"code":"SSZJPZ","label":"IBM InfoSphere Information Server"},"Component":"--","Platform":[{"code":"PF002","label":"AIX"},{"code":"PF016","label":"Linux"},{"code":"PF033","label":"Windows"}],"Version":"11.7.0.0;11.7.0.1;11.7.0.2","Edition":"","Line of Business":{"code":"LOB10","label":"Data and AI"}}]

Document Information

Modified date:
09 April 2021

UID

swg27050650