IBM InfoSphere Identity Insight Version 8.1 Fix Pack 4 - Release Notes

These release notes contain information to ensure the successful installation and use of IBM® InfoSphere® Identity Insight Version 8.1 Fix Pack 4. Included is information about updates, fixed problems, usage notes, and known problems.

Contents

About IBM InfoSphere Identity Insight

IBM InfoSphere Identity Insight helps organizations solve business problems related to recognizing the true identity of someone or something ("who is who") and determining the potential value or danger of relationships ("who knows who") among customers, employees, vendors, and other external forces. IBM InfoSphere Identity Insight provides immediate and actionable information to help prevent threat, fraud, abuse, and collusion in all industries.

This product was formerly titled IBM Relationship Resolution.

Summary of product enhancements and fixes for Version 8.1 Fix Pack 4

Note: Fix Pack 4 is cumulative; Fix Pack 4 includes the fixes from Fix Pack 1. Fix Pack 2, and Fix Pack 3.
Fix Pack 4 includes the following product enhancements and fixes:
Install and Configuration
Windows 2012 support.
IBM InfoSphere Identity Insight Version 8.1 Fix Pack 4 has added support for the product to run on Windows 2012. To obtain the Windows 2012 installer, please contact the IBM Support team.
Import / Export Tool
A utility that allows users to import and export the configurations from a database to and from a flat file. The utility has both command line and UI interfaces.
This tool is useful when promoting configurations from a development to a production environment, or when working with IBM Support.
The tool is located in the sql directory.
The table list files are located in the sql directory.
export
The export mode is used to export an existing database configuration from a database to a flat file. Parameters required are database credentials (for connecting to the database to export from), a table list filename which instructs which tables to export, and a export file name.
To start the UI interface, On Windows run easexport.bat from the command line, on AIX and Linux run easexport from the shell.
To use the command line interface:
Usage:  easexport [-t <dbtype>] -s <server> [-u <uid>] [-p <pwd>]
-d <database> -n <Informix server> -f <filename>
[-a <table-list> | -e <exclude-table-list>] [-o <port>] [-r][-w][-g]

-t <dbtype>          is one of:
                     oracle:     Oracle (Default on Solaris and Linux)
                     mssql2008:  Microsoft SQL Server 2008 (Default on Windows)
                     db2:        IBM DB2 (Default on AIX)
                     informix:   IBM Informix
-s <server>          is the Data Source Name (SQL Server) or Hostname (Oracle, DB2)
-u <uid>             is the user ID or Oracle schema name
-p <pwd>             is the password (if unspecified OS authentication is used)
-d <database>        is the instance (Oracle) or database (SQL Server, DB2)
-n <Informix server> is the name of the Informix server (Informix)
-c <schema>          is the name of the DB2 schema to use. Default schema is used if unspecified. (DB2 only)
-f <filename>        is the path to the export file
-o <port>            is the port (Oracle, DB2)
-r                   Include the export of restricted tables (currently only dqm_generics)
-a                   Table only, optional, multiple allowed (comma delimited list)
-e                   Exclude list of tables, optional, multiple allowed (comma delimited list)
-l                   File containing a list of tables (REQUIRED)
-w                   Display the data entry dialog
import
The import mode is used to import an exported database configuration flat file into an existing database. Parameters required are database credentials (for connecting to the database to import to), a table list filename which instructs which tables to import, and a import file name.
To start the UI interface, On Windows run easimport.bat from the command line, on AIX and Linux run easimport from the shell.
To use the command line interface:
Usage:  easimport [-t <dbtype>] -s <server> [-u <uid>] [-p <pwd>]
-d <database> -n <Informix server> -f <filename>
[-a <table-list> | -e <exclude-table-list>] [-o <port>] [-r][-w][-g]
[-ne] [-b backupfilename] [-w] [-g logfilename]

-t <dbtype>          is one of:
                     oracle:     Oracle  (Default on Solaris and Linux)
                     mssql2008:  Microsoft SQL Server 2008 (Default on Windows)
                     db2:        IBM DB2 (Default on AIX)
                     informix:   IBM Informix
-s <server>          is the Data Source Name (SQL Server) or Hostname (Oracle, DB2)
-u <uid>             is the user ID or Oracle schema name
-p <pwd>             is the password (if unspecified OS authentication is used)
-d <database>        is the instance (Oracle) or database (SQL Server, DB2)
-n <Informix server> is the name of the Informix server (Informix)
-c <schema>          is the name of the DB2 schema to use. Default schema is used if unspecified. (DB2 only)
-f <filename>        is the path to the export file
-o <port>            is the port (Oracle, DB2)
-r                   Include the export of restricted tables (currently only dqm_generics)
-a                   Table only, optional, multiple allowed (comma delimited list)
-e                   Exclude list of tables, optional, multiple allowed (comma delimited list)
-l                   File containing a list of tables (REQUIRED)
-w                   Display the data entry dialog
-ne                  Do not export a backup file before importing the data
-b                   Backup filename (optional; backup filename  will be created automatically)
-g                   Import log file name (optional)
Pipeline
UMF Validation
In the Basic Console, UMF Validation is accessed through UMF Validation tab. UMF Validation is an administrative feature built into the basic console that is used to validate small sections of UMF/XML files prior to loading them into Identity Insight. UMF Validation is useful when adding new data sources, data streams, or entity attribute types an existing Identity Insight system.
After a UMF file is loaded, the Validation Results panel shows various tables which summarize the validation results.
The Validation Log File panel displays errors and warnings for up to 100 input records. To save the entire log file, click Save Log File in the upper-right corner of the Validation Log File panel.
Fixes and issues corrected
  • Pipeline fails to connect to QSAVI and shuts down with errors.
User interfaces
Console
Fixes and issues corrected
  • Patched the ClassLoader manipulation vulnerability in Apache Struts, used by the console.
Cognos
Alert Summary/Detail report
The ISII_AlertSummary Cognos report is a sample report provided to demonstrate how Cognos can be used to view Identity Insight role-alert summary data. The ISII_AlertSummary report works with the existing Identity Insight Cognos model, if you have it installed.
To add the ISII_AlertSummary report to your Cognos installation:
  1. Open the ISII_AlertSummary.xml report file (found in your Identity Insight install directory, under the cognos/reports folder) in a text editor.
  2. Copy the entire contents of the ISII_AlertSummary.xml file to the clipboard.
  3. Open Cognos Report Studio.
  4. Create a blank Cognos report based on the ISII model, and select Tools > Open Report from Clipboard.
  5. Save the report as ISII_AlertSummary under the ISII Cognos folder.
Refer to the Cognos documentation for more information on importing reports from the clipboard.
Expanded Services API
Console
  • Addition of SOA auditing support to track requests, users who issued the requests, and responses.
  • Fixes HTTP transport issue on Windows.
Global Name Recognition
Upgraded to Global Name Recognition Version 5.0
The Global Name Recognition software included with the product has been update to the latest Version 5.0.
Added support for custom culture
Identity Insight now allows for the use of special custom cultures in its processing. This features allows the creation of new culture categories by claiming an allocated custom culture slot. This also allows culture categories that have been overridden as a work-around for adding a new culture to now be restored. Existing cultures can be tuned to better match behavior specific to the environment by allowing the specific setting if a name is or is not attributable to a specific culture category. Up to 20 custom cultures are allowed.
To define a new custom culture:
  1. Insert a row into the SYSTEM_PARAM table or INI file to define the name of a custom culture. For example, [NameManager] Custom_Culture_1=HEBREW or [NameManager] Custom_Culture_2=HEBREW_ARAB will define custom cultures.
  2. Insert a row into the LAS_MATCH_CONFIG table to configure the matching parameters for the new culture.
  3. Insert several rows in the LAS_COMP_PARMS_CONFIG table to specify the NameHunter matching parameters for the new culture.
Added customized transliteration for custom cultures
Customized name transliteration is enabled for custom cultures. Name transliteration is the process of converting a name from a particular writing system or character encoding convention into another.
Identity Insight plug-in for i2
Fixes and issues corrected
  • i2 plug-in Find By Alert ID fails when you clicked load with no value in the alert ID field.
  • i2 plug-in connection fails when trailing spaces were entered on host or port.
  • Addition of Find path support. Select any two entities on an i2 graph to see if there is an Identity Insight path between them. If Identity Insight identifies a path between the 2 selected entities, that path is added to the graph.
  • Addition of support for Entity-node icons by Identity Insight roles.

Modified views and tables

Upgrading to Fix Pack 4 will modify existing views.

Fix Pack 4 modified or added views include:

New views
  • SOA_DOC_TYPE
  • SOA_ENTITY_TYPE
Updated tables
  • DQM_RULE
  • DQM_RULE_PARAM
  • DQM_RULE_SET
  • UMF_DQM_MAPPING
  • UMF_OUTPUT_FORMAT
  • UMF_OUTPUT_RULE
  • UMF_OUTPUT_PARAM
  • LDR_MESSAGE_TYPE
  • SYSTEM_PARAM
  • COMPONENT_CONFIG_TAGS
Note: Backing up your modified view and tables to prevent loss is also necessary if you are upgrading to Version 8.1 in preparation for installing Fix Pack 4. See related information in the information center (or PDF version) under Installing and upgrading > Upgrading the product > Upgrade items > Customized views overwritten or deleted during upgrade.

Install Scenarios for IBM InfoSphere Identity Insight Version 8.1 Fix Pack 4

There are several Fix Pack 4 installation scenarios:
Important: If you use IBM AIX 6.1, you must move to IBM AIX 7.1 before installing Fix Pack 4.
Version 8.1 installed - add Fix Pack 4
Install Fix Pack 4 using the fix pack installation program.
Fix Pack 4 is cumulative of all previous fix packs.
Version 8.1 with Fix Pack 1 installed - add Fix Pack 4
Install Fix Pack 4 using the fix pack installation program.
Version 8.1 with Fix Pack 2 installed - add Fix Pack 4
Install Fix Pack 4 using the fix pack installation program.
Version 8.1 with Fix Pack 3 installed - add Fix Pack 4
Install Fix Pack 4 using the fix pack installation program.
Version 8.1 Fix Pack 1 with IBM Informix (including the addition of Fix Pack 2, or Fix Pack 3) - add Fix Pack 4
Install Fix Pack 4 using the fix pack installation program.
This installation is for customers who want to use an IBM Informix database. It required a separate installation program that installed a new Version 8.1 system adapted for Informix plus the enhancements and fixes that are part of Fix Pack 1. Note: This special Informix installer installs the product up to Version 8.1 FP1.
Version 8.1 is not installed
You must first install or upgrade to Version 8.1.
Then install Fix Pack 4.

For more information see: product Support portal. Planning, installation, and configuration information is also available in the product information center at ibm.com®. Note the installation-related items in this document.

Required actions and considerations before installing Fix Pack 4

Important: You must do the following before installing or upgrading to Fix Pack 4.
Platform
AIX: If you use IBM AIX 6.1, you must move to IBM AIX 7.1 before installing Fix Pack 4.
Pipeline
Windows installation: Pipeline services must be stopped and deleted prior to upgrade. When installing on Windows, stop and delete old pipeline services prior to running the installer to upgrade to Version 8.1 Fix Pack 4.

Non-Windows Installation: Make sure that no pipeline is running. The upgrade will fail if a pipeline is running.

Embedded WebSphere Application Server
Before you install, stop the embedded version of WAS. Go to the install root directory:
  • On Windows: StopEAS.bat
  • On other platforms: stopEAS.sh
Installation directory and Identity Insight schema
Back up your installation directory and Identity Insight schema before upgrading.
Database Environment:
Validate that the database directory and instance variables are set for different database type.
User ID required for upgrade:
The person performing the install should use the same user id that was used to install the product.
Note: Upgrades should always be done by logging on as the owner and not root.

Considerations and issues for AIX users

If you use IBM AIX 6.1, you must move to IBM AIX 7.1 before installing Fix Pack 4.

Considerations and issues for Microsoft Windows 7 and Microsoft SQL Server 2008 users

If you use IBM InfoSphere Identity Insight on a Microsoft Windows 7 client or use Microsoft SQL 2008 as your database, be aware of the following before installing:

Windows Server 2003 or 2008 for pipeline or Application Server with a DB2® v9.5 or v9.7 database server:
A potential issue exists for IBM Identity Insight Version 8.1 customers who are using Windows Server 2003 or Windows Server 2008 operating systems for the pipeline or Application Server and IBM DB2 Version 9.5 or Version 9.7 database server. Latin-1 or UTF-8 data may not be encoded correctly by Identity Insight Version 8.1 with this operating system-database combination. If you are using a DB2 v9.5 or v9.7 database, you are strongly encouraged to install IBM Identity Insight Version 8.1 in a test environment and verify correct encoding of Latin-1 or UTF-8 data. Check the following columns:

Table=NAME
Columns=LAST_NAME, FIRST_NAME, MID_NAME,
NAME_PFX, NAME_SFX, NAME_GEN

Table=ADDRESS
Columns=ADDR1, ADDR2, ADDR3, CITY, STATE

Table=ATTRIBUTE
Column=ATTR_VALUE

If the data in any of these tables appear to be incorrectly encoded, check the following Environment variable is set on your Windows pipeline server:
DB2CODEPAGE
This should be set to the same value as the DB2 Database 'CODEPAGE' variable. For example, if the DB2 Database configuration is:
CODEPAGE=1208
CODESET=UTF-8
The DB2CODEPAGE environment variable should be set as follows:
set DB2CODEPAGE=1208

Considerations and issues for Oracle database users

If you use an Oracle database with IBM InfoSphere Identity Insight, be aware of the following additional known issues:
Do not segment or partition database tables that are used for UMF input transports
Database tables that are used for UMF input transports must not be segmented or partitioned. Segmenting or partitioning these transport tables will result in errors in the pipeline transport and lost data.

Considerations and issues for IBM Informix users

If you use IBM Informix as your database, be aware of the following additional known issues:

IBM Informix Dynamic Server Release 11.70.xC4DE database support
Support for IBM Informix Dynamic Server Release 11.70.xC4DE databases (Ultimate Edition and Ultimate Warehouse edition) is provided with the following Application Server architectures (Native OS implementation only):
  • IBM AIX 6.1 - 64 bit - POWER 5/6
  • Linux x-86
  • Linux x86_64
  • 64-bit Linux on System z
  • Novell SUSE Enterprise Linux 10 - 64-bit - IBM System z only
  • Novell SUSE Enterprise Linux 11 - 64-bit, x86_64
  • RedHat Enterprise Linux 5 - 64-bit, x86_64
  • Microsoft Windows 2008 Server - 64-bit, x86_64
Note: IBM Informix Dynamic Server 11.7 DataBlade functionality is not supported.

See the section on system requirements for detailed information.

Installation prerequisites and setup information for Informix
The environment variables INFORMIXSERVER, INFORMIXDIR must be set before you run the installer to properly configure an Informix installation.
  • INFORMIXSERVER should point to the instance of the target DB server, for example, ol_informix1170.
  • INFORMIXDIR should point to the root installation directory where the Informix client or server package is installed. For example, on UNIX you can install under /opt/IBM/informix and may have a version number appended. For Windows, it is usually in Program Files/IBM/Informix.
Minimum page sizes
Page sizes for IBM Identity Insight databases managed with Informix need to be created with a minimum page size of 8K. If this is not done, installation will fail with an error similar to the following:
java.sql.SQLException: Total length of columns in constraint is too long. 
Logging mode
Logging mode must be enabled for the Informix database when it is created. Do not use the MODE ANSI keyword. Doing so can break pipeline transaction-handling functions.
Running SUIT
When installing to an Informix database, the Schema Upgrade and Installation Tool (SUIT) must be run with the following command line parameter: -mbi 1. This disables statement "batching" and is required for version 11.50 and 11.70. If this is not done, the SUIT operation will fail with the error "maximum statement length exceeded" and abort. See SUIT: Schema Upgrade and Installation Tool version 8.1.0.4.
Environment variables
Informix has two optional environment variables that must be set before running the installation: CLIENT_LOCALE and DB_LOCALE. The CLIENT and DB locale values must be the same or their code sets must be convertible. It is best if both are the same, but both must be UTF8. codeset name and language name must both be lower case when used in the JDBC call.
Enabling logging for Transactional support
In order for transactions to work with Informix, the database must be created with logging enabled. You can do this through dbaccess using the menu options, or specify it when using DDL:
CREATE DATABASE <database name> WITH LOG
Transaction-handling configuration
The following configuration settings are required for transaction-handling:
  • Row-Level table locking DEF_TABLE_LOCKMODE=ROW in onconfig database-configuration file
  • Row lock handling SET LOCK MODE TO WAIT on database connection definition.
  • Choice of plan optimization: UPDATE STATISTICS HIGH on database after some data has been loaded. This does not work on an empty database.
Required entries for LD_LIBRARY_PATH:
  1. Add the following required entries for LD_LIBRARY_PATH:
    • $INFORMIXDIR/lib
    • $INFORMIXDIR/lib/cli
    • $INFORMIXDIR/lib/esql
  2. Ensure that $INFORMIXDIR/bin is in the $PATH.
  3. After setting up the Informix database, you must update the ODBC configuration. For Unix and Linux, see: http://publib.boulder.ibm.com/infocenter/idshelp/v117/topic/com.ibm.odbc.doc/ids_odbc_062.html (add link). For all platforms: http://publib.boulder.ibm.com/infocenter/idshelp/v117/index.jsp?topic=/com.ibm.relnotes.doc/notes/csdk_370xc3/mach/odbc.html
JDBC files
Make sure that $INFORMIX/jdbc/lib exists before installing. Some Informix installations do not automatically create the jdbc files. These files must exist before installing:
C:\PROGRA~1\informix_1170\jdbc\lib>dir
Volume in drive C has no label.
Volume Serial Number is D477-2F8F

Directory of C:\PROGRA~1\informix_1170\jdbc\lib

18/07/2012  12:31          <DIR>     .
18/07/2012  12:31          <DIR>     ..
23/09/2011  20:01          816.096 ifxjdbc.jar
23/09/2011  19:58            44.939 ifxjdbcx.jar
23/09/2011  19:58        1.585.532 ifxlang.jar
23/09/2011  19:58          307.332 ifxlsupp.jar
23/09/2011  19:58          806.318 ifxsqlj.jar
23/09/2011  19:58            48.982 ifxtools.jar
              6 File(s)      3.609.199 bytes
              2 Dir(s)  1.093.832.704 bytes free 
Do not segment or partition database tables that are used for UMF input transports
Database tables that are used for UMF input transports must not be segmented or partitioned. Segmenting or partitioning these transport tables will result in errors in the pipeline transport and lost data.
Known issue: Informix sometimes returns trailing spaces on data
Informix sometimes returns trailing spaces on data. If this occurs:
  1. Edit the on_config_<server name> file, which is located in $INFORMIXDIR/etc.
  2. Set the following parameter:
    IFX_LEGACY_CONCAT 1
  3. Then restart the server.

Starting the Fix Pack 4 installation program

Complete the following steps to start the product installation program for install Fix Pack 4.
On Microsoft Windows:
You must copy the product installation file to a local drive. The product installation program will not run from either the installation media or from a network drive.
On AIX and Linux:
To enable the License-print function within the Installer running in GUI mode, you need to define your printer within the X-windows subsystem that you are running on the client machine.
To enable the License-print function within the Installer running in command line mode, you need to set up a default print-queue and printer on the machine you are installing on.
  1. Obtain the IBM InfoSphere Identity Insight product software download package.
  2. Do one of the following steps:
    1. On Microsoft Windows: If obtaining a .tar file, unzip the file to temporary directory on a local drive of the target installation machine.
      Note: On Microsoft Windows, you must copy the product installation file to a local drive. The product installation program will not run from either the installation media or from a network drive. Ensure that the .tar file is unzipped with the directory structure intact.
    2. On AIX and Linux: If obtaining a .tar file, unzip the file to temporary directory on a local drive of the target installation machine.
      Note: Ensure that the .tar file is unzipped with the directory structure intact. Ensure that the product installation file's parent directory structure of \Disk1\InstData\VM\ is retained if you copy the installation file to another location.
  3. Navigate to the /platform/Install/Disk1/InstData/VM/ directory, and run the installer program.
    1. To run the installer in GUI mode, double-click or launch the installer program.
    2. To run the installer in command line mode, from the command line, append -i console when executing the installer program.
      For example: prompt> ISII_81_FP2_aix_ppc.bin -i console
      Operating System platform Installer file
      Microsoft Windows Server x86_64 isii_8.1.0.4_win_x64.exe
      Microsoft Windows 2012 Contact the IBM Support team to obtain the installer.
      IBM AIX isii_8.1.0.4_aix_pwr5.bin
      Linux x86_64 isii_8.1.0.4_linux_x64.bin
      64-bit Linux on System z isii_8.1.0.4_linux_s390.bin

Completing the Fix Pack 4 installation

For an upgrade installation, all product features will be installed, even if the previous installation did not have all features installed. Complete the following installation program panels to install Fix Pack 2.

  1. On the Introduction panel, review the screen.
  2. On the License Agreement - Software License Agreement panel, review the license agreement and select the Agree to Continue button.
  3. On the Destination - Choose Install Folder panel, type or browse to the directory (fully qualified path) where IBM InfoSphere Identity Insight is installed.
    Note: If browsing to an installation directory, you must click the Browse button, then browse to the directory one level above the install directory (create the new directory if needed). Then select the install directory and click the Open button.
  4. On the Product Features panel, you will see all updates and enhancement included in this fix pack.
  5. On the Database Information panel, review the installed database information on the screen. It should be pre-filled. Validate to ensure you are updating the right database. Note the database information. You will need it for a database update task after the product upgrade and installation is complete.
  6. On the Database Update panel, select the option to skip database population. You must run SUIT to upgrade your database after the installation or upgrade is complete. See Updating the database with SUIT.
  7. Review the Cognos Report panel.
  8. On the Pre-Installation Summary panel, review the summary. Click the Previous button if any changes are needed. Then click the Install button to start the product upgrade. After installation completes,
    Note: When installing on Solaris systems, you might see the No such file or directory message on the final window. You can safely ignore this warning message.

Verifying Fix Pack 4 installation

Verify that you have successfully upgraded to Fix Pack 4:
  1. Verify that the pipeline has been updated. Run the pipeline command without options and confirm that the output shows:
    ** pipeline v8.1.0.224
  2. Verify that Web services have been updated. Open a Web browser and enter the URL for your installation. For example, http://yourserver.domain.com:17110/easws/api/soap. You should see the following:
    8.1.0.224 - 20150505_2208
    - - - - - - - - - - - - - - - - - - - -
    SERVICE_NAME = EntityResolver
    WRAPPED_DOCUMENT_LITERAL = true
    RESOURCE_SUBDIRECTORY = soap

Updating the database with SUIT

You must update the database that you are using with IBM Identity Insight after the product is installed or upgraded if you chose to skip the database population on the Database Update panel (recommended above). Use the Schema Upgrade and Installation Tool (SUIT) to update the database. You must update the following:
  • RELRES.NO_HIST
  • EAS-Console
  • CMEAdmin
You can use SUIT to make changes directly to the database with the -auto option, but it is usually preferable to direct the SUIT output to an SQL file that you can examine before running against the database.
  1. Update the database that you are using with Identity Insight with the SUIT .
    Table 1.
    Database Example Notes
    Informix
    suit -t informix -s dbHost -d
    dbName -u jdoe -n ol_informix
    RELRES.NO_HIST upgrade -auto
    Automatically upgrade RELRES.NO_HIST on IBM Informix host dbHost (default port 9088), database name dbName, server name 1_informix, user ID 'jdoe'. Will prompt for password.
    DB2
    suit -t db2 -s dbHost -o 50000 -u
    db2inst1 -d dbInstance
    RELRES.NO_HIST upgrade
    Print SQL to upgrade RELRES.NO_HIST on DB2 host dbHost, port 50000, instance dbInstance, user ID 'db2inst1'. Will prompt for password.
    Oracle
    suit -t oracle -s dbHost -u frank -d
    dbInstance RELRES.NO_HIST install
    Print SQL to install RELRES.NO_HIST on Oracle host dbHost, instance dbInstance, user ID 'frank'. Will prompt for password.
    SQL Server
    suit -t mssql2008 -s dbHost -d
    dbName -u jdoe RELRES.NO_HIST
    upgrade -auto
    Automatically upgrade RELRES.NO_HIST on SQL Server 2008 host dbHost (default port 1433), database name dbName, user ID 'jdoe'. Will prompt for password.
    DB2
    suit -t db2 -s dbHost -d dbName -u
    jdoe -n ol_db2 RELRES.NO_HIST upgrade
    > 8.1_FP4_Upgrade_RELRES.NO_HIST.sql
    Save the SQL update to a file (8.1_FP2_Upgrade_RELRES.NO_HIST.sql) to be run after examining it.

    See SUIT: Schema Upgrade and Installation Tool version 8.1.0.4 for a list of SUIT commands and options.

Graph server initialization parameters

Server administrators can set graph server initialization parameters in the graph.properties file to improve performance. These parameters are:
getEntityDetailMultiple_chunkSize=nn
10 is the default. 1 is the minimum.
getEntityDetailMultiple_usePreFetching=true/false
True is the default and is recommended.
getEntityRelationshipsMultiple_chunksize=nn
10 is the default. 1 is the minimum.
getEntityRelationshipsMultiple_usePreFetching=true/false
True is the default and is recommended.
getEntityRelationshipsMultiple_limit=nnnnn
10000 is the default. 1 is the minimum.
To set these parameters:
  1. Open the graph.properties file, located in the srd-home/graphs.
  2. Look for and edit the line that contains one of the above parameters or add a line if the one you need is not already there.
  3. Save the file.
  4. Restart the graph server. Changes will not take effect until the graph server is restarted.
The default value is used for any absent or unspecified parameters.

chunkSize parameters

The purpose of getEntityDetailMultiple_chunkSize and getDirectEntityRelationshipsMultiple_chunkSize is to avoid long running queries, consuming large amounts of SOA server memory and pipeline memory in a single SOA call. The larger the values used for those two parameters, the fewer total SOA calls will be made to the pipeline, and the faster the total amount of data for the graph will be retrieved. With larger chunk sizes each SOA call will require more memory.

For example, if there are 150 entities in a given level of the graph, then instead of making one a single call to getEntityDetailMultiple for all 150 entities at once, the graph server will instead make 15 calls (150 / getEntityDetailMultiple_chunkSize, with chunkSize=10).

This parameter can be adjusted based on your data and hardware to obtain an optimal balance. This is also true for getDirectEntityRelationshipsMultiple_chunkSize.

PreFetching parameters

Once the graph has been displayed, the pre-fetching feature will continue to gather information about the entities on the graph. This additional information makes the current graph operate more efficiently.

The purpose of getEntityDetailMultiple_usePreFetching and getDirectEntityRelationshipsMultiple_usePreFetching is to throttle back or turn off prefetching.

You can control prefetching as follows:
Prefetching is completely off
Set getEntityDetailMultiple_usePreFetching=false with getDirectEntityRelationshipsMultiple_usePreFetching=false.
Prefetching gather relationships only
Set getEntityDetailMultiple_usePreFetching=false with getDirectEntityRelationshipsMultiple_usePreFetching=true.
Prefetching gather entity data only
Set getEntityDetailMultiple_usePreFetching=true and getDirectEntityRelationshipsMultiple_usePreFetching=false.

Parameter getDirectEntityRelationshipsMultiple_limit

The graph will attempt to identify which entities on the graph are related to each other beyond the relationship to the primary entity. This is a time consuming feature. The graph will attempt to find these relationships for the first n number of entities, where n=getDirectEntityRelationshipsMultiple_limit. Setting this value to 1 essentially turns it off. Graph performance will go up, but some entities will not be identified as related on the graph.

For example, E1 ? E2 and E1 ? E3 and E1 ? E4. The graph always shows this correctly. However, E2, E3, and E4 may be related to each other. Set getDirectEntityRelationshipsMultiple_limit= 1 to prevent showing these secondary relationships.

The blue rectangle on each entity now identifies the number of relationships not displayed on the graph regardless of whether the related entity is displayed on the graph.

The following related improvements are part of this new function:
Locking granularity has been fine-tuned
Locking granularity has been fine-tuned to allow for better interleaving between server-side thread handling user commands from browser and the server-side prefetching thread.
Five new graph server initialization parameters have been added
Server administrators can set additional graph server initialization parameters in the graph.properties file to improve performance.
Known behaviors: getDirectEntityRelationshipsMultiple() and getDirectEntityRelationships() return null in both the following scenarios:
  • Entity exists but has no relationships.
  • Entity does not exist.

Known issues and changes when using the product

Be aware of the following considerations and known issues related to using the product:

Match/Merge processing and From/Thru dates
If you have a specific requirement for From/Thru dates to be taken into account during Match/Merge processing, you should explicitly set the DATERANGETHRESHOLD value to '0' in the pipeline.ini file as part of this FP1 upgrade. This ensures that From/Thru dates continue to be honored. Note that setting this to '0' is not the recommended value. Please contact IBM Support for further information and guidance.

If you have previously set [MM] DATERANGETHRESHOLD = -1 in your configuration, your system will now correctly ignore the From/Thru dates during Match/Merge processing.

Do not specify ODBC Isolation level=1 (uncommitted Read) on any ODBC client
Do not specify ODBC Isolation level=1 (uncommitted Read) on any ODBC client. This is particularly true when using a multi-threaded pipeline, or multiple pipelines. Doing so can cause data corruption or unexpected pipeline shutdown. Example pipeline output:
08/07 14:49:38 [pipeline:503380304]
 CRIT: CRITICAL ERROR:
08/07 14:49:38 [pipeline:503380304]
 CRIT: {
08/07 14:49:38 [pipeline:503380304]
 CRIT:  Requested resolve config for an invalid entity type: 0
08/07 14:49:38 [pipeline:503380304]
 CRIT:  Check logs or UMF_EXCEPT table for more information.
08/07 14:49:38 [pipeline:503380304]
 CRIT: }  
MERGE_ID and NUM_MERGED tags not used or required
The <MERGE_ID> and <NUM_MERGED> tags are not used or required by the pipeline. If you use them, they will be ignored by the pipeline. Only a single <MERGE> segment can be present in a UMF input document.
ILOG graph generates error for custom number types
The Attribute Alert section of the ILOG graphing tool can generate an error when displaying an entity that contains custom number type.

If new Number-Types or Attribute-Types are added to the system via the Configuration Console, you must restart the pipelines and the "Graph server" within eWAS (or the entire eWAS) before the iLOG Graph can display these new types correctly.

Viewing ILOG graphs
Do not to use the browser back arrow when viewing ILOG graphs.

It’s possible to get an unresponsive script when viewing graphs in Firefox or other browsers. Click the "Don't show message" check box and the "Continue running script" option to prevent subsequent messages and issues when loading graphs in the browser.

MATCH_ID column on the SEP_CONFLICT_REL table
The MATCH_ID column on the SEP_CONFLICT_REL table is currently defined as a NUMBER (10). This can artificially limit the size of values that can be stored in this table and cause SQL insert failures and pipeline shutdown with large data volumes. To allow larger Match-ID numbers to be stored, this column-type for the MATCH_ID column can be safely altered to be: "max" precision / BIGINT.
Handling names that include single-name aliases (Main Name 'a/k/a' Alias Name)
To handle names that include single-name aliases (Main Name 'a/k/a' Alias Name) do the following:
  1. The ' Main Name' can be processed using the standard 'M' <NAME_TYPE> tag, in either an unparsed (<FULL_NAME>) or parsed (<FIRST_NAME>, <LAST_NAME>) format.
  2. An alias name can be encoded as follows:
    <NAME_TYPE>A</NAME_TYPE>
    <DSRC_CODE>+<DSRC_ACCT> = same as the Main Name record
    The alias name is passed in via the <FULL_NAME> tag DQM Rule #289 (Alternate Parse) enabled for NAME segment.

    This approach enables the alias name to be encoded in a way that maximizes the possibility of name-matching. This is particularly useful if the alias is a single-token name.

Configuration utility (eacfg) known behavior
In Solaris and AIX environments when running the configuration utility (eacfg) with LOCALE set to 'en_UTF8' , it is possible that a secondary window will appear that steals focus from the main window. It does not affect the operation of the tool and can be worked around by temporarily setting $LOCALE='C' prior to launching the tool.
DateCriterion object in SOAP or REST service requests
When using the DateCriterion object in SOAP or REST service requests, you should be aware of how date values are stored in the database (ATTRIBUTE.ATTR_VALUE). For example, if a particular date attribute is stored without a timestamp (i.e. date only), it is possible that matches will not be found if values are submitted with timestamps. If you are using a SOAP client that forces a timestamp value and you are searching for DOB, you can supply a 00:00:00 timestamp and use a withDays value of 1. This will have the side-effect of forcing the search to use EQ/find-by attribute whenever a DOB is supplied.

If your SOAP client allows you to specify a Date with no timestamp, then make sure the adjustment is in place when searching for dates without timestamps. If you are creating a SOAP client with Java or other programming language, it is possible to format the date in your client code to be date only. Use standard date-formatting functions supplied by the language provider.

UMF pipeline processing
Order matters when it comes to FROM AND THROUGH DATE in pipeline processing. When sending data that has FROM AND THROUGH DATE, not having it in the proper order can create problems.
Israel to use non-US address parsing rules
To get Israeli based systems to use non-US address parsing rules, you must add a row to the AP_RULE_SET table.
When an address has a country it is looked up in the DQM_COUNTRY table to find the 2 letter country code. The 2 letter country code is then looked up in this AP_RULE_SET table. If it does not exist then the RULE_SET_ID=1 rules are applied, which are the US rules. When the country code is found, it then looks at the BASE_RULE_SET_ID column. If the BASE_RULE_SET_ID column = 0 then it is a native set of rules and it uses that rule set. If the BASE_RULE_SET_ID > 0 then it uses the rule set ID of what is in this column. In this case, the German rules are used because they match the Israeli format.
In the Cognos Alert Summary, after clicking on Alert ID or Entity ID in the role alert detail, an hour glass may continue to display after loading an entity report
Due to a Cognos Server issue on affected systems, the following steps reproduce the issue:
  1. Select any role alert in the Alert Summary window.
  2. Click an Alert ID or Entity ID in the Role Alert detail.
Result: The hour glass keep on displaying even after the entity is successfully loaded.
Workaround: To overcome this issue, click the refresh button or any Alert count in the Alert Summary.

To see the latest information about known problems and issues

Known problems are documented in the form of individual technotes in the Support portal at http://www.ibm.com/support/entry/portal/Software/Information_Management/InfoSphere_Identity_Insight:
  1. Use the Search Support feature and in the Enter terms, error code or APAR # field, enter a keyword, phrase, error code, or APAR number to search on.
  2. Select Solve a problem.
  3. Click Search.

As problems are discovered and resolved, the IBM Support team updates the Support portal. By searching the Support portal, you can quickly find solutions to problems.

At time of publication, there were no known installation problems. Check the Support portal for the most current information.

System requirements updates

For the latest information about hardware and software compatibility, see the detailed system requirements document at http://www.ibm.com/support/entry/portal/Software/Information_Management/InfoSphere_Identity_Insight.

SUIT: Schema Upgrade and Installation Tool version 8.1.0.4

SUIT is a Schema Upgrade and Installation Tool that comes with IBM Identity Insight.

When installing to an Informix database, the Schema Upgrade and Installation Tool (SUIT) must be run with the following command line parameter: -mbi 1. This disables statement "batching" and is required for version 11.50 and 11.70. If this is not done, the SUIT operation will fail with the error "maximum statement length exceeded" and abort.

SUIT is used to update the schemas for other databases as part of the installation program and is located in II installer\suit\sql.

suit [-t <dbtype>] -s <server> [-u <uid>] [-p <pwd>] -d <database> [-n <servername>]
             [-o <port>] [-c <schema>] [-x <xmlhome>] [-l <xslhome>]
             [-tokenProperties <tokenprops>]
             [-enc <encoding>] <product> <action> [-auto] [-v]

<dbtype> is one of:
    oracle: Oracle  (Default on Solaris and Linux)
    mssql2008:  Microsoft SQL Server 2008 (Default on Windows)
    db2:    IBM DB2 (Default on AIX)
    informix:   IBM Informix

<server>      is the database Host Name (or IP address)

<uid>         is the user ID or Oracle schema name
<pwd>         is the password (if unspecified the user is prompted for it)
<database>    is the instance (Oracle) or database (DB2, SQL Server)
<servername>  is the name of the informix server (Informix)
<port>        is the port number the database listens on. Default if unspecified
.
<schema>      is the name of the DB2 schema to use. Default schema is
              used if unspecified. (DB2 only)

<xmlhome>     is the path to the directory where the product XML files reside
              (default is "./xml")

<xslhome>     is the path to the directory where the suit XSL files reside
              (default is "xslt")

<tokenprops>  is the path to a properties file containing token keys
              and string replacement values.

<encoding>    is the name of the output character encoding used for all text
              output (stdout and stderr). If unspecified, the JRE default is used.
              Supported encodings include UTF-8, UTF-16, UTF-16LE, UTF-16BE,
              ISO-8859-1 (ISO Latin-1), cp1252 (Windows Latin-1). Other encodings
              may be supported by your Java environment, see:
              http://www.iana.org/assignments/character-sets

<product>    is one of:
    NO PRODUCTS FOUND
    (access to different products may be possible via the "-x <xmlhome>" option)


<action>    is one of:
    install [-auto]
       Prints the SQL for creating a new schema for this
       product. Using -auto will automatically create the schema.

    upgrade [-auto]
       Prints the SQL for upgrading the schema for this product
       to the current version from the specified version.
       Using -auto will automatically upgrade the schema (not
       recommended for most systems).

    verify [-v]
       Examines the current schema against the master schema, and
       prints information about any missing objects.  Specifying
       -v will cause information about extra objects to be generated as well.

Licensed Materials - Property of IBM. Copyright IBM Corporation 2003, 2015. A ll Rights Reserved. US Government Users Restricted Rights - Use, duplication or disclosure restricte d by GSA ADP Schedule Contract with IBM Corporation. IBM and the IBM logo are registered trademarks of International Business Machines Corporation in the United States, other countries, or both.