Readme file for IBM Watson® Machine Learning Accelerator Interim Fix 527174  

 

Readme file for: IBM Watson® Machine Learning Accelerator
Product/Component Release: 1.2.1
Fix ID: dli-1.2.3-build527174-wmla

Publication date: October 7, 2019

Last modified date: January 21, 2020

 

Interim fix 527174 includes some new features and resolves some existing features in IBM Watson Machine Learning Accelerator 1.2.1, including:

·        Upgrade IBM Spectrum Conductor 2.3.0 to version 2.4.0.

·        Security fix to enable IBM Spectrum Conductor 2.4.0 to work with IBM Spectrum Conductor Deep Learning Impact 1.2.3.  

·        Fixed hyperparameter issue when finding the best hyperparameters during model creation.

·        Fixed CSV files dataset creation issue for split data.

·        Update to the default Python version used from Python 2 to Python 3.

·        Update to Spark 2.x packages for a IBM Spectrum Conductor 2.4.0 cluster.

·        Improved the accuracy of elastic distributed training.

·        Enhanced the train function in elastic distributed training to include the following options: validation_freq, checkpoint_freq and effective_batch_size.

 

 

IMPORTANT: Interim fix 536919 supports a direct upgrade from IBM Spectrum Conductor 2.3.0 to 2.4.1. You can upgrade directly from IBM Spectrum Conductor 2.3.0 to 2.4.1, by applying interim fix 536919.

 

NOTE: By applying this interim fix any existing Spark instance group, which did not explicitly set a Python version (using the PYTHON_VERSION environment variable), will be updated to use the latest Python version 3.6. For any Spark instance groups where the Python version was explicitly set, the Python version will remain unchanged. For any new Spark instance groups, the default Python version is set to 3.6. To change the Python version of a new Spark instance group, you must configure and set the PYTHON_VERSION environment variable. 

 

Contents

1.     Download location 

2.     Products or components affected

3.     Installation and configuration

4.     Uninstallation

5.     List of files

6.     Product notifications 

7.     Copyright and trademark information

1.   Download location

Download interim fix 527174 (dli-1.2.3-build527174-wmla) from the following location http://www.ibm.com/eserver/support/fixes/.

 

2.   Products or components affected

Component Name, Platform, Fix ID:

dlpd, fabric, dl_plugins, plc, WEBGUI, REST, ascd, elasticsearch

Linux x86_64, Linux ppc64le

dli-1.2.3-build527174-wmla 

 

3.   Installation and configuration

 

3.1 Before installation

Before installing the interim fix, complete the following steps to prepare your environment and complete the rolling upgrade to IBM Spectrum Conductor 2.4.0.

 

1.      Log on to the master host as the cluster administrator (CLUSTERADMIN) and source the environment according to your shell environment: 

For sh, ksh or bash:

$ . $EGO_TOP/profile.platform

$ export DLI_SHARED_FS=$DLI_SHARED_FS

$ export CLUSTERADMIN=$CLUSTERADMIN

$ export DLI_USER=${CLUSTERADMIN}

$ export DLI_USER_GROUP=$( id -ng "${DLI_USER}" 2>&1 )

$ export DLI_CONDA_HOME=$DLI_CONDA_HOME

$ export EGO_CONFDIR=$EGO_TOP/kernel/conf                                

$ export WLP=$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/

where EGO_TOP is IBM Spectrum Conductor Deep Learning Impact installation path, and DLI_SHARED_FS, CLUSTERADMIN, DLI_CONDA_HOME must be the same as the IBM Spectrum Conductor Deep Learning Impact installation setting.

 

For csh or tcsh:

$ source $EGO_TOP/cshrc.platform

$ setenv DLI_SHARED_FS $DLI_SHARED_FS

$ setenv CLUSTERADMIN $CLUSTERADMIN

$ setenv DLI_USER ${CLUSTERADMIN}

$ setenv DLI_USER_GROUP `id -ng "${DLI_USER}"`

$ setenv DLI_CONDA_HOME $DLI_CONDA_HOME

$ setenv EGO_CONFDIR $EGO_TOP/kernel/conf

$ setenv WLP $EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/

where EGO_TOP is IBM Spectrum Conductor Deep Learning Impact installation path, and DLI_SHARED_FS, CLUSTERADMIN, DLI_CONDA_HOME must be the same as the IBM Spectrum Conductor Deep Learning Impact installation setting.

 

2.      Create a backup directory and back up the following files and directories:

$WLP/dlgui

$EGO_TOP/dli/1.2.3/dlinsights

$DLI_SHARED_FS/tools/dataset/main.py

$DLI_SHARED_FS/tools/dl_plugins/

$DLI_SHARED_FS/tools/tune/

$DLI_SHARED_FS/fabric/

$DLI_SHARED_FS/tools/inference/edi.py

      $DLI_SHARED_FS/tools/inference/classify.py

      

3.      Upgrade from IBM Spectrum Conductor 2.3.0 to version 2.4.0. To complete the upgrade, follow the upgrade documentation for IBM Spectrum Conductor 2.4.0 in the IBM Knowledge Center using the conductor2.4-upgrade-wmla-x86_64.bin package for x86_64 and conductor2.4-upgrade-wmla-ppc664le.bin package for ppc64le.

IMPORTANT: After upgrading to IBM Spectrum Conductor 2.4.0, the following limitations will exist: 

·        Existing instance groups (previously known as Spark instance groups) and Spark versions are available as part of the upgrade, however not all Spark versions have continued support in IBM Spectrum Conductor 2.4.0.

·        An instance group template that uses the latest Spark version is not available for IBM Spectrum Conductor Deep Learning Impact. An instance group must be manually created.

·        A roll back to the previous version of IBM Spectrum Conductor can only be performed before applying the struts security fix.

 

3.2 Installation steps 

After successfully upgrading to IBM Spectrum Conductor 2.4.0, apply the interim fix and complete the follow steps.

 

1.      Run the following command to log in: 

$ egosh user logon -u user_name -x password

where user_name and password are your login credentials. For example:

$ egosh user logon -u Admin -x Admin

2.      Run the following command to stop services: 

$ egosh service stop dlinsights-monitor dlinsights-optimizer WEBGUI ascd dlpd elk-shipper elk-manager elk-indexer elk-elasticsearch elk-elasticsearch-master elk-elasticsearch-data

 

3.      On each host, download the fix packages to a directory. For example, packages can be downloaded to the /dlifixes directory.

 

4.      On each host, run the egoinstallfixes command to install cluster jars.

 

For Linux x86_64:

a.      Login as a root user.

$ su root

 

b.      As a root user, run the following commands:

$ chown -R $DLI_USER:$DLI_USER_GROUP $EGO_TOP/wlp/usr/servers/dlrest

$ chmod o+r /dlifixes/dlimgmt-1.2.3.0_x86_64_build527174.tar.gz

$ chmod o+r /dlifixes/dlicore-1.2.3.0_x86_64_build527174.tar.gz

$ chmod o+r /dlifixes/dli-1.2.3.0_build527174_share.tar.gz

 

c.      Login as a cluster administrator.

$ su $CLUSTERADMIN

 

d.      As a cluster administrator, run the following commands. 

$ egoinstallfixes /dlifixes/dlimgmt-1.2.3.0_x86_64_build527174.tar.gz

$ egoinstallfixes /dlifixes/dlicore-1.2.3.0_x86_64_build527174.tar.gz

$ tar zoxf /dlifixes/dli-1.2.3.0_build527174_share.tar.gz -C $DLI_SHARED_FS

$ chmod 775 $DLI_SHARED_FS/tools/dataset/main.py

$ chmod 755 $DLI_SHARED_FS/tools/inference/classify.py

$ chmod 755 $DLI_SHARED_FS/tools/inference/edi.py

$ chmod 775 $DLI_SHARED_FS/conf/spark-env.sh

$ chmod -R 755 $DLI_SHARED_FS/tools/dl_plugins

$ chmod -R 755 $DLI_SHARED_FS/tools/tune

$ chmod -R 775 $DLI_SHARED_FS/fabric

 

For Linux ppc64le:

a.      Login as a root user.

$ su root

 

b.      As a root user, run the following commands:

$ chown -R $DLI_USER:$DLI_USER_GROUP $EGO_TOP/wlp/usr/servers/dlrest

$ chmod o+r /dlifixes/dlicore-1.2.3.0_ppc64le_build527174.tar.gz

$ chmod o+r /dlifixes/dlimgmt-1.2.3.0_ppc64le_build527174.tar.gz

$ chmod o+r /dlifixes/dli-1.2.3.0_build527174_share.tar.gz

$ yum install -y ed

 

c.      Login as a cluster administrator.

$ su $CLUSTERADMIN

 

d.      As a cluster administrator, run the following commands: 

$ egoinstallfixes /dlifixes/dlimgmt-1.2.3.0_ppc64le_build527174.tar.gz

$ egoinstallfixes /dlifixes/dlicore-1.2.3.0_ppc64le_build527174.tar.gz

$ tar zoxf /dlifixes/dli-1.2.3.0_build527174_share.tar.gz -C $DLI_SHARED_FS

$ chmod 775 $DLI_SHARED_FS/tools/dataset/main.py

$ chmod 755 $DLI_SHARED_FS/tools/inference/classify.py

$ chmod 755 $DLI_SHARED_FS/tools/inference/edi.py

$ chmod 775 $DLI_SHARED_FS/conf/spark-env.sh

$ chmod -R 755 $DLI_SHARED_FS/tools/dl_plugins

$ chmod -R 755 $DLI_SHARED_FS/tools/tune

$ chmod -R 775 $DLI_SHARED_FS/fabric

NOTE: Running the “egoinstallfixes” command automatically backs up the current binary files to a fix backup directory for recovery purposes. Do not delete this backup directory; you will need it if you want to recover the original files. For more information on using this command, see the egoinstallfixes command reference.

 

5.       On each management host, verify the installation by running the pversions command.

$ pversions -b 527174

 

6.      Manually update some configuration files in IBM Spectrum Conductor Deep Learning Impact. All of these steps must be completed on the management host.

NOTE: If your cluster is a local installation, you must also complete step 3.1.1 and from section 3.2.6, steps: a, c, e, f and g, on each candidate master host and step 3.1.1, 3.2.6.e and g, on each compute host.

a.      Delete struts configuration files.

$ rm $WLP/dlgui/WEB-INF/classes/com/platform/gui/cws_dl/common/SparkBaseListAction.class 

$ rm $WLP/dlgui/WEB-INF/classes/com/platform/gui/cws_dl/dl/action/beans.xml 

$ rm $WLP/dlgui/WEB-INF/classes/com/platform/gui/cws_dl/dl/action/struts.xml 

$ rm $WLP/dlgui/WEB-INF/classes/struts.xml 

$ rm $WLP/dlgui/WEB-INF/struts-config.xml 

$ rm $WLP/dlgui/WEB-INF/classes/spring-cs.xml

 

b.      Update the cluster management consol. 

$ webapp_dlgui='<include optional="true" location="${env.EGO_CONFDIR}/../../gui/conf/webapp/webapp_dlgui.xml"/>'

$ webapp_dlguiv5='<include optional="true" location="${env.EGO_CONFDIR}/../../gui/conf/webapp/webapp_dlguiv5.xml"/>'

$ sed -i '/dlgui/d' $EGO_CONFDIR/../../gui/conf/webapp/server_internal.xml

$ sed -i "13a $webapp_dlgui" $EGO_CONFDIR/../../gui/conf/webapp/server_internal.xml

$ sed -i "14a $webapp_dlguiv5" $EGO_CONFDIR/../../gui/conf/webapp/server_internal.xml

$ sed -i "s#@EDI_HELP@#<Help role=\"\" display=\"Elastic Distributed Inference RESTful API\" url=\"/dlgui/apidocs/edi/index.html\" width=\"1280\" height=\"720\" />#" $EGO_CONFDIR/../../gui/conf/help/pmc_DLI_help.xml

$ sed -i 's#urlPrefix="true"/>#/>#g' $EGO_CONFDIR/../../gui/conf/help/pmc_DLI_help.xml

$ sed -i 's#gotoAbout.do#gotoAbout.controller#g' $EGO_CONFDIR/../../gui/conf/help/pmc_DLI_help.xml

 

c.      Update framework and widget.

$ mv $WLP/dlgui/framework $WLP/dlgui/framework_bak

$ mv $WLP/dlgui/widget $WLP/dlgui/widget_bak

$ cp -pr $WLP/../../conductor/2.4.0/conductorgui/framework $WLP/dlgui

$ cp -pr $WLP/../../conductor/2.4.0/conductorgui/widget $WLP/dlgui

 

d.      Update model templates.

$ sed -i "s#_DLI_SHARED_CONF_#$DLI_SHARED_FS/conf#g"  $EGO_TOP/conductorspark/work/templates/dli_sig_template.yml

$ sed -i "s#_DLI_SHARED_CONF_#$DLI_SHARED_FS/conf#g"  $EGO_TOP/conductorspark/work/templates/wmla_ig_edt_template.yml

$ sed -i "s#_DLI_SHARED_CONF_#$DLI_SHARED_FS/conf#g"  $EGO_TOP/conductorspark/work/templates/wmla_ig_template.yml

$ sed -i "s#_DLI_SHARED_FS_#$DLI_SHARED_FS#g"  $EGO_TOP/conductorspark/work/templates/wmla_ig_template.yml

$ sed -i "/sparkparameters/a\  SPARK_EGO_RECLAIM_GRACE_PERIOD: '200'" $EGO_TOP/conductorspark/work/templates/dli_sig_template.yml

$ sed -i "s#SPARK_EGO_RECLAIM_GRACE_PERIOD: '120'#SPARK_EGO_RECLAIM_GRACE_PERIOD: '200'#g" $EGO_TOP/conductorspark/work/templates/wmla_ig_edt_template.yml

 

e.      As the root user, reinstall dlinsights to use Python 3 on all management and compute hosts in the cluster:

$ . ${DLI_CONDA_HOME}/etc/profile.d/conda.sh

$ conda remove --name dlinsights --all --yes

$ conda create --name dlinsights --yes pip python=3.6

$ conda activate dlinsights

$ conda install --yes numpy==1.12.1

$ conda install --yes pyopenssl==18.0.0

$ conda install --yes flask==1.0.2 Flask-Cors==3.0.3 scipy==1.0.1 SQLAlchemy==1.1.13 requests==2.21 alembic==1.0.5

$ pip install --no-cache-dir elasticsearch==5.2.0 Flask-Script==2.0.5 Flask-HTTPAuth==3.2.2 mongoengine==0.11.0 

$ conda deactivate

 

f.       Update dlinsights.

$ sed -i 's#enabled\"#enabled:\"#g' $EGO_TOP/dli/1.2.3/dlinsights/bin/start_dlinsights_service.sh

 

g.      Update ELK.

$ DLELK_HOME=$EGO_TOP/dli/1.2.3/dlinsights/elk-patch

$ ELK_HOME=$EGO_CONFDIR/../../integration/elk

$ /usr/bin/cp -f $DLELK_HOME/template/ibm-dlinsights.json $ELK_HOME/init/template/

$ /usr/bin/cp -f $DLELK_HOME/template/ibm-dlinsights-spark-executor-work.json $ELK_HOME/init/template/

$ /usr/bin/cp -f $DLELK_HOME/template/ibm-dlinsights-spark-driver-work.json $ELK_HOME/init/template/

$ /usr/bin/cp -f $DLELK_HOME/template/ibm-dlinsights-batch-job-metrics.json $ELK_HOME/init/template/

$ /usr/bin/cp -f $DLELK_HOME/template/ibm-dlinsights-batch-job-resource-usage.json $ELK_HOME/init/template/

$ chown -R $DLI_USER:$DLI_USER_GROUP $ELK_HOME/init/template/ibm-dlinsights-*.json

$ /usr/bin/cp -f $DLELK_HOME/conf/grok-pattern/dlinsights-pattern $ELK_HOME/conf/grok-pattern/

$ chown -R $DLI_USER:$DLI_USER_GROUP $ELK_HOME/conf/grok-pattern/dlinsights-pattern

$ /usr/bin/cp -f $DLELK_HOME/conf/indexer/dlinsights_logstash_*.conf $ELK_HOME/conf/indexer/

$ chown -R $DLI_USER:$DLI_USER_GROUP ${ELK_HOME}/conf/indexer/dlinsights_logstash_*.conf

$ /usr/bin/cp -f $DLELK_HOME/conf/indexcleanup/*.conf $ELK_HOME/conf/indexcleanup/

$ chown -R $DLI_USER:$DLI_USER_GROUP $ELK_HOME/conf/indexcleanup/*.conf

$ /usr/bin/cp -f $DLELK_HOME/conf/shipper/dlinsights_shipper.yml $ELK_HOME/conf/shipper/

$ chown -R $DLI_USER:$DLI_USER_GROUP ${ELK_HOME}/conf/shipper/dlinsights_shipper.yml

$ rm --verbose -f ${ELK_HOME}/conf/shipper/conductor.yml

$ rm --verbose -f ${ELK_HOME}/conf/indexer/cws_spark.conf

$ mv ${ELK_HOME}/conf/indexer/dlinsights_logstash_worker.conf ${ELK_HOME}/conf/indexer/cws_spark.conf

$ mv ${ELK_HOME}/conf/shipper/dlinsights_shipper.yml ${ELK_HOME}/conf/shipper/conductor.yml

 

h.       Recreate the fabric directory.

$ rm -rf $DLI_SHARED_FS/fabric

$ mkdir -p $DLI_SHARED_FS/fabric/1.2.3

 

i.      Move the fabric files to the fabric directory.

For Linux ppc64le:

$ tar zoxf $DLI_SHARED_FS/fabric_527174_patch_files/fabric-linux-rhel-ppc64le.tar.gz -C $DLI_SHARED_FS/fabric/1.2.3

For Linux x86_64:

$ tar zoxf $DLI_SHARED_FS/fabric_527174_patch_files/fabric-linux-rhel-x86_64.tar.gz -C $DLI_SHARED_FS/fabric/1.2.3

 

j.       Start the following services:

$ egosh service start dlinsights-monitor dlinsights-optimizer WEBGUI dlpd ascd elk-shipper elk-manager elk-indexer elk-elasticsearch elk-elasticsearch-master elk-elasticsearch-data

 

3.3 After installation

After installing the interim fix, complete the following:

1.         Update the Spark package on your cluster. IMPORTANT: Omit this step if you will be installing interim fix 536919.

a.      On your client host, unzip the sc-2.4.0.0_build530302.tgz package.

b.      Launch your web browser and clear the browser cache.

c.      Navigate to the cluster management console and log in as a cluster administrator.

d.      Remove the old Spark 2.x.x package from your cluster:

                                                            i.     Click Resources > Frameworks > Version Management.

                                                           ii.     Select the Spark version.

                                                          iii.     Click Remove.

e.      Add the new Spark 2.x.x package to your cluster:

                                                            i.     Click Resources > Frameworks > Version Management.

                                                           ii.     Click Add.

                                                          iii.     Click Browse and select the Spark2.x.x-Conductor2.4.0.tgz.

                                                         iv.     Click Add.

f.       If the instance group is using spark 2.x.x package and status is started.

                                                            i.     Stop the instance group.

                                                           ii.     Click Workload > Instance Groups > Available: 1 > Update.

                                                          iii.     Start the instance group.

2.         After applying the fix, existing instance groups are in ready state. If needed, start your instance groups.

3.      Optionally, create new instance groups using the template provided.

NOTE: The default Python version associated with an instance group was changed from Python 2.7 to Python 3.6. To use Python 2.7, you must configure an additional environment variable for the instance group to use:

 

·        To create a new instance group that uses Python 2.7, do the following:

a.      In the "New Instance Group" window, click on "Configuration" (found under "Spark version").

b.      Click the "All parameters" menu and select "Additional environment variables".

c.      Click "Add an environment variable". In the "Name" text field, input PYTHON_VERSION and in the "Value" text field, input python2.

d.      Click Save and continue with your instance group creation.

·        To modify an existing Spark instance group to use Python 2.7, do the following:

a.      Click on the instance group that you want to modify. Stop the instance group, then click Manage > Configure

b.      Click Configuration in the "Spark configuration" section. 

c.      Click the "All parameters" menu and select "Additional environment variables".

d.      Click "Add an environment variable". In the "Name" text field, input PYTHON_VERSION and in the "Value" text field, input python2.

e.      Click Save, then Modify Instance Group.

 

4.   Uninstallation

If required, follow the instructions in this section to uninstall this interim fix on hosts in your cluster.

1.      Log in to the management host as a cluster administrator and source the environment.

$ su $CLUSTERADMIN

$ . $EGO_TOP/profile.platform

$ export DLI_SHARED_FS=$DLI_SHARED_FS

$ export CLUSTERADMIN=$CLUSTERADMIN

where EGO_TOP is IBM Spectrum Conductor Deep Learning Impact installation path, DLI_SHARED_FS and CLUSTERADMIN must be same as IBM Spectrum Conductor Deep Learning Impact installation setting.

 

2.      On the management host, stop the following services:

$ egosh service stop dlinsights-monitor dlinsights-optimizer WEBGUI ascd dlpd elk-shipper elk-manager elk-indexer elk-elasticsearch elk-elasticsearch-master elk-elasticsearch-data

 

3.      Log on to each management host in the cluster and roll back this interim fix:

$ egoinstallfixes -r 527174

 

4.      On the management host, restore files and directories that you backed up.

$WLP/dlgui

$EGO_TOP/dli/1.2.3/dlinsights

$DLI_SHARED_FS/tools/dataset/main.py

$DLI_SHARED_FS/tools/dl_plugins/

$DLI_SHARED_FS/tools/tune/

$DLI_SHARED_FS/fabric/

 

5.      As the root user, remove and reinstall dlinsights to use Python 2 on all management and compute hosts in the cluster:

$ . ${DLI_CONDA_HOME}/etc/profile.d/conda.sh

$ conda remove --name dlinsights --all --yes

$ conda create --name dlinsights --yes pip python=2.7 openssl=1.1.1c

$ conda activate dlinsights

$ conda install --yes numpy=1.12.1 openssl=1.1.1c

$ conda install --yes pyopenssl==18.0.0 openssl=1.1.1c

$ conda install --yes Flask==0.12.2 Flask-Cors==3.0.3 scipy==1.0.1 pathlib==1.0.1 SQLAlchemy==1.1.13 requests=2.21 alembic=1.0.5 openssl=1.1.1c

$ pip install --no-cache-dir warlock==1.3.0 elasticsearch==5.2.0 Flask-Script==2.0.5 Flask-HTTPAuth==3.2.2 mongoengine==0.11.0 python-heatclient==1.2.0 python-keystoneclient==3.17.0

$ conda deactivate

 

6.      Roll back the upgrade from IBM Spectrum Conductor 2.4.0 to your previous version, see the roll back documentation in the IBM Knowledge Center.

 

7.      Update ELK on management hosts. If your cluster is used a local installation, you must also complete this step on each candidate master host and compute host.

$ DLELK_HOME=$EGO_TOP/dli/1.2.3/dlinsights/elk-patch

$ ELK_HOME=$EGO_CONFDIR/../../integration/elk

$ /usr/bin/cp -f $DLELK_HOME/template/ibm-dlinsights.json $ELK_HOME/init/template/

$ /usr/bin/cp -f $DLELK_HOME/template/ibm-dlinsights-spark-executor-work.json $ELK_HOME/init/template/

$ /usr/bin/cp -f $DLELK_HOME/template/ibm-dlinsights-spark-driver-work.json $ELK_HOME/init/template/

$ /usr/bin/cp -f $DLELK_HOME/template/ibm-dlinsights-batch-job-metrics.json $ELK_HOME/init/template/

$ /usr/bin/cp -f $DLELK_HOME/template/ibm-dlinsights-batch-job-resource-usage.json $ELK_HOME/init/template/

$ chown -R $DLI_USER:$DLI_USER_GROUP $ELK_HOME/init/template/ibm-dlinsights-*.json

$ /usr/bin/cp -f $DLELK_HOME/conf/grok-pattern/dlinsights-pattern $ELK_HOME/conf/grok-pattern/

$ chown -R $DLI_USER:$DLI_USER_GROUP $ELK_HOME/conf/grok-pattern/dlinsights-pattern

$ /usr/bin/cp -f $DLELK_HOME/conf/indexer/dlinsights_logstash_*.conf $ELK_HOME/conf/indexer/

$ chown -R $DLI_USER:$DLI_USER_GROUP ${ELK_HOME}/conf/indexer/dlinsights_logstash_*.conf

$ /usr/bin/cp -f $DLELK_HOME/conf/indexcleanup/*.conf $ELK_HOME/conf/indexcleanup/

$ chown -R $DLI_USER:$DLI_USER_GROUP $ELK_HOME/conf/indexcleanup/*.conf

$ /usr/bin/cp -f $DLELK_HOME/conf/shipper/dlinsights_shipper.yml $ELK_HOME/conf/shipper/

$ chown -R $DLI_USER:$DLI_USER_GROUP ${ELK_HOME}/conf/shipper/dlinsights_shipper.yml

$ rm --verbose -f ${ELK_HOME}/conf/shipper/conductor.yml

$ rm --verbose -f ${ELK_HOME}/conf/indexer/cws_spark.conf

$ mv ${ELK_HOME}/conf/indexer/dlinsights_logstash_worker.conf ${ELK_HOME}/conf/indexer/cws_spark.conf

$ mv ${ELK_HOME}/conf/shipper/dlinsights_shipper.yml ${ELK_HOME}/conf/shipper/conductor.yml

 

8.      On the management host, run the following commands to restart services:

$ egosh service stop elk-shipper elk-manager elk-indexer elk-elasticsearch elk-elasticsearch-master elk-elasticsearch-data

$ egosh service start elk-shipper elk-manager elk-indexer elk-elasticsearch elk-elasticsearch-master elk-elasticsearch-data

 

 

5.   List of files 

Deleted files:

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/WEB-INF/classes/com/platform/gui/cws_dl/common/SparkBaseListAction.class

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/WEB-INF/classes/com/platform/gui/cws_dl/dl/action/beans.xml

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/WEB-INF/classes/com/platform/gui/cws_dl/dl/action/struts.xml

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/WEB-INF/classes/spring-cs.xml

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/WEB-INF/classes/struts.xml

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/WEB-INF/struts-config.xml

Added files:

$EGO_TOP/wlp/usr/srvers/gui/apps/dli/1.2.3/dlgui/WEB-INF/classes/springMVC-servlet.xml

Updated files:

$EGO_TOP/gui/conf/help/pmc_DLI_help.xml

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/WEB-INF/classes/com/platform/gui/cws_dl/common/SparkRestClient.class

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/WEB-INF/classes/com/platform/gui/cws_dl/dl/action/SparkDeepLearningAction.class

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/WEB-INF/web.xml

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/common/conductorError.jsp

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/common/js/ConductorSparkApp.js

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dberror.jsp

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/dl.jsp

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/js/datasets.controller.js

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/js/datasetView.controller.js

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/js/modelInferenceList.controller.js

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/js/models.controller.js

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/js/modelTrainingList.controller.js

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/js/modelValidationList.controller.js

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/js/monitor.controller.js

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/modelView.html

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/modelView.jsp

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/index.jsp

$EGO_TOP/conductorspark/work/templates/dli_sig_template.yml

$EGO_TOP/conductorspark/work/templates/wmla_ig_edt_template.yml

$EGO_TOP/conductorspark/work/templates/wmla_ig_template.yml

$EGO_TOP/dli/1.2.3/dlpd/lib/cws_dl-core-1.2.3.jar

$EGO_TOP/dli/1.2.3/dlpd/lib/cws_dl-common-1.2.3.jar

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/iasModelDefinition.html

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/iasmodelViewOverview.html

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/i18n/locale-en.json

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/js/iasModels.controller.js

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/iasModelAdd.html

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/dl/js/iasModelView.controller.js

 

$EGO_TOP/dli/1.2.3/dlinsights/bin/start_dlinsights_service.sh

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/framework

$EGO_TOP/wlp/usr/servers/gui/apps/dli/1.2.3/dlgui/widget

$EGO_CONFDIR/../../integration/elk/init/template

${DLI_SHARED_FS}/tools/dataset/main.py

$EGO_TOP/dli/1.2.3/dlpd/bin/dlicmd.py

$EGO_TOP/dli/conf/dlpd/dl_plugins/common.conf

$EGO_TOP/dli/1.2.3/dlinsights/monitor/app/main/algorithm.py

$EGO_TOP/dli/1.2.3/dlinsights/monitor/app/main/app_detail.py

>$EGO_TOP/dli/1.2.3/dlinsights/monitor/app/main/app_list.py

$EGO_TOP/dli/1.2.3/dlinsights/optimizer/app/bayesian_opt.py

$EGO_TOP/dli/1.2.3/dlinsights/monitor/app/main/cluster_detail.py

$EGO_TOP/dli/1.2.3/dlinsights/monitor/app/main/cluster_list.py

$EGO_TOP/dli/1.2.3/dlinsights/monitor/app/main/commons.py

$EGO_TOP/dli/1.2.3/dlinsights/monitor/app/heat_operation.py

$EGO_TOP/dli/1.2.3/dlinsights/monitor/app/main/host_list.py

$EGO_TOP/dli/1.2.3/dlinsights/optimizer/app/opt_aux.py

$EGO_TOP/dli/1.2.3/dlinsights/optimizer/app/optimizer_engine.py

$EGO_TOP/dli/1.2.3/dlinsights/optimizer/app/sobol_lib.py

$EGO_TOP/dli/1.2.3/dlinsights/monitor/app/main/system.py

$EGO_TOP/dli/1.2.3/dlinsights/monitor/app/main/views.py

$EGO_TOP/dli/1.2.3/dlinsights/optimizer/app/main/views.py

$DLI_SHARED_FS/tools/dl_plugins/caffeModelPrep.py

$DLI_SHARED_FS/tools/dl_plugins/ddlCaffeCmdGen.py

$DLI_SHARED_FS/tools/dl_plugins/ddlTensorFlowCmdGen.py

$DLI_SHARED_FS/tools/dl_plugins/disttensorflowCmdGen.py

$DLI_SHARED_FS/tools/dl_plugins/dlioptgen.py

$DLI_SHARED_FS/tools/dl_plugins/edtPyTorchCmdGen.py

$DLI_SHARED_FS/tools/dl_plugins/edtTensorflowCmdGen.py

$DLI_SHARED_FS/tools/dl_plugins/globalHead.py

$DLI_SHARED_FS/tools/dl_plugins/PowerAICaffeIBMCmdGen.py

$DLI_SHARED_FS/tools/dl_plugins/PyTorchCmdGen.py

$DLI_SHARED_FS/tools/dl_plugins/tensorflowCmdGen.py

$DLI_SHARED_FS/tools/tune/bayes_opt_crl.py

$DLI_SHARED_FS/tools/tune/util/bayesian_opt_utils.py

$DLI_SHARED_FS/tools/tune/util/data_transform.py

$DLI_SHARED_FS/tools/tune/hyperband_opt.py

$DLI_SHARED_FS/tools/tune/util/sobol_lib.py

$EGO_TOP/dli/1.2.3/dlpd/bin/start-dlpd.sh

sc-2.4.0.0_build530302.tar.gz

 

6.   Product notifications

To receive information about product solution and patch updates automatically, subscribe to product notifications on the My Notifications page http://www.ibm.com/support/mynotifications/ on the IBM Support website (http://support.ibm.com). You can edit your subscription settings to choose the types of information you want to get notification about, for example, security bulletins, fixes, troubleshooting, and product enhancements or documentation changes. 

7.   Copyright and trademark information 

© Copyright IBM Corporation 2019, 2020

U.S. Government Users Restricted Rights - Use, duplication or disclosure restricted by GSA ADP Schedule Contract with IBM Corp.

IBM®, the IBM logo and ibm.com® are trademarks of International Business Machines Corp., registered in many jurisdictions worldwide. Other product and service names might be trademarks of IBM or other companies. A current list of IBM trademarks is available on the Web at "Copyright and trademark information" at www.ibm.com/legal/copytrade.shtml