Readme File for IBM Spectrum Conductor with Spark 2.1.0.1 (Interim Fix 426392)

Readme file for: IBM® Spectrum Conductor with Spark
Product/Component Release:
2.1.0.1
Update Name: Interim Fix 426392
Fix ID:
cws-2.1-build426392
Publication date:
September 14th, 2017

Abstract

Interim fix enabling Spark applications to use different SPARK_HOME settings for both client and cluster deployment mode.

 

Description

For Spark applications submitted in the client mode, the SPARK_HOME setting on the client side does not need to be the same as SPARK_HOME for the Spark instance group. Previously, when the path to SPARK_HOME on the client side was different from the one on the master side, tasks submitted from the client side failed.

 

System requirements

RHEL x86_64 platform

 

Contents

1.       List of fixes

2.       Download location

3.       Installation and configuration

4.       List of files

5.       Copyright and trademark information

 

1. List of fixes

    APAR: P101954

2. Download location

     Download interim fix 426392 from http://www.ibm.com/eserver/support/fixes/.

3. Installation and configuration

     Before installation

      1. Log on to the cluster master host as the cluster administrator and shut down the cluster:
     > egosh service stop all
     >
egosh ego shutdown all

     

      2. Log on to all management hosts in the cluster and back up the following file that will be replaced by this interim fix:
     $EGO_CONFDIR/../../conductorspark/conf/packages/Spark1.6.1-Conductor2.1.0/Spark1.6.1.tgz

    

     Installation

      1. On each management host, replace the Spark 1.6.1 package from the interim fix:
     > tar zxfo cws2.1-build426392.tgz -C /tmp
     > cp /tmp/Spark1.6.1-Conductor2.1.0/Spark1.6.1.tgz $EGO_CONFDIR/../../conductorspark/conf/packages/Spark1.6.1-Conductor2.1.0/

     

      2. Start the cluster from the egosh command prompt on each host:
     > egosh ego start

     

      3. Create a new Spark instance group that will use the new Spark 1.6.1 package. See https://www.ibm.com/support/knowledgecenter/SSZU2E_2.1.0/developing_instances/developing_instances.html.

    

      4. If required, upgrade your existing Spark instance groups to use the new Spark 1.6.1 package.

 

a. From the cluster management console, go to Resources > Service Packages > Service Packages and determine the package name and the top-level consumer for the Spark instance group. The package name uses the format <SIG_NAME>_Spark<version>; for example, LOB_Spark1.6.1.

 

b. Run the egodeploy command to add the new Spark1.6.1.tgz package to the top-level consumer:
       > egodeploy add <package_name> -p <path_to_Spark1.6.1.tgz> -c <consumer_path>

 

c. Stop the Spark instance group.

 

d. If high availability was enabled for the Spark master in the Spark instance group, delete the recovery directory. To find this directory, click the Configuration tab for the Spark instance group; then, click Edit Configuration for the Spark version. Under Spark on EGO settings, look for the value of the spark.deploy.recoveryDirectory parameter.

 

e. Modify the configuration for the Spark instance group to trigger redeployment of the Spark package, enabling the new Spark version package to be deployed to hosts in the resource group. To modify the configuration, change the value of the spark.driver.memory parameter: Click the Configuration tab for the Spark instance group; then, click Edit Configuration for the Spark version. Set a new value for the spark.driver.memory parameter, change it to the original value, and click Save.

          

           NOTE: If the spark-default.conf/spark-env.sh file was modified from the command line, back up the files before redeploying the Spark instance group. After redeployment, restore the files.

 

f.   Start the Spark instance group.

     

       Uninstallation

       1.  Stop all services and shut down the cluster:
     > egosh service stop all
     > egosh ego shutdown all

      

       2. Log on to all management hosts in the cluster and restore the backup file:
           
$EGO_CONFDIR/../../conductor/conf/packages/Spark1.6.1-Conductor2.1.0/Spark1.6.1.tgz

      

       3. Complete steps 2-4 in the Installation section.

 

4.  List of files

   Spark1.6.1.tgz

 5.  Copyright and trademark information

©Copyright IBM Corporation 2017

U.S. Government Users Restricted Rights - Use, duplication or disclosure restricted by GSA ADP Schedule Contract with IBM Corp.

IBM®, the IBM logo, and ibm.com® are trademarks of International Business Machines Corp., registered in many jurisdictions worldwide. Other product and service names might be trademarks of IBM or other companies. A current list of IBM trademarks is available on the Web at "Copyright and trademark information" at www.ibm.com/legal/copytrade.shtml.