Readme for IBM® Spectrum Conductor with Spark 2.2.1 Interim Fix 522535
Readme file for: IBM Spectrum Conductor with Spark
Product/Component Release: 2.2.1
Update Name: Interim Fix 522535
Fix ID: cws-2.2.1-build522535
Publication date: July 1, 2019
This interim fix provides a resolution for the following issues in IBM Spectrum Conductor with Spark 2.2.1:
· When too many Spark executors are created for a Spark application, the Spark master runs out of memory, causing the Spark master to go down. Accessing the cluster management console is also impacted.
·
Sometimes, when a Spark
application runs for a long time, the “slots” and “demandslots”
fields returned through the Spark master REST API show zero.
This update introduces a new SPARK_EGO_EXECUTORS_MAX_RESULTS_NUM parameter in the Spark version configuration of a Spark instance group to control the maximum number of executors to be returned for each application. It also updates the default for the SPARK_EGO_APPS_MAX_RESULTS_NUM parameter from 10000 to 200.
Contents
1. List of fixes
2. Download location
3. Products or components affected
4. Installation and configuration
5. Uninstallation
6. List of files
7. Product notifications
8. Copyright and trademark information
APAR: P103061
Download fix 522535 from the following
location: http://www.ibm.com/eserver/support/fixes/.
Component name, Platform, Fix ID:
System requirements
Linux x86_64
Installation
a.
Log on to the master host
as CLUSTERADMIN and source the environment.
b.
Stop the ascd service:
$ egosh service stop ascd
c.
Back up the following
files:
$EGO_TOP/ascd/2.2.1/lib/asc-common-2.2.1.jar
$EGO_TOP/ascd/2.2.1/lib/asc-core-2.2.1.jar
d.
On your client host, unzip
the cws-2.2.1.0_x86_64_build522535.tar.gz package,
for example:
>
mkdir -p /tmp/fix522535
>
tar zoxf cws-2.2.1.0_x86_64_build522535.tar.gz -C /tmp/fix522535
e.
Copy the following files to the $EGO_TOP/ascd/2.2.1/lib/
directory on the management host:
/tmp/fix522535/ascd/2.2.1/lib/asc-common-2.2.1.jar
/tmp/fix522535/ascd/2.2.1/lib/asc-core-2.2.1.jar
f.
Start the ascd service:
$ egosh service start ascd
g.
Launch a browser and clear
the browser cache.
h.
Navigate to the cluster management
console and log in as an administrator.
i.
Remove the Spark x.x.x package if
it already exists:
a) Click Workload > Spark > Version Management.
b) Select the Spark version.
c) Click Remove.
j.
Add the Spark x.x.x package to
your cluster.
a) Click Workload > Spark > Version Management.
b) Click Add.
c)
Click Browse and
select the /tmp/fix522535/Sparkx.x.x-Conductor2.2.1.tgz.
d) Click Add.
e) Repeat for each Spark version you want to add.
After installation
a. Create a new Spark instance group that uses the new Spark version package. For details, see https://www.ibm.com/support/knowledgecenter/SSZU2E_2.2.1/developing_instances/instance_create_about.html.
b.
If required, update your
existing Spark instance groups to use the new Spark version package. For
details, see https://www.ibm.com/support/knowledgecenter/SSZU2E_2.2.1/managing_instances/instance_update_spark_version.html.
IMPORTANT: Once a Spark instance group is updated to the new Spark version package, you cannot roll back the update.
c. If required, edit your Spark instance group to update the default values for the SPARK_EGO_EXECUTORS_MAX_RESULTS_NUM and SPARK_EGO_APPS_MAX_RESULTS_NUM parameters in the Spark on EGO section of the Spark version configuration:
·
SPARK_EGO_EXECUTORS_MAX_RESULTS_NUM:
Specifies the maximum number of executors to be returned for each application. Default
is 200. If you change this parameter to return more than 200 executors per application,
ensure that you increase the Spark master's JVM memory.
·
SPARK_EGO_APPS_MAX_RESULTS_NUM:
Specifies the maximum number of applications to be returned in one request. Default
is 200. If you change this parameter to return more than 200 applications,
ensure that you increase the Spark master's JVM memory.
a.
Log on to the master host
as CLUSTERADMIN and source the environment.
b.
Stop the ascd service:
$ egosh service stop ascd
c.
Restore your backup for
the following files:
$EGO_TOP/ascd/2.2.1/lib/asc-common-2.2.1.jar
$EGO_TOP/ascd/2.2.1/lib/asc-core-2.2.1.jar
d.
Start the ascd service:
$ egosh
service start ascd
e. Launch your web browser and clear the browser cache.
f.
Navigate to the cluster management
console and log in as an administrator.
g.
Remove the Spark x.x.x package:
a)
Click Workload > Spark
> Version Management.
b)
Select x.x.x.
c)
Click Remove.
h.
Log in to Fix Central, then download and
extract the original Spark version.
i.
Add the original Spark x.x.x package to
your cluster:
a)
Click Workload > Spark
> Version Management.
b)
Click Add.
c)
Click Browse and select
the .tgz
file.
d)
Click Add.
j.
Create a new Spark instance group
that uses the original Spark version x.x.x package. For details, see https://www.ibm.com/support/knowledgecenter/SSZU2E_2.2.1/developing_instances/instance_create_about.html.
Spark2.1.1-Conductor2.2.1.tgz
Spark2.2.0-Conductor2.2.1.tgz
Spark2.3.1-Conductor2.2.1.tgz
Spark2.3.3-Conductor2.2.1.tgz
asc-common-2.2.1.jar
asc-core-2.2.1.jar
To receive information about product solution and
patch updates automatically, subscribe to product notifications on the My
Notifications page http://www.ibm.com/support/mynotifications/ on the IBM
Support website (http://support.ibm.com).
You can edit your subscription settings to choose the types of information you
want to get notification about, for example, security bulletins, fixes,
troubleshooting, and product enhancements or documentation changes.
© Copyright IBM Corporation 2019
U.S. Government Users Restricted Rights - Use, duplication or disclosure restricted by GSA ADP Schedule Contract with IBM Corp.
IBM®, the IBM logo and ibm.com® are trademarks of International Business Machines Corp., registered in many jurisdictions worldwide. Other product and service names might be trademarks of IBM or other companies. A current list of IBM trademarks is available on the Web at "Copyright and trademark information" at www.ibm.com/legal/copytrade.shtml