Readme for
IBM Spectrum Conductor with Spark 2.2 Interim Fix 437255
Readme
file for: IBM® Spectrum Conductor with Spark
Product/Component Release: 2.2.0
Update Name: Interim fix 437255
Fix ID: spark1.6.1-conductor2.2.0-build437255-jpmc
Publication date: Jan 20, 2017
Spark 1.6.1 package supporting GPU
adaptive scheduling and shared RDDs for batch applications in IBM Spectrum
Conductor with Spark 2.2.0
About this interim fix
This interim
fix provides a Spark 1.6.1 package that supports the following functions in IBM Spectrum Conductor with Spark V2.2.0:
·
Adaptive scheduling when your cluster includes CPU and GPU
resources. With this interim fix, you can enable adaptive scheduling for
applications that use Spark version 1.6.1, allowing tasks to run
first on a portion of GPU resources in the cluster. When GPU resources are no
longer available in the cluster, the remaining tasks run on CPU resources. For
more information, see http://www.ibm.com/support/knowledgecenter/SSZU2E_2.2.0/developing_instances/instances_gpu.html.
· Shared RDDs for batch applications.
With this interim fix, you can use Spark version 1.6.1 to submit shared batch
applications that can be shared among multiple users. With shared
batch applications, multiple users can submit Spark jobs to an application and
use the same Resilient Distributed Datasets (RDDs). For more information, see http://www.ibm.com/support/knowledgecenter/SSZU2E_2.2.0/managing_applications/applications_shared.html.
Installation and configuration
System requirements
Linux 64-bit platform
Installation
Before you begin, IBM Spectrum
Conductor with Spark V2.2.0 must be installed on a supported operating system.
For details, see https://www.ibm.com/support/knowledgecenter/SSZU2E_2.2.0/installing/install_upgrade.html.
1. Log
on to each management host in the cluster and back up the following files that
will be replaced by this interim fix:
> $EGO_CONFDIR/../../conductorspark/conf/packages/Spark1.6.1-Conductor2.2.0/Spark1.6.1.tgz
> $EGO_CONFDIR/../../conductorspark/conf/packages/Spark1.6.1-Conductor2.2.0/platform-conductor-conf.yaml
> $EGO_CONFDIR/../../conductorspark/conf/packages/Spark1.6.1-Conductor2.2.0/platform-conductor.yaml
> $EGO_TOP/wlp/usr/servers/gui/apps/conductor/2.2.0/conductorgui/spark/instance/instanceViewApplications.html
> $EGO_TOP/wlp/usr/servers/gui/apps/conductor/2.2.0/conductorgui/spark/instance/js/instanceView.controller.js
> $EGO_TOP/wlp/usr/servers/gui/apps/conductor/2.2.0/conductorgui/spark/appsandnotebooks/js/sharedApplications.controller.js
> $EGO_TOP/wlp/usr/servers/gui/apps/conductor/2.2.0/conductorgui/spark/appsandnotebooks/i18n/locale-en.json
> $EGO_TOP/wlp/usr/servers/gui/apps/conductor/2.2.0/conductorgui/spark/appsandnotebooks/js/submitApplication.controller.js
2.
On each management host, decompress
the spark1.6.1-conductor2.2.0-build437255.tgz
package as cluster admin:
> mkdir -p /tmp/fix437255
> tar zxf Spark1.6.1-Conductor2.2.0-build437255.tgz -C /tmp/fix437255
> cp /tmp/fix437255/instanceViewApplications.html
$EGO_TOP/wlp/usr/servers/gui/apps/conductor/2.2.0/conductorgui/spark/instance
> cp /tmp/fix437255/instanceView.controller.js
$EGO_TOP/wlp/usr/servers/gui/apps/conductor/2.2.0/conductorgui/spark/instance/js
> cp /tmp/fix437255/sharedApplications.controller.js
$EGO_TOP/wlp/usr/servers/gui/apps/conductor/2.2.0/conductorgui/spark/appsandnotebooks/js
> cp /tmp/fix437255/locale-en.json
$EGO_TOP/wlp/usr/servers/gui/apps/conductor/2.2.0/conductorgui/spark/appsandnotebooks/i18n
> cp /tmp/fix437255/submitApplication.controller.js
$EGO_TOP/wlp/usr/servers/gui/apps/conductor/2.2.0/conductorgui/spark/appsandnotebooks/js
3. Launch
a browser and clear the browser cache; then, log in to the cluster management
console as admin.
4. Add
the Spark 1.6.1 package to your cluster.
a.
Click Workload > Spark > Version
Management.
b.
Click Add.
c.
Click Browse and select the /tmp/fix437255/Spark1.6.1-Conductor2.2.0.tgz
package.
d.
Click Add.
5. Create
a new Spark instance group that will use the new Spark 1.6.1 package. For
details, see http://www.ibm.com/support/knowledgecenter/SSZU2E_2.2.0/developing_instances/developing_instances.html.
6. If
required, upgrade your existing Spark instance groups to use the new Spark
1.6.1 package. For details, see https://www.ibm.com/support/knowledgecenter/SSZU2E_2.2.0/managing_instances/instance_update_spark_version.html.
Copyright and trademark information
© Copyright
IBM Corporation 2017
U.S. Government Users Restricted Rights - Use, duplication
or disclosure restricted by GSA ADP Schedule Contract with IBM Corp.
IBM®, the IBM logo, and ibm.com® are trademarks of
International Business Machines Corp., registered in many jurisdictions worldwide.
Other product and service names might be trademarks of IBM or other companies.
A current list of IBM trademarks is available on the Web at "Copyright and
trademark information" at www.ibm.com/legal/copytrade.shtml.