The following prerequisites
are required before you install and configure the in-database deployment
package for Hadoop using the SAS Deployment Manager:
-
The SSH user must have passwordless
sudo access.
-
If your cluster is secured with
Kerberos, in addition to having a valid ticket on the client, a Kerberos
ticket must be valid on node that is running Hive. This is the node
that you specify when using the SAS Deployment Manager.
-
If you are using Cloudera, the
SSH account must have Write permission to these directories:
/opt/cloudera |
/opt/cloudera/csd |
/opt/cloudera/parcels |
-
You cannot customize the install
location of the SAS Embedded Process on the cluster. By default, the
SAS Deployment Manager deploys the SAS Embedded Process in the /opt/cloudera/parcels
directory
for Cloudera and the /opt/sasep_stack
directory
for Hortonworks.
-
If you are using Cloudera, the
Java JAR and GZIP commands must be available.
-
-
If you are using Hortonworks, the
requiretty option is enabled, and the SAS Embedded Process is installed
using the SAS Deployment Manager, the Ambari server must be restarted
after deployment. Otherwise, the SASEP Service does not appear in
the Ambari list of services. It is recommended that you disable the
requiretty option until the deployment is complete.
-
The following information is required:
-
host name and port of the cluster
manager
-
credentials (account name and password)
for the Hadoop cluster manager
-
-
-
SSH credentials of the administrator
who has access to both Hive and Oozie nodes