Configuring a Hadoop Distributed File System

To enable users to publish scoring model files to a Hadoop Distributed File System (HDFS) from SAS Model Manager using the SAS Embedded Process, follow these steps:
  1. Create an HDFS directory where the model files can be stored.
    Note: The path to this directory is used when a user publishes a model from the SAS Model Manager user interface to Hadoop.
  2. Grant users Write access permission to the HDFS directory. For more information, see Hadoop Permissions.
  3. Add this line of code to the autoexec_usermods.sas file that is located in the Windows directory\SAS-configuration-directory\Levn\SASApp\WorkspaceServer\:
    %let HADOOP_Auth = Kerberos or blank; 
    UNIX Specifics: The location of the autoexec_usermods.sas file for UNIX is /SAS-confirguration-directory/Levn/SASApp/WorkspaceServer/.
    If your Hadoop server is configured with Kerberos, set the HADOOP_Auth variable to Kerberos. Otherwise, leave it blank.
  4. (Optional) If you want users to be able to copy the publish code and execute it using Base SAS, then this line of code must be added to the sasv9.cfg file that is located in the Windows directory \SASHome\SASFoundation\9.4\:
    -AUTOEXEC ‘\SAS-confirguration-directory\Levn\SASApp\WorkspaceServer\
    autoexec_usermods.sas'
    UNIX Specifics: The location of the sasv9.cfg file for UNIX is /SASHome/SASFoundation/9.4/.
  5. (Optional) If your Hadoop distribution is using Kerberos authentication, each user must have a valid Kerberos ticket to access SAS Model Manager. You must also complete additional post-installation configuration steps to enable users to publish models to a Hadoop Distributed File System (HDFS) from SAS Model Manager. For more information, see SAS Model Manager: Administrator’s Guide.
Last updated: February 9, 2017