SAS LASR Analytic Server can
load data from the Hadoop Distributed File System (HDFS) that is co-located
on the machines in the cluster. Before loading the data, you must
add the data to HDFS. To load from HDFS:
-
Navigate to the SASHDAT
file to use in the HDFS content explorer. Select the file and click
.
-
On the
General page,
specify values for the
Job name,
Location,
and
Description if you intend to save this
action as a job.
-
Select the
Settings page
and specify the SAS LASR Analytic Server and data parameters:
|
|
|
This field specifies
the location of the server description files.
|
|
This field specifies
the name of the server description file.
|
|
This path to the SASHDAT
file is populated automatically.
|
|
Specify a description
for the data. This description overrides the description that was
associated with the data set before it was added to HDFS. The description
is displayed beside the table name in the explorer interface.
|
-
The following items
provide additional information about loading from HDFS:
-
Before the data can be loaded from
HDFS, it must be added to HDFS.
For more information,
see Prepare Data.
-
The data can also be added to HDFS
with the OLIPHANT procedure from a SAS session.
-
SAS LASR Analytic Server reads
SASHDAT files from HDFS only. It does not read any other data that
is stored in HDFS such as the output of MapReduce jobs.