|
File Type
|
Input
|
Output
|
|---|---|---|
|
delimited
|
HDFS direct-read
HCatalog if partitioned,
skewed, or escaped
|
HDFS direct-read
HCatalog if partitioned,
skewed, or escaped
|
|
RCFile
|
HCatalog
|
HCatalog
|
|
ORC
|
HCatalog
|
HCatalog
|
|
Parquet
|
HCatalog1
|
CREATE TABLE AS SELECT2
|
|
sequence
|
HDMD
HCatalog if partitioned
or skewed
|
HCatalog
|
|
Avro
|
HCatalog1
|
CREATE TABLE AS SELECT3
|
| 1Partitioned Avro or Parquet data is not supported as input to the SAS In-Database Code Accelerator for Hadoop. | ||
| 2Unable to write output directly to Parquet files due to these issues: https://issues.apache.org/jira/browse/HIVE-8838. | ||
| 3Unable to write output directly to Avro files due to these issues: https://issues.apache.org/jira/browse/HIVE-8687. | ||
| webhcat-java-client*.jar |
| hbase-storage-handler*.jar |
| hcatalog-server-extensions*.jar |
| hcatalog-core*.jar |
| hcatalog-pig-adapter*.jar |
ERROR: Job job_1424277669708_2919 has failed. Please, see job log for
details. Job tracking URL :
http://name.unx.company.com:8088/proxy/application_1424277669708_2919/