![]() | ![]() | ![]() | ![]() | ![]() |
Hadoop MapReduce jobs initiated by recent versions of the SAS® 9.4_M2 Embedded Process for Hadoop might fail. Error messages similar to the following appear in the SAS log file:
NOTE: Attempting to run DATA Step in Hadoop. ERROR: Map/Reduce job failed. Could not run Hadoop job. FATAL: Unrecoverable I/O error detected in the execution of the DATA step program. Aborted during the COMPILATION phase. NOTE: The SAS System stopped processing this step because of errors. WARNING: The data set MYHIVE.TEST may be incomplete. When this step was stopped there were 0 observations and 0 variables. ERROR: Error attempting to CREATE a DBMS table with no columns. This is not supported.
Error messages similar to the following appear in the Yarn ResourceManager logs:
java.io.IOException: The server responded with a DS2 prepare failure. Possible causes: The input metadata does not match what DS2 expects. The input metadata file format is wrong. at com.sas.access.hadoop.ep.client.EpProtocol.isServerMessageOfType(EpProtocol.java:1431) at com.sas.access.hadoop.ep.client.EpProtocol.processOutputMetadataReturnMessage(EpProtocol.java:1301) at com.sas.access.hadoop.ep.client.EpProtocol.sendMetadata(EpProtocol.java:890) at com.sas.access.hadoop.ep.client.EpProtocol.<init>(EpProtocol.java:143) at com.sas.access.hadoop.ep.client.EpProtocol.createEPProtocol(EpProtocol.java:209) at com.sas.access.hadoop.ep.client.EpProtocol.createEPProtocol(EpProtocol.java:168) at com.sas.hadoop.ep.superreader.SuperReaderThread.runEPServer(SuperReaderThread.java:217) at com.sas.hadoop.ep.superreader.SuperReaderThread.run(SuperReaderThread.java:147) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) java.io.IOException: One or more super reader threads failed to execute. Please see job log for additional error information. at com.sas.hadoop.ep.superreader.SuperReader.next(SuperReader.java:206) at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:199) at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:185) at com.sas.hadoop.ep.superreader.SuperMapRunner.run(SuperMapRunner.java:78) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Click the Hot Fix tab in this note to access the hot fix for this issue.
Product Family | Product | System | Product Release | SAS Release | ||
Reported | Fixed* | Reported | Fixed* | |||
SAS System | SAS/ACCESS Interface to Hadoop | Linux for x64 | 9.4_M2 | 9.43 | 9.4 TS1M2 | 9.4 TS1M3 |