DiskChecker$DiskErrorException: Could not find any valid local directory

Description:

Sometimes, Map Reduce job fails with the below error:

Shuffle Error: Exceeded the abort failure limit; bailing-out.
java.io.IOException: Task: attempt_2670.201510262034_0327_r_000002_1 - The reduce copier failed
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:388)
at org.apache.hadoop.mapred.Child$3.run(Child.java:205)
Caused by: org.apache.hadoop.util.DiskChecker$DiskErrorException: Could not find any valid directory for <file_path> at
org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext$DirSelector.getPathForWrite(LocalDirAllocator.java:466)

Symptom:

Map Reduce task during execution will store intermediate data onto local directories, these directories are specified by the parameter "mapreduce.cluster.local.dir" specified in the mapred-site.xml.

During job processing, map reduce framework looks for the directories specified by mapreduce.cluster.local.dir parameter and verifies if there is enough space on the directories listed to create the intermediate file, if there is no directory which has the required space the map reduce job will fail with the error as shared above.

Action:

1. Ensure that there is enough space on to the local directories based on the requirement of data to be processed.

2. You may compress the intermediate output files to minimize the space consumption.

Have more questions? Submit a request

Comments

Powered by Zendesk