Please check your yarn log. It will have details around this error .
regards
Dev
+91 958 305 9899
> On Jul 7, 2014, at 10:46 AM, Ritesh Kumar Singh
> wrote:
>
> hive>select COUNT(*) from movies;
> Total MapReduce jobs = 1
> Launching Job 1 out of 1
> Number of reduce tasks determined at compil
hive>select COUNT(*) from movies;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=
In order to limit the maximum number of reducers:
set