I am using hadoop-0.20.2-cdh3u1 from Cloudera Jing
On Sat, Dec 17, 2011 at 12:18 AM, Vivek Mishra <vivek.mis...@impetus.co.in>wrote: > Which version of hadoop you re experimenting with? > AFAIK, only 0.20.X works fine. i tried versions > 0.20.x but no luck. > > Vivek > ________________________________________ > From: alo alt [wget.n...@googlemail.com] > Sent: 16 December 2011 14:59 > To: user@hive.apache.org > Subject: Re: hive select count(*) query exception > > Hi, > > looks like the user who uses the statement has not the correct rights. > org.apache.hadoop.fs.permission.FsPermission$2.<init>())' > > - Alex > > On Fri, Dec 16, 2011 at 8:59 AM, jingjung Ng <jingjun...@gmail.com> wrote: > > Hi, > > > > I have simple hive select count(*) query,which results in the following > > exception. I am using Cloudera cdh3u1 ( hadoop/hbase/hive). However I am > > able to do "select * from t1" from hive CLI. > > > > Here is output after running "select count(*) from t1". > > > > hive> select count(*) from t1; > > Total MapReduce jobs = 1 > > Launching Job 1 out of 1 > > Number of reduce tasks determined at compile time: 1 > > In order to change the average load for a reducer (in bytes): > > set hive.exec.reducers.bytes.per.reducer=<number> > > In order to limit the maximum number of reducers: > > set hive.exec.reducers.max=<number> > > In order to set a constant number of reducers: > > set mapred.reduce.tasks=<number> > > org.apache.hadoop.ipc.RemoteException: IPC server unable to read call > > parameters: java.lang.NoSuchMethodException: > > org.apache.hadoop.fs.permission.FsPermission$2.<init>() > > at org.apache.hadoop.ipc.Client.call(Client.java:1107) > > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226) > > at $Proxy4.setPermission(Unknown Source) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at > > > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) > > at > > > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59) > > at $Proxy4.setPermission(Unknown Source) > > at org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient.java:855) > > at > > > org.apache.hadoop.hdfs.DistributedFileSystem.setPermission(DistributedFileSystem.java:560) > > at > > > org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:123) > > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:839) > > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:396) > > at > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127) > > at > > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833) > > at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807) > > at > > org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657) > > at > > org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123) > > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130) > > at > > > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57) > > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063) > > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900) > > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748) > > at > org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:209) > > at > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:286) > > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:513) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at org.apache.hadoop.util.RunJar.main(RunJar.java:186) > > Job Submission failed with exception > > 'org.apache.hadoop.ipc.RemoteException(IPC server unable to read call > > parameters: java.lang.NoSuchMethodException: > > org.apache.hadoop.fs.permission.FsPermission$2.<init>())' > > FAILED: Execution Error, return code 1 from > > org.apache.hadoop.hive.ql.exec.MapRedTask > > hive> > > > > > > Thanks, > > > > Jing. > > > > > > -- > Alexander Lorenz > http://mapredit.blogspot.com > > P Think of the environment: please don't print this email unless you > really need to. > > ________________________________ > > New Impetus webcast on-demand ‘Big Data Technologies for Social Media > Analytics’ available at http://bit.ly/nFdet0. > > Visit http://www.impetus.com to know more. Follow us on > www.twitter.com/impetuscalling > > > NOTE: This message may contain information that is confidential, > proprietary, privileged or otherwise protected by law. The message is > intended solely for the named addressee. If received in error, please > destroy and notify the sender. Any use of this email is prohibited when > received in error. Impetus does not represent, warrant and/or guarantee, > that the integrity of this communication has been maintained nor that the > communication is free of errors, virus, interception or interference. >