Thanx a lot. It worked!

On Thu, May 10, 2012 at 1:14 PM, shashwat shriparv <
dwivedishash...@gmail.com> wrote:

>
> Some suggestion :
>
>
>
>
> 1. Chown of the hive folder
> 2. change permission of hive folder to 755
> 3. Set this to hive-site.xml
>
>
> <property>
>     <name>hive.exec.scratchdir</name>
>     <value>/home/yourusername/mydir</value>
>     <description>Scratch space for Hive jobs</description>
>   </property>
>
>
>
> 3. put
>
>
> hadoop-0.20-core.jar
> hive/lib/hive-exec-0.7.1.jar
> hive/lib/hive-jdbc-0.7.1.jar
> hive/lib/hive-metastore-0.7.1.jar
> hive/lib/hive-service-0.7.1.jar
> hive/lib/libfb303.jar
> lib/commons-logging-1.0.4.jar
> slf4j-api-1.6.1.jar
> slf4j-log4j12-1.6.1.jar
>
> these file to hive lib folder
>
>
>
> Define external jar in hive site xml file as follows :
>
>
>
> IN hive-env.sh change following:
>
>
> # Set HADOOP_HOME to point to a specific hadoop install directory
>
> #here instead of path what i have given you give your own path where hadoop 
> #is there
> export HADOOP_HOME=/home/shashwat/Hadoop/hadoop-0.20.205
>
> # Hive Configuration Directory can be controlled by:
>
> #here you specify the conf directory path of hive
> export HIVE_CONF_DIR=/home/shashwat/Hadoop/hive-0.7.1/conf
>
> #Folder containing extra ibraries required for hive compilation/execution
>
> #can be controlled by:
>
> #here you specify the lib file directory, here you can specify the lib
>
> #directory of hadoop lib folder too like follows:
> export 
> HIVE_AUX_JARS_PATH=/home/shashwat/Hadoop/hive-0.7.1/lib:/home/shashwat/Hadoop/hadoop-0.20.205/lib:/home/shashwat/Hadoop/hbase/lib
>
>
>
>
>
>
>
> and also in hive-site.xml
>
>
>
>
>
>   <property>
>     <name>hive.aux.jars.path</name>
>  
> <value>file:///home/shashwat/Hadoop/hive-0.7.1/lib/hive-hbase-handler-0.7.1.jar,file:///home/shashwat/Hadoop/hive-0.7.1/lib/hbase-0.90.4.jar,file:///home/shashwat/Hadoop/hive-0.7.1/lib/zookeeper-3.3.1.jar</value>
>   </property>
>
>
> by default if you have just downloaded the hive there will not be 
> hive-site.xml or hive-env.sh by default in conf dir there will be 
> hive-site.xml.template and same for the hive-env.sh so make a backupe of both 
> and rename hive-site.xml.template to hive-site.xml and hive-env.sh.template 
> to hive-env.sh and make the changes what i suggested...
>
>
>
>
>
> Note : when ever you download something and extract in linux dont forget to 
> change owner and change mode using chown and chmod for the folder recursively
>
>
>
> Hope this will help you..
>
> Cheer
>
>
>
> ∞
>
> Shashwat Shriparv
>
>
>
>
>
>
>
>
>
>
>
> On Thu, May 10, 2012 at 9:59 PM, Mahsa Mofidpoor <mofidp...@gmail.com>wrote:
>
>> Hi,
>>
>> When I want to join two tables, I receive the following error:
>>
>> 12/05/10 12:03:31 WARN conf.HiveConf: hive-site.xml not found on CLASSPATH
>> WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please
>> use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties
>> files.
>> Execution log at:
>> /tmp/umroot/umroot_20120510120303_4d0145bb-27fa-4d4a-8cbc-95d8353fccaf.log
>> ENOENT: No such file or directory
>> at org.apache.hadoop.io.nativeio.NativeIO.chmod(Native Method)
>>  at org.apache.hadoop.fs.FileUtil.execSetPermission(FileUtil.java:692)
>> at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:647)
>>  at
>> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509)
>> at
>> org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344)
>>  at
>> org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189)
>> at
>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
>>  at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856)
>> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
>>  at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:396)
>>  at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
>> at
>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
>>  at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824)
>> at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:435)
>>  at org.apache.hadoop.hive.ql.exec.ExecDriver.main(ExecDriver.java:693)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>  at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>  at java.lang.reflect.Method.invoke(Method.java:597)
>> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> Job Submission failed with exception
>> 'org.apache.hadoop.io.nativeio.NativeIOException(No such file or directory)'
>> Execution failed with exit status: 2
>> Obtaining error information
>>
>> Task failed!
>> Task ID:
>>   Stage-1
>>
>> Logs:
>>
>> /tmp/umroot/hive.log
>> FAILED: Execution Error, return code 2 from
>> org.apache.hadoop.hive.ql.exec.MapRedTask
>>
>>
>> I use hadoop-0.20.2 (single-node setup) and I have build Hive through
>>  the latest source code.
>>
>>
>> Thank you in advance  for your help,
>>
>> Mahsa
>>
>
>
>
> --
>
>
> ∞
> Shashwat Shriparv
>
>
>

Reply via email to