Praveen,

Including those worked! Thanks.

But I have some questions:

1-Why do we even need to do this? If in a cluster, same nodes are being
used as DataNodes/TaskTrackers/RegionServers then shouldn't these libs be
on the classpath already? These are pretty basic libs? Had it been the case
that we have different RegionServer nodes from TaskTrackers then I could
understand. Or even if the same node/server is hosting
RegionServer/DataNode/TaskTracker, the TT daemon, when started, did not
have hbase jars on its classpath?

2-Why hbase security jar? Why not the main one? Is the security name
misleading in the context that it is not 'only' for hbase security but
hbase 'with' security?

3-Why zookeeper jar though? Is it because the hbase jar that is actually
needed here depends, downstream in the dependency chain?

4-Lastly, is there a way where we cannot hardcode these registered
libraries' path?

Or am I missing something here? Thanks for any explanation.

Regards,
Shahab


On Fri, May 31, 2013 at 7:44 AM, shashwat shriparv <
[email protected]> wrote:

> export HBASE_CLASS= jarfiles
> export HADOOP_CLASS= jarfiles
>
> *Thanks & Regards    *
>
> ∞
> Shashwat Shriparv
>
>
>
> On Fri, May 31, 2013 at 4:49 PM, Shahab Yunus <[email protected]
> >wrote:
>
> > Thanks, I will try and provide an update. Out of curiosity, did you ever
> > resolved it?
> >
> > Regards,
> > Shahab
> >
> >
> > On Fri, May 31, 2013 at 4:17 AM, Praveen Bysani <[email protected]
> > >wrote:
> >
> > > Yes, it could be. Try registering these jar files
> > >
> > >
> >
> /opt/cloudera/parcels/CDH-<version>/lib/hbase/hbase-<version>-security.jar
> > >
> /opt/cloudera/parcels/CDH-<version>/lib/zookeeper/zookeeper-<version>.jar
> > >
> > > On 30 May 2013 20:14, Shahab Yunus <[email protected]> wrote:
> > >
> > > > I am not explicitly registering any of these jars in the script. The
> > > > cluster was setup through standard Cloudera installation (4.2.0).
> > Should
> > > I?
> > > > Is that the issue?
> > > >
> > > > Regards,
> > > > Shahab
> > > >
> > > >
> > > > On Wed, May 29, 2013 at 11:26 PM, Praveen Bysani <
> > > [email protected]
> > > > >wrote:
> > > >
> > > > > Hi,
> > > > >
> > > > > Are you registering the hbase and zookeepr jar files in your pig
> > > script ?
> > > > >
> > > > > On 30 May 2013 06:24, Shahab Yunus <[email protected]> wrote:
> > > > >
> > > > > > Hello,
> > > > > >
> > > > > > When loading data from a HBase table in Pig, using HBaseStorage,
> > if I
> > > > > > specify the type of the fields as chararray, I get an exception:
> > > > > >
> > > > > > 2013-05-29 16:18:56,557 INFO
> > > > > > org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs'
> > > > > > truncater with mapRetainSize=-1 and reduceRetainSize=-1
> > > > > > 2013-05-29 16:18:56,560 FATAL org.apache.hadoop.mapred.Child:
> Error
> > > > > > running child : java.lang.NoClassDefFoundError:
> > > > > > org/apache/hadoop/hbase/mapreduce/TableInputFormat
> > > > > >         at java.lang.ClassLoader.defineClass1(Native Method)
> > > > > >         at
> > > java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
> > > > > >         at
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)
> > > > > >         at
> > > > > >
> > > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> > > > > >         at
> > > java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> > > > > >         at
> > java.net.URLClassLoader.access$000(URLClassLoader.java:58)
> > > > > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> > > > > >         at java.security.AccessController.doPrivileged(Native
> > Method)
> > > > > >         at
> > java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > > > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > > >         at
> > > > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > > > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > > > >         at java.lang.Class.forName0(Native Method)
> > > > > >         at java.lang.Class.forName(Class.java:247)
> > > > > >
> > > > > >
> > > > > > events = LOAD 'hbase://events'
> > > > > > USING
> org.apache.pig.backend.hadoop.hbase.HBaseStorage('info:field1
> > > > > > info:field2 info:field3')
> > > > > > AS (field1: chararray, field2:chararray, field3:chararray);
> > > > > > It seems to work if I use bytearray instead of chararray (or no
> > type
> > > at
> > > > > > all.)
> > > > > > AS (field1: bytearray, field2:bytearray, field3:bytearray);
> > > > > >
> > > > > > or
> > > > > >
> > > > > > AS (field1, field2, field3);
> > > > > >
> > > > > > My question is that why do we get the ClassNotFoundException for
> > > > > > TableInpuFormat which is kind of misleading? Plus, is it even
> type
> > > > > casting
> > > > > > really the issue? Wouldn't automatic byte-to-char conversion
> > happen?
> > > > > >
> > > > > > This issue has been discussed earlier also but it doesn't seem
> any
> > > > > > conclusion was reached. Plus so far, I have not seen any problem
> > > > > regarding
> > > > > > HBase libs anywhere:
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> http://search-hadoop.com/m/KUU6m1oZMEi1&subj=Re+Unable+to+typecast+fields+loaded+from+HBase
> > > > > >
> > > > > > Thanks.
> > > > > >
> > > > > > Regards,
> > > > > > Shahab
> > > > > >
> > > > >
> > > > >
> > > > >
> > > > > --
> > > > > Regards,
> > > > > Praveen Bysani
> > > > > http://www.praveenbysani.com
> > > > >
> > > >
> > >
> > >
> > >
> > > --
> > > Regards,
> > > Praveen Bysani
> > > http://www.praveenbysani.com
> > >
> >
>

Reply via email to