On Aug 10, 2011, at 11:10 AM, Alejandro Abdelnur wrote:

> Eric,
> 
> I'd argue that including the JAR as you suggest will most likely break
> because of required dependencies of the Hadoop JAR that may not be part of
> HBase (ie the jackson JARs).
> 
> But if you want to still do that you can always include the jar from the lib
> directory, for example:
> 
> $HBASE_PREFIX/share/hbase/hbase*.jar:$HBASE_PREFIX/share/hbase/lib/*.jar:$HADOOP_PREFIX/share/hadoop/
> *lib/*hadoop-*.jar

This example shows HBase jar file and Hadoop jar file are located in different 
directory.  This seems inconsistent.  The settle difference of having 
$project*.jar in share/$project is good to have.  I don't have strong opinion 
if they are merged into one directory, but they should be consistent across 
projects.  You might want to get buy in from the community before making this 
change.  Owen was in favor of having third party libraries in separated 
directories, and I am in favor of his design.

regards,
Eric

> Thoughts?
> 
> Thanks.
> 
> Alejandro
> 
> On Thu, Aug 4, 2011 at 3:47 PM, Eric Yang <eric...@gmail.com> wrote:
> 
>> It is easier to write shell script to import jar files by directory instead
>> of explicitly reference to a few jars with specific versions.
>> 
>> The common use case is:
>> 
>> HBase needs to use hadoop jar files, but HBase depends on more recent
>> version of log4j.  The construction of the class path would be:
>> 
>> 
>> $HBASE_PREFIX/share/hbase/hbase*.jar:$HBASE_PREFIX/share/hbase/lib/*.jar:$HADOOP_PREFIX/share/hadoop/*.jar
>> 
>> This provides a way to segment the library loading with least amount of
>> scripting and loosely coupled.
>> 
>> regards,
>> Eric
>> 
>> On Aug 4, 2011, at 2:40 PM, Alejandro Abdelnur wrote:
>> 
>>> [moving to core-dev@, general@ BCCed]
>>> 
>>> Eric,
>>> 
>>> Even if the JAR is in lib/ you could import/use that JAR only.
>>> 
>>> How would you use Hadoop JARs without its dependencies? Many things will
>>> break unless you add the dependency JARs.
>>> 
>>> Granted, there are JARs that are used by Hadoop server side only
>>> (JT/NN/TT/DN/SNN), but that is a different thing. Having a client side
>> set
>>> of JARs would help handle this (MAPREDUCE-1638).
>>> 
>>> Thoughts?
>>> 
>>> Thanks.
>>> 
>>> Alejandro
>>> 
>>> On Thu, Aug 4, 2011 at 1:14 PM, Eric Yang <eric...@gmail.com> wrote:
>>> 
>>>> The jar files placement  outside of lib directory is to ensure the
>> project
>>>> generated jar files are not mixed with it's dependencies.
>>>> Hence, if another project tries to import current project's jar files
>>>> without dependencies, it is possible to do so.
>>>> 
>>>> regards,
>>>> Eric
>>>> 
>>>> On Aug 4, 2011, at 11:03 AM, Alejandro Abdelnur wrote:
>>>> 
>>>>> What is the rationale for having the hadoop JARs outside of the lib/
>>>>> directory?
>>>>> 
>>>>> It would definitely simplify packaging configuration if they are under
>>>> lib/
>>>>> as well.
>>>>> 
>>>>> Any objection to it?
>>>>> 
>>>>> Thanks.
>>>>> 
>>>>> Alejandro
>>>> 
>>>> 
>> 
>> 

Reply via email to