Hi Harold,
  It seems you don't have latest hdfs code. KerberosInfo annotation has
slightly changed in the latest code.
  Also, KerberosInfo not being found means you have older jar for common.
Try 'ant clean clean-cache'  before compiling.

> How do I force the compile to use my own jar file, the jar from the version of
> hadoop-common that I have?

To use your on version of jar do ant mvn-install in the the common
And in hdfs give -Dresolvers=internal flag to the ant command line. Please
make sure to clean-cache before this.


On 5/20/10 10:35 PM, "Harold Lim" <rold...@yahoo.com> wrote:

> Hi,
> 
> I also have the source from hadoop-common. However, when I do ant clean jar
> from hdfs folder, ivy seems to try and download the hadoop-core jar file from
> the repository? 
> 
> Maybe the newer version of hadoop-core is not compatible with mine?
> How do I force the compile to use my own jar file, the jar from the version of
> hadoop-common that I have?
> 
> 
> Thanks,
> Harold
> 
> --- On Fri, 5/21/10, Sagar Shukla <sagar_shu...@persistent.co.in> wrote:
> 
>> From: Sagar Shukla <sagar_shu...@persistent.co.in>
>> Subject: RE: Problem compiling from source
>> To: "hdfs-u...@hadoop.apache.org" <hdfs-u...@hadoop.apache.org>,
>> "hdfs-dev@hadoop.apache.org" <hdfs-dev@hadoop.apache.org>
>> Date: Friday, May 21, 2010, 12:28 AM
>> Hi Harold,
>>       The error message "cannot find symbol"
>> hints of not having necessary libraries. Looks like it is
>> trying to access Kerberos libraries which it is unable to
>> find. You can check if all the required Kerberos libraries
>> are available.
>> 
>> Regards,
>> Sagar
>> 
>> -----Original Message-----
>> From: Harold Lim [mailto:rold...@yahoo.com]
>> Sent: Friday, May 21, 2010 9:54 AM
>> To: hdfs-dev@hadoop.apache.org
>> Cc: hdfs-u...@hadoop.apache.org
>> Subject: Problem compiling from source
>> 
>> Hi All,
>> 
>> 
>> For some reason, my hdfs source code can't compile anymore.
>> ~1-2 weeks ago it was compiling fine but now it's not. I
>> haven't made any changes to my code since I last compiled.
>> 
>> When I do ant clean jar: I get the following errors.
>> 
>> compile-hdfs-classes:
>>    [javac] Compiling 198 source files to
>> /hadoop_0.22/hdfs_new/build/classes
>>    [javac]
>> /hadoop_0.22/hdfs_new/src/java/org/apache/hadoop/hdfs/protocol/ClientProtoco
>> l.java:53: cannot find symbol
>>    [javac] symbol  : method value()
>>    [javac] location: @interface
>> org.apache.hadoop.security.KerberosInfo
>>    [javac]
>> @KerberosInfo(DFSConfigKeys.DFS_NAMENODE_USER_NAME_KEY)
>>    [javac]         
>>                
>>   ^
>>    [javac]
>> /hadoop_0.22/hdfs_new/src/java/org/apache/hadoop/hdfs/server/protocol/Nameno
>> deProtocol.java:34: cannot find symbol
>>    [javac] symbol  : method value()
>>    [javac] location: @interface
>> org.apache.hadoop.security.KerberosInfo
>>    [javac]
>> @KerberosInfo(DFSConfigKeys.DFS_NAMENODE_USER_NAME_KEY)
>>    [javac]         
>>                
>>   ^
>>    [javac]
>> /hadoop_0.22/hdfs_new/src/java/org/apache/hadoop/hdfs/server/protocol/InterD
>> atanodeProtocol.java:33: cannot find symbol
>>    [javac] symbol  : method value()
>>    [javac] location: @interface
>> org.apache.hadoop.security.KerberosInfo
>>    [javac]
>> @KerberosInfo(DFSConfigKeys.DFS_NAMENODE_USER_NAME_KEY)
>>    [javac]         
>>                
>>   ^
>>    [javac]
>> /hadoop_0.22/hdfs_new/src/java/org/apache/hadoop/hdfs/server/protocol/Datano
>> deProtocol.java:40: cannot find symbol
>>    [javac] symbol  : method value()
>>    [javac] location: @interface
>> org.apache.hadoop.security.KerberosInfo
>>    [javac]
>> @KerberosInfo(DFSConfigKeys.DFS_NAMENODE_USER_NAME_KEY)
>>    [javac]         
>>                
>>   ^
>>    [javac]
>> /hadoop_0.22/hdfs_new/src/java/org/apache/hadoop/hdfs/protocol/ClientProtoco
>> l.java:53: annotation
>> org.apache.hadoop.security.KerberosInfo is missing
>> serverPrincipal
>>    [javac]
>> @KerberosInfo(DFSConfigKeys.DFS_NAMENODE_USER_NAME_KEY)
>>    [javac] ^
>>    [javac]
>> /hadoop_0.22/hdfs_new/src/java/org/apache/hadoop/hdfs/server/protocol/Nameno
>> deProtocol.java:34: annotation
>> org.apache.hadoop.security.KerberosInfo is
>> missing serverPrincipal
>>    [javac]
>> @KerberosInfo(DFSConfigKeys.DFS_NAMENODE_USER_NAME_KEY)
>>    [javac] ^
>>    [javac]
>> /hadoop_0.22/hdfs_new/src/java/org/apache/hadoop/hdfs/server/protocol/InterD
>> atanodeProtocol.java:33: annotation
>> org.apache.hadoop.security.KerberosInfo
>> is missing serverPrincipal
>>    [javac]
>> @KerberosInfo(DFSConfigKeys.DFS_NAMENODE_USER_NAME_KEY)
>>    [javac] ^
>>    [javac]
>> /hadoop_0.22/hdfs_new/src/java/org/apache/hadoop/hdfs/server/protocol/Datano
>> deProtocol.java:40: annotation
>> org.apache.hadoop.security.KerberosInfo is
>> missing serverPrincipal
>>    [javac]
>> @KerberosInfo(DFSConfigKeys.DFS_NAMENODE_USER_NAME_KEY)
>>    [javac] ^
>>    [javac] Note: Some input files use or
>> override a deprecated API.
>>    [javac] Note: Recompile with
>> -Xlint:deprecation for details.
>>    [javac] 8 errors
>> 
>> BUILD FAILED
>> 
>> 
>> 
>> 
>> DISCLAIMER
>> ==========
>> This e-mail may contain privileged and confidential
>> information which is the property of Persistent Systems Ltd.
>> It is intended only for the use of the individual or entity
>> to which it is addressed. If you are not the intended
>> recipient, you are not authorized to read, retain, copy,
>> print, distribute or use this message. If you have received
>> this communication in error, please notify the sender and
>> delete all copies of this message. Persistent Systems Ltd.
>> does not accept any liability for virus infected mails.
>> 
> 
> 
> 

Reply via email to