I'm not sure of the specific error but you may not want to be using 0.21.

"23 August, 2010: release 0.21.0 available
This release contains many improvements, new features, bug fixes and 
optimizations. It has not undergone testing at scale and should not be 
considered stable or suitable for production. This release is being classified 
as a minor release, which means that it should be API compatible with 0.20.2."
-- 
http://hadoop.apache.org/hdfs/releases.html#23+August%2C+2010%3A+release+0.21.0+available

So it's a release that's not recommended/suitable for production.  So 
compatibility with 0.21 isn't a big priority right now.

On Nov 5, 2010, at 12:51 PM, Utku Can Topçu wrote:

> When I try to read a CF from Hadoop, just after issuing the run I get this 
> error:
> 
> Exception in thread "main" java.lang.IncompatibleClassChangeError: Found 
> interface org.apache.hadoop.mapreduce.JobContext, but class was expected
>         at 
> org.apache.cassandra.hadoop.ColumnFamilyInputFormat.getSplits(ColumnFamilyInputFormat.java:88)
>         at 
> org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:401)
>         at 
> org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:418)
>         at 
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:338)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:960)
>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:976)
> 
> However the same code works fine for hadoop 0.20.2? Is there a prospective 
> patch for this issue?
> 
> Regards,
> Utku

Reply via email to