On Thursday 23 April 2015 12:22 PM, Akhil Das wrote:
Here's a complete scala example
https://github.com/bbux-proteus/spark-accumulo-examples/blob/1dace96a115f29c44325903195c8135edf828c86/src/main/scala/org/bbux/spark/AccumuloMetadataCount.scala
Thanks
Best Regards
On Thu, Apr 23, 2015 at 12:19
Here's a complete scala example
https://github.com/bbux-proteus/spark-accumulo-examples/blob/1dace96a115f29c44325903195c8135edf828c86/src/main/scala/org/bbux/spark/AccumuloMetadataCount.scala
Thanks
Best Regards
On Thu, Apr 23, 2015 at 12:19 PM, Akhil Das
wrote:
> Change your import from mapred
Change your import from mapred to mapreduce. like :
import org.apache.accumulo.core.client.mapreduce.AccumuloInputFormat;
Thanks
Best Regards
On Wed, Apr 22, 2015 at 2:42 PM, madhvi wrote:
> Hi,
>
> I am creating a spark RDD through accumulo writing like:
>
> JavaPairRDD accumuloRDD =
> sc.new
[mailto:madhvi.gu...@orkash.com]
Sent: Wednesday, April 22, 2015 5:13 PM
To: user@spark.apache.org
Subject: Error in creating spark RDD
Hi,
I am creating a spark RDD through accumulo writing like:
JavaPairRDD accumuloRDD =
sc.newAPIHadoopRDD(accumuloJob.getConfiguration(),AccumuloInputFormat.class
Hi,
I am creating a spark RDD through accumulo writing like:
JavaPairRDD accumuloRDD =
sc.newAPIHadoopRDD(accumuloJob.getConfiguration(),AccumuloInputFormat.class,Key.class,
Value.class);
But I am getting the following error and it is not getting compiled:
Bound mismatch: The generic method