LongWritable.class, BytesWritable.class, config);
However, when I run the job I get the following error:
com.fasterxml.jackson.databind.JsonMappingException: Infinite recursion
(StackOverflowError) (through reference chain:
scala.collection.convert.IterableWrapper[0]->org.apache.spark.rdd.RDDOp
state table)"},
>{ "name": "localorig", "type": "boolean", "doc": "If conn originated
> locally T; if remotely F."},
>{ "name": "localresp", "type": "boolean", "doc
"doc": "empty, always unset"},
{ "name": "missedbytes", "type": "int", "doc": "Number of missing bytes
in content gaps"},
{ "name": "history", "type": "string&q
hi
on windows, in local mode, using pyspark i got an error about "excessively
deep recursion"
i'm using some module for lemmatizing/stemming, which uses some dll and
some binary files (module is a python wrapper around c code).
spark version 1.4.0
any idea w
Hi,
On Fri, Sep 5, 2014 at 6:16 PM, Deep Pradhan
wrote:
>
> Does Spark support recursive calls?
>
Can you give an example of which kind of recursion you would like to use?
Tobias
Hi,
Does Spark support recursive calls?