Hello!
I'm working with some parquet files saved on amazon service and loading
them to dataframe with
Dataset df = spark.read() .parquet(parketFileLocation);
however, after some time I get the "Timeout waiting for connection from
pool" exception. I hope I'm not mistaken, but I think that there's
reate();
> List results = new LinkedList();
> JavaRDD jsonRDD =
> new JavaSparkContext(sparkSession.
> sparkContext()).parallelize(results);
>
> Dataset peopleDF = sparkSession.createDataFrame(jsonRDD,
> Row.class);
>
> Richard Xin
>
Hello!
I am running Spark on Java and bumped into a problem I can't solve or find
anything helpful among answered questions, so I would really appreciate
your help.
I am running some calculations, creating rows for each result:
List results = new LinkedList();
for(something){
results.add(RowFac
Hello!
I am running Spark on Java and bumped into a problem I can't solve or find
anything helpful among answered questions, so I would really appreciate
your help.
I am running some calculations, creating rows for each result:
List results = new LinkedList();
for(something){
results.add(RowFac