Re: reading the parquet file

2016-03-09 Thread Xinh Huynh
You might want to avoid that unionAll(), which seems to be repeated over 1000 times. Could you do a collect() in each iteration, and collect your results in a local Array instead of a DataFrame? How many rows are returned in "temp1"? Xinh On Tue, Mar 8, 2016 at 10:00 PM, Angel Angel wrote: > He

Re: reading the parquet file in spark sql

2016-03-07 Thread Manoj Awasthi
>From the parquet file content (dir content) it doesn't look like that parquet write was successful or complete. On Mon, Mar 7, 2016 at 11:17 AM, Angel Angel wrote: > Hello Sir/Madam, > > I am running one spark application having 3 slaves and one master. > > I am wring the my information using t