You might want to avoid that unionAll(), which seems to be repeated over
1000 times. Could you do a collect() in each iteration, and collect your
results in a local Array instead of a DataFrame? How many rows are returned
in "temp1"?
Xinh
On Tue, Mar 8, 2016 at 10:00 PM, Angel Angel
wrote:
> He
Hello Sir/Madam,
I writing the spark application in spark 1.4.0.
I have one text file with the size of 8 GB.
I save that file in parquet format
val df2 =
sc.textFile("/root/Desktop/database_200/database_200.txt").map(_.split(",")).map(p
=> Table(p(0),p(1).trim.toInt, p(2).trim.toInt, p(3)))toD
>From the parquet file content (dir content) it doesn't look like that
parquet write was successful or complete.
On Mon, Mar 7, 2016 at 11:17 AM, Angel Angel
wrote:
> Hello Sir/Madam,
>
> I am running one spark application having 3 slaves and one master.
>
> I am wring the my information using t
Hello Sir/Madam,
I am running one spark application having 3 slaves and one master.
I am wring the my information using the parquet format.
but when i am trying to read it got some error.
Please help me to resolve this problem.
code ;
val sqlContext = new org.apache.spark.sql.SQLContext(sc)