Hi,
I am trying Spark with some sample programs,
In my code, the following items are imported:
import org.apache.spark.mllib.regression.{StreamingLinearRegressionWithSGD,
LabeledPoint}
import org.apache.spark.mllib.regression.{StreamingLinearRegressionWithSGD}
import org.apache.spark.streamin
k release are you compiling against ?
>
> Cheers
>
> On Sun, Jun 28, 2015 at 5:49 AM, Arthur Chan
> wrote:
>
>> Hi,
>>
>> I am trying Spark with some sample programs,
>>
>>
>> In my code, the following items are imported:
>>
>> im
also my Spark is 1.4
On Mon, Jun 29, 2015 at 9:02 AM, Arthur Chan
wrote:
>
>
> Hi,
>
>
> line 99:model.trainOn(labeledStream)
>
> line 100: model.predictOn(labeledStream).print()
>
> line 101:ssc.start()
>
> line 102: ssc.awaitTermination()
&g
Hi,
I use Spark 1.4. When saving the model to HDFS, I got error?
Please help!
Regards
my scala command:
sc.makeRDD(model.clusterCenters,10).saveAsObjectFile("/tmp/tweets/model")
The error log:
15/07/14 18:27:40 INFO SequenceFileRDDFunctions: Saving as sequence file of
type (NullWritable,Byt
Hi, Below is the log form the worker.
15/07/14 17:18:56 ERROR FileAppender: Error writing stream to file
/spark/app-20150714171703-0004/5/stderr
java.io.IOException: Stream closed
at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:170)
at java.io.BufferedInputStream.read1(Buf
I found the reason, it is about sc. Thanks
On Tue, Jul 14, 2015 at 9:45 PM, Akhil Das
wrote:
> Someone else also reported this error with spark 1.4.0
>
> Thanks
> Best Regards
>
> On Tue, Jul 14, 2015 at 6:57 PM, Arthur Chan
> wrote:
>
>> Hi, Below is the log form
Hi,
I plan to upgrade from 1.4.1 (+ Hive 1.1.0) to 1.5.2, is there any upgrade
document available about the upgrade especially which Hive version should
be upgraded too?
Regards
Hi,
I am trying sample word2vec from
http://spark.apache.org/docs/latest/mllib-feature-extraction.html#example
Following are my test results:
scala> for((synonym, cosineSimilarity) <- synonyms) {
| println(s"$synonym $cosineSimilarity")
| }
taiwan 2.0518918365726297
japan 1.89609623
Hi,
I use Hive 0.12 for Spark 1.2 at the moment and plan to upgrade to Spark
1.3.x
Could anyone advise which Hive version should be used to match Spark 1.3.x?
Can I use Hive 1.1.0 for Spark 1.3? or can I use Hive 0.14 for Spark 1.3?
Regards
Arthur