+user On Apr 21, 2015 4:46 AM, "baris akgun" <baris.akg...@gmail.com> wrote:
> Hi , > > Actually I solved the problem, I just copied the existing file to train > folder, but I noticed that Spark Streaming look for file created date, > therfore I created new file after starting the streaming job and the > problem was solved > > Thanks > > 2015-04-21 2:20 GMT+03:00 Xiangrui Meng <men...@gmail.com>: > >> Did you keep adding new files under the `train/` folder? What was the >> exact warn message? -Xiangrui >> >> On Fri, Apr 17, 2015 at 4:56 AM, barisak <baris.akg...@gmail.com> wrote: >> > Hi, >> > >> > I write this code for just train the Stream Linear Regression, but I >> took no >> > data found warn, so no weights were not updated. >> > >> > Is there any solution for this ? >> > >> > Thanks >> > >> > import org.apache.spark.mllib.linalg.Vectors >> > import org.apache.spark.mllib.regression.{LabeledPoint, >> > StreamingLinearRegressionWithSGD} >> > import org.apache.spark.SparkConf >> > import org.apache.spark.streaming.{Seconds, StreamingContext} >> > >> > >> > object StreamingLinearRegression { >> > >> > def main(args: Array[String]) { >> > >> > val numFeatures=3 >> > >> > val conf = new >> > >> SparkConf().setMaster("local[2]").setAppName("StreamingLinearRegression") >> > val ssc = new StreamingContext(conf, Seconds(30)) >> > >> > val trainingData = >> > >> ssc.textFileStream("/home/barisakgu/Desktop/Spark/train").map(LabeledPoint.parse).cache() >> > val testData = >> > >> ssc.textFileStream("/home/barisakgu/Desktop/Spark/test").map(LabeledPoint.parse) >> > >> > val model = new >> > >> StreamingLinearRegressionWithSGD().setInitialWeights(Vectors.zeros(numFeatures)) >> > >> > model.trainOn(trainingData) >> > model.predictOnValues(testData.map(lp => (lp.label, >> > lp.features))).print() >> > >> > ssc.start() >> > ssc.awaitTermination() >> > >> > } >> > >> > } >> > >> > >> > >> > -- >> > View this message in context: >> http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-Linear-Regression-problem-tp22539.html >> > Sent from the Apache Spark User List mailing list archive at Nabble.com. >> > >> > --------------------------------------------------------------------- >> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >> > For additional commands, e-mail: user-h...@spark.apache.org >> > >> > >