Hi,
I have a problem with Json Parser. I am using spark streaming with
hiveContext for keeping json format tweets. The flume collects tweets and
sink to hdfs path. My spark streaming job checks the hdfs path and convert
coming json tweets and insert them to hive table.
My problem is that ;
Some
Hi,
I write this code for just train the Stream Linear Regression, but I took no
data found warn, so no weights were not updated.
Is there any solution for this ?
Thanks
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.{LabeledPoint,
StreamingLinearRegressi
Hi
I have a class in above desc.
case class weatherCond(dayOfdate: String, minDeg: Int, maxDeg: Int, meanDeg:
Int)
I am reading the data from csv file and I put this data into weatherCond
class with this code
val weathersRDD = sc.textFile("weather.csv").map {
line =>
val Array(d
Hi
I tried to run Streaming Linear Regression in my local.
val trainingData =
ssc.textFileStream("/home/barisakgu/Desktop/Spark/train").map(LabeledPoint.parse)
textFileStream is not seeing the new files. I search on the Internet, and I
saw that somebody has same issue but no solution is found