t;1.5.1"*,
> *"org.apache.spark" *% *"spark-streaming_2.11" *% *"1.5.1"*,
> *"org.apache.spark" *% *"spark-mllib_2.11" *% *"1.5.1"*
>
>
>
>
>
> *From:* Бобров Виктор [
e$.withScope(RDDOperationScope.scala:108)"
>>
>> 11 = {StackTraceElement@7143}
>> "org.apache.spark.rdd.RDD.withScope(RDD.scala:306)"
>>
>> 12 = {StackTraceElement@7144}
>> "org.apache.spark.rdd.RDD.filter(RDD.scala:330)"
>>
>
treaming_2.11" % "1.5.1",
"org.apache.spark" % "spark-mllib_2.11" % "1.5.1"
From: Бобров Виктор [mailto:ma...@bk.ru]
Sent: Thursday, December 10, 2015 2:54 PM
To: 'Harsh J'
Cc: user@spark.apache.org
Subject: RE: C
step2 = step1.filter(filter1)
//step1.collect().foreach(println)
}
}
From: Harsh J [mailto:ha...@cloudera.com]
Sent: Thursday, December 10, 2015 2:50 PM
To: Бобров Виктор ; Ndjido Ardo Bar
Cc: user@spark.apache.org
Subject: Re: Can't filter
Are you sure you do not have any me
eElement@7146} "SimpleApp$.main(test1.scala:26)"
>
> 15 = {StackTraceElement@7147} "SimpleApp.main(test1.scala)"
>
>
>
> *From:* Ndjido Ardo Bar [mailto:ndj...@gmail.com]
> *Sent:* Thursday, December 10, 2015 2:20 PM
> *To:* Бобров Виктор
> *Cc:* user@s
filter(RDD.scala:330)"
13 = {StackTraceElement@7145}
"SimpleApp$GeneratedEvaluatorClass$44$1.invoke(FileToCompile0.scala:30)"
14 = {StackTraceElement@7146} "SimpleApp$.main(test1.scala:26)"
15 = {StackTraceElement@7147} "SimpleApp.main(test1.scala)"
From: Ndjido
Please send your call stack with the full description of the exception .
> On 10 Dec 2015, at 12:10, Бобров Виктор wrote:
>
> Hi, I can’t filter my rdd.
>
> def filter1(tp: ((Array[String], Int), (Array[String], Int))): Boolean= {
> tp._1._2 > tp._2._2
> }
> val mail_rdd = sc.parallelize(A.t
Hi, I can’t filter my rdd.
def filter1(tp: ((Array[String], Int), (Array[String], Int))): Boolean= {
tp._1._2 > tp._2._2
}
val mail_rdd = sc.parallelize(A.toSeq).cache()
val step1 = mail_rdd.cartesian(mail_rdd)
val step2 = step1.filter(filter1)
Get error “Class not found”. What I’m doing