Hi,

Can we convert directly scala collection to spark RDD data type without
using parellize method?
Is their any way to create custom converted RDD datatype from scala type
using some typecast like that?

Please suggest me....



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Can-we-convert-scala-collection-ArrayBuffer-Int-Double-to-org-spark-RDD-Int-Double-tp3486.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to