Figured constructor can be used for this purpose..
On 10/24/14 7:57 AM, Lochana Menikarachchi wrote:
SparkConf conf = new
SparkConf().setAppName("LogisticRegression").setMaster("local[4]");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> lines = sc.textFile("some.csv");
JavaRDD<LabeledPoint> lPoints = lines.map(new CSVLineParser());
Is there anyway to parse an index to a function.. for example instead
of hard coding (parts[0]) below is there a way to parse this
public class CSVLineParser implements Function<String, LabeledPoint> {
private static final Pattern COMMA = Pattern.compile(",");
@Override
public LabeledPoint call(String line) {
String[] parts = COMMA.split(line);
double y = Double.parseDouble(parts[0]);
double[] x = new double[parts.length];
for (int i = 1; i < parts.length; ++i) {
x[i] = Double.parseDouble(parts[i]);
}
return new LabeledPoint(y, Vectors.dense(x));
}
}
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org