For an analysis app, I have to make ROC curves on the fly and save to disc. I am using scala-chart for this purpose and doing the following in my Spark app:
val rocs = performances.map{case (id, (auRoc, roc)) => (id, roc.collect().toList)} XYLineChart(rocs.toSeq, title = "Pooled Data Performance: AuROC").saveAsPNG(outputpath + "/plots/global.png") However, I am getting the following exception. Does anyone have idea of the cause? Exception in thread "main" java.io.FileNotFoundException: file:/home/njoshi/dev/outputs/test_/plots/global.png (No such file or directory) at java.io.FileOutputStream.open(Native Method) at java.io.FileOutputStream.<init>(FileOutputStream.java:213) at java.io.FileOutputStream.<init>(FileOutputStream.java:101) at scalax.chart.exporting.PNGExporter$.saveAsPNG$extension(exporting.scala:138) at com.aol.advertising.ml.globaldata.EvaluatorDriver$.main(EvaluatorDriver.scala:313) at com.aol.advertising.ml.globaldata.EvaluatorDriver.main(EvaluatorDriver.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Thanks in advance, Nikhil -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Error-while-saving-plots-tp27016.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org