How about: val range = Range.getRange.toString val notWorking = "path/output_{" + range +"}/*/*"
On Fri, Sep 5, 2014 at 3:45 AM, jerryye <jerr...@gmail.com> wrote: > Hi, > I have a quick serialization issue. I'm trying to read a date range of input > files and I'm getting a serialization issue when using an input path that > has a object generate a date range. Specifically, my code uses > DateTimeFormat in the Joda time package, which is not serializable. How do I > get spark to not lazily compute the input path and run into the > serialization issue? > > Code: > object Range { > val now = new DateTime > val dateFormatter = DateTimeFormat.forPattern("MMddyyyy") > def dateRange(from: DateTime, to: DateTime, step: Period): > Iterator[DateTime] = > Iterator.iterate(from)(_.plus(step)).takeWhile(!_.isAfter(to)) > def getRange: String = { > dateRange(now.minusDays(22), now, > Period.days(1)).map(dateFormatter.print(_)).mkString(",") > } > } > > val notWorking = "path/output_{" + Range.getRange +"}/*/*" > val working = > "path/output_{08121914,08132014,08142014,08152014,08162014,08172014,08182014,08192014,08202014,08212014,08222014,08232014,08242014,08252014,08262014,08272014,08282014,08292014,08302014,08312014,09012014,09022014,09032014,09042014}/*/*" > val lines = sc.textFile(working).count --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org