Hi all, I was wondering if there was a docker image to build spark and/or spark documentation
The idea would be that I would start the docker image, supplying the directory with my code and a target directory and it would simply build everything (maybe with some options). Any chance there is already something like that which is working and tested? Thanks, Assaf -- Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/ --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org