Thanks all for responding and helping me with the build issue. I tried building the code at git://github.com/apache/spark.git (master branch) in my ppc64le Ubuntu 16.04 VM and it failed. I tried building a specific branch (branch-2.2) using following command:
build/mvn -DskipTests -Pkubernetes clean package install This builds it successfully, but again I do not see "dockerfiles" and "jars" directories anywhere. This behaviour is exactly same as observed with source code at https://github.com/apache-spark-on-k8s/spark Any advise on how to proceed on this? As far as possible, I need to build v2.2. Thanks, Atul. On Wed, Mar 28, 2018 at 8:06 PM, Anirudh Ramanathan <ramanath...@google.com> wrote: > As Lucas said, those directories are generated and copied when you run a > full maven build with the -Pkubernetes flag specified (or use instructions > in https://spark.apache.org/docs/latest/building-spark. > html#building-a-runnable-distribution). > > Also, using the Kubernetes integration in the main Apache Spark project > is recommended. The fork https://github.com/apache-spark-on-k8s/spark/ > will be retired once we finish upstreaming all those features in Spark 2.4. > > > On Wed, Mar 28, 2018, 6:42 AM Lucas Kacher <lu...@vsco.co> wrote: > >> Are you building on the fork or on the official release now? I built >> v2.3.0 from source w/out issue. One thing I noticed is that I needed to run >> the build-image command from the bin which was placed in dist/ as opposed >> to the one in the repo (as that's how it copies the necessary targets). >> >> (Failed to reply-all to the list). >> >> On Wed, Mar 28, 2018 at 4:30 AM, Atul Sowani <sow...@gmail.com> wrote: >> >>> Hi, >>> >>> I built apache-spark-on-k8s from source on Ubuntu 16.04 and it got built >>> without errors. Next, I wanted to create docker images, so as explained at >>> https://apache-spark-on-k8s.github.io/userdocs/ >>> running-on-kubernetes.html I used sbin/build-push-docker-images.sh to >>> create those. While using this script I came across 2 issues: >>> >>> 1. It references "dockerfiles" directory which should be in "spark", >>> however this directory is missing. I created "dockerfiles" directory and >>> copied Dockerfiles from resource-managers/kubernetes/docker-minimal- >>> bundle >>> >>> 2, spark-base dockerfile expects to have some JAR files present in a >>> directory called "jars" - this directory is missing. I tried rebuilding the >>> code but this directory is not getting generated if it is supposed to be. >>> >>> My doubt is, if this is a genuine/known issue or am I missing out some >>> build steps? >>> >>> Thanks, >>> Atul. >>> >>> >> >> >> -- >> >> *Lucas Kacher*Senior Engineer >> - >> vsco.co <https://www.vsco.co/> >> New York, NY >> 818.512.5239 >> >