LuciferYang commented on code in PR #50423:
URL: https://github.com/apache/spark/pull/50423#discussion_r2017828991


##########
.github/workflows/build_and_test.yml:
##########
@@ -769,6 +772,85 @@ jobs:
       if: inputs.branch != 'branch-3.5'
       run: ./dev/check-protos.py
 
+  repl:
+    needs: [precondition]
+    if: (!cancelled()) && fromJson(needs.precondition.outputs.required).repl 
== 'true'
+    name: REPL (spark-sql, spark-shell and pyspark)
+    runs-on: ubuntu-latest
+    timeout-minutes: 45
+    env:
+      LC_ALL: C.UTF-8
+      LANG: C.UTF-8
+      PYSPARK_DRIVER_PYTHON: python3.11
+      PYSPARK_PYTHON: python3.11
+    steps:
+      - name: Checkout Spark repository
+        uses: actions/checkout@v4
+        with:
+          fetch-depth: 0
+          repository: apache/spark
+          ref: ${{ inputs.branch }}
+      - name: Sync the current branch with the latest in Apache Spark
+        if: github.repository != 'apache/spark'
+        run: |
+          git fetch https://github.com/$GITHUB_REPOSITORY.git 
${GITHUB_REF#refs/heads/}
+          git -c user.name='Apache Spark Test Account' -c 
user.email='sparktest...@gmail.com' merge --no-commit --progress --squash 
FETCH_HEAD
+          git -c user.name='Apache Spark Test Account' -c 
user.email='sparktest...@gmail.com' commit -m "Merged commit" --allow-empty
+      - name: Install Java 17
+        uses: actions/setup-java@v4
+        with:
+          distribution: zulu
+          java-version: 17
+      - name: Install Python 3.11
+        uses: actions/setup-python@v5
+        with:
+          python-version: '3.11'
+      - name: Install dependencies for PySpark
+        run: |
+          python3.11 -m pip install ipython numpy scipy 'protobuf==5.28.3' 
'pyarrow>=19.0.0' 'six==1.16.0' 'pandas==2.2.3' 'grpcio==1.67.0' 
'grpcio-status==1.67.0' 'protobuf==5.28.3' 'googleapis-common-protos==1.65.0'
+          python3.11 -m pip list
+      - name: Build Spark
+        run: |
+          ./build/sbt -Phive -Phive-thriftserver clean package

Review Comment:
   So I believe that the verification of the output from 
`dev/make-distribution.sh` should be more rigorous and convincing.



##########
.github/workflows/build_and_test.yml:
##########
@@ -769,6 +772,85 @@ jobs:
       if: inputs.branch != 'branch-3.5'
       run: ./dev/check-protos.py
 
+  repl:
+    needs: [precondition]
+    if: (!cancelled()) && fromJson(needs.precondition.outputs.required).repl 
== 'true'
+    name: REPL (spark-sql, spark-shell and pyspark)
+    runs-on: ubuntu-latest
+    timeout-minutes: 45
+    env:
+      LC_ALL: C.UTF-8
+      LANG: C.UTF-8
+      PYSPARK_DRIVER_PYTHON: python3.11
+      PYSPARK_PYTHON: python3.11
+    steps:
+      - name: Checkout Spark repository
+        uses: actions/checkout@v4
+        with:
+          fetch-depth: 0
+          repository: apache/spark
+          ref: ${{ inputs.branch }}
+      - name: Sync the current branch with the latest in Apache Spark
+        if: github.repository != 'apache/spark'
+        run: |
+          git fetch https://github.com/$GITHUB_REPOSITORY.git 
${GITHUB_REF#refs/heads/}
+          git -c user.name='Apache Spark Test Account' -c 
user.email='sparktest...@gmail.com' merge --no-commit --progress --squash 
FETCH_HEAD
+          git -c user.name='Apache Spark Test Account' -c 
user.email='sparktest...@gmail.com' commit -m "Merged commit" --allow-empty
+      - name: Install Java 17
+        uses: actions/setup-java@v4
+        with:
+          distribution: zulu
+          java-version: 17
+      - name: Install Python 3.11
+        uses: actions/setup-python@v5
+        with:
+          python-version: '3.11'
+      - name: Install dependencies for PySpark

Review Comment:
   If we are only checking spark-shell and spark-sql, there is no need to 
install these Python dependencies.



##########
.github/workflows/build_and_test.yml:
##########
@@ -769,6 +772,85 @@ jobs:
       if: inputs.branch != 'branch-3.5'
       run: ./dev/check-protos.py
 
+  repl:
+    needs: [precondition]
+    if: (!cancelled()) && fromJson(needs.precondition.outputs.required).repl 
== 'true'
+    name: REPL (spark-sql, spark-shell and pyspark)
+    runs-on: ubuntu-latest
+    timeout-minutes: 45
+    env:
+      LC_ALL: C.UTF-8
+      LANG: C.UTF-8
+      PYSPARK_DRIVER_PYTHON: python3.11
+      PYSPARK_PYTHON: python3.11
+    steps:
+      - name: Checkout Spark repository
+        uses: actions/checkout@v4
+        with:
+          fetch-depth: 0
+          repository: apache/spark
+          ref: ${{ inputs.branch }}
+      - name: Sync the current branch with the latest in Apache Spark
+        if: github.repository != 'apache/spark'
+        run: |
+          git fetch https://github.com/$GITHUB_REPOSITORY.git 
${GITHUB_REF#refs/heads/}
+          git -c user.name='Apache Spark Test Account' -c 
user.email='sparktest...@gmail.com' merge --no-commit --progress --squash 
FETCH_HEAD
+          git -c user.name='Apache Spark Test Account' -c 
user.email='sparktest...@gmail.com' commit -m "Merged commit" --allow-empty
+      - name: Install Java 17
+        uses: actions/setup-java@v4
+        with:
+          distribution: zulu
+          java-version: 17
+      - name: Install Python 3.11
+        uses: actions/setup-python@v5
+        with:
+          python-version: '3.11'
+      - name: Install dependencies for PySpark
+        run: |
+          python3.11 -m pip install ipython numpy scipy 'protobuf==5.28.3' 
'pyarrow>=19.0.0' 'six==1.16.0' 'pandas==2.2.3' 'grpcio==1.67.0' 
'grpcio-status==1.67.0' 'protobuf==5.28.3' 'googleapis-common-protos==1.65.0'
+          python3.11 -m pip list
+      - name: Build Spark
+        run: |
+          ./build/sbt -Phive -Phive-thriftserver clean package

Review Comment:
   Ultimately, we still need to use Maven to compile and package the Spark 
Client. Although the result produced by `sbt package` can also be used for 
testing, I think it does not necessarily mean that the Client packaged by Maven 
will also be healthy and free of issues.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to