Hi Shen

You can use sbt to run a specific suite.

1. run sbt shell.
       $ bash build/sbt
2. specify project.
       sbt > project core
     You can get project name from properties `sbt.project.name` from
pom.xml
3. Finally, you can run a specific suite
       sbt > testOnly org.apache.spark.scheduler.DAGSchedulerSuite

Hope this helps
Best regards,
Qian Sun

Fangjia Shen <shen...@purdue.edu> 于2022年1月25日周二 07:44写道:

> Hello all,
>
> How do you run Spark's test suites when you want to test the correctness
> of your code? Is there a way to run a specific test suite for Spark? For
> example, running test suite XXXSuite alone, instead of every class under
> the test/ directories.
>
> Here's some background info about what I want to do: I'm a graduate
> student trying to study Spark's design and find ways to improve Spark's
> performance by doing Software/Hardware co-design. I'm relatively new to
> Maven and so far struggling to find to a way to properly run Spark's own
> test suites.
>
> Let's say I did some modifications to a XXXExec node which belongs to the
> org.apache.spark.sql package. I want to see if my design passes the test
> cases. What should I do?
>
>
> What command should I use:
>
>      *<spark_root>/build/mvn test *  or  *<spark_root>/dev/run-tests*  ?
>
> And where should I run that command:
>
>     *<spark_root>*  or  *<package_dir>* ? - where <package_dir> is where
> the modified scala file is located, e.g. "<spark_root>/sql/core/".
>
>
> I tried adding -Dtest=XXXSuite to *mvn test *but still get to run tens of
> thousands of tests. This is taking way too much time and unbearable if I'm
> just modifying a few file in a specific module.
>
> I would really appreciate any suggestion or comment.
>
>
> Best regards,
>
> Fangjia Shen
>
> Purdue University
>
>
>
>

-- 
Best!
Qian SUN

Reply via email to