Hi John,

>> I don't know how to fix this. I've tried adding `flink-table-planner`
and `flink-table-planner-blink` dependencies with `<type>test-jar</type>`
to my dummy pom.xml, but it still fails.
What's the failure after doing this? The flink-table-planner*-tests.jar
should be available in maven repository[1].

>> This is starting to feel like a real pain to do something that should be
trivial: basic TDD of a PyFlink project.  Is there a real-world example of
a Python project that shows how to set up a testing environment for unit
testing SQL with PyFlink?
I'm not aware of such a project, however I agree that this may be a very
important aspect which should be improved. I will look into this.

Regards,
Dian

[1]
https://repo1.maven.org/maven2/org/apache/flink/flink-table-planner_2.11/1.13.6/


On Sun, Apr 24, 2022 at 4:44 AM John Tipper <john_tip...@hotmail.com> wrote:

> Hi all,
>
> Is there an example of a self-contained repository showing how to perform
> SQL unit testing of PyFlink (specifically 1.13.x if possible)?  I have
> cross-posted the question to Stack Overflow here:
> https://stackoverflow.com/questions/71983434/is-there-an-example-of-pyflink-sql-unit-testing-in-a-self-contained-repo
>
>
> There is a related SO question (
> https://stackoverflow.com/questions/69937520/pyflink-sql-local-test),
> where it is suggested to use some of the tests from PyFlink itself.  The
> issue I'm running into is that the PyFlink repo assumes that a bunch of
> things are on the Java classpath and that some Python utility classes are
> available (they're not distributed via PyPi apache-flink).
>
> I have done the following:
>
>
>    1. Copied `test_case_utils.py` and `source_sink_utils.py` from PyFlink
>    (
>    
> https://github.com/apache/flink/tree/f8172cdbbc27344896d961be4b0b9cdbf000b5cd/flink-python/pyflink/testing)
>    into my project.
>    2. Copy an example unit test (
>    
> https://github.com/apache/flink/blob/f8172cdbbc27344896d961be4b0b9cdbf000b5cd/flink-python/pyflink/table/tests/test_sql.py#L39)
>    as suggested by the related SO question.
>    3.
>
> When I try to run the test, I get an error because the test case cannot
> determine what version of Avro jars to download (`download_apache_avro()`
> fails, because pyflink_gateway_server.py tries to evaluate the value of
> `avro.version` by running `mvn help:evaluate -Dexpression=avro.version`)
>
> I then added a dummy `pom.xml` defining a Maven property of `avro.version`
> (with a value of `1.10.0`) and my unit test case is loaded.
>
> I now get a new error and my test is skipped:
>
>     'flink-table-planner*-tests.jar' is not available. Will skip the
> related tests.
>
> I don't know how to fix this. I've tried adding `flink-table-planner` and
> `flink-table-planner-blink` dependencies with `<type>test-jar</type>` to my
> dummy pom.xml, but it still fails.
>
> This is starting to feel like a real pain to do something that should be
> trivial: basic TDD of a PyFlink project.  Is there a real-world example of
> a Python project that shows how to set up a testing environment for unit
> testing SQL with PyFlink?
>
> Many thanks,
>
> John
>

Reply via email to