Hi,

everything prefixed with `org.apache.flink.table.planner` is Blink planner. So you should be able to use those testing classes. The Blink planner is also the default one since 1.11. In general, I would recommend to look a bit into the testing package. There are many different testing examples.

Regards,
Timo


On 31.10.20 00:34, Rex Fenley wrote:
Hello,

Thank you for these examples, they look great. However, I can seem to import
`import org.apache.flink.table.planner.runtime.utils.{StreamingTestBase, StringSink}`
is it because I'm using the Blink planner and not the regular one?

Thanks

On Fri, Oct 9, 2020 at 7:55 AM Timo Walther <twal...@apache.org <mailto:twal...@apache.org>> wrote:

    Hi Rex,

    let me copy paste my answer from a similar thread 2 months ago:

    Hi,

    this might be helpful as well:

    
https://lists.apache.org/thread.html/rfe3b45a10fc58cf19d2f71c6841515eb7175ba731d5055b06f236b3f%40%3Cuser.flink.apache.org%3E

    First of all, it is important to know if you are interested in
    end-to-end tests (incl. connectors) or excluding connectors. If you
    just
    like to test your operators, you can use a lot of the testing
    infrastructure of Flink.

    If your are NOT using event-time, you can simply use
    `TableEnvironment.fromValues()` and `Table.execute().collect()`. This
    test uses it for example [1] (it is one of the newer test generations).

    Otherwise you can use or implement your own testing connectors, like in
    
org.apache.flink.table.planner.runtime.stream.sql.FunctionITCase#testStructuredScalarFunction

    [2].

    I hope this helps.

    Regards,
    Timo

    [1]
    
https://github.com/apache/flink/blob/master/flink-table/flink-table-planner-blink/src/test/java/org/apache/flink/table/planner/expressions/MathFunctionsITCase.java

    [2]
    
https://github.com/apache/flink/blob/master/flink-table/flink-table-planner-blink/src/test/java/org/apache/flink/table/planner/runtime/stream/sql/FunctionITCase.java#L700


    Let me know if you need more information.

    Regards,
    Timo

    On 09.10.20 07:39, Rex Fenley wrote:
     > Hello
     >
     > I'd like to write a unit test for my Flink Job. It consists
    mostly of
     > the Table API and SQL using a StreamExecutionEnvironment with the
    blink
     > planner, from source to sink.
     > What's the best approach for testing Table API/SQL?
     >
     > I read
     >
    
https://flink.apache.org/news/2020/02/07/a-guide-for-unit-testing-in-apache-flink.html

     > however that seems to cover more for specialized functions with
     > DataStreams vs entire Table API constructs. What I think I'd like
    is to
     > be able to have some stubbed input sources and mocked out sinks
    which I
     > use to test against my Tables.
     >
     > Does this seem reasonable?
     >
     > I did find TestStreamEnvironment and maybe that would be useful
    at least
     > for running the tests locally it seems?
     >
    
https://ci.apache.org/projects/flink/flink-docs-stable/api/java/org/apache/flink/streaming/util/TestStreamEnvironment.html
     >
     > Any help appreciated. Thanks!
     >
     > --
     >
     > Rex Fenley|Software Engineer - Mobile and Backend
     >
     >
     > Remind.com <https://www.remind.com/>| BLOG
    <http://blog.remind.com/> |
     > FOLLOW US <https://twitter.com/remindhq> | LIKE US
     > <https://www.facebook.com/remindhq>
     >



--

Rex Fenley|Software Engineer - Mobile and Backend


Remind.com <https://www.remind.com/>| BLOG <http://blog.remind.com/> | FOLLOW US <https://twitter.com/remindhq> | LIKE US <https://www.facebook.com/remindhq>


Reply via email to