HI,

Yes i have written custom jdbc sink function based on the jdbcoutformat for streaming and its working and writing records in postgres db or H2 in memory db. However trying to figure out how many times open method is called and establishes database connection because for my integration tests its calling open() method for couple of times which is establishing connection couple of times and that is only for one single event coming from kafka. So can you please tell if its a default implementation that will establish connection multiple times ? Or db connection will be taken from connection pool if i will run my app on Cluster? would be great help

Thanks


On 02/22/2017 10:47 PM, Fabian Hueske wrote:
Hi,

I should also mention that the JdbcOutputFormat batches writes to the database. Since it is not integrated with the Flink's checkpointing mechanism, data might get lost in case of a failure. I would recommend to implement a JdbcSinkFunction based on the code of the JdbcOutputFormat. If you use the batch JdbcOutputFormat you might get duplicates or lose data.

Best, Fabian

2017-02-16 15:39 GMT+01:00 Punit Tandel <punit.tan...@ericsson.com <mailto:punit.tan...@ericsson.com>>:

    Thanks for the info, At the moment i used the flink-jdbc to write
    the streaming data coming from kafka which i can process and write
    those data in postgres or mysql database configured on cluster or
    sandbox, However when trying to write integration tests i am using
    in memory H2 database which some what acting strange as i can not
    see any error being thrown by write record method but at the same
    time nothing is written in database. So kinda a little hard to
    figure whats going wrong here.

    Thanks


    On 02/16/2017 02:02 PM, Fabian Hueske wrote:
    The JdbcOutputFormat was originally meant for batch jobs.
    It should be possible to use it for streaming jobs as well,
    however, you should be aware that it is not integrated with Flink
    checkpointing mechanism.
    So, you might have duplicate data in case of failures.

    I also don't know if or how well it works with H2.

    Best, Fabian

    2017-02-16 11:06 GMT+01:00 Punit Tandel
    <punit.tan...@ericsson.com <mailto:punit.tan...@ericsson.com>>:

        Yes  i have been following the tutorials and reading from H2
        and writing to H2 works fine, But problem here is data coming
        from kafka and writing them to h2 engine does not seems to
        work and cant see any error thrown while writing into in
        memory H2 database, So couldnt say whats the error and why
        those data are not inserted.

        Have been trying to find out cause and looking for logs while
        flink processes the operations but couldnt find any error
        being thrown at the time of writing data. Any where i can
        check for logs ?

        Thanks


        On 02/16/2017 01:10 AM, Ted Yu wrote:
        See the tutorial at the beginning of:

        
flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JDBCInputFormat.java

        Looks like plugging in "org.h2.Driver" should do.

        On Wed, Feb 15, 2017 at 4:59 PM, Punit Tandel
        <punit.tan...@ericsson.com
        <mailto:punit.tan...@ericsson.com>> wrote:

            Hi All

            Does flink jdbc support writing the data into H2 Database?

            Thanks
            Punit







Reply via email to