There may well be a better way, but I've been able to achieve this by executing 
the following in the DB:


CREATE CAST(VARCHAR AS JSONB) WITH INOUT AS IMPLICIT;

I think you can also probably do

CREATE CAST(VARCHAR AS JSONB) WITHOUT FUNCTION AS IMPLICIT;

However, I am working in AWS and AWS does not give the owner of the database 
sufficient permissions to execute this.


________________________________
From: John Tipper <john_tip...@hotmail.com>
Sent: 24 July 2022 22:01
To: user@flink.apache.org <user@flink.apache.org>
Subject: Write JSON string to JDBC as JSONB?

Hi all,

I am using PyFlink and SQL and have a JSON string that I wish to write to a 
JDBC sink (it's a Postgresql DB). I'd like to write that string to a column 
that is of type JSONB (or even JSON).

I'm getting exceptions when Flink tries to write the column:

Batch entry 0 INSERT INTO my_table(my_jsonb_column) VALUES ('{ ... a valid JSON 
string...}') ...
...
Error: column "my_jsonb_column" is of type jsonb but expression is of type 
character varying
Hint: You will need to rewrite or cast the expression

My table sink in Flink is defined like this:

CREATE TABLE my_table(
      my_jsonb_column STRING
) WITH (
      'connector' = 'jdbc'
)

and in my actual SQL database the column my_jsonb_column is actually defined of 
type JSONB.

I can happily execute SQL statements myself like this without an error where I 
write a string to the JSONB column:

INSERT INTO my_table(my_jsonb_column) VALUES ('{ ... a valid JSON string...}')


Is there any way of having Flink write to a JSONB column or otherwise get that 
JSON string into a JSONB column?

Many thanks,

John


Reply via email to