Hi all,

I am using PyFlink and SQL and have a JSON string that I wish to write to a 
JDBC sink (it's a Postgresql DB). I'd like to write that string to a column 
that is of type JSONB (or even JSON).

I'm getting exceptions when Flink tries to write the column:

Batch entry 0 INSERT INTO my_table(my_jsonb_column) VALUES ('{ ... a valid JSON 
string...}') ...
...
Error: column "my_jsonb_column" is of type jsonb but expression is of type 
character varying
Hint: You will need to rewrite or cast the expression

My table sink in Flink is defined like this:

CREATE TABLE my_table(
      my_jsonb_column STRING
) WITH (
      'connector' = 'jdbc'
)

and in my actual SQL database the column my_jsonb_column is actually defined of 
type JSONB.

I can happily execute SQL statements myself like this without an error where I 
write a string to the JSONB column:

INSERT INTO my_table(my_jsonb_column) VALUES ('{ ... a valid JSON string...}')


Is there any way of having Flink write to a JSONB column or otherwise get that 
JSON string into a JSONB column?

Many thanks,

John


Reply via email to