Sorry I somehow missed the "Scope" column in the docs, which
explicitly states its for reads only. I don't suppose anyone knows of some
other method I can submit SET statements for write sessions?

On Fri, Nov 26, 2021 at 12:51 PM <trs...@gmail.com> wrote:

> Hello,
>
> Regarding JDBC sinks, the docs state:
> https://spark.apache.org/docs/latest/sql-data-sources-jdbc.html
> sessionInitStatement:
> After each database session is opened to the remote DB and before starting
> to read data, this option executes a custom SQL statement (or a PL/SQL
> block). Use this to implement session initialization code. Example:
> option("sessionInitStatement", """BEGIN execute immediate 'alter session
> set "_serial_direct_read"=true'; END;""")
>
> The language suggests this is for reads only?
>
> I'm wondering if I can use this with Google Cloud Spanner to run:
> SET AUTOCOMMIT_DML_MODE = 'PARTITIONED_NON_ATOMIC'
> https://cloud.google.com/spanner/docs/use-oss-jdbc#set_autocommit_dml_mode
>
> I've tried, but I think it's just being ignored, because
> sessionInitStatement is not accepted by the open source spanner JDBC
> driver. Is this somehow handled by Spark independently of the JDBC driver
> or is it sessionInitStatement driver specific?
>
>
>

Reply via email to