Re: Batch load with BigQueryIO fails because of a few bad records.

2021-05-07 Thread Reuven Lax
ignoreUnknownValues is supported for BATCH_LOADS as well. On Fri, May 7, 2021 at 7:08 AM Matthew Ouyang wrote: > Thank you for responding Evan. It looks like these options will only work > for STREAMING_INSERTS. Are there any options for BATCH_LOADS, and if not > are there any plans for it? >

Re: Batch load with BigQueryIO fails because of a few bad records.

2021-05-07 Thread Matthew Ouyang
Thank you for responding Evan. It looks like these options will only work for STREAMING_INSERTS. Are there any options for BATCH_LOADS, and if not are there any plans for it? On Thu, May 6, 2021 at 6:11 PM Evan Galpin wrote: > Hey Matthew, > > I believe you might also need to use the “ignoreUn

Re: Batch load with BigQueryIO fails because of a few bad records.

2021-05-06 Thread Evan Galpin
Hey Matthew, I believe you might also need to use the “ignoreUnknownValues”[1] or skipInvalidRows[2] options depending on your use case if your goal is to allow valid entities to succeed even if invalid entities exist and separately process failed entities via “getFailedResults”. You could also co

Batch load with BigQueryIO fails because of a few bad records.

2021-05-06 Thread Matthew Ouyang
I am loading a batch load of records with BigQueryIO.Write, but because some records don't match the target table schema the entire and the write step fails and nothing gets written to the table. Is there a way for records that do match the target table schema to be inserted, and the records that