Thanks Robert!

All working now.

Turns out an incorrect log4j2 configuration was slurping up a warning about
converting org.apache.avro.util.Utf8 to the internal flink data type.
I've drawn some inspiration from flink-avro and added a converter that
converts and re-orders the Avro encoded BigQuery results to the correct
flink RowData.

Cheers,
Matt.

On Tue, Apr 5, 2022 at 9:28 PM Robert Metzger <metrob...@gmail.com> wrote:

> Hi Matt,
>
> At first glance your code looks fine. I guess you'll need to follow the
> codepaths more with the debugger.
> Have you made sure that "reachedEnd()" returns false?
>
>
> On Tue, Apr 5, 2022 at 9:42 AM Matthew Brown <m...@matthewbrown.io> wrote:
>
>> Hi all,
>>
>> I'm attempting to build a Table API connector for BigQuery using the
>> BigQuery Storage API (
>> https://cloud.google.com/bigquery/docs/reference/storage).
>>
>> I've got a base structure built out at
>> https://github.com/mnbbrown/bigquery-connector
>> There's a couple of things I have to do yet like correcting the mapping
>> between the BigQuery avro schema and flink TypeInformation, and add a test
>> suite.
>>
>> I've added it to an internal project I'm working on and "nextRecord" on
>> the InputFormat is never called. I can see open/close,
>> openInputFormat/closeInputFormat, etc being called correctly.
>>
>> Does anybody have any debugging tips?
>>
>> Thanks,
>> Matt.
>>
>> --
>> --
>> AU: +61 459 493 730
>> UK: +44 7927 618921
>> @mnbbrown
>>
>

-- 
--
AU: +61 459 493 730
UK: +44 7927 618921
@mnbbrown

Reply via email to