@OrielResearch Eila Arich-Landkof <e...@orielresearch.org>  Depending on
your needs, I wonder about establishing a sheet (or sheets, as needed) that
has a BQ connector for the datasource of it.  If you use Dataflow to
write/create a BQ table, that would then hydrate the sheet (not sure the
ordering -- maybe you'd need to create the BQ table before creating the
sheet)...?  An extra step, and perhaps a bit convoluted.

Another idea would be to write to some sort of sheet-compatible file-type
and then upload that to the folder.  There then *might* be something like a
cli call to turn the (ex: csv) file into a sheet?

Neither seem as clean as what you're looking for :-/


On Mon, Jun 8, 2020 at 8:26 AM Luke Cwik <lc...@google.com> wrote:

> It doesn't look like BigQuery supports exporting to Google sheet[1], maybe
> you can invoke this BQ connector directly by adding a transform that
> follows the BQ sink.
>
> 1:
> https://cloud.google.com/bigquery/docs/exporting-data#export_limitations
>
> On Sat, Jun 6, 2020 at 8:31 PM OrielResearch Eila Arich-Landkof <
> e...@orielresearch.org> wrote:
>
>> Hello,
>>
>> Is it possible to have the pipeline sink to a google sheet within a
>> specific google drive directory.
>> Something like that:
>>
>> p =  beam.Pipeline(options=options)
>> (p | 'Step 1: read file ' >> beam.io.ReadFromText(path/to/file)
>>    | 'Step 2:  process data  ' >> beam.ParDo(get_daata(l]))
>>    | 'step 3: write data to gsheet  ' >> beam.io.WriteToXXX(GSHEET PATH))
>>
>>
>> I know that BQ has a connector to Google sheet. Is it possible to use
>> this connector from the BQ sink? Other way?
>>
>> Thanks,
>> Eila
>>
>>

Reply via email to