[ 
https://issues.apache.org/jira/browse/BEAM-1909?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17127045#comment-17127045
 ] 

Joar Wandborg commented on BEAM-1909:
-------------------------------------

[~muscovite bob] "this issue" was fixed in Apache Beam 2.8.0. Are you running 
an older version?
BigQuerySource(query='...')
will execute the BigQuery query with dry_run=True, then use the location of the 
first referenced table in the query. See 
https://github.com/apache/beam/blob/d58fb742ecc9f8ada721121bf749e23e43ee18e6/sdks/python/apache_beam/io/gcp/bigquery_tools.py#L314-L322

> BigQuery read transform fails for DirectRunner when querying non-US regions
> ---------------------------------------------------------------------------
>
>                 Key: BEAM-1909
>                 URL: https://issues.apache.org/jira/browse/BEAM-1909
>             Project: Beam
>          Issue Type: Bug
>          Components: sdk-py-core
>            Reporter: Chamikara Madhusanka Jayalath
>            Assignee: Chamikara Madhusanka Jayalath
>            Priority: P2
>             Fix For: 2.8.0
>
>          Time Spent: 2h 40m
>  Remaining Estimate: 0h
>
> See: 
> http://stackoverflow.com/questions/42135002/google-dataflow-cannot-read-and-write-in-different-locations-python-sdk-v0-5-5/42144748?noredirect=1#comment73621983_42144748
> This should be fixed by creating the temp dataset and table in the correct 
> region.
> cc: [~sb2nov]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to