dliang5 opened a new issue #21279: URL: https://github.com/apache/airflow/issues/21279
### Apache Airflow version 2.1.4 ### What happened At my organization, we currently followed the documentation for Bigquery data transfer in Airflow/Composer (1.17.9). It calls a `BigQueryDataTransferServiceTransferRunSensor` which executes and creates a class called `BigQueryDataTransferServiceHook`. Using this hook object, it calls a specific function called `get_transfer_run` which calls its counterpart in grpc. Using the passed arguments, it creates the request name using f-strings which is passed into the function to fetch for the transfer config object. In the [bigquery_dts file](https://github.com/apache/airflow/blob/main/airflow/providers/google/cloud/hooks/bigquery_dts.py), at line 263 using `[email protected]` or main branch, it doesn't interpolate the variables to the string, `name`, correctly due to a misplaced f. This results in the value, `"f{project}/transferConfigs/{transfer_config_id}/runs/{run_id}"`, for `name` which causes an "invalid run name" error from grpc. ### What you expected to happen Let's say I have these variables containing the data transfer run I want to monitor passed into the Sensor class: - project_id = "123" - transfer_config_id = "456" - run_id = "789" I expect the variable name to contain `projects/123/transferConfigs/456/runs/789` when calling `get_transfer_run` with the above arguments during execution. ### How to reproduce This is assuming you have a GCP project and a dataset with any kind of tables in it. 1. Go to Bigquery Data Transfer which is on the sidebar of Bigquery workspace or search "Bigquery Data Transfer". 2. Click on "CREATE TRANSFER" 3. Select "Dataset Copy" 4. Fill in all the required information and set "Repeats" from "daily" to "On-demand". 5. Click on "SAVE" and it should bring you back into its homepage with the created transfer config. 6. Click on the created transfer config and click "RUN TRANSFER NOW". This should generate an "in-progress" run. 7. Click on the completed run and it should bring up a sidebar from the right containing the "Run details". 8. In `Resource name`, save the numbers to the right of `Projects`, `transferConfigs` and `runs` as they are be the information needed to run the Sensor class in Airflow. 9. From Airflow, a basic dag file with `BigQueryDataTransferServiceTransferRunSensor` with the three information from above. 10. Execute the dag and it will produce an error with an invalid run name ### Operating System Ubuntu 18.04.06 LTS ### Versions of Apache Airflow Providers apache-airflow-providers-google==6.3.0 ### Deployment Composer ### Deployment details None just default Composer ### Anything else I would be down to create a PR for this as it seems like a chill bug and would be a great exposure for me being new to all this. However, I'm fine leaving this for others to do if it's a lot more dire than expected. ### Are you willing to submit PR? - [X] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
