eladkal opened a new issue, #42882: URL: https://github.com/apache/airflow/issues/42882
### Body I have a kind request for all the contributors to the latest provider packages release. Could you please help us to test the RC versions of the providers? The guidelines on how to test providers can be found in [Verify providers by contributors](https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-the-release-candidate-by-contributors) Let us know in the comment, whether the issue is addressed. Those are providers that require testing as there were some substantial changes introduced: ## Provider [amazon: 9.0.0rc1](https://pypi.org/project/apache-airflow-providers-amazon/9.0.0rc1) - [ ] [Remove deprecated stuff from Amazon provider package (#42450)](https://github.com/apache/airflow/pull/42450): @vincbeck Linked issues: - [ ] [Linked Issue #42218](https://github.com/apache/airflow/pull/42218): @borismo - [ ] [Support session reuse in RedshiftDataOperator (#42218)](https://github.com/apache/airflow/pull/42218): @borismo - [ ] [Add STOPPED to the failure cases for Sagemaker Training Jobs (#42423)](https://github.com/apache/airflow/pull/42423): @ferruzzi - [ ] [S3DeleteObjects Operator: Handle dates passed as strings (#42464)](https://github.com/apache/airflow/pull/42464): @ellisms Linked issues: - [ ] [Linked Issue #42363](https://github.com/apache/airflow/issues/42363): @mgorsk1 - [ ] [Small fix to AWS AVP cli init script (#42479)](https://github.com/apache/airflow/pull/42479): @o-nikolas - [ ] [Make `AwsTaskLogFetcher` faster by reducing the amount of sleep (#42449)](https://github.com/apache/airflow/pull/42449): @smsm1-ito - [ ] [Fix logout in AWS auth manager (#42447)](https://github.com/apache/airflow/pull/42447): @vincbeck - [ ] [handle ClientError raised after key is missing during DyanmoDB table.get_item (#42408)](https://github.com/apache/airflow/pull/42408): @Lee-W - [ ] [Drop python3.8 support core and providers (#42766)](https://github.com/apache/airflow/pull/42766): @jscheffl Linked issues: - [ ] [Linked Issue #42742](https://github.com/apache/airflow/pull/42742): @jscheffl - [ ] [Removed conditional check for task context logging in airflow version 2.8.0 and above (#42764)](https://github.com/apache/airflow/pull/42764): @dirrao - [ ] [Rename dataset related python variable names to asset (#41348)](https://github.com/apache/airflow/pull/41348): @Lee-W - [ ] [Remove identity center auth manager cli (#42481)](https://github.com/apache/airflow/pull/42481): @o-nikolas - [ ] [Refactor AWS Auth manager user output (#42454)](https://github.com/apache/airflow/pull/42454): @o-nikolas - [ ] [Remove `sqlalchemy-redshift` dependency from Amazon provider (#42830)](https://github.com/apache/airflow/pull/42830): @vincbeck - [ ] [Revert "Remove `sqlalchemy-redshift` dependency from Amazon provider" (#42864)](https://github.com/apache/airflow/pull/42864): @mobuchowski ## Provider [apache.beam: 5.8.1rc1](https://pypi.org/project/apache-airflow-providers-apache-beam/5.8.1rc1) - [ ] [Bugfix/dataflow job location passing (#41887)](https://github.com/apache/airflow/pull/41887): @lukas-mi ## Provider [apache.kafka: 1.6.1rc1](https://pypi.org/project/apache-airflow-providers-apache-kafka/1.6.1rc1) - [ ] [Remove callable functions parameter from kafka operator template_fields (#42555)](https://github.com/apache/airflow/pull/42555): @gopidesupavan Linked issues: - [ ] [Linked Issue #42502](https://github.com/apache/airflow/issues/42502): @mxmrlt ## Provider [apache.spark: 4.11.1rc1](https://pypi.org/project/apache-airflow-providers-apache-spark/4.11.1rc1) - [ ] [The spark hook resolve_kerberos_principal function code update when airflow version 2.8.0 and above (#42777)](https://github.com/apache/airflow/pull/42777): @dirrao ## Provider [celery: 3.8.3rc1](https://pypi.org/project/apache-airflow-providers-celery/3.8.3rc1) - [ ] [All executors should inherit from BaseExecutor (#41904)](https://github.com/apache/airflow/pull/41904): @dstandish - [ ] [Remove state sync during celery task processing (#41870)](https://github.com/apache/airflow/pull/41870): @Kytha - [ ] [Standard provider bash operator (#42252)](https://github.com/apache/airflow/pull/42252): @gopidesupavan ## Provider [cloudant: 4.0.1rc1](https://pypi.org/project/apache-airflow-providers-cloudant/4.0.1rc1) - [ ] [Drop python3.8 support core and providers (#42766)](https://github.com/apache/airflow/pull/42766): @jscheffl Linked issues: - [ ] [Linked Issue #42742](https://github.com/apache/airflow/pull/42742): @jscheffl ## Provider [cncf.kubernetes: 9.0.0rc1](https://pypi.org/project/apache-airflow-providers-cncf-kubernetes/9.0.0rc1) - [ ] [kubernetes executor cleanup_stuck_queued_tasks optimization (#41220)](https://github.com/apache/airflow/pull/41220): @dirrao - [ ] [All executors should inherit from BaseExecutor (#41904)](https://github.com/apache/airflow/pull/41904): @dstandish - [ ] [Fix mark as success when pod fails while fetching log (#42815)](https://github.com/apache/airflow/pull/42815): @romsharon98 - [ ] [Fix SparkKubernetesOperator spark name. (#42427)](https://github.com/apache/airflow/pull/42427): @gopidesupavan Linked issues: - [ ] [Linked Issue #41188](https://github.com/apache/airflow/issues/41188): @andallo - [ ] [KubernetesPodOperator never stops if credentials are refreshed (#42361)](https://github.com/apache/airflow/pull/42361): @paolo-moriello - [ ] [Added unit tests and restructred `await_xcom_sidecar_container_start` method. (#42504)](https://github.com/apache/airflow/pull/42504): @harjeevanmaan Linked issues: - [ ] [Linked Issue #42132](https://github.com/apache/airflow/issues/42132): @captify-mkambur - [ ] [KubernetesHook kube_config extra can take dict (#41413)](https://github.com/apache/airflow/pull/41413): @dstandish - [ ] [Drop python3.8 support core and providers (#42766)](https://github.com/apache/airflow/pull/42766): @jscheffl Linked issues: - [ ] [Linked Issue #42742](https://github.com/apache/airflow/pull/42742): @jscheffl - [ ] [Remove airflow_version from k8s executor pod selector (#42751)](https://github.com/apache/airflow/pull/42751): @dstandish ## Provider [common.compat: 1.2.1rc1](https://pypi.org/project/apache-airflow-providers-common-compat/1.2.1rc1) - [ ] [Rename dataset related python variable names to asset (#41348)](https://github.com/apache/airflow/pull/41348): @Lee-W ## Provider [common.io: 1.4.2rc1](https://pypi.org/project/apache-airflow-providers-common-io/1.4.2rc1) - [ ] [Drop python3.8 support core and providers (#42766)](https://github.com/apache/airflow/pull/42766): @jscheffl Linked issues: - [ ] [Linked Issue #42742](https://github.com/apache/airflow/pull/42742): @jscheffl - [ ] [Rename dataset related python variable names to asset (#41348)](https://github.com/apache/airflow/pull/41348): @Lee-W ## Provider [common.sql: 1.18.0rc1](https://pypi.org/project/apache-airflow-providers-common-sql/1.18.0rc1) - [ ] [feat(providers/common/sql): add warning to connection setter (#42736)](https://github.com/apache/airflow/pull/42736): @Lee-W - [ ] [FIX: Only pass connection to sqlalchemy engine in JdbcHook (#42705)](https://github.com/apache/airflow/pull/42705): @dabla Linked issues: - [ ] [Linked Issue #42664](https://github.com/apache/airflow/issues/42664): @emredjan ## Provider [databricks: 6.11.0rc1](https://pypi.org/project/apache-airflow-providers-databricks/6.11.0rc1) - [ ] [Add `on_kill` to Databricks Workflow Operator (#42115)](https://github.com/apache/airflow/pull/42115): @R7L208 - [ ] [Add warning log in`DatabricksTaskBaseOperator` when task_key>100 (#42813)](https://github.com/apache/airflow/pull/42813): @rawwar Linked issues: - [ ] [Linked Issue #41816](https://github.com/apache/airflow/issues/41816): @rawwar - [ ] [Add debug logs to print Request/Response data in Databricks provider (#42662)](https://github.com/apache/airflow/pull/42662): @rawwar ## Provider [dbt.cloud: 3.11.0rc1](https://pypi.org/project/apache-airflow-providers-dbt-cloud/3.11.0rc1) - [ ] [Add ability to provide proxy for dbt Cloud connection (#42737)](https://github.com/apache/airflow/pull/42737): @b-per - [ ] [Simplify code for recent dbt provider change (#42840)](https://github.com/apache/airflow/pull/42840): @kaxil ## Provider [elasticsearch: 5.5.2rc1](https://pypi.org/project/apache-airflow-providers-elasticsearch/5.5.2rc1) - [ ] [Removed conditional check for task context logging in airflow version 2.8.0 and above (#42764)](https://github.com/apache/airflow/pull/42764): @dirrao ## Provider [fab: 1.4.1rc1](https://pypi.org/project/apache-airflow-providers-fab/1.4.1rc1) - [ ] [Update Rest API tests to no longer rely on FAB auth manager. Move tests specific to FAB permissions to FAB provider (#42523)](https://github.com/apache/airflow/pull/42523): @vincbeck - [ ] [Rename dataset related python variable names to asset (#41348)](https://github.com/apache/airflow/pull/41348): @Lee-W - [ ] [Simplify expression for get_permitted_dag_ids query (#42484)](https://github.com/apache/airflow/pull/42484): @dstandish ## Provider [google: 10.24.0rc1](https://pypi.org/project/apache-airflow-providers-google/10.24.0rc1) - [ ] [Add 'retry_if_resource_not_ready' logic for DataprocCreateClusterOperator and DataprocCreateBatchOperator (#42703)](https://github.com/apache/airflow/pull/42703): @MaksYermak - [ ] [Publish Dataproc Serverless Batch link after it starts if batch_id was provided (#41153)](https://github.com/apache/airflow/pull/41153): @rafalh - [ ] [Fix gcp_conn_id in PubsubPullTrigger (#42671)](https://github.com/apache/airflow/pull/42671): @gopidesupavan Linked issues: - [ ] [Linked Issue #42160](https://github.com/apache/airflow/issues/42160): @nickmarx12345678 - [ ] [Fix consistent return response from PubSubPullSensor (#42080)](https://github.com/apache/airflow/pull/42080): @gopidesupavan Linked issues: - [ ] [Linked Issue #41877](https://github.com/apache/airflow/issues/41877): @arpit-maheshwari1 - [ ] [Undo partition exclusion from the table name when splitting a full BigQuery table name (#42541)](https://github.com/apache/airflow/pull/42541): @moiseenkov - [ ] [Fix GCP text to speech operator uri fetch (#42309)](https://github.com/apache/airflow/pull/42309): @olegkachur-e - [ ] [Refactor ``bucket.get_blob`` calls in ``GCSHook`` to handle validation for non-existent objects. (#42474)](https://github.com/apache/airflow/pull/42474): @jsjasonseba Linked issues: - [ ] [Linked Issue #42439](https://github.com/apache/airflow/issues/42439): @shahar1 - [ ] [Bugfix/dataflow job location passing (#41887)](https://github.com/apache/airflow/pull/41887): @lukas-mi - [ ] [Removed conditional check for task context logging in airflow version 2.8.0 and above (#42764)](https://github.com/apache/airflow/pull/42764): @dirrao - [ ] [Rename dataset related python variable names to asset (#41348)](https://github.com/apache/airflow/pull/41348): @Lee-W - [ ] [Deprecate `AutoMLBatchPredictOperator` and refactor AutoML system tests (#42260)](https://github.com/apache/airflow/pull/42260): @olegkachur-e ## Provider [jdbc: 4.5.2rc1](https://pypi.org/project/apache-airflow-providers-jdbc/4.5.2rc1) - [ ] [FIX: Only pass connection to sqlalchemy engine in JdbcHook (#42705)](https://github.com/apache/airflow/pull/42705): @dabla Linked issues: - [ ] [Linked Issue #42664](https://github.com/apache/airflow/issues/42664): @emredjan ## Provider [microsoft.azure: 10.5.1rc1](https://pypi.org/project/apache-airflow-providers-microsoft-azure/10.5.1rc1) - [ ] [BUGFIX: Paginated results in MSGraphAsyncOperator (#42414)](https://github.com/apache/airflow/pull/42414): @dabla - [ ] [Bugfix/42575 workaround pin azure kusto data (#42576)](https://github.com/apache/airflow/pull/42576): @jscheffl Linked issues: - [ ] [Linked Issue #42575](https://github.com/apache/airflow/issues/42575): @jscheffl - [ ] [Removed conditional check for task context logging in airflow version 2.8.0 and above (#42764)](https://github.com/apache/airflow/pull/42764): @dirrao ## Provider [mysql: 5.7.2rc1](https://pypi.org/project/apache-airflow-providers-mysql/5.7.2rc1) - [ ] [Rename dataset related python variable names to asset (#41348)](https://github.com/apache/airflow/pull/41348): @Lee-W ## Provider [openlineage: 1.12.2rc1](https://pypi.org/project/apache-airflow-providers-openlineage/1.12.2rc1) - [ ] [Standard provider bash operator (#42252)](https://github.com/apache/airflow/pull/42252): @gopidesupavan - [ ] [Drop python3.8 support core and providers (#42766)](https://github.com/apache/airflow/pull/42766): @jscheffl Linked issues: - [ ] [Linked Issue #42742](https://github.com/apache/airflow/pull/42742): @jscheffl - [ ] [Rename dataset related python variable names to asset (#41348)](https://github.com/apache/airflow/pull/41348): @Lee-W ## Provider [opensearch: 1.5.0rc1](https://pypi.org/project/apache-airflow-providers-opensearch/1.5.0rc1) - [ ] [Add feature to read log from opensearch (#41799)](https://github.com/apache/airflow/pull/41799): @Owen-CH-Leung Linked issues: - [ ] [Linked Issue #33619](https://github.com/apache/airflow/issues/33619): @djadeau - [ ] [Don't pass auth to opensearch client with empty login and password (#39982)](https://github.com/apache/airflow/pull/39982): @pdebelak Linked issues: - [ ] [Linked Issue #39979](https://github.com/apache/airflow/issues/39979): @pdebelak - [ ] [Removed conditional check for task context logging in airflow version 2.8.0 and above (#42764)](https://github.com/apache/airflow/pull/42764): @dirrao ## Provider [postgres: 5.13.1rc1](https://pypi.org/project/apache-airflow-providers-postgres/5.13.1rc1) - [ ] [Rename dataset related python variable names to asset (#41348)](https://github.com/apache/airflow/pull/41348): @Lee-W ## Provider [snowflake: 5.8.0rc1](https://pypi.org/project/apache-airflow-providers-snowflake/5.8.0rc1) - [ ] [Add Snowpark operator and decorator (#42457)](https://github.com/apache/airflow/pull/42457): @sfc-gh-jdu Linked issues: - [ ] [Linked Issue #24456](https://github.com/apache/airflow/issues/24456): @sfc-gh-madkins - [ ] [Fixes: SnowflakeSqlApiOperator not resolving parameters in SQL (#42719)](https://github.com/apache/airflow/pull/42719): @harjeevanmaan Linked issues: - [ ] [Linked Issue #42033](https://github.com/apache/airflow/issues/42033): @chris-okorodudu - [ ] [Make `private_key_content` a sensitive field in Snowflake connection (#42649)](https://github.com/apache/airflow/pull/42649): @rawwar Linked issues: - [ ] [Linked Issue #42496](https://github.com/apache/airflow/issues/42496): @TJaniF ## Provider [trino: 5.8.1rc1](https://pypi.org/project/apache-airflow-providers-trino/5.8.1rc1) - [ ] [Rename dataset related python variable names to asset (#41348)](https://github.com/apache/airflow/pull/41348): @Lee-W ## Provider [ydb: 1.4.0rc1](https://pypi.org/project/apache-airflow-providers-ydb/1.4.0rc1) - [ ] [Add an ability to use scan queries via new YDB operator (#42311)](https://github.com/apache/airflow/pull/42311): @vgvoleg <!-- NOTE TO RELEASE MANAGER: You can move here the providers that have doc-only changes or for which changes are trivial, and you could assess that they are OK. --> All users involved in the PRs: @Lee-W @jscheffl @romsharon98 @borismo @sfc-gh-jdu @smsm1-ito @o-nikolas @gopidesupavan @Kytha @paolo-moriello @lukas-mi @Owen-CH-Leung @dirrao @olegkachur-e @rafalh @mobuchowski @kaxil @pdebelak @MaksY ### Committer - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow project. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org