moomindani commented on PR #63622:
URL: https://github.com/apache/airflow/pull/63622#issuecomment-4062592033

   Yes, I tested this against real S3 with `S3CreateBucketOperator`. Here's the 
test DAG I used:
   
   ```python
   from airflow.providers.amazon.aws.operators.s3 import (
       S3CreateBucketOperator,
       S3DeleteBucketOperator,
   )
   
   ACCOUNT_ID = "xxxxxxxxxxxx"
   REGION = "us-east-1"
   BUCKET_NAME = f"airflow-dag-test-{ACCOUNT_ID}-{REGION}-an"
   
   create_op = S3CreateBucketOperator(
       task_id="create_bucket_account_regional",
       bucket_name=BUCKET_NAME,
       region_name=REGION,
       bucket_namespace="account-regional",
       aws_conn_id=None,
   )
   
   delete_op = S3DeleteBucketOperator(
       task_id="delete_bucket",
       bucket_name=BUCKET_NAME,
       force_delete=False,
       aws_conn_id=None,
   )
   ```
   
   Results:
   - **Create**: Bucket `airflow-dag-test-xxxxxxxxxxxx-us-east-1-an` created 
successfully
   - **Idempotency**: Second execution correctly detected the existing bucket 
and skipped creation
   - **Delete**: Bucket cleaned up successfully via `S3DeleteBucketOperator`
   
   Note: Requires `botocore>=1.42.0` at runtime for the `BucketNamespace` 
parameter to be recognized by boto3.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to