Hi Alexey, Thanks we will give it a try.
From: Alexey Romanenko <aromanenko....@gmail.com> Reply-To: "user@beam.apache.org" <user@beam.apache.org> Date: Thursday, June 10, 2021 at 5:14 AM To: "user@beam.apache.org" <user@beam.apache.org> Subject: Re: How to specify a spark config with Beam spark runner Hi Tao, "Limited spark options”, that you mentioned, are Beam's application arguments and if you run your job via "spark-submit" you should still be able to configure Spark application via normal spark-submit “--conf key=value” CLI option. Doesn’t it work for you? — Alexey On 10 Jun 2021, at 01:29, Tao Li <t...@zillow.com<mailto:t...@zillow.com>> wrote: Hi Beam community, We are trying to specify a spark config “spark.hadoop.fs.s3a.canned.acl=BucketOwnerFullControl” in the spark-submit command for a beam app. I only see limited spark options supported according to this doc: https://beam.apache.org/documentation/runners/spark/<https://nam11.safelinks.protection.outlook.com/?url=https%3A%2F%2Fbeam.apache.org%2Fdocumentation%2Frunners%2Fspark%2F&data=04%7C01%7Ctaol%40zillow.com%7Ca97c13493d904a366c0308d92c094b53%7C033464830d1840e7a5883784ac50e16f%7C0%7C0%7C637589240693166504%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=l%2BWgkIYG%2BVM9j7z0PXMKj1ybNL51E%2F2%2BmTUgD4dkeuc%3D&reserved=0> How can we specify an arbitrary spark config? Please advise. Thanks!