Re: Write pyspark dataframe into kms encrypted s3 bucket

2020-10-15 Thread Hariharan
Sorry, I meant setLong only. If you know which version of hadoop jars you're using, you can check the code here to try to find out which line exactly is throwing the error

Re: Write pyspark dataframe into kms encrypted s3 bucket

2020-10-15 Thread Devi P V
hadoop_conf.set("fs.s3a.multipart.size", 104857600L) .set only allows string values. Its throwing invalid syntax. I tried following also. But issue not fixed. hadoop_conf.setLong("fs.s3a.multipart.size", 104857600) Thanks On Thu, Oct 15, 2020, 7:22 PM Hariharan wrote: > fs.s3a.multipart.siz

Re: Write pyspark dataframe into kms encrypted s3 bucket

2020-10-15 Thread Hariharan
fs.s3a.multipart.size needs to be a long value, not a string, so you will need to use hadoop_conf.set("fs.s3a.multipart.size", 104857600L) ~ Hariharan On Thu, Oct 15, 2020 at 6:32 PM Devi P V wrote: > > Hi All, > > I am trying to write a pyspark dataframe into KMS encrypted S3 bucket.I am > us