This is an interesting one.
I have never tried to add --files ...
spark-submit --master yarn --deploy-mode client --files
/etc/hive/conf/hive-site.xml,/etc/hadoop/conf/core-site.xml,/etc/hadoop/conf/hdfs-site.xml
Rather, under $SPARK_HOME/conf, I create soft links to the needed XML files
as belo
Thanks everyone. I was able to resolve this.
Here is what I did. Just passed conf file using —files option.
Mistake that I did was reading the json conf file before creating spark session
. Reading if after creating spark session helped it. Thanks once again for your
valuable suggestions
Tha
If code running on the executors need some local file like a config file,
then it does have to be passed this way. That much is normal.
On Sat, May 15, 2021 at 1:41 AM Gourav Sengupta
wrote:
> Hi,
>
> once again lets start with the requirement. Why are you trying to pass xml
> and json files to
at to all executors.
>>>>>
>>>>>
>>>>> On Fri, May 14, 2021 at 5:01 PM Longjiang.Yang <
>>>>> longjiang.y...@target.com> wrote:
>>>>>
>>>>>> Could you check whether this file is accessible in executors? (is it
>>>>>> in HDFS or in the client local FS)
>>>>>> /appl/common/ftp/conf.json
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *From: *KhajaAsmath Mohammed
>>>>>> *Date: *Friday, May 14, 2021 at 4:50 PM
>>>>>> *To: *"user @spark"
>>>>>> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>>>>>>
>>>>>>
>>>>>>
>>>>>> /appl/common/ftp/conf.json
>>>>>>
>>>>>
t;>>>
>>>>> Could you check whether this file is accessible in executors? (is it
>>>>> in HDFS or in the client local FS)
>>>>> /appl/common/ftp/conf.json
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *From: *KhajaAsmath Mohammed
>>>>> *Date: *Friday, May 14, 2021 at 4:50 PM
>>>>> *To: *"user @spark"
>>>>> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>>>>>
>>>>>
>>>>>
>>>>> /appl/common/ftp/conf.json
>>>>>
>>>>
common/ftp/conf.json
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *From: *KhajaAsmath Mohammed
>>>> *Date: *Friday, May 14, 2021 at 4:50 PM
>>>> *To: *"user @spark"
>>>> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>>>>
>>>>
>>>>
>>>> /appl/common/ftp/conf.json
>>>>
>>>
>>
>>>
>>>
>>>
>>>
>>> *From: *KhajaAsmath Mohammed
>>> *Date: *Friday, May 14, 2021 at 4:50 PM
>>> *To: *"user @spark"
>>> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>>>
>>>
>>>
>>> /appl/common/ftp/conf.json
>>>
>>
>>
>>
>>
>> *From: *KhajaAsmath Mohammed
>> *Date: *Friday, May 14, 2021 at 4:50 PM
>> *To: *"user @spark"
>> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>>
>>
>>
>> /appl/common/ftp/conf.json
>>
>
nt local FS)
> /appl/common/ftp/conf.json
>
>
>
>
>
> *From: *KhajaAsmath Mohammed
> *Date: *Friday, May 14, 2021 at 4:50 PM
> *To: *"user @spark"
> *Subject: *[EXTERNAL] Urgent Help - Py Spark submit error
>
>
>
> /appl/common/ftp/conf.json
>
Hi,
I am having a weird situation where the below command works when the
deploy mode is a client and fails if it is a cluster.
spark-submit --master yarn --deploy-mode client --files
/etc/hive/conf/hive-site.xml,/etc/hadoop/conf/core-site.xml,/etc/hadoop/conf/hdfs-site.xml
--driver-memory 70g --n
10 matches
Mail list logo