Hi,

can you please try file://?

If you are using a cluster try to ensure that the location you mention is
accessible across all the executors.


Regards,
Gourav Sengupta

On Fri, Nov 5, 2021 at 4:16 AM Lynx Du <ngl...@outlook.com> wrote:

> Hi experts,
>
> I am just get started using spark and scala.
>
> I am confused how to read local files.
>
> I run a spark cluster using docker-compose. There are one master and 2
> worker nodes. I think this cluster is so-called standalone cluster.
>
> I am trying to submit a simple task to this cluster by this command
>
> spark-submit --class example.SimpleApp --master spark://localhost:7077
> simple-project_2.12-1.0.jar file:///tmp/sharefiles/README.md
>
>
> This is my test result.
>
> Case 1:
>
> I mount local(my Mac desktop) /tmp/sharefiles to each worker. It works
> fine.  That’s /tmp/sharefiles/README.md should exists both on my local
> desktop and worker machine.
>
> Othere cases all failed to read.
>
> Isn’t it?
>
> Why my local desktop need have this file?  How can I remove this
> limitation. For My understanding, file:///xxx should only need exist on
> worker node.
>
> Thanks
>
>

Reply via email to