Hey Mark,

I believe this is the name of the subdirectory that is used to store
metadata about which files are valid, see comment in code
https://github.com/apache/spark/blob/v2.3.0/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/FileStreamSink.scala#L33
Do you see the exception as warnings or as errors in Alluxio master log? It
will be helpful to post the stack trace if it is available.
My hypothesis is that Spark in your case was testing creating such
directory

-Bin

On Wed, Aug 28, 2019 at 1:59 AM Mark Zhao <guoguo20181...@gmail.com> wrote:

> Hey,
>
>  When running Spark on Alluxio-1.8.2, I encounter the following exception:
> “alluxio.exception.FileDoseNotExistException: Path
> “/test-data/_spark_metadata” does not exist” in Alluxio master.log. What
> exactly is the directory "_spark_metadata" used for? And how can I fix this
> problem?
>
> Thanks.
>
> Mark
>

Reply via email to