Thank you all for the responses. I believe the user shouldn't be
worried about creating the log dir explicitly. The event logging should
behave like other logs (e.g. master or slave) that the directory should
be automatically created if not exist.
-- ND
On 7/2/20 9:19 AM, Zero wrote:
This could be the result of you not setting the location of eventLog
properly. By default, it's/TMP/Spark-Events, and since the files in
the/TMP directory are cleaned up regularly, you could have this problem.
------------------ Original ------------------
*From:* "Xin Jinhan"<18183124...@163.com>;
*Date:* Thu, Jul 2, 2020 08:39 PM
*To:* "user"<user@spark.apache.org>;
*Subject:* Re: File Not Found: /tmp/spark-events in Spark 3.0
Hi,
First, the /tmp/spark-events is the default storage location of spark
eventLog, but the log is stored only when you set the
'spark.eventLog.enabled=true', which maybe your spark 2.4.6 set to
false. So
you can just set it to false and the error will disappear.
Second, I suggest to open the eventLog and you can specify the log
location
with 'spark.eventLog.dir' either a filesystem or local one, because you
maybe to check the log later.(can simplely use spark-history-server)
Regards
Jinhan
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org