Hi, I have a question for you. Do we need to kill a spark job every time we change and deploy it to cluster? Or, is there a way for Spark to automatically pick up the recent jar version?
Best regards, Mina
Hi, I have a question for you. Do we need to kill a spark job every time we change and deploy it to cluster? Or, is there a way for Spark to automatically pick up the recent jar version?
Best regards, Mina