If you are using the principal / keytab params, Spark should create tokens as needed. If it's not, something else is going wrong, and only looking at full logs for the app would help. On Wed, Jan 2, 2019 at 5:09 PM Ali Nazemian <alinazem...@gmail.com> wrote: > > Hi, > > We are using a headless keytab to run our long-running spark streaming > application. The token is renewed automatically every 1 day until it hits the > max life limit. The problem is token is expired after max life (7 days) and > we need to restart the job. Is there any way we can re-issue the token and > pass it to a job that is already running? It doesn't feel right at all to > restart the job every 7 days only due to the token issue. > > P.S: We use "--keytab /path/to/the/headless-keytab", "--principal > principalNameAsPerTheKeytab" and "--conf > spark.hadoop.fs.hdfs.impl.disable.cache=true" as the arguments for > spark-submit command. > > Thanks, > Ali
-- Marcelo --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org